Hitting Tech Sufficiency
Posted on May 8, 2017 Posted by John Scalzi 52 Comments
I was recently gifted with a Google Chromebook Pixel, which although now two years old is still the most specced-out Chromebook you can get (the version I received has an i7 processor, 16 gigs of ram and a 64GB SSD, as well as a retina-like touchscreen). I was delighted to get it, and can attest to it being an all around lovely laptop, as well as (of course) just about the best Chromebook I’ve come across. It can run Android apps too, which is a bonus, although I don’t find myself actually using that ability much, either on this or the other two Chromebooks I currently have in the house. Be that as it may, if you have a hankering for a Chromebook, the Pixels are still well worth looking into. Google’s not making them anymore, so supplies are limited, but on the other hand you can pick one up these days for about $400, a steep discount from their original pricing (of about $1k).
As much as I like the Pixel (and I do!), one of the things I’m aware of at the moment is that I’m currently in a moment of technological sufficiency, which is to say that I’m at a point where I don’t really have a hankering for any new bit of tech. Before the Pixel arrived I already had the latest Asus Flip Chromebook, which I liked quite a bit and which I took on tour with me, where it performed in an entirely satisfactory manner. My desktop computer is a couple years old now but still near the upper end of things, techwise; as long as it doesn’t explode I’m fine. My cell phone is likewise well-specced and I’m in no rush to upgrade it. Basically, there’s no tech out there in the world I really feel the urge to pick up. I’m good.
This is very weird for me, I should note. There’s usually a laptop or cell phone or graphics card or camera or TV or whatever that I don’t have that I wish I did, and which I’m sorely tempted to get even if I don’t exactly need it (this is what Charlie Stross calls “having to make a saving throw against shiny“). But at the moment: Nope.
I think part of the reason for this is a bit of self-awareness, i.e., no matter what new computer (or phone, or whatever) I get, I’m almost certainly going to use it for the same things I always do — in the case of a laptop, to write emails and occasionally work on a novel (if I’m not at home), and read social media. These are not things which require blazing speeds or massive computing power, which is one reason I’ve become enamored of Chromebooks in the last couple of years; they’re nicely good enough, especially now that I can get models with backlit keyboards. They are so “good enough,” in fact, that at this point (for me, anyway), it becomes increasingly difficult to justify spending hundreds more for a PC or Mac ever again. Maybe if my laptops were my primary computers (i.e., no desktop computer). But they’re not.
Also, I think I might have a little bit of technology fatigue, which is to say at this moment in time there’s nothing so particularly new or innovative in terms of technology that I feel an urge to race out and upgrade. Laptops are sufficiently small and light and capable; their functionality isn’t notably different from what it was five or even ten years ago, at least in terms of how I use them. The most recent attempts to innovate in that area amount to either removing capability (Apple ditching inputs and forcing its users to use dongles) or adding capability of dubious utility (Apple again, with their “Touch Bar”). Likewise, the newest generation of cell phones doesn’t add much to the party for me — again they’re either dropping capability (no headphone jacks? Screw you), or what’s being added doesn’t impress me much.
(Tablets, I’ll note, have dropped entirely off my radar; I loved the Nexus 7 tablet, which was the perfect size for me, but I barely use mine anymore. Likewise the iPad Mini I have, which I got because I’m working on games designed for iOS. What I used tablets for previously are now handled by my phone, which now has a large enough screen, or by my Asus Chromebook, which flips about to make a perfectly serviceable tablet, especially now that it runs Android apps.)
There’s nothing that grabs me, upgrade-wise, so I suspect I’m unlikely to upgrade until my current set of toys break. Which will be soon enough, as tech these days is not made to last. But when it does break, the question will be whether I’ll upgrade, or just… sidegrade, and get tech that is equivalent to what I have now and thus, relatively cheaper because it will no longer be the shiniest of the shinies anymore.
I don’t suspect this state of affairs will last, mind you. I am famously susceptible to new tech toys, and I suspect that soon some as-now-unheralded feature or functionality will presently become indespensible (or will at least feel like it is) and then there I will be, Fry-like, thrusting out a fist of dollars and telling someone to shut up and take my money. But for the moment? Yeah, I’m fine, tech-wise. It’s a weird feeling. But I could get used to it. And so could my wallet.
Technically satiated. Me too except for cars… I really NEED a Tesla.
It was fascinating reading this, since I’m in a similar position. Between cameras, computers, and music equipment, I’m pretty satisfied right now, which is strange, since I just sold some older gear on eBay. I consider money like that found money and usually spend it back on new toys, but I’m actually pretty caught up on my toys right now, which is certainly an odd feeling. I have $600 burning a hole in my Paypal account and nothing I really want right now sufficient to spend it all. But, I suspect this to be a fleeting moment that will change too, very quickly, like a planet spinning off into the universe, to quote a character playing Patton in a movie.
I’m mostly in the same place – just upgraded the wife’s gaming desktop and the home theater PC, and I’m not longing for anything. I’m foreseeing that the next fail to save vs. Shiny will be when VR headsets finally come up with a killer app/must-play game that justifies their price tags. Until then, I’m going to idle and wait for the price to come down. Eventually, the falling price and incoming games will cross over and I’ll be getting the new Oculus or Vive, then do it again so I can play with my wife & daughter, and a new cascade will start.
Is this how Moore’s law dies? Not because they can’t get more transistors in an inch bit because there isn’t a need for more? I remember getting a new PC every year or two because a new game would come out that couldn’t run at the higher settings with the current computing power. Or are we just getting old and aren’t as enamored with the latest and greatest. I’m running an older Dell. I did a graphics card upgrade and am fine. Yeah there are faster machines out there and I’m sure there are newer games out there with even better graphics but I really don’t care enough to drop the cash on a new machine.
What about an Amazon Echo or something similar? Using voice commands to turn on lights, pause TV, etc. is pretty amazing… like living in the future!
Disclosure: I’m an Amazon employee.
I have one. At this point I use it mostly as a speaker. Do the same with the Google Home we have.
Won’t Athena need a laptop to take to college with her? Or can she use one of your computers?
She has her own computers.
I’ve been that way for several years, and I’m a developer. Everyone I know is constantly building a new computer, upgrading cards, etc. like a Jedi building their own lightsaber. Me? Besides my work computer, I’ve used 3 laptops since 2001…and one I had to replace because it started overheating. Hell, we still have the same washer, dryer, oven and dishwasher that we got when got married 30 years ago.
If it ain’t broke, don’t fix it (or upgrade)
You clearly are not sufficiently serious about photography, or there would be a constant stream of new bodies and lenses that you NEEEEEEEDDDDD.
Maybe you need to go back to 35mm film…
David G. Lewis:
In fact I’ve been thinking about an 85mm prime lens, but not too seriously yet.
This sort of answers a question I have about almost all science fiction, in that within the worlds of most future/near-future sci-fi stories, you never see characters griping about their technology or wishing it was faster or bigger or better. Everyone is always perfectly happy with the state of tech in their world. No one’s ever talking about how the Brainernet is okay for now, but just think what we could do if it was X times faster. No impatient whiner on their warp phone is ever having to be told “Give it a minute! It’s going into space!” I’ve been banging this drum for a half a decade now, but it seems like maybe I’ll have to retire this crusade. Because I agree with you. There’s not a single bit of tech I’m lusting over. :(
Personally, I have made the decision to never, ever buy a new Windows machine again. This leaves me with a dilemma. Wait will replace my laptop when it dies? I could simply buy a used Windows laptop. However, I think I am going to start migrating all my documents to Google Docs and eventually get a Chromebook.
There’s a point where tech specs exceed our perception capabilities. printers got to 1200 dpi, and there really wasn’t any more need for more pixels. people just can’t tell the difference when you go to 2400. Same with displays on your laptop, television, smart phone, more pixels become imperceptible. The majority of people cannot tell the difference between a raw wave file and mp3 compression. Professionals who work with visuals can tell the difference between raw and lossy compressed image formats, but they’re a minority.
Rembrandt: “Is this how Moore’s law dies? Not because they can’t get more transistors in an inch bit because there isn’t a need for more?”
Advancements have slowed from every 18 months, to every 24 months, to every 30 months per doubling. The man who Moore’s Law is named after predicts the end of Moore’s Law around 2025. A single silicon atom is 0.2 nanometers in diameter. High end fabs are producing gates at 10nanometers now. Cutting in half per step, that would be 5, 2.5, 1.2, .6, .3, and then its smaller than the size of a single silicon atom. That’s five halvings, and Moore’s law says two years per halving, so 10 years from now. Which is 2027, right where Moore says it will end.
Even if moore’s law ends in 10 years, five doublings means we peak with processors that are 32 times faster than today’s CPU’s. Which is, well, it boggles my mind.
After that, I don’t see how chips will be able to follow moore’s law without fundamentally, radically, world altering physics being discovered. Folks have been talking about three dimensional chips, but the problem with that is there’s no easy way to dissipate heat when gates are in a cube shape, and heat kills transistors. Racks of processors, the way server farms look today, packs processors into a three dimensional array about as densely as they can handle heat limits and power limits. If you get 30x more transistors, you get ~30x more heat (not exactly, but still).
I think what we will have to do is look for a radical redesign of what it means to be a processor. Something. Like the way graphics cards have hundreds of cores in them. Which will require a radical reevaluation of how software is written and compiled to take advantage of thousands of cores. Or quantuum computers need to be figured out and applications found for them, because they’re only good for certain kinds fo calculations, and they need to be supercooled. Which will also require a radical reevaluation of how software is written and compiled to take advantage of quantum bits.
I think we have hit the point where software that “most” people use today have reached are are close to reaching their max needs for processing power. Currently, the highest end computer setups for consumers is video game systems. And if you had 30x processing power, I don’t know that games will get 30x better. Graphics are good enough. Maps of game worlds are big enough. and the AI engines are decent enough to challenge players. THe capacity of computers has basically met many of our perception limits. Meanwhile, people still play low tech games like Angry Birds.
We live in interesting times, to be sure.
When I look at what’s currently driving tech to the next speedier, shinier, thing-level, it’s either stuff at the edge cases of human perception (16K UUHHDD Display Display at 20,000FPS) or it’s upcoming tech that’s not quite ready for my investment. Things like VR and AR goggles come to mind. The “killer app” for those things hasn’t hit the market. I anticipate when it does will be when I get past this phase.
In the meantime, yeah right there with you. Running on mostly 2-4yr old tech right now and nothing is screaming to be any faster.
You NEED an 85mm lens — trust me.
Years ago, in a different life, I was a semi-pro photographer. The 85mm was my goto lens. It gives just enough enlargement make almost any picture (except perhaps landscapes) a little bit better. Certainly easier to enlarge, and no matter how many Megapixels you have, sooner or later, you’re going to want to make something REALLY big. Also, almost always, even if you think you are too close with the 85mm, many, many times it forces you to compose a better picture by including just the important stuff.
Just my opinion, and in fairness, this was back in the film days, where “graininess” was a thing, so your mileage may vary. Also, as a card carrying geek for 65 years, it offends my sensibilities that there doesn’t seem to be a least one toy (grrr … I meant “tool”) that you absolutely must have at any given point in time.
I’m looking at getting the Asus c302 chromebook. Would you recommend it over the used Pixel?
I would recommend new over used with tech generally, but the Pixels I saw on Amazon were “new” (i.e., previously unused).
Beyond that I think it’s mostly about what you want. The Asus c302 can flip over to a tablet, which I actually found very useful on tour (I did my readings off of it). The Pixel is more solidly built and with faster processors and more RAM, and a better screen (although the c302 has a 1080 screen, which is good enough).
Honestly I think either is a pretty good choice.
What do you use for a WiFi network at home?
I’ve been using Unifi by Ubiquiti over the past two years and it is outstanding.
We used to have a central router that had terrible signal at the edges of the house (including the bedroom) and then I got Unifi setup with three access points and I’m in love.
I expect that we are due for another round of obsolescence in the near future as Intel goes back into combat mode with AMD and all the programmers find that the new stuff has the ability to support more (lines of code, threads, cores, terahertz).
I’m using the Google Wifi set up (three hubs) and it works very well for me.
I have a 135mm lens on my Canon that I really love. It is still light enough to keep the camera in balance and zooms nicely enough for the wildlife photos I take.
Back on point: I think the current tech is ten years ahead of its support systems. There are some truly archaic server farms out there and internet speeds stay the same.
People like Microsoft just seem to make things that are harder to use each time. Maybe they do that so you don’t notice the missing things they took out until it is too late.
I had to replace my old desktop last Christmas and don’t see a reason to try to top the sixth generation 17-6700T processor, HD 530 Graphics package, 2 Tb hard drive with 12 GB DDR4-2133 SDRAM.
I stuck my Galaxy S7 in a drawer before they started exploding because it was too big for me to carry around without breaking it or leaving it somewhere.
I am fairly satiated by tech too, at least until the transmitting side catches up.
@Greg, the number of transistors you can fit in an area goes inversely as linear dimension square: there are 100 square nanometers in a 10 nanometer square. So that doubles the number of doublings we have left.
And then you can go vertical, and then you can go fast, and then you can go big. A cubic kilometer of sub-nanometer elements switching at optical frequencies is more than enough computing power for anything you are doing now.
And that’s before you go quantum.
Heat dissipation will be an engineering challenge.
A lot of video game graphics look stagnant right now because so many publishers want to also support consoles.
There are a few that just don’t care. I have high hopes for Star Citizen for example. I fully expect it to be using 8 CPU threads and 16 GB of RAM by the time it releases. They’re doing 64 bit coordinates to build star systems that take 40 minutes to fly across at light speed.
As software developers, we really haven’t reached limits on how much computer we can use. We may have reached limits on how many people want to buy new systems though.
If all else fails we can burn unlimited CPU time doing true AI.
I think the killer app for a 16k display involves a nursery and an African veldt.
I’ve been at peak tech many times in my life. I had everything I really wanted, and I could easily walk through an Apple Store or a Best Buy without being tempted.
Being there multiple times obviously means the situation was temporary. Now is different because I’m early retired. I am getting the next iPhone, but I held onto my current phone for three years instead of my usual two. Even in the case of the Tesla Model 3, where I have already put $1000 towards a reservation, or the glowforge, where I paid in full over a year ago (before retirement), I’m not on pins and needles waiting for delivery. They’ll come in their own time. I don’t mind.
Sounds like it is time to short some stocks…
My current desktop is sufficient unto my needs. I never became enamored of tablets, though I can use one (and consider Win 7 superior to all its replacements). I will be finally replacing my old flip phone sometime this summer with a smartphone, because e-mailing pictures. Otherwise, I seem largely immune to the lure of tech-shiny, although other things may occasionally catch my eye.
John, whenever you decide you would like to enter into a new phase of gadget lust, I do recommend trying one of the good VR headsets (Rift, Vive, PSVR), if you haven’t already. There’s no killer app or game quite yet (although a couple pretty good ones, and I suspect that Tilt Brush may cross the threshold once I actually get to try it). But I find it’s the highest DC check versus “*shiny!*” out there at the moment that I know of. It will make you want new cards, new peripherals, new *chairs*, whole new *ROOMS* for the purpose of its exploration and enjoyment. It’s really something else.
On the money. I feel the same way. Thx for sharing.
I have the Samsung VR thing and after playing with it for a while I’m under the impression it needs a couple more iterations before it’s going to be something I find useful and/or entertaining (and also, ergonomically, they need to do something beyond strapping a cell phone Viewmaster to people’s heads).
If you’re not a gamer or work in photo/video editing you could easily use a 10 years old desktop perhaps upgrading the HD and the monitor. CPUs, for a normal user, have stopped being the bottleneck long time ago. Even for gamers they come last in upgrading (GPU, HD, memory, monitor and last CPU).
As for Moore law we’re at the last gasps for silicon based CPU. Speed of semiconductor based CPU pretty much peaked 10 years ago Improvements since then have been purely by adding more transistors: Intel SKylake chips have 2bn transistors spaced 14 nanometres (for comparison a human cell is usually between 1,000 and 10,000 nanometres, visible light wavelength is between 400 and 700 mn), this year they should reach 10 nm, physical limits for silicon is said to be about 5 nm I think we’ll be limited to two, perhaps 3 doubling for silicon based CPU. After that? Either carbon based CPU (that should allow more transistors and faster speeds) or 3d chips (with transistors stacked in several layers). And still right now transistors are a few dozens of atoms each so to keep Moore’s law viable in the long run either we learn how to make transistors smaller than an hydrogen atom or we develop quantum processors.
The big shiny thing I’m working on right now is the smart home. We’ve been buying bits and bobs a little at a time because it can be really expensive. It seems a little gimmicky, but the voice/motion controlled lights are super convenient in places like the garage or those weird hallways and staircases with two lightswitches and the one you need is always on the wrong side. The next thing we’re trying to figure out is using Arduino/Raspberry Pi to control the blinds, which is gonna be awesome.
Beyond that I still really want a smart watch. The new slimmer design on the Moto 360 actually fits my teeny tiny wrist. and the ability to manage notifications from a watch means I can keep my phone stowed in my purse, which is important when you don’t have functional pockets. (Although I’ve taken to buying my pants in the men’s section to address the pocket problem.)
At this point, I only use my tablet for video and Feedly. When the iPad finally dies, I’ll probably get a large screen Kindle Fire to replace it, since they’re significantly cheaper.
I’m generally in the same boat. The desktops I bought for gaming some…what, seven?!? years ago are still functioning. They can run Overwatch and other games (my intent was to have a machine powerful enough to run Starcraft II and Diablo 3, at the time). Could they be faster and handle more intense graphics? Sure. Do I care that much? Not like I used to, no. Especially as I have a newer laptop (2 years old, now) from Alienware that is probably more powerful than my desktops. The only game I’ve had trouble playing in the last year was Battleborn…which is just one more reason that Overwatch cleaned their clock.
I was listening to NPR last night and they were talking the Walter Mossberg (formerly the tech editor at WSJ, then of his own company, now retiring) where he said that the smart phone is the personal computer, now. I was kind of floored at this simple revelation. If not for work, I’d do most of my web browsing on my phone or tablet.
Someone mentions above that consoles are holding PCs back in terms of graphic fidelity. That may be true…but I’m not sure that it matters that much. Slightly higher textures, slightly higher resolutions, somewhat better performance…but is it really that noticeable? The visual design of the new Zelda, for example, is more compelling to me than seeing Titanfall 2 running at 4K. My new Samsung TV can do 4K but the jump from HD to 4K is not the same as the jump from 480i to HD was. When I first saw 1080i native content on my first HDTV, my jaw literally dropped open. When I watched 4K native content, I was…squinting to make sure it was no just regular HD.
We’ve reached a place where the improvements are incremental and dwindling. What was the last iPhone improvement that was compelling? Thumb-scannning, maybe? I have played some pretty compelling content on my friend’s HTC Vive…but spending hundred or thousands to play it? Not so much.
Nice Save vs Shania Twain with “doesn’t impress me much”
Totally understand this. Growing up, it was always about having the latest and greatest technology. The latest iPhone. The latest computer. But maybe it’s age, maybe it’s the fact there isn’t really anywhere to innovate anymore with phones and computers. I’m just not excited by new product releases anymore. I’ve had my current iMac for 5 years and it still works perfectly fine and is fast. I have no reason to need a new one or upgrade. Same with my laptop, I only replaced it last time because the old one was broken. When it came time to upgrade my phone after a screen broke, I didn’t get the latest iPhone, I got the one below it because I still wanted a headphone jack. They’ve made products so well in the last few years, they just last and don’t slow down.
Sometimes I think it’s almost like Moore’s Law has stopped.
Desktops – I used to build a new desktop every 18 months, now it’s every 5 years. Cameras – my Canon M3 is still pretty amazing, great pictures, light, and fast. TVs – 4k is nice, but at 10 feet away you can’t really see much difference. Audio – nothing new there beyond various Sonos clones for years. Tablets – my iPad is now nothing more than a clock by my bedside. Laptops – my Core i5 Surface Pro 3 is still pretty damned good. At this point, unless something dies, I just don’t feel the need. Even the phone I only upgrade once a year because Apple makes it easy – not because I need it.
I’m thinking our technology has hit a (probably temporary) plateau. There’s simply no “killer app” for anything anymore. Everything is just incrementing along.
All that said, we do live in a science fiction world – I carry a super computer in my pocket that brings me a sizable percentage of the worlds information just by asking for it, I’ve seen live pictures from every major body in the solar system, I can actually buy a ticket to space and have a reasonable expectation that I’d get to go. I can be virtually anywhere on the planet by this time tomorrow. I’ll probably get to have a car drive me, by itself, to the grocery in the next 10-15 years. I am alive today because our medical technology makes it so. All in all, while it may be a technological plateau, it’s still a pretty damned good one.
I’ve felt this way ever since getting my S7 Edge. It really does everything you want on anything smaller than a laptop. Also, I think this is helped along by really good cloud applications. Everything important in my life is now on Google Drive, and When now things arrive, I can point my phone at it and presto! I’ve almost forgotten what a “save” button does.
David Palmer: Moore himself says his own law will break down in about ten years. I would ignore him at your own peril.
Quantuum computing is only good for weird applications right now. If you want to crack 256 bit encryption, it might help. If you want to play a video game, probably not nearly as helpful. It gives you a fuzzy answer which probably needs to be verified digitally. Also, quantuum bits need a lot of support structure around them to work. They arent like a single transistor.
“you can go vertical, and then you can go fast, and then you can go big”
You cant go much faster. Clock speeds are currently poking at 5 gigahertz. Thats a clock period of 0.2 nanoseconds. Light can travel about a foot a nanosecond. So that means electricity could travel, at most, 2 inches in a single clock cycle. A die for an intel processor is about an inch diagonally, and electricity doesnt travel at thr speed of light in metal. Then you have to subtract delays it takes a signal to get through a transistor (the actual logic gates on the chip), and then there is clock uncertainty between the sending and receiving flip flop, and then there is the setup and hold delay on the flops. That all adds up to a hard limit for how much faster we can go. If we get two more doublings out of clock speed, 20 gigahertz, in 4 years, i will be quite surprised.
Vertical? Heat is already a huge problem, and thats with the gates mounted directly to a liquid cooled metal plate with pumps and fans. Gaming rigs now have to have crazy active cooling or they burn out. If you try to sandwich 8 or 16 layers of gates on to of each other, computers will have to come with their own halon fire suppression system.
Big is an option if by big you mean distributed. Procesor speeds arent increasing as much as it used to. Now the solution is to add more cores. We will probably see an approach like the graphics cards with hundreds of little cores in them. But software will need to be completely redesigned to take full advantage of distributed calculations. Server racks are the current solution for commercial apps. Seti allows people to download code to borrow unused cycles on their computers. But not every app works well distributed.
Raspberry pi’s now sell for $5. You need to add a usb power supply, a keyboard, mouse, and monitor. But for five bucks you get a computer that can run linux, play old video games, act as a media server for your music and video, let you surf the web, and so on.
That level of functionality would satisfy what a lot of people want to do with a computer. And its only five bucks.
So long as the latest version of Sid Meyer’s Civilization (now VI) runs on my Lenovo home desktop computer I am a happy camper. I will only need to upgrade when a newer version will not run on this three year old piece of tech. And since I have the internet on the thing, why ever would I need a smart phone with internet access. My flip cell phone does not have a touch screen and you know what? I can still make phone calls on it and even read an occasional text message; not that I ever text back. I will call you back. I think Greg is on target about the limits of current silicon based processing advancements. Not much room left for silicon based advances and I doubt any of them will mess up my playing Civilization VI or some later version.
You mentioned TV. I don’t remember you ever sharing your TV/home theater setup. Any chance of a post on that type of stuff? 4K? HDR? 5.1?
(I work in video production, but I haven’t made the jump to 4K at home or at work yet.)
I’ve been feeling very much the same in regards to my current crop of tech gadgets but I’ve realized that I’m kinda waiting for 2 key technology advances that I see as having the potential to spur a lot of developmental upgrades. Internet speed is a bottleneck right now (in the US at least). Especially when so much of our technology use depends on streaming – from media to gaming to video-conference type use. In my experience, my hardware technology is more powerful than my internet connection and I can’t upgrade the internet any more so it makes no sense to upgrade the hardware.
Smart home systems are the other thing. Amazon Echo and Google Home and whatnot are cool but they need some usability advances to get beyond the current general use as music speakers and verbal google. And the full home systems are expensive, buggy, and generally take a lot of tech know-how to set up and get working correctly. When fully integrating the smarthome components becomes cheaper and easier for the average consumer I think we will see a lot of advances in that as well.
Might this be a second cousin to the practice of always waiting for the second generation/first update of a product? I used to do business/computer tech reviews, but since those days I’ve never bought for my own use the newest generation of gear–partly to avoid the New! Improved! price penalty, partly to let the bugs get stomped, and partly because there is rarely a feature or capacity that I need for my actual work or play.
Then there’s the bad habit manufacturers and OS builders have of dropping connections or functions that I depend on or prefer. Last time I replaced my main machine, I had to hunt for a used model that still had old-standard keyboard and mouse ports, because there’s no way I’m giving up my favorite keyboard and trackball. For similar reasons, I find myself skipping entire generations of Windows versions while they unbugger the various pseudo-improvements. On the other hand, I finally discovered the utility and convenience of the tablet for simple tasks such as reading e-mail or Googling–but I’m not going to be doing any writing on one, even with a marginally usable Bluetooth keyboard. (My wife does her writing on a 12-year-old ThinkPad running Windows XP. Sufficiency, as they say, is enough.)
I confess, though, that I no longer use my VHS-C video gear, which replaced my Super-8 movie gear, which replaced my standard 8mm cameras and editing stuff. . . . And don’t get me started on audio formats and technologies.
Moores law has radically slowed down. There is no reason to upgrade often, because upgrades are so small. I built a PC 4 years ago. Mid grade PC, i5, 2 year old Video card, etc… Through the early 2000s, if I had a PC like this 18 months later, I would be turning down the graphics level alot and with in 2.5 years there were games I could not play. I have not upgraded one time and I can play every game on max settings.
Unfortunately tech progress is slowing down. Its a shame since it was a primary driver in economic growth. It also means science and other technologies will not progress faster. Phones/Devices/Laptops people are just using them until they stop working. There is no need to upgrade since the performance improvements are not noticeable.
This was masked for a few years when iphone/Ipad came out because they were ‘cool’ and people wanted to get the next version even though it had such small improvements that they were irrelevent.
TVs have topped out. They tried to get us to buy 3d, but the tech still sucks so that faded. Then they came out with the gimmick of a ‘curved TV and I still some of that, but its stupid and doesn’t do anything.
Speed of CPUs has been topped out for 15 years due to heat and power cost. You can’t cool the CPUs with fans above a certain speed reliably and water cooled systems are for enthusiasts. you really want to risk having water in your iphone? Plus they just use more power and battery technology has not kept up. If we get a better way to cool and better battery tech/plus energy savings at higher clock rates we may see an uptick.
Now we are running into physics limits on shrinking CPUs. They can still be shrunk for a while ,but they are getting harder to do (which explains Intels delay in releases of new CPUs).
One thing that isn’t slowing down is the rate that super computers improve. If you find tech stuff cool check out top500.org. It has the 500 fastest super computer int he world. They update it every 5 months. Every year about about 65% of the the top 500 falls off. The very top ones don’t go up as fast, but overall they are increasing rapidly. Its literally a straight line. It reminds me of the Madoff Ponzi scheme investment straight line. This has not slowed down at all. This is pretty cool, but super computers are mainly for governments and research, they are too expensive and are not drivers of creating jobs.
The SF style quantum computers are so far from being out of the research sector no one has a product or any real release date.
Its a shame.
Same here. Sometime recently we hit the point where everything was powerful enough to do almost anything that most of us want. Yeah, the new phones will be faster than the current ones but.. .meh. Same for laptops/desktops. Wearables will become a thing when the world around them has sensors that they can interact with and as several people have pointed out, home automation is still pricey and not that compelling for most people. VR? I don’t see the current VR gear as ever moving out of the hobbyist/gamer/geek demographic. Can you see most people you know with a facehugger thing strapped to their head? Sure, the experience is likely cool but I just don’t see most people thinking “Why yes, I’d like to put this thing over my face regularly!” AR? Honestly, I wonder what would have happened to Google Glass if it had come out with Pokemon Go on it. But again, I don’t see most people wearing AR glasses most of the time in the near future.
VR is a no go as long as it makes you instantaneously sea sick when you turn your head a certain way.
I got to play with someone’s vr set for a little while. It was pretty impressive tech. And then something happened, I turned my head, and got hit with a wall of naseau. Took the goggles off and it didnt go away. My friend says it happens to him too, so he doesnt use it much. Several people I know who have VR say they get vertigo.
Who wants to pay a couple grand or more for goggles and a machine powerful enough to run vr, just to get the feeling like someone just kicked you in the gut? No thanks.
I was all psyched for VR, but its got major issues yet.
I’m still part of the Apple universe. I went from Microsoft to Apple because I have always been very bad with computers. Fifteen years ago a friend who got tired of solving my problems for me advised me to go Apple because even digi-morons like me couldn’t fuck up an Apple computer.
That proved to be the case – and I still like their products, even though I dislike their arrogance (no ‘delete’, now this headphone jack shit.)
So, for now I have all I need: a mainframe in one home and a laptop in the other for my writing. A tablet for my newspaper reading. An iPod for music (well, mostly audio books these days) and email checking on the hoof. (I don’t even have a normal mobile, let alone a smartphone.)
My only problem is that I can see a moment coming that I will be truly fed up with Apple. I’m not sure what I will do then – though it will be a sidegrade. I know what I want from my tech and that won’t change.
Given the amount of traveling you do, you might benefit from moving to a small, portable gaming laptop like the *ugh the name* Razer Blade. Then you could check social media or write something *OR* take half an hour to murder some super-mutants while you were traveling.
I mean, they’re not going to murder themselves. Except for the suicide-bomber ones.
The Gear VR is functionally a mobile phone strapped to your face, so your comparison with the Viewmaster is pretty apt. I did actually use my Cardboard to look at apartments in San Francisco with, and it was better than just looking at pictures online, but the Rift/Vive/PSVR are in another league.
There’s definitely still a lot of room for growth, but there’s enough of a difference that it’s one of kind and not just degree. I don’t know how you feel about arcade cockpit fighters, but I find EVE Valkyrie in the Rift to be increasingly varied and compelling in the experiences it offers, and the kind of thing I can just drop into the way I do War Thunder or Overwatch.
Also, I haven’t tried room-scale and the motion-controllers yet, so I can’t be sure, but I suspect that their addition to the Rift/Vive/PSVR is what definitively puts those systems head and shoulders above the simpler VR solutions.
Yep. I bought a new Levano laptop computer about a year and a half ago as a back up for when my current, a 2011 Satellite laptop, dies. I figured I’d start migrating stuff to the new computer and eventually the Satellite would become the backup. Hasn’t worked that way. Satellite has Win 7, Lenovo has Win 10. I much prefer Win 7 to Win 10. So I am backing up files to the Lenovo but not really using it much. Even though the Lenovo has the lit up keys on the keyboard (the keyboard on the Satellite died a while back, I bought a USB external lit up keyboard for it… sometimes the external keyboard is more useful than the built in one.) Plus I use Word 2007 and I had to put Word 2014 on the Lenovo. Ugh. I’m used to and familiar with the older programs. Learning the updated programs is much like having to learn entirely new programs, and the learning curve, for me at least, is pretty steep.
For much of the time I use an iPad mini for surf-the-web on the go, but even though I have a word processing app on it, don’t use it for that much. ( there are times I wish the iPad had a USB port – I don’t have a Bluetooth printer, etc.) I do have a HP Pavilion tablet with detachable keyboard, but I have discovered that for what I generally do ‘on the go’ the iPad mini works better, is smaller and lighter, and has a better battery life. So the Pavilion only gets used on research trips.
No longer have a desk top computer, as I find the laptops far more convenient. But I don’t do a lot of computer gaming, nor do I stream much content – I’m one of those Luddites who much prefer to buy CDs and DVDs. For that matter, I really dislike the fact most programs now HAVE to be downloaded from the web, I would rather have them on CD and download from the CD rather than the web. Especially Word – new version wants to be connected to the web, and is not ‘fully functional’ if disconnected from it – but a lot of places I would be using that computer outside my house are places where free wifi isn’t available, and many don’t have wifi at all.
I think I also agree with what seems to be a consensus among the above – the real roadblock to upgrading computing performance, etc these days is the slow speed of internet. I have Comcast as my service provider. Even with only one device using wifi, sometimes my router speed could be beaten by any of the local snails. Pretty darned frustrating. Most of the internet providers in this area have the same problem. It’s that backbone to internet computing that really needs to be upgraded before most of the new online tech can really be fully utilized.
This is me lately. I’ve been dragging my feet on an upgraded phone (which my employer will provide to me for free) because my old one is plenty fast and has better specs in some ways. I’m only thinking about a new laptop because the old one has a hardware issue.
What I’m really excited about is solar panels. I want some on my house desperately. At the same time I’m not ready to make a 20-year commitment because I know they’ll be much better and hopefully less ugly if I wait just a couple years. But… I WANT them now.
All I really need is a new main TV, but I’ve drawn a line in the tech-sand and promised myself I’d hold off until I can reasonably afford a 65″ 4K OLED. Right now, last year’s model is running $2300 on Amazon, so it’s getting closer.
Also, as a user, I understand the loss felt by the obsolescence of the 3.5mm phone jack for headphones. As an electrical engineer, I agree with the decision to delete it. That may put me in the minority, but so be it…