Tuesday, March 21, 2006

Where's My Beefy New Desktop, Dude?

Last August I talked about hardware trends in terms of what is changing in the landscape of consumer electronics. But I think an update is in order here, as several things have changed, and there were a few things that I didn’t really talk about in the first article.

I was recently in the market for some new hardware because my current computer had run out of hard drive space and I’d run across a game that my video card wasn’t good enough for. So I got to thinking about what has and hasn’t changed over the 10 years since I’ve been a computer consumer.

I tend to be willing to spend between 1,000 and 1,500 for a new system, so if we take price as a constant, all other things equal, if I were to buy a computer in 2006, it should be about the same ratio of goodness from a computer in 2001 as that computer would be to one from 1996. That is, ever 5 years, the various components should increase at the same multipliers. (You might have heard of Moore’s Law).

In 1996, I was running a 166 MHz Pentium I, on 96 MB of memory and 1.6 GB hard drive. Sadly, I was also using this machine in 2002.  But in 2002 I bought a new machine. 1.8 GHz P IV, 512 MB memory, and 40 GB of hard drive space. Furthermore, the memory that I did buy with this computer was RDRAM, which is very fast. If I were to buy a comparably priced computer in 2006, it would be about 2.8 GHz P IV, 1024 MB memory, and 160 GB of hard drive space. Plus, the 1024 MB memory is SLOWER than the 512 I bought in 2002.

Whew – that’s a lot of numbers. The point is this, from 1996 to 2002, the hard drive got 25 times bigger, the processor was 10 times faster and the memory was 5 times bigger. From 2002 to 2006, the hard drive got 4 times bigger, the processor got 1.5 times faster, and the memory was 2 times bigger. If Moore’s law were holding up, we’d expect to see much more even numbers there. (The extra year from 1996 to 2002 doesn’t explain the jump).

What we’re looking at from 1996 to 2002 was a new generation of machines. In 2006, consumers are still buying the same family of Pentium processors, and the same type of memory. It’s the same generation of hardware. Just a gradual evolution.

So what explains this difference?

The main thing is all of the research by hardware companies in notebook computers. Notebooks in 1996 were total crap. In 2002, they were decent, but not great. In 2006, most people I know who own a computer have a notebook, and overall notebook sales have surpassed desktop sales. They spent gobs of money racing to get market share, expending time and money on minituraization and power consuption in current technology. Plus the explosion of Wi-Fi. (I'm not saying this is a bad thing).

The other thing is the onset of flat-panel monitors. Almost all new desktops are packaged with flat-panel monitors, which are still on the expensive side. So, since my numbers kept price constant, the flat-panel ate away at other hardware I could have bought. A new desktop with a CRT would have the following specs in my price range. (Who wants a CRT these days though? I scowl at mine daily. There are days I go to work just so I can stare at my dual-19 inch flat panels in silent reverence. Its so choice.) Ah-em. Anyway - the CRT 2006 era system came out like this - 3.0 GHz, 2 GB memory, and 250 MB hard drive. That ups the ratios slightly, but only the memory ratio is comparable to 1996 levels at that point. And keep in mind, that memory is still slower than the memory from 2002.

As a side note – the fast memory from 2002 turned out to be dead end technology for various reasons. It basically turned out that doubling your RAM speed wasn’t as good as making your RAM twice as big. Especially when the big RAM cost less. Cause people bought big cheap memory over small fast (expensive) memory.

There are a few hidden benefits to current technology – current processors are now Dual-Core or hyper-threaded. That basically means that you can run more programs at once without your computer slowing down. Intel has invested a lot of money betting that you care more about running a lot of things rather than one thing quickly. It’s a smart bet, considering how most people use their computers.

There was also a lot of money spent on 64 bit processors. AMD has come out with some successful 64 bit processors, and has really surpassed Intel in that area. Most of the money Intel spent on the 64 bit processors seems to have been wasted. So that’s another reason why we may have slowed down.

Clock speeds haven’t increased but your computer hardware’s capability of multitasking has. Unfortunately, software and software developers haven’t caught up to Dual-Core technology yet, so buying a dual-core probably won’t be as big of an advantage as it seems for another few years until your favorite operating systems and compliers are re-written to take advantage of the new hardware. Tough luck.

The elephant in the room though is that consumers aren’t buying computers as often as they used to. Is it the chicken or the egg? Would I perceive that I needed a new computer if new hardware was on the scale of 10 times better, rather than 2 times better than what I currently have? Or is it more that I am satisfied with a computer that runs every program I can possibly pay attention to at once – without any problems?

Computers seem to be approaching the curve of diminishing returns when it comes to price in compared to what new stuff you can do with it. That’s why mobility is all the rage. People are realizing there’s a lot more value in a computer they can use where and when they want than a stationary power-desktop that doesn’t do anything stunningly better. The exception are power-gamers, but at this point they’re a small minority in the computer market.

The hardware industry looks to me as if it is going to become more of a commodity, if it already isn’t, where profits are very marginal. The good thing is that everyone will be able to afford multiple computers in the near future. The bad thing is, new and exciting developments in consumer electronics are going to keep slowing down, and will never get as fast again as they were in the 1990’s. (Unless we meet an alien race or figure out fusion power or something else totally whack like that).

*sigh* I said all that and I didn’t even get to tell all you guys about the cool new transparent computer technology. I guess that’ll have to wait for another time.

6 comments:

Matt C. Wilson said...

I would argue that computers have hit a "good enough" point for the majority of the market (read: most moms you know).

Looking over the past 25 years, you can break down each 5 year section into overriding themes: Personal Computers (81-86), Multitasking GUIs (86-91), Desktop Business (91-96), Web (96-01), and Mobile/Wi-Fi (01-06). I think right now it's hard to see any big problem that could spawn a new computing revolution.

Each one of these phases has equally helped hardware manufacturers and monopolistic OS companies as well. Perhaps one of Microsoft's biggest successes has been in making each subsequent iteration of Windows push the available technology to its limits.

Except recently. When Windows "just works", it's not busy blue-screening, lagging during a task switch, or failing to adapt itself to new devices or internet connectivity. In short, XP was functional but not bloated, and now the seething mob is silent and happy.

Which is why Vista is interesting to me. I can't help wondering if somewhere deep in MS, perhaps in the recesses of Bill's own brain, the need for this bloat is obvious. Because for as slick and pretty as it purports to be - what problem is it solving?

Assuming it doesn't spawn another revolution, the only remaining real competitive statistic Microsoft has is benefit/cost. And uh, you know - give enough hackers enough time to write a comparable OS and that's division by zero. :)

Kurt said...

Nice analysis, both of you. Corporate use will still keep the desktop business rolling for a while, but I think we are definitely at or close to a "good enough" point. The major thing trying processors today is interpreted languages in the browser, i.e. Javascript and the whole AJAX thing. Imagine if all of MS Office goes AJAX for corporation, that is a huge memory and processor demand, that will force big corporate upgrades. It is yet to be seen what will happen in this area, but Web 2.0 could be a tipping point for hardware upgrades on the desktop.

Mobility is definitely a huge driver today and where all the money is going. Look at all the people trying to make in roads: Apple iPod, everyone trying to be an iPod killer, MS Origami, PSP, Motorola SLVR/ROKR, BlackBerry, all the smart phones... etc. No one has yet perfected the mobile platform that can be a cellphone, mp3 player, gps, pda, gaming device, and all-purpose computer I can plug into a dock at home to use a full screen, keyboard and mouse... but I think someone will and I think it will be amazing.

Jake said...

I agree that corporate business will drive desktop sales for a while, but I think you might be overestimating the impact of AJAX.

Without having given it serious anaylsis to this point, the whole theory behind AJAX is to create lean-traffic, but high-functionality websites. I think AJAX is actually a more horsepower efficient model than traditional postbacking.

Matt C. Wilson said...

I agree with Jake. AJAX, judiciously applied, should cut down on the total amount of churn to render a given page of web content.

We're actually adding it to the application we work on, and the big benefit is on large pages with different sections of content. Being able to manipulate info relevant to just a section, rather than resend the entire page post and rerender the (mostly same) remainder of the page, is the big win.

Granted, if you want to use javascript for any heavy lifting on the client end, it would tax the local machine. But I'm not familiar with any apps where the majority of the work happens on the client - at least not web based ones.

Maybe if we ever saw something like web-based SETI@Home, where server communication is light but client load is heavy?

If you're thinking more "Excel or Photoshop, on the web" - I think there will need to be another iteration of client-side standardization before we see that day.

AJAX is key for asynchronicity, but for powerful, web-delivered, rich client applications, we need more. We need big blocks, like a JS graphics library, or a mini-database, so that each page of a web app isn't rerendering the wheel. The standard HTML input controls don't do much in terms of slick interface. I wouldn't build WebExcel as a grid of textboxes. :)

Anonymous said...

A few comments:

1) Economics: Moore's law is about computing power at constant price, not constant US Dollars. Our currency has weakened a good bit since 2002.

2) Revolution I want? A new OS paradigm. I think we've gotten to the point where OS's make the mundane simple.
I want to make the fantastic accessible. For creative uses, todays' OS's make creative projects feel like planning a space shuttle launch. I want them to feel like driving a Ferrarri.

3) Revolution we'll get? Super Blackberries. Streamline, miniaturize, mobilize, for exceptional convenience. I think its apparent that this will make mundane tasks accessible 24hours a day, but all in one device. Yay!

Yeah, I'm still a dreamer.

-Ben

Ben said...

I would like to point out that the comments I made in March 06 have aged like a fine wine.