[OCN] Less is The Future of Computing?

If you've been following hardware news for the last 2 years, you may have notice some of the many trends that are becoming the new hotness in computing. As always the race for price/performance between Intel and AMD is on, the two giants along with Nvidia are releasing new flagship products, core shrink remixes almost every month. That's nothing new... But other hot topics are also shaking up the hardware world, namely more and more parallelized computing via multi-core CPUs, GP-GPUs and GPU acceleration, the growing presence of DDR3 in high performance systems, and something which I think is really special, the hype over smaller, cheaper, more power efficient computers, the topic of this article.  If you look around, you'll realized pretty much everybody is trying to get a stab at the new nettop/netdesk market: Via has been in the game for years of course, but both Intel and AMD are churning out more and more interesting products. The long awaited Atom is pretty much leading the ball, but the Athlon 2000+, and the upcoming Bobcat are also very promising alternatives. As for the actual manufacturers of such integrated systems, it's pretty much a free for all. AsusMSIAcerHTCHPFujitsu, everybody wants a piece of the pie. Since I find that releasing old revamped technology is almost an insult to the thousands of engineers who have been working their asses off to make better, more powerful chips, allow me to question this new hype: tiny computing has already caught on, but is it going to live on? Will people really continue buying underpowered hardware just because it costs peanuts?  I can see valid arguments on both side of the scale. I know that this isn't necessarily the right place to inquire about this, but if you think about it, what applications apart from games do you use? Chances are that unless you're doing some hardcore 3d modeling with SolidWorks or touching up video in Premiere, all you're doing is either surfing the net, writing up documents, and chatting on AIM or MSN. Do you really need an e8600 with 8 gigs of RAM to do that? Absolutely not, and chances are that most people currently can't afford a computer won't be doing much more than that either, so all these mini laptops are ideal for them. I can also see those new chips working wonders in carputers, boatputers(?), and computer aided navigation/entertainment devices of all sorts. Carputers have been around for a while, but most will agree that the Atom is pretty much the beefiest chip available in an integrated nano-ITX form factor which can fit pretty much anywhere. With the rising popularity of home servers, the integrated Atom board is also a great choice which has a smaller physical footprint, lower price and lower power consumption then the AMD Sempron powered prebuilts I keep seeing in retail stores.  On the other hand, I'm questioning the longevity of the current low-power, low-cost chips. After all, what's the point in giving lower income families computers if they are rendered obsolete only a few months after they are bought? The Atom 230 has been called inapt at handling day to day computing by Tom's, which right off that bat is a pretty bad indication of how long the chip will survive, and even if it were to be good for another year or two for browsing and typing, the roadmap for the Intel Atom kind of makes me want to wait. Dual core Diamondvilles should be landing in nettops and integrated ITX devices near you any time soon, and next gen Atoms code name "Pineview" featuring a core shrink to 32nm are just around the bend, ETA 2009. Normally, I'm not the type to wait for a newer, better component because in the end you can just end up waiting eternally, but in this case, buying an Atom 230 right now is pretty much stupid. Cost effectiveness is also an issue. Because of the almost disposable nature of the new nettops, my guess is that people will most likely want to get rid of their nettops after only a year of use (some EEE701's are already going on sale for cheap on the net), which has me questioning if it's better to buy a 450$ MID and having a so-so user experience for a year or shelling out under 900$ up front on a REAL laptop but enjoy it's full sized keyboard, screen, and greater performance for 2+ years until it becomes sluggish. Part of me just wants to empty my secondary savings account and buy a Dell Mini 9 to run OS X on, but the other part tells me that it's a pretty stupid investment.  So, is minimal computing really the new hotness, or is it just a passing fad? I guess that only time will tell, but in any case, I'm just happy that finally computers are becoming something that everybody can afford.


Check out the original article with replies from the Overclock.net community on my OCN blog.