×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Ion Platform For Atom Tested With Games, HD Video

timothy posted more than 5 years ago | from the setting-the-bar-moderately-high dept.

Portables 115

J. Dzhugashvili writes "Nvidia has already pulled the curtain off its Ion platform, which couples GeForce 9400 integrated graphics with Intel's Atom processor. But how does it perform? The Tech Report has taken the tiny Ion reference system for a spin in games and video decoding to see if the GeForce GPU really helps. The verdict? 1080p playback is actually smooth, and the whole system only draws 25W during playback. Fast-paced action games are another story—Half-Life 2, Quake Wars, and Call of Duty 4 are all choppy with a single Atom core and single-channel RAM, although they do run. TR concludes that Ion is nevertheless a clear improvement over Intel's 945G chipset, especially since Nvidia doesn't expect Ion-based Atom systems to cost significantly more than all-Intel ones." Update: 02/04 09:14 GMT by T : HotHardware is one of the several other sites offering some performance benchmark numbers on the new chipset.

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

115 comments

But the real question is- (2, Funny)

Gizzmonic (412910) | more than 5 years ago | (#26713099)

Does the Atom processor make the Internet faster? Because if not, I'm going back to a P4!

Re:But the real question is- (3, Informative)

frieko (855745) | more than 5 years ago | (#26714261)

If I may take a moment to be a smartass: Assuming your P4 has a 25 watt power supply, the internet is about infinity times faster.

Re:But the real question is- (1)

default luser (529332) | more than 5 years ago | (#26717367)

Does the Atom processor make the Internet faster? Because if not, I'm going back to a P4!

You're thinking of the Pentium !!! [suite101.com], which made the interwebs your bitch.

Damn (1)

Jaysyn (203771) | more than 5 years ago | (#26713193)

Looks like I didn't wait long enough to get the netbook.

Re:Damn (1)

Nursie (632944) | more than 5 years ago | (#26713527)

I'm very happy with my eee901. I've debianised it and it's replaced my "big" 13 inch vaio for casual use. I still use the vaio for anythin cpu intensive, or if I want a bigger screen. Or for typing anything other than the odd email/slashdot post.

What happened to the dual core Atom chips?

Re:Damn (1)

vux984 (928602) | more than 5 years ago | (#26715295)

I'm very happy with my eee901. I've debianised it and it's replaced my "big" 13 inch vaio for casual use. I still use the vaio for anythin cpu intensive, or if I want a bigger screen. Or for typing anything other than the odd email/slashdot post.

So its 'replaced' your Vaio for casual use... yet you use the vaio for anything cpu intensive, when you want a bigger screen, or if your typing something longer than a /. post. Serious question: what does that leave?

I've been eyeing netbooks, myself, but am having trouble assessing whether I'd actually use it... I suspect I'd practically -always- want the bigger screen, more comfortable keyboard, etc.

I currently have an ipod touch, and really, I almost NEVER use its web capabilities; the laptop is only a few feet away, and as good as the ipod touch browser is, (and it IS good), I find I prefer the laptop. I only use the touch if I want to look something up when I'm out, because its in my pocket, and laptop is at home. If I had a netbook... it wouldn't fit in my pocket, and thus would never be more than a 20 feet closer to me than my laptop...and if I'm going somewhere where I need an ultra portable -- either I can bring the laptop... or the ipod touch...

I'm not criticicing your choice. I'm just trying to get the niche where the netbook comes into its own... without buying one to see where/if I'd actually use it. :)

Re:Damn (2, Interesting)

Nursie (632944) | more than 5 years ago | (#26716177)

"Serious question: what does that leave?"

Email.
Slashdot.
Flashing Neo Freerunner with stuff.
Using it as a terminal into my servers for maintenance tasks.
Music & Movies (on the plane or sometimes hooked up to a big LCD)

Err...

Taking places I wouldn't take a decent laptop. Or places I wouldn't think to take a normal one, but it's small enough to throw in the bag.
Seeing something on tv and wanting to look it up on wikipedia NOW and every other computer is out of reach and takes ages to boot...

I don't know if it's for everyone, but I use it all the time. To the extent I feared I'd stop using my vaio completely for a while. Then I remembered that I occasionally type letters or play games. Or do the odd bit of crypto-related programming.

Re:Damn (1)

Nursie (632944) | more than 5 years ago | (#26716439)

To answer another bit of your post - the niche, for me, is fast boot (or wakeup from suspend is *damn* quick), very portable but also fully featured. It's not a mobile phone screen, it can do most of what a larger and more powerful machine can do, that's it really.

It's just another debian device...

You may wish to take into account that I very nearly bought a Sony Vaio TZ a couple of years back, so *really* small is something that appeals to me.

Re:Damn (5, Insightful)

Midnight Thunder (17205) | more than 5 years ago | (#26713743)

Looks like I didn't wait long enough to get the netbook.

You can never wait too long to get the ultimate configuration, but there is only so long you can wait to have something to use.

Re:Damn (2, Informative)

Chabo (880571) | more than 5 years ago | (#26713887)

Right. The only way you can really be screwed by new hardware coming out is if you buy right before a price reduction. If you pay attention to the market, you pretty much know when those are going to happen. Athlon X2 price drop when Conroe was released, Penryn price drop when Phenom II was released, etc.

Re:Damn (1)

Lord Ender (156273) | more than 5 years ago | (#26714923)

Yeah, I have an Atom netbook with the Intel GPU, and it is a little slow for full-screen HD video (hulu). I would love it if an NVIDIA GPU were an option, even for significantly more money.

On the other hand, I got a netbook so I could escape the distractions of TV, video games, and home; and instead escape to a café where I can actually get work done.

first Gen (1)

xenolion (1371363) | more than 5 years ago | (#26713229)

first generation chips will be slow, but give them time, Nvidia will make this chip worth the budge chip video in laptops and more, anything over intels video is a major step up. Let the war begin....

nVidia is doomed. (4, Interesting)

tjstork (137384) | more than 5 years ago | (#26713253)

I hate to say it because they do good work, but I think nVidia is ultimately doomed as it is today. Everyone rips Intel's integrated 3d graphics but they just keep getting better every year. Although AMD should have bought nVidia instead of ATI, they do own ATI, and so have a pretty good graphics system on their own. Eventually, both AMD and Intel are going to wind up with 3d calculations on the die in some fashion, and that's going to leave nVidia for what?

Re:nVidia is doomed. (3, Informative)

Bearhouse (1034238) | more than 5 years ago | (#26713381)

Insightful. If one looks at the post here today:

http://hardware.slashdot.org/article.pl?sid=09/02/02/2344208 [slashdot.org]

About the new Acer with Intel's highly-integrated N280/GN40 chipset, you've got to wonder about the long-term viability of nVidia.

Re:nVidia is doomed. (1)

jebrew (1101907) | more than 5 years ago | (#26714115)

I've got an e-machines laptop with that duo (see Woot sometime last week).

I've got to say, I didn't have terribly high hopes going in (~$300...and e-machines), but it plays 720p content just fine, edits my source code with very little lag, and can actually playback 1080p content well enough (when plugged into my TV...so the 1080 is actually useful).

Still sucks for games...but I don't play much.

Re:nVidia is doomed. (1)

LoRdTAW (99712) | more than 5 years ago | (#26715055)

X86-64 is an open standard. Who is to say they could not jump in with an X86 cpu? Yes it does sound far fetched but VIA does have CPU tech but very poor video. Maybe a Future Nvidia/VIA mash up is what they need. Would be nice to have some decent embedded competition.

An all in one CPU/GPU chip could house a dual core N-VIA processor, memory controller, GPU core with a standard IO interconnect and a compact south bridge for the SATA/Ethernet/Sound/USB etc. Not very memory bandwidth oriented but none the less perfect for compact PC's and portables.

Re:nVidia is doomed. (0)

Anonymous Coward | more than 5 years ago | (#26715543)

Even with Nvidia + VIA they would still lose:

Intel: good CPU, bad graphics
AMD: good CPU, good graphics
Nvidia: bad CPU, good graphics

Anybody still think AMD buying ATI was a mistake? Right now Intel has a slight edge on CPU and Nvidia has a slight edge on graphics, but if AMD can execute they are in a better position than any competitor.

Re:nVidia is doomed. (1)

DesertBlade (741219) | more than 5 years ago | (#26715749)

AMD in Linux - Pain to set up (especially Dual monitors)

NVIDIA in Linux - Super easy

Re:nVidia is doomed. (1)

Jorophose (1062218) | more than 5 years ago | (#26720419)

Not any more. nVidia breaks kernels. AMD has a free driver for pretty much all of their GPUs.

nVidia + VIA would be nice, if they weren't morons and dropped the platform.

Re:nVidia is doomed. (1, Insightful)

Anonymous Coward | more than 5 years ago | (#26713681)

nVidia is in no way hurting and nor will it in the forseeable future. consumer entertainment graphics cards are a big slice of their pie, no doubt, but even if that slice were to go away nVidia is where professionals turn for high end data modeling etc.

http://www.nvidia.com/object/tesla_computing_solutions.html

we're talkin TFLOPS of GPU power in a 1U rackmount.

Re:nVidia is doomed. (2, Insightful)

fuzzyfuzzyfungus (1223518) | more than 5 years ago | (#26714659)

I agree that Nvidia has a good slice of life left; but do remember: Nvidia(as well as ATI) got into high-performance workstation stuff, and undercut the super esoteric stuff, in large part because they could amortize much of the R&D cost over huge numbers of consumer parts. There are niches were totally custom esoteric high end stuff can survive, even prosper; but trying to survive exclusively on high end stuff is an ugly business.

The history of computing is littered with the corpses of high end outfits who were devoured by their cheap junk competitor's gradually improving parts, and rapidly improving price/performance ratio.

Re:nVidia is doomed. (1)

Phortune (1455837) | more than 5 years ago | (#26713703)

Bring Intel's Larabee into the equation and things get a whole-lot more uncomfortable for the Green-Team... Maybe the days of discreet graphics solutions are over, but expect Larabee to give nVidia a jolly good thrashing in the market before on-die becomes the status-quo. Ray-tracing, anyone?

Re:nVidia is doomed. (1)

Kjella (173770) | more than 5 years ago | (#26713903)

Eventually could take a very long time... Then there's consoles, which is also a big market to fight for. Putting the CPU and GPU which are the two biggest power draws in a modern computer isn't exactly without drawbacks. Sure, latency would improve but that means more memory lines, more power lines, more heat to dissapate from the same area. Given AMDs breathing problems on the CPU side I'd say the GPU wars are in much better shape than the GPU wars.

Re:nVidia is doomed. (0)

Anonymous Coward | more than 5 years ago | (#26713917)

Yes, Intel and AMD are both making strides towards embedding good graphics on chip, but nVidia isn't standing still waiting for it to happen. They're fighting back, making GPUs more useful, more general purpose. So Intel is adding more GPU power to their CPUs, and nVidia is adding more CPU power to their GPUs. Where will that leave the market 5 or 10 years down the road? I don't know, but nVidia is certainly not the walking dead.

Re:nVidia is doomed. (1)

chipace (671930) | more than 5 years ago | (#26714341)

ION is about a very targeted market (mobile gamers and HD enthusiasts). They are very willing to pay 2x -3x the profit (not the cost) for a mobile gaming/HD product.

I see Nvidia as having a good future, as they are listening to their customers, and not trying to predict the market.

I am not interested in ION at all, but it delivers the goods to those that want it.

Re:nVidia is doomed. (0)

Anonymous Coward | more than 5 years ago | (#26714395)

nVidia/ATI will end up going the way of Creative. It used to be that to get any sort of decent sound you were required to buy a PCI sound card. I'm out of the hard-core gaming scene, but I don't know anyone that uses anything but integrated sound. When I can get 7.1 sound from my motherboard, why would I consider buying something else?

The purchase of ATI by AMD was a great move. Now I can buy my 780G based motherboard and get HD HDMI out (with sound) on an $85 motherboard that is capable of decoding a Blu-Ray movie without any stutters. For the basic user, this is more than enough power, and will even let them play some games.

Re:nVidia is doomed. (3, Insightful)

Joe U (443617) | more than 5 years ago | (#26715327)

nVidia/ATI will end up going the way of Creative. It used to be that to get any sort of decent sound you were required to buy a PCI sound card. I'm out of the hard-core gaming scene, but I don't know anyone that uses anything but integrated sound. When I can get 7.1 sound from my motherboard, why would I consider buying something else?

Creative seriously fucked up the sound card market to try and corner it and wound up destroying audio on the PC. Most of the serious competition got bought up or put out of business by Creative's 'win by any means necessary' plan.

Re:nVidia is doomed. (2, Informative)

Chabo (880571) | more than 5 years ago | (#26717291)

According to Anandtech, currently Creative still has the best game compatibility, because the game devs write to their cards, but Asus' Xonar line has better sound quality, and nearly the same level of game compatibility. I know if I were to build a new machine I'd take their advice on that, what with Creative's driver troubles, especially on x86-64.

http://anandtech.com/guides/showdoc.aspx?i=3497&p=5 [anandtech.com]

Based on Valve's stats [steampowered.com], it looks like only about 3.5% of Steam users have an X-Fi card. I do know of a large portion of people who were weary of the X-Fi series though, and kept buying Audigys, and people like me who kept their "Creative Live!" cards, which are likely a good portion of that 33% with "other" sound devices.

GP is right though; most people are perfectly happy with onboard sound. This is especially true in the laptop market, which last I heard was now well over 50% of total computer sales.

Specifically. (1)

tjstork (137384) | more than 5 years ago | (#26720101)

Creative's 'win by any means necessary' plan.

We're talking about how Creative went and bought Emu, then, turned around and shut off the flow of chips to Turtle Beach. There was a nice little competition there and Creative just pissed all over it with a pretty sleazy play. I don't feel bad about Intel and Microsoft screwing Creative out of the equation at all.

Re:nVidia is doomed. (0)

Anonymous Coward | more than 5 years ago | (#26715105)

Everyone rips Intel's integrated 3d graphics but they just keep getting better every year

At this rate they'll be viable in about 10 years.

Re:nVidia is doomed. (1)

TheRaven64 (641858) | more than 5 years ago | (#26715401)

nVidia don't just make GPUs for x86. They also have a very advanced SoC incorporating a multi-core ARM CPU and an nVidia GPU. The version aimed at Netbook-type systems draws under 4W for the entire SoC.

Re:nVidia is doomed. (0)

Anonymous Coward | more than 5 years ago | (#26715435)

nVidia > AMD.
AMD trying to buy nVidia?! HAHAHAHAHHAHAHAHA

Re:nVidia is doomed. (1)

h3llfish (663057) | more than 5 years ago | (#26717227)

It wouldn't surprise me one bit if you were correct, and nVidia ended up going the same way as 3com and other vendors whose products ended up being integrated into the MB. But, I think it's also a pretty valid argument to say that there will continue to be space at the top of the food chain, selling cards that appeal only to the more hardcore gaming enthusiast. There are just too many gamers out there who have plenty of money and a love of bleeding-edge hardware.

Re:nVidia is doomed. (1)

p0tat03 (985078) | more than 5 years ago | (#26717725)

Except that the performance for Intel's integrated graphics is still junk. NVidia's 9400M chipset at least offers decent performance, and IMHO is poised to take a lot of market share from products that are currently Intel's.

AMD? AMD is a has-been. They bought ATI how long ago? They've been promising a CPU-GPU hybrid for years now, and it's always "just around the corner". As far as I'm concerned I'll believe it when I see it, because AMD doesn't look like they're capable of delivering on that promise.

The days of non-accelerated graphics are over. As we've seen with OS X and now Vista, the requirement for graphics horsepower is no longer limited to the few gamers out there, but we're seeing the OS demand some of that power for itself as wel. The 9400M IMHO offers the right amount of performance, whereas the Intel's X3100 and the previous GMA950's (which a lot of platforms are still using) are simply too slow to get the job done.

That being said, I don't get why TFA complains about the inability to play HD movies on $300 netbook. Seriously, it's a *netbook*. It was *specifically invented* to be a low-performance but cheap and highly mobile laptop. If you want to play HD movies and run Photoshop or whatever, get a real laptop!

Re:nVidia is doomed. (1)

ChunderDownunder (709234) | more than 5 years ago | (#26718465)

X3100 is old technology; I have a 4500MHD [intel.com] on my 12" notebook. I haven't tested out the performance much but it's supposed to support Blu-ray and 1080P. I haven't seen it in a netbook yet though.

Re:nVidia is doomed. (1)

jedidiah (1196) | more than 5 years ago | (#26718499)

Playing a movie is not a terribly interesting problem.
Neither is Photoshop anymore for that matter.

Playing a movie only becomes interesting due to the fact
that the number of pixels being pushed around has increased
dramatically at the same time encoding methods have gotten
more sophisticated.

All of the effort spent on speeding up games can probably be
easily re-purposed for things like "simple movies" if you're
just a little clever about it.

That's all this is really.

Nvidia as usual is just being smarter than the average bear. ...and it would be nice if a $300 PC could keep up with a $300 (HD/BD)DVD player.

Nvidia is by no means doomed. They just made it a no brainer for companies
like Apple or HP to replace nv7x00's and nv6x00's in their current products
with nv8x00's and nv9x00's.

Re:nVidia is doomed. (1)

p0tat03 (985078) | more than 5 years ago | (#26718685)

and it would be nice if a $300 PC could keep up with a $300 (HD/BD)DVD player.

Your HD/BD player doesn't come with a screen, keyboard, battery, and all the trimmings... this isn't even a fair comparison. At this stage expecting a netbook-level device to handle HD movies is simply ridiculous.

And that's precisely my point: all the effort speeding up games is easily re-purposed for playing HD movies, and this is in fact *why* NVidia will succeed where Intel fails. Intel has always made poor-performing integrated solutions on the expectation that only gamers need performance. The reality now is that *everyone* needs a little of that gaming performance.

Re:nVidia is doomed. (1)

polywaffle (827427) | more than 5 years ago | (#26718155)

Intels 3d graphics has been a joke for the longest time. While they might be getting better every year, they still have a lot of catching up to do.

What about power consumption? (4, Insightful)

hattig (47930) | more than 5 years ago | (#26713259)

How does the ION chipset compare in power consumption with the mobile 945 used in netbooks (the 6W TDP one, not the 20W+ TDP desktop variant that's a total joke).

25W for CPU, Chipset, HD, Memory, motherboard doesn't seem as low as it could be.

Still, if they can get 8 hours out of a 6 cell battery in a netbook with it, great. It's a far far far more advanced chipset than the Intel crud.

Re:What about power consumption? (1)

drinkypoo (153816) | more than 5 years ago | (#26714951)

25W for CPU, Chipset, HD, Memory, motherboard doesn't seem as low as it could be.

That's the power consumption while playing HD video. Even the Little Rokubox is supposed to peak over 5W and ALL it does is play video and it has no storage to speak of, no GPU to speak of (but it does have a dedicated video decoder.) Running the LCD backlight is probably one of the big loads, but using a general purpose GPU (it's not just for pushing pixels any more, after all) in this application is necessarily going to hurt power consumption.

I think it's pretty fantastic for what it is. I think the best questions are is this the best way to play HD video and if not is this really worth the power consumption?

Re:What about power consumption? (1)

TheRaven64 (641858) | more than 5 years ago | (#26715437)

Maybe 1080p is really processor intensive, but the OMAP 3530 can decode 720p H.264 with a power draw of under 1.8W for the CPU/GPU/DSP, flash, and RAM. If the WiFi and TFT are drawing less than 20W then this number is not very impressive.

Re:What about power consumption? (1)

slapys (993739) | more than 5 years ago | (#26719531)

That's the power consumption while playing HD video. Even the Little Rokubox is supposed to peak over 5W and ALL it does is play video and it has no storage to speak of, no GPU to speak of (but it does have a dedicated video decoder.) Running the LCD backlight is probably one of the big loads, but using a general purpose GPU (it's not just for pushing pixels any more, after all) in this application is necessarily going to hurt power consumption.

Can't we get a dedicated chip for the pipeline of network packets-encoded video data-pixels-screen yet? There's a reason that some solutions are implemented in hardware rather than software. If my ipod can play music for 14 hours with a tiny battery, I should be able to stream video off Hulu without murdering my netbook battery life. I feel like this is a common enough application where we could have the CPU offload the work to a specialized chip or something. Now that we're in the age of open source browsers, we could even have web sites tell Firefox or Opera about streaming video somehow to signal the process.

P.S. Some background - I'm a software engineer (just graduated actually!) but I have a tremendous amount of respect for EE people who can whip up a DSP chip. I just bought an EEE 1000HA and I use it at least four hours a day. It's my only computer at home, and I hook it up to my car to play music. Maybe I should get out more.

Valve games (2, Interesting)

Chabo (880571) | more than 5 years ago | (#26713265)

Well of course Half Life 2 is choppy on the platform -- the Source engine is very CPU-intensive. Almost every system is going to be CPU-bound with Valve games, unless you happen to be running a Core i7 with an entry-level video card, at 1920x1200. As for the other games, you're still running on integrated graphics, and there's only so much you can do before you need a separate card.

Disclaimer: I work for Intel, but I've been a fan of Valve's games for much longer than that.

Re:Valve games (1)

800DeadCCs (996359) | more than 5 years ago | (#26713441)

I've been running HL2 on an atom 330, with a pci ati 2400 (note, old school pci, not pci-e), and except for a little flitter when levels start, it's been great.
Then again, I'm running 1024/768 (monitor max res).

I do wonder why everyone is so gung-ho about the single core atom,,, isn't the dual core only about $5 more?
That's worth it for me... add on a Nvidia chipset (please do CUDA), it'd be beautiful.

Re:Valve games (1)

Chabo (880571) | more than 5 years ago | (#26713747)

Plus most hardware review sites tend to say "If you can't play it at full settings, it's not worth it." ;)

I first played HL2 on a GeForce 4 MX 440, at 640x480, on minimal settings. That was alright, but the real problem was that if you set the game to DX7 (which is all the GF4 was capable of), then the draw distance for models was greatly reduced. That meant the chapter where you had to use your grav gun to get objects to stand on top of the sand was a real pain, cause you couldn't see any boxes that were more than 15 feet away unless you zoomed in, and you couldn't use the grav gun while zoomed.
Zoom in, find an object, zoom out, pull it, zoom back in to see if you lost it, repeat.

However, other than that chapter, the game ran fine, and I still had an enjoyable experience. If that card had been capable of running DX8, then I would've had just as much fun as with a top-of-the-line card. If the designers are smart, as Valve's people are, then the game experience will not be affected by hardware.

Re:Valve games (1)

Hal_Porter (817932) | more than 5 years ago | (#26713961)

That meant the chapter where you had to use your grav gun to get objects to stand on top of the sand was a real pain, cause you couldn't see any boxes that were more than 15 feet away unless you zoomed in,

BAM! Antlioned.

and you couldn't use the grav gun while zoomed. Zoom in, find an object, zoom out, pull it, zoom back in to see if you lost it, repeat.

Sorry, did I interrupt your train of thought?

Re:Valve games (1)

kenh (9056) | more than 5 years ago | (#26714869)

I agree - the dual-core (four pseudo-cores if you include hyper-threading) Atom CPU is a very capable machine. I'm not a gamer, but I did get the Intel D945GCLF2 MB with the dual-core CPU, Gigabit LAN, and dual SATA ports for a test/trainer 64-bit Windows Server 2008 machine and it works great. Is it the fastest machine I own, no, but as a build-up/test/tear-down box it works very well, and if I leave it on, the power drain is minimal. I do wish it had something other than a PCI slot (PCIe or even PCI-X would be a big plus)... At under $80 for MB & CPU & cooling solution, it is a good deal.

Re:Valve games (1)

Choad Namath (907723) | more than 5 years ago | (#26715549)

I do wonder why everyone is so gung-ho about the single core atom,,, isn't the dual core only about $5 more? That's worth it for me... add on a Nvidia chipset (please do CUDA), it'd be beautiful.

Intel is restricting the dual-core Atom to desktops AFAIK. Probably something like 80-90% of Atoms are in netbooks, so the dual-core Atom not an option for most people for now, at least.

Intel will hate it. (1)

LWATCDR (28044) | more than 5 years ago | (#26713277)

This is a shift away for the CPU to the GPU and Intel will hate it.
This or even the plane atom is good enough for a very large percentage of users.
This would work for just about every Office PC, average home user, and media center.
About the only tasks this will now work for is media editing, gaming, or heavy technical use.

The one problem I see with it is the cost. That extra money is a big percentage of the cost of the one of these mini systems.
I so want one.

Re:Intel will hate it. (1)

Midnight Thunder (17205) | more than 5 years ago | (#26713889)

This is a shift away for the CPU to the GPU and Intel will hate it.

Well they should have seen the writing on the wall. It is much easier to make an efficient specialised processor, than an efficient generic one. With OpenCL on the horizon, don't be surprised to see computer with dual graphics chips (one used for OpenGL/Directx and the other for OpenCL). Note I see OpenCL being beneficial to physics aspects of games. If Intel has any sense they will either improve their graphics chips or invest in Nvidia.

Re:Intel will hate it. (1)

drinkypoo (153816) | more than 5 years ago | (#26715425)

With OpenCL on the horizon, don't be surprised to see computer with dual graphics chips (one used for OpenGL/Directx and the other for OpenCL).

I believe that you are incredibly wrong - thank goodness. All you have to do is look at history to see that computers have been steadily moving towards commoditization. Even the last Cray machines were just bundles of commodity processors with some (very important and significant) glue.

Intel is simply going to put more of the functionality of GPUs into its CPUs. Meanwhile they are talking about a future with "thousands of cores" in the computers of mainstream users. While that is clearly a long way off if it is coming at all, eight-way processors cannot be far off and you can already build a dual quad. Since processors are getting better at turning off unused functional units and such, this greater parallelism is a valid approach to computing into the next decade and beyond - after all, programs are only getting more multiprocessor-aware.

Re:Intel will hate it. (2, Informative)

forkazoo (138186) | more than 5 years ago | (#26717713)

Intel is simply going to put more of the functionality of GPUs into its CPUs. Meanwhile they are talking about a future with "thousands of cores" in the computers of mainstream users.

Actually, AMD is out front on putting GPU functionality into CPU's with the "fusion" platform. Intel is taking the long way around with Larrabee and putting X86 into the GPU. Go figure. Anyhow, the end result will be to reduce chip counts and take advantage of the high number of transistors that can cheaply be put on a single chip and integrate as much as practical.

Personally, I'm surprised we haven't seen more in the way of SOC's marketed at the desktop market yet. I'm sure it'll come, and you'll just get a motherboard with PCIe slots, DIMM slots, and a CPU socket, but no chipset or anything soldered on.

Re:Intel will hate it. (1)

LWATCDR (28044) | more than 5 years ago | (#26719903)

Actually I am not so sure.
For one thing super computers often do include co processors for things like vector ops and such.
GPUs are becoming more and more important not less and less. Intel has failed to make a good GPU. Why is up to debate. Frankly I think CPUs have for now reached a good enough level for most people. Now they want lower power, heat, size, and at best HD playback.
As to gaming take a look at the XBox360 and PS3. They are very GPU heavy and pretty CPU light. Even the Cells SPEs are really more like GPUs than a GP CPU.

Re:Intel will hate it. (1)

Lord Ender (156273) | more than 5 years ago | (#26715903)

Intel has claimed that they will get rid of the need for GPUs by adding GPU cores to their CPUs. NVidia has claimed that CPUs don't matter much anymore, and that their GPUs are what consumers really need to go forward.

Time will tell who is right, so I own both INTC and NVDA ;-)

A reasonable start (5, Insightful)

abigsmurf (919188) | more than 5 years ago | (#26713427)

Games performance isn't really the issue for these. These things aren't designed for games.

What these are best used for are Media Centre setups. However it doesn't play all 1080p content smoothly which is a major issue. There are plenty of options for this kind of thing, the Popcorn hour, the WD HDTV box. Those are good to a point but fall down on format support, especially mkv which doesn't have full subtitle and codec support on either.

The current best option is an energy efficient Athlon based setup. These cost about $75-$100 more than an atom system and use a bit more power but they'll play back any video you throw at them without dropping any frames.

Maybe with a dual core atom and using dual core optimised codecs this will reach the goal of never having to notice a dropped frame, regardless of format and bit rate but this atom solution still isn't the Media center beast it could be.

Re:A reasonable start (1)

CannonballHead (842625) | more than 5 years ago | (#26713657)

Games performance isn't really the issue for these. These things aren't designed for games.

Totally agree. who is going to be playing Half Life 2 on a 8" or 10" screen? =P

Maybe some other games, but FPS? With a trackpad? (wh owants to lug around a mouse with their ultra portable netbook?)

Re:A reasonable start (4, Interesting)

Midnight Thunder (17205) | more than 5 years ago | (#26713969)

Totally agree. who is going to be playing Half Life 2 on a 8" or 10" screen? =P

Certainly, but looking towards platforms such as PSP, Nintendo DS and the iPhone, we can see that there is a market for games taking advantage of small format screens. While Half Life 2 won't be targeted at these platforms, there are already FPS games for some of these platforms, though then again we are more likely to see a Nvidia + ARM combination, than a Nvida + x86 combination, simply because of battery limitations.

Re:A reasonable start (1)

CannonballHead (842625) | more than 5 years ago | (#26714373)

I might be wrong, but I think any given netbook is going to be pretty cool with running PSP/DS/iPhone games fairly well? The, for lack of a better term, computing power of an iPhone vs. MSI Wind has to be tilted towards the Wind. I don't think Half-Life 2 is going to be running on an iPhone anytime soon :)

Re:A reasonable start (0)

Anonymous Coward | more than 5 years ago | (#26715013)

It might not really be a single core Atom as they seem to suggest ITPA. They mention an Atom 330 processor onboard, which is actually a dual core version Silverthrone. Later on they start talking about Single core atom, which might really be something they just wrongly suggested ?

Either way, even this setup leaves a lot to be desired.

Re:A reasonable start (1)

the_B0fh (208483) | more than 5 years ago | (#26715687)

So, what's a good energy efficient athlon setup? With dual core? and 64 bit?

I used to know both intel and amd cpus quite well, but it was years and years ago when things still made sense. Nowadays, I keep putting off buying something because I can't figure out what's what!

Re:A reasonable start (1)

mrsalty (104200) | more than 5 years ago | (#26716913)

I have to disagree with you on this point. I replaced an Athlon64 3800+ machine with a Popcorn Hour because the Athlon simply did not have the power to play back 1080p content. It could manage 720p no problem, but would stutter on 1080p enough to make it unwatchable. It was also far harder to setup and use MythTV then it is the PH. Sure the PH has it flaws, but it works out of the box with a remote and plays nearly any media you stream at it. Also, if you are starting from scratch, it is cheaper.
-mrsalty

ohshi- (0, Troll)

Anonymous Coward | more than 5 years ago | (#26713749)

Stalin's submitting stories to Slashdot!!!

Re:ohshi- (0)

Anonymous Coward | more than 5 years ago | (#26713943)

How the fuck is this a troll? The submitter is J. Dzhugashvili which is Stalin's real name. Fucking hypersensitive American idiots who fail at life, in the classroom and with economics. Get a sense of humour.

What's wrong... (1)

jd (1658) | more than 5 years ago | (#26713973)

...with using IBM's Sequoia for a graphics processor? Ok, they need to work on the price a little, and maybe it's a little bulky for most portables, oh and the power supply might need upgrading, but aside from that, what have the Romans ever done for us?

Re:What's wrong... (0)

Anonymous Coward | more than 5 years ago | (#26715031)

drink my goo, chelloveck

I'd buy some. (3, Interesting)

Big Boss (7354) | more than 5 years ago | (#26714035)

With recent developments in VDPAU, the HD capable GPU acceleration for Linux, I could use this board. The only thing I would change is to make it wider and move all the ports to the back. Include an LCD or VFD if you want to get fancy, and and IR receiver on the front. Perfect MythTV frontend machine. I would like the dual-channel RAM though, to help with 1080i playback.

Put it in a nice small case like those used for modern DVD players, and they have a winner.

Re:I'd buy some. (2, Interesting)

QuasiEvil (74356) | more than 5 years ago | (#26714111)

My first thought exactly - gimme, gimme, gimme, need a new Myth frontend. Let's see - low power, good Linux-supported decompression acceleration, and has an HDMI port. This is exactly what I've been waiting on.

Re:I'd buy some. (0)

Anonymous Coward | more than 5 years ago | (#26716049)

Well, I'm waiting for the Asus eeebox b204 and b206 to come. They have an ATI Radeon HD 3450 series with 256 MB DDR2 memory and HDMI output.
  The link. [asus.com]

Re:I'd buy some. (1)

blackpaw (240313) | more than 5 years ago | (#26718887)

Well, I'm waiting for the Asus eeebox b204 and b206 to come. They have an ATI Radeon HD 3450 series with 256 MB DDR2 memory and HDMI output.

And ATI HD Accelerated playback is *not* supported under linux. They barely manage video playback via XVideo.

Re:I'd buy some. (1)

Clarious (1177725) | more than 5 years ago | (#26719987)

VDPAU still can't play a lot of video.

Another important thing, VDPAU can't utilize memory allocated by Turbo Cache, and 9400M don't has enough integrated RAM to decode HD video (not sure about 720p, but sure it can't do 1080p)

25 Watts? (2, Informative)

Bruce Perens (3872) | more than 5 years ago | (#26714065)

25 watts during playback? Huh? This is more than twice what my current netbook with Intel graphics uses. It generally runs less than 11 watts, and I can get it to a bit less than 9 with some tweaking. I don't suppose anyone's seriously proposing an Atom platform that pulls 25 watts during playback.

Bruce

Re:25 Watts? (1)

ninjackn (1424235) | more than 5 years ago | (#26714255)

But intel graphics can't do 1080p at 9W or 11W.

Re:25 Watts? (2, Insightful)

Bruce Perens (3872) | more than 5 years ago | (#26714357)

But Intel graphics can't do 1080p

Perhaps because there isn't optimized MPEG playback code for that chipset?

Part of the problem here is that on the desktop, Intel's vendors don't want great Intel graphics, they want to be able to sell up to an external display card. So, it's only the laptop platform that could drive an improvement in Intel graphics.

Re:25 Watts? (1)

gEvil (beta) (945888) | more than 5 years ago | (#26714503)

But intel graphics can't do 1080p at 9W or 11W.

Not that it even matters on the 1024x600 screen on most netbooks (or the 1280x800 screen on a select few). Oh, you were talking about hooking it up to an HDTV monitor. Well, that's what the HTPC or game console is for. At the moment it really seems like the Ion platform is aiming for a niche that barely exists. Now, for a really low-powered HTPC, this might show some promise in another generation or two. But at the moment, I'll pass.

Re:25 Watts? (0)

Anonymous Coward | more than 5 years ago | (#26719161)

But intel graphics can't do 1080p at 9W or 11W. Not that it even matters on the 1024x600 screen on most netbooks (or the 1280x800 screen on a select few). Oh, you were talking about hooking it up to an HDTV monitor. Well, that's what the HTPC or game console is for. At the moment it really seems like the Ion platform is aiming for a niche that barely exists. Now, for a really low-powered HTPC, this might show some promise in another generation or two. But at the moment, I'll pass.

There's nothing except Intel preventing these lower power Netbook/Atom platforms from improving upon the HTPC niche. Paired with a decent GPU for decode/encode an Atom platform uses less power, is more quiet and cooler running than most HTPCs and cheaper to boot. There's no reason we should have to wait around a generation or two. The Netbook market is in its third generation now and in this market Intel is moving like a sloth; reminds me of the original Pentium days when they didn't have much for competition. They certainly don't want to eat into their full-sized laptop/desktop lines by actually releasing a performance part. I'd love to see AMD or VIA really enter or step up (respectively) the competition in this market.

Re:25 Watts? (1)

afidel (530433) | more than 5 years ago | (#26714277)

The Intel GMA950 which can do 1080p pulls greater than 20W just for the chipset.

Re:25 Watts? (1)

default luser (529332) | more than 5 years ago | (#26718219)

For the last time, the chipset your are referencing is the DESKTOP variant, not the MOBILE variant.

Mobile 945: 7w TDP [intel.com].

945 Desktop: 22w TDP [intel.com].

Nobody in their right mind is going to put a desktop chipset in a netbook. So yes, the 25w consumed by this test platform (4w for CPU + 20w for chipset) IS significantly more than most netbooks (4w for CPU + 7w for chipset), and raises the question of whether this is a viable solution for HD playback.

Re:25 Watts? (0)

TheRaven64 (641858) | more than 5 years ago | (#26715523)

The atom platform is a joke. It's competing with ARM SoC solutions that draw under 2W and can play 720p H.264 in this power envelope, dropping to around 15mW if all you're doing is playing music. It's an order of magnitude worse than its competitors, but the massive Intel marketing machine keeps screaming 'look at us! We're relevant!' and the tech press believes them.

Unless you want to run Windows (and surely you don't Bruce), there's no compelling reason to go with Intel for a portable.

Re:25 Watts? (1)

Bruce Perens (3872) | more than 5 years ago | (#26716313)

Well, there's a MIPS laptop somewhere out there, on fire sale for less than $200. But where would I get an ARM laptop? I'm currently using an Acer Aspire One, and I have two of the 9-cell batteries, and can get 8 hours out of each. This is sufficient for my traveling and public speaking. I'd be happy to try an ARM laptop, if it's physically as nice as the Aspire One, and has a VGA output (because I want to project from the Linux system, not whatever Windows system they have at the venue), and doesn't cost more than the Aspire One.

Game engines targetted to this platform? (1)

JSBiff (87824) | more than 5 years ago | (#26714175)

While off-the-shelf PC games might not work that great on this combo, I suspect that there could be really good, beautiful looking games created and fine-tuned just for such a platform. Someone could possibly use this as the basis for a portable entertainment system to compete with the Nintendo DS, PSP, etc. XBox Portable anyone? If you consider that Atom-based systems would typically have smaller screens, so that you might be looking at a resolution of maybe 640x480 or 800x600 (or perhaps wide-screen aspect ratios, but with similar vertical resolution), I could see this being a sufficient solution for such a portable entertainment system.

Re:Game engines targetted to this platform? (1)

xenolion (1371363) | more than 5 years ago | (#26714589)

Im starting to think about some of the old classic games making a somewhat come back on these machines. Doom, quake, Warcraft? or just someone creating a bunch of open source games for them, but then again im starting to show my age for the classic games. --- I think im going to start saving some cash.

Re:Game engines targetted to this platform? (0)

Anonymous Coward | more than 5 years ago | (#26715293)

Um. Two of the games you mentioned aren't 3D at all. Think more along the lines of, say, KOTOR. Games that aren't too old, still look great (to anyone who isn't a whiny child demanding shiny toys), but will run on modest hardware.

Re:Game engines targetted to this platform? (1)

sznupi (719324) | more than 5 years ago | (#26716209)

I would like to for such standard to go even "lower", towards current netbooks as the base - there's already quite a lot of them, and the Poulsbo/new integrated Atom ones won't be significantly more powerfull.

But...there's either no need for it or it won't bring anything noteworthy to the table - definatelly not any big release, just casual games at most. And you can already find a lot of them that run nice, and also a lot of really good, older "big" releases.

Mac Mini (1)

idiotnot (302133) | more than 5 years ago | (#26714421)

And, maybe AppleTV? I wouldn't imagine a lot of mini owners are gaming on them, and if they can get the price down to reasonable, why not?

Will stick to my 45WAthlonX2/AMD740G mythtv box until AppleTV gets a tuner
But I would like to replace that aging G4 that I occasionally use

My new mythtv frontends!!!! (3, Interesting)

jhfry (829244) | more than 5 years ago | (#26714521)

I see this being the hot new frontend for mythtv. With VDPAU supported for HD decoding, fanless/quiet fan, atom processor, a bit of ram, and a SD card for storage I could make one hell of a nice tiny front end.

I want one now!

A Atom based mac mini is a cpu power down grade an (1)

Joe The Dragon (967727) | more than 5 years ago | (#26715119)

A Atom base mini is a cpu power down grade and there is no way that it will be a good buy at $600 maybe $400 - $500 but it should use ddr2 not high cost ddr3 laptop ram. AND DON"T thing about not giving a free mini dp to dvi cable with it $30-$100 more to use your own display??.

Apple may try to sell a system with this a 1gb of ram at $600 but that will make the old mini still look good even more so if it drops to $500 or less and ddr 3 will just make it cost more as the cpu is so slow that it is better to get more ddr2 at lower cost.

That will also tell Psystar we can even try to match or do better then your hardware but we can make the mini look even worse next to your system.

You can get a nice mid tower for about $600 with a high end amd dual core + lower-mid range (much better then 9400m) video with 790gx on board video with it's own ram or just use the 790gx (64-128 mb just for video is cool and faster then 9400m) if you want to pay less.

Who cares about the Mac mini any more? (0)

Anonymous Coward | more than 5 years ago | (#26716275)

Except for people who are obsessed with only buying Apple-branded products, Apple has largely conceded the entry level business to their faster and more dynamic competitors.

The Atom + Ion platform better suited for AppleTV (1)

willy_me (212994) | more than 5 years ago | (#26716935)

Assuming that it plays 1080p correctly, the Ion platform would make for an excellent AppleTV. The only question is that will it be cheap enough. But the extra horsepower of the Atom could allow the AppleTV to be used for other things.

There were rumors about Apple using the Ion platform in the mini but I believe those to be false. The AppleTV appears to be a much more likely target.

mod 0 p (-1, Flamebait)

Anonymous Coward | more than 5 years ago | (#26715265)

mutated tes7iclHe of Notorious OpenBSD

WD TV (1)

Pinback (80041) | more than 5 years ago | (#26719883)

I bought a WD TV Friday, and it is going back to Fry's tonight. The fact that it couldn't "aggregate" my mp3s or play back two thrids of my mst3k avi files was a deal breaker.

It also failed indicate what the problem was when I (unknowingly) tried to use a non-HDCP compliant HDMI cable. (It allows selecting upto 720p output on HDMI, but the configuration menu keeps reverting to showing "composite" as the display type.) I figured it out by swapping cables with my Philips (HDMI equipped) DVD player.

Remote sensitivity is marginal, response to button pushes is sluggish. Even though it is Linux based, WD went with a closed source video decoder chip, etc.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...