Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Releases Open Source Fusion Driver

kdawson posted more than 3 years ago | from the knock-yourselves-out dept.

AMD 126

An anonymous reader writes "Yesterday AMD released open source Linux driver support for their Fusion APUs, primarily for the first Ontario processor. As detailed on Phoronix, this includes support for kernel mode-setting, 2D acceleration, and 3D acceleration via both Mesa and Gallium3D graphics drivers."

Sorry! There are no comments related to the filter you selected.

Is Fusion any good? (0)

Anonymous Coward | more than 3 years ago | (#34319702)

Any chance Apple could use that for the next versions of Mac mini and MacBooks? Or is a Core 2 Duo with nVidia 320M still better than Fusion?

core2 is dieing intel's next on board video 9400m (2, Interesting)

Joe The Dragon (967727) | more than 3 years ago | (#34319790)

core2 is dieing intel's next on board video is at nvidia 9400m level but it also locks out better on board video.

Some of apple systems may not fit in a full X16 pci-e video chip.

Apple is may use i3 / i5 cpu with a added ati or nvidia chip. But they don't like to use add in video in there low end systems.

Re:Is Fusion any good? (1)

gman003 (1693318) | more than 3 years ago | (#34319966)

IIRC, Fusion is aimed more at smartphones, tablets and maybe netbooks/nettops. A Core 2 Duo / nVidia 320M should still be significantly more powerful. They were once planning a desktop-grade processor, but that seems to have been cancelled.

Re:Is Fusion any good? (2, Interesting)

hedwards (940851) | more than 3 years ago | (#34320130)

I think there was some speculation that it could be used alongside the main GPU as in some of the newer multicard ones. Basically to do things like calculating what things are visible so that the processor doesn't have to send those over the bus. Normally the GPU itself does that after the data goes over the wire, doing it on chip would be a lot cheaper, and probably quite doable if you've got another chip that ends up doing most of the rest of the work. I suspect that they'll find a way of adding that sort of flexibility.

I'm not sure if that's something which AMD has any designs on, though I'd be shocked if they weren't.

Requires running the vertex shader first (1)

tepples (727027) | more than 3 years ago | (#34320482)

Basically to do things like calculating what things are visible so that the processor doesn't have to send those over the bus.

Calculating occlusion requires knowing where the points are relative to the camera's line of sight, which requires running the vertex shaders first. How much of a speed boost would result from running the vertex shaders on an on-die GPU and delegating pixel shading to the discrete GPU? I'd imagine that pixel shaders, which are run for each pixel, need a lot more time than vertex shaders.

When AMD turns to 28nm production... (3, Interesting)

IYagami (136831) | more than 3 years ago | (#34319972)

Any chance Apple could use that for the next versions of Mac mini and MacBooks? Or is a Core 2 Duo with nVidia 320M still better than Fusion?

... according to Fudzilla.com

http://www.fudzilla.com/notebooks/item/20888-amd-apple-deal-is-28nm-notebooks [fudzilla.com]

"Fusion goes Apple 28 / 32nm
It all started here, when AMD’s Senior VP and Chief Sales Officer Emilio Ghilardi was brave enough to show an image of several Apple products in a Fusion presentation. After we wrote our part AMD was quick to deny it, perhaps a bit too quick, which gave us a reason to dig some more, only to find that we were on the right track.

We asked around and some sources close to Intel / Nvidia have denied the rumour saying that they know nothing about it. However, just a day later we managed to confirm that the leak is real and that Apple will indeed use Fusion, here.

Our industry sources have indicated that the deal will be announced in at some point 2011, that it will involve 28nm and 32nm Fusion parts particularly Krishna and that Apple plans to launch notebooks based on AMD chips. Apple is also not cold hearted on Trinity 32nm Fusion parts.

The announcement can be as far as a year away, as 28nm parts won't materialise until the second half of 2011 and since AMD doesn’t have a tablet chip, it won’t happen in iPad segment. At this point Apple doesn’t plan to use any AMD chips in desktop or server parts, but in case Bulldozer impresses us all, maybe Steve might change his mind.

So if you like Apple and love AMD, start saving money as roughly a year from now you should be able to buy Apple notebook with Fusion Krishna / Trinity class APU."

And if you want Fusion benchmarks, check the usual suspects:
http://techreport.com/articles.x/19981 [techreport.com]
http://www.anandtech.com/show/4023/the-brazos-performance-preview-amd-e350-benchmarked [anandtech.com]

Re:When AMD turns to 28nm production... (4, Interesting)

hedwards (940851) | more than 3 years ago | (#34320162)

One of the complaints I've had about Apple was that they don't have any products at all that use AMD chips. Not really a deal breaker, but I prefer AMD because for as long as I can recall they've had the best performance for the price. Sure Intel is almost always faster, but just about anybody can if their not worried about price.

Re:When AMD turns to 28nm production... (4, Funny)

ifiwereasculptor (1870574) | more than 3 years ago | (#34320266)

And you think Apple customers are that worried about price?

Price of Android pod touch (0)

tepples (727027) | more than 3 years ago | (#34320542)

And you think Apple customers are that worried about price?

Of course they are. If you want a portable media player with an app store, and you live in the United States, an iPod touch 4 is a lot cheaper than an unlocked Android phone. As I understand it, there still aren't any Android-based portable media players that officially support Android Market.

Re:Price of Android pod touch (2, Informative)

SuricouRaven (1897204) | more than 3 years ago | (#34321084)

Unfair comparison: The Android phone is also a phone. You should be comparing equivilent products: An Android phone vs the iPhone.
I'm not sure about Android portable media players, but there are tablets that could be regarded as equivilent in intended use to an iPad.

Re:Price of Android pod touch (1)

drinkypoo (153816) | more than 3 years ago | (#34321524)

I'm not sure about Android portable media players, but there are tablets that could be regarded as equivilent in intended use to an iPad.

In order for your objection to be carried you are going to have to show that there's Android-based media players with an app store which are also otherwise comparable to the iPod Touch and/or iPad. So far tablets don't have app store access. Perhaps in Gingerbread, but I wouldn't hold my breath.

Re:Price of Android pod touch (1)

Vancorps (746090) | more than 3 years ago | (#34322652)

See Archos: [archos.com] Their tablets and media players run Android, some even include dual OS features to run Linux on them. They run on Cortex A8 processors.

Re:Price of Android pod touch (3, Insightful)

MBGMorden (803437) | more than 3 years ago | (#34321138)

That's the same argument fanboys always use to call Apple products cheaper. Hand pick your specific criteria the must be included (app store) and excluded (and actual phone . . .) until you get just the right oddball combination of features that you can call it cheaper.

Meanwhile, when you compare the iPod Touch to other touch-screen media players, it's pricing is atrocious, and Apple's laptops, desktops, and servers all fair equally poorly against their general competitors.

As a matter of fact the only segment in which Apple competes well on price is with iPhone. It's about the same as other similar smartphones. Other than that though? You're definitely paying your turtleneck-tax.

Re:Price of Android pod touch (1)

AvitarX (172628) | more than 3 years ago | (#34321648)

Last time I specced an Apple desktop vs any others is was the cheapest.

It had to be compared to workstations. As there were no other brands matching Apple's in the desktop range.

Re:Price of Android pod touch (2, Informative)

MrHanky (141717) | more than 3 years ago | (#34322906)

Funny. Last time I bought a PC, the cheapest Apple option for my needs was the most expensive iMac. It would have cost twice what I paid, and performed worse. Apple simply isn't competitive in the midrange.

Re:Price of Android pod touch (2, Informative)

AvitarX (172628) | more than 3 years ago | (#34323682)

Just looked at their prices too, they've gotten worse.

And the crappy displays on the iMacs (maybe this has improved) kill them for serious use, leaving the cheapest desktop at $2500, and it's only one CPU.

But trying to match their 5k computer at Dell runs 6k.

Re:Price of Android pod touch (1)

tepples (727027) | more than 3 years ago | (#34321914)

when you compare the iPod Touch to other touch-screen media players, it's pricing is atrocious

What other touch screen media players can I try in stores, even those without an app store?

Re:Price of Android pod touch (1)

ADRA (37398) | more than 3 years ago | (#34322184)

Archos, Zune, etc.. There are other little guys in the market as well, but I can't remember them off the top of my head. Archos has pegged on Android, so they'll probably get app store access and Zune is tied to the Microsoft ecosystem for better or worse.

Re:Price of Android pod touch (1)

tepples (727027) | more than 3 years ago | (#34325110)

Archos has pegged on Android, so they'll probably get app store access

Any evidence of that? Archos has had Android media players out for years, but none have Android Market access. Archos even set up its own "AppsLib" store out of frustration with Google not opening up Android Market to devices other than telephones. But as far as I know, the selection on AppsLib is nowhere near that of Android Market.

Re:Price of Android pod touch (1)

h4rr4r (612664) | more than 3 years ago | (#34322542)

Why try it in a store? Buy it online and if you do not like it return it.

Welcome to the 21st century.

The 15% restocking fee (1)

tepples (727027) | more than 3 years ago | (#34325086)

Buy it online and if you do not like it return it.

And pay how many 15% restocking fees?

Re:Price of Android pod touch (1)

644bd346996 (1012333) | more than 3 years ago | (#34322258)

You don't need to handpick criteria in order to make Apple products seem cheaper. All you really need to do is include some criteria other than clock speeds and memory capacities. Geeks have a tendency to ignore advantages that aren't trivially quantifiable, even when those advantages have real monetary value to most consumers.

When you look at Apple's product line, you find that many products have no true head-on competitors. Most obvious are the iPod Touch, the iMac, and the Mac Mini. Those are products that clearly have a large market and large margins for Apple, so you would expect there to be some competitors trying to undercut Apple while matching them for features and capabilities. Instead, what you see are companies that try to undercut Apple by offering products that have significant disadvantages, such as ultra-small form factor PCs that only offer Atom processors and crappy Intel graphics, while still being bigger than the Mac Mini, or slightly larger boxes that are as fast or faster than an iMac, but when you add in the price of a good monitor, it ends up being several hundred dollars more expensive than the iMac, while lacking the convenience and not really offering much more in the way of upgradeability.

The only reasonable way to explain this is that all would-be Apple competitors lack either the engineering talent or the scale necessary to compete head-on with Apple's offerings. But if that's the case, then the "Apple tax" is no longer arbitrary - it is supported at least in part by very narrow and apparently natural monopolies Apple has in some niches.

Re:Price of Android pod touch (1)

h4rr4r (612664) | more than 3 years ago | (#34322518)

As I understand it the android market is not a requirement for installing apps.

Re:When AMD turns to 28nm production... (4, Insightful)

C_Kode (102755) | more than 3 years ago | (#34321416)

And you think Apple customers are that worried about price?

Apple customers are going to pay a premium no matter what. It's Apple that wants the discount. The less Apple pays for the hardware, the larger the margin they get with each product. Apple's customers aren't going to see any discount, even if Apple's discount is $100 per processor to move to AMD.

Apple has $50B in the cash. Considering what they sell, that's an absurd amount of money. Enough to buy Sony outright. It just goes to show you the enormous margins that consumers pay for Apple products. It's like Sun / Oracle / Cisco in the 90s except these are consumers that are paying the outrageous margins rather than large money-fat corporations.

Re:When AMD turns to 28nm production... (1)

qbast (1265706) | more than 3 years ago | (#34321424)

Or about performance?
It just have to be shiny and with apple logo. The rest does not matter.

Re:When AMD turns to 28nm production... (0)

Anonymous Coward | more than 3 years ago | (#34320448)

I prefer AMD because for as long as I can recall they've had the best performance for the price.

Seriously, not to flame Apple, but is "best performance for the price" a target they're going after?

Re:When AMD turns to 28nm production... (0)

Anonymous Coward | more than 3 years ago | (#34320916)

No, they're after maximizing margins per unit sold. That's what they do.

Re:When AMD turns to 28nm production... (1)

RyuuzakiTetsuya (195424) | more than 3 years ago | (#34321486)

AMD CPUs you mean? Apple uses Radeon parts.

As for why they don't go AMD, it's probably due to AMD's supply line for midrange CPU parts. Note that Apple doesn't sell Celeron chips. they sell C2D, Core i5, i7 and Xeon.

Re:When AMD turns to 28nm production... (1)

Daniel Phillips (238627) | more than 3 years ago | (#34322354)

For quite some time, Paul Otellini, board member of Google, was also sitting on Intel's board while Eric Schmidt sat on Apple's board. While this linking of boards in itself is not proof of anything it is suggestive that some "out of band" communication may have been taking place. (In my opinion, Google as a huge consumer of data center hardware should be expected to avoid on its own recognizance all appearance of bias towards one processor manufacturer or another.)

Re:When AMD turns to 28nm production... (0)

Anonymous Coward | more than 3 years ago | (#34320544)

Wow - seems like AMD has forgotten the lessons of ATI. This isn't the first time ATI products were supposed to be in an Apple product. The last time, the deal got scuttled & Apple went with nVidia because ATI leaked that they were going to be the graphics chip on the new products before Apple announced it.

Re:When AMD turns to 28nm production... (1)

CODiNE (27417) | more than 3 years ago | (#34320818)

And if you want future Apple news to happen, don't leak it.

Steve Jobs has canceled actual products and ripped up supplier contracts for much less than this.

Re:Is Fusion any good? (1)

ceeam (39911) | more than 3 years ago | (#34320310)

The word is that they are seriously considering it at least. (And "the word" is the best you get when discussing Apple)

SALE ? (1, Interesting)

Anonymous Coward | more than 3 years ago | (#34319774)

Aren't too many Phoronix news lattely ? Are they on sale ?

Re:SALE ? (1)

TheLevelHeadedOne (700023) | more than 3 years ago | (#34319940)

Not yet...Black Friday is still a few days off...

Fusion (1, Insightful)

Anonymous Coward | more than 3 years ago | (#34319990)

Fusion is going to be important. AMD will finally have a portable product that rivals Intel. Integrated video hardware is now commonplace on the desktop [dell.com] . Embedded AMD hardware is beginning to appear [innocoregaming.com] and Fusion will accelerate this.

Intel doesn't have a 3D core they can integrate onto the CPU die. Bottom line is AMD has an edge.

Re:Fusion (2, Insightful)

hedwards (940851) | more than 3 years ago | (#34320186)

Intel doesn't have a 3D chipset they can integrate onto the processor die largely because they'd need to have a competent 3D chipset to start with. They haven't gotten that right up until now, so it's not a shocker that they haven't managed to get a competent one on die.

Re:Fusion (4, Informative)

RightSaidFred99 (874576) | more than 3 years ago | (#34320858)

huh? [geek.com]

Double huh? [anandtech.com]

It's rare to read someone post something both factually and subjectively wrong at the same time in so few words. Congratulations.

Re:Fusion (1)

TheEyes (1686556) | more than 3 years ago | (#34321502)

1) Sandy Bridge doesn't exist yet, and won't until next year. It'll be great when it does exist, though.

2) The GP was probably talking about OpenCL support, which is generally lagging on Intel IGPs. Apple is into OpenCL in a big way, and the lack of OpenCL support was reportedly one of the reasons they stayed with the aging Core2s (paired with NVIDA GPUs) for the current generation of Macbook Air. Maybe we'll see OpenCL support next year with Sandy?

Re:Fusion (1)

RightSaidFred99 (874576) | more than 3 years ago | (#34322484)

Well, he was saying Intel doesn't have a "3D chipset they can integrate onto the processor die" when in fact they have already done so. And if we're talking about released products, where's Fusion?

And actually their IGP is reasonable these days, even pre-SB. It's not fit for high res gaming, of course, but it's suitable for a majority of non-gaming needs.

I don't think SB will have OpenCL support, from my limited understanding Intel's GPU solution is less flexible than ATI/NVidia's.

Re:Fusion (1)

TeknoHog (164938) | more than 3 years ago | (#34321706)

I'm using an Atom D510 right now, with on-die integrated graphics. Of course, these are not GPUs in the modern sense of "general processing unit", but at least it is an Intel 3D chipset on the processor.

http://en.wikipedia.org/wiki/List_of_Intel_Atom_microprocessors#.22Pineview.22_.2845_nm.29_2 [wikipedia.org]

Fusion Driver (-1, Offtopic)

Stargoat (658863) | more than 3 years ago | (#34319996)

I read that incorrectly. I was really hoping that this was going to be a fusion screwdriver. I could hop in my handy box shaped spaceship and have some awesome adventurers. But alas no, back to the office with me.

Time to move away from NVidia now? (5, Interesting)

erroneus (253617) | more than 3 years ago | (#34320032)

Long ago, I went with ATI video because it had the best support for Linux. Eventually, NVidia caught on to this trend and started supporting Linux too... and better than ATI. So I switched. Now NVidia has screwed the community that had helped it to grow in popularity by putting out "Optimus" hybrid graphics everywhere and then refusing to update their Linux drivers to support it and refusing to release any details about it either. So now, the best anyone had been able to do is disable the nvidia GPU to reduce power consumption in laptops not able to utilize the nvidia hardware.

As AMD/ATI is doing this, perhaps my next selection will be to the exclusion of NVidia (again).

When will these jerks ever learn? The future of computing is in embedded devices and those devices will run Linux. Get Linux developers using YOUR hardware and it will have a better shot at a prosperous future as well. So far, Intel and ATI are the only options.

Re:Time to move away from NVidia now? (4, Insightful)

hedwards (940851) | more than 3 years ago | (#34320236)

nVidia is the last man standing in a sense. Both Intel and ATI (Obviously now owned by AMD) have released or are releasing pretty much everything necessary to have native drivers for whatever OS one wants to use. At some point they'll likely give up on that as more and more geeks decide that they don't want to recommend something that's limited like that.

Not so much with cutting edge gaming rigs, but with older computers especially it's fairly common for video cards to outlive their manufacturer support and still contain a few bugs or optimization problems.

Re:Time to move away from NVidia now? (2, Insightful)

Kjella (173770) | more than 3 years ago | (#34321116)

That said, nVidia has given very long support on old graphics cards. The primary reason that they support Linux is OpenGL workstations, which demand long support cycles and regular users get the benefit. And as even AMD admit, you get bigger benefits from cross-platform code (Win/Mac/*nix) than you do from the open source collaboration, as long as it's not possible to open up the closed source drivers due to DRM licensing, licensed code and so on. The open source team is very small compared to the Catalyst/fglrx, whether you count just the AMD employees or all the contributors. I have an HD5850 and the open source drivers are still not in any way on par with nVidia's (or AMDs) closed source drivers despite being 14 months since release, in some ways it'll probably never get there. As long as it's possible to fix bugs with the given documentation on how it should work you are good, if you're triggering some kind of undocumented lockup it might not be that easy getting resources on it except to say "don't do that".

Re:Time to move away from NVidia now? (2, Insightful)

fuzzyfuzzyfungus (1223518) | more than 3 years ago | (#34321390)

Arguably, it might actually be getting less likely that Nvidia will ever provide decent OSS support...

Intel has all-but-formally-announced their intention to lock Nvidia out of everything they can, as fast as the feds will let them. On die-video, no QPI licence, trimming PCIe lanes off lower end products, etc. AMD has not been as frankly rude about it; but their on-die video will be even more competent than Intel's, and they control a smaller slice of the market, in any case. Pretty much across the board, Nvidia can reasonably expect to be shoved out of anything too small, power-constrained, thermal-constrained, or cost-constrained to have either a full discrete GPU(in laptops) or a full PCIe expansion slot, populated,(desktops/servers).

Unless they can think of some fairly clever pushback, and fast, this will leave them with a market of A)Enthusiast gamers(who tend to run Windows and replace GPUs frequently) B)Serious CAD/Visualization guys(who may or may not run Windows; but whose Very Expensive software packages depend on Nvidia's 'makes the train run on time' approach to OpenGL support, rather than software freedom, seemless OSX integration, or still working in 5 years) and C) GPU compute types (who, again, are running very expensive software on very expensive hardware, and care that it works and, if they are large enough, that they can get engineering support). None of these markets place a premium on FOSS drivers, and most of them make driver quality and featurefulness a major part of Nvidia's competitive advantage(going from 'foremost provider of GPU computing solutions' to 'just another fabless silicon vendor whose stuff will work if you target Gallium3D' would be bad news for Nvidia...).

AMD and Intel, on the other hand, while deadly rivals, are in virtually identical positions RE: FOSS drivers: For their low-end stuff, drivers are just a pain in the ass. Especially for Intel, if team Linux will overlook their suckitude because their ttys come back after suspend, or whatever it happens to be, that is a pure win. They are both racing to make low to midrange GPU capabilities just part of the CPU, and it is very much to their advantage if all their CPU capabilities are Just Supported on whatever OSes the market cares about. I would expect to see increasing divergence in strategy between Intel/AMD on the one hand and Nvidia on the other.

Re:Time to move away from NVidia now? (3, Interesting)

RotateLeftByte (797477) | more than 3 years ago | (#34320252)

NVidia had their opportunity but since AMD got their ATI dept's act together their GPU performance and importantly their Linux support has come on in leaps & bounds.
With NVidia being squeezed out of the chipset market by AMD & Intel and even the consumer Graphics card able to play most FPS games at more than adequate frame rates, I see (sadly) NVidia slowly but surely going the way of Novel's Netware. i.e. to an inevitable death.
They really need to buy an ARM user and get their GPU's into mobile devices, provided they can make them sip power instead of gulp it like a 6ltr Dodge Charger

Re:Time to move away from NVidia now? (3, Insightful)

ArcherB (796902) | more than 3 years ago | (#34321856)

They really need to buy an ARM user and get their GPU's into mobile devices, provided they can make them sip power instead of gulp it like a 6ltr Dodge Charger

Doesn't NVidia make the Tegra/Tegra2 processors for mobile devices?

Re:Time to move away from NVidia now? (1)

RotateLeftByte (797477) | more than 3 years ago | (#34322358)

Yeah, I'd forgotten that. But like AMD & Intel, their market share is miniscule. Most other CPU/GPU builders are far better placed in the market.

The Register has a bit about the ARM A15 and a 1U server rack with the current A9 CPU inside. Now this is interesting. Low power Powerful servers. I use a couple of EEEBoz boxes but the A15 has certainly made me think again about dumping X86 CPU's for this use.

Re:Time to move away from NVidia now? (1)

antdude (79039) | more than 3 years ago | (#34323192)

So for an ATI Radeon 4870 video card (512 MB) in Debian/Linux, will I have easy, good support, and speeds with ATI's closed binary drivers like NVIDIA's? I am currently using my old NVIDIA GeFVorce 8800 GT and GeForce 5200 FX video cards in my old Debian/Linux boxes. They work well even with games, Google Earth, 3D screen savers, etc. I do want speed. I am not crazy about the open drivers since they tend to be slow and I do 3D stuff. :(

Re:Time to move away from NVidia now? (1)

RyuuzakiTetsuya (195424) | more than 3 years ago | (#34320302)

Normally I'd shake my head and walk away. However, with the Boxee Box and other embedded devices running on Atom and other low end x86 CPUs and Linux, having the community bash out bugs in your drivers means that nVidia can get to the top of the embedded heap.

Re:Time to move away from NVidia now? (4, Insightful)

diegocg (1680514) | more than 3 years ago | (#34320476)

Time? You are late. ATI has been releasing specs and employing engineers to write opensource drivers for some time already. I haven't bought a Nvidia GPU for years, and I have no plans to do it in the future.

Re:Time to move away from NVidia now? (1)

Compaqt (1758360) | more than 3 years ago | (#34320934)

I had considered ATI but heard bad things about the driver situation at the time (open source problems hadn't fully been worked through at the time).

How is $YOUR_LINUX_DISTRO these days with open-source ATI drivers? Any problems with Linux games, other programs requiring 3D (like KDE Stars), and Windows games running on Wine?

Re:Time to move away from NVidia now? (1, Troll)

ArsonSmith (13997) | more than 3 years ago | (#34321070)

Any problems with Linux games

Nope, they both work.

I know, "both", we all know the obvious one, but can you name the other one?

Re:Time to move away from NVidia now? (0)

Anonymous Coward | more than 3 years ago | (#34321220)

Quake?

Re:Time to move away from NVidia now? (1, Interesting)

Anonymous Coward | more than 3 years ago | (#34320520)

It's always been time to 'move away' from nvidia.
I asked years ago which card to use. Linux gaming fanboys said go nvidia, their closed blobs have great 3d support.
WTF do I care about that?!?! When they decide to release new models and want to force everyone to upgrade, they
quit maintaining the old blobs. Therefore, I'm left in simple open source 2D land with both vendors. Plus some marginal 3D here and there with both, but whatever. And further, when that blob starts crashing as I update
around it, expose bugs, get cracked, etc, guess what, it's closed, no further releases. Oops.

Now it's this year, and guess what, ATI has the same decent open 2D support they always have.
And looky here, even more 3D specs are being released, again as always... so GOGOGO AMD/ATI.
Intel has had their own capable graphics for well over a decade. They don't need a graphics partner.
AMD did not have one, and wisely bought the less encumbered ATI.
Excepting all the windows gaming fanboys, Nvidia graphics division is just waffling in the breeze for the moment. It has no purpose in life, no one has need to buy them.

And further, both AMD and Intel have inhouse/partnered north/south bridges and bios partners.
So nvidias other lines of business are similarly mooted.
This news just cements my prior conclusion to go with AMD/ATI in the future and shows I was right back then.
NEVER follow the advice of fanboys when you're seeking to keep your long term investments usable.

Re:Time to move away from NVidia now? (2, Insightful)

FreonTrip (694097) | more than 3 years ago | (#34321430)

I'll grant that the situation's always sucked for non-x86 platforms, but Nvidia's done a remarkable job of supporting their older hardware in Linux.

Drivers for GeForce FX Cards, Updated 10/18/2010 [nvidia.com]

Drivers for GeForce2-4 Cards, Updated 11/16/2010 [nvidia.com]

Drivers for the Riva 128 (?!) through GeForce256, Updated 08/04/2010 [nvidia.com]

There are also supported drivers for all of these products for AMD64 Linux. It's no substitute for an open source driver - I support nouveau - but declaring that they leave their old cards unsupported is patently false. They're still one of the only games in town for CUDA and GPU computing. And, as someone who has a house full of systems running Nvidia graphics cards, Nvidia has treated me very well.

Re:Time to move away from NVidia now? (1)

AvitarX (172628) | more than 3 years ago | (#34322712)

From what I've seen in forums nouveau is not so bad actually.

I like that the team is working hard on using Gallium effectively (for example they used it to make VA-API work using VDPAU (decode only)).

As far as I can tell the nouveau driver is more feature complete than the open source ATI driver, though maybe not as performant (I'm waiting for Open Source VA-API support for h.264 with encoding and decoding, I am not too concerned about the overall performance if it can run compiz).

Intel appears to be the closest/already there, but I have hopes for the others. I really want one of these AMD APU's for a media center, but at the very least I need the playback. And I have enough patients to wait for OSS drivers and vote with my wallet. The progress on nouveau I find quite interesting, as it does not have the same type of backing as ATI is giving, yet appears to be moving along very well.

I'd really like to see VA-API supported more broadly, as it is not purely a playback solution. In doing such I think Intel made the write call. I additionally like that it is not locked into certain codecs at all, and a GPU accelerated solution using just GPU hardware (not the special chip that VDPAU has for example) will theoretically work across all Gallium cards that support the same version of OpenGL (though not as well as using the dedicated hardware on the card).

Re:Time to move away from NVidia now? (0)

Anonymous Coward | more than 3 years ago | (#34324970)

> As far as I can tell the nouveau driver is more feature complete than the open source ATI driver, though maybe not as performant

No, it isn't.
http://wiki.x.org/wiki/RadeonFeature
http://nouveau.freedesktop.org/wiki/FeatureMatrix

As you would expect, since AMD/ATI have released the programming specifications for their graphics chipsets, and nVidia have not, there is significantly more green on the Radeon open source driver features page. "Page flipping" feature is work in progress, but when it comes this will deliver a significant performance boost to the Radeon open source drivers.

> I'm waiting for Open Source VA-API support for h.264 with encoding and decoding, I am not too concerned about the overall performance if it can run compiz

The Radeon open source driver can readily run compiz already. In fact they have been able to do so for some time now, over a year.

As for support for h.264 decoding (and WebM suport BTW), this will depend on the Gallium3D version, which is also still a work in progress. Up to now, the Gallium 3D driver supports XvMC with iDCT. This new work goes a long way towards bringing VDPAU support to the open-source ATI Radeon "R600g" Gallium3D driver for the Radeon HD 2000/3000/4000/5000 series graphics cards.

http://www.phoronix.com/scan.php?page=news_item&px=ODgwOQ

Re:Time to move away from NVidia now? (2, Insightful)

blair1q (305137) | more than 3 years ago | (#34321196)

I don't know about "time to", but in any case where the software is open vs. closed, the open-source community will not make the effort with the closed system. This will absolutely make linux hackers choose AMD graphics now, which will almost certainly result in improved reliability of AMD cards in linux systems overall, and eventually almost total domination of the consumer linux segment by AMD graphics.

Re:Time to move away from NVidia now? (1)

Yfrwlf (998822) | more than 3 years ago | (#34324242)

Providing AMD isn't withholding critical documentation about their GPUs from open source driver developers, and they can actually bring the open source drivers [phoronix.com] up-to-par [phoronix.com] .

Re:Time to move away from NVidia now? (2, Interesting)

drinkypoo (153816) | more than 3 years ago | (#34321578)

I am currently typing on a Gateway LT3103u, which has also been sold by Everex among others. It has a 1.2 GHz "Athlon L110" and AMD RS690M chipset, which in turn contains a Radeon R2xx chipset. The "fglrx" driver does not support it as it is too old, and the open source driver causes massive display trashing and lockups. Whatever they did to it, it's not a faithful R2xx, and so the existing driver (which works on genuine, old R2xx stuff) does not work on it. But that's not all; AMD also didn't bother to contribute proper support for the power saving features of this processor or chipset, nor decent documentation for either... for anything but Vista.

In short, I wouldn't trust AMD to actually provide you proper Linux support. I'm sitting here at an AMD "netbook" (subnotebook really) without it. Indeed, this machine is really only properly supported under Vista; power saving doesn't work under Windows 7! That's actually an artifact of the video driver, though, which lags behind the Vista version. If I load the VGA driver then power saving works right. However, since AMD makes the whole chipset, I get to blame them for it no matter how you slice it.

Re:Time to move away from NVidia now? (1, Interesting)

Anonymous Coward | more than 3 years ago | (#34323074)

I am currently typing on a Gateway LT3103u, which has also been sold by Everex among others. It has a 1.2 GHz "Athlon L110" and AMD RS690M chipset, which in turn contains a Radeon R2xx chipset.

The GPU in rs690 is actually a 4xx variant, not 2xx. Are you using the power saving features described at the bottom of http://www.x.org/wiki/RadeonFeature ?

Linux drivers - stable?? (1, Insightful)

Anonymous Coward | more than 3 years ago | (#34320068)

Drivers that finally enable full capability of the hardware are a must, be that OSS or closed source.

nVidia has a long term support in their Linux drivers - they are the same as their Windows or OS X drivers, just added a GPL glue layer. Are the AMD drivers going to be long term supported too? Stable??

To me,

    stable > long term support >>> OSS > closed-source

only because I'm not planning on debugging video drivers!

Re:Linux drivers - stable?? (2, Informative)

hedwards (940851) | more than 3 years ago | (#34320268)

Assuming that you're on a blessed platform. I remember waiting for nVidia to release their drivers for 64-bit Linux. It took a really long time from when I started waiting, and I waited a few years before getting an AMD64 system.

But now there is support for Windows, OSX and Linux. If your OS isn't on that list then they aren't providing you with anything, or even the ability to do it yourself without doing some real funky stuff with wrappers.

Re:Linux drivers - stable?? (1)

AvitarX (172628) | more than 3 years ago | (#34320548)

There was GPL glue to get them working on AMD64 pretty quick in my memory, It worked fine, and was pretty easy.

I actually can't verify it as faster than 1 year, but it was available when i purchased my computer in sept '04, with the first clawhammer released sept '03.

Not an official driver, just a little fragment to make the official one work (it was a patched NVIDIA installer, so easy to do).

Re:Linux drivers - stable?? (1)

CynicTheHedgehog (261139) | more than 3 years ago | (#34320522)

The AMD drivers AFAIK are supported almost entirely by the community. There are some devs at AMD that contribute to the drivers, and AMD has been releasing specs and documentation in increments. 3D is still not fully supported on most chipsets, so you need the FGLRX drivers for that, but for normal 2D desktop operations the radeon drivers are actually faster and more stable (fewer artifacts, etc.). At least, that's been my experience. The radeon driver also supports KMS (FGLRX does not) meaning it's the only option for Wayland.

There is also a lot of developer activity on the forums (Phoronix in particular) and you can really see progress being made literally from day to day. It looks like come Christmas/early next year the open source radeon 3D stiff will start catching up to FGLRX in the upstream releases (Xorg, Mesa, Gallium) but it will probably be a while before that finds its way into most distros.

If you're a KDE user, however, you'll probably want to stick with NVidia for a little while longer. Kwin compositing does not work well with FGLRX or the radeon driver and the KDE devs insist that it's the driver (whereas Compiz runs just fine on the same drivers).

Re:Linux drivers - stable?? (0)

Anonymous Coward | more than 3 years ago | (#34321858)

If you're a KDE user, however, you'll probably want to stick with NVidia for a little while longer. Kwin compositing does not work well with FGLRX or the radeon driver and the KDE devs insist that it's the driver (whereas Compiz runs just fine on the same drivers).

Compiz and kwin work in different ways, kwin uses QT opengl modules and compiz afaik uses gpu assemblies directly.
the QT opengl module isnt perfect, but i totally believe that raedon(hd)/fglrx are the faulty SW here.

Re:Linux drivers - stable?? (2, Insightful)

koolfy (1213316) | more than 3 years ago | (#34320562)

Long term support is something that only exists in OSS ecosystems. No matter how long a company is going to try to support a hardware, the community will support it longer.
It all comes down to "supporting old stuff does not bring new sales, therefore is really bad in the long term" vs "I still use/want to use old stuff, therefore I want it to work, and as long as it fits me, I'll support it."

In the OSS community, the only hardware that's not supported is really the hardware that's not used. Hell they even managed to support closed nvidia hardware without any help from nvidia (see nouveau 2D/3D driver)

Also i'm more confident about OSS drivers being stable than closed source ones. Agreed OSS ones right now are still a bit unfinished, but they really are working fine on r600-700 with classic mesa. In fact I've been playing through all Stalker games recently with decent performances.
Chances are, OSS drivers are good enough for the vast majority of you. Maybe hardcore gamers will bitch, but that's all.

Re:Linux drivers - stable?? (1)

fuzzyfuzzyfungus (1223518) | more than 3 years ago | (#34321536)

The situation is arguably a little more nuanced than that...

You can get long term support in closed ecosystems, even very closed ones(just ask the nice chap from IBM Mainframe sales...); but it'll cost you. Often a great deal and you had better be sure that the vendor is contractually on board; because they can, otherwise, pull the rug out from under you at their option(If they stop selling product X licenses/support contracts, you pretty much have to stop using product X. Game over. Copyright law. Have a nice day.).

OSS, by contrast, allocates support largely by popularity, with the option of using money to modify that allocation. If Project Y is popular, you might end up getting years of long term support for free. If it isn't, nobody can force you to drop it; but it might not be cheap to stick with it, if you need to make changes to keep using it(whether or not this is an issue varies widely: if you just need some internal OSS stack to boot in a VM, you can probably do that until the sun engulfs the inner planets for basically nothing. If you want someone to make Linux 2.2 run on your company's fleet of laptops so some creaky monstrosity can continue to be used, it'll take real money to silence the laughter of the needed kernel developers...)

Re:Linux drivers - stable?? (0)

Anonymous Coward | more than 3 years ago | (#34322628)

It all comes down to "supporting old stuff does not bring new sales, therefore is really bad in the long term"

Yes it does. When I purchase a piece of hardware, I want to be able to upgrade the software and have the hardware working until it is useless, broken, or completely obsolete (5-7 year cycle is normal). nVidia has been *very good* at providing legacy support. Their current drivers work on anything GeForce 6xxx and newer. That stuff is years and years old. Legacy drivers for new kernels keep the old hardware ticking...

Just looking at the list below, if I wanted to I could still use a RIVA TNT card. That was released in 1998. For myself, 12 years is more than enough support. But there are AMD motherboards being sold today that do not have supported graphics!

http://www.newegg.ca/Product/Product.aspx?Item=N82E16813186201
http://support.amd.com/us/gpudownload/linux/Legacy/Pages/radeon_linux.aspx?type=2.7&product=2.7.2.3.2&lang=English

So, if I want to run current X 7.5 or 7.6 or later, I'm SOL.

If AMD wants to support OSS drivers, they should write OSS drivers and not have parallel efforts. I'm very skeptical about AMD's display offerings. My experience in present and past are my only guides and I got burnt with ATI before. Currently I don't see them leaping ahead of nVidia in Linux/FreeBSD/Solaris support (heck, even if you pick just Linux).

But maybe I'm wrong and AMD will actually have OSS world-class display drivers. Not holding my breath though.

~$ apt-cache show nvidia-glx-legacy-96xx ....
  This legacy version is the last release that supports the following GPUs:
  GeForce2 Go [NV11], GeForce2 MX Integrated Graphics [NVCrush11],
  GeForce2 MX/MX 400 [NV11], GeForce2 MX200 [NV11DDR], GeForce3 [NV20],
  GeForce3 Ti 200 [NV20], GeForce3 Ti 500 [NV20], GeForce4 410 Go 16M [NV17],
  GeForce4 420 Go [NV17], GeForce4 420 Go 32M [NV17], GeForce4 440 Go [NV17],
  GeForce4 440 Go 64M [NV17], GeForce4 448 Go [NV18M], GeForce4 460 Go [NV17],
  GeForce4 488 Go [NV18M], GeForce4 MX 420 [NV17],
  GeForce4 MX 420 AGP 8x [NV18], GeForce4 MX 440 [NV17],
  GeForce4 MX 440 AGP 8x [NV18], GeForce4 MX 440SE [NV17],
  GeForce4 MX 440SE AGP 8x [NV18], GeForce4 MX 460 [NV17],
  GeForce4 MX 4000 [NV18], GeForce4 MX - nForce GPU [NV18], GeForce4 Ti [NV25],
  GeForce4 Ti 4200 [NV25], GeForce4 Ti 4200 AGP 8x [NV28],
  GeForce4 Ti 4200 Go AGP 8x [NV28], GeForce4 Ti 4400 [NV25],
  GeForce4 Ti 4600 [NV25], GeForce4 Ti 4800 [NV28], GeForce4 Ti 4800 SE [NV28],
  Quadro DCC [NV20DCC], Quadro NVS [NV17GL], Quadro NVS 50 PCI [NV18GL],
  Quadro NVS 280 SD AGP 8x [NV18GL], Quadro2 MXR/EX/Go [NV11GL],
  Quadro4 380 XGL [NV18GL], Quadro4 500 GoGL [NV17GL],
  Quadro4 550 XGL [NV17GL], Quadro4 580 XGL [NV18GL], Quadro4 700 XGL [NV25GL],
  Quadro4 750 XGL [NV25GL], Quadro4 780 XGL [NV28GL], Quadro4 900 XGL [NV25GL],
  Quadro4 980 XGL [NV28GL], Quadro4 Go700 [NV28GLM], NV40GL.

~$ apt-cache show nvidia-glx-legacy-71xx
  This legacy version is the last release that supports the following GPUs:
  RIVA TNT [NV4], RIVA TNT2 Model 64/Model 64 Pro [NV5M64],
  RIVA TNT2/TNT2 Pro [NV5], RIVA TNT2 Ultra [NV5], Aladdin TNT2 [NV5],
  Vanta/Vanta LT [NV6], GeForce 256 [NV10], GeForce 256 DDR [NV10DDR],
  GeForce2 GTS/Pro [NV15], GeForce2 Ti [NV15DDR],
  GeForce2 Ultra (Bladerunner) [NV15BR], Quadro [NV10GL], Quadro2 Pro [NV15GL].
  .

Cool (0)

Anonymous Coward | more than 3 years ago | (#34320290)

Now make my 5850 work on debian squeeze or ubuntu 10.10 without the former not working at all, and the latter making X use 100% CPU. Until then I'll stick to a OS that works.

Re:Cool (3, Informative)

SuricouRaven (1897204) | more than 3 years ago | (#34321144)

I've not got a 5850, but a close ATI card. I found that the drivers ubuntu installed were unstable and quite awkward in multi-monitor configurations, but the ones that I got straight from the ATI site worked very nicely. They are the same basic software, right down to the control panel layout, but the ones on the site are a few revisions further along and it shows. At least in the multi-monitor area.

Re:Cool (1)

larppaxyz (1333319) | more than 3 years ago | (#34322232)

You are the lucky one. I also have ATI 48xx series card and last three (or four) releases from ATI only cause hard lock when Xorg starts. No matter if i use 'ubuntu' or ATI version (or other distribution for that matter). Opensource drivers almost work except videos look trashed and fan is spinning at full speed all the time.

However, i also own another computer which is Lenovo Thinkpad U160 with very common i3 intel chipset.. sadly, it's also missing working driver. Issue is related to i915 driver.

Oh... and at work, i have dualhead desktop with old NVidia 6600. It used to work very well until i upgraded to Ubuntu 10.04 (and later to 10.10). Now desktop is laggy as hell, close to unusable. Opensource drivers have issues with dualhead.

Things used to work just few years ago...

Ontario Processor? (5, Funny)

Anonymous Coward | more than 3 years ago | (#34320340)

I went to buy an Ontario processor, but cheaped out -- I ended up with a Quebec processor. Now, I can't understand a thing it says, it never seems to do anything, and I keep having to give it money!

Re:Ontario Processor? (1)

jiteo (964572) | more than 3 years ago | (#34320486)

As a Quebecois (not born, but raised), I want to slap you while laughing ;)

Re:Ontario Processor? (0)

Anonymous Coward | more than 3 years ago | (#34321830)

See. Quebeckers need to qualify how they came to be. :D

Re:Ontario Processor? (0)

Anonymous Coward | more than 3 years ago | (#34321840)

Don't make me say this, but some of my best friends are Quebecois. I just couldn't resist... ;-)

Ontario ones are better then the cheap china ones (2, Funny)

Joe The Dragon (967727) | more than 3 years ago | (#34320644)

Ontario ones are better then the cheap china ones.

Re:Ontario Processor? (2, Funny)

jjohnson (62583) | more than 3 years ago | (#34323656)

Also, it periodically runs a system-wide poll to separate and form its own machine that stays in the same case, draws from the same power supply, and uses all the same resources :)

Re:Ontario Processor? (1)

zeitnot (155181) | more than 3 years ago | (#34324400)

I am a Quebecker you insensitive clod!

Fusion (1)

necro81 (917438) | more than 3 years ago | (#34320466)

Wait, I thought fusion was still (perpetually) a few decades away from viability.

Re:Fusion (1)

Noughmad (1044096) | more than 3 years ago | (#34320624)

We were all waiting for the years of "Fusion" and "Linux on the desktop", but instead we got "Linux on the Fusion".

Re:Fusion (1)

ArsonSmith (13997) | more than 3 years ago | (#34321110)

No, net power gain from fusion is still a few decades away.

3d Video cards have been sucking in power to create a small fusion reaction in their GPU for at least a few years now.

Re:Fusion (1)

blair1q (305137) | more than 3 years ago | (#34321240)

That's just what AMD marketing wants you to think. They chose the "Fusion" trademark so they could perpetually delay their products (they're already pushing 2 years late compared to the date they first announced after buying ATI).

Open Source Fusion Driver (0)

ATestR (1060586) | more than 3 years ago | (#34320652)

Darn... for a second there I thought I was going to finally be able to retrofit my car with a Mr. Fusion unit.

Take a step back, look at the big picture. (1)

VortexCortex (1117377) | more than 3 years ago | (#34320736)

Originally computers were huge proprietary things.
Now they are small and commonplace.

In the past software was written to a specific hardware, now it's not ( C is cross platform compared to assembly, folks ).
Games no longer draw graphics by directly reading and writing raster data directly into hardcoded "video memory" regions.

Abstraction layers (such as a graphics API + Drivers) are a must in todays software environment. Why? To support cross platform software development. Many of todays games sit on top of another huge abstraction layer The Game Engine (such as Unreal or Source), and in doing so are more easily ported to multiple platforms.

The point is: Software development is rapidly moving from "Works on only one hardware/software platform" to "Works on many platforms, OSes, devices, etc". Eventually we will get to the point where any software can run on just about any hardware. The fast track to this destination is clearly Open Source. It's ridiculous to me to see hardware drivers lagging behind in the cross platform aspect when compared to cross platform open source projects like Firefox, Apache server, etc.

Additionally: An open source driver could have a few #ifdef blocks, etc, and compile/run on both Linux and Windows platforms (ok, more than a few, but why not release the source at least and get some free help?). The damn driver is not the product; The hardware is the product being sold. More platforms = more customers; no other argument really compares.

I'm glad to see ATI has seen a bit of the big picture. Now, if only we can get NV to realize that cross platform / OSS is good for everyone (including customers -- less vendor lock in).

Re:Take a step back, look at the big picture. (0)

Anonymous Coward | more than 3 years ago | (#34321062)

An open source driver would give details on hardware internals.

Re:Take a step back, look at the big picture. (1)

VortexCortex (1117377) | more than 3 years ago | (#34322204)

An open source driver would give details on hardware internals.

More so than the hardware itself? I think not. If you want me to see a movie I must be able to see the movie, ergo, it is impossible for you to keep me from recording the movie if you let me take the movie home.

If you let me take the hardware home, and the hardware is expected to function for me, then I will be able to operate and analyze the hardware and write my own damn driver which, you guessed it, will expose the same details as the Mfgr's driver would if it were open source... Hell, not giving me the driver source may cause me to discover MORE hardware details than would have been exposed otherwise.

Come on, think man... If you're selling me a machine, do you honestly expect me not to be able to tell how the machine works if I'm sufficiently curious?

It's not Closed Source Drivers or Epoxy Coated Circuit Boards that keep people from using your "innovations", it's the Patent and Copyright laws that do.

Re:Take a step back, look at the big picture. (1)

blair1q (305137) | more than 3 years ago | (#34321382)

Uh, software is still written to specific hardware. You may write it in C, but C doesn't determine the register mappings and semantics. *addr=value is still just mov [addr],$value

Re:Take a step back, look at the big picture. (2, Insightful)

VortexCortex (1117377) | more than 3 years ago | (#34322046)

Uh, software is still written to specific hardware. You may write it in C, but C doesn't determine the register mappings and semantics. *addr=value is still just mov [addr],$value

Yes, but largely NO!

I write my software in C. The same code compiles and runs on x64 and on x86. The COMPILER translates my cross platform "*addr=value;" into the apropriate machine level instructions. My C software programs do not concern themselves with the specific register mappings and processor semantics; this has been abstracted away by the C Compiler.

I agree that driver software may be written to the specific hardware, but the purpose of a driver is to abstract said "register mappings and semantics" so that the majority of other software (All other software EXCEPT the driver), don't have to worry about the "register mappings" and/or other "semantics".

Inline Assembly code is not cross platform, and in many cases a compiler can make better improvements than the assembly code in question does.

When is the last time you used a significant program that was written entirely in assembly?
Again, I must reiterate: Take a step back, look at the big picture.
You're focusing on the little edge part which may get cut off without significantly changing the picture at all.

Re:Take a step back, look at the big picture. (1)

blair1q (305137) | more than 3 years ago | (#34322888)

My C software programs do not concern themselves with the specific register mappings and processor semantics; this has been abstracted away by the C Compiler.

Your compiler knows jack about the registers on a video card.

If you're writing a video device driver, as here, you're going to be poking values into board-register addresses, streaming data in and out of specific port addresses, and that code isn't portable. It may seem very familiar from board to board, but unless the features you implement are completely trivial it's not. Even at the library or application level you'll have to deal with the different features of the cards, which means a whole new set of ioctls and ports and configuration and operation protocols. You only get true portability when you put a standard library like openGL in the interface between your application and the driver. The driver is still bespoke, but it only has to implement the features that openGL knows about, so its upper interface to openGL is standard, as is the lower interface of your app to openGL.

The only way to focus on the big picture is to let someone else write the software.

Re:Take a step back, look at the big picture. (1)

butalearner (1235200) | more than 3 years ago | (#34322740)

As an interesting contrast, software used to be all what we'd consider open source -- BSD-style open. The relatively few programmers around back then were mostly academics and openly shared code until folks like Paul Allen and Bill Gates stepped in and decided they should be able to charge for Altair BASIC (see Bill Gates' Open Letter to Hobbyists). You could also see the stirrings of the open source rebellion, too, with the freely published Tiny BASIC design descriptions with full source code, which many groups implemented for many different architectures and sold for a small fee (as opposed to up to hundreds of dollars for Altair BASIC) independently.

Fusion for (light) servers? (1)

Sloppy (14984) | more than 3 years ago | (#34320848)

There's still a niche that isn't very well served, where these low-power Fusion CPUs appear they could kick some major ass: the always-on-24/7 lightly-loaded server. I'm currently using Athlon II xxxe for this, but I'd happily downgrade processing power in exchange for lower wattage.

Shit, Atom would be good enough, if the motherboards had enough SATA ports or slots for me to add SATA cards, but I never found any that did. Gimme a 9W or 18W processor on a board that I can somehow hook up 8 drives to, and that'll be my new mythbackend. If commflagging or transcoding takes a little longer, I just don't care; Atom 330 is nearly good enough for that that anyway, and it looks like Fusion is better than Atom in every way.

I'd think just about every home or office would need at least one box like this, but according to the market, I'm wrong. WTF?

Re:Fusion for (light) servers? (1)

SuricouRaven (1897204) | more than 3 years ago | (#34321204)

What you describe is basically a modern NAS appliance. A low-power chip, often an Atom, running an x86 OS, and with lots of SATA interfaces for drives. The only real difference between that and a low-power server is in the interface: Server runs a full OS and software stack, while the NAS runs a minimalist stack with a web interface for all configuration. I've got one of those NASs up im my loft, a QNAP brand. The thing runs linux and samba. It does have an option to SSH in, disabled by default.

Re:Fusion for (light) servers? (0)

Anonymous Coward | more than 3 years ago | (#34321592)

Ask and ye shall receive. Supermicro makes this handy number. The X7SPA-HF Dual core Atom D510, intel's latest die shrunk atom.

http://www.supermicro.com/products/motherboard/ATOM/ICH9/X7SPA.cfm?typ=H

An actual server board with server class firmware and features. Serial console, IPMI, remote management, dual Intel gigabit NICs.

Uses a more standard ICH9 southbridge that what's normally paired with a D510. More OSs support the ICH9, and generally do so a lot better(Lookin at you, Freebsd). Also gives you 6 SATA ports, plus an X4 pci-e (x16 sized slot. I guess you could put in a video card) You could add in a SATA card for more ports.

Uses SO dimms, which is odd. Supports 4GB of ram. I've heard these boards are picky about their memory so stick to Supermicro's compatibility list.

Re:Fusion for (light) servers? (1)

amorsen (7485) | more than 3 years ago | (#34322764)

You get quite far with a single 2.5" disk these days, and that's easy to fit in every tiny server. 8 spinning disks is niche.

Re:Fusion for (light) servers? (0)

Anonymous Coward | more than 3 years ago | (#34324918)

Then why not just go with Sempron 145?

http://www.newegg.com/Product/Product.aspx?Item=N82E16819103888

The thing has 45W peak power, but frequency scales down to sub-1GHz for low loads. The power consumption of a system with built-in nVidia 6100 graphics was 21W at idle and 65W at peak (cpuburn). It may be even lower with onboard ATI graphics (people say they are more efficient). Atom boards have shitty power consumption because of the chipset.

PS. Athlon II xxxE are very efficient if you enable cpu frequency scaling in the OS. You can do that in Linux and Windows. Then measure actual consumption between peak and idle. The difference will be mostly CPU multiplied by PS inefficiencies.

Anything missing? (1)

SanityInAnarchy (655584) | more than 3 years ago | (#34322414)

Last big announcement about an AMD code drop, there was still something missing, though I don't remember what. Features, performance, whatever, there was still something not there that was either present in proprietary AMD drivers or nVidia drivers.

Are we past that yet? Is it finally time to dump nVidia for AMD?

www.thinkpenguin.com www.openpc.com (0)

Anonymous Coward | more than 3 years ago | (#34323184)

If people want computers that aren't dependent on non-free drivers, firmware, or software check out www.thinkpenguin.com and www.open-pc.com. They're the ones doing the work and making sure freedom is being pushed forward on the desktop. They're doing the most for free computing right now in actually putting out systems which aren't dependent on the non-free components. As much as I'd like to to think AMD and ATI are doing something they're only doing it if YOU demand it. Well, you have to demand it. Those behind ThinkPenguin have operations in three states as well pushing free software on average consumers. It's why ThinkPenguin exists. Open-PC is a great idea too with multi-national ties. FreeGeek is another great operation. Put your money where your mouth is and make sure your next computer isn't just running "Linux", but is also a free one.

Still not 100% open... (1)

jonwil (467024) | more than 3 years ago | (#34324746)

Let me know when I can buy a GPU where every single feature of the card (INCLUDING the on-board dedicated circuitry for decoding video) can be used in the open source drivers and then maybe I will care...

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?