×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

ARM VP To Keynote AMD Developer Conference

timothy posted more than 2 years ago | from the shades-of-things-to-come dept.

AMD 70

MojoKid writes "AMD is hosting its first AMD Fusion Developer Summit (AFDS) this summer, from June 13-16. The conference will focus on OpenCL and upcoming AMD Llano performance capabilities under various related usage models. One interesting twist is that the keynote address will be given by Jem Davies, currently ARM's VP of technology. To date, AMD's efforts to push OpenCL as a programming environment have been limited, particularly compared to the work NV has sunk into CUDA. With its profit margins and sales figures improving, AMD is apparently turning back to address the situation — and ARM's a natural ally. The attraction of OpenCL is that it can potentially be used to improve handheld device performance. AMD's explicit mention of ARM hints that there might be more than meets the eye to this conference as well."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

70 comments

Goodnite x86 (0, Interesting)

Anonymous Coward | more than 2 years ago | (#35951516)

This is the beginning of the end of an archaic architecture. We can finally see fair competition when it comes to silicon..

Re:Goodnite x86 (3, Informative)

ByOhTek (1181381) | more than 2 years ago | (#35951642)

I've been hearing this drivel for years.

It's beaten out other techs for a reason. ARM is not replacing x86 on the desktop any time soon, thank goodness. Maybe joining it, but not replacing it.

Re:Goodnite x86 (-1)

Anonymous Coward | more than 2 years ago | (#35951914)

desktop?

good night sweet horse.

good night sweet buggy.

good night desktop.

Re:Goodnite x86 (2)

DarkOx (621550) | more than 2 years ago | (#35951920)

No ARM Will not replace x86 on the desktop. They point is to keep x86 away from other platforms. The desktop is what it is, a swiss army knife where a powerful CPU with a big feature set is called for, because the system will do a little of everything. Its also true that different feature sets are needed by different users but packing it into a single one size fits all chip is the way to go because the incremental cost of adding features so won't use are less that making a wider array of products. Electrical efficiency is desirable but not at the cost of features or raw performance. Nothing but perhaps POWER will ever compete with x86 there.

x86 is ill suited to mobile devices for all the same reasons its great on desktops and laptops that can afford large heavy batteries, and are generally used where AC is available.

What AMD/Marvell/VIA/NVIDIA *need* to do is keep the old WinTel mind set and lock in out of these new devices. Which they can because they are more limited in scope, and most of the software for these things is already cross platform. Its a matter of keeping it that way so the market remains open and competitive.

Re:Goodnite x86 (0)

Anonymous Coward | more than 2 years ago | (#35952860)

Haha, if the old RISC ISAs were competing head-to-head with x86 now, instead of in the 90s, I think things might be going differently. Alpha, MIPS, SPARC, et al had already lost in the desktop space before the MHz Myth died and the power wall reigned supreme.

Power dissipation is typically the limiting factor in chip design (in your desktop chips too). In a sense, electrical efficiency is critical everywhere. This may be an opportunity for ARM.

Re:Goodnite x86 (1)

ByOhTek (1181381) | more than 2 years ago | (#35955384)

I think the only one of those that would stand a chance is Alpha... *MAYBE* MIPS.

Sparc? Doubt it. It's nice for highly parallel tasks, but not so great for high-demand single-thread, which matches a lot of desktop apps.

Re:Goodnite x86 (1)

mcmonkey (96054) | more than 2 years ago | (#35956138)

RISC architecture is going to change everything.

(I couldn't resist)

Re:Goodnite x86 (1)

ByOhTek (1181381) | more than 2 years ago | (#35956180)

s/is going to change/has changed/

Given that, to my knowlege, all x86 CPUs are internally RISC, that change may be necessary.

However, there's nothing wrong with exposing the x86 arch while internally using their own, separate, instruction set - this allows a model which can perform standard optimizations, keeping the programmers from having to worry about implementing them, themselves. Also, it allows for changing the executed instruction set, to improve performance, without causing problems to existing applications.

Re:Goodnite x86 (1)

bored (40072) | more than 2 years ago | (#35956140)

x86 is ill suited to mobile devices for all the same reasons its great on desktops and laptops that can afford large heavy batteries, and are generally used where AC is available.

This is complete BS. As ARM starts to match the performance of x86 the power draw is beginning to match too. If intel sticks it out in another generation or two i'm betting Atom will absolutely dominate the power/performance curve because there isn't anything fundamentally in x86 that makes it draw more power. If anything building an OOO ARM is harder due to the fact that every single instruction can be predicated.

Intel has learned the integration lesson too, which should bring the platform power aspects under control within a generation or two.

The "legacy" portions of x86 are such a tiny portion of the overall die as to be nearly meaningless. ARM isn't exactly legacy free either, plus they continue to have a fairly fragmented market due to a half dozen ABI's, and probably ~100 different vendors producing different versions of the chips with different memory controllers/DMA engines/etc. A SOC x86 with graphics/IO/slow memory/etc will be comparable when one gets built. ARM will continue to sell in huge numbers, but in order to absolutely control the market place they MUST have generational compatibility between devices. Otherwise, they run the risk that in any single generation they loose marketshare. This actually happens all the time, its just that their market has so far been diverse enough that it just doesn't matter, loose one vendor, gain another.

Finally, half the reason the tablets/phones seem fast isn't because they have fast hardware, but rather its a focus on building a system that works within their constrains, unlike windows or general purpose Linux. This is why no one particularly cares about x86 in a tablet. You have to customize the environment sufficiently that the ABI compatibility that x86 brings, along doesn't provide any value. No one wants a smart phone running windows 7 (hell a lot of people don't even want a desktop running windows7).

BTW: I have a fair number of ARM devices, including an OpenRD (BTW: 7 watts wall power at full load, compare with the CPU draw) which acts as the primary network server at my house. A few months ago I had this discussion with my coworker and measured the performance/watt numbers of a number of devices. The end result is that the latest Intel processors quite literally are about 10x faster than the best ARM devices you can buy, and the power/watt is roughly comparable even though the server is eating 150-300 watts (consider disks/etc) and the arms are eating 10.

It's not that it is better technically (2)

dbIII (701233) | more than 2 years ago | (#35952092)

And the reason is Intel had the resources to make more of what they were selling than anybody else and other low end players could get better sales by copying them instead doing their own thing. Itanic was competing with Intel's core product and did not have a chance within Intel. Other places got bought out by other companies that didn't see anything past Intel making a lot of money. I've got no idea what IBM are doing with Cell - try to buy something with it and they tell you to get something else.

Re:Goodnite x86 (2, Interesting)

horza (87255) | more than 2 years ago | (#35952250)

It's beaten out other techs for a reason. ARM is not replacing x86 on the desktop any time soon, thank goodness. Maybe joining it, but not replacing it.

And what do you think this reason is?

Hint: it's not technical superiority.

Phillip.

Re:Goodnite x86 (1)

ByOhTek (1181381) | more than 2 years ago | (#35955412)

Adaptability and technological superiority for the tasks on a desktop.

So, yeah, it is in part, technological superiority.

Re:Goodnite x86 (1)

renoX (11677) | more than 2 years ago | (#35956550)

> ARM is not replacing x86 on the desktop any time soon, thank goodness. Maybe joining it, but not replacing it.

That's a beginning: when someone use an iPad instead of a PC, he "replace" an x86 by an ARM.
Currently iPads and the like are mostly PC 'companions' but it wouldn't surprise me that they could become PC replacements for many users..

Re:Goodnite x86 (1)

dicobalt (1536225) | more than 2 years ago | (#35951662)

People have tried and failed before. Repeatedly. It's not impossible but it is improbable. ARM sure has a whole lot of work to do if it is to go from being an anorexic featureless simpleton CPU into a flexible *and* powerful desktop and server CPU.

Re:Goodnite x86 (1)

Halo1 (136547) | more than 2 years ago | (#35952256)

Which features is the ARM architecture missing compared to x86?

Re:Goodnite x86 (1)

Anonymous Coward | more than 2 years ago | (#35952866)

Features of x86 that are currently missing in ARM? How about out-of-order execution, 64-bit operation, speed boost (some cores shut down to let other cores run faster), and a top-end speed around 3GHz just to name a few.

Of course the lack of those features lets it run cooler which makes the ARM processor ideal for low-power applications like cell phones.

Re:Goodnite x86 (1)

spirit of reason (989882) | more than 2 years ago | (#35952988)

Except for 64-bit operations, those features have nothing to do with x86 vs ARM. They're part of the microarchitecture. And by the way, the Cortex-A9 does support out-of-order execution.

Re:Goodnite x86 (2)

Halo1 (136547) | more than 2 years ago | (#35953132)

Features of x86 that are currently missing in ARM? How about out-of-order execution,

The Cortex-A9 is out-of-order.

64-bit operation,

They indeed don't have 64 bit ALU or memory addressing support yet.

speed boost (some cores shut down to let other cores run faster),

That's unrelated to the architecture. And at least NVidia's Tegra dual-core cpu's shut down one of the two cores if it's not in use. I don't think they automatically overclock the other one to run faster when doing so though.

and a top-end speed around 3GHz just to name a few.

Yes, in absolute performance per core they are still trailing x86. I was mainly reacting to the "anorexic featureless simpleton CPU" remark with my question though.

Of course the lack of those features lets it run cooler which makes the ARM processor ideal for low-power applications like cell phones.

And server farms [eetimes.com].

Re:Goodnite x86 (1)

Tolleman (606762) | more than 2 years ago | (#35953518)

Server farms? Kind of a bitch without ECC but I guess it works in some cases. Granted, what do I know, maybe they will be slapping ECC support into them.

Re:Goodnite x86 (1)

Halo1 (136547) | more than 2 years ago | (#35953974)

Server farms? Kind of a bitch without ECC but I guess it works in some cases. Granted, what do I know, maybe they will be slapping ECC support into them.

Why do you believe ARM doesn't support ECC right [mvdirona.com] now [google.com]?

Re:Goodnite x86 (0)

Anonymous Coward | more than 2 years ago | (#35954014)

I sure hope ARM supports ECC, or I'm not sure what all of those ECC bits are doing in the ARM designs I've worked on.

Re:Goodnite x86 (1)

Anonymous Coward | more than 2 years ago | (#35952938)

It's missing various addressing modes, some tiny 8-bit instructions, a separate I/O address space, and probably more. Depending on what you do, however, missing these features may itself be a feature. ;)

Re:Goodnite x86 (0)

Anonymous Coward | more than 2 years ago | (#35954504)

x86 compatibility. That's the killer. Other people stuff mentioned, while important for some duties, simply isn't the reason for x86 dominance. x86 is dominant because it was dominant. It's inertia and network effects, most particularly from the closed source world (I run linux on an ancient PPC mac laptop and it still works great (amazing battery life, what you want in a laptop), it's not our (open source) fault).

Re:Goodnite x86 (1)

Halo1 (136547) | more than 2 years ago | (#35954600)

I agree that has been extremely important until now. As the success of Android and iOS devices demonstrates, it's getting much less so though, even in the consumer space. Even migrations from one architecture to another have been pulled off by Apple already twice in a quite successful way (although in that case increasing performance of the new architecture is quite important). And in the server space, the underlying architecture is almost irrelevant.

Re:Goodnite x86 (1)

spirit of reason (989882) | more than 2 years ago | (#35958810)

I feel like x86 compatibility itself doesn't matter anymore either. The majority of users seem to depend on only a very small number of applications. You pretty much get all the average folk with a web browser, Flash, Microsoft Office, and maybe iTunes. Adobe, Microsoft, and Apple have demonstrated a willingness to work with whatever platforms are popular.

There are certainly large niches that matter too, like video games, but would companies/developers for those applications hold things up? I don't know.

Re:Goodnite x86 (2)

ByOhTek (1181381) | more than 2 years ago | (#35955468)

With the plethora of JIT compiled languages doing high-end tasks today, and the increasing number of cross platform/arch libraries, I'm not sure that x86 compatibility is such a killer.

Re:Goodnite x86 (1)

ByOhTek (1181381) | more than 2 years ago | (#35955448)

Actually, ARM wouldn't take much work to make a good server CPU.

Most servers are better suited by a lot of mediocre or slightly below mediocre cores, rather than one or a few heavier-duty cores. ARM's low power and high performance/power ratio makes it a very likely contender for the server market in the next couple years, if it is developed properly.

For the end-user segment (desktops and notebooks), where low-thread brute force tends to be a more relevant factor, ARM isn't as good of a choice.

Re:Goodnite x86 (0)

Anonymous Coward | more than 2 years ago | (#35951694)

I think we'll start to see a fundamental shift in computer design soon. Computers will become much more modular.
Modern computers (I mean PCs more than phones, tablets and portable devices) are still very rigid in their design.
One or more specific processors fit onto a motherboard that takes a specific type of memory and add-in cards.

I see motherboards becoming a kind of switch or bus with multiple super high-speed lanes connecting all kinds of components.
There wont be specific connectors for CPU, memory or cards just links to the bus. You'll be able to mix and match different
types of CPU as you need. The OS will become what it should be, just a basic layer that allows access to the components.

This can only be OpenCL, nothing more (0)

Anonymous Coward | more than 2 years ago | (#35951570)

If it was anything more than a technical decision they would have the head of the company for the announcement, the attendee is a VP of tech which in all likelihood means a partnership or an adoption of some key technology...nothing more

Proper Linux Support? (2)

0100010001010011 (652467) | more than 2 years ago | (#35951630)

How about spending a few engineering dollars and releasing GOOD well documented drivers? I'm a regular reader of the XBMC forums [xbmc.org] and anyone that wants to use Linux more or less needs to buy Nvidia hardware.

I'm not in the 'anti-closed binary' camp, I just want the best tool for the job. Nvidia provides great CUDA and VDPAU support and it more or less 'just works'. ATI & Intel decided to jump on the Linux bandwagon by opening up everything and so far it seems like the community really hasn't jumped on it. I paid money for your hardware, why not pay an engineer to write software I can actually use?

When I go car shopping and the sales associate shows me 2 cars. One is completely built, works well enough and has good factory support BUT I'm not allowed to modify it. Or the second one which is actually just in a crate. It comes partially assembled... but don't worry. There is complete documentation for every single loose part and instructions on how to put it together. And the 2 cars cost nearly the same.

I'm going to choose the first car. My time IS worth something and I'd rather have something I can't modify but works great as is (NVidia's drivers) to something that really is useless unless I, or someone else, uses the documentation to do something (ATI). Especially when the hardware costs are nearly the same.

Re:Proper Linux Support? (2)

drinkypoo (153816) | more than 2 years ago | (#35951778)

I'd be satisfied if ATI would release enough information to support their hardware. I have a netbook based on R690M chipset and Athlon 64 L110 and the graphics only work correctly under Vista. They limit me to one suspend-resume cycle under Windows 7, suspend never resumes properly under Linux, and I get massive graphics corruption in Linux even with RenderAccel disabled. Further, power saving doesn't work properly anywhere but Vista; I have a five hour battery, get about 4:30 in Vista (no crap) but about 3:30 in Windows 7 and about 2:30 in Linux. AMD didn't bother to kick out the tiny bit of code needed for power saving in Linux.

Since I am a Linux user, I am concerned about what this says about AMD's commitment to Linux. Probably the smartest thing I could do is to put Vista back on the machine and sell it while it still has some value, but I keep hoping that the 'ati' driver for X will work with it one day. Since my chipset is supported by coreboot, if I can get JTAG access then maybe I can hack around the crap Gateway BIOS that disables AMD-V. In my book that makes it a Sempron 64, not an Athlon.

Re:Proper Linux Support? (1)

Have Brain Will Rent (1031664) | more than 2 years ago | (#35954474)

Wow, except for the Win7 stuff that sounds so much like my experience with my old HP notebook/tablet. I finally did go back to Vista on that.

I replaced the hard-drive with an ssd - which stopped the thing's exhaust from scorching me and extends the battery life a bit... now I have an overpriced, overweight but moderately nice to use in dim light ebook reader. In tablet mode a single page of a magazine like SciAm fits perfectly on the screen and is still readable so it pretty much eliminates scrolling.

It's interesting that your power consumption went up in Win7... good to know as I had been thinking of installing it to get out of using Vista.

Anyhow I'm not sure who to blame, HP, ATI, ... maybe a combination of several entities, but somebody sure screwed it up.

Re:Proper Linux Support? (0)

Anonymous Coward | more than 2 years ago | (#35954684)

Well, I can't comment on this 100%, but you should ask about it on freenode/#radeon or the mailinglists.

Are you sure the problem isn't simply that the manufacturer of the laptop produced rubbish?

I imagine they're QA Testing mostly consists of:

Test on Current Microsoft Platform That We Ship With It.
Move to Next Product.

Re:Proper Linux Support? (1)

drinkypoo (153816) | more than 2 years ago | (#35961418)

Are you sure the problem isn't simply that the manufacturer of the laptop produced rubbish?

Yeah, I'm sure, because it works 100% in Vista. Which is to say, even slower, but literally everything on the machine works great. The problem is not Gateway, IIRC this is really an Everex anyway. The problem is AMD. I knew better and bought something with ATI graphics and now I am paying the price. Based on my experience with Geode, though, I fully expected the processor to have good Linux support, which was not the case at all.

Between that and their lack of proper timely support for k10 (hello, there's a thermal monitor in there and I would like to have used it much earlier) I have to assume that AMD has zero commitment to Linux, and any bone they throw us is just to keep us running along behind so they can hitch us up to the sled if they need us. I may yet buy AMD processors in the future (I'm actually quite happy with the performance and the price of my Phenom II X3 720) but I will actually look at intel combos in the future when for my last four homebuilt PCs I did not even consider it for a second. However, any AMD graphics are not eligible for inclusion in my computers, period, the end. I won't even think about a machine with integrated ATI any more, and that was the last thing I thought might be worth buying.

It's too bad I was wrong. Save yourself. Don't buy ATI.

Re:Proper Linux Support? (1, Informative)

h4rr4r (612664) | more than 2 years ago | (#35951798)

That second car does not have all the docs, or someone could build it. AMD leaves out anything relating to video acceleration for example. This is to protect their windows DRM, meaning me a linux user is suffering due to windows DRM.

Re:Proper Linux Support? (1)

mattack2 (1165421) | more than 2 years ago | (#35957042)

Then can't you just buy a different video card (or different laptop with built in video from another company), and let the market decide?

Re:Proper Linux Support? (1)

TeknoHog (164938) | more than 2 years ago | (#35952018)

IMHO, AMD also provides decent binary drivers and programming tools. I like to reward AMD for their open-source attitude, but frankly, I am happy running their binaries for video decoding and number crunching, while waiting for the OSS drivers to improve.

Re:Proper Linux Support? (1)

h4rr4r (612664) | more than 2 years ago | (#35952732)

Decent Binaries?
Since when? Try using an ATI card to run games in wine, or do anything particularly OpenGL heavy and watch what happens.

Re:Proper Linux Support? (1)

petteyg359 (1847514) | more than 2 years ago | (#35953930)

I do watch what happens, sometimes for long periods of time. What happens? My game runs just fine, often even better than in Windows.

Re:Proper Linux Support? (1)

TeknoHog (164938) | more than 2 years ago | (#35955180)

Compared to a vast majority of binary software/drivers out there, AMD graphics drivers are awesome. Maybe Nvidia is simply better for gaming, in that case choose the best tool for the job. For certain integer-heavy number crunching applications, Radeons are currently the best by a wide margin (something like 5x faster at similar price and wattage).

Re:Proper Linux Support? (1)

Billly Gates (198444) | more than 2 years ago | (#35959104)

I am annoyed (under Linux) that support for video accelerations sucks or does not exist. Flash and just web browsing with Chrome and Firefox 4 is painful in that platform as a result.

I switched back to Windows 7 and use a VM for Linux programming for serverish things. Adobe maintains they will not support hardware acceleration at all for any Intel or ATI products because the drivers are hacks and scripts and are not professional grade like their MacOSX and Windows counterparts.

Under Windows 7 I like my ATI card.

Re:Proper Linux Support? (1)

Svartalf (2997) | more than 2 years ago | (#35957980)

Define OpenGL heavy. The drivers typically fall flat on their face compared to NVidia when your developer "oopsed" something on a shader or a call- basically, the drivers are less tolerant of errors in coding than NVidia's. As for WINE...don't know, it's been a bit since I've tried doing much in it. I port titles after hours so I tend to not rely on band-aids to get games to play... ( :-D )

Re:Proper Linux Support? (1)

LWATCDR (28044) | more than 2 years ago | (#35952414)

Funny but the FOSS community has said for years that if "They just documented the chips we would write the drivers.". What it comes down to is money. Very few people buy hardware to run Linux on. Most people buy hardware to run Windows on. Most resources goes to where most profit comes from. From what I have heard ATI drivers have gotten much better lately.

Re:Proper Linux Support? (1)

Anonymous Coward | more than 2 years ago | (#35952900)

That's because AMD has faked releasing documentation. They have released the most basic worth-/useless parts.
And as promised, the XOrg radeon team already implemented that, plus a lot more.
As can be seen here: http://xorg.freedesktop.org/wiki/RadeonFeature [freedesktop.org]

If they release stuff that actually documents the 3D and video stuff, instead of just basic mode setting & co, then we'll fix the bits that are missing too.

But it's so nice of this whole thread with all parent posts and most sibling posts, to just spew uninformed ignorant bullshit...

Re:Proper Linux Support? (1)

Svartalf (2997) | more than 2 years ago | (#35958014)

Uh, no... It's more because they're trying to get INFRASTRUCTURE in right so that you have no bottlenecks in the rendering path that are avoidable. You're watching the devs work towards what NVidia and AMD have already had 10+ years at doing for their couple of year's at it so far. It's NOT like the old drivers that you just needed to know how to submit verticies and textures to the rasterization engine quickly like with RagePRO, Rage128, and G200/400 cards (I should know about BOTH classes of hardware...I was one of the UtahGLX devs and I did work for one of the big two as a contractor years back- on the class of hardware we're talking about. Coding for modern 3D cards is NOT easy or simple.) Quite simply the bulk of the stuff was made available- just not the compressed video stream decode parts, mainly because it requires knowledge to turn on/off the DRM components of the whole path. They're constrained by contractual obligations to NOT give that out.

Re:Proper Linux Support? (1)

Junta (36770) | more than 2 years ago | (#35952770)

I use AMD catylist with xbmc. I have the va-api implementation and it works all right.

Re:Proper Linux Support? (1)

Dr. Spork (142693) | more than 2 years ago | (#35952886)

It's attitudes like this that makes companies give up on releasing things to the open source community. In the case of ATI, we've been begging for years, and when we finally got what we want, we're like... "sure, you gave us lots of toilet paper, but our asses are still not wiped, so get on it!"

Intel (0)

Anonymous Coward | more than 2 years ago | (#35954096)

In the case of ATI, we've been begging for years, and when we finally got what we want..

The target moves. AMD/ATI gave us something better than Nvidia did, but Intel set a new minimum standard.

Everyone who thinks Sandy Bridge is just another slightly faster series of processors, needs to catch up on the news. Something wonderful has happened. If you're a Linux guy and not drooling on the Core ix 2xxxK models, your intell (heh) is outdated.

Re:Proper Linux Support? (0)

Anonymous Coward | more than 2 years ago | (#35954104)

I'm not in the 'anti-closed binary' camp, I just want the best tool for the job. Nvidia provides great CUDA and VDPAU support and it more or less 'just works'. ATI & Intel decided to jump on the Linux bandwagon by opening up everything and so far it seems like the community really hasn't jumped on it. I paid money for your hardware, why not pay an engineer to write software I can actually use?

Because Linux has an insignificant number of users, and the extra sales they would get do not justify the cost?

Re:Proper Linux Support? (0)

Anonymous Coward | more than 2 years ago | (#35954260)

At least for the last couple of years, ATI's closed-source Linux drivers seem to be fine. As a bonus, the "anti-closed binary" camp gets open source drivers as an extra bonus. So, if you go AMD, you have decent (at least that's my and others' I know experience) closed-source drivers plus some not that great open source drivers to appease the "purists". Not sure why you would need nVidia if you are not into CUDA. And since I am more of a developer than a gamer, I am really looking forward for good OpenCL tools since the asymmetric abilities it has make it a really interesting dev platform, in contrast to the closed CUDA.

Re:Proper Linux Support? (1)

Have Brain Will Rent (1031664) | more than 2 years ago | (#35954952)

I have to say that is the general feeling I'm getting about a lot of stuff.

I want to play around a bit with GPU programming for scientific calculation and the feeling I am getting is that my choices are either Linux with NVidia or Windows with ATI (or NVidia).

It's pretty much the same with Linux itself. I talked the wife into using Linux instead of Windows on her desktop and netbook but that pretty much meant Ubuntu. I don't want to have to maintain two different flavors so that means I run Ubuntu too. My experience with Ubuntu has been one of feeling like an unpaid beta (or sometimes alpha) tester. Things keep changing, afaics often just for the sake of change.

Now the latest is the complete replacement of Gnome so my multi-screen desktop can function more like a phone??? This may make perfect sense for Canonical. And it may be fine for people with lots of time to be continually trying out new things - but I don't have that time to spare anymore. I don't have time to learn a new major metaphor every year, or to figure out how to undo things to get back my old desktop, or my old audio player, or my old mail client, or my old IM client, or whatever. I want to use my time on my actual goal(s) not overcoming or undoing unasked for changes.

So after 6-7 years of trying I'm considering new directions. For the time being I'm sticking with 10.04 then if that becomes more problematic I will either try one more flavor of Linux - probably straight Debian - or just surrender to the dark side and for all its ugliness go back to Windows.

Our time is worth something and we want it to be expended on our actual goals.

I buy AMD hardware because it is usually better bang for the buck and because I want to support the market underdog if possible. With GPU computing my solution was to get a system just to run Win7 on... because I don't have the necessary significant time to invest just to find out if I can successfully do GPU computing with the combination of ATI/AMD and Linux. But now I'm having to keep current with two OS's - that's not likely to feel time effective for very long.

Re:Proper Linux Support? (0)

Anonymous Coward | more than 2 years ago | (#35955800)

Hello,

you should normally be able to use AMD opencl-drivers on linux according to their docs. As far as I've read, it should be possible to do opencl on x86, so without gpu (as it is a requirement by the spec).

I'm no sepcialist on this, but I just tried to install the amd-opencl-drivers on linux. It's not so difficult, but I don't have a supported amd-card yet (a very old one), so it makes it a little bit more difficult maybe. They even have ubuntu-packages I believe.

Kind regards,

Michel

Re:Proper Linux Support? (1)

Svartalf (2997) | more than 2 years ago | (#35957956)

Considering that the remark that NVidia's drivers working well is a Your Mileage May Vary Considerably (Fermi not being well supported under Nouveau, NVidia dropping 2D driver support and pointing people to Nouveau for initial bring up, and select Fermi chipsets NOT being supported (GT440, for example...)) you MIGHT just want to moderate your remarks on that score.

Re:Proper Linux Support? (1)

CAIMLAS (41445) | more than 2 years ago | (#35958216)

What if one car is completely build and works 'well enough' and has good factory support, but the other one offers 30% more performance, has all-wheel drive (VT), and twice as much+ storage space (8-16GB RAM support) - though it needs new glow plugs (you can start it, but only in warm weather).

That's the dichotomy of Atom vs. Bobcat, not what you propose.

ARM and AMD (5, Informative)

EponymousCustard (1442693) | more than 2 years ago | (#35951744)

Speaking to EE Times during a discussion of ARM's first quarter financial results CEO Warren East said: "AMD is a successful company selling microprocessors. ARM is in the business of licensing microprocessor designs. It is perfectly natural that we should have been trying to sell microprocessor designs to AMD for about the last ten years. Hitherto we haven't been successful." East also said: "AMD has signaled they are going through a rethink of their strategy, and that must provide a heightened opportunity for ARM. They might use ARM microprocessors in the future and you've got to expect that we would be trying to persuade them of that." http://www.eetimes.com/electronics-news/4215518/ARM-working-on-AMD-to-drop-x86 [eetimes.com]

Re:ARM and AMD (1)

PeterKraus (1244558) | more than 2 years ago | (#35951932)

From the linked article:

AMD has lagged NV enormously when it comes to gaming or general application performance.

What?

Re:ARM and AMD (1)

h4rr4r (612664) | more than 2 years ago | (#35951966)

That makes no sense. I am setting up a new box and will have to pay more for a worse NV card to get the performance I want since I run linux.

All my other machines use intel graphics, but I want to game on that one.

Re:ARM and AMD (0)

Anonymous Coward | more than 2 years ago | (#35953236)

From all your whining above, you'd think AMD ate your children.

Re:ARM and AMD (1)

h4rr4r (612664) | more than 2 years ago | (#35954470)

I love AMD CPUs, and the next box will be an AMD CPU and an NVidia GPU. I need working drivers.

Re:ARM and AMD (0)

Anonymous Coward | more than 2 years ago | (#35955946)

AMD's Linux graphics drivers have greatly improved in the past couple years. When was the last time you used them?

Re:ARM and AMD (1)

owlstead (636356) | more than 2 years ago | (#35956124)

Last time they were a good enough reason for me to get a new motherboard and nVidia graphics card. Boy, I really wish their drivers did not suck so much, since they've got the more interesting hardware. I had good 3D, good video, but they would just not work at the same time. Then you start to try and before you know it you got an X configuration that even I could not fix (and I've got some 17 years experience). Their multi-monitor support wasn't up to par either, and that's stuff I *NEED* and expect to work to perfection by now.

Re:ARM and AMD (0)

Anonymous Coward | more than 2 years ago | (#35954634)

My Machine *is* my child, you insensitive clod!

Re:ARM and AMD (0)

Anonymous Coward | more than 2 years ago | (#35953494)

Google is betting on ARM... that's why they've released all these x86-specific trade secrets, like cityhash and speedy. Having an x86 performance advantage won't help when they're using ARM, and in the meantime getting people to write code and systems using these algorithms may make it harder for their competitors to switch to ARM later on.

2012 (2)

macson_g (1551397) | more than 2 years ago | (#35952160)

2012 will be the year of ARM on the desktop.
Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...