Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA Releases Source To CUDA Compiler

Unknown Lamer posted more than 2 years ago | from the not-quite-there-yet dept.

Programming 89

An anonymous reader writes "NVIDIA has announced they have 'open-sourced' their new CUDA compiler so that their GPGPU platform can be brought to new architectures. NVIDIA's CUDA compiler is based upon LLVM. At the moment though they seem to be restricting the source code's access to 'qualified' individuals.' The official press release implies wider access to the source will happen later. It so happens that a few days ago AMD opened their OpenCL backend and added initial support to the Free Software r600 driver.

Sorry! There are no comments related to the filter you selected.

Should work well with AMD 7000 series (2)

Ken_g6 (775014) | more than 2 years ago | (#38370658)

Based on what we know about the high-end AMD 7000 series, that it will forgo VLIW for separate threads, CUDA might actually work very well on that architecture. As long as the right 'qualified' individuals work on it.

Re:Should work well with AMD 7000 series (1)

GameboyRMH (1153867) | more than 2 years ago | (#38371752)

Doesn't this open source compiler still compile code that requires an Nvidia-proprietary hardware technology? Anyone who wants to run on non-Nvidia GPUs should just use OpenCL.

Re:Should work well with AMD 7000 series (3, Interesting)

GameboyRMH (1153867) | more than 2 years ago | (#38371796)

D'oh, NM, I RTFA'd:

Nvidia's CUDA compiler will be able to create code that supports more programming languages and will run on AMD and Intel processors, while previously it ran only on Nvidia's GPUs. The company made the announcement today at the GPU Technology Conference in Beijing.

Continuing to call it a CUDA compiler is a bit misleading then isn't it?

GPU drivers (1)

Anonymous Coward | more than 2 years ago | (#38370666)

If they would open source video drivers too NVIDIA would be clear card of choice in *nix systems.

Re:GPU drivers (-1)

Anonymous Coward | more than 2 years ago | (#38370756)

"Its shit but we want it more because its open source!" Actually does sound like the typical Linux user.

Re:GPU drivers (0)

Anonymous Coward | more than 2 years ago | (#38371740)

Yeah because ATI's really good about drivers. I've never had problems with my nvidia gear.

Re:GPU drivers (3, Interesting)

hairyfeet (841228) | more than 2 years ago | (#38372664)

Yes, please help proprietary hardware vendors by making sure nobody will support you, that's the ticket!

I mean what do you think other hardware vendors are gonna do? they are gonna look at what happened with AMD. AMD releases all the specs they can (there are some bits that are part of HDCP they don't have the right to release) and even go so far as to hire developers out of their own pocket to support the free drivers, and what does the community do? What do you see on every forum, including here? "LOL use nvidia".

Any company with a brain after seeing that would tell you and the rest of your little ungrateful "community" to fuck right off. So thanks, you are helping to ensure that FOSS stays the choice of hobbyist nerds and never gains any share, making sure guys like me have plenty of work, appreciate it pal.

Re:GPU drivers (1)

Anonymous Coward | more than 2 years ago | (#38376202)

That's because right now, the nVidia solution just works. The open source community is primarily made up of people who proclaim open source as the next coming, use the software, and contribute absolutely nothing meaningful back. Of the small percentage that does contribute, you have people lending advice on IRC and message boards, you have people writing documentation, you have people doing HCI work, you have people managing, and you have people contributing code. Of those contributing code, you are only going to have a small percentage that has the skill set and inclination to read specsheets and develop a driver from them, and out of those, you only have a small number who actually are doing so, rather than working on their own separate project.

Driver development for such a complex piece of hardware like a video card takes considerable time and effort. Releasing data on how to interface with your hardware doesn't mean a development team a hundred engineers strong will suddenly spring up overnight. Open source drivers are always preferable to closed source ones, because anyone can maintain them if their company abandons them, but that requires them to be functional in the first place. nVidia has spent a lot of time making their Linux drivers work well, while ATI largely left theirs to rot. Since AMD took over, they have been improving, but still have too many problems to be relied upon when purchasing hardware. Releasing the specs to the community was a nice gesture, but a largely empty one, as the open source drivers developed from them are still only of limited capability. At the end of the day, you have to use what works.

not related to open source (1)

dutchwhizzman (817898) | more than 2 years ago | (#38376358)

I despise Nvidia for not opening up their drivers, but I still use them on Linux and Windows. Why? Because their drivers don't crash my box half as often as the ATi drivers do if I use them the way I want to, or ATi simply doesn't do what I want. In the end, I much prefer open source, but I'll use whatever gets me the best results for my application. My computers are first and foremost tools, not political advocacy devices.

Blaming the lack of success for ATi on Linux desktops on the fact that they went open source simply doesn't hold. They were behind before they went open source and even tho they improved greatly, they are still significantly behind Nvidia when it comes to Linux drivers.

Re:GPU drivers (1)

hufter (542690) | more than 2 years ago | (#38381472)

For most people it is all the same if a driver is open source or not as long as it works and you get it for free.
As I appreciate AMD opening up some of their secrets. The open source Radeon driver is a lot better than it was on ATI times. However, the closed driver still gives you double the fps on games, the bad news that it works so badly. Messes up the screen and stuff.
I recently bought a new nVidia based card, partly because I was fed up with the proprietary Radeon driver not working. Gave up trying to install it on Fedora 16 or Mint 12.
GeForce is the only choice if you want fast graphics for Linux right now.

Re:GPU drivers (1)

jlehtira (655619) | more than 2 years ago | (#38382736)

ATI once bricked my Radeon laptop by suddenly making drivers that can't draw a single pixel on the mobile 9600. Okay, so maybe it was a bug, but they weren't in a hurry to fix it. Yes, I could have installed an older driver, but because of Linux, that would have also meant installing an old distribution with an old kernel. I needed new features and programs. And even while the ATI driver initially worked, it didn't support everything (dual screen in particular was hacky).

I'd be very happy to see AMD make stable and feature-perfect drivers (consistently) for Linux. But given their very earned reputation and personal hardships with radeons, I'm not buying another ATI/AMD graphics card unless I see several years of flawless drivers from them (at least the kind of flawlessness that Nvidia offers).

So everybody "knows" that Nvidia is better for Linux, and not many people are left to find out if the drivers turn better. Too bad. They're in the grave they dug themselves.

Re:GPU drivers (1)

Anonymous Coward | more than 2 years ago | (#38382792)

Interesting that you add this little delusion but completely ignore the part which immediately prior, ATI told Linux users to go fuck themselves and handed the entire Linux market to NVIDIA. Furthermore, unlike ATI, NVIDIA has been very good to the Linux community for a very, very long time now. And unlike what you present, ATI's effort reflects an effort to grab market share away form NVIDIA rather than a good faith effort to do "right" by the open source community. On top of that, NVIDIA has repeatedly stated they do not own all of the IP in their stack So basically you argument is, fuck the ones who have always done well by you.

On top of that, ATI makes good hardware but their drivers suck...suck...suck... Well, okay, modern drivers only suck once rather than three times. They are still behind NVIDIA's drivers in quality. And unlike with NVIDIA, myself and several people I know have all had hardware in active use deprecated by ATI; one of which was a laptop. Which basically means, the laptop which was in active use can no longer be upgraded without losing 3D support.

Sorry, but ATI can take their disingenuous marketing ploy and go fuck themselves. In the grand scheme of things, unlike what the delusional open source fanatics tell you, we don't need the source to the video drivers and so long as the drivers work well, binary blobs are perfectly acceptable. Welcome to the real world.

Re:GPU drivers (2)

marcosdumay (620877) | more than 2 years ago | (#38374876)

"It is shit, but will be supported and won't stop working after the kernel upgrades from 3.8.54-patch3 to 3.8.54-patch4."

Seems to be a sane option.

Re:GPU drivers (3, Informative)

Anonymous Coward | more than 2 years ago | (#38370772)

They cant, not fully at least. They have to keep details about tilt-bits and other DRM and patented crap secret.

Re:GPU drivers (3, Informative)

Anonymous Coward | more than 2 years ago | (#38370840)

If it's patented, then it isn't a secret anymore, by virtue of the patent.

Re:GPU drivers (0, Troll)

jpate (1356395) | more than 2 years ago | (#38370978)

+1 idealist. If only...

Re:GPU drivers (3, Insightful)

fnj (64210) | more than 2 years ago | (#38371096)

Do you know something we don't? Is there some defiency in our understanding of patents? "A patent [wikipedia.org] ... consists of a set of exclusive rights granted by a sovereign state to an inventor or their assignee for a limited period of time in exchange for the public disclosure of an invention" (emphasis added).

Re:GPU drivers (0)

Anonymous Coward | more than 2 years ago | (#38371348)

The problem is not the "patented" technologies, but more the "non-patented", licensed from third parties and "secret" technologies.

As a wild guess, i think Nvidia have licensed technologies from some third parties that are declared open source enemies (imagination tech can be one of these) , and thus, nvidia can't really open their technologies, not without breking their contracts.

Re:GPU drivers (1)

fnj (64210) | more than 2 years ago | (#38372326)

Yes. Very valid point. I was only addressing the remark about patents.

Re:GPU drivers (0)

Anonymous Coward | more than 2 years ago | (#38371770)

I think his point is about making generic, over broad software patents without necessarily getting into implementation details, which does happen.

On the other hand plenty of graphics patents are very formula specific. Witness John Carmack having to change a shading algorithm before releasing the doom3 source.

Finally, just because an algorithm is patented, doesn't mean they've released implementation details for speeding up said algorithm. (Or maybe they released it for gen X card, but gen Y has a better way of doing it and they see no need to release the details for that...)

Re:GPU drivers (1)

fnj (64210) | more than 2 years ago | (#38372274)

If that does happen, it's a violation. Look, I'm strongly against the absurd fringe patents of software, processes, business methods, and the like, and I'm not in favor of ANY patents under the present absurd system, but if they are going to exist in the first place, then outright violations should be shot down on sight.

Re:GPU drivers (2)

jpate (1356395) | more than 2 years ago | (#38371958)

That might be the intent, but, for a lot of software patents, at least, the language is too vague and broad to for things to work out this way. They're written with the intent of catching as many possible technologies in the same net; disclosure is not a concern.

Re:GPU drivers (1)

fnj (64210) | more than 2 years ago | (#38372300)

I consider that practice a fundamental violation and agree that it should be stamped out.

Re:GPU drivers (1)

k_187 (61692) | more than 2 years ago | (#38372642)

Yup, any patent that has a software step should be required to disclose the source code at least as the best mode of the patent, if not part of the claims (depending on how strict you should be. When I'm elected Leader of the world, this will be like the 11th or 12th thing I'll do in office.

Re:GPU drivers (0)

Anonymous Coward | more than 2 years ago | (#38373524)

the 11th or 12th thing I'll do in office

Seriously... What's with you and 11 and 12?

Re:GPU drivers (0)

Anonymous Coward | more than 2 years ago | (#38373594)

The issue I think he's pointing to is patents in the USA can be so general in description that one may not understand how a DRM actually functions just by reading the patent.

Re:GPU drivers (1)

tepples (727027) | more than 2 years ago | (#38371830)

A single technology license can cover both a patented invention and related trade-secret know-how that is not patented. This know-how is how I understand "details about [...] patented crap", especially for patents that describe an invention in broad strokes. I imagine that patent licensors want NVIDIA to keep the secret parts secret too.

Re:GPU drivers (2)

bunratty (545641) | more than 2 years ago | (#38371578)

Without patents, there would be far more trade secrets. Patents make inventions public, not private. That's the whole purpose of patents: to promote the spread of ideas by making them public.

Re:GPU drivers (1)

avatar139 (918375) | more than 2 years ago | (#38372454)

Without patents, there would be far more trade secrets. Patents make inventions public, not private. That's the whole purpose of patents: to promote the spread of ideas by making them public.

Really? Wow, and here I thought they're solely a means for companies that don't actually produce anything except lawsuits to make money via protectionist licensing schemes! :P

Re:GPU drivers (1)

Stele (9443) | more than 2 years ago | (#38382382)

If they would open source video drivers too NVIDIA would be clear card of choice in *nix systems.

It already is. At least in the post production industry. You don't see any of us wining about lack of open-source drivers. We just want to get stuff done with the most reliable drivers available.

So... (1)

AdamJS (2466928) | more than 2 years ago | (#38370780)

Does this mean eventually running CUDA applications on AMD GPUs?

Re:So... (1)

melonakos (948203) | more than 2 years ago | (#38371068)

My guess is there will be some academic projects, like Ocelot, that will take a stab at this. But I doubt it will be a better path than using OpenCL directly as supported by AMD/ATI.

Re:So... (4, Interesting)

fsckmnky (2505008) | more than 2 years ago | (#38371324)

imho ... OpenCL is a much better path, because it can execute code on a CPU as well as a GPU. It can even target FPGAs for executing the parallel operations on reconfigurable hardware, as well as sharing output paths with OpenGL for visualization.

The Nvidia driver ( at least for linux ) currently seems to only support Nvidia GPUs as a target, but the AMD driver supports AMD GPUs as well as the host systems CPU. Again, on linux at least, you can install both AMD and Nvidias drivers if you want to utilize your CPU ( via AMD driver ) and Nvidia GPU ( via Nvidia driver ) at the same time, although there are some minor framework related hoops to jump through to get parallel execution across multiple device platforms concurrently.

These features seem to indicate native CUDA is pretty much a dead platform ( looking forward ).

From TFA: CUDA runs on x86 (3, Interesting)

tepples (727027) | more than 2 years ago | (#38371874)

OpenCL is a much better path, because it can execute code on a CPU as well as a GPU.

So can CUDA, according to a graphic in one of the featured articles [nvidia.com] : "NVIDIA C or C++, PGI Fortran, or new language support, through LLVM-based CUDA compiler, to NVIDIA GPUs, x86 CPUs, and new processor support."

Re:From TFA: CUDA runs on x86 (3, Interesting)

fsckmnky (2505008) | more than 2 years ago | (#38371994)

And as I said, the OpenCL driver I downloaded from Nvidia a week ago, doesn't support CPU's as a target. This is not to say CUDA isn't capable of doing this at some point or with some 3rd party addition, just that the capability currently doesn't exist, or at the very least, isn't being provided by Nvidia.

Given the choice of a vendor-neutral platform, or a vendor-supplied platform ( where the odds of CUDA having support for AMD GPU's is near 0 ) ... as a developer, which would you choose ?

Hence my prediction, OpenCL will win ( the hearts and minds contest ). It's already won mine. I wouldn't touch CUDA for fear of being locked into Nvidia products versus the open, broad support, of OpenCL. I even ordered an AMD video card, so I can ditch the Nvidia only driver entirely.

Re:From TFA: CUDA runs on x86 (2)

fsckmnky (2505008) | more than 2 years ago | (#38372184)

and now that I contemplate this more, it seems like this announcement is Nvidia's attempt to win back people like myself, and convince everyone to use CUDA instead of OpenCL. Much like the DirectX vs OpenGL thing Microsoft pulled off, but perhaps with less chance of success.

Re:From TFA: CUDA runs on x86 (0)

Anonymous Coward | more than 2 years ago | (#38381138)

Perhaps a better analogy would be Glide vs OpenGL (primarly because Glide sounds more car-related).

Re:From TFA: CUDA runs on x86 (0)

Anonymous Coward | more than 2 years ago | (#38381406)

A careful reading of TFA shows that the *next* release of CUDA *will be* released as source to NDAed individuals, and *that* release contains ability to run on CPUs.

But ability to run on CPUs really isn't that important; there's something of a performance disparity, not to mention the fact that GPGPU algorithms are typically much slower, FLOP for FLOP, than the equivalent CPU code. We already have CUDA for CPUs; it's called C.

Nonetheless, OpenCL or even DirectCompute is going to be a better target than CUDA for the reasons of lock-in you mention, just as everyone uses HLSL or glslang these days rather than Cg.

And please note nVidia's poor record with Cg. They "open-sourced" the 0.9 Cg compiler with "personal use only" restrictions and then failed to ever update it.

any word on a license? (3, Insightful)

Trepidity (597) | more than 2 years ago | (#38370810)

Despite the phrase "open-source", there seems to be a distinct lack of information about whether this is a "source is now available for inspection" type release, or actually under an open-source license, and if so, which one.

Still no news about the specific license (0)

Anonymous Coward | more than 2 years ago | (#38370822)

We can hope for GPL but it looks like it's unfortunately going to be some kind of BSD.

Re:Still no news about the specific license (0)

oPless (63249) | more than 2 years ago | (#38371008)

"unfortunately" ?
BSD > GPL in terms of freedom.

Re:Still no news about the specific license (0)

larry bagina (561269) | more than 2 years ago | (#38371102)

They should release it under the GPL license so an evil company like NVIDIA can't steal it and not give their changes back.

Re:Still no news about the specific license (0)

Anonymous Coward | more than 2 years ago | (#38371228)

Fuck you. If you don't want NVIDIA's (hypothetical) proprietary bastardized version, DON'T BLOODY USE IT!

There is no "steal it", because the original is still available, still under the same license, you can use that -- and get all the stuff _except_ what nvidia wrote, and the whole premise of copyright is that you get to control others' dissemination of what you wrote (for great profit!). If you disagree with that premise (as I do), then fight for the abolition of copyright, don't use it to strongarm companies into giving you source code.

Re:Still no news about the specific license (1)

fnj (64210) | more than 2 years ago | (#38371218)

This is the kind of disagreement in which neither side can ever hope to convince the other. Let's just say that you are free to do almost anything you want under a BSD license. That's an objective fact. You are not compelled to contribute back any improvements you may make in a work. Which one contributes more to global "freedom" is much too broad and slippery a concept to ever resolve unanimously.

Re:Still no news about the specific license (2, Interesting)

hedwards (940851) | more than 2 years ago | (#38371510)

OK, then how do you explain the rather large numbers of companies that give back to BSD projects? This anti-BSD FUD that the Linux and GPL camp seem to need to spread got old many,many years ago.

Without a permissive license the internet would have been greatly delayed as MS and the others would have had to develop their own TCP/IP stack from scratch.

Re:Still no news about the specific license (-1, Troll)

Dog-Cow (21281) | more than 2 years ago | (#38371710)

You are an idiot. Please fuck off and die. Or at least learn to comprehend. Then fuck off and die.

Re:Still no news about the specific license (0)

Anonymous Coward | more than 2 years ago | (#38372158)

I'm very sorry for flying off the handle there. But I am afraid of hurting my super smart/tough image so I'll just post it anonymous. Can I bake you some cookies?

sincerely

Dog-Cow

Re:Still no news about the specific license (0)

Anonymous Coward | more than 2 years ago | (#38375910)

I apologize too. Let's get a room then we can share the "cookies" together, ok?

Toodles,

hedwards

Re:Still no news about the specific license (0)

Anonymous Coward | more than 2 years ago | (#38376602)

Yes, let's. My "cookies" are delicious. 8pm sound ok?

giggles,

Dog-Cow

Re:Still no news about the specific license (1)

fnj (64210) | more than 2 years ago | (#38372376)

I believe that is consonant with my comment. Not everyone only does what they are compelled to do. I have BSD licensed some of my own code.

Re:Still no news about the specific license (1)

diego.viola (1104521) | more than 2 years ago | (#38376978)

That depends on your definition of freedom. I prefer freedom in the sense that I know the code will always remain free and available. As a developer and user I want my code to be always available, I also want contributors to never close what I made open in the first place. I prefer freedom in the sense of Free software, GPL and strong copy-left. AKA "Liberty or Death".

BSD brings uncertainty when it comes to having contributors closing the source code and not contributing back.

Re:Still no news about the specific license (1)

oPless (63249) | more than 2 years ago | (#38381344)

GPL has its place, I agree. But unfortunately the usage of it is often 'wrong', or malicious IMHO.

Re:Still no news about the specific license (1)

marcosdumay (620877) | more than 2 years ago | (#38374948)

BSD is the perfect license to apply to a layer of software that helps people talking to the hardware you sell.

If somebody wants to "steal" it, and make something great without sharing upstream, well, great for you, more people will buy your hardware.

No they haven't (5, Interesting)

PatDev (1344467) | more than 2 years ago | (#38370850)

Title is correct. From TFA, the summary appears wrong. It seems they are not open sourcing anything. To quote TFA

On December 13th, NVIDIA announced that it will open up the CUDA platform by releasing source code for the CUDA Compiler.

They will let you look at the code, and they might let you send patches back to them. Nowhere I can find did NVIDIA promise anything along the lines of an open license, or even any license at all. This is more like a Microsoft shared-source deal, where you can look, but no rights or privileges are transferred to you.

That said, it would still be cool to see.

Re:No they haven't (0)

Anonymous Coward | more than 2 years ago | (#38370950)

Not sure if any developer should even look at the code given that it is not really open. Last thing a developer would want is to end up being accused of copying stuff from said code illegally.

Re:No they haven't (1)

gl4ss (559668) | more than 2 years ago | (#38371282)

Not sure if any developer should even look at the code given that it is not really open. Last thing a developer would want is to end up being accused of copying stuff from said code illegally.

usually the company wouldn't admit you to seeing the code anyhow if they considered it as that secret and the nda's usually go both ways.
you know why? because usually it's shit and then they'd have to admit you worked on the project and that would be giving the employee an edge when looking for future work, big companies don't like giving exact creds on people walking out, they like some creds on people walking in.

getting tainted from looking the code is pure bullshit scary pants nonsense - the code won't be relevant even for long enough for it to matter anyways, just don't keep it around as a copy. but it is cool from future employer viewpoint if you did look at something once.

of course the amount of people for this particular code is interesting to is quite limited.

but I really wonder what kind of people do people who recommend not looking at any source at all due to being scared of getting tainted think? that employers want noobs?

Re:No they haven't (-1)

Anonymous Coward | more than 2 years ago | (#38370956)

That said, it would still be cool to see.

Looking at this source would decrease your value as a programmer in the job market, as you would be "tainted" forever with Nvidia intellectual property. Your future employer would be liable for lawsuits from nvidia.

it's better to NOT look at it.

Re:No they haven't (1)

Anonymous Coward | more than 2 years ago | (#38371130)

What a load of absolute crap. By that logic, don't ever work for anyone or ever sign any NDA.

Looking at stuff is how programmers learn. The whole industry is built on residual knowledge. If I say I used to work at nVidia (which I did) on stuff that they want to look at themselves, then provided the non-compete clauses is done I am at a massive advantage compared to someone who has not looked at anything ever at all.

Stop trotting out this bullshit myth that you can't look at things if you later want a job.

Re:No they haven't (1)

fnj (64210) | more than 2 years ago | (#38371272)

It may so appear to some, but it may be so, or it may not be so in fact. We don't know whether they intend to open source it or not. Not from this article, anyway. It certainly doesn'y say it will NOT be open sourced, though.

Re:No they haven't (1)

Robert Zenz (1680268) | more than 2 years ago | (#38371320)

Open Source != Free Software.

Re:No they haven't (0)

Anonymous Coward | more than 2 years ago | (#38371556)

http://www.opensource.org/docs/osd

The people who coined the term Open Source. If you use the capitol letters, you are talking about this. It's not exactly the same as Free Software, but Free Software is much closer than what NVIDIA has done.

Re:No they haven't (1)

Melkhior (169823) | more than 2 years ago | (#38372876)

Why is everybody thinking this is big news?
ftp://download.nvidia.com/CUDAOpen64/ [nvidia.com]
The previous compiler, based upon Open64, has been available in source form since CUDA 1.0. They (partially) switched to LLVM in 4.1, and they also release the source code. They didn't have to, because unlike Open64 LLVM is not GPL, so it's nice of them, but it's not exactly earth-shattering news...

In related news (1)

StikyPad (445176) | more than 2 years ago | (#38370980)

Stikypad [sic] has "given away" all of his "money" to a "qualified individual."

Re:In related news (2)

chill (34294) | more than 2 years ago | (#38371200)

Ah, you're married, aren't you.

Re:In related news (1)

GameboyRMH (1153867) | more than 2 years ago | (#38371968)

No he's just started investing.

Re:In related news (1)

StikyPad (445176) | more than 2 years ago | (#38372078)

Lost at Monopoly.

Can someone tell me NVidia's business model? (1)

Brannon (221550) | more than 2 years ago | (#38371030)

Discrete graphics is going away, they seem to be leaning increasingly towards the HPC market but that is tiny compared to the consumer graphics market that their company was built on. I just don't see it. Anyone?

Re:Can someone tell me NVidia's business model? (2)

Ken_g6 (775014) | more than 2 years ago | (#38371082)

(Off topic): [url=http://en.wikipedia.org/wiki/Tegra]Tegra[/url]? (An ARM chip with nVIDIA graphics.)

Re:Can someone tell me NVidia's business model? (4, Informative)

hardwareman (144389) | more than 2 years ago | (#38371268)

Well, They are making some of the best mobile/low-power solutions with the Tegra [nvidia.com] family of chip-sets.

I also believe that it's still going to take some time before the integrated solutions (Intel IGP and AMD Fusion) are good enough to replace discrete graphics for gamers - where they are strong today.

Xbox 360 and Wii have integrated graphics (1)

tepples (727027) | more than 2 years ago | (#38372072)

They are making some of the best mobile/low-power solutions with the Tegra [nvidia.com] family of chip-sets.

That and nForce.

it's still going to take some time before the integrated solutions (Intel IGP and AMD Fusion) are good enough to replace discrete graphics for gamers

SWF games, such as those seen on Facebook, are targeted at PCs with Intel GMA (Graphics My Ass) IGPs. So to people for whom "gaming" means FarmVille and "upgrade" means buying a new PC, integrated graphics have replaced discrete.

Xbox 360 and Wii have "integrated graphics" by AMD in the sense that the GPU is on the northbridge. The 360's graphics are also "integrated" in the sense that all 512 MB of its RAM can be used as VRAM. I'm not very familiar with the PS3 architecture other than that half its 512 MB of RAM is exclusive to the NVIDIA RSX GPU, so I don't know what other functions the RSX does in addition to GPU operations. But at least for people who game on an Xbox 360 or Wii and do not game on the PC, integrated graphics have replaced discrete.

Re:Xbox 360 and Wii have integrated graphics (0)

Anonymous Coward | more than 2 years ago | (#38373036)

I'm not sure you could categorise the Wii's Hollywood package as a northbridge. Apart from the Hollywood GPU (including 3 MB of RAM) and 24 MB of the system RAM it contains a bunch of other stuff like the audio DSP and an ARM9 SoC that runs 'IOS', the basic operating system, controller of the main PPC processor and primary enforcer of security. I'd call it more like a southbridge but even then it does more and lacks the cohesion that I'd expect from something bearing that name. There are four names crammed onto that thing (Nintendo, ATI, BroadOn, NEC) for a reason...

For that matter, I'm not sure the Wii's GPU is really in the same league as other AMD GPUs. It's an upgrade of the Gamecube GPU, which was designed by a company of ex-SGI engineers who worked on the Nintendo 64. The company got acquired by ATI shortly before the release of the Gamecube and so I assume the design is more SGI-ish than AMD-ish.

Re:Xbox 360 and Wii have integrated graphics (1)

hairyfeet (841228) | more than 2 years ago | (#38373916)

If you are talking about chipsets my friend i'm afraid you are mistaken as nvidia got out of that business nearly two years ago, all they sell now is crappy old designs that they had finished before they got out of the business. i know because i'm having a fricking devil of a time with a new board thanks to the fact the Nvidia chips are so old they don't support AHCI and the last board did so Windows doesn't want to boot off the Nvidia. Learned a valuable lesson though, two actually. Nvidia boards are shit number one, and number two just because a board manufacturer says a chips is supported don't mean shit until she you the damned thing fire. This board has clearly marked on their website under CPU Thuban but won't fire on anything bigger than a quad, ARGH!

As for the "farmville" crowd? if the chip doesn't support hardware acelerated flash then its shit. I should know because i have plenty of customers addicted to FB games and if you want them to be happy give them a machine with hardware accelerated flash, otherwise they are bitching about the game being jerky or slow. And as far the the consoles being "integrated"? not even close friend, as the RAM they use on the GPU (IIRC XDR RAM) has a bigger pipe and thus can run more data through than plain old DDR whatever. Its been a few years since i looked at the consoles so I might not have the terminology down, but I remember it wasn't just COTS RAM they used on the GPUs which is why it doesn't have nearly as much as your average $50 discrete card.

Re:Can someone tell me NVidia's business model? (1)

yupa (751893) | more than 2 years ago | (#38374500)

You means the tegra2 that lacks neon support and have low memory bandwidth ?

Re:Can someone tell me NVidia's business model? (1)

gupg (58086) | more than 2 years ago | (#38376918)

Discrete graphics is going away, they seem to be leaning increasingly towards the HPC market but that is tiny compared to the consumer graphics market that their company was built on. I just don't see it. Anyone?

Discrete GPU market is growing. See JPR's analyst reports http://jonpeddie.com/press-releases/details/embedded-graphics-processors-killing-off-igps-no-threat-to-discrete-gpus/ [jonpeddie.com]

here is the full report http://jonpeddie.com/download/media/slides/An_Analysis_of_the_GPU_Market.pdf [jonpeddie.com]

That is the funniest thing I've read in a long (1)

Brannon (221550) | more than 2 years ago | (#38385746)

long time.

I'm going to print it out and put it on the shelf next to:

    * "Buggy whip industry still growing with no end in sight"
    * "Refrigeration is no threat to the ice delivery business"
    * "Travel agents expect little competition from internet sales"

GPGPU Cold War finally ending? (1)

rcrodgers (1233228) | more than 2 years ago | (#38371086)

Does this mean that AMD/ATI and nVidia are finally recognizing that the only people really losing out in their cold war are their users? I'm traditionally an AMD/ATI customer but have been leaning towards getting an nVidia card for the CUDA support in Adobe's Creative Suite, but if this means that at some point in the future the Radeon HD 7000 series will support CUDA and will potentially accelerate CS, then I'll stick with it...

Open Source the Libraries (4, Interesting)

melonakos (948203) | more than 2 years ago | (#38371220)

IMO, open sourcing their GPU libraries would be a much bigger deal than only open sourcing the compiler. I would like to see CUBLAS, CUFFT, CUSPARSE, CURAND, etc all get opened up to the community.

The pain is not in compiling GPU code; rather, the pain is in writing good GPU code. The major difference between NVIDIA and AMD (and the major edge NVIDIA has over AMD) is not as much the compiler as it is the libraries.

Of course, I'm biased, because I work at AccelerEyes and we do GPU consulting with our freely available, but not open source, ArrayFire GPU library [accelereyes.com] , which has both CUDA and OpenCL versions.

Re:Open Source the Libraries (2)

melonakos (948203) | more than 2 years ago | (#38371472)

Also, OpenCL is not going anywhere, even if someone figured out how to get CUDA code to run well on ATI GPUs. In addition to many other reasons which I'm are getting discussed in these comments, OpenCL is gaining a lot of traction by mobile GPU vendors too (e.g. ARM Mali, Imagination PowerVR, Qualcomm Adreno, etc).

Nvidia Driver (0)

Anonymous Coward | more than 2 years ago | (#38371420)

Is there any particular reason they are not making the driver open source? I have done programming with CUDA but it was troublesome to get the driver installed and working correctly. It would be much easier if it was just part of the modules distributed by default with any distribution. Wouldn't open sourcing the driver, which is free as in free beer anyway, encourage people to buy Nvidia's hardware?

Re:Nvidia Driver - trade secrets (1)

DCFusor (1763438) | more than 2 years ago | (#38374618)

A big part of the trade secrets business in competing GPU's is how things are done in the details - how the problems are phrased or broken up between the CPU and GPU, and exactly what algos are used in each. No one patents this stuff, it's all trade-secret. So no one wants to open source their drives, where all this information lives. Stinks, but there it is.

Bitcoin (-1, Offtopic)

DriedClexler (814907) | more than 2 years ago | (#38371928)

But the question everyone wants to know is, will this allow programmers to write even better software for Bitcoin miners using AMD GPUs?

(By "everyone", I mean the community of Bitcoin minors.)

Re:Bitcoin (-1, Redundant)

DriedClexler (814907) | more than 2 years ago | (#38372172)

Redundant? What the fuck? Where are the other, earlier comments about implications for Bitcoin mining?

Running Linux in the CUDA Cores (1)

hoodofblack (957491) | more than 2 years ago | (#38372788)

I think if there was a way to run the kernel in the CUDA cores would be awesome. I have a GTX GeForce 560 Ti - 384 cores. That would be awesome even if to use 1/2 of those in the kernel. Even if we used the GeForce 8800 with 112 cores. Wow...this could be awesome to use those for something productive. If you know of a linux that will run in the CUDA cores I would be happy to know about it.

Re:Running Linux in the CUDA Cores (1)

Zulkis (839927) | more than 2 years ago | (#38374012)

This is not how the GPU cores work. There are some efforts to offload some tasks that kernel does and are suitable for GPU like block encryption etc, (in general everything that is parallel enough and can be streamed). For instance there's AES acceleration: http://gpgpu.org/2011/05/04/kgpu-gpu-computing-in-linux-kernel [gpgpu.org]

ATI/AMD Driver Is Not Free Software (0)

Anonymous Coward | more than 2 years ago | (#38374444)

Subject says it all really. It includes binary blobs.

http://www.fsfla.org/svnwiki/selibre/linux-libre/

sitting idle (0)

Anonymous Coward | more than 2 years ago | (#38381984)

any way to offload the OpenVPN calculations to the GPU?
OpenVPN server is eating away at my intel atom 330 : (

Re:sitting idle (2)

Bengie (1121981) | more than 2 years ago | (#38385644)

You probably wouldn't gain anything. Passing data between your CPU and GPU has a high latency penalty. OpenVPN processes small amounts of data. You would need to probably buffer a few hundred KB before it would become worth it.

A GPU would be great for any large amounts of data, like a block device, but small packetized datastreams work best with super low latency instruction level acceleration.

My guess anyway.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?