Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA Open-Sources Tegra K1 Graphics Support

Soulskill posted about 6 months ago | from the increment-good-citizen-point-total dept.

Open Source 66

An anonymous reader writes "NVIDIA's next-generation Tegra K1 ARM processor now has open-source support for its Kepler-based graphics. NVIDIA decided to submit a large queue of patches to the open-source, reverse-engineered Nouveau project for supporting their ARM Kepler graphics with the open-source driver. The patches are still experimental but this is the first time NVIDIA has contributed open-source code to Nouveau."

cancel ×

66 comments

Danger (1, Funny)

For a Free Internet (1594621) | about 6 months ago | (#46131373)

The software and the hardwater can give you NSA cookies, which can intercept your phone dogs and cause corn.

Re:Danger (1)

rmdingler (1955220) | about 6 months ago | (#46131467)

If though, it were Christmas, there is an infinitesimal chance your cookies would be appreciated by a red-suited big and tall shopper.

And if we imagine there's some unlikey rationale we can use to weave your incredibly less than statistically significant chance of weaving corn into the already convoluted story line, your comment is only slightly less rational.

Re:Danger (2)

Mister Liberty (769145) | about 6 months ago | (#46131797)

The word rational appears twice in your post, but would have been good if it wasn't preceded by
the word Christmas. As such, you can't be taken serious and any use of the former word has
to be seen as clownesque -- which is good, since clowns can hardly stand in the way of a free
internet.

Re:Danger (0)

Anonymous Coward | about 6 months ago | (#46131923)

The word "rational" only appears once in his post, and it was preceded by the word "less". You should feel bad. Your barely cogent 40-column drivel is just a waste of time.

Re:Danger (0)

Anonymous Coward | about 6 months ago | (#46131661)

don't forget your sega cds too.

Clickity-click (3, Funny)

tepples (727027) | about 6 months ago | (#46131973)

NSA cookies

Are you saying I could harness the resources of the NSA to make cookies [wikia.com] ? Is it more powerful than an antimatter condenser [wikia.com] ?

Nvidia has NOTHING to lose at this stage (5, Interesting)

Anonymous Coward | about 6 months ago | (#46131415)

Tegra has been a horrid disappointment for Nvidia till now, and the competition in the ultra-mobile SoC market is ramping up at a terrifying rate.

-Tegra 1. The equivalent of Microsoft's Windows 1,2. If it ever existed, no-one noticed.
-Tegra 2. Horribly late, missing NEON, and missing hardware acceleration for H264 video decode. Used in devices only because Nvidia was forced to give it away.
-Tegra 3. First ARM SoC part from Nvidia worth using. Late, but good enough to get still get some major contracts as a highish end part.
-Tegra 4. Pretty much an unmitigated disaster. Late and expensive enough to lose the small progress Tegra 3 had made. Wrongly specced, so Nvidia had to announce the 4i.

-Tegra 5, renamed the K1. Built on the wrong process (not really Nvidia's fault- TSMC and others have failed to make the shrink progress expected years ago when this part was first planned). Using the wrong ARM core (A15), so Nvidia had to announce a later version of the K1 that will come with Nvidia's own 64-bit ARM core. Of course, this means the first K1 is already obsolete, long before it is on sale. First Tegra with PC class GPU cores, but not the NEW Maxwell GPU architecture Nvidia launches on the desktop in a few weeks time (750TI). So, the GPU is also out of date before the K1 goes on sale.

The Tegra 5/K1 has a lot of graphic clout for an ARM SoC, BUT cannot use that power in a phone/normal tablet form factor. Therefore, Qualcomm and Apple will best the K1 in performance per Watt, once again.

So, Nvidia has zero (ZERO!!!!!) to lose by throwing out all the tech details of the K1 into the public arena. Intel pulls the same stunt with its laughably poor integrated GPUs on its current CPU chips. If you can't compete, make your documentation open-source in the hope this will boot-strap some extra business.

Re:Nvidia has NOTHING to lose at this stage (4, Insightful)

mrbluze (1034940) | about 6 months ago | (#46131461)

If you can't compete, make your documentation open-source in the hope this will boot-strap some extra business.

Too little too late. For YEARS we have been screaming for nvidia drivers that aren't buggy, closed and unstable, to the point of writing Nuveau, an open source hack (remarkably good but still crippled). Rot in hell, NVIDIA - I have wasted enough money on your hardware.

Re:Nvidia has NOTHING to lose at this stage (1)

Anonymous Coward | about 6 months ago | (#46131519)

NVIDIA Linux drivers are closed source and complex, but you must admit they have good quality.

They used to be far better than ATI/AMD proprietary stuff.
Only Intel write good opensourced graphic drivers, but their hardware was so limited that their
drivers didn't need the complexity of serious OpenGL drivers.

Re:Nvidia has NOTHING to lose at this stage (0)

Anonymous Coward | about 6 months ago | (#46131585)

Sure, and with intel closing in on the sub 5 watt envelope with open source drivers, and so much competition for other players, opening the drivers to at least make the chip mostly usefull with open source drivers is a good way to differentiate themselves from the pack.

Re:Nvidia has NOTHING to lose at this stage (0)

Anonymous Coward | about 6 months ago | (#46131643)

But too little too late.

Re:Nvidia has NOTHING to lose at this stage (4, Interesting)

Anonymous Coward | about 6 months ago | (#46131649)

Stop spouting this rubbish. Intel's pretty decent performance wise these days and NVIDIA's drivers have always sucked no matter how much better the performance was.

NVIDIA's graphics drivers don't' f'ing work without head pounding. I shouldn't have to f around with the terminal for hours to install a proprietary driver that only half works on a select set of distributions.

The days where Intel's graphics sucked are long over. It's not the 1990s. Intel's graphics are pretty good. Intel's 3rd generation graphics were decent. Almost comparable at the low end with NVIDIA. The newer 4th generation stuff is pretty impressive although unfortunately Iris Pro has been restricted to integrated CPUs and thus no socketed CPUs have it. As a result motherboard manufacturers have chosen to opt out in protest. Nobody ships an Intel Iris Pro mini itx motherboard. In fact there are very few Iris Pro systems. I have one of the very few that exist in fact. It's an ultrabook-like form factor 14” screen.

AMD's drivers still suck and they are still non-free despite the public relations stunt to “open” them.

While I hope this actually helps improve the free drivers for NVIDIA's graphics chips I'm doubtful. I'm not that familiar with these chips although I'm pretty sure they are targeted at and only available in cellular devices and similar. It won't help the desktop users.

Re:Nvidia has NOTHING to lose at this stage (-1)

Anonymous Coward | about 6 months ago | (#46132031)

Nonsense, nVidia's drivers have always worked much better than everyone else's. It's the retarded fucks from Red Hat and Canonical that fuck them up.

Not that you'd notice with your head implanted that far up your ass, though, but this is off topic for a chip that is aimed at the embedded market.

Re:Nvidia has NOTHING to lose at this stage (3, Interesting)

TheRaven64 (641858) | about 6 months ago | (#46133091)

The nVidia drivers on FreeBSD are pretty solid, but they got a poor reputation for their open source drivers in the early releases. I was running a room full of Linux boxes about 10 years ago, and they'd all kernel panic about once a day, typically while running nothing more strenuous graphically than the log-in screen, and always with a backtrace in the nVidia drivers. The open source ATi drivers of the same era (R200) were a lot slower, but were very stable.

nVidia also had that embarrassing incident where a crafted image could cause arbitrary code execution in the kernel, which turned out to be exploitable by just putting a picture on a web page, and didn't fix it until about two years after they were first notified of it. For the last 4-5 years, their proprietary drivers have been pretty reasonable though.

Re:Nvidia has NOTHING to lose at this stage (1, Insightful)

JDG1980 (2438906) | about 6 months ago | (#46132779)

The days where Intel's graphics sucked are long over. It's not the 1990s. Intel's graphics are pretty good. Intel's 3rd generation graphics were decent. Almost comparable at the low end with NVIDIA. The newer 4th generation stuff is pretty impressive although unfortunately Iris Pro has been restricted to integrated CPUs and thus no socketed CPUs have it. As a result motherboard manufacturers have chosen to opt out in protest. Nobody ships an Intel Iris Pro mini itx motherboard. In fact there are very few Iris Pro systems. I have one of the very few that exist in fact. It's an ultrabook-like form factor 14â screen.

Iris Pro is decent by laptop standards – reasonably competitive with Nvidia's mid-range discrete offerings (750M).

But on the desktop, even if you could get Iris Pro (which as you noted, you can't), it is decisively beaten by pretty much every graphics card over $100. You can't game at 1080p or use MadVR with maximum settings on Iris Pro.

To be competitive on the desktop, Intel needs something about as powerful as a Radeon HD 7850 or GeForce GTX 650 Ti Boost. As of now they aren't even close.

Re:Nvidia has NOTHING to lose at this stage (5, Interesting)

slacka (713188) | about 6 months ago | (#46133149)

it is decisively beaten by pretty much every graphics card over $100. You can't game at 1080p or use MadVR with maximum settings on Iris Pro.

To be competitive on the desktop, Intel needs something about as powerful as a Radeon HD 7850 or GeForce GTX 650 Ti Boost. As of now they aren't even close.

I built my desktop with a 3.4GHz Core i5-3570K Ivy Bridge. Anyone telling you the HD Graphics 4000 is "good enough" for gaming is full of shit. Even my low rez 1200x1080 monitor, most games struggled to get 30 FPS at anything about the lowest detail level. When I got into Dota 2, that was the final straw. I caved in and bought a Radeon HD 7850 for $150. The difference is night and day. Integrated graphics are still garbage, worthless for anything beyond angry birds.

I dual boot to Linux and have a decent steam library. The only thing I'll give Intel, is that they do make decent open source drivers that perform nearly as well in Linux as Windows. The AMD open source drivers are terrible for gaming. They get 30-80% of the proprietary drivers FPS and have major issues with micro stuttering. And yes, I use the dev drivers from the edgy PPA along with all the tweaks like SB backend. They still suck.

Wrong! (0)

Anonymous Coward | about 6 months ago | (#46142187)

They are good enough for use on a laptop. Sure you will have to use low level resolution, but honestly - how much of a loss is that on most games you play? Is it worth having to go from a $600 laptop to a $1200 "gaming" laptop?

no good without drivers. (0)

Anonymous Coward | about 6 months ago | (#46133333)

Using only open source drivers the Iris Pro beats the nvidia 680.

Re:Nvidia has NOTHING to lose at this stage (0)

Anonymous Coward | about 6 months ago | (#46134023)

To be competitive on the desktop, Intel needs something about as powerful as a Radeon HD 7850 or GeForce GTX 650 Ti Boost. As of now they aren't even close.

Then why in hell do so many people not bother upgrading their i3/i5/i7's with integrated GPU to a discrete card? Hell to be competitive, Intel only has to reach the performance of a 7450 or a god damn Geforce 210. They don't need to push the boundaries like both nvidia/amd are searching for an ephemeral speed up. Consider that Intel graphics are in all of the Corporate PC's running Intel throught the world. Very few of them need more then that so I'd say they're beating the shit out both AMD and Nvidia in the area that it fucking counts while laughing all the way to the bank.

I'm running a 4 year old Radeon 5670 with a meager 512M onboard that pushes over 1000 points according to Passmark. To upgrade this card, I need something hitting over 3k points for $100, not the Radeon 7770/GTX650Ti you suggested as they only double things. The 7850 is over 1k points faster then the 650Ti, making me think you're an Nvidia fanboi. Me, I don't give a damn so long as it allows me to play my games such as GuildWars and Call To Power2. Neither of which push the CPU hard. Hell I used to handle 1080 playback on an HD3000 and E6300 C2D and for anyone who's not a gamer (company users = +50%) that's acceptable.

Fast Turtle - posting AC to preserve Mod Points

Re:Nvidia has NOTHING to lose at this stage (0)

Anonymous Coward | about 6 months ago | (#46135815)

How is this shit upmodded? It contains no information or specifics whatsoever and is just an unsubstantiated rant.

Re:Nvidia has NOTHING to lose at this stage (0)

Anonymous Coward | about 6 months ago | (#46137151)

And I have never had an issue with Nvidia drivers period. I have never pounded my head, or spent hours in a terminal trying to 'install' a driver. And it completely works on the distributions I have chosen to use. I have owned 3 different nvidia cards.

The 'it never works for me' so then it all 'sucks' _rubbish_ you are tossing about in your post is pathetic. Perhaps learn to read documentation - it only takes a few minutes, follow the directions, and then you might not actually be pounding heads, and 'f'-ing around on your terminal for 'hours'.

Re:Nvidia has NOTHING to lose at this stage (0)

Anonymous Coward | about 6 months ago | (#46141661)

wow, yer doing something wrong. I've NEVER had to "f'around" in the terminal to get nvidia's proprietary drivers to work. 670/780

even the crapalyst drivers didn't need "f'ing" around in the terminal either as they either (a) worked or (b) phailwhaled* on the bits and bobs that it needed to build in order to work. Always got lotsa free video tearing with these though, and f'tons of bugs that never got fixed, even when they'd feebly claim that they fixed them, e.g. video tearing... (which is why I switched to nVidia primarily nowadays, although I did build a couple of complete AMD systems recently, and found that lo and behold ATI figured out how to unload the video driver and reload it under windows instead of leaving me w/a black screen and guessing as to when the install was finished...still wants to reboot after install though v. I noticed being back with nvidia a few years ago even their reboot requirement/request was gone...(unloading/reloading of vid driver under windows always just worked for me w/nVidia even with their broken drivers)) R9 280X

Intel drivers: (bought a couple haswell based notebooks last summer 4600) again just work, and work as well as nVidia's

Last time that I had to f'around in the term to get stuff to work was probably way back in the late 90s/early 00s or IOW really pre-ATI/nVidia for me, and in that case it was just all about getting X configured, i.e. monitor timings, etc. Oh the joys of days gone by... (Of course this was the time when I'd still re-build kernels by hand as IIRC every vendor was still shipping with base i386 target... might've been the kernel default build as well, but it's been so long... I REALLY DO NOT miss that part of those days....)

OOPS, ALMOST forgot yeah, I still had to f'around in the term until c. mid-00s as notebooks tended to be poorly supported w/o manual intervention.

* usually a result of a bug in the build/install script, so I guess that ATI doesn't believe in QA either(not surprising given their poor quality drivers) AMD still has a LONG ways to go on the software front as they still mostly suck at it.

next trick will be experimenting with an a10-7850k, oughta be "fun"

Re:Nvidia has NOTHING to lose at this stage (3, Informative)

Khyber (864651) | about 6 months ago | (#46131735)

" but you must admit they have good quality."

When a driver update kills the fan and thus kills the GPU by failing to cool it (320.18) you call that good quality?

Really?

Re: Nvidia has NOTHING to lose at this stage (0)

Anonymous Coward | about 6 months ago | (#46132103)

Yup destroyed my perfectly well working gtx 580 that would still be fast enough for me.
Went with amd for two thirds of what the gtx would still retail for and got twice the performance now.

Re:Nvidia has NOTHING to lose at this stage (1)

ELCouz (1338259) | about 6 months ago | (#46133919)

See also the infamous NVIDIA frying gpu card driver 196.75

Re:Nvidia has NOTHING to lose at this stage (1)

Anne Thwacks (531696) | about 6 months ago | (#46132871)

you must admit they have good quality.

"good" evidently does not mean what you think it does.

Nvidia Linux drivers are diabolical, as in "work of the devil" and I am bitterly disappointed. The devil (and presumably Microsoft, if a different entity) may well be pleased.

Re: Nvidia has NOTHING to lose at this stage (0)

Anonymous Coward | about 6 months ago | (#46131595)

The nvidia drivers work quite well. As for this code contribution, it is really quite small, and builds on the existing nouveau base, so doesn't really expose any new hw details. Judging from the patch details, it is basically just a typical Kepler chip, which already has been reversed pretty well in nouveau.

Re:Nvidia has NOTHING to lose at this stage (1)

Tough Love (215404) | about 6 months ago | (#46131603)

Not at all. It's never too late to see the light and do the right thing.

Re:Nvidia has NOTHING to lose at this stage (1, Interesting)

LWATCDR (28044) | about 6 months ago | (#46131611)

So you are stuck with Intel with their FOSS drivers but terrible GPUs. AMD also was all proprietary and their FOSS drivers are not as feature rich as the closed source. So you are made about them not supporting FOSS in the past... But they are doing it now and you are still mad.... Seems counter productive to keep complaining after a company goes FOSS.

Re:Nvidia has NOTHING to lose at this stage (3, Insightful)

Arker (91948) | about 6 months ago | (#46131679)

"AMD also was all proprietary and their FOSS drivers are not as feature rich as the closed source."

Their Free drivers may not be as 'feature rich' but they're a heck of a lot more stable and compatible than the blobware.

I'm planning to buy new video hardware about the middle of the year and their chances of getting my money just went from 0 to... well to nonzero at least.

Re:Nvidia has NOTHING to lose at this stage (4, Informative)

tick-tock-atona (1145909) | about 6 months ago | (#46132133)

Don't get too excited. It's not like nvidia are actually opening up here:

The scope of this work is strictly limited to Tegra (although given the similarities desktop GPU support will certainly benefit from it indirectly), and we do not have any plan to work on user-space support. So do not uninstall that proprietary driver just yet. ;)

This is only about leveraging the hard work already done by nouveau hackers, in order to bring their embedded SOC product to market more quickly. There was no documentation dropped, and they're specifically refuting the idea desktop linux support.

Re:Nvidia has NOTHING to lose at this stage (0)

Anonymous Coward | about 6 months ago | (#46132181)

Why do you call the open source nouveau drivers a linux hack? It makes me think you don't understand what you're talking about.

Re:Nvidia has NOTHING to lose at this stage (1)

Anne Thwacks (531696) | about 6 months ago | (#46132887)

I don't know how either of define "hack", but "crashes every day" is not a symtom of a satisfactory piece of software

Re:Nvidia has NOTHING to lose at this stage (1)

MMC Monster (602931) | about 6 months ago | (#46136489)

For Linux systems, NVIDIA was the worse graphics card manufacturer. Except, of course, for all their competition.

Now, with nouveau drivers starting to mature, things are getting better. I just install Linux Mint 16, and for the first time in installing linux systems with nvidia graphics drivers (probably in 2004?), I didn't install the nvidia graphics driver and things work at an acceptable speed.

Thanks, nouveau developers, for your excellent work so far. I hope Nvidia gives you more support in the future.

Re:Nvidia has NOTHING to lose at this stage (1)

MacDork (560499) | about 6 months ago | (#46131753)

I'm actually pretty excited about this SoC. The problem with Tegra 4 is it is not available on anything but a couple of Chinese phones and those don't even come with the software defined radio. The other problem is that it is not CUDA. K1 is CUDA, so presumably, I should be able to install the CUDA Toolkit along with something like rpud and take maximum advantage of those 192 cores. I'll definitely wait for the 64-bit variety and hope for a phone with >4GB of RAM. I'll probably get the 64-bit one. Hopefully there will be a phone with the SDR this time. It would be mind blowing awesome if they'd give the SDR some open source love too.

Re: Nvidia has NOTHING to lose at this stage (1)

deppman (2649539) | about 6 months ago | (#46132407)

The Linux drvers are far better than anthing else. Look at the system76 hardware or any other preconfigured systems. All of them use nVidia as far as I can see. I have yet to find anyone using AMD. The reason is the nVidia Linux systems outperform AMD and are much less troublesome (e.g. expensive) to support. In Linux, nVidia > Intel > AMD > everything else in my experience.

Re: Nvidia has NOTHING to lose at this stage (0)

Anonymous Coward | about 6 months ago | (#46132607)

You are right in all but the K1 part. The performance of that is something most "gaming laptops" can dream of. Performance per Watt is not the main point, as the most important use case will be microconsoles. This part means that the eg next Ouya can almost catch up to the "next gen" consoles.
Not trying to discuss the significance of the open-source thing from the article here, that's probably just a publicity stunt. Just saying you're extrapolating from previous Tegras in a wrong way. As you noted, with each generation they have been closing the gap between their initially shifty offering and the top players. The correct extrapolation is that the lines in the graph have finally crossed and they are the lead now.

Re: Nvidia has NOTHING to lose at this stage (1)

Pinky's Brain (1158667) | about 6 months ago | (#46136735)

You mean last gen consoles ... current gen consoles are nearly an order of magnitude removed from the K1.

Re:Nvidia has NOTHING to lose at this stage (2)

JDG1980 (2438906) | about 6 months ago | (#46132697)

-Tegra 5, renamed the K1. Built on the wrong process (not really Nvidia's fault- TSMC and others have failed to make the shrink progress expected years ago when this part was first planned). Using the wrong ARM core (A15), so Nvidia had to announce a later version of the K1 that will come with Nvidia's own 64-bit ARM core. Of course, this means the first K1 is already obsolete, long before it is on sale. First Tegra with PC class GPU cores, but not the NEW Maxwell GPU architecture Nvidia launches on the desktop in a few weeks time (750TI). So, the GPU is also out of date before the K1 goes on sale.

Who is building a phone or tablet with over 4GB of RAM? That's about the only circumstance where you'd need 64-bit ARM cores. In fact, with low-RAM systems, 64-bit can be a positive disadvantage, due to higher memory requirements (because pointers are double the size). There have been quite a few low-memory crashes [google.com] reported on the iPad Air, which has a 64-bit CPU but only 1GB of RAM. The iPad 4, in contrast, with its 32-bit CPU, doesn't seem to have these issues.

And calling the GPU "out-of-date" is ridiculous. Maxwell hasn't even been released to the desktop yet – if you buy a Nvidia card today, no matter how expensive, it will be Kepler. More to the point, phone/tablet GPUs can only reasonably be compared against other phone/tablet GPUs, not the much larger and more power-hungry desktop offerings. Of course even a $100 desktop graphics card is probably going to decisively beat a tablet GPU, but so what? They are different devices (despite what Steve Ballmer seems to think), designed to do different things. If Kepler can compete on performance per watt with the other GPUs in the mobile space, then it's a viable product. The fact that something better exists on the desktop is neither here nor there.

Re:Nvidia has NOTHING to lose at this stage (2)

Mike Buddha (10734) | about 6 months ago | (#46132965)

64 bit is better because you use up all your cache memory in half the time. It's better when your cache is full, right?

Re:Nvidia has NOTHING to lose at this stage (0)

Anonymous Coward | about 6 months ago | (#46134963)

There have been quite a few low-memory crashes [google.com] reported on the iPad Air, which has a 64-bit CPU but only 1GB of RAM. The iPad 4, in contrast, with its 32-bit CPU, doesn't seem to have these issues.

This is because any iOS apps not recompiled for 64-bit have to load their own copy of a 32-bit to 64-bit software shim which increases an app's memory requirements. If all app developers re-compiled for the iOS 7 SDK then each app would have a 64-bit and 32-bit version of the executable and memory requirements on the 64-bit platforms would drop from where they currently are.

Since 1st Feb 2014 Apple now requires each submitted app to be compiled against the iOS 7 SDK so the situation will only improve from now on in.

Re:Nvidia has NOTHING to lose at this stage (1)

phoenix_rizzen (256998) | about 6 months ago | (#46141465)

ARMv8 supports both AArch32 (32-bit ISA) and AArch64 (64-bit ISA), similar to how AMD (and now Intel) CPUs support both x86 and amd64 ISAs.

Meaning, you can run a 32-bit OS on a 64-bit chip, and get access to all the improvements to the architecture, and it will run like a faster 32-bit chip.

Or, you can run a 64-bit OS on the 64-bit chip, and still run 32-bit apps, and get access to all the improvements to the architecture, and it will run like a 32-bit chip with access to a full 64-bit address space (for the OS, the apps are still limited to 4 GB each).

Or, you can run a 64-bit OS on the 64-bit chip and run 64-bit apps and get access to all the improvements to the architecture, including access to the full 64-bit address space within each app.

Or, you can mix and match the last two as needed. Which is what Apple is doing with their A7 SoC (64-bit CPU, 64-bit OS, mix of 32-bit and 64-bit apps).

There's a lot more to the ARMv8 architecture than just 64-bit-ness. There's a lot more memory bandwidth, there's a lot more registers, there's a lot of clean-up to the ISA, etc, etc, etc.

You don't need more than 4 GB of RAM to get improvements from running a 64-bit SoC. Just like you don't need 4 GB of RAM on the desktop to get improvements from running an AMD CPU in 64-bit mode with a 64-bit OS.

Re:Nvidia has NOTHING to lose at this stage (0)

Anonymous Coward | about 6 months ago | (#46132911)

If you can't compete, make your documentation open-source in the hope this will boot-strap some extra business.

I love how this implies that open sourcing driver automatically gives more revenues... and yet they don't do it for other solutions...

Re:Nvidia has NOTHING to lose at this stage (1)

edxwelch (600979) | about 6 months ago | (#46133383)

You're partially right, Tegra has been pretty poor so far. However it's to early to make a call on Tegra 5 yet. There are have been no independant benchmarks done on it. Also, we do not know which process it uses. It could well be 20nm. 20nm has been in test production since the start of 2013.

Re:Nvidia has NOTHING to lose at this stage (1)

edxwelch (600979) | about 6 months ago | (#46133443)

Sorry, forget that post. Indeed, k1 is 28nm, not 20nm.

Someone, cut Linuse's middle finger in half (-1)

Anonymous Coward | about 6 months ago | (#46131421)

Quick, Someone, cut Linuse's middle finger in half

Re:Someone, cut Linuse's middle finger in half (0)

Anonymous Coward | about 6 months ago | (#46131481)

Why? His criticism was valid at the time. Even Linus can't see the future you dumbass. Besides, maybe the fact he called attention to the problem helped push Nvidia in the right direction.

Re: Someone, cut Linuse's middle finger in half (0)

Anonymous Coward | about 6 months ago | (#46131483)

I'm sure Linus still hates Nvidia. Just because something has become open source does not make it even remotely usable.

The light bulb ... (1)

hackus (159037) | about 6 months ago | (#46131505)

finally pops on.

Re:The light bulb ... (0)

Anonymous Coward | about 6 months ago | (#46132839)

finally pops on.

A great result from Linus finally showing some spine. Years of rolling over waiting for his tummy to be tickled: no effect. Tell them to fuck off once and they finally start to act.

This is almost a rule for companies. You can't give things away for free without a reason. Although this isn't the greatest hardware, whoever achieved this in Nvidea has set them up to be able to do this again whenever they have a reason. Now all are have to do is give them that reason by only buying hardware with open source drivers.

trol7kore (-1)

Anonymous Coward | about 6 months ago | (#46131583)

Re:trol7kore (-1)

Anonymous Coward | about 6 months ago | (#46132049)

Pretty lame bro, I've added you to the wall of shame [goo.gl] .

Finger power (0)

Tough Love (215404) | about 6 months ago | (#46131609)

See, if you get management's attention with a few precisely calibrated and executed gestures, stupidity can sometimes end.

Re:Finger power (1)

Tough Love (215404) | about 6 months ago | (#46131713)

Whoa, what have we got here, Nvidia manager with mod points? Come on, admit it, closing your driver source when you're trying to sell hardware is just plain stupid, no two ways about it.

For all the naysayers out there. (4, Informative)

deviated_prevert (1146403) | about 6 months ago | (#46131715)

That jump all over how terrible Tegra SOCs are, the chips still power a crapload of cheap devices all over the planet. What opening up the source on these chips will do is make it easier for smaller companies to create Android and other OS based devices for the expanding cheap device market. What some here refuse to realize is that China is a have and have not market. Those who cannot afford iPads and iPhones will go for the best cheap alternatives and Samsung's products are not significantly cheaper than Apple products. The stuff that they make that is low end could easily be blown out of the water by other companies that clone both the iPad the iPhone and high end Samsung products like the top end Note series and Galaxy 4 phones. You can bet within a very short period of time there will be a flood of cheap knock offs that do everything that these devices do and with just as much grunt but much cheaper.

The high end portable device market that is run by SOCs is undergoing the same thing the pc market went through, aggressive competition and a patent portfolio will not adequately stop the production of knock offs. This is most likely what NVidia is counting on happening, all they care about is selling a gazillion SOCs as fast as they can, just like everybody else that relies on sales of hardware for revenue. NVidia realizes that their SOCs are not going to make it into iPads and Win8 RT is a complete bust so they are instead taking a run at Samsung by opening up their software specs and making cheap but much more powerful versions of Android and even things like Firefox OS and Ubuntu on arm a real possibility. No doubt this will make many more powerful cheap devices possible than what we currently see coming out of the east. This sort of game change is only to be expected, even if many would like to see NVIDIA FOD they are in a position to change the game simply by not playing by the old closed source software design rules that killed many manufactures in the PC market place. My prediction is the next company to bite the dust will be Creative unless someone like NVidia buys them out and teams up with someone like Lenovo to produce killer pro devices and the like as well as consumer do dads. There will be a huge consolidation in the industry and this time Microsoft and their so called "hardware partners" could be left in the dust, perhaps NVidia sees the writing on the wall this time and is breaking free from Redmond's apron strings for a change.

Re:For all the naysayers out there. (0)

Anonymous Coward | about 6 months ago | (#46133279)

Why would a small company buy an aging Tegra 3, or Tegra 5, for 10s of dollars when Mediatek will sell them a comparible chip for 5?

Microsoft's loss (2, Informative)

EmperorOfCanada (1332175) | about 6 months ago | (#46131775)

One of the greatest strengths Microsoft has had was its library of drivers. Quite simply most manufacturers would be foolish to make their drivers for anything but Windows first and foremost. Thus when a company would deploy their resources they could ask the question is it better to spend some resources for porting the drivers to things like Linux or just put more effort into the Windows version. Thus at best the Linux version (if any) played second fiddle to Windows (or third after Mac).

This resulted in Microsoft effectively having billions of dollars worth of drivers that they didn't even have to pay for; a serious competitive advantage. But as many power users have moved over to Linux for various needs such as servers, rendering, and large scale computing; certain classes of drivers have become valuable for hardware manufacturers to port properly (or assist in the porting).

This won't kill Windows but it is a nice step toward leveling the playing field somewhat.

Re:Microsoft's loss (0)

Anonymous Coward | about 6 months ago | (#46132299)

No actually this will kill Windows. At least in the consumer space. Companies like Qualcomm and NVIDIA are developing their hardware and writing the Linux drivers simultaneously during development. When the part is finally ready the drivers have already made progress. Only then do they even start caring about Windows on ARM. Simply put Windows is 6 months to a year behind on hardware that frankly only has 1 year of shelf life before its outdated and replaced.

Re: Microsoft's loss (0)

Anonymous Coward | about 6 months ago | (#46132683)

This doesn't even use the GPU for rendering, it's just a frame buffer on a niche platform. This won't even put a dent in Windows which predominantly uses x86, not ARM SoCs.

Re:Microsoft's loss (2)

Bert64 (520050) | about 6 months ago | (#46132913)

Microsoft don't have a library of drivers for ARM, in fact it's the other way round where windows for arm has an extremely limited set of hardware support and linux is far ahead because a lot of the drivers for x86 can be trivially recompiled.

Re:Microsoft's loss (1)

EmperorOfCanada (1332175) | about 6 months ago | (#46132963)

Ah but the key is that the open source community needed to cook up the bulk of those drivers and often without the cooperation of the hardware manufacturer. Of late the manufacturers have a higher chance of either cooperating or actually cooking up their own OS drivers. The key is that MS could sit back and effectively let this asset build up it would be sort of like all the guide books, yellow pages, and travel websites only ever mentioned your hotel chain and ignored the others. You wouldn't get 100% of the business but you would get most of it.

So with things like ARM they are in unfamiliar territory. Especially where they traditionally had less to do with driver development, whereas Linux drivers have been a long hard slog and multi-platform has generally been the rule not the exception.

If I had to name Microsoft's top assets I would say that market share inertia has been their number one asset with drivers as their number two. Things like brand and whatnot are all far behind those two. As a long time user of Mac and Linux drivers has always been something that I have been aware of. Before buying any hardware I have generally checked to make sure that it would be fine.

Steam is only supporting NVIDIA, for a reason. (0)

Anonymous Coward | about 6 months ago | (#46131869)

Linux is a tree, it flows down to all distros.

deceptive title (5, Informative)

Gravis Zero (934156) | about 6 months ago | (#46131989)

The title should be "NVIDIA publishes Nouveau patches to support Tegra K1"

Nothing has been "open sourced" as it was never closed sourced to start with. It's all original code written specifically for Nouveau.

Re:deceptive title (1)

Bite The Pillow (3087109) | about 6 months ago | (#46132039)

If you feel like being a pedant, consider that it was closed until released.
Or, do you want to argue about whether software with an open source license, that sits on a disk inaccessible to anyone but the author, is open?
I understood that the code was released with an open license. Finding that it is just patches does not significantly deviate from that meaning, certainly not enough to deserve bold.

Re: deceptive title (1)

Anonymous Coward | about 6 months ago | (#46132161)

Perhaps you should realise that they were trying to distinguish between writing small patches to improve Nouveau with the intention of releasing those patches vs open sourcing existing code from their proprietary codebase.

Re:deceptive title (1)

Zimluura (2543412) | about 6 months ago | (#46133413)

this is probably a good way for nvidia to test the waters with regard to oss.

they probably have some brass that see it as:
"paying our engineers to write drivers, to give away for free to our enemies"

hopefully nothing will scare them off.
noone make any sudden moves! whisper! ...ad try not to breath to much!

Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...