Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nvidia Releases Hardware-Accelerated Film Renderer

timothy posted more than 10 years ago | from the smart-use-of-hardware dept.

Graphics 251

snowtigger writes "The day we'll be doing movie rendering in hardware has come: Nvidia today released Gelato, a hardware rendering solution for movie production with some advanced rendering features: displacement, motion blur, raytracing, flexible shading and lighting, a C++ interface for plugins and integration, plus lots of other goodies used in television and movie production. It will be nice to see how this will compete against the software rendering solutions used today. And it runs under Linux too, so we might be seeing more Linux rendering clusters in the future =)" Gelato is proprietary (and pricey), which makes me wonder: is there any Free software capable of exploiting the general computing power of modern video cards?

Sorry! There are no comments related to the filter you selected.

Word up bitches (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#8914512)

J to the P, niggers.

Re:Word up bitches (0, Offtopic)

Jim Florentine (586871) | more than 10 years ago | (#8914543)

For all the Iraqi children out there, this song is for you.

Little child
Dry your crying eyes
How can I explain
The fear you feel inside
cause you were born
Into this evil world
Where man is killing man
and no one knows just why

What have we become
Just look what we have done
All that we destroyed
You must build again

When the children cry
Let them know we tried
cause when the children sing
Then the new world begins

Little child
You must show the way
To a better day
For all the young
cause you were born
For the world to see
That we all can live
In light and peace

No more presidents
And all the wars will end
One united world under God

When the children cry
Let them know we tried
cause when the children sing
Then the new world begins

What have we become
Just look what we have done
All that we destroyed
You must build again

No more presidents
And all the wars will end
One united world under God

When the children cry
Let them know we tried
cause when the children fight
Let them know it aint right
When the children pray
Let them know the way
cause when the children sing
The new world begins

Re:Word up bitches (1)

terriblekarmanow tm (592883) | more than 10 years ago | (#8914580)

[sing it in a sad voice]

Dead children
Dead children
Dead children aren't much fun
They don't come when you call
They don't chase squirrels at all
Dead children aren't much fun

My brother died late last fall
He's still rotting in the hall
Dead children aren't much fun
Mom says brother's days are through
She's gonna throw him in the stew
Dead children aren't much fun

Dead children
Dead children
Dead children aren't much fun

Dead children
Dead children
Dead children aren't much fun

Dead children
Dead children

Re:Word up bitches (-1, Offtopic)

Quim (463393) | more than 10 years ago | (#8914582)

Welcome back slut.

Happy Birthday Hitler (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#8914514)

Happy Birthday To You
Roasted Commies and Jews
Happy Birthday Dear Hitler
Happy Birthday To You

115 today. Congratulations Adolf

Gelato or Gensto? (-1, Troll)

Anonymous Coward | more than 10 years ago | (#8914516)

Get it right, bitch!

Re:Gelato or Gensto? (0, Offtopic)

Quim (463393) | more than 10 years ago | (#8914568)

Bitch, please.

moo (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#8914519)

First ?

Re:moo (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#8914529)

No, slut.

Spelling... (4, Funny)

B4RSK (626870) | more than 10 years ago | (#8914523)

Gelsto is proprietary (and pricey), which makes me wonder: is there any Free software capable of exploiting the general computing power of modern video cards?

Gelato seems to be correct...

Nice to see some good out of BMRT/Exluna. (4, Informative)

ron_ivi (607351) | more than 10 years ago | (#8914684)

Renderman.org's summary of Exluna & BMRT [renderman.org] describes where much of this technology probably came from.

For those who don't remember, BMRT was a really cool RenderMan based renderer that Pixar had some sort of love/hate relationship with. IIRC, they used it, yet they sued the company. At the end nVidia bought them, though it wasn't clear why at the time.

Re:Nice to see some good out of BMRT/Exluna. (4, Informative)

ron_ivi (607351) | more than 10 years ago | (#8914761)

And a nice Siggraph [siggraph.org] presentation of some of the capabilities of BMRT.

Interestingly, BMRT was free as in $$$ but not as in Free Software. This was one of the first software packages where I first recognized how big this distinction is. (A free as in Free Software program probably would have continued on as people may have coded around some of the disputed intellectual property - a free as in $$$ program was possible to kill with the carrot and stick of a lawsuit and buyout opportunity)

Re:Nice to see some good out of BMRT/Exluna. (1, Informative)

Anonymous Coward | more than 10 years ago | (#8914829)

Parent wrote: "where much of this technology probably came from"

Indeed. NVidia's FAQ for this group [nvidia.com] says " It is the evolution of NVIDIA's acquisition of Exluna in 2002"

Try Blender3D (1)

pillendraaier (722295) | more than 10 years ago | (#8914917)

Try Blender3D (blender3d.org) Found out about the product just a week ago. I think it rules BIGTIME. Ik uses the accerated OpenGL drivers of my nVIDIA chipset.

I like this... (5, Funny)

rjw57 (532004) | more than 10 years ago | (#8914526)

This is a reversion of the norm :) [from the page linked to in the story]:

Operating System

* RedHat Linux 7.2 or higher
* Windows XP (coming soon)

Re:I like this... (2, Interesting)

Mister Coffee (771513) | more than 10 years ago | (#8914544)

It does not happen often that a hardware manufacturer has Linux support before it has Windows support. At least I have never seen it before.

I think it'll start happening a lot more (3, Insightful)

PlatinumInitiate (768660) | more than 10 years ago | (#8914669)

Not only with hardware manufacturers/drivers, but also general software. ISV's are getting annoyed by Microsoft's dominance of the desktop market, and through that, their (heavy) influence on desktop software. It's not inconceivable that in a decade, Microsoft could control every aspect of the standard desktop PC and desktop software market. At the moment some of the only really strong ISVs in their respective areas are Adobe, Corel, Intuit, Macromedia, Oracle, and a few specialized companies. Expect a big ISV push towards a "neutral" platform, like Linux or FreeBSD. Windows is too big to stop supporting, but ISVs will be smart to at least try and carve out a suitable alternative and avoid being completely dominated by Microsoft. All that most ISVs might be able to hope for in a decade is being bought out by Microsoft or making deals with Microsoft, if things don't go the way of creating a vendor-neutral platform.

Re:I like this... (0)

Anonymous Coward | more than 10 years ago | (#8914597)

now thats more like it

New Headline: (3, Funny)

Anonymous Coward | more than 10 years ago | (#8914634)

Windows XP's Achilles Heel Apparently Revealed

*ROTFL* (n/t) (-1)

Anonymous Coward | more than 10 years ago | (#8914646)

*ROTFL*

Guess what? (-1)

Anonymous Coward | more than 10 years ago | (#8914807)

Y-O-U-R-E-G-AY

You're gay!

Re:I like this... (1)

arvindn (542080) | more than 10 years ago | (#8914707)

Naturally. If the product is intended to be cross platform right from the beginning, then the developers would prefer to work on linux and port it to windows rather than the other way round, so you can expect the linux version to be released slightly earlier. What is perhaps a little surprising is that they announced it before the Windows version was ready.

Re:I like this... (0)

Anonymous Coward | more than 10 years ago | (#8914794)

When thinking back to the earlier story about unsupported sound cards on Linux, does this mean that Windows XP is not ready for the desktop? ;)

3D graphics cards aren't relevant (2, Interesting)

Anonymous Coward | more than 10 years ago | (#8914536)

Sadly, the hardware accelerations that consumer 3D graphics cards do aren't useful for the high quality renderings that are needed for film and television. The needs of games are just different, parially because of the need to render in realtime. So I doubt whether there's much scope for free software to make use of them for that purpose...

Re:3D graphics cards aren't relevant (3, Insightful)

niheuvel (570621) | more than 10 years ago | (#8914584)

Perhaps you should read the Nvidia FAQ? This topic is covered. From what I can tell, they don't use the GPU in the traditional way, they just use it as a co-processor.

Re:3D graphics cards aren't relevant (4, Interesting)

mcbridematt (544099) | more than 10 years ago | (#8914591)

But, NVIDIA's Quadro lineup *ARE* PCB Hacked consumer cards. Some PCI ID(or BIOS for the NV3x cards) hacking can get you a Quadro out of a GeForce easily, minus the extra video memory present on the Quadro's. I've done this heaps of times with my GeForce4 Ti 4200 8x (to a Quadro 780 XGL and even a 980 XGL) and I believe people have done it with the NV3x/FX cards as well.

This film renderer is different. It uses the GPU and CPU together as powerful floating point processors (not sure if gelato does anything more than that).

Re:3D graphics cards aren't relevant (4, Informative)

Oscaro (153645) | more than 10 years ago | (#8914596)

This is not really correct. The graphics cards Gelato uses are consumer hardware. This doesn't mean that the image is generated directly by the card! The 3D hardware is used as a specialized fast and parallel calculation unit, used especially for geometric calculation (matrix per vertex multiplication, essentially) and other stuff. This (of course) means that the rendering is NOT done in realtime.

Re:3D graphics cards aren't relevant (4, Interesting)

WARM3CH (662028) | more than 10 years ago | (#8914598)

Actually, there has been reports of using such hardwares to produce the similar results of the high-end, software based methods like those used in films. The trick is to break the job (typically the complex RenderMan shaders) to many passes, and feed them to the graphics card to process. By many passes, I mean 100~200 passes. The outcome will be like rendering a frame in a few seconds (we're not talking about real-time renderings here) which is MUCH faster the software based approaches. The limit in the past was that the color representaion inside the GPUs used a small number of bits per channel and by having a lots of passes on the data, round-off errors would degredate the quality of the results. But now, nVidia supports 32 bit floating point representaion for each color channel (i.e 128 bits per pixel for RGBA!) and this brings back the idea of using the GPU with many passes to complete the job. Please note that in the film and TV business, we're talking of large clusters of machines and weeks of rendering and bringing it down to days with smaller number of machines is a very big progress.

Re:3D graphics cards aren't relevant (2, Informative)

Anonymous Coward | more than 10 years ago | (#8914690)

Pinnacle have been doing graphics card assisted effects for a long time under Adobe Premier see

http://www.pinnaclesys.com/ProductPage_n.asp?Produ ct_ID=19&Langue_ID=7

for details

Re:3D graphics cards aren't relevant (0)

Anonymous Coward | more than 10 years ago | (#8914703)

Have a look at

http://www.pinnaclesys.com/ProductPage_n.asp?Produ ct_ID=19&Langue_ID=7

Rendering artefacts between cards? (5, Interesting)

Anonymous Coward | more than 10 years ago | (#8914538)

The rumor on the street is that a Soho based SFX house tried this when they had a deadline that standard software rendering couldn't meet.

So they wrote an engine to do renderman->OpenGL and ran it across many boxes.

Problem was that they got random rendering artefacts by rendering on different cards - different colors etc, and couldn't figure out why.

When working on one box they got controlled results, but only had the power of one renderer.

Re:Rendering artefacts between cards? (4, Interesting)

XMunkki (533952) | more than 10 years ago | (#8914630)

Problem was that they got random rendering artefacts by rendering on different cards - different colors etc, and couldn't figure out why.

I have seen this problem in software renderers as well. The problem seemd to be that part of the rendering farm was running on different processors (some were Intel, some AMD and many different speeds and revs) and one of them supposedly had a little difficulty with high-precision floating points and it computed the images with a greenish tone. Took over a week to figure this one out.

Re:Rendering artefacts between cards? (4, Interesting)

DrSkwid (118965) | more than 10 years ago | (#8914777)

I used to see that with 3ds4 as well when I was rendering this [couk.com] . One was a pentium and one was a pentium pro.

Ah those were the days. We were on a deadline and rendered it over Christmas. After four hours the disks would be full and it would be time to transfer it to the DPS-PVR. I spent six days where I couldn't leave the room for more than four hours, sleep included. Was pretty wild !

VH1 viewers voted it second best CGI video of all time, behind Gabriel's Sledgehammer so I guess it was worth it!

Re:Rendering artefacts between cards? (1)

Doh! (86796) | more than 10 years ago | (#8914692)

Gelato won't be doing real-time hardware rendering the way you would normally use OpenGL for. It seeks to use the processing power of the GPU for calculations that would otherwise be done by the CPU. The goal here isn't fast, real-time output, but rather high-quality film-res output that can still take hours to render, so theoretically it's designed to give consistent results across all cards that support it.

Solution: Use one model of card, one driver rev (1)

Namarrgon (105036) | more than 10 years ago | (#8914770)

It's certainly possible that different hardware or even different drivers on the different machines doing the rendering can create subtle (or not-so-subtle) differences in each resulting frame, but standardising the hardware and drivers across machines should solve that completely.

Fab for machinima (4, Informative)

Paul Crowley (837) | more than 10 years ago | (#8914540)

For some possible applications, check out machinima.com [machinima.com] - film-making in real time using game engines.

This would be more useful (-1, Insightful)

Slashdot Hivemind (763065) | more than 10 years ago | (#8914542)

If you could get any decent 3D modelling software on Lunix. The choice of Blender, Blender or Blender just doesn't cut it compared to Hash, 3DS Max, Lightwave, Maya, Truespace or any other of the wide range of options available to Mac and Windows users.

Re:This would be more useful (1)

master0ne (655374) | more than 10 years ago | (#8914562)

dont forget blender for windows aswell

Re:This would be more useful (-1)

Anonymous Coward | more than 10 years ago | (#8914564)

You clueless tool.

Maya is available for linux.

Re:This would be more useful (4, Informative)

BiggerIsBetter (682164) | more than 10 years ago | (#8914586)

Alias has Maya for Linux. Newtek has Lightwave rendering node software for Linux. There are a few other 3D packages like AC3D too.

Lunix? (-1, Troll)

Anonymous Coward | more than 10 years ago | (#8914611)

Are you that religion nut troll?

Re:This would be more useful (3, Informative)

EnglishTim (9662) | more than 10 years ago | (#8914673)

Insightful my ass.

Maya, Houdini and XSI are all available for Linux, and they work well.

Re:This would be more useful (1)

SlightOverdose (689181) | more than 10 years ago | (#8914688)

Don't forget POVRay

Re:This would be more useful (2, Informative)

afd8856 (700296) | more than 10 years ago | (#8914804)

I hope you know that all high-end 3d packages are available on Linux: XSI, Maya, Houdini, Real3d. And then you have some cool open source, like wings3d, that can cover some a lot of ground on the modeling field. Combine that with blender & yafray, and (theoreticly) you have a complete open source animation studio!

Eat some gelato (2, Informative)

SpikyTux (524666) | more than 10 years ago | (#8914546)

Gelato (Italian) == Ice cream

WRONG! (1)

terriblekarmanow tm (592883) | more than 10 years ago | (#8914678)

Gelato == man cream, not ice cream.

I'll never make that mistake again in Rome.

Re:Eat some gelato (1)

Vampyre_Dark (630787) | more than 10 years ago | (#8914725)

And gelato is what you turn into into when you have too much. :)

Re:Eat some gelato (1)

piper-noiter (772438) | more than 10 years ago | (#8914781)

With a cost of $3200+ (license plus annuity), this better be the best damn ice cream I've ever consumed. I expect chocolate, caffeine, and it had better be able to rinse out the bowl and stick itself in the dishwasher. That?s all I have to say.

Quick question... (3, Funny)

Noryungi (70322) | more than 10 years ago | (#8914549)

Is there any Free software capable of exploiting the general computing power of modern video cards?

Well, since they released "a C++ interface for plugins and integration" for Gelato (ice cream in Italian, btw), this probably means that free software can (and, eventually, will) support all these high-end functions... or am I completely wrong?

For instance, just imagine Blender with a Gelato plug-in for rendering... hmmmm... Now I understand why they named it "Gelato"...

Re:Quick question... (1)

sshtome (771249) | more than 10 years ago | (#8914592)

Am I wrong in thinking that maya runs on Unices already??

Some folk just can't afford it?! (read everyone) but I think that it's good to have proprietry software on Linux. If you make 3D graphics for big money you can afford to pay for the dogs in 3D. And Linux the smart choice for your render farm...

yes you or I get Blender. You and I couldn't do 3D graphics for Harry Potter 4.

Re:Quick question... (0)

Anonymous Coward | more than 10 years ago | (#8914729)

Maya was under Irix as far back as '96 --- this I know after doing no research as I ran Maya on an Indigo2 Extreme.

Don't you mean.. (1)

Viceice (462967) | more than 10 years ago | (#8914550)

Nvidia releses Hardware-Accelerated video renderer?

Re:Don't you mean.. (1)

DrSkwid (118965) | more than 10 years ago | (#8914805)


or even releases

btw. it's punishment

For low res, general computing power is too cheap (-1, Offtopic)

Sancho (17056) | more than 10 years ago | (#8914566)

Even a low end computer these days will be capable of rendering DVD quality video in a short time with many filters in place. I routinely use free tools (AVISynth being the biggest) as well as inexpensive encoders (TMPGEnc, and I want to try CCE Basic one of these days) to make DVDs from my video. I filter the hell out of my video, and it rarely takes more than 7x realtime on my Athlon XP 2800. Usually it takes much less time. And TMPGEnc is considered a fairly slow encoder. With a P-IV and CCE, I would probably cut that time in half, at least.
Using a higher resolution source would certainly slow down the process, but is it really enough to warrant paying high prices for this sort of software? Probably not. If you are going for real quality, you'll probably spend much more time working out the processing for the video than the time you will spend encoding. If you just want something simple, you probably aren't going to need this hardware because for simple, general computing power is definately fast enough. Without filters, I encode at roughly half-realtime for Full-D1 DVD resolution, for example.

Re:For low res, general computing power is too che (-1, Troll)

imsabbel (611519) | more than 10 years ago | (#8914751)

Come back if you aquired basic knowledge about what RENDERING in this context means. (i would call you a stupid divx kid if there werent your userid, you should know better)

the problem is in the Bus (4, Insightful)

rexguo (555504) | more than 10 years ago | (#8914573)

The AGP bus has assymetrical bandwidth. Upstream to video card is like 10x faster than downstream to the CPU. So you can dump tons of data to the GPU but you can't get the data back for further processing fast enough, which defeats the purpose.

Re:the problem is in the Bus (1)

BiggerIsBetter (682164) | more than 10 years ago | (#8914595)

Not if the purpose is to output to a recording or viewing device, but they're probably planning to use it with PCI-X anyway.

Re:the problem is in the Bus (5, Insightful)

snakeshands (602435) | more than 10 years ago | (#8914617)


The purpose might mostly be to show people why they need to run out
and get PCI Express hardware; it completely addresses the assymetry
issue.

I'm guessing the main reason Gelato is spec'd to work on their
current AGP kit is to encourage the appearance of really impressive
benchmarks showing how much better performance is with PCI Express.

They have a good idea, and they're rolling it out at a good time,
I think.

Some folks were trying to do stuff like this with PS2 game consoles,
but I guess now they'll have more accessible toys to play with.

Re:the problem is in the Bus (-1, Troll)

Anonymous Coward | more than 10 years ago | (#8914721)

The PS2 console thing wasn't using the GPU. They were attempting to use one of the processors which happens to be vector-based(the PS2 has like 3 processors and they're all a mess to code for, smart huh?) and so has an amazing capability for math work

Re:the problem is in the Bus (1)

WARM3CH (662028) | more than 10 years ago | (#8914620)

I believe the problem is not with AGP bus, but rather with the GPUs that are NOT designed in the first place to transfer anything back to the memory. In normal 3D applications, you just feed the graphics card with all sorts of data, like the texturs, geometry, shaders... and the result goes out throug the VGA connector! You don't need to give it back to the CPU or the memory. The GPU and the memory architecture of the graphics card is simply designed to just recieve the data with highest speed from the CPU. However, this does not mean you CAN'T get a high bi-directional throughput in AGP if you're going to design a special GPU for offline renderings.

Re:the problem is in the Bus (0)

Anonymous Coward | more than 10 years ago | (#8914717)

I know a researcher who is working on this problem.

What he does is to take the video output, in DVI, and to feed it back to some 'homemade' PCI card, with 2 FPGA on it. Then he can benefit of the full bandwidth.

But it is quite pricey...

Re:the problem is in the Bus (1)

Divlje Jagode (710824) | more than 10 years ago | (#8914799)

that's for video processing (impressive nonetheless), what we're talking about is bus data transfer. The problem seems to be that you can push arrays (texture data) to the card's memory but cannot get the processed data back nowhere fast enough...

A couple of years back there was an article on /. about this. Some guy (team?) had designed a benchmark to show the assymetrical throughput. There was a program you could download to check it out for yourself (windows)

Re:the problem is in the Bus (1)

Trogre (513942) | more than 10 years ago | (#8914723)

If it's I/O bound, yes.

However if the GPU can be left to crunch for most of the time and return say a row of pixels at a time I doubt the 1/10th speed of the AGP bus downstream would be a big problem. For complex scenes the input (textures, geometry, shaders) may well exceed the output (pixels) in terms of data.

Re:the problem is in the Bus (1)

Kris_J (10111) | more than 10 years ago | (#8914752)

Hello? Digital out? You can plug it into something other than an LCD monitor.

Not an issue, esp. for non-RT rendering (4, Insightful)

Namarrgon (105036) | more than 10 years ago | (#8914759)

You're right that the AGP port is asymmetric, but this is unlikely to be a bottleneck if they can do enough of the processing on the card.

For 3D rendering, especially non-realtime cinematic rendering, you have large source datasets - LOTS of geometry, huge textures, complex shaders - but a relatively small result. You also generally take long enough to render (seconds or even minutes, rather than fractions of a second) that the readback speed is not so much an issue.

Upload to the card is plenty fast enough (theoretical 2 GB/s, but achieved bandwidth is usually a lot less) to feed it the source data, if you're doing something intensive like global illumination (which will take a lot more time to render than the upload time). Readback speed (around 150 MB/s) is indeed a lot slower, but when your result is only e.g. 2048x1536x64 (FP16 OpenEXR format, 24 MB per image), you can typically read that back in 1/6 of a second. Not to say PCIe won't help, of course, in both cases.

Readback is more of an issue if you can't do a required processing stage on the GPU, and you have to retrieve the partially-complete image from the GPU, work on it, then send it back for more GPU processing etc, but with fairly generalised 32 bit float processing, you can usually get away with just using a different algorithm, even if it's less efficient, and keep it on the card.

Another issue might be running out of onboard RAM, but in most cases you can just dump source data instead & upload it again later.

Teh horror !!! (5, Insightful)

Anonymous Coward | more than 10 years ago | (#8914574)

"Gelsto is proprietary (and pricey)"

A company that wants to be payed for their work, weird !

You will see more, allot more, of this for the Linux platform in the near future.

Software may be released with source code, but no way that it will be released under GPL, most ISV's can't make a living releasing their work under GPL.

And please the "but you can provid consulting services" argument is not valid, it dont work that way in the real world.

Re:Teh horror !!! (0)

Anonymous Coward | more than 10 years ago | (#8914696)

>And please the "but you can provide consulting
>services" argument is not valid, it dont work that
>way in the real world.

Indeed. If you are entering the product support consulting business the extra expense to create your own product to support only pays off if the product is a whiz-bang success.

They could support themselves by banner ads on their home page. That is a tried and true business model.

Re:Teh horror !!! (2, Insightful)

Cobron (712518) | more than 10 years ago | (#8914713)

Still I think it's a good idea to mention if software for Linux is proprietary.
That just saved me the trouble of clicking the link and checking it out ;) .

ps: I recently visited a project trying to "harness the power of GPU's". I think that project was something like seti/folding/ud/... but tried to have all the calculations be made by the GPU of your 3d card.
If someone knows what I'm talking about: please post it, I can't find it anymore ;)

Re:Teh horror !!! (1)

Blackknight (25168) | more than 10 years ago | (#8914743)

If I had mod points I'd mod you up. I'm getting tired of the slashdot mentality that everything has to be free. If a company makes a professional grade product, and there is demand for that product, they should be able to be rewarded for their efforts.

Re:Teh horror !!! (1, Insightful)

Anonymous Coward | more than 10 years ago | (#8914784)

maybe they can sell the hardware?

BURN!!!!!! (1)

Viceice (462967) | more than 10 years ago | (#8914587)

I wonder what market segment nVidia is gunning for. Are they after some of Discreets market share or trying to offer a hardware solution that will beat the crap out of Adobe After Effects.

It would be really cool to have a hardware solution with Combustion like features for the price of After Effects.

Re:BURN!!!!!! (1)

TheOldFart (578597) | more than 10 years ago | (#8914665)

Why not both? After Effects will do what Burn does but very slowly. Burn will do what After Effects does really fast but very, very expensive. Combine these new hardware accelerators with some cleaver code and you get all that for a fraction of the cost.

Re:BURN!!!!!! (0)

Anonymous Coward | more than 10 years ago | (#8914953)

No, they want that film manufacturers will use Quadro GPUs instead of CPUs in their render farms.

'pricey' (5, Insightful)

neurosis101 (692250) | more than 10 years ago | (#8914589)

Um... depends on what you're looking for/expect. This isn't intended for you to buy and use at home. This is more likely for smaller developers (big developers write their own usually... think Pixar). Professional grade equipment is all expensive. The first common digital nonlinear editor was the casablanca, and with an 8 gig scsi drive ran close to a grand when it was released. This was just a single unit.

I bet the type of people that buy this are like big time architects that have a few machines set up to do renders for clients, and want to perhaps do some additional effects for promo/confidence value, that likely already have people running that type of hardware.

Then again all those Quadro users could be CAD people and they've got no audience. =)

Re:'pricey' (1)

Dynedain (141758) | more than 10 years ago | (#8914681)

I work for a smaller graphics house that is part of an architecture firm and does mostly (but not all) architecture work.

This product is competing against other rendering engines like MentalRay, Vray, Renderman, etc. And at $2750 per license it's deffinately not for smaller developers or architects. There are plenty of other rendering engines out there that are significantly cheaper and dont require a video card that costs as much as an entire x86 rendernode.

Re:'pricey' (2, Interesting)

RupW (515653) | more than 10 years ago | (#8914765)

Professional grade equipment is all expensive.

No, you can get raytracing hardware for less than the software and a Quadro FX would cost you.

For example, there's the ART PURE P1800 card [artvps.com] which is a purpose-built raytracing accelerator. It's a mature product with an excellent featureset, speaks renderman and has good integration into all the usual 3d packages. It's generally acknowledged as a very fast piece of kit with excellent image quality, and plenty of quality/speed trade-off options. And if you've a deeper wallet they do much bigger network-appliance versions.

Re:'pricey' (1)

Tim C (15259) | more than 10 years ago | (#8914962)

Then again all those Quadro users could be CAD people and they've got no audience. =)

Not just CAD - I do server-side Java programming, and we've all recently been bought new PCs. The spec we went for included a Quadro FX 500; don't ask me why, it just did... (it was that, or a similar machine with a GeForce - I didn't make the choice)

I guess this is what BMRT has turned into... (1, Informative)

Anonymous Coward | more than 10 years ago | (#8914601)

It looks like this is what NVIDIA have done with BMRT when they bought it: look at what has become of exluna.com [exluna.com]

If anyone's wondering, a couple of the latest releases of BMRT (Blue Moon Rendering Tools) before NVIDIA pulled the plug on them are available here [unsw.edu.au]

General computing on graphics hardware (3, Informative)

attaka (739833) | more than 10 years ago | (#8914606)

I have been reading interesting stuff about this lately. Take a look at this Stanford project: BrookGPU [stanford.edu]

This might also be interesting: GPGPU [gpgpu.org] /Arvid

Re:General computing on graphics hardware (0)

Anonymous Coward | more than 10 years ago | (#8914714)

GPGPU...

Hmm...

So could you stack your computer up with multiple PCI video cards to effectively make a multiprocessor PC?

NOTE: I did *NOT* say a Beowulf cluster of video cards.

Re:General computing on graphics hardware (1)

BiggerIsBetter (682164) | more than 10 years ago | (#8914787)

Yes. I've done it with the BrookGPU library and two Nvidia cards (1 AGP, 1 PCI). The benchmark wasn't that much faster though, presumably because the datasets weren't big enough to just leave the cards running -I suspect the main CPU was too busy feeding them to get any real work done.

Linux software (5, Informative)

HenchmenResources (763890) | more than 10 years ago | (#8914607)

Is there any Free software capable of exploiting the general computing power of modern video cards?

Take a look at the Jashaka [jashaka.com] project. It is a real time video editing suit and the designers have been working with and have supposedly been getting support from Nvidia, so they may have had access and I would imagine certainly will have access to these video cards. I can't imagine them not taking advantage of this technology.

The other nice thing is if memory serves me correctly this program is being designed to work on Windows, Linux and OS X, so good news all around.

Using GPU for signal processing (4, Informative)

PastaAnta (513349) | more than 10 years ago | (#8914628)

is there any Free software capable of exploiting the general computing power of modern video cards?

A quick Googling revealed the following:
- BrookGPU [stanford.edu]
- GPGPU [gpgpu.org]

Re:Using GPU for signal processing (1)

Kris_J (10111) | more than 10 years ago | (#8914762)

Is there a D.Net client for my GeForce4 yet?

ExLuna, take 2 :) (2, Interesting)

Anonymous Coward | more than 10 years ago | (#8914653)

Seems like ex-Exluna staff (bought by NVidia) is going to kick PRMan's a$$ on hardware level: they tried it on software level with Entropy, but got sued into oblivion by Pixar, now it's time for revenge?

M4 open GL VJtool. (3, Interesting)

kop (122772) | more than 10 years ago | (#8914660)

M4 [captainvideo.nl] is a free as in beer movieplayer/vj tool that uses the power of openGL to manipulate movies,images and text.

gamespot.slashdot.org (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#8914662)

Free software is a product of a lifecycle (3, Insightful)

heironymouscoward (683461) | more than 10 years ago | (#8914679)

Software, like much technology, follows a classic cycle from rare/expensive to common/cheap as the knowledge and means required to build it get cheaper.

"Moore's Law" is simply the application of this general law to hardware. But it applies also to software.

Free software is an expression of this cycle: at the point where the individual price paid by a group of developers to collaborate on a project falls below some amount (which is some function of a commercial license cost), they will naturally tend to produce a free version.

This is my theory, anyhow.

We can use this theory to predict where and how free software will be developed: there must be a market (i.e. enough developers who need it to also make it) and the technology required to build it must be itself very cheap (what I'd call 'zero-price').

History is full of examples of this: every large scale free software domain is backed by technologies and tools that themselves have fallen into the zero-price domain.

Thus we can ask: what technology is needed to build products like Gelato, and how close is this coming to the zero-price domain?

Incidentally, a corollary of this theory is that _all_ software domains will eventually fall into the zero-price domain.

And a second corollary is that this process can be mapped and predicted to some extent.

Re:Free software is a product of a lifecycle (1)

tehcyder (746570) | more than 10 years ago | (#8914797)

Perhaps I'm not following, but doesn't this mean that Windows should be free by now?

Re:Free software is a product of a lifecycle (1)

heironymouscoward (683461) | more than 10 years ago | (#8914861)

Windows a specific product, not a technology. The technology would be the operating system. An OS is based on a series of other technologies, most obviously compilers, networking, disk management systems, kernel models, etc.

Since these underlying technologies have been zero-priced since the 1980's (mainly thanks to Unix), the OS as a technology has indeed fallen into the zero-price domain as well.

In other words: a small team can today build a product that competes fairly well with Windows, using off-the-shelf technology. This was not true 10 years ago.

Windows is not free, but it obviously has found itself right in the middle of the zero-price zone.

GPU as 2nd processor (slightly offtopic) (2, Interesting)

CdBee (742846) | more than 10 years ago | (#8914709)

is there any Free software capable of exploiting the general computing power of modern video cards?

I expect that once it suddenly becomes clear that the GPU in a modern video card has serious processing power, that someone will release a version of the SETI@Home [berkeley.edu] client which can use the rendering engine as a processor. Bearing in mind that most computers use their GPU's for a very small percentage of their logged-in life, I suspect there is real potential for using it for analysing on distributed computing projects.

Absolutely (2, Informative)

TheFr00n (643304) | more than 10 years ago | (#8914735)

Check out www.jahshaka.com. It's an open source video compositing / FX package that leverages the 3D accelerator chip on your graphics card to do incredible things. This is one to watch, it's definitely going places.

You can download binaries for linux and windows (and MAC), and source tarballs are available for the savvy.

I know, it's not strictly a "renderer", but it employs many of the fuctions of a renderer to create realtime effects and transitions.

Little value... (3, Insightful)

winchester (265873) | more than 10 years ago | (#8914767)

Almost every FX house worth its salt in the CG business uses Pixar's Renderman on UNIX or Linux machines. The reasons behind this choise are very simple.

Renderman is proven technology and has been so since the early '90s. Renderman is well known, its results are predictable and it is a fast renderer. Also, current production pipelines are optimised for Renderman.
UNIX and Linux are quite good when it comes to distributed environments (can anyone say Render Farm?) and handle large file sizes well (Think a 2k by 2k image file, large RIB files).
And last but not least, renderman is available with a source code license.

Hardware accelerated film rendering is in essence nothing but processor operations, some memory to hold objects and some I/O stuff to get the source files and output the film images. Please explain to me why a dedicated rendering device from NVidia would be any better than your average UNIX or Linux machine? Correct, there aren't any advantages, only disadvantages. (More expensive, proprietary hardware, unproven etc.)

Re:Little value... (1)

Rotting (7243) | more than 10 years ago | (#8914907)

That is odd.

I am no expert in this matter but a friend of mine has worked in the graphics industry (mostly with movies and TV shows) for several years now and from what I have learned from him a lot of modelling is done using 3DS Max and as well a lot of work is done in Maya.

I could be wrong though :)

Re:Little value... (-1)

Anonymous Coward | more than 10 years ago | (#8914959)

Posting as anonymous coward because I've already modded in this story.

I think you need to find out the difference between modelling and rendering...

Free software ready indeed! (5, Informative)

Goeland86 (741690) | more than 10 years ago | (#8914813)

I think that indeed there is free software to do movies and rendered animations using raytracing. First, Cinelerra can use a linux cluster for movie rendering. Second, there's a whole bunch of modellers/raytracers out there that perform very well: Povray is the oldest and most advanced, and can run on a pvm cluster, yafray is relatively recent and can use an openmosix cluster for networked rendering, Blender now integrates a raytracer AND exports to yafray. Those are the 4 programs I know of that I use, but there are more, I just haven't looked for more. So, yes, there is free software for movie rendering already!

math coprocssor (2, Insightful)

PineGreen (446635) | more than 10 years ago | (#8914827)

Instead of just using the native 3D engine in the GPU, as done in games, Gelato also uses the hardware as a second floating point processor.
Does this mean that I could eventually use my GeForce to do things like matrix inversion for me?

round and round we go (2, Informative)

renderfarm (721357) | more than 10 years ago | (#8914845)

ART, OGL assisted, now gelato. Sure there is a place but how do I stick a FX card into my several hundred 1U racks either physically or financially. Have you seen the size of these cards anyway ? Sure some vendors (mental images) are leveraging GPU power and have done the same with OGL for some time but unless the GPU calls are handled by calls to the renderer so you hide behind a consistent API it's a waste of your hard earned time getting your pipeline into shape in the first place. Long live GPU but I don't want to be aware of it's presence. PS. I think ATI are actually the smart kids on the block but chose the wrong colours for their marketing hype.Maybe they can get their chips straight onto the motherboard (much smarter).

Re:round and round we go (1)

RupW (515653) | more than 10 years ago | (#8914895)

ART, OGL assisted, now gelato.

Do you mean this ART [art.co.uk] ? Is their stuff any good? I went to a lecture given by one of their tech guys a long time ago and it sounded pretty impressive.

Video Cards as Renderers (4, Interesting)

agby (303294) | more than 10 years ago | (#8914849)

I was under the impression that it's hard to use a video card for general computing tasks because of the way that AGP is designed. It's really good at shunting massive amounts of data into the card (textures, geometry, lighting, etc) but terrible at getting a good data rate back into the computer. They're designed to take a load of data, process it and push the output back to the screen, not the processor. This is the major reason, IMHO.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?