Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

ILM Now Capable of Realtime CGI

chrisd posted more than 11 years ago | from the ilm-has-a-raleigh-mann-too dept.

Movies 262

Sandman1971 writes "According to the Sydney Morning Herald, specialFX company ILM is now capable of doing realtime CGI, allowing actors and directors to see rough CGI immediately after a scene is filmed. Actors on the latest Star Wars film watch instant replays of their battles with CG characters. ILM CTO Cliff Plumer attributes this amazing leap to the increase in processing power and a migration from using Silicon Graphics RISC-Unix workstations to Intel-based Dell systems running Linux."

cancel ×

262 comments

Sorry! There are no comments related to the filter you selected.

Errm... (4, Insightful)

bconway (63464) | more than 11 years ago | (#5750380)

According to the Sydney Morning Herald, specialFX company ILM is now capable of doing realtime CGI, allowing actors and directors to see rough CGI immediately after a scene is filmed.

Wouldn't realtime by WHILE the scene is filmed?

Re:Errm... (2, Interesting)

thona (556334) | more than 11 years ago | (#5750389)

Well, how can the ACTOR look at the scene WHILE he is playing it, without looking like he is looking at a scene. Also the director is propably more concentrated on the screenplay.

Re:Errm... (0, Troll)

Alan Partridge (516639) | more than 11 years ago | (#5750415)

he could look at a monitor? People in TV do, you know.

Re:Errm... (0)

Anonymous Coward | more than 11 years ago | (#5750514)

Are you stupid or just acting like you are? How can an actor play a role and look at a monitor at the same time?

Re:Errm... (0, Troll)

Alan Partridge (516639) | more than 11 years ago | (#5750568)

how can a person walk and chew bubblegum at the same time?

you never seen a weather report?

Re:Errm... (1)

JJahn (657100) | more than 11 years ago | (#5750708)

Weather report people are standing there talking, dumbass. These are actors who are moving and must adopt the expression and feel of their character. It would not do to have them distracted.

Re:Errm... (1, Troll)

Alan Partridge (516639) | more than 11 years ago | (#5750784)

you don't know anything about anything, do you?

They're ACTORS, their job is to fucking handle distractions and make it look good. They are distracted on all sides by techies doing THEIR jobs, they're taking direction from the directors, they're hitting focus marks over and over again and they're crying on fucking cue.

Re:Errm... (2, Insightful)

sigep_ohio (115364) | more than 11 years ago | (#5750495)

'Also the director is propably more concentrated on the screenplay.'

not if your name is George Lucas. then it is all about the eye-candy

Re:Errm... (0, Flamebait)

cyberlotnet (182742) | more than 11 years ago | (#5750395)

Think a minute you flame monkey, The actors are a bit busy doing there scenes, It would not make sense to have this viewable DURING filming, I would imaging it would be a bit disrupting at times to see your "cgi" lookalike doing the things your doing.

Re:Errm... (-1, Troll)

Alan Partridge (516639) | more than 11 years ago | (#5750430)

THEIR

THERE

THEY'RE

is this stuff not taught in American public schools anymore?

Re:Errm... (0, Offtopic)

sigep_ohio (115364) | more than 11 years ago | (#5750515)

nope. heck i got thru hi skool wit out lernen hows to reed or rite.

of course sometimes it is just easier to write something up and not worry too much about spelling or punctuation. it is the sentiment that matters, plus you obviously knew which version the author intended to use in the post.

Re:Errm... (-1, Offtopic)

cyberlotnet (182742) | more than 11 years ago | (#5750555)

Checking Faq
Checking Terms Of Service

Nope, Spelling is not required.

Checking posts by editors of slashdot.

Matter fact, Spelling is clearly optional.

Re:Errm... (0)

Anonymous Coward | more than 11 years ago | (#5750521)

doing the things your doing.

Not to mention "your" and "you're".

Spelling-impaired poster's sig: "If I had a nickle for every dupe on slashdot I would be rich!!"

And if I had a NICKEL for every spelling mistake, I'd be richer.

Re:Errm... (1)

say (191220) | more than 11 years ago | (#5750396)

ILM is now capable of doing realtime CGI, allowing actors and directors to see rough CGI immediately after a scene is filmed.

Wouldn't realtime by WHILE the scene is filmed?

Uh, as in "the actors are able to watch rough CGI while they are filming a scene"? That would be great ;) Holographic characters!

Re:Errm... (0)

Anonymous Coward | more than 11 years ago | (#5750398)

and won't this make filming take FOREVER? why wait for someone to import cgi into a film and make your workload 3x and your actual product cut down into 1/4th of what it could be?

not only that, but what about on location shots?

Re:Errm... (1)

Lt Wuff (319298) | more than 11 years ago | (#5750613)

Given that most movies run about 1/3 longer than they need to, wouldn't this be a good thing?

Re:Errm... (1)

Steeltoe (98226) | more than 11 years ago | (#5750447)

Wouldn't realtime by WHILE the scene is filmed?

As for the term 'realtime', without a reference to what it is realtime against, it could mean anything. It's just how the language works, without a relation to something else, we have to assume something and hope the author intended what we assume.

Just goes to show the inadequacy of languages, and why so many confusions are taking place in this world. No need to get upset about it though.

Re:Errm... (5, Insightful)

UCRowerG (523510) | more than 11 years ago | (#5750463)

Technically, perhaps. I think this is a great tool for directors and actors. Instead of having to wait weeks/months to incorporate CGI and see the interaction, it can be done in minutes/hours or as fast as the CGI people can splice things together. The director can give near-immediate feedback to the actor(s), which could really help the movie get done more quickly and with fewer costs in the long run. Think about it: changing the expression/pose/color on a CGI character is fairly easy. Re-filming live actors, especially with live fx, can take much longer and be more expensive (salaries for actor, director, film crew... lighting, film, makeup, fx expenses).

Re:Errm... (1)

einer (459199) | more than 11 years ago | (#5750570)

I can't wait until we can wholesale replace hollywood with a desktop box. No more listening to Tim Robbins flap his jaw about politics. No more waiting to replace the prima dona actress that just walked off the set because the production crew never ordered her Folgers high colonic. It's a directors dream (just like in S1m0ne, the Paccino movie no one saw).

There are drawbacks however. Getting a half dog for a CG character would definately throw up some flags. We'd have to update the Bible. No gay sex, and no sex with your computer... Especially gay computer sex, that's just right out. Whole new chapters would have to be added about how God turned George Lucas into a pillar of salt and smote ILM.

Maybe this isn't such a good idea.

Re:Errm... (5, Insightful)

jeffgreenberg (443923) | more than 11 years ago | (#5750571)

This is particularly important as they aren't using film.

WIth HD Lucas is shooting actors on Video...and now doing previsualization with the CG elements on set.

Did Liam look in the general direction of, but not AT the eyes of the CG character? Reshoot. etc. etc. etc.

Additionally a rough edit can be done off the video tap on set with the rough CG edit.

Unfortuantetly this still means nothing without good acting, a good script, or alternate footage to make decisions from.

You make a film three times.

Once on the page, once while directing, and once in the edit. But if everthing is so storyboarded and timed down the moment that you can't have options, you can't discover anything in the edit at all.

Oh well, at least you can see what the giant CG creature looks like

Re:Errm... (2, Insightful)

UCRowerG (523510) | more than 11 years ago | (#5750747)

Once on the page, once while directing, and once in the edit. But if everthing is so storyboarded and timed down the moment that you can't have options, you can't discover anything in the edit at all. I think this is exactly the point of this story. Whereas before, a director would have to fit the CGI to the live action already filmed, or expend a *lot* more money in bringing the actors, crew, etc. back to re-shoot the scene (several times). Now, a director can find a good "fit" for a scene almost immediately. It's like the CGI effects are almost the same as the real actors. Each can more easily react to the other.

Re:Errm... (2, Insightful)

mrtroy (640746) | more than 11 years ago | (#5750503)

No, adding the effects in "realtime" would still force you to rewind and watch it after.

That would be like saying videotaping isn't "realtime" since you have to rewind!

Jesus. It's CG and *NOT* CGI (-1)

Anonymous Coward | more than 11 years ago | (#5750755)

People need to get their acronyms straight.

That's nothing... (5, Funny)

say (191220) | more than 11 years ago | (#5750381)

...my webserver has been doing realtime CGI for years.

Re:That's nothing... (-1, Troll)

Anonymous Coward | more than 11 years ago | (#5750436)

Hello good sir! Please to stop being so gay! Thank you!

oh yeah (5, Funny)

Anonymous Coward | more than 11 years ago | (#5750658)

Post your server's url to slashdot, and we'll see just how realtime it is.

How long til... (5, Funny)

BeninOcala (664162) | more than 11 years ago | (#5750387)

We have Real-time CGI Porn?

How long... (1)

Raul654 (453029) | more than 11 years ago | (#5750581)

Till they beam it straight into my mind?

Re:How long til... (1)

NineNine (235196) | more than 11 years ago | (#5750590)

We have Real-time CGI Porn?


Almost there... we already have CGI porn, and we already have real-time porn... and let me tell ya', I've got 100 people hard at work on real-time CGI porn!

it'll be awhile. (1)

Multiple Sanchez (16336) | more than 11 years ago | (#5750717)

you're going to have to investigate the real thing.

Re:How long til... (1, Insightful)

Anonymous Coward | more than 11 years ago | (#5750767)

Quite soon. Just look at what can consumer-level GPUs produce in the real-time [digital-daily.com] .

Realtime (4, Funny)

Anonymous Coward | more than 11 years ago | (#5750390)

Maybe it IS realtime, but the actors just don't have the skill to watch themselves on a monitor WHILE acting, so they use the obvious 'i'll watch when i'm done method'

What's the point about this? (4, Interesting)

cdemon6 (443233) | more than 11 years ago | (#5750393)

"Realtime CGI in Movie Quality" would be impressive, but:

"It's not at full resolution, but at least it gives them something to work with rather than working completely blind after each take."

Re:What's the point about this? (0)

Anonymous Coward | more than 11 years ago | (#5750458)

i think you just said what the point is. it gives them something to work with, instead of really not having a clue how the take looks until a few days/hours later. that way they can quickly decide if there needs to be a retake, and what changes need to be done

Re:What's the point about this? (1)

cdemon6 (443233) | more than 11 years ago | (#5750508)

but i do not lo-res rendering preview *after* recording to be a impressive feature, that's what i meant :)

Serious Question (4, Interesting)

Anonymous Coward | more than 11 years ago | (#5750394)

With all the excitement over ILM using Linux I'm wondering exactly how many Hollywood visual effects studios use Linux.

nothing inherantly special about dell/linux (5, Interesting)

AssFace (118098) | more than 11 years ago | (#5750399)

The way that is worded, it makes it sound as if the processing power of an Intel/Linux combination is superior - whereas it is a matter of the bang for the buck instead.

You can get more processing power with the latter since it is cheaper (I would imagine even moreso with AMD) and easier to maintain. But not because it is inherently special or faster in any way.

I wonder if this will bring Silicon Graphics back into the favor of Intel boxes - for awhile they were okay with WinNT and Intel boxes, but then they dropped all of that - presumably for a higher profit margin and less hassle of maintaining multiple systems (also likely some break in business politics - perhaps someone at MS pissed someone off at SGI).

Re:nothing inherantly special about dell/linux (0, Troll)

Alan Partridge (516639) | more than 11 years ago | (#5750454)

but the processing power of the Intel/Linux combinaton IS superior.

Go and have a look at SPEC.org if you don't believe.

Re:nothing inherantly special about dell/linux (2, Interesting)

AssFace (118098) | more than 11 years ago | (#5750490)

I can look there all day and all it will do is back up what I just said.

if you are talking about "running RISC chip A at 1Ghz is this much slower doing this than were I to run it on chip B at 1Ghz" - then that is totally different.
that is a benchmark that is useless - especially in terms of real world usage.

what is useful is exactly what I said in the first post - bang for the buck.
If you run Dell/Linux and you pay $500 for one entry level node, and your budget is $50K for this project, then you can have 500 nodes to crunch data on.
If you run SGI and pay $2000 for one entry level node and you have the same budget, then you are going to get more bang for the buck from the Dell/Intel/Linux combo.

But it isn't that Dell and Linux are somehow special - they are just cheap. SGI has plenty of solution that kick the shit out of any Intel/Linux combo ever could - but they are cost prohibitive.

you can point to Spec.org all you want, but that won't change basic economic theory.

Re:nothing inherantly special about dell/linux (0, Redundant)

AssFace (118098) | more than 11 years ago | (#5750501)

LOL

uhh - 100 nodes.

too early - need coffee

Re:nothing inherantly special about dell/linux (4, Interesting)

Alan Partridge (516639) | more than 11 years ago | (#5750554)

I take your point, but the fact remains that the FASTEST SGI workstation is treacly slow - in absolute terms - vs the fastest Intel-based 'station. ILM couldn't care a fuck how much it costs, they want cutting edge speed (hint - they didn't buy their previous solution based on cost, they bought it based on capability).

I work in TV, and I know first hand that SGI is losing out to commodity hardware running Linux, Windows and even to the Mac. SGI gear is just about hanging on thanks to discreet - but it's just a matter of time before an inferno for Intel product lands and a lot of Onyx racks hit eBay.

Unless, of coures, SGI fights back...

Re:nothing inherantly special about dell/linux (3, Interesting)

AssFace (118098) | more than 11 years ago | (#5750699)

I think we are "arguing" the same points here. But I'm pointing out the semantics of it and how it comes about - but in the end I think the end result is likely the same.

The article states that they upgraded their hardware and the new hardware is faster and cheaper than the prior hardware... uhhh, right - I'm pretty sure that is how the hardware world works.

Where you could argue that Linux has its edge is stated right in the article - it is the driver support. SGI doesn't support certain drivers, and for good reason - they want to push their own stuff. So if they want to work with new hardware - like the new NVidia chips for realtime rendering the same way SquareSoft did, then SGI isn't going to help.

Also, workstation speed is all relative - it depends on what you are doing on the particular workstation - are they slower at working with real-time video? are they slower at network filesharing? is their memory bandwidth too slow for the hardware to make full use the processor?
To say it is too slow is a cop out - the hardware exists for a specific reason - SGI makes very action specific workstations, and they are areguably useless outside of that realm.

And while it is a fantastic thing for you to be able to throw around that you "work in TV" as if what you say is now backed by all of that business instead of just your opinion - then by me saying that I once worked at a special effects house, I should now have more power in what I say right?
I assure you that whether the effects house is SquareSoft, ILM, Digital Domain, or whatever - they all are businesses and have a single bottom line - they need to make money.
In order to make money, they won't ignore cost as you say. But it might look like that if they are rationalizing cost (a 100 node cluster of SGIs might be a million dollars, but a 200 node cluster of Alpha boxes might be 1.75million - they are spending more money, but they are getting a much faster overall cluster).

To argue over their workstations is silly in the end - the workstations are constantly being turned over at these places and nobody is ever satisfied with their performance. They don't really care if your workstation is top notch - what they care about is how fast the end product can be realized - if a faster workstation would result in that, then you get a faster one based on cost - but almost always, the entire focus of the drive of machines purchased is the rendering farm.
Even then, it hardly ever is truly purchased - it is a lease type deal since the turnover is so high.

I personally hated SGI when we worked with them and I much preferred the Intel boxes. So I'm not exactly standing up for SGI here, I mainly just thought the article was poorly written and should have called out the reason for switch better than just a reason to add one more article to the linux circle jerk.

Also I should note that I wrote SGI/Intel on WinNT up there - that is wrong - it was SGI/Alphas with WinNT. I would imagine that Intel and AMD now making the new 64bit chips will lead to a lot bigger jump over SGI.

Right... (1)

amcguinn (549297) | more than 11 years ago | (#5750741)

You're Alan Partridge, and you work in TV

I think you mean you used to work in TV

And even The Day Today's brilliant graphic effects weren't that advanced from a hardware standpoint...

Re:nothing inherantly special about dell/linux (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#5750562)

No, it won't back up what you said, because you're a stupid fucking sgi fanboy dumbass. On a one to one basis, intel cpu's outperform any cpu that SGI makes.

Fucking dumbass.

Even if intel's cost the same as the SGI's they'd be a better deal. The fact that they are cheaper just proves that SGI sucks total ass, and Intel is teabagging them just like I'm going to teabag you.

Re:nothing inherantly special about dell/linux (1)

AssFace (118098) | more than 11 years ago | (#5750731)

I give you high points on style - any well worded arguement should include "ass", "fucking", and "teabagging" - and excellent use of "stupid" to help your cause, I intially felt I might be swayed, but then there is the whole factual basis of your arguement which is lacking... so I guess I would still in the end find that you are wrong.

That said, you did say "fucking dumbass" and I need to go lookup the internet discussion rules, but I think that might trump any rational thought that I might have actually made - damn.

And if you read the other posts in this thread, you would actually see that I don't even like SGI, so I guess that rules out the "fanboy" part - but I'll give you "dumbass" - that one is dead on... I really am a dumbass.

further proof (2, Insightful)

Anonymous Coward | more than 11 years ago | (#5750401)

that proprietary unix is dying

Re:further proof (5, Interesting)

Alan Partridge (516639) | more than 11 years ago | (#5750482)

not even close

further proof that commodity hardware is killing innovative companies like SGI, and a FREE UNIX is helping it happen.

Linux is great for a company like ILM which is stuffed full of coders who can adapt it to suit their needs, not so good for many other companies.

NVIDIA + Maya 5? (1, Interesting)

Anonymous Coward | more than 11 years ago | (#5750403)

Perhaps they're using Geforce FX's + the new hardware renderer in Maya 5. It sends geometry, shaders and lighting to the card for rendering, saving the finished frames to disk. This results in far faster rendering than making the CPU (not specialized for 3D operations) do all the work. Something like *twenty times* faster give or take, depending on the scene's complexity. Cool stuff here, people!

Two Towers (5, Insightful)

alnya (513364) | more than 11 years ago | (#5750407)

In the Fellowship of the Ring DVD, Peter jackson can clearly be seen watching golum on a monitor (low poligon, but golum none the less) performing the mo-cap Andy Serkis is performing IN REAL TIME; as it is happening (not after).

So does this make this old news??

I dunno, I feel the ILM have been behind the bleeding edge for sometime now...

alnya

Re:Two Towers (3, Informative)

Zzootnik (179922) | more than 11 years ago | (#5750500)

No-no-no-no-no....That was Motion Capture.

Sensors on Andy S's Body capturing movement data and feeding it into a computer...Much like the mouse you're waving around right now...

Re:Two Towers (3, Informative)

chrisseaton (573490) | more than 11 years ago | (#5750667)

"Sensors on Andy S's Body capturing movement data and feeding it into a computer"

Yes... and then rendering the character on top of the real from using the motion capture info.

It's still realtime rendering.

Yay! (2, Funny)

Plissken (666719) | more than 11 years ago | (#5750409)

This will probably help on release dates for movies.

We'll get to see Episode III sooner!

Re:Yay! (1, Funny)

sporty (27564) | more than 11 years ago | (#5750484)

Too bad you can't use that 3d technology in Duke Nuke'em forever.

Wait a minute...

[/joke]

Why the Pills? (1)

sweeney37 (325921) | more than 11 years ago | (#5750413)

Why the use of the Matrix logo? From everything I've gathered, The Wachowski's didn't use ILM...

Mike

Re:Why the Pills? (0)

mike_mgo (589966) | more than 11 years ago | (#5750533)

Going off topic here but...What's with all of the new logos on the topics page? Matrix and LOTR I understand, but about 6 different ones for computer/console games?

This has been coming for a while (2, Interesting)

ThundaGaiden (615019) | more than 11 years ago | (#5750416)

I always thought with the current 3d cards coming
out and the horsepower they can throw at things
they would eventually be able to to tv quality
3d animation programs in real time.

Hopefully this is going to lead to alot more 3d
animated series on tv in the near future , and
in time pick up from where final fantasy left
off. I still think it was such a pity that film
didn't get the people into the cinema to watch it.

But I think the advances they made will pave the
way for the future. Mainstream 3d Anime here we
come :)

Re:This has been coming for a while (1)

sigep_ohio (115364) | more than 11 years ago | (#5750561)

actually dreamworks and nbc are working on an all cgi show for tv this fall(if i recall correctly).

Embedded journalists perhaps? (-1, Offtopic)

jkrise (535370) | more than 11 years ago | (#5750424)

Could this have helped our journos who covered the recent war? Maybe we could've watched it on the internet in almost real time?

Re:Embedded journalists perhaps? (1)

say (191220) | more than 11 years ago | (#5750441)

Or maybe we could just make the entire war in CGI :) Just artificial bombs, explosions and destruction. We would be able to make a real-time-war in almost no-time! Bush, Blair and Saddam would get their fun, and no lives would have to be taken.

Re:Embedded journalists perhaps? (1)

akadruid (606405) | more than 11 years ago | (#5750537)

Sit them all down to a game of Civ2.

Oh well (5, Funny)

stephenry (648792) | more than 11 years ago | (#5750433)

Its a pitty they haven't got one of those to write the script!

Steve.

Re:Oh well (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5750587)

SPELLCHECK = FALSE

Hrmm (5, Funny)

acehole (174372) | more than 11 years ago | (#5750435)

well, I guess they need to get that jar jar binks death scene juuuuust right.

Re:Hrmm (4, Funny)

docbrown42 (535974) | more than 11 years ago | (#5750494)

well, I guess they need to get that jar jar binks death scene juuuuust right.

Naw. The actors kept screwing up just so they could kill Jar Jar again...and again...and again. Given the chance, I think most fans would do the same thing.

Impressive, most impressive! (1)

count_dooku (448992) | more than 11 years ago | (#5750443)

Actors on the latest Star Wars film watch instant replays of their battles with CG characters.

That's even more impressive since they haven't started filming it yet!

--

Was i the only one (1)

Loosewire (628916) | more than 11 years ago | (#5750449)

that worried that
"to Intel-based Dell systems running"
would end windows?

Will this really improve movies? (2, Interesting)

PyrotekNX (548525) | more than 11 years ago | (#5750468)

It's hard to tell if this is anything more than a toy at this point. Marginal quality control is now possible. The time from pre-production to release might be a few days difference.

The actors might be able to play their roles slightly better if they know what the final result will be. In movies like EpisodeII they were acting totally blind in front of a screen for most of the movie. Very little of it was actually built.

The biggest question is "When will we have it at home?"

Super news.. (1, Funny)

grub (11606) | more than 11 years ago | (#5750471)


I hope this brings even more realism and depth of character to Jar Jar.

Don't get so excited (3, Informative)

derrickh (157646) | more than 11 years ago | (#5750472)

The realtime images aren't -final- renders of the scene. They're just rough drafts. The scene still has to be rendered in full res/texture, which still takes hours per frame.

What ILM has is a supercharged 'preview' button. Just like when you switch to wireframe mode in Lightwave [newtek.com] or Maya [aliaswavefront.com] and see a 'realtime' preview of the animation you're working on. But I'm sure ILM's version looks little bit better.

D

Another nail in the SGI coffin (5, Interesting)

binaryDigit (557647) | more than 11 years ago | (#5750479)

Well, as more and more cgi houses move off of SGI (and on to whatever), they are only really left with their server business. It's really a shame to see a once proud pioneer in the industry reduced to a mere shadow of their former selves, though I guess in this industry, its very common (e.g. DEC, Lotus, Compaq, etc). At this rate it's hard to even see them being around in 4 years, a definite takeover target.

ob /. comment:

SGI (aka Silicon Graphics Inc.) was found dead today at the age of 20. After being a high flyer in his youth, often seen hobnobbing with Hollywoods power elite, the latter years were not so kind and saw him in the throes of an identity crisis. Eventually his reliance on a small circle of friends was his undoing, as he was slowly replaced by more mainstream competitors. He will be sorely missed, as while he was at the top, he was a role model for "cool" in the industry, and helped to usher in one of the most exciting (and abused) technology shifts in the motion picture/video entertainment industry since the advent of talkies and color.

The next innovation (5, Funny)

imadork (226897) | more than 11 years ago | (#5750488)

would be to develop a program that re-writes Lucas's inane dialogue in real time...

ILM? (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5750517)

So has anyone else been wondering why the Warchimp
has been ignoring the US economy and focusing on
threatening the rest of the world and alienating
the US?

Well, now we have our answer - "Baghdad Blue" - the
administrations current offer of jobs in law
enforcement in Iraq.

So who wants to join this new pseudo-military? C'mon,
the WarChimp wants to free up more troops so he
can play war some more!

Slashdotters, Bush and his cronies have GOT to go.

Mod me down if you support killing innocent Iraqis.

What a scoop! (1)

Ciderx (524837) | more than 11 years ago | (#5750530)

Where'd they find this amazing scoop? Possibly because there is an entire featurette about the realtime generation of CGI in the DVD release of Attack of the Clones? Out circa November 2002...

Off Topic (-1, Offtopic)

mrtroy (640746) | more than 11 years ago | (#5750534)

I dont know about the rest of you slashdotters, but I am quite happy to say today at work we found easter eggs hidden across the office! No I am (in realtime) eating some of those bad boys.

What is it about foil covered chocolate that I cannot resist, even in early morning?

Realtime Plot help (1, Funny)

Anonymous Coward | more than 11 years ago | (#5750544)

Now if only that could help Lucas get realtime feedback that his plots suck.

A stunningly inaccurate article (4, Interesting)

sgi_admin (666721) | more than 11 years ago | (#5750549)

This is, largely, nonsense.

These images are *not* realtime! A PC is not capable of rendering a CGI screen, in realtime, and merging that, in realtime, with a video feed, and then displaying that, in *realtime*.

Say what you like about Linux, or high speed CPUs, or XXX vendor's high end GFX card - the architecture and the tools are physically incapable of this.

If you look at the extras on the LOTR:FOTR DVD set, you'll see people walking around, with a camera on a stick. This *is* displaying real time camera images, merged into a low res, non final rendered, scene of the Cave Troll fight in Moria.

A point of reference - the machine's they are using for this are SGI Octanes. Not Octane2s, but Octanes.

They did that work around, what, 3 years ago? And the Octane, at that time, was only 3-4 years old.

Can anyone show me a PC from 1997 that can manage that? Anyone?

Despite the fact that the Octane is an ancient piece of kit, there is nothing from the PC world that can match it's capabilities.

SGI have always been, and always will be, a niche player.

You would be a fool to buy expensive SGI kit for a renderfarm - buy Intel PCs with Linux. Similarly, you would be fool to try and do realtime CGI with that same kit - that's a specialist task that calls for specialist skills.

This article does not show that SGI is dying, or that they're being thrown out of the GFX workstation market.

This article *does* confirm what is widely known - the once cutting edge ILM are now many years behind people like Weta Digital.

Throwing around "Linux" and "Intel replacing SGI" sound bytes to try and get some news coverage for a dated effects house isn't going to change that.

Re:A stunningly inaccurate article (1)

Alan Partridge (516639) | more than 11 years ago | (#5750635)

you've never heard of virtual sets then? The BBC uses them every single day.

Re:A stunningly inaccurate article (1)

sgi_admin (666721) | more than 11 years ago | (#5750655)

you've never heard of virtual sets then? The BBC uses them every single day.

Indeed I have. I think you'll find they use a combination of Avid and SGI kit to display that.

Re:A stunningly inaccurate article (1)

Alan Partridge (516639) | more than 11 years ago | (#5750698)

they did (Avid? Their systems use a Mac or PC host), but not anymore. Look at (for example) ForA's Digiwarp product - Win 2K based. Video Toaster? Win 2K based. There are others, but these two are real and they work - and they're nothing to do with SGI.

Re:A stunningly inaccurate article (1)

sgi_admin (666721) | more than 11 years ago | (#5750746)

Avid? Their systems use a Mac or PC host

Avid's high end systems run exlusively on SGI kit, and are capable of significantly more than the lower end pre-prod stuff that runs on PCs and Macs.

I've yet to see any solution that can match the capabilities of a 7 year old Octane in this area. The fact that you can cite software that is available that claims to do a similar job is irrelevant. It's not in use by broadcast studies and effects house, for the simple reason that it doesn't work.

Low tech solution (3, Funny)

dr.robotnik (205595) | more than 11 years ago | (#5750558)

Simply prop a big mirror against the wall, and Hey Presto! you can watch a virtual representation of the actors performing the scene in real time!

C'mon now.... (3, Interesting)

curtisk (191737) | more than 11 years ago | (#5750577)

This is pretty cool, no doubt, but I believe that they were using this in a similiar form for Two Towers (?)

I think its funny that two recent articles call Linux immature or maturing and that Novell gets bashed for calling Linux immature, but noone (as of this writing) makes mention of ILM saying the exact same thing.

ILM's move to Linux has been a long and gradual process. "Linux is still maturing," says Plumer. "When we started working with it a few years ago, it was still in its infancy and a lot of things we took for granted coming from (Silicon Graphics') IRIX operating system just weren't there - supportive drivers from different graphics cards and other things. "It took a while to mature, but right now it's going extremely well."

Ahhhh.... slashdot :)

Not new.. (2, Insightful)

Lord Grumbleduke (13642) | more than 11 years ago | (#5750579)

Pah - Jim Henson's Creature Shop, Weta and Framestore have been doing this sort of thing long before ILM. Framestore did this for Dinotopia, Weta for Golum, and JHC for a variety of different things - all too numerous to mention here.

Note to George (1)

EricWright (16803) | more than 11 years ago | (#5750591)

It's not the effects that suck, it's the scripts. Try spending some time on those instead.

Re:Note to George (0)

Anonymous Coward | more than 11 years ago | (#5750758)

Scripts? We need those?

I thought it was all about hi-tech special effects to "shock and awe" the viewing public. Why do we need a real plot?

Geez.

Real Realtime In My Dreams (1)

handy_vandal (606174) | more than 11 years ago | (#5750605)

When the audience can watch the finished scene, complete with CGI, as the actors are filming it -- now that would be realtime!

It's the video card, not the CPU.... (4, Insightful)

Faeton (522316) | more than 11 years ago | (#5750609)

Carmack himself [slashdot.org] (on Slashdot no less) has predicted this would come to pass, due to the increasingly feature-rich and faster video chipsets.

SGI laughed at the unassuming threat of the video chipsets, thinking that they would never be as fast as brute force. Even Pixar thought the same [siliconinvestor.com] . Boy, were they wrong though. You can set up a cheap-ass render farm for about $250k, taking up minimal space that can do the same job as a SGI render farm that costs a cool $2 million (Shuttle SFF PC w/ 3 gig CPU + ATI 9700). Of course, there's still the software side.

The Nvidia's GeForceFX and ATI's Radeon 9800 both contain features that even through the marketing-hype has some real value to programmers out there. Just look at Doom 3. It will run well on some computers that are just 6 months old. Now, imagine taking 250 of them, as a Beowulf cluster!!1

Re:It's the video card, not the CPU.... (3, Informative)

_|()|\| (159991) | more than 11 years ago | (#5750791)

SGI laughed at the unassuming threat of the video chipsets, thinking that they would never be as fast as brute force. ... You can set up a cheap-ass render farm ... that can do the same job as a SGI render farm ... (Shuttle SFF PC w/ 3 gig CPU + ATI 9700)

A high-end video card is used in a workstation for content creation. Final rendering, however, is still done in software (i.e., by the CPU), whether it's LightWave, Mental Ray or RenderMan. Don't waste your money on a Radeon for your render node.

Re:It's the video card, not the CPU.... (1, Informative)

Anonymous Coward | more than 11 years ago | (#5750803)

Actually Radeon 9700/9800 supports only 24bit floating point precision in the pixel shader and thus is unable to produce same quality as the software renderers which use 32bit floating point precision.

GeForceFX supports 32bit floats and QuadroFX can be up to 20 times faster [nvidia.com] than the fastest CPU in the Maya5 which supports hardware rendering.

sweet (0)

ptrangerv8 (644515) | more than 11 years ago | (#5750626)

We're getting closer to the ability to go truly virtual, in realtime...

This world will be a scary (or cool) place in 25 years....

According to me... ;beer;

Open? (4, Interesting)

Diabolical (2110) | more than 11 years ago | (#5750659)

ILM developed its proprietary file format, OpenEXR

Hmm.. i sense a trend in calling things open when they are actually closed. This is eroding the intended meaning of "Open" in front of fileformats or products.

Re:Open? (4, Informative)

Kupek (75469) | more than 11 years ago | (#5750814)

It [openexr.net] was released under a modified BSD license [ilm.com] .

Haven't we been diong this for years now? (0, Redundant)

10Ghz (453478) | more than 11 years ago | (#5750669)

Real-time CGI? Haven't we had that for years and years in 3D-accelerated games? The graphics in those sure seem real-time to me.

Had do be said... [Spaceballs ref] (5, Funny)

caveat (26803) | more than 11 years ago | (#5750750)

Dark Helmet - "What the hell am I looking at? When does this happen in the movie?"
Col Sandurz - "Now. You're looking at now, sir. Everything that happens now, is happening now."
Dark Helmet - "What happened to then?"
Col Sandurz - "We passed then?"
Dark Helmet - "When?"
Col Sandurz - "Just now. We're at now, now."
Dark Helmet - "Go back to then."
Col Sandurz - "When?"
Dark Helmet - "Now."
Col Sandurz - "Now?"
Dark Helmet - "Now."
Col Sandurz - "I can't."
Dark Helmet - "Why?"
Col Sandurz - "We missed it."
Dark Helmet - "When?"
Col Sandurz - "Just now."
Dark Helmet - "When will then be now?"
Col Sandurz - "Soon."

Rolls off the tongue... (-1)

Anonymous Coward | more than 11 years ago | (#5750753)


Dude! You're... uh.. rendering real time!

a good step forward (0)

Anonymous Coward | more than 11 years ago | (#5750762)

But when holograms are renderd in high res to interact with the actor, people will think "this was the beginning. We've come a long way due to these huge leaps in technology." If it makes actors more believable and the movie better I'm all for it. If not, who cares.

Internal monologue (2, Funny)

Jonboy X (319895) | more than 11 years ago | (#5750768)


Intel-based Dell systems running Linux

So conflicted...Intel bad...Linux good...Dell ambivalent...

Thank God.... (1)

djupedal (584558) | more than 11 years ago | (#5750796)

...he didn't have real time rendering when he created the Earth and the Stars.

He might have changed his mind several times along the way, and we'd all be living inside a soap bubble right now.

Where/When can they do this? (1)

tekunokurato (531385) | more than 11 years ago | (#5750823)

How portable/scalable is this? I mean, if they had it during LOTR, they couldn't have used it when they were filming out in nature scenes, could they? A lot of stuff is done on sets, but a lot is also done in remote locations, and I'd think it would be seriously hindered under such circumstances.

Jack

If they were using the preemtive kernel (4, Funny)

teamhasnoi (554944) | more than 11 years ago | (#5750832)

they could watch the CGI *before* it happened.

Now that's Cost Savings!

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>