Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Software

The State of OpenGL 273

CowboyRobot writes "No longer vapor, but a true 3D-embedded engine, OpenGL is on the move. Pixar and others would love to be able to render their movies in realtime, and that desire has prompted the intended release of OpenGL 2.0, due in a few months. Khronos is now in charge of further extending OpenGL to cellphones and handheld gaming devices."
This discussion has been archived. No new comments can be posted.

The State of OpenGL

Comments Filter:
  • by Anonymous Coward
    Yay, I get to play around with 2x2 pixel rendered characters on the cell now.
  • Damn them (Score:5, Insightful)

    by Anonymous Coward on Friday April 09, 2004 @11:36AM (#8815883)
    When will they figure it out OpenGL is not necessarily desirable in a cellular phone?

    I want business class reliability, not a the ability to rent subpar games on my cell phone for $5/month.

    When I'm on the phone all day because of my work I want it to be there for important calls, not fizzle out after an hour because it's got a 640x480 pixel screen with 24-bit color.
    • Sun (Score:5, Funny)

      by Brando_Calrisean ( 755640 ) on Friday April 09, 2004 @11:41AM (#8815946)
      What about the ability to stick to-do notes on the BACK of your cellphone? Wouldn't that make mobile 3D worthwhile?!
    • Re:Damn them (Score:3, Interesting)

      no sh*t. imagine the battery drain from using a processor that can use openGL. who needs that crap. I'm all for openGL as a 3d standard, but cellphones don't need 3d. cellphones don't need games. Am i going to be ranting about cellphone batteries not lasting an hour, like i am with laptop batteries, in a year?

      again, vote with your wallet.
      • Re:Damn them (Score:5, Insightful)

        by sbaker ( 47485 ) on Friday April 09, 2004 @12:17PM (#8816346) Homepage
        If there *is* going to be 3D on cellphones and PDA's then I'd much prefer that they ran a standard API than a non-standard one. Given that there really are only two 3D standards, I'd much rather it was OpenGL than Direct3D.

        So - *IF* we want 3D then we want OpenGL.

        But do we want 3D in cellphones?

        The supposed 'killer app' for 3D on cellphones is the idea of using the positioning detecting capability of the phone - along with network access - to provide an annotated 3D map of your present location. Think of the navigation systems in cars - but in 3D - so you can find the elevator you need to get to a particular office in a big unfamiliar building - or find where you left your car in an multistory parking lot.

        Games will obviously use the technology too.

        I don't know whether this is important to people or not - but if 3D is happening, it should CERTAINLY be in OpenGL - initially a small subset - gradually improving to a full-blown implementation in every phone as the technology catches up.

        Personally, I'd be much happier with a last-generation basic phone that had 10x battery life and didn't lose service quite so easily.
        • by cookie_cutter ( 533841 ) on Friday April 09, 2004 @12:43PM (#8816618)
          Seriously, your suggestion just gave me an idea: if your 3d image enabled cell phone has centimeter resolution positioning information (not easy, I know), then you could use the screen as a "magic window" to see things that aren't physically there.

          Which could be your target as a glowing orb, or a character in of a video game super-imposed on the actual landscape, or the trail your friend took through the same city two years ago, or just some construct representing an interesting thing about your environment, or ...

          I think that would be a real killer app.

        • The supposed 'killer app' for 3D on cellphones is the idea of using the positioning detecting capability of the phone - along with network access - to provide an annotated 3D map of your present location.

          That has got to be the stupidest idea I've ever heard. It's hard enough to get a map that shows STREETS accurately on a GPS, much less elevators inside of buildings in a 3D environment on a cellphone. Besides, how the hell do you navigate? It's hard enough with a mouse and a keyboard, much less with
    • Re:Damn them (Score:3, Insightful)

      by HeghmoH ( 13204 )
      So buy a phone with a black-and-white screen and long battery life. Nothing's stopping you.
    • Something doesn't have to be "needed". Isn't "Wanted" good enough. I don't use a cell phone at all. But some people do. A lot of people don't use them for important things, or business things. They just like them. 3D games on a cell phone? Some people might like that. That's enough of a reason to make it available.
    • Re:Damn them (Score:5, Insightful)

      by Azghoul ( 25786 ) on Friday April 09, 2004 @12:15PM (#8816322) Homepage
      Hmm, funny, I don't remember you being declared the only person to own a cell phone.

      How about realizing that there are other users out there? How about realizing that teenagers ( a gigantic market, by any measure ) might WANT their phones to play games?

      Be a little more myopic next time, AC...
      • Stupid parents for letting teenagers have game enabled cellphones. When I was in HS, my bothers wanted pagers/beepers. They got them. Mom treated them as recall buttons. She'd let them roam, but when she wanted them back, she'd page each of them. IF they didn't call her right then, no more pager.

        I could understand GPS enabled pagers for teenagers, but what parent would want to let a teenager have a glorified gameboy phone?
        • Re:Damn them (Score:2, Insightful)

          by Puff Daddy ( 678869 )
          Parents who want their kids to have a useful device instead of a glorified leash might want to get them a "glorified gameboy phone."

          Next time my car breaks down and I have to call for help I'll remember how stupid my parents were for getting me a phone instead of a pager.
    • Maybe OpenGL isn't desirable in a phone that you want to buy, but someone else might want it.
    • Re:Damn them (Score:5, Insightful)

      by stienman ( 51024 ) <adavis@@@ubasics...com> on Friday April 09, 2004 @12:32PM (#8816509) Homepage Journal
      1) You are not their target audience.

      2) Eventually cell phones, pdas, computers, entertainment devices (tivo,etc) will converge into one or two devices, one of which will be portable. This is one item on the continuum leading towards the ubiquitous always on computing device.

      3) OpenGL on the cell phone is simply a way of saying, "OpenGL on any platform requiring 3d graphics." It's marketting. It may not be used heavily on cell phones, but perhaps new a new HDTV format will allow for an opengl data stream to place products in pretaped shows for different areas (ie, midwest viewers see a CVS pharmacy, while southeast see an Eckard). Having a pared down implementation meant for little processors and low resolution screens is an asset. Don't abuse the implementation if the idea can be generalized.

      -Adam
  • by Stiletto ( 12066 ) on Friday April 09, 2004 @11:37AM (#8815885)

    Although right now OpenGL is all that's out there for low-cost portable embedded 3D software, no one is going to develop with it until hardware support emerges. Who wants a handheld 3D mapping device that takes 10 seconds to redraw a frame using an ARM9 software renderer?
    • exactly thats the point of opengl es.

      as it is only a subset of the opengl standard trimmed for low power/low speed devices it is (or will be) also fast in software. afaik there are also hardware renderers for opengl es in the works.

      and remember:
      we hat 3d games long before the gigahertz pcs with 3d accelerators were out and they were a tiny bit faster than 1 frame/second.
      • by arbitrary nickname ( 325162 ) on Friday April 09, 2004 @12:01PM (#8816168)
        It's *not* really designed for software implementations. This is a common misconception. Relies on depth buffers for sorting - which can be wasteful on memory bandwidth for software implementations (there are better alternatives in many cases (BSP trees, portals, bucket sorting)

        After having a look at the spec, OpenGL ES seems -1, Redundant. Why not just aim for full OpenGL, starting with a 'MiniGL/QuakeGL' style implementation, of the sort which really got the ball rolling on the PC.

        However, I believe it does include fixed-point maths support - very useful for all the ARM-based devices out there with no FPU.
        • The simplistic reason for this is that the full OGL spec, or even the "miniGL" drivers are still waaay to complex for what is needed on these devices. Things like multitexturing, and anything that requires readback from the state is especially costly. The point was that on these devices they dont want full OGL support. If they did, they wouldn't have even bothered starting this spec. It is to provide a spec that has a greatly reduced footprint (memory, API coverage, etc) that allowed first class 3D graphics
  • I hope so (Score:5, Interesting)

    by re-Verse ( 121709 ) on Friday April 09, 2004 @11:43AM (#8815968) Homepage Journal
    For so long, DirectX had to struggle and claw to keep up with OpenGL - they did just that, while OpenGL sat mainly idle (well, John Carmack was a big help to it)... Now it seems the shoe is on the other foot, and OpenGL is going to have to move deftly to surpass DX9, and soon enough 10...

    I sincerely hope it happens. I wish developers felt more inclinded to make their 3D engined GL based rather than DX based, so the day where I can play any game in linux may actually arrive. Of course, we have to give massive amounts of respect to those who do make OpenGL platforms for their games (ID, Epic), but what about those who feel DX is easier and more practical for what they do(Valve).

    Maybe if we're lucky, the Carmack will drop in to this discussion and tell us exactly what he thinks needs to happen to really make GL a reality for most gamaes again.
    • Re:I hope so (Score:3, Interesting)

      by westlake ( 615356 )
      Carmack can still build an engine. But his game designs are frozen in the 'nineties. Which is likely to prove the stronger seller and have more impact on developers, Half-Life 2 or Doom 3?
  • by Pike ( 52876 ) on Friday April 09, 2004 @11:43AM (#8815972) Journal
    "Touch one hair on her head and I'll render you limb from limb!"
  • by BigBuckHunter ( 722855 ) on Friday April 09, 2004 @11:47AM (#8816016)
    Hopefully, this will prompt more developers to join efforts to create a feature rich gaming framework for *nix. SDL is a great start, but lags behind DirectX in a number of ways. I look forward to seeing this 2.0 release breathe new life/blood into this area of development.

    Thank you for your time,

    BBH
    • My friend is working on a multipurpose game engine, with the ability to "plug in" different graphics managers - so you can have the beauty of DirectX 9 on your Windows version, and seamlessly switch to OpenGL when you port it to Mac OS.

      Or should I say, when I port it to Mac OS, since that's my job. I wish I had the slightest idea how his engine worked... He has all sorts of complicated code that compiles fine on his x86, but is gcc-unfriendly. :-(
      • Hey, that's no worse than many open source project that are GCC friendly.... But it's a fucking bitch to try and use any other compiler, because the dumb fucks have used GCC-specific code, and ignored the C and C++ standards. Linux is one such example.
        • because the dumb fucks have used GCC-specific code, and ignored the C and C++ standards. Linux is one such example

          Actually the linux source is pretty good about using gcc extensions only when necessary -- i.e., because the standard is lacking, not because they're "dumbfucks".

          For instance, gcc's extended "asm" syntax (parameter passing, constraints) is extremely important for the sort of low-level code a kernel needs sometimes [and, no, moving all assembly code into separate files is not an adequate repla
      • I am curious about what you mean by a "multipurpose game engine." Games are generally defined by genre, first-person shooter, role-playing, flight simuation, etc., and players tend to object to a one-size-fits-all solution.
        • Well it's designed to run any sort of game, and a number of different "plugins" (a C++ class inheriting from an abstract plugin class) allow specialization as the given genre demands. So far using this engine a lot of 2D games have been developed - a working DDR clone (albeit only with keyboard support and crummy graphics), a decent version of Asteroids (vector graphics, scrolling, camera zoom, camera rotation, minimap/radar, running at 60fps or better - Asteroids is the testbed for most of the engine's new
  • OpenGL 1.5 (Score:5, Interesting)

    by PlatinumInitiate ( 768660 ) on Friday April 09, 2004 @11:47AM (#8816017)

    OpenGL is used in the Torque engine [garagegames.com] alongside Direct3D (D3D on Windows, OpenGL on Mac and Linux). It would be great if OpenGL could eclipse Direct3D, and become the premiere 3D platform once again. Perhaps we will see this with the release of OpenGL 2.0, but for a few years Direct3D has been slowly but surely catching up and then surpassing the aging OpenGL standard.

    A lot of our customers demand Linux in their solutions (networked gaming terminals) to avoid the cost of licensing Windows XP Embedded for each machine, and the option so far has been to go the Mesa/OpenGL/SDL route (WineX is still too slow for what we do), which, while it has worked, is technically slightly inferior to our Windows equivalents. Hopefully OpenGL 2.0 will change this.


    • > OpenGL is used in the Torque engine alongside Direct3D (D3D on Windows, OpenGL on Mac and Linux).

      How well do Torque-based games run on Linux?

      • Re: OpenGL 1.5 (Score:2, Informative)

        by TypoNAM ( 695420 )
        I am a licensed indie Torque owner and I have to say it is a very impressive game engine when it comes to cross platform game development. Not only does it run pretty smooth just like games such as Wolfenstein: ET and Tribes 2 (T2 was actually not really good on Linux compared to it on Windows) its source code is actually gcc friendly.

        About Tribes 2, its Linux port wasn't very good and it choked from time to time for no real reason and Torque engine doesn't have this problem. The real history of Torque is
      • Very well, actually.

        If you check out http://www.garagegames.com you will see that almost all of the Torque-based software products have native Linux and Mac versions.

        Try ThinkTanks. It's a pretty cool example.
  • my interest fwiw (Score:4, Informative)

    by rokzy ( 687636 ) on Friday April 09, 2004 @11:51AM (#8816074)
    I'm going to learn opengl in a few months during the holidays before I start my PhD. I work with simulations of the Sun and use IDL to visualise the results. but I think it would be cool to have more "realistic" pictures, plus having the hardware acceleration has benefits when dealing with a lot of data (IDL gets real slow for 2D simulations at resolutions above a few hundred squared)

    these simulations are done on beowulf clusters (imagine that!) so I think opengl is the best (the only other API I know of being directx)
    • you should check out celestia: an open source solar system visualization tool. it is avaliable at http://www.shatters.net/celestia

      there is a windows installer if you are running that and don't want to be hassled with compilations, or you can download the source from cvs to compile either on your linux or windows machine
    • Re:my interest fwiw (Score:3, Informative)

      by wass ( 72082 )
      I've used IDL before, and I'd avoid doing any very intensive calculation on it (unless it was 'linkimage'd to an external C routine). Some of the specific IDL calculation routines are optimized (FFT for one), and i've FFT'd 64k arrays of floats, but that's about the limit.

      Anyway, since you use IDL on a Beowulf I'm assuming they finally added multithreading. That's good, when I used it before my PhD (I'm doing condensed matter physics now, but used IDL at my research job prior to this) there was no multi

  • OpenGL 2 (Score:5, Informative)

    by woodhouse ( 625329 ) on Friday April 09, 2004 @11:59AM (#8816143) Homepage
    OpenGL 2.0 is not as exciting as the new major version number might indicate. Probably the most important new feature of OpenGL 2.0 was going to be the GLSL high level shader language. However, in order to speed up its support by hardware companies, this was instead put into OpenGL 1.5 spec when it was announced last year; GLSL already has implementations by 3DLabs, ATi and nVidia. OpenGL 2.0 will still add some useful new features, but it won't be the world-shattering event that 3DLabs promised in their original proposals.
  • On a related note.. (Score:5, Informative)

    by rkaa ( 162066 ) on Friday April 09, 2004 @12:17PM (#8816343)
    Jan 16th 2002: SGI transfers 3D graphics patents to MS [theregister.co.uk]
    Jul 09th 2002: Microsoft Claims IP Rights on Portions of OpenGL [slashdot.org]
    Jul 11th 2002: 3D graphics world shaken by patent claims [zdnet.co.uk]
    Jul 13th 2002: Microsoft patent claims may affect OpenGL [macworld.com]
    Mar 3rd 2003: Microsoft quits OpenGL board [theregister.co.uk]
  • The current handheld devices are not suitable for 2D/3D graphics, because their memory bandwidth is so poor that texturemapping will make the software crawl. I'd get excited when the mobile devices get real 3D hardware acceleration. Even a 400MHz XScale doesn't cut the mustard if it spends its time waiting for the memory. Have been using OpenGL ES for over six months now...
  • by tommck ( 69750 ) on Friday April 09, 2004 @12:23PM (#8816422) Homepage
    My cousin's husband works for Sun and he said that the next version (1.5?) of Java will have Swing ported to OpenGL underpinnings... that way, even 2D apps will be MUCH faster.

    He said they're realizing 4X speed increases on plain old 2D apps.

    They're also working on making 3D game demos (some with 3rd parties) to demo that Java can actually now compete in the desktop game market...

    • Java and OGL would be an incredible combo. Basically, it's be .NET, only crossplatform and without the kludge of Mono's reliance on Wine for P/Invoke. And without C# (shame there, C# is the bomb, but who knows...maybe it can be compiled for the JVM!).
      • You may want to have a look at JSR-231 then, which is the official bindings of OGL in Java. If you need something more immediate, the JOGL project, which is the baseline for the JSR, should be checked out. It can be found on the java.net site.
  • Most frames in Pixar movies are rendered using some form of ray-tracing. While it is possible to use vertex and fragment shaders in uncoventional ways to do ray tracing, this is *not* what the OpenGL pipeline is designed for. Great for games, but ray-tracing will still be done using render farms (and not in real time).
    • by One Louder ( 595430 ) on Friday April 09, 2004 @01:08PM (#8816928)
      Uh, no. Pixar movies typically use the REYES micropolygon algorithm, with some assists from raytracing for certain effects as necessary, implemented within their Photorealistic Renderman (PRMan) product.

      The notion that Pixar would use OpenGL for final rendering if only it were fast enough comes up every time a new video card or GL enhancement comes along just indicates how little people understand how Pixar actually makes their films. Oddly, Pixar really doesn't make this information much of a secret, and they'll even sell you the same software they use.

      • by Viking Coder ( 102287 ) on Friday April 09, 2004 @04:13PM (#8819513)
        Peercy and Olano [psu.edu] (Click on "PDF" in the upper right)

        Presentation [ibiblio.org]

        ASHLI [ati.com]

        GPGPU [gpgpu.org]

        More than Moore's Law [geek.com]

        Moore's law : still for wimps [extremetech.com]

        Using programmable graphics hardware (possibly through OpenGL) for final rendering is not that far off. (Definitely not in real-time, but as a more cost-effective way to do it, anyway.) Especially with the massive parallelism of rendering, and the fact that GPUs are far outpacing CPUs in terms of their speed and transistor counts.

        OpenGL is much more similar to micropolygon rendering (REYES) than it is to raytracing in the first place. The shaders are where you spend all of your time, anyway.

        Heck, do you think nVIDIA bought ExLuna (Larry Gritz, author of BMRT, and former Pixar employee) just for the fun of it?

        Software for translating from RenderMan Shading Language to Cg?

        And what about RenderMonkey supporting RenderMan?

        Do you even remember PixelFlow [unc.edu] from Pixar? Do you see the name Marc Olano on that paper? The same Marc Olano who talks about rendering on consumer-level graphics hardware? These things have far more in common than you seem to realize.
      • Interesting - is this REYES micropolygon algorithm in any way related to the old (late 80's?) computer graphic entitled "Road to Point Reyes" (I think that is right)?

        I remember seeing an image of that in an old computer graphics "coffee table"-type book back in high school - and you mentioning that popped it in my head...

    • Most frames in Pixar movies are rendered using some form of ray-tracing.

      Technically, no. Renderman (the Pixar renderer) does not perform ray tracing. It uses a scanline renderer that is much faster than any ray tracer I've ever seen. They've been at this for literally decades, and are very good at it. Still, the most complex images in their movies can take many hours -- sometimes more than a day -- to render. The time-to-render doesn't seem to improve much from picture to picture because as computers get
  • Ahem... (Score:5, Interesting)

    by dustman ( 34626 ) <dleary.ttlc@net> on Friday April 09, 2004 @12:38PM (#8816559)
    No longer vapor, but a true 3D-embedded engine...

    Since when has OpenGL been vapor?
    • Really, I was wondering this myself. I mean, I've been playing OpenGL games for years now.

      In the article they kind of say what they mean, but the headline in the Slashdot article makes it seem like OpenGL is FINALLY being released and is no longer vapor.

      Me got cornfused....
  • by mark-t ( 151149 ) <markt.nerdflat@com> on Friday April 09, 2004 @12:44PM (#8816639) Journal
    Direct3d has a few advantages over OpenGL that can't really be addressed by OpenGL.

    For one, direct3d is integrated into the direct api which handles a multitude of things, multimedia and game input devices among others, that game developers are almost naturally drawn to by the appeal that so much work has already been done for them

    OpenGL can't and really shouldn't have to address all these requirements, but it's just part of why there's been this ongoing struggle. SDL is a reasonable answer to portability while still accomplishing the integration that MS has achieved, but SDL isn't really as mainstream as OpenGL is.

    I've seen soap opera plots that were less convoluted than this mess.

    • by omicronish ( 750174 ) on Friday April 09, 2004 @01:10PM (#8816945)
      From coding experience the integration is pretty much non-existent or not very strong. APIs such as Direct3D and DirectSound have consistent API styles, but they don't share much API. It is possible to write an OpenGL application that uses DirectSound and DirectInput, like GLQuake.
    • I've seen soap opera plots that were less convoluted than this mess.

      given that soap opera plots are targeted at lobotomized cows barely aware enough to sign their name on a credit card application, that's not saying much...

    • Who says you can't write the graphics engine in OpenGL, and all the other modules(music,netcode,etc) with DirectX ???
    • > SDL is a reasonable answer to portability while still accomplishing the integration that MS has achieved, but SDL isn't really as mainstream as OpenGL is.

      SDL and OpenGL are not mutually exclusive. I have very successfully used SDL to handle joystick input, window creation, and sound output, with OpenGL for 3D. SDL in fact is designed to work this way, since SDL will create OpenGL rendering contexts for you when you create windows, and it handles the fullscreen video modes far easier than any other
  • by Performer Guy ( 69820 ) on Friday April 09, 2004 @12:53PM (#8816744)
    The original article post seems to confuse different forms of OpenGL. OpenGL|ES is the embeded stripped down OpenGL for mobile & embeded systems. OpenGL 2.0 is just a proposal from 3DLabs and may never get off the drawing board. Most of the significant changes that OpenGL 2.0 introduced have been implemented and released either as extensions or as part of OpenGL 1.5, so it's just not clear if or when OpenGL 2.0 will actually arrive, there's a lot of resistance because 2.0 intended to throw some stuff out and many developing, selling & using OpenGL implementations think that it's a REALLY bad idea to do that. With OpenGL|ES there is already a version 1.0 and you can actually get this in several forms from implementations that run on phones to wrappers around OpenGL that you can use on the desktop to emulate OpenGL|ES. OpenGL|ES is in the process of developing version 1.1 right now.
  • Cell Phones? (Score:2, Insightful)

    by muonzoo ( 106581 )
    " Khronos is now in charge of further extending OpenGL to cellphones and ..."

    Why oh why, for the love of a higher reasoning! Doesn't any one make a Simple, Small, Functional mobile phone?!

    I don't WANT fancy crap in my phone. I want it to WORK. Good RF, Bluetooth, Multi-band radio (global GSM), EDGE, long battery life and iSync support.
    Where is _my_ phone?
    • Thats fancy to me. My phone needs to do voice, and Short Messaging Services (texting for those in the US). I have one which does exactly that.
      I need one with a better battery life though. Triband would be nice, but is not a must.
      Simple and functional.
  • The article says that the programmable features will be based the direct-compile model in which the compiler will be included as part of the driver.

    Given the current less than good state of open source drivers for graphics chips this may well mean that most of the useful (i.e. works with your hardware) compilers may only be available in the Linux world as part of tainted binary drivers. It seems pretty likely that vendors who believe that their current drivers contain deep secrets than open source would
    • FUD. Offtopic. Flamebait. Why does every discussion remotely related to graphics have to deteriorate into closed versus open source drivers zeolotry.

      Drivers are closed source now. Drivers will continue to be closed source in the future in spite of where the compiler lies. Having the compiler in the driver is the right decision.

      Don't like it. 3DLabs released the front end to their compiler. There is work being done in Mesa to support GLSL.

      From now on, all bitching about open versus closed driver

  • by Anubis333 ( 103791 ) on Friday April 09, 2004 @01:50PM (#8817416) Homepage
    As TD who works in the computer graphics field, let me state that the technology required to render a Pixar film in 'Real Time' is far off and ridiculous. Just because OpenGL looks better does not mean that it can support the shader functions that Renderman utilizes, not to mention the Fur and cloth APIs. Also, the majority of shots in movies aren't even single comp shots they involve many rendered elements, which you still have to comp together. I'd be all for the guy talking about how OpenGL 2.0 will benefit the artists by allowing them to get more feedback about the quality of the shot they're working on without preview renders, but thinking that OGL could replace final renders any time soon is wrong. Perhaps we are geting to a place where we could render the original Toy Story realtime and a general viewing audience might not know the difference. Perhaps. But I remember some really great PRman shaders from the film that wouldn't be posible in the real time version.
  • by Handpaper ( 566373 ) on Friday April 09, 2004 @02:43PM (#8818161)
    Am I the only person who thought that:
    "Over the next year or two, I think you're going to see a whole range of applications that use your graphics board as a supercomputer," Trevett says enthusiastically.
    was the most interesting part of the article?
    SETI@home [berkeley.edu], Finite Element Analysis [hks.com], video recoding [exit1.org] are all areas which could benefit from vector processing , matrix calculation and/or huge register sizes provided by GPUs.

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...