Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

AMD-ATI Ships Radeon 2900 XT With 1GB Memory 132

MojoKid writes "Prior to AMD-ATI's Radeon HD 2000 series introduction, rumors circulated regarding an ultra-high clocked ATI R600-based card, that featured a large 1GB frame buffer. Some even went so far as to say the GPU would be clocked near 1GHz. When the R600 arrived in the form of the Radeon HD 2900 XT, it was outfitted with 'only' 512MB of frame buffer memory and its GPU and memory clock speeds didn't come close to the numbers in those early rumors. Some of AMD's partners, however, have since decided to introduce R600-based products that do feature 1GB frame buffers, like the Diamond Viper HD 2900 XT 1GB in both single-card and CrossFire configurations. At 2GHz DDR, the memory on the card is also clocked higher than AMD's reference designs but the GPU remains clocked at 742MHz"
This discussion has been archived. No new comments can be posted.

AMD-ATI Ships Radeon 2900 XT With 1GB Memory

Comments Filter:
  • But... (Score:5, Funny)

    by gQuigs ( 913879 ) on Saturday September 29, 2007 @01:10AM (#20791433) Homepage
    Does it run (on) Linux yet?
  • by User 956 ( 568564 ) on Saturday September 29, 2007 @01:11AM (#20791435) Homepage
    When the R600 arrived in the form of the Radeon HD 2900 XT, it was outfitted with 'only' 512MB of frame buffer memory and its GPU and memory clock speeds didn't come close to the numbers in those early rumors.

    Well, that's because when they tried to build the 1GB units, a loud voice was heard saying "We require more minerals", and production was blocked.
  • by Chas ( 5144 ) on Saturday September 29, 2007 @01:12AM (#20791439) Homepage Journal
    These cards are ridiculous. ESPECIALLY in Crossfire installs.

    Wow! Now that 4GB of main system memory I installed has been pared back down to a more manageable 2GB!

    WHEE!

    Until 64-bit becomes more mainstream, cards like this will only become more and more detrimental to the systems they're installed in.
    • So get the 64-bit Windows XP/Vista OS. Then you won't "lose" your precious "minerals".
    • These cards are ridiculous. ESPECIALLY in Crossfire installs.

      You mean like doubly so? Like the point that the money spent on the doubling the money is almost completely wasted, the money spent on doubling the graphics chips is almost completely wasted?
      • I mean "doubling the memory". Stupid sleepiness.
      • by Chas ( 5144 )
        No. I mean that since this memory has to be mapped within a 32-bit address space, you wind up wasting space that could be better allocated to system memory.

        Sure, for anything that remains strictly on the graphics card, it's great. But for anything else (functions besides raw graphics in a game (like AI) or for non-gaming application), stealing that memory allocation space degrades overall system performance.
    • Re: (Score:3, Interesting)

      by Doppler00 ( 534739 )
      Yeah, this is why I'm waiting before upgrading my computer. I need to see better 64 bit support in the future. I always plan on doubling everything at next major upgrade. From 2GB -> 4GB, 2 cores -> 4 cores. Until there is an operating system and application support though, I don't think I'm going to go there.
      • Re: (Score:2, Insightful)

        by Colin Smith ( 2679 )
        Right because Linux hasn't been 64bit and running on SMP systems for years...

        Oh wait. You meant Windows. Sorry, I do apologize ... Well, you'll have to wait the traditional 3 year Microsoft lag behind the state of the art.
         
        • by Tim C ( 15259 )
          So has Windows - the 64 bit build of Win XP was released a couple of years ago.

          The problem isn't the OS, it's the hardware support. In my case, USB wi-fi dongles; none of the ones I have kicking about the place have driver support from the manufacturer (thanks for that Netgear!) and I don't fancy trailing cat5 cabling through my house again.

          And yes, Linux is great, and yes it was my primary desktop OS for a couple of years, but it simply doesn't support all the software I want to run (and yes, that includes
          • Re: (Score:3, Insightful)

            by jlarocco ( 851450 )

            This is where open source trumps closed source, hands down.

            In the majority of cases, having an open source 32-bit driver, almost automatically implies having a 64-bit driver. It's just a recompile. Yes, a lot of the time there will be bugs, but since developers are usually on "higher end" 64-bit systems, those bugs are usually fixed quickly.

            I'm running 64-bit Debian, and have never had a problem with drivers. My video card, sound card, firewire card, USB devices, network cards and printer all work

      • by Chas ( 5144 )
        With multi-core support, I don't think that's really going to change. You're not necessarily going to see a huge "performance boost" from massively parallel processing.

        However, you'll still have the luxury of running multiple processor intensive apps without bringing the whole system to a standstill.
      • by Lorkki ( 863577 )
        What are the specific problems with OS and application support you're having? Windows may be an issue if you still need applications with 16-bit components or if you have bad luck with driver support. In Linux there's trouble with closed-source browser plugins, which can be partly alleviated with nspluginwrapper, although Java 6 can still be a pain. Other than that, the support is about as good as it can be expected to get, and I've been running the AMD64 builds of both Ubuntu and XP on my desktop since whe
    • by mzs ( 595629 )
      If that is true it is bad and it is simply a limitation of Windows. For a long time x86 has offered PAE (Physical Address Extension) which gives 36 address lines. In fact PAE is also used for 2M page size which would be useful to use for a frame buffer. You map a reasonable window for frame buffer, another sliding window for textures, and a small window for CSRs for each card and still have plenty of VA space left over.

      On the PPC side since the 745x at least there has been support for Extended Addressing (H
    • Seriously. What is with the lack of uptake on 64-bit? I mean, do they even sell mainstream machines with only 32-bit processors? (I really don't know.) It's been out for, let's see, over 4 YEARS.
      • My ULV Core Duo in my ultra portable doesn't have 64 bit support. It's only a year old.
      • by drsmithy ( 35869 )

        Seriously. What is with the lack of uptake on 64-bit?

        Severe lack of motivation from the majority of consumers to whom it offers no benefits (and, due to frequently buggy and/or nonexistant hardware and/or drivers, numerous disadvantages).

  • UEI++ (Score:2, Interesting)

    Hmmm This might mod my Vista User Experience Index up to 3.0
    • Re: (Score:3, Informative)

      Exactly how is this trolling? I got a new video card yesterday and it boosted my UEI from 1 to 2. Imagine my disappointment. I learned my lesson: say something remotely negative about Vista on /. and get down modded.
  • by The-Pheon ( 65392 ) on Saturday September 29, 2007 @01:20AM (#20791477) Homepage
    With this new hardware, will be able to run vim with some colors for syntax highlighting? :)

  • Useless! (Score:5, Insightful)

    by ynososiduts ( 1064782 ) on Saturday September 29, 2007 @01:33AM (#20791533)
    Unless you are running quad 32" screens at some insane resolution, there is no need for 1 GB of frame buffer RAM. I think this is more for the "OMG MI VIF CARD HAZ 1 GIGGBYTES OF MEMORYIES!11!" type.
    • Re: (Score:2, Informative)

      I take it you've never gamed at very high resolutions with ALL the eyecandy turned on.
      • Re: (Score:2, Informative)

        My 8800 GTS with 320 MB runs all games fine at 1680x1050 with max settings. That's pretty much one third of one gigabyte. I seriously doubt you need one, let alone two, gigabytes of video RAM.
        • Re: (Score:1, Interesting)

          Maybe in a direct x 8.1 rendering path. No way youre getting consistant 80+ fps playing tf2 or other directx 10 capable games.
        • Re: (Score:3, Informative)

          by MikShapi ( 681808 )
          I second that. I run an 8800GTS/320 on a triple 17'' 1280x1024 setup (using a matrox triplehead2go digital to split the DVI signal in 3). The card pushes out 3840x1024, which amounts to about 4MP, and it's been happy so far in Gothic, Oblivion, S.T.A.L.K.E.R and a bunch of other titles, giving very reasonable frame rates with either all or practically all the graphics bells and whistles turned on.

          Memory doesn't make a card faster, except on REALLY insane resolutions (way higher than 4MP I suspect) when you
          • Re: (Score:3, Insightful)

            by mikkelm ( 1000451 )
            You're going to have to spill the beans on how you manage to run S.T.A.L.K.E.R. in a resolution like that, and how you manage to do it on that kind of hardware.

            With an 8800GTS/320, myself, and most all review sites, struggle to stay above 60FPS in 1024x768 at times with all the eyecandy on.
            • To quote myself, I said "giving very reasonable frame rates with either all or practically all the graphics bells and whistles turned on".
              STALKER had to have one of the heaviest GPU resource consuming options bumped to keep frame-rates in the 20-30 range (which is what I consider playable for single-player games, and where I prefer eyecandy to the frames. multiplayer would go the other way)
              STALKER also displayed some warping on the side screens, showing it was not designed nor QA'd for unconventionally wide
        • That's funny, because I was just contemplating replacing my 320mb with a 640 because I can make this card chug pretty hard in certain recent games. Mind you, I like my 16x AA+AF so I'm probably taxing it harder than the more reasonable folks out there, but there is definitely a point to having more video ram.

          We've had 256mb cards for a few years now for "normal" resolutions like 1024x768 and 1280x1024... but that's not hardcore anymore. Hardcore is SLI GTX'es (or HD2900s) driving a 30" Dell at 2560x1600.
    • Comment removed (Score:4, Informative)

      by account_deleted ( 4530225 ) on Saturday September 29, 2007 @01:43AM (#20791577)
      Comment removed based on user account deletion
      • But see, there is no need for 1 GB of RAM for modern gaming. It's useless unless you are running games at big resolutions. Most people I've seen don't go any higher then 1600x1200.
        • Re: (Score:3, Informative)

          by Anonymous Coward
          You do realize that texture size is completely independent of screen resolution right? And that you possibly have hundreds of textures loaded at once? And they can't be stored compressed because decompression would take too long?

          Basically, other than the framebuffer for what's actually displayed on screen none of the graphics card memory is depended on screen resolution.

          Anyway, this card isn't useful *now*. That's because video game producers target the cards that are widely available. 2 years from now you'
          • by Have Blue ( 616 )
            Actually, most modern cards can use certain compressed texture formats natively without decompressing them. I know the Xbox 360 can do that and its graphics hardware is 2 years old.
          • All modern cards are capable of processing compressed textures without having to decompress them. They actualy result in both less memory usage and FASTER processing, at a slight cost of image quality (minor compression artefacts at the higher compression levels)
      • Re: (Score:3, Insightful)

        Mod parent up. 'Framebuffer'? The article submitter's brain is stuck in about 1995.
    • Do the math. You don't need anywhere near 1Gb for that.

      What you *do* need it for is texture and vertex data, but even then games aren't really going to use it - they're designed for current hardware.

      Nope, the only people who'll buy this are ignorants with too much money*.

      - Not that there's any shortage of those.

      [*] ...and medical people who like to look at 3D textures from MRI scans - they can never get enough video memory. 1Gb is only enough for a single 512x512x512 texture.
      • You just proved my point. I was saying that most people only go as high as 1600x1200. I wasn't saying that it is only needed for people that run at 1600x1200.
    • The memory on a video card is used for more than just simple frame buffering.

      Notice how some of the newer games see less performance degradation on some of the 640MB nVidia cards than equivallently clocked 320MB versions of the same card.
    • If enough people will pay for it to create a sustainable market, it's needed (period). Not to mention I hear what sounds like and assumption that this is going to be targeted at the enthusiast market, side-stepping the high-end (and high-cost) graphics shops and their associated market. Unless we assumed Pixar ran all their workstations on...pixie dust?

      Anyway, if it finds a markets more power. Engineers do their jobs, people get paid. Welcome the open market. (:
      • Re: (Score:3, Funny)

        by Anonymous Coward
        Hey, nice backwards smiley there. The smiley emoticon has been around for 25 years, and it looks like this: :)
    • Exactly. Looking at the benchmarks, there is no difference between the 512MB version of the 2900 XT and this new 1GB version. Infact, most of the benchmarks show the 1GB version performing slightly worse than the 512MB version for some reason...

      When I bought my 1900 XT several months ago, I decided to get the cheaper 256MB version instead of the 512MB version because benchmarks showed there wasn't even a 5% difference in frame rates between the two cards. I play all my games at 1680x1050 with 4x AA and 8x
    • How about one of these [digitaltigers.com]?

  • by Animats ( 122034 ) on Saturday September 29, 2007 @01:35AM (#20791543) Homepage

    Sounds useful for 3D animation work, where you need all that memory for textures. Remember, by the time players see a game, the textures have been "optimized"; stored at the minimum resolution that will do the job, and possibly with level of detail processing in the game engine. Developers and artists need to work with that data in its original, high-resolution form.

    • Re: (Score:3, Informative)

      by Runefox ( 905204 )
      Yeah... The FireGL has been doing that for several years. In fact, they have a 2GB version [amd.com] now, the V8650. Don't try it with games, though. Not going to work so well.
  • Ahh... (Score:5, Funny)

    by xx01dk ( 191137 ) on Saturday September 29, 2007 @01:50AM (#20791595)
    Question. Where are the ships? I wanted to read about video cards and ships. This article only half-delivers.
    • by xx01dk ( 191137 )
      ...just pokin' fun atcha shiny :)

      Anyhow I just read the review in the newest CPU and they gave it what could best be described as a "meh". Give it a few more iterations and then we might have a respectable competitor to the current top-shelf Nvidia offering, but of course by then there will be something better out...

  • by chrisl456 ( 699707 ) on Saturday September 29, 2007 @02:33AM (#20791697)
    Umm, not to sound like a tech jargon-nazi, but "frame buffer" to me has always meant just the part of video ram that "mirrors" what you see on screen. A 1GB frame buffer would give you 16384x16384x32bit color, so unless you're doing some kind of huge multi-screen setup, 1GB of frame buffer is a bit overkill. ;)
  • I understand if you were doing research of any sort that would exploit this hardware - assuming you use ALL of it or can write the code to do so - the better bandwidth you have, the faster the results etc. I understand hardware like this being useful in this regard. I also understand it from the perspective of a software developer who may be developing with this hardware for a future product that will be released in a year or so, and this sort of hardware will be more standard at that time and affordable.
    • by BarneyL ( 578636 )

      But I am sort of baffled by people who spend hundreds upon hundreds of dollars for something that they will not use the bandwidth for until next year or later and then the thing will be down in price anyway.

      At the extreme end of the scale it is bad value. However if you do need a new graphics card it often works out better to go towards the high end. I'd rather spend $300 on one card that keeps up with the latest games for two years than $180 get a mid range card that will need to be updated with another

  • The mobo will be a giant video card and the cpu will reside in a board on one of the slots.
  • I was just wondering, since before games often used system RAM when the graphics RAM was full, do any of you think it would be possible to go the opposite way, i.e. use gfx ram as system ram? It's a lot faster, and when you're just sitting there outside of a game it's not doing anything. Ultra-fast system cache ftw? Or am I just crazy? Is PCI-e too slow for that kind of stuff? Maybe with Vista's new driver model that allows GPU virtualization something like this could become true, but I really have no idea
  • by Anonymous Coward
    I have such a 1GB model from PowerColor for 2 months now and since the last linux driver release, it runs flawlessly there too. Slashdot, late as always :/
  • bitchin (Score:5, Funny)

    by savuporo ( 658486 ) on Saturday September 29, 2007 @07:37AM (#20792471)
    All these years later, and its still no match for the original Bitchin' fast 3d! 2000 [carcino.gen.nz] Livin' la Video loca con Puerto Para Garficios Acelerados Gigante!
  • When will they release their promised 3d specifications for their GPUs?
  • then I'll be able to simply heat my entire house without turning the damn heaters on while running the Folding GPU client.

    Seriously though, we're already seeing problems with PSU's that can't deliver enough on the 12volt rail (2900XT needs 24 Amps by itself) and now they want to push that up to 26-28 amps? So where is the power going to come from? The wicked fairy's curse?

    • "Seriously though, we're already seeing problems with PSU's that can't deliver enough on the 12volt rail "

      That limitation is a design choice. Beefier supplies are no problem to build.
    • by Ant P. ( 974313 )
      That's one reason I'm going with Intel's graphics in my next PC. The other reason being that good drivers already exist for it.
  • Considering how much address space MMIO takes up in Windows 32-bit OS's, One can only imagine some poor sap buying two of these and wondering why he only has 1.8GB of RAM available in Windows.

    "lawl"
  • At some point, if they want me to keep buying new video cards, they are going to have to upgrade my eyes. Elder Scrolls V will look better than Christmas morning; but watching movies in the theater will be all choppy and hard to hear. And setting up SLi will be even more painful than it was the first time.
  • Imagine a beowulf cluster of these... Actually, with ATI's CTM technology which allows these to be used as stream processors, you could get alot of math done with a couple of these.

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...