×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

ATI All-In-Wonder X1900 PCIe Review

ScuttleMonkey posted more than 8 years ago | from the keeping-me-poor dept.

Graphics 55

An anonymous reader writes "ViperLair is currently running a closer look at ATI's newly released All-In-Wonder X1900 PCIe graphics card. The clock speeds and memory are pretty comparable to other cards available but the reviewer warns that 'clock speeds do not always tell the whole story.' The review tests performance in Doom 3, UT 2004, Far Cry, Half-Life 2, and the 3DMark06 benchmarking tool." This release comes relatively quickly after the X1800 series which was release just last October.

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

55 comments

I'm giving up on my All in Wonder (4, Informative)

Anonymous Coward | more than 8 years ago | (#14642990)

My next video card won't have any TV capture abilities. I got an MDP-130 HD capture card, and Comcast is now doing Analog Digital Simulcast (clear QAM) in my area, which means I can do straight digital captures of most major TV stations.

Re:I'm giving up on my All in Wonder (1)

SineOtter (901025) | more than 8 years ago | (#14643350)

After setting up a friend's (then new) 9700 All In Wonder, I'd have to agree. It did decently as a video card, but trying to get the video capture and TV tuning parts working with his cable was simply a pain, and even when working properly, the quality was rather sub par when compared to the 15 year old TV he had. A stand alone tuner/capture card would probably have been the better choice, and cheaper!

Re:I'm giving up on my All in Wonder (2, Insightful)

Tlosk (761023) | more than 8 years ago | (#14643380)

I would second the cheaper also. I've had a Tuner card that has seen me through three full systems and about seven video card upgrades and is still working great. In comparison the 2MB video card, 64MB memory, 56k modem, 4X CD Rom etc that shared space with it when it was new have long since hit the dustbin.

geforce (-1, Flamebait)

Anonymous Coward | more than 8 years ago | (#14642993)

... still a geforce fan. :-P

Re:geforce (1)

Coneasfast (690509) | more than 8 years ago | (#14643009)

it's amazing how many people say they are 'fans' of one brand of the other, not talking about you, but many people haven't even used the other because they've always been fans.

they have a brand, say it works fine, and call themselves 'fans', i find it interesting.

when i buy a card, i take a look at both ATI and NVIDIA, find the price range i want, and after going through reviews, buy the one that has the best FPS/$

Re:geforce (0)

Anonymous Coward | more than 8 years ago | (#14643053)

I agree with you. I am the original poster of this.

I have used ATI cards, I just find I have less problems with Geforce cards under Linux. Which is my primary OS.

I also build computers for friends, etc. I use the same process as you when looking for others. If I can get a better deal with Geforce, then Geforce it is. If I can get a better deal with ATI cards, then ATI it is... I suppose it also depends on what they are using it for... Gaming? Probably ATI. Home/office stuff? Generally if the motherboard has in-built video, it's a Geforce chipset.

Re:geforce (2)

Billly Gates (198444) | more than 8 years ago | (#14643103)

There is a good reason.

Linux support and a companies reputation. ATI has made some lousy drivers and products in the past. However I find nvidia to be falling behind and both their linux and windows drivers are not stable. I had to downgrade to an older Windows HCL certified driver for my geforce 6600 for windows. Strange things kept happening. SuSE has a big warning on drivers too.

ATI now makes cards that are faster and have better effects and visuals than NVidia's while their drivers are improving. Even on Linux they are getting decent.

This is why people buy on loyality and I wished I purchased an ATI instead a few months ago.

Re:geforce (2, Insightful)

WhiteWolf666 (145211) | more than 8 years ago | (#14643155)

Meh....

I only run Linux. I used to be a huge fan of ATI. I still have a Radeon 9800 Pro, and an Xpress 200 integrated chipset.

Both are a HUGE pain in Linux, compared to Nvidia. Huge. Gigantic.

Their drivers have improved, yes. But they still suck quite a bit. At least they usually compile/install correctly now, but performance is crappy.

Re:geforce (0)

Anonymous Coward | more than 8 years ago | (#14643709)

Yeah, the drivers have gotten easier to compile, but the performance isn't as good as it used to be WRT the windows drivers. I think I'm only getting about 10% more FPS in RTCW:ET and Q4 in linux then I am in windows... Used to be more like 15%.

Re:geforce (1)

WhiteWolf666 (145211) | more than 8 years ago | (#14650592)

You see more FPS in Linux than Windows on your ATI cards?

Have you tried any Cedega gaming? Do Cedega (Windows) games work properly?

Re:geforce (1)

Lord Kestrel (91395) | more than 8 years ago | (#14651806)

Depending on how recent the game you want to run is, it sometimes works very well, but often works well except for a few bits. For instance, hitting two mice buttons at once in Guild Wars will make it crash, and Cedega doesn't seem like they want to fix that (it's been like that for 3 months at least).

think about that (3, Insightful)

way2trivial (601132) | more than 8 years ago | (#14643106)

if 700$ gets you 82 fps in a game, that's 8$ per frame
if 200$ gets you 40 fps that's 50 cents a frame
if a 10$ card gets you 30 fps,

are you really ONLY gonna buy based on $/fps?

enjoy your 10$ card.

$500 (1, Flamebait)

Eightyford (893696) | more than 8 years ago | (#14643008)

Wow, and for the low price of only $500! Who actually buys these cards when they are at this price? WoW only needs a GeForce 2, and those can be had for little over 25 bucks.

Re:$500 (3, Insightful)

LurkerXXX (667952) | more than 8 years ago | (#14643032)

People with more disposable income than you, or people who have gaming as a higher priority in their life compared to other things than it is in yours.

Re:$500 (2, Informative)

d474 (695126) | more than 8 years ago | (#14643068)

"People with more disposable income than you, or people who have gaming as a higher priority in their life compared to other things than it is in yours."
IOW, guys without girlfriends.

Re:$500 (0)

Anonymous Coward | more than 8 years ago | (#14643170)

Errrr, I can assure you that it's quite possible to be both unable to get a girlfriend and simultaneously not care about gaming.

Re:$500 (1)

dunkelfalke (91624) | more than 8 years ago | (#14643378)

have you ever had a thought that maybe, just maybe, some people play other games than world of warcraft? and that some people don't like role playing games at all? and that some people even prefer single player games?

Re:$500 (1)

Eightyford (893696) | more than 8 years ago | (#14643589)

have you ever had a thought that maybe, just maybe, some people play other games than world of warcraft? and that some people don't like role playing games at all? and that some people even prefer single player games?

I just picked a fairly new game to illustrate my point. I don't think it matters if it's a single or multiplayer game...

Re:$500 (0, Offtopic)

TheAntiCrust (620345) | more than 8 years ago | (#14643643)

You picked a game over a year old that wasn't a good judge of current hardware requirements WHEN IT CAME OUT. Im also sure you didnt actually attempt playing WoW on a GEForce2. My 2.8 gigahertz ATI Radeon 9700 Pro had trouble with it. If you want to look at 'current hardware requierments' look at FEAR or Battlefield2 or heck, even Quake4. That being said, I just bought myself a GeFroce 7800GT for $359 and got a free motherboard with it. Good deal for me.

But if you want to know who the crazies are, it is possible to buy two DUAL GEForce something or other cards for 800 each. So $1600 for the 'graphics card' alone.

Re:$500 (1)

Eightyford (893696) | more than 8 years ago | (#14643710)

But if you want to know who the crazies are, it is possible to buy two DUAL GEForce something or other cards for 800 each. So $1600 for the 'graphics card' alone.

That's pretty crazy! Doom3 was probably the last hardcore 3d game I've played, but wouldn't the xbox360 provide a better gaming experience for a lot less money?

Re:$500 (1)

lucas teh geek (714343) | more than 8 years ago | (#14644524)

well, that would depend if youre a diehard pc fan (or dont play anything but rts/fps) or just someone who wants the best fun per dollar spent. personally i hate rts and I'm over most fps (how many times can you play the same storyless "shoot the bad guy, yuo teh win" type of game without getting tired of it). console gaming saves me cash AND cuts the last need for windows on my pc.

Re:$500 (1)

dunkelfalke (91624) | more than 8 years ago | (#14643745)

well, deus ex 2 is (with the updated textures) absolutely unplayable with a geforce 3. and this game is old.

Re:$500 (0)

Anonymous Coward | more than 8 years ago | (#14643853)

1. Have you ever tried playing Battlefield 2 on a GeForce2?
2. Have you ever tried watching TV on a GeForce 2?
3. Have you ever tried both at the same time on a GeForce 2?

Primarily useless benchmarks... (4, Insightful)

Suddenly_Dead (656421) | more than 8 years ago | (#14643011)

The review didn't really test much that stressed the video card beyond Doom 3. A look at Half-Life 2 Lost Coast and/or some other HDR game(s) would have been far more useful than testing Unreal Tournament 2004, which the review admitted had more of a CPU bottleneck than anything else. They didn't do any overclock tests either, and the image quality tests are a little dubious.

Re:Primarily useless benchmarks... (1)

80 85 83 83 89 33 (819873) | more than 8 years ago | (#14643737)


here is a list i compiled by checking out many different benchmarks. in general the faster cards are on top, the slower ones below. since i am concentrating on affordable cards, i haven't placed many expensive cards above the nvidia 6600GT and radeon X1600XT, so there are many high-end ones available now that are not on this list. if you see a few cards back-to-back with an equal sign (=) in front, that means they are very similar in performance to the ones next to it that also have the "=" sign.

N/A = discontinuted
        or Not Available

FASTER CARDS
N 6800 Ultra $400-500
N 6800 GT
R X800 Pro ~$250

R X1600 XT ($170)best price/performance
N 6600 GT ($140) best price/performance
          MSI & BFG = quiet

R 9950 ultra
N 6800 LE (LE=slower)
R X700 XT (N/A)
R X700 PRO($125,135)
N 5900U/5950 Ultra($250)
R 9800 PRO(~$140)
N 6600 ???
=R 9700 pro
=R 9800 ($90??)
=N 5900/5950
R 9700 ($110)
R X1300 PRO($105)
N 5800 ultra
(3GHz)
N 5700 Ultra (N/A)
R 9500 Pro ($95 used)
yes it beats 9600pro!
=R 9600 pro/XT ($100)
=R X600 PRO/XT ($100)
N 6200 non-tc($70 AGP)
N 5600 pro/Ultra
N 5800
N 5700/5750
R 9800 SE(128 bit)
=N 5600
=R 9500/9550/9600
R X300 non-hc???
N 5700 LE (MINE)
N GF4 Ti 4600 !!!
N 5200 ULTRA
N 5600 XT (XT=lower)
R 9600 SE

this last group of expansion cards is equal to the current generation of integrated onboard graphics
***very slow***

N 5200/5500
N PCX 5300
N 6200 Turbocache
R 9200
R X300 SE Hypermemory

by the way, these next three are the current generation of integrated onboard graphics chipsets, and they all have 3D gaming performance roughly equal to each other:

-- Intel GMA 900/950
-- Geforce 6100/6150
-- ATI xpress 200

Re:Primarily useless benchmarks... (1)

stunt_penguin (906223) | more than 8 years ago | (#14644145)

Yea, the games list is Doom 3, UT 2004, Far Cry, Half-Life 2, and the 3DMark06 benchmarking tool. Whatever happened to trying Battlefield 2 (a hugely detailed game when you scale things up), F.E.A.R (reputedly tough on video hardware) or something like the latest Age of Empires game.

Half Life 2 has been out for a year- there are tougher tests for a video card, like the Lost Coast expansion pack

Another ATI (0, Flamebait)

canuck57 (662392) | more than 8 years ago | (#14643049)

Yawn... Why would someone in this day and age pay this much for a video card?

Another ATI-Money Mover. (0)

Anonymous Coward | more than 8 years ago | (#14643962)

"Yawn... Why would someone in this day and age pay this much for a video card?"

So I can get it cheaper a few years from now.

perhaps time for the older 1800 (1)

Billly Gates (198444) | more than 8 years ago | (#14643079)

My geforce6600 is dog slow on my athlonXP +2400 for doom3 and everquest2.

Its like nothing is fast enough. After reading about the trillion or so polygons for unreal3 or whatever its going to be called, I need a new card. The graphics are stunning [unrealtechnology.com] and I wonder if even the x1900 will be able to handle it?

Re:perhaps time for the older 1800 (1)

hazmat2k (911198) | more than 8 years ago | (#14643148)

That would be your 2400 slowing you down. Not the 6600.

Re:perhaps time for the older 1800 (1)

goldspider (445116) | more than 8 years ago | (#14644012)

I've heard conflicting accounts of where the bottleneck really is, at least for EQ2 anyway.

I also have a GF6600, with an Athlon 64 3200 (2.0ghz) CPU. I've heard that the major bottleneck for EQ2 is the somewhat low VRAM on the card (128MB), but also notice that the CPU is running at full capacity as well.

Any ideas? Upgrading either would cost roughly the same, and I want to make sure I pick the right one :) Thanks!

Re:perhaps time for the older 1800 (0)

Anonymous Coward | more than 8 years ago | (#14643252)

Yes, your XP CPU is the weakest link in this equation. It's probably not even a Barton (which has 512KB of L2 cache).

I am using a Barton XP2600 with a Ti4400 and UT2K4 with everything cranked on max at 1280x1024 runs just fine.

That said, I don't do Doom3 or HL2 as I know the Ti4400 is going to poo itself.

That said, UT2K7 is going to cost me $80 for the game and $1500 for a new PC with a whooparse video card.

I'm still waiting ... (2, Interesting)

AdamReyher (862525) | more than 8 years ago | (#14643099)

While the performance of the card does take a nice step forward over the X1800, it's not really much to spend the extra $$ over. I'm still waiting for the "next generation" graphics card. Something that really takes a large step forward. Still, it really comes down to the application developers and how they design the programs. Most can't use the full capabilities of the card, so we're still left in the dark.

- Adam

I'm waiting on my PVR system (1)

Gyorg_Lavode (520114) | more than 8 years ago | (#14643154)

I am waiting on my PVR system. Mainly now I am waiting for backend software that supports saving in a format I can play on any of my computers and which can be controlled from a thin client like the windows media connector ones. It would also have to have the standard features. Support for various cards, HD capability, wide screen, etc, etc. I don't care about the OS as long as the features are there. Unfortunately it seems that only Windows Media Center really does what I want and it, unfortunately, does not save to widely usable format.

Re:I'm waiting on my PVR system (0)

Anonymous Coward | more than 8 years ago | (#14643765)

I'm a big fan of mediaportal.sf.net but i've been primarily using gbpvr.com due to it's overall stability.

GBpvr also supports MVP from Hauppauge. The only downside is you'll need one of the few supported tuner cards supported under GBpvr.

Sorry to hijack this thread.. 2 monitors? (0, Offtopic)

ExternalDingus (951990) | more than 8 years ago | (#14643266)

I am trying to figure out how to do this.. I just got an Emachine T3410 that says it has anVidia Geforce 6100 PCI express slot available And what I want to do is run 2 monitors on it.. do I have to get a PCI video card? Or can I use an additional AGP card. I also want a tv tuner card.. I am looking for the chapest solution. The best deal I think is to get a PCI video card for about 60 bucks and an additional 40 dollar PCI tuner card at COMPUSA. I would prefer to get a card that combines both things for cheaper than 100 bucks.. but can't find any.

Re:Sorry to hijack this thread.. 2 monitors? (1)

lenehey (920580) | more than 8 years ago | (#14643634)

PCI Express is not compatible with VGA or PCI -- it is totally different. You must get a video card that fits into the PCI Expres slot, or a lower end one that fits into one of the regular PCI slots.

Re:Sorry to hijack this thread.. 2 monitors? (1)

ExternalDingus (951990) | more than 8 years ago | (#14643713)

So do I have 2 AGP video card slots? Is this PCI express slot open? Or is the stock video card in that slot? Is there a way I can tell without opening it up?

Re:Sorry to hijack this thread.. 2 monitors? (1)

lenehey (920580) | more than 8 years ago | (#14644368)

Many motherboards have a built-in video card. You can tell from the outside of the box if the video plug is in a slot or not. If it is not in a slot, then it is probably built into the motherboard. (You can also tell by checking the system bios, but that is a little trikier.) If your motherbard has a PCI Express slot, then you must get a video card to fit that slot (different from PCI). Some higher-end motherboards have two PCI Express slots, so that two video cards can be linked for higher performance, or for multiple monitors. Some motherboards have both an AGP slot (for backward compatibility) and a PCI-Express. Your user manual should tell you what you have. Opening up the box may not help if the slots are not labeled (usually not) and you don't know how to identify one type from another.

48 pixel pipelines (1)

BrookHarty (9119) | more than 8 years ago | (#14643361)

Jumping from 16 to 48 pixel pipelines, (1800 to 1900), one would expect much better frame rate. But the nice thing, this puts ATI back up to the Nvidia GTX series.

Very nice card, price is expensive, but nice.

Re:48 pixel pipelines (4, Informative)

be-fan (61476) | more than 8 years ago | (#14643420)

They didn't jump from 16 to 48 pixel pipelines. The x1000 cards have a fairly non-traditional architecture. Instead of having a fixed set of pixel pipelines with fixed resources, they have a large shader array, running a number of rendering threads. ALUs are assigned to each thread as necessary. The X1900 increases the number of shader units from 16 to 48, but both the X1800 and X1900 have 16 texture units and 16 raster-op units. So both cards can do 16 texture lookups per clock, and commit 16 pixels to memory per clock. Where the extra ALUs in the X1900 come in handy are for complex shaders, where the X1900 can do far more calculations per pixels than the X1800.

Re:48 pixel pipelines (2)

BrookHarty (9119) | more than 8 years ago | (#14643538)

Then the graph is off in the article.

All-In-Wonder Comparison
X1900 X1800 XL 2006 X800 XL X800 XTPCI Express
Yes Yes Yes Yes No
Core Clock
500 500 450 400 500
Memory Clock
480 500 400 490 500
Vertex Pipelines
8 8 2 6 6
Pixel Pipelines
48 16 4 16 16
Microtune Tuner
IC 2121 IC 2121 IC 2121 IC 2121 MT2050
Shader Model 3.0
Yes Yes Yes No No
Avivo, H.264 Acceleration
Yes Yes No No No

Re:48 pixel pipelines (-1)

Anonymous Coward | more than 8 years ago | (#14645179)

The architecure of the R5xx isn't all that revolutionary. They decoupled the TMUs from the pixel processors (significantly smaller "pixel pipelines" now), because they wanted to make more effective use of real-estate by having a different ratio of TMUs to pixel processors. There are still a fixed number of TMUs, ROPs, and pixel processors (which are distributed across a fixed number of cores). In turn the pixel processors have a fixed number of ALUs and a branch processor. In order to reduce the cost of branching required by SM 3.0 the batch scheduler and the number of registers is expanded to work on significantly smaller pixel batches. The batch scheduler is expanded to deal with working with the decoupling of the TMUs from the pixel processors, but all in all it's just an evolutionary step necessary to deal with the expansion of pixel shader complexity. It doesn't necessarily matter if you have TMU or ROP parity if you aren't fillrate bound.

Only 47 comments on this product? (0)

Anonymous Coward | more than 8 years ago | (#14644223)

Time to sell your stock in ATI!
Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...