×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

217 comments

That was “interesting” (5, Interesting)

sebi (152185) | more than 9 years ago | (#9758927)

Reading that "interview" I can almost see the lawyer going over every answer and neutering it before it went out. Either that or Mr Desai is the most boring and lifeless fellow in the history of electronics.

Worthless read (3, Insightful)

Achoi77 (669484) | more than 9 years ago | (#9759037)

Heh. After reading that interview I get the feeling that this guy doesn't know anything about the product he is selling. Generic one-liner answers that dance around the questions with emphasis on market speak. Here's an excerpt:

GeForce 6 Series GPUs have the most advanced shading engines on the market. Compared to previous generation parts, the vertex and pixel shader engines on GeForce 6 Series GPUs have been completely redesigned from the ground-up. Pixel Shader performance is 4 to 8 times faster. Vertex shader performance is twice as fast. Performance numbers have been outstanding.

Absolut (tm) Garbage!! Here's another, this time with the question:

* Do you have any idea how performance compares on the Mac between the GeForce 6800 Ultra and the ATI 9800 Pro/XT card?

GeForce 6800 Ultra represents the largest leap forward in graphics performance in our company's history. As expected, they are much faster than previous generation products from ATI. We will let the benchmarks speak for themselves.

Talk about trash!! A simple NO would have sufficed. Looks like he's made the most of his Business-for-dummies Manual. Man, why am I so angry over this?

Re:Worthless read (0)

Anonymous Coward | more than 9 years ago | (#9759330)

After reading TFA... I totally agree.

Also observe how in your second example there's no mention of ATI Radeon X800 Pro/XT in the question! Sheesh...

Re:Worthless read (1)

Hungus (585181) | more than 9 years ago | (#9759419)

Man, why am I so angry over this?
Umm... because its still early and you have not yet had enough caffine?

Re:Worthless read (2, Insightful)

foidulus (743482) | more than 9 years ago | (#9759463)

Talk about trash!! A simple NO would have sufficed. Looks like he's made the most of his Business-for-dummies Manual. Man, why am I so angry over this?
Probably because he gets paid much more for spouting bs than any of us do for real work....

Re:That was “interesting” (2, Insightful)

Anonymous Coward | more than 9 years ago | (#9759087)

The interviewer is not any better. I was particularly annoyed at this exchange:

Mr Desai:
* We are the first chip with an onboard video processor

Interviewer's commentary:
(Note: Some previous ATI cards like their AIW models have shipped with the "Rage Theater" video encoder/decoder chip on the card. It was first intro'd in 1998, and revised I'm sure since then. Of course the current generation of GPUs have more advanced features.)

Now, how exactly is that comment relevant? Mr Desai claimed theirs was the first chip with an onboard video processor. Does the interviewer even understand the difference between a video card and its GPU?

Re:That was “interesting” (2, Interesting)

Peter Cooper (660482) | more than 9 years ago | (#9759182)

Clearly not, as the interviewer asked if nVidia would be introducing any retail cards for the Mac. nVidia doesn't make cards, but GPUs, as Mr. Dull-ai explained.

Re:That was “interesting” (0)

Anonymous Coward | more than 9 years ago | (#9759107)

you have to remember that in (most) really large companies, the sheer scale of the organisation makes people feel compelled to be depersonal in their social interaction (see www.intel.com for the perfect example). "Hey cool man!" kind of chat isn't going to look good in the eyes of his boss.

Re:That was “interesting” (1)

sebi (152185) | more than 9 years ago | (#9759142)

you have to remember that in (most) really large companies, the sheer scale of the organisation makes people feel compelled to be depersonal in their social interaction (see www.intel.com for the perfect example). "Hey cool man!" kind of chat isn't going to look good in the eyes of his boss.

Hence the mention of an imaginary or not lawyer responsible for going over all outbound communication. I know that there are good reasons for the practice, but one side effect is to make interviews lie this completely useless. Why then do companies even bother to "talk" to interested consumers? Pump out some press releases and be done with. The level of actual content can't be any lower.

Re:That was “interesting” (4, Funny)

Anonymous Coward | more than 9 years ago | (#9759113)

Mr Desai is the most boring and lifeless fellow in the history of electronics.

And that's saying a lot.

First ever first post (-1, Offtopic)

SABME (524360) | more than 9 years ago | (#9758930)

Time to abandon all hope ... I'm deleriously happy at the prospect of *finally* making the first post on a slashdot topic.

Re:First ever first post (0)

Audigy (552883) | more than 9 years ago | (#9758953)

I felt the same just now... until I learned of the limit between clicking "reply" and "submit" ;)

Oh well... leave the idiotic first postings to the NGAA or whatever they call themselves. Insightful/Informative/Interesting first posts are much more... valuable.

Re:First ever first post (0)

Anonymous Coward | more than 9 years ago | (#9759082)

"Oh well... leave the idiotic first postings to the NGAA or whatever they call themselves. Insightful/Informative/Interesting first posts are much more... valuable."
  • That's the GNAA! Get it right, you opressor!

Re:First ever first post (0)

Anonymous Coward | more than 9 years ago | (#9759410)

... you mean oppressor.

Re:First ever first post (-1)

Anonymous Coward | more than 9 years ago | (#9758954)


And.... you failed it.

There's a rope in the closet to hang yourself with.

Set up (1)

OECD (639690) | more than 9 years ago | (#9758960)

Are these Cinema Displays essentially a dual-monitor-in-one setup (from the computer's POV, that is.)

(YFI, BTW)

Re:Set up (3, Informative)

angrist (787928) | more than 9 years ago | (#9759004)

Nope, they should each show up as a single monitor.
The "dual-link" label is misleading, it's mearly an update to the DVI standard (like DVI-I, DVI-A, etc) too allow for more data.

Re:Set up (1, Informative)

lordDallan (685707) | more than 9 years ago | (#9759151)

Actually - The issue is that there is too much data to drive the screen over one cable connection/channel (don't know the right technical term) - so there are two DVI connectors for each screen (four on the card).

Only the 30 inch display requires the two connections per screen - so this card is really only for the 30 inch.

IANAE - so I have no idea if the card could ever be hacked to drive four displays - but that would be pretty cool.

Re:Set up (2, Informative)

angrist (787928) | more than 9 years ago | (#9759226)

You're wrong and you're right.

Yes, the issue is data throughput. DVI-D doesn't support high enough resolution.

But, the 30 inch display only needs ONE connector.
DVI-Dual Link is just a protocal/standard that allows that one connector to send twice the data of DVI-D. Think double density.

So... one card, two DVI-Dual Link Connectors, one display (including 30 inch) per connector.

Re:Set up (2, Interesting)

afidel (530433) | more than 9 years ago | (#9759295)

No, there are the same number of cables(2), it's just that more of the pins are actually used for digital data. You might run into a problem if a cheap cable was made assuming single link DVI but any cable which follows the spec should work fine. The interesting part is that there is no KVM capable of switching dual channel DVI AFAIK.

Man that card is HUGE! (4, Funny)

Gentoo Fan (643403) | more than 9 years ago | (#9758948)

Also, I liked this:

* Do you have any idea how performance compares on the Mac between the GeForce 6800 Ultra and the ATI 9800 Pro/XT card?

GeForce 6800 Ultra represents the largest leap forward in graphics performance in our company's history. As expected, they are much faster than previous generation products from ATI. We will let the benchmarks speak for themselves.


Translated: We'll release some actual numbers when we sell more of these mini-space heaters.

Re:Man that card is HUGE! (1)

Arminator (138868) | more than 9 years ago | (#9759323)

Better yet:
"As expected, they are much faster than previous generation products from ATI. We will let the benchmarks speak for themselves."

I'd translate it like this:
Our new card can beat any of those old (previous) ATI cards. (But the new ATI Cards have the potential to open up a can of whoopass on this one, so we'll rely on benchmark-optimized results)

Help censor the American media (-1, Offtopic)

Anonymous Coward | more than 9 years ago | (#9759510)

Support ethnic cleansing in Palestine and help censor the American press! [honestreporting.com]
If you are an American student we are recruiting active censorship drones [israelactivism.com] to spy on fellow students, lecturers and guest speakers on behalf of the Israeli government.
We can stifle democratic thought and criticism of Israeli fascist oppression -But only with your help!

Free housing:
In six short weeks we can show you how to build a rogue state [greenleft.org.au] by demolishing existing homes in Palestine [informatio...house.info] and building new houses on top!
We are currently looking for experienced bulldozer drivers [ccmep.org] with a large western bank balance [cnn.com] to emigrate to the expansionist state of Israel and call it home.
Simply choose a plot of land and start building! Its easy peesy!!
If your chosen plot is currently occupied by a Palestinian family, dont worry
-simply build over them! [lnreview.co.uk]
Its as easy peesy as eeny meeny miney mo!

We can protect your residential developments on occupied land with experienced snipers [apfn.net] in full body armour and appropriately armed Apache helicopters [endtheoccupation.org] kindly donated by the American public.
If you are an American citizen with a view to emigrating to warmer climes along the Med, you may also be eligible for a fraction of the 3,000,000,000 [wrmea.com] (yes thats 3 Billion!) dollars donated yearly by American taxpayers to help support our broken-ass state.

Due to our endless appetite for weapons of mass destruction [thirdworldtraveler.com] our economy is unsustainable and we require your support. WMDs don't come cheap you know. It costs $$$$$$s to terrorise a whole region.
Our military personnel can barely afford to maintain our arsenal of 200 nuclear weapons [bbc.co.uk] , spy satellites [defense-update.com] and attack submarines. [globalsecurity.org]
Vacancies available:We are currently in construction [bbc.co.uk] of the world record breaking apartheid wall [stopthewall.org] surrounding the largest ethic ghetto since Krakow.
The Israeli military is hiring expatriates preferably with a military background to monitor the prisoners [commondreams.org] and maintain watchtowers. If you are blinded by a covetousness of other peoples land, [stopthewall.org] but have a keen eye with a sniper scope you would be the ideal candidate for our border watchtower division.
We need your help. Sponsor an Israeli colonizer.
Do it today.If anybody criticises you, just point a finger an call them anti-Semite.
It worked for the Liberty. [usslibertyinquiry.com]

It's costs... (4, Funny)

_PimpDaddy7_ (415866) | more than 9 years ago | (#9758955)

more than some PCs. Amazing!

From the site:
"The combination of a GeForce 6800 Ultra with a dual processor Power Mac G5 driving two 30-inch Apple Cinema HD Displays is the definitive tool for the creative professional."

Yes because I need 2 30" screens to watch Carrie Ann Moss on one screen and Natalie Portman on the other :)

Re:It's costs... (-1, Flamebait)

Anonymous Coward | more than 9 years ago | (#9759160)

You're not the typical audience. Most Mac users will be watching Graham Norton on one screen and have a naked picture of Leonardo Dicaprio on the other.

(Posting anonymous because I'm a Mac user myself, but couldn't help take the bait)

Re:It's costs... (0)

Anonymous Coward | more than 9 years ago | (#9759481)

Thanks for proving once again that browsing at +5 flamebait is the way to go.

Re:It's costs... (-1, Flamebait)

Anonymous Coward | more than 9 years ago | (#9759192)

It's not two DVI ports for two 30 inch screens. You need the two dvi ports to support ONE 30 inch cinema display's immense high resolution screen.

I had to read that twice.

Re:It's costs... (4, Informative)

Quobobo (709437) | more than 9 years ago | (#9759227)

Argh. No, it's not. There's 2 (two) dual-link DVI ports, each of which can drive 1 (one) 30 inch monitor. Take a look at the pictures from WWDC where they had a G5 driving two of those monitors.

Proud Owner (1, Flamebait)

Seth Finklestein (582901) | more than 9 years ago | (#9758959)

I've owned three Ultra DDL cards before to power my various high-resolution monitors. The major problem here is heat dissipation. Even with three industrial-grade heat sinks on the card, it still produces an additional 10 degrees Celsius of heat on top of the already-hot temperatures emitted by my Duron @2.3 GHz.

Furthermore, although Nvidia has previously made drivers available for Linux, their exclusivity agreement with Apple prevents them from releasing Linux for the Nvidia 6800 Ultra DDL Graphics Card.

As a result, I will boycott this card and the Apple that sponsors it.

Sincerely,
Seth Finklestein
Expert

Re:Proud Owner (-1, Flamebait)

Anonymous Coward | more than 9 years ago | (#9759035)

Furthermore, although Nvidia has previously made drivers available for Linux, their exclusivity agreement with Apple prevents them from releasing Linux for the Nvidia 6800 Ultra DDL Graphics Card.

Shit, you mean no more Winblows drivers, no more LimpDix drivers? No more OS/2 Warped, Sun Solartits support?

I too will boycott this card and the fruits that sponsors it.

Re:Proud Owner (1)

Peter Cooper (660482) | more than 9 years ago | (#9759255)

the already-hot temperatures emitted by my Duron @2.3 GHz.

You mean you pay out the ass for the latest video card and heatsinks, but you buy the cheapest CPU out there and push it to the very limits? That just sounds odd.

Re:Proud Owner (0)

Anonymous Coward | more than 9 years ago | (#9759403)

He might need lots of screen real estate but not so much CPU power. Many design jobs and other tasks are like that, you know...

And those Durons aren't half bad, mind you. Or maybe he just ran out of money halfway through his upgrade ;-)

Re:Proud Owner (0)

Anonymous Coward | more than 9 years ago | (#9759367)

Boo hoo. Why don't you use one of the 100's of graphics cards that work in your PC but not in my Mac.

Re:Proud Owner (1)

Hungus (585181) | more than 9 years ago | (#9759508)

Well I for one an not sorry to hear that:
"exclusivity agreement with Apple prevents them from releasing Linux for the Nvidia 6800 Ultra DDL Graphics Card."
I mean I personally have linuc running on enough devices as it is now if they wanted to post netbsd to the card that would be different. Oh wait, did you mean to say that it prevents them from releasing linux drivers for the card? Or, did you really want to run linux on the card itself?

you can't replace me (4, Funny)

nighty5 (615965) | more than 9 years ago | (#9758965)

It won't replace my S3 - 1 meg

Never..

Never......

Never !!!!

Re:you can't replace me (1)

rodac (580415) | more than 9 years ago | (#9759055)

You also have an S3, i love that chip.

Do you have, as I have, it on a VesaLocal bus card, (high end) , or are you using a low-end ISA card?

Re:you can't replace me (4, Funny)

skinfitz (564041) | more than 9 years ago | (#9759072)

I was going to write a long reply but then you wouldn't be able to read it all as it wouldn't fit on your screen.

Re:you can't replace me (0)

Anonymous Coward | more than 9 years ago | (#9759162)

I was going to write a long reply but then you wouldn't be able to read it all as it wouldn't fit on your screen.
Upgrade to a Matrox Millennium I tell you ;-)

Re:you can't replace me (1)

sv0f (197289) | more than 9 years ago | (#9759555)

I was going to write a long reply but then you wouldn't be able to read it all as it wouldn't fit on your screen.

Don't worry, Andrew Wiles is working on a fix. Should be available in about ten years.

Re:you can't replace me (1)

AKAImBatman (238306) | more than 9 years ago | (#9759315)

Hey, those things were actually great little 2D cards (and cheap too!). The reason why they fell out of favor was that their 3D support (in the form of the S3 Virge) was downright broken. I remember playing a "Virge Enhanced" version of Tomb Raider. None of the wall seams would line up, polygon "ghosts" kept appearing, and the overall game performance was indistinguishable from the software version. The end result was that *software rendering* made a better showing than the S3 Virge.

Wow, what useless responses... (2, Interesting)

Goronmon (652094) | more than 9 years ago | (#9758976)

As expected, they are much faster than previous generation products from ATI

Thats basically like saying "Hey, this new souped Mustang is much faster than a 1992 Taurus!"

I mean, it better be whole hell of a lot faster than the old cards for the huge premium you are paying right now.

Re:Wow, what useless responses... (3, Insightful)

Achoi77 (669484) | more than 9 years ago | (#9759111)

Well, there was one question that he did give a precise answer to:

* Does the GeForce 6800 Ultra DDL have a low-noise fan?

Yes, the GeForce 6800 Ultra DDL runs very quiet.

I think this was the only question he was capable of answering.

Cram that thing into an iMac?! (1)

angrist (787928) | more than 9 years ago | (#9758978)

All I know is that the 6800 won't in in my iMac, or (soon to arrive) PowerBook ..... Damn you nVidia!

Slightly off topic, has anyone seen a way to upgrade (even if it includes needed a new case) the video card on an iMac? (lamp type)

Re:Cram that thing into an iMac?! (1)

Semantic Anomaly (705211) | more than 9 years ago | (#9759211)

Slightly off topic, has anyone seen a way to upgrade (even if it includes needed a new case) the video card on an iMac? (lamp type)

Yeah. Buy a new mac.

Article Text: Im AC 'cause i dont want the karma (3, Informative)

Anonymous Coward | more than 9 years ago | (#9758981)

Q & A with Nvidia on the Mac
Nvidia 6800 Ultra DDL Graphics card
Posted: 7/20/2004

Shortly after Apple announced the Mac Nvidia 6800 Ultra DDL card for the PowerMac G5s (which is required to drive the 30in Cinema Display), I sent a series of questions to a contact at Nvidia on the card. Yesterday I received the reply from Ujesh Desai, Nvidia's General Manager of Desktop GPUs. Although some questions didn't get as complete an answer as I hoped (often due to the fact Apple controls OEM Mac Nvidia products), I appreciate his taking the time to reply.

* How does the NVIDIA GeForce 6800 Ultra DDL card for the Mac differ from the PC version (i.e. Does the PC version have dual link DVI?)

The GeForce 6800 Ultra DDL card was designed specifically for the Mac to provide two dual-link outputs to support Apple's displays.

* Does the Apple version of the GeForce 6800 Ultra GPU run at the same core/memory clock as the PC version?

The Apple cards run at 400/550, just like the GeForce 6800 Ultra GPU on the PC.
(Note: Some vendor's 6800 cards are clocked higher than the standard/reference design.)

* The GeForce 6800 Ultra for the PC has two Molex power connectors - does the Mac version source all the power from the G5's AGP pro slot? (or does it have a aux power connector?)

There is an on-board power connector on the graphics card and the motherboard to provide power, so there is no need for an external power connector from the power supply.
(although the only Mac 6800 photos I've seen are tiny, it appears there's a stub connector on the card that (I suspect) uses the ADC (28V or 24V usually) DC power connector on the motherboard that's normally used for ADC display power to provide additional power (regulated down) for the 6800 card. That eliminates the need for Aux. (Molex) P.S. connector(s) like the PC/standard 6800 card versions have.)

* Does the GeForce 6800 Ultra DDL have a low-noise fan?

Yes, the GeForce 6800 Ultra DDL runs very quiet.

* Will there ever be a control panel with 3D/GL/FSAA controls for the NVIDIA cards on the Mac platform? (ATI's retail Radeon cards (and OEM models with the 3rd party patch) have a '3D/GL overrides' feature - which is seen as a big plus by many end users.)

Apple provides all the drivers for NVIDIA-based add-in cards. We supply them with the source code and they provide the final driver.

* Regarding the previous question - if there's no chance of an Apple supplied NVIDIA card control panel (for advanced features/FSAA, etc.) - if a 3rd party wanted to do this, can NVIDIA provide some assistance?

Apple is our customer, so if this is something that they requested, then we would support it.

* There's been talk of previous NVIDIA cards taking a bigger than expected performance hit from using some types of shaders (on the Mac) - is this a concern with the GeForce 6800 Ultra DDL?

GeForce 6 Series GPUs have the most advanced shading engines on the market. Compared to previous generation parts, the vertex and pixel shader engines on GeForce 6 Series GPUs have been completely redesigned from the ground-up. Pixel Shader performance is 4 to 8 times faster. Vertex shader performance is twice as fast. Performance numbers have been outstanding.

* Will there updated/new drivers for the GeForce 6800 Ultra?

Yes. Apple provides all the drivers for NVIDIA-based add-in cards. We supply them with the source code and they provide the final driver. Apple will control the release schedules for drivers that provide even more performance, features and image quality enhancements.

* Do you have any idea how performance compares on the Mac between the GeForce 6800 Ultra and the ATI 9800 Pro/XT card?

GeForce 6800 Ultra represents the largest leap forward in graphics performance in our company's history. As expected, they are much faster than previous generation products from ATI. We will let the benchmarks speak for themselves.

(Note: There's no Mac 6800 performance reviews currently as it's not set to ship until late August. (I should have one for tests then.) Nvidia's 6800 reviews page lists many PC site reviews of various models of the 6800 - but most compare performance to the ATI 9800/XT model, not the faster/more advanced X800 series. One review that does have some (PC) 6800 Ultra vs ATI X800 (XT/Platinum models) tests is this article at Anandtech, which in the limited tests run there shows similar performance. Of course there's no Mac ATI X800 series cards announced yet, although I'd bet they're in the works...)

* Platforms/drivers aside - how does the GeForce 6800 Ultra compare in design/features to the ATI X800 series?

ATI has spun an old architecture and have done a very good job of squeezing the last bit of untapped performance from it and the drivers. GeForce 6 is a new architecture and we will have a lot more headroom than they will. We always give significant performance increases after we have leveled out the stability of the new architecture. GeForce 6 should continue that trend. Other key differences include:

* We support shader model 3, they do not.
* We support 64-bit floating point blending and filtering, they don't.
* We are the first chip with an onboard video processor

(Note: Some previous ATI cards like their AIW models have shipped with the "Rage Theater" video encoder/decoder chip on the card. It was first intro'd in 1998, and revised I'm sure since then. Of course the current generation of GPUs have more advanced features.)

* This is an old question, but do you think there will ever be a retail Mac NVIDIA card? (i.e. Is the Apple contract an exclusive on the ROM code, etc. - or is there just no interest in PC graphics card mfrs to do a Mac product?)

We sell GPUs, we do not sell add-in cards. Apple has offered stand alone cards for sale on the Apple store in the past. We plan on continuing to offer products for the Mac platform through Apple.
(I was trying get an answer if the Apple contract was an exclusive, or if the reason there are no Mac retail Nvidia cards is that none of the (many) companies that have sold PC Nvidia GPU based cards have any interest in selling a Mac retail product. (i.e. - Has any company approached Nvidia about licensing the mac ROM, etc.) Although Apple has from time to time sold OEM Mac Nvidia cards separately, the selection has been sparse and prices typically high. PC 6800 Ultra cards are currently priced similar to the Mac 6800 Ultra however, but the Mac model only works/fits in a PowerMac G5. And with lots of retail competition, PC versions typically drop in price over time.)

* Normally DVI (DVI-I) port cards have analog pins in the connector so that a DVI->VGA adapter can be used. Can a DVI to VGA adapter be used with the Mac 6800 Ultra DDL DVI ports? (Apple's currently limited info on the card does not mention this capability)

There is no technical reason why a DVI to VGA adapter would not work on this board.

I've wrote contacts at ATI for their comments on future Mac graphics card models such as an X800 based card and replies to any comments above, which I'll add to this page when received.

ATI's Response? (0)

TheShadowHawk (789754) | more than 9 years ago | (#9759022)

Does ATI have anything in the pipeline to combat this new card from nVidia?

I'd google it, but it's too late and I'm too lazy to bother..

Re:ATI's Response? (1)

rozz (766975) | more than 9 years ago | (#9759203)

I'd google it, but it's too late and I'm too lazy to bother..

me too.
(just in case you were counting on me)

Radeon X800 Series (1, Informative)

Anonymous Coward | more than 9 years ago | (#9759585)

For the Mac, dunno.

But Radeon X800 XT certainly matches (wins some, loses some) the GeForce 6800 Ultra in PC land. Reviews confirming this are in abundance.

But like said, dunno about Macs and driving 30" screens thru two dual-link DVI ports... Maybe not. I follow the developments in 3D hardware, and there haven't been any rumours or info about such a Radeon card (by ATI, Apple, or somebody else).

It would need four TDMS transmitters on board. Then again, the Evans & Sutherland four-way R300 card has eight ;-)

A dream... (0)

EMN13 (11493) | more than 9 years ago | (#9759024)

Not that it's in any livable price-range; but two 30" displays connected to a card like that running MacOS-X, can you image that?
The beauty, the beauty... ;-)

Re:A dream... (1)

afidel (530433) | more than 9 years ago | (#9759258)

probably still comes in at less than a comparable Sun or SGI workstation without a display =)

Text in case of /.ing (-1)

Anonymous Coward | more than 9 years ago | (#9759027)

Q & A with Nvidia on the Mac
Nvidia 6800 Ultra DDL Graphics card
Posted: 7/20/2004

Shortly after Apple announced the Mac Nvidia 6800 Ultra DDL card for the PowerMac G5s (which is required to drive the 30in Cinema Display), I sent a series of questions to a contact at Nvidia on the card. Yesterday I received the reply from Ujesh Desai, Nvidia's General Manager of Desktop GPUs. Although some questions didn't get as complete an answer as I hoped (often due to the fact Apple controls OEM Mac Nvidia products), I appreciate his taking the time to reply.

* How does the NVIDIA GeForce 6800 Ultra DDL card for the Mac differ from the PC version (i.e. Does the PC version have dual link DVI?)

The GeForce 6800 Ultra DDL card was designed specifically for the Mac to provide two dual-link outputs to support Apple's displays.

* Does the Apple version of the GeForce 6800 Ultra GPU run at the same core/memory clock as the PC version?

The Apple cards run at 400/550, just like the GeForce 6800 Ultra GPU on the PC.
(Note: Some vendor's 6800 cards are clocked higher than the standard/reference design.)

* The GeForce 6800 Ultra for the PC has two Molex power connectors - does the Mac version source all the power from the G5's AGP pro slot? (or does it have a aux power connector?)

There is an on-board power connector on the graphics card and the motherboard to provide power, so there is no need for an external power connector from the power supply.
(although the only Mac 6800 photos I've seen are tiny, it appears there's a stub connector on the card that (I suspect) uses the ADC (28V or 24V usually) DC power connector on the motherboard that's normally used for ADC display power to provide additional power (regulated down) for the 6800 card. That eliminates the need for Aux. (Molex) P.S. connector(s) like the PC/standard 6800 card versions have.)

* Does the GeForce 6800 Ultra DDL have a low-noise fan?

Yes, the GeForce 6800 Ultra DDL runs very quiet.

* Will there ever be a control panel with 3D/GL/FSAA controls for the NVIDIA cards on the Mac platform? (ATI's retail Radeon cards (and OEM models with the 3rd party patch) have a '3D/GL overrides' feature - which is seen as a big plus by many end users.)

Apple provides all the drivers for NVIDIA-based add-in cards. We supply them with the source code and they provide the final driver.

* Regarding the previous question - if there's no chance of an Apple supplied NVIDIA card control panel (for advanced features/FSAA, etc.) - if a 3rd party wanted to do this, can NVIDIA provide some assistance?

Apple is our customer, so if this is something that they requested, then we would support it.

* There's been talk of previous NVIDIA cards taking a bigger than expected performance hit from using some types of shaders (on the Mac) - is this a concern with the GeForce 6800 Ultra DDL?

GeForce 6 Series GPUs have the most advanced shading engines on the market. Compared to previous generation parts, the vertex and pixel shader engines on GeForce 6 Series GPUs have been completely redesigned from the ground-up. Pixel Shader performance is 4 to 8 times faster. Vertex shader performance is twice as fast. Performance numbers have been outstanding.

* Will there updated/new drivers for the GeForce 6800 Ultra?

Yes. Apple provides all the drivers for NVIDIA-based add-in cards. We supply them with the source code and they provide the final driver. Apple will control the release schedules for drivers that provide even more performance, features and image quality enhancements.

* Do you have any idea how performance compares on the Mac between the GeForce 6800 Ultra and the ATI 9800 Pro/XT card?

GeForce 6800 Ultra represents the largest leap forward in graphics performance in our company's history. As expected, they are much faster than previous generation products from ATI. We will let the benchmarks speak for themselves.

(Note: There's no Mac 6800 performance reviews currently as it's not set to ship until late August. (I should have one for tests then.) Nvidia's 6800 reviews page lists many PC site reviews of various models of the 6800 - but most compare performance to the ATI 9800/XT model, not the faster/more advanced X800 series. One review that does have some (PC) 6800 Ultra vs ATI X800 (XT/Platinum models) tests is this article at Anandtech, which in the limited tests run there shows similar performance. Of course there's no Mac ATI X800 series cards announced yet, although I'd bet they're in the works...)

* Platforms/drivers aside - how does the GeForce 6800 Ultra compare in design/features to the ATI X800 series?

ATI has spun an old architecture and have done a very good job of squeezing the last bit of untapped performance from it and the drivers. GeForce 6 is a new architecture and we will have a lot more headroom than they will. We always give significant performance increases after we have leveled out the stability of the new architecture. GeForce 6 should continue that trend. Other key differences include:

We support shader model 3, they do not.
We support 64-bit floating point blending and filtering, they don't.
We are the first chip with an onboard video processor

(Note: Some previous ATI cards like their AIW models have shipped with the "Rage Theater" video encoder/decoder chip on the card. It was first intro'd in 1998, and revised I'm sure since then. Of course the current generation of GPUs have more advanced features.)

* This is an old question, but do you think there will ever be a retail Mac NVIDIA card? (i.e. Is the Apple contract an exclusive on the ROM code, etc. - or is there just no interest in PC graphics card mfrs to do a Mac product?)

We sell GPUs, we do not sell add-in cards. Apple has offered stand alone cards for sale on the Apple store in the past. We plan on continuing to offer products for the Mac platform through Apple.
(I was trying get an answer if the Apple contract was an exclusive, or if the reason there are no Mac retail Nvidia cards is that none of the (many) companies that have sold PC Nvidia GPU based cards have any interest in selling a Mac retail product. (i.e. - Has any company approached Nvidia about licensing the mac ROM, etc.) Although Apple has from time to time sold OEM Mac Nvidia cards separately, the selection has been sparse and prices typically high. PC 6800 Ultra cards are currently priced similar to the Mac 6800 Ultra however, but the Mac model only works/fits in a PowerMac G5. And with lots of retail competition, PC versions typically drop in price over time.)

* Normally DVI (DVI-I) port cards have analog pins in the connector so that a DVI->VGA adapter can be used. Can a DVI to VGA adapter be used with the Mac 6800 Ultra DDL DVI ports? (Apple's currently limited info on the card does not mention this capability)

There is no technical reason why a DVI to VGA adapter would not work on this board.

I've wrote contacts at ATI for their comments on future Mac graphics card models such as an X800 based card and replies to any comments above, which I'll add to this page when received.

Related Links:

Apple store page for the Mac Nvidia 6800 Ultra DDL card
Nvidia's 6800 series product page
ATI's X800 product page (although their Mac products page lists the 9800 Pro/Special Edition as the highest end model to date.)
Nvidia's listing of 6800 series card reviews (most include tests vs ATI 9800XT)

For previous tests of Mac graphics cards (Mac 9800 Pro retail, OEM 9800, GF4 Ti, etc.), see the graphics card section of the Video topics page.

Article Summary: (4, Funny)

Erwos (553607) | more than 9 years ago | (#9759056)

The 6800 DDL is just a 6800 that supports the new ADC. Apple releases the drivers, don't bitch at us if you don't like the drivers. No, we're not going to tell you about our contract with Apple. The X800 sucks.

Much faster to read, no PR speak to deal with.

-Erwos

If you meant... (4, Insightful)

daveschroeder (516195) | more than 9 years ago | (#9759305)

..."the new ACD", as in "the new 30" Apple Cinema Display", ok.

But if you actually meant ADC, or "Apple Display Connector", that is no longer used. With the new line of displays, Apple has (thankfully) gone back to standard DVI for the displays and for their future OEM video cards.

Re:If you meant... (0)

Anonymous Coward | more than 9 years ago | (#9759507)

I'm happy that Apple has decided to go with DVI, as that's what all the PC's use. But I really wish it had been the other way around, the PC goes move to adc. ADC was totally cool, DVI, power, and usb all through a single cable/connector.

I've got a old (now) 20" cinema display and the adc to dvi adapter. It's great that with a new cinema display I wont need a hundred dollar adapter, just sucks that they had to go to less usefull tech, w/dvi.

Re:Article Summary: (1)

tonywong (96839) | more than 9 years ago | (#9759487)

I don't mean to be a pendant, but

ADC == Apple Display Connector

ACD == Apple Cinema Display

The new 6800 supports the latter, not the former (Dual DVI is the connection standard for the newest Apple systems).

Article - incase of slashdotting (-1, Redundant)

Loco3KGT (141999) | more than 9 years ago | (#9759064)


Shortly after Apple announced the Mac Nvidia 6800 Ultra DDL card [apple.com] for the PowerMac G5s (which is required to drive the 30in Cinema Display), I sent a series of questions to a contact at Nvidia on the card. Yesterday I received the reply from Ujesh Desai, Nvidia's [nvidia.com] General Manager of Desktop GPUs. Although some questions didn't get as complete an answer as I hoped (often due to the fact Apple controls OEM Mac Nvidia products), I appreciate his taking the time to reply.

* How does the NVIDIA GeForce 6800 Ultra DDL card for the Mac differ from the PC version (i.e. Does the PC version have dual link DVI?)

The GeForce 6800 Ultra DDL card was designed specifically for the Mac to provide two dual-link outputs to support Apple's displays.

* Does the Apple version of the GeForce 6800 Ultra GPU run at the same core/memory clock as the PC version?

The Apple cards run at 400/550, just like the GeForce 6800 Ultra GPU on the PC.
(Note: Some vendor's 6800 cards are clocked higher than the standard/reference design.)

* The GeForce 6800 Ultra for the PC has two Molex power connectors - does the Mac version source all the power from the G5's AGP pro slot? (or does it have a aux power connector?)

There is an on-board power connector on the graphics card and the motherboard to provide power, so there is no need for an external power connector from the power supply.
(although the only Mac 6800 photos I've seen are tiny, it appears there's a stub connector on the card that (I suspect) uses the ADC (28V or 24V usually) DC power connector on the motherboard that's normally used for ADC display power to provide additional power (regulated down) for the 6800 card. That eliminates the need for Aux. (Molex) P.S. connector(s) like the PC/standard 6800 card versions have.)

* Does the GeForce 6800 Ultra DDL have a low-noise fan?

Yes, the GeForce 6800 Ultra DDL runs very quiet.

* Will there ever be a control panel with 3D/GL/FSAA controls for the NVIDIA cards on the Mac platform? (ATI's retail Radeon cards (and OEM models with the 3rd party patch) have a '3D/GL overrides' feature - which is seen as a big plus by many end users.)

Apple provides all the drivers for NVIDIA-based add-in cards. We supply them with the source code and they provide the final driver.

* Regarding the previous question - if there's no chance of an Apple supplied NVIDIA card control panel (for advanced features/FSAA, etc.) - if a 3rd party wanted to do this, can NVIDIA provide some assistance?

Apple is our customer, so if this is something that they requested, then we would support it.

* There's been talk of previous NVIDIA cards taking a bigger than expected performance hit from using some types of shaders (on the Mac) - is this a concern with the GeForce 6800 Ultra DDL?

GeForce 6 Series GPUs have the most advanced shading engines on the market. Compared to previous generation parts, the vertex and pixel shader engines on GeForce 6 Series GPUs have been completely redesigned from the ground-up. Pixel Shader performance is 4 to 8 times faster. Vertex shader performance is twice as fast. Performance numbers have been outstanding.

* Will there updated/new drivers for the GeForce 6800 Ultra?

Yes. Apple provides all the drivers for NVIDIA-based add-in cards. We supply them with the source code and they provide the final driver. Apple will control the release schedules for drivers that provide even more performance, features and image quality enhancements.

* Do you have any idea how performance compares on the Mac between the GeForce 6800 Ultra and the ATI 9800 Pro/XT card?

GeForce 6800 Ultra represents the largest leap forward in graphics performance in our company's history. As expected, they are much faster than previous generation products from ATI. We will let the benchmarks speak for themselves.

(Note: There's no Mac 6800 performance reviews currently as it's not set to ship until late August. (I should have one for tests then.) Nvidia's 6800 reviews page lists many PC site reviews of various models of the 6800 - but most compare performance to the ATI 9800/XT model, not the faster/more advanced X800 series. One review that does have some (PC) 6800 Ultra vs ATI X800 (XT/Platinum models) tests is this article [anandtech.com] at Anandtech, which in the limited tests run there shows similar performance. Of course there's no Mac ATI X800 series cards announced yet, although I'd bet they're in the works...)

* Platforms/drivers aside - how does the GeForce 6800 Ultra compare in design/features to the ATI X800 series?

ATI has spun an old architecture and have done a very good job of squeezing the last bit of untapped performance from it and the drivers. GeForce 6 is a new architecture and we will have a lot more headroom than they will. We always give significant performance increases after we have leveled out the stability of the new architecture. GeForce 6 should continue that trend. Other key differences include:

  • We support shader model 3, they do not.
  • We support 64-bit floating point blending and filtering, they don't.
  • We are the first chip with an onboard video processor

(Note: Some previous ATI cards like their AIW models have shipped with the "Rage Theater" [ati.com] video encoder/decoder chip on the card. It was first intro'd in 1998, and revised I'm sure since then. Of course the current generation of GPUs have more advanced features.)

* This is an old question, but do you think there will ever be a retail Mac NVIDIA card? (i.e. Is the Apple contract an exclusive on the ROM code, etc. - or is there just no interest in PC graphics card mfrs to do a Mac product?)

We sell GPUs, we do not sell add-in cards. Apple has offered stand alone cards for sale on the Apple store in the past. We plan on continuing to offer products for the Mac platform through Apple.
(I was trying get an answer if the Apple contract was an exclusive, or if the reason there are no Mac retail Nvidia cards is that none of the (many) companies that have sold PC Nvidia GPU based cards have any interest in selling a Mac retail product. (i.e. - Has any company approached Nvidia about licensing the mac ROM, etc.) Although Apple has from time to time sold OEM Mac Nvidia cards separately, the selection has been sparse and prices typically high. PC 6800 Ultra cards are currently priced similar to the Mac 6800 Ultra however, but the Mac model only works/fits in a PowerMac G5. And with lots of retail competition, PC versions typically drop in price over time.)

* Normally DVI (DVI-I) port cards have analog pins in the connector so that a DVI->VGA adapter can be used. Can a DVI to VGA adapter be used with the Mac 6800 Ultra DDL DVI ports? (Apple's currently limited info on the card does not mention this capability)

There is no technical reason why a DVI to VGA adapter would not work on this board.


I've wrote contacts at ATI for their comments on future Mac graphics card models such as an X800 based card and replies to any comments above, which I'll add to this page when received.

Related Links:

For previous tests of Mac graphics cards (Mac 9800 Pro retail, OEM 9800, GF4 Ti, etc.), see the graphics card section of the Video topics page [slashdot.org] .

usually good, but ... (4, Insightful)

for_usenet (550217) | more than 9 years ago | (#9759075)

I'm a heavy mac user, and I read this site pretty much on a daily basis, as the guy responsible for the site puts up a LOT of decent Mac hardware and software info on there. But this has got to be one of the most UNinformative, useless things he's posted. I know there's a desire for info about this card - but shouldn't we wait till some more detailed specs are released, or till someone has some actual silicon so benchmarks can be run ?

Yet another example of "no news" being news ... As many other people have said, "Nothing to see here. Move along !!"

Promises promises (1, Informative)

Anonymous Coward | more than 9 years ago | (#9759076)

We always give significant performance increases after we have leveled out the stability of the new architecture. GeForce 6 should continue that trend.

They made the same promises regarding the NV30/NV35 series and the shader performance NEVER approached the shader performance of the R300 series. Even Carmack was talking potential scheduling efficiencies during the NV30 launch that never materialized.

ATI may have similar problems as R500+ are going to the pool of ALUs approach where software scheduling becomes paramount to delivering on the performance of the hardware.

Pretty sparse... (1)

dalamarian (741404) | more than 9 years ago | (#9759078)

I was really interested in this article till about the 2nd question. The responses are extremely limited and seem right out of standardized corporate response. But good job for trying to get some more facts!

Dual-Link DVI for PC? (1)

Renesis (646465) | more than 9 years ago | (#9759105)

Does anyone have any links for Dual-Link DVI cards for the PC?

I believe there are a couple of them out there, but I wanna run one of these 30" Cinema displays on a PC you see!

Re:Dual-Link DVI for PC? (1)

whitelines (715819) | more than 9 years ago | (#9759191)

There are quite a few Dual Link DVI's out there, but I don't know of any that can drive 2560 by 1600 through BOTH ports. The ones I have seen can only drive that kind of resolution ACROSS the ports...

Re:Dual-Link DVI for PC? (2, Informative)

badriram (699489) | more than 9 years ago | (#9759231)

I would look into the Matrox Parhelia series of cards. They are designed for high end use in DV, CAD GIS etc.

Tom's Hardware (4, Informative)

pajamacore (613970) | more than 9 years ago | (#9759154)

There was actually a really great, informative article about the 6800 on Tom's Hardware [tomshardware.com] a few weeks ago.

"NVIDIA has seemingly pulled out all stops in an attempt to deliver cutting edge graphics with its GeForce 6800 Ultra. After gamers for too long have had to be content with mere incremental improvements to graphics performance, NVIDIA new card delivers a performance jump not seen for a long time. The device is also solidly engineered as well as insanely fast."

Wait... not a Motorola 6800... an NVidia 6800.... (3, Funny)

DeckerEgo (520694) | more than 9 years ago | (#9759174)

My brain kept thinking that they were talking about the old Motorola 6800 chipsets that Apple used nine years ago... not a GPU marketed as "6800"... I got so confused...

Wait - I sold those things nine years ago!?!? Damn I'm old.

Re:Wait... not a Motorola 6800... an NVidia 6800.. (1)

fyonn (115426) | more than 9 years ago | (#9759220)

68000, not 6800.

I had those chips powering my amiga's, a 7.14mhz 68000 in my a500, a 14mhz 68020 in my 1200 and a 50mhz 68030/68882 in the blizzard board for my 1200.

damn, I feel old too now

dave

Re:Wait... not a Motorola 6800... an NVidia 6800.. (1)

stanmann (602645) | more than 9 years ago | (#9759366)

And of course stepping further back the Apple 2 series running the 6800 knockoff the 6502.

Re:Wait... not a Motorola 6800... an NVidia 6800.. (0)

Anonymous Coward | more than 9 years ago | (#9759300)

The Apple didn't use a Motorola 6800. They used a MOS 6502, which was sort of like a Motorola 6800, in the Apple II (Or various incarnations of the 6502, but never a 6800) They then used a Motorola 68000 in the Lisa and Macintosh.

You think that's bad?!?!? (1)

Klar (522420) | more than 9 years ago | (#9759414)

LAST YEAR(2nd year University) we learend to code in assembly using a Motorola 6800 board--no lie! We had to pay $150 for the damn book that looked photocopied... /bitter

Flamebait... (2, Funny)

Anita Coney (648748) | more than 9 years ago | (#9759177)

Can someone explain to me why a Mac would need such a powerful gaming card?!

Re:Flamebait... (1)

Semantic Anomaly (705211) | more than 9 years ago | (#9759270)

Can someone explain to me why a Mac would need such a powerful gaming card?!

To play the apple jigsaw puzzle game, of course.

Re:Flamebait... (2, Informative)

Have Blue (616) | more than 9 years ago | (#9759407)

I don't know why I'm replying to this but...

It's not just for gaming. Mac OS X's GUI can be accelerated by the GPU. 10.4 will also ship with video- and image-processing libraries that use the GPU.

And even if you don't care about gaming at all, this is the only card on any platform that supports the 30" cinema display, so if you want one of those you need the card anyway.

Doom 3. (0)

Anonymous Coward | more than 9 years ago | (#9759602)

That is, after all, the reason a PC would need such a powerful gaming card as well.

Mac user's number 1 hardware question (3, Interesting)

gsfprez (27403) | more than 9 years ago | (#9759216)

why can't we buy and use "PC" Video cards? What is it that makes vendors have to build EPROMs differently (Different?) for the Mac vs. Windows machines for the exact same card otherwise?

It reduces our choices and makes $100 cost $400.

Re:Mac user's number 1 hardware question (2, Informative)

mfago (514801) | more than 9 years ago | (#9759318)

why can't we buy and use "PC" Video cards? What is it that makes vendors have to build EPROMs differently (Different?) for the Mac vs. Windows machines for the exact same card otherwise?

Because x86 stores data backwards (the big/little endian thing) as compared to almost every other processor, including the PowerPC.

Thus the card firmware needs to be different...

Re:Mac user's number 1 hardware question (2, Interesting)

Quobobo (709437) | more than 9 years ago | (#9759345)

Yeah, but Mac users are gouged on the prices, for just a different firmware (hence the many guides to flashing "PC" video cards for use in Macs). It makes no sense to charge so much more for simply having some different data on the card.

little-endian is the "right way" (TM) (1)

mangu (126918) | more than 9 years ago | (#9759597)

Comeon, calling little-endian "backwards" is flamebait...


Actually there is a justificative for little-endian, just like there is one for the British driving on the left. Casting values from 16 bits to 8 bits and vice-versa in little-endian machines is automatic. In the old days of limited memory this was an advantage. (As for driving on the left side of the road, it came from horse riding: one mounts a horse from the left side)

Re:Mac user's number 1 hardware question (0)

Anonymous Coward | more than 9 years ago | (#9759339)

Because PCs don't use platform independent FCode drivers like the rest of the world, they store their ROMs on the card in intel assembly. Thus PCI cards for x86 systems don't work on anything else without new ROMs.

Mac suks (-1, Flamebait)

Anonymous Coward | more than 9 years ago | (#9759230)

It puts the lotion on the skin.

Apple is dying: Sell stock now. (-1, Troll)

Anonymous Coward | more than 9 years ago | (#9759238)

Apple Computer, Inc. (AAPL), beset by angry creditors and faced with severe G5 production problems, is on the verge of bankruptcy and total collapse. Apple continues to nosedive into oblivion, as confirmed by industry watchers, investors, and, most painfully, by customers themselves.

As a recent study by Bank of America Securities puts it, Apple ekes out its existence by peddling new hardware to its existing customers; once those customers are satisfied, Apple will run out of steam [forbes.com] . If these disastrous financial forecasts aren't enough, one need only look to Netcraft for confirmation that Apple's market share among Web servers is slowly dwindling down to zero. The market share of Mac OS X is now eclipsed even by that of FreeBSD, another OS that is deeply imperiled.

But the abysmal server presence of OS X is the least of Apple's worries. Apple's most recent quarterly report indicates a death spiral of cash loss. Indeed, Apple has hemorrhaged some $276 million in the last quarter, while racking up a dizzying $2.4 billion in debt. Revenue from sales of the iPod, the portable music player that is barely keeping Apple afloat in this shipwreck of fiscal woe, declined dramatically, threatening to shrink further an already miniscule lifeline.

Likewise, sales of the eMac, iMac and Power Macintosh G5 lines continue to skid. Apple is unable to secure G5 processors in sufficient nubmers to supply its customers with Power Macintosh G5 and iMac computers, as Steve Jobs himself recently admitted. The staggering decline in sales numbers confirms it: there is no doubt that one-time Apple customers, dismayed with the floundering ineptitude of their favorite company, have begun turning away in droves, seeking cheaper, faster hardware from manufacturers such as Dell.

Apple teeters on the precipice of doom, one step away from plummeting to its ultimate nadir of bankruptcy, chaos, and implosion. Wise investors will quickly dump AAPL stock and abandon the doomed company, now less than one year away from complete disintegration.

It's time to move to a new platform: Apple is dead.

Re:Apple is dying: Sell stock now. (3, Funny)

killproc (518431) | more than 9 years ago | (#9759374)

Apple teeters on the precipice of doom, one step away from plummeting to its ultimate nadir of bankruptcy, chaos, and implosion.

I thought the release date for the OSX version of Doom III was still up in the air...

A note from the author (4, Funny)

saddino (183491) | more than 9 years ago | (#9759259)

My answers were designed specifically to provide little information, so there is no need for criticism. The site provided questions and I supply them with answers, if more details are requested, then I would support it. Compared to previous generation interviews, I redesigned my answers from the ground up and I think my word count was outstanding. Yes, Apple provides the answers sometimes. We supply them with talking points and let our quotes speak for themselves. The guys at ATI do a good job of squeezing out interesting information during their interviews, but our answers have a lot more headroom. Other differences include:
  • I support my pants with suspenders and they do not.
  • I speak marketing-speak fluently, and they don't.
  • I am the first one to make my points using bullets.

I answer questions with no add-ins of emotion. There is no technical reason why I would answer otherwise.

Sincerely,
Ujesh Desai

PC??? (0)

Anonymous Coward | more than 9 years ago | (#9759283)

They should totally realease that card for the PC market. If anything it will let apple sell some really nice and pricey displays, hell the damned thing costs as much if not more then most of their computers they sell.

Annoying marketing regression (0, Offtopic)

flakier (177415) | more than 9 years ago | (#9759302)

Great monitors but I just noticed that the 30inch isn't actualy 30 inces! They've regressed to crt marketing!

Come on apple, everybody else is sizing their LCD's by their viewable size let's not go back to the assinine advertised vs. viewable screen size. http://www.apple.com/displays/specs.html [apple.com]
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...