Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

S3 Jumps On GPGPU Bandwagon

Soulskill posted more than 5 years ago | from the trying-to-get-their-groove-back dept.

Graphics 86

arcticstoat writes "It's sometimes easy to forget that the PC graphics market isn't owned by ATI and Nvidia, and the company that first gave us 3D acceleration, S3, is very much still around. So much so, in fact, that it's now even developed its own GPGPU technology. S3 claims that its Chrome 400 chips can accelerate gaming physics, HD video transcoding/encoding and photo enhancement. To demonstrate the latter, S3 has released a free download called S3FotoPro, which enhances color clarity and reduces haze in photos. However, the company hasn't yet revealed whether it plans to support OpenCL in the future." The Tech Report also points out that this could allow S3's parent company, VIA, to compete with Intel and AMD in graphics processing technology.

cancel ×

86 comments

Sorry! There are no comments related to the filter you selected.

not the first time we hear that S3 can compete... (5, Insightful)

squisher (212661) | more than 5 years ago | (#25424993)

This is definitely not the first time in recent years that we hear S3 can compete with ATI and Nvidia again. As much as I'd like to see that, I certainly won't believe it until I see some decent independent benchmarks.

Re:not the first time we hear that S3 can compete. (2, Funny)

rhyder128k (1051042) | more than 5 years ago | (#25425225)

Hey, give them a chance. If their excellent 3D graphics chipsets are anything to go by this could give you the power of a 386 processor ON YOUR DESKTOP! Imagine it: DooM running in practically real-time. This baby could render the teapot POV example in 3/4 mins rather than the hours it would take on the older XT class machine.

Re:not the first time we hear that S3 can compete. (1)

donaggie03 (769758) | more than 5 years ago | (#25425247)

Seriously, I've never heard of S3. Are they not sold at normal stores like Best Buy?

Re:not the first time we hear that S3 can compete. (1)

fuzzyfuzzyfungus (1223518) | more than 5 years ago | (#25425299)

Typically not. Standalone S3 based cards do exist; but they survive primarily in the form of embedded video on VIA boards.

Re:not the first time we hear that S3 can compete. (2, Funny)

Anonymous Coward | more than 5 years ago | (#25425517)

normal stores like Best Buy?

America really has gone downhill.

Re:not the first time we hear that S3 can compete. (1)

cbiltcliffe (186293) | more than 5 years ago | (#25439597)

I was thinking the same thing.

If it's not a super-dooper-insta-matic-graphics-o-tron-buzzword-overclocker-massive-profit-maker-by-stupid-customers cheapo graphics card, Best Buy doesn't want anything to do with it.

Re:not the first time we hear that S3 can compete. (3, Informative)

OrangeTide (124937) | more than 5 years ago | (#25425523)

Long ago they used to be, back when ATI and Trident were big names in the video card business.

Re:not the first time we hear that S3 can compete. (2, Interesting)

sortius_nod (1080919) | more than 5 years ago | (#25427299)

ATi, Trident, Matrox, S3, the good old days... I remember when I worked in a computer shop, we used to burn through S3 Virge and S3 Trio cards like they were going out of fashion.

Unfortunately they were left for dead when people no longer needed a 2D card to go with their 3DFX card - the combo cards from Diamond were killer cards and removed the need for the usual S3 Virge/Trio or Trident.

Re:not the first time we hear that S3 can compete. (1)

mollymoo (202721) | more than 5 years ago | (#25425721)

I had an S3 Virge about 11 years ago, they were pretty popular back then. I don't recall hearing much about them since, I assumed they'd gone the way of 3DFX.

Re:not the first time we hear that S3 can compete. (2, Informative)

aliquis (678370) | more than 5 years ago | (#25425787)

3Dfx got bought by Nvidia, so no.

Re:not the first time we hear that S3 can compete. (3, Informative)

TheRaven64 (641858) | more than 5 years ago | (#25426093)

And S3 got bought by VIA, so yes.

Re:not the first time we hear that S3 can compete. (1)

lysergic.acid (845423) | more than 5 years ago | (#25427325)

i thought in a recent /. interview [slashdot.org] with the VIA open source rep he said that VIA didn't own S3 (not entirely at least):

However, S3 Graphics is an entirely independent company and not a subsidiary of VIA. That basically means that VIA is holding some stock, but that's more or less all. They also promote their products together.

The S3 Graphics discrete GPU's are developed independently from the VIA integrated graphics, and they share no common hardware core or driver.

So since VIA has taken steps to become more supportive of Open Source and particularly Linux and Xorg, we will see improvement for the VIA integrated graphics products. This has no relationship to what S3 does for their products.

personally, i don't know too much about S3 (other than the fact that they were a popular name when i was playing Quake III and Unreal Tournament). but i do have great interest in VIA's product line, especially as they relate to PVR/HTPC applications. perhaps we'll finally see those cheap Chrome 4 + EPIA low-power multimedia platforms we've been promised [xbitlabs.com] .

Re:not the first time we hear that S3 can compete. (2, Interesting)

tibman (623933) | more than 5 years ago | (#25425803)

S3 is from the age of 3dfx cards and pre Nvidia Geforce cards. I don't remember any of their cards being very successful? Other than some late Savage cards, but even then, not equal to 3dfx, ATI, or Nvidia offerings.

I still have stuff with 3dfx logos.. i miss them :(

Re:not the first time we hear that S3 can compete. (1)

rhyder128k (1051042) | more than 5 years ago | (#25425969)

*Click* *Spining Logo* = Time for some eye candy! Happy memories :-)

Re:not the first time we hear that S3 can compete. (2, Informative)

sortius_nod (1080919) | more than 5 years ago | (#25427327)

As I stated in an post further up - the Trio and Virge cards are what S3 made a killing on.

I actually remember a server board that basically required a Trio - other cards would cause the system to hang mid use. They were great little cards and even were able to have expanded memory added.

Re:not the first time we hear that S3 can compete. (0)

Anonymous Coward | more than 4 years ago | (#25430253)

The ViRGE & Trio cards were everywhere (& Microsoft Virtual PC still emulates an S3 Trio), and the SavageMX chips were in quite a few laptops at the turn of the century. The Savage chips were actually quite nice, full Open Source drivers and everything.

Re:not the first time we hear that S3 can compete. (1)

tubapro12 (896596) | more than 5 years ago | (#25426081)

I actually think I've seen some of their hardware before; but, I didn't know they were still around, so I guess I'm in a similar boat as you.
S3 Graphics [wikipedia.org]

Re:not the first time we hear that S3 can compete. (0)

Anonymous Coward | more than 4 years ago | (#25432637)

It was one of the big players at the infancy of 3D accelerating video cards for the desktop.
IIRC it was actually _the first_ vendor, still in the MS-DOS era, and Quake (one!) was the first application to use it.

Re:not the first time we hear that S3 can compete. (1)

Mr Z (6791) | more than 5 years ago | (#25436553)

My S3 ViRGE sitting in my Gateway 2000 G6 Pentium II-300 weeps.

Ah, what the heck am I talking 'bout. I kicked that turd and its 4GB HD to the curb years ago.

Re:not the first time we hear that S3 can compete. (0)

Anonymous Coward | more than 5 years ago | (#25425647)

I recently had to RMA my desktop video card due to some blown caps and in the time waiting for the replacement I ended up using the onboard S3 Chrome graphics on my motherboard for the first time. It's not very fast, as I expected, but it's much better than I thought it would be. It supports shader model 2.0 and can actually run a surprising number of pre-2005 games at playable framerates with various adjustments to resolution or details. It seems to do a better job than any integrated Intel graphics I have ever used, which isn't saying much, but it seems like it may be slightly better than the integrated Geforce 6150 in my laptop also.

Since the Chrome 400 series are discrete parts that are much higher powered than the Chrome9 HC IGP on my desktop motherboard, I would not be shocked to find out that they can compete with ATI and Nvidia on their mid-range cards. The S3 Chrome 440 GTX 256MB seems to be their highest end card for $59 and they show it beating out a Radeon 3470 in 3dmark 06.

I'm certainly not ready to give up my Nvidia cards yet, but it will be interesting to see if the gaming market can once again become a place of competition among more than just 2 companies.

Re:not the first time we hear that S3 can compete. (1)

BrentH (1154987) | more than 5 years ago | (#25425785)

While the press release doesnt make this terribly clear, I think it may be that this applciations automatically processes all image data and adjusts as needed before it sends it out to the monitor. So i a way its an image 'normalizer', like how you have audio normalizer that make all sounds have about the same volume. This actually would make sense (if its what I think it is), because this would, for example, mean that all your photos 'look good' without an touching up, all your homemovies dito, and dito the rest. Not a bad idea, if this is the idea.

Yep (2, Interesting)

Sycraft-fu (314770) | more than 5 years ago | (#25425843)

First we need to see a video card that performs well. Serious. The whole reason that nVidia (and ATi) cards can do well at GPGPU stuff is that they are fast at gaming stuff.

Gaming graphics are at their heart a whole lot of parallel single precision floating point math. Thus, that is what modern video cards are good at calculating. Well the GPGPU idea was just someone saying "Hey, these things are amazingly fast as number crunching, and graphics aren't the only sort of thing this is relivant to. Let's get an API on there to let people use it as they wish."

Well that worked out great, however the whole thing was predicated on good hardware. Since the hardware does it's job very quickly, much quicker than a CPU can, it is worthwhile to use it for other things. That wouldn't be the case if the hardware were slow. If the hardware didn't really do anything faster than a CPU, well then why not juse use a CPU? Easier to program for, more readily available and all that.

So if S3 wishes to be taken seriously for GPGPU, they need to show they can be serious for games first. Show some serious vertex/pixel crunching capability. If your card is capable of that, it should be capable of generalizing that to any parallel FP task (provided the API is there and the hardware is designed right). However if you lag ass at graphics, I'm not going to believe you are worth a shit for other stuff. Graphics are more or less the ideal case: Embarrassingly parallel, not much branching, etc. This is no surprise, GPUs were designed to do graphics well. However it also means, if you aren't good at graphics, you aren't good at GPGPU.

Re:Yep (2, Interesting)

bendodge (998616) | more than 5 years ago | (#25428993)

Do games later.
For now, let's see some really small, low-power, low-heat video chips with enough power for HD video and basic 3D acceleration. If they do that and release documentation for Linux, they can pwn the netbook market. Guess what S3 appears to be aiming at?

Re:Yep (1)

hattig (47930) | more than 4 years ago | (#25430759)

The problem is that both NVIDIA and AMD have products for this market that also double as system chipsets and thus are quite cheap to use instead of discrete graphics. For example, see the recently announced 9400M from NVIDIA (although their older AMD chipsets also had the video decode capability) and AMD's 780G.

(If you distill AMD's solution, you will end up with an ATI Xilleon chip (although I think they sold this off to Broadcom recently) that is a SoC with the video decode unit that is used in AMD GPUs, a MIPS CPU and all the other stuff.)

Re:Yep (1)

paganizer (566360) | more than 4 years ago | (#25434509)

I just wish they would aim more resources at OpenGL optimization.

Mod up: agree. (1)

ReedYoung (1282222) | more than 5 years ago | (#25437457)

Gaming hardware is developed for power first, with lower power as an afterthought. A separate market of buyers exists, which prioritize compactness and low Wattage. Why not a separate R&D model? Both have a legitimate place in computer hardware, but like you, I think S3 has different plans than trying to compete on ATI's & nVidia's terms. And I'm excited about the possibilities.

If they do that and release documentation for Linux, they can pwn the netbook market.

what (3, Insightful)

Brian Gordon (987471) | more than 5 years ago | (#25425001)

How is enhancing photos the business of a video card? That can be done in software at a perfectly acceptable speed without hardware acceleration.

Re:what (5, Informative)

Poltras (680608) | more than 5 years ago | (#25425037)

Not true. Some filters which took minutes in Adobe Photoshop CS2 only took half a second in CS4. Just doing a low-pass filter or a blur to get noise reduction would, of course, be doable by a single cpu correctly. But once you go professional, the time saved through using GPGPU is amazing, and means you can see results in realtime, so you can make adjustments much much faster.

Re:what (2, Informative)

DigiShaman (671371) | more than 5 years ago | (#25425079)

nVidia already has this market covered with the Cuda API. In fact, the new version of photoshop is GPGPU accelerated with nVidia cards that support Cuda.

Re:what (2, Funny)

slimjim8094 (941042) | more than 5 years ago | (#25425761)

I'm fairly sure that's what he said. He even specifically mentioned photoshop

Re:what (1)

sortius_nod (1080919) | more than 5 years ago | (#25427349)

Pretty much... oh noes, Nvidia wasn't mentioned.

Either someone is a bit too much of a pedant or we're seeing some fanboyism.

Re:what (1)

cbiltcliffe (186293) | more than 5 years ago | (#25439673)

Oh. There's a company already in the market? Well, forget it then. There's no need for a second company to produce a similar product.
Give everybody the rest of the day off, and we'll come up with some new stuff tomorrow.

</sarcasm>

Re:what (1)

aliquis (678370) | more than 5 years ago | (#25425861)

Except in Aperture there the GPU acceleration is much slower than Lightroom thanks to Apples retarded decision of using 128 MB vram. Hurray for less choice.

Re:what (2, Funny)

Anonymous Coward | more than 5 years ago | (#25425041)

How is enhancing photos the business of a central processing unit? That can be done in hardware acceleration at incredible speeds without software.

Re:what (1, Funny)

larry bagina (561269) | more than 5 years ago | (#25425157)

how is enhancing photos the business of a computer? That can be done with professional equipment and professional lighting.

Re:what (2, Funny)

MadnessASAP (1052274) | more than 5 years ago | (#25425607)

Hos is enhancing photos the buisness of professional equipment and lighting? That can be done with a few lenses, a cheap pen light and some cardboard.

Re:what (1)

aliquis (678370) | more than 5 years ago | (#25425875)

Off-topic my ass, funny or insightful :D

Why ISN'T enhancing photos the business of the central processing unit? Does it matter how it's done as long as you get the results you want?

Anyway, all enhanchments will still try to fix what the exposure didn't.

Re:what (2, Funny)

Fred_A (10934) | more than 5 years ago | (#25427495)

Or you can just squint a bit, or step back a little. Cheap and fast and used to work just as well.

S3 (3, Funny)

crawly (890914) | more than 5 years ago | (#25425007)

I vaguely remember them, and here I though they had gone out of business.

Easy to forget (3, Informative)

Spatial (1235392) | more than 5 years ago | (#25425059)

It's sometimes easy to forget that the PC graphics market isn't owned by ATI and Nvidia

That's right. Intel own it too.

Re:Easy to forget (3, Insightful)

binarylarry (1338699) | more than 5 years ago | (#25425671)

Yep, Intel has like 60% market share.

And they have the worse performance.

Sounds like a perfect take over target for good old Microsoft, if I ever saw one.

Re:Easy to forget (4, Informative)

Teun (17872) | more than 5 years ago | (#25425771)

Yep, Intel has like 60% market share.

And they have the worse performance.

And they have some of the easiest Linux support.

Re:Easy to forget (0)

Anonymous Coward | more than 5 years ago | (#25426491)

And GPGPU stuff just flies with glcopytexsubimage.

Re:Easy to forget (2, Funny)

omuls are tasty (1321759) | more than 5 years ago | (#25426713)

And they have some of the easiest Linux support.

That surely explains their 60% market share

Re:Easy to forget (0, Troll)

800DeadCCs (996359) | more than 4 years ago | (#25430135)

Yep, Intel has like 60% market share.

And they have the worse performance.

And they have some of the easiest Linux support.

Isn't that like saying most pedos go for 11 - 13 year olds, but at least they use condoms?

Intel video is so bad it should be criminal... can we fine them $1000 for each gma950 unit they ship?... please?

Re:Easy to forget (1)

xZgf6xHx2uhoAj9D (1160707) | more than 4 years ago | (#25432963)

Isn't that like saying most pedos go for 11 - 13 year olds, but at least they use condoms?

No. Just...no. I'm pretty sure it's nothing like saying that.

VIA (4, Interesting)

blind biker (1066130) | more than 5 years ago | (#25425113)

Looks like VIA is really serious about this whole x86 business - they are the little (compared to Intel and AMD) thorn in the side to the big boys. With so many bald decisions regarding their own x86 roadmap, it's a miracle they're still around!

What I mean is: AMD has been on the razor's edge for many years already, always in danger of unprofitability due to the thin or sometimes non-existent margins they had in order to keep with the top-dog. And AMD has a substantial slice of the x86 market, definitely way bigger than VIA. Imagine what sort of creative management it takes for VIA to stay competitive.

S3's role in VIA's x86 plans could be crucial. I can definitely see them help VIA into the emerging netbook market. Cheap and low-power, is what VIA and S3 are good at, and that's exactly what netbooks are all about.

Re:VIA (1)

blind biker (1066130) | more than 5 years ago | (#25425133)

With so many bald decisions regarding their own x86 roadmap, it's a miracle they're still around!

Uh.... I shouldn't be posting minutes after waking up from an afternoon slumber :o)
Obviously, I meant bold decisions! Oh well - at least it's a funny typo!

Re:VIA (2, Interesting)

PineGreen (446635) | more than 5 years ago | (#25425493)

Incidentally, I put together a completelly fanless audio system based around via esther at 1Ghz (the only noise producing thing is HD and I used a quiet barracuda HD and even that gets annoying after a while). The little box soon expanded into torrent downloader, file and web server and is incredibly stable platform. The best ever spent $400 (it is actually more stable than my 8 core xeon monster worth $8000 last year that I use at work).

Re:VIA (1)

fedcb22 (1215744) | more than 5 years ago | (#25425973)

$8000? Thats a crap lot for a machine, even a 8 core Xeon. What you got in there?

Re:VIA (1)

zippthorne (748122) | more than 5 years ago | (#25429171)

A small bag of gemstones.

Re:VIA (1)

PineGreen (446635) | more than 5 years ago | (#25437275)

$8000? Thats a crap lot for a machine, even a 8 core Xeon. What you got in there?

One of a few pleasures of academia is that you get whatever machine you want...8 cores at 3 GHz (which was max last year), 8Gb RAM, 2x 750Gb raided, 30 inch screen driven by a decent videocard... adds up pretty quickly. :)

Re:VIA (1)

TheRaven64 (641858) | more than 5 years ago | (#25425677)

VIA is also an ARM licensee, but I've not seen any products from them based on ARM designs. I don't know if this is because they only sell to OEMs, or if it's because they don't exist.

Re:VIA (1)

hattig (47930) | more than 4 years ago | (#25430803)

VIA may use ARMs as components within their other chips. It's not too hard to imagine an ARM inside their Firewire controllers, for example. ARMs are so small and efficient they get put everywhere you need some CPU functionality these days.

Re:VIA (1)

Graymalkin (13732) | more than 4 years ago | (#25432283)

And AMD has a substantial slice of the x86 market, definitely way bigger than VIA. Imagine what sort of creative management it takes for VIA to stay competitive.

If a company sells enough widgets to be profitable it doesn't matter if they're 90% or .01% of the widget market. If their revenues exceed their expenses they're making money. VIA and AMD don't have to dominate the CPU market to remain competitive or profitable. Whether or not they're actually profitable and competitive is what is important.

OpenCL (1)

1053r (903458) | more than 5 years ago | (#25425137)

The linked wikipedia article says that OpenCL will be introduced in OS X Snow Leopard. According to a post by DigiShaman above, Nvidia cards are GPGPU with their Cuda API. I don't know much about the technology, but maybe this is what Apple had in mind when they put Nvidia cards in all the new Macbooks?

Re:OpenCL (1)

Joseph Lam (61951) | more than 5 years ago | (#25426031)

I think the whole point of OpenCL is that one doesn't need to be concerned about the underlying hardware. It can be a discrete Nvidia/ATI GPU, an chipset integrated GPU, or even a hybrid multi-core CPU with graphics acceleration cores. I've read that both NVidia and ATI/AMD claimed to support OpenCL in the future. From Apple's point of view it's a matter of choosing a vendor who can provide them with the best(by Apple's own standard) hardware solution.

Re:OpenCL (0)

Anonymous Coward | more than 5 years ago | (#25439179)

Yes, blabla, the usual "built binary on system/bytecode" story.

And it will have a problem for the same reason that their normal programming counterparts also failed to achieve dominance: complex runtime dependancies, tremendously overengineered for simple use and the need to release easy reverse engineerable code.

Re:OpenCL (1)

aliquis (678370) | more than 5 years ago | (#25427385)

Yeah, Nvidia supports Cuda on everything above the 6000-series I think.

Two other reasons is probably that:
a) It's faster.
b) Transgamings Cider software don't support the Intel graphics, and now with this one EA:s "mac games" such as C&C 3, Spore and such will run on the Macbook aswell.

Who is VIA anyway? (0, Insightful)

Anonymous Coward | more than 5 years ago | (#25425221)

All I know about VIA is all my past bad experiences with their chipsets, slow CPU's (but low power, I guess) and other garbage. About 5 years ago I swore off VIA completely. It's weird, I have always thought the same bad things about S3 and I didn't even know they were VIA till now. Now all I need to hear is that VIA owns or is owned by Goldstar and/or Acer and the circle of crap will be complete.

Re:Who is VIA anyway? (1)

Pax681 (1002592) | more than 5 years ago | (#25425695)

i once had a machine with the dreaded S3 Virge in it... what a piece o crapioca that was.... the day i got my voodoo card and put it in...what a difference! then another voodoo was put in for dual card lovelyness! as i recall there was a wee cable used at the back called a loopback cable to use the cards together. i think nvidia bought out 3Dfx who made the voodoo tio grab the tech and this formed the basis on SLi. After the vodoo i went onto a TNT Riva! ad whata difference. to me though S3 will always represent the low end shitty end of the market and VIA's chipsets will always be shitty. do VIA chipsets STILL have a prob with USB?

Re:Who is VIA anyway? (0)

Anonymous Coward | more than 5 years ago | (#25429733)

Even though you type like a nigger, your using a computer so you can't be. Your just a fat, lazy sack of shit wasting everyone elses time until you kill yourself.

captcha: stereo, like what your slutty mom plays upstairs while she's blowing whoever came with a fiver and a case of AIDS.

Re:Who is VIA anyway? (0)

Anonymous Coward | more than 4 years ago | (#25434609)

How about learning the difference between "your" and "you're"?

Potentially interesting (2, Interesting)

fuzzyfuzzyfungus (1223518) | more than 5 years ago | (#25425349)

I can't say I'm wildly optimistic about the likely power of an S3 GPGPU setup, given the history of S3 GPUs. On the other hand, because their performance is likely to be somewhat mediocre, and they certainly don't have the marketshare or power of someone like NVIDIA, they are more likely to do things like release documentation in order to attract development for their platform. In general, the dominant player has the greatest incentive to go it alone, keep things proprietary, and generally try to leverage their power, while the second stringers are much more likely to be helpful in their attempt to build marketshare.

WTF? (1)

4D6963 (933028) | more than 5 years ago | (#25425515)

Is it me or is there "S3FotoPro Enhancement" in TFA looks like nothing different than mere contrast adjustment?

Re:WTF? (1)

4D6963 (933028) | more than 5 years ago | (#25425529)

Crap crap crap and an extra slice of triple crap, I meant their not there... Damnit, the things tiredness makes you type..

Better dead than a fucking indian (0)

Anonymous Coward | more than 5 years ago | (#25429911)

way to cover your tracks, you got lucky... this time. Better watch your back you fucking drunk indian. Enjoy your firewater and poverty.

Captcha: apparent, as it is apparent you are a fucking retard.

Re:WTF? (1)

Teun (17872) | more than 5 years ago | (#25425801)

I'd sooner compare it with Gamma adjustment.

But does it run on Linux? :)

Re:WTF? (1)

Joseph Lam (61951) | more than 5 years ago | (#25425943)

I think the program simply performs some basic automatic tunings such as contrast, saturation and sharpness adjustments. It is not supposed to be a replacement of Adobe Photoshop/Lightroom with tons of sophisticated filters (which mostly are meaningless without manually inputted parameters anyway). It's simply a demo app to show the effect of RELATIVE performance gain one can get from GPU hardware acceleration. Even though it is likely that the stuff still runs faster on my Xeon PC without acceleration than on a low power VIA box with S3 GPU, the improvement can still be significant for users with lower power CPUs.

But does it run Linux? (1, Insightful)

Anonymous Coward | more than 5 years ago | (#25425539)

No, I'm not trying to be funny (or annoying, if you prefer).

I'm seriously asking.

so far, S3 has the worst performance and support for Linux. At least VIA is beguining to open their drivers:
http://linux.via.com.tw/support/downloadFiles.action

For me at least, even if S3 starts a video revolution, I will stick to NVidia, ATI and even Intel until I hear about decent Linux performance.

(Sorry about the link not being clickable. It seems that ACs can't post in HTML, and I don't need an account just yet)

Wasnt Nvidia and Diamond the first to bring us 3D? (1)

Jackie_Chan_Fan (730745) | more than 5 years ago | (#25425811)

If i remember correctly, it was Diamond that first brought us the Diamond EDGE cards which used Nvidia's first 3d chip. It was the same chip used in the Sega Saturn.

I believe S3 Virge came after. As far as i can remember... the Diamond EDGE PC cards were the first 3D accelerator cards. ... And that would make Nvidia first no?

After Edge flopped, Matrox gave us the Millenia cards which had terrible 3D support on them. Voodoo then took the entire market for a while until Nvidia launched the TNT2 which was a great card for its time.

Voodoo held on to the lead for a while longer but ultimately we all know what happened and Nvidia became king. ATI is still very solid but Nvidia has better opengl support and their cards just perform a lot better 3D animation production software.

Re:Wasnt Nvidia and Diamond the first to bring us (1)

Shark (78448) | more than 5 years ago | (#25429599)

Actually, my Matrox Impression Plus had 3D (OpenGL) acceleration. That was um... 1994?

The tricky thing then was to find *anything* but the 3 matrox-made sample (and crappy) games that would use it ;) It was a real killer in 3D Studio and CAD though.

ATI Rage 3D (0)

Anonymous Coward | more than 5 years ago | (#25429933)

The original Rage chip is pretty old. I believe it was available on a VLB card and that win95 shipped with drivers for it.

Anyway, Sega designed their own chips for the Saturn. The early nVidia chip was only related in that it also rendered quadrilaterals rather than triangles.

I think the Matrox G200 and G400 chips were competitive. The G200 being a little faster than a Rage Pro, and the G400 being about as fast as TNT2. I don't know when the S3 Virge showed up but I know that it was used on an Amiga VGA board (the Cybervision 3D) and, like its PC counterpart, was notorious for having slower 2D performance than the S3 Trio 64.

Re:ATI Rage 3D (1)

Jackie_Chan_Fan (730745) | more than 5 years ago | (#25456503)

You know what confused me about the Sega Saturn history is that I remember the Diamond brought out the EDGE cards which I swear came before the ATI Rage cards, but i would have to look that up for truth... Anyways I remember that Diamond advertised the Edge with Virtual Fighter from Sega. The Diamond Edge was the first Nvidia chip. Thats where i got mixed up... But it turns out SEGA did have a partnership with Nvidia during that period and Nv1 chips ended up in Sega's arcade boards.

Thats where my confusion is... I found an interesting article on the whole situation at :

http://www.firingsquad.com/features/nv2/ [firingsquad.com]

After the Matrox Millenia (which was a great card) but a terrible 3D card at the time.... I pretty much gave up on Matrox for 3D and jumped on the voodoo bandwagon. Then i ended up with dev kit of the TNT2 and I've pretty much relied on Nvidia since, although i've owned some of the ATI cards and still own a fairly new one in one of my pcs. Nvidia does OpenGL much better, and I need OpenGL.

Anyways... that arcticle was an interesting refresher.

S3? (0)

Anonymous Coward | more than 5 years ago | (#25425989)

Solid Snake Simulation?

Re:S3? (0)

Anonymous Coward | more than 5 years ago | (#25426141)

No, Selection for Societal Sanity

Again, not GPGPU (0)

Anonymous Coward | more than 5 years ago | (#25426077)

The only proposed usage that is even remotely general-purpose is the physics thing, and even then if the physics capabilities are tuned for graphical processing rather than general vector handling, that may or may not qualify as GP.

Overall, this sounds like another misuse of the GPGPU moniker. GPGPU does not mean using GPU acceleration for non-game apps. It means using GPU like a CPU, i.e. for completely arbitrary purposes like implementing traditional data structures (sorting lists and that kind of thing) that have nothing to do with generating visuals and may not even be visualized data. That you can use GPUs for something other than games is nothing new. The only thing that is kind of new is that this functionality used to be the exclusive domain of high-end workstations and production-level graphics cards.

The next time you are about to post a story with "GPGPU" in it, ask yourself, can this technology be applied to a text editor? If not, it's probably not GPGPU.

Does it compile and run C? (0)

Anonymous Coward | more than 5 years ago | (#25426089)

Thats all great and all, but we do not use anything unless we can write stuff in C that will run on it.

Seriously - we recently started using nvidia chips because CUDA is very much C.

Also, we just started to use FPGAs, because we can write algorithms in C and compile it to HDL or netlist (look up FPGAC on sourceforge).

On the software front - we just started looking into website interface to our hardware because we ran into Wt ( http://www.webtoolkit.eu/wt ) library. This way we can reuse our C/C++ knowledge and write websites.

So, again, does it run C?

Because if it doesn't, or if its some weird sorry excuse for C (think Brook or objective-C), we are sticking to CUDA.

Re:Does it compile and run C? (0)

Anonymous Coward | more than 4 years ago | (#25431199)

Are you such shitty software engineers that you can't pick up another language like Verilog, or shock horror, assembler?

Pick the right tool for the job even if you need to learn how to use it first, don't try to use the tool you know when it isn't suitable.

My dream job? Working for S3! (1)

gozu (541069) | more than 5 years ago | (#25426105)

I want to work as a GPU designer for S3 and put my heart and soul into a product that will be laughably pathetic compared to nVidia and ATi's offerings.

I also want to fight two MMA champs at the same time, just so I can push my body to the limit and get utterly humiliated and destroyed anyways by two laughing guys drinking beer while they are beating me up.

Re:My dream job? Working for S3! (1)

jensend (71114) | more than 5 years ago | (#25426947)

I think you must mean you want to work for S3's driver team- it's the drivers which are laughable. S3's hardware is- like many other things VIA and its subsidiaries have come up with, such as the Envy and the Nano- good engineering.

S3 cards have done fairly well in benchmarks compared to parts from nV and AMD/ATI of the same classes (midrange and low end- S3 doesn't bother with the high end)- an impressive achievement considering that 800-lb gorilla Intel has consistently failed to even come anywhere close to the low end parts from ATI and nV. IIRC, they also usually have a smaller silicon budget, lower power consumption, and narrower memory interfaces than their cards' ATI/nV competitors- which makes it even more impressive and shows that the designs have room to grow performance-wise.

However, VIA keeps dropping the ball in getting perfectly good products from the design finished/announcement stage to the production and sales stage and in doing the other things (driver development, marketing, going after design wins/the oem market, doing what it takes to get developers to code with their products in mind, etc) which are necessary for any product- no matter how well engineered- to really take off. With this tendency it is amazing (especially since their Intel and AMD chipset business dried up) that VIA is still surviving.

So, S3 & aspiring F/OSS developers ... (1)

ReedYoung (1282222) | more than 5 years ago | (#25437717)

... have a lot to offer and a lot of potential benefit to realize from one another: agree or disagree? If "agree," any reliable info on why S3/VIA have not thus far been more open to assistance from the Open Source community?

Without adoption of OpenCL by S3 and it's... (1)

tyrione (134248) | more than 5 years ago | (#25427509)

DOA.

Imagine an inexpensive netbook with (1)

electrogeist (1345919) | more than 5 years ago | (#25427719)

with HD video acceleration and decent 3D for compiz eye candy / simple games.

If they release good quality linux drivers I think S3 could make a name for themselves once again. The netbook and low-end pc market is growing. AFAIK even nVidia does not have any kind of HD video acceleration with their linux drivers.

Otherwise most windows geeks already have a preference of either nVidia or ATI, so this would largely go ignored and only end up in the laps of those who don't know the difference or who just got the latest hot deal.

my first thougth (1)

mAriuZ (264339) | more than 5 years ago | (#25451495)

was that amazon s3 is using gpgpu

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>