Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

VIA and NVIDIA Working Together For PC Design

Soulskill posted more than 6 years ago | from the teaming-up-before-intel-drops-the-atom-bomb dept.

Graphics 93

Vigile writes "With AMD buying up ATI and Intel working on their own discrete graphics core, it makes sense for NVIDIA and VIA to partner together. It might be surprising, though, that rather than see the rumors of NVIDIA buying VIA come true, the two companies instead agreed to 'partner' on creating a balanced PC design around VIA's Nano processor and NVIDIA's mid-range discrete graphics cards. During a press event in Taiwan, VIA showed Bioshock and Crysis running on the combined platform. They also took the time to introduce a revision to the mini-ITX standard, which Intel has adopted for Atom, that pushes an open hardware and software platform design rather than the ultra-controlled version that Intel is offering."

cancel ×

93 comments

once more... (1, Insightful)

Anonymous Coward | more than 6 years ago | (#23692633)

the video game industry is the one pushing the development of computing!

Re:once more... (0)

Anonymous Coward | more than 6 years ago | (#23692641)

First Post is supposed to be reserved for Anonymous Cowards you insensitive clod!

Re:once more... (0)

Anonymous Coward | more than 6 years ago | (#23692841)

First post was claimed by an Anonymous Coward...

Re:once more... (3, Insightful)

sayfawa (1099071) | more than 6 years ago | (#23693015)

the porn industry is the one pushing the development of computing!

There, fixed.
What, you haven't been watching a lot of CG porn recently? I'm not alone.. am I? :)

Re:once more... (1)

stonecypher (118140) | more than 6 years ago | (#23693145)

What, you haven't been watching a lot of CG porn recently?
Not yet. The broken physics still bothers me. I can feel it getting closer, though.

Two years, if it keeps progressing as it has lately. Maybe less - surely the money will start ramping up soon.

Re:once more... (2, Interesting)

Smauler (915644) | more than 6 years ago | (#23693309)

The major problem with CGI pron is that Uncanny Valley [wikipedia.org] can take on a whole new meaning.

Re:once more... (0)

Anonymous Coward | more than 6 years ago | (#23695507)

Uncanny Glory Hole?

Re:once more... (1)

(Score.5, Interestin (865513) | more than 6 years ago | (#23698467)

The major problem with CGI pron is that Uncanny Valley can take on a whole new meaning.
If they stuck to modelling the physics of silicone instead of real flesh, that problem would be quickly solved.

Re:once more... (1)

ady1 (873490) | more than 6 years ago | (#23694105)

the porn industry is the one pushing the development of internet!

There, fixed.
And yeah, you are :P

Re:once more... (1)

negRo_slim (636783) | more than 6 years ago | (#23698451)

the porn industry is the one pushing the development of computing!
But has anyone ever found reliable data indicating the porn industries revenues? I know everyone quotes billions, but I'd love to know just how powerful an industry it really is.

Re:once more... (1)

mabhatter654 (561290) | more than 6 years ago | (#23694351)

as opposed to Intel's corporate agenda that tanks gaming because "businesses" don't "need" it. Both AMD and Nvidia/VIA will put more balanced machines out there. I personally can't get over the Apple/Intel Macbook fiasco... that a $1200 Macbook (with the fastest dual core processors out there) play games like WoW like crap... an old iBook will play better than Macbooks! But it's "cheaper" for Intel to hose it's partners with the integrated graphic and eventually the game industry will change how they make games so they can run on 4 year old intel crapware.

If you have a good graphics chip, you only need 1.5 GHz or so to run the average PC game if you have a modern true DX9 card in the box. AI can be trimmed down, models can be made with less polys, physics can be simplified, but things like T&L just can't be done quickly by a general purpose CPU ... or you have to throw a whole 2GHZ CPU core to do what a 3 year-old graphic card does... what a waste of resources.

Re:once more... (0)

Anonymous Coward | more than 6 years ago | (#23694827)

Apparently you don't play "the average modern" PC game. 1.5GHz will not cut it.

Re:once more... (1)

GigaplexNZ (1233886) | more than 6 years ago | (#23697605)

A Core 2 Duo running at around 1.5GHz is sufficient to run the average modern game if partnered with a capable graphics card.

Re:once more... (1)

negRo_slim (636783) | more than 6 years ago | (#23698529)

as opposed to Intel's corporate agenda that tanks gaming because "businesses" don't "need" it.
Exactly, businesses do not need advanced 3d acceleration. They are businesses, the bottom line is king. (increased hardware costs, lost productivity, etc.)

Both AMD and Nvidia/VIA will put more balanced machines out there.
That is an incorrect statement since none of those companies actually sell working "machines". Instead they develop and sell components. Which aside from VIA are all comparable with one another in terms of price and performance (generally).

that a $1200 Macbook (with the fastest dual core processors out there) play games like WoW like crap
Well that's a given. Proper 3d acceleration is a must for any modern game. It should be noted my girlfriend has proven to be an ample tank for my alt priest while playing on a 750Mhz AMD Duron with 1gb of memory and an nVidia GeForce 6600LE AGP card. While I happily played my character to end game (name:mithrilvar server:moonrunner 70/war/a) on a 1ghz Intel Pentium 3 with 768mb ram and a 6600LE video card.

But it's "cheaper" for Intel to hose it's partners with the integrated graphic and eventually the game industry will change how they make games so they can run on 4 year old intel crapware.
Well yes it's cheaper, but that's the name of the game. Even the very latest IGP's (from nvidia/ati) are only 'passable' when it comes to current gaming trends. So what would be the point when even the very best to be offered pales in comparison to a dedicated card?

In regards to your theory they are holding out for software developers to adapt to their slower hardware... Well that would be great if it was true as that would allow them to concentrate on game play. All of my favorite games run on dramatically older hardware (civ3/4, simcity4, serious sam, baldur's gate, etc.) and I know I'd love to see more games of a similar caliber ;)

If you have a good graphics chip, you only need 1.5 GHz or so to run the average PC game if you have a modern true DX9 card in the box
I would say something rated 2.0ghz these days, as my experiences with an ATI x1800 XT (one of the fastest dx9 single card solutions to have been released) have shown me. And dual core is a must. Of course if you buy a nice DX10 card I'm sure you could get by with 1.5ghz, but with prices the way they are why short yourself and leave your video card bottlenecked by the cpu?

Low watt, high performance? Seg fault (3, Insightful)

clowds (954575) | more than 6 years ago | (#23692649)

It would be grand to be able to buy a low watt, small box gaming machine that doesn't require 6 fans to keep it cool.

However, with the way things are at the moment in the pc gamespace, I'd be pretty cautious expecting any decent performance, even with their Crysis and Bioshock demoes.

I do miss the days when games had 128 multiplayer maps, ran on cheap $200 video cards well and had more story rather than the shinies but I guess that's progress for you. *sigh*

Re:Low watt, high performance? Seg fault (1)

PopeRatzo (965947) | more than 6 years ago | (#23692799)

Man, I've never payed $200 for a video card and I've played most of the big games that have come along.

I wouldn't be so sure nVidia and VIA aren't on to something here.

Re:Low watt, high performance? Seg fault (1)

skoaldipper (752281) | more than 6 years ago | (#23695455)

$250 for mine, and I can power a small submarine using my DVI connector.

Re:Low watt, high performance? Seg fault (1)

negRo_slim (636783) | more than 6 years ago | (#23698535)

200 for mine, ati model... ran auto overclock feature in the drivers... reported back 92c. It's nice to be able to boil my ramen noodles while waiting for the next level to load.

Re:Low watt, high performance? Seg fault (1)

SacredByte (1122105) | more than 6 years ago | (#23696613)

I've never spent a single cent on a graphics card. I got my 8800 the old fashioned way -- trade/barter. And don't even get me started on soundcards -- TF2 FTW! All things considered, I think VGXPO last year was a very profitable experience for me.... Anyway, on the subject of older machines, I have a Dell Latitude D600 (2 GHz Dothan, 1GB, 160GB, 32MB ATi MR 9000), and the only part that really holds it back from running just about all games released up to about the middle of last year is the video card.

Re:Low watt, high performance? Seg fault (1)

ejecta (1167015) | more than 6 years ago | (#23698931)

After buying my first computer, I've never paid more than $40 for a video card, and that's AUD$! But then I am still happy playing Total Annihilation across the network with the Total Annihilation Works Project Mod.

Shiny graphics are a waste - gameplay is king imho.

Re:Low watt, high performance? Seg fault (1)

Sj0 (472011) | more than 6 years ago | (#23701327)

Man, I've never payed $200 for a video card and I've played most of the big games that have come along.

Seconded. I've always been able to play the latest games by taking advantage of the fact that you don't need the latest and greatest to play the games. A 100 dollar budget video card will give you all the features you need, for a fraction of the price for the same card a year or two earlier.

Re:Low watt, high performance? Seg fault (1)

Alwin Henseler (640539) | more than 6 years ago | (#23692865)

It would be grand to be able to buy a low watt, small box gaming machine that doesn't require 6 fans to keep it cool. However, with the way things are at the moment in the pc gamespace, I'd be pretty cautious expecting any decent performance, even with their Crysis and Bioshock demoes.
Who cares? The latest games are always written to 'barely' run on top-of-the-line consumers PC's. But who needs a box like that? For myself, I normally put together a box that is behind the latest tech a year or so, if not more. That way, you can run almost anything you want, at a fraction of what it would cost you to buy the latest & greatest. Right now I've got a box that is 2~3 years old, but it still meets minimum specs for Crysis, and Bioshock. I don't expect either of those to run smoothly, but I could run them if I wanted to. Doom 3 runs fine (at 1600 x 1200), any game released before that runs super-smooth.

Basically, you can buy a certain 'class of performance' to run a certain 'class of applications'. You can get it fast, cheap or small. Pick 2 of those. My current machine qualifies as 'fast enough, cheap', my next build will probably be 'how much gaming performance can I cram into a breadbox?'.

So yes you can say "it won't run the latest games". But along the same lines you can say "yes but it doesn't have 8 cores, 16 GB RAM and 10 TB diskspace. Do you really need that? Ofcourse not, you just pick what you can afford, and best fits your needs.

Re:Low watt, high performance? Seg fault (1)

maxume (22995) | more than 6 years ago | (#23693069)

I'm pretty sure that anybody who has *anything* they would rather do with their money should be following your strategy. And I mean stuff like "Buy 45 year supply of non perishable food", "Get that pony you've always wanted" and other such nonsense.

I'm glad somebody is buying cutting edge computer systems and driving the technology forward, but not as glad as I am that it isn't me.

Re:Low watt, high performance? Seg fault (2, Interesting)

Smauler (915644) | more than 6 years ago | (#23693379)

Whenever I build a new PC for myself, I buy just below the best. My previous PC lasted me 5 years or so, with a ti4200 (a powerhouse of a card, which will still outperform some modern 256mb cards). I'm hoping I can last as long with my current 8800GT, though I'm guessing it may be more like 4 years before I replace it.

That's the thing about buying a decent spec PC - if you buy well, you will not have to replace it for years, and you'll be able to play just about anything because everyone else is buying generic PCs which have crap graphics cards in them every year or two. YMMV, as always.

Re:Low watt, high performance? Seg fault (1)

KillerBob (217953) | more than 6 years ago | (#23693865)

*nods* I'm much the same as you are in that respect... but I've recently found my desires have changed as far as computing goes. Basically... there's exactly one computer game that I want to play any more, and its requirements are not even close to stunning. All it asks for, at a minimum, is an Intel 945 chipset, nVidia GeForce 6600, or ATI Radeon 9500 or better, coupled with 512MB of RAM, and an 800MHz P3 or Athlon. Show me *any* new computer that you can buy today which doesn't meet those specs. And it runs under Linux and MacOS.

So I've been slowly switching all of the computers I use over to laptops. And my gaming is done on the Nintendo Wii. With a console, there's two major advantages: you only have to spend $200 to buy one, and you know that for several years, you'll never have to upgrade. I buy a $500 laptop, install Linux, and I'm off to the races, with a great computer that'll do me for several years. And I'm no longer in the upgrade hell cycle that you see in PC gaming.

Re:Low watt, high performance? Seg fault (1)

mrchaotica (681592) | more than 6 years ago | (#23696001)

a ti4200 (a powerhouse of a card, which will still outperform some modern 256mb cards)

If and only if the games in question don't use fancy shaders and whatnot that the 4200 doesn't have hardware support for, that is.

Re:Low watt, high performance? Seg fault (1)

negRo_slim (636783) | more than 6 years ago | (#23698545)

Shaders? Z-buffers? Mip mapping? Back in my day we had to walk from system memory to frame buffer uphill and in the snow both ways!

Re:Low watt, high performance? Seg fault (1)

Smauler (915644) | more than 6 years ago | (#23699265)

I know, that's one of the main reasons I got a new PC. Saying that, however, the ti4200 would run games like doom3 [digital-daily.com] (un?)happily, though I never got it. I wasn't going to just upgrade the graphics card because it was AGP, and the entire system was crap by this point anyway.

Re:Low watt, high performance? Seg fault (1)

mrchaotica (681592) | more than 6 years ago | (#23695973)

I do miss the days when games had 128 multiplayer maps, ran on cheap $200 video cards well and had more story

I run The Orange Box games and Oblivion on a ~$100 GeForce 8600 GT just fine at high settings and 1680x1050 resolution (although without antialiasing); what's the problem?

well duh! (0)

naz404 (1282810) | more than 6 years ago | (#23692651)

NVIDIAVIA would be one hell of an ugly name!

It makes that monstrosity Macradobia [adobe.com] look like Scarlett Johansson by comparison!

Re:well duh! (0)

naz404 (1282810) | more than 6 years ago | (#23692655)

It's almost as bad as Eych Tee Tee Pee Colon Slash Slash Slash Dot Dot Org!

Re:well duh! (0)

FurtiveGlancer (1274746) | more than 6 years ago | (#23692751)

NVIDIAVIA would be one hell of an ugly name!
Is it any worse than NVIADIA or VIANVIDA? One could use this situation to define "lose-lose-proposition" on Wikipedia.

Re:well duh! (0)

Anonymous Coward | more than 6 years ago | (#23696759)

You guys would never make it in marketing. It would be nViVIA, of course.

Re:well duh! (0)

Anonymous Coward | more than 6 years ago | (#23692765)

NVia?

slow down cowboy, it has been 25 years since your last comment.

May the best chip win! (5, Interesting)

aceofspades1217 (1267996) | more than 6 years ago | (#23692657)

Competition can't hurt. Now that we have Intel, AMD/ATI, and Nvidia/VIA all throwing their hat in the ring it will keep prices down and of course spur innovation considering its a race to the find the best technology. Personally I would like to see Intel taken off it's high considering it delayed all their 45nm production just so they could sell out their older chips. Of course they were able to do that because AMD is so behind in the 45nm race.

So great hopefully we will see some real progress and we can have affordable laptops that have OK power. Because right now most normal laptops have integrated chips (you can't really fit a video card into a normal laptop) and of course the integrated card is horrible. Also the integrated card (at least in my laptop) sucks up all the power and makes my laptop have 3x less life. Also my integrated card overheats.

So yea it would be great if we could have decent video processing on normal mass market laptops.

Good Luck and may the best chip win!

Re:May the best chip win! (1)

Cal Paterson (881180) | more than 6 years ago | (#23692833)

Was anyone saying that competition would hurt?

Re:May the best chip win! (1)

aceofspades1217 (1267996) | more than 6 years ago | (#23692857)

No...but I just said it wouldn't :P

Considering I was the first post pretty sure noone else said that.

I am just saying that any competition is good. Heck if Microsoft wanted to make a chip that would be cool :P All I'm saying is the more the marrier. Now if only we had competition between our telecoms.

Re:May the best chip win! (1)

Homer's Donuts (838704) | more than 6 years ago | (#23699561)

No...but I just said it wouldn't :P

Considering I was the first post pretty sure noone else said that.

I am just saying that any competition is good. Heck if Microsoft wanted to make a chip that would be cool :P All I'm saying is the more the marrier. Now if only we had competition between our telecoms.
Microsoft?

I wouldn't marrier!

Re:May the best chip win! (1)

aceofspades1217 (1267996) | more than 6 years ago | (#23703543)

I was making a point. I mean seriously I would like it if any company started making chips even the horrible, despicable, and evil microsoft :P

THREE is not enough! (2, Insightful)

PopeRatzo (965947) | more than 6 years ago | (#23692919)

I don't know about you all, but I'm not sure three entities making all the processing hardware is enough.

Whenever I see these "strategic partnerships" which basically means "mergers so the DOJ won't notice", I think about what's happened to the airlines and the oil companies (oh and telecom). Going in different directions, they are, but the consumers are getting screwed all around when these big outfits team up.

Re:THREE is not enough! (2, Interesting)

aceofspades1217 (1267996) | more than 6 years ago | (#23693297)

I wouldn't really call this a merger. VIA is a processor maker and nvidia is a GPU maker obviously Nvidia's all-in-one chip wouldn't be viable on their own. VIA makes a lot cheaper and low power chips so for the purposes of this they have a leg up on intel. The whole point is you don't really need (or want for that matter) a fast, monster processor for a smart phone. But there could be obvious benefits to having a kickass video card....playing video. The same also applies to ultra portable laptops. Your not going to be using it for gaming (but you can if you want) but it's quite possible that your going to be watching on your ultra portable. Also the ability to output 1080p is insane.

Right now I just want to see Intel get off their high horse. Even though they haven't really done anything wrong and they are ahead for a reason (they made 45nm chips) I think that someone needs to keep intel in check and make sure they know that the only way for them stay ahead is to further their technology and provide good value (45nm chips are really good value :P).

And yes three is not enough. Its never enough. Competition is good. You know why the margins on food are paper thin? Because their is a publix, wal-mart, target, albersons, and win-dixie in ever corner.

Re:THREE is not enough! (1)

Kjella (173770) | more than 6 years ago | (#23696993)

I don't know about you all, but I'm not sure three entities making all the processing hardware is enough.
Well, what would you like to do about it? It's not because of lack of innovation, the problem is rather that the tick-tocks going on cost billions and billions and billions. For one smaller companies couldn't afford them, and even if we forget that it's big enough it'd have to be passed to the consumer as substantially increased prices. The only other option I see is compulsory licensing which would bring a host of problems on its own as they'd lose their biggest incentive to improve the hardware.

Besides, don't forget that they're also competing against themselves. While a few have had hardware failure, I have 10+ year old machines that still work and I doubt Intel would be very happy if I said that because of their saggy lack of innovation and exorbitant prices I won't buy another until 2020 at least. Honestly, if I could get full 1080p hardware acceleration in Linux before AMD runs out of steam I'd be set for quite some time even if all else goes to shit. I might have to hand in my geek certificate over this, but I'm actually very content with my computer now and don't see any particular reason to upgrade, which is quite a change from the past.

Re:THREE is not enough! (1)

PopeRatzo (965947) | more than 6 years ago | (#23702673)

the problem is rather that the tick-tocks going on cost billions and billions and billions.
So, you think that there are only three corporate entities in the world that can put together "billions and billions"?

The problem is that even if there were EIGHT companies making processing hardware today, by next Friday there would be three again because our system rewards consolidation at the expense of innovation and certainly against the best interest of the consumer.

The system is rigged in favor of the corporation over the consumer. Things will continue to get more fucked up as long as this inbalance exists. It has nothing to do with supply and demand, the free market or any other business school fantasy. It is strictly about the rule: Power Wants More.

Re:May the best chip win! (1)

renoX (11677) | more than 6 years ago | (#23693075)

>Competition can't hurt.

For users no, but AMD isn't doing so well for a long time and investors don't like to loose money, so who know how long they're still going to compete with Intel?

Re:May the best chip win! (1)

aceofspades1217 (1267996) | more than 6 years ago | (#23693351)

Yes AMD needs to get their shit in gear. They are getting their ass kicked by Intel (and rightfully so). Their chips are still in the dark ages and they are just barely making quad cores and more importantly 45nm processors. I pity the fool who buys a 300 dollar 65nm processor.

Now if Nvidia and VIA gets in the game and supplants amd are at least takes a small niche it will keep intel on their toes....without competition Intel could become like microsoft. They can artificially set their prices.

Dark Ages ? I don't think. (1)

DrYak (748999) | more than 6 years ago | (#23696703)

Their chips are still in the dark ages and they are just barely making quad cores and more importantly 45nm processors.
Their chip aren't Dark Age at all. There are a lot of clever tricks inside. The main being the whole idea of moving the north bridge functions (memory controller) inside the CPU it self and using an open standard to communicate to the rest of the world (hypertransport). (in addition to other technical feats such as split power planes, *real* quad cores, unganged-mode bus)

Intel is only catching up with that now (with their future Quickpath technology. And it's hardly an *open* standard, given the fights with nVidia over the right to implement it).

The main problem of AMD is that they only have brilliant ideas, but no advantages in fabs processes. Intel is always one step further along the race for smaller process.

(But still, being on process generation late, and still being able to have competitive offerings in the mid-range category shows that indeed their ideas are clever).

Now if Nvidia and VIA gets in the game and supplants amd
That's going to be a sad day for open source supporter. NVidia was never open source friendly and VIA is hardly in a position to influence them. And we already lost an open source supporters in the GPU maker battles a couple of years ago (3DFx was a big proponent for the DRI infrastructure, for example).

Re:May the best chip win! (1)

Kjella (173770) | more than 6 years ago | (#23693617)

Personally I would like to see Intel taken off it's high considering it delayed all their 45nm production just so they could sell out their older chips.
Umm, what? Yes, they've kept their prices high and 45nm only on the high-end to get rid of 65nm stock, but Intel would like nothing more than to switch to 45nm as fast as possible. The chips get considerably smaller and thus cheaper to produce, which translates direcfly to higher margins. Plus they get all the premium of being alone in the high-end market, another good reason to keep them stocked. I'm sure there's a lot you can blame Intel for, but I don't think this is one of them...

Re:May the best chip win! (1)

aceofspades1217 (1267996) | more than 6 years ago | (#23693697)

Well obviously they aren't making the 65nm ones (for good reason) but they wouldn't have done that to begin with if AMD had released its 45nm chips.

But thats not the point. I have intel stuff I love intel. They compete hard but I feel right now that AMD has slipped out of the consumer market Intel is not getting as much competition. Without competition Intel can sell their chips at whatever price they feel like. I jsut think someone needs to keep intel on their toes.

And I'm not saying anything negative against intel. They are ahead not because they bought out someone or because they are a monopoly but because they have really good products and much better manufacturing techniques.

Re:May the best chip win! (1)

Z80xxc! (1111479) | more than 6 years ago | (#23694957)

The 45nm chips are not only on the super-high-end chips. Sure, you don't exactly see celerons with 45nm yet but a core 2 duo e7200 can be had for $130, and that's 45nm. I'm sure that before long, most if not all Intel chips will be of the 45nm variety.

Open source imbalance (1)

DrYak (748999) | more than 6 years ago | (#23696621)

The problem is :

- Intel has always been a strong pusher for open source (see their graphic drivers as an example)

- AMD has too (AMD64 Linux released before the actual processor, thanks to massive help from them - and thanks to Transmeta's code-morphing to help test before the chips come to life).
And since they acquired ATI, Radeons have seen lot of open-source efforts (before acquisition was mostly reverse engineering. Now AMD is slowly releasing the necessary documentation so open source drivers can be written).

(And in addition, ATI Radeon chips benefit from some technology available to AMD CPU, like smaller process and efficient low power to make them competitive with them)

- But VIA/nVidia the situation is reversed.
VIA is the one who are slowly catching up with the open source movement (started releasing documentation for their integrated graphic cores).
But the 500lb gorilla interested in buying them are nVidia who haven't done absolutely anything to help open-source movement. Project Nouveau is still only relying on reverse engineering. If nVidia buys VIA maybe there's even a chance that they choke VIA's previous open source effort.

So for the player there might be some interests (cheap low power machine, that can still run Vista and play games - as touted recently by nVidia)
But for the penguin-fans this might be a set back in the road to liberate drivers.

Re:Open source imbalance (1)

Insanity Defense (1232008) | more than 6 years ago | (#23699369)

If nVidia buys VIA maybe there's even a chance that they choke VIA's previous open source effort.

I don't think that is likely. It would be more likely to go the other way.

My information is not up to date but I did some reading up on VIA a few years ago. They were then owned by a larger conglomerate. That conglomerate makes a lot of interlinking things. Other branches of the conglomerate make motherboards and the material used to encase CPUSs and other integrated circuits just to name 2 that I recall. They are much more powerful collectively than VIA is alone.

Re:May the best chip win! (1)

sweet_petunias_full_ (1091547) | more than 6 years ago | (#23698913)

In this corner: AMD/ATI with the next generation of integrated video with their first pass at pairing up the CPU/GPU and claiming a 3x speedup (over previous integrated graphics solutions, most likely). Their second pass, if they stick the two onto a single die, should be much more interesting, as it should cost even less, use less power, and, if they do enough of a redesign should actually go even faster. I'm keeping my fingers crossed that this combination will result in a fully open system.

In the next corner: VIA and Nvidia, who claimed that they have a big can of "whoop-ass" for intel, but which, for the moment sounds like is empty except for a scrap of paper with the observation that they can put together a system with a small processor and a big video card that is usable enough for most things and can be sold cheaply. Alas, Nvidia can't open source their drivers and this can only hold them back in the long term compared to AMD/ATI and intel. As far as laptops, I'm less optimistic about this team producing something with long battery life.

In the other corner is Intel, who can afford to sit on its perch and see what the market does before reacting. According to the benchmark results in the article, the Atom is falling short in system performance compared to VIA's new offering. It could fix that by allowing the atom to be paired with a faster graphics chip, but if it did that, then Atom would surely eat into their profits in another market segment. Intel has opened its graphics drivers, but their graphics solutions being low-performance by comparison, these don't reveal any tricks that nVidia or ATI would want to use (and maybe that's by design). However, they can't sit on their laurels for ever, eventually the lack of GPU performance may come to haunt them.

NVidia may be worried that if it opened its high-performance drivers, intel and ATI would learn something. On the other hand, intel must be worried about that can of "whoop-ass" if it comes in the form of low-end competition, the sort of competition it can't answer without throwing its segmenting scheme into disarray. If the first two partnerships are both successfully attacking the low end and undercutting intel's high end offerings (and people do seem to be buying a lot of subnotebooks lately), intel will have to cut prices and reduce profitability. In the long term, it'll have to try to take back the low end, but with what we don't know yet.

OK, OK, OK!!! (2, Funny)

crhylove (205956) | more than 6 years ago | (#23692667)

You got my attention! Now price and availability? Is it multicore? Can I do this:

http://tech.slashdot.org/article.pl?sid=08/05/31/1633214&from=rss [slashdot.org]

?

Price and availability? Can we have a laptop that has a draw on wacom style touch screen where the keyboard is? I still want a keyboard though, but I guess I could use a docking station with a monitor. Can I get it like eee size and eee cheap and put 6 of 'em together in a custom beowulf cluster that grows and shrinks as the various laptops enter and exit the house over wifi N, or wimax, or whatever?

Re:OK, OK, OK!!! (1)

naz404 (1282810) | more than 6 years ago | (#23692693)

Actually, I hate the way processors are getting faster and faster and hungrier and hungrier.

I mean all this speed is sheer abuse! Back then I was perfectly fine with 233Mhz + 128MB RAM with Win98 and Diablo II!

All this speed is just a waste of battery life!

If they can give me a PIII-500 equivalent processor + 256MB ram + 2GB hard drive + 640x480 + ubuntu lite + 10hours battery life at $100-$200, that would be totally teh sweet!

Retrogames and emulators on Linux for TEH WIN!

Re:OK, OK, OK!!! (1)

Hektor_Troy (262592) | more than 6 years ago | (#23692723)

Why would you want to settle for 640x480? I like the rest of the idea, but honestly, 640x480 wasn't even fun back in the days of Win95.

Boost it to 1600x1200 or something like that, and it'd be a lot more comfortable to work with.

Think EEE form factor (1)

naz404 (1282810) | more than 6 years ago | (#23692783)

except smaller than EEE (but with still a true keyboard) Then toss in DOSBox [dosbox.com] , SCUMMVM [scummvm.org] , ZSNES [zsnes.com] ,

mix in a little WiFi capability for leeching off hotspots, and you now have a true hacker toy that can lug anywhere!

PDAs and Smartphones just don't cut it. They suck for doing stuff like coding and compiling your own programs.

Re:Think EEE form factor (1)

mrchaotica (681592) | more than 6 years ago | (#23696045)

Even on an EEE-size screen, I'd still want more pixels. It's 2008, for crying out loud; why can't we finally get 300DPI?!

Re:OK, OK, OK!!! (1)

pushing-robot (1037830) | more than 6 years ago | (#23692767)

Sounds like you just described the XO-1. [wikipedia.org]

Re:OK, OK, OK!!! (1)

gparent (1242548) | more than 6 years ago | (#23692815)

Yeah, let's stop technological advances right now!

Re:OK, OK, OK!!! (1)

crhylove (205956) | more than 6 years ago | (#23692875)

Every computer *I'VE* built for myself has been better than the last. My apologies if that is not the same result for everyone. I'm totally for cheap over options though, and if I could get an under $100 n64 emulator, with those new hi res textures and then play mario kart.... that would be worth $100 alone. But not $600, like I was offering for the supercomputer cluster of eee style notebooks that could also be 6 death match stations and 6 race cars. Oh yeah, and all work as perfect p2p skype video phones, too.

Re:OK, OK, OK!!! (1)

maxume (22995) | more than 6 years ago | (#23693107)

A low voltage core solo uses about 5 watts. At that point, the screen and other devices are going to factor much more heavily into battery consumption. The lower power core duos are around 15 watts, which still isn't going to destroy battery life.

So get a core solo and don't worry about the 4 watts that they might be able to save by gutting performance.

Re:OK, OK, OK!!! (2, Informative)

Smauler (915644) | more than 6 years ago | (#23693437)

Actually, I hate the way processors are getting faster and faster and hungrier and hungrier.

Umm... don't know if you've been following the processor market recently, but they're not. A lot of the advances recently have been about lower voltages, lower power consumption, and more cores. Pure processor power increase has most definately not been a feature of the recent processor market, at least not compared to the past. I mean, AMD released over 2ghz processors 6 years ago....

Re:OK, OK, OK!!! (0)

Anonymous Coward | more than 6 years ago | (#23694071)

Yes. My last round of upgrades for the various computers around my home were done to reduce power usage. It turns out that the lower wattage systems also ran at about twice the speed of our old systems. The old systems were actually just fine for our usage, but the power issue gave me a good excuse for upgrading.

Using a Kill-a-Watt [smarthome.com] I calculated that between the lower power consumption, the much improved sleep support, and the $0.23/kwh I pay for electricity, it will take about 8 months for my system to pay for itself and about 6 months for my wifes system and son's system to pay for themselves.

Re:OK, OK, OK!!! (1)

toddestan (632714) | more than 6 years ago | (#23696165)

Did you ever consider turning the computers off when you aren't using them? When the computer is only running for 1/4-1/3 a day, you'll find they'll use a lot less power.

Re:OK, OK, OK!!! (1)

aceofspades1217 (1267996) | more than 6 years ago | (#23693443)

now en days procs are becoming less important and now video cards are becoming more important. Even the best games don't really need too much CPU. Other than pysics CPUs are becoming less and less important. It is now secondary in high end gaming PCs. You won't see much of any difference between a 1k quad core or 200 dollar 45nm dual core in video gaming. Although you will see a huge difference between an 8600 GT and a 8800 GTS.

Re:OK, OK, OK!!! (1)

Aranykai (1053846) | more than 6 years ago | (#23692703)

Less... Caffeine... Please...

Can I get it like eee size and eee cheap and put 6 of 'em together in a custom beowulf cluster that grows and shrinks as the various laptops enter and exit the house over wifi N, or wimax, or whatever?
WTF would you do with a beowolf cluster of mini laptops on wireless? Folding@home that important to you?

Umm... (3, Funny)

FurtiveGlancer (1274746) | more than 6 years ago | (#23692715)

WTF would you do with a beowolf cluster of mini laptops on wireless?
Leak information like a sieve?

Re:OK, OK, OK!!! (1)

ozamosi (615254) | more than 6 years ago | (#23692759)

Crack WEP/WPA

Re: beowolf cluster of mini laptops on wireless (1)

naz404 (1282810) | more than 6 years ago | (#23692813)

Ever hear of Smart Dust? [wikipedia.org]

That's almost exactly what they're trying to achieve.

Re:OK, OK, OK!!! (2, Insightful)

crhylove (205956) | more than 6 years ago | (#23692849)

I was thinking POV ray Quake III, but sure, like good causes or whatever! Maybe install the Folding@home screensaver by default on all machines while I'm not playing POV ray Urban Terror. Thanks in advance, and if you can make 'em for $100 each, I'll take six advance orders.

PS Oh, can you install regular Urban Terror on each machine, too, so I have a 6 chair death match out of the box? Double Thanks in Advance. Might as well put a racing game on there too and bundle each with a dual analog stick. Thanks. Ok, I'll go $120 now.

Re:OK, OK, OK!!! (1)

drinkypoo (153816) | more than 6 years ago | (#23700587)

WTF would you do with a beowolf cluster of mini laptops on wireless?

With a beowulf cluster, where you have to rework your applications, or operate them on a job-submission basis at which point you might as well just use DQS? Nothing.

However, with a single-system-image cluster like MOSIX, and with binary compatibility across architectures, you could use the system to do all kinds of things transparently.

However again, my understanding is that OpenMOSIX is more or less over (I would love to be corrected.) And even if it isn't you need everything to be the same architecture... so again, this would work best if you were compiling to a virtual machine, not x86-Linux (which AFAIK is about the only place it runs, and on an old kernel to boot.)

Re:OK, OK, OK!!! (1)

mrchaotica (681592) | more than 6 years ago | (#23696031)

Can we have a laptop that has a draw on wacom style touch screen where the keyboard is?

Wouldn't it make more sense to put the Wacom digitizer where the screen is?

Also, I second the motion for a cheap, small tablet PC!

Re:OK, OK, OK!!! (1)

drinkypoo (153816) | more than 6 years ago | (#23700617)

It seems like the GP is either ignorant of the existence of the tablet PC, or just wants a giant PDA (tablet PC, no keyboard.) Maybe they want the clamshell form factor - personally, I'd want two screens in that case, but almost every use of two screens has turned out to be more trouble than it's worth. Me, I just want a giant PDA with some sort of optically-based multitouch and an easily replacable ~1/4" thick sheet of Lexan over the top of the screen so that I can beat it up and then resurface it. (The only practical way to handle this is to give people a dramatic discount for giving you back the old piece and to have a cheap way to resurface it, so I don't expect it to happen of course.) Guess I have to wait for construction diamond.

Re:OK, OK, OK!!! (1)

mrchaotica (681592) | more than 6 years ago | (#23700779)

It seems like the GP is either ignorant of the existence of the tablet PC, or just wants a giant PDA (tablet PC, no keyboard.)

I own a tablet PC, and no, I don't think he wants one in its current form, because no tablet PC is small or cheap. They're all expensive, and even the ones with small screens are absurdly thick and heavy. No, what he's asking for -- and what I'd want, too -- would be a convertible tablet (i.e., including keyboard) with about a 6" by 8" 1024x768 screen, <= 2lbs, <= $500, <= 1/2" thick, and running Windows XP Tablet Edition (because Vista is too CPU-hungry, Linux doesn't have any tablet apps, and it should have a PC OS, not a PDA one.

And before you say "you're asking for too much!" what I want almost already existed years ago: namely, the Sharp Actius MM10 [wired.com] . You could literally add only a Wacom digitizer and a swiveling hinge to that, and it'd be perfect (but, of course, somewhat updated performance would be good too; maybe use an Intel Atom or Via chip -- whatever would be fastest while still keeping the price below $500).

Re:OK, OK, OK!!! (1)

drinkypoo (153816) | more than 6 years ago | (#23700879)

I own a tablet PC, and no, I don't think he wants one in its current form, because no tablet PC is small or cheap.

But then what he wants is a "lighter, cheaper tablet PC" :P

I think the hinge is just a problem. If I need a keyboard I can carry a snazzy folding one in my pocket - I have big pockets. I already have a serial one for handspring visor that I intend to hack for use with everything else (well, that has rs232 anyway. they have bluetooth ones these days.) I want the ultimate durability and that would ideally mean something sealed in plastic. The battery would be external and would connect to the only electrical contacts, which would also have a cover (thus enabling the design's use in hospitals etc.) The charger could connect to the battery or charge without contact, the latter being my choice - the unit only needs an external battery so that it is replacable.

I think Linux is fine for a pen-based system, at least potentially. At the very least I think that any question whose best answer is "run windows" is a very silly one.

Re:OK, OK, OK!!! (1)

mrchaotica (681592) | more than 6 years ago | (#23705779)

I think the hinge is just a problem. If I need a keyboard I can carry a snazzy folding one in my pocket - I have big pockets.

The problem with that is that it becomes harder to use it as a laptop: what do you prop up the screen with?

I think Linux is fine for a pen-based system, at least potentially.

"Potential" solutions are useless. I've actually tried using Linux on my tablet. The digitizer itself works great; that's not the problem. The problem is that there are no apps! There is no open-source continuous handwriting recognition, for example. It simply doesn't exist. There is no Free equivalent for OneNote. There is no clipping tool. The state-of-the-art in tablet apps for Linux is Xournal [wikipedia.org] , and not only does it have even fewer features than Windows Journal (such as lack of the aforementioned handwriting recognition), but it's buggy to boot!

The situation for tablets in Linux is pathetic. It's unusable. There's no point in it; forget about it. And it's not going to change any time soon, because what little interest in writing Free software for alternative input devices there is, is pretty much entirely focused on small touchscreen devices like PDAs and smartphones, which use an entirely different paradigm.

imagine being hated/feared by billions of people (-1, Flamebait)

Anonymous Coward | more than 6 years ago | (#23692707)

seems as though that would indicate some changes would be in order? the lights are coming up all over now. conspiracy theorists are being vindicated. some might choose a tin umbrella to go with their hats. the fairytail is winding down now. let your conscience be yOUR guide. you can be more helpful than you might have imagined. there are still some choices. if they do not suit you, consider the likely results of continuing to follow the corepirate nazi hypenosys story LIEn, whereas anything of relevance is replaced almost instantly with pr ?firm? scriptdead mindphuking propaganda or 'celebrity' trivia 'foam'. meanwhile; don't forget to get a little more oxygen on yOUR brain, & look up in the sky from time to time, starting early in the day. there's lots going on up there.

http://news.google.com/?ncl=1216734813&hl=en&topic=n
http://www.nytimes.com/2007/12/31/opinion/31mon1.html?em&ex=1199336400&en=c4b5414371631707&ei=5087%0A
http://www.nytimes.com/2008/05/29/world/29amnesty.html?hp
http://www.cnn.com/2008/US/06/02/nasa.global.warming.ap/index.html
http://www.cnn.com/2008/US/weather/06/05/severe.weather.ap/index.html
http://www.cnn.com/2008/US/weather/06/02/honore.preparedness/index.html
http://www.nytimes.com/2008/06/01/opinion/01dowd.html?em&ex=1212638400&en=744b7cebc86723e5&ei=5087%0A
http://www.cnn.com/2008/POLITICS/06/05/senate.iraq/index.html

is it time to get real yet? A LOT of energy is being squandered in attempts to keep US in the dark. in the end (give or take a few 1000 years), the creators will prevail (world without end, etc...), as it has always been. the process of gaining yOUR release from the current hostage situation may not be what you might think it is. butt of course, most of US don't know, or care what a precarious/fatal situation we're in. for example; the insidious attempts by the felonious corepirate nazi execrable to block the suns' light, interfering with a requirement (sunlight) for us to stay healthy/alive. it's likely not good for yOUR health/memories 'else they'd be bragging about it? we're intending for the whoreabully deceptive (they'll do ANYTHING for a bit more monIE/power) felons to give up/fail even further, in attempting to control the 'weather', as well as a # of other things/events.

http://www.google.com/search?hl=en&q=weather+manipulation&btnG=Search
http://video.google.com/videosearch?hl=en&q=video+cloud+spraying

dictator style micro management has never worked (for very long). it's an illness. tie that with life0cidal aggression & softwar gangster style bullying, & what do we have? a greed/fear/ego based recipe for disaster. meanwhile, you can help to stop the bleeding (loss of life & limb);

http://www.cnn.com/2007/POLITICS/12/28/vermont.banning.bush.ap/index.html

the bleeding must be stopped before any healing can begin. jailing a couple of corepirate nazi hired goons would send a clear message to the rest of the world from US. any truthful look at the 'scorecard' would reveal that we are a society in decline/deep doo-doo, despite all of the scriptdead pr ?firm? generated drum beating & flag waving propaganda that we are constantly bombarded with. is it time to get real yet? please consider carefully ALL of yOUR other 'options'. the creators will prevail. as it has always been.

corepirate nazi execrable costs outweigh benefits
(Score:-)mynuts won, the king is a fink)
by ourselves on everyday 24/7

as there are no benefits, just more&more death/debt & disruption. fortunately there's an 'army' of light bringers, coming yOUR way. the little ones/innocents must/will be protected. after the big flash, ALL of yOUR imaginary 'borders' may blur a bit? for each of the creators' innocents harmed in any way, there is a debt that must/will be repaid by you/us, as the perpetrators/minions of unprecedented evile, will not be available. 'vote' with (what's left in) yOUR wallet, & by your behaviors. help bring an end to unprecedented evile's manifestation through yOUR owned felonious corepirate nazi glowbull warmongering execrable. some of US should consider ourselves somewhat fortunate to be among those scheduled to survive after the big flash/implementation of the creators' wwwildly popular planet/population rescue initiative/mandate. it's right in the manual, 'world without end', etc.... as we all ?know?, change is inevitable, & denying/ignoring gravity, logic, morality, etc..., is only possible, on a temporary basis. concern about the course of events that will occur should the life0cidal execrable fail to be intervened upon is in order. 'do not be dismayed' (also from the manual). however, it's ok/recommended, to not attempt to live under/accept, fauxking nazi felon greed/fear/ego based pr ?firm? scriptdead mindphuking hypenosys.

consult with/trust in yOUR creators. providing more than enough of everything for everyone (without any distracting/spiritdead personal gain motives), whilst badtolling unprecedented evile, using an unlimited supply of newclear power, since/until forever. see you there?

"If my people, which are called by my name, shall humble themselves, and pray, and seek my face, and turn from their wicked ways; then will I hear from heaven, and will forgive their sin, and will heal their land."

meanwhile, the life0cidal philistines continue on their path of death, debt, & disruption for most of US. gov. bush denies health care for the little ones;

http://www.cnn.com/2007/POLITICS/10/03/bush.veto/index.html

whilst demanding/extorting billions to paint more targets on the bigger kids;

http://www.cnn.com/2007/POLITICS/12/12/bush.war.funding/index.html

& pretending that it isn't happening here;

http://www.timesonline.co.uk/tol/news/world/us_and_americas/article3086937.ece
all is not lost/forgotten/forgiven

(yOUR elected) president al gore (deciding not to wait for the much anticipated 'lonesome al answers yOUR questions' interview here on /.) continues to attempt to shed some light on yOUR foibles. talk about reverse polarity;

http://www.timesonline.co.uk/tol/news/environment/article3046116.ece

Um... what? (0)

Anonymous Coward | more than 6 years ago | (#23692749)

Doesn't AMD own Via? I could swear that happened like, four or five years ago. Than again, I could always be wrong, I haven't slept since Thursday.

*Swoons* (2, Interesting)

hyperz69 (1226464) | more than 6 years ago | (#23692851)

I am a big fan of the Nano. I think it has potential to be huge if it lives up to 1/2 of it's claims. I always cried thinking I would need to use the Chroma crap that Via Integrates. Nvidia Graphics on a Nano platform. Tiny little gaming boxes and notebooks. Dear lord, its nerd heaven! Media centers for the poor! I am buzzing with glee just thinking the possibilities. KUDOS!

Will nvidia make chipsets for via as via ones suck (1)

Joe The Dragon (967727) | more than 6 years ago | (#23692915)

Will nvidia make chipsets for via as via ones suck?

open source drivers? (1)

tritter (1294692) | more than 6 years ago | (#23693257)

Might there be a chance to finally get open source drivers from nvidia now they team up with VIA?

Re:open source drivers? (1)

TeknoHog (164938) | more than 6 years ago | (#23693555)

Might there be a chance to finally get open source drivers from nvidia now they team up with VIA?

You're kidding, right?

So far VIA has been terrible with Linux. I've had much better experiences with Nvidia, even with their closed-source drivers. This talk about VIA being open in comparison to Intel is funny, considering Intel has provided opensource Linux drivers for their hardware for years.

Re:open source drivers? (1)

dfries (466073) | more than 6 years ago | (#23694213)

This talk about VIA being open in comparison to Intel is funny, considering Intel has provided opensource Linux drivers for their hardware for years.

Have they? I always thought it was their publically available hardware specifications that allowed anyone to write drivers. It's just when they got into the graphics chip business that they lacked documentation and had to then provide drivers. Now Intel is releasing the source to the graphics drivers, and providing the graphic chipset documentation, a great example for others to follow.

Improve OLPC design and sell it for $200 (1)

bsharma (577257) | more than 6 years ago | (#23693393)

One thing that has puzzled me is why aren't more companies just copying OLPC design, may be enhance the processor, memory etc., and sell for say, $200-$300. I don't think OLPC foundation will say no to sell or share their design. Not that their design is some top secret anyway.

Re:Improve OLPC design and sell it for $200 (1)

GigaplexNZ (1233886) | more than 6 years ago | (#23697719)

What puzzles me is why you are puzzled about something not happening when it actually is? Take a look at the Asus Eee PC and the other competing laptops starting to come out.

Re:Improve OLPC design and sell it for $200 (1)

drinkypoo (153816) | more than 6 years ago | (#23700659)

The best feature of the XO is the Marvell-based WiFi (which currently does not have Free or even Open firmware) which implements a Mesh AP on a chip. This allows the device to operate as a repeater while draining the absolute minimum power. The only way we are going to achieve independence from the greedy corporations trying to milk the internet for every penny is if we make them irrelevant. The way to do that is to build an alternative internet. And the only cost-effective way to do that at the moment is with mesh networking over 802.11.

Anyone know of a product which consists of that Marvell chip and an ethernet interface, with a couple of external antenna connectors? Without the inclusion of the laptop computer, of course. So far it seems like the cheapest Mesh AP solution is a proper-version WRT54g with a new firmware image, but it's not very low-power. I'm looking for something that I can run off a small solar panel in occasionally cloudy conditions (with a small battery pack made up of NiCd or NiMh AAs for night operation.) I live in the boonies, and something like that is exactly what is needed for a last-mile solution. Just plant them on ridge lines, and use directional antennas if necessary to boost the range. Way out in the BLM land etc you could amplify the signal and use an antenna with some moderate gain.

he spelled it right... (1)

slashdotard (835129) | more than 6 years ago | (#23693503)

He used and spelled "discrete" correctly.

wow.

He used and spelled "discrete" correctly.

wow.

ummm... the story? yeah, sure.

Whatever.

He used and spelled "discrete" correctly.

I hate to be the devil's advocate... (0)

Anonymous Coward | more than 6 years ago | (#23693883)

...but does it run Crysi---SWEET JESUS!

VIA tries to redefine Open as "Ours" (4, Interesting)

Glasswire (302197) | more than 6 years ago | (#23694093)

Hmmm, the old Mini-ITX format had multiple vendors (VIa, Intel, others) using it and right now the only vendor using Mini-ITX 2.0 is VIA-NVidia. How is this more open? And in what sense was Intel making the old standard less open -other than jumping into that market and doing well?

BTW, I have to laugh at the sight of a Mini-ITX board with a relatively low power VIA cpu having a huge, power sucking NVidia discrete GPU board on it. Surely anybody that cares about performance graphics is not using this catagory of board. Logically , NVidia would do an integrated graphics chipset for the Mini-ITX format, but a PCI-Express external card that quadruples the chassis height (and probably quads the power consumption of the board) is a joke. Ask embedded systems developers (still the main market for Mini-ITX systems) if this is really what they're looking for. VIA and NVidia cobbled together a frankenstein combination of technologies just to make the Atom look bad with irrelevant perf specs.

Re:VIA tries to redefine Open as "Ours" (1)

DrYak (748999) | more than 6 years ago | (#23696761)

Hmmm, the old Mini-ITX format had multiple vendors (VIa, Intel, others) using it and right now the only vendor using Mini-ITX 2.0 is VIA-NVidia. How is this more open?
By the fact that vendors wanting to implement it don't have to pay royalties to anybody. That is what "open" means. You don't necessarily need several vendors using them, you need vendors having no big barrier to start using them.

And in what sense was Intel making the old standard less open -other than jumping into that market and doing well?
Here I agree with you. Intel's Mini-ITX has no fundamental reason of being less open. (Unless they're charging license fees for their improvements).

BTW, I have to laugh at the sight of a Mini-ITX board with a relatively low power VIA cpu having a huge, power sucking NVidia discrete GPU board on it.
Still the whole platform is projected to cost less than any Intel offering able to run Vista.

And I presume the whole VIA / NVidia collaboration is also due to Nvidia's interest in having access to lower-power technologies from VIA, just the same way ATI's latest and next Radeons are benefiting of technology from AMD (smaller process fabs, split power planes, etc...)

If some of the VIA technology is used or joint agreement are made for smaller process 3rd party fabs, VIA's onboard GeForce will probably be less power hungry.

Re:VIA tries to redefine Open as "Ours" (1)

sweet_petunias_full_ (1091547) | more than 6 years ago | (#23698633)

"Intel's Mini-ITX has no fundamental reason of being less open..."

Maybe what they're referring to as far as "open" (but they're using the word loosely) is that intel doesn't want people to use its cheap atom chip to build systems that can compete in the same market segment with their more pricey offerings. That's what would keep an OEM from outfitting a system any way they want to.

This VIA/nVidia system that does have enough graphics performance to beat the atom and (they claim) to play recent games is basically a way to use intel's own (stupid) rule to grab marketshare away. If intel suddenly lifts the restriction, then their obedient OEMs won't be able to react for a while, and if it doesn't, it may have to drop prices on the low end of its more profitable segment.

It's a smart move on nVidia's part. What isn't so smart is to try to confuse the word "open" for this tactic. Everyone knows that nVidia's drivers aren't open source and that they're not likely to open them any time soon due to all of patents they had to use. They may have nice performance, but closed source drivers and open systems don't mix. They may think they can ignore this due to customer ignorance but I think it's going to catch up with them eventually. Personally, I wouldn't add a closed source video card to my system even at half price - I just don't want to bother with something that works today but breaks with the next kernel upgrade.

Re:VIA tries to redefine Open as "Ours" (0)

Anonymous Coward | more than 6 years ago | (#23697237)

Bah, VIA will be fine.

Really, if those developpers don't want it, they don't have to buy it. VIA will be releasing Nano+VX800 miniITX boards, nVidia will be working on its ARM processor devices (Tegra?), and nVidia will help put some video bite into VIA's systems for those who really want it.

For a lot of people like me, we want a HTPC that doesn't consume buckets of power. A Nano miniITX system does that wonderfully. Now, add in a discrete card, and you have something capable of actually gaming too. And for those who don't want nVidia's crappy chipsets (if they make one for the nano) or a PCIe x16 slot, you don't have to take it bub.

It's more open because hopefully AMD will also move on to miniITX2.0... I've only quickly read the article, but since there's so many improvements as opposed to "miniITX 1", it's more of an upgrade underlining at the same time that VIA wasn't kidding when they were saying that they were going to open up.

Tiered computer systems coming (0)

Anonymous Coward | more than 6 years ago | (#23698273)

Depending how you break it down the future will be on a shifting scale from the cpu's doing everything(intel) to gpu's doing the grunt work(nvidia) with AMDTI sitting in the middle. Intel-AMD-nVidia.
Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...