Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Brings New Desktop Chips Down To 65W

Unknown Lamer posted more than 2 years ago | from the doing-more-with-less dept.

AMD 104

crookedvulture writes "AMD's new Llano-powered A-series APUs have had a difficult birth on the desktop. The first chips were saddled with a 100W power rating, making them look rather unattractive next to Intel's 65W parts. Now, AMD has rolled out a 65W version of Llano that's nearly as fast as its 100W predecessor despite drawing considerably less power under load. This A8-3800 APU doesn't skimp on integrated graphics, which is key to Llano's appeal. If you're not going to be using the built-in Radeon, the value proposition of AMD's latest desktop APUs looks a little suspect."

cancel ×

104 comments

Sorry! There are no comments related to the filter you selected.

Amd also has better MB's for the price (2)

Joe_Dragon (2206452) | more than 2 years ago | (#37604844)

as you get more for your $ then with a intel board.

Re:Amd also has better MB's for the price (-1, Troll)

Anonymous Coward | more than 2 years ago | (#37604988)

AMD is a third world company producing illegal knock-offs of American chips. They lack security features, making them popular with computer hackers. AMD also hosts the goatse [goatse.fr] website, and makes you gay.

Re:Amd also has better MB's for the price (0, Troll)

CelticWhisper (601755) | more than 2 years ago | (#37605120)

You say that now, but I saw you running Lunix the other day! And you weren't using AOL to get online either!

'security features' (1)

unity100 (970058) | more than 2 years ago | (#37605484)

as in the capability of a private corporation being able to lock down your computer remotely from somewhere on the internet. thats what intel plans. ah, i forgot the built-in drm to offer you 'better content delivery'.

anyone who vies for such 'features' translates into 'moron' in my dictionary.

Re:'security features' (0)

Anonymous Coward | more than 2 years ago | (#37606090)

What? vPro exists only [intel.com] on [intel.com] non-K CPUs. Additionally, to use such "features" requires an optional driver [intel.com] installation.

Re:'security features' (1)

unity100 (970058) | more than 2 years ago | (#37606154)

intel has declared that it intends to put drm 'content delivery' in its future generation. it wont be optional.

Re:Amd also has better MB's for the price (1, Informative)

burning-toast (925667) | more than 2 years ago | (#37605230)

Performance desktop user here... Let me know when they start beating out the i5-2500K or i7-2600K CPU performance wise (even if the chip is more expensive!). I've got my i7-2600K running at 4.4Ghz stable without playing with the voltages or running turbine aircraft engine coolers (matter of fact the PC is almost silent). I can't think of any features I am missing on my P67 rev2 board which would make me trade in the performance of the CPU I have either. I love this chip!

I used to like AMD quite a lot (P4 era), but then they started lagging behind big time once Intel ditched Netburst. They have been either a lot HOTTER or a lot SLOWER (or both) than Intel for what seems like years now without really being able to recover. The Sandy Bridge based chips from Intel just kinda bend them over the barrel at the end of the day too with the reduced price point Intel is using for them. The only thing AMD would possibly give me would be a higher core count for my dollars, but not necessarily a more efficient (HT on SB is really nice) or cooler running chip; sadly this is something they used to excel at doing.

Since I'm not interested in their integrated graphics this is most likely the exact reason why there is minimal to no attraction for me to these product offerings.

Re:Amd also has better MB's for the price (2)

blair1q (305137) | more than 2 years ago | (#37605432)

Let me know when they start beating out the i5-2500K or i7-2600K

They may never do that, if they keep getting all excited about a part that has less than half the performance.

Looking at the price-performance chart from the summary, it's clear the i5-2500K is leaps and bounds better than any other chip currently available.

Of course, people here are more performance- than price-conscious, so those i7-##0X jobs off to the right are drooltastic. Especially when you check on Pricewatch and find them for $200 less than TechReport is listing.

My home box is getting unstable in scary ways (memory shrinking relative to newer app reqs and multiple fails to find compatible upgrades that don't barf on half of boot attempts; and one of my RAID-0 discs has developed a boot failure reported in the BIOS at startup, but still manages to operate well enough to get me into the OS, but it's only a matter of time before whatever that is metastasizes until the system is a brick).

So it's time to rebuild. My next system will have the OS on an SSD and my data partitions on a properly error-tolerant RAID instead of the RAID-0 I'm using now just for the speedup. And, of course, it will be run by something sweet from the Intel catalog. i7-bignumbersX, most likely. No sense chintzing on the CPU when it's the most important component and the rest of the parts will cost $1k or more.

And since there are rumors that Intel is flattening out its roadmap (no sense overspending when the competition is as lame as they are), anything built today will remain egoboosting for longer than the usual.

Re:Amd also has better MB's for the price (1)

QuantumRiff (120817) | more than 2 years ago | (#37607486)

Wait a minute.. How the heck is your CPU the most important component upgrade? Seriously Get the 3rd or 4th fastest processor. Do you know how much time your CPU sits Idle? sure, and SSD will help some, but put in more ram, better video card.. heck, get a nicer, bigger monitor so you can see more physical desktop.. but CPU?? Unless you do video encoding for a living you will never know the difference

Re:Amd also has better MB's for the price (1)

swalve (1980968) | more than 2 years ago | (#37608156)

It's not how long it spends idle, it is how long it spends near 100% usage. As long as you can peg it, you can benefit from more processor.

Re:Amd also has better MB's for the price (1)

slashdotjunker (761391) | more than 2 years ago | (#37608772)

Since you seem to care vaguely about stability, you might want to know that none of Intel's current desktop chipsets support ECC memory. I'm speccing a system right now and it looks like I'm going to have to buy a non-Intel chip.

PS. If you still want an Intel, the Xeon E3 1200 series is the closest ECC chipset to a desktop chipset.

Re:Amd also has better MB's for the price (1)

petermgreen (876956) | more than 2 years ago | (#37609578)

Of course, people here are more performance- than price-conscious, so those i7-##0X jobs off to the right are drooltastic.

Except they aren't because while they beat the 2600K in highly multithreaded tasks they lose badly to it in tasks with four or less active threads. Afaict most desktop tasks have four or less active threads. Plus by buying them you would be buying into a dying platform. Therefore unless I really needed the features of the LGA1366/x58 platform I would probablly not buy one at this point.

And since there are rumors that Intel is flattening out its roadmap (no sense overspending when the competition is as lame as they are), anything built today will remain egoboosting for longer than the usual.

Today you have a choice between LGA1155 which has the sandy bridge cores and native sata 6G but it's a mainstream platform with limited PCIe, dual channel memory and a maximum of four cores. Or you have LGA1366 which is a high end platform with triple channel memory, lots of PCIe and the option of 4 or 6-core processors but the cores are an older (slower) design and the chipset no sata 6G (motherboard vendors often add third party sata 6G chipsets but apparently they often suck performance wise).

Soon (apparently sometime in november) LGA2011 should be coming out which will support chips with up to 8 of the latest generation cores (though desktop chips will only go up to 6-core), quad channel memory and lots of PCIe and sata 6G in the chipset.

So if you are building a high end system I would strongly suggest waiting for LGA2011

Re:Amd also has better MB's for the price (0)

Anonymous Coward | more than 2 years ago | (#37605498)

How is this marked troll? This is a very reasonable response to the article. Intel simply has no meaningful competition in the mid-high end of the market. This is definitely true with Sandy Bridge but was also true with Lynnfield and Bloomfield before it. This market is important as it is more profitable for the chip makers than the low-end offerings.

I also really don't see the market for AMD's APU offerings. While the integrated GPUs crush what Intel has to offer, they're still massively inferior to a mid to high-end dedicated graphics card. Yet, for 2D applications and HTPCs, the Intel onboard graphics are perfectly adequate.

I think part of the reason that Intel invests so little in the integrated graphics on their desktop chips is precisely because there is so little interest beyond basic 2D, HTPC and very light gaming usage. For anything else, users are going to buy a dedicated graphics card.

idiot. (1)

unity100 (970058) | more than 2 years ago | (#37605756)

I also really don't see the market for AMD's APU offerings I think part of the reason that Intel invests so little in the integrated graphics on their desktop chips is precisely because there is so little interest beyond basic 2D, HTPC and very light gaming usage. For anything else, users are going to buy a dedicated graphics card.

if you are unable to see that, then dont go deciding what is troll and what is not.

i just had to advice approx 2 guild members because they had to upgrade their outdated hardware in order to be able to play swtor with full settings. they cant buy desktops due to mobility requirements, and they dont have the finances to shell out on a high end gaming laptop.

there is that market for amd's apu offerings. a low end notebook can play starcraft 2. situation wont be too different in desktop - people will readily shell out cpu+gpu in one shot instead of paying almost 40% more for cpu and dedicated graphics, and power users will buy the cpugpu, and then buy 2 more cards to run in 16x 4x 4x crossfire - which would be 30% more expensive to do in current cpu + 3 cards route.

Re:idiot. (1)

Billly Gates (198444) | more than 2 years ago | (#37606068)

Not just for games too.

If you play with IE 9 and IE 10/Windows 8 preview you will see fluid scrolling browsing when you hit hte up or down arrow keys with no flickering. Go to www.google.com videos with a generic search and try that it if you have a nice card? FIrefox 7 is catching up wtih video acceleration but it has slow scroll and will soon have fast scroll. Chrome will eventually have fast scroll with full accelerated html 5 canvas soon too.This is where it will matter for AMDs ALUs.

Average users and not gamers will see transitional graphical effects with Metro applets, smooth video, and animations with Llamo over a more expensive intel unit. I think a portable llamo would be a gret tablet running WIndows 8 for this reason next year.The cpu is no longer the bottleneck in modern systems. It is the hard drive and graphics.

portable llano in a tablet (1)

unity100 (970058) | more than 2 years ago | (#37606170)

seems like a good idea. that even can make me consider buying a tablet seriously.

Re:idiot. (0)

Anonymous Coward | more than 2 years ago | (#37606508)

I doubt a llano can run STWOR on anything except low settings even if the dedicated graphics is better than most integrated. Of course it has not come out yet and you are awesome if you got a hold of the beta.

It is better than integrated for video though. If it runs it on medium settings I will be highly impressed. I do know on atom netbooks where the chip competes you can forget it when running any game made in the last 5 years :-)

Games on your HTPC (1)

tepples (727027) | more than 2 years ago | (#37606366)

While the integrated GPUs crush what Intel has to offer, they're still massively inferior to a mid to high-end dedicated graphics card. Yet, for 2D applications and HTPCs, the Intel onboard graphics are perfectly adequate.

True for non-gaming HTPCs. But if you want to plug in a gamepad and play some games on your HTPC that aren't MAME, you'll need something stronger than Intel's "Graphics My Ass" integrated GPU. That's why I recommend AMD for value systems: you're sure to end up with GeForce or Radeon graphics, which helps in case you do end up wanting to add a little gaming on the side.

Re:Games on your HTPC (0)

Anonymous Coward | more than 2 years ago | (#37607172)

True for non-gaming HTPCs. But if you want to plug in a gamepad and play some games on your HTPC that aren't MAME, you'll need something stronger than Intel's "Graphics My Ass" integrated GPU.

A little off-topic here, but can you give me some examples of games you can play on your HTPC with a gamepad that aren't MAME or any other emulated game console?

The only one I am aware about is Tuxracer, and I think that even for that you need a keyboard/mouse and a gamepad isn't sufficient, so its not really a HTPC game.

Re:Games on your HTPC (1)

tepples (727027) | more than 2 years ago | (#37607244)

Anonymous Coward wrote:

can you give me some examples of games you can play on your HTPC with a gamepad that aren't MAME or any other emulated game console?

When I asked the same question, I got answers that I've collected here [pineight.com] . Also Lockjaw Tetromino Game [pineight.com] works with a gamepad, but that's simple enough to run well even on Intel.

Re:Games on your HTPC (0)

Anonymous Coward | more than 2 years ago | (#37609572)

I still play various incarnations of Fifa Football through Wine. Need For Speed is playable as well.

I also use a flightstick for games like Descent and Conflict FreeSpace (fs2_open). But nothing beats keyboard+mouse for rts and rpg games (looking forward to Skyrim, although it will probably be a while before it runs on Wine).

Re:Amd also has better MB's for the price (0)

Anonymous Coward | more than 2 years ago | (#37605528)

Let me know how that awesome i5-2500k can run StarCraft or do hidef blu-ray video, or fluid accelerated HTML 5 in the upcoming Metro apps with the built in Intel video? Llamo can do all of that in a $400 netbook.

You have dedicated graphics like you said on the bottom? Most business users do not have that or notebook users or anyone who is trying to watch their pennies on consumer electronics.

Even if you are a high performance user with dedicated graphics a high end Llamo or Bulldozer class cpu-GPU combo will be able to do crossfire. This means you wont get that extra kick in performance by adding a high end video card like you can with an AMD CPUGPU. Add crossfire for any Raedon card is very sweet if AMD can pull it off reasonable.

AMD is in an unfair spot. They do not have the cash for the same chip fabrication plants and they buy older ones Intel abandons so they need a better CPU just to break even because they are higher NM in size. I find this very creative as many Arm processors also have integrated graphics. One of the issues with integrated graphics is the long time it takes to interrupt the cpu to access the ram for each instruction. Since the GPU now shares the same access controller as the CPU it doesn't have the wait cycles. Still bandwidth hungry but at least you should get much better performance. It is about time desktops caught up with consoles.

Re:Amd also has better MB's for the price (1)

toddestan (632714) | more than 2 years ago | (#37608114)

StarCraft? A Pentium 133 can play that game.

Re:Amd also has better MB's for the price (1)

DarkXale (1771414) | more than 2 years ago | (#37610280)

The HD3000 will whine if you start fiddling with the Shader setting in SC2 and setting it to upper values, but set that to low and you can pretty much set everything else to Ultra.

Re:Amd also has better MB's for the price (2)

dbIII (701233) | more than 2 years ago | (#37608064)

Performance server user here - that 48 core supermicro AMD system from well over a year ago buries any of those Intel systems you are talking about. Intel now have things with less cores but higher speed at around three times the price.
Everything under serious development is being written as multithreaded if it isn't already. A fast core is pointless if it's switching context all the time to run something that would be on another core if you had more cores.

Re:Amd also has better MB's for the price (1)

serviscope_minor (664417) | more than 2 years ago | (#37609864)

that 48 core supermicro AMD system from well over a year

I love those machines. They are frankly awesome and astonishingly cheap and dense compared to the competition. The funny thing is that they beat most of the real specialist high density crazy-servers on CPU grunt per U and completely bury them in price. They're also similar in power (worse if you believe the vendor information, which I don't). The huge system image (mine are configured for 256G) is also really nice for certain kinds of problem.

I've recently been building a cluster (previous Ask ./ about it), and in terms of absolute price, a cluster of x6 1100T's in desktop motherboards beats the quad supermicros by a factor of 3 or so in price. The money is saved by not having 4 very fast HT links per processor (giving up large system images), no server class features (ILM), non server class cooling, and of course considerably less density.

Re:Amd also has better MB's for the price (1)

dbIII (701233) | more than 2 years ago | (#37610178)

One nice thing I could do with it (only have one) is when I had a braindead application that ran like crap because it was doing sorting on disk I just fed it a 20GB ramdisk to get better than 100x the performance. Of course a sane application would check to see how much memory was available before resorting to disk.
Fairly stupid software licencing practices made the thing far more cost effective than a cluster (even shifting the licences onto the cluster of 8 core machines I have would cost more than the 48 core machine) - everything else is a bonus.

Re:Amd also has better MB's for the price (0)

geekoid (135745) | more than 2 years ago | (#37605426)

for the price? yeah, Yugo's have better cars the Mercedes, for the price.

Re:Amd also has better MB's for the price (0)

Anonymous Coward | more than 2 years ago | (#37605764)

i spent $200 and got :
Mushkin Enhanced Silverline Stiletto 8GB 2X4GB PC3-10666 DDR3-1333 9-9-9-24 Dual Channel Memory Kit $42.99
ASUS M5A78L-M LX mATX AM3+ DDR3 AMD 760G 1PCI-E16 2PCI-E1 1PCI SATA2 VGA GBLAN Motherboard $56.99
AMD Phenom II X2 555 Dual Core Processor AM3 3.2GHZ 6MB L3 Cache 80W 45NM Retail Box $93.04
(modded to a quad core with the asus bios core unlocker)

Beat that for value. An 8 gig 3.2Ghz quad core for $200.

Re:Amd also has better MB's for the price (1)

beelsebob (529313) | more than 2 years ago | (#37609138)

Okay then...
Pentium G840 –$88.99
ASRock H61M-GE LGA1155 Intel H61 1PCI-E16 1PCI-E1 2PCI SATA2 VGA GBLAN Motherboard - $62.99
Team Elite 8GB 2x4GB PC3-10666 DDR3-1333 9-9-9-24 Dual Channel Memory Kit - $39.99

That puts me $6 cheaper, and as you can see, even if you succeed in unlocking your cores, it beats your system 9 times out of 10. If you don't, well, it demolishes your system:
http://www.anandtech.com/bench/Product/188?vs=405 [anandtech.com]
http://www.anandtech.com/bench/Product/121?vs=405 [anandtech.com]

Re:Amd also has better MB's for the price (1)

O('_')O_Bush (1162487) | more than 2 years ago | (#37607776)

Well, a better analogy would be a corvette vs a ferrari. Corvette isn't quite as fast or does quite as well in the corners... but a corvette is a lot more accessible to most people... and for every day things is more than enough and cheaper to maintain.

My phenom II cost me 125$ and isn't much lower performing than some of the newer 300$ i5's (<5% difference), and smokes some of the older 300$+ i7's. And the socket is forwards compatible for many of the newer phenoms and Bulldozer coming out in a few months.

That's a hard bargain to beat.

Re:Amd also has better MB's for the price (1)

beelsebob (529313) | more than 2 years ago | (#37609156)

Your phenom II (I'm guessing a 965 at $125) isn't much lower performing than some of the newer $300 (wait, no $190 for an i5 2400) i5s (it's more like 30% compared to an i5 2400 generally http://www.anandtech.com/bench/Product/102?vs=363 [anandtech.com] ), but neither is an i3 2100. In fact, so much so that the i3 2100 will beat your CPU silly most places (http://www.anandtech.com/bench/Product/102?vs=289), and costs $125 too ;)

Re:Amd also has better MB's for the price (2)

hairyfeet (841228) | more than 2 years ago | (#37606980)

I'd add the new Brazos netbooks like this EEE I picked up [tigerdirect.com] SERIOUSLY rock hard. Frankly after messing first with atom netbooks (too damned slow) followed by an MSI wind with an Athlon dual (nice but a little power hungry) i wasn't sure what to think when I got the Brazos but it really is a sweet chip. I get about 6 hours on a battery in windows 7 HP or around 8 hours in Expressgate, even after running 5 hours solid it was cool to the touch, the Radeon 6310 GPU makes for some smooth HD video, oh and it'll hold 8Gb of RAM which I can't wait to get here!

all in all I really can't think of a bad word to say about Brazos. I've done multitrack editing in audacity, a little light video editing, and so far no matter how much I threw at it it still felt nice and responsive, more like an Intel CULV than an atom netbook. and who can't love a netbook with bluetooth, 320Gb HDD, USB 3, HDMI, and a nice bright 12.1 inch screen that gets 6 hours on a 6 cell and with the 8Gb RAM upgrade cost less than $340 shipped? I'd say AMD has a real winner on its hands with those APUs, they truly are nice chips.

Re:Amd also has better MB's for the price (0)

Anonymous Coward | more than 2 years ago | (#37607220)

I'd add the new Brazos netbooks like this EEE I picked up [tigerdirect.com] SERIOUSLY rock hard.

Looks nice, except for the seesaw touchpad buttons, I will never understand why ASUS uses that crap for their laptops all the time, that's the only reason I don't buy ASUS.

Re:Amd also has better MB's for the price (1)

Billly Gates (198444) | more than 2 years ago | (#37608186)

Go run Firefox on these with lots of addons and then tell me it is a good chip. :-)

That should be the end all of benchmarks

Re:Amd also has better MB's for the price (1)

hairyfeet (841228) | more than 2 years ago | (#37609310)

I quit using that bloated piggie of a browser when it reached FF 4, thanks ever so. I'd suggest you try Comodo Dragon [comodo.com] instead bill. It is based on chromium so nice and fast, you can opt in to using their secure DNS (which doesn't affect the OS settings, just the browser) which I've found pretty much kills malware dead before it can even reach the system, all the good Chrome plugins like ABP, readability, and ForecastFox work on it, and most importantly it is NOT a bloated POS.

That said that EEE supports 8gb of RAM, which is currently going for $37 on Amazon so its dirt cheap to max that puppy out and let FF leak away. So while I can't tell you how FF runs on it, I CAN tell you editing 4 tracks while bouncing down to two in audacity worked great, the only slowdown was when i was experimenting with some heavy effects (which will slow down even my quad desktop) but other than that was sweet, editing docs in Word 2K7 while listening to playback? Nice. Tron Legacy in MPC? smooth as silk, I bet it would play some of my older games like Titan Quest well too, but who would want to game on a 12 inch screen?

all in all the best $340 I've ever spent on PC gear. That Brazos APU gets really great battery life and expressgate is just wonderful! and where else you gonna get a machine with HDMI, 320Gb HDD, USB 3, bluetooth AND 8Gb of RAM, all for less than $350 shipped? I love the hell out of the thing, its just too damned handy.

Re:Amd also has better MB's for the price (0)

Anonymous Coward | more than 2 years ago | (#37611124)

Joe, why'd you make another account? Did you get tired of posting at -1 by default with http://slashdot.org/~Joe%20the%20Dragon [slashdot.org] ?

Want to know why everyone kept modding you down?

It's because you're subliterate, and reading your posts is painful at best.

"If" (4, Insightful)

Baloroth (2370816) | more than 2 years ago | (#37604892)

The whole point of these chips is the built in Radeon, whether it's for GPU or GPGPU performance. I'm not even sure why you would compare it solely as a processor, and I'm quite sure that isn't a fair or reasonable comparison. Nor one anyone wants to make (who might actually buy a Llano). For high performance, you'll get a dedicated card anyways. Anyone looking at this will use the integrated Radeon, that's the point.

nonononono (1)

unity100 (970058) | more than 2 years ago | (#37604948)

get a llano. get another 6xxx radeon. yet get ANOTHER 6xxx radeon. you got a 3 way crossfire.

you were speaking of performance ?

Re:nonononono (1)

GigaplexNZ (1233886) | more than 2 years ago | (#37606786)

Crossfire generally doesn't scale well enough to make 3 way crossfire worthwhile when you have 2 mid-high end cards and one slower GPU. Also, the fastest discrete GPU that supports the hybrid crossfire right now is the 6670 I believe.

PS: I'm a Llano owner (A4-3400).

Re:"If" (1)

adamchou (993073) | more than 2 years ago | (#37605658)

but if you look at the benchmarks, the Intel i3 beats the AMD in almost every gaming benchmark too. So what does the AMD chip have to offer if its supposed ondie GPU can't even beat Intel's?

Re:"If" (1)

Billly Gates (198444) | more than 2 years ago | (#37605904)

Link?

In gaming benchmarks I can get a $2,000 icore 7 xeon with an integrated graphics chip and then setup a $499 Dell with just an i3, but throw in a Raedon 6950. Guess which computer will trounce the benchmarks by a very large margin?

The GPU is what is important in gaming and regular desktop usage with accelerated html 5 browsing and Metro around the corner. CPU is less important. Also like another slashdotter posted you can always add a dedicated card and then crossfire it with the CPU/GPU :-D ... now that is a great gaming rig for $600. Even if the results are mediocre at best, the crossfire can make up and make it a great system if I choose to put a dedicated 68x grade ATI card.

AMDs are slower but not by that much. If my 2.8 ghz dekstop is 15% slower I do not care as much if I have a nice GPU and virtualization instructions to run VirtualBOX all for an on sale for $699 system. Intel equilivents were well over $1,000 and I wonder if hte bios turned off the virtualization functions to force me to pay more money to run VMware or virtualbox? I do not own a Llamo, but if I were to buy a new laptop I would certain look into one. It has been proving web browsing is slow on a 3.0 ghz hyperthreaded system with a crappy GPU.

Re:"If" (1)

LordLimecat (1103839) | more than 2 years ago | (#37607658)

The xeons dont have the high performance Sandy Bridge GPU, so thats not terribly suprising. Only the desktop chips have the new intel GPU.

Re:"If" (1)

beelsebob (529313) | more than 2 years ago | (#37609174)

Actually, and E3 Xeon ending with a 5 in its serial number has an HD 3000.

Re:"If" (1)

LordLimecat (1103839) | more than 2 years ago | (#37613816)

Ive been looking at the Xeon E3s, and Intel's knowledgebase seems to indicate they lack the hardware gpu features.

For instance, look at the E3 1270 (link) [intel.com] . Under "Graphics specs", it says "no" to all of the graphics features, including "processor graphics".

Ive been looking at these closely for the last few weeks, and it seems you specifically need a separate gpu chipset on the motherboard to handle the graphics, as the CPU will not do it.

Re:"If" (1)

hairyfeet (841228) | more than 2 years ago | (#37609362)

Don't forget two other advantages, mainly OpenCL and the fact that AMD is in the process of switching from VLIW to vector on its GPUs and APUs. One of the things the AMD devs have been working on is having the APU take over for physics if you have a dedicated card and once they switch over to vector the APU will become even more powerful, but frankly for the mobile space i'd say the new E-350 is just about perfect, just the right mix of CPU and GPU for the stuff folks use a laptop for.

And IMHO the most important reason? So you don't reward douchebaggery and market manipulation. Intel has already admitted to rigging their compiler (which they still do BTW, despite the settlement), massive bribes to all the OEMs for more than 7 years costing ALL of us higher bills on electricity by foisting the stinking turd P4 upon us all, and if you reward asshattery it'll only get worse.

Personally I believe in supporting a free market enough that when i found out about the BS at Intel I switched to AMD exclusively for me and my customers. honestly ever since we went dual core PCs have been "good enough" for the masses by a pretty large margin, and my customers just love being able to get more cores, more memory, more HDD space, more GPU, just all around a hell of a lot more bang for the buck by going AMD. And I can be happy knowing i'm doing my own small part in trying to ensure that we have competition instead of a monopoly. because i'm old enough to remember when Intel was the only game in town and it sucked hard. high prices, crappy performance, the only thing that has kept intel on its toes is AMD. or has everyone forgotten how truly horrible Intel GPUs were before AMD bought ATI? i815? 945G? Power hogging lousy performing POS chips all around.

Re:"If" (2)

Pulzar (81031) | more than 2 years ago | (#37606218)

In those tests, i3 is being tested with external graphics, compared to AMD with the same external graphics. Basically, it's a CPU vs CPU test. Which is pretty ridiculous because they are both targeted to users who will not buy external cards...

The actual i3 vs A8 tests with their associated graphics are tested later in the article here: http://techreport.com/articles.x/21730/8 [techreport.com] . The results aren't even close - AMD is more than playable, i3 is not.

Re:"If" (1)

cynyr (703126) | more than 2 years ago | (#37610604)

find me a game that makes use of more than 2 cores...

Or better yet do a "while re-encoding this 1080P source(link) using these ffmpeg/libx264 settings(link) using n-1 cores, here is the FPS of ${GAME}" or even simply "we started a virus scan and then decided to play ${GAME}"

Can we please move past the single and dual threaded benchmarks? go look at the x264 encode times using all the cores for both chips, I'll wait... yep the AMD wins at a given price point. I don't know about you but i usually have $X to spend on an upgrade, and I try to maximise my bang for the buck. When $X is 3/4 the cost of that shinny i7-${BIGNUMBER}X it might as well not even exist.

Re:"If" (0)

Anonymous Coward | more than 2 years ago | (#37610866)

When playing battlefield 3, all four cores of my i5-2500 jump to at least 50% utilization. In fact, the game even suggests a 4 core system to play. This is probably an example of an exception to the rule though.

Re:"If" (1)

UnknowingFool (672806) | more than 2 years ago | (#37605688)

Seriously. For the average consumer to surf the web and watch movies, built-in GPUs are more than adequate. They can even play casual games. The reduction in power has it on par with Core i5 now in most cases; however, Intel has lower power Core i3s and at least one Core i5 that is lower power.

casual ? (1)

unity100 (970058) | more than 2 years ago | (#37605962)

llano low end notebooks are able to play starcraft 2.

Re:"If" (1)

Billly Gates (198444) | more than 2 years ago | (#37606830)

I disagree. Integrated gpus have such high ram latency that even hidef videos have trouble keeping up.

IE 10 ppr and Firefox 7 have a big difference in performance depending on GPU for sites and ads that take 100% cpu utilization.Metro will show this when Microsoft adds IOS graphical effects. Llamo may not be super fast, but it is lightyears ahead of regular gpus because it is integrated with the CPUs ram controller. If you are on a computer with a decent dedicated video card fire up IE 9 (I know blaspheme here on /.) and do a Google video search and hit the up and down arrow keys if you don't believe me?

  See how smooth it is? That is the GPU doing 100% of the work. This would flicker like mad with the integrated controller on a Core i5 or under Chrome. Chrome will fix this by Chrome 16 or 17 so it is coming.

That is what me and Unity are talking about. Fast CPU doesn't make a difference but a good GPU will for the average Joe. Llamo will have a better experience for a fraction of the cost.

Re:"If" (1)

beelsebob (529313) | more than 2 years ago | (#37609168)

The thing I find strange about llano is... Who wants a low end radeon, who can't make do with an HD 2000 or HD 3000? I can't think of anyone who actually wants a "real" graphics chip, but doesn't want a *real* graphics chip on the desktop.

They look great for laptops, at low power usage, but for desktop... really no.

Re:"If" (1)

Sloppy (14984) | more than 2 years ago | (#37613294)

Who wants a low end radeon, who can't make do with an HD 2000 or HD 3000?

That's my thinking too, but there it turns out there is an answer. The niche I see for Llano is where someone is looking the at absolute dollars spent on the machine, combined with having some minimum standard performance for both the GPU and CPU. That is, someone doesn't want pre-Sandy Bridge Intel integrated graphics (i965 isn't enough even if the CPU is) or a weak CPU (ION's Atom isn't enough even though the Nvidia 9400 is), so buying cheaper pre-2011 equipment is off the table.

If you look at it that way, I think the cheapest Llanos start at (roughly) $50 less than the cheapest HD2000s. If I wanted to move machines through Wal-Mart, I might build Llano computers. People buying those would end up pretty satisfied, even though they could have gotten better performance per dollar if they had spent just a few more dollars.

Whether this cheap stuff is profitable for AMD, I can't say. But keeping it cheap is the only way to sell it, precisely because once you decide to spend $120 on the CPU, Intel parts' value becomes fucking awesome.

You WILL use the built-in radeon (3, Insightful)

unity100 (970058) | more than 2 years ago | (#37604900)

http://pente.hubpages.com/hub/AMD-Fusion-APU-Processor-Specifications [hubpages.com]

for its possible to play starcraft 2 with that shit, even on a low end portable if it has the llano.

in a desktop, you can even crossfire it with its equivalent 6xxx card, therefore reaching major performance for ridiculous price.

if you went with a traditional route, you would need to get the cpu, and then get a separate 6xxx equivalent card, and then one more to do the crossfire.

llano pieces give you 1 good cpu and 1 good graphics card in one shot, and in future they will be upgradeable. you will be able to upgrade both the cpu and 'graphics card' of your rig by upgrading just 1 piece of hardware.

Re:You WILL use the built-in radeon (1)

ThunderBird89 (1293256) | more than 2 years ago | (#37604992)

A question regarding the Crossfire capability: does it automatically enable in, say, a laptop (specifically, an ASUS K53TA) with an A4 APU and a Radeon 6550? Or is it actually the part where ATI Control asks me which graphics core it should use for a given application?

Re:You WILL use the built-in radeon (3, Informative)

unity100 (970058) | more than 2 years ago | (#37605088)

if, the board you have is crossfire capable, and the generations match each other (it has to be in XXYY range and first XXes must match from what i know, but exceptions are possible), ati catalyst control center will see that you have crossfire possibility, and it may auto enable it. you may enable crossfire, or disable it. with windows 7 and vision control center more customizations may be possible, however if you consider that hardware acceleration is even used for web page rendering in firefox, you would probably leave it on all the time.

Re:You WILL use the built-in radeon (0)

Anonymous Coward | more than 2 years ago | (#37605628)

Another advantage not mentioned yet is that doing this should free up some RAM. It'll use the dedicated graphics memory on the additional card instead of just the motherboard memory. Not that it should matter that much, but it might make some graphics things refresh faster.

Re:You WILL use the built-in radeon (0)

Anonymous Coward | more than 2 years ago | (#37606306)

Llano gives you 1 (relatively) good graphics, that's true. But the CPU part is mediocre at best, and that's what the summary is rightfully calling out. Also the upgradeably is a moot point; conversely if you only need to improve either CPU or GPU, you need to upgrade (and pay for) both.

Re:You WILL use the built-in radeon (1)

unity100 (970058) | more than 2 years ago | (#37606590)

games do not require heavy cpu these days. most to the chagrin of many of us. actually, in fact, they dont even require quite high gpu too - due to the console factor.

as a result im doing mighty good with 2 x 5670s crossfired, running shit in 1920x1200 resolution in full detail settings in dx 11.

Why the hell wouldn't you use the GPU in the CPU? (1)

tyrione (134248) | more than 2 years ago | (#37605004)

With OpenCL 1.1 throughout and more applications leveraging it even on Linux it's rather clear that once the Desktop Environments of KDE, Gnome catch up in some respects with OS X on using OpenCL for the GPGPUs then you'll will be using all that power without even knowing it. Games most certainly will be using it. Gimp, Blender, Inkscape and more are rolling it into their products.

Re:Why the hell wouldn't you use the GPU in the CP (1)

hedwards (940851) | more than 2 years ago | (#37605074)

There's that, but there's also dual GPUs which have been around for a while. I think Apple has offered dual GPU laptops for years now, where the big one would only get tapped for GPU intensive use, saving battery power.

A desktop isn't as sensitive to power use as a laptop is, but you could still conceivably cut down on the electrical bill and cooling costs.

Re:Why the hell wouldn't you use the GPU in the CP (0)

Anonymous Coward | more than 2 years ago | (#37606444)

GNOME 3 already refuses to run without modern hardware acceleration - sadly, this means GNOME 3 in VirtualBox or KVM is a no-go.

Re:Why the hell wouldn't you use the GPU in the CP (1)

bill_mcgonigle (4333) | more than 2 years ago | (#37607326)

Because the linux drivers are no good on this. I've got a 3650 with Fedora 15, and most of the stuff works under linux 3.0 but the video on my display is shifted up and left for no good reason and tinkering with modelines didn't move the picture at all. I'm still using the CPU but I put my nVidia card back in so I could use my display.

Re:Why the hell wouldn't you use the GPU in the CP (1)

tyrione (134248) | more than 2 years ago | (#37607722)

The Compositing and leveraging the GPGPU is up to the Desktop Environment and the Application. Only OS X has OpenCL/GCD system-wide and app-wide due to the Compositor, Quartz and WindowServer all leveraging OpenCL natively working with an OpenGL 3.2 fully accelerated environment in 10.7. Linux is still sucking hind tit with OpenGL 1.4. It's only with KDE 4.7.x that OpenGL ES 2.0 bits are now being leveraged. At that rate it'll take years for Linux to catch up. Hopefully, with X moving to Wayland the gap won't be so great.

Re:Why the hell wouldn't you use the GPU in the CP (0)

Anonymous Coward | more than 2 years ago | (#37613210)

In 2 years that built-in GPU won't be all that hot. Meanwhile, a 2 year old CPU is perfectly fine for any normal workload (as long as you have a recent video card for games).

LLANO? (0)

Anonymous Coward | more than 2 years ago | (#37605062)

LMFAO!

Re:LLANO? (1)

blair1q (305137) | more than 2 years ago | (#37605448)

Too right. I've been to Llano, and AMD picked the right name for a podunk part.

Using the built-in Radeon (4, Informative)

Anonymous Coward | more than 2 years ago | (#37605190)

Not sure if I'm supposed to spill the beans on this, but I'm an AC, dammit. I'm in their focus-group thing, and apparently they're working real hard on a Crossfire-like solution right now so your "free" on-chip GPU isn't being wasted if you throw down for a discrete card. They haven't been making much words about this, though. Odd.

Re:Using the built-in Radeon (0)

Anonymous Coward | more than 2 years ago | (#37605216)

The horsepower is right there. That they're planning on it is wise, but honestly, if you drop a Radeon core and forget about it, someone will fire it up, even if only on Linux.

Re:Using the built-in Radeon (1, Flamebait)

geekoid (135745) | more than 2 years ago | (#37605452)

Ah yes, posting AC dissolved all obligations~

What an untrustworthy piece of shit you are.

Re:Using the built-in Radeon (1)

MobileTatsu-NJG (946591) | more than 2 years ago | (#37606890)

It's a bummer this was modded down, I'm inclined to agree. If you promise to keep a secret, you should keep your promise.

Re:Using the built-in Radeon (1)

O('_')O_Bush (1162487) | more than 2 years ago | (#37607866)

Except he didn't say he promised to keep a secret and didn't, he said he didn't know (aka, they didn't tell him not to).

A focus group member isn't part of the design or marketing committee after all.

Re:Using the built-in Radeon (1)

MobileTatsu-NJG (946591) | more than 2 years ago | (#37608076)

Oh, well if he's unsure of his obligations then he's off the hook.

Impressive rebuttal, bro.

Re:Using the built-in Radeon (1)

hairyfeet (841228) | more than 2 years ago | (#37609514)

Uhhh...if its a "secret" that everybody knows about [wikipedia.org] does it still count as a secret? The only difference is instead of the GPU being on the board its on the die itself, hence why they are testing. But hybrid has been around since the HD2xxx series cards so it isn't like everyone didn't already know about it.

Re:Using the built-in Radeon (1)

ackthpt (218170) | more than 2 years ago | (#37605494)

Not sure if I'm supposed to spill the beans on this, but I'm an AC, dammit. I'm in their focus-group thing, and apparently they're working real hard on a Crossfire-like solution right now so your "free" on-chip GPU isn't being wasted if you throw down for a discrete card. They haven't been making much words about this, though. Odd.

I figured they were trying to fly under the RADAR until they got to some point.

Re:Using the built-in Radeon (1)

Zuriel (1760072) | more than 2 years ago | (#37606126)

I figured they were trying to fly under the RADAR until they got to some point.

Google says they aren't doing a very [softpedia.com] good [wikipedia.org] job [softpedia.com] .

Re:Using the built-in Radeon (1)

gozu (541069) | more than 2 years ago | (#37605530)

This is already known. I read about it a few weeks ago. You didn't spill any beans I'm afraid.

Re:Using the built-in Radeon (1)

Baloroth (2370816) | more than 2 years ago | (#37605818)

Well, considering that I was wondering if you could do precisely this (and if not, why the hell not) with the Llano, I don't think this will be considered "leaking" any information. Well, that and they demoed a scaling programming language for using multiple GPUs several months ago, which is basically a similar idea. AMD could make massive inroads on Intel if they can get such a system working well.

Re:Using the built-in Radeon (1)

dabadab (126782) | more than 2 years ago | (#37609918)

Sorry, are you from the past?... The Dual Graphics option for Llano has been in the news since, well, basically since the existence of Llano is known. It also has been featured in basically all the Llano reviews (like this one [anandtech.com] from June) so I am not sure what do you mean by "not making much words about this"

What about video codec support under linux? (1)

Jah-Wren Ryel (80510) | more than 2 years ago | (#37605490)

Forget 3D, what I'd like to know is how good is the video codec support under linux? Specifically de-interlacing and pulldown of 1080i video for mpeg2, h264 and vc1? I'd really like to dump my windows box, but so far the very best de-interlacing - both quality and coverage - seems to be with nvidia under windows

Re:What about video codec support under linux? (1)

UnknowingFool (672806) | more than 2 years ago | (#37605766)

For linux, it appears that is rather new. An SDK for XvBA [wikipedia.org] was released by AMD on Feb 2011. VAAPI (Intel) and VDAPU (nVidia) have been out longer comparatively.

ATi/AMD still lagging looooong way behind (1)

Lead Butthead (321013) | more than 2 years ago | (#37606320)

My personal experience has been that with nVidia parts, their proprietary driver "just works" under Linux, on occasions when it can't even identify the part itself.

With ATi/AMD... not so much; more often than not, trying to install proprietary driver is like pulling teeth out of a pitbull's mouth. Even I get it to install, it only sort-of-kind-of works. Trying to uninstall it is downright insane.

I don't know why ATi/AMD suck this hard, or why it's so much effort to get anything they made to work. But frankly as an end user, I shouldn't know or care. Stuff either "just works" or will not be considered for purchase.

Re:ATi/AMD still lagging looooong way behind (1)

Billly Gates (198444) | more than 2 years ago | (#37606574)

I ran Fedora 14 before the gnome 3 fiasco and switched to Windows 7. My system had an ati-5750 and used the fedora ati proprietary drivers and it worked fine. FLuid animations, great video support at 1080p and it never crashed. Was a very stable system. I do admit I did not do gaming or CAD on it. I dual booted to Windows 7 to run Wow or anything like that.

AMD has higher quality hardware and are better cards in my opinion. Its drivers are always so so and conservative compared to Nvidias. I have had 2 nvidia chipsets and cards fail within the last 5 years so I am sticking with ATI. But, I do use Windows more than some slashdotters so that makes a difference too.

whoa (1)

unity100 (970058) | more than 2 years ago | (#37607190)

I have had 2 nvidia chipsets and cards fail within the last 5

i just sold my 4 year old sapphire 3870 dual slot to my friend's sister's family, and they are playing sims 3 as a family with that card.

Installation is fine, it's the stability (1)

Chris Burke (6130) | more than 2 years ago | (#37613064)

With ATi/AMD... not so much; more often than not, trying to install proprietary driver is like pulling teeth out of a pitbull's mouth. Even I get it to install, it only sort-of-kind-of works. Trying to uninstall it is downright insane.

Having recently switched to an ATI card, using Ubuntu, to these observations I say LOL no, yeah pretty much, and LOL no.

Installation is simple. System-> Additional Drivers -> Enable ATI proprietary drivers -> Reboot (this part sucks but oh wel).

Removal is the same procedure except the button says "Disable" instead of "Enable". There is absolutely nothing insane about it at all.

Now as far as the "works" part, that's a different issue... It mostly works, and when it works it works excellently, but then sometimes it just craps out on a game. Sometimes there's a workaround, sometimes not.

For example, Minecraft just crashes instantly when it tries to render the first frame of an actual level. Now that the main menu has a fuzzy rendering of a Minecraft level as the background, it crashes at the main menu.

I found a workaround though -- Install the ATI drivers, reboot to activate them, then remove the ATI drivers, then don't reboot. Then the game runs perfectly stably, though the desktop flickers through (especially if there are other windows underneath Minecraft).

So that's an annoying way to play the game.

I'm hoping that by the time I want to upgrade (or the card dies, which is why I replaced my old Nvidia card), the drivers have improved. Otherwise, it's probably back to Nvidia.

I like the idea of having actual choice in Linux video card vendor. :P

Re:What about video codec support under linux? (1)

Zuriel (1760072) | more than 2 years ago | (#37606386)

There's an SDK out *now*, but they're late to the party. Noone's really interested in implementing a *third* API, so XvBA only gets used in the VAAPI --> XvBA wrapper. There's also a VAAPI --> VDPAU wrapper and direct VAAPI support for Intel IGPs, so the competition seems to be between VDPAU for it's relative maturity and polish and VAAPI for it's wide support.

I don't believe VAAPI has *any* hardware-based deinterlacing yet.

On an unrelated note, why are we still doing interlacing on 1080p LCD panels? Surely we could do 1920x540 at 50 fps and stretch the image more easily than doing 1920x1080 at 25i and deinterlacing if we really needed the extra frame rate and couldn't do 1920x1080 50p? And stretching progressive frames means no elaborate tricks to prevent fine detail from flickering.

Re:What about video codec support under linux? (0)

Anonymous Coward | more than 2 years ago | (#37607176)

I can tell you are in europe. In the US (and other 60hz countries) interlacing is often a lot more complicated than it is for you 50hzers.

FYI, interlaced vc1 probably won't even work at all under linux, regardless of hardware. There is something about the bitstream (versus progressive vc1) that requires unwritten code. Last time I tried feeding an interlaced vc1 stream to ffmpeg, just to demux (not re-encode) from m2ts to mkv, ffmpeg errored out saying it doesn't handled interlaced vc1 at all. And since most open-source video projects use ffmpeg libraries, nobody really supports it. Sometimes top-field-first works (like in vlc), but bottom-field-first practically never works.

Re:What about video codec support under linux? (1)

drinkypoo (153816) | more than 2 years ago | (#37608498)

FYI, interlaced vc1 probably won't even work at all under linux, regardless of hardware. There is something about the bitstream (versus progressive vc1) that requires unwritten code. Last time I tried feeding an interlaced vc1 stream to ffmpeg, just to demux (not re-encode) from m2ts to mkv, ffmpeg errored out saying it doesn't handled interlaced vc1 at all.

so have you tried using, say, vlc with vdpau? vlc claims to use "mostly" its own mux/demux.

Re:What about video codec support under linux? (1)

cynyr (703126) | more than 2 years ago | (#37610630)

I was wondering the same my self. I tend to go for the 720P stuff over the 1080i for the same reasons. It scales up nicely.

Re:What about video codec support under linux? (1)

Tacvek (948259) | more than 2 years ago | (#37610926)

1080i@50Hz (i.e. 50 "half-frames" per second) effectively encodes more visual data than either 1080p@25Hz, or stretched 540p@50Hz.

It encodes more than 1080p 25Hz by including information sampled twice as frequently, leading to smoother motion.

It encodes more than stretched 540p@50Hz by way of discriminating between twice the number of vertical lines, and thus providing twice the apparent vertical resolution on still (or slow moving) objects.

Obviously it provides less visual data than 1080p@50Hz, but 1080i@50Hz does provide more than the 50% of effective visual data that one might naively expect.

(Of course, that is all considering uncompressed streams. Once we start adding compression the bit rates start screwing with that analysis, exaggerating or collapsing some of the differences depending on what rates of resolution we are comparing and which compression scheme is used).

The real question is more why we bother to de-interlace the video when playing back on a computer, instead of simply drawing each "half-frame" on the appropriate lines alternatively. When content is playing at regular speed both should appear similar. (Obviously when pausing or playing in slow motion, we want a de-interlaced image, but in both of those cases we can afford to spend a bit more time calculating each de-interlaced frame).

Actually this is bullshit (1)

unity100 (970058) | more than 2 years ago | (#37606084)

For a good dedicated graphics card, you are expected to shell out at least 150 extra watts under load.

if, a desktop chip sports such a dedicated card, and its entire power consumption is 100 watts, that is NOTHING compared to a separate cpu + separate graphics cards combo. such a system would spend at the minimum 200 watts. so 100 watts compared to this is nothing.

if you look at this in that light, 65 watt consumption becomes something phenomenal.

Re:Actually this is bullshit (0)

Anonymous Coward | more than 2 years ago | (#37606232)

Get it? PHENOMenal?

I hate myself.

gah gah gah gah (1)

unity100 (970058) | more than 2 years ago | (#37606312)

actually i found that a phenom ii 965 be is much more than enough for over-serious gaming. with its unlocked overclock capacity and readily blazing performance. and im not even spending 150 watts in the entirety of my rig, which sports 2 crossfired cards (5670 tho entry, quite silent and still delivers performance) 3x 23 cm fans, 1 14 cm fan, an external usb card (recording grade), a fan controller and 1 ssd + hd and 8 gb ram. so, your joke may not be so shitty.

Re:gah gah gah gah (0)

Anonymous Coward | more than 2 years ago | (#37607758)

150 watts under what load? I just finished building an "experimental" HTPC (1100T x6, 6670, 300W 80+ PS, 8gb ram, WD velociraptor and WD black) that idles at 60W but consumes 200W at the wall at full load (tested with prime95 for CPU and furmark for GPU). I used the 1100T under the assumption that underclocking/undervolting most powerful 125W AMD part would yield better results than the 65W phenom 915e, but so far I'm estimating about 95W at 3.1Ghz with 2 of the 6 cores disabled and turbo at 3.5ghz.

Agreed, this is total bullshit (1)

turing_m (1030530) | more than 2 years ago | (#37609830)

I also really don't see what the big deal is about TDP - all that determines is what sort of HSF you use. The important thing is the idle power because that is where the CPU sits most of the time.

Actually this made me think again (1)

unity100 (970058) | more than 2 years ago | (#37606354)

i threw my lot with going with a bulldozer capable board and 2 discrete radeon cards for my last recent upgrade. now, i am thinking that if i went the llano route, and shoved in another 6xxx, the performance would be much more better at the current state.

Its just $139 (1)

unity100 (970058) | more than 2 years ago | (#37606706)

http://www.newegg.com/Product/Product.aspx?Item=N82E16819103942 [newegg.com]

it is one 4 core cpu, and one decent graphics card in one package, and its just 139. you would need to shell out $139 just for a decent graphics card, if you went with external.

and great reviews :

http://www.newegg.com/Product/Product.aspx?Item=N82E16819103942 [newegg.com]

Looks a little suspect?! (2)

Cajun Hell (725246) | more than 2 years ago | (#37608096)

Is this a joke? The integrated graphics are the whole fucking point! If you don't want 'em, you can get a Phenom II (or maybe even an Athlon II) that uses less power and runs faster.

If you don't use it as a car, the Honda Civic isn't really all that great a value, comparing slightly unfavorably to Stone Ruination IPA in most video compression benchmarks.

45W (1)

jones_supa (887896) | more than 2 years ago | (#37610934)

I think those AMD 45W quad cores (6??e series) were pretty cool (both metaphorically and literally). Intel rarely makes big desktop processors in such a low TDP range.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>