×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Unveils Elite A-Series APUs With Enhanced Performance, Improved Efficiency

Soulskill posted about a year ago | from the warning:-graphic-content dept.

AMD 102

MojoKid writes "AMD has just announced a new family of Elite A-Series APUs for mobile applications, based on the architecture codenamed 'Richland.' These new APUs build upon last year's 'Trinity' architecture, by improving graphics and compute performance, enhancing power efficiency through the implementation of a new 'Hybrid Boost' mode which leverages on-die thermal sensors, and offering AMD-optimized applications meant to improve the user experience. AMD is unveiling a new visual identity as well, with updated logos and clearer language, in a bid to enhance the brand. At the top of the product stack now is the AMD A10-5750M, a 35 Watt, 3.5GHz quad-core processor with integrated Radeon HD 8650G graphics, 4MB of L2 cache and a DDR3-1866 capable memory interface. The low-end is comprised of dual-cores with Radeon HD 8400G series GPUs and a DDR3-1600 memory interface."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

102 comments

On-die thermal sensors (3, Insightful)

Byrel (1991884) | about a year ago | (#43155125)

That qualifies as one of those inventions that make you wonder why it had to be invented... The utility is quite obvious.

Re:On-die thermal sensors (1)

gagol (583737) | about a year ago | (#43155211)

It is my understanding Intel uses them since at least C2D... maybe this is a refined version with all the multiple sensors and all...

Re:On-die thermal sensors (0, Flamebait)

PhunkySchtuff (208108) | about a year ago | (#43155281)

Well, back in 2006, Intel had thermal sensors. AMD didn't... Here were the results:
http://www.youtube.com/watch?v=hxSqCdT7xPY [youtube.com]
The Intel chip was a Pentium 4 and I'm pretty sure they weren't the first ones with thermal sensors...

Re:On-die thermal sensors (2)

Byrel (1991884) | about a year ago | (#43155325)

AAAHHH, they let the magic smoke out of the AMD! Of course it stopped working! UNFAIR TEST! ~

Re:On-die thermal sensors (3, Informative)

adolf (21054) | about a year ago | (#43155723)

That was 2001, not 2006.

Re:On-die thermal sensors (2)

Ironhandx (1762146) | about a year ago | (#43156447)

It was also a result of overclocking the CPU to all unholy hell and wasn't possible to achieve under normal operating conditions, even while yanking the heat sink off. You would damage the processor by yanking the heat sink yes, but it wouldn't smoke more than a very small amount or in any way explode.

The thermal sensor was not included on-die as it was included in the heat sinks of the time for cooling report purposes and was not seen as necessary... which it wasn't.

Re:On-die thermal sensors (1)

TheRaven64 (641858) | about a year ago | (#43157315)

It was a slightly contrived test, but it demonstrated a real difference. At the time, CPUs were expensive but most OEMs used the cheapest possible cooling system. The Athlon on my desk at work got through 4 CPUs because the lowest-bidder cooling fan kept getting blocked up with dust and baking the chip (it got through 3 motherboards because the chipset fan did the same thing). In contrast, if you did the same thing to a P4, it just ran really slowly. We actually had some problems with this, because we had a cluster of P4 machines. The cooling in the nodes would periodically fail but they'd keep going, just really slowly. It's a lot easier to identify a failed cluster node than the one that's running at a tenth normal speed and slowing the entire job down...

Re:On-die thermal sensors (1)

Ironhandx (1762146) | about a year ago | (#43157391)

Thats actually one of the things I've always liked about AMD and ATI. They both tend to not fuck around about failing. If its dead, its dead, none of this half way bull.

Just finished replacing a 560ti from nvidia with a spare 6870 I had kicking around... should be around the same speed except the 6870 was actually twice as fast due to some issue or other that had cropped up in the nvidia card but thanks to it not failing properly it just hung on and made the computer owner think he had a pile of viruses and blame basically everyone in his family for his computer being slow when his video card was just half-cooking itself on a daily basis.

I've had video cards and cpus from all 4 of the major vendors over the years... with a few noteable exceptions the rule of thumb has been that the AMD processors and the ATI cards would simply FAIL when they failed, or at least give some VERY clear indications of a hardware problem(usually significant artifacting in the case of the video cards).

Meanwhile I've seen an nvidia card quite literally melt the fan onto the heat sink without the computer even shutting down. Guy came in complaining about viruses slowing his WoW down...

Same thing happened with a couple of the old P4 dells I repaired.

Re:On-die thermal sensors (1)

ardor (673957) | about a year ago | (#43158055)

Thats actually one of the things I've always liked about AMD and ATI. They both tend to not fuck around about failing. If its dead, its dead, none of this half way bull.

Except that an overheated CPU tends to damage the mainboard as well. With a temperature regulated CPU, this doesn't happen. Besides, if the *fan* breaks, I just want to have to replace that fan, not fan+CPU+mainboard.

Re:On-die thermal sensors (1)

Lonewolf666 (259450) | about a year ago | (#43158273)

Add a diagnostic program to show the temperature/throttling state of the CPU and you have the best of both worlds ;-)

Re:On-die thermal sensors (1)

Ironhandx (1762146) | about a year ago | (#43160957)

A steadily overheating CPU is more likely to blow the main board than one that overheated badly once and fried. Have had overheating CPUs go through two mobos before the real problem was identified and the CPU was replaced.

I have the same issue with video cards, though the extreme heat that some holder high end ATI cards were designed to run at caused the problem without the card ever even batting an eyelash. Bled heat into the mainboard and eventually the mobo died after a month or so.

Re:On-die thermal sensors (0)

drinkypoo (153816) | about a year ago | (#43158323)

My last ATI GPU-based-card gave intermittent errors, would flake out in some games but not others, etc. The plural of anecdotal data is meaningless rambling.

Re:On-die thermal sensors (1)

serviscope_minor (664417) | about a year ago | (#43158237)

The cooling in the nodes would periodically fail but they'd keep going, just really slowly.

That's actually quite amusing (from the outside). I imagine it had you scratching your head for a while.

I do support the lack of sudden death though. I've rescued a few machines which were "really crap" and never worked properly. The first was a Dell laptop which was shipped without a heatsink! It would run fast for a minute, then really slowly and randomly switch off when it got too hot. Another was a Core 2 desktop which had a heatsink kind of flopping around.

Generally good for me that they could be fixed.

Re:On-die thermal sensors (1)

dreamchaser (49529) | about a year ago | (#43155217)

More to the point, they are still playing catch up with Intel by essentially implementing their own version of the latter's Turbo Boost.

Re:On-die thermal sensors (5, Informative)

r1348 (2567295) | about a year ago | (#43155279)

Not exactly, AMD had single-core power boosts since quite some time now. This is a refined version that calculates the boost based on real-time sensor data, instead of using conservative assumptions. So basically: the better you dissipate heat, the faster it goes.

Re:On-die thermal sensors (5, Insightful)

dreamchaser (49529) | about a year ago | (#43155311)

They are both essentially dynamic overclocking, and both rely on thermal data. I'd say they are more alike than dissimilar. I'm not saying it's a bad thing that AMD has done this, but I'd much rather see IPC improvements than brute force attempts to lower the existing performance gap between the two vendors.

Re:On-die thermal sensors (1)

LordLimecat (1103839) | about a year ago | (#43159771)

If you can figure out how AMD can do that with a fraction of the budget and no in-house fab, I do believe they would have a job for you as CFO.

Re:On-die thermal sensors (0)

Anonymous Coward | about a year ago | (#43156331)

Ive had 3.6 intel Duals..fried them every time. Nice to see some real world INTELLIGENCE being used with CPU manufacturing for once.

Re:On-die thermal sensors (0)

Anonymous Coward | about a year ago | (#43157021)

Having thermal sensors built-in is smart, but putting CPU cooling under software control is the WORST idea ever!

Just stick a microcontroller or something on the board to control fans and throttle back when necessary. I don't want to risk blowing up my secondhand laptop just by installing a generic Linux on it.

whats the spec benchmark ? (3, Interesting)

johnjones (14274) | about a year ago | (#43155339)

seriously show me numbers

also 9.6W for decoding MPEG is pretty horrendous but this is because I'm guessing they have to power the whole of the GPU rather than a simple specialised unit

where is the benchmark ?

regards

John Jones

Re:whats the spec benchmark ? (2)

gagol (583737) | about a year ago | (#43155481)

Your specialised unit will not work with the next standard. A software update will work with the GPU. Cant have everything.

Re:whats the spec benchmark ? (3, Informative)

Anonymous Coward | about a year ago | (#43155653)

Might want to read the table under the barchart.
9.6W is at the System Level (i.e. whole netbook) while the APU Silicon itself is consuming 2.923W. Rest of the system would be the LCD, SATA HDD, memory, WiFi etc

Not sure where your MPEG part comes from as they didn't specify the encoding, only play back from HDD.

Re:whats the spec benchmark ? (3, Informative)

TubeSteak (669689) | about a year ago | (#43156049)

This was the first benchmark I found.
Keep in mind this new CPU is for mobile usage.
http://www.notebookcheck.net/AMD-A-Series-A10-5750M-Notebook-Processor.87797.0.html [notebookcheck.net]

First PCMark 7 benchmarks show a performance increase of around 10 percent on the A10-4600M (5750M: 2175 points, 4600M: 1965 points).
Thus, the A10-5750M would place roughly at the level of a Core i3-2330M (Sandy Bridge).

Notebook Check is pretty awesome.
If anyone knows of a better/equal website for laptop hardware, I'd like to know

Re:whats the spec benchmark ? (-1, Troll)

Gothmolly (148874) | about a year ago | (#43158017)

You automatically lose for 'signing' your post with your name.

Re:whats the spec benchmark ? (1)

Anonymous Coward | about a year ago | (#43159261)

What the fuck did he lose? This is a discussion forum, not a game.

AMD even still relevant? (-1)

ArchieBunker (132337) | about a year ago | (#43155341)

Why does anyone buy AMD chips anymore? Their top of the line quad core barely keeps up against the mid range i5 chips.

Re:AMD even still relevant? (5, Insightful)

Khyber (864651) | about a year ago | (#43155377)

Only in single-threaded tasks. You get into multi-threaded and AMD begins to win outright.

Re:AMD even still relevant? (-1, Flamebait)

RightSaidFred99 (874576) | about a year ago | (#43155673)

Lulz. Yes, if you have 50 threads that each increment a counter then AMD _pwns_. Real world shit? Not so much.

Re:AMD even still relevant? (5, Informative)

Zuriel (1760072) | about a year ago | (#43155769)

You've never run a video transcode or compiled anything, have you?

I transcode Fraps recordings and upload them to Youtube, transcode bluray video for my Nexus 7, my MythTV backend often has transcode and commflag jobs queued that could run in parallel with no performance loss if it had more cores. 7-Zip will happily multithread compression tasks across dozens of cores. None of that is particularly exotic.

When you say "real world shit" you're talking about games, right? Be aware that there are things other than World of Warcraft that will tax a CPU, and they aren't imaginary or hypothetical.

Re:AMD even still relevant? (1)

Luckyo (1726890) | about a year ago | (#43155841)

Vast majority of the games do not even tax dual cores all that heavily, and quad cores are a massive overkill for all but a small handful of games. The optimizations a la BF3 are just unnecessary because most games have processes that just don't run well in parallel. And even BF3 doesn't scale all that well as cores increase past 4. Yours is a fringe case where you perform exactly one task on the fly that actually scales on more then a couple of CPU cores efficiently (video encoding). So his claim of "real world shit" stands. Vastly overwhelming majority of software that exists and will exist in near future does NOT scale well with core numbers. Of these the most popular exceptions are graphics renderers and video renderers that are typically optimized scale with GPU cores rather then CPU cores.

On the other hand with AMD getting both PS4 and next XBOX as platforms that feature their APUs, this may change in a few years. But for now, essentially everyone but a small handful will enjoy far more performance on a CPU that has a few powerful cores then a CPU that has many weak ones.

Re:AMD even still relevant? (1)

aztracker1 (702135) | about a year ago | (#43156135)

I can say that for me, when running several server processes in the background for development, I get more for the dollar out of AMD... If I were building a gaming rig, I would probably lean towards a newer Core i5.

Re:AMD even still relevant? (2)

GigaplexNZ (1233886) | about a year ago | (#43156161)

Video transcoding, while not representative of the average computing task, is still definitely "real world shit" and much moreso than simply incrementing counters.

Re:AMD even still relevant? (3, Interesting)

Zuriel (1760072) | about a year ago | (#43156423)

The vast majority of software that exists won't max out an AMD E-350 netbook chip. Put things in perspective here: we're talking about the minority of software that will actually tax a system.

A lot of programs are single threaded or do almost all of their work in a single thread, and don't really benefit from more cores. Other programs scale almost linearly with number of cores. I was only making the point that software that takes advantage of many cores isn't as rare as the great grandparent seems to think. AMD's multi-threading advantage with its 8 core chips isn't just something that AMD fanboys babble about, there's real benefits in real software that people actually use.

Re:AMD even still relevant? (0)

Luckyo (1726890) | about a year ago | (#43156621)

I can testify that this is simply bullshit (as in a bold-faced lie). I have E-450 notebook sitting 2 meters away from me as of typing this, and CPU is so weak, I routinely observe it max itself out doing many mundane everyday tasks. Even browsing the web has noticeable hiccups on page loads that I do not see on this core i5 desktop where openhardwaremonitor shows both cores spike to and sit at 100% usage for a while as page is processed.

I had similar issues with anything from libreoffice calc to even software installers and updaters max out CPU for a while on that laptop. I almost never see my desktop i5 maxed out. This is very noticeable because OS responsiveness remains on i5 doing the same things like updates, the software installations I run on both are very similar.

That said I have no complaints about the notebook itself, it cost me 350 EUR about a year ago and the main reason I bought it was to be able to play games like LoL and SC2 on the go from the battery for several hours. It does that well with both games. Most intel-based notebooks either can't match the battery life or can't match the GPU performance or both.

But CPU is definitely the weak part of the package.

Re:AMD even still relevant? (2)

GauteL (29207) | about a year ago | (#43157413)

"I can testify that this is simply bullshit (as in a bold-faced lie)."

No. It was rhetoric. The grand parent picked the E-450 as an example. While it was admittedly a poor example, he/she did not mean for you to take this as a literal description of the E-450's prowess.

The point, which you missed, was that you don't need either a Core i5 or an AMD Bulldozer to surf the web and write documents. This point is true regardless of the poor choice of example. For the vast majority of software out there a $60 cpu will do just fine (think modern Pentium or high-end Celeron).

Re:AMD even still relevant? (0)

Anonymous Coward | about a year ago | (#43158501)

a intel i3 second-gen feels about 10 times faster then a amd e-350. no more amd for me thank you very much.

Re:AMD even still relevant? (1)

Herr Brush (639981) | about a year ago | (#43163815)

I can vouch for this. My HTPC has no problems playing 1080p video in WMC or XBMC, boots in 10secs, launches other progs super quick - all with a dual core Celeron G630. It would be a fine mid-range gaming rig too if I added a discrete video card.

Re:AMD even still relevant? (1)

Luckyo (1726890) | about a year ago | (#43168099)

But that's the point why people like myself buy the low end APUs. Discreet video card means it won't run for long off battery while actually running games. APUs like E-450 enjoy long battery life while actually running games.

Also, E-450 cannot decode realtime 1080p h264 high profile video on CPU without a lot of dropped frames. Not enough computing power. It can usually handle 720p unless it's Hi10P in software, in which case it may lag depending on video content (CPU will be taxed from about 80% and up with decoding thread).

Obviously these problems don't exist when using the hardware decoding path, where GPU could probably decode 4k h.264 High profile video on the fly with properly optimized decoder. But when going with CPU intensive software path, you're SOL.

Re:AMD even still relevant? (1)

Luckyo (1726890) | about a year ago | (#43168071)

The grand parent (you) picked E-350, which is the same hardware as E-450 but runs on lower clock speed. You then proceeded to tell open lies about CPU performance. I debunked them as I happen to actually own the overclocked version of the CPU you used in your example, and my experiences with the system over the last year show all if your bullshit as being just that - bullshit.

E-450 is a great choice if you want to play games on battery for a while, because it has a great integrated GPU. For energy it consumes, it's probably the best on the market. That's why I bought it.

But CPU on the die is complete and utter trash. And the main reason why I bought it knowing this is because games I play are not CPU intensive. I was expecting it to lag on installers and even web browsing (though honestly not as much as it ended up lagging the machine). Overall I'm pretty satisfied with the machine regardless. But to try to sell any of the CPUs of the series based on the CPU is complete and utter bullshit - because CPU is utter trash. If things like interface response times, fluid web browsing, background updaters that don't cause interface lag and so on are a must, AMD's low power APUs are simply off the table. Because their CPU part of the package is so weak, it will cause all of the above.

The bullshit about bulldozers has no place here either. E-450 is a dual core with no multi-threading or pseudo "one core is two cores" thing that bulldozer architecture has. It's an exceptionally weak CPU that is accompanied by an exceptionally powerful integrated GPU, which is what sells the package. To try to pretend that CPU is something other then utterly abysmal is to lie.

Re:AMD even still relevant? (1)

halltk1983 (855209) | about a year ago | (#43158617)

If Intel is better performing, why didn't you go with one of their packages for the on-the-go gaming? Surely the Intel graphics part could have kept up? Or is the overall performance of the APU under the situations that you use it for most often the AMD is better. I see flash max out a single core processor on just about any system I see it run on. Java does the same thing.

Have you considered adblock and flashblock?

Re:AMD even still relevant? (1)

Luckyo (1726890) | about a year ago | (#43167985)

Have you considered reading what you're replying to before going into knee-jerk "oh my god, he criticised the company I love" rant after reading the first sentence? I clearly stated in the third paragraph as to why I did not get intel for that notebook.

P.S. E-450 chokes even on slashdot WITH "no ads" option ticked. Looking at >10 second pageloads as a norm on longer posts on firefox. Both cores going at 100% during load. Desktop first gen i5 load times are barely noticeable and are mostly about network - CPU isn't anywhere near maxed at any point during loading.

In a way, it's funny and sad at the same time. Parent told a clear lie praising his favourite company's product and got upmodded to 3, I debunked it as I happened to own a factory overclocked version of CPU he used in his example and reality of my experiences with it is exact opposite of his claims and got downmodded to zero for doing so. Fanboyism at its finest. I suppose it's fitting that others followed the example.

Re:AMD even still relevant? (3, Interesting)

kermidge (2221646) | about a year ago | (#43156879)

I'm not sure which to blame - I've no known easy way to figure it out - but when I turn off WCG with BOINC (full-time at 100%) and I play Civ V under Crossover XI on a 64-bit Linux using a AMD 1090 I can see all cores being used. Loads range from ~30% to max. I like to see that I'm getting my money's worth out of those cores.

I've been glad to have run a Phenom quad, a Phenom II quad, and now the hexa-core off the same mobo because it saved me money.

Re:AMD even still relevant? (2)

Lonewolf666 (259450) | about a year ago | (#43158355)

In Crysis 3, the AMDs look good:
http://www.pcgameshardware.de/Crysis-3-PC-235317/Tests/Crysis-3-Test-CPU-Benchmark-1056578/ [pcgameshardware.de]
Only the really expensive Core i7-3960x (800 Euros or more on my side of the Atlantic) beats the FX-8350. And with a TDP of 130W, it is similar to the FX in heating your PC.

Other games are already moving towards being designed for more parallel processing. For instance X:Rebirth by Egosoft, currently in development. The CEO said in an interview that a quad core is recommended, they will have at least three main threads and several smaller ones.
So the days of well scaling games may be closer than you think.

Re:AMD even still relevant? (1)

LordLimecat (1103839) | about a year ago | (#43159841)

The vast majority of game servers and basically every bit of internet infrastructure run by a company with more than 100 employees is virtualized, and those that arent can still use the extra horsepower. Whether or not apache is running as a VM, Im sure it can use more cores.

You fail to realize that end users and their processor usage is pretty tiny compared with the server-side that you never see.

Re:AMD even still relevant? (1)

Luckyo (1726890) | about a year ago | (#43168005)

I care about as much about server performance when on my home machine as server admin cares about home machine performance when configuring his newest blade rack to do whatever it is they will be doing. Why should it be any different? We were talking about home machine use here, are you perhaps posting in the wrong thread?

Re:AMD even still relevant? (1)

ardor (673957) | about a year ago | (#43158101)

Agreed. Example: when I build a rootfs using OpenEmbedded, parallelism is absolutely essential, because it builds thousands of packages. And to let it build in parallel efficiently, you need all these cores (of course), plus fast I/O so it does not become a bottleneck, and lots of RAM to avoid using swap and have a large disk cache. So, for a build server, I'd go with 4 to 8 cores, 16 GB RAM (you can get that for 100 bucks these days), a 7200RPM hard disk for package downloads and archives, and an SSD for the OS and the build work & staging directories.

Re:AMD even still relevant? (1)

LordLimecat (1103839) | about a year ago | (#43159809)

You miss the bigger picture. What do you suppose youtube runs on? You think theyre using physical servers?

How many threads do you suppose the physical boxes behind the youtube clusters are running?

Re:AMD even still relevant? (1)

karnal (22275) | about a year ago | (#43155385)

I bought an amd for my recent file server build. 120$ for chip + mobo (and integrated graphics, who needs a video card?). 35 watt tdp, and most of it's time is spent idling away.

I could have gone Intel - don't get me wrong, but a decent motherboard with all the features of the AMD I bought runs closer to 150 by itself.

Re:AMD even still relevant? (0)

Anonymous Coward | about a year ago | (#43155469)

integrated graphics, who needs a video card?

Video cards - all the rage in the '90s. Modern GPU's are used by gamers, 3D vfx artists, anyone doing serious image / video processing and on servers they're fairly useful too! [sisoftware.co.uk]

Re:AMD even still relevant? (0)

Anonymous Coward | about a year ago | (#43155711)

A server needs no graphics, it should not have no video connector. For local connections, use the serial or ILOM connector.

Re:AMD even still relevant? (3, Interesting)

Anonymous Coward | about a year ago | (#43155389)

We buy lots of them for our HPC cluster. We can get four 16-core CPUs in a 1U box. Each core is slower than an Intel core, but the price performance ratio is higher by a factor of two. Of course this only helps if your jobs are very paralyzable or you have lots of users (both of which apply to us).

Re:AMD even still relevant? (3, Insightful)

SQL Error (16383) | about a year ago | (#43155645)

Likewise for our database clusters. We use open-source software (MongoDB, Redis, Riak) so hardware cost matters - if you use Oracle or something like that, software costs dominate. If you want large 4-socket servers, AMD offers much better value than Intel. And if you want lots of small 1S servers, AMD wins again, because the E3 Xeons only support 32GB RAM.

Re:AMD even still relevant? (0)

Anonymous Coward | about a year ago | (#43155767)

Factor of two? Please... I know Intel has a premium, and i don't doubt their is some bang for the buck on the AMD side as you mention, but factor of two is a bit of a ridiculous statement.

Re:AMD even still relevant? (0)

Anonymous Coward | about a year ago | (#43155999)

Not really, when I bought my last laptop a couple years ago, my laptop was about 40% of the cost of the lowest priced Intel alternative, and that was for a business laptop, not cheap Acer crap. What's more it smoked my mother's school provided laptop with similar specs, but an Atom processor costing twice what my laptop cost.

Re:AMD even still relevant? (2)

serviscope_minor (664417) | about a year ago | (#43157283)

Well, that's the thing.

If you're in the 4 socket space (40 cores intel IIRC, 64 cores AMD), then you're probably in the market for something pretty parallelizable. Which means of course that AMD doesn't reallt have the disadvantage.

Re:AMD even still relevant? (1)

del_diablo (1747634) | about a year ago | (#43155409)

The problem is that Intel only has a marginal advantage, and AMDs chipsets is more than strong enough on the top of that. If AMDs chipsets where underpowered, sure your advice would be sane. Also as others point out, it only applies to single threads.

Re:AMD even still relevant? (5, Interesting)

router (28432) | about a year ago | (#43155601)

Because AMD unlike Intel (nawadays) makes next gen chips available for previous gen motherboards. So total cost of ownership is substantially lower with AMD than Intel. I got 3x performance boost on a several year old system this way. Because their motherboards are cheaper and use normal RAM (RDRAM debacle, anyone?). Because Intel has tried and failed to screw the enthusiast consumer for decades (except for that celeron 300 -> 450 thing, that rocked). Because their multithreaded performance is better, because their 8 core chips are cheaper, and some of us run an operating system and compute jobs that take full advantage of multiple cores. Because some of us _like_ AMD, and their continued existence means lower CPU prices for everyone.

Maybe that's why.

andy

Re:AMD even still relevant? (0)

Anonymous Coward | about a year ago | (#43155955)

+1 too all that. I just built a new AMD PC w/ FX 8350 and 32 GB of RAM. Very nice. I don't mind showing AMD a little love. None of these big corps are perfect. I'd like that AMD is around to threaten Intel so they can get away with a little bit less.

Re:AMD even still relevant? (1)

aztracker1 (702135) | about a year ago | (#43156159)

Ditto, just upped to an FX 8350 w/ 32GB ram... runs plenty fast with some services, and Virtual Machines running...

Re:AMD even still relevant? (2)

toygeek (473120) | about a year ago | (#43155995)

This x1000.

In 2010 I bought an AMD Phenom II X2 550 Black Edition and a motherboard I could afford. I ran it as a quad core (unlocked cores) at 3.3ghz until that motherboard died. The only one I could get in a pinch did not have a southbridge that would unlock my other cores. I still ran it at 3.4ghz without issue.

Fast forward to 2013 and I just bought a new motherboard with DDR3 ram (8gb of it, double the 4 I had before) and unlocked my cores again. Its like a new computer.

I took the old mobo and ram and bought a cheap AMD processor for $55 (Athlon II X2 3.4ghz, VERY servicable!) and gave that to my son who was still on a P4 3.2ghz. We're both happy. AMD is cost effective for those of us who aren't made of money and still want decent computers. I love them for it.

Re:AMD even still relevant? (1)

drinkypoo (153816) | about a year ago | (#43155655)

I built my AMD PC ages ago, you insensitive clod! An intel motherboard with the same feature set cost literally twice as much, so that was a big help deciding. And the availability of much more powerful CPUs in the same socket cemented the deal. I spent $100 each on MB and CPU. Some time later I spent another $100 and got a CPU with twice as many cores at the same clock rate without having to change anything, just dropped it in.

Also, quad core? Only four cores? I mean, I started out with three, but today that would be silly.

Re:AMD even still relevant? (0)

Anonymous Coward | about a year ago | (#43155725)

Why does anyone buy AMD chips anymore? Their top of the line quad core barely keeps up against the mid range i5 chips.

So are you saying that no one buys mid range i5 chips anymore?

Performance/$ is king.

Re:AMD even still relevant? (0)

Anonymous Coward | about a year ago | (#43155947)

their top of the line 8350 struggles against the 3570k depending on what application you're doing.

Re:AMD even still relevant? (2)

GigaplexNZ (1233886) | about a year ago | (#43156219)

The 3570k is not mid range, it's high end. The only substantial difference between it and the top i7 chip is hyperthreading and that isn't always useful. In some workloads, it's actually a hindrance. The types of workloads that favour hyperthreading are generally where the AMD chips are competitive with the i7 anyway.

Re:AMD even still relevant? (2)

Zuriel (1760072) | about a year ago | (#43155827)

AMD don't have anything that can compete with Intel's top end on single thread performance, but for the mid-range and lower end parts of the desktop market their products are quite competitive. Building a basic computer for a relative, or an office PC that's never going to do anything more intensive than run Word and play Youtube videos? AMD's APUs offer quite a lot of power for not a lot of money. Not everyone has a use for an i7-3970X.

Re:AMD even still relevant? (2)

letherial (1302031) | about a year ago | (#43155931)

you should be glad people buy amd chips anymore, its the only true competition for Intel.

I buy AMD because well...fuck Intel.

In all honesty the performance different is negligible anymore, but ive had my quad core amd for awhile, its sturdy, it does what i need and i can overclock it....I have never been disappointed in a AMD cpu, intel however...i had a few shitty ones from them. I wont buy ATI however, i hate those video cards.

so just be thankful that the market is what it is, if intel had a monopoly you would be paying alot more for a shittier cpu.

Re:AMD even still relevant? (2)

0123456 (636235) | about a year ago | (#43156355)

so just be thankful that the market is what it is, if intel had a monopoly you would be paying alot more for a shittier cpu.

But Intel CPUs are cheaper today than they were when AMDs were objectively better in the Pentium-4 space heater era.

AMD isn't their major competitor, ARM is.

Re:AMD even still relevant? (1)

UnknownSoldier (67820) | about a year ago | (#43156029)

Because why would you support their dishonesty when they pull shenanigans like intentionally crippling run-time performance of their code when run on non Intel hardware??
http://www.agner.org/optimize/blog/read.php?i=49#49 [agner.org]

And of course they put the disclaimer as a "gif" so text engines won't find it.
http://software.intel.com/en-us/articles/optimization-notice/#opt-en [intel.com]

When Intel learns to respect their customers then I'll respect and support them.

Re:AMD even still relevant? (1)

Rockoon (1252108) | about a year ago | (#43156037)

Their top of the line quad core barely keeps up against the mid range i5 chips.

For the cost of that mid-range i5, you can get an FX-8350 that will completely spank the fuck out of it.

Now stop being an irrational intel fanboy that ignores price.

Re:AMD even still relevant? (1)

hobarrera (2008506) | about a year ago | (#43156353)

Because their way cheaper. Especially for mid-range and even mid-high range. The price difference is also more noticable in less developed countries.
In my case, Intel does offers better CPUs than my current AMD one, but they're all out of my budget.

This is great. (-1)

Anonymous Coward | about a year ago | (#43155477)

I'm so glad that SOMETHING good came out of the Obummer Administration.

It's nice that Obamailure was able to force AMD to come out with elite APUs, but the real question revolves around why Bush Jr didn't do it first. A question that truly boggles the mind. I'll bet it has something to do with the mind control device that the illuminati had implanted in his head. You almost never saw Bush Jr on tv with a tinfoil hat on, which is a clear sign that he was being manipulated by the Jews, which is probably why he couldn't instruct AMD to come out with the APU. The reason Baquack was able to give the order to AMD was because he IS one of the Zionist leaders of the illuminati. Sure, he pretends to be a muslim, but we all know the truth.

His goal is total domination of the world, and he is starting by putting so-called APUs in computers. But they aren't REALLY APUs, as described. Sure, they do part of what they say they do, but how many people know about the hidden chips within the chip that perform two additional functions: 1) allow the government to access everything on the computer's hard drives, and 2) allow the government to remotely hack into the brains of nearby users. Very recently, in February of 2009, a breakthrough in mind control technology was reached. It is no longer necessary to directly connect to a person's brain to read their thoughts. It can be done remotely with a highly specialized computer chip. A computer chip that AMD is going to produce.

Don't fall for their lies. Buy Intel, Buy NVIDIA, and Reject the Illuminati, and Reject the Slavery. And always wear a hat made out of tinfoil when operating your computer. When in public you can put tinfoil on the inside of a normal hat, and they will be none the wiser. Spread the word. Friends don't let friends buy AMD and become slaves to the American government. Oh yeah, it isn't just Americans whose minds they will be reading. All of you are in danger. The zionists will probably mod me down and delete this post, to hide the truth, but those of you who have managed to see this, you will be saved, and you will be able to protect yourself.

Oh, and if you see someone who has more than two epicanthal folds, they're probably one of them. I've already said too much. Goodbye, my friends. I have to go off the grid now.

Re:This is great. (1)

Spottywot (1910658) | about a year ago | (#43155533)

You're weird.

major CPU struggles (-1, Flamebait)

slashmydots (2189826) | about a year ago | (#43155855)

I know it's been a long hard struggle over the last decade or more for chip companies but I'm glad they're getting past that and just putting on the label WHAT THE DAMN FUCKING CHIP IS! I'm sick of havink to boot up a laptop just to see what the fuck "AMD Vision" means. It's a real crap shoot between E1 APUs and Phenom x4 chips. If it has a fucking A6 on it, put a fucking A6 on the logo! Not "intel inside" or "powered by xxxxx" or "centrino" or just "AMD" or nothing or a green AMD logo instead of red, or just "pentium" but sometimes the hologram means G-series and sometimes the hologram means P6000 series or "Core Duo" where it might actually be a single core chip. Ugh. Is it really that fucking hard?

Make Mine A Double! (1)

SQL Error (16383) | about a year ago | (#43156183)

So, 4 cores at 2.5-3.5GHz, 384 shaders, and dual-channel DDR3-1866 RAM at 35W.

If AMD were to double everything they'd have a really nice 70W desktop chip. Not sure what the die size is for Richland, so a doubled chip might not be cost-effective - though the PS4 APU has 8 cores and 1152 shaders, so it's at least possible.

Re:Make Mine A Double! (0)

Anonymous Coward | about a year ago | (#43156283)

Yeah, that request falls apart with double the IO pins for memory, and motherboard complexity

Re:Make Mine A Double! (1)

SQL Error (16383) | about a year ago | (#43156755)

Socket 2011 has quad-channel memory off one die, and AMD's G34 server socket has quad-channel off two. Again, it's definitely possible.

With 768 shaders off just dual-channel DDR3 you'd be seriously bandwidth-starved, so I don't see it working well without four memory channels.

Re:Make Mine A Double! (1)

Lonewolf666 (259450) | about a year ago | (#43158477)

Since we're talking a midrange (relatively cheap) product here, a huge socket like the 2011 might be prohibitive in terms of cost.

So what about a PC mainboard in PS4 style, with 8GByte GDDR5 RAM?
CPU and RAM might have to be soldered in, but it would solve the bandwidth problem and 8GByte RAM seem adequate for most things you would do on a midrange PC...

Re:Make Mine A Double! (0)

drinkypoo (153816) | about a year ago | (#43158315)

If AMD were to double everything they'd have a really nice 70W desktop chip.

Or if you were to just buy a Phenom II X6 and a real GPU (preferably not from AMD) then you'd have a much better desktop system. APUs are for portables and nettops.

Power Efficiency and Beyond Ghz CPUs... (0)

Anonymous Coward | about a year ago | (#43157137)

... this association has always made me laugh.

35W ?? (1)

dywolf (2673597) | about a year ago | (#43158009)

35 watts? Really? I have an AMD APU in my HTPC and it pulls 130W. They managed to drop it by over 100?

Re:35W ?? (1)

SQL Error (16383) | about a year ago | (#43158161)

They're releasing the laptop chips first.

Re:35W ?? (1)

dywolf (2673597) | about a year ago | (#43159371)

Ahhh. OK I missed that part. I would love to use a lower power part in my HTPC (would have prefered the slightly more powerful, yet 100W 2nd gen model of APU, but they were out of stock, so I took what I could get at the time, which was the first generation of em)

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...