Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Introduces New Opterons

Unknown Lamer posted about 2 years ago | from the not-dead-yet dept.

AMD 128

New submitter Lonewolf666 writes "According to SemiAccurate, AMD is introducing new Opterons that are essentially 'Opteron-ized versions of Vishera.' TDPs are lower than in their desktop products, and some of these chips may be interesting for people who want a cool and quiet PC." And on the desktop side, ZDNet reports that AMD will remain committed to the hobbyist market with socketed CPUs.

cancel ×

128 comments

Sorry! There are no comments related to the filter you selected.

Xeon? (0, Flamebait)

Anonymous Coward | about 2 years ago | (#42192977)

So, still slower and hotter than the equivalent Xeon?

Posting from a 3yo AMD Phenom II... (-1)

Anonymous Coward | about 2 years ago | (#42192985)

Black Edition made this first post ;-p

Re:Posting from a 3yo AMD Phenom II... (3, Funny)

Anonymous Coward | about 2 years ago | (#42193311)

Foiled again by Intel

Not watching the trends? (3, Interesting)

erroneus (253617) | about 2 years ago | (#42193079)

I hope people are starting to sit up an take notice. The desire for fulfill the prophesies of Moore's law and to have ever faster and more powerful computing has already exhausted itself. Games are just about as good as they are going to get without new display technologies. The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.

So now things are going for lower power, lower operating temperature and all that. What sort of things benefit from that? How about "embedded systems"? Things that people don't want or need to reboot? The current versions of Windows are too bloaty, power and memory hungry to fit within that framework, so it'll have to be another OS. We know this because of the horrible failure "Netbook" computing has been. People wanted it, but expected it to run Windows. Windows couldn't really do it effectively. (I know... people are still doing it... I've still got two netbooks running XP and going strong... but anyone selling XP?) Microsoft shows no remorse over their architectural choices and show no signs of slimming down and getting lighter. So nothing points in Microsoft's direction... not even Microsoft. They are raising prices to make up for the lack of interest in what they are doing now.

Think about what we are seeing.

Re:Not watching the trends? (4, Insightful)

kwerle (39371) | about 2 years ago | (#42193215)

Actually, all modern OSs do a fantastic job of taking advantage of multiple cores. It's the apps that fail to do so.

As for OSs that take advantage of low power CPUs, you only mention MS - who (I suppose) has done a good job of this with Windows RT on the Surface. And maybe even a good job with whatever the hell Windows Phones run. It's just that consumers have not liked the apps. Of course Apple and Google both have solid contenders in the embedded space.

So, as it always has been: "It's the applications, dummy."

What are you trying to get at?

Re:Not watching the trends? (3, Insightful)

UnknowingFool (672806) | about 2 years ago | (#42193609)

I think he's saying that CPUs bought several years ago are good enough for most people and the need to upgrade hardware every few years is not as pressing as it once was. One way to force this is to bloat software like OS so that you needed new processors.

This leaves MS in a difficult place as most consumer tend to buy new machines to get new Windows versions instead of upgrading. There are rumors that MS is switching to a yearly release to entice consumers to upgrade. It is nearly the same model that Apple uses.

A key difference is that while Apple might make some profit on OS upgrades, they make a lot more on hardware. Thus MS is trying to get into the hardware business as well.

Re:Not watching the trends? (1)

dbIII (701233) | about 2 years ago | (#42199353)

I think he's saying that CPUs bought several years ago are good enough for most people

Most people won't be buying Xeons or Opterons, but for those that do 64 cores and 128GB of memory for under $9k means a lot more hardware for much lower budgets than expected.

Re:Not watching the trends? (0)

Anonymous Coward | about 2 years ago | (#42193253)

The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.

Nonsense, I've had xorg demand 75% cpu time, if that isn't the OS taking full advantage of my hardware, nothing is.

Re:Not watching the trends? (1)

CajunArson (465943) | about 2 years ago | (#42194503)

One problem is that xorg is using 75%... but only of one core. Xorg is a notorious offender when it comes to a program that you'd think would be very well multi-threaded but is actually single-threaded (Firefox being another although that is gradually changing).

Re:Not watching the trends? (2)

temcat (873475) | about 2 years ago | (#42193299)

Speaking a bit offtopically about selling XP, I think that there is a single measure that would make the SW and HW markets much healthier without drastic changes to the copyright law: compulsory downgrade rights to the SW products that aren't sold anymore.

This is a`reasonable compromise for consumers and vendors: the former get to use the software they want, the latter continue to get their money, just without the ability to create additional artificial scarcity.

Re:Not watching the trends? (1)

Jeng (926980) | about 2 years ago | (#42193499)

So you are saying that a company should be forced to sell software they no longer want to support?

What about that companies support costs. If the company is still selling the software their customers are going to demand support and the main reason that the company stopped selling said product is that the products support costs were too damn high. So how does that get the company more money? All it gets the company is pissed off customers and frazzled tech support.

Re:Not watching the trends? (1)

Anonymous Coward | about 2 years ago | (#42193783)

XP still has support, its called DIY.

Re:Not watching the trends? (1)

Khyber (864651) | about 2 years ago | (#42199339)

Out comes the Linux Mantra, but for an MS product! Wow.

Re:Not watching the trends? (1)

temcat (873475) | about 2 years ago | (#42194265)

No, I company shouldn't be forced to support obsolete software, that would be unreasonable. But customers should have the right to obtain it legally with no support obligation from the vendor - and the company shouldn't get to restrict it artificially.

Re:Not watching the trends? (1)

Jeng (926980) | about 2 years ago | (#42196027)

In your post you stated.

This is a`reasonable compromise for consumers and vendors: the former get to use the software they want, the latter continue to get their money

How are the vendors getting money in your solution?

Re:Not watching the trends? (2)

temcat (873475) | about 2 years ago | (#42196213)

The keyword is "downgrade": for example, when I buy a box copy of Windows 8 Pro, in my proposed solution I am legally entitled to install Windows 7, Vista, XP, 2000 Pro box or less expensive editions of these instead of the 8. Here, Microsoft gets the money from me for its latest product, and I get to legally use its comparable obsolete product that I find better.

Re:Not watching the trends? (2)

Jeng (926980) | about 2 years ago | (#42196555)

That is nice and reasonable as long as the consumer is aware that support will only be for the version they purchased.

Re:Not watching the trends? (4, Insightful)

hairyfeet (841228) | about 2 years ago | (#42199333)

How EXACTLY are they "restricting it artificially" pray tell? If you want an XP license today you can get software assurance, an MSDN, or just buy a retail copy, that is three different ways to get XP right there.

People simply aren't buying XP because more and more software has skipped Vista and moved straight to Win 7/8 so as XP gets ever longer in the tooth (Good Lord its over a decade old folks, and has patches on top of patches and doesn't even have support for more than 3Gb of RAM) all the remaining XP machines out there are legacy boxes that people for one reason or another have just not decided to upgrade. this is understandable as most people don't want to spend money upgrading some 7 year old PC that requires more expensive parts than a new one, but that isn't any kind of restriction, that's just common sense.

And if you'll be happy to show us a reliable (IE not a "ZOMFG M$ is gonna kill the servers and burn babies ZOMFG!" blog post) source that has a single quote from MSFT about killing the activation servers? The only quote I've seen from MSFT is a quote saying if they decide to kill the activation servers they will simply point the systems to a page where they can download a simple activation killer, no different than how you can still find the KBs and patches for Win2K at Microsoft, they just don't support it anymore so if you run it you're on your own.

So I don't see how they can't obtain XP, several ways to buy it and I'd be happy to provide links but I figure most can Google, hell you can even use WSUS Offline and set up your own XP (Or 2K3, or Vista, or 7) Update server and keep installing XP all you want, you just won't be getting support after Apr 2014 which since we are talking about an OS that came out when the average system was a 400Mhz PII with 64Mb of RAM? Really not unreasonable and frankly longer than anybody else out there. Does Apple still support OSX 1? Does Linux still provide patches to Debian 2 or whatever was released in 2001? Nope, so I really don't get why anybody is having a fit over this, hell they gave it twice the lifespan of Win9X and have made 10 years of support standard on ALL of their OSes, not just Pro and Enterprise like before, but even Basic and Home, so I honestly don't see what is up with the teeth gnashing and double standards.

I mean is anybody REALLY taking that brand new i3 or AMD quad with 4Gb of RAM and slapping XP on it? Because if so they don't need more licenses, they need a cat scan, you are crippling the system with a creaky old OS that was never made to run on the specs we have now and its just nuts. Even the $200 netbooks are several times faster than the workstations were back then, its just pointless to use XP now for anything but legacy apps.

Re:Not watching the trends? (3, Interesting)

Attila Dimedici (1036002) | about 2 years ago | (#42193607)

I would go a different route. When you stop selling the software, it goes public domain. Of course if copyright still only extended for a reasonable length of time, Windows XP would be public domain.

Re:Not watching the trends? (3, Insightful)

erroneus (253617) | about 2 years ago | (#42193775)

Oh I completely dig that idea. If it is of no use to you (ie. you aren't selling it) then you have apparently exhausted its value to you as a business. It is now your responsibility under the contract of copyright, to release it to the public domain. But no. "The value" is maintained by keeping it away from the public in order to ensure that they keep buying the same things over and over and over again. This is a public abuse which could only be enabled by copyright law.

So copyright went from the right to copy and distribute to the right to take it away from the public and to withhold information, arts and technology.

Re:Not watching the trends? (0)

Anonymous Coward | about 2 years ago | (#42195405)

It's hotpatches and continued support that keep people updating to the latest version. Windows XP, free or not, is sunsetting/EoL.

If updates and patches didn't matter, wouldn't people still be running Windows 98SE? It's not public domain, but it's just about free on the second-hand market.

Re:Not watching the trends? (1)

erroneus (253617) | about 2 years ago | (#42195463)

1. Some people still are running Win98SE
2. If WinXP were free, people would likely kick in with their own fixes and updates

Also, if the copyright for Windows XP were turned over to the public domain, I think the source code would ALSO be made available somehow. They could try to keep it to themselves... you know they wouldn't be obligated to publish it anywhere as far as I know... not sure how the library of congress works with regards to all of that, but I get the feeling the code would get leaked somehow.

Re:Not watching the trends? (0)

Anonymous Coward | about 2 years ago | (#42195599)

1) Yes, but not all that many.

The code could get leaked, but I'd think if it ever would, it would have been already. As for a formal release, it seems really unlikely they'd actually be required to go through a source code release process- if they even have the full thing, along with the toolchain, still sitting around.

Think about the things that HAVE been public domain'd. Books, movies... they don't have to release the storyboards and such, you just have rights to the finished product and the ideas within that product.

Re:Not watching the trends? (1)

erroneus (253617) | about 2 years ago | (#42196463)

No, they don't... and it's just a thought.

But in general, I think when people are ready to give up on the new version of Windows, they will pirate and old one and/or move to another OS. The options are limited but flavors of Linux like Ubuntu are extremely effective with new users. (I am not an Ubuntu fan. Not at all!)

I'm not predicting doom for Windows and certainly not setting a date. I am extrapolating the more numerous and recent failures Microsoft has had and they are huge. The primary cause is customer dissatisfaction. That's a hard thing for Microsoft to fight back against.

If some enterprising company came out with a technology that blends in with the existing AD networks currently entrenched in today's business and enables tablet and tablet-like machines (think tablets with keyboards) and enables support for Microsoft legacy deployments, things would change faster. The whole BYOD push that has been going on is a major driving factor and opportunity for Android. (Note that it's not for iOS *only* because Apple doesn't want to support [and be liable for] the enterprise.) And for executives, information retreival is the big thing... information creation? Not so much. It would take longer for such things to trickle down to the cube farms.

The "in" is obvious and present. I think at this point, even if Microsoft came out with its own Android series of devices and gave it 100% compatibility with the AD networks and applications out there, people will still view it with great mistrust.

Re:Not watching the trends? (0)

Anonymous Coward | about 2 years ago | (#42198999)

And what would you do when the company decides to keep its obsolete product on sale, but with a sweet sweet price tag of say $4995? They'd be fulfilling the "on sale" requirement but nobody could actually buy the thing.

Re:Not watching the trends? (0)

Anonymous Coward | about 2 years ago | (#42194193)

Which would still cost Microsoft money to scrub it of software they buy and incorporate into their OS. It took HP a year to release source of a mobile OS scrubbed of all proprietary drivers and and source written with permission using proprietary information. Try suggesting something that doesn't cost the creator of software money and maybe it will happen.

Re:Not watching the trends? (1)

temcat (873475) | about 2 years ago | (#42194337)

That would be better, but it's too drastic a change with much less chance to ever be implemented. I called it compromise for a reason, in spite of my sig which conveys my ideal approach.

Re:Not watching the trends? (1)

Attila Dimedici (1036002) | about 2 years ago | (#42194825)

I partly wrote my response the way I did and where I did because of a previous response to your post that complained that your proposal would force companies to continue to support software after they considered it obsolete. My response is the answer to that, somewhat legitimate, objection to your original proposal.
Perhaps a good solution would be to implement yours, but give companies the option of just releasing old versions to public domain if they wish to avoid any support issues that they were afraid would accompany giving people downgrade rights.

Re:Not watching the trends? (0)

Anonymous Coward | about 2 years ago | (#42196135)

Then people would never stop selling software. Windows XP now $10000 a copy

Re:Not watching the trends? (1)

Bengie (1121981) | about 2 years ago | (#42194107)

I want support for Linux 1.0. What do you mean I should just upgrade to 2.6+? I thought you said old versions should be supported.

Re:Not watching the trends? (2)

temcat (873475) | about 2 years ago | (#42194351)

I mean legal availability, not support. The former doesn't necessarily imply the latter.

Re:Not watching the trends? (1)

timeOday (582209) | about 2 years ago | (#42193377)

No, I think we have only exhausted the demands of what you might call simplistic computing - fragile algorithms that efficiently follow usually a fixed number of steps to transform their input into some determined output, like drawing a rectangle. Given any imperfect input, they simply explode. Nothing in nature works this way. Living things are more messy, rooted in pattern matching and open-ended searching. We still can't even really simulate glass of water tipping over and spilling off the table. Computers that interact more naturalistically, such as IBM's Watson (the Jeapordy machine) and the highest-scoring image recognition algorithms [kurzweilai.net] consume massive (by today's standards) computing resources (at least to train if not also to execute).

It would be a shame if naturalistic computing were stunted by lack of demand for more capable processors.

Re:Not watching the trends? (0)

Anonymous Coward | about 2 years ago | (#42193513)

>I've still got two netbooks running XP and going stron
A couple of years ago there may have been a reason to run XP on a netbook, but not anymore. Almost no sites require IE and lots of apps are moving to the cloud . Do yourself a favor and install a lightweight linux distro (such as Debian with LXDE) and google Chrome. You'll immediately notice the improvement in boot times, responsiveness, and browser speed. And you'll be a lot less susceptible to malware.

Re:Not watching the trends? (0)

Anonymous Coward | about 2 years ago | (#42194595)

Nice idea. There's a catch.

For instance, I buy and use mini-ITX boards with Intel's Atom CPU (D510 and D525), because the remote sites on which they operate are off-grid and powered via PV systems.

Unfortunately the apps these computers run are Windows-only. There are no suitable *nix equivalents.

Re:Not watching the trends? (0)

jones_supa (887896) | about 2 years ago | (#42195137)

A couple of years ago there may have been a reason to run XP on a netbook, but not anymore. Almost no sites require IE and lots of apps are moving to the cloud . Do yourself a favor and install a lightweight linux distro (such as Debian with LXDE) and google Chrome. You'll immediately notice the improvement in boot times, responsiveness, and browser speed. And you'll be a lot less susceptible to malware.

I would just install Windows 7. It runs as fast as XP and, there is more software available. Also, internet video (Flash, HTML5) is garbage in Linux and cannot be played smoothly on a netbook.

Re:Not watching the trends? (1)

Hal_Porter (817932) | about 2 years ago | (#42196461)

Also if you bought a computer post Windows Vista but pre Windows 8 it came with 7 pre-installed.

Re:Not watching the trends? (0)

Kremmy (793693) | about 2 years ago | (#42197085)

7 will run as fast as XP when it has double the RAM and no applications being used. Once you start putting some load on it, the difference makes itself apparent quickly. Here's a logical fallacy for you: New Operating System faster than Old Operating System! (on New Hardware. Much of which can't be used as a basis of comparison with the old Operating System because of Vendor's insistence on trying to cut off the driver flow for the older revision.)

I swear anyone who thinks 7 is fast has never used it alongside a solid XP install.

Re:Not watching the trends? (1)

AvitarX (172628) | about 2 years ago | (#42198235)

Or we're not CPU bound and have enough RAM.

I've found XP approaches useless without 1-2gb of RAM (due to apps, not OS) and 7 without 2-4.

I find getting to the right explorer window when I'm working in multiple folders is far faster in 7 than XP (due to the new taskbar), also dealing with non zero-padded numbers for filenames.

Copying isn't faster in 7, but it is also safe to do un attended, as it won't halt a copy on error, but will do all the stuff it can do without input, asking questions later. This allows a computer to do more work overnight while I sleep often.

With few exceptions I find 7 much faster (netbooks maybe not withstanding due to the taller taskbar making it harder to see scaled images),

Re:Not watching the trends? (1)

WuphonsReach (684551) | about 2 years ago | (#42199461)

I swear anyone who thinks 7 is fast has never used it alongside a solid XP install.

I'll take that bet. Multiple instances of EVE Online running in XP and you were lucky to get 30-40 FPS on the primary window, with the background windows chugging along at 10-20.

Same hardware, Win7 64bit, now you're getting 60fps solid on the primary window and 30-45 on the secondary windows.

Also running Win7 on my 2007 era Thinkpad T61p. Runs nicer the WinXP did and I do a lot more with it.

(Win7 is pretty darned good. But I still might move to Ubuntu on the next laptop and run Win7 in a VM.)

Re:Not watching the trends? (3, Interesting)

serviscope_minor (664417) | about 2 years ago | (#42193519)

I hope people are starting to sit up an take notice.

???

The desire for fulfill the prophesies of Moore's law and to have ever faster and more powerful computing has already exhausted itself.

Not sure I follow. Transistor density has kept on increasing. It's been a little slower recently, I think, but several manufacturers are now sub 30nm for a variety of different process types.

Games are just about as good as they are going to get without new display technologies.

Really? Seems unlikely.

The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.

Are you sure? Have you looked at the recent CPU benchmarks? More and more programs are taking advantages of multiple cores. All sorts of things that people actually do, like file compression, web browsing, media transcoding. Certainly the things I do benefit from multiple cores.

We know this because of the horrible failure "Netbook" computing has been.

Netbook computing was fine until microsoft moved quickly to kill it. Then the manufacturers seemed bent on suicide after that for inexplicable reasons. Oh, and intel came up with bizarro licensing for the Atom restricting manufacturers yet they haven't (with few exceptions) switched to the faster and cheaper Bobcat CPUs which lack such bizzare licensing restrictions.

Why can't I buy a machine at the low price point and low eright of the EEE 900? That machine sold many millions. Netbooks used to be sub 1kG in the beginning. Now the lightweight ones are 1.5. What happened?

Venduhs are strange. Why did they drop all the high res screens from laptops 10 years ago only to scrabble to play catch up after Apple decided to bring in high res displays? Makes no sense.

That said, there's still a quite decent range of cheap netbook machines around, but they're just not as good as they were.

Oh bull (3, Interesting)

oGMo (379) | about 2 years ago | (#42193559)

The desire for fulfill the prophesies of Moore's law and to have ever faster and more powerful computing has already exhausted itself.

While software has been hampered by web "technology" over the last decade, we are hardly at the pinnacle of software and computing... it's more like the Dark Ages, actually. Some stuff is being done elsewhere (GPUs, mobile), but we're still mired in fundamentally stagnant and backwards principles on the desktop (and server, really).

Games are just about as good as they are going to get without new display technologies.

Laughable. Let's assume anything video-related is "new display technology," and that we certainly have a long way to go to realtime radiosity and raytracing at extremely high resolution in a mobile device, then toss it 3D for good measure, so that's a given. But in terms of gameplay, all the computing and RAM you can get can be eaten up for a very long while. Simulation in games, today, isn't anything like what it could be. If I can't build a city at the SimCity level, zoom in and rampage through it at the GTA level, and walk up to each and every person on the street and learn their personal history and daily routines at an RPG level, then go into every structure and demolish it bit-by-bit with full soft-body dynamics, you've got quite a long way to go.

The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.

This is true to some extent, but "resorting to multi-processor and multi-core" means the desktop isn't maxed out. The primary OS (and software) may not be taking advantage of these things, but they are there and we're far from done yet.

Microsoft shows no remorse over their architectural choices and show no signs of slimming down and getting lighter. So nothing points in Microsoft's direction... not even Microsoft. They are raising prices to make up for the lack of interest in what they are doing now.

Microsoft is irrelevant. They have been for a long time. They may not be going away anytime soon, but they've been irrelevant since Google used the web to effectively route technology around them (due to earlier attempted lock-in). Of course, this has resulted in aforementioned Dark Age of Software, but at least we're not stuck on one platform. We're at the point where Valve is looking to seriously move gaming away from Windows, and there are alternatives for everything else, so what happened before doesn't really apply to what can happen in the future.

Think about what we are seeing.

What we are seeing is ripe potential for a Computing Renaissance.

Re:Not watching the trends? (2)

nabsltd (1313397) | about 2 years ago | (#42194045)

So now things are going for lower power, lower operating temperature and all that. What sort of things benefit from that?

Racks and racks of servers.

Every saving of one watt in TDP of the processor can double (or more) in savings in less cooling of the building.

Re:Not watching the trends? (5, Interesting)

hairyfeet (841228) | about 2 years ago | (#42198035)

Oh lord not this again. First it was the netbook that was gonna kill "the big bad M$" then it was the tablet, then the phone, now you are gonna say embedded...really? Give it up Sparky, nobody is giving up their desktops or laptops for some 1Ghz ARM embedded in the TV, okay? Hell one of the Apple fanbois tried giving up ALL X86 for a month, just one month, and using nothing but his iPad and his iPhone...what happened? he gave up after a week and a half because it was hobbling him too damned much.

The ONLY thing you got right is that PCs have gotten insanely powerful, but you know what? Computers have been insanely powerful for most of the decade, hasn't stopped people from buying them. What HAS stopped people from buying them is the fact we are in the midst of a global recession (I would argue depression, but whatever) so people simply aren't spending money they don't absolutely have to and with their desktops sporting triples and quads, and their laptops sporting duals capable of 1080p? They really don't have to.

But the simple fact is even with an economy in the shitter we are talking 300 MILLION plus computers being sold, and yes nearly all of them running Windows, why? because that is where the software is. they don't want ersatz software, like Gimp for Photoshop, Tux Racer for DIRT, they want to use the billions of dollars worth of software they are sitting on, everything from Quickbooks to that God awful EasyShare your grandma loves so much, and NONE of that shit is gonna run on some embedded ARM chip.

And I hate to break the news to ya but ARM is about to slam face first into the thermal wall, just as X86 did half a decade ago. This is why the ARM Holdings Group have been talking about "dark silicon" for their last several press releases, and why Nvidia is now up to FIVE, count 'em, five cores in their Tegra chips, only ARM hit the thermal wall with a frankly shitty IPC so they are throwing more cores at it but as we saw with AMD there is only so much you can make up for IPC by throwing more cores at it, because most software today still don't thread for shit.

So X86 isn't going anywhere, Windows will be back up once Ballmer's fat ass is thrown out of the big chair and the abortion known as Win 8 is replaced by a much better Win 9 (Star Trek rule in play) and people will continue to buy hundreds of millions of X86 units every year, just at a slower pace because grandma can't stress out that quad like she could that old P4. Does that mean ARM is gonna disappear? Nope, it means its gonna have an insanely quick race to the bottom and several corps will go broke selling Android units, because by this time next year you'll have 7 inch dual core tablets with Android 5 selling for $50 at the Big Lots, and just like X86 people will find they can't tell the difference between a dual core and a quad so they'll get the cheaper unit. Look at the financials of the companies selling Android, Samsung is barely making a profit as is HTC, the rest are bleeding money.

But to say embedded is gonna take out X86 is as stupid as saying mopeds are gonna take out the trucking industry. They are completely different units built for completely different tasks and I have YET to see a single person, even one, replacing their X86 laptops and desktops for some cell phone chip. Sorry, ain't gonna happen.

Re:Not watching the trends? (1)

Crosshair84 (2598247) | about 2 years ago | (#42198399)

Hell one of the Apple fanbois tried giving up ALL X86 for a month, just one month, and using nothing but his iPad and his iPhone...what happened? he gave up after a week and a half because it was hobbling him too damned much.

Do you have a link to that story? Sounds like an interesting read but my efforts at searching for it have failed.

Keep 'em Coming (5, Interesting)

corychristison (951993) | about 2 years ago | (#42193085)

AMD has huge advantages in the server market, I'm really surprised people are so stuck on XEON's.

You can't cram 64 XEON cores into a 1U. Not to mention Intel is spotty on their hardware virtualization extensions.

Intel has the lead in power consumption, sure. But if you're looking into running anything Xen, KVM or VMware in production, the cost savings AMD brings to the table makes them a competitive contender.

I'm in the market for a new Workstation. I've been looking at an Opteron instead of the desktop models. Primary reason being 16 cores on one chip, at a lower power consumption than the 8-core Desktop model.

Re:Keep 'em Coming (-1)

Anaerin (905998) | about 2 years ago | (#42193251)

Unfortunately at the moment, AMD doesn't have the same single-core performance that Intel has. For computing tasks right now, [url="http://www.bit-tech.net/hardware/2012/11/06/amd-fx-8350-review/1"]1 Intel core is "worth" around 2.4 AMD cores[/url]. AMD needs to pull something pretty damned major out of the bag, as they're getting beaten hand over fist.

Re:Keep 'em Coming (1)

tibman (623933) | about 2 years ago | (#42193633)

That isn't what that link is saying. It's saying they are comparable.

Re:Keep 'em Coming (1)

Anaerin (905998) | about 2 years ago | (#42198207)

That link is saying that an 8-core AMD processor has comparable performance to an Intel 4-core i5 processor.

Re:Keep 'em Coming (2)

rgbrenner (317308) | about 2 years ago | (#42198955)

Did you actually read the article? It says the performance of the 8-core AMD FX-8350 has similar performance as the 4-core Intel Core i5-3570K. In most of the tests, AMD actually performed worse.

So where's the disconnect? If words are too hard for you, just go look at the graphs.

Re:Keep 'em Coming (1)

Rockoon (1252108) | about 2 years ago | (#42193793)

Do you work for Intel or something? The link never says what you are claiming.

Re:Keep 'em Coming (1)

dbIII (701233) | about 2 years ago | (#42199443)

Yes, but the Intel multi socket capable stuff with a lot of cores was also stuck on 2GHz a couple of months back, and I haven't heard of anything since. That means at the top end of highly parallel systems an Intel core is around the same capability as an AMD one, but a lot more expensive. The extra expense may be worth it if you want 80 cores in a box instead of 64, but that 80 core machine is close to ten times the cost of the AMD 64 core machine and the same clock speed per core. For some things there's more of a difference than just raw clock speed but for others there isn't much.

Re:Keep 'em Coming (-1)

Anonymous Coward | about 2 years ago | (#42193353)

You can't cram 64 XEON cores into a 1U.

Why would you ever have the absolute need to do that instead of opting for a 2U or 3U? Pizza boxes (1U) don't offer hot swapable HD bays, making them less than ideal for your VMware server. If your rack is so crammed for space maybe it's time to branch out with a larger rack or a 2nd. If that is not an option, my bet is your company can't afford a Mr. Slim, and that 1U crammed full of Opterons is going to fry up in less than two years. If that's the case, a good white box offers much better airflow and cooling capabilities.

Re:Keep 'em Coming (4, Informative)

rgbrenner (317308) | about 2 years ago | (#42193481)

Pizza boxes (1U) don't offer hot swapable HD bays

supermicro would disagree with you

Re:Keep 'em Coming (1)

Anonymous Coward | about 2 years ago | (#42193833)

Supermicro is crazy. They'll cram 4 nodes in 2U [supermicro.com] with an Infiniband interconnect.

Re:Keep 'em Coming (2)

bill_mcgonigle (4333) | about 2 years ago | (#42194373)

supermicro would disagree with you

Not to diss Supermicro, they make nice boxes, but I've had hotswap 1U gear from other manufacturers for over a decade.

Methinksts the GP just doesn't know the market.

Re:Keep 'em Coming (1)

nabsltd (1313397) | about 2 years ago | (#42194147)

Pizza boxes (1U) don't offer hot swapable HD bays, making them less than ideal for your VMware server.

Although having hot-swap bays in an ESX(i) server is nice, one of the whole points of the VMware infrastructure is that you can down a physical machine for maintenance without affecting any services.

You'd still have some sort of redundant disk (RAID-1 at least) on the ESX server, and if a drive fails, you just migrate all the VMs to a different server, replace the failed drive, let the RAID rebuild while nothing is running on that machine (which means the rebuild can be given high priority and complete more quickly) and then bring the machine back into service. Then, VMware DRS will eventually move VMs back onto the machine to load balance the cluster.

Re:Keep 'em Coming (1)

corychristison (951993) | about 2 years ago | (#42194631)

Hotswap absolutely exists in 1U. 4x2TB HDDs in RAID-10 will provide best availability:cost:performance ratios. I have not fiddled with 3 or 4TB drives yet.

2U would probably be ideal in a 4-way setup, for air flow, indeed. But if you need number crunching 1U offers better density.

Heat should not be an issue with proper active cooling.

Re:Keep 'em Coming (1)

dbIII (701233) | about 2 years ago | (#42199469)

Especially if you have two nodes side by side in a 1U box. Those things are like hairdryers so plenty of airflow comes at the expense of a bit of power for fans and a lot of noise.

Re:Keep 'em Coming (1)

Metal_Militia (1201049) | about 2 years ago | (#42195733)

Esxi could also be already embedded on the motherboard / installed on an internal usb drive or sdcard.
Having hot swappable hard disks, today, is not really necessary.

Re:Keep 'em Coming (0)

Anonymous Coward | about 2 years ago | (#42197441)

Unless you're in a datacenter and need the density...

HP Blades [amd.com]

Re:Keep 'em Coming (1)

rgbrenner (317308) | about 2 years ago | (#42193457)

Until recently, I've been buying 100% AMD for 15 years... but AMD is so far behind that for the first time, I bought several Intel-based servers.

Not sure what advantages you think AMD has over Intel... I would love to see a list. because frankly, it's sad to see AMD where it is.

Re:Keep 'em Coming (2)

h4rr4r (612664) | about 2 years ago | (#42193535)

AMDs advantage is lots of CPUs for cheap.
For some workloads, that is worth it. I am using some for a VDI deployment. RAM is the limiting factor on how many desktops I can host not CPU and not disk, because I went all SSD.

Re:Keep 'em Coming (2)

serviscope_minor (664417) | about 2 years ago | (#42194129)

RAM is the limiting factor on how many desktops I can host not CPU and not disk

Surely in that case it's also worth going for AMD, since you also get excellent value in terms of DIMM sockets. If CPU is really not the limiting factor, you could get a 4 way 6212 (are those the cheapest?). The processors are about £200 a pop, and you get 32 DIMM sockets giving you up to 512G of RAM, using 16G DIMMs. 16G DIMMs are now at the point where they are sometimes less/GB than 8G ones.

Between the cheaper motherboards and the cheaper processors, you will save a lot of money compared to a high ram / low CPU Intel machine.

Re:Keep 'em Coming (3, Insightful)

corychristison (951993) | about 2 years ago | (#42194533)

- Core density
- Virtualization extension on all Opteron chips (and now most desktop chips, even the A6-4455M in my laptop)

Not all XEONs have hardware virtualization. Only some of the most expensive chips have it and even then, it can be spotty.

Bottom line, AMD wins in virtualization/"cloud" market (and supercomputing).

Re:Keep 'em Coming (1)

corychristison (951993) | about 2 years ago | (#42194547)

Augh. Damn me for using my phone... That message should have been for parent.

Sorry about that.

Re:Keep 'em Coming (1)

rgbrenner (317308) | about 2 years ago | (#42198247)

- a core on an AMD system has about the same performance as a thread on a XEON. So Opteron and XEON are equivalent there.
- The e3 - entry level XEON -- the cheapest class of XEON chips -- has hardware virtualization. I don't think it's on every chip, but it's really not hard to pull up Intels site and see if a specific chip supports it.

Re:Keep 'em Coming (1)

Mad Merlin (837387) | about 2 years ago | (#42198471)

Not all XEONs have hardware virtualization. Only some of the most expensive chips have it and even then, it can be spotty.

Not true. Every Xeon since 2006 has shipped with VT-x support. Look at the Xeon 5030 [intel.com] for example. Absolute bottom of the line ($150 at launch) from 2006, and it supports VT-x.

You're probably thinking of Intel's desktop line where to do artificially hobble large swaths of their CPUs with respect to VT-x.

Re:Keep 'em Coming (2)

Jawnn (445279) | about 2 years ago | (#42194123)

Until recently, I've been buying 100% AMD for 15 years... but AMD is so far behind that for the first time, I bought several Intel-based servers.

Not sure what advantages you think AMD has over Intel... I would love to see a list. because frankly, it's sad to see AMD where it is.

That's easy. Core density per dollar in the same n rack units.
For the same number of dollars I can buy more real estate on which to run my virtualization stacks with AMD processors than Intel processors. And the savings extend beyond the hardware too. With more cores per socket, my VMWare licensing costs (per core or per VM, however you want to break it out) can be much lower with AMD processors in those sockets. So cheaper CAPEX (hardware and license costs) and cheaper OPEX (support subscription costs) makes AMD hardware a very attractive choice.

Re:Keep 'em Coming (0)

Anonymous Coward | about 2 years ago | (#42194389)

Except an and core is equivalent to a thread on intel, performance wise. So that negates your entire point

Re:Keep 'em Coming (2)

serviscope_minor (664417) | about 2 years ago | (#42193797)

I'm in the market for a new Workstation. I've been looking at an Opteron instead of the desktop models.

Be careful. Strange workstation vendors have decided that if you're blowing $5k on a huge workstation 2 sockets, then naturally you don't mind if it sounds like it's being powered by a gas turbine located inside the case. Oh, and the exhaust of the gas turbine is then fed into a flugelhorn or vuvuzela just incase you are hard of hearing.

I really don't know why. I have purchased a 3 GPU machine which dissipated over 1kW on full and wasn't particularly noisy, being water cooled. Actually the best place I found to get high powered, quiet workstations was to order them from a company specializing in gaming machines. So they arrive looking funny with LEDs and case cutouts, and have silly things like "republic of gamers" written on various parts, but they're quiet.

Re:Keep 'em Coming (2)

corychristison (951993) | about 2 years ago | (#42194465)

I'd probably go for single socket, 16-core Opteron on a supermicro or Tyan standard ATX board I can plop in my existing chassis. Supermicro will need a breakout cable for front panel buttons, but no big deal. In this situation I can fit most sized heatsinks just need to be sure it will fit on the socket.

Re:Keep 'em Coming (2)

lopgok (871111) | about 2 years ago | (#42194517)

Do your 'workstations' have intel xeon cpus or amd cpus? Otherwise, it is quite unlikely that they have ECC memory, which is pretty much required of real workstations. Real workstations are reliable (and quiet). Just being quiet doesn't count.

Re:Keep 'em Coming (1)

serviscope_minor (664417) | about 2 years ago | (#42194675)

Do your 'workstations' have intel xeon cpus or amd cpus?

The loud workstation I'm referring to was an AMD one. The quiet GPU monster was an Intel Core i7, so it didn't have ECC, but neither did the 3 graphics cards which were being used for computation, so it was a bit of a wash, really. And it actually needed proper graphics cards, since it relied heavily on the texture sampling unit for the computations.

That's the other nice thing about AMD is that you can get cheap machines with ECC: the standard single socket desktop CPUs and motherboards will support it.

Re:Keep 'em Coming (1)

lopgok (871111) | about 2 years ago | (#42194927)

I heard the high end nvidia cards have ECC on their video memory. They found there were too many errors when doing computation...

Re:Keep 'em Coming (1)

serviscope_minor (664417) | about 2 years ago | (#42195001)

I think they do now. Or at least the computation cards do. I don't think they have them on the consumer level graphics cards. It turned out that the consumer graphics cards were better for our needs since they had more and faster texture sampling units. Since the workload was image processing, that was necessary to get good performance. The computation cards tend not to be so good if you're doing graphics or graphics-like workloads as they dedicate more resources to floating point and fewer to pixel specific stuff.

Re:Keep 'em Coming (1)

tirefire (724526) | about 2 years ago | (#42195203)

I don't think they have them on the consumer level graphics cards. It turned out that the consumer graphics cards were better for our needs since they had more and faster texture sampling units.

I recently bought a Radeon HD 7850 2GB card that I'm pretty sure has ECC. Cost was $200. It's pretty cool - if you overclock the memory too high, the card doesn't start crashing your games or anything, it just doesn't run any faster.

To my knowledge, all graphics cards with GDDR5 memory have ECC.

Re:Keep 'em Coming (1)

Bengie (1121981) | about 2 years ago | (#42194239)

Can you image the amount of heat 64cores of Bulldozers in a 1U case would create?!

I've been looking at custom building a server at home and when looking in the 64GB-256GB of ram with single-dual socket 8-32 threads, Intel wins in price, performance, and power consumption.

Choosing between an Intel Xeon i5 3.3ghz quad+HT with 65watt TDP compared to an Opteron Bulldozer 4module-8core 2.8ghz with a 125watt TDP and lower IPC, for an almost identical price isn't even a choice.

Re:Keep 'em Coming (0)

Anonymous Coward | about 2 years ago | (#42194749)

I just wish I can find decent AMD motherboards with SLAT support and a TPM chip. That way, I can make a couple servers I can stash in a back room, and if someone breaks in and steals them, the data on them would be protected due to BitLocker [1].

[1]: Well, barring an intruder who manages to get the case off and dumps the RAM while hot... However, few tweakers/crackheads are interested in anything but just liberating items from a place, so they can sell them and get their next high.

Re:Keep 'em Coming (1)

viperidaenz (2515578) | about 2 years ago | (#42195235)

It's a pity AMD's cores aren't complete cores. The cores come as a pair and they share some resources with each other. Some kind of bastardisation between Hyper-threading and true multi-core.

AMD SUcks (5, Funny)

Anonymous Coward | about 2 years ago | (#42193107)

Everyone knows that Intel is better, and competition in the CPU market is not a good thing. I hope AMD goes out of business soon, so that Intel can lower the price of their chips.

Re:AMD SUcks (1, Insightful)

Anonymous Coward | about 2 years ago | (#42193171)

Everyone knows that Intel is better, and competition in the CPU market is not a good thing. I hope AMD goes out of business soon, so that Intel can lower the price of their chips.

Yea, because usually when a company has no competition they lower prices. Happens all the time.

Re:AMD SUcks (2)

Baloroth (2370816) | about 2 years ago | (#42193597)

Everyone knows that Intel is better, and competition in the CPU market is not a good thing. I hope AMD goes out of business soon, so that Intel can lower the price of their chips.

Yea, because usually when a company has no competition they lower prices. Happens all the time.

Woooosh!

Re:AMD SUcks (-1)

Anonymous Coward | about 2 years ago | (#42193243)

Your command of basic economics is astounding.

Re:AMD SUcks (1)

Anubis IV (1279820) | about 2 years ago | (#42193345)

If I read what you wrote as a sarcastic statement intended to be read as the opposite of what you wrote, you come off sounding a lot more intelligent. Intel does have a leg up at the moment, but everything after that is incorrect and misguided in what you said. Here's hoping competition stays alive, socketed CPUs stay around, and AMD has a long life ahead of them (one they earned, of course).

Re:AMD SUcks (2, Insightful)

Anaerin (905998) | about 2 years ago | (#42193361)

...competition in the CPU market is not a good thing. I hope AMD goes out of business soon, so that Intel can lower the price of their chips.

What? Competition drives innovation and lowers prices. It happened with AMD's Athlon killing the old Netburst P4s. It happened with x64 killing IA-64. Why would AMD leaving the market "let" Intel lower CPU prices?

Oh, I'm sorry, you're just a troll, without the possibility of reasonable discourse or fair and reasoned debate. Forgive my oversight.

Re:AMD SUcks (0)

Anonymous Coward | about 2 years ago | (#42194187)

You fail at sarcasm

Re:AMD SUcks (-1)

Anonymous Coward | about 2 years ago | (#42193451)

1/10

Re:AMD SUcks (1)

cpicon92 (1157705) | about 2 years ago | (#42193455)

I wish I had mod points. This is the funniest thing I've read on here in a while. It's goes above and beyond trolling.

Re:AMD SUcks (1)

Jeng (926980) | about 2 years ago | (#42193717)

You got some nice bites there, but I think you could have trolled harder.

7/10

1.25v DDR3, but CPU efficiency... (2)

Kjella (173770) | about 2 years ago | (#42193349)

Okay so they're the only x86 CPU offering 1.25v DDR3 support but the difference between a pair of 1.25v and 1.5v DIMMs is around 4 W [tomshardware.com] and you can save 3 of those 4 W moving to the commonly available 1.35v DDR3. Meanwhile AMD keeps putting out 125W processors like the FX-8350 to not really compete with a 77W processor like the i7-3770K, so this "major datacenter advantage" I think I'll file under "major wishful thinking". Not to mention you're investing into a platform with little future since AMD wants to push ARM servers now. But I guess Intel has let AMD put a positive spin on continuing to deliver on old sockets.

Re:1.25v DDR3, but CPU efficiency... (4, Interesting)

serviscope_minor (664417) | about 2 years ago | (#42193711)

Huh?

The i7 3770K has a TDP of 95W. And the FX-8350 is a very good chip and much cheaper than the i7. The benchmarks relative to the i7 are all over the place. In most cases it sits somewhere between the i5 and i7. In some cases it is destroyed by the i7, in other cases, the reverse is true. The single threaded performane is quite weak and usually substantially less than the i5, but then the i5 to i7 difference isn't enormous. The difference from FX8350 to i7 seems to be around 20-50% in most cases.

Curiously the AMD processors tend to stack up better on the Linux benchmark suites.

Anyway.

This thread is about the Opteron processors, which are still (a) competing against SB, (b) benefit from substantially cheaper full system costs and (c) you aren't terribly sensitive to single thread performance if you're buying a 4 socket server.

Not to mention you're investing into a platform with little future

What does that even mean? It's all x86, so even if AMD vanishes tomorrow you can keep using the servers and then transition to intel when you need new ones. The whole point of having more than one vendor means that no matter what, you're not investing in a platform with no futuer.

Re:1.25v DDR3, but CPU efficiency... (1)

Bill Dimm (463823) | about 2 years ago | (#42194145)

The i7 3770K has a TDP of 95W

Intel's website says 77 W [intel.com] while various [techpowerup.com] websites [flyingsuicide.net] say the retail packaging says 95 W. Weird.

Re:1.25v DDR3, but CPU efficiency... (4, Interesting)

steveha (103154) | about 2 years ago | (#42194555)

The i7 3770K has a TDP of 95W.

I know that, at least in the past, Intel used to issue TDP numbers that represented "typical" heat, while AMD used to issue TDP numbers that represented worst-case heat (which is what TDP ought to be IMHO). I have read here on Slashdot that more recently, AMD has started playing those games as well.

But according to NordicHardware [nordichardware.com] , in this case Intel is under-promising and over-delivering, and the chips really do dissipate only 77W despite being rated for 95W. (But how did they measure that? Is this a "typical" 77W? I guess it's not that hard to run a benchmark test that should hammer the chip and get a worst-case number that way.)

Curiously the AMD processors tend to stack up better on the Linux benchmark suites.

This is probably because Linux benchmarks were compiled with GCC or Clang rather than the Intel compiler. The Intel compiler deliberately generates code that makes the compiled code run poorly on non-Intel processors. The code checks the CPU ID, and the code has two major branches: the good path, which Intel chips get to run, and the poor path, which other chips run.

http://www.agner.org/optimize/blog/read.php?i=49 [agner.org]

The irony is that Intel, by investing heavily in fab technology, is about two generations ahead of everyone else, so they can make faster and/or lower-power parts than everyone else. This means they could be competing fairly and win.

But because Intel does evil things like making their compiler sabotage their competition, I refuse to buy Intel. They have lost my business. They don't care of course, because there aren't many like me who are paying attention and care enough to change their buying habits.

If you want the fastest possible desktop computer, pay the big bucks for a top-of-the-line i7 system. But if you merely want a very fast desktop computer that can play all the games, an AMD will do quite well, and will cost a bit less. So giving up Intel isn't a hard thing to do, really.

Re:1.25v DDR3, but CPU efficiency... (1)

Anonymous Coward | about 2 years ago | (#42194925)

The i7 3770K has a TDP of 95W.

I know that, at least in the past, Intel used to issue TDP numbers that represented "typical" heat, while AMD used to issue TDP numbers that represented worst-case heat (which is what TDP ought to be IMHO). I have read here on Slashdot that more recently, AMD has started playing those games as well.

But according to NordicHardware [nordichardware.com] , in this case Intel is under-promising and over-delivering, and the chips really do dissipate only 77W despite being rated for 95W. (But how did they measure that? Is this a "typical" 77W? I guess it's not that hard to run a benchmark test that should hammer the chip and get a worst-case number that way.)

Curiously the AMD processors tend to stack up better on the Linux benchmark suites.

This is probably because Linux benchmarks were compiled with GCC or Clang rather than the Intel compiler. The Intel compiler deliberately generates code that makes the compiled code run poorly on non-Intel processors. The code checks the CPU ID, and the code has two major branches: the good path, which Intel chips get to run, and the poor path, which other chips run.

http://www.agner.org/optimize/blog/read.php?i=49 [agner.org]

The irony is that Intel, by investing heavily in fab technology, is about two generations ahead of everyone else, so they can make faster and/or lower-power parts than everyone else. This means they could be competing fairly and win.

But because Intel does evil things like making their compiler sabotage their competition, I refuse to buy Intel. They have lost my business. They don't care of course, because there aren't many like me who are paying attention and care enough to change their buying habits.

If you want the fastest possible desktop computer, pay the big bucks for a top-of-the-line i7 system. But if you merely want a very fast desktop computer that can play all the games, an AMD will do quite well, and will cost a bit less. So giving up Intel isn't a hard thing to do, really.

AMEN!

Re:1.25v DDR3, but CPU efficiency... (0)

Anonymous Coward | about 2 years ago | (#42197217)

Real world usage chiming in here: I'm under full load on a i7 3770 at the moment (transcoding with handbrake), and according to CPU hardware monitor, i've got 68~69W of draw to the chip package. That, by the way, is with all four cores Turbo-locked to 3.9Ghz. (Asus boards have nice features)

The plural of anecdote is not data...yada yada and all that.

Re:1.25v DDR3, but CPU efficiency... (1)

Lonewolf666 (259450) | about 2 years ago | (#42194943)

The Opterons in the new series have a TDP of 65W or 95W for the 8-core models. At the expense of being clocked lower than a FX-8350, but the performance per watt is still better than for the FX-8350.

Looking at the 4 core models, the 3350HE may be a worthy replacement for the Athlon II X4 910e I have in my current desktop:
Four cores like the Athlon, 2.8 GHz clock speed where the Athlon had 2.6 GHz and only 45W TDP versus the 65W of the Athlon. In terms of pricing, the 3350HE seems to be similar to where the old Athlon was (right now, price comparisons in the EU come up blank for both).

Re:1.25v DDR3, but CPU efficiency... (1)

Kjella (173770) | about 2 years ago | (#42195399)

The i7 3770K has a TDP of 95W.

No it doesn't, but I guess if you don't have facts use FUD. Intel has kept a "segment TDP" on the retail packaging because they want all Sandy/Ivy Bridge motherboards, coolers etc. to support 95W processors - the maximum in the Sandy Bridge line - but the actual processor will never use more than 77W. This was explained here [nordichardware.com] but Intel's site and 99,9% of all reviews and online sites will list it as a 77W processor. In fact the 95W figure is so rare that only reason to bring it up - particularly ignoring all the other places that say differently - is to troll.

Re:1.25v DDR3, but CPU efficiency... (0)

Anonymous Coward | about 2 years ago | (#42195479)

*Not to mention you're investing into a platform with little future*

Kjella is presumably talking about the collateral for the processor- that is to say, the motherboard, and more specifically the socket.

Of course, I'm told by sysadmin friends AMD historically has great socket stability across CPU generations, so I can't help but wonder if AMD could build this ARM CPU to plug directly into whatever socket is currently in use for x86. That would sure be one for the history books.

AES-NI? (0)

Anonymous Coward | about 2 years ago | (#42194515)

Does anybody know if these new opterons support hardware accelerated crypto? Didn't turn anything up with a quick google search of 3320ee aes-ni.

I read a while ago that AMD was adding the AES-NI instructions to all their new line, but want to confirm it for this processor.

AMD to become overpriced at some point? (1)

Kergan (780543) | about 2 years ago | (#42196057)

And on the desktop side, ZDNet reports that AMD will remain committed to the hobbyist market with socketed CPUs.

On a related note, I find it quite weird that Intel willfully forfeits their own future over to motherboard makers. It makes little sense to me to depend on third parties you've absolutely no control on to fix the price of the final product that your current customers -- computer makers -- end up buying, irrespective of the fact that Intel itself makes motherboards. I must be missing something besides the obvious (aka it's thinner, which incidentally ensures that AMD has to do this too for laptops). Slashdotter insights welcome...

And even if it does make actual sense... Given that Intel's CPU market share is roughly twice as large as AMD's, won't motherboard makers get more economies of scale for Intel CPU motherboards, that AMD -- socketed or not -- simply won't be able to compete with in the long term?

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?