Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Is ARM Ever Coming To the Desktop?

timothy posted about 2 years ago | from the my-arms-are-there-now dept.

Hardware 332

First time accepted submitter bingbangboom writes "Where are the ARM powered desktops? I finally see some desktop models however they are relegated to "developer" models with USD200+ price tags (trimslice, etc). Raspberry Pi seems to be the only thing that will be priced correctly, have the right amount of features, and may actually be released. Is the software side holding ARM desktops back? Everyone seems to be foaming at the mouth about anything with a touch interface, even on the Linux side. Or are manufacturers not wanting to bring the 'netbook effect' to their desktop sales? Are ARM powered desktops destined to join the mythical smartbook?"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered


Tabtop momentum building (0, Interesting)

Anonymous Coward | about 2 years ago | (#37501452)

Waiting for the release of Tegra 3 Kal-El quad-core equipped Asus Transformer 2 tabtop as a low power general purpose device that I can keep powered on all the time to replace a much more power hungry x86 machine.

Re:Tabtop momentum building (2)

GameboyRMH (1153867) | about 2 years ago | (#37501564)

...then finally the device will become slim enough to have the keyboard built-in without pissing off even the trendiest of Starbucks-dwellers, and we would have come full circle back to the convertible laptop.

Re:Tabtop momentum building (5, Insightful)

hairyfeet (841228) | about 2 years ago | (#37501750)

I'm sorry but.....why? WTF would you want ARM on the desktop? Are you living in a mud hut in Zambundi and don't have any electricity to spare for a desktop?

Lets be honest folks, the big selling point of ARM is how cheap it is on batteries. Well guess what you do NOT need when you are inside? Why that would be a battery! See that plug on the wall right in front of you?

Cycle for cycle x86 stomps the living shit out of ARM, it just uses more power to do so than most mobiles can afford due to the fact we haven't had a real breakthrough in battery tech in ages. Well that and the fashionistas at Apple have made iSliver batteries the "in" thing in which means you have to power the thing on a battery the width of a tic tac. I don't care if you put 8 cores on the thing, a bottom o' the line AMD quad, even the low power AMD quads, will stomp the living shit out of ARM. drop in an i series and it isn't even funny how badly it gets stomped.

Like everything else it is about using the right tool for the job. ARM royally kicks ass in mobile, embedded, and in places where you need a device that'll take milspec levels of abuse due to the fact you can run it fanless. X86 kicks ass in desktop and laptop where you want more performance and don't mind giving up some battery life for it. But ARM on the desktop makes about as much sense as stuffing an i series into your phone, that is none at all. The majority of code out there is x86, even on Linux x86 outnumbers ARM code by a pretty wide margin. So unless you just really really REALLY want the Droid version of Angry birds on your desktop it just seems more than a little stupid to be running a mobile chip in a place where you are right beside a plug in.

Re:Tabtop momentum building (3, Insightful)

MightyMartian (840721) | about 2 years ago | (#37501904)

The majority of code out there is x86, even on Linux x86 outnumbers ARM code by a pretty wide margin.

This is a bizarre claim, considering the majority of code out there is in C or in higher level languages like Java, Cobol, C# and so on, so technically the processor architecture is irrelevant for most code.

As to Linux, there are small pieces of the kernel written in assembly, but these have been rewritten so Linux can run on a number of non-x86 platforms. The vast bulk of Linux and its userland tools are written in C, so the underlying architecture is irrelevant. Want to run emacs on an ARM variant of Linux, well, just bloody well compile it for that ARM processor.

Re:Tabtop momentum building (2)

icebraining (1313345) | about 2 years ago | (#37501998)

This is a bizarre claim, considering the majority of code out there is in C or in higher level languages like Java, Cobol, C# and so on, so technically the processor architecture is irrelevant for most code.

That's fine for OSS (and Debian for example has decent support for ARM), but try to convince some publisher of some proprietary software you need to use to port it.

Re:Tabtop momentum building (4, Insightful)

jd142 (129673) | about 2 years ago | (#37501978)

I'm sorry but.....why? WTF would you want ARM on the desktop? Are you living in a mud hut in Zambundi and don't have any electricity to spare for a desktop?

Lets be honest folks, the big selling point of ARM is how cheap it is on batteries. Well guess what you do NOT need when you are inside? Why that would be a battery! See that plug on the wall right in front of you?

You know, it's just possible some people might want to conserve electricity. Or even shave a couple of bucks off the old electricity bill. Just because you can use a resource, doesn't mean you should. I have running water, but I don't just leave the faucet on all day in case I might want a glass of water.

I don't know, but if you had one of those little portable solar cells, could you just power an arm laptop anywhere?

Re:Tabtop momentum building (3, Insightful)

wonkavader (605434) | about 2 years ago | (#37502146)

I want silence. COMPLETE silence.

I want a computer you can barely find, it's so small and unobtrusive.

I want a computer so cool it can be covered in papers and crap without me worrying about it overheating.

I want devices that are dirt cheap to buy and dirt cheap to run, because I want them in every room, on all the time.

I want ARM.

Re:Tabtop momentum building (1)

Nutria (679911) | about 2 years ago | (#37502382)

Or a Lenovo propped up on it's docking station and connected to an external kb and LCD.

Nope (0)

Anonymous Coward | about 2 years ago | (#37501454)

No, they won't. All the claims of ARMageddon [tmrepository.com] have been bullshit every year it's been claimed (which is quite some time now). Most people will still prefer to get a higher powered x86 system even if they don't always use the power, plus the fact that most legacy software is x86 only and will pretty much never be ported to ARM is another reason that most won't switch.

Look on eBay (5, Informative)

jimicus (737525) | about 2 years ago | (#37501460)

Look on eBay for an Archimedes.

They're rapidly becoming a collector's item, but they were on the desktop in 1987.

Re:Look on eBay (2)

damburger (981828) | about 2 years ago | (#37501630)

They had a surprisingly modern looking desktop, that booted in seconds. It would be interesting to see where the platform would be today had it taken off in a big way.

Re:Look on eBay (3, Informative)

jimicus (737525) | about 2 years ago | (#37501714)

The OS itself is still around today [riscosopen.org] , after a fashion. But time has not been kind.

IMV, a fast boot cannot compensate for a spectacular lack of features you'd expect to find in a modern OS. It's a single user OS with co-operative multi-tasking rather than pre-emptive, there's no protected memory or swap support, it's single-user.

Re:Look on eBay (2)

damburger (981828) | about 2 years ago | (#37501762)

I think its safe to assume that, had it been developed as a mainstream OS in the intervening time, it would've gone to pre-emptive multitasking as soon as the hardware permitted it.

Most of my fond memories are of the interface; the consistent and effective use of three mouse buttons, the innovative save dialog and the way in which applications were packaged (which, honestly, I know people thought was invented with Mac OS X)

Re:Look on eBay (5, Insightful)

mountaineer76 (941902) | about 2 years ago | (#37501902)

Totally agree, just compare RISC OS 2 or 3.1 to the equivalent Windows version in 1988 or so, Acorn was streets ahead. Still have a 200 mhz RISC PC sitting next to my desk, it's a nippy little beast boots in a few secs or so....

Re:Look on eBay (3, Interesting)

Anonymous Coward | about 2 years ago | (#37502156)

I find it funny, yet sad, that people have forgotten that the A in ARM used to stand for Acorn. I was there, in Cambridge, during the time of the first ARM CPU's development. Friends of friends of the people who worked to create it. At the time it was by far the most powerful desktop and it's various OSs (RISC OS 2+.. Arthur was always a stop-gap OS :) ) were far advanced over everything else available at the time.

I still use a RISC PC today. Also, one of the best case designs ever - it's practically infinitely expandable!

Why? (3, Insightful)

dukeblue219 (212029) | about 2 years ago | (#37501464)

Seriously, what is the reason for having a desktop ARM computer? Power consumption? I don't think there's a very large market for people who will settle for tablet-like performance in order to save a few dollars a month at most on electricity compared to existing low power processors. People with power grid problems will want something that runs on a battery anyway, and a tablet/netbook makes more sense there.

Is it just for something fun to play with? Something small and portable? You can always get a small ARM tablet and hook up the HDMI to a monitor if it's the full size display and keyboard you're missing.

Not sure what touch interface has to do with anything. That could be just as easily implemented with any architecture, and it's maybe the ONE thing I agree with Steve Jobs about -- touch does NOT work as a viable input method for a desktop.

Re:Why? (1)

hedwards (940851) | about 2 years ago | (#37501522)

That was my though, I have a Zacate processor in my Laptop which would be far more suitable for a desktop than your average ARM processor. ARM would ultimately face the same uphill struggle for acceptance that Intel's MERCED did when AMD whomped them with their AMD64 architecture. Changing instruction sets isn't easy to do and the main reason that folks tolerate it with mobile processors is that it saves them so much scarce battery life. Switching to one of AMD's mobile offerings would pretty much eliminate that concern. My laptop for instance uses 25w with everything maxed out. You're not likely to save enough power going with ARM on a desktop to make it worthwhile.

Re:Why? (2)

lkcl (517947) | about 2 years ago | (#37501718)

Zacate is 18 watts! that means you have to have a heatsink or heatpipe, fan and other moving parts, as well as much larger power components. by contrast, with something like the NuSmart 2816 if you run it at 1.6ghz then you can get away with 4 watts and that's *including* the ECC 1066mhz DDR3 RAM. a voltage regulator for a stable 4 watt power supply is approximately a $0.50 cents part. it's a whole different ballgame. so the EOMA Initiative, we've set a 5 watt absolute maximum limit, and are sticking to it. http://elinux.org/Embedded_Open_Modular_Architecture/PCMCIA [elinux.org] sadly, not even intel's latest 1.2ghz 45nm CPU, the one that's designed for Meego, is suitable, because the CPU's 2.5 watts, the Northbridge IC is 2 watts, whoops there's not enough room to run the RAM ICs.

even the Z510 shows that this northbridge-southbridge strategy is unacceptable. you *have* to go "totally integrated", in order to reach the required power target. once Intel and AMD start doing that, then and only then will they produce a winner.

Re:Why? (1)

hedwards (940851) | about 2 years ago | (#37501982)

Right, and the cost you pay is having to recompile and possibly rewrite every application that you want to use.

As I pointed out, Intel thought that they could introduce a 64bit architecture that lacked support for 32bit applications and ended up being taken over AMD's lap for a spanking.

As for thermal dissipation, 18 watts isn't that much, with the right heat sink you can dissipate most CPUs using something like this: http://www.nofencomputer.com/eng/products/CR-100A.php [nofencomputer.com]
And ultimately, that's just at peak use, most of the time, the consumption is significantly lower than that.

Re:Why? (1)

obarthelemy (160321) | about 2 years ago | (#37502158)

I bought a passive E-350 board from Asus, and put it in a passive, well ventilated case vertically mounted behind my monitor. Temp was 60-65C, so I added a small and whiny fan.

Re:Why? (3, Interesting)

durrr (1316311) | about 2 years ago | (#37501824)

The Zacate is availible for desktop setups. Motherboard with zacate integrated goes for $100-$120, with RAM, PSU, HD and a shoebox for chassi you get a decent windows computer for $200.

That however is quite a bit in excess of the $35 the Raspberry Pi is supposed to sell for. At that pricepoint you can almost start putting them everywhere before knowing why you're putting them everywhere.

Re:Why? (3, Insightful)

rve (4436) | about 2 years ago | (#37501550)

I have an ARM based laptop. It's fanless, in fact, it has no moving parts at all other than the hinge of the screen, and goes for a day or two of regular use between recharges. I'm not convinced "the desktop" has much of a long term future at this point... i think it will go the way of the workstation.

Re:Why? (2)

SerpentMage (13390) | about 2 years ago | (#37501604)

Yes and everybody will run under powered desktops in the quest to not have to hear the fan...

I run a desktop because I need the power. If it has an ARM in it, so be it. BUT I need the power. When I develop I am going to use multiple screens. When I run my trading software I need a desktop with multiple screens.

What people need to understand is that there is no solution fits all. Some people don't need a desktop, others do. Some people don't need a tablet, others do. Let's all get this straight we will have more choice, not less choice.

Re:Why? (4, Insightful)

Bert64 (520050) | about 2 years ago | (#37501676)

Exactly, no solution fits all... Your needs are specialised, so you will occupy a niche of people who will continue to buy highend workstations...

For the vast majority of people computers became powerful enough for their requirements many years ago (aside from increasingly bloated software trying to mask that fact), and they are concerned about price, running cost (ie power usage), noise and that the machine is not an eyesore, and even more so are the companies who buy hundreds of desktops for their employees and don't want to buy a noisy, expensive, large and power hungry workstation for someone who's sole business use for it is to write letters.

Re:Why? (3, Informative)

realityimpaired (1668397) | about 2 years ago | (#37501814)

There's no reason a Cortex A7 dual core @ 800MHz wouldn't be able to handle both of the tasks you listed with ease. It could even handle basic gaming if you have a discreet video card to handle the load. Most people don't do the kind of number crunching that a modern high end desktop CPU would allow.

The gamer crowd, absolutely. I can fully understand why they would want a high end processor. Even games that aren't that graphics intensive, like Civilization, are very heavy on number crunching. The office crowd, however, could easily be serviced by a low end low power ARM CPU. I could easily replace my desktop with an ARM-powered nettop without adjusting my computing habits at all, and I'm already running a multi-head setup.

Re:Why? (0)

Anonymous Coward | about 2 years ago | (#37502250)

Not the GP, but developing involves more than just typing code into an editor. Compiling, simulation, matlab-style software bring my "workstation" to its knees, and a dual-core 800MHz cpu is not going to cut it. Granted most people don't need that kind of power, but the thing is that they *WANT* that kind of power. Just like how most people don't need a 300hp car, but they want it.

Re:Why? (0)

Anonymous Coward | about 2 years ago | (#37501932)

It implies that the desktop is going away. Development goes a little bit in reverse and instead we have just tablets* that wirelessly connect to our audio equipmentry and HD TVs effectively doubling as monitors in our living room or in the study.

*of course, keyboard is still a must for writing longer passages of text

Re:Why? (2)

dunkelfalke (91624) | about 2 years ago | (#37501946)

There is actually no need to. I've got a desktop computer with Core2Duo E8400 CPU and a Radeon HD5770 graphics card and the only fan of that system sits in the PSU (and it is almost silent due to its low RPM). The system is quite fast and the power consumption is moderate.
It is absolutely possible to build a silent and powerful desktop system.

Re:Why? (1)

rve (4436) | about 2 years ago | (#37502096)

You know, in the late 90's Sun, DEC, SGI and yes, Apple, were getting great profit margins on fantastically powerful graphical work stations. Some models sold for $100k or even much more. Swith to an under powered Linux PC just tosave $100k? Not me, I need multiple processor cores, 1600x1200 pixel 256 color displays, 100kbit/s ethernet.
Of course, 5 years later, all of these companies had either abandoned the graphical workstaton market or were struggling to survive. Today, a $800 laptop exceeds the specs above.

I believe 10 or 15 years from now, it will be somewhat unusual to have what you would consider a full powered desktop PC at home or at the office, and so will doing evelopment work for these no longer very common devices be.

Re:Why? (3, Informative)

SendBot (29932) | about 2 years ago | (#37501578)

... it's maybe the ONE thing I agree with Steve Jobs about -- touch does NOT work as a viable input method for a desktop.

He may have said that at some point, but you should know by now that Apple changes the kool-aid they serve every so often. He's even spoken at length about merging iOS concepts into the desktop OSX.

I've copied some text straight from the apple web site for the "Magic Trackpad" that make touch sound like you're no longer cool without it:

"The new Magic Trackpad is the first Multi-Touch trackpad designed to work with your Mac desktop computer."
"And it supports a full set of gestures, giving you a whole new way to control and interact with what’s on your screen."
"Magic Trackpad gives you a whole new way to control what’s on your Mac desktop computer. When you perform gestures, you actually interact with what’s on your screen. You feel closer to your content, and moving around feels completely natural."

Re:Why? (0)

Anonymous Coward | about 2 years ago | (#37501596)

The OP, and Jobs, is referring to touch screens, not touch pads.

The OP is talking about red apples, not greenapplz (1)

SendBot (29932) | about 2 years ago | (#37501802)

The OP said "touch interface" and then refers to "touch" as an "input method". The trackpad is described as "multi-touch".

I stand by my statements and provide further evidence of "touch" making its way to the desktop. Look! It's an official apple support page about multi-touch gestures in OSX Lion, one of the big things they were promoting about it: http://support.apple.com/kb/HT4721 [apple.com]

If you're touching a trackpad (distinct from mousing, which is more using a stick to poke things instead of touching directly), what and where you are touching the interface is largely arbitrary. It's not that crazy to imagine a kinect-like desktop interface being common so that people can touch items in their desktop experience without smudging the screen up with fingerprints.

Re:The OP is talking about red apples, not greenap (4, Informative)

Sancho (17056) | about 2 years ago | (#37502002)

Smudging the screen isn't the problem. The problem is holding your arm up for long periods of time, or the repetitive motion of raising your arm up to touch the screen. That's not something most deskjockeys are going to be doing a lot. It's horrible for ergonomics.

A standalone touch pad doesn't have that problem.

Most phones are held in the hands with lowered arms, hence it's not a problem for those devices.

Hell, laptops were being sold with touch pads as the primary pointing interface. Not much different from a desktop, really.

I don't think any particular feature of touch pads was the perceived problem. But then, you seem prejudiced against Jobs, so my reply is likely pointless.

Re:The OP is talking about red apples, not greenap (1)

SendBot (29932) | about 2 years ago | (#37502346)

In a lot of ways, a touchpad is just a mouse by any other name. What makes them interesting are more recent developments that allow these "touch" conventions, for instance two-finger scrolling (which I *love*). I never suggested a poor ergonomic setup, nor would I. With a kinect-like thing, a user could just hover their hands over the keyboard and have little transparent hand avatars on the screen. It's the concept that's important, and debating flimsy hypothetical implementations completely misses the point.

I think Jobs is okay. Heck, I even like the guy. But I read between the lines and take what he says with a block of salt. Remember when iPod competitors started having video playback? He played a scene from Raiders of the Lost Ark to poke fun at them and say they were going to the wrong place. Now how many current iPods play video? All of them except the screenless shuffle I think?

I don't recall him saying that touch isn't a viable input method (and no one is providing any links here), but I'd believe that he'd say something like that only to be later contradicted by his own products, as evidenced by what I've quoted earlier.

Re:Why? (1)

Anonymous Crobar (1143477) | about 2 years ago | (#37501926)

I can't speak for the author, but I think he meant "touch screen" and not just "touch." Steve Jobs (among others) have criticized touch screen desktops because of the "gorilla arm" problem. [wikipedia.org] FWIW, I use a magic trackpad at work and it's good for gesturing - especially if you are already used to it, coming off a laptop or tablet.

Re:Why? (2)

lkcl (517947) | about 2 years ago | (#37501586)

well, fortunately there's soon going to be things like the NuSmart 2816, which will have the best of both worlds: Dual-Core 1.6 to 2ghz, 4gb of ECC DDR3 1033mhz RAM... and only about 4 watts for a system (at the 1.6ghz speed).

i'm working towards getting these - and other such beefy low-power CPUs - plugged in to the EOMA initiative:
http://www.openhardwaresummit.org/forum/viewtopic.php?f=5&t=502 [openhardwaresummit.org]
http://elinux.org/Embedded_Open_Modular_Architecture/PCMCIA [elinux.org]

Re:Why? (2)

Gordonjcp (186804) | about 2 years ago | (#37501670)

Dual-Core 1.6 to 2ghz, 4gb of ECC DDR3 1033mhz RAM... and only about 4 watts for a system (at the 1.6ghz speed).

That's going to be slower than an equivalent x86-based machine, though.

The thing is, it will be slower *but draw one tenth the power consumption*. I want ten of these on a board, with about eight times the processing power for the same power draw as an x86 solution.

Re:Why? (4, Informative)

Bert64 (520050) | about 2 years ago | (#37501640)

A few dollars a month for a desktop...
A few thousand dollars a month for an office full of desktops?

The average office worker doesn't do a lot with their computer, and has been doing much the same thing for years... The only thing stopping them from using 10 year old hardware is modern bloated software which is intentionally incompatible with older versions.

There's no reason that the average user's needs couldn't be fulfilled by a low power machine with equivalent processing power to a system from 10+ years ago, with power hungry x86 systems being relegated to the small niche of power users and certain classes of server.

(in short, watch what x86 did to Sparc/MIPS/Alpha/Power, attacked from below)

Re:Why? (1)

gl4ss (559668) | about 2 years ago | (#37501878)

the screen is going to consumer a lot of power anyways.

nothing is stopping from running offices with 10 year old sw too. office '97 is enough to run a smb business, proven in practice many times over.

Re:Why? (1)

Waffle Iron (339739) | about 2 years ago | (#37501992)

A few dollars a month for a desktop...
A few thousand dollars a month for an office full of desktops?

We have an "underpowered" Intel Atom-based mini desktop in the kitchen as a terminal, which is silent and uses negligible power.

While it would be usable for an office worker, it *is* noticeably slower than my standard Athlon computers. Just about every real-world web page or application launch takes several seconds longer than the normal PCs. Over the course of a month, all these little delays would almost certainly add up to more lost productivity than the additional power cost, especially if the normal PCs had an effective sleep mode that actually gets used when the computer is idle.

(For example, if the average user does 100 things per day that take an extra 3 seconds each, that's 5 minutes per day, or more than an hour lost time over a typical month. Even with low-wage employees, with overhead that lost time could cost the employer well over $20/month, enough to buy power for a couple of standard PCs going full tilt 24/7.)

Of course, you could try to convince every website operator in the world to recode their sites to not be full of bloated Javascript (starting with the site you're reading this on), but good luck with that.

Re:Why? (2)

bcrowell (177657) | about 2 years ago | (#37502062)

A few dollars a month for a desktop...A few thousand dollars a month for an office full of desktops? The average office worker doesn't do a lot with their computer, and has been doing much the same thing for years... The only thing stopping them from using 10 year old hardware is modern bloated software which is intentionally incompatible with older versions.

In principle I'm in sympathy with you, but in reality there are a lot of problems with your argument.

  • You're suggesting using 10-year-old x86 hardware in a medium-sized business environment. This is different from ARM hardware, which is what this discussion was originally about.
  • In this type of environment, the total cost of ownership probably consists of something like 50% support, 25% software licensing, 20% hardware, and 5% electricity. If you use ARM-based machines, the first thing I can guarantee you is that some people are going to complain that there's some piece of software they need in order to do their job, and it's not available on ARM. That means you're probably going to need a mixed x86/ARM hardware inventory. That's going to be massively more complex to support than pure x86.
  • Suppose instead that you keep a homogeneous x86 inventory, but you keep using machines as old as 10 years. Statistically, your business is probably running Windows. With that hardware mix, you're going to be forced to support lots of different versions of Windows, Office, etc. Again, this makes support more complex and expensive. Since support is the biggest chunk of your TCO, this isn't a good business decision.
  • When you use 10 year old hardware, you get all kinds of other issues coming in. E.g., a machine that old may not have a CD drive.

The truth is that hardware is cheap, and workers are expensive. It doesn't make sense to make your workers even 5% less productive in order to save some tiny amount of money on electricity.

What would really make sense these days for a medium to large business would be to stop paying $2000 for every machine and start supplying 80% of their users with new x86 machines in the $500 price range, on a 4-year replacement cycle. What I've observed where I work, however, is that this is difficult to do, for a variety of reasons. Workers who haven't had a hardware upgrade in 10 years feel like when their time comes to finally get an upgrade, this is their one big chance, and they're going to be stuck with their new machine for 10 years into the future -- so they argue for higher-end hardware. IT wants standardization of hardware to make their jobs easier, and since 20% of users do need higher-end hardware, you can't standardize on the low end. Psychologically, IT wants to work with shiny new toys.

Re:Why? (1)

nurb432 (527695) | about 2 years ago | (#37501662)

You can always get a small ARM tablet and hook up the HDMI to a monitor if it's the full size display and keyboard you're missing.

That is my thought ( and what I'm doing ), so really ARM is already ( back ) on the desktop.

One advantage to having it on your desktop is that you could have the same 'stuff' in your hand ( phone ) and desktop.

Re:Why? (1)

Anonymous Coward | about 2 years ago | (#37501924)

"Seriously, what is the reason for having a desktop ARM computer? Power consumption?"

No, the reason for wanting ARM on the desktop is efficiency: performance per unit of power consumption.
Why would you not want desktop computing power at 1/10 of the power consumption?
It surely would solve a whole lot of the thermal problems that today's desktop PC technology is facing.

The issue seems to be that while ARM is inherently more efficient than x86, the ARM world has been focusing on low power applications, not on the highest possible performance. But there's no principal reason why a (multi-core) ARM cpu that runs at 4GHZ can not be made, and no reason why that cannot be mated with high-speed RAM and a high-speed system bus.

Not scaling to threads (1)

tepples (727027) | about 2 years ago | (#37502428)

Why would you not want desktop computing power at 1/10 of the power consumption?

First, because the existing application you want to use depends on clock rate and instructions per clock and doesn't scale to 8+ threads. Second, because the existing application you want to use hasn't been recompiled.

Re:Why? (1)

obarthelemy (160321) | about 2 years ago | (#37502144)

The question could be put backwards: what's the reason for having an x86 computer ?
In my case, I don't game much anymore, I use OpenOffice... apart from dual-screen support, the Pi-B at $35 does everything I need, though more RAM would be nice. Actually getting 2 Pis for fake dual screen will turn out cheaper than my current nettop.
Once iOS and Android get their "big screen" interfaces right (and Android seems well on its way with 4.x), we can even take advantage of all the mobile apps out there, which are typically much cheaper and more accessible than their desktop versions.

Re:Why? (1)

Patch86 (1465427) | about 2 years ago | (#37502358)

I guess it depends if we're including "full sized laptop" in our definition of "desktop" (which pains me- but it seems to be pretty common where the two categories are "desktop" and "mobile" (tablets and smartphones)).

If you include laptops, power consumption is important; it's either longer battery life or smaller, lighter batteries. That's why an ARM-powered iPad has a 10 hour battery life and is still slim and portable. I don't know about you, but I'd happily go for some of that with my office laptop too.

Re:Why? (1)

fruitbane (454488) | about 2 years ago | (#37502362)

Jobs and Co. at Apple DID introduce a touch-pad mouse, y'know... That's not too far off from a touch screen.

Archimedes (5, Interesting)

rve (4436) | about 2 years ago | (#37501482)

It was on the desktop first [wikipedia.org] . I was a kid, not terribly good with money, and it was expensive, so I just missed out on being an early adopter.

Yes, here is a link. (0)

Anonymous Coward | about 2 years ago | (#37501514)



Macs and Hackintoshes.

Re:Yes, here is a link. (1)

Lunix Nutcase (1092239) | about 2 years ago | (#37501520)

It's quite easy to figure out "why" they added that. They will probably being use the quad-core A9s in a future iPad.

When the desktop is superseded (2)

drinkypoo (153816) | about 2 years ago | (#37501516)

When you can get a quad core smartphone with a halfway decent GPU, who cares? The only real problem is the lack of memory. Dock your phone, use its display for status updates and compute on your TV... or monitor.

Re:When the desktop is superseded (1, Insightful)

Lunix Nutcase (1092239) | about 2 years ago | (#37501556)

The people who run tons of software that is x86 only and has no comparable ARM version? People who do work for which ARM is supremely under-powered even with a quad-core version? Even a low end i5 can blow away the fastest ARM processors. This quad-core version will close the gap some, but it will still be far noticeable less performant.

Re:When the desktop is superseded (1)

Bert64 (520050) | about 2 years ago | (#37501704)

Sounds familiar...

Like the people who run tons of software that is (sparc|alpha|hppa|power|mips) only and has no comparable x86 version? People who do work for which x86 was supremely under-powered even with quad processors? Even a low end Alpha could blow away the fastest x86 processors.

History repeats itself, attack from below pushes the more powerful, more power hungry and more expensive architectures into small niches...

Re:When the desktop is superseded (1)

lkcl (517947) | about 2 years ago | (#37501600)

not if it's got 4gb of ECC DDR3 1066mhz RAM, it's not. the NuSmart 2816 is a 1.6 to 2ghz Dual-Core Cortex A9 with two versions - one 32-bit memory addressing and the other 64-bit. they're sampling, now. i'm working to get them plugged in to the EOMA initiative:
http://elinux.org/Embedded_Open_Modular_Architecture/PCMCIA [elinux.org]
http://www.openhardwaresummit.org/forum/viewtopic.php?f=5&t=502 [openhardwaresummit.org]

In netbooks (1)

X10 (186866) | about 2 years ago | (#37501538)

I don't see why I should have an ARM processor in my desktop pc, but I would love to have one in my netbook. It would boost battery life from two hours to ten - same as my Galaxy Tab.

Re:In netbooks (1)

fast turtle (1118037) | about 2 years ago | (#37501758)

Buy an AMD Fusion based laptop. Battery life exceeds 6 hours and can easily reach 8 hours of normal office use. With a bit of effort and thinking, they can actually exceed 10 hours of run time while sitll offering full compatibility with Win7 and all of your software

Raspberry Pi pricing (0)

Anonymous Coward | about 2 years ago | (#37501542)

Was talking about the Raspberry Pi to several Arduino guys last week at the NYC Makers faire. Most are convinced that they can only hit that price point because they are given free chips by the manufacturers. (This happens for education quite a bit i'm told.)


Re:Raspberry Pi pricing (1)

sunderland56 (621843) | about 2 years ago | (#37501706)

An Atom-based CPU + motherboard costs roughly $75. The Raspberry is supposed to be $25/$35 but is nowhere near as capable (no hard disk interface, fewer USB ports, slower graphics). The Atom is a fully usable, if a bit slow, desktop; the Raspberry is brilliant for what it is designed to do, but would not be usable as your one and only computer.

If Raspberry added the components to make it usable, and charged market price for the CPU, then it would be $75 - and would still be slower than the Atom. So, what exactly is the point of an ARM desktop?

Foaming at the mouth? Really? (0)

Anonymous Coward | about 2 years ago | (#37501546)

Methinks this post is laced with a bit of projection [wikipedia.org] .

EOMA Initiative (5, Interesting)

lkcl (517947) | about 2 years ago | (#37501548)

it's a long story, but i've been working to get ARM-powered desktop machines and laptops into the hands of free software developers for some time.

one of the key problems are that the chinese and taiwanese factories have absolutely no software expertise whatsoever. some guy decides he got caught out by the USA and UK Governments placing embargos and tariffs on imported clothes a couple years back: his business was affected, so he goes "i know, i'll diversify, i'll make tablets, those are popular". so off he goes, he gets supplied with a GPL-violating Android OS right from the word "go" by a limited number of Chinese ODMs who are having a really hard time keeping hold of their software engineers, and it just goes downhill from there.

the other problem is, as can be seen from the insane amount of money spent by the openpandora group, that case-work for laptops etc. can well be in excess of $100,000. that means that anything like the "pegatron netbook" has to be bought in volumes of 250,000 and above in order for the R&D costs to be amortised over a reasonable period.

this is where the EOMA initiative comes in: http://elinux.org/Embedded_Open_Modular_Architecture/PCMCIA [elinux.org]

by reversing everything on its head, and getting free software developers a modular architecture which _could_ be dropped into a mass-volume product, the tables are turned: those Chinese Factories can be supplied *by us* - Free Software Developers - with a completed ready-to-ship OS.

so, yes there's a board which is available that is similar in size and function to the pandaboard, origen exynos board, beagleboard, IMX53QSB etc., but unlike those boards, by complying to the EOMA/PCMCIA Open Standard it would be possible to literally drop that hardware-software combination straight into a mass-volume product, with the development effort of the required motherboard being nothing more than a low-cost 2 to 4 layer board that even KiCAD, Eagle or gEDA could do.

one key part of this strategy is to leverage arduino-like boards, like the leafpad Maple:
http://elinux.org/Embedded_Open_Modular_Architecture/PCMCIA/MiniEngineeringBoard [elinux.org]

anyway i think that's enough for one slashdot post. bit of background and some additional links, here:
http://www.openhardwaresummit.org/forum/viewtopic.php?f=5&t=502 [openhardwaresummit.org]

Re:EOMA Initiative (2)

fast turtle (1118037) | about 2 years ago | (#37501720)

I'd think that it would be more sensible to use an existing standard form factor such as M-Atx/Itx for the boards due to the tooling already being available. Simply put, the only issue you'd have is what internal connectors (sata/ide/floppy/firewire/usb) are needed along with the backplane ports, video, usb, ethernet being the most obvious. Then design the board to provide the needed connectivity and be done with it. The main advantage is not reinventing the damn wheel and getting a standard board into the hands of the devs as quickly as possible. Hell you could even go with surface mounting of the cpu and make it non removable/upgradable to avoid incompatibility issues. Includea a PCI-E slot for video or just use one of the AMD/Nvidia mobile GPU's as used in laptops to solve that issue and you'd have an ARM development machine in the hands of the devs within 6 months at the most.

Re:EOMA Initiative (4, Interesting)

lkcl (517947) | about 2 years ago | (#37502342)

ok - i'm pleased to see a response here: there are several points that are good, and some are, when you look closer, turn out to be unrealistic for mass-volume so-called "embedded" products.

1) the first is form-factor. this is great! yes, one of the options being considered is to have a standard Mini-ATX/ITX motherboard (into which an EOMA/PCMCIA CPU card can be plugged, at the back). there are several embedded companies that produce Mini ATX motherboards as standard, for their "modules", so it is not a new concept, it is in fact a proven one.

2) the second is connectors / interfaces. if you look right across the board at the very latest ARM processors coming out *right now*, you can count the number of Cortex A8 and Cortex A9 systems (as well as Marvell's "ARM-compatible" range of processors) that have PCI-e on the fingers of one hand.

i'll say that again.

the total number of modern ARM processors with even a 1x PCI-e interface is *below* 5 (five).

now there do exist some Cortex A8s (e.g. the OMAP35xx series) which have a HPI bus, onto which you could put a PCI-e "PHY" chip as it's called, but the total number of companies doing actual PCI-e "PHY" chips is, also, very very limited. typically, any company which has PCI-e PHY interface is a "Fabless Semi" company that gets bought up very very rapidly by the likes of Mentor Graphics, Synopsys and so on.

3) the third is the sheer overwhelming disparity between the ARM CPU's power consumption and the average PCI-e-based GPU's power consumption. the absolute ABSOLUTE lowest power consumption PCI-e-based GPU i could find is one from SiS, it's an older 65nm CMOS process, and if you ramp its speed down to the absolute lowest it will go without keeling over, it uses 6 watts. SIX watts!! you wanna connect a 6 watt GPU up to a 0.5 to 1.0 watt processor be my guest!

4) Multi-layer boards at ATX/ITX form-factor are expensive. if you have the CPU on-board the Motherboard (rather than being on a separate card), you then are forced to have the most complex part - the CPU-to-RAM interface - push up the number of layers required for the *whole* motherboard. by contrast, if you do the CPU-plus-RAM as a separate tiny, tiny board, just presenting its interfaces (SATA, ETH, USB, I2C, RGB/TTL etc.) via a simple connector (e.g. PCMCIA 68-pin) then you've just saved a fortune on the cost of the main motherboard because the main motherboard PCB can be done as an ultra-low-cost 4 or even if you're really lucky or a very good designer as a 2 layer board.

5) The power requirements of standard PCs are 10 to 200x larger than is actually needed! 500 to 1000 watts i mean for fuck's sake that's just insane. these ARM processors, the fastest most powerful one available on the market right now (sampling) is the NuSmart 2816, and that uses _two_ watts (shock horror) at 2ghz. wow big fucking deal. why on god's green earth would you want to match a 2 watt CPU with a 1000 watt Power Supply?? the entire motherboard would probably need a big resistor just to draw enough current in order to convince the PSU that nothing's wrong! i'm not joking about that - i'm dead serious.

6) The level of integration on these so-called ARM "embedded" CPUs is so high that it's really not worth the effort. present a USB bus, present an Ethernet port, present an SATA socket, along with an HDMI out and maybe even VGA, you're done! ship the damn product out the door, it cost you $35 to make! don't believe that price? just look at the cost of the RaspberryPi - it's doable. the irony is that for $35 of the "upgraded" RaspberryPi you can get an 800mhz Cortex A9 with an AML-8726-M (single core) for the same price. and the same size. credit-card-sized. you have to ask yourself: why would you _want_ to fit a credit-card-sized computer into a 12 x 15 x 8in "Desktop" case?? :) why not fit it into a 4in x 5in x 0.75in box, instead?

so, whilst on the face of it, fitting into the "standard" - i'm going to go further than that i'm going to call it "legacy"...

so whilst on the face of it, fitting into the "lecacy" form-factor of M-ATX/ITX seems like a good idea, the whole paradigm is so anathemaic to the ARM embedded design that it's really not worth the effort. take a look at the riversimple wikipedia page, look up the phrase "Mass Decompounding". the same thing applies here, except with ICs and Power budgets.

Re:EOMA Initiative (1)

Doc Ruby (173196) | about 2 years ago | (#37501994)

Are there any existing ARM PC boards that can take the forthcoming Xilinx Zynq CPUs? Nothing fancy - I want to run an Android desktop on it while I port my PIC embedded industrial controller firmware to Zynq/Android/FPGA. Any PC I can swap a Zynq into now, or that will be available in 2012?

Re:EOMA Initiative (1)

Doc Ruby (173196) | about 2 years ago | (#37502006)

Are there any existing ARM PC boards that can take the forthcoming Xilinx Zynq CPUs? Nothing fancy - I want to run an Android desktop on it while I port my PIC embedded industrial controller firmware to Zynq/Android/FPGA. Any PC I can swap a Zynq into now, or that will be available in 2012?

Or rather, is anyone putting a Zynq [fpgajournal.com] into EOMA? What would it take for my HW lab to do so?

Re:EOMA Initiative (0)

Anonymous Coward | about 2 years ago | (#37502068)

GPL-violating Android OS

Looks like slander to me.

There's a problem with ARM computing? (2)

GameboyRMH (1153867) | about 2 years ago | (#37501572)

Nobody told me and I've been using my N900 the whole time with no problems. Why am I the last to learn these things!?

What's the problem with the TrimSlice? (5, Informative)

earls (1367951) | about 2 years ago | (#37501590)

Developer only? What is that non-sense? The TrimSlice ships with Ubuntu ready to use. ~$200 for the feature set is a steal, IMO. Not happy without a Dell logo or something? What's the problem with the TrimSlice?

Re:What's the problem with the TrimSlice? (1)

Anonymous Coward | about 2 years ago | (#37501958)

I could go for a little slice of trim right about now~

Re:What's the problem with the TrimSlice? (1)

olesk (211973) | about 2 years ago | (#37502170)

Developer only? What is that non-sense? The TrimSlice ships with Ubuntu ready to use. ~$200 for the feature set is a steal, IMO. Not happy without a Dell logo or something? What's the problem with the TrimSlice?

I bought one last week from here http://trimslice.com/web/ [trimslice.com]

Cheap, cool (as in not hot) and fairly reasonably priced for what you get. Waiting for mine right now...

Re:What's the problem with the TrimSlice? (1)

lkcl (517947) | about 2 years ago | (#37502404)

the first problem is that the cost of the whole system's components is actually, as the RaspberryPi shows, somewhere around the $40 mark. also, if you look closer at the Tegra 250 which is used in the TrimSlice, it doesn't have SATA-II. and as i mentioned in another post here, the number of modern ARM CPUs with PCI-e on the market is less than 5.

so you can't expand it (ARM systems aren't designed that way), and it happens not to have the _exact_ set of features which make it truly desirable to free software developers, which is why free software developers are plumping (reluctantly) for the OpenRD Ultimate, and wishing that there was something better.

right now, it's all a big big compromise. and that's why i started the EOMA initiative, so that those "compromises" do not impact on the development of a product: you can always go swap out the CPU card for something better when it comes along. http://elinux.org/Embedded_Open_Modular_Architecture/PCMCIA [elinux.org]

there was a discussion of the features available (and desired) on debian-arm a few months back, let me give you a link to jump somewhere into the middle of that:
http://lists.debian.org/debian-arm/2011/08/msg00012.html [debian.org]

Search a little more, like the Efika (4, Informative)

jonsmirl (114798) | about 2 years ago | (#37501616)

http://www.genesi-usa.com/products/efika [genesi-usa.com]
Smarttop $129 thin client
Smartbook $199 laptop

They run Ubuntu and are based on the Freescale iMX51.
They are far more powerful than a Raspberry PI.

Freescale i.MX515 (ARM Cortex-A8 800MHz)
3D Graphics Processing Unit
WXGA display support (HDMI)
Multi-format HD video decoder and D1 video encoder (currently not supported by the included software)
8GB Internal SSD
10/100Mbit/s Ethernet
802.11 b/g/n WiFi
SDHC card reader
2 x USB 2.0 ports
Audio jack for headset
Built-in speaker

10.1" TFT-LCD, 16:9 with LED backlight, 1024 x 600 resolution
Freescale i.MX515 (ARM Cortex-A8 800MHz)
3D Graphics Processing Unit
Multi-format High-Definition hardware video decoder
16GB Nand Flash
External MMC / SD card slot (up to SD v2.0 and MMC v4.2)
Internal MicroSD slot
802.11 b/g/n WiFi (with on/off switch)
Bluetooth 2.1 + EDR
2 x USB 2.0 ports
Phone jack for headset
Built-in 1.3MP video camera
Built-in microphone
Built-in stereo speaker

iMX515 - only 512mb RAM (1)

lkcl (517947) | about 2 years ago | (#37502152)

you need to look at those a little closer. 1) the iMX515 has a hard limit of 512mb RAM 2) if you've never used a 10,1in 1024x600 LCD, you are in for a bit of a shock.

but yes, the efika mx is getting an upgrade - soon - to the 1ghz iMX53. and, also, i think genesi have been doing "dogfood eating" and have found, just as i told them it would be, that 1024x600 LCDs are completely unusable. their developer, matt, treated my ideas like shit (i was approaching them to see if they'd like to collaborate on a faster, more flexible product). he told me there was no market for high-end ARM-based laptops. and then "oh look!" surpriise, the next version of the efika laptop will have a 1280x768 LCD as standard. hmm... :)

Re:Search a little more, like the Efika (0)

Anonymous Coward | about 2 years ago | (#37502200)

Panda board is likely decent enough for a desktop. (Use hard fp debian port).

ARM desktop from the past (1)

Anonymous Coward | about 2 years ago | (#37501638)

I already have one (Archimedes) lurking around in the basement (http://en.wikipedia.org/wiki/Acorn_Archimedes)

There is a smartbook available (although not good) (2)

IYagami (136831) | about 2 years ago | (#37501736)

The Toshiba AC100

You can find a review at http://www.reghardware.com/2010/11/03/review_netbook_toshiba_ac100/ [reghardware.com]

"The beautifully designed and executed hardware is very close to my ideal netbook, and it's hardly an exaggeration to say that I'm heart-broken by Toshiba's cocked-up Android implementation. The best one can hope for is a firmware rescue from the open source community, although I wonder if the product will stay around long enough in these tablet-obsessed times for that to happen."

Re:There is a smartbook available (although not go (1)

lkcl (517947) | about 2 years ago | (#37502438)

unfortunately, this system illustrates why 1024x600 LCDs are undesirable. as does the genesi laptop with the same sized screen. other than the forced-installation of android, total non-upgradeability, inability to have 1gb of RAM and complete lack of interface for putting in an SATA SSD, the AC100 is actually very good. ok, in case you hadn't noticed, that was supposed to be ironic.

Yes, in about two months. (1)

Anonymous Coward | about 2 years ago | (#37501756)

Raspberry Pi [raspberrypi.org] . $35 for a 700MHz ARM with 256MB of RAM and 1080p HDMI output. More computer than most people need.

Re:Yes, in about two months. (0)

Anonymous Coward | about 2 years ago | (#37502124)

Isn't the Raspberry Pi the device with the Broadcom ARM soc without any public docs? The Raspberry Pi director does work for Broadcom [ycombinator.com] . Does Broadcom even have a single ARM device with open docs?

Re:Yes, in about two months. (1)

LuxuryYacht (229372) | about 2 years ago | (#37502202)

The Broadcom BCM2835 [broadcom.com] looks like the ARM SOC used by the Raspberry Pi [elinux.org] . There isn't a link for its data sheet that I could find. So it looks like just another closed hardware ARM design.

Re:Yes, in about two months. (0)

Anonymous Coward | about 2 years ago | (#37502334)

Who cares, as long as the kernel drivers and any necessary userspace tools are open source? Does it make a difference if the graphics card loads its firmware from a flash chip on the card or a binary blob on the SD-card? What documentation do you need?

Before you go saying that ARM is fast enough... (2)

CajunArson (465943) | about 2 years ago | (#37501778)

OK, the idea behind ARM is that it is "fast enough" for desktop and notebook PCs. Well, if that's the case, then a P4 is also "fast enough" and you should consider not buying anything newer.

Why am I saying that? Let's look at one benchmark that *is* multi-core ready and that Nvidia kindly ran on the upcoming Kal-El quad-core systems: Linpack.

Now I know Linpack is not a perfect benchmark, but it does do a decent job of showing off number-crunching power and it is multi-core capable and there are results from a wide range of architectures.

Here's a result from a 1.7 Ghz P4 system (see: http://www.roylongbottom.org.uk/linpack%20results.htm [roylongbottom.org.uk] )

CPU Mhz Opt (MFlops) Non-Opt (MFlops)
Pentium 4 1700 382.00 131.59

I think (but I'm not sure) that Opt means optimized (such as using SSE) and non-Opt is a minimal x86 implementation with no optimizations.

Now, here are Nvidia's results for its not-yet-on-the-market Kal-El Quad Core ARM at 1.0 Ghz:

Multi-threaded Linpack: 309 Mflops

See: http://www.xbitlabs.com/news/mobile/display/20110921142759_Nvidia_Unwraps_Performance_Benchmarks_of_Tegra_3_Kal_El.html [xbitlabs.com]

I'm going to assume that Nvidia will go out of its way to make sure the code is optimized for benchmarks that it posts as part of a marketing push.

So a QUAD CORE Arm architecture is still lagging behind a P4, and while the P4 has a clock speed advantage, it's a lot smaller than is justified by the difference in performance considering the Nvidia chip has 4 cores compared to a single-core P4.

Now, I'm not saying that Kal-El won't be awesome for use on tablets and smaller devices, but on a desktop or even a notebook, don't go around expecting miraculous performance.

Re:Before you go saying that ARM is fast enough... (1)

CajunArson (465943) | about 2 years ago | (#37501798)

P.S. --> Since Slashdot formatting is whack, the above results should show the 1.7 Ghz P4 with a 382 Mflop Linpack score when "optimized" compared to Nvidia's published results of 309 Mflops.

Mathematically, the P4 has a 70% clockspeed advantage with only a 23.6% performance advantage, but remember how crappy the IPC on the old P4 was, and remember that the P4 is a single core CPU vs. Quad Cores for Nvidia.

Re:Before you go saying that ARM is fast enough... (1)

statusbar (314703) | about 2 years ago | (#37501880)

How much power does the 1.7 Ghz P4 system require vs the Kal-El Quad Core ARM at 1.0 Ghz?

Re:Before you go saying that ARM is fast enough... (0)

Anonymous Coward | about 2 years ago | (#37502038)

It's useless to ask; he can't hear you because of the cpu-fan.

Re:Before you go saying that ARM is fast enough... (0)

Anonymous Coward | about 2 years ago | (#37502166)

How much power does the 1.7 Ghz P4 system require vs the Kal-El Quad Core ARM at 1.0 Ghz?

Does it matter?? Seriously, he was comparing old tech with new tech. If you want power comparison, compare at least somewhat modern tech with teh Kal-El.

Let's take the SINGLE core AMD Sempron 145 @ 2.7GHz with 45W TPD. It will clearly blow the P4 and Kal-El out of the water in performance. But you can even compare the power usage. At idle, it clocks down to 800MHz and uses under 2W power.

For even lower power APUs (ie. CPU + Radeon cores)

        http://en.wikipedia.org/wiki/Bobcat_(processor) [wikipedia.org]

Intel is compatible with power usage at idle. There is no reason to go to ARM to lower power output of desktops and laptops.

Re:Before you go saying that ARM is fast enough... (1)

DuckDodgers (541817) | about 2 years ago | (#37502114)

2 points to consider:
1. The P4 uses something like 5-10 times as much power for the task. Considering that these days most people just surf the web and play simple games that can be programmed with Javascript and HTML5, that means ARM is good enough for a large minority of current PC users. It also means ARM is pretty good for a lot of laptops. At our company the software developers run IDEs and compiler on laptops, so beefy x86 CPUs are extremely useful. But everyone else is using a web browser and Microsoft Office - as long as the user interface remained the same, a switch to ARM wouldn't even be noticed.
2. ARM processors are going to improve. The nVidia Tegra 3 "Kal El" processor is made with a 40nm semiconductor technology, 4 cores plus one low power core, a cap of 1.5 GHz, and a 12 core GPU. nVidia has announced future generations of its ARM processors with 28 nm semiconductor technology, 8 cores and a 64 core CPU, and no details on max GHz. It's likely in as little as three years there will be ARM chips that are comparable to a mid range Intel Core i5 today with much lower power draw. Then ARM will be sufficient for 75-90% of desktop machines.

I don't do video editing, Folding@Home, or Bitcoin mining. I'm sure if I did that would change my position on this, and for people that do I'm sure having a beefy CPU and top tier dedicated GPU is essential. In my experience, the AMD 6-core CPU I have at home and the Core i5 I have at work never top 100% use of one core plus 50% use of a second core. The rest are always idle.

Wrong subject (0)

Anonymous Coward | about 2 years ago | (#37501848)

I think people are looking at this completely wrong. How much does a ARM device cost? 600? 800$? 1200? most of that is the battery, screen and maybe the flash itself. When you look at the AppleTV (99$) you notice none of those parts are a consideration.

So how do you get everyone on the internet with their own identity? Easy, make the computer core portable (the CPU, OS, and primary storage) that you can just plug into any other computing device (eg iPhone, desktop.) When the next more powerful version comes out, you migrate it to the new one, without having to throw away all the more expensive parts (Screen, batteries and secondary storage.)

Of course this would cut into the profits of all the companies involved, but from a cost and ecological point of view it makes sense. Instead of a 1000$ desktop or laptop, you have a 99$ micro-desktop that you can take anywhere and plug into anything. "The cloud" promises this somewhat, but is unworkable if you're in places without the internet, or in countries that eavesdrop on everything you do. It largely solves the malware problem since everything you put on the portable module is yours, and you don't have to constantly authorize and deauthorize each computer you work at.

Anyway. ARM desktop, ideally would be a 2" by 1" module that you can insert into a iMac-like computer screen to get a desktop, or iPhone-like portable to get a laptop/smartphone. No syncing, no batteries, no wireless, no passwords(unless you set one up,) all the hassle cut out, because the CPU,RAM,primary storage and OS are on the module. We're probably 2 years away from this being actually viable, mainly the cost of flash and the lack of appropriate GPU performance is what's holding things back.

Seeing an ARM desktop that's like a full x86 desktop in a ATX formfactor is a non-starter as that eliminates two of ARM's advantages - power and size. Intel has to cripple their CPU's down to the performance of a 10 year old computer to get the power envelope to ARM's capability, and ARM still winds up being about as powerful as a 5 year old computer. So ARM has at least a 2:1 performance to power benefit.

Acorn Archimedes - the first desktop ARM (2)

Nivag064 (904744) | about 2 years ago | (#37501854)

ARM got to the desktop years ago (1987, according to wikipedia), as the first computer to use the ARM chip was a desktop computer - the Acorn Archimedes!

I had one, it was a lovely computer easy to program, and a GUI for in advance of its time.

Zynq-7000 PC? (1)

Doc Ruby (173196) | about 2 years ago | (#37501972)

What I want is a desktop Zynq-7000, the ARM A-9 CPU from Xilinx with a large embedded FPGA, running Android. My lab desktop, anyway. I want to port my embedded industrial control PIC code to it, perhaps targeting a soft PIC core in the FPGA (at first, then gradually porting sequential functions to Android processes). A desktop ARM/FPGA would be a great way to use the large universe of desktop apps to get the embedded PC to do what I want, even if I then repackage it as an embedded device (text LCD, minimum IO ports, no local Desktop, etc).

The Zynq CPU itself is probably not shipping until 2012. But who's got a PC built on it in the pipeline? Who's got some other ARM PC that could take a Zynq popped into it with a minimum of electronics work?

They exist (2)

guruevi (827432) | about 2 years ago | (#37502032)

They just won't run MS Office which is the biggest problem for most office workers. They are currently indeed in developer and embedded stage. The problem is that occasionally you want a little more horsepower (even if it's just to play Flash games) so they buy a 'normal' computer. Also there is no real support available and very little experience by your average sysadmin.

Once somebody starts doing it, the ball will get rolling. Even $200 is not bad but once Raspberry Pi runs a browser and e-mail, SSH, VNC, X and OpenOffice and basically plugs into a display without too much trouble (or is embedded into a display even better) I will be deploying them in our shared computer spaces because that's all they're for - connect to the cluster to run your jobs, check your e-mail and Facebook while you're waiting, occasionally copy something from or to a USB stick. All home directories are already on the network (NFS) so I don't really need much storage.

just my 2-cents (0)

nerdyalien (1182659) | about 2 years ago | (#37502066)

ARM is relatively new to the pipeline and super-scalar architectures, and yet to see "out-of-order" execution. Assume it will take some more years to produce a highly efficient processor.

My guess why ARM didn't reach desktop yet, it can't handle complex multi tasking stuff. For an example, how about viewing 20 web pages, HD video playback, downloading multi-gig file at 1 Mbps, applying PS touches on batch mode, while compressing 1000+ files. My other guess is, ARM is yet to produce a CPU that can communicate with modern graphics cards, SSDs and other demanding I/Os. They require additional transistors/processor to get around it I guess.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account