Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware

Is ARM Ever Coming To the Desktop? 332

First time accepted submitter bingbangboom writes "Where are the ARM powered desktops? I finally see some desktop models however they are relegated to "developer" models with USD200+ price tags (trimslice, etc). Raspberry Pi seems to be the only thing that will be priced correctly, have the right amount of features, and may actually be released. Is the software side holding ARM desktops back? Everyone seems to be foaming at the mouth about anything with a touch interface, even on the Linux side. Or are manufacturers not wanting to bring the 'netbook effect' to their desktop sales? Are ARM powered desktops destined to join the mythical smartbook?"
This discussion has been archived. No new comments can be posted.

Is ARM Ever Coming To the Desktop?

Comments Filter:
  • Look on eBay (Score:5, Informative)

    by jimicus ( 737525 ) on Saturday September 24, 2011 @10:08AM (#37501460)

    Look on eBay for an Archimedes.

    They're rapidly becoming a collector's item, but they were on the desktop in 1987.

    • They had a surprisingly modern looking desktop, that booted in seconds. It would be interesting to see where the platform would be today had it taken off in a big way.
      • Re:Look on eBay (Score:4, Informative)

        by jimicus ( 737525 ) on Saturday September 24, 2011 @10:47AM (#37501714)

        The OS itself is still around today [riscosopen.org], after a fashion. But time has not been kind.

        IMV, a fast boot cannot compensate for a spectacular lack of features you'd expect to find in a modern OS. It's a single user OS with co-operative multi-tasking rather than pre-emptive, there's no protected memory or swap support, it's single-user.

        • I think its safe to assume that, had it been developed as a mainstream OS in the intervening time, it would've gone to pre-emptive multitasking as soon as the hardware permitted it.

          Most of my fond memories are of the interface; the consistent and effective use of three mouse buttons, the innovative save dialog and the way in which applications were packaged (which, honestly, I know people thought was invented with Mac OS X)

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      I find it funny, yet sad, that people have forgotten that the A in ARM used to stand for Acorn. I was there, in Cambridge, during the time of the first ARM CPU's development. Friends of friends of the people who worked to create it. At the time it was by far the most powerful desktop and it's various OSs (RISC OS 2+.. Arthur was always a stop-gap OS :) ) were far advanced over everything else available at the time.

      I still use a RISC PC today. Also, one of the best case designs ever - it's practically in

    • BAFTA video: http://bcove.me/tux4wa2x [bcove.me] The part most pertaining to the current thread is at 14:32
    • I came here just to post this. It's not that ARM is ever coming to the desktop; it's already been and gone. By the way, am I mistaken in thinking some of the later RISC PCs used Intel ARM processors?

      It's also nice to see that someone still thinks back fondly to Archimedes machines as I do.

      • by jimicus ( 737525 )

        IIRC there was an upgrade based on a DEC StrongARM processor available for the RiscPC. Not sure if there was ever an Xscale upgrade but Castle Technology had a few systems built based on such a chip.

        Damn fast, they were. I used them a couple of times at university - I wish I'd known they were binning them, I'd have grabbed one. Came in one day and found the lab had been re-equipped.

  • Why? (Score:3, Insightful)

    by dukeblue219 ( 212029 ) <dukeblue219.aol@com> on Saturday September 24, 2011 @10:09AM (#37501464) Homepage

    Seriously, what is the reason for having a desktop ARM computer? Power consumption? I don't think there's a very large market for people who will settle for tablet-like performance in order to save a few dollars a month at most on electricity compared to existing low power processors. People with power grid problems will want something that runs on a battery anyway, and a tablet/netbook makes more sense there.

    Is it just for something fun to play with? Something small and portable? You can always get a small ARM tablet and hook up the HDMI to a monitor if it's the full size display and keyboard you're missing.

    Not sure what touch interface has to do with anything. That could be just as easily implemented with any architecture, and it's maybe the ONE thing I agree with Steve Jobs about -- touch does NOT work as a viable input method for a desktop.

    • That was my though, I have a Zacate processor in my Laptop which would be far more suitable for a desktop than your average ARM processor. ARM would ultimately face the same uphill struggle for acceptance that Intel's MERCED did when AMD whomped them with their AMD64 architecture. Changing instruction sets isn't easy to do and the main reason that folks tolerate it with mobile processors is that it saves them so much scarce battery life. Switching to one of AMD's mobile offerings would pretty much eliminate

      • by lkcl ( 517947 )

        Zacate is 18 watts! that means you have to have a heatsink or heatpipe, fan and other moving parts, as well as much larger power components. by contrast, with something like the NuSmart 2816 if you run it at 1.6ghz then you can get away with 4 watts and that's *including* the ECC 1066mhz DDR3 RAM. a voltage regulator for a stable 4 watt power supply is approximately a $0.50 cents part. it's a whole different ballgame. so the EOMA Initiative, we've set a 5 watt absolute maximum limit, and are sticking t

        • Right, and the cost you pay is having to recompile and possibly rewrite every application that you want to use.

          As I pointed out, Intel thought that they could introduce a 64bit architecture that lacked support for 32bit applications and ended up being taken over AMD's lap for a spanking.

          As for thermal dissipation, 18 watts isn't that much, with the right heat sink you can dissipate most CPUs using something like this: http://www.nofencomputer.com/eng/products/CR-100A.php [nofencomputer.com]
          And ultimately, that's just at peak u

          • I bought a passive E-350 board from Asus, and put it in a passive, well ventilated case vertically mounted behind my monitor. Temp was 60-65C, so I added a small and whiny fan.

          • by Nutria ( 679911 )

            Right, and the cost you pay is having to recompile and possibly rewrite every application that you want to use.

            I guess you didn't know that Debian is a multi-CPU distro and that they aren't Gentoo?

            As I pointed out, Intel thought that they could introduce a 64bit architecture that lacked support for 32bit applications and ended up being taken over AMD's lap for a spanking.

            This is true only for the Windows world. (Which is admittedly 90% of the market...)

      • Re:Why? (Score:4, Interesting)

        by durrr ( 1316311 ) on Saturday September 24, 2011 @11:08AM (#37501824)
        The Zacate is availible for desktop setups. Motherboard with zacate integrated goes for $100-$120, with RAM, PSU, HD and a shoebox for chassi you get a decent windows computer for $200.

        That however is quite a bit in excess of the $35 the Raspberry Pi is supposed to sell for. At that pricepoint you can almost start putting them everywhere before knowing why you're putting them everywhere.
    • Re:Why? (Score:4, Insightful)

      by rve ( 4436 ) on Saturday September 24, 2011 @10:21AM (#37501550)

      I have an ARM based laptop. It's fanless, in fact, it has no moving parts at all other than the hinge of the screen, and goes for a day or two of regular use between recharges. I'm not convinced "the desktop" has much of a long term future at this point... i think it will go the way of the workstation.

      • Yes and everybody will run under powered desktops in the quest to not have to hear the fan...

        I run a desktop because I need the power. If it has an ARM in it, so be it. BUT I need the power. When I develop I am going to use multiple screens. When I run my trading software I need a desktop with multiple screens.

        What people need to understand is that there is no solution fits all. Some people don't need a desktop, others do. Some people don't need a tablet, others do. Let's all get this straight we will have

        • Re:Why? (Score:5, Insightful)

          by Bert64 ( 520050 ) <bert@[ ]shdot.fi ... m ['sla' in gap]> on Saturday September 24, 2011 @10:40AM (#37501676) Homepage

          Exactly, no solution fits all... Your needs are specialised, so you will occupy a niche of people who will continue to buy highend workstations...

          For the vast majority of people computers became powerful enough for their requirements many years ago (aside from increasingly bloated software trying to mask that fact), and they are concerned about price, running cost (ie power usage), noise and that the machine is not an eyesore, and even more so are the companies who buy hundreds of desktops for their employees and don't want to buy a noisy, expensive, large and power hungry workstation for someone who's sole business use for it is to write letters.

        • Re:Why? (Score:4, Informative)

          by realityimpaired ( 1668397 ) on Saturday September 24, 2011 @11:08AM (#37501814)

          There's no reason a Cortex A7 dual core @ 800MHz wouldn't be able to handle both of the tasks you listed with ease. It could even handle basic gaming if you have a discreet video card to handle the load. Most people don't do the kind of number crunching that a modern high end desktop CPU would allow.

          The gamer crowd, absolutely. I can fully understand why they would want a high end processor. Even games that aren't that graphics intensive, like Civilization, are very heavy on number crunching. The office crowd, however, could easily be serviced by a low end low power ARM CPU. I could easily replace my desktop with an ARM-powered nettop without adjusting my computing habits at all, and I'm already running a multi-head setup.

          • I mean, if you can run Quake III on Raspberry Pi [youtube.com], I see no reason why you can't solve most people's computing needs with an 1 GHz or less ARM processor. Now, you can't do video editing, but that's not what most people do with their computers. They type email, visit facebook, type up a document, look at some pictures, or watch a movie. These computers are perfect for those uses.
        • There is actually no need to. I've got a desktop computer with Core2Duo E8400 CPU and a Radeon HD5770 graphics card and the only fan of that system sits in the PSU (and it is almost silent due to its low RPM). The system is quite fast and the power consumption is moderate.
          It is absolutely possible to build a silent and powerful desktop system.

        • by rve ( 4436 )

          You know, in the late 90's Sun, DEC, SGI and yes, Apple, were getting great profit margins on fantastically powerful graphical work stations. Some models sold for $100k or even much more. Swith to an under powered Linux PC just tosave $100k? Not me, I need multiple processor cores, 1600x1200 pixel 256 color displays, 100kbit/s ethernet.
          Of course, 5 years later, all of these companies had either abandoned the graphical workstaton market or were struggling to survive. Today, a $800 laptop exceeds the specs ab

    • Re:Why? (Score:4, Informative)

      by SendBot ( 29932 ) on Saturday September 24, 2011 @10:25AM (#37501578) Homepage Journal

      ... it's maybe the ONE thing I agree with Steve Jobs about -- touch does NOT work as a viable input method for a desktop.

      He may have said that at some point, but you should know by now that Apple changes the kool-aid they serve every so often. He's even spoken at length about merging iOS concepts into the desktop OSX.

      I've copied some text straight from the apple web site for the "Magic Trackpad" that make touch sound like you're no longer cool without it:

      "The new Magic Trackpad is the first Multi-Touch trackpad designed to work with your Mac desktop computer."
      "And it supports a full set of gestures, giving you a whole new way to control and interact with what’s on your screen."
      "Magic Trackpad gives you a whole new way to control what’s on your Mac desktop computer. When you perform gestures, you actually interact with what’s on your screen. You feel closer to your content, and moving around feels completely natural."

    • by lkcl ( 517947 )

      well, fortunately there's soon going to be things like the NuSmart 2816, which will have the best of both worlds: Dual-Core 1.6 to 2ghz, 4gb of ECC DDR3 1033mhz RAM... and only about 4 watts for a system (at the 1.6ghz speed).

      i'm working towards getting these - and other such beefy low-power CPUs - plugged in to the EOMA initiative:
      http://www.openhardwaresummit.org/forum/viewtopic.php?f=5&t=502 [openhardwaresummit.org]
      http://elinux.org/Embedded_Open_Modular_Architecture/PCMCIA [elinux.org]

      • Dual-Core 1.6 to 2ghz, 4gb of ECC DDR3 1033mhz RAM... and only about 4 watts for a system (at the 1.6ghz speed).

        That's going to be slower than an equivalent x86-based machine, though.

        The thing is, it will be slower *but draw one tenth the power consumption*. I want ten of these on a board, with about eight times the processing power for the same power draw as an x86 solution.

    • Re:Why? (Score:5, Informative)

      by Bert64 ( 520050 ) <bert@[ ]shdot.fi ... m ['sla' in gap]> on Saturday September 24, 2011 @10:36AM (#37501640) Homepage

      A few dollars a month for a desktop...
      A few thousand dollars a month for an office full of desktops?

      The average office worker doesn't do a lot with their computer, and has been doing much the same thing for years... The only thing stopping them from using 10 year old hardware is modern bloated software which is intentionally incompatible with older versions.

      There's no reason that the average user's needs couldn't be fulfilled by a low power machine with equivalent processing power to a system from 10+ years ago, with power hungry x86 systems being relegated to the small niche of power users and certain classes of server.

      (in short, watch what x86 did to Sparc/MIPS/Alpha/Power, attacked from below)

      • by gl4ss ( 559668 )

        the screen is going to consumer a lot of power anyways.

        nothing is stopping from running offices with 10 year old sw too. office '97 is enough to run a smb business, proven in practice many times over.

      • A few dollars a month for a desktop...
        A few thousand dollars a month for an office full of desktops?

        We have an "underpowered" Intel Atom-based mini desktop in the kitchen as a terminal, which is silent and uses negligible power.

        While it would be usable for an office worker, it *is* noticeably slower than my standard Athlon computers. Just about every real-world web page or application launch takes several seconds longer than the normal PCs. Over the course of a month, all these little delays would almost certainly add up to more lost productivity than the additional power cost, especially if the normal PC

      • A few dollars a month for a desktop...A few thousand dollars a month for an office full of desktops? The average office worker doesn't do a lot with their computer, and has been doing much the same thing for years... The only thing stopping them from using 10 year old hardware is modern bloated software which is intentionally incompatible with older versions.

        In principle I'm in sympathy with you, but in reality there are a lot of problems with your argument.

        • You're suggesting using 10-year-old x86 hardware in a medium-sized business environment. This is different from ARM hardware, which is what this discussion was originally about.
        • In this type of environment, the total cost of ownership probably consists of something like 50% support, 25% software licensing, 20% hardware, and 5% electricity. If you use ARM-based machines, the first thing I can guarantee you i
    • by nurb432 ( 527695 )

      You can always get a small ARM tablet and hook up the HDMI to a monitor if it's the full size display and keyboard you're missing.

      That is my thought ( and what I'm doing ), so really ARM is already ( back ) on the desktop.

      One advantage to having it on your desktop is that you could have the same 'stuff' in your hand ( phone ) and desktop.

    • The question could be put backwards: what's the reason for having an x86 computer ?
      In my case, I don't game much anymore, I use OpenOffice... apart from dual-screen support, the Pi-B at $35 does everything I need, though more RAM would be nice. Actually getting 2 Pis for fake dual screen will turn out cheaper than my current nettop.
      Once iOS and Android get their "big screen" interfaces right (and Android seems well on its way with 4.x), we can even take advantage of all the mobile apps out there, which are

    • I guess it depends if we're including "full sized laptop" in our definition of "desktop" (which pains me- but it seems to be pretty common where the two categories are "desktop" and "mobile" (tablets and smartphones)).

      If you include laptops, power consumption is important; it's either longer battery life or smaller, lighter batteries. That's why an ARM-powered iPad has a 10 hour battery life and is still slim and portable. I don't know about you, but I'd happily go for some of that with my office laptop too

    • Jobs and Co. at Apple DID introduce a touch-pad mouse, y'know... That's not too far off from a touch screen.

    • by YesIAmAScript ( 886271 ) on Saturday September 24, 2011 @12:44PM (#37502488)

      ARM licenses IP. Intel sells chips.

      If you license a core from ARM you can put it down on a chip, then put down your other logic (north/south bridge, interface logic like USB) on the same chip. Then you can end up with your entire system on a chip.

      With Intel you have to buy a CPU, buy a north/southbridge. If you want custom interfaces beyond that, that's more chips too.

      So the net effect is that the Intel-based system uses more chips and that means it costs more, uses more power and is larger. Using more power means you need to put in a larger power supply, that costs more. If it's battery-powered, that means it needs a larger battery, that costs more. Larger in and of itself makes something more expensive to make as it requires more materials. And then it being larger means it costs more to ship from where it is made to the customer. And then finally every increase in costs also means more increase in on-the-shelf price because you not only have to cover the higher costs, but the OEM and retail margins on the costs.

      The next effect is that ARM devices will be cheaper to buy and to run. And in the case of portable devices, more sleek too.

      This may not matter to some customers but to other customers lower costs means a lot.

      Performance is an issue. We have ARMs already in the pipe (dual-core ARM A15) which have sufficient power for most uses and ARM will certainly have even faster cores later.

      I see a strong future for ARM in laptops and in home computers. No, not in tower computers but those make up a shrinking part of the market already.

      Finally, as others have said, be careful about agreeing with Steve Jobs. He's a consummate liar. Just because he says he doesn't like touch for the desktop doesn't necessarily mean much. It means Apple doesn't deliver touch on the desktop today, but it doesn't necessarily mean anything more. Apple could flip on this at any time like on the video iPod.

  • Archimedes (Score:5, Interesting)

    by rve ( 4436 ) on Saturday September 24, 2011 @10:11AM (#37501482)

    It was on the desktop first [wikipedia.org]. I was a kid, not terribly good with money, and it was expensive, so I just missed out on being an early adopter.

  • When you can get a quad core smartphone with a halfway decent GPU, who cares? The only real problem is the lack of memory. Dock your phone, use its display for status updates and compute on your TV... or monitor.

    • Re: (Score:2, Insightful)

      The people who run tons of software that is x86 only and has no comparable ARM version? People who do work for which ARM is supremely under-powered even with a quad-core version? Even a low end i5 can blow away the fastest ARM processors. This quad-core version will close the gap some, but it will still be far noticeable less performant.

      • by Bert64 ( 520050 )

        Sounds familiar...

        Like the people who run tons of software that is (sparc|alpha|hppa|power|mips) only and has no comparable x86 version? People who do work for which x86 was supremely under-powered even with quad processors? Even a low end Alpha could blow away the fastest x86 processors.

        History repeats itself, attack from below pushes the more powerful, more power hungry and more expensive architectures into small niches...

      • by Arlet ( 29997 ) on Saturday September 24, 2011 @01:06PM (#37502662)

        Making the situation even worse is the fact that there is a complete lack of standardization on the ARM platform, especially for all the peripherals. But even for the core itself there are many different variants. This can be an advantage for embedded developers, because it gives you lots of choices.

        For binary software vendors, it's a nightmare, because they would have to support all these different versions.

    • by lkcl ( 517947 )

      not if it's got 4gb of ECC DDR3 1066mhz RAM, it's not. the NuSmart 2816 is a 1.6 to 2ghz Dual-Core Cortex A9 with two versions - one 32-bit memory addressing and the other 64-bit. they're sampling, now. i'm working to get them plugged in to the EOMA initiative:
      http://elinux.org/Embedded_Open_Modular_Architecture/PCMCIA [elinux.org]
      http://www.openhardwaresummit.org/forum/viewtopic.php?f=5&t=502 [openhardwaresummit.org]

      • Sounds sexy, but I don't see it in a phone yet. is it? and with 4GB? If so, that would be fantastic.

  • EOMA Initiative (Score:5, Interesting)

    by lkcl ( 517947 ) <lkcl@lkcl.net> on Saturday September 24, 2011 @10:21AM (#37501548) Homepage

    it's a long story, but i've been working to get ARM-powered desktop machines and laptops into the hands of free software developers for some time.

    one of the key problems are that the chinese and taiwanese factories have absolutely no software expertise whatsoever. some guy decides he got caught out by the USA and UK Governments placing embargos and tariffs on imported clothes a couple years back: his business was affected, so he goes "i know, i'll diversify, i'll make tablets, those are popular". so off he goes, he gets supplied with a GPL-violating Android OS right from the word "go" by a limited number of Chinese ODMs who are having a really hard time keeping hold of their software engineers, and it just goes downhill from there.

    the other problem is, as can be seen from the insane amount of money spent by the openpandora group, that case-work for laptops etc. can well be in excess of $100,000. that means that anything like the "pegatron netbook" has to be bought in volumes of 250,000 and above in order for the R&D costs to be amortised over a reasonable period.

    this is where the EOMA initiative comes in: http://elinux.org/Embedded_Open_Modular_Architecture/PCMCIA [elinux.org]

    by reversing everything on its head, and getting free software developers a modular architecture which _could_ be dropped into a mass-volume product, the tables are turned: those Chinese Factories can be supplied *by us* - Free Software Developers - with a completed ready-to-ship OS.

    so, yes there's a board which is available that is similar in size and function to the pandaboard, origen exynos board, beagleboard, IMX53QSB etc., but unlike those boards, by complying to the EOMA/PCMCIA Open Standard it would be possible to literally drop that hardware-software combination straight into a mass-volume product, with the development effort of the required motherboard being nothing more than a low-cost 2 to 4 layer board that even KiCAD, Eagle or gEDA could do.

    one key part of this strategy is to leverage arduino-like boards, like the leafpad Maple:
    http://elinux.org/Embedded_Open_Modular_Architecture/PCMCIA/MiniEngineeringBoard [elinux.org]

    anyway i think that's enough for one slashdot post. bit of background and some additional links, here:
    http://www.openhardwaresummit.org/forum/viewtopic.php?f=5&t=502 [openhardwaresummit.org]

    • I'd think that it would be more sensible to use an existing standard form factor such as M-Atx/Itx for the boards due to the tooling already being available. Simply put, the only issue you'd have is what internal connectors (sata/ide/floppy/firewire/usb) are needed along with the backplane ports, video, usb, ethernet being the most obvious. Then design the board to provide the needed connectivity and be done with it. The main advantage is not reinventing the damn wheel and getting a standard board into the

      • Re:EOMA Initiative (Score:5, Interesting)

        by lkcl ( 517947 ) <lkcl@lkcl.net> on Saturday September 24, 2011 @12:30PM (#37502342) Homepage

        ok - i'm pleased to see a response here: there are several points that are good, and some are, when you look closer, turn out to be unrealistic for mass-volume so-called "embedded" products.

        1) the first is form-factor. this is great! yes, one of the options being considered is to have a standard Mini-ATX/ITX motherboard (into which an EOMA/PCMCIA CPU card can be plugged, at the back). there are several embedded companies that produce Mini ATX motherboards as standard, for their "modules", so it is not a new concept, it is in fact a proven one.

        2) the second is connectors / interfaces. if you look right across the board at the very latest ARM processors coming out *right now*, you can count the number of Cortex A8 and Cortex A9 systems (as well as Marvell's "ARM-compatible" range of processors) that have PCI-e on the fingers of one hand.

        i'll say that again.

        the total number of modern ARM processors with even a 1x PCI-e interface is *below* 5 (five).

        now there do exist some Cortex A8s (e.g. the OMAP35xx series) which have a HPI bus, onto which you could put a PCI-e "PHY" chip as it's called, but the total number of companies doing actual PCI-e "PHY" chips is, also, very very limited. typically, any company which has PCI-e PHY interface is a "Fabless Semi" company that gets bought up very very rapidly by the likes of Mentor Graphics, Synopsys and so on.

        3) the third is the sheer overwhelming disparity between the ARM CPU's power consumption and the average PCI-e-based GPU's power consumption. the absolute ABSOLUTE lowest power consumption PCI-e-based GPU i could find is one from SiS, it's an older 65nm CMOS process, and if you ramp its speed down to the absolute lowest it will go without keeling over, it uses 6 watts. SIX watts!! you wanna connect a 6 watt GPU up to a 0.5 to 1.0 watt processor be my guest!

        4) Multi-layer boards at ATX/ITX form-factor are expensive. if you have the CPU on-board the Motherboard (rather than being on a separate card), you then are forced to have the most complex part - the CPU-to-RAM interface - push up the number of layers required for the *whole* motherboard. by contrast, if you do the CPU-plus-RAM as a separate tiny, tiny board, just presenting its interfaces (SATA, ETH, USB, I2C, RGB/TTL etc.) via a simple connector (e.g. PCMCIA 68-pin) then you've just saved a fortune on the cost of the main motherboard because the main motherboard PCB can be done as an ultra-low-cost 4 or even if you're really lucky or a very good designer as a 2 layer board.

        5) The power requirements of standard PCs are 10 to 200x larger than is actually needed! 500 to 1000 watts i mean for fuck's sake that's just insane. these ARM processors, the fastest most powerful one available on the market right now (sampling) is the NuSmart 2816, and that uses _two_ watts (shock horror) at 2ghz. wow big fucking deal. why on god's green earth would you want to match a 2 watt CPU with a 1000 watt Power Supply?? the entire motherboard would probably need a big resistor just to draw enough current in order to convince the PSU that nothing's wrong! i'm not joking about that - i'm dead serious.

        6) The level of integration on these so-called ARM "embedded" CPUs is so high that it's really not worth the effort. present a USB bus, present an Ethernet port, present an SATA socket, along with an HDMI out and maybe even VGA, you're done! ship the damn product out the door, it cost you $35 to make! don't believe that price? just look at the cost of the RaspberryPi - it's doable. the irony is that for $35 of the "upgraded" RaspberryPi you can get an 800mhz Cortex A9 with an AML-8726-M (single core) for the same price. and the same size. credit-card-sized. you have to ask yourself: why would you _want_ to fit a credit-card-sized computer into a 12 x 15 x 8in "Desktop" case?? :) why not fit it into a 4in x 5in x 0.75in box, instead?

        so, whilst on the face of it, fitting into the "standard" - i'm going to go further than that i'm going to

    • Are there any existing ARM PC boards that can take the forthcoming Xilinx Zynq CPUs? Nothing fancy - I want to run an Android desktop on it while I port my PIC embedded industrial controller firmware to Zynq/Android/FPGA. Any PC I can swap a Zynq into now, or that will be available in 2012?

    • Are there any existing ARM PC boards that can take the forthcoming Xilinx Zynq CPUs? Nothing fancy - I want to run an Android desktop on it while I port my PIC embedded industrial controller firmware to Zynq/Android/FPGA. Any PC I can swap a Zynq into now, or that will be available in 2012?

      Or rather, is anyone putting a Zynq [fpgajournal.com] into EOMA? What would it take for my HW lab to do so?

  • Nobody told me and I've been using my N900 the whole time with no problems. Why am I the last to learn these things!?

    • Re: (Score:3, Insightful)

      by RR ( 64484 )
      I was aware of the N900. I still use an N800. I just didn't need an upgrade yet, and I was waiting for step 5 of the 5-step mass market process for Maemo (becoming concerned about bureaucratic interference when Meego became the company's strategy and they decided to dump Debian for Linux Foundation silliness), when the Elop Effect [blogs.com] happened. So, no, I don't think the N900 was successful.
  • by earls ( 1367951 ) on Saturday September 24, 2011 @10:27AM (#37501590)

    Developer only? What is that non-sense? The TrimSlice ships with Ubuntu ready to use. ~$200 for the feature set is a steal, IMO. Not happy without a Dell logo or something? What's the problem with the TrimSlice?

    • by lkcl ( 517947 )

      the first problem is that the cost of the whole system's components is actually, as the RaspberryPi shows, somewhere around the $40 mark. also, if you look closer at the Tegra 250 which is used in the TrimSlice, it doesn't have SATA-II. and as i mentioned in another post here, the number of modern ARM CPUs with PCI-e on the market is less than 5.

      so you can't expand it (ARM systems aren't designed that way), and it happens not to have the _exact_ set of features which make it truly desirable to free softwa

  • by jonsmirl ( 114798 ) on Saturday September 24, 2011 @10:31AM (#37501616) Homepage

    http://www.genesi-usa.com/products/efika [genesi-usa.com]
    Smarttop $129 thin client
    Smartbook $199 laptop

    They run Ubuntu and are based on the Freescale iMX51.
    They are far more powerful than a Raspberry PI.

    Freescale i.MX515 (ARM Cortex-A8 800MHz)
    3D Graphics Processing Unit
    WXGA display support (HDMI)
    Multi-format HD video decoder and D1 video encoder (currently not supported by the included software)
    512MB RAM
    8GB Internal SSD
    10/100Mbit/s Ethernet
    802.11 b/g/n WiFi
    SDHC card reader
    2 x USB 2.0 ports
    Audio jack for headset
    Built-in speaker

    10.1" TFT-LCD, 16:9 with LED backlight, 1024 x 600 resolution
    Freescale i.MX515 (ARM Cortex-A8 800MHz)
    3D Graphics Processing Unit
    Multi-format High-Definition hardware video decoder
    16GB Nand Flash
    External MMC / SD card slot (up to SD v2.0 and MMC v4.2)
    Internal MicroSD slot
    802.11 b/g/n WiFi (with on/off switch)
    Bluetooth 2.1 + EDR
    2 x USB 2.0 ports
    Phone jack for headset
    Built-in 1.3MP video camera
    Built-in microphone
    Built-in stereo speaker

    • you need to look at those a little closer. 1) the iMX515 has a hard limit of 512mb RAM 2) if you've never used a 10,1in 1024x600 LCD, you are in for a bit of a shock.

      but yes, the efika mx is getting an upgrade - soon - to the 1ghz iMX53. and, also, i think genesi have been doing "dogfood eating" and have found, just as i told them it would be, that 1024x600 LCDs are completely unusable. their developer, matt, treated my ideas like shit (i was approaching them to see if they'd like to collaborate on a fas

  • Comment removed based on user account deletion
    • by lkcl ( 517947 )

      the trimslice is $200, is non-upgradeable and does not have an SATA-II interface (whereas the raspberrypi is $25 - a discrepancy of 85%+ in price, for no good reason). the genesi products use an iMX515 which has a hard limit of 512mb RAM, and the laptop only has a maximum LCD resolution of 1024x600, which is completely unusable. every single product out there right now has these hard compromises that make them completely dissatisfactory for at least one market. this is why i'm doing the EOMA/PCMCIA initi

  • by IYagami ( 136831 ) on Saturday September 24, 2011 @10:53AM (#37501736)

    The Toshiba AC100

    You can find a review at http://www.reghardware.com/2010/11/03/review_netbook_toshiba_ac100/ [reghardware.com]

    "The beautifully designed and executed hardware is very close to my ideal netbook, and it's hardly an exaggeration to say that I'm heart-broken by Toshiba's cocked-up Android implementation. The best one can hope for is a firmware rescue from the open source community, although I wonder if the product will stay around long enough in these tablet-obsessed times for that to happen."

    • Or the Eee Pad Transformer which by all accounts is very good

    • by lkcl ( 517947 )

      unfortunately, this system illustrates why 1024x600 LCDs are undesirable. as does the genesi laptop with the same sized screen. other than the forced-installation of android, total non-upgradeability, inability to have 1gb of RAM and complete lack of interface for putting in an SATA SSD, the AC100 is actually very good. ok, in case you hadn't noticed, that was supposed to be ironic.

  • The NetWinder [netwinder.org] was based on the DEC/Intel StrongARM 110. They had quite a nice desktop working back in 1999 along with a large developer community [netwinder.org]

  • by CajunArson ( 465943 ) on Saturday September 24, 2011 @11:01AM (#37501778) Journal

    OK, the idea behind ARM is that it is "fast enough" for desktop and notebook PCs. Well, if that's the case, then a P4 is also "fast enough" and you should consider not buying anything newer.

    Why am I saying that? Let's look at one benchmark that *is* multi-core ready and that Nvidia kindly ran on the upcoming Kal-El quad-core systems: Linpack.

    Now I know Linpack is not a perfect benchmark, but it does do a decent job of showing off number-crunching power and it is multi-core capable and there are results from a wide range of architectures.

    Here's a result from a 1.7 Ghz P4 system (see: http://www.roylongbottom.org.uk/linpack%20results.htm [roylongbottom.org.uk])

    CPU Mhz Opt (MFlops) Non-Opt (MFlops)
    Pentium 4 1700 382.00 131.59

    I think (but I'm not sure) that Opt means optimized (such as using SSE) and non-Opt is a minimal x86 implementation with no optimizations.

    Now, here are Nvidia's results for its not-yet-on-the-market Kal-El Quad Core ARM at 1.0 Ghz:

    Multi-threaded Linpack: 309 Mflops

    See: http://www.xbitlabs.com/news/mobile/display/20110921142759_Nvidia_Unwraps_Performance_Benchmarks_of_Tegra_3_Kal_El.html [xbitlabs.com]

    I'm going to assume that Nvidia will go out of its way to make sure the code is optimized for benchmarks that it posts as part of a marketing push.

    So a QUAD CORE Arm architecture is still lagging behind a P4, and while the P4 has a clock speed advantage, it's a lot smaller than is justified by the difference in performance considering the Nvidia chip has 4 cores compared to a single-core P4.

    Now, I'm not saying that Kal-El won't be awesome for use on tablets and smaller devices, but on a desktop or even a notebook, don't go around expecting miraculous performance.

    • P.S. --> Since Slashdot formatting is whack, the above results should show the 1.7 Ghz P4 with a 382 Mflop Linpack score when "optimized" compared to Nvidia's published results of 309 Mflops.

      Mathematically, the P4 has a 70% clockspeed advantage with only a 23.6% performance advantage, but remember how crappy the IPC on the old P4 was, and remember that the P4 is a single core CPU vs. Quad Cores for Nvidia.

    • How much power does the 1.7 Ghz P4 system require vs the Kal-El Quad Core ARM at 1.0 Ghz?

    • 2 points to consider:
      1. The P4 uses something like 5-10 times as much power for the task. Considering that these days most people just surf the web and play simple games that can be programmed with Javascript and HTML5, that means ARM is good enough for a large minority of current PC users. It also means ARM is pretty good for a lot of laptops. At our company the software developers run IDEs and compiler on laptops, so beefy x86 CPUs are extremely useful. But everyone else is using a web browser an
  • ARM got to the desktop years ago (1987, according to wikipedia), as the first computer to use the ARM chip was a desktop computer - the Acorn Archimedes!

    I had one, it was a lovely computer easy to program, and a GUI for in advance of its time.
  • What I want is a desktop Zynq-7000, the ARM A-9 CPU from Xilinx with a large embedded FPGA, running Android. My lab desktop, anyway. I want to port my embedded industrial control PIC code to it, perhaps targeting a soft PIC core in the FPGA (at first, then gradually porting sequential functions to Android processes). A desktop ARM/FPGA would be a great way to use the large universe of desktop apps to get the embedded PC to do what I want, even if I then repackage it as an embedded device (text LCD, minimum

  • by guruevi ( 827432 ) on Saturday September 24, 2011 @11:44AM (#37502032)

    They just won't run MS Office which is the biggest problem for most office workers. They are currently indeed in developer and embedded stage. The problem is that occasionally you want a little more horsepower (even if it's just to play Flash games) so they buy a 'normal' computer. Also there is no real support available and very little experience by your average sysadmin.

    Once somebody starts doing it, the ball will get rolling. Even $200 is not bad but once Raspberry Pi runs a browser and e-mail, SSH, VNC, X and OpenOffice and basically plugs into a display without too much trouble (or is embedded into a display even better) I will be deploying them in our shared computer spaces because that's all they're for - connect to the cluster to run your jobs, check your e-mail and Facebook while you're waiting, occasionally copy something from or to a USB stick. All home directories are already on the network (NFS) so I don't really need much storage.

  • by wavedeform ( 561378 ) on Saturday September 24, 2011 @01:55PM (#37503062)
    The Acorn Archimedes: http://en.wikipedia.org/wiki/Acorn_Archimedes [wikipedia.org] This was some sort of outgrowth of the BBC Micro - http://en.wikipedia.org/wiki/BBC_Micro [wikipedia.org]
  • by Alioth ( 221270 ) <no@spam> on Saturday September 24, 2011 @05:30PM (#37504452) Journal

    ARM was originally developed as a desktop CPU, and it was on the desktop - it's been and gone.

    ARM stood originally for Acorn Risc Machine, it was developed by Acorn because they couldn't find an adequate processor for what they wanted to do to follow on from the 6502. Many of the CISC chips at the time (mid 1980s) had very poor utilization of memory bandwidth and poor interrupt response (Steve Furber in one of his talks recently on the development of the ARM - he's one of the two people who developed the first ARM CPU, pointed out in particular the National Semi 32016 (IIRC) that they were thinking of using, until they found out the multiply instruction took over 100 clock cycles and could not be interrupted).

    They also wanted ARM to be low power, not to make their new line of desktop computers energy efficient particularly, but because they needed it to be cheap so the computers could be affordable. If they could get it under 1 watt, they could use plastic packaging instead of ceramic packaging which reduces the cost by an order of magnitude. They had no tools for estimating power, so they just designed *everything* on the chip for low power. When they got the first samples back from the fab, they were blown away when they found the chip consumed 0.1 watts - they had massively overachieved.

    We had the Acorn Archimedes in school. IIRC, it had an 8MHz ARM and it could emulate - in software - an IBM PC with VGA graphics faster than the original IBM PC ran. That's how much faster the ARM was at the time compared to anything else around. Without needing to be in a ceramic package.

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...