×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Arduino Goes ARM

Soulskill posted more than 2 years ago | from the gil-hamilton-gets-a-new-partner dept.

Hardware Hacking 144

mikejuk writes "The whole world seems to be going in ARM's direction. The latest version of Windows 8 will run on ARM processors, Raspberry Pi is a $25 ARM based machine and now the open source Arduino platform has a new member — the ARM-based Arduino Due announced at the Maker Faire in New York. The Due makes use of Atmel's SAM3U ARM-based process, which supports 32-bit instructions and runs at 96Mhz. The Due will have 256KB of Flash, 50KB of SRAM, five SPI buses, two I2C interfaces, five serial ports, 16 12-bit analog inputs and more. This is much more powerful than the current Uno or Mega. However, it's not all gain — the 3.3V operating voltage and the different I/O ports are going to create some compatibility problems. Perhaps Intel should start to worry about the lower end of the processor world."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

144 comments

sweet? (1)

ThorGod (456163) | more than 2 years ago | (#37446796)

I dig this idear!

FTA (bold = my emphasis):

We’re going to be demoing the board and giving away some boards to a selected group of developers who will be invited to shape the platform while it’s been created. After Maker Faire, we will begin selling a small batch of Developer Edition boards on the Arduino store (store.arduino,cc) for members of the community who want to be join the development effort. We plan a final and tested release by the end of 2011

I can't wait to experiment with these puppies.

Re:sweet? (1)

hairyfeet (841228) | more than 2 years ago | (#37447840)

While I agree that having more ARM stuff for DIY is always of the cool, I really do wish the blogs and newspapers would quit this "Its the death of X86!" crap. it would be like saying "This killer new moped will get rid of trucks 4 evar!". Doesn't that sound dumb? same thing here.

X86 has a place, ARM has a place. When you need to do some heavy number crunching, or want huge detailed 3D worlds, or need to deal with a crapload of data? Then x86 is your guy. Need to be ultra mobile, where every microwatt counts? Need something light and rugged where it doesn't need fans, or a mil spec toughness for harsh conditions? Then ARM is your baby.

As a retailer I can tell you why the sales of x86 has gone down, its because for most folks what they have is more than good enough for the tasks they are doing. Just going to FB, checking webmail, and web surfing? Then frankly dual cores are overkill and anything above that is EXTREME overkill. Tablets are selling because they are new, they are hip, they are lightweight, and most importantly they don't have to compete with literally hundreds of millions of models and makes already built. Most folks now have at least 2, sometimes more, when it comes to desktops, so unless another "killer app" comes along that makes those too weak to keep up? naturally growth will slow down. it is just the OEMs got spoiled during the MHz wars when changing your desktop every three years was pretty much required to get anything done.

But now things have settled down and each will have their niche. The x86 will be in the desktop at home and in the office, as well as netbooks and laptops when people need more power than they do battery life. ARM will have the cell phones, tablets, and embedded applications. Both have great strengths, both have great weaknesses, so can't we all just....get along?

Re:sweet? (-1)

Anonymous Coward | more than 2 years ago | (#37447936)

Fuck you you piece of shit part switcher. Nobody gives a shit what you think.

Re:sweet? (2)

Zerth (26112) | more than 2 years ago | (#37448152)

Um, this ARM chip is replacing an AVR microcontroller. No x86 here.

Re:sweet? (1)

hairyfeet (841228) | more than 2 years ago | (#37448630)

Nobody said there was. What I said is every time some new ARM something comes out the blogs explode with "its the death of x86!" articles.

As someone who has actually helped my local college get into Arduino and FOSS (they do some cool rocketry stuff which that little board fits nicely into) it is easy to see where each chip fits in. X86 for coding and processing the data, ARM for the ruggedness and power sipping when collecting the data.

Why all the blogs and journals have to act like this is a death match when they go together like PB&J is stupid, pointless, and makes no sense, which is what I pointed out. Just let each chip do what they do best and let the market decide. I bet you'll see it stay pretty much as it is now, with X86 doing the number crunching and ARM doing the mobile jobs.

Re:sweet? (1)

element-o.p. (939033) | more than 2 years ago | (#37450008)

...they do some cool rocketry stuff which that little board fits nicely into...

Could I twist your arm into sharing? Pretty please? I've been looking at Arduino and Raspberry Pi for a few projects along those lines, and I'm really, really curious what others are doing along these lines (and how, of course).

Dual core is still really useful (0)

billstewart (78916) | more than 2 years ago | (#37448264)

If you're running WIndows and one of your processes explodes, like Firefox seems to almost daily, and starts trying to burn up the entire CPU, it used to be a real problem, because the user interface became unresponsive. With dual-core, what usually happens is that one core gets burned up by Firefox, but the other one's available for other processes, like driving your UI or running mail or killing Firefox.

Re:Dual core is still really useful (1)

optimism (2183618) | more than 2 years ago | (#37448786)

If you're running WIndows and one of your processes explodes, like Firefox seems to almost daily, and starts trying to burn up the entire CPU, it used to be a real problem, because the user interface became unresponsive.

We run several single-core "home theater" machines for web browsing & videos, all running XPSP3 and Firefox, and I can't remember when we last had a hang like that.

Perhaps last winter or spring, and almost certainly caused by Flash, not Firefox.

Anyway, Ctrl-Alt-Del and force quit is a quick&easy solution. No need for multiple cores.

Re:Firefox crashing (1)

billstewart (78916) | more than 2 years ago | (#37449722)

It's probably Javascript - I'd normally blame Flash, but that's generally wrapped up in Plugincontainer,exe these days, and the CPU count shows up in Firefox.

Re:Dual core is still really useful (1)

k31 (98145) | more than 2 years ago | (#37449926)

you are both right.

Flash ( which runs within firefox) can cause this behaviour, and it can make Celerons and such unresponsive. Single cores with hperthreading and AMD based machines seem less affected. Must be a flash/windows quirk.

Even ctrl-alt-del may fail to cause a response in a timely manner, if a process "explodes".

However, mutiple cores are really more useful for transcoding and other "geeky" stuff. As more people rip their DVDs to save them from teh scratches of their kids or the clumsiness of their friends , or just common wear and tear, those cores will keep being more important.

Of course, they are business reasons for using them, too....

Re:Dual core is still really useful (1)

optimism (2183618) | more than 2 years ago | (#37450194)

However, mutiple cores are really more useful for transcoding and other "geeky" stuff. As more people rip their DVDs to save them from teh scratches of their kids or the clumsiness of their friends , or just common wear and tear, those cores will keep being more important.

Of course, they are business reasons for using them, too....

Agree with the first sentence. Video transcoding to h264 is the most processor-intensive task that any consumer is likely to throw at their cpu today.

As for ripping DVDs, a Pentium III can do this, no sweat. It's an easy job for pretty much every consumer PC made in the last decade. Transcoding to "modern" codecs is the hard part.

Business reasons? Not for the vast majority of business users. Unless they are forced to use a Microsoft OS (eg Vistam, Win 7) that is ~designed~ to overburden the hardware with no added benefit to the user.

Re:sweet? (2)

optimism (2183618) | more than 2 years ago | (#37448606)

X86 has a place, ARM has a place. When you need to do some heavy number crunching, or want huge detailed 3D worlds, or need to deal with a crapload of data? Then x86 is your guy.

If you have no idea what you need the processor to crunch, then yeah, a high-power general-purpose CPU is "your guy".

But if you think about it beforehand and realize that your device needs to do 3D rendering, or H264 video decompression, or any of a number of common high-horsepower tasks, you can augment the CPU with custom silicon that beats the stuffing out of a general CPU, with less power.

That is exactly what the current crop of handheld devices do. They have special-purpose silicon to do the heavy lifting for these tasks. Hence core CPU horsepower is less relevant for the vast majority of usage.

Re:sweet? (1)

Truekaiser (724672) | more than 2 years ago | (#37449948)

your a tad bit off.. the cortex a8 has the same performance as the Pentium 3 line. considering the amount of people who buy tablets i would say that is just near enough computing power that many people actually need. with the cortex a9mp and a15mp coming out the performance of a arm chip will be on parity or damn close to it to x86 on just about everything BUT gaming. in fact, looking at the omap4(texas instruments) version of the a9 which is used widely today. and their future omap5 which will have the ability to use sata 3 gbps and ddr3 along with dual gpu graphics i will go out on a limb here and state that for the exception for the enthusiast gaming market a omap5 will be a functional replacement for a x86 gaming rig.

“I can’t wait to experiment with these (0)

Anonymous Coward | more than 2 years ago | (#37448680)

That's what Sophie Wilson said.

(with apologies to the awesome Ms Wilson)

Atmel (0)

Anonymous Coward | more than 2 years ago | (#37446878)

Atmel is the manufacturer of the ARM processor for the Due, not Amtel. The misspelling is propagated from the original article.

For the uninformed like me.... (1)

allaunjsilverfox2 (882195) | more than 2 years ago | (#37446942)

Can someone explain the difference here? I mean, will this allow bigger / better projects? I'm not a programmer and only vaguely familiar with some of the projects done with Arduino. I love the idea of a upgrade as much as the next person, but I was wondering if someone could give me a context of what can be achieved here that couldn't have before?

Re:For the uninformed like me.... (1)

chispito (1870390) | more than 2 years ago | (#37447072)

They will have more:
processing power
ram
program space
I/Os

Re:For the uninformed like me.... (0)

Anonymous Coward | more than 2 years ago | (#37447168)

and the ARM is a much more clean architecture so you don't have mess around with stuff like putting strings in program memory and
using special functions to access them etc..

I can understand the 3.3V might be a problem, but I can't see why IO would become a problem, teh packages you can get ARMs in
has so many more pins it must be possible to come up with something that is compatible

Re:For the uninformed like me.... (0)

Anonymous Coward | more than 2 years ago | (#37447762)

Is it 3.3v logic levels, or just that the chip runs on 3.3v? I'd imagine a little voltage divider would work for most things, no?

Re:For the uninformed like me.... (0)

Anonymous Coward | more than 2 years ago | (#37449904)

Is it 3.3v logic levels, or just that the chip runs on 3.3v? I'd imagine a little voltage divider would work for most things, no?

yes, it would, at least for the inputs, which do not seem to be "5 Volt tolerant". The core voltage is 1.8 Volt for this microcontroller, and the I/O voltage may be 1.8 or 3.3 Volt, with a maximum of 3.3 + 0.3 Volt = 3.6 Volt. So a simple resistor divider is needed although sometimes an 1K serial resistor will suffice on may atmel ARM controllers. For output signals you need outputs that have a Vin(high) of at most 3.0 volt, many HCT chips will manage that,sometimes a weak pullup to 5V is all that is needed.
Most peripheral chips nowadays will work with 3V3 I/O anyway.

Re:For the uninformed like me.... (1)

petermgreen (876956) | more than 2 years ago | (#37450624)

I can't see why IO would become a problem

Compatibility between 5V logic and 3.3V logic is a somewhat complex area. Some 5V devices can accept 3.3V signals and some 3.3V devices are tolerant of 5V signals but many aren't (on both sides). So much of the time you end up having to building level translation circuits to interface 5V stuff to 3.3V stuff.

A further complication is that most IO on microcontrollers has independent direction control for every pin. So if you want to make a fully general level translation circuit you have to 1: find a bidirectional buffer chip with individual direction lines (or build the equivilent of one out of tristate buffers) and 2: find enough IO to control the direction of all those buffers. It's doable but I'm not sure if it's doable at a cost (in both size and money) that would be acceptable to the ardunio guys. I certainly don't see any evidence of it on the photo in TFA

Re:For the uninformed like me.... (3)

LoRdTAW (99712) | more than 2 years ago | (#37447190)

This isn't a replacement for the 8-bit ATmega based Arduinos but a step up up for those looking for more processing power. The Arduino toolkit and IDE is very user friendly and but the ATmega has its limits. Very small ROM and RAM is one problem the other being low clock speeds and its 8 bit architecture. One could make a pretty powerful robotics controller or even game console out of these boards.

A port to ARM for the Arduino toolkit had been long talked about and wanted by many. There are attempts like the Maple but this is an official ARM-Arduino board with official support in the arduino toolchain.

All I know is I am glad that this has finally arrived. For a while I was using the mbed for playing with arm project stuff. A great little development board but it lacked an open and offline compiler. they also left out allot of the I/O due to the DIP nature of the board. The one thing it looks like it has over the ARM-arduino is Ethernet which the SAM3U appears to lack.

Re:For the uninformed like me.... (1)

stewbee (1019450) | more than 2 years ago | (#37447220)

Some things that come to mind....

ADCs samples with more bits (12 bits vs 8 bits). This is would be important to someone who cared about getting better dynamic range from an analog signal. It is roughly a 6 dB improvement per bit. (effectively a bit less when considering non-ideal things such as clock jitter). Possible applications would be SW defined radios

Clock speed is faster than current Arduinos. If you were running something that was computationally intense and had a small window to complete this computation, this would be beneficial. A project that I am working on now could benefit from this. I would like to to do FFTs in a really short time span. This would help here.

While I am not familiar with the exact architecture here, if it has 32 bit address fields, it probably has 32 bit computational registers too. Again, this could be beneficial if you have need precise computations in your application. Again, when doing something like FFTs, or digital filtering, round off errors could be problematic with smaller registers. In general, more bits means less round off error.

These would be the first things that come to mind for my applications. I am sure others will provide more, especially if they are more familiar with the architecture.

Re:For the uninformed like me.... (2)

Andy Dodd (701) | more than 2 years ago | (#37447490)

It does have 32 bit registers.

An example of a benefit of this are softPWM implementations - on AVRs, softPWM with greater than 8 bits of resolution is a real bitch because once you go above 8 bits, the mathematics slow down a LOT. I worked around this once by having a fast 8-bit PWM loop that was dithered every PWM cycle by an outer sigma-delta modulator loop. I would've been able to do straight softPWM a lot easier with 32-bit registers.

It's also clocked at 96 MHz, significantly higher than the 16 MHz of the MegaXX8s that Arduinos normally use.

Also, the ARM Cortex-M3 architecture has some nifty tricks for modifying individual bits of each register or memory location. With AVR, modifying a single bit requires a read-modify-write operation. In the CM3, each register and memory location is aliased to 8 memory locations, each representing one bit of the aliased location. If the Arduino IDE takes advantage of this trick, it means that you won't pay the heavy I/O penalties Arduino is (or at least was) notorious for.

Re:For the uninformed like me.... (1)

Anonymous Coward | more than 2 years ago | (#37450358)

With AVR, modifying a single bit requires a read-modify-write operation. In the CM3, each register and memory location is aliased to 8 memory locations, each representing one bit of the aliased location. If the Arduino IDE takes advantage of this trick, it means that you won't pay the heavy I/O penalties Arduino is (or at least was) notorious for.

That was entirely a function of Arduino software suckage, not AVR. The AVR instruction set has "sbi" and "cbi" instructions to set/clear a single bit in an I/O register. They take 2 cycles to execute on most AVRs, or one cycle on the newer faster ones.

I don't remember the exact details because it was years ago, but I looked into why Arduino was so bad and there was some kind of screwy ideological resistance to structuring their software stack so they could take advantage of lightweight IO on AVR. My overall impression of Arduino is that it's cool in that it enables some people to hack around with embedded CPUs who otherwise would face too sharp a learning curve to do anything useful in a short time, but oversold as the One True Way and also poorly engineered in some ways. YMMV

Re:For the uninformed like me.... (1)

rasmusbr (2186518) | more than 2 years ago | (#37447712)

The TL;DR of this post is basically self-driving robot: 8-bit MCU. Self-flying robot: 32-bit MCU. Don't mess up the analog stuff.

Of course the ARM has 32 registers everywhere. That's kind of the point. It also has 32-bit arithmetic units, including division I presume (but don't quote me on that). 32-bit arithmetic units will come in handy in a lot of advanced applications such as signal processing and control systems where you're often working with more than 8 significant bits of data. (Caveat: As always, the analog input has to have a good enough SNR and the input has to be low-pass filtered by an analog filter before sampling in order to avoid aliasing.)

The 8-bit AVR:s only do 8-bit addition, subtraction and multiplication. Division by powers of two can be done rather easily by shifting data, other division has to be done by some kind of program. Programs tend to get large quickly when you implement a lot of processing.

Re:For the uninformed like me.... (0)

Anonymous Coward | more than 2 years ago | (#37448618)

ARM certainly has 32 bit registers. However, unlike the AVR (ATmega), it only has 16 of them.

The big difference to me appears to be DMA. CPU cycles are quite expensive on such slow chips, so the ability to bypass the CPU and do direct memory access is especially valuable here. The Atmel had 4 DMA channels, this ARM will have 23.

Furthermore, with this CPU comes USB 2 support. With all those I/O's, it would really be painful if you'd try to shuffle the (processed) input out over an USB1, 12 Mbps connection.

Re:For the uninformed like me.... (0)

Anonymous Coward | more than 2 years ago | (#37447314)

As a long term embedded guy couple of thing.

32bit micro-controllers, especially arm based micro-controllers are seriously squeezing the price point of higher end 8 bit micro's while offering an order of magnitude more power. Advantages of 8 bit micro's still are lower power and simpler bare metal development and tools chains. Small to medium power devices are much less expensive, important when you are building tens of thousands of units. And the development tools are traditionally cheaper.

However for something like Arduino, I'm not sure that's really a win. With an ARM you can potentially develop a set of libraries and a tool chain/Dev system that is install and go, but that offers good performance. Also one other thing, Ethernet support. You don't get that with AVR's.

Re:For the uninformed like me.... (1)

Darinbob (1142669) | more than 2 years ago | (#37448332)

ARM is a full modern von-Neumann 32-bit CPU (though maybe this version is stripped down w/o an MMU). AVR is a very small 8-bit Harvard architecture. (Harvard architecture confuses things since the instruction word size can be bigger than the memory word size, thus 8-bit registers but instructions are 16-bit). The ARM version also has a much larger chunk of RAM whereas the AVR was extremely limited (not sure what arduino used but some versions only have 256 bytes total).

Re:For the uninformed like me.... (0)

Anonymous Coward | more than 2 years ago | (#37449060)

Harvard architecture confuses things since the instruction word size can be bigger than the memory word size, thus 8-bit registers but instructions are 16-bit

I'm not sure what things are confused, but this sentence sure confused me. A von-Neumann processor could have 8-bit registers and 16-bit instructions. It would negate some of the efficiencies gained over a Harvard architecture, but wouldn't make it into one.

What Arduinos are about (2)

billstewart (78916) | more than 2 years ago | (#37448766)

After we're done slashdotting arduino.cc [arduino.cc], go take a look around. Arduino makes an open hardware and software design for an 8-bit microcontroller board with a bunch of pins for analog and digital input and output, with a friendly C-based integrated development environment. Even if you're an artist and not an electronics engineer, it's a friendly easy-learning-curve environment for building electronics that respond to sensors, and taking technology that used to be opaque magic and turn it into transparent crafts you can understand.

Typical kinds of things people do with Arduinos are blink LEDs, use all sorts of input sensors for distance or temperature to control blinking LEDs, move servos or other motors, build simple robots, sew them into clothing so you can blink LEDs in time to music or when you wave your arms, turn on your lawn sprinklers when your plants are dry, that kind of stuff.

What this new release does is two main things - there's one new 8-bit board that's simpler, cheaper, and a bit more powerful, and there's another new board that has a 32-bit CPU and a lot more sensor I/O. The 8-bit designs are a somewhat limited programming environment (which is enough for a lot of things, and can be an intellectual puzzle if you like that sort of thing). The 32-bit design will let you do much more powerful projects, which may be especially useful for music or video, and it's still cheap and friendly.

Re:For the uninformed like me.... (0)

Anonymous Coward | more than 2 years ago | (#37449490)

I mean, will this allow bigger / better projects?

Yeah, both. It's exciting.

Audio and video are going to get better in peoples projects, as a result. Bit banging external ram, probably USB and a lot of other interfaces without extra hardware between it and the Arduino.

People have been hooking up SD cards to Arduinos for a while, so it wont affect storage size much. But, it'll be bigger meaning a project that does several things instead of one or two.

One of the limitations I ran into a long time ago was controlling a 24-bit color TFT display directly that didn't have an interface IC. The only solution I had before was build an interface between the Arduino and display using a CPLD. The CPLD handles video buffer ram, generates H/Vsync and provides a drawing interface to the Arduino. Those extra parts and complexity wont be a bare necessity anymore. The new Arduino could do it through sheer processing power (software drawing).

Another problem was audio input and processing. The older ones didn't have a fast enough ADC, and very little could be done to an audio signal. For example, taking 16-bit stereo audio and converting it to PCM in real time wasn't possible before. Now, things like wireless audio are easier.

Re:For the uninformed like me.... (1)

HappyPsycho (1724746) | more than 2 years ago | (#37449732)

Biggest difference will most likely come from ram, CPU speed will help with number crunching for some algorithms but going from 2k to 50k RAM will actually allow you to run a proper TCP/IP stack, store the image from a camera so you can process it immediately or save for later use or keep decent sized data sets in memory for machine learning algorithms.

Very good candidate for a central controller for a project. I doubt that it would be as cheap to get the chips vs the ATMega chips.

Encouraging Overkill (3, Informative)

MrOctogon (865301) | more than 2 years ago | (#37446948)

As somebody who is looking for a more powerful prototyping platform on the cheap I look forward to this. But I would not use it for a majority of my hobby projects, which do not need a lot of this power.

Most arduino projects only use a few I/O pins and very little processing power. Many hobby projects could be made with a much weaker pic processor, and many could get by on the basic 8 pin pics. Many people don't know that the simpler solutions exist, because they only see arduino stuff all over the web. The full development board is way overkill.

Additionally, with current arduino setups, it is fairly simple to make a clone around an ATmega chip. All parts are soldered easily through hole, and the schematic is easy. With a 32 bit surface mount chip, the schematic gets complex enough that most hobbyists are now scared off by the hard soldering and the crazy layouts. The open source, easy to clone nature of Arduino that made it what it is today is incompatible with the new high-end boards, and people will have to pay more for the official dev boards, or something else professionally fabbed.

Re:Encouraging Overkill (1)

drolli (522659) | more than 2 years ago | (#37447150)

Many, maybe most, but as soon as you want to do something which requires a little signal processing, you appreciate more computational power than the AVRs provide.

Re:Encouraging Overkill (1)

Viewsonic (584922) | more than 2 years ago | (#37447186)

Unfortunately, many of the sensors take up a whole bunch of pins. You get a simple LCD and over half your pins are used depending. This becomes an issue when you're trying to make something like a mobile robot with a touch LCD screen. Then you want fully articulated arms, you're going to need more pins for each servo. You could always offload them to their own processors, but I think there is a market for these uber Arduinos that will allow someone to not worry about how many pins they have left.

I don't think this will harm your normal Arduino base. They are cheap ($6 for the chip itself) and will work for MOST projects people are doing.

There is.... (0)

Anonymous Coward | more than 2 years ago | (#37450080)

An I2C solution to every problem, my ideal arduino would have 4 pins, 2 of which would be for the supply.

Re:Encouraging Overkill (0)

Anonymous Coward | more than 2 years ago | (#37447312)

Most arduino projects only use a few I/O pins and very little processing power. Many hobby projects could be made with a much weaker pic processor, and many could get by on the basic 8 pin pics. Many people don't know that the simpler solutions exist, because they only see arduino stuff all over the web. The full development board is way overkill.

PICs will do, yes, but there's also a whole slew of low-spec AVRs, keeping the same assembler (and compiler, if you're doing it that way) as the ATmega. I'd be a lot happier about the Arduino craze if it were killing off the PIC hobbyist sector (as it is), but people implemented low-complexity projects on ATtinys instead. But it seems to make all beginning hobbyists want to throw an entire dev-board into each project, which is just ridiculous.

Re:Encouraging Overkill (0)

Anonymous Coward | more than 2 years ago | (#37448154)

I'm seeing an increase in interest in the ATtiny chips. Oddly enough, I think it has everything to do with the cost of an Arduino dev board. $20-30 is a very inexpensive prototyping setup, but people can't afford to lose a whole board every time they make a little project.

My experience with them, using the MIT cores, was fantastic (aside from a small clock issue). They're very cheap, and I can even use the Uno as an ISP for the tiny's. It works out great.

Re:Encouraging Overkill (1)

optimism (2183618) | more than 2 years ago | (#37449508)

$20-30 is a very inexpensive prototyping setup, but people can't afford to lose a whole board every time they make a little project

Why would they lose the board?

You can use the prototyping board to ~prototype~, then for the finished project, use a bare ATmega328 with a couple of filter caps, reset pullup resistor, and some headers. Maybe also an external crystal & caps if you need very fast timing (eg 250kbps serial). Total cost: $5 or less. Add a couple bucks if you want to buy ATmega328's pre-flashed with the Arduino bootloader instead of flashing them yourself.

One of the beautiful things about the Arduino ecosystem is that there are plenty of options available to hit your cost/effort target. Everything from the $30 Uno boards, to the $20 Pro SMD boards, to $10 DIY with preflashed chips and prefab PCBs (eg Boarduino), to $5 or less DIY with your own PCB or even dead-bug style.

The cost depends on how lazy you are and/or how many units you plan to make.

Re:Encouraging Overkill (0)

Anonymous Coward | more than 2 years ago | (#37450614)

You're making the same point I am. People use dev boards in finished projects all the time. But more and more, I'm seeing people use more appropriate chips and getting their projects off the relatively expensive development boards. It's a good thing, and I think it's largely driven by cost... despite the fact that the arduino boards are very reasonably priced.

Re:Encouraging Overkill (2)

Doc Ruby (173196) | more than 2 years ago | (#37448408)

Is there a tool that recompiles PIC code into Arduino code that either "just works" or takes only about 10% extra hand-retooling time by a skilled PIC/Arduino developer to finish porting?

Re:Encouraging Overkill (1)

Osgeld (1900440) | more than 2 years ago | (#37449968)

just use an attiny probelm solved

Re:Encouraging Overkill (1)

Doc Ruby (173196) | more than 2 years ago | (#37450368)

So if I have a lot of PIC code, and I want to add some features I can get already implemented in Arduino code, I can use an ATtiny chip and it will run both?

I don't think an ATtiny has the power to run my PIC18F code, let alone adding Arduino code to that.

Re:Encouraging Overkill (1)

Osgeld (1900440) | more than 2 years ago | (#37450790)

I totally lost you, i thought you were asking for a pic to arduino converter, I say fuck the pic code and just use the arduino software with an attiny, but now I have no idea what your talking about.

Leonardo 32U4 vs. Smaller Atmel Chips (1)

billstewart (78916) | more than 2 years ago | (#37448568)

Arduino's current Uno design burns an entire Atmega chip converting USB to less-useful serial, matching what they used to do with a more specific serial chip, and it's difficult to really use the USB through that interface. The new Arduino Leonardo follows on that by using one of Atmega's newer chips that does USB functions as well as general processing in one chip, so it makes it easier to do a lot of the functions they mentioned, like keyboard and mouse protocols. The catch is that all of Atmel's AVR chips with the USB functions are in surface-mount packages, unlike the friendly DIPs that the 168 and 328 used, or 8-pin DIPs like the 45 and 85. It's a much cleaner and more efficient design, but I've got mixed feelings about the loss of user-accessibility compared to popping a chip into the Arduino and reprogramming it.

Re:Encouraging Overkill (0)

Anonymous Coward | more than 2 years ago | (#37448762)

Oft' repeated 'easy through hole'. Fine solder and a thin tip and tweezers or paste and a cheap ebay hot air rework blower makes surface mount easier once you switch and increasingly cheaper as the through hole parts are retired.

ok (1)

Osgeld (1900440) | more than 2 years ago | (#37446952)

there are dozens of arm board, some for as little as 10 bucks (ST) I just dont have the faith in this product.

Re:ok (1)

Doc Ruby (173196) | more than 2 years ago | (#37448436)

Because?

Re:ok (1)

Osgeld (1900440) | more than 2 years ago | (#37449436)

because knowing the arduino team this is going to be over the 50$ mark when it comes out, there is no indication that the chip is 5 volt tolerant which all the current 3rd party shields are 5 volt, the software is totally different and once lumped into the arduino crowd its going to be a nightmare for libraries if not handled correctly , and frankly it does not have enough ram to do anything else except be a overkill bit twiddler.

Meanwhile for 40 bucks you can get a pic32 that will run BSD, or a half dozen other setups that run elua which is hands down 100x easier than arduino wants to be. I like arduino, I think its great, but for 32 bit there are better solutions than shoehorning arduino into it

Why? (1)

Dunbal (464142) | more than 2 years ago | (#37446986)

Perhaps Intel should start to worry about the lower end of the processor world.

This has always belonged to the likes of Motorola and others. Comparing ARM to x86 is like comparing apples and oranges. At one point people will wake up and realize that after reading a book and playing Angry Birds, tablets are rather limited. I know a lot of people suffering from buyer's remorse after buying tablets. They have their uses, they have their niche market, but at one point when you make a tablet do everything a laptop or desktop does, you end up with a (bad) laptop or a desktop.

Re:Why? (1)

bucky0 (229117) | more than 2 years ago | (#37447330)

So, this is me not knowing the field very well, but why wouldn't Intel need to be worried about these devices? A significant part of the x86 market is people running simple word-processing, web-browsing applications that don't demand a lot of CPU power.

Additionally, (and it's probably worth a different thread) but why doesn't intel just release ARM processors? If Microsoft is releasing an ARM port of Windows8, theoretically, a lot of people (at least the big guys) will be porting their applications to ARM as well.

ocean spurned (1)

epine (68316) | more than 2 years ago | (#37447650)

why doesn't intel just release ARM processors?

Intel released the XScale to Marvell back in 2006. Is that what you mean? Besides, Itanium has better predication.

I recall maybe ten years ago people talking about how everything was going from C to just about anything else. But the reality is that C/C++ is still on the ground floor doing the heavy lifting.

I suppose if you're stranded on a ship in the doldrums, you shake your fist at the sky above rather than the ocean beneath, which makes perfect sense until someone follows up with the attention grabbing headline "Ocean Spurned".

I've used the SAM7XC512 before which was a pretty fat SOC with 128KB SRAM. This chip stacks up nicely (twice the speed, half the SRAM) and it really does have 5 SPI ports on the data sheet.

Re:Why? (0)

Anonymous Coward | more than 2 years ago | (#37447708)

Well, you do have to license the ARM core to release those chips.

Having said that, Intel used to do just this: it acquired the StrongARM business from DEC, and only sold it relatively recently (to Marvell).

Re:Why? (1)

Darinbob (1142669) | more than 2 years ago | (#37448458)

The comment about Intel is entirely unrelated to this Arduino version. It's still vastly too small to be used in a tablet or smartphone. ARM can be used for something much more powerful but not this particular board. You might find this CPU as just a peripheral on a full tablet.

Intel already does higher end ARM chips of the type you'd expect to see in a tablet. Intel is a big company, they don't only do x86 chips and PC motherboards.

Re:Why? (0)

Anonymous Coward | more than 2 years ago | (#37448050)

Tablets are great for consumers. People who make/modify things will still need PCs for a very long time to come.

Re:Why? (1)

optimism (2183618) | more than 2 years ago | (#37449676)

Perhaps Intel should start to worry about the lower end of the processor world.

This has always belonged to the likes of Motorola and others.

Except...an entire generation of engineers cut their teeth on the Intel 8051.

Does Intel still collect licensing fees from other manufacturers who used 8051 intellectual property? I dunno. But they definitely were a player in this space, directly or indirectly, at one point.

Raspberry Pi (1)

gregfortune (313889) | more than 2 years ago | (#37447022)

I don't believe Raspberry Pi is available yet (see http://www.raspberrypi.org/faqs/ [raspberrypi.org], first line), but I'm seriously considering picking one up when it's available. Arduino is definitely an amazing platform, but I just don't need something quite that low level. I'm currently toying with turning one of those cheap $10 "cameras" into a wireless surveillance cam. As far as ARM goes, I'm not deep enough into the hardware to care either way anyway.

RISC strikes again (0)

Anonymous Coward | more than 2 years ago | (#37447034)

I miss my Motorola 68k.

Re:RISC strikes again (0)

Anonymous Coward | more than 2 years ago | (#37447178)

I'm still waiting for Apple to go back to PowerPC...

Re:RISC strikes again (0)

Anonymous Coward | more than 2 years ago | (#37447320)

6502 FTW!

Re:RISC strikes again (1)

alexo (9335) | more than 2 years ago | (#37448136)

6502 FTW!

Personally, I found the 6809 more compelling at the time.

Re:RISC strikes again (0)

Anonymous Coward | more than 2 years ago | (#37450390)

I have here in my closet a fully functioning Commodore SuperPET, a dual 6502/6809. It switches to 6502 mode for IO and then back to 6809 for the operating system, Microware OS9. Maybe I should donate it to a museum.

Very nice, ideal for a lot of small projects. (0)

Anonymous Coward | more than 2 years ago | (#37447100)

This is very good news for the hobbyist community. While there are already a few ARM-based boards out there (Leaflabs Maple) they do not have a large community and are not compatible with all the Arduino stuff out there. An ARM-based Arduino fills nicely the gap between the small Arduino and much more powerful boards like the Beagleboard which have very different power and space requirements.

I am very interested what kind of operating system the Arduino team is using for the Due. While it makes sense to only equip the smaller ones with a bootloader the Due could run a small realtime OS with networking, threads and a file system which would make larger applications easy to put together. Does somebody have any information on the software running on the Due?

Re:Very nice, ideal for a lot of small projects. (1)

Osgeld (1900440) | more than 2 years ago | (#37447572)

its just the IDE and wrapper functions at the moment, and besides there is not really enough ram on these to do much with an RTOS, mabe if it was doubled you could run bsd on it like the pic32, or even lua but 50 ish K, good luck

And Intel isn't happy about it. (1)

gstrickler (920733) | more than 2 years ago | (#37447142)

Intel is making plenty of money, but they definitely see ARM as long term threat, which is part of the reason they've been focused on power consumption and performance per watt the last 5 years.

Re:And Intel isn't happy about it. (1)

Arlet (29997) | more than 2 years ago | (#37447332)

Of course, this project has a low-end ARM. It is far away from any market that Intel is interested in.

The higher end ARMs that are used in tablets and phones are monsters compared to this one, and not really interesting for hobby use anyway. They are usually only available in BGA, and depend on external memory chips.

"Should start to worry"? (1)

goose-incarnated (1145029) | more than 2 years ago | (#37447164)

Really? They should start to worry only now? If I were them, I'd be right in the middle of worrying, not simply starting to do it.

Re:"Should start to worry"? (3, Insightful)

dbc (135354) | more than 2 years ago | (#37448774)

Sure, and Intel has been worrying for over 15 years. But here is the thing... the #1 thing that matters at Intel is gross margin per wafer. Intel fills its fabs, and runs them throttle to the firewall 24x7. Every project is ranked by gross margin per wafer... fall below the cut-off line, and you either buy fab from somebody like TSMC, or go find a different project to work on. The Intel Atom is a successful attempt to create a power efficient part that meets the gross margin per wafer test. Go look at the margins of the ARM makers. I'll bet it doesn't match Intel's.

I overheard a very interesting+insightful conversation among vendors at the ARMTech conference a year or so ago. "We are all just vendors of value-added flash. Look at the die photos. It's a whole lot of flash memory, with a little bit of logic around the margins for a processor and peripherals in order to differentiate our flash from the other guys' flash and add some value."

Intel is doing what makes business sense for Intel. But they are watching. And Intel, big as it is, can turn on a dime, and has enough fab capacity to pave over with silicon any competitor that gets in it's boresight. That said, in the space where I work (embedded) ARM is taking over the world. It really makes zero sense to use an 8 bit uCtlr just about anywhere anymore, when you can get an ARM in the same size package and at nearly the same cost. Since flash dominates the die area in a microcontroller, 8-bit versus 32-bit logic is noise -- it has less cost impact than the package. There are a lot of Cortex-M3 parts in 48 pin packages now that cost only slightly more than 8 bit parts. (I should point out that there is huge difference between, an ARM Cortex-M3 and an ARM-A9, for instance an MMU.)

In the end, it comes down to MIPS and MFLOPS, and the die area and power required to do that much computation. When an ARM has enough functional units to match the MIPS and MFLOPS of an x86, it will take as much die area and power. At the complexity level of a Pentium IV, the added ugliness of the X86 instruction set is pretty much noise in the total die area and power. (In a past life I was an instruction decode and pipeline control logic design specialist -- I can tell you that x86 instruction decode is as ugly as it comes -- and in the day and age of out-of-order execution, that almost doesn't matter, except that because of all that ugliness x86 code is freakishly dense, which means the same size I-cache holds a lot more useful code. When you toss in the fact that the ugliness is also guarantees employment for instruction decode specialists, I'd call that a win :)

Re:"Should start to worry"? (0)

Anonymous Coward | more than 2 years ago | (#37449100)

"At the complexity level of a Pentium IV, the added ugliness of the X86 instruction set is pretty much noise in the total die area and power."

Then you state "which means the same size I-cache holds a lot more useful code. When you toss in the fact that the ugliness is also guarantees employment for instruction decode specialists, I'd call that a win :)"

What are you trying to say. The first statement implies CISC adds inefficient complex logic and adds to the total cost (area/power). Then you say that the code resulting from CISC is dense, saving on I-cache. What's the answer? Which one wins more (less area/power or less code therefore less I-cache)? I'd say you a basic RISC type ala ARM, and just add more I-cache or microcode to make up for lack of CISC. But you are the expert here.. explain ?

Disruptive technology (0)

Anonymous Coward | more than 2 years ago | (#37447280)

Perhaps Intel should start to worry about the lower end of the processor world.

The classic thing that happens with disruptive technology is:

  • The technology isn't good enough for most of the established company's customers.
  • The technology gives benefits for some of the company's customers.
  • The company is willing to give up those customers because they are usually the least profitable.
  • The technology gets better and better.
  • The established company gets chased up market.
  • Finally, there is no market left for the established company.

Microsoft understands this and they made darn sure that Linux wasn't allowed to establish a beach head in netbooks. Intel could use the same tactics to fight off ARM processors. Both companies probably won't be able to fight off their respective disruptive technologies forever though.

http://en.wikipedia.org/wiki/Disruptive_technology [wikipedia.org]

Re:Disruptive technology (1)

optimism (2183618) | more than 2 years ago | (#37450002)

Microsoft understands this and they made darn sure that Linux wasn't allowed to establish a beach head in netbooks. Intel could use the same tactics to fight off ARM processors.

Apparently not, since the dominant smartphone (iphone) and tablet (ipad) have sold tens (hundreds?) of millions of units running ARM processors.

Coffee required (1)

MonsterTrimble (1205334) | more than 2 years ago | (#37447642)

five SPI buses

I read that as five spouses and thought: Fuck, what is it doing? Making a harem?!? Then I realized two things: We're talking about a mobo and second, I need more caffeine.

Re:Coffee required (1)

billstewart (78916) | more than 2 years ago | (#37448846)

That's SPIES, not spouses :-) Actually, no, it's not a mobo in the sense you're thinking of. It's a microcontroller board with a bunch of analog and digital inputs, though you can now support some higher-end input devices as well. SPI is a simple bus for chips to talk to each other, typically used for things like a D/A converter or accelerometer chip or EEPROM to talk to a controller chip.

Fuck (0)

Anonymous Coward | more than 2 years ago | (#37447806)

Why couldn't this have come along 3 months ago before I had to develop a load of in house dev boards for ! Dual i2C buses and a nice fast clock would have been an ideal off the shelf solution.

In related news I'm probably going to be buying 10 of these the day they are released, hopefully on the works account.

Bad for programmers (1)

Prune (557140) | more than 2 years ago | (#37447988)

ARM's relaxed/weak memory consistency model is not an issue for non-concurrent programming since the compiler takes care of you. But when you're making parallel programs and you're concerned with performance and using lock-free algorithms, ARM is a nightmare in trying to figure out the appropriate memory barriers yet not overuse it and kill performance. X86 has much more limited reordering and even there it takes some serious thought in making sure your lock-free code is correct.

fuck yea! (0)

Anonymous Coward | more than 2 years ago | (#37448204)

this is very very good news!

as someone who has ran out of SRAM on a mega2560 and was looking at the next atmel chip up's evalulation board, this is the best news of the year so far....

Arduino in FPGA? (1)

Doc Ruby (173196) | more than 2 years ago | (#37448338)

Xilinx is marketing Zynq, a new line of chips containing ARM-9 with embedded FPGA peripheral [fpgajournal.com]. Arduino is remarkable because its HW design is open-source, and reproducible without a license. So is there an Arduino "IP core" that can be configured on a Zynq FPGA, and then Arduino code run on that?

How about using that Arduino in FPGA on ARM to replicate the functions of this new Arduino/ARM chip? Would code targeting the Arduino/ARM chip "just work" on the Arduino/FPGA/ARM chip?

Re:Arduino in FPGA? (0)

Anonymous Coward | more than 2 years ago | (#37448396)

ARM9 != Cortex A9

Re:Arduino in FPGA? (1)

mako1138 (837520) | more than 2 years ago | (#37448672)

An AVR core for FPGA already exists [gadgetfactory.net].

Re:Arduino in FPGA? (1)

Doc Ruby (173196) | more than 2 years ago | (#37449266)

That AVR8 core is designed to run on "Butterfly One", which is a Spartan 3E. Which I think means that it would give me an Arduino running on the Zynq's FPGA, right? Except I'm not sure the Zynq offers the analog inputs the AVR8 expects. If I added the missing HW to the Zynq, would the board support all Arduino SW?

Re:Arduino in FPGA? (1)

Zarquon (1778) | more than 2 years ago | (#37449438)

Atmel did something similar a few years back with their FPSLIC, but the tools and parts were very expensive and it's more or less dead. It looks like the Zynq has a similar problem... the lowest end part is $15+ for high volume sales.

Re:Arduino in FPGA? (1)

Doc Ruby (173196) | more than 2 years ago | (#37449642)

FPSLIC [design-reuse.com] was an 8bit AVR with FPGA up to 2300 cells. All the AVR did was manage the reconfig process. It doesn't look like an app on the AVR could actively invoke logic on the FPGA. The Zynq boots into Linux and leaves the FPGA as a peripheral for circuits to deliver data or interface with external HW. It can run Linux and its app processes on one core, leaving the other core "raw" for running processing directly, or embedded in the FPGA to optimize circuits factoring out ARM instructions from FPGA.

You think $15 is expensive for a dual-core ARM-A9 and a fat Artix FPGA, integrated in a fast AMBA bus? An ARM-A9 that can run Linux 2.6, and access the FPGA as a peripheral? In 28nm, with low power consumption? Even at $25 for hundreds-count it seems a great deal for the performance.

New competor for the Digilent UNO 32 (1)

subanark (937286) | more than 2 years ago | (#37448400)

The specs on this seem to be better than what Digilent's UNO 32 is:
Microchip® PIC32MX320F128 processor
80 Mhz 32-bit MIPS
128K Flash, 16K SRAM
42 available I/O (I think that includes 16 analog)

The price of this is also $25 and the software is based on arduino's code. The voltage is also 3.3 like the announced one instead of the UNO's 5.0V.

Personally, we will probably be sticking to Digilent as the company is about a 30 minute walk of where I work (which saves on shipping costs).

Why would Intel be worried? (1)

MobyDisk (75490) | more than 2 years ago | (#37448548)

I don't think that Intel will be worried that something moved from one non-Intel platform to another non-Intel platform, especially in a market segment that Intel does not compete in. Those Atmel/ARM chips run under $5 per chip, soemtimes under $1. Intel has the Atom chips around $50 at the low-end.

Re:Why would Intel be worried? (1)

Cute Fuzzy Bunny (2234232) | more than 2 years ago | (#37449236)

Seriously. Besides this being a zero profit item, Intel is still getting over that horrible beating they got from the PowerPC challengers, arent they?

Even if it does offer comparable capabilities and a price or performance benefit, theres too much inertia behind x86.

I mean, Google+ is arguably better than Facebook, so why isnt everyone on it instead of FB?

Oh and lets all listen raptly to the people who hum the "I'd be happy doing web browsing and basic office stuff on a 95MHz chip!!1!". Yeah sure. Thats why most people want the new computer after the old one has been kicking around for a few years. Going back to 1989 wont make anyone happy. People in third world countries wouldnt pee on it if it caught fire.

I think they mean Atmel's SAM3U (0)

Anonymous Coward | more than 2 years ago | (#37448570)

Typo in original article.

I'd prefer an NXP LPC3131 (1)

pem (1013437) | more than 2 years ago | (#37448710)

Twice as fast. Almost 4 times as much RAM. Cheaper.

The only downside is that it's BGA, but if somebody else puts it down on a board for me, that's sweet.

Hooray for 3.3v (3, Interesting)

Sleepy (4551) | more than 2 years ago | (#37449052)

mikejuk's submission paragraph states: "However, it's not all gain — the 3.3V operating voltage and the different I/O ports are going to create some compatibility problems. "

I respectfully disagree. Firstly, there are already a lot of 3.3v based Arduinos on the market. I own a JeeNode (see Jeelabs in EU, Modern Device in the USA). The JeeNode can run a 434MHz wireless radio transceiver and temperature sensor for MONTHS on a single 3.3v boosted AA battery. You could not do that with 5V.
Adafruit has a tutorial on converting Arduino Unos over to 3.3v, from 5v. It's popular.

Mostly all sensors these days are 3.3v.

But most most actuators (like stepper motors) require MORE than 5V. Sure, there's some relays requiring a mere 5v.. and very few work on 3.3v... but most relays require 6V or higher. The usefulness of 5V is diminishing, so what you really want is just enough power to activate a transistor or relay.

(Some Arduino compatible chips run great at 1.8v, and sensors do also... there will come a time someday where it may make sense to run at less than 3.3v)

I see Arduino more as a collection of standards and open hardware. There are dozens of Arduino designs all of which vary slightly in terms of electrical and physical (pinout, etc) compatibility. But this too is a good thing... the Arduino platform is all about ADAPTABILITY.

Re:Hooray for 3.3v (0)

Anonymous Coward | more than 2 years ago | (#37449942)

Just a shame about all thous pesky 5v TTL devices then.....

Re:Hooray for 3.3v (0)

Anonymous Coward | more than 2 years ago | (#37450306)

You know TTL inputs work fine on a 3.3V CMOS output, right? And converting TTL outputs down is as simple as a voltage divider.

So you have to add a 5V rail, and a handful of resistors... cry me a river.

Re:Hooray for 3.3v (0)

Anonymous Coward | more than 2 years ago | (#37450674)

Because the whole point of microprocessors is so you can add a bunch of external passive components....

There's not a lot of point in having 50+ IO pins if that means I'm going to have to add 100 resistors to use them as TTL inputs.

Pin spacing? (1)

riflemann (190895) | more than 2 years ago | (#37449974)

Does this have proper 2.5mm pin spacing throughout?

The most annoying thing about the regular arduino is the fact that you can't use standard protoboard for home made shields.

Please tell me they have fixed this problem.

Re:Pin spacing? (1)

Osgeld (1900440) | more than 2 years ago | (#37450860)

nope, it would break compatibility with shields (which wont work anyway since this mcu is not 5 volt tolerent)

One problem (2)

pinkeen (1804300) | more than 2 years ago | (#37449976)

There is one problem with this. Yes, you can prototype on arduino very well, but suppose you tested everything and want to deploy your project on a dedicated board. I can't see hobbysts soldering QFP packages and making 3+ layer PCBs at home somehow. Compare that to soldering a DIP socket on a possibly one-sided board.

Maybe one project - one arduino isn't a problem for some. For me it's way to pricey and not a good solution to add a quite big board with a lot of redundant circuitry to every project.

Next thing - what about the mods, arduino forks, fully compatible alternatives? There won't be so many now.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...