Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel's Plans For X86 Android, Smartphones, and Tablets

samzenpus posted more than 2 years ago | from the sometime-down-the-road dept.

Android 151

MrSeb writes "'Last week, Intel announced that it had added x86 optimizations to Android 4.0, Ice Cream Sandwich, but the text of the announcement and included quotes were vague and a bit contradictory given the open nature of Android development. After discussing the topic with Intel we've compiled a laundry list of the company's work in Gingerbread and ICS thus far, and offered a few of our own thoughts on what to expect in 2012 as far as x86-powered smartphones and tablets are concerned.' The main points: Intel isn't just a chip maker (it has oodles of software experience); Android's Native Development Kit now includes support for x86 and MMX/SSE instruction sets and can be used to compile dual x86/ARM, 'fat' binaries; and development tools like Vtune and Intel Graphics Performance Analyzer are on their way to Android."

cancel ×

151 comments

Sorry! There are no comments related to the filter you selected.

First Post (-1, Offtopic)

Unclenefeesa (640611) | more than 2 years ago | (#38084504)

First Post yuuuuppiiii

Re:First Post (-1, Offtopic)

Unclenefeesa (640611) | more than 2 years ago | (#38084562)

Oh come on. I posted an actual comment right after this. The -1 hurt my karma. I was in such a good mood.
Either way, thank you for moderating :-)

Re:First Post (0)

Anonymous Coward | more than 2 years ago | (#38087272)

That's why you should always remember to check the AC box and use the word "nigger", as referring to darkie-americans, when you get a first post..

Intel's Software Experience...Graphics (-1, Flamebait)

thejynxed (831517) | more than 2 years ago | (#38084526)

Google allowed them to mess with the graphics engine? OMFG, we'll end up with tablet devices that run 1990's era graphics tech. One thing that Intel sucks hard at, is their graphics hardware and software. Letting them touch anything related to that in Android and Android devices is a freaking mistake.

Re:Intel's Software Experience...Graphics (0, Troll)

CajunArson (465943) | more than 2 years ago | (#38084588)

ARM chips using PowerVR Graphics: Amazing!
Intel chips using (literally the exact same) PowerVR Graphics: Intel Graphics Sux0rz!

Par for the coarse on this site.

Re:Intel's Software Experience...Graphics (5, Informative)

gl4ss (559668) | more than 2 years ago | (#38084632)

have you used intel graphics lately(stuff they're shipping in 2011)? it's like having a discrete mobile gpu from 2004.

but this article is not news of any kind. intel has had these plans out in public for years and years, android ndk has support for multiple targets. if they actually started shipping _that_ would be news.

Re:Intel's Software Experience...Graphics (0)

Anonymous Coward | more than 2 years ago | (#38084674)

have you used intel graphics lately(stuff they're shipping in 2011)? it's like having a discrete mobile gpu from 1994.

but this article is not news of any kind. intel has had these plans out in public for years and years, android ndk has support for multiple targets. if they actually started shipping _that_ would be news.

FTFY

Re:Intel's Software Experience...Graphics (2)

Mr Z (6791) | more than 2 years ago | (#38085686)

Hey, don't knock my Diamond Stealth 64! It's got VLB!

Re:Intel's Software Experience...Graphics (0)

Anonymous Coward | more than 2 years ago | (#38087788)

Can you read? GP mentioned that the graphics hardware is not made by Intel. Fuck, the summary doesn't even mention Intel graphics hardware (it mentions a software analyzing program) so this whole fucking thread is off topic. Retards.

Re:Intel's Software Experience...Graphics (1)

Guspaz (556486) | more than 2 years ago | (#38088328)

Intel's past use of PowerVR chips was at a time when smartphone screens were still pretty low-res, and the expectations of graphical performance on a smartphone was very different from what was expected on a notebook. Cedertrail (their upcoming Atom product) is using a Series 5 chip (the 545) rather than a Series 5XT chip (like the PowerVR SGX543MP2 in the iPad 2 and iPhone, or the SGX543MP4 in the Playstation Vita). The 545 is certainly an improvement over their previous single-core chips, but I doubt it will compare favourably to the multicore graphics solutions in modern smartphones and tablets. It's a very odd decision on Intel's prt.

Anyhow, the failure of Intel's chips in the smartphone and tablet space has little to do with graphics, and more to do with the fact that Atom performs similar to a Cortex A8 or A9, and yet early Atom chips used two or three times as much power to do that. Intel has narrowed the gap quite a bit, although they're still not there yet. Their next-gen parts might achieve this, but much like other architectures have had trouble displacing Intel for the desktop/notebook crown, so too will Intel have trouble displaying ARM in the embedded space. Merely matching the performance-per-watt of ARM's chips isn't enough, because at that point, people will ask "what's the point of using Intel's chips over ARM, they perform the same but don't have as big an install base so there are no advantages."

The only way Intel will get anywhere is by doing something BETTER than ARM, and they haven't managed that quite yet.

Re:Intel's Software Experience...Graphics (0)

Anonymous Coward | more than 2 years ago | (#38084624)

Give em a chance. Maybe they can add animated gif support to android...

Re:Intel's Software Experience...Graphics (1)

rrossman2 (844318) | more than 2 years ago | (#38085500)

Animated gifs do work (well hit and miss) in the browser... for some reason sometimes they will play on failblog.org, other times they are just a static images...

Re:Intel's Software Experience...Graphics (1)

oakgrove (845019) | more than 2 years ago | (#38088308)

Give em a chance. Maybe they can add animated gif support to android...

Dear God no

Re:Intel's Software Experience...Graphics (2)

ByOhTek (1181381) | more than 2 years ago | (#38084696)

Google allowed them to mess with the graphics engine? OMFG, we'll end up with tablet devices that run 1990's era graphics tech.

Wow. I hadn't realized Intel's graphics offerings have improved to even that point.

At least it wasn't ATI/AMD, then it would be fast, but crash a lot...

Re:Intel's Software Experience...Graphics (1)

RulerOf (975607) | more than 2 years ago | (#38085484)

At least it wasn't ATI/AMD, then it would be fast, but crash a lot...

And then there would be the Android malware mining bitcoins, too!

Re:Intel's Software Experience...Graphics (2)

sarhjinian (94086) | more than 2 years ago | (#38084938)

Even if they develop their own graphics chip for tablet use, it'll a) probably be enough for what you'd do on a tablet (seriously: on a desktop PC, for anything except gaming, Intel's stuff is good enough), and b) it depends on how well the software's done, anyway (case in point: on many recent Linux distros, and again, unless you're gaming, Intel's chipsets provide a better overall experience than much more capable nVidia or ATI hardware).

Re:Intel's Software Experience...Graphics (1, Interesting)

hedwards (940851) | more than 2 years ago | (#38085986)

Intel's stuff is generally good, but it's expensive and I don't personally think we need to allow a foothold for the same sort of anti-competitive behavior that Intel is known for in the desktop/laptop processor market.

Re:Intel's Software Experience...Graphics (0)

Anonymous Coward | more than 2 years ago | (#38085806)

Don't forget Tizen. :)

x86 (5, Insightful)

Unclenefeesa (640611) | more than 2 years ago | (#38084528)

Since most of x86 architecture and related hardware is getting smaller and most smartphone are getting bigger, they are bound to meet somewhere.
hmm, I guess it will be called a tablet or an i(ntel)Pad. ehm ehm

Re:x86 (1)

JoeMerchant (803320) | more than 2 years ago | (#38084702)

Can you say "Windows 8" for phone?

Re:x86 (4, Funny)

chill (34294) | more than 2 years ago | (#38084952)

Not while keeping a straight face, no.

Re:x86 (2)

davester666 (731373) | more than 2 years ago | (#38087842)

Didn't Balmer just threaten everybody with something along the lines of "We'll always live in a Windows era"?

Or was it more of a warning?

Re:x86 (2)

ozmanjusri (601766) | more than 2 years ago | (#38084978)

I can say "Meh".

Is that close enough?

Re:x86 (1)

Locutus (9039) | more than 2 years ago | (#38085814)

if you say Win 8 really fast it sounds like Wait. Just saying.

LoB

Re:x86 (0)

JoeMerchant (803320) | more than 2 years ago | (#38086206)

Resistance is futile. I'd rather have had a Qt-Linux-Nokia phone in 2011, but I bet I'll be getting a Windows phone by 2013 - it's just too damn useful to ignore (as opposed to the iPod Touch-Phone that has been sweeping the Nation lately...)

Re:x86 (1)

oakgrove (845019) | more than 2 years ago | (#38087566)

Resistance is futile.

They said the same thing about Vista.

Re:x86 (1)

Luckyo (1726890) | more than 2 years ago | (#38087618)

And they were right, as Vista 1.1, also known as 7 won everyone's hearts.

Re:x86 (1)

oakgrove (845019) | more than 2 years ago | (#38087692)

No doubt Windows 7 is very close to the Vista codebase but what people were saying way back when (I was there) was that Vista in its present state was the future and you might as well get off of XP and get on Vista because "resistance is futile". That never happened. So, no, resistance was not futile. In this case it worked as MS got on the ball and delivered something that many people actually like.

Re:x86 (1)

Luckyo (1726890) | more than 2 years ago | (#38088124)

But resistance WAS futile. 7 was a repackaged Vista, that retained most if not all of its flaws. It just offered a slightly different presentation and was repackaged under a different name.

UAC halting your entire system for pretty much everything? Still there. Programs breaking due to admin rights requirements on machine where I specifically fucking want to have admin rights while running it? Still there. Inability to roll back to classic menu? Still there. Incompatibility issues with older software? Still there. Massive memory hog? Still there.

Seven essentially reduced some of the annoyance brought by vista a bit, and people who were used to vista's annoynances jumped on 7. I understand that: if I lived in the poorest country of Africa, Libya would look like heaven to me too.
Of course, when you're living in Western country, Libya is a poor shit hole, and moving from XP to seven felt pretty fucking horrible. Still does, after ripping out and disabling pretty much every annoyance I could, including disabling UAC, aero, ripping out most of the new and retarded interface and replacing it with classic shell (props to those guys for doing microsoft's job and as FOSS project to boot) and so on.

I still have to deal with some of the interface stuff they couldn't remove, like huge spacing crap. I also am hamstrung by 7's caching scheme which slowed my daily operations by a very significant margin (I actually asked a friend to time myself on this with old and new machine to see if it was just in my head and it wasn't) with it's absolutely useless "awesome and fast" as microsoft advertised it, caching. Which probably is, when you have well over 4 gigs of RAM and and an SSD.
It's just that I have HDDs and just 4 gigs of RAM and I see that I have in fact downgraded in terms of efficiency, in spite of having more RAM, much faster CPU and GPU in the system as well as faster (and obviously bigger) hard drives.

So yes, Vista won. Even really pissed people like me were forced to surrender and switch to Vista 1.1. I rest my case.

Re:x86 (4, Interesting)

Anonymous Coward | more than 2 years ago | (#38084748)

Given the choice, everyone who actually has to code for those CPUs (e.g. compiler makers), without a doubt prefers ARM over x86. Simply because of how shit x86 is.
It's the Windows ME of machine code. It started out as a DOS, and kept the cruft all the way to today. While piling more and more bigger and bigger stuff on top. Ending up with a upside-down pyramid, held in balance by a billion wood sticks.
And I know that even Intel itself couldn't stand it anymore. That's why they implemented that microcode solution with a RISC processor on the inside.
If only they would give us direct access to that core, but leave the microcode in there for 1-2 processor generations for legacy reasons.
Then nobody would willingly keep doing x86, and before those 2 generations would be over, it would be locked away and forgotten.

I, for one, plan a 265-core ARM CPU as my next desktop system. (Yes, ARM cores are slower per clock cycle. But they are *a lot* more efficient and *a lot* cheaper too. [No, ATOM does not count, unless you add that northbridge that's so big and gets so hot that looking at the mainboard 10/10 people think it's the actual CPU. Which is closer to the truth as Intel ever wants to admit.])

Re:x86 (2)

TheDarkMaster (1292526) | more than 2 years ago | (#38085452)

You forget too easily that many people depend on this legacy code to run software of thousands or even millions of dollars. Not because your desktop in your mom basement no longer need it so that the mankind did not need anymore too.

Re:x86 (1)

Anonymous Coward | more than 2 years ago | (#38085592)

Not on smartphones and tablets, they don't.

Re:x86 (1)

hedwards (940851) | more than 2 years ago | (#38086040)

Precisesly. The reason why AMD was able to succeed with its architecture over Intel was that Intel's 64bit architecture at that time required all the software to be specially compiled to run on Merced. Whereas AMD64 was backwards compatible and could allow people to buy the chip and update to 64bit when needed or to just run some applications in 32bit mode.

In this case I have no idea why anybody other than Intel would think this would be a good idea as the reason why Intel was using an ARM based XScale processor previously was that the x86 isn't suitable for this application.

Then run it in emulation (1)

tepples (727027) | more than 2 years ago | (#38086158)

You forget too easily that many people depend on this legacy code to run software of thousands or even millions of dollars.

Then keep your legacy code and run it in an emulator on an ARM CPU. The legacy code was probably written so long ago that it'd run as fast in a JIT emulator today as it did natively then.

Re:x86 (2)

oakgrove (845019) | more than 2 years ago | (#38087462)

But you're kind of missing the point though. If ARM really is that much better than x86 (I don't really know as I don't program on that level) then with the amount of momentum it is catching in mobile devices, ARM overtaking x86 is inevitable. I don't know a lot about x86_64 whatever vs. ARM but I do know that my Xoom outperforms my netbook and it does it while generating no heat and with 3 times the battery life from a smaller battery. Look out, intel.

Re:x86 (1)

TheDarkMaster (1292526) | more than 2 years ago | (#38088406)

The problem is not who is better... The problem is try to port your "legacy", big and very expensive x86 code because some "genius" decided to simply drop the legacy support on the hardware. Especially when you have a tight deadline to meet and the system can not stop

Re:x86 (2)

mlts (1038732) | more than 2 years ago | (#38086304)

What i would like to see is a CPU architecture that can have asymmetric cores:

When the machine is idle, one low-power core handles the OS idle functions while another handles the IP stack, another core handles I/O, and another handles the hypervisor aspect.

When the machine is running database stuff, first cores that are made for integer operations get used, then the FPUs and GPUs come in.

Flip to a game, and the cores that mainly are used as GPUs come into play.

Fire up a modeling task, and the FPU heavy cores take the load first.

All this while cores dedicated to AES and RSA deal with the hard disk encryption as well as SSL/TLS items.

As for instructions, I agree with you there. Intel knows that the x86 needs to go, but has to keep that architecture going for legacy reasons. Ideally the best solution would be an Itanium chip with a ton of registers (128 general, 128 FP, etc.) This makes operations easier because all the fetches can be done first, the registers used, then the results stuffed back into memory, making caching easier.

Even more ideal is putting the x86 emulation into hardware so operating systems that are legacy can run on that and a hypervisor, while programs using the new architecture can run optimally. It might even be good to put the hypervisor on the CPU.

Re:x86 (1)

inhuman_4 (1294516) | more than 2 years ago | (#38087096)

Mod Parent Up!

Very interesting idea. We are going to have to add in more cores from now on to get more performance, might as well start specializing them for certain tasks. Your idea about x86 hardware emulation is especially interesting.

Re:x86 (1)

inhuman_4 (1294516) | more than 2 years ago | (#38087198)

Sorry to double reply, but now that I think about it, what you are describing sounds a hell of a lot like a mainframe on a chip. IBM mainframes have Multi-chip Modules [wikipedia.org] that are a lot like what you are describing.

Re:x86 (3, Informative)

tlhIngan (30335) | more than 2 years ago | (#38087210)

What i would like to see is a CPU architecture that can have asymmetric cores:

Similar to your design, the Tegra 3 ARM SoC does that. It has a quad-core A9 running at 1.5GHz or more, but it also has a "slow" core running at 600MHz or so. When things are idling, the slow core takes over and does the job while the hefty quadcores are powered off, saving tons of power.

Marvell I think also has a similar idea for their SoCs. And ARM's A15 design is supposed to incorporate that as well.

Re:x86 (1)

oakgrove (845019) | more than 2 years ago | (#38087548)

The only problem with splitting everything out like that is there are many apps that always have that one thread that can't be atomized any further and will saturate the core. I mean a loop can only be run so fast no matter how many cores you have so any one program is always going to have an absolute performance bottleneck. This isn't to say that there isn't merit to multi-core as of course the more the merrier but it isn't an absolute panacea.

Re:x86 (0)

Anonymous Coward | more than 2 years ago | (#38087814)

Kernel makers prefer x86. Go read Linus' comments on ARM.

power consumption (1)

craftycoder (1851452) | more than 2 years ago | (#38084566)

I thought x86 is a power hog compared to ARM. It seems like that is a serious consideration for mobile devices to me. I'll be interested to see where this goes. In the mean time, x86 chips are going to have to get a lot cheaper to compete with ARMs prices.

Re:power consumption (5, Insightful)

TheRaven64 (641858) | more than 2 years ago | (#38085200)

It is. The difference between an x86 and ARM core is around an order of magnitude at the moment for the same performance. But the difference between an x86 core and the display is another order of magnitude, so for devices that you mainly use with the screen on there isn't much difference between x86 and ARM in terms of overall power consumption. The difference in battery life between an ARM core at 200mW and an Intel core at 2W is very small when the display is using 10-20W. There are a few display technologies that are supposed to be hitting the market Real Soon Now that ought to make the difference between x86 and ARM a lot more apparent.

Re:power consumption (4, Interesting)

craftycoder (1851452) | more than 2 years ago | (#38085550)

I'd mod up your post, but I want to reply instead. Are you suggesting that the display uses 50-100 times the power of an ARM chip (and therefore 5-10 times an x86)? If that is true, that is very interesting. I did not realize the display was such an outlier in power consumption department...

Re:power consumption (1)

Hotweed Music (2017854) | more than 2 years ago | (#38085834)

If you have an Android (or possibly other type of) smartphone, the display usually uses around 60-80% of the battery. And that's on small displays designed for energy efficiency.

Re:power consumption (2)

pmontra (738736) | more than 2 years ago | (#38086900)

On my Samsung Galaxy S2 it's between 40 and 50% so maybe the super amoled display is really a power saver despite the 4.3"diagonal. And I use little wifi and little 3g, only when I explicitly need the net, so the display consumption could be even lower in % on a typical always on scenario.

Re:power consumption (1)

tepples (727027) | more than 2 years ago | (#38086224)

Are you suggesting that the display uses 50-100 times the power of an ARM chip (and therefore 5-10 times an x86)?

Yes, and this is why an e-ink Kindle reader lasts so much longer on a charge than, say, a Kindle Fire tablet.

Re:power consumption (1)

Shatrat (855151) | more than 2 years ago | (#38086656)

It definitely is, but you also have to consider that it is usually OFF in the case of a phone.
x86 android tablet would make sense since you could just turn it completely off when not in use, but an x86 phone would have a standby time shorter than your average summer blockbuster.

Re:power consumption (5, Informative)

Mr Z (6791) | more than 2 years ago | (#38085956)

Is the display really that much of a hog on a cell phone? Those numbers sound like laptop numbers, but I thought we were talking cell phones.

My phone has a battery that holds around 1300 mAh at 3.7v. That means I can draw 4.8W for 1 hour. If my phone's display really sucked down even 10W, then I wouldn't be able to have the display on for more than about 28 minutes total, which doesn't match my experience at all. I regularly browse the web from my phone for a half hour at a time, without making much of a dent in the battery.

A quick scan through this paper [usenix.org] suggests backlight power for the phone they analyzed tops out at 414mW, and the LCD display power ranges from 33.1mW to 74.2mW. If you drop the brightness back just a few notches, the total display power is around a quarter Watt or so, which sounds far more reasonable.

I don't think Intel is standing still on power consumption. Their desktop CPUs are hogs, sure, but they can bring a lot of engineers to bear optimizing Atom-derived products. (We might get an early read from Knight's Corner, actually, although I expect it to still be on the "hot" side. I'm waiting to hear more about it.) Also, ARM's latest high-end offerings (including the recently announced A15) aren't exactly as power-frugal as some of their past devices. In the next couple years, I think the scatter plot of power vs. performance for ARM and x86 variants will show a definite overlap in the mix, with some x86s pulling less power than some ARMs.

Re:power consumption (1)

tepples (727027) | more than 2 years ago | (#38086246)

Is the display really that much of a hog on a cell phone?

Tablet screens draw four to ten times as much juice as smartphone screens because they have four to ten times the area.

Re:power consumption (1)

Mr Z (6791) | more than 2 years ago | (#38086580)

Yeah, I can see that. I guess "mobile device" doesn't just mean "mobile phone" these days.

Re:power consumption (2)

tycoex (1832784) | more than 2 years ago | (#38088192)

I have my phone screen on for about 2-3 hours per day due to bus rides. According to my the Android battery tracking thing my display uses up around 60-70% of my battery for the day, and this is on a Nexus S with the AMOLED screen that is supposed to use less battery than an LCD screen due to not having to light up the black pixels.

The screen really is huge when it comes to battery consumption.

Re:power consumption (1)

Mr Z (6791) | more than 2 years ago | (#38088420)

What are you doing on it during that time? The processor, baseband and RF circuits also suck up a fair juice. That PDF I linked above shows GSM consuming around 600mW during GPRS and WiFi consuming around 700mW when in use on the phone they analyzed. I'd expect other phones to be similar. 3G is supposedly much worse at draining batteries. Dunno about CDMA/LTE, but I would imagine they'd also be in the half-watt to 1 watt range, to venture a first-order guess.

If you're just playing games, then it's just the CPU, RAM and display sopping the battery.

Re:power consumption (0)

Anonymous Coward | more than 2 years ago | (#38086424)

6W large displays were available last year.

http://www.coated.com/samsung-low-power-usb-lcd-display/

Re:power consumption (0)

Anonymous Coward | more than 2 years ago | (#38087160)

The difference in battery life between an ARM core at 200mW and an Intel core at 2W is very small when the display is using 10-20W.

You are forgetting that in a smartphone the screen is only turned on a fraction of the time, which outweighs its' relative power consumption. For example, while the CPU runs 100% the time the phone is turned on, the screen is automatically turned off after a few seconds of idle time, or even while doing phone calls. So, if a screen is continuously kept on for about 2 or 3 hours in a day, its' power consumption tends to be, on average, between 1 and 2W. Which is at the same level as your Intel core figures.

And even in this case, if we compare 0.2W + 2W = 2.2W with 2W+2W =4W, arm cores still outperform Intel ones. And in smartphones this is what really matters.

Re:power consumption (2)

zealot (14660) | more than 2 years ago | (#38087750)

Despite what many other commenters will say, no, it isn't a power hog compared to ARM. Or at least it doesn't have to be. Intel/AMD/VIA don't yet offer processors that have as low power as ARM (although some are pretty power/performance efficient depending on your workload), but they will within the next year for smartphones and tablets. On modern manufacturing processes the "x86 tax" becomes almost non-existant.

Debian (2)

mschoolbus (627182) | more than 2 years ago | (#38084642)

Just give me a debian build for my phone including dialer, messaging, etc..

Then I can play REAL games on my phone.. Or as real as they get in Linux!

Re:Debian (1)

isama (1537121) | more than 2 years ago | (#38084788)

then please give us the source to that dialer so we can port it to *bsd so we can have a real system.

Re:Debian (1)

mschoolbus (627182) | more than 2 years ago | (#38084826)

Then run *bsd [debian.org] on Debian.

Re:Debian (2)

znerk (1162519) | more than 2 years ago | (#38085258)

Just give me a debian build for my phone including dialer, messaging, etc..

Then I can play REAL games on my phone.. Or as real as they get in Linux!

Games aren't real on Linux? Yeah, PenguSpy [penguspy.com] and Linux Gamers [linux-gamers.net] don't have real games, really written for real Linux. You know, like Quake 4, Doom 3, Vendetta, and X3 - those aren't real games... oh, wait.

And nevermind that wine [winehq.org] actually works really well, nowadays, running many top games "flawlessly, out of the box", and tons more "run flawlessly with some special configuration" [winehq.org] .

Re:Debian (1)

mschoolbus (627182) | more than 2 years ago | (#38085490)

I don't know if you ever used wine for gaming, but I certainly wouldn't call it flawless, or even close. Running non-native games on a phone sounds awful!

Now name popular games for linux not made by ID Software or ported by Loki..

Re:Debian (1)

ifiwereasculptor (1870574) | more than 2 years ago | (#38086276)

Solitaire! Freecell! And maybe Chromium, mostly because people mistake it for the browser when using Synaptic.

Re:Debian (1)

oakgrove (845019) | more than 2 years ago | (#38087722)

I don't know if you ever used wine for gaming, but I certainly wouldn't call it flawless

I don't think he actually said that. What he said was that wine runs many games flawlessly not that wine itself is flawless. Subtle distinction but it is there.

If you're not a first-person shooter fan (1)

tepples (727027) | more than 2 years ago | (#38086340)

You know, like Quake 4, Doom 3, Vendetta, and X3

Vendetta [wikipedia.org] is from 1991. It's like pointing out that Mega Man X3 runs in a Super NES emulator: interesting, and probably fun for a while, but not what grandparent had in mind. As for Quake and Doom, can you recommend things other than first-person shooters that commonly get ported to Linux, especially well-praised E or E10+ rated game series?

And nevermind that wine actually works really well

Only on x86 phones. Most existing smartphones are ARM; let me know when Atom phones start to come out. And even if you stick to games from the Pentium 4 era, knowing that Atom is roughly comparable to a similarly clocked Pentium 4, you'll still have to work around copy protection measures that rely on the machine having an internal CD-ROM drive.

Re:Debian (1)

Aryden (1872756) | more than 2 years ago | (#38085262)

I'm running Ubuntu stacked on AmeriCandroid on my HD2

Re:Debian (0)

Anonymous Coward | more than 2 years ago | (#38086724)

Get an N900. You can actually boot Debian, although IIRC not all stuff is working properly, and nobody really cares to fix it.

Or more typically, boot Maemo, using its kernel, X server, etc., and chroot to a bog-standard Debian install. It's a bit of a hassle to tear out Maemo's one-window-at-a-time window manager (Matchbox) and GNOME-ish session stuff (Hildon), if you want those from Debian too, but eminently doable -- I did exactly that on my N810, running FVWM2 for several months. Or just leave them, and work with them -- they play reasonably nice with standard X programs, and rather well with GNOME programs.

Intel Softcores (4, Interesting)

inhuman_4 (1294516) | more than 2 years ago | (#38084686)

While it is always nice to hear about companies contributing to opensource, I don't see there being a big demand for x86 android. Who would use it? It's not low power enough for most tablets/phones. And while the ability to run existing x86 apps is nice they are mostly tied to Windows which is also not likely to see much traction in the mobile space. So what is the point?

What I would like to see is Intel creating a SoC and softcore suite. Intel has some big advantages that they could use to seriously compete:
1) Lots of experience in chip design. I don't see why they can't create an ARM-Core competitor.
2) They can start from scratch. Unlike ARM there is no need to legacy support or backward compatibility.
3) They have in house designers for everything from graphics, wired, wireless, etc. chips. I don't see why they cannot design from this a whole suite of modules that work on their SoC platform.
4) They have (to my knowledge) the best chip fab plants in the world by a sizable margin. Die shrinks offer a great way to reduce power consumption.
5) They have produced great x86 compilers for years, so producing a new compiler for a new chip shouldn't be too difficult since they are already experienced with x86 and Itanium.
6) They have shown that they already know how to support Android.
7) They have the cash and business partners to make it work.

I'm not saying they are guaranteed to make big bucks. Fighting an intrenched ARM with wide industry support will be hugely difficult. But if any company can do it it's Intel. Of course this means they would have to get over the Itanic debacle and stop trying to shove x86 down the throats of every problem.

Re:Intel Softcores (1)

JasterBobaMereel (1102861) | more than 2 years ago | (#38084866)

Intel - have loads of experience in getting the creaking x86 architecture to work in the modern world, ARM however is much much newer and has much less layers of cruft, they have not shown their ability to throw away all that and start from scratch (which is what we really need)

Re:Intel Softcores (1)

Brian Feldman (350) | more than 2 years ago | (#38084878)

It's not as if "x86" means much from an architectural standpoint. It is a choice in instruction set and is a good choice for new products given your (5) above -- what's got better payoff, making a new instruction set or reusing an existing one that is supported exceedingly well? Intel's 386 and AMD's 64-bit conventions are common ground for many wildly different CPU architectures.

Re:Intel Softcores (1)

inhuman_4 (1294516) | more than 2 years ago | (#38085226)

It's not as if "x86" means much from an architectural standpoint.

But it does. Intel/AMD do lots to make the architecture efficient but they are significantly constrained by having to meet the x86 at the highest level for compatibility.

First there is a ton of legacy stuff in x86 that is just not needed, making the core larger and more power hungry. Take a look at how the floating point works, its just dreadful.

x86 is CISC when we know RISC is better. Intel/AMD do some tricks to make the core more RISC, but why not just cut out the middle man? Why bother with converting it at all?

The number of working registers is also determined by the instruction set. It is pretty obvious that x86 could use more.

Additionally the are things like status registers, various pointers, the interrupt subsystem. It's just not pretty.

And most importantly for this application is that the chip is not designed with SoC in mind. So splitting up or adding different parts I suspect will be much more difficult.

Re:Intel Softcores (5, Informative)

yoshman (1416413) | more than 2 years ago | (#38085496)

The mistake most people seem to make here is to compare ARM to IA32, when they should be comparing ARM to Intel64/AMD64 (x86_64) since even Atom can run 64-bit code these days.

Going to 64-bit does increase code size a bit, but one of the good things about x86/x86_64 code is that it is VERY dense. This document

http://www.csl.cornell.edu/~vince/papers/iccd09/iccd09_density.pdf [cornell.edu]

suggests that 64-bit x86 code is actually even denser than ARM-thumb code in most cases (which in turn is denser than "normal" ARM code).

High code density means more cache hits, which means better performance and less power-hungry.

x86_64 has the same amount of integer registers as ARM: 16. Every single x86_64 CPU has support for SSE, which means that floating point operations can (and is) handled by the 16 SSE registers instead of the old x87 fpu-stack.

Fact is that the 64-bit specification for x86 fixed a large number of problems that the 32-bit specification had, making x86_64 a really good architecture without any significant flaws.

Re:Intel Softcores (1)

inhuman_4 (1294516) | more than 2 years ago | (#38087378)

Mod Parent Up!

Thanks for your post, very informative. I never considered the cache/code density aspect. I'm printing that pdf ASAP.

Re:Intel Softcores (4, Insightful)

Short Circuit (52384) | more than 2 years ago | (#38086102)

x86 is CISC when we know RISC is better. Intel/AMD do some tricks to make the core more RISC, but why not just cut out the middle man? Why bother with converting it at all?

Pull up a pillow and have a seat around ol' Grandpa Short Circuit. This may come as a shock to you.

Some programs still being sold and run on desktop computers today were compiled over ten years ago. Some programs still sold and run in x86 embedded environments were compiled twenty to thirty years ago. That's why x86 is still around.

x86 is still around for the same reason Windows is still around. It still runs binaries that are really, really old. In some cases (many, I expect), the source code for these binaries no longer exists, or the toolchain for building it is bitrotted. That's why x86 is still around.

Imagine some sci-fi horror film where everyone's forgotten how to maintain the vast infrastructure of their civilization, they just don't poke it because they don't want it to break. That's why x86 is still around.

Meanwhile, every year there are more long-lived applications built for the existing platform, with very little hope for being updated for newer platforms and processors; their binaries are likely to be running for another five or ten years.

Amusingly, open-source software has a clear advantage over closed source software in this arena. Several distributions are actively keeping software packages portable across CPU archs, and even portable across OS kernels. (Debian and Gentoo both support BSD foundations as well as Linux)

Re:Intel Softcores (1)

mlts (1038732) | more than 2 years ago | (#38086524)

I know some businesses which are still dependent on Windows 3.1 programs written in 1993-1994. When machine upgrade time came around, I ended up just P2V-ing their old boxes, sharing the application's document folders with the host OS, and to the end user, the creaky old application functions the same as anything else on Windows 7. To boot, if the creaky application gets corrupted, it only takes either a reloading of a snapshot, or grabbing an archive of the VM disk file to get back in business. (I also made sure images of the program's install media were stored with the VM for safety reasons.)

Even Apple which will toss a port of a feature out the second they feel it isn't important made the switch to x86, so the ability to run legacy apps is a major factor with machines these days.

Re:Intel Softcores (3, Informative)

Gaygirlie (1657131) | more than 2 years ago | (#38085232)

It's not as if "x86" means much from an architectural standpoint. It is a choice in instruction set and is a good choice for new products given your (5) above -- what's got better payoff, making a new instruction set or reusing an existing one that is supported exceedingly well? Intel's 386 and AMD's 64-bit conventions are common ground for many wildly different CPU architectures.

Actually yes, "x86" does mean a lot even from an architectural standpoint. For example it means you have to carry along all the instructions and their related mechanisms concerning 8086 Real Mode, and 80286 Extended Real Mode, plus all the horribly clumsy register types. That means you'll be wasting die space just to support stuff that isn't even used anymore, not to mention the time wasted on actual hardware design. With a completely new processor design you can just scrap all that, add much more flexible registers plus more of them, and get a more efficient CPU as a result. Every little bit of space saved is meaningful on a processor aimed for mobile devices, and it does help on desktops, too, if not as much.

scrap existing isa (0)

Anonymous Coward | more than 2 years ago | (#38085376)

The only benefit to native support of x86 is to run existing code* at high performance. If speed is not important, then emulate the instruction set. 9and then you will need to emulate the OS to for win32 compatibility, which is beyond the scope of this)
If you create a brand new instruction set and offer it to the world, there is a real issue that no one will buy it and the market will go a different direction and buy a competitor's product.

*If it is windows code, in my AC opinion, you are daft to want to run a program, written 5+ years ago, as fast a it can be run, on a tablet. Novelty/hobbyists aside, who is going to buy a shiny tablet with (oh say) windows 8, and install Office 2003 on, and be pissed that is doesn't run as fast as it does on their desktop.

Re:Intel Softcores (1)

drinkypoo (153816) | more than 2 years ago | (#38088306)

Actually yes, "x86" does mean a lot even from an architectural standpoint. For example it means you have to carry along all the instructions and their related mechanisms concerning 8086 Real Mode, and 80286 Extended Real Mode, plus all the horribly clumsy register types.

OK, so your decoder has to be able to handle the micro-ops, and you've got to have the hardware on the chip somewhere to perform the operations. But you don't actually have to have ANY of the same hardware (aside from where there's really only one practical way to do things) because you're going to decompose the x86 instructions into micro-ops anyway.

Re:Intel Softcores (4, Informative)

TheRaven64 (641858) | more than 2 years ago | (#38085276)

What I would like to see is Intel creating a SoC and softcore suite

They did that, what, 18 months ago now? Total number of people who licensed it: zero. Why? Because x86 absolutely sucks for low power.

Lots of experience in chip design. I don't see why they can't create an ARM-Core competitor

Ah yes, all those massive commercial success stories that Intel has had when it tried to produce a non-x86 chip, like the iAPX, the i860, the Itanium. The closest they came was XScale, and they sold the team responsible for that to Marvell.

They can start from scratch. Unlike ARM there is no need to legacy support or backward compatibility.

Intel has two advantages over their competition: superior process technology and x86 compatibility. Your plan is that they should give up one of those?

They have produced great x86 compilers for years, so producing a new compiler for a new chip shouldn't be too difficult since they are already experienced with x86 and Itanium

Hahahaha! Spoken like someone who has never been involved with compiler design or spoken to any compiler writers. Tuning a compiler for a new architecture is not a trivial problem.

Re:Intel Softcores (1)

grumling (94709) | more than 2 years ago | (#38085364)

The ultimate nerd tablet would be nice... Triple boot Linux, Android or Windows depending on what you want to run.

Re:Intel Softcores (1)

pmontra (738736) | more than 2 years ago | (#38087094)

It would run the three of them at the same time. Android for answering calls and managing the small screen, linux for the large screen you attach over hdmi and windows... Mmm in my case it would be for testing sites with ie but not much else. For many people it would be for gaming.

Now for step 2 (2)

davidbrit2 (775091) | more than 2 years ago | (#38084714)

Add some x86 optimizations to the battery.

Re:Now for step 2 (4, Funny)

MysteriousPreacher (702266) | more than 2 years ago | (#38084944)

Does doubling its size count as an optimization?

THey are hungry... (0)

Anonymous Coward | more than 2 years ago | (#38084852)

...Intel also wants a piece of the pie. Intel.Atom.Ant awayyy

Makes sense. (1)

sgt scrub (869860) | more than 2 years ago | (#38084922)

If you want a software platform to be able to build native code to your hardware you add code to their software base.

Android Distributed on 802.15.4 (2)

Doc Ruby (173196) | more than 2 years ago | (#38085066)

What's more interesting to me is the Android@Home announcements (from Google IO 2011) that Google is implementing its own networking stack (instead of Zigbee) on 802.15.4 [wikipedia.org] . 802.15.4 is a very low power low-level radio network, with cheap embedded microcontrollers that are often ARM. There's probably not enough power in the node's ARM to run Android, but some nodes could have extra power and extra ARM cores that do run Android.

Android's Java means in addition to network RPC, code can be straightforwardly programmed to safely migrate around the network for distributed local execution near the data, whether that's network metadata, sensor data, or just the power of massively parallel distribution. I wonder whether JavaSpaces or something like it (probably a very lite version) will find a fit in making cheap distributed networks represented in computational tuplespace. Distributed around one's home, office/classroom or car, or among one's clothing (daily worn watch/jacket/shoes/belt/keyring), or eventually merging among those personal spaces as they're either near or just related (linked by the Internet).

Intel's x86 architecture still has too much power consumption (and the legacy HW baggage that consumes it) to be a design win for this distributed architecture. By the time x86 is suitably low power, Android will probably have defined the space of these smart spaces, and the smart things in them.

FWIW, there's still few details of A@H, though supposedly there is a reference implementation (network backbone embedded in LED bulbs). Anyone seen any specs, like whether it's really a SNAP/6LOWPAN hybrid, or which specific alternative Google is now pushing? Where to get the devkits (HW and SW)?

Oh Java to the rescue (2)

thammoud (193905) | more than 2 years ago | (#38085120)

Not very popular on /., but Android being Java based will make life very easy for Intel to crack the mobile market. Most of the apps (sans native ones) will just work. It would have been almost impossible otherwise without some serious virtualization.

Re:Oh Java to the rescue (0)

Anonymous Coward | more than 2 years ago | (#38085224)

Java isn't special in this regard. Objective-C made it possible for many apps to "just work" when Apple moved from PowerPC to Intel.

The serious "virtualization" still needed is a port of Dalvik to x86, and a port of the Android runtime libraries to x86.

Re:Oh Java to the rescue (1)

oakgrove (845019) | more than 2 years ago | (#38088126)

still needed is a port of Dalvik to x86, and a port of the Android runtime libraries to x86.

This was done eons [android-x86.org] ago.

x86 Android? (0)

Anonymous Coward | more than 2 years ago | (#38085168)

LOL. Tits on a bull much?

Intel Software (1)

sunderland56 (621843) | more than 2 years ago | (#38085182)

Intel isn't just a chip maker (it has oodles of software experience)

Has Intel ever done any software other than to boost hardware sales?

Sure, they write lots of software, but they *are* just a chip maker.

Re:Intel Software (0)

Anonymous Coward | more than 2 years ago | (#38085436)

Intel itself is a hardware/foundry company, but they have acquired Win River which does a lot of embedded OS.

Re:Intel Software (1)

TrueSpeed (576528) | more than 2 years ago | (#38085962)

Well, they make Havoc which is the most popular middleware physics engine in the world and used by a majority of console games.

Re:Intel Software (0)

Anonymous Coward | more than 2 years ago | (#38087704)

Intel bought Havok. They didn't create it.

Easy virtualization on Android! (0)

Anonymous Coward | more than 2 years ago | (#38085210)

x86 arch will mean that Android will be a great virtualization platform in the future. Its already being done with android but with x86... much much easier!

VMware... where are you?

dual x86/ARM, 'fat' binaries - cool (1)

nurb432 (527695) | more than 2 years ago | (#38085898)

Yay.. universal binaries again, like apple had the foresight to do but then later quit. ( no, that was not sarcasm, just disappointment )

Would be an exercise in uselessnes (2)

Makawity (684480) | more than 2 years ago | (#38086486)

I bet everybody think about Android Market and all the cool stuff there. Well, don't do that unless your Android runs ARM.

I've got recently my hands on a Android MIPS phone. Extremely frustrating experience -- two of every three downloads from the Market simply refuse to install, because they have some tiny snippet or library compiled to ARM native code. Unless Intel heavy invests in app developers recompiling their works for Android/x86, it will be barely usable outside of the base system.

Re:Would be an exercise in uselessnes (1)

oakgrove (845019) | more than 2 years ago | (#38088176)

Sadly this is true. I have the same experience when I try to install many market apps in my Android virtual machine. Some work many don't. To all current and potential Android devs: do it with Java if at all possible.

How about actually exposing the RISC architecture (1)

Deliveranc3 (629997) | more than 2 years ago | (#38088056)

It's not like Android is going to run on top of Windows.

Let the Linux kernel loose Intel.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>