Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Android ICS Will Require 16GB RAM To Compile

timothy posted about 3 years ago | from the all-that-dessert-makes-you-sluggish dept.

Android 357

ozmanjusri writes "New smartphones may be lightweight, compact objects, but their OSs are anything but. Ice Cream Sandwich will need workstations with no less than 16 GB RAM to build the source code, twice the amount Gingerbread needed. It will take 5 hours to compile on a dual quad-core 2+GHz workstation, and need 80GB disk space for all AOSP configs. Android developers are also being warned to be cautious of undocumented APIs: 'In almost every case, there's only one reason for leaving APIs undocumented: We're not sure that what we have now is the best solution, and we think we might have to improve it, and we're not prepared to make those commitments to testing and preservation. We're not claiming that they're "Private" or "Secret" — How could they be, when anyone in the world can discover them? We're also not claiming they're forbidden: If you use them, your code will compile and probably run.'"

Sorry! There are no comments related to the filter you selected.

Of Course. (4, Funny)

Frosty Piss (770223) | about 3 years ago | (#37813176)

Nobody will ever need more than 16GB...

Re:Of Course. (1)

hedwards (940851) | about 3 years ago | (#37813184)

16GB is an awful lot of RAM, I'm really curious as to what it is that they're doing that's going to require more RAM than most of these devices have in total storage space.

I get that optimizations take memory and that there are likely independent steps, but still 16GB of RAM?

Re:Of Course. (3, Informative)

lolcutusofbong (2041610) | about 3 years ago | (#37813200)

probably using the -pipe CFLAG.

Re:Of Course. (1)

iluvcapra (782887) | about 3 years ago | (#37813216)

16GB is an awful lot of RAM, I'm really curious as to what it is that they're doing that's going to require more RAM than most of these devices have in total storage space.

These are the hardware requirements to compile the complete AOSP Android system and platform, and not the requirements to merely develop an application on it.

Re:Of Course. (2)

a_n_d_e_r_s (136412) | about 3 years ago | (#37813382)

Looking at the article; the 16GB is because they compile the code in parallell so need lots of memory. They get the 5 hours of build time down to 25 min.

Re:Of Course. (0)

DigiShaman (671371) | about 3 years ago | (#37813464)

It will take 5 hours to compile on a dual quad-core 2+GHz workstation

To me, that sounds like it takes 5 hours after compiling the code in parallel. So if it was a single threaded compilation job, in theory, the task would take much much longer.

Re:Of Course. (5, Informative)

evilviper (135110) | about 3 years ago | (#37813712)

To me, that sounds like it takes 5 hours after compiling the code in parallel. So if it was a single threaded compilation job, in theory, the task would take much much longer.

Yes, it does SOUND that way... It's very "truthy" that way...

Relying on /. summaries just makes you look like an idiot, when you're just one quick and easy click away from the source. Surely, if you cant be bothered to put that much effort in, then you must not have enough time to write-up a response, either...

Verbatim quote from TFA:
    "5+ hours of CPU time for a single build, 25+ minutes of wall time"

Re:Of Course. (0)

vanDrunen (1075573) | about 3 years ago | (#37813218)

16GB is nothing, just playing with large amounts of TNT in Minecraft requires this. An average server has 32+ GB when hosting some decent VM's. Windows Server with Exchange will eat up 16GB in no time as well.

16GB RAM and GCC optimization (2)

Zan Lynx (87672) | about 3 years ago | (#37813226)

This is a guess as to the reason.

One of the better ways to optimize C++ code for building with GCC is to put all of the source code into one big code file. Or you can build it as a few independent modules, but the code is still quite large. Then you build it with the O3 flags. In GCC, the amount of RAM and CPU used in an O3 compile goes up by quite a lot as the code size in a single module increases. I am not sure what the exact equation is but I think it's an exponential function.

This would easily explain the RAM and CPU usage.

Re:16GB RAM and GCC optimization (1)

AVonGauss (1001486) | about 3 years ago | (#37813294)

One of the better ways? I think a more accurate description would be, lazier ways that often promotes continued inefficiency and bad design. Even with a massive disk cache, I can't think of any good reason that it would take 16 GB to compile anything.

Re:16GB RAM and GCC optimization (3, Informative)

bucky0 (229117) | about 3 years ago | (#37813482)

No, you can perform better optimizations if you know, for instance, that a function can be inlined. You can't get that if some of the uses are in other compilation units.

Re:16GB RAM and GCC optimization (0)

AVonGauss (1001486) | about 3 years ago | (#37813666)

Inlining 12 GB (conservatively) worth of functions, that's the argument? Really?

Re:16GB RAM and GCC optimization (2)

exomondo (1725132) | about 3 years ago | (#37813528)

I can't think of any good reason that it would take 16 GB to compile anything.

Well how much RAM would you think should be needed to compile Android? If you're taking 5 hours of CPU time to ~25mins wall time then obviously your parallel compiles are going to be chewing up a lot of RAM. If you reduced the amount of parallel builds it would reduce the amount of RAM required - and also take a lot longer.

Re:16GB RAM and GCC optimization (1)

AVonGauss (1001486) | about 3 years ago | (#37813678)

My "smart" answer would be not much more than the host operating system itself needs. However, yes, RAM can affect the number of parallel compiles but I think its safe to say in this case we are not optimally limiting the number of symbols and objects that the compiler needs to work at any given time.

Re:16GB RAM and GCC optimization (1)

exomondo (1725132) | about 3 years ago | (#37813798)

My "smart" answer would be not much more than the host operating system itself needs.

And just thrash your HDD, yeah you could do that in fact you could do that with everything and never need much more RAM than the host operating system, it would of course be extremely slow.

However, yes, RAM can affect the number of parallel compiles

Well no, in this case the amount of RAM required is affected by the number of parallel builds.

but I think its safe to say in this case we are not optimally limiting the number of symbols and objects that the compiler needs to work at any given time.

Conserve RAM or conserve time? I think the latter given the nature of the task.

Re:16GB RAM and GCC optimization (1)

shutdown -p now (807394) | about 3 years ago | (#37813656)

Think about how much time and memory a complete escape analysis for several hundred megabytes of C++ code would take.

Re:16GB RAM and GCC optimization (1)

AVonGauss (1001486) | about 3 years ago | (#37813690)

Is there a reason we would analyze several hundred megabytes of C++ code, at once?

Re:16GB RAM and GCC optimization (5, Informative)

Intropy (2009018) | about 3 years ago | (#37813820)

Yes, there certainly are. The most obvious reason is code optimization. If your target device is something relatively light on resources like a mobile phone, then you probably want to optimize very aggressively. All forms of optimization require context. For something like "false && statement" all the required context for optimizing away the statement is very nearby. Something like the return value optimization [wikipedia.org] needs to know about the entire function. So far we're considering the easy stuff. If you want to go all out and get into whole program optimization [wikipedia.org] then some optimizations cannot be guaranteed to be safe without knowing the entire program.

Now if "compile" refers to the entire build process, then we're also probably talking about some serious static analysis. Checking for things like "can this function ever throw?" or "is this code reachable?" or "is the memory allocated here always eventually freed?" also requires an awful lot of context to check. In the worst case each of these questions requires knowing all of the code to answer.

Re:16GB RAM and GCC optimization (1)

wmac1 (2478314) | about 3 years ago | (#37813782)

It isn't? I upgraded my PC to 4G of RAM just a few weeks ago.

BTW I do a lot of software development and also academic simulation (using Matlab and my own simulation software which handles hundreds of thousands of intelligent autonomous agents ). I am a computer science researcher by the way.

Re:16GB RAM and GCC optimization (1)

UnknownSoldier (67820) | about 3 years ago | (#37813816)

Maybe after you've actually worked on a professional C/C++ compiler you will be able to point out BOTH the positive and negatives to Unity/Bulk Builds. Until then, just because YOU are ignorant of Unity / Bulk Builds and have obviously never used them, doesn't mean other people would be wiling to trade there 5 min compiles + 2 min links times for 1+ hr builds using the inefficient precompiled header approach.

Re:16GB RAM and GCC optimization (2)

Rich0 (548339) | about 3 years ago | (#37813452)

I run Gentoo and usually run make with -j5 on a tmpfs, and I've never managed to use even half of my 8GB RAM building anything from chromium to firefox to openoffice. And I certainly don't skimp on my CFLAGS.

Maybe if you build this thing on a tmpfs and run -j50 or something you'd need that kind of RAM, but seriously...

Plus, since parallel make tends to limit itself to a single module at a time in most build systems it is hard to get the parallelism to be all that high anyway.

I'll take them at their word, but I suspect that you'd be able to build android with a lot less than 16GB if you aren't running so highly parallelized. For starters I certainly don't have 8 cores to throw at it...

Re:Of Course. (4, Interesting)

fuzzyfuzzyfungus (1223518) | about 3 years ago | (#37813260)

I have to wonder if the 16GB "requirement" is more of a recommendation and/or a bunch of default settings that deliberately avoid the disk as much as possible, and keep as many cores as you can throw at the job busy by compiling every little bit and piece in parallel...

On the one hand, with 16GB of RAM in the desktop/light workstation 4x4GB only running around $100(with the more workstation-friendly 2x8GB with ECC only twice that), it seems rather pointless to burn any developer time on trying to optimize the RAM needs of building the entire OS. RAM is cheap.

On the other hand, I have to wonder what they could possibly be doing to the process of compiling what is basically a weird-but-not-unrecognizable linux distro to make it that RAM hungry.

Re:Of Course. (1)

OeLeWaPpErKe (412765) | about 3 years ago | (#37813286)

It's probably the amount of RAM required to have every line of source code + one device image file + gcc + java + linker + ... all in ram at the same time.

It'll probably compile on 640k of ram. It just won't be finished by Christmas.

Re:Of Course. (5, Interesting)

mjwx (966435) | about 3 years ago | (#37813304)

I have to wonder if the 16GB "requirement" is more of a recommendation and/or a bunch of default settings that deliberately avoid the disk as much as possible

I have to wonder if the 16 GB requirement is real.

Reading the blog linked to in the summary, there is no source mentioned. The author completely fails to mention how they came across this information. Even ignoring their bad English (obviously not their first language).

I think I'll wait for a more trustworthy source to confirm or deny this.

Re:Of Course. (1)

SomePgmr (2021234) | about 3 years ago | (#37813416)

Indeed, and google searches only return results to that blog post with no sources, slashdot and slashdot-reposts. It's like wikipedia-style circular references.

Re:Of Course. (3, Informative)

kidgenius (704962) | about 3 years ago | (#37813516)

Here's the original source over at Google Groups from JBQ http://groups.google.com/group/android-building/browse_thread/thread/3757b189f4e93df0?hl=en&pli=1 [google.com]

Re:Of Course. (5, Informative)

PopeRatzo (965947) | about 3 years ago | (#37813576)

And if you read that original source, you'll see that they are recommendations for building future development machines:

-6GB of download.
-25GB disk space to do a single build.
-80GB disk space to build all AOSP configs at the same time.
-16GB RAM recommended, more preferred, anything less will measurably
benefit from using an SSD.
-5+ hours of CPU time for a single build, 25+ minutes of wall time, as
measured on my workstation (dual-E5620 i.e. 2x quad-core 2.4GHz HT,
with 24GB of RAM, no SSD),

Re:Of Course. (1)

chrb (1083577) | about 3 years ago | (#37813718)

Looking at those specs, maybe it's about time to think about switching Android to a modular architecture. There is no reason why the complete build needs to be made in one go. It's like running Gentoo and compiling the entire system from source when you really just want to upgrade an application. Or like the old OpenOffice days, when they bundled everything, so compiling it required compiling every library that it depended on, plus compiling Python and everything else that was embedded. Building/distributing a whole new system image and whacking it over a partition seems crazy in this day and age. What Android needs is something like .deb packaging and a proper package repository, honestly I'm surprised the Cyanogen guys haven't done it yet, it would make their job of getting rapid and incremental updates out much easier.

Re:Of Course. (0)

Anonymous Coward | about 3 years ago | (#37813654)

Agreed that it's not real (as mentioned elsewhere in these comments). The stated 8GB requirement for Gingerbread is also wrong. I build that on a VM allocated just 2GB of my system's 4GB of RAM.

Re:Of Course. (-1)

Anonymous Coward | about 3 years ago | (#37813208)

just like 640K is more memory than anyone will ever need on a computer?

Re:Of Course. (-1)

Anonymous Coward | about 3 years ago | (#37813244)

Thank you. Yes, that was reference the OP was making.

Moores law... (1)

mulvane (692631) | about 3 years ago | (#37813188)

Next Android will need 32GB.... Then 64GB... Soon, Skynet level of resources....

Recommended, not required, right? (2)

BeforeCoffee (519489) | about 3 years ago | (#37813202)

16GB recommended, or else you'll be waiting quite awhile. But, you could build the thing in less RAM than that, right?

Re:Recommended, not required, right? (1)

XaXXon (202882) | about 3 years ago | (#37813404)

Yes, it can always swap out. The compiler has no visibility into whether the memory space it is executing in is actually mapped to physical ram.

But like other people said, it might take a really long time.

ps aux (1)

tepples (727027) | about 3 years ago | (#37813670)

The compiler has no visibility into whether the memory space it is executing in is actually mapped to physical ram.

If the compiler doesn't know its own resident size, then how does top know the compiler's resident size, and how does ps aux know the compiler's resident size? I imagine that if a program detects that it's being swapped out, it might be able to adjust its CPU/memory tradeoffs at runtime.

Re:Recommended, not required, right? (0)

Anonymous Coward | about 3 years ago | (#37813408)

mkswap is your friend.

Sure... (1)

STratoHAKster (30309) | about 3 years ago | (#37813204)

This is of course if you don't edit out that -j64 in Google's main Makefile.

Well do you want (1)

Osgeld (1900440) | about 3 years ago | (#37813210)

A pile of ram, or a pile of time as it thrashes your hard disk? Ram is cheaper than time.

Re:Well do you want (1)

tepples (727027) | about 3 years ago | (#37813682)

Does compressing the swap file, as seen in Connectix RAM Doubler and compcache [google.com] , help any?

16 Gigabytes RAM costs $100 (0)

Anonymous Coward | about 3 years ago | (#37813220)

Seriously, you're complaining that you need $100 of RAM?

Re:16 Gigabytes RAM costs $100 (5, Funny)

Anonymous Coward | about 3 years ago | (#37813266)

...and a computer that can take 16 gigs of RAM.

I mean, you can spend $100 and buy a 16-inch horse cock dildo, but that doesn't mean you can shove the whole thing up your ass.

Re:16 Gigabytes RAM costs $100 (3, Funny)

Anonymous Coward | about 3 years ago | (#37813334)

I mean, you can spend $100 and buy a 16-inch horse cock dildo

I'll leave that to you. Interesting that you knew the price off the top of your head, though.

Re:16 Gigabytes RAM costs $100 (1)

petermgreen (876956) | about 3 years ago | (#37813362)

and a computer that can take 16 gigs of RAM.

Indeed though support for 16GB is nothing unusual these days, if your desktop was made in the last couple of years and wasn't bottom of the barrel it will probablly support 16GB.

Re:16 Gigabytes RAM costs $100 (1)

Osgeld (1900440) | about 3 years ago | (#37813388)

Bingo, dev workstations are not 199$ Acer's bought at K-mart

Re:16 Gigabytes RAM costs $100 (1)

earls (1367951) | about 3 years ago | (#37813612)

Speak for yourself. I pay less and take more. Practice makes perfect.

Re:16 Gigabytes RAM costs $100 (1)

kelemvor4 (1980226) | about 3 years ago | (#37813270)

You beat me to it. It's not like 16GB is really asking that much for a dev workstation. I've got 24GB in mine currently.

Re:16 Gigabytes RAM costs $100 (0)

Anonymous Coward | about 3 years ago | (#37813384)

I wish.
Where I work (at a major semiconductor company) everyone is required to use laptops. Crappy dell laptops. Slow ass 80GB hard drives, 4GB of ram under Windows XP. We might get Win7 in six months or so. Did I mention I need to compile Android every now and then? Under VirtualBox? Sigh. at least I think we'll get new laptops for Win7. We're a large international company so they want to be able to put us on a plane to visit other offices and take our development machines with us. It would be so much easier of they gave us big honkin' linux boxes to do our work on and a cheap windows laptop to VPN in remotely when necessary.

Re:16 Gigabytes RAM costs $100 (1)

vanDrunen (1075573) | about 3 years ago | (#37813274)

Sucks if your motherboard doesn't support 16GB RAM or if you have a laptop.

Re:16 Gigabytes RAM costs $100 (2)

izomiac (815208) | about 3 years ago | (#37813492)

16 GB is far more than any desktop user should need, and most laptops simply cannot hold that much, so it's creating a sharp demarcation between user and developer. This is bad. You want your advance users to naturally transition into becoming developers, and making your codebase inaccessible for them prevents that.

IOW, most people have suggestions for improvement for any tool they use. Ideally, it would be trivial for someone to download the source, modify it, recompile, test, and submit improvements. People start with simple things (e.g. misspelled words) and move to more advanced tasks as they gain familiarity. By requiring several hundred dollars of hardware and massive time investments, you are ensuring that users never become developers, just needy consumers whining about feature requests.

Re:16 Gigabytes RAM costs $100 (1)

0123456 (636235) | about 3 years ago | (#37813812)

16 GB is far more than any desktop user should need, and most laptops simply cannot hold that much, so it's creating a sharp demarcation between user and developer.

I have 8GB in my development system at work. With two copies of Eclipse sucking up a gigabyte each, if I try to compile my C++ software without shutting down the old version, I go over 8GB and start to swap.

16GB would definitely be beneficial; I put 16GB of 1.6GHz DDR3 in my home server when I built it earlier this year and it cost under $150. ECC RAM for the work system would cost more, but would probably pay for itself in a few weeks.

What Slashdot wants to know (0)

Anonymous Coward | about 3 years ago | (#37813276)

tabletroms
what are troms? and what do they have to do with tables?

Re:What Slashdot wants to know (0)

Anonymous Coward | about 3 years ago | (#37813420)

tabletroms

what are troms? and what do they have to do with tables?

It's Tablet Roms obviously.

So that's Google's master plan (5, Funny)

Anonymous Coward | about 3 years ago | (#37813280)

While Android will remain open-source, eventually it will require so much RAM/processing power/etc. to compile that only Google will have the computational resources available to compile it.

Clever!

Rookie question on debugging monster code bases (3, Interesting)

Twinbee (767046) | about 3 years ago | (#37813282)

Quick question for those with giant codebases such as this. How the heck do you test, and debug the software with those kind of lag times? Do you split everything up into smaller pieces or something? If so, then surely there are cases where you need to test something that requires EVERYTHING to be compiled. I can imagine such shot in the dark scenarios to be the stuff of pure nightmares.

Re:Rookie question on debugging monster code bases (0)

Anonymous Coward | about 3 years ago | (#37813336)

If they're using make, you only end up recompiling the bits you changed. If you really need to recompile everything, you're SOL of course :-P , but typically, you only need to recompile and test a small number of files.

Re:Rookie question on debugging monster code bases (0)

Anonymous Coward | about 3 years ago | (#37813392)

ccache is a compiler cache that operates on the preprocessed files, so assuming there are no defines in a header file that actually change anything, it can still speed things up by using the previously compiled object file.

Recompile *should* be much, much faster (3, Informative)

dwheeler (321049) | about 3 years ago | (#37813338)

Unless the build system is screwed up, recompiling after a change should be relatively fast. Usually source code is stored as lots of smaller files, and each file is compiled separately to produce a separate object file (e.g., .o). Then next time a rebuild is requested, the system should notice what changed, and only rebuild the needed parts. Some parts take the same time each time (e.g., a final link), but it shouldn't take anywhere near the same amount of time. There are lots of build tools, including make, cmake, and so on. If you use the venerable "make" tool, you might want to read Miller's "Recursive Make Considered Harmful": http://aegis.sourceforge.net/auug97.pdf [sourceforge.net] Cue the lovers and haters of "make", here :-).

Also distributed compiling (1)

Sycraft-fu (314770) | about 3 years ago | (#37813386)

In game programming, Incredibuild is a common tool for that. You run it on everyone's machine and it integrates with Visual Studio. Lets you reduce build time a ton since you have a lot of resources to use. Also tends to scale nicely as the larger the project, the more people working on it and thus the more computers available and so on. You can, of course, have dedicated servers just for compiling but many places don't bother, just having it use idle time from office systems as it is amazing how much that can add up to. Particularly since it is very rare for all devs to compile something at once.

Build battle scars? (5, Funny)

ben_kelley (234423) | about 3 years ago | (#37813568)

Unless the build system is screwed up, recompiling after a change should be relatively fast. Usually source code is stored as lots of smaller files [...] Then next time a rebuild is requested, the system should notice what changed, and only rebuild the needed parts.

I feel your pain brother.

Re:Recompile *should* be much, much faster (0)

Anonymous Coward | about 3 years ago | (#37813578)

That's mostly irrelevant. During development, rebuilding only what has changed saves time. But that's the ONLY time you do that. Any sort of official build is always from scratch.

Re:Rookie question on debugging monster code bases (0)

Anonymous Coward | about 3 years ago | (#37813348)

i remember when you could make world on an 80486 dx 75mhz with a 100 mb hdd and 64 mb of ram.

Re:Rookie question on debugging monster code bases (2)

tchuladdiass (174342) | about 3 years ago | (#37813378)

Back in my college days we had to submit a compilation job on the mainframe, and then wait around for a couple hours for someone to put the printout containing the results (or more likely a crash dump) into the appropriate mail box slot. Then you had to wait your turn to submit a revised copy. (No, this wasn't that long ago -- 89, 90, something like that -- but the community college I went to still taught their Cobol & assembly classes on an older mainframe -- 3270 terminals though, no punch cards).

But in the case of Android, remember that all the components are still separate -- you have the Dalvik VM, the Linux kernel, and libraries as probably the large components. So you can still debug any particular program module independently.

Re:Rookie question on debugging monster code bases (1)

epine (68316) | about 3 years ago | (#37813662)

No, this wasn't that long ago -- 89, 90, something like that

Seems pretty unlikely unless you were in a deep backwater. Interactive terminals became commonplace very early in the 1980s. It wasn't uncommon to work on a batch processing system until the mid 1980s, but not with results delivered on paper.

The development model you're talking about properly dates to the 1960s and into 1970s, in backwaters where the future had yet to penetrate. FFS, the Xerox Alto [wikipedia.org] was introduced in 1973.

Sixteen years later, you're still on a line printer development loop? I think your college needed a shot of Future Lube if your dates are accurate.

Re:Rookie question on debugging monster code bases (2)

Osgeld (1900440) | about 3 years ago | (#37813406)

As others have said, you dont recompile the entire thing cause you changed one integer, but as others have not said you really should be testing in smaller chunks, you are not perfect enough to vomit out something that takes 5 hours of CPU time (which on the given systems is about a half hour of real time) perfectly the first time you try.

Its much easier to write a chunk and make sure it works than to write a freeking monster blob and go hunting for a chain reaction.

Re:Rookie question on debugging monster code bases (0)

Anonymous Coward | about 3 years ago | (#37813560)

What is this testing you speak of? Compile==Ship!

Re:Rookie question on debugging monster code bases (0)

Anonymous Coward | about 3 years ago | (#37813810)

1) Distributed build systems.
    Assuming your build consists of parallizable components (nearly all builds do), just buy a build cluster and use that. There are systems out there which will autodetect dependencies and analyze serialization so you can get the build time down significantly. Nearly all cell phone manufactures use the same distributed build technology. I work for the company that produces it so I'll avoid naming it so I don't seem like a shill.

2) Incremental builds.
  Only build what changed. This works well if you aren't refactoring stuff frequently. It tends to fail pretty horribly if you change a common shared file or are frequently synccing a rapidly changing source tree.

3) Subbuilds.
  Only build the branch of the tree (and its dependancies) that you care about. If I'm working on the radio code, I don't care about any of the application changes that are happening. If I'm building an application interface, I generally don't care about any of the other application interfaces.

Something doesn't add up (1)

Anonymous Coward | about 3 years ago | (#37813308)

Ice Cream Sandwich will need workstations with no less than 16 GB RAM to build the source code, twice the amount Gingerbread needed.

2 * 2 = 16?

Gingerbread can be compiled with 2GB.

not true (4, Informative)

MrCrassic (994046) | about 3 years ago | (#37813324)

Here's what the article *actually* says:

16GB RAM recommended, more preferred, anything less will measurably benefit from using an SSD.

Emphasis mine. Still pretty beast, though.

Depends on how you look at it (5, Informative)

Sycraft-fu (314770) | about 3 years ago | (#37813412)

While it is a lot of RAM compared to what many system have, it really isn't a big deal these days. 4GB DDR3 sticks are $25 or less each, and that is for high quality RAM. Regular, consumer grade, LGA1155 boards support 4 of them. So for $100 you can have 16GB on a normal desktop system. My home system I type this on has 16GB for that reason. It was so cheap I decided "Why not?"

They actually can support more, with 8GB chips you can have 32GB on a standard desktop, but those are still expensive.

The enthusiast X79 LGA2011 boards coming out will have 8 sockets and thus handle 64GB. Of course beyond that there's workstation which cost a lot more, but not as much as you might first think.

At any rate, 16GB is now a "regular desktop" amount of RAM. Standard boards the likes of which you get in cheap ($1000 or less) towers support that much, and it only costs $100 to get. It is quite a realistic thing to require, for something high end.

Re:Depends on how you look at it (1)

Rich0 (548339) | about 3 years ago | (#37813474)

At any rate, 16GB is now a "regular desktop" amount of RAM.

Well, it is an amount of RAM you could cram into a brand new regular desktop, but it certainly isn't something you'd find on an average desktop. I think I have two slots free in mine so I could bump it up to 16GB, but that is $50 I don't really need to spend. I rarely am using more than half of my RAM as it is, though the extra obviously helps with caching/etc.

Android has always been RAM-intensive, and it makes sense since you have no choice but to build an entire OS at once (not like you can dynamically link it to your desktop's libraries). Just building something like chromium takes a ton of CPU.

Still, not looking forward to this... :)

Re:Depends on how you look at it (1)

Lehk228 (705449) | about 3 years ago | (#37813490)

my ~1.5 year old desktop has 8 gigs and a SSD, but i'm crazy like that

Recompile only the parts that changed (1)

tepples (727027) | about 3 years ago | (#37813714)

Android has always been RAM-intensive, and it makes sense since you have no choice but to build an entire OS at once

Can't you do the initial build overnight, come back the next day, do more development, and recompile only the parts that changed?

Re:Depends on how you look at it (1)

KingMotley (944240) | about 3 years ago | (#37813608)

My nearly two year old desktop has 12GB of ram in it, and can take 24GB (i7 930 based system). My next build which should be in December or January will likely have 32GB and expandable to 96GB (Dual socketed X79).

5 hours of CPU time != 5 hours of wall time (1)

jlebar (1904578) | about 3 years ago | (#37813342)

From TFA:

5+ hours of CPU time for a single build, 25+ minutes of wall time, as measured on a workstation (dual-E5620 i.e. 2x quad-core 2.4GHz HT, with 24GB of RAM, no SSD).

Not so bad... (1)

msevior (145103) | about 3 years ago | (#37813374)

TFA says:

5+ hours of CPU time for a single build, 25+ minutes of wall time, as measured on a workstation (dual-E5620 i.e. 2x quad-core 2.4GHz HT, with 24GB of RAM, no SSD)

25 minutes of wall time is nothing for a first build. After that updates from changes in the source code will be trivial.

25 minutes to build a complete linux distro is fantastic.

shitty /. summary (5, Informative)

petermgreen (876956) | about 3 years ago | (#37813380)

TFA: "5+ hours of CPU time for a single build, 25+ minutes of wall time, as measured on a workstation (dual-E5620 i.e. 2x quad-core 2.4GHz HT, with 24GB of RAM, no SSD)."
/. Summary: "It will take 5 hours to compile on a dual quad-core 2+GHz workstation"

Re:shitty /. summary (0)

Anonymous Coward | about 3 years ago | (#37813444)

Seriously, which editor was asleep at the wheel for this one?

Re:shitty /. summary (1)

Anonymous Coward | about 3 years ago | (#37813572)

And so, an incorrect meme is born.

"I hear it takes 5 hours and 16GB to compile ICS"

Sigh. thanks /.

This has to be BS overstating.... (0)

Anonymous Coward | about 3 years ago | (#37813414)

I built Gingerbread on a 4-year old mac with 4gb. It took a while, but build times are being reported [google.com] to take ~25 mins.

Hmmm... (2)

damn_registrars (1103043) | about 3 years ago | (#37813418)

I was looking for something else to do on my old 16-cpu Itanium cluster with 64gbs of shared ram. I think I just found it...

Re:Hmmm... (1)

afabbro (33948) | about 3 years ago | (#37813438)

Android builds on Itanium?

Re:Hmmm... (0)

Anonymous Coward | about 3 years ago | (#37813536)

ever heard of cross compiling?

Re:Hmmm... (1)

Zan Lynx (87672) | about 3 years ago | (#37813538)

Cross compilers. Same thing as building it on a Xeon. Neither one is an ARM.

Why they can test if it takes 5 hours to compile (0)

Anonymous Coward | about 3 years ago | (#37813446)

http://google-engtools.blogspot.com/2011/06/build-in-cloud-accessing-source-code.html
Google does iterative builds, meaning small changes take minimal amount of time to compile. If you have to build from scratch though, it can still be a big headache...

Have I heard this before? (-1, Troll)

rickb928 (945187) | about 3 years ago | (#37813496)

"In almost every case, there's only one reason for leaving APIs undocumented: We're not sure that what we have now is the best solution, and we think we might have to improve it, and we're not prepared to make those commitments to testing and preservation. We're not claiming that they're "Private" or "Secret" — How could they be, when anyone in the world can discover them? We're also not claiming they're forbidden: If you use them, your code will compile and probably run."

Honestly. Undocumented APIs are either an oversight or deliberately not disclosed. So Google is not disclosing these. Fine. And they are explaining this with an extraordinary story to explain they may or may not, will or wont, what?

Is it evil to be duplicitious? Or is Google just being lame?

Re:Have I heard this before? (0)

Anonymous Coward | about 3 years ago | (#37813600)

Oh hay java has undocumented APIs: everything in sun.*. glibc has undocumented functions too.

Just maybe android also has APIs that are implementation details too.

But no, it must be a conspiracy, or they're just a bunch of mouthbreathing derps. Everyone on slashdot is always a superior genus of being who could do better if they lowered their mighty selves to bother.

Re:Have I heard this before? (1)

blackraven14250 (902843) | about 3 years ago | (#37813632)

....can't you believe that they're deliberately undisclosed because they don't want to support them in any fashion, as they stated?

Re:Have I heard this before? (2)

darkonc (47285) | about 3 years ago | (#37813704)

If the undocumented API changes or disappears, be ready to either (1) change your code, or (2) emulate the old API. Nothing nefarious -- just too damned lazy to document something that might be unstable.

Human nature -- If you document it, people will expect it to be stable (no matter what you may say to the contrary). Undocumented API's have a built-in "we told you so" flavour to them.

I wonder... (1)

levicivita (1487751) | about 3 years ago | (#37813580)

...does a phone really require an OS of that complexity? Don't get me wrong, I have a current generation Android smartphone I bought 2-3 months ago, 4G enabled, it even has an HDMI out, and I completely comprehend that a modern smartphone is essentially a fully fledged computer.

That being said, it's still a phone. And in fact, it's horrible at it. To redial the last number I have to press 3 buttons (1 physical, 2 virtual) and suffer through 3s+ of erratic lag or more. On my 4 year old boring but functional Blackberry, it took under 1s and all I had to do is press the same button twice. Hanging up, redialing, going back to the home screen is slow as molasses. Yes, I can browse the web, access my Google Docs, open PDFs, read books, play chess, watch Youtube HD, etc. etc.

The smart- part of the phone is great and a step forward, however the -phone part of the phone is actually a big step back in my opinion.

Re:I wonder... (1)

arth1 (260657) | about 3 years ago | (#37813724)

...does a phone really require an OS of that complexity?

It's not the OS complexity. It's the laziness of programmers, who want everything abstracted through seven layers until they can write single lines of code that through black box code magically expands into what they think they want done, not that they can ever know for sure.
And the poor compiler works overtime, because it has to redo the same work every time someone writes object.method().

There are reasons why so many of the kernel developers (including Linus Torvalds) are adamantly against using "higher level" languages for the Linux kernel. And it's not just because of size, but because the risks of hiding bugs combined with the nightmare of debugging the underlying code is so immense.

What's it doing in there? (1)

Animats (122034) | about 3 years ago | (#37813590)

What is it actually doing that needs 16GB of RAM to compile?

Finally, an answer! (0)

93 Escort Wagon (326346) | about 3 years ago | (#37813618)

Ice Cream Sandwich will need workstations with no less than 16 GB RAM to build the source code, twice the amount Gingerbread needed.

I've been wondering, for quite a while, exactly what patents Microsoft seemed to believe Android infringes upon - but those memory requirements are definitely Microsoft-esque!

And the Redmond folks have lots of prior art too...

So how long and what resources would win7 take? (1)

Anonymous Coward | about 3 years ago | (#37813734)

I know nothing about this sort of programming/compiling. :(

Windows (2)

file_reaper (1290016) | about 3 years ago | (#37813740)

I wonder how long a full compile of that takes...

Why? (2)

wisnoskij (1206448) | about 3 years ago | (#37813760)

Unless the entire program is in 1 gigantic 8 billion billion billion line file why would it need that many resources or even be able to use 16 GB of RAM?
Assuming it is like a normal program would it not just be a large collection of relatively small files that are compiled one after the other (theoretically number of CPUs + 1 threads running at a time with that many files being compiled concurrently being the optimal solution)?
And I just do not see how you could ever use up 16 GB at any one time.

Re:Why? (1)

JackAxe (689361) | about 3 years ago | (#37813814)

Because every byte needs a bite of an icecream sandwich. :)

A good reason to look at D instead of C++ (0)

Anonymous Coward | about 3 years ago | (#37813766)

The D programming language compiles much faster than C++. It was designed to be easier to lex & parse. Maybe despite all the cool new features of the language (nice threading model, super-duper generic programming, optional garbage collection), something mundane as compilation speed will be the "killer feature" that gets people to migrate to D?

If memory were still expensive... (3, Insightful)

JackAxe (689361) | about 3 years ago | (#37813788)

This article would be shocking, but considering that 16 GB of memory -- especially the dual-channel DDR3 used for the i5 and consumer i7's -- is so cheap, less than $100, this article doesn't have any shock value. It's just informative. It's letting us know the 'recommended' memory and giving more nerds an excuse to add more RAM. That is the NERDS that don't already have 24 gigs for their virtual machines. :P
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?