Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!



Facebook Puts 10,000 Blu-ray Discs In Low-Power Storage System

CTachyon Re:Write once? (153 comments)

Hmm, how do you backup the crypto keys?

In short: turtles all the way down.

Not 100% sure, I don't work on that system, but I strongly suspect it's by encrypting the crypto keys with a master symmetric key and replicating/backing up the encrypted ball-o-keys as needed. The master key itself lives in an HSM; backups of the HSM are handled with the usual HSM approach of "M of N physical smartcards".

1 year,2 days

Facebook Puts 10,000 Blu-ray Discs In Low-Power Storage System

CTachyon Re:Write once? (153 comments)

Anyone know if these burners are write-once drives?

If so, it pretty much guarantees that Facebook keeps a copy of your stuff forever, even if you "delete" it.

Where I work, we use large-scale tape backup (complete with robots), but tapes are so crappy that you basically have to treat them as write-once media anyway, so you have the same problems. (And tape drives are a consumable, but that's another story.) We solved this by encrypting each backup batch with a unique symmetric crypto key, and when a backup expires a cron job throws away the crypto key and marks the batch as "deleted" in our tape index. If all the batches present on a given tape end up deleted, only then do we bother to recall the physical tape from off-site storage and throw it out.

Has the bonus that we don't have to trust the security of our off-site storage provider.

1 year,3 days

VC Likens Google Bus Backlash To Nazi Rampage

CTachyon Re:That's not what was said. (683 comments)

... Short answer is probably mainly because I've been unemployed for years since I walked out on a six figure salary and a hardwalled office in the historic Xerox-Parc after I walked out on VMWare in January of 2009. Well, we'll set asside my educational 2 months stint working at Wendy's, which truly was more rewarding in every way other than financially than working for VMWare or others.

Why did I do that? [...] The fact of the matter is that the 'average googler' works for a system of control.

Wait, you think tech companies are part of the "system of control" but McJobs aren't? I worked 5 years at Wal-mart and it made me want to scrub away the filth by the time I quit. Even as a lowly overnight shelf stocker, I was complicit in helping to operate one of the most exploitative companies on the face of the earth. Wal-mart has committed serious acts of economic devastation (e.g. classic monopoly abuses like dumping), has bribed and corrupted third-world governments, and has used monopsony power to price-pressure suppliers into cutting corners on product quality and into depressing workers' wages (sometimes to the point of looking the other way when suppliers use literal slave labor). And the Waltons have used their Wal-mart wealth to push an explicitly conservative (evangelical Christian, anti-abortion, anti-gay) political agenda.

I mean, fuck. VMWare mostly just overcharges for neat software you could get for free and convinces idiot CTOs to buy licenses for it. That's peanuts to Wal-mart. And even Wal-mart is barely a blip compared to some other companies I can name off the top of my head. And the fast food chains like McDonald's and Wendy's rank only slightly less evil than Wal-mart (no slavery AFAIK, but look at e.g. their quest to addict us to their food by soaking everything in sodium and saturated fat, or their recent pressure on the FDA to loosen regulations on use of diseased animals for human consumption).

On the package signing: you're doing a piss-poor job of convincing me that poor key hygiene at VMWare was nefarious NSA tinkering, as opposed to some VP with password "password1" handing down security decisions or some fool sysadmin who runs everything from a "sudo -i" shell because it's easier than learning how chmod works. Consider this: RSA, a company that sells security and only security, had their SecurID key material stolen 3 years ago because they were idiots and didn't air-gap the key material for their signature commercial product. The profit motive doesn't explain running one's business into the ground. Total ignorance of security practices (and crypto in particular) is just too common to blindly attribute every bad practice to NSA nefariousness. Even at a company like RSA, where we're 95% sure they did weaken security for NSA bribes. Frankly, it doesn't surprise me that other people perceived you as a loon -- you're not justifying your claims at all. "Snowden, therefore X" is not an argument.

1 year,4 days

Global-Warming Skepticism Hits 6-Year High

CTachyon Re:Exactly 0% argue static climate (846 comments)

I posted about this on my G+ feed a while back; at some point, we went from being told about Global Warming to being warned about Climate Change.

The reason for that is that people equate "Global Warming" with "hot summers". That's bogus. The greenhouse effect isn't about direct sunlight; it prevents heat from escaping; therefore it affects low temperatures more than it affects high temperatures, and it affects winter more than it affects summer. The Arctic and Antarctic are the places that are changing the most drastically, and that's far removed from your average Joe's day to day "ermigahrd its sooo hot" experience.

But warming the poles more than the temperate latitudes evens out the temperature difference between them, and that has huge consequences from a weather standpoint. Temperature differences drive the jet streams; a polar jet stream is a 100mph~200mph river of air that circles the planet 5 miles up, and if you live in a temperate latitude (e.g. the US, Europe, China, south Australia) then a polar jet streams is responsible for everything nice about your weather. A polar jet stream blocks cold dry air from plunging equatorward (and warm moist air from surging poleward), and it also shepherds weather systems from west to east, forcing them to keep moving. Without a jet stream, weather would just sit in place for weeks or months at a time, causing droughts or flooding depending on whether a high pressure system or a low pressure system decided to set up shop over your head. (Either possibility is a disaster for agriculture and local ecology.) But thanks to CO2-induced polar warming, the jet streams have been creeping equatorward a little bit each year and they've been weakening. With weaker jet streams, we can expect things like polar vortex plunges and balmy temperatures in Alaska and 15%-of-normal-rainfall droughts in California and 115 F heat waves in Australia to become regular occurrences. (These things are all happening right now, if you haven't been paying attention, and they're all a consequence of polar jet stream shenanigans, which are getting more common and more extreme as of late.)

Like the jet streams, ocean currents are also driven by temperature differences, so ocean currents will eventually start to shift if polar warming continues. That will have far-reaching consequences, because ocean currents determine evaporation rates and thus where precipitation falls, but ocean current changes are very hard to predict because we have so little data to work from. This hasn't really affected us yet, but the El Niño vs La Niña dichotomy (drought vs flooding; where you live determines which one brings which) gives a small taste of how much power the ocean has over the weather (and how big the effect will be once we do get our first permanent ocean current shifts). That awful The Day After Tomorrow film was mostly made of bogus-science-from-hell, but it was very loosely based on a real-world hypothesis that freshwater glacial melt could disrupt the thermohaline circulation that powers the Gulf Stream, the ocean current that keeps the UK and northern Europe warm. (The UK is at the same latitude as the Gulf of Alaska, suggesting it would be as cold as Alaska if the Gulf Stream were disrupted. The Gulf Stream weakened 30% from 1957 to 2005, which causes some concern.)

It's also worth noting that the changes are being buffered by the ocean, but that's not without consequence either. The ocean has been absorbing tremendous amounts of CO2, and that has seriously reduced CO2's greenhouse warming impact and bought us time before the Arctic temperature situation gets completely out of hand. But when CO2 dissolves in water it forms carbonic acid (H2CO3) -- that's why flat soda tastes disgusting: carbonation adds acidity (tartness) -- and now the ocean's pH is getting so acidic that coral reefs are dying en masse. The loss of biodiversity dominoes up the food chain to fish that human industry cares about. Coral reef biodiversity has also been a fruitful source of drug discovery ideas, so the pharmaceutical industry will suffer a bit from coral reef deaths too. And beyond coral, the falling ocean pH is also hurting shellfish operations because the water is now too acidic for baby oysters to mineralize their shells. The Pacific Northwest's oyster industry has started suffering from this problem in the last couple of years and is now resorting to artificial hatcheries to stay in business. Expect to see the global shellfish industry in dire straits within the next decade or two.

The situation is horrendously complicated. There's no way you could summarize it with two English words, no matter how pithy. But "Climate Change" is a little closer to the reality than "Global Warming".

1 year,12 days

People Become More Utilitarian When They Face Moral Dilemmas In Virtual Reality

CTachyon Re:Utilitarianism is correct (146 comments)

Utilitarianism is false, because no human being can know how to globally maximize the good.

This is like saying "mathematics is false, because no human being can know if a statement should be an axiom or not". In both cases the subordinate "because" clause is trivially true, but not logically related to the independent clause it pretends to justify. Mathematics is a tool for generating models, some of which are useful for approximating how the real world behaves; utilitarianism is a subtool within mathematics that's appropriate for generating models of the part of reality we call "human morality".

They just believe they do, and then use "the end justifies the means" to commit atrocities.

Every proposed moral system has been used to justify at least an atrocity or two at some point: utilitarianism, deontology, moral relativism, moral absolutism, every goddamn religion you care to name — even Buddhism! (What the hell, right?) The truth is that people choose an action, then they justify their action by creating a post hoc story that rationalizes why the chosen action was Right, and it makes no sense to blame the justification instead of the choice.

Morality itself is a pattern in the brain that shapes what one chooses — how one resolves the balance between conflicting goals — and it's not actually an object-level belief that one can directly observe with conscious thought. If you give people books to read about object-level moral beliefs, the readers don't become more moral or less moral, they just get better at crafting post hoc justifications.

(Also, as it turns out utilitarianism was not a great model for human behavior by itself, but it actually does pretty well if you extend it with uncertainty in the Bayesian sense. Moreso if you go the extra step and add causality to the model (fixing the edge cases that crop up in more nai:ve decision theories that treat actions as evidence). If the space of possible futures is small enough, you can even wrestle the conditional probabilities into submission, e.g. using Judea Pearl's causal networks, and get concrete answers that take that uncertainty into account — still a high bar, but more tractable than "noooo, it's not worth doing unless it's perfect". Many human behaviors that seem irrational in a Homo economicus utilitarian calculus suddenly look perfectly rational if you model the study participant as, say, a Pearlian estimator with a low computed probability for P(stranger will actually give $100|stranger says they'll give $100 if I were to do X AND I counterfactual-do(X)).)

1 year,23 days

Google Brings AmigaOS to Chrome Via Native Client Emulation

CTachyon Re:Lack of vision (157 comments)

Sometimes, Google just baffles me. The lack of direction in their product lines makes me shake my head.

We have several distinct software platforms:

1) Android. Development in XML with Java used as glue to hold everything together. Unless you don't. You can use standard C libraries and call the Linux kernel directly, bypassing the Dalvik Java VM.

2) Chrome browser. Development largely in javascript, again there are some obvious exceptions. Javascript is, of course, preferred because it's safer, so ChromeOS protects you by having everything done in Javascript. Except that it isn't.

3) ChromeOS. Kinda/Sorta like using the Chrome browser, except that it's not, because you are developing things that run as if they were actual clients. In Javascript. And of course, this too, is just as strictly enforced.

4) But Let's not forget the 4th platform in the trio: Google's Go language is clearly a contender, and it's designed to replace C, except for a few bone-headed decisions like linking everything statically resulting in enormous binaries. Because you really, really need to have the same library installed once for every app installed, because that way you get to recompile everything installed on your system any time a security update comes out for your favorite library. Except that, of course there are exceptions here, too.

And most importantly, you cannot target all these platforms with any single codebase written in any language. It's like they are trying to make their product suite as difficult as just using products from multiple vendors anyway.

It's really quite simple. A lot of Google projects started from a handful of people going "you know what would be a cool idea?" and doing it with very little approval or red tape (the fabled 20% time). That's certainly the only explanation I can think of for DART, at any rate.

Go is basically what you get when you hire a former Plan 9 developer, expose him to Google's internal hermetic build system (where a 100MiB binary is small), then let him build cool stuff to keep him from getting bored.

Disclaimer: I work at Google but do not speak for my employer. I don't work on any of the teams mentioned in your post. The information in this post is already available to the public in various places.

about a year ago

Google's Plan To Kill the Corporate Network

CTachyon Re:eh, Google no eat own dogfood? (308 comments)

I have a Chromebook and rely on SSH. There is native SSH via the Crosh terminal. It's even easier to import a key file into then Secure Shell.

True, it does exist -- I've used it myself -- but it's incredibly awkward to use and has some real oddities when it comes to terminal emulation.

about a year ago

Google's Plan To Kill the Corporate Network

CTachyon Re:eh, Google no eat own dogfood? (308 comments)

Care to share the Distro of choice on those linux based non chromebook machines? Is it a free employee option ? Are there a set number of pre-approved distros? Is there a top-secret Google Gnu-Linux Distro that dispenses chocolates on the half hour?

Only Goobuntu is available. It's Ubuntu Precise Pangolin plus some light policy customization (internal base-install *.debs; some Puppet stuff).

about a year ago

Google's Plan To Kill the Corporate Network

CTachyon Re:eh, Google no eat own dogfood? (308 comments)

why use so many Apple computers when there's your own awesome Chromebook?

Google employee here (but I don't speak for my employer and I am basing this purely on anecdotal observation, not hard data).

I'm only familiar with my impressions from the engineering side, so I don't know much about the sales and marketing side of things, but nearly all of the engineers use Linux desktops (unless they're developing client software, like Chrome). Laptops are a different story. As a Bay Area-wide phenomenon, software engineers sure like their Macbooks, and this place is no exception. A few of us run Linux laptops, but my impression is that Macbooks outnumber Linux laptops plus Chromebooks combined. But the internal hardware requisition site is now offering the Pixel (indeed, recommending it instead of Macbooks), so this should change with time.

There's also the matter of hardware refresh cycles. The Pixel is not even a year old yet, and it hasn't been available for requisitions for its entire lifespan, so a good number of employees haven't yet had the chance to switch even if they want to. (Returned working laptops are refurbished and reused, so turning over the inventory will take longer than you might expect.) Also, lack of VPN or native SSH impeded the Chromebook's internal usefulness in the early days, but today hardly anything still requires VPN (it works now regardless) and the Secure Shell app is pretty workable (set it "Open as Window" so that ^W goes to the terminal). And... well, the early Chromebooks had anemic hardware specs, which is not true of the Pixel.

about a year ago

Scientists Boost the "Will To Persevere" With Current To the Brain

CTachyon Re:Movie idea (127 comments)

You could make a film about a pile of dead body parts assembled into the form of a man being shocked by lightning and being given the will to live. You could even add some wanton violence and philosophical questions of existence to make the story interesting.

You mean Frank Henenlotter's 1990 masterpiece, Frankenhooker , of which Bill Murray said (and I quote) "if you see one movie this year, it should be Frankenhooker"?

about a year ago

App Detects Neo-Nazis Using Their Music

CTachyon Re:Freedom of thought (392 comments)

Uhhh...just FYI? Rohm and the SA leadership were pretty much ALL gay and Hitler and pals didn't have a problem with it until Rohm started talking about a "second revolution" because he thought "the little colonel" had betrayed the socialist part of national socialism, just FYI.

Hitler had a pretty firm "babies good, homosexuals bad" policy for the common folk. Rohm was a party insider long before Hitler was elected Chancellor; in general, Hitler was pretty willing to give special treatment to party insiders, even ones less senior than Rohm. Even so, I'm not aware of any other SA leaders who got a pass for the same reason; care to name names?

For that matter, Hitler's family doctor Eduard Bloch was Jewish, and he got special treatment too (only Jew in Linz with special protection from the Gestapo, notes Wikipedia). Adolf reportedly had quite the soft spot for him after he did everything he could to treat Klara Hitler's rather horrifically advanced breast cancer, despite her financial hardship. Basically, Hitler was a giant hypocrite who tried to ignore the brutality of his own policies by shielding only the people he cared about and could personally see suffering from them.

about a year ago

A Link Between Wormholes and Quantum Entanglement

CTachyon Re:Mysterious quantum mechanical connection? (186 comments)

I am not a physicist.

But I keep hearing that there is actually nothing mysterious about entanglement at all... Something along the lines of:

You post 2 envelopes containing cards in opposite directions, one with a printed letter A, the other card with the letter B.

At one destination, the envelope is opened to reveal the letter A. ... then through some mysterious quantum mechanical connection.... you know that the envelope at the remote destination contains the letter B.

And that's about all there is to entanglement....

Can any physicist confirm?

I'm not a physicist, just a well-read layman, but...

It is more mysterious than that, but if you go with the Many Worlds interpretation it's not much more mysterious.

Basically, if you entangle letters A and B and send them in opposite directions, you're really creating two universes corresponding to the two possibilities: universe P (A here, B there) and universe Q (B here, A there). If you open the envelope to reveal A, for instance, then that copy of you in universe P now knows they exists in universe P, and likewise for B and Q. But unlike in classical physics, universe P is not completely separated from universe Q. P and Q still exist as a single mathematical object, P-plus-Q, and you can manipulate that mathematical object in ways that don't make sense from a classical standpoint.

Basically, it all comes down to one small thing with big consequences. The real world is NOT described by classical probability (real numbers in the range [0,1]). Instead, the real world is described by quantum probability (complex numbers obeying Re[x]^2 + Im[x]^2 = 1).

As it turns out, "system P-plus-Q has a 50% chance of P and a 50% chance of Q" is really saying "system P-plus-Q lies at a 45deg angle between the P axis and the Q axis". Starting from P-plus-Q, you can rotate 45deg in one direction to get orthogonal P (A always here), or you can rotate 45deg in the opposite direction to get orthogonal Q (B always here), thus deleting the history of whether A or B was "originally" here. (If P and Q were independent universes, this would decrease entropy and thus break the laws of physics.) Even more counterintuitively, you can even rotate P-plus-Q by 15deg to get a 75% chance A is here and a 25% chance B is here (or vice versa, depending on which quadrant the starting angle was in). Circular rotations in 2-dimensional probability space are the thing that makes quantum probability different from classical probability, and thus the thing that makes quantum physics from classical physics.

Classically, A is either definitely here or definitely there, and until we open the envelope and look we are merely ignorant of which is the case. Classical physics is time-symmetric, and it therefore forbids randomness from being created or destroyed; classical probability actually measures ignorance of starting conditions. In a classical world obeying classical rules, you can't start from "50% A-here, 50% B-here" and transform it into "75% A-here, 25% B-here" without cheating. The required operation would be "flip a coin; if B is here and the coin lands heads, swap envelopes", and you can't carry that out without opening the envelope to check if B is here or not. Quantum physics is also time-symmetric and also forbids the creation and destruction of randomness, but quantum probability (also called "amplitude") is not a mere measure of ignorance. In the Many Worlds way of thinking, physics makes many copies of each possible universe, and the quantum amplitude determines how many copies of each universe to make. At 30deg off the P axis, cos(30deg)^2 = 75% of the copies are copies of universe P, and you experience this as a 75% probability of finding yourself in a universe with "A here, B there".

(Or something like that. It'll probably make more sense once we eliminate time from the equations. At the moment not even Many Worlds can help us wrap our heads around the fact that quantum entanglement works backward the same as it does forward. The equations as they stand today imply that many past-universes containing past-yous have precisely converged to become the present-universe containing present-you.)

One last complication. If the information of A's location spreads to more particles than A and B, then P and Q become more and more different, and as a consequence the quantum probability rules become harder and harder to distinguish from the classical ones. If you open the envelope and learn "A is here", for instance, then P now contains billions of particles that are different from Q (at the very least, the particles in your brain that make up your memory) and it now becomes impossible-ish to perform rotations on P-plus-Q, because you would need to find each particle that changed and rotate it individually. (Not truly impossible, but staggeringly impractical in the same sense that freezing a glass of room-temperature water by gripping each molecule individually to make it sit still is staggeringly impractical. And both are impractical for the same reason: entropy.)

When so many particles are involved that we can't merge the universes back together, we call the situation "decoherence", but it's really just "entanglement of too many things to keep track of". Entanglement itself isn't really that special; what's special is limiting the entanglement to a small group of particles that we can keep track of and manipulate as a group.

about a year ago

FDA Tells Google-Backed 23andMe To Halt DNA Test Service

CTachyon Re:Democracy? (371 comments)

Just under what legal theory before the FDA was poisoning people a legitimate business ?


Back in the U.S. robber-baron era (1870-1905) it used to be the case that it was your own fault if you put it in your mouth. It didn't matter if the seller marketed it as edible despite knowing or suspecting that the product was poisonous (such as radium water or formaldehyde-preserved milk). As the buyer you were supposed to know better, as summarized by the legal doctrine caveat emptor ("let the buyer beware"). It was only later that caveat emptor was _partially_ overturned by the invention of the "implied warranty", as federally formalized in the Uniform Commercial Code of 1952 (though the concept was kicking around decades earlier than that on a state-by-state basis). In the absence of a warranty (explicit or otherwise), the seller had made no promise to the buyer about the product sold, and with no promise to break there was therefore no fraud on the seller's part. No fraud, therefore no wrong and no restitution: no wrongful death damages, no medical bill expenses, not even a "satisfaction or your money back" refund guarantee.

To this day, there's still quite a bit of caveat emptor in the law. For example, cigarette smoke is poisonous at the intended dosage, full stop. Habitual smoking of cigarettes is known to inactivate hemoglobin by way of carbon monoxide, to reduce lung capacity by accumulation of scar tissue, to damage the cardiovascular system by hardening the arterial walls, and to dramatically increase the risk of lung and other cancers. But despite their documented toxicity, to this day tobacco companies are not held liable for selling them. They have been sued several times, but generally for their advertising, and many of the advertising suits have been for ads that played up false benefits or downplayed real drawbacks -- i.e. they made a promise (implied warranty of fitness) that was then broken (fraud). But so long as the buyer is duly warned (no false advertising, the Surgeon General's Warning is present), the situation reverts to caveat emptor and it's again the buyer's own fault if they put poison in their mouth.

about a year ago

Google Chrome 31 Is Out: Web Payments, Portable Native Client

CTachyon Re:Security? (123 comments)

How they maintain security with C and C++ applets?

-- hendrik

NaCl (in its standard, non-Portable flavor) is essentially a bytecode that happens to be directly executable as machine code (either x86-64 or ARM). The bytecode can be statically verified to mathematically prove that the instructions obey certain rules (e.g. exactly one interpretation for any bytecode, execution only leaves the verified bytecode by calling trusted functions, can only read/write memory in the sandbox, cannot write to bytecode, etc.). As I understand it, PNaCl is similar to classic x86/ARM NaCl but trades fake bytecode for real bytecode (LLVM's intermediate representation, last I heard) and statically compiles it to native machine code after the bytecode verification step. Basically, in this scheme the verified C code can run at near-native speed, but it can only communicate with the world outside the sandbox by calling trusted functions that the enclosing app chooses to expose.

Theoretically, Java ought to be just as strongly sandboxed as NaCl: Java code in a JVM sandbox can only call trusted functions that the JVM chooses to expose, too. But in practice the Java standard library exposes a ridiculously broad attack surface, giving sandboxed apps plenty of chances to exploit bugs and escape the sandbox. (For instance, java.lang.String is a final class today because folks discovered that you could subclass it to make it mutable, pass a sandbox-approved value to e.g. a file I/O function, then modify the value to a sandbox-forbidden value after the security check but before the OS system call.) Basically, Java's attack surface is broad and leaky because Java was designed for running embedded devices and servers, not for sandboxed applets downloaded from hostile sites on the Internet. Applets were a distant afterthought compared to Java's "let's write an OS for set-top cable boxes" origin.

In contrast with Java, Chrome's implementation of [P]NaCl only exposes the Pepper API, and the Pepper API was designed from the ground up to be called by sandboxed code fetched from a malicious website. Looking at the Pepper C API site, the attack surface seems... bigger... than I would have expected. But most of the functionality I see there is also exposed to JavaScript, where the code is every bit as hostile. Almost any "attack surface, WTF" argument would also argue against JavaScript and all modern web design. And if they're smart, one API is hopefully built on top of the other (plus a thunk layer made of machine-generated code), so that there's only one pool of security bugs to fix.

about a year ago

Support For NASA Spending Depends On Perception of Size of Space Agency Budget

CTachyon Easy solution: measure budgets in Iraq War Days (205 comments)

A repost of a Google+ post I wrote a year and some change ago:


From today forward, all federal government expenditures will be priced in "Iraq War Days" (IWD) or "Iraq War Years" (IWY). For quick reference:

  • - MSL mission w/ Curiosity rover: 3.5 IWD
  • - Cost of giving $10 to all 312M US citizens: 4.33 IWD
  • - 2012 "General Science, Space and Technology" budget: 43.04 IWD
  • - Cost of giving $100 to all 312M US citizens: 43.3 IWD
  • - 2012 Welfare budget: 210.3 IWD (0.6 IWY)
    • ~ Computed as 26% of the 2012 "Income Security" budget
    • ~ Includes TANF (22%) welfare, SNAP (70%) and WIC (8%) food stamps
    • ~ All ratios from 3rd party analysis of 2010 data; see "How much do we REALLY spend on Welfare?"
  • - 2012 "Medicare" budget: 672.9 IWD (1.8 IWY)
  • - Cost of giving $2250 to all 312M US citizens: 975 IWD (2.7 IWY)
  • - 2012 "National Defense" budget: 994.9 IWD (2.7 IWY)
  • - 2012 "Social Security" budget: 1081 IWD (3.0 IWY)
  • - 2012 Total budget: 4986 IWD (13 IWY)

Source: "United States Federal budget, 2012" and "Mars Science Laboratory" pages on Wikipedia for budgets, google.com/publicdata for US population, National Priorities Project via "Cost of War" Wikipedia page for IWD exchange rate.


Something I didn't note in my original post that's probably worth mentioning in passing: Social Security is huge, "bigger than the National Defense budget" huge, but it's basically self-funding because it's a retirement investment paid for by payroll taxes (modulo population bumps, e.g. the post-WW2 "baby boom"). Person A pays in, person A cashes out, theoretical net cost to taxpayers $0.

about a year ago

Boston Dynamics Wildcat Can Gallop — No Strings Attached

CTachyon Re:Government waste (257 comments)

So then the question becomes, could an actual fission reactor be designed small and powerful enough to power a car (or horse) -like vehicle?

Short version, no. There are no nuclear fuels with the right balance of properties to achieve that. Long version: go Wikipedia nuclear fission, fissile, and critical mass.

about a year ago

Congress Reaches Agreement ... On Helium

CTachyon Re:Dispensing our reserves? (255 comments)

[...] Meanwhile, engineers will continue to look at alternate cooling solutions, such as liquid hydrogen. [...]

This doesn't work. There's no viable substitute for helium, not even hydrogen. The reason helium is so useful is that it boils at 4 K (by far the coldest boiling point of any substance), remains liquid all the way down to absolute zero at standard pressure, and becomes superfluid at 2 K (the only bulk superfluid achievable on Earth).

The boiling point is important because that's how cryogenic cooling works: when you use a circulating liquid coolant, the temperature of the (coolant plus apparatus) system cannot exceed the boiling point of the coolant until the coolant has entirely boiled away, so you get a very consistent and predictable temperature (right up until the coolant is gone). 4 K is below the critical temperature of the most common materials for superconducting electromagnets: niobium-titanium (10 K, relatively cheap) and niobium-tin (18 K, highest known T_c for a traditional superconductor). Hydrogen is not a substitute, because it boils at 20 K; that's noticeably too warm for any traditional superconductor, and even if it weren't, superconductors can handle stronger magnetic fields the colder you chill them, so they'd be less useful in an MRI machine. And you can't chill hydrogen much colder than its boiling point before you hit its melting point, 14 K, at which point it stops circulating and becomes much less useful as a coolant.

The superfluidity is not quite as useful day to day, but it's used to study the behavior of other quantum mechanical systems, such as neutron star interiors, that we can't recreate in a lab. It also forms a rigorous analogy with superconductivity, especially in the case of fermionic He-3, so it gives us a chance to play with a bulk fluid that propagates fluid currents in the same way that superconductors propagate electrical currents. Nothing else can replace it for this purpose.

(Side note: helium is not a truly expendable resource. Of the helium present on Earth, not a single gram is left over from the formation of the solar system; Earth doesn't have the mass to retain helium in its atmosphere. All our helium comes from the alpha particle decay of heavier radioactive elements, like radon. When the alpha particles relax and become neutral helium gas, the gas is trapped by the same gas-impermeable rock formations that trap natural gas. However, the natural recharge rate from radioactive decay is much slower than the rate that we're extracting it and venting it, so if we don't curtail our waste we're going to run out regardless.)

about a year ago

Somebody Stole 7 Milliseconds From the Federal Reserve

CTachyon Re:I do not understand why this is a story (740 comments)

Trades were executed in Chicago before the change was announced in Washington D.C. in a relativistic physics sense.

Actually, in relativistic physics sense, the trades in Chicago where outside of the light cone of the Washington event (neither in the future cone nor in the past cone). That being said, since Washington and Chicago do not move at relativistic speed with respect to each other, the trades are still at a later time than the announce, even if there's no possible causality.

But the DC announcement was not in the past light cone for the Chicago trade. Therefore the information had not yet reached the Chicago public. That is the criterion being judged, not simultaneity. Insider trading, case closed.

(And even if we take the classical limit of c approaches infinity, are we really to believe that a trade conducted within single-digit milliseconds of the announcement was based on consideration of the contents of the announcement? There exist fully automated flash trading systems hooked up to news wire services, but AFAIK even those don't react quickly enough to explain the speed of this trade. Shakier conclusion, but still insider trading.)

about a year ago

Java Update Implements Whitelists To Combat 0-Day Hacks

CTachyon What the hell (55 comments)

Package your ruleset.xml into DeploymentRuleSet.jar

Packaging your ruleset allows the desktop administrator to apply cryptographic signatures [emphasis mine] and prevent users from overriding your policy. This requires usage of a trusted signing certificate. The easiest route to get a signature is to buy one from a certificate authority like Symantec/Verisign, Comodo, GoDaddy, or any other; [...]. The default certificate authority list contains about 80 authorities from which you may purchase a signing certificate [emphasis mine].

-- Introducing Deployment Rule Sets, Java Platform Group blog

Why in the name of the everliving fuck would anyone think this step was a good idea? The file is already located in a directory that can only be written by root (or Administrator, as OS appropriate). Why require a signature? This adds zero security. If you have root on the machine, you can add a self-signed CA to the trusted CA list anyway. Do they have a kickback arrangement with Verisign or something?

about a year ago

Seagate's Shingled Magnetic Recording Tech Boosts HDD Capacities to 5TB and Up

CTachyon Re:maintenance (195 comments)

Since you obviously know that a *file* can be fragmented, obviously you already know that a file doesn't have to be contiguously written.

Thus, you don't need to defragment it. The directory structure knows that the 'file' is in blocks 1-5, 8, 14.

As other people pointed out, disk seeks are most assuredly something to avoid on spinning media. But even when seeks are free, as they are on SSD, fragmentation still sucks and you should avoid it like you owe it money. For one, some filesystems use run-length encoding for the list of blocks in a file. Basically, instead of recording "1, 2, 3, 4, 5, 8, 14", they notice the pattern and record "1-5, 8, 14" like you just did in your post. (The ext[234] family doesn't do this, but IIRC some of the post-ext2 up-and-comers use it.) RLE lets you inline more metadata directly in the inode without resorting to indirect blocks, which basically means you get your data with fewer round trips to the disk. (It might save you from needing to read a meta-meta-block to find the meta-blocks that tell you where the blocks are. Instead you can fit all the blocks in one meta-block and skip a round trip.) For two, even filesystems on SSD that don't do RLE still suffer under fragmentation. Unfragmented files make it easy for the kernel I/O scheduler to coalesce those sequential block reads into big, happy multi-block SATA reads when you're streaming through the file. As before that means fragmentation = more round trips to the disk, but it also means fragmentation = spamming the SATA controller with more commands and spamming the CPU with more interrupt handlers for the command completions. (In other words, copying a big fragmented file slows down everything else on the computer, moreso than copying a big un-fragmented file.)

Disclaimer: I am not a filesystem designer, I just play one on Slashdot.

about a year ago



"80% Functional" Includes Junk DNA After All

CTachyon CTachyon writes  |  more than 2 years ago

CTachyon writes "Last week the ENCODE project published a suite of papers, which were announced to the press with a claim that 80% of the human genome is "functional". But according to Ars Technica's science editor John Timmer, himself a Ph.D. in Molecular and Cell Biology, most of what you read was wrong: in their papers, the ENCODE team redefined the word "functional" so that known junk DNA (such as dormant viruses and broken pseudogenes) would meet the definition; and what's more, Timmer accuses individual ENCODE scientists of fostering confusion, rather than clearly explaining the semantic bait-and-switch."

CTachyon CTachyon writes  |  more than 8 years ago

CTachyon writes "A friend at work is having the usual Windows trouble with viruses and Trojans. She has an anti-virus program on there of some sort (unknown vintage, neither McAfee nor Symantec/Norton), and while it cleaned up a good chunk of the mess, there's still at least one more lurking on her system.

As one of the resident computer 'experts' at work, she came to me for advice. Unfortunately for her, I'm out of the Windows loop since I jumped ship to Linux years ago. While the proper thing for her to do at this point would be to back up her important data and reinstall from her recovery CD, I no longer have the patience for Windows to do that for anyone I'm not sleeping with, and it's a bit over her level of expertise to handle herself. That pretty much leaves trying another AV program.

Thus the problem. I'm out of the loop, so I don't know what's good and what's not. I did manage to instill in her a proper fear of Symantec/Norton, but I don't really know what I should recommend instead. If all else fails, I vaguely recall that AVG is decent, and it's little-f free (big plus for her). Do any of my fellow Slashdotters have some better advice for her?"


CTachyon has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?