Google Faces Up To $5 Billion Fine From Competition Commission of India
Humanity faces a five hundred million billon trillion dollar loss of income due to premature extinction of the species.
School Tricks Pupils Into Installing a Root CA
Please, also don't act like your the first person ever that this has happened to. It's been standard practice for at least the last 15 years I've been working IT in schools in the UK.
Your post is constructive right up to phrase "the last 15 years" which apparently justifies how little your network reveals to the surveilled about the actual extent of the surveillance, even to the point of having software installed that they know little to nothing about on their own equipment that could open back doors to the device when employed outside of the school network if by some extraordinary turn of events proves to be slightly less than 100% bullet proof in its coding, implementation, and deployment. Nothing ever goes wrong with WEP or SSL.
Would it damage the small little minds to know more about how this all became "bog standard" without so much as a public whimper? Probably. Does that mean your Slashdot post is filtered on your own school network? Probably.
In my world, forged SSL certificates should be clearly marked as such. There should even be a "forger identity" field and a "forger authority" field (containing the pertinent parental agreement UUID).
None of this would interfere whatsoever with your legal authority to protect your network or your success in achieving this protection. It would increase the awareness of the surveilled of what externalities they have actually taken on downstream of their agreement with you to allow you to do so.
The fact that you've been doing this for fifteen years already without any of this in place is a sad argument.
If this is the school's equipment so that the school absorbs it's own externalities of having badly-coded surveillance kits forcibly installed (I'm guessing the rock stars on that coding team were on the guaranteed forcible-installation side of the house) and the equipment is emblazoned with a giant warning "abandon privacy all ye who input here" there should still be a giant warning screen that comes up whenever a user tries to access a major financial institution (I'm told the government tracks the identities of these organizations) which warns the user "you are attempted to access a financial institution through a forged SSL root chain which is potentially a far leakier pipe than regular SSL, are you really sure you want to do this?"
So you're justified in doing what you do, but you're also so damn sneaky about doing it, that fires spring up in public opinion when the least of what goes on is exposed to public discussion.
No need to hammer the state of affairs in the daily consciousness so that these public fires don't flare up. Because fifteen years.
My bank has a security mechanism where they show a set of images unique to my account so that I can detect impostor sites that entice me to enter my credentials where they shouldn't go (the impostor site doesn't know the unique images associated with each banking account). There really should be a law against these security fingerprint images being conveyed through a forged-certificate SSL proxy no matter how legitimate the usage agreement. Once those images are scraped and laundered, one more safeguard we've be taught to trust is down the spiral tube.
If it's rational, necessary, and you're proud of it, do it out in the open as democracy conceptually demands, with plenty of loud warning signs where the externalities impose heightened risk.
Genomic Medicine, Finally
Too many people suffer and die from too many diseases that we more or less understand, but can't effectively treat.
Yes, this is what classical Greek rhetoric describes as a regressive mirage: the more you learn, the worse it gets, no matter how diseases you cure along the way.
Here's the amazing thing. Understanding tends to outpace effective intervention. Any snooker player can tell you which ball on the table he'd really like to move next. It's rarely the ball he's presently shooting at. In Genomics, we're talking 30,000 balls on the snooker table, and the snooker table is gravity golf in a twenty dimensional space. Even with your trillion dollar Laplacian pool cue, you're struggling to pull off exactly the shot you want.
When I was young and we were on a long trip and the moon was hanging there on the horizon, I always wanted to go faster, so we could see the other side.
Then I got a little bit older. Perhaps a month older. And I thought to myself, "you know, there are reasons why this is probably not going to happen the way I want it to".
Bitcoin Inventor Satoshi Nakamoto Outed By Newsweek
The person who posted this comment is apparently a paranoid psychopath and you are effectively praising him.
Apparently, paranoid psychopathic trolls are tightly knit.
But no problem. It'll be +500 in another hour, +50,000 by tomorrow afternoon, and then wrap back around to zero, at which point he loses his quarter and his game is over.
Apple Refuses To Unlock Bequeathed iPad
If everything appears to be working smoothly between family members,
Didn't you watch a single episode of The Dukes of Hazard growing up? The legitimate party is always the last to know, and by the time the penny drops they're one hell of a car ride away from interceding in the nick of time, bursting into the court room at the very moment the justice picks up the pen and says out loud to Boss Hogg and his henchmen in particular "these papers all seem to be in good order".
Ask Slashdot: What Software Can You Not Live Without?
MediaWiki. Before I created my note-taking wiki, my ideas went off in all directions.
I'm also pretty heavy into R/C/C++/zsh/ZFS/git right now.
Tim Cook: If You Don't Like Our Energy Policies, Don't Buy Apple Stock
Directors and officers of a corporation have a fiduciary duty to the stockholders to run the company in their interest.
When you study real people in controlled settings, their actual interests turn out to be far murkier and less consistent than we like to imagine.
There's no perfect way for management to pin-point the precise interests of their collective (and fluctuating) stockholders.
Rather than becoming slaves to opinion-poll rounding errors, perhaps management is wise to buffer this obligation by living like decent human beings, following thousands of years of human precedent before we got all hot and bothered and legalled-up over brittle inducements.
Wolfram Language Demo Impresses
For example, Boost is really sweet when you need to slam together a pile of code and have it working out of the gate with minimal fuss, but if performance is an issue, you cant use it.
Wow, that's just bizarre. I don't know where you get your misinformation, but it's an elite grade of batshit.
The whole point of Boost is that it maintains a certain amount of abstraction without boxing you into a performance corner. Were it not for those conflicting goals, the devilishness of its internal machinery could not be justified.
Template metaprogramming essentially involves expressions converting themselves to a symbolic representation that doesn't resolve itself into a concrete expression—by means of purely functional transformation at a quasi-syntactic level;—until some final result is demanded, at which point the highest performance code path can be selected based on the actual parameters (more specifically, often exploiting which parameters vary and which parameters are constant or nearly constant).
The problem with Boost is similar to what Knuth said about the problem with literate programming.
Literate programming demands a high proficiency with two different skills: formal reasoning and verbal expression. This shrinks the available pool of adherents and adopters. And worse, there's a terrible opportunity cost, because the people out there who have extremely high proficiency in both of these skills are in extremely high demand to take on central roles in large projects where they don't spend their hours bent over literate code.
The kind of environment where Boost can be best exploited for both its abstraction and its performance is going to be wonk-filled boiler-rooms at high frequency trading companies where the cash, the talent, the commitment, and the project duration mesh together. Importantly, the project specification in these environments is often in continuous, long-term evolution as your firm chases whatever edge it thinks it might have in a chaotic, rapidly-shifting market environment. The month you spend pouring over low-level optimization gets deployed for a whole week. The month you spend automated your Boost framework to achieve nearly the same performance becomes a permanent code asset (and a competitive asset whenever you find yourself needing once again to run that old play).
Boost is in that category where if you have to ask, you can't cut the mustard. The natural Boost programmers already know who they are. Few of these people toil in the public eye. That's not where this elite, double-barrel skillset tends to land.
The Wolfram language is impossible to assess based on this video. If your application depends on Wolfram "knowledge" how do you know it will continue to meet rigorous specifications the day after tomorrow?
Is there a public regression suite on the contained knowledge against which to assess whether your program is erected on firm or porous soil?
What guarantee does one have that it's cleverness or performance characteristics will stay consistent when it matters most?
I suspect the killer application for WooL is prototyping the semantic web. The semantic web has been dragging its feet. Google and Facebook don't wish to become disintermediated. They have one foot on both sides of this fence and their hands cupped over their testicles. Doesn't make for rapid progress.
The Achilles heel of search is that search returns results rather than models. Google is trying to split the difference by having search return interactions. It's an excellent paving stone on the road to a lucrative future purveying OOXML.
If ten minutes of coding within the Wolfram Language embarrasses Google search, we have a winner here of WuLing mammoth proportions.
Wolfram Language Demo Impresses
I think the point is that 1000 words can succinctly be described by a single picture by simply including those words in the picture.
No, you're thinking of "a picture is worth ten million bits".
Apple Drops Snow Leopard Security Updates, Doesn't Tell Anyone
Self reply: the other right.
Waiting waiting waiting for Slashdot alpha to allow my cowboy.
Apple Drops Snow Leopard Security Updates, Doesn't Tell Anyone
If you're still running 10.6 for some reason, your computer is either a low-end one from at least 7 years ago, or you've made an intentional choice to remain on 10.6 for some reason
It used to be that low IQ was failing to identify the continuation of some trivial numerical sequence on some trivial test. The new low IQ is use-case blindness, the inability to even hazard a guess at the myriads of reasons other people live differently than you do. The ravening mob of blindness promulgators are ever with us. Pity.
Here's my story.
I bought my wife a second generation Core Duo iMacs, which I believe has never been upgraded from the original Leopard. I use this computer so rarely (about ten hours per years) that I can barely keep track of which leopard presently holds court.
The computer works—until some piece of software offers to "upgrade" itself, then restarts with a whole new user interface (I'm looking at you, iTunes). Then I'm constantly told the computer doesn't work any more, but the real problem is that she hasn't figured out where all the familiar functions were forcibly relocated.
I'm not willing to sit down at her desk and chase GUI tidbits from point A to point B, so I just told her "don't click upgrade". When something visibly breaks, then I'm willing to sit down and deal with it. Meanwhile I have enough sysadmin on my plate with my own Linux desktop, where I'm heavily invested in ZSH, and my FreeBSD server, where I'm making very heavy use of ZFS. This is where my neural matter wants to go.
I have a very low tolerance for having something trivial I've mastered at the autonomic level yanked back to the center of my attention. It took me close to a decade to cease seething about the relocation of the CTRL key in favour of a CAPS LOCK key that should have been ALT-NUMLOCK or, even better, CTRL-ALT-INSERT. FFS I can type ~50 wpm in ALL CAPS using the right shift key for six of my fingers, alternating to the right shift key for the other two. But guess what? The CAPS LOCK key is more prominent to my left pinkie than ENTER is to my right pinkie. If we normalize the utility of the ENTER key to 100, the utility of the CAPS LOCK key comes out around -1000.
The problem with most upgrades is that it's always more of this father-knows-best groupthink bullshit.
It's a huge project just to figure out what's going to change. The only recourse one has to all these unnecessary relearning cycles is to skip as many releases as humanly possible. I'd be thrilled if XP is the last Microsoft OS I learn how to use in this lifetime. I was an early adopter of Windows 2000 and I stayed there until 2000 went out of support. Later I ended up using XP in a different work environment and I can't name a single thing that improved, except that I had to disable a lot more bling for half a day. Long ago I held out on MSDOS until I could jump straight to Windows NT which I adopted within weeks of the Intel P6 becoming available. That was a real upgrade, one well worth reprogramming a decade of autonomic habits. I never used any of the shitshow 3.1/95/98 for more than the very occasional hour.
These upgrades change a lot of stuff for extremely dubious benefits. An upgrade is going from UFS to ZFS. That I can buy into. An upgrade is going from System 7 to OS X. On that one I can sell my wife.
What I really want concerning these fairly useless system frobs is the semantic web: searchable metadata describing every user interface action that formerly existed and whether it still exists in the new version, plus a mapping to a more-or-less equivalent version, if such a thing has even been retained. Oh yes, Apple is good at silent castration. Ideally the OS would track which user interface functions have been regularly used, and list out all the things the upgraded user will be instantly forced to relearn. But no. It's sexy. No assistance offered retraining for sexy. That what sexy means, lover boy.
Until such a world exists, small upgrades require leaping over a trust quanta with the four degree Kelvin cosmic background radiation as the only available heat source.
The other reason to upgrade her OS would be the occasion of replacing her old disk drive with a new new SSD which is long overdue, but there was a small roadblock. I got about 15 minutes into this project before I discovered that suction cups are involved.
Just what I need in my service kit: suction cups.
If it works, don't fix it.
If it wants to be a disposable appliance to the very core of its being, let it die in its birthday suit.
How Mobile Apps Are Reinventing the Worst of the Software Industry
I simply don't install applications on Android that ask for abusive permissions, which pretty much puts my phone back into the stone age. I don't need the project right now of installing a root kit, tweaking non-standard security settings, then wondering whether the next glitch is something I have to fix myself.
Net effect, so far as I'm concerned, is that the smart phone has not been invented yet.
I've always considered the Brights movement to be tragically misnamed (almost cringe-worthy) but at this point I'd have no problem carrying around a Bright phone where the device's intelligence was on my side for once.
Ray Kurzweil Talks Google's Big Plans For Artificial Intelligence
Kurzweil is probably a good deal less bright than Sir Isaac Newton, but also a good deal less crazy, his barmy extrapolation of the singularity notwithstanding. Clearly Google hired the man based on the smartest thing he's accomplished rather than the dumbest thing he espouses.
I've thought about this for a long time, and I'm only 99% convinced Kurzweil is wrong. He holds the record for the most ridiculous thing I've ever heard for which I maintain a non-zero sliver of belief. That said, extropian immortality sure as heck isn't life as we presently know it. Even if he's right, I'm not sure I give a damn about my xeno-species future extropian self.
What's left of me as I presently know myself would be just a little sliver of MSDOS buried somewhere deep inside Windows 8, though that might be just enough to properly enjoy hearing Raymond-prime mutter, "Oh, indeed, my original Raymond self, he was such a twit, wasn't he? Every so often I simulate his ego as a kind of Positronic CPU burn to keep my immortality in good working order, but only when the liquid helium is in copious supply."
He's weirdest belief of all is that you can multiply something by a million and it gets a million times better and not more aptly just a million times different.
"Microsoft Killed My Pappy"
1. More standard compliant
The difference in standards compliance in modern browsers is a supermodel vs an air-brushed supermodel.
The difference between IE5/IE6 and other browsers of the era was a supermodel vs watching your own parents having sex.
One could say the same thing about Visual C++ as well.
The thing about Goldman Sachs is that you never get to ruin the economy twice in exactly the same way. There's relentless pressure to innovate concerning your grand malfeasance. It's so comforting to know that Goldman now goes to church on Sunday mornings and sings the sub-prime anthem.
Only a failed criminal tries the same scheme twice. The key is to make such obscene profits the first time that you can sit tight long enough for the apologists to paper over your track record before hatching your next plan.
The title character is a poor and fatherless teenager growing up in The Bronx. Billy and his friends are in awe of the flashy mobsters in the neighborhood. Dutch Schultz and Otto Berman, based on the real-life mobsters, hire Billy as a gofer and become mentors to him. The gangsters take Billy up to their upstate hideaway, where they are awaiting a trial. Schultz becomes a community leader and converts to Catholicism.
"Microsoft Killed My Pappy"
I wont mention which tech company
Mission accomplished. I have no idea what you're talking about, and I couldn't give a rat's ass.
Stack Overflow Could Explain Toyota Vehicles' Unintended Acceleration
100% perfection in any non-trivial thing (whether hard or soft) is impossible.
This is 100% semantic wankery, because triviality is circularly defined as the magic threshold beyond which bugs become inevitable.
Of course there's an implicit ego frame of reference, because we're all looking for an edge on the margin where the big money lives. They didn't call it a "space race" for nothing. It would be far more pertinent to observe about the human species that bragging rights come from body bags. That's just how we run our affairs on the larger political scales.
I can't stand the intellectual posing that ensues whenever someone espouses the culture of bug mitigation with extreme prejudice. Oh, nothing can ever be perfect—as if that's ever the human standard in anything. Part of this is IQ wanking: the notion that writing bug-free code requires superhuman feats of logical perfection. Successfully reasoning your way out of a wet paper bag has something to do with writing bug-free software, but it's a secondary term at most.
The real key to writing defect-minimized systems is a good understanding of human psychology and mental frailty, keeping notes, and constantly upping your game.
It's a rare piece of software that is more robust than the worst API it programs against. Even if the code behind that API is 100% bug free, you're far from out of the woods, because the API can be designed in such as way as to delegate complexity up hill.
No doubt Opportunity was far from perfect, but it sure as hell sets a god-like bar compared to what passes for quality work in 98% of the make-a-buck sphere.
EFF Reports GHCQ and NSA Keeping Tabs On Wikileaks Visitors and Reporters
There are even patents filed which allow identification of individual only by this fingerprint.
The government is doing things for which there are even patents? Wow. I had no idea.
Geez, with IPv6 giving every single web client a distinct address, you'd think the NSA would be campaigning behind the scenes to have their carefully curated fat-pipe monopolists ramming IPv6 down our collective throats.
And damn, what a surprising patent, with only about a thousand years of prior art.
On 29 January 1697 Newton returned at 4pm from working at the Royal Mint and found in his post the problems that Bernoulli had sent to him directly; two copies of the printed paper containing the problems. Newton stayed up to 4am before arriving at the solutions; on the following day he sent a solution of them to Montague, then president of the Royal Society for anonymous publication.
He announced that the curve required in the first problem must be a cycloid, and he gave a method of determining it. He also solved the second problem, and in so doing showed that by the same method other curves might be found which cut off three or more segments having similar properties.
Solutions were also obtained from Leibniz and the Marquis de l'HÃpital; and, although Newton's solution was anonymous, he was recognized by Bernoulli as its author; "tanquam ex ungue leonem" (we recognize the lion by his claw).
I guess that cuts both ways.
PS: Notice our fine Slashdot Classic buggering poor Mr l'HÃpital.
Are Bankers Paid Too Much? Are Technology CEOs?
If you need evidence of how valuable it is, merely look at our recent financial crisis when the flow of money froze up.
That's just about the dumbest thing I've ever heard.
I guess you don't recall the August 1981 strike air-traffic controllers strike. Most of those wealthy bankers could be replaced by people being paid 10% as much, and after a few years we'd hardly notice the difference, except perhaps that the new lot wouldn't be nearly so adept at screwing the system over.
I guess if you were running NASA you'd pay a billion dollars per o-ring, because--gosh--look at what happens if it won't deform, and the size of the bill if we need to replace the dumb thing. Ten thousand parts at a billion dollars each sure adds up. When you think about it, with each o-ring protecting the safety of a ten-trillion dollar shuttle, a mere billion dollars per ring is a screaming deal, wouldn't you say?
Finance wasn't rocket science until the inmates figured out that astrobucks are a good living. It doesn't need to be rocket science any more than an o-ring needs to cost a billion dollars.
The controllers had Washington by the balls. Big mistake. The bankers have Washington by the carotid artery. We can therefore infer from this that bankers do more important, more productive, more difficult work. Or we can infer that bankers are better at pouring over Grey's Anatomy if it serves their personal interests.
New 'pCell' Technology Could Bring Next Generation Speeds To 4G Networks
In my post above: ... wouldn't take long ...
Fingers too fast, or delay line from the Chomskian trace badly programmed.
Speech Errors as Linguistic Evidence
New 'pCell' Technology Could Bring Next Generation Speeds To 4G Networks
However, in order to get this to work well, you need the transmitted signal to be phased-aligned to within an appreciable fraction of a wavelength. ... Since we are around a gigahertz, that means that the phase of the carrier should be accurate to within a couple hundred picoseconds, max.
Programmable delay range: 3.2 ns to 14.8 ns in 10 ps increments in 2^10 discrete steps.
160 ps rise/fall, less than 2 ps RMS cycle-to-cycle jitter.
That gives you a spatial resolution of about 3 mm within a 3 m pixel on the fine delay; more if you also introduce a coarse delay line (in 10 ns increments). I think the Xilinx DCM gives a step delay on the order of 10 ps in 1024 discrete steps. You've now got 3 mm steps out to 3 km. Note that the linearity of these delay lines is not perfect, so there's some art to it (it's not a simple two-digit number in base 1000), but the worst case step remains small. You might use two DCMs in series plus the CML Micrel to ensure uniform coverage (one Spartan 3E has four DCMs IIRC). Actually, for a multi-channel base station, you'd need to fabricate an ASIC with a very large number of programmable delay lines, as I imagined it before RingTFA.
If the phone is 150 m away from a cell transmitter, you can set up a ping pong ping loop with a round-trip frequency of 1 MHz, where each end bats the pulse back as fast as possible.
Imagine the phone sends out a coloured packet and two or more base station pong it back. The phone can ping back on the first received response, or the last, or the n'th response in between. The fastest paths need to be artificially delayed until all paths are equal time. (With multi-pathing, the radio might be able to detect and measure more than one path length per base station.)
It would take long to achieve the coarse lock-on. Then it needs to maintained during motion of the mobile end, plus changes in atmospheric conditions, or sway in the buildings you're bouncing off of, if you've used the loudest path instead of the quickest path. The timing fabric is quite doable. The delay line can be anywhere in the ping pong circuit. The non-radio portions would ideally use fibre as copper has a temperature-variable c that adds up quick in the ps regime where lengths of 100 m are involved.
I can totally see this working, though radio systems at this level are astrobuck black magic.
The software-defined LTE phased array waveform simulation would be an interesting computational problem. They probably do the time extraction with DSP rather than actual delay lines. I'm wondering how much the upstream channel borks total throughput.
Maybe this is the Netflix special. Agility is always the last crow.