Mozilla To Support Public Key Pinning In Firefox 32
How about a stable 64-bit version for Windows,
THere were stable builds for Windows. The problem was people needed plugins which weren't available (because a 64-bit browser can't run 32-bit plugins without a thunk layer). Chrome did it because Chrome ships with the plugins recompiled for 64-bit (because Google has the source code to Flash and all that).
It's the same reason why Microsoft actively discourages use of the 64-bit version of Office.
Though, other than being "64-bit", is there a real reason for having a 64-bit browser?
Google Testing Drone Delivery System: 'Project Wing'
Amazon recently announced it was getting into the advertisement business, and it beat out Google to acquire Twitch.
Pure speculation on my part, but I have to wonder if this is just Google's CEO trying to steal some of the spotlight away from Amazon?
The real reason Amazon scooped Twitch and not Google was Google was worried about anti-trust issues if it bought Twitch .
Amazon didn't get Twitch because they paid more than Google (Google offered more), Google walked away worried. Twitch probably egged Amazon on to get more money from Amazon after Google went away.
Australian Consumer Watchdog Takes Valve To Court
Reading about the US I really like consumer protection laws in Germany. Everything is so much more consumer friendly and open. Companies have to identify themselves (i.e. have an imprint), all taxes have to be included in prices and if you buy something you have all kinds of rights (two week period to send stuff back/cancel contracts, two year warranty on physical items and such) that cannot be taken away by ToSs.
It's such a different culture. US companies often struggle because they're used to the whole "corporations first" mindset.
On the flip side, you realize stuff in Europe is way more expensive, right? And why people in Australia complain that stuff is more expensive too.
If you mandate that everything has a 2 year warranty, then consider the next time anyone asks you in North America, "Do you want to buy an extended warranty?" and being forced to say "yes". Because that 2 years is now built into the price of the unit itself. It doesn't matter if you'd normally say "no" and be done with it, you're forced to say "yes" and pay up.
Likewise, if you're forced to handle returns on digital items, well, don't be surprised when people either a) don't want to do business with you (see music/movies geoblocking), b) charge for the privilege (i.e., it costs more).
Now, Australia is a bit funny in that respect because they want to encourage the practice of buying from other regions to get better pricing to help drive down the local prices. Yet at the same thing, those other retailers don't have to accept returns or deal with Australian law (and the Australian representatives can easily say since you didn't buy it in Australia, the law doesn't apply - if you want a refund, deal with the overseas store you got it from).
In fact, if you compare pricing, you'd find after warranties and embedded taxes, the price gap isn't as big as it once was.
Why Women Have No Time For Wikipedia
No doubt, history is filled with all kinds of evil misogyny, racism, and homophobia...and large swaths of the planet still have those problems, especially in the islamic world. But we lose sight of the truth, that people are individual *actors*, not *objects*, all too often. Fighting the scourges of discrimination of various sorts doesn't lead to some predetermined statistical balance, it gives individual actors the *freedom* to make the choices they'd like. Sometimes, those free choices are lopsided, and that's *okay*.
The problem is not if a gender imbalance is inappropriate, but the question we should be asking is, is there any systematic problem?
There's a fine line between "they don't want to do it" versus "they're being actively excluded from doing it".
So the question is - in all fields, is there something we're doing that prevents women from entering the tech field, or editing Wikipedia?
It could be something as simple as "women can't stand the immaturity of tech people" (given all the trolls and all that). In which case, the reason we don't have more women is systematic - we're all a bunch of immature idiots who cannot behave. Now, whether or not we think it's a problem is another issue altogether, but knowing that, it's a lot clearer as to why.
If the answer is instead "women just don't like tech" then fine, the imbalance will remain because we can't change personal preferences. We can ask perhaps why they don't like tech and it could be stuff like "don't want to sit in front of a screen all day" which is something we cannot change, and must accept.
That's the real question we should be asking - WHY is there an imbalance, and is it something we can potentially fix. If it isn't, then fine, we shouldn't bother trying - but at least we know. If it can be fixed, then perhaps we should look at ways to fix it.
If it's because of something stupid like "tech people are immature" it's a real problem we need to fix for many reasons, including simple respect - if you don't act like you deserve respect, don't be surprised when people don't. (Why do you think video games get the stereotype of teenage boys, despite the average gamer being over 35? Act like teenagers, and people believe you are).
Why Women Have No Time For Wikipedia
Have you edited Wikipedia lately? It's a fucking nightmare of committee-watched articles and instantaneous reversions.
There we go, the real reason.
I mean, face it, men are just more willing to be the trolls and make life miserable for each other. Women see that and avoid the whole issue altogether.
We saw it with that article on games vs. women article. They simply see what happens as basically a bunch of horny teenagers with ragers going on, and simply steer clear to avoid the trouble. Wikipedia is the same - it's no better in the end.
Now, whether or not having women think all people who enjoy videogames or use wikipedia are immature teenagers is a good thing or a bad thing, I don't know. It just makes the entire population no better than construction workers who catcall women as they pass on the street. So much for intelligence, I guess?
Microsoft Dumps 1,500 Apps From Its Windows Store
They should crowdsource this. Simply mark new apps as being in a probationary period and give downloaders the option of tagging the app as misleading, malware, abuse of permissions, etc. It would greatly help their human staff find the bad apples quickly. Of course the same goes for Google and Apple.
The problem comes when the apps are ported.
Say I make ACoolApp on iOS, and it's so good, someone makes an Android version. Call it "AndroidCoolApp". Now much richer from iOS sales, I decide to try my hand at Android development, and port it to Android. Now what?
ACoolApp for Android is technically "duplicate" of AndroidCoolApp, but that was a duplicate of ACoolApp to begin with.
It's happened a few times. And while it's true there are a few intentionally deceptive (search 1Password and find the REAL one), there are also plenty where both apps are legitimately developed - someone sees a cool app on the other platform, the developer is "taking too long" and release their own.
And that's the real problem - how do you properly draw the line between apps that are legitimate but happen to be similar because one inspired the other, and apps that are pure scamware and trying to undermine the original developer?
Hell, what if you make a flappy bird derivative that has some neat twists in it? Does your app no longer exist because of all the others? (And face it, most people would download the app, run it for two seconds and then mark it duplicate without trying to play it).
Limiting the Teaching of the Scientific Process In Ohio
Indeed. However, the Discovery Institute's chance of success depends entirely on obfuscating that goal. There's a lot more people who would support "intelligent design" as some sort of oppressed underdog "scientific theory" than who would support it as the blatant theocratic idea it really is.
Which is why it's called Intelligent Design. in fact, when they were converting from Creationism to Intelligent Design, they basically did a search and replace. And they left transition fossils to show how "Creationism" evolved into "Intelligent Design" because of a messed up search-and-replace.
(A transition fossil is just that - if you have animal A and animal B, and you know B evolved from A, then there has to exist a creature in-between A and B, called the transition fossil since evolution works on such timescales that many generations of creatures will exist between then and now).
Yes, there was evolution in the DI texts :).
It's too bad that more Americans believe in creationism than the great flood, since the latter is a lot more scientifically plausible than the other two ideas you mentioned. I mean, it's pretty clear that the "entire earth" didn't flood, but it may sure have seemed that way to somebody living in what is now the Black Sea about 7600 years ago.
True, there was evidence of it, however there was unlikely to be an Ark. Maybe 40 days and 40 nights of rain, but that's about it.
Approximately 40% of Americans believe God created humans as-is. (The rest believe either humans evolved, or humans evolved with God providing a helping hand). And that percentage has remained fairly stable over the past 30+ years.
Limiting the Teaching of the Scientific Process In Ohio
On the bright side, framing the debate in those terms might help convince the kind of people who would argue that we should "respect all sides of the issue" (or some politically-correct BS like that) that these anti-scientific ideas really don't belong in science class after all. I think the lawmaker did us a favor and I'm optimistic that his plans will backfire.
It doesn't matter. The WHOLE reason we're having this debate is not about science. It's not even about creationism or "intelligent design" or however we "evolve" the term.
The Discovery institute (the real organization behind all this) believes fundamentally, society went awry when we did the whole "separation of church and state" thing and that religion in school meant students were better behaved and more obedient, and society as a whole was just better off.
So that's the real end goal - to get religion - or more correctly, Christianity, back into schools so everyone becomes a "good little Christian boy".
(Yes, it glosses over a LOT of things, like racial issues, the fact that there are more religions than just Christianity, etc).
Basically all of society's ills are the direct result of secularism and the pursuit of "things" (money, toys, stuff) instead of spirituality.
It's just that creationism is the wedge issue that can get them in the door the easiest since a lot more Americans believe in it (than say, a great flood happened, or that everything we see was made in a week a few thousand years ago). And once you're in the door, spreading the other beliefs becomes a lot easier.
Fermilab Begins Testing Holographic Universe Theory
Or Quantum theory. Ever notice how things are quantized (i.e., they come in discrete packets of stuff) rather than a continuous spectrum?
Or how once you get below a certain size, the rules of physics just seem to break down and it all becomes random?
Well, we hit the resolution limit of the simulation, and the quantum "foam" is the LSB of the simulation. Even in computing today (especially floating point) you have to be careful in how you order your operations so you don't lose TOO many bits in the mantissa due to computation error. Well, that's what the quantum world is - computation errors flipping the LSB around in random unpredictable ways. It's just we're able to guess at the likelihood of it being in a certain way because the simulation runs the same operations the same way (and loss of precision can generally be approximated). But it too loses precision during calculations which is why the quantum world is statistical. A software upgrade to the simulation can change the way the least precise bit behaves, if they changed that part of the simulation calculations.
So there you go. The resolution limit of the universe is h-bar, representing the limited precision of our simulation.
HP Recalls 6 Million Power Cables Over Fire Hazard
How do you fuck something like that up?
Separate assemblies - the ones who do the power supply generally are very good at it (including the IEC plug the AC power goes into). The output end is typically just a header, and the cables are provided by a third party who specializes in making terminated cables. (Especially modern laptop cables which can have several conductors and indicators), with the only requirement that the power supply end use a mating connector.
Though, cases and other stuff are also often done by someone else (the power supply manufacturer will often assemble it all together though).
And customers are stupid and they yank on cords that cause the wires to stretch and break, or bend them tightly. It all frays the insulation.
Apple has the same problem and often times if you take in a power adapter with a frayed end, they may replace it for reduced cost. Moreso if the machine it goes with is under applecare (and since they're all compatible with each other...)...
Project Zero Exploits 'Unexploitable' Glibc Bug
Yes, ASLR somewhat works but is an afterthought. The ultimate solution would be to stop using computers which mix data and code adjacently, in other words get rid of the whole von Neumann computer architecture.
There are plenty of processors that are Harvard architecture out there (separate data/instruction memory). Though modern architectures do have a bit of Harvard in them (the separated instruction and data caches). And memory segmentation and permissions do help split code and data into separate areas.
The problem is that von Neumann makes computers extremely useful because you're able to treat code as data, so you can do fancy things like load a program off disk into memory and execute it, or load a program from a network device using any programmable protocol and run it. This only works because the OS treats the code text as data temporarily to load it off storage (local or otherwise) and then into memory. (After all, loading a program into memory consists of reading the executable off disk like you'd read a regular data file into memory, then you'd need to runt hat code). Heck, modern paging systems in an OS rely on it - reloading a memory page from disk doesn't care if it's code or data - the OS just sets up a new memory page to hold the contents, finds the location on disk, and tells the disk driver to populate that memory with data, and on completion, re-executes the failed instruction (or performs the pre-fetch)
Harvard architecture machines need to have a way to load their program information and pre-load data into memory, which is why traditionally they only run fixed program code (like DSP). Or have a von Neumann machine load the code into instruction RAM. (They're great for streaming and signals where the code doesn't change, but you're constantly passing data through the system)
Free Law Casebook Project Starts With IP Coursebook
While I agree that NC is generally misunderstood by lay licensors, and greatly more restrictive than most people realise, ND has a valuable place in the licensing suite.
For example, if you write an opinion piece, adding the ND clause will make sure that no-one can (legitimately) alter or distort the text, and use it to misrepresent the position you hold/held.
Otherwise, using ND for non-opinion works shows a certain amount of arrogance. It's effectively proclaiming "no one but myself could possibly make this any better".
Not really. Because even the most restrictive copyright (traditional "All Rights Reserved") still has people routinely distort and misrepresent your position. It's called "creative editing" - and it can change the meaning completely.
If people want to misrepresent you, they're going to, regardless of if you use ND or full copyright. And no, just because it's on the web doesn't mean it's not under full copyright - the author can legitimately post an opinion piece completely copyrighted (see editorials) and be freely readable. It's under copyright, so no one can legitimately alter or distort the text.
Oh, but you say what about fair comment and all the other fair use rules? Guess what? They apply to CC works too because just like copyleft, it relies on copyright law to specify the minimum rights everyone has, including fair use, satire, etc. CC and other copyleft simply grant more rights than copyright would've so you can ignore the CC license just fine, you'll just be held to a more restrictive agreement.
ND doesn't solve anything. It probably makes it worse since it just means your work gets copied everywhere, whereas full copyright means your online post is the only legitimate one and people should link to that as the original source piece. Those who would just re-host it and violate copyright law will continue to do so, regardless of "All Rights Reserved" or CC.
VMware Unveils Workplace Suite and NVIDIA Partnership For Chromebooks
Enough. No, I do not want "Cloud" services, thanks. I want my good old desktop with local applications that do not need be connected to the internet 24/7 to work, not everyone have a fiber connection available all the time for this.
So don't use it. Why does it have to be an either/or situation? If you need your desktop, continue using it.
This service is more for those who have a desktop only because they need to run something on it. You know, like how some people ran Windows just to play a video game. Or for one application they use infrequently but have to use.
Hell, this is practically an ideal situation for parents who basically neglect their PCs and to whom you spend every thanksgiving fixing their PC. You replace it with a chromebook (locked down web browser) and use a cloud desktop for the few things you need a desktop PC for.
It's like those who complained tablets will replace desktops, yet Jobs was far more accurate in that we'd always have desktops even in the age of "Post-PC".
3 Years In, a "B" For Tim Cook's Performance at Apple
It would never catch on because it doesn't support what existing Micro USB connectors do, and what other manufacturers already use. For example, there is no way to do uncompressed 1080p video over it, and phones were doing that three or four years ago so are not likely to drop back now. The cost of the Apple video solution is prohibitive as well, when an MHL adapter is Ã5.
Lightning doesn't seem to support USB peripherals either. Not sure if it is an inherent limitation of the design or just that Apple don't use them, but many Android devices can make use of USB flash drives, card readers, game controllers, keyboards, mice and the like.
micro USB connectors DO NOT DO VIDEO.
MHL and SlimPort and every other standard does. No, those connectors are not compatible with each other, but they do allow you to fit a microUSB plug into them. They are not, however, micro USB. That'll be like saying Apple invents a new connector, but you can use micro USB with it. It just means the connector was made compatible, but if Apple puts in Firewire/thunderbolt/whatever, it doesn't mean micro USB inherits those properties.
USB peripherals are supported by lightning just fine. You can connect cameras, memory cards, even USB DACs to an iOS device just fine - you do need the "Camera Connection Kit" which converts your 30 pin or Lightning port to a USB host port, to which you can plug in a camera, memory card reader or flash drive, or USB audio device to. Or keyboard, if you wanted.
And it's taken long enough for USB to get to the point where you can plug it in without caring for orientation. USB micro aren't immune to this - USB micro AB ports generally are reversible because of their godawful design. And most devices should be using microAB ports instead of just microB and special adapters to make it an A port. It's just the user experience is so terrible, and it makes it incompatible with MHL and SlimPort (which only are compatible with micro B cables).
Major Delays, Revamped Beta For Credit-Card Consolidating Gadget Coin
When you sign the back of your card, you're providing a template for forgery to anyone that happens to steal or find your card. I can understand why the credit card company would want you to do this, as a convincing forgery job on a signed sales receipt shifts liability from them to the consumer. However, as a consumer, I don't understand why you'd willingly buy in to such a system.
Because signing a credit card isn't for verification. It's for agreement of the terms and conditions.
Signing the back of your card is how you indicate that you agree with the terms of your cardholder agreement, which your provider has spelled out how you pay them back, how they charge interest, what interest rate, billing, etc. If you don't sign the card and the merchant accepts it, then they have to eat the loss because you didn't agree to the terms.
Likewise, signing the chit just means you agree to pay the amount shown in line with your agreement.
It's just contracts, in the end. The card signature shows you agree to the contract between you and the credit card provider. The chit signature shows you agree to the contract to pay the amount shown. If someone else forges your signature, that's fraud and you're not responsible. Likewise, if someone uses your credit card with their signature, that too is valid since it was signed under agreement.
There's nothing special about the signature. Banks routinely loan out lots of money without even a "reference signature" to compare to, yet they're still valid.
You're just signing to show you agreed to the presented terms.
If you look closely, the chits all say "Cardholder agrees to pay the amount shown per the terms of the cardholder agreement" which is what you're REALLY agreeing to.
Hackers Claim PlayStation Network Take-Down
Someone who has enough skill to use the Other OS function probably has enough skill to install CFW
Actually, CFW is freakishly easy to install. It's just an offline update.
No one uses OtherOS anymore. The reason you use CFW is pirating games and all that. It always has been since the OtherOS folks, pissed at losing it, hacked the PS3 to restore it. Which ended up leaving a huge hole for everyone else to exploit, so there are more than a few ISO loaders and dumpers and all that.
Not sure about their status to play online, since I hear that Sony sends down a binary to run on them to report on the status (client-side trust), which I assume is pretty easy to fake after a few days.
Anyhow, it appears Xbox Live is back up, the best they could do was make it "intermittent". And only login was affected.
3 Years In, a "B" For Tim Cook's Performance at Apple
A CEO that gets it.
Tim Cook realizes he's not Steve Jobs. Steve Jobs is perhaps one of three people in the world who can be an asshole and yet get results done (the other two - Linus Torvalds and Theo De Raadt). Say what you want, but they're all assholes, except mysteriously, they get results.
Everyone else who've tried, failed miserably.
And I'm sure Cook realizes it too - he's no Jobs and being an asshole would destroy the company (most who try fail, hence why there's only three people in the world who could do it). He's got to be different, and if that means revamping the company from being under the thumb to how companies should be run, so be it.
Still, you do miss the odd Jobs-style flare up. I mean, Ballmer had his chairs. Cook is just a bit.. understated.
For Microsoft, $93B Abroad Means Avoiding $30B Tax Hit
It's almost like the editors wanted to publish a biased article or something. Scandalous.
Exactly. It's exactly the same thing Apple, Google and everyone else has.
Hell, in Apple's case, it's cheaper to borrow the money in the US than repatriate it. When Apple needed $17B, they took on debt against future US earnings, because it would cost them less to pay back that principle plus interest than it would if they were to bring in the money from offshore into the US (which I think would've been close to $30B to get $17B they could spend). And Apple has very rarely taken on debt intentionally.
An unintentional side effect is well, Apple, Microsoft and Google have to spend that money outside the US, so they hire developers and other people to work outside the US as well.
The Tech Fixes the PS3 Still Needs, Eight Years On
If my PS3 breaks while they're still making them? I'm not sure I'd buy another. I'd just get a cheap 3D-capable Blu-Ray player and play SotN by other means.
You'd get better quality using a cheap 3D blu-ray these days - the PS3's HDMI output means it only supports half-resolution 3D, and in doing so, lossy audio, making it one of the most undesirable 3D players out there.
3D over HDMI comes in 4 formats - side-by-side (SBS), Top-and-bottom, line-interleaved, and frame-packed. The latter format involves fitting two full resolution frames (1920x1080 x2) per frame, while the others are fitting two frames in a 1080p frame. Side by side means the two frames are 960x1080 in size (losing horizontal resolution), while top and bottom means they're 1920x540 (losing vertical resolution), while interleaved means every other line belongs to a frame, again losing half the vertical resolution.
Couple that with lossy audio (the PS3 can't do lossless audio in 3D mode, go figure), and it was a nice "how do you do" feature. The people who could use it however, generally were people who spent a lot of money with a nice system. Even today, they still are since 3D has disappeared practically from store shelves. Relegated to a few high-end models so if you wanted it, you paid for it.
The Tech Fixes the PS3 Still Needs, Eight Years On
The removal of OtherOS didn't affect the average gamer, it only affected a very small group of people who installed Yellowdog Linux out of curiosity. I was one of those who did so -- a year later, I didn't particularly care that the feature was removed, because as everyone else who tried it discovered, OtherOS sucked. The hypervisor, which can't be worked around, locked out much of the hardware. Want to use it as a cool games emulator? Good idea! But since the hypervisor has always restricted the RSX, the PS3 runs much slower than your standard HTPC, and has almost no graphics acceleration.
It's only been recently that some exploits with specific hypervisor versions have allowed the Linux kernel to boot in "game mode," unlocking full graphics acceleration, but that's not a Sony feature and wasn't available through OtherOS.
OtherOS always sucked because Sony was scared it would lead to pirated games or homebrew games that competed with their own offerings, so they crippled it from the very start.
And you know what? It helped keep piracy at bay.
Here's one thing Microsoft learned on the original Xbox - when the interests of homebrewers and pirates align, you lose. It's why the Xbox360 is locked down and to this day, unbroken save for limited piracy hacks.
Sony had the same with OtherOS. Within 6 months of them removing OtherOS, the PS3's horrendously broken security system was breached - by people looking to run OtherOS! And what happened after that? The pirates came in and basically took over. It's so bad in the early days, you could still use PSN with a fully opened console (which led to the PSN shutdown a few months later). And these days, you still can since the complete console security system was breached - anything Sony tries is a element of "trust the client". Which means it works for a few days, then fails as everyone learns how to spoof the response.
And perhaps another factor was Microsoft's "opening" of the Xbox360 using XNA and the Xbox Live Indie Arcade where homebrewers can write code and then play them and even offer them for sale.
It's lead to the Xbox360 being unbreached - no "hacked" console can be connected to Xbox Live without being detected, and the security of the software is such that it still only runs Microsoft's code.
So if you hacked your xbox, you could play pirated games, but never online.