Canon Printer Hacked To Run Doom Video Game
I really shouldn't be getting my tech news from sites that are basically a day behind BBC News.
Sapphire Glass Didn't Pass iPhone Drop Test According to Reports
Anecdotal or not, almost everyone I come into contact with who has an iPhone is either living with a smashed screen or had to take it back to Apple to get the screen replaced after smashing it.
I do not see as many, if any, of non-Apple phones that are smashed as easily.
Personally, maybe I'm just not as clumsy, but I've dropped my phone any number of times and even kicked it accidentally as I dropped it and smashed it into a wall... and it wasn't even scratched. I don't think I've ever managed to break a phone like that, and I've had some spectacular drops in the past (plastic covers and batteries flying all over the room, but just put it back together and it worked).
School Installs Biometric Fingerprint System For Cafeteria
I work in IT in English schools.
Welcome to a decade ago.
I've worked in several schools that have biometric library systems and the move to cashless canteens has been underway for years (I've never happened to work with one, but that's not because they aren't around).
It is sold as preventing bullying, stopping you having to pay for the cards, etc. The privacy implications came up 10-15 years ago. Nobody, especially parents, really cared.
Hell, five years ago, my daughter's creche had fingerprint entry (I refused to take part, mainly because I saw it as insecure given I could gummi-bear the reader and enter as whoever came in last, but I was apparently the first to complain).
Old news people. It's already in schools all over the UK. There was minimal protest.
The State of ZFS On Linux
1) Yes, it's a general features of RAID's. Multiple devices are reading the data, the "fastest finger first" wins.
2) File server only dependent on your disk format, you mean? I happen to agree here but, if you're doing it at the FS level, then just a standardised RAID layout (such as Linux md / LVM) is the same thing. The non-standard formats that tie you into hardware do so for a reason - the hardware RAID provides things that no software RAID can, sheer speed. (Though, please note, I've happily run Linux software RAID on server-end hardware in production systems without any performance problems).
3) 3 disks dying out of 11? RAID6+1 will actually do better (I think... I can't do the maths just now).
ZFS is cool, don't get me wrong, but it's basically just a RAID fs. The Merkel tree journalling trick just saves having to have battery backup, but whether it works like that in real life failures is another matter entirely.
California Tells Businesses: Stop Trying To Ban Consumer Reviews
I don't care how many 1-star reviews a place get. You know what matters? How they respond to them.
I'd rather go to a place that replies politely to every negative review than one that ignores them entirely. And if they are genuinely fake, things such as "We have no record of your stay, but we're sorry that you had trouble" speak a thousand times more to what's actually happening then any amount of ignorance.
Everywhere gets bad reviews. You cannot have perfection. What matters is how you deal with when you fuck up.
5 Million Gmail Passwords Leaked, Google Says No Evidence Of Compromise
Same for me, same for my brother.
Someone's just collected 5m GMail addresses from somewhere.
To be honest, it's more likely that my address has been sold by a Google employee - there's no way I should be getting as much spam as I do to an address that's completely unadvertised and which is only the end-point of various domain forwarding.
Password compromise too? Just sounds like someone's collated all the compromised data from other websites etc. they could find, rather than hacked into GMail somehow.
Northwest Passage Exploration Ship Found
Yes, but HMS Please Don't Hurt Me doesn't have the same kind of ring to it.
Feds Say NSA "Bogeyman" Did Not Find Silk Road's Servers
Security software cannot fix stupidity.
In this case, one of the scripts on a Tor service pulled data from and thus advertised it's globally-addressable IP address.
Sure, they can improve their processes and pull that script and replace it with a Tor-compatible version - but Tor can't detect this kind of stupidity and fix it for you. If you're stupid enough to put your home address on a Tor service, there's nothing Tor can do about that either.
The most interesting thing about this story is that all the "Tor was somehow broken by a omnipotent government agency" nonsense actually boiled down to "Idiots were giving out their own IP over a Tor service providing illegal content" (which is more often than not the case - I'm not at all convinced that most countries actually have the talent and resources to do what people claim they can, let alone that they routinely do them).
This either proves, when used properly, how effective Tor is, or ineffective the relevant agency is against Tor.
Honestly, I don't care too much about the detail. I don't support the illegal activity that this service was built upon. But I find it worrying that they were that stupid, and that it was that easy to "find" them, and also that the relevant agencies don't seem to have made much progress at all since the days of GCHQ.
All I see in the modern day is unbreakable maths stopping (or severely hindering) anyone but the most stupid people from being caught. I see that as both a good thing (encryption, etc. doing what it was designed to do, and implemented strongly) and a bad thing (our governments are still unable to stop such services because they don't have the talent to infiltrate them).
Ask Slashdot: Best Service To Digitize VHS Home Movies?
The analogue signal on a VHS tape corresponds to an exact (enough) representation of a PAL or NTSC signal, which you can capture in as much detail as you like but it will hardly vary.
The storage mechanism may be able to cope with more, but the actual useful data that could ever come out down a cable is limited to a quite precise specification. As such, higher resolution samples aren't going to help.
Also, VHS isn't entirely analogue. It has a magnetic representation on tape that is - again - highly specified to enable readback.
As such, it's not akin to, say, photographic slides or negatives (but even they have a useful resolution beyond which we won't see any advantage in delving), but more akin to storing computer data on audio cassette - something which formats like TZX encompass entirely even if they do not store every single magnetic charge that may be on the tape.
Your magnetic tape is 0's and 1's, by the way. Stored using a helical layout to stripe them around the tape.
Facebook's Auto-Play Videos Chew Up Expensive Data Plans
Stop whinging about your browser allowing shit and treat data from the Internet as untrusted and unable to initiate actions without your explicit consent.
Ask Slashdot: Best Service To Digitize VHS Home Movies?
Does the original magnetic tape have those properties?
Unlikely unless it's S-VHS and even then, I don't think so if it was recorded on any normal household camera (quote from the Wiki: "In VHS, the chroma carrier is both severely bandlimited and rather noisy, a limitation that S-VHS does not address" - and they mention that S-VHS tapes were used to record 20-bit audio, but only if you were prepared to use several minutes of videotape for one minute of audio, so the chances that it recorded colour with even 8-bit accuracy is unlikely).
You have to think mathematically - significant digits. If the original only have 3 significant digits, there's absolutely NO POINT in worrying about anything with 3 or more significant digits handling it. All you're preserving is error anyway.
You know what? Digitise it yourself if you're that worried. Get a capture card (good luck finding one that captures RAW), plug it into a high-end VHS player, stick it all in 32-bit PNG channels if you want. The end result will be so insignificantly different to your original but will cost ORDERS OF MAGNITUDE more.
I'm with you on quality, I get that, and you want to get that stuff off tape sooner rather than later if it holds any kind of emotional significance to you (chances are, your holiday tapes from the 80's will never be played again once you're dead, and only a handful of times until then). But you're really trying to go too far because you've heard some things on audiophile/videophile websites and the like and think you have to do that.
You know what? The extra time spent with your family, and the extra money to follow the kid's hobbies, will more than make up for any theoretical loss in the MPEG encoding of some home movies. And, at the end of the day, so long as you can see who the people in the movie are and what's happening, who cares about the fine detail? You can't Bladerunner it back to 4K, so what you do now will not degrade in the future. And, chances are, what you do now is higher quality than anything on the original tapes anyway (unless you intend to capture the missing parts of the TV interlace somehow?).
Give it up. Buy a GBP20 adaptor from your local store. Buy a slide-and-film scanner while you're there. Have a night in with the family where you're all doing one job - scanning, sorting, cleaning, labelling, filing, archiving - and get everything you have in your archives digitised. Copy it to friends and family (who, honestly, really won't care but will be polite). Then forget about it until little Johnny is 18 and you want to embarrass him in front of his girlfriend.
IT Job Hiring Slumps
The age-old fallacy that what specifics you teach people has any correlation to their future careers.
If you're a programmer, the language does not matter. It's literally that simple. You could WRITE your own language if it came to it.
If you're not, learning some language that's a passing fad is hardly worth worrying about compared to one that went out with the Ark.
In the same way that all my science classes taught me that Pluto was a planet, all my CS classes taught me about languages from the 60's that aren't in use any more. Literally, by the time you get to the workplace the language does not matter. It's like a car mechanic who's repaired some Fords in the past... it won't help him much on the new Fords or on other models if he can't use the underlying skills instead of the rote teaching.
Course languages should not be chosen to suit employers who - generally speaking - by the time those students graduate will be demanding something else. They should be chosen to promote understanding and completeness and practicality (I'm not saying we should all teach a language that doesn't exist outside of academia, for example). Just for the simple matter of students being able to obtain a compiler and get to grips with it at home, if nothing else.
But saying that business should dictate the languages taught is nonsensical. Things used in business are generally a BAD IDEA. We know they are. Because they are quick, cheap and dirty. That shouldn't be the basis of an education, especially when - as you hint at - it's the theory that matters.
For the record, I have been "officially" taught BBC BASIC, Visual Basic 3.0 and Java. And I have a degree in CS. Only one of those is close to a useful language any more, and that's the one being ridiculed in the previous article for it's use in the world's most popular brand of smartphones nowadays. If anything keeps me in a job, it's C, SQL, and the ability to quickly read example code from any language (PHP, Ruby, Perl, VB, C#, you name it) and knock up something that works by knowing that they are all pretty much the same at the bottom.
Course languages have almost zero correlation to future success. Business is already suspicious of people who do a 3-year CS degree and then tell you they can program anything in Java. It honestly doesn't matter what the language is, so business shouldn't be dictating it.
IT Job Hiring Slumps
What employers demand should not dictate what universities teach.
Otherwise we are quite literally giving a bunch of people McDonald's certifications and telling them that's an education.
Carmack On Mobile VR Development
It's back to the age-old arguments.
You can have the speed of native code, if you deal with the problems of native code (device compatibility, non-portability, security, etc.)
Or you can have the security and cross-platformness of some standardised intermediary language that basically runs as a VM at a slightly slower pace.
To be honest, I think Carmack's best work is long behind him, but there's also a need to develop on appropriate devices. If you need a device to be that fast and powerful, then aiming at smartphones and tablets probably isn't the best idea at the moment - they aren't the cutting edge of the market and won't be for a long time. In the same way that aiming at the 286 probably wouldn't have been the best idea for Doom, or that aiming only for software rendering probably wasn't the best idea for Quake.
Get the code going and providing something people want, and they will either buy devices that can run it, or ask you what you need for a port to work. By the time you're finished, the speed you're after will be available in the next device to launch.
Isn't this how cutting-edge development has always worked? Had to have huge resources and powerful machines totally unlike those seen before to make the software run at the speeds you need while you're developing it. Then a year after release, everyone has those devices and soon people are calling your code "old"?
Mushroom-Like Deep Sea Organism May Be New Branch of Life
Which will work fine right up until the point that one person is genetically un-human and then we'll have no end of arguments (and maybe even bloodshed) on our hands.
DNA is also a bit of a problem - are you talking mitochondrial DNA, etc? Because you don't have "one" DNA in your body. You have several thousand, minimum. Thus you are instantly several thousand species in a single individual and actually your largest amount of DNA probably isn't "you", as such.
Out of the Warehouse: Climate Researchers Rescue Long-Lost Satellite Images
And until someone works out what we're supposed to do about it, we can all sit around and argue about whether or not we caused it. Like a bunch of people in a traffic accident swearing and shouting at each other and not one bothering to use the brakes. Sure, knowing it's us must lead us to find out why it's us, which might lead us to find out how we stop doing whatever-it-is.
Fact is, in EVERY discussion, every news story, every article, every paper I see, there's endless blame, "confirmation", etc. and yet not one bright idea about what to do about it.
Let's make it easy. I will happily take the assumption that we're somehow doing this. I'll even assume that how we THINK we're doing this is exactly how it's happening (that's not a given, by a long shot). And I'll assume that the catastrophic predictions are all correct.
Just quite what the fuck are we actually doing about it? What can we do about it? Does stopping doing those things actually hurt us more in the long run than the most dire of consequences otherwise (seriously, if we have to knock energy production down even a single order of magnitude, life changes forever for everyone on the planet)? What if all we can do is slow the change and not stop it? Is it then really worth all the huge, massive, political posturing, scientific research, bitching and arguing if all we can do is, say, buy ourselves an extra 10 years at ENORMOUS cost to our way of life?
Honestly, what kind of measures are we suggesting? How do we get international co-operation on those measures? What if we DON'T get international co-operation on those measures? How does that impact the average person, the average industry, the average production cycle? Are we going to have to abandon modern life and just-those-inventions that might save us (e.g. producing new large-scale energy projects) in order to survive at all?
I'm happy, as someone of a scientific mind, to entertain any amount of what-if's. But the ones that are never addressed boil down to "What if we're right?" And, to be honest, the answers I find from that are either scarier or more lacking than anything the doomsayers might chime in with about sea level rises, etc.
Whether or not I believe the evidence, the way it's handled, or the final conclusion, I still am interested in the suggested outcome. Because, in the AGW debate more than any, it seems to be a lot of political posturing to get someone to agree - to what? A complete absence of solutions. As such, I don't see the "profit" (intellectual or otherwise) in someone choosing any particular path beyond their own beliefs and this, more than anything, makes me question quite what we're hoping to get out of "winning" the argument.
Seriously. An utterly serious question. If we are right, what do we do, and how does that affect us? Because I believe (i.e. zero evidence) that, actually, the cure might be worse than the disease if it's this poorly researched and nobody's really got anything viable. If we can't point at something at say "If we spent more money on that, or researched this, or got those people to co-operate, it would solve the problem" then what - besides arguing about the cause - is ever going to change about the situation?
Wi-Fi Router Attack Only Requires a Single PIN Guess
Sorry, but maybe it would pay to Google things and keep on security news sites occasionally. Sure, I'm a home user for the most part, my home connections aren't liable to be attacked.
But WPA-TKIP is fatally flawed and allows - while not password revelation - replay-attacks that allow packet injection and all kinds of other nasties. Some of this has been known about since 2008. Some of this is because WPA still uses the RC4 stream cipher (which is dead nowadays) in some situations too, whereas WPA2 uses AES.
Services such as CloudCracker also mean that anything not already a seriously complex passphrase is only a couple of hundred dollars away from complete compromise - and NOBODY at home has a passphrase that complex, as you normally have to give it to people (yourself included!).
WPA / TKIP are thus dead. WPA2 / AES have measures against such things. And WPA2 hardware is old-hat now and it's been available for years. There's no excuse to still be lingering on WPA, and WEP is just asking for it - it's actually quicker to crack WEP even casually than it is to piss about asking people for their passphrase (have done it to several friends who told me they were "secure"). WPA's life is, to put it bluntly, limited at best.
Guest network - I have no need of one. I certainly have no need of one I have to turn on and off all the time. So it stays off. With modern 3G, the chances of anyone wanting to join your wireless are entirely minimal, but a lot of home routers that offer guest Wifi have associated vulnerabilities or are commercial services I have no desire to offer (BT-FON etc.).
And there are three channels on 802.11g. Three. Ignore the 13 that you might claim to be given on the router config, they overlap. And, thus, chances are that in any suburban environment, you are already picking up a ton of other networks that overlap yours. Kill off the guest networks, whatever the channel, or move to 5GHz (which is still pretty dead, but liable to get a lot busier over time).
And I VPN all my wireless. The extraneous ping is 1ms on normal hardware (and, no, I don't have particularly high-end equipment on the VPN side - usually some old crappy desktop running Linux). You can test this quite simply with even the simplest ping to Google using the Linux tools that will show sub-ms pings as proper floats. VPN costs are extremely minimal. Gaming is NOT affected any more than anything else. In fact, bulk download/uploads are liable to have more of a delay than tiny regular packets.
Wi-Fi Router Attack Only Requires a Single PIN Guess
I didn't personally use Wifi until it had been in place, with an encryption system that had proven itself, for a number of years before I trusted my networks and data to it.
WEP was broken, so I reset the clock. WPA was compromised so I reset the clock. It was only WPA2 that has proved difficult to "simplify" the problem by using real, proven encryption schemes rather than making-one-up-as-we-go-along.
Common bloody sense.
RAYA: Real-time Audio Engine Simulation In Quake
Realistic sound has been around, as people point out, since the Aureal days. Now, to be honest, it should be baked into every engine and tied to your textures (soft textures absorb sound, shiny textures reflect sound, etc.).
The fact that it isn't means a couple of things - it's too expensive (which I can't believe nowadays), it adds too much cost to development time (but surely modifying those sounds for echo etc. is more costly than just putting in a pure sound and letting the engine modify it as necessary),, people just don't notice that much, or the patent field is too heavy.
Take things like TF2, HL, CS, etc. They are all same-engine. They are all 3D open environments. It is vital to know where shots etc. are coming from in order to play properly. But we don't see such audio tricks. That, to me, suggests they aren't necessary or certainly not the right value to waste time on.
And, to be honest, I watched "ray-traced quake" over, what? Ten years ago? That tech still isn't used in modern games because of the above reasons. It's do-able but expensive, the development time is costly, the effect isn't that much different from pure cheating on the 3D drawing, and it's not in any of the major game engines. This is suggestive of the value of such things being minimal.
And, to be honest, the realistic-"ness"of a game is the first few minutes of unboxing and then that's it. What destroys your immersion from then on is crappy plot, unrealistic capabilities, and AI that still - to this day - sucks. Fire gun, run around corner, wait for the idiots to pile round. The "better" ones might well throw a grenade but once you know that, you take account of that, and that's the AI beaten. To "win" the AI has to have reactions infinitely better than yours and outnumber/outgun you. Think about the average FPS game - there are several THOUSAND bad guys. And you. And though you might get stuck occasionally, you will win. You can use first-aid kits, they can't. You can lure them into traps, they can't (unless scripted). You can sit and wait them out. You can guess where they will walk next, they forget about you one second after they stop seeing you. It's ludicrous.
Please stop wasting our game industry by reinventing tech we've had for decades and could put in any game, given time. Let's try and make a game with one, single, scary opponent (and maybe some NPC's to fill in the gaps). A Matrix-like game, for example. Agents are few and far between, maybe one per real player. There is only one that's a real threat. And there's you. And a world that you can both use to your advantage.
When humans play humans you HAVE to have the same numbers on both sides. When humans play AI, you HAVE to be vastly outnumbered.
I'd much rather Half-Life 3 had intelligent enemies who will choose to camp the chokepoints and not be lured out, than some fancy water effect or proper audio reflections or whatever.
You're not telling me that with the CPU/GPU available nowadays, we couldn't make a Quake 1 opponent that - with the same programmed reaction times, capabilities, and facilities available to them as a human player - couldn't be a serious threat. I'd rather play that than yet-another "look how shiny" kind of game.
Wi-Fi Router Attack Only Requires a Single PIN Guess
Is it just me that hates shit on my router?
- WPS (a.k.a. turn your massive password into a four-digit number): turned off on every router I've ever used, since day one of installation.
- UPnP (a.k.a. let anything open any port to anywhere without authentication): turned off on every router I've ever used, since day one of installation.
- WPA/WEP (a.k.a. half-arsed encryption that we never really thought through): turned off on every router I've ever used, since day one of installation.
- Guest networks (a.k.a. let random strangers use your Internet connection without you knowing): turned off on every router I've ever used, since day one of installation.
- Remote administration (a.k.a. let random strangers on the Internet sit and brute-force your passwords with no way to tell it's happening): turned off on every router I've ever used, since day one of installation.
And, in fact, on anything BUT my actual wireless router of choice (e.g. any Internet router supplied by my ISP):
- wireless (a.k.a. give people another way into my network and hinder all my other - wanted - wifi connections by flooding the airwaves): turned off on every router I've ever used, since day one of installation.
Seriously, people, just turn this shit off. And layer VPN over the top of it, if you can. Seriously. There's zero impact on always VPN'ing over your wireless connection to a machine that has a fixed line to your actual Internet connection. Then even if WPA2 is broken, you're still secure. And yes, you can game. I've done it with OpenVPN over my wireless for years - for EVERY packet - that goes over the wireless.
Wireless is the leaky, draughty hole of your network. Seal that fucker up and treat it like an Internet connection, even to your local network.