Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!



Automation Coming To Restaurants, But Not Because of Minimum Wage Hikes

ledow Automated restaurant (161 comments)

I have said for many years that, with an appropriate restaurant-savvy partner, I'd like to open an automated restaurant. In-table PC's to order things, with card-readers.

I don't want to wait for the waiter to come over until I can order a drink. I might have driven a long way and be gasping of thirst before I care about a menu. Press, press, done before I've even taken my coat off.

I want to see the whole menu. The ingredients. A picture. The price. The associated special offers. Does it have pepper on it? A fully interactive menu would be great, and not be covered in the gravy-stains of the last patron, or have bits scribbled out on it. Plus, when something is no longer available, bam, you can't order it. I could even press the "I have an allergy button" and see if anything is incompatible with that without relying on the waiter to run back and forth to the kitchen.

I might want to tip one member of staff, but not know their name (or they happen to have finished their shift by then). Press "tip", select staff member photo (or select "All staff"), type in a reason, swipe card, done. And no arguments over who I intended it for.

I might well want to pay for my own stuff and not have to wait for the end of the meal and argue with friends. Or order a slice of cake to take home as a last minute thought after I've paid. Or split the bill via various common calculations. Or even tag five items as what John has to pay and let him pay that off the bill because he has to leave early. Press, press, swipe. Done.

I might wall desire a human to talk to, if something cocks up. Big green help button lights up the table, which summons a waiter, much like airplane call buttons. The waiter still has to be around to shuttle things from the kitchen, and this way seems easier - and politer - than having to flag him down as he passes with a table full of plates. Press, done.

I might well decide to change the order mid-flow. So long as the kitchen hasn't started on it yet, why not? Until the order's locked in, I can alter it. And I can even "lock" certain portions if one person at the table wants the starter now while the others only want mains and want to argue over it. Press, press, done.

I might want to pay first, or pay once I've eaten everything. I can choose.

I might want to buy some wifi access, or get a code for the toilet (I disagree with limiting toilets to paying customers only, except on an honesty agreement, but some places do just that and your receipt contains your code for the toilet), or donate to the charity associated with the restaurant, or buy the chef's recipe book. Press, press, swipe, done.

I might want to move tables mid-order, or take my drinks outside. Press, press, done and the waiters and kitchen automatically know where I am.

The back-end? The waiters still wait. The bar tabs are still on the EPOS. The kitchen still gets a ticket about what table wants what. And those that want manual service press one button.

We've already automated every part of the experience but the customer's.

about half an hour ago

FTDI Removes Driver From Windows Update That Bricked Cloned Chips

ledow Re:Computer Missues Act 1990 (187 comments)

"As many many people have said the right and legal thing was to simply stop working and post a message to the user that the chip is a counterfeit/clone."

As lots of OBD2 software does if you don't use a genuine ELM327 chip.

about an hour ago

FTDI Removes Driver From Windows Update That Bricked Cloned Chips

ledow Re:It's in the license! (187 comments)

By the same token, if some bloke down the pub gives me a Windows key, shouldn't Microsoft allow it to activate?

It doesn't work like that.

Unfortunately, there's a difference between having a driver that won't drive a counterfeit chip, and one that actively "breaks" counterfeit chips.

In the same way that Microsoft are quite entitled to refuse to activate illegal copies of Windows, but they aren't entitled to take it upon themselves to format your hard drives when they find them.

about an hour ago

German Publishers Capitulate, Let Google Post News Snippets

ledow Maybe.... (94 comments)

So, maybe losing all your content visibility on Google was worse than them publishing a small article headline?

So, maybe, just maybe, Google's exposure was actually to your advantage?

So maybe you've been biting the hand that feeds you?

If the threat of Google doing EXACTLY what you ask for (taking your content off their site) is enough to make you back down, maybe your original intention was something other than was stated?

Maybe you just wanted a free payment?

And maybe Google weren't being so evil in the first place?


Austin Airport Tracks Cell Phones To Measure Security Line Wait

ledow Sigh. (161 comments)

Erm... how do you think the traffic apps work on your satnav?

They ask you to "anonymously" contribute statistics, they talk home over 3G to service centres, who spot traffic moving slowly (given speed and position is easy on a satnav), mark those roads with appropriate average speeds and then transmit that out to everyone with traffic services.

Sure, they use roadside monitors and other things as well but the "HD" traffic you might get from any large satnav provider uses exactly the same technology.

The question is not whether this is worrying data to collect, but exactly what portion of the collected data needs to be collected? If they are hashing the MAC's really quickly and then discarding the original MAC data, and only keeping MAC-hash and position data, then there's nothing to worry about.

Or, you know, you could write an inflammatory article about a technology that every satnav, every shopping mall, and even festival organisers have been using for years.


Cisco Fixes Three-Year-Old Telnet Flaw In Security Appliances

ledow Telnet (59 comments)

Is it just me that wasn't even aware that telnet had an encrypted mode (let alone a horribly-broken one)?

Not been an issue as I always switch it off unless the device is entirely in-house (and, there, someone sniffing the packets is much more of a problem than the fact they might end up with a device password by doing so).

Honestly, we just need to kill this "protocol".


Machine Learning Expert Michael Jordan On the Delusions of Big Data

ledow Re:I disagree. (140 comments)

I'm not scared by the maths. That is working back from a series of 2D images to reconstruct a 3D model, with appropriate error. It's horribly complex, but it's not anything more than a time-saving calculation. It isn't a new realm of science (mathematical or otherwise).

And, again, even the example images in the introduction of the book belie the actual capabilities. The mathematics of 3D geometry are complex, yes, but well-known. Reversing them is difficult, yes, but again well-known - with appropriate error.

Taking enough photographs to be able to identify points (edge-detection, heuristics, manual placement...) in several of those photographs and thus form a correlation between the images to allow you to form a volumetric object is DAMN HARD. I have no doubt.

But it cannot extrapolate the window frame hidden behind another object in a 2D painting, as that book's introductory images suggest. Computer vision is notorious in this area for making undeliverable promises. The point-clouds that result have to be cleansed and interpreted, and information not given to the computer cannot be inferred (of course... why would it? But that's the credibility of the claims at stake).

Taking one example from the book, where a 2D painting is converted to a 3D scene: Sure, the window-frame that's obscured by a foreground object probably DOES extend symmetrically and with the same colour but you cannot know that - and hence the error creeps back in again unaccounted for by having humans "fix" things that the computer can't.

Yes, it saves time if you want to get a 3D sculpture into your computer, or recreate a crime-scene from evidence, but it requires tweaking and a lot of human work - it's back into the realms of the time-saving tool, rather than a whole new paradigm of (as the article is originally about) machine learning and automated extrapolation. The acid-test is how admissible this stuff would be in court, and though a lot of it would be provable, the error margins would need to be stated and then it's not as clear-cut as first impressions might give.

CV is a horribly complex task that performs all kinds of useful functions. But it isn't, and can't yet be, anything beyond a tool that speeds up human calculations. I guarantee that even an average artist would be able to recreate that scene in 3D to a greater degree of accuracy than a computer could (I actually have a personal like for those "we've layered a 2D image over a sidewalk/car to make it look like a black-hole, or that the car isn't there" etc. images).

And, again, it's the usefulness that's limited in scope, and the automation that's only doing the legwork for a human-led interpretation.

CV is maths. That's the end of it (don't be insulted... similarly, quantum physics is "just maths"). Horribly complex maths, with associated error. It gives us useful answers when we apply it. But, as the article is wont to point out, we need to apply it. Or design something that will apply it in a particular circumstance.

This is vastly different from the claims that the CV industry makes, and from those illustrations they choose to adorn their books. Hence why CV comes up in the topic of machine learning. The machine isn't learning, it isn't thinking, it isn't extrapolating, it isn't guessing, it's doing lots of maths very fast that we could do if we had the time. Thus the usefulness extends only so far as a human is willing to work out how to apply it.

And, at the end of the day, when you want to scan in a 3D structure, chances are that some laser distance-based measurement is more accurate and less easily "misinterpreted" by the computer than anything it might get from someone running a camera around it. That's why most of those 3D reconstruction projects make the point-cloud with a laser measuring device first, not rely on the interpretation of a 2D image to infer it.


Machine Learning Expert Michael Jordan On the Delusions of Big Data

ledow Re:I disagree. (140 comments)

The problem with computer vision is not that it's not useful, but that it's sold as a complete solution comparable to a human.

In reality, it's only used where it doesn't really matter.

OCR - mistakes are corrected by spellcheckers or humans afterwards.

Mail systems - sure, there are postcode errors, but they result in a slight delay, not a catastrophe of the system.

Structure from motion - fair enough, but it's not "accurate" and most of that kind of work isn't to do with CV as much as actual laser measurements etc.

Photo stitching - I'd be hard pushed to see this as more of a toy. It's like a photoshop filter. Sure, it's useful, but we could live without it or do it manually. Probably biggest use in mapping, where it's a time-saver and not much else. It doesn't work miracles.

Number plate recognition - well-defined formats on tuned cameras aimed at the right point, and I guarantee there are still errors. The systems I've been sold in the past claim 95% accuracy at best. Like OCR, if the number plate is read slightly wrongly, there are fallbacks before you issue a fine to someone based on the image.

Face detection is a joke in terms of accuracy. If we're talking about biometric logon, it's still a joke. If we're talking about working out if there's a face in-shot, still a joke. And, again, not put to serious use.

QR scanners - that I'll give you. But it's more to do with old barcode technology that we had 20 years ago, and a very well defined (and very error-correcting) format.

Pick-and-place rarely relies on vision only. There's much better ways of making sure something is aligned that don't come down to CV (and, again, usually involve actually measuring rather than just looking).

I'll give you medical imaging - things like MRI and microscopy are greatly enhanced with CV, and the only industry I know where a friend with a CV doctorate has been hired. Counting luminescent genes / cells is a task easily done by CV. Because, again, accuracy is not key. I can also refer you to my girlfriend who works in this field (not CV) and will show you how many times the most expensive CV-using machine in the hospital can get it catastrophically wrong and hence there's a human to double-check.

CV is, hence, a tool. Used properly, you can save a human time. That's the extent of it. Used improperly, or relied upon to do the work all by itself, it's actually not so good.

I'm sorry to attack your field of study, it's a difficult and complex area as I know myself being a mathematician that adores coding theory (i.e. I can tell you how/why a QR code works even if large portions of the image are broken, or how Voyager is able to keep communicating, despite interference on an unbelievable magnitude).

The problem is that, like AI, practical applications run into tool-time (saving a human having to do a laborious repetitive task, helping that task along, but not able to replace the human in the long run or operate entirely unsupervised). Meanwhile, the headlines are telling us that we've invented "yet-another-human-brain", which are so vastly untrue as to be truly laughable.

What you have is an expertise in image manipulation. That's all CV is. You can manipulate the image to be easier read by a computer which can extract some of the information it's after. How the machine deals with that, or how your manipulations cope with different scenarios, requires either a constrained environment (QR codes, number plates), or constant human manipulation to deal with.

Yet it's sold as something that "thinks" or "sees" (and thus interprets the image) like we do. It's not.

The CV expert I know has code in an ATM-like machine in one of the southern American counties. It recognises dollar bills, and things like that. Useful? Yes. Perfect? No. Intelligent? Far from it. From what I tell, most of the system is things like edge detection (i.e. image manipulation via a matrix, not unlike every Photoshop-compatible filter going back 20 years), derived heuristics and error-margins.

Hence, "computer vision" is really a misnomer, where "Photoshopping an image to make it easier to read" is probably closer.


Xerox Alto Source Code Released To Public

ledow Re:It would be interesting (121 comments)

Er... windows 3.11 had the same minimum spec as Windows 3.1... 2Mb RAM. And a 15Mb hard disk. So the point still stands.

And I have personally contributed to a project that brought Linux networking and TONS of extra features that we'd have died for in the 3.11 era to a single, bootable, 1.44Mb floppy disk.

Sure, Windows 95 upped the ante, but in terms of what you were given was it really that much of an advance? That's where things started to go downhill if anything... networking stack, yes. Firewalling of any kind? No.

And Windows 95: "To achieve optimal performance, Microsoft recommends an Intel 80486 or compatible microprocessor with at least 8 MB of RAM.".

I think you're forgetting how much you could get done in 2Mb of RAM. Hell, Windows 95 can't even boot if you have 512Mb, it was never designed to have that much RAM EVER. I'm just not sure there was ever a feature worth quite that amount of system resources - at this moment in time, my Bluetooth tray icon takes more RAM than Windows 3.1 needed to load everything. I can't see the justification for that at all.

CPU speed, yes, devices nowadays shove data through them a LOT faster than they ever used to so you need to be able to keep up. Disk space, possibly. But RAM usage? Why should a Bluetooth icon take more RAM than an entire former OS?

2 days ago

Xerox Alto Source Code Released To Public

ledow Re:It would be interesting (121 comments)

The chances of the code even compiling any more are slim. Let alone the required hardware and devices being present in a PC.

You're looking at a full emulation environment, which would kill all the performance anyway. It'd still fly on a modern PC, even so, but I can remember entire games fitting in 16Kb of RAM and Windows graphical interfaces that you needed to upgrade to 2Mb of RAM in order to run.

Of course they'd be fast on modern architecture. But they won't run directly. And by the time you get them to run, you could have wrote a basic GUI that did the same in a language of your choice.

The problem is not that we should be running ancient systems as they were back then. It's asking ourselves why Windows needs a Gig of RAM in order to even boot properly, when the user "experience" of such is a desktop background bitmap and a clicky button in the corner. Windows 3.1 could do that in 2Mb of RAM.

2 days ago

More Eye Candy Coming To Windows 10

ledow Sigh (209 comments)

Because what I want in an enterprise-class operating system, what I desire more than anything else, what I cannot live without, what my users are crying out for, what I will pay good money just to have... ... is more shit jumping out at me on the screen for no good reason.

Gimme WinFS and we'll talk. Gimme complete application isolation and I'll think about it. Otherwise, honestly, you're just papering over the cracks.

3 days ago

As Prison Population Sinks, Jails Are a Steal

ledow Prison (407 comments)

Sounds like returning to the norm, from a foreigner's perspective.

"The incarceration rate in the United States of America is the highest in the world."

Something like EIGHT TIMES what it is in Europe, from what this page says:

Land of the free, indeed...

about a week ago

Making Best Use of Data Center Space: Density Vs. Isolation

ledow Re:Simple (56 comments)

I have just put in a Blade / VM configuration at a school (don't ask what they were running before, you don't want to know).

Our DR plan is that we have an identical rack at another location with blades / storage / VM's / etc. on hot-standby

Our DDR (double-disaster recovery!) plan is to restore the VM's we have to somewhere else, e.g. cloud provider, if something prevents us operating on that plan.

The worries I have are that storage is integrated into the blade server (a SPOF on its own, but at least we have multiple blade servers mirroring that data), and that we are relying on a single network to join them.

The DDR plan is literally there for "we can't get on site" scenarios, and involves spinning up copies of instances on an entirely separate network, including external numbering. It's not a big deal for us, we are merely a tiny school, but if even we're thinking of that and seeing those SPOF's, you'd think someone writing their article into Slashdot would see that too.

All the hardware in the world is useless if that fibre going into the IT office breaks, or a "single" RAID card falls over (or the RAID even degrades, affecting performance). It seems pretty obvious. Two of everything, minimum. And thus two ways to get to everything, minimum.

If you can't count two or more of everything, then you can't (in theory) safely smash one of anything and continue. Whether that's a blade server, power cord, network switch, wall socket, building generator, or whatever, it's the same. And it's blindingly obvious why that is.

about a week ago

Making Best Use of Data Center Space: Density Vs. Isolation

ledow Simple (56 comments)

Put all your eggs in one basket.
Then make sure you have copies of that basket.

If you're really worried, put half the eggs in one basket and half in another.

We need an article for this?

Hyper-V High Availability Cluster. It's right there in Windows Server. Other OS's have similar capabilities.

Virtualise everything (there are a lot more advantages than mere consolidation - you have to LOVE the boot-time on a VM server as it doesn't have to mess about in the BIOS or spin up the disks from their BIOS-compatible modes, etc.), then make sure you replicate that to your failover sites / hardware.

about a week ago

Court Rules Parents May Be Liable For What Their Kids Post On Facebook

ledow Re:7th grade? (323 comments)

In the UK:

The age of legal responsibility can be as low as 10. James Bulger's killers, for example. Held personally liable for their actions. This is the "old enough to know" law.

Contract-signing is 16. So a "contract" with Facebook is null and void as they never bothered to check they were over 16. Facebook should terminate the account as soon as they are made aware of it as they are providing service on a void contract.

Financial responsibility, parental responsibility to ensure they are in education, employment or training, and an awful lot of other responsibilities to a child last until they are 18.

In the US, it's a bit different. Hilariously, in the UK, you can legally be married, have sex, have children, drive a car, smoke, drink and sign a contract (hopefully not all at the same time) while still being under parental responsibility because you're not 18.

about a week ago

Flight Attendants Want Stricter Gadget Rules Reinstated

ledow Re:Simple solution: bring cookies. (405 comments)

I live in a country with guaranteed minimum wage.

If the guy sweeping the street doesn't get tips, I don't see why the waitress he sweeps past every morning should. They both received a guaranteed minimum. If that's not enough, there's a reason to campaign for minimum wage rises. Not to tip, charity-like, out of sympathy.

If you're going to tip on the basis of hard work, or sympathy for their plight, tip nurses, tip doctors, tip policemen (you can't, but that's another matter), tip firefighters, tip sewer workers, tip the guy that sweeps the streets cleaner than any other. Don't tip because you feel sorry they are in such a bad job with an employer who doesn't appreciate them.

You say yourself: "tips are often used to allow employers to underpay people"

If you didn't tip, they'd have to pay a proper wage.

about a week ago

Positive Ebola Test In Second Texas Health Worker

ledow Re:Just tell me (463 comments)

In comparison, a spike through the head is probably MORE common and almost certainly more deadly. Should we be avoiding anything spiky at all whatsoever? No. Just be careful around spiky things.

about two weeks ago

Positive Ebola Test In Second Texas Health Worker

ledow Re:Just tell me (463 comments)

Of you have to ask others if it's time to panic, then it's not.

P.S. Even if others say it IS time to panic, expand your definition of "others" carefully. I have been variously told that I have swine flu, that eggs are killer-bacteria in a shell, that bird flu is going to wipe us all out, the seas are rising, the sky is falling, etc. etc. etc.

If you have to ask, it's not important. If you have to choose who you ask to get the answer you want, it's even less important.

about two weeks ago

Where Intel Processors Fail At Math (Again)

ledow Re:Exact mathematical value isn't the ideal (239 comments)

One instruction does not necessarily cause a problem. However, errors build and multiply. Anyone serious about their calculations also calculates the potential error as they go. If an answer doesn't have a "+/- X%" attached to it, then it should have.

As such, for most uses a tiny few decimal places out right past the decimal point aren't a factor in a single calculation. But the error will build if you're not careful. Might not make much difference to the length of a matchstick in a matchstick factory, but will surely hurt when you're laying the foundations for your skyscraper that ends up weighing 10% more than you thought.

As such, fsin is fine for games, 3D visualistions, home calculations, etc. But that small error will build with every transform, every manipulation of the data, etc. and if that matters to you, then it's going to hurt. Floating-point, especially, is something to be handled with care. It affects currency formats (believe it or not) and calculations with an inherent error already. Integer calculations? Most of those are perfectly safe. But floating-point errors build up. It might not make much of a difference at first, but with 2 BILLION instructions per second on each chip, you're talking one hell of a mess on anything that needs to be accurate (which is where mistakes tend to matter most).

about two weeks ago


ledow hasn't submitted any stories.


ledow has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?