Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Inside The Twisted Mind of Bruce Schneier

Soulskill posted more than 6 years ago | from the it's-dark-in-here dept.

Security 208

I Don't Believe in Imaginary Property writes "Bruce Schneier has an essay on the mind of security professionals like himself, and why it's something that can't easily be taught. Many people simply don't see security threats or the potential ways in which things can be abused because they don't intend to abuse them. But security pros, even those who don't abuse what they find, have a different way of looking at things. They always try to figure out all the angles or how someone could beat the system. In one of his examples, Bruce talks about how, after buying one of Uncle Milton's Ant Farms, he was enamored with the idea that they would mail a tube of live ants to anyone you asked them to. Schneier's article was inspired by a University of Washington course in which the professor is attempting to teach the 'security mindset.' Students taking the course have been encouraged to post security reviews on a class blog."

cancel ×

208 comments

Sorry! There are no comments related to the filter you selected.

Destructive mindset (5, Insightful)

wces423 (686579) | more than 6 years ago | (#22817166)

This article just confirms my belief that a good security professional needs to have destructive mindset. You need to feel the urge to abuse the system as soon as you have seen it. I was not good at it, quit security research to join development!

Re:Destructive mindset (0, Troll)

Anonymous Coward | more than 6 years ago | (#22817284)

This article confirms to me that Bruce Schneier is an ego maniac, without much good reason. What has he ever done ? He wrote a crappy as hell book that people like because they don't understand it. He did blowfish, which is one of a MILLION symmetric crypto systems, which btw, is totally easier than public key crypto, and he did a recent entry to the AES contest, but it wasn't his work it was with other people, and it turned out to suck anyway if you read the reviews of it.

Re:Destructive mindset (3, Insightful)

andy666 (666062) | more than 6 years ago | (#22817332)

Yes, but more like "ooooh look at the dark and deep mind of Bruce Schneier, he is so briliiant." He's so dramatic about it. Jesus, a lot of people do security, why does he think he understands all of them ? It's another branch of computer science - not being James Bond. In fact I went into security after college because of the allure, but in fact the daily things that have to be done are not that glamorous, and have little to do with his strange psychological theories. And I agree, the book is overrated.

Re:Destructive mindset (4, Funny)

iamdrscience (541136) | more than 6 years ago | (#22817350)

You two should be careful about critcizing Bruce Schneier. His fists are tatooed with "Bob" and "Alice" and if you get on his bad side, he'll exchange keys all over your face.

Re:Destructive mindset (4, Funny)

cbart387 (1192883) | more than 6 years ago | (#22817498)

Even if you're not 'Eve'?

Re:Destructive mindset (5, Funny)

Anonymous Coward | more than 6 years ago | (#22817802)

Most people use passwords. Some people use passphrases. Bruce Schneier uses an epic passpoem, detailing the life and works of seven mythical Norse heroes.

Hashes collide because they're swerving to avoid Bruce Schneier.

And more:
http://geekz.co.uk/schneierfacts/ [geekz.co.uk]
http://geekz.co.uk/schneierfacts/facts/top [geekz.co.uk]

Re:Destructive mindset (4, Insightful)

qbzzt (11136) | more than 6 years ago | (#22817516)

In fact I went into security after college because of the allure, but in fact the daily things that have to be done are not that glamorous, and have little to do with his strange psychological theories.

Implementing security procedures is not at all glamorous, and does not require more than understanding the system to which they apply. Writing security procedures in such a way that they will be difficult to abuse requires a twisted mind. Doing it correctly, so the procedures properly balance security and availability, requires a mind that is twisted and straight at the same time.

Re:Destructive mindset (2, Insightful)

macslas'hole (1173441) | more than 6 years ago | (#22818672)

Security and crypto are not branches of computer science. They both existed before CS and are widely application outside of CS.

not being James Bond ... I went into security after college ... not that glamorous
You sound bitter. Life's a bitch, and then you die. (This being /. you can skip the "marry one" part) Get over it.

Re:Destructive mindset (1)

Oktober Sunset (838224) | more than 6 years ago | (#22818770)

He sounds like Butters when he becomes Professor Chaos.

Re:Destructive mindset (1)

strider44 (650833) | more than 6 years ago | (#22817370)

He did blowfish, which is one of a MILLION symmetric crypto systems, which btw, is totally easier than public key crypto

Don't comment when you obviously with that statement showed you have only a little bit of an idea about cryptography.

Re:Destructive mindset (0)

Anonymous Coward | more than 6 years ago | (#22817378)

It's true that symmetric cryptography is not trivial, but public key wasn't invented until the 1970s. It is HARDER.

Re:Destructive mindset (1, Interesting)

strider44 (650833) | more than 6 years ago | (#22817462)

Why does being invented later mean that it's harder? Usually it goes the other way around - people find better and easier ways of doing things.

For an example of how hard symmetric key cryptography is consider this: The session key exchange algorithm that is in most common use (Diffie Hellman) was invented in 1976. The public key cryptographic algorithm most commonly in use now (RSA) was invented in 1973. These haven't been broken. The current symmetric algorithm in use was invented in 2000 and the reason is that every previous algorithm was broken. There are dozens of attacks against symmetric algorithms and almost none against public key cryptography. While symmetric cryptography isn't nearly as hard as hashing, it's still pretty damn hard.

(also, RSA can be implemented in about five lines of code. Not quite as easy for AES)

Re:Destructive mindset (1)

Splab (574204) | more than 6 years ago | (#22817588)

(also, RSA can be implemented in about five lines of code. Not quite as easy for AES)


Could we please please PLEASE! stop talking about programs in terms of lines of code?? It makes no sense! You can't just claim something is a oneliner - I can create and populate a huge database with one line of code, just remove all line breaks and voila.

Even when you limit it to say "it's 5 function calls" or something like that it still makes no sense, one of those function calls could be to libEverything calling some god forsaken huge library that does everything...

Re:Destructive mindset (1)

somersault (912633) | more than 6 years ago | (#22817678)

Fine. Uh.. RSA can be implemented in.. 8452 operations and using only 3 registers!

Re:Destructive mindset (1)

Watson Ladd (955755) | more than 6 years ago | (#22817958)

What keysize? Is this program length or execution time? What about mems of computation?

Re:Destructive mindset (0)

Anonymous Coward | more than 6 years ago | (#22817682)

I can create and populate a huge database with one line of code, just remove all line breaks and voila.
And there goes your bragging rights. :/

Re:Destructive mindset (4, Insightful)

Anonymous Coward | more than 6 years ago | (#22817606)

At least he has accomplished something notable, which is a heck of a lot more than can be said for an anonymous post criticizing said noteworthiness.

Re:Destructive mindset (5, Informative)

mattpalmer1086 (707360) | more than 6 years ago | (#22817710)

Symmetric crypto easier than public key? Are you kidding? Public key is based on simple one-way math functions. It's easy to prove it's secure (with certain assumptions about not being able to solve hard problems, like discreet logs or factoring large numbers). If the maths is solid, you've got a good encryption algorithm. If the single hard maths problem isn't cracked, you're safe. Job done.

I could probably invent a reasonable public key algorithm with a maths textbook to hand - but no way could I invent a good symmetric crypto algorithm. Symmetric crypto relies on scrambling things up in a way it can't be unscrambled easily. You have to know a *lot* about cryptanalysis to even begin designing one, and you can still become vulnerable to a surprise attack. There is no general way of mathematically proving that how you are doing the scrambling is secure in any way - only that it is resistant to all the known attacks so far.

Re:Destructive mindset (0)

Anonymous Coward | more than 6 years ago | (#22818242)

[...] with certain assumptions about not being able to solve hard problems, like discreet logs or factoring large numbers [...]

[...] There is no general way of mathematically proving that how you are doing the scrambling is secure in any way - only that it is resistant to all the known attacks so far.

It's really interesting how you spin the same thing as "totally secure" for asymmetric cryptography and "totally insecure" for the symmetric kind.

Re:Destructive mindset (0)

Anonymous Coward | more than 6 years ago | (#22818392)

The simplest crypto is the strongest is symetric: One Time Pads

As for symetric key strength vs asymetric key strength, given the same bit length for each, symetric keys are FAR more stronger than asymetric because symetric all bits are fair game while in asymetric, only those bits related to prime numbers in the same field are usable.

The only reason asymetric keys are so popular is because you dont have to communicate the decryption key in advance. Security vs usability.

Admiral Beotch

Re:Destructive mindset (1, Interesting)

Anonymous Coward | more than 6 years ago | (#22818462)

You are only half right. The concept, the algorithm, and the math may be correct, but that does *not* mean the product is secure. One of the key problems with cryptography, indeed with most security, is in its implementation. There is a famous quote from Donald Knuth, it goes something like this: "Beware of bugs in the above code. I have only proven it correct, not tested it." Proof does not make a program correct. So often programmers make small but fundamental mistakes that compromise the security of the concept/algorithm. This is part of what Bruce Schneier is talking about - how is it compromised. It is this issue that makes your first assumption wrong as well: "I could probably invent a reasonable public key algorithm with a maths textbook to hand [...]".

Re:Destructive mindset (1)

hal9000(jr) (316943) | more than 6 years ago | (#22818648)

It's easy to prove it's secure (with certain assumptions about not being able to solve hard problems, like discreet logs or factoring large numbers) ...snip... I could probably invent a reasonable public key algorithm with a maths textbook to hand - but no way could I invent a good symmetric crypto algorithm.

First, to make a strong crypto algorithm, you have to prove your assumptions are strong. The caveat with asymmetric key crypto based on factoring large primes is that today, factoring large primes is a difficult problem. But that doesn't mean a more efficient way to factor large primes won't be discovered tomorrow.

I bet you can't just whip up a new asymmetric key algorithm with a math text book. Talk about arrogance. The reason why there are so few good crypto systems is because creating an algorithm that is sufficiently strong is difficult. Hell, creating a pseudo random number generator is difficult. So maybe you are a genius. If so, then I challenge you to come up with a new asymmetric algorithm based on a math problem other than factoring large primes and have it assess by the crypto community. You can patent it and make millions off licensing.

Re:Destructive mindset (1)

mattpalmer1086 (707360) | more than 6 years ago | (#22819084)

I apologise if I seemed arrogant; I wasn't claiming any great intelligence for myself. I maintain that pretty much *anyone*, with a reasonable understanding of maths and the principles of public key ciphers, would find it possible to design a workable (not necessarily efficient or pragmatic) public key cipher, but most people wouldn't even know where to start with symmetric cipher.

The reason there aren't more public key ciphers is not because of a shortage of hard maths problems to pick from, but that the ciphers we have are sufficient, well studied and understood, and security people are quite conservative. In fact, much of the differentiator in them is down to pragmatic reasons, like computational power required to implement them, or that they are standardised and already in widespread use.

Symmetric crypto, on the other hand, is not based on a simple mathematical problem you can find in a textbook. It is based on "scrambling things up" - a vague concept - in a way that you hope is hard to unscramble, and which is highly dependent on the key. But this isn't based on any single underlying mathematical problem or theory. You can't state that the security of your scrambling is the same as the difficulty of doing x. It takes a lot more understanding and skill to design a symmetric cipher that people can even begin to have any faith in, regardless of how pragmatic it may be to implement.

Re:Destructive mindset (2, Informative)

sdaemon (25357) | more than 6 years ago | (#22818882)

Actually, a one-time pad is an excellently secure symmetric cipher, the strength of which is dependent only upon the randomness of the pad (and the mechanism for distributing copies of the pad to the various parties who require it).

You have to distribute copies of a secure symmetric key anyway. Distributing copies of a OTP is no different.

Re:Destructive mindset (1)

sdaemon (25357) | more than 6 years ago | (#22818904)

Er, and I meant to add "and a good OTP algorithm is simple and can be written in a couple lines of code with an XOR operation in there somewhere."

Re:Destructive mindset (1)

lightman_wg (1238048) | more than 6 years ago | (#22817810)

He did blowfish, which is one of a MILLION symmetric crypto systems, which btw, is totally easier than public key crypto
I dont get this statement. Easier than public key crypto? Different yes, and I dont get the point your making.

Re:Destructive mindset (1)

macslas'hole (1173441) | more than 6 years ago | (#22818558)

What has he ever done ?
He outs crap crypto every month on his blog. Are you in his doghouse?

one of a MILLION symmetric crypto systems
and approximately 999,900 of those are utter crap. Good crypto is not easy; if you think it is, you probably are in his doghouse.

Re:Destructive mindset (1)

strider44 (650833) | more than 6 years ago | (#22817414)

Why does it have to be destructive? It's not so much the urge to abuse the system, it's more the urge to see what it's capable of, even the things not intended by the creator.

Re:Destructive mindset (2, Funny)

SL Baur (19540) | more than 6 years ago | (#22817488)

Bruce talks about how, after buying one of Uncle Milton's Ant Farms, he was enamored with the idea that they would mail a tube of live ants to anyone you asked them to.
I had the board game when I was very young. I also remember the spanking I got when I brought a container of ants into the house. Dad, they can't get out! Ouch!

Re:Destructive mindset (3, Insightful)

Registered Coward v2 (447531) | more than 6 years ago | (#22817636)

This article just confirms my belief that a good security professional needs to have destructive mindset. You need to feel the urge to abuse the system as soon as you have seen it. I was not good at it, quit security research to join development!

I would not say a destructive mindset but rather an inquisitive one - that asks "What possibilities does this open up and how can I use this to other ends?"

The challenge is to turn that mindset to productive, rather than destructive ends.

Speaking as one who has done that work; a little paranoia is a good thing as well; because some people are out to get you (and even more are just plain stupid enough to do a dumb thing).

Re:Destructive mindset (4, Insightful)

cardpuncher (713057) | more than 6 years ago | (#22817680)

I think it's got more to do with awareness and analysis than destructivness.

I remember some years ago now gently trying to persuade a colleague that it was inappropriate to have forwarded the infamous Craig Shergold [wikipedia.org] chain e-mail. Despite widespread publicity, the colleague absolutely refused to believe that there could be anything amiss and insisted I was being mean and cruel to deny the child (even by then cured and in his late teens) his "dying wish" and denounced my callousness to other co-workers.

There's an advertisement for an animal welfare organisation on British TV at present with pictures of pathetic looking dogs who have been badly beaten ("it's the worst case I've ever seen" says the voice-over) or "used as an ashtray". Finally, at the end of the advertisement the confession, "these are not real cases" - followed with a demand for money anyway, now the viewers have been "softened up".

Being a sucker for a sob-story isn't "constructive"; knowing that it can be exploited for social engineering isn't "destructive" - unless you regard human gullibility as a postive trait - though it sure can make you unpopular!

Re:Destructive mindset (1)

Naughty Bob (1004174) | more than 6 years ago | (#22817750)

This article just confirms my belief that a good security professional needs to have destructive mindset
As in 'Set a thief to catch a poacher turned gamekeeper'.

Re:Destructive mindset (1)

jellomizer (103300) | more than 6 years ago | (#22818234)

I think "urge to abuse" is to strong of a phrase. You don't need to feel the need to do it wrong but you do realize ways around things, I see these things all the time that are security nightmares. But I don't feel any urge to try them myself. Because I realize yea it is a security problem but it also makes my life more convient. You need to get a fair balance between the two.

Re:Destructive mindset (2, Insightful)

analog_line (465182) | more than 6 years ago | (#22818714)

I would agree. I've got the "security mindset". I used to work in security on the consulting side, trying to fix up people's stuff. Thought about getting into research, but the culture of the security community at the time (right before 9/11) drove me away before I could. A kind of self-hating trifecta of ex-military intelligence grunts looking at disdain at anyone that didn't come out of the armed services, genius technical boffins with all the interpersonal skills of Rain Man, or wild-eyed "Information must be free, damn the consequences" idealogues. Since I don't fit into any one of those stereotypes, I made a lot more enemies than friends (though I did make plenty enough friends, and there are many exceptions to the rule), and decided once it was nigh impossible to find work after 9/11, that a change in direction wasn't such a bad thing after all.

Now I don't make nearly as much money, but I'm both a lot happier, and my work is a lot more helpful than it was when I was a part of the "security community". Working with little companies, a security mindset can go a very long way. I don't worry about intrusion detection or policy enforcement, or priviledges, or password strength, or encryption keys even a quarter as much as I had to before. Not when no one I deal with has a backup system that actually backs anything up (if they have a backup system) when I first walk in the door, or a simple switch of web browsers or e-mail clients will eliminate the lion's share of reasonable attack vectors into their network. Not when they don't understand the concept of patching their operating system. Not when a hands on explanation of what a phishing e-mail exactly is, what they look like, and what not to do.

Not that the more complicated stuff doesn't ever come up, because it does, and often I bring it up. I've set up a lot of VPNs lately, stopping people from what they had been doing, which is exposing their file servers directly to the outside world, with no encryption or really ANYTHING other than bad passwords stopping entry. Passwords is a big pet peeve of mine. So many of my customers have passwords that so many people know, or are trivial to guess, that they've started prefacing telling me what a new password is with "I know you're going to hate me" when they tell me the password is something that every employee that has ever been there knows, including the ones that hate the owner's guts. However, I choose to see that as a glass half full. They may not be doing the right thing, but THEY KNOW they're not doing the right thing, and have chosen to continue doing things a different way. Before I showed up and spoke to them in language they understood and took the time to explain how things work, the jargon and fearmongering of the public infosec community (including antivirus software companies) helped them nil. Maybe that kind of stuff works better in bigger organizations (heck, maybe it's the only thing that has any effect in big organizations). Perhaps that's why I couldn't handle bigger organizations and have found a lot more success with the personal touch.

Disappointing (3, Interesting)

CRCulver (715279) | more than 6 years ago | (#22817170)

It seems like ever since Bruce left the cryptography world for general security consulting, he's become less interested in giving useful advice and more interested in self-aggrandizing.

I have to agree (1, Interesting)

2.7182 (819680) | more than 6 years ago | (#22817310)

I used to look forward to reading what he had to say - in the 1990's. Now when I see these articles about what the almightly Bruce Schneier says I cringe. He did some decent work, but I think the main reason for his high profile comes from a book which was essentially a derivative of several other classic tomes in cryptography, like Stinson. For me, he has become the Dvorak of security.

Re:I have to agree (5, Insightful)

Rainer (42222) | more than 6 years ago | (#22817364)

I used to look forward to reading what he had to say - in the 1990's. Now when I see these articles about what the almightly Bruce Schneier says I cringe.

You cringe because he keeps saying the same things over and over again.

He keeps saying the same things over and over again because people keep making the same dumb mistakes over and over again.

Re:Disappointing (2, Insightful)

Feminist-Mom (816033) | more than 6 years ago | (#22817354)

I think its all about having written a fat book at the right time and place. I can't stand it, but all my friends have it - yet none of them have read it! It has become the book you have on your shelf to look cool. Another Bruce Schneier, pronouncement about his superiority, ah yes...It would be nice if there were more articles here about current developments in cryptography. I've heard more than enough from Schneier. There are MANY other interesting security people out there to read, who aren't confused.

Re:Disappointing (1)

mattpalmer1086 (707360) | more than 6 years ago | (#22817668)

Well, I know what you mean about that. I've got Knuth on my bookshelf, and I can honestly say I don't look at it very often! Pure pose value for me ;) I assume you're talking about "Applied Cryptograhpy" - but I do read Schneier's books - I've got most of them, and I like 'em. What other security people would you recommend?

Re:Disappointing (5, Insightful)

call-me-kenneth (1249496) | more than 6 years ago | (#22817374)

Tell you what, when you've written a book that gives a tenth of the useful advice, interesting information and insightful analysis of a single issue of CryptoGram [counterpane.com] , come back and tell us about it. Until then, your words serve only to make you look bad.

Re:Disappointing (5, Insightful)

mattpalmer1086 (707360) | more than 6 years ago | (#22817650)

I would say quite the opposite. I think it's well documented that Mr Schneier used to think that cryptography would solve all our security woes, and then he realised this was only a small part of the picture. You may have preferred him when he was all gung-ho on the deeply technical and fascinating aspects of crypto - I love that stuff too - but you are not his audience anymore.

Things that you may think are obvious are just not to most people. He's trying to reach normal people, business leaders, politicians - people who don't get it, or still think security is just boring techy stuff that doesn't work very well. He's trying to show it's also a mindset, a way of seeing the world, that anyone can understand. I think he's doing pretty good, but again, we are not his primary audience.

Re:Disappointing (1)

dpilot (134227) | more than 6 years ago | (#22817798)

Even if it's as bad as you say, he's still more interesting to read/hear than Donald Trump, the *real* king of self-aggrandizing.

Open network ? (2, Interesting)

davro (539320) | more than 6 years ago | (#22817188)

I couldn't help but wonder how you reconcile your security mindset with an open wireless network at home. A while ago you proposed an open network in the name of politeness http://www.schneier.com/blog/archives/2008/01/my_open_wireles.html [schneier.com]

Re:Open network ? (5, Insightful)

muridae (966931) | more than 6 years ago | (#22817258)

Okay, I'm not Bruse, but I'll explain. If I open my wireless network, I know it's open. I can secure the computers behind with the knowledge that the wireless system is wide open. This is not really any different then securing the whole internal network against internet based problems. And, on the off chance that he really does have a single AP/router combo with the other computers connected directly to it, then the computers all need to be secured. How does this differ from securing a laptop that you use while traveling, connecting to what ever unsecured wireless signal you can pick up, except that you have to do it to all the devices involved?

So, let's say you keep your wireless system closed. What happens when someone cracks the encryption key and gets access anyways? What happens when an internet bot net gets turned on your router because someone found a vulnerability in it? Lots of people kept secured computers before home routers and NAT became a real necessity. Doing so hasn't really gotten that much tougher. Just more constant.

My real guess, though, is that he keeps the wireless and wired networks separated. Internet->wifi AP ->wired router+NAT+firewall-> computers. Given that he's a pro, the wifi AP and wired router might not even be connected to each other at all.

Re:Open network ? (3, Funny)

jonadab (583620) | more than 6 years ago | (#22818314)

> If I open my wireless network, I know it's open. I can secure the computers
> behind with the knowledge that the wireless system is wide open.

You're thinking like an engineer: "How is this supposed to work?"

Try thinking like an enemy: "How could this be exploited to harm Bruce Schneier?"

The most obvious thing is to get a rental car, drive it through some mud until the plates aren't legible, and sit across the street from the guy's house and use his wireless network for... nefarious purposes. Sending spam via his ISP's mail server? Peer-to-peer child porn? Attacking government networks in a way that's likely to get noticed? So many possibilities.

And sitting across the street in a car for a while is only the really obvious attack. There are much more interesting things that can be done over the long term.

Re:Open network ? (1)

iamdrscience (541136) | more than 6 years ago | (#22817270)

Way to repost a comment from the article page.

In security (3, Interesting)

Z00L00K (682162) | more than 6 years ago | (#22817232)

It's not necessarily to have a destructive mindset but a great deal of imagination and some paranoia.

Such a personality may be disastrous in many other cases but works well when it comes to security work.

And remember that most computer viruses in the beginning weren't really malicious - they just were there "because I can". Even those cases has to be taken into account by security people.

Re:In security (1)

the 99th penguin (1453) | more than 6 years ago | (#22817266)

Sounds like you're describing the hacker mindset, but in a security context, which seems pretty fitting. It does make sense if you think of a security expert as a hacker, someone that that sees something and thinks "hm, wonder if I could do this with it?".

paranoia yes ..... (2, Insightful)

taniwha (70410) | more than 6 years ago | (#22817300)

I do crypto for a living .... my bank really really wants me to to use their web banking service - but I have a dilemma - is it safe? if I try and break their security to test them a couple of things might happen: if it's any good they'll catch me and I might go to jail .... if it's crap there's no point in me using their service - so I can't win and can't use their service

Re:paranoia yes ..... (1)

Bill Wong (583178) | more than 6 years ago | (#22817878)

The problem is even if you actively choose not to use their system, security problems in their account creation or account login can still render your account wide open to whoever out there might try to hack it. So you really might as well use the service, if only to be able create a baseline and from thereon passively monitor it yourself to check for problems. Some banks are at least trying to get better in that regard, such as Bank of America has been been offering two-factor authentication that uses a cellphone as a keyfob, but, whether that's secure at all is anyone's guess. Also, if their service is crap, you could always use an aggregator like Yodlee, but, I suppose that it's own security risk in of itself.

Re:paranoia yes ..... (1)

Lobster Quadrille (965591) | more than 6 years ago | (#22818140)

This is the problem that's been giving me issues lately too.

Every online payment application I have available to me, including my (very large) ISP's web interface, my student loan, my utility bill, my home loan, and my bank, has at least one serious xss, session fixation, or SQL injection hole. I've informed them about the problems, and not one has made an effort to fix the issue.

They have all, however, failed to remove the text from their respective web sites saying:

We have information systems that collect and store customer information in addition to systems that store our own business records. These systems have different types of security as appropriate for the information stored.

We maintain physical, electronic, and procedural safeguards that comply with federal regulations to guard your nonpublic personal information.
What's a gray hat to do?

If I make such things public, they pursue legal action against me. Posting anonymously may or may not help, but they still know that I know about the holes, and it wouldn't be hard to put 2 and 2 together.

Re:paranoia yes ..... (1)

Hatta (162192) | more than 6 years ago | (#22818456)

If I make such things public, they pursue legal action against me.

I suggest you talk to a lawyer. If you can demonstrate that their security really is inappropriate for the information stored on their systems, you might have action for negligence. Beat them to the punch.

Re:paranoia yes ..... (1)

Lennie (16154) | more than 6 years ago | (#22818366)

I have the same problem, I do think from what I've seen of my current bank they aren't so good, I'm switching. As a user of the system you can see so much more, so I'll reevaluate after that.

Re:paranoia yes ..... (2, Interesting)

TheRaven64 (641858) | more than 6 years ago | (#22818542)

Have you contacted the bank and asked if they would be interested in you performing a free evaluation of their network security? Send them your credentials as a security professional and say that you are willing to give them a documented appraisal for free since you are a customer and the security of their system affects you. If they say no, then publish their refusal online somewhere, and approach another bank. If they say no, add them to the list. Start sending the list to consumer groups and mainstream media publications. Then contact another bank.

If you're in the USA, then good luck finding one that's even remotely secure. My US bank has such a laughable concept of security that I'm very glad I don't keep much money on that side of the pond.

Re:In security (5, Insightful)

v1 (525388) | more than 6 years ago | (#22817718)

I take the third view. I believe you need the ability to (forgive the overused phrase) "think different". 100% of what we do every day in life is based on a world of assumptions. To be a good security researcher requires distancing yourself from the assumptions, breaking out of the ruts in the road, and trying different things. The majority of security holes exist because the developers and defenders are making the same assumptions as everyone else. Buffer overflows are the classic example, and we still see them constantly even though they've been recognized for years as a major security risk.

I did in-house beta testing for a time, and used to really piss off the developers because I had a knack for knowing what they weren't planning for. I wasn't so much looking for security holes, but rather ways to crash the app. (which probably many of which were exploitable) A classic I heard was a developer submitting a bug report for "program crashes when it says Press Any Key and you press letter A". The developer called her back to his cubicle, why did you press "A"??? She said her name was Alice, and it said press ANY KEY so she hit "A". "But you're not SUPPOSED to hit "A", you're SUPPOSED to hit the space bar!" At which point the other developer stood up from his cubicle and said "oh? I thought it meant RETURN?" This perfectly illustrates how persistent assumptions are in coding. Not only are they all making assumptions, but they aren't even making the same assumptions.

That's the sort of testing I did. Deleting the last element in a list, Select all in empty lists, saving a form before completing it, entering a 200 character filename for save, taking advantage of assumptions that the user knew what they were doing and would not ask the program to do something that was certain to produce undesirable results.

Re:In security (3, Funny)

TheRaven64 (641858) | more than 6 years ago | (#22818564)

I used to know a tester who would always hit control-alt-delete when told to press any key to continue. The company changed the messages to 'press almost any key to continue' after a while. Of course, that then confused real users who wondered which keys they weren't allowed to press...

yawn, another "superior geek" complex (-1, Troll)

Anonymous Coward | more than 6 years ago | (#22817252)

Documented in humanity for the past 4000 years is the concept of Divine Right: the idea that some person or group of people is inherently superior, as if given a gift by God. This grants them privileges - money, power, adoration, whatever.

What Schneier is doing here is posting another advocacy piece for Divine Right, at the start of the 21st century.

Get over it. You're just another human. You choose to use your brain in one way, that's all. Don't assume that just because others aren't doing what you are, it's because they can't. Worse, don't tell people who might aspire to something that they don't have a hope. That's just vile.

Re:yawn, another "superior geek" complex (2, Interesting)

TheP4st (1164315) | more than 6 years ago | (#22817296)

RTFA! "There's nothing magical about this particular university class; anyone can exercise his security mindset simply by trying to look at the world from an attacker's perspective."

Ant farms are nothing. (2, Interesting)

evanbd (210358) | more than 6 years ago | (#22817272)

You can get a port-a-potty delivered without ever providing positive identification. You don't even have to pay for it until it shows up, and they'll happily deliver while you're at work. They're quite used to people preparing to have renovations done by contractors.

Of course, I would never decide someone else needed a port-a-potty on their front lawn. But, much like the ants, it's something you can't help but notice if you have the right mindset.

Re:Ant farms are nothing. (1)

Nullav (1053766) | more than 6 years ago | (#22819012)

So you're the one flooding hapless neighborhoods and honest businesses with toilets!

Is this mindset really special? (4, Insightful)

badzilla (50355) | more than 6 years ago | (#22817290)

Anyone can do what Bruce implies only "special security people" can do. It's just that most people don't because there is no incentive to. You might as well announce that your special security mindset has noticed how easy it would be to go into restaurants and put poison in the salt shakers. Hell they are wide open! What were the salt shaker designers thinking of! But of course normal people are just not interested in doing that.

Good engineering (3, Insightful)

TheLink (130905) | more than 6 years ago | (#22817298)

"This kind of thinking is not natural for most people. It's not natural for engineers. Good engineering involves thinking about how things can be made to work; the security mindset involves thinking about how things can be made to fail"

In my opinion, good engineering involves thinking that things _will_ eventually fail, how it can be made to fail _safely_ if possible and figuring out what the acceptable risk is given the cost. Modern engineers don't normally design stuff to last for 1000 years (some of it might last that long - distribution curves and all that).

Re:Good engineering (0)

Anonymous Coward | more than 6 years ago | (#22817526)

Most engineers don't design things to last even 10 years, let alone 1000 -- if you're talking about consumer good. Heck, even 1 year would be a dream come true with some crap. Gotta keep the monkeys buying again and again...

Re:Good engineering (1)

tsjaikdus (940791) | more than 6 years ago | (#22817590)

>> In my opinion, good engineering involves thinking that things _will_ eventually fail
.
Most engineering is done by engineers that have just left school. They are fast and they don't think. When the design is ticked of the excel sheet of the project leader it is time to progress to the next open issue. Nobody cares that the design does not fit or does not work at all. Engineering is about ticking off open issues.

Re:Good engineering (0)

Anonymous Coward | more than 6 years ago | (#22817812)

This is why it's called software "engineering".

Re:Good engineering (2, Insightful)

Hatta (162192) | more than 6 years ago | (#22818324)

In my opinion, good engineering involves thinking that things _will_ eventually fail, how it can be made to fail _safely_ if possible and figuring out what the acceptable risk is given the cost.

Murphy [wikipedia.org] was an engineer after all.

Re:Good engineering (0)

Anonymous Coward | more than 6 years ago | (#22818532)

That's because modern engineers work for companies that need to continue to make money. They need to have to replace things in order to sell more stuff to people and keep up "growth."

Roman engineers were soldiers who were out to build an empire for the Eternal City - and that meant "last for ever." Hell, Segovia in Spain still uses the one the Romans built!

Re:Good engineering (1)

celafon (935595) | more than 6 years ago | (#22818954)

I would say more.

Long gone are the days when the security was something that "others" will make sure is there. Being a developer now involves making the code/system/architecture/whatever SECURE. And you have to train this particular way of looking at things even when building them. There are no magic devices that will secure the system for you!

Bruce Schneider Facts (1)

tangent3 (449222) | more than 6 years ago | (#22817326)

Bruce Schneider Facts [geekz.co.uk]
The last time someone tried to look into Bruce Schneider's twisted mind, the Big Bang happened

You're damn right, most people don't get it! (5, Interesting)

MikeRT (947531) | more than 6 years ago | (#22817422)

My instincts on this are more of "how would a criminal or terrorist" behave in this setting" because I grew up in a law enforcement family (both parents plus extended family). I've made a few "regular people" upset in the past by pointing out the idiocy of their evacuation plans to them in pointed detail. One example comes from high school when the school shootings were just starting to disappear from the news.

Our school gets a bomb threat, and the teachers and administrators are freaked out. They move us all, I kid you not, to the football field where we are fenced in by chain link fence, about 1/3 of which is covered by barbed wire. So I point out to my history teacher, one of the only genuinely intelligent public school teachers I have ever met that we had been corralled into an enclosed area, surrounded by strong sniper nests (there were many points where a shooter with a 30.06 and a few mags could have unloaded with impunity), and that ironically, if there were a bomb, and the person who planted it were clever, they'd have put it under the bleachers where about 200-300 of us were sitting.

He nodded his head in agreement that were this a real thing, we'd probably be fucked because of our administrators' plan, but the one or two regular teachers not far away who overheard acted like I was the real danger for pointing out what should been "the obvious" about this plan. Me? I'd have called in the buses, and shipped everyone off property to be safe right away.

Re:You're damn right, most people don't get it! (3, Insightful)

LaskoVortex (1153471) | more than 6 years ago | (#22817458)

Id have called in the buses, and shipped everyone off property to be safe right away.

And then the snipers would shoot them as they were packed like sardines into the busses. Me, I would pull one of 50 cards with random "evacuation plans" out of a hat and did what it said on the card. I'd include an "ignore the bomb threat" card in there as well.

Re:You're damn right, most people don't get it! (3, Insightful)

autocracy (192714) | more than 6 years ago | (#22817568)

That's basically the answer it would have to come down to as far a secure response would go. The constant issue in the grandparent's scenarios has been that the same thing will always happen. Call in a threat, watch them load the buses... bomb the buses the next time.

Much like the pre-2001 response of "we'll sit and wait for the hijacking to end," bomb threats are dealt with as if the threat is honest. Once somebody has a case of a bomb under a bleacher to remember, we may act differently.

Security tends to be reflexive.

Re:You're damn right, most people don't get it! (4, Insightful)

remahl (698283) | more than 6 years ago | (#22817528)

No need to call in the busses. Just tell everyone that they may go home for the day. They will disperse randomly in every direction, quicker than any school administrator can administer their movements and in ways that no terrorist can predict.

Re:You're damn right, most people don't get it! (1)

smooth wombat (796938) | more than 6 years ago | (#22817758)

They will disperse randomly in every direction, . . . . in ways that no terrorist can predict.


Generally, there are only two ways in and out of a school parking. Using the DC snipers as a template:

1) Call in bomb threat to evacuate students
2) Administrators let students go home immediately rather than putting them on buses
3) As first car approaches exit, sniper shoots driver, disabling first vehicle and blocking exit
4) Repeat with second sniper at second entrance
5) Wait for students behind stuck vehicles to get out of cars or simply shoot at students inside vehicles
6) . . . .
7) Profit! (from publicity)

Does what I just said now make me a terrorist or someone who should be involved with security issues?

Re:You're damn right, most people don't get it! (1)

Torvaun (1040898) | more than 6 years ago | (#22817960)

Someone who should be involved with security issues. Note that this does not preclude terrorism.

Re:You're damn right, most people don't get it! (1)

The New Andy (873493) | more than 6 years ago | (#22817932)

You now have another problem though: Student has exam they don't want to do, so they call the school from a payphone and fake a bomb threat. Weee, no exam.

Given that most school bomb threats are fake, and there are lots of people who would like to fake them, the problem is a bit trickier than just avoiding getting students blown up.

Article leaves out cost benefit analysis (5, Insightful)

MyNameIsFred (543994) | more than 6 years ago | (#22817522)

While I agree with many points of the article - specifically that a security professional must have an unusual mindset - I am troubled that the examples leave out the cost-benefit analysis. As an example, the article correctly points out the vulnerability associated with picking up "your car" from a service department. All you need is a last name, no ID. This is an obvious vulnerability. On the other hand, the service department is motivated to make the process as streamlined as possible for its customers. Demanding IDs, etc., will slow down the process. The more cumbersome the process, the more likely customers are to use a competitor. Therefore, they need to trade security with cars to the cost of loosing customers.

I am reminded of the time that I test drove a new car. All the dealership wanted was a photocopy of my driver license, and they let me drive the car off the lot for an extended test drive. Since driver licenses are relatively easy to fake, I wondered how often cars are stolen. I asked, and was told they are stolen on occasion, but insurance covers it. My point, they did the cost-benefit analysis, and decided on an insecure method.

Re:Article leaves out cost benefit analysis (1)

remahl (698283) | more than 6 years ago | (#22817790)

The article recognizes that there is a cost-benefit tradeoff in the car dealership example. The point is that there will be no analysis unless someone sees that there may be a problem in the first place:

The rest of the blog post speculates on how someone could steal a car by exploiting this security vulnerability, and whether it makes sense for the dealership to have this lax security. You can quibble with the analysis -- I'm curious about the liability that the dealership has, and whether their insurance would cover any losses -- but that's all domain expertise. The important point is to notice, and then question, the security in the first place.

Re:Article leaves out cost benefit analysis (1)

CBravo (35450) | more than 6 years ago | (#22818082)

It would be easy to photocopy your drivers licence and see if the person that is collecting the car matches the photo from the licence. Right? Though that doesn't prove you have not already picked up the car.

All these 'problems' should be stated as 'engineering requirements'.

Re:Article leaves out cost benefit analysis (0)

Anonymous Coward | more than 6 years ago | (#22818590)

Indeed, but the cost-effective method still IS the insecure one. This isn't a problem if the one making the security decision bears its costs, but it does mean that when there's externalities involved, we need to ensure that those who can make decision have a good financial reason to make the right ones.

In the service department example, for example, it needs to be ensured that if a car gets stolen this way, the service department is held responsible.

What is Bruce trying to make us think ? (1)

Alain Williams (2972) | more than 6 years ago | (#22817558)

Turn his reasoning on his article, how can we subvert it? Was the message that he gave a real one or was he trying to make us believe something and for who's benefit ?

Seriously: I agree with a lot of what he has to say. I am amazed at the number of programmers who do not follow Henry Spencer's 6th commandment for C programmers - check function return codes, they simply assume that it will work correctly.

If something can go wrong - it will, and often at the most inconvenient time.

I do this all the time (1)

ledow (319597) | more than 6 years ago | (#22817564)

I do this all the time... I actually am quite surprised at the number of everyday things that have such simple flaws.

In the hospital waiting for my wife the other day, I watched a mailwoman with a big trolley full of mail, sorted into departments, insert several people's medical records into the trolley and then walk off out of sight through locked doors (which were opened by her tapping the glass and standing to one side) leaving the mail unattended. It wouldn't take much to a) gain access to the baby ward that is supposed to be secure by posing as a mail woman or b) stealing someone medical records just by knowing they were in hospital that day and one department that they would have to visit.

The other, from working in schools, comes from the Tesco Computers For Schools voucher scheme. For every £10 spent in a supermarket, customers get a flimsy paper voucher that they can give to the schools (only schools) who, when they have a few thousand, can trade them in for a free computer or computer hardware. Most people just throw them away, and I actually collect hundreds from the floor outside shops on my way home.

First, the vouchers are simply printed pieces of paper - there isn't any security on them at all. The only "barcode" is always "1234567890X" and every piece of paper is identical - it's also just cheap, bog-standard paper. Secondly, the schools can collect amazing numbers of vouchers just by running campaigns or by collecting harder, so there are schools that collect 5 vouchers one year and 50,000 the next. Thirdly, the vouchers are *not counted* at the other end. They are weighed approximately (if at all - I don't believe that Tesco's actually weight millions of vouchers each year and worry about the accuracy). I always wonder how much it would cost to print, say, 10,000 identical vouchers of your own to the same standard compared to the cost of a video-editing PC and lots of educational software supplied with it. Or to try your luck by declaring false numbers of vouchers and thereby learn the accuracy that they measure to (yes, you TELL Tesco how many you have, you can even do that online, and then send them off later to be "verified").

That, and working in IT in a school means I'm always looking for ways into the building, past staff, into the computer systems, etc. Some schools are amazingly lax while others are like Fort Knox.

Re:I do this all the time (1)

cbart387 (1192883) | more than 6 years ago | (#22817662)

My favorite is when companies allow you to verify stuff over the phone. For example. My parents have POA (power of attorney) for my grandparents. I forget which company it was but they weren't accepting it when my mom needed to verify some information. Therefore they wanted my grandfather to give his consent. That was fixed by just my father calling and pretending he was my grandfather. Great!

Re:I do this all the time (1)

petes_PoV (912422) | more than 6 years ago | (#22818514)

The other, from working in schools, comes from the Tesco Computers For Schools voucher scheme

The thing about this promotion is that giving away computers to schools is actually something Tesco could afford to do, for free, anytime they chose. The whole idea of making people collect worthles pieces of paper and go through the charade of giving them to schools who then redeem them is merely a marketing exercise to promote loyalty to the store and to make the donors feel good in themselves, it's certainly not a charitable activity.

Therefore I would say that they have the level of security exactly right. The product (here, the voucher not the computer) is worthless, so there's no merit in trying to protect it. I would expect that any effort Tesco's made to either trace back or validate the authenticity of a voucher would add considerably to the cost of the marketing programme and is therefore not worthwhile.

Securing things with value has some worth (provided the security costs are commensurate with the risks and value involved). Adding security just for the hell of it, or "because we can" is a pointless exercise that only adds to costs and overheads.

How about risk management? (2, Insightful)

tansualpcan (1259978) | more than 6 years ago | (#22817640)

I have written a long long reply to his article at my blog [blogspot.com] (no ads, etc.)
Short summary:
In my opinion, security in real life is not about "what can go wrong". It is about "how often and how much can it go wrong and am I prepared to handle those cases". In short it is more about how to calculate risks accurately and knowing when to take them.

There's a fine line (3, Interesting)

petes_PoV (912422) | more than 6 years ago | (#22817664)

between being "security conscious" and being completely paranoid. When it boils down to it, there's risk involved in everything we do. Nothing is completely secure and there's always a chance that something will go wrong.

Sadly the world we live in today has massively overestimated the possiblity of problems and hugely inflates the effects they will have (in the tiny percent of occasions when they happen). I think this is a side-effect of improved communications: we all get to hear about the 1 in a million disaster stories, but never about all the other times, when everything goes right. This leads us to think that problems are more common than they actually are.

The great thing about being a security professional is that you can never be proved wrong. If you claim a security hole and it is never exploited, no-one will say you're wrong - just that it hasn't been exploited yet. If we beleived everything these guys say, no-one would ever do anything as we'd all be too scared. Personally I think we should avoid the obvious problems, get on with our lives and accept that on a few, very few, occasions we might have to spend a little time sorting out a problem.

Re:There's a fine line (1)

dpilot (134227) | more than 6 years ago | (#22817924)

But this becomes a good lead-in to point out the findings of the 9/11 Commission.

The "fault" was assigned as a "failure of imagination." Yet in the very center of the whole investigation was the NSA, the folks that are *supposed* to be, as you say, completely paranoid. These are the people who are supposed to see an array of dots and connect them all into a pattern - that's their job. They're supposed to think about the possibilities of a hijacked airplane loaded with fuel, and what you do to mitigate the risk. They're supposed to see briefings titled, "Bin Laden determined to attack inside US" and start thinking.

We had the wrong mindset in the job.

But that's OK, they've been promoted out of that position.

Re:There's a fine line (1)

Torvaun (1040898) | more than 6 years ago | (#22818020)

Security professionals can be proven wrong, all it takes is someone listening to them. Suppose a security professional stated that guns were dangerous, and outlawing guns would make people safer. Then someone outlaws guns, and voila, the reverse happens. The security professional has just been proven wrong.

Re:There's a fine line (1)

petes_PoV (912422) | more than 6 years ago | (#22818304)

Yes, you're right in that example. However my experience (somewhat limited I admit) is that security professionals and others tend to make statements such as "X is a security risk", rather than saying "if you do X, Y will happen".

They seem to have learned the "weasel words": might, could, may, etc. and pepper their prognostications with them. As a consequence you can't nail them down to a definitive, quantifable, statement.

I'd like to ask the guy who wrote about being able to mail tubes of live ants (from the original article) exactly how many instances of this has occurred? While he is right that's it's possible, my point is that possible does not mean probable. Even if someone did send you a tube of ants, or ordered you a porta-potty: well, so what? It's not exactly life-threatening and would only take a phone call, or a quick FLUSH to resolve the problem - barely worth considering, much less writing about.

Re:There's a fine line (2, Informative)

grassy_knoll (412409) | more than 6 years ago | (#22818500)

It seems when many consider risk they don't consider the probability of something happening only the possibility.

Consider the National Safety Council's Odds of Dying [nsc.org] page. According to them, one has a 1 in 73,085 chance of dying in a motorcycle accident while there's a 1 in 19,216 of dying in a motor vehicle accident as a car occupant.

However, motorcycles are perceived ( at least by people I know, obviously a small sample ) as more risky because "people die riding those". Obviously that happens, but not to the same extent as people dying in car accidents.

Since many people drive every day, that's a routine activity they don't seem to associate with risk; your average person doesn't seem to assign the probability of risk very high even though it's statistically more dangerous.

Re:There's a fine line (1, Informative)

Anonymous Coward | more than 6 years ago | (#22819088)

You do realise that that's pretty much what Bruce is saying, too, don't you?

Or, well, I guess you don't. Did you RTFA? For that matter, do you read Bruce's blog? I'm not saying you need to do either - it's fine if you didn't / don't -, but you should not pass judgement on him if you don't.

Developers: Put On Your Hacker Hat! (2)

curmudgeon99 (1040054) | more than 6 years ago | (#22817708)

Over the Christmas holidays, when work is always slow, I have a long habit of putting on my hacker hat and seeing what our vulnerabilities are. I think every developer owes it to their sanity to do this regularly. You will find so many opportunities for SQL Injection--no matter how careful your developers are--and Cross-Site Scripting and just a bunch of other holes. You do not want to be in a conference room some day explaining to your boss's boss why your program allowed a hacker to gain access to the company's systems through your app. This is a no-brainer.

The necessary human element (3, Insightful)

dpilot (134227) | more than 6 years ago | (#22817756)

One example used was getting the car from the repair shop, with just a last name.

Where I get my car serviced, I know both guys who might be behind the desk, and they both know me, my wife, and son. They won't hand over the car keys on just a last name. Which brings it all back to a frequent point of Bruce's writings - all of the security razzle-dazzle in the world doesn't make a bit of difference compared to a knowledgeable person in the right spot.

Re:The necessary human element (1)

QuantumRiff (120817) | more than 6 years ago | (#22818560)

Where I get my car serviced, I know both guys who might be behind the desk, and they both know me, my wife, and son.

But not everyone drives a ford!

Good engineers look for failure too. (2, Insightful)

argent (18001) | more than 6 years ago | (#22818030)

Good engineers need to look for how things can fail, too. They need to look for small parts that children may swallow, weak latches that can allow lids to fall open, weak load-bearing structures... how the environment can make their products fail. They need to look for how things can be made to fail, as well, because the hostile human element is always part of the environment... the same factors that make someone a good engineer make them a good security expert.

The problem isn't that good security professionals have a different mindset from good engineers, it's that both good security professionals and good engineers are rarer than people think, and that engineers are not as often held responsible for how their stuff fails when someone gains an advantage by deliberately making them fail.

As in many other areas of life, I try to ask myself, WWFD? What Would Feynman Do?

HERO Sys (1)

ChristTrekker (91442) | more than 6 years ago | (#22818076)

I described this as "PsychLim: naive to criminal mindset, -10" on a Champions character I played back in the day.

Scripts (2, Insightful)

Hatta (162192) | more than 6 years ago | (#22818098)

Speaking of security analysis, there are scripts from 9 different domains on that page, none of which are required to read the article. WTF. Thank god for noscript.

Making money by breaching security isn't easy (3, Insightful)

dpbsmith (263124) | more than 6 years ago | (#22818576)

Without disagreeing with anything at all the article, I'd like to raise the point that an awful lot of things have no security, or very porous security.

What saves society is three things.

First, mischief and curiosity aren't a powerful enough motivator to create a real problem. I don't know whether Schneier ever sent live ants to strangers... or how many Slashdot readers will try it... but most likely not very many.

Second, for most security holes it is difficult to think of a way to make money from the exploits.

Third, even if you can make money, it's even more difficult to find a way that will make significant amounts of money and to repeat the exploit often enough to make a living wage, without being caught.

Case in point: newspaper vending boxes which allow you to pay for one newspaper and access a whole stack of them. If you have a "security mindset" (or even if you don't), it occurs to you that you could pay for one and take two... or ten... or the whole stack. And, indeed, you can. The problem is that it doesn't benefit you to get more than one newspaper. So, can you take two and sell the extra? Maybe. Net profit $0.50. Could you take the entire stack out of the machine and dress up as a street vendor and sell them on a street corner? Maybe. Net profit $25. Could you do it more than half-a-dozen times? Probably not.

How about self-checkout lines in supermarkets? You can buy produce at them, and the produce isn't bar-coded. So, you can buy orange bell peppers at $3.99 a pound, put them on the scanner scale, and enter the code for green peppers at $1.69 a pound. Most supermarkets seem to rely on someone at a nearby counter keeping an eye on the self-checkout lanes while doing other things, and they don't usually come over unless a customer calls or the machine goes into an error state. Again, it's hard to see how you can make money, rather than saving a little on your grocery bill... and if you managed to do this to the extent where you were stealing hundreds of dollars, I think your chances of being detected get to be high. (I'm thinking of people who got caught recently pasting barcodes for two-dollar items over things like boom-boxes and DVD players...).

Also the auditor mindset (1)

baomike (143457) | more than 6 years ago | (#22819082)

When investigating/examing internal controls over cash, you use the same idea.
How can you get money out of the system?
If you can find it, so can someone else.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>