Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Schneier: Security Awareness Training 'a Waste of Time'

Soulskill posted about a year and a half ago | from the only-trust-users-to-be-users dept.

Security 284

An anonymous reader writes "Security guru Bruce Schneier contends that money spent on user awareness training could be better spent and that the real failings lie in security design. 'The whole concept of security awareness training demonstrates how the computer industry has failed. We should be designing systems that won't let users choose lousy passwords and don't care what links a user clicks on,' Schneier writes in a blog post on Dark Reading. He says organizations should invest in security training for developers. He goes on, '... computer security is an abstract benefit that gets in the way of enjoying the Internet. Good practices might protect me from a theoretical attack at some time in the future, but they’re a bother right now, and I have more fun things to think about. This is the same trick Facebook uses to get people to give away their privacy. No one reads through new privacy policies; it's much easier to just click "OK" and start chatting with your friends. In short: Security is never salient.'"

cancel ×

284 comments

Sorry! There are no comments related to the filter you selected.

Obligatory car analogy (5, Insightful)

qbast (1265706) | about a year and a half ago | (#43221781)

It demonstrates that car industry has failed. We should be designing systems that don't need seatbelts and don't care if user decides to slam into a tree at 100km/h. Whole concept of secure driving is just an abstract benefit that gets in the way of enjoying driving.

Re:Obligatory car analogy (0, Flamebait)

Cat_Herder_GoatRoper (2491400) | about a year and a half ago | (#43221799)

Employee awareness training is inexpensive and I bet "Security guru Bruce Schneier" will provide training to your developers that is not inexpensive.

Re:Obligatory car analogy (3, Insightful)

jewens (993139) | about a year ago | (#43222047)

The training itself may be inexpensive, but the lost time for "all" employees forced to take the course/class/lecture is not. Not to mention the burden on your staff in tracking attendance compliance etc.

Re:Obligatory car analogy (5, Insightful)

philip.paradis (2580427) | about a year ago | (#43222171)

Bruce is right. In many environments, information awareness training is an attempt to solve the problem at entirely the wrong end of the failure chain, and is frequently ineffective. It may be difficult to hear for some, but the fact is that such training simply doesn't have a great track record of producing significant overall gains in organizational security, largely owing to the difficulty of mitigating widespread stupidity on the part of human operators. Most companies are not wholly staffed by information security experts, and any perceived near term security gains following training sessions quickly erode as employees revert back to an attitude of "I just want to do X, Y, and Z, and I'm too busy to keep thinking about those scary stories portrayed in last week's training."

Even military environments suffer from these training challenges. The difference in a military unit is the very real possibility of going to prison for merely mishandling cryptographic material on accident. On the "low" end of the punishment scale, there's more than a few senior enlisted military comms folks out of a job because of such process failures. I served with one such person.

It's worth noting in closing that you might want to spend a bit of time looking into who Bruce Schneier [wikipedia.org] is before framing him in any additional snarky quote marks. To say this is a man who typically knows what he's talking about is an understatement.

Re:Obligatory car analogy (0)

Anonymous Coward | about a year ago | (#43222431)

... you really don't know who Bruce Schneier IS, do you?

Re:Obligatory car analogy (5, Interesting)

hairyfeet (841228) | about a year ago | (#43222493)

Sorry, been in PC retail for nearly 25 years and I can tell you training the grunts? NEVER works. Now training the IT staff? Sure send 'em to blackhat, pay for security classes, those ARE good investments that will see return, but Sally the secretary, that sees the PC as a magic black box that lets her do her work? Sorry but its gonna go in one ear and out the other.

It would be like trying to teach me how to rebuild cars, i don't like cars, never cared about what model I drove, I just don't give a damn as long as it gets me from A to B and THAT is how many of your employees see the PC. They don't want to know about the thing, couldn't care less what its doing as long as they can get their work done and punch out, they have not the slightest interest in PCs which if you don't have any desire to really learn? Not gonna stick.

So i have to agree that paying to train the regular staff is just a waste of time and energy. Much better to make sure you have well trained IT staff that can minimize the risk that your end users will have because frankly you are just wasting your breath when you try to teach somebody who doesn't care about PCs how to securely use one.

Re:Obligatory car analogy (3, Insightful)

PopeRatzo (965947) | about a year ago | (#43222567)

Employee awareness training is inexpensive and I bet "Security guru Bruce Schneier" will provide training to your developers that is not inexpensive.

Training only goes so far. Even the best-trained user will make a mistake.

"Oh, I didn't mean to click that".

Re:Obligatory car analogy (0)

Anonymous Coward | about a year and a half ago | (#43221803)

Automatic breaking, collision avoidance systems, completely automatic cars, ...

Yea, I'd say that's actually the direction we're heading.

Re:Obligatory car analogy (1)

Anonymous Coward | about a year and a half ago | (#43221825)

but would you download a car?

Re:Obligatory car analogy (5, Insightful)

mwvdlee (775178) | about a year and a half ago | (#43221871)

To stay closer to the original analogy...

Would you drive a car randomly left by the side of the road with big stickers on it saying "You may be eligable to win $1mln if you drive this car!!! (paid for by Soilent Green Corp.".

Re:Obligatory car analogy (2)

JustOK (667959) | about a year ago | (#43222539)

Only if the blinkers were on.

Re:Obligatory car analogy (1)

BrokenHalo (565198) | about a year ago | (#43222293)

Yea, I'd say that's actually the direction we're heading.

True, which is why we need to step back a bit. Yet another car analogy might apply here. We would all be safer drivers if we were strapped to the front of our cars like Aztec sacrifices [a virtual beer for anyone cool enough to spot that reference]. Similarly, an appropriate modicum of paranoia in our online behavior would prevent a bucketload of grief.

Re:Obligatory car analogy (2)

Sique (173459) | about a year ago | (#43222423)

But Pontiac was an Ottawa chief and not an Aztec.

Re:Obligatory car analogy (5, Insightful)

DMUTPeregrine (612791) | about a year ago | (#43221915)

No, he's saying that we should be adding seat belts and anti-lock breaks and eventually self-driving cars to eliminate the need for the user to focus on safety in driving. He's arguing that the safety should be built into the system, and not rely on the judgement of the user. That's the exact opposite of your example.

Re:Obligatory car analogy (1)

Anonymous Coward | about a year ago | (#43221997)

self-driving cars

We already have those. It's called a bus, and it's a very effective way of getting people to buy a non-self-driving car.

Re:Obligatory car analogy (1)

Ardyvee (2447206) | about a year ago | (#43222079)

Except self-driving cars remove the fun of driving (not the fun of moving around, but you know, some people like to drive themselves). But yeah, you're right. That's what the guy meant if you were to make an obligatory car analogy.

Re:Obligatory car analogy (4, Funny)

mcgrew (92797) | about a year ago | (#43222197)

What's so fun about driving? That's like saying a Roomba takes the fun out of sweeping the floor.

Re:Obligatory car analogy (1)

JustOK (667959) | about a year ago | (#43222553)

Some androids enjoy ironing.

Re:Obligatory car analogy (4, Interesting)

dinfinity (2300094) | about a year ago | (#43222225)

No. TFS is a terrible representation of TFA.

This is a more fitting excerpt:

The whole concept of security awareness training demonstrates how the computer industry has failed. We should be designing systems that won't let users choose lousy passwords and don't care what links a user clicks on. We should be designing systems that conform to their folk beliefs of security, rather than forcing them to learn new ones.

Even though TFA is pretty crappy itself with its myriad of bad analogies, the idea of trying to craft effective simplified 'folksy' models makes sense. My favourite metaphor for internet security is regarding the internet as a square in a foreign city center. It gets the message of what to trust and what not across a lot better than trying to explain Javascript, cross-site scripting, or what an executable is.

In addition to this approach to raising security awareness, a case is (sort of) made for designing systems to support users in security related decisions in a way consistent with the above. I'd say that a green colored address bar in a browser is an example of how to do it the right way and the blanket statement 'this file may harm your computer' one of how to do it the wrong way.

Re:Obligatory car analogy (4, Insightful)

Anonymous Coward | about a year ago | (#43221993)

Driving a car is a far more focused task, with more salient dangers. Even without safety training people understand that driving erratically, or at high speeds can be dangerous. Using a computer or the internet is more like watching TV or reading an article and determining if what you're watching is fact or fiction; it requires judgement and motivation.

Given that many adults don't have these skills and importantly that the effects of failure extend beyond the individual involved what Schneier is proposing makes sense.
 

Re:Obligatory car analogy (0)

Anonymous Coward | about a year ago | (#43222185)

What? The fuel tank is installed such that it will blow up in almost any type of accident? Invest in more 'security awareness' of the drivers!!
Such was the mentality of the car industry before independent safety tests became mandatory.

Re:Obligatory car analogy (1)

philip.paradis (2580427) | about a year ago | (#43222213)

Here's a better car analogy. You're driving down the street on four bald tires, and a guy driving a truck for a tire shop happens to pull up next to you at a red light. The guy remarks on your crap tires, and now you have two choices. You can listen to him because he probably knows what he's talking about when he tells you you're running a serious risk of dying on the highway when one of those tires fails catastrophically, or irrationally ignore him because you perceive that he's just trying to sell you something.

All the driving skills and seat belts in the world won't beat physics when one of those tires blows out at 80 mph and you flip the median into an ongoing semi.

Re:Obligatory car analogy (2)

LaggedOnUser (1856626) | about a year ago | (#43222365)

Have you forgotten about air bags? They are there precisely so that you don't have to remember to use your seat belt...

Re:Obligatory car analogy (1)

Anonymous Coward | about a year ago | (#43222455)

Your car analogy is completely off.
The correct one would be: Bruce Schneier says it's better to spend money on putting seatbelts in cars, in making cars that the user can't push over the safety limit, that can not be driven off road, etc than to spend money on educating the drivers to drive safely.

The analogy is also not relevant because driving a car recklessly poses a danger to the drivers life, and even if there is no immediate death threat, the driver may be arrested by police if he does not follow the rules.
There is no police that will arrest you if you don't use strong passwords, and there is no life threat. So the approach must be different with computer security. Unless we are prepared create and pay for the "strong password police force", or set computer security practices into law.

Re:Obligatory car analogy (1)

milkasing (857326) | about a year ago | (#43222503)

That. Every system can only do so much. Ultimately, even the best designed system depends on having people do the right thing, and accept changes that makes the system more secure
What use is it if you build a closed environment, with restricted access and rely on two factor authentication, if some CxO gives his RSA token and password to his unvetted summer intern to do some trivial task without supervision?
Is security awareness training the end all of IT security? Of course not. But frankly, it is a trivial part of a security budget and it does have real benefits.

Well, duh.. (2, Insightful)

Anonymous Coward | about a year and a half ago | (#43221793)

Users can screw up because they are just as human as you. So live with it. Design around it. Make it safe regardless.

I've only been saying that since, mwah, 1999 or so.

Policies are OK, but rules that assume perfect compliance to work are really only there to cloak the failure of engineering in some fault tolerance in system architecture and user UI design. Glad someone finally caught on..

Re:Well, duh.. (1)

3.5 stripes (578410) | about a year ago | (#43221957)

Sure, humans can screw up, can't the people engineering make mistakes as well?

Most software designers don't leave security holes in their software by design, one would hope.

Re:Well, duh.. (4, Insightful)

ATMAvatar (648864) | about a year ago | (#43222657)

They don't intentionally do so, sure. However, most software designers are not trained to develop secure software, are not paid to develop secure software, and in fact, would probably get a heated talking-to by management if they spent the extra time to make their software secure without explicit instructions to do so.

Re:Well, duh.. (1)

BrokenHalo (565198) | about a year ago | (#43222363)

I've only been saying that since, mwah, 1999 or so.

1999? That was only the day before yesterday. Get off my lawn, whappersnipper. ;)

Obligatory quote (4, Insightful)

Krneki (1192201) | about a year and a half ago | (#43221815)

A common mistake that people make when trying to design something completely foolproof was to underestimate the ingenuity of complete fools.

Re:Obligatory quote (2)

gblackwo (1087063) | about a year ago | (#43222217)

-Douglas Adams (And I do not believe the original quote was past tense)

Re:Obligatory quote (0)

Anonymous Coward | about a year ago | (#43222249)

Security awareness = nerd religion

Re:Obligatory quote (1)

Anonymous Coward | about a year ago | (#43222265)

I fought the fool, and the fool won!

Re:Obligatory quote (1)

Big Hairy Ian (1155547) | about a year ago | (#43222625)

Good old Douglass Adams! What's to stop some idiot who we now force into using a strong password from writing it on a PostIt note and sticking it to a monitor!

Invalid comparison (5, Insightful)

Aethedor (973725) | about a year and a half ago | (#43221835)

He's comparing security with health and driving to 'prove his point'. Security is not the same as health or driving. So, any conclusion from making a comparison is a false one.

Second, you don't have to choose between completely ignoring security awareness training and spending lots and lots of money and time in it. There is a very good choice somewhere in between. I agree with him that the information systems have to be secure and shouldn't offer dangerous actions but no matter how secure you make your information system, it will all fail if the user has no clue about what he or she is doing. And giving empolyees a basis level of security awareness doesn't have to cost a lot of money but will still help you prevent a lot of trouble.

You're missing the point (1)

Anonymous Coward | about a year ago | (#43221963)

His examples are about forms of security or safety in different areas, but that really isn't that important for the point he's making, which is not about the type of technology involved, but about human behaviour. If we recognise short or medium term consequences we're far more likely to adjust our behaviour than if the consequences only affect us in the long term or if the link between cause and effect isn't clear. With the current state of IT the link is unclear, so training people will not be very effective. Energy is better spent on adapting the technology to the limitations humans have.

Re:You're missing the point (2)

Electricity Likes Me (1098643) | about a year ago | (#43222575)

Also in computer security, there's a lot of false-flag type attacks going on: in the modern day, something tends not to look obviously unsafe, but winds up being so (browsing the web "safely" shouldn't even be a problem, when you get down to it the browser should be keeping things thoroughly "on the web only").

I totally agree with Bruce here (2)

tlambert (566799) | about a year and a half ago | (#43221859)

I totally agree with Bruce here

We should be designing systems that won't let users choose lousy passwords

It reduces the search space I have to look at in order to brute force things, and that's a good thing...

Re:I totally agree with Bruce here (3, Insightful)

iapetus (24050) | about a year ago | (#43221999)

Sorry, but your approach is inefficient. Since the system now requires users to choose passwords that aren't memorable (and probably to change them regularly as well) a large number of them will have them written down on post-it notes stuck to their monitors. That reduces the search space even more. :D

Re:I totally agree with Bruce here (4, Insightful)

Loki_666 (824073) | about a year ago | (#43222403)

Damn my lack of mod points today. +1

Force users to chose complex passwords they write them down or learn what the minimum requirement is and create something stupidly simple anyway. Or they constantly forget their complex passwords and are bugging the admins to reset their passwords every 5 mins. Final variant is they use the same complex password for all systems. So, its fairly secure from brute force or random guessing, but once a hacker has one password, he has them all... one password to rule them all etc.

I've used systems with ridiculous requirements where i've not been able to remember 1 hour later what the hell i used. Something like requiring at least one capital, one number, one punctuation mark, no more than 2 consecutive characters, and no less than 12 characters. I ended up with something like this: Aabbaabbaabb1!

Re:I totally agree with Bruce here (1)

BrokenHalo (565198) | about a year ago | (#43222463)

Since the system now requires users to choose passwords that aren't memorable

Not really. There are lots of ways of constructing a strong password while still making it memorable. For instance, one can take one of the (arguably) most memorable opening lines in a novel:

It was the afternoon of my eighty-first birthday, and I was in bed with my catamite when Ali announced that the archbishop had come to see me.
(
Anthony Burgess, Earthly Powers)

...and use that as Iwt4om81stb,aIwibwmcwAattahc2cm.

This is not one of my passwords, and it does seem pretty cumbersome, but I just pulled it out for effect. If any computer is sufficiently literary to deduce that password, it well and truly deserves the privilege of accessing my data.

Tick the box exercise for auditors (5, Insightful)

Anonymous Coward | about a year and a half ago | (#43221869)

Security Awareness training is a tick the box exercise most companies do to get auditors off their back.

Apparently, users are supposed to be "trained to recognise phishing emails and other Internet frauds". IT has enough trouble these days trying to recognise them, and somehow our ordinary users are supposed to recognise them too?

Users have to be "trained to pick good passwords". This should be system designed to prevent users from picking bad passwords in the first place.

Users should be advised to "pick strong passwords and change them regularly". Two contradictory statements, no-one can remember a new complex password that changes regularly unless they write it down. Oh, users should be told "not to write down passwords".

Awareness training is pushed because there are a number of so-called "security consultants" who have no real technical skills, yet have made a living pushing this snakeoil. They unfortunately are also good self-promoters and have the ear of regulators and auditors.

If you are relying on security awareness to protect your infrastructure, you're screwed. Most users don't care, and even those who do care cannot possibly be expected to remain aware of the myriad of threats that exist. Often, their attempts to remain secure achieve the opposite purpose ("I heard you tell me email was insecure, so I use dropbox now to transmit files to customers").

What galls me most is I have to spend part of my IT budget this year spending money on this stupid notion because it is expected by auditors. This means I have to cut back on the security projects that make a real difference.

Re:Tick the box exercise for auditors (0)

Anonymous Coward | about a year ago | (#43221925)

So how do you change the way a corporation works so that it can't be social engineered by a hacker but still work well for legitimate users and customers? And do it all without providing any "security awareness training" to the employees.

Re:Tick the box exercise for auditors (1)

Anonymous Coward | about a year ago | (#43222061)

So how do you change the way a corporation works so that it can't be social engineered by a hacker but still work well for legitimate users and customers? And do it all without providing any "security awareness training" to the employees.

Think about the ways social engineering occurs. If you have, say, a call centre operator who gives out sensitive information to someone who is not who they say they are, there may be two failings:
1. They have deviated from the security procedure, or
2. The security procedures were insufficient.

In the case of (1), retraining. In the case of (2) redesign. Expecting the call centre operator to think back to some "social engineering" training is a ridiculous control notion, you instead tell them not to ever deviate from the security procedure.

If someone receives an email, that has managed to evade the spam and AV detection, and they then click through to a third party website, which again evades AV detection - whose fault is that? Do we expect users to identify sophisticated phishing sites? If you're relying on users to do that, at best you have an extremely weak control. Instead, we should be pushing the AV vendors to do better and come up with new solutions, we should be investing in technologies such as DLP and deep packet inspection, and we should be investigating in technologies that detect password misuse or the unauthorised install of software on a user's PC.

Re:Tick the box exercise for auditors (1)

Raumkraut (518382) | about a year ago | (#43222269)

Think about the ways social engineering occurs. If you have, say, a call centre operator who gives out sensitive information to someone who is not who they say they are, there may be two failings:
1. They have deviated from the security procedure, or
2. The security procedures were insufficient.

In the case of (1), retraining. In the case of (2) redesign. Expecting the call centre operator to think back to some "social engineering" training is a ridiculous control notion, you instead tell them not to ever deviate from the security procedure.

But surely, if it is even possible for (1) to occur, it is a failure of the system storing the data; that sensitive information was disclosed without the security procedures being enforced.

Potentially unwanted programs (1)

tepples (727027) | about a year ago | (#43222537)

we should be investigating in technologies that detect password misuse or the unauthorised install of software on a user's PC.

Unauthorized by whom? There are plenty of tools for web development, remote assistance, and accessibility that show up as "potentially unwanted programs" in certain spyware checkers. A web server could have been installed by a web developer testing his own web application, or it could have been installed by an intruder to serve up kid porn.

Re:Tick the box exercise for auditors (1)

babywhiz (781786) | about a year ago | (#43222597)

"Expecting the call centre operator to think back to some "social engineering" training..."

If the training for Social Engineering stopped at the call centre, then the training plan is flawed to begin with. Everyone in IT better already be familiar with Social Engineering tactics and better know how to recognize them without thinking twice. That is just part of the job. If it's not where you work, then it should be. We require people that work in our IT Dept to know how to spot most Social Engineering attempts, and have read at least one Kevin Mitnick book. We work in Aerospace manufacturing, so we have to keep a closer eye on WHO has access to what.

The hard core, effective training at that point should be at the receptionist, the person that is answering the phone when the user presses 0 on their phone, and the people allowed to open the door for someone on the outside. That is the very first line of defense for Social Engineering attempts.

Kids these days....Get off my lawn.....

Re:Tick the box exercise for auditors (2)

rbrightwell (932570) | about a year ago | (#43221969)

You said: Users should be advised to "pick strong passwords and change them regularly". Two contradictory statements, no-one can remember a new complex password that changes regularly unless they write it down. I di$agr33WithY0uWh0leH3art3dly&&.

Re:Tick the box exercise for auditors (0)

Anonymous Coward | about a year ago | (#43222029)

You really think that replacing E with 3 and O with 0 makes passphrases stronger?

Re:Tick the box exercise for auditors (1)

rbrightwell (932570) | about a year ago | (#43222073)

It helps, but if that's all you noticed about my password then you failed at understanding that it strikes a balance between memorable and secure. This password includes numbers, symbols, upper and lower case, and is longer than most while still easy to remember.

Re:Tick the box exercise for auditors (0, Insightful)

Anonymous Coward | about a year ago | (#43222097)

di$agr33WithY0uWh0leH3art3dly&&.

You think this is memorable? Take a typical company where users are forced to change a password every 30 days.

They have to remember a new passphrase.
They have to remember that the start of words are capitalised, except the first word.
The have to remember to turn s into $, 3 into E etc. In case you're wondering, this is a basic reversal that password crackers find trivial to crack, so you haven't really added two extra character sets - it's security theatre.
They have to remember to add "&&." at the end.

As the saying goes, you've created a password that is hard for a user to remember, and easy for a computer to guess.

Re:Tick the box exercise for auditors (1)

rbrightwell (932570) | about a year ago | (#43222437)

My point is that you can make up a few rules which you can remember, think of a *long* phrase which you can remember, and have a passphrase which is easy to recall and better than 99% of the passwords in use today. This is good enough for most password situations.

And yes... I do find that password easy to remember.

Do you have a better solution for memorable passwords?

Re:Tick the box exercise for auditors (1)

BrokenHalo (565198) | about a year ago | (#43222593)

I have my doubts about the usefulness of changing passwords often. If your password is sufficiently strong, and there is no likelihood of inadvertently passing it on to a third party, you shouldn't need to change it that frequently. If the password can be scraped from the target system at any time, then you're fucked in any case. This is simply a reason to have an array of memorable (but strong) passwords for ranges of sites, where the smallest range is for critical things like banking(*) and the largest range is for inconsequential sites like Slashdot. ;)

* Ironically, banks always seem (at least here in .au) to have the servers supporting only the weakest passwords, limiting the number of characters and not accepting punctuation or whitespace.

Re:Tick the box exercise for auditors (0)

jewens (993139) | about a year ago | (#43222065)

What galls me most is I have to spend part of my IT budget this year spending money on this stupid notion because it is expected by auditors. This means I have to cut back on the security projects that make a real difference.

Have you tried getting your humar resources department to add it to their list of recurring mandatory-for-employment training; along with sexual harassment, EEO and all the other CYA events they are expected to cover?

Re:Tick the box exercise for auditors (1)

deoxyribonucleose (993319) | about a year ago | (#43222069)

Security engineering and awareness training aren't mutually exclusive: what's needed is a pragmatic balance between the two. Never try to use technology to solve people problems.

For instance, fraud detection is something people always will have an edge in, thanks to several millennia of social evolutionary pressures. But they won't be infallible, and will be more efficient if technology can filter out the worst distractions. Neither is complete without the other. The question is where we get the most bang for the buck realistically, and there Bruce has several points, without having shown the utter futility of any kind of end user training.

Re:Tick the box exercise for auditors (1)

Anonymous Coward | about a year ago | (#43222167)

The weakest link in any system is the user. You can't make it secure without educating them.

Re:Tick the box exercise for auditors (1)

Sique (173459) | about a year ago | (#43222481)

But you can reduce his influence by designing systems more securely.

If your input field doesn't accept ); anymore, the probability of an user starting an SQL injection attack intentionally or involuntarily sinks drastically.

If you replace the door to the secure vault with a man trap, inadvertedly leaving the door open to the secure vault doesn't happen anymore, and so does tailgating.

You rightly identified the user as the weakest link, but your solution is disputed by Bruce Schneier.

Re:Tick the box exercise for auditors (1)

thegarbz (1787294) | about a year ago | (#43222207)

I'm interested to know how you design a system that works around the weakest link in security being the user? Every system that has been envisaged has been design to authorise the user. The attacks on security aren't attacks on security but rather an attack on the common sense of the user to not let others in.

The only way around this system is to prevent the user from being able to log someone else in, and the typical way that happens is at the incredible inconvenience to the user, i.e. tying his login to the specific IP address of his computer.

Re:Tick the box exercise for auditors (1)

mcgrew (92797) | about a year ago | (#43222223)

Oh, users should be told "not to write down passwords".

I disagree, they should pick a strong password, write it down, and keep it somewhere secure, like their wallet.

Re:Tick the box exercise for auditors (2)

bickerdyke (670000) | about a year ago | (#43222227)

Apparently, users are supposed to be "trained to recognise phishing emails and other Internet frauds". IT has enough trouble these days trying to recognise them, and somehow our ordinary users are supposed to recognise them too?

That's because your users should have the one thing that the best malware filter/firewall/virus scanner hasn't: Common sense!

Re:Tick the box exercise for auditors (-1)

Anonymous Coward | about a year ago | (#43222279)

Well, Changing passwords often can be helpful. I'm a hypnotherapist and I own a hypnosis website: http://www.njhypnotherapy.com/ and I update my password once a month. Thanks for the tips!

Re:Tick the box exercise for auditors (1)

vulcan1701 (1245624) | about a year ago | (#43222335)

Security Awareness is a part of a security infrastructure. Since it is on the surface, and apparently everyone must do it, it gets the most exposure. In reality, it is a minor part designed to protect against the potential unknown, zero-day, social engineering or unintended privilige escalation.

An AUP along with three 5-to10-minute videos covering external storage, phishing and social engineering should be sufficient.

Re:Tick the box exercise for auditors (0)

Anonymous Coward | about a year ago | (#43222405)

Users should be advised to "pick strong passwords and change them regularly". Two contradictory statements, no-one can remember a new complex password that changes regularly unless they write it down. Oh, users should be told "not to write down passwords".

This. The idea that randomly changing your password is by any means a good thing is absurd. Has your password been brute forced? Then you're hosed - the damage has already been done, and an aggressor of any talent will have your shiny new password delivered into their waiting hands. Has it not been brute forced? Then you might have just made it easier to brute force.

Can it be brute forced? That's the real question, isn't it? Programmers who allow for brute force attempts (without compromising a box and grabbing the actual encrypted/hashed passwords out of the requisite file or database) should be taken out to the nearest field and shot. Or demoted to home and small office support. It depends on which qualifies as a war crime. Go with that one.

Let's not talk about stupid pass 'words' are in the first place. It's 2013; there's no excuse for me to not be able to type an entire sentence into the goddamned form. Correct horse, verily. Argh, I'll stop ranting now.

Not news (3, Informative)

Tom (822) | about a year and a half ago | (#43221873)

Nice to hear it from someone with a big name. I'm an IT security specialist, giving talks every now and then, and I've basically been saying the same for years now. It is one of the topic where I face the most fierce opposition, usually from (big surprise) consultants and other people who offer security awareness trainings.

I've been doing this for so long that I can sum it up in one sentence by now: If security awareness trainings would work, don't you think we would be seing SOME effect after doing them for 20 years?

Of course, I am exaggerating a bit to make the point. I do think that training to make users familiar with specific security protocols is useful. I don't think general security awareness is. There is a plethora of reasons why it's a failure, from the context-specific nature of the human mind to the abstract level, but the main reason is that we have enough experience to show that it really is a waste of time and resources. Putting the same amount of money and effort into almost any other security program is going to give you a better ROI.

Re:Not news (1)

gmack (197796) | about a year ago | (#43221977)

While I agree that the system should do what it can to prevent intrusions and bad passwords, there are some things that users are just going to have to know not to do such as not writing their passwords on a sticky note or replying to some random email with their bank login or social security number.

Re:Not news (1)

Bongo (13261) | about a year ago | (#43222123)

Maybe a picture is, user awareness is the very last line of defence. If the terrorist is on the plane and armed, the passengers are the last line. But it was the failure of everything before that point that's to blame. Gee we really should increase passenger awareness of how to spot terrorists -- he has a big beard, no wait he doesn't have a beard, no wait he's dressed ordinary but is reaching into his bag, no wait he's taking off his shoe, no wait he's actually a she and young, etc.

We all know there are "bad guys" out there. And we can alert people about specific attacks occurring today. "A man in a blue T-shirt is walking around and police say he has a record and probably looking to steal equipment". People will listen to that. But indeed, general vague "you should be security trained" is not much use, it seems; you have to tell people exactly what they can and can't do and that list is too long and complicated and keep growing.

Re:Not news (4, Insightful)

serviscope_minor (664417) | about a year ago | (#43222347)

Nice to hear it from someone with a big name. I'm an IT security specialist, giving talks every now and then, and I've basically been saying the same for years now. It is one of the topic where I face the most fierce opposition, usually from (big surprise) consultants and other people who offer security awareness trainings.

Of course, I am exaggerating a bit to make the point. I do think that training to make users familiar with specific security protocols is useful. I don't think general security awareness is. There is a plethora of reasons why it's a failure, from the context-specific nature of the human mind to the abstract level, but the main reason is that we have enough experience to show that it really is a waste of time and resources. Putting the same amount of money and effort into almost any other security program is going to give you a better ROI.

I am honestly surprised by this. I really do not see how you can avoid security awareness training.

Forcing the users to pick non-lousy passwords is simply not enough if the users will happily repond to an email from email.admin@scamsite.ru (Re: YO'RE ACCOUNT IS SOON EXPiRE!1) with their username, passowrd, SSN, date of birth and random security questions.

OK, that's a bit of an exaggeration, but users do happily respond to really poor phishing attacks and will tell their password to someone they assume is an email admin because the email comes from an account with admin in the name.

Security is as much as a social problem as a technical one, and you simply cannot ignore the social aspect. And for that people have to have some understandings of basic security protocols: e.g. the admins will never ask for your password.

In fact, I would go as far as to say that security is very much a social problem. Technology will only get you half way. If your system is not easily hackable from the outside, you have reached the minimum standard. The trouble is that "social engineering" is really easy.

Even if you switch to 2 factor authentication it won't help enough: if the user believes that an admin has contacted them, then they will do ANYTHING to help that admin and will even follow detaile dinstructions to bypass as much security as possible. For some reason people being scammed are way better at following instructions than when they're not being scammed.

As someone else quoted earlier: never underestimeate the ingenuity of complete fools.

Leading Online Florists for Same Day delivery (-1)

Anonymous Coward | about a year and a half ago | (#43221903)

The visit was useful. Content was really very informative. http://www.giftwithlove.net

As Bruce Salienty Puts It (-1)

Anonymous Coward | about a year ago | (#43221905)

Beauty I'd always missed
With these eyes before
Just what the truth is
I can't say anymore

Now, some try to tell me
Thoughts they cannot defend
Just what you want to be
You will be in the end

Which is another way to say
Security is what you make it
Draw a circle in the sand
Round and round

NOON lunch time GONE

Fine to a certain point... (1)

gnalre (323830) | about a year ago | (#43221923)

While I agree with him to a certain point, there is a limit to how far security can be imposed on a user. Security always introduces overhead to doing a job. A user will accept that to a certain point if the reason is explained, however there is a point where putting more onerous security restrictions on a user is counter productive.

For example, if the IT policy is that passwords must be changed every week, be 80% different, be a combination of letters, numbers, upper and lowe case and cannot contain any part of your userID. That sounds safe,however it puts a great issue for users to generate and remember passwords. so what happens? They write them down and security is compromised.

Using the car analogy, the reason that driving is safer now is that the work of driving safely is hidden. Users do not need to work to drive safely, items like anti-lock brakes mean that users are safer without additional workload. What we need in security is ways to make things secure while at the same time reducing the effort to keep secure, for example bio-metrics.

One example is Spam. Spam as basically been defeated not by making but more onus on the email reader but having better spam detection which means 99% of the time users are not aware spam has even arrived

Re:Fine to a certain point... (0)

Anonymous Coward | about a year ago | (#43222387)

The problem with your argument about spam is that 1% gets through, and of that about 1% gets a response. It costs the spammer about $0.00 to send each email, and they send millions. If they get a handful of suckers then they are ahead.

Security is like that. Attempt to engineer theuser 100 times and invariably 1 of them will bite. Try 1000000 times and that's a fair number of bites.

Yes but no (1)

Clovert Agent (87154) | about a year ago | (#43221933)

I think I understand his point, and I agree in part, but I also disagree. I think security awareness is good, but I think relying on it is bad.

First of all, I think there will always be situations where the security technology fails - social engineering is an obvious example - and ultimately the final barrier is the security smarts of the target. Anything which raises that barrier, even a little, is a good thing. The question, obviously, is whether the benefit is worth the cost of the training.

And secondly, I think in general that making people more aware is always good. People are way too trusting, and that covers the gamut from clicking dodgy attachments to falling for Ponzi schemes. I think it's good to teach people to question more, to think critically, and to be risk-aware. And by "teach people" I mean "starting in primary school".

Security training is more than systems (2)

Chrisq (894406) | about a year ago | (#43221939)

Its about things like the call-centre operator who gets a call saying Can I check my balance ... yea hear are the details... and while you are on can you tell me my wife's balance too. Its about the shop assistant in a phone shop who has someone asking for a replacement for a phone they just flushed down the toilet - they're desperate, miles from home and have no ID on them but expect an urgent call from their aunt in hospital so need a replacement on the same account. Its about the middle level IT manager who gets a call from a very annoyed board director who says his password doesn't work and you better reset it now or head will role. Its about not lending your access card to a visitor so they can go to the canteen and you are too busy to take a break.

Security training is very important, but it needn't concentrate on systems.

Re:Security training is more than systems (1)

L4t3r4lu5 (1216702) | about a year ago | (#43222257)

Its about the middle level IT manager who gets a call from a very annoyed board director who says his password doesn't work and you better reset it now or head will role.

Similar situation in a previous job: I was a tech for a secondary (high) school. The Headteacher (Principal) called while off site and asked for the local admin password for the laptop as he'd forgotten the password he'd set on the user account he was given. I, being an employee, gave it to him and thought nothing of it.

The next day I explain the situation to the network manager and he goes MENTAL at me about data security and all manner of other policies, stating that the local admin password was also used in other places and it was a massive problem to reveal it.

He didn't like me pointing out that password reuse was far worse than allowing local admin access to the machine (it was a loner and would be re-imaged before being sent out again). I was performance managed out of that job.

TL;DR: We could all do with taking our own advice once in a while and ensuring that what we give as instruction passes basic sanity checking for every day use. Otherwise systems will be subverted or compromised just to get the job done.

The worst thing (4, Insightful)

drolli (522659) | about a year ago | (#43221955)

is that many companies are too lazy to even get the most fundamental things right. Why on earth would you not distribute your own CA fro your internal web services? Do you really want to train yout employees that clicking on the "accept certificate" button is an everyday thing to do? Why dont you manage to get the security settings in a way that "content from an unknown source" is not "content from you own file server"? how the hell shoud the office assistant know that this is dangerous and theoretically unusual if in everyday work the instruciton says to accepti it several times per day? why yould you enable macros in office documents for no reason and not sign the document?

All security training, hints like "be careful when opening attachements from unknown sources" are anihilated if you train your employees everyday to do the exact opposite thing, namely constructing worflows and selecting toolsets which are requiring exactly that.

My 2 cents on this

a) If there is a "do not use/do x" in your security education, then something is wrong. The right way is "use/do y"

b) Construct your standard processes in a way that your users/employees can work secure *AND* efficient.

c) If there are new tools and your users demand these, keep an open ear! Note to the management: reserve some bugdet for it. If users find dropbox an efficient service, the right way is not to forbid it but to ask yourself why you cant provide any decent file sharing on your own servers.

Re:The worst thing (2)

gnalre (323830) | about a year ago | (#43222053)

James Lyne [sophos.com] once said that he changed to standard security certificate dialog to say "by cllicking this you kill 1000 kittens".

No one raised an issue, not even IT.

Which goes to show how pointless the dialog is and how far it goes in adding security

Re:The worst thing (1)

drolli (522659) | about a year ago | (#43222311)

the dialog is pointless becaus nobody does it right. The people would pretty quickly learn that it does not kill 1000 kittens in average.

Correct would be to write: in one of hundred times, clicking on this will cause a malware infection. If it does, it department will send 1000 killed kittens via in-house mail to your table. That's 10 kittens in average per click. Good luck.

I am sure after one or two times burying the desktop of some office assistant under dead kittens and posting it on the companies homepage you may have your employees attention.

Re:The worst thing (1)

bickerdyke (670000) | about a year ago | (#43222307)

d) if you set up security policies, ENFORCE THEM!

Or Hire "security aware" people and trust on them.

That's related to your point where employees are used to processing files from untrusted sources, but receive training not to do so.

Tools is a good example for that. 2 out of 3 companies I worked for had a whitelisted set of tools you were allowed to install. It never contained either a the full set of tools you needed to do your work, nor the newest versions. So you were completly left in the dark if you were allowed to accept this auto-update or not.

The third company went along the lines of: We've hired expert developers, they all grew up with PC, have their own machines at home - who if not them should be trusted to know what tools they need and to discern usefull tools from BonzoBuddies.

So make up your mind. Set good, enforceable rules that work without exceptions (and go all the way and invest in a software deployment system!) or train & trust the users judgement.

Re:The worst thing (1)

packrat0x (798359) | about a year ago | (#43222457)

You need both methods.

2 out of 3 companies I worked for had a whitelisted set of tools you were allowed to install. It never contained either a the full set of tools you needed to do your work, nor the newest versions. So you were completly left in the dark if you were allowed to accept this auto-update or not.

This is the setup for employees who do not handle files from the outside world and only need internal networks.

The third company went along the lines of: We've hired expert developers, they all grew up with PC, have their own machines at home - who if not them should be trusted to know what tools they need and to discern usefull tools from BonzoBuddies.

This is the setup for employees who regularly work with outside files.

Re:The worst thing (0)

Anonymous Coward | about a year ago | (#43222621)

You're right. But note that you are not mixing the concepts for a single employee.

That balance between usability and security (1)

erroneus (253617) | about a year ago | (#43222001)

These are invariably give and take.

People simply need to be smarter. They aren't. No amount of precautions which do not inhibit functionality will help. People want to do what they want to do. The weak link is almost always the people and you can't control them with computers. You can limit what they do, but now you're encroaching on usability.

Re:That balance between usability and security (0)

Anonymous Coward | about a year ago | (#43222031)

Best personal example, limiting the users from picking bad passwords by the system. Every computer where I work has a post-it note for the group login because the employees don't want to make the effort to remember it. The password? Company's three letter initials and 123, so if we were IBM it'd be "ibm123". I tear these down every time I see them, but they keep coming back up. In one case a manager had stickers ordered to be affixed to the monitors so he wouldn't have to deal with "I can't remember the password" calls from the night shift.

somebody needs training (1)

Anonymous Coward | about a year ago | (#43222083)

The end user training may be a waste, but we definitely need security training for management. More than once I have implemented systems that require strong passwords, hash those passwords, and perform strict certificate validation only to have the customer demand no password requirements, clear-text stored passwords, and lax certificate checking because they are lazy and their IT people are incompetent.

Exactly correct (2)

AdmV0rl0n (98366) | about a year ago | (#43222131)

He is correct. User training is largely a waste of time, and both in development, and deployment, the systems are not designed or setup for security. So yes, users clicking a link is not safe, and it should be. Users opening an application and reading data should be safe, but isn't.

These problems have to be engineered out. They cannot be socially controlled out, the audience has neither the inclination, knowledge or interest in resolving this. And even after training, once its established how you've trained your monkeys, a new method will be established that undoes the training.

The whole industry is still in its infancy. Its building bridges that are made from cardboard, and without any form of certification or regime. This will only be resolved when it becomes apparent that software providers cannot ship things like 'our software cannot be held accountable for anything, have a nice day'. Nobody in the world making bridges gets away with 'if this bridge falls down, we are not accountable'.

The Adobe and Java scenario is exactly like this. Both are wholly unaccountable, and yet frankly directly responsible for perhaps billions upon billions of dollars of data loss, theft, security breaches, and so on.

There is no_fundamental_reason why people should even bother to make their software secure - so they only ally a baseline effort to the task. Until this is addressed, the rinse, shampoo, rinse, shampoo will repeat. And its actually why the security landscape is degrading. Things like Metasploit may have seemed to help. But fundamentally the white hat hacking and info security folks have ultimatly not helped. Its only highlighting how bad things are, putting guns in hands that should not have them, and making things globally worse. The vendors have not changed by very much.

in a perfect world..... (1)

Blaisun (2763413) | about a year ago | (#43222149)

This is a rose colored glasses view. If everything was perfectly designed, perfectly implemented, and used by knowledgeable users, sure that might work. We live in an imperfect world run by lowest bidder wins, quickest product to market, "good enough" security to not get us sued. This will not change. as long as the focus of a product is to make someone money, it will only be done "good enough" with the focus being minimum invested for maximum return. I believe the security of products are getting better all the time. But the majority of the time, the weakest link in the chain of security is the user. Why do you think that Social Engineering is so widely used? Simple really........ BECAUSE IT IS EFFECTIVE. why go through all effort necessary to exploit a system when a simple phone call can net you the same result? Technical Security can only protect you so far... you have to involve users in your security plan or you are simply keeping your head in the sand....

Shit or get off the pot Bruce... (1)

RocketRabbit (830691) | about a year ago | (#43222151)

What's this we shit? Why don't you practice what you preach and design a system from the ground up with enhanced security in mind?

I mean, it's not as if you are saying anything that hasn't been said for what, almost 20 years now. I know all the flitting back and forth to conferences and whatnot is exhausting, but to me, Bruce, you are becoming more of a Pt Barnum of security and crypto every day. Self promotion and loud noises / flashy things but in the end it's all just rehashes of what other people said before.

But What Can You Do? (-1)

Anonymous Coward | about a year ago | (#43222161)

In the streets there's no wrong and no right
Buy your kicks from the man in white
Feels all right
Then you see the girls with the dresses so tight
Who give you love if the price is right
Feels all right
Power pleasure in your nose tonight
It's not reality
It's just a fantasy
Feels all right

Back at work
I get paid for nothing
Feels all right

Whatever happened to Yusuf Islam?

Because software can stop human stupidity. (0)

Anonymous Coward | about a year ago | (#43222195)

By far, the weakest link in almost any programme nowadays is the user. Most times people claim their account has been "hacked" is because they fell for a phishing scam. While I'm sure some companies are doing it hideously wrong, but if a couple hours of training stops even one user from giving up their password, it could save hundreds of hours of work, nevermind all of the other wonderful legal consequences.

But no, let's just give up entirely, and let the scammers win.

Completely useless? (1)

Dereck1701 (1922824) | about a year ago | (#43222241)

You'll never make a computer system completely idiot proof, a more impressive idiot will ALWAYS come along. "Security Awareness training", or at least some pamphlet or something handed out to the departments is only going to help. While it is very true that the primary focus should be on securing the system as much as possible, letting the users know some of the simple rules to follow to help keep it secure is always a plus.

Don't use passwords (1)

zaax (637433) | about a year ago | (#43222247)

The main failing of passwords are passwords - so get rid of them simples.

So in a perfect world we wouldn't need it? (1)

Njovich (553857) | about a year ago | (#43222259)

I read the points he is saying, and I respect Scheier, especially in terms of the work he did earlier.

He makes some interesting discussion points, but it mostly seems to boil down to that we have to fix things from an engineering perspective, and let the rules of thumb about security spread by osmosis.

I would say, while there are still gains to be made at the engineering level, for many organizations serious about security, the low hanging fruit has already been taken care of mostly. Going further would often require complete reorganizations of the way they work, all their applications, and their network infrastructure. That is simply not an option that's on the table right now for most organizations. Also, a lot of the current weaknesses come with at least some level of social engineering. Making sure people properly notify the right people of fishing attempts, pointing coworkers to not wearing their badge, or keeping security basics into account, it is something people will only do if they believe it has some importance to them. A proper security awareness training can give them that. Yes, you can not teach everything there, but if done properly, you will give people more willingness to do something. And that can make a lot of difference when you deal with advanced threats (I hate the APT term, but have yet to come up with something more appropriate).

Yes, 95% of security awareness trainings suck, but lets face it, 95% of everything sucks. That doesn't mean that there is nothing useful to convey.

Perhaps Schneier has always seemed a bit out of touch with the reality in organizations, so it's amusing to read from him (from TFA):

To those who think that training users in security is a good idea, I want to ask: "Have you ever met an actual user?" They're not experts, and we can’t expect them to become experts.

Well, I have met actual users, and I would say they could learn about some basics like 'dont give your password in exchange for a chocolate bar offered by the coughing guy in the trench coat on the parking lot'.

Partially Disagree (1)

Evil W1zard (832703) | about a year ago | (#43222277)

Security training is a necessity, but its almost always done incorrectly. As much as it shocks us there are still hordes of workers who have no idea what spearphishing is or why anti-virus doesn't wholly protect their computer.... My belief is that once a year and at start date of the employee you have an online brief going over basic security/what to look for, reinforce the fact that the network and individual systems are monitored and let them know what the penalties can be for not practicing what they are learning. You make it so you have to click a question every 2 or so slides so they cant just click through and then the kicker is if they dont pass they dont get to take the test again. Everyone who fails has to go to an in-person briefing with security and corporate leadership.... Guarantee more attention is paid to the content when the possibility of looking like a dummy in front of the bosses is there (and yes I know the bosses will probably fail too...)

And of course everyone should agree better security implementation within systems, networks, apps, processes and etc... should be accomplished. Thats a no brainer. But by no means should we just disregard trying to ensure the user base who has never heard of half the shit talked about on Slashdot have some kind of basic knowledge of what can go wrong when they open up furry_kittens.flv on their work machine...

how to get along with the job-creation rethoric? (0)

Anonymous Coward | about a year ago | (#43222291)

Schneier is right, but such wise advices don't play well with the industrial rethoric of "creating jobs". Its harder and less lucrative to make good developers than sending cops to patrol and having users to buy silly security patchworks. The problems of the security industry are buried deep into its own reasoning and opportunistic behaviour.

Really? (0)

Anonymous Coward | about a year ago | (#43222321)

But some people really are security idiots who click on phishing sites!

Targeted Ads at their best (3, Funny)

JSC (9187) | about a year ago | (#43222337)

And what do I see just to the right of the lead-in about how Bruce Schneier says security awareness training is a waste of time? An ad for Kevin Mitnick's Security Awareness Training.

It's all about presenation (1)

Quakeulf (2650167) | about a year ago | (#43222433)

It's all about how you present the security awareness. Start by asking a simple question: "Do you care about your profile/account/access?" Then keep it simple from there. Just one or two one-lined paragraphs or bulletpoints, or a video lasting max 30 seconds. Use emotions and feelings and pack it all up with kittens and upbeat indie music. That is how you get it into the skulls of the mediocre masses.

If we could just..... (2)

danskal (878841) | about a year ago | (#43222447)

This point of view smacks of "if we just worked a bit harder/longer we'll be able to build a perfectly secure system".

It aint gonna happen. Not for a system as sprawling as the internet, not for a system with as complex requirements as an operating system.

The more you know about security, the easier it seems to do what is required to improve security - but you have to have very tight control of platforms to be able to follow through on implementing that security. And tight control prevents innovation. Often, security reduces the usefulness of a product.

Convincing everyone in the IT world that they need to spend $ on educating developers and implementing security features is an insurmountable task - and even if you manage it, you still won't be done, because the security issues we understand now and have fixes for are only a subset of all security issues. New types of holes will be found continuously.

Of course, end user training might still be a waste of money - I can't deny that.

Yes, and no. (1)

asdf7890 (1518587) | about a year ago | (#43222487)

systems that don't care what links a user clicks on

Definitely. As far as is possible we should stop users accidentally doing something stupid by making sure that they can only do the right things. This is not always practical though as for a start there are factors outside our control (for the password example we can't control how the user might store and potentially distribute their credentials in other services (password managers) or in the real works (bits of paper)).

systems that won't let users choose lousy passwords

I can't see a way that could be implemented which is not essentially an attempt to enumerate the bad, which is never a good idea. Even if it was for the most part, some of the things that make lousy passwords are again well out of our control: there is no way in software "don't use the same credentials for everything" can be enforced.

Security awareness is a lot more than just properly managing passwords and such - there are real world interactions that users need to be aware of so some training is definitely needed no matter how close to perfect the security in your applications is.

does not cover social engineering... (0)

Anonymous Coward | about a year ago | (#43222605)

Sure - I agree, the system should prevent the user from picking bad passwords or clicking phishing links... but what about the social engineering attacks? thats way more of an issue.

Wall to wall Schneier (0)

Anonymous Coward | about a year ago | (#43222645)

I've seen Schneier all over the place recently - does he have a new book coming out or something? Why is he suddenly so visible?

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>