Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Ask Slashdot: How Can We Create a Culture of Secure Behavior?

Soulskill posted about 6 months ago | from the start-giving-$50-citations-for-bad-passwords dept.

Security 169

An anonymous reader writes "Despite the high news coverage that large breaches receive, and despite tales told by their friends about losing their laptops for a few days while a malware infection is cleared up, employees generally believe they are immune to security risks. They think those types of things happen to other, less careful people. Training users how to properly create and store strong passwords, and putting measures in place that tell individuals the password they've created is 'weak' can help change behavior. But how do we embed this training in our culture?"

Sorry! There are no comments related to the filter you selected.

This approach has gone nowhere for years (5, Insightful)

Anrego (830717) | about 6 months ago | (#46816973)

Users are gonna do stupid things when it comes to security. Trying to fix that is a noble goal, but good luck.

The direction we need to keep going towards is idiot proofing. Assume the user will screw up and mitigate or eliminate the impact.

Re:This approach has gone nowhere for years (1)

Anonymous Coward | about 6 months ago | (#46817079)

Amen. And it is not just about idiot users either. It is basic human psychology. We are all wired to do insecure things at times. We need to engineer around this vulnerability.

Re:This approach has gone nowhere for years (2)

Geoffrey.landis (926948) | about 6 months ago | (#46817899)

In general, this is because IT departments are dictatorial about forcing users to do "security" requirements that do little or nothing to improve security.

Re:This approach has gone nowhere for years (4, Informative)

lgw (121541) | about 6 months ago | (#46817937)

Preach it! You cannot try to fix a software problem by fixing the users. Requirements for strong passwords have no place in modern security. A 4-digit PIN works great for my ATM card, because of the combination of:
* Two-factor auth
* Good, fast system for repudiation and reclamation
* Many, many back-end processes in place to limit harm

Is your IT system set up this way? Why not? Two-factor auth is easy, off-the-shelf stuff these days. Sharply limit password tries before account lockout, and abandon any thought of strong passwords, changing passwords, and so on - all of that is accomplished by the certs (and rotation thereof) on the second factor. The user's password is just there to make it OK if the second factor is stolen, during the time before the user reports it.

Everyone's "real" password is crypto-strong, because there's a properly-generated cert involved, and rotated at ITs discretion with no burden on the user. But people only need to remember something easy, just something that would take more than 3 tries to guess.

Re:This approach has gone nowhere for years (4, Interesting)

PRMan (959735) | about 6 months ago | (#46818169)

How many ATM heists and skimmers have their been over the past 10 years? I'd hardly say it's working WELL.

Re:This approach has gone nowhere for years (3, Interesting)

lgw (121541) | about 6 months ago | (#46818317)

It's working quite well. The cost of all that is very low on the scale of the banks and that's what matters. It's simply not about "0 incidents", it's about limiting the damage to little enough that it's not important.

Partly that depends on the bank, of course, as some are total dicks about it if your card gets skimmed, but that's a customer service problem. Detecting the problem, limiting the cost, and so on are all important systems that banks take seriously. And the banks are gradually making systemic, low cost changes to reduce the ease of skimming, or of hacking an ATM, but they're not in a hurry as it's just not that expensive of a problem (how many ATM heists to equal a single mortgage default?). More importantly, they're not trying to fix their customers!
 

Forget idiotproofing, how about licensing (0)

Anonymous Coward | about 6 months ago | (#46817123)

Cars are not idiot proof, but we require that people be licensed and pass a test to drive them

Of course it will be the death-blow of the free-and-easy interwebs that we love much, what with them pesky net-cops passing out tickets for unsafe behavior

Sigh... every frontier has seen its freedoms fade as the masses trounce forward, I suppose that this was inevitable

Unless... unless... we could just freaking expect people to not act like total asshats, follow some simple rules and accept that they are going to get mugged if they do not follow the rules...

naw, that could never happen

Re:Forget idiotproofing, how about licensing (0)

Anonymous Coward | about 6 months ago | (#46817231)

Because reasons [penny-arcade.com]

Also, this entire "logged in on the front page, not logged in on the comments page" sucks major ass.

Re:Forget idiotproofing, how about licensing (1)

Anrego (830717) | about 6 months ago | (#46817561)

The problem with that analogy is that we still have car accidents, many of which are serious.

Re:Forget idiotproofing, how about licensing (0)

Anonymous Coward | about 6 months ago | (#46818177)

Cars are much, much easier to use than computers, and we require months of training. After all, cars have only one purpose-carrying people and stuff around, and only a few controls are required to operate then.

Would you rather have users training for years before they can use computers? (Not programing, but merely using). Sorry, but that ship has sailed long ago.

Re:Forget idiotproofing, how about licensing (1)

pr0fessor (1940368) | about 6 months ago | (#46818329)

I have never heard anyone say

I had a download that was just flying when a kid on a bicycle came out of nowhere and I had to crash my computer to avoid hitting him.

Firefox crashed, my wrist was broken in two places and I got a concussion, I was lucky compared to the guy with internet explorer.

Re:This approach has gone nowhere for years (2)

drakaan (688386) | about 6 months ago | (#46817125)

Seconded. The people that understand the risks generally don't represent a problem, but the people that don't understand them often also don't benefit from an explanation in a way that would change their behavior. Computers are not magic, but many people believe that they are. They also believe that antivirus software catches every single bad thing before it happens.

Re:This approach has gone nowhere for years (2, Insightful)

Anonymous Coward | about 6 months ago | (#46817519)

It's not that. Most people know that data breaches happen, like the Target one that was all over the news a bit ago.

The problem is that the security advocates make (seemingly) random behavioral demands that awkwardly often do not actually enhance security if followed. (I'm thinking of the entropy-neutral "strong password" dogmas)

When you make a system change that affects other employees, let them know why. When you propose a policy change for security purposes, defend it in front of a crowd of those affected. If you missed the trend, treat the other employees as equals (even if you don't believe they are) and explain why you are changing the firewall to block bittorrent at work or whatever change you have in mind.

Mod parent up. (2)

khasim (1285) | about 6 months ago | (#46817571)

The people that understand the risks generally don't represent a problem, but the people that don't understand them often also don't benefit from an explanation in a way that would change their behavior.

And in the corporate world there is the problem of status. People higher on the hierarchy do not like being told that they cannot do something by people lower on the hierarchy.

And if something goes wrong then it is YOUR fault because "security" was YOUR responsibility.

Computers are not magic, but many people believe that they are.

The problem there is that software has all the problems of a magical system. If you do A, B and C and then expect D to happen ... maybe it will, maybe it won't. Had you previously done X, Y or Z without rebooting?

There was a CAD program that had a problem with memory fragmentation. Even if you closed the previous files, eventually you ran out of contiguous memory and then your computer would complain about "issues" when you tried to open a file larger than your available contiguous memory. So first thing in the morning everything was fine. But around lunchtime things got weird. And the weirdness wasn't evenly distributed. On Monday, Alice would have a problem but Bob would work fine. On Tuesday Bob would have a problem but Alice would be fine. Etc. .....

And that was a problem that I could diagnose. There are hundreds more where all I can say is "perform the rite of reboot" and only open the app you have trouble with right now and let me know if it's still having trouble my god what are all those apps that are loading on start-up.

Re:This approach has gone nowhere for years (2)

jovius (974690) | about 6 months ago | (#46817429)

Exactly. What helps is a step by step process which doesn't allow any missteps, and which guides on the way. Encryption is perceived as sorcery; something summoned by the high priests. Even a shortcut key combination and a password is too much. Strong passwords are hideous monsters from the netherworld anyway. The concepts are too complicated. They need to be hidden away or in some way built in. Maybe a key analogy would work, something like the final key [cyberstalker.dk] or similar setup.

Anyway, the process should function as a learning platform for all. In the corporate world the security culture is often found only in the proper IT department, and everybody else are more than happy to throw out their responsibility of the matter, because the days are too busy nevertheless. What is needed is a common vision about what is IT security and why is it so important. If the users know why does it matter the process becomes natural.

Re:This approach has gone nowhere for years (0)

Anonymous Coward | about 6 months ago | (#46818099)

>The direction we need to keep going towards is idiot proofing.

No, this is equally doomed to fail for reasons that can be pithily explained by two quotes:

"Programming today is a race between software engineers striving to build bigger and better idiot-proof programs, and the Universe trying to produce bigger and better idiots."

and

“If you make a system simple enough that an idiot could use it, only an idiot would want to use it.”

Re:This approach has gone nowhere for years (1)

whit3 (318913) | about 6 months ago | (#46818129)

Truly, it is foolish to think millions of 'users' can be handed the
security problem, and advised to take action individually.

We should all cringe in horror when we hear that
millions of nontechnical users are being encouraged to
'take the problem seriously'. It's like asking all the residents
of an apartment building to safety-check the steam boiler (probably
only one or two will want to tighten the relief valve spring).

There have been attempts to 'take the problem seriously' with
draconian legal sentences: that, too, is doomed. The law moves
too slowly, and relies on things, like electronic documentation,
that can be SO easily corrupted.

There have been attempts to 'take the problem seriously' with
proliferation of passwords, and password-generating rules and
password replacement schedules, and by moving controls into
obscure places (what port do YOU open for SMTP?),
which entirely miss the target of security, because the poor
user has to write those things down (I know I have to!).

Instead, we should be building institutional watchers and code
(walls, if necessary, and alarms, and a few traps) to deal with
such issues. Sadly, government and commercial interests
aren't good for personal computer security- we need OTHER
institutions.

If only there was a template for this (4, Funny)

Krishnoid (984597) | about 6 months ago | (#46816977)

Perhaps we could take the lead from government departments already tasked with maintaining security, hold on, let me google this ... I'm finding 'Transportation Security Agency' and 'National Security Agency'. That should be a good start.

Good morale, perhaps? (2, Interesting)

Anonymous Coward | about 6 months ago | (#46817025)

In my experience, a company with high employee morale has people who will tend to listen and follow security procedures, even when it might be time consuming. Even small things like stopping someone who slips past a door without badging in, or asking who someone is who is in a building without some ID.

With poor morale, there isn't much for the people to bother with security. I've seen companies try to save money by offshoring... then lose a lot more due to breaches than they would have spent by keeping existing talent in house.

Re:Good morale, perhaps? (3, Insightful)

bhcompy (1877290) | about 6 months ago | (#46817511)

Time consuming = won't do it. I've got enough things to worry about with all the bullshit administrative tasks I have to do to accomplish my non-administrative job. Give me security that doesn't force me to do more work, like encrypting my drive, single badge identification(no separate key fobs for doors I should have access to anyways), automatically encrypting my attachments, forcing me to change my password every 30 days, forcing me to have different passwords for different resources because password requirements are different(some requiring special characters, some not allowing special characters), forcing me to change my passwords for different resources at different intervals, etc.

Re:Good morale, perhaps? (1)

bhcompy (1877290) | about 6 months ago | (#46817525)

err, fucked that one up good. All instances of forcing should be prefaced with "Not".

Start with the software developers and type safety (0)

Anonymous Coward | about 6 months ago | (#46817033)

We can start by making the software developers use type safe languages (Ada is one such example) so we have fewer of these problems to deal with in the first place.

Using C is irresponsible when better alternatives exist.

yeah, lemme see where was that in the requirments (5, Insightful)

Anonymous Coward | about 6 months ago | (#46817167)

Sure, just was devs need, more users, who never requested a feature in the first place, coming in and demanding that a particular language be used in the implementation because the read an article about how its 'more secure'

Welcome to my nightmare, this rarely works out well

And for the inevitable, 'why didn't you make it secure in the first place' comment

fuck you, fuck you fuck you and your childish, 'I changed my mind, I don't want it fast, I don't want it cheep, I want you to read my mind and know the future and give me something that I can't break because I am a fucking idiot... and I need it tomorrow' attitude that makes everything somebody else's fault

Re:yeah, lemme see where was that in the requirmen (1)

sinucus (85222) | about 6 months ago | (#46817303)

Where the hell are my mod points??!! I'd mod you up to 9000 if I could.

Re:yeah, lemme see where was that in the requirmen (0)

Anonymous Coward | about 6 months ago | (#46818159)

make the SOBs sign a finalized specification and then throw it in their face when they get stupid, saying, "OK, you can have the new stuff you now demand if you tell me what in this spec you don't want or how much more time you will give me". If they won't sign a spec or won't deal with the realities of mission creep, get their asses kicked from above.

Re:Start with the software developers and type saf (0)

Anonymous Coward | about 6 months ago | (#46817319)

Heartbleed would simply not have happened if OpenSSL was written in Ada or another type safe language.

That is a extremely convincing argument for abandoning C style languages in favour of type safe ones when writing core libraries.

Re:Start with the software developers and type saf (1)

Minwee (522556) | about 6 months ago | (#46817415)

Heartbleed would simply not have happened if OpenSSL was written in Ada or another type safe language.

Right you are. Heartbleed happened because everybody was _using_ OpenSSL. Fix that and the problem goes away.

Re:Start with the software developers and type saf (1)

BreakBad (2955249) | about 6 months ago | (#46817775)

I used the same approach my requiring my users tattoo their passwords on their foreheads. Eventually my user base dropped to almost zero...but for those who stayed I did see an interesting trend. Passwords like %uS*32Ldi# started prevailing because passwords like wafflebunny make for an embarrassing tattoo.

Strong passwords == useless (1)

Dr. Crash (237179) | about 6 months ago | (#46817041)

Strong passwords are useless - well, they're useful only against a brute-force attack and that's not the big threat anymore. A 64-character password is worth nothing against a phishing attack, and is worse than nothing if you have to write it down.

Maybe the cure is to have the incoming mail server destroy all clickable links (or point them at an internal "you will need to navigate to that URL manually" warning page, and simply delete anything executable.

Re:Strong passwords == useless (1)

jythie (914043) | about 6 months ago | (#46817287)

After that you need to cure customer support too since that is a common social engineering target. In fact you might have to wipe out tech support in general...

Re:Strong passwords == useless (1)

mlts (1038732) | about 6 months ago | (#46817347)

I've wondered about more adaptation of CAC-like cards for logging in, where the card reader (or even better, access tokens that work with a USB port) is standard on all new computers. This way, a host has a list of public keys for authorized users, rather than sensitive passwords (even if stored as salted hashes.) The way malware can work would be to generate bogus signatures/decryptions with the user's access token, and that is a lot more intrusive than just slurping a password typed in.

Of course, this is a double-edged sword against anonymity, so this isn't a perfect solution. However, for SSO in a company, it might be useful.

How is a password written down "worse than nothing (1)

Anonymous Coward | about 6 months ago | (#46818157)

Suppose I have a private office with a lockable door, do not anticipate being targeted for physical espionage, and personally know everyone who has keys (except the janitorial staff). How is writing 'horse correct battery staple' on a sticky and putting it under the keyboard worse than forcing password to empty? This is exactly as effective as memorizing "348Chj#(hf.4%!g'; DROP TABLE Students; 'fh2^*Hcvbmmz" at preventing anyone who does not have access to my office from accessing my computer.

I worked in the CS division of a US National Lab last summer - yes, people there have left their laptops alone in a conference room while they go pee, and come back to find someone attacking their machine. We were under advisement to always, always, always lock screen if you're away. If we are worried about casual espionage attempts, I'll keep the sticky note in my wallet.

If you wish to evince a scenario where either my home will be burglarized and/or myself physically attacked so they can steal my credentials, or my computer will be physically attacked and compromised, then we're past the point where storing the password only in my neural engams is sufficient so the argument is now moot.

Start early on with training and rules (2, Interesting)

Anonymous Coward | about 6 months ago | (#46817061)

While it may seem draconian, the best way I've found is to start from the ground up with recurring training. Make the training mandatory, but unobstructive, and ensure you get the people to sign they understand the rules. You'd be surprised just how much of a difference you will get from anyone if you have a piece of paper with their signature on it, there just isn't the same value in an emailed "ok, I got it".

There is a delicate balance between security and convenience, so you need to make sure that whatever you do to your end users doesn't bother them too much. Having purely random passwords is sure to get them to write it down and stick it under their keyboard. Having too loose of passwords is what will get you on the front page. However, if you can give them some leeway while maintaining some length and complexity in the passwords (i.e. pointers on using passphrases or self-made acronyms), you can go a long way. You might make a game out of your training too, give out some cheap prizes like lollipops or something, for various categories of passwords that the users create as part of the training. Who can make the best 24 character password? Who can make the funniest 12 character? etc... Engage them, give them something to remember, but hold them accountable for their (lack of) actions as well.

Wrong question (2)

blue trane (110704) | about 6 months ago | (#46817067)

How can we create a culture where there is no incentive to hack or steal?

Re:Wrong question (0)

Anonymous Coward | about 6 months ago | (#46817127)

Invent the holodeck, that's the only way everyone can have anything.

Re:Wrong question (1)

tqk (413719) | about 6 months ago | (#46818413)

Invent the holodeck, that's the only way everyone can have anything.

Idiot. That buggy piece of crap damned near destroyed the ship almost every time it was used. They should've spaced the twit who came up with the idea before they had a chance to implement it..

Re:Wrong question (0)

Anonymous Coward | about 6 months ago | (#46817245)

First you create a cybernetic platform restricted by Asimov's Three Laws of Robotics. Then you upload Ray Kurzweil's consciousness and launch it into space with instructions to return to Earth once it's found the means by which to reproduce organically. By the time that's happened man will be purged from biosphere and whatever's left and has evolved into the next iteration of Gaia might be able to support such a concept.

In the mean time, the discussion's about imposing secure behavior onto an inherently insecure kludge that would have to be rebuilt from the ground up in order to facilitate such a culture. Either that or technology must become a religion where everyone's a priest. Where's the fun in that? Religions require someone to dominate.

Re:Wrong question (0)

Anonymous Coward | about 6 months ago | (#46817295)

First, you kill all the people... problem solved!

Re:Wrong question (0)

Anonymous Coward | about 6 months ago | (#46817467)

you let us know when you find a solution. in the mean time the rest of us will work on looking for a solution to the problem that could actually be solved.

Re:Wrong question (2)

jbmartin6 (1232050) | about 6 months ago | (#46817573)

You are right, this is the better question. Why do we have a world where a few pieces of information that are effectively public have any sort of value? I have to tell my address, phone number, SSN, and so on to every bank, doctor, potential employer, landlord, and so on. Yet we continue to delude ourselves that somehow the information is going to remain secret. Well, 30+ years of "the bad guys are winning" shows that keeping (essentially) public information secret just isn't going to happen.

Look at it from another perspective. Since I am not liable for false charges on my credit card, I don't care much at all about keeping that number secret. It is the bank's problem, not mine. (I suppose if I just posted it here on /. the law might be different though, since that is an intentional thing). And frankly, if we look at the numbers, not caring very much might be the best strategy. So the Target breach involved say, 200 million people. How many were impacted by anything more than some false credit charges? The banks paid the price for their failure to implement a better system, card holders are not liable for those charges.

What this story is saying is, why don't billions of people change their behavior instead of a few dozen financial institutions? I think we can see why the banks want us to ask questions like in the article. I am asking why we don't ask a few banks to change their process instead of an unrealistic expectation that human nature is going to change.

Yes, I understand I made a few oversimplifications and left some stuff out. Only for brevity, I assure you, I think the core point is still solid.

Stop using passwords (0)

flightmaker (1844046) | about 6 months ago | (#46817083)

It's high time we stopped using the term 'password'. Those in the know realise by now that a word or words is no good.

I'd like to suggest replacing the term with 'passcode'. For those who still use passwords, it might encourage them to cease and desist. Or maybe not, but it would surely be worth a try.

Re:Stop using passwords (0)

Anonymous Coward | about 6 months ago | (#46817413)

Depends on the application. A password like the one you use to log into your home Windows box can be weak. There's no Internet-facing "door" that uses that password.

Re:Stop using passwords (1)

Minwee (522556) | about 6 months ago | (#46817471)

I used to use passwords like "love", "sex", "secret" and "god", but now that we have switched to passcodes I just use "12345".

A Well-Informed Workforce is Key. (4, Insightful)

Sight Training (3626165) | about 6 months ago | (#46817089)

This is a great question, and one that plagues businesses of all sizes. Based on our experience writing security training and consulting companies on the best ways to plug the security holes in their organizations, it comes down to three things: 1) Spelling it out: A proactive approach to security awareness includes open lines of communication, telling employees exactly what sorts of things to look out for. One major mistake that corporations often make is assuming too much—mainly, assuming that their employees know how to identify malicious situations over the phone or through email. Instead, spell out the situations that may trip them up, either through policies or training. 2) Repeat, repeat, repeat: Even in companies that make a concerted effort to raise security awareness among workers, there is a tendency to backslide into comfortable complacency unless the danger is kept at the forefront of their minds. This doesn’t have to be onerous for management or irritating to employees, since there are so many effective ways to make security awareness a part of a worker’s daily experience. E-newsletters, security briefs, and clever, eye-catching security awareness campaigns are a few ideas. 3) Create a culture of teamwork: Often, corporate environments in large companies use impersonal policies to “teach,” hoping to generate desirable behaviors with a “Don’t think, just do” mentality. This approach makes employees feel like a tiny cog in a huge machine, a piece not worthy of more than minimal information. Smart employers give employees more credit. An attitude of inclusion should permeate every policy, every training campaign, and every common area. A real “good guys vs. bad guys” attitude makes everyone feel like part of a team that is working toward the common goal of security.

It's a challenge (1)

Stéphane V (3594053) | about 6 months ago | (#46817091)

good luck with that 40 yr old secretary that still hold old behavior at heart. Computers have good memories, people have crappy shitty memories. Thats why they tend to use words or something similar to what they know instead of gibberish random password generator for their security. I've seen people in high places which holds sensitive info that could easy kill a person if that info is leaked and they still used weak passwords... I've tried to tell them everything I can to use good behavior and it's a difficult challenge.

Make computers harder to use. (1)

Nutria (679911) | about 6 months ago | (#46817095)

Then the people who don't deeply care about using computers properly won't use them except for boring business stuff, and then we can replace Windows with z/OS or OpenVMS and all those PCs with terminals.

Eh, kind of wrong (0)

Anonymous Coward | about 6 months ago | (#46817099)

Security isn't a destination! It's a journey! Often with potholes, tornadoes, zombies, and other obstructions along the way.

You want your environment secure? Implement rigorous security training for all personnel, and make sure your admins are on their game! I.e. pay them enough to warrant the time that's required for that type of time/knowledge investment across your entire enterprise.

Are your Admins following the bug reports, ON EVERYTHING, or actively searching for firewall or software holes?

Answer: Onagawa Nuclear Station (0)

Anonymous Coward | about 6 months ago | (#46817141)

Come on if this has not been posted to slashdot recently it should have

http://www.nucnet.org/all-the-news/2014/03/17/safety-culture-protected-japan-s-onagawa-nuclear-station-researchers-say

How Can We Create a Culture of Secure Behavior? (0)

Anonymous Coward | about 6 months ago | (#46817147)

...by monitoring everything, duh!

The technology has to change (1)

petes_PoV (912422) | about 6 months ago | (#46817163)

Security is a pain. It slows you down. it gets in the way. It makes you jump through hoops and it is inconvenient. If I had to spend as much time unlocking my front door as I do to log into some websites: ones that don't even contain any information I value, I'd probably leave it open a lot more often.

So until the software (or hardware) necessary to make systems more secure improves a great deal people won't use it. I can't say what the nemchmark is for user tolerance / acceptance, but if I had to guess I'd say is was about 1 second of "automatic" activity, zero intellectual input and one simple mechanical movement. Implement that and you've probably invented computer security.

Re:The technology has to change (2)

mlts (1038732) | about 6 months ago | (#46817365)

Sometimes good security isn't a pain. Had client certificates been used more often, or just having a website ask the user to PGP/gpg sign a blob of text for logging in, passwords would be less critical.

With a client cert, almost all authentication troubles go away. However, client certs are troublesome for users to manage (have to remember the key's password as well as copy the private key to every device in advance), so it comes at a cost, although if people got as used to it as they are used to the like button, it wouldn't be that much of a speedbump.

Hahahahahah (1)

argStyopa (232550) | about 6 months ago | (#46817165)

People can't be bothered to take moderate, reasonable precautions with their own LIFE-PRESERVING behaviors, you think that they're going to be motivated to change their behaviors because some tech has to fart around with their laptop for 3 days re-imaging it?

Seriously, people need to stop assuming that humans aren't just hairless primates with a knack for tools and language.

You don't. (2)

DaveV1.0 (203135) | about 6 months ago | (#46817169)

People still drink and drive, smoke, do drugs, and have unsafe sex despite years and sometimes decades of having admonitions against all of those things embedded in our culture. Why? Because people still "think those types of things happen to other, less careful people." It is human nature, hubris, and magical thinking all rolled into one.

Re:You don't. (4, Insightful)

Bonker (243350) | about 6 months ago | (#46818241)

An important caveat to this line of thought is that GOOD education DOES work to prevent risk behaviors.

A blanket 'Just Say No' campaign like the one ran by Nancy Reagan in the 1980s did more harm that good because, when a lot of the kids had it force-fed to them for a decade grew up and discovered that marijuana didn't immediately kill your or turn you into a junkie, many of them threw out the entirety of 'Drugs are bad, m'kay?' and went on their merry way destroying their bodies with harsher and harsher drugs.

However, kids who had explained to them what drugs really did to a person's body and which drugs were more addictive and which drugs were less were, and are, less likely to actually do those drugs.

The same is true of sex education. It's been shown with frequently tragic consequences that 'Abstinence Only' education usually makes the teen pregnancy and STD situation worse in places where it's taught. However, more complete sex education that explains pregnancy, STDs, and all the other associated risks that go along with sex causes a notable decline in teen pregancy, STDs, and an actual increase in the average age at which teens start having sex.

I have found the same line of logic to be true with IT security. If you make a point of explaining the whys and wherefores, perhaps going so far as to make an interesting, engaging education program, the people who are your 'risk vectors' decrease, as do the number of security incidents you have to deal with.

No, you never can completely eliminate the problem. However, by offering education that is interesting, complete, and that doesn't treat the recipient as an idiot, you can dramatically reduce the problem.

You can't. (5, Insightful)

bravecanadian (638315) | about 6 months ago | (#46817175)

As long as there is incentive to skip security and get things done.

ie. let the nerds in IT worry about security - I'll worry about selling/making/doing and getting my bonus.

So technically I guess you could do something to foster this sort of secure behaviour but it won't happen because the powers that be don't give a shit.

So yeah, you can't.

The Solution (0)

Anonymous Coward | about 6 months ago | (#46817189)

If we execute anyone who has more than one security issue, artificial selection will fix this for us in future generations.

Right now we're probably genetically predisposed to risk taking (men more than women).

You can't. (0)

Anonymous Coward | about 6 months ago | (#46817205)

It would require eradicating laziness, ignorance, and plain old stupidity. Manage that and life would be such a paradise that you probably wouldn't need security in the first place.

Read what you wrote (1)

tomhath (637240) | about 6 months ago | (#46817237)

Despite the high news coverage that large breaches receive, and despite tales told by their friends about losing their laptops for a few days while a malware infection is cleared up, employees generally believe they are immune to security risks. They think those types of things happen to other, less careful people.

Untrained users are not the cause of large breaches. Malware infections happen to even the most careful users. In other words, training users and trying to change your company's culture won't make a significant difference.

Encrypt the laptop before a user can touch it. Make sure a decent virus scanner is running (and keep your fingers crossed). Get well trained sysadmins who see their job as keeping your network and servers as secure as reasonably possible.

Re:Read what you wrote (5, Insightful)

mlts (1038732) | about 6 months ago | (#46817495)

If I had to give five general things a company could do, it would be similar to the following the parent stated:

1: First and foremost... separate and isolate. Finance should be isolated from everything else, with a Citrix or TS server so people working there can browse the web with the browsing well separated from critical assets. If a breach does occur, it will be limited in scope.

2: Laptop encryption is trivial. BitLocker [1] and the AD infrastructure to recovery is a must-have. Depending on level of paranoia, AD policy can be set to auto-encrypt USB drives, so a dropped thumbdrive doesn't mean a massive data breach. In fact, it would be wise to have BitLocker on all desktops as well, so repurposing of the machines is easy -- just a simple format or clean command in diskpart.exe.

3: Backups. Often overlooked, but a humble tape drive can mean the difference between a quick restore versus paying some guy out of Russia a lot of BitCoins. Disk arrays != backup because one command (blkdiscard for example) can render all backed up data gone in seconds.

4: A clear chain of command. This way, someone can't hack a VoIP connection, browbeat some lackey to get some critical access or knowledge about internal networking.

5: Active pen-testing from a guy running a script on boxes to actual blackhats using everything at their disposal including sending people on site in coveralls and fake badges to get in.

[1]: Yes, TrueCrypt is a good utility, but this is the enterprise where recoverability is as important as security.

Re:Read what you wrote (3, Informative)

Xaedalus (1192463) | about 6 months ago | (#46817929)

I work in Tape, and I can tell you that I've run into sysadmins and CTOs who have overlooked #3 (particularly with their belief in cheap disk arrays) to their sorrow. Tape is boring old tech, but it's damn near bulletproof in saving the bacon every damn time something goes wrong and a restore needs to occur. Ethernet with NAS boxes my ass, you need a tape library in there somewhere to completely insure that your company doesn't go down permanently after the inevitable rogue wave of human stupidity hits your network.

Re:Read what you wrote (0)

Anonymous Coward | about 6 months ago | (#46818323)

A couple years ago, a colleague of mine worked at a company that swallowed the "yes, all backups should go to the SAN" Kool-Aid. They happily tossed their perfectly working LTO-5 silos and went fully with online disk with replication. Well, one goof by one of their SAN admins that purged all the logical drives ended that illusion. All the RAID-6 protection with hot spares, online drive checking and autotiering didn't mean squat. Replication meant that the erase commands were echoed asynchronously across the WAN to the remote site, so the data was trashed in two locations. Thankfully, this was "just" backup data and the SAN admin quickly took snapshots of everything as some form of backup, but if a production machine went down or was corrupted, there would be no way to recover.

With tapes, erasing all media -can- be done. However, part of backup 101 class is offsites, or even just keeping a set out of the silo and offline. This way, if someone zeroes a media set/backup pool/whatever the backup utility calls it, the data can still be reconstructed. Just the fact that tapes have a read/write switch can help mitigate an attack from remote trying to zero all data. One can't just delete a backup set and all data on the offsite tapes magically disappear. It might be a PITA to reindex the data, but it is still recoverable, especially if a catalog backup is sent offsite as well.

Personally, I think some more players should jump in the tape industry. A consumer level tape drive would be very useful, especially one that has enough RAM to slow down and not "shoe-shine" when plugged in a USB 3.x connection. However, until people realize HDDs are not backup media, this likely won't happen.

How Can We Create a Culture of Secure Behavior? (1)

Rob the Bold (788862) | about 6 months ago | (#46817285)

Same way as every other behavior: reward desired behavior and/or punish undesired behavior.

Re:How Can We Create a Culture of Secure Behavior? (3, Interesting)

bill_mcgonigle (4333) | about 6 months ago | (#46817641)

Or more succinctly: incentives matter. What incentive does an employee have to keep data secret? Will he be demoted in rank and lose pay if he does something stupid?

What incentives do companies have to maintain a secure infrastructure? Will their insurance policy hold them liable if they do not?

I'm just in the middle of polishing up a puppet module to deploy a bunch of new certs on my infrastructure. My incentive is that my reputation looks pretty bad if I advise clients to be secure but my own infrastructure is not up to snuff. That's really an incentive to avoid lost opportunities, I suppose.

Google is talking about scoring up pages that are secure. Another very wise incentive.

Let's keep this ball rolling: what other incentives can we offer or explain?

People guard against old threats (1)

jfdavis668 (1414919) | about 6 months ago | (#46817291)

People are used to guarding against security threats, but are always defending against old ones. By the time you get everyone trained in defending the threat, the attackers have already moved on to a new one. The only way to defend yourself is have a small group of people who can anticipate or react to the ever changing threat and have them defend everyone else. Unless you are primarily interested in security, they will never focus on preventing new attack avenues.

Yeah, blame the victim (2)

Animats (122034) | about 6 months ago | (#46817299)

Users are not the problem any more. Crap code is the problem.

C is the source of buffer overflows. Microsoft is the source of autorun problems, or "if it's executable, run it". PHP is the source of most SQL injection problems. Vendor-installed backdoors are the source of most router vulnerabilities. None of these are end-user problems.

By making it easy (0)

Anonymous Coward | about 6 months ago | (#46817349)

Right now there's a lot of commandline work that goes into making something secure.

Some people.... (0)

Anonymous Coward | about 6 months ago | (#46817373)

Some people won't respect fire until they get burned.

when you start firing (0)

Anonymous Coward | about 6 months ago | (#46817431)

When you start firing people for not following security policies. that is the only way. Look at what happens now you get a virus and what happens IT fixes you computer and you are maybe slightly inconvenienced for a short time while you wait for them to replace or re-image it. The whole time you and your manager get to yell at IT for not fixing it fast enough even though it was caused by you clicking on that email from the Nigerian prince for the 5th time this month. There need to be individual consequences or nothing will change.

Re:when you start firing (4, Insightful)

Anrego (830717) | about 6 months ago | (#46817747)

This requires security to be a priority over whatever that user is doing. In most cases, it's not. The job of IT is to keep the system running and support the people doing the things that the company actually cares about (buying widgets, making widgets, selling widgets, whatever). When IT folk get ideas of grandeur and images of violators of their well defined policy being given the boot, it never ends well.

Much as it sucks, I think the onus is on us to build software and systems that the user can't screw up. People clicking links and attachments.. filter out all links and attachments save for whitelisted senders. Careless with their password? Time for a 2 factor system where the hacker on the other end of the phone doesn't have easy access to one of the factors. Spearfishing becoming a problem? Implement something that makes it really obvious an email is from an outside source (and don't make it a big paragraph, just a simple large font "THIS EMAIL WAS SENT FROM SOMEONE OUTSIDE OF THIS COMPANY" at the top.

Re:when you start firing (1)

RobertLTux (260313) | about 6 months ago | (#46818149)

"Time for a 2 factor system where the hacker on the other end of the phone doesn't have easy access to one of the factors. "

this is where dial-backs come in handy .
the way it works is :

1 you get a call from "Joe Smith in the Texas Office"
2 you tell "Joe" im going to dial you back give me Line 3 when i call
3 you use your phone list to dial him back and
4 Joe gives you Line 3 (this is from a key string list) AND You as instructed give him line "5"
5 You then continue with business

Or Video Phones with the system doing Face Recog on both persons

*They* are immune to security risks? (0)

Anonymous Coward | about 6 months ago | (#46817455)

Well, they are immune. It's their employer's computer that's at risk, not themselves. I couldn't give a rat's ass for my employers' computers. Securing that is the employer's problem, not mine. I'm not forgetting about identity theft and my personal information. There's no way I'll tell somebody else's computer (with nebulous security regimes designed by others) anything personal about myself. I'm not a fool.

Many of those nitwits won't even let me install less sucky web browsers, so fsck 'em! Live by the sword, die by the sword. They can consider it an expensive education on their part, until they smarten up.

Password strength is overrated (4, Interesting)

Tony Isaac (1301187) | about 6 months ago | (#46817459)

In my 25 years working in IT, none of my passwords, weak or strong, have ever been hacked. Even my teenage sons, who have no idea about password strength, or site security, have never been hacked. And I doubt YOU can point to a single instance of someone hacking YOUR password.

Does password hacking happen? Yes, of course. Should we be careful? Yes. But there are much greater dangers, such as malware (which you no doubt HAVE had a personal brush with).

So if we need to put up with annoying security measures, let's at least focus on the more relevant dangers, rather than forcing us all write down our passwords and stick them to the bottom of our keyboards!

Re:Password strength is overrated (0)

Anonymous Coward | about 6 months ago | (#46817685)

My wife's throwaway email address had its pretty weak password (it was a 6-digit number) hacked last year.

Re:Password strength is overrated (0)

Anonymous Coward | about 6 months ago | (#46818273)

Oh, and I've never seen malware in my life (been using PCs since MS-DOS 2.11).

Teach the benefit - a system that keeps working (1)

raymorris (2726007) | about 6 months ago | (#46817485)

I've recently learned a new definition of security, one that's a little bit different from what I'd thought about before.

A secure system is a system that continues to work as expected, even in the face of unexpected events.

Users like a system that works the way they expect. They don't like crashes, endless popups, and systems slowed to a halt by malware.
So teach them the benefits they can expect. You can have a fast, trouble-free computer by doing x, y, and z. Clicking on "virus alerts" makes your computer slow and prone to crashing. Opening unexpected PDF files causes a huge hassle of needing to change your passwords and all that mess.

Re:Teach the benefit - a system that keeps working (0)

Anonymous Coward | about 6 months ago | (#46817957)

A secure system is a system that continues to work as expected, even in the face of unexpected events.

After decades of BSOD, what fraction of computer users have ever seen such a system?

The 20% who have used something other than Windows (1)

raymorris (2726007) | about 6 months ago | (#46818405)

Our company, for example, uses Linux and measures uptime in years. Machines are rebooted for CPU and kernel upgrades and that's about it. Hard drive upgrades don't require a reboot, and they sure as heck don't crash. One machine had a bad memory module that caused a crash. We don't have users or software that crashes.

Re:Teach the benefit - a system that keeps working (1)

Imagix (695350) | about 6 months ago | (#46818225)

That's not security (well, not the security that the rest of this thread is posting about). That's resiliency.

sure it is - open a malicious attachment, things s (1)

raymorris (2726007) | about 6 months ago | (#46818425)

That definition absolutely includes what this thread is about. TFA talked mostly about malicious email attachments. When you do that, things stop working right. The discussion has talked about poor passwords. When your poorly chosen password is cracked, things stop working right. Using a good passphrase helps keep things working they way you expect them to work.

Losing battle? (1)

scoticus (1303689) | about 6 months ago | (#46817503)

For a company of decent size, having some sort of mandatory training may be in the realm of possibility, but good luck with all of the small business (20 employees) out there. My company provides IT services to these types of businesses, mostly medical practices. There is no way to do anything other than individual, one-on-one training, and then only after something has already gone wrong. The owners don't want to pay for our time, and the staff are simply too damn busy to deal with it. This could just be a medical office thing, but I doubt it. It seems like simply being a "business" is itself a hindrance to instilling safe habits. At least with my home user clients, I have the time to educate them in a way that resonates. Back when I was in school, "computer class" was typing, a little BASIC, and that's about it. I wonder if there is anything in the current curriculum regarding safe surfing and proper security practices?

Long story short (2)

Charliemopps (1157495) | about 6 months ago | (#46817507)

A number of years ago I worked for a large (Global) company that wanted to make their new ticketing system secure. So they implemented a new password standard for the system that required a 35 character password, it reset every 30 days, and required 5 non-alpha numeric characters. The result? Within a week everyone in my department had their passwords written on a post-it note stuck to their monitor. The biggest problem with network security is usually the network security department.

Use common sense 2 factor authentication that's not too difficult for your users to comply with and they WILL comply. Make it overly complex and hard for the average non-tech person to understand and your own people will undermine all of your security efforts. Publicly fire any employe that violates your simple rules and it will quickly become apparent that adhering to those easy to follow rules is worth the effort.

People will care when they have something to lose (0)

Anonymous Coward | about 6 months ago | (#46817579)

if they don't.

Penalize negligence, just like we do IRL.

Basic Trainging in Computer Use (1)

X!0mbarg (470366) | about 6 months ago | (#46817585)

Unless people have some training or background, thy will proceed blindly along until something actually Makes them pay attention.

Start with such basics in high-school, or even earlier than that. Explain (and mark their understanding) of things like strong vs weak passwords, and simple security procedures. E-mail safety tips. Good file management practices. Even basics like how to take care of a keyboard and/or pointing device would go fairly well in such a course.

Oh. Almost forgot: MAKE IT MANDATORY! Nobody gets to use the school computers/labs (even Office Staff) if they don't show proficiency. No personal systems should be allowed access to the school network without a valid certificate either, lest they infect the whole thing from their own carrier box. Ban those who violate the practices and cause problems. Make them responsible for what they caused, and Sit Through the repair procedures with a technician as an additional education in what happens, and what has to be done to Fix things, or no forgiveness, and therefore, no regained access! Give them a sense of what they are avoiding, and even what to do to fix a problem on their own system, should they get afflicted at home.

Start 'em young, and train them in the ways of the system. The results will be worth the effort.

Seriously: If people don't show they are responsible enough to use the school (or company) systems, they have no business accessing them, and probably shouldn't be working there in any capacity.

Re:Basic Trainging in Computer Use (1)

Anonymous Coward | about 6 months ago | (#46817781)

MAKE IT MANDATORY! Nobody gets to use the school computers/labs (even Office Staff) if they don't show proficiency.

I agree. They should apply these same rules to all parts of life. Did you wash your hands when you came into the restaurant? No? Slap the food out of their hands and throw them out!

Did you buckle your seatbelt in the taxi? No? Throw them off into the gutter!

Did your dog just shit on the grass where kids play? Looks like poochie is getting Ol' Yeller'd!

It's only when every aspect of our lives is subject to draconian absolutism imposed by every other person's personal bugaboos that we can really be safe from irresponsible people.

culture of paranoia (1)

Khashishi (775369) | about 6 months ago | (#46817659)

It's not known exactly how to instill a culture of paranoia, but one idea is to subject employees to traumatic experiences involving police and/or gangsters.

Not passwords (4, Insightful)

Todd Knarr (15451) | about 6 months ago | (#46817719)

First off, stop worrying about passwords. Most malware doesn't get into systems by way of an attacker cracking passwords. It comes in in ways that bypass passwords entirely, either by getting a user to run it or by getting the user to give the attacker their password.

Second, look at your management culture. Do you expect your employees to routinely click on links in e-mail? Look for things like HR or IT sending e-mails that instruct people to follow links they've provided, or "secure" or "encrypted" e-mail systems that store the messages on Web servers and expect your employees to use a link to get at the contents of the "secure" or "encrypted" message. If you find such things, realize that you're training your employees to be insecure, because you're training them to expect to do as a normal part of their job exactly what the malware will need them to do to infect their systems. Start by removing such things from your management culture. If you need encrypted e-mail, do it within your own e-mail system so that users never need to follow links to read encrypted or secured e-mail. Outlook and Exchange offer this directly. If you need to give employees links to internal web applications or documents, create a Web page or site with a directory of links and train your employees to use a bookmark in their browser to access that site and navigate to the appropriate section where you'll put all the new links they need.

Third, look at your IT policies. Not the ones you wrote, the ones you expect employees to follow. If your policy manuals say "No user-installed software." but your actual policies require users to get and install software from outside, you have a problem. It can be as innocuous as sending zipped archives while not having a program to handle them pre-installed on user computers. It can be as pervasive as not having your IT able to support the myriad of tools your developers need, most of which will by definition not be the kind of thing most desktops would need. But every time you have a situation where what you expect of your employees requires software you didn't pre-install on their systems and where it'd negatively impact an employee's job performance and more importantly their performance evaluations if they refused to install that needed software themselves, you're creating security problems. Sit down and decide how you're going to address this, then address it. It can be as simple as a page of "approved" links to sites you know are safe and where employees can get all that useful software that gets used every day.

Fourth, evaluate your software update policies and IT budget and staffing. If your IT department doesn't have the staff or the budget to monitor the vendors of all the software in use in your organization, test changes and push updates out to your desktops and servers, you need to re-evaluate your IT budget and staffing levels. You need to get most updates installed within 30 days of their release, and you need to be able to get major critical security updates analyzed, tested and deployed within 24 hours. Your IT staff can't do that if security updates are a side item they're expected to handle in between doing everything else. If management wants security to be a priority, they need to back up their words with the resources and budget departments need to make it a priority.

Yes, a lot of that comes back to management. Attitudes towards security come from the top. More importantly, they come from what those at the top do and expect rather than from what they say.

Re:Not passwords (1)

ThatsDrDangerToYou (3480047) | about 6 months ago | (#46818341)

Fourth, evaluate your software update policies and IT budget and staffing.

LOLS! What is this "IT budget" of which you speak? Staffing?!

I worked for a series of startups, and at the last place the CEO was like "Wtf am I paying $10k a year for with this IT management company?" Hilarity ensued. :-|

Internal phishing attacks (0)

Anonymous Coward | about 6 months ago | (#46817829)

My empoyer periodically sends out convincing phishing attacks to employees. You click the link, you get a clear reminder that world is unsafe. It doesn't address all concerns, of course, but helps keep security in people's conscious mental mix.

EDUCATE END USERS (especially Windows) (0)

Anonymous Coward | about 6 months ago | (#46817839)

Since it's MOST used worldwide on PC's & Servers combined: A good read (by "yours truly" that actually got me PAID for it no less - "the Lord works in mysterious ways") -> http://www.bing.com/search?q=%... [bing.com]

* It uses a HIGHLY ESTEEMED tool http://www.computerworld.com/s... [computerworld.com]

(Whose makers have taken a few of MY suggestions to improve it no less)

CIS Tool actually makes it "fun" to do (in a nerdy kind of way) - almost like a performance benchmark software does, albeit, for security instead!

It works!

APK

P.S.=> CIS Tool is also MULTI-PLATFORM capable (not just for Windows users, but also *NIX variants of many kinds as well)...

... apk

Re:EDUCATE END USERS (especially Windows) (1)

TrollingForHostFiles (3613155) | about 6 months ago | (#46818055)

APK tells
Nothing but lies
'Cept when he tries
To spamvertise

BURMA SHAVE

TrollingForHostsFiles = Zontar the Mindless (0)

Anonymous Coward | about 6 months ago | (#46818267)

What lies, Zontar the Mindless? See January 2008 winners http://techtalk.pcpitstop.com/... [pcpitstop.com]

* ... & as-per-YOUR-usual, vs. myself? "EAT YOUR WORDS..."

I see you haven't managed to eat them ALL yet (lol) after your failed attempt @ libeling me -> http://mobile.slashdot.org/com... [slashdot.org]

APK

P.S.=> As to my subject-line, for anyone's that curious on that account? See here (Zontar admits TrollingForHostsFiles is HIS sockpuppet, SEVERAL times) -> http://slashdot.org/comments.p... [slashdot.org] (What a TOTALLY reprehensible little scumbag that Zontar the Mindless is...)

... apk

LART the offenders? (1)

rainer_d (115765) | about 6 months ago | (#46817913)

Well, you have to start somewhere, right?

Do you want to? (1)

LainTouko (926420) | about 6 months ago | (#46817989)

The first question is not actually how you can create such a culture, but whether it's actually a good thing in the first place. You seriously need to evaluate this. One of the primary means of being secure is not trusting others. But trusting others is an incredibly useful tool to get things done, and it may be worth taking the security hit. Stand on a crowded railway platform, and you're trusting so many people, each of whom could push you off and kill you so easily, without even thinking about it. Without trust, society itself would be impossible.

So for example, if everyone believed they were immune to the security risk of terrorism, this would very obviously be such a good thing for society. There have been security economic analyses done of various security measures recommended by security guys, thinking their users to be fools who just wouldn't listen, which established that the users who ignored them were actually completely right, that the cost of implementing these measures was hundreds of times greater than the benefit of preventing the attacks they were effective against.

A security professional who thinks doing things securely must always be a priority just because that's his field, instead of taking the time to gain a more holistic understanding of the situation, deserves to be ignored.

Incentives (0)

Anonymous Coward | about 6 months ago | (#46818153)

Sue them for negligence when circumventing security actually results in damages. If I get fired for not skipping a security thing and missing my deadline, and don't get fired when I meet the deadline but infect the network...

culture is easy (0)

Anonymous Coward | about 6 months ago | (#46818163)

Culture is easy; it's the implementation that's hard.

Every single day people make a value assessment about what they should do: do I be lazy and post on slashdot or do I finish my assignment? If I can get away with being lazy without finishing my assignment, I'll be lazy and procrastinate. If I value discipline and the joy of hard work and a job well done, I'll finish it early. Etc etc.

Culture is simply a single world that assesses what a community of people value and do not value. So the key to embed something within your organization's culture is to MAKE it valuable, either through a system of rewards/punishments or some other method; essentially testing your user group's security habits and rewarding those who are good and lightly punishing/training those who are bad.

The problem with that method is it takes time and resources to implement a program like this, so you will likely need some higher up approval to do so. Culture in an organization like a company usually comes from the top, so you need a higher up as your champion, because that higher up will create a policy that grants you the power to give out rewards and punishments etc.

Higher ups in an organization are usually concerned with efficiency; typically that means cost. So what I would do is create some sort of explanation or proposal for a higher up, explaining the costs and risks (and if you can quantify the risks in terms of dollars that's good) of having bad security habits, and outline a program that would encourage it and what program would cost in terms of hours/costs to the organization. Sold right, they will grant you the power and authority to implement the program, and if they are the champion of it bringing it to the organization as a whole, the others will fall into line.

The key though is also value. When you understand the costs and risks of a security breach, is that risk and cost high enough to warrant a program ensuring proper safety protocols? Value is absolutely key.

You can't. (1)

seebs (15766) | about 6 months ago | (#46818195)

1. It's annoying.
2. Most people don't think like that.

People are not built for that kind of caution.

Best Approach (1)

MrKaos (858439) | about 6 months ago | (#46818223)

I'll probably be modded down for this but the most effective way is to pwn the users to show them that they are merely bitches that any moderately skilled geek can defraud completely. Since they only learn from being fucked over, being fucked over is the only way they learn - otherwise you are just considered to be paranoid.

Repeat this for every user you meet and add the strange looks you get from them when you do things a secure way.

Just Remember (1)

naris (830549) | about 6 months ago | (#46818253)

If your users can do their job, then obviously IT Security is not doing theirs and stricter security policies are required!

FLEE! FLEE FOR YOUR LIVES!!! (0)

Anonymous Coward | about 6 months ago | (#46818265)

A culture of secure behavior is a culture of paranoid, suspicious minds.

We can't go on together, with suspicious minds, and we can't build our dreams on suspicious minds. We're caught in a trap, and I can't walk out. ...because I love you too much baby.

"Strong" Passwords are not the answer (2)

naris (830549) | about 6 months ago | (#46818367)

Requiring users to change their password often and requiring long and "strong" passwords that are difficult to memorize is not the answer to better security. This results in people having to write down their password someplace convenient for them (and any nefarious people around). This is well demonstrated by the movie "Ferris Bueller's Day Off" where the main character find the schools' passwords taped inside a desk and alters his and his friends grades. It also trains users, and the help desk, that they will have to reset their password often. This has the effect of making the actual passwords irrelevant to security. All a nefarious person has to do to gain access to the system is convince the help desk that they are an employee that needs to change their password.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?