All the buzz is about an article appearing in ArsTechnica, Why passwords have never been weaker - and crackers have never been stronger, by Dan Goodin. It's an eye-opening piece about how the state of the art in password cracking has been greatly advanced by the security information intelligence gained from recent hacks, and password file disclosures, from LinkedIn, eHarmony, Battle.net and others.
Everyone "knows" they should be using strong passwords - yet passwords like 123456, password and Monkey are routinely on the top of the list of the most used passwords on hacked sites. Similarly, everyone "knows" they should not reuse passwords between sites (is your Facebook password the same as your online banking password???). But the article refers to a 2007 Microsoft study which reports that the average web user has accounts on 25 different sites, but protected by only 6.5 unique passwords!
Here's my take...
This is a great article. I think this is well done and pretty much supports what I’ve been saying for years.
In the old days, when a site was hacked, the encrypted password file would be grabbed. If the site did a good job of encrypting their passwords, then it was hard get those passwords. These new analysis capabilities make it easier to guess those passwords, to essentially create “rainbow tables on steroids”.
Thus, if a site gets hacked, it’s game over. But, the main attack advantage for the hacker on unhacked sites is the ability to try reused passwords and password schemes.
Note, and this is important, when they refer to password cracking, it means first stealing the encrypted password file from a hacked site, then figuring out the passwords from that. It doesn't mean directly cracking the passwords of accounts on unhacked sites. [more here]
Here are the main issues/points:
- People need too many passwords. So they engage in two dangerous practices – choosing weak (easy to remember) passwords and reusing passwords.
- What constitutes weak or strong passwords has changed… the bar has been raised.
- Site security at too many sites is bad. While it has always been, more or less, true that a hack of our neighbor’s site can effect us, this makes it more true.
These are the ONLY effective controls for passwords:
- Choose really, really strong passwords – single use passwords are best. In fact, humans should not choose their own passwords. They should be long, machine generated and stored in an encrypted vault tool.
- Users must never reuse passwords between sites. They also must not reuse password schemes between sites unless they are very, very good.
- Sites must have a retry lockout.
- This isn't really a control but a response – if our site gets hacked then all passwords must be changed immediately.
Password expiration is not a control at all. It encourages reuse of passwords between sites. Also, changing from one guessable password to another is not useful.
If your organization wants to get serious about mitigating this risk then:
- Roll out a vault to all users and encourage use at home.
- Only allow the vault to generate passwords. They will be long and gibberish.
- Consider implementing 1-time passwords via hard or soft tokens, or tools like Google Authenticator (I use this and it’s awesome).
- Assure all systems have retry lockouts (you should do this anyway)
- Expiration adds no value, but if passwords are unknown to the user through the use of vaults then it also does no harm.
How do you manage passwords? Do you use a vault? Do you have policies at work around password creation, storage and reuse?
No comments:
Post a Comment