How should I ethically approach user password stor

2018-12-31 15:55发布

As I continue to build more and more websites and web applications I am often asked to store user's passwords in a way that they can be retrieved if/when the user has an issue (either to email a forgotten password link, walk them through over the phone, etc.) When I can I fight bitterly against this practice and I do a lot of ‘extra’ programming to make password resets and administrative assistance possible without storing their actual password.

When I can’t fight it (or can’t win) then I always encode the password in some way so that it, at least, isn’t stored as plaintext in the database—though I am aware that if my DB gets hacked it wouldn't take much for the culprit to crack the passwords, so that makes me uncomfortable.

In a perfect world folks would update passwords frequently and not duplicate them across many different sites—unfortunately I know MANY people that have the same work/home/email/bank password, and have even freely given it to me when they need assistance. I don’t want to be the one responsible for their financial demise if my DB security procedures fail for some reason.

Morally and ethically I feel responsible for protecting what can be, for some users, their livelihood even if they are treating it with much less respect. I am certain that there are many avenues to approach and arguments to be made for salting hashes and different encoding options, but is there a single ‘best practice’ when you have to store them? In almost all cases I am using PHP and MySQL if that makes any difference in the way I should handle the specifics.

Additional Information for Bounty

I want to clarify that I know this is not something you want to have to do and that in most cases refusal to do so is best. I am, however, not looking for a lecture on the merits of taking this approach I am looking for the best steps to take if you do take this approach.

In a note below I made the point that websites geared largely toward the elderly, mentally challenged, or very young can become confusing for people when they are asked to perform a secure password recovery routine. Though we may find it simple and mundane in those cases some users need the extra assistance of either having a service tech help them into the system or having it emailed/displayed directly to them.

In such systems the attrition rate from these demographics could hobble the application if users were not given this level of access assistance, so please answer with such a setup in mind.

Thanks to Everyone

This has been a fun question with lots of debate and I have enjoyed it. In the end I selected an answer that both retains password security (I will not have to keep plain text or recoverable passwords), but also makes it possible for the user base I specified to log into a system without the major drawbacks I have found from normal password recovery.

As always there were about 5 answers that I would like to have marked as correct for different reasons, but I had to choose the best one--all the rest got a +1. Thanks everyone!

Also, thanks to everyone in the Stack community who voted for this question and/or marked it as a favorite. I take hitting 100 up votes as a compliment and hope that this discussion has helped someone else with the same concern that I had.

26条回答
何处买醉
2楼-- · 2018-12-31 16:19

Salt-and-hash the user's password as normal. When logging the user in, allow both the user's password (after salting/hashing), but also allow what the user literally entered to match too.

This allows the user to enter their secret password, but also allows them to enter the salted/hashed version of their password, which is what someone would read from the database.

Basically, make the salted/hashed password be also a "plain-text" password.

查看更多
不流泪的眼
3楼-- · 2018-12-31 16:21

Sorry, but as long as you have some way to decode their password, there's no way it's going to be secure. Fight it bitterly, and if you lose, CYA.

查看更多
泛滥B
4楼-- · 2018-12-31 16:24

Imagine someone has commissioned a large building to be built - a bar, let's say - and the following conversation takes place:

Architect: For a building of this size and capacity, you will need fire exits here, here, and here.
Client: No, that's too complicated and expensive to maintain, I don't want any side doors or back doors.
Architect: Sir, fire exits are not optional, they are required as per the city's fire code.
Client: I'm not paying you to argue. Just do what I asked.

Does the architect then ask how to ethically build this building without fire exits?

In the building and engineering industry, the conversation is most likely to end like this:

Architect: This building cannot be built without fire exits. You can go to any other licensed professional and he will tell you the same thing. I'm leaving now; call me back when you are ready to cooperate.

Computer programming may not be a licensed profession, but people often seem to wonder why our profession doesn't get the same respect as a civil or mechanical engineer - well, look no further. Those professions, when handed garbage (or outright dangerous) requirements, will simply refuse. They know it is not an excuse to say, "well, I did my best, but he insisted, and I've gotta do what he says." They could lose their license for that excuse.

I don't know whether or not you or your clients are part of any publicly-traded company, but storing passwords in any recoverable form would cause you to to fail several different types of security audits. The issue is not how difficult it would be for some "hacker" who got access to your database to recover the passwords. The vast majority of security threats are internal. What you need to protect against is some disgruntled employee walking off with all the passwords and selling them to the highest bidder. Using asymmetrical encryption and storing the private key in a separate database does absolutely nothing to prevent this scenario; there's always going to be someone with access to the private database, and that's a serious security risk.

There is no ethical or responsible way to store passwords in a recoverable form. Period.

查看更多
只若初见
5楼-- · 2018-12-31 16:24

If you can't just reject the requirement to store recoverable passwords, how about this as your counter-argument.

We can either properly hash passwords and build a reset mechanism for the users, or we can remove all personally identifiable information from the system. You can use an email address to set up user preferences, but that's about it. Use a cookie to automatically pull preferences on future visits and throw the data away after a reasonable period.

The one option that is often overlooked with password policy is whether a password is really even needed. If the only thing your password policy does is cause customer service calls, maybe you can get rid of it.

查看更多
伤终究还是伤i
6楼-- · 2018-12-31 16:24

Do the users really need to recover (e.g. be told) what the password they forgot was, or do they simply need to be able to get onto the system? If what they really want is a password to logon, why not have a routine that simply changes the old password (whatever it is) to a new password that the support person can give to the person that lost his password?

I have worked with systems that do exactly this. The support person has no way of knowing what the current password is, but can reset it to a new value. Of course all such resets should be logged somewhere and good practice would be to generate an email to the user telling him that the password has been reset.

Another possibility is to have two simultaneous passwords permitting access to an account. One is the "normal" password that the user manages and the other is like a skeleton/master key that is known by the support staff only and is the same for all users. That way when a user has a problem the support person can login to the account with the master key and help the user change his password to whatever. Needless to say, all logins with the master key should be logged by the system as well. As an extra measure, whenever the master key is used you could validate the support persons credentials as well.

-EDIT- In response to the comments about not having a master key: I agree that it is bad just as I believe it is bad to allow anyone other than the user to have access to the user's account. If you look at the question, the whole premise is that the customer mandated a highly compromised security environment.

A master key need not be as bad as would first seem. I used to work at a defense plant where they perceived the need for the mainframe computer operator to have "special access" on certain occasions. They simply put the special password in a sealed envelope and taped it to the operator's desk. To use the password (which the operator did not know) he had to open the envelope. At each change of shift one of the jobs of the shift supervisor was to see if the envelope had been opened and if so immediately have the password changed (by another department) and the new password was put in a new envelope and the process started all over again. The operator would be questioned as to why he had opened it and the incident would be documented for the record.

While this is not a procedure that I would design, it did work and provided for excellent accountability. Everything was logged and reviewed, plus all the operators had DOD secret clearances and we never had any abuses.

Because of the review and oversight, all the operators knew that if they misused the privilege of opening the envelope they were subject to immediate dismissal and possible criminal prosecution.

So I guess the real answer is if one wants to do things right one hires people they can trust, do background checks and exercise proper management oversight and accountability.

But then again if this poor fellow's client had good management they wouldn't have asked for such a security comprimised solution in the first place, now would they?

查看更多
残风、尘缘若梦
7楼-- · 2018-12-31 16:25

Pursuant to the comment I made on the question:
One important point has been very glossed over by nearly everyone... My initial reaction was very similar to @Michael Brooks, till I realized, like @stefanw, that the issue here is broken requirements, but these are what they are.
But then, it occured to me that that might not even be the case! The missing point here, is the unspoken value of the application's assets. Simply speaking, for a low value system, a fully secure authentication mechanism, with all the process involved, would be overkill, and the wrong security choice.
Obviously, for a bank, the "best practices" are a must, and there is no way to ethically violate CWE-257. But it's easy to think of low value systems where it's just not worth it (but a simple password is still required).

It's important to remember, true security expertise is in finding appropriate tradeoffs, NOT in dogmatically spouting the "Best Practices" that anyone can read online.

As such, I suggest another solution:
Depending on the value of the system, and ONLY IF the system is appropriately low-value with no "expensive" asset (the identity itself, included), AND there are valid business requirements that make proper process impossible (or sufficiently difficult/expensive), AND the client is made aware of all the caveats...
Then it could be appropriate to simply allow reversible encryption, with no special hoops to jump through.
I am stopping just short of saying not to bother with encryption at all, because it is very simple/cheap to implement (even considering passible key management), and it DOES provide SOME protection (more than the cost of implementing it). Also, its worth looking at how to provide the user with the original password, whether via email, displaying on the screen, etc.
Since the assumption here is that the value of the stolen password (even in aggregate) is quite low, any of these solutions can be valid.


Since there is a lively discussion going on, actually SEVERAL lively discussions, in the different posts and seperate comment threads, I will add some clarifications, and respond to some of the very good points that have been raised elsewhere here.

To start, I think it's clear to everyone here that allowing the user's original password to be retrieved, is Bad Practice, and generally Not A Good Idea. That is not at all under dispute...
Further, I will emphasize that in many, nay MOST, situations - it's really wrong, even foul, nasty, AND ugly.

However, the crux of the question is around the principle, IS there any situation where it might not be necessary to forbid this, and if so, how to do so in the most correct manner appropriate to the situation.

Now, as @Thomas, @sfussenegger and few others mentioned, the only proper way to answer that question, is to do a thorough risk analysis of any given (or hypothetical) situation, to understand what's at stake, how much it's worth to protect, and what other mitigations are in play to afford that protection.
No, it is NOT a buzzword, this is one of the basic, most important tools for a real-live security professional. Best practices are good up to a point (usually as guidelines for the inexperienced and the hacks), after that point thoughtful risk analysis takes over.

Y'know, it's funny - I always considered myself one of the security fanatics, and somehow I'm on the opposite side of those so-called "Security Experts"... Well, truth is - because I'm a fanatic, and an actual real-life security expert - I do not believe in spouting "Best Practice" dogma (or CWEs) WITHOUT that all-important risk analysis.
"Beware the security zealot who is quick to apply everything in their tool belt without knowing what the actual issue is they are defending against. More security doesn’t necessarily equate to good security."
Risk analysis, and true security fanatics, would point to a smarter, value/risk -based tradeoff, based on risk, potential loss, possible threats, complementary mitigations, etc. Any "Security Expert" that cannot point to sound risk analysis as the basis for their recommendations, or support logical tradeoffs, but would instead prefer to spout dogma and CWEs without even understanding how to perform a risk analysis, are naught but Security Hacks, and their Expertise is not worth the toilet paper they printed it on.

Indeed, that is how we get the ridiculousness that is Airport Security.

But before we talk about the appropriate tradeoffs to make in THIS SITUATION, let's take a look at the apparent risks (apparent, because we don't have all the background information on this situation, we are all hypothesizing - since the question is what hypothetical situation might there be...)
Let's assume a LOW-VALUE system, yet not so trival that it's public access - the system owner wants to prevent casual impersonation, yet "high" security is not as paramount as ease of use. (Yes, it is a legitimate tradeoff to ACCEPT the risk that any proficient script-kiddie can hack the site... Wait, isn't APT in vogue now...?)
Just for example, let's say I'm arranging a simple site for a large family gathering, allowing everyone to brainstorm on where we want to go on our camping trip this year. I'm less worried about some anonymous hacker, or even Cousin Fred squeezing in repeated suggestions to go back to Lake Wantanamanabikiliki, as I am about Aunt Erma not being able to logon when she needs to. Now, Aunt Erma, being a nuclear physicist, isn't very good at remembering passwords, or even with using computers at all... So I want to remove all friction possible for her. Again, I'm NOT worried about hacks, I just dont want silly mistakes of wrong login - I want to know who is coming, and what they want.

Anyway.
So what are our main risks here, if we symmetrically encrypt passwords, instead of using a one-way hash?

  • Impersonating users? No, I've already accepted that risk, not interesting.
  • Evil administrator? Well, maybe... But again, I dont care if someone can impersonate another user, INTERNAL or no... and anyway a malicious admin is gonna get your password no matter what - if your admin's gone bad, its game over anyway.
  • Another issue that's been raised, is the identity is actually shared between several systems. Ah! This is a very interesting risk, that requires a closer look.
    Let me start by asserting that it's not the actual identity thats shared, rather the proof, or the authentication credential. Okay, since a shared password will effectively allow me entrance to another system (say, my bank account, or gmail), this is effectively the same identity, so it's just semantics... Except that it's not. Identity is managed seperately by each system, in this scenario (though there might be third party id systems, such as OAuth - still, its seperate from the identity in this system - more on this later).
    As such, the core point of risk here, is that the user will willingly input his (same) password into several different systems - and now, I (the admin) or any other hacker of my site will have access to Aunt Erma's passwords for the nuclear missile site.

Hmmm.

Does anything here seem off to you?

It should.

Let's start with the fact that protecting the nuclear missiles system is not my responsibility, I'm just building a frakkin family outing site (for MY family). So whose responsibility IS it? Umm... How about the nuclear missiles system? Duh.
Second, If I wanted to steal someone's password (someone who is known to repeatedly use the same password between secure sites, and not-so-secure ones) - why would I bother hacking your site? Or struggling with your symmetric encryption? Goshdarnitall, I can just put up my own simple website, have users sign up to receive VERY IMPORTANT NEWS about whatever they want... Puffo Presto, I "stole" their passwords.

Yes, user education always does come back to bite us in the hienie, doesn't it?
And there's nothing you can do about that... Even if you WERE to hash their passwords on your site, and do everything else the TSA can think of, you added protection to their password NOT ONE WHIT, if they're going to keep promiscuously sticking their passwords into every site they bump into. Don't EVEN bother trying.

Put another way, You don't own their passwords, so stop trying to act like you do.

So, my Dear Security Experts, as an old lady used to ask for Wendy's, "WHERE's the risk?"

Another few points, in answer to some issues raised above:

  • CWE is not a law, or regulation, or even a standard. It is a collection of common weaknesses, i.e. the inverse of "Best Practices".
  • The issue of shared identity is an actual problem, but misunderstood (or misrepresented) by the naysayers here. It is an issue of sharing the identity in and of itself(!), NOT about cracking the passwords on low-value systems. If you're sharing a password between a low-value and a high-value system, the problem is already there!
  • By the by, the previous point would actually point AGAINST using OAuth and the like for both these low-value systems, and the high-value banking systems.
  • I know it was just an example, but (sadly) the FBI systems are not really the most secured around. Not quite like your cat's blog's servers, but nor do they surpass some of the more secure banks.
  • Split knowledge, or dual control, of encryption keys do NOT happen just in the military, in fact PCI-DSS now requires this from basically all merchants, so its not really so far out there anymore (IF the value justifies it).
  • To all those who are complaining that questions like these are what makes the developer profession look so bad: it is answers like those, that make the security profession look even worse. Again, business-focused risk analysis is what is required, otherwise you make yourself useless. In addition to being wrong.
  • I guess this is why it's not a good idea to just take a regular developer and drop more security responsibilities on him, without training to think differently, and to look for the correct tradeoffs. No offense, to those of you here, I'm all for it - but more training is in order.

Whew. What a long post...
But to answer your original question, @Shane:

  • Explain to the customer the proper way to do things.
  • If he still insists, explain some more, insist, argue. Throw a tantrum, if needed.
  • Explain the BUSINESS RISK to him. Details are good, figures are better, a live demo is usually best.
  • IF HE STILL insists, AND presents valid business reasons - it's time for you to do a judgement call:
    Is this site low-to-no-value? Is it really a valid business case? Is it good enough for you? Are there no other risks you can consider, that would outweigh valid business reasons? (And of course, is the client NOT a malicious site, but thats duh).
    If so, just go right ahead. It's not worth the effort, friction, and lost usage (in this hypothetical situation) to put the necessary process in place. Any other decision (again, in this situation) is a bad tradeoff.

So, bottom line, and an actual answer - encrypt it with a simple symmetrical algorithm, protect the encryption key with strong ACLs and preferably DPAPI or the like, document it and have the client (someone senior enough to make that decision) sign off on it.

查看更多
登录 后发表回答