[messaging] Let's run a usability study (was Useability of public-key fingerprints)
infinity0 at pwned.gg
Tue Apr 1 07:33:26 PDT 2014
On 30/03/14 23:44, Bernard Tyers - ei8fdb wrote:
> Hi there,
> I recently sub’ed to the list. A friend mentioned this thread, and I’d like to pitch in.
> I’m a user research/interaction designer. I did my Msc thesis on non-technical user adoption of OTR in instant messaging, specifically on their mental models (how someone thinks something works based on a) previous experience and/or b) second-hand correct/incorrect knowledge).
> I looked specifically at The Guardian Projects Gibberbot (now Chatsecure for Android) and iOS Chatsecure client by Chris Ballinger.
> I focused on journos, and human rights defenders.
> IMO, IM, e-mail is essentially human interactions, supported by computer.
> I’d argue cryptography is just computers interacting, a computer exchanging messages which are validated somehow by the other computer. They are highly formulated conversations.
> Humans understand conversations, once they are explained in a way thats familiar to them.
I'm not sure it's necessary to explain all the details of the conversation, even if you can describe it in more natural language. This is not actually necessary to use the tool.
What is vital is to explain the interface between the human and the computer - e.g. key validity, when encryption is active, what the likely reasons for errors are, and how to fix them in a secure way.
Preferably you'd also explain the security properties that a tool achieves, but this is can get technical - I find that the security properties of OTR are just about explainable in casual conversation, but to explain Pond would take more time.
> I can totally agree with the comment made about the Cryptocat blog post where the user said the word fingerprint terrified them.
> Two of my participants (both journos who worked in “tricky” situations and not “digital security literate”) found the word scary - one said “can my iPhone read my fingerprint? (this was before the faux fingerprint reader), while the other said “fingerprint? What the hell does that mean?”.
> When I interviewed the participants of my study, there was a mixture of good structural mental models and bad functional mental models, and vice-versa.
> Two other participants were involved in scenarios where they had to validate humans identities, sometimes remotely (one situation: different parties in London, Syria, Bahrain).
> When I questioned them more they explained their human procedures for validating the identities of the other parties. I mapped their procedure in a flow diagram” and it closely mirrored that of OTR, however they had third parties (I defned them as proxies) who they trusted implicitly.
> While most had heard of OTR in some shape or form, they varying depths of mental models, some correct, and some totally incorrect.
> The maority of them did not instinctively compare the fingerprints when the software gave them the opportunity.
> Some found the OTR SMP an accessible option as it was a model they understood - “secret answer to a question”.
> Others thought the fingerprint was “more secure” (I didn’t get a chance to probe further).
I agree that "fingerprint" is unnecessary jargon. I generally try to explain a PGP "key" in terms of a social network profile, and having to validate the information on that profile (numbers, addresses, personal data), as well as the fact that the profile/account is controlled by the person you think it is. (Actually using PGP for email is another topic that I deal with separately.)
One major UX issue with the OTR shared secret method, is that people think they can short-cut it by asking the question *over the yet-unverified OTR channel*. (There was a thread I think on otr-users or otr-dev where I suggested extra wording that made it clear the latter was not safe, but it was too long to put in an actual UI.) You must have encountered this in your research - what are your thoughts for dealing with this?
> One conclusion I made was that non-technical people CAN understand OTR - the issue is OTR is implemented in user software in ways they CANNOT understand (or find difficult) - jargon, cryptogtaphic terms, overly complicated language.
> If you are looking for a user research/usability tester to be involved in the test I’d like to offer my time.
> If anyone would like some more information on my thesis, I am trying to write up some short papers/discussions about what I found.
> All the best,
I'd be interested in what you think of this idea: https://moderncrypto.org/mail-archive/messaging/2014/000171.html I don't think that we should get rid of (the idea of) fingerprints / shared-secrets (though we can explain them using more natural language), at least until we have another way of achieving strong levels of validity. (TOFU does not achieve this.)
Validity is a logistical problem, and using clever cryptography to reduce the amount of verification needed is good, but only solves half of the problem. Even if we solve the CA problem, there still has to be a CA to do the actual verification. How do we know they are doing their job correctly? If we had a tool that managed the verification process (the idea that I just linked), then we could have more confidence, because it cuts out a lot of potential human sources of error. As a bonus, it would make it easier for individuals to act as CAs for their friends.
Why is it a given assumption that people "will not check fingerprints"? People exchange phone numbers all the time. On the other hand, massive marketing campaigns have changed people's behaviour too. (Do people think it's *immoral* to encourage more people to verify fingerprints?) Software is another way to change people's behaviour, and software that lowers the cost of a task will encourage more people to do that task.
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 880 bytes
Desc: OpenPGP digital signature
More information about the Messaging