[messaging] Multiple devices and key synchronization: some thoughts
carlo von lynX
lynX at i.know.you.are.psyced.org
Sun Jan 4 01:57:51 PST 2015
On Sat, Jan 03, 2015 at 08:15:06AM -0800, Joseph Bonneau wrote:
> On Fri, Jan 2, 2015 at 1:47 AM, carlo von lynX <
> lynX at i.know.you.are.psyced.org> wrote:
> > Consider also the possibility that market logic may not work out as
> > it never has in the past two decades since we "won" the crypto wars.
> > If we let people always take the decision and opt for easy solutions
> > humanity may never experience a secure Internet as they will always
> > pick a compromised solution and mass surveillance will go on, to
> > the detriment of democracy. Consider the possibility that the only
> > way to create an Internet that respects the principles of democratic
> > consititutions could be to put certain basic requirements of end-to-end
> > security into law. http://youbroketheinternet.ortellg/legislation/
> > <http://youbroketheinternet.org/legislation/> is
> > about that, a law proposal for obligatory encryption.
> I agree that a market failure often exists in which users genuinely want a
> higher level of security, but are unable to achieve it because they can't
> tell the difference between secure and insecure products (or secure and
> insecure behaviors) and so they default to insecure products and behaviors
> because they are usually easier. Essentially, this is a lemons market
> (although not technically, since there is usually not a price difference
> but a convenience one). This was proposed for information security at least
> 13 years ago in the original papers on security economics  and has been
> widely discussed since then.
Yes, that is problem number one, but there is more.
> I think this is a helpful framing, and there are many actions to try to
> reduce information asymmetry. For example, things like the EFF Scorecard
> attempt to inform more users that certain products aren't secure, as well
> as to try to convince large Internet companies not to tarnish their brand
> with weak products. Libertarian/soft paternalism  can also be helpful,
> in which users are nudged to better decisions through secure defaults.
> However, I think it's also possible (and indeed common) to make a design
> error by assuming all users have the same values as we do, or would "if
> only they knew" and therefore we should try to force them into a high level
> of security.
> Personally, I think many users' desire for end-to-end security ends well
> short of printing backup codes or running a pairing protocol that prevents
> them from instantly using a new device. If this is required to use multiple
> devices, I'm worried that the result will be a large number of users
> signing up for some new cloud service which manages a single private key
> for them and lets them fetch their messages from any device (using
> passwords and HTTPS), at which point end-to-end security is gone.
>  https://www.acsac.org/2001/papers/110.pdf
>  https://en.wikipedia.org/wiki/Soft_paternalism
You are making an assumption here that I think needs to be focused
on and be fully aware of. You think of these tools like a helmet
that a user chooses to put on in order to hurt herself less.
I think each time my girlfriend speaks to her old childhood friends
over Facebook, telling her about our relationship, she is harming
my rights and my privacy - and there is nothing I can do about it
but get down on my knees and explain to her how much this hurts me.
So the tools we should be working on shouldn't be helmets, they
should be considered like car illumination. You turn them on not
only to be able to see the road yourself, but also to avoid causing
harm to others.
How many countries on Earth have introduced traffic lights, seat
belts and car illuminations as a non-mandatory free choice for the
citizen? As far as I know, when the behavior of individuals affects
many other people around them, the necessary measures to ensure
everybody's safety are put into law.
Laws are especially good for business, because non-ethical companies
no longer have an advantage on the market compared to the ethical
ones. So if you like to contribute to a company with an ethos, you
should be all for policy-making that makes the privacy abuse business
model illegal and re-opens business on the Internet for actual paid
services - equally for all.
Messaging is among the most politically relevant technologies in
this age, more so than nuclear or genetic IMHO. We developers of
these technologies must be very aware of political implications
of every little design decision we make. It would be irresponsible
to want to separate these aspects. So please consider that the
"consumer choice" paradigm may just be totally wrong.
More information about the Messaging