[noise] Post-Quantum Noise with New Hope

Trevor Perrin trevp at trevp.net
Sun Jul 17 04:56:18 PDT 2016


On Sat, Jul 16, 2016 at 11:41 PM, Rhys Weatherley <rhys.weatherley at gmail.com
> wrote:

> On Sat, Jul 16, 2016 at 8:34 PM, Trevor Perrin <trevp at trevp.net> wrote:
>
>>  * Maybe we should clarify which tokens are data and which are
>> computations in the notation, e.g. capitalization?
>>
>
>> Noise_XX(s, rs):                 Noise_XXekem(s, rs):
>>   -> e                             -> e, ekem1, s
>>   <- e, DHEE, s, DHSE              <- e, ekem2, DHEE, EKEM21, s, DHSE
>>   -> s, DHSE                       -> s, DHSE
>>
>
> Arrgh!  My handshake is shouting at me! :-)
>

Some people found it confusing that the patterns combine message fields
("e", "s") with pure computation steps ("dh**").

If we keep adding more tokens that confusion might grow, which is why I was
thinking about notation to distinguish them.  But I'm not sure about this.


I think at that point it might be simpler to run two Noise_XX() handshakes
> in parallel and mix them with SSK's.  Or just switch completely to a
> post-quantum "XX" and drop the classical handshake.
>

I'm not sure how much confidence we have in current PQ schemes resisting
*classical* cryptanalysis, must less PQ cryptanalysis.  So I'm opposed to
deploying current PQ encryption schemes by themselves.


How about we apply some KISS?  The odd one out is ekem: there is a need for
> adding another ephemeral-only exchange for obtaining extra forward secrecy
> from a different algorithm.
>
> Noise_XXf(s, rs):
>     -> e, f
>     <- e, f, dhee, dhff, s, dhse
>     -> s, dhse
>
> Where "f" stands for "forward secrecy ephemeral".
>

That's not bad, though if we're pushing for simple patterns, just
"doubling" the DH functions deserves a closer look.  I.e., for NewHope:

"e" = sends an unencrypted DH value, then sends a (potentially encrypted)
NewHope value.  The first occurrence of "e" sends NewHope message #1, the
next "e" by the other party sends NewHope message #2.

"dhee" = calls MixKey() on the DH output, then calls MixKey() on the
NewHope output.

This loses some flexibility.  For example, with your separate tokens, we
could do the regular dhee *before* sending the second f, and encrypt f:

Noise_XXf(s, rs):
    -> e, f
    <- e, dhee, f, dhff, s, dhse
    -> s, dhse

But encrypting the second NewHope value doesn't seem that useful - it
doesn't make the protocol indistinguishable from random, because the first
NewHope value is still in clear.


To apply the "doubling" approach to post-quantum auth:

 * PQ signature schemes exist.  Ignoring PQ, a "sig" token that signs h
with s is an obvious extension to Noise, though there's things to think
about (e.g., do we allow a static elliptic curve key to be used for both DH
and signatures?).

Anyways, a "sig" transformation would convert "dhse" -> "sig":

 Noise_XX(s, rs):                 Noise_XXsig(s, rs):
  -> e                             -> e,
  <- e, dhee, s, dhse              <- e, dhee, s, sig
  -> s, dhse                       -> s, sig

So we could "double" the "s" token to contain a regular and PQ public key,
and double the "sig" token to contain a regular signature, followed by a PQ
signature.


 * For schemes like NTRU Prime where we can encrypt to a static public key,
we could double "dhes" so that it calls MixKey() on the DH output, then
sends an NTRU Prime #2 message and calls MixKey() on the NTRU Prime
output.  Instead of XX, we'd use a pattern like this:

    -> e
    <- e, dhee, s
    -> s, dhes
    <- dhes

Note that this pattern is worse than XX, since XX can authenticate "faster":

    -> e
    <- e, dhee, s, dhse
    -> d, dhse

So having separate tokens allows combining the faster authentication from
XX with slower PQ authentication, which was my "skem" proposal:

Noise_XXskem(s, rs):
  -> e, ekem1
  <- e, ekem2, DHEE, EKEM21, s, DHSE, skem1
  -> skem2, SKEM21, s, DHSE, skem1,
  <- skem2, SKEM21

But that's a lot more complicated...

So for PQ auth, the doubling approach loses some real flexibility in the
auth case, since it constrains us to using "regular" auth that complies
with whatever limits the PQ scheme has.

OTOH, PQ forward-secrecy is the more immediate goal, so maybe it's better
to use doubling to just keep things simple for now, and hope that more
flexible PQ schemes are invented if/when PQ auth becomes important?


Trevor
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://moderncrypto.org/mail-archive/noise/attachments/20160717/b63b923a/attachment.html>


More information about the Noise mailing list