[curves] Choosing an extra-strength curve

Johannes Merkle johannes.merkle at secunet.com
Tue May 6 09:57:03 PDT 2014

Hi Trevin,

> I'm not qualified to assess this, so I'll look to people like
> Bernstein, Lange, Hamburg, etc.
> Bernstein and Lange don't seem to think that's important [1]:
> "Special primes help index calculus, but the point of ECC has always
> been to avoid index calculus. All of the SafeCurves requirements can
> be met by special primes."

This is absolutely true w.r.t current cryptanalytic approaches. My point is that sometimes we do not anticipate future
developments. This is purely hypothetical, but nevertheless it is plausible to consider any "simplifying structure",
e.g. the possibility to represent the prime as the sum of few powers of small integers, as a potential for new twists
for attacks. The approach of the ECC Brainpool was to be on the safe side just in case that special primes turn out to
be susceptible to new efficient attacks. Nobody can assess how likely this event is, and, personally, I think it will
not occur; but considering the devastating impact in case of a sub-exponential attack, we should not ignore such a

> Mike agrees that random primes might protect against future
> cryptanalysis, but points out they bring a substantial cost [2]: "a
> random field would be at least twice as slow".
> If that's true, I think you'd expect a random-prime curve to be about
> the same speed as a curve 1.3x the size (2 ^ 1/2.6).  So a 384-bit
> random-prime curve would be about as slow as a fast-prime 500-bit
> curve, but would have a nominal security level of 192 bits instead of
> 250.
> So I guess this is a tradeoff between different strategies for adding
> margin against cryptanalysis?

You seem to imply that new attacks can be countered by increasing the bit length by 33%. But if such an attack has
sub-exponential running time (and not too large constants), even 512 bit might be insufficient.

For instance consider the discrete logarithm in finite fields. For performance reasons, sometimes GF(2^n) had been
suggested. However, the new attacks by Joux et al. have shown that many, if not most (the analysis is extremely
complicated, so there is no complete picture yet) of these fields can be broken up to bit lengths of 8000.

IMHO it is wise to have fast curves for applications that need high performance, but to also have, as a backup,
conservatively chosen curves which avoid simplifying structures. This second class of curves may be considerably slower,
but there are plenty of application scenarios where the crypto speed doesn't really matter. Hey, most TLS servers prefer
RSA encryption cipher suites with 2048 bit, this is much slower than most Brainpool curves.

> Do you think 2x slower is accurate?  (Or do you have performance
> numbers on Brainpool or similar curves I could add to the spreadsheet
> [3])?

Unfortunately, not. I am not a developer, so I can't test it myself. Maybe I can find someone doing it for me.


More information about the Curves mailing list