[curves] Review of NIST workshop

Michael Hamburg mike at shiftleft.org
Thu Jun 11 13:58:43 PDT 2015


Efficient ephemeral elliptic curve cryptographic keys

You might imagine that there is a precomputation attack on elliptic curves, so that you’d rather create a new curve per session.  But it’s really slow to count points.  Possible solution: use curves with complex multiplication, and use the CM method to generate them.  But this is still a bit slow, so maybe precompute several classes of curves and put them in a table, and you can instantiate a random curve from the class very quickly.

Dan Bernstein’s criticism: this makes unlikely-sounding assumptions about how powerful the precomputation attack is.  Otherwise it’s better to spend your perf budget on a larger curve, instead of some ephemeral but  pretty special CM curve.



A random zoo: sloth, unicorn and trx

Goal: design a publicly-trustable randomization procedure for lotteries, EC generation, etc.  Suppose you have a pre-agreed program to generate the curve or whatever; it just needs a seed.

Design a function which cannot be evaluated quickly, even with very fast hardware.  For example, sequential hash evaluation, or (to enable quickly checkable certificates) something involving modular square roots.

Protocol: a party asks for public input at publicly announced times.  It also chooses a random seed itself, which it commits to.  After gathering the input, the party immediately publishes (within seconds, but possibly only a hash) it and begins computing the slow function.  The output of the slow function is used as a seed to the pre-agreed process.  To attack this, you’d need to be able to evaluate the slow function in the “immediate publication” time.

This seems like a reasonable idea to me, but you’d have to be more conservative than the paper suggests, since you can use custom hardware to (eg) take square roots significantly faster than a CPU.




Adobe Digital Standards and Elliptic Curve Cryptogaphy

Symantec’s view on the current state of ECC

These papers discuss current deployment of ECC.  The main lesson seems to be: adoption moves very slowly, because many web servers need to support XP clients, or clients on random tiny embedded platforms from $N$ years ago.  This will be that much worse once there is crypto in your light switches.



An Efficient Certificate Format for ECC

A stripped down X.509 subset for tiny devices.  Not much effort to make parsing simpler, mostly to reduce size.  Slightly simplifies parsing, perhaps, but there is still the nightmare of ASN.1 DER.



Vehicle to Vehicle Safety Application using Elliptic Curve PKI

Description of the emerging V2V standards.  Vehicles will sign messages with their positions.  Certs rotate regularly to protect anonymity.  Each vehicle will have several thousand ephemeral certs (lasting a few hours each?) connected by a hash chain, which will enable them to revoked all at once if a car’s security chip is discovered to be compromised.  Need multiple agencies to collaborate to track down wrongdoers.  Requires high-speed ECC implementations in cheap hardware.  Uses NIST P256; Ed25519 got nixed for some reason.




Panel: ECC in industry.  I didn’t take notes on this.  IIRC it was similar to the Symantec talk, with discussions about how ECC moves slowly.  Also, we don’t want each nation to have their own pet curve.  Would be better if we can get an international buy-in.



Requirements for Elliptic Curves for High-Assurance Applications

Prefer Brainpool, because it’s what we have in German smartcards right now.  No cofactor makes checks simpler; cofactor has other advantages but requires rewriting all algorithms which is hard in hardware.  Random primes make blinding easier.  Don’t see twist security as important.



Diversity and transparency in ECC

There are many desiderata among the new curves.  The most important is that dlog must be hard.  There are also prime-order; random p or fast p; genericity in many metrics (twist security vs not, cm discriminant, embedding degree, small coeff or random, etc).

The existing curves (NIST, Brainpool, etc) and academic proposals (NUMS, Curve25519/41417, Goldilocks, etc) take opposite ends of the spectrum on several of these points.  Perhaps it it worth having two standards. 




A brief discussion on selecting new elliptic curves

Basically a defense and recap of the NUMS paper.  Suggests that the plan should be to give a generation mechanism: bit length -> curve.  Then you should run it at your favorite bit levels.  The perf and simplicity losses from this are claimed to be small enough that having a unified technique for all levels is better.  Also, this is a good strategy to put the curves beyond reproach.



Simplicity

A Bernstein rant about why NIST curves and standards are bad, and why Montgomery curves are better.



Panel: Selection criteria for new standardized curves

Lochter bemoaning the damage that BADA55 and SafeCurves have done to the industry, by causing potential users to back off from ECC.

Lochter and Costello: trust is more important than speed.  Speed is not very important: it’s a small difference and most EC is fast enough most of the time.  Also speed may be ephemeral since platform change.  Also using it as a criterion allows talented implementors to pull the wool over people’s eyes.

Bernstein (from the audience) and Lange: arguments for why performance is a good and important and stable metric.

Lots of Bernstein and Lange vs Lochter and Costello.  Flori is quieter.


Cheers,
— Mike



> On Jun 11, 2015, at 4:22 PM, Trevor Perrin <trevp at trevp.net> wrote:
> 
> If anyone wants to write a summary of the NIST workshop day 1, that
> would be good content for curves list...
> 
> Trevor



More information about the Curves mailing list