# [curves] Isogeny patterns among Edwards curves

Michael Hamburg mike at shiftleft.org
Thu Jan 30 13:37:49 PST 2014

```
On Jan 30, 2014, at 11:38 AM, Watson Ladd <watsonbladd at gmail.com> wrote:

> On Thu, Jan 30, 2014 at 9:08 AM, Robert Ransom <rransom.8774 at gmail.com> wrote:
> There is also a bigger problem: each encoding and decoding doubles the
> point. This is fine for ECDH, but makes signatures ugly.

At worst you can recover P or P+(point of order 2) with two square roots, effectively backing out of the isogeny rather than completing it.  This might be more expensive than decompressing and doubling, but it’s not horrible.  Since you’re going to check the equation (r*G + s*PubKey) ?= NoncePoint, the recipient only has to decompress the public key, not the NoncePoint.  After compressing, they can check the equation with no expensive ops, because they’ll have have an Edwards point as (xz,yz,z) and the NoncePoint is supposed to be x/y = xz/yz.

If the PubKey is a q-torsion point, which it should be, you could instead check r*G + (s/4 mod q)*4*PubKey ?= NoncePoint.  The fastest implementations will do something like that anyway using the isogeny to the twisted Edwards curve E(-1,d-1), since that’s about 5% faster and has slightly simpler formulas.

Compare this to a coordinate+signbit encoding, where to check the sign bit (not suuuuuper necessary, but a good idea for strictness) you have to do a division at the end.  Or if the sign bit is a Legendre symbol, you have to compute that symbol.  So in total, signature checking is not much more complicated or expensive with the x/y encoding than it is with coordinate+signbit.

But you’re right, it’s not all pretty.  If I can’t find a way to avoid the extra square root or Legendre symbol, then it will be more expensive for batch verification.  For that, you can check 4*r*G + s*4*PubKey ?= 4*NoncePoint.  So it doesn’t hurt that the points get doubled or quadded by the cheaper version of decompression.

Likewise, in the Elligator PAKE, you have to compute wire+elligator, and eventually serialize to wire again, which (as things stand) costs at least 1 square root, 1 Legendre and 1 inversion in addition to the elligator and the scalarmul.  This is compared to just a square root and an inversion for a coordinate+signbit representation.

So, maybe it's not good enough *yet*, because it adds moderate slowdowns to batch-verify and Elligator PAKE.  But it doesn’t really slow down ECDH, signing or single-verify, which is pretty good for a trick I’ve been working on since yesterday.

> I think any encoding should be able to represent the entire curve,
> without nasty problems like this.

Aside from performance concerns, do you think it’s actually a problem if the encoding fails to faithfully convey points, by destroying the difference between P and P+(point of order 2)?

> Yes, it's surmountable, but what
> does this idea have over a regularly compressed Edwards point?

What this has vs a coordinate+signbit compressed Edwards point is that you can use the Montgomery ladder without destroying the sign bit, and without paying more than a few muls extra to compute it.  This is nice for a protocol which wants to do a scalar mul but cares about the sign that comes out afterwards.  Dragonfly does exactly this.  I know that neither of us loves Dragonfly, but there may be other protocols that care about this sort of thing.

What’s more, this is the first suggestion for a unified point format which might reduce complexity instead of increasing it.  But only if I can work out a nice decompression algorithm.

Also, it might dodge patents, but that probably doesn’t matter if the patents are expiring in July anyway.

Cheers,
— Mike

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://moderncrypto.org/mail-archive/curves/attachments/20140130/741583d1/attachment.html>
```