[curves] pure-python Ed25519 library for review
Brian Warner
warner at lothar.com
Tue Apr 7 11:55:18 PDT 2015
Hi folks.. I could use some extra eyeballs on a pure-python library I
put together to do Ed25519 operations:
https://github.com/warner/python-pure25519
Of course it's very much not constant-time, and a lot slower than a C
implementation. But a pure-python library is, in practice, much easier
to depend upon than one that requires a C compiler.
And it's not unusably slow. If you run "python setup.py speed" in that
tree, you'll get the speed-test results. On my machine (2.6GHz i7), I'm
getting SUPERCOP-compatible Ed25519 sign/verify times of 2.8ms and
10.5ms .
I'm writing this to support a pure-python SPAKE2 library, for which I'm
seeing each phase of the protocol take about 5-8ms.
I'm especially looking for feedback on the arbitrary_element() function,
which provides SPAKE2 with the unknown-discrete-log group elements U and
V (or M and N depending on which paper you read). That function is
paraphrased here:
https://github.com/warner/python-pure25519/blob/master/pure25519/basic.py#L261
def arbitrary_element(seed): # unknown DL
hseed = hashlib.sha512(seed).digest()
y = int(binascii.hexlify(hseed), 16) % Q
for plus in itertools.count(0):
y_plus = (y + plus) % Q
x = xrecover(y_plus)
Pa = [x,y_plus]
if not isoncurve(Pa):
continue
P = ElementOfUnknownGroup(xform_affine_to_extended(Pa))
P8 = P.scalarmult(8)
if is_extended_zero(P8.XYTZ):
continue
assert is_extended_zero(P8.scalarmult(L).XYTZ)
return Element(P8.XYTZ)
# never reached
("Q" is 2**255-19, and xrecover() does what you'd expect)
What this code is doing and why:
* oversized hash (sha512) and reduction, to avoid any significant bias.
I'm pretty sure this doesn't matter for SPAKE2, but it seemed
appropriate. Should I remove this? Does Elligator avoid bias?
* test isoncurve(P), increment-and-try-again if not. I find that about
50% of Y-coords result in not-on-curve points. I assume these points
are actually on the "twist", and that some protocols can run faster by
ignoring this fact, but I don't know enough to safely do the same.
* multiply by 8 to force any points of order 2*L/4*L/8*L into the proper
order=L subgroup. It also forces points of order 1/2/4/8 into the
identity element (zero)
* test against zero, which rejects points of order 1/2/4/8. I'm not sure
if I need to be worried about these: I suspect they're vanishingly
unlikely to happen. The full version has a large comment about my
probably-flawed beliefs of how common these points are, and I think
there are at most 8 of them.
The rest of that file defines addition and multiplication functions
(using "extended" coordinates, XYTZ) and some object-oriented wrappers
to make application code easier/safer. In the process of testing, I
found a need for that ElementOfUnknownGroup, to exercise point math on
things like the identity and the point of order 2. Most applications
would stick to the correct-group-order Element class instead.
The repo includes a compatible implementation of Ed25519 signatures, a
demo DH-key-agreement routine (functionally equivalent to Curve25519 but
not interoperable with it), and a SPAKE2 implementation that I intend to
use in another project I'll be asking y'all to review shortly.
The bytes_to_element() function does full is-this-in-the-right-group
point validation, which slows things down by 2-4ms. It'd be nice to
remove that for the protocols that were designed to not need it
(Ed25519/Curve25519 clear the cofactor in other ways), but I don't
understand the issues well enough to feel confident removing it. Plus, I
don't want to leave a trap for later users of the library. Perhaps I'll
add a function named bytes_to_8_times_element() that multiplies instead
of validating.
An interesting side discussion would be how/where to speed up SPAKE2
with these same tricks. Maybe instead of X=G*a+U*pw. Y=G*b+V*pw.
Z1=(Y-V*pw)*a. Z2=(X-U*pw)*b, you'd do?:
X=G*a+U*pw. Y=G*b+V*pw. Z1=(Y*8-V*pw*8)*a. Z2=(X*8-U*pw*8)*b
Anyways, any and all feedback is welcome. Let me know what you think!
cheers,
-Brian
More information about the Curves
mailing list