[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[cryptography] How big a speedup through storage?

On Fri, Jun 20, 2014 at 3:23 PM, Jeffrey Goldberg <[email protected]> wrote:
> On 2014-06-19, at 10:42 PM, Lodewijk andré de la porte <[email protected]> wrote:
>> With common algorithms, how much would a LOT of storage help?
> Well, with an unimaginable amount of storage it is possible to shave a few bits off of AES.
> As {Bogdanov, Andrey and Khovratovich, Dmitry and Rechberger, Christian} say in Biclique Cryptanalysis of the Full AES (ASIACRYPT 2011) [PDF at http://research.microsoft.com/en-us/projects/cryptanalysis/aesbc.pdf ]
> "This approach for 8-round AES-128 yields a key recovery with computational complexity about 2^125.34, data complexity 2^88, memory complexity 2^8, and success probability 1.â??
> Itâ??s that 2^88 that requires a LOT of storage. Iâ??m not sure if that 2^88) is in bits or AES blocks, but letâ??s assume bits. Facebook is said to store about 2^62 bits, so we are looking at something 2^26 times larger than Facebookâ??s data storage.

8 rounds, lot more to go.

>> I know this one organization that seems to be building an omnious observation storage facility,
> Any (reliable) estimates on how big?

I believe the square footage is public, if not guesstimate by parking spaces etc
in JYA's sat photos. Then fill it to the brim with nothing but 6TB
drives, less some
space for racks, aisles, power, network at say 50% better density than industry
best [1]. That's your physical upper bound.
$10M in drives at consumer pricing will get you a raw 177PB, or 236PB at double
the space and power. Or $1B for 17EB. Budget is an issue.

Give it a shot on paper, best estimate wins...

[1] If all you care about is storage, you plug drives into tiny custom
storage fabric
asics and present giant block devices at the end of each row or room, not into
bulky servers. Commodity CPU's have 64bit address space, ZFS covers that.
Or go custom access/compute on your data as well.