- cross-posted to:
- [email protected]
- protonprivacy
- [email protected]
- cross-posted to:
- [email protected]
- protonprivacy
- [email protected]
Proton’s mission, funding sources, independence, and community are some of the reasons we’re more resilient than other privacy-first companies.
I could imagine a tool that makes cloud storage act like a remote hard drive, with sectors and everything. Where these “sectors” are just small binary files.
You have software locally that is setup to track local files and calculate how they are mapped to the remote sectors. When a file gets updated, or new ones are added, it shuffles things around in an efficient manner to keep the number of remote updates to a minimum, and then it only updates or adds the required sector files. This way a tiny edit to a 4 GB local file would only require a tiny upload to the server instead of resending a new encrypted copy of the entire 4 GB file.
Not only are the little sector files all encrypted with a private key known only to you, the file structure in this system doesn’t even make any sense to anyone but you.
However, if you lose you home PC and the file structure DB, the cloud copy becomes absolutely useless. Even if you had a backup of the private key.
Something like this surely already exists. Maybe there are even cloud storage providers who offer hard-drive like access to a block of data instead of being file-based.
EDIT: Turns out that’s what Proton Drive does. Kind of.
They say it’s client side, but the hashes that control the ordering must be stored on the server or else you couldn’t easily download the file on a other device. And I wonder if it’s still efficient if you make an exit in the middle of the file. Does it need to send the full 4GB all over again? Even having to send 2 GB all over again would be a lot.