Daniel Bauer schreef op 25-04-2016 7:25:
Just to give feedback about my "home work":
- all cloud services that are easy to find (let's say top 25) use their proprietary clients, which means you give full root access to some company. It's like giving the key to your house and car to somebody you don't know. In fact, it is giving complete access to bank accounts, credit cards, tax declarations, book keeping, clients list and private stuff.
Even with "client side encryption" the proprietary, closed source program still has root access plus internet connection and this is such a high risk that I really wonder why somebody uses such programs....
Thank you for your feedback. That was what I was implying as well. Maybe we can one day design a modular framework that allows clients with internet access and logic to perform the upload/sync, to basically be a plugin to another application that is just the GUI to such a backup system. This GUI would then use an encryption solution of your choice. The APIs would be such that only encrypted data (or non-encrypted, as you may want) is ever presented to the network component. It would require a minimum set of features that takes care of increments and versioning. As well as the study of proprietary solutions to find the right feature set perhaps. In a zero-knowledge solution increments are not possible unless they are created prior to encryption and stored on the server in separate files. Most encryption schemes also prevent files that are encrypted twice, to be able to make use of rsync's optimisations. Just saying this because updating large archives is a challenge in this sense. Any such scheme requires availability of the original archive, or information about its contents. If you are truly going to sync with the remote host, the protocol that the platform uses would need access to time stamps and filesizes at the very least, as well as filenames. Could be interesting to design it.
- The only reasonable solution (to my eyes) is to rent pure disk space somewhere (I found for example https://www.hetzner.de/gb/hosting/storagebox/bx40 ) and to upload self-encrypted files. I guess a program like https://cryptomator.org/ can make that task easier.
Yeah seems the only option at present.
I wasn't going deeper into the matter because in the end I decided not to use any online storage. Not because of lack of offers, but because my files are too large. I was dreaming of an online-solution to backup photo shootings, which are between 5 and 40 GB. If my math are right, with my current internet connection the upload of 1 GB would take about 215 minutes (speed test says I have 0.62mbps, telefonica, Spain) and en route any upload would take way longer than the battery of my laptop works. I admit, I took the large way to come to this conclusion :-)
I've had the same problem. What makes it harder is that you cannot rsync update an encrypted archive. For photoshoots maybe no problem. For real system or changing dataset backups, a big problem. That means you maintain the backup locally as well prior to encryption and create differences/increments, or do the same by only keeping a summary contents file, such as tar -g. But then you also need the tools to recreate the original archive from the increments. Anyway, not exactly what I wanted to write, but I'm a bit too sick to be thinking of essential features right now ;-). Regards. -- To unsubscribe, e-mail: opensuse+unsubscribe@opensuse.org To contact the owner, e-mail: opensuse+owner@opensuse.org