Comment by coreylane
16 hours ago
RClone has been so useful over the years I built a fully managed service on top of it specifically for moving data between cloud storage providers: https://dataraven.io/
My goal is to smooth out some of the operational rough edges I've seen companies deal with when using the tool:
- Team workspaces with role-based access control
- Event notifications & webhooks – Alerts on transfer failure or resource changes via Slack, Teams, Discord, etc.
- Centralized log storage
- Vault integrations – Connect 1Password, Doppler, or Infisical for zero-knowledge credential handling (no more plain text files with credentials)
- 10 Gbps connected infrastructure (Pro tier) – High-throughput Linux systems for large transfers
I hope that you sponsor the rclone project given that it’s the core of your business! I couldn’t find any indication online that you do give back to the project. I hope I’m wrong.
I'm certainly planning on sponsoring the project as soon as possible, but so far I have zero paying customers, hopefully that will change soon
first thing that popped into my mind is that your free plan is crazy generous. cut it out.
2 replies →
that's just creepy and hella presumptuous
Yeah I've seen this pop up in foss a lot lately and I don't like it.
I certainly hope you give back to projects you use too.
Gifts do not confer obligation. If you give me a screwdriver and I use it to run my electrical installation service business, I don’t owe you a payment.
This idea that one must “give back” after receiving a gift freely given is simply silly.
Yes but thank-yous are always good. Making sure the project sticks around is just smart.
If your neighbor kept baking and giving you cookies, to the point where you were wrapping and reselling them at the market, don't you think you should do something for them in return?
1 reply →
Me too!
How do you deal with how poorly rclone handles rate limits? It doesn't honor dropbox's retry-after header and just adds an exponential back off that, in my migrations, has resulted in a pause of days.
I've adjusted threads and the various other controls rclone offers but I still feel like I'm not see it's true potential because the second it hits a rate limit I can all but guarantee that job will have to be restarted with new settings.
> doesn't honor dropbox's retry-after header
That hasn't been true for more than 8 years now.
Source: https://github.com/rclone/rclone/blob/9abf9d38c0b80094302281...
And the PR adding it: https://github.com/rclone/rclone/pull/2622
Interesting. I’ve been through 4 large transfers to Dropbox in the last 3 years and never once has it honored that header.
I honestly haven't used it with Dropbox before, have you tried adjusting --tpslimit 12 --tpslimit-burst 0 flags? Are you creating a dedicated api key for the transfer? Rate limits may vary between Plus/Advanced forum.rclone.org is quite active you may want to post more details there.
I have made a dedicated api application and I have adjusted the tps flags. I’ve scanned the forums a few times but I’ve yet to inquire there.
i had been thinking about this service for a long time, especially something supporting transforms and indexing for backups. great job spinning it up.
Thanks 1. are you thinking of something like aws data firehose transform feature? where pandas or something can run inline? https://docs.aws.amazon.com/firehose/latest/dev/data-transfo...
2. do you have an example of what indexed backups would look like? Im thinking of macos time machine, where each backup only contains deltas from the last backup. Or am I completely off?
For transforms, the concept would be user friendly processing, like downcoding video & photos, compressing PDFs & text files, filtering out temporary or wasteful files . Something like AirTable for backups with a gui workflow editor with common processing jobs for backups.
For indexing, full text indexing of backups to allow for record retrieval based on keyword or date. E.g. “images in Los Angeles before 2010” or “tax records from 2015”. If possible, low resolution thumbnails of the backups to make retrieval easier.
I think #1 (transforms) would be more generally useful for cross cloud applications, and #2 is more catered toward backups