I’ve been using it on a number of personal Linux machines for a while now, and have had zero problems… and it’s saved my skin several times.
No issues at all with memory or humungous cache files (I’m looking at you, Arq!)
My only nitpick is the ghastly UX of the web tool. Yes, it works, but… ick. Interviews it just often enough that you can figure it out, but then forget it within a couple of days…
Actual homepage is https://duplicacy.com/
I also do my backups with it and it has seemed to work fine. I haven’t yet dared to try to restore anything so good to hear it has helped you.
When it comes to backup tools, Duplicacy seemed like the only free (for personal use) tool that didn’t have a history of surprise data corruptions. So it was an easy choice.
I can’t find any requirements for local resources or comparisons to other tools to put it into perspective. I won’t name other tools but I’ve seen local caches growing to 10s of GBs and backup processes OOMing when there’s too many files to scan/backup. This seem to be a common problem for golang: GC doesn’t seem to allow easy control over memory management and suffers from fragmentation. This is especially an issue on smaller machines (e.g. RPi, lower tier VPS and cloud instances).
PS: I have to point out the author’s choice of license for the project. Good for them. One thing to improve about it is to make it coherent/more clear in both LICENSE.txt and on the website.