You could run the docker container of resilio sync on each machine (linux) or the regular Windows, MAC clients. I find the docker solution elegant and very simple to replicate on other linux clients.
From my docker-compose.yml file
resilio-sync:
image: linuxserver/resilio-sync
container_name: resilio-sync
environment:
- PUID=1000
- PGID=1000
- TZ=America/Toronto
- UMASK_SET=022 #optional
volumes:
- /docker/resilio-sync/config:/config
- /docker/resilio-sync/cache:/downloads
- /home/pi/sync:/sync
ports:
- 8888:8888
- 55555:55555
restart: unless-stopped
Let me explain the volumes:
volumes:
- /docker/resilio-sync/config:/config #main program dir where config info is stored on local disk outside of the container
- /docker/resilio-sync/cache:/downloads # cache dir for partial downloads
- /home/pi/sync:/sync # a shared folder gets stored here. You create "myshare" and it would then have a path of "/home/pi/sync/myshare"
Why not run an rsync cron to synchronize/mirror to your RAID array? Or something such as rsnapshot? Backup is accomplished using rsnapshot to save it to a software raid array on a daily basis. so all my backups are automated, and up to four months old at any point.
Multitple users: You can do this with the Family key, you could then decide on the folder type that would be shared across your users. I have a single user case with multiple windows, linux and android clients, so I have my always on Pi hosting a shared folder.
I am not sure about removing the source files once you have moved them to your RAID array... Perhaps someone else can comment on that?