Best practice for archiving files from ownCloud Server

Hello everyone,

I’m running ownCloud Server 10.1.0 on a dedicated LAMP server. I would like to write a script that will automatically archive certain files from users (based on conditions I will set). I thought about writing a bash script that will loop over all files in certain directories, and if conditions are met, then it will move the files away from ownCloud and send them to AWS to store them. Afterwards, I thought about running sudo -u www-data php occ files:scan --path="user_id/files/path" in each directories where files were moved to rescan in order to avoid ownCloud from showing non existing files to users.

Does this seem to be a right way of achieving what I want ? Do you have any ideas on how I could do otherwise ?

All ideas welcome :slight_smile:

Thanks a lot!

Marc

So I ended up doing what I said:
-writing a shell script that loops over certain folders of a user in the main data folder of ownCloud (where/owncloud/data/folder/is/data/user_id/files/folder/)
-it sends the matching folders to an AWS S3 bucket
-afterwards, it runs sudo -u www-data php occ files:scan --path="user_id/files/path" in each folder where files moved (in order for ownCloud to stop showing users files that were moved)

And it works like a charm :slight_smile:

Hey,

from what i have read in the past occ files:scan should be only used for temporary issues and not consistently for adding new to or removing files from ownCloud.

Hey @tom42,

thanks for your reply. So do you see any other way of achieving it without using occ files:scan ?

Maybe you can download the file via WebDAV and delete it afterwards. I think with this you don’t need to run occ files:scan at all over all files as ownCloud will notice that a file got deleted.

I will definitely have a look at that!
Thanks

Hi! Just joined today. I’m doing the inverse, I’m mirroring a Git repo to ownCloud. The DavFS performance is problematic. Initially I tried to dump a 1GB folder with about 3,000 files via rsync to DavFS. After a hundred files or so the DavFS hangs. This is when the DavFS mount point is on the ownCloud server itself. In addition dismounting the DavFS folder induces a DavFS cache flush that for thousands of files can take minutes. I wrote a Python script that mounts and dismounts the DavFS partition every 30 seconds to flush the cache in small chunks. It is instructive to tail the Apache logs of the ownCloud server and watch all the HTTP operations DavFS incurrs per file. In any case, due to the DavFS performance characteristics I plan to do continuous cron backups periodically of small deltas to incur fewer files per backup via DavFS.

1 Like