I am owncloud user for a long time however i am stuck on a need.
In order to transfer very large files up to 100GB, I wanted to use the public links by uploading only.
The main goal is not to be logged on the client side. I adapted the temporary file and php configuration.
It’s ok but our problem is the loss of customer connection. It is impossible to resume the upload.
Is this feature possible? In case of recovery OC indicates that a file is already present. Do I miss a configuration?
Steps to reproduce
- Create a public links.
- Transfert file and lose connexion.
- the transfer starts again
ownCloud version: (see ownCloud admin page)
Updated from an older ownCloud or fresh install:
Login as admin user into your ownCloud and access
paste the results into https://gist.github.com/ and puth the link here.
No errors have been found.
There are a few settings that have to be set in PHP and ownCloud for big file uploads to work smoothly.
max_execution_time = 6000
max_input_time = 6000
memory_limit = 1G # make sure your system has enough memory available
sys_temp_dir = "/tmp"
upload_tmp_dir = "/tmp"
session.gc_maxlifetime = 6000
'dav.chunk_base_dir' = '/tmp/uploads',
But you have to be aware that you’ll need 200GB of free disk space in order to accommodate a 100GB upload, once in
/tmp and then again in
/path/to/owncloud/data. As the chunks will only be deleted in /tmp once the file has been put together.
However if the client is losing connection for too long the upload will be lost as the browser will recognize a timeout on the upload and I haven’t been able to get this particular upload resumed once you hit the timeout. If you restart the upload again the already uploaded files won’t be recognized. The sync client however will mostly resume the upload (in my tests I lost a couple GB of already uploaded data, that had to be reuploaded).
You could open a feature request for upload resuming on the web ui, however I doubt that would be fruitful as there is currently a new web ui in development: https://github.com/owncloud/phoenix
Thank you for that answer.
I saw these php settings but I did not modify the session.gc_maxlifetime nor added dav.chunk_base_dir ‘=’ / tmp / uploads’
In my test I disconnected the virtual interface of the virtual machine hosting OC, the timeout has probably been reached. At reconnection OC offers me to replace the existing file. The only solution would be to use the OC client or a Webdav client.
I did not see that a new UI was in development is a wonderful news. Hoping that the resumption of upload will be planned it would be a great feature. In the case of uploading files to clients it is difficult to have to install a client and identify themselves for downloads that can last for days.