I installed Owncloud server in a clean VM running Ubuntu 20.04 using Google Drive external storage feature.
I can download without any problem the files stored on the server by public link sharing, but unfortunately it is not the same as for the ones from Google Drive. They are synced correctly, but I got these issues:
The download is very slow. I tried 1K text file and it takes around 10 seconds to display the download confirmation (the window saying “open file or save file”)
Cannot download large files (some GBs).
For example, a 1GB file is not possible to download, since the browser keeps waiting (the tab icon is spinning) and after a while does nothing (the tab icon stops).
In data/owncloud.log no log about this.
I don’t know if it’s a technical limitation or I need to tweak something in the configuration.
Does anybody know how to fix this?
ownCloud needs to download the file from gdrive before sending it to you. Depending on the connectivity between ownCloud and gdrive, it could take some time
We also have to take into account the size of the file (the bigger the file, the longer it will take to reach to ownCloud). For a 1K file this point shouldn’t be the problem though, but it’s something to check for big files.
For google documents, we have to export them first. This is a conversion process to “.odt” and other open formats so those files can be downloaded. This runs on google side and will delay the download from google to ownCloud until the process is complete.
As said, all of those points happen since you click the download button from ownCloud until ownCloud has the chance to start giving you the file (when the browser display the download confirmation dialog)
This is known and points to a problem in the google library. So far, the only workaround is to rise the php memory limit so the whole file fits into memory.
I tried the Dropbox external storage and it’s definitively faster than Google, so may be a Drive library limitation.
Downloading the 1K file is now very fast and the 1G takes around 1 minute to load, but I noticed that the limitations are anyway the ones you mentioned:
As for rising the PHP memory limit, I monitored with htop and df the RAM and disk space while downloading different big files: RAM consumption is literally zero, while the temporary files are stored on the disk. So, I don’t think rising PHP memory limit plays a relevant role in order to solve this issue.
The scenario is the following: a server storing huge file sizes from the external storage (Dropbox) allowing users to download files by public link.
Isn’t the best solution for this scenario hosting the files directly on the machine?
Anyway, if there is a suggestion/workaround for this specific scenario to still use external storage it would be great to know.
I guess this is for the dropbox connector. In general, all the external storage connector (except google drive) don’t require rising the memory limit regardless of the size of the file you’re downloading.
Having the files in the same host as ownCloud is always the best option. If you have reasons to have the files in other place (to have more space, to access from different places, etc), the closer the machines are, the better.
I guess people use a NAS in the same network to have the files there and connect ownCloud to it via SFTP or SMB. From within the network, you could manage the files through the NAS, but from outside the network you access to them through ownCloud via external storage (the NAS isn’t expected to be accessible from outside)