Owncloud on shared hosting with low php max execution time. (60s)

Hi everybody,

I'm trying to use owncloud on my shared hosting. Unfortunately the allowed PHP Max Execution Time is only 60s.
I'm unable to upload files larger than +- 3-6MB depending on the connection speed with different clients ( iOS, Windows, macOS - latest stable Versions)
Clients will report "Connection loss" or "Connection timeout"
The web interface works with larger files ( tested up go 35MB)
For me it looks like the php times out on the server side which leads to connection loss on the client side.

Is this a feasible explanation?
and
Is there anything I can do about it or is a "PHP Max Execution Time" of 60s just not enough for using Owncloud?

If yes it would actually be a good idea to include that in the documentation and spare people some trial and error.

Thanks for any help,

Nils

P.S.
Owncloud Version is 9.1.4

Your problem looks more like something is not working with chunking. Uploaded files are split into chunks, they are uploaded separately and merged again on the server. The chunk size is about 5 MB, so it looks like the merging process is failing. You can try to increase the chunk size in order to know if this is related:
https://doc.owncloud.org/desktop/2.3/advancedusage.html

To make the chunk-feature work, you must check your logfiles for errors. Sometimes you don't have enough permissions in the temp folder, so that you should put your own temp folder into your document root (https://github.com/owncloud/core/blob/master/config/config.sample.php#L1175-L1183).

The critical part are updates. Environment with full control are preferable and such environments as yours are surely limited. However, you can use ownCloud on hosting environments, they just should not get to big. Unfortunately, it is hard to give precise numbers. If you use mainly calendar and contacts and use it to sync only a few files for a single user, it should be ok. Could also extend to a couple of users but as usage increases, be prepared to move to a different environment at some point.

Hi Nils,

I had to deal with similar restrictions with my Strato WebSpace and a quite slow DSL-Connection.

I run in interruptions during uploads using the owncloud client.

Setting the environment variable "OWNCLOUD_CHUNK_SIZE=>size_in Bytes<" to reduce the chunked size to smaller parts helped me to solve the problems.
See documentation: https://github.com/ajasja/client/blob/env_doc/doc/envvars.rst

The environment variable might be similar, or identical, to the possible settings of the chunk_size within the config file...

Gruss

Ralf

Thanks for the Replys!

@tflidd
Chunking works fine. I tried it with different files up to 260 MB on a relatively fast VDSL connection (50MBit/10MBit) chunks are 10MB each. Merging works flawless.
The chunks are just not adapting to network and php running time.

I´m quite sure my problems are caused by the "max execution time" not taken into account by the chunking algorithm (if that´s even possible in the current setup)
During the Easter Holiday I was limited by a symmetric 1MBit connection. In my office I also have only 1Mbit upload (Don´t ask! Sometimes I have to go home or LTE for time critical uploads)
I am not able to upload files bigger than +-4,5MB. In the iOS client the files would just upload forever without appearing.
The desktop client will create countless "Name.suffix-ID.part" files(visible via ftp)
At home with the fast internet connection everything works fine.

If I use my fast home network and change the maximum upload bandwidth in the client, I get the same problems I´m having with slow internet connections.
(Upload limited to 120kByte -> Upload stops between 6,7 and 7,6MB, Upload limited to 60kByte -> +-4,2MB)

It would help to set a global maximum chunk size on the server side.
Even better would be if the clients would take account of the max php execution time in relation to the connection speed.
Would it be a good idea to request this or is it such a narrow use case that it doesn´t justify the work?
At least the documentation could mention minimum requirements for the php max execution time ?!

@rkneisle
I read about setting the chunk size on the client side.
Unfortunately I can´t set the chunk size for the iOS client, which is one of my main use cases.
As a workaround I will probably use webdav (real webdav, not the ownclod php webdav, which will have the same limitations ) from the iOs side and set the chunk size in the desktop clients...

It's not only the execution time. I use "only" 30s but can uploads that take much longer. For large uploads, you can find a few options that limit the upload size: https://doc.owncloud.org/server/9.1/admin_manual/configuration_files/big_file_upload_configuration.html

It's already complicated for people with full access to their system, on hosted environments, it can be very difficult/impossible to run ownCloud properly.

I too have uploads that take longer than 60s. Chunking works fine, when the connection is fast enough. The problem just appears when one chunk can not be uploaded within the limited timeframe.
Regarding the link about big files: I read that but in my case the biggest file to expect will probably be around 200MB.

Actually I solved my problem in the following way:
In the desktop clients I set the maximum chunk size to 1MB. For my use case that´s convenient and I shouldn´t run into problems.
As the iOS client doesn´t allow to modifiy the default chunkSize for big iOS uploads I setup a an external folder with Webdav access (not the owncloud webdav implementation as that will run into the same time constraints on my webspace) and connected that folder as external folder within owncloud.
Now I can upload to this folder via webdav and owncloud will scan for changes...

If I understand correctly the desktop client (updated to 2.3.1, I realized I had an older one) will adapt the chunk size to a certain target transfer time, which is 60s default.
The client will start with a default chunk size of 10MB, evaluate the timed needed to upload and then adjust the chunksize accordingly in specified limits (1MB to 100MB)
Problem was that in certain environments the first chunk was never transferred correctly.
Setting the chunks size to 1MB just sets the starting point for dynamic chunking to 1MB, so it will adapt to the network accordingly.

And actually with a up-to-date client it´s easier to set the default chunk size:
https://doc.owncloud.org/desktop/2.3/advancedusage.html

One question:
Can anyone lead me to the dynamic chunking algorithm in the source code?

Update:
I just confirmed with my hoster:
My Uploads timing out are clearly connected to php-scripts running into the max_execution_time limit.
This only affecting certain users in a shared hosting environment is probably related to different server setups.

Depending on the combination of php installation and webserver the upload itself will or will not count into the running time of the php script.
e.g. in my case my hoster is using a combination of apache and php-fpm which will lead to the timer running during the upload.

Default from the docs is apache+mod_php, php-fpm is a bit tricky for apache setups. For more serious use, I would consider moving to an environment where you have more control.