File upload issue over 10MB

I have a file upload issue whenever the file size is larger than 10 MB, the current version is 10.13.3.
The guide in official documentation about big file upload has already been reviewed and settings are made to accept 500MB:

upload_max_filesize = "500M"
post_max_size = "500M"

In case .htaccess would override, the same values have been put here:

    php_value upload_max_filesize 513M
    php_value post_max_size 513M
    php_value memory_limit 512M

The reverse proxy used is Traefik, are there any known limitations? I’ve also tried to configure the buffering values, which did not help:

- "traefik.http.middlewares.limit.buffering.maxRequestBodyBytes=262144000"
- "traefik.http.middlewares.limit.buffering.memRequestBodyBytes=262144000"

The following is the error message in data/owncloud.log:

{
  "reqId": "Rx9eJMYGUgtMNXWqPtZj",
  "level": 4,
  "time": "2023-12-19T11:22:09+00:00",
  "remoteAddr": "172.18.0.3",
  "user": "x@x.com",
  "app": "webdav",
  "method": "MKCOL",
  "url": "/remote.php/dav/uploads/x%40x.com/web-file-upload-4cb03c292f37d082a7f7fd8c4bb1fbda-1702984928968",
  "message": "Exception: Call to a member function getPath() on bool: {\"Exception\":\"Error\",\"Message\":\"Call to a member function getPath() on bool\",\"Code\":0,\"Trace\":\"#0 \\/var\\/www\\/owncloud\\/apps\\/dav\\/lib\\/Connector\\/Sabre\\/Directory.php(88): OCA\\DAV\\Connector\\Sabre\\Node->__construct()\n#1 \\/var\\/www\\/owncloud\\/apps\\/dav\\/lib\\/Upload\\/UploadHome.php(97): OCA\\DAV\\Connector\\Sabre\\Directory->__construct()\n#2 \\/var\\/www\\/owncloud\\/apps\\/dav\\/lib\\/Upload\\/UploadHome.php(52): OCA\\DAV\\Upload\\UploadHome->impl()\n#3 \\/var\\/www\\/owncloud\\/apps\\/dav\\/lib\\/Upload\\/UploadHome.php(62): OCA\\DAV\\Upload\\UploadHome->getChild()\n#4 \\/var\\/www\\/owncloud\\/lib\\/composer\\/sabre\\/dav\\/lib\\/DAV\\/Tree.php(111): OCA\\DAV\\Upload\\UploadHome->childExists()\n#5 \\/var\\/www\\/owncloud\\/lib\\/composer\\/sabre\\/dav\\/lib\\/DAVACL\\/Plugin.php(834): Sabre\\DAV\\Tree->nodeExists()\n#6 \\/var\\/www\\/owncloud\\/lib\\/composer\\/sabre\\/event\\/lib\\/WildcardEmitterTrait.php(89): Sabre\\DAVACL\\Plugin->beforeMethod()\n#7 \\/var\\/www\\/owncloud\\/lib\\/composer\\/sabre\\/dav\\/lib\\/DAV\\/Server.php(456): Sabre\\DAV\\Server->emit()\n#8 \\/var\\/www\\/owncloud\\/lib\\/composer\\/sabre\\/dav\\/lib\\/DAV\\/Server.php(253): Sabre\\DAV\\Server->invokeMethod()\n#9 \\/var\\/www\\/owncloud\\/apps\\/dav\\/lib\\/Server.php(348): Sabre\\DAV\\Server->start()\n#10 \\/var\\/www\\/owncloud\\/apps\\/dav\\/appinfo\\/v2\\/remote.php(31): OCA\\DAV\\Server->exec()\n#11 \\/var\\/www\\/owncloud\\/remote.php(165): require_once(/var/www/ownclo...)\n#12 {main}\",\"File\":\"\\/var\\/www\\/owncloud\\/apps\\/dav\\/lib\\/Connector\\/Sabre\\/Node.php\",\"Line\":84}"
}

The storage used is Files Primary S3 in AWS. No limitations have been configured on AWS bucket.

Anyone ever experienced the same issue?

The issue was a missing parameter in config.php. From the documentation I was expecting the parameter dav.chunk_base_dir is using the same as the configured data directory, but this is not the case. After configuring a folder inside the data directory the upload of bigger files work.

3 Likes