I’ve read the FAQ item on upload problems and the docs on big file upload config.
Here is what I have and done. CentOS 7, PHP 5.4.16, Owncloud 9.1.5. Nginx+php-fpm.
/tmp is a 1G tmpfs so I have /var/tmp which is on the /var volume, its own virtual disk and much larger.
Changed the following in my /etc/php-fpm.d/owncloud.conf;
env[TMP] = /var/tmp
env[TMPDIR] = /var/tmp
env[TEMP] = /var/tmp
php_admin_value[open_basedir] = "/dev/urandom:/var/www/html/owncloud:/var/www/html/owncloud/config:/var/www/html/owncloud/apps:/tmp:/var/tmp"
php_value[upload_max_filesize] = 20G
php_value[post_max_size] = 20G
Created /etc/php.d/upload_dir.ini;
upload_tmp_dir=/var/tmp
sys_temp_dir=/var/tmp
Set the following in the owncloud nginx config;
client_max_body_size 10G;
client_body_timeout 3600s;
But I don’t think any of that is a problem. phpinfo() does show /var/tmp being used as the new upload dir, and still when I upload a large file I see my RAM exhausting until it’s all gone. So if I upload anything larger than my RAM can handle I get an error saying: “Den uppladdade filen var endast delvis uppladdad” which translates to “The uploaded file was only partially uploaded”.
As soon the a large file uploads, or fails to upload, all my RAM is given back to free.
I have also tried expanding RAM so 1.7G was free when the error came up last time.
Why isn’t PHP using the disk? Using RAM seems very unpractical.