php - nginx HTTP Error 500 when retrieving 1 Million rows -


using nginx 1.7 & php (laravel framework) retrieve 1+ million rows postgresql database causes error 500 displayed within 10 seconds.

http error 500 (internal server error): unexpected condition encountered while server attempting fulfill request.

i've set timeouts longer , reloaded nginx not working. settings prevent timeout?

sites-enabled

location ~ \.php$ {     fastcgi_pass unix:/var/run/php5-fpm.sock;     fastcgi_index index.php;     include fastcgi_params;     fastcgi_read_timeout 6000;     fastcgi_send_timeout 6000;     client_body_timeout 6000;     send_timeout 6000;     proxy_read_timeout 6000; } 

to explain cryptic comment, , supply possible answer:

this does not timeout, should make nginx issue 504 gateway timeout error. http error 500 means there error, i.e.:

  • php (laravel) went out of memory
  • the postgresql connection aborted
  • the fastcgi process gave ghost

so have 3 places check: fastcgi logs, laravel/php error log, , possibly postgresql (not likely, since postgresql should not have issues dealing millions of rows - maybe there memory issues).

directly loading page, without passing nginx, ought give more informative error anyway, if don't want to, or can't, check logs.

my money on memory error in php layer, or failing that, on resource (memory and/or cpu) exhaustion issue in fastcgi.

if so, can solve issue allowing higher memory footprint php processes in php.ini; or better, redesign process doesn't take in data (you surely don't display 1 million rows; maybe you're doing in php processing better done @ postgresql level?).


Comments

Popular posts from this blog

c# - Send Image in Json : 400 Bad request -

jquery - Fancybox - apply a function to several elements -

An easy way to program an Android keyboard layout app -