python - Django website performance slowdown -


all righty want explain small django issue, having trouble getting around.

the problem

i have small website, couple of pages display list of database records. website internal render farm monitor company have perhaps dozen or 2 active connections @ time. no more 50.

the problem have 3 update services cause real performance hit when turned on.

the update services each python scripts that:

  1. use urllib2 make http request url.
  2. wait response
  3. print success message time stamps log.
  4. wait 10 seconds, , start again.

the urls send requests cause django website poll external service , read new data our django database. urls this:

when these update services turned on (especially updatetasks), can take on 10 seconds http://webgrid/ start loading normal users.

the setup

django 1.8, deployed gunicron v18.

the main gunicorn service run these arguments (split list easier reading).

<path_to_python> <path_to_gunicorn>  -b localhost:80001  -u farmer  -t 600  -g <company_name>  --max-requests 10000  -n bb_webgrid  -w 17  -p /var/run/gunicorn_bb_webgrid.pid  -d  --log-file /xfs/gridengine/bbgrid_log/bb_webgrid.log bb_webgrid.wsgi:application 

apache config site:

<virtualhost *:80>     servername webgrid.<interal_company_url>     serveralias webgrid      setenv force-proxy-request-1.0 1      documentroot /xfs/gridengine/bb_webgrid/www     customlog logs/webgrid_access.log combined     errorlog logs/webgrid_error.log     #loglevel       warn     <directory "/xfs/gridengine/bb_webgrid/www">             allowoverride     </directory>      wsgidaemonprocess webgrid processes=17 threads=17     wsgiprocessgroup webgrid  </virtualhost> 

this kind of thing shouldn't done online; hitting url directs view unnecessarily tying webserver stops doing real job, respond user requests.

instead, out-of-band. quick easy way write django management command; way can call model methods command-line script. can point cron job, or whatever is, call these commands, rather calling separate python script calls url on site.

an alternative use celery; it's system doing long-running asynchronous tasks. has own scheduling system, replace cron jobs completely.


Comments