tornado - Should Celery sub-processes show up on ps aux | less -


i'm using supervisord celery on tornado server (note: not tcelery, since server isn't using async features yet) 3 workers: w1, w2, , w3. each has concurrency of 10. via supervisor adding following /etc/supervisord.conf:

[program:sendgrid_gateway_server] command=sudo python main.py -o runserver numprocs=1 directory=/home/ubuntu/sendgrid_gateway/sendgrid-gateway stdout_logfile=/home/ubuntu/sendgrid_gateway/sendgrid-gateway/logs/server_log.txt autostart=true autorestart=true user=root  [program:sendgrid_gateway_server_w1] command=celery worker -a tasks --loglevel=info --concurrency=10 -n w1 numprocs=1 directory=/home/ubuntu/sendgrid_gateway/sendgrid-gateway stdout_logfile=/home/ubuntu/sendgrid_gateway/sendgrid-gateway/logs/w1_log.txt autostart=true autorestart=true user=root  [program:sendgrid_gateway_server_w2] command=celery worker -a tasks --loglevel=info --concurrency=10 -n w2 numprocs=1 directory=/home/ubuntu/sendgrid_gateway/sendgrid-gateway stdout_logfile=/home/ubuntu/sendgrid_gateway/sendgrid-gateway/logs/w2_log.txt autostart=true autorestart=true user=root  [program:sendgrid_gateway_server_w3] command=celery worker -a tasks --loglevel=info --concurrency=10 -n w3 numprocs=1 directory=/home/ubuntu/sendgrid_gateway/sendgrid-gateway stdout_logfile=/home/ubuntu/sendgrid_gateway/sendgrid-gateway/logs/w3_log.txt autostart=true 

the first [program] block main python application runs tornado. next 3 (obviously) celery workers. worries me when "supervisorctl start all" 30 processes show in list:

root 2547 0.0 0.0 40848 1672 ? s 13:40 0:00 sudo python main.py -o runserver root 2548 0.2 1.9 176140 33020 ? sl 13:40 0:04 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w3 root 2549 0.0 2.1 196848 35632 ? s 13:40 0:01 python main.py -o runserver root 2560 0.2 1.9 176140 33016 ? sl 13:40 0:03 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w2 root 2561 0.2 1.9 176140 33020 ? sl 13:40 0:03 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w1 root 2581 0.0 1.6 175144 28616 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w3 root 2582 0.0 1.6 175144 28624 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w3 root 2583 0.0 1.6 175144 28628 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w3 root 2584 0.0 1.6 175144 28628 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w3 root 2585 0.0 1.6 175144 28628 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w3 root 2586 0.0 1.6 175144 28632 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w3 root 2587 0.0 1.6 175144 28632 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w3 root 2589 0.0 1.6 175144 28636 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w3 root 2590 0.0 1.6 175144 28644 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w3 root 2591 0.0 1.6 175144 28640 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w3 root 2595 0.0 1.6 175144 28612 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w2 root 2596 0.0 1.6 175144 28624 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w1 root 2597 0.0 1.6 175144 28632 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w1 root 2598 0.0 1.6 175144 28620 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w2 root 2599 0.0 1.6 175144 28620 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w2 root 2600 0.0 1.6 175144 28620 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w2 root 2601 0.0 1.6 175144 28624 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w2 root 2602 0.0 1.6 175144 28636 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w1 root 2603 0.0 1.6 175144 28628 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w2 root 2604 0.0 1.6 175144 28636 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w1 root 2605 0.0 1.6 175144 28632 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w1 root 2608 0.0 1.6 175144 28632 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w1 root 2609 0.0 1.6 175144 28628 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w2 root 2610 0.0 1.6 175144 28640 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w1 root 2611 0.0 1.6 175144 28640 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w1 root 2612 0.0 1.6 175144 28632 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w2 root 2613 0.0 1.6 175144 28648 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w1 root 2614 0.0 1.6 175144 28644 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w1 root 2616 0.0 1.6 175144 28640 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w2 root 2617 0.0 1.6 175144 28636 ? s 13:40 0:00 /usr/bin/python /usr/local/bin/celery worker -a tasks --loglevel=info --concurrency=10 -n w2

those 30 celery processes, plus few (not quite sure why ones there...) under impression unnecessary processes should terminate after task has been finished. case or loony?

thanks in advance.

yes, should show processes. might want use stopasgroup=true , killasgroup=true options under program configurations stop child processes @ once, or else may keep running after have run stop [programname] command supervisorctl.


Comments

Popular posts from this blog

curl - PHP fsockopen help required -

HTTP/1.0 407 Proxy Authentication Required PHP -

c# - Resource not found error -