Fails in the same way on multiple celery versions:
celery version 3.1.7 billiard version 3.3.0.23
celery version 3.1.25 billiard version 3.3.0.23
celery worker -l debug --autoscale=50,4 -Ofair -Q tower_scheduler,tower_broadcast_all,tower,localhost -n celery@localhost
Happens sporadically when running our nightly 8 hour integration tests on Ubuntu 14.04 and 16.04. Does not happen on RHEL 7 nor Centos 7. We are working on a set of smaller recreation steps.
The celery master process and a particular worker both block on the same file descriptor, performing a read()
. The worker has finished a job and is ready for more work. The parent is "locked up", blocking on the read()
system call shared by the worker.
Restarting celery "fixes" the issue. More surgically, sending a SIGUSR1 to the master process "fixes" the issue by breaking it out of the read()
system call. The child then returns from the read and seems to process a/the pending message. The master process does NOT try to read()
from the PIPE again.
https://github.com/celery/celery/blob/3.1/celery/concurrency/asynpool.py#L220 is where we are hanging. This is a call to __read__
gotten from https://github.com/celery/billiard/blob/3.3/Modules/_billiard/multiprocessing.c#L202
This is non-blocking and non-asynchronous. I don't really understand how this code isn't susceptible to a deadlock/infinite blocking scenario. The read
is non-blocking and is called in such a way that it can block forever if the child dies.
We are now trying to recreate the problem with CELERY_RDBSIG="1"
set so that we can jump into a remote debug session when the deadlock occurs.
Any advise would be helpful.
From an OS perspective, I can't reason how this could occur.
write()
being observed after the SIGUSR1. I have the same issue here, workers look like running but they aren't processing messages.
I'm running on Python 3.6 and using Celery 4.1.0
@codeadict are you able to strace the two processes (worker and parent) to see if they are both stuck on the same file descriptor?
You can verify it's the same fd with sudo lsof | grep 53r
where 53r
is the number of the file descriptor the processes are blocked on.
@codeadict as @chrismeyersfsu mentioned, you can strace
the master celery process and see if it's stuck reading on a pipe:
$ sudo strace -p <pid-of-celery-master-process> -s 100
Process X attached
read(53
If you have python-dbg
installed w/ Python symbols, you can also use gdb
to see where the master is stuck:
$ gdb python -p <pid-of-master-process>
...
(gdb) py-list
216 # header
217
218 while Hr < 4:
219 try:
220 n = __read__(
>221 fd, bufv[Hr:] if readcanbuf else bufv, 4 - Hr,
https://github.com/celery/celery/blob/v3.1.17/celery/concurrency/asynpool.py#L219
Same problem.
celery==3.1.23, billiard==3.3.0.23
Workers started with options: -time-limit=7200 --autoscale=64,2 --maxtasksperchild=1
python 2.7.3
After few hours celery 'freeze'
I see that master and childs infinity wait read() on pipes (b.t.w, different pipes).
So, this is master process:
root@vm1:~# strace -p 41840
Process 41840 attached - interrupt to quit
read(50, ^C <unfinished ...>
Process 41840 detached
root@vm1:~# sudo lsof | grep 50r
python 41840 root 50r FIFO 0,8 0t0 548796 pipe
python 46237 root 50r CHR 1,3 0t0 7604 /dev/null
python 46304 root 50r FIFO 0,8 0t0 548796 pipe
root@vm1:~# ls -lh /proc/41840/fd/50
lr-x------ 1 root root 64 Oct 20 10:02 /proc/41840/fd/50 -> pipe:[548796]
root@vm1:~# (find /proc -type l | xargs ls -l | fgrep 'pipe:[548796]') 2>/dev/null
lr-x------ 1 root root 64 Oct 20 10:02 /proc/41840/fd/50 -> pipe:[548796]
lr-x------ 1 root root 64 Oct 20 10:02 /proc/41840/fd/51 -> pipe:[548796]
lr-x------ 1 root root 64 Oct 20 10:07 /proc/41840/task/41840/fd/50 -> pipe:[548796]
lr-x------ 1 root root 64 Oct 20 10:07 /proc/41840/task/41840/fd/51 -> pipe:[548796]
l-wx------ 1 root root 64 Oct 20 10:02 /proc/46237/fd/51 -> pipe:[548796]
l-wx------ 1 root root 64 Oct 20 10:07 /proc/46237/task/46237/fd/51 -> pipe:[548796]
lr-x------ 1 root root 64 Oct 20 10:02 /proc/46304/fd/50 -> pipe:[548796]
lr-x------ 1 root root 64 Oct 20 10:02 /proc/46304/fd/51 -> pipe:[548796]
lr-x------ 1 root root 64 Oct 20 10:07 /proc/46304/task/46304/fd/50 -> pipe:[548796]
lr-x------ 1 root root 64 Oct 20 10:07 /proc/46304/task/46304/fd/51 -> pipe:[548796]
And one of childs:
root@vm1:~# strace -p 46304
Process 46304 attached - interrupt to quit
read(16, ^C <unfinished ...>
Process 46304 detached
root@vm1:~# ls -lh /proc/46304/fd/16
lr-x------ 1 root root 64 Oct 20 10:02 /proc/46304/fd/16 -> pipe:[482281]
root@vm1:~# (find /proc -type l | xargs ls -l | fgrep 'pipe:[482281]') 2>/dev/null
lr-x------ 1 root root 64 Oct 20 10:02 /proc/41840/fd/16 -> pipe:[482281]
l-wx------ 1 root root 64 Oct 20 10:02 /proc/41840/fd/17 -> pipe:[482281]
lr-x------ 1 root root 64 Oct 20 10:07 /proc/41840/task/41840/fd/16 -> pipe:[482281]
l-wx------ 1 root root 64 Oct 20 10:07 /proc/41840/task/41840/fd/17 -> pipe:[482281]
lr-x------ 1 root root 64 Oct 20 10:02 /proc/46237/fd/16 -> pipe:[482281]
l-wx------ 1 root root 64 Oct 20 10:02 /proc/46237/fd/17 -> pipe:[482281]
lr-x------ 1 root root 64 Oct 20 10:07 /proc/46237/task/46237/fd/16 -> pipe:[482281]
l-wx------ 1 root root 64 Oct 20 10:07 /proc/46237/task/46237/fd/17 -> pipe:[482281]
lr-x------ 1 root root 64 Oct 20 10:02 /proc/46304/fd/16 -> pipe:[482281]
lr-x------ 1 root root 64 Oct 20 10:07 /proc/46304/task/46304/fd/16 -> pipe:[482281]
root@vm1:~#
And this current 'freezed' frame in master process:
Thread 0x7efc1c101700
('Frame: ', 63347776)
File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/opt/waf/python/lib/python2.7/site-packages/celery/__main__.py", line 54, in <module>
main()
File "/opt/waf/python/lib/python2.7/site-packages/celery/__main__.py", line 30, in main
main()
File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/celery.py", line 81, in main
cmd.execute_from_commandline(argv)
File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/celery.py", line 793, in execute_from_commandline
super(CeleryCommand, self).execute_from_commandline(argv)))
File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/base.py", line 311, in execute_from_commandline
return self.handle_argv(self.prog_name, argv[1:])
File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/celery.py", line 785, in handle_argv
return self.execute(command, argv)
File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/celery.py", line 717, in execute
).run_from_argv(self.prog_name, argv[1:], command=argv[0])
File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/worker.py", line 179, in run_from_argv
return self(*args, **options)
File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/base.py", line 274, in __call__
ret = self.run(*args, **kwargs)
File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/worker.py", line 212, in run
state_db=self.node_format(state_db, hostname), **kwargs
File "/opt/waf/python/lib/python2.7/site-packages/celery/worker/__init__.py", line 206, in start
self.blueprint.start(self)
File "/opt/waf/python/lib/python2.7/site-packages/celery/bootsteps.py", line 123, in start
step.start(parent)
File "/opt/waf/python/lib/python2.7/site-packages/celery/bootsteps.py", line 374, in start
return self.obj.start()
File "/opt/waf/python/lib/python2.7/site-packages/celery/worker/consumer.py", line 279, in start
blueprint.start(self)
File "/opt/waf/python/lib/python2.7/site-packages/celery/bootsteps.py", line 123, in start
step.start(parent)
File "/opt/waf/python/lib/python2.7/site-packages/celery/worker/consumer.py", line 838, in start
c.loop(*c.loop_args())
File "/opt/waf/python/lib/python2.7/site-packages/celery/worker/loops.py", line 76, in asynloop
next(loop)
File "/opt/waf/python/lib/python2.7/site-packages/kombu/async/hub.py", line 340, in create_loop
cb(*cbargs)
File "/opt/waf/python/lib/python2.7/site-packages/celery/concurrency/asynpool.py", line 279, in on_result_readable
next(it)
File "/opt/waf/python/lib/python2.7/site-packages/celery/concurrency/asynpool.py", line 221, in _recv_message
fd, bufv[Hr:] if readcanbuf else bufv, 4 - Hr,
()
@harlov @codeadict @chrismeyersfsu Did you find any solution or workaround?
No. We are upgrading to celery 4. Will run out same set of tests that invokes this error and see if we can recreate on celery 4.
@korycins we remove --autoscale and --maxtasksperchild - all okay yet.
I gues ongoing respawning workers with autoscale о maxtasksperchild may cause this situation.
@chrismeyersfsu I have the same issue with celery 4.1 so I guess switching to v4.1 will not help you.
Previously I used only --autoscale option. Workers hang on from time to time ( around once per week) but when I added maxtasksperchild=20, workers hang on around once per hour. I thought that problem can be with maxtasksperchild so I removed it.
I created custom autoscale to dynamically scale up/down based on free resources and the same issue occurs so I guess problem is connected with re spawning workers (as @harlov said)
can anyone verify the issues on top of latest master? not sure if it's already fixed with latest commit.
@auvipy which commit(s) do you suspect might resolve this? I'm currently able to reproduce this issue on celery 4.1.0, billiard 3.5.0.3 from PyPI.
I am not sure so could you plz try the master branch? If after releasing 4.2 it still exists work would be carried out to fix
@auvipy we could give it a _try_, but it may be hard to say for sure - _personally_, we only seem to encounter this hang once every few _weeks_, so it's pretty hard to narrow down.
@ryanpetrello Try to add max_task_per_child and set it to 1. http://docs.celeryproject.org/en/latest/userguide/workers.html#max-tasks-per-child-setting
This should increase posibility of issue occurrence
Same issue, cannot reproduce right now, I thought It was configure issue, when restart just un-hang the worker, but looks like it happening from time to time.
Just chiming in to say we also run into this issue. celery 4.1.0, kombu 4.1.0, billiard 3.5.0.3.
Command line: /usr/bin/python /usr/local/bin/celery worker --app myapp.celery -B --loglevel INFO --autoscale=5,1 --max-tasks-per-child=50 --max-memory-per-child=1000000 -O fair
With respect to blocking on file descriptors, we see the same as @harlov above: the master is in a deadlock with one of its workers.
Sending SIGUSR1 to the celery master "unsticks" the process and dumps the stacktrace below. What is suspicious is that in a function "on_result_readable" it gets stuck on a read. This feels like a classic race condition: poll() returns readable but the read() blocks. This can happen, this is why it is always recommended to set the FDs in non-block mode to make sure nothing goes wrong.
Relevant message on LKML: https://lkml.org/lkml/2011/6/18/105
Not sure it is easy to set the descriptors non-blocking, because we have an environment with many celery masters and there's always a few stuck, if it's a simple change we may be able to test it quickly.
MainThread [636/660]
=================================================
File "/usr/local/bin/celery", line 11, in <module>
sys.exit(main())
File "/usr/local/lib/python2.7/dist-packages/celery/__main__.py", line 14, in main
_main()
File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 326, in main
cmd.execute_from_commandline(argv)
File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 488, in execute_from_commandline
super(CeleryCommand, self).execute_from_commandline(argv)))
File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 281, in execute_from_commandline
return self.handle_argv(self.prog_name, argv[1:])
File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 480, in handle_argv
return self.execute(command, argv)
File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 412, in execute
).run_from_argv(self.prog_name, argv[1:], command=argv[0])
File "/usr/local/lib/python2.7/dist-packages/celery/bin/worker.py", line 221, in run_from_argv
return self(*args, **options)
File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 244, in __call__
ret = self.run(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/celery/bin/worker.py", line 256, in run
worker.start()
File "/usr/local/lib/python2.7/dist-packages/celery/worker/worker.py", line 203, in start
self.blueprint.start(self)
File "/usr/local/lib/python2.7/dist-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/usr/local/lib/python2.7/dist-packages/celery/bootsteps.py", line 370, in start
return self.obj.start()
File "/usr/local/lib/python2.7/dist-packages/celery/worker/consumer/consumer.py", line 320, in start
blueprint.start(self)
File "/usr/local/lib/python2.7/dist-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/usr/local/lib/python2.7/dist-packages/celery/worker/consumer/consumer.py", line 596, in start
c.loop(*c.loop_args())
File "/usr/local/lib/python2.7/dist-packages/celery/worker/loops.py", line 88, in asynloop
next(loop)
File "/usr/local/lib/python2.7/dist-packages/kombu/async/hub.py", line 354, in create_loop
cb(*cbargs)
File "/usr/local/lib/python2.7/dist-packages/celery/concurrency/asynpool.py", line 292, in on_result_readable
next(it)
File "/usr/local/lib/python2.7/dist-packages/celery/concurrency/asynpool.py", line 235, in _recv_message
fd, bufv[Hr:] if readcanbuf else bufv, 4 - Hr,
File "/usr/local/lib/python2.7/dist-packages/celery/apps/worker.py", line 349, in cry_handler
safe_say(cry())
File "/usr/local/lib/python2.7/dist-packages/celery/utils/debug.py", line 193, in cry
traceback.print_stack(frame, file=out)
=================================================
LOCAL VARIABLES
=================================================
{'P': <functools.partial object at 0x7f29da0f1c58>,
'frame': <frame object at 0x37137b0>,
'out': <vine.five.WhateverIO object at 0x7f29da14e150>,
'sep': u'=================================================',
'sepchr': u'=',
'seplen': 49,
'thread': <_MainThread(MainThread, started 139818014390016)>,
'threading': <module 'threading' from '/usr/lib/python2.7/threading.pyc'>,
'tid': 139818014390016,
'tmap': {139818014390016: <_MainThread(MainThread, started 139818014390016)>}}
Just an update. We spent some time testing things and trying to reproduce it.
I verified that the file descriptors normally used by celery are all correctly marked non-blocking and they are. So the question is, how does the code block if it's every descriptor is non-blocking.
The track I'm following now is that the epoll() object is getting confused about the descriptors it's supposed to be watching. If you strace celery you get lots of output like:
14659 14:32:56.665937 epoll_ctl(4, EPOLL_CTL_ADD, 43, {EPOLLIN|EPOLLERR|EPOLLHUP, {u32=43, u64=14879199376994992171}}) = -1 EEXIST (File exists)
14659 14:32:56.667084 epoll_ctl(4, EPOLL_CTL_ADD, 16, {EPOLLIN|EPOLLERR|EPOLLHUP, {u32=16, u64=14879199376994992144}}) = -1 EEXIST (File exists)
14659 14:32:56.667278 epoll_ctl(4, EPOLL_CTL_ADD, 26, {EPOLLIN|EPOLLERR|EPOLLHUP, {u32=26, u64=14879199376994992154}}) = -1 EEXIST (File exists)
14659 14:32:56.667440 epoll_ctl(4, EPOLL_CTL_ADD, 23, {EPOLLIN|EPOLLERR|EPOLLHUP, {u32=23, u64=14879199376994992151}}) = -1 EEXIST (File exists)
14659 14:32:56.667607 epoll_ctl(4, EPOLL_CTL_DEL, 37, 0x7ffd31c17f60) = -1 ENOENT (No such file or directory)
14659 14:32:56.667804 epoll_ctl(4, EPOLL_CTL_DEL, 38, 0x7ffd31c17f60) = -1 ENOENT (No such file or directory)
It's entirely plausible that at some point a file descriptor that once was used for one thing gets used for something else, the hub gets confused and calls the wrong method and boom!
Ok, so I've found the problem, and I have strace output capturing it failing. And it's very subtle and to do with the following completely non-obvious 'feature' of the epoll() interface. From the manpage:
Q6
Will closing a file descriptor cause it to be removed from all epoll sets automatically?
A6
Yes, but be aware of the following point. A file descriptor is a reference to an open file description (see open(2)). Whenever a file descriptor is duplicated via dup(2), dup2(2), fcntl(2) F_DUPFD, or fork(2), a new file descriptor referring to the same open file description is created. An open file description con‐ tinues to exist until all file descriptors referring to it have been closed. A file descriptor is removed from an epoll set only after all the file descriptors referring to the underlying open file description have been closed (or before if the file descrip‐ tor is explicitly removed using epoll_ctl(2) EPOLL_CTL_DEL). This means that even after a file descriptor that is part of an epoll set has been closed, events may be reported for that file descriptor if other file descriptors referring to the same under‐ lying file description remain open.
What is happening is that a file descriptor is being used by celery for communication, the file descriptor leaks to a worker, the master closes the file descriptor and then opens another for the sentinel _which is not marked non-blocking_. The child process dies causing the file descriptor to get marked as readable, the master tries to read it an blocks.
Yes, epoll() returns readability for a file descriptor that is no longer open in this process.
Here I'm pasting captured strace output for just fd 49 as evidence (apologies for the multi-line output, strace was monitoring multiple processes):
7417 15:04:00.001511 pipe( <unfinished ...>
7417 15:04:00.001646 <... pipe resumed> [49, 50]) = 0
7417 15:04:00.001727 fcntl(49, F_GETFL <unfinished ...>
7417 15:04:00.002091 <... fcntl resumed> ) = 0 (flags O_RDONLY)
7417 15:04:00.002165 fcntl(49, F_SETFL, O_RDONLY|O_NONBLOCK <unfinished ...>
7417 15:04:00.002222 <... fcntl resumed> ) = 0
7417 15:04:00.003052 fcntl(49, F_GETFL <unfinished ...>
7417 15:04:00.003195 <... fcntl resumed> ) = 0x800 (flags O_RDONLY|O_NONBLOCK)
...
7417 15:04:00.237131 fcntl(49, F_GETFL) = 0x800 (flags O_RDONLY|O_NONBLOCK)
7417 15:04:00.237363 epoll_ctl(4, EPOLL_CTL_ADD, 49, {EPOLLIN|EPOLLERR|EPOLLHUP, {u32=49, u64=12923050543437316145}} <unfinished ...>
7417 15:04:00.237478 <... epoll_ctl resumed> ) = 0
...
7417 15:04:00.274482 read(49, <unfinished ...>
7417 15:04:00.274558 <... read resumed> "\0\0\0\25", 4) = 4
7417 15:04:00.274670 read(49, <unfinished ...>
7417 15:04:00.274734 <... read resumed> "(I15\n(I9395\ntp0\ntp1\n.", 21) = 21
7417 15:04:00.274851 epoll_ctl(4, EPOLL_CTL_ADD, 49, {EPOLLIN|EPOLLERR|EPOLLHUP, {u32=49, u64=12923050543437316145}} <unfinished ...>
7417 15:04:00.274970 <... epoll_ctl resumed> ) = -1 EEXIST (File exists)
...
... from here fd 49 is used normally for a while, have not included all the output ...
...
7417 15:04:52.871646 epoll_wait(4, [{EPOLLIN, {u32=49, u64=12923050543437316145}}, {EPOLLIN, {u32=33, u64=12923050543437316129}}, {EPOLLOUT, {u32=48, u64=12923050543437316144}}, {EPOLLOUT, {u32=8, u64=12923050543437316104}}], 1023, 155) = 4
7417 15:04:52.871888 read(49, "\0\0\0\31", 4) = 4
7417 15:04:52.872079 read(49, "(I4\n(I9400\nI155\ntp0\ntp1\n.", 25) = 25
... here it uses normal poll for a once?
7417 15:04:52.891818 poll([{fd=49, events=POLLERR}], 1, 10 <unfinished ...>
7417 15:04:52.902239 <... poll resumed> ) = 0 (Timeout)
7417 15:04:52.913780 close(49 <unfinished ...>
7417 15:04:52.914020 <... close resumed> ) = 0
... here we create a new pipe for a child process
7417 15:04:53.148195 pipe([49, 50]) = 0
7417 15:04:53.148996 clone(child_stack=NULL, flags=CLONE_CHILD_CLEARTID|CLONE_CHILD_SETTID|SIGCHLD, child_tidptr=0x7f2dee7f79d0) = 9482
... the write end got closed, now we dup it. This is the sentinal fd so is not marked non-blocking
7417 15:04:53.155409 dup(49) = 50
7417 15:04:53.156030 epoll_ctl(4, EPOLL_CTL_ADD, 50, {EPOLLIN|EPOLLERR|EPOLLHUP, {u32=50, u64=12923050543437316146}}) = 0
... and here it breaks
7417 15:04:53.776974 epoll_wait(4, <unfinished ...>
7417 15:04:53.785786 <... epoll_wait resumed> [{EPOLLHUP, {u32=49, u64=12923050543437316145}}], 1023, 1878) = 1
7417 15:04:53.785876 --- SIGCHLD {si_signo=SIGCHLD, si_code=CLD_EXITED, si_pid=9481, si_uid=905, si_status=155, si_utime=4, si_stime=3} ---
7417 15:04:53.786073 read(49, <unfinished ...>
... discontinuity ...
7417 15:09:24.338109 <... read resumed> 0x23f4f10, 4) = ? ERESTARTSYS (To be restarted if SA_RESTART is set)
7417 15:09:24.345507 epoll_ctl(4, EPOLL_CTL_DEL, 49, 0x7ffeecbf7660) = -1 ENOENT (No such file or directory)
Note how epoll() admits that fd 49 is no longer in its fdset, but it returned the number nonetheless.
The way to fix this is to ensure that in the worker every file descriptor is closed that is being monitored by epoll(). Currently workers close the file descriptors they don't want that correspond to themselves, but they don't touch the file descriptors they inherit that relate to _other_ workers. Workarounds are to mark the sentinal fd also as non-blocking, or simply using poll() instead of epoll(). I'm testing the latter locally.
Ok, as it turns out, using poll() doesn't solve the problem. As the actual issue is that a file descriptor is being closed while still registered with the hub. Once you realise that it is easy to figure out exactly where it goes wrong and the following hack fixes this issue for us. Without this we could reproduce the issue within an hour or two. With it we haven't been able to trigger it yet. The hack is not the correct fix, but it really requires someone who knows the code better to write a good fix.
--- /usr/local/lib/python2.7/dist-packages/celery/concurrency/asynpool.py~ 2018-03-16 01:10:04.000000000 +0000
+++ /usr/local/lib/python2.7/dist-packages/celery/concurrency/asynpool.py 2018-03-16 12:08:58.058686003 +0000
@@ -735,6 +735,7 @@
except KeyError:
pass
self.on_inqueue_close = on_inqueue_close
+ self.hub_remove = hub_remove
def schedule_writes(ready_fds, total_write_count=[0]):
# Schedule write operation to ready file descriptor.
@@ -1241,6 +1242,7 @@
if queue:
for sock in (queue._reader, queue._writer):
if not sock.closed:
+ self.hub_remove(sock)
try:
sock.close()
except (IOError, OSError):
It looks like the write queue fd is properly removed from the hub (a few lines up), but the read queue fd is not.
No updates here?:(
@auvipy any thoughts on @kleptog's analysis above at https://github.com/celery/celery/issues/4185#issuecomment-373755206?
sorry dont have enough time right now to investigate the issue
@kleptog thanks for so much legwork on this.
It seems like on_inqueue_close
_itself_ calls hub_remove(fd)
- specifically on L744
, similar to your example:
...which is called in destroy_queues()
on L1247
, but only for queues[0]
.
So is this statement actually accurate?
The hack is not the correct fix, but it really requires someone who knows the code better to write a good fix.
I'm not knowledgeable enough about celery internals to know why the code is currently _only_ doing this for queues[0]
. Might be worth a pull request w/ your diff, at least.
IIRC it was indeed the other queue that was being left out, so maybe it's a matter of doing an inqueue_close on the other queue? I won't have the opportunity to test it for a while though.
@auvipy I'm sorry to be a nag, and I can sympathize with the often thankless task of doing free work on an open source project 😞 - I work on several in my spare time, and another one for my day job.
Do you know if you or any other celery maintainer has had a chance to take a peek at this one? We've got a notable number of celery users in this thread that have been encountering this debilitating issue since is was reported last summer, and even a few who have pitched in and possibly identified the area of code that's the culprit. Is there any sort of ETA on when this bug might be looked at?
@ryanpetrello @kleptog I've had a brief look into this code, here are some thoughts - this is all new so nothing definitive I'm sorry.
queues
in this context is an inq, outq, synq
tuple as returned by AsynPool.create_process_queues()
. So queue[0]
refers specifically to the inq
, and it makes sense that a function called on_inqueue_close
would be being called only for that.
So right now, the only thing that's being passed to hub_remove()
is the inq._writer
, not its _reader
, and not the outq
or synq
writers or readers, and ONLY when it's a key in fileno_to_inq
whose value is proc
.
I don't have much insight into this. But there are also fileno_to_outq
and fileno_to_synq
dicts. Should we be checking their contents before calling hub_remove()
on those other handles? Or should we be unconditionally hub_remove
ing everything? Maybe we should remove the hub_remove()
call from on_inqueue_close()
and just call it in destroy_queues()
as in the diff above?
@kleptog
Thank you for the amazing work. We've had this issue for a while, and it had the unfortunate effect on our system that prefetched tasks were being dropped. I found the same root cause - workers and slaves hanging on FDs, but I couldn't find the deadlock condition. Applying your fix worked and our stress test now effortlessly processed ~50k tasks without hanging.
@AlexHill said:
Should we be checking their contents before calling hub_remove() on those other handles? Or should we be unconditionally hub_removeing everything? Maybe we should remove the hub_remove() call from on_inqueue_close() and just call it in destroy_queues() as in the diff above?
Very simply, you must remove a file descriptor from the hub before closing it, because closing will not remove it from the hub for you. So your last suggestion is actually pretty good, since it is more "obviously correct" and I'm a big fan of that.
Using strace I'm seeing the following:
epoll_ctl(5, EPOLL_CTL_DEL, 16, 0x7ffe54f25410) = -1 ENOENT (No such file or directory)
epoll_ctl(5, EPOLL_CTL_DEL, 21, 0x7ffe54f25410) = -1 ENOENT (No such file or directory)
fcntl(26, F_GETFL) = 0x2 (flags O_RDWR)
fcntl(26, F_SETFL, O_RDWR|O_NONBLOCK) = 0
recvfrom(26, 0x7f42ca4409e4, 7, 0, NULL, NULL) = -1 EAGAIN (Resource temporarily unavailable)
Is related to this issue?
Thanks
maybe it helps to diagnose the issue with the following linke: http://memos.ml/
I'm having the same problem since upgrading to Ubuntu 16
I an facing the same issue on 4.2.1 .
try to config celery with socket timeout option as follow:
celery.config_from_object({'broker_transport_options': {'socket_timeout': 30}})
I had the same issue. However after the patch published upper the issue is resolved.
I also have this issue, is there a pull request with @kleptog 's fix or are people using a fork until this is resolved?
Our issue was resolved after adding tcp socket keepalive and timeout (using redis)
BROKER_TRANSPORT_OPTIONS = {'socket_timeout': 60,
'socket_connect_timeout': 120,
'socket_keepalive': True,
'socket_keepalive_options': {socket.TCP_KEEPIDLE: 60,
socket.TCP_KEEPINTVL: 60,
socket.TCP_KEEPCNT: 3}}
@linar-jether : what is this socket in socket.TCP_KEEPIDLE
?
@nishith107 import socket
Changing the redis broker's config is almost certainly unrelated to the issue in this thread, which is about getting stuck on a blocking read on the master-child communication pipe.
Right, it's possible that tinkering w/ broker settings may introduce enough jitter to _avoid_ the underlying issue, but the master-child IPC hang is a distinct bug.
@auvipy this was originally reported one year ago today. Myself and others have left detailed information and analysis of where the code seems to hang, including strace
output where we've narrowed it down to a few lines of code. Somebody has even posted a link to a patch (
https://github.com/celery/celery/issues/4185#issuecomment-373755206) that it seems people are using and deploying (in production?) as a workaround. Many people involved in this comment thread still seem to encounter this regularly across a wide number of celery versions and broker setups.
I appreciate this is an open source project, and everyone involved is a volunteer, so I'm not meaning to endlessly nag. Do you know if _anyone_ has made an effort at investigating or diagnosing this critical issue?
It would be easier for us to fix the issues if someone of us faces it in production. Since some of you used some workaround to get rid of this issue why not send a pr with appropriate test cases? celery have issues which are 5-6 years old too :(
trust me it's really tough to cope with all the issues popping up. please take the charge to send a fix as some of you are facing it on production so it would be easier for you to test the proposed changes in the real environment.
@linar-jether Thanks, this fix worked for me :)
could you guys check this PR https://github.com/celery/celery/pull/4997 ? if this fix your issue we might merge that.
I will try this branch. It will take a little while for me to know if it's sorted as I haven't found a way to force it to occur.
Any news on testing this bug fix?
I have not had the issue for over 2 weeks now. I'm not 100% yet but I think the suggested PR has fixed it for me.
We just deployed the 4.2 branch w/ this patch applied as a custom wheel on a 3 client machines... it's in production now. This bug was stalling out workers once a week for months now. It's deployed. We'll see if it prevents the problem.
so I am merging the pr https://github.com/celery/celery/pull/4997 please keep us posted with your production issues here, and if anybody has any unit/integration tests suggestion please proceed with a PR.
@ryanpetrello wondering, did you deploy this patch and tested?
@AvnerCohen because of this and a few other issues, we ended up just solving our problem without using celery (we were able to accomplish what we needed with plain old kombu). It sounds like others have had luck with this patch, though!
So far we're not aware of any more problems since we deployed in production. I think it fixed it.
awesome thanks!
@auvipy When should we expect a formal release including this patch?
my target is before Christmas
Thanks @kleptog for some great debugging!
I've tried all the suggested fixes, up to running Celery and friends from master, but as soon as a worker encounters a ConnectionResetError it refuses to process any more tasks. Restarting RabbitMQ causes the stuck workers to process tasks again. Naturally this is not a good solution but it does point me in a direction to debug further.
We are using this patch but still encounter this issue once in a while - worker successfully finishes its job and the main process is locked hanging awaiting for results
I'm using the celery==4.2.1 ; python==3.6.7 ubuntu 16.04. and Redis as backend
I used supervisor to start celery with command:
celery -A project worker -l info --autoscale=10,2
It still got stuck sometimes.
@mikolaje what works for me is to set the tcp-keepalive
in redis.conf to resolve the deadlock.
Did anyone found a good way to monitor the situation of stuck workers?
We are still seeing this behavior with the code from master, can't say if this is improved or not, but initial feel is that at least for our use case, we are still getting stuck workers.
Question is how to monitor the individual workers? right now all we can do is monitor the queue level (messages present and not picked up), but I would like to track this much earlier by seeing when a worker is no longer taking work.
What I have noticed is that celery "status" command will return OK regardless, also flower showing that all is good.
I've tried all the suggested fixes, up to running Celery and friends from master, but as soon as a worker encounters a ConnectionResetError it refuses to process any more tasks. Restarting RabbitMQ causes the stuck workers to process tasks again. Naturally this is not a good solution but it does point me in a direction to debug further.
@hedleyroos, you may also have this problem #4895? We're running Celery v4.1.x to avoid that problem.
@kingb yes I do have that problem. I wanted to avoid downgrading to 4.1 and switched to Redis as broker but still encountered the same issue. Luckily Redis has a tcp-keepalive
option that I set to 60 seconds. This means Redis will terminate what it considers stale connections, and this in turn leads to the worker breaking out of the deadlock and start consuming tasks again.
I have freeze for celery==4.2.1, billiard==3.5.0.5, kombu==4.2.2.post1, amqp==2.4.0, Python 3.6.8, RabbitMQ, MongoDB v 4
@tonal I'm using the same versions as you and sometimes I get this error. Not retrying tasks or anything like that. Tasks get stuck at RECIEVED status.
it doesn't happen all the time, but it happens every few hours/every few days and have purge the message and restart the service.
4.3? 4.4rc2?
No, celery==4.2.1, billiard==3.5.0.5, kombu==4.2.2.post1, amqp==2.4.0,
@auvipy still seeing it die ~1 hour with 4.3. If it's any help it stops succeeding/failing tasks and just keeps receiving them and then just silently fails, but the celery process remains running.
/code # celery status
celery@celery-worker: OK
1 node online.
Can someone assure me that it is really solved in 4.3? I have to make the upgrade decision based on this.
Hi all,
This isn't fixed in 4.3. I'm using this version in a project, and my workers still get stuck (after showing "missing heartbeat" messages).
Has anyone circumvented this issue with 4.3? I noticed the code is significantly different from the workaround/patch above, so I guess there was an attempt to fix the issue, but it doesn't seem fixed.
Thanks!
I can confirm that I have the same issue in Celery 4.3 (Python 3.7.3).
could you try celery==4.4.0rc3?
@auvipy tried that as well, didn't help. just FYI I'm using gevent
Any chance this issue could be re-opened? It sounds like people are still encountering it on the most recent versions of celery.
I am facing same issue. Python2.7, Celery4.1.0. I use SQS broker. Celery processes tasks for sometime and it abruptly blocks after a while.
I am facing same issue. Python2.7, Celery4.1.0. I use SQS broker. Celery processes tasks for sometime and it abruptly blocks after a while.
you should try the newer releases.
We haven't been able to reproduce this issue so someone will have to debug it for us and if they can, provide a fix for this issue.
We see this daily in production. We never managed to recreate on dev or debug enabled environment.
Any suggestions on what sort of info we can provide that will help ?
I suggest using something like python-hunter and/or python-manhole.
The problem is likely in the master process.
Alternatively, use strace to see which socket/pipe is being read or written into and try to deduct from there.
I can come to your offices next week if you wish. Let's discuss this over the phone.
Thanks, will try this, just as a general example, this is how the processes stuck look from PS point of view.
username 14182 1 1 03:12 ? 00:02:28 [celeryd: workername@hostname:MainProcess] -active- (moshe)
username 15487 14182 0 03:15 ? 00:00:12 [celeryd: workername@hostname:ForkPoolWorker-5]
username 15766 14182 0 03:18 ? 00:00:12 [celeryd: workername@hostname:ForkPoolWorker-14]
username 16666 14182 0 03:52 ? 00:00:10 [celeryd: workername@hostname:ForkPoolWorker-43]
username 20253 14182 65 06:45 ? 00:00:24 [celeryd: workername@hostname:ForkPoolWorker-110]
Concurrency is set to 4, with "max-tasks-per-child" set, so that we get restarts every X messages.
As can clearly be seen, while 20253
is still active (and worker counter keeps increasing) 15487
died early on and is stuck with:
$ strace -p 15487
strace: Process 15487 attached
futex(0x2b36980, FUTEX_WAIT_PRIVATE, 0, NULL
The master is still active and messages are still be sent to the one worker that is still active.
Just leaving this here in case it helps someone else. It's not clear if I had the exact same issue, but my workers were also hanging after processing several hundred/thousand tasks. I was launching Celery as a service using:
celery multi start w1 w2 -A app.celery --time-limit=21600 --concurrency=4
After unsuccessfully troubleshooting workers getting stuck with that, I changed the command to a plain:
celery worker -A app.celery --time-limit=21600 --concurrency=4
And this fixed it for me! When I ran Celery this way manually, I could see tasks coming in and running and tasks occasionally getting stuck (and then the worker being restarted properly). I think the former way was not properly restarting workers after a task failed. The other change I made was to set a lower time limit on those tasks that would be run hundreds/thousands of times, so when a worker inevitably got stuck, the main worker process would be able to kill it and restart a new one fairly quickly before all of them got stuck.
I don't know enough of the celery internals to debug this issue effectively, but if someone can give some guidance or places to look I can investigate it on our end when the issue occurs.
I can not recreate this issue on demand, but it does occur over time at one of our sites. From what I have been able to track down the issue occurs when there is a network drop between the worker and the RabbitMQ server while the workers are under load. The failure basically locks the main process from taking any new jobs.
If the system is not under load everything continues to work properly. It has to be a combination of load on the worker and a network drop at the same time. This has confirmed two ways, one a secondary staging system with no load at the time sees the mingle failures but continues to function properly afterwards. Second, we run multiple workers per box. Only workers that are running jobs fail. If the load is light and not all main processes are processing jobs then only the some of the main processes will fail.
Our launch command:
celery worker -n XXX-%I.${CELERY_HOST_NAME} \
-A XXX.app --pidfile=/run/celery/XXX-%I.pid \
--autoscale=10,3 \
--max-tasks-per-child 1\
-Q YYYl \
--loglevel=INFO --logfile=/var/log/XXX/XXX-%I${CELERY_PROCESS_INDEX_SEPERATOR}.log --soft-time-limit=7200 --time-limit
Log output:
{"event": "mingle: sync with 54 nodes", "logger": "celery.worker.consumer.mingle", "level": "info", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "mingle: processing reply from celery@XXXX", "logger": "celery.worker.consumer.mingle", "level": "debug", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
...
{"event": "mingle: processing reply from celery@XXX", "logger": "celery.worker.consumer.mingle", "level": "debug", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "mingle: sync complete", "logger": "celery.worker.consumer.mingle", "level": "info", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "^-- substep ok", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "| Consumer: Starting Gossip", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "using channel_id: 2", "logger": "amqp", "level": "debug", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Channel open", "logger": "amqp", "level": "debug", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "^-- substep ok", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:25", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "| Consumer: Starting Tasks", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:25", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "^-- substep ok", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:26", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "| Consumer: Starting Control", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:26", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "using channel_id: 3", "logger": "amqp", "level": "debug", "timestamp": "2019-11-17 22:31:26", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Channel open", "logger": "amqp", "level": "debug", "timestamp": "2019-11-17 22:31:26", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "^-- substep ok", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:27", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "| Consumer: Starting event loop", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:27", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "| Worker: Hub.register Autoscaler...", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:27", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "| Worker: Hub.register Pool...", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:27", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "basic.qos: prefetch_count->3", "logger": "kombu.common", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Received task: XXX[ec8092ac-9b3c-4081-b11e-2d93aeac3bf1] ", "logger": "celery.worker.strategy", "level": "info", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "TaskPool: Apply <function _fast_trace_task at 0x7fb3d5399bf8> (args:('XXX', 'ec8092ac-9b3c-4081-b11e-2d93aeac3bf1', {'lang': 'py', 'task': 'XXX', 'id': 'ec8092ac-9b3c-4081-b11e-2d93aeac3bf1', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'retries': 0, 'timelimit': [None, None], 'root_id': 'ec8092ac-9b3c-4081-b11e-2d93aeac3bf1', 'parent_id': None, 'argsrepr': '()', 'kwargsrepr': \"xxx}\", 'origin': 'gen49433@XXX', 'reply_to':... kwargs:{})", "logger": "celery.pool", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Received task: XXX[4e3acf3a-c0ec-4be3-8add-3e76d047ec09] ", "logger": "celery.worker.strategy", "level": "info", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "TaskPool: Apply <function _fast_trace_task at 0x7fb3d5399bf8> (args:('XXX', '4e3acf3a-c0ec-4be3-8add-3e76d047ec09', {'lang': 'py', 'task': 'XXX', 'id': '4e3acf3a-c0ec-4be3-8add-3e76d047ec09', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'retries': 0, 'timelimit': [None, None], 'root_id': '4e3acf3a-c0ec-4be3-8add-3e76d047ec09', 'parent_id': None, 'argsrepr': '()', 'kwargsrepr': \"{'xxx}\", 'origin': 'gen49438@XXX', 'reply_to':... kwargs:{})", "logger": "celery.pool", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Received task: XXX[d5a6ada9-b9f5-435d-84cb-4dd6d7e21020] ", "logger": "celery.worker.strategy", "level": "info", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "TaskPool: Apply <function _fast_trace_task at 0x7fb3d5399bf8> (args:('XXX', 'd5a6ada9-b9f5-435d-84cb-4dd6d7e21020', {'lang': 'py', 'task': 'XXX', 'id': 'd5a6ada9-b9f5-435d-84cb-4dd6d7e21020', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'retries': 0, 'timelimit': [None, None], 'root_id': 'd5a6ada9-b9f5-435d-84cb-4dd6d7e21020', 'parent_id': None, 'argsrepr': '()', 'kwargsrepr': \"{xxx}\", 'origin': 'gen49438@XXX', 'reply_to':... kwargs:{})", "logger": "celery.pool", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "celery@XXX left", "logger": "celery.worker.consumer.gossip", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Task accepted: XXX[4e3acf3a-c0ec-4be3-8add-3e76d047ec09] pid:111546", "logger": "celery.worker.request", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Task accepted: XXX[ec8092ac-9b3c-4081-b11e-2d93aeac3bf1] pid:111544", "logger": "celery.worker.request", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Task accepted: XXX[d5a6ada9-b9f5-435d-84cb-4dd6d7e21020] pid:111538", "logger": "celery.worker.request", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "missed heartbeat from celery@XXX", "logger": "celery.worker.consumer.gossip", "level": "info", "timestamp": "2019-11-17 22:34:17", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
...
{"event": "missed heartbeat from celery@XXXX", "logger": "celery.worker.consumer.gossip", "level": "info", "timestamp": "2019-11-17 22:34:17", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Task handler raised error: WorkerLostError('Worker exited prematurely: exitcode 155.')", "exc_info": ["<class 'billiard.exceptions.WorkerLostError'>", "WorkerLostError('Worker exited prematurely: exitcode 155.')", "<billiard.einfo.Traceback object at 0x7fb3cec22d68>"], "logger": "celery.worker.request", "level": "error", "timestamp": "2019-11-17 22:34:17", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Task handler raised error: WorkerLostError('Worker exited prematurely: exitcode 155.')", "exc_info": ["<class 'billiard.exceptions.WorkerLostError'>", "WorkerLostError('Worker exited prematurely: exitcode 155.')", "<billiard.einfo.Traceback object at 0x7fb3cec1c5f8>"], "logger": "celery.worker.request", "level": "error", "timestamp": "2019-11-17 22:34:17", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Task handler raised error: WorkerLostError('Worker exited prematurely: exitcode 155.')", "exc_info": ["<class 'billiard.exceptions.WorkerLostError'>", "WorkerLostError('Worker exited prematurely: exitcode 155.')", "<billiard.einfo.Traceback object at 0x7fb3cec1c860>"], "logger": "celery.worker.request", "level": "error", "timestamp": "2019-11-17 22:34:17", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "heartbeat_tick : for connection ebf9e86b69a347aaae81a92712b65eb9", "logger": "amqp", "level": "debug", "timestamp": "2019-11-17 22:34:17", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "heartbeat_tick : Prev sent/recv: None/None, now - 31/490, monotonic - 4090172.991839465, last_heartbeat_sent - 4090172.991832662, heartbeat int. - 60 for connection ebf9e86b69a347aaae81a92712b65eb9", "logger": "amqp", "level": "debug", "timestamp": "2019-11-17 22:34:17", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
py-bt stack trace or master process:
<built-in method read of module object at remote 0x7fb3e61f33b8>
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py", line 64, in __read__
chunk = read(fd, size)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py", line 241, in _recv_message
fd, bufv[Hr:] if readcanbuf else bufv, 4 - Hr,
<built-in method next of module object at remote 0x7fb3e627ac28>
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py", line 298, in on_result_readable
next(it)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/hub.py", line 362, in create_loop
cb(*cbargs)
<built-in method next of module object at remote 0x7fb3e627ac28>
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py", line 91, in asynloop
next(loop)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/consumer/consumer.py", line 596, in start
c.loop(*c.loop_args())
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/consumer/consumer.py", line 318, in start
blueprint.start(self)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bootsteps.py", line 369, in start
return self.obj.start()
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/worker.py", line 205, in start
self.blueprint.start(self)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py", line 258, in run
worker.start()
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py", line 252, in __call__
ret = self.run(*args, **kwargs)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py", line 223, in run_from_argv
return self(*args, **options)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 420, in execute
).run_from_argv(self.prog_name, argv[1:], command=argv[0])
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 488, in handle_argv
return self.execute(command, argv)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py", line 298, in execute_from_commandline
return self.handle_argv(self.prog_name, argv[1:])
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 496, in execute_from_commandline
super(CeleryCommand, self).execute_from_commandline(argv)))
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 322, in main
cmd.execute_from_commandline(argv)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/__main__.py", line 16, in main
_main()
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery", line 8, in <module>
sys.exit(main())
bt stack trace of master process
#0 0x00007fb3e5c2e6e0 in __read_nocancel () from /lib64/libpthread.so.0
#1 0x0000000000542a85 in _Py_read (fd=fd@entry=56, buf=0x7fb3cecbad70, count=count@entry=4) at Python/fileutils.c:1407
#2 0x000000000054b1a9 in os_read_impl (module=0x0, length=4, fd=56) at ./Modules/posixmodule.c:8043
#3 os_read (module=module@entry=<module at remote 0x7fb3e61f33b8>, args=args@entry=0x7fb3ceb7f3b8, nargs=<optimized out>)
at ./Modules/clinic/posixmodule.c.h:3625
#4 0x000000000043bddd in _PyMethodDef_RawFastCallKeywords (kwnames=0x0, nargs=0, args=0x7fb3ceb7f3b8,
self=<module at remote 0x7fb3e61f33b8>, method=0x8c7ec0 <posix_methods+2848>) at Objects/call.c:651
#5 _PyCFunction_FastCallKeywords (func=<built-in method read of module object at remote 0x7fb3e61f33b8>,
args=args@entry=0x7fb3ceb7f3b8, nargs=nargs@entry=2, kwnames=kwnames@entry=0x0) at Objects/call.c:730
#6 0x00000000004280bd in call_function (kwnames=0x0, oparg=2, pp_stack=<synthetic pointer>) at Python/ceval.c:4568
#7 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#8 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3ceb7f218, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py, line 64, in __read__ (fd=56, buf=<_io.BytesIO at remote 0x7fb3ceca0f10>, size=4, read=<built-in method read of module object at remote 0x7fb3e61f33b8>)) at Python/ceval.c:547
#9 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf537420>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=3, kwnames=0x0, kwargs=kwargs@entry=0x347a2b8, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3cf3074f8, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0, name='__read__',
qualname='__read__') at Python/ceval.c:3930
#10 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#11 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#12 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#13 0x00000000005c043d in gen_send_ex (closing=0, exc=0, arg=0x0, gen=0x7fb3cebfb138) at Objects/genobject.c:221
#14 gen_iternext (gen=0x7fb3cebfb138) at Objects/genobject.c:542
#15 0x00000000004e703b in builtin_next (self=self@entry=<module at remote 0x7fb3e627ac28>, args=args@entry=0x3481f10,
nargs=<optimized out>) at Python/bltinmodule.c:1426
#16 0x000000000043bddd in _PyMethodDef_RawFastCallKeywords (kwnames=0x0, nargs=0, args=0x3481f10,
self=<module at remote 0x7fb3e627ac28>, method=0x8ba020 <builtin_methods+992>) at Objects/call.c:651
#17 _PyCFunction_FastCallKeywords (func=<built-in method next of module object at remote 0x7fb3e627ac28>, args=args@entry=0x3481f10,
nargs=nargs@entry=1, kwnames=kwnames@entry=0x0) at Objects/call.c:730
#18 0x00000000004280bd in call_function (kwnames=0x0, oparg=1, pp_stack=<synthetic pointer>) at Python/ceval.c:4568
#19 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#20 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x3481d68, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py, line 298, in on_result_readable (fileno=56, it=<generator at remote 0x7fb3cebfb138>))
at Python/ceval.c:547
#21 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3cf2ed300>,
globals=globals@entry={'__name__': 'celery.concurrency.asynpool', '__doc__': "Version of multiprocessing.Pool using Async I/O.\n\n.. note::\n\n This module will be moved soon, so don't use it directly.\n\nThis is a non-blocking version of :class:`multiprocessing.Pool`.\n\nThis code deals with three major challenges:\n\n#. Starting up child processes and keeping them running.\n#. Sending jobs to the processes and receiving results back.\n#. Safely shutting down this system.\n", '__package__': 'celery.concurrency', '__loader__': <SourceFileLoader(name='celery.concurrency.asynpool', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py') at remote 0x7fb3cf2db828>, '__spec__': <ModuleSpec(name='celery.concurrency.asynpool', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py', loader_state=None, submodule_search_locations=N...(truncated),
locals=locals@entry=0x0, args=args@entry=0x7fb3cec173a8, argcount=argcount@entry=1, kwnames=kwnames@entry=0x0,
kwargs=kwargs@entry=0x0, kwcount=kwcount@entry=0, kwstep=kwstep@entry=2, defs=0x0, defcount=0, kwdefs=0x0,
closure=(<cell at remote 0x7fb3ceb5f9a8>, <cell at remote 0x7fb3ceb5f9d8>, <cell at remote 0x7fb3ceb5fa08>, <cell at remote 0x7fb3ceb5fa38>, <cell at remote 0x7fb3ceb5fb58>), name='on_result_readable',
qualname='ResultHandler._make_process_result.<locals>.on_result_readable') at Python/ceval.c:3930
#22 0x000000000043b027 in _PyFunction_FastCallDict (func=<function at remote 0x7fb3cec260d0>, args=args@entry=0x7fb3cec173a8,
nargs=1, kwargs=kwargs@entry=0x0) at Objects/call.c:376
#23 0x000000000043c713 in PyObject_Call (callable=callable@entry=<function at remote 0x7fb3cec260d0>, args=args@entry=(56,),
kwargs=kwargs@entry=0x0) at Objects/call.c:226
#24 0x0000000000422479 in do_call_core (kwdict=0x0, callargs=(56,), func=<function at remote 0x7fb3cec260d0>) at Python/ceval.c:4645
#25 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#26 0x00000000005c043d in gen_send_ex (closing=0, exc=0, arg=0x0, gen=0x7fb3cf275de0) at Objects/genobject.c:221
#27 gen_iternext (gen=0x7fb3cf275de0) at Objects/genobject.c:542
#28 0x00000000004e703b in builtin_next (self=self@entry=<module at remote 0x7fb3e627ac28>, args=args@entry=0x347abd0,
nargs=<optimized out>) at Python/bltinmodule.c:1426
#29 0x000000000043bddd in _PyMethodDef_RawFastCallKeywords (kwnames=0x0, nargs=0, args=0x347abd0,
self=<module at remote 0x7fb3e627ac28>, method=0x8ba020 <builtin_methods+992>) at Objects/call.c:651
#30 _PyCFunction_FastCallKeywords (func=<built-in method next of module object at remote 0x7fb3e627ac28>, args=args@entry=0x347abd0,
nargs=nargs@entry=1, kwnames=kwnames@entry=0x0) at Objects/call.c:730
#31 0x00000000004280bd in call_function (kwnames=0x0, oparg=1, pp_stack=<synthetic pointer>) at Python/ceval.c:4568
#32 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#33 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x347a9d8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py, line 91, in asynloop (obj=<Consumer(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.l...(truncated)) at Python/ceval.c:547
#34 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3cf2c6d20>,
globals=globals@entry={'__name__': 'celery.worker.loops', '__doc__': 'The consumers highly-optimized inner loop.', '__package__': 'celery.worker', '__loader__': <SourceFileLoader(name='celery.worker.loops', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py') at remote 0x7fb3cf2cd3c8>, '__spec__': <ModuleSpec(name='celery.worker.loops', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/__pycache__/loops.cpython-37.pyc', _initializing=False) at remote 0x7fb3cf2cd400>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py', '__cached_...(truncated),
locals=locals@entry=0x0, args=args@entry=0x7fb3cebfba38, argcount=argcount@entry=9, kwnames=kwnames@entry=0x0,
kwargs=kwargs@entry=0x0, kwcount=kwcount@entry=0, kwstep=kwstep@entry=2, defs=0x7fb3cf2cd5d8, defcount=1, kwdefs=0x0,
closure=0x0, name='asynloop', qualname='asynloop') at Python/ceval.c:3930
#35 0x000000000043b027 in _PyFunction_FastCallDict (func=<function at remote 0x7fb3cf2b1c80>, args=args@entry=0x7fb3cebfba38,
nargs=9, kwargs=kwargs@entry=0x0) at Objects/call.c:376
#36 0x000000000043c713 in PyObject_Call (callable=callable@entry=<function at remote 0x7fb3cf2b1c80>,
args=args@entry=(<Consumer(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.Celery---Type <return> to continue, or q <return> to quit---
Config', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker...(truncated),
kwargs=kwargs@entry=0x0) at Objects/call.c:226
#37 0x0000000000422479 in do_call_core (kwdict=0x0,
callargs=(<Consumer(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker...(truncated),
func=<function at remote 0x7fb3cf2b1c80>) at Python/ceval.c:4645
#38 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#39 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
at Objects/call.c:283
#40 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#41 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#42 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#43 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
at Objects/call.c:283
#44 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#45 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#46 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#47 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
at Objects/call.c:283
#48 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#49 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#50 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#51 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
at Objects/call.c:283
#52 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#53 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#54 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#55 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
at Objects/call.c:283
#56 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#57 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#58 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#59 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
at Objects/call.c:283
#60 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#61 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#62 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#63 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x2c6b3a8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py, line 258, in run (self=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at ...(truncated)) at Python/ceval.c:547
#64 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3d89706f0>,
globals=globals@entry={'__name__': 'celery.bin.worker', '__doc__': "Program used to start a Celery worker instance.\n\nThe :program:`celery worker` command (previously known as ``celeryd``)\n\n.. program:: celery worker\n\n.. seealso::\n\n See :ref:`preload-options`.\n\n.. cmdoption:: -c, --concurrency\n\n Number of child processes processing the queue. The default\n is the number of CPUs available on your system.\n\n.. cmdoption:: -P, --pool\n\n Pool implementation:\n\n prefork (default), eventlet, gevent or solo.\n\n.. cmdoption:: -n, --hostname\n\n Set custom hostname (e.g., 'w1@%%h'). Expands: %%h (hostname),\n %%n (name) and %%d, (domain).\n\n.. cmdoption:: -B, --beat\n\n Also run the `celery beat` periodic task scheduler. Please note that\n there must only be one instance of this service.\n\n .. note::\n\n ``-B`` is meant to be used for development purposes. For production\n environment, you need to start :program:`celery beat` separately.\n\n.. cmdoption:: -Q, --queues\n\n L...(truncated),
locals=locals@entry=0x0, args=args@entry=0x7ffdbea26e60, argcount=argcount@entry=1, kwnames=kwnames@entry=0x2c6b830,
kwargs=kwargs@entry=0x2c6b838, kwcount=78, kwcount@entry=39, kwstep=kwstep@entry=2, defs=0x7fb3d89cf8d0, defcount=9, kwdefs=0x0,
closure=0x0, name='run', qualname='worker.run') at Python/ceval.c:3930
#65 0x000000000043b027 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3d8979620>,
args=args@entry=0x7ffdbea26e60, nargs=nargs@entry=1,
kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:376
#66 0x000000000043e749 in _PyObject_FastCallDict (
kwargs={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, nargs=1, args=0x7ffdbea26e60, callable=<function at remote 0x7fb3d8979620>) at Objects/call.c:98
#67 _PyObject_Call_Prepend (callable=<function at remote 0x7fb3d8979620>, obj=<optimized out>, args=(),
kwargs={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'b---Type <return> to continue, or q <return> to quit---
gl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None})
at Objects/call.c:904
#68 0x000000000043c638 in PyObject_Call (callable=callable@entry=<method at remote 0x7fb3e6238d08>, args=args@entry=(),
kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:245
#69 0x0000000000422479 in do_call_core (
kwdict={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, callargs=(), func=<method at remote 0x7fb3e6238d08>) at Python/ceval.c:4645
#70 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#71 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3d5414a48, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py, line 252, in __call__ (self=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread....(truncated)) at Python/ceval.c:547
#72 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3e6172300>,
globals=globals@entry={'__name__': 'celery.bin.base', '__doc__': 'Base command-line interface.', '__package__': 'celery.bin', '__loader__': <SourceFileLoader(name='celery.bin.base', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py') at remote 0x7fb3e6154748>, '__spec__': <ModuleSpec(name='celery.bin.base', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/__pycache__/base.cpython-37.pyc', _initializing=False) at remote 0x7fb3e6154780>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py', '__cached__': '/app/aep/.local/share/virtualenvs/a02e6b...(truncated),
locals=locals@entry=0x0, args=args@entry=0x7ffdbea271d0, argcount=argcount@entry=1, kwnames=kwnames@entry=0x2c69670,
kwargs=kwargs@entry=0x2c69678, kwcount=78, kwcount@entry=39, kwstep=kwstep@entry=2, defs=0x0, defcount=0, kwdefs=0x0,
closure=0x0, name='__call__', qualname='Command.__call__') at Python/ceval.c:3930
#73 0x000000000043b027 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3d89a1a60>,
args=args@entry=0x7ffdbea271d0, nargs=nargs@entry=1,
kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:376
#74 0x000000000043e749 in _PyObject_FastCallDict (
kwargs={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, nargs=1, args=0x7ffdbea271d0, callable=<function at remote 0x7fb3d89a1a60>) at Objects/call.c:98
#75 _PyObject_Call_Prepend (callable=callable@entry=<function at remote 0x7fb3d89a1a60>,
obj=obj@entry=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker.ta...(truncated),
args=args@entry=(),
kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:904
#76 0x0000000000494665 in slot_tp_call (
self=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_
worker.ta...(truncated), args=(),
kwds={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': N---Type <return> to continue, or q <return> to quit---
one, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None})
at Objects/typeobject.c:6379
#77 0x000000000043c638 in PyObject_Call (
callable=callable@entry=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker.ta...(truncated),
args=args@entry=(),
kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:245
#78 0x0000000000422479 in do_call_core (
kwdict={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, callargs=(),
func=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker.ta...(truncated)) at Python/ceval.c:4645
#79 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#80 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3d7a13808, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py, line 223, in run_from_argv (prog_name='celery', argv=['-n', 'ae-worker-3.%h', '-A', 'ae_worker.app', '--pidfile=/run/celery/ae-worker-3.pid', '--autoscale=10,3', '--max-tasks-per-child', '1', '-Q', 'bgl', '--loglevel=INFO', '--logfile=/var/log/ae-worker/ae-worker-3%I.log', '--soft-time-limit=7200', '--time-limit=7300', '-Ofair', '--max-tasks-per-child', '1', '-E'], command='worker', options={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child':...(truncated)) at Python/ceval.c:547
#81 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d8970300>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=3, kwnames=0x7fb3d8a12f78, kwargs=kwargs@entry=0x7fb3d54143f0, kwcount=1, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3d8973660, defcount=defcount@entry=2, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
name='run_from_argv', qualname='worker.run_from_argv') at Python/ceval.c:3930
#82 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#83 0x0000000000428c25 in call_function (kwnames=('command',), oparg=<optimized out>, pp_stack=<synthetic pointer>)
at Python/ceval.c:4616
#84 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3139
#85 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3d5414248, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py, line 420, in execute (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<...(truncated)) at Python/ceval.c:547
#86 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d89f7540>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=3, kwnames=0x0, kwargs=kwargs@entry=0x2957570, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3d89a54f8, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0, name='execute',
qualname='CeleryCommand.execute') at Python/ceval.c:3930
#87 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#88 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#89 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#90 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x29573b8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py, line 488, in handle_argv (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_...(truncated)) at Python/ceval.c:547
#91 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d8a045d0>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=3, kwnames=0x0, kwargs=kwargs@entry=0x7fb3d897c1f0, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x0, defcount=defcount@entry=0, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0, name='handle_argv',
qualname='CeleryCommand.handle_argv') at Python/ceval.c:3930
#92 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#93 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#94 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#95 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3d897c048, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py, line 298, in execute_from_commandline (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'pr---Type <return> to continue, or q <return> to quit---
eload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _f...(truncated)) at Python/ceval.c:547
#96 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3e6172420>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=2, kwnames=0x0, kwargs=kwargs@entry=0x7fb3e60f0f18, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3e6174bc0, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
name='execute_from_commandline', qualname='Command.execute_from_commandline') at Python/ceval.c:3930
#97 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#98 0x0000000000428b80 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#99 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3093
#100 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3e60f0d68, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py, line 496, in execute_from_commandline (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, ...(truncated)) at Python/ceval.c:547
#101 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d8a06780>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=2, kwnames=0x0, kwargs=kwargs@entry=0x274a150, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3d89a55a0, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0,
closure=closure@entry=(<cell at remote 0x7fb3d895e5e8>,), name='execute_from_commandline',
qualname='CeleryCommand.execute_from_commandline') at Python/ceval.c:3930
#102 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#103 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#104 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#105 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
at Objects/call.c:283
#106 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#107 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#108 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#109 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=0, globals=<optimized out>)
at Objects/call.c:283
#110 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#111 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#112 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#113 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x26c8708, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery, line 8, in <module> ()) at Python/ceval.c:547
#114 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3de9fb300>,
globals=globals@entry=<unknown at remote 0x7fb3de9e4648>,
locals=locals@entry='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery',
args=args@entry=0x0, argcount=argcount@entry=0, kwnames=kwnames@entry=0x0, kwargs=kwargs@entry=0x0, kwcount=kwcount@entry=0,
kwstep=kwstep@entry=2, defs=defs@entry=0x0, defcount=defcount@entry=0, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
name=name@entry=0x0, qualname=qualname@entry=0x0) at Python/ceval.c:3930
#115 0x00000000004ebbc0 in PyEval_EvalCodeEx (closure=0x0, kwdefs=0x0, defcount=0, defs=0x0, kwcount=0, kws=0x0, argcount=0,
args=0x0,
locals=locals@entry='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery',
globals=globals@entry=<unknown at remote 0x7fb3de9e4648>, _co=_co@entry=<code at remote 0x7fb3de9fb300>) at Python/ceval.c:3959
#116 PyEval_EvalCode (co=co@entry=<code at remote 0x7fb3de9fb300>,
globals=globals@entry={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
locals=locals@entry={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>}) at Python/ceval.c:524
#117 0x00000000005265f8 in run_mod (arena=0x7fb3e6282078, flags=0x7ffdbea285f0,
locals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
globals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
filename='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', mod=0x26c1cc8)
at Python/pythonrun.c:1035
#118 PyRun_FileExFlags (fp=0x26f05c0, filename_str=<optimized out>, start=<optimized out>,
globals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
locals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>}, closeit=1, flags=0x7ffdbea285f0) at Python/pythonrun.c:988
#119 0x00000000005267dd in PyRun_SimpleFileExFlags (fp=0x26f05c0, filename=<optimized out>, closeit=1, flags=0x7ffdbea285f0)
at Python/pythonrun.c:429
#120 0x000000000042fdfd in pymain_run_file (p_cf=0x7ffdbea285f0,
filename=0x2655d10 L"/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery",
fp=0x26f05c0) at Modules/main.c:427
#121 pymain_run_filename (cf=0x7ffdbea285f0, pymain=0x7ffdbea286d0) at Modules/main.c:1627
#122 pymain_run_python (pymain=0x7ffdbea286d0) at Modules/main.c:2877
---Type <return> to continue, or q <return> to quit---
#123 pymain_main (pymain=pymain@entry=0x7ffdbea286d0) at Modules/main.c:3038
#124 0x00000000004300cd in _Py_UnixMain (argc=<optimized out>, argv=<optimized out>) at Modules/main.c:3073
#125 0x00007fb3e516c495 in __libc_start_main () from /lib64/libc.so.6
#126 0x0000000000429df4 in _start ()
py-bt stack trace or forked process:
Traceback (most recent call first):
<built-in method read of module object at remote 0x7fb3e61f33b8>
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/connection.py", line 422, in _recv
chunk = read(handle, remaining)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/connection.py", line 456, in _recv_bytes
buf = self._recv(4)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/connection.py", line 243, in recv_bytes
buf = self._recv_bytes(maxlength)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/queues.py", line 355, in get_payload
return self._reader.recv_bytes()
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 445, in _recv
return True, loads(get_payload())
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 473, in receive
ready, req = _receive(1.0)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 351, in workloop
req = wait_for_job()
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 292, in __call__
sys.exit(self.workloop(pid=pid))
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/process.py", line 114, in run
self._target(*self._args, **self._kwargs)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/process.py", line 327, in _bootstrap
self.run()
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/popen_fork.py", line 79, in _launch
code = process_obj._bootstrap()
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/popen_fork.py", line 24, in __init__
self._launch(process_obj)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/context.py", line 333, in _Popen
return Popen(process_obj)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/process.py", line 124, in start
self._popen = self._Popen(self)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 1158, in _create_worker_process
w.start()
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py", line 445, in _create_worker_process
return super(AsynPool, self)._create_worker_process(i)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 1328, in _repopulate_pool
self._create_worker_process(self._avail_index())
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 1343, in _maintain_pool
self._repopulate_pool(joined)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 1351, in maintain_pool
self._maintain_pool()
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/timer.py", line 130, in _reschedules
return fun(*args, **kwargs)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/timer.py", line 68, in __call__
return self.fun(*self.args, **self.kwargs)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/hub.py", line 145, in fire_timers
entry()
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/hub.py", line 301, in create_loop
poll_timeout = fire_timers(propagate=propagate) if scheduled else 1
<built-in method next of module object at remote 0x7fb3e627ac28>
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py", line 91, in asynloop
next(loop)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/consumer/consumer.py", line 596, in start
c.loop(*c.loop_args())
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/consumer/consumer.py", line 318, in start
blueprint.start(self)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bootsteps.py", line 369, in start
return self.obj.start()
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/worker.py", line 205, in start
self.blueprint.start(self)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py", line 258, in run
worker.start()
---Type <return> to continue, or q <return> to quit---
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py", line 252, in __call__
ret = self.run(*args, **kwargs)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py", line 223, in run_from_argv
return self(*args, **options)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 420, in execute
).run_from_argv(self.prog_name, argv[1:], command=argv[0])
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 488, in handle_argv
return self.execute(command, argv)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py", line 298, in execute_from_commandline
return self.handle_argv(self.prog_name, argv[1:])
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 496, in execute_from_commandline
super(CeleryCommand, self).execute_from_commandline(argv)))
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 322, in main
cmd.execute_from_commandline(argv)
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/__main__.py", line 16, in main
_main()
File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery", line 8, in <module>
sys.exit(main())
bt stack trace of forked process
#0 0x00007fb3e5c2e6e0 in __read_nocancel () from /lib64/libpthread.so.0
#1 0x0000000000542a85 in _Py_read (fd=fd@entry=12, buf=0x7fb3cf30b410, count=count@entry=4) at Python/fileutils.c:1407
#2 0x000000000054b1a9 in os_read_impl (module=0x0, length=4, fd=12) at ./Modules/posixmodule.c:8043
#3 os_read (module=module@entry=<module at remote 0x7fb3e61f33b8>, args=args@entry=0x3873df0, nargs=<optimized out>)
at ./Modules/clinic/posixmodule.c.h:3625
#4 0x000000000043bddd in _PyMethodDef_RawFastCallKeywords (kwnames=0x0, nargs=0, args=0x3873df0,
self=<module at remote 0x7fb3e61f33b8>, method=0x8c7ec0 <posix_methods+2848>) at Objects/call.c:651
#5 _PyCFunction_FastCallKeywords (func=<built-in method read of module object at remote 0x7fb3e61f33b8>, args=args@entry=0x3873df0,
nargs=nargs@entry=2, kwnames=kwnames@entry=0x0) at Objects/call.c:730
#6 0x00000000004280bd in call_function (kwnames=0x0, oparg=2, pp_stack=<synthetic pointer>) at Python/ceval.c:4568
#7 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#8 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x3873c38, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/connection.py, line 422, in _recv (self=<Connection(_handle=12, _readable=True, _writable=False) at remote 0x7fb3ced17fd0>, size=4, read=<built-in method read of module object at remote 0x7fb3e61f33b8>, buf=<_io.BytesIO at remote 0x7fb3cecccba0>, handle=12, remaining=4)) at Python/ceval.c:547
#9 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf53ff60>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=2, kwnames=0x0, kwargs=kwargs@entry=0x7fb3cec4f1e8, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3cf54e488, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0, name='_recv',
qualname='Connection._recv') at Python/ceval.c:3930
#10 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#11 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#12 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#13 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3cec4f048, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/connection.py, line 456, in _recv_bytes (self=<Connection(_handle=12, _readable=True, _writable=False) at remote 0x7fb3ced17fd0>, maxsize=None)) at Python/ceval.c:547
#14 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf5420c0>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=2, kwnames=0x0, kwargs=kwargs@entry=0x7fb3cec87a78, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3cf543108, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
name='_recv_bytes', qualname='Connection._recv_bytes') at Python/ceval.c:3930
#15 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#16 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#17 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#18 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3cec878e0, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/connection.py, line 243, in recv_bytes (self=<Connection(_handle=12, _readable=True, _writable=False) at remote 0x7fb3ced17fd0>, maxlength=None)) at Python/ceval.c:547
#19 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf53f300>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=1, kwnames=0x0, kwargs=kwargs@entry=0x7fb3cec9e9e8, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3cf538bf8, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
name='recv_bytes', qualname='_ConnectionBase.recv_bytes') at Python/ceval.c:3930
#20 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#21 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#22 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#23 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
at Objects/call.c:283
#24 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#25 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#26 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#27 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3cec87728, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py, line 445, in _recv (timeout=<float at remote 0x7fb3d01417c8>, loads=<function at remote 0x7fb3cf5d72f0>)) at Python/ceval.c:547
#28 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf598780>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=1, kwnames=0x0, kwargs=kwargs@entry=0x3346fa0, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3ced2c878, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0,
closure=closure@entry=(<cell at remote 0x7fb3cec534f8>,), name='_recv', qualname='Worker._make_recv_method.<locals>._recv')
at Python/ceval.c:3930
#29 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#30 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#31 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#32 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x3346df8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py, line 473, in receive (debug=<function at remote 0x7fb3d970b0d0>)) at Python/ceval.c:547
#33 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf598a50>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=0, kwnames=0x0, kwargs=kwargs@entry=0x386d2c0, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3cec222c8, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0,
closure=closure@entry=(<cell at remote 0x7fb3ceb62708>, <cell at remote 0x7fb3ceb62fd8>), name='receive',
qualname='Worker._make_protected_receive.<locals>.receive') at Python/ceval.c:3930
#34 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#35 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#36 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#37 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x386d058, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py, line 351, in workloop (debug=<function at remote 0x7fb3d970b0d0>, now=<built-in method monotonic of module object at remote 0x7fb3de980d68>, pid=111793, put=<method at remote 0x7fb3cecf55c8>, inqW_fd=13, synqW_fd=None, maxtasks=1, max_memory_per_child=0, prepare_result=<method at remote 0x7fb3cec86548>, wait_for_job=<function at remote 0x7fb3cec7b268>, wait_for_syn=<function at remote 0x7fb3cec3ff28>, completed=0)) at Python/ceval.c:547
#38 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf5985d0>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=1, kwnames=0x7fb3cf599060, kwargs=kwargs@entry=0x3490068, kwcount=1, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3cf536f90, defcount=defcount@entry=3, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0, name='workloop',
qualname='Worker.workloop') at Python/ceval.c:3930
#39 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#40 0x0000000000428c25 in call_function (kwnames=('pid',), oparg=<optimized out>, pp_stack=<synthetic pointer>)
at Python/ceval.c:4616
#41 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3139
#42 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x348feb8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/si---Type <return> to continue, or q <return> to quit---
te-packages/billiard/pool.py, line 292, in __call__ (self=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898...(truncated)) at Python/ceval.c:547
#43 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3cf598270>,
globals=globals@entry={'__name__': 'billiard.pool', '__doc__': None, '__package__': 'billiard', '__loader__': <SourceFileLoader(name='billiard.pool', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py') at remote 0x7fb3cf5916d8>, '__spec__': <ModuleSpec(name='billiard.pool', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/__pycache__/pool.cpython-37.pyc', _initializing=False) at remote 0x7fb3cf591710>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py', '__cached__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVE...(truncated),
locals=locals@entry=0x0, args=args@entry=0x7ffdbea23e20, argcount=argcount@entry=1, kwnames=kwnames@entry=0x0,
kwargs=kwargs@entry=0x0, kwcount=kwcount@entry=0, kwstep=kwstep@entry=2, defs=0x0, defcount=0, kwdefs=0x0, closure=0x0,
name='__call__', qualname='Worker.__call__') at Python/ceval.c:3930
#44 0x000000000043b027 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3cf53db70>,
args=args@entry=0x7ffdbea23e20, nargs=nargs@entry=1, kwargs=kwargs@entry={}) at Objects/call.c:376
#45 0x000000000043e749 in _PyObject_FastCallDict (kwargs={}, nargs=1, args=0x7ffdbea23e20,
callable=<function at remote 0x7fb3cf53db70>) at Objects/call.c:98
#46 _PyObject_Call_Prepend (callable=callable@entry=<function at remote 0x7fb3cf53db70>,
obj=obj@entry=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _us...(truncated),
args=args@entry=(), kwargs=kwargs@entry={}) at Objects/call.c:904
#47 0x0000000000494665 in slot_tp_call (
self=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _us...(truncated), args=(), kwds={})
at Objects/typeobject.c:6379
#48 0x000000000043c638 in PyObject_Call (
callable=callable@entry=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _us...(truncated),
args=args@entry=(), kwargs=kwargs@entry={}) at Objects/call.c:245
#49 0x0000000000422479 in do_call_core (kwdict={}, callargs=(),
func=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _us...(truncated)) at Python/ceval.c:4645
#50 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#51 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
at Objects/call.c:283
#52 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#53 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#54 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#55 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
at Objects/call.c:283
#56 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#57 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#58 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#59 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
at Objects/call.c:283
#60 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#61 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#62 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#63 0x00000000004200a8 in function_code_fastcall (co=co@entry=0x7fb3cf1f5b70, args=<optimized out>, args@entry=0x7ffdbea24610,
nargs=nargs@entry=2,
globals=globals@entry={'__name__': 'billiard.popen_fork', '__doc__': None, '__package__': 'billiard', '__loader__': <SourceFileLoader(name='billiard.popen_fork', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/popen_fork.py') at remote 0x7fb3cf201748>, '__spec__': <ModuleSpec(name='billiard.popen_fork', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/popen_fork.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/__pycache__/popen_fork.cpython-37.pyc', _initializing=False) at remote 0x7fb3cf201780>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/popen_fork.py', '__cached__': '/app/aep/.local/share/virtualenvs/a02e6b...(truncated))
at Objects/call.c:283
#64 0x000000000043b176 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3cf205c80>,
args=args@entry=0x7ffdbea24610, nargs=nargs@entry=2, kwargs=kwargs@entry=0x0) at Objects/call.c:322
#65 0x000000000043e749 in _PyObject_FastCallDict (kwargs=0x0, nargs=2, args=0x7ffdbea24610,
callable=<function at remote 0x7fb3cf205c80>) at Objects/call.c:98
---Type <return> to continue, or q <return> to quit---
#66 _PyObject_Call_Prepend (callable=callable@entry=<function at remote 0x7fb3cf205c80>,
obj=obj@entry=<Popen(returncode=None, pid=0) at remote 0x7fb3cec22da0>,
args=args@entry=(<ForkProcess(_identity=(3733,), _config={'authkey': <AuthenticationString at remote 0x7fb3ddedfba8>, 'semprefix': '/mp', 'daemon': True}, _parent_pid=94981, _popen=None, _target=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, ...(truncated),
kwargs=kwargs@entry=0x0) at Objects/call.c:904
#67 0x000000000048f059 in slot_tp_init (self=<Popen(returncode=None, pid=0) at remote 0x7fb3cec22da0>,
args=(<ForkProcess(_identity=(3733,), _config={'authkey': <AuthenticationString at remote 0x7fb3ddedfba8>, 'semprefix': '/mp', 'daemon': True}, _parent_pid=94981, _popen=None, _target=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, ...(truncated), kwds=0x0)
at Objects/typeobject.c:6613
#68 0x000000000048a332 in type_call (type=<optimized out>, type@entry=0x336fdb8,
args=args@entry=(<ForkProcess(_identity=(3733,), _config={'authkey': <AuthenticationString at remote 0x7fb3ddedfba8>, 'semprefix': '/mp', 'daemon': True}, _parent_pid=94981, _popen=None, _target=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, ...(truncated),
kwds=kwds@entry=0x0) at Objects/typeobject.c:949
#69 0x000000000043b861 in _PyObject_FastCallKeywords (callable=callable@entry=<type at remote 0x336fdb8>, stack=<optimized out>,
nargs=nargs@entry=1, kwnames=kwnames@entry=0x0) at Objects/call.c:199
#70 0x0000000000424098 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4619
#71 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#72 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
at Objects/call.c:283
#73 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#74 0x0000000000428b80 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#75 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3093
#76 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
at Objects/call.c:283
#77 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#78 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#79 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#80 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
at Objects/call.c:283
#81 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#82 0x0000000000428b80 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#83 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3093
#84 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3cf1f0570, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py, line 445, in _create_worker_process (self=<AsynPool(sched_strategy=4, synack=False, _queues={(<_SimpleQueue(_reader=<Connection(_handle=31, _readable=True, _writable=False) at remote 0x7fb3cecdecf8>, _writer=<Connection(_handle=32, _readable=False, _writable=True) at remote 0x7fb3cecde2b0>, _poll=<method at remote 0x7fb3cf23cec8>, _rlock=None, _wlock=None) at remote 0x7fb3ced47748>, <_SimpleQueue(_reader=<Connection(_handle=33, _readable=True, _writable=False) at remote 0x7fb3cecde2e8>, _writer=<Connection(_handle=34, _readable=False, _writable=True) at remote 0x7fb3cecde828>, _poll=<method at remote 0x7fb3ceb4a188>, _rlock=None, _wlock=None) at remote 0x7fb3cecde160>, None): <ForkProcess(_identity=(3731,), _config={'authkey': <AuthenticationString at remote 0x7fb3ddedfba8>, 'semprefix': '/mp', 'daemon': True}, _parent_pid=94981, _pope...(truncated)) at Python/ceval.c:547
#85 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf2ed8a0>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=2, kwnames=0x0, kwargs=kwargs@entry=0x7fb3ced34988, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x0, defcount=defcount@entry=0, kwdefs=kwdefs@entry=0x0,
closure=closure@entry=(<cell at remote 0x7fb3cf2e5eb8>,), name='_create_worker_process',
qualname='AsynPool._create_worker_process') at Python/ceval.c:3930
#86 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#87 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#88 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#89 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
at Objects/call.c:283
#90 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#91 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#92 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#93 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
at Objects/call.c:283
#94 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#95 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#96 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#97 0x00000000004200a8 in function_code_fastcall (co=co@entry=0x7fb3cf5299c0, args=<optimized out>, args@entry=0x7ffdbea25400,
nargs=nargs@entry=1,
globals=globals@entry={'__name__': 'billiard.pool', '__doc__': None, '__package__': 'billiard', '__loader__': <SourceFileLoader(name='billiard.pool', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py') at remote 0x7fb3cf5916d8>, '__spec__': <ModuleSpec(name='billiard.pool', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/__pycache__/pool.cpython-37.pyc', _initializing=False) at remote 0x7fb3cf591710>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py', '__cached__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVE...(truncated))
at Objects/call.c:283
#98 0x000000000043b176 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3cf2d61e0>,
args=args@entry=0x7ffdbea25400, nargs=nargs@entry=1, kwargs=kwargs@entry={}) at Objects/call.c:322
---Type <return> to continue, or q <return> to quit---
#99 0x000000000043e749 in _PyObject_FastCallDict (kwargs={}, nargs=1, args=0x7ffdbea25400,
callable=<function at remote 0x7fb3cf2d61e0>) at Objects/call.c:98
#100 _PyObject_Call_Prepend (callable=<function at remote 0x7fb3cf2d61e0>, obj=<optimized out>, args=(), kwargs={})
at Objects/call.c:904
#101 0x000000000043c638 in PyObject_Call (callable=callable@entry=<method at remote 0x7fb3cecd5048>, args=args@entry=(),
kwargs=kwargs@entry={}) at Objects/call.c:245
#102 0x0000000000422479 in do_call_core (kwdict={}, callargs=(), func=<method at remote 0x7fb3cecd5048>) at Python/ceval.c:4645
#103 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#104 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x347e4f8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/timer.py, line 130, in _reschedules (args=(), kwargs={}, last=None, now=<float at remote 0x7fb3ceb52108>, lsince=<float at remote 0x7fb3d01419f0>)) at Python/ceval.c:547
#105 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3cf2f05d0>,
globals=globals@entry={'__name__': 'kombu.asynchronous.timer', '__doc__': 'Timer scheduling Python callbacks.', '__package__': 'kombu.asynchronous', '__loader__': <SourceFileLoader(name='kombu.asynchronous.timer', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/timer.py') at remote 0x7fb3cf2e3c50>, '__spec__': <ModuleSpec(name='kombu.asynchronous.timer', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/timer.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/__pycache__/timer.cpython-37.pyc', _initializing=False) at remote 0x7fb3cf2e3c88>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/as...(truncated),
locals=locals@entry=0x0, args=args@entry=0x7fb3e6246060, argcount=argcount@entry=0, kwnames=kwnames@entry=0x0,
kwargs=kwargs@entry=0x0, kwcount=kwcount@entry=0, kwstep=kwstep@entry=2, defs=0x0, defcount=0, kwdefs=0x0,
closure=(<cell at remote 0x7fb3cec65be8>, <cell at remote 0x7fb3cec30d08>, <cell at remote 0x7fb3cec30d38>, <cell at remote 0x7fb3cec30d98>, <cell at remote 0x7fb3cec30e28>), name='maintain_pool', qualname='Pool.maintain_pool') at Python/ceval.c:3930
#106 0x000000000043b027 in _PyFunction_FastCallDict (func=<function at remote 0x7fb3cec6a620>, args=args@entry=0x7fb3e6246060,
nargs=0, kwargs=kwargs@entry={}) at Objects/call.c:376
#107 0x000000000043c713 in PyObject_Call (callable=callable@entry=<function at remote 0x7fb3cec6a620>, args=args@entry=(),
kwargs=kwargs@entry={}) at Objects/call.c:226
#108 0x0000000000422479 in do_call_core (kwdict={}, callargs=(), func=<function at remote 0x7fb3cec6a620>) at Python/ceval.c:4645
#109 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#110 0x00000000004200a8 in function_code_fastcall (co=co@entry=0x7fb3cf2e69c0, args=<optimized out>, args@entry=0x7ffdbea259c0,
nargs=nargs@entry=1,
globals=globals@entry={'__name__': 'kombu.asynchronous.timer', '__doc__': 'Timer scheduling Python callbacks.', '__package__': 'kombu.asynchronous', '__loader__': <SourceFileLoader(name='kombu.asynchronous.timer', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/timer.py') at remote 0x7fb3cf2e3c50>, '__spec__': <ModuleSpec(name='kombu.asynchronous.timer', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/timer.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/__pycache__/timer.cpython-37.pyc', _initializing=False) at remote 0x7fb3cf2e3c88>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/as...(truncated))
at Objects/call.c:283
#111 0x000000000043b176 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3cf2ee730>,
args=args@entry=0x7ffdbea259c0, nargs=nargs@entry=1, kwargs=kwargs@entry=0x0) at Objects/call.c:322
#112 0x000000000043e749 in _PyObject_FastCallDict (kwargs=0x0, nargs=1, args=0x7ffdbea259c0,
callable=<function at remote 0x7fb3cf2ee730>) at Objects/call.c:98
#113 _PyObject_Call_Prepend (callable=callable@entry=<function at remote 0x7fb3cf2ee730>,
obj=obj@entry=<Entry at remote 0x7fb3cebb3768>, args=args@entry=(), kwargs=kwargs@entry=0x0) at Objects/call.c:904
#114 0x0000000000494665 in slot_tp_call (self=self@entry=<Entry at remote 0x7fb3cebb3768>, args=args@entry=(), kwds=kwds@entry=0x0)
at Objects/typeobject.c:6379
#115 0x000000000043b861 in _PyObject_FastCallKeywords (callable=callable@entry=<Entry at remote 0x7fb3cebb3768>,
stack=<optimized out>, nargs=nargs@entry=0, kwnames=kwnames@entry=0x0) at Objects/call.c:199
#116 0x0000000000424098 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4619
#117 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#118 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x3480618, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/hub.py, line 145, in fire_timers (self=<Hub(timer=<Timer(max_interval=<float at remote 0x7fb3d0141a08>, on_error=None, _queue=[<scheduled at remote 0x7fb3cebf0d18>, <scheduled at remote 0x7fb3cebf0a48>, <scheduled at remote 0x7fb3cecea3b8>, <scheduled at remote 0x7fb3ceceabd8>, <scheduled at remote 0x7fb3ceb60368>, <scheduled at remote 0x7fb3cec05d18>, <scheduled at remote 0x7fb3ceb606d8>, <scheduled at remote 0x7fb3cec8f7c8>, <scheduled at remote 0x7fb3cec8f688>]) at remote 0x7fb3cf2b04e0>, readers={71: (<method at remote 0x7fb3cec80b48>, (<Connection(_connection_id='ed843206bfeb424292688d00071ac41e', authentication=(<AMQPLAIN(username='aep', password='F8P7uYpFbxyWQSJF4psFc52P6') at remote 0x7fb3ceb98a20>,), client_properties={'product': 'py-amqp', 'product_version': '2.5.1', 'capabilities': {'consumer_cancel_notify': True, 'connection.blocked': True, ...(truncated)) at Python/ceval.c:547
#119 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf550780>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=1, kwnames=0x7fb3cf2dfd80, kwargs=kwargs@entry=0x347c960, kwcount=1, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3cf549ab0, defcount=defcount@entry=4, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
name='fire_timers', qualname='Hub.fire_timers') at Python/ceval.c:3930
#120 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#121 0x0000000000428c25 in call_function (kwnames=('propagate',), oparg=<optimized out>, pp_stack=<synthetic pointer>)
at Python/ceval.c:4616
#122 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3139
#123 0x00000000005c043d in gen_send_ex (closing=0, exc=0, arg=0x0, gen=0x7fb3cf275de0) at Objects/genobject.c:221
#124 gen_iternext (gen=0x7fb3cf275de0) at Objects/genobject.c:542
#125 0x00000000004e703b in builtin_next (self=self@entry=<module at remote 0x7fb3e627ac28>, args=args@entry=0x347abd0,
nargs=<optimized out>) at Python/bltinmodule.c:1426
#126 0x000000000043bddd in _PyMethodDef_RawFastCallKeywords (kwnames=0x0, nargs=0, args=0x347abd0,
self=<module at remote 0x7fb3e627ac28>, method=0x8ba020 <builtin_methods+992>) at Objects/call.c:651
#127 _PyCFunction_FastCallKeywords (func=<built-in method next of module object at remote 0x7fb3e627ac28>,
args=args@entry=0x347abd0, nargs=nargs@entry=1, kwnames=kwnames@entry=0x0) at Objects/call.c:730
#128 0x00000000004280bd in call_function (kwnames=0x0, oparg=1, pp_stack=<synthetic pointer>) at Python/ceval.c:4568
#129 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#130 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x347a9d8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py, line 91, in asynloop (obj=<Consumer(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.l...(truncated)) at Python/ceval.c:547
#131 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3cf2c6d20>,
globals=globals@entry={'__name__': 'celery.worker.loops', '__doc__': 'The consumers highly-optimized inner loop.', '__package__': 'celery.worker', '__loader__': <SourceFileLoader(name='celery.worker.loops', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py') at remote 0x7fb3cf2cd3c8>, '__spec__': <ModuleSpec(name='celery.worker.loops', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-em---Type <return> to continue, or q <return> to quit---
IVEOn1/lib/python3.7/site-packages/celery/worker/loops.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/__pycache__/loops.cpython-37.pyc', _initializing=False) at remote 0x7fb3cf2cd400>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py', '__cached_...(truncated),
locals=locals@entry=0x0, args=args@entry=0x7fb3cebfba38, argcount=argcount@entry=9, kwnames=kwnames@entry=0x0,
kwargs=kwargs@entry=0x0, kwcount=kwcount@entry=0, kwstep=kwstep@entry=2, defs=0x7fb3cf2cd5d8, defcount=1, kwdefs=0x0,
closure=0x0, name='asynloop', qualname='asynloop') at Python/ceval.c:3930
#132 0x000000000043b027 in _PyFunction_FastCallDict (func=<function at remote 0x7fb3cf2b1c80>, args=args@entry=0x7fb3cebfba38,
nargs=9, kwargs=kwargs@entry=0x0) at Objects/call.c:376
#133 0x000000000043c713 in PyObject_Call (callable=callable@entry=<function at remote 0x7fb3cf2b1c80>,
args=args@entry=(<Consumer(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker...(truncated),
kwargs=kwargs@entry=0x0) at Objects/call.c:226
#134 0x0000000000422479 in do_call_core (kwdict=0x0,
callargs=(<Consumer(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker...(truncated),
func=<function at remote 0x7fb3cf2b1c80>) at Python/ceval.c:4645
#135 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#136 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
at Objects/call.c:283
#137 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#138 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#139 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#140 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
at Objects/call.c:283
#141 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#142 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#143 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#144 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
at Objects/call.c:283
#145 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#146 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#147 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#148 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
at Objects/call.c:283
#149 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#150 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#151 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#152 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
at Objects/call.c:283
#153 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#154 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#155 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#156 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
at Objects/call.c:283
#157 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#158 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#159 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#160 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x2c6b3a8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py, line 258, in run (self=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at ...(truncated)) at Python/ceval.c:547
#161 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3d89706f0>,
globals=globals@entry={'__name__': 'celery.bin.worker', '__doc__': "Program used to start a Celery worker instance.\n\nThe :program:`celery worker` command (previously known as ``celeryd``)\n\n.. program:: celery worker\n\n.. seealso::\n\n See :ref:`preload-options`.\n\n.. cmdoption:: -c, --concurrency\n\n Number of child processes processing the queue. The default\n is the number of CPUs available on your system.\n\n.. cmdoption:: -P, --pool\n\n Pool implementation:\n\n prefork (default), eventlet, gevent or solo.\n\n.. cmdoption:: -n, --hostname\n\n Set custom hostname (e.g., 'w1@%%h'). Expands: %%h (hostname),\n %%n (name) and %%d, (domain).\n\n.. cmdoption:: -B, --beat\n\n Also run the `celery beat` periodic task scheduler. Please note that\n there must only be one instance of this service.\n\n .. note::\n\n ``-B`` is meant to be used for development purposes. For production\n environment, you need to start :program:`celery beat` separately.\n\n.. cmdoption:: -Q, --queues\n\n L...(truncated),
locals=locals@entry=0x0, args=args@entry=0x7ffdbea26e60, argcount=argcount@entry=1, kwnames=kwnames@entry=0x2c6b830,
kwargs=kwargs@entry=0x2c6b838, kwcount=78, kwcount@entry=39, kwstep=kwstep@entry=2, defs=0x7fb3d89cf8d0, defcount=9, kwdefs=0x0,
closure=0x0, name='run', qualname='worker.run') at Python/ceval.c:3930
#162 0x000000000043b027 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3d8979620>,
args=args@entry=0x7ffdbea26e60, nargs=nargs@entry=1,
kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'schedu---Type <return> to continue, or q <return> to quit---
ler': None}) at Objects/call.c:376
#163 0x000000000043e749 in _PyObject_FastCallDict (
kwargs={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, nargs=1, args=0x7ffdbea26e60, callable=<function at remote 0x7fb3d8979620>) at Objects/call.c:98
#164 _PyObject_Call_Prepend (callable=<function at remote 0x7fb3d8979620>, obj=<optimized out>, args=(),
kwargs={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None})
at Objects/call.c:904
#165 0x000000000043c638 in PyObject_Call (callable=callable@entry=<method at remote 0x7fb3e6238d08>, args=args@entry=(),
kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:245
#166 0x0000000000422479 in do_call_core (
kwdict={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, callargs=(), func=<method at remote 0x7fb3e6238d08>) at Python/ceval.c:4645
#167 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#168 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3d5414a48, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py, line 252, in __call__ (self=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread....(truncated)) at Python/ceval.c:547
#169 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3e6172300>,
globals=globals@entry={'__name__': 'celery.bin.base', '__doc__': 'Base command-line interface.', '__package__': 'celery.bin', '__loader__': <SourceFileLoader(name='celery.bin.base', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py') at remote 0x7fb3e6154748>, '__spec__': <ModuleSpec(name='celery.bin.base', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/__pycache__/base.cpython-37.pyc', _initializing=False) at remote 0x7fb3e6154780>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py', '__cached__': '/app/aep/.local/share/virtualenvs/a02e6b...(truncated),
locals=locals@entry=0x0, args=args@entry=0x7ffdbea271d0, argcount=argcount@entry=1, kwnames=kwnames@entry=0x2c69670,
kwargs=kwargs@entry=0x2c69678, kwcount=78, kwcount@entry=39, kwstep=kwstep@entry=2, defs=0x0, defcount=0, kwdefs=0x0,
closure=0x0, name='__call__', qualname='Command.__call__') at Python/ceval.c:3930
#170 0x000000000043b027 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3d89a1a60>,
args=args@entry=0x7ffdbea271d0, nargs=nargs@entry=1,
kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:376
#171 0x000000000043e749 in _PyObject_FastCallDict (
kwargs={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, nargs=1, args=0x7ffdbea271d0, callable=<function at remote 0x7fb3d89a1a60>) at Objects/call.c:98
#172 _PyObject_Call_Prepend (callable=callable@entry=<function at remote 0x7fb3d89a1a60>,
obj=obj@entry=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker.ta...(truncated),
args=args@entry=(),
kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:904
---Type <return> to continue, or q <return> to quit---
#173 0x0000000000494665 in slot_tp_call (
self=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker.ta...(truncated), args=(),
kwds={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None})
at Objects/typeobject.c:6379
#174 0x000000000043c638 in PyObject_Call (
callable=callable@entry=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker.ta...(truncated),
args=args@entry=(),
kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:245
#175 0x0000000000422479 in do_call_core (
kwdict={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, callargs=(),
func=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker.ta...(truncated)) at Python/ceval.c:4645
#176 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#177 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3d7a13808, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py, line 223, in run_from_argv (prog_name='celery', argv=['-n', 'ae-worker-3.%h', '-A', 'ae_worker.app', '--pidfile=/run/celery/ae-worker-3.pid', '--autoscale=10,3', '--max-tasks-per-child', '1', '-Q', 'bgl', '--loglevel=INFO', '--logfile=/var/log/ae-worker/ae-worker-3%I.log', '--soft-time-limit=7200', '--time-limit=7300', '-Ofair', '--max-tasks-per-child', '1', '-E'], command='worker', options={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child':...(truncated)) at Python/ceval.c:547
#178 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d8970300>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=3, kwnames=0x7fb3d8a12f78, kwargs=kwargs@entry=0x7fb3d54143f0, kwcount=1, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3d8973660, defcount=defcount@entry=2, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
name='run_from_argv', qualname='worker.run_from_argv') at Python/ceval.c:3930
#179 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#180 0x0000000000428c25 in call_function (kwnames=('command',), oparg=<optimized out>, pp_stack=<synthetic pointer>)
at Python/ceval.c:4616
#181 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3139
#182 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3d5414248, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py, line 420, in execute (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<...(truncated)) at Python/ceval.c:547
#183 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d89f7540>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=3, kwnames=0x0, kwargs=kwargs@entry=0x2957570, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3d89a54f8, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0, name='execute',
qualname='CeleryCommand.execute') at Python/ceval.c:3930
#184 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#185 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#186 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#187 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x29573b8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py, line 488, in handle_argv (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_per---Type <return> to continue, or q <return> to quit---
iodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_...(truncated)) at Python/ceval.c:547
#188 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d8a045d0>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=3, kwnames=0x0, kwargs=kwargs@entry=0x7fb3d897c1f0, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x0, defcount=defcount@entry=0, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0, name='handle_argv',
qualname='CeleryCommand.handle_argv') at Python/ceval.c:3930
#189 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#190 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#191 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#192 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3d897c048, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py, line 298, in execute_from_commandline (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _f...(truncated)) at Python/ceval.c:547
#193 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3e6172420>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=2, kwnames=0x0, kwargs=kwargs@entry=0x7fb3e60f0f18, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3e6174bc0, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
name='execute_from_commandline', qualname='Command.execute_from_commandline') at Python/ceval.c:3930
#194 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#195 0x0000000000428b80 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#196 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3093
#197 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x7fb3e60f0d68, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py, line 496, in execute_from_commandline (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, ...(truncated)) at Python/ceval.c:547
#198 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d8a06780>, globals=<optimized out>, locals=locals@entry=0x0,
args=<optimized out>, argcount=2, kwnames=0x0, kwargs=kwargs@entry=0x274a150, kwcount=0, kwstep=kwstep@entry=1,
defs=defs@entry=0x7fb3d89a55a0, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0,
closure=closure@entry=(<cell at remote 0x7fb3d895e5e8>,), name='execute_from_commandline',
qualname='CeleryCommand.execute_from_commandline') at Python/ceval.c:3930
#199 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:433
#200 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#201 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#202 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
at Objects/call.c:283
#203 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#204 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#205 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#206 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=0, globals=<optimized out>)
at Objects/call.c:283
#207 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
kwnames=<optimized out>) at Objects/call.c:415
#208 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#209 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#210 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
f=Frame 0x26c8708, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery, line 8, in <module> ()) at Python/ceval.c:547
#211 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3de9fb300>,
globals=globals@entry=<unknown at remote 0x7fb3de9e4648>,
locals=locals@entry='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery',
args=args@entry=0x0, argcount=argcount@entry=0, kwnames=kwnames@entry=0x0, kwargs=kwargs@entry=0x0, kwcount=kwcount@entry=0,
kwstep=kwstep@entry=2, defs=defs@entry=0x0, defcount=defcount@entry=0, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
name=name@entry=0x0, qualname=qualname@entry=0x0) at Python/ceval.c:3930
#212 0x00000000004ebbc0 in PyEval_EvalCodeEx (closure=0x0, kwdefs=0x0, defcount=0, defs=0x0, kwcount=0, kws=0x0, argcount=0,
args=0x0,
locals=locals@entry='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery',
globals=globals@entry=<unknown at remote 0x7fb3de9e4648>, _co=_co@entry=<code at remote 0x7fb3de9fb300>) at Python/ceval.c:3959
#213 PyEval_EvalCode (co=co@entry=<code at remote 0x7fb3de9fb300>,
globals=globals@entry={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
locals=locals@entry={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>}) at Python/ceval.c:524
#214 0x00000000005265f8 in run_mod (arena=0x7fb3e6282078, flags=0x7ffdbea285f0,
locals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
globals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
filename='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', mod=0x26c1cc8)
at Python/pythonrun.c:1035
#215 PyRun_FileExFlags (fp=0x26f05c0, filename_str=<optimized out>, start=<optimized out>,
globals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': ---Type <return> to continue, or q <return> to quit---
None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
locals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>}, closeit=1, flags=0x7ffdbea285f0) at Python/pythonrun.c:988
#216 0x00000000005267dd in PyRun_SimpleFileExFlags (fp=0x26f05c0, filename=<optimized out>, closeit=1, flags=0x7ffdbea285f0)
at Python/pythonrun.c:429
#217 0x000000000042fdfd in pymain_run_file (p_cf=0x7ffdbea285f0,
filename=0x2655d10 L"/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery",
fp=0x26f05c0) at Modules/main.c:427
#218 pymain_run_filename (cf=0x7ffdbea285f0, pymain=0x7ffdbea286d0) at Modules/main.c:1627
#219 pymain_run_python (pymain=0x7ffdbea286d0) at Modules/main.c:2877
#220 pymain_main (pymain=pymain@entry=0x7ffdbea286d0) at Modules/main.c:3038
#221 0x00000000004300cd in _Py_UnixMain (argc=<optimized out>, argv=<optimized out>) at Modules/main.c:3073
#222 0x00007fb3e516c495 in __libc_start_main () from /lib64/libc.so.6
#223 0x0000000000429df4 in _start ()
Dependencies:
celery==4.3.0
kombu==4.6.5
billard==3.6.1.0
amqp==2.5.1
python==3.7.3
can you give celery==4.4.0rc4 a try?
It will take me a week to vet it in staging, but I will try and get some of out load running against it in production.
@wakemaster39 : Did this help?
Also curious if this seems to have fixed it @wakemaster39
celery==4.4.0rc5?
I got pulled into having to solve different issues the past couple weeks and the site we are having issues with is going offline over the holidays so I don't want to make any changes there. I am vetting it in staging for the next week in a half in hopes to deploy to prod come the new year to get some load against it.
Do we know if this problem is a problem with celery workers specifically and not other parts of celery like beat
? Ergo if I switch to something like go-celery
and continue using celery.beat
does that theoretically fix my problem? Or have people experienced problems with beat
stalling out for whoever reason too?
In my case its not even beat that dies, its celery worker threads deadlocking. I fixed it for me by adding a timeout, but every once and a while the thread just stalls. It happened way more when I had some database transactions that weren't atomic, but fixing that fixed some of my headache.
tldr: its not beat, but worker that "randomly hang on IPC"
Seen similar things running celery = "==4.4.0". Beat is running fine but celery gets stuck for some reason. Possibly a deadlock but very hard to say
I had a similar experience to @vincent31337 . I believe this is avoided by not using the celery multi start
command.
I am facing same issue. Python2.7, Celery4.1.0. I use SQS broker. Celery processes tasks for sometime and it abruptly blocks after a while.
you should try the newer releases.
It's fixed after upgrading.
I would like to know if it's still the case with celery=>4.4.x
I am dealing with this same issue:
python 3.8
Django 3.0.3
celery 4.4.0
redis 3.4.1 (for broker and results)
I'm running five workers. I can run five tasks that complete, then the worker hangs up on read([different pipe value for each]
. After that tasks are accepted and show up in Inspect(app=app).active()
, However, their worker_pid
is not one of the pids of the workers. Oftentimes the pids are low values like 50. Using ps -p [low pid]
I do not see those pids actually running. I haven't seen that mentioned yet in this thread.
even with celery==4.4.2 right?
We are facing exactly same issue - with celery==4.3 as well as trying out latest celery=4.4.2
we are using rabbitmq as backend(broker and results)
kombu==4.6.8
celery==4.4.2
amqp==2.5.2
billiard==3.6.3.0
librabbitmq==2.0.0
Python 3.7.4
celery is stuck on read and is not able to process any other tasks.
$ sudo strace -p 8553
strace: Process 8553 attached
read(10, ^Cstrace: Process 8553 detached
<detached ...>
$ sudo strace -p 8557
strace: Process 8557 attached
read(14, ^Cstrace: Process 8557 detached
<detached ...>
$ sudo strace -p 8561
strace: Process 8561 attached
read(18, ^Cstrace: Process 8561 detached
<detached ...>
One example lsof ourput for process 8553
celery 8553 ubuntu 10r FIFO 0,10 0t0 43059 pipe
FWIW we seem to have hit the same issue with a raw use of billiard==3.6.3.0
(independently of celery):
parent:
(gdb) py-bt
Traceback (most recent call first):
<built-in method acquire of _thread.lock object at remote 0x7f838895e620>
File "/usr/lib/python3.6/threading.py", line 295, in wait
waiter.acquire()
File "/usr/local/lib/python3.6/dist-packages/billiard/pool.py", line 1957, in next
self._cond.wait(timeout)
File "xxx.py", line 1063, in main
for res in self.pool.imap_unordered(_partial, itertools.islice(a_generator, nb_processes)):
...
stuck child process:
0x00007f8393cc11c4 in __GI___libc_read (fd=21, buf=0x7f838892c578, nbytes=4) at ../sysdeps/unix/sysv/linux/read.c:27
Traceback (most recent call first):
<built-in method read of module object at remote 0x7f8392d3de58>
File "/usr/local/lib/python3.6/dist-packages/billiard/connection.py", line 422, in _recv
chunk = read(handle, remaining)
File "/usr/local/lib/python3.6/dist-packages/billiard/connection.py", line 456, in _recv_bytes
buf = self._recv(4)
File "/usr/local/lib/python3.6/dist-packages/billiard/connection.py", line 243, in recv_bytes
buf = self._recv_bytes(maxlength)
File "/usr/local/lib/python3.6/dist-packages/billiard/queues.py", line 395, in get_payload
return self._reader.recv_bytes()
File "/usr/local/lib/python3.6/dist-packages/billiard/pool.py", line 445, in _recv
return True, loads(get_payload())
...
This is the same stack as in https://github.com/celery/celery/issues/4185#issuecomment-554828564 py-bt stack trace or forked process
all other children processes are in semlock:
Traceback (most recent call first):
<built-in method __enter__ of _multiprocessing.SemLock object at remote 0x7f838896ff10>
File "/usr/local/lib/python3.6/dist-packages/billiard/synchronize.py", line 116, in __enter__
return self._semlock.__enter__()
File "/usr/local/lib/python3.6/dist-packages/billiard/queues.py", line 394, in get_payload
with self._rlock:
File "/usr/local/lib/python3.6/dist-packages/billiard/pool.py", line 445, in _recv
return True, loads(get_payload())
...
I observed the issue of workers getting stale on several instances of OpenWISP.
celery==4.4.7
kombu==4.6.11
billiard==3.6.3.0
Command:
celery -A openwisp2 worker -l info --pool=gevent --concurrency=10 -Ofair
The only difference I can see between a working worker and a not working one is that the worker which is not processing any task shows this in the strace
output:
select(22, [21], [], [21], {tv_sec=0, tv_usec=0}) = 0 (Timeout)
Even if the worker is processing tasks, I see these two type of lines in the strace output:
epoll_ctl(4, EPOLL_CTL_ADD, 20, {EPOLLIN, {u32=20, u64=12884901908}}) = -1 EEXIST (File exists)
recvfrom(20, 0x377a460, 65536, 0, NULL, NULL) = -1 EAGAIN (Resource temporarily unavailable)
Therefore I don't believe that these last two log lines indicate an underlying issue, or at least that these are not symptoms of the issue.
# strace -p 81
strace: Process 81 attached
read(24, ^Cstrace: Process 81 detached
<detached ...>
# lsof -p 81 | grep 24
celery 81 root mem REG 252,1 535124 /usr/local/lib/python3.6/lib-dynload/_ctypes.cpython-36m-x86_64-linux-gnu.so (path dev=0,112)
celery 81 root 24r FIFO 0,13 0t0 311875006 pipe
md5-a6754624afaf3bd93564d53ec027c759
python version: 3.6
celery==4.4.7
kombu==4.6.11
billiard==3.6.3.0
Hello,
I've been dealing with the same problem here:
celery inspect report:
software -> celery:4.3.0 (rhubarb) kombu:4.6.4 py:3.6.6
billiard:3.6.1.0 py-amqp:2.5.1
platform -> system:Linux arch:64bit
kernel version:3.10.0-1062.18.1.el7.x86_64 imp:CPython
loader -> celery.loaders.app.AppLoader
settings -> transport:amqp results:amqp
broker_url: 'amqp://nextgen:********@*:5672/ng_vhost'
result_backend: 'amqp'
task_queues:
(<unbound Queue default -> <unbound Exchange default(direct)> -> default>,
<unbound Queue runForecasts -> <unbound Exchange runForecasts(direct)> -> runForecasts>,
<unbound Queue runEnsembles -> <unbound Exchange runEnsembles(direct)> -> runEnsembles>)
task_default_queue: 'default'
task_default_exchange: 'default'
task_default_routing_key: '********'
broker_heartbeat: None
task_acks_late: True
worker_prefetch_multiplier: 1
CELERYD_OPTS:
--time-limit=3600 -c:worker1 3 -Q:worker1 runForecasts -c:worker2 2 -Q:worker2 runEnsembles -c:worker3 2 -Q:worker3 default
Only the runForecasts queue seems to get stuck, and here's the strace output for the child processes:
$ sudo strace -p 17426
strace: Process 17426 attached
read(28, ^Cstrace: Process 17426 detached
<detached ...>
$ sudo ls -l /proc/17426/fd/28
lr-x------. 1 wind wind 64 Nov 10 22:45 /proc/17426/fd/28 -> pipe:[284399927]
I found the pipe that it's stuck on, but don't know how to debug further. Would appreciate it if anyone can shed some light on this.
# strace -p 32
strace: Process 32 attached
epoll_wait(7, [], 1023, 787) = 0
epoll_ctl(7, EPOLL_CTL_DEL, 43, 0x7ffe2bd34534) = -1 ENOENT (No such file or directory)
wait4(631, 0x7ffe2bd33c3c, WNOHANG, NULL) = 0
epoll_wait(7, [], 1023, 4050) = 0
epoll_ctl(7, EPOLL_CTL_DEL, 43, 0x7ffe2bd34534) = -1 ENOENT (No such file or directory)
epoll_wait(7, [{EPOLLIN, {u32=15, u64=139972984176655}}], 1023, 275) = 1
ioctl(15, FIONBIO, [1]) = 0
recvfrom(15, "\10\0\0\0\0\0\0", 7, 0, NULL, NULL) = 7
recvfrom(15, "\316", 1, 0, NULL, NULL) = 1
ioctl(15, FIONBIO, [0]) = 0
ioctl(15, FIONBIO, [1]) = 0
recvfrom(15, 0x7f4e96911aa0, 7, 0, NULL, NULL) = -1 EAGAIN (Resource temporarily unavailable)
ioctl(15, FIONBIO, [0]) = 0
epoll_ctl(7, EPOLL_CTL_DEL, 43, 0x7ffe2bd34534) = -1 ENOENT (No such file or directory)
epoll_wait(7, [], 1023, 19) = 0
epoll_ctl(7, EPOLL_CTL_DEL, 43, 0x7ffe2bd34534) = -1 ENOENT (No such file or directory)
epoll_wait(7, [], 1023, 670) = 0
epoll_ctl(7, EPOLL_CTL_DEL, 43, 0x7ffe2bd34534) = -1 ENOENT (No such file or directory)
wait4(631, 0x7ffe2bd33c3c, WNOHANG, NULL) = 0
epoll_wait(7, [], 1023, 5000) = 0
epoll_ctl(7, EPOLL_CTL_DEL, 43, 0x7ffe2bd34534) = -1 ENOENT (No such file or directory)
wait4(631, 0x7ffe2bd33c3c, WNOHANG, NULL) = 0
epoll_wait(7, [], 1023, 5000) = 0
lrwx------. 1 root root 64 Dec 3 08:21 7 -> 'anon_inode:[eventpoll]'
celery==4.4.7
worker args: --autoscale and --max-tasks-per-child
I will test this.
I use --autoscale 20,1
and --max-tasks-per-child 1
task
@share_task
def echo():
print(1)
# wait for processes scaling down and run next
for i in range(100):
echo.delay()
# once large call
for i in range(10000):
echo.delay()
I got another error:
[2020-12-03 18:03:42,137: CRITICAL/MainProcess] Unrecoverable error: AssertionError()
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/celery/worker/worker.py", line 208, in start
self.blueprint.start(self)
File "/usr/local/lib/python3.6/site-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/usr/local/lib/python3.6/site-packages/celery/bootsteps.py", line 369, in start
return self.obj.start()
File "/usr/local/lib/python3.6/site-packages/celery/worker/consumer/consumer.py", line 318, in start
blueprint.start(self)
File "/usr/local/lib/python3.6/site-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/usr/local/lib/python3.6/site-packages/celery/worker/consumer/consumer.py", line 599, in start
c.loop(*c.loop_args())
File "/usr/local/lib/python3.6/site-packages/celery/worker/loops.py", line 83, in asynloop
next(loop)
File "/usr/local/lib/python3.6/site-packages/kombu/asynchronous/hub.py", line 303, in create_loop
poll_timeout = fire_timers(propagate=propagate) if scheduled else 1
File "/usr/local/lib/python3.6/site-packages/kombu/asynchronous/hub.py", line 145, in fire_timers
entry()
File "/usr/local/lib/python3.6/site-packages/kombu/asynchronous/timer.py", line 68, in __call__
return self.fun(*self.args, **self.kwargs)
File "/usr/local/lib/python3.6/site-packages/celery/concurrency/asynpool.py", line 636, in verify_process_alive
assert proc.outqR_fd in hub.readers
AssertionError
FROM https://github.com/celery/celery/issues/4185#issuecomment-694179347
I review billiard/pool.py line 1957, and I found:
self._cond = threading.Condition(threading.Lock())
......
self._lost_worker_timeout = lost_worker_timeout
......
def next(self, timeout=None):
with self._cond:
......
self._cond.wait(timeout)
......
self._cond.wait not set timeout. Is it stuck in this place ?
for those facing this, can you please try https://github.com/celery/billiard/pull/325?
https://github.com/celery/billiard/pull/325 may not work.
And I give up. Now, I use --pool threads
Most helpful comment
Ok, as it turns out, using poll() doesn't solve the problem. As the actual issue is that a file descriptor is being closed while still registered with the hub. Once you realise that it is easy to figure out exactly where it goes wrong and the following hack fixes this issue for us. Without this we could reproduce the issue within an hour or two. With it we haven't been able to trigger it yet. The hack is not the correct fix, but it really requires someone who knows the code better to write a good fix.
It looks like the write queue fd is properly removed from the hub (a few lines up), but the read queue fd is not.