Hi,
I would like to somehow capture the data dumped by my job onto stdout. Is there a way to access it from the Job object?
I am spawning new processes using Popen in my job function.
Thanks,
Anand
You can. Take a look at the custom worker classes here
http://python-rq.org/docs/workers/#custom-worker-classes
You can implement your own output capturing worker for example by overriding the execute_job method here
/rq/worker.py@master#L585-L594
This is more tricky though as you have to capture the output of a forked process.
@anandsaha Have you found a solution for this?
I think it would be great if this were standard behavior: the output of a job is automatically attached to Job
objects, similar to the return value in job.result
.
Would this be hard to implement?
I agree this would be a great feature. It would streamline the process of integrating RQ into backend applications using existing shell tools, for example.
+1 it would be awesome to have this!
Can't you use logging to redirect your output to stdout?
+1
Yes, I think capturing job outputs in “job.output” is a good idea. I’d welcome a PR for this.
I solved this by a dummy way:
1.manual set job id and output with job.meta when start a job
job = q.enqueue(render,job_id=jobID)
job.meta['output'] = 'init'
job.save_meta()
2.fetch and update job.meta['output'] by job id during subprocess running
proc = subprocess.Popen([cmd]
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
universal_newlines=True)
while proc.poll() is None:
line = proc.stdout.readline()
if line:
# Process output here
job = Job.fetch(jobID, connection=redis)
job.meta['output'] = line
job.save_meta()
3.define a function to read job.meta['output'] and call it
I solved this by a dummy way:
1.manual set job id and output with job.meta when start a job
job = q.enqueue(render,job_id=jobID)
job.meta['output'] = 'init'
job.save_meta()2.fetch and update job.meta['output'] by job id during subprocess running
proc = subprocess.Popen([cmd] stdout=subprocess.PIPE, stderr=subprocess.STDOUT, universal_newlines=True) while proc.poll() is None: line = proc.stdout.readline() if line: # Process output here job = Job.fetch(jobID, connection=redis) job.meta['output'] = line job.save_meta()
3.define a function to read job.meta['output'] and call it
Using this method, are you still able to watch the output of the currently executing job by issuing 'rq worker'?
Most helpful comment
@anandsaha Have you found a solution for this?
I think it would be great if this were standard behavior: the output of a job is automatically attached to
Job
objects, similar to the return value injob.result
.Would this be hard to implement?