Rq: Capturing stdout output of jobs

Created on 4 Oct 2017  ·  9Comments  ·  Source: rq/rq

Hi,

I would like to somehow capture the data dumped by my job onto stdout. Is there a way to access it from the Job object?

I am spawning new processes using Popen in my job function.

Thanks,
Anand

Most helpful comment

@anandsaha Have you found a solution for this?
I think it would be great if this were standard behavior: the output of a job is automatically attached to Job objects, similar to the return value in job.result.
Would this be hard to implement?

All 9 comments

You can. Take a look at the custom worker classes here
http://python-rq.org/docs/workers/#custom-worker-classes

You can implement your own output capturing worker for example by overriding the execute_job method here
/rq/worker.py@master#L585-L594

This is more tricky though as you have to capture the output of a forked process.

@anandsaha Have you found a solution for this?
I think it would be great if this were standard behavior: the output of a job is automatically attached to Job objects, similar to the return value in job.result.
Would this be hard to implement?

I agree this would be a great feature. It would streamline the process of integrating RQ into backend applications using existing shell tools, for example.

+1 it would be awesome to have this!

Can't you use logging to redirect your output to stdout?

+1

Yes, I think capturing job outputs in “job.output” is a good idea. I’d welcome a PR for this.

I solved this by a dummy way:
1.manual set job id and output with job.meta when start a job
job = q.enqueue(render,job_id=jobID)
job.meta['output'] = 'init'
job.save_meta()

2.fetch and update job.meta['output'] by job id during subprocess running

proc = subprocess.Popen([cmd]
        stdout=subprocess.PIPE, 
        stderr=subprocess.STDOUT,
        universal_newlines=True)

while proc.poll() is None:
    line = proc.stdout.readline()
    if line:
        # Process output here
        job = Job.fetch(jobID, connection=redis)
        job.meta['output'] = line
        job.save_meta()

3.define a function to read job.meta['output'] and call it

I solved this by a dummy way:
1.manual set job id and output with job.meta when start a job
job = q.enqueue(render,job_id=jobID)
job.meta['output'] = 'init'
job.save_meta()

2.fetch and update job.meta['output'] by job id during subprocess running

proc = subprocess.Popen([cmd]
        stdout=subprocess.PIPE, 
        stderr=subprocess.STDOUT,
        universal_newlines=True)

while proc.poll() is None:
    line = proc.stdout.readline()
    if line:
        # Process output here
        job = Job.fetch(jobID, connection=redis)
        job.meta['output'] = line
        job.save_meta()

3.define a function to read job.meta['output'] and call it

Using this method, are you still able to watch the output of the currently executing job by issuing 'rq worker'?

Was this page helpful?
0 / 5 - 0 ratings

Related issues

mark-99 picture mark-99  ·  29Comments

jchia picture jchia  ·  80Comments

samuelcolvin picture samuelcolvin  ·  17Comments

proofit404 picture proofit404  ·  12Comments

sborpo picture sborpo  ·  9Comments