Rq: Allow custom logging configuration

Created on 20 Jan 2013  路  14Comments  路  Source: rq/rq

Currently there are a few ways to configure Worker's exception handling behavior:

  1. By passing --sentry-dsn to rqworker (assuming that you want to use sentry)
  2. By writing a custom script, passing your own exception handler to Worker(exc_handler=foo)

Now that RQ uses Python's logging, should we also allow the user to more easily configure logging when running rqworker? Python allows logging configuration to be loaded from a file, so we could add a --logging-config=logging.conf option to rqworker.

Most helpful comment

I found a much simpler version. Just suply your own worker.py:

#!/usr/bin/env python
import sys
from rq import Connection, Worker
import logging
# my example is with jsonlogger
from pythonjsonlogger import jsonlogger

# get the worker log
logger = logging.getLogger("rq.worker")

# manipulate the worker however you wont
logHandler = logging.StreamHandler()
formatter = jsonlogger.JsonFormatter()
logHandler.setFormatter(formatter)
logger.addHandler(logHandler)
logger.error('YEAH!')

# Provide queue names to listen to as arguments to this script,
# similar to rq worker
with Connection():
    qs = sys.argv[1:] or ['default']

    w = Worker(qs)
    w.work()

All 14 comments

That's great, I didn't even know that was possible鈥攑lease go ahead and add this!

Hi

How can I use this great feature?
It is now on the road?

It's definitely a feature we want to have in RQ, but I don't think there's anyone working on this at the moment. Patches welcome :)

I made a first commit on this feature.
You can see the diff here (https://github.com/bicycle1885/rq/compare/feature-logging-config-dev).
This is not ready for the pull-request because contains some code for development and debug.

Now we need to make a decision; the precedence of options on logging level.
"rqworker" has already --verbose and --quiet options which can conflict with the logging level in a configuration file.
In this commit, --verbose and --quiet options override the root logger's level.

What do you think?

+1. What's the status on this issue?

+1. Would this be added?

If not, is there a workaround to use a custom logger from the logging module.

My workaround is to add from init_logging import logging to worker.py and to create an init_logging.py as follows:

from __future__ import print_function
from __future__ import unicode_literals
from __future__ import division

import logging
import os
import sys

__author__ = 'calvin'


class LoggerWriter:
    def __init__(self, logger, level, pipe):
        self.logger = logger
        self.level = level
        self.buffer = ''
        self.pipe = pipe

    def write(self, message):
        message = self.buffer + message
        lines = message.split('\n')
        self.buffer = lines[-1]
        lines = lines[:-1]
        for line in lines:
            self.logger.log(self.level, line)

    def flush(self):
        if self.buffer != '':
            self.logger.log(self.level, self.buffer)
        self.buffer = ''

    def isatty(self):
        self.pipe.isatty()


log_level = os.environ.get('LOG_LEVEL', 'INFO')
numeric_level = getattr(logging, log_level.upper(), None)
if not isinstance(numeric_level, int):
    raise ValueError('Invalid log level: %s' % log_level)
logging.basicConfig(level=numeric_level, format="%(asctime)s %(levelname)s %(message)s")

if not isinstance(sys.stdout, LoggerWriter):
    sys.stdout = LoggerWriter(logging, logging.INFO, sys.stdout)
if not isinstance(sys.stderr, LoggerWriter):
    sys.stderr = LoggerWriter(logging, logging.WARNING, sys.stderr)

This works pretty well but I get two log entries sent to the log file as rq is logging to stdout as well as logging.

I found a much simpler version. Just suply your own worker.py:

#!/usr/bin/env python
import sys
from rq import Connection, Worker
import logging
# my example is with jsonlogger
from pythonjsonlogger import jsonlogger

# get the worker log
logger = logging.getLogger("rq.worker")

# manipulate the worker however you wont
logHandler = logging.StreamHandler()
formatter = jsonlogger.JsonFormatter()
logHandler.setFormatter(formatter)
logger.addHandler(logHandler)
logger.error('YEAH!')

# Provide queue names to listen to as arguments to this script,
# similar to rq worker
with Connection():
    qs = sys.argv[1:] or ['default']

    w = Worker(qs)
    w.work()

As far as I know, no one is working on this feature. It would be great if you could make a PR to add --logging-config though :)

Hey. Thanks for the response. I edited the issue and it seems to me as good enough solution really.

@hnykda how to start your own worker.py?

https://python-rq.org/docs/workers/#custom-worker-classes

import logging
from rq import Worker

# configure your logging format here

class YorWorker(Worker):
    pass

@hnykda Thank you for your response!
Could you take a look on my question https://github.com/rq/rq/issues/1088 ?

+1 on this feature

Was this page helpful?
0 / 5 - 0 ratings

Related issues

mattrobenolt picture mattrobenolt  路  18Comments

anandsaha picture anandsaha  路  9Comments

selwin picture selwin  路  20Comments

Chronial picture Chronial  路  19Comments

mark-99 picture mark-99  路  29Comments