2015-09-18 114 views
1

我添加了一個自定義日誌處理程序到我的django應用程序,將日誌條目寫入數據庫。django添加自定義日誌記錄到芹菜日誌記錄

class DbLogHandler(logging.Handler): # Inherit from logging.Handler 
    def __init__(self): 
     # run the regular Handler __init__ 
     logging.Handler.__init__(self) 
     self.entries = [] 
     logging.debug("*****************[DB] INIT db handler") 

    def emit(self, record): 
     # instantiate the model 
     logging.debug("*****************[DB] called emit on db handler") 
     try: 
      revision_instance = getattr(record, 'revision', None) 
      logEntry = MyModel(name=record.name, 
            log_level_name=record.levelname, 
            message = record.msg, 
            module = record.module, 
            func_name = record.funcName, 
            line_no = record.lineno, 
            exception = record.exc_text, 
            revision = revision_instance 
           ) 
      if revision_instance is None: 
       return 
      self.entries.append(logEntry) 

     except Exception as ex: 
      print(ex) 
     return 

    def flush(self): 
     if self.entries: 
      MyModel.objects.bulk_create(self.entries) 
      logging.info("[+] Successfully flushed {0:d} log entries to " 
         "the DB".format(len(self.entries))) 
     else: 
      logging.info("[*] No log entries for DB logger") 

當我直接調用一個函數時,假設通過運行管理命令,處理程序正確使用。然而,在生產中,入口點將是芹菜任務。我的理解是芹菜有它自己的記錄機制。我正在嘗試做但不能工作的是將我的db處理程序添加到芹菜日誌記錄中。也就是說,所有的芹菜日誌也將發送到DbLogHandler

這就是我試圖做到的。在my_app.celery_logging.logger

from celery.utils.log import get_task_logger 

class CeleryAdapter(logging.LoggerAdapter): 
    """Adapter to add current task context to "extra" log fields.""" 
    def process(self, msg, kwargs): 
     if not celery.current_task: 
      return msg, kwargs 

     kwargs = kwargs.copy() 
     kwargs.setdefault('extra', {})['celery'] = \ 
      vars(celery.current_task.request) 
     return msg, kwargs 

def task_logger(name): 
    """ 
    Return a custom celery task logger that will also log to db. 

    We need to add the db handler explicitly otherwise it is not picked 
    up by celery. 

    Also, we wrap the logger in a CeleryAdapter to provide some extra celery- 
    related context to the logging messages. 

    """ 
    # first get the default celery task logger 
    log = get_task_logger(name) 

    # if available, add the db-log handler explicitly to the celery task 
    # logger 
    handlers = settings.LOGGING.get('handlers', []) 
    if handlers: 
     db_handler_dict = handlers.get('db', None) 
     if (db_handler_dict != settings.NULL_HANDLER_PARAMS and 
       db_handler_dict is not None): 
      db_handler = {'db': {'class': 'my_app.db_logging.db_logger.DbLogHandler', 
            'formatter': 'verbose', 
            'level': 'DEBUG'}} 
      log.addHandler(db_handler) 

    # wrap the logger by the CeleryAdapter to add some celery specific 
    # context to the logs 
    return CeleryAdapter(log, {}) 

之後,終於在我的task.py

from my_app.celery_logging.logger import task_logger 
logger = task_logger(__name__) 

但是從這一點上看,這是一個痛苦的世界。我甚至無法描述究竟發生了什麼。當我啓動服務器並查看芹菜日誌輸出時,我發現我的db-logger實際上正在調用,但芹菜似乎放鬆了工人。

[2015-09-18 10:30:57,158: INFO/MainProcess] [*] No log entries for DB logger 
Raven is not configured (logging is disabled). Please see the documentation for more information. 
2015-09-18 10:30:58,659 raven.contrib.django.client.DjangoClient INFO Raven is not configured (logging is disabled). Please see the documentation for more information. 
[2015-09-18 10:30:59,155: DEBUG/MainProcess] | Worker: Preparing bootsteps. 
[2015-09-18 10:30:59,157: DEBUG/MainProcess] | Worker: Building graph... 
[2015-09-18 10:30:59,158: DEBUG/MainProcess] | Worker: New boot order: {Timer, Hub, Queues (intra), Pool, Autoscaler, Autoreloader, StateDB, Beat, Consumer} 
[2015-09-18 10:30:59,161: DEBUG/MainProcess] | Consumer: Preparing bootsteps. 
[2015-09-18 10:30:59,161: DEBUG/MainProcess] | Consumer: Building graph... 
[2015-09-18 10:30:59,164: DEBUG/MainProcess] | Consumer: New boot order: {Connection, Events, Mingle, Tasks, Control, Gossip, Agent, Heart, event loop} 
[2015-09-18 10:30:59,167: DEBUG/MainProcess] | Worker: Starting Hub 
[2015-09-18 10:30:59,167: DEBUG/MainProcess] ^-- substep ok 
[2015-09-18 10:30:59,167: DEBUG/MainProcess] | Worker: Starting Pool 
[2015-09-18 10:30:59,173: DEBUG/MainProcess] ^-- substep ok 
[2015-09-18 10:30:59,173: DEBUG/MainProcess] | Worker: Starting Consumer 
[2015-09-18 10:30:59,174: DEBUG/MainProcess] | Consumer: Starting Connection 
[2015-09-18 10:30:59,180: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672// 
[2015-09-18 10:30:59,180: DEBUG/MainProcess] ^-- substep ok 
[2015-09-18 10:30:59,180: DEBUG/MainProcess] | Consumer: Starting Events 
[2015-09-18 10:30:59,188: DEBUG/MainProcess] ^-- substep ok 
[2015-09-18 10:30:59,188: DEBUG/MainProcess] | Consumer: Starting Mingle 
[2015-09-18 10:30:59,188: INFO/MainProcess] mingle: searching for neighbors 
[2015-09-18 10:31:00,196: INFO/MainProcess] mingle: all alone 
[2015-09-18 10:31:00,196: DEBUG/MainProcess] ^-- substep ok 
[2015-09-18 10:31:00,197: DEBUG/MainProcess] | Consumer: Starting Tasks 
[2015-09-18 10:31:00,203: DEBUG/MainProcess] ^-- substep ok 
[2015-09-18 10:31:00,204: DEBUG/MainProcess] | Consumer: Starting Control 
[2015-09-18 10:31:00,207: DEBUG/MainProcess] ^-- substep ok 
[2015-09-18 10:31:00,208: DEBUG/MainProcess] | Consumer: Starting Gossip 
[2015-09-18 10:31:00,211: DEBUG/MainProcess] ^-- substep ok 
[2015-09-18 10:31:00,211: DEBUG/MainProcess] | Consumer: Starting Heart 
[2015-09-18 10:31:00,212: DEBUG/MainProcess] ^-- substep ok 
[2015-09-18 10:31:00,212: DEBUG/MainProcess] | Consumer: Starting event loop 
[2015-09-18 10:31:00,213: WARNING/MainProcess] [email protected] ready. 
[2015-09-18 10:31:00,213: DEBUG/MainProcess] | Worker: Hub.register Pool... 
[2015-09-18 10:31:00,255: ERROR/MainProcess] Unrecoverable error: WorkerLostError('Could not start worker processes',) 
Traceback (most recent call last): 
    File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/__init__.py", line 206, in start 
    self.blueprint.start(self) 
    File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/bootsteps.py", line 123, in start 
    step.start(parent) 
    File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/bootsteps.py", line 374, in start 
    return self.obj.start() 
    File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/consumer.py", line 278, in start 
    blueprint.start(self) 
    File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/bootsteps.py", line 123, in start 
    step.start(parent) 
    File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/consumer.py", line 821, in start 
    c.loop(*c.loop_args()) 
    File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/loops.py", line 48, in asynloop 
    raise WorkerLostError('Could not start worker processes') 

當調用芹菜任務時,我也看不到任何日誌了。

+0

你確定這不是導致工人退出異常? – patrys

+0

工作人員損失在我啓動服務器後立即發生,因此所有工作人員都應該等待任務。 – LarsVegas

+0

我在問芹菜是從任務發現開始的,這會導致你的任務模塊被導入,並調用'get_task_logger(...)',而它似乎試圖訪問'settings'而不先導入它。 – patrys

回答

0

集worker_hijack_root_logger爲False配置,並且定製記錄

link

+0

請發表一個答案,然後參考。 – Sachith