Request ID propagation for ASGI apps

Overview

pypi test codecov

ASGI Correlation ID middleware

Middleware for loading and receiving correlation IDs from request HTTP headers, and making them available in application logs.

By default, the middleware loads correlation IDs from the Correlation-ID HTTP header, but the name of the header can be specified, and if you're, e.g., on a platform like Heroku you probably want to set the header name to X-Request-ID instead.

In addition to adding correlation IDs to logs, the middleware supports propagating correlation IDs to Sentry events and Celery tasks. See the relevant sections below for more details.

Table of contents

Installation

pip
install
asgi - correlation - id

Setting up the middleware

Adding the middleware

The middleware can be added like this

app = FastAPI(middleware=[Middleware(CorrelationIdMiddleware)])

or this

app = FastAPI()
app.add_middleware(CorrelationIdMiddleware)

For Starlette apps, just substitute FastAPI with Starlette in the example.

Middleware settings

The middleware has a few settings. These are the defaults:

class CorrelationIdMiddleware(
    header_name='Correlation-ID',
    validate_guid=True,
    uuid_length=32,
)

Each individual setting is described below:

header_name

The HTTP header key to read IDs from.

In additon to Correlation-ID, another popular choice for header name is X-Request-ID. Among other things, this is the standard header value for request IDs on Heroku.

Defaults to Correlation-ID.

validate_guid

By default, the middleware validates correlation IDs as valid UUIDs. If turned off, any string will be accepted.

An invalid header is discarded, and a fresh UUID is generated in its place.

Defaults to True.

uuid_length

Lets you optionally trim the length of correlation IDs. Probably not needed in most cases, but for, e.g., local development having 32-length UUIDs in every log output to your console can be excessive.

Defaults to 32.

Configuring logging

To get a benefit from the middleware, you'll want to configure your logging setup to log the correlation ID in some form or another. This way logs can be correlated to a single request - which is largely the point of the middleware.

To set up logging of the correlation ID, you simply have to implement the filter supplied by the package.

If your logging config looks something like this:

LOGGING = {
    'version': 1,
    'disable_existing_loggers': False,
    'formatters': {
        'web': {
            'class': 'logging.Formatter',
            'datefmt': '%H:%M:%S',
            'format': '%(levelname)s ... %(name)s %(message)s',
        },
    },
    'handlers': {
        'web': {
            'class': 'logging.StreamHandler',
            'formatter': 'web',
        },
    },
    'loggers': {
        'my_project': {
            'handlers': ['web'],
            'level': 'DEBUG',
            'propagate': True,
        },
    },
}

You simply have to make these changes

LOGGING = {
    'version': 1,
    'disable_existing_loggers': False,
+   'filters': {
+       'correlation_id': {'()': CorrelationId},
+   },
    'formatters': {
        'web': {
            'class': 'logging.Formatter',
            'datefmt': '%H:%M:%S',
+           'format': '%(levelname)s ... [%(correlation_id)s] %(name)s %(message)s',
        },
    },
    'handlers': {
        'web': {
            'class': 'logging.StreamHandler',
+           'filters': ['correlation_id'],
            'formatter': 'web',
        },
    },
    'loggers': {
        'my_project': {
            'handlers': ['web'],
            'level': 'DEBUG',
            'propagate': True,
        },
    },
}

And your log output should go from this:

INFO ... project.views This is a DRF view log, and should have a GUID.
WARNING ... project.services.file Some warning in a function
INFO ... project.views This is a DRF view log, and should have a GUID.
INFO ... project.views This is a DRF view log, and should have a GUID.
WARNING ... project.services.file Some warning in a function
WARNING ... project.services.file Some warning in a function

to this

INFO ... [773fa6885e03493498077a273d1b7f2d] project.views This is a DRF view log, and should have a GUID.
WARNING ... [773fa6885e03493498077a273d1b7f2d] project.services.file Some warning in a function
INFO ... [0d1c3919e46e4cd2b2f4ac9a187a8ea1] project.views This is a DRF view log, and should have a GUID.
INFO ... [99d44111e9174c5a9494275aa7f28858] project.views This is a DRF view log, and should have a GUID.
WARNING ... [0d1c3919e46e4cd2b2f4ac9a187a8ea1] project.services.file Some warning in a function
WARNING ... [99d44111e9174c5a9494275aa7f28858] project.services.file Some warning in a function

Setting up Celery support

Features

What we call "Celery support" is really two distinct features. If you don't care about the details and just want IDs in your logs, feel free to skip ahead to the implementation section.

Feature 1: Passing a correlation ID to a Celery worker

It's pretty useful to be able to discern which HTTP request spawned which background task, if that's something your app might do.

Celery already has a concept of correlation_id, but unfortunately it's not something users can hook into. Instead we can use the before_task_publish signal to pass our correlation ID to the receiving worker, using the task headers.

Feature 2: Keeping track of what spawned a Celery worker

In the case of a HTTP request spawning a background task, we have full information about the sequence of events.

But what happens if that background task spawns more background tasks, or retries and rejections are added to the mix? As soon as more than one task is spawned, the correlation ID is reduced to an "origin ID" - the ID of the HTTP request that spawned the first worker.

In the same way correlation IDs are nice, because it connects logs to a single HTTP request, we would like something to give us the sequence of events when things get complicated. For this purpose the package supplies two extra log filters:

  • The worker current_id, which is a generated UUID, unique to each new worker process
  • The worker parent_id which is either the correlation ID (when a background task was spawned from a HTTP request), or the current_id of the worker process that spawned the current worker process.

So to summarize:

  • correlation_id: The ID for the originating HTTP request
  • current_id: The ID of the current worker process
  • parent_id: The ID of the former HTTP/worker process (None if there was none, as in the case of scheduled tasks)

Adding Celery event hooks

Setting up the event hooks is simple, just import configure_celery and run it during startup.

from fastapi import FastAPI

from asgi_correlation_id import configure_celery

app = FastAPI()

app.add_event_handler('startup', configure_celery)

You can look over the event hooks here.

Celery event hook settings

The setup function has a few settings. These are the defaults:

def configure_celery(
        uuid_length=32,
        log_parent=True,
        parent_header='CELERY_PARENT_ID',
        correlation_id_header='CORRELATION_ID',
) -> None:

Each individual setting is described below:

  • uuid_length


    Lets you optionally trim the length of IDs. Probably not needed in most cases, but for, e.g., local development having 32-length UUIDs in every log output to your console *can* be excessive.


    Defaults to 32.

  • log_parent


    If `False` will only pass the correlation ID to worker processes spawned by HTTP requests - nothing else.


    Defaults to True.

  • correlation_id_header


    Same as `parent_header`.


    Defaults to CORRELATION_ID.

  • parent_header


    The key to store a `parent_id` header value in. There's no need to change this unless you have other signal functions for Celery interacting with task headers.


    Defaults to CELERY_PARENT_ID.

Configuring Celery logging

If this is your logging config:

LOGGING = {
    'version': 1,
    'disable_existing_loggers': False,
    'filters': {
        'correlation_id': {'()': CorrelationId},
    },
    'formatters': {
        'web': {
            'class': 'logging.Formatter',
            'datefmt': '%H:%M:%S',
            'format': '%(levelname)s ... [%(correlation_id)s] %(name)s %(message)s',
        },
    },
    'handlers': {
        'web': {
            'class': 'logging.StreamHandler',
            'filters': ['correlation_id'],
            'formatter': 'web',
        },
    },
    'loggers': {
        'my_project': {
            'handlers': ['web'],
            'level': 'DEBUG',
            'propagate': True,
        },
    },
}

You simply need to add these lines of code to log the current_id and parent_id

LOGGING = {
    'version': 1,
    'disable_existing_loggers': False,
    'filters': {
        'correlation_id': {'()': CorrelationId},
+       'celery_tracing': {'()': CeleryTracingIds},
    },
    'formatters': {
        'web': {
            'class': 'logging.Formatter',
            'datefmt': '%H:%M:%S',
            'format': '%(levelname)s ... [%(correlation_id)s] %(name)s %(message)s',
        },
+       'celery': {
+           'class': 'logging.Formatter',
+           'datefmt': '%H:%M:%S',
+           'format': '%(levelname)s ... [%(correlation_id)s] [%(celery_parent_id)s-%(celery_current_id)s] %(name)s %(message)s',
+       },
    },
    'handlers': {
        'web': {
            'class': 'logging.StreamHandler',
            'filters': ['correlation_id'],
            'formatter': 'web',
        },
+       'celery': {
+           'class': 'logging.StreamHandler',
+           'filters': ['correlation_id', 'celery_tracing'],
+           'formatter': 'celery',
+       },
    },
    'loggers': {
        'my_project': {
+           'handlers': ['celery' if any('celery' in i for i in sys.argv) else 'web'],
            'level': 'DEBUG',
            'propagate': True,
        },
    },
}

Though at this point it might make more sense to use a JSON-formatter, since the logs will become pretty cluttered. Using the UUID length settings for local development can also be useful.

{
    'version': 1,
    'disable_existing_loggers': False,
    'filters': {
        'correlation_id': {'()': CorrelationId},
        'celery_tracing': {'()': CeleryTracingIds},
    },
    'formatters': {
        'dev': {
            'class': 'logging.Formatter',
            'datefmt': '%H:%M:%S',
            'format': '%(levelname)s:\t\b%(asctime)s %(name)s:%(lineno)d [%(correlation_id)s] %(message)s',
        },
        'dev-celery': {
            'class': 'logging.Formatter',
            'datefmt': '%H:%M:%S',
            'format': (
                '%(levelname)s:\t\b%(asctime)s %(name)s:%(lineno)d [%(correlation_id)s]'
                ' [%(celery_parent_id)s-%(celery_current_id)s] %(message)s'
            ),
        },
        'json': {
            '()': 'pythonjsonlogger.jsonlogger.JsonFormatter',
            'format': """
                asctime: %(asctime)s
                created: %(created)f
                filename: %(filename)s
                funcName: %(funcName)s
                levelname: %(levelname)s
                level: %(levelname)s
                levelno: %(levelno)s
                lineno: %(lineno)d
                message: %(message)s
                module: %(module)s
                msec: %(msecs)d
                name: %(name)s
                pathname: %(pathname)s
                process: %(process)d
                processName: %(processName)s
                relativeCreated: %(relativeCreated)d
                thread: %(thread)d
                threadName: %(threadName)s
                exc_info: %(exc_info)s
                correlation-id: %(correlation_id)s
                celery-current-id: %(celery_current_id)s
                celery-parent-id: %(celery_parent_id)s
            """,
            'datefmt': '%Y-%m-%d %H:%M:%S',
        },
    },
    'handlers': {
        'dev': {
            'class': 'logging.StreamHandler',
            'stream': 'ext://sys.stdout',
            'filters': ['correlation_id'],
            'formatter': 'console',
        },
        'dev-celery': {
            'class': 'logging.StreamHandler',
            'stream': 'ext://sys.stdout',
            'filters': ['correlation_id', 'celery_tracing'],
            'formatter': 'console-celery',
        },
        'json': {
            'class': 'logging.StreamHandler',
            'stream': 'ext://sys.stdout',
            'filters': ['correlation_id'],
            'formatter': 'json',
        },
    },
    'loggers': {
        'my_project': {
            'handlers': [
                'json' if settings.ENVIRONMENT != 'dev'
                else 'dev-celery' if any('celery' in i for i in sys.argv)
                else 'dev'
            ],
            'level': 'DEBUG',
            'propagate': True,
        },
    },
}

Sentry support

If your project has sentry-sdk installed, correlation IDs are automatically added to Sentry events as a transaction_id. This is also the case for Celery tasks, if Celery support is enabled.

See this blogpost for more details.

Comments
  • ID generator for celery extension

    ID generator for celery extension

    Refer to #50

    Please review and I can address any requested changes :)

    Added customizable generator options to celery extension to better match capabilities of the correlation id middleware

    • Added optional "header_key" param to asgi_correlation_id.extensions.celery.load_correlation_id
    • Added optional "generator" param to asgi_correlation_id.extensions.celery.load_correlation_id
    • Added optional "generator" param to asgi_correlation_id.extensions.celery.load_celery_current_and_parent_ids
    opened by dapryor 16
  • Avoid using log filter to modify/enrich log record

    Avoid using log filter to modify/enrich log record

    The current implementation provides log filter classes which can be used to enrich the log records with correlation IDs. This is a surprising use of a filter since it has a side effect and modifies a log record. For example, when attaching the filter to one handler, another handler may see the modified log record.

    According to the documentation (just below https://docs.python.org/3/library/logging.html?highlight=logrecord#logging.LogRecord.getMessage), the intended pattern for custom log record generation is to use a modified log record factory function like this:

    old_factory = logging.getLogRecordFactory()
    
    def record_factory(*args, **kwargs):
        record = old_factory(*args, **kwargs)
        record.custom_attribute = 0xdecafbad
        return record
    
    logging.setLogRecordFactory(record_factory)
    

    The only annoying thing here is that this one-off setting of the factory cannot be done using logging.config.

    Short of asking app developers to bootstrap the record factory, manually, there may be an alternative whereby the ASGI middleware could be responsible for this. When using a class-based middleware implementation, the middleware's __init__() constructor may be able to set the log record factory?

    opened by faph 11
  • The headers with same key are overwritten

    The headers with same key are overwritten

    Starlette responses adds cookies as headers with the same key. self.raw_headers.append((b"set-cookie", cookie_val.encode("latin-1")))

    In your middleware handle outgoing request over writes the same key with only the last value.

            async def handle_outgoing_request(message: Message) -> None:
                if message['type'] == 'http.response.start':
                    headers = {k.decode(): v.decode() for (k, v) in message['headers']}
                    headers[self.header_name] = correlation_id.get()
                    headers['Access-Control-Expose-Headers'] = self.header_name
                    response_headers = Headers(headers=headers)
                    message['headers'] = response_headers.raw
                await send(message)
    
    opened by lakshaythareja 11
  • correlation_id.get got empty in fastapi exception_handler

    correlation_id.get got empty in fastapi exception_handler

    I want to use correlation_id.get() to get trace_id in my exception_handler but got empty, is there any wrong?

    full code:

    import uvicorn
    
    from asgi_correlation_id import CorrelationIdMiddleware, correlation_id
    from fastapi import FastAPI, Request, status
    from fastapi.responses import JSONResponse
    
    app = FastAPI()
    
    @app.exception_handler(Exception)
    async def unicorn_exception_handler(request: Request, exception: Exception):
        trace_id = correlation_id.get() or 'ccc'
        return JSONResponse(
            status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
            content={"code": config.errcode.INSIDE_ERROR, "msg": "error.", "data": [], "trace": trace_id}
        )
    
    app.add_middleware(
        CorrelationIdMiddleware,
        header_name="X-Trace"
    )
    
    @app.get("/items")
    async def read_item(request: Request):
        trace_id = correlation_id.get()
        print(trace_id)
        raise AttributeError
        return {"item_id": "100", "trace": trace_id}
    
    if __name__ == '__main__':
        uvicorn.run(app='main:app', host="0.0.0.0", port=8000, reload=True)
    

    request:

    curl   -XGET  http://<server-ip>:8000/items
    

    response:

    Http Code: 500
    {"code":99999,"msg":"error.","data":[],"trace":"ccc"}
    

    Thanks

    opened by liyongzhezz 9
  • Correlation-id header doesn't propagate to outgoing reqeusts.

    Correlation-id header doesn't propagate to outgoing reqeusts.

    As in a typical micro-service architecture, I want the correlation Id to be automatically propagated to the outbound requests made from one API to other APIs. I'm using the HTTPX AsyncClient for the same. But unless I explicitly add the header to the client object, the other API services do not receive it inherently and they create their own header which ultimately defeats the purpose of having a correlation-id.

    opened by prashantk1220 7
  • After asgi-correlation-id is used, the log cannot be printed when creating fastapi app

    After asgi-correlation-id is used, the log cannot be printed when creating fastapi app

    I configured loguru in fastapi according to the example, and the code is as follows.

    import logging
    import sys
    
    import uvicorn
    from asgi_correlation_id import CorrelationIdMiddleware
    from asgi_correlation_id.context import correlation_id
    from fastapi import FastAPI
    from loguru import logger
    
    logger.remove()
    
    
    def correlation_id_filter(record):
        record['correlation_id'] = correlation_id.get()
        return record['correlation_id']
    
    
    fmt = "<green>{time:YYYY-MM-DD HH:mm:ss.SSS}</green> | <level>{level: <8}</level> | <red> {correlation_id} </red> | <cyan>{name}</cyan>:<cyan>{function}</cyan>:<cyan>{line}</cyan> - <level>{message}</level>"
    
    logger.add(sys.stderr, format=fmt, level=logging.DEBUG, filter=correlation_id_filter)
    
    
    def start_app():
        logger.info('start app....')
        print('start app....')
    
    
    app = FastAPI()
    
    app.add_middleware(CorrelationIdMiddleware)
    
    app.add_event_handler('startup', start_app)
    
    
    @app.get('/')
    def index():
        logger.info(f"Request with id ")
        return 'OK'
    
    
    if __name__ == '__main__':
        uvicorn.run(app='mainw:app', host='0.0.0.0', port=8000, reload=False, debug=True)
    

    After the project is started, logger did not print the log in start_app(), just print()

    /Users/slowchen/.virtualenvs/fastapi/bin/python /Users/slowchen/workspace/FastApi/tianya-fastapi/app/mainw.py
    INFO:     Will watch for changes in these directories: ['/Users/slowchen/workspace/FastApi/tianya-fastapi/app']
    INFO:     Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
    INFO:     Started reloader process [31627] using statreload
    INFO:     Started server process [31629]
    INFO:     Waiting for application startup.
    INFO:     Application startup complete.
    start app....
    INFO:     127.0.0.1:52057 - "GET / HTTP/1.1" 200 OK
    2022-03-12 13:43:30.186 | INFO     |  048a508fb369425fb25a63631899b99c  | mainw:index:37 - Request with id 
    

    How can I configure it to print the log correctly when fastapi, startup or adding middleware

    opened by slowchen 7
  • Correlation header not present in response for 500 errors

    Correlation header not present in response for 500 errors

    Responses that are returned without error, or responses that are returned when an HTTPException is raised, include the correlation header name and the correlation id.

    However, when an unhandled exception is raised, for example with this code:

    @app.get("/")
    async def root():
        a = 1/0
        return {"message": "Hello!"}
    

    then a 500 error is returned to the client, and the correlation header is not included.

    It seems like it would be especially nice to have the correlation id included in the response headers for such cases.

    opened by brki 7
  • Can you please give an example of how to add parent id and current process id in celery tasks for loguru?

    Can you please give an example of how to add parent id and current process id in celery tasks for loguru?

    I use loguru for logging, saw your example for adding the correlation id for requests. But I am unable to add parent and current process id for celery tasks in loguru. Can you help?

    opened by lakshaythareja 6
  • How to get Correlation ID inside the view function in FastAPI?

    How to get Correlation ID inside the view function in FastAPI?

    Hey author, thanks for the amazing work! Our team is planning to onboard this library in our projects. I have been trying to retrieve the created correlation ID inside a view function in FastAPI but I am not able to query it somehow. I have tried querying the request.headers and request.body(), and am still not able to find it. How can I get that?

    opened by raghavsikaria 6
  • How to configure asgi-correlation with loguru ?

    How to configure asgi-correlation with loguru ?

    Hello there !

    Loguru seems to get more popularity as time goes by, I've tried to setup a very minimal example to get your library working with it, but it seems it can't find the correlation id.

    I've tried several things, and right now i'm doing this :

    import logging
    import sys
    
    import uvicorn
    from asgi_correlation_id import CorrelationIdMiddleware
    from fastapi import FastAPI
    from loguru import logger
    
    app = FastAPI()
    
    app.add_middleware(CorrelationIdMiddleware)
    
    # remove default handler
    logger.remove()
    
    fmt = "[{time}] [application_name] [{extra[correlation_id]}] [{level}] - {name}:{function}:{line} - {message}"
    logger.add(sys.stderr, format=fmt, level=logging.DEBUG)
    
    @app.get('/')
    def index():
        logger.info(f"Request with id ")
        return 'OK'
    
    
    
    if __name__ == '__main__':
        uvicorn.run(app)
    
    
    

    And I get a :

    Record was: {'elapsed': datetime.timedelta(seconds=1, microseconds=1880), 'exception': None, 'extra': {}, 'file': (name='another_main.py', path='C:\\Users\\User\\PycharmProjects\\x\\y\\another_main.py'), 'function': 'index', 'level': (name='INFO', no=20, icon='ℹ️'), 'line': 21, 'message': 'Request with id ', 'module': 'another_main', 'name': '__main__', 'process': (id=3412, name='MainProcess'), 'thread': (id=9164, name='asyncio_0'), 'time': datetime(2021, 11, 17, 15, 7, 54, 443182, tzinfo=datetime.timezone(datetime.timedelta(seconds=3600), 'Paris, Madrid'))}
    Traceback (most recent call last):
      File "C:\Users\User\PycharmProjects\x\venv\lib\site-packages\loguru\_handler.py", line 155, in emit
        formatted = precomputed_format.format_map(formatter_record)
    KeyError: 'correlation_id'
    --- End of logging error ---
    

    Got any idea ? Thanks !

    opened by sorasful 6
  • How to pass correlation_id to tasks executed in a multithreaded environment?

    How to pass correlation_id to tasks executed in a multithreaded environment?

    EDIT: Changed the name of the issue for better searchability; you can find the solution to the question here


    Hey there!

    I feel pretty stupid asking this question, but can you explain to me how I should create my logger instance to have a correlation_id?

    Currently I create my logger at the top of the a router file:

    import logging
    from fastapi import APIRouter, HTTPException
    
    LOG = logging.getLogger(__name__)
    
    router = APIRouter(prefix="/my/route", responses={404: {"description": "Not found"}})
    
    @router.get("/")
    def handler():
       LOG.info("Hello!")
    

    And I get

    [2022-07-19T20:37:48] INFO [None] path.to.module | Hello
    

    when my logging configuration is as follows:

        "formatters": {
            "default": {
                "format": "[%(asctime)s] %(levelname)s [%(correlation_id)s] %(name)s | %(message)s",
                "datefmt": "%Y-%m-%dT%H:%M:%S",
            }
        },
    
        app.add_middleware(
            CorrelationIdMiddleware,
            header_name='X-Request-ID',
            generator=lambda: uuid4().hex,
            validator=is_valid_uuid4,
            transformer=lambda a: a,
        )
     
    

    -- I would like to have my correlation_id show up in my log like so:

    [2022-07-19T20:37:48] INFO [8fe9728a] path.to.module | Hello
    

    I can't get anything about it in both the Starlette and FastAPI documentation. It's like everybody knows this and it's not worth mentionning 🤔

    Can you give me an example of how I should get a logger instance to have the request id show up?

    Thanks for your help!

    opened by philippefutureboy 5
Releases(v3.2.1)
  • v3.2.1(Nov 18, 2022)

    What's Changed

    • chore: allow newer starlette versions by @JonasKs in https://github.com/snok/asgi-correlation-id/pull/58
      • thanks to @greenape for his contributions in #56

    Full Changelog: https://github.com/snok/asgi-correlation-id/compare/v3.2.0...v3.2.1

    Source code(tar.gz)
    Source code(zip)
  • v3.2.0(Oct 14, 2022)

    What's Changed

    • Contextvars added to package __init__ by @Bobronium in https://github.com/snok/asgi-correlation-id/pull/54

    New Contributors

    • @Bobronium made their first contribution in https://github.com/snok/asgi-correlation-id/pull/54

    Full Changelog: https://github.com/snok/asgi-correlation-id/compare/v3.1.0...v3.2.0

    Source code(tar.gz)
    Source code(zip)
  • v3.1.0(Sep 29, 2022)

    What's Changed

    • docs: Add docs for how to integrate with saq by @sondrelg in https://github.com/snok/asgi-correlation-id/pull/47
    • chore: Update workflows by @sondrelg in https://github.com/snok/asgi-correlation-id/pull/48
    • chore: Upgrade to Poetry 1.2.0 by @sondrelg in https://github.com/snok/asgi-correlation-id/pull/49
    • feat: Add ability to specify celery-integration generated IDs and fix celery log filter signature inconsistency @dapryor in https://github.com/snok/asgi-correlation-id/pull/51

    New Contributors

    • @dapryor made their first contribution in https://github.com/snok/asgi-correlation-id/pull/51

    Full Changelog: https://github.com/snok/asgi-correlation-id/compare/v3.0.1...v3.1.0

    Source code(tar.gz)
    Source code(zip)
  • v3.0.1(Jul 27, 2022)

  • v3.0.0(May 18, 2022)

    Changes

    Breaking changes

    • Reworked the middleware settings (#39)

    • Reworded a warning logger (https://github.com/snok/asgi-correlation-id/commit/1883b31b0b115b2e6706c03f9bf94fadfeebca7a). This could potentially break log filters or monitoring dashboard, though is probably a non-issue for most.

    Migration guide

    The validate_header_as_uuid middleware argument was removed.

    If your project uses validate_header_as_uuid=False, this is how the middleware configuration should change:

    app.add_middleware(
        CorrelationIdMiddleware,
        header_name='X-Request-ID',
    -    validate_header_as_uuid=False
    +    validator=None,
    )
    

    Otherwise, just make sure to remove validate_header_as_uuid if used.

    Read more about the new configuration options here.

    Source code(tar.gz)
    Source code(zip)
  • v3.0.0a1(May 16, 2022)

  • v2.0.0(Apr 30, 2022)

    v2.0.0 release

    Changes

    Breaking changes

    • Drops Python 3.6
    • Old log filter factories were removed. All users will need to follow the migration guide below to upgrade.

    Non-breaking changes

    • Adds 3.11 support

    Migration guide

    The celery_tracing_id_filter and correlation_id_filter callables have been removed in the latest release.

    To upgrade, change from this log filter implementation:

    LOGGING = {
        'version': 1,
        'disable_existing_loggers': False,
        'filters': {
            'correlation_id': {'()': correlation_id_filter(uuid_length=32)},
           'celery_tracing': {'()': celery_tracing_id_filter(uuid_length=32)},
        },
        ...
    }
    

    To this one:

    LOGGING = {
        'version': 1,
        'disable_existing_loggers': False,
        'filters': {
            'correlation_id': {
                '()': 'asgi_correlation_id.CorrelationIdFilter',
                'uuid_length': 32,
            },
            'celery_tracing': {
                 '()': 'asgi_correlation_id.CeleryTracingIdsFilter',
                 'uuid_length': 32,
            },
        },
        ...
    }
    

    When upgrading a project which only implemented correlation_id_filter, you should expect this diff:

    LOGGING = {
        'version': 1,
        'disable_existing_loggers': False,
        'filters': {
    -        'correlation_id': {'()': correlation_id_filter(uuid_length=32)},
    +        'correlation_id': {
    +            '()': 'asgi_correlation_id.CorrelationIdFilter',
    +            'uuid_length': 32,
    +        },
        },
        ...
    }
    

    See the repository README for updated documentation.

    Source code(tar.gz)
    Source code(zip)
  • v1.1.4(Mar 22, 2022)

  • v1.1.3(Mar 21, 2022)

  • v1.1.2(Nov 30, 2021)

  • v1.1.1(Nov 20, 2021)

  • v1.1.0(Nov 9, 2021)

  • v1.0.1(Nov 2, 2021)

  • v1.0.0(Nov 1, 2021)

Owner
snok
Open source collaboration
snok
Python package for reference counting native pointers

refcount master: testing: This package is primarily for managing resources in native libraries, written for instance in C++, from Python. While it boi

CSIRO Hydroinformatics 2 Nov 03, 2022
Code and data for learning to search in local branching

Code and data for learning to search in local branching

Defeng Liu 7 Dec 06, 2022
Get information about what a Python frame is currently doing, particularly the AST node being executed

executing This mini-package lets you get information about what a frame is currently doing, particularly the AST node being executed. Usage Getting th

Alex Hall 211 Jan 01, 2023
Project based on pure python with OOP

Object oriented programming review Object oriented programming (OOP) is among the most used programming paradigms (if not the most common) in the indu

Facundo Abrahan Cerimeli 1 May 09, 2022
Integration of Hotwire's Turbo library with Flask.

turbo-flask Integration of Hotwire's Turbo library with Flask, to allow you to create applications that look and feel like single-page apps without us

Miguel Grinberg 240 Jan 06, 2023
Manjaro CN Repository

Manjaro CN Repository Automatically built packages based on archlinuxcn/repo and manjarocn/docker. Install Add manjarocn to /etc/pacman.conf: Please m

Manjaro CN 28 Jun 26, 2022
A python mathematics module

A python mathematics module

Fayas Noushad 4 Nov 28, 2021
Bookmarkarchiver - Python script that archives all of your bookmarks on the Internet Archive

bookmarkarchiver Python script that archives all of your bookmarks on the Internet Archive. Supports all major browsers. bookmarkarchiver uses the off

Anthony Chen 3 Oct 09, 2022
Simplified web browser made in python for a college project

Python browser Simplified web browser made in python for a college project. Web browser has bookmarks, history, multiple tabs, toolbar. It was made on

AmirHossein Mohammadi 9 Jul 25, 2022
This directory gathers the tools developed by the Data Sourcing Working Group

BigScience Data Sourcing Code This directory gathers the tools developed by the Data Sourcing Working Group First Sourcing Sprint: October 2021 The co

BigScience Workshop 27 Nov 04, 2022
Repositorio com arquivos processados da CPI da COVID para facilitar analise

cpi4all Repositorio com arquivos processados da CPI da COVID para facilitar analise Organização No site do senado é possivel encontrar a lista de todo

Breno Rodrigues Guimarães 12 Aug 16, 2021
A collection of full-stack resources for programmers.

A collection of full-stack resources for programmers.

Charles-Axel Dein 22.3k Dec 30, 2022
No more support server flooding with questions about unsupported hosting.

No more support server flooding with questions about unsupported hosting.

3 Aug 09, 2021
Coursework project for DIP class. The goal is to use vision to guide the Dashgo robot through two traffic cones in bright color.

Coursework project for DIP class. The goal is to use vision to guide the Dashgo robot through two traffic cones in bright color.

Yueqian Liu 3 Oct 24, 2022
Online HackerRank problem solving challenges

LinkedListHackerRank Online HackerRank problem solving challenges This challenge is part of a tutorial track by MyCodeSchool You are given the pointer

Sefineh Tesfa 1 Nov 21, 2021
Powering up Apache JMeter with Streamlit and opening the door for machine learning.

Powering up Apache JMeter with Streamlit Overview Apache JMeter is an open source load testing tool written in 100% pure Java. JMeter supports umpteen

NaveenKumar Namachivayam ⚡ 16 Aug 24, 2022
CHIP-8 interpreter written in Python

chip8py CHIP-8 interpreter written in Python Contents About Installation Usage License About CHIP-8 is an interpreted language developed during the 19

Robert Olaru 1 Nov 09, 2021
An evolutionary multi-agent platform based on mesa and NEAT

An evolutionary multi-agent platform based on mesa and NEAT

Valerio1988 6 Dec 04, 2022
JimShapedCoding Python Crash Course 2021

Python CRASH Course by JimShapedCoding - Click Here to Start! This Repository includes the code and MORE exercises on each section of the entire cours

Jim Erg 64 Dec 23, 2022
A python script that fetches the grades of a student from a WAEC result in pdf format.

About waec-result-analyzer A python script that fetches the grades of a student from a WAEC result in pdf format. Built for federal government college

Oshodi Kolapo 2 Dec 04, 2021