Skip to content
Python 3.8+  ·  MIT License  ·  Minimal Dependencies

Secure. Structured.
Production-Ready Logging.

A logging library for data professionals and developers who need reliable, secure logging with minimal setup — built on Python's standard logging module.

python — app.py
# secure, structured logging in two lines from pylogshield import get_logger logger = get_logger("api", enable_json=True, use_queue=True) # sensitive fields masked automatically logger.info({"user": "alice", "password": "s3cr3t"}, mask=True) → {"timestamp":"2026-04-17T10:22:01Z","level":"INFO","message":{"user":"alice","password":"***"}} # context propagates through the entire request with logger.context(request_id="req-8f2c", user_id=42):     logger.info("Order processed") → {"level":"INFO","request_id":"req-8f2c","user_id":42,"message":"Order processed"}
11 Features
0 Required Deps
3.8+ Python
MIT License

PyPI Python License Downloads CI


Why PyLogShield?

PyLogShield extends Python's standard logging module with production-ready features commonly needed in data engineering and application development — without complexity. Requires only rich and typer.

Sensitive Data Masking

Automatically masks passwords, tokens, API keys, and custom fields. Never accidentally leak credentials in your logs again.

Python
logger.info({"password": "secret"}, mask=True)
# Output: {"password": "***"}

Rate Limiting

Prevent log flooding by suppressing duplicate messages within a configurable time window.

Python
logger = get_logger("app", rate_limit_seconds=2.0)
logger.info("Retry")  # Logged
logger.info("Retry")  # Suppressed (within 2s)

JSON Formatting

Structured JSON with ISO 8601 timestamps. Ready for ELK, Splunk, CloudWatch, and Datadog.

Python
logger = get_logger("app", enable_json=True)
logger.info("Started")
# {"timestamp": "...", "level": "INFO", ...}

Log Rotation

Automatically rotate log files based on size with configurable backup counts.

Python
logger = get_logger("app",
    rotate_file=True,
    rotate_max_bytes=5_000_000)

Async Logging

Offload logging to a background thread via QueueHandler. Non-blocking for high-throughput apps.

Python
logger = get_logger("app", use_queue=True)
# Non-blocking log writes

CLI Log Viewer

View and follow logs from the command line with rich formatting and level filtering.

Bash
pylogshield follow -f app.log -l ERROR

Context Propagation

Inject structured fields into every log within a block — thread-safe and asyncio-safe via contextvars.

Python
with log_context(request_id="abc", user_id=42):
    logger.info("Processing")
# JSON output includes request_id and user_id

FastAPI Middleware

Automatically inject request_id, HTTP method, path, and client IP into every log during a request.

Python
app.add_middleware(PyLogShieldMiddleware, logger=logger)
# Every log in a request carries request context

Quick Start

Installation

Bash
pip install pylogshield

Basic Usage

Python
from pylogshield import get_logger

# Create a logger
logger = get_logger("my_app", log_level="INFO")

# Standard logging
logger.info("Application started")
logger.warning("Low memory")
logger.error("Connection failed")

# Log with sensitive data masking
logger.info({
    "user": "john",
    "api_key": "sk-1234567890"
}, mask=True)
# Output: {"user": "john", "api_key": "***"}

Production Configuration

Python
from pylogshield import get_logger, add_sensitive_fields

# Add custom sensitive fields
add_sensitive_fields(["ssn", "credit_card"])

# Create a production-ready logger
logger = get_logger(
    "production_app",
    log_level="INFO",
    enable_json=True,            # Structured JSON output
    rotate_file=True,            # Auto-rotate logs
    rotate_max_bytes=10_000_000, # 10 MB per file
    rate_limit_seconds=0.5,      # Prevent flooding
    use_queue=True,              # Async logging
    queue_maxsize=50_000,        # Cap queue memory
    enable_metrics=True,         # Track log stats
    enable_context=True,         # Structured context injection
)

logger.info("Production logger ready")

Feature Comparison

Feature Standard Logging PyLogShield
Basic logging
Sensitive data masking
Rate limiting
JSON formatting Manual setup Built-in
Log rotation Separate handler Integrated
Async logging Manual setup One flag
CLI viewer
Metrics
Context propagation
FastAPI middleware
Cloud credential scrubbing

Architecture

PyLogShield wraps every log call in a processing pipeline before handing off to your configured output handlers.

flowchart LR
    APP(["Your Application\nlogger.info(msg, mask=True)"])

    subgraph PIPELINE ["Processing Pipeline"]
        direction TB
        A["🔒 Sensitive Data Masking"]
        B["🚦 Rate Limiter"]
        C["🧵 Context Injection"]
        D["☁️ Cloud Credential Scrubber"]
        A --> B --> C --> D
    end

    subgraph OUTPUT ["Output Handlers"]
        direction TB
        H1["Console / Rich"]
        H2["File / Rotating File"]
        H3["JSON Formatter"]
        H4["Async Queue → Background Thread"]
        H5["Metrics Tracker"]
    end

    APP --> PIPELINE --> OUTPUT

Next Steps

Getting Started

Learn how to install and configure PyLogShield for your project.

Installation Guide

Usage Guide

Explore all features with detailed examples.

Basic Usage

Recipes

End-to-end examples: FastAPI service, data pipelines, async workers, testing.

Recipes & Cookbooks

API Reference

Complete API documentation with all parameters and options.

API Reference


Contributing

All contributions are welcome! If you have a suggestion that would make this better, please fork the repo and create a pull request.

View on GitHub Report an Issue