Secure. Structured.
Production-Ready Logging.
A logging library for data professionals and developers who need reliable, secure logging with minimal setup — built on Python's standard logging module.
Why PyLogShield?¶
PyLogShield extends Python's standard logging module with production-ready features commonly needed in data engineering and application development — without complexity. Requires only rich and typer.
Sensitive Data Masking¶
Automatically masks passwords, tokens, API keys, and custom fields. Never accidentally leak credentials in your logs again.
Rate Limiting¶
Prevent log flooding by suppressing duplicate messages within a configurable time window.
JSON Formatting¶
Structured JSON with ISO 8601 timestamps. Ready for ELK, Splunk, CloudWatch, and Datadog.
Log Rotation¶
Automatically rotate log files based on size with configurable backup counts.
Async Logging¶
Offload logging to a background thread via QueueHandler. Non-blocking for high-throughput apps.
CLI Log Viewer¶
View and follow logs from the command line with rich formatting and level filtering.
Context Propagation¶
Inject structured fields into every log within a block — thread-safe and asyncio-safe via contextvars.
FastAPI Middleware¶
Automatically inject request_id, HTTP method, path, and client IP into every log during a request.
Quick Start¶
Installation¶
Basic Usage¶
from pylogshield import get_logger
# Create a logger
logger = get_logger("my_app", log_level="INFO")
# Standard logging
logger.info("Application started")
logger.warning("Low memory")
logger.error("Connection failed")
# Log with sensitive data masking
logger.info({
"user": "john",
"api_key": "sk-1234567890"
}, mask=True)
# Output: {"user": "john", "api_key": "***"}
Production Configuration¶
from pylogshield import get_logger, add_sensitive_fields
# Add custom sensitive fields
add_sensitive_fields(["ssn", "credit_card"])
# Create a production-ready logger
logger = get_logger(
"production_app",
log_level="INFO",
enable_json=True, # Structured JSON output
rotate_file=True, # Auto-rotate logs
rotate_max_bytes=10_000_000, # 10 MB per file
rate_limit_seconds=0.5, # Prevent flooding
use_queue=True, # Async logging
queue_maxsize=50_000, # Cap queue memory
enable_metrics=True, # Track log stats
enable_context=True, # Structured context injection
)
logger.info("Production logger ready")
Feature Comparison¶
| Feature | Standard Logging | PyLogShield |
|---|---|---|
| Basic logging | ||
| Sensitive data masking | ||
| Rate limiting | ||
| JSON formatting | Manual setup | Built-in |
| Log rotation | Separate handler | Integrated |
| Async logging | Manual setup | One flag |
| CLI viewer | ||
| Metrics | ||
| Context propagation | ||
| FastAPI middleware | ||
| Cloud credential scrubbing |
Architecture¶
PyLogShield wraps every log call in a processing pipeline before handing off to your configured output handlers.
flowchart LR
APP(["Your Application\nlogger.info(msg, mask=True)"])
subgraph PIPELINE ["Processing Pipeline"]
direction TB
A["🔒 Sensitive Data Masking"]
B["🚦 Rate Limiter"]
C["🧵 Context Injection"]
D["☁️ Cloud Credential Scrubber"]
A --> B --> C --> D
end
subgraph OUTPUT ["Output Handlers"]
direction TB
H1["Console / Rich"]
H2["File / Rotating File"]
H3["JSON Formatter"]
H4["Async Queue → Background Thread"]
H5["Metrics Tracker"]
end
APP --> PIPELINE --> OUTPUT
Next Steps¶
Getting Started¶
Learn how to install and configure PyLogShield for your project.
Recipes¶
End-to-end examples: FastAPI service, data pipelines, async workers, testing.
Contributing¶
All contributions are welcome! If you have a suggestion that would make this better, please fork the repo and create a pull request.