Logprep: The swiss army knife for logs
This is the documentation for Logprep. The swiss army knife for logs. It provides tools for:
collection of logs from various sources
normalization via different processors
shipping to different datalake targets
generation of events for load testing
pseudonymization and depseudonymization of fields in log data to comply with GDPR
and it is written in Python!
- Installation
- User Manual
- Configuration
- Configuration File Structure
- Input
- Output
- Processors
- Amides
- Calculator
- Clusterer
- Concatenator
- DatetimeExtractor
- Deleter
- Decoder
- Dissector
- DomainLabelExtractor
- DomainResolver
- Dropper
- FieldManager
- GenericAdder
- GenericResolver
- GeoipEnricher
- Grokker
- IpInformer
- KeyChecker
- Labeler
- ListComparison
- PreDetector
- Pseudonymizer
- Replacer
- Requester
- SelectiveExtractor
- StringSplitter
- TemplateReplacer
- Timestamper
- TimestampDiffer
- Rules
- Getters
- Metrics
- YAML Tags
- Development
- Architecture
- Implementing a new Connector
- Implementing a new Processor
- Registring a new Component
- Testing
- Profiling & Benchmarking
- Processor Case Examples
- Lucene regex filter
- Concatenator
- Calculator
- Dissector
- Dissector
- FieldManager
- Generic Adder
- Grokker
- Configuration of the GeoipEnricher
- IpInformer
- KeyChecker
- Requester
- StringSplitter
- Timestamper
- TimestampDiffer
- Pseudonymization and Depseudonymization
- Pseudonymizer Processor
- Usage of the EventMetaData Class
- Pipeline Example
- Demonstration of the Event States
- Usage of the EventClass - demonstration on concrete class LogEvent
- Specialized Event Classes: ErrorEvent, PseudonymEvent, SreEvent
- The Event Backlog
- Usage of Opensearch Output Connector with Event Objects
- Usage of ConfluentKafka Output Connector with Event Objects
- Usage of Processors with Event Objects
- Input Connectors -> ConfluentKafka Input
- Start Logprep programaticly
- Example Deployments