Ingestion architecture
Data source integration is a core functionality of the platform, enabling seamless integration of any data source regardless of its location—whether situated within a client's private network, hosted publicly on the Internet, sending events directly from the cloud, running on a public cloud provider, or residing within a 5G telecommunications core.
The platform is designed with a 5 + 1 layer architecture, where each layer is responsible for providing services to its immediate upper and lower layers. This approach facilitates the construction of robust log shipping pipelines that include comprehensive error handling, layer-specific specialised mechanisms, and abstraction between layers.

Below are detailed descriptions of the roles and mechanisms associated with each layer within the ingestion architecture:
1
Local collection
· Local log collection in customer premises
· Local persistency
· Encryption & compression
2
Shipping
· Shipping using one single TCP for all customer data sources
· Asymmetric encryption. Ensures confidentiality
· Compression. Reduces bandwidth
3
Processing
· Processing of source formats
· Normalizing to standardized field names
· Enrichment & events contextualization
4
Enrichment
· Matching against Threat Intelligence
· Contextualization
· Frameworks
5
Data Lake
· Storage in Data Lake
· Second layer of detection use cases based in AI models
· Correlation & business analytics
· Visualizations & dashboarding
6
Archiving
· Long term retention
· Restoration to live if required
Data source integration is managed from the Integrations → Data Sources section. This section provides the tools for integrating data sources, including:
Stream log ingestion from sources located within clients' on-premises networks
Cloud services located in the Internet, that send it's streams logs directly to the plateform
Public cloud platforms such as AWS, Azure, and Google Cloud Platform

Last updated