You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When setting up a pipeline to send logs from a specific source to Observability Pipelines, you often need to decide how to process and manage those logs.
15
+
16
+
Questions such as the following might come up:
13
17
14
18
- Which logs from this source are important?
15
-
- Which logs from this source should be dropped?
16
-
- Which logs should be retained?
17
-
- Should logs be sampled?
18
-
- Should quotas be added?
19
+
- Which logs can safely be dropped?
20
+
- Should repetitive logs be sampled?
21
+
- Which fields should be parsed or formatted for the destination?
22
+
23
+
Making these decisions typically requires coordination across multiple teams and detailed knowledge of each log source.
24
+
25
+
Observability Pipelines Packs provide predefined configurations to help you make these decisions quickly and consistently. Packs apply Datadog-recommended best practices for specific log sources such as Akamai, AWS CloudTrail, Cloudflare, Fastly, Palo Alto Firewall, and Zscaler.
26
+
27
+
### What Packs do
28
+
29
+
Each Pack includes source-specific configurations that defines:
30
+
31
+
-**Fields that can safely be removed** to reduce payload size
32
+
-**Logs that can be dropped**, such as duplicate events or health checks
33
+
-**Logs that should be retained or parsed**, such as errors or security detections
34
+
-**Formatting and normalization rules** to align logs across different destinations and environments
35
+
36
+
By using Packs, you can apply consistent parsing, filtering, and routing logic for each log source without creating configurations manually.
19
37
20
-
Often, you need to consult with different teams to answer these questions.
38
+
### Why use Packs
21
39
22
-
Use Observability Pipelines Packs to help you set up and optimize Observability Pipelines without extensive manual configuration. Packs contain predefined configurations that are specific to a source and identify:
40
+
Packs help teams:
23
41
24
-
- Log fields that can safely be removed
25
-
- Logs that can be dropped, such as duplicated logs
26
-
- Logs that need to be parsed
27
-
- Logs that need to be formatted for the destination
42
+
-**Reduce ingestion volume and costs** by filtering or sampling repetitive, low-value events
43
+
-**Maintain consistency** in parsing and field mapping across environments and destinations
44
+
-**Accelerate setup** by applying ready-to-use configurations for common sources
0 commit comments