Pipelines Overview
Pipelines provide fine-grained control over log event processing. They consist of topics (which filter events) and pipelines (which process the filtered events through a series of steps). By mixing and rearranging pipeline steps, you can control how your events are transformed before being sent to their destinations.
Getting Started: Navigate to Pipelines in RunReveal to create your first topic and pipeline, or manage existing ones.
Pipeline Architecture
Source → Topics (Filter) → Pipeline (Process) → DestinationEvent Flow
- Events arrive from your configured sources
- Topics filter events based on preconditions
- Pipelines process filtered events through steps
- Events are sent to their final destinations
Topics: Event Filtering
Topics apply filters to select subsets of events before they enter a pipeline. They use preconditions to determine which events get routed to specific pipelines.
Topic Preconditions
Topics use preconditions to determine which events get routed to specific pipelines:
- Source-based: Route events from specific sources (e.g.,
webhooksources) - Field-based: Route events based on field values
- Custom criteria: Any combination of filtering conditions
Managing Topics
To manage your topics, navigate to the Pipelines Page.
Topic Evaluation Order:
- Events are evaluated against topics from top to bottom
- Events that don’t match any custom topics go to the default RunReveal pipeline
- Topics can be reordered by dragging them up/down in the list
Creating a Topic
1) Start Topic Creation
Click the “Create Topic” button to open the topic creation wizard.

2) Configure Topic Details
Each topic needs a name and a precondition. The precondition determines which subset of events will be routed to your topic.

Example: The precondition above matches any events coming from webhook sources.
3) Choose Pipeline Configuration
Configure where your matching events will be processed:

Options:
- Use an existing pipeline: Reuse pipelines across multiple topics
- Create a new pipeline: Start with a fresh pipeline from scratch
- Copy from an existing pipeline: Build on top of existing processing
4) Configure Pipeline Steps
You’re brought to the pipeline editor to configure how events are processed.

Click Complete to finish the wizard and set up your new resources.
Pipelines: Event Processing
Pipelines detail how events are processed before being sent to their final destinations. They consist of steps that are evaluated in order from top to bottom. Each step includes a function to apply and a precondition to select which events the step applies to.
Available Step Types
Creating a Pipeline
1) Access Pipeline Editor
Click “Add Pipeline” or edit an existing pipeline to open the pipeline editor.

2) Build Your Pipeline
- Left column: Your pipeline steps
- Right column: Available steps to add
- Drag and drop: Add steps from right to left column
- Reorder: Drag steps up/down to change evaluation order
3) Configure Step Preconditions
Each step can have preconditions to determine which events it applies to.
Shared Pipeline Warning: When editing a pipeline shared between multiple topics, you’ll be prompted to unlock it first. Changes affect all matching topics - proceed with caution.
Drop Steps: Detailed Configuration
Drop steps prevent matching events from being sent to destinations and from being indexed in the UI. They use preconditions to determine which events to drop.
How Drop Steps Work
Drop steps are evaluated in order from top to bottom. Once a step matches an event, that event is dropped and won’t be processed by subsequent steps or sent to destinations.
Precondition Types and Targeting Options
Field Targeting Options
Standard Fields
rawLog- Entire raw log contentnormalized.eventTime- Processed event timestampnormalized.eventName- Processed event namenormalized.actor.email- Actor’s email addressnormalized.src.ip- Source IP addressnormalized.service.name- Service name
GJSON Path Targeting
For complex JSON structures, use GJSON paths:
Example:
- Field:
events.0.parameters.#(name="client_id").value - Type:
exact - Value:
115520757898459258175
GJSON Path Syntax:
events.0- First element in events arrayparameters.#(name="client_id")- Find parameter where name equals “client_id”.value- Extract the value field
Common Drop Step Use Cases
Drop by Application Name
Remove events from specific applications that generate noise or aren’t relevant for security analysis.
rawLogregex“app_name”,“value”:“Material Security”Drop by Client ID
Filter out events from specific client applications using GJSON path targeting.
events.0.parameters.#(name=“client_id”).valueexact115520757898459258175Drop Health Check Events
Remove automated health check and monitoring events that don’t provide security value.
normalized.eventNameregexhealth|heartbeat|pingDrop Internal Traffic
Exclude internal network traffic from security analysis to focus on external threats.
normalized.src.ipcidrMatch10.0.0.0/8Important Considerations
⚠️ Source-Specific Behavior
Google Workspace Integration: Processes logs through Google Admin SDK, which changes the JSON structure of rawLog. Use GJSON paths or regex on rawLog for reliable targeting. Field paths may not match original audit log structure.
Other Sources:
- Most sources preserve original log structure
- Standard field targeting works as expected
- Check source documentation for specific behavior
⚠️ Performance Considerations
- Regex Performance: Complex regex patterns can impact performance
- Field Path Complexity: Deep GJSON paths are slower than simple field access
- Order Matters: Place drop steps early in pipelines to reduce processing overhead
⚠️ Precondition Evaluation
- Top-to-Bottom: Steps are evaluated in order
- First Match Wins: Once a step matches, subsequent steps may not execute
- Empty Results: Failed preconditions return empty results, not errors
Best Practices
Pipeline Design
- Place drop steps early in the pipeline to reduce processing overhead
- Use specific preconditions to avoid over-dropping
- Test with sample data before deploying
Field Selection
- Use
rawLogfor complex pattern matching - Use
normalizedfields for standard data - Use GJSON paths for nested JSON structures
Regex Patterns
- Escape special characters:
\"for quotes - Use anchors when needed:
^pattern$for exact matches - Test patterns with sample data
Monitoring
- Monitor pipeline metrics for dropped events
- Set up alerts for unexpected drop rates
- Review dropped events periodically
Example: Complete Drop Configuration
Step Name: Drop Material Security Events
Step Type: Drop
Precondition Field: rawLog
Precondition Type: regex
Precondition Value: "app_name","value":"Material Security"
This configuration will drop all events containing the Material Security app name anywhere in the rawLog content, providing a robust solution for filtering out unwanted audit events from your Google Workspace integration.