Implementing Splunk Data Stream Processor (DSP) 1.2
Upcoming Classes
Online
Instructor-led online training
Location | Mar 2021 | Apr 2021 | May 2021 | Jun 2021 | Jul 2021 | Aug 2021 | Sep 2021 |
---|---|---|---|---|---|---|---|
AMER Eastern Time - Virtual |
Mar 8 – Mar 11 |
Apr 5 – Apr 8 |
May 11 – May 14 |
Jun 14 – Jun 17 |
Jul 13 – Jul 16 |
Aug 2 – Aug 5 | |
AMER Pacific Time - Virtual |
Mar 22 – Mar 25 |
Apr 26 – Apr 29 |
May 24 – May 27 |
Jun 28 – Jul 1 | |||
EMEA UK Time - Virtual |
Mar 29 – Apr 1 |
Apr 26 – Apr 29 |
May 24 – May 27 |
Jun 28 – Jul 1 |
Jul 26 – Jul 29 |
Aug 23 – Aug 26 | |
APAC Singapore - Virtual |
Mar 30 – Apr 2 |
Jul 20 – Jul 23 |
Summary
This 4 day course is designed for the experienced Splunk administrators who are new to a Splunk DSP. This hands-on class provides the fundamentals of deploying a Splunk DSP cluster and designing pipelines for core use cases. It covers installation, source and sink configurations, pipeline design and backup, and monitoring a DSP environment.
Description
- Introduction to Splunk DSP
- Deploying a DSP cluster
- Configuring SplunkSources and Sinks
- Building Pipelines - Basics
- Building Pipelines - Intermediate
- Building Pipelines - Advanced
- Working with 3rd-party Sources and Sinks
- Working with Metrics and Traces
- Streaming ML Plugin
- Monitoring DSP Environment
Objectives
Module 1 – Introduction to DSP
- Review Splunk deployment options and challenges
- Describe the purpose and value of Splunk DSP
- Define DSP concepts and terminologies
Module 2 – Deploying a DSP Cluster
- List DSP core components and system requirements
- Describe installation options and steps
- Check DSP service status
- Learn to navigate in DSP UI
- Use scloud
Module 3 – Prepping Sources and Sinks
- Ingest data with DSP REST API service
- Configure DSP source connections for Splunk data
- Configure DSP sink connections for Splunk indexers
- Create Splunk-to Splunk pass-through pipelines
Module 4 – Building Pipelines - Basic
- Describe the basic elements of a DSP pipeline
- Create data pipelines with the DSP canvas and SPL2
- List DSP pipeline commands
- Use scalar functions to convert data types and schema
- Filter and route data to multiple sinks
Module 5 – Building Pipelines - Intermediate
- Manipulate pipeline options:
- Extract
- Transform
- Obfuscate
- Reduce payload
Module 6 – Building Pipelines - Advanced
- Review Splunk lookups
- Enrich data with DSP lookups
- Populate KV Store lookups from DSP streams
- Manipulate pipeline options
- Aggregate
- Conditional trigger
- Introduce the DSP Plugins SDK
Module 7 – Working with 3rd party Sources and Sinks
- Read from and write data to pub-sub systems like Kafka
- List sources supported with the collect service
- Transform data from Kafka and normalize
- Write to S3
Module 8 – Working with Metrics and Traces
- Onboard observability data (log, metric, and trace) into DSP
- Transform metric data for Splunk indexers and Splunk SignalFx
- Transform trace data for Splunk Infrastructure Monitoring
- Route metric data to Splunk indexers and SignalFx
- Send trace data to Splunk SignalFx
Module 9 – Streaming ML Plugin
- Describe the advantage of using DSP Streaming ML plugin
- Enable the Streaming ML plugin in DSP
- List the DSP Streaming ML functions
- Practice DSP ML algorithms with the ML datagen
Module 10 – Monitoring DSP Environment
- Back up DSP pipelines
- Monitor DSP environment
- Describe steps to isolate DSP service issues
- Scale DSP
- Replace DSP master node
- Upgrade DSP cluster