Implementing Splunk Data Stream Processor (DSP) 1.1

Implementing Splunk Data Stream Processor (DSP) 1.1

Upcoming Classes

Online

Instructor-led online training

Location Oct 2020 Nov 2020 Dec 2020 Jan 2021 Feb 2021 Mar 2021 Apr 2021
AMER Eastern Time - Virtual Nov 16 – Nov 19
Dec 7 – Dec 10
Jan 4 – Jan 7
EMEA UK Time - Virtual Jan 18 – Jan 21
AMER Pacific Time - Virtual Jan 19 – Jan 22

Summary

This 4 day course is designed for the experienced Splunk administrators who are new to a Splunk DSP. This hands-on class provides the fundamentals of deploying a Splunk DSP cluster and designing pipelines for core use cases. It covers installation, source and sink configurations, pipeline design and backup, and monitoring a DSP environment.

Description

  • Introduction to Splunk DSP
  • Deploying a DSP cluster
  • Prepping Sources and Sinks
  • Building Pipelines - Basics
  • Building Pipelines - Deep Dive
  • Working with 3rd party Data Feeds
  • Working with Metric Data
  • Monitoring DSP Environment

Duration

4 Days

Objectives

Module 1 – Introduction to DSP

  • Review Splunk deployment options and challenges
  • Describe the purpose and value of Splunk DSP
  • Define DSP concepts and terminologies

Module 2 – Deploying a DSP Cluster

  • List DSP core components and system requirements
  • Describe installation options and steps
  • Check DSP service status
  • Learn to navigate in DSP UI
  • Use scloud

Module 3 – Prepping Sources and Sinks

  • Ingest data with DSP REST API services
  • Configure DSP source connections for Splunk data
  • Configure DSP sink connections for Splunk indexers
  • Create Splunk-to Splunk pass-through pipelines

Module 4 – Building Pipelines - Basic

  • Describe the basic elements of a DSP pipeline
  • Create data pipelines with the DSP canvas and SPL2
  • List DSP pipeline commands
  • Use scalar functions to convert data types and schema
  • Filter and route data to multiple sinks

Module 5 – Building Pipelines - Deep Dive

  • Manipulate pipeline options:
    • Extract
    • Transform
    • Obfuscate
    • Aggregate and conditional trigger

Module 6 – Working with 3rd party Data Feeds

  • Read from and write data to pub-sub systems like Kafka
  • List sources supported with the collect service
  • Transform data from Kafka and normalize
  • Write to S3

Module 7 – Working with Metric Data

  • Onboard metric data into DSP
  • Transform metric data for Splunk indexers and SignalFx
  • Send metric data to Splunk indexers
  • Send metric data to Splunk SignalFx

Module 8 – Monitoring DSP Environment

  • Back up DSP pipelines
  • Monitor DSP environment
  • Describe steps to isolate DSP service issues
  • Scale DSP
  • Replace DSP master node
  • Upgrade DSP cluster

Prerequisites

Required:

  • Splunk Enterprise System Administration
  • Splunk Enterprise Data Administration

Nice to have:

  • Working knowledge of:
    • Distributed system architectures
    • Apache Kafka (user level)
    • Apache Flink (user level)
    • Kubernetes (admin level)

Onsite Training

For groups of three or more

Request Quote

Public Training

AMER Eastern Time - Virtual

EMEA UK Time - Virtual

AMER Pacific Time - Virtual


Don't see a date that works for you?

Request Class