Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 3 Next »

Introduction

Confluent Cloud is a resilient, scalable streaming data service based on Apache Kafka®, delivered as a fully managed service. Confluent Cloud has a web interface and local command line interface. You can manage cluster resources, settings, and billing with the web interface.

Use case(s)

User Storie(s)

  • As a pipeline developer, I would like to stream data from Confluent cloud 
  • As a pipeline developer I would like to capture records that are not delivered downstream for analysis

Plugin Type

  • Batch Source
  • Batch Sink 
  • Real-time Source
  • Real-time Sink
  • Action
  • Post-Run Action
  • Aggregate
  • Join
  • Spark Model
  • Spark Compute

Configurables

Following fields must be configurable for the plugin. The plugin should be created as a wrapper on HTTPSink with additional attributes required for Splunk HTTP Event Collector


User Facing NameTypeDescriptionConstraintsMacro Enabled?
URLString
Required. The URL to post data to.

yes
HEC TokenStringRequired . Specify value of token created for authentication to Splunk

Authentication TypeSelect
  • Basic Authentication
    • Example: -u "x:<hec_token>"
  • HTTP Authentication
    • Example: "Authorization: Splunk <hec_token>"
  • Query String (Splunk Cloud Only)
    • Example: ?token=<hec_token>

    Pre-requisities for Query String URL
  • You must also enable query string authentication on a per-token basis. On your Splunk server, request Splunk Support to edit the file at $SPLUNK_HOME/etc/apps/splunk_httpinput/local/inputs.conf. Your tokens are listed by name in this file, in the form http://<token_name>
  • Within the stanza for each token you want to enable query string authentication, add the following setting (or change the existing setting, if applicable):allowQueryStringAuth = true



Batch SizeNumber - with upper boundThe number of messages to batch before sending> 0, default 1 (no batching)yes
FormatNumber with upper limitThe format to send the message in. JSON will format the entire input record to json and send it as a payload. Form will convert the input message to a query string and send it in the payload. Custom will leverage the request body field to send.JSON, Form, Custom
Request BodyString
Optional request body. Only required if Custom format is specified.

yes
Content TypeStringUsed to specify the Content-Type header.
yes
Channel Identifier HeaderKeyValue
If your request includes raw events, you must include an X-Splunk-Request-Channel 
header field in the event, and it must be set to a unique channel identifier (a GUID). curl https://http-inputs-<customer>.splunkcloud.com/services/collector/raw  -H "X-Splunk-Request-Channel: FE0ECFAD-13D5-401B-847D-77833BD77131" -H "Authorization: Splunk BD274822-96AA-4DA6-90EC-18940FB2414C" -d '<raw data string>' -v Alternatively, the X-Splunk-Request-Channel header field can be sent as a URL query parameter, as shown here:
curl https://http-inputs-<customer>.splunkcloud.com/services/collector/raw?channel=FE0ECFAD-13D5-401B-847D-77833BD77131 -H "Authorization: Splunk BD274822-96AA-4DA6-90EC-18940FB2414C" -d '<raw data string>' -v

yes
Should Follow Redirects?Toggle
Whether to automatically follow redirects. Defaults to true.
true,false
Number of RetriesToggle
The number of times the request should be retried if the request fails. Defaults to 3.
0,1,2,3,4,5,6,7,8,9,10
Connect TimeoutString
The time in milliseconds to wait for a connection. Set to 0 for infinite. Defaults to 60000 (1 minute).


Read TimeoutString
The time in milliseconds to wait for a read. Set to 0 for infinite. Defaults to 60000 (1 minute).


Use ProxyToggleTrue or false to enable HTTP proxy to connect to system. Defaults to falsetrue, false
Proxy URIStringProxy URI

Proxy UsernameStringUsername

Proxy PasswordStringPassword


References

https://docs.splunk.com/Documentation/Splunk/7.1.1/Data/FormateventsforHTTPEventCollector

Design / Implementation Tips

  • Tip #1
  • Tip #2

Design

Approach(s)

Properties

Security

Limitation(s)

Future Work

  • Some future work – HYDRATOR-99999
  • Another future work – HYDRATOR-99999

Test Case(s)

  • Test case #1
  • Test case #2

Sample Pipeline

Please attach one or more sample pipeline(s) and associated data. 

Pipeline #1

Pipeline #2



Table of Contents

Checklist

  • User stories documented 
  • User stories reviewed 
  • Design documented 
  • Design reviewed 
  • Feature merged 
  • Examples and guides 
  • Integration tests 
  • Documentation for feature 
  • Short video demonstrating the feature
  • No labels