Tuesday, June 6, 2023
HomeBig DataIngest VPC circulation logs into Splunk utilizing Amazon Kinesis Information Firehose

Ingest VPC circulation logs into Splunk utilizing Amazon Kinesis Information Firehose


In September 2017, through the annual Splunk.conf, Splunk and AWS collectively introduced Amazon Kinesis Information Firehose integration to help Splunk Enterprise and Splunk Cloud as a supply vacation spot. This native integration between Splunk Enterprise, Splunk Cloud, and Kinesis Information Firehose is designed to make AWS information ingestion setup seamless, whereas providing a safe and fault-tolerant supply mechanism. We need to allow you to watch and analyze machine information from any supply and use it to ship operational intelligence and optimize IT, safety, and enterprise efficiency.

With Kinesis Information Firehose, you should utilize a totally managed, dependable, and scalable information streaming answer to Splunk. In September 2022, AWS introduced a brand new Amazon Digital Personal Cloud (Amazon VPC) function that lets you create VPC circulation logs to ship the circulation log information instantly into Kinesis Information Firehose as a vacation spot. Beforehand, you would ship VPC circulation logs to both Amazon CloudWatch Logs or Amazon Easy Storage Service (Amazon S3) earlier than it was ingested by different AWS or Associate instruments. On this submit, we present you methods to use this function to arrange VPC circulation logs for ingesting into Splunk utilizing Kinesis Information Firehose.

Overview of answer

We deploy the next structure to ingest information into Splunk.

We create a VPC circulation log in an present VPC to ship the circulation log information to a Kinesis Information Firehose supply stream. This supply stream has an AWS Lambda perform enabled for information transformation and has vacation spot settings to level to the Splunk endpoint together with an HTTP Occasion Collector (HEC) token.

Stipulations

Earlier than you start, guarantee that you’ve the next conditions:

  • AWS account – If you happen to don’t have an AWS account, you’ll be able to create one. For extra data, see Setting Up for Amazon Kinesis Information Firehose.
  • Splunk AWS Add-on – Make sure you set up the Splunk AWS Add-on app from Splunkbase in your Splunk deployment. This app gives the required supply varieties and occasion varieties mapping to AWS machine information.
  • HEC token – In your Splunk deployment, arrange an HEC token with the supply kind aws:cloudwatchlogs:vpcflow.

Create the transformation Lambda perform

Integrating VPC circulation logs with Kinesis Information Firehose requires a Lambda perform to remodel the circulation log information. The info that VPC circulation logs sends to the supply stream is encoded as JSON information. Nonetheless, Splunk expects this as uncooked circulation log information. Subsequently, if you create the supply stream, you allow information transformation and configure a Lambda perform to remodel the circulation log information to uncooked format. Kinesis Information Firehose then sends the info in uncooked format to Splunk.

You possibly can deploy this transformation Lambda perform as a serverless software from the Lambda serverless app repository on the Lambda console. The title of this software is splunk-firehose-flowlogs-processor.

After it’s deployed, you’ll be able to see a Lambda perform and an AWS Id and Entry Administration (IAM) position getting deployed on the console. Be aware the bodily ID of the Lambda perform; you employ this if you create the Firehose supply stream within the subsequent step.

Create a Kinesis Information Firehose supply stream

On this step, you create a Kinesis Information Firehose supply stream to obtain the VPC circulation log information and ship that information to Splunk.

  1. On the Kinesis Information Firehose console, create a brand new supply stream.
  2. For Supply, select Direct PUT.
  3. For Vacation spot, select Splunk.
  4. For Supply stream title, enter a reputation (for instance, VPCtoSplunkStream).
  5. Within the Remodel information part, for Information transformation, choose Enabled.
  6. For AWS Lambda perform, select Browse.
  7. Choose the perform you created earlier by on the lookout for the bodily ID.
  8. Select Select.
  9. Within the Vacation spot settings part, for Splunk cluster endpoint, enter your endpoint.If you happen to’re utilizing a Splunk Cloud endpoint, seek advice from Configure Amazon Kinesis Firehose to ship information to the Splunk platform for various Splunk cluster endpoint values.
  10. For Splunk endpoint kind, choose Uncooked endpoint.
  11. For Authentication token, enter the worth of your Splunk HEC that you simply created as a prerequisite.
  12. Within the Backup settings part, for Supply report backup in Amazon S3, choose Failed occasions solely so that you solely save the info that fails to be ingested into Splunk.
  13. For S3 backup bucket, enter the trail to an S3 bucket.
  14. Full creating your supply stream.

The creation course of might take a couple of minutes to finish.

Create a VPC circulation log

On this ultimate step, you create a VPC circulation log with Kinesis Information Firehose as vacation spot kind.

  1. On the Amazon VPC console, select Your VPCs.
  2. Choose the VPC for which to create the circulation log.
  3. On the Actions menu, select Create circulation log.
  4. Present the required settings for Filter:
    1. If you wish to filter the circulation logs, choose Settle for visitors or Reject visitors.
    2. Choose All in the event you want all the data despatched to Splunk.
  5. For Most aggregation interval, choose an acceptable interval on your use case.Choose the minimal setting of 1 minute interval in the event you want the circulation log information to be accessible for near-real-time evaluation in Splunk.
  6. For Vacation spot, choose Ship to Kinesis Firehose in the identical account if the supply stream is ready up on the identical account the place you create the VPC circulation logs.If you wish to ship the info to a special account, seek advice from Publish circulation logs to Kinesis Information Firehose.
  7. For Log report format, in the event you depart it at AWS default format, the circulation logs are despatched as model 2 format. Alternatively, you’ll be able to specify which fields you might want to be captured and despatched to Splunk.For extra data on log format and accessible fields, seek advice from Circulation log information.
  8. Evaluation all of the parameters and create the circulation log.Inside a couple of minutes, it’s best to have the ability to see the info in Splunk.
  9. Open your Splunk console and navigate to the Search tab of the Search & Reporting app.
  10. Run the next SPL question to take a look at pattern VPC circulation log information:
    index=<index title> sourcetype="aws:cloudwatchlogs:vpcflow"

Clear up

To keep away from incurring future expenses, delete the assets you created within the following order:

  1. Delete the VPC circulation log.
  2. Delete the Kinesis Information Firehose supply stream.
  3. Delete the serverless software to delete the transformation Lambda perform.
  4. If you happen to created a brand new VPC and new assets within the VPC, then delete the assets and VPC.

Conclusion

You should utilize VPC circulation log information in a number of Splunk options, just like the Splunk App for AWS Safety Dashboards for visitors evaluation or Splunk Safety Necessities, which makes use of the info to supply deeper insights into the safety posture of your AWS setting. Utilizing Kinesis Information Firehose to ship VPC circulation log information into Splunk gives many advantages. This managed service can robotically scale to fulfill the info demand and supply near-real-time information evaluation. Check out this new fast and hassle-free method of sending your VPC circulation logs to Splunk Enterprise or Splunk Cloud Platform utilizing Kinesis Information Firehose.

You possibly can deploy this answer at this time in your AWS account by following the Kinesis Information Firehose Immersion Day Lab for Splunk


In regards to the authors

Ranjit Kalidasan is a Senior Options Architect with Amazon Internet Providers primarily based in Boston, Massachusetts. He’s Associate Options Architect serving to safety ISV companions to co-build and co-market options with AWS. He brings over 20 years of expertise in Info expertise serving to international prospects implement complicated options for Safety & Analytics. You possibly can join with Ranjit in Linkedin.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments