Discover the essentials of creating a Kinesis Firehose delivery stream in AWS. Follow our comprehensive guide for seamless configuration and setup.

Step 1: Open the Amazon Kinesis Console
- Log in to your AWS Management Console.
- Navigate to Amazon Kinesis under the “Analytics” section.
Step 2: Create a Firehose Delivery Stream
- In the Kinesis console, choose Create delivery stream.
- Provide a Delivery stream name (e.g.,
my-firehose-stream).
Step 3: Select the Source
Choose a data source:
- Direct PUT: Firehose will directly receive data from your applications.
- Kinesis Data Stream: You can use it as the source if you already have a Kinesis Data Stream.
Step 4: Configure the Destination
Select the destination where Firehose will deliver the data. In this case, choose Amazon S3.
S3 Destination Settings:
- S3 Bucket: Select an existing bucket or create a new one (e.g.,
my-firehose-data-bucket). - Prefix: Define a prefix to organize the data in the S3 bucket (e.g.,
firehose-data/). - Error Output Prefix: Optionally define a prefix for failed records (e.g.,
firehose-errors/).
Scalable Data Streaming with Amazon Kinesis – Top Book
Step 5: Configure Data Transformation (Optional)
- If you want to process or transform the data before delivery:
- Enable Data Transformation.
- Specify a Lambda function for transforming incoming data.
- If not, you can skip this step.
Step 6: Configure Buffer and Compression Settings
- Buffer Size and Interval:
- Define the buffer size (default is 5 MB).
- Define the buffer interval (default is 300 seconds).
- Compression:
- Choose a compression format (e.g., GZIP, Snappy, or leave it as None).
Step 7: Enable Data Encryption (Optional)
- Enable server-side encryption if required.
- Choose an AWS Key Management Service (KMS) key.
Step 8: Review IAM Permissions
Kinesis Firehose requires permission to write to your S3 bucket. Ensure:
- An IAM role is automatically created for Firehose.
- The IAM role has a
AmazonS3FullAccessa policy or a custom policy granting access to the specified bucket.
Step 9: Review and Create
- Review all the configurations.
- Choose Create delivery stream.
Step 10: Test the Delivery Stream
- Go to the newly created delivery stream.
- Use the Test with Demo data feature to send sample records.
- Verify that the records are delivered to the S3 bucket.
Step 11: Integrate with Your Applications
Use the AWS SDK, CLI, or API to send data to the Firehose delivery stream. For example:
Python Example with boto3:
import boto3
client = boto3.client('firehose', region_name='us-east-1')
response = client.put_record(
DeliveryStreamName='my-firehose-stream',
Record={
'Data': b'{"field1": "value1", "field2": "value2"}\n'
}
)
print(response)







You must be logged in to post a comment.