Maintained with ☕️ by
IcePanel logo

Stream data into Snowflake using Amazon Data Firehose and Snowflake Snowpipe Streaming

Share

Services

Amazon Data Firehose (Firehose) now offers direct integration with Snowflake Snowpipe Streaming. Firehose enables customers to reliably capture, transform, and deliver data streams into Amazon S3, Amazon Redshift, Splunk, and other destinations for analytics. With this new feature, customers can stream clickstream, application, and AWS service logs from multiple sources, including Kinesis Data Streams, to Snowflake. With a few clicks, customers can setup a Firehose stream to deliver data to Snowflake. Firehose automatically scales to stream gigabytes of data, and records are available in Snowflake within seconds. Snowflake offers two options to load data into Snowflake tables: Snowpipe and Snowpipe Streaming. With Snowpipe, customers load data from files in micro-batches. This requires aggregating streams into batches, writing to interim storage, and then loading into Snowflake. This multi-step process introduces several minutes of latency, and higher cost. Snowpipe Streaming enables writing rows of data into tables. Through integration with Snowpipe Streaming, Firehose delivers streams record-by-record as soon as available, so data is available for querying in Snowflake within seconds. Thus, customers reduce cost, latency, and complexity of delivering streams into Snowflake. To learn more and get started, visit Amazon Data Firehose [documentation](https://docs.aws.amazon.com/firehose/latest/dev/what-is-this-service.html), [pricing](https://aws.amazon.com/kinesis/data-firehose/pricing/), and [console](https://console.aws.amazon.com/firehose/home).