Send me as a pdf

Druid vs Snowflake (2024)

A detailed comparison

Compare Druid vs Snowflake by the following set of categories:

Druid provides the ability to handle fast ingest and high concurrency. Custom sizing and cluster tuning are required to balance the compute, memory, storage needs of each process within Druid and to provide high concurrency. Druid clusters can be grown by adding nodes with automatic rebalancing of storage segments assigned to nodes. Self hosted Druid on Kubernetes is an option that users leverage to simplify scaling. Additionally, Cloud based managed Druid offerings are being rolled out. However, these managed offerings are limited in scale and scaling is not granular.

Snowflake was one of the first decoupled storage and compute architectures, making it the first to have nearly unlimited compute scale and workload isolation, and horizontal user scalability. It runs on AWS, Azure and GCP. It is multi-tenant over shared resources in nature and requires you to move data out of your VPC and into the Snowflake cloud. “Virtual Private Snowflake” (VPS) is its highest-priced tier, and can run a dedicated isolated version of Snowflake. Its virtual warehouses can be T-shirt sized along an XS/S/M…/4XL axis, where each discrete T-shirt size is bundled with fixed HW properties that are abstracted from the users. Snowflake has recently added support for Snowflake managed Iceberg tables.

Druid vs Snowflake - Architecture

The biggest difference among cloud data warehouses are whether they separate storage and compute, how much they isolate data and compute, and what clouds they can run on.

Druid provides the ability to handle fast ingest and high concurrency. Custom sizing and cluster tuning are required to balance the compute, memory, storage needs of each process within Druid and to provide high concurrency. Druid clusters can be grown by adding nodes with automatic rebalancing of storage segments assigned to nodes.

Snowflake scales very well both for data volumes and query concurrency. The decoupled storage/compute architecture supports resizing clusters without downtime, and in addition, supports auto-scaling horizontally for higher query concurrency during peak hours.

Druid vs Snowflake - Scalability

There are three big differences among data warehouses and query engines that limit scalability: decoupled storage and compute, dedicated resources, and continuous ingestion.

Druid provides high performance through columnar storage format, parallel processing, bitmap indexes and roll-ups. Druid, however, recommends a denormalized data model for performance needs. Join operations in Druid are a relatively new feature with various limitations, especially if there is a need to join large datasets.

Snowflake typically comes on top for most queries when it comes to performance in public TPC-based benchmarks when compared to BigQuery and Redshift, but only marginally. Its micro partition storage approach effectively scans less data compared to larger partitions. The ability to isolate workloads over the decoupled storage & compute architecture lets you avoid competition for resources compared to multi-tenant shared resource solutions, and the ability to increase warehouse sizes can often enhance performance (for a higher price), but not always linearly. Snowflake’s recently released “Search optimization service” delivers index-like behavior for point queries, but comes at an additional cost.

Druid vs Snowflake - Performance

Performance is the biggest challenge with most data warehouses today.
While decoupled storage and compute architectures improved scalability and simplified administration, for most data warehouses it introduced two bottlenecks; storage, and compute. Most modern cloud data warehouses fetch entire partitions over the network instead of just fetching the specific data needed for each query. While many invest in caching, most do not invest heavily in query optimization. Most vendors also have not improved continuous ingestion or semi-structured data analytics performance, both of which are needed for operational and customer-facing use cases.

Druid is designed as an OLAP engine to provide fast access to aggregations that are run against large volumes of data. Druid is typically used for customer facing analytics and streaming data processing. Druid is used as an add-on with other data warehousing products that are efficient at scaling, joining, and filtering large volumes of data. It is not a suitable option for data warehouse replacement.

Snowflake is a well rounded general purpose cloud data warehouse, that can also span beyond traditional BI & Analytics use cases into Ad-Hoc and ML use cases. Thanks to the flexible decoupeld storage & compute architecture that allows you to isolate and control the amount of compute per workload, it’s possible to tackle a broad spectrum of workloads. However, like its close siblings Redshift & BigQuery, it struggles to deliver low-latency query performance at scale, making it a lesser fit for operational use cases and customer-facing data apps.

Druid vs Snowflake - Use cases

There are a host of different analytics use cases that can be supported by a data warehouse. Look at your legacy technologies and their workloads, as well as the new possible use cases, and figure out which ones you will need to support in the next few years.

Compare other data warehouses

See all data warehouse comparisons ->

Talk to a Firebolt solution architect