Your enablement partner for all things Firebolt
Log a case directly with severity from Firebolt UI ( Learn how )
Log a non-urgent case by emailing support@firebolt.io
New research from UC Berkeley identifies four key characteristics of how AI agents interact with databases. They all revolve around agentic speculation: agents formulate a hypothesis and leverage the underlying database to find answers. Well performing agentic systems often have agents collaborating to solve a single problem. This means you can easily have hundreds of parallel agents asking similar questions. Firebolt was built for exactly these patterns.
Learn about latest and archived release notes
View real-time/historical information about the Firebolt managed offerings
Need text
View all Firebolt documentation including overviews, tutorials, detailed references for SQL commands, functions, integrations etc.
Firebolt integrates seamlessly with the tools and protocols that power today's AI applications.

Native PostgreSQL dialect and wire protocol support for seamless integration with existing AI tools and frameworks.

Built-in support for Apache Iceberg, enabling efficient data lakehouse architectures for AI workloads.

Model Context Protocol server for direct AI model integration and context sharing.

Agent-to-Agent communication protocol for sophisticated multi-agent AI systems.

Native support for LangChain and other popular AI frameworks for rapid development.
Embeddings and Semi Structured Support

The fastest analytical query engine in the world, optimized for AI workload patterns.
High-volume data ingestion capabilities that keep pace with the fastest data generation rates.
Purpose-built architecture that handles thousands of concurrent AI agent queries without degradation.
Work on the freshest data available, ensuring AI agents make decisions on current information.
The foundational capabilities required to support autonomous AI agents in production environments
Postgres compatible SQL dialect - to help agents build the queries
Postgres compatible wire protocol for integration with existing tools
Support for AI ecosystem protocols and frameworks - MCP, A2A, LangChain
Integration with LLM providers of user's choice
Fits into Python ecosystem - from inside with Python UDFs, from outside with Pandas Dataframe-like APIs
Hybrid search: Mix of symbolic queries (SQL) and semantic search (vector similarity)
Vector search indexes for approximate nearest neighbor (ANN) crucial for real-time agent loop performance
Planner which can deal with complex machine generated SQL
Approximate query results and sampling to serve agent probes
Agentic memory: The agent needs a place to persist beliefs, goals, plans, and past interactions. Fast DML and explicit transactions
Flexible schemas: Agents often deal with heterogeneous and evolving data (text, embeddings, structured facts). Handle both structured (SQL tables) and unstructured (documents, vectors, JSON) data
Temporal/versioned data: Built using copy-on-write and zero-copy-clone mechanisms - allows low cost branching and fast transaction rollback
Multi-agent state sharing: When multiple agents act on the same data, the database must handle concurrency, isolation, and conflict resolution
Task queues & workflows: Agents often need transactional coordination between steps
Provenance & lineage: Ability to track where knowledge came from
Access control: Fine-grained security (agents may have different capabilities/roles)
Audit logs: Required for debugging autonomous behavior
Low-latency queries: Since agent loops often run synchronously with user requests, retrieval must be sub-100ms
Horizontal scale: Agents can generate and consume large volumes of data quickly
The fastest analytical query engine in the world, optimized for AI workload patterns.