Kognitio is a mature SQL engine for your Hadoop cluster or your data warehouse. With massively concurrent, high throughput ANSI SQL, unlock the insight in your big data.
Kognitio’s SQL platform gives you:
Kognitio was built for massively parallel query processing of big data using SQL. Here’s how it works.
Kognitio operates as a distributed, scalable cluster from one server to more than a thousand individual nodes, in a truly shared nothing architecture. You choose the most efficient data model and distribution across the cluster and then Kognitio redistributes data as queries demand. Scale up, scale down as you need.
In-memory processing, not caching
Kognitio is built to run in-memory, with both data and query processes operating directly in RAM. With no more reliance on disk, there’s no risk of legacy IO bottlenecks.
Every CPU cycle utilized
Kognitio dynamically deploys CPU capacity across the entire cluster to process that moment’s workload most efficiently, whether splitting a single complex query across many cores or processing many thousands of queries simultaneously.
Real-time machine code generation
During query planning, Kognitio generates custom machine code to squeeze every last cycle out of every core.
Every query completes
Some query engines will fail part way through a query, if they run out of resource. Kognitio adjusts its plan as the query processes, to dynamically redirect resources and make sure every query completes.