Kognitio puts standard SQL on your Hadoop cluster or in front of your data warehouse. With massively concurrent, high throughput, real-time SQL, you can unlock the insight in your big data.
Kognitio gives you:
Kognitio is built from the ground-up for massively parallel query processing. Here’s how we do it.
Kognitio operates as a distributed, scalable cluster from one server to more than a thousand individual nodes, in a truly shared nothing architecture. You choose the most efficient data model and distribution across the cluster and then Kognitio redistributes data as queries demand. Scale up, scale down as you need.
In-memory processing, not caching
Kognitio is built to run in-memory, with both data and query processes operating directly in RAM. With no more reliance on disk, there’s no risk of legacy IO bottlenecks.
Every CPU cycle utilized
Kognitio dynamically deploys CPU capacity across the entire cluster to process that moment’s workload most efficiently, whether splitting a single complex query across many cores or processing many thousands of queries simultaneously.
Real-time machine code generation
During query planning, Kognitio generates custom machine code to squeeze every last cycle out of every core.
Every query completes
Some query engines will fail part way through a query, if they run out of resource. Kognitio adjusts its plan as the query processes, to dynamically redirect resources and make sure every query completes.