A pre-requisite for good scale-out is software that supports “Massively Parallel Processing (MPP)”.
Kognitio was written from scratch to be an MPP solution. It allows enormously powerful data analysis platforms to be to be built from scalable, commodity, industry standard servers by efficiently harnessing very large amounts of CPU power.
Kognitio sits between where the data is stored (“the persistence layer”) and the end user tools, reports and applications (“the consumption layer”). Kognitio allows users to easily pull very large amounts of data from existing persistence systems into high-speed computer memory, apply massive amounts of processing power to it, and thereby allow complex analytical questions to be answered interactively, regardless of how big the data is. The persistence layer can be existing traditional disk-based data warehouse products, operational systems, Kognitio’s optional internal disk subsystem, distributed parallel file systems such as Hadoop or cloud-based storage such as Amazon S3, MS Azure WASB and MS Azure ADLS.
Kognitio’s in-memory analytical platform can handle even the very largest data sets. It scales out across arrays of low-cost industry-standard servers in the same way that Hadoop solves the “big data” storage and processing problem.
With Kognitio business users can continue to use their preferred front-end applications and visualization tools e.g. Tableau, Qlik, Power BI and Microstrategy, even when working with very large data sets. Kognitio is currently fully supported by Tableau, Microstrategy and PowerBI. Qlik Sense can work with Kognitio using the Direct Query and ODAG functionality.
The Kognitio Analytical Platform software can be deployed either on a standalone compute cluster or on an existing Hadoop cluster, either on premises or in the cloud. All built from the same source code, the software is currently available in three forms: