IDC reports for Q1 2014 a 6.9% reduction in revenue for the disk storage systems market and a 25% spending cut in high-end storage solutions.
So has the flood of Big Data eased? Are the data lakes now big enough? Very unlikely. More likely is a shift in emphasis from storage to utilization; ultimately business teams have to show some value from accruing all this data, those expensive (and rare) data scientists have to yield insights and move onto the analytical production line – insight after insight after insight…
This means that analysis of data takes the focus and starts to acquire some of the spend, typically for CPUs, Memory and software – disk sub-systems are fine for coarse processing but they hold back high-frequency, iterative, complex analytics that is ultimately required for fine detail processing. Pull qualified, filtered sets of data from the data lake into large RAM and let lose with as many CPU cores as possible to run the analytical production line – day-in-day out – Whether SQL processing or massively parallel R or Python.
Sounds easy but scale-out in-memory solutions running complex analytics are tough to build, as Kognitio will atest it takes lots of engineering to build a stable analytical platform that can be quickly deployed and easly harnessed for ad-hoc analytics that graduate seamlessly onto a high-throughput analytical production line.
So if you’re thinking less about storage and more about how to do scale-out analytical processing today then contact us – we’ve been delivering for many tears MPP analytical software that tightly integrates with Hadoop and other storage solutions.