Big data and business professionals are gathering in New York this week to hear about the latest innovations in the sector and learn how they can get more out of their information at the Strata + Hadoop World conference.
The event is a sellout and thousands of data scientists and IT executives will hear three days of tutorials, sessions and keynote speeches between October 28th and 30th. And among the presenters will be Kognitio, which is an sponsor of the event.
Therefore, as well as exhibiting our in-memory computing solutions to assist business conduct real-time analytics and greatly improve the speed of data processing, big data advocate and chief innovation officer Paul Groom will be leading a session. This takes place on Wednesday (October 30th) and will explain why companies should be looking at in-memory as a way to make the most of Hadoop.
Understanding how best to implement this technology will be essential for any data-intensive businesses in the coming years. With the general availability release of version 2 of Hadoop this month, the platform is now ready for widespread adoption into production environments. As a result, organizations that are unable to gain insight from the information they gather will be at a disadvantage compared with those that understand how to make their data work for them and are able to get results quickly.
This means companies need to intelligently engineer Hadoop into existing mature business intelligence in order to support agile business practices, Mr Groom says. He adds that at present, many business users move too slowly and adopt new tools too far behind the curve to keep up with the innovation in the Hadoop community.
Mr Groom's session will explain how businesses can protect Hadoop clusters from the demands of business intelligence use, as well as how they can ensure their system is able to cope with the most complex queries.
One of the key factors to understand will be that Hadoop does not exist in a vacuum. Many key applications in today's business environment speak in SQL, so it is vital these are able to communicate with Hadoop solutions.
"A modern data platform will act as a Logical Data Warehouse to 'bridge' Hadoop with the legacy systems and connect it to existing visualization and dashboard tools for reporting and advanced analytics," Mr Groom said. Adding Hadoop to an IT network should be seen as an evolution of capabilities and building on what is already in place, rather than trying to "reinvent the wheel".
His session at Strata + Hadoop World New York will also seek to help businesses understand how they can extend Hadoop with in-memory processing in order to get faster results. This is increasingly important for many organizations, as end-users are now demanding high performance, with expectations for Google-like query speeds.
This is not something that a standard Hadoop deployment can meet, as it does not fully exploit all cores with 100 percent effective work. However, in-memory can deliver this, as this approach drives cores to complete queries in seconds rather than minutes or hours.