With information volumes growing all the time and many more firms coming to recognize the importance of being able to access, evaluate and act on data quickly, businesses will have to answer key questions about what technologies they opt for in order to make the most of this new environment.

One issue that will be high on the agenda for a large number of chief information officers is whether to use traditional data warehousing tools or in-memory platforms. It was noted by Fierce Big Data that with so many new innovations and buzzwords being thrown around in this area, this is resulting is a high degree of confusion among some enterprises.

Therefore, it will be important for firms to understand what options will be best for their requirements. Kognitio's chief innovation officer Paul Groom explained to the publication that in-memory is typically well-suited for situations where information needs to be accessed repeatedly, at high speed.

He said: "In-memory is relevant for data that is under the microscope, data that is being analyzed in detail with many complex methods, such that there is high-frequency of access and low latency has increasing value."

While traditional queries may only touch a piece of data once or twice, the more complex data analytics tools that are increasingly being used within enterprises for handling large quantities of information may mean the data is touched hundreds or even thousands of times, depending on the algorithm that is being run.

Therefore, if firms have a need for the fastest, most up-to-date results, in-memory analytics is likely to be a much more effective solution that traditional alternatives.

However, organizations will need to be aware that what they are getting is a genuine in-memory solution, as Fierce Big Data noted that the confusion surrounding the definition of this means many vendors are looking to offer older technologies, repackaged as having in-memory capability.

In many cases, companies are offering extended cache solutions as in-memory, but this is a misrepresentation and businesses need to be wary of this.

Mr Groom explained that with cache-based systems, every input/output operation, code needs to be included that asks if a page of data is in cache. 

"If the answer is no, it has to issue an expensive request to 'fetch data from disk to cache' and this may have a ripple effect as the cache algorithm has to say, 'do I have space, if not, shuffle data,'" he stated.

In-memory, on the other hand, guarantees direct access to any byte of data, which removes extra code, resulting in a shorter path, fewer overheads and faster performance.

He also added that businesses should not think of in-memory as a storage solution, which requires steps such as partitioning and indexing in order to guarantee performance. All of these have to be set up,which can be time-consuming and add to a firm's expenses.

Mr Groom said: "In-memory allows data models to be experimented with on the fly as you can afford to throw away and start again many times over in a day with no consequences – try doing that in a traditional platform."