Last week, efforts to make big data analytics a less fragmented and more accessible discipline took a step forward with the publication of a first draft of the National Institute for Standards and Technology (NIST)'s Big Data Operability Framework.

The set of documents comprises nine volumes on subjects ranging from definitions and taxonomies, through potential use cases, to residual security and privacy concerns.

Between now and the end of next month (May 21st), the US federal agency will seek comments from commercial, academic and government stakeholders on the suitability of the framework to serve the requirements of their own diverse big data projects, before submitting it to the international standards community for further development.

In a statement issued to coincide with the launch, NIST pointed to the potential scientific implications of big data as a key impetus for standardizing the discipline, thereby making it accessible to researchers without the resources necessary to build and integrate their own analytics systems from scratch.

"The availability of vast data resources carries the potential to answer questions previously out of reach," the statement read.

"Questions such as: how do we reliably detect a potential pandemic early enough to intervene? Can we predict the properties of new materials even before they've been synthesized? How can we reverse the current advantage of the attacker over the defender in guarding against cyber security threats?"

However, before big data can be harnessed to those ends, the standards body believes greater agreement must be reached on a number of fundamental questions as to its precise nature.

This will include coming to a consensus on "the attributes that define big data solutions", as well as those that differentiate it from the data environments that businesses, researchers and public sector organizations have used in the past.

It also means improving understanding and awareness of the "central scientific, technological and standardization challenges" that have to date slowed or stopped the adoption of game-changing analytics systems.

Wo Chang, NIST's digital data advisor, explained in the statement: "One of NIST's Big Data goals was to develop a reference architecture that is vendor-neutral, and technology and infrastructure-agnostic, to enable data scientists to perform analytics processing for their given data sources without worrying about the underlying computing environment."

Confusion, as well as low levels of confidence and understanding, are regularly highlighted as some of the key barriers to big data deployments today, underlining the importance of standardization even as many stakeholders begin to accept that adoption is an inevitability and a competitive necessity.

For example, a recent survey from Snaplogic found that while 52 per cent of large US enterprises are investing in big data for customer analytics, the same figure feel it is too soon to say how these solutions will effectively integrate with their existing information management infrastructure.

"Our survey shows there's a good bit of indecision right now when it comes to big data plans and technologies," said Darren Cunningham, the firm's marketing vice president. "At the same time, the results show strong interest for using big data to achieve business goals."