‘Cognitive storage’ aims to cut the cost of big data

06

Apr
2016
Posted By : admin Comments are off
cognitive storage cut big data costs
Categories :#AnalyticsNews

One of the key challenges for any organisation embarking on a big data project will be ensuring that costs are kept under control – something that is not always easy to do when firms are collecting and storing huge amounts of information.

Therefore, in order to tackle this issue, IBM has revealed it is working on a new method for automatically classifying information in order to ensure the most relevant data is always on hand.

Known as 'cognitive storage', the solution involves putting value to incoming data, determining what data should reside on which type of media, what levels of data protection should apply and what policies should be set for the retention and lifecycle of different classes of data, Computer Weekly reports.

IBM researcher Giovanni Cherubini explained the most obvious answer to the challenge of handling large amounts of data while keeping costs low is to have tiers of storage – such as flash and tape solutions – with the most important data held on the fastest media.

The machine learning tool aims to assess the value of data and direct it to the most appropriate solution, by studying metadata and analysing access patterns, as well as learning from the changing context of data use to help it assign value. 

IBM researcher Vinodh Venkatesan added: "Administrators would help train the learning system by providing sample files and labelling types of data as having different value."

For business users, the challenge of this is that they will have a large variety of data – from business-critical transactional data to emails, machine sensor data and more – so it will be essential that any cognitive storage system is able to categorise this correctly.

Mr Venkatesan said: "For an enterprise, there are ‘must keep’ classes of data and these could be set to be of permanently high value. But that is a small proportion in an enterprise. 

"The rest, the majority, which cannot necessarily be manually set, can be handled by cognitive storage – such as big data-type information and sensor information that might have value if analysed."

How big data is helping transform the logistics industry

30

Mar
2016
Posted By : admin Comments are off
Image credit: iStockphoto/Maxiphoto
Categories :#AnalyticsNews

By some estimates, e-commerce now makes up more than half of all retail sales, indicating that it's clearly something no business can afford to be without if it wants to be successful.

But despite the huge range of innovations that have helped improve the sector over recent years, the final part of the process – the process of physically getting goods from a business into the hands of the consumer – has remained largely unchanged. Although offerings like click-and-collect give consumers more choice, for many people, it's still a matter of sitting around waiting for a courier to show up.

And for the logistics industry itself, it's this part of the process – the so-called 'last mile' between local distribution centres and the customer's home, where the challenges lie, as this is typically the slowest and least cost-effective part of their operations.

However, this is starting to change as more organisations adopt big data analytics and the Internet of Things to give them more insight into where they can make improvements.

Speaking at a supply chain conference recently, Matthias Winkenbach, director of the Massachusetts Institute of Technology’s (MIT's) Megacity Logistics Lab, said these innovations can be a powerful resource for the sector, if businesses are able to effectively harness them.

The Wall Street Journal reports that one of the big challenges is that, while companies have large amounts of data available to them, they often don't know want to do with it, or even understand what it is telling them. But the team at the MIT Megacity Logistics Lab is looking to change this, working with companies such as Anheuser-Busch and Brazilian e-commerce firm B2W to examine what lessons can be learned from last-mile analytics.

For instance, Dr Winkenbach noted that while data-collecting tools can be used to track the progress of delivery vehicles and identify patterns in delivery times in order to better inform route planning, they can also provide "transactional data" that gives a clearer picture of what happens between a delivery truck and a customer's doorstep.

He explained many shippers want to know why some drop-offs take longer than others, a question that was hard to answer in the past as there was very little data available other than that provided from the truck itself.

But advanced geospatial information reveals that longer drop-offs tend to occur in the most densely populated parts of a city, where many people live in high-rise apartments. Mr Winkenbach said this indicates delivery workers are struggling to park, walking farther after parking, and climbing stairs when they get there.

It can also help logistics providers get a better picture of consumer behaviour, such as pinpointing customers who are typically not at home during delivery hours. This is not a factor that is usually factored into route planning, but can greatly impact the efficiency of operations.

The WSJ notes this is all information that can be used to help create more efficient routes, inform training programmes and determine the most suitable delivery vehicles. For example, the data may prove that multiple short-route deliveries on smaller vehicles, including bicycles, makes more sense than bulk deliveries in large trucks.

The results of this should be more effective delivery routes that not only save money for logistics firms, but allow consumers to get their goods faster, thereby improving customer satisfaction from both buyers and retailers.

Streaming analytics to be key big data priority for 2016

25

Mar
2016
Posted By : admin Comments are off
Hadoop and Spark 'key big data platforms' in the UK
Categories :#AnalyticsNews

Improved streaming analytics capabilities, self-service solutions and advanced in-memory tools will be among the key big data analytics technologies businesses will be investigating throughout 2016, a new report has found.

A study by Forrester Research, which notes the development of big data technology is reaching a new stage this year, as organisations aim to embed the solutions into the applications that power their operations.

InformationWeek reports that the main priorities for 2016 indicate a change in businesses' attitudes and approach to big data. In previous years, companies were still struggling to get to grips with the technology and how they could extend their existing solutions to incorporate big data.

Now, however, there is a much higher level of confidence in the tools available. Research vice-president and principal analyst at Forrester Brian Hopkins said: "Forrester has seen an explosion in client adoption of big data since we first wrote about it in 2011. For example, the number of firms implementing streaming analytics, a key leading-edge big data technology, more than doubled between 2012 and 2015." 

The research firm defines streaming analytics software as tools that are capable of filtering, aggregating, enriching, and analysing a high throughput of data from multiple disparate live data sources and in any data format. By identifying both simple and complex patterns, such tools enable businesses to get a real-time pictures of their operations, detect urgent situations, and automate immediate actions.

Increased demand for tools such as Apache Spark illustrates the challenges businesses are facing when it comes to processing and analysing data in real time, the report continued.

Elsewhere, newer, more business-focused goals for big data analytics will see higher demand for advanced analytics solutions that make use of in-memory computing and data preparation tools.

"To keep pushing revenue growth and digital customer experience transformation, big data technology is expanding its scope," wrote Hopkins. "It must also address the scale, speed, and integration requirements necessary to embed insights into the very fabric of next-generation, insights-driven businesses."

Overall, the report found that many organisations are set to make significant investments into big data this year, with more than six out of ten firms in Europe and North America planning to have systems in place by the end of 2016.

However, many organisations are said to be at a crossroads when it comes to making decisions on their big data deployments. Although many of Forrester's clients are looking to invest in technology for real-time data processing and user and customer self service, they are often unsure of which options to choose amid a crowded field of options from open source and commercial vendors.

Converged approaches to data ‘among key big data trends’ for 2016

22

Mar
2016
Posted By : admin Comments are off
Image credit: iStockphoto/cifotart
Categories :#AnalyticsNews

A move away from centralised data storage approaches, converged analytics platforms and a greater focus on value and quality will be among the key trends facing the big data industry in 2016.

This is according to co-founder and chief executive of MapR John Schroeder, who wrote in an article for IT Pro Portal that as big data analytics has moved beyond a buzzword to become an essential part of many organisations' strategy, it is transforming the enterprise computing environment.

However, this is an area that's constantly evolving. "With many new innovative technologies on the horizon, not to mention a particularly noisy marketplace, differentiating between what is hype and what is just around the corner can be challenging," Mr Schroeder noted.

Therefore, he highlighted several key trends that all businesses looking to improve their big data analytics capabilities will have to consider in 2016.

One of the key areas of focus will be an effort to develop more converged analytics environments. Mr Schroeder said that in the past, it has been accepted best practice to keep operational and analytic systems in separate business applications, in order to prevent analytic workloads from disrupting operational processing.

But this attitude is changing as new tools emerge that can use in-memory data solutions to perform both online transaction processing (OLTP) and online analytical processing (OLAP) without the requirement for data duplication.

"In 2016, converged approaches will become more mainstream as leading organisations reap the benefits of combining production workloads with analytics in response to changing customer preferences, competitive pressures, and business conditions," the MapR chief executive stated. This convergence will also speed up the 'data to action' cycle and removes much of the latency between analytical processes its impact on business performance.

Mr Schroeder also forecast that 2016 will see a shift away from centralised workload and processing models to more distributed solutions. One reason for this will be to better deal with the challenges of managing multiple devices, data centres, and global use cases, across multiple locations.

Changes to overseas data security and protection rules brought about by the nullification of the EU-US Safe Harbor agreement will also dictate how companies store, share and process large quantities of data. With Safe Harbor 2.0 on the horizon and set to bring in new restrictions, global companies will need to re-evaluate their approach to cross-border data storage that will affect their analytics activities.

Elsewhere, it was predicted that 2016 will see the market focusing far less on the "bells and whistles" of the latest products, and more on established solutions that have proven business value.

"This year, organisations will recognise the attraction of a product that results in a tangible business impact, rather than on raw big data technologies – which, while promising an exciting new way of working, really just cloud the issues at hand," Mr Schroeder said.

Ultimately, vendors that are able to demonstrate quality will win out in 2016 as businesses demand proven, stable solutions to meet their requirements for better operational efficiency. 

"Now more than ever, an organisation's competitive stance relies on its ability to leverage data to drive business results. That's easier said than done when it’s pouring in from every origin imaginable," Mr Schroeder said.

4 in 10 businesses to adopt IoT solutions this year

17

Mar
2016
Posted By : admin Comments are off
Image credit: iStockphoto/emyerson
Categories :#AnalyticsNews

The number of companies adopting Internet of Things (IoT) technology is set to see significant growth in the coming years, which is likely to place additional pressure on businesses to develop solutions to make the best use of the data they gather.

IoT is a technology that often goes hand in hand with big data analytics, due to the large volume and variety of real-time information they are capable of collecting. And for a growing number of organisations, the tools are set to have a pivotal role in how they operate.

Research from Gartner revealed that less than a third of companies (29 per cent) are currently using IoT. However, it also found that an additional 14 per cent are planning to implement the technology within the next 12 months.

Overall, almost two-thirds of businesses (64 per cent) agreed that they will deploy IoT eventually, while just nine per cent of respondents stated the solutions will have no relevance to their sector.

However, adoption rates varied widely across sectors, with Gartner noting that heavy industries such as utilities, oil and gas, and manufacturing are leading the way. By contrast, more service-oriented businesses are still lagging.

Chet Geschickter, research director at Gartner, stated there are two key reasons why only a third of businesses have currently implemented IoT. He said: "The first set of hurdles are business-related. Many organisations have yet to establish a clear picture of what benefits the IoT can deliver, or have not yet invested the time to develop ideas for how to apply IoT to their business."

The second problem lies within organisations themselves, as many companies lack the skills and leadership needed to make their initiatives a success.

"2016 will be a very big year for IoT adoption. We are starting to see a wide range of IoT use cases across virtually all industries. But the big challenge now is demonstrating return on investment," Mr Geschickter said.

In many cases, turning details gathered from IoT devices into actionable results will demand a strong big data solution that is able to quickly and effectively process this information and translate it into a useable format.

Hadoop and Spark ‘key big data platforms’ in the UK

14

Mar
2016
Posted By : admin Comments are off
Hadoop and Spark 'key big data platforms' in the UK
Categories :#AnalyticsNews

Hadoop continues to lead the way as the preferred big data analytics platform for organisations in the UK, but Spark in starting to make inroads into its dominance.

This is according to recent research by Computing magazine, which found almost six out of ten respondents (59 per cent) believed their company will be using Hadoop as its primary analytical tool in 18 months' time.

This compares to 17 per cent who named Spark as the way forward for their business, while Kinesis (seven per cent), Storm (four per cent) and Flink (two per cent) received lower levels of interest. One in four IT professionals stated their business would be using another solution for big data processing.

However, the research found that more advanced organisations – described as those businesses that are leading the way when it comes to adopting and using technology to drive change – were more likely to favour Spark over Hadoop, suggesting that it is catching up.

Computing did offer a note of caution, observing that many businesses use both Spark and Hadoop in conjunction with one another, so it may well be the case that even as Spark interest goes, Hadoop is unlikely to be replaced any time soon. However, for the purposes of the survey, respondents were asked to choose only one processing platform, in order to see which are having the most impact on professionals' thinking.

Interviews conducted by Computing also saw Spark come up frequently, with the speed of the solution a commonly cited benefit. One chief technology officer noted that although it is much easier to find people with experience and understanding of Hadoop, tools such as Spark and Storm are "much more attractive and faster".

As the capabilities of Spark have grown, it has also become more attractive to companies with needs for both batch and real-time processing. One data scientist Computing spoke to noted that if users are looking to deploy new solutions, they will increasingly turn straight to Spark, rather than use tools such as MapReduce.

Insurance sector ‘to increase focus’ on big data

10

Mar
2016
Posted By : admin Comments are off
Image credit: iStockphoto/shulz
Categories :#AnalyticsNews

Insurance providers in the UK are expected to rely more heavily on big data analytics solutions this year in order to provide more personalised offers to customers.

Research conducted by Teradata revealed that more than four-fifths of companies with turnover of over £500 million (82 per cent) will be prioritising this technology in 2016. Meanwhile, smaller firms are still lagging behind in this area, with just 46 per cent of providers with under $500 million in turnover agreeing with this.

Despite the growing interest, there is still work to be done in some areas order for the UK insurance sector to catch up with those in other countries. It was noted that on average, 76 per cent of large firms in France and Germany are able to 'fully deploy' consumer data in order to make use of analytics, compared with just 63 per cent of British companies.

However, the UK was said to be ahead when it comes to incorporating technology such as the Internet of Things into their big data. This is an area that's set to be particularly important to the insurance sector, with tools such as telematics increasingly being used in motor insurance to provide quotes that reflect a person's driving habits.

Three-quarters of companies in the UK described themselves as "very well equipped" to exploit this.

As well as analysing customer behaviour and preferences in order to provide more personalised quotes and offers, nearly three-quarters of large insurers (73 per cent) stated that they will be using big data to help tackle underwriting fraud.

However, insurers have been warned they need to be cautious in their use of consumer data, in order to avoid falling foul of privacy regulations and to avoid alienating their customers.

Computer Weekly reports that at a discussion event in the city of London, chief executive of civil liberties pressure group Big Brother Watch Renate Samson noted that the trend towards personalisation will not be allowed under the terms of the European Union's forthcoming General Data Privacy Regulation, which will prohibit profiling or predicting on the basis of behaviour, attitudes or preferences.

She added: "People feel creeped out having their social media activity or web browsing watched. If an insurance or other financial services company comes to me offering a service and I realise they've been looking at my Facebook or Twitter, they will come a cropper."

Getting people engaged with the use of big data has also proved a challenge for insurance providers in the US, despite the potential savings that individuals stand to make as a result of allowing the use of technology such as telematics.

According to figures from US insurer Progressive, reported in a recent article in the Wall Street Journal, around 80 per cent of its customers would qualify for a discount on their premiums through the use of IoT solutions that monitor their driving behaviour. However, just a quarter of consumers have signed up for this, with a further 40 per cent stating they would never give their consent for this type of tracking.

Big data ‘to add £322bn’ to UK economy by 2020

07

Mar
2016
Posted By : admin Comments are off
Big data 'to add £322bn' to UK economy by 2020
Categories :#AnalyticsNews

The value of big data analytics and Internet of Things (IoT) technology has been highlighted by a new report that forecast the solutions will add £322 billion to the UK's economy alone over the rest of the decade.

The paper is entitled 'The value of big data and the Internet of Things to the UK economy' and was published by the Centre for Economic and Business Research (Cebr) and SAS. It noted the figure is twice the size of the combined budget for education, healthcare and defence for 2014-15 and more than one-fifth of the UK's net public debt for that financial year.

Big data alone is expected to contribute an average of £40 billion a year to the UK economy between 2015 and 2020, and will be worth around 2.2 per cent of the country's gross domestic product by the end of the forecast period.

Manufacturing will be one of the big winners from this, with the sector expected to see a £57 billion boost between 2015 and 2020 as a direct result of big data. This is expected to be driven by the diversity of firms in the industry and the variety of areas in which efficiency gains can be achieved through the use of big data analytics

For example, the study suggested it could lead to improvements in supply chain management and enhancements in customer intelligence.

By 2020, two-third of UK business (67 per cent) are expected to have adopted big data solutions, up from 56 per cent last year. The technology will be particularly prevalent in retail banking, with 81 per cent of companies in this sector deploying solutions.

IoT is set for a similar boom, with the adoption rate increasing from 30 per cent in 2015 to 43 per cent by 2020.

Chief executive of Cebr Graham Brough said: "Collecting and storing data is only the beginning. It is the application of analytics that allows the UK to harness the benefits of big data and the IoT. Our research finds that the majority of firms have implemented between one and three big data analytics solutions."

However, he added that the key to success will be not only making sure these tools are extracting maximum insight, but also that firms are able to turn them  into business actions. 

"IoT is earlier in its lifecycle, and will provide more data for analysis in areas that may be new to analytics, reinforcing the potential benefits to the UK economy," Mr Brough said.

The most common reason given for adopting big data analytic tools was in order to gain better insight into customer behaviour. More than two-fifths (42 per cent) of organisations surveyed stated that they use big data for this purpose.

A similar proportion of businesses (39 per cent) will be turning to IoT solutions in order to cut costs and gain insight into operational data, the report continued.

Joe Biden: Big data could assist cancer research

29

Feb
2016
Posted By : admin Comments are off
Joe Biden: Big data could assist cancer research
Categories :#AnalyticsNews

Big data is making a huge difference to a number of companies and industries, allowing professionals to handle data more effectively than ever before and gather fascinating new insights. 

As the number of devices capable of connecting online increases, it is vital that companies find ways of interpreting data that can help to improve their products and services. Sectors such as marketing, retail and insurance have already reaped the benefits of big data. 

However, one area that is perhaps understated is how the technology could help to improve research into diseases, particularly cancer. Vice-president of the US Joe Biden recently met up with healthcare specialists in Utah and explained how a better approach for sharing information will be necessary in order for new treatments to be realised. 

How has big data helped to treat cancer?

Speaking to The Spectrum, Mr Biden emphasised how big data is helping to trace genetic and environmental factors that influence the disease, giving practitioners new knowledge that can be used to enhance their understanding. 

Mary Beckerle, Huntsman Institute CEO, told the news provider: “Half of those folks who succumb to cancer succumb to a cancer that could have been prevented. I think there's a really important emphasis to treat cancers that could have been prevented.”

Mr Biden has been visiting hundreds of doctors in a bit to improve federal engagement on curing cancer. He visited Duke University earlier this month and is set to make an appearance at the University of California San Francisco today (February 29th). 

The work is part of the White House’s cancer “moonshot” initiative, which aims to improve the level of progress towards curing cancer. As well as providing $1 billion (£721,000) towards research, the government is hoping to generate new ideas for cancer treatment specialists across the country. 

President Barack Obama is asking Congress for $755 million for cancer research in the coming budget, which would be on top of the $195 million already given the green light by officials last year. 

What developments could occur in the coming years?

Big data is still growing in the market and many industries should be able to improve how they manage information and increase the quality of their products and services. 

In recent years, devices such as laptops, tablets and mobiles have all become key technologies for businesses as employees adopt more flexible ways of working and, with big data now introduced by many companies, there is more information for managers to decipher than ever before. 

As analytics technologies improve, companies can easily identify new ways of making money and producing better products and services. Big data allows organisations to easily gain insights from complex statistics and ensures that information is being used as effectively as possible. 

For causes such as cancer research, where thousands of documents have been created for organisations to look at, big data makes it easier than ever before to collate and understand information. Without it, sorting through large numbers of files and trying to extract insights from them can be a painstaking process. 

Big data allows qualitative factors in investment management to be quantified

29

Feb
2016
Posted By : admin Comments are off
Big data allows qualitative factors in investment management to be quantified
Categories :#AnalyticsNews

The potential uses for big data are infinite it seems and this goes for the investment management industry as much as any other area. A new report shows that factors that were traditionally considered qualitative can now be quantified through this very handy new area of expertise.

Big Data & Investment Management: The Potential To Quantify Traditionally Qualitative Factors, which has been produced by Citi Business Advisory Services does what it says on the tin. Potential is the most important word here, as so far investment managers have not embraced big data wholeheartedly, but that could all be about to change.

It is something that every investment manager in the land should be clambering to address, as it could give them a competitive advantage. Having an information edge could put them ahead of those using traditional analytic techniques, reports Value Walk.

There are two main reasons for this: big data raises the amount of data that can be included in investment models; and improves the speed at which the data can be processed. Both of these elements could be particularly vital for investment managers moving forward.

If a third persuasive argument was to be added, it would be the fact that the variety of data that can be analysed has come on leaps and bounds in recent years. This is as a direct result of the growth of the internet; the advancement of social media; and the new Internet of Things, which can provide sensory readouts of physical objects.

In the process of additional content sources being developed, a certain amount of datafication has occurred. If this is a word you have not yet come across – and why should you, as it’s a fairly new addition to the vocabulary of the subject – consulting the report may help. It defines this term as “the ability to render into data many aspects of the world that have never been quantified before”.

With larger amounts of data, its increased variation and the speed at which it can be analysed all improving, an acceleration in systematic trading models innovation is expected. Things are likely to move on far quicker than they have at any time in the last ten years. At the same time, the quantitative investment space is braced to see similar steps forward, as the gap between traditional quantitative norms and modern qualitative research becomes smaller.

While a number of firms were surveyed in the making of the report, several pointed out that some of these changes remain purely aspirational at present. There are a number of obstacles that are getting in the way of big data adoption for investment managers, but should these be overcome, the future could certainly look bright.

Facebook

Twitter

LinkedId