New programming language ‘boosts big data speeds fourfold’

22

Sep
2016
Posted By : admin Comments are off
230916 - Image credit: iStockphoto/cifotart
Categories :#AnalyticsNews

A new programming language developed by researchers at the Massachusetts Institute of Technology (MIT) is claiming to be able to increase the speed of big data processing by up to four times.

Called Milk, the language allows application developers to manage memory more efficiently in programs that deal with scattered data points in large data sets. In tests conducted using several common algorithms, programs written in the new language were shown to be four times as fast as those written in existing languages, but the researchers behind the language believe that further work will result in even larger gains.

Milk is intended to solve one of the biggest barriers to successful implementation of big data analytics processes – how efficiently programs gather the relevant data.

MIT explained that traditional memory management is based on the 'principle of locality' – that is, if a program requires a certain piece of data stored in a specific location, it is also likely to need the neighbouring data, so it will fetch this at the same time.

However, this assumption no longer applies in the era of big data, where programs frequently require scattered chunks of data that are stored arbitrarily across huge data sets. Since fetching data from their main memory banks is the major performance bottleneck in today’s computer chips, having to do this more frequently can lead to major performance issues.

Vladimir Kiriansky, a PhD student in electrical engineering and computer science and first author on the new paper, explained that returning to the main memory bank for each piece of information is highly inefficient. He said: "It's as if every time you want a spoonful of cereal, you open the fridge, open the milk carton, pour a spoonful of milk, close the carton, and put it back in the fridge."

The new programming language aims to overcome this limitation through the use of batch processing, by adding a few commands to OpenMP, an extension of languages such as C and Fortran that makes it easier to write code for multicore processors. 

When using the language, a programmer then adds a few lines of code around any instruction that iterates through a large data collection looking for a comparatively small number of items. Milk’s compiler then figures out how to manage memory accordingly.

Using Milk, when a core needs a piece of data, instead of requesting it – and any adjacent data – from main memory, it adds the data item's address to a locally stored list of addresses. When this list is long enough, all the chip's cores pool their lists, group together those addresses that are near each other and redistribute them to the cores. This means that each core requests only data items that it knows it needs and that can be retrieved efficiently.

Matei Zaharia, an assistant professor of computer science at Stanford University, noted that although many of today's applications are highly data-intensive, the gap in performance between memory and CPU means they are not able to fully utilise current hardware.

"Milk helps to address this gap by optimising memory access in common programming constructs. The work combines detailed knowledge about the design of memory controllers with knowledge about compilers to implement good optimisations for current hardware," he added.

Hadoop and NoSQL drive big data boom

20

Sep
2016
Posted By : admin Comments are off
200916 - Image credit: iStockphoto/bakhtiar_zein
Categories :#AnalyticsNews

Investments in technologies such as Hadoop and NoSQL will underpin much of the growth in the big data analytics market in the coming years, with non-relational solutions set to increase at around twice the rate of the sector as a whole.

This is according to a new report from Forrester Research, which found that over the next five years, big data will grow at a rate of around 13 per cent a year. However, NoSQL is set for a compound annual growth rate of 25 per cent over the period between 2016 and 2021, while the projected figure for Hadoop is 32.9 per cent.

It also noted that this year, some 30 per cent of organisations have implemented Hadoop, up from 26 per cent in 2015. Meanwhile, 41 per cent of professionals stated they had already implemented NoSQL or were expensing its use.

A further 20 per cent expect to undertake a NoSQL deployment in the next year. The report observed this is proving to be particularly useful for applications such as ecommerce and graph data databases, while open source options can help companies reduce the cost of their big data initiatives.

Analyst at Forrester and author of the report Jennifer Adams said: "Five years ago, big data was still a buzzword, but today, it's a standard business practice."

She added one of the key reasons for Hadoop's robust forecast is its ability to run data-intensive applications that legacy solutions would be unable to handle. For example, the report highlighted Arizona State University's Hortonworks cluster, which is used to store four petabytes of cancer genome data, as one scenario where Hadoop is breaking down barriers to research.

The need to manage huge volumes of data is also a key driver for NoSQL implementations, Ms Adams stated. For instance, eBay uses a MongoDB deployment that is able to store up to one billion live listings, while PayPal uses Couchbase to handle databases of one billion documents.

Cloud computing is another area set for growth, and is a major factor driving interest in Hadoop. The report noted: "Hadoop in the cloud allows the analysis of more data using cheaper infrastructure and enables faster advanced analytics."

As a result, the number of organisations using a cloud-based service to store unstructured data has increased from 29 per cent in 2015 to 35 per cent this year.

Elsewhere, a separate report from Forrester has identified 15 emerging technologies that are set to have a huge impact on the world over the next five years, and it is clear that being able to effectively analyse data will be critical if businesses are able to take full advantage of many of them.

For example, the Internet of Things was named by Forrester as one of the top five innovations that will change the world by 2021. It noted this will drive "new levels of customer insight and engagement" by the end of the forecast period, but some firms will need to undergo organisational change in order to make the most of this.

Augmented and virtual reality, intelligent agents, artificial intelligence, and hybrid wireless were the other top five technologies named by the research firm.

How big data helps keep trains running on time

15

Sep
2016
Posted By : admin Comments are off
150916 - Image credit: iStockphoto/ipopba
Categories :#AnalyticsNews

Siemens – the largest engineering company in Europe – is trying to improve punctuality on Germany’s rail network by utilising big data analysis and predictive maintenance.

The most recent data from the European Commission reveals that just over three-quarters (78.3 per cent) of the country’s long distance trains arrived on time. The report, published in 2014, found that just two other countries racked up worse averages, with Portugal and Lithuania coming in behind.

But it is hoped that the introduction of big data analytics will change this, the Financial Times reports. A group of Siemens employees have fitted hundreds of sensors to the trains, which relay data back to the engineers about the condition of the locomotive’s parts.

By combining two industrial disciplines – big data and predictive maintenance – the firm is able to find out what needs to be repaired or replaced before any delays are caused. This means that punctuality could be pumped back into Germany’s rail network, helping it climb the ranks and keep commuters happy.

So far, this experiment has seen all but one of the 2,300 journeys monitored by Siemens in Spain arrive less than five minutes late. This has pushed up the punctuality rate from 89.9 per cent up to 99.98 per cent, beating leader Finland’s score of 95.4 per cent.

This is not a new idea for Siemens, as it realised back in 2014 that the Internet of Things could help it provide customers with more than just hardware, as it paired together sensors and connected devices.

From there, the company decided to move its train manufacturing site from Allach to outside Munich to a digital hub. Here a group of specialists analyse the data that is generated by the sensors on the trains being monitored by the firm.

The group is looking out for patterns or anomalies that could point to an issue onboard one of its fleet. If something does need to be replaced, the specialists make sure it is done during regular maintenance checks, rather than cause a disruption to regular services.  

Renfe – the Spanish rail network that has partnered with Siemens – is so confident that the system works that it is offering commuters a refund if the service between Madrid and Barcelona is late by more than 15 minutes.

Gerhard Kress, head of Siemens' Mobility Data Services Centre, believes that the most important thing for rail networks is to avoid breakdowns, as just one can have a ripple effect and cause several services to then be delayed.
.
Mr Kress believes that his team has got the knowhow to make big data work hard to keep the trains running on time. He added: “We are essentially building on the know-how that Siemens has developed over the years for other types of applications, namely in healthcare and gas turbine operations.

“We have also massively invested in building our team. All our scientists not only have PhDs in data science, machine learning or mathematics, but also a background in mechanical engineering.”

Shop Direct highlights big data’s impact as profits rise

13

Sep
2016
Posted By : admin Comments are off
13/09/16 - Image credit: iStockphoto/emyerson
Categories :#AnalyticsNews

Online retailer Shop Direct has highlighted its investments in big data and machine learning as among the reasons for its recent success, as it unveiled revenue of £1.86 billion for the first full year since the implementation of these technologies.

Computing magazine reports that the company, which runs brands including Littlewoods.com and Very.co.uk, reported profits of £150.4 million for the period, an increase of 43 per cent year-on-year.

Chief executive of the group Alex Baldock said this success has been down to a greater focus on new technology, and this is something that the firm will continue with in the coming years.

“This was the year our investments in technology really started to pay off," he said. "We've made big strides in m-commerce, big data and personalisation. But there's a lot more to play for in these areas."

For instance, he highlighted artificial intelligence as a solution that can "change the game" for the company when it comes to how it uses data. Mr Baldock said that Shop Direct has already begun to deploy this technology and is serious about taking it much further.

In its report, Shop Direct highlighted how it uses machine learning technology to improve its offerings, such as delivering personalised recommendations to its customers based on their buying habits.

“Driven by machine learning, the group is now personalising more customer touchpoints than ever, from customer emails and off-site advertising to homepage content, on-site sort orders, top navigation menus and product recommendations deeper within the shopping journey," the company stated.

It has also begun trialling more personalised services using this data, in order to build "deeper relationships" with its customers.

Shop Direct has made big data analytics a key part of its business since it abandoned its traditional print catalogue-based business in January 2015 in favour of a completely digital offering.

Earlier this year, chief executive of financial services at the company Neil Chandler explained to Computing how it has spent six years transforming its offering from a catalogue firm into a "world-class" leader in digital retail.

Its efforts include a personalised sort order tool, which compiles a list of suggested products based on a user's history and ensures these appear at the top of the user's search results page.

This is something that's particularly important as more of the firm's customers switch to mobile devices, where space is at a premium. Mr Chandler explained: "On mobile, people aren't going to keep swiping down if they are looking for a black dress and there are 100 to choose from – they'll probably see nine at best.

"So the aim is to work out how analytics can help to curate and show the best nine black dresses for the customer that are in stock, that match the fashion preferences and are in the right price range."

Tools such as this will be hugely valuable in the coming years, as Shop Direct's results indicate there has been a 46 per cent increase in the number of orders placed via smartphones in the last year, while the company's apps have been downloaded over a million times across Android and iOS.

Software to dominate as big data market grows to $72 billion

12

Sep
2016
Posted By : admin Comments are off
12/09/16 - Image credit: iStockphoto/kentoh
Categories :#AnalyticsNews

This year is set to see the big data analytics market grow into a sector that it is worth $46 billion (£34.6 billion) in revenues, as the technology continues to become commonplace throughout all businesses.

According to predictions from SNS Research, the rise is set to continue for the coming years, with business spending forecast to reach $72 billion by the end of 2020, as firms invest in big data hardware, professional services and software.

The company has predicted that the compound annual growth rate (CAGR) will be around 12 per cent over the next four years. More than $7 billion is expected to move from hardware dominated to software dominated goods by the end of this decade.

This suggests that the challenges big data has previously faced – such as privacy concerns and organisational resistance – are a thing of the past.

The report – entitled 'Big Data Market: 2016 – 2030: Opportunities, Challenges, Strategies, Industry Verticals and Forecasts' attributes the growth to several different factors, rather than just a single component.

SNS Research provided a detailed assessment of the big data ecosystem, covering elements including key market drivers, investment potential, vertical market opportunities, value chain, future roadmaps and data from the vendor markets.

It noted: "Amid the proliferation of real time data from sources such as mobile devices, web, social media, sensors, log files and transactional applications, big data has found a host of vertical market applications, ranging from fraud detection to scientific research and development."

While the last few months have been relatively quiet, it is no indicator that the market is slowing down. SNS Research suggest that instead, this is indicative of the sector establishing big data analytics as a crucial element for businesses.

Big data is no longer in its infancy, meaning there are no longer a slew of groundbreaking product releases hitting the news. Instead, vendors are having to take a new approach that focuses on the details.

Stanford offers guidance on big data ethics for education sector

07

Sep
2016
Posted By : admin Comments are off
07/09/16 - Image credit: iStockphoto/cifotart
Categories :#AnalyticsNews

Stanford University has launched a new resource that aims to offer guidance and set out best practices for the responsible use of big data in the higher education sector.

The project saw researchers from the university and Ithaka S+R, a nonprofit education consulting firm, bring together 70 experts from academia, government, nonprofits and the commercial education technology industry to debate some of the biggest issues surrounding the use of data in education, and how these could be tackled.

The result is a new website, Responsible Use of Student Data in Higher Education, which launched on September 6th and aims to clarify the often unclear rules surrounding what can and cannot be done with this technology.

Martin Kurzweil, director of the educational transformation program at Ithaka S+R, explained that many educational institutions are currently worried about potential issues with over-reach when it comes to using personal student data in their research, which leads to much of the potential of this information going untapped.

Many colleges and universities are therefore restricting researchers' access to student data as they are unsure how to remain compliant with pre-existing data protection laws, while at the same time, professors and students are freely downloading apps or using online education services that generate usable data, often without their schools’ knowledge.

"A lot of players are moving in to fill those gaps and it's not always clear how they’re using student data," Mr Kurzweil continued.

Therefore, the new resource should provide colleges and universities with answers to the ethical questions they face when dealing with big data analytics.

There are four central ideas that underpin the guidelines. The first is that all parties in the higher education sector, including students and technology vendors, recognise that data collection is a joint venture that needs clearly defined goals and limits.

Secondly, students must be well informed about what data is being collected and analysed, and be allowed to appeal if they feel analytics processes lead to misinformation.

The third principle emphasises that schools have an obligation to use data-driven insights to improve their teaching, while the fourth establishes that education is about opening up opportunities for students, not closing them.

Mitchell Stevens, a sociologist and associate professor at Stanford Graduate School of Education, stated: "We're standing under a waterfall, feasting on information that's never existed before. All of this data has the power to redefine higher education." 

However, while the goal of researchers is to use big data to deliver a 'deeply personalised' learning environment that both keeps struggling students from dropping out and helps star performers excel, concerns have been expressed about the potential for information gathered from students to be misused.

As well as worries that sensitive data will be sold to third parties or stolen, there are fears that big data could have a negative impact on students' progress. For instance, if information derived from big data indicates that certain student profiles struggle in a core course, could students fitting this profile be prevented from taking the class or encouraged to take a different path, based solely on big data insight?

US organizations ‘insufficiently prepared for IoT security risks’

01

Sep
2016
Posted By : admin Comments are off
Image credit: shulz via iStock
Categories :#AnalyticsNews

US organizations are ramping up their investment in Internet of Things (IoT)-enabled devices at the moment – but do not always seem to be taking the security precautions necessary to defend against the advanced range of threats the technology can bring about.

This is according to a new report from security solutions provider Tripwire, which surveyed more than 220 information security professionals attending the 2016 Black Hat USA conference, finding that many companies may not have adequately prepared themselves for the new technology paradigm that the IoT represents.

When asked whether their organizations had prepared for the security risks associated with IoT devices, only 30 per cent responded in the affirmative, with 37 per cent saying they had not but intended to do so soon, while a further 27 per cent replied with a simple "no". Additionally, five per cent of those polled simply said they were not concerned about IoT security risks.

This is despite the fact that 78 per cent of respondents to the survey said they were worried about the weaponization of IoT devices for the use of DDoS attacks – events that can severely impact the running of a business and create significant risk of reputational damage.

Dwayne Melancon, chief technology officer and vice-president of research and development at Tripwire, said: "The large number of easily compromised devices will require a new approach if we are to secure our critical networks. Organizations must respond with low-cost, automated and highly resilient methods to successfully manage the security risk of these devices at scale."

This lax attitude to IoT security is being observed even at a time when only ten per cent of companies say they do not expect the number of IoT devices on their networks to increase in 2017. By contrast, 21 per cent expect to see this number increase by up to ten per cent, while 22 per cent are anticipating a rise of at least 20 per cent, 19 per cent expect an increase of 30 per cent and nine per cent forecast a rise of 40 per cent. Meanwhile, 18 per cent of respondents said their number of IoT-connected systems will surge by at least 50 per cent.

Tripwire's report shows that this rapid growth is not always being accompanied by proper monitoring of the technology. When asked if their organization accurately tracks the number of IoT devices on their network, only 34 per cent gave a positive response, compared to 52 per cent who responded negatively and 15 per cent who said they did not know.

Tim Erlin, director of IT security and risk strategy for Tripwire, said: "The IoT presents a clear weak spot for an increasing number of information security organizations. As an industry, we need to address the security basics with the growing number of IoT devices in corporate networks.

"By ensuring these devices are securely configured, patched for vulnerabilities and being monitored consistently, we will go a long way in limiting the risks introduced."

CSA unveils big data security best practice guidelines

31

Aug
2016
Posted By : admin Comments are off
31/08/16 - iStockphoto/emyerson
Categories :#AnalyticsNews

The Cloud Security Alliance (CSA) has unveiled 100 best practices for cloud users to take note of when gathering and handling big data in a bid to improve cyber security on a global scale.

It has split its recommendations into ten categories, with ten key points to follow in each one. The first of these centres around the importance of securing computation in distributed programming frameworks to prevent information leaks and to improve compliance, while the second focuses on improving the general security of non-relational data stores.

Thirdly, the CSA wants to see organisations improving the way that they store and log big data, as without adequate encryption methods, data is placed at significant risk of a breach.

Improving endpoint validation and monitoring security and compliance in real time are two more areas that the alliance has provided ten recommendations apiece for, as are ensuring user privacy and big data cryptography. In terms of the latter, the report discusses how advances in this area will benefit businesses in the future, suggesting that they begin following cryptography best practices now wherever possible to ensure they are prepared and one step ahead of their competitors when the time comes.

The eighth and ninth categories on the CSA's list concern granular access, which is the process of providing data access to users in the most minute way possible, as well as granular audits. 

Lastly, data provenance has been named as a key area of focus by the CSA, but is something that can only be achieved by tightening operations in each of the nine aforementioned categories. When discussing big data, provenance refers to having a thorough record of all of the people who have access to manage an organisation's big data to allow the security of the information to be monitored accordingly.

Commenting on the report, J R Santos, executive vice-president of research at the CSA, stated: "This is an important initiative for the cloud community, as new security challenges have arisen from the coupling of big data with public cloud environments. 

"As big data expands through streaming cloud technology, traditional security mechanisms tailored to secure small-scale static data on firewalled and semi-isolated networks are inadequate."

The CSA wants all businesses to be adopting the same practices when it comes to protecting big data in order to make them less vulnerable to attacks and to ensure a strong level of security across the board. With big data continuing to increase in importance and more and more firms beginning to harness the insights it can deliver, it is vital that companies start implementing best practice protection measures sooner rather than later.

Mr Santos concluded: "Security and privacy issues are magnified by this volume, variety and velocity of big data. This handbook serves as a comprehensive list of best practices for companies to use when securing big data."

US govt turns to big data to fight security breaches

30

Aug
2016
Posted By : admin Comments are off
300816 - Image credit: iStockphoto/Henrik5000
Categories :#AnalyticsNews

A significant majority of federal US government agencies have adopted big data analytics tools in order to improve their cyber security capabilities, a new report has found.

The 'Navigating the Cybersecurity Equation' study, conducted by MeriTalk on behalf of Cloudera, revealed more than four-fifths of organisations (81 per cent) are using this technology in some form. Some 53 per cent of agencies reported big data has a role to play in their overall cyber security strategy, while 28 per cent stated they were using it in a more limited capacity.

However, many federal agencies are still struggling to keep up with the threats posed by hackers and other cyber criminals. Nearly six out of ten respondents (59 per cent) reported their organisation deals with a cyber security incident at least once a month due to their inability to effectively analyse the data they have available.

One of the most common problems is that agencies find it difficult to manage these resources. Some 88 per cent of respondents admitted they struggle to turn their data into useful cyber security intelligence, while almost half (49 per cent) said the sheer volume of information is overwhelming. 

Meanwhile, one in three agencies (33 per cent) said they don't have the right systems in place to gather the data they need, and 30 per cent found that by the time the information makes it to cyber security managers, it has become outdated.

Efforts to make the most of data are also hampered by budget issues, a lack of internal skills, privacy concerns and a lack of management support and awareness of such projects. As a result, more than 40 per cent of data gathered by federal agencies goes unanalysed.

Rocky DeStefano, cyber security subject matter expert at Cloudera, stated that as both internal and external cyber security threats are evolving on a daily basis, it is vital that government agencies are able to unlock the power of data in order to combat these dangers.

"Agencies need complete visibility into the data across their enterprise," he said. "These teams also need the ability to flexibly analyse that data in a meaningful timeframe so they can detect advanced threats quickly, identify the impact and reduce the associated risk."

Those agencies that are able to effectively adopt big data analytics tools have seen significant improvements in their cyber security defences. Some 84 per cent of big data users reported that their agency had successfully used the technology to prevent an attack, while 90 per cent have seen a decline in overall breaches.

These successes mean that agencies are keen to increase their investments in big data technology. Some 94 per cent of respondents stated their agency has plans to invest in big data analytics in the next two years, with key areas for improvement including technology infrastructure (61 per cent), hardware (52 per cent), and business intelligence tools/analytics (52 per cent).

Steve O'Keeffe, founder of MeriTalk, said: "Agencies face a perfect storm of cybersecurity threats. When you're headed into troubled waters, you need a weather forecast. Big data provides agencies with the visibility to ensure they don't end up the river without a paddle."

Lack of skills holding back adoption of big data

25

Aug
2016
Posted By : admin Comments are off
Categories :#AnalyticsNews

Key technology projects such as big data analytics, Internet of Things (IoT) deployments and artificial intelligence (AI) are being held back because businesses lack the skills to properly implement them.

This is the finding of a new report by Capita and Cisco, which revealed that while there is widespread awareness of the importance of these solutions, there is a strong disconnect between this and the actual implementation of solutions.

For example, nine out of ten respondents (90 per cent) agreed that big data analytics is relevant to their business. However, just 39 per cent stated they were currently utilising this technology.

One reason for this is that nearly two-thirds of businesses (64 per cent) did not have the skills to recognise how they could use big data within their company.

A similar pattern was seen for IoT. Although 70 per cent of IT decision-makers stated their business could benefit from this technology, 71 per cent said they did not have the skills within their organisation to identify any opportunities for growth afforded by it.

What's more, 80 per cent of respondents said they did not have the skills to capitalise on the data they received from IoT. As a result, just 30 per cent said this technology was being implemented in their business.

One of the biggest disconnects was related to the deployment of AI technologies. Only eight per cent stated this is being adopted in their own company, even though half of businesses agreed this would be relevant to them. This may be because 80 per cent of respondents don't have the skills to implement or keep up to date with trends and developments in AI.

Adam Jarvis, managing director at Capita Technology Solutions, said: "It's clear that there are several important, technology-led trends which have the capacity to transform the way business is done.

"Whilst it is encouraging that levels of awareness around the strategic benefits of those trends are high, these results suggest more needs to be done to support businesses and help them close what is a substantial skills gap."

When it comes to big data, the difficulties of getting this technology to operate alongside legacy infrastructure was named as the biggest barrier to adoption. Other issues named by IT decision-makers included cost and data governance questions.

Meanwhile, the risk of security breaches was named as the biggest hurdle to IoT adoption, followed by data governance and overcoming problems created by adapting legacy IT systems.

"Without the necessary skills and infrastructure needed to implement trends such as IoT and big data, businesses across the board will suffer long-term competitive disadvantage; it is up to us as an industry to find the best and right ways to deliver that support," Mr Jarvis continued.

Facebook

Twitter

LinkedId