Stanford offers guidance on big data ethics for education sector

07

Sep
2016
Posted By : admin Comments are off
07/09/16 - Image credit: iStockphoto/cifotart
Categories :#AnalyticsNews

Stanford University has launched a new resource that aims to offer guidance and set out best practices for the responsible use of big data in the higher education sector.

The project saw researchers from the university and Ithaka S+R, a nonprofit education consulting firm, bring together 70 experts from academia, government, nonprofits and the commercial education technology industry to debate some of the biggest issues surrounding the use of data in education, and how these could be tackled.

The result is a new website, Responsible Use of Student Data in Higher Education, which launched on September 6th and aims to clarify the often unclear rules surrounding what can and cannot be done with this technology.

Martin Kurzweil, director of the educational transformation program at Ithaka S+R, explained that many educational institutions are currently worried about potential issues with over-reach when it comes to using personal student data in their research, which leads to much of the potential of this information going untapped.

Many colleges and universities are therefore restricting researchers' access to student data as they are unsure how to remain compliant with pre-existing data protection laws, while at the same time, professors and students are freely downloading apps or using online education services that generate usable data, often without their schools’ knowledge.

"A lot of players are moving in to fill those gaps and it's not always clear how they’re using student data," Mr Kurzweil continued.

Therefore, the new resource should provide colleges and universities with answers to the ethical questions they face when dealing with big data analytics.

There are four central ideas that underpin the guidelines. The first is that all parties in the higher education sector, including students and technology vendors, recognise that data collection is a joint venture that needs clearly defined goals and limits.

Secondly, students must be well informed about what data is being collected and analysed, and be allowed to appeal if they feel analytics processes lead to misinformation.

The third principle emphasises that schools have an obligation to use data-driven insights to improve their teaching, while the fourth establishes that education is about opening up opportunities for students, not closing them.

Mitchell Stevens, a sociologist and associate professor at Stanford Graduate School of Education, stated: "We're standing under a waterfall, feasting on information that's never existed before. All of this data has the power to redefine higher education." 

However, while the goal of researchers is to use big data to deliver a 'deeply personalised' learning environment that both keeps struggling students from dropping out and helps star performers excel, concerns have been expressed about the potential for information gathered from students to be misused.

As well as worries that sensitive data will be sold to third parties or stolen, there are fears that big data could have a negative impact on students' progress. For instance, if information derived from big data indicates that certain student profiles struggle in a core course, could students fitting this profile be prevented from taking the class or encouraged to take a different path, based solely on big data insight?

US organizations ‘insufficiently prepared for IoT security risks’

01

Sep
2016
Posted By : admin Comments are off
Image credit: shulz via iStock
Categories :#AnalyticsNews

US organizations are ramping up their investment in Internet of Things (IoT)-enabled devices at the moment – but do not always seem to be taking the security precautions necessary to defend against the advanced range of threats the technology can bring about.

This is according to a new report from security solutions provider Tripwire, which surveyed more than 220 information security professionals attending the 2016 Black Hat USA conference, finding that many companies may not have adequately prepared themselves for the new technology paradigm that the IoT represents.

When asked whether their organizations had prepared for the security risks associated with IoT devices, only 30 per cent responded in the affirmative, with 37 per cent saying they had not but intended to do so soon, while a further 27 per cent replied with a simple "no". Additionally, five per cent of those polled simply said they were not concerned about IoT security risks.

This is despite the fact that 78 per cent of respondents to the survey said they were worried about the weaponization of IoT devices for the use of DDoS attacks – events that can severely impact the running of a business and create significant risk of reputational damage.

Dwayne Melancon, chief technology officer and vice-president of research and development at Tripwire, said: "The large number of easily compromised devices will require a new approach if we are to secure our critical networks. Organizations must respond with low-cost, automated and highly resilient methods to successfully manage the security risk of these devices at scale."

This lax attitude to IoT security is being observed even at a time when only ten per cent of companies say they do not expect the number of IoT devices on their networks to increase in 2017. By contrast, 21 per cent expect to see this number increase by up to ten per cent, while 22 per cent are anticipating a rise of at least 20 per cent, 19 per cent expect an increase of 30 per cent and nine per cent forecast a rise of 40 per cent. Meanwhile, 18 per cent of respondents said their number of IoT-connected systems will surge by at least 50 per cent.

Tripwire's report shows that this rapid growth is not always being accompanied by proper monitoring of the technology. When asked if their organization accurately tracks the number of IoT devices on their network, only 34 per cent gave a positive response, compared to 52 per cent who responded negatively and 15 per cent who said they did not know.

Tim Erlin, director of IT security and risk strategy for Tripwire, said: "The IoT presents a clear weak spot for an increasing number of information security organizations. As an industry, we need to address the security basics with the growing number of IoT devices in corporate networks.

"By ensuring these devices are securely configured, patched for vulnerabilities and being monitored consistently, we will go a long way in limiting the risks introduced."

CSA unveils big data security best practice guidelines

31

Aug
2016
Posted By : admin Comments are off
31/08/16 - iStockphoto/emyerson
Categories :#AnalyticsNews

The Cloud Security Alliance (CSA) has unveiled 100 best practices for cloud users to take note of when gathering and handling big data in a bid to improve cyber security on a global scale.

It has split its recommendations into ten categories, with ten key points to follow in each one. The first of these centres around the importance of securing computation in distributed programming frameworks to prevent information leaks and to improve compliance, while the second focuses on improving the general security of non-relational data stores.

Thirdly, the CSA wants to see organisations improving the way that they store and log big data, as without adequate encryption methods, data is placed at significant risk of a breach.

Improving endpoint validation and monitoring security and compliance in real time are two more areas that the alliance has provided ten recommendations apiece for, as are ensuring user privacy and big data cryptography. In terms of the latter, the report discusses how advances in this area will benefit businesses in the future, suggesting that they begin following cryptography best practices now wherever possible to ensure they are prepared and one step ahead of their competitors when the time comes.

The eighth and ninth categories on the CSA's list concern granular access, which is the process of providing data access to users in the most minute way possible, as well as granular audits. 

Lastly, data provenance has been named as a key area of focus by the CSA, but is something that can only be achieved by tightening operations in each of the nine aforementioned categories. When discussing big data, provenance refers to having a thorough record of all of the people who have access to manage an organisation's big data to allow the security of the information to be monitored accordingly.

Commenting on the report, J R Santos, executive vice-president of research at the CSA, stated: "This is an important initiative for the cloud community, as new security challenges have arisen from the coupling of big data with public cloud environments. 

"As big data expands through streaming cloud technology, traditional security mechanisms tailored to secure small-scale static data on firewalled and semi-isolated networks are inadequate."

The CSA wants all businesses to be adopting the same practices when it comes to protecting big data in order to make them less vulnerable to attacks and to ensure a strong level of security across the board. With big data continuing to increase in importance and more and more firms beginning to harness the insights it can deliver, it is vital that companies start implementing best practice protection measures sooner rather than later.

Mr Santos concluded: "Security and privacy issues are magnified by this volume, variety and velocity of big data. This handbook serves as a comprehensive list of best practices for companies to use when securing big data."

US govt turns to big data to fight security breaches

30

Aug
2016
Posted By : admin Comments are off
300816 - Image credit: iStockphoto/Henrik5000
Categories :#AnalyticsNews

A significant majority of federal US government agencies have adopted big data analytics tools in order to improve their cyber security capabilities, a new report has found.

The 'Navigating the Cybersecurity Equation' study, conducted by MeriTalk on behalf of Cloudera, revealed more than four-fifths of organisations (81 per cent) are using this technology in some form. Some 53 per cent of agencies reported big data has a role to play in their overall cyber security strategy, while 28 per cent stated they were using it in a more limited capacity.

However, many federal agencies are still struggling to keep up with the threats posed by hackers and other cyber criminals. Nearly six out of ten respondents (59 per cent) reported their organisation deals with a cyber security incident at least once a month due to their inability to effectively analyse the data they have available.

One of the most common problems is that agencies find it difficult to manage these resources. Some 88 per cent of respondents admitted they struggle to turn their data into useful cyber security intelligence, while almost half (49 per cent) said the sheer volume of information is overwhelming. 

Meanwhile, one in three agencies (33 per cent) said they don't have the right systems in place to gather the data they need, and 30 per cent found that by the time the information makes it to cyber security managers, it has become outdated.

Efforts to make the most of data are also hampered by budget issues, a lack of internal skills, privacy concerns and a lack of management support and awareness of such projects. As a result, more than 40 per cent of data gathered by federal agencies goes unanalysed.

Rocky DeStefano, cyber security subject matter expert at Cloudera, stated that as both internal and external cyber security threats are evolving on a daily basis, it is vital that government agencies are able to unlock the power of data in order to combat these dangers.

"Agencies need complete visibility into the data across their enterprise," he said. "These teams also need the ability to flexibly analyse that data in a meaningful timeframe so they can detect advanced threats quickly, identify the impact and reduce the associated risk."

Those agencies that are able to effectively adopt big data analytics tools have seen significant improvements in their cyber security defences. Some 84 per cent of big data users reported that their agency had successfully used the technology to prevent an attack, while 90 per cent have seen a decline in overall breaches.

These successes mean that agencies are keen to increase their investments in big data technology. Some 94 per cent of respondents stated their agency has plans to invest in big data analytics in the next two years, with key areas for improvement including technology infrastructure (61 per cent), hardware (52 per cent), and business intelligence tools/analytics (52 per cent).

Steve O'Keeffe, founder of MeriTalk, said: "Agencies face a perfect storm of cybersecurity threats. When you're headed into troubled waters, you need a weather forecast. Big data provides agencies with the visibility to ensure they don't end up the river without a paddle."

Lack of skills holding back adoption of big data

25

Aug
2016
Posted By : admin Comments are off
Categories :#AnalyticsNews

Key technology projects such as big data analytics, Internet of Things (IoT) deployments and artificial intelligence (AI) are being held back because businesses lack the skills to properly implement them.

This is the finding of a new report by Capita and Cisco, which revealed that while there is widespread awareness of the importance of these solutions, there is a strong disconnect between this and the actual implementation of solutions.

For example, nine out of ten respondents (90 per cent) agreed that big data analytics is relevant to their business. However, just 39 per cent stated they were currently utilising this technology.

One reason for this is that nearly two-thirds of businesses (64 per cent) did not have the skills to recognise how they could use big data within their company.

A similar pattern was seen for IoT. Although 70 per cent of IT decision-makers stated their business could benefit from this technology, 71 per cent said they did not have the skills within their organisation to identify any opportunities for growth afforded by it.

What's more, 80 per cent of respondents said they did not have the skills to capitalise on the data they received from IoT. As a result, just 30 per cent said this technology was being implemented in their business.

One of the biggest disconnects was related to the deployment of AI technologies. Only eight per cent stated this is being adopted in their own company, even though half of businesses agreed this would be relevant to them. This may be because 80 per cent of respondents don't have the skills to implement or keep up to date with trends and developments in AI.

Adam Jarvis, managing director at Capita Technology Solutions, said: "It's clear that there are several important, technology-led trends which have the capacity to transform the way business is done.

"Whilst it is encouraging that levels of awareness around the strategic benefits of those trends are high, these results suggest more needs to be done to support businesses and help them close what is a substantial skills gap."

When it comes to big data, the difficulties of getting this technology to operate alongside legacy infrastructure was named as the biggest barrier to adoption. Other issues named by IT decision-makers included cost and data governance questions.

Meanwhile, the risk of security breaches was named as the biggest hurdle to IoT adoption, followed by data governance and overcoming problems created by adapting legacy IT systems.

"Without the necessary skills and infrastructure needed to implement trends such as IoT and big data, businesses across the board will suffer long-term competitive disadvantage; it is up to us as an industry to find the best and right ways to deliver that support," Mr Jarvis continued.

IoT ‘to change focus’ of big data plans

19

Aug
2016
Posted By : admin Comments are off
IoT 'to change focus' of big data plans
Categories :#AnalyticsNews

The emergence of the Internet of Things (IoT) as a key technology for many businesses will lead to a significant change in how organisations approach their big data analytics operations.

This is according to a new report from Machina Research, which stated there will be an increasing focus on predictive and prescriptive analytics in order to assist with business decision-making as firms try to make the most of the available data.

The company said that as millions of connected devices come online and provide real-time details about what is going on in the physical world, businesses will look to shift the goals of their analytics activities from examining what has happened to asking what is likely to happen.

Although more traditional activities such as historical and descriptive analytics will still have their place, the real value in the coming years will come from being able to accurately foresee opportunities and threats before they become readily apparent to competitors.

Author of the report Emil Berthelsen, principal analyst at Machina Research, observed: "One of the more significant developments as part of, and in parallel to, developments in IoT, is the approach of two different 'waves' in data management – big data and fast data."

He explained both of these are characterised by high speed and large scale, and the combination of the two has led to significant changes in how businesses interact with data, resulting in new requirements for data management.

"The landscape of IoT data and analytics is certainly evolving and will include a new age of machine learning, augmented insights and managed autonomy, as well as a new set of enabling technologies and data governance tools," Machina Research stated.

By 2020, it is estimated that IoT-equipped gadgets and sensors will make up around half of all connected devices. According to a study by Cisco, this will equate to some 12.2 billion items.

Meanwhile, Machina Research has estimated that revenues from IoT are set to exceed $3 trillion in 2025, compared with just $750 billion last year.

Many firms still struggling to secure cloud-based data

09

Aug
2016
Posted By : admin Comments are off
090816 - Image credit: iStockphoto/Henrik5000
Categories :#AnalyticsNews

For many businesses, cloud computing presents a great opportunity to make the most of big data, as the technology allows them to access storage and processing resources that may otherwise be beyond their reach.

But if they are going down this route, they must take steps to ensure any sensitive data they transfer to the cloud is secure – and this is something that is proving to be a challenge for a large number of firms.

A recent study conducted by Gemalto and the Ponemon Institute found that although 73 per cent of enterprises currently regard cloud-based platforms as important to their current operations, fewer than half of IT security professionals are confident in the security of their solutions.

Some 54 per cent of respondents did not agree their companies have a proactive approach to managing security, or ensuring that their cloud solutions comply with privacy and data protection regulations.

Additionally, 56 per cent did not agree their organisation is careful about sharing sensitive information in the cloud with third parties such as business partners, contractors and vendors.

This is despite the fact that a growing amount of data, ranging from customer information to payment records, is stored or processed in the cloud. In 2014, 53 per cent of businesses held customer data in the cloud, but this has increased to 62 per cent today. The majority of respondents (53 per cent) also consider this information to be most at risk.

Dr Larry Ponemon, chairman and founder or the Ponemon Institute, said that cloud security "continues to be a challenge for companies, especially in dealing with the complexity of privacy and data protection regulations".

He added: "To ensure compliance, it is important for companies to consider deploying such technologies as encryption, tokenisation or other cryptographic solutions to secure sensitive data transferred and stored in the cloud."

This is something many firms are currently failing to do. The survey showed only a third of businesses (34 per cent) using Software-as-a-Service solutions currently encrypt or tokenise sensitive data that is being transferred to the cloud.

One common concern was that conventional security practices do not apply when dealing with the cloud, which means firms may have to adapt their approach to activities such as big data analytics. Seven out of ten firms (70 per cent) cited this as a challenge, while 69 per cent stated the fact they cannot directly inspect cloud providers for security compliance is a problem.

Jason Hart, vice-president and chief technology officer for data protection at Gemalto, commented that although organisations have embraced the cost and flexibility benefits of the cloud, it is clear that many businesses are still struggling to maintain control of their data in this environment.

"It's quite obvious security measures are not keeping pace because the cloud challenges traditional approaches of protecting data when it was just stored on the network," he continued. "It is an issue that can only be solved with a data-centric approach in which IT organisations can uniformly protect customer and corporate information across the dozens of cloud-based services their employees and internal departments rely on every day."

How big data could transform the roads of the future

08

Aug
2016
Posted By : admin Comments are off
How big data could transform the roads of the future (iStock/IakovKalinin)
Categories :#AnalyticsNews

While big data is fast becoming an integral tool in the business world, its potential impact is far more wide-reaching and has the capacity to impact the day-to-day life of people all over the world. 

For example, big data is playing a key role in the development of so-called smart cities, where almost every aspect of life, from waste disposal to law enforcement, will be managed by integrated technology solutions that are dependent on big data to function. 

Key to the development of true smart cities will be the evolution of more efficient transport systems, which make getting from A to B quicker and easier. Big data will be vital to making this future a reality and great strides have already been made.

Intelligent traffic

Traffic congestion is one of the biggest problems in cities across the world. Time-consuming and pollution-generating, it has a negative impact on both the environment and city residents' quality of life. There is an economic effect too, with a study from the INRIX and the Centre for Economics and Business Research forecasting congestion will cost the UK economy £307 billion between 2013 and 2030.

The growth of the internet of things and machine-to-machine technology presents solutions to these problems. According to Statista, more than one in ten (12 per cent) of cars on the road now possess connected technology and this figure is expected to reach 22 per cent by 2020. With such technology in place, it is possible for vehicles to share and receive real-time data on road and traffic conditions. This can then be processed to provide information that can be used to manage traffic more effectively.

Theoretically, the exchange of data will make it possible to monitor the number of cars in a certain area and divert approaching motorists towards a different route when capacity is reached. Trails to this effect have already taken place in the US and China. 

However, progress is still required before this vision of truly smart traffic can be achieved. Hussein Dia, associate professor at Swinburne University of Technology, Melbourne, recently discussed the issue with Raconteur.

"While decision-makers and city leaders are recognising the role of data analytics in ‘sweating of assets’ and providing innovative solutions to meet demand, deployment at a global scale is still in its infancy," he stated.

"As the amount of data about current and future travel demands increases in the connected world, so the possibility of better analytics increases. In order to have real benefit, though, predictive analytics for transport as a whole is required," Prof Dia added.

Self-driving cars

Should data-driven traffic management become a reality, the next step in the development of truly smart cities will be autonomous vehicles. Progress has already been made in this area, with Google having covered close to 2.5 million miles in test drives through its self-driving car project. Tesla, meanwhile, has adopted more of a piecemeal approach and has already released the 'autopilot' software update, which allows its vehicles to drive, change lanes and adjust speed autonomously. However, the car must already be moving at a consistent speed and have maps data of the surrounding area before these features can be engaged.

If genuinely autonomous cars are to become a reality, big data will be crucial, providing the real-time information these vehicles will require to navigate the roads in the safest and most efficient manner possible. 

4 in 10 manufacturing firms experimenting with big data architecture

04

Aug
2016
Posted By : admin Comments are off
4 in 10 manufacturing firms experimenting with big data architecture (iStockphoto/shulz)
Categories :#AnalyticsNews

Just under half of businesses in the high-value manufacturing sector are currently experimenting with big data architecture.

This is according to a new study carried out by the Alan Turing Institute and Warwick Analytics, the results of which have been shared with The Manufacturer. It found that 41 per cent of businesses in the sector are currently at the experimentation stage with big data architecture, looking into how they can utilise this technology and the value it can provide. However, adoption is expected to accelerate in the near future, with the number of businesses still experimenting forecast to be only 11 per cent come 2019.

The study revealed there is a lack of clarity about some aspects of big data among manufacturers, with half of the companies surveyed unable to clearly understand the difference between business intelligence, big data analytics, and predictive analytics.

When it comes to technical barriers to adoption of big data analytics, having data spread across a number of systems, which will prove difficult to combine, was rated as the top factor. Concerns about the quality of data and difficulties cleaning it were ranked as the second biggest challenges. Other barriers mentioned ranged from a lack of data to having too much information, while some respondents believe data analytics is simply too difficult to understand.

A number of business challenges were also highlighted as obstacles to adoption, such as a lack of internal sponsorship, a shortage of specific data analysis skills and not having an effective business case for the technology. Despite these concerns, the majority of respondents see the value in big data analytics, with 92 per cent saying they believe it can drive a business improvement of more than ten per cent.

"The ability to extract meaningful insights about products; processes; production; yield; maintenance, and other manufacturing functions, as well as the ability to make decisions and take proactive action – when it matters – can deliver tremendous growth and profitability result," the report stated.

"Manufacturers have tremendous potential to generate value from the use of large datasets, integrating data across the extended enterprise and applying advanced analytical techniques to raise their productivity both by increasing efficiency and improving the quality of their products. However, the reality is that very few of today’s manufacturers are close to this vision yet," it added.

The study highlighted a number of key benefits that data analytics can deliver for manufacturing businesses. These include improving quality by providing a firm foundation from which the root cause of problems can be identified and making production more effective. Other advantages are faster time-to-launch, forecasting maintenance needs and improving supply chain operations.

Big data tech makes inroads into UK public sector

03

Aug
2016
Posted By : admin Comments are off
Big data tech makes inroads into UK public sector (iStockphoto/blackdovfx)
Categories :#AnalyticsNews

Big data technology is beginning to make a mark within the UK public sector.

Both HM Revenue and Customs (HMRC) and the Home Office are now using Hadoop to help manage their data needs, according to a report from Computer Weekly. Both organisations are using commercial distributors of the software, with HMRC utilising Cloudera and the Home Office working with Hortonworks.

The former spent some £7.4 billion on Cloudera at the beginning of last year before investing almost another £1 million earlier in 2016.

A HMRC spokesperson told Computer Weekly: "HMRC has built an enterprise data hub – a powerful central repository for all of its data, which will help it to personalise services to customers and strengthen its compliance work. HMRC will be able to store and analyse data using a mix of open source and closed source tools, and commodity hardware, representing better value for money for taxpayers."

The spokesperson added that use of Hadoop will allow for greater operational efficiency and a level of analytical capacity that has not been available to it in the past.

Records show the Home Office spent £53,000 with Hortonworks in August 2014 and a further £61,000 two months later. It is reported that the organisation is using Hadoop to connect the various databases it currently relies on.

Regarding the use of big data technology in the public sector, Cloudera's vice-president for northern Europe Stephen Line said full-scale adopting will take time.

"Government, like a lot of old industry, has to go through that digital transformation, modernising its digital architecture, breaking down those silos. The UK is not necessarily behind or ahead particularly," he commented.

Other UK public sector bodies now utilising big data technology include the National Crime Agency and Office for National Statistics. 

Earlier this year, an independent report commissioned by the UK government called for the nation's public sector to improve its use of data. Professor Sir Charlie Bean, a former deputy governor of the Bank of England, compiled the document, which said improvements need to be made to ensure that accurate economic statistics on the digital economy can be captured.

Prof Bean: "We need to be candid about the limitations of UK economic statistics. The UK was one of the original pioneers of national accounting. We need to take economic statistics back to the future or we risk missing out an important part of the modern economy from official figures."

Among his recommendations were for the establishment of two new centres to better measure the UK economy and unlock the "treasure trove of big data available – especially in the public sector".

Prof Bean called on the Office for National Statistics to become innovative enough to provide the kind of data the country needs.

Facebook

Twitter

LinkedId