Blog

Chief data officers ‘essential’ to big data success

13

Dec
2016
Posted By : admin Comments are off
131216 - Image credit: iStockphoto/emyerson

Organisations that invest in skilled executives to manage their big data analytics projects are better-placed to see success in this area than those that do not, a new report has indicated.

A study of US federal agencies conducted by MeriTalk and ViON Corporation revealed that almost all these bodies (92 per cent) use big data to some degree. However, the majority (58 per cent) graded the effectiveness of their data management strategy as C or worse.

Therefore, having the right personnel on hand to control the direction of such projects will be invaluable. The study found that 88 per cent of organisations with a chief data officer (CDO) leading these efforts report these executives have had a positive impact on … Read more

Read More

Don’t delete big data, companies urged

06

Dec
2016
Posted By : admin Comments are off
dont delete big data companies urged

Companies performing ad-hoc big data analytics operations have been reminded of the importance of keeping the data used in the process after it is completed.

Speaking at an IT Leaders Forum organised by Computing.com, director of file, object storage and big data flash at IBM Alex Chen explained businesses may need to refer back to this information at a later date. This may be in order to meet regulatory requirements, or simply if people want to investigate what happened and why a particular decision was taken.

At the moment, many organisations are still in the early adoption stage when it comes to big data, which means they may be performing a large number of experimental and ad-hoc analyses as they … Read more

Read More

Using external scripts to create a table with random test data

02

Dec
2016
Posted By : admin Comments are off
external scripts creating tables random test data

As a database tester, sometimes there’s a need to create tables with randomised data for testing. This could be because there are security and privacy issues preventing the use of production data, or it could help in testing for robustness against unusual characters, integer ranges and so on.  There are various ways to achieve this. One flexible method is to use Kognitio’s external scripting feature.

External scripts are Kognitio’s way of passing data to and from other Linux environments such as Python, Bash and R, described in section 10 of the Kognitio Guide (you can download the documentation from http://www.kognitio.com/forums/viewtopic.php?f=2&t=3/).

If external scripting for Python is already enabled in a Kognitio system, then the following SQL will create a … Read more

Read More

Harvard seeks to tackle big data storage challenges

01

Dec
2016
Posted By : admin Comments are off
big data storage challenges, growth

With a growing number of companies looking to expand their big data analytics operations in the coming years, one key consequence of this will be an explosion in the amounts of data that businesses will have to store.

Therefore, finding cost-effective solutions for this will be essential if such initiatives are to be successful. While turning to technologies such as cloud computing could be the answer for many businesses today, as data volumes continue to grow at an exponential rate, new and improved solutions may be required.

This is why developers at Harvard University have been working to develop new infrastructure that is able to cope with this influx of information and support critical research taking place throughout the institution.… Read more

Read More

Financial services firms to embrace real-time analytics

30

Nov
2016
Posted By : admin Comments are off
financial services embrace real time analytics

A growing number of companies in the financial services sector are set to upgrade their big data analytics initiatives to include real-time solutions, a new report has claimed.

A study by TABB Group noted there is an increasing understanding in the sector that the value of a given piece of data can be lost almost immediately as it becomes outdated. Therefore, capital markets firms are turning to real-time analytics for activities including risk management, compliance, consumer metrics and turning insight into revenue.

Author of the report Monica Summerville noted that simply having data is no longer useful, and traditional ways of thinking about analytics, such as data warehousing and batch-led approaches to analytics, no longer apply.

In today's environment, firms … Read more

Read More

How HelloFresh embraced Hadoop

28

Nov
2016
Posted By : admin Comments are off
how HelloFresh embraced Hadoop

As businesses grow, it becomes more critical for them to have a solution that will effectively handle the increasing amounts of data they generate. However, one problem that many organisations find when they are expanding is that tools that were adequate when they were developed are not able to scale along with the company.

This was the problem facing Berlin-based home meal delivery firm HelloFresh. The five-year-old firm has expanded rapidly and now delivers more than 7.5 million meals a month to 800,000 subscribers in multiple countries. Therefore, it found itself quickly outgrowing the custom-made business intelligence system it had long relied on, and needed a new solution.

In a recent interview with InformationWeek, chief technology officer at the company … Read more

Read More

UK regulator cautions insurers on big data

24

Nov
2016
Posted By : admin Comments are off
big data analytics

The head of the Financial Conduct Authority (FCA) has reminded insurance providers of the need to be careful in their use of big data to ensure some customers are not unfairly penalised.

Speaking at the Association of British Insurers' annual conference, chief executive of the regulator Andrew Bailey noted the ability to capture and convert information into insight has led to a "revolution" in how businesses approach data. However, he cautioned that there need to be boundaries on how this is used to ensure that the technology serves everyone effectively.

The use of big data can allow insurers to determine premiums for consumers at a much more individual level, rather than pooling them into wider risk groups. This puts more … Read more

Read More

How Tesco is diving into the data lake

23

Nov
2016
Posted By : admin Comments are off
tesco data lake, big data, forecasting

An effective big data analytics solution is now an essential requirement for any large business that wishes to be successful in today's competitive environment, regardless of what sector they are in.

However, one part of the economy that particularly stands to benefit from this technology is retail. These firms have a longstanding tradition of gathering and utilising customer data, so the ability to gain greater insight from the information they already have will play a key role in their decision-making.

One company that has always been at the forefront of this is UK supermarket Tesco. It was noted by Forbes that the company was one of the first brands to track customer activity through the use of its loyalty cards, … Read more

Read More

Providing good problem reports

21

Nov
2016
Posted By : admin Comments are off
210716 - Image credit: iStockphoto/cifotart

Why bother creating a good problem report?

The better the problem report is, the faster your problem is likely to be solved.

In the absence of a good report, whoever is trying to resolve your problem will probably come back to you and ask for more information. The more iterations there are of this nature, the more time will elapse before the problem is understood, and can be resolved.

 

Where to create problem reports

All users of Kognitio software can report problems by posting on the community forums at http://kognitio.com/forums – remember these are visible to everyone, so don’t post confidential information.

Alternatively, customers paying for support can:

  • create a case on the support portal
  • email the appropriate helpdesk
Read more

Read More

NIH highlights use of big data in disease research

21

Nov
2016
Posted By : admin Comments are off
211112 - Image credit: iStockphoto/kentoh

The US National Institute of Health (NIH) has highlighted the importance of big data in helping track infectious disease outbreaks and formulating response plans.

In a study published as a supplement in the Journal of Infectious Disease, the body observed that data derived from sources ranging from electronic health records to social media has the potential to provide much more detailed and timely information about outbreaks than traditional surveillance techniques.

Existing methods are typically based on laboratory tests and other data gathered by public health institutions, but these have a range of issues. The NIH noted they are expensive, slow to produce results and do not provide adequate data at a local level to set up effective monitoring.

Big data Read more

Read More

Facebook

Twitter

LinkedId