Quantcast
Channel: Analytics India Magazine
Viewing all articles
Browse latest Browse all 21365

10 Reasons Why Do Big Data And Analytics Projects Fail?

$
0
0

Big data is a big deal. According to a recent research, 64% of organisations surveyed in 2013 had already purchased or were planning to invest in big data systems, compared to 58% surveyed in 2012. With a gamut of companies diving into their data. to minimise customer churn, analyse financial risk, and improve customer experience, the chances of failure also increase.

As per another data released, 92% of companies who dive into analytics, are stuck in neutral, most of which fail in the long run.

What are the main reasons for the failure of these organisations? AIM spoke to various business leaders in the space and lists the top 10 reasons.

  1.    Lack of Data

Poor data quality and accuracy is a major obstacle to the success of company’s analytics efforts. Most analysts feel that the quality of data provided to them is inaccurate or incomplete. According to a Gartner research, 52% data users have claimed that they often turn to third-party service providers to help fill the gaps in the data provided. Data quality remains critical here, but very few organisations identify the issue and take a proactive approach to address the need.

  1.    The structure of Data:

Inaccurate, outdated and incomplete data is bad for business, especially in terms of profitability and competitive advantage. Here comes the dire need for the data to be structured to be put to best use. Another data related problem is that companies often manage data at a local level (e.g. department or location). This suggests the creation of ‘information silos’ in which data are redundantly stored, managed and processed. To sum it up, poor quality data that is not identified and corrected can have a significant negative economic impact on an organisation. The implications vary from lack of customer satisfaction, increased running costs, inefficient decision-making processes to lower performance and lowered employee job satisfaction.

  1.    Complex Models:

Most often data scientists tend to go after complex model building when a simple one can just be as good. Or at times even superior. There’s always a tendency to complicate problem statement or come out with solutions that have a complex build to them. This not just takes away focus from the big picture at hand but digress from what the right path to a solution might be.

  1.    Asking the Wrong Questions/ Lack of Business Objectives:

Companies often start with an overly ambitious project and fail to tackle it. But most importantly, companies lack objectives or are simply asking the wrong questions. A classic example would be Google’s ‘Flu Trends’, an initiative launched by Google to predict flu epidemics. What the company asked was: “When will the next flu epidemic hit North America?” However, when the data was analysed, the trends missed the 2009 epidemic and as a result over-predicted trends. It was later speculated that if the researchers had instead asked: “What do the frequency and number of flu trends tell us,” the results may have been more accurate.

  1.    Poor Management Overview:

Big data projects succeed when they’re not “isolated projects” but instead are the core of how the company plans to utilise its data. Placing varying strategic priorities and ideologies over data is often a mistake organisations make. Despite what data conveys, according to the Gartner survey, 62% of business leaders said they tend to trust their gut, and 61% said real-world insight tops hard analytics when making decisions. Top level management must have a clear overview and forward planning for execution of data.

  1.    Results Not Reproducible:

There is growing alarm about results that cannot be reproduced or reused. Explanations include increased levels of scrutiny, the complexity of experiments and statistics. While data may be designed poorly or fail to use appropriate statistics, one must understand that this should not be a one-time exercise. Data that cannot be implemented is of no good use. It is just time, money and energy wasted.

  1.    Lack of Infrastructure:

Companies often attempt to solve big data problems using traditional data technologies, in which case failure is more than often certain. Interrogating non-traditional data sets through traditional means is a very common problem. Despite the major contributions of big data technology in all facets of society, big data management has caused headaches to most data centres and data management teams. Big data storage brings in new challenges that confront all data-management teams as data types and content are getting more complicated and increasing in volume.

  1.    Lack of Expenditure:

Lack of budget and issues with technology are also the top impediments to achieving the company’s data strategies. While business’ use of data is growing, investment in technology is falling behind. According to another survey, only 41% companies use predictive models and/or advanced analytical and forecasting techniques. Technological advancements come with a cost and companies must understand the need for upgrades.

  1.    Missing Timeline:

Time it! Results produced much after the desired time is also a reason for failure among smaller companies. A need for disciple and better time management must be instilled from the beginning. Don’t wait till your results are redundant.

  1.    Lack of Skilled Manpower:

Last on our list but still the most vital. Too many big data projects fail due to the lack of skilled manpower. A critical element is to have a team which brings the right talent on board. 66% Analytics leaders in India believe that ‘Unavailability of Analytics Talent’ is the major challenge that they face.

 

The post 10 Reasons Why Do Big Data And Analytics Projects Fail? appeared first on Analytics India Magazine.


Viewing all articles
Browse latest Browse all 21365

Trending Articles