Anne Fisher just published a nice piece titled “Why Big Data Isn’t Paying Off for Companies (Yet)” where she describes research from the American Institute of CPA’s (AICPA) regarding big data initiatives within organizations around the world. You can jump over to the study itself but Anne does a very good job describing the findings in her article.
One of the key findings from that study is that here are three main roadblocks to big data success. They are:
Being in too big of a hurry
Trying to start too ‘big’
Not incorporating corporate culture into big data initiatives
In my work with clients on big data, that third roadblock is the one that tends to cause the most problems. Sure, many companies want to through big data at all their problems (or is it throw all their problems at big data?) but it doesn’t take long for them to realize they need to start small and go slow to make sure they know what they’re doing within the big data world.
Roadblock #3 is the one that gets most companies into trouble, especially those that are very bureaucratic. These bureaucratic companies tend to attract people that like to hold onto information because it makes them feel more ‘powerful’ or in control. If you implement big data systems and processes correctly, your organization should become much more information rich, which is something that bureaucrats don’t really like. Therefore, it is imperative to big data success that organizations remove the bureaucrats and bring in a more open, sharing culture. Fisher and the AICPA agree:
Busting up the bureaucracy, so information can flow quickly to the right people, requires a kind of manager that the AICPA study refers to as “integrative thinkers.” The relatively few organizations that are making profitable use of big data have hired or cultivated executives Thomas describes as “collaborative leaders who can see horizontally across their whole organization and connect the dots.”
Are you thinking about corporate culture in your big data initiatives?
A newly released report from IDC titled “Worldwide Big Data Technology and Services Forecast, 2015–2019” predicts that we will continue to see strong growth over the next four years for big data services, software and infrastructure. According to IDC, the growth rates for these three areas will grow at the following compound annual growth rates (CAGR):
Infrastructure: 21.7% CAGR.
Software: 26.2% CAGR.
Services: 22.7 CAGR%.
With these growth rates, IDC forecasts that the annual spend for big data services, software and infrastructure will be $48.6 Billion in 2019, which is about a 23% annual growth between now and 2019.
Based on those predicted rates of growth in big data services,software and infrastructure it looks like we still have plenty of time on Gartner’s Hype Cycle before we get to the ‘plateau of productivity’ within most organizations.
4% of respondents are able to extract full value from the data they have
36% of respondents lack the tools and/or skills necessary to extract value from their data
66% of respondents obtain little to no benefit from their data
25% of respondents do not see any value in the data they have and don’t believe it would add value in any form of decision-making processes.
I was pointed to the PWC / Iron Mountain report via a CIO article titled “Study reveals that most companies are failing at big data“. In that article, the author uses the data from this report to claim that companies are failing at big data. I don’t think that’s what the data is saying at all…I think the survey responses show that companies are trying to figure out how to ‘do’ big data but very few have really got it under control. That lines up well with my experience working with clients as does the survey results in the report.
This survey (and others) tells me that we are still very early in the lifecycle (or hype cycle) of big data. Lots of people are talking about it and lost of people/companies have extreme expectations about what they can do with big data, but few have really figured out how to make big data work for them. According to Gartner’s Hype Cycle, I think we are still in the up-cycle between ‘technology trigger’ and ‘peak of inflated expectations’ for most organizations.
Are companies failing at big data? Sure…but I think that’s just because most companies are still very early in the learning cycle for big data. Give it some time and we’ll see these survey results change.
One of the highlights is extremely important. It relates to the importance of finding actionable insights from big data and sharing those insights in a way that the organization can use. The quote is:
I think you need three ingredients. You need data, you need the right ways to combine the data and extract features from that data, and then the third ingredient is the ability to analyze the data and bring together the analysis results in a way that provides these insights and these measurable actions. … [They] need to be able to know what actions [they] need to execute in response to these analytics….
So many times when talking about big data, we get ourselves wrapped up in the technologies, processes and systems that we forget to think about the real reason we are even working with data in the first place. Data is near worthless until it is analyzed. Sure, you can put some ‘value’ on data but unless you turn that data into information (and then into knowledge) your doing nothing more than being a data hoarder.
So…to my data hoarding friends I say: Don’t focus on the data, focus on what the data tells you. I don’t care if you have 1 GB of data or 1 PB of data, if you can’t turn that data into information and then knowledge, your data initiatives aren’t going to succeed. Additionally, if you can’t communicate the insights gained from your analysis, your missing out on the real value in data analysis.
Big data doesn’t have to be used to solve big problems. Big data sure makes it easier to solve big problems, but you can just as easily use big data and data analytics to solve ‘smaller’ issues.
In fact, if your organization is just getting started in the world of big data, it makes sense to find a few smaller problems to try to solve. These small problems allow you to tweak your systems and processes to make sure you are gathering, storing and analyzing data correctly. These small problems also let you build up the appropriate skills within your teams to ensure when the big problems come along, your teams are ready to handle them.
Wouldn’t you feel better about your big data initiatives if you could ‘prove’ that the systems, processes and people were working effectively and giving output that can be believed? Most people would…and that’s why most organizations should start with these small projects. There’s nothing worse than getting 6 months (or a year) down the big data road and realize your data has been collected and stored in a way that makes it difficult to believe whether that data is correct and ‘clean’.
As important as making sure your systems, processes and people are working effectively is, it is just as important to make sure your organization is ready to accept the outcomes of any big data analysis. There’s nothing worse than spending time and money and realizing that your organization (or certain people within your organization) aren’t willing or able to accept the outcome of work performed analyzing the company’s data.
Starting small with big data lets you and your organization get comfortable with the entire process of collecting, storing, analyzing and reporting. Big data doesn’t have to require big projects, big budgets or big teams…especially when starting out.
When you read about big data and/or data analytics projects and systems, it is rare that you also read bout communicating the outcome of those projects. Without the ability to communicate the results of any analysis to the broader business, most big data / analytics projects are doomed to mediocrity…or even failure.
The quantitative mind is a great one. It is one that I’m very familiar with and one that I wholeheartedly support. The ability to take a data set, analyze that data and create new information and knowledge from that data is an extremely important skill for people and organizations to have.
Just as important is the skill to be able to convert the outcome of any quantitative analysis into something that is easily digestible by people throughout an organization.
Take, for example, the world of academia. There are many really smart people performing research within universities and research facilities. These people conduct research and then publish the outcomes of that research in academic journals to share their new-found knowledge with others.
Have you ever picked up an academic journal/article? These articles are generally well-written and delivered in formal academic styles but they aren’t exactly ‘easy reading’. They are meant to be used for academic reporting within academic circles. They are also used within industry but most practitioners that read these journals and articles are usually people with similar education and experience as those folks who are writing / publishing these articles.
What happens when a finance manager picks up the Journal of Finance paper titled “Determinants of Corporate Borrowing?” Will they easily understand what the paper is trying to communicate? Let’s take a look at a portion of the abstract of the paper:
Many corporate assets, particularly growth opportunities, can be viewed as call options. The value of such ‘real options’ depends on discretionary future investment by the firm. Issuing risky debt reduces the present market value of a firm holding real options by inducing a suboptimal investment strategy or by forcing the firm and its creditors to bear the costs of avoiding the suboptimal strategy. The paper predicts that corporate borrowing is inversely related to the proportion of market value accounted for by real options. It also rationalizes other aspects of corporate borrowing behavior, for example the practice of matching maturities of assets and debt liabilities.
I would argue that anyone – given enough time – could understand what that paragraph is trying to communicate, but in the fast-paced world of business, does anyone really have time to sit down and study this paper? I doubt it. Most will call up a consultant and ask to help better understand the optimal approach to corporate debt. What is that consultant going to do? She will take her experience as a consultant (and in finance/banking), study the business, literature and best practices and then make a recommendation to the business on what they should do. If the consultant is any good, these recommendations will be provided in an easy to understand document that can be implemented effectively within the organization.
The same approach needs to be taken with data analytics. We can’t just throw a spreadsheet or chart over the wall at the business and expect them to understand what the data is telling them or what they should with that data. I see a lot of this these days though. A company will implement a new big data project, perform some analysis of the data and then provide the output of the analysis in pretty charts and tables but very rarely are there deep, meaningful discussions and analysis about what that data is really telling the business and/or what the business should do based on the data analysis.
Now, you may say that good data scientists / analysts already do this…and you’d be right. But, not everyone is a great analyst nor is it a skill set that most organization’s are hiring for these days. When I talk to clients about big data, they talk about the need to get the best hardware, software and analytical skills…but they rarely talk about the need to find great communicators.
Companies regularly spend millions of dollars on the ‘hard’ costs for big data and data analytics. They’ve even begun spending a good deal of money on the ‘soft’ costs to get their people the best training available so they can be the best data analysts available but it is rare that they spend much money on communications training.
If you want to be a great data scientist, become a great communicator and storyteller. As a data scientist, if you can’t communicate in a way that is informative and useful to the business, the work you do in the ‘quant’ world isn’t that valuable to the company. The same can be said to the business in general – if you want a great data analytics culture, build a great communications culture. You can’t have one without the other.
Eric D. Brown, D.Sc. is a technology consultant, investor and entrepreneur with an interest in using technology and data to solve real-world business problems. He currently runs his own consulting practice focused on helping organizations use their data more efficiently. Additionally, he is the Chief Information Officer of Sundial Capital Research, publisher of sentimenTrader
Eric received his Doctor of Science (D.Sc.) in Information Systems in 2014 with a dissertation titled “Analysis of Twitter Messages for Sentiment and Insight for use in Stock Market Decision Making”. His research interests are currently in the areas of decision support, data science, big data, natural language processing, sentiment analysis and social media analysis.In recent years, he has combined sentiment analysis, natural language processing and big data approaches to build innovative systems and strategies to solve interesting problems. You can read some of his research here: Eric D. Brown on ResearchGate
In addition, he is an entrepreneur that has launched a few companies with the most recent being a company focused on proving data analytics and visualization services to the financial markets.