Big data is old news. You could say it got started back in 18,000 B.C.E., when humans started using tally sticks to track trading activity and inventory. Along the way, we’ve also seen the development of the abacus in 2400 B.C.E., and the statistical analysis experiment conducted by John Graunt in 1663 to try to curb the spread of bubonic plague. And in 1865, Richard Millar Devens brought us the first use of the term “business intelligence” in the Encyclopedia of Commercial and Business Anecdotes, all milestones cited by author Bernard Marr.
Fast forward to today: nearly half of the world's population uses the Internet (usage is at 75% in the U.S.), and the consulting firm Excelacom estimates that the following happens online in a single minute:• 347,222 new Tweets
• 701,389 Facebook logins
• 2.78 million videos viewed on YouTube
• 527,760 photos shared on Snapchat
• 150 million emails sent
• 51,000 app downloads from Apple
“Many scholars think of that as ‘exhaust’ — just as cars move and leave exhaust behind, every time an interaction online occurs, a lot of data is left behind,” says Dr. Jagdip Singh,
professor of design and innovation and co-director of the MSM-Business Analytics Program at the Weatherhead School of Management at Case Western Reserve University. “With larger amounts of data it makes the task of finding insight more difficult and more urgent because we believe in the ‘exhaust’ lies the insights that can make the lives of humans better.”
Rapid pace of change. Abundance of data in a variety of forms, including photos, video and text-based information. Growing complexity of data, generated quickly and requiring response in real time. All of these challenges have been matched by new technology and a growing sophistication in big data analytics and data science, increasing the capacity of humans and machines to make sense of data and allowing more organizations of all sizes to ride the big data wave.
“Data decision-making is becoming mainstream in more types of companies and more functions,” says Dr. Rakesh Niraj, a professor of marketing at the Weatherhead School of Management who works alongside Singh as co-director of the MSM-Business Analytics Program. “The technology for keeping, collecting and accessing data has become democratized.”
Experts describe the major market forces impacting the big data climate as the “Four V’s of Big Data”:
• Volume. According to IBM’s Big Data & Analytics Hub, every day we create 2.5 quintillion bytes of data, and 90% of today’s data has been created in just the last two years.
• Variety. Data is originating from a wide and growing variety of sources, and 90% of it, according to IBM, is unstructured data like tweets, photos, customer purchase histories and customer service calls.
• Velocity. Excelacom’s statistics above illustrate how quickly data is being generated and transmitted; IBM estimates that the rate of global internet traffic by 2018 will be 50,000 gigabytes per second.
• Veracity. Quick, abundant and varied is only useful if the data is also reliable. IBM estimates that $3.1 trillion is lost in the U.S. economy each year due to poor data quality.
Each of these presents both challenges and opportunities, making best practices in data analytics and access to top talent more crucial than ever.
“To use modern technologies and algorithms that can that can find the pearls, sort through the noise and do it in a timely, effective manner, that is no doubt the next frontier,” says Gerard Daher, president and CEO of Speedeon Data, a Cleveland-based direct marketing and data services company that he co-founded in 2008.
Getting big insights out of big data
Having access to big data and actually using it properly are quite different things. Marr, who authored “Big Data: Using SMART Big Data, Analytics and Metrics To Make Better Decisions and Improve Performance” and “Big Data in Practice: How 45 Successful Companies Used Big Data Analytics to Deliver Extraordinary Results,” writes that many companies are “data rich but insight poor.”
He encourages businesses to think less about “big” data and more about “smart” data: “The value of the data is not the data itself — it’s what you do with it,” he writes. “Why go to all the time and trouble collecting data that you won’t or can’t use to deliver business insights? You must focus on the things that matter the most otherwise you’ll drown in data. Data is a strategic asset but it’s only valuable if it’s used constructively and appropriately to deliver results.” To that end, the SMART approach to big data Marr espouses begins with the maxim “Start with strategy” — in other words, determining the results the company hopes to achieve before even touching the data. He follows up that SMART approach with Measure analytics and data, Apply analytics, Report results and Transform business.In Singh’s work with students, “I talk about problems first, analysis second. … In a typical data-based decision, one can analyze the data and find hundreds of interesting insights … but what companies want are insights which are stronger, better and more useful. To get that high quality of insights, students need to understand and contextualize problems … Problems must come first.”
In EMC Corporation’s 2014 study “The Digital Universe of Opportunities: Rich Data and the Increasing Value of the Internet of Things,” it found that only 22% of information in the digital universe was considered useful data, but less than 5% was actually analyzed. Looking ahead to 2020, EMC predicted that more than 35% will be considered useful data — thanks in part to the growth of the Internet of Things — “but it will be up to businesses to put this data to use.” Meanwhile, a 2016 survey of 150 executives at large U.S. firms by tech consulting firm Attivio found that although 94% felt their corporate big data strategies are headed in the right direction, only 23% felt their companies were “extremely successful” in leveraging big data for decision-making with another 39% responding “very successful.”
Consider one example Niraj cites — the use of retailer loyalty cards. Many have used such cards for decades, “but until about seven or eight years ago, a large amount of data was collected but not really used. … there are companies that have spent a lot of money getting data and are still struggling to do something with it.”
For example, the grocery chain Tesco, a $54 billion UK-based company operating in 11 countries, started its loyalty card program in the mid-1990s, and was an early adopter of the use of the program for data analytics. The data this $54-billion company collected through its Clubcard was used to determine that a 10-degree rise in outdoor temperature corresponded to a 300% increase in barbecue meat purchases, 45% more lettuce and 50% more coleslaw. These insights helped the company optimize efficiencies while also reducing food loss and waste.
It’s not just for Amazon anymore
It’s easy to see how Tesco, Amazon or Netflix can put big data analytics to use to identify and serve current or prospective customers. But are there lessons to be learned from the big guys for small and mid-sized businesses, startups or non-profits?
“The playing field has absolutely been leveled in terms of the ease of getting up and running and the ease of sophistication about how you analyze your business,” says Jared Blank, the senior vice president of data analysis and insights at Bluecore, a New York-based marketing startup serving retailers. “That has been an under-appreciated change in the last 10 years.”
A small business may not have the resources to hire a data scientist or invest in buying data, but Blank insists that companies can still put dataanalysis to work in ways they might not immediately think about. For example, a small business might traditionally structure its sales team around prospect geography or company size. But a little creative thinking and smart data use might identify a better and more effective way. But Blank suggests that the company could alternatively sort its customers by what they chose as their first product: It could be that customers who bought Product X first turn out to be more valuable than those who purchased Product Y first.
“It involves brainstorming about what factors we know about prospects and clients, and throwing away some of the assumptions we’ve had about our business,” Blank says. “You need to sit down in a room and determine, what are the variables that will drive the outcome I want?” That could be data a company already has but isn’t tapping into. Or it could be data that can be easily collected with a simple policy change — for instance, asking customer service representatives to capture just one additional fact from clients during calls for entry into its CRM.
“A lot of this comes down to a cultural change in the organization,” Blank says. “Let people know you are testing these hypotheses. Share the data after you’ve done analyses, whether you’re proven right or proven wrong. People will start to get excited about it.”
Sometimes that culture change has to start at the top. Niraj says some managers can view big data as a threat to their leadership strategies. “Managers should be willing to sometimes be superseded by the data,” he says. “In the real world, management and data have to work together to get to things supported by the data. Once you see that data improves your decision, and is not a threat to your way of thinking and managing, you can use more and more of it.”
Dr. Singh cites university business schools as a rich resource for companies who want to build their big data analytics capabilities. Companies get the assistance of budding data analysts and data scientists, but the students also get an invaluable opportunity to work with what Singh calls “messy, ill-developed and unstructured data.”
“In the past, much instruction has been around problems carefully developed to give clarity to outcomes,” he says. “But the data that the world generates is messy — it’s missing variables, it’s not properly matched, some of the data is out of line with the metrics. Employers are saying, bring them to work with messy data, not clean data.”
Predictions for the future? Unpredictability
“Predictions for the long-term in big data analytics isn’t very useful, since things are changing in such fast, unpredictable ways,” says Singh. “What we need is the capacity to work with the unpredictable. Over-planning won’t work. You have to have short windows of experimenting … and be flexible and agile to take advantage of what happens in six months.”As software used to analyze big data becomes increasingly open source, he says, the playing field for data use among all sorts of organizations will continue to be leveled. “We want to train people who don’t have huge machinery and big investments to mobilize the power of big data,” says Singh. “We are turning around the infrastructure to make it simpler, more available for people on the front lines.”
Daher agrees: “The proliferation of data has given smaller businesses the ability to be nimble. … Companies no doubt will need to evolve more rapidly.”
The size of the digital universe will double every two years, predicts EMC’s 2014 study, adding up to a 10-fold increase between 2013 and 2020. The study also estimated that the amount of data touched by the cloud will double from 20% in 2013 to 40% in 2020, and that emerging markets will account for 60% of data — a flip from 2013 numbers, when mature markets accounted for 60%. The number of connected devices will swell to 32 billion, representing 10% of the world’s data.
Investment in big data projects will swell as well — Forrester Research’s September 2016 forecast predicts that the big data tech market will grow three times faster than the overall tech market, with the biggest increases in adoption in pharmaceutical, transportation, and primary production industries.
The continued growth of data scientists on company payrolls will also increase the effective use of data. “The data science field is only starting out right now,” says Eeshan Srivastava, senior marketing analyst for Overstock.com, whose job straddles data science and marketing analytics. “Initially there wasn’t much experience in how to actually interpret that information to benefit a business. Now people have started learning to use the information very quickly. there are a lot of libraries coming out that are making a data scientist’s job much more easy… now they can plug and play and start getting results.”
A 2012 article in Harvard Business Review proclaimed data science to be “the sexiest job of the 21st century,” yet research by McKinsey Global Institute predicts that by 2018, the U.S. could face a shortage of up to 190,000 people “with deep analytical skills as well as 1.5 million managers and analysts with the know-how to use big data to make effective decisions.”
Blank also predicts a continuation of the trend in democratization of data. “We’ll see a lot more tools in B2B and B2C that automate the data science tools,” says Blank. “The world is heading in the direction where companies will be able to ask more complex questions without having to hire Ph.D.s.”
Consumer experience is where Daher expects to see the most significant advances in the future — reducing the interactions that some consumers find creepy (like ads that follow them throughout their online life) and increasing the interactions that make their lives better. “We will be knowing and anticipating how to make a consumer’s life more enjoyable, getting rid of the noise and optimizing the consumer’s experience,” he says. “At some point, it will be the consumer guiding this because there is so much data available.”