Big data analytics is dead - long live data analytics, says CTO of Exasol
The big data world is changing
Big data is a term now considered mainstream. It is no longer just hype and a buzzword. However, continuing to focus on the term ‘big data' misses the point: simply having and collecting more data is no longer enough. We have moved to a place where business-oriented ‘data strategies' should be - need to be - the focus.
The term ‘big data', in its original form of referring to Hadoop and unstructured data, has begun to fade away. The big data revolution is evolving and the way businesses extract value needs to evolve too; it needs to increase in sophistication and it needs a clear and joined up strategy.
Today, Hadoop & co. are used to process and store the data, which still leaves the question of where to analyse it? This is the critical bit: storing data for fear of missing out is no longer an acceptable strategy for the early adopters, and data is driving real value for organisations with the sophistication to put it to business use. Too often, ‘data lakes' turn into passive ‘data reservoirs' with the intrinsic value remaining untapped. To use a crude analogy for the new prevailing paradigm, it's not the size of your data but how you use it that counts.
This is why organisations must move towards incorporating new data strategies to remain ahead of the competition. Only companies who work their data hard for insights are able to optimise their business, create new routes to market or create innovative new services and revenue streams. With the sheer variety of sectors and organisations looking to use data to enhance their business, there are almost as many strategies as there are companies; the only overarching philosophy that successful data-centric companies all subscribe to is pragmatism. This has led to companies building new data competence centres outside of the IT-departments, reporting directly to the C-levels (CFO, COO or directly under the CEO), shaking off the old conventional wisdom and bringing data analytics into the heart of the business.
In general terms, US companies are several years ahead of their UK counterparts when it comes to un-siloing their data analytics technology. Much of this can be put down to the Silicon Valley ethos of ‘moving fast and breaking things'. By being first, many large American companies have been able to spread the benefits of data analysis throughout their organisation, while others have only just begun to grapple with what their data gathering methods entail. This does not mean that other countries are far behind, as the culture that spurred this process is specific to industries rather than nations. However, there is no denying that having multiple stakeholders take part in data projects across disparate business departments has accelerated the decentralisation of data analytics.
These new exciting technologies popping up to address specific needs are also replacing the unsuitable big software suites from large vendors such as Oracle or IBM. There are data projects that specialise in analysing complex graph structures, text sentiment analysis tools for analysing your customers' support emails appropriately, and in-memory databases that give you fast access to your data analysis, and each of these was built according to specific requirements, in line with a specific data strategy.
Today, a heterogeneous, agile data eco system allows companies to do unprecedented things with data, opening up a whole new space to become better in business by utilising automated, predictive and prescriptive processes rather than just creating reports about the history. In this way, the success rate of data projects has shifted firmly away from data gathering to the rate at which gathered data can be transformed into useful information; a process that is only accelerating with the democratisation of data and its much lower barrier to entry.
There are a few sectors that have already come out the other side of the big data hype cycle. The e-commerce and retail sectors have always been front-runners in adoption. They successfully applied data mining in the late 1990s, when it was under the broad banner of ‘business intelligence' or ‘data mining'. Any large organisation with millions of customers that is used to processing huge amounts of data has had an easy time stealing a march on those who don't, which is why they have been enthusiastic early adopters of one the biggest game changer of all - AI integration.
Now that companies have gathered these frightening amounts of data, they have hit the size threshold needed to create effective machine learning logic. This is why Facebook's face recognition is better than anything else on the planet - even humans. The company is sitting on petabytes of pictures and profile information, and having that much to work with, along with the right tools, has pushed the boundaries of what analytics is capable of.
New technologies such as artificial intelligence, in-memory databases, key/value stores, graph databases, stream processing tools and many more move the question from technical limitations to the smart application of such technologies to create innovative insights from all kinds of data sources. As a consequence, the speed of adoption will increase more and more, and data will more often form the bedrock of companies' data and wider business strategies.
Mathias Golombek has been a member of Exasol's executive board since January 2014. In his role as chief technology officer he is responsible for all technical areas in the company, ranging from development, operations and technical support to professional consulting.