Different data types, in different formats and from different sources—what’s an analyst to do? This blog argues that it’s not the size of your data, but what you do with it that counts!
Data behaving badly
Posing an interesting question about data and its use has become de rigueur in the IT industry. As IT becomes ever more complex, with data points being collected all over an organization, businesses are struggling to get a grip on the situation.
The myriad of applications and internal systems we use today, all of them individually doing their thing and in their own native formats - some in the cloud and some on premise - means more and more data of varying quality and standards.
Salesforce, Netsuite, SAP, Workday, Hubspot and Marketo, for example, all handle data in their own way and present very specific challenges to integrators and analysts alike.
Your IT department will be trying to solve these problems with manual quick fixes to link systems, but nearly half (48 percent) of respondents’ in a recent Vanson Bourne research study, state that they want to get rid of these compromised shortcuts within their organization. What’s more, nearly a third (31 percent) stated they had experienced errors, data loss, or privacy violations in their organization, as a direct result of manual moves or ad-hoc coding to integrate data.
Disconnected data dilemma
Disconnected data can be measured specifically in costs and missed business opportunities. The results of the Vanson Bourne study, also uncovered that organizations in the US and UK are losing $140bn each year to disconnected data. Nearly half of companies in the study (47 percent) believe disconnected data is negatively impacting their organization’s ability to innovate, develop new products and services and get them to market quickly. In addition, their ability to engage, support, and meet the needs of customers (46 percent) is also negatively impacted.
A rigid data architecture and strict methods of processing data often create lengthy and drawn out lead times to get meaningful information from this critical asset. Data tantalizingly tempts business leaders with the allure of untapped fortunes, just waiting to be dipped into. If only they could get at it and use it efficiently and effectively. Add to this the geographical challenge of an international business and/or a regulated industry and things get even more difficult.
Stifling data architectures
Whilst all this may or may not be interesting, isn’t data just data after all? It's not about the quantity, and even data quality issues can be solved, it’s about what you do with it and how flexible your approach is to leverage what you have. Our data strategy, and the decisions we make today will either hold us back and restrict our ability for years to come or make life much easier. Cost and speed to market can be cut drastically, and data is key to doing this, rather than being the thing that gets in the way.
Now, an agile start-up with a data lake type architecture can pose a question to their data scientist in the morning and get a report back, complete with data insights in the afternoon, of the same day! However, companies with more rigid systems are being driven by their data and the limitations of their internal systems or architecture. They are losing control of their own operations, sometimes due to the complexities imposed by regulation, but more likely because of how data is being used and/or a lack of key related skills.
Instead of driving their own data to get what they need, data has taken on its own rigid personality. However, they may not have the right people, combined with the technology and architecture to help themselves. What they may just have is a big storage tank collecting more and more data, which on its own is not very helpful. In addition, accessing data is not straight forward. Vanson Bourne highlights this trend, with 41 per cent of respondents reporting that critical company data is trapped in legacy systems that cannot be accessed or linked to cloud services.
Data + Work = Results!
Business analysts are trying to cope as best they can but the methods and ways of looking at data may need to change before things get better. Some vendors have tried to capitalise on the confusion by selling packaged analytics solutions that claim to serve up the information business leaders need, without doing the important and hard data work first. I wonder if they’re really solving a problem or just creating a bigger one? Aren’t these new analytics and data services just adding another layer of detachment, at a time when that is the last thing that’s needed? The hard work still has to be done by someone, and there will need to be a few more breakthroughs in AI to get us there. In the meantime, experts are needed – just merely providing access to the data is not enough, you need people who can interpret it the right way, not just draw pleasant charts.
Surely it's better to roll up your sleeves, grab your data and start to get deep and dirty with it – that’s if you can get hold of it in the first place. If you can get access to the data, the challenge becomes one of having the right tools to process it. Begin by manipulating and transforming your data into something useful, so you can use it in new, meaningful and helpful ways. Modern interpretations of the data warehouses are now adapting to allow for more agile ways of working. This movement is being led by the data lake concept, which reverses data processing techniques on their head, releasing vast possibilities that can be completed in a fraction of the time it took to do before – if they could have been done at all. But all this requires deep and joined up thinking across the organisation.
Business analysts, data scientists, data engineers, data architects, software developers and business users are all coming together to better understand the company data at their disposal and their data strategy. Yes, there will always be more and more data to worry about, but quantity is not the defining factor regarding how helpful your data will be for you. It is, always has been and always will be, what you do with and how you handle your data that makes all the difference. For this is where the true untapped value is.