In our previous posts we discussed the major trends that are shaping the future of corporate services and explored some practical immediate actions that corporate services can take.
In this post we dive under the surface and examine how key changes in Data and Analytics – a topic that cuts through several major trends previously discussed – impacts the way corporate services operates.
Big Data has been a commonplace term for the last 10 years, and yet rarely has such an over-hyped term failed to live up expectations. The promise of bigger, better, and broader insights failed to materialise for most organisations (with tech and media companies the notable exception) and most people carried on about their business. Part of the problem was that data resided in on-premise systems, or hard-to-scale data centres, coupled with poor database structures that made “big-data” hard to achieve.
The rise of Cloud (and more importantly people’s acceptance of it) is now starting to transform how we approach data, and with many scalable platforms available on the market, data has never been easier to obtain, categorise and model. We also anticipate a growing trend of the need to incorporate external data sets alongside internal information with the realisation that some insights are driven as much by what is happening in the outside world as what is happening within the organisation.
We still need good data governance, master data management and such to be a key part of any organisation structure, but we are now starting to realise the promise of Big Data and proper BI – empowering corporate services to effectively tackle the challenges they face as decisions can be based on better, more accessible data available at the right time.
A few years ago, people used to refer to a situation called “Spreadsheet Hell” due to organisations drowning in thousands of spreadsheets being emailed around, stored on servers and generally causing a nuisance of themselves. A lot of organisations are now starting to see the equivalent with dashboards, with too many dashboards being published, and an ever-increasingly complex set of self-serve options making it difficult for end users to find the reports they need. The most common root cause is bad design of the dashboards themselves, and bad governance processes around how reports are generated and published.
Well-designed dashboards should enable better decisions across every level of an organisation, incorporating KPI hierarchies, and importantly helping managers and operators to uncover the underlying issues and identify the actions that need to be taken. Now that visualisation tools are becoming almost as ubiquitous as spreadsheets once were, and people get better at designing and building (as well as the tools themselves getting better and better) we will see the rise of fully connected dashboards becoming a core component of any corporate function’s armoury, allowing a seamless journey across linked dashboards, drilldowns and connected data sets.
If you find yourself having to read across several dashboards to reach an answer then ask yourself how your current dashboards could be better designed to achieve these connected insights across corporate services, and how your library of dashboards could be simplified to make the right board easy to find in the first place. Decades ago, Excel was ‘just’ a spreadsheet tool used by the majority for the presentation of financial data and simple calculations, and as people became better and designing and building tools within Excel these spreadsheets matured into complex models and decision support tools. We will see the same evolution occurring with dashboards are we once did with spreadsheets.
Many organisations are now investing in Data Science capabilities in the hope that this will help generate even more commercial insight and provide a competitive advantage, almost to the point of wanting to have a team of data scientists even if the business isn’t actually sure what to do with them! These black-box tools are not only difficult to wire up to the data, but difficult to know whether you have the right box in the first place. Whilst the results can be amazing, the process for getting them can prove too much in terms of effort and maintenance for most people.
We see this changing, and in fact it already is. These black-boxes are now being incorporated into reporting tools meaning that our reports now include statistically / machine generated forecasts without the need for extensive data science work in the background. These black-boxes even have names now that are becoming more and more familiar sounding, such as Prophet, XGBoost, TensorFlow, ARIMA-x etc.
BI is the practice of bringing information to the people that need it most, whether operators or decision makers. The rise of data visualisation tools has done away with pages of static reports and fixed analysis, but these interactive tools come at a price – almost too much information that then lacks insight, and sometimes finding the underlying trends can be like finding a needle in a haystack. It can also be quite difficult for people to layer commentary on top of the dashboards making it hard to share insights across the organisation.
We are now seeing a rise in artificial intelligence being incorporated into visualisation tools by the vendors, which not only hunt out insights in the data but are now able to write commentary alongside as well. Whilst these tools are at an early stage in their development lifecycle, it is not hard to imagine a world where these tools become more and more sophisticated, forcing the business analyst to continue on their evolution: From data downloader and spreadsheet preparer (before data warehouses and dashboards), to insight analysis and commentary (today’s role) to action analyst on the back of insights produced by AI tools. With the rise of embedded workflow tools as well, how long will it be before AI finds the underlying issues and then tells the business what to do about it!
While we are not there yet, augmented reality is not far away. And it isn’t going to just be about Pokémon GO, but will incorporate extra dimensions to how we interact with data. We are already seeing start-ups provide fully immersive, Virtual Reality environments where individuals can interact with data in ways impossible to do on a 2 dimensional screen, and more importantly these environments allow teams of people to remotely explore the data together.
Whilst fully immersive virtual reality may be a step too far for everyday business users, it is not too difficult to imagine wearable tech interacting with your dashboard of choice to provide a depth of insight never seen before. The prospect of groups of people sharing the same augmented experience will take team working and even customer interactions to a whole new level!
There is an underlying assumption behind all of these points that your data is in a good place and set up for success. The reality is that everything we’ve discussed here is very dependent upon having your data in the right place and in the right structure to fully taken advantage of the developments happening in this space. When we work with our clients the first task is to ensure there is a clear vision / strategy across the organisation for how you want analytics to be a part of your day to day decision making, closely followed by a data discovery piece to understand the data that you have, how it is organised and structured, and how your data should be modelled (both from a data architecture perspective but also within the data warehouses and data stores that you have).
Once these principles have been established the rest of the journey involves collaboration between business users, IT, technology and the data analytics community to be clear on the objectives for the analytics and how the tools will become embedded within day to day operations and decision making. This collaboration is essential to ensure successful adoption of new tools and ideas, but also essential to progress on your journey to become a data driven organisation.