"Trust takes years to build and seconds to break," as the saying goes. This certainly applies to the smart society of the future, in which Data & Analytics play an essential role. Let's be clear about one thing: analyzing data for all sorts of applications is a positive development that will make our lives easier and enables us to make better decisions faster. But if we want to feel at ease in our smart society, we need a new trust model. Privacy and cyber security already dominate the public debate, but ethical issues are equally important if not more so. For instance, how do we integrate checks and balances into the systems that control your and my data. These are key questions for a smart future.
Data analysis is not new, but its scope, accessibility and especially its impact have increased exponentially in recent years due to technological developments. Sometimes the boundary between ‘cool and creepy’ isn't clear cut. Data analysis allows doctors to predict if and when an individual will suffer a certain disease. Loans are granted based on the applicant's social media profile and network. Recently there was a fuss about cameras in billboards that counted the number of passersby, and currently there's a discussion about algorithms that can predict depression in first-year students. The automated guided vehicle will be one big algorithm that will have to deal with situations involving ethical choices: common sense dictates that no one will be in doubt about what to do when comparing material damage or avoiding a child playing in the street.
Politicians, lawyers and ethicists are lagging behind these technological developments. Still, a clear framework of standards and values is no less necessary. The use of a navigational system in the vehicle shows that this is more than just a detail. In order to get you quickly from point A to point B, a navigational system must comply with a number of requirements. The quality of the map details must be good, the algorithm must take into account any special circumstances - for example rerouting - and the route advised must serve the interests of the driver. The latter is certainly not the case if the algorithm has a preference for suggesting routes that pass by a certain brand or gas station.
As this example shows, there is a risk that companies and governments won't always act in the interests of their customers and/or society as a whole. What's more: such conduct is contrary to what society expects from data analysis these days. People want to know: are their details correct, used ethically, in a way they understand, by organizations they trust and for purposes they approve. And above all, they want to know when something goes wrong. Our society is currently struggling with these questions, rightfully so, and answers must be found. It goes without saying that this trust cannot be imposed by policy makers or businesses.
We do not claim to have all the answers, however, based on our experience we believe a trust model should be based on four building blocks: quality, effectiveness, integrity and flexibility. The quality of the data and the processes used for the collection of this data is decisive. Quality requires quality controls which will certainly require additional effort. A recent KPMG study shows that less than half of organizations worldwide currently carry out such quality checks.
The effectiveness of the data analysis will be an essential factor in strengthening trust. In practice this means that the data must, first and foremost, provide added value for the organization as well as for the end user. Measuring effectiveness is an extremely complex process. The gap between rough data and the application is significant.
Subsequently, integrity will require further attention. And this is a new area: is everything in line with legislation? How transparent are the processes? Are ethical boundaries crossed? Take, for example, the automated guided vehicle. External regulators will play a crucial role in this.
And finally, the flexibility of the data analysis systems will need to be taken into account. As data analysis systems constantly evolve, a trust model will need to anticipate extremely rapid developments such as technological changes, new legislation, cyber security and social changes.
Strengthening our faith in data analysis requires investments and a willingness to enter into long-term commitments, and this in a rapidly changing environment. Social interests must be at the center and there must be a willingness both to work closely with regulatory authorities and to be held publicly accountable. To be successful, this must happen step-by-step and in consultation with the various stakeholders. The real 'profit' that can be made, and which will be invaluable, is the strengthening of social trust in a smart society. If it isn't, we will be left with the path, but no way to move forward.