In the past few months Facebook has been caught in a storm of accusations about the way it has managed users’ data. It is not that it has stored and aggregated huge amounts of this data. The issue is how it has used, shared, and managed the data. Data ethics are essential if we are to benefit from the opportunities arising from the likes of artificial intelligence and machine learning, while also acting as a digitally responsible society.
The incident and process, as uncomfortable and personal as it might be for the people responsible, is not unique. Not only are the amount and types of data collected by all industries increasing at a rapid pace, the actual use and management is at the same time hidden in long and very complex terms and condition documents. The consumer relies on transparency, but often get the opposite. That does not change the fact that data is the most promising “product” of the future.
IDC expects much of the data economy’s potential to empower vendors, public institutions, and citizens/consumers in their decision making to materialize in the next few years, driven by major developments in digital platforms, advanced analytics, cognitive, and AI. With these tools, the vast amount of data generated by endpoints can be gathered, analyzed, and turned into value. Digital platforms are evolving rapidly, and are becoming ever more sophisticated. At a basic level, these products connect devices, collect and manage vast amounts of data, and surface new insights into vendors’ back-end systems or third parties. Their ability to support the development of new applications that can underpin better and faster decision making is crucial. There is just one issue that needs to be taken into account before the “new Nirvana” of data can be realized — the concept of ethics and trust.
Data ethics are essential if we are to benefit from the opportunities arising from the likes of artificial intelligence and machine learning, while also acting as a digitally responsible society.
The data economy is based on one essential pillar — trust. I need to trust you to handle my data securely and privately. I need to trust that you don’t gather more than I have given permission for you to gather. And in the end, I need to trust that the data is only used for the purposes we have agreed on. If you deliberately or even unwittingly breach our mutual understanding, then you will lose my loyalty and, maybe, also the most valuable thing of all — my personal data. So, to stay in business, governments, vendors, and others that tap into or create data lakes must apply a strict form of data ethics.
Ethics will become the new parameter for competitive advantage, and only companies with the highest of morals and a governance framework to support it will succeed. Data is the new holy grail in the frenzied race for market share and potential global domination. It is no longer about how you gather and store it — it is about how you handle data. You can be ethical and transparent, or be out of business.
CLICK HERE to access IDC’s recent webinar on Future of Work.
To find out more about the data ethics of today and tomorrow, contact Research Director, Jonas Knudsen, member of the IDC Future of Work practice and IDC Health Insights lead.