The question of how to manage personal data and delineate between the individual and the corporate has been forced centre stage by technological development. Global Relay President and General Counsel Shannon Rogers takes a look at what brought us here and where we need to go.
“But it’s clear now that we didn’t do enough to prevent these tools from being used for harm, as well. And that goes for fake news, for foreign interference in elections, and hate speech, as well as developers and data privacy. We didn’t take a broad enough view of our responsibility, and that was a big mistake. And it was my mistake. And I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here.”
Mark Zuckerberg, CEO of Facebook, testifying before a joint session of the US Senate’s Commerce and Judiciary committees on Facebook’s failure to protect users’ personal data from election interference, April 10 2018.
Throughout the second decade of this century, personal data belonging to millions of Facebook users was collected without their consent by Cambridge Analytica, a UK consulting firm. This data was gathered through the third-party app “This is Your Digital Life,” which built psychological profiles of its users. It also aggregated the personal data of its users’ Facebook friends, via Facebook’s Open Graph platform. The result was 87m profiles available for use in political advertising. Cambridge Analytica then used the data to provide analytical assistance to the 2016 presidential campaigns of Ted Cruz and Donald Trump. It was also accused of interfering with the UK’s Brexit referendum, although investigations found this activity to be insignificant.
New focus on data ownership
Information on this data abuse came to light in 2018, when a former Cambridge Analytica employee disclosed details about the company’s data harvesting and political tampering in interviews with the Guardian and New York Times. In May 2018, Cambridge Analytica was forced to file for Chapter 7 bankruptcy. And in July 2019, the Federal Trade Commission announced a $5bn fine for Facebook’s privacy violations; the tech giant would also pay a €500,000 ($540,310) fine that October to the UK Information Commissioner’s Office.
Zuckerberg’s testimony to Congress was the culmination of a series of events orchestrated by Facebook that helped consumers everywhere to understand that there was real value in their personal data. There had been a steadily rising tide of data protection legislation until that point, but most corporates and even consumers had treated it with relative indifference. Enter Facebook, Cambridge Analytica, and Zuckerberg to capture our attention. Suddenly there was a new focus on data capture, storage, its (mis)use, and its ownership.
Every key stroke when shopping online, interacting with social media, or contacting a client through a corporate network, creates a data (and metadata) trail that can be captured and used to create a data profile. Such profiles are susceptible to manipulation in ways that infringe on data privacy and pose serious risks to a person’s private information, especially when data is collected without their consent or knowledge. Facebook exposed how vulnerable our personal data can be.
At the turn of the century, data and the rights attached to it were barely considered when contracts and agreements were being negotiated. Some sophisticated corporates would refer to the data protection laws but this was the extent of their relevance then.
Executives would be working for a bank one day and emailing their clients from a personal email address; the next day they would move to a competitor and be able to access the same book of contacts using their personal email.
The first shift in emphasis towards protecting people’s data autonomy and privacy was driven by regulatory change. Two significant milestones illustrate this. The first was the US state-based regulation 201 CMR 17.00, forming part of The Massachusetts General Law Chapter 93H that came into effect in March 2010. The second was the grandparent of all data regulation (especially from a personal perspective): the EU’s General Data Protection Regulation (GDPR), which was implemented in May 2018. The Mass regs spawned many equivalents at different times across the US, while GDPR created seismic change that set a new standard for data governance globally, and has since been the blueprint for many nations that updated their own regulations.
One of the root causes of GDPR’s development was the uncomfortable mixing of personal and corporate data. The emergence of personal email services such as AOL, Yahoo, and Hotmail offered a new channel through which people could communicate with their clients. These communications were deemed the property of the corporate. However, executives would be working for a bank one day and emailing their clients from a personal email address; the next day they would move to a competitor and be able to access the same book of contacts using their personal email.
Equally problematic was the fact that personal dialogue, data, and extremely sensitive information were being exchanged and retained in corporate email. Some corporates had started to offer their own proprietary email networks for both internal and external contact and their employees had an implicit faith that they were providing some basic level of privacy and security in these scenarios. But that was not the reality.
After this realization, it took about five years for corporates and individuals to educate, train, and adjust their habits to acceptably separate the personal and working worlds.
Complexity for corporate compliance
Technological innovation waits for no one, and messaging devices and formats continued to progress while this was happening. Blackberry was the gadget of choice (containerized to achieve separation), Instant Messaging (IM) arrived, and then the smartphone became ubiquitous.
The route to compliance only got more complex for the corporate. GDPR was the natural response to this changing world and arrived in time for everyone to apply some sort of governance map to their use, retention, and treatment of data.
But innovation continued apace while the evolution of the regs remained in its infancy. Just as corporates thought it was safe to go forward, social media emerged to muddy the water.
The fundamental question that all of this posed for corporates was how to create an effective ring-fence in their communications, using policy and technology to protect and control both corporate and personal data appropriately. Every effort must be made to ensure that employees’ and customers’ personal and confidential information does not enter the corporate data pool and pollute it.
These changes have altered the obligations, practise, risks, and, ultimately, the rights of both corporates and individuals. The balance of power has definitely shifted towards the individual, for whom the new regulations provide a sword rather than a shield. Compliance has become much more expensive for organizations. Data deletion requests are onerous, and enforcement is more common and can be severe (data protection authorities in the EU can pursue the greater of 20m euros or 4 percent of annual global turnover).
Corporate data must be separated and secured from personal but the employer no longer controls the originating telephone number and device. Employees do not want to carry two phones or share a telephone number.
Firms in highly regulated industries, such as finance, have to balance their obligation to stringently monitor certain potentially high-risk employees, such as traders, with their need to comply with these heightened new data privacy regulations, which can be at odds with requirements to tackle market abuse. Sometimes they need to predict which watchdog might have a sharper bite.
The prevalence of personal mobile phone use presents an unsolved dilemma. Corporate data must be separated and secured from personal but the employer no longer controls the originating telephone number and device. Employees do not want to carry two phones or share a telephone number. Virtual phone numbers have come to the fore with a container that holds corporate apps to achieve the required separation. The complexity has been magnified by the introduction of encryption, which adds an extra layer of protection but removes immediate transparency.
It all centers around data ownership, data control, the right to privacy, and whether the corporation has true authority to delete corporate data on its employees’ personal devices.
There is heightened awareness of what is at stake now. High-profile enforcement by data protection authorities, turf wars between Big Techs, and a larger-scale movement at the government level have exposed most to the power of analytics, the new role of artificial intelligence, and the value of data and metadata.
A clear demonstration of this magnified attention is Google being fined $56.6m in June 2020 by France’s top administrative court for breaching European Union online privacy rules. Meanwhile Facebook is building a lawsuit over Apple’s IOS 14.5 update that gives users privacy controls over what apps track their data. And finally, 2021 has delivered a renewed global impetus by administrations for national sovereignty over their data.
Monitizing the data effectively
Innovative progress intensifies the need for appropriate regulation and further complicates the relationship between employer and employee. There is now incentive to leverage the employee’s relationships and behavior in social media networks – to improve brand visibility and expand the corporate’s reach. This temptation pushes the corporate into a position where revenue potential might drive it across this faint boundary and into the personal domain.
Corporates are only just beginning to understand how effectively personal data can be monetized. That opportunity lies before all now. Consent is one method that is used by some whose business model demands it – users of Google Maps is an example of this. This is open to abuse despite all the regulation and it irks many consumers, who have no choice but to give their consent when they have to use a monopoly provider. More parallels like this will present themselves in the corporate/employee bargain over the use and abuse of personal data.
Corporates need to treat data governance in the same way they do other business principles and regulatory standards that they adhere to, and hold sacrosanct. Their approach should reflect the culture of compliance that they adopt across all parts of their business. Data privacy is a challenge that is not going away; corporates could be on a dangerous, slippery slope if they don’t address their access to private data and more powerful technology from the top down and embed it throughout the whole business.