Targeted advertising relies upon a user’s individual habits online that include actions from how she submits information on a retail site to then later going to a news site and finding those very same items, even similar items, there wedged between stories on her screen. This is how targeted advertising functions whereby information that we largely deem uninteresting or trivial is used to create a menu of objects, from among a larger group of online advertisers, that us our information to feed us future advertisements. We are shown what these advertisers deem that we ought to see.
While online advertisements might seem harmless to most, for what harm can be done by advertisers showing me the same blender in an advert today that I purposefully searched for yesterday, the reality is that much more than my preference for blender types or speeds is recorded in this metadata that these advertisers collect on me. Or, as Adam Rauh explains, “If data is the new oil, then metadata is the refinery; without it, you have no way of knowing or utilizing what you have.”
But metadata is created from our online searches and interactions with websites that rely on information from our browsing habits to ensure that the kinds of advertisements we are shown are specifically aimed at us. When the advertisements refer to home restoration or electric automobiles, this might seem like a harmless mechanism. Will my human rights be abused if smaller cars are shown to me because I am a female over say, my male friends’ who are shown advertisements for jeeps? Probably not. But if buying a jeep qualifies the buyer for a lower APR (Annual Percentage Rate), then yes, there is an ethical question afoot.
And there have concerns about targeted advertising specifically those mechanisms used by Facebook which has incredible amounts of access to personal data. Categories such as “ethnic affinity” were used by Facebook-affiliated marketing companies to exclude users on the basis of age, sex, postal code, and various other targeting categories to include ethnicity. Not only has this potential for online discrimination been well-known and documented, but there is now a legal precedent for how Facebook handles targeted marketing in large part thanks to the in-depth reporting from ProPublica and The New York Times in 2017 which demonstrated how the social-media giant allowed its advertisers to exclude people from seeing housing ads based on their “ethnic affinity.”
Since this exposé was published, the past two years have been filled with litigation and advocacy which ended last month as civil-rights and labor groups reached historic legal agreements with Facebook. Under the terms of these settlements, Facebook has agreed to end such marketing practices that have previously allowed creditors, landlords, employers,, and similar marketers to explicitly target and even exclude people based on categories such as ethnicity and sex. For what came out of the investigations was that not only were older people excluded from housing searches, but women were also excluded from job searches. These are serious abuses in how metadata is used to keep information out of the reach of more than half the population.
Are there good uses of data that don’t rely on processing the data into metadata? Sure. There are many companies that ethically process user information and private data for the purpose of an internal business platform such as how our information is used by our online banking systems or how our information is used by our employers. For instance, WorldERP, LLC. is a software development company which sets out to revolutionize Contract Lifecycle Management (CLM), with their ‘Hyper-Converged’ ContractFX™ solution. CEO Glenn Summerfield explains, “Effective Contract Lifecycle Management extends far beyond creating, managing, and storing contract documents and information. It seamlessly integrates many applications into one, providing a panoramic view of every contract element to include forecasting future risks.” As long as the information collated from the data is kept internal to any one company’s practices and is not shared to third parties to be sourced out at meta-data, then we can safely assume that data is used ethically.
Last month’s settlement with Facebook is good news for civil and human rights advocates who have shown that corporations that traffic in data can be pressured into respecting long-standing laws that have been established to protect users’ civil and human legal rights. The bigger question is how users can become more aware of how their data is handled and even to have access to public mechanisms that can allow the user to test if online marketing is discriminating against or targeted unethically toward any one group such that we might face a future of job averts where only young, white men are seeing the advertisements.