Shadow Contact Information

The Newest Threat to Targeted Advertising?

Targeted advertising is the quintessential form of digital marketing in an era where cellphones, social media, and instant communication methods deliver content faster than ever before. This rapid exchange of information has necessitated innovations in advertising, such as advertisers contracting Facebook and other large social media giants directly. However, recent shifting public perception around privacy of digital information or data has called into question the safeguards and procedures advertisers are utilizing when processing or exchanging personally identifiable information (PII).

A recent study spurned by Professor Alan Mislove, and conducted by several professors from Northeastern and Princeton University, uncovered unsettling habits associated with Facebook’s collecting strategies of personal contact information. Their study focused primarily on a concept which they coined called, “shadow contact information”, or personal data which is not willingly exchanged from user to processor when creating a new Facebook profile or during set up of two-factor authentication. The latter of which is incredibly ironic as two-factor authentication was conceived as a more comprehensive safeguard to protect personal accounts / data in the first place. The methods undertaken to generate the results of their study consisted of testing ads on Facebook through multiple sample accounts, and as a result establishing a baseline of testable actions through modifying the custom audience feature on Facebook.

The essence of their experiment surrounding shadow contact information mirrored the following example: User 1 signs up for Facebook and uploads their personal contact list in order to find potential friends and other users. User 1’s personal contact list includes either emails, or phone numbers associated with User 2, despite the fact that User 2 willingly omitted more than the one phone number or email tethered to their action when they signed up initially for Facebook. Mislove’s study verified that Facebook will share user contact info from User 1’s list which includes User 2’s contact information which was never provided nor intended for Facebook or their third-party advertisers consumption. This phenomena removes the agency of a single user from preserving their contact information from unwanted advertisers. As stated in Gizmodo’s analysis of this new research, “we own our contact books” but that does not change the fact that User 2 would have no knowledge of this exchange of personally identifiable information between User 1 to Facebook to a variety of independent advertisers.

As far as two-factor authentication is concerned, a similar yet exponentially more complex process is practiced by Facebook (extensively spelled out in Mislove’s full research), prompting similar philosophical questions around the ethics of collecting safeguard information such as two-factor passwords, etc. and Facebook intentionally turning around to off-load this “security information” to ensure advertisers more efficiently connect themselves to users. Facebook is a private enterprise. They have already successfully broken the mold by providing the largest and most highly profitable expansive social network in a free, non-subscription based business model. They can only do so much to fend off the public backlash around data privacy in the wake of their Cambridge Analytica scandal, but their explanation for the unethical practices associated with shadow contact information collection reads as highly insufficient. Facebook has put forward a few blanket statement explanations of “using information people provide to create a more personalized user experience, including more relevant ads”.  As detailed in Kashmir Hill’s article, Facebook PR reps deny the existence of shadow contact information being used directly for targeted advertising. Furthermore, Facebook has repeatedly denied access to people (specifically even to UK users) who have asked for Facebook to disclose the purpose of collection. However, acknowledging the advertising component of this data collection circumvents any validity of their previous denials.

The marketing implications of shadow contact information are largely unknown as Facebook, Pinterest, Twitter, and other social media giants (which researchers tested as part of the Facebook study) have not formally disclosed their existence. Although the ethical questions attached to using this new form of data remain, such as: Does collecting two-factor authentication information with the intent to profit off of them undermine the notion of their security safeguard? For creative agencies, such as Hero Creative, these questions will become more important as jeopardizing digital security can have material consequences in reality. Targeted advertisements ensure users receive “more relevant ads” but the continual handing off of user data between various parties increases the risk for data to be stolen or breached by unwanted hackers. Additionally, does maintaining profit margins and business success at the hands of unethical advertising practices delegitimize the customer oriented business models? Consumer trust is seriously at risk if large corporations like Facebook knowingly mislead users for the sake of hiding behind data privacy laws that don’t necessitate transparency over new forms of data.