Today’s column is written by Jim Kaskade, CEO at Janrain.
When the EU’s landmark General Data Protection Regulation (GDPR) went into effect last year, it, among other things, expanded the definition of personally identifiable information (PII) to include data related to IP address, biometrics, physical devices, location, race, ethnicity, religion and sexual orientation.
By giving EU residents more control of their privacy, they will theoretically receive fewer “creepy” ads if they refused brands’ permission to use this highly personal data, as well as traditional identifiers such as name, address, birthdate, Social Security number or financial info. With more guardrails in place, it appeared that EU residents had the tools to limit what advertisers could learn about them.
Despite GDPR providing a significant step for digital consumer privacy, however, citizens may soon discover that it might not protect them as thoroughly as they first anticipated. The reason: Just about everything is PII.
Companies can still glean a lot from the information collected by brands from users who may still be considered “anonymous.” That Spotify playlist of ’80s songs, for example, may offer clues to a listener’s age. A news feed could tip off political leanings or ethnic identity, while those Netflix action movies and documentaries might suggest the account holder’s gender.
Taken together, this derived data can potentially fill out a good deal of a personal profile, regardless of how little the user disclosed explicitly. Add in a few basic demographic pieces of information and advertisers have all the data they need to complete the picture before targeting consumers with ads that go beyond their surface-level interests.
In the United States, companies can still capture users’ locations, device IDs and other information that is classified as PII under GDPR without users’ permission. But the California Consumer Privacy Act (CCPA), which goes into effect in 2020, defines PII similarly to GDPR. It was enacted to thwart an even tougher grassroots ballot initiative from being voted on by the public.
Regardless of future legislation, US-based advertisers and marketers must recognize the growing consumer awareness around privacy and realize it’s not just the data security breaches that cause brand damage. Just as problematic is anything that results in a free-for-all for end-user information without the customer’s blessing.
True, advertisers are under pressure to take location, social media posts, app browsing, call logs, device IDs, IP addresses and other data that can still can be derived without a green light from customers and turn it into invaluable insight. Consumers demand personalization, and brands are rewarded handsomely when they get it right. But the advertising industry should temper its urge to aggressively collect, buy, sell and trade customer data in 2019 and prepare early for the looming CCPA and potential federal legislation.
One way to ensure customer comfort and build trust in a company’s data practices is to institute principles of privacy by design, which GDPR now formally mandates in the EU, into every design, operational process and offering that touches consumers. For example, a footwear company can follow privacy by design by proactively assuming that fashion preferences (color, patterns), specifications (size, shoe type) and other personal details (shipping address) are not to be stored by default after a custom-made sneaker is delivered. No action is required on the part of the consumer. Personal information is only stored and used with an explicit consent in context: “Would you like us to save this information for future orders?”
Marketers understandably shudder at the thought of tossing such valuable data that can be used to tailor future communications, but privacy by design seeks to create “win-win” or “give-to-get” scenarios. In this case, the footwear company will obtain the consumer’s explicit permission to send messages that only contain relevant offers related to the customized shoes upon completion of the transaction.
Another way to build trust is to leverage emerging best practices such as former Ontario information and privacy commissioner Ann Cavoukian’s groundbreaking framework. This seminal white paper, which heavily influenced GDPR, is much more than a checklist of features needed to ensure consumer privacy and data security. It reveals how to weave privacy (and associated required security) deeply into the fabric of an organization, including its overall mindset and supporting business processes, not just its technology specifications.
But the simplest advice may be for marketers to “put themselves in their customers’ shoes” (no pun intended) and use data the way those customers would want, even if it results in pulling back on certain ad or email campaigns or cutting down a target list. Marketers will need to identify points in the customer journey where there’s a logical value exchange for consent to use personal data.
For example, a licensed apparel company of a major professional sports league may ask permission to send details about jersey offers when the customer is browsing an online catalog days before all-star weekend. An airline could ask permission to send deals for amenities related to a passenger’s flight via mobile phone or email. In each exchange, consent is earned in context of the customer journey, which builds trust between consumer and brand. These exchanges can start when a browser is anonymous and continue well after a consumer becomes a registered user – ideally until they are a lifetime high-value customer that advocates for the brand by sharing with others in their network.
Marketers should tell customers what data they are collecting, what it is being used for and, most important, what’s in it for the customer. People tend not to mind relevant ads, targeted messages or personalized content when they have been informed and given an explanation for why they will be receiving them. Brands shouldn’t surprise their customers by sending a text out of the blue as they walk by a storefront, because many will be more than a little creeped out.
Marketers, advertisers and ad tech companies don’t need to earn the right to use every piece of customer data whatsoever right now, but they will have to eventually. It would serve them well to learn how to work with their customers within the parameters of future privacy legislation. Those who act sooner will achieve brand trust well ahead of their less proactive peers.