Data Privacy Week Q&A with Arielle Garcia, ASG Solutions

25th January 2024
Neutronian Q&A Series

Welcome to the very first mini-interview in our new Q&A Series. This blog series will feature various industry thought leaders and share their perspectives, opinions and predictions. Up first is Arielle Garcia, founder of ASG Solutions and former Chief Privacy Officer, UM Worldwide. 

It’s no secret that things move fast in the world of advertising and the past year was no exception. In 2023, we saw several new state privacy laws go into effect, AI became a hot topic among the general population and calls for heightened protections for children’s privacy and health data grew. 2024 is off to a fast-paced start too. January brought us stage one of Chrome’s cookie deprecation plans, and then there was the first of its kind location data settlement with the FTC and X-Mode. It’s no wonder companies feel overwhelmed with keeping up with regulatory requirements in order to stay compliant.

As old school as it sounds, sometimes going back to basics is the answer. Read on to hear from Arielle Garcia what she’s hearing in the industry, what she thinks needs to be done and how marketers should be thinking about consumer privacy.

Q: What do you believe are the biggest challenges facing the marketing and advertising industry in 2024?

A. The obvious answer to this question (for the past several years) would draw attention to the complexity of the regulatory environment, heightened enforcement activity, and accelerating signal loss.

In my view, it is more useful to understand that these are downstream symptoms of the deeper challenge: the trust crisis that is a result of a digital ecosystem and data economy that has become fundamentally imbalanced, unaccountable, and unfair.

So, the real challenge isn’t about navigating the imminent and ongoing disruption, but how do we restore trust, and create a healthier market?

How do we become more respectful of the expectations of our customers, and how do we position ourselves such that our growth is aligned with meeting those expectations, instead of beholden to the vision and the models designed and defined by big tech?

Through this lens, we can understand privacy laws as a reaction to the industry’s failures to treat consumers with care and respect – or to consider inadvertent harms to individual safety, liberty and equity enabled by commercial data practices. Practically speaking, this lens allows us to understand, for example, that solutions (ID-based, cohort, or otherwise) that are not underpinned by transparency, choice, and accountability are no more “durable” than their predecessors.

This lens lets us acknowledge that certain data – e.g. health, precise location – is inherently sensitive, and its collection and use should be limited to mitigate risk of harm to people.

We can understand the broader regulatory environment – like the Digital Services Act and Digital Markets Act in Europe as an attempt to restore balance by creating transparency and choice for business and for people, while securing institutional and societal trust and safety.

Through this lens, uncertainty turns to optimism, as marketers and the broader ecosystem stand to benefit by not only embracing, but leading the change – asking the right questions, and establishing new standards that enable sustainable growth.

Q: We’ve had plenty of warning that the cookie apocalypse was coming – do you think most organizations are ready? What are you hearing?

A: As with the first question, there are two sides to this: near-term, tactical readiness, and long-term, sustainable strategy. The answer to both is: not really.

On the first point, everyone is at a different stage of readiness. Marketers will spend this year continuing to refine their strategies for sustaining addressability and attribution – acquainting themselves with Google’s Privacy Sandbox, testing identity solutions, wading through the fragmented Retail Media environment, exploring clean rooms, upleveling their 1st party data and customer experience strategies, and the like – all while needing to color within the shape-shifting lines of an equally complicated landscape of legal and regulatory obligations. Publishers facing continued revenue pressure and falling referral traffic, will be looking for ways to appease buyer appetite and sustain audience monetization – also amidst regulatory change, while generative AI acceleration raises old hurdles, and creates new ones.

Again, however, there’s a difference between mitigating disruption and sustaining the status quo – which is what most seem to be thinking of as 3rd party cookie deprecation nears – versus the shift to more respectful and responsible data practices and partnerships. The latter is still largely treated as an afterthought, or “someone else’s job” – e.g. privacy is seen as predominantly a legal or compliance issue, and there’s deference to the platforms on what constitutes durability.

As laws and regulations increase the obligations on marketers to understand their data flows, evaluate their activities, and assess their partners, there is a natural opportunity to align near-term needs and long-term strategies – one that requires cross-functional engagement (e.g. marketing, privacy, legal, IT).

Q: If you could give one piece of privacy-related advice to a marketer – what would it be and why?

A: My advice – and really this is applicable to everyone in the ecosystem is to begin incorporating this people-first ethos in evaluating business and investment decisions – whether related to product, partnership, strategy, or activation. 

It’s easy to see privacy and the broader regulatory shifts as disconnected from or – worse – as an impediment to marketing and business objectives. In reality, marketers and their customers are both ill-served by the status quo and the false promises of the relentless pursuit of precision.

While this holds true for many reasons, one of the most striking examples I’ve seen comes from a comment by a young woman submitted to the FTC during their 2022 request for comment on their Commercial Surveillance rulemaking.

She says As a teenager, I have grown up with the internet my whole life. My whole life since [I got a smartphone] I’ve had data collected on me. As a younger individual, I was confused as to why it felt like social media [companies]…needed all of this private information on me.  Or, even as a young woman, I use an app to track my menstruation. It’s disturbing to know that they have this data on my reproductive health, why do they need it?

I understand that they use our data for advertisement purposes but there’s been plenty of times where I click on a feature where it says it will stop suggesting ads to me that I don’t want but it does it anyways! It worries me, what will my life be like as an adult?”

This one comment from an ordinary citizen breaks through the illusion that personalization and precision necessarily mean greater effectiveness.  It points out not only the diminishing return awaiting a brand that crosses the line between relevance and exploitation, but also how this can be compounded by platforms’ failure to honor choices or provide effective controls. This is what happens when we lose sight of the human element. Spend is wasted. Trust is fractured. The platforms make money either way. 

Focus on strengthening legitimate relationships with your customers, built on trust and fairness – and partnerships that adhere to these same principles. Engage with your audience authentically and respectfully – reach them in environments that are trusted, with data and insights they’d expect you (and your partners, where relevant) to have and use.  

That requires taking a more intentional and thoughtful approach to defining audience and media strategy, but also to stewarding your investment – understanding your partners and their practices.

Thank you for reading the first post in our new Q&A Series. We look forward to sharing more perspectives, opinions, and discussions with other industry thought leaders.


 

Share this Post: