By Mathew Ratty
In the digital advertising ecosystem, reaching the right audience has become crucial for campaign success. Therefore, advertisers are harnessing the data-driven approach to deliver highly targeted advertisements, ensuring the message reaches the designated audience. However, as the competition has surged, they require new techniques in a bid to have an edge in the market. In this context, Google’s Performance Max (PMax) emerged as a game changer, allowing AI to optimise the performance of advertiser’s campaigns in real-time, resulting in more conversions and value.

However, recent revelations have brought to light that Google’s PMax AI might have inadvertently displayed ads to children on YouTube. This unintended consequence could potentially lead to companies tracking children’s online activities downstream for remarketing purposes, raising significant concerns regarding data privacy. Additionally, this situation has triggered questions about potential violations of the Children’s Online Privacy Protection Act (COPPA), a federal privacy law designed to safeguard children’s online information.

For advertisers aiming to connect with authentic audiences and boost their return on ad spend (ROAS), this development is indeed worrisome. The inadvertent collection of data they did not intend to gather presents a troubling scenario, which may undermine their efforts to maintain ethical data practices and effective audience targeting.

Decoding data: a two-edged sword

The aim of using a data-driven approach has been to target a genuine audience and convert them into customers. However, regardless of using a third-party tool such as Google’s PMax, the advertisers might not get the same level of transparency that is required for them to function to their full potential. Furthermore, as the data is collected and retained automatically from a platform, the AI behaviour could change owing to irrelevant data resulting in advertisements being shown to a different audience base than the actual target group. As a result, the advertisers can experience wasted ad budgets.

Also, in a scenario where children are clicking on an advertisement YouTube placed by PMax, they are redirected to the websites where their data is collected. As there is no filter on which type of data is being collected, it blends with genuine user data, further diluting its richness. Also, by collecting children’s data, advertisers might be unintentionally breaching privacy laws and are liable to face repercussions. It can not only lead to ethical and liability implications but also undermine the efforts of marketers. According to research by Cisco, 33% of users terminated relationships with companies over data privacy concerns. Therefore, the need of the hour is to utilise a solution that avoids such a situation by preventing unintended data propagation.

Data Collection Filter: Need of the Hour

As discussed, there can be potential pitfalls to indiscriminate data collection, and in this regard, a data collection filter (DCF) can be a viable solution. TrafficGuard’s DCF is a decision-making solution that has the capability of blocking post-click collection from inappropriate users so that advertisers never collect, store, or track the data they never intend to use, or is potentially tainted from a compliance perspective. It is the first link in the data chain for the website, where it chooses which data about incoming visitors should be passed on to marketing and analytics tools so they may be gathered further downstream.

It automatically filters out the data from visitors who are redirected from a platform that does not engage with the website. It only collects the data on the premise that they are genuine users, on the condition that the visitor meaningfully engages with the site. It is advantageous for the advertisers as the visitors do not experience a difference with the DCF, and it also does not stop the users from visiting the website. It only prevents visitors whose data does not reach a confidence level from being gathered by an advertiser’s downstream marketing and analytics tools, which effectively improves the data quality. As a result, this reduces exposure to visitor data that the advertiser does not want, safeguarding them from breaching any law unintentionally.

Effectively integrating a Data Clean Room (DCF) can significantly mitigate the risk of inadvertently accessing visitor data that advertisers wish to avoid, providing a safeguard against unintentional legal violations. Furthermore, the strategic use of DCF also yields a tangible advantage for advertisers by enabling them to elevate their return on ad spend (ROAS) through a retargeting pool devoid of irrelevant users. This aligns perfectly with its core role for advertisers.

By harmonising this modern solution alongside tools like PMax or any other AI-driven platform, advertisers can harness data to its fullest potential, magnifying campaign impact while adhering diligently to stringent data privacy regulations.

The author is the co-founder and CEO of TrafficGuard

Follow us on TwitterInstagramLinkedIn, Facebook