Today, Adalytics released a report claiming that DoubleVerify failed to block or filter ads from running alongside adult pornography and, potentially, online child sexual abuse material (CSAM) on the website imgBB.com. We find this content abhorrent and a core part of our mission is to ensure that advertiser dollars do not support this, ever.

While we were not given access to their data prior to publication — and we have consistently seen them misrepresent how our tags work and how client campaigns are executed — we have reviewed initial coverage about the research and want to correct the record while outlining steps that all verification providers should adopt. As always, our number one priority is to ensure the highest level of media safety for our customers. We take any perceived issues seriously and seek to address them immediately, collaboratively and transparently.

In Brief

  • The report claims that ads appeared beside objectionable and illegal content on a specific site, imgBB.com.
  • The site has a small advertising footprint, with an even smaller number of DV-measured ads — 0.000047% of our total.
  • Customers using DV’s pre-bid and post-bid controls benefited from multiple layers of protection; DV’s blocking controls alone prevented tens of thousands of ads from appearing on the site in the past 30 days.
  • DV has taken immediate additional measures to block this site and affiliated sites for our customers, while we conduct our review.
  • DV has strict policies and processes in place to ensure illegal content is handled in accordance with the law, and we regularly have worked with law enforcement to assist in previous investigations concerning the online ecosystem.
  • Adalytics has a track record of publishing inaccurate reports, and we are actively reviewing their latest claims.

The Site: imgBB.com

The site featured in Adalytics’ report is imgBB.com and its sister site ibb.co, which redirects to imgBB.com. It is a widely used image-hosting platform. According to SimilarWeb, the site — including traffic from ibb.co — receives nearly 21 million visits per month and is supported mostly through subscriptions, in addition to some level of advertising for its free tier.

Based on our data from the past 30 days, DV customer ad impressions on imgBB.com accounted for just 0.000047% of the total media transactions we measured, with only a limited number of clients running ads on the site. 

While the impression volume for our customers on this site was very small, we take this issue seriously. The vast majority of these ads appeared alongside neutral content, in large part due to the pre-bid controls used by many of DV’s customers. Concurrently, DV has blocked tens of thousands of ads from serving on imgBB.com, enabling clients who choose to run on this site to avoid unsafe or unsuitable content through our classification and enforcement. 

CSAM

Any content that is illegal or sexually exploitative is abhorrent and should not be available online. Any instances of CSAM should also be immediately reported to the appropriate authorities.

Importantly, there is no data in the Adalytics report that explicitly indicates DV client ads appeared alongside CSAM. At DV, we have strict policies and processes in place to ensure CSAM is handled in accordance with the law. Upon learning of the claims in Adalytics’ report, DV immediately contacted law enforcement to offer our assistance in identifying and addressing any illegal content. DV has regularly worked with law enforcement to assist in previous investigations concerning the online ecosystem.

Adult Pornography

DV’s classification system — designed for both pre-bid filtration and post-bid measurement — prioritizes site and page classification based on impression volume. (As an ad technology company, it is not reasonably feasible to measure every URL or page on the internet at scale if ad impressions are extremely low.) Regardless of content category, classification is triggered once a page reaches a certain level of ad traffic. If a researcher identifies content on a URL, it does not necessarily mean that the page has met the threshold required for classification. In this instance, the page would be unclassified. However, customers using DV’s Authentic Brand Suitability (ABS)  pre-bid solution can still choose to block unclassified pages, and the majority of ABS users are leveraging this safeguard today.

For pages on imgBB.com that received sufficient traffic for classification, DV applied a range of content categories. Some pages contained pornographic material and were appropriately classified as “Adult & Sexual.” This classification appears in post-bid reporting and allows for pre-bid filtration when enabled. DV strongly advises all clients to activate pre-bid controls to prevent ads from running on inappropriate content.

In multiple instances, Adalytics has presented the presence of a DV tag on a page as evidence that our technology failed to block an ad. In reality, this is a misrepresentation of how verification works. The presence of a tag does not necessarily mean an ad was served due to a failure in our system — it may simply indicate that post-bid measurement was in place rather than pre-bid avoidance or blocking. Additionally, ad placement always follows client settings — meaning the appearance of an ad on a classified page may align with an advertiser’s specific campaign goals and configurations.

Additional Steps

In light of this report’s claims, DV is conducting an additional comprehensive review of ad-supported image-hosting sites on the open web that are within our system — even those that may have very small ad impression volumes — and placing them under stricter classification standards. Additionally, we are defining a mechanism to block anonymous, profile-based image-hosting sites at scale. We will share our findings with customers to help inform their brand safety strategies as they evaluate campaigns, and DV’s content classifications and brand suitability controls. 

As another measure to drive greater clarity around the potential content quality issues for sites like imgBB.com, we are creating a standalone, app and site-level avoidance category dedicated specifically to protecting clients from advertising on peer-to-peer (P2P) sharing and streaming domains and apps that could be abused by hosting or distributing illegal content, in violation of these sites’ policies. P2P file sharing is the distribution of digital media such as software, videos, music and images through an informal network in order to upload and download files. This will provide clients with greater clarity, granularity, and protection across campaigns. 

Finally, while we are not a CSAM detection company, we believe we are in a position to continue to help prevent its potential monetization. To that end, we will work with third-party organizations and partners within the industry to combat this issue, in an effort to foster greater transparency and trust across the entire digital ecosystem.