adobe

The danger of disinformation needs a new collaboration

This article was first published in The Hill.

Surrounded by a seemingly endless swirl of digital media, today’s consumers are often left to face the dangers of disinformation without the knowledge or tools to combat them. To tackle this growing challenge, we must embrace technology solutions that will help consumers verify the validity or truthfulness of digital content. This solution will require a strong collaboration between the government, media, and technology companies who together can help create a consumer base that isn’t as easily influenced by the content they consume.

That’s why we at Adobe strongly support the Deepfake Task Force Act (S. 2559), which would establish a National Deepfake and Digital Provenance task force comprised of members from the private sector, the federal government, and academia to address the problem of disinformation. We’re encouraged to see government leading the charge and bringing together the nation’s collective expertise to find a solution — one which we feel must be grounded in technology and championed across industries.

While the concept of disinformation has been around for centuries, recently, those wishing to spread it have taken advantage of social media and easy-to-use editing technologies to do so at an alarming pace. In the last year alone, disinformation has eroded confidence in elections and spread deadly untruths about COVID-19 vaccine effectiveness.

And as artificial intelligence (AI) continues to advance, it will become even easier to manipulate all types of media — and even more difficult to detect manipulation when it occurs. Think altered photos, videos, audio — all with the intent to mislead.

See also  How ELEMIS London is utilizing a technology-forward approach to skincare and e-commerce

The Deepfake Task Force Act focuses on an important aspect of the solution to disinformation: digital content provenance. The bill defines this as “the verifiable chronology of the origin and history of a piece of digital content, such as an image, video, audio recording, or electronic document.” In other words, the ability to see where a piece of content came from and what happened to it along the way. The stated goal of the Task Force is to explore how the development and implementation of digital provenance standards could assist in verifying online information and reducing the proliferation and impact of disinformation.

This is the same approach we took at Adobe when we founded the Content Authenticity Initiative. The Content Authenticity Initiative is a provenance-based tool that attaches tamper-evident attribution data like name, location, and edit history to a piece of media, allowing creative professionals to get credit for their work while empowering consumers with a new level of transparency about what they’re seeing online.

A provenance-based solution removes some of the hurdles social media platforms are facing when it comes to disinformation that detection tools alone cannot address. As we’ve seen from current practice, curating content with “disinformation” labels is often futile — by the time it’s labeled as such, millions of users have already seen it. Blocking or removing content altogether is also problematic as it could lead to a decline of users’ trust in platforms to uphold free speech. One mistake and no one will trust your judgment again.

But with provenance technology, the decision-making is left up to the consumer — not an editor, not social media platforms, not the government. An empowered consumer base could identify disinformation based on the characteristics of the content before it spreads, without waiting for an intermediary to label it. So rather than taking on the impractical (and, quite frankly, impossible) task of catching every bad actor, a provenance-based solution creates a place for good actors to be trusted. And it provides a critical backstop if AI-based detection tools cannot keep up with AI-based creations.

Tech companies have an important role to play here. We have vast networks of users and intimate knowledge of how our tools are used to create and share content. Therefore, we’re also the place where a provenance-based solution should exist. But we can’t do it alone.

In order for a provenance-based solution to truly work, we need investment and research initiatives to continue to find the best ways to provide these tools to consumers. We need government action to encourage the entire ecosystem to incorporate these tools. And we need industry-wide standards that companies can follow. Most importantly, we need educational efforts to engrain an awareness of disinformation into the public’s core understanding of media and the internet.

There is much work ahead, but the creation of a National Deepfake and Digital Provenance Task Force is a crucial step in joining the knowledge, perspectives, and influence of the private and public sectors to address these challenges. We encourage the Senate to take up and pass this bill so we can work together in our collective fight against the dangers of disinformation.

Dana Rao is executive vice president, general Counsel and Corporate Secretary at Adobe. He leads Adobe’s efforts around content authenticity.

Source : Adobe

Back to top button

Adblock Detected

Please disable your ad blocker to be able to view the page content. For an independent site with free content, it's literally a matter of life and death to have ads. Thank you for your understanding! Thanks