Each issue, The Progressive poses one question to a panel of expert voices—writers, thinkers, politicians, artists, and others who help shape the national conversation. For our June/July 2024 issue, we asked: What can people do to combat disinformation in the 2024 election season?
Walter Scheirer
Professor of engineering at the University of Notre Dame, author of A History of Fake Things on the Internet
In this age of the Internet, older communications media still exist, and they tend to be more trustworthy sources of information when compared to social media. This is because the news that emanates from them is more often than not professionally produced, well-researched, and informed by reliable sources. One can turn to television, radio, or print—media whose demise has been greatly exaggerated and are still delivering high-quality reporting to the public.
Long-form journalism tends to be the best, since the extra time it takes to produce means the end product is more thoughtful and extensively fact-checked. Of course, sometimes the professional news media get things wrong, too, and excessive partisanship can tilt reporting away from the facts. It’s a good idea to check the reporting on a story from multiple sources to get the gist and to filter out potential misreporting. In a high-tech world, stay low-tech for your news.
Oren Etzioni
Former CEO of the Allen Institute for AI
We are facing a potential tsunami of disinformation in the run-up to the 2024 election. I’m particularly concerned about AI-enabled deepfakes. For this reason, I founded TrueMedia.org, a nonprofit offering a free tool for automatically assessing deepfakes and media manipulation. The tool analyzes social media posts containing video, images, or audio to determine whether this content has been manipulated using AI.
I encourage each of you to access this tool and share it widely. But technology alone is insufficient. Education is also crucial—people must understand the deepfake threat and question any information that seems too good (or bad) to be true. Media literacy skills are vital in today’s information landscape. With the right tools, knowledge, and regulations, we can safeguard our elections and protect the truth in 2024 and beyond.
Calli Schroeder
Senior Counsel and Global Privacy Counsel at the Electronic Privacy Information Center
While individuals can and should combat disinformation by fact checking content before they spread it and calling out known disinformation, digital platforms and AI developers have a much greater responsibility and opportunity to address the problem at its source. Advances in AI mean that deepfake images, video, and audio of politicians can be far too convincing for individuals to be able to detect themselves.
AI developers need to build guardrails into their systems that prevent AI-generated content that features political candidates from being created during elections. Digital platforms can allow disinformation to spread virtually unchecked, sometimes assisted by algorithms that promote disinformation when it generates engagement. These platforms should implement trust verification on fact-checked information or labeling disinformation to help combat its impact.
Brenda Victoria Castillo
President and CEO of the National Hispanic Media Coalition
We need to educate people around us in our everyday lives about how to check sources and diversify the news we are taking in nearly every minute of the day. We have to encourage folks to take a step into the uncomfortable, to question the validity of something as an active consumer. These days it's common to come across a manipulated image, fabricated story, or AI-generated deep fake on social media.
We have to take responsibility as a community to say, "no, you can't manipulate us anymore." There's no one coming to save us; the big online platforms are only out for viral advertisement money, and our government is moving too slowly to truly prevent harm. It's up to us to educate our families, to make a difference, and to defend the truth.