Warner Says Russian Facebook Influencers Improved at Masking; Cornyn Fears More
It’s clear Russian adversaries have gotten better at masking social media influence campaigns, Senate Intelligence Committee Vice Chairman Mark Warner, D-Va., told reporters after a hearing on foreign interference. A day earlier, Facebook announced it removed 32 pages and accounts potentially linked to Russian disinformation efforts there and on Instagram. Like Warner, Facebook suggested the account holders, who weren't identified as Russian, are using more sophisticated methods (see 1807310067) for manipulating the platforms than the malicious behavior during the 2016 election.
Sign up for a free preview to unlock the rest of this article
Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.
Warner told reporters “one of the most remarkable things” from Wednesday’s hearing was a comment from Graphika CEO John Kelly. He testified there are 25 to 30 times more bot-generated social media posts a day than genuine political posts, suggesting platforms have been overrun with disinformation.
Facebook’s announcement is “the tip of the iceberg,” Sen. John Cornyn, R-Texas, told us: “We’ve entered into a new era where social media can be used as a tool by nations, to be used by disgruntled employees, can be used by somebody who wants to drive down a stock price and short sell stock. It’s a whole new world, and we’ve got to figure it out.”
Warner questioned whether users have the right to know where content originates, especially when account holders are misrepresenting location, and whether users have the right to see if content is user- or bot-generated. Warner introduced the Honest Ads Act. S-1989 would expand disclosure requirements for political ads on social media platforms.
Facebook's notification is an example of what all platforms should be doing more often, said Laura Rosenberger, German Marshall Fund's U.S. Alliance for Securing Democracy director. Transparency and exposure of these accounts is paramount, she said.
Chairman Richard Burr, R-N.C., called foreign social media influence a “national security issue” because 60 percent of the U.S. uses Facebook. Warner said in opening remarks he’s been “hard” on social media platforms. They are a symbol of American potential when at their best.
New Knowledge Research Director Renee DiResta expects social media influencers to exploit more advanced technology like artificial intelligence. Private platforms need to be held accountable to ensure they are doing all they can to defend American values and democratic systems, she said. Oxford Internet Institute Director Philip Howard testified the time for industry self-regulation is over: Targeting campaigns are exploiting social media algorithms, fostering polarization, political rumor mongering and voter discouragement through lies about polling access.
Burr said this strategy is rooted in the same sentiment as Cold War-era Russia: If it’s bad for the U.S., it’s good for Russia. Sen. Jim Risch, R-Idaho, said it’s an enormous, “if not impossible,” challenge to distinguish Russian bots from Americans, who have the right to spread whatever information they want. Rand Corp. Senior Behavioral Scientist Todd Helmus said it’s going to require constant research from platforms and interest groups to develop new techniques to combat this issue.