DOJ Exploring Enforcement Potential for AI Discrimination
DOJ is weighing the enforcement potential around AI-related discriminatory practices, Assistant Attorney General-Civil Rights Kristen Clarke said Tuesday. The Civil Rights Division is bolstering AI enforcement, education, outreach and interagency coordination, Clarke told an NTIA listening session. AI has potential to perpetuate historical discrimination against marginalized communities through the use of web browsing activity, social media and app usage, she said: Consumer data is fed into “specialized algorithms” used to make critical decisions impacting everyday lives, including employment, banking, law enforcement, housing, credit and more. She called the intersection of AI and civil rights a “rapidly evolving policy and legislative landscape.”
Sign up for a free preview to unlock the rest of this article
Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.
The agency is considering submitting statements of interest in pending litigation, Clarke said. The division is connecting with agencies to develop AI ethical frameworks and guidelines. Policy or legislative solutions could offer approaches to addressing AI’s potential for discrimination, she said: DOJ is reviewing whether guidance on “algorithmic fairness and use of AI may be necessary and effective.”
The FTC should launch an FTC Act Section 18 rulemaking on data abuse, Commissioner Rebecca Kelly Slaughter said. A rulemaking, which she recommended in the past (see 2107280061), could clarify the congressional debate through an open public record, she said. The agency needs to cite algorithmic “snake oil,” especially for face-scanning technology, she said.
Slaughter noted most FTC enforcement is brought under its general authority using Section 5, which bars unfair or deceptive acts. She urged the agency to get creative in exploring authority under the Fair Credit Reporting and Equal Credit Opportunity acts. Poorly designed algorithms facilitate discriminatory harms through faulty inputs, faulty conclusions and a failure to audit or test those algorithms for discriminatory output, she said.
Sen. Ron Wyden, D-Ore., drew attention to his surveillance-related privacy legislation, at a Cato Institute event Tuesday. The Fourth Amendment Is Not for Sale Act (see 2104210053) would change language that allows data brokers to sell Americans’ personal information to law enforcement and intelligence agencies without Foreign Intelligence Surveillance Act court oversight. Wyden said it would stop the government from “using its credit card” to “erase the constitutional rights of Americans.” The primary target are telecom companies that sell consumer data to “sleazy” data brokers, which in turn share the data with bounty hunters, car salesmen and police, he said.
An Electronic Communications Privacy Act loophole lets companies like Verizon, AT&T, Google and Facebook share consumer metadata with third parties. “Once a tech or telecom company sells our information to somebody else, it’s basically open season,” said Wyden. “The data broker gets this information, and that information has no protection under ECPA or the third party doctrine. Shady middlemen are essentially above all regulation to sell our personal information to the government without any court order.” He called it a “shameless” run around the Fourth Amendment.