US Senate Bill Re-Introduces Suspicious Activity Reports for Social Media

Another challenge to Section 230 of the Communications Decency Act, which protects tech platforms from being liable for various forms of content posted on them, has re-emerged, with bipartisan support. It takes a page from the Banking Secrecy Act (BSA) but, rather than filing Suspicious Activity Reports (SARs), the bill would force tech companies to file “Suspicious Transmission Activity Reports” (STARs) for “illegal activity” on their platforms. 

This week, senators Joe Manchin of West Virginia and John Cornyn of Texas reintroduced their “See Something Say Something Online” act, which would force tech companies “to report suspicious activity to law enforcement, similar to the way that banks are required to report suspicious transactions over $10,000 or others that might signal criminal activity.”

According to a summary document from Manchin’s office, companies are “largely shielded from liability for the actions taken by individuals on their platforms, lacking incentives to clean up illicit activity. Even when they do take action, they often just delete the data rather than turning it over to the appropriate authorities, making it more difficult for law enforcement to go after bad actors online. It is past time to hold these sites accountable, and for them to say something when they see something online.”

But many questions remain about why such a bill is needed, including concerns over what actions could fall under the broad umbrella it lays out and what data would be collected. 

Anne Fauvre-Willis is COO at Oasis Labs, a company that focuses on data privacy. She says this is a great example of a bill with nice intentions in theory, but costly implications in practice. 

“I understand regulators want to put more onus on tech companies to protect their users, but this does the opposite,” said Fauvre-Willis in an email. “It violates individuals’ right to privacy and removes them from any sense of control of their data in an undeliberate way.”

No STARs? No Section 230 protections

The bill would create a system “similar to the Bank Secrecy Act by authorizing the creation of an office within the Department of Justice (DOJ) to act as the clearinghouse for these reports, similar to the Financial Crimes Enforcement Network (FinCEN) within the Department of Treasury,” according to a press release from Manchin’s office. 

The bill was re-introduced to raise the threshold of what is required to be reported as “serious crimes,” which the release identifies as drug sales, hate crimes, murder or terrorism, to “ensure that users’ privacy remains safe.”

Read more: FinCEN Encourages Banks to Share Customer Information With Each Other

Tech companies would have to send STARs within 30 days of becoming aware of any such information. “Suspicious transmissions” could include a wide array of material, including a “public or private post, message, comment, tag, transaction, or any other user-generated content or transmission that commits, facilitates, incites, promotes, or otherwise assists the commission of a major crime.”

If the companies choose not to do so, they will be stripped of Section 230 protections, with the end result likely being they would be sued into oblivion. 

By threatening to remove Section 230 protections for failing to comply with the bill, it makes the filings of STARs mandatory in practice if not in word. So, to ensure these companies are able to continue to exist they will be forced to further transgress upon users’ data privacy. 

STARs would be accompanied by a host of personal information associated with the post’s originator. 

They would include the name, location and identity information given to the platform; the time, origin and destination of the transmission; any relevant text, information and metadata related to it. It’s not clear how wide or narrow that relevant information could be. Entities filing STARs would have to keep them on record for five years after filing them. 

A blanket gag order also means the targets of STARs would not be informed about them. And STARs would also not be subject to Freedom of Information Act (FOIA) requests.

Additionally, the bill calls for the creation of a department under the DOJ to manage these reports. There would also be a centralized online resource established that could be used by any member of the public to report to law enforcement any suspicious activity related to “major crimes.” 

“With an overly broad definition of reporting ‘suspicious activity,’ the bill completely ignores consumer privacy protections and defaults to a world where the government knows best,” said Fauvre-Willis. 

“In practice what this means is that, if passed, companies would have to pass along large swaths of data that may be relevant but also very much may not be. This data could include sensitive information about individuals including emails, age, social security numbers and who knows what else.”

How STARs create a data honeypot

Compelling companies to divulge personal information on a regular basis with regards to the billions of posts, messages, tags and other actions people take every day seems like a great way to create a massive honeypot of personal data, one that has troubling implications. 

 “The ‘see something, say something’ approach has been thoroughly debunked in the offline context – as leading to invasions of privacy while not advancing public safety – and it would be even more negative in the context of online platforms,” said Nadine Strossen, a law professor at New York University and former president of the ACLU.

The bill specifically outlines the creation of a centralized online resource where people (anyone, seemingly) could file STARs. Whether tech companies would then have to provide personal information on users who had STARs filed against them by members of the public is an open question the 11-page bill fails to address.

Read more: How FinCEN Became a Honeypot for Sensitive Personal Data

“Creating a clearinghouse for this data in a centralized system run by the federal government seems fraught for security risk,” said Fauvre-Willis. “Holding sensitive data is no easy task, and sharing it in a way that is safe and protected, even harder. And once the government has this data what will they do with it? This bill feels fraught with challenges and half-thinking.”

Data is sensitive, and the avalanche of data this might produce means that it could be a succulent honeypot for people who might be interested in using that data in ways that are only limited by the extent of their imagination. 

“It’s creating a facility for the public to report bad tweets,” said Jerry Brito, the executive director of Coin Center, in a phone call. “Have you seen Twitter?”

Strossen said the legislation would also encourage and empower anyone to wreak havoc on particular users or platforms, simply by filing a STAR. 

“Given the vague, broad descriptions of ‘suspicious activity,’ which turn on subjective judgments,  a limitless array of posts could be claimed to fit within them,” she said in an email.  “People could weaponize this law to make life miserable for anyone from political opponents, to economic competitors, to individuals they dislike.”

Free speech, data privacy and decentralization

Conversely, Strossen said, “Plausible arguments can be made that this law violates platform users’ free speech and privacy rights, because the federal government deputizes platforms to monitor and disclose detailed information about their users’ communications.”

“Government can’t do an end-run around constitutional constraints on its own actions by forcing platforms to engage in spying and censorship that the government wouldn’t be permitted to engage in directly.”

Not only would it seemingly require companies to monitor direct messages that they may not otherwise, the bill also discourages the adoption of end-to-end encryption. Such encryption would stop companies from having extensive reach into messages sent by individuals,  which could feasibly make them unable to comply with STAR filings. 

“What that means is that Twitter has to be searching, constantly monitoring your DMs for suspicious stuff,” said Brito. “And then informing on it. That’s problematic for all the reasons you can imagine.”

Read more: Google Down: The Perils of Centralization

Brito says he thinks the reaction among tech companies would actually be to move toward encryption, as Apple and WhatsApp have done, though he doesn’t think the term “private” in the bill is specifically referring to encrypted communications. 

“They’re going to say, ‘All of the communications that we provide on our platforms are end-to-end encrypted and so we can’t see into our customers communications,’” he said. “And then the government’s going to come back by saying, ‘Okay, we need a backdoor then.’ So that’s one thing. The other thing is it’s going to push folks towards decentralization.”

In decentralized systems, there isn’t one centralized body (or company) that can unilaterally decide to adhere to such regulation and begin to surveil users’ communications. 

The impending data deluge: Who is asking for this?

The BSA, from which the thrust of this act borrows heavily, has resulted in compliance officers filing a SAR on anything that might possibly lead to liability for the financial institutions. 

As such, banks have been filing more and more SARs, the number of which has nearly doubled in the last decade. 

As a financial compliance lawyer described in an earlier interview, financial institutions have been doing more defensive SAR filing, turning what was a thoughtful process into something that is more akin to just checking the box. Essentially, the idea is banks are filing large numbers of SARs to protect themselves from liability or being hit with fines for potential noncompliance with the BSA. 

It’s hard to imagine this bill doing anything different, but using STARs instead. 

Brito also raised the point of whether the potential deluge of information is something law enforcement wants. For example, as the number of SARs has risen, FinCEN has shrunk. This means there are relatively few people to analyze all the SARs that come, and potentially place a limit on the quality of the intelligence they’re seeking to gather. 

“Did the sponsors of this bill talk to law enforcement?” he asked. “Because as a result of this they could very well get tens of thousands of reports for whenever anybody uses the word bomb, for example, like ‘that club was the bomb.’ That doesn’t help them and they’re going to have to go through them all.”

This also doesn’t take into account that Facebook and other social media platforms already have compliance teams that work closely with law enforcement on these sorts of issues. Facebook and Instagram report and take down millions of instances of child pornography annually, for example. 

“Who is this meant to cover that isn’t already doing this today?” said Brito.

Squashing competition

For all the consternation around big tech and antitrust legislation being rolled out, yet another side effect of this legislation would be to hamper the ability of other tech companies to compete with the already dominant platforms. 

“As with any such burdensome regulation, another adverse impact would be to further entrench the already dominant online platforms, such as Facebook and Google, and to raise further barriers to entry for new, small companies,” said Strossen, “The giants have the resources to contend with the regulatory requirements, but their potential competitors do not.”

Content moderation itself is a tall task, one that requires resources, systems and attention. Creating additional obstacles, as this bill does, would exponentially increase the upfront costs to getting into the game at all, and provide a myriad number of reasons why someone shouldn’t. 

“This bill, like many that seek to regulate the internet before it, has the indirect effect of hurting small startups and entrepreneurs more than anything,” said Fauvre-Willis. “The more these bills go into action, the greater moat large companies have against small innovators. Facebook and Google can hire lawyers and teams to manage this process if they need to. An early stage company cannot. This has the unintended consequence of stifling innovation as a result.”

Read Entire Article


Add a comment