. . . from the not-really-encrypted dept . . .
Karl Bode, Nov 12 2020 | Original Techdirt article here.
In many ways, Zoom is an incredible success story. A relative unknown before the pandemic, the company’s userbase exploded from 10 million pre-pandemic to 300 million users worldwide as of last April.
One problem: like so many modern tech companies, its security and privacy practices weren’t up to snuff. Researchers found that the company’s "end-to-end encryption" didn’t actually exist. The company also came under fire for features that let employers track employees’ attention levels, and for sharing data with Facebook that wasn’t revealed in the company’s privacy policies.
While the company has taken great strides to improve most of these problems, the company received a bit of a wrist slap by the FTC this week for misleading marketing and "a series of deceptive and unfair practices that undermined the security of its users."
A settlement (pdf) and related announcement make it clear that the company repeatedly misled consumers with its marketing, particularly on the issue of end-to-end encryption:
"In reality, Zoom maintained the cryptographic keys that could allow Zoom to access the content of its customers’ meetings, and secured its Zoom Meetings, in part, with a lower level of encryption than promised. Zoom’s misleading claims gave users a false sense of security, especially for those who used the company’s platform to discuss sensitive topics such as health and financial information.
The FTC also criticized Zoom for storing some meeting recordings unencrypted in the cloud for up to two months, despite marketing claims that meetings would be encrypted immediately following session completion. The agency also criticized Zoom for bypassing Safari malware detection when it installed ZoomOpener web server software as part of a Mac desktop application update in July 2018:
"Without the ZoomOpener web server, the Safari browser would have provided users with a warning box, prior to launching the Zoom app, that asked users if they wanted to launch the app. The complaint alleges that Zoom did not implement any offsetting measures to protect users’ security, and increased users’ risk of remote video surveillance by strangers. The software remained on users’ computers even after they deleted the Zoom app, and would automatically reinstall the Zoom app—without any user action—in certain circumstances."
The settlement itself isn’t much of one. As part of it, Zoom simply has to
-
"establish and implement a comprehensive security program" and
-
adhere to "a prohibition on privacy and security misrepresentations," stuff the company insists it has already done.
The settlement doesn’t come with any meaningful financial penalties or consumer compensation of any kind, resulting in some dissenting Democratic Commissioners (like commissioner Rebecca Kelly Slaughter) arguing it wasn’t really much of a settlement at all:
"Zoom is not required to offer redress, refunds, or even notice to its customers that material claims regarding the security of its services were false. This failure of the proposed settlement does a disservice to Zoom’s customers, and substantially limits the deterrence value of the case."
Again, Zoom should be applauded for the fact that the company has taken many concrete steps to improve things sense reports first surfaced that its privacy and security standards weren’t up to snuff. But it’s not clear that the FTC, arriving late to the party and "requiring" the company do a bunch of things it had already accomplished, really acts as much of a deterrent for the long line of companies that phone in their privacy and security standards. Especially when most of them get far less (if any) attention for similar behavior, in part because the FTC routinely lacks the resources to seriously police privacy at any real scale.