A new nonprofit called Psst is creating a platform to connect tech whistleblowers with similar concerns, aiming to strengthen their collective voice when reporting AI safety issues and other tech industry problems. The initiative addresses growing concerns about the rapid, largely unregulated development of AI systems, where traditional oversight mechanisms have proven inadequate and individual employees face significant barriers to speaking out.
How it works: Psst stores encrypted information from tech workers who feel uneasy about their company’s practices, then connects individuals with similar complaints to help them present stronger cases as a group.
- The platform currently relies on Psst employees to handle matching, but the nonprofit is developing an algorithm to automate this process.
- Pro bono lawyers are involved at every step to ensure potential whistleblowers understand their rights and legal protections.
- Workers can remain anonymous and keep their jobs while still sharing critical information with journalists, Congress members, or regulatory agencies.
Why this matters: The absence of federal AI regulation and limited state oversight has created a regulatory vacuum where whistleblowers represent one of the few mechanisms to monitor potential AI harms.
- “You’ve got [an industry] that’s possibly more powerful than the government developing a technology at breakneck speed without guardrails and workers who may not have good options for how to raise alarms,” co-founder Jennifer Gibson told Semafor.
- AI systems are becoming increasingly opaque while companies face pressure from massive investments and White House expectations to present idealistic portrayals of their technologies.
The barriers whistleblowers face: Tech companies have intensified efforts to prevent information leaks, making it harder for employees to share concerns about potential harms.
- Meta recently fired about 20 employees for sharing internal information with journalists and sued a former executive over her memoir.
- Companies are increasingly compartmentalizing information, preventing individual employees from seeing the complete picture of potential issues.
- Monster tech salaries raise the personal financial stakes for speaking out, while whistleblower protections remain unclear and often only apply when companies break specific laws that predate AI’s rise.
Key details: Founded last year, Psst operates with five employees and a network of pro bono lawyers who also work on contingency if workers pursue legal action and win monetary awards.
- The nonprofit receives funding from a private tech foundation and individual donors.
- It built its first prototype using an open-source tip line called Hushline and is now developing an updated version with a university tech lab.
- The group’s approach of connecting whistleblowers helps protect individual identities better than solo reporting.
What experts are saying: Former Biden administration official Suresh Venkatasubramanian emphasized the critical need for insider perspectives given current regulatory gaps.
- “We’re still at the very early stages of being able to understand how a lot of these systems work, and that’s critical information needed to understand their effectiveness,” he told Semafor.
- “In the absence of any ability to regulate these systems, we don’t have too many mechanisms to look under the hood. Whistleblowers [are] one way to get some insight.”
The broader context: Research from the Netherlands shows that most whistleblowers suffer severe anxiety, depression, and other mental health challenges, making support systems crucial for those considering speaking out.
- Psst joins other organizations like SecureDrop and The Signals Network in supporting tech industry transparency, but its unique matching approach distinguishes it from existing platforms.
- While these platforms can enhance technology safety, experts note that companies still need robust internal reporting mechanisms to address employee concerns before they escalate to whistleblowing.
Nonprofit connects tech whistleblowers to improve AI oversight