April 16-30 Newsletter
Hi all,
Thanks for bearing with me. Family duties pulled me offline over the weekend, but I’m back with the key tech-policy headlines you might’ve missed from April 16 – 30th. Let’s jump in!
Court Undermines FCC’s Privacy Enforcement Authority: A federal appeals court delivered a blow to the Federal Communications Commission’s privacy enforcement. On April 17th, the U.S. Fifth Circuit voided a $57 million fine against AT&T for unauthorized sharing of customer location data. The court ruled that the FCC’s process violated AT&T’s Seventh Amendment right to a jury trial, citing the Supreme Court’s SEC v. Jarkesy decision. In essence, judges found that the FCC cannot unilaterally levy such fines without giving companies their day in court. The decision not only voids AT&T’s penalty but throws into question the FCC’s broader forfeiture authority. The FCC had fined major carriers over $200 million in total for selling customer location data, but those actions are now under legal cloud.
WDTM (Why does this matter)? This ruling sharply limits the FCC’s ability to punish telecom privacy violations under its current process. Without a fix, carriers might feel emboldened, knowing the FCC faces hurdles to enforce privacy rules. The decision has prompted calls for the FCC (and Congress) to reform enforcement practices to ensure due process while still holding bad actors accountable. In the meantime, the FCC’s privacy policing powers remain in limbo, potentially weakening consumer data protections.
Texas AG Seeks Revival of State Social Media Restrictions: Texas is pushing back against a court order blocking its new law aimed at protecting minors on social media. State Attorney General Ken Paxton filed an appeal in the Fifth Circuit to lift an injunction on the Securing Children Online through Parental Empowerment (SCOPE) Act (HB 18). The law would require social platforms to verify users’ ages and filter “harmful” content for anyone under 18—including posts that “promote” or “glorify” eating disorders, self-harm, substance abuse, or sexual exploitation. It also bars targeted ads to minors without parental consent. A federal judge in Austin had blocked these provisions as overly broad, vague, and likely unconstitutional on First Amendment grounds. Paxton argues the law is clear and necessary for child safety, urging the appellate court to let the state enforce the rules.
WDTM?: This legal battle is part of a larger national trend as states seek to reign in social media risks for teens. Texas’s law—like similar measures in Utah and elsewhere—tests the balance between protecting children online and upholding free speech rights. The outcome could set an important precedent. If Texas prevails, social media companies may face a wave of state-level age gating and content regulations. If the injunction stands, it reinforces constitutional limits on state control over online content. The case is also highlighting growing political pressure on Big Tech to address mental health and safety issues for younger users.
Congress Fast-Tracks Controversial ‘Take It Down’ Deepfakes Bill: Lawmakers passed the Take It Down Act, a bill intended to curb the spread of non-consensual deepfake pornography and other exploitative content online. The legislation imposes a duty-of-care on platforms to swiftly remove flagged intimate images and deepfakes, backed by harsh criminal penalties for violators. However, digital rights advocates have sounded alarms: even the Cyber Civil Rights Initiative—normally a supporter of revenge-porn laws—says it “cannot support this bill” due to vagueness and overbreadth that could undermine lawful speech. Critics note the bill is ripe for abuse; in fact, President Donald Trump openly cheered the Act and “bragged about how he plans to abuse” its takedown provisions against content he dislikes.
WDTM?: Platforms now face a 24‑hour clock: flag a deepfake or get hit with felony‑level penalties. While it helps protect victims of non-consensual deepfakes, upon deeper inspection, the law’s vague wording is problematic. The language own what counts as “exploitative” is so broad that it may give people, especially politicians, sweeping power to demand content takedowns. Smaller sites will probably over‑censor to dodge liability, and the worst offenders may just migrate to offshore servers. Personally, until Congress tightens the scope and builds in stronger speech safeguards, I’m siding with the critics: the bill, as written, is too broad.
International Sting Shuts Down Stolen Data Marketplaces: The U.S. Department of Justice, working with Europol and others, dismantled two major cybercrime marketplaces known as “Cracked” and “Nulled” that traded in stolen login credentials, personal data, and hacking tools. In a coordinated operation spanning multiple countries, authorities seized servers and domain names for the sites. DOJ officials also unsealed charges against an administrator of Nulled. The crackdown is part of an international effort to disrupt the underground economy of data breaches and prevent “sextortion” and identity theft crimes facilitated by these platforms.
WDTM?: This is a big win against the online black market for stolen data. Without a doubt, there are many more out there, but at least shutting these down means fewer tools for criminals—and fewer victims. It also indicates an uptick in international collaboration to combat cybercrime. Recently, countries have started aligning more closely on cybercrime—especially after the UN adopted its first cybercrime treaty.
Federal Workforce Cuts Spur Worries Over Self-Driving Safety Oversight: Recent federal workforce reductions are raising alarms about conflicts of interest in tech regulation. DOGE ordered agency downsizing that ousted several National Highway Traffic Safety Administration (NHTSA) experts focused on autonomous vehicle safety. NHTSA, which is actively investigating Tesla’s Autopilot and self-driving tech, lost about 4% of its staff in February, including key members of its Vehicle Automation Safety team. Critics argue the cuts could weaken oversight just as Tesla is pushing to deploy “millions of robotaxis.” Given Musk’s dual role as a government adviser and Tesla CEO, observers see a “potential conflict of interest”—the firings could remove “regulatory hurdles” to Tesla’s plans, as one report noted. Even a Tesla manager has expressed that NHTSA needs more resources, not fewer, to ensure vehicle safety.
WDTM?: The tension between pro-industry push for efficiency and the need for robust independent regulation of emerging technologies is rearing its head here. Musk is influencing the rules while his company benefits. Gutting NHTSA’s autonomous vehicle oversight team right as Tesla accelerates robotaxi deployment undermines the very checks meant to keep the public safe. It raises a critical question: when the regulators are gone, who’s left to ask hard questions about safety, accountability, and algorithmic error with self-driving safety?
Elon Musk Signals Departure from DOGE: Elon Musk announced plans to step back from his role in the Trump administration's Department of Government Efficiency (DOGE). While the White House indicated this move was anticipated, the timing coincides with a significant 71% drop in Tesla's profits, attributed to global protests and boycotts over Musk's government involvement. As a "special government employee," Musk is limited to 130 days of service per year, and reducing his involvement could extend his tenure. The exact timeline of his departure remains unclear.
WDTM?: As the primary architect and advocate of DOGE, Musk's exit leaves a leadership vacuum that raises questions about the program's future direction and effectiveness. While Musk has claimed that DOGE saved the government $160 billion, independent analyses suggest that these savings may be overstated and that the aggressive cost-cutting measures have led to diminished public services and morale within federal agencies. The implications of Musk's departure are multifaceted. Firstly, it challenges the sustainability of DOGE's objectives without its most prominent leader. Secondly, it prompts a reevaluation of the role that private sector leaders should play in public governance, especially when their approaches may not align with the complexities of governmental operations. Finally, it underscores the need for transparent and accountable leadership in initiatives that significantly impact public services and federal employees. In the absence of Musk's leadership, DOGE's future will depend on the ability of remaining officials to adapt the initiative's goals to the realities of public administration.
Big Tech Accused of Watering Down EU’s AI Code of Practice: A new report revealed that major tech firms, including Google, Meta, and Microsoft, significantly influenced the drafting process of the EU’s AI Code of Practice—softening several of its core commitments. Originally designed to set clear ethical boundaries for AI development ahead of the AI Act’s enforcement, the voluntary Code has reportedly lost provisions around transparency, data access, and human oversight due to industry pressure. Instead of mandatory disclosures or safeguards, the final version relies more heavily on self-reporting and generalized ethical principles. Civil society groups say they were sidelined in negotiations, and several have urged EU institutions to reconsider how industry consultations are conducted. EU regulators say the Code was always intended as a “living document,” but trust in the process has clearly eroded.
WDTM? This raises alarms about regulatory capture—when the entities meant to be regulated gain outsized influence over the process. In a moment where governments worldwide are scrambling to catch up to AI’s risks, the EU was seen as a leader in setting ethical AI standards. If that leadership is eroded by corporate pressure, other governments may follow suit, prioritizing industry cooperation over enforceable protections.
Russia and Ukraine War: Russia claimed to have produced over 1.5 million drones in 2025, but officials recently admitted the military still faces critical shortages on the Ukrainian front. Despite the flashy production figures, shortages are affecting frontline operations, particularly as Ukraine continues leveraging drone swarms to target Russian supply lines and infrastructure. Reports suggest Russian forces are turning to motorcycles and civilian vehicles as low-cost, evasive alternatives. There’s also growing evidence that Russia’s domestic drone manufacturing is struggling to scale under sanctions and resource constraints.
WDTM?: Drones have become the dominant force on the battlefield, and they are responsible for a significant proportion of casualties. For instance, Ukraine's strategic use of drones has allowed it to reclaim positions and target Russian infrastructure effectively. But President Putin's acknowledgment of these shortages highlights the strain on Russia's defense industry. It shows that the sanctions on Russia are working. On its own, the drone shortage won’t decide the war, but it knocks a chunk out of Russia's edge and gives Kyiv breathing room to keep pressing.
Yale New Haven: Detected on March 8, 2025, and disclosed on April 11, 2025, Yale New Haven Health reported a data breach affecting 5.5 million patient records, marking one of the most significant healthcare data breaches in recent history.
WDTM?: Healthcare data breaches are escalating in both frequency and severity. In March 2025 alone, 18 significant breaches were reported, each affecting over 10,000 individuals. I believe this trend highlights the urgent need for enhanced cybersecurity measures within the healthcare sector to protect patient data and maintain the integrity of healthcare services. The public is also losing confidence. Patients are questioning whether hospitals and insurers can keep their most private information safe, especially as breaches are hitting not just hospitals, but also third-party vendors and cloud providers. Some lawmakers, like Senators Ron Wyden and Mark Warner, have made strong calls for stricter breach reporting rules and minimum cybersecurity standards for healthcare providers. If this pattern holds, expect legal pressure on hospitals to rethink how they store and secure patient data.