GasCope
Sam Altman to Policymakers: AI Is Coming for Your Jobs (And Your Private Keys)
Back to feed

Sam Altman to Policymakers: AI Is Coming for Your Jobs (And Your Private Keys)

OpenAI CEO Sam Altman is telling U.S. policymakers to get their act together on AI regulation before things get weird. In a chat with Axios, the guy who basically invented ChatGPT warned that artificial intelligence is no longer science fiction — it's writing code and doing research that used to need whole teams of humans. The next generation of models will be even crazier, potentially helping scientists crack major discoveries and letting one person do the work of an entire company. That's great for progress, maybe less great for the rest of us.

In the crypto world, some industry veterans are already seeing the cracks form. Ledger CTO Charles Guillemet told CoinDesk that AI tools are making it cheaper and easier to find software vulnerabilities. Exploits that used to take months of reverse-engineering can now be pulled off in seconds with the right prompts. Last year, the crypto industry lost over $1.4 billion to hacks. Guillemet thinks that number could keep climbing. Plus, developers are increasingly relying on AI-generated code, which might introduce new bugs at scale. His solution? Stronger defenses like mathematically verified code, hardware wallets that keep private keys offline, and accepting that systems will fail. Nothing like a good cold wallet to keep your keys away from AI overlords, we guess.

Altman also flagged that AI could speed up drug discovery but also enable nastier cyberattacks and make harmful biological research easier. He thinks these threats could emerge within a year, which means government, tech firms, and security groups need to get on the same page ASAP. "We're not that far away from a world where there are incredibly capable open-source models that are very good at biology," he said. "The need for society to be resilient to terrorist groups using these models to try to create novel pathogens is no longer a theoretical thing." Nothing like existential biosecurity risks to brighten your Tuesday afternoon.

He also mentioned a potential "world-shaking cyberattack" could drop as early as this year. Avoiding that will require, in his words, "a tremendous amount of work." Altman framed OpenAI's policy ideas as a starting point — basically a nudge to get the conversation going on how to handle systems that learn fast and operate across many fields. Using AI to defend against these attacks is also part of the plan. It's giving "letting the fox guard the henhouse" but make it tech bro.

On the idea of nationalizing OpenAI, Altman wasn't thrilled. His take: the biggest argument against it is that the U.S. needs to achieve "superintelligence" before rival nations do, and doing that as a government project would probably be a disaster. That said, he still thinks AI companies need to work closely with the U.S. government. Given his stake in OpenAI, his framing of both the urgency of regulation and the role of private companies in managing these risks could influence the firm's competitive position. Curious timing, that. Very curious.

Altman also sees energy as an area where things will move fast — more processing power could keep costs down as AI demand spikes. And he's already seeing labor shifts. A programmer in 2026 works differently than one from 2025. His vision? AI becomes a utility, like electricity, embedded everywhere, while the cost of basic intelligence drops and only top-tier systems stay expensive. "You will have this personal super assistant running in the cloud. If you use it a lot or use it at high levels of intelligence you'll have a higher bill one month and if you use it less, you'll have a lower bill." So basically, AI will eventually come with a monthly subscription tier. Fun. Can't wait for premium AI and AI Basic, the latter of which will definitely hallucinate your tax returns.

"It's incredibly important that people building AI are high integrity, trustworthy people," Altman said. We'll see about that.

Share:
Publishergascope.com
Published
UpdatedApr 6, 2026, 22:51 UTC

Disclaimer: This content is for information and entertainment purposes only. It does not constitute financial, investment, legal, or tax advice. Always do your own research and consult with qualified professionals before making any financial decisions.

See our Terms of Service, Privacy Policy, and Editorial Policy.