
What happened on April 15
On April 15, 2026, Snap Inc. axed 1,000 jobs — 16% of its total workforce — and gave the reason no other major corporation had ever put so starkly before: 65% of the company's code is now generated by artificial intelligence.
In a memo to staff, CEO Evan Spiegel wrote: "Rapid advancements in artificial intelligence enable our teams to reduce repetitive work, increase velocity, and better support our community, partners, and advertisers. We have already witnessed small squads leveraging AI tools to drive meaningful progress across several important initiatives."
By the afternoon, Snap's stock had risen nearly 8%. That sequence of events — mass layoffs followed by market celebration — is not new in the tech industry. But something about what happened at Snap feels different. Because this time, the company didn't just say it was cutting costs. It said exactly what was replacing the people it was letting go.
Snap expects to reduce its annual costs by over $500 million by the second half of 2026. DeepSeek One thousand people paid for that number with their livelihoods.
This is not just Snap
Snap is the headline. The context is bigger.
So far in 2026, there have been over 96,000 layoffs in tech. At least four major companies have explicitly cited increased AI investment as the reason — Oracle, Amazon, Meta, and now Snap. Meta CEO Mark Zuckerberg called 2026 "the year that AI starts to dramatically change the way that we work," adding: "We're starting to see projects that used to require big teams now be accomplished by a single very talented person."
According to data from layoff tracking firms, 47.9% of the 95,000-plus tech layoffs in 2026 so far are a result of AI automation. Snap has laid off software engineers, machine learning engineers, data scientists, product managers, and even distinguished engineers and directors. The strategy is clear: eliminate execution roles, keep senior architects, and invest in AI infrastructure.
Google has already acknowledged that 75% of its new code is AI-generated. Snap is at 65%. These companies are moving in the same direction at similar speeds — and both are restructuring their human workforces in response.
The jobs being eliminated are not entry-level or incidental. They are the jobs that generations of students studied, competed, and prepared for. Computer science degrees. Software engineering careers. Data science roles. The traditional pipeline from school to stable tech employment — the one that parents still quote when advising their children — is being restructured in real time.
What kinds of roles are disappearing — and what is surviving
Most exposed are coders, QA engineers, and level-one support staff. Roles that are proving durable require human intelligence: architects, security and compliance engineers, and AI infrastructure engineers. To remain relevant, developers need to build security, compliance, and cross-functional product thinking skills — things AI does not yet excel at.
This is a precise and important distinction. AI is not eliminating the need for human intelligence. It is eliminating the need for human execution of defined, repeatable tasks. The programmer who writes the same kind of function thousands of times can be replaced. The engineer who decides what to build, why it matters, and whether the AI's output can be trusted — that person is more valuable than ever.
A Stanford study shows a 20% drop in the number of software developers between the ages of 22 and 25 since 2022. The disruption is not hitting the middle of the workforce. It is hitting the entry point — the moment when today's school students will step into their careers.
This is the generation in your classrooms right now.
What this means for schools — honestly
There is a version of this story that schools tell themselves to avoid discomfort: AI will create new jobs to replace the ones it eliminates. This has been true of previous technological revolutions, and it may be true again. But it has never been true immediately. And the children entering the workforce in the next five to ten years will arrive into a transition, not a stable new equilibrium.
The honest question for every school leader reading this is not "are we teaching AI?" The question is whether the education you are providing gives children the capacities that AI cannot replicate — and whether those capacities are being deliberately, systematically developed, or assumed.
Judgment. The ability to evaluate AI output, catch its errors, understand its limitations, and take responsibility for decisions made with its assistance. This is what kept the Perseverance rover operators in the loop even when Claude was planning the route. It is what keeps senior engineers employed even as AI writes 65% of the code. It is a capacity built over years of being asked to think, not just produce.
Curiosity. Not the performed curiosity of a student who asks questions to please a teacher, but the genuine, restless kind that drives a person to understand why something works, not just how to use it. This is the capacity that makes a person want to go further than the AI's output — to question it, improve it, and imagine what it could not.
Adaptability. The ability to learn new tools, new contexts, and new problems continuously — without waiting for a curriculum to catch up. The children who thrive in the next decade will not be the ones who learned the right things in school. They will be the ones who learned how to learn, and kept doing it.
These are not abstract virtues. They are the specific outcomes of a school designed to develop the whole child — not just the child who can pass an exam. They are what Cypher is designed to cultivate: an AI companion that asks the student to think, not think for them. They are what NEO builds through hands-on innovation labs where children work with real AI tools and face real problems. They are what the AI Bootcamp and Agentic AI for Kids programmes are structured around — not AI literacy as a subject, but AI capability as a practice.
The number that belongs in every school's strategy conversation
65% of new code at Snap. Written by AI. Yesterday, Google acknowledged 75%.
When the majority of a company's technical output is generated by AI, the humans who remain are not there to produce. They are there to direct, evaluate, and take responsibility. That is a completely different job description from the one that has defined tech employment for thirty years. And it requires a completely different kind of preparation.
The school that graduates a child who can use AI confidently, question it honestly, and direct it purposefully is not preparing that child for a job that still exists. It is preparing them for the one that is just beginning to take shape — and that will define the economy your students inherit.
Snap said it plainly: "small squads using AI tools" can now do the work previously handled by larger engineering teams. Al Jazeera The squad that will lead is the one where every member knows how to think — not just how to type.