Editor: Mia Shah-Dand
Highlights
US Visa policy drives more uncertainties.
Is “Learn to code” movement dead?
Does AI coding really improve productivity?
The rise of “workslop.”
The dark side of “vibe coding.”
LLMs are Short-circuiting. Back to Defining Intelligence?
Beyond Large Language Models? New approaches?

US Visa Policy Uncertainties Drive Tech Talent Toward New Hubs
A nation long associated with sending its best talent out to the US can now think more seriously about the best measures to keep those workers closer to home. Because surely the stress of living near the in-laws is less of a burden than the worry of whether you’ll still have a visa tomorrow.
In 2012,Douglas Rushkoff, who writes regularly for CNN.com, is a media theorist and the author of “Program or Be Programmed: Ten Commands for a Digital Age” said that it’s time Americans begin treating computer code the way we do the alphabet or arithmetic.
Code is the stuff that makes computer programs work – the list of commands that tells a word processor, a website, a video game, or an airplane navigation system what to do. That’s all software is: lines of code, written by people.
Getting with the Program: Code Year
“It also has a recreational appeal; programming has become one of those manly arts that fit the rubric of Being Handy, like fixing a bicycle or building a deck. (Code Year isn’t aimed just at men; it trumpets support from Girl Develop It.) The Code Year campaign also taps into deeper feelings of inadequacy, much like those Charles Atlas ads in the back of comic books, only it’s about building up your mental muscles. If you can code, the implicit promise is that you will not be wiped out by the enormous waves of digital change sweeping through our economy and society.”
Code.org was launched in January 2013 by Iranian-American brothers Hadi Partovi and Ali Partovi, as a non-profit focused on making computer programming more accessible. In late February 2013, a month after launch, they released a video featuring Mark Zuckerberg, Bill Gates, Jack Dorsey, and other programmers and entrepreneurs on the importance of learning how to code.
Two weeks after the launch, TechCrunch reported that the video had gone viral and received a lot of positive attention. Partovi raised about $10 million for Code.org from various tech companies and tech company founders.
Mark Zuckerberg, Bill Gates, President Obama, and a slew of celebrities are somehow part of an operation called Code.org, which promotes teaching kids to code. Supposedly coding is good for you and the initiative might actually get the young folks interested in computer science.
I see it as a ploy to sell more computers to schools. For the most part, middle schoolers—and even high schoolers—do not need to be taking computer science classes. These kids can’t even balance a checkbook or write in cursive, and they don’t get out nearly enough as it is.
Sam Altman’s Bold Warning: Learn AI or Get Left Behind in Tech’s Biggest Shift
The tech goal posts keep moving as earlier this year, CEO of OpenAI, Sam Altman, said that students should learn and master AI tools because the workforce is quickly automating coding. In an interview with Stratechery’s Ben Thompson, Sam Altman said that today’s AI skills are comparable to coding skills during his own high school years. He said that when he was in school, coding was a tactical thing and now being good at using AI tools has become the newer version of that. Now many tech executives are also using AI tools for coding.
Big Tech is spending tens of billions of dollars on AI infrastructure in 2025 alone, and companies, from Meta to Microsoft, are using AI to write and review code. At Meta’s LlamaCon conference in April 2025, CEO Mark Zuckerberg indicated that AI will take over half of the company’s software development within the next year. About 30% of new code at Google and Microsoft is AI-generated. In a sit-down chat with Meta CEO Mark Zuckerberg, Nadella noted that the exact percentage of code produced by AI varies based on the programming language. He said that AI generates “fantastic” Python code, but its C++ abilities are “not that great.”
Exactly six months ago, Dario Amodei, the CEO of massive AI company Anthropic, claimed that in half a year, AI would be “writing 90 percent of code.” And that was the worst-case scenario; in just three months, he predicted, we could hit a place where “essentially all” code is written by AI.
Anthropic launches Claude 4.5, touts better abilities, targets business customers
Anthropic unveiled the Claude 4.5 AI model on Monday, saying the newest version can code for longer uninterrupted stretches and handle finance and scientific tasks better, as the startup pushes deeper into enterprise AI.
OpenAI’s first-half revenue rises 16% to about $4.3 billion, The Information reports
OpenAI generated around $4.3 billion in revenue in the first half of 2025, about 16% more than it generated all of last year, The Information reported on Monday, citing financial disclosures to shareholders.
OpenAI said it burned $2.5 billion, in large part due to its research and development costs for developing artificial intelligence and for running ChatGPT, the report added.
Measuring the Impact of Early-2025 AI on Experienced Open-Source Developer Productivity
When developers are allowed to use AI tools, they take 19% longer to complete issues—a significant slowdown that goes against developer beliefs and expert forecasts. This gap between perception and reality is striking: developers expected AI to speed them up by 24%, and even after experiencing the slowdown, they still believed AI had sped them up by 20%.
Office workers everywhere are awash in “workslop.” This is the term researchers are using to call AI-generated content that might look like it completes a task at work: pretty on paper, well-organized and neatly formatted, but lacks substance upon a closer read — and it often becomes a headache for the person receiving the work.
Vibe-coded build system NX gets hacked, steals vibe-coders’ crypto
NX is build software. You write your code on your laptop, you press “build”, it runs NX, and you get a built version you can put onto your web server. If you could hack NX, you could hit a lot of projects. NX proudly declares itself “An AI-first build platform”. And that’s also how good NX is at security. So, nobody should be very surprised that NX got hacked to steal its users’ crypto wallets.
Author/Editor: Hessie Jones
Video snippet of the conversation between Sam Altman and David Deutsch, author of “The Beginning of Infinity,” who argues AI cannot be transformed to Super Intelligence.
LLMs Are Short-Circuiting. Is it Time to Redefine Intelligence?
The culmination of Hessie Jones’ interviews with Mounir Shita, Gary Marcus and Marc Fawzi in where LLMs fall short towards the path to surpass human intelligence.
The Three Laws of Intelligence
Mounir Shita is a 30-year veteran AGI researcher and founder of EraNova Global, a physics-first program on the nature of intelligence. This article from Shita’s Substack dives into the three laws of intelligence, derived from his book, the “Theory of General Intelligence.”
The Book of Why by Judea Pearl
Mounir Shita references Judea Pearl’s work and fundamentals of cause and effect, necessary to achieve AGI.
Mounir Shita references Jeff Hawkins Book, “On Intelligence.” “The book explains Hawkins’ memory-prediction framework theory of the brain and describes some of its consequences.”
Mark Fawzi writes under the pseudonym Nakamato Damacy on Substack. “Uncoded is the founder-led journal of Understory — a platform and app designed as a community infrastructure for ethical AI and critical thinking.”
Architecture for a Reliable and Coherent World Model
Marc Fawzi’s notes detailing the layers required in building AI models. He provides a view on why LLMs are not the answer, by themselves, and then provides a “Practical AGI Milestone: Layer Alignment at Scale.”
Inference Scaling and the Log-x Chart
Toby Ord is a Senior researcher at Oxford University’s AI Governance initiative. In his blog entitled, Inference Scaling and the Log-x Chart” he reveals 2 charts first introduced last year in GPT-o3 model. The left-hand side shows results of scaling training compute and the right, results of scaling inference compute. He concludes, “the compute (and thus the financial costs and energy use) needs to go up exponentially in order to keep making constant progress.”
The Cost of Compute: A $7 trillion race to scale data centers
McKinsey recently reported that by 2030, “data centers are projected to require $6.7 trillion worldwide to keep pace with the demand for compute power.” The capital expenditure alone is projected to be $5.2 trillion.








