Code snippet showing ‘Keep Building!’ printed in multiple programming languages including Python, Java, JavaScript, and C++.
Technical Insights

How to Become a Multilingual Coder: AI makes it easy to code in any programming language — especially if you know just one.

Even though I’m a much better Python than JavaScript developer, with AI assistance, I’ve been writing a lot of JavaScript code recently.
Cartoon of two coworkers coding; one struggles with evaluations, the other iterates quickly through model updates and test cases.
Technical Insights

We Iterate on Models. We Can Iterate on Evals, Too: Building automated evals doesn’t need to be a huge investment. Start with a few quick-and-dirty examples and iterate!

I’ve noticed that many GenAI application projects put in automated evaluations (evals) of the system’s output probably later — and rely on humans to manually examine and judge outputs longer — than they should.
Cartoon of a relaxed man saying “Relax! I’m lazy prompting!” while lounging under a beach umbrella near a stressed coworker at a desk.
Technical Insights

The Benefits of Lazy Prompting: You don’t always need to provide context when prompting a large language model. A quick prompt can be enough.

Contrary to standard prompting advice that you should give LLMs the context they need to succeed, I find it’s sometimes faster to be lazy and dash off a quick, imprecise prompt and see what happens.
Cartoon of a man playing violin saying “I’m fine-tuning!” while a woman at her desk covers her ears, replying “Did you try prompting?”
Technical Insights

When to Fine-Tune — and When Not To: Many teams that fine-tune their models would be better off prompting or using agentic workflows. Here's how to decide.

Fine-tuning small language models has been gaining traction over the past half year.
Illustration of a programmer at a computer displaying PyTorch code, while a smiling colleague gives a thumbs-up in approval.
Technical Insights

Learn the Language of Software: AI won’t kill programming. There has never been a better time to start coding.

Some people today are discouraging others from learning programming on the grounds AI will automate it.
Diagram of an RQ-Transformer speech system with Helium and Depth Transformers for audio processing.
Technical Insights

Wait Your Turn! Conversation by Voice Versus Text: Text interactions require taking turns, but voices may interrupt or overlap. Here’s how AI is evolving for voice interactions.

Continuing our discussion on the Voice Stack, I’d like to explore an area that today’s voice-based systems mostly struggle with: Voice Activity Detection (VAD) and the turn-taking paradigm of communication.
Diagram comparing direct audio generation with a foundation model vs. a voice pipeline using STT, LLM, and TTS.
Technical Insights

What I’ve Learned Building Voice Applications: Best practices for building apps based on AI’s evolving voice-in, voice-out stack

The Voice Stack is improving rapidly. Systems that interact with users via speaking and listening will drive many new applications.
“Responsible AI” written on a wall, with “Safety” crossed out in blue paint.
Technical Insights

The Difference Between “AI Safety” and “Responsible AI”: Talk about “AI safety” obscures an important point; AI isn't inherently unsafe. Instead, let’s talk about “responsible AI.”

At the Artificial Intelligence Action Summit in Paris this week, U.S. Vice President J.D. Vance said, “I’m not here to talk about AI safety.
Three Takeaways from DeepSeek’s Big Week: Innvations by China’s AI powerhouse DeepSeek highlight major shifts in the international scene
Technical Insights

Three Takeaways from DeepSeek’s Big Week: Innvations by China’s AI powerhouse DeepSeek highlight major shifts in the international scene

The buzz over DeepSeek this week crystallized, for many people, a few important trends that have been happening in plain sight.
Illustration of tech tools like OpenAI, MongoDB, Heroku, and Python with Andrew Ng working on a laptop
Technical Insights

My AI-Assisted Software Development Stack: The software development stack is evolving fast. Here are some things to consider as you choose components.

Using AI-assisted coding to build software prototypes is an important way to quickly explore many ideas and invent new things.
Andrew Ng celebrating and wishing a Happy New Year 2025 with sparklers.
Technical Insights

New Opportunities for the New Year: AI-assisted coding lets you prototype applications quickly and easily. Go forth and build!

Despite having worked on AI since I was a teenager, I’m now more excited than ever about what we can do with it, especially in building AI applications.
Graph showing cross-validation accuracy vs. number of features for raw and whitened inputs.
Technical Insights

Focus on the Future, Learn From the Past: 15 years ago, the idea of scaling up deep learning was controversial — but it was right. Keep your eyes open for such ideas in 2025.

I’m thrilled that former students and postdocs of mine won both of this year’s NeurIPS Test of Time Paper Awards.
Cartoon showing people stuck in wet concrete, with a person saying ‘You asked for a concrete idea!’
Technical Insights

Best Practices for AI Product Management: Generative AI is making it possible to build new kinds of applications in new ways. Here are emerging best practices for AI product managers.

AI Product Management is evolving rapidly. The growth of generative AI and AI-based developer tools has created numerous opportunities to build AI applications.
AI ecosystem layers: applications, orchestration, foundational models, cloud, and semiconductors.
Technical Insights

The Falling Cost of Building AI Applications: Big AI’s huge investments in foundation models enables developers to build AI applications at very low cost.

There’s a lingering misconception that building with generative AI is expensive.
Two people reading in bed, one with a book on library functions and a head labeled with AI layers.
Technical Insights

AI Is Part of Your Online Audience: Some webpages are written not for humans but for large language models to read. Developers can benefit by keeping the LLM audience in mind.

A small number of people are posting text online that’s intended for direct consumption not by humans, but by LLMs (large language models).
Load More

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox