In today’s Five-Minute Friday episode, I’ll cover the five biggest takeaways from the 2025 edition of the renowned AI Index Report, which was published a few weeks ago by the Stanford University Institute for Human-Centered AI. Every year this popular report — often called the “State of AI” report — covers the biggest technical advances, new achievements in benchmarking, investment flowing into AI and more. Here’s a link to the colossal full report in the show notes; today’s episode will cover the five most essential items.
Read MoreFiltering by Category: SuperDataScience
AI-Powered Virtual Reality: The Future of Education and Entertainment, with Mary Spio
In today's episode, the deep-space engineer and visionary entrepreneur Mary Spio takes us on a journey into the A.I.-powered virtual reality that is transforming education, entertainment and more.
Mary:
Is CEO and CTO of CEEK INC, a platform pioneering A.I.-powered virtual-reality experiences featuring the likes of Lady Gaga, Bon Jovi and Dwayne Wade.
Holds 10+ patents across A.I., digital cinema, spatial audio, and extended-reality technologies.
Was a deep space engineer at Boeing and, before ever even going to university, was a satellite technician for the US Air Force.
Her innovations have been used by Xbox, Lucasfilm, and Universal Music Group.
Holds a Masters in Electrical Engineering, Computer Science and Innovation Management from the Georgia Institute of Technology.
Today’s episode is fascinating and relatively high-level and should be of interest to any listener.
In today’s episode, Mary details:
How a childhood in Ghana during a military coup led to a career as a deep space engineer and A.I. entrepreneur.
The neuroscience of how VR training can create memories that are indistinguishable from reality in your brain.
The shocking discovery about why VR headsets were making women violently ill (and how Mary fixed it).
How A.I. music is revolutionizing the industry and giving artists unexpected new powers.
How blockchain verification might be our only defense against an impending tsunami of A.I.-generated deepfakes.
The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
Teams of Agents: The Next Frontier in AI Collaboration, with Mike Pell
Special episode for you today (filmed in front of a live audience!) with the inventor and exceptional communicator, Mike Pell. Hear his vision for the way teams of A.I. agents will change work and life for the better.
Today’s episode features a session I hosted a couple weeks ago in Brooklyn at the inaugural "A.I. & Creativity Summit", which was run by Artist and the Machine. It was an excellent full-day event on a gloriously sunny day.
My guest for an on-stage conversation in front of a live audience was the extraordinary Mike Pell:
Inventor of the PDF and Adobe Acrobat.
Director of The Microsoft Garage, a global innovation program.
Holds over 20 US Patents.
Author of three books.
Today’s episode is entertaining, optimistic and forward-looking and will be of interest to any listener of my podcast.
In the episode, Mike details:
Why A.I. agents are like an exoskeleton that gives you capabilities you never had time to master.
The coming shift from passive to proactive A.I. that will interject like a trusted coworker.
Why he believes we're "getting to the good part" in the A.I. revolution and what that means for the future of work.
The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
Multi-Agent Teams, Quantum Computing and the Future of Work, with Dell’s Global CTO John Roese
Collosal guest today: Dell's global CTO and Chief A.I. Officer John Roese, who eloquently describes mind-blowingly fascinating topics including multi-agent teams in enterprises and quantum computing.
John Roese:
Is Global CTO and Chief A.I. Officer at Dell Technologies, the giant Texas-based corporation with over 100,000 employees and $88 billion dollars of revenue in 2024.
Responsible for Dell’s future-looking technology strategy and accelerating A.I. adoption for Dell and its customers.
With an unreal career stretching back several decades, he was previously Global CTO at EMC, Global CTO at Nortel and CTO at Broadcom, amongst many other top roles at world-leading tech companies, board memberships and deep involvement with the private equity and venture capital ecosystems.
Holds a degree in Electrical and Computer Engineering from the University of New Hampshire.
Despite John being such a deep technical expert, today’s episode stays relatively high level and so should be of great value to any listener.
In today’s episode, John details:
How Dell narrowed 800 generative A.I. ideas down to 8 high-impact projects.
Proof-of-Concept Prison and his strategy for escaping it.
Where multi-agent teams will make the biggest impacts in enterprises first.
The unexpected way A.I. is creating more construction jobs than any other sector as well as new careers that will emerge in the coming years because of A.I.
How quantum computing and A.I. advances are entangled in a way that will dramatically change the future.
The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
In Case You Missed It in April 2025
We had a record number of guests on my podcast in April — and they were spectacular! In today's "In Case You Missed It" episode, hear the best parts of my conversations with each of them.
The specific conversation highlights included in today's episode are:
Sama Bali from NVIDIA and Logan Lawler from Dell Technologies fill us in on the AI software stack on NVIDIA GPUs, including libraries like CUDA.
Continuing on the A.I. hardware topic, Emily Webber details Amazon Web Services (AWS)'s own A.I. accelerator chips.
Zerve AI's co-founder Dr. Greg Michaelson describes how data scientists can deploy A.I. models to production without needing to call on an engineering team.
Kai Beckmann, CEO of Merck KGaA's semiconductor business, describes intricate details of the semiconductors that make A.I. systems hum.
Finally, Shirish Gupta explains his "A-I-P-C" framework for finding out if you should be using edge compute for local A.I. inference instead of relying on cloud compute.
The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
Python Polars: The Definitive Guide, with Jeroen Janssens and Thijs Nieuwdorp
Today's episode on Polars is in equal parts hilarious and informative with Jeroen and Thijs, who co-authored the brand-new O'Reilly book "Python Polars: The Definitive Guide". Enjoy this one!
More on Dr. Jeroen Janssens:
• Senior Developer Relations Engineer at Posit PBC (iconic creators of RStudio and much more).
• Previously, was Senior Machine Learning Engineer at Xomnia.
• Wrote the invaluable O’Reilly book "Data Science at the Command Line".
• Holds a PhD in machine learning from Tilburg University.
...and on Thijs Nieuwdorp:
• Lead Data Scientist at Xomnia, the largest Dutch data and A.I. consulting company.
• Holds a degree in A.I. from Radboud University.
Today’s episode will be particularly appealing to hands-on data science, machine learning and A.I. practitioners but Jeroen and Thijs are tremendous storytellers and frankly very funny so this episode can probably be enjoyed by anyone interested in data and A.I.
In today’s episode, Jeroen and Thijs detail:
• Why pandas users are rapidly switching to Polars for dataframe operations in Python.
• The inside story of how O'Reilly rejected four book proposals on Polars before accepting the fifth.
• The moment when an innocuous GitHub pull request forced a complete rewrite of an entire book chapter.
• A previously secret collaboration with NVIDIA and Dell that revealed remarkable GPU acceleration benchmarks by Polars.
The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
Model Context Protocol (MCP) and Why Everyone’s Talking About It
Today we're diving into Model Context Protocol, or MCP – the hot topic taking the AI world by storm in early 2025.
Read MoreBlackwell GPUs Are Now Available at Your Desk, with Sama Bali and Logan Lawler
Today's charming and complementary guests — Sama Bali from NVIDIA and Logan Lawler from Dell — make for an extra fun episode on the powerful new Blackwell GPUs... now available at your desk!
More on Sama:
A.I. Solutions leader at NVIDIA that specializes in bringing A.I. products to market.
Prior to NVIDIA, held a Machine Learning Solutions role at Amazon Web Services (AWS).
Focused on educating data scientists and developers on A.I. innovations and implementing them effectively in enterprises.
Holds a Masters in Engineering Management from San José State University.
More on Logan:
Leads Dell Pro Max A.I. Solutions (if you haven’t heard of Pro Max before, we’ll cover that in this episode!)
Over his sixteen-year tenure at Dell Technologies, has held positions across merchandising, services, marketing and e-commerce.
Holds an MBA in management from Texas State University.
Today’s episode will be particularly appealing to hands-on data science, machine learning and A.I. practitioners but it isn’t especially technical and so can be enjoyed by anyone!
In today’s episode, Sama and Logan detail:
Why data scientists are camping out at 6AM to attend NVIDIA's GTC conference.
The killer specs of NVIDIA’s next-generation Blackwell GPUs.
How Dell and Nvidia have joined forces to bring server-level AI power right to your desktop.
How microservices are revolutionizing A.I. development and deployment.
The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
40x Hotter Than the Sun: The ASML Machines That Make AI Chips
Today we're diving into something absolutely critical to the future of artificial intelligence that you might never have thought about before: the machines that make AI chips possible.
Read MoreBeyond GPUs: The Power of Custom AI Accelerators, with Emily Webber
The mind-blowing A.I. capabilities of recent years are made possible by vast quantities of specialized A.I.-accelerator chips. Today, AWS's (brilliant, amusing and Zen!) Emily Webber explains how these chips work.
Emily:
• Is a Principal Solutions Architect in the elite Annapurna Labs ML service team that is part of Amazon Web Services (AWS).
• Works directly on the Trainium and Inferentia hardware accelerators (for, respectively, training and making inferences with A.I. models).
• Also works on the NKI (Neuron Kernel Interface) that acts as a bare-metal language and compiler for programming AWS instances that use Trainium and Inferentia chips.
• Wrote a book on pretraining foundation models.
• Spent six years developing distributed systems for customers on Amazon’s cloud-based ML platform SageMaker.
• Leads the Neuron Data Science community and leads the technical aspects for the “Build On Trainium” program — a $110m credit-investment program for academic researchers.
Today’s episode is on the technical side and will appeal to anyone who’s keen to understand the relationship between today’s gigantic A.I. models and the hardware they run on.
In today’s episode, Emily details:
• The little-known story of how Annapurna Labs revolutionized cloud computing.
• What it takes to design hardware that can efficiently train and deploy models with billions of parameters.
• How Tranium2 became the most powerful A.I. chip on AWS.
• Why AWS is investing $110 million worth of compute credits in academic AI research.
• How meditation and Buddhist practice can enhance your focus and problem-solving abilities in tech.
The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
Manus, DeepSeek and China’s AI Boom
Today, we're diving into the fascinating AI boom that's been sweeping across China since early 2025, examining what this means for the global AI landscape and markets.
Read MoreServerless, Parallel, and AI-Assisted: The Future of Data Science is Here, with Zerve’s Dr. Greg Michaelson
What are "code nodes" and "RAG DAGs"? Listen to today's episode with the highly technical (but also highly hilarious) Dr. Greg Michaelson to get a glimpse into the future of data science and A.I. model development.
Greg:
Is a Co-Founder of Zerve AI, a super-cool platform for developing and delivering A.I. products that launched to the public on this very podcast a little over a year ago.
Previously spent 7 years as DataRobot’s Chief Customer Officer and 4 years as Senior Director of Analytics & Research for Travelers.
Was a baptist pastor while he obtained his PhD in Applied Statistics!
Today’s episode is on the technical side and so will appeal most to hands-on practitioners like data scientists, AI/ML engineers and software developers… but Greg is such an engaging communicator that anyone interested in how the practice of data science is rapidly being revolutionized may enjoy today’s episode.
In it, Greg details:
How Zerve's collaborative, graph-based coding environment has matured over the past year, including their revolutionary 'Fleet' feature (in beta) that allows massive parallelization of code execution without additional cost.
How AI assistants are changing the coding experience by helping build, edit, and connect your data science projects.
Why the rise of LLMs might spell trouble for many SaaS businesses as building in-house solutions becomes increasingly viable.
The innovative ways companies are using retrieval-augmented generation (RAG) to create more powerful A.I. applications.
The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
In Case You Missed It in March 2025
We had absolutely killer guests and killer conversations on my podcast in March. This isn't bluster; I learned a ton from Andriy, Richmond, Natalie and Varun... Today's episode features all the best highlights!
The specific conversation highlights included in today's episode are:
The mega-bestselling author of "The 100-Page Machine Learning Book" (and now "The 100-Page Language Models Book"!) Dr. Andriy Burkov on the missing piece of AGI: Why LLMs can't plan or self-reflect.
Relatedly, the fascinating and exceptionally well-spoken Natalie Monbiot contrasted artificial intelligence with the human variety, detailing what makes us unique.
The charismatic software engineer Richmond Alake (of MongoDB) explained his "A.I. Stack" concept and how you can leverage it to build better A.I. applications.
Former Google Gemini engineer Varun Godbole provides a helpful overview of guide to neural network design, the (freely available!) "Deep Learning Tuning Playbook".
The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
The Neural Processing Units Bringing AI to PCs, with Shirish Gupta
In many situations, it's impractical (or even impossible!) to have A.I. executed in the cloud. In today's episode, Shirish Gupta details when to run A.I. locally and how Neural Processing Units (NPUs) make it practical.
Today's episode is about efficiently designing and deploying AI applications that run on the edge. Our guide on that journey is SuperDataScience Podcast fan, Shirish! Here's more on him:
• Has spent more than two decades working for the global technology juggernaut, Dell Technologies, in their Austin, Texas headquarters.
• Has held senior systems engineering, quality engineering and field engineering roles.
• For the past three years, has been Director of AI Product Management for Dell’s PC Group.
• Holds a Master’s in Mechanical Engineering from the University of Maryland.
Today’s episode should appeal to anyone who is involved with or interested in real-world A.I. applications.
In this episode, Shirish details:
• What Neural Processing Units (NPUs) are and why they're transforming A.I. on edge devices.
• Four clear, compelling reasons to consider moving AI workloads from the cloud to your local device.
• The "A.I. PC" revolution that's bringing A.I. acceleration to everyday laptops and workstations.
• What kinds of Large Language Models are best-suited to local inference on AI PCs.
• How Dell's Pro A.I. Studio toolkit will drastically reduce enterprise A.I. deployment time.
• Plenty of real-life A.I. PC examples, including how a healthcare provider achieved physician-level accuracy with a custom vision model.
The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
Hugging Face’s smolagents: Agentic AI in Python Made Easy
Today, we’re diving into Hugging Face’s smolagents – a new development that gives AI models more autonomy. Hugging Face, the open-source AI powerhouse behind technologies like Transformers, has now turned its attention to AI agents – programs where AI models can plan and execute tasks on their own – and their latest library smolagents makes building these agents simpler than ever. In this short episode, I’ll break down what smolagents are, how they work, and why they’re a big deal for developers, businesses, and researchers alike.
Read MoreHow Semiconductors Are Made (And Fuel the AI Boom), with Kai Beckmann
Today's episode is an important one on the hardware that underlies all computing and is fueling the A.I. boom. It’s hard to imagine a better guest than Kai Beckmann for this essential topic.
Kai:
• Is Member of the Executive Board of Merck KGaA, Darmstadt, Germany (a 350-year-old firm that’s the world's oldest chemical and pharmaceutical company and that has more than 62,000 employees across 60 countries).
• Having worked at the gigantic firm for over 35 years, he’s been CEO of their Electronics business for the past eight years.
• Under his leadership, Merck KGaA develops cutting-edge, materials-based solutions and equipment for leading chip companies — 99% of electronic devices contain one of their products 🤯
• A leading speaker within the semiconductor industry, he’s an expert in material-based semiconductor solutions, A.I., digitalization, and change management.
Today’s episode will be of interest to anyone looking to understand the hardware that all of computing and data science depend on. In it, Kai details:
• How materials from one company are found in virtually every electronic device on the planet.
• How A.I. is being used to develop materials that power... more A.I.
• His vinyl-record analogy for understanding computer-chip manufacturing.
• The impact that scaled-up, stable quantum computing will have on society.
• How a neuromorphic chip might someday run on the power of a low-wattage light bulb while matching human brain capabilities.
The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
How AI is Transforming Baseball (with Lessons For All of Us)
Baseball has always been a game of numbers. For decades, teams have pored over stats like batting averages and ERAs to gain an edge. But in recent years, artificial intelligence has taken baseball analytics to new heights. In today’s episode, we’ll explore how AI is revolutionizing baseball – from scouting and player performance to in-game strategy and even fan experience – and what that means for the future of sports and other industries.
Read MoreBecome Your Best Self Through AI Augmentation — feat. Natalie Monbiot
The deep-thinking and highly articulate Natalie Monbiot returns to my podcast today for a can't-miss episode (one of my favorite convos ever) on how A.I. will overhaul our lives, our work, our society in the coming years.
More on Natalie:
Through her consultancy, Virtual Human Economy, she advises on virtual humans and A.I. clones, including to startups like Wizly and investment firms like Blue Tulip Ventures.
Was previously Head of Strategy at Hour One, a leading virtual-human video-generation startup.
Regularly speaks at the world's largest conferences, including Web Summit and SXSW.
Holds a Master's in Modern Languages and Literature from the University of Oxford.
Today’s fascinating episode will be of great interest to all listeners. In it, Natalie details:
How A.I. is making us dumber — and what we can do about it.
Why the "virtual human economy" could be the next evolution of human civilization.
The two states of being humans are seeking (and how A.I. could help us achieve them).
Why focusing on merely 10x’ing our capabilities misses the much bigger opportunity of A.I.
The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.
Microsoft’s “Majorana 1” Chip Brings Quantum ML Closer
Microsoft’s Majorana 1 is a newly unveiled quantum computing chip that marks a major breakthrough in the quest for practical quantum computers. It’s the world’s first quantum processor built on a so-called Topological Core architecture – meaning it uses topological qubits (based on exotic Majorana particles that I’ll dig into more shortly) instead of the fragile qubits found in today’s machines. Microsoft believes this innovation could accelerate the timeline for solving real-world, industrial-scale problems with quantum computing from “decades” to just a few years.
Read MoreNoSQL Is Ideal for AI Applications, with MongoDB’s Richmond Alake
In today's episode (#871), I'm joined by the gifted writer, speaker and ML developer Richmond Alake, who details what NoSQL databases are and why they're ideally suited for A.I. applications.
Richmond:
Is Staff Developer Advocate for AI and Machine Learning at MongoDB, a huge publicly-listed database company with over 5000 employees and over a billion dollars in annual revenue.
With Andrew Ng, he co-developed the DeepLearning.AI course “Prompt Compression and Query Optimization” that has been undertaken by over 13,000 people since its release last year.
Has delivered his courses on Coursera, DataCamp, and O'Reilly.
Authored 200+ technical articles with over a million total views, including as a writer for NVIDIA.
Previously held roles as an ML Architect, Computer Vision Engineer and Web Developer at a range of London-based companies.
Holds a Master’s in computer vision, machine learning and robotics from The University of Surrey in the UK.
Today's episode (filmed in-person at MongoDB's London HQ!) will appeal most to hands-on practitioners like data scientists, ML engineers and software developers, but Richmond does a stellar job of introducing technical concepts so any interested listener should enjoy the episode.
In today’s episode, Richmond details:
How NoSQL databases like MongoDB differ from relational, SQL-style databases.
Why NoSQL databases like MongoDB are particularly well-suited for developing modern A.I. applications, including Agentic A.I. applications.
How Mongo incorporates a native vector database, making it particularly well-suited to RAG (retrieval-augmented generation).
Why 2025 marks the beginning of the "multi-era" that will transform how we build A.I. systems.
His powerful framework for building winning A.I. strategies in today's hyper-competitive landscape.
The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.