• Home
  • Fresh Content
  • Courses
  • Resources
  • Podcast
  • Talks
  • Publications
  • Sponsorship
  • Testimonials
  • Contact
  • Menu

Jon Krohn

  • Home
  • Fresh Content
  • Courses
  • Resources
  • Podcast
  • Talks
  • Publications
  • Sponsorship
  • Testimonials
  • Contact
Jon Krohn

NPUs vs GPUs vs CPUs for Local AI Workloads, with Dell’s Ish Shah and Shirish Gupta

Added on September 9, 2025 by Jon Krohn.

Double the laughs in today's episode, with *two* hardware experts from Dell joining me to explain when you should process A.I. workloads locally with a CPU, GPU or Neural Processing Unit (NPU).

Guest #1, Ishan Shah, is:

  • Technologist in the Office of the CTO for Dell Technologies’ Client Solutions Group.

  • Was previously founding member of Dell's Chief A.I. Office.

  • Holds an MBA from the Massachusetts Institute of Technology.

Guest #2, Shirish Gupta:

  • Director of A.I. Product Management at Dell, where he's been for 20+ years!

  • Holds a Master's in Engineering from the University of Maryland.

Today's episode will appeal to all hands-on A.I. practitioners as well as anyone who makes hardware decisions for A.I. practitioners. It's also simply a ton of fun to listen to — Ish and Shirish play off each other very amusingly :)

The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.

In Data Science, Podcast, SuperDataScience, YouTube Tags superdatascience, datascience, ai, AIhardware, cpu, gpu, npu
← Newer: AI for Manufacturing and Industry, with Hugo Dozois-Caouette Older: In Case You Missed It in August 2025 →
Back to Top