• Home
  • Fresh Content
  • Courses
  • Resources
  • Podcast
  • Talks
  • Publications
  • Sponsorship
  • Testimonials
  • Contact
  • Menu

Jon Krohn

  • Home
  • Fresh Content
  • Courses
  • Resources
  • Podcast
  • Talks
  • Publications
  • Sponsorship
  • Testimonials
  • Contact
Jon Krohn

Dragon Hatchling: The Missing Link Between Transformers and the Brain, with Adrian Kosowski

Added on October 8, 2025 by Jon Krohn.

What do dragons, macarons and potato latkes have in common? They've sparked a revolutionary model that's poised to replace the Transformer. Today, Adrian Kosowski reveals this big breakthrough.

Adrian:

• Chief Scientific Officer and co-founder of Pathway.

• Theoretical computer scientist, quantum physicist and mathematician.

• Earned his PhD at 20 years old and went on to serve as a tenured researcher at Inria at 23 and associate professor at École Polytechnique.

• Has authored more than 100 scientific papers spanning graph algorithms, distributed systems, quantum information and A.I.

In today's highly technical episode, Adrian demonstrates how the Pathway team have brought devised the Baby Dragon Hatching (BDH) architecture, thereby allowing attention in LLMs to function more like the way the biological brain functions. This is revolutionary because (relative to the today-ubiquitous Transformer architecture) BDH allows:

• Reasoning to be generalized across more complex and extended reasoning patterns, approximating a more human-like approach to problem-solving.

• Saving time/compute/money through sparse activation at inference time.

• Allowing for more interpretability.

The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.

In Data Science, Interview, Podcast, SuperDataScience, YouTube Tags superdatascience, ai, llm, transformer, bdh, pathway
Older: The “Lethal Trifecta”: Can AI Agents Ever Be Safe? →
Back to Top