Conversation with Loup Ventures

Conversation with Loup Ventures

https://loupventures.com/026-gordon-wilson/


Top 3 Takeaways


  1. Training deep learning algorithms is expensive.
  2. To understand the brain, you need to build one.
  3. Modern computing hardware doesn’t have the parallelism and energy efficiency of the brain.


Show Notes


  • [1:12] Building a processor for brain math.
  • [2:40] The cost of artificial neural networks.
  • [3:36] What is “brain-inspired hardware”?
  • [4:50] Nanowires and memristors.
  • [6:25] Cross-disciplinary chip design.
  • [7:30] Size of the brain vs. size of artificial neural networks.
  • [9:05] Research vs. development.
  • [12:00] Bridging brain science and AI.
  • [13:54] Neuromorphics vs. GPUs.
  • [18:00] Chips on the market.
  • [20:40] Go-to-market: matrix multiplication.
  • [22:22] Cost and energy of Rain’s hardware.
  • [23:33] Does chip design impact software development?
  • [24:08] Fusing training and inference.
  • [26:26] Wide learning vs. deep learning.
  • [29:30] Sparse learning.
  • [32:10] Gordon’s book recommendations.


Selected Links