Virtual
Seminar
  US Mountain Time
Speaker: 
Kiel Howe (Minerva Schools at KGI)

Swimming Without a Lifeguard in the Ocean of Ideas: Abstraction & Unsupervised Deep Learning

Abstract: Deep learning models are known to effectively discover useful abstractions (latent features) of input data in their hidden layers. Unsupervised deep learning techniques have been used to systematically discover these latent features in unlabeled data, and progress has been made by several groups in methods for discovering hierarchical latent feature spaces. Such hierarchical latent feature spaces can improve interpretability and performance in downstream supervised learning tasks and play an important role in deep generative models. I will give an overview of some of these existing techniques for finding hierarchical latent feature spaces and present my recent work on another approach. I'll also briefly speculate about how such representations may play a role in building a bridge between knowledge representations in neural and symbolic learning systems. The talk will be given dubbed from the original language of machine learning into English, with subtitles provided for native speakers of statistical physics, information theory, and tensor networks / graphical models.

Speaker Bio: Kiel is interested in how his brain, other human brains, and computer brains learn. He is an educator at Minerva Schools at KGI and has been part of innovating in how students learn in virtual classrooms since before virtual classrooms were trending. He obtained his Ph.D. in theoretical physics from Stanford University in 2015, focusing on writing mathematical poetry about subatomic particles, and his postdoctoral research at Fermilab included work on applications of quantum computing and deep learning to high energy physics. 

Purpose: 
Research Collaboration
SFI Host: 
Melanie Mitchell

More SFI Events