What I Learned from my first Podcast on Deep Learning

Connor Shorten
2 min readJun 5, 2019

I have been blogging on Medium for a while now and just recently came around to the idea of making YouTube videos. I am aspiring to be a Deep Learning research scientist and my blog posts and videos reflect my interests in reading these research papers. Additionally, I hope that these videos will provide some value to other researchers who want to get a high-level overview of a paper before diving into the paper. Given that these papers can be somewhat difficult and time-consuming to read/understand.

A friend of mine who I met through the blogosphere recently invited me to talk with him on his podcast. After seeing how simple it was to set up a remote interview by simply recording a Skype call, I was inspired to try it for myself.

I was extremely lucky to speak with Edward Dixon, a Data Scientist at Intel, for my first Podcast episode. We exchanged emails to decide on a good research paper to begin our conversation and decided on Stochastic Weight Averaging. We initially discussed Stochastic Weight Averaging in the context of Neural Loss Surfaces, Model Ensembles, and Model Compression. The conversation leads to a very interesting discussion on Model Distillation, learning in a teacher-student context, and I got to share some of my thoughts on the new VQ-VAE-2 model in the context of Generative modeling for Data Augmentation.

This interview was very enlightening to me on the power of podcasting. With modern software tools, it is very easy to record and upload a fairly reasonable quality podcast. I am extremely excited to continue with this idea and connect with more experts in Artificial Intelligence and Deep Learning. Please check out the podcast episode!

--

--

No responses yet