Join our Podcast Community on Discord
Episodes
Episode 1 - "On the Mathematical Foundations of Theoretical Statistics" (1922), R.A. FisherEpisode 2 - "Application of the logistic function to bio-assay" (1944), Berkson JosephEpisode 3 - "A Mathematical Theory of Communication" (1948), Shannon, C. E - Part 1Episode 4 - "A Mathematical Theory of Communication" (1948), Shannon, C. E - Part 2Episode 5 - "A Mathematical Theory of Communication" (1948), Shannon, C. E - Part 3Episode 6 - “On the problem of the most efficient tests of statistical hypotheses." (1933) Neyman and PearsonEpisode 7 - “The Use of Multiple Measurements in Taxonomic Problems" (1936) Sir Ronald FisherEpisode 8 - The Turing test by Turing Alan "Computing machinery and intelligence" Mind (1950)Episode 9 - The Unreasonable Effectiveness of Mathematics in Natural Sciences, Eugene Wigner (1960)Episode 10 - The original principal component analysis (PCA) paper by Harold Hotelling (1935)Episode 11 - The original Perceptron paper by Frank Rosenblatt (1958)Episode 12 - Kolmogorov complexity paper review (1965) - Part 1Episode 13 - Kolmogorov complexity paper review (1965) - Part 2Episode 14 - The original k-means algorithm paper review (1957)Episode 15 - The First Decision Tree Algorithm (1963)Episode 16 - The First Stochastic Descent Algorithm (1952)Tutorials
AI and Machine LearningProbability & StatisticsPaper summariesAlso available at
Apple podcasts
About
We, the hosts, are motivated by a shared interest in bridging historical mathematical methods with modern data science practices.
We express a desire to create content that delves into the origins and foundational concepts of statistics and machine learning, areas we feel are underrepresented on platforms like YouTube and various other existing podcasts.
Why Historical Papers?
We highlight the importance of understanding historical papers to grasp the evolution of statistical concepts and their relevance to contemporary practices. We all agree that historical context is crucial for understanding where the field has been and where it is headed, emphasizing that foundational works like Fisher's paper provide essential insights into modern data science methodologies.
Hosts
Daniel Aronovic, Msc
Daniel is a data scientist and technology leader with a robust background in engineering and machine learning. Currently serving as the CTO at Dataflint, he oversees the company's technological strategy and development. As the former CTO of Vocalis Health, he led the strategic technological vision and spearheaded the development of core technologies. Previously, as the Head of Data at DualBird, he worked on accelerating data analytics and data engineering. His academic background includes an MSc in Electrical Engineering and Physics from Technion, as well as an MA in Philosophy.
Dr. Mike Erlihson
Mike holds a Ph.D. in mathematics from the Technion in Israel and has spent the past 20 years working in various algorithmic roles across different companies. He currently leads an AI team at a small startup. Mike's experience includes computer vision, signal processing, neural networks, and natural language processing (NLP). He has also taught courses at Ben-Gurion University and has a strong academic inclination, having read and reviewed countless papers in machine learning and deep learning.
Dr. Nir Regev
Nir Regev has been in the industry for the last 26-years focusing on algorithm design, signal processing, statistical signal processing, machine learning and deep learning. Nir is a founder of alephzero.ai, through which he works on numerous projects involving computer vision, radar, and LiDAR data processing. He’s also a professor at Cal Poly Pomona teaching in the EE department. Nir is also constantly publishing original AI/ML content on his website drnirregev.com. Finally, Nir holds a Ph.D. in Electrical Engineering from Ben-Gurion University in Israel.