Rosenblatt's Perceptron: What Can Neural Networks Do For Us?
Counting Sand
English - November 30, 2021 10:55 - 31 minutes - 29.3 MB - ★★★★★ - 23 ratingsTechnology Business Careers artificial intelligence computer science machine learning big data systems thinking data science neural networks deep learning Homepage Download Apple Podcasts Google Podcasts Overcast Castro Pocket Casts RSS feed
In any discussion of artificial intelligence and machine learning today, artificial neural networks are bound to come up. What are artificial neural networks, how have they developed, and what are they poised to do in the future? Host Angelo Kastroulis dives into the history, compares them to biological systems that they are meant to mimic, and talks about how hard problems like this one need to be handled carefully.
Angelo begins with a discussion of how biological neural networks help make our brain a powerful computer of complexity. He then talks about how artificial neural networks recruit the same structures and connections to create artificial intelligence. To understand what we mean by artificial intelligence, Angelo explains how the Turing Test works and how Turing’s work forms a foundation for modern AI.
He then discusses other early pioneers in this work, namely Frank Rosenblatt, who worked on models that could learn or “perceptrons.” Angelo then relates the history of how this work was criticized by Marvin Minsky and Seymour Papert and how mistakes in their own work put the potential advances of artificial neural networks back by about two decades.
Using image recognition as a case study, Angelo ends the episode by talking about about various approaches’ benefits and drawbacks to illustrate what we can do with artificial neural networks today.
Citations
Hebb, D.O. (1949). The organization of behavior: A neuropsychological theory. New York: Wiley.
Minsky, M. (1954.) Theory of neural-analog reinforcement systems and its application to the brain-model problem. Doctoral dissertation. Princeton: Princeton University.
Minsky, M. and Papert, S. (1969). Perceptrons: An introduction to computational geometry. Cambridge: MIT Press.
Rosenblatt, F. (1957). "The perceptron: A perceiving and recognizing automaton.”Buffalo: Cornell Aeronautical Laboratory, Inc. (Accessible at https://blogs.umass.edu/brain-wars/files/2016/03/rosenblatt-1957.pdf)
Rosenblatt, F. (1962). Principles of neurodynamics: Perceptrons and the theory of brain mechanisms. Washington, D.C.: Spartan Books_._
Turing, A. (1950, October). "Computing machinery and intelligence," Mind, LIX: 236, pp. 433–460. https://doi.org/10.1093/mind/LIX.236.433
Further Reading
Warren McCollough and the McCollough-Pitts Neuron
Church-Turing Thesis
XOR or Exclusive or
Host: Angelo Kastroulis
Executive Producer: Kerri Patterson; Producer: Leslie Jennings Rowley; Communications Strategist: Albert Perrotta; Audio Engineer: Ryan Thompson
Music: All Things Grow by Oliver Worth
© 2021, Carrrera Group