BACK-PROPAGATION
\bˈakpɹˌɒpɐɡˈe͡ɪʃən], \bˈakpɹˌɒpɐɡˈeɪʃən], \b_ˈa_k_p_ɹ_ˌɒ_p_ɐ_ɡ_ˈeɪ_ʃ_ə_n]\
Sort: Oldest first
-
(Or "backpropagation") A learning algorithm for modifying afeed-forward neural network which minimises a continuous"error function" or "objective function."Back-propagation is a "gradient descent" method of trainingin that it uses gradient information to modify the networkweights to decrease the value of the error function onsubsequent tests of the inputs. Other gradient-based methodsfrom numerical analysis can be used to train networks moreefficiently.Back-propagation makes use of a mathematical trick when thenetwork is simulated on a digital computer, yielding in justtwo traversals of the network (once forward, and once back)both the difference between the desired and actual output, andthe derivatives of this difference with respect to theconnection weights.
By Denis Howe
Word of the day
Platidiam
- An inorganic water-soluble platinum complex. After undergoing hydrolysis, it reacts DNA produce both intra interstrand crosslinks. These crosslinks appear to impair replication and transcription of DNA. The cytotoxicity cisplatin correlates with cellular arrest in G2 phase cell cycle.