backpropagation


back·prop·a·ga·tion

B0013650 (băk′prŏp′ə-gā′shən)n. A common method of training a neural net in which the initial system output is compared to the desired output, and the system is adjusted until the difference between the two is minimized.