Automatic Differentiation with TensorFlow — Topic 64 of Machine Learning Foundations
Automatic Differentiation – Segment 3 of Subject 3, "Limits & Derivatives" – ML Foundations
Computer Science: How does automatic differentiation work?
In mathematics and computer algebra, automatic differentiation (AD), also called algorithmic differentiation or computational differentiation, is a set of techniques to numerically evaluate the derivative of a function specified by a computer program. AD exploits the fact that every computer program, no matter how complicated, executes a sequence of elementary arithmetic operations and elementary functions. By applying the chain rule repeatedly to these operations, derivatives of arbitrary order can be computed automatically, accurately to working precision, and using at most a small constant factor more arithmetic operations than the original program.
Explore contextually related video stories in a new eye-catching way. Try Combster now!