logo

Video encyclopedia

Theorems

Popular in this category (1,760)

Science

Pythagorean theorem

In mathematics, the Pythagorean theorem, also known as Pythagoras' theorem, is a fundamental relation in Euclidean geometry among the three sides of a right triangle. It states that the square of the hypotenuse is equal to the sum of the squares of the other two sides. The theorem can be written as an equation relating the lengths of the sides a, b and c, often called the "Pythagorean equation":

Science

Bayes' theorem

In probability theory and statistics, Bayes’ theorem describes the probability of an event, based on prior knowledge of conditions that might be related to the event. For example, if cancer is related to age, then, using Bayes’ theorem, a person’s age can be used to more accurately assess the probability that they have cancer, compared to the assessment of the probability of cancer made without knowledge of the person's age.

Science

Euler's formula

Euler's formula, named after Leonhard Euler, is a mathematical formula in complex analysis that establishes the fundamental relationship between the trigonometric functions and the complex exponential function. Euler's formula states that for any real number x

Science

Central limit theorem

In probability theory, the central limit theorem (CLT) establishes that, in some situations, when independent random variables are added, their properly normalized sum tends toward a normal distribution even if the original variables themselves are not normally distributed. The theorem is a key concept in probability theory because it implies that probabilistic and statistical methods that work for normal distributions can be applicable to many problems involving other types of distributions.

Science

Binomial theorem

In elementary algebra, the binomial theorem describes the algebraic expansion of powers of a binomial. According to the theorem, it is possible to expand the polynomial (x + y)n into a sum involving terms of the form a xb yc, where the exponents b and c are nonnegative integers with b + c = n, and the coefficient a of each term is a specific positive integer depending on n and b. For example,

Science

Nash equilibrium

In game theory, the Nash equilibrium, named after American mathematician John Forbes Nash Jr., is a solution concept of a non-cooperative game involving two or more players in which each player is assumed to know the equilibrium strategies of the other players, and no player has anything to gain by changing only their own strategy. If each player has chosen a strategy and no player can benefit by changing strategies while the other players keep theirs unchanged, then the current set of strategy choices and the corresponding payoffs constitutes a Nash equilibrium. The Nash equilibrium is one of the foundational concepts in game theory. The reality of the Nash equilibrium of a game can be tested using experimental economics methods.

Science

Chebyshev's inequality

In probability theory, Chebyshev's inequality guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from the mean. Specifically, no more than 1/k2 of the distribution's values can be more than k standard deviations away from the mean. The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics. The inequality has great utility because it can be applied to any probability distribution in which the mean and variance are defined. For example, it can be used to prove the weak law of large numbers.

Science

De Morgan's laws

In propositional logic and boolean algebra, De Morgan's laws are a pair of transformation rules that are both valid rules of inference. They are named after Augustus De Morgan, a 19th-century British mathematician. The rules allow the expression of conjunctions and disjunctions purely in terms of each other via negation.

Science

Fermat's Last Theorem

In number theory Fermat's Last Theorem states that no three positive integers a, b, and c satisfy the equation an + bn = cn for any integer value of n greater than 2. The cases n = 1 and n = 2 have been known to have infinitely many solutions since antiquity.

Science

Euler's identity

In mathematics, Euler's identity is the equality

Science

Law of large numbers

In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of trials should be close to the expected value, and will tend to become closer as more trials are performed.

Science

Gaussian integral

The Gaussian integral, also known as the Euler–Poisson integral, is the integral of the Gaussian function e−x2 over the entire real line. It is named after the German mathematician Carl Friedrich Gauss. The integral is:

Science

L'Hôpital's rule

In mathematics, and more specifically in calculus, L'Hôpital's rule or L'Hospital's rule uses derivatives to help evaluate limits involving indeterminate forms. Application of the rule often converts an indeterminate form to an expression that can be evaluated by substitution, allowing easier evaluation of the limit. The rule is named after the 17th-century French mathematician Guillaume de l'Hôpital. Although the contribution of the rule is often attributed to L'Hôpital, the theorem was first introduced to L'Hôpital in 1694 by the Swiss mathematician Johann Bernoulli.

Science

Law of cosines

In trigonometry, the law of cosines relates the lengths of the sides of a triangle to the cosine of one of its angles. Using notation as in Fig. 1, the law of cosines states

Science

Stirling's approximation

In mathematics, Stirling's approximation is an approximation for factorials. It is a good approximation, leading to accurate results even for small values of n. It is named after James Stirling, though it was first stated by Abraham de Moivre.