Theory of Limits

From SEG Wiki
Jump to navigation Jump to search
ADVERTISEMENT

The original formulation of the calculus employed by its originators Isaac Newton, Gottfried Leibnitz, Jacob and Johann Bernoulli, Guillaume de l'Hôpital, and others, employed the assertion that there could be quantities that are infinitesimal meaning that they are vanishingly small, in such a way that an infinite number of such quantities added together would yield a finite number. While such a notion did permit the ancients to deduce formulas for some classical mathematical objects, for example the circumference and the area of a circle, the notion was poorly formulated, which led to paradoxes and philosophical discussions.

Infinitesimals were banished from the realm of analysis in the 19th century with the introduction of the concept of limit by Bernhard Bolzano in 1817, and its refinement modern form of the method by August Cauchy and later by, Karl Weierstrass.

Theory of Limits

A sketch of the theory of limits

We are used to the notion of a function as a rule that maps each real number represented by to one and only one real number . In mathematical analysis we are concerned with the behavior of functions in the vicinity of points where they are defined.

For example, it is easy to see that the functions and are both equal to zero at , but it is not so clear as to what the value of

is at . Both and are zero at , but we know that we cannot simply perform division, because division by zero is not defined in mathematics. Thus we need an approach where we examine functions in a neighborhood of the point of interest. This approach is called the Theory of Limits.

Formal definition of the limit of a function

We say that a real number is the limit of a real-valued function at if for every there exists such that whenever

The fundamental notion is that constrains the range of the function whereas constrains the domain of These constraints are in the form of open sets. Because is arbitrary, the open sets defined by and may be made as small as we desire. In effect, this formal definition is like a microscope of infinite resolution allowing us to examine the behavior of at any length scale. As we reduce the value of or

We use the notion to indicate that is the limit of as approaches