Typically, high school mathematics study ends at calculus. While not the epitome of all mathematics, it does, in my opinion, deserve a little time in the limelight. Thus, in a series of posts from now into mid-February, I want to delve into the history of calculus.

• Part 1 – Limits
• Part 2 – Derivatives
• Part 3 – Integrals
• Part 4 – Applications

Let us begin.

A study of limits must precede a study of calculus. Calculus concerns itself with the calculation and application of derivatives and integrals, which find their formal definitions in the concept of a limit.

The works of Sir Issac Newton and Gottfried Leibniz, mathematicians and philosophers of the Scientific Revolution prepared the way for future analysts to formalize limits. They, most will assert, discovered calculus.

In Newton and Leibniz’s day, scientists could very well calculate the average slope of the curve, but did not know how to calculate the exact slope at an individual point on a curve, i.e. the slope of a tangent line at an exact point.

To find slope, one divides the change in y of two points by the change in x of those same two points. As the distance between the two points decreases, drawing closer and closer to 0, which would represent an infinitesimal change in x, then the calculated slope would, theoretically, approach the actual slope.

Newton’s and Leibniz’s formulation of differential calculus depended on the concept of an infinitesimal, or a non-zero number of negligible value. Think of it as a positive number greater than zero but less than any positive real number, or a negative number less than zero but greater than any negative real number. Acceptance of infinitesimals allowed them to conduct their computations and derivatives.

Generations of mathematicians solved equations with the infinitesimals approach to calculus, but the idea bothered many. In the 19th century, Augustin-Louis Cauchy, Karl Weierstrass, and Bernhard Riemann reformulated calculus in terms of limits, rather than infinitesimals.

A limit gives the value a function or sequence approaches for some input. Rather than negligible values, limits allowed mathematicians to solve equations with real, full-bodied numbers.

Limits work by concerning the activity around a point, not the activity at the point itself. Take the graph of the function f(x) below, for example.

The limit as x approaches 2 is 4. Although f(2) does not, in fact, exist, as denoted by the open circle at that point, the function f(x) approaches that value from both the left and the right, so we say that $\lim_{x \rightarrow 2} f(x)$ = 4.

In the next couple of posts, we’ll realize the importance of limits in understanding derivatives and integrals.

Other posts in “A History of Calculus”