Suppose a function \(f\) has a minimum at a point \(c\) in some open interval \(I\text{.}\) If \(f\) is differentiable at \(c\) then \(f'(c) = 0\text{.}\)
The proof of this theorem is accessible and conceptually relevant. This result should seem very credible an intuitive level. If \(f'(c) < 0\) then moving to the left from \(c\) to \(c - \varepsilon\) should produce a greater value of \(f\text{.}\) Likewise, if \(f' (c) > 0\) then moving to the right should produce a greater value. This is the most intuitive justification we could write down, though not exactly airtight.
Here is a more airtight argument. Because \(f\) is differentiable at \(c\text{,}\) the one-sided derivatives exist and are equal. The derivative from the right is \(\lim_{x \to c^+} \frac{f(x) - f(c)}{x-c}\text{;}\) because \(c\) is a minimum, both top and bottom of this fraction are positive (the numerator could be zero). The limit of nonnegative numbers is nonnegative, hence \(f'(c_+) \geq 0\text{;}\) see Figure 7.6. Similarly, \(f'(c_-)\) is a limit in which each term is nonpositive, thus \(f' (c_-) \leq 0\text{.}\) For these to be equal, both must equal zero. This finishes the proof.
Suppose \(f\) is differentiable on \([a,b]\) (derivatives at the endpoints are one-sided). If the minimum of \(f\) on this interval occurs at the left endpoint, can you conclude that the one-sided derivative there is zero?
is not necessarily true! Nevertheless, everyone’s favorite procedure for finding minima is to set \(f'\) equal to zero. Why does this work, or rather, when does this work?
From Theorem 7.4, if \(f\) is defined and continuous on a closed interval \([a,b]\text{,}\) then indeed \(f\) has to have a minimum and a maximum somewhere on \([a,b]\text{.}\) We can use Theorem 7.5 to find where minima and maxima don’t occur: if \(a < c < b\) and \(f'(c) \neq 0\text{,}\) then definitely the minimum does not occur at \(c\text{.}\) Where can it be then? What’s left is the point \(a\text{,}\) the point \(b\text{,}\) every point where \(f'\) is zero, and every point where \(f'\) does not exist. An identical argument shows the same is true for the maximum. Summing up:
Suppose \(f\) is continuous on \([a,b]\) and differentiable everywhere on \((a,b)\) except for a finite number of points \(c_1, \ldots, c_k\text{.}\) Then the minimum value of \(f\) on \([a,b]\) occurs at one or more of the points \(\{ a, b, c_1, \ldots , c_k , \mbox{ anywhere } f' = 0 \}\text{,}\) and nowhere else. The maximum also occurs at one or more of these points and nowhere else.
Being differentiable except for a number of points (call them \(c_0,\ldots,c_k\)) is sometimes called being piecewise differentiable, because the function is differentiable in pieces, the pieces being the intervals \((c_0, c_1) , (c_2, c_2), \ldots , (c_{k-1} , c_k)\text{.}\)
Let \(f\) be the “sawtooth” function shown in Figure 7.10, defined by letting \(f(x)\) be the distance from \(x\) to the nearest integer, either \(\lfloor x \rfloor\) or \(\lceil x \rceil\text{.}\)
You can write Theorem 7.8 as a procedure if you want. Even if you’re looking only for the minimum or only for the maximum, the procedure is the same so it will find both.
For every point \(x\) on the list, compute \(f(x)\text{;}\) the greatest value on this second list (the output list) will be the maximum; the least will be the minimum.
Find the maximum of \(f(x) := 5x - x^2\) on the interval \([1,3]\text{;}\) see the figure at the right. Computing \(f'(x) = 5-2x\) and setting it equal to zero we see that \(f'(x) = 0\) precisely when \(x = 2.5\text{.}\) There are no points where \(f'\) is undefined, so our list consists of just the one point plus the two endpoints: \(\{ 1, 2.5, 3 \}\text{.}\) Checking the values of \(f\) there produces \(4, 6.25, 6\text{.}\) The maximum is the greatest of these, occurring at \(x = 2.5\text{.}\)
Here are some other things you may find when you use Theorem 7.8. Match each of these verbal descriptions to the role of \(x\) in one of the four pictures in Figure 7.14.
This example represents a scenario where you make a donation in bitcoin to enter a virtual tourist attraction and you want to spend as little as possible. You have 1 bitcoin, so that’s the maximum you can donate; donations can be any positive real number but zero is not allowed.
The interpretation is clear: no matter how little you donate, you could have donated less. Mathematically, this clarifies the need for a closed interval in Theorem 7.4.
Recall that wherever \(f\) has a second derivative, if \(f'' \neq 0\) then the sign of \(f''\) determines the concavity of \(f\text{.}\) If \(f''(x) > 0\) then \(f\) is concave upward and if \(f''(x) < 0\) then \(f\) is concave downward. At a point where \(f' = 0\text{,}\) if we know the concavity, we know whether \(f\) has a local maximum or local minimum.
What are the extrema of the function \(f(x) := x^2 + 1/x\) on the interval \((0,2)\text{?}\) The only critical point is where \(f'(x) = 2x - 1/x^2 = 0\text{,}\) hence \(x = \sqrt[3]{1/2}\text{.}\) Here, \(f''(x) = 2 + 2/x^3 > 0\) therefore this is a local minimum. There are not any local maxima. This means \(f\) has no global maximum on \((0,2)\text{.}\) It may have a global minimum, and indeed, Figure 7.18 shows that \(x = \sqrt[3]{2}\) is a local minimum. In your homework you will get some more tools for arguing whether a local extremum on a non-closed interval is a global extremum.