Partial Differentiation Classroom
60.1 Partial Derivatives Intro
Partial differentiation extends the concept of differentiation to functions of multiple variables. While ordinary derivatives measure the rate of change of a function with respect to a single variable, assuming all other variables are constant, partial derivatives do the same for functions with two or more independent variables.
For a function $f(x, y)$, the partial derivative with respect to $x$, denoted as $\frac{\partial f}{\partial x}$ or $f_x$, is found by treating $y$ as a constant and differentiating with respect to $x$. Similarly, the partial derivative with respect to $y$, denoted as $\frac{\partial f}{\partial y}$ or $f_y$, is found by treating $x$ as a constant and differentiating with respect to $y$.
60.2 First-Order Partial Derivatives
To find the first-order partial derivatives of a function $f(x, y, z, ...)$, you differentiate with respect to one variable while holding all other variables constant. This means that any term that does not contain the variable you are differentiating with respect to is treated as a constant, and its derivative is zero (unless it's part of a product or quotient where the other variable is involved).
For example, if $f(x,y) = x^3 + 2xy + y^2$, then:
- To find $\frac{\partial f}{\partial x}$: Treat $y$ as a constant. $\frac{\partial f}{\partial x} = 3x^2 + 2y + 0 = 3x^2 + 2y$.
- To find $\frac{\partial f}{\partial y}$: Treat $x$ as a constant. $\frac{\partial f}{\partial y} = 0 + 2x + 2y = 2x + 2y$.
60.3 Second-Order Partial Derivatives
Second-order partial derivatives involve differentiating a partial derivative with respect to another variable (or the same variable again). There are four possible second-order partial derivatives for a function $f(x, y)$:
- $f_{xx}$ or $\frac{\partial^2 f}{\partial x^2}$: Differentiate $f_x$ with respect to $x$.
- $f_{yy}$ or $\frac{\partial^2 f}{\partial y^2}$: Differentiate $f_y$ with respect to $y$.
- $f_{xy}$ or $\frac{\partial^2 f}{\partial y \partial x}$: Differentiate $f_x$ with respect to $y$. (Mixed partial)
- $f_{yx}$ or $\frac{\partial^2 f}{\partial x \partial y}$: Differentiate $f_y$ with respect to $x$. (Mixed partial)
Clairaut's Theorem (or Schwarz's Theorem) states that if the mixed partial derivatives $f_{xy}$ and $f_{yx}$ are continuous in an open disk, then they are equal. That is, $f_{xy} = f_{yx}$.
61. Total Differential & Changes
61.1 Total Differential
For a function of multiple variables, the total differential represents the total change in the function due to small changes in all of its independent variables. For a function $z = f(x, y)$, the total differential $dz$ is given by:
$$ dz = \frac{\partial f}{\partial x}dx + \frac{\partial f}{\partial y}dy $$
Where $dx$ and $dy$ represent infinitesimal changes in $x$ and $y$, respectively.
61.2 Rates of Change (Multi-variable)
When a multi-variable function depends on variables that themselves depend on another single variable (e.g., time $t$), we can find the total rate of change of the function with respect to that single variable using the chain rule. For $z = f(x, y)$, where $x = x(t)$ and $y = y(t)$, the rate of change of $z$ with respect to $t$ is:
$$ \frac{dz}{dt} = \frac{\partial z}{\partial x}\frac{dx}{dt} + \frac{\partial z}{\partial y}\frac{dy}{dt} $$
This generalizes to functions of more variables as well.
61.3 Small Changes (Multi-variable)
The total differential can also be used to approximate the change in a function's value for small, finite changes in its independent variables. This is particularly useful for estimation. For a function $f(x, y)$ with small changes $\Delta x$ and $\Delta y$ from an initial point $(x_0, y_0)$, the approximate change in $f$, denoted $\Delta f$, is:
$$ \Delta f \approx df = \frac{\partial f}{\partial x}(x_0, y_0)\Delta x + \frac{\partial f}{\partial y}(x_0, y_0)\Delta y $$
62. Maxima, Minima & Saddle Points (2 Var)
62.1 Functions of Two Variables
Just as with functions of a single variable, we can find local maxima, local minima, and saddle points for functions of two variables, $f(x, y)$. These points correspond to "peaks," "valleys," and "passes" on the surface defined by $z = f(x, y)$.
A critical point $(a, b)$ of $f(x, y)$ is a point where $f_x(a, b) = 0$ and $f_y(a, b) = 0$, or where at least one of the partial derivatives does not exist.
62.2 Identifying Max, Min, Saddle
To classify a critical point $(a, b)$ for a function $f(x, y)$, we use the Second Derivative Test (or D-Test). This test involves calculating the second-order partial derivatives and forming the discriminant $D$ (also known as the Hessian determinant):
$$ D(x, y) = f_{xx}(x, y)f_{yy}(x, y) - [f_{xy}(x, y)]^2 $$
Then, evaluate $D(a, b)$ and $f_{xx}(a, b)$ at each critical point:
- If $D(a, b) > 0$ and $f_{xx}(a, b) > 0$, then $f$ has a local minimum at $(a, b)$.
- If $D(a, b) > 0$ and $f_{xx}(a, b) < 0$, then $f$ has a local maximum at $(a, b)$.
- If $D(a, b) < 0$, then $f$ has a saddle point at $(a, b)$.
- If $D(a, b) = 0$, the test is inconclusive, and further analysis is needed.
62.3 Procedure for Max, Min, Saddle
Here's a step-by-step procedure to find and classify local extrema and saddle points for a function $f(x, y)$:
- Find the first-order partial derivatives $f_x(x, y)$ and $f_y(x, y)$.
- Set both partial derivatives to zero ($f_x = 0$ and $f_y = 0$) and solve the system of equations to find all critical points $(a, b)$.
- Find the second-order partial derivatives $f_{xx}(x, y)$, $f_{yy}(x, y)$, and $f_{xy}(x, y)$.
- Calculate the discriminant $D(x, y) = f_{xx}f_{yy} - (f_{xy})^2$.
- For each critical point $(a, b)$ found in step 2, evaluate $D(a, b)$ and $f_{xx}(a, b)$ and use the Second Derivative Test to classify the point.
62.4 Max/Min/Saddle Problems
Let's apply the procedure to some common problems.
62.5 Advanced Max/Min/Saddle Problems
These problems may involve more complex algebraic solutions for critical points or more involved second derivative calculations.