THE MATHEMATICS OF
OPTIMIZATION
From Nicholson and Snyder, Microeconomic Theory Basic
Principles and Extensions, 10th Edition, Chapter 2
The Mathematics of Optimization
Many economic theories begin with the
assumption that an economic agent is seeking
to find the optimal value of some function
consumers seek to maximize utility
firms seek to maximize profit
Maximization of a Function of One Variable
Example: Profit maximization
f (q)
*
= f(q)
q*
Maximum profits of
* occur at q*
Quantity
3
Maximization of a Function of One Variable
The manager will likely try to vary q to see where
the maximum profit occurs
an increase from q1 to q2 leads to a rise in
0
q
*
2
= f(q)
q1
q2
q*
Quantity
4
Maximization of a Function of One Variable
If output is increased beyond q*, profit will decline
an increase from q* to q3 leads to a drop in
0
q
*
= f(q)
3
q*
q3
Quantity
5
Derivatives
The derivative of = f(q) is the limit of /q
for very small changes in q
f (q1 h) f (q1 )
d df
lim
dq dq h0
h
The value of this ratio depends on the value
of q1
6
Value of a Derivative at a Point
The evaluation of the derivative at the point
q = q1 can be denoted
d
dq
q q1
In our previous example,
d
dq
0
q q1
d
dq
0
q q3
d
dq
0
q q*
7
First Order Condition for a Maximum
For a function of one variable to attain its maximum
value at some point, the derivative at that point must
be zero
df
0
dq q q*
Second Order Conditions
The first order condition (d/dq) is a
necessary condition for a maximum, but it
is not a sufficient condition
If the profit function was u-shaped,
the first order condition would result
in q* being chosen and would
be minimized
q*
Quantity
9
Second Order Conditions
This must mean that, in order for q* to be the
optimum,
d
0 for q q *
dq
and
d
0 for q q *
dq
Therefore, at q*, d/dq must be decreasing
10
Second Derivatives
The derivative of a derivative is called a
second derivative
The second derivative can be denoted by
d
2
dq
2
or
d f
2
dq
or f "( q)
11
Second Order Condition
The second order condition to represent a
(local) maximum is
d
2
dq
2
f "(q) q q* 0
q q*
12
Rules for Finding Derivatives
1.
2.
3.
4.
db
If b is a constant, then
0
dx
d [bf ( x)]
If b is a constant, then
bf '( x)
dx
b
dx
If b is constant, then
bx b 1
dx
d ln x 1
dx
x
13
Rules for Finding Derivatives
x
5.
d ln x 1 da
a x ln a
dx
x dx
for any constant a
a special case of this rule is dex/dx = ex
14
Rules for Finding Derivatives
Suppose that f(x) and g(x) are two functions
of x and f(x) and g(x) exist
Then
6.
7.
d [ f ( x) g ( x)]
f '( x) g '( x)
dx
d [ f ( x) g ( x)]
f ( x) g '( x) f '( x) g ( x)
dx
15
Rules for Finding Derivatives
f ( x)
d
g ( x)
f '( x) g ( x) f ( x) g '( x)
8.
dx
[ g ( x)]2
provided that g ( x) 0
16
Rules for Finding Derivatives
If y = f(x) and x = g(z) and if both f(x) and g(x)
exist, then:
9.
dy dy dx df dg
= =
dz dx dz dx dz
This is called the chain rule. The chain rule allows
us to study how one variable (z) affects another
variable (y) through its influence on some
intermediate variable (x)
17
Rules for Finding Derivatives
Some examples of the chain rule include
de ax
de ax d (ax )
10.
e ax a ae ax
dx d (ax) dx
d [ln(ax)] d [ln(ax)] d (ax) 1
1
11.
a
dx
d (ax)
dx
ax
x
12.
d [ln( x 2 )] d [ln( x 2 )] d ( x 2 ) 1
2
2 2x
2
dx
d (x )
dx
x
x
18
Example of Profit Maximization
Suppose that the relationship between profit and
output is
= 1,000q - 5q2
The first order condition for a maximum is
d/dq = 1,000 - 10q = 0
q* = 100
Since the second derivative is always
q = 100 is a global maximum
-10,
19
Functions of Several Variables
Most goals of economic agents depend on several
variables
trade-offs must be made
The dependence of one variable (y) on a series of
other variables (x1,x2,,xn) is denoted by
y f ( x1 , x2 ,..., xn )
20
Partial Derivatives
The partial derivative of y with respect to x1 is
denoted by
y
f
or
or f x1 or f1
x1
x1
It is understood that in calculating the partial
derivative, all of the other xs are held constant
21
Partial Derivatives
A more formal definition of the partial
derivative is
f
x1
x2 ,..., xn
f ( x1 h, x2 ,..., xn ) f ( x1 , x2 ,..., xn )
lim
h 0
h
22
Calculating Partial Derivatives
1. If y f ( x1 , x2 ) ax12 bx1 x2 cx22 , then
f
f1 2ax1 bx2
x1
and
f
f 2 bx1 2cx2
x2
2.
If y f ( x1 , x2 ) e
ax1 bx2
, then
f
f
ax1 bx2
f1 ae
and
f 2 be ax1 bx2
x1
x2
23
Calculating Partial Derivatives
3. If y f ( x1 , x2 ) a ln x1 b ln x2 , then
f
a
f
b
f1
and
f2
x1
x1
x2
x2
24
Partial Derivatives
Partial derivatives are the mathematical
expression of the ceteris paribus
assumption
show how changes in one variable affect some
outcome when other influences are held
constant
25
Partial Derivatives
We must be concerned with how variables
are measured
if q represents the quantity of gasoline
demanded (measured in billions of liters) and p
represents the price in dollars per liter, then
q/p will measure the change in demand (in
billions of liters per year) for a dollar per liter
change in price
26
Elasticity
Elasticities measure the proportional effect
of a change in one variable on another
unit free
The elasticity of y with respect to x is
ey , x
y
y x y x
y
x x y x y
x
27
Elasticity and Functional Form
Suppose that
y = a + bx + other terms
In this case,
ey , x
y x
x
x
b b
x y
y
a bx
ey,x is not constant
it is important to note the point at which the
elasticity is to be computed
28
Elasticity and Functional Form
Suppose that
y = axb
In this case,
ey , x
y x
x
b 1
abx b b
x y
ax
29
Elasticity and Functional Form
Suppose that
ln y = ln a + b ln x
In this case,
ey , x
y x
ln y
b
x y
ln x
Elasticities can be calculated through
logarithmic differentiation
30
Second-Order Partial Derivatives
The partial derivative of a partial derivative is called a
second-order partial derivative
(f / xi ) f
f ij
x j
x j xi
2
31
Youngs Theorem
Under general conditions, the order in which
partial differentiation is conducted to evaluate
second-order partial derivatives does not
matter
f ij f ji
32
Use of Second-Order Partials
Second-order partials play an important role
in many economic theories
One of the most important is a variables
own second-order partial, fii
shows how the marginal influence of xi on
y(y/xi) changes as the value of xi increases
a value of fii < 0 indicates diminishing marginal
effectiveness
33
Maximization: Several Variables
Suppose that y = f (x1,x2,,xn)
If all xs are varied by a small amount, the total
effect on y will be
f
f
f
dy
dx1
dx2 ...
dxn
x1
x2
xn
dy f1dx1 f 2 dx2 ... f n dxn
This expression is the total differential
34
First-Order Condition for a
Maximum (or Minimum)
A necessary condition for a maximum (or minimum) of the
function f (x1,x2,,xn) is that dy = 0 for any combination of
small changes in the xs
The only way for this to be true is if
f1 f 2 ... f n 0
A point where this condition holds is called a critical point
35
Finding a Maximum
Suppose that y is a function of x1 and x2
y = - (x1 - 1)2 - (x2 - 2)2 + 10
y = - x12 + 2x1 - x22 + 4x2 + 5
First-order conditions imply that
y
2 x1 2 0
x1
y
2 x2 4 0
x2
OR
x1* 1
x2* 2
36
Implicit Function Theorem
It may not always be possible to solve implicit
functions of the form g(x,y)=0 for unique
explicit functions of the form y = f(x)
mathematicians have derived the necessary
conditions
in many economic applications, these conditions are
the same as the second-order conditions for a
maximum (or minimum)
37
Derivatives of implicit functions
Implicit function:
Total differential:
f ( x, y ) 0
0 f x dx f y dy
fx
dy
dx
fy
dy
Hence, the implicit derivative
can be found
dx
as the negative of the ratio of partial derivatives
of the implicit function.
38
Production Possibility Frontier
Earlier example: 2x2 + y2 = 225
Can be rewritten: f (x,y) = 2x2 + y2 - 225 = 0
Because fx = 4x and fy = 2y, the opportunity cost trade-off
between x and y is
dy f x 4 x 2 x
dx
fy
2y
y
39
The Envelope Theorem
The envelope theorem concerns how the
optimal value for a particular function changes
when a parameter of the function changes
This is easiest to see by using an example
40
The Envelope Theorem
Suppose that y is a function of x
y = -x2 + ax
For different values of a, this function
represents a family of inverted parabolas
If a is assigned a specific value, then y
becomes a function of x only and the value of x
that maximizes y can be calculated
41
The Envelope Theorem
V
alu01eofaV
alu1e0/2ofx*V
alu1e0/4ofy*
234 31
1
/
2
9
/
4
2
4
56 53/2 259/4
Optimal Values of x and y for alternative values of a
42
The Envelope Theorem
y*=f(a)
As a increases,
the maximal value
for y (y*) increases
The relationship
between a and y
is quadratic
43
The Envelope Theorem
Suppose we are interested in how y* changes
as a changes
There are two ways we can do this
calculate the slope of y directly
hold x constant at its optimal value and calculate
y/a directly
44
The Envelope Theorem
To calculate the slope of the function, we must solve
for the optimal value of x for any value of a
dy
2x a 0
dx
a
x*
2
Substituting, we get
y* ( x*) 2 a( x*) (a / 2) 2 a( a / 2)
y* a 2 / 4 a 2 / 2 a 2 / 4
45
The Envelope Theorem
Therefore,
dy*/da = 2a/4 = a/2 = x*
But, we can save time by using the envelope
theorem
for small changes in a, dy*/da can be computed by
holding x at x* and calculating y/ a directly from y
46
The Envelope Shortcut
For small changes in a, dy*/da can be computed
by holding x at x* and calculating y/ a directly
from y
y/ a = x
Holding x = x*
y/ a = x* = a/2
This is the same result found earlier!
47
The Envelope Theorem: Summary
The envelope theorem states that the change in
the optimal value of a function with respect to a
parameter of that function can be found by
partially differentiating the objective function
while holding x (or several xs) at its optimal
value
dy * y
{x x *(a )}
da a
48
The Envelope Theorem : Many Variables
The envelope theorem can be extended to the
case where y is a function of several variables
y = f (x1,xn,a)
Finding an optimal value for y would consist of
solving n first-order equations
y/xi = 0 (i = 1,,n)
49
The Envelope Theorem
Optimal values for these xs would be
determined that are a function of a
*
1
*
1
x = x (a ),
x2* = x2* (a ),
x (a ).
*
n
*
n
50
The Envelope Theorem
Substituting into the original objective function
yields an expression for the optimal value of y
(y*)
y* = f [x1*(a), x2*(a),,xn*(a),a]
Differentiating yields
dy * f dx1 f dx2
f dxn f
...
da x1 da x2 da
xn da a
51
The Envelope Theorem
Because of first-order conditions, all terms
except f /a are equal to zero if the xs are at
their optimal values
Therefore,
dy * f
{x x *(a )}
da a
52
The Envelope Theorem
Example:
y ( x1 1) 2 ( x2 2) 2 10
x1* 1, x2* 2, y* 10
Instead of 10, use the parameter a
y f ( x1 , x2 , a ) ( x1 1) 2 ( x2 2) 2 a
In this case, the optimal values of x1 and x2 do not depend
on a. So
y* a
dy *
1
da
53
Constrained Maximization
What if all values for the xs are not feasible?
the values of x may all have to be positive
a consumers choices are limited by the amount
of purchasing power available
One method used to solve constrained
maximization problems is the Lagrangian
multiplier method
54
Lagrangian Multiplier Method
Suppose that we wish to find the values of
x1, x2,, xn that maximize
y = f(x1, x2,, xn)
subject to a constraint that permits only
certain values of the xs to be used
g(x1, x2,, xn) = 0
55
Lagrangian Multiplier Method
The Lagrangian multiplier method starts
with setting up the expression
L = f(x1, x2,, xn ) + g(x1, x2,, xn)
where is an additional variable called a
Lagrangian multiplier
When the constraint holds, L = f because
g(x1, x2,, xn) = 0
56
Lagrangian Multiplier Method
First-Order Conditions
L / x1 f1 g1 0
L / x2 f 2 g 2 0
L / xn f n g n 0
L/ g ( x1 , x2 ,..., xn ) 0
57
Lagrangian Multiplier Method
The first-order conditions can generally be
solved for x1, x2,, xn and
The solution will have two properties:
the xs will obey the constraint
these xs will make the value of L (and therefore
f ) as large as possible
58
Lagrangian Multiplier Method
The Lagrangian multiplier () has an important
economic interpretation
The first-order conditions imply that
f1/-g1 = f2/-g2 == fn/-gn =
the numerators above measure the marginal benefit that
one more unit of xi will have for the function f
the denominators reflect the added burden on the
constraint of using more xi
59
Lagrangian Multiplier Method
At the optimal choices for the xs, the ratio of
the marginal benefit of increasing xi to the
marginal cost of increasing xi should be the
same for every x
is the common cost-benefit ratio for all of
the xs
marginal benefit of xi
marginal cost of xi
60
Lagrangian Multiplier Method
If the constraint was relaxed slightly, it would not
matter which x is changed
The Lagrangian multiplier provides a measure of
how the relaxation in the constraint will affect the
value of y
provides a shadow price to the constraint
61
Lagrangian Multiplier Method
A high value of indicates that y could be increased
substantially by relaxing the constraint
each x has a high cost-benefit ratio
A low value of indicates that there is not much to
be gained by relaxing the constraint
=0 implies that the constraint is not binding
62
Duality
Any constrained maximization problem has
associated with it a dual problem in
constrained minimization that focuses
attention on the constraints in the original
problem
63
Duality
Individuals maximize utility subject to a
budget constraint
dual problem: individuals minimize the
expenditure needed to achieve a given level of
utility
Firms minimize the cost of inputs to produce
a given level of output
dual problem: firms maximize output for a given
cost of inputs purchased
64
Constrained Maximization
Suppose a farmer had a certain length of fence
(P) and wished to enclose the largest possible
rectangular shape
Let x be the length of one side
Let y be the length of the other side
Problem: choose x and y so as to maximize the
area (A = xy) subject to the constraint that the
perimeter is fixed at P = 2x + 2y
65
Constrained Maximization
Setting up the Lagrangian multiplier
L = xy + (P - 2x - 2y)
The first-order conditions for a maximum are
L/x = y - 2 = 0
L/y = x - 2 = 0
L/ = P - 2x - 2y = 0
66
Constrained Maximization
Since y/2 = x/2 = , x must be equal to y
the field should be square
x and y should be chosen so that the ratio of
marginal benefits to marginal costs should be the
same
Since x = y and y = 2, we can use the
constraint to show that
x = y = P/4
= P/8
67
Constrained Maximization
Interpretation of the Lagrangian multiplier
if the farmer was interested in knowing how much
more field could be fenced by adding an extra yard
of fence, suggests that he could find out by
dividing the present perimeter (P) by 8
thus, the Lagrangian multiplier provides
information about the implicit value of the
constraint
68
Constrained Maximization
Dual problem: choose x and y to minimize the
amount of fence required to surround the field
minimize P = 2x + 2y subject to A = xy
Setting up the Lagrangian:
LD = 2x + 2y + D(A - xy)
69
Constrained Maximization
First-order conditions:
LD/x = 2 - Dy = 0
LD/y = 2 - Dx = 0
LD/D = A - xy = 0
Solving, we get
x = y = A1/2
The Lagrangian multiplier (D) = 2A-1/2
70
Envelope Theorem & Constrained
Maximization
Suppose that we want to maximize
y = f(x1,,xn;a)
subject to the constraint
g(x1,,xn;a) = 0
One way to solve would be to set up the
Lagrangian expression and solve the firstorder conditions
71
Envelope Theorem & Constrained
Maximization
Alternatively, it can be shown that
dy*/da = L/a(x1*,,xn*;a)
The change in the maximal value of y that
results when a changes can be found by
partially differentiating L and evaluating the
partial derivative at the optimal point
72
Inequality Constraints
In some economic problems the constraints
need not hold exactly
For example, suppose we seek to maximize y
= f(x1,x2) subject to
g(x1,x2) 0,
x1 0, and
x2 0
73
Inequality Constraints
One way to solve this problem is to introduce
three new variables (a, b, and c) that convert
the inequalities into equalities
To ensure that the inequalities continue to
hold, we will square these new variables to
ensure that their values are positive
74
Inequality Constraints
g(x1,x2) - a2 = 0;
x1 - b2 = 0; and
x2 - c2 = 0
Any solution that obeys these three equality
constraints will also obey the inequality
constraints
75
Inequality Constraints
We can set up the Lagrangian
L = f(x1,x2) + 1[g(x1,x2) - a2] + 2[x1 - b2] + 3[x2 - c2]
This will lead to eight first-order conditions
76
Inequality Constraints
L/x1 = f1 + 1g1 + 2 = 0
L/x2 = f1 + 1g2 + 3 = 0
L/a = -2a1 = 0
L/b = -2b2 = 0
L/c = -2c3 = 0
L/1 = g(x1,x2) - a2 = 0
L/2 = x1 - b2 = 0
L/3 = x2 - c2 = 0
77
Inequality Constraints
According to the third condition, either a or
1 = 0
if a = 0, the constraint g(x1,x2) holds exactly
if 1 = 0, the availability of some slackness of
the constraint implies that its value to the
objective function is 0
Similar complemetary slackness
relationships also hold for x1 and x2
78
Inequality Constraints
These results are sometimes called KuhnTucker conditions
they show that solutions to optimization
problems involving inequality constraints will
differ from similar problems involving equality
constraints in rather simple ways
we cannot go wrong by working primarily with
constraints involving equalities
79
Second Order Conditions Functions of One Variable
Let y = f(x)
A necessary condition for a maximum is that
dy/dx = f (x) = 0
To ensure that the point is a maximum, y must
be decreasing for movements away from it
80
Second Order Conditions Functions of One Variable
The total differential measures the change in y
dy = f (x) dx
To be at a maximum, dy must be decreasing
for small increases in x
To see the changes in dy, we must use the
second derivative of y
81
Second Order Conditions Functions of One Variable
d [ f '( x)dx]
2
d y
dx f "( x )dx dx f "( x )dx
dx
2
Note that d 2y < 0 implies that f (x)dx2 < 0
Since dx2 must be positive, f (x) < 0
This means that the function f must have a
concave shape at the critical point
82
Second Order Conditions Functions of Two Variables
Suppose that y = f(x1, x2)
First order conditions for a maximum are
y/x1 = f1 = 0
y/x2 = f2 = 0
To ensure that the point is a maximum, y must
diminish for movements in any direction away
from the critical point
83
Second Order Conditions Functions of Two Variables
The slope in the x1 direction (f1) must be
diminishing at the critical point
The slope in the x2 direction (f2) must be
diminishing at the critical point
But, conditions must also be placed on the crosspartial derivative (f12 = f21) to ensure that dy is
decreasing for all movements through the critical
point
84
Second Order Conditions Functions of Two Variables
The total differential of y is given by
dy = f1 dx1 + f2 dx2
The differential of that function is
d 2y = (f11dx1 + f12dx2)dx1 + (f21dx1 + f22dx2)dx2
d 2y = f11dx12 + f12dx2dx1 + f21dx1 dx2 + f22dx22
By Youngs theorem, f12 = f21 and
d 2y = f11dx12 + 2f12dx1dx2 + f22dx22
85
Second Order Conditions Functions of Two Variables
d 2y = f11dx12 + 2f12dx1dx2 + f22dx22
For this equation to be unambiguously negative
for any change in the xs, f11 and f22 must be
negative
If dx2 = 0, then d 2y = f11 dx12
for d 2y < 0, f11 < 0
If dx1 = 0, then d 2y = f22 dx22
for d 2y < 0, f22 < 0
86
Second Order Conditions Functions of Two Variables
d 2y = f11dx12 + 2f12dx1dx2 + f22dx22
If neither dx1 nor dx2 is zero, then d 2y will be
unambiguously negative only if
f11 f22 - f122 > 0
the second partial derivatives (f11 and f22) must be
sufficiently negative so that they outweigh any
possible perverse effects from the cross-partial
derivatives (f12 = f21)
87
Constrained Maximization
Suppose we want to choose x1 and x2 to
maximize
y = f(x1, x2)
subject to the linear constraint
c - b1x1 - b2x2 = 0
We can set up the Lagrangian
L = f(x1, x2) + (c - b1x1 - b2x2)
88
Constrained Maximization
The first-order conditions are
f1 - b1 = 0
f2 - b2 = 0
c - b1x1 - b2x2 = 0
To ensure we have a maximum, we must
use the second total differential
d 2y = f11dx12 + 2f12dx1dx2 + f22dx22
89
Constrained Maximization
Only the values of x1 and x2 that satisfy the
constraint can be considered valid alternatives
to the critical point
Thus, we must calculate the total differential of
the constraint
-b1 dx1 - b2 dx2 = 0
dx2 = -(b1/b2)dx1
These are the allowable relative changes in x1
and x2
90
Constrained Maximization
Because the first-order conditions imply that
f1/f2 = b1/b2, we can substitute and get
dx2 = -(f1/f2) dx1
Since
d 2y = f11dx12 + 2f12dx1dx2 + f22dx22
we can substitute for dx2 and get
d 2y = f11dx12 - 2f12(f1/f2)dx12 + f22(f12/f22)dx12
91
Constrained Maximization
Combining terms and rearranging
d 2y = f11 f22 - 2f12f1f2 + f22f12 [dx12/ f22]
Therefore, for d 2y < 0, it must be true that
f11 f22 - 2f12f1f2 + f22f12 < 0
This equation characterizes a set of functions
termed quasi-concave functions
any two points within the set can be joined by a
line contained completely in the set
92
Concave and Quasi-Concave
Functions
The differences between concave and quasiconcave functions can be illustrated with the
function
y = f(x1,x2) = (x1x2)k
where the xs take on only positive values and
k can take on a variety of positive values
93
Concave and Quasi-Concave
Functions
No matter what value k takes, this function is
quasi-concave
Whether or not the function is concave
depends on the value of k
if k < 0.5, the function is concave
if k > 0.5, the function is convex
94
Homogeneous Functions
A function f (x1,x2,xn) is said to be
homogeneous of degree k if
f (tx1,tx2,txn) = tk f(x1,x2,xn)
when a function is homogeneous of degree one, a
doubling of all of its arguments doubles the value
of the function itself
when a function is homogeneous of degree zero,
a doubling of all of its arguments leaves the value
of the function unchanged
95
Homogeneous Functions
If a function is homogeneous of degree k, the
partial derivatives of the function will be
homogeneous of degree k-1
96
Eulers Theorem
If we differentiate the definition for
homogeneity with respect to the
proportionality factor t, we get
ktk-1f(x1,,xn) = x1f1(tx1,,txn) + + xnfn(x1,,xn)
This relationship is called Eulers theorem
97
Eulers Theorem
Eulers theorem shows that, for homogeneous
functions, there is a definite relationship
between the values of the function and the
values of its partial derivatives
98
Homothetic Functions
A homothetic function is one that is formed by
taking a monotonic transformation of a
homogeneous function
they do not possess the homogeneity properties of
their underlying functions
99
Homothetic Functions
For both homogeneous and homothetic
functions, the implicit trade-offs among the
variables in the function depend only on the
ratios of those variables, not on their absolute
values
100
Homothetic Functions
Suppose we are examining the simple, two
variable implicit function f(x,y) = 0
The implicit trade-off between x and y for a
two-variable function is
dy/dx = -fx/fy
If we assume f is homogeneous of degree k,
its partial derivatives will be homogeneous of
degree k-1
101
Homothetic Functions
The implicit trade-off between x and y is
t k 1 f x (tx, ty )
f x (tx, ty )
dy
k 1
dx
t f y (tx, ty )
f y (tx, ty )
If t = 1/y,
x
F ' f x ,1
y
dy
dx
x
F ' f y ,1
y
f x
f y
x
y ,1
x
y ,1
102
Homothetic Functions
The trade-off is unaffected by the monotonic
transformation and remains a function only of
the ratio x to y
103
Important Points to Note:
Using mathematics provides a convenient,
short-hand way for economists to develop
their models
implications of various economic assumptions
can be studied in a simplified setting through
the use of such mathematical tools
104
Important Points to Note:
Derivatives are often used in economics
because economists are interested in how
marginal changes in one variable affect
another
partial derivatives incorporate the ceteris
paribus assumption used in most economic
models
105
Important Points to Note:
The mathematics of optimization is an
important tool for the development of
models that assume that economic agents
rationally pursue some goal
the first-order condition for a maximum
requires that all partial derivatives equal zero
106
Important Points to Note:
Most economic optimization problems
involve constraints on the choices that
agents can make
the first-order conditions for a maximum
suggest that each activity be operated at a
level at which the ratio of the marginal
benefit of the activity to its marginal cost
107
Important Points to Note:
The Lagrangian multiplier is used to help
solve constrained maximization problems
the Lagrangian multiplier can be interpreted as
the implicit value (shadow price) of the
constraint
108
Important Points to Note:
The implicit function theorem illustrates
the dependence of the choices that result
from an optimization problem on the
parameters of that problem
109
Important Points to Note:
The envelope theorem examines how
optimal choices will change as the
problems parameters change
Some optimization problems may
involve constraints that are inequalities
rather than equalities
110
Important Points to Note:
First-order conditions are necessary but
not sufficient for ensuring a maximum or
minimum
second-order conditions that describe the
curvature of the function must be checked
111
Important Points to Note:
Certain types of functions occur in many
economic problems
quasi-concave functions obey the secondorder conditions of constrained maximum
or minimum problems when the constraints
are linear
homothetic functions have the property that
implicit trade-offs among the variables
depend only on the ratios of these variables
112