0% found this document useful (0 votes)
9 views3 pages

Optimizers - Building a Parameterized Model Notes

The document provides an overview of optimizers, explaining their role in finding minimum values of functions and building parameterized models based on data. It includes Python examples for implementing optimizers to minimize error in fitting lines and polynomials to data. Additionally, it discusses the importance of convex problems and error metrics in model fitting.

Uploaded by

pr1206
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views3 pages

Optimizers - Building a Parameterized Model Notes

The document provides an overview of optimizers, explaining their role in finding minimum values of functions and building parameterized models based on data. It includes Python examples for implementing optimizers to minimize error in fitting lines and polynomials to data. Additionally, it discusses the importance of convex problems and error metrics in model fitting.

Uploaded by

pr1206
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Lesson 01-08 - Optimizers - Building a Parameterized Model

● What is an optimizer?
○ Find minimum values of functions
○ Build parameterized models based on data
○ Refine allocations to stocks in portfolios
● How to use an optimizer?
○ Provide a function to minimize
○ Provide an initial guess
○ Call the optimizer
● Python example
○ Import pandas, matplotlib, and numpy as normal
○ Import scipy.optimize as spo
○ Def f(x):
■ Y = (X-1.5) **2 + 0.5
■ Print “X = {}, Y = {}”.format(X, Y)
■ Return Y
○ Def test_run():
■ Xguess = 2.0
■ Min_result = spo.minimize(f, Xguess, method = ‘SLSQP’, options = {‘disp’:
True})
■ Print ‘Minima found at: ‘
■ Print “X = {}, Y = {}”.format(min_result.x, min_result.fun)
● Convex problems
○ A real value function f(x) defined on an interval is called convex if the line
segment between any two points on the graph of the function lies above the
graph
○ Steps
■ Choose two points and draw a line between them
■ Convex if the line is above the graph when connecting the two points
■ Also has to have one local minima which is also the global minima
■ Cannot have any flat regions
○ Convex problems are easier to find the minima for than non convex problems
○ Can also find the minima of multidimensional graphs
● Building a parameterized model
○ If you are trying to find a line of best fit, you want to minimize the error (meaning
the distance from each of the points to the line of best fit)
○ You have to find the correct values for m and b in the y=mx+b formula in order to
do this
○ A good error metric would be one of the following
■ Σ abs(ei)
■ Σ (ei)2
■ Simply summing up the errors doesn't work, as some of them may be
negative so taking the absolute value or squared error solves that
problem.
● Python code:
○ Import pandas, matplotlib, numpy, and scipy.optimize as normal
○ Def error(line, data): -> error function
■ Subtract estimate from the line that we are looking at at a given point from
the value of the actual data at each point and then square it (Metric: sum
of squared y axis differences)
● Err = np.sum((data[:, 1] - (line[0] * data[:, 0] + line[1])) ** 2)
● Return err (returns error as a single real value)
○ Def fit_line(data, error_func):
■ Generate initial guess for line model
● L = np.float32([0, np.mean(data[:,1])])
○ Slope = 0, intercept = mean y values
■ Plot initial guess (optional)
● X_ends = np.float32([-5, 5])
● plt.plot(x_ends, l[0] * x_ends + l[1], ‘m--‘, linewidth = 2.0, label =
“Initial Guess”)
■ Call optimizer to minimize error function
● Result = spo.minimize(error_func, l, args = (data,), method =
‘SLSQP’, options = {‘disp’: True})
● Return result.x
○ Def test_run():
■ Define original line
● L_orig = np.float32([4,2])
● Print “Original Line: C0 = {}, C1 ={}”.format(l_orig[0], l_orig[1])
● Xorig = np.linespace(0,10,21)
● Yorig = l_orig[0] * Xorig + l_orig[1]
● plt.plot(Xorig, Yorig, ‘b ‘, linewidth = 2.0, label = “Original Line”)
■ Generate Noisy data points
● Noise_sigma = 3.0
● Noise = np.random.normal(0, noise_sigma, Yorig.shape)
● Data = np.asarray([Xorig, Yorig + noise]).T
● plt.plot(data[:, 0], data[:, 1], ‘go’, label = “Data Points”)
■ Try to find a line to fit this data
● L_fit = fit_line(data, error)
● Print “Fitted Line: C0 = {}, C1 ={}”.format(l_fit[0], l_fit[1])
● plt.plot(data[:,0], l_fit[0] * data[:,0] + data[:,1], ‘r--‘, linewidth = 2.0,
label = “Fitted Line”)
■ Add a legend a show the plot
● Python code for polynomials
○ Import pandas, matplotlib, numpy, and scipy.optimize as normal
○ Def error_poly(C, data): -> error function
■ Subtract estimate from the line that we are looking at at a given point from
the value of the actual data at each point and then square it (Metric: sum
of squared y axis differences)
● Err = np.sum((data[:, 1] - np.polyval(C, data[:, 0])) ** 2)
● Return err (returns error as a single real value)
○ Def fit_poly(data, error_func, degree = 3):
■ Generate initial guess for polynomial model (all coeffs = 1)
● Cguess = np.poly1d(np.ones(degree+1, dtype = np.float32))
■ Plot initial guess (optional)
● X_ends = np.float32([-5, 5])
● plt.plot(x, np.polyval(Cguess, x), ‘m--‘, linewidth = 2.0, label =
“Initial Guess”)
■ Call optimizer to minimize error function
● Result = spo.minimize(error_func, l, args = (data,), method =
‘SLSQP’, options = {‘disp’: True})
● Return np.poly1d(result.x)
○ Everything else is the same as you would do for a fitted line
● You can use this
○ On functions besides polynomials
○ To model stock prices
○ To Optimize a Portfolio

You might also like