Optimize piecewise linear function

Eugine Kang
Aug 26, 2017 · 2 min read

Non-linear least square

Ordinary least square is used to approximate parameters for a linear model. However, the assumption will not always be linear and more complex.

The goal of non-linear least square is the same as OLS, minimizing the sum of square error. By setting the partial derivative to the parameter beta equal to zero, we are finding the local minima for the S equation.

Advantages of non-linear least squares

  1. Broad range of functions that fits
  2. Good estimates with relatively small amounts of data
  3. Well-developed theory computing confidence, prediction, and intervals

Disadvantages

  1. Iterative optimization procedures for estimating parameters
  2. User provided starting values
  3. Bad starting values may converge to local minima instead of global minima
  4. String sensitivity to outliers

How would you optimize a piecewise linear function in python?

Difficult part is generating a function which will constraint each piece to be continuous. The y intercept value for the third piece is not a parameter. y intercept value for the second and third piece are shared which allows continuity. However, full continuity is not reached.

Ordinary Least Square Iteration

Another approach for full continuity is using OLS with constraints for the y-intercept. The method would start with one interval and find a linear model. Find linear models for adjacent intervals which are continuous. At the end we will have n different models for n intervals. Decide on the best model with the smallest sum of squared error.

(to be continued)

)
Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade