Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Polynomial regression

From Wikipedia, the free encyclopedia
Statistics concept
A cubic polynomial regression fit to a simulated data set. Theconfidence band is a 95% simultaneous confidence band constructed using theScheffé approach.
Part of a series on
Regression analysis
Models
Estimation
Background

Instatistics,polynomial regression is a form ofregression analysis in which the relationship between theindependent variablex and thedependent variabley is modeled as apolynomial inx. Polynomial regression fits a nonlinear relationship between the value ofx and the correspondingconditional mean ofy, denoted E(y |x). Although polynomial regression fits a nonlinear model to the data, as astatistical estimation problem it is linear, in the sense that the regression function E(y | x) is linear in the unknownparameters that are estimated from thedata. Thus, polynomial regression is a special case ofmultiple linear regression.

The explanatory (independent) variables resulting from the polynomial expansion of the "baseline" variables are known as higher-degree terms. Such variables are also used inclassification settings.[1]

History

[edit]

Polynomial regression models are usually fit using the method ofleast squares. The least-squares method minimizes thevariance of theunbiasedestimators of the coefficients, under the conditions of theGauss–Markov theorem. The least-squares method was published in 1805 byLegendre and in 1809 byGauss. The firstdesign of anexperiment for polynomial regression appeared in an 1815 paper ofGergonne.[2][3] In the twentieth century, polynomial regression played an important role in the development ofregression analysis, with a greater emphasis on issues ofdesign andinference.[4] More recently, the use of polynomial models has been complemented by other methods, with non-polynomial models having advantages for some classes of problems.[citation needed]

Definition and example

[edit]

The goal of regression analysis is to model the expected value of a dependent variabley in terms of the value of an independent variable (or vector of independent variables)x. In simple linear regression, the model

y=β0+β1x+ε,{\displaystyle y=\beta _{0}+\beta _{1}x+\varepsilon ,\,}

is used, where ε is an unobserved random error with mean zero conditioned on ascalar variablex. In this model, for each unit increase in the value ofx, the conditional expectation ofy increases byβ1 units.

In many settings, such a linear relationship may not hold. For example, if we are modeling the yield of a chemical synthesis in terms of the temperature at which the synthesis takes place, we may find that the yield improves by a different amount for each unit increase in temperature. Or we may find that the yielddecreases withincreasing temperature (but only for a certain range of temperatures) andincreases withincreasing temperature in a different range of temperatures. In this case, we might propose a quadratic model of the form

y=β0+β1x+β2x2+ε.{\displaystyle y=\beta _{0}+\beta _{1}x+\beta _{2}x^{2}+\varepsilon .\,}

In this model, when the temperature is increased fromx tox + 1 units, the expected yield changes byβ1+β2(2x+1).{\displaystyle \beta _{1}+\beta _{2}(2x+1).} (This can be seen by obtaining the derivative with respect to x of the regression formula.) Forinfinitesimal changes inx, the effect ony is given by thetotal derivative with respect tox:β1+2β2x.{\displaystyle \beta _{1}+2\beta _{2}x.} The fact that the change in yield depends onx is what makes the relationship betweenx andy nonlinear even though the model is linear in the parameters to be estimated.

In general, we can model the expected value ofy as annth degree polynomial, yielding the general polynomial regression model

y=β0+β1x+β2x2+β3x3++βnxn+ε.{\displaystyle y=\beta _{0}+\beta _{1}x+\beta _{2}x^{2}+\beta _{3}x^{3}+\cdots +\beta _{n}x^{n}+\varepsilon .\,}

Conveniently, these models are all linear from the point of view ofestimation, since the regression function is linear in terms of the unknown parametersβ0,β1, .... Therefore, forleast squares analysis, the computational and inferential problems of polynomial regression can be completely addressed using the techniques ofmultiple regression. This is done by treatingxx2, ... as being distinct independent variables in a multiple regression model.

Matrix form and calculation of estimates

[edit]

The polynomial regression model

yi=β0+β1xi+β2xi2++βmxim+εi (i=1,2,,n){\displaystyle y_{i}\,=\,\beta _{0}+\beta _{1}x_{i}+\beta _{2}x_{i}^{2}+\cdots +\beta _{m}x_{i}^{m}+\varepsilon _{i}\ (i=1,2,\dots ,n)}

can be expressed in matrix form in terms of adesign matrixX{\displaystyle \mathbf {X} }, a response vectory{\displaystyle {\vec {y}}}, a parameter vectorβ{\displaystyle {\vec {\beta }}}, and a vectorε{\displaystyle {\vec {\varepsilon }}} of random errors. Thei-th row ofX{\displaystyle \mathbf {X} } andy{\displaystyle {\vec {y}}} will contain thex andy value for thei-th data sample. Then the model can be written as asystem of linear equations:

[y1y2y3yn]=[1x1x12x1m1x2x22x2m1x3x32x3m1xnxn2xnm][β0β1β2βm]+[ε1ε2ε3εn],{\displaystyle {\begin{bmatrix}y_{1}\\y_{2}\\y_{3}\\\vdots \\y_{n}\end{bmatrix}}={\begin{bmatrix}1&x_{1}&x_{1}^{2}&\dots &x_{1}^{m}\\1&x_{2}&x_{2}^{2}&\dots &x_{2}^{m}\\1&x_{3}&x_{3}^{2}&\dots &x_{3}^{m}\\\vdots &\vdots &\vdots &\ddots &\vdots \\1&x_{n}&x_{n}^{2}&\dots &x_{n}^{m}\end{bmatrix}}{\begin{bmatrix}\beta _{0}\\\beta _{1}\\\beta _{2}\\\vdots \\\beta _{m}\end{bmatrix}}+{\begin{bmatrix}\varepsilon _{1}\\\varepsilon _{2}\\\varepsilon _{3}\\\vdots \\\varepsilon _{n}\end{bmatrix}},}

which when using pure matrix notation is written as

y=Xβ+ε.{\displaystyle {\vec {y}}=\mathbf {X} {\vec {\beta }}+{\vec {\varepsilon }}.\,}

The vector of estimated polynomial regression coefficients (usingordinary least squaresestimation) is

β^=(XTX)1XTy,{\displaystyle {\widehat {\vec {\beta }}}=(\mathbf {X} ^{\mathsf {T}}\mathbf {X} )^{-1}\;\mathbf {X} ^{\mathsf {T}}{\vec {y}},\,}

assumingm <n which is required for the matrix to be invertible; then sinceX{\displaystyle \mathbf {X} } is aVandermonde matrix, the invertibility condition is guaranteed to hold if all thexi{\displaystyle x_{i}} values are distinct. This is the unique least-squares solution.

Expanded formulas

[edit]

The above matrix equations explain the behavior of polynomial regression well. However, to physically implement polynomial regression for a set of xy point pairs, more detail is useful. The below matrix equations for polynomial coefficients are expanded from regression theory without derivation and easily implemented.[5][6][7]

[i=1nxi0i=1nxi1i=1nxi2i=1nximi=1nxi1i=1nxi2i=1nxi3i=1nxim+1i=1nxi2i=1nxi3i=1nxi4i=1nxim+2i=1nximi=1nxim+1i=1nxim+2i=1nxi2m][β0β1β2βm]=[i=1nyixi0i=1nyixi1i=1nyixi2i=1nyixim]{\displaystyle {\begin{bmatrix}\sum _{i=1}^{n}x_{i}^{0}&\sum _{i=1}^{n}x_{i}^{1}&\sum _{i=1}^{n}x_{i}^{2}&\cdots &\sum _{i=1}^{n}x_{i}^{m}\\\sum _{i=1}^{n}x_{i}^{1}&\sum _{i=1}^{n}x_{i}^{2}&\sum _{i=1}^{n}x_{i}^{3}&\cdots &\sum _{i=1}^{n}x_{i}^{m+1}\\\sum _{i=1}^{n}x_{i}^{2}&\sum _{i=1}^{n}x_{i}^{3}&\sum _{i=1}^{n}x_{i}^{4}&\cdots &\sum _{i=1}^{n}x_{i}^{m+2}\\\vdots &\vdots &\vdots &\ddots &\vdots \\\sum _{i=1}^{n}x_{i}^{m}&\sum _{i=1}^{n}x_{i}^{m+1}&\sum _{i=1}^{n}x_{i}^{m+2}&\dots &\sum _{i=1}^{n}x_{i}^{2m}\\\end{bmatrix}}{\begin{bmatrix}\beta _{0}\\\beta _{1}\\\beta _{2}\\\cdots \\\beta _{m}\\\end{bmatrix}}={\begin{bmatrix}\sum _{i=1}^{n}y_{i}x_{i}^{0}\\\sum _{i=1}^{n}y_{i}x_{i}^{1}\\\sum _{i=1}^{n}y_{i}x_{i}^{2}\\\cdots \\\sum _{i=1}^{n}y_{i}x_{i}^{m}\\\end{bmatrix}}}

After solving the abovesystem of linear equations forβ0 through βm{\displaystyle \beta _{0}{\text{ through }}\beta _{m}}, the regression polynomial may be constructed as follows:

y^=β0x0+β1x1+β2x2++βmxmWhere:n=number of xiyi variable pairs in the datam=order of the polynomial to be used for regressionβ(0m)=polynomial coefficient for each corresponding x(0m)y^=estimated y variable based on the polynomial regression calculations.{\displaystyle {\begin{aligned}&\qquad {\widehat {y}}=\beta _{0}x^{0}+\beta _{1}x^{1}+\beta _{2}x^{2}+\cdots +\beta _{m}x^{m}\\&\qquad \\&\qquad {\text{Where:}}\\&\qquad n={\text{number of }}x_{i}y_{i}{\text{ variable pairs in the data}}\\&\qquad m={\text{order of the polynomial to be used for regression}}\\&\qquad \beta _{(0-m)}={\text{polynomial coefficient for each corresponding }}x^{(0-m)}\\&\qquad {\widehat {y}}={\text{estimated y variable based on the polynomial regression calculations.}}\end{aligned}}}

Interpretation

[edit]

Although polynomial regression is technically a special case of multiple linear regression, the interpretation of a fitted polynomial regression model requires a somewhat different perspective. It is often difficult to interpret the individual coefficients in a polynomial regression fit, since the underlying monomials can be highly correlated. For example,x andx2 have correlation around 0.97 when x isuniformly distributed on the interval (0, 1). Although the correlation can be reduced by usingorthogonal polynomials, it is generally more informative to consider the fitted regression function as a whole. Point-wise or simultaneousconfidence bands can then be used to provide a sense of the uncertainty in the estimate of the regression function.

Alternative approaches

[edit]

Polynomial regression is one example of regression analysis usingbasis functions to model a functional relationship between two quantities. More specifically, it replacesxRdx{\displaystyle x\in \mathbb {R} ^{d_{x}}} in linear regression with polynomial basisφ(x)Rdφ{\displaystyle \varphi (x)\in \mathbb {R} ^{d_{\varphi }}}, e.g.[1,x]φ[1,x,x2,,xd]{\displaystyle [1,x]{\mathbin {\stackrel {\varphi }{\rightarrow }}}[1,x,x^{2},\ldots ,x^{d}]}. A drawback of polynomial bases is that the basis functions are "non-local", meaning that the fitted value ofy at a given valuex = x0 depends strongly on data values withx far fromx0.[8] In modern statistics, polynomial basis-functions are used along with newbasis functions, such assplines,radial basis functions, andwavelets. These families of basis functions offer a more parsimonious fit for many types of data.

The goal of polynomial regression is to model a non-linear relationship between the independent and dependent variables (technically, between the independent variable and the conditional mean of the dependent variable). This is similar to the goal ofnonparametric regression, which aims to capture non-linear regression relationships. Therefore, non-parametric regression approaches such assmoothing can be useful alternatives to polynomial regression. Some of these methods make use of a localized form of classical polynomial regression.[9] An advantage of traditional polynomial regression is that the inferential framework of multiple regression can be used (this also holds when using other families of basis functions such as splines).

A final alternative is to usekernelized models such assupport vector regression with apolynomial kernel.

Ifresiduals haveunequal variance, aweighted least squares estimator may be used to account for that.[10]

See also

[edit]

Notes

[edit]
  • Microsoft Excel makes use of polynomial regression when fitting a trendline to data points on an X Y scatter plot.[11]

References

[edit]
  1. ^Yin-Wen Chang; Cho-Jui Hsieh; Kai-Wei Chang; Michael Ringgaard; Chih-Jen Lin (2010)."Training and testing low-degree polynomial data mappings via linear SVM".Journal of Machine Learning Research.11:1471–1490.
  2. ^Gergonne, J. D. (November 1974) [1815]. "The application of the method of least squares to the interpolation of sequences".Historia Mathematica.1 (4) (Translated by Ralph St. John andS. M. Stigler from the 1815 French ed.):439–447.doi:10.1016/0315-0860(74)90034-2.
  3. ^Stigler, Stephen M. (November 1974). "Gergonne's 1815 paper on the design and analysis of polynomial regression experiments".Historia Mathematica.1 (4):431–439.doi:10.1016/0315-0860(74)90033-0.
  4. ^Smith, Kirstine (1918)."On the Standard Deviations of Adjusted and Interpolated Values of an Observed Polynomial Function and its Constants and the Guidance They Give Towards a Proper Choice of the Distribution of the Observations".Biometrika.12 (1/2):1–85.doi:10.2307/2331929.JSTOR 2331929.
  5. ^Muthukrishnan, Gowri (17 Jun 2018)."Maths behind Polynomial regression, Muthukrishnan".Maths behind Polynomial regression. Retrieved30 Jan 2024.
  6. ^"Mathematics of Polynomial Regression".Polynomial Regression, A PHP regression class.
  7. ^Devore, Jay L. (1995).Probability and Statistics for Engineering and the Sciences (4th ed.). US: Brooks/Cole Publishing Company. pp. 539–542.ISBN 0-534-24264-2.
  8. ^Such "non-local" behavior is a property ofanalytic functions that are not constant (everywhere). Such "non-local" behavior has been widely discussed in statistics:
  9. ^Fan, Jianqing (1996).Local Polynomial Modelling and Its Applications: From linear regression to nonlinear regression. Monographs on Statistics and Applied Probability. Chapman & Hall/CRC.ISBN 978-0-412-98321-4.
  10. ^Conte, S.D.; De Boor, C. (2018).Elementary Numerical Analysis: An Algorithmic Approach. Classics in Applied Mathematics. Society for Industrial and Applied Mathematics (SIAM, 3600 Market Street, Floor 6, Philadelphia, PA 19104). p. 259.ISBN 978-1-61197-520-8. Retrieved2020-08-28.
  11. ^Stevenson, Christopher."Tutorial: Polynomial Regression in Excel".facultystaff.richmond.edu. Retrieved22 January 2017.
Continuous data
Center
Dispersion
Shape
Count data
Summary tables
Dependence
Graphics
Study design
Survey methodology
Controlled experiments
Adaptive designs
Observational studies
Statistical theory
Frequentist inference
Point estimation
Interval estimation
Testing hypotheses
Parametric tests
Specific tests
Goodness of fit
Rank statistics
Bayesian inference
Correlation
Regression analysis
Linear regression
Non-standard predictors
Generalized linear model
Partition of variance
Categorical
Multivariate
Time-series
General
Specific tests
Time domain
Frequency domain
Survival
Survival function
Hazard function
Test
Biostatistics
Engineering statistics
Social statistics
Spatial statistics
Computational statistics
Correlation and dependence
Regression analysis
Regression as a
statistical model
Linear regression
Predictor structure
Non-standard
Non-normal errors
Decomposition of variance
Model exploration
Background
Design of experiments
Numericalapproximation
Applications

External links

[edit]
Retrieved from "https://en.wikipedia.org/w/index.php?title=Polynomial_regression&oldid=1332946538"
Category:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp