Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Isotonic regression

From Wikipedia, the free encyclopedia
Type of numerical analysis
An example of isotonic regression (solid red line) compared to linear regression on the same data, both fit to minimize themean squared error. The free-form property of isotonic regression means the line can be steeper where the data are steeper; the isotonicity constraint means the line does not decrease.
Part of a series on
Regression analysis
Models
Estimation
Background

Instatistics andnumerical analysis,isotonic regression ormonotonic regression is the technique of fitting a free-form line to a sequence of observations such that the fitted line isnon-decreasing (or non-increasing) everywhere, and lies as close to the observations as possible.

Applications

[edit]

Isotonic regression has applications instatistical inference. For example, one might use it to fit an isotonic curve to the means of some set of experimental results when an increase in those means according to some particular ordering is expected. A benefit of isotonic regression is that it is not constrained by any functional form, such as the linearity imposed bylinear regression, as long as the function is monotonic increasing.

Another application is nonmetricmultidimensional scaling,[1] where a low-dimensionalembedding for data points is sought such that order of distances between points in the embedding matchesorder of dissimilarity between points. Isotonic regression is used iteratively to fit ideal distances to preserve relative dissimilarity order.

Isotonic regression is also used inprobabilistic classification to calibrate the predicted probabilities ofsupervised machine learning models.[2]

Isotonic regression for the simply ordered case with univariatex,y{\displaystyle x,y} has been applied to estimating continuous dose-response relationships in fields such as anesthesiology and toxicology. Narrowly speaking, isotonic regression only provides point estimates at observed values ofx.{\displaystyle x.} Estimation of the complete dose-response curve without any additional assumptions is usually done via linear interpolation between the point estimates.[3]

Software for computing isotone (monotonic) regression has been developed forR,[4][5][6]Stata, andPython.[7]

Problem statement and algorithms

[edit]

Let(x1,y1),,(xn,yn){\displaystyle (x_{1},y_{1}),\ldots ,(x_{n},y_{n})} be a given set of observations, where theyiR{\displaystyle y_{i}\in \mathbb {R} } and thexi{\displaystyle x_{i}} fall in somepartially ordered set. For generality, each observation(xi,yi){\displaystyle (x_{i},y_{i})} may be given a weightwi0{\displaystyle w_{i}\geq 0}, although commonlywi=1{\displaystyle w_{i}=1} for alli{\displaystyle i}.

Isotonic regression seeks a weightedleast-squares fity^iyi{\displaystyle {\hat {y}}_{i}\approx y_{i}} for alli{\displaystyle i}, subject to the constraint thaty^iy^j{\displaystyle {\hat {y}}_{i}\leq {\hat {y}}_{j}} wheneverxixj{\displaystyle x_{i}\leq x_{j}}. This gives the followingquadratic program (QP) in the variablesy^1,,y^n{\displaystyle {\hat {y}}_{1},\ldots ,{\hat {y}}_{n}}:

mini=1nwi(y^iyi)2{\displaystyle \min \sum _{i=1}^{n}w_{i}({\hat {y}}_{i}-y_{i})^{2}} subject toy^iy^j for all (i,j)E{\displaystyle {\hat {y}}_{i}\leq {\hat {y}}_{j}{\text{ for all }}(i,j)\in E}

whereE={(i,j):xixj}{\displaystyle E=\{(i,j):x_{i}\leq x_{j}\}} specifies the partial ordering of the observed inputsxi{\displaystyle x_{i}} (and may be regarded as the set of edges of somedirected acyclic graph (dag) with vertices1,2,n{\displaystyle 1,2,\ldots n}). Problems of this form may be solved by generic quadratic programming techniques.

In the usual setting where thexi{\displaystyle x_{i}} values fall in atotally ordered set such asR{\displaystyle \mathbb {R} }, we may assumeWLOG that the observations have been sorted so thatx1x2xn{\displaystyle x_{1}\leq x_{2}\leq \cdots \leq x_{n}}, and takeE={(i,i+1):1i<n}{\displaystyle E=\{(i,i+1):1\leq i<n\}}. In this case, a simpleiterative algorithm for solving the quadratic program is thepool adjacent violators algorithm. Conversely, Best and Chakravarti[8] studied the problem as anactive set identification problem, and proposed a primal algorithm. These two algorithms can be seen as each other's dual, and both have acomputational complexity ofO(n){\displaystyle O(n)} on already sorted data.[8]

To complete the isotonic regression task, we may then choose any non-decreasing functionf(x){\displaystyle f(x)} such thatf(xi)=y^i{\displaystyle f(x_{i})={\hat {y}}_{i}} for all i. Any such function obviously solves

minfi=1nwi(f(xi)yi)2{\displaystyle \min _{f}\sum _{i=1}^{n}w_{i}(f(x_{i})-y_{i})^{2}} subject tof{\displaystyle f} being nondecreasing

and can be used to predict they{\displaystyle y} values for new values ofx{\displaystyle x}. A common choice whenxiR{\displaystyle x_{i}\in \mathbb {R} } would be to interpolate linearly between the points(xi,y^i){\displaystyle (x_{i},{\hat {y}}_{i})}, as illustrated in the figure, yielding a continuous piecewise linear function:

f(x)={y^1if xx1y^i+xxixi+1xi(y^i+1y^i)if xixxi+1y^nif xxn{\displaystyle f(x)={\begin{cases}{\hat {y}}_{1}&{\text{if }}x\leq x_{1}\\{\hat {y}}_{i}+{\frac {x-x_{i}}{x_{i+1}-x_{i}}}({\hat {y}}_{i+1}-{\hat {y}}_{i})&{\text{if }}x_{i}\leq x\leq x_{i+1}\\{\hat {y}}_{n}&{\text{if }}x\geq x_{n}\end{cases}}}

Centered isotonic regression

[edit]

As this article's first figure shows, in the presence of monotonicity violations the resulting interpolated curve will have flat (constant) intervals. In dose-response applications it is usually known thatf(x){\displaystyle f(x)} is not only monotone but alsosmooth. The flat intervals are incompatible withf(x){\displaystyle f(x)}'s assumed shape, and can be shown to be biased. A simple improvement for such applications, named centered isotonic regression (CIR), was developed by Oron and Flournoy and shown to substantially reduce estimation error for both dose-response and dose-finding applications.[9] Both CIR and the standard isotonic regression for the univariate, simply ordered case, are implemented in the R package "cir".[4] This package also provides analytical confidence-interval estimates.

References

[edit]
  1. ^Kruskal, J. B. (1964). "Nonmetric Multidimensional Scaling: A numerical method".Psychometrika.29 (2):115–129.doi:10.1007/BF02289694.S2CID 11709679.
  2. ^Niculescu-Mizil, Alexandru; Caruana, Rich (2005). "Predicting good probabilities with supervised learning". In De Raedt, Luc; Wrobel, Stefan (eds.).Proceedings of the Twenty-Second International Conference on Machine Learning (ICML 2005), Bonn, Germany, August 7–11, 2005. ACM International Conference Proceeding Series. Vol. 119. Association for Computing Machinery. pp. 625–632.doi:10.1145/1102351.1102430.
  3. ^Stylianou, MP;Flournoy, N (2002). "Dose finding using the biased coin up-and-down design and isotonic regression".Biometrics.58 (1):171–177.doi:10.1111/j.0006-341x.2002.00171.x.PMID 11890313.S2CID 8743090.
  4. ^abOron, Assaf."Package 'cir'".CRAN. R Foundation for Statistical Computing. Retrieved26 December 2020.
  5. ^Leeuw, Jan de; Hornik, Kurt; Mair, Patrick (2009)."Isotone Optimization in R: Pool-Adjacent-Violators Algorithm (PAVA) and Active Set Methods".Journal of Statistical Software.32 (5):1–24.doi:10.18637/jss.v032.i05.ISSN 1548-7660.
  6. ^Xu, Zhipeng; Sun, Chenkai; Karunakaran, Aman."Package UniIsoRegression"(PDF).CRAN. R Foundation for Statistical Computing. Retrieved29 October 2021.
  7. ^Pedregosa, Fabian; et al. (2011). "Scikit-learn:Machine learning in Python".Journal of Machine Learning Research.12:2825–2830.arXiv:1201.0490.Bibcode:2011JMLR...12.2825P.
  8. ^abBest, Michael J.; Chakravarti, Nilotpal (1990)."Active set algorithms for isotonic regression; A unifying framework".Mathematical Programming.47 (1–3):425–439.doi:10.1007/bf01580873.ISSN 0025-5610.S2CID 31879613.
  9. ^Oron, AP; Flournoy, N (2017). "Centered Isotonic Regression: Point and Interval Estimation for Dose-Response Studies".Statistics in Biopharmaceutical Research.9 (3):258–267.arXiv:1701.05964.doi:10.1080/19466315.2017.1286256.S2CID 88521189.

Further reading

[edit]
Wikibooks has a book on the topic of:Isotonic regression
Continuous data
Center
Dispersion
Shape
Count data
Summary tables
Dependence
Graphics
Study design
Survey methodology
Controlled experiments
Adaptive designs
Observational studies
Statistical theory
Frequentist inference
Point estimation
Interval estimation
Testing hypotheses
Parametric tests
Specific tests
Goodness of fit
Rank statistics
Bayesian inference
Correlation
Regression analysis (see alsoTemplate:Least squares and regression analysis
Linear regression
Non-standard predictors
Generalized linear model
Partition of variance
Categorical
Multivariate
Time-series
General
Specific tests
Time domain
Frequency domain
Survival
Survival function
Hazard function
Test
Biostatistics
Engineering statistics
Social statistics
Spatial statistics
Retrieved from "https://en.wikipedia.org/w/index.php?title=Isotonic_regression&oldid=1319037994"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp