12.07.2015 Views

On Monotone Regression - School of Maths and Stats Local Site

On Monotone Regression - School of Maths and Stats Local Site

On Monotone Regression - School of Maths and Stats Local Site

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Why do we want the monotonicity constraint?Underlying phenomenon can be monotone. For exampleGrowth curve data, for example tumour growthMonotonic data transformation <strong>of</strong> data, e.g. in a regression orANOVA situationForensic science — e.g. predicting age from maturity score (createdfrom teeth sizes)nonparametric calibrationdose-response curvesBerwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 3 / 32


Selective reviewMuch work done on constrained nonparametric smoothingsome using kernel smoothing methodsFriedman <strong>and</strong> Tibshirani (1984), Mammen (1991), Marron et al. (1997), Fisher et al. (1997), Mammenet al. (2001), Hall <strong>and</strong> Huang (2001), Dette et al. (2006), Dette <strong>and</strong> Pilz (2006)but mainly in the splines literature◮ using linear or quadratic programming approachesDierckx (1980), Wright <strong>and</strong> Wegman (1980), Micchelli et al. (1985), Irvine et al. (1986), Villalobos<strong>and</strong> Wahba (1987), Elfving <strong>and</strong> Andersson (1988), Ramsay (1988), Fritsch (1990), Schmidt <strong>and</strong>Scholz (1990), Schwetlick <strong>and</strong> Kunert (1993), Tantiyaswasdikul <strong>and</strong> Woodro<strong>of</strong>e (1994), He <strong>and</strong> Shi(1998), Turlach (2005)◮◮using semi-indefinite programming approachesWang (2008), Wang <strong>and</strong> Li (2008)via the penalty termRamsay (1998), Heckman <strong>and</strong> Ramsay (2000)Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 4 / 32


Selective review (ctd)◮◮◮via boosting techniquesLeitenstorfer <strong>and</strong> Tutz (2007), Tutz <strong>and</strong> Leitenstorfer (2007)via Bayesian approachesRamgopal et al. (1993), Lavine <strong>and</strong> Mockus (1995), Perron <strong>and</strong> Mengersen (2001), Holmes <strong>and</strong>Heard (2003), Neelon <strong>and</strong> Dunson (2004), Brezger <strong>and</strong> Steiner (2008), Shively et al. (2009),Hazelton <strong>and</strong> Turlach (2011), Meyer et al. (2011)for some theory seeUtreras (1985), Mammen <strong>and</strong> Thomas-Agnan (1999), Meyer (2008)Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 5 / 32


The big questionTo smooth or not to smooth,that is the question:Whether ’tis nobler in the mind tolet the data speak for themselves,or to fit parametric models to them,<strong>and</strong> by fitting miss some features?Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 7 / 32


PolynomialsPolynomials are usually parameterised as follows:p(x) = β 0 + β 1 x + β 2 x 2 + . . . + β p x pThis is linear in its parametersParameters can be estimated using st<strong>and</strong>ard linear regressiontechniquesNothing difficult here!Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 8 / 32


Example: monotone polynomial fit to simulated datay−20 −10 0 10 20 30●●● ●● ● ●●●● ●●●● ● ● ● ●●●●●● ●● ● ● ● ●●● ● ● ● ●● ● ●●● ● ●●●● ●●●●●−1.0 −0.5 0.0 0.5 1.0xFigure 3: <strong>Monotone</strong> polynomial (degree 5) fitBerwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 10 / 32


Comparison: unconstrained polynomial <strong>and</strong> monotone polynomialy−20 −10 0 10 20 30●●● ●● ● ●●●● ●●●● ● ● ● ●●●●●● ●● ● ● ● ●●● ● ● ● ●● ● ●●● ● ●●●● ●●●●●−1.0 −0.5 0.0 0.5 1.0xFigure 4: Comparison - st<strong>and</strong>ard polynomial / monotone polynomialBerwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 11 / 32


The Problem <strong>and</strong> existing methodsRamsay (1998) wrote:How to fit a monotonepolynomial to data“Attempting to impose monotonicity on polynomials quicklybecomes unpleasant”Elphinstone (1983) provided formulations <strong>of</strong> monotone polynomialsHawkins (1994) fitted monotone polynomials to data by introducinghorizontal inflection points (HIPS)Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 12 / 32


Formulations <strong>of</strong> monotone polynomialsElphinstone provided this formulation:∫ xp(x) = d + a0K∏ {(c2i + bi 2 ) + 2b i t + t 2} dti=1where 2K + 1 is the degree <strong>of</strong> the polynomial, <strong>and</strong> b i , c i are arbitrary real values <strong>and</strong> d is the intercept <strong>and</strong> a is ascaling parameter that also determines whether our polynomial is monotonic increasing or decreasing.This has first derivativep ′ (x) = aK∏i=1{}(ci2 }{{} + b2 i ) + 2b i x + x 2} {{ }Both c 2 i<strong>and</strong> (b i + x) 2 ≥ 0, hence monotonicityBerwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 13 / 32


Problems with formulationElphinstone’s second formulation was used by Hawkins (1994) <strong>and</strong>Heinzmann (2008)∫ xp(x) = d + a0This has two minor problems:K∏ {1 + 2bi t + (bi 2 + ci 2 )t 2} dti=1We cannot express p(x) = x p in this mannerThe ci2 term has the potential to cause problems in optimizationroutinesBerwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 14 / 32


Alternative formulationPenttila (2006; personal communication):∫ xp(x) = d + a0K∏ {b2i + 2b i t + (1 + ci 2 )t 2} dti=1p(x) = x p can be expressed using this formulationUsing Penttila’s formulation our objective function becomesRSS =[ (n∑y j − d + aj=1∫ xj0i=1)] 2K∏ {b2i + 2b i t + (1 + ci 2 )t 2} dtBerwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 15 / 32


Objective function - cubic exampleFor the cubic we haveRSS =n∑ [yj − ( β 0 + β 1 x j + β 2 xj 2 + β 3 xj3 )] 2j=1When we impose the monotonicity constraint this becomes:RSS =[ (n∑y j − d + ab1x 2 j + ab 1 xj 2 + a(1 + c1 2 ) x )]j3 23j=1Not too difficult but is non-linear in parameters — needs non-linearoptimisation routinesBerwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 16 / 32


Objective function K > 1?For K = 2 this polynomial becomes more complexp(x) = d + a{b 2 1b 2 2x +b 1 b 2 (b 1 + b 2 )x 2 +13 [4b 1b 2 + b 2 2(1 + c 2 1 ) + b 2 1(1 + c 2 2 )]x 3 +12 (b 1 + b 2 + b 2 c 2 1 + b 1 c 2 2 )x 4 +15 (1 + c2 1 )(1 + c 2 2 )x 5 }<strong>and</strong> hence the non-linear optimisation becomes more complex <strong>and</strong>computationally intensive!Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 17 / 32


Minimization <strong>of</strong> the objective functionConsidered several different optimisation routines includingNon-derivative based algorithmsCoordinate descentBlock coordinate descentLevenberg–MarquardtAfter simulations concluded Levenberg–Marquardt was most suitableLooked also at a modification using constrained c i instead <strong>of</strong> c 2 iBerwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 18 / 32


yyyySome simulated <strong>and</strong> real dataConsidered 4 data sets−20 0 10 20 30●●●●●● ● ● ●●●●● ● ● ● ●● ●●●●● ● ●● ●●● ● ●●●●● ● ●●● ●●●●●●● ●● ●●80 100 140 180● ●●●●●●●●●●● ●●●●●●●●●●●●●●● ●●●●−1.0 −0.5 0.0 0.5 1.05 10 15xx0 2 4 6 8 10 14●●●●●●●●●●●●●●●●●●●●●●●●●−0.05 0.00 0.05 0.10●●● ● ● ●● ●●● ● ●●●●● ●● ●●●0 2 4 6 8 10 12−1.0 −0.5 0.0 0.5 1.0xxFigure 5: Example data setsBerwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 19 / 32


Simulation results — summaryTable 1: RSS for EHH formulationEHH Elphinstone Penttilac 2 c ≥ 0 c 2 c ≥ 0 c 2 c ≥ 0Data 1 (Hawkins) K=1 886.32757 323.69448K=2 63.60159 63.60159K=3 44.94751 41.99854K=4 41.90708 41.90566Data 2 (Ramsay) K=1 348.78796 348.78796K=2 351.11169 138.79971K=3 351.85594 40.68825K=4 352.22267 40.50511Data 3 (SimD0) K=1 0.002000 0.002000K=2 0.001469 0.001469K=3 0.001399 0.001421K=4 0.001414 0.001402Data 4 (SimD2) K=1 25.67706 25.67706K=2 25.40864 25.40864K=3 11.44768 11.24709K=4 13.91453 11.08235Many times the c 2 parameterisation converges to a local minimaBerwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 20 / 32


Simulation results — summary (ctd)Table 2: RSS focussing on Penttila formulationEHH Elphinstone Penttilac 2 c ≥ 0 c 2 c ≥ 0 c 2 c ≥ 0Data 1 (Hawkins) K=1 323.69448 323.69448 323.69448K=2 63.60159 63.60159 63.60159K=3 41.99854 41.99854 41.99854K=4 41.90566 41.90566 41.90566Data 2 (Ramsay) K=1 348.78796 348.78796 348.78796K=2 138.79971 138.79971 138.79971K=3 40.68825 40.68825 40.68825K=4 40.50511 40.5051 40.68825Data 3 (SimD0) K=1 0.002000 0.002000 0.002000K=2 0.001469 0.001469 0.001469K=3 0.001421 0.001399 0.001422K=4 0.001402 0.001401 0.001422Data 4 (SimD2) K=1 25.67706 25.67706 25.67706K=2 25.40864 25.40864 25.40864K=3 11.24709 11.24709 11.24709K=4 11.08235 11.08235 11.87866The Penttila formulation fails to converge in some instancesBerwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 21 / 32


Simulation results — summary (ctd)Table 3: RSS comparing Elphinstone’s 2 formulationsEHH Elphinstone Penttilac 2 c ≥ 0 c 2 c ≥ 0 c 2 c ≥ 0Data 1 (Hawkins) K=1 323.69448 323.69448K=2 63.60159 63.60159K=3 41.99854 41.99854K=4 41.90566 41.90566Data 2 (Ramsay) K=1 348.78796 348.78796K=2 138.79971 138.79971K=3 40.68825 40.68825K=4 40.50511 40.5051Data 3 (SimD0) K=1 0.002000 0.002000K=2 0.001469 0.001469K=3 0.001421 0.001399K=4 0.001402 0.001401Data 4 (SimD2) K=1 25.67706 25.67706K=2 25.40864 25.40864K=3 11.24709 11.24709K=4 11.08235 11.08235Elphinstone’s original formulation performs marginally betterBerwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 22 / 32


Back to the forensic science problem — cubicAge vs Maturity scoreAge (years)6 8 10 12 14RSS=277.5●● ● ●●● ●●●●●●●● ● ●●●●●●●●●● ●●●●● ●● ● ● ● ●● ● ● ● ●●● ● ●● ●●● ●● ● ● ● ● ● ●● ●● ● ●● ● ●● ● ● ● ●●●●● ●● ● ● ●● ●●●● ●● ●● ●● ● ●●● ●● ● ● ● ●●● ●●●● ● ● ●● ●● ● ●● ● ●● ● ●● ●● ●● ●●● ● ●●● ●●● ● ●●● ● ●● ● ● ●● ● ●●● ● ●●● ●● ●●●−1.0 −0.5 0.0 0.5 1.0Maturity score (st<strong>and</strong>ardised)Figure 6: Forensic science data, K=1Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 23 / 32


Back to the forensic science problem — quinticAge vs Maturity scoreAge (years)6 8 10 12 14RSS=273.6●● ● ●●● ●●●●●●●● ● ●●●●●●●●●● ●●●●● ●● ● ● ● ●● ● ● ● ●●● ● ●● ●●● ●● ● ● ● ● ● ●● ●● ● ●● ● ●● ● ● ● ●●●●● ●● ● ● ●● ●●●● ●● ●● ●● ● ●●● ●● ● ● ● ●●● ●●●● ● ● ●● ●● ● ●● ● ●● ● ●● ●● ●● ●●● ● ●●● ●●● ● ●●● ● ●● ● ● ●● ● ●●● ● ●●● ●● ●●●−1.0 −0.5 0.0 0.5 1.0Maturity score (st<strong>and</strong>ardised)Figure 7: Forensic science data, K=2Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 24 / 32


Back to the forensic science problem — septicAge vs Maturity scoreAge (years)6 8 10 12 14RSS=271.8●● ● ●●● ●●●●●●●● ● ●●●●●●●●●● ●●●●● ●● ● ● ● ●● ● ● ● ●●● ● ●● ●●● ●● ● ● ● ● ● ●● ●● ● ●● ● ●● ● ● ● ●●●●● ●● ● ● ●● ●●●● ●● ●● ●● ● ●●● ●● ● ● ● ●●● ●●●● ● ● ●● ●● ● ●● ● ●● ● ●● ●● ●● ●●● ● ●●● ●●● ● ●●● ● ●● ● ● ●● ● ●●● ● ●●● ●● ●●●−1.0 −0.5 0.0 0.5 1.0Maturity score (st<strong>and</strong>ardised)Figure 8: Forensic science data, K=3Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 25 / 32


Back to the forensic science problemAge vs Maturity scoreAge (years)6 8 10 12 14● ● ●●●●●●●●●●●● ● ●●●●●●●●●● ●●●●● ●● ● ● ● ●● ● ● ● ●●● ● ●● ●●● ●● ● ● ● ● ● ●● ●● ● ●● ● ●● ● ● ● ●●●●● ●● ● ● ●● ●●●● ●● ●● ●● ● ●●● ●● ● ● ● ●●● ●●●● ● ● ●● ●● ● ●● ● ●● ● ●● ●● ●● ●●● ● ●●● ●●● ● ●●● ● ●● ● ● ●● ● ●●● ● ●●● ●● ●●●−1.0 −0.5 0.0 0.5 1.0Maturity score (st<strong>and</strong>ardised)Figure 9: Forensic science data, K=1:3Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 26 / 32


ReferencesBrezger, A. <strong>and</strong> Steiner, W.J. (2008). Monotonic regression based on BayesianP-splines: An application to estimating price response functions from store-level scannerdata, Journal <strong>of</strong> Business & Economic Statistics, 26(1):90–104Dette, H., Neumeyer, N., <strong>and</strong> Pilz, K.F. (2006). A simple nonparametric estimator <strong>of</strong> astrictly monotone regression function, Bernoulli, 12(3):469–490Dette, H. <strong>and</strong> Pilz, K.F. (2006). A comparative study <strong>of</strong> monotone nonparametric kernelestimates, Journal <strong>of</strong> Statistical Computation <strong>and</strong> Simulation, 76(1):41–56Dierckx, P. (1980). An algorithm for cubic spline fitting with convexity constraints,Computing, 24:349–371Elfving, T. <strong>and</strong> Andersson, L.E. (1988). An algorithm for computing constrainedsmoothing spline functions, Numerische Mathematik, 52:583– 595Elphinstone, C. (1983). A target distribution model for non-parametric densityestimation, Communications in Statistics - Theory <strong>and</strong> Methods, 12(2):161–198Fisher, N.I., Hall, P., Turlach, B.A., <strong>and</strong> Watson, G.S. (1997). <strong>On</strong> the estimation <strong>of</strong> aconvex set from noisy data on its support function, Journal <strong>of</strong> the American StatisticalAssociation, 92(437):84–91, doi:10.2307/2291452Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 27 / 32


References (ctd)Friedman, J.H. <strong>and</strong> Tibshirani, R. (1984). The monotone smoothing <strong>of</strong> scatterplots,Technometrics, 26(3):243–250Fritsch, F.N. (1990). <strong>Monotone</strong> piecewise cubic data fitting, in J.C. Mason <strong>and</strong> M.G.Cox, editors, Algorithms for Approximation II, Chapman <strong>and</strong> Hall, London, pages 99–106Hall, P. <strong>and</strong> Huang, L.S. (2001). Nonparametric kernel regression subject tomonotonicity constraints, The Annals <strong>of</strong> Statistics, 29(3):624–647Hawkins, D.M. (1994). Fitting monotonic polynomials to data, ComputationalStatistics, 9(3):233–247Hazelton, M.L. <strong>and</strong> Turlach, B.A. (2011). Semiparametric regression with shapeconstrained penalized splines, Computational Statistics & Data Analysis,55(10):2871–2879, doi:10.1016/j.csda.2011.04.018He, X. <strong>and</strong> Shi, P. (1998). <strong>Monotone</strong> B-spline smoothing, Journal <strong>of</strong> the AmericanStatistical Association, 93(442):643–650Heckman, N. <strong>and</strong> Ramsay, J.O. (2000). Penalized regression with model-based penalties,Canadian Journal <strong>of</strong> Statistics, 28:241–258Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 28 / 32


References (ctd)Heinzmann, D. (2008). A filtered polynomial approach to density estimation,Computational Statistics, 23(3):343–360, doi:10.1007/s00180-007-0070-zHolmes, C.C. <strong>and</strong> Heard, N.A. (2003). Generalized monotonic regression using r<strong>and</strong>omchange points, Statistics in Medicine, 22:623–638Irvine, L.D., Marin, S.P., <strong>and</strong> Smith, P.W. (1986). Constrained interpolation <strong>and</strong>smoothing, Constructive Approximation, 2(2):129–151Lavine, M. <strong>and</strong> Mockus, M. (1995). A nonparametric Bayes method for isotonicregression, Journal <strong>of</strong> Statistical Planning <strong>and</strong> Inference, 46:235–248Leitenstorfer, F. <strong>and</strong> Tutz, G. (2007). Generalized monotonic regression based onB-splines with an application to air pollution data, Biostatistics, 8(3):654–673Mammen, E. (1991). Estimating a smooth monotone regression function, The Annals <strong>of</strong>Statistics, 19(2):724–740Mammen, E., Marron, J.S., Turlach, B.A., <strong>and</strong> W<strong>and</strong>, M.P. (2001). A general projectionframework for constrained smoothing, Statistical Science, 16(3):232–248,doi:10.1214/ss/1009213727Mammen, E. <strong>and</strong> Thomas-Agnan, C. (1999). Smoothing splines <strong>and</strong> shape restrictions,Sc<strong>and</strong>inavian Journal <strong>of</strong> Statististics, 26:239–252Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 29 / 32


References (ctd)Marron, J.S., Turlach, B.A., <strong>and</strong> W<strong>and</strong>, M.P. (1997). <strong>Local</strong> polynomial smoothing underqualitative constraints, in L. Billard <strong>and</strong> N.I. Fisher, editors, Graph-Image-Vision,volume 28 <strong>of</strong> Computing Science <strong>and</strong> Statistics, Interface Foundation <strong>of</strong> North America,Inc., Fairfax Station, VA 22039–7460, pages 647–652Meyer, M.C. (2008). Inference using shape-restricted regression splines, The Annals <strong>of</strong>Applied Statistics, 2(3):1013–1033Meyer, M.C., Hackstadt, A.J., <strong>and</strong> Hoeting, J.A. (2011). Bayesian estimation <strong>and</strong>inference for generalised partial linear models using shape-restricted splines, Journal <strong>of</strong>Nonparametric Statistics, to appearMicchelli, C.A., Smith, P.W., Swetitis, J., <strong>and</strong> Ward, J.D. (1985). Constrained L papproximation, Constructive Approximation, 1(1):93–102Neelon, B. <strong>and</strong> Dunson, D.B. (2004). Bayesian isotonic regression <strong>and</strong> trend analysis,Biometrics, 60:398–406Perron, F. <strong>and</strong> Mengersen, K. (2001). Bayesian nonparametric modeling using mixtures<strong>of</strong> triangular distributions, Biometrics, 57:518–528Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 30 / 32


References (ctd)Ramgopal, P., Laud, P., <strong>and</strong> Smith, A. (1993). Nonparametric Bayesian bioassay withprior constraints on the shape <strong>of</strong> the potency curve, Biometrika, 80:489–498Ramsay, J.O. (1988). <strong>Monotone</strong> regression splines in action (with discussion), StatisticalScience, 3(4):425–461Ramsay, J.O. (1998). Estimating smooth monotone functions, Journal <strong>of</strong> the RoyalStatistical Society, Series B, 60(2):365–375Schmidt, J.W. <strong>and</strong> Scholz, I. (1990). A dual algorithm for convex-concave datasmoothing by cubic C 2 -splines, Numerische Mathematik, 57:333–350Schwetlick, H. <strong>and</strong> Kunert, V. (1993). Spline smoothing under constraints onderivatives, BIT, 33:512–528Shively, T.S., Sager, T.W., <strong>and</strong> Walker, S.G. (2009). A Bayesian approach tonon-parametric monotone function estimation, Journal <strong>of</strong> the Royal Statistical Society,Series B, 71(1):159–175Tantiyaswasdikul, C. <strong>and</strong> Woodro<strong>of</strong>e, M.B. (1994). Isotonic smoothing splines undersequential designs, Journal <strong>of</strong> Statistical Planning <strong>and</strong> Inference, 38:75–88Turlach, B.A. (2005). Shape constrained smoothing using smoothing splines,Computational Statistics, 20(1):81–104, doi:10.1007/BF02736124Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 31 / 32


References (ctd)Tutz, G. <strong>and</strong> Leitenstorfer, F. (2007). Generalized smooth monotonic regression inadditive modeling, Journal <strong>of</strong> Computational <strong>and</strong> Graphical Statistics, 16(1):165–188Utreras, F.I. (1985). Smoothing noisy data under monotonicity constraints: Existence,characterization <strong>and</strong> convergence rates, Numerische Mathematik, 47:611–625Villalobos, M. <strong>and</strong> Wahba, G. (1987). Inequality-constrained multivariate smoothingsplines with application to the estimation <strong>of</strong> posterior probabilities, Journal <strong>of</strong> theAmerican Statistical Association, 82(397):239–248Wang, X. (2008). Bayesian free–knot monotone cubic spline regression, Journal <strong>of</strong>Computational <strong>and</strong> Graphical Statistics, 17(2):373–387Wang, X. <strong>and</strong> Li, F. (2008). Isotonic smoothing spline regression, Journal <strong>of</strong>Computational <strong>and</strong> Graphical Statistics, 17(1):21–37Wright, I. <strong>and</strong> Wegman, E. (1980). Isotonic, convex <strong>and</strong> related splines, The Annals <strong>of</strong>Statistics, 8:1023–1035Berwin A Turlach (UWA) <strong>Monotone</strong> <strong>Regression</strong> 29 August 2011 32 / 32

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!