CFE 2020: Start Registration
View Submission - CMStatistics
B0687
Title: Some rates of convergence in unlinked monotone regression Authors:  Fadoua Balabdaoui - ETH Zurich (Switzerland) [presenting]
Abstract: The so-called univariate unlinked regression is considered when the unknown regression curve is monotone. In standard monotone regression, one observes a pair $(X, Y )$ where a response $Y$ is linked to a covariate $X$ through the model $Y = m_0(X) +\epsilon$, with $m_0$ the (unknown) monotone regression function and $\epsilon$ the unobserved error (assumed to be independent of $X$). In the unlinked regression setting one gets only to observe a vector of realizations from both the response $Y$ and from the covariate $X$ where now $Y$ is only known to have the same distribution as $m_0(X) + \epsilon$. Despite this, it is actually still possible to derive a consistent non-parametric estimator of m0 under the assumption of monotonicity ofm0 and knowledge of the distribution of the noise. We establish an upper bound on the rate of convergence of such an estimator under minimal assumptions on the distribution of the covariate $X$. We discuss extensions to the case in which the distribution of the noise is unknown. We develop a gradient-descent-based algorithm for its computation, and we demonstrate its use on synthetic data.