Victor Akinwande, Megan Macgregor, et al.
IJCAI 2024
Prequential model selection and delete-one cross-validation are data-driven methodologies for choosing between rival models on the basis of their predictive abilities. For a given set of observations, the predictive ability of a model is measured by the model's accumulated prediction error and by the model's average-out-of-sample prediction error, respectively, for prequential model selection and for cross-validation. In this paper, given i.i.d. observations, we propose nonparametric regression estimators-based on neural networks-that select the number of "hidden units" (or "neurons") using either prequential model selection or delete-one cross-validation. As our main contributions: (i) we establish rates of convergence for the integrated mean-squared errors in estimating the regression function using "off-line" or "batch" versions of the proposed estimators and (ii) we establish rates of convergence for the time-averaged expected prediction errors in using "on-line" versions of the proposed estimators. We also present computer simulations (i) empirically validating the proposed estimators and (ii) empirically comparing the proposed estimators with certain novel prequential and cross-validated "mixture" regression estimators.
Victor Akinwande, Megan Macgregor, et al.
IJCAI 2024
Ella Barkan, Ibrahim Siddiqui, et al.
Computational And Structural Biotechnology Journal
Merve Unuvar, Yurdaer Doganata, et al.
CLOUD 2014
Fahiem Bacchus, Joseph Y. Halpern, et al.
IJCAI 1995