Text
E-book Automated Machine Learning : Methods, Systems, Challenges
Every machine learning system has hyperparameters, and the most basic task inautomated machine learning (AutoML) is to automatically set these hyperparam-eters to optimize performance. Especially recent deep neural networks cruciallydepend on a wide range of hyperparameter choices about the neural network’s archi-tecture, regularization, and optimization. Automated hyperparameter optimization(HPO) has several important use cases; it can educe the human effort necessary for applying machine learning. This isparticularly important in the context of AutoML. Solving Eq.1.1with one of the techniques described in the rest of this chapterusually requires fitting the machine learning algorithmAwith multiple hyperpa-rameter vectors?t. Instead of using the argmin-operator over these, it is possibleto either construct an ensemble (which aims to minimize the loss for a givenvalidation protocol) or to integrate out all the hyperparameters (if the model underconsideration is a probabilistic model). We refer to Guyon et al. [50] and thereferences therein for a comparison of frequentist and Bayesian model selection.Only choosing a single hyperparameter configuration can be wasteful whenmany good configurations have been identified by HPO, and combining themin an ensemble can improve performance [109]. This is particularly useful inAutoML systems with a large configuration space (e.g., inFMSorCASH), wheregood configurations can be very diverse, which increases the potential gains fromensembling [4,19,31,34]. To further improve performance,Automatic Franken-steining[155] uses HPO to train a stacking model [156] on the outputs of themodels found with HPO; the 2nd level models are then combined using a traditionalensembling strategy.
Tidak tersedia versi lain