Proving corresponding minimax lower bounds. In addition, we justify our claims for the optimality of rates by It determines how a decision tree chooses to split data. In engineering applications, information is analogous to signal, and entropy is analogous to noise. Besides the novel oracle-type inequality, the sharpĬonvergence rates given in our paper also owe to a tight error bound forĪpproximating the natural logarithm function near zero (where it is unbounded)īy ReLU DNNs. Basically, entropy is the measure of impurity or uncertainty in a group of observations. This resultĮxplains why DNN classifiers can perform well in practical high-dimensionalĬlassification problems. Log factors) which are independent of the input dimension of data. Under this assumption, we derive optimal convergence rates (up to Hölder smooth function only depending on a small number of its input Of which each component function is either a maximum value function or a That requires $\eta$ to be the composition of several vector-valued functions Moreover, we consider a compositional assumption Log factors) only requiring the Hölder smoothness of the conditional class In particular, we obtain optimal convergence rates (up to In this paper, we aim to fill this gap byĮstablishing a novel and elegant oracle-type inequality, which enables us toĭeal with the boundedness restriction of the target function, and using it toĭerive sharp convergence rates for fully connected ReLU DNN classifiers trained The target function for the logistic loss is the main obstacle to deriving However, generalization analysis for binaryĬlassification with DNNs and logistic loss remains scarce. Download a PDF of the paper titled Classification with Deep Neural Networks and Logistic Loss, by Zihan Zhang and 2 other authors Download PDF Abstract: Deep neural networks (DNNs) trained with the logistic loss (i.e., the crossĮntropy loss) have made impressive advancements in various binaryĬlassification tasks.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |