skip to main content
Primo Search
Search in: Busca Geral
Tipo de recurso Mostra resultados com: Mostra resultados com: Índice

NONPARAMETRIC REGRESSION USING DEEP NEURAL NETWORKS WITH RELU ACTIVATION FUNCTION

Schmidt-Hieber, Johannes

The Annals of statistics, 2020-08, Vol.48 (4), p.1875-1897 [Periódico revisado por pares]

Hayward: Institute of Mathematical Statistics

Texto completo disponível

Citações Citado por
  • Título:
    NONPARAMETRIC REGRESSION USING DEEP NEURAL NETWORKS WITH RELU ACTIVATION FUNCTION
  • Autor: Schmidt-Hieber, Johannes
  • Assuntos: Activation ; Architecture ; Artificial neural networks ; Composition ; Computer architecture ; Constraint modelling ; Convergence ; Estimators ; Minimax technique ; Multilayers ; Neural networks ; Parameters ; Regression analysis ; Regression models ; Studies ; Wavelet transforms
  • É parte de: The Annals of statistics, 2020-08, Vol.48 (4), p.1875-1897
  • Descrição: Consider the multivariate nonparametric regression model. It is shown that estimators based on sparsely connected deep neural networks with ReLU activation function and properly chosen network architecture achieve theminimax rates of convergence (up to log n-factors) under a general composition assumption on the regression function. The framework includes many well-studied structural constraints such as (generalized) additive models. While there is a lot of flexibility in the network architecture, the tuning parameter is the sparsity of the network. Specifically, we consider large networks with number of potential network parameters exceeding the sample size. The analysis gives some insights into why multilayer feedforward neural networks perform well in practice. Interestingly, for ReLU activation function the depth (number of layers) of the neural network architectures plays an important role, and our theory suggests that for nonparametric regression, scaling the network depth with the sample size is natural. It is also shown that under the composition assumption wavelet estimators can only achieve suboptimal rates.
  • Editor: Hayward: Institute of Mathematical Statistics
  • Idioma: Inglês

Buscando em bases de dados remotas. Favor aguardar.