skip to main content

Decoupled differentiable graph neural architecture search

Chen, Jiamin ; Gao, Jianliang ; Wu, Zhenpeng ; Al-Sabri, Raeed ; Oloulade, Babatounde Moctard

Information sciences, 2024-07, Vol.673, p.120700, Article 120700 [Revista revisada por pares]

Elsevier Inc

Texto completo disponible

Citas Citado por
  • Título:
    Decoupled differentiable graph neural architecture search
  • Autor: Chen, Jiamin ; Gao, Jianliang ; Wu, Zhenpeng ; Al-Sabri, Raeed ; Oloulade, Babatounde Moctard
  • Materias: Decoupled differentiable optimization ; Graph neural architecture search ; Graph neural network ; Supernet pruning
  • Es parte de: Information sciences, 2024-07, Vol.673, p.120700, Article 120700
  • Descripción: The differentiable graph neural architecture search (GNAS) effectively designs graph neural networks (GNNs) efficiently and automatically with excellent performance based on different graph data distributions. Given a GNN search space containing multiple GNN component operation candidates, the differentiable GNAS method builds a mixed supernet using learnable architecture parameters multiplied by the GNN component operation candidates. When the mixed supernet completes optimization, the mixed supernet is pruned based on the best architecture parameters to efficiently identify the optimal GNN architecture in the GNN search space. However, the multiplicative relationship between the architecture parameters and the GNN component operation candidates introduces a coupled optimization bias into the weight optimization process of the mixed supernet GNN component operation candidates. This bias results in differentiable GNAS performance degradation. To solve the problem of coupled optimization bias in the previous differentiable GNAS method, we propose the Decoupled Differentiable Graph Neural Architecture Search (D2GNAS). It utilizes the Gumbel distribution as a bridge to decouple the weights optimization of supernet GNN component candidate operation and architecture parameters for constructing the decoupled differentiable GNN architecture sampler. The sampler is capable of selecting promising GNN architectures based on architecture parameters treated as sampling probabilities, and it is further optimized through the validation gradients derived from the sampled GNN architectures. Simultaneously, D2GNAS builds a single-path supernet with a pruning strategy to compress the supernet progressively to improve search efficiency further. We conduct extensive experiments on multiple benchmark graphs. The experimental findings demonstrate that D2GNAS outperforms all established baseline methods, both manual GNN and GNAS methods, in terms of performance. Additionally, D2GNAS has a lower time complexity than previous differentiable GNAS methods. Based on the fair GNN search space, it achieves an average 5x efficiency improvement. Codes are available at https://github.com/AutoMachine0/D2GNAS.
  • Editor: Elsevier Inc
  • Idioma: Inglés

Buscando en bases de datos remotas, por favor espere

  • Buscando por
  • enscope:(USP_VIDEOS),scope:("PRIMO"),scope:(USP_FISICO),scope:(USP_EREVISTAS),scope:(USP),scope:(USP_EBOOKS),scope:(USP_PRODUCAO),primo_central_multiple_fe
  • Mostrar lo que tiene hasta ahora