Neural Architecture Search Task

From GM-RKB
(Redirected from Neural Architecture Search)
Jump to navigation Jump to search

A Neural Architecture Search Task is a model search task for neural network models (that automates the process of designing artificial neural networks to optimize their architecture for a given task).



References

2024

  • (Wikipedia, 2024) ⇒ https://en.wikipedia.org/wiki/Neural_architecture_search Retrieved:2024-3-4.
    • Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS has been used to design networks that are on par or outperform hand-designed architectures.[1] Methods for NAS can be categorized according to the search space, search strategy and performance estimation strategy used:[2]
      • The search space defines the type(s) of ANN that can be designed and optimized.
      • The search strategy defines the approach used to explore the search space.
      • The performance estimation strategy evaluates the performance of a possible ANN from its design (without constructing and training it).
    • NAS is closely related to hyperparameter optimization [3] and meta-learning and is a subfield of automated machine learning (AutoML).

2020

  1. Cite error: Invalid <ref> tag; no text was provided for refs named Zoph 2016
  2. Cite error: Invalid <ref> tag; no text was provided for refs named survey
  3. Matthias Feurer and Frank Hutter. Hyperparameter optimization. In: AutoML: Methods, Systems, Challenges, pages 3–38.