Neural Architecture Search (NAS) is the process of discovering the best architecture a neural network should use for a specific need. In the past, programmers had to tweak neural networks to learn what works well manually. NAS automated that process, allowing artificial intelligence (AI) systems to discover more complex architectures.

In essence, NAS uses various tools and methods that test and evaluate huge architecture volumes utilizing a search strategy to choose one that best meets the programmer’s objectives.

Other interesting terms…

Read More about “Neural Architecture Search

NAS uses optimization-based algorithms to find the ideal solution to a specific problem from a large pool of candidates.

When Did Neural Architecture Search Originate?

NAS, which covers all processes that automate machine learning (ML) and deep learning problem solving, first came to light in 2016. Zoph, et al. and Baker, et al. created state-of-the-art architectures for image recognition and language modeling using reinforcement learning algorithms. Their work considerably boosted NAS growth.

How Does Neural Architecture Search Work?

NAS requires as big a search space or as many architectures as possible to look for possible solutions to a specific problem from. The programmer then has to craft a search strategy with criteria to use on the data. The search process begins when the AI picks an architecture to test utilizing the performance estimation strategy, which evaluates how well the architecture solves the problem. The architecture evaluation results are sent back to the search strategy. Performance estimation is done on all available architectures until the ideal solution is found or an adequate fitness rating is reached.

Here’s a diagram depicting NAS.

Neural Architecture Search
Source: https://medium.com/digital-catapult/neural-architecture-search-the-foundations-a6cc85f7562

What Is Neural Architecture Search For?

NAS explores tons of potential solutions to problems of different levels of complexity, which can cause the process to be very computationally expensive. The larger the search space, the more architectures to test, train, and evaluate. The process requires considerable resources and can sometimes take days to find a suitable model.

Does Neural Architecture Search Have Limitations?

As a relatively new technology, NAS has certain limitations, including:

  • Since NAS explores multitudes of potential solutions to problems of varying complexity, performing it is very computationally expensive.
  • NAS requires a lot of resources and can take days to find good models.
  • Since NAS is typically applied to training data, it isn’t easy to know how potential models will perform on actual data.
  • Creating search and performance estimation strategies is still done manually. This task requires several rounds of fine-tuning, making NAS time-consuming even before testing starts.
  • Some NAS models aren’t robust yet and can be hard to train.

What Are Some Applications of Neural Architecture Search

To date, we’ve seen some popular NAS applications, including:

What Are the Different Neural Architecture Search Methods?

Various search strategies can be used for neural architectures, including:

  • Random search: As the name suggests, the AI starts with a randomly chosen architecture to test and performs the evaluation on succeeding ones until adequate fitness is reached.
  • Bayesian optimization: The evaluation in this model starts with the architecture that is most likely to work on the problem. The AI then chooses the next model to test based on how effective the previous one was, making this less expensive to perform than other testing models.
  • Evolutionary methods: This model requires testing an architecture against multiple environments.
  • Reinforcement learning: In this model, the model gets rewarded for performing as desired and punished for doing otherwise, hence using the term “reinforcement.”
  • Gradient-based methods: This model requires the AI to adjust its parameters to find the ideal solution faster with each test.

Given the complexity and resource-hogging nature of NAS, Microsoft has been working on speeding the process up while minimizing resource usage with Project Petridish.