badge icon

This article was automatically translated from the original Turkish version.

Article
nasnet_name.png
Model
NASNet
Year
June 23, 2018
Developer
Google Brain
Base Component
Normal and Reduction Cell
Variants
NASNet-ANASNet-BNASNet-C

NASNet (Neural Architecture Search Network) is a family of neural networks constructed using the Neural Architecture Search (NAS) approach, which enables the automatic design of deep neural network architectures without human intervention. Developed by Google Brain, NASNet models have achieved performance levels surpassing traditional hand-designed architectures. This architecture has been specifically optimized to achieve high accuracy in visual tasks such as classification and object detection.

Neural Architecture Search (NAS) Approach

Neural Architecture Search (NAS) is a method that aims to enable machine learning algorithms to automatically discover the optimal neural network structure for a given task. The NASNet architecture is a successful application of this approach.

Search Space

In NASNet, the search process is performed not on the entire network directly, but on smaller building blocks called cells. These building blocks are then connected together to form larger models.

  • Normal cell: Processes input data without changing its spatial dimensions.
  • Reduction cell: Reduces the spatial size of the feature map, thereby increasing the level of abstraction.

Optimization of NASNet

For the search process, Reinforcement Learning or Evolutionary Algorithms are commonly used. The NASNet architecture performs the search on low-resolution data, and the resulting cell structures are transferred to larger, higher-resolution models.

Architecture of the best convolutional cells (NASNet-A) with B = 5 blocks defined on CIFAR-10 (

In the NASNet architecture, the cells discovered during the search are replicated numerous times to construct a deep network architecture.

Structure of the NASNet Architecture

NASNet models are scalable to different sizes and computational resource requirements, making them suitable for both mobile devices and high-capacity servers.

NASNet-A, NASNet-B, and NASNet-C

  • NASNet-A: Contains the most successful cell architectures and is typically used for large datasets such as ImageNet.
  • NASNet-B: Represents the second-best cell architecture, positioned between variants A and C, balancing performance and efficiency.
  • NASNet-C: Recommended for scenarios requiring lower computational cost.

Transferability

The cells discovered in NASNet are designed to be transferable across different datasets. For example, cell structures searched on the CIFAR-10 dataset can be successfully applied to larger datasets such as ImageNet.

Advantages of NASNet

  • Performance: On ImageNet, the NASNet-A model achieved a top-1 accuracy of 82.7 percent.
  • Automation: Superior architecture design without human intervention.
  • Adaptability: Scalable structure adaptable to mobile and cloud environments.

Applications

  • Image classification
  • Object detection (particularly on the COCO dataset)
  • Segmentation
  • Research platforms for automated architecture design

Author Information

Avatar
AuthorKaan GümeleDecember 9, 2025 at 6:42 AM

Tags

Discussions

No Discussion Added Yet

Start discussion for "NASNet" article

View Discussions

Contents

  • Neural Architecture Search (NAS) Approach

    • Search Space

    • Optimization of NASNet

  • Structure of the NASNet Architecture

  • NASNet-A, NASNet-B, and NASNet-C

    • Transferability

  • Advantages of NASNet

  • Applications

Ask to Küre