badge icon

This article was automatically translated from the original Turkish version.

Article

ResNet (Residual Network)

02ff4333-f341-47a2-9651-3fce1474105d.png
Model
ResNet
Year
November 10, 2015
Developer
Kaiming He and team
Base Component
Skip connections
Success
ImageNet 2015 Championship
Variants
ResNet-18ResNet-34ResNet-50ResNet-101ResNet-152

ResNet (Residual Network), developed to address one of the most significant problems in deep neural network architectures—“degradation of training performance as the number of layers increases”—was introduced in 2015 by Kaiming He and his team. This architecture enables much deeper networks to be trained more efficiently and successfully by incorporating residual connections in addition to conventional layer structures. It revolutionized the field of deep learning by achieving high accuracy in tasks such as image classification.

Residual Learning Mechanism

At the core of the ResNet architecture are residual connections (skip connections), which transmit not only the direct output of each layer to the next but also the original input itself forward. This design allows the model to learn only the residual changes needed, rather than the full transformation.

Residual Blocks

Residual blocks are the fundamental building units of the ResNet architecture. Each block takes an input vector x and produces a transformed version F(x). Unlike in classical networks, the input is directly added to this transformation:

y = F(x) + x


This enables the model to learn transformations close to zero more easily. As a result, the vanishing gradient problem that typically arises as network depth increases is significantly mitigated.

Simple Residual Block Structure

A typical residual block consists of the following components:

  • Two consecutive convolutional layers (Conv2D),
  • Batch normalization (BN) and ReLU activation function following each convolutional layer,
  • A residual connection at the end.


Residual Block Structure (Credit: Dive into Deep Learning)

Depth and Variants

The ResNet architecture has been implemented at various depths. The most well-known variants are:

  • ResNet-18 and ResNet-34: Shallower networks that use basic residual blocks.
  • ResNet-50, ResNet-101 and ResNet-152: Deeper models that incorporate special block structures known as “bottleneck blocks”.

Bottleneck Blocks

These blocks consist of three layers designed to reduce the number of parameters and computational cost:

  1. 1x1 convolution (dimension reduction),
  2. 3x3 convolution (feature extraction),
  3. 1x1 convolution (dimension restoration).

This structure enhances the efficiency of deep models. In the ResNet-50 architecture, layers are grouped into blocks supported by residual connections, and these blocks are repeated as depth increases.


ResNet-50 Model Architecture (Credit:

Applications and Achievements

ResNet has been successfully applied to numerous computer vision tasks, with image classification being the most prominent. It demonstrated high accuracy and efficient training performance by winning first place in the ImageNet competition in 2015. Furthermore, the ResNet architecture has served as the foundation for many subsequent models, such as ResNeXt and DenseNet. Even in today’s transformer-based models, the residual connection structure is actively incorporated. ResNet, developed to solve the problem of degradation in training performance with increasing layer depth in deep neural network architectures, was introduced in 2015 by Kaiming He and his team. By incorporating residual connections alongside classical layer structures, it enabled more efficient and successful training of very deep networks. Its exceptional accuracy in tasks like image classification revolutionized the field of deep learning.

Author Information

Avatar
AuthorKaan GümeleDecember 11, 2025 at 8:00 AM

Tags

Discussions

No Discussion Added Yet

Start discussion for "ResNet (Residual Network)" article

View Discussions

Contents

  • Residual Learning Mechanism

    • Residual Blocks

      • Simple Residual Block Structure

  • Depth and Variants

    • Bottleneck Blocks

  • Applications and Achievements

Ask to Küre