Cổng tri thức PTIT

Bài báo quốc tế

Kho tri thức

/

/

A Comparative Study of Transformer and Convolutional Neural Network Architectures

A Comparative Study of Transformer and Convolutional Neural Network Architectures

Nguyễn Trung Hiếu

Network intrusion detection has become increasingly critical in cybersecurity as cyber threats continue to evolve in sophistication and frequency. This paper presents a comprehensive comparative study between Transformer-based architectures and Convolutional Neural Networks (CNNs) for network intrusion detection using the NSL-KDD dataset. We implement and evaluate both architectures on a multi-class classification task involving 40 different types of network attacks and normal traffic across 119 features. Our experimental results demonstrate hat while the Transformer model achieves slightly higher accuracy (71.22% vs 70.73%), the CNN model shows superior F1-score performance 0.6325 vs 0.6116) and significantly better computational efficiency with 8.7× faster training time (203.31s vs 1761.22s). The study provides insights into the effectiveness of attention mechanisms versus convolutional architectures for capturing complex patterns in network traffic data, while also analyzing computational efficiency and practical deployment considerations. Our findings contribute to the advancement of AI-driven cybersecurity solutions and provide guidance for selecting appropriate deep learning architectures for network intrusion detection systems.

Xuất bản trên:

A Comparative Study of Transformer and Convolutional Neural Network Architectures


Nhà xuất bản:

Địa điểm:


Từ khoá:

Network Security, Intrusion Detection, Deep Learning, Transformer, CNN, NSL-KDD, Cybersecurity