Bài báo quốc tế
A Survey on Methods of Applying Transformers to Non-NLP Applications
Nguyễn Trung Hiếu
Transformers play a key role in the success of large language models
today. The success of the architecture is impressive. In this survey we find out
the adaptability of Transformer models in fields other than natural language pro-
cessing (NLP), thereby giving us an insight into the superiority of Transformer
models in processing various types of data. Although initially introduced for nat-
ural language processing (NLP). Transformers are gradually used to solve prob-
lems related to images, related to the analysis of signals, time series and in the
processing and modeling of spatial data, etc. However, there are still many un-
clear issues regarding the application of Transformers in such domains, such as
whether they are natural or not; whether they are really effective; and whether
transformers are a universal architecture for everything; what innovative tech-
niques have been used to apply Transformers in domains other than NLP and
what challenges remain to be solved. Through our survey, these aspects are grad-
ually clarified. We also found some basic principles for adapting transformers to
non-NLP applications, while also recognizing their limitations and directions for
improvement.
Bài báo liên quan
FA-Net: A Dual-Branch Attention Architecture for Extracting Fine-Grained Anatomical Features of Wood
Ma Công ThànhTinyCDAE: Lightweight Convolutional Denoising Autoencoders for Real-Time Image Denoising on Resource-Constrained IoT Devices
Nguyễn Trọng HuânAn Object Detection Framework Based on Relationship Between Objects in an Open Vocabulary Using Owl-VIT And RelTransformer
Nguyễn Thị NguyệtEstimation of External Government Debt Thresholds: The Case of Vietnam
Đặng Thị Huyền Anh