Cổng tri thức PTIT

Bài báo quốc tế

Kho tri thức

/

/

A Survey on Methods of Applying Transformers to Non-NLP Applications

A Survey on Methods of Applying Transformers to Non-NLP Applications

Nguyễn Trung Hiếu

Transformers play a key role in the success of large language models today. The success of the architecture is impressive. In this survey we find out the adaptability of Transformer models in fields other than natural language pro- cessing (NLP), thereby giving us an insight into the superiority of Transformer models in processing various types of data. Although initially introduced for nat- ural language processing (NLP). Transformers are gradually used to solve prob- lems related to images, related to the analysis of signals, time series and in the processing and modeling of spatial data, etc. However, there are still many un- clear issues regarding the application of Transformers in such domains, such as whether they are natural or not; whether they are really effective; and whether transformers are a universal architecture for everything; what innovative tech- niques have been used to apply Transformers in domains other than NLP and what challenges remain to be solved. Through our survey, these aspects are grad- ually clarified. We also found some basic principles for adapting transformers to non-NLP applications, while also recognizing their limitations and directions for improvement.

Xuất bản trên:

A Survey on Methods of Applying Transformers to Non-NLP Applications

Ngày đăng:

DOI:


Nhà xuất bản:

Địa điểm:


Từ khoá:

Vision Transformer (ViT), Self-Attention Mechanism, Data Effi- ciency, Time Series Forecasting, Adversarial Robustness, Multimodal Repre- sentation.