Cổng tri thức PTIT

Bài báo quốc tế

Kho tri thức

/

/

Attentive Deep Neural Networks for Legal Document Retrieval

Attentive Deep Neural Networks for Legal Document Retrieval

Ha Thanh Nguyen, Manh Kien Phi, Ngô Xuân Bách, Vu Tran, Le Minh Nguyen, Từ Minh Phương

Legal text retrieval serves as a key component in a wide range of legal text processing tasks such as legal question answering, legal case entailment, and statute law retrieval. The performance of legal text retrieval depends, to a large extent, on the representation of text, both query and legal documents. Based on good representations, a legal text retrieval model can effectively match the query to its relevant documents. Because legal documents often contain long articles and only some parts are relevant to queries, it is quite a challenge for existing models to represent such documents. In this paper, we study the use of attentive neural network-based text representation for statute law document retrieval. We propose a general approach using deep neural networks with attention mechanisms. Based on it, we develop two hierarchical architectures with sparse attention to represent long sentences and articles, and we name them Attentive CNN and Paraformer. The methods are evaluated on datasets of different sizes and characteristics in English, Japanese, and Vietnamese. Experimental results show that: i) Attentive neural methods substantially outperform non-neural methods in terms of retrieval performance across datasets and languages; ii) Pretrained transformer-based models achieve better accuracy on small datasets at the cost of high computational complexity while lighter weight Attentive CNN achieves better accuracy on large datasets; and iii) Our proposed Paraformer outperforms state-of-the-art methods on COLIEE dataset, achieving the highest recall and F2 scores in the top-N retrieval task

Xuất bản trên:

Artificial Intelligence and Law

Ngày đăng:

2023


Nhà xuất bản:

Springer

Địa điểm:


Từ khoá:

Legal text retrieval, deep neural networks, hierarchical representation, global attention

Bài báo liên quan

Nguyễn Thị Thanh Thủy, Từ Minh Phương, Ngô Xuân Bách, Nguyễn Ngọc Điệp