An Empirical Study on Cross-Lingual and Cross-Domain Transfer for Legal Judgment PredictionDownload PDF

Anonymous

16 Jan 2022 (modified: 05 May 2023)ACL ARR 2022 January Blind SubmissionReaders: Everyone
Abstract: Cross-lingual transfer learning has proven useful in a variety of NLP tasks, but it is understudied in the context of legal NLP, and not at all on n Legal Judgment Prediction (LJP). We explore transfer learning techniques on LJP using the trilingual Swiss-Judgment-Prediction (SJP) dataset, including cases written in three languages (German, French, Italian). We find that Cross-Lingual Transfer (CLT) improves the overall results across languages, especially when we augment the dataset with machine-translated versions of the original documents, using a $3\times$ larger training corpus. Further on, we perform an analysis exploring the effect of cross-domain and cross-regional transfer, i.e., train a model across domains (legal areas), or regions. We find that in both settings (legal areas, origin regions), models trained across all groups perform overall better, while they also have improved results in the worst-case scenarios. Finally, we report improved results when we ambitiously apply cross-jurisdiction transfer, where we augment our dataset with Indian legal cases originally written in English.
Paper Type: short
0 Replies

Loading