Exploring the Promises of Tranformer-Based LMs for the Representation of Normative Claims in the Legal Domain

Item Type Journal paper
Abstract

In this article, we explore the potential of transformer-based language models (LMs) to correctly represent normative statements in the legal domain, taking tax law as our use case. In our experiment, we use a variety of LMs as bases for both word- and sentence-based clusterers that are then evaluated on a small, expert-compiled test-set, consisting of real-world samples from tax law research literature that can be clearly assigned to one of four nor-mative theories. The results of the experiment show that clusterers based on sentence-BERT-embeddings deliver the most promising results. Based on this main experiment, we make first attempts at using the best performing models in a bootstrapping loop to build classifiers that map normative claims on one of these four nor-mative theories.

Authors Gubelmann, Reto; Handschuh, Siegfried & Hongler, Peter
Language English
Subjects computer science
social sciences
law
HSG Classification contribution to scientific community
HSG Profile Area LS - Business Enterprise - Law, Innovation and Risk
Refereed No
Date 26 August 2021
Publisher arXiv
Place of Publication https://arxiv.org/abs/2108.11215
Number arXiv:2108.11215
Depositing User Prof. Dr. Peter Hongler
Date Deposited 27 Aug 2021 12:02
Last Modified 27 Aug 2021 12:12
URI: https://www.alexandria.unisg.ch/publications/264176

Download

[img] Text
2108.11215-3.pdf

Download (577kB)

Citation

Gubelmann, Reto; Handschuh, Siegfried & Hongler, Peter (2021) Exploring the Promises of Tranformer-Based LMs for the Representation of Normative Claims in the Legal Domain. (arXiv:2108.11215).

Statistics

https://www.alexandria.unisg.ch/id/eprint/264176
Edit item Edit item
Feedback?