Challenges And Thrills Of Legal Arguments · The Large Language Model Bible Contribute to LLM-Bible

Challenges And Thrills Of Legal Arguments

Pallaprolu Anurag, Vaidya Radha, Attawar Aditya Swaroop. Arxiv 2020

[Paper]    
Attention Mechanism BERT Model Architecture Pretraining Methods Transformer

State-of-the-art attention based models, mostly centered around the transformer architecture, solve the problem of sequence-to-sequence translation using the so-called scaled dot-product attention. While this technique is highly effective for estimating inter-token attention, it does not answer the question of inter-sequence attention when we deal with conversation-like scenarios. We propose an extension, HumBERT, that attempts to perform continuous contextual argument generation using locally trained transformers.

Similar Work