pith. machine review for the scientific record. sign in

GPT-J-6B: A 6 Billion Parameter Autoregressive Language Model

2 Pith papers cite this work. Polarity classification is still indexing.

2 Pith papers citing it

fields

cs.CL 1 cs.LG 1

years

2023 1 2021 1

representative citing papers

BloombergGPT: A Large Language Model for Finance

cs.LG · 2023-03-30 · conditional · novelty 6.0

BloombergGPT is a 50B parameter LLM trained on a 708B token mixed financial and general dataset that outperforms prior models on financial benchmarks while preserving general LLM performance.

citing papers explorer

Showing 2 of 2 citing papers.