GPT-J-description_picture


GPT-J - AI Tool

GPT-3 Democratized. A 6B parameter open-source version of GPT-3

About GPT-J

GPT-J is the open-source alternative to OpenAI's GPT-3. The model is trained on the Pile, is available for use with Mesh Transformer JAX. Now, thanks to Eleuther AI, anyone can download and use a 6B parameter version of GPT-3.

EleutherAI are the creators of GPT-Neo.

GPT-J-6B performs nearly on par with 6.7B GPT-3 (or Curie) on various zero-shot down-streaming tasks.

Zero-Shot Evaluations

Models roughly sorted by performance, or by FLOPs if not available.

ModelWeightsTraining FLOPsLAMBADA PPL ↓LAMBADA Acc ↑Winogrande ↑Hellaswag ↑PIQA ↑Dataset Size (GB)
Chance0~a lot~0%50%25%25%0
GPT-3-Ada‡-----9.9551.6%52.9%43.4%70.5%-----
GPT-2-1.5B-----10.6351.21%59.4%50.9%70.8%40
GPTNeo-1.3B‡3.0e217.5057.2%55.0%48.9%71.1%825
Megatron-2.5B*2.4e21-----61.7%---------------174
GPTNeo-2.7B‡6.8e215.6362.2%56.5%55.8%73.0%825
GPT-3-1.3B*‡2.4e215.4463.6%58.7%54.7%75.1%~800
GPT-3-Babbage‡-----5.5862.4%59.0%54.5%75.5%-----
Megatron-8.3B*7.8e21-----66.5%---------------174
GPT-3-2.7B*‡4.8e214.6067.1%62.3%62.8%75.6%~800
Megatron-11B†1.0e22-------------------------161
GPT-J-6B1.5e223.9969.7%65.3%66.1%76.5%825
GPT-3-6.7B*‡1.2e224.0070.3%64.5%67.4%78.0%~800
GPT-3-Curie‡-----4.0069.3%65.6%68.5%77.9%-----
GPT-3-13B*‡2.3e223.5672.5%67.9%70.9%78.5%~800
GPT-3-175B*‡3.1e233.0076.2%70.2%78.9%81.0%~800
GPT-3-Davinci‡-----3.075%72%78%80%-----

* represents evaluation numbers reported by their respective authors, all other numbers are provided by running the lm-evaluation-harness either with the released weights or with API access. Due to subtle implementation differences as well as different zero shot task framing, these might not be directly comparable. See this blog post for more details.

The Megatron-11B model provides no comparable metrics, and several implementations using the released weights do not reproduce the generation quality and evaluations. (see 1 2 3) Thus, evaluation was not attempted.

These models have been trained with data which contains possible test set contamination. The OpenAI GPT-3 models failed to deduplicate training data for certain test sets, while the GPT-Neo models as well as this one is trained on The Pile, which has not been deduplicated against any test sets.

Source: https://github.com/kingoflolz/mesh-transformer-jax/blob/master/README.md

"Unlock Your Social Media Success: Discover More Powerful AI Tools Today!"

unlock now