What is GPT-J?
GPT-J is a NLP (Natural Language Processing) model that has been trained with 6.7 billion parameters. In other words, GPT-J is an artificial intelligence specialized and trained in understanding human language, so that it can understand it and generate text.
Why is GPT-J called GPT-J?
GPT-J is so called because “GPT” stands for “Generative Pre-Trained Transformer”, which is a type of architecture in artificial intelligence, in the next point we will see why this name. And the “J” I have not found at any time where it comes from but after much reading I have assumed that comes from “Jax”, a framework for the creation of artificial intelligences created by Google and open source and free under Apache license.
Who created GPT-J and why?
It was created by Ben Wang and Aran Komatsuzaki using Google’s framework, and trained with “The Pile” from Eleuther.ai, a foundation for the development of free and open source artificial intelligences. The Stack is a database of more than 825GB of text.
The objective of GPT-J is to compete with what at that time (and still) leads the AI sector in terms of Text generation, the company OpenAI and its GPT models. The most famous being GPT-3. Although OpenAI models are open to the public through API calls and payment, they are not open source or free.
What is GPT-J used for?
GPT-J is used to generate text, and has a multitude of applications. From generating content for apps or websites to classifying text to serve in some apps. As an example: it can be used in an application to determine if a text contains insults, or to know what a customer wants from a message. Etc.
It can also be used for research, teaching and for building new applications on top of it, since it is open source and free.
The best of GPT-J
It is said that the best part of GPT-J when it comes to generating text is not creativity but rather logic, and in particular he is particularly good at programming.
GPT-J ¿State of art?
GPT-J was state of the art technology at the time, it was never the most powerful language model, but it was at the time the most powerful open source language model, which is no small thing, since it means that it was the most powerful model open to programmers. But it does not surpass or even come close to GPT-3, although it does surpass the smaller versions of GPT-3 that have been trained with a similar number of parameters.
That is now behind us, through the Eleuther Foundation, a more powerful model of text generation has been released: GPT-NeoX. And more recently Bloom.
How and where to use GPT-J
Nowadays it can be used in a multitude of websites where they will serve us the API for free or for a fee if we make a lot of use of it. We can also download and run the program ourselves, although this requires large computing resources, as is currently the case with all AI of this type.
- HuggingFace: GPT-J can be used both visually and with API for free, for massive use there are payment plans.
- Gooseai: It is an API service company for artificial intelligence and currently GPT-J is one of the models it serves.