GPT-2

Generative Pre-trained Transformer 2 (GPT-2) is an open-source artificial intelligence created by OpenAI. When creating long passages, GPT-2 interprets text, answers queries, summarizes passages, and provides text output on a level that, while occasionally indistinguishable from that of humans, can become repetitious or nonsensical.

It was not particularly trained to accomplish any of these tasks, and its ability to do so is an extension of its general capacity to accurately synthesise the next item in an arbitrary sequence.

Project Background

  • Project: Artificial Intelligence Based
  • Author: OpenAI
  • Initial Release: 2019
  • Type: Transformer Language Model
  • License: Modified MIT Licence
  • Contains: Parameter models 
  • GitHub:/gpt-2 with 15k stars and 14 contributors
  • Runs On: Windows
  • Twitter: /gpt2_bot

Features

  • Powerful Natural Language Processing (NLP)
  • Comes with pre-trained pipelines
  • Incorporates neural network models for tagging, parsing, text classification, entity recognition, and more
  •  Includes language-specific rules for tokenization
Scroll to Top