Menu
Bert
Bidirectional Encoder Representations Transformers’ is a natural language processing by a neural network-based technique for pre-training. In simpler words. it helps search engines contextualize content for better understanding and matching with queries. It helps advance the natural language toolkit through coding and a pre-trained model.
Project Background
- Project: Natural Language Processing Transformer
- Author: Jacob Devlin
- Initial Release: 2018
- Type: Machine Learning, Transformer Language Model
- License: Apache 2.0 Licence
- Contains: BERT (Base) and BERT (Large)
- GitHub: /bert with 29.5k stars and 28 contributors
- Twitter: None
Applications
- Intended for environments with restricted computational resources.
- Effective in the context of knowledge distillation
- Natural language processing
- Next sentence prediction