1 tools tagged with pretrained model.
OPT-350M is a pre-trained transformer language model developed by Meta AI, designed for text generation and research purposes. It is part of the OPT series, which includes models ranging from 125M to 175B parameters. The model is trained on a large corpus of English text, with some non-English data included. It is intended for use in prompting for downstream tasks and text generation, and can be fine-tuned for specific applications. The model is available on Hugging Face, allowing researchers to access and study its capabilities. However, it is important to note that the model may contain biases and has limitations in terms of safety and diversity, as it was trained on unfiltered internet data. The model uses the causal language modeling objective and is compatible with frameworks like PyTorch and TensorFlow. It is a valuable resource for those interested in exploring large language models and their potential applications.