Instructions to use bigscience/T0pp with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use bigscience/T0pp with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("bigscience/T0pp") model = AutoModelForSeq2SeqLM.from_pretrained("bigscience/T0pp") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 228ca232705213364d102e23ac286307d6e2184c5ee47c4b99947bce81eef6fc
- Size of remote file:
- 44.5 GB
- SHA256:
- 60d41124474b722422688c42c1f38bc9fea9c275fe7e2d9f816b80dc29c6a8fd
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.