Instructions to use google/mobilebert-uncased with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google/mobilebert-uncased with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForPreTraining tokenizer = AutoTokenizer.from_pretrained("google/mobilebert-uncased") model = AutoModelForPreTraining.from_pretrained("google/mobilebert-uncased") - Notebooks
- Google Colab
- Kaggle
Add Core ML conversion
#5
by hikaruaohara - opened
Core ML conversion, task=fill-mask, precision=float32