Instructions to use ammaradel/PSU-LLaMA-Inference with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Adapters
How to use ammaradel/PSU-LLaMA-Inference with Adapters:
from adapters import AutoAdapterModel model = AutoAdapterModel.from_pretrained("undefined") model.load_adapter("ammaradel/PSU-LLaMA-Inference", set_active=True) - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 6e99c08c94594c12feeb5ffcfabe2c1ba6b9d03c7427baef29cb6c67658531e7
- Size of remote file:
- 16.8 MB
- SHA256:
- 9264b79d10f4c5f012fde9dc1962f4a80504948f8547f8bf36fdfe57cda6b62b
路
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.