Instructions to use zilongpa/BUCSSA_assistant with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use zilongpa/BUCSSA_assistant with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("zilongpa/BUCSSA_assistant", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Model Card for Model ID
Visit here for the model repository.
- This is only a proof-of-concept model!
- A bilingual assistant model fine-tuned by the BUCSSA Technology Department to answer frequently asked questions from incoming terriers.
- Special thanks to all members of the Technology Department and eBoard of BUCSSA for providing the QA pairs used during freeze-parameter training.
Technical Specifications
Model Architecture and Objective
Qwen-2, Text generation
Compute Infrastructure
Hardware
- NVIDIA RTX 4090 (24 GB VRAM)
Software
- AutoGPTQ, Vllm
Model Card Contact
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support