Welcome to the final part of our series on training and publishing your own Large Language Model with Hugging Face! π
π If youβre landing here directly, I recommend first checking out the earlier posts:
- Part 1: Getting Started β Installing Hugging Face, running a model, and creating a dataset.
- Part 2: Fine-Tuning Your Model β Training your own custom model step by step.
In this post, weβll cover the most exciting part: publishing your model to Hugging Face Hub and sharing it with others.
What Youβll Learn in This Post
- How to log in to Hugging Face Hub
- How to push your trained model to your profile
- How to add model cards (documentation)
- How to create a simple demo using Hugging Face Spaces
- How to share your model with others
By the end, your model will be online, accessible to others, and even usable in apps! π
Step 1: Log In to Hugging Face Hub
First, install the CLI if you havenβt already:
pip install huggingface_hub
Then log in:
huggingface-cli login
π This will ask for your Hugging Face access token. You can get it from your Hugging Face settings.
Step 2: Push Your Model to the Hub
From Part 2, you already have your model saved locally (my_custom_model
). Letβs upload it:
from huggingface_hub import HfApi, HfFolder, Repository
from transformers import AutoTokenizer, AutoModelForCausalLM
model_name = "my-username/my-custom-model"
model.push_to_hub(model_name)
tokenizer.push_to_hub(model_name)
Now your model is live on your Hugging Face profile! π
Step 3: Add a Model Card
When someone visits your model page, theyβll see a Model Card. This is like documentation for your model. You can edit it directly in the Hugging Face web UI.
Things to include:
- π What your model does
- π What dataset you trained on
- β οΈ Limitations or biases
- π‘ Example usage code
A simple starter model card:
# My Custom Model
This is a fine-tuned GPT-2 model trained on my own dataset.
## How to Use
```python
from transformers import pipeline
generator = pipeline("text-generation", model="my-username/my-custom-model")
print(generator("Hello world", max_length=50))
Step 4: Create a Demo with Hugging Face Spaces
Want others to try your model in their browser? Hugging Face Spaces lets you build small web apps.
- Go to Hugging Face Spaces
- Click New Space
- Choose Gradio or Streamlit as your app framework
Example app.py
using Gradio:
import gradio as gr
from transformers import pipeline
generator = pipeline("text-generation", model="my-username/my-custom-model")
def generate_text(prompt):
return generator(prompt, max_length=100, num_return_sequences=1)[0]['generated_text']
demo = gr.Interface(fn=generate_text, inputs="text", outputs="text")
demo.launch()
Now your model has an interactive demo anyone can use! β¨
Step 5: Share Your Model
Congratulations β your model is now live! You can share the Hugging Face link with:
- Your teammates π©βπ»
- Your research community π§βπ¬
- Or embed it into your own apps π
Wrap-Up
In this post, you:
- Logged in to Hugging Face Hub
- Published your model online
- Wrote a model card
- Created an interactive demo with Spaces
π― Thatβs it! You now know how to train, fine-tune, and publish your own LLM using Hugging Face.
This 3-part series showed you the full journey from zero to sharing your AI model with the world. π