Install tools: transformers, datasets, huggingface_hub, accelerate, optionally gradio.

⚙️ Install Tools: transformers, datasets, huggingface_hub, accelerate (Optional: gradio)

Now that you know what your assistant will do, it’s time to set up your open-source AI toolkit. Hugging Face provides easy-to-use Python libraries for every step: loading models, working with data, fine-tuning, and sharing your work.


Essential Python Packages

Here’s what each tool does:

  • transformers Load pre-trained large language models (LLMs) and run text generation or Q&A.

  • datasets Easily load, process, and share training or test data.

  • huggingface_hub Upload your models or datasets to the 🤗 Hub and download community models.

  • accelerate Simplify training across CPUs, GPUs, or multiple devices.

  • Optional: gradio Build a simple web interface for chatting with your assistant.


Using a virtual environment keeps your project’s dependencies clean and separate.

# Create and activate a virtual environment (Linux/Mac)
python -m venv venv
source venv/bin/activate

# On Windows
python -m venv venv
venv\Scripts\activate

🗂️ Step 2: Install the Packages

Install all necessary packages in one command:

If you want an interactive chat UI:


🔐 Step 3: Login to the Hugging Face Hub

Sign up for a free Hugging Face account if you don’t have one.

This securely stores your API token so you can download or upload models.


Quick Check

Verify that everything is installed:

If you see All set!, you’re ready to move on.


➡️ Next: You’ll pick a foundation model and run your first prompt in Chapter 2!


Last updated