Inference Unlimited

Guide: How to Run Stable Beluga on a Computer with 32GB RAM

Stable Beluga is a modern AI-powered tool for image generation. In this guide, we will show you how to run Stable Beluga on a computer with 32GB RAM, step by step.

Prerequisites

Before starting the installation, make sure your system meets the following requirements:

Installing Dependencies

On Windows

  1. Install graphics card drivers:

    • Download the latest drivers for your graphics card from the official NVIDIA website.
    • Follow the manufacturer's instructions to complete the installation.
  2. Install Python:

    • Download and install Python 3.8 or later from the official python.org website.
    • Make sure to check the "Add Python to PATH" option during installation.
  3. Install Git:

    • Download and install Git from the official git-scm.com website.

On Linux (Ubuntu 20.04 LTS)

  1. Update the system:

    sudo apt update && sudo apt upgrade -y
    
  2. Install NVIDIA drivers:

    sudo apt install nvidia-driver-470
    sudo reboot
    
  3. Install Python and Git:

    sudo apt install python3 python3-pip git
    

Cloning the Stable Beluga Repository

  1. Open the terminal (or command prompt in Windows).
  2. Run the following command:
    git clone https://github.com/stability-ai/stable-beluga.git
    cd stable-beluga
    

Installing Python Dependencies

  1. Create and activate a virtual environment:

    python -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
    
  2. Install the required packages:

    pip install -r requirements.txt
    

Configuring Stable Beluga

  1. Copy the configuration file:

    cp config.example.yaml config.yaml
    
  2. Edit the config.yaml file in a text editor, e.g., VS Code:

    model:
      name: "stable-beluga"
      path: "models/stable-beluga.safetensors"
    device: "cuda"  # Use "cpu" if you don't have a graphics card
    precision: "fp16"  # You can change to "fp32" if you have enough memory
    

Downloading the Model

  1. Download the Stable Beluga model from the official repository:
    wget https://example.com/models/stable-beluga.safetensors -P models/
    
    (Replace the URL with the current model's address.)

Running Stable Beluga

  1. Run the main script:

    python main.py
    
  2. If you encounter memory issues, try reducing the batch size in the config.yaml file:

    batch_size: 1  # Default is 4, reduce if needed
    

Usage Examples

Generating an Image

from stable_beluga import StableBeluga

model = StableBeluga.from_config("config.yaml")
prompt = "Astronaut riding a horse on Mars"
image = model.generate(prompt)
image.save("output.png")

Generating a Series of Images

prompts = [
    "Sunset over the ocean",
    "Forest in autumn",
    "City skyline at night"
]

for prompt in prompts:
    image = model.generate(prompt)
    image.save(f"{prompt.replace(' ', '_')}.png")

Memory Optimization

If you encounter memory issues, try the following solutions:

  1. Reduce the batch size:

    batch_size: 1
    
  2. Use fp32 precision:

    precision: "fp32"
    
  3. Disable gradients:

    enable_gradients: false
    

Troubleshooting

GPU Memory Error

If you receive a GPU memory error, try:

  1. Reducing the output resolution:

    output_size: [512, 512]  # Reduce to [256, 256] if needed
    
  2. Using a smaller model:

    model:
      name: "stable-beluga-small"
    

Dependency Error

If you encounter dependency issues, try:

  1. Updating pip:

    pip install --upgrade pip
    
  2. Installing packages manually:

    pip install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu113
    

Summary

In this guide, we have shown you how to run Stable Beluga on a computer with 32GB RAM. Remember that memory optimization may be necessary depending on your hardware. With these steps, you should be able to generate high-quality images using Stable Beluga.

If you have additional questions or encounter issues, visit the official Stable Beluga forum or GitHub repository.

Język: EN | Wyświetlenia: 12

← Powrót do listy artykułów