Guide: How to Run Stable Beluga on a Computer with 32GB RAM
Stable Beluga is a modern AI-powered tool for image generation. In this guide, we will show you how to run Stable Beluga on a computer with 32GB RAM, step by step.
Prerequisites
Before starting the installation, make sure your system meets the following requirements:
- Operating System: Windows 10/11 or Linux (Ubuntu 20.04 LTS recommended)
- Processor: Intel i7 or newer / AMD Ryzen 7 or newer
- RAM: 32GB
- Graphics Card: NVIDIA RTX 2060 or newer (with at least 8GB GPU memory)
- Disk Space: 50GB of free space
Installing Dependencies
On Windows
-
Install graphics card drivers:
- Download the latest drivers for your graphics card from the official NVIDIA website.
- Follow the manufacturer's instructions to complete the installation.
-
Install Python:
- Download and install Python 3.8 or later from the official python.org website.
- Make sure to check the "Add Python to PATH" option during installation.
-
Install Git:
- Download and install Git from the official git-scm.com website.
On Linux (Ubuntu 20.04 LTS)
-
Update the system:
sudo apt update && sudo apt upgrade -y -
Install NVIDIA drivers:
sudo apt install nvidia-driver-470 sudo reboot -
Install Python and Git:
sudo apt install python3 python3-pip git
Cloning the Stable Beluga Repository
- Open the terminal (or command prompt in Windows).
- Run the following command:
git clone https://github.com/stability-ai/stable-beluga.git cd stable-beluga
Installing Python Dependencies
-
Create and activate a virtual environment:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate -
Install the required packages:
pip install -r requirements.txt
Configuring Stable Beluga
-
Copy the configuration file:
cp config.example.yaml config.yaml -
Edit the
config.yamlfile in a text editor, e.g., VS Code:model: name: "stable-beluga" path: "models/stable-beluga.safetensors" device: "cuda" # Use "cpu" if you don't have a graphics card precision: "fp16" # You can change to "fp32" if you have enough memory
Downloading the Model
- Download the Stable Beluga model from the official repository:
(Replace the URL with the current model's address.)wget https://example.com/models/stable-beluga.safetensors -P models/
Running Stable Beluga
-
Run the main script:
python main.py -
If you encounter memory issues, try reducing the batch size in the
config.yamlfile:batch_size: 1 # Default is 4, reduce if needed
Usage Examples
Generating an Image
from stable_beluga import StableBeluga
model = StableBeluga.from_config("config.yaml")
prompt = "Astronaut riding a horse on Mars"
image = model.generate(prompt)
image.save("output.png")
Generating a Series of Images
prompts = [
"Sunset over the ocean",
"Forest in autumn",
"City skyline at night"
]
for prompt in prompts:
image = model.generate(prompt)
image.save(f"{prompt.replace(' ', '_')}.png")
Memory Optimization
If you encounter memory issues, try the following solutions:
-
Reduce the batch size:
batch_size: 1 -
Use fp32 precision:
precision: "fp32" -
Disable gradients:
enable_gradients: false
Troubleshooting
GPU Memory Error
If you receive a GPU memory error, try:
-
Reducing the output resolution:
output_size: [512, 512] # Reduce to [256, 256] if needed -
Using a smaller model:
model: name: "stable-beluga-small"
Dependency Error
If you encounter dependency issues, try:
-
Updating pip:
pip install --upgrade pip -
Installing packages manually:
pip install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu113
Summary
In this guide, we have shown you how to run Stable Beluga on a computer with 32GB RAM. Remember that memory optimization may be necessary depending on your hardware. With these steps, you should be able to generate high-quality images using Stable Beluga.
If you have additional questions or encounter issues, visit the official Stable Beluga forum or GitHub repository.