Skip to content

ArintonAkos/diffusion-model

Repository files navigation

DDPM (Denoising Diffusion Probabilistic Models)

Installation Guide

Follow these steps to set up and run the DDPM project:

  1. Clone the Repository git clone https://github.com/ArintonAkos/diffusion-model/

  2. Install Dependencies

    • Install the required dependencies by running:
      pip install -r requirements.txt
      
    • For detailed PyTorch installation instructions, visit: PyTorch Get Started
  3. Running the Code

    • Execute main.py using the command line with the desired parameters:
    • Options include:
      • -A, --all: Run setup, pre-train, and full-train steps.
      • -S, --setup: Run setup steps (visualize and data-analysis).
      • -P, --pre-train: Run pre-train steps (overfit-test and lr-find).
      • -F, --full-train: Run full-train steps (train and evaluate).
      • --model-type: Specify the model type (required). Choices: ['classification', 'ddpm'].
      • -N, --new-run: Create a new run ID.
      • --run-id: Specify a custom run ID.
      • --visualize: Visualize the dataset.
      • --data-analysis: Analyze the dataset.
      • --overfit-test: Perform an overfitting test.
      • --lr-find: Find an optimal learning rate.
      • --train: Train the model on the full dataset.
      • --evaluate: Evaluate the model on the validation set.
      • --federated: Enable federated training.
      • --model-path: Specify the path to a pre-trained model.
      • --dataset-name: Specify the dataset name.
      • --config-file: Specify the path to the configuration file. Default: 'configs/config.yml'.

Example command to train the model: python main.py --train --new-run --config=configs/homogeneous.yml --federated

Configuration

Edit the configuration file configs/config.yml to set various parameters like batch size, learning rate, etc., to suit your requirements.

Contributions

Contributions to this project are welcome. Please follow the standard Git workflow - fork, clone, feature branch, commits, and pull request.

Good results:

References

For more in-depth understanding and background of DDPM, refer to the following papers and resources:

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors