Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About Diffusion timestep annealing #65

Open
Kirito-Ausna opened this issue Sep 17, 2024 · 1 comment
Open

About Diffusion timestep annealing #65

Kirito-Ausna opened this issue Sep 17, 2024 · 1 comment

Comments

@Kirito-Ausna
Copy link

Kirito-Ausna commented Sep 17, 2024

Hi! Thanks for opening source DreamCraft3D. I am learning your paper and codes. When I tried to find the corresponding code with the part Diffusion timestep annealing, I find your implementation is actually randomly sampling timestep. I got confused. Have I missed something important? I have noted the time prior branch but the prior is set to None and I was not very clear about this part of codes:

if self.cfg.time_prior is not None:
            time_index = torch.where(
                (self.time_prior_acc_weights - current_step_ratio) > 0
            )[0][0]
            if time_index == 0 or torch.abs(
                self.time_prior_acc_weights[time_index] - current_step_ratio
            ) < torch.abs(
                self.time_prior_acc_weights[time_index - 1] - current_step_ratio
            ):
                t = self.num_train_timesteps - time_index
            else:
                t = self.num_train_timesteps - time_index + 1
            t = torch.clip(t, self.min_step, self.max_step + 1)
            t = torch.full((batch_size,), t, dtype=torch.long, device=self.device)

Is this the implementation of dreamtime? Could you share some insights with me? Many thanks!

@Kirito-Ausna
Copy link
Author

Now I understand this part is the implementation of dreamtime, but could you share with us the default hyper-parameter setting?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant