Files
stable-diffusion-webui/modules/sub_quadratic_attention.py
brkirch 2489252099 torch.empty can create issues; use torch.zeros
For MPS, using a tensor created with `torch.empty()` can cause `torch.baddbmm()` to include NaNs in the tensor it returns, even though `beta=0`. However, with a tensor of shape [1,1,1], there should be a negligible performance difference between `torch.empty()` and `torch.zeros()` anyway, so it's better to just use `torch.zeros()` for this and avoid unnecessarily creating issues.
2023-08-13 10:06:25 -04:00

7.1 KiB