When using ComfyUI with an AMD GPU and PyTorch ROCm, installing custom nodes via ComfyUI-Manager (or manually via pip) can silently replace your ROCm-enabled PyTorch with the default CUDA version.
This happens because:
- Custom nodes specify
torchas a dependency in theirrequirements.txt - pip resolves this to the default PyPI torch package (CUDA version)
- Your ROCm torch gets overwritten without warning