← Back to context Comment by terribleperson 8 hours ago It's pretty crazy that a 6900XT/6950XT aren't supported. 1 comment terribleperson Reply bavell 1 hour ago Eh, YMMV. I was using rocm for minor AI things as far back as 2023 on an "unsupported" 6750XT [0]. Even trained some LoRAs. Mostly the issues were how many libs were cuda only.[0] https://news.ycombinator.com/item?id=43207015
bavell 1 hour ago Eh, YMMV. I was using rocm for minor AI things as far back as 2023 on an "unsupported" 6750XT [0]. Even trained some LoRAs. Mostly the issues were how many libs were cuda only.[0] https://news.ycombinator.com/item?id=43207015
Eh, YMMV. I was using rocm for minor AI things as far back as 2023 on an "unsupported" 6750XT [0]. Even trained some LoRAs. Mostly the issues were how many libs were cuda only.
[0] https://news.ycombinator.com/item?id=43207015