- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
But in all fairness, it’s really llama.cpp that supports AMD.
Now looking forward to the Vulkan support!
But in all fairness, it’s really llama.cpp that supports AMD.
Now looking forward to the Vulkan support!
I’ve been using it with a 6800 for a few months now, all it needs is a few env vars.