- update llama.cpp python to `bc098c3` (now adds support for Qwen3, Llama 4, etc.)
- update requirements and general maint
- UI fixes in AutoGGUF
- Updated backend selection box to sort by newest version
- Fixed log information box inserting newlines on open and autoscroll
- Modified task deletion behavior
- Fixed logging for cancellation/deletion
- Updated readme information
- updated copyright year in LICENSE file to 2025
- bundled llama.cpp licensing text in About menu to maintain MIT compliance
- updated llama.cpp and gguf Python library and scripts
- adjusted monitoring intervals from 0.2s to 0.5s
- updated Python requirements to latest compatible versions
- added new HF to GGUF conversion types: `tq1_0` and `tq2_0`
Happy New Year 🎉!