Skip to main content

LLM Farm news

v1.4.0 beta

· One min read
  • llama.cpp updated to b3982
  • Added RAG support for pdf documents
  • Chat settings UI improvements

v1.3.9

· One min read
  • llama.cpp updated to b3837
  • Added support for llama 3.2, 3.1, RWKV, MiniCPM(2.5, 2.6, 3), Chameleon models

v1.3.4

· One min read
  • llama.cpp updated to b3445
  • Added Gemma2, T5, JAIS, Bitnet support, GLM(3,4), Mistral Nemo
  • Added OpenELM support

v1.3.0

· One min read
  • llama.cpp updated to b3190
  • Added support for DeepseekV2, GPTNeoX (Pythia and others)
  • Added support for Markdown formatting

v1.2.5

· One min read
  • Save/Load context state. Now it is possible to continue the dialog with the model even after the program is reopened. Read more here.
  • Chat settings such as sampling are now applied without reloading the chat.

v1.2.0

· One min read
  • Added shortcuts support
  • llama.cpp updated to b2864
  • Added llama3 instruct template

v1.1.1

· One min read
  • Added Gemma template and download link
  • Added warning about creating a chat without a selected model
  • Fixed some bugs

v1.1.0

· One min read
  • llama.cpp updated to b2717
  • Phi3, Mamba(CPU only), gemma, StarCoder2, GritLM, Command-R, MobileVLM_V2, qwen2moe models
  • IQ1_S, IQ2_S, IQ2_M, IQ3_S, IQ4_NL, IQ4_XS quntization support

v1.0.1

· One min read
  • Fixed some bugs that could cause the application to crash
  • Added the ability to hide the keyboard, to do this tap anywhere in the chat window