LlaMaCpp integration in Unreal Engine 5


I have managed to integrate the LlaMaC++ project in UE5 to load Large Language Models (Meta Llama 3.1) and generate text dynamically. The main improvement is that now the speed in the generation of the responses is substantially better than previous versions. Specific advances:

  • Integration of LlaMaC++ in UE5 which allows to run the LLM locally (no internet needed and anonymous conversation).
  • The execution of the LLM supports the speeds: Normal, Medium, High and Real Time (default option, only the CPU is used).
  • You can choose the sampler: Top K or Mirostat 2.
  • The android has memories of past conversations.
  • Interface to configure the android was finished.
  • Startup interface was finished.
  • Main interaction interface was updated.

Next step: State machine for animations, localization for different languages, auto-save option, etc. 

Note: I have managed to apply a workaround to completely decensor the LLM, however, it seems to cause the LLM to become more repetitive, more experimentation is necessary.


Get More than words

Download NowName your own price

Comments

Log in with itch.io to leave a comment.

Good progress. I look forward to what's next.

Thanks, although these are things I've already done, it's interesting to see how new problems come up and appear, slowly but it's progressing! Regards