Setting-up

In this section, you will setup your environment to develop LLM applications using MBASE Inference library.

By the end of it, you will achieve the following:

  • Setting up your system.

  • Installing the awesome llama.cpp library.

  • Installing the MBASE library.

  • Creating an empty project and query your hardware information using MBASE library.

We will start by acknowledging the system requirements. And then there is the simple download page for fast downloading. Then, preparing the environment for SDK development. Then, optional compiling from source and creating an empty project.