Creating an Empty Project¶
About¶
In this page, you will create an empty CMake project and link the inference library into your project.
Setting up CMake¶
Now in your working directory, create a directory named ‘mbase_simple_project’ and go into that directory:
mkdir mbase_simple_project
In that directory, open your favorite IDE or using terminal, create two files given as:
CMakeLists.txt
main.cpp
cd mbase_simple_project
touch CMakeLists.txt
touch main.cpp
Now, properly configure your CMakeLists.txt by setting up a project and minimum version etc.
cmake_minimum_required(VERSION 3.15...3.31)
project("mbase_simple_project" LANGUAGES CXX)
Then create an executable named simple_project target for main.cpp.
add_executable(simple_project main.cpp)
Then, we will set the C++ version as 17 (as long as the minimum is 17, higher versions are fine) and then, we will find the MBASE libraries and link the inference library and specify the include path. Here is how you do it:
find_package(mbase.libs REQUIRED COMPONENTS std inference)
target_compile_features(simple_project PUBLIC cxx_std_17)
target_link_libraries(simple_project PRIVATE mbase-std mbase-inference)
target_include_directories(simple_project PUBLIC mbase-std mbase-inference)
After all those operations, your CMakeLists.txt should look like this:
cmake_minimum_required(VERSION 3.15...3.31)
project("mbase_simple_project" LANGUAGES CXX)
add_executable(simple_project main.cpp)
find_package(mbase.libs REQUIRED COMPONENTS std inference)
target_compile_features(simple_project PUBLIC cxx_std_17)
target_link_libraries(simple_project PRIVATE mbase-std mbase-inference)
target_include_directories(simple_project PUBLIC mbase-std mbase-inference)
Setting up main.cpp¶
After we set cmake configuration, as a hello world, we will print the names of the devices that are available on our system using the inference library to see if everything works correctly and libraries are linked properly. Here is how we do it:
#include <mbase/inference/inf_device_desc.h>
#include <mbase/vector.h>
#include <iostream>
int main()
{
mbase::vector<mbase::InfDeviceDescription> deviceDesc = mbase::inf_query_devices();
for(mbase::vector<mbase::InfDeviceDescription>::iterator It = deviceDesc.begin(); It != deviceDesc.end(); It++)
{
std::cout << It->get_device_description() << std::endl;
}
return 0;
}
After you run this program, you will see the available devices in your system for inference.
Important
If you have an NVIDIA GPU and its is not being displayed by the program, make sure you have installed the necessary drivers and installed the llama.cpp library with proper configuration.
Refer to: Setting up For bundled install: Download