| Name | Latest commit | Last update |
|---|---|---|
| 📂 src | ||
| 📄 .clang-format | Initial commit: Agent Smith C++ Framework | 7 days ago |
| 📄 .gitignore | Initial commit: Agent Smith C++ Framework | 7 days ago |
| 📄 CMakeLists.txt | Refactor: Split build system into a library and an executable CLI target | 7 days ago |
| 📄 README.md | Docs: Update PRD and Design docs, and create README | 7 days ago |
| 📄 design.md | Initial commit: Agent Smith C++ Framework | 7 days ago |
| 📄 prd.md | Initial commit: Agent Smith C++ Framework | 7 days ago |
Agent Smith is a high-performance, modular C++23 framework for building LLM-powered agents.
Designed with composability and safety in mind, it provides a clean, dependency-injectable architecture for orchestrating LLM interactions, managing conversation history, and natively executing tools/functions.
std::expected (via mw::E<T>), ensuring robust network and execution error propagation without crashing your application.Task<T> coroutine wrappers, ready for integration into modern async event loops.libmw for everyday HTTP networking and data handling, alongside nlohmann/json and spdlog.Agent Smith is designed to be consumed as a CMake library. The easiest way to add it to your project is via FetchContent.
include(FetchContent)
FetchContent_Declare(
agent_smith
GIT_REPOSITORY https://github.com/YourOrg/agent-smith.git
# GIT_TAG main
)
FetchContent_MakeAvailable(agent_smith)
add_executable(my_agent src/main.cpp)
target_link_libraries(my_agent PRIVATE agent_smith)
Here is a minimal example of how to configure an OpenAI-compatible client, define a custom tool, and run an agent loop.
#include <iostream>
#include <memory>
#include <agent.hpp>
#include <llm_client.hpp>
#include <memory.hpp>
#include <tool.hpp>
// 1. Define a custom tool
class MyCalculator : public Tool {
public:
std::string name() const override { return "calculator"; }
std::string description() const override { return "Performs basic addition."; }
nlohmann::json parametersSchema() const override {
return R"({
"type": "object",
"properties": {
"a": {"type": "number"},
"b": {"type": "number"}
},
"required": ["a", "b"]
})"_json;
}
Task<mw::E<std::string>> execute(const nlohmann::json& args) override {
double a = args["a"];
double b = args["b"];
co_return std::to_string(a + b);
}
};
// Simple blocking runner for coroutines
void runTask(Task<mw::E<std::string>> task) {
auto res = task.get();
if(res.has_value()) std::cout << "Agent: " << res.value() << "
";
else std::cout << "Error: " << mw::errorMsg(res.error()) << "
";
}
int main() {
// 2. Setup the infrastructure
auto client = std::make_unique<OpenAiClient>("your-api-key");
auto memory = std::make_unique<InMemoryMemory>();
ToolRegistry registry;
registry.registerTool(std::make_unique<MyCalculator>());
// 3. Instantiate the agent
Agent agent(std::move(client), std::move(memory), registry);
agent.allowTool("calculator");
Skill math_skill{
"math_bot",
"You are a helpful math bot. Use your tools to answer questions.",
{"calculator"}
};
agent.activateSkill(math_skill);
// 4. Run the conversational loop
runTask(agent.run("What is 42 + 8?"));
return 0;
}
The repository also builds an interactive CLI executable (agent_smith_cli) out of the box, which serves as a great reference implementation.
# Build the CLI and tests
cmake -S . -B build
cmake --build build -j
# Run the CLI using a custom local endpoint
./build/agent_smith_cli --api-key "dummy" --endpoint "http://localhost:8080/v1" --model "local-model"