# Agent Smith 🕶️
**Agent Smith** is a high-performance, modular C++23 framework for building LLM-powered agents.
Designed with composability and safety in mind, it provides a clean, dependency-injectable architecture for orchestrating LLM interactions, managing conversation history, and natively executing tools/functions.
## Key Features
* **Modular Architecture:** Swap out LLM clients, memory storage backends, and tool registries effortlessly.
* **Native Tool Calling:** Fully supports structured JSON tool calling, paving the way for seamless Model Context Protocol (MCP) integrations.
* **Exception-Free Error Handling:** Built heavily around C++23 `std::expected` (via `mw::E<T>`), ensuring robust network and execution error propagation without crashing your application.
* **Async-Ready:** Core loops and LLM executions return `Task<T>` coroutine wrappers, ready for integration into modern async event loops.
* **Lightweight Foundation:** Powered by `libmw` for everyday HTTP networking and data handling, alongside `nlohmann/json` and `spdlog`.
## Integration
Agent Smith is designed to be consumed as a CMake library. The easiest way to add it to your project is via `FetchContent`.
### CMakeLists.txt
```cmake
include(FetchContent)
FetchContent_Declare(
agent_smith
GIT_REPOSITORY https://github.com/YourOrg/agent-smith.git
# GIT_TAG main
)
FetchContent_MakeAvailable(agent_smith)
add_executable(my_agent src/main.cpp)
target_link_libraries(my_agent PRIVATE agent_smith)
```
## Quick Start Example
Here is a minimal example of how to configure an OpenAI-compatible client, define a custom tool, and run an agent loop.
```cpp
#include <iostream>
#include <memory>
#include <agent.hpp>
#include <llm_client.hpp>
#include <memory.hpp>
#include <tool.hpp>
// 1. Define a custom tool
class MyCalculator : public Tool {
public:
std::string name() const override { return "calculator"; }
std::string description() const override { return "Performs basic addition."; }
nlohmann::json parametersSchema() const override {
return R"({
"type": "object",
"properties": {
"a": {"type": "number"},
"b": {"type": "number"}
},
"required": ["a", "b"]
})"_json;
}
Task<mw::E<std::string>> execute(const nlohmann::json& args) override {
double a = args["a"];
double b = args["b"];
co_return std::to_string(a + b);
}
};
// Simple blocking runner for coroutines
void runTask(Task<mw::E<std::string>> task) {
auto res = task.get();
if(res.has_value()) std::cout << "Agent: " << res.value() << "
";
else std::cout << "Error: " << mw::errorMsg(res.error()) << "
";
}
int main() {
// 2. Setup the infrastructure
auto client = std::make_unique<OpenAiClient>("your-api-key");
auto memory = std::make_unique<InMemoryMemory>();
ToolRegistry registry;
registry.registerTool(std::make_unique<MyCalculator>());
// 3. Instantiate the agent
Agent agent(std::move(client), std::move(memory), registry);
agent.allowTool("calculator");
Skill math_skill{
"math_bot",
"You are a helpful math bot. Use your tools to answer questions.",
{"calculator"}
};
agent.activateSkill(math_skill);
// 4. Run the conversational loop
runTask(agent.run("What is 42 + 8?"));
return 0;
}
```
## Built-in CLI
The repository also builds an interactive CLI executable (`agent_smith_cli`) out of the box, which serves as a great reference implementation.
```bash
# Build the CLI and tests
cmake -S . -B build
cmake --build build -j
# Run the CLI using a custom local endpoint
./build/agent_smith_cli --api-key "dummy" --endpoint "http://localhost:8080/v1" --model "local-model"
```
## Dependencies
* **C++ Standard:** C++23
* **Build System:** CMake >= 3.24
* [libmw](https://github.com/MetroWind/libmw) (HTTP client, SQLite, core utils)
* [nlohmann/json](https://github.com/nlohmann/json)
* [spdlog](https://github.com/gabime/spdlog)
* [cxxopts](https://github.com/jarro2783/cxxopts) (CLI only)
* [GoogleTest](https://github.com/google/googletest) (Tests only)