Incubating v0.1.0

Llama Cpp Sdk

First concrete self-hosted inference backend for llama.cpp, owning llama-server boot specs, readiness probes, stop semantics, and endpoint publication through…

Stats & Info

  • Latest release: v0.1.0
  • Recent downloads: 0
  • All-time downloads: 0
  • Maintainers: nshkrdotcom

Tags

MIT Hex.pm

Installation

Add llama_cpp_sdk to your list of dependencies in mix.exs:

def deps do
  [
    {:llama_cpp_sdk, "~> 0.1.0"}
  ]
end

Then run:

mix deps.get

Resources