For anyone interested in building applications that use LLMs in Roc, I have a new package for you!
roc-openrouter :rocket:
This provides an easy to use interface for interacting with the OpenRouter.ai API, and includes features like:
[INST]
, <<SYS>>
, and <s>
tags for models with llama style fine-tuning.temperature
, topP
, topK
, repetitionPenalty
, etc.Prompting an LLM with the auto router is as easy as:
client = Chat.initClient { apiKey }
messages = Chat.appendUserMessage [] "Hello, computer!"
response = Http.send! (Chat.buildRequest client messages)
when Chat.decodeTopMessageChoice response.body is
Ok message -> Stdout.line! message.content
Err _ -> Stdout.line! "Problem decoding API response"
Hey ya'll! I've got a big addition to roc-openrouter
live as of today! The package now supports LLM tool use. This means that when using a model that supports tool usage (such as gpt-4o
), the model can now call Roc functions, and use the results in it's answers.
Screenshot-2024-09-19-at-14.09.14.png
To see this in action check out this example.
While this API is functional, and I'm pretty happy with it as is, I think it could still use a little refinement to simplify the process. As always, I'm open to feedback and pull requests!
EDIT: Updated the API by using module params to allow the Tools module to make API requests. This allowed me to streamline the tool calling process significantly.
yay! happy to see module params in use :smiley:
Thanks for all your work on module params! That’s a feature I’ve been looking forward to ever since it was first proposed.
Last updated: Jul 06 2025 at 12:14 UTC