Stream: show and tell

Topic: OpenRouter package for using LLMs in Roc


view this post on Zulip Ian McLerran (May 16 2024 at 20:15):

For anyone interested in building applications that use LLMs in Roc, I have a new package for you!

roc-openrouter :rocket:

This provides an easy to use interface for interacting with the OpenRouter.ai API, and includes features like:

view this post on Zulip Ian McLerran (May 16 2024 at 21:04):

Prompting an LLM with the auto router is as easy as:

client = Chat.initClient { apiKey }
messages = Chat.appendUserMessage [] "Hello, computer!"
response = Http.send! (Chat.buildRequest client messages)
when Chat.decodeTopMessageChoice response.body is
    Ok message -> Stdout.line! message.content
    Err _ -> Stdout.line! "Problem decoding API response"

view this post on Zulip Ian McLerran (Sep 19 2024 at 19:09):

Hey ya'll! I've got a big addition to roc-openrouter live as of today! The package now supports LLM tool use. This means that when using a model that supports tool usage (such as gpt-4o), the model can now call Roc functions, and use the results in it's answers.

Screenshot-2024-09-19-at-14.09.14.png
To see this in action check out this example.

While this API is functional, and I'm pretty happy with it as is, I think it could still use a little refinement to simplify the process. As always, I'm open to feedback and pull requests!

EDIT: Updated the API by using module params to allow the Tools module to make API requests. This allowed me to streamline the tool calling process significantly.

view this post on Zulip Agus Zubiaga (Sep 20 2024 at 14:35):

yay! happy to see module params in use :smiley:

view this post on Zulip Ian McLerran (Sep 20 2024 at 15:07):

Thanks for all your work on module params! That’s a feature I’ve been looking forward to ever since it was first proposed.


Last updated: Jul 06 2025 at 12:14 UTC