Now that you’ve deployed your first provider and confirmed it’s working, you can connect it to an LLM like ChatGPT. In this guide, you’ll learn how to build a chat-enabled app that automatically handles tool calls from your Metorial providers.Documentation Index
Fetch the complete documentation index at: https://metorial.com/docs/llms.txt
Use this file to discover all available pages before exploring further.
What you’ll learn:
- How to use a Metorial provider
- How to use the Metorial SDKs
- Complete the Introduction guide
- Complete the Deploying your first provider guide
- Complete the Testing your first provider guide
5. Loop & Handle Tool Calls
- Send
messagesto OpenAI, passing the tools. - If the assistant response contains
tool_calls, invoke it:
- Append both the tool call requests and their results to
messages. - Repeat until the assistant’s response has no more
tool_calls.
What’s Next?
You are all set on having a production-ready provider to use in your AI apps. Next, you will learn about all the observability tooling available.Next Up: How to monitor your provider and tool calls
Learn how to use the observability & logging features.