Composio equip's your AI agents & LLMs with 100+ high-quality integrations via function calling
-
Updated
Apr 27, 2025 - Python
Composio equip's your AI agents & LLMs with 100+ high-quality integrations via function calling
Harness LLMs with Multi-Agent Programming
AI agent microservice
[ICML 2024] LLMCompiler: An LLM Compiler for Parallel Function Calling
LLM abstractions that aren't obstructions
ReCall: Learning to Reason with Tool Call for LLMs via Reinforcement Learning
HumanLayer enables AI agents to communicate with humans in tool-based and async workflows. Guarantee human oversight of high-stakes function calls with approval workflows across slack, email and more. Bring your LLM and Framework of choice and start giving your AI agents safe access to the world. Agentic Workflows, human in the loop, tool calling
Command Your World with Voice
kani (カニ) is a highly hackable microframework for chat-based language models with tool use/function calling. (NLP-OSS @ EMNLP 2023)
The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.
[ICLR'25] BigCodeBench: Benchmarking Code Generation Towards AGI
MLX Omni Server is a local inference server powered by Apple's MLX framework, specifically designed for Apple Silicon (M-series) chips. It implements OpenAI-compatible API endpoints, enabling seamless integration with existing OpenAI SDK clients while leveraging the power of local ML inference.
ACI.dev is the open source platform that connects your AI agents to 600+ tool integrations with multi-tenant auth, granular permissions, and access through direct function calling or a unified MCP server.
Conversational voice AI agents
GPT-4 level function calling models for real-world tool using use cases
Action library for AI Agent
Add a description, image, and links to the function-calling topic page so that developers can more easily learn about it.
To associate your repository with the function-calling topic, visit your repo's landing page and select "manage topics."