关键词 "short response distillation" 的搜索结果, 共 23 条, 只显示前 480 条
A Model Context Protocol (MCP) implementation enabling communication between an LLM and SQL Server. Allows users to query SQL databases using natural language and get structured SQL responses.
OmniLLM: A Model Context Protocol (MCP) server that enables Claude to access and integrate responses from multiple LLMs including ChatGPT, Azure OpenAI, and Google Gemini, creating a unified AI knowle
ProjectDocHelper is an MCP (Model Context Protocol) server designed to automatically generate project documentation and make it accessible to AI development tools like Cursor through MCP, thereby impr
An MCP server enhances AI responses with real-time search results via Higress ai-search.
🔍 A Model Context Protocol (MCP) server providing unified access to multiple search engines (Tavily, Brave, Kagi), AI tools (Perplexity, FastGPT), and content processing services (Jina AI, Kagi). Comb
An MCP server that provides tools for retrieving and processing documentation through vector search, both locally or hosted. Enabling AI assistants to augment their responses with relevant documentati
Short and sweet example MCP server / client implementation for Tools, Resources and Prompts.
🧠 MCP server implementing RAT (Retrieval Augmented Thinking) - combines DeepSeek's reasoning with GPT-4/Claude/Mistral responses, maintaining conversation context between interactions.
MCP Server for the Fillout.io API, enabling form management, response handling, and analytics.
An MCP server implementation that provides tools for retrieving and processing documentation through vector search, enabling AI assistants to augment their responses with relevant documentation contex
A Python-based Model Context Protocol (MCP) client and server that enables LLMs to send notifications via Telegram and receive user responses.
An MCP server that enables communication with users through Telegram. This server provides a tool to ask questions to users and receive their responses via a Telegram bot.
MCP Server to connect your MCP host to V0.dev directly
MCP Server Shortcut Project
Открытый инструмент для автоматизированного создания коротких видеороликов с поддержкой MCP и переводом README на русский язык.
minion-agent的开源项目为开发者们提供了一个全新的AI智能体开发框架。 minion-agent的核心价值在于优雅地解决了「框架碎片化」的问题,开发者想要开发一款AI智能体的话,需要在OpenAI、LangChain、Google AI以及SmolaAgents等多种框架间切换,因为每种框架都有其独特的优势和局限性,也是当前AI智能体开发中的主要阻碍。 minion-agent通过提供
ContextGem:轻松从文档中提取 LLM ContextGem 是一个免费的开源 LLM 框架,它可以让您以最少的代码更轻松地从文档中提取结构化数据和见解。 💎 为什么选择 Contex
极简部署AI视频翻译配音工具 KrillinAI-一款AI视频翻译配音工具 提供了从视频下载,音频提取,音频转录,文本切割,翻译,对齐,到最终合成适配抖音,哔哩哔哩,小红书,视频号,快手等主流平台格式的一站式解决方案。 基于AI大模型的视频翻译和配音工具,专业级翻译,一键部署全流程,可以生成适配抖音,小红书,哔哩哔哩,视频号,TikTok,Youtube Shorts等形态的
ViLAMP(VIdeo-LAnguage Model with Mixed Precision)是蚂蚁集团和中国人民大学联合推出的视觉语言模型,专门用在高效处理长视频内容。基于混合精度策略,对视频中的关键帧保持高精度分析,显著降低计算成本提高处理效率。ViLAMP在多个视频理解基准测试中表现出色,在长视频理解任务中,展现出显著优势。ViLAMP能在单张A100 GPU上处理长达1万帧(约3小时)
🚀🤖 Crawl4AI:开源 LLM 友好型网络爬虫和抓取工具。 Crawl4AI 是 GitHub 上排名第一的热门代码库,由充满活力的社区积极维护。它提供速度超快、AI 就绪的 Web 爬取功能,专为 LLM、AI 代理和数据管道量身定制。Crawl4AI 开源、灵活,专为实时性能而构建,为开发者提供无与伦比的速度、精度和部署便捷性。 ✨ 查看最新更新 v0.6.0 🎉 0.6.
全新的生成模型MeanFlow,最大亮点在于它彻底跳脱了传统训练范式——无须预训练、蒸馏或课程学习,仅通过一次函数评估(1-NFE)即可完成生成。 MeanFlow在ImageNet 256×256上创下3.43 FID分数,实现从零开始训练下的SOTA性能。 图1(上):在ImageNet 256×256上从零开始的一步生成结果 在ImageNet 256×25
只显示前20页数据,更多请搜索
Showing 457 to 479 of 479 results