Skip to content

Commit

Permalink
Merge branch 'main' into feat/add-start-scripts-everything-server
Browse files Browse the repository at this point in the history
  • Loading branch information
apappascs authored Jan 19, 2025
2 parents 49088d0 + 01b5cd5 commit 71da7b2
Show file tree
Hide file tree
Showing 21 changed files with 171 additions and 130 deletions.
11 changes: 10 additions & 1 deletion .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,7 @@ jobs:
needs: [create-metadata]
if: ${{ needs.create-metadata.outputs.npm_packages != '[]' || needs.create-metadata.outputs.pypi_packages != '[]' }}
runs-on: ubuntu-latest
environment: release
outputs:
changes_made: ${{ steps.commit.outputs.changes_made }}
steps:
Expand Down Expand Up @@ -170,7 +171,7 @@ jobs:
working-directory: src/${{ matrix.package }}
run: |
VERSION=$(jq -r .version package.json)
if npm view --json | jq --arg version "$VERSION" '[.[]][0].versions | contains([$version])'; then
if npm view --json | jq -e --arg version "$VERSION" '[.[]][0].versions | contains([$version])'; then
echo "Version $VERSION already exists on npm"
exit 1
fi
Expand Down Expand Up @@ -210,3 +211,11 @@ jobs:
gh release create "$VERSION" \
--title "Release $VERSION" \
--notes-file RELEASE_NOTES.md
- name: Docker MCP images
uses: peter-evans/repository-dispatch@v3
with:
token: ${{ secrets.DOCKER_TOKEN }}
repository: docker/labs-ai-tools-for-devs
event-type: build-mcp-images
client-payload: '{"ref": "${{ needs.create-metadata.outputs.version }}"}'
10 changes: 9 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,7 @@ Official integrations are maintained by companies building production ready MCP
- <img height="12" width="12" src="https://www.meilisearch.com/favicon.ico" alt="Meilisearch Logo" /> **[Meilisearch](https://github.com/meilisearch/meilisearch-mcp)** - Interact & query with Meilisearch (Full-text & semantic search API)
- <img height="12" width="12" src="https://metoro.io/static/images/logos/Metoro.svg" /> **[Metoro](https://github.com/metoro-io/metoro-mcp-server)** - Query and interact with kubernetes environments monitored by Metoro
- <img height="12" width="12" src="https://www.motherduck.com/favicon.ico" alt="MotherDuck Logo" /> **[MotherDuck](https://github.com/motherduckdb/mcp-server-motherduck)** - Query and analyze data with MotherDuck and local DuckDB
- <img height="12" width="12" src="https://needle-ai.com/images/needle-logo-orange-2-rounded.png" alt="Needle AI Logo" /> **[Needle](https://github.com/needle-ai/needle-mcp)** - Production-ready RAG out of the box to search and retrieve data from your own documents.
- <img height="12" width="12" src="https://neo4j.com/favicon.ico" alt="Neo4j Logo" /> **[Neo4j](https://github.com/neo4j-contrib/mcp-neo4j/)** - Neo4j graph database server (schema + read/write-cypher) and separate graph database backed memory
- **[Neon](https://github.com/neondatabase/mcp-server-neon)** - Interact with the Neon serverless Postgres platform
- <img height="12" width="12" src="https://qdrant.tech/img/brand-resources-logos/logomark.svg" /> **[Qdrant](https://github.com/qdrant/mcp-server-qdrant/)** - Implement semantic memory layer on top of the Qdrant vector search engine
Expand All @@ -72,15 +73,19 @@ A growing set of community-developed and maintained servers demonstrates various
- **[Atlassian](https://github.com/sooperset/mcp-atlassian)** - Interact with Atlassian Cloud products (Confluence and Jira) including searching/reading Confluence spaces/pages, accessing Jira issues, and project metadata.
- **[BigQuery](https://github.com/LucasHild/mcp-server-bigquery)** (by LucasHild) - This server enables LLMs to inspect database schemas and execute queries on BigQuery.
- **[BigQuery](https://github.com/ergut/mcp-bigquery-server)** (by ergut) - Server implementation for Google BigQuery integration that enables direct BigQuery database access and querying capabilities
- **[ChatMCP](https://github.com/AI-QL/chat-mcp)** – An Open Source Cross-platform GUI Desktop application compatible with Linux, macOS, and Windows, enabling seamless interaction with MCP servers across dynamically selectable LLMs, by **[AIQL](https://github.com/AI-QL)**
- **[ChatSum](https://github.com/mcpso/mcp-server-chatsum)** - Query and Summarize chat messages with LLM. by [mcpso](https://mcp.so)
- **[Chroma](https://github.com/privetin/chroma)** - Vector database server for semantic document search and metadata filtering, built on Chroma
- **[Cloudinary](https://github.com/felores/cloudinary-mcp-server)** - Cloudinary Model Context Protocol Server to upload media to Cloudinary and get back the media link and details.
- **[cognee-mcp](https://github.com/topoteretes/cognee-mcp-server)** - GraphRAG memory server with customizable ingestion, data processing and search
- **[coin_api_mcp](https://github.com/longmans/coin_api_mcp)** - Provides access to [coinmarketcap](https://coinmarketcap.com/) cryptocurrency data.
- **[Contentful-mcp](https://github.com/ivo-toby/contentful-mcp)** - Read, update, delete, publish content in your [Contentful](https://contentful.com) space(s) from this MCP Server.
- **[Data Exploration](https://github.com/reading-plus-ai/mcp-server-data-exploration)** - MCP server for autonomous data exploration on .csv-based datasets, providing intelligent insights with minimal effort. NOTE: Will execute arbitrary Python code on your machine, please use with caution!
- **[Dataset Viewer](https://github.com/privetin/dataset-viewer)** - Browse and analyze Hugging Face datasets with features like search, filtering, statistics, and data export
- **[DevRev](https://github.com/kpsunil97/devrev-mcp-server)** - An MCP server to integrate with DevRev APIs to search through your DevRev Knowledge Graph where objects can be imported from diff. sources listed [here](https://devrev.ai/docs/import#available-sources).
- **[Dify](https://github.com/YanxingLiu/dify-mcp-server)** - A simple implementation of an MCP server for dify workflows.
- **[Docker](https://github.com/ckreiling/mcp-server-docker)** - Integrate with Docker to manage containers, images, volumes, and networks.
- **[Drupal](https://github.com/Omedia/mcp-server-drupal)** - Server for interacting with [Drupal](https://www.drupal.org/project/mcp) using STDIO transport layer.
- **[Elasticsearch](https://github.com/cr7258/elasticsearch-mcp-server)** - MCP server implementation that provides Elasticsearch interaction.
- **[Fetch](https://github.com/zcaceres/fetch-mcp)** - A server that flexibly fetches HTML, JSON, Markdown, or plaintext.
- **[FireCrawl](https://github.com/vrknetha/mcp-server-firecrawl)** - Advanced web scraping with JavaScript rendering, PDF support, and smart rate limiting
Expand All @@ -104,16 +109,18 @@ A growing set of community-developed and maintained servers demonstrates various
- **[MySQL](https://github.com/benborla/mcp-server-mysql)** (by benborla) - MySQL database integration in NodeJS with configurable access controls and schema inspection
- **[MySQL](https://github.com/designcomputer/mysql_mcp_server)** (by DesignComputer) - MySQL database integration in Python with configurable access controls and schema inspection
- **[NS Travel Information](https://github.com/r-huijts/ns-mcp-server)** - Access Dutch Railways (NS) real-time train travel information and disruptions through the official NS API.
- **[Needle](https://github.com/JANHMS/needle-mcp)** - Production-ready RAG out of the box to search and retrieve data from your own documents.
- **[Notion](https://github.com/suekou/mcp-notion-server)** (by suekou) - Interact with Notion API.
- **[Notion](https://github.com/v-3/notion-server)** (by v-3) - Notion MCP integration. Search, Read, Update, and Create pages through Claude chat.
- **[oatpp-mcp](https://github.com/oatpp/oatpp-mcp)** - C++ MCP integration for Oat++. Use [Oat++](https://oatpp.io) to build MCP servers.
- **[Obsidian Markdown Notes](https://github.com/calclavia/mcp-obsidian)** - Read and search through your Obsidian vault or any directory containing Markdown notes
- **[OpenAPI](https://github.com/snaggle-ai/openapi-mcp-server)** - Interact with [OpenAPI](https://www.openapis.org/) APIs.
- **[OpenCTI](https://github.com/Spathodea-Network/opencti-mcp)** - Interact with OpenCTI platform to retrieve threat intelligence data including reports, indicators, malware and threat actors.
- **[OpenRPC](https://github.com/shanejonas/openrpc-mpc-server)** - Interact with and discover JSON-RPC APIs via [OpenRPC](https://open-rpc.org).
- **[Pandoc](https://github.com/vivekVells/mcp-pandoc)** - MCP server for seamless document format conversion using Pandoc, supporting Markdown, HTML, and plain text, with other formats like PDF, csv and docx in development.
- **[Pinecone](https://github.com/sirmews/mcp-pinecone)** - MCP server for searching and uploading records to Pinecone. Allows for simple RAG features, leveraging Pinecone's Inference API.
- **[Placid.app](https://github.com/felores/placid-mcp-server)** - Generate image and video creatives using Placid.app templates
- **[Playwright](https://github.com/executeautomation/mcp-playwright)** - This MCP Server will help you run browser automation and webscraping using Playwright
- **[Postman](https://github.com/shannonlal/mcp-postman)** - MCP server for running Postman Collections locally via Newman. Allows for simple execution of Postman Server and returns the results of whether the collection passed all the tests.
- **[RAG Web Browser](https://github.com/apify/mcp-server-rag-web-browser)** An MCP server for Apify's RAG Web Browser Actor to perform web searches, scrape URLs, and return content in Markdown.
- **[Rememberizer AI](https://github.com/skydeckai/mcp-server-rememberizer)** - An MCP server designed for interacting with the Rememberizer data source, facilitating enhanced knowledge retrieval.
- **[Salesforce MCP](https://github.com/smn2gnt/MCP-Salesforce)** - Interact with Salesforce Data and Metadata
Expand Down Expand Up @@ -152,6 +159,7 @@ Additional resources on MCP.
- **[mcp-get](https://mcp-get.com)** - Command line tool for installing and managing MCP servers by **[Michael Latman](https://github.com/michaellatman)**
- **[mcp-manager](https://github.com/zueai/mcp-manager)** - Simple Web UI to install and manage MCP servers for Claude Desktop by **[Zue](https://github.com/zueai)**
- **[MCPHub](https://github.com/Jeamee/MCPHub-Desktop)** – An Open Source MacOS & Windows GUI Desktop app for discovering, installing and managing MCP servers by **[Jeamee](https://github.com/jeamee)**
- **[mcp.run](https://mcp.run)** - A hosted registry and control plane to install & run secure + portable MCP Servers.
- **[Open-Sourced MCP Servers Directory](https://github.com/chatmcp/mcp-directory)** - A curated list of MCP servers by **[mcpso](https://mcp.so)**
- **[PulseMCP](https://www.pulsemcp.com)** ([API](https://www.pulsemcp.com/api)) - Community hub & weekly newsletter for discovering MCP servers, clients, articles, and news by **[Tadas Antanavicius](https://github.com/tadasant)**, **[Mike Coughlin](https://github.com/macoughl)**, and **[Ravina Patel](https://github.com/ravinahp)**
- **[r/mcp](https://www.reddit.com/r/mcp)** – A Reddit community dedicated to MCP by **[Frank Fiegel](https://github.com/punkpeye)**
Expand Down
20 changes: 13 additions & 7 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion src/aws-kb-retrieval-server/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM node:22.12-alpine as builder
FROM node:22.12-alpine AS builder

COPY src/aws-kb-retrieval-server /app
COPY tsconfig.json /tsconfig.json
Expand Down
2 changes: 1 addition & 1 deletion src/brave-search/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM node:22.12-alpine as builder
FROM node:22.12-alpine AS builder

# Must be entire project because `prepare` script is run during `npm install` and requires all files.
COPY src/brave-search /app
Expand Down
2 changes: 1 addition & 1 deletion src/everart/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM node:22.12-alpine as builder
FROM node:22.12-alpine AS builder

COPY src/everart /app
COPY tsconfig.json /tsconfig.json
Expand Down
2 changes: 1 addition & 1 deletion src/everything/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM node:22.12-alpine as builder
FROM node:22.12-alpine AS builder

COPY src/everything /app
COPY tsconfig.json /tsconfig.json
Expand Down
2 changes: 1 addition & 1 deletion src/fetch/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ classifiers = [
]
dependencies = [
"markdownify>=0.13.1",
"mcp>=1.0.0",
"mcp>=1.1.3",
"protego>=0.3.1",
"pydantic>=2.0.0",
"readabilipy>=0.2.0",
Expand Down
41 changes: 21 additions & 20 deletions src/fetch/src/mcp_server_fetch/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
from mcp.server import Server
from mcp.server.stdio import stdio_server
from mcp.types import (
ErrorData,
GetPromptResult,
Prompt,
PromptArgument,
Expand Down Expand Up @@ -79,15 +80,15 @@ async def check_may_autonomously_fetch_url(url: str, user_agent: str) -> None:
headers={"User-Agent": user_agent},
)
except HTTPError:
raise McpError(
INTERNAL_ERROR,
f"Failed to fetch robots.txt {robot_txt_url} due to a connection issue",
)
raise McpError(ErrorData(
code=INTERNAL_ERROR,
message=f"Failed to fetch robots.txt {robot_txt_url} due to a connection issue",
))
if response.status_code in (401, 403):
raise McpError(
INTERNAL_ERROR,
f"When fetching robots.txt ({robot_txt_url}), received status {response.status_code} so assuming that autonomous fetching is not allowed, the user can try manually fetching by using the fetch prompt",
)
raise McpError(ErrorData(
code=INTERNAL_ERROR,
message=f"When fetching robots.txt ({robot_txt_url}), received status {response.status_code} so assuming that autonomous fetching is not allowed, the user can try manually fetching by using the fetch prompt",
))
elif 400 <= response.status_code < 500:
return
robot_txt = response.text
Expand All @@ -96,15 +97,15 @@ async def check_may_autonomously_fetch_url(url: str, user_agent: str) -> None:
)
robot_parser = Protego.parse(processed_robot_txt)
if not robot_parser.can_fetch(str(url), user_agent):
raise McpError(
INTERNAL_ERROR,
f"The sites robots.txt ({robot_txt_url}), specifies that autonomous fetching of this page is not allowed, "
raise McpError(ErrorData(
code=INTERNAL_ERROR,
message=f"The sites robots.txt ({robot_txt_url}), specifies that autonomous fetching of this page is not allowed, "
f"<useragent>{user_agent}</useragent>\n"
f"<url>{url}</url>"
f"<robots>\n{robot_txt}\n</robots>\n"
f"The assistant must let the user know that it failed to view the page. The assistant may provide further guidance based on the above information.\n"
f"The assistant can tell the user that they can try manually fetching the page by using the fetch prompt within their UI.",
)
))


async def fetch_url(
Expand All @@ -124,12 +125,12 @@ async def fetch_url(
timeout=30,
)
except HTTPError as e:
raise McpError(INTERNAL_ERROR, f"Failed to fetch {url}: {e!r}")
raise McpError(ErrorData(code=INTERNAL_ERROR, message=f"Failed to fetch {url}: {e!r}"))
if response.status_code >= 400:
raise McpError(
INTERNAL_ERROR,
f"Failed to fetch {url} - status code {response.status_code}",
)
raise McpError(ErrorData(
code=INTERNAL_ERROR,
message=f"Failed to fetch {url} - status code {response.status_code}",
))

page_raw = response.text

Expand Down Expand Up @@ -221,11 +222,11 @@ async def call_tool(name, arguments: dict) -> list[TextContent]:
try:
args = Fetch(**arguments)
except ValueError as e:
raise McpError(INVALID_PARAMS, str(e))
raise McpError(ErrorData(code=INVALID_PARAMS, message=str(e)))

url = str(args.url)
if not url:
raise McpError(INVALID_PARAMS, "URL is required")
raise McpError(ErrorData(code=INVALID_PARAMS, message="URL is required"))

if not ignore_robots_txt:
await check_may_autonomously_fetch_url(url, user_agent_autonomous)
Expand Down Expand Up @@ -253,7 +254,7 @@ async def call_tool(name, arguments: dict) -> list[TextContent]:
@server.get_prompt()
async def get_prompt(name: str, arguments: dict | None) -> GetPromptResult:
if not arguments or "url" not in arguments:
raise McpError(INVALID_PARAMS, "URL is required")
raise McpError(ErrorData(code=INVALID_PARAMS, message="URL is required"))

url = arguments["url"]

Expand Down
Loading

0 comments on commit 71da7b2

Please sign in to comment.