Building a Custom MCP Server for PostgreSQL
Author
Igor Drobiazko
Date Published

In one of our previous posts, we introduced the Model Context Protocol (MCP) and showed how to integrate a large language model (LLM), such as Claude, with a PostgreSQL database using a ready-to-use MCP server. This integration empowers AI systems to interact with external data sources in a consistent, secure, and transparent manner.
But what if your project demands more customization, tighter security, or specialized behavior that the default server doesn't provide?
In this post, we’ll walk through how to build your own MCP server for PostgreSQL from scratch—explaining both why you might want to and how to do it.
Why You Might Want to Build a Custom MCP Server?
The open-source PostgreSQL MCP server is a great starting point, but it may not fit every use case. Here’s why you might consider writing your own:
1. Precise Query Control and Access Permissions
The default server exposes general SQL execution capabilities. If your data is sensitive or if you need to strictly control what the model can query, building a custom MCP server lets you:
- Whitelist specific tables or columns
- Allow only
SELECToperations - Enforce row-level access control
- Sanitize or redact sensitive data in responses
2. Custom API-Like Behaviors Instead of Raw SQL
You might want to expose higher-level operations rather than raw SQL. For example, instead of letting the model ask:
SELECT AVG(salary) FROM employees;you can offer:
{
"function": "getAverageSalary",
"arguments": {}
}This way, you define the behavior behind the scenes.
3. Integrate External Systems and Hybrid Workflows
Sometimes, data doesn’t live in PostgreSQL alone. You may want your MCP server to:
- Combine PostgreSQL with REST API responses
- Cache results
- Perform audits or logging
- Validate data before returning it
4. Enhanced Security and Regulatory Compliance
Organizations with strict security requirements (e.g., healthcare or finance) often need full control over their AI integration layers. Writing your own MCP server helps ensure you meet compliance standards like GDPR or HIPAA.
Step-by-Step: How to build a MCP Server
We are going to use the official MCP Python SDK to build our server and uv for managing the Python project.
Project Setup
For this walkthrough, we'll use uv, a Python project management tool. You're free to use another manager, but uv simplifies setup for this example. Begin by initializing your project:
uv init custom-postgres-mcp
cd custom-postgres-mcpThen, add the MCP SDK:
uv add "mcp[cli]"Now that our project is set up, we are ready to implement our MCP server. Let's explore how to do it.
Implementing the server
Let's create a file server.py in the root directory of our new project and paste the following code into it.
import os
import psycopg2
from typing import List, Dict, Any, Optional
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("Own Postgres MCP Server")
if __name__ == "__main__":
mcp.run(transport='stdio')Here's what this initial setup does:
- Imports the required libraries
- Creates an MCP server instance
- Launches it using the standard input/output transport (stdio), enabling local interaction
The stdio transport lets your MCP server communicate via standard input and output streams, which is ideal for CLI tools or local development environments. While MCP also supports transports like Server-Sent Events (SSE) and HTTP streaming, we'll focus only on stdio here. For other transport methods, consult the official docs.
Implementing a tool without parameters
So far our MCP server is quite empty - it does not provide any tools. Let's change that by implementing our first tool for listing tables in our employees database. The following example demonstrates how to achieve that.
# Add a PostgreSQL tool to list tables
@mcp.tool()
def list_tables() -> List[Dict[str, Any]]:
"""List tables from the employees PostgreSQL database.
Returns:
List of dictionaries with keys:
- table_schema: The schema name
- table_name: The table name
"""
# Static SQL query to list tables
query = """SELECT
table_schema, table_name
FROM
information_schema.tables
WHERE
table_schema='employees'
ORDER BY
table_schema, table_name"""
results = execute_query(query)
return results
MCP tools are registered via the @mcp.tool() decorator. The MCP system uses the function's signature and docstring to automatically configure the tool:
- Function name becomes the tool's identifier
- Parameters define its input schema
- Return type specifies the expected result format
- Docstring serves as inline documentation and is used for tool discovery and runtime introspection
Subscribe to our waiting list
By putting yourself on our waiting list, you will be informed as soon as our product is live. But not only that. We will give you 12 months Pro Plan for free when we launch our product. To prevent misuse, we will do a double-opt-in process.
- Pro Plan Features for free
- As a thank you for being an early adopter, we will give you a free 1 year pro plan subscription. Don't miss out - this is for waiting-list-subscribers only.
- No spam
- Apart from the confirmation email you will get, you will never receive any marketing emails from us. You will exactly get one other email signaling that you can register.
Our tool is implemented by querying all table names in the employees schema from the information_schema.tables table. The above tool relies on a helper function, execute_query, which we've separated for reuse across tools. Here’s how it works:
def execute_query(query: str) -> List[Dict[str, Any]]:
"""Execute a SQL query against the PostgreSQL database.
Args:
query: SQL query to execute
Returns:
List of dictionaries containing the query results
"""
try:
# Connect to PostgreSQL using connection URL
conn = psycopg2.connect(CONNECTION_URL)
# Create a cursor and execute the query
with conn.cursor() as cursor:
cursor.execute(query)
# Get column names
columns = [desc[0] for desc in cursor.description] if cursor.description else []
# Fetch all results and convert to list of dictionaries
results = []
for row in cursor.fetchall():
results.append(dict(zip(columns, row)))
return results
except Exception as e:
return [{"error": str(e)}]
finally:
if 'conn' in locals() and conn is not None:
conn.close()Now that you learned how to implement a tool without parameters, let's implement another tool with an input parameter.
Implementing a tool with parameters
Now let's implement another tool that will describe a given table. The tool takes a table name as parameter and returns a list of column definitions.
@mcp.tool()
def get_table_columns(table_name: str) -> List[Dict[str, Any]]:
"""Get column information for a specific table in the employees PostgreSQL database.
Args:
table_name: The name of the table to get column information for
Returns:
List of dictionaries with keys:
- table_schema: The schema name
- table_name: The table name
- column_name: The column name
- data_type: The data type of the column
- is_nullable: Whether the column allows NULL values ('YES' or 'NO')
"""
# SQL query to get column information for a specific table
query = f"""SELECT
table_schema,
table_name,
column_name,
data_type,
is_nullable
FROM
information_schema.columns
WHERE
table_schema='employees' AND table_name='{table_name}'"""
results = execute_query(query)
return resultsThe tool above is named get_table_columns and takes table_name parameter. This table name is used to fetch the column descriptions from the table information_schema.columns and return it to the client. As you can see, the implementation of this tool is very similar to the the tool we implemented before. It differs only in the specification of the input parameter and the @mcp.tool() decorator will do the job for us by registering the tool with proper configuration in the MCP server.
Now let's add our last tool that takes an arbitrary SQL statement and executes it agains our datable. This tool is required to allow our users to ask LLM questions about our database, like we did in our previous post.
@mcp.tool()
def execute_sql(query: str) -> List[Dict[str, Any]]:
"""Execute an arbitrary SQL query against the employees PostgreSQL database.
Args:
query: The SQL query to execute
Returns:
List of dictionaries containing the query results
"""
results = execute_query(query)
return resultsYou can enhance safety by validating queries—e.g., blocking DROP, DELETE without WHERE, or long-running statements—to prevent misuse or accidental data loss.
Now let’s test the MCP server using the MCP Inspector tool. Run this command:
MCP Inspector
Now let’s test the MCP server using the MCP Inspector tool. Run this command:
uv run mcp dev server.pyOnce executed, you should see something like this in your terminal.
uv run mcp dev server.py
Starting MCP inspector...
⚙️ Proxy server listening on localhost:6277
🔑 Session token: cc70523e1a2517b15ff530c47640b9e65e7d6f361f784e08add9e586d4738c7d
Use this token to authenticate requests or set DANGEROUSLY_OMIT_AUTH=true to disable auth
🚀 MCP Inspector is up and running at:
http://localhost:6274/?MCP_PROXY_AUTH_TOKEN=cc70523e1a2517b15ff530c47640b9e65e7d6f361f784e08add9e586d4738c7dIf successful, the terminal will display a URL that opens the inspector UI in your browser. If it doesn't open automatically, copy and paste it manually.

In the MCP Inspector UI:
- Set Transport Type to
STDIO - Command:
uv - Arguments:
run server.py
Then, click Connect. If connected without errors, you should be able to see a screen like the following one.

Once connected:
- Click the Tools tab
- Click List Tools
- Select a tool (e.g.,
list_tables) - If applicable, enter input values
- Click Run Tool to execute and view the output
The tool should be executed and the output presented to you. The following screenshot demonstrates the result of running the tool list_tables.

🎉 Congratulations! You've built and tested your first custom MCP server. You now have full control over how your LLM interacts with your PostgreSQL database—securely, flexibly, and with custom business logic.
Real-World Example: Connecting Claude Desktop to Your Custom MCP Server
Now let's use our MCP server in a real life scenario. Suppose you want to provide your LLM access to your company's employee database to enable data analysis about employees - perhaps to generate reports, answer HR questions, or identify workforce trends. We'll use Claude Desktop as our LLM client and connect it to your employee database through a PostgreSQL MCP server.
In our previous post, we covered how to use MCP servers in Claude Desktop in details. The following steps assume you have read that post. We don't go into much details here.
Update your claude_desktop_config.json to point to your custom MCP server:
{
"mcpServers": {
"postgres": {
"command": "/Users/igor/.local/bin/uv",
"args": [
"--directory",
"/Users/igor/dev/custom-postgres-mcp",
"run",
"server.py"
]
}
}
}Use absolute paths for reliability. In some cases, relative paths or shorthand like just uv may not work correctly—we recommend full paths until resolved.
Let's review the configuration above in details. The configuration tells Claude Desktop to start our MCP server with the following command.
uv --directory /Users/igor/dev/custom-postgres-mcp run server.pySo we tell uv to run our server.py file in the specified directory.
Wrapping Up
In this post, we explored how to build a custom MCP server for PostgreSQL from the ground up. We discussed why you might need a custom server—from fine-grained permission control and custom business logic to multi-system integration and regulatory compliance. Then, we walked through a step-by-step implementation using the mcp Python SDK and uv, adding tools for:
- Listing database tables
- Inspecting column metadata
- Executing custom SQL queries
You also learned how to test your MCP server using MCP Inspector and how to integrate it with Claude Desktop for real-world usage.
By creating your own MCP server, you gain full control over how your LLM interacts with your data—ensuring security, transparency, and flexibility tailored to your needs.
Whether you're building internal analytics tools or deploying AI assistants in regulated industries, this approach puts you in the driver’s seat.

Introduction to the Model Context Protocol (MCP)
