Improving Code Structure in the Ghost MCP Server: A Journey Through Refactoring
Explore a comprehensive refactoring journey of the Ghost-MCP codebase, where modular design, dynamic tool discovery, and enhanced type safety transformed a monolithic structure into a maintainable, developer-friendly system. Learn practical strategies for Python project architecture.
![Improving Code Structure in the Ghost MCP Server: A Journey Through Refactoring](/content/images/2025/02/ImprovingCodeStructureintheGhostMCP.jpg)
Related Post:
![](https://fanyangmeng.blog/content/images/thumbnail/IntroducingGHOST-MCP.jpg)
As a software engineer working with the Ghost MCP server, I recently undertook a significant refactoring effort to improve the codebase's structure and maintainability. In this post, I'll share my journey through this process, explaining the challenges I faced and the solutions I implemented.
Understanding the Initial Structure
Let's first look at how our server initialization was originally structured:
# Old server.py
def create_server() -> FastMCP:
mcp = FastMCP(
SERVER_NAME,
dependencies=SERVER_DEPENDENCIES,
description=SERVER_DESCRIPTION
)
# Manual registration of every tool
mcp.tool()(tools.search_posts_by_title)
mcp.tool()(tools.list_posts)
mcp.tool()(tools.read_post)
mcp.tool()(tools.create_post)
# ... 20+ more tool registrations
# Manual resource registration
mcp.resource("user://{user_id}")(resources.handle_user_resource)
mcp.resource("member://{member_id}")(resources.handle_member_resource)
# ... more resource registrations
return mcp
The Ghost MCP server had a straightforward but somewhat rigid organization. The code was primarily structured in a flat manner, with a single tools.py
file that acted as a central hub for all tool exports. While this approach worked, it came with several limitations that became more apparent as the codebase grew.
The original tools.py
file was essentially a long list of imports and re-exports, looking something like this:
from .tools.posts import search_posts_by_title, list_posts, ...
from .tools.users import list_users, read_user, ...
# ... many more imports ...
__all__ = [
'search_posts_by_title',
'list_posts',
# ... long list of exports ...
]
The API module had minimal documentation and type annotations:
# Old api.py
async def make_ghost_request(endpoint: str, headers: dict, http_method="GET"):
url = f"{API_URL}/{endpoint}"
async with httpx.AsyncClient() as client:
if http_method == "PUT":
response = await client.put(url, headers=headers)
else:
response = await client.get(url, headers=headers)
return response.json()
The Downsides of the Original Structure
After careful analysis, I identified several key issues with this approach:
- Monolithic Server Initialization: The original server initialization was contained in a single, large function that handled everything:
- Manual Maintenance: Every time I added a new tool, I needed to modify the
tools.py
file in two places - the imports and the__all__
list. And I also need to make two more changes inservers.py
. This was error-prone and created unnecessary maintenance overhead. - Limited Type Safety: The original API implementation lacked comprehensive type hints and documentation
- Documentation Gaps: The original structure lacked comprehensive documentation about the system's architecture and the relationships between different components.
Implementing the Improvements
Enhanced API Documentation and Type Safety
The original API module lacked comprehensive type hints and documentation, After refactoring, I introduced robust type hints and detailed documentation.
# After
async def make_ghost_request(
endpoint: str,
headers: Dict[str, str],
ctx: Optional[Context] = None,
is_resource: bool = False,
http_method: str = GET,
json_data: Optional[Dict[str, Any]] = None
) -> Dict[str, Any]:
"""Make an authenticated request to the Ghost API.
Args:
endpoint: API endpoint to call (e.g. "posts" or "users")
headers: Request headers from get_auth_headers()
ctx: Optional context for logging
is_resource: Whether this request is for a resource
http_method: HTTP method to use (GET, POST, PUT, or DELETE)
json_data: Optional JSON data for POST/PUT requests
Returns:
Parsed JSON response from the Ghost API
Raises:
GhostError: For network issues, auth failures, rate limiting
"""
This improvement brought several benefits:
- IDE support now provides better autocomplete and error detection
- Developers can understand function behavior without diving into implementation
- Runtime type-related errors are caught during development
- Clear documentation of error conditions helps with error handling
Modular Component Registration
Previously, our server initialization was a monolithic function, The refactored version introduces modular registration:
# After
def create_server() -> FastMCP:
mcp = FastMCP(...)
register_resources(mcp)
register_tools(mcp)
register_prompts(mcp)
return mcp
def register_resources(mcp: FastMCP) -> None:
"""Register all resource handlers."""
resource_mappings = {
"user://{user_id}": resources.handle_user_resource,
"member://{member_id}": resources.handle_member_resource,
# ... organized mapping
}
for uri_template, handler in resource_mappings.items():
mcp.resource(uri_template)(handler)
This approach makes the code more structured and easier to work with. By keeping different types of registrations separate, it’s simpler to understand and maintain. Each registration function can be tested on its own, making debugging and improvements more straightforward. The overall organization is cleaner, reducing the complexity when setting up the server.
Dynamic Tool Discovery
The original tools package required manual exports, the new dynamic discovery system revolutionizes this:
# After - tools/__init__.py
def _import_submodules() -> Dict[str, Any]:
"""Dynamically import all modules from the current package."""
current_dir = Path(__file__).parent
modules: Dict[str, Any] = {}
for py_file in current_dir.glob('*.py'):
if py_file.name.startswith('__'):
continue
module_name = py_file.stem
module = import_module(f".{module_name}", package="ghost_mcp.tools")
# Automatically import non-private attributes
for attr_name in dir(module):
if not attr_name.startswith('_'):
globals()[attr_name] = getattr(module, attr_name)
return modules
# Run dynamic imports and create sorted __all__
_import_submodules()
__all__ = sorted(name for name in globals() if not name.startswith('_'))
This system makes development smoother by automating export management, so there’s no need to handle it manually.
It reduces the risk of missing exports when adding new tools and keeps the export list consistently organized in alphabetical order. Adding new tools is as easy as creating a new file, and dynamic loading helps prevent circular import issues.
Improved Error Handling and HTTP Method Management
The original code had basic error handling, the refactored version introduces comprehensive error handling:
# After
VALID_HTTP_METHODS = {GET, POST, PUT, DELETE}
async def make_ghost_request(endpoint, headers, http_method=GET):
if http_method not in VALID_HTTP_METHODS:
raise ValueError(f"Invalid HTTP method: {http_method}")
try:
method_map = {
GET: client.get,
POST: client.post,
PUT: client.put,
DELETE: client.delete
}
method_func = method_map[http_method]
response = await method_func(url, headers=headers)
if http_method == DELETE and response.status_code == 204:
return {}
response.raise_for_status()
if not is_resource and ctx:
ctx.log("info", f"API Request to {url} successful")
return response.json()
except httpx.HTTPError as e:
error_msg = f"HTTP error accessing Ghost API: {str(e)}"
if response := getattr(e, 'response', None):
error_msg += f" (Status: {response.status_code})"
raise GhostError(error_msg)
This improved error handling provides:
- Validation of HTTP methods before making requests
- Detailed error messages including status codes
- Proper handling of successful DELETE responses
- Context-aware logging for debugging
- Clear separation between HTTP and general errors
Real-World Impact
These changes have made a real difference in our development process:
Faster Tool Addition
• Before: Adding a new tool took 4 steps, requiring updates across multiple files.
• After: Now, it takes just 2 steps—simply create a new file.
Improved Error Detection
• More type-related bugs are caught during development.
• Fewer runtime errors, leading to a more stable system.
Better Documentation & Onboarding
• New developers can spend less time understanding the codebase.
• Code reviews are faster, making collaboration smoother.
Easier Maintenance
• Bug fixes are more precise thanks to clearer error messages.
• Testing coverage has increased, thanks to a more modular design.
These improvements have transformed my Ghost MCP server into a more robust and developer-friendly system. With dynamic tool discovery, better type safety, and modular component registration, the codebase is now not only easier to maintain but also much more enjoyable to work with.
I hope this detailed breakdown helps other developers considering similar architectural improvements in their Python projects. Remember, good architecture isn't about following rules blindly – it's about making thoughtful decisions that benefit your specific use case and team.
Discussion