MCP is so hot right now, so I spent a Hack Friday building a server that lets AI assistants enrich location data with Congressional districts, Census demographics, and more. The result? A glimpse into how we might all be using APIs in the future.August 05, 2025

Prototyping an MCP Server for Geocodio

MCP is so hot right now, so I spent a Hack Friday building a server that lets AI assistants enrich location data with Congressional districts, Census demographics, and more. The result? A glimpse into how we might all be using APIs in the future.

MCP is so hot right now.

I mean, everyone's talking about the Model Context Protocol and for good reason. MCP is an open protocol that lets you connect AI tools to various data sources in a standardized way. No more copy-pasting API calls into chat windows or building bespoke integrations for every project.

Every Friday here at Geocodio, we dedicate a few hours to experiment with new tech and build prototypes. We call it Hack Friday. This felt like the perfect time to build an MCP server that would make our data append features super easy to use with AI assistants. Let's dive in...

Why MCP + Geocodio?

One of the things I love about MCP is how it turns complex integrations into simple tools. Instead of explaining to an AI assistant how to call our API, format the request, handle errors, and parse the response, you just... use a tool. It's pretty specific to what our customers need, too: enriching location data with political, demographic, and geographic information.

The protocol handles all the plumbing. Your AI assistant gets clean, typed tools it can use, and you get consistent, reliable data enrichment. No more debugging malformed API calls at 2 AM.

Building the Prototype

Setting up an MCP server is surprisingly straightforward. Here's the core structure I ended up with:

import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js"; import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js"; import { z } from "zod"; import Geocodio from "geocodio-library-node"; class DataAppendServer { private server: McpServer; private geocodio: Geocodio; constructor(apiKey: string) { this.geocodio = new Geocodio(apiKey); this.server = new McpServer( { name: "mcp-data-append-server", version: "1.0.0" }, { capabilities: { tools: {} } } ); this.setupTools(); } async start(): Promise<void> { const transport = new StdioServerTransport(); await this.server.connect(transport); console.error("MCP Data Append Server running on stdio"); } }

The beauty here is that MCP handles all the communication protocol stuff. I just needed to focus on implementing the actual tools that would be useful for location data enrichment.

Implementing Tools

Let's take a look at how I implemented the Congressional district lookup tool. This was pretty easy to hook up:

private setupTools(): void { this.server.tool( "get_congressional_district", { address: z.string().optional() .describe("The address to get congressional district for"), latitude: z.number().min(-90).max(90).optional() .describe("The latitude coordinate"), longitude: z.number().min(-180).max(180).optional() .describe("The longitude coordinate"), }, async ({ address, latitude, longitude }) => { if (!address && (!latitude || !longitude)) { throw new Error("Either address or latitude/longitude coordinates are required"); } const input = address || `${latitude},${longitude}`; const result = await this.geocodio.geocode(input, ['cd'], 1); return this.formatResponse(result); } ); }

What I really like about this approach is the schema validation with Zod. The tool parameters are typed and validated automatically, and the descriptions show up in the AI assistant's interface. Super clean.

Demo Time

Alright, let's see this thing in action...

Basic Address Enrichment

MCP Demo 1

Multi-Field Location Enrichment

MCP Demo 2

The Development Experience

Building this prototype was a pretty smooth experience. The whole thing came together in just a few hours during Hack Friday, which honestly surprised me. The MCP SDK does a lot of heavy lifting: handling the stdio transport, managing tool registration, and dealing with all the protocol details. I could focus on what actually mattered: making useful tools for location data enrichment.

One thing I needed was proper error handling. Geocodio's API can fail for various reasons: invalid API keys, rate limiting, or just bad addresses. The MCP protocol handles errors gracefully, so all I needed to do was throw meaningful error messages:

if (!apiKey) { console.error("Error: GEOCODIO_API_KEY environment variable is required"); process.exit(1); }

The schema validation turned out to be super helpful too. By using Zod schemas with good descriptions, the AI assistant knows exactly what parameters each tool expects. No more guessing about whether it's "lat" or "latitude" or "lat_coord".

Lessons from the Experiment

Building this prototype really got me thinking about API accessibility in the age of AI assistants. We spend so much time crafting beautiful REST APIs, writing comprehensive docs, building SDKs... but what if the future is AI assistants calling our APIs for users?

The MCP approach feels like a glimpse of that future. Instead of teaching users how to construct API calls, we give their AI assistants the tools they need. The assistant handles the implementation details while users just describe what they want.

What surprised me most was how little code it took. A few hours on a Hack Friday, and I had a working integration that made our data append features feel native to Claude. No more context switching, no more looking up field names, no more debugging JSON responses.

This experiment has me rethinking how we might want to present our APIs going forward. Should we be building MCP servers alongside our traditional SDKs? Is this how developers will want to interact with services like ours in the future?

I don't have all the answers yet, but I'm super intrigued by the possibilities. The fact that I've already been using this internally for our own data analysis says something about the potential here.

For now, this remains a fun Hack Friday experiment. But who knows... maybe this is a preview of how we'll all be working with APIs in a few years.

Subscribe to Code and Coordinates

Get the latest articles about software development, data science, and geospatial technology

How We're Keeping Our Free Tier Sustainable by Preventing Abuse

A free tier allows people to try a SaaS without paying first, but it opens up the service to bad actors. Here's how we balance preventing bad actors while still supporting legitimate users.
Read more

How Geocodio keeps 300M addresses up to date

Working with address data requires continual updates. Our in-house ETL, built on Laravel and SQLite, helps us expand our address point data on a daily basis.

Welcome to Code and Coordinates

Welcome to our new engineering blog where we share behind-the-scenes insights into how we build and scale Geocodio.
Copyright © 2014-2025 Dotsquare LLC, Norfolk, Virginia. All rights reserved.