Creating Per-Project MCP Servers

I must confess, first and foremost, that I am not a fan of as a protocol–I find it overly complex (why shunt that much JSON around and waste tokens parsing it?), badly designed for application scenarios (it is completely redundant if you have good Swagger specs for your APIs, not to mention poorly secured) and has generated so much hype that I instinctively shied away until the dust settled.

But a few months ago I found a (genius) minimalist stdio MCP server (in bash, of all things), turned it into a Python library (with both synchronous and asyncio flavors) and decided to build a couple of tools to extend GitHub Copilot inside .

And then one day I built another tool for another workspace. And another. And after three or so, a pattern emerged–even though provides Copilot with local filesystem and code structure information, I often needed to have specialist tools to help with things like:

  • Looking up snyk vulnerabilities in package.json files
  • Validating internal wiki links in Markdown files
  • Converting old Textile markup to Markdown
  • Adding or updating YAML front-matter in blog posts
  • Bulk-renaming (or linting/formatting) files according to some pattern

And I only needed those tools in one workspace at a time, so having a zillion tools available all the time was pointless (and confused the LLM).

So I started including simple task-specific servers in my repositories.

Configuring VS Code

As it happens, this very site is a good example. The git repository I use for the content has a wiki-helper server that makes it easy for me to check internal page links and perform chores like converting old Textile markup or adding data to YAML files when publishing a post.

stores its workspace preferences in a .vscode folder inside your repository, and the Copilot extension in particular gets its starting context from inside .github, so right now things look like this:

.github
└── copilot-instructions.md
.vscode
├── extensions.json
├── mcp.json
└── settings.json
tools
├── wiki_mcp.py
└── umcp.py

Since I do not need a virtual environment for this particular repository, I can just dump umcp.py alongside my server, and configure mcp.json to invoke it like this:

 cat .code/mcp.json
{
    "servers": {
        "wiki-helper": {
            "type": "stdio",
            // Use python executable as command; pass script path as first arg for consistent CWD handling.
            "command": "python3",
            "args": [
                "tools/wiki_mcp.py"
            ],
            // Provide explicit environment override so WIKI_ROOT uses workspace root (script already defaults correctly).
            "env": {
                "WIKI_ROOT": "${workspaceFolder}/space"
            }
        }
    }
}

This is actually one of my simplest servers (it mostly uses regexps, Path and a few more standard library functions), but I also have other projects that require specific libraries, so I use uv to run the server in those.

Also, besides including in copilot-instructions.md a short description of what the project is, what coding conventions and tooling I’m using, etc., I typically add “use tools from the foobar server to do these tasks”.

Configuring Zed

Configuring is very similar, since it too allows you to have per-workspace settings:

 cat .zed/settings.json 
{
  "context_servers": {
    "wiki-helper": {
      "source": "custom",
      "command": "python3",
      "args": ["tools/wiki_mcp.py"],
      "env": {}
    }
  }
}

However, does not seem to have an easy way to pass the workspace root as a variable, so I’ve had to hack that into the server itself. If anyone knows how to do that, please let me know.

Additionally, is rather picky about server versions (as of this writing it actually errors out if they’re not 2025-03-26 or 2024-11-05, which is a bit too limiting).

Either way, this approach has proven to be highly effective, enabling me to maintain per-project tooling with ease. This not only automates repetitive tasks but also significantly reduces costs and optimizes LLM usage. For instance, I can leverage gpt-4o or gpt-5-mini to handle specific tools, avoiding the need for more expensive models like gpt-5 or Claude, which incur premium charges in GitHub Copilot.

In particular, having a tool to check internal wiki links has been a godsend, since I can now just prompt Copilot to “reformat this post according to repository patterns” before publishing a post and be reasonably sure that I won’t have broken links (which is a pet peeve of mine):

Here's a recent example
Here's a recent example

The fact that I can also use Copilot to help me write the servers themselves with full context of what the project is about and how files are laid out is just icing on the cake, even if it smacks of self-improving

Notes for September 22-28

It was a moderately exciting week work-wise (in a positive way), but a recurrence of the highly disruptive habit people have of booking meetings the very next day or early the day after (even when any sort of effective work would take a day or so to yield finished results) made it hard to, well, do anything at all…

But there were a few things of note:

Health

Over the past semester I’ve been gradually increasing the amount of daily exercise I aim for, and I’m getting new minor aches and pains every day that seem to stem directly from a continuous (but moderate) exercise streak. Between a compact treadmill I got at the end of January and a few other tricks, I slowly nudged myself to the point where I now need to pause work mid-morning and take a brisk walk, so that’s a good milestone to keep track of.

Hardware

I got a LattePanda Iota in the mail along with a bunch of add-on boards, and initial impressions are great—I’ve put up a very short video on it and am putting the board through its paces. In the process I’m realizing things take four times as long if I have to capture video, which is one of the reasons it’s taken me this long to get even moderately serious about YouTube.

Video Editing

I am stubbornly pursuing two approaches to video editing—a cross-platform approach using as a desktop editing and compositing tool (which is OK except that video stabilization and audio editing are too much of a manual process to be enjoyable) and, simultaneously, trying to use my iPad as a video editing station.

The latter is mostly winning solely because I can do it quietly on the couch (and bed) in the evenings (and nights), but I am constantly switching apps to figure out how best to manage media, edit voice-overs, add simple titles that I can have some control over, etc.

This has meant experimenting with various video editing techniques as well, and right now is squarely trouncing Final Cut Pro on account of its image stabilization and having flexible, editable titles that weren’t designed by hippies with Victorian influences, even if its iPad UX was apparently designed for ants and I can’t seem to be able to record a voice-over directly in it (which is a pain).

Homelab

My , so I spent a couple of not-so-entertaining hours rejiggering my Cloudflare tunnels and discovered selkies, which had flown under my radar until now. It works OK atop a tunnel (to the point where I can use this container image to run remotely without a lot of fuss), but it’s still slower than RDP and for some reason the Remmina image has a weird runaway CPU usage bug.

I used it as a rather roundabout way to share my Linux desktops in video calls since (to my continuous frustration) Cloudflare’s RDP web client still doesn’t work with xrdp, so I spent a while trying to figure out a fix (to no avail yet). So right now I’m rebuilding my bastion to run… Windows (don’t ask).

As an encore, I with an arguably better “roaming” setup, the effects of which are only noticeable if you’re doing video conferencing as you walk around the house to check if all the windows are shuttered (ask me how I know).

Coding

I refactored one of my projects to use sqlite-vector instead of sqlite-vec with reasonable success, but that one’s still hampered by the need to use fastembed, and that is just dog slow on a CPU-only setup.

Notes for September 15-21

A rather hectic week as work ramps up again and I start to progressively lose control of my calendar, but I’ve managed to slowly accrete some notes.

Read More...

Notes on the Liquid Glass Tsunami

Following my little saga with the , I upgraded a few of my Apple devices, including one of my Macs, to the final release versions of all the “26” operating systems, and… It’s even worse than I thought.

Read More...

Notes for September 8–14

I’m now fully back to work, so there hasn’t been any free time for anything but finishing overdue posts. I have, however, managed to sneak in a few leisurely half-hours in the mornings reading work e-mail from my balcony before my calls start, which has been a great way to enjoy the lingering summertime.

Read More...

The Cudy AX3000 Wi-Fi 6 System (with OpenWRT)

As I’ve been writing about or , I’ve recently upgraded my Wi-Fi after an attempt to use ISP-provided equipment to replace my remarkably long-lasting (and extremely reliable) base stations.

Read More...

The Chuwi AuBox 8745HS

As regular readers will know, I am quite fond of the various Ryzen APUs that have hit the market over the past couple of years, and I take a look at them whenever I can, since they have proven to be quite popular options—partially because of the high core counts and partly because of their increasingly powerful iGPUs.

Read More...

The iPhone 17 Event

This time I actually forgot to watch the live event.

Read More...

Notes for September 1-7

Summer break is now completely over, so I did my usual Summer “cleansing”—disabling notifications from annoying apps, unsubscribing from a few more online services, ditching a half dozen YouTube channels, and (surprisingly) keeping my Twitter/X account afloat. I also poked at BlueSky with a metaphorical stick, only to find it very much alive.

Read More...

Archives3D Site Map