If you’re a developer, you’ve probably heard about MCP — the Model Context Protocol. It’s the hottest thing in AI, and posts and videos about it are appearing everywhere. Maybe you’ve even looked at the documentation, nodded along with the technical explanations (“it’s a protocol for connecting AI models to tools”), and thought, “But why would I want to use this instead of just calling tools directly? Why go through some protocol layer when I could access them myself?”
At any rate, that was my reaction. The MCP documentation was clear about what MCP did, but I couldn’t see why I should care. Then I stumbled across a blog post by Daniela Petruzalek about making your computer respond like the Enterprise computer from Star Trek and everything clicked. Let me show you the moment MCP went from confusing concept to indispensable tool.
This post is a text version of a YouTube video I published recently. If you’d rather watch the video, here it is:
If you’d rather read the details, please continue. 🙂
The SQL Barrier
The blog post recommended OSQuery, a powerful open source system monitoring tool you can install on a Mac. You can query just about anything about your system state — CPU usage, memory consumption, network connections, temperature sensors, running processes, you name it. There’s just one catch: you have to write SQL to use it.
Say I want to know why my fan is running. I need to know that OSQuery maintains a temperature_sensors table and its columns, so I can write something like:
SELECT name, celsius
FROM temperature_sensors
WHERE celsius > 70;Then I need to cross-reference that with processes using the CPU, which means knowing about the processes table and writing more SQL. It’s powerful, but it requires me to remember table names, column names, and the right syntax for OSQuery (which is a superset of SQLite, in case you were wondering), so there’s a learning curve involved.
But if I wrap OSQuery with an MCP service, I can connect it to an AI client and just ask: “Why is my fan running?” in plain English. The LLM then translates my question into the right SQL queries for OSQuery, runs them through the service, and converts the answers (which come back in JSON form) into conversational English. It’s like having a systems expert who speaks both English, SQL, and JSON (not to mention OSQuery), at your disposal.
This is one of the key reasons why MCP matters. It’s not about replacing your tools — it’s about making them speak your language. The AI becomes your translator between human intent and technical syntax.

Avoiding The Docker Tax
Once I understood this pattern, I started seeing MCP opportunities everywhere. Take GitHub integration, which is one of the classic examples in the MCP documentation.
The official GitHub MCP service exists, but it suggests using Docker to run it. When you integrate it into something like Claude Desktop, you have to embed your personal access token right inside the config file in plain text.
{
"mcpServers": {
"github": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"GITHUB_PERSONAL_ACCESS_TOKEN",
"mcp/github"
],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>"
}
}
}
}That gives me two issues:
- Docker can be a the resource hog
- I don’t want my personal access token sitting in a plain text config file
(Note: Recently Docker introduced its own Docker MCP Toolkit, which allows you to work through that instead. That avoids embedding the secret key in plain text, but still, of course, runs through Docker.)
So instead of using their service, I wrapped the local gh command (GitHub CLI, which I installed using homebrew) inside my own MCP service. Now I can do all my authentication right at the command line. No secrets in config files, and no Docker required.
For example, I can ask: “What are the last five commits on my OSQuery MCP server project?” The LLM translates that into the right gh commands, runs them locally with my existing authentication, and formats the results nicely.

No Docker, no exposed tokens, just clean local execution with a simple interface.
Building MCP Services with Spring AI
Here’s where it gets interesting for Java developers. I implemented my MCP services using Spring AI, which provides starters for both MCP servers and MCP clients.

If you already know Spring, you can use all the regular Spring patterns and mechanisms. The key for MCP is the @Tool annotation:
@Tool("Get system temperature information")
public String getTemperatureInfo() {
// call executeOsquery with the proper SQL
}
@Tool(description = """
Execute osquery SQL queries to inspect system state.
Query processes, users, network connections, and other OS data.
Example: SELECT name, pid FROM processes""")
public String executeOsquery(String sql) {
// ...
}Each method annotated with @Tool gets exposed to clients as an available MCP tool (function). The LLM reads the description and parameter info and invokes the method when appropriate.
The architecture for creating your own MCP service wrapper is straightforward:
- Write a Spring AI service exposing as many of the useful methods as you want with
@Toolannotations. - Authenticate through normal Spring system mechanisms.
- Build the service (in Java, an executable jar file) you can access in your MCP client.
- Configure the client to access the service, and you’re done.
You get normal dependency injection, configuration management, and all the enterprise patterns you’re used to in Spring. Spring AI handles all the MCP protocol conversions under the hood, and returns responses in a readable form.
Going Full Star Trek
Remember that blog post about the Enterprise computer? I figured, what the heck, I’ll build a Spring AI-based client with a JavaFX front-end and throw in voice command translation.

I have code that records my microphone, translates the audio file into English, figures out what commands to invoke, and displays the results. I can literally hold down a button and say: “Computer, run a level one diagnostic.”
It works. It’s not terribly elegant (and, like always, my UI could use some work), but I am literally talking to my computer like it’s on the Enterprise and it responds with actual system information.
(I did not add in voice capabilities for the UI to read the response back to me. I don’t think I want that wall of text read out loud. I might, however, have the LLM summarize it and read me the summary. We’ll see.)
When to Write an MCP wrapper
The idea here is that MCP shines when you need to bridge natural language with technical interfaces. You’re putting a conversational face on complicated tools. Look for tools in your workflow that require translation between what you want and the technical syntax:
- Complex command line utilities
- SQL databases with complicated queries
- APIs with dozens of parameters
- Configuration systems
- Build tools
Anything that could benefit from a conversation in front of it is a good candidate. If you find yourself thinking, “How do I call this? What are the details? I wish the computer would just go ahead and invoke it for me,” is a good candidate for MCP.
So, what have we learned?
MCP isn’t about replacing your tools or your development job. It’s about making those tools speak your language. It’s about putting a friendly, conversational interface on these tools. Instead of learning each new API parameter and command line flag, you focus on what you want to accomplish and let the AI handle the translation. That’s the real breakthrough with MCP. It’s not just some protocol — it’s a way to make the tools in your toolchain understand you.
MCP changed how I think about AI integration. It’s not about the AI doing the work — it’s about the AI understanding what I want and translating that into the technical details I’d otherwise have to remember or look up.
If you’d like to access the code for all these applications, here are my GitHub repositories referenced above:
- https://github.com/kousen/gh_mcp_server (gh wrapper)
- https://github.com/kousen/OsqueryMcpServer (OSQuery wrapper) (Naming consistency? What’s that? Never heard of it)
- https://github.com/kousen/starfleet-voice-interface (Spring Boot MCP client with OpenAI voice transcription and a JavaFX front end)
Good luck! For more information on this and related topics, see my free weekly Tales from the jar side newsletter and YouTube channel.
Leave a Reply