Trying out MCP and my thoughts on it
Last few days, I have been trying out Model Context Protocol. Testing out with Claude desktop. It’s amazing and I personally love it. Model Context Protocol (MCP) is an open protocol designed to standardize how AI applications, particularly those using Large Language Models (LLMs), interact with external data sources and tools. It’s like USB-C ports for AI applications.
For example, I have Claude desktop application. Normally, I can ask questions to Claude’s models, but it does not have access to my files/folders from my computer. Currently, Claude does not have internet access as well. But there are tools that you can provide or you can create your own tools (Claude Doc, OpenAI Doc) and provide to LLMs. That way LLMs will have better context and would able to answer more accurately. If you still haven’t tried it, I would suggest you go look at the docs and try it out.
For my use case and testing, I use Claude Desktop application as my client; there are quite a number of client apps as well. You can check them out and try them. Currently, most of the client apps are coding editors. But, in the future, I am quite sure there will be more standalone apps that do everything for you. Be it coding, browsing, computer use… Literally, it will be everything in my opinion.
I created two CLI application using Claude with MCP Filesystem server. One is a port over of https://github.com/simonw/files-to-prompt python package to Golang. The another one, https://github.com/setkyar/cmcp, is a CLI app that you can configure Claude MCP severs via CLI. I find it quite amazing. I know that I should do something like coding via code editors which support MCP. But, I tried it out to understand what the experience is with Claude app. Also, I don’t have to spend extra cost because I am Claude pro user.
If you are a developer, you should check how to create MCP clients and servers, especially on creating servers as they provide context, tools to the LLMs. For the client app, I think all those main LLMs applications will have their own app with MCP server integration. Of course, there will be standalone apps that integrate with popular LLMs like goose did. So far, I don’t think there is a really good client app that does everything for you. I believe there will be more soon.
What I am thinking about LLM and MCP is like the following. Inside your computer, there will be a standalone app. You might want to run your own MCP servers inside your local computer based on your own needs as they’re standardized. There will be quite lot of MCP servers that you can connect to via API as well. Let’s say you want to book a hotel, you should be able to do so via an LLM app which will communicate with the hotel website via MCP (API). This is just one example, but I think you got what I mean. Goose is something that I was thinking. But, can be more seamless experience.

If you want LLM to use your computer, it should be able to be done as well using MCP with tools specalize in computer use. Or, if you want something like OpenAI Operator there should be a service (MCP server) that should be allows it. You might have to pay extra as you are using VMs.
Personally, I think LLMs + MCP Servers is approaching AGI . Of course, this is my thought and speculation. Let me know what you think!
[…] also created youtube-subtitles-mcp to use with Claude Desktop with MCP. The main use case is to summarize the YouTube videos. But I also use it for like translating […]