Troubleshooting Gemini CLI Error Discovering Prompts From GitHub MCP Error -32601 Prompts Not Supported
Introduction
Hey guys! Today, we're diving deep into a specific error that some of you might have encountered while using the gemini-cli
. This error, Error discovering prompts from github: MCP error -32601: prompts not supported
, can be a bit perplexing, but don't worry, we're going to break it down and explore potential solutions. This article aims to provide a comprehensive understanding of the issue, its root cause, and how to tackle it. We'll cover everything from the technical details to practical fixes, ensuring you can get back to your AI projects without a hitch. So, let's jump right in and demystify this error together!
What's the Issue? Unpacking the Error Message
So, you've fired up your gemini-cli
and bam! You're greeted with this message: Error discovering prompts from github: MCP error -32601: prompts not supported
. What does it even mean? Well, this error pops up after the merge of PR #4828, which aimed to load and use MCP server prompts as slash commands in the CLI. The error surfaces during the startup of gemini
and indicates a problem when the CLI tries to fetch prompts from a GitHub MCP (Model Context Protocol) server. The key part of the error message, "prompts not supported," suggests that the specific MCP server being accessed doesn't have the functionality to provide prompts. The gemini-cli
attempts to discover prompts from a configured MCP server, as seen in the discoverPrompts
function within the mcp-client.ts
file. This function sends a request to the server using the prompts/list
method. However, if the server doesn't support this method, it throws an error, leading to the dreaded message. This is a crucial first step in understanding the error, as it highlights that the issue is not necessarily a bug in the gemini-cli
itself, but rather a mismatch between the CLI's expectations and the capabilities of the MCP server it's trying to communicate with. Understanding this distinction is key to finding the right solution, whether it's adjusting the configuration, updating the server, or modifying the CLI's error handling.
The error message is emitted from this snippet of code in packages/core/src/tools/mcp-client.ts
:
export async function discoverPrompts(
mcpServerName: string,
mcpClient: Client,
promptRegistry: PromptRegistry,
): Promise<void> {
try {
const response = await mcpClient.request(
{ method: 'prompts/list', params: {} },
ListPromptsResultSchema,
);
...
} catch (error) {
// It's okay if this fails, not all servers will have prompts.
// Don't log an error if the method is not found, which is a common case.
if (
error instanceof Error &&
!error.message?.includes('Method not found')
) {
console.error(
`Error discovering prompts from ${mcpServerName}: ${getErrorMessage(
error,
)}`,
);
}
}
}
The Configuration Culprit
The specific configuration in your ~/.gemini/settings.json
file is often the culprit. If you're using the following:
"mcpServers": {
"github": {
"httpUrl": "https://api.githubcopilot.com/mcp/x/repos/readonly",
"headers": {
"Authorization": "..."
},
"timeout": 5000
}
},
You're likely to encounter this error. Notice the httpUrl
: https://api.githubcopilot.com/mcp/x/repos/readonly
. This endpoint, specifically the repos
part, is the key. The repos/readonly
MCP server doesn't include predefined prompts, which is why the error surfaces. However, if you switch the httpUrl
to https://api.githubcopilot.com/mcp/x/issues/readonly
(note the issues
part), the error disappears. This is because the issues/readonly
MCP server does include predefined prompts. This distinction is crucial for understanding the root cause of the error. The gemini-cli
is attempting to discover prompts from a server that simply doesn't support that functionality. Therefore, the error message isn't indicative of a bug in the CLI, but rather a mismatch between the CLI's expectations and the server's capabilities. The configuration setting in ~/.gemini/settings.json
plays a critical role in determining which MCP server the CLI interacts with, and thus, whether this error is triggered. This highlights the importance of carefully configuring the CLI to align with the intended MCP server and its supported features.
Diving Deeper: Understanding MCP Servers and Prompts
To fully grasp this, we need to understand MCP servers and prompts. MCP, or Model Context Protocol, is a way for tools to interact with language models. Think of it as a standardized language for communication. MCP servers, like the ones from GitHub Copilot, offer specific functionalities. Some servers, like the issues/readonly
server, come with predefined prompts. These prompts are essentially instructions or starting points for the language model. The repos/readonly
server, on the other hand, doesn't have these prompts. This difference is fundamental to why the error occurs. The gemini-cli
, by default, tries to discover prompts from the configured MCP server. When it hits the repos/readonly
server, which lacks prompt support, the server responds with the -32601
error code, indicating that the requested method (prompts/list
) is not supported. This is in line with the MCP specification, which states that servers supporting prompts must declare this capability during initialization. The repos/readonly
server doesn't declare this capability, hence the error. This deep dive into MCP servers and prompts clarifies that the error is a direct consequence of the server's design and capabilities, rather than a flaw in the CLI's logic. It also underscores the importance of understanding the specific features and limitations of the MCP server being used, to avoid such errors and ensure a smooth interaction with the language model.
Proposed Solutions: Tackling the Error
Now, let's talk solutions. There are a couple of ways we can handle this. First up, a simple fix would be to expand the error check in the discoverPrompts
function. Currently, the code ignores errors that include 'Method not found'
. We could add another condition to also ignore MCP error -32601: prompts not supported
. This would effectively silence the error message, but it's more of a band-aid solution. While this approach is quick and easy to implement, it doesn't address the underlying issue. It simply suppresses the error message, which might mask other potential problems in the future. Furthermore, it doesn't align with the principle of making the CLI as informative as possible. Error messages, when properly handled, can provide valuable insights into the system's behavior and help users diagnose and resolve issues. By simply ignoring the error, we lose this opportunity. Therefore, while this solution might be tempting for its simplicity, it's not the most robust or user-friendly approach in the long run.
A more robust solution involves checking the server's capabilities before attempting to discover prompts. The mcpClient
has a getServerCapabilities()
method that tells us what the server supports. We can check if the 'prompts'
capability is present before calling mcpClient.request({ method: 'prompts/list', params: {} }, ...)
.
This could be done either within the discoverPrompts()
function or even earlier, in the connectAndDiscover()
function, which calls discoverPrompts()
. By checking the server's capabilities beforehand, we ensure that we only attempt to discover prompts from servers that actually support them. This prevents the error from occurring in the first place and makes the CLI more efficient and reliable. This approach aligns with best practices for error handling, which emphasize preventing errors rather than simply reacting to them. It also makes the CLI more adaptable to different MCP servers, as it can dynamically adjust its behavior based on the server's advertised capabilities. This is a more elegant and scalable solution that contributes to the overall robustness and user-friendliness of the gemini-cli
.
Code Example: Checking Server Capabilities
Here's a snippet illustrating how you might implement the second solution:
async function discoverPrompts(
mcpServerName: string,
mcpClient: Client,
promptRegistry: PromptRegistry,
): Promise<void> {
const serverCapabilities = mcpClient.getServerCapabilities();
if (serverCapabilities?.prompts) {
try {
const response = await mcpClient.request(
{ method: 'prompts/list', params: {} },
ListPromptsResultSchema,
);
// ...
} catch (error) {
// ...
}
}
}
This code first retrieves the server's capabilities using mcpClient.getServerCapabilities()
. Then, it checks if the prompts
capability is present. Only if it is, does the code attempt to request the list of prompts. This approach ensures that the CLI only tries to discover prompts from servers that actually support them, preventing the error and making the CLI more robust.
Delving into the GitHub MCP Server URLs
For those curious, the GitHub MCP Server URLs are listed in the Remote MCP Toolsets documentation. The key difference between the repos/readonly
and issues/readonly
endpoints lies in their functionality. The issues
endpoint includes predefined prompts, while the repos
endpoint doesn't. This is because the issues
endpoint is designed to handle issue-related tasks, which often benefit from predefined prompts for common workflows like assigning agents or fixing issues. The repos
endpoint, on the other hand, focuses on repository-level operations, which may not require the same level of prompt support. Looking at the GitHub MCP server code (github/github-mcp-server
), you can see how the issues
toolset is explicitly configured with prompts:
issues := toolsets.NewToolset("issues", "GitHub Issues related tools").
....AddPrompts(
toolsets.NewServerPrompt(AssignCodingAgentPrompt(t)),
toolsets.NewServerPrompt(IssueToFixWorkflowPrompt(t)),
)
This code snippet clearly shows that the issues
toolset is designed to include specific prompts, which explains why the issues/readonly
server supports the prompts/list
method, while the repos/readonly
server does not. This detailed examination of the GitHub MCP server URLs and their underlying code further reinforces the understanding that the error is a consequence of the server's design and capabilities, rather than a flaw in the gemini-cli
.
Expected Behavior: A Smooth Experience
Ideally, you shouldn't see this error message. The gemini-cli
should either handle the absence of prompt support gracefully or, even better, check for it beforehand. This ensures a smoother user experience and prevents unnecessary error messages from cluttering the console. A well-designed CLI should be informative but not alarming, providing clear and actionable feedback without overwhelming the user with technical details. In this case, the error message, while technically accurate, can be confusing for users who are not familiar with the intricacies of MCP servers and prompt capabilities. Therefore, suppressing the error or, better yet, preventing it from occurring in the first place, is crucial for creating a positive user experience. This aligns with the broader goal of making AI tools accessible and user-friendly, by minimizing technical hurdles and ensuring that users can focus on their tasks without being bogged down by error messages.
Client Information: Your Setup
For reference, here's the client information provided in the original issue:
â•────────────╮
│ > /about │
╰────────────╯
â•───────────────────────────────────────────────────────────────────────╮
│ │
│ About Gemini CLI │
│ │
│ CLI Version 0.1.15-nightly.250801.6f7beb41 │
│ Git Commit 6f7beb41 (local modifications) │
│ Model gemini-2.5-pro │
│ Sandbox no sandbox │
│ OS linux │
│ Auth Method OAuth │
│ │
╰───────────────────────────────────────────────────────────────────────╯
This information can be helpful for debugging and reproducing the issue. It provides details about the CLI version, Git commit, model, operating system, and authentication method, which can all play a role in the behavior of the CLI. In this case, the user was using a nightly build of the CLI on a Linux system, with OAuth authentication. While this information doesn't directly point to the root cause of the error, it provides valuable context for developers who might be investigating the issue. It also highlights the importance of providing detailed client information when reporting bugs, as it can significantly speed up the debugging process.
Login Information: Authentication Details
The user was using Login with Google
with a Gmail account that has a Google AI Pro subscription. This is another piece of the puzzle. While the authentication method itself isn't directly related to the error, it's good to know for context. Knowing the authentication method can be useful in troubleshooting scenarios where authentication issues might be suspected. In this case, the user's successful login using Google and a Google AI Pro subscription indicates that the error is likely not related to authentication problems. This allows us to focus on other potential causes, such as the MCP server configuration and prompt discovery logic. Providing this level of detail in bug reports helps to narrow down the scope of the issue and makes it easier for developers to identify the root cause.
Conclusion: Wrapping Up the Error Discovery
So, there you have it! The Error discovering prompts from github: MCP error -32601: prompts not supported
error can be a bit of a head-scratcher, but understanding the interplay between the gemini-cli
and MCP servers is key. By checking the server capabilities before attempting to discover prompts, we can avoid this error and ensure a smoother experience. Remember, the repos/readonly
server doesn't support prompts, while the issues/readonly
server does. Choosing the right endpoint for your needs is crucial. This deep dive into the error, its causes, and potential solutions should equip you with the knowledge to tackle it effectively. By understanding the underlying mechanisms and adopting best practices for error handling, we can make the gemini-cli
a more robust and user-friendly tool for everyone.
If you're still running into issues, don't hesitate to consult the documentation or reach out to the community for help. Happy coding, guys! This comprehensive guide has provided you with the necessary tools to understand and resolve this error, ensuring a smoother and more productive experience with the gemini-cli
.