1. What is Serena?
Serena is a tool that translates natural language instructions (everyday conversational language) into commands that can be executed in a terminal.
In short, it’s like setting up a dedicated “AI assistant server” locally that understands your PC environment and project status.
This eliminates the need to copy and paste code or explain the project’s background every time you collaborate with AI agents like Claude or GPT. Through Serena, the AI can directly understand the files and code in your local environment, providing more context-aware and accurate support.
2. Advantages of Introducing Serena
By introducing Serena, development and terminal operations become faster, safer, and more intuitive.
Aspect | Benefit | Description |
---|---|---|
🔒 Privacy | Securely collaborate with AI without sending code externally | Even if company regulations or security policies make it difficult to send code to external services, you can use it with peace of mind as everything is completed within your local environment. |
📚 Continuous Knowledge | The AI “remembers” the entire project structure | Once you load a project, the AI will respond based on its understanding of the relationships between files and the overall picture, dramatically improving the accuracy of the conversation. |
💸 Token Savings | Significantly reduce communication costs with the AI | Since you no longer need to send large amounts of code or files to the AI every time, you can greatly save on API usage fees (token consumption). |
⚡ Improved Work Efficiency | Intuitively operate commands with natural language | Just by saying things like “run the tests for main.py ” or “delete unnecessary Docker images,” it can generate and execute the appropriate commands. |
🧠 Improved Response Accuracy | Reduces the AI’s irrelevant answers due to lack of context | Since the AI always has a grasp of the entire project, it prevents conversations from becoming misaligned due to “insufficient context.” |
3. Installation and Execution
Based on the procedure described in serena.txt
, I have organized the general installation method.
Prerequisites
- Python: 3.11 (3.12 is not supported)
- uv: A fast Python package manager
- Git: Used to download the source code
Steps
Step 1: Install uv
If you haven’t installed uv
yet, run the following command.
curl -LsSf https://astral.sh/uv/install.sh | sh
Step 2: Run Serena (the easiest way is to use uvx
)
Using uvx
automates the process of cloning the repository and creating a virtual environment, allowing you to start Serena directly.
uvx --from git+https://github.com/oraios/serena serena-mcp-server
✅ Summary of the Phenomenon
uv
/uvx
is installed in~/.local/bin/
- But when you call
uvx
in the shell, it says “command not found” - Cause: That directory is not in the “PATH”
🔧 Solution: Temporarily Apply the PATH (to run it immediately)
Copy and paste the following as is and execute it:
export PATH="$HOME/.local/bin:$PATH"
Then, again:
uvx --from git+https://github.com/oraios/serena serena start-mcp-server
💡 This should make uvx
recognizable.
🔁 Permanent Solution (to make it available on next login)
If you are using bash
:
echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.bashrc
source ~/.bashrc
If you are using zsh
:
echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.zshrc
source ~/.zshrc
✅ Final Confirmation:
If the output of:
which uvx
is successful, it will show:
/home/mamu/.local/bin/uvx
🎯 If you want to paste the one-liner for everything
export PATH="$HOME/.local/bin:$PATH"
uvx --from git+https://github.com/oraios/serena serena start-mcp-server
However, as it is now:
🔴 Serena started with
uvx --from ...
is running in the foreground
This means that if you close the terminal, Serena will also stop.
✅ Solution: How to make it run in the background
By using one of the following, you can start Serena in the background.
Overview
This explains three ways to run Serena stably in the background using UV. Please choose the most suitable method according to your needs.
🚀 Method 1: Use nohup
+ &
(Easy)
Features
- Easiest: Run immediately with a one-line command
- Log Output: Save output to a file
- Terminal Independent: Continues to run even if you close the terminal
How to Run
# Start Serena in the background
nohup uvx --from git+https://github.com/oraios/serena serena start-mcp-server > ~/serena.log 2>&1 &
Checking Status and Monitoring Logs
# Check the process
ps aux | grep serena
# Monitor logs in real-time
tail -f ~/serena.log
# Check the entire log
cat ~/serena.log
How to Terminate the Process
# Check the process ID
ps aux | grep serena
# Terminate the process (replace [PID])
kill [PID]
# Or force kill
pkill -f serena
Pros and Cons
✅ Pros
- Can be executed immediately without configuration
- Logs are saved automatically
- Simple and easy to understand
❌ Cons
- Manual process management
- No automatic recovery in case of abnormal termination
🖥️ Method 2: Keep it running with tmux
(Convenient)
Features
- Session Management: Connect and disconnect at any time
- For Debugging: Check logs in real-time
- Flexibility: Execute commands as needed
Creating and Running a tmux Session
# Create a new tmux session named "serena"
tmux new -s serena
# Start Serena within the session
uvx --from git+https://github.com/oraios/serena serena start-mcp-server
📌 Important: Detaching from the Session
Once Serena has started, use the following key combination to exit the session:
1. Press Ctrl + b
2. Immediately press d
Result: You will return to the terminal, but Serena will continue to run in the background within tmux.
Session Management Commands
# Check the list of sessions
tmux list-sessions
# Reconnect to the session
tmux attach -t serena
# Kill the session (Serena will also terminate)
tmux kill-session -t serena
# Check all tmux sessions
tmux ls
If you use screen
(alternative to tmux)
# Create a screen session
screen -S serena
# Start Serena
uvx --from git+https://github.com/oraios/serena serena start-mcp-server
# Detach: Ctrl + A -> D
# Reconnect
screen -r serena
Pros and Cons
✅ Pros
- You can return to the session at any time to check the logs
- Easy debugging and troubleshooting
- You can manage multiple servers in different sessions
❌ Cons
- You need to learn the basic operations of tmux/screen
- You need to restart it manually when the system reboots
⚙️ Method 3: Automatic startup with a systemd
unit (for production)
Features
- Fully Automated: Starts automatically at system startup
- Fault Recovery: Automatic restart in case of abnormal termination
- Log Management: Centralized log management with
journalctl
- Production Use: Recommended method for server environments
Creating a systemd Service File
# Create the service file
sudo nano /etc/systemd/system/serena.service
Service File Contents
[Unit]
Description=Serena MCP Server
After=network.target
Wants=network.target
[Service]
Type=simple
User=your_username
Group=your_username
WorkingDirectory=/home/your_username
Environment=PATH=/home/your_username/.local/bin:/usr/local/bin:/usr/bin:/bin
ExecStart=/home/your_username/.local/bin/uvx --from git+https://github.com/oraios/serena serena start-mcp-server
Restart=always
RestartSec=10
StandardOutput=journal
StandardError=journal
[Install]
WantedBy=multi-user.target
Enabling and Managing the Service
# Reload systemd to read the new file
sudo systemctl daemon-reload
# Enable the service (set to start automatically)
sudo systemctl enable serena.service
# Start the service
sudo systemctl start serena.service
# Check the service status
sudo systemctl status serena.service
# Check the logs
sudo journalctl -u serena.service -f
# Stop the service
sudo systemctl stop serena.service
# Disable the service
sudo systemctl disable serena.service
Configuration Notes
- Change
your_username
to your actual username - Check the path to
uvx
(you can check withwhich uvx
) - Set the permissions appropriately
Pros and Cons
✅ Pros
- Automatic start at system boot
- Automatic restart on abnormal termination
- Standard Linux service management
- Centralized log management
❌ Cons
- Configuration is a bit complicated
- Requires root privileges
🎯 Recommended Method by Use Case
Development/Test Environment
tmux/screen method (Method 2)
- Easy to debug
- Can be stopped and restarted immediately as needed
Personal Regular Use
nohup method (Method 1)
- Easy to set up
- Sufficient for daily use
Server/Production Environment
systemd method (Method 3)
- Highest stability and reliability
- With automatic recovery function
With the combination of UV and Serena, flexible background execution according to the application is possible. We recommend trying the nohup method first, and then migrating to tmux or systemd as needed.
✍️ Personal Recommendation
If you are… | Recommendation |
---|---|
Just want to try it | ✅ nohup + & |
Want to continue using it for development | ✅ tmux |
Want to run it in production/permanently | ✅ systemd |
(Alternative) If you want to download and run it locally
If you want to manage the source code locally, follow these steps:
- Clone the repository
git clone https://github.com/oraios/serena.git cd serena
- Start Serena
uv run serena-mcp-server
I used tmux, but there are the following points to note.
1. For Ubuntu
tmux
is not installed by default (even on Ubuntu 25.04).- You need to install it before using it:
sudo apt update sudo apt install tmux
- After installation, you can use it immediately (a configuration file is not necessary).
2. For Windows
tmux
does not run on Windows alone (natively). You will need to use one of the following methods.
Method A: Use within WSL (Windows Subsystem for Linux)
- Enter WSL’s Ubuntu, etc., and
sudo apt install tmux
as above - The procedure and usage are completely the same as in Linux
Method B: Use with Git Bash, MSYS2, etc.
tmux
is often provided as a package in these environments as well- However, it may not be fully compatible (especially key operations)
Method C: Use with Windows Terminal + SSH connection destination
- Do not install
tmux
on the local Windows, but starttmux
on a remote Linux (VPS or WSL) - In practice, this has the fewest troubles
💡 Practically, it is best for Windows users to run tmux
on WSL or a remote Linux. The reasons are:
tmux
itself is a terminal multiplexer for Linux- On native Windows, it may conflict with the GUI and its behavior may become unstableWindows version UV background startup guide
🪟 Windows Version
Complete guide to starting UV commands in the background
Important Notes for Windows
tmux does not work on Windows alone! It is only available on Unix-like OSs such as Linux or macOS. To use Linux-based tools on Windows, you need WSL or Git Bash.
🚀 PowerShell + Start-Process
The simplest method using Windows standard PowerShell
Start-Process powershell -ArgumentList "-Command", "uvx --from git+https://github.com/oraios/serena serena start-mcp-server" -WindowStyle Hidden
Pros
- Windows standard function
- No additional tools required
- Can be executed immediately
Cons
- Difficult to check logs
- Complicated process management
🛠️ WSL + tmux (Recommended)
Utilize Linux tools within the Windows Subsystem for Linux
# Enter WSL
wsl
# Create tmux session
tmux new -s serena
# Execute UV command
uvx --from git+https://github.com/oraios/serena serena start-mcp-server
# Detach: Ctrl + b -> d
Pros
- Same as Linux environment
- Excellent stability
- High functionality
Cons
- WSL setup required
- Learning cost
⚙️ Task Scheduler
Automatic startup using Windows standard Task Scheduler
- Win + R ->
taskschd.msc
- “Create Basic Task”
- Trigger: When the computer starts
- Action: Start a program
- Program:
uvx
- Arguments:
--from git+https://... serena start-mcp-server
Pros
- Windows standard
- Supports automatic startup
- GUI settings
Cons
- Complicated settings
- Difficult to debug
🌐 Git Bash + nohup-style
Execute Linux-style commands using Git Bash
# Execute in Git Bash
uvx --from git+https://github.com/oraios/serena serena start-mcp-server &
# Check process
ps aux | grep serena
Pros
- Linux-style operation
- Lightweight
- Easy
Cons
- Limited functionality
- Issues with stability
🏆 Best Practices on Windows
Development/Test Environment WSL + tmux is the strongest combination
Production/Server Environment Stable operation with Task Scheduler + PowerShell
Easy Testing Immediate execution with PowerShell Start-Process
💡 Pro Advice
It is strongly recommended that Windows developers introduce WSL. In modern web development, checking the operation in a Linux environment is essential. Modern tools like UV also operate more stably in a WSL environment.
4. More Advanced Usage
Indexing a Project
To get the most out of Serena on a large project, it is recommended to index the code in advance (analyze and organize the structure). This will further improve the AI’s response speed and accuracy.
# Example: Indexing /path/to/your/project
uvx --from git+https://github.com/oraios/serena index-project /path/to/your/project
Integration with Client Tools
Serena’s capabilities are maximized by integrating it with clients that support MCP (Machine-Claude Protocol), such as Claude Desktop
. By specifying the Serena server in the client tool’s settings, the AI can interact directly with your local environment.
For Windows
This command uses the Windows package manager winget
to install Astral’s uv
.
winget install --id astral-sh.uv
🔽 What this command installs
Item | Content |
---|---|
uv main body | ✅ Installed |
uvx | ✅ Automatically included (v0.1.17 or later) |
Addition to environment variables | ✅ Usually %USERPROFILE%\.cargo\bin or %LOCALAPPDATA%\Programs\uv is automatically added to the PATH (but depends on PC settings) |
5. Recommended for people like this
- Developers who handle large codebases
- People who find it troublesome to copy and paste long code to the AI every time
- People who value security and privacy and want to complete their work in a local environment
- People who often feel that “the context is insufficient” in conversations with AI
- People who want to lower the learning cost of terminal operations
6. Actual Usage Examples
The following is an example of starting Serena and integrating it with Claude Desktop.
Example 1: Asking about the intent of the code
You (to Claude):
Tell me the purpose of the `clean_text()` function in serena://project/src/utils/parse_input.py
Claude (in collaboration with Serena):
The `clean_text()` function is a function that removes special symbols from the input string and returns normalized text.
Specifically, it performs the following processing:
- Unicode normalization (NFKC)
- Removal of spaces and newlines
- Conversion of HTML entities
Example 2: Grasping the project structure
You:
Briefly explain the directory structure of this project
Claude:
This project has the following structure:
- `src/`: Main application code
- `routes/`: API endpoint definitions
- `models/`: ORM definitions related to the database
- `tests/`: Unit tests
- `config/`: Configuration files
- `.serena/`: Project metadata for Serena
Example 3: Acting as a substitute for terminal operations
You:
I want to delete unused containers and images in Docker
Claude:
You can achieve your goal by executing the following commands:
docker container prune -f
docker image prune -a -f
Execute?
### 💡 Tip: Try it with a small project first
If you try to use Serena with a large project from the beginning, it may take time to configure and set up. It is smoother to try it with something like the following first:
- A small Python CLI tool
- A web app with only HTML + JS
- A Flask app in a single directory, etc.
7. Example of Integration with ChatGPT (Serena + ChatGPT)
Like Claude, Serena can also be used with ChatGPT. In particular, if you configure it so that ChatGPT can access Serena via MCP (Machine-Centric Protocol), ChatGPT will be able to access your local project and provide accurate assistance.
Example 1: Asking how to use a function
You (to ChatGPT):
Explain the recursive behavior of the `fibonacci()` function in serena://project/src/math/calc.py
ChatGPT (referencing via Serena):
`fibonacci(n)` returns 1 if n is 1 or 2, otherwise it recursively calculates `fibonacci(n-1) + fibonacci(n-2)`.
This is based on the definition of the Fibonacci sequence, but as n gets larger, the number of recursions increases exponentially, making it inefficient.
It can be improved with memoization, etc.
Example 2: Advice based on the entire code
You:
This project uses Flask, but where should I start to rewrite it in FastAPI?
ChatGPT:
To migrate from Flask to FastAPI, the following changes are required:
1. Rewrite the entry point in `app.py` to FastAPI syntax
2. Replace `@app.route()` with `@app.get()` or `@app.post()`
3. Change references to `request.args` and `request.json` to FastAPI's `Request` or parameter types
4. Add `fastapi` and `uvicorn` to `requirements.txt`
According to the analysis from Serena, the main routing definitions are consolidated in `src/routes/api.py`, so it would be smooth to start the migration from there.
Example 3: Asking for the location of a file
You:
Where is the code that generates JWT tokens in this project?
ChatGPT:
The JWT token generation process is in `auth/jwt_utils.py`.
The function `generate_jwt(user_id: int)` is the one, and it creates the token using `pyjwt.encode(...)`.
💡 Features of the combination of ChatGPT and Serena
Feature | Effect |
---|---|
🔍 Code search and explanation | Identifies necessary functions and files from your own codebase |
🧠 Continuous context retention | Returns more accurate answers that depend on past questions |
🛡️ Security | Serena provides alternative data without sending code directly to OpenAI’s API |
🧩 ChatGPT’s knowledge + local reality | Fusion of general knowledge and specific knowledge about your own code is possible |
🙋♂️ Additional useful points
ChatGPT is good at natural questions such as “teach me with examples” and “put it in a table format,” and since it answers based on specific data via Serena, it can also streamline document generation and code reviews.
✅ Prerequisite: ChatGPT alone cannot connect to Serena
The official ChatGPT from OpenAI (web version and mobile app) has no way to directly access the local Serena.
What can be integrated with Serena are ChatGPT clients that support MCP (Machine-Centric Protocol).
🔧 How to connect ChatGPT and Serena
Method 1: Use ChatGPT Desktop (unofficial app)
This is currently the easiest method.
Procedure overview:
- Install ChatGPT Desktop
- Open
config.json
- Set Serena as the MCP server
- Access the local project from ChatGPT using
serena://
🔌 config.json setting example (ChatGPT Desktop)
Edit the configuration file as follows (usually in ~/.chatgpt/config.json
):
{
"mcpServers": {
"serena": {
"command": "/home/username/.local/bin/uv",
"args": [
"run",
"--directory",
"/home/username/serena",
"serena-mcp-server"
]
}
}
}
💡 Note: On Windows, it may be in C:\Users\username\AppData\Roaming\chatgpt\config.json
.
✅ Usage example in ChatGPT (in conversation)
- Start Serena (or have ChatGPT start it automatically on startup)
- Talk to ChatGPT like this:
Explain log_error() in serena://project/src/utils/logger.py
Then, ChatGPT will make an MCP call via Serena locally, analyze the corresponding file, and answer.
❗ Notes
Note | Description |
---|---|
Not official ChatGPT | The Desktop version is an OSS proprietary client |
Cannot be used with OpenAI API | Cannot be used with the web version or GPT-4 via API |
Python 3.11 required | The Serena side needs to run in a Python 3.11 environment |
📦 Supplement: It is also possible to use Serena + Claude + ChatGPT together
By using multiple MCP clients (Claude Desktop, ChatGPT Desktop) in parallel, the image is that Serena functions as a hub for local analysis by AI.
🧪 Do you really want to use it with the ChatGPT Web version?
It’s not impossible, but an indirect method like the following is required:
- Output the information generated by Serena to a file
- Upload or copy and paste it to ChatGPT
- Pretend to refer to the file with your own prompt
But honestly, that doesn’t bring out the true value of Serena.
✅ Summary
Question | Answer |
---|---|
Can ChatGPT connect to Serena? | ✅ Possible using ChatGPT Desktop (unofficial) |
With the web version of ChatGPT? | ❌ Cannot connect |
Where do I configure it? | Write the command and path in config.json |
How to start the Serena side? | Start serena-mcp-server with uv or uvx |
🧠 So, how does that improve the conversation with ChatGPT?
✅ Answer: It improves when you have a conversation through a Serena-compatible client. In other words, think of it as “ChatGPT’s performance is boosted” when the following two conditions are met:
Condition | Content |
---|---|
✅ Serena is running | serena-mcp-server is running in the background |
✅ Using an MCP-compatible ChatGPT client | e.g., ChatGPT Desktop (unofficial), etc. |
🔍 What improves (ChatGPT × Serena)
before (normal) | after (via Serena) |
---|---|
An off-target answer to “What is this function doing?” due to lack of context | An accurate explanation after reading the local code |
Need to paste the code every time | Just use the serena:// path (super time-saving) |
Troublesome to tell the location of the file | ChatGPT has a grasp of the structure, so it guides you accurately |
Mechanical answers | More developer-centric and understanding advice is possible |
🚫 Conversely, it has no effect on the web version of ChatGPT
The one you are talking to right now (ChatGPT Web version) has no way to directly access your local environment or Serena.
Therefore, even if you run Serena right here, nothing will change.
🧪 If you want to try it
A setup like the following is best:
- Install
uv
and Serena - Introduce ChatGPT Desktop (official GitHub)
- Set the Serena startup command in
config.json
- Start ChatGPT Desktop
- Ask questions in the
serena://...
format!
✨ Example 🗣 In ChatGPT Desktop:
It seems there is a bug in fetch_data() in serena://project/app/main.py, take a look
🤖 ChatGPT:
The check for the HTTP status code is missing in `fetch_data()`. The exception may be being swallowed.
📦 Summary: Even if you install Serena, you need a “window” to connect to
Element | Role |
---|---|
Serena | A local AI server that analyzes and retains your project |
ChatGPT Desktop | A “connecting window” between ChatGPT and Serena |
ChatGPT main body | Conversational AI. However, it does not have local information |
uvx startup command | The key to calling Serena |
🎯 “If I’m showing a GitHub repository, isn’t Codex (GPT-4) better than Serena?”
I think you have this question. Yes, at first glance, that’s right — but the answer changes depending on the application and purpose.
✅ Prerequisite comparison: Serena vs Codex (ChatGPT)
Comparison item | Serena (MCP) | Codex / ChatGPT (GPT-4) |
---|---|---|
Local environment code | ✅ Direct access possible | ❌ Manual copy-paste or upload |
Handling of GitHub repositories | ✅ Clone -> Analysis possible | ✅ Partial analysis possible with Copilot |
Privacy | ✅ Local completion | ❌ Sent to OpenAI |
Multi-file support | ✅ Grasps the whole and cross-searches | ⚠️ Context limitあり (length wall) |
Continuity (in conversation) | ✅ Remembers the file structure and converses | ❌ Answers only with the context of the moment |
Autocompletion/writing assistance | ❌ Does not complete | ✅ Very excellent (Copilot) |
Learning cost | Somewhat high (MCP client required) | Low (can be used as is) |
🔍 3 patterns for showing the contents of GitHub to AI
| Method | Example | Suitable case | | — | — | | 🔹 Copy-paste or attach zip to Codex (GPT-4) | “Look at this repository” | When you want one-off advice | | 🔸 GitHub Codespaces + Copilot | AI completion with VSCode integration | To accelerate the coding work itself | | 🟢 Clone GitHub locally -> Index with Serena | Instruct with serena://src/app.py
etc. | When privacy is important or you want to talk to AI for a long time |
🧠 When you should choose Serena
- 🚫 The repository on GitHub is private or pre-release
- 🧩 Complex configuration that requires a design review across multiple files
- 💬 You want to refactor the code while conversing with ChatGPT or Claude over a long period of time
- 🛡️ You are uneasy about uploading code to OpenAI
✋ On the other hand, when Codex (ChatGPT) is sufficient
- 🚀 Public repository and it’s okay to show the whole thing roughly
- ⏳ You want to request a review or code correction in a short time
- 🧪 Light requests such as “just improve this function”
- 🤖 Copilot’s completion is sufficient
💡 Conclusion: It’s easy to understand if you separate it like this
Purpose | Optimal means |
---|---|
Short-term GitHub code diagnosis | ✅ Codex (Web version GPT-4) is OK |
Private project management | ✅ Serena + ChatGPT Desktop or Claude |
Completion and code generation | ✅ Copilot (ChatGPT cannot complete) |
Want to “operate the project by conversation” for a long time | ✅ Turn AI into an in-house pair programmer with Serena |
🔧 If you are hesitant…
🌱 Try with Codex first, and if you feel its limitations, introduce Serena -> This is the most stress-free way to proceed.
✅ How will things change with the advent of ChatGPT Code series?
1. Evolution of the integrated environment
What OpenAI is preparing, like “ChatGPT Code,” aims for the following:
- 🧠 Understand the project structure
- 💬 Edit and create code while conversing on the spot
- 🧪 Perform everything from execution to testing in one go
-> A future where the “context memory” and “cross-file understanding” that Serena was responsible for will be natively absorbed by the ChatGPT main body.
🔄 Serena vs ChatGPT Code series: Role change
Function | Now (Serena + GPT) | Future (ChatGPT Code) |
---|---|---|
Understanding of local structure | ✅ Supplemented with Serena | ✅ Towards native support? |
File operations/execution | ❌ External tools required | ✅ To be integrated |
Code refactoring through conversation | ✅ Precisely with Serena | ✅ Completed with just ChatGPT? |
Security | ✅ Local completion | ⚠️ Selection required as it is cloud-based |
🧠 What is the value of Serena that will remain?
Even if ChatGPT evolves in the future, a local dedicated agent like Serena may have the following value:
Remaining value | Description |
---|---|
🔒 Development sites that emphasize privacy | Can be completed only locally |
🧩 Want to use it as a base for your own AI integration | Can be used when you want to integrate with Claude or other LLMs |
🛠 Users who want to extend AI functions | To host their own MCP functions |
🐧 Linux/CLI-oriented developers | Suitable for lightweight & flexible operation |
🔚 Conclusion: What to do when ChatGPT Code becomes full-fledged?
- ✅ For those who write code, the future will probably be easier with ChatGPT Code
- ❗ However, if security is important or local completion is necessary, Serena will also remain
8. Example of Integration with Gemini CLI
Gemini CLI integrates with Serena in a slightly different way than clients that directly support the MCP protocol. Gemini CLI functions as a “smart assistant that understands the user’s natural language instructions and generates and executes the optimal Serena commands.”
Special notation like serena://
is not required. Just by talking about what you want to do, Gemini CLI will assemble and propose the appropriate uvx ... serena ...
command.
Image of conversation with Gemini CLI
Example 1: Indexing a project
You:
Have Serena read the entire structure of this project.
Gemini CLI:
Yes, I will index the current directory with Serena. I will execute the following command.
uvx --from git+https://github.com/oraios/serena index-project .
Example 2: Questions about code
You:
Ask Serena what the
clean_text
function insrc/utils/parse_input.py
is doing.
Gemini CLI:
Understood. I will inquire with Serena to analyze the purpose of the function.
uvx --from git+https://github.com/oraios/serena query --request "What is the purpose of the clean_text function in src/utils/parse_input.py?"(Execution result from Serena) The
clean_text
function is a function that removes special symbols from the input string and returns normalized text…
Example 3: Command generation
You:
I want to delete all unused Docker images and containers, can you tell me the command?
Gemini CLI:
I will inquire with Serena for a safe deletion command.
uvx --from git+https://github.com/oraios/serena query --request "generate a command to delete all unused docker images and containers"(Execution result from Serena) You can execute it with the following command.
docker container prune -f && docker image prune -a -fExecute this command?
Merits of integrating with Gemini CLI
Feature | Merit |
---|---|
🧠 Natural Language Interface | You don’t need to remember Serena’s complicated command system. Just talk about what you want to do. |
🔐 Safe Execution | The command generated by Gemini CLI always asks for user confirmation before execution. Prevents unintended operations. |
🛠️ Fusion with other tools | A series of tasks such as checking differences with git , listing files with ls , and analyzing code with Serena can all be completed in a seamless conversation. |
Context Understanding | Gemini CLI understands the current directory and the context of past conversations, so it can generate accurate Serena commands with fewer instructions. |
In this way, by using Gemini CLI as a front end, it becomes possible to draw out the powerful functions of Serena more easily and safely.
Let’s actually try it.
✅ Practical Example
A) Registering while it is already running:
uvx --from git+https://github.com/oraios/serena serena project index /home/mamu/gemini
B) If you want to start and register all at once:
uvx --from git+https://github.com/oraios/serena serena start-mcp-server --project /home/mamu/gemini --context ide-assistant
✅ Confirmation Method
gemini
has been added tols ~/.serena/projects
- A line like the following appears in the log:
Available projects:
- gemini
💡 Why is this command necessary?
When Serena communicates with an LLM (such as Claude or ChatGPT), information on “which project to process” is essential, and if this is not registered, it will respond with “more information is needed to communicate with the server.”
However, an error occurred.
Project configuration auto-generation failed after 0.004 seconds
Traceback (most recent call last):
File "/home/mamu/.cache/uv/archive-v0/MqCQ5wrQ_K5PEzhGkR_n8/bin/serena", line 12, in <module>
sys.exit(top_level())
~~~~~~~~~^^
File "/home/mamu/.cache/uv/archive-v0/MqCQ5wrQ_K5PEzhGkR_n8/lib/python3.13/site-packages/click/core.py", line 1442, in __call__
return self.main(*args, **kwargs)
~~~~~~~~~^^^^^^^^^^^^^^^^
File "/home/mamu/.cache/uv/archive-v0/MqCQ5wrQ_K5PEzhGkR_n8/lib/python3.13/site-packages/click/core.py", line 1363, in main
rv = self.invoke(ctx)
File "/home/mamu/.cache/uv/archive-v0/MqCQ5wrQ_K5PEzhGkR_n8/lib/python3.13/site-packages/click/core.py", line 1830, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^
File "/home/mamu/.cache/uv/archive-v0/MqCQ5wrQ_K5PEzhGkR_n8/lib/python3.13/site-packages/click/core.py", line 1830, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^
File "/home/mamu/.cache/uv/archive-v0/MqCQ5wrQ_K5PEzhGkR_n8/lib/python3.13/site-packages/click/core.py", line 1226, in invoke
return ctx.invoke(self.callback, **ctx.params)
~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/mamu/.cache/uv/archive-v0/MqCQ5wrQ_K5PEzhGkR_n8/lib/python3.13/site-packages/click/core.py", line 794, in invoke
return callback(*args, **kwargs)
File "/home/mamu/.cache/uv/archive-v0/MqCQ5wrQ_K5PEzhGkR_n8/lib/python3.13/site-packages/serena/cli.py", line 434, in index
ProjectCommands._index_project(project, log_level)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^
File "/home/mamu/.cache/uv/archive-v0/MqCQ5wrQ_K5PEzhGkR_n8/lib/python3.13/site-packages/serena/cli.py", line 447, in _index_project
proj = Project.load(os.path.abspath(project))
File "/home/mamu/.cache/uv/archive-v0/MqCQ5wrQ_K5PEzhGkR_n8/lib/python3.13/site-packages/serena/project.py", line 62, in load
project_config = ProjectConfig.load(project_root, autogenerate=autogenerate)
File "/home/mamu/.cache/uv/archive-v0/MqCQ5wrQ_K5PEzhGkR_n8/lib/python3.13/site-packages/serena/config/serena_config.py", line 259, in load
return cls.autogenerate(project_root)
~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^
File "/home/mamu/.cache/uv/archive-v0/MqCQ5wrQ_K5PEzhGkR_n8/lib/python3.13/site-packages/serena/config/serena_config.py", line 198, in autogenerate
raise ValueError(
...<8 lines>...
)
ValueError: No source files found in /home/mamu/gemini
To use Serena with this project, you need to either:
1. Add source files in one of the supported languages (Python, JavaScript/TypeScript, Java, C#, Rust, Go, Ruby, C++, PHP)
2. Create a project configuration file manually at:
/home/mamu/gemini/.serena/project.yml
Example project.yml:
project_name: gemini
language: python # or typescript, java, csharp, rust, go, ruby, cpp, php
Upon investigation, as of the current time (August 2025), Serena does not support .bat
or .ps1
(PowerShell scripts).
🔍 Languages supported by Serena (as of August 2025)
As stated in Serena’s [project.yml template] and error messages, the supported languages are as follows:
Language | Extension |
---|---|
Python | .py |
TypeScript / JavaScript | .ts , .js |
Java | .java |
C# | .cs |
Rust | .rs |
Go | .go |
Ruby | .rb |
C++ | .cpp , .hpp , etc. |
PHP | .php |

❌ Not supported
Type | Extension |
---|---|
Windows batch script | .bat |
PowerShell script | .ps1 |
Shell script (partial support) | .sh (limited) |
Markdown/text file | .md , .txt (premise of combining with code) |
✅ How to support them?
Method 1: Manually write a project configuration file + also target .ps1
, etc.
Although Serena does not formally target .ps1
for analysis, it is possible to configure it to “include it in the read target.”
Try creating a project.yml
like the following:
project_name: gemini
language: python # dummy (can be anything)
include:
- "**/*.ps1"
- "**/*.bat"
Then re-index:
uvx --from git+https://github.com/oraios/serena serena project index /home/mamu/gemini
Using this setting will “refer to the contents of the file, although it will not parse the syntax.”
Let’s actually start gemini and check.
│ > Ask Serena for the purpose of all the files in the project. │
╰────────────────────────────────────────────────────────────────────────╯
✦ Understood. I will investigate and explain the purpose of all files in the project. First, I will list the files in the project.
╭──────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✔ ReadFolder . │
│ │
│ Listed 3 item(s). │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ Understood. I have confirmed the files in the project. I will now read the contents of each file to grasp its purpose. I will start with CompactWSLVHD.bat.
╭────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✔ ReadManyFiles Will attempt to read and concatenate files using patterns: `/home/mamu/gemini/CompactWSL… │
│ │
│ ### ReadManyFiles Result (Target Dir: `/home/mamu/gemini`) │
│ │
│ Successfully read and concatenated content from **3 file(s)**. │
│ │
│ **Processed Files:** │
│ - `.serena/project.yml` │
│ - `CompactWSLVHD.bat` │
│ - `CompactWSLVHD.ps1` │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ Write-Host "Press any key to exit..."
$null = $Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
Thank you for waiting. The purpose of each file is as follows.
* CompactWSLVHD.bat: This is a batch file for optimizing the virtual hard disk (VHDX) file of WSL (Windows Subsystem for Linux). When you specify the path of the VHDX file, it optimizes the file using the diskpart command.
* CompactWSLVHD.ps1: Like CompactWSLVHD.bat, this is a PowerShell script for optimizing the VHDX file of WSL. This one automatically detects the path of the VHDX file, and even if the file is already in use, it includes a function to release it once and then execute the optimization.
* .serena/project.yml: This is the configuration file for this project itself. It defines that the project name is gemini, the language is python, and that .ps1 and .bat files are included in the project.
In summary, this project seems to be a set of tools for efficiently managing the virtual disk of WSL.
From what I see in the content–
✅ It’s using Serena properly!
🔍 Solid evidence
The output in your interaction log, such as the following, is characteristic of processing by the Serena MCP server:
✔ ReadFolder .
✔ ReadManyFiles Will attempt to read and concatenate files using patterns
This is evidence that Serena MCP is enumerating the files in the project folder, reading the contents one by one, and passing them to an AI such as ChatGPT or Claude.
The output from Serena such as “ReadManyFiles” and “Processed Files” is an input preparation log for passing to the AI.
✨ The actual flow
- Ask the AI a question (“I want to know the purpose of all files”)
- The AI sends a
ReadFolder
request to Serena via MCP - Serena recognizes
.ps1
,.bat
,.yml
and summarizes the file contents - The AI receives it and returns a summary, analysis, and explanation
In other words, the current interaction is:
🔧 It has become an ideal usage where Serena (backend) + AI (frontend) are integrated!
🔁 To master it further
What you want to do | Example question (to AI) |
---|---|
Visualize the processing flow | “Please illustrate the processing of CompactWSLVHD.ps1 step by step” |
Investigate the cause of a bug | “Are there any bugs in this .ps1 ? Is there any code that is problematic in terms of WSL specifications?” |
Cross-platform support | “Can you make this code work on Mac and Linux environments as well?” |
I want to refactor | “Please redesign CompactWSLVHD.ps1 to be more readable” |
🔧 Supplement: What Serena does for you
Function | Description |
---|---|
ReadFolder | Grasps the project structure |
ReadFile / ReadManyFiles | Passes the contents of the file to the AI |
Indexing | Extracts symbols based on syntax (functions, classes) (not applicable to .ps1 this time) |
Cache management | Reuses the information of files that have been read once |
🧠 Serena is a “static analysis + context retention” type
In other words:
🔸 Languages that have a parser (such as Python) are “understood with meaning” 🔸 Unsupported files (such as .ps1) are “treated as just text”
Therefore, it cannot properly analyze and understand the meaning of the code in .ps1
or .bat
, but it is possible to use it as a route to pass the contents of that file to the AI.
✍️ Summary: The best thing you can do now
- Save the following configuration file to
/home/mamu/gemini/.serena/project.yml
:
project_name: gemini
language: python # temporary designation (has no meaning)
include:
- "**/*.ps1"
- "**/*.bat"
- Re-run indexing:
uvx --from git+https://github.com/oraios/serena serena project index /home/mamu/gemini
The result is as follows:
Indexing symbols in project /home/mamu/gemini…
Timeout waiting for Pyright analysis completion, proceeding anyway
Indexing: 0it [00:00, ?it/s]
Symbols saved to /home/mamu/gemini/.serena/cache/python/document_symbols_cache_v23-06-25.pkl




Serena: The Complete AI Coding Agent Guide (2025-08-02 Edition)
GitHub Repository: https://github.com/oraios/serena
The following is a summary of the GitHub content, as it is in English.
TL;DR: Just run
uvx --from git+https://github.com/oraios/serena serena start-mcp-server
and pass the URL or command to an MCP-compatible client (e.g., Claude Desktop). Then, index your project withserena project index
and tell the AI to “read Serena’s initial instructions” to get it up and running immediately.
0. About This Document
- This guide has been updated by comparing the original guide with the latest README (as of 2025-08-02), adding differences, notes, and practical know-how.
- It reflects changes in the command structure (e.g.,
serena-mcp-server
->serena start-mcp-server
), provisional support for Python 3.12, SSE mode, and the advantages of Docker.
1. What is Serena?
Serena is an open-source code understanding and editing server compliant with the Model Context Protocol (MCP). It allows LLMs to invoke IDE-like symbolic search and refactoring operations, upgrading existing AIs like Claude Code and ChatGPT to be “fully context-equipped.”
- Free & OSS – Can be operated at zero cost by combining with the free tiers of commercial LLMs.
- Privacy-First – Utilize AI without sending your code out of your local environment.
- LLMO-Ready – Integrates with other frameworks like Gaia and Agno.
2. Recent Major Updates (July-August 2025)
Date | Change | Impact |
---|---|---|
2025-07-25 | Transition to subcommand structure: serena-mcp-server -> serena start-mcp-server | Existing scripts need modification. |
2025-07-23 | Command unification (old index-project is deprecated) | Aliases still work but will be removed in the future. |
2025-07-18 | SSE mode (--transport sse ) added | Allows for a persistent server via HTTP. |
2025-07-10 | Python 3.12 experimental support | Be aware of LSP compatibility issues. |
2025-07-05 | Docker image ghcr.io/oraios/serena:latest published | Avoids installing LS/dependencies on the host. |
3. Five Advantages of Adopting Serena
Aspect | Benefit | Notes |
---|---|---|
🔒 Privacy | AI collaboration without sending code to external APIs | Clears internal company policies without needing a VPN. |
📚 Continuous Knowledge | “Read once is enough” even for huge codebases through indexing | Completed by running serena project index . |
💸 Token Savings | Don’t paste the same code every time | Up to 90% reduction in some cases. |
⚡ Work Efficiency | Just say “run the tests for main.py “ | Auto-generates CLI commands like ChatOps. |
🧠 Response Accuracy | Drastically reduces off-target answers due to lack of context | Refactoring suggestions are on par with an IDE. |
4. Installation & Startup
4.1 Recommended: One-shot startup with uvx
# Run the latest version of Serena
uvx --from git+https://github.com/oraios/serena \
serena start-mcp-server
4.2 Local Clone + uv
# 1. Get the repository
git clone https://github.com/oraios/serena.git
cd serena
# 2. Start the server
uv run serena start-mcp-server
4.3 Docker (Experimental)
docker run --rm -i --network host \
-v /path/to/projects:/workspaces/projects \
ghcr.io/oraios/serena:latest \
serena start-mcp-server --transport stdio
Point: LS and Python are self-contained within the container. Safe and doesn’t pollute the host.
4.4 SSE Mode (Persistent HTTP Server)
uv run serena start-mcp-server --transport sse --port 9121
Specify http://localhost:9121
on the client side.
5. Indexing a Project
For large repositories, the initial tool call can be slow, so do this beforehand:
serena project index /abs/path/to/your/project
It will be auto-detected if you pass the --directory
option or run it from the project root.
6. Client Integration Recipes
6.1 Claude Desktop
In File › Settings › Developer › MCP Servers
, add the following:
{
"mcpServers": {
"serena": {
"command": "/abs/path/to/uvx",
"args": ["--from", "git+https://github.com/oraios/serena", "serena", "start-mcp-server"]
}
}
}
6.2 VS Code / Cursor
- Install the “Cline” extension.
- In settings, specify the MCP Server Command as above.
- Command Palette ->
Cline: Connect
7. Known Limitations & Workarounds (as of 2025-08-02)
Language | Status | Notes |
---|---|---|
Python | ✅ Stable | Also supports symbolic analysis of PyPI libraries. |
JavaScript/TypeScript | ⚠️ Unstable | Issues with pointer resolution on the LSP side. Refactoring is limited. |
Go / Rust | 🚧 Experimental | Depends on gopls and rust-analyzer settings. |
C/C++ | ⏳ Not supported | The dependent toolchain is heavy and awaiting verification. |
8. Python Version Compatibility
| Version | Operation | Notes | | — | — | | 3.11 | ✅ Fully supported | Officially recommended. | | 3.12 | ⚠️ Experimental | Some LSPs fail to build. Virtual environment recommended. | | ≤3.10 | ❌ Not supported | Due to lack of support for modern type hints. |
9. Frequently Asked Questions (FAQ)
Q. Is a GPU required?
A. Serena itself runs on the CPU. If a GPU is required on the LLM side (e.g., local Llama), consider it separately.
Q. Can you give an example of token savings?
A. For a project with over 300 files, sending 50KB each time was reduced to 4KB or less after indexing. Calculated with the monthly limit of Claude Pro, this is a reduction of about 92%.
Q. Does it work on Windows?
A. Yes, uvx
also works on Windows. Pay attention to the path separator (C:\path\to\uvx
).
10. Recommended References
Maintainer’s Note: Serena is still under active development. Always keep up to date with
git pull --ff-only
and check the change history to avoid breakage.