An ultra-lightweight, high-performance, on-premise chat solution designed for secure communication within local networks. This system ensures data privacy by keeping traffic internal to the local network.
- Massive Scalability: Engineered to handle over 6,000 concurrent connections using optimized multithreading.
- Zero-Internet Dependency: Works entirely within your LAN, preventing external data interception.
- Dockerized Deployment: Includes a Docker configuration to make the server easily runnable everywher.
- Strict JSON Protocol: A standardized messaging format used to communicate between all architecture components.
- Interactive Endpoints: Built-in server commands for listing users, broadcasting, and server-time synchronization.
The project implements a Client-Server architecture, also known as a Star Topology.
- Server: A multithreaded TCP engine that manages a client registry and routes JSON packets.
- Client: Utilizes two independent threads to handle simultaneous message sending and receiving.
- Language: Python for rapid development and extensive library support.
- Core Libraries:
socketfor networking,threadingfor concurrency, andjsonfor data management. - DevOps: Docker for containerization and GitHub Actions for CI/CD.
- Clone the repo:
git clone https://github.com/gizano/multithreaded-python-chat.git cd multithreaded-python-chat - Launch the server:
docker-compose up -d
- Start the Server:
python server/server_main.py
- Launch the Client:
python client/client_main.py
Stability is our priority. We utilized AI-generated testing suites to push the server to its limits.
- Stress Test: Simulates 50 bots performing handshake and random messaging.
- Breakpoint Test: Confirmed hardware-bound stability at 6,000+ active users.
Technical Note: The "TCP Coalescing" issue was addressed using the
timelibrary to prevent packet merging during registration.
All communication must adhere to the following JSON schema:
{
"from": { "name": "...", "ip": "..." },
"to": "...",
"msg": "..."
}