A multi-threaded server can handle multiple client requests simultaneously, improving responsiveness and efficiency. Unlike single-threaded servers, which handle one client at a time, multi-threaded servers create a new thread for each connection, allowing parallel execution.
Why Multi-Threaded Servers?
✔ Handle multiple clients concurrently
✔ Improve performance in high-traffic scenarios
✔ Reduce response time for each client
1. Single-Threaded vs Multi-Threaded Servers
Single-Threaded Server (Blocking)
A basic server handles one client at a time, blocking other connections.
import socket
server_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
server_socket.bind(("localhost", 8080))
server_socket.listen(5)
print("Server listening on port 8080...")
while True:
client_socket, addr = server_socket.accept()
print(f"Connected to {addr}")
data = client_socket.recv(1024).decode()
print(f"Received: {data}")
client_socket.send("Hello from server!".encode())
client_socket.close()
Problem: If one client takes time, others must wait.
2. Multi-Threaded Server (Handling Multiple Clients)
A multi-threaded server creates a new thread for each client, handling multiple requests in parallel.
Using threading
Module
import socket
import threading
def handle_client(client_socket, addr):
print(f"Connected to {addr}")
data = client_socket.recv(1024).decode()
print(f"Received: {data}")
client_socket.send("Hello from server!".encode())
client_socket.close()
server_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
server_socket.bind(("localhost", 8080))
server_socket.listen(5)
print("Server listening on port 8080...")
while True:
client_socket, addr = server_socket.accept()
client_thread = threading.Thread(target=handle_client, args=(client_socket, addr))
client_thread.start()
✔ Each client request runs in a separate thread.
✔ Improves responsiveness of the server.
3. Multi-Threaded Server with ThreadPool
Instead of creating a new thread for every request (which may be inefficient under heavy load), use ThreadPoolExecutor for efficient thread management.
import socket
import concurrent.futures
def handle_client(client_socket, addr):
print(f"Connected to {addr}")
data = client_socket.recv(1024).decode()
print(f"Received: {data}")
client_socket.send("Hello from server!".encode())
client_socket.close()
server_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
server_socket.bind(("localhost", 8080))
server_socket.listen(5)
print("Server listening on port 8080...")
with concurrent.futures.ThreadPoolExecutor(max_workers=10) as executor:
while True:
client_socket, addr = server_socket.accept()
executor.submit(handle_client, client_socket, addr)
✔ Efficient resource management with a fixed number of threads.
✔ Prevents overhead from excessive thread creation.
4. Multi-Threaded HTTP Server (Using socketserver
)
Python’s socketserver
provides a built-in multi-threaded HTTP server.
import socketserver
class ThreadedTCPServer(socketserver.ThreadingMixIn, socketserver.TCPServer):
pass
class RequestHandler(socketserver.BaseRequestHandler):
def handle(self):
data = self.request.recv(1024).decode()
print(f"Received: {data}")
self.request.sendall("Hello from server!".encode())
server = ThreadedTCPServer(("localhost", 8080), RequestHandler)
with server:
server.serve_forever()
✔ ThreadingMixIn automatically handles each request in a new thread.
✔ Useful for scalable TCP or HTTP servers.
5. Multi-Threaded Flask Server
Flask supports multi-threading for handling multiple API requests.
from flask import Flask
app = Flask(__name__)
@app.route('/')
def home():
return "Hello from Multi-threaded Flask Server!"
if __name__ == '__main__':
app.run(host="0.0.0.0", port=8080, threaded=True)
✔ threaded=True
enables multi-threading in Flask.
✔ Allows concurrent requests in web applications.
6. Multi-Threaded Server with WebSockets
A multi-threaded WebSocket server allows real-time communication.
Using websockets
with threading
import asyncio
import websockets
import threading
async def handle_client(websocket, path):
async for message in websocket:
print(f"Received: {message}")
await websocket.send(f"Server received: {message}")
def start_server():
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
server = websockets.serve(handle_client, "localhost", 8765)
loop.run_until_complete(server)
loop.run_forever()
server_thread = threading.Thread(target=start_server)
server_thread.start()
✔ Allows real-time chat & notifications over WebSockets.
✔ Handles multiple clients asynchronously with threads.
7. Multi-Threaded Server with asyncio
(Async Await)
Instead of threads, asyncio
allows efficient concurrent handling of multiple clients without blocking.
import asyncio
async def handle_client(reader, writer):
data = await reader.read(100)
message = data.decode()
print(f"Received: {message}")
writer.write("Hello from server!".encode())
await writer.drain()
writer.close()
async def main():
server = await asyncio.start_server(handle_client, "localhost", 8080)
async with server:
await server.serve_forever()
asyncio.run(main())
✔ Uses async/await instead of threads.
✔ Highly scalable for thousands of connections.
8. Comparing Multi-Threading vs Asyncio
Feature | Multi-Threading | Asyncio (Non-blocking) |
---|---|---|
Performance | Moderate | High (Efficient) |
Complexity | Easy | More Complex |
Concurrency | Limited by threads | Handles thousands of tasks |
Best For | I/O bound tasks | I/O bound with high scalability |
Use Multi-Threading if handling moderate concurrent requests (e.g., Web APIs, small servers).
Use Asyncio if handling thousands of concurrent requests (e.g., WebSockets, live streaming).