I am programming a proxy-server in python, when offering it as a service for others, obviously alot of connections will be made. How would you efficiently handle this many connections properly?
My current solution is creating a thread for each connection a person makes, this will sooner or later lead to context switching overhead and that will be a problem. A ThreadPool also really isn't a solution because it will lead to blocking sooner or later, because the connections can be long lived and would start to take up more and more workers
So what would be the solution? One person alone manages to create 200 threads / conncetions
Here is my solution in python:
def acceptConnections(self):
logger.info("HTTP Awaiting new connections...")
while self.running:
try:
if Debug:
logger.debug("Awaiting connection")
sock, address = self.__proxy_sock.accept()
if address[0] in self.blockedIPs:
return
if Debug:
logger.debug(f"connection request: {address}")
threading.Thread(target=self.__handleConnection, args=(sock, address[0])).start()
except KeyboardInterrupt as e:
logger.debug(f"Closing thread: {e}")
break
Right now I am just gathering ideas, I asked chatGPT and it gave me an async solution but that would require me to recode everything which is a lot. ThreadPools aren't a solution either (maybe I am wrong) as described in the post.
Because this should be a service for many other people to use, the program should be able to handle thousands of connections concurrently without much lag.