multithreading - Python http server with multithreaded response handlers -
i'm trying setup python server handles post packets. once packet arrives, do_post inits new thread self & data, then, thread stuff , puts self object received output. have far:
from basehttpserver import basehttprequesthandler, httpserver .... class httphandler(basehttprequesthandler): def do_post(self): length = int(self.headers['content-length']) data = self.rfile.read(length) resolver(self,data).start() return
then, in resolver class do: import threading
class resolver(threading.thread): def __init__(self,http,number): threading.thread.__init__(self) self.http = http self.number = number + "!" def run(self): self.http.send_response(200) self.http.send_header('content-type','text/html') self.http.send_header('content-length', len(self.number)) self.http.end_headers() # send html message self.http.wfile.write(self.number) return
of course, example , not complete sheet, i'm still in phase of testing program. running on weak platform (at moment, raspberry pi) , i'm looking performance solution. suggestions ?
the problem basehttprequesthandler
expects done request time return do_post
. isn't clear in documentation, it's obvious if @ source handle_one_request
, method calls method:
mname = 'do_' + self.command # ... method = getattr(self, mname) mname() self.wfile.flush() #actually send response if not done.
if deeper, you'll see that, you'd expect, code expects able close or reuse connection finishes handling request.
so, can't use basehttprequesthandler
way.
you can, of course, write own handler implementation instead. large extent, stuff in basehttpserver
meant sample code more powerful, efficient, robust, , flexible framework (which why docs link straight source).
alternatively, instead of trying create thread per request, create thread per connection. threadingmixin
class makes easy.
but better solution use better framework, twisted or tornado, or use webserver threading , calls code via wsgi.
Comments
Post a Comment