I’ve been playing with asyncore and 2.2’s new generator model. Here’s the idea:
Instead of passing events to callbacks, pass information to a single handle function (a generator) via instance attributes. When the handler needs more data from the network, use yield to pass control back to the asyncore dispatcher.
Does it work? You bet.
from YieldAsyncore import * class httpClient(YieldDispatcherWithSend): def __init__(self, host, port, path): self.host = host self.path = path YieldDispatcherWithSend.__init__(self, host, port) def handle(self): # send request self.send( "GET %s HTTP/1.0\r\nHost: %s\r\n\r\n" % (self.path, self.host) ) yield OK # look for response header while self.data: try: i = string.index(self.data, "\r\n\r\n") except ValueError: yield MORE # need more data else: print self, repr(self.data[:i]) self.data = self.data[i+4:] if not self.data: yield OK break # process response body while self.data: print self, len(self.data) yield OK print self, "done"
It’s not co-routines, but it’s pretty damn close!
(In case it’s not obvious, the trick here is that you can create any number of httpClient instances, and let them all run in parallel.)
asyncore
and generators are both concepts in Python that relate to asynchronous programming, but they are used in different contexts and have different implementations.
asyncore
:asyncore
is a module in the Python standard library that provides an asynchronous socket server and client implementation. It is based on the asynchronous I/O (non-blocking I/O) model, allowing you to handle multiple socket connections efficiently. It uses callbacks and event-driven programming to manage I/O operations.
With asyncore
, you define a subclass of the asyncore.dispatcher
class and override its methods to handle events like the creation of a new connection, receiving data, or closing a connection. The asyncore.loop()
function is then used to enter the event loop, where it continuously checks for events and invokes the appropriate methods.