Description
Hi,
I've been thinking a bit about how we could implement the Dataloader pattern in v3 while still running in multi-threaded mode. Since v3 does not support Syrus's Promise library, we need to come up with a story for batching in async mode, as well as in multi-threaded environments. There are many libraries that do not support asyncio
and there are many cases where it does not make sense to go fully async.
As far as I understand, the only way to batch resolver calls from a single frame of execution would be to use loop.call_soon
. But since asyncio
is not threadsafe, that means we would need to run a separate event loop in each worker thread. We would need to wrap the graphql
call with something like this:
def run_batched_query(...):
loop = asyncio.new_event_loop()
execution_future = graphql(...)
loop.run_until_complete(result_future)
return execution_future.result()
Is that completely crazy? If yes, do you see a less hacky way? I'm not very familiar with asyncio
so I would love to get feedback.
Cheers