A library of composable Python executors.
This library is intended for use with the
concurrent.futures
module. It includes a collection of Executor
implementations in order to
extend the behavior of Future
objects.
- Futures with implicit retry
- Futures with implicit cancel on executor shutdown
- Futures with implicit cancel after timeout
- Futures with transformed output values
- Futures resolved by a caller-provided polling function
- Throttle the number of futures running at once
- Synchronous executor
- Bridge
concurrent.futures
withasyncio
- Convenience API for creating executors
See the API documentation for detailed information on usage.
This example combines the map and retry executors to create futures for HTTP requests running concurrently, decoding JSON responses within the future and retrying on error.
import requests
from concurrent.futures import as_completed
from more_executors import Executors
def get_json(response):
response.raise_for_status()
return (response.url, response.json())
def fetch_urls(urls):
# Configure an executor:
# - run up to 4 requests concurrently, in separate threads
# - run get_json on each response
# - retry up to several minutes on any errors
executor = Executors.\
thread_pool(max_workers=4).\
with_map(get_json).\
with_retry()
# Submit requests for each given URL
futures = [executor.submit(requests.get, url)
for url in urls]
# Futures API works as normal; we can block on the completed
# futures and map/retry happens implicitly
for future in as_completed(futures):
(url, data) = future.result()
do_something(url, data)
- Improved RetryPolicy API
- Fixed a race condition leading to RetryExecutor hangs
- Added
logger
argument to each executor
- Introduced ThrottleExecutor
- Fixed missing long_description in package
- Revised TimeoutExecutor concept to "cancel after timeout"
- Introduced AsyncioExecutor
- Introduce TimeoutExecutor
- Use monotonic clock in RetryExecutor
- Avoid some uninterruptible sleeps on Python 2.x
- Minor improvements to logging
GPLv3