-
-
Notifications
You must be signed in to change notification settings - Fork 858
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support async file types in files = {}
and content = ...
#1620
Comments
Hi, I'm interested in this issue and also found it in the ._content.encode_content() function, the first thing need to do is to find the type of trio and anyio. Could I have a try for getting it done? :D |
Ohhh, like you said its multipart issue seems like aiohttp working with multipart as well. |
hey, @meist0731 are you still working on this issue ? If yes, can we work it together. Seems like an interesting problem |
Yep! I'm still working on this. It's my honor to work together with you :DDD I'm going to sleep soon and tomorrow I will share the materials I've searched before. |
@meist0731 cheers, you on discord ? itll be easier to work together. |
Gotcha! I have discord, wait a sec, bro. |
Hey, I've sent the invitation :D |
@tomchristie how do you recommend to proceed this issue? Like can you explain where to start and all ? |
I've tried these APIs as below,
The multipart upload:
The problems here are: For example, this function accepts AsyncIterable objects but cannot perform iterations indeed. is not an async function. |
The first problem has an existing discussion #1704 (comment) |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
Argh, stale, wontfix, nooo 😱 ! Just want to make sure: uploading a file and data as multipart is still not supported by the async client, right? I'm getting the import httpx
from aiofiles import open as aopen
async with aopen("somefile.zip", "rb") as fp, httpx.AsyncClient() as client:
files = {"content": ("somefile.zip", fp, "application/octet-stream")}
response = await client.post(
"http://localhost:8888",
data=data_to_send,
files=files,
follow_redirects=False,
headers={"Content-Type": f"multipart/form-data; boundary={uuid4().hex}"},
) |
@pawamoy No fix was implemented AFAIK. This seems like an issue stalebot closed due to lack of inactivity, rather than us deciding it shouldn't be acted upon. I guess we can reopen (stalebot would come back in a few months) and any attempts towards supporting the interfaces described in OP (trio, anyio, aiofiles) would be welcome! |
While the issue is not resolved, I'm using following monkey-patch, maybe it will be helpful: """
This is workaround monkey-patch for https://github.com/encode/httpx/issues/1620
If you need to upload async stream as a multipart `files` argument, you need to apply this patch
and wrap stream with `AsyncStreamWrapper`::
httpx_monkeypatch.apply()
...
known_size = 42
stream = await get_async_bytes_iterator_somehow_with_known_size(known_size)
await client.post(
'https://www.example.com',
files={'upload': AsyncStreamWrapper(stream, known_size)})
)
"""
import typing as t
from asyncio import StreamReader
from httpx import _content
from httpx._multipart import FileField
from httpx._multipart import MultipartStream
from httpx._types import RequestFiles
class AsyncStreamWrapper:
def __init__(self, stream: t.Union[t.AsyncIterator[bytes], StreamReader], size: int):
self.stream = stream
self.size = size
class AsyncAwareMultipartStream(MultipartStream):
def __init__(self, data: dict, files: RequestFiles, boundary: bytes = None) -> None:
super().__init__(data, files, boundary)
for field in self.fields:
if isinstance(field, FileField) and isinstance(field.file, AsyncStreamWrapper):
field.get_length = lambda f=field: len(f.render_headers()) + f.file.size # type: ignore # noqa: E501
async def __aiter__(self) -> t.AsyncIterator[bytes]:
for field in self.fields:
yield b'--%s\r\n' % self.boundary
if isinstance(field, FileField) and isinstance(field.file, AsyncStreamWrapper):
yield field.render_headers()
async for chunk in field.file.stream:
yield chunk
else:
for chunk in field.render():
yield chunk
yield b'\r\n'
yield b'--%s--\r\n' % self.boundary
def apply():
_content.MultipartStream = AsyncAwareMultipartStream |
Has there been any progress on this by any chance? |
I am also using It says |
If anyone is invested in making this happen I can make the time to guide a pull request through. |
I solved this problem, for FastAPI. When reading a uploaded file in a form, FastAPI wraps a SpooledTemporaryFile into async style. To access the file with httpx, the async doesn't fit, but you can use the old-fasioned way, just change
into
|
A monkey patch showing a possible solution (also cancels #1706 and covers #2399): https://gist.github.com/yayahuman/db06718ffdf8a9b66e133e29d7d7965f And possible type annotations: from abc import abstractmethod
from typing import AnyStr, AsyncIterable, Iterable, Protocol, Union # 3.8+
class Reader(Protocol[AnyStr]):
__slots__ = ()
@abstractmethod
def read(self, size: int = -1) -> AnyStr:
raise NotImplementedError
class AsyncReader(Protocol[AnyStr]):
__slots__ = ()
@abstractmethod
async def read(self, size: int = -1) -> AnyStr:
raise NotImplementedError
FileContent = Union[
str,
bytes,
Iterable[str],
Iterable[bytes],
AsyncIterable[str],
AsyncIterable[bytes],
Reader[str],
Reader[bytes],
AsyncReader[str],
AsyncReader[bytes],
]
RequestContent = FileContent |
@tomchristie, can my monkey patch approach be acceptable? |
Let me help guide this conversation a bit more clearly. |
What's the status of this ticket? There were a couple people actively working toward a PR several years ago, and it's unclear what happened with that work. Since then, conversation has revolved entirely around monkeypatches, temp fixes, and workarounds. Is anyone actively looking at the proper solution within httpx itself? I am also in need of using |
|
OK thanks, since that comment was a year ago I guess the answer is that the status of this ticket is low priority and will not be done by the maintainer. |
i see, it would be better if giving more context/details about your train of thoughts of this content case? thanks~~ @tomchristie |
I'd suggest a good approach would be that I reciprocate effort by guiding contributors through the process.
|
We ought to support the following cases.
Raw upload content from an async file interface:
Multipart file upload from an async file interface:
We probably want to ensure that we're supporting both
trio
,anyio
(Which have the same interfaces), and perhaps also `aiofiles. So eg, also supporting the following...The
content=...
case is a little simpler than thedata=...
case, since it really just need an async variant ofpeek_filelike_length
, and a minor update to the._content.encode_content()
function.Also fiddly is what the type annotations ought to look like.
The text was updated successfully, but these errors were encountered: