Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unbound response error #726

Open
ddematheu opened this issue Mar 13, 2024 · 13 comments
Open

Unbound response error #726

ddematheu opened this issue Mar 13, 2024 · 13 comments
Assignees
Labels
needs repro The issue could not be reproduced

Comments

@ddematheu
Copy link

Bug report

Describe the bug

Library is throwing a unbound error for variable response.

To Reproduce

Call : response = supabase.storage.from_(bucket_name).download(file_path)

Full tracestack

File "/app/supabase_utils/download_utils.py", line 39, in download_with_retry
response = supabase.storage.from_(bucket_name).download(file_path)
File "/usr/local/lib/python3.10/site-packages/storage3/_sync/file_api.py", line 342, in download
response = self._request(
File "/usr/local/lib/python3.10/site-packages/storage3/_sync/file_api.py", line 50, in _request
resp = response.json()
UnboundLocalError: local variable 'response' referenced before assignment"

Expected behavior

A clear and concise description of what you expected to happen.

No unbound error.

Screenshots

If applicable, add screenshots to help explain your problem.

System information

  • OS: [e.g. macOS, Windows]
  • Browser (if applies) [e.g. chrome, safari]
  • Version of supabase-js: [e.g. 6.0.2]
  • Version of Node.js: [e.g. 10.10.0]

Additional context

Add any other context about the problem here.

@ddematheu ddematheu added the bug Something isn't working label Mar 13, 2024
@Zeulni
Copy link

Zeulni commented Mar 17, 2024

I have a similar issue with the same call response = supabase.storage.from_(bucket_name).download(file_path)

Suddenly, without any change, I get the error "ERROR: cannot access local variable 'response' where it is not associated with a value"

@ddematheu
Copy link
Author

It was the same with me. No change from my side and starting seeing error.

@Abullity
Copy link

Abullity commented May 8, 2024

I have similar issue.

here's my code:

res = await get_async_client().storage.from_(PDF_BUCKET).upload(
        file=file,
        path=filename,
        file_options={"content-type": "application/pdf"}
    )

    return res.json()

this exception was raised because I was trying to upload PDF greater than 50mb, the handling however is not error-proof, here's the code responsible for the exception:

in storage3/_async/file_api.py:50

async def _request(
        self,
        method: RequestMethod,
        url: str,
        headers: Optional[dict[str, Any]] = None,
        json: Optional[dict[Any, Any]] = None,
        files: Optional[Any] = None,
        **kwargs: Any,
    ) -> Response:
        try:
            response = await self._client.request(
                method, url, headers=headers or {}, json=json, files=files, **kwargs
            )
            response.raise_for_status()
        except HTTPError:
            try:
                resp = response.json()  # <--- this line is responsible for the error 
                raise StorageException({**resp, "statusCode": response.status_code})
            except JSONDecodeError:
                raise StorageException({"statusCode": response.status_code})

        return response

@silentworks
Copy link
Contributor

Unable to reproduce this. Please provide a reproducible repository with the issue.

@silentworks silentworks added needs repro The issue could not be reproduced and removed bug Something isn't working labels Jun 25, 2024
@silentworks silentworks self-assigned this Jun 25, 2024
@ddematheu
Copy link
Author

@silentworks

The issue is transient as it required the request to fail. Hard to provide a repo for this to reproduce.

My question to you would be is there any case where self._client.request would fail without returning a response?

Clearly the issue seems to be that there is some edge case somewhere where request is failing and not returning a response to be checked.

@ddematheu ddematheu reopened this Jul 4, 2024
@silentworks
Copy link
Contributor

@ddematheu there aren't any cases that I know of or have come across myself personally. I will leave this issue open until someone can provide such an edge-case where this happens, but until then there isn't much I can do to fix it from the library's perspective.

@danielbichuetti
Copy link

I will try to reproduce in the upcoming week. When Storage returns 520 errors, response is unbounded here too.

@ddematheu
Copy link
Author

Today has been particularly bad with seeing lots of these coming back. Not consistent

@supabase supabase deleted a comment from furiousteabag Jul 23, 2024
@ddematheu
Copy link
Author

Seeing a http read timeout

Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
yield
File "/usr/local/lib/python3.10/site-packages/httpx/_transports/default.py", line 233, in handle_request
resp = self._pool.handle_request(req)
File "/usr/local/lib/python3.10/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request
raise exc from None
File "/usr/local/lib/python3.10/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request
response = connection.handle_request(
File "/usr/local/lib/python3.10/site-packages/httpcore/_sync/connection.py", line 101, in handle_request
return self._connection.handle_request(request)
File "/usr/local/lib/python3.10/site-packages/httpcore/_sync/http11.py", line 143, in handle_request
raise exc
File "/usr/local/lib/python3.10/site-packages/httpcore/_sync/http11.py", line 113, in handle_request
) = self._receive_response_headers(**kwargs)
File "/usr/local/lib/python3.10/site-packages/httpcore/_sync/http11.py", line 186, in _receive_response_headers
event = self._receive_event(timeout=timeout)
File "/usr/local/lib/python3.10/site-packages/httpcore/_sync/http11.py", line 224, in _receive_event
data = self._network_stream.read(
File "/usr/local/lib/python3.10/site-packages/httpcore/_backends/sync.py", line 124, in read
with map_exceptions(exc_map):
File "/usr/local/lib/python3.10/contextlib.py", line 153, in exit
self.gen.throw(typ, value, traceback)
File "/usr/local/lib/python3.10/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ReadTimeout: The read operation timed out

When handling that exception, its hitting the issue
File "/usr/local/lib/python3.10/site-packages/storage3/_sync/file_api.py", line 67, in create_signed_upload_url
response = self._request("POST", f"/object/upload/sign/{_path}")
File "/usr/local/lib/python3.10/site-packages/storage3/_sync/file_api.py", line 50, in _request
resp = response.json()
UnboundLocalError: local variable 'response' referenced before assignment"

Not sure @silentworks how to repro causing that issue, but when the timeout happens, the other issue happens as well.

@silentworks
Copy link
Contributor

@ddematheu can you give more info, like the size/type of the file you are trying to upload and which methods this happens with?

@xiaoland
Copy link

I have the same problem on supabase2.5.0 when using storage upload. There's nothing changed before but everything suddenly stopped working. I restarted the Terminal and Python a few times, but the error persists.

@Jaycmarques
Copy link

Jaycmarques commented Sep 20, 2024

I have similar issue.

here's my code:

res = await get_async_client().storage.from_(PDF_BUCKET).upload(
        file=file,
        path=filename,
        file_options={"content-type": "application/pdf"}
    )

    return res.json()

this exception was raised because I was trying to upload PDF greater than 50mb, the handling however is not error-proof, here's the code responsible for the exception:

in storage3/_async/file_api.py:50

async def _request(
        self,
        method: RequestMethod,
        url: str,
        headers: Optional[dict[str, Any]] = None,
        json: Optional[dict[Any, Any]] = None,
        files: Optional[Any] = None,
        **kwargs: Any,
    ) -> Response:
        try:
            response = await self._client.request(
                method, url, headers=headers or {}, json=json, files=files, **kwargs
            )
            response.raise_for_status()
        except HTTPError:
            try:
                resp = response.json()  # <--- this line is responsible for the error 
                raise StorageException({**resp, "statusCode": response.status_code})
            except JSONDecodeError:
                raise StorageException({"statusCode": response.status_code})

        return response

You may try to modify the _request method to ensure that the response variable is only accessed if it has been assigned a value. You can do this by initializing response to None and checking if it’s None before trying to access it:

async def _request(
        self,
        method: RequestMethod,
        url: str,
        headers: Optional[dict[str, Any]] = None,
        json: Optional[dict[Any, Any]] = None,
        files: Optional[Any] = None,
        **kwargs: Any,
    ) -> Response:
    response = None
    try:
        response = await self._client.request(
            method, url, headers=headers or {}, json=json, files=files, **kwargs
        )
        response.raise_for_status()
    except HTTPError:
        if response is not None:
            try:
                resp = response.json()  # Safe to access response now
                raise StorageException({**resp, "statusCode": response.status_code})
            except JSONDecodeError:
                raise StorageException({"statusCode": response.status_code})
        else:
            raise StorageException({"error": "Request failed without a response."})

    return response

@Unkn0wn0x
Copy link

I run into the same issue too, currently fixed it with Boto3, the AWS S3 client itself as Supabase allows direct connections to the S3 bucket. The SDK has many benefits with automatic retries and in my opinion a faster processing (upload time) with less overhead.

Replace abc123def456 with your project ref within supabase.

Your project ref can be found here (Project ID):

Your anon key here:

The storage endpoint and region can be found here:

config.py

import os

class Config:

    # Supabase settings
    SUPABASE_URL: str                       = os.getenv('SUPABASE_URL') # https://abc123def456.supabase.co
    SUPABASE_KEY: str                       = os.getenv('SUPABASE_KEY') # your anon key
    SUPABASE_PROJECT_REF: str               = os.getenv('SUPABASE_PROJECT_REF') # abc123def456
    SUPABASE_STORAGE_REGION: str            = os.getenv('SUPABASE_REGION') # "eu-central-1"
    SUPABASE_STORAGE_ENDPOINT_URL: str      = f"{os.getenv('SUPABASE_URL')}/storage/v1/s3"
    SUPABASE_STORAGE_SIGNED_URL_EXPIRY: int = 60 # 60s
    SUPABASE_STORAGE_CACHE_CONTROL: int     = 604800 # 7d

config = Config()

So here in short:

storage.py

import boto3
from .config import config

def get_storage_client(self, auth_header: str):
    return boto3.client(
        's3',
        aws_access_key_id=str(config.SUPABASE_PROJECT_REF),
        aws_secret_access_key=str(config.SUPABASE_KEY),
        aws_session_token=str(auth_header.replace("Bearer ", "")),
        region_name=str(config.SUPABASE_STORAGE_REGION),
        endpoint_url=str(config.SUPABASE_STORAGE_ENDPOINT_URL),
    )

Now you can do something like:
yourlogic.py

# Get Auth header (JWT token)
auth_header = request.headers.get('Authorization')

# Get S3 client
s3_client = self.get_storage_client(auth_header)

# Upload file to storage
# See: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-uploading-files.html
with open('localfile.png', 'rb') as file:
    s3_client.upload_fileobj(
        file,
        bucket_id,
        f"my/file/path/example.png",
        ExtraArgs={
            "ContentType": f"image/png",
            "CacheControl": str(config.SUPABASE_STORAGE_CACHE_CONTROL),
            "Metadata": {
                "upsert": "true"
            }
        }
    )

# Download file from storage
# See: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-example-download-file.html
with open('localfile.png', 'wb') as f:
    s3_client.download_fileobj(bucket_id, file_path, f)

# Create signed URLs
# See: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-presigned-urls.html
url = s3_client.generate_presigned_url(
    ClientMethod='get_object',
    Params={
        'Bucket': bucket_id,
        'Key': file_path # my/file/path/example.png
    },
    ExpiresIn=str(config.SUPABASE_STORAGE_SIGNED_URL_EXPIRY)
)

# Fire the request and store the response
response = requests.get(url)

You can read more about it here:

Also you're able to directly upload with a S3 access key (service role) if you create one. The method above shows the handling from a user (client) with it's access token.

Hope it helps some of you guys.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs repro The issue could not be reproduced
Projects
None yet
Development

No branches or pull requests

8 participants