You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What happened: I'm facing an issue when using the deltalake lib to save / loading data to Azure Blob Storage. Sometimes, I'm getting the following error:
DatasetError: Failed while saving data to data set CustomDeltaTableDataset(file_example).
Failed to parse parquet: Parquet error: AsyncChunkReader::get_bytes error:
Generic MicrosoftAzure error: Error after 10 retries in 2.196683949s, max_retries:10,
retry_timeout:180s, source:error sending request for url
(https://<address>/file.parquet):
error trying to connect: dns error: failed to lookup address information: Name or service not known
What you expected to happen: I expected to load the data from the Delta table and convert it to a Pandas DataFrame without any errors.
How to reproduce it:
fromdeltalakeimportDeltaTabledatalake_info= {
'account_name': <account>,
'client_id': <cli_id>,
'tenant_id': <tenant_id>,
'client_secret': <secret>,
'timeout': '100000s'
}
# Load data from the delta tabledt=DeltaTable("abfs://<azure_address>", storage_options=datalake_info)
Environment
Delta-rs version: 0.16.0
Environment:
Bug
What happened: I'm facing an issue when using the deltalake lib to save / loading data to Azure Blob Storage. Sometimes, I'm getting the following error:
What you expected to happen: I expected to load the data from the Delta table and convert it to a Pandas DataFrame without any errors.
How to reproduce it:
More details: I was looking for a parameter like max_retries but couldn't find anything related. Does anyone know a solution or workaround for this issue? I didn't find an approach in the docs: https://docs.rs/object_store/latest/object_store/azure/enum.AzureConfigKey.html
The text was updated successfully, but these errors were encountered: