Skip to content

Commit

Permalink
Release v0.4.1 (#93)
Browse files Browse the repository at this point in the history
* Fixing ovewrite integration tests
([#92](#92)). A new
enhancement has been implemented for the `overwrite` feature's
integration tests, addressing a concern with write operations. Two new
variables, `catalog` and "schema", have been incorporated using the
`env_or_skip` function. These variables are utilized in the `save_table`
method, which is now invoked twice with the same table, once with the
`append` and once with the `overwrite` option. The data in the table is
retrieved and checked for accuracy after each call, employing the
updated `Row` class with revised field names `first` and "second",
formerly `name` and "id". This modification ensures the proper operation
of the `overwrite` feature during integration tests and resolves any
related issues. The commit message `Fixing overwrite integration tests`
signifies this change.
  • Loading branch information
nfx authored Apr 12, 2024
1 parent 9a3517b commit 5782b23
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 1 deletion.
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,10 @@
# Version changelog

## 0.4.1

* Fixing ovewrite integration tests ([#92](https://github.com/databrickslabs/lsql/issues/92)). A new enhancement has been implemented for the `overwrite` feature's integration tests, addressing a concern with write operations. Two new variables, `catalog` and "schema", have been incorporated using the `env_or_skip` function. These variables are utilized in the `save_table` method, which is now invoked twice with the same table, once with the `append` and once with the `overwrite` option. The data in the table is retrieved and checked for accuracy after each call, employing the updated `Row` class with revised field names `first` and "second", formerly `name` and "id". This modification ensures the proper operation of the `overwrite` feature during integration tests and resolves any related issues. The commit message `Fixing overwrite integration tests` signifies this change.


## 0.4.0

* Added catalog and schema parameters to execute and fetch ([#90](https://github.com/databrickslabs/lsql/issues/90)). In this release, we have added optional `catalog` and `schema` parameters to the `execute` and `fetch` methods in the `SqlBackend` abstract base class, allowing for more flexibility when executing SQL statements in specific catalogs and schemas. These updates include new method signatures and their respective implementations in the `SparkSqlBackend` and `DatabricksSqlBackend` classes. The new parameters control the catalog and schema used by the `SparkSession` instance in the `SparkSqlBackend` class and the `SqlClient` instance in the `DatabricksSqlBackend` class. This enhancement enables better functionality in multi-catalog and multi-schema environments. Additionally, this change comes with unit tests and integration tests to ensure proper functionality. The new parameters can be used when calling the `execute` and `fetch` methods. For example, with a `SparkSqlBackend` instance `spark_backend`, you can execute a SQL statement in a specific catalog and schema with the following code: `spark_backend.execute("SELECT * FROM my_table", catalog="my_catalog", schema="my_schema")`. Similarly, the `fetch` method can also be used with the new parameters.
Expand Down
2 changes: 1 addition & 1 deletion src/databricks/labs/lsql/__about__.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = "0.4.0"
__version__ = "0.4.1"

0 comments on commit 5782b23

Please sign in to comment.