Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

incompatible with pytest 7.2.0 #92

Open
ENM1989 opened this issue Jan 3, 2023 · 2 comments · May be fixed by #96
Open

incompatible with pytest 7.2.0 #92

ENM1989 opened this issue Jan 3, 2023 · 2 comments · May be fixed by #96

Comments

@ENM1989
Copy link

ENM1989 commented Jan 3, 2023

When I run "pytest --flake8" I get the following error:
AttributeError: 'Application' object has no attribute 'parse_preliminary_options'

The issue is in the check_file function.

@tkutcher
Copy link

Think the problem is actually with flake8 6.0.0. Looks like in the latest commit they actually removed that function from the Application class that pytest-flake8 uses:

See diff at

PyCQA/flake8@0d667a7#diff-eb63c667faddfa67b8a2a6e6c3e8a85e34ffca80baddde0f8202e2ece75a52a4L203

Also does not seem to play with flake8 5.0 as it uses a "ConfigFileFinder" class that was removed

For me things work (with many DeprecationWarnings) with pytest 7.2.1, flake8 4.0.1, and pytest-flake8 1.1.1, broken for flake8 >= 5.0

living180 added a commit to living180/git-cola that referenced this issue Feb 14, 2023
pytest-flake8 has been broken for over a year with flake8 >= 5.0
[1][2][3], and has not had any commits since March 2022.  Drop in
favor of simply invoking flake8 separately outside of pytest.  Ensure
that "make check" now invokes flake8.

[1] tholo/pytest-flake8#83
[2] tholo/pytest-flake8#87
[3] tholo/pytest-flake8#92

Signed-off-by: Daniel Harding <[email protected]>
living180 added a commit to living180/git-cola that referenced this issue Feb 14, 2023
pytest-flake8 has been broken for over a year with pytest >= 7.0 [1] and
more than six months with flake8 >= 5.0 [2][3], and has not had any
commits since March 2022.  Drop in favor of simply invoking flake8
separately outside of pytest.  Ensure that "make check" now invokes
flake8.

[1] tholo/pytest-flake8#83
[2] tholo/pytest-flake8#87
[3] tholo/pytest-flake8#92

Signed-off-by: Daniel Harding <[email protected]>
xen0l added a commit to xen0l/aws-gate that referenced this issue Mar 12, 2023
Et7f3 added a commit to Et7f3/pytest-flake8 that referenced this issue Apr 18, 2023
@Et7f3 Et7f3 linked a pull request Apr 18, 2023 that will close this issue
justinGilmer added a commit to PingThingsIO/btrdb-python that referenced this issue May 18, 2023
pytest-flake8 seems to have issues with the later versions of flake8

tholo/pytest-flake8#92
justinGilmer added a commit to PingThingsIO/btrdb-python that referenced this issue Jun 6, 2023
* Release v5.15.0

* update protobuf to v4.22.3

* Add threaded streamset calls

Using concurrent.futures.ThreadPoolExecutor

* Blacken code

* Update for failing tests

* Ignore flake8 as part of testing

pytest-flake8 seems to have issues with the later versions of flake8

tholo/pytest-flake8#92

* Update .gitignore

* Update ignore and remove extra print.

* Remove idea folder (pycharm)

---------

Co-authored-by: David Konigsberg <[email protected]>
Co-authored-by: Jeff Lin <[email protected]>
justinGilmer added a commit to PingThingsIO/btrdb-python that referenced this issue Jun 6, 2023
* Release v5.15.0

* update protobuf to v4.22.3

* Add threaded streamset calls

Using concurrent.futures.ThreadPoolExecutor

* Blacken code

* Update for failing tests

* Ignore flake8 as part of testing

pytest-flake8 seems to have issues with the later versions of flake8

tholo/pytest-flake8#92

* Update .gitignore

* Update proto definitions.

* Update endpoint to support arrow methods

* Support arrow endpoints

* Additional arrow updates

* Update transformers, add polars conversion

* Update .gitignore

* Update ignore and remove extra print.

* Remove idea folder (pycharm)

* Update requirements.txt

* Update btrdb/transformers.py

* Update the way to check for arrow-enabled btrdb

This has not been "turned on" yet though, since we dont know the version number this will be enabled for. The method is currently commented out, but can be re-enabled pretty easily.

* Use IPC streams to send the arrow bytes for insert

Instead of writing out feather files to an `io.BytesIO` stream and then sending the feather files over the wire, this creates a buffered outputstream and then sends that data back as bytes to btrdb.

* Create arrow specific stream methods.

* Update test conn object to support minor version

* Update tests and migrate arrow code.

* Arrow and standard streamset insert

* Create basic arrow to dataframe transformer

* Support multirawvalues, arrow transformers

* Multivalue arrow queries, in progress

* Update stream filter to properly filter for sampling frequency

* Update arrow values queries for multivalues

* Update param passing for sampling frequency

* Update index passing, and ignore depth

* benchmark raw values queries for arrow and current api

* Add aligned windows and run func

* Streamset read benchmarks (WIP)

In addition:
* update streamset.count to support the `precise` boolean flag.

* Update mock return value for versionMajor

* In progress validation of stream benchs

---------

Co-authored-by: David Konigsberg <[email protected]>
Co-authored-by: Jeff Lin <[email protected]>
youngale-pingthings added a commit to PingThingsIO/btrdb-python that referenced this issue Jul 20, 2023
* Threadpool executor (#22)

* Release v5.15.0

* update protobuf to v4.22.3

* Add threaded streamset calls

Using concurrent.futures.ThreadPoolExecutor

* Blacken code

* Update for failing tests

* Ignore flake8 as part of testing

pytest-flake8 seems to have issues with the later versions of flake8

tholo/pytest-flake8#92

* Update .gitignore

* Update ignore and remove extra print.

* Remove idea folder (pycharm)

---------

Co-authored-by: David Konigsberg <[email protected]>
Co-authored-by: Jeff Lin <[email protected]>

* Threaded arrow (#23)

* Release v5.15.0

* update protobuf to v4.22.3

* Add threaded streamset calls

Using concurrent.futures.ThreadPoolExecutor

* Blacken code

* Update for failing tests

* Ignore flake8 as part of testing

pytest-flake8 seems to have issues with the later versions of flake8

tholo/pytest-flake8#92

* Update .gitignore

* Update proto definitions.

* Update endpoint to support arrow methods

* Support arrow endpoints

* Additional arrow updates

* Update transformers, add polars conversion

* Update .gitignore

* Update ignore and remove extra print.

* Remove idea folder (pycharm)

* Update requirements.txt

* Update btrdb/transformers.py

* Update the way to check for arrow-enabled btrdb

This has not been "turned on" yet though, since we dont know the version number this will be enabled for. The method is currently commented out, but can be re-enabled pretty easily.

* Use IPC streams to send the arrow bytes for insert

Instead of writing out feather files to an `io.BytesIO` stream and then sending the feather files over the wire, this creates a buffered outputstream and then sends that data back as bytes to btrdb.

* Create arrow specific stream methods.

* Update test conn object to support minor version

* Update tests and migrate arrow code.

* Arrow and standard streamset insert

* Create basic arrow to dataframe transformer

* Support multirawvalues, arrow transformers

* Multivalue arrow queries, in progress

* Update stream filter to properly filter for sampling frequency

* Update arrow values queries for multivalues

* Update param passing for sampling frequency

* Update index passing, and ignore depth

* benchmark raw values queries for arrow and current api

* Add aligned windows and run func

* Streamset read benchmarks (WIP)

In addition:
* update streamset.count to support the `precise` boolean flag.

* Update mock return value for versionMajor

* In progress validation of stream benchs

---------

Co-authored-by: David Konigsberg <[email protected]>
Co-authored-by: Jeff Lin <[email protected]>

* Add 3.10 python to the testing matrix (#21)

* Add 3.10 python to the testing matrix

* Fix yaml parsing

* Update requirements to support 3.10

* Use pip-tools `pip-compile` cli tool to generate requirements.txt files from the updated pyproject.toml file
* Include pyproject.toml with basic features to support proper extra deps
* Support different ways to install btrdb from pip
  * `btrdb, btrdb[data], btrdb[all], btrdb[testing], btrdb[ray]`
* Update transformers.py to build up a numpy array when the subarrays are not the same size (number of entries)
  * This converts the main array's dtype to `object`
  * tests still pass with this change
* recompile the btrdb proto files with latest protobuf and grpc plugins
* Create multiple requirements.txt files for easier updating in the future as well as a locked version with pinned dependencies

* Ignore protoc generated flake errors

* Update test requirements

* Include pre-commit and setup.

* Pre-commit lints.

* Update pre-commit.yaml

add staging to pre-commit checks

* Fix missing logging import, rerun pre-commit (#24)

* Add basic doc string to endpoint object (#25)

* Update benchmark scripts.

* Multistream read bench insert bench (#26)

* Fix multistream endpoint bugs

* The streamset was passing the incorrect params to the endpoint
* The endpoint does not return a `version` in its response, just `stat` and `arrowBytes`

Params have been updated and a NoneType is passed around to ignore the
lack of version info, which lets us use the same logic for all bytes
decoding.

* Add multistream benchmark methods for timesnap and no timesnap.

* Add insert benchmarking methods (#27)

Benchmarking methods added for:

* stream inserts using tuples of time, value data
* stream inserts using pyarrow tables of timestamps, value columns

* streamset inserts using a dict map of streamset stream uuids, and lists of tuples of time, value data
* streamset inserts using a dict map of streamset stream uuids, and pyarrow tables of timestamps, values.

* Fix arrow inserts (#28)

* Add insert benchmarking methods

Benchmarking methods added for:

* stream inserts using tuples of time, value data
* stream inserts using pyarrow tables of timestamps, value columns

* streamset inserts using a dict map of streamset stream uuids, and lists of tuples of time, value data
* streamset inserts using a dict map of streamset stream uuids, and pyarrow tables of timestamps, values.

* Include nullable false in pyarrow schema inserts

* This was the only difference in the schemas between go and python.
* also using a bytesIO stream to act as the sink for the ipc bytes.

* Start integration test suite

* Add more streamset integration tests.

* Add support for authenticated requests without encryption.

* Optimize logging calls (#30)

Previously, the debug logging in the api would create the f-strings no matter if logging.DEBUG was the current log level or not.

This can impact the performance, especially for benchmarking.

Now, a cached IS_DEBUG flag is created for the stream operations, and other locations, the logger.isEnabledFor boolean is checked.

Note that in the stream.py, this same function call is only executed once, and the results are cached for the rest of the logic.

* Add more arrow tests and minor refactoring.

* More integration test cases

* Restructure tests.

* Mark new failing tests as expected failures for now.

* Disable gzip compression, it is very slow.

* Reenable test, server has been fixed.

* Update pandas testing and fix flake8 issues (#31)

* Update pandas testing and fix flake8 issues

* Update stream logic for unpacking arrow tables, update integration tests.

* add init.py for integration tests.

* Add additional tests for arrow methods vs their old api counterparts.

* Add tests for timesnap boundary conditions. (#32)

* Add more integration tests.

* Add additional integration tests, modify the name_callable ability of the arrow_values.

* remove extraneous prints.

* Include retry logic.

* Update statpoint order in arrow, fix some bugs with the arrow methods.

* Update testing to account for NaNs.

* Update github action versions.

* Update tests, add in a test for duplicate values.

* Remove empty test, remove extraneous prints

---------

Co-authored-by: andrewchambers <[email protected]>

* Update docs for arrow (#35)

* Update docs, add in final enhanced edits.

* Only enable arrow-endpoints when version >= 5.30 (#36)

Once we have a v5.30tag of the server with arrow/multistream, we can
merge this and complete the ticket.

* Update arrow notes, small doc changes. (#38)

---------

Co-authored-by: Justin Gilmer <[email protected]>
Co-authored-by: David Konigsberg <[email protected]>
Co-authored-by: Jeff Lin <[email protected]>
Co-authored-by: Andrew Chambers <[email protected]>
Co-authored-by: andrewchambers <[email protected]>
justinGilmer added a commit to PingThingsIO/btrdb-python that referenced this issue Sep 25, 2023
* Threadpool executor (#22)

* Release v5.15.0

* update protobuf to v4.22.3

* Add threaded streamset calls

Using concurrent.futures.ThreadPoolExecutor

* Blacken code

* Update for failing tests

* Ignore flake8 as part of testing

pytest-flake8 seems to have issues with the later versions of flake8

tholo/pytest-flake8#92

* Update .gitignore

* Update ignore and remove extra print.

* Remove idea folder (pycharm)

---------

Co-authored-by: David Konigsberg <[email protected]>
Co-authored-by: Jeff Lin <[email protected]>

* Threaded arrow (#23)

* Release v5.15.0

* update protobuf to v4.22.3

* Add threaded streamset calls

Using concurrent.futures.ThreadPoolExecutor

* Blacken code

* Update for failing tests

* Ignore flake8 as part of testing

pytest-flake8 seems to have issues with the later versions of flake8

tholo/pytest-flake8#92

* Update .gitignore

* Update proto definitions.

* Update endpoint to support arrow methods

* Support arrow endpoints

* Additional arrow updates

* Update transformers, add polars conversion

* Update .gitignore

* Update ignore and remove extra print.

* Remove idea folder (pycharm)

* Update requirements.txt

* Update btrdb/transformers.py

* Update the way to check for arrow-enabled btrdb

This has not been "turned on" yet though, since we dont know the version number this will be enabled for. The method is currently commented out, but can be re-enabled pretty easily.

* Use IPC streams to send the arrow bytes for insert

Instead of writing out feather files to an `io.BytesIO` stream and then sending the feather files over the wire, this creates a buffered outputstream and then sends that data back as bytes to btrdb.

* Create arrow specific stream methods.

* Update test conn object to support minor version

* Update tests and migrate arrow code.

* Arrow and standard streamset insert

* Create basic arrow to dataframe transformer

* Support multirawvalues, arrow transformers

* Multivalue arrow queries, in progress

* Update stream filter to properly filter for sampling frequency

* Update arrow values queries for multivalues

* Update param passing for sampling frequency

* Update index passing, and ignore depth

* benchmark raw values queries for arrow and current api

* Add aligned windows and run func

* Streamset read benchmarks (WIP)

In addition:
* update streamset.count to support the `precise` boolean flag.

* Update mock return value for versionMajor

* In progress validation of stream benchs

---------

Co-authored-by: David Konigsberg <[email protected]>
Co-authored-by: Jeff Lin <[email protected]>

* Add 3.10 python to the testing matrix (#21)

* Add 3.10 python to the testing matrix

* Fix yaml parsing

* Update requirements to support 3.10

* Use pip-tools `pip-compile` cli tool to generate requirements.txt files from the updated pyproject.toml file
* Include pyproject.toml with basic features to support proper extra deps
* Support different ways to install btrdb from pip
  * `btrdb, btrdb[data], btrdb[all], btrdb[testing], btrdb[ray]`
* Update transformers.py to build up a numpy array when the subarrays are not the same size (number of entries)
  * This converts the main array's dtype to `object`
  * tests still pass with this change
* recompile the btrdb proto files with latest protobuf and grpc plugins
* Create multiple requirements.txt files for easier updating in the future as well as a locked version with pinned dependencies

* Ignore protoc generated flake errors

* Update test requirements

* Include pre-commit and setup.

* Pre-commit lints.

* Update pre-commit.yaml

add staging to pre-commit checks

* Fix missing logging import, rerun pre-commit (#24)

* Add basic doc string to endpoint object (#25)

* Update benchmark scripts.

* Multistream read bench insert bench (#26)

* Fix multistream endpoint bugs

* The streamset was passing the incorrect params to the endpoint
* The endpoint does not return a `version` in its response, just `stat` and `arrowBytes`

Params have been updated and a NoneType is passed around to ignore the
lack of version info, which lets us use the same logic for all bytes
decoding.

* Add multistream benchmark methods for timesnap and no timesnap.

* Add insert benchmarking methods (#27)

Benchmarking methods added for:

* stream inserts using tuples of time, value data
* stream inserts using pyarrow tables of timestamps, value columns

* streamset inserts using a dict map of streamset stream uuids, and lists of tuples of time, value data
* streamset inserts using a dict map of streamset stream uuids, and pyarrow tables of timestamps, values.

* Fix arrow inserts (#28)

* Add insert benchmarking methods

Benchmarking methods added for:

* stream inserts using tuples of time, value data
* stream inserts using pyarrow tables of timestamps, value columns

* streamset inserts using a dict map of streamset stream uuids, and lists of tuples of time, value data
* streamset inserts using a dict map of streamset stream uuids, and pyarrow tables of timestamps, values.

* Include nullable false in pyarrow schema inserts

* This was the only difference in the schemas between go and python.
* also using a bytesIO stream to act as the sink for the ipc bytes.

* Start integration test suite

* Add more streamset integration tests.

* Add support for authenticated requests without encryption.

* Optimize logging calls (#30)

Previously, the debug logging in the api would create the f-strings no matter if logging.DEBUG was the current log level or not.

This can impact the performance, especially for benchmarking.

Now, a cached IS_DEBUG flag is created for the stream operations, and other locations, the logger.isEnabledFor boolean is checked.

Note that in the stream.py, this same function call is only executed once, and the results are cached for the rest of the logic.

* Add more arrow tests and minor refactoring.

* More integration test cases

* Restructure tests.

* Mark new failing tests as expected failures for now.

* Disable gzip compression, it is very slow.

* Reenable test, server has been fixed.

* Update pandas testing and fix flake8 issues (#31)

* Update pandas testing and fix flake8 issues

* Update stream logic for unpacking arrow tables, update integration tests.

* add init.py for integration tests.

* Add additional tests for arrow methods vs their old api counterparts.

* Add tests for timesnap boundary conditions. (#32)

* Add more integration tests.

* Add additional integration tests, modify the name_callable ability of the arrow_values.

* remove extraneous prints.

* Include retry logic.

* Update statpoint order in arrow, fix some bugs with the arrow methods.

* Update testing to account for NaNs.

* Update github action versions.

* Update tests, add in a test for duplicate values.

* Remove empty test, remove extraneous prints

---------

Co-authored-by: andrewchambers <[email protected]>

* Update docs for arrow (#35)

* Update docs, add in final enhanced edits.

* Only enable arrow-endpoints when version >= 5.30 (#36)

Once we have a v5.30tag of the server with arrow/multistream, we can
merge this and complete the ticket.

* Update arrow notes, small doc changes. (#38)

* fix: patch up stream object type and other bugs (#33)

* fix: patch up stream object type and other bugs

* fix: resolve depth errors in stream window

* fix: resolve remaining test warnings

* fix: resolve test imports

* chore: add pre-commit install to readme

* Update staging branch with latest `master` changes (#52)

---------

Co-authored-by: David Konigsberg <[email protected]>
Co-authored-by: Jeff Lin <[email protected]>
Co-authored-by: Andrew Chambers <[email protected]>
Co-authored-by: andrewchambers <[email protected]>
Co-authored-by: Taite Nazifi <[email protected]>
gentooboontoo added a commit to Kozea/pygal that referenced this issue Nov 19, 2023
@Tiger-zzZ
Copy link

Tiger-zzZ commented Dec 13, 2023

Think the problem is actually with flake8 6.0.0. Looks like in the latest commit they actually removed that function from the Application class that pytest-flake8 uses:

See diff at

PyCQA/flake8@0d667a7#diff-eb63c667faddfa67b8a2a6e6c3e8a85e34ffca80baddde0f8202e2ece75a52a4L203

Also does not seem to play with flake8 5.0 as it uses a "ConfigFileFinder" class that was removed

For me things work (with many DeprecationWarnings) with pytest 7.2.1, flake8 4.0.1, and pytest-flake8 1.1.1, broken for flake8 >= 5.0

Thanks, It's worked for me
I run the checks using flake8==6.1.0 and encountered failures, but it passed successfully with version 4.0.1.
This is my current environment.

Python 3.10.7
pytest 7.2.0
flake8 4.0.1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants