Skip to content

Commit

Permalink
update supported python versions
Browse files Browse the repository at this point in the history
  • Loading branch information
barrust committed Dec 29, 2023
1 parent b81f177 commit 685cde7
Show file tree
Hide file tree
Showing 3 changed files with 102 additions and 74 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/python-package.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['3.7', '3.8', '3.9', '3.10', '3.11']
python-version: ['3.8', '3.9', '3.10', '3.11', '3.12']

steps:
- uses: actions/checkout@v3
Expand Down
151 changes: 91 additions & 60 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,161 +1,192 @@
# PyProbables Changelog

### Version 0.5.9

* Add `py.typed` files so that mypy will find type annotations
* Drop support for python `3.6` and `3.7`

### Version 0.5.8

* Make the `mmap` utility class windows compatible; see [PR #160](https://github.com/barrust/pyprobables/pull/106); Thanks [@leonhma](https://github.com/leonhma)

### Version 0.5.7

* Update Build System and update project metadata
* Better support for `resolve_path` in passed filenames
* Remove Python 3.5 support
* Pylint inspired updates

### Version 0.5.6

* Bloom Filters:
* Fix for `ValueError` exception when using `estimate_elements()` when all bits are set
* Fix for `ValueError` exception when using `estimate_elements()` when all bits are set
* Add Citation file

### Version 0.5.5

* Bloom Filters:
* Re-implemented the entire Bloom Filter data structure to reduce complexity and code duplication
* Re-implemented the entire Bloom Filter data structure to reduce complexity and code duplication
* Removed un-unsed imports
* Removed unnecessary casts
* Pylint Requested Style Changes:
* Use python 3 `super()`
* Use python 3 classes
* Use python 3 `super()`
* Use python 3 classes
* Remove use of temporary variables if possible and still clear

### Version 0.5.4

* All Probablistic Data Structures:
* Added ability to load each `frombytes()`
* Updated underlying data structures of number based lists to be more space and time efficient; see [Issue #60](https://github.com/barrust/pyprobables/issues/60)
* Added ability to load each `frombytes()`
* Updated underlying data structures of number based lists to be more space and time efficient; see [Issue #60](https://github.com/barrust/pyprobables/issues/60)
* Cuckoo Filters:
* Added `fingerprint_size_bits` property
* Added `error_rate` property
* Added ability to initialize based on error rate
* Added `fingerprint_size_bits` property
* Added `error_rate` property
* Added ability to initialize based on error rate
* Simplified typing
* Ensure all `filepaths` can be `str` or `Path`

### Version 0.5.3

* Additional type hinting
* Improved format parsing and serialization; [see PR#81](https://github.com/barrust/pyprobables/pull/81). Thanks [@KOLANICH](https://github.com/KOLANICH)
* Bloom Filters
* Added `export_to_hex` functionality for Bloom Filters on Disk
* Export as C header (**\*.h**) for Bloom Filters on Disk and Counting Bloom Filters
* Added `export_to_hex` functionality for Bloom Filters on Disk
* Export as C header (**\*.h**) for Bloom Filters on Disk and Counting Bloom Filters
* Added support for more input types for exporting and loading of saved files


### Version 0.5.2

* Add ability to hash bytes along with strings
* Make all tests files individually executable from the CLI. Thanks [@KOLANICH](https://github.com/KOLANICH)
* Added type hints

### Version 0.5.1

* Bloom Filter:
* Export as a C header (**\*.h**)
* Export as a C header (**\*.h**)
* Count-Min Sketch
* Add join/merge functionality
* Add join/merge functionality
* Moved testing to use `NamedTemporaryFile` for file based tests

### Version 0.5.0

* ***BACKWARD INCOMPATIBLE CHANGES***
* **NOTE:** Breaks backwards compatibility with previously exported blooms, counting-blooms, cuckoo filter, or count-min-sketch files using the default hash!
* Update to the FNV_1a hash function
* Simplified the default hash to use a seed value
* **NOTE:** Breaks backwards compatibility with previously exported blooms, counting-blooms, cuckoo filter, or count-min-sketch files using the default hash!
* Update to the FNV_1a hash function
* Simplified the default hash to use a seed value
* Ensure passing of depth to hashing function when using `hash_with_depth_int` or `hash_with_depth_bytes`

## Version 0.4.1

* Resolve [issue 57](https://github.com/barrust/pyprobables/issues/57) where false positive rate not stored / used the same in some instances

## Version 0.4.0

* Remove **Python 2.7** support

### Version 0.3.2

* Fix `RotatingBloomFilter` to keep information on number of elements inserted when exported and loaded. [see PR #50](https://github.com/barrust/pyprobables/pull/50) Thanks [@dvolker48](https://github.com/volker48)

### Version 0.3.1
* Add additional __slots__

* Add additional **slots**
* Very minor improvement to the hashing algorithm

### Version 0.3.0

* Bloom Filters:
* Import/Export of Expanding and Rotating Bloom Filters
* Fix for importing standard Bloom Filters
* Import/Export of Expanding and Rotating Bloom Filters
* Fix for importing standard Bloom Filters

### Version 0.2.6

* Bloom Filters:
* Addition of a Rotating Bloom Filter
* Addition of a Rotating Bloom Filter

### Version 0.2.5

* Bloom Filters:
* Addition of an Expanding Bloom Filter
* Addition of an Expanding Bloom Filter

### Version 0.2.0
* Use __slots__

### Version 0.1.4:
* Use **slots**

### Version 0.1.4

* Drop support for python 3.3
* Ensure passing parameters correctly to parent classes

### Version 0.1.3:
### Version 0.1.3

* Better parameter validation
* Cuckoo Filters:
* Support passing different hash function
* Support for different fingerprint size
* Support passing different hash function
* Support for different fingerprint size
* Utility to help generate valid hashing strategies using decorators
* hash_with_depth_bytes
* hash_with_depth_int
* hash_with_depth_bytes
* hash_with_depth_int
* Updated documentation

### Version 0.1.2:
### Version 0.1.2

* Counting Cuckoo Filter
* Basic functionality: add, remove, check
* Expand
* Import / Export
* Basic functionality: add, remove, check
* Expand
* Import / Export
* Fix and tests for utility functions
* Fix package build

### Version 0.1.1:
### Version 0.1.1

* CuckooFilter
* Import / Export functionality
* Enforce single insertion per key
* Auto expand when insertion failure OR when called to do so (settable)
* Import / Export functionality
* Enforce single insertion per key
* Auto expand when insertion failure OR when called to do so (settable)

### Version 0.1.0

### Version 0.1.0:
* Cuckoo Filter
* Added basic Cuckoo Filter code
* Added basic Cuckoo Filter code

### Version 0.0.8

### Version 0.0.8:
* Counting Bloom Filter
* Estimate unique elements added
* Union
* Intersection
* Jaccard Index
* Estimate unique elements added
* Union
* Intersection
* Jaccard Index

### Version 0.0.7

### Version 0.0.7:
* Counting Bloom Filter
* Fix counting bloom hex export / import
* Fix for overflow issue in counting bloom export
* Added ability to remove from counting bloom
* Fix counting bloom hex export / import
* Fix for overflow issue in counting bloom export
* Added ability to remove from counting bloom
* Count-Min Sketch
* Fix for not recording large numbers of inserts and deletions correctly
* Fix for not recording large numbers of inserts and deletions correctly

### Version 0.0.6

### Version 0.0.6:
* Probabilistic data structures added:
* Counting Bloom Filter
* Counting Bloom Filter
* Minor code clean-up
* Re-factored Bloom Filters

### Version 0.0.5:
### Version 0.0.5

* Better on-line documentation
* Changed access to some public functions

### Version 0.0.4:
### Version 0.0.4

* Probabilistic data structures:
* Bloom Filter
* Bloom Filter (on disk)
* Count-Min Sketch
* Count-Mean Sketch
* Count-Mean-Min Sketch
* Heavy Hitters
* Stream Threshold
* Bloom Filter
* Bloom Filter (on disk)
* Count-Min Sketch
* Count-Mean Sketch
* Count-Mean-Min Sketch
* Heavy Hitters
* Stream Threshold
* Import and export of each
23 changes: 10 additions & 13 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
[project]
name = "pyprobables"
dynamic = ["version"]
authors = [{name = "Tyler Barrus", email = "[email protected]"}]
license = {text = "MIT"}
authors = [{ name = "Tyler Barrus", email = "[email protected]" }]
license = { text = "MIT" }
description = "Probabilistic data structures in python"
keywords = [
"python",
Expand All @@ -28,34 +28,31 @@ classifiers = [
"License :: OSI Approved :: MIT License",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
]
requires-python = ">=3.6"

[tool.setuptools.dynamic]
version = {attr = "probables.__version__"}
version = { attr = "probables.__version__" }

[project.urls]
Homepage = "https://github.com/barrust/pyprobables"
Bug-tracker = "https://github.com/barrust/pyprobables/issues"
Documentation = "https://pyprobables.readthedocs.io/"

[tool.poetry]
packages = [
{include = "probables"},
]
packages = [{ include = "probables" }]

[tool.poetry.dev-dependencies]
pre-commit = {version = ">=2.18.1", python = "^3.6.1"}
black = {version = "^20.8b1", python = "^3.6"}
isort = {version = "^5.6.4", python = "^3.6"}
pytest = {version = "^6.1.1", python = "^3.6"}
flake8 = {version = "^3.6.0", python = "^3.6"}
pre-commit = { version = ">=2.18.1", python = "^3.6.1" }
black = { version = "^20.8b1", python = "^3.6" }
isort = { version = "^5.6.4", python = "^3.6" }
pytest = { version = "^6.1.1", python = "^3.6" }
flake8 = { version = "^3.6.0", python = "^3.6" }

[tool.setuptools.packages.find]
include = ["probables", "probables.*"]
Expand Down

0 comments on commit 685cde7

Please sign in to comment.