├── py_cachify
├── py.typed
├── _backend
│ ├── __init__.py
│ ├── _types
│ │ ├── __init__.py
│ │ ├── _reset_wrap.py
│ │ ├── _lock_wrap.py
│ │ └── _common.py
│ ├── _logger.py
│ ├── _constants.py
│ ├── _exceptions.py
│ ├── _clients.py
│ ├── _helpers.py
│ ├── _cached.py
│ └── _lib.py
└── __init__.py
├── docs
├── scripts
│ ├── __init__.py
│ └── hooks.py
├── img
│ ├── logo.png
│ ├── project-header.png
│ ├── type-annotations.png
│ ├── type-annotations-2.png
│ ├── type-annotations-3.png
│ └── logo-icon.svg
├── tutorial
│ ├── index.md
│ ├── initial-setup
│ │ ├── install.md
│ │ └── initialization.md
│ ├── locks
│ │ ├── locks-intro.md
│ │ ├── lock-parameters.md
│ │ ├── simple-locks.md
│ │ ├── lock-as-decorator.md
│ │ └── lock-methods.md
│ ├── cached-decorator
│ │ ├── dynamic-cache-keys.md
│ │ ├── first-steps.md
│ │ ├── specifying-ttl-and-encoder-decoder.md
│ │ └── reset-attribute.md
│ └── once-decorator
│ │ └── index.md
├── help
│ ├── contribution.md
│ └── help.md
├── llm-full.md
├── examples.md
├── reference
│ ├── once.md
│ ├── lock.md
│ └── cached.md
├── release-notes.md
└── index.md
├── .readthedocs.yml
├── tests
├── conftest.py
├── test_merge.py
├── test_once_decorator.py
├── test_locks.py
├── test_helpers.py
├── test_lock_decorator.py
└── test_cached.py
├── .github
├── dependabot.yml
├── pull_request_template.md
└── workflows
│ ├── build-and-publish.yml
│ └── checks.yml
├── sonar-project.properties
├── integration_tests
├── conftest.py
├── test_unpicklable_cache.py
├── test_decorators.py
└── test_clients_isolation_and_ttl.py
├── LICENSE
├── mkdocs.yml
├── pyproject.toml
├── .gitignore
└── README.md
/py_cachify/py.typed:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/docs/scripts/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/py_cachify/_backend/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/py_cachify/_backend/_types/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/docs/img/logo.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/EzyGang/py-cachify/HEAD/docs/img/logo.png
--------------------------------------------------------------------------------
/docs/img/project-header.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/EzyGang/py-cachify/HEAD/docs/img/project-header.png
--------------------------------------------------------------------------------
/py_cachify/_backend/_logger.py:
--------------------------------------------------------------------------------
1 | import logging
2 |
3 |
4 | logger = logging.getLogger('py-cachify')
5 |
--------------------------------------------------------------------------------
/docs/img/type-annotations.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/EzyGang/py-cachify/HEAD/docs/img/type-annotations.png
--------------------------------------------------------------------------------
/docs/img/type-annotations-2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/EzyGang/py-cachify/HEAD/docs/img/type-annotations-2.png
--------------------------------------------------------------------------------
/docs/img/type-annotations-3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/EzyGang/py-cachify/HEAD/docs/img/type-annotations-3.png
--------------------------------------------------------------------------------
/py_cachify/_backend/_constants.py:
--------------------------------------------------------------------------------
1 | from typing import Literal
2 |
3 |
4 | OperationPostfix = Literal['once', 'cached', 'lock']
5 |
--------------------------------------------------------------------------------
/py_cachify/_backend/_exceptions.py:
--------------------------------------------------------------------------------
1 | class CachifyInitError(Exception):
2 | pass
3 |
4 |
5 | class CachifyLockError(Exception):
6 | pass
7 |
--------------------------------------------------------------------------------
/.readthedocs.yml:
--------------------------------------------------------------------------------
1 | version: 2
2 | formats: all
3 | mkdocs:
4 | fail_on_warning: false
5 | configuration: mkdocs.yml
6 |
7 | python:
8 | install:
9 | - requirements: docs/requirements.txt
10 |
11 | build:
12 | os: ubuntu-22.04
13 | tools:
14 | python: "3.12"
15 |
--------------------------------------------------------------------------------
/tests/conftest.py:
--------------------------------------------------------------------------------
1 | import pytest
2 |
3 | import py_cachify._backend._lib
4 | from py_cachify import init_cachify
5 |
6 |
7 | @pytest.fixture(scope='function')
8 | def init_cachify_fixture():
9 | init_cachify()
10 | yield
11 | py_cachify._backend._lib._cachify._sync_client._cache = {}
12 | py_cachify._backend._lib._cachify = None
13 |
--------------------------------------------------------------------------------
/docs/tutorial/index.md:
--------------------------------------------------------------------------------
1 | # Learn
2 |
3 | This is the Py-Cachify tutorial - user guide.
4 |
5 | This tutorial covers everything the package provides with basic examples, explanations, and the most common cases on how to use py-cachify.
6 |
7 | If you are upgrading from 2.x, we recommend reading the [3.0.0 release notes](../release-notes.md#300) first to understand the new features and behavior changes before diving into the tutorial.
8 |
9 |
--------------------------------------------------------------------------------
/.github/dependabot.yml:
--------------------------------------------------------------------------------
1 | # To get started with Dependabot version updates, you'll need to specify which
2 | # package ecosystems to update and where the package manifests are located.
3 | # Please see the documentation for all configuration options:
4 | # https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates
5 |
6 | version: 2
7 |
8 | updates:
9 | - package-ecosystem: "uv"
10 | directory: "/"
11 | schedule:
12 | interval: "weekly"
13 |
--------------------------------------------------------------------------------
/.github/pull_request_template.md:
--------------------------------------------------------------------------------
1 | # Description
2 | _Please include a summary of the changes you made in this PR_
3 |
4 |
5 | # Checklist:
6 | _If any of these are not checked please leave a succinct reason below_
7 |
8 | **I have ...**
9 | - [ ] tested the code
10 | - [ ] added & updated automated tests
11 | - [ ] performed a self-review of own code
12 | - [ ] checked the new code does not generate new warnings
13 |
14 | # Additional Notes for Reviewers
15 | _Callout for anything noteworthy_
16 |
--------------------------------------------------------------------------------
/docs/tutorial/initial-setup/install.md:
--------------------------------------------------------------------------------
1 | # Installation
2 |
3 | Before starting, create a project directory, and then create a **virtual environment** in it to install the packages.
4 |
5 | Install via pip:
6 |
7 | ```bash
8 | $ pip install py-cachify
9 |
10 | ---> 100%
11 | Successfully installed py-cachify
12 | ```
13 |
14 | Or if you use poetry:
15 |
16 | ```bash
17 | $ poetry add py-cachify
18 |
19 | ---> 100%
20 | Using version * for py-cachify
21 | Successfully installed py-cachify
22 | ```
23 |
--------------------------------------------------------------------------------
/sonar-project.properties:
--------------------------------------------------------------------------------
1 | sonar.projectKey=EzyGang_py-cachify
2 | sonar.organization=ezygang
3 |
4 | # This is the name and version displayed in the SonarCloud UI.
5 | sonar.projectName=py-cachify
6 | sonar.projectVersion=3.0.1
7 |
8 |
9 | # Path is relative to the sonar-project.properties file. Replace "\" by "/" on Windows.
10 | sonar.sources=./py_cachify
11 | sonar.python.version=3
12 | sonar.python.coverage.reportPaths=./coverage.xml
13 |
14 | # Encoding of the source code. Default is default system encoding
15 | #sonar.sourceEncoding=UTF-8
16 |
--------------------------------------------------------------------------------
/integration_tests/conftest.py:
--------------------------------------------------------------------------------
1 | import pytest
2 | import redis
3 |
4 | from py_cachify import Cachify, init_cachify
5 |
6 |
7 | @pytest.fixture(autouse=True)
8 | def init_cachify_fixture() -> None:
9 | init_cachify(
10 | sync_client=redis.from_url(url='redis://localhost:6379/0'),
11 | async_client=redis.asyncio.from_url(url='redis://localhost:6379/1'),
12 | )
13 |
14 |
15 | @pytest.fixture
16 | def cachify_local_in_memory_client() -> Cachify:
17 | return init_cachify(
18 | is_global=False,
19 | )
20 |
21 |
22 | @pytest.fixture
23 | def cachify_local_redis_second() -> Cachify:
24 | return init_cachify(
25 | is_global=False,
26 | sync_client=redis.from_url(url='redis://localhost:6379/2'),
27 | async_client=redis.asyncio.from_url(url='redis://localhost:6379/3'),
28 | )
29 |
--------------------------------------------------------------------------------
/docs/scripts/hooks.py:
--------------------------------------------------------------------------------
1 | import re
2 |
3 | from bs4 import BeautifulSoup
4 | from mkdocs import plugins
5 | from mkdocs.config.defaults import MkDocsConfig
6 | from paginate import Page
7 |
8 |
9 | regex = re.compile(r'(///([a-zA-Z_@\(\)]+)///)')
10 |
11 |
12 | @plugins.event_priority(-100)
13 | def on_post_page(output_content: str, page: Page, config: MkDocsConfig) -> str:
14 | soup = BeautifulSoup(output_content, 'html.parser')
15 | aria_tags = soup.find_all(lambda tag: 'aria-label' in tag.attrs)
16 | for at in aria_tags:
17 | at.attrs['aria-label'] = re.sub(
18 | regex,
19 | r'\2',
20 | at.attrs['aria-label'],
21 | )
22 |
23 | output_content = str(soup)
24 | return re.sub(
25 | regex,
26 | r'\2',
27 | output_content,
28 | )
29 |
--------------------------------------------------------------------------------
/docs/tutorial/locks/locks-intro.md:
--------------------------------------------------------------------------------
1 | # Introduction to Locks (Mutex)
2 |
3 | In simple terms, a lock, also known as a mutex,
4 | is like electronic door locks that allow only one person to enter a room,
5 | or, in terms when it comes to coding - to make sure that certain code is
6 | being only run once at a time. This prevents data inconsistencies and race conditions.
7 | `py-cachify` provides tools for creating and managing these locks, so you can keep your logic safe and organized.
8 |
9 | ## py-cachify's locks
10 |
11 | This tutorial will show you how to use locks provided by `py-cachify`, what params do they have,
12 | and showcase some common use case scenarios.
13 |
14 | Note: py-cachify's main focus is to provide a convenient way to use distributed locks and in no way replace built-in ones.
15 | This type of lock is usually utilized heavily in web development in particular when scaling comes into play
16 | and the synchronization problems are starting to surface as well as race conditions.
17 |
18 |
19 | ## What's next
20 |
21 | We will dive deeper and look at some examples.
--------------------------------------------------------------------------------
/docs/help/contribution.md:
--------------------------------------------------------------------------------
1 | # Contribution Guidelines
2 |
3 | ## Contribution
4 |
5 | We welcome contributions from the community! Below is a quick guide on how to contribute::
6 |
7 | 1. **Fork the Repository**
8 | - Go to the py-cachify GitHub page and click on "Fork."
9 |
10 | 2. **Clone Your Fork**
11 | - Clone your forked repository to your local machine.
12 |
13 | 3. **Create a New Branch**
14 | - Create a new branch for your feature or fix.
15 |
16 | 4. **Make Changes and Commit**
17 | - Implement your changes and commit them with a clear message.
18 |
19 | 5. **Push to Your Fork**
20 | - Push your changes to your fork on GitHub.
21 |
22 | 6. **Open a Pull Request**
23 | - Navigate to the original repository and open a pull request. Describe your changes and why they are beneficial.
24 |
25 | We appreciate your contributions and look forward to collaborating with you!
26 |
27 | ## Thank You!
28 |
29 | Your support and contributions make a difference. Let’s build an amazing package together!
30 |
--------------------------------------------------------------------------------
/py_cachify/__init__.py:
--------------------------------------------------------------------------------
1 | from ._backend._cached import cached as cached
2 | from ._backend._exceptions import CachifyInitError as CachifyInitError
3 | from ._backend._exceptions import CachifyLockError as CachifyLockError
4 | from ._backend._lib import Cachify as Cachify
5 | from ._backend._lib import init_cachify as init_cachify
6 | from ._backend._lock import lock as lock
7 | from ._backend._lock import once as once
8 | from ._backend._types._common import AsyncClient as AsyncClient
9 | from ._backend._types._common import Decoder as Decoder
10 | from ._backend._types._common import Encoder as Encoder
11 | from ._backend._types._common import SyncClient as SyncClient
12 | from ._backend._types._lock_wrap import AsyncLockWrappedF as AsyncLockWrappedF
13 | from ._backend._types._lock_wrap import SyncLockWrappedF as SyncLockWrappedF
14 | from ._backend._types._lock_wrap import WrappedFunctionLock as WrappedFunctionLock
15 |
16 |
17 | try:
18 | from importlib.metadata import version
19 |
20 | __version__ = version('py-cachify')
21 | except ModuleNotFoundError:
22 | __version__ = f'No version available for {__name__}'
23 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2024-2025 EzyGang
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/.github/workflows/build-and-publish.yml:
--------------------------------------------------------------------------------
1 | name: Build and Publish
2 |
3 | on: workflow_dispatch
4 | permissions:
5 | id-token: write
6 | jobs:
7 | build-and-publish:
8 | name: Build & Upload Package
9 | runs-on: ubuntu-latest
10 | environment:
11 | name: release
12 | steps:
13 | - uses: actions/checkout@v5
14 | - name: Install uv
15 | uses: astral-sh/setup-uv@v7
16 | with:
17 | version: "latest"
18 | activate-environment: true
19 | enable-cache: true
20 | - name: "Set up Python"
21 | uses: actions/setup-python@v6
22 | with:
23 | python-version-file: "pyproject.toml"
24 | - name: Install Dependencies
25 | run: |
26 | uv sync --all-extras --all-groups
27 | - name: build
28 | run: uv build
29 | - name: Publish to PyPi
30 | uses: pypa/gh-action-pypi-publish@v1.13.0
31 | with:
32 | verbose: true
33 | print-hash: true
34 | - name: Sign published artifacts
35 | uses: sigstore/gh-action-sigstore-python@v3.0.0
36 | with:
37 | inputs: ./dist/*.tar.gz ./dist/*.whl
38 | release-signing-artifacts: true
39 |
--------------------------------------------------------------------------------
/docs/help/help.md:
--------------------------------------------------------------------------------
1 | # Help Py-Cachify Package Grow and Evolve
2 |
3 | Thank you for your interest in py-cachify!
4 | Your support is crucial for the growth and improvement of this project.
5 |
6 | Here are a few ways you can help:
7 | ## Ways to Support
8 |
9 | 1. **Try It Out**
10 | - Download py-cachify from PyPI and test it out in your projects. Your feedback is invaluable!
11 |
12 | 2. **Star on GitHub**
13 | - If you find py-cachify helpful, please consider starring the repository on GitHub.
14 | This not only shows your appreciation but also helps others discover the package.
15 |
16 | 3. **Share It**
17 | - Spread the word! Share your experiences and the benefits of using py-cachify with your community on social media, forums, or blogs.
18 |
19 | 4. **Report Issues**
20 | - If you encounter any issues or have questions, please check our Issues page on GitHub
21 | where you can report bugs, ask questions, or suggest features.
22 |
23 | ## Contribution guidelines
24 |
25 | Do you have a wonderful idea or want to help fix an issue?
26 |
27 | Go to the [contribution guide](./contribution.md).
--------------------------------------------------------------------------------
/py_cachify/_backend/_types/_reset_wrap.py:
--------------------------------------------------------------------------------
1 | from collections.abc import Awaitable
2 | from typing import Callable, TypeVar, Union
3 |
4 | from typing_extensions import ParamSpec, Protocol, overload
5 |
6 |
7 | _R = TypeVar('_R')
8 | _P = ParamSpec('_P')
9 |
10 |
11 | class AsyncResetWrappedF(Protocol[_P, _R]):
12 | __wrapped__: Callable[_P, Awaitable[_R]] # pragma: no cover
13 |
14 | async def __call__(self, *args: _P.args, **kwargs: _P.kwargs) -> _R: ... # pragma: no cover
15 |
16 | async def reset(self, *args: _P.args, **kwargs: _P.kwargs) -> None: ... # pragma: no cover
17 |
18 |
19 | class SyncResetWrappedF(Protocol[_P, _R]):
20 | __wrapped__: Callable[_P, _R] # pragma: no cover
21 |
22 | def __call__(self, *args: _P.args, **kwargs: _P.kwargs) -> _R: ... # pragma: no cover
23 |
24 | def reset(self, *args: _P.args, **kwargs: _P.kwargs) -> None: ... # pragma: no cover
25 |
26 |
27 | class WrappedFunctionReset(Protocol):
28 | @overload
29 | def __call__(self, _func: Callable[_P, Awaitable[_R]], /) -> AsyncResetWrappedF[_P, _R]: ... # type: ignore[overload-overlap]
30 |
31 | @overload
32 | def __call__(self, _func: Callable[_P, _R], /) -> SyncResetWrappedF[_P, _R]: ...
33 |
34 | def __call__( # pragma: no cover
35 | self,
36 | _func: Union[
37 | Callable[_P, Awaitable[_R]],
38 | Callable[_P, _R],
39 | ],
40 | ) -> Union[
41 | AsyncResetWrappedF[_P, _R],
42 | SyncResetWrappedF[_P, _R],
43 | ]: ...
44 |
--------------------------------------------------------------------------------
/py_cachify/_backend/_types/_lock_wrap.py:
--------------------------------------------------------------------------------
1 | from collections.abc import Awaitable
2 | from typing import Callable, Union
3 |
4 | from typing_extensions import ParamSpec, Protocol, TypeVar, overload
5 |
6 |
7 | _R = TypeVar('_R')
8 | _P = ParamSpec('_P')
9 |
10 |
11 | class AsyncLockWrappedF(Protocol[_P, _R]):
12 | __wrapped__: Callable[_P, Awaitable[_R]] # pragma: no cover
13 |
14 | async def __call__(self, *args: _P.args, **kwargs: _P.kwargs) -> _R: ... # pragma: no cover
15 |
16 | async def is_locked(self, *args: _P.args, **kwargs: _P.kwargs) -> bool: ... # pragma: no cover
17 |
18 | async def release(self, *args: _P.args, **kwargs: _P.kwargs) -> None: ... # pragma: no cover
19 |
20 |
21 | class SyncLockWrappedF(Protocol[_P, _R]):
22 | __wrapped__: Callable[_P, _R] # pragma: no cover
23 |
24 | def __call__(self, *args: _P.args, **kwargs: _P.kwargs) -> _R: ... # pragma: no cover
25 |
26 | def is_locked(self, *args: _P.args, **kwargs: _P.kwargs) -> bool: ... # pragma: no cover
27 |
28 | def release(self, *args: _P.args, **kwargs: _P.kwargs) -> None: ... # pragma: no cover
29 |
30 |
31 | class WrappedFunctionLock(Protocol):
32 | @overload
33 | def __call__(self, _func: Callable[_P, Awaitable[_R]]) -> AsyncLockWrappedF[_P, _R]: ... # type: ignore[overload-overlap]
34 |
35 | @overload
36 | def __call__(self, _func: Callable[_P, _R]) -> SyncLockWrappedF[_P, _R]: ...
37 |
38 | def __call__( # pragma: no cover
39 | self,
40 | _func: Union[
41 | Callable[_P, Awaitable[_R]],
42 | Callable[_P, _R],
43 | ],
44 | ) -> Union[
45 | AsyncLockWrappedF[_P, _R],
46 | SyncLockWrappedF[_P, _R],
47 | ]: ...
48 |
--------------------------------------------------------------------------------
/integration_tests/test_unpicklable_cache.py:
--------------------------------------------------------------------------------
1 | import sys
2 |
3 | import pytest
4 | from pytest_mock import MockerFixture
5 | from typing_extensions import Any
6 |
7 | from py_cachify import cached
8 |
9 |
10 | class UnpicklableClass:
11 | def __init__(self, arg1: str, arg2: str) -> None:
12 | self.arg1 = arg1
13 | self.arg2 = arg2
14 |
15 | def __eq__(self, other: 'UnpicklableClass') -> bool:
16 | return self.arg1 == other.arg1 and self.arg2 == other.arg2
17 |
18 | def __reduce__(self):
19 | raise TypeError('This class is not picklable')
20 |
21 |
22 | def create_unpicklable_class(arg1: str, arg2: str) -> UnpicklableClass:
23 | return UnpicklableClass(arg1=arg1, arg2=arg2)
24 |
25 |
26 | def test_cached_decorator_without_encoder():
27 | wrapped_create = cached(key='test_create_unpicklable-{arg1}-{arg2}')(create_unpicklable_class)
28 |
29 | with pytest.raises(TypeError, match='This class is not picklable'):
30 | wrapped_create('arg1', 'arg2')
31 |
32 |
33 | def test_cached_decorator_with_encoder_decoder(mocker: MockerFixture):
34 | def encoder(val: UnpicklableClass) -> dict[str, Any]:
35 | return {'arg1': val.arg1, 'arg2': val.arg2}
36 |
37 | def decoder(val: dict[str, Any]) -> UnpicklableClass:
38 | return UnpicklableClass(**val)
39 |
40 | spy_on_create = mocker.spy(sys.modules[__name__], 'create_unpicklable_class')
41 |
42 | wrapped_create = cached(key='test_create_unpicklable_dec_enc-{arg1}-{arg2}', enc_dec=(encoder, decoder))(
43 | create_unpicklable_class
44 | )
45 |
46 | res_1 = wrapped_create('test1', 'test2')
47 | res_2 = wrapped_create('test1', 'test2')
48 |
49 | assert res_1 == res_2
50 | spy_on_create.assert_called_once_with('test1', 'test2')
51 |
--------------------------------------------------------------------------------
/tests/test_merge.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 | import sys
3 | from concurrent.futures import ThreadPoolExecutor, as_completed
4 | from time import sleep
5 |
6 | import pytest
7 | from pytest_mock import MockerFixture
8 |
9 | from py_cachify._backend._cached import cached
10 | from py_cachify._backend._lock import once
11 |
12 |
13 | def sync_function(arg1: int, arg2: int) -> int:
14 | sleep(1)
15 | return arg1 + arg2
16 |
17 |
18 | async def async_function(arg1: int, arg2: int) -> int:
19 | await asyncio.sleep(1)
20 | return arg1 + arg2
21 |
22 |
23 | def test_cached_once_merge(init_cachify_fixture, mocker: MockerFixture):
24 | spy = mocker.spy(sys.modules[__name__], 'sync_function')
25 | sync_function_wrapped = cached(key='test_key')(sync_function)
26 | once_wrapped = once(key='test_key')(sync_function_wrapped)
27 |
28 | with ThreadPoolExecutor(max_workers=2) as e:
29 | futures = [
30 | e.submit(once_wrapped, arg1=3, arg2=4),
31 | e.submit(lambda: sleep(0.1) or once_wrapped(arg1=3, arg2=4)),
32 | ]
33 |
34 | result = once_wrapped(3, 4)
35 |
36 | results = [res.result() for res in as_completed(futures)]
37 | assert None in results
38 | assert results.count(7) == 1
39 | assert result == 7
40 | assert spy.call_count == 1
41 |
42 |
43 | @pytest.mark.asyncio
44 | async def test_cached_once_merge_async(init_cachify_fixture, mocker: MockerFixture):
45 | spy = mocker.spy(sys.modules[__name__], 'async_function')
46 | async_function_wrapped = cached(key='test_key')(async_function)
47 | once_wrapped = once(key='test_key')(async_function_wrapped)
48 |
49 | results = await asyncio.gather(once_wrapped(3, 4), once_wrapped(3, 4))
50 | result = await once_wrapped(3, 4)
51 |
52 | assert None in results
53 | assert results.count(7) == 1
54 | assert result == 7
55 | assert spy.call_count == 1
56 |
--------------------------------------------------------------------------------
/py_cachify/_backend/_types/_common.py:
--------------------------------------------------------------------------------
1 | from collections.abc import Awaitable
2 | from typing import TYPE_CHECKING, Any, Callable, Optional, Protocol, Union
3 |
4 | from typing_extensions import TypeAlias
5 |
6 |
7 | if TYPE_CHECKING:
8 | from .._lib import CachifyClient
9 |
10 |
11 | Encoder: TypeAlias = Callable[[Any], Any]
12 | Decoder: TypeAlias = Callable[[Any], Any]
13 |
14 |
15 | class AsyncClient(Protocol):
16 | def get(self, name: str) -> Awaitable[Optional[Any]]:
17 | raise NotImplementedError
18 |
19 | def delete(self, *names: str) -> Awaitable[Any]:
20 | raise NotImplementedError
21 |
22 | def set(
23 | self,
24 | name: str,
25 | value: Any,
26 | ex: Union[int, None] = None,
27 | nx: bool = False,
28 | ) -> Awaitable[Any]:
29 | raise NotImplementedError
30 |
31 |
32 | class SyncClient(Protocol):
33 | def get(self, name: str) -> Optional[Any]:
34 | raise NotImplementedError
35 |
36 | def delete(self, *names: str) -> Any:
37 | raise NotImplementedError
38 |
39 | def set(
40 | self,
41 | name: str,
42 | value: Any,
43 | ex: Union[int, None] = None,
44 | nx: bool = False,
45 | ) -> Any:
46 | raise NotImplementedError
47 |
48 |
49 | class UnsetType:
50 | def __bool__(self) -> bool:
51 | return False
52 |
53 |
54 | UNSET = UnsetType()
55 |
56 |
57 | class LockProtocolBase(Protocol):
58 | _key: str
59 | _nowait: bool
60 | _timeout: Optional[Union[int, float]]
61 | _exp: Union[Optional[int], UnsetType]
62 |
63 | @staticmethod
64 | def _raise_if_cached(
65 | is_already_cached: bool, key: str, do_raise: bool = True, do_log: bool = True
66 | ) -> None: ... # pragma: no cover
67 |
68 | @property
69 | def _cachify(self) -> 'CachifyClient': ... # pragma: no cover
70 |
71 | def _calc_stop_at(self) -> float: ... # pragma: no cover
72 |
73 | def _get_ttl(self) -> Optional[int]: ... # pragma: no cover
74 |
--------------------------------------------------------------------------------
/docs/llm-full.md:
--------------------------------------------------------------------------------
1 | ---
2 | hide:
3 | - toc
4 | - navigation
5 | ---
6 | # LLM Full Documentation View
7 |
8 | This page programmatically includes all other documentation pages.
9 | It is intended for tools and LLMs that want a single-page view of the entire documentation
10 | while keeping each source file as the single source of truth.
11 |
12 | > NOTE: This page is built using the `include-markdown` plugin.
13 | > Each section below inlines the contents of an existing documentation file using Jinja-style directives.
14 |
15 | ---
16 |
17 | ## Top-level
18 |
19 | {% include-markdown "index.md" %}
20 |
21 | {% include-markdown "examples.md" %}
22 | ---
23 |
24 | ## Tutorial
25 |
26 | {% include-markdown "tutorial/index.md" %}
27 |
28 | ### Initial Setup
29 |
30 | {% include-markdown "tutorial/initial-setup/install.md" %}
31 |
32 | {% include-markdown "tutorial/initial-setup/initialization.md" %}
33 |
34 | ### Cached Decorator
35 |
36 | {% include-markdown "tutorial/cached-decorator/first-steps.md" %}
37 |
38 | {% include-markdown "tutorial/cached-decorator/dynamic-cache-keys.md" %}
39 |
40 | {% include-markdown "tutorial/cached-decorator/specifying-ttl-and-encoder-decoder.md" %}
41 |
42 | {% include-markdown "tutorial/cached-decorator/reset-attribute.md" %}
43 |
44 | ### Locks
45 |
46 | {% include-markdown "tutorial/locks/locks-intro.md" %}
47 |
48 | {% include-markdown "tutorial/locks/simple-locks.md" %}
49 |
50 | {% include-markdown "tutorial/locks/lock-parameters.md" %}
51 |
52 | {% include-markdown "tutorial/locks/lock-methods.md" %}
53 |
54 | {% include-markdown "tutorial/locks/lock-as-decorator.md" %}
55 |
56 | ### Once Decorator
57 |
58 | {% include-markdown "tutorial/once-decorator/index.md" %}
59 |
60 | ---
61 |
62 | ## API Reference
63 |
64 | {% include-markdown "reference/init.md" %}
65 |
66 | {% include-markdown "reference/cached.md" %}
67 |
68 | {% include-markdown "reference/lock.md" %}
69 |
70 | {% include-markdown "reference/once.md" %}
71 |
72 | ---
73 |
74 | ## Help & Contribution
75 |
76 | {% include-markdown "help/help.md" %}
77 |
78 | {% include-markdown "help/contribution.md" %}
79 | {% include-markdown "release-notes.md" %}
80 |
81 | {% endraw %}
82 |
--------------------------------------------------------------------------------
/py_cachify/_backend/_clients.py:
--------------------------------------------------------------------------------
1 | import threading
2 | import time
3 | from typing import Any, Optional, Union
4 |
5 |
6 | class MemoryCache:
7 | def __init__(self) -> None:
8 | self._cache: dict[str, tuple[Any, Union[float, None]]] = {}
9 | self._lock = threading.RLock()
10 |
11 | def set(self, name: str, value: Any, ex: Union[int, None] = None, nx: bool = False) -> Optional[bool]:
12 | """
13 | Set a value with optional NX semantics.
14 |
15 | - If nx is False: behaves like a normal set, always overwriting the value.
16 | Returns None to mirror the fact that some backends don't return a meaningful value.
17 | - If nx is True: only set the value if the key is absent or expired.
18 | Returns True if the value was set, False if the key already exists and is not expired.
19 | """
20 | if not nx:
21 | self._cache[name] = value, ex and time.time() + ex
22 | return None
23 |
24 | # NX path: need atomic check+set
25 | with self._lock:
26 | existing = self._cache.get(name)
27 | if existing is not None:
28 | _, exp_at = existing
29 | if exp_at is None or exp_at > time.time():
30 | return False
31 |
32 | self._cache[name] = value, ex and time.time() + ex
33 | return True
34 |
35 | def get(self, name: str) -> Optional[Any]:
36 | val, exp_at = self._cache.get(name, (None, None))
37 | if not exp_at or exp_at > time.time():
38 | return val
39 |
40 | self.delete(name)
41 | return None
42 |
43 | def delete(self, *names: str) -> None:
44 | for key in names:
45 | if key not in self._cache:
46 | continue
47 |
48 | del self._cache[key]
49 |
50 |
51 | class AsyncWrapper:
52 | def __init__(self, cache: MemoryCache) -> None:
53 | self._cache = cache
54 |
55 | async def get(self, name: str) -> Optional[Any]:
56 | return self._cache.get(name=name)
57 |
58 | async def delete(self, *names: str) -> Any:
59 | self._cache.delete(*names)
60 |
61 | async def set(self, name: str, value: Any, ex: Union[int, None] = None, nx: bool = False) -> Optional[Any]:
62 | return self._cache.set(name=name, value=value, ex=ex, nx=nx)
63 |
--------------------------------------------------------------------------------
/docs/tutorial/cached-decorator/dynamic-cache-keys.md:
--------------------------------------------------------------------------------
1 | # Cached - Dynamic cache key arguments
2 |
3 | In this tutorial, we will continue from the previous example and customize the key in the decorator.
4 |
5 | The full code will look like this:
6 |
7 | ```python
8 | import asyncio
9 |
10 | from py_cachify import init_cachify, cached
11 |
12 |
13 | # here we are initializing py-cachify to use an in-memory cache
14 | # for global decorators like @cached, @lock, @once
15 | init_cachify()
16 |
17 |
18 | # notice that we now have {a} and {b} in the cache key
19 | @cached(key='sum_two-{a}-{b}')
20 | async def sum_two(a: int, b: int) -> int:
21 | # Let's put print here to see what was the function called with
22 | print(f'Called with {a} {b}')
23 | return a + b
24 |
25 |
26 | async def main() -> None:
27 | # Call the function first time with (5, 5)
28 | print(f'First call result: {await sum_two(5, 5)}')
29 |
30 | # And we will call it again to make sure it's not called but the result is the same
31 | print(f'Second call result: {await sum_two(5, 5)}')
32 |
33 | # Now we will call it with different args to make sure the function is indeed called for another set of arguments
34 | print(f'Third call result: {await sum_two(5, 10)}')
35 |
36 |
37 | if __name__ == '__main__':
38 | asyncio.run(main())
39 | ```
40 |
41 | > Note: in more advanced scenarios you can also create a dedicated instance with `init_cachify(is_global=False)` and use `instance.cached(...)` instead of the global `@cached`. The dynamic key rules shown here work the same way for both global and instance-based usage.
42 |
43 |
44 | ## Understanding what has changed
45 |
46 | As you can see, we now have `{a}` and `{b}` inside our key,
47 | what it allows py-cachify to do is dynamically craft a key for a function the decorator is being applied to.
48 |
49 | This way it will cache the result for each set of arguments instead of creating just one key.
50 |
51 | Note, that in this current example key `'sum_two-{}-{}'` will have the same effect.
52 | Providing a not named placeholders is supported to allow creating dynamic cache keys even for the functions that accept `*args, **kwargs` as their arguments.
53 |
54 | We have also modified our main function to showcase the introduced changes.
55 |
56 | ## Let's run our code
57 |
58 | After running the example:
59 |
60 | ```bash
61 | # Run our example
62 | $ python main.py
63 |
64 | # The ouput will be
65 | Called with 5 5
66 | First call result: 10
67 | Second call result: 10
68 | Called with 5 10
69 | Third call result: 15
70 |
71 | ```
72 |
73 | As you can see, the function result is being cached based on the arguments provided.
74 |
75 | ## What's next
76 |
77 | In the next chapter we'll learn what other parameters `@cached()` decorator has.
78 |
--------------------------------------------------------------------------------
/integration_tests/test_decorators.py:
--------------------------------------------------------------------------------
1 | from threading import Thread
2 | from time import sleep
3 |
4 | import pytest
5 |
6 | from py_cachify import CachifyLockError, cached, once
7 |
8 |
9 | def test_once_decorator():
10 | @once(key='test_key-{arg1}', return_on_locked='IF_LOCKED')
11 | def _sync_function(arg1, arg2):
12 | sleep(2)
13 | return arg1 + arg2
14 |
15 | thread = Thread(target=_sync_function, args=(1, 2))
16 | thread.start()
17 | sleep(0.1)
18 | result = _sync_function(1, 2)
19 |
20 | assert 'IF_LOCKED' == result
21 |
22 |
23 | def test_once_decorator_raises():
24 | @once(key='test_key-{arg1}-{arg2}', raise_on_locked=True)
25 | def _sync_function(arg1, arg2):
26 | sleep(2)
27 | return arg1 + arg2
28 |
29 | thread = Thread(target=_sync_function, args=(1, 2))
30 | thread.start()
31 | sleep(0.1)
32 | with pytest.raises(CachifyLockError):
33 | _sync_function(1, 2)
34 |
35 |
36 | @pytest.mark.asyncio
37 | async def test_once_decorator_async_function():
38 | @once(key='async_test_key-{arg1}-{arg2}', return_on_locked='IF_LOCKED')
39 | async def _async_function(arg1, arg2, initial=True):
40 | res = None
41 | if initial:
42 | res = await _async_function(arg1, arg2, initial=False)
43 | return arg1 + arg2, res
44 |
45 | results = await _async_function(3, 4)
46 | assert 'IF_LOCKED' in results
47 | assert 7 in results
48 |
49 |
50 | @pytest.mark.asyncio
51 | async def test_async_once_decorator_raise_on_locked(init_cachify_fixture):
52 | @once(key='async_test_key-{arg1}-{arg2}', raise_on_locked=True)
53 | async def _async_function(arg1: int, arg2: int) -> int:
54 | await _async_function(arg1, arg2)
55 | return arg1 + arg2
56 |
57 | with pytest.raises(CachifyLockError):
58 | await _async_function(3, 4)
59 |
60 |
61 | def test_cached_decorator_sync_function():
62 | @cached(key='test_key')
63 | def _sync_function_wrapped(arg1, arg2):
64 | return arg1 + arg2
65 |
66 | result = _sync_function_wrapped(3, 4)
67 | result_2 = _sync_function_wrapped(10, 20)
68 | _sync_function_wrapped.reset(3, 4)
69 | result_3 = _sync_function_wrapped(10, 20)
70 |
71 | assert result == 7
72 | assert result_2 == 7
73 | assert result_3 == 30
74 |
75 |
76 | @pytest.mark.asyncio
77 | async def test_cached_decorator_async_function():
78 | @cached(key='test_key_{arg1}')
79 | async def _async_function_wrapped(arg1, arg2):
80 | return arg1 + arg2
81 |
82 | result = await _async_function_wrapped(3, 4)
83 | result_2 = await _async_function_wrapped(3, 20)
84 | await _async_function_wrapped.reset(3, 4)
85 | result_3 = await _async_function_wrapped(10, 20)
86 |
87 | assert result == 7
88 | assert result_2 == 7
89 | assert result_3 == 30
90 |
--------------------------------------------------------------------------------
/docs/tutorial/locks/lock-parameters.md:
--------------------------------------------------------------------------------
1 | # Lock - Lock Parameters in Py-Cachify
2 |
3 | ## Parameters
4 |
5 | Here, we will detail the various parameters that you can
6 | configure when creating a lock and how to use them effectively.
7 |
8 | ### Explanation of Parameters
9 |
10 | 1. **key**:
11 | - This is a mandatory parameter that uniquely identifies the lock. Each operation you wish to manage with a lock should have a unique key.
12 |
13 | 2. **nowait**:
14 | - Setting `nowait=True` (default) means that if the lock is already held by another lock, your current lock won't wait and will immediately raise a `CachifyLockError`.
15 | - If `nowait=False`, your lock will wait until the lock becomes available, up to the duration specified by `timeout`.
16 |
17 | 3. **timeout**:
18 | - Use this parameter to specify how long (in seconds) the lock should wait for to acquire a lock. If the lock does not become available within this time, a `CachifyLockError` is raised.
19 | - Timeout only works if `nowait` is `False`.
20 | 4. **exp**:
21 | - This parameter sets an expiration time (in seconds) for the lock. After this time, the lock will automatically be released, regardless of whether the operation has been completed.
22 | - This can help to prevent deadlocks in cases where an app may fail to release the lock due to an error or abrupt termination.
23 |
24 |
25 | ## Some examples
26 |
27 | Let's write the example showcasing every parameter and then go through the output to understand what is happening:
28 |
29 | ```python
30 | import asyncio
31 |
32 | from py_cachify import init_cachify, lock
33 |
34 |
35 | # Initialize py-cachify to use in-memory cache
36 | init_cachify()
37 |
38 |
39 | async def main() -> None:
40 | example_lock = lock(key='example-lock', nowait=False, timeout=4, exp=2)
41 |
42 | async with example_lock:
43 | print('This code is executing under a lock with a timeout of 4 seconds and expiration set to 2 seconds')
44 |
45 | async with example_lock:
46 | print('This code is acquiring the same lock under the previous one.')
47 |
48 |
49 | if __name__ == '__main__':
50 | asyncio.run(main())
51 | ```
52 |
53 | After running the example:
54 |
55 | ```bash
56 | $ python main.py
57 |
58 | # The output will be
59 | This code is executing under a lock with a timeout of 4 seconds and expiration set to 2 seconds
60 | example-lock is already locked!
61 | example-lock is already locked!
62 | example-lock is already locked!
63 | example-lock is already locked!
64 | This code is acquiring the same lock under the previous one.
65 | ```
66 |
67 | As you can see we got no errors, because the lock expired after 2 seconds, and the timeout (maximum wait time to acquire a lock) is set to 4,
68 | which is enough to wait for the first expiration of the first acquire.
69 |
70 | ## What's next
71 |
72 | Next, we'll see what methods do lock objects have in py-cachify.
73 |
--------------------------------------------------------------------------------
/docs/tutorial/initial-setup/initialization.md:
--------------------------------------------------------------------------------
1 | # Initializing a library
2 |
3 | ## Description
4 |
5 |
6 |
7 | First, to start working with the library, you will have to initialize it by using the provided `init_cachify` function for global usage, or create one or more dedicated instances when you need isolated caches:
8 |
9 |
10 | ```python
11 | from py_cachify import init_cachify
12 |
13 |
14 |
15 | # Configure the global Cachify instance used by top-level decorators
16 | init_cachify()
17 | ```
18 |
19 |
20 | By default, the global client uses an **in-memory** cache.
21 |
22 |
23 |
24 | ⚠ In-memory cache details
25 |
26 | The in-memory cache is not suitable to use in any sort of serious applications, since every python process will use its own memory,
27 | and caching/locking won't work as expected. So be careful using it and make sure it is suitable for your particular use case,
28 | for example, some simple script will probably be OK utilizing an in-memory cache, but a FastAPI app won't work as expected.
29 |
30 |
31 |
32 | If you want to use Redis:
33 | ```python
34 | from py_cachify import init_cachify
35 | from redis.asyncio import from_url as async_from_url
36 | from redis import from_url
37 |
38 |
39 | # Example: configure global Cachify with Redis for both sync and async flows
40 | init_cachify(
41 | sync_client=from_url(redis_url),
42 | async_client=async_from_url(redis_url),
43 | default_cache_ttl=300,
44 | )
45 | ```
46 | Normally you wouldn't have to use both sync and async clients since an application usually works in a single mode i.e. sync/async. You can pass only `sync_client` **or** only `async_client` if that matches your usage, or both if you want sync and async code paths to share the same backend. The `default_cache_ttl` parameter lets you configure a global default TTL (in seconds) that is used for `@cached` when `ttl` is omitted.
47 |
48 |
49 | Once the global client is initialized you can use everything that the library provides straight up without being worried about managing the cache yourself.
50 |
51 |
52 | ❗ If you forgot to call `init_cachify` with `is_global=True` at least once, using the global decorators (`cached`, `lock`, `once`) will raise `CachifyInitError` during runtime. Instance-based usage via `init_cachify(is_global=False)` does not depend on this global initialization and can be used independently.
53 |
54 |
55 | ## Additional info on initialization
56 |
57 | The clients are not the only thing that this function accepts. You can also configure `default_cache_ttl`, `default_lock_expiration`, prefixes, and whether a particular call should register a global client or return a dedicated instance. Make sure to check out the **[Detailed initialization reference](../../reference/init.md)** for the full list of options and defaulting rules.
58 |
59 |
60 |
61 | ## What's next
62 |
63 |
64 | Next, we'll learn about the `@cached()` decorator and how to use it, including how it interacts with `default_cache_ttl` and how to use it both with the global decorators and with dedicated `Cachify` instances.
65 |
66 |
--------------------------------------------------------------------------------
/docs/tutorial/once-decorator/index.md:
--------------------------------------------------------------------------------
1 | # Once - Decorator for background tasks
2 |
3 | ## Description
4 |
5 |
6 | The `@once` decorator is a convenience wrapper around the same distributed locking mechanism used by `lock`, but tailored for “only one run at a time” semantics on a given key.
7 |
8 | `once` can come in handy when you have a lot of background tasks, which usually are powered by `celery`, `darq`, `taskiq`, or `dramatiq`.
9 |
10 | ## Theoretical example
11 |
12 | Let's say we have some sort of a spawner task, which spawns a lot of small ones.
13 | Like, for example, the spawner gets all the orders in progress and submits a task for each one to check the status on it.
14 |
15 | It could look like this:
16 |
17 | ```python
18 | from celery import shared_task
19 |
20 | # This is scheduled to run every 5 minutes
21 | @shared_task()
22 | def check_in_progress_orders() -> None:
23 | orders = ... # hit the database and get all orders
24 | [check_order.s(order_id=order.id).delay() for order in orders]
25 |
26 |
27 | # This is being spawned from the previous one
28 | @shared_task()
29 | def check_order(order_id: UUID) -> None
30 | # check the order progress, update state, save
31 |
32 | ```
33 |
34 | So in this scenario, we don't care about the results of each task, but we DO care that we are not running the second task for the same `order_id` twice
35 | since it could break things.
36 |
37 |
38 | This is where `@once` could come in handy: it will make sure that only one task is being run at the same time for a given `order_id`, and all subsequent tasks on the same `order_id` will exit early while at least one task is running.
39 |
40 |
41 | The full code will look like this:
42 |
43 | ```python
44 | from py_cachify import once
45 | from celery import shared_task
46 |
47 | # This is scheduled to run every 5 minutes
48 | @shared_task()
49 | def check_in_progress_orders() -> None:
50 | orders = ... # hit the database and get all orders
51 | [check_order.s(order_id=order.id).delay() for order in orders]
52 |
53 |
54 | # This is being spawn from the previous one
55 | @shared_task()
56 | @once(key='check_order-{order_id}', raise_on_locked=False, return_on_locked=None) # raise_on_locked and return_on_locked can be omitted (those values are defaults)
57 | def check_order(order_id: UUID) -> None
58 | # check the order progress, update state, save
59 | pass
60 |
61 | ```
62 |
63 | This will make sure you won't run into multiple update tasks running at the same time for one order.
64 |
65 | ## What's next
66 |
67 | You can always check the full reference for once [here](../../reference/once.md).
68 |
69 | ## Conslusion
70 |
71 |
72 | This concludes the tutorial for py-cachify.
73 |
74 | We have covered the basics of the package and glanced over common cases,
75 | the topics of caching and locking are pretty common yet they are always unique to the specifics of the app and tasks that the programmer wants to solve.
76 |
77 |
78 |
79 | Py-Cachify tries to help you cover your specific cases by giving you lock- and once-based tools built on the same underlying mechanism, so you can adapt them to your needs without bloating your codebase.
80 |
81 |
82 | For full API reference [here](../../reference/init.md).
83 |
--------------------------------------------------------------------------------
/docs/tutorial/locks/simple-locks.md:
--------------------------------------------------------------------------------
1 | # Lock - Getting Started with Locks in Py-Cachify
2 |
3 | ## Starting slow
4 |
5 | Let's write the following code:
6 |
7 | ```python
8 | import asyncio
9 |
10 | from py_cachify import init_cachify, lock
11 |
12 |
13 | # here we initializing a py-cachify to use an in-memory cache, as usual
14 | init_cachify()
15 |
16 |
17 | async def main() -> None:
18 |
19 | # this is a sync lock
20 | with lock(key='cool-sync-lock'):
21 | print('this code is locked')
22 |
23 | # and this is an async lock
24 | async with lock(key='cool-async-lock'):
25 | print('this code is locked, but using async cache')
26 |
27 |
28 | if __name__ == '__main__':
29 | asyncio.run(main())
30 | ```
31 |
32 | If we run the example:
33 |
34 |
35 | ```bash
36 | # Run our example
37 | $ python main.py
38 |
39 | # The output will be
40 | this code is locked
41 | this code is locked, but using async cache
42 | ```
43 |
44 | As you can see, we just had both of our prints printed out without any exceptions.
45 |
46 | Notice how we utilized both sync and async context managers from a single lock object,
47 | by doing this `py-cachify` allows you to use the `lock` in any environment your application might work in (sync or async),
48 | without splitting those into for example `async_lock` and `sync_lock`.
49 |
50 | From now on we will do everything in async, but you can also follow the tutorial writing the same sync code :)
51 |
52 | ## Let's break it
53 |
54 |
55 | Now, we'll adjust our previous example:
56 |
57 | ```python
58 | import asyncio
59 |
60 | from py_cachify import init_cachify, lock
61 |
62 |
63 | # here we initializing a py-cachify to use an in-memory cache, as usual
64 | init_cachify()
65 |
66 |
67 | async def main() -> None:
68 |
69 | # and this is an async lock
70 | async with lock(key='cool-async-lock'):
71 | print('this code is locked and will be executed')
72 |
73 | async with lock(key='cool-async-lock'):
74 | print('we are attempting to acquire a new lock with the same key and will not make it to this print')
75 |
76 |
77 | if __name__ == '__main__':
78 | asyncio.run(main())
79 | ```
80 |
81 | After running this piece:
82 |
83 |
84 | ```bash
85 | $ python main.py
86 | # The output will be
87 | poetry run python main.py
88 | this code is locked and will be executed
89 | cool-async-lock is already locked! # this is a .warning from log
90 |
91 | # traceback of an error
92 | Traceback (most recent call last):
93 | ...
94 | File "/py_cachify/backend/lock.py", line 199, in _raise_if_cached
95 | raise CachifyLockError(msg)
96 | py_cachify.backend.exceptions.CachifyLockError: cool-async-lock is already locked!
97 | ```
98 |
99 | And as expected, at the line where we are trying to acquire a lock with the same name - we get an error that this key is already locked.
100 |
101 | This was a showcase of the very basic piece of locks (or mutexes) and everything else everywhere is built on top of this basic concept.
102 |
103 | ## What's next
104 |
105 | We will see what parameters does `lock` object has and what cases can we cover with the help of those.
--------------------------------------------------------------------------------
/docs/tutorial/locks/lock-as-decorator.md:
--------------------------------------------------------------------------------
1 | # Lock - Using Lock as a Decorator
2 |
3 | ## Parameters and Methods
4 |
5 | You can also use the `lock` that py-cachify has as a decorator.
6 |
7 | It accepts the same parameters as a normal `lock` and also automatically detects which function it is being applied to
8 | (sync or async) and uses the correct wrapper.
9 |
10 | ## Differences with regular usage
11 |
12 | - The first difference (and advantage) is that when using `lock` as a decorator you can create **dynamic** cache keys (same as in `cached` decorator).
13 | - The second one is since the decorator \*knows\* what type of function it is being applied to there is no need to attach both `is_alocked` and `is_locked` to the wrapped function, so it only attaches `is_locked(*args, **kwargs)` that is going to be the same type as the function that was wrapped (i.e. sync or async)
14 |
15 | ## Examples
16 |
17 | Let's write some code that showcases all the methods with default lock params.
18 |
19 | ```python
20 | import asyncio
21 |
22 | from py_cachify import init_cachify, lock, CachifyLockError
23 |
24 |
25 | # Initialize py-cachify to use in-memory cache
26 | init_cachify()
27 |
28 |
29 | # Function that is wrapped in a lock and just sleeps for certain amount of time
30 | @lock(key='sleep_for_lock-{arg}', nowait=True)
31 | async def sleep_for(arg: int) -> None:
32 | await asyncio.sleep(arg)
33 |
34 |
35 | async def main() -> None:
36 | # Calling a function with an arg=3
37 | _ = asyncio.create_task(sleep_for(3))
38 | await asyncio.sleep(0.1)
39 |
40 | # Checking if the arg 3 call is locked (should be locked)
41 | print(f'Sleep for is locked for argument 3: {await sleep_for.is_locked(3)}')
42 | # Checking if the arg 4 call is locked (should not be locked)
43 | print(f'Sleep for is locked for argument 4: {await sleep_for.is_locked(4)}')
44 |
45 | task = asyncio.create_task(sleep_for(5))
46 | await asyncio.sleep(0.1)
47 | # Checking if our call with arg=5 is locked
48 | print(f'Sleep for is locked for argument 5: {await sleep_for.is_locked(5)}')
49 | # Forcefully release a lock
50 | await sleep_for.release(5)
51 | # Doing a second check - shouldn't be locked now
52 | print(f'Sleep for is locked for argument 5: {await sleep_for.is_locked(5)}')
53 | await task
54 |
55 | # Trying to run 2 tasks with the same argument (and catching the exception)
56 | try:
57 | await asyncio.gather(sleep_for(1), sleep_for(1))
58 | except CachifyLockError as e:
59 | print(f'Exception: {e}')
60 |
61 |
62 | if __name__ == '__main__':
63 | asyncio.run(main())
64 | ```
65 |
66 | After running the example:
67 |
68 | ```bash
69 | $ python main.py
70 |
71 | # The output
72 | Sleep for is locked for argument 3: True
73 | Sleep for is locked for argument 4: False
74 | Sleep for is locked for argument 5: True
75 | Sleep for is locked for argument 5: False
76 | sleep_for_lock-1 is already locked!
77 | Exception: sleep_for_lock-1 is already locked!
78 | ```
79 |
80 | Here we tried to showcase all the flexibility you have when wrapping functions with the `lock`.
81 |
82 | ## Conslusion
83 |
84 | This concludes our tutorial for the `lock` that py-cachify provides.
85 |
86 | The full API reference can be found here.
--------------------------------------------------------------------------------
/docs/tutorial/cached-decorator/first-steps.md:
--------------------------------------------------------------------------------
1 | # Cached - First steps with py-cachify
2 |
3 | Judging by the package's name py-cachify provides cache-based utilities, so let's start by doing some simple caching :)
4 |
5 |
6 | Py-Cachify is a thin, backend-agnostic wrapper over your cache client (for example Redis or DragonflyDB), giving you a clean decorator-based API instead of manually wiring get/set logic.
7 |
8 | The initialization details can be found [here](../initial-setup/initialization.md).
9 |
10 |
11 |
12 | For the sake of all the examples here, we will use the in-memory cache and an async environment, but everything will be the same for the sync one. In more advanced scenarios you can also create dedicated `Cachify` instances with `init_cachify(is_global=False)` for per-module or per-subsystem caches instead of relying only on the global decorators.
13 |
14 |
15 | ## Function to cache
16 |
17 | Let's start by creating a function that we are about to cache:
18 |
19 | ```python
20 | async def sum_two(a: int, b: int) -> int:
21 | # Let's put print here to see what was the function called with
22 | print(f'Called with {a} {b}')
23 | return a + b
24 | ```
25 |
26 | So this function takes two integers and returns their sum.
27 |
28 | ## Introducing py-cachify
29 |
30 | To cache a function all we have to do is wrap the function in the provided `@cached()` decorator.
31 |
32 | Also, we'll implement a simple main function to run our example, the full code will look something like this:
33 |
34 | ```python
35 | import asyncio
36 |
37 | from py_cachify import init_cachify, cached
38 |
39 |
40 | # here we are initializing a py-cachify to use an in-memory cache
41 | init_cachify()
42 |
43 |
44 | @cached(key='sum_two')
45 | async def sum_two(a: int, b: int) -> int:
46 | # Let's put print here to see what was the function called with
47 | print(f'Called with {a} {b}')
48 | return a + b
49 |
50 |
51 | async def main() -> None:
52 | print(f'First call result: {await sum_two(5, 5)}')
53 | print(f'Second call result: {await sum_two(5, 5)}')
54 |
55 |
56 | if __name__ == '__main__':
57 | asyncio.run(main())
58 | ```
59 |
60 |
61 | ## Running the example
62 |
63 | Now, let's run the example above.
64 |
65 |
66 | ```bash
67 | # Run our example
68 | $ python main.py
69 |
70 | # The ouput should be
71 | Called with 5 5
72 | First call result: 10
73 | Second call result: 10
74 | ```
75 |
76 | So as you can see, the function result has been successfully cached on the first call,
77 | and the second call to the function did not invoke an actual implementation and got its result from cache.
78 |
79 |
80 | ## Type annotations
81 |
82 | Py-Cachify is **fully** type annotated, this enhances the developer experience and lets your IDE keep doing the work it is supposed to be doing.
83 |
84 | As an example, our wrapped function keeps all its type annotations and lets you keep writing the code comfortably.
85 |
86 | 
87 |
88 | And another example, in this case, our LSP gives us the warning that we have forgotten to `await` the async function.
89 |
90 | 
91 |
92 | ## What's next
93 |
94 | Next, we will utilize the dynamic cache key to create cache results based on function arguments.
95 |
--------------------------------------------------------------------------------
/mkdocs.yml:
--------------------------------------------------------------------------------
1 | site_name: Py-Cachify
2 | site_description: Py-Cachify. Cache and locks made easy. Fully type annotated. 100% coverage.
3 | repo_name: EzyGang/py-cachify
4 | repo_url: https://github.com/EzyGang/py-cachify
5 |
6 | theme:
7 | name: material
8 | logo: img/logo-icon.svg
9 | favicon: img/logo.png
10 | palette:
11 | - media: "(prefers-color-scheme: light)"
12 | scheme: default
13 | primary: black
14 | accent: cyan
15 | toggle:
16 | icon: material/lightbulb
17 | name: Switch to dark mode
18 | - media: "(prefers-color-scheme: dark)"
19 | scheme: slate
20 | primary: black
21 | accent: cyan
22 | toggle:
23 | icon: material/lightbulb-outline
24 | name: Switch to light mode
25 | icon:
26 | repo: fontawesome/brands/github
27 | features:
28 | - content.code.annotate
29 | - content.code.copy
30 | # - content.code.select
31 | - content.footnote.tooltips
32 | - content.tabs.link
33 | - content.tooltips
34 | - navigation.footer
35 | - navigation.indexes
36 | - navigation.path
37 | - navigation.tabs
38 | - navigation.tabs.sticky
39 | - navigation.top
40 | - navigation.tracking
41 | - search.highlight
42 | - search.share
43 | - search.suggest
44 | - toc.follow
45 |
46 | markdown_extensions:
47 | # Python Markdown
48 | abbr:
49 | attr_list:
50 | footnotes:
51 | md_in_html:
52 | tables:
53 | toc:
54 | permalink: true
55 |
56 | # Python Markdown Extensions
57 | pymdownx.betterem:
58 | smart_enable: all
59 | pymdownx.caret:
60 | pymdownx.highlight:
61 | line_spans: __span
62 | pymdownx.inlinehilite:
63 | pymdownx.keys:
64 | pymdownx.mark:
65 | pymdownx.superfences:
66 | custom_fences:
67 | - name: mermaid
68 | class: mermaid
69 | format: !!python/name:pymdownx.superfences.fence_code_format
70 | pymdownx.tilde:
71 |
72 | # pymdownx blocks
73 | pymdownx.blocks.admonition:
74 | types:
75 | - note
76 | - attention
77 | - caution
78 | - danger
79 | - error
80 | - tip
81 | - hint
82 | - warning
83 | # Custom types
84 | - info
85 | pymdownx.blocks.details:
86 | pymdownx.blocks.tab:
87 | alternate_style: True
88 |
89 | plugins:
90 | # Material for MkDocs
91 | - search
92 | - termynal
93 | - include-markdown
94 | nav:
95 | - Introduction: "index.md"
96 | - Tutorial - User Guide:
97 | - tutorial/index.md
98 | - Initial setup:
99 | - tutorial/initial-setup/install.md
100 | - tutorial/initial-setup/initialization.md
101 | - Cached (///@cached()/// decorator):
102 | - tutorial/cached-decorator/first-steps.md
103 | - tutorial/cached-decorator/dynamic-cache-keys.md
104 | - tutorial/cached-decorator/specifying-ttl-and-encoder-decoder.md
105 | - Cached - Manually resetting cache with ///reset()/// method: tutorial/cached-decorator/reset-attribute.md
106 | - Lock (///lock()/// context manager and decorator):
107 | - tutorial/locks/locks-intro.md
108 | - tutorial/locks/simple-locks.md
109 | - tutorial/locks/lock-parameters.md
110 | - tutorial/locks/lock-methods.md
111 | - tutorial/locks/lock-as-decorator.md
112 | - Once (///@once()/// decorator):
113 | - tutorial/once-decorator/index.md
114 | - API Reference:
115 | - reference/init.md
116 | - reference/cached.md
117 | - reference/lock.md
118 | - reference/once.md
119 | - Examples: examples.md
120 | - Contribution & Help:
121 | - help/help.md
122 | - help/contribution.md
123 | - Release Notes: release-notes.md
124 | - LLM Full: llm-full.md
125 |
126 | hooks:
127 | - docs/scripts/hooks.py
128 |
--------------------------------------------------------------------------------
/tests/test_once_decorator.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 | from concurrent.futures import ThreadPoolExecutor, as_completed
3 | from time import sleep
4 |
5 | import pytest
6 |
7 | from py_cachify import CachifyLockError, once
8 |
9 |
10 | def test_once_decorator_sync_function(init_cachify_fixture):
11 | @once(key='test_key-{arg1}-{arg2}')
12 | def sync_function(arg1, arg2):
13 | sleep(1)
14 | return arg1 + arg2
15 |
16 | with ThreadPoolExecutor(max_workers=2) as e:
17 | futures = [e.submit(sync_function, arg1=3, arg2=4), e.submit(sync_function, arg1=3, arg2=4)]
18 |
19 | results = [res.result() for res in as_completed(futures)]
20 | assert None in results
21 | assert 7 in results
22 |
23 |
24 | @pytest.mark.asyncio
25 | async def test_once_decorator_async_function(init_cachify_fixture):
26 | @once(key='test_key-{arg1}-{arg2}')
27 | async def async_function(arg1, arg2):
28 | await asyncio.sleep(1)
29 | return arg1 + arg2
30 |
31 | results = await asyncio.gather(async_function(3, 4), async_function(3, 4))
32 | assert None in results
33 | assert 7 in results
34 |
35 |
36 | def test_once_decorator_raise_on_locked(init_cachify_fixture):
37 | @once(key='test_key-{arg1}-{arg2}', raise_on_locked=True)
38 | def sync_function(arg1, arg2):
39 | sleep(1)
40 | return arg1 + arg2
41 |
42 | with ThreadPoolExecutor(max_workers=2) as e:
43 | futures = [e.submit(sync_function, arg1=3, arg2=4), e.submit(sync_function, arg1=3, arg2=4)]
44 |
45 | with pytest.raises(CachifyLockError):
46 | [res.result() for res in as_completed(futures)]
47 |
48 |
49 | @pytest.mark.asyncio
50 | async def test_async_once_decorator_raise_on_locked(init_cachify_fixture):
51 | @once(key='test_key-{arg1}-{arg2}', raise_on_locked=True)
52 | async def async_function(arg1: int, arg2: int) -> int:
53 | await asyncio.sleep(1)
54 | return arg1 + arg2
55 |
56 | with pytest.raises(CachifyLockError):
57 | await asyncio.gather(async_function(3, 4), async_function(3, 4))
58 |
59 |
60 | def test_once_decorator_return_on_locked_sync(init_cachify_fixture):
61 | to_return = 'test'
62 |
63 | @once(key='test_key-{arg1}', return_on_locked=to_return)
64 | def sync_function(arg1, arg2):
65 | sleep(1)
66 | return arg1 + arg2
67 |
68 | with ThreadPoolExecutor(max_workers=2) as e:
69 | futures = [e.submit(sync_function, arg1=3, arg2=4), e.submit(sync_function, arg1=3, arg2=4)]
70 |
71 | results = [res.result() for res in as_completed(futures)]
72 | assert to_return in results
73 | assert 7 in results
74 |
75 |
76 | @pytest.mark.asyncio
77 | async def test_once_decorator_return_on_locked_async(init_cachify_fixture):
78 | to_return = 'test'
79 |
80 | @once(key='test_key-{arg1}', return_on_locked=to_return)
81 | async def async_function(arg1, arg2):
82 | await asyncio.sleep(1)
83 | return arg1 + arg2
84 |
85 | results = await asyncio.gather(async_function(3, 4), async_function(3, 4))
86 | assert to_return in results
87 | assert 7 in results
88 |
89 |
90 | def test_once_wrapped_async_function_has_release_and_is_locked_callables_attached(init_cachify_fixture):
91 | @once(key='test')
92 | async def async_function(arg1: int, arg2: int) -> None:
93 | return None
94 |
95 | assert hasattr(async_function, 'release')
96 | assert asyncio.iscoroutinefunction(async_function.release)
97 |
98 | assert hasattr(async_function, 'is_locked')
99 | assert asyncio.iscoroutinefunction(async_function.is_locked)
100 |
101 |
102 | def test_once_wrapped_function_has_release_and_is_locked_callables_attached(init_cachify_fixture):
103 | @once(key='test')
104 | def sync_function() -> None: ...
105 |
106 | assert hasattr(sync_function, 'release')
107 | assert not asyncio.iscoroutinefunction(sync_function.release)
108 | assert callable(sync_function.release)
109 |
110 | assert hasattr(sync_function, 'is_locked')
111 | assert not asyncio.iscoroutinefunction(sync_function.is_locked)
112 | assert callable(sync_function.is_locked)
113 |
--------------------------------------------------------------------------------
/docs/tutorial/locks/lock-methods.md:
--------------------------------------------------------------------------------
1 | # Lock - Lock Methods in Py-Cachify
2 |
3 | Parameters are not the only things that lock in py-cachify has.
4 | There are also a couple of handy methods.
5 |
6 | ## ///is_locked()/// and ///is_alocked()///
7 |
8 | The method `is_locked()` checks if the lock associated with the specified key is currently held.
9 | The method `is_alocked()` is the asynchronous counterpart of `is_locked()`. It checks if the lock is held, but it is designed to be used in an async context.
10 |
11 | Both methods return a `bool`.
12 |
13 | ## Example for ///is_alocked()///
14 |
15 | Let's modify the previous example a little bit:
16 |
17 | ```python
18 | import asyncio
19 |
20 | from py_cachify import init_cachify, lock
21 |
22 |
23 | # Initialize py-cachify to use in-memory cache
24 | init_cachify()
25 |
26 |
27 | async def main() -> None:
28 | example_lock = lock(key='example-lock', nowait=False, timeout=4, exp=2)
29 |
30 | async with example_lock:
31 | print('This code is executing under a lock with a timeout of 4 seconds and expiration set to 2 seconds')
32 | while await example_lock.is_alocked():
33 | print('Lock is still active! Waiting...')
34 | await asyncio.sleep(1)
35 |
36 | print('Lock has been released')
37 | async with example_lock:
38 | print('Acquire the same lock again')
39 |
40 |
41 | if __name__ == '__main__':
42 | asyncio.run(main())
43 | ```
44 |
45 | After running the example:
46 |
47 |
48 | ```bash
49 | $ python main.py
50 |
51 | # The output will be
52 | This code is executing under a lock with a timeout of 4 seconds and expiration set to 2 seconds
53 | Lock is still active! Waiting...
54 | Lock is still active! Waiting...
55 | Lock has been released
56 | Acquire the same lock again
57 | ```
58 |
59 | As you can see we were checking if the lock has been released inside a `while` loop before reacquiring it.
60 |
61 | Remember that we are talking about distributed locks, which means that you could check if the lock is being held from another process or even another machine in a real app!
62 |
63 |
64 | ## ///release()/// and ///arelease()///
65 |
66 | The method `release()` releases the lock associated with the given key.
67 | It is called internally when a lock context manager exits.
68 |
69 | The method `arelease()` is similar to the `release()` but is used in an asynchronous context.
70 |
71 | ## Modifying the example
72 |
73 | We'll introduce some small changes to the previous code:
74 |
75 | ```python
76 | import asyncio
77 |
78 | from py_cachify import init_cachify, lock
79 |
80 |
81 | # Initialize py-cachify to use in-memory cache
82 | init_cachify()
83 |
84 |
85 | async def main() -> None:
86 | example_lock = lock(key='example-lock', nowait=False, timeout=4, exp=2)
87 |
88 | async with example_lock:
89 | print('This code is executing under a lock with a timeout of 4 seconds and expiration set to 2 seconds')
90 |
91 | await example_lock.arelease()
92 |
93 | print(f'Is the lock currently locked: {await example_lock.is_alocked()}')
94 | async with example_lock:
95 | print('Acquire the same lock again')
96 |
97 |
98 | if __name__ == '__main__':
99 | asyncio.run(main())
100 | ```
101 |
102 | After running the example:
103 |
104 |
105 | ```bash
106 | $ python main.py
107 |
108 | # The output
109 | This code is executing under a lock with a timeout of 4 seconds and expiration set to 2 seconds
110 | Is the lock currently locked: False
111 | Acquire the same lock again
112 | ```
113 |
114 | This time we forcefully reset the lock instead of relying on our `while` loop to check if it has expired.
115 |
116 | ## Conclusion
117 |
118 | Understanding these methods allows for better management of locks within your applications.
119 | Depending on your application’s architecture (sync vs. async), you'll choose between the synchronous or asynchronous methods to check lock status or release locks after use.
120 | This ensures that resources are managed efficiently and that concurrently executed code does not produce race conditions or inconsistent data.
121 |
122 | ## What's next
123 |
124 | We'll see how can we use the `lock` as a decorator and see the ✨magic✨ that py-cachify does.
--------------------------------------------------------------------------------
/.github/workflows/checks.yml:
--------------------------------------------------------------------------------
1 | name: Pre-build Checks & Tests
2 | concurrency:
3 | group: ${{ github.workflow }}-${{ github.event.pull_request.number }}
4 | cancel-in-progress: true
5 |
6 | on:
7 | push:
8 | branches: [main]
9 | pull_request:
10 | types: [opened, synchronize, reopened]
11 | branches: [main, v/*, rc/*]
12 |
13 | env:
14 | UV_FROZEN: 1
15 |
16 | jobs:
17 | unit-tests:
18 | name: Unit Tests
19 | runs-on: ubuntu-latest
20 | strategy:
21 | fail-fast: false
22 | matrix:
23 | python: ["3.9", "3.10", "3.11", "3.12", "3.13", "3.14"]
24 | steps:
25 | - name: Install uv
26 | uses: astral-sh/setup-uv@v7
27 | with:
28 | version: "latest"
29 | python-version: ${{ matrix.python }}
30 | activate-environment: true
31 | enable-cache: true
32 | - uses: actions/setup-python@v6
33 | with:
34 | python-version: ${{ matrix.python }}
35 | allow-prereleases: true
36 | - uses: actions/checkout@v5
37 | with:
38 | fetch-depth: 0
39 | - name: Install Dependencies
40 | run: |
41 | uv sync --no-group docs
42 | - name: Execute Tests
43 | run: uv run task tests
44 | - name: Rename file
45 | env:
46 | VER: ${{ matrix.python }}
47 | run: mv ./coverage.xml ./coverage-$VER.xml
48 | - uses: actions/upload-artifact@v4
49 | with:
50 | name: coverage-${{ matrix.python }}.xml
51 | path: coverage-${{ matrix.python }}.xml
52 | - name: Coveralls
53 | uses: coverallsapp/github-action@v2.3.0
54 | with:
55 | coverage-reporter-version: v0.6.14
56 | file: coverage-${{ matrix.python }}.xml
57 | lint:
58 | name: Code Lint
59 | runs-on: ubuntu-latest
60 | strategy:
61 | fail-fast: false
62 | matrix:
63 | python: ["3.9", "3.10", "3.11", "3.12", "3.13", "3.14"]
64 | steps:
65 | - name: Install uv
66 | uses: astral-sh/setup-uv@v7
67 | with:
68 | version: "latest"
69 | python-version: ${{ matrix.python }}
70 | activate-environment: true
71 | enable-cache: true
72 | - uses: actions/setup-python@v6
73 | with:
74 | python-version: ${{ matrix.python }}
75 | allow-prereleases: true
76 | - uses: actions/checkout@v5
77 | with:
78 | fetch-depth: 0
79 | - name: Install Dependencies
80 | run: |
81 | uv sync --no-group docs
82 | - name: Run ruff
83 | run: uv run task ruff-lint
84 | - name: Run mypy
85 | run: uv run task mypy-lint
86 | sonar:
87 | name: SonarCloud
88 | runs-on: ubuntu-latest
89 | needs: unit-tests
90 | steps:
91 | - uses: actions/checkout@v3
92 | with:
93 | fetch-depth: 0 # Shallow clones should be disabled for a better relevancy of analysis
94 | - uses: actions/download-artifact@v5
95 | with:
96 | name: coverage-3.14.xml
97 | - name: Rename file
98 | run: mv ./coverage{-3.14,}.xml
99 | - name: SonarCloud Scan
100 | uses: SonarSource/sonarqube-scan-action@v7.0.0
101 | env:
102 | GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
103 | SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
104 | integration-tests:
105 | name: Integration tests
106 | runs-on: ubuntu-latest
107 | strategy:
108 | fail-fast: false
109 | matrix:
110 | python: ["3.9", "3.10", "3.11", "3.12", "3.13", "3.14"]
111 | services:
112 | redis:
113 | image: redis
114 | options: >-
115 | --health-cmd "redis-cli ping"
116 | --health-interval 10s
117 | --health-timeout 5s
118 | --health-retries 5
119 | ports:
120 | - 6379:6379
121 | steps:
122 | - name: Install uv
123 | uses: astral-sh/setup-uv@v7
124 | with:
125 | version: "latest"
126 | python-version: ${{ matrix.python }}
127 | activate-environment: true
128 | enable-cache: true
129 | - uses: actions/setup-python@v6
130 | with:
131 | python-version: ${{ matrix.python }}
132 | allow-prereleases: true
133 | - uses: actions/checkout@v5
134 | with:
135 | fetch-depth: 0
136 | - name: Install Dependencies
137 | run: |
138 | uv sync --no-group docs
139 | - name: Execute Tests
140 | run: uv run task integration-tests
141 | check:
142 | name: Check CI Green
143 | if: always()
144 | needs:
145 | - sonar
146 | - lint
147 | - unit-tests
148 | - integration-tests
149 | runs-on: Ubuntu-latest
150 | steps:
151 | - uses: re-actors/alls-green@v1.2.2
152 | with:
153 | jobs: ${{ toJSON(needs) }}
154 |
--------------------------------------------------------------------------------
/docs/img/logo-icon.svg:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/docs/examples.md:
--------------------------------------------------------------------------------
1 | # Examples
2 |
3 | Here's a small list of possible usage applications.
4 |
5 | Remember to make sure to call `init_cachify` (for global decorators) or create a local instance with `init_cachify(is_global=False)` when you need an isolated cache.
6 |
7 | ## ///cached/// decorator (global usage)
8 |
9 | ```python
10 | from py_cachify import cached, init_cachify
11 |
12 | # Configure global Cachify instance for top-level decorators
13 | init_cachify()
14 |
15 | @cached(key='example_key', ttl=60)
16 | def expensive_function(x: int) -> int:
17 | print('Executing expensive operation...')
18 | return x ** 2
19 |
20 |
21 | @cached(key='example_async_function-{arg_a}-{arg_b}')
22 | async def async_expensive_function(arg_a: int, arg_b: int) -> int:
23 | print('Executing async expensive operation...')
24 | return arg_a + arg_b
25 |
26 | # Reset the cache for a specific call
27 | expensive_function.reset(10)
28 | ```
29 |
30 | ## ///cached/// with default_cache_ttl
31 |
32 | ```python
33 | from py_cachify import cached, init_cachify
34 |
35 | # Configure a global default TTL of 300 seconds
36 | init_cachify(default_cache_ttl=300)
37 |
38 | # Uses default_cache_ttl=300 because ttl is omitted
39 | @cached(key='profile-{user_id}')
40 | def get_profile(user_id: int) -> dict:
41 | ...
42 |
43 | # Never expires, even though default_cache_ttl is set
44 | @cached(key='feature-flags', ttl=None)
45 | def get_feature_flags() -> dict:
46 | ...
47 | ```
48 |
49 | ## ///cached/// with instance-based usage
50 |
51 | ```python
52 | from py_cachify import init_cachify
53 |
54 | # Create a dedicated instance that does not affect the global client
55 | local_cachify = init_cachify(is_global=False, prefix='LOCAL-', default_cache_ttl=10)
56 |
57 | @local_cachify.cached(key='local-expensive-{x}')
58 | def local_expensive_function(x: int) -> int:
59 | print('Executing local expensive operation...')
60 | return x ** 3
61 | ```
62 |
63 | ## ///cached/// multi-layer usage
64 |
65 | ```python
66 | from py_cachify import cached, init_cachify
67 |
68 | # Global initialization (used by the top-level @cached)
69 | init_cachify(default_cache_ttl=60)
70 |
71 | # Local instance that adds a shorter TTL on top of the global cache
72 | local_cachify = init_cachify(is_global=False, prefix='LOCAL-', default_cache_ttl=5)
73 |
74 | @local_cachify.cached(key='local-expensive-{x}') # outer, short-lived layer
75 | @cached(key='global-expensive-{x}') # inner, longer-lived layer
76 | def expensive(x: int) -> int:
77 | print('Executing expensive operation...')
78 | return x * 10
79 |
80 | # Reset both layers for a given argument
81 | expensive.reset(42)
82 | ```
83 |
84 |
85 | ## ///cached/// decorator with encoder/decoder
86 | ```python
87 | from py_cachify import cached
88 |
89 |
90 | def encoder(val: 'UnpicklableClass') -> dict:
91 | return {'arg1': val.arg1, 'arg2': val.arg2}
92 |
93 |
94 | def decoder(val: dict) -> 'UnpicklableClass':
95 | return UnpicklableClass(**val)
96 |
97 |
98 | @cached(key='create_unpicklable_class-{arg1}-{arg2}', enc_dec=(encoder, decoder))
99 | def create_unpicklable_class(arg1: str, arg2: str) -> 'UnpicklableClass':
100 | return UnpicklableClass(arg1=arg1, arg2=arg2)
101 | ```
102 |
103 | ## ///lock/// as a context manager
104 |
105 | ```python
106 | from py_cachify import init_cachify, lock
107 |
108 | # Ensure global client is initialized for locks
109 | init_cachify()
110 |
111 | # Use it within an asynchronous context
112 | async with lock('resource_key'):
113 | # Your critical section here
114 | print('Critical section code')
115 |
116 |
117 | # Use it within a synchronous context
118 | with lock('resource_key'):
119 | # Your critical section here
120 | print('Critical section code')
121 | ```
122 |
123 | ## ///lock/// as a decorator
124 |
125 | ```python
126 | from py_cachify import init_cachify, lock
127 |
128 | # Initialize once at app startup
129 | init_cachify()
130 |
131 | @lock(key='critical_function_lock-{arg}', nowait=False, timeout=10)
132 | def critical_function(arg: int) -> None:
133 | # critical code
134 | ...
135 | ```
136 |
137 | ## ///once/// decorator
138 |
139 | ```python
140 | from datetime import date
141 | from time import sleep
142 |
143 | from py_cachify import once
144 |
145 |
146 | @once(key='long_running_function')
147 | async def long_running_function() -> str:
148 | # Executing long-running operation...
149 | ...
150 |
151 |
152 | @once(key='create-transactions-{for_date.year}-{for_date.month}-{for_date.day}')
153 | def create_transactions(for_date: date) -> None:
154 | # Creating...
155 | ...
156 |
157 |
158 | @once(key='another_long_running_task', return_on_locked='In progress')
159 | def another_long_running_function() -> str:
160 | sleep(10)
161 | return 'Completed'
162 |
163 |
164 | @once(key='exception_if_more_than_one_is_running', raise_on_locked=True)
165 | def one_more_long_running_function() -> None:
166 | # Executing
167 | ...
168 | ```
169 |
--------------------------------------------------------------------------------
/integration_tests/test_clients_isolation_and_ttl.py:
--------------------------------------------------------------------------------
1 | import sys
2 | from time import sleep
3 |
4 | import pytest
5 | from pytest_mock import MockerFixture
6 |
7 | from py_cachify import cached
8 |
9 |
10 | def _sync_func(x: int) -> int:
11 | return x * 10
12 |
13 |
14 | async def _async_func(x: int) -> int:
15 | return x * 10
16 |
17 |
18 | def test_sync_global_vs_local_redis_isolation(
19 | cachify_local_redis_second,
20 | mocker: MockerFixture,
21 | ) -> None:
22 | spy = mocker.spy(sys.modules[__name__], '_sync_func')
23 |
24 | global_wrapped = cached(key='client_isolation-{x}')(_sync_func)
25 | local_wrapped = cachify_local_redis_second.cached(key='client_isolation-{x}')(_sync_func)
26 |
27 | res1 = global_wrapped(1)
28 | res2 = global_wrapped(1)
29 |
30 | assert res1 == 10
31 | assert res2 == 10
32 | assert spy.call_count == 1
33 |
34 | res3 = local_wrapped(1)
35 | res4 = local_wrapped(1)
36 |
37 | assert res3 == 10
38 | assert res4 == 10
39 | assert spy.call_count == 2
40 |
41 |
42 | @pytest.mark.asyncio
43 | async def test_async_global_vs_local_redis_isolation(
44 | cachify_local_redis_second,
45 | mocker: MockerFixture,
46 | ) -> None:
47 | spy = mocker.spy(sys.modules[__name__], '_async_func')
48 |
49 | global_wrapped = cached(key='client_isolation-async-{x}')(_async_func)
50 | local_wrapped = cachify_local_redis_second.cached(key='client_isolation-async-{x}')(_async_func)
51 |
52 | res1 = await global_wrapped(2)
53 | res2 = await global_wrapped(2)
54 |
55 | assert res1 == 20
56 | assert res2 == 20
57 | assert spy.call_count == 1
58 |
59 | res3 = await local_wrapped(2)
60 | res4 = await local_wrapped(2)
61 |
62 | assert res3 == 20
63 | assert res4 == 20
64 | assert spy.call_count == 2
65 |
66 |
67 | def test_ttl_global_vs_local_redis(
68 | cachify_local_redis_second,
69 | mocker: MockerFixture,
70 | ) -> None:
71 | spy = mocker.spy(sys.modules[__name__], '_sync_func')
72 |
73 | global_wrapped = cached(key='ttl-test-{x}', ttl=1)(_sync_func)
74 | local_wrapped = cachify_local_redis_second.cached(key='ttl-test-{x}', ttl=5)(_sync_func)
75 |
76 | res1 = global_wrapped(3)
77 | res2 = local_wrapped(3)
78 |
79 | assert res1 == 30
80 | assert res2 == 30
81 | assert spy.call_count == 2
82 |
83 | sleep(2)
84 |
85 | res3 = global_wrapped(3)
86 | res4 = local_wrapped(3)
87 |
88 | assert res3 == 30
89 | assert res4 == 30
90 | assert spy.call_count == 3
91 |
92 |
93 | def test_sync_multilayer_global_redis_inner_local_inmemory_outer_ttl(
94 | cachify_local_in_memory_client,
95 | mocker: MockerFixture,
96 | ) -> None:
97 | spy = mocker.spy(sys.modules[__name__], '_sync_func')
98 |
99 | inner = cached(key='multilayer-{x}', ttl=10)(_sync_func)
100 | outer = cachify_local_in_memory_client.cached(key='multilayer-{x}', ttl=1)(inner)
101 |
102 | res1 = outer(4)
103 | assert res1 == 40
104 | assert spy.call_count == 1
105 |
106 | res2 = outer(4)
107 | assert res2 == 40
108 | assert spy.call_count == 1
109 |
110 | sleep(2)
111 |
112 | res3 = outer(4)
113 | assert res3 == 40
114 | assert spy.call_count == 1
115 |
116 | sleep(9)
117 |
118 | res4 = outer(4)
119 | assert res4 == 40
120 | assert spy.call_count == 2
121 |
122 |
123 | @pytest.mark.asyncio
124 | async def test_async_multilayer_global_redis_inner_local_inmemory_outer_ttl(
125 | cachify_local_in_memory_client,
126 | mocker: MockerFixture,
127 | ) -> None:
128 | spy = mocker.spy(sys.modules[__name__], '_async_func')
129 |
130 | inner = cached(key='multilayer-async-{x}', ttl=10)(_async_func)
131 | outer = cachify_local_in_memory_client.cached(key='multilayer-async-{x}', ttl=1)(inner)
132 |
133 | res1 = await outer(5)
134 | assert res1 == 50
135 | assert spy.call_count == 1
136 |
137 | res2 = await outer(5)
138 | assert res2 == 50
139 | assert spy.call_count == 1
140 |
141 | sleep(2)
142 |
143 | res3 = await outer(5)
144 | assert res3 == 50
145 | assert spy.call_count == 1
146 |
147 | sleep(9)
148 |
149 | res4 = await outer(5)
150 | assert res4 == 50
151 | assert spy.call_count == 2
152 |
153 |
154 | def test_ttl_global_vs_local_inmemory(
155 | cachify_local_in_memory_client,
156 | mocker: MockerFixture,
157 | ) -> None:
158 | spy = mocker.spy(sys.modules[__name__], '_sync_func')
159 |
160 | global_wrapped = cached(key='ttl-test-mem-{x}', ttl=1)(_sync_func)
161 | local_wrapped = cachify_local_in_memory_client.cached(key='ttl-test-mem-{x}', ttl=5)(_sync_func)
162 |
163 | res1 = global_wrapped(6)
164 | res2 = local_wrapped(6)
165 |
166 | assert res1 == 60
167 | assert res2 == 60
168 | assert spy.call_count == 2
169 |
170 | sleep(2)
171 |
172 | res3 = global_wrapped(6)
173 | res4 = local_wrapped(6)
174 |
175 | assert res3 == 60
176 | assert res4 == 60
177 | assert spy.call_count == 3
178 |
--------------------------------------------------------------------------------
/py_cachify/_backend/_helpers.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 | import inspect
3 | from collections.abc import Awaitable
4 | from typing import Any, Callable, TypeVar, Union
5 |
6 | from typing_extensions import ParamSpec, TypeIs
7 |
8 | from ._constants import OperationPostfix
9 | from ._lib import CachifyClient
10 | from ._logger import logger
11 | from ._types._common import Decoder, Encoder
12 |
13 |
14 | _R = TypeVar('_R', covariant=True)
15 | _P = ParamSpec('_P')
16 | _S = TypeVar('_S')
17 |
18 |
19 | def _call_original(
20 | _pyc_original_func: Union[Callable[..., Any], None], _pyc_method_name: str, *args: Any, **kwargs: Any
21 | ) -> Any:
22 | if not _pyc_original_func:
23 | return
24 |
25 | orig_method = getattr(_pyc_original_func, _pyc_method_name, None)
26 | if not orig_method or not callable(orig_method):
27 | return
28 |
29 | try:
30 | return orig_method(*args, **kwargs)
31 | except Exception as e:
32 | logger.debug(f'Error calling original reset: {e}')
33 |
34 | return None
35 |
36 |
37 | async def _acall_original(
38 | _pyc_original_func: Union[Callable[..., Awaitable[Any]], None], _pyc_method_name: str, *args: Any, **kwargs: Any
39 | ) -> Any:
40 | if not _pyc_original_func:
41 | return
42 |
43 | orig_method = getattr(_pyc_original_func, _pyc_method_name, None)
44 | if not orig_method or not is_coroutine(orig_method):
45 | return
46 |
47 | try:
48 | return await orig_method(*args, **kwargs)
49 | except Exception as e:
50 | logger.debug(f'Error calling original reset: {e}')
51 |
52 | return None
53 |
54 |
55 | def get_full_key_from_signature(
56 | bound_args: inspect.BoundArguments,
57 | key: str,
58 | operation_postfix: OperationPostfix,
59 | ) -> str:
60 | bound_args.apply_defaults()
61 | _args_repr = f'{bound_args}'
62 |
63 | args_dict = bound_args.arguments
64 | args: tuple[Any, ...] = args_dict.pop('args', ())
65 | kwargs: dict[str, Any] = args_dict.pop('kwargs', {})
66 | kwargs.update(args_dict)
67 |
68 | try:
69 | return f'{key.format(*args, **kwargs)}-{operation_postfix}'
70 | except (IndexError, KeyError):
71 | raise ValueError(f'Arguments in a key({key}) do not match function signature params({_args_repr})') from None
72 |
73 |
74 | def is_coroutine(
75 | func: Union[Callable[_P, Awaitable[_R]], Callable[_P, _R]],
76 | ) -> TypeIs[Callable[_P, Awaitable[_R]]]:
77 | return asyncio.iscoroutinefunction(func)
78 |
79 |
80 | def encode_decode_value(encoder_decoder: Union[Encoder, Decoder, None], val: Any) -> Any:
81 | if not encoder_decoder:
82 | return val
83 |
84 | return encoder_decoder(val)
85 |
86 |
87 | def reset(
88 | *args: Any,
89 | _pyc_key: str,
90 | _pyc_signature: inspect.Signature,
91 | _pyc_operation_postfix: OperationPostfix,
92 | _pyc_original_func: Union[Callable[..., Any], None],
93 | _pyc_client_provider: Callable[..., CachifyClient],
94 | **kwargs: Any,
95 | ) -> None:
96 | client = _pyc_client_provider()
97 | _key = get_full_key_from_signature(
98 | bound_args=_pyc_signature.bind(*args, **kwargs),
99 | key=_pyc_key,
100 | operation_postfix=_pyc_operation_postfix,
101 | )
102 |
103 | client.delete(key=_key)
104 |
105 | _call_original(_pyc_original_func, 'reset', *args, **kwargs)
106 |
107 |
108 | async def a_reset(
109 | *args: Any,
110 | _pyc_key: str,
111 | _pyc_signature: inspect.Signature,
112 | _pyc_operation_postfix: OperationPostfix,
113 | _pyc_original_func: Union[Callable[..., Awaitable[Any]], None],
114 | _pyc_client_provider: Callable[..., CachifyClient],
115 | **kwargs: Any,
116 | ) -> None:
117 | client = _pyc_client_provider()
118 | _key = get_full_key_from_signature(
119 | bound_args=_pyc_signature.bind(*args, **kwargs), key=_pyc_key, operation_postfix=_pyc_operation_postfix
120 | )
121 |
122 | await client.a_delete(key=_key)
123 |
124 | await _acall_original(_pyc_original_func, 'reset', *args, **kwargs)
125 |
126 |
127 | async def is_alocked(
128 | *args: Any,
129 | _pyc_key: str,
130 | _pyc_signature: inspect.Signature,
131 | _pyc_operation_postfix: OperationPostfix,
132 | _pyc_original_func: Union[Callable[..., Awaitable[Any]], None],
133 | _pyc_client_provider: Callable[..., CachifyClient],
134 | **kwargs: Any,
135 | ) -> bool:
136 | client = _pyc_client_provider()
137 | _key = get_full_key_from_signature(
138 | bound_args=_pyc_signature.bind(*args, **kwargs), key=_pyc_key, operation_postfix=_pyc_operation_postfix
139 | )
140 |
141 | if bool(await client.a_get(key=_key)):
142 | return True
143 |
144 | return await _acall_original(_pyc_original_func, 'is_locked', *args, **kwargs) or False
145 |
146 |
147 | def is_locked(
148 | *args: Any,
149 | _pyc_key: str,
150 | _pyc_signature: inspect.Signature,
151 | _pyc_operation_postfix: OperationPostfix,
152 | _pyc_original_func: Union[Callable[..., Any], None],
153 | _pyc_client_provider: Callable[..., CachifyClient],
154 | **kwargs: Any,
155 | ) -> bool:
156 | client = _pyc_client_provider()
157 | _key = get_full_key_from_signature(
158 | bound_args=_pyc_signature.bind(*args, **kwargs), key=_pyc_key, operation_postfix=_pyc_operation_postfix
159 | )
160 |
161 | if bool(client.get(key=_key)):
162 | return True
163 |
164 | return _call_original(_pyc_original_func, 'is_locked', *args, **kwargs) or False
165 |
--------------------------------------------------------------------------------
/tests/test_locks.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 | from asyncio import sleep as asleep
3 | from contextlib import nullcontext
4 | from threading import Thread
5 | from time import sleep
6 |
7 | import pytest
8 |
9 | from py_cachify import CachifyLockError, init_cachify, lock
10 | from py_cachify._backend._lib import CachifyClient
11 | from py_cachify._backend._types._common import UNSET
12 |
13 |
14 | lock_obj = lock(key='test')
15 |
16 |
17 | @pytest.mark.asyncio
18 | async def test_async_lock(init_cachify_fixture):
19 | async def async_operation():
20 | async with lock('lock'):
21 | return None
22 |
23 | await async_operation()
24 |
25 |
26 | @pytest.mark.asyncio
27 | async def test_async_lock_already_locked(init_cachify_fixture):
28 | key = 'lock'
29 |
30 | async def async_operation():
31 | async with lock(key):
32 | async with lock(key):
33 | pass
34 |
35 | with pytest.raises(CachifyLockError, match=f'{key} is already locked!'):
36 | await async_operation()
37 |
38 |
39 | def test_lock(init_cachify_fixture):
40 | def sync_operation():
41 | with lock('lock'):
42 | pass
43 |
44 | sync_operation()
45 |
46 |
47 | def test_lock_already_locked(init_cachify_fixture):
48 | key = 'lock'
49 |
50 | def sync_operation():
51 | with lock(key):
52 | with lock(key):
53 | pass
54 |
55 | with pytest.raises(CachifyLockError, match=f'{key} is already locked!'):
56 | sync_operation()
57 |
58 |
59 | @pytest.mark.parametrize(
60 | 'exp,timeout,expectation', [(1, 2, nullcontext(None)), (2, 1, pytest.raises(CachifyLockError))]
61 | )
62 | def test_waiting_lock(init_cachify_fixture, exp, timeout, expectation):
63 | key = 'lock'
64 |
65 | def sync_operation():
66 | with lock(key=key, exp=exp):
67 | with lock(key=key, nowait=False, timeout=timeout):
68 | return None
69 |
70 | with expectation as e:
71 | assert sync_operation() == e
72 |
73 |
74 | @pytest.mark.asyncio
75 | @pytest.mark.parametrize(
76 | 'exp,timeout,expectation', [(1, 2, nullcontext(None)), (2, 1, pytest.raises(CachifyLockError))]
77 | )
78 | async def test_waiting_lock_async(init_cachify_fixture, exp, timeout, expectation):
79 | key = 'lock'
80 |
81 | async def async_operation():
82 | async with lock(key=key, exp=exp):
83 | async with lock(key=key, nowait=False, timeout=timeout):
84 | return None
85 |
86 | with expectation as e:
87 | assert await async_operation() == e
88 |
89 |
90 | def test_lock_cachify_returns_cachify_instance(init_cachify_fixture):
91 | assert isinstance(lock_obj._cachify, CachifyClient)
92 | assert lock_obj._cachify is not None
93 |
94 |
95 | def test_lock_recreate_cm_returns_self():
96 | assert lock_obj._recreate_cm() is lock_obj
97 |
98 |
99 | @pytest.mark.parametrize('timeout,expected', [(None, float('inf')), (10, 20.0)])
100 | def test_lock_calc_stop_at(mocker, timeout, expected):
101 | new_lock = lock('test', timeout=timeout)
102 | mocker.patch('time.time', return_value=10.0)
103 |
104 | assert new_lock._calc_stop_at() == expected
105 |
106 |
107 | @pytest.mark.parametrize(
108 | 'default_expiration,exp,expected',
109 | [
110 | (None, UNSET, 30),
111 | (60, UNSET, 60),
112 | (30, 60, 60),
113 | (30, None, None),
114 | ],
115 | )
116 | def test_lock_get_ttl(init_cachify_fixture, default_expiration, exp, expected):
117 | init_dict = {'default_lock_expiration': default_expiration} if default_expiration is not None else {}
118 |
119 | init_cachify(**init_dict)
120 |
121 | lock_obj = lock('test', exp=exp)
122 |
123 | assert lock_obj._get_ttl() == expected
124 |
125 |
126 | @pytest.mark.parametrize(
127 | 'is_already_locked,key,do_raise,expectation',
128 | [
129 | (True, 'test', False, nullcontext(None)),
130 | (True, 'test', True, pytest.raises(CachifyLockError)),
131 | (False, 'test', True, nullcontext(None)),
132 | ],
133 | )
134 | def test_lock_raise_if_cached(mocker, is_already_locked, key, do_raise, expectation):
135 | patch_log = mocker.patch('py_cachify._backend._logger.logger.debug')
136 |
137 | with expectation:
138 | lock._raise_if_cached(
139 | is_already_cached=is_already_locked,
140 | key=key,
141 | do_raise=do_raise,
142 | )
143 | if is_already_locked is True:
144 | patch_log.assert_called_once_with(f'{key} is already locked!')
145 |
146 |
147 | def test_unset_type_bool():
148 | assert bool(UNSET) is False
149 |
150 |
151 | @pytest.mark.parametrize(
152 | 'sleep_time,expected',
153 | [
154 | (1, True),
155 | (0, False),
156 | ],
157 | )
158 | def test_is_locked_on_lock_obj(init_cachify_fixture, sleep_time, expected):
159 | test_lock = lock('test')
160 |
161 | def sync_function():
162 | with test_lock:
163 | sleep(sleep_time)
164 |
165 | thread = Thread(target=sync_function)
166 | thread.start()
167 | sleep(0.3)
168 |
169 | assert test_lock.is_locked() is expected
170 |
171 |
172 | @pytest.mark.parametrize(
173 | 'sleep_time,expected',
174 | [
175 | (1, True),
176 | (0, False),
177 | ],
178 | )
179 | async def test_is_locked_on_lock_obj_async(init_cachify_fixture, sleep_time, expected):
180 | test_lock = lock('test')
181 |
182 | async def async_function():
183 | async with test_lock:
184 | await asleep(sleep_time)
185 |
186 | task = asyncio.create_task(async_function())
187 |
188 | await asleep(0.2)
189 |
190 | assert await test_lock.is_alocked() is expected
191 |
192 | await task
193 |
--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------
1 | [project]
2 | name = "py-cachify"
3 | description = "Distributed locks, caching, and locking at hand"
4 | version = "3.0.1"
5 | authors = [{ name = "Galtozzy", email = "galtozzy+git@gmail.com" }]
6 | requires-python = ">=3.9,<4.0"
7 | readme = "README.md"
8 | license = "MIT"
9 | classifiers = [
10 | "Intended Audience :: Information Technology",
11 | "Intended Audience :: System Administrators",
12 | "Intended Audience :: Developers",
13 | "Operating System :: OS Independent",
14 | "Programming Language :: Python :: 3",
15 | "Programming Language :: Python",
16 | "Topic :: Software Development :: Libraries :: Python Modules",
17 | "Topic :: Software Development :: Libraries",
18 | "Topic :: Software Development",
19 | "Typing :: Typed",
20 | "Development Status :: 5 - Production/Stable",
21 | "Programming Language :: Python :: 3 :: Only",
22 | "Programming Language :: Python :: 3.9",
23 | "Programming Language :: Python :: 3.10",
24 | "Programming Language :: Python :: 3.11",
25 | "Programming Language :: Python :: 3.12",
26 | "Programming Language :: Python :: 3.13",
27 | "Programming Language :: Python :: 3.14",
28 | "License :: OSI Approved :: MIT License",
29 | ]
30 | dependencies = [
31 | "typing-extensions>=4.15.0",
32 | ]
33 |
34 | [project.urls]
35 | Homepage = "https://github.com/EzyGang/py-cachify"
36 | Repository = "https://github.com/EzyGang/py-cachify"
37 |
38 | [build-system]
39 | requires = ["hatchling"]
40 | build-backend = "hatchling.build"
41 |
42 | [dependency-groups]
43 | dev = [
44 | "pytest>=8.3.2,<9",
45 | "pytest-mock>=3.14.0,<4",
46 | "pytest-cov>=7.0.0,<8",
47 | "taskipy>=1.13.0,<2",
48 | "mypy>=1.11.1,<2",
49 | "coverage>=7.6.1,<8",
50 | "pytest-asyncio>=1.0.0,<2",
51 | "pytest-socket>=0.7.0,<0.8",
52 | "basedpyright>=1.35.0,<2",
53 | "ruff>=0.9.10",
54 | ]
55 | docs = [
56 | "mkdocs>=1.6.0,<2",
57 | "mkdocs-material>=9.5.28,<10",
58 | "pygments>=2.18.0,<3",
59 | "termynal>=0.13,<0.14 ; python_version >= '3.8.1' and python_version < '4.0'",
60 | "beautifulsoup4>=4.12.3,<5",
61 | "mkdocs-include-markdown-plugin>=7.2.0",
62 | ]
63 | integration = ["redis>=5.0.7,<6"]
64 |
65 | [tool.uv]
66 | default-groups = [
67 | "dev",
68 | "docs",
69 | "integration",
70 | ]
71 |
72 | [tool.hatch.build.targets.sdist]
73 | include = [
74 | "py_cachify",
75 | "py_cachify/py.typed",
76 | "README.md",
77 | ]
78 |
79 | [tool.hatch.build.targets.wheel]
80 | include = [
81 | "py_cachify",
82 | "py_cachify/py.typed",
83 | "README.md",
84 | ]
85 |
86 | [tool.taskipy.tasks]
87 | format-and-lint = "task ruff && task mypy-lint"
88 |
89 | ruff = "ruff format ./py_cachify/ ./tests/ ./integration_tests/ ./docs/scripts/ && ruff check ./py_cachify/ ./tests/ ./integration_tests/ ./docs/scripts/ --fix --unsafe-fixes"
90 | tests = "PYTHONPATH=. pytest tests/ -vvv"
91 | integration-tests = "PYTHONPATH=. pytest integration_tests/ --no-cov"
92 |
93 | docs-dev = "mkdocs serve"
94 | ruff-lint = "ruff check ./py_cachify"
95 | mypy-lint = "mypy --install-types --non-interactive ./py_cachify/"
96 |
97 | [tool.ruff]
98 | line-length = 120
99 | extend-exclude = ["site-packages", "*.pyi"]
100 | target-version = 'py39'
101 |
102 | [tool.ruff.lint]
103 | fixable = ["ALL"]
104 | unfixable = []
105 | select = [
106 | "E", # pycodestyle errors
107 | "W", # pycodestyle warnings
108 | "F", # pyflakes
109 | "I", # isort
110 | "C", # flake8-comprehensions
111 | "B", # flake8-bugbear
112 | "UP", # pyupgrade
113 | ]
114 | ignore = [
115 | "B008", # do not perform function calls in argument defaults
116 | "C901", # too complex
117 | "B010", # do not rewrite setattr()
118 | ]
119 |
120 | [tool.ruff.format]
121 | quote-style = 'single'
122 | indent-style = 'space'
123 | line-ending = 'auto'
124 | skip-magic-trailing-comma = false
125 | exclude = ['*.pyi']
126 |
127 | [tool.ruff.lint.mccabe]
128 | max-complexity = 6
129 |
130 | [tool.ruff.lint.isort]
131 | split-on-trailing-comma = false
132 | lines-after-imports = 2
133 | known-first-party = ["py_cachify"]
134 |
135 | [tool.ruff.lint.pyupgrade]
136 | keep-runtime-typing = true
137 |
138 | [tool.mypy]
139 | # Mypy configuration:
140 | # https://mypy.readthedocs.io/en/latest/config_file.html
141 | strict = true
142 | pretty = true
143 |
144 | exclude = ['test_']
145 |
146 | [[tool.mypy.overrides]]
147 | module = "tests.*"
148 | ignore_errors = true
149 |
150 | [tool.pytest.ini_options]
151 | minversion = "6.0"
152 | addopts = [
153 | '--strict-markers',
154 | '--strict-config',
155 | '--allow-unix-socket',
156 | '--allow-hosts=127.0.0.1,127.0.1.1,::1',
157 | '--tb=short',
158 | '--cov=py_cachify/',
159 | '--cov-branch',
160 | '--cov-report=term-missing:skip-covered',
161 | '--cov-report=xml',
162 | '--cov-fail-under=100',
163 | ]
164 | asyncio_mode = "auto"
165 | python_files = "test*.py"
166 | # Directories that are not visited by pytest collector:
167 | norecursedirs = "*.egg .eggs dist build docs .tox .git __pycache__ config docker etc"
168 | testpaths = ["tests"]
169 |
170 | [tool.coverage.run]
171 | # Coverage configuration:
172 | # https://coverage.readthedocs.io/en/latest/config.html
173 | omit = []
174 | concurrency = ['thread']
175 |
176 | [tool.coverage.report]
177 | omit = []
178 | exclude_lines = [
179 | 'pragma: no cover',
180 | '@overload',
181 | 'SyncOrAsync',
182 | '@abstract',
183 | 'def __repr__',
184 | 'raise AssertionError',
185 | 'raise NotImplementedError',
186 | 'if __name__ == .__main__.:',
187 | '__all__',
188 | 'if TYPE_CHECKING:',
189 | 'except ModuleNotFoundError:',
190 | ]
191 |
192 |
193 | [tool.pyright]
194 | include = ["py_cachify"]
195 | exclude = ["tests"]
196 | reportAny = false
197 | typeCheckingMode = "strict"
198 | pythonVersion = "3.9"
199 | executionEnvironments = [
200 | { root = "py_cachify" },
201 | { root = "tests" }
202 | ]
203 |
--------------------------------------------------------------------------------
/tests/test_helpers.py:
--------------------------------------------------------------------------------
1 | import inspect
2 | import re
3 |
4 | import pytest
5 | from pytest_mock import MockerFixture
6 |
7 | from py_cachify._backend._helpers import (
8 | _acall_original,
9 | _call_original,
10 | a_reset,
11 | get_full_key_from_signature,
12 | is_alocked,
13 | is_locked,
14 | reset,
15 | )
16 | from py_cachify._backend._lib import get_cachify_client
17 |
18 |
19 | def method_with_args_kwargs_args(*args, **kwargs) -> None:
20 | pass
21 |
22 |
23 | @pytest.fixture
24 | def args_kwargs_signature():
25 | return inspect.signature(method_with_args_kwargs_args)
26 |
27 |
28 | def test_get_full_key_valid_arguments(args_kwargs_signature):
29 | bound_args = args_kwargs_signature.bind('value1', 'value2', arg3='value3')
30 | result = get_full_key_from_signature(bound_args, 'key_{}_{}_{arg3}', operation_postfix='cached')
31 | assert result == 'key_value1_value2_value3-cached'
32 |
33 |
34 | def test_get_full_key_invalid_key_format(args_kwargs_signature):
35 | bound_args = args_kwargs_signature.bind('value1', 'value2')
36 | bound_args.apply_defaults()
37 |
38 | with pytest.raises(
39 | ValueError,
40 | match=re.escape(f'Arguments in a key(key_{{}}_{{}}_{{}}) do not match function signature params({bound_args})'),
41 | ):
42 | get_full_key_from_signature(bound_args, 'key_{}_{}_{}', operation_postfix='cached')
43 |
44 |
45 | def test_get_full_key_empty_key_and_arguments(args_kwargs_signature):
46 | bound_args = args_kwargs_signature.bind()
47 | result = get_full_key_from_signature(bound_args, 'key_with_no_args', operation_postfix='cached')
48 | assert result == 'key_with_no_args-cached'
49 |
50 |
51 | def test_get_full_key_mixed_placeholders(args_kwargs_signature):
52 | bound_args = args_kwargs_signature.bind('value1', 'value2', arg3='value3')
53 | bound_args.apply_defaults()
54 |
55 | with pytest.raises(
56 | ValueError,
57 | match=re.escape(
58 | 'Arguments in a key(key_{}_{}_{}_{invalid_arg}) ' + f'do not match function signature params({bound_args})'
59 | ),
60 | ):
61 | _ = get_full_key_from_signature(bound_args, 'key_{}_{}_{}_{invalid_arg}', operation_postfix='cached')
62 |
63 |
64 | def test_reset_calls_delete_with_key(init_cachify_fixture, args_kwargs_signature, mocker: MockerFixture):
65 | mock = mocker.patch('py_cachify._backend._lib.CachifyClient.delete')
66 |
67 | reset(
68 | 'val1',
69 | 'val2',
70 | arg3='val3',
71 | _pyc_key='key_{}_{}_{arg3}',
72 | _pyc_signature=args_kwargs_signature,
73 | _pyc_operation_postfix='cached',
74 | _pyc_original_func=None,
75 | _pyc_client_provider=get_cachify_client,
76 | )
77 |
78 | mock.assert_called_once_with(key='key_val1_val2_val3-cached')
79 |
80 |
81 | @pytest.mark.asyncio
82 | async def test_a_reset_calls_delete_with_key(init_cachify_fixture, args_kwargs_signature, mocker: MockerFixture):
83 | mock = mocker.patch('py_cachify._backend._lib.CachifyClient.a_delete')
84 |
85 | await a_reset(
86 | 'val1',
87 | 'val2',
88 | arg3='val3',
89 | _pyc_key='key_{}_{}_{arg3}',
90 | _pyc_signature=args_kwargs_signature,
91 | _pyc_operation_postfix='cached',
92 | _pyc_original_func=None,
93 | _pyc_client_provider=get_cachify_client,
94 | )
95 |
96 | mock.assert_called_once_with(key='key_val1_val2_val3-cached')
97 |
98 |
99 | @pytest.mark.asyncio
100 | @pytest.mark.parametrize('val', [0, 1])
101 | async def test_is_alocked_accesses_a_get_with_key(
102 | init_cachify_fixture, args_kwargs_signature, mocker: MockerFixture, val
103 | ):
104 | mock = mocker.patch('py_cachify._backend._lib.CachifyClient.a_get', return_value=val)
105 |
106 | res = await is_alocked(
107 | 'val1',
108 | 'val2',
109 | arg3='val3',
110 | _pyc_key='key_{}_{}_{arg3}',
111 | _pyc_signature=args_kwargs_signature,
112 | _pyc_operation_postfix='cached',
113 | _pyc_original_func=None,
114 | _pyc_client_provider=get_cachify_client,
115 | )
116 |
117 | mock.assert_called_once_with(key='key_val1_val2_val3-cached')
118 | assert res is bool(val)
119 |
120 |
121 | def test_call_original_logs_debug_on_exception(mocker: MockerFixture):
122 | class Obj:
123 | def reset(self, *args, **kwargs):
124 | raise ValueError('boom')
125 |
126 | obj = Obj()
127 | log_mock = mocker.patch('py_cachify._backend._helpers.logger')
128 |
129 | result = _call_original(obj, 'reset', 1, kw=2)
130 |
131 | assert result is None
132 | log_mock.debug.assert_called_once()
133 | assert 'Error calling original reset' in log_mock.debug.call_args.args[0]
134 |
135 |
136 | @pytest.mark.asyncio
137 | async def test_acall_original_logs_debug_on_exception(mocker: MockerFixture):
138 | class Obj:
139 | async def reset(self, *args, **kwargs):
140 | raise ValueError('boom')
141 |
142 | obj = Obj()
143 | log_mock = mocker.patch('py_cachify._backend._helpers.logger')
144 |
145 | result = await _acall_original(obj, 'reset', 1, kw=2)
146 |
147 | assert result is None
148 | log_mock.debug.assert_called_once()
149 | assert 'Error calling original reset' in log_mock.debug.call_args.args[0]
150 |
151 |
152 | @pytest.mark.parametrize('val', [0, 1])
153 | def test_is_locked_accesses_get_with_key(init_cachify_fixture, args_kwargs_signature, mocker: MockerFixture, val):
154 | mock = mocker.patch('py_cachify._backend._lib.CachifyClient.get', return_value=val)
155 |
156 | res = is_locked(
157 | 'val1',
158 | 'val2',
159 | arg3='val3',
160 | _pyc_key='key_{}_{}_{arg3}',
161 | _pyc_signature=args_kwargs_signature,
162 | _pyc_operation_postfix='cached',
163 | _pyc_original_func=None,
164 | _pyc_client_provider=get_cachify_client,
165 | )
166 |
167 | mock.assert_called_once_with(key='key_val1_val2_val3-cached')
168 | assert res is bool(val)
169 |
--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------
1 | .idea
2 | .ipynb_checkpoints
3 | .mypy_cache
4 | .vscode
5 | __pycache__
6 | .pytest_cache
7 | htmlcov
8 | dist
9 | site
10 | .coverage
11 | coverage.json
12 | coverage.xml
13 | .netlify
14 | test.db
15 | log.txt
16 | Pipfile.lock
17 | env3.*
18 | env
19 | docs_build
20 | site_build
21 | venv
22 | docs.zip
23 | archive.zip
24 | .zed
25 | .zed/*
26 | # vim temporary files
27 | *~
28 | .*.sw?
29 | .cache
30 |
31 | # macOS
32 | .DS_Store
33 |
34 |
35 | ### Python template
36 | # Byte-compiled / optimized / DLL files
37 | __pycache__/
38 | *.py[cod]
39 | *$py.class
40 |
41 | # C extensions
42 | *.so
43 |
44 | # Distribution / packaging
45 |
46 | .Python
47 | develop-eggs/
48 | dist/
49 | downloads/
50 | eggs/
51 | .eggs/
52 | lib64/
53 | parts/
54 | sdist/
55 | var/
56 | wheels/
57 | *.egg-info/
58 | .installed.cfg
59 | *.egg
60 | pyrightconfig.json
61 | .zed
62 |
63 | # PyInstaller
64 | # Usually these files are written by a python script from a template
65 | # before PyInstaller builds the exe, so as to inject date/other infos into it.
66 | *.manifest
67 | *.spec
68 |
69 | # Installer logs
70 | pip-log.txt
71 | pip-delete-this-directory.txt
72 |
73 | # Unit test / coverage reports
74 | htmlcov/
75 | test-reports/
76 | .tox/
77 | .coverage.*
78 | nosetests.xml
79 | *.cover
80 | .hypothesis/
81 |
82 | # Translations
83 | *.mo
84 | *.pot
85 |
86 | # Django stuff:
87 | staticfiles/
88 |
89 | # Sphinx documentation
90 | docs/_build/
91 |
92 | # PyBuilder
93 | target/
94 |
95 | # pyenv
96 | .python-version
97 |
98 | # celery beat schedule file
99 | celerybeat-schedule
100 |
101 | # Environments
102 | .venv
103 | venv/
104 | ENV/
105 |
106 | # Rope project settings
107 | .ropeproject
108 |
109 | # mkdocs documentation
110 | /site
111 |
112 | # mypy
113 | .mypy_cache/
114 |
115 | # ruff
116 | .ruff_cache
117 |
118 | ### Node template
119 | # Logs
120 | logs
121 | *.log
122 | npm-debug.log*
123 | yarn-debug.log*
124 | yarn-error.log*
125 |
126 | # Runtime data
127 | pids
128 | *.pid
129 | *.seed
130 | *.pid.lock
131 |
132 | # Directory for instrumented libs generated by jscoverage/JSCover
133 | lib-cov
134 |
135 | # Coverage directory used by tools like istanbul
136 | coverage
137 |
138 | # nyc test coverage
139 | .nyc_output
140 |
141 | # Bower dependency directory (https://bower.io/)
142 | bower_components
143 |
144 | # node-waf configuration
145 | .lock-wscript
146 |
147 | # Compiled binary addons (http://nodejs.org/api/addons.html)
148 | build/Release
149 |
150 | # Dependency directories
151 | node_modules/
152 | jspm_packages/
153 |
154 | # Typescript v1 declaration files
155 | typings/
156 |
157 | # Optional npm cache directory
158 | .npm
159 |
160 | # Optional eslint cache
161 | .eslintcache
162 |
163 | # Optional REPL history
164 | .node_repl_history
165 |
166 | # Output of 'npm pack'
167 | *.tgz
168 |
169 | # Yarn Integrity file
170 | .yarn-integrity
171 |
172 |
173 | ### Linux template
174 | # temporary files which can be created if a process still has a handle open of a deleted file
175 | .fuse_hidden*
176 |
177 | # KDE directory preferences
178 | .directory
179 |
180 | # Linux trash folder which might appear on any partition or disk
181 | .Trash-*
182 |
183 | # .nfs files are created when an open file is removed but is still being accessed
184 | .nfs*
185 |
186 |
187 | ### VisualStudioCode template
188 | .vscode/*
189 |
190 | .idea/*
191 |
192 | # CMake
193 | cmake-build-debug/
194 |
195 | ## File-based project format:
196 | *.iws
197 |
198 | ## Plugin-specific files:
199 |
200 | # IntelliJ
201 | out/
202 |
203 | # mpeltonen/sbt-idea plugin
204 | .idea_modules/
205 |
206 | # JIRA plugin
207 | atlassian-ide-plugin.xml
208 |
209 | # Crashlytics plugin (for Android Studio and IntelliJ)
210 | com_crashlytics_export_strings.xml
211 | crashlytics.properties
212 | crashlytics-build.properties
213 | fabric.properties
214 |
215 | ### SublimeText template
216 | # Cache files for Sublime Text
217 | *.tmlanguage.cache
218 | *.tmPreferences.cache
219 | *.stTheme.cache
220 |
221 | # Workspace files are user-specific
222 | *.sublime-workspace
223 |
224 | # Project files should be checked into the repository, unless a significant
225 | # proportion of contributors will probably not be using Sublime Text
226 | # *.sublime-project
227 |
228 | # SFTP configuration file
229 | sftp-config.json
230 |
231 | # Package control specific files
232 | Package Control.last-run
233 | Package Control.ca-list
234 | Package Control.ca-bundle
235 | Package Control.system-ca-bundle
236 | Package Control.cache/
237 | Package Control.ca-certs/
238 | Package Control.merged-ca-bundle
239 | Package Control.user-ca-bundle
240 | oscrypto-ca-bundle.crt
241 | bh_unicode_properties.cache
242 |
243 | # Sublime-github package stores a github token in this file
244 | # https://packagecontrol.io/packages/sublime-github
245 | GitHub.sublime-settings
246 |
247 | ### Emacs template
248 | .\#*
249 |
250 | ### Vim template
251 | # Swap
252 | [._]*.s[a-v][a-z]
253 | [._]*.sw[a-p]
254 | [._]s[a-v][a-z]
255 | [._]sw[a-p]
256 |
257 | # Session
258 | Session.vim
259 |
260 | # Temporary
261 | .netrwhist
262 |
263 | # Auto-generated tag files
264 | tags
265 |
266 | .pytest_cache/
267 | .env
268 | src/settings/.env
269 | .envs/*
270 | !.envs/.local/
271 |
272 | backups/
273 | *.sql
274 | *.sql.gz
275 | **/*.sqlite3
276 |
277 | ### Jupyter Notebook
278 |
279 | **/.ipynb_checkpoints
280 | jupyter/
281 |
282 | ### Rest Idea Envs
283 | *.env.json
284 |
285 | etc/
286 | docker-compose.local.*
287 |
288 | *.iml
289 |
290 |
291 | *htmlcov*
292 |
293 | *.pyc
294 | db.sqlite3
295 | media
296 |
297 | # Backup files
298 | *.bak
299 |
300 | # User-specific stuff
301 | .idea/
302 |
303 | # Distribution / packaging
304 | .Python build/
305 |
306 | # celery
307 | celerybeat-schedule.*
308 |
309 | # SageMath parsed files
310 | *.sage.py
311 |
312 | env.bak/
313 | venv.bak/
314 |
315 | *.sublime-project
316 |
317 | # Package control specific files Package
318 | Control.last-run
319 | Control.ca-list
320 | Control.ca-bundle
321 | Control.system-ca-bundle
322 |
323 | # Visual Studio Code #
324 | .vscode/
325 | .history
326 |
--------------------------------------------------------------------------------
/docs/reference/once.md:
--------------------------------------------------------------------------------
1 | # API Reference for ///@once()/// Decorator
2 |
3 | ## Overview
4 |
5 | The `once` decorator ensures that a decorated function can only be called once at a time based on a specified key.
6 | It can be applied to both synchronous and asynchronous functions, facilitating locking mechanisms to prevent concurrent executions.
7 | Internally it reuses the same distributed locking mechanism as `lock`, relying on an underlying cache client that supports atomic "set-if-not-exists" (`nx`) semantics.
8 |
9 |
10 | There are two main ways to use `once` with py-cachify:
11 |
12 | - Via the **global** `once` decorator exported from `py_cachify`, which relies on a globally initialized client.
13 | - Via **instance-based** `once` decorators obtained from a `Cachify` object created by `init_cachify(is_global=False)`.
14 |
15 | ---
16 |
17 | ## Function: ///once()///
18 |
19 | ### Description
20 | The `once` decorator takes a key to manage function calls,
21 | ensuring that only one invocation of the wrapped function occurs at a time.
22 | If the function is called while it is still locked, it can either raise an exception or return a predefined value depending on the parameters.
23 |
24 | ### Parameters
25 |
26 | | Parameter | Type | Description |
27 | |---------------------|---------------------------------|---------------------------------------------------------------------------------------------------------------|
28 | | `key` | `str` | The key used to identify the lock for the function. |
29 | | `raise_on_locked` | `bool`, optional | If `True`, raises an exception (`CachifyLockError`) when the function call is already locked. Defaults to `False`. |
30 | | `return_on_locked` | `Any`, optional | The value to return when the function is already locked. Defaults to `None`. |
31 |
32 | ### Returns
33 | - `WrappedFunctionLock`: A wrapped function (either synchronous or asynchronous) with additional methods attached for lock management, specifically:
34 | - `is_locked(*args, **kwargs)`: Method to check if the function is currently locked.
35 | - `release(*args, **kwargs)`: Method to release the lock.
36 |
37 |
38 | - **If the wrapped function is called while locked**:
39 | - If `raise_on_locked` is `True`: A `CachifyLockError` exception is raised.
40 | - If `return_on_locked` is specified: The decorator returns the specified value instead of invoking the function.
41 | - If neither is provided, the call is simply skipped and the default `None` is returned.
42 |
43 |
44 | ### Usage Example
45 |
46 | ```python
47 | from py_cachify import once
48 |
49 | @once('my_function_lock', raise_on_locked=True)
50 | def my_function():
51 | # Critical section of code goes here
52 | return 'Function executed'
53 |
54 | @once('my_async_function_lock-{arg}', return_on_locked='Function already running')
55 | async def my_async_function(arg: str):
56 | # Critical section of async code goes here
57 | return 'Async function executed'
58 | ```
59 |
60 | ### Instance-based Usage
61 |
62 | If you need multiple, independent "once" semantics (for example, per module or subsystem), you can create dedicated `Cachify` instances via `init_cachify(is_global=False)` and use their `once` method instead of the global decorator:
63 |
64 | ```python
65 | from py_cachify import init_cachify
66 |
67 | # Create a dedicated instance that does not affect the global client
68 | local_cachify = init_cachify(is_global=False, prefix='LOCAL-')
69 |
70 | @local_cachify.once('local-once-{task_id}')
71 | def local_task(task_id: str) -> None:
72 | # This function will be guarded by the local instance
73 | ...
74 | ```
75 |
76 | - Global `@once(...)` uses the client configured by a global `init_cachify()` call.
77 | - `@local_cachify.once(...)` uses a client that is completely independent from the global one.
78 |
79 | ### Releasing the once lock or checking if it is locked
80 |
81 | ```python
82 |
83 | await my_async_function.is_locked(arg='arg-value')
84 |
85 | await my_async_function.release(arg='arg-value')
86 | ```
87 |
88 | The same pattern applies to instance-based usage:
89 |
90 | ```python
91 | await local_task.is_locked(task_id='42')
92 | await local_task.release(task_id='42')
93 | ```
94 |
95 | ### Note
96 | - If py-cachify is not initialized through `init_cachify` with `is_global=True`, using the global `once` decorator will raise a `CachifyInitError`.
97 | - `Cachify` instances created with `is_global=False` do not depend on global initialization and can be used independently.
98 | - The correctness of `once` in concurrent or distributed environments depends on the underlying cache client providing an atomic "set-if-not-exists" (`nx`) operation (see the initialization reference and custom client section for details). Redis/DragonflyDB support his by default.
99 |
100 |
101 |
102 | ### Type Hints Remark (Decorator only application)
103 |
104 | Currently, Python's type hints have limitations in fully capturing a function's
105 | original signature when transitioning to a protocol-based callable in a decorator,
106 | particularly for methods (i.e., those that include `self`).
107 | `ParamSpec` can effectively handle argument and keyword types for functions
108 | but doesn't translate well to methods within protocols like `WrappedFunctionLock`.
109 | I'm staying updated on this issue and recommend checking the following resources
110 | for more insights into ongoing discussions and proposed solutions:
111 |
112 | - [Typeshed Pull Request #11662](https://github.com/python/typeshed/pull/11662)
113 | - [Mypy Pull Request #17123](https://github.com/python/mypy/pull/17123)
114 | - [Python Discussion on Allowing Self-Binding for Generic ParamSpec](https://discuss.python.org/t/allow-self-binding-for-generic-paramspec/50948)
115 |
116 | Once any developments occur, I will quickly update the source code to incorporate the changes.
--------------------------------------------------------------------------------
/py_cachify/_backend/_cached.py:
--------------------------------------------------------------------------------
1 | import inspect
2 | from collections.abc import Awaitable
3 | from functools import partial, wraps
4 | from typing import Callable, Optional, TypeVar, Union, cast, overload
5 |
6 | from typing_extensions import ParamSpec
7 |
8 | from ._helpers import a_reset, encode_decode_value, get_full_key_from_signature, is_coroutine, reset
9 | from ._lib import CachifyClient, get_cachify_client
10 | from ._types._common import UNSET, Decoder, Encoder, UnsetType
11 | from ._types._reset_wrap import AsyncResetWrappedF, SyncResetWrappedF, WrappedFunctionReset
12 |
13 |
14 | _R = TypeVar('_R')
15 | _P = ParamSpec('_P')
16 | _S = TypeVar('_S')
17 |
18 |
19 | def cached(
20 | key: str,
21 | ttl: Union[Optional[int], UnsetType] = UNSET,
22 | enc_dec: Union[tuple[Encoder, Decoder], None] = None,
23 | ) -> WrappedFunctionReset:
24 | """
25 | Decorator that caches the result of a function based on the specified key, time-to-live (ttl),
26 | and encoding/decoding functions.
27 |
28 | Args:
29 | key (str): The key used to identify the cached result, could be a format string.
30 | ttl (Union[int, None, UnsetType], optional): The time-to-live for the cached result.
31 | If UNSET (default), the current cachify client's default_cache_ttl is used.
32 | If None, means indefinitely.
33 | enc_dec (Union[Tuple[Encoder, Decoder], None], optional): The encoding and decoding functions for the cached value.
34 | Defaults to None.
35 |
36 | Returns:
37 | WrappedFunctionReset: Either a synchronous or asynchronous function with reset method attached to it,
38 | reset(*args, **kwargs) matches the type of original function, accepts the same argument,
39 | and could be used to reset the cache.
40 | """
41 |
42 | return _cached_impl(key=key, ttl=ttl, enc_dec=enc_dec, client_provider=get_cachify_client)
43 |
44 |
45 | def _cached_impl(
46 | key: str,
47 | ttl: Union[Optional[int], UnsetType] = UNSET,
48 | enc_dec: Union[tuple[Encoder, Decoder], None] = None,
49 | client_provider: Callable[[], CachifyClient] = get_cachify_client,
50 | ) -> WrappedFunctionReset:
51 | @overload
52 | def _cached_inner( # type: ignore[overload-overlap]
53 | _func: Callable[_P, Awaitable[_R]],
54 | ) -> AsyncResetWrappedF[_P, _R]: ...
55 |
56 | @overload
57 | def _cached_inner(
58 | _func: Callable[_P, _R],
59 | ) -> SyncResetWrappedF[_P, _R]: ...
60 |
61 | def _cached_inner(
62 | _func: Union[Callable[_P, Awaitable[_R]], Callable[_P, _R]],
63 | ) -> Union[AsyncResetWrappedF[_P, _R], SyncResetWrappedF[_P, _R]]:
64 | signature = inspect.signature(_func)
65 |
66 | enc, dec = None, None
67 | if enc_dec is not None:
68 | enc, dec = enc_dec
69 |
70 | def _resolve_ttl(client: CachifyClient) -> Optional[int]:
71 | if isinstance(ttl, UnsetType):
72 | return client.default_cache_ttl
73 | return ttl
74 |
75 | if is_coroutine(_func):
76 | _awaitable_func = _func
77 |
78 | @wraps(_awaitable_func)
79 | async def _async_wrapper(*args: _P.args, **kwargs: _P.kwargs) -> _R:
80 | cachify_client = client_provider()
81 | _key = get_full_key_from_signature(
82 | bound_args=signature.bind(*args, **kwargs), key=key, operation_postfix='cached'
83 | )
84 | if (val := await cachify_client.a_get(key=_key)) is not None:
85 | return cast(_R, encode_decode_value(encoder_decoder=dec, val=val))
86 |
87 | res = await _awaitable_func(*args, **kwargs)
88 |
89 | await cachify_client.a_set(
90 | key=_key,
91 | val=encode_decode_value(encoder_decoder=enc, val=res),
92 | ttl=_resolve_ttl(cachify_client),
93 | )
94 | return res
95 |
96 | setattr(
97 | _async_wrapper,
98 | 'reset',
99 | partial(
100 | a_reset,
101 | _pyc_signature=signature,
102 | _pyc_key=key,
103 | _pyc_operation_postfix='cached',
104 | _pyc_original_func=_awaitable_func,
105 | _pyc_client_provider=client_provider,
106 | ),
107 | )
108 |
109 | return cast(AsyncResetWrappedF[_P, _R], cast(object, _async_wrapper))
110 | else:
111 | _sync_func = cast(Callable[_P, _R], _func) # type: ignore[redundant-cast]
112 |
113 | @wraps(_sync_func)
114 | def _sync_wrapper(*args: _P.args, **kwargs: _P.kwargs) -> _R:
115 | cachify_client = client_provider()
116 | _key = get_full_key_from_signature(
117 | bound_args=signature.bind(*args, **kwargs), key=key, operation_postfix='cached'
118 | )
119 | if (val := cachify_client.get(key=_key)) is not None:
120 | return cast(_R, encode_decode_value(encoder_decoder=dec, val=val))
121 |
122 | res = _sync_func(*args, **kwargs)
123 |
124 | cachify_client.set(
125 | key=_key,
126 | val=encode_decode_value(encoder_decoder=enc, val=res),
127 | ttl=_resolve_ttl(cachify_client),
128 | )
129 | return res
130 |
131 | setattr(
132 | _sync_wrapper,
133 | 'reset',
134 | partial(
135 | reset,
136 | _pyc_signature=signature,
137 | _pyc_key=key,
138 | _pyc_operation_postfix='cached',
139 | _pyc_original_func=_sync_func,
140 | _pyc_client_provider=client_provider,
141 | ),
142 | )
143 |
144 | return cast(SyncResetWrappedF[_P, _R], cast(object, _sync_wrapper))
145 |
146 | return cast(WrappedFunctionReset, cast(object, _cached_inner))
147 |
--------------------------------------------------------------------------------
/docs/tutorial/cached-decorator/specifying-ttl-and-encoder-decoder.md:
--------------------------------------------------------------------------------
1 | # Cached - Providing a ttl (time-to-live) and custom encoder/decoder
2 |
3 |
4 | ## Explanation
5 |
6 | Sometimes you don't need to cache a function result indefinitely and you need to cache it let's say for a day (a common case for web apps).
7 |
8 | Py-Cachify has got you covered and allows for an optional `ttl` param to pass into the decorator.
9 | This value will be passed down to a cache client and usually means how long the set value will live for in seconds.
10 |
11 |
12 | In addition to per-decorator `ttl`, you can also configure a global or instance-level `default_cache_ttl` via `init_cachify`. When `ttl` is omitted on `@cached`, that `default_cache_ttl` is used; when you pass `ttl=None`, the value is stored without expiration even if `default_cache_ttl` is configured; when you pass an explicit integer `ttl`, it overrides any default.
13 |
14 | ## Let's see it in action
15 |
16 |
17 | ```python
18 | import asyncio
19 |
20 | from py_cachify import init_cachify, cached
21 |
22 |
23 | # here we are initializing py-cachify to use an in-memory cache
24 |
25 | # and setting a default_cache_ttl that will be used when ttl is omitted
26 | init_cachify(default_cache_ttl=10)
27 |
28 |
29 | # notice ttl, that will cache the result for one second and override default_cache_ttl
30 | @cached(key='sum_two-{a}-{b}', ttl=1)
31 | async def sum_two(a: int, b: int) -> int:
32 | # Let's put print here to see what was the function called with
33 | print(f'Called with {a} {b}')
34 | return a + b
35 |
36 |
37 | async def main() -> None:
38 | # Call the function first time with (5, 5)
39 | print(f'First call result: {await increment_int_by(5, 5)}')
40 |
41 | # Let's wait for 2 seconds
42 | await asyncio.sleep(2)
43 |
44 | # And we will call it again to check what will happen
45 |
46 | print(f'Second call result: {await sum_two(5, 5)}')
47 |
48 |
49 |
50 | if __name__ == '__main__':
51 | asyncio.run(main())
52 | ```
53 |
54 |
55 | The only changes we introduced are the removal of the third call, adding the sleep, and providing a `ttl` param that overrides the configured `default_cache_ttl`.
56 |
57 |
58 | After running the example:
59 |
60 | ```console
61 | // Run our example
62 | $ python main.py
63 |
64 | // The ouput will be
65 | Called with 5 5
66 | First call result: 10
67 | Called with 5 5
68 | Second call result: 10
69 |
70 | ```
71 |
72 |
73 | As you can see the cache has expired (after the 1 second `ttl`) and allowed the function to be called again. If we had omitted `ttl` entirely, the `default_cache_ttl=10` configured in `init_cachify` would have been used instead.
74 |
75 |
76 | ## Encoders/Decoders
77 |
78 | `ttl` is not the only param that `@cached()` has available.
79 | There is also an `enc_dec` which accepts a tuple of `(Encoder, Decoder)`,
80 | those being the methods that are going to be applied to the function result on caching and retrieving the cache value.
81 |
82 | The required signature is `Callable[[Any], Any]`.
83 | But keep in mind that results should be picklable, py-cachify uses pickle, before passing the value to the cache backend.
84 |
85 |
86 | ℹ Why it was introduced
87 |
88 | The main reason is sometimes you have to cache something, that is not picklable by default.
89 |
90 | Even though the cases are rare, we decided to support it since it doesn't hurt to have it when it's needed :)
91 |
92 |
93 |
94 |
95 | ## Introducing `enc_dec`
96 |
97 | Usually provided encoder and decoder are supposed to work in tandem and not change the output value at all
98 | (since the encoder does something, and then the decoder reverts it back).
99 | But for the sake of our demonstration, we'll break that principle.
100 |
101 | We'll introduce the following functions:
102 |
103 | ```python
104 |
105 | # our encoder will multiply the result by 2
106 | def encoder(val: int) -> int:
107 | return val * 2
108 |
109 |
110 | # and our decoder will do the multiplication by 3
111 | def decoder(val: int) -> int:
112 | return val * 3
113 | ```
114 |
115 | Now, as a result, the final output should be multiplied by 6.
116 |
117 | All we have to do now is modify our `@cached()` decorator params to look like this:
118 |
119 | ```python
120 | @cached(key='sum_two-{a}-{b}', enc_dec=(encoder, decoder))
121 | async def sum_two(a: int, b: int) -> int:
122 | # Let's put print here to see what was the function called with
123 | print(f'Called with {a} {b}')
124 | return a + b
125 | ```
126 |
127 |
128 |
129 | ℹ Full file preview
130 | ```python
131 | import asyncio
132 |
133 | from py_cachify import init_cachify, cached
134 |
135 |
136 | # here we are initializing py-cachify to use an in-memory cache
137 | init_cachify()
138 |
139 |
140 | # our encoder will multiply the result by 2
141 | def encoder(val: int) -> int:
142 | return val * 2
143 |
144 |
145 | # and our decoder will do the multiplication by 3
146 | def decoder(val: int) -> int:
147 | return val * 3
148 |
149 |
150 | # enc_dec is provided
151 | @cached(key='sum_two-{a}-{b}', enc_dec=(encoder, decoder))
152 | async def sum_two(a: int, b: int) -> int:
153 | # Let's put print here to see what was the function called with
154 | print(f'Called with {a} {b}')
155 | return a + b
156 |
157 |
158 | async def main() -> None:
159 | # Call the function first time with (5, 5), this is where the encoder will be applied before setting cache value
160 | print(f'First call result: {await sum_two(5, 5)}')
161 |
162 | # Calling the function again with the same arguments to make decoder do its job on retrieving value from cache
163 | print(f'Second call result: {await sum_two(5, 5)}')
164 |
165 |
166 | if __name__ == '__main__':
167 | asyncio.run(main())
168 | ```
169 |
170 |
171 |
172 |
173 | ## Running the code
174 |
175 | After running the currently crafted file, we should get the following output:
176 |
177 |
178 | ```bash
179 | # Run our example
180 | $ python main.py
181 |
182 | # The ouput will be
183 | Called with 5 5
184 | First call result: 10
185 | Second call result: 60
186 |
187 | ```
188 |
189 | As you can see, the second call result was 60, which is 6 times bigger than the original value.
190 |
191 |
192 | ## What's next
193 |
194 | We'll see some magic that py-cachify does on a function wrap and learn how to manually reset a cache.
195 |
--------------------------------------------------------------------------------
/docs/tutorial/cached-decorator/reset-attribute.md:
--------------------------------------------------------------------------------
1 | # Cached - Manually resetting cache with ///reset()/// method
2 |
3 | ## How to
4 |
5 | Now it's time to see some ✨magic✨ happen.
6 |
7 | You could've wondered:
8 |
9 | What if I need to manually reset the cache on something I have cached using the `@cached` decorator?
10 | Do I have to go all the way to my actual cache client and do the reset myself? How can I reset a dynamic key with certain arguments?
11 |
12 | Don't worry py-cachify has got you covered.
13 |
14 | ## Introducing ///reset()///
15 |
16 | Every time you wrap something with the provided decorators that py-cachify has, there is a method attached to the function you are wrapping.
17 |
18 | Also, the method attached has the same type as the original function, so if it was async, the reset method will be async or the other way around for a sync function.
19 |
20 | `reset()` **has the same signature** as your declared function, this way you can easily reset even the dynamic key with no issues.
21 |
22 | ## Changing our example
23 |
24 | Let's modify the code we ran previously in the dynamic keys introduction:
25 |
26 | ```python
27 | import asyncio
28 |
29 | from py_cachify import init_cachify, cached
30 |
31 |
32 | # here we are initializing py-cachify to use an in-memory cache
33 | init_cachify()
34 |
35 |
36 | # nothing is changing in declaration
37 | @cached(key='sum_two-{a}-{b}')
38 | async def sum_two(a: int, b: int) -> int:
39 | # Let's put print here to see what was the function called with
40 | print(f'Called with {a} {b}')
41 | return a + b
42 |
43 |
44 | async def main() -> None:
45 | # Call the function first time with (5, 5)
46 | print(f'First call result: {await sum_two(5, 5)}')
47 |
48 | # Let's try resetting the cache for this specific call
49 | await sum_two.reset(a=5, b=5)
50 |
51 | # And then call the function again to see what will happen
52 | print(f'Second call result: {await sum_two(5, 5)}')
53 |
54 |
55 | if __name__ == '__main__':
56 | asyncio.run(main())
57 | ```
58 |
59 | We have added the reset call for a specific signature.
60 |
61 | Let's now run it and see the output:
62 |
63 | After running the example:
64 |
65 | ```bash
66 | # Run our example
67 | $ python main.py
68 |
69 | # The ouput will be
70 | Called with 5 5
71 | First call result: 10
72 | Called with 5 5
73 | Second call result: 10
74 |
75 | ```
76 |
77 | And you can see that the cache has been reset between the two calls we have.
78 |
79 | ## Instance-based reset
80 |
81 | So far we have only used the **global** `@cached` decorator that relies on the globally initialized client.
82 |
83 | In more advanced scenarios you might want a dedicated cache instance (for example, for a specific module or subsystem) that you can reset independently from the global one. For that, you can create a local `Cachify` instance using `init_cachify(is_global=False)` and call `reset()` on the wrapped function in exactly the same way.
84 |
85 | ```python
86 | import asyncio
87 |
88 | from py_cachify import init_cachify
89 |
90 |
91 | # global initialization for the top-level decorators
92 | init_cachify()
93 |
94 |
95 | # local instance that does NOT touch the global client
96 | local_cachify = init_cachify(is_global=False, prefix='LOCAL-')
97 |
98 |
99 | @local_cachify.cached(key='local-sum_two-{a}-{b}')
100 | async def local_sum_two(a: int, b: int) -> int:
101 | print(f'LOCAL called with {a} {b}')
102 | return a + b
103 |
104 |
105 | async def main() -> None:
106 | print(f'First local call: {await local_sum_two(1, 2)}')
107 | print(f'Second local call: {await local_sum_two(1, 2)}')
108 |
109 | # Reset only the local cache entry for these arguments
110 | await local_sum_two.reset(a=1, b=2)
111 |
112 | print(f'Third local call after reset: {await local_sum_two(1, 2)}')
113 |
114 |
115 | if __name__ == '__main__':
116 | asyncio.run(main())
117 | ```
118 |
119 | Here:
120 |
121 | - `local_sum_two` uses the dedicated instance configured via `local_cachify`.
122 | - `local_sum_two.reset(...)` operates only on that instance’s cache and has no effect on any globally cached functions.
123 | - The method signature is still the same as the original function.
124 |
125 | ## Multi-layer reset
126 |
127 | You can also create **multi-layer** caching by stacking a local instance’s `cached` decorator on top of the global `@cached`. In that case, calling `reset()` on the stacked function will clear both layers for the given arguments.
128 |
129 | ```python
130 | import asyncio
131 |
132 | from py_cachify import init_cachify, cached
133 |
134 |
135 | # global initialization for the top-level decorators
136 | init_cachify()
137 |
138 |
139 | # local instance providing a short-lived layer over the global cache
140 | local_cachify = init_cachify(is_global=False, prefix='LOCAL-')
141 |
142 |
143 | @local_cachify.cached(key='local-sum_two-{a}-{b}', ttl=5)
144 | @cached(key='sum_two-{a}-{b}', ttl=60)
145 | async def sum_two(a: int, b: int) -> int:
146 | print(f'GLOBAL called with {a} {b}')
147 | return a + b
148 |
149 |
150 | async def main() -> None:
151 | # First call: computes and populates both inner and outer caches
152 | print(f'First layered call: {await sum_two(2, 3)}')
153 | # Second call: hits outer cache only, no extra prints
154 | print(f'Second layered call: {await sum_two(2, 3)}')
155 |
156 | # Reset both local and global layers for these args
157 | await sum_two.reset(a=2, b=3)
158 |
159 | # After reset, both caches are clear for (2, 3), so the inner function is executed again
160 | print(f'Third layered call after reset: {await sum_two(2, 3)}')
161 |
162 |
163 | if __name__ == '__main__':
164 | asyncio.run(main())
165 | ```
166 |
167 | This pattern lets you compose multiple caches with different TTLs or backends while keeping the `reset()` API simple and predictable.
168 |
169 | ## Type annotations
170 |
171 | The `reset()` function has the same signature as the original function, which is nice and allows your IDE to help you with inline hints and errors:
172 |
173 | 
174 |
175 | ## Conclusion
176 |
177 | This concludes our tutorial for the `@cached()` decorator.
178 |
179 | A couple of important behavioral notes to keep in mind:
180 |
181 | - When you do not pass `ttl` to `@cached`, the effective TTL is taken from the configured `default_cache_ttl` (if any), and if both are omitted the value is stored without expiration; passing `ttl=None` always forces “no expiration”, even when a default TTL exists.
182 |
183 |
184 | Next, we'll learn about the locks and a handy decorator that will help you incorporate locking logic without a headache.
185 |
--------------------------------------------------------------------------------
/docs/reference/lock.md:
--------------------------------------------------------------------------------
1 | # API Reference for ///lock()///
2 |
3 | ## Overview
4 |
5 | The `lock` module provides a mechanism for managing locking within synchronous and asynchronous contexts.
6 |
7 | The main class, `lock`, combines both synchronous and asynchronous locking operations and relies on an underlying cache client that supports atomic "set-if-not-exists" (`nx`) semantics for correct distributed locking behavior.
8 |
9 |
10 | There are two main ways to use locking with py-cachify:
11 |
12 | - Via the **global** `lock` factory exported from `py_cachify`, which relies on a globally initialized client.
13 | - Via **instance-based** locking obtained from a `Cachify` object created by `init_cachify(is_global=False)`.
14 |
15 |
16 | ## Class: ///lock///
17 |
18 | ### Description
19 | The `lock` class manages locks using a specified key, with options for waiting and expiration.
20 | It can be used in both synchronous and asynchronous contexts.
21 |
22 | ### Parameters
23 |
24 | | Parameter | Type | Description |
25 | |-----------|---------------------------------|---------------------------------------------------------------------------------------------------|
26 | | `key` | `str` | The key used to identify the lock. |
27 | | `nowait` | `bool`, optional | If `True`, do not wait for the lock to be released and raise immediately. Defaults to `True`. |
28 | | `timeout` | `Union[int, float]`, optional | Time in seconds to wait for the lock if `nowait` is `False`. Defaults to `None`. |
29 | | `exp` | `Union[int, None]`, optional | Expiration time for the lock. Defaults to `UNSET` and falls back to the global setting in cachify.|
30 |
31 | ### Methods
32 |
33 | - `__enter__() -> Self`
34 | - Acquire a lock for the specified key as a context manager, synchronous.
35 |
36 | - `release() -> None`
37 | - Release the lock that is currently being held, synchronous.
38 |
39 | - `is_locked() -> bool`
40 | - Check if the lock is currently held, synchronous.
41 |
42 | - `__aenter__() -> Self`
43 | - Async version of `__enter__` to acquire a lock as an async context manager.
44 |
45 | - `arelease() -> None`
46 | - Release the lock that is currently held, asynchronously.
47 |
48 | - `is_alocked() -> bool`
49 | - Check if the lock is currently held asynchronously.
50 |
51 | - as a `decorator`
52 | - Decorator to acquire a lock for the wrapped function on call, for both synchronous and asynchronous functions.
53 | - Attaches the following methods to the wrapped function:
54 | - `is_locked(*args, **kwargs)`: Check if the function is currently locked.
55 | - `release(*args, **kwargs)`: Release the lock associated with the function.
56 |
57 | ## Error Handling
58 |
59 | - **`CachifyLockError`**: Raised when an operation on a lock is invalid or a lock cannot be acquired.
60 |
61 | ## Backend Requirements and `nx` Semantics
62 |
63 | The correctness of `lock` (and decorators built on top of it) depends on the underlying cache client providing an atomic "set-if-not-exists" operation via an `nx` flag:
64 |
65 | - When `nx=False`, a `set` call should behave like a normal upsert and overwrite existing values.
66 | - When `nx=True`, a `set` call must atomically set the value **only if** the key does not already exist, and return a truthy indication on success and a falsy indication otherwise.
67 |
68 | Built-in clients implement this behavior and use it to acquire and release locks safely. Custom clients should follow the same contract as documented in the initialization reference to ensure that locks behave correctly in concurrent and distributed scenarios.
69 |
70 |
71 | ## Usage Example
72 |
73 | ```python
74 | from py_cachify import lock
75 |
76 | @lock('my_lock_key-{arg}', nowait=True)
77 | def my_function(arg: str) -> None:
78 | # Critical section of code goes here
79 | pass
80 |
81 |
82 | with lock('my_lock_key'):
83 | # Critical section of code goes here
84 | pass
85 |
86 | async with lock('my_async_lock_key'):
87 | # Critical section of async code goes here
88 | pass
89 |
90 | ```
91 |
92 |
93 | By using the `lock` class, you'll ensure that your function calls are properly synchronized, preventing race conditions in shared resources.
94 |
95 | ### Instance-based usage
96 |
97 | If you need multiple, independent locking backends (for example, per module or subsystem), you can create dedicated `Cachify` instances via `init_cachify(is_global=False)` and use their `lock` method instead of the global factory:
98 |
99 | ```python
100 | from py_cachify import init_cachify
101 |
102 | # Create a dedicated instance that does not affect the global client
103 | local_cachify = init_cachify(is_global=False, prefix='LOCAL-')
104 |
105 | local_lock = local_cachify.lock(key='local-lock-{name}')
106 |
107 | with local_lock:
108 | # Critical section protected by the local instance
109 | ...
110 | ```
111 |
112 | - Global `lock(...)` uses the client configured by a global `init_cachify()` call.
113 | - `local_cachify.lock(...)` uses a client that is completely independent from the global one.
114 |
115 |
116 | ### Releasing the Lock or checking whether it's locked or not
117 | ```python
118 | my_function.is_locked(arg='arg-value') # returns bool
119 |
120 | my_function.release(arg='arg-value') # forcefully releases the lock
121 | ```
122 |
123 | ### Note
124 |
125 |
126 | - If py-cachify is not initialized through `init_cachify` with `is_global=True`, using the global `lock` factory or decorators will raise a `CachifyInitError`.
127 | - `Cachify` instances created with `is_global=False` do not depend on global initialization and can be used independently.
128 |
129 |
130 | ### Type Hints Remark (Decorator only application)
131 |
132 | Currently, Python's type hints have limitations in fully capturing a function's
133 | original signature when transitioning to a protocol-based callable in a decorator,
134 | particularly for methods (i.e., those that include `self`).
135 | `ParamSpec` can effectively handle argument and keyword types for functions
136 | but doesn't translate well to methods within protocols like `WrappedFunctionLock`.
137 | I'm staying updated on this issue and recommend checking the following resources
138 | for more insights into ongoing discussions and proposed solutions:
139 |
140 | - [Typeshed Pull Request #11662](https://github.com/python/typeshed/pull/11662)
141 | - [Mypy Pull Request #17123](https://github.com/python/mypy/pull/17123)
142 | - [Python Discussion on Allowing Self-Binding for Generic ParamSpec](https://discuss.python.org/t/allow-self-binding-for-generic-paramspec/50948)
143 |
144 | Once any developments occur, I will quickly update the source code to incorporate the changes.
--------------------------------------------------------------------------------
/docs/reference/cached.md:
--------------------------------------------------------------------------------
1 | # API Reference for ///@cached()/// Decorator
2 |
3 | ## Overview
4 |
5 | The `cached` decorator provides a caching mechanism that stores the result of a function based on a specified key, time-to-live (TTL), and optional encoding/decoding functions. It can be applied to both synchronous and asynchronous functions, facilitating quick access to previously computed results. This includes respecting a configurable `default_cache_ttl` when no explicit `ttl` is provided.
6 |
7 | ---
8 |
9 | ## Function: ///cached///
10 |
11 | ### Description
12 |
13 | The `cached` decorator caches the results of a function execution using a unique key. If the function is called again with the same key before the TTL expires, the cached result is returned instead of re-executing the function. This is particularly useful for expensive computations or IO-bound tasks.
14 |
15 | There are two main ways to use caching with py-cachify:
16 |
17 | - Via the **global** `cached` decorator exported from `py_cachify`, which relies on a globally initialized client.
18 | - Via **instance-based** decorators obtained from a `Cachify` object created by `init_cachify(is_global=False)`.
19 |
20 | ### Parameters
21 |
22 | | Parameter | Type | Description |
23 | |---------------------|---------------------------------|---------------------------------------------------------------------------------------------------------------|
24 | | `key` | `str` | The key used to identify the cached result, which can utilize formatted strings to create dynamic keys. (i.e. `key='my_key-{func_arg}'`) |
25 |
26 | | `ttl` | `Union[int, None]`, optional | Time-to-live (seconds) for the cached result. If omitted, the decorator uses the cache client's `default_cache_ttl` (configured via `init_cachify`). If `ttl` is `None`, the value is stored without expiration. If `ttl` is an integer, that value is used directly and overrides any `default_cache_ttl`. |
27 | | `enc_dec` | `Union[Tuple[Encoder, Decoder], None]`, optional | A tuple containing the encoding and decoding functions for the cached value. Defaults to `None`, which means that no encoding or decoding functions will be applied. |
28 |
29 |
30 | ### Default TTL behavior
31 |
32 | The effective TTL for a cached value is determined as follows (higher items take precedence over lower ones):
33 |
34 | 1. If you pass an explicit integer, for example `@cached(..., ttl=30)`, that TTL is used and overrides any `default_cache_ttl`.
35 | 2. If you pass `ttl=None`, the cache entry is stored **without expiration** (infinite TTL in most backends), even if `default_cache_ttl` is configured.
36 | 3. If you omit `ttl` entirely, the decorator will fall back to the underlying client's `default_cache_ttl`:
37 | - `default_cache_ttl` is configured via `init_cachify(default_cache_ttl=...)` for both global and instance-based usage.
38 | - If `default_cache_ttl` is `None` (the default), omitting `ttl` behaves like “no expiration”.
39 |
40 | This lets you define a global or instance-specific default TTL once and only override it where needed. When you do not configure `default_cache_ttl` at all (leaving it as `None`) and also omit `ttl` on the decorator, the behavior is the same as in previous versions of py-cachify: cached values are stored without expiration by default.
41 |
42 | ### Returns
43 |
44 | - `WrappedFunctionReset`: A wrapped function (either synchronous or asynchronous) with an additional `reset` method attached for cache management. The `reset(*args, **kwargs)` method allows the user to manually reset the cache for the function using the same key.
45 |
46 | ### Method Behavior
47 |
48 | 1. **For Synchronous Functions**:
49 | - Checks if a cached value exists for the provided key.
50 | - If the cached value exists, it returns the decoded value.
51 | - If not, it executes the function, caches the result (after encoding, if specified), and then returns the result.
52 |
53 | 2. **For Asynchronous Functions**:
54 | - Similar checks are performed in an asynchronous context using `await`.
55 | - The caching behavior mirrors the synchronous version.
56 |
57 | ### Global Usage Example
58 |
59 | ```python
60 | from py_cachify import cached, init_cachify
61 |
62 |
63 | # Configure a default cache TTL of 60 seconds for all cached values
64 | init_cachify(default_cache_ttl=60)
65 |
66 |
67 | @cached('my_cache_key')
68 | def compute_expensive_operation(param: int) -> int:
69 | # Uses default_cache_ttl=60 as TTL
70 | return param * 2
71 |
72 |
73 | @cached('my_async_cache_key-{param}', ttl=30)
74 | async def fetch_data(param: int) -> dict:
75 | # Overrides the default and uses ttl=30
76 | return {'data': param}
77 | ```
78 |
79 | ### Instance-based Usage
80 |
81 | If you need multiple independent caches (for example, per module or subsystem), you can create dedicated `Cachify` instances via `init_cachify(is_global=False)` and use their `cached` method instead of the global decorator.
82 |
83 | ```python
84 | from py_cachify import init_cachify
85 |
86 | # Create a dedicated instance that does not affect the global client
87 | # and set a default TTL of 300 seconds for this instance
88 | local_cachify = init_cachify(is_global=False, prefix='LOCAL-', default_cache_ttl=300)
89 | @local_cachify.cached(key='local-{x}-{y}')
90 | def local_sum(x: int, y: int) -> int:
91 | # Uses the instance-level default_cache_ttl=300
92 | return x + y
93 | ```
94 |
95 | - `@cached(...)` (global) uses the client configured by a global `init_cachify()` call.
96 | - `@local_cachify.cached(...)` uses a client that is completely independent from the global one.
97 |
98 | ### Multi-layer Usage
99 |
100 | It is possible to layer caches by stacking `cached` decorators (for example, a global cache inside a local instance cache).
101 |
102 | ```python
103 | from py_cachify import cached, init_cachify
104 |
105 | # Global initialization
106 | init_cachify()
107 |
108 | # Local instance with a shorter TTL that wraps the global one
109 | local = init_cachify(is_global=False, prefix='LOCAL-')
110 |
111 | @local.cached(key='local-expensive-{x}', ttl=5)
112 | @cached(key='expensive-{x}', ttl=60)
113 | def expensive(x: int) -> int:
114 | return x * 10
115 | ```
116 |
117 | In this scenario:
118 |
119 | - The **outer** cache (local instance) provides a short-lived layer over the **inner** global cache.
120 | - Could be useful to add in-memory cache over a Redis/Dragonfly cache, to further speed up execution (useful for hard to refactor N+1 processing, for example).
121 | - Calling `expensive.reset(x)` will:
122 | - Clear the local cache entry for that call.
123 | - Attempt to call `reset` on the inner cached layer as well, if present, so both layers are cleared for that key.
124 |
125 | This makes multi-layer setups behave intuitively when resetting cached values.
126 |
127 | ### Resetting the Cache
128 |
129 | You can reset the cache for either a synchronous or asynchronous function by calling the `reset` method attached to the wrapped function.
130 |
131 | ```python
132 | # Reset cache for a synchronous function
133 | compute_expensive_operation.reset()
134 |
135 | # Reset cache for an asynchronous function
136 | await fetch_data.reset(param='param-value')
137 | ```
138 |
139 | For instance-based usage, the pattern is the same:
140 |
141 | ```python
142 | local_sum.reset(x=1, y=2)
143 | ```
144 |
145 | ### Notes
146 |
147 | - Ensure that both the serialization and deserialization functions defined in `enc_dec` are efficient to preserve optimal performance.
148 | - If py-cachify is not initialized through `init_cachify` with `is_global=True`, using the global `@cached` decorator will raise a `CachifyInitError` at runtime.
149 | - `Cachify` instances created with `is_global=False` do not depend on global initialization and can be used independently.
150 |
151 | ### Type Hints Remark
152 |
153 | Currently, Python's type hints have limitations in fully capturing a function's original signature when transitioning to a protocol-based callable in a decorator, particularly for methods (i.e., those that include `self`). `ParamSpec` can effectively handle argument and keyword types for functions but doesn't translate well to methods within protocols like `WrappedFunctionReset`. I'm staying updated on this issue and recommend checking the following resources for more insights into ongoing discussions and proposed solutions:
154 |
155 | - [Typeshed Pull Request #11662](https://github.com/python/typeshed/pull/11662)
156 | - [Mypy Pull Request #17123](https://github.com/python/mypy/pull/17123)
157 | - [Python Discussion on Allowing Self-Binding for Generic ParamSpec](https://discuss.python.org/t/allow-self-binding-for-generic-paramspec/50948)
158 |
159 | Once any developments occur, I will quickly update the source code to incorporate the changes.
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 |
32 |
33 | ---
34 |
35 | **Documentation**: https://py-cachify.readthedocs.io/latest/
36 |
37 | **Source Code**: https://github.com/EzyGang/py-cachify
38 |
39 | **FastAPI Integration Guide**: Repo
40 |
41 | ---
42 |
43 | **Py-Cachify** is a robust library tailored for developers looking to enhance their Python applications with elegant caching and locking mechanisms.
44 | Whether you're building synchronous or asynchronous applications, Py-Cachify has you covered!
45 | It acts as a thin, backend-agnostic wrapper over your favorite cache client, letting you focus on business logic instead of juggling low-level get/set calls.
46 |
47 |
48 | ## Key Features:
49 | - **Flexible Caching**: Effortlessly cache your function results, dramatically reducing execution time for expensive computations and I/O-bound tasks.
50 | Utilize customizable keys and time-to-live (TTL) parameters.
51 |
52 | - **Distributed Locks**: Ensure safe concurrent operation of functions with distributed locks.
53 | Prevent race conditions and manage shared resources effectively across both sync and async contexts.
54 |
55 | - **Backend Agnostic**: Easily integrate with different cache backends.
56 | Choose between in-memory, Redis, DragonflyDB, or any custom backend that adheres to the provided client interfaces.
57 |
58 | - **Decorators for Ease**: Use intuitive decorators like `@cached()` and `@lock()` to wrap your functions,
59 | maintain clean code, and benefit from automatic cache management.
60 |
61 | - **Type Safety & Documentation**: Fully type-annotated for enhanced IDE support and readability,
62 | featuring comprehensive documentation and examples to guide you through various use cases.
63 |
64 | - **Production Ready**: With 100% test coverage and usage in multiple commercial projects,
65 | Py-Cachify is trusted for production environments, ensuring reliability and stability for your applications.
66 |
67 | ---
68 |
69 | ## Table of Contents
70 |
71 | - [Installation](#installation)
72 | - [How to use](#how-to-use)
73 | - [Basic examples](#basic-examples)
74 | - [Contributing](#contributing)
75 | - [License](#license)
76 |
77 | ## Installation
78 |
79 |
80 | ```bash
81 | $ pip install py-cachify
82 |
83 | ---> 100%
84 | Successfully installed py-cachify
85 | ```
86 |
87 | ## How to use
88 |
89 | You can read more in-depth tutorials [here](https://py-cachify.readthedocs.io/latest/tutorial/).
90 |
91 | First, to start working with the library, you will have to initialize it by using the provided `init_cachify` function:
92 | ```python
93 | from py_cachify import init_cachify
94 |
95 | init_cachify()
96 | ```
97 |
98 | This call:
99 |
100 | - Configures the **global** client used by the top-level decorators: `cached`, `lock`, and `once`.
101 | - Returns a `Cachify` instance, but you don't have to use it if you only work with the global decorators.
102 | - Uses an in-memory cache by default (both for sync and async usage).
103 |
104 |
105 | If you want to use Redis:
106 | ```python
107 | from py_cachify import init_cachify
108 | from redis.asyncio import from_url as async_from_url
109 | from redis import from_url
110 |
111 |
112 | # Example: configure global cachify with Redis for both sync and async flows
113 | init_cachify(
114 | sync_client=from_url(redis_url),
115 | async_client=async_from_url(redis_url),
116 | )
117 | ```
118 | Normally you wouldn't have to use both sync and async clients since an application usually works in a single mode i.e. sync/async. You can pass only `sync_client` **or** only `async_client` if that matches your usage.
119 |
120 |
121 | Once initialized you can use everything that the library provides straight up without being worried about managing the cache yourself.
122 |
123 |
124 | ❗ If you forgot to call `init_cachify` with `is_global=True` at least once, using the global decorators (`cached`, `lock`, `once`) will raise `CachifyInitError` during runtime.
125 |
126 | You can also create **dedicated instances** without touching the global client:
127 |
128 | ```python
129 | from py_cachify import init_cachify
130 |
131 | # Global initialization for the top-level decorators
132 | init_cachify()
133 |
134 | # Local instance that does NOT touch the global client
135 | local_cache = init_cachify(is_global=False, prefix='LOCAL-')
136 |
137 | @local_cache.cached(key='local-{x}')
138 | def compute_local(x: int) -> int:
139 | return x * 2
140 | ```
141 |
142 |
143 | ## Basic examples
144 |
145 |
146 | ### Caching
147 |
148 |
149 | Caching by using `@cached` decorator utilizing the flexibility of a dynamic key:
150 |
151 |
152 | ```python
153 | # Cache the result of the following function with dynamic key
154 | @cached(key='sum_two-{a}-{b}')
155 | async def sum_two(a: int, b: int) -> int:
156 | # Let's put print here to see what was the function called with
157 | print(f'Called with {a} {b}')
158 | return a + b
159 |
160 |
161 | # Reset the cache for the call with arguments a=1, b=2
162 | await sub_two.reset(a=1, b=2)
163 | ```
164 |
165 | ### Multi-layer Usage
166 |
167 | It is possible to layer caches by stacking `cached` decorators (for example, a global cache inside a local instance cache).
168 |
169 | ```python
170 | from py_cachify import cached, init_cachify
171 |
172 | # Global initialization for the top-level decorators
173 | init_cachify()
174 |
175 | # Local instance with a shorter TTL that wraps the global one
176 | local = init_cachify(is_global=False, prefix='LOCAL-')
177 |
178 | @local.cached(key='local-expensive-{x}', ttl=5)
179 | @cached(key='expensive-{x}', ttl=60)
180 | def expensive(x: int) -> int:
181 | return x * 10
182 | ```
183 |
184 | Read more about `@cached` [here](https://py-cachify.readthedocs.io/latest/reference/cached/).
185 |
186 | ### Locking
187 |
188 | Locking through context manager:
189 |
190 | ```python
191 | from py_cachify import lock
192 |
193 |
194 | async_lock = lock('resource_key')
195 | # Use it within an asynchronous context
196 | async with async_lock:
197 | # Your critical section here
198 | print('Critical section code')
199 |
200 | # Check if it's locked
201 | await async_lock.is_alocked()
202 |
203 | # Forcefully release
204 | await async_lock.arelease()
205 |
206 | # Use it within a synchronous context
207 | with lock('resource_key'):
208 | # Your critical section here
209 | print('Critical section code')
210 | ```
211 |
212 | Locking via decorator:
213 |
214 | ```python
215 |
216 | from py_cachify import lock
217 |
218 | @lock(key='critical_function_lock-{arg}', nowait=False, timeout=10)
219 | async def critical_function(arg: int) -> None:
220 | # critical code
221 |
222 |
223 | # Check if it's locked for arg=5
224 | await critical_function.is_locked(arg=5)
225 |
226 | # Forcefully release for arg=5
227 | await critical_function.release(arg=5)
228 | ```
229 |
230 | Read more about `lock` [here](https://py-cachify.readthedocs.io/latest/reference/lock/).
231 |
232 | For a more detailed tutorial visit [Tutorial](https://py-cachify.readthedocs.io/latest/tutorial/) or [full API reference](https://py-cachify.readthedocs.io/latest/reference).
233 |
234 | ## Contributing
235 |
236 | If you'd like to contribute, please first discuss the changes using Issues, and then don't hesitate to shoot a PR which will be reviewed shortly.
237 |
238 | ## License
239 |
240 | This project is licensed under the MIT License - see the [LICENSE](https://github.com/EzyGang/py-cachify/blob/main/LICENSE) file for details.
241 |
--------------------------------------------------------------------------------
/docs/release-notes.md:
--------------------------------------------------------------------------------
1 | # Release Notes
2 |
3 | ## [3.0.0](https://github.com/EzyGang/py-cachify/releases/tag/v3.0.0)
4 |
5 | In short, 3.0.0 focuses on:
6 |
7 | - Instance-based usage (`Cachify`) and multiple independent caches per app.
8 | - Stronger locking semantics backed by atomic `nx` support in cache clients.
9 | - A configurable `default_cache_ttl` with clearer TTL precedence rules.
10 | - Cleanup of long-deprecated aliases and stricter type checking on modern Python versions.
11 | - Documentation updates and improvements, including a separate page to use with Agentic systems and LLMs.
12 |
13 | ### Features & Enhancements
14 |
15 | #### **Multiple cachify instances per app**:
16 | - `init_cachify` now supports `is_global: bool = True` and returns a `Cachify` instance.
17 | - When `is_global=True` (default), `init_cachify` configures the global client used by top-level `cached`, `lock`, and `once` and returns a `Cachify` instance backed by that client.
18 | - When `is_global=False`, `init_cachify` does **not** modify the global client and instead returns an independent `Cachify` instance exposing:
19 | - `Cachify.cached(...)`
20 | - `Cachify.lock(...)`
21 | - `Cachify.once(...)`
22 |
23 | #### **New public `Cachify` type**:
24 | - `Cachify` is now publicly exported from `py_cachify`.
25 | - It provides a convenient, instance-scoped API over the same high-level decorators:
26 | - `@Cachify.cached(...)`
27 | - `@Cachify.lock(...)`
28 | - `@Cachify.once(...)`
29 | - All instance methods share the same semantics as the corresponding top-level decorators, but are bound to a specific client/prefix.
30 |
31 | #### **Improved reset and lock-query semantics in helpers**:
32 | - The helper functions `reset`, `a_reset`, `is_locked`, and `is_alocked` have been reworked to:
33 | - Accept internal parameters (`_pyc_key`, `_pyc_signature`, `_pyc_operation_postfix`, `_pyc_original_func`, `_pyc_client_provider`) to make them fully aware of which client and which wrapped function they are operating on and prevent collisions with user defined functions args and kwargs.
34 |
35 | #### **Configurable default cache TTL**:
36 | - `init_cachify` and `Cachify` now accept an optional `default_cache_ttl` parameter.
37 | - If a `@cached` decorator does **not** specify `ttl`, the `default_cache_ttl` of the underlying client is used as the fallback.
38 | - Passing `ttl=None` to `@cached` now explicitly means “no expiration”, even if `default_cache_ttl` is set.
39 | - Effective TTL precedence:
40 | 1. If `@cached(ttl=...)` is provided, that value is used.
41 | 2. Else, if the client has `default_cache_ttl` set, that value is used.
42 | 3. Else, entries are stored without expiration.
43 |
44 | #### **Stronger lock correctness with atomic `nx` support**:
45 | - Lock acquisition and the `once` decorator now rely on an atomic “set-if-not-exists” (`nx`) operation provided by the underlying cache client.
46 | - Built-in clients (in-memory, Redis examples) have been updated to implement `set(..., nx=True)` semantics for lock keys.
47 | - This significantly reduces race conditions in concurrent environments and makes lock behavior more predictable.
48 |
49 | #### **Multi-layer caching support**:
50 | - Thanks to the helper changes and the instance-scoped API, it is now straightforward to stack multiple `cached` decorators, for example:
51 | - A global cache with a long TTL; and
52 | - A local instance cache with a shorter TTL on top of it.
53 | - Calling `reset(*args, **kwargs)` on the outermost wrapper will:
54 | - Clear that wrapper’s cache entry; and
55 | - Attempt to call `reset` on the inner wrapper(s), if they expose such a method, so the entire “stack” is reset for the given arguments.
56 | - This pattern is documented in the updated `cached` reference and tutorial.
57 |
58 | #### **Stricter typing and tooling**:
59 | - Python baseline bumped to **3.9+**.
60 | - Core types updated to use `collections.abc.Awaitable` and built-in generics (`dict[...]`, `tuple[...]`, etc.).
61 | - `typing-extensions` dependency bumped (>=4.15.0) and `basedpyright` configuration added for strict type checking on the `py_cachify` package.
62 |
63 | ### Breaking Changes
64 |
65 | #### **Deprecated aliases removed**:
66 | - The following deprecated functions, announced in 2.0.0 as scheduled for removal in 3.0.0, have now been removed:
67 | - `async_cached`
68 | - `sync_cached`
69 | - `async_once`
70 | - `sync_once`
71 | - Use the unified decorators instead:
72 | - `cached` for both sync and async caching.
73 | - `once` for both sync and async “once at a time” locking.
74 |
75 | #### **Python 3.8 support dropped**:
76 | - The supported Python versions are now 3.9–3.14.
77 | - Python 3.8 is no longer supported and is removed from classifiers and test matrix.
78 |
79 | ### Notes on Migration from 2.x to 3.0.0
80 |
81 | #### If you implemented (used) a custom cache client:
82 | - Ensure your client supports an atomic "set-if-not-exists" semantics used by locks and `once`.
83 | - Concretely, the client should implement a `set(key, value, ttl=None, nx=False)` (or equivalent) method where:
84 | - `nx=False` behaves like a normal set; and
85 | - `nx=True` only sets the value if the key does *not* already exist, returning an appropriate success indicator.
86 | - Without `nx` semantics, lock and `once` behavior may no longer be correct in 3.0.0.
87 |
88 | #### If you only used:
89 | - `init_cachify(...)`,
90 | - `cached`,
91 | - `lock`,
92 | - `once`,
93 | and **did not** use any of the deprecated aliases or internal APIs, you should be able to upgrade with no code changes.
94 | #### If you used any of the deprecated aliases:
95 | - Replace:
96 | - `sync_cached` / `async_cached` with `cached` (it works for both sync and async).
97 | - `sync_once` / `async_once` with `once`.
98 |
99 | ## [2.0.10](https://github.com/EzyGang/py-cachify/releases/tag/v2.0.10)
100 |
101 | ### Features & Enchancements
102 |
103 | - Default log level is now DEBUG
104 | - Dependencies bump
105 |
106 | ## [2.0.9](https://github.com/EzyGang/py-cachify/releases/tag/v2.0.9)
107 |
108 | ### Features & Enchancements
109 |
110 | - Better error message on the mismatch of key format params and function arguments
111 |
112 | ### Bugfixes
113 |
114 | - Fix default arguments are not respected when crafting cache key
115 |
116 | ## [2.0.7](https://github.com/EzyGang/py-cachify/releases/tag/v2.0.7)
117 |
118 | ### Features & Enchancements
119 |
120 | - Bump dependencies
121 | - Add Python 3.13 Support
122 |
123 | ## [2.0.4](https://github.com/EzyGang/py-cachify/releases/tag/v2.0.4)
124 |
125 | ### Features & Enchancements
126 |
127 | - Bump dependencies
128 | - Better README and Docs
129 |
130 | ## [2.0.0](https://github.com/EzyGang/py-cachify/releases/tag/v2.0.0)
131 |
132 | ### Features & Enchancements
133 | - **Lock improvements**: Locks are now way more versatile and support new parameters like:
134 | - Whether to wait for the lock to expire or not (`nowait`, boolean)
135 | - Timeouts for how long should it try to acquire a lock. (`timeout`, int | float | None)
136 | - Expiration param to prevent deadlocks (`exp`, int | None)
137 | - When using lock as a decorator or using `once` decorator two methods are being added to the wrapped function:
138 | - `is_locked(*args, **kwargs)` - to check whether the lock is acquired or not
139 | - `release(*args, **kwargs)` - to forcefully release a lock.
140 |
141 | - More info could be found [here](./reference/lock.md).
142 |
143 | - **File layout improved**: All internal files have been made private helping LSP's and IDE's
144 | provide better import locations for the features py-cachify provides.
145 |
146 | - **Type annotations now feature TypeIs & Protocols**: Updated type annotations now provide even better IDE support,
147 | making it easier to write better code. They expose all methods attached to decorated functions and help you inline.
148 |
149 | - **Additional tests were added**
150 |
151 | - **`cached` decorator improvements**: There is now a new method attached to the wrapped functions called
152 | `reset(*args, **kwargs)` to allow for a quick cache resets.
153 | - More info can be found [here](./reference/cached.md).
154 |
155 | - **Bump dependencies**
156 |
157 | ### Breaking Changes
158 | - **async_lock**: Async lock has been removed, you should replace it with `lock` since it now can work in both contexts.
159 | - **import locations**: since files were renamed and moved around quite a bit,
160 | some import locations may not work after the 2.0.0 release, so I recommend reimporting used functions to ensure they work in your project.
161 | ### Deprecations
162 | - **async_once, sync_once, async_cached, sync_cached**: These are now deprecated and scheduled for removal in 3.0.0
163 | (all of those methods are just aliases for `cached` and `once`).
164 |
165 | ### Miscellaneous
166 | - **Documentation**: Documentation was refactored and greatly improved.
167 |
168 | I recommend checking out **[full API reference](reference/init.md)** to get familiar with changes and new features.
169 |
170 | ## [1.1.2](https://github.com/EzyGang/py-cachify/releases/tag/v1.1.2)
171 |
172 | ### Features & Enchancements
173 | - **Bump dependencies**
174 | - **Docs update to include info on `init_cachify` `prefix` parameter**
175 |
176 |
177 | ## [1.1.0](https://github.com/EzyGang/py-cachify/releases/tag/v1.1.2)
178 | ### Features & Enchancements
179 | - **Custom encoders/decoders for the `cached` decorator**: `enc_dec` parameter introduced on a `cached` decorator.
180 |
181 | ### Miscellaneous
182 | - **Documentation update**
183 |
--------------------------------------------------------------------------------
/docs/index.md:
--------------------------------------------------------------------------------
1 |