├── .github ├── CONTRIBUTING.md ├── FUNDING.yml ├── ISSUE_TEMPLATE │ ├── bug_report.md │ └── feature_request.md ├── actions │ └── test-coverage │ │ └── action.yml ├── dependabot.yml ├── release-drafter.yml └── workflows │ ├── publish.yml │ └── test.yml ├── .gitignore ├── .readthedocs.yaml ├── CODE_OF_CONDUCT.md ├── LICENSE ├── README.md ├── SECURITY.md ├── docs ├── changelog.md ├── commands.md ├── configuration.md ├── drt-model.md ├── index.md ├── installation.md ├── media │ ├── add-args.jpg │ └── add-scheduled-job.jpg ├── requirements.txt └── usage.md ├── mkdocs.yml ├── poetry.lock ├── pyproject.toml ├── scheduler ├── __init__.py ├── admin │ ├── __init__.py │ ├── job.py │ └── redis_models.py ├── apps.py ├── decorators.py ├── management │ ├── __init__.py │ └── commands │ │ ├── __init__.py │ │ ├── delete_failed_executions.py │ │ ├── export.py │ │ ├── import.py │ │ ├── rqstats.py │ │ ├── rqworker.py │ │ └── run_job.py ├── migrations │ ├── 0001_initial_squashed_0005_added_result_ttl.py │ ├── 0002_alter_cronjob_id_alter_repeatablejob_id_and_more.py │ ├── 0003_auto_20220329_2107.py │ ├── 0004_cronjob_at_front_repeatablejob_at_front_and_more.py │ ├── 0005_alter_cronjob_at_front_alter_repeatablejob_at_front_and_more.py │ ├── 0006_auto_20230118_1640.py │ ├── 0007_add_result_ttl.py │ ├── 0008_rename_str_val_jobarg_val_and_more.py │ ├── 0009_alter_jobarg_arg_type_alter_jobarg_val_and_more.py │ ├── 0010_queue.py │ ├── 0011_worker_alter_queue_options_alter_cronjob_at_front_and_more.py │ ├── 0012_alter_cronjob_name_alter_repeatablejob_name_and_more.py │ ├── 0013_alter_cronjob_queue_alter_repeatablejob_queue_and_more.py │ └── __init__.py ├── models │ ├── __init__.py │ ├── args.py │ ├── queue.py │ ├── scheduled_job.py │ └── worker.py ├── py.typed ├── queues.py ├── rq_classes.py ├── settings.py ├── templates │ └── admin │ │ └── scheduler │ │ ├── change_form.html │ │ ├── change_list.html │ │ ├── confirm_action.html │ │ ├── job_detail.html │ │ ├── jobs-list.partial.html │ │ ├── jobs.html │ │ ├── queue_workers.html │ │ ├── scheduler_base.html │ │ ├── single_job_action.html │ │ ├── stats.html │ │ ├── worker_details.html │ │ ├── workers-list.partial.html │ │ └── workers.html ├── templatetags │ ├── __init__.py │ └── scheduler_tags.py ├── tests │ ├── __init__.py │ ├── jobs.py │ ├── test_internals.py │ ├── test_job_arg_models.py │ ├── test_job_decorator.py │ ├── test_mgmt_cmds.py │ ├── test_models.py │ ├── test_redis_models.py │ ├── test_settings.py │ ├── test_views.py │ ├── test_worker.py │ └── testtools.py ├── tools.py ├── urls.py └── views.py └── testproject ├── manage.py └── testproject ├── __init__.py ├── settings.py ├── urls.py ├── views.py └── wsgi.py /.github/CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # Contributing to django-rq-scheduler 4 | 5 | First off, thanks for taking the time to contribute! ❤️ 6 | 7 | All types of contributions are encouraged and valued. See the [Table of Contents](#table-of-contents) for different ways 8 | to help and details about how this project handles them. Please make sure to read the relevant section before making 9 | your contribution. It will make it a lot easier for us maintainers and smooth out the experience for all involved. The 10 | community looks forward to your contributions. 🎉 11 | 12 | > And if you like the project, but just don't have time to contribute, that's fine. There are other easy ways to support 13 | > the project and show your appreciation, which we would also be very happy about: 14 | > - Star the project 15 | > - Tweet about it 16 | > - Refer this project in your project's readme 17 | > - Mention the project at local meetups and tell your friends/colleagues 18 | 19 | 20 | 21 | ## Table of Contents 22 | 23 | - [Code of Conduct](#code-of-conduct) 24 | - [I Have a Question](#i-have-a-question) 25 | - [I Want To Contribute](#i-want-to-contribute) 26 | - [Reporting Bugs](#reporting-bugs) 27 | - [Suggesting Enhancements](#suggesting-enhancements) 28 | - [Your First Code Contribution](#your-first-code-contribution) 29 | - [Improving The Documentation](#improving-the-documentation) 30 | - [Style guides](#style-guides) 31 | - [Commit Messages](#commit-messages) 32 | - [Join The Project Team](#join-the-project-team) 33 | 34 | ## Code of Conduct 35 | 36 | This project and everyone participating in it is governed by the 37 | [django-rq-scheduler Code of Conduct](https://github.com/dsoftwareinc/django-rq-scheduler/blob/main/CODE_OF_CONDUCT.md). 38 | By participating, you are expected to uphold this code. Please report unacceptable behavior 39 | to . 40 | 41 | ## I Have a Question 42 | 43 | > If you want to ask a question, we assume that you have read the 44 | > available [Documentation](https://github.com/dsoftwareinc/django-rq-scheduler). 45 | 46 | Before you ask a question, it is best to search for 47 | existing [Issues](https://github.com/dsoftwareinc/django-rq-scheduler/issues) that might help you. In case you have 48 | found a suitable issue and still need clarification, you can write your question in this issue. It is also advisable to 49 | search the internet for answers first. 50 | 51 | If you then still feel the need to ask a question and need clarification, we recommend the following: 52 | 53 | - Open an [Issue](https://github.com/dsoftwareinc/django-rq-scheduler/issues/new). 54 | - Provide as much context as you can about what you're running into. 55 | - Provide project and platform versions (nodejs, npm, etc), depending on what seems relevant. 56 | 57 | We will then take care of the issue as soon as possible. 58 | 59 | 73 | 74 | ## I Want To Contribute 75 | 76 | > ### Legal Notice 77 | > When contributing to this project, you must agree that you have authored 100% of the content, that you have the 78 | > necessary rights to the content and that the content you contribute may be provided under the project license. 79 | 80 | ### Reporting Bugs 81 | 82 | 83 | 84 | #### Before Submitting a Bug Report 85 | 86 | A good bug report shouldn't leave others needing to chase you up for more information. Therefore, we ask you to 87 | investigate carefully, collect information and describe the issue in detail in your report. Please complete the 88 | following steps in advance to help us fix any potential bug as fast as possible. 89 | 90 | - Make sure that you are using the latest version. 91 | - Determine if your bug is really a bug and not an error on your side e.g. using incompatible environment 92 | components/versions (Make sure that you have read 93 | the [documentation](https://github.com/dsoftwareinc/django-rq-scheduler). If you are looking for support, you might 94 | want to check [this section](#i-have-a-question)). 95 | - To see if other users have experienced (and potentially already solved) the same issue you are having, check if there 96 | is not already a bug report existing for your bug or error in 97 | the [bug tracker](https://github.com/dsoftwareinc/django-rq-scheduler/issues?q=label%3Abug). 98 | - Also make sure to search the internet (including Stack Overflow) to see if users outside the GitHub community have 99 | discussed the issue. 100 | - Collect information about the bug: 101 | - Stack trace (Traceback) 102 | - OS, Platform and Version (Windows, Linux, macOS, x86, ARM) 103 | - Version of the interpreter, compiler, SDK, runtime environment, package manager, depending on what seems relevant. 104 | - Possibly your input and the output 105 | - Can you reliably reproduce the issue? And can you also reproduce it with older versions? 106 | 107 | 108 | 109 | #### How Do I Submit a Good Bug Report? 110 | 111 | > You must never report security related issues, vulnerabilities or bugs including sensitive information to the issue 112 | > tracker, or elsewhere in public. Instead, sensitive bugs must be sent by email to . 113 | 114 | 115 | We use GitHub issues to track bugs and errors. If you run into an issue with the project: 116 | 117 | - Open an [Issue](https://github.com/dsoftwareinc/django-rq-scheduler/issues/new). (Since we can't be sure at this point 118 | whether it is a bug or not, we ask you not to talk about a bug yet and not to label the issue.) 119 | - Explain the behavior you would expect and the actual behavior. 120 | - Please provide as much context as possible and describe the *reproduction steps* that someone else can follow to 121 | recreate the issue on their own. This usually includes your code. For good bug reports you should isolate the problem 122 | and create a reduced test case. 123 | - Provide the information you collected in the previous section. 124 | 125 | Once it's filed: 126 | 127 | - The project team will label the issue accordingly. 128 | - A team member will try to reproduce the issue with your provided steps. If there are no reproduction steps or no 129 | obvious way to reproduce the issue, the team will ask you for those steps and mark the issue as `needs-repro`. Bugs 130 | with the `needs-repro` tag will not be addressed until they are reproduced. 131 | - If the team is able to reproduce the issue, it will be marked `needs-fix`, as well as possibly other tags (such 132 | as `critical`), and the issue will be left to be [implemented by someone](#your-first-code-contribution). 133 | 134 | 135 | 136 | ### Suggesting Enhancements 137 | 138 | This section guides you through submitting an enhancement suggestion for django-rq-scheduler, **including completely new 139 | features and minor improvements to existing functionality**. Following these guidelines will help maintainers and the 140 | community to understand your suggestion and find related suggestions. 141 | 142 | 143 | 144 | #### Before Submitting an Enhancement 145 | 146 | - Make sure that you are using the latest version. 147 | - Read the [documentation](https://github.com/dsoftwareinc/django-rq-scheduler) carefully and find out if the 148 | functionality is already covered, maybe by an individual configuration. 149 | - Perform a [search](https://github.com/dsoftwareinc/django-rq-scheduler/issues) to see if the enhancement has already 150 | been suggested. If it has, add a comment to the existing issue instead of opening a new one. 151 | - Find out whether your idea fits with the scope and aims of the project. It's up to you to make a strong case to 152 | convince the project's developers of the merits of this feature. Keep in mind that we want features that will be 153 | useful to the majority of our users and not just a small subset. If you're just targeting a minority of users, 154 | consider writing an add-on/plugin library. 155 | 156 | 157 | 158 | #### How Do I Submit a Good Enhancement Suggestion? 159 | 160 | Enhancement suggestions are tracked as [GitHub issues](https://github.com/dsoftwareinc/django-rq-scheduler/issues). 161 | 162 | - Use a **clear and descriptive title** for the issue to identify the suggestion. 163 | - Provide a **step-by-step description of the suggested enhancement** in as many details as possible. 164 | - **Describe the current behavior** and **explain which behavior you expected to see instead** and why. At this point 165 | you can also tell which alternatives do not work for you. 166 | - You may want to **include screenshots and animated GIFs** which help you demonstrate the steps or point out the part 167 | which the suggestion is related to. You can use [this tool](https://www.cockos.com/licecap/) to record GIFs on macOS 168 | and Windows, and [this tool](https://github.com/colinkeenan/silentcast) 169 | or [this tool](https://github.com/GNOME/byzanz) on 170 | Linux. 171 | - **Explain why this enhancement would be useful** to most django-rq-scheduler users. You may also want to point out the 172 | other projects that solved it better and which could serve as inspiration. 173 | 174 | 175 | 176 | ### Your First Code Contribution 177 | 178 | 182 | 183 | ### Improving The Documentation 184 | 185 | 189 | 190 | ## Style guides 191 | 192 | ### Commit Messages 193 | 194 | Taken from [conventional commits](https://www.conventionalcommits.org/en/v1.0.0/) 195 | 196 | ``` 197 | [optional scope]: 198 | 199 | [optional body] 200 | 201 | [optional footer(s)] 202 | ``` 203 | 204 | The commit contains the following structural elements, to communicate intent to the consumers of your library: 205 | 206 | * `fix:` a commit of the type fix patches a bug in your codebase (this correlates with `PATCH` in Semantic Versioning). 207 | * `feat:` a commit of the type feat introduces a new feature to the codebase (this correlates with `MINOR` in Semantic 208 | Versioning). 209 | * `BREAKING CHANGE:` a commit that has a footer BREAKING CHANGE:, or appends a ! after the type/scope, introduces a 210 | breaking API change (correlating with MAJOR in Semantic Versioning). A BREAKING CHANGE can be part of commits of any 211 | type. 212 | * types other than `fix:` and `feat:` are allowed, for example @commitlint/config-conventional (based on the Angular 213 | convention) recommends `build:`, `chore:`, `ci:`, `docs:`, `style:`, `refactor:`, `perf:`, `test:`, and others. 214 | * footers other than `BREAKING CHANGE: ` may be provided and follow a convention similar to 215 | [git trailer format](https://git-scm.com/docs/git-interpret-trailers). 216 | 217 | Additional types are not mandated by the Conventional Commits specification, and have no implicit effect in Semantic 218 | Versioning (unless they include a BREAKING CHANGE). A scope may be provided to a commit’s type, to provide additional 219 | contextual information and is contained within parenthesis, e.g., feat(parser): add ability to parse arrays. 220 | 221 | ## Join The Project Team 222 | 223 | 224 | 225 | 226 | 227 | ## Attribution 228 | 229 | This guide is based on the **contributing-gen**. [Make your own](https://github.com/bttger/contributing-gen)! 230 | -------------------------------------------------------------------------------- /.github/FUNDING.yml: -------------------------------------------------------------------------------- 1 | github: cunla 2 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/bug_report.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Bug report 3 | about: Create a report to help us improve 4 | title: '' 5 | labels: bug 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Describe the bug** 11 | A clear and concise description of what the bug is. 12 | 13 | **To Reproduce** 14 | Steps to reproduce the behavior: 15 | 1. Go to '...' 16 | 2. Click on '....' 17 | 3. Scroll down to '....' 18 | 4. See error 19 | 20 | **Expected behavior** 21 | A clear and concise description of what you expected to happen. 22 | 23 | **Screenshots** 24 | If applicable, add screenshots to help explain your problem. 25 | 26 | **Desktop (please complete the following information):** 27 | - OS: [e.g. iOS] 28 | - python version 29 | - django version 30 | - django-rq version 31 | - requirements.txt? 32 | 33 | **Additional context** 34 | Add any other context about the problem here. 35 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/feature_request.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Feature request 3 | about: Suggest an idea for this project 4 | title: '' 5 | labels: enhancement 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Is your feature request related to a problem? Please describe.** 11 | A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] 12 | 13 | **Describe the solution you'd like** 14 | A clear and concise description of what you want to happen. 15 | 16 | **Describe alternatives you've considered** 17 | A clear and concise description of any alternative solutions or features you've considered. 18 | 19 | **Additional context** 20 | Add any other context or screenshots about the feature request here. 21 | -------------------------------------------------------------------------------- /.github/actions/test-coverage/action.yml: -------------------------------------------------------------------------------- 1 | name: Run Tests with coverage 2 | description: 'Run tests with coverage and publish results to PR' 3 | inputs: 4 | pythonVer: 5 | description: 'python version' 6 | required: true 7 | djangoVer: 8 | description: 'django version' 9 | required: true 10 | repoToken: 11 | description: 'Token for PR comment' 12 | required: true 13 | outputs: 14 | coverage: 15 | description: "Coverage" 16 | value: ${{ steps.json-report.outputs.coverage }} 17 | runs: 18 | using: "composite" 19 | steps: 20 | - name: Run Tests with coverage 21 | shell: bash 22 | run: | 23 | cd testproject 24 | poetry run coverage run manage.py test scheduler 25 | - name: Coverage report 26 | id: coverage_report 27 | shell: bash 28 | run: | 29 | mv testproject/.coverage . 30 | echo 'REPORT<> $GITHUB_ENV 31 | poetry run coverage report >> $GITHUB_ENV 32 | echo 'EOF' >> $GITHUB_ENV 33 | - name: json report 34 | id: json-report 35 | shell: bash 36 | run: | 37 | poetry run coverage json 38 | echo "COVERAGE=$(jq '.totals.percent_covered_display|tonumber' coverage.json)" >> $GITHUB_ENV 39 | - uses: mshick/add-pr-comment@v2 40 | if: ${{ github.event_name == 'pull_request' }} 41 | with: 42 | message: | 43 | Coverage report python v${{ inputs.pythonVer }} django v${{ inputs.djangoVer }} 44 | ``` 45 | ${{ env.REPORT }} 46 | ``` 47 | repo-token: ${{ inputs.repoToken }} 48 | repo-token-user-login: 'github-actions[bot]' 49 | allow-repeats: true 50 | -------------------------------------------------------------------------------- /.github/dependabot.yml: -------------------------------------------------------------------------------- 1 | # To get started with Dependabot version updates, you'll need to specify which 2 | # package ecosystems to update and where the package manifests are located. 3 | # Please see the documentation for all configuration options: 4 | # https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates 5 | 6 | version: 2 7 | updates: 8 | - package-ecosystem: "pip" 9 | directory: "/" 10 | schedule: 11 | interval: "daily" 12 | -------------------------------------------------------------------------------- /.github/release-drafter.yml: -------------------------------------------------------------------------------- 1 | name-template: 'v$RESOLVED_VERSION 🌈' 2 | tag-template: 'v$RESOLVED_VERSION' 3 | categories: 4 | - title: '🚀 Features' 5 | labels: 6 | - 'feature' 7 | - 'enhancement' 8 | - title: '🐛 Bug Fixes' 9 | labels: 10 | - 'fix' 11 | - 'bugfix' 12 | - 'bug' 13 | - title: '🧰 Maintenance' 14 | label: 'chore' 15 | change-template: '- $TITLE @$AUTHOR (#$NUMBER)' 16 | change-title-escapes: '\<*_&' # You can add # and @ to disable mentions, and add ` to disable code blocks. 17 | version-resolver: 18 | major: 19 | labels: 20 | - 'major' 21 | minor: 22 | labels: 23 | - 'minor' 24 | patch: 25 | labels: 26 | - 'patch' 27 | default: patch 28 | template: | 29 | ## Changes 30 | 31 | $CHANGES -------------------------------------------------------------------------------- /.github/workflows/publish.yml: -------------------------------------------------------------------------------- 1 | # This workflow will upload a Python Package using Twine when a release is created 2 | # For more information see: https://help.github.com/en/actions/language-and-framework-guides/using-python-with-github-actions#publishing-to-package-registries 3 | 4 | # This workflow uses actions that are not certified by GitHub. 5 | # They are provided by a third-party and are governed by 6 | # separate terms of service, privacy policy, and support 7 | # documentation. 8 | 9 | name: Upload Python Package 10 | 11 | on: 12 | release: 13 | types: [published] 14 | 15 | jobs: 16 | deploy: 17 | runs-on: ubuntu-latest 18 | steps: 19 | - uses: actions/checkout@v3 20 | - name: Set up Python 21 | uses: actions/setup-python@v4 22 | with: 23 | python-version: '3.11' 24 | cache-dependency-path: poetry.lock 25 | - name: Install dependencies 26 | run: | 27 | python -m pip install --upgrade pip 28 | pip install build 29 | - name: Build package 30 | run: python -m build 31 | - name: Publish package 32 | uses: pypa/gh-action-pypi-publish@release/v1 33 | with: 34 | user: __token__ 35 | password: ${{ secrets.PYPI_API_TOKEN }} 36 | -------------------------------------------------------------------------------- /.github/workflows/test.yml: -------------------------------------------------------------------------------- 1 | name: Django CI 2 | 3 | on: 4 | pull_request_target: 5 | branches: 6 | - master 7 | push: 8 | branches: 9 | - master 10 | workflow_dispatch: 11 | 12 | jobs: 13 | flake8: 14 | runs-on: ubuntu-latest 15 | name: "flake8 on code" 16 | steps: 17 | - uses: actions/checkout@v3 18 | - name: Set up Python ${{ matrix.python-version }} 19 | uses: actions/setup-python@v4 20 | with: 21 | python-version: "3.11" 22 | cache-dependency-path: poetry.lock 23 | - name: Install poetry and dependencies 24 | shell: bash 25 | run: | 26 | python -m pip --quiet install poetry 27 | echo "$HOME/.poetry/bin" >> $GITHUB_PATH 28 | poetry install 29 | - name: Run flake8 30 | shell: bash 31 | run: | 32 | poetry run flake8 scheduler/ 33 | testFakeRedis: 34 | needs: [ 'flake8' ] 35 | runs-on: ubuntu-latest 36 | strategy: 37 | max-parallel: 6 38 | matrix: 39 | python-version: [ '3.9', '3.10', '3.11' ] 40 | django-version: [ '3.2.19', '4.2.1' ] 41 | outputs: 42 | version: ${{ steps.getVersion.outputs.VERSION }} 43 | steps: 44 | - uses: actions/checkout@v3 45 | - name: Set up Python ${{ matrix.python-version }} 46 | uses: actions/setup-python@v4 47 | with: 48 | python-version: ${{ matrix.python-version }} 49 | cache-dependency-path: poetry.lock 50 | - name: Install poetry and dependencies 51 | shell: bash 52 | run: | 53 | python -m pip --quiet install poetry 54 | echo "$HOME/.poetry/bin" >> $GITHUB_PATH 55 | poetry install -E yaml 56 | poetry run pip install django==${{ matrix.django-version }} 57 | 58 | - name: Get version 59 | id: getVersion 60 | shell: bash 61 | run: | 62 | VERSION=$(poetry version -s --no-ansi -n) 63 | echo "VERSION=$VERSION" >> $GITHUB_OUTPUT 64 | - name: Run Tests without coverage 65 | run: | 66 | cd testproject 67 | export FAKEREDIS=True 68 | poetry run python manage.py test scheduler 69 | 70 | testRedis: 71 | needs: [ 'flake8' ] 72 | runs-on: ubuntu-latest 73 | strategy: 74 | max-parallel: 6 75 | matrix: 76 | python-version: [ '3.9', '3.10', '3.11' ] 77 | django-version: [ '3.2.19', '4.2.1' ] 78 | include: 79 | - python-version: '3.11' 80 | django-version: '4.2.1' 81 | coverage: yes 82 | services: 83 | redis: 84 | image: redis:7.0.7 85 | ports: 86 | - 6379:6379 87 | options: >- 88 | --health-cmd "redis-cli ping" 89 | --health-interval 10s 90 | --health-timeout 5s 91 | --health-retries 5 92 | outputs: 93 | version: ${{ steps.getVersion.outputs.VERSION }} 94 | steps: 95 | - uses: actions/checkout@v3 96 | - name: Set up Python ${{ matrix.python-version }} 97 | uses: actions/setup-python@v4 98 | with: 99 | python-version: ${{ matrix.python-version }} 100 | cache-dependency-path: poetry.lock 101 | - name: Install poetry and dependencies 102 | shell: bash 103 | run: | 104 | python -m pip --quiet install poetry 105 | echo "$HOME/.poetry/bin" >> $GITHUB_PATH 106 | poetry install -E yaml 107 | poetry run pip install django==${{ matrix.django-version }} 108 | 109 | - name: Get version 110 | id: getVersion 111 | shell: bash 112 | run: | 113 | VERSION=$(poetry version -s --no-ansi -n) 114 | echo "VERSION=$VERSION" >> $GITHUB_OUTPUT 115 | - name: Run Tests without coverage 116 | if: ${{ matrix.coverage != 'yes' }} 117 | run: | 118 | cd testproject 119 | poetry run python manage.py test scheduler 120 | # Steps for coverage check 121 | - name: Run tests with coverage 122 | uses: ./.github/actions/test-coverage 123 | if: ${{ matrix.coverage == 'yes' }} 124 | with: 125 | pythonVer: ${{ matrix.python-version }} 126 | djangoVer: ${{ matrix.django-version }} 127 | repoToken: ${{ secrets.GITHUB_TOKEN }} 128 | - name: Create coverage badge 129 | if: ${{ matrix.coverage == 'yes' && github.event_name == 'push' }} 130 | uses: schneegans/dynamic-badges-action@v1.6.0 131 | with: 132 | auth: ${{ secrets.GIST_SECRET }} 133 | gistID: b756396efb895f0e34558c980f1ca0c7 134 | filename: django-rq-scheduler-4.json 135 | label: coverage 136 | message: ${{ env.COVERAGE }}% 137 | color: green 138 | # Prepare a draft release for GitHub Releases page for the manual verification 139 | # If accepted and published, release workflow would be triggered 140 | update_release_draft: 141 | permissions: 142 | # write permission is required to create a GitHub release 143 | contents: write 144 | # write permission is required for auto-labeler 145 | # otherwise, read permission is required at least 146 | pull-requests: write 147 | needs: testRedis 148 | runs-on: ubuntu-latest 149 | steps: 150 | - uses: release-drafter/release-drafter@v5 151 | env: 152 | GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} 153 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | __pycache__/ 2 | *.py[cod] 3 | *$py.class 4 | 5 | *.so 6 | 7 | .Python 8 | .venv/ 9 | docker-compose.yml 10 | env/ 11 | build/ 12 | develop-eggs/ 13 | dist/ 14 | downloads/ 15 | eggs/ 16 | .eggs/ 17 | lib/ 18 | lib64/ 19 | parts/ 20 | sdist/ 21 | var/ 22 | *.egg-info/ 23 | .installed.cfg 24 | *.egg 25 | *.manifest 26 | *.spec 27 | 28 | pip-log.txt 29 | pip-delete-this-directory.txt 30 | htmlcov/ 31 | .tox/ 32 | .coverage 33 | .coverage.* 34 | .cache 35 | nosetests.xml 36 | coverage.xml 37 | *,cover 38 | .hypothesis/ 39 | *.mo 40 | *.pot 41 | *.log 42 | docs/_build/ 43 | target/ 44 | 45 | .ipynb_checkpoints 46 | .idea 47 | *.sqlite3 48 | .DS_Store 49 | *.iml 50 | -------------------------------------------------------------------------------- /.readthedocs.yaml: -------------------------------------------------------------------------------- 1 | version: 2 2 | build: 3 | os: "ubuntu-20.04" 4 | tools: 5 | python: "3.11" 6 | 7 | mkdocs: 8 | configuration: mkdocs.yml 9 | fail_on_warning: false 10 | 11 | python: 12 | install: 13 | - requirements: docs/requirements.txt 14 | -------------------------------------------------------------------------------- /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Code of Conduct - django-rq-scheduler 2 | 3 | ## Our Pledge 4 | 5 | In the interest of fostering an open and welcoming environment, we as 6 | contributors and maintainers pledge to make participation in our project and 7 | our community a harassment-free experience for everyone, regardless of age, body 8 | size, disability, ethnicity, sex characteristics, gender identity and expression, 9 | level of experience, education, socioeconomic status, nationality, personal 10 | appearance, race, religion, or sexual identity and orientation. 11 | 12 | ## Our Standards 13 | 14 | Examples of behavior that contributes to a positive environment for our 15 | community include: 16 | 17 | * Demonstrating empathy and kindness toward other people 18 | * Being respectful of differing opinions, viewpoints, and experiences 19 | * Giving and gracefully accepting constructive feedback 20 | * Accepting responsibility and apologizing to those affected by our mistakes, 21 | and learning from the experience 22 | * Focusing on what is best not just for us as individuals, but for the 23 | overall community 24 | 25 | Examples of unacceptable behavior include: 26 | 27 | * The use of sexualized language or imagery, and sexual attention or 28 | advances 29 | * Trolling, insulting or derogatory comments, and personal or political attacks 30 | * Public or private harassment 31 | * Publishing others' private information, such as a physical or email 32 | address, without their explicit permission 33 | * Other conduct which could reasonably be considered inappropriate in a 34 | professional setting 35 | 36 | ## Our Responsibilities 37 | 38 | Project maintainers are responsible for clarifying and enforcing our standards of 39 | acceptable behavior and will take appropriate and fair corrective action in 40 | response to any behavior that they deem inappropriate, 41 | threatening, offensive, or harmful. 42 | 43 | Project maintainers have the right and responsibility to remove, edit, or reject 44 | comments, commits, code, wiki edits, issues, and other contributions that are 45 | not aligned to this Code of Conduct, and will 46 | communicate reasons for moderation decisions when appropriate. 47 | 48 | ## Scope 49 | 50 | This Code of Conduct applies within all community spaces, and also applies when 51 | an individual is officially representing the community in public spaces. 52 | Examples of representing our community include using an official e-mail address, 53 | posting via an official social media account, or acting as an appointed 54 | representative at an online or offline event. 55 | 56 | ## Enforcement 57 | 58 | Instances of abusive, harassing, or otherwise unacceptable behavior may be 59 | reported to the community leaders responsible for enforcement at . 60 | All complaints will be reviewed and investigated promptly and fairly. 61 | 62 | All community leaders are obligated to respect the privacy and security of the 63 | reporter of any incident. 64 | 65 | ## Enforcement Guidelines 66 | 67 | Community leaders will follow these Community Impact Guidelines in determining 68 | the consequences for any action they deem in violation of this Code of Conduct: 69 | 70 | ### 1. Correction 71 | 72 | **Community Impact**: Use of inappropriate language or other behavior deemed 73 | unprofessional or unwelcome in the community. 74 | 75 | **Consequence**: A private, written warning from community leaders, providing 76 | clarity around the nature of the violation and an explanation of why the 77 | behavior was inappropriate. A public apology may be requested. 78 | 79 | ### 2. Warning 80 | 81 | **Community Impact**: A violation through a single incident or series 82 | of actions. 83 | 84 | **Consequence**: A warning with consequences for continued behavior. No 85 | interaction with the people involved, including unsolicited interaction with 86 | those enforcing the Code of Conduct, for a specified period of time. This 87 | includes avoiding interactions in community spaces as well as external channels 88 | like social media. Violating these terms may lead to a temporary or 89 | permanent ban. 90 | 91 | ### 3. Temporary Ban 92 | 93 | **Community Impact**: A serious violation of community standards, including 94 | sustained inappropriate behavior. 95 | 96 | **Consequence**: A temporary ban from any sort of interaction or public 97 | communication with the community for a specified period of time. No public or 98 | private interaction with the people involved, including unsolicited interaction 99 | with those enforcing the Code of Conduct, is allowed during this period. 100 | Violating these terms may lead to a permanent ban. 101 | 102 | ### 4. Permanent Ban 103 | 104 | **Community Impact**: Demonstrating a pattern of violation of community 105 | standards, including sustained inappropriate behavior, harassment of an 106 | individual, or aggression toward or disparagement of classes of individuals. 107 | 108 | **Consequence**: A permanent ban from any sort of public interaction within 109 | the community. 110 | 111 | ## Attribution 112 | 113 | This Code of Conduct is adapted from the [Contributor Covenant](https://contributor-covenant.org/), version 114 | [1.4](https://www.contributor-covenant.org/version/1/4/code-of-conduct/code_of_conduct.md) and 115 | [2.0](https://www.contributor-covenant.org/version/2/0/code_of_conduct/code_of_conduct.md), 116 | and was generated by [contributing-gen](https://github.com/bttger/contributing-gen). -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2022- Daniel Moran, (Before 2016 - iStrategyLabs, LLC) 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NON INFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | Django RQ Scheduler 2 | =================== 3 | This project has been migrated to [django-tasks-scheduler](https://github.com/dsoftwareinc/django-tasks-scheduler). 4 | -------------------------------------------------------------------------------- /SECURITY.md: -------------------------------------------------------------------------------- 1 | # Security Policy 2 | 3 | ## Supported Versions 4 | 5 | | Version | Supported | 6 | |-------------|--------------------| 7 | | 2023.latest | :white_check_mark: | 8 | 9 | ## Reporting a Vulnerability 10 | 11 | To report a security vulnerability, please use the 12 | [Tidelift security contact](https://tidelift.com/security). 13 | Tidelift will coordinate the fix and disclosure. -------------------------------------------------------------------------------- /docs/changelog.md: -------------------------------------------------------------------------------- 1 | # Changelog 2 | 3 | ## v2023.6.1 🌈 4 | 5 | ### 🚀 Features 6 | 7 | ### 🐛 Bug Fixes 8 | 9 | * Minor fix on export model 10 | 11 | ### 🧰 Maintenance 12 | 13 | * Simplified admin + model code significantly 14 | * Remove need for django-model-utils dependency 15 | * Fix GitHub actions 16 | * Fix security issues 17 | * Clean direct redis connection calls from views.py 18 | 19 | ## v2023.6.0 🌈 20 | 21 | ### 🚀 Features 22 | 23 | * Extend the queue name to 255 characters @gavaig (#138) 24 | 25 | ### 🧰 Maintenance 26 | 27 | * Update dependencies 28 | * Create SECURITY.md @cunla (#125) 29 | 30 | ## v2023.5.0 🌈 31 | 32 | ### 🚀 Breaking changes 33 | 34 | * Remove django-rq dependency 35 | * Remove threaded scheduler support 36 | 37 | ### 🚀 Features 38 | 39 | * Migrate all required features from django-rq: 40 | * management commands to create worker (rqworker), stats, etc. 41 | * admin view of queues 42 | * admin view for workers. 43 | * admin views are significantly more informative. 44 | * job-ids and worker-ids are more informative. 45 | * Added ability to cancel ongoing job. 46 | * job executions inline in each job. 47 | 48 | ### 🚀 Roadmap 49 | 50 | * Merge all scheduled jobs to one model 51 | 52 | ## v2023.4.0 🌈 53 | 54 | ### 🚀 Features 55 | 56 | * Add management commands to export and import models. 57 | * Add Run Now @gabriels1234 (#106) 58 | 59 | ### 🧰 Maintenance 60 | 61 | * Bump poetry from 1.4.0 to 1.4.1 @dependabot (#107) 62 | * Bump flake8-pyproject from 1.2.2 to 1.2.3 @dependabot (#110) 63 | * Bump fakeredis from 2.10.1 to 2.10.2 @dependabot (#111) 64 | * Bump coverage from 7.2.1 to 7.2.2 @dependabot (#104) 65 | 66 | ## v2023.3.2 🌈 67 | 68 | * Add missing migration 69 | 70 | ## v2023.3.1 🌈 71 | 72 | * Fix: error on django-admin when internal scheduler is off 73 | 74 | ## v2023.3.0 🌈 75 | 76 | ### 🐛 Bug Fixes 77 | 78 | * fixed validation of callable field @mazhor90 (#93) 79 | 80 | ## v2023.2.0 🌈 81 | 82 | ### 🚀 Features 83 | 84 | * Start working on documentation on https://django-rq-scheduler.readthedocs.io/en/latest/ 85 | 86 | ### 🐛 Bug Fixes 87 | 88 | * Hotfix new cron @gabriels1234 (#79) 89 | * Make at_front nullable @cunla (#77) 90 | 91 | ## v2023.3.0 🌈 92 | 93 | ### 🚀 Breaking changes 94 | 95 | * Remove rq-scheduler dependency 96 | 97 | ### 🚀 Features 98 | 99 | * Add support for scheduling at front of the queue #65 100 | -------------------------------------------------------------------------------- /docs/commands.md: -------------------------------------------------------------------------------- 1 | # Management commands 2 | 3 | ## rqworker 4 | 5 | Create a new worker with a scheduler for specific queues by order of priority. 6 | If no queues are specified, will run on default queue only. 7 | 8 | All queues must have the same redis settings on `SCHEDULER_QUEUES`. 9 | 10 | ```shell 11 | python manage.py rqworker queue1 queue2 queue3 12 | 13 | ``` 14 | 15 | ## export 16 | 17 | Export all scheduled jobs from django db to json/yaml format. 18 | 19 | ```shell 20 | python manage.py export -o {yaml,json} 21 | ``` 22 | 23 | Result should be (for json): 24 | 25 | ```json 26 | [ 27 | { 28 | "model": "ScheduledJob", 29 | "name": "Scheduled Job 1", 30 | "callable": "scheduler.tests.test_job", 31 | "callable_args": [ 32 | { 33 | "arg_type": "datetime", 34 | "val": "2022-02-01" 35 | } 36 | ], 37 | "callable_kwargs": [], 38 | "enabled": true, 39 | "queue": "default", 40 | "at_front": false, 41 | "timeout": null, 42 | "result_ttl": null, 43 | "scheduled_time": "2023-02-21T14:06:06" 44 | }, 45 | ... 46 | ] 47 | ``` 48 | 49 | ## import 50 | 51 | A json/yaml that was exported using the `export` command 52 | can be imported to django. 53 | 54 | - Specify the source file using `--filename` or take it from the standard input (default). 55 | - Reset all scheduled jobs in the database before importing using `-r`/`--reset`. 56 | - Update existing jobs for names that are found using `-u`/`--update`. 57 | 58 | ```shell 59 | python manage.py import -f {yaml,json} --filename {SOURCE-FILE} 60 | ``` 61 | 62 | ## run_job 63 | 64 | Run a method in a queue immediately. 65 | 66 | ```shell 67 | python manage.py run_job {callable} {callable args ...} 68 | ``` 69 | 70 | ## delete failed jobs 71 | 72 | Run this to empty failed jobs registry from a queue. 73 | 74 | ```shell 75 | python manage.py delete_failed_jobs 76 | ``` 77 | -------------------------------------------------------------------------------- /docs/configuration.md: -------------------------------------------------------------------------------- 1 | # Configure your django-rq-scheduler 2 | 3 | ## settings.py 4 | 5 | All default settings for scheduler can be in one dictionary in `settings.py`: 6 | 7 | ```python 8 | SCHEDULER_CONFIG = { 9 | 'EXECUTIONS_IN_PAGE': 20, 10 | 'DEFAULT_RESULT_TTL': 500, 11 | 'DEFAULT_TIMEOUT': 300, # 5 minutes 12 | 'SCHEDULER_INTERVAL': 10, # 10 seconds 13 | } 14 | SCHEDULER_QUEUES = { 15 | 'default': { 16 | 'HOST': 'localhost', 17 | 'PORT': 6379, 18 | 'DB': 0, 19 | 'USERNAME': 'some-user', 20 | 'PASSWORD': 'some-password', 21 | 'DEFAULT_TIMEOUT': 360, 22 | 'REDIS_CLIENT_KWARGS': { # Eventual additional Redis connection arguments 23 | 'ssl_cert_reqs': None, 24 | }, 25 | }, 26 | 'high': { 27 | 'URL': os.getenv('REDISTOGO_URL', 'redis://localhost:6379/0'), # If you're on Heroku 28 | 'DEFAULT_TIMEOUT': 500, 29 | }, 30 | 'low': { 31 | 'HOST': 'localhost', 32 | 'PORT': 6379, 33 | 'DB': 0, 34 | } 35 | } 36 | ``` 37 | 38 | ### SCHEDULER_CONFIG: `EXECUTIONS_IN_PAGE` 39 | 40 | Number of job executions to show in a page in a ScheduledJob admin view. 41 | 42 | Default: `20`. 43 | 44 | ### SCHEDULER_CONFIG: `DEFAULT_RESULT_TTL` 45 | 46 | Default time to live for job execution result. 47 | 48 | Default: `600` (10 minutes). 49 | 50 | ### SCHEDULER_CONFIG: `DEFAULT_TIMEOUT` 51 | 52 | Default timeout for job when it is not mentioned in queue. 53 | Default: `300` (5 minutes). 54 | 55 | ### SCHEDULER_CONFIG: `SCHEDULER_INTERVAL` 56 | 57 | Default scheduler interval, a scheduler is a subprocess of a worker and 58 | will check which job executions are pending. 59 | 60 | Default: `10` (10 seconds). 61 | 62 | ### `SCHEDULER_QUEUES` 63 | 64 | You can configure the queues to work with. 65 | That way you can have different workers listening to different queues. 66 | 67 | Different queues can use different redis servers/connections. 68 | -------------------------------------------------------------------------------- /docs/drt-model.md: -------------------------------------------------------------------------------- 1 | # Worker related flows 2 | 3 | Running `python manage.py startworker --name 'X' --queues high default low` 4 | 5 | ## Register new worker for queues 6 | ```mermaid 7 | sequenceDiagram 8 | autonumber 9 | 10 | participant worker as WorkerProcess 11 | 12 | participant qlist as QueueHash
name -> key 13 | participant wlist as WorkerList 14 | participant wkey as WorkerKey 15 | participant queue as QueueKey 16 | participant job as JobHash 17 | 18 | 19 | note over worker,qlist: Checking sanity 20 | 21 | break when a queue-name in the args is not in queue-list 22 | worker ->>+ qlist: Query queue names 23 | qlist -->>- worker: All queue names 24 | worker ->> worker: check that queue names exists in the system 25 | end 26 | 27 | note over worker,wkey: register 28 | worker ->> wkey: Create workerKey with all info (new id, queues, status) 29 | worker ->> wlist: Add new worker to list, last heartbeat set to now() 30 | ``` 31 | 32 | ## Work (execute jobs on queues) 33 | 34 | ```mermaid 35 | sequenceDiagram 36 | autonumber 37 | 38 | participant worker as WorkerProcess 39 | 40 | participant qlist as QueueHash
name -> key 41 | participant wlist as WorkerList 42 | participant wkey as WorkerKey 43 | participant queue as QueueKey 44 | participant job as JobHash 45 | 46 | loop Until death 47 | worker ->> wlist: Update last heartbeat 48 | note over worker,job: Find next job 49 | 50 | loop over queueKeys until job to run is found or all queues are empty 51 | worker ->>+ queue: get next job id and remove it or None (zrange+zpop) 52 | queue -->>- worker: job id / nothing 53 | end 54 | 55 | note over worker,job: Execute job or sleep 56 | critical [job is found] 57 | worker ->> wkey: Update worker status to busy 58 | worker ->>+ job: query job data 59 | job -->>- worker: job data 60 | 61 | worker ->> job: update job status to running 62 | worker ->> worker: execute job 63 | worker ->> job: update job status to done/failed 64 | worker ->> wkey: Update worker status to idle 65 | option No job pending 66 | worker ->> worker: sleep 67 | end 68 | end 69 | ``` 70 | 71 | # Scheduler flows 72 | -------------------------------------------------------------------------------- /docs/index.md: -------------------------------------------------------------------------------- 1 | # Django RQ Scheduler 2 | 3 | !!! Info 4 | 5 | This Project is migrated to and maintained in [django-tasks-scheduler](https://github.com/dsoftwareinc/django-tasks-scheduler). 6 | Please migrate your package usage to django-tasks-scheduler. 7 | 8 | [![Django CI](https://github.com/dsoftwareinc/django-rq-scheduler/actions/workflows/test.yml/badge.svg)](https://github.com/dsoftwareinc/django-rq-scheduler/actions/workflows/test.yml) 9 | ![badge](https://img.shields.io/endpoint?url=https://gist.githubusercontent.com/cunla/b756396efb895f0e34558c980f1ca0c7/raw/django-rq-scheduler-4.json) 10 | [![badge](https://img.shields.io/pypi/dm/django-rq-scheduler)](https://pypi.org/project/django-rq-scheduler/) 11 | [![Open Source Helpers](https://www.codetriage.com/dsoftwareinc/django-rq-scheduler/badges/users.svg)](https://www.codetriage.com/dsoftwareinc/django-rq-scheduler) 12 | 13 | --- 14 | 15 | A database backed job scheduler for Django RQ. 16 | This allows remembering scheduled jobs, their parameters, etc. 17 | 18 | !!! Info 19 | 20 | Starting v2023.5.0, django-rq-scheduler does not require django-rq to run. 21 | Most features from django-rq are now implemented on this package. 22 | 23 | It is recommended you use the import/export management commands to 24 | save your database. 25 | 26 | ## Terminology 27 | 28 | ### Queue 29 | 30 | A queue of messages between processes (main django-app process and worker usually). 31 | This is implemented in `rq` package. 32 | 33 | * A queue contains multiple registries for scheduled jobs, finished jobs, failed jobs, etc. 34 | 35 | ### Worker 36 | 37 | A process listening to one or more queues **for jobs to be executed**, and executing jobs queued to be 38 | executed. 39 | 40 | ### Scheduler 41 | 42 | A process listening to one or more queues for **jobs to be scheduled for execution**, and schedule them 43 | to be executed by a worker. 44 | 45 | This is a sub-process of worker. 46 | 47 | ### Queued Job Execution 48 | 49 | Once a worker listening to the queue becomes available, 50 | the job will be executed 51 | 52 | ### Scheduled Job Execution 53 | 54 | A scheduler checking the queue periodically will check 55 | whether the time the job should be executed has come, and if so, it will queue it. 56 | 57 | * A job is considered scheduled if it is queued to be executed, or scheduled to be executed. 58 | * If there is no scheduler, the job will not be queued to run. 59 | 60 | ### Scheduled Job 61 | 62 | django models storing information about jobs. So it is possible to schedule using 63 | django-admin and track their status. 64 | 65 | There are 3 types of scheduled job. 66 | 67 | * `Scheduled Job` - Run a job once, on a specific time (can be immediate). 68 | * `Repeatable Job` - Run a job multiple times (limited number of times or infinite times) based on an interval 69 | * `Cron Job` - Run a job multiple times (limited number of times or infinite times) based on a cron string 70 | 71 | Scheduled jobs are scheduled when the django application starts, and after a scheduled job is executed. 72 | 73 | ## Scheduler sequence diagram 74 | 75 | ```mermaid 76 | sequenceDiagram 77 | autonumber 78 | box Worker 79 | participant scheduler as Scheduler Process 80 | end 81 | box Redis queue 82 | participant queue as Queue 83 | participant schedule as Queue scheduled jobs 84 | end 85 | loop Scheduler process - loop forever 86 | scheduler ->> schedule: check whether there are jobs that should be scheduled for execution 87 | critical there are jobs that are scheduled to be executed 88 | scheduler ->> schedule: remove jobs to be scheduled 89 | scheduler ->> queue: enqueue jobs to be executed 90 | end 91 | scheduler ->> scheduler: sleep interval (See SCHEDULER_INTERVAL) 92 | end 93 | ``` 94 | 95 | ## Worker sequence diagram 96 | 97 | ```mermaid 98 | sequenceDiagram 99 | autonumber 100 | box Worker 101 | participant worker as Worker Process 102 | end 103 | box Redis queue 104 | participant queue as Queue 105 | participant finished as Queue finished jobs 106 | participant failed as Queue failed jobs 107 | end 108 | loop Worker process - loop forever 109 | worker ->>+ queue: get the first job to be executed 110 | queue -->>- worker: A job to be executed or nothing 111 | critical There is a job to be executed 112 | worker ->> queue: Remove job from queue 113 | worker ->> worker: Execute job 114 | critical Job ended successfully 115 | worker ->> finished: Write job result 116 | option Job ended unsuccessfully 117 | worker ->> failed: Write job result 118 | end 119 | option No job to be executed 120 | worker ->> worker: sleep 121 | end 122 | end 123 | ``` 124 | 125 | --- 126 | 127 | ## Reporting issues or Features requests 128 | 129 | Please report issues via [GitHub Issues](https://github.com/dsoftwareinc/django-rq-scheduler/issues) . 130 | 131 | --- 132 | 133 | ## Acknowledgements 134 | 135 | Based on original [django-rq-scheduler](https://github.com/isl-x/django-rq-scheduler) - Now supports Django 4.0. 136 | 137 | A lot of django-admin views and their tests were adopted from [django-rq](https://github.com/rq/django-rq). 138 | -------------------------------------------------------------------------------- /docs/installation.md: -------------------------------------------------------------------------------- 1 | # Installation 2 | 3 | 1. Use pip to install: 4 | ```shell 5 | pip install django-rq-scheduler 6 | ``` 7 | 8 | 2. In `settings.py`, add `scheduler` to `INSTALLED_APPS`: 9 | ```python 10 | INSTALLED_APPS = [ 11 | # ... 12 | 'scheduler', 13 | # ... 14 | ] 15 | ``` 16 | 17 | 3. Configure your queues. 18 | Add at least one Redis Queue to your `settings.py`: 19 | ```python 20 | import os 21 | SCHEDULER_QUEUES = { 22 | 'default': { 23 | 'HOST': 'localhost', 24 | 'PORT': 6379, 25 | 'DB': 0, 26 | 'USERNAME': 'some-user', 27 | 'PASSWORD': 'some-password', 28 | 'DEFAULT_TIMEOUT': 360, 29 | 'REDIS_CLIENT_KWARGS': { # Eventual additional Redis connection arguments 30 | 'ssl_cert_reqs': None, 31 | }, 32 | }, 33 | 'with-sentinel': { 34 | 'SENTINELS': [('localhost', 26736), ('localhost', 26737)], 35 | 'MASTER_NAME': 'redismaster', 36 | 'DB': 0, 37 | # Redis username/password 38 | 'USERNAME': 'redis-user', 39 | 'PASSWORD': 'secret', 40 | 'SOCKET_TIMEOUT': 0.3, 41 | 'CONNECTION_KWARGS': { # Eventual additional Redis connection arguments 42 | 'ssl': True 43 | }, 44 | 'SENTINEL_KWARGS': { # Eventual Sentinel connection arguments 45 | # If Sentinel also has auth, username/password can be passed here 46 | 'username': 'sentinel-user', 47 | 'password': 'secret', 48 | }, 49 | }, 50 | 'high': { 51 | 'URL': os.getenv('REDISTOGO_URL', 'redis://localhost:6379/0'), # If you're on Heroku 52 | 'DEFAULT_TIMEOUT': 500, 53 | }, 54 | 'low': { 55 | 'HOST': 'localhost', 56 | 'PORT': 6379, 57 | 'DB': 0, 58 | } 59 | } 60 | ``` 61 | 62 | 4. Optional: Configure default values for queuing jobs from code: 63 | ```python 64 | RQ = { 65 | 'DEFAULT_RESULT_TTL': 360, 66 | 'DEFAULT_TIMEOUT': 60, 67 | } 68 | ``` 69 | 70 | 5. Add `scheduler.urls` to your django application `urls.py`: 71 | ```python 72 | from django.urls import path, include 73 | 74 | urlpatterns = [ 75 | # ... 76 | path('scheduler/', include('scheduler.urls')), 77 | ] 78 | ``` 79 | 80 | 6. Run migrations to generate django models 81 | ```shell 82 | python manage.py migrate 83 | ``` 84 | 85 | -------------------------------------------------------------------------------- /docs/media/add-args.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dsoftwareinc/django-rq-scheduler/d37a9149c770bde88f5eb60ba03402c4a722f81c/docs/media/add-args.jpg -------------------------------------------------------------------------------- /docs/media/add-scheduled-job.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dsoftwareinc/django-rq-scheduler/d37a9149c770bde88f5eb60ba03402c4a722f81c/docs/media/add-scheduled-job.jpg -------------------------------------------------------------------------------- /docs/requirements.txt: -------------------------------------------------------------------------------- 1 | mkdocs==1.5.2 2 | mkdocs-material==9.2.6 3 | -------------------------------------------------------------------------------- /docs/usage.md: -------------------------------------------------------------------------------- 1 | # Usage 2 | 3 | ## Enqueue jobs from code 4 | 5 | ```python 6 | from scheduler import job 7 | 8 | 9 | @job 10 | def long_running_func(): 11 | pass 12 | 13 | 14 | long_running_func.delay() # Enqueue function in "default" queue 15 | ``` 16 | 17 | Specifying the queue where the job should be queued: 18 | 19 | ```python 20 | @job('high') 21 | def long_running_func(): 22 | pass 23 | 24 | 25 | long_running_func.delay() # Enqueue function in "high" queue 26 | ``` 27 | 28 | You can pass in any arguments that RQ's job decorator accepts: 29 | 30 | ```python 31 | from scheduler import job 32 | 33 | 34 | @job('default', timeout=3600) 35 | def long_running_func(): 36 | pass 37 | 38 | 39 | long_running_func.delay() # Enqueue function with a timeout of 3600 seconds. 40 | ``` 41 | 42 | You can set in `settings.py` a default value for `DEFAULT_RESULT_TTL` and `DEFAULT_TIMEOUT`. 43 | 44 | ```python 45 | # settings.py 46 | RQ = { 47 | 'DEFAULT_RESULT_TTL': 360, 48 | 'DEFAULT_TIMEOUT': 60, 49 | } 50 | ``` 51 | 52 | ## Scheduling a job Through django-admin 53 | 54 | * Sign in to the Django Admin site (e.g., http://localhost:8000/admin/) and locate the 55 | **Django RQ Scheduler** section. 56 | * Click on the **Add** link for the type of job you want to add (`Scheduled Job` - run once, `Repeatable Job` - run 57 | multiple times, `Cron Job` - Run based on cron schedule). 58 | * Enter a unique name for the job in the **Name** field. 59 | * In the **Callable** field, enter a Python dot notation path to the method that defines the job. For the example 60 | above, that would be `myapp.jobs.count` 61 | * Choose your **Queue**. 62 | The queues listed are defined in your app `settings.py` under `SCHEDULER_QUEUES`. 63 | * Enter the time in UTC the job is to be executed in the **Scheduled time** field. 64 | 65 | ![](media/add-scheduled-job.jpg) 66 | 67 | #### Optional fields: 68 | 69 | * Select whether the job should take priority over existing queued jobs when it is queued (jobs waiting to be executed) 70 | by using **at front**. 71 | * **Timeout** specifies the maximum time in seconds the job is allowed to run. blank value means it can run forever. 72 | * **Result TTL** (Time to live): The time to live value (in seconds) of the job result. 73 | - `-1`: Result never expires, you should delete jobs manually. 74 | - `0`: Result gets deleted immediately. 75 | - `n` (where `n > 0`) : Result expires after n seconds. 76 | 77 | Once you are done, click **Save** and your job will be persisted to django database. 78 | 79 | ### Support for arguments for jobs 80 | 81 | django-rq-scheduler supports scheduling jobs calling methods with arguments, as well as arguments that should be 82 | calculated in runtime. 83 | 84 | ![](media/add-args.jpg) 85 | 86 | ### Scheduled Job - run once 87 | 88 | No additional steps required. 89 | 90 | ### Repeatable Job - Run a job multiple time based on interval 91 | 92 | Additional fields required: 93 | 94 | * Enter an **Interval**, and choose the **Interval unit**. This will calculate the time before the function is called 95 | again. 96 | * In the **Repeat** field, enter the number of time the job is to be run. Leaving the field empty, means the job will 97 | be scheduled to run forever. 98 | 99 | ### Cron Job - Run a job multiple time based on cron 100 | 101 | Additional fields required: 102 | 103 | * In the **Repeat** field, enter the number of time the job is to be run. Leaving the field empty, means the job will be 104 | scheduled to run forever. 105 | * In the **cron string** field, enter a cron string describing how often the job should run. 106 | 107 | ### Scheduled Job - run once 108 | 109 | No additional steps required. 110 | 111 | ### Repeatable Job - Run a job multiple time based on interval 112 | 113 | Additional fields required: 114 | 115 | * Enter an **Interval**, and choose the **Interval unit**. This will calculate the time before the function is called 116 | again. 117 | * In the **Repeat** field, enter the number of time the job is to be run. Leaving the field empty, means the job will 118 | be scheduled to run forever. 119 | 120 | ### Cron Job - Run a job multiple time based on cron 121 | 122 | Additional fields required: 123 | 124 | * In the **Repeat** field, enter the number of time the job is to be run. Leaving the field empty, means the job will be 125 | scheduled to run forever. 126 | * In the **cron string** field, enter a cron string describing how often the job should run. 127 | 128 | ## Enqueue jobs through command line 129 | 130 | It is possible to queue a job to be executed from the command line 131 | using django management command: 132 | 133 | ```shell 134 | python manage.py run_job -q {queue} -t {timeout} -r {result_ttl} {callable} {args} 135 | ``` 136 | 137 | ## Running a worker 138 | 139 | Create a worker to execute queued jobs on specific queues using: 140 | 141 | ```shell 142 | python manage.py rqworker [queues ...] 143 | ``` 144 | 145 | ### Running multiple workers as unix/linux services using systemd 146 | 147 | You can have multiple workers running as system services. 148 | In order to have multiple rqworkers, edit the `/etc/systemd/system/rqworker@.service` 149 | file, make sure it ends with `@.service`, the following is example: 150 | 151 | ```ini 152 | # /etc/systemd/system/rqworker@.service 153 | [Unit] 154 | Description = rqworker daemon 155 | After = network.target 156 | 157 | [Service] 158 | WorkingDirectory = {{ path_to_your_project_folder } } 159 | ExecStart = /home/ubuntu/.virtualenv/{ { your_virtualenv } }/bin/python \ 160 | {{ path_to_your_project_folder } }/manage.py \ 161 | rqworker high default low 162 | # Optional 163 | # {{user to run rqworker as}} 164 | User = ubuntu 165 | # {{group to run rqworker as}} 166 | Group = www-data 167 | # Redirect logs to syslog 168 | StandardOutput = syslog 169 | StandardError = syslog 170 | SyslogIdentifier = rqworker 171 | Environment = OBJC_DISABLE_INITIALIZE_FORK_SAFETY = YES 172 | Environment = LC_ALL = en_US.UTF-8 173 | Environment = LANG = en_US.UTF-8 174 | 175 | [Install] 176 | WantedBy = multi-user.target 177 | ``` 178 | 179 | After you are done editing the file, reload the settings and start the new workers: 180 | ```shell 181 | sudo systemctl daemon-reload 182 | sudo systemctl start rqworker@{1..3} 183 | ``` 184 | 185 | You can target a specific worker using its number: 186 | ```shell 187 | sudo systemctl stop rqworker@2 188 | ``` -------------------------------------------------------------------------------- /mkdocs.yml: -------------------------------------------------------------------------------- 1 | --- 2 | site_name: django-rq-scheduler 3 | site_author: Daniel Moran 4 | site_description: >- 5 | Documentation for django-rq-scheduler django library 6 | # Repository 7 | repo_name: dsoftwareinc/django-rq-scheduler 8 | repo_url: https://github.com/dsoftwareinc/django-rq-scheduler 9 | 10 | # Copyright 11 | copyright: Copyright © 2022 - 2023 Daniel Moran 12 | 13 | extra: 14 | generator: false 15 | analytics: 16 | provider: google 17 | property: G-GJBJBKXT19 18 | 19 | markdown_extensions: 20 | - abbr 21 | - admonition 22 | - attr_list 23 | - def_list 24 | - footnotes 25 | - md_in_html 26 | - pymdownx.arithmatex: 27 | generic: true 28 | - pymdownx.betterem: 29 | smart_enable: all 30 | - pymdownx.caret 31 | - pymdownx.details 32 | - pymdownx.emoji: 33 | emoji_generator: !!python/name:materialx.emoji.to_svg 34 | emoji_index: !!python/name:materialx.emoji.twemoji 35 | - pymdownx.highlight: 36 | anchor_linenums: true 37 | - pymdownx.inlinehilite 38 | - pymdownx.keys 39 | - pymdownx.magiclink: 40 | repo_url_shorthand: true 41 | user: dsoftware-inc 42 | repo: django-rq-scheduler 43 | - pymdownx.mark 44 | - pymdownx.smartsymbols 45 | - pymdownx.superfences: 46 | custom_fences: 47 | - name: mermaid 48 | class: mermaid 49 | format: !!python/name:pymdownx.superfences.fence_code_format 50 | - pymdownx.tabbed: 51 | alternate_style: true 52 | - pymdownx.tasklist: 53 | custom_checkbox: true 54 | - pymdownx.tilde 55 | - toc: 56 | permalink: true 57 | toc_depth: 3 58 | 59 | 60 | theme: 61 | name: material 62 | palette: 63 | - scheme: default 64 | primary: indigo 65 | accent: indigo 66 | toggle: 67 | icon: material/brightness-7 68 | name: Switch to dark mode 69 | - scheme: slate 70 | primary: indigo 71 | accent: indigo 72 | toggle: 73 | icon: material/brightness-4 74 | name: Switch to light mode 75 | features: 76 | # - announce.dismiss 77 | - content.action.edit 78 | - content.action.view 79 | - content.code.annotate 80 | - content.code.copy 81 | # - content.tabs.link 82 | - content.tooltips 83 | # - header.autohide 84 | # - navigation.expand 85 | - navigation.footer 86 | - navigation.indexes 87 | # - navigation.instant 88 | # - navigation.prune 89 | - navigation.sections 90 | # - navigation.tabs.sticky 91 | - navigation.tracking 92 | - search.highlight 93 | - search.share 94 | - search.suggest 95 | - toc.follow 96 | # - toc.integrate 97 | highlightjs: true 98 | hljs_languages: 99 | - yaml 100 | - django 101 | 102 | nav: 103 | - Home: index.md 104 | - Installation: installation.md 105 | - Configuration: configuration.md 106 | - Usage: usage.md 107 | - Management commands: commands.md 108 | - Change log: changelog.md 109 | 110 | -------------------------------------------------------------------------------- /pyproject.toml: -------------------------------------------------------------------------------- 1 | [build-system] 2 | requires = ["poetry-core"] 3 | build-backend = "poetry.core.masonry.api" 4 | 5 | [tool.poetry] 6 | name = "django-rq-scheduler" 7 | packages = [ 8 | { include = "scheduler" }, 9 | ] 10 | version = "2023.9.0" 11 | description = "An async job scheduler for django using redis" 12 | readme = "README.md" 13 | keywords = ["redis", "django", "background-jobs", "job-queue", "task-queue", "redis-queue", "scheduled-jobs"] 14 | authors = [ 15 | "Daniel Moran ", 16 | ] 17 | maintainers = [ 18 | "Daniel Moran ", 19 | ] 20 | license = "MIT" 21 | classifiers = [ 22 | 'Development Status :: 5 - Production/Stable', 23 | 'Environment :: Web Environment', 24 | 'Intended Audience :: Developers', 25 | 'License :: OSI Approved :: MIT License', 26 | 'Operating System :: OS Independent', 27 | 'Programming Language :: Python', 28 | 'Programming Language :: Python :: 3.9', 29 | 'Programming Language :: Python :: 3.10', 30 | 'Programming Language :: Python :: 3.11', 31 | 'Framework :: Django', 32 | 'Framework :: Django :: 4', 33 | 'Framework :: Django :: 4.0', 34 | 'Framework :: Django :: 4.1', 35 | 'Framework :: Django :: 4.2', 36 | 'Framework :: Django :: 3', 37 | 'Framework :: Django :: 3.2', 38 | ] 39 | homepage = "https://github.com/dsoftwareinc/django-redis-scheduler" 40 | documentation = "https://django-rq-scheduler.readthedocs.io/en/latest/" 41 | 42 | [tool.poetry.urls] 43 | "Bug Tracker" = "https://github.com/dsoftwareinc/django-rq-scheduler/issues" 44 | "Funding" = "https://github.com/sponsors/cunla" 45 | 46 | [tool.poetry.dependencies] 47 | python = "^3.9" 48 | django = ">=3.2" 49 | croniter = "^1.4" 50 | click = "^8.1" 51 | rq = "^1.15" 52 | pyyaml = { version = "^6.0", optional = true } 53 | 54 | [tool.poetry.dev-dependencies] 55 | poetry = "^1.5" 56 | coverage = "^7" 57 | fakeredis = { version = "^2.15", extras = ['lua'] } 58 | Flake8-pyproject = "^1.2" 59 | 60 | [tool.poetry.extras] 61 | yaml = ["pyyaml"] 62 | 63 | [tool.flake8] 64 | max-line-length = 119 65 | exclude = [ 66 | 'scheduler/migrations', 67 | ] -------------------------------------------------------------------------------- /scheduler/__init__.py: -------------------------------------------------------------------------------- 1 | import importlib.metadata 2 | 3 | __version__ = importlib.metadata.version('django-rq-scheduler') 4 | 5 | from .decorators import job # noqa: F401 6 | -------------------------------------------------------------------------------- /scheduler/admin/__init__.py: -------------------------------------------------------------------------------- 1 | from .job import JobAdmin # noqa: F401 2 | from .redis_models import QueueAdmin, WorkerAdmin # noqa: F401 3 | -------------------------------------------------------------------------------- /scheduler/admin/job.py: -------------------------------------------------------------------------------- 1 | import redis 2 | from django.contrib import admin, messages 3 | from django.contrib.contenttypes.admin import GenericStackedInline 4 | from django.utils.translation import gettext_lazy as _ 5 | 6 | from scheduler import tools 7 | from scheduler.models import CronJob, JobArg, JobKwarg, RepeatableJob, ScheduledJob 8 | from scheduler.settings import SCHEDULER_CONFIG, logger 9 | from scheduler.tools import get_job_executions 10 | 11 | 12 | class HiddenMixin(object): 13 | class Media: 14 | js = ['admin/js/jquery.init.js', ] 15 | 16 | 17 | class JobArgInline(HiddenMixin, GenericStackedInline): 18 | model = JobArg 19 | extra = 0 20 | fieldsets = ( 21 | (None, { 22 | 'fields': (('arg_type', 'val',),), 23 | }), 24 | ) 25 | 26 | 27 | class JobKwargInline(HiddenMixin, GenericStackedInline): 28 | model = JobKwarg 29 | extra = 0 30 | fieldsets = ( 31 | (None, { 32 | 'fields': (('key',), ('arg_type', 'val',),), 33 | }), 34 | ) 35 | 36 | 37 | @admin.register(CronJob, ScheduledJob, RepeatableJob) 38 | class JobAdmin(admin.ModelAdmin): 39 | LIST_DISPLAY_EXTRA = dict( 40 | CronJob=('cron_string', 'next_run',), 41 | ScheduledJob=('scheduled_time',), 42 | RepeatableJob=('scheduled_time', 'interval_display'), 43 | ) 44 | FIELDSET_EXTRA = dict( 45 | CronJob=('cron_string', 'repeat', 'timeout', 'result_ttl',), 46 | ScheduledJob=('scheduled_time', 'timeout', 'result_ttl'), 47 | RepeatableJob=('scheduled_time', ('interval', 'interval_unit',), 'repeat', 'timeout', 'result_ttl',), 48 | ) 49 | """BaseJob admin class""" 50 | save_on_top = True 51 | change_form_template = 'admin/scheduler/change_form.html' 52 | actions = ['disable_selected', 'enable_selected', 'enqueue_job_now', ] 53 | inlines = [JobArgInline, JobKwargInline, ] 54 | list_filter = ('enabled',) 55 | list_display = ('enabled', 'name', 'job_id', 'function_string', 'is_scheduled', 'queue',) 56 | list_display_links = ('name',) 57 | readonly_fields = ('job_id',) 58 | fieldsets = ( 59 | (None, { 60 | 'fields': ('name', 'callable', 'enabled', 'at_front',), 61 | }), 62 | (_('RQ Settings'), { 63 | 'fields': ('queue', 'job_id',), 64 | }), 65 | ) 66 | 67 | def get_list_display(self, request): 68 | if self.model.__name__ not in JobAdmin.LIST_DISPLAY_EXTRA: 69 | raise ValueError(f'Unrecognized model {self.model}') 70 | return JobAdmin.list_display + JobAdmin.LIST_DISPLAY_EXTRA[self.model.__name__] 71 | 72 | def get_fieldsets(self, request, obj=None): 73 | if self.model.__name__ not in JobAdmin.FIELDSET_EXTRA: 74 | raise ValueError(f'Unrecognized model {self.model}') 75 | return JobAdmin.fieldsets + ((_('Scheduling'), { 76 | 'fields': JobAdmin.FIELDSET_EXTRA[self.model.__name__], 77 | }),) 78 | 79 | @admin.display(description='Next run') 80 | def next_run(self, o: CronJob): 81 | return tools.get_next_cron_time(o.cron_string) 82 | 83 | def change_view(self, request, object_id, form_url='', extra_context=None): 84 | extra = extra_context or {} 85 | obj = self.get_object(request, object_id) 86 | try: 87 | execution_list = get_job_executions(obj.queue, obj) 88 | except redis.ConnectionError as e: 89 | logger.warn(f'Could not get job executions: {e}') 90 | execution_list = list() 91 | paginator = self.get_paginator(request, execution_list, SCHEDULER_CONFIG['EXECUTIONS_IN_PAGE']) 92 | page_number = request.GET.get('p', 1) 93 | page_obj = paginator.get_page(page_number) 94 | page_range = paginator.get_elided_page_range(page_obj.number) 95 | 96 | extra.update({ 97 | "pagination_required": paginator.count > SCHEDULER_CONFIG['EXECUTIONS_IN_PAGE'], 98 | 'executions': page_obj, 99 | 'page_range': page_range, 100 | 'page_var': 'p', 101 | }) 102 | 103 | return super(JobAdmin, self).change_view( 104 | request, object_id, form_url, extra_context=extra) 105 | 106 | def delete_queryset(self, request, queryset): 107 | for job in queryset: 108 | job.unschedule() 109 | super(JobAdmin, self).delete_queryset(request, queryset) 110 | 111 | def delete_model(self, request, obj): 112 | obj.unschedule() 113 | super(JobAdmin, self).delete_model(request, obj) 114 | 115 | @admin.action(description=_("Disable selected %(verbose_name_plural)s"), permissions=('change',)) 116 | def disable_selected(self, request, queryset): 117 | rows_updated = 0 118 | for obj in queryset.filter(enabled=True).iterator(): 119 | obj.enabled = False 120 | obj.unschedule() 121 | rows_updated += 1 122 | 123 | message_bit = "1 job was" if rows_updated == 1 else f"{rows_updated} jobs were" 124 | 125 | level = messages.WARNING if not rows_updated else messages.INFO 126 | self.message_user(request, f"{message_bit} successfully disabled and unscheduled.", level=level) 127 | 128 | @admin.action(description=_("Enable selected %(verbose_name_plural)s"), permissions=('change',)) 129 | def enable_selected(self, request, queryset): 130 | rows_updated = 0 131 | for obj in queryset.filter(enabled=False).iterator(): 132 | obj.enabled = True 133 | obj.save() 134 | rows_updated += 1 135 | 136 | message_bit = "1 job was" if rows_updated == 1 else f"{rows_updated} jobs were" 137 | level = messages.WARNING if not rows_updated else messages.INFO 138 | self.message_user(request, f"{message_bit} successfully enabled and scheduled.", level=level) 139 | 140 | @admin.action(description="Enqueue now", permissions=('change',)) 141 | def enqueue_job_now(self, request, queryset): 142 | job_names = [] 143 | for job in queryset: 144 | job.enqueue_to_run() 145 | job_names.append(job.name) 146 | self.message_user(request, f"The following jobs have been enqueued: {', '.join(job_names)}", ) 147 | -------------------------------------------------------------------------------- /scheduler/admin/redis_models.py: -------------------------------------------------------------------------------- 1 | from django.contrib import admin 2 | 3 | from scheduler import views 4 | from scheduler.models import Queue 5 | from scheduler.models.worker import Worker 6 | 7 | 8 | class ImmutableAdmin(admin.ModelAdmin): 9 | def has_add_permission(self, request): 10 | return False # Hide the admin "+ Add" link for Queues 11 | 12 | def has_change_permission(self, request, obj=None): 13 | return True 14 | 15 | def has_module_permission(self, request): 16 | """ 17 | return True if the given request has any permission in the given 18 | app label. 19 | 20 | Can be overridden by the user in subclasses. In such case it should 21 | return True if the given request has permission to view the module on 22 | the admin index page and access the module's index page. Overriding it 23 | does not restrict access to the add, change or delete views. Use 24 | `ModelAdmin.has_(add|change|delete)_permission` for that. 25 | """ 26 | return request.user.has_module_perms('django-rq-scheduler') 27 | 28 | 29 | @admin.register(Queue) 30 | class QueueAdmin(ImmutableAdmin): 31 | """Admin View for queues""" 32 | 33 | def changelist_view(self, request, extra_context=None): 34 | """The 'change list' admin view for this model.""" 35 | return views.stats(request) 36 | 37 | 38 | @admin.register(Worker) 39 | class WorkerAdmin(ImmutableAdmin): 40 | """Admin View for workers""" 41 | 42 | def changelist_view(self, request, extra_context=None): 43 | """The 'change list' admin view for this model.""" 44 | return views.workers(request) 45 | -------------------------------------------------------------------------------- /scheduler/apps.py: -------------------------------------------------------------------------------- 1 | from django.apps import AppConfig 2 | from django.utils.translation import gettext_lazy as _ 3 | 4 | 5 | class SchedulerConfig(AppConfig): 6 | default_auto_field = 'django.db.models.AutoField' 7 | name = 'scheduler' 8 | verbose_name = _('Django RQ Scheduler') 9 | 10 | def ready(self): 11 | pass 12 | -------------------------------------------------------------------------------- /scheduler/decorators.py: -------------------------------------------------------------------------------- 1 | from scheduler import settings 2 | from .queues import get_queue, QueueNotFoundError 3 | from .rq_classes import rq_job_decorator 4 | 5 | 6 | def job(*args, **kwargs): 7 | """ 8 | The same as rq package's job decorator, but it automatically works out 9 | the ``connection`` argument from SCHEDULER_QUEUES. 10 | 11 | And also, it allows simplified ``@job`` syntax to put job into 12 | default queue. 13 | 14 | """ 15 | if len(args) == 0: 16 | func = None 17 | queue = 'default' 18 | else: 19 | if callable(args[0]): 20 | func = args[0] 21 | queue = 'default' 22 | else: 23 | func = None 24 | queue = args[0] 25 | args = args[1:] 26 | 27 | if isinstance(queue, str): 28 | try: 29 | queue = get_queue(queue) 30 | if 'connection' not in kwargs: 31 | kwargs['connection'] = queue.connection 32 | except KeyError: 33 | raise QueueNotFoundError(f'Queue {queue} does not exist') 34 | 35 | config = settings.SCHEDULER_CONFIG 36 | 37 | kwargs.setdefault('result_ttl', config.get('DEFAULT_RESULT_TTL')) 38 | kwargs.setdefault('timeout', config.get('DEFAULT_TIMEOUT')) 39 | 40 | decorator = rq_job_decorator(queue, *args, **kwargs) 41 | if func: 42 | return decorator(func) 43 | return decorator 44 | -------------------------------------------------------------------------------- /scheduler/management/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dsoftwareinc/django-rq-scheduler/d37a9149c770bde88f5eb60ba03402c4a722f81c/scheduler/management/__init__.py -------------------------------------------------------------------------------- /scheduler/management/commands/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dsoftwareinc/django-rq-scheduler/d37a9149c770bde88f5eb60ba03402c4a722f81c/scheduler/management/commands/__init__.py -------------------------------------------------------------------------------- /scheduler/management/commands/delete_failed_executions.py: -------------------------------------------------------------------------------- 1 | import click 2 | from django.core.management.base import BaseCommand 3 | 4 | from scheduler.queues import get_queue 5 | from scheduler.rq_classes import JobExecution 6 | 7 | 8 | class Command(BaseCommand): 9 | help = 'Delete failed jobs from Django queue.' 10 | 11 | def add_arguments(self, parser): 12 | parser.add_argument( 13 | '--queue', '-q', dest='queue', default='default', 14 | help='Specify the queue [default]') 15 | parser.add_argument('-f', '--func', help='optional job function name, e.g. "app.tasks.func"') 16 | parser.add_argument('--dry-run', action='store_true', help='Do not actually delete failed jobs') 17 | 18 | def handle(self, *args, **options): 19 | queue = get_queue(options.get('queue', 'default')) 20 | job_ids = queue.failed_job_registry.get_job_ids() 21 | jobs = [JobExecution.fetch(job_id, connection=queue.connection) for job_id in job_ids] 22 | func_name = options.get('func', None) 23 | if func_name is not None: 24 | jobs = [job for job in jobs if job.func_name == func_name] 25 | dry_run = options.get('dry_run', False) 26 | click.echo(f'Found {len(jobs)} failed jobs') 27 | for job in jobs: 28 | click.echo(f'Deleting {job.id}') 29 | if not dry_run: 30 | job.delete() 31 | click.echo(f'Deleted {len(jobs)} failed jobs') 32 | -------------------------------------------------------------------------------- /scheduler/management/commands/export.py: -------------------------------------------------------------------------------- 1 | import sys 2 | 3 | import click 4 | from django.apps import apps 5 | from django.core.management.base import BaseCommand 6 | 7 | from scheduler.tools import MODEL_NAMES 8 | 9 | 10 | class Command(BaseCommand): 11 | """ 12 | Export all scheduled jobs 13 | """ 14 | help = __doc__ 15 | 16 | def add_arguments(self, parser): 17 | parser.add_argument( 18 | '-o', '--output', 19 | action='store', 20 | choices=['json', 'yaml'], 21 | default='json', 22 | dest='format', 23 | help='format of output', 24 | ) 25 | 26 | parser.add_argument( 27 | '-e', '--enabled', 28 | action='store_true', 29 | dest='enabled', 30 | help='Export only enabled jobs', 31 | ) 32 | parser.add_argument( 33 | '-f', '--filename', 34 | action='store', 35 | dest='filename', 36 | help='File name to load (otherwise writes to standard output)', 37 | ) 38 | 39 | def handle(self, *args, **options): 40 | file = open(options.get('filename'), 'w') if options.get("filename") else sys.stdout 41 | res = list() 42 | for model_name in MODEL_NAMES: 43 | model = apps.get_model(app_label='scheduler', model_name=model_name) 44 | jobs = model.objects.all() 45 | if options.get('enabled'): 46 | jobs = jobs.filter(enabled=True) 47 | for job in jobs: 48 | res.append(job.to_dict()) 49 | 50 | if options.get("format") == 'json': 51 | import json 52 | click.echo(json.dumps(res, indent=2), file=file) 53 | return 54 | 55 | if options.get("format") == 'yaml': 56 | try: 57 | import yaml 58 | except ImportError: 59 | click.echo("Aborting. LibYAML is not installed.") 60 | exit(1) 61 | # Disable YAML alias 62 | yaml.Dumper.ignore_aliases = lambda *x: True 63 | click.echo(yaml.dump(res, default_flow_style=False), file=file) 64 | return 65 | -------------------------------------------------------------------------------- /scheduler/management/commands/import.py: -------------------------------------------------------------------------------- 1 | import sys 2 | 3 | import click 4 | from django.apps import apps 5 | from django.conf import settings 6 | from django.contrib.contenttypes.models import ContentType 7 | from django.core.management.base import BaseCommand 8 | from django.utils import timezone 9 | 10 | from scheduler.models import JobArg, JobKwarg 11 | from scheduler.tools import MODEL_NAMES 12 | 13 | 14 | def create_job_from_dict(job_dict, update): 15 | model = apps.get_model(app_label='scheduler', model_name=job_dict['model']) 16 | existing_job = model.objects.filter(name=job_dict['name']).first() 17 | if existing_job: 18 | if update: 19 | click.echo(f'Found existing job "{existing_job}, removing it to be reinserted"') 20 | existing_job.delete() 21 | else: 22 | click.echo(f'Found existing job "{existing_job}", skipping') 23 | return 24 | kwargs = dict(job_dict) 25 | del kwargs['model'] 26 | del kwargs['callable_args'] 27 | del kwargs['callable_kwargs'] 28 | if kwargs.get('scheduled_time', None): 29 | kwargs['scheduled_time'] = timezone.datetime.fromisoformat(kwargs['scheduled_time']) 30 | if not settings.USE_TZ: 31 | kwargs['scheduled_time'] = timezone.make_naive(kwargs['scheduled_time']) 32 | model_fields = set(map(lambda field: field.attname, model._meta.get_fields())) 33 | keys_to_ignore = list(filter(lambda k: k not in model_fields, kwargs.keys())) 34 | for k in keys_to_ignore: 35 | del kwargs[k] 36 | scheduled_job = model.objects.create(**kwargs) 37 | click.echo(f'Created job {scheduled_job}') 38 | content_type = ContentType.objects.get_for_model(scheduled_job) 39 | 40 | for arg in job_dict['callable_args']: 41 | JobArg.objects.create( 42 | content_type=content_type, object_id=scheduled_job.id, **arg, ) 43 | for arg in job_dict['callable_kwargs']: 44 | JobKwarg.objects.create( 45 | content_type=content_type, object_id=scheduled_job.id, **arg, ) 46 | 47 | 48 | class Command(BaseCommand): 49 | """ 50 | Import scheduled jobs 51 | """ 52 | help = __doc__ 53 | 54 | def add_arguments(self, parser): 55 | parser.add_argument( 56 | '-f', '--format', 57 | action='store', 58 | choices=['json', 'yaml'], 59 | default='json', 60 | dest='format', 61 | help='format of input', 62 | ) 63 | parser.add_argument( 64 | '--filename', 65 | action='store', 66 | dest='filename', 67 | help='File name to load (otherwise loads from standard input)', 68 | ) 69 | 70 | parser.add_argument( 71 | '-r', '--reset', 72 | action='store_true', 73 | dest='reset', 74 | help='Remove all currently scheduled jobs before importing', 75 | ) 76 | parser.add_argument( 77 | '-u', '--update', 78 | action='store_true', 79 | dest='update', 80 | help='Update existing records', 81 | ) 82 | 83 | def handle(self, *args, **options): 84 | file = open(options.get('filename')) if options.get("filename") else sys.stdin 85 | jobs = list() 86 | if options.get("format") == 'json': 87 | import json 88 | try: 89 | jobs = json.load(file) 90 | except json.decoder.JSONDecodeError: 91 | click.echo('Error decoding json', err=True) 92 | exit(1) 93 | elif options.get("format") == 'yaml': 94 | try: 95 | import yaml 96 | except ImportError: 97 | click.echo("Aborting. LibYAML is not installed.") 98 | exit(1) 99 | # Disable YAML alias 100 | yaml.Dumper.ignore_aliases = lambda *x: True 101 | jobs = yaml.load(file, yaml.SafeLoader) 102 | 103 | if options.get('reset'): 104 | for model_name in MODEL_NAMES: 105 | model = apps.get_model(app_label='scheduler', model_name=model_name) 106 | model.objects.all().delete() 107 | 108 | for job in jobs: 109 | create_job_from_dict(job, update=options.get('update')) 110 | -------------------------------------------------------------------------------- /scheduler/management/commands/rqstats.py: -------------------------------------------------------------------------------- 1 | import time 2 | 3 | import click 4 | from django.core.management.base import BaseCommand 5 | from scheduler.views import get_statistics 6 | 7 | ANSI_LIGHT_GREEN = "\033[1;32m" 8 | ANSI_LIGHT_WHITE = "\033[1;37m" 9 | ANSI_RESET = "\033[0m" 10 | 11 | KEYS = ('jobs', 'started_jobs', 'deferred_jobs', 'finished_jobs', 'canceled_jobs', 'workers') 12 | 13 | 14 | class Command(BaseCommand): 15 | """ 16 | Print statistics 17 | """ 18 | help = __doc__ 19 | 20 | def __init__(self, *args, **kwargs): 21 | super(Command, self).__init__(*args, **kwargs) 22 | self.table_width = 80 23 | self.interval = None 24 | 25 | def add_arguments(self, parser): 26 | parser.add_argument( 27 | '-j', '--json', action='store_true', dest='json', 28 | help='Output statistics as JSON', ) 29 | 30 | parser.add_argument( 31 | '-y', '--yaml', action='store_true', dest='yaml', 32 | help='Output statistics as YAML', 33 | ) 34 | 35 | parser.add_argument( 36 | '-i', '--interval', dest='interval', type=float, 37 | help='Poll statistics every N seconds', 38 | ) 39 | 40 | def _print_separator(self): 41 | click.echo('-' * self.table_width) 42 | 43 | def _print_stats_dashboard(self, statistics, prev_stats=None): 44 | if self.interval: 45 | click.clear() 46 | click.echo() 47 | click.echo("Django-Scheduler CLI Dashboard") 48 | click.echo() 49 | self._print_separator() 50 | click.echo(f'| {"Name":<16} | Queued | Active | Deferred |' 51 | f' Finished |' 52 | f' Canceled |' 53 | f' Workers |') 54 | self._print_separator() 55 | for ind, queue in enumerate(statistics["queues"]): 56 | vals = list((queue[k] for k in KEYS)) 57 | # Deal with colors 58 | if prev_stats and len(prev_stats['queues']) > ind: 59 | prev = prev_stats["queues"][ind] 60 | prev_vals = (prev[k] for k in KEYS) 61 | colors = [ANSI_LIGHT_GREEN 62 | if vals[i] != prev_vals[i] else ANSI_LIGHT_WHITE 63 | for i in range(len(prev_vals)) 64 | ] 65 | else: 66 | colors = [ANSI_LIGHT_WHITE for _ in range(len(vals))] 67 | to_print = ' | '.join([f'{colors[i]}{vals[i]:9}{ANSI_RESET}' for i in range(len(vals))]) 68 | click.echo(f'| {queue["name"]:<16} | {to_print} |', color=True) 69 | 70 | self._print_separator() 71 | 72 | if self.interval: 73 | click.echo() 74 | click.echo("Press 'Ctrl+c' to quit") 75 | 76 | def handle(self, *args, **options): 77 | 78 | if options.get("json"): 79 | import json 80 | click.secho(json.dumps(get_statistics(), indent=2), ) 81 | return 82 | 83 | if options.get("yaml"): 84 | try: 85 | import yaml 86 | except ImportError: 87 | click.secho("Aborting. yaml not supported", err=True, fg='red') 88 | return 89 | 90 | click.secho(yaml.dump(get_statistics(), default_flow_style=False), ) 91 | return 92 | 93 | self.interval = options.get("interval") 94 | 95 | if not self.interval or self.interval < 0: 96 | self._print_stats_dashboard(get_statistics()) 97 | return 98 | 99 | try: 100 | prev = None 101 | while True: 102 | statistics = get_statistics() 103 | self._print_stats_dashboard(statistics, prev) 104 | prev = statistics 105 | time.sleep(self.interval) 106 | except KeyboardInterrupt: 107 | pass 108 | -------------------------------------------------------------------------------- /scheduler/management/commands/rqworker.py: -------------------------------------------------------------------------------- 1 | import logging 2 | import os 3 | import sys 4 | 5 | import click 6 | from django.core.management.base import BaseCommand 7 | from django.db import connections 8 | from redis.exceptions import ConnectionError 9 | from rq.logutils import setup_loghandlers 10 | 11 | from scheduler.tools import create_worker 12 | 13 | VERBOSITY_TO_LOG_LEVEL = { 14 | 0: logging.CRITICAL, 15 | 1: logging.WARNING, 16 | 2: logging.INFO, 17 | 3: logging.DEBUG, 18 | } 19 | 20 | 21 | def reset_db_connections(): 22 | for c in connections.all(): 23 | c.close() 24 | 25 | 26 | class Command(BaseCommand): 27 | """ 28 | Runs RQ workers on specified queues. Note that all queues passed into a 29 | single rqworker command must share the same connection. 30 | 31 | Example usage: 32 | python manage.py rqworker high medium low 33 | """ 34 | 35 | args = '' 36 | 37 | def add_arguments(self, parser): 38 | parser.add_argument('--pid', action='store', dest='pidfile', 39 | default=None, help='file to write the worker`s pid into') 40 | parser.add_argument('--burst', action='store_true', dest='burst', 41 | default=False, help='Run worker in burst mode') 42 | parser.add_argument('--name', action='store', dest='name', 43 | default=None, help='Name of the worker') 44 | parser.add_argument('--worker-ttl', action='store', type=int, dest='worker_ttl', default=420, 45 | help='Default worker timeout to be used') 46 | parser.add_argument('--max-jobs', action='store', default=None, dest='max_jobs', type=int, 47 | help='Maximum number of jobs to execute before terminating worker') 48 | parser.add_argument('--fork-job-execution', action='store', default=True, dest='fork_job_execution', type=bool, 49 | help='Fork job execution to another process') 50 | parser.add_argument( 51 | 'queues', nargs='*', type=str, 52 | help='The queues to work on, separated by space, all queues should be using the same redis') 53 | 54 | def handle(self, **options): 55 | queues = options.get('queues', []) 56 | if not queues: 57 | queues = ['default', ] 58 | click.echo(f'Starting worker for queues {queues}') 59 | pidfile = options.get('pidfile') 60 | if pidfile: 61 | with open(os.path.expanduser(pidfile), "w") as fp: 62 | fp.write(str(os.getpid())) 63 | 64 | # Verbosity is defined by default in BaseCommand for all commands 65 | verbosity = options.get('verbosity', 1) 66 | log_level = VERBOSITY_TO_LOG_LEVEL.get(verbosity, logging.INFO) 67 | setup_loghandlers(log_level) 68 | 69 | try: 70 | # Instantiate a worker 71 | w = create_worker( 72 | *queues, 73 | name=options['name'], 74 | default_worker_ttl=options['worker_ttl'], 75 | fork_job_execution=options['fork_job_execution'], ) 76 | 77 | # Close any opened DB connection before any fork 78 | reset_db_connections() 79 | 80 | w.work(burst=options.get('burst', False), 81 | logging_level=log_level, 82 | max_jobs=options['max_jobs'], ) 83 | except ConnectionError as e: 84 | click.echo(str(e), err=True) 85 | sys.exit(1) 86 | -------------------------------------------------------------------------------- /scheduler/management/commands/run_job.py: -------------------------------------------------------------------------------- 1 | import click 2 | from django.core.management.base import BaseCommand 3 | 4 | from scheduler.queues import get_queue 5 | 6 | 7 | class Command(BaseCommand): 8 | """ 9 | Queues the function given with the first argument with the 10 | parameters given with the rest of the argument list. 11 | """ 12 | help = __doc__ 13 | args = '' 14 | 15 | def add_arguments(self, parser): 16 | parser.add_argument( 17 | '--queue', '-q', dest='queue', default='default', 18 | help='Specify the queue [default]') 19 | parser.add_argument( 20 | '--timeout', '-t', type=int, dest='timeout', 21 | help='A timeout in seconds') 22 | parser.add_argument( 23 | '--result-ttl', '-r', type=int, dest='result_ttl', 24 | help='Time to store job results in seconds') 25 | parser.add_argument('callable', help='Method to call', ) 26 | parser.add_argument('args', nargs='*', help='Args for callable') 27 | 28 | def handle(self, **options): 29 | verbosity = int(options.get('verbosity', 1)) 30 | timeout = options.get('timeout') 31 | result_ttl = options.get('result_ttl') 32 | queue = get_queue(options.get('queue')) 33 | func = options.get('callable') 34 | args = options.get('args') 35 | job = queue.enqueue_call(func, args=args, timeout=timeout, result_ttl=result_ttl) 36 | if verbosity: 37 | click.echo(f'Job {job.id} created') 38 | -------------------------------------------------------------------------------- /scheduler/migrations/0001_initial_squashed_0005_added_result_ttl.py: -------------------------------------------------------------------------------- 1 | # Generated by Django 4.0.1 on 2022-01-06 20:34 2 | 3 | from django.db import migrations, models 4 | 5 | 6 | class Migration(migrations.Migration): 7 | # replaces = [('scheduler', '0001_initial'), ('scheduler', '0002_add_timeout'), 8 | # ('scheduler', '0003_remove_queue_choices'), ('scheduler', '0004_add_cron_jobs'), 9 | # ('scheduler', '0005_added_result_ttl')] 10 | 11 | dependencies = [ 12 | ] 13 | 14 | operations = [ 15 | migrations.CreateModel( 16 | name='CronJob', 17 | fields=[ 18 | ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), 19 | ('created', models.DateTimeField(auto_now_add=True, verbose_name='created')), 20 | ('modified', models.DateTimeField(auto_now=True, verbose_name='modified')), 21 | ('name', models.CharField(max_length=128, unique=True, verbose_name='name')), 22 | ('callable', models.CharField(max_length=2048, verbose_name='callable')), 23 | ('enabled', models.BooleanField(default=True, verbose_name='enabled')), 24 | ('queue', models.CharField(max_length=16, verbose_name='queue')), 25 | ('job_id', 26 | models.CharField(blank=True, editable=False, max_length=128, null=True, verbose_name='job id')), 27 | ('timeout', models.IntegerField(blank=True, 28 | help_text="Timeout specifies the maximum runtime, in seconds, for the job before it'll be considered 'lost'. Blank uses the default timeout.", 29 | null=True, verbose_name='timeout')), 30 | ('repeat', models.PositiveIntegerField(blank=True, null=True, verbose_name='repeat')), 31 | ('cron_string', 32 | models.CharField(help_text='Define the schedule in a crontab like syntax.', max_length=64, 33 | verbose_name='cron string')), 34 | ], 35 | options={ 36 | 'ordering': ('name',), 37 | 'verbose_name': 'Cron Job', 38 | 'verbose_name_plural': 'Cron Jobs', 39 | }, 40 | ), 41 | migrations.CreateModel( 42 | name='RepeatableJob', 43 | fields=[ 44 | ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), 45 | ('created', models.DateTimeField(auto_now_add=True, verbose_name='created')), 46 | ('modified', models.DateTimeField(auto_now=True, verbose_name='modified')), 47 | ('name', models.CharField(max_length=128, unique=True, verbose_name='name')), 48 | ('callable', models.CharField(max_length=2048, verbose_name='callable')), 49 | ('enabled', models.BooleanField(default=True, verbose_name='enabled')), 50 | ('queue', models.CharField(max_length=16, verbose_name='queue')), 51 | ('job_id', 52 | models.CharField(blank=True, editable=False, max_length=128, null=True, verbose_name='job id')), 53 | ('scheduled_time', models.DateTimeField(verbose_name='scheduled time')), 54 | ('interval', models.PositiveIntegerField(verbose_name='interval')), 55 | ('interval_unit', models.CharField( 56 | choices=[('minutes', 'minutes'), ('hours', 'hours'), ('days', 'days'), ('weeks', 'weeks')], 57 | default='hours', max_length=12, verbose_name='interval unit')), 58 | ('repeat', models.PositiveIntegerField(blank=True, null=True, verbose_name='repeat')), 59 | ('timeout', models.IntegerField(blank=True, 60 | help_text="Timeout specifies the maximum runtime, in seconds, for the job before it'll be considered 'lost'. Blank uses the default timeout.", 61 | null=True, verbose_name='timeout')), 62 | ('result_ttl', models.IntegerField(blank=True, 63 | help_text='The TTL value (in seconds) of the job result. -1: Result never expires, you should delete jobs manually. 0: Result gets deleted immediately. >0: Result expires after n seconds.', 64 | null=True, verbose_name='result ttl')), 65 | ], 66 | options={ 67 | 'ordering': ('name',), 68 | 'verbose_name': 'Repeatable Job', 69 | 'verbose_name_plural': 'Repeatable Jobs', 70 | }, 71 | ), 72 | migrations.CreateModel( 73 | name='ScheduledJob', 74 | fields=[ 75 | ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), 76 | ('created', models.DateTimeField(auto_now_add=True, verbose_name='created')), 77 | ('modified', models.DateTimeField(auto_now=True, verbose_name='modified')), 78 | ('name', models.CharField(max_length=128, unique=True, verbose_name='name')), 79 | ('callable', models.CharField(max_length=2048, verbose_name='callable')), 80 | ('enabled', models.BooleanField(default=True, verbose_name='enabled')), 81 | ('queue', models.CharField(max_length=16, verbose_name='queue')), 82 | ('job_id', 83 | models.CharField(blank=True, editable=False, max_length=128, null=True, verbose_name='job id')), 84 | ('scheduled_time', models.DateTimeField(verbose_name='scheduled time')), 85 | ('timeout', models.IntegerField(blank=True, 86 | help_text="Timeout specifies the maximum runtime, in seconds, for the job before it'll be considered 'lost'. Blank uses the default timeout.", 87 | null=True, verbose_name='timeout')), 88 | ('result_ttl', models.IntegerField(blank=True, 89 | help_text='The TTL value (in seconds) of the job result. -1: Result never expires, you should delete jobs manually. 0: Result gets deleted immediately. >0: Result expires after n seconds.', 90 | null=True, verbose_name='result ttl')), 91 | ], 92 | options={ 93 | 'ordering': ('name',), 94 | 'verbose_name': 'Scheduled Job', 95 | 'verbose_name_plural': 'Scheduled Jobs', 96 | }, 97 | ), 98 | ] 99 | -------------------------------------------------------------------------------- /scheduler/migrations/0002_alter_cronjob_id_alter_repeatablejob_id_and_more.py: -------------------------------------------------------------------------------- 1 | # Generated by Django 4.0.1 on 2022-01-06 20:40 2 | 3 | from django.db import migrations, models 4 | 5 | 6 | class Migration(migrations.Migration): 7 | dependencies = [ 8 | ('scheduler', '0001_initial_squashed_0005_added_result_ttl'), 9 | ] 10 | 11 | operations = [ 12 | migrations.AlterField( 13 | model_name='cronjob', 14 | name='id', 15 | field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'), 16 | ), 17 | migrations.AlterField( 18 | model_name='repeatablejob', 19 | name='id', 20 | field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'), 21 | ), 22 | migrations.AlterField( 23 | model_name='scheduledjob', 24 | name='id', 25 | field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'), 26 | ), 27 | ] 28 | -------------------------------------------------------------------------------- /scheduler/migrations/0003_auto_20220329_2107.py: -------------------------------------------------------------------------------- 1 | # Generated by Django 3.2.12 on 2022-03-29 21:07 2 | 3 | import django.db.models.deletion 4 | from django.db import migrations, models 5 | 6 | 7 | class Migration(migrations.Migration): 8 | dependencies = [ 9 | ('contenttypes', '0002_remove_content_type_name'), 10 | ('scheduler', '0002_alter_cronjob_id_alter_repeatablejob_id_and_more'), 11 | ] 12 | 13 | operations = [ 14 | migrations.AlterField( 15 | model_name='cronjob', 16 | name='cron_string', 17 | field=models.CharField(help_text='Define the schedule in a crontab like syntax. Times are in UTC.', 18 | max_length=64, verbose_name='cron string'), 19 | ), 20 | migrations.AlterField( 21 | model_name='cronjob', 22 | name='id', 23 | field=models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'), 24 | ), 25 | migrations.AlterField( 26 | model_name='repeatablejob', 27 | name='id', 28 | field=models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'), 29 | ), 30 | migrations.AlterField( 31 | model_name='repeatablejob', 32 | name='interval_unit', 33 | field=models.CharField( 34 | choices=[('seconds', 'seconds'), ('minutes', 'minutes'), ('hours', 'hours'), ('days', 'days'), 35 | ('weeks', 'weeks')], default='hours', max_length=12, verbose_name='interval unit'), 36 | ), 37 | migrations.AlterField( 38 | model_name='scheduledjob', 39 | name='id', 40 | field=models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'), 41 | ), 42 | migrations.CreateModel( 43 | name='JobKwarg', 44 | fields=[ 45 | ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), 46 | ('arg_type', models.CharField( 47 | choices=[('str_val', 'string'), ('int_val', 'int'), ('bool_val', 'boolean'), 48 | ('datetime_val', 'Datetime')], default='str_val', max_length=12, 49 | verbose_name='Argument Type')), 50 | ('str_val', models.CharField(blank=True, max_length=255, verbose_name='String Value')), 51 | ('int_val', models.IntegerField(blank=True, null=True, verbose_name='Int Value')), 52 | ('bool_val', models.BooleanField(default=False, verbose_name='Boolean Value')), 53 | ('datetime_val', models.DateTimeField(blank=True, null=True, verbose_name='Datetime Value')), 54 | ('object_id', models.PositiveIntegerField()), 55 | ('key', models.CharField(max_length=255)), 56 | ('content_type', 57 | models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.contenttype')), 58 | ], 59 | options={ 60 | 'ordering': ['id'], 61 | 'abstract': False, 62 | }, 63 | ), 64 | migrations.CreateModel( 65 | name='JobArg', 66 | fields=[ 67 | ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), 68 | ('arg_type', models.CharField( 69 | choices=[('str_val', 'string'), ('int_val', 'int'), ('bool_val', 'boolean'), 70 | ('datetime_val', 'Datetime')], default='str_val', max_length=12, 71 | verbose_name='Argument Type')), 72 | ('str_val', models.CharField(blank=True, max_length=255, verbose_name='String Value')), 73 | ('int_val', models.IntegerField(blank=True, null=True, verbose_name='Int Value')), 74 | ('bool_val', models.BooleanField(default=False, verbose_name='Boolean Value')), 75 | ('datetime_val', models.DateTimeField(blank=True, null=True, verbose_name='Datetime Value')), 76 | ('object_id', models.PositiveIntegerField()), 77 | ('content_type', 78 | models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.contenttype')), 79 | ], 80 | options={ 81 | 'ordering': ['id'], 82 | 'abstract': False, 83 | }, 84 | ), 85 | ] 86 | -------------------------------------------------------------------------------- /scheduler/migrations/0004_cronjob_at_front_repeatablejob_at_front_and_more.py: -------------------------------------------------------------------------------- 1 | # Generated by Django 4.1.4 on 2022-12-18 18:47 2 | 3 | from django.db import migrations, models 4 | 5 | 6 | class Migration(migrations.Migration): 7 | dependencies = [ 8 | ('scheduler', '0003_auto_20220329_2107'), 9 | ] 10 | 11 | operations = [ 12 | migrations.AddField( 13 | model_name='cronjob', 14 | name='at_front', 15 | field=models.BooleanField(default=False, verbose_name='At front'), 16 | ), 17 | migrations.AddField( 18 | model_name='repeatablejob', 19 | name='at_front', 20 | field=models.BooleanField(default=False, verbose_name='At front'), 21 | ), 22 | migrations.AddField( 23 | model_name='scheduledjob', 24 | name='at_front', 25 | field=models.BooleanField(default=False, verbose_name='At front'), 26 | ), 27 | ] 28 | -------------------------------------------------------------------------------- /scheduler/migrations/0005_alter_cronjob_at_front_alter_repeatablejob_at_front_and_more.py: -------------------------------------------------------------------------------- 1 | # Generated by Django 4.1.5 on 2023-01-13 22:35 2 | 3 | from django.db import migrations, models 4 | 5 | 6 | class Migration(migrations.Migration): 7 | dependencies = [ 8 | ('scheduler', '0004_cronjob_at_front_repeatablejob_at_front_and_more'), 9 | ] 10 | 11 | operations = [ 12 | migrations.AlterField( 13 | model_name='cronjob', 14 | name='at_front', 15 | field=models.BooleanField(blank=True, default=False, null=True, verbose_name='At front'), 16 | ), 17 | migrations.AlterField( 18 | model_name='repeatablejob', 19 | name='at_front', 20 | field=models.BooleanField(blank=True, default=False, null=True, verbose_name='At front'), 21 | ), 22 | migrations.AlterField( 23 | model_name='scheduledjob', 24 | name='at_front', 25 | field=models.BooleanField(blank=True, default=False, null=True, verbose_name='At front'), 26 | ), 27 | ] 28 | -------------------------------------------------------------------------------- /scheduler/migrations/0006_auto_20230118_1640.py: -------------------------------------------------------------------------------- 1 | # Generated by Django 4.1.5 on 2023-01-18 16:40 2 | 3 | from django.db import migrations 4 | 5 | 6 | def forwards_func(apps, schema_editor): 7 | pass 8 | 9 | 10 | def reverse_func(apps, schema_editor): 11 | # forwards_func() creates two Country instances, 12 | # so reverse_func() should delete them. 13 | cronjob_model = apps.get_model(app_label='scheduler', model_name='CronJob') 14 | db_alias = schema_editor.connection.alias 15 | cronjob_model.objects.using(db_alias).filter(name='Job scheduling jobs').delete() 16 | 17 | 18 | class Migration(migrations.Migration): 19 | dependencies = [ 20 | ('scheduler', '0005_alter_cronjob_at_front_alter_repeatablejob_at_front_and_more'), 21 | ] 22 | 23 | operations = [ 24 | migrations.RunPython(forwards_func, reverse_func), 25 | ] 26 | -------------------------------------------------------------------------------- /scheduler/migrations/0007_add_result_ttl.py: -------------------------------------------------------------------------------- 1 | # Generated by Django 4.1.5 on 2023-01-20 22:32 2 | 3 | from django.db import migrations, models 4 | 5 | 6 | class Migration(migrations.Migration): 7 | dependencies = [ 8 | ('scheduler', '0006_auto_20230118_1640'), 9 | ] 10 | 11 | operations = [ 12 | migrations.AddField( 13 | model_name='cronjob', 14 | name='result_ttl', 15 | field=models.IntegerField(blank=True, 16 | help_text='The TTL value (in seconds) of the job result. -1: Result never expires, you should delete jobs manually. 0: Result gets deleted immediately. >0: Result expires after n seconds.', 17 | null=True, verbose_name='result ttl'), 18 | ), 19 | migrations.AlterField( 20 | model_name='cronjob', 21 | name='queue', 22 | field=models.CharField(help_text='Queue name', max_length=16, verbose_name='queue'), 23 | ), 24 | migrations.AlterField( 25 | model_name='repeatablejob', 26 | name='queue', 27 | field=models.CharField(help_text='Queue name', max_length=16, verbose_name='queue'), 28 | ), 29 | migrations.AlterField( 30 | model_name='scheduledjob', 31 | name='queue', 32 | field=models.CharField(help_text='Queue name', max_length=16, verbose_name='queue'), 33 | ), 34 | ] 35 | -------------------------------------------------------------------------------- /scheduler/migrations/0008_rename_str_val_jobarg_val_and_more.py: -------------------------------------------------------------------------------- 1 | # Generated by Django 4.1.7 on 2023-02-19 22:12 2 | 3 | from django.db import migrations, models 4 | 5 | 6 | class Migration(migrations.Migration): 7 | dependencies = [ 8 | ('scheduler', '0007_add_result_ttl'), 9 | ] 10 | 11 | operations = [ 12 | migrations.RenameField( 13 | model_name='jobarg', 14 | old_name='str_val', 15 | new_name='val', 16 | ), 17 | migrations.RenameField( 18 | model_name='jobkwarg', 19 | old_name='str_val', 20 | new_name='val', 21 | ), 22 | migrations.RemoveField( 23 | model_name='jobarg', 24 | name='bool_val', 25 | ), 26 | migrations.RemoveField( 27 | model_name='jobarg', 28 | name='datetime_val', 29 | ), 30 | migrations.RemoveField( 31 | model_name='jobarg', 32 | name='int_val', 33 | ), 34 | migrations.RemoveField( 35 | model_name='jobkwarg', 36 | name='bool_val', 37 | ), 38 | migrations.RemoveField( 39 | model_name='jobkwarg', 40 | name='datetime_val', 41 | ), 42 | migrations.RemoveField( 43 | model_name='jobkwarg', 44 | name='int_val', 45 | ), 46 | migrations.AlterField( 47 | model_name='jobarg', 48 | name='arg_type', 49 | field=models.CharField(choices=[('str_val', 'string'), ('int_val', 'int'), ('bool_val', 'boolean'), 50 | ('datetime_val', 'Datetime'), ('callable', 'Callable')], default='str_val', 51 | max_length=12, verbose_name='Argument Type'), 52 | ), 53 | migrations.AlterField( 54 | model_name='jobkwarg', 55 | name='arg_type', 56 | field=models.CharField(choices=[('str_val', 'string'), ('int_val', 'int'), ('bool_val', 'boolean'), 57 | ('datetime_val', 'Datetime'), ('callable', 'Callable')], default='str_val', 58 | max_length=12, verbose_name='Argument Type'), 59 | ), 60 | ] 61 | -------------------------------------------------------------------------------- /scheduler/migrations/0009_alter_jobarg_arg_type_alter_jobarg_val_and_more.py: -------------------------------------------------------------------------------- 1 | # Generated by Django 4.1.7 on 2023-03-12 19:53 2 | 3 | from django.db import migrations, models 4 | 5 | 6 | class Migration(migrations.Migration): 7 | dependencies = [ 8 | ('scheduler', '0008_rename_str_val_jobarg_val_and_more'), 9 | ] 10 | 11 | operations = [ 12 | migrations.AlterField( 13 | model_name='jobarg', 14 | name='arg_type', 15 | field=models.CharField( 16 | choices=[('str', 'string'), ('int', 'int'), ('bool', 'boolean'), ('datetime', 'datetime'), 17 | ('callable', 'callable')], default='str', max_length=12, verbose_name='Argument Type'), 18 | ), 19 | migrations.AlterField( 20 | model_name='jobarg', 21 | name='val', 22 | field=models.CharField(blank=True, max_length=255, verbose_name='Argument Value'), 23 | ), 24 | migrations.AlterField( 25 | model_name='jobkwarg', 26 | name='arg_type', 27 | field=models.CharField( 28 | choices=[('str', 'string'), ('int', 'int'), ('bool', 'boolean'), ('datetime', 'datetime'), 29 | ('callable', 'callable')], default='str', max_length=12, verbose_name='Argument Type'), 30 | ), 31 | migrations.AlterField( 32 | model_name='jobkwarg', 33 | name='val', 34 | field=models.CharField(blank=True, max_length=255, verbose_name='Argument Value'), 35 | ), 36 | ] 37 | -------------------------------------------------------------------------------- /scheduler/migrations/0010_queue.py: -------------------------------------------------------------------------------- 1 | # Generated by Django 4.1.7 on 2023-03-16 20:59 2 | 3 | from django.db import migrations, models 4 | 5 | 6 | class Migration(migrations.Migration): 7 | dependencies = [ 8 | ('scheduler', '0009_alter_jobarg_arg_type_alter_jobarg_val_and_more'), 9 | ] 10 | 11 | operations = [ 12 | migrations.CreateModel( 13 | name='Queue', 14 | fields=[ 15 | ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), 16 | ], 17 | options={ 18 | 'permissions': [['view', 'Access admin page']], 19 | 'managed': False, 20 | 'default_permissions': (), 21 | }, 22 | ), 23 | ] 24 | -------------------------------------------------------------------------------- /scheduler/migrations/0011_worker_alter_queue_options_alter_cronjob_at_front_and_more.py: -------------------------------------------------------------------------------- 1 | # Generated by Django 4.1.7 on 2023-04-03 00:46 2 | 3 | from django.db import migrations, models 4 | 5 | 6 | class Migration(migrations.Migration): 7 | dependencies = [ 8 | ('scheduler', '0010_queue'), 9 | ] 10 | 11 | operations = [ 12 | migrations.CreateModel( 13 | name='Worker', 14 | fields=[ 15 | ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), 16 | ], 17 | options={ 18 | 'verbose_name_plural': ' Workers', 19 | 'permissions': [['view', 'Access admin page']], 20 | 'managed': False, 21 | 'default_permissions': (), 22 | }, 23 | ), 24 | migrations.AlterModelOptions( 25 | name='queue', 26 | options={'default_permissions': (), 'managed': False, 'permissions': [['view', 'Access admin page']], 27 | 'verbose_name_plural': ' Queues'}, 28 | ), 29 | migrations.AlterField( 30 | model_name='cronjob', 31 | name='at_front', 32 | field=models.BooleanField(blank=True, default=False, 33 | help_text='When queuing the job, add it in the front of the queue', null=True, 34 | verbose_name='At front'), 35 | ), 36 | migrations.AlterField( 37 | model_name='cronjob', 38 | name='cron_string', 39 | field=models.CharField( 40 | help_text='Define the schedule in a crontab like syntax.\n Times are in UTC. Use crontab.guru to create a cron string.', 41 | max_length=64, verbose_name='cron string'), 42 | ), 43 | migrations.AlterField( 44 | model_name='cronjob', 45 | name='enabled', 46 | field=models.BooleanField(default=True, 47 | help_text='Should job be scheduled? This field is useful to keep past jobs that should no longer be scheduled', 48 | verbose_name='enabled'), 49 | ), 50 | migrations.AlterField( 51 | model_name='cronjob', 52 | name='job_id', 53 | field=models.CharField(blank=True, editable=False, help_text='Current job_id on queue', max_length=128, 54 | null=True, verbose_name='job id'), 55 | ), 56 | migrations.AlterField( 57 | model_name='cronjob', 58 | name='queue', 59 | field=models.CharField(choices=[('default', 'default'), ('low', 'low'), ('high', 'high')], 60 | help_text='Queue name', max_length=16, verbose_name='queue'), 61 | ), 62 | migrations.AlterField( 63 | model_name='cronjob', 64 | name='repeat', 65 | field=models.PositiveIntegerField(blank=True, 66 | help_text='Number of times to run the job. Leaving this blank means it will run forever.', 67 | null=True, verbose_name='repeat'), 68 | ), 69 | migrations.AlterField( 70 | model_name='cronjob', 71 | name='result_ttl', 72 | field=models.IntegerField(blank=True, 73 | help_text='The TTL value (in seconds) of the job result.
\n -1: Result never expires, you should delete jobs manually.
\n 0: Result gets deleted immediately.
\n >0: Result expires after n seconds.', 74 | null=True, verbose_name='result ttl'), 75 | ), 76 | migrations.AlterField( 77 | model_name='repeatablejob', 78 | name='at_front', 79 | field=models.BooleanField(blank=True, default=False, 80 | help_text='When queuing the job, add it in the front of the queue', null=True, 81 | verbose_name='At front'), 82 | ), 83 | migrations.AlterField( 84 | model_name='repeatablejob', 85 | name='enabled', 86 | field=models.BooleanField(default=True, 87 | help_text='Should job be scheduled? This field is useful to keep past jobs that should no longer be scheduled', 88 | verbose_name='enabled'), 89 | ), 90 | migrations.AlterField( 91 | model_name='repeatablejob', 92 | name='job_id', 93 | field=models.CharField(blank=True, editable=False, help_text='Current job_id on queue', max_length=128, 94 | null=True, verbose_name='job id'), 95 | ), 96 | migrations.AlterField( 97 | model_name='repeatablejob', 98 | name='queue', 99 | field=models.CharField(choices=[('default', 'default'), ('low', 'low'), ('high', 'high')], 100 | help_text='Queue name', max_length=16, verbose_name='queue'), 101 | ), 102 | migrations.AlterField( 103 | model_name='repeatablejob', 104 | name='repeat', 105 | field=models.PositiveIntegerField(blank=True, 106 | help_text='Number of times to run the job. Leaving this blank means it will run forever.', 107 | null=True, verbose_name='repeat'), 108 | ), 109 | migrations.AlterField( 110 | model_name='repeatablejob', 111 | name='result_ttl', 112 | field=models.IntegerField(blank=True, 113 | help_text='The TTL value (in seconds) of the job result.
\n -1: Result never expires, you should delete jobs manually.
\n 0: Result gets deleted immediately.
\n >0: Result expires after n seconds.', 114 | null=True, verbose_name='result ttl'), 115 | ), 116 | migrations.AlterField( 117 | model_name='scheduledjob', 118 | name='at_front', 119 | field=models.BooleanField(blank=True, default=False, 120 | help_text='When queuing the job, add it in the front of the queue', null=True, 121 | verbose_name='At front'), 122 | ), 123 | migrations.AlterField( 124 | model_name='scheduledjob', 125 | name='enabled', 126 | field=models.BooleanField(default=True, 127 | help_text='Should job be scheduled? This field is useful to keep past jobs that should no longer be scheduled', 128 | verbose_name='enabled'), 129 | ), 130 | migrations.AlterField( 131 | model_name='scheduledjob', 132 | name='job_id', 133 | field=models.CharField(blank=True, editable=False, help_text='Current job_id on queue', max_length=128, 134 | null=True, verbose_name='job id'), 135 | ), 136 | migrations.AlterField( 137 | model_name='scheduledjob', 138 | name='queue', 139 | field=models.CharField(choices=[('default', 'default'), ('low', 'low'), ('high', 'high')], 140 | help_text='Queue name', max_length=16, verbose_name='queue'), 141 | ), 142 | migrations.AlterField( 143 | model_name='scheduledjob', 144 | name='result_ttl', 145 | field=models.IntegerField(blank=True, 146 | help_text='The TTL value (in seconds) of the job result.
\n -1: Result never expires, you should delete jobs manually.
\n 0: Result gets deleted immediately.
\n >0: Result expires after n seconds.', 147 | null=True, verbose_name='result ttl'), 148 | ), 149 | ] 150 | -------------------------------------------------------------------------------- /scheduler/migrations/0012_alter_cronjob_name_alter_repeatablejob_name_and_more.py: -------------------------------------------------------------------------------- 1 | # Generated by Django 4.2 on 2023-04-18 19:08 2 | 3 | from django.db import migrations, models 4 | 5 | 6 | class Migration(migrations.Migration): 7 | dependencies = [ 8 | ('scheduler', '0011_worker_alter_queue_options_alter_cronjob_at_front_and_more'), 9 | ] 10 | 11 | operations = [ 12 | migrations.AlterField( 13 | model_name='cronjob', 14 | name='name', 15 | field=models.CharField(help_text='Name of the job.', max_length=128, unique=True, verbose_name='name'), 16 | ), 17 | migrations.AlterField( 18 | model_name='repeatablejob', 19 | name='name', 20 | field=models.CharField(help_text='Name of the job.', max_length=128, unique=True, verbose_name='name'), 21 | ), 22 | migrations.AlterField( 23 | model_name='scheduledjob', 24 | name='name', 25 | field=models.CharField(help_text='Name of the job.', max_length=128, unique=True, verbose_name='name'), 26 | ), 27 | ] 28 | -------------------------------------------------------------------------------- /scheduler/migrations/0013_alter_cronjob_queue_alter_repeatablejob_queue_and_more.py: -------------------------------------------------------------------------------- 1 | # Generated by Django 4.2.1 on 2023-05-11 16:40 2 | 3 | from django.db import migrations, models 4 | 5 | 6 | class Migration(migrations.Migration): 7 | 8 | dependencies = [ 9 | ("scheduler", "0012_alter_cronjob_name_alter_repeatablejob_name_and_more") 10 | ] 11 | 12 | operations = [ 13 | migrations.AlterField( 14 | model_name="cronjob", 15 | name="queue", 16 | field=models.CharField( 17 | choices=[("default", "default"), ("low", "low"), ("high", "high")], 18 | help_text="Queue name", 19 | max_length=255, 20 | verbose_name="queue", 21 | ), 22 | ), 23 | migrations.AlterField( 24 | model_name="repeatablejob", 25 | name="queue", 26 | field=models.CharField( 27 | choices=[("default", "default"), ("low", "low"), ("high", "high")], 28 | help_text="Queue name", 29 | max_length=255, 30 | verbose_name="queue", 31 | ), 32 | ), 33 | migrations.AlterField( 34 | model_name="scheduledjob", 35 | name="queue", 36 | field=models.CharField( 37 | choices=[("default", "default"), ("low", "low"), ("high", "high")], 38 | help_text="Queue name", 39 | max_length=255, 40 | verbose_name="queue", 41 | ), 42 | ), 43 | ] 44 | -------------------------------------------------------------------------------- /scheduler/migrations/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dsoftwareinc/django-rq-scheduler/d37a9149c770bde88f5eb60ba03402c4a722f81c/scheduler/migrations/__init__.py -------------------------------------------------------------------------------- /scheduler/models/__init__.py: -------------------------------------------------------------------------------- 1 | from .args import JobKwarg, JobArg, BaseJobArg # noqa: F401 2 | from .queue import Queue # noqa: F401 3 | from .scheduled_job import BaseJob, ScheduledJob, RepeatableJob, CronJob # noqa: F401 4 | -------------------------------------------------------------------------------- /scheduler/models/args.py: -------------------------------------------------------------------------------- 1 | from datetime import datetime 2 | from typing import Callable 3 | 4 | from django.contrib.contenttypes.fields import GenericForeignKey 5 | from django.contrib.contenttypes.models import ContentType 6 | from django.core.exceptions import ValidationError 7 | from django.db import models 8 | from django.utils.translation import gettext_lazy as _ 9 | 10 | from scheduler import tools 11 | 12 | ARG_TYPE_TYPES_DICT = { 13 | 'str': str, 14 | 'int': int, 15 | 'bool': bool, 16 | 'datetime': datetime, 17 | 'callable': Callable, 18 | } 19 | 20 | 21 | class BaseJobArg(models.Model): 22 | class ArgType(models.TextChoices): 23 | STR = 'str', _('string') 24 | INT = 'int', _('int') 25 | BOOL = 'bool', _('boolean') 26 | DATETIME = 'datetime', _('datetime') 27 | CALLABLE = 'callable', _('callable') 28 | 29 | arg_type = models.CharField( 30 | _('Argument Type'), max_length=12, choices=ArgType.choices, default=ArgType.STR, 31 | ) 32 | val = models.CharField(_('Argument Value'), blank=True, max_length=255) 33 | content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE) 34 | object_id = models.PositiveIntegerField() 35 | content_object = GenericForeignKey() 36 | 37 | def clean(self): 38 | if self.arg_type not in ARG_TYPE_TYPES_DICT: 39 | raise ValidationError({ 40 | 'arg_type': ValidationError( 41 | _(f'Could not parse {self.arg_type}, options are: {ARG_TYPE_TYPES_DICT.keys()}'), code='invalid') 42 | }) 43 | try: 44 | if self.arg_type == 'callable': 45 | tools.callable_func(self.val) 46 | elif self.arg_type == 'datetime': 47 | datetime.fromisoformat(self.val) 48 | elif self.arg_type == 'bool': 49 | if self.val.lower() not in {'true', 'false'}: 50 | raise ValidationError 51 | elif self.arg_type == 'int': 52 | int(self.val) 53 | except Exception: 54 | raise ValidationError({ 55 | 'arg_type': ValidationError( 56 | _(f'Could not parse {self.val} as {self.arg_type}'), code='invalid') 57 | }) 58 | 59 | def save(self, **kwargs): 60 | super(BaseJobArg, self).save(**kwargs) 61 | self.content_object.save() 62 | 63 | def delete(self, **kwargs): 64 | super(BaseJobArg, self).delete(**kwargs) 65 | self.content_object.save() 66 | 67 | def value(self): 68 | if self.arg_type == 'callable': 69 | res = tools.callable_func(self.val)() 70 | elif self.arg_type == 'datetime': 71 | res = datetime.fromisoformat(self.val) 72 | elif self.arg_type == 'bool': 73 | res = self.val.lower() == 'true' 74 | else: 75 | res = ARG_TYPE_TYPES_DICT[self.arg_type](self.val) 76 | return res 77 | 78 | class Meta: 79 | abstract = True 80 | ordering = ['id'] 81 | 82 | 83 | class JobArg(BaseJobArg): 84 | def __str__(self): 85 | return f'JobArg[arg_type={self.arg_type},value={self.value()}]' 86 | 87 | 88 | class JobKwarg(BaseJobArg): 89 | key = models.CharField(max_length=255) 90 | 91 | def __str__(self): 92 | key, value = self.value() 93 | return f'JobKwarg[key={key},arg_type={self.arg_type},value={self.val}]' 94 | 95 | def value(self): 96 | return self.key, super(JobKwarg, self).value() 97 | -------------------------------------------------------------------------------- /scheduler/models/queue.py: -------------------------------------------------------------------------------- 1 | from django.db import models 2 | 3 | 4 | class Queue(models.Model): 5 | """Placeholder model with no database table, but with django admin page 6 | and contenttype permission""" 7 | 8 | class Meta: 9 | managed = False # not in Django's database 10 | default_permissions = () 11 | permissions = [['view', 'Access admin page']] 12 | verbose_name_plural = " Queues" 13 | -------------------------------------------------------------------------------- /scheduler/models/worker.py: -------------------------------------------------------------------------------- 1 | from django.db import models 2 | 3 | 4 | class Worker(models.Model): 5 | """Placeholder model with no database table, but with django admin page 6 | and contenttype permission""" 7 | 8 | class Meta: 9 | managed = False # not in Django's database 10 | default_permissions = () 11 | permissions = [['view', 'Access admin page']] 12 | verbose_name_plural = " Workers" 13 | -------------------------------------------------------------------------------- /scheduler/py.typed: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dsoftwareinc/django-rq-scheduler/d37a9149c770bde88f5eb60ba03402c4a722f81c/scheduler/py.typed -------------------------------------------------------------------------------- /scheduler/queues.py: -------------------------------------------------------------------------------- 1 | from typing import List, Dict 2 | 3 | import redis 4 | from redis.sentinel import Sentinel 5 | 6 | from .rq_classes import JobExecution, DjangoQueue, DjangoWorker 7 | from .settings import get_config 8 | from .settings import logger 9 | 10 | _CONNECTION_PARAMS = { 11 | 'URL', 12 | 'DB', 13 | 'USE_REDIS_CACHE', 14 | 'UNIX_SOCKET_PATH', 15 | 'HOST', 16 | 'PORT', 17 | 'PASSWORD', 18 | 'SENTINELS', 19 | 'MASTER_NAME', 20 | 'SOCKET_TIMEOUT', 21 | 'SSL', 22 | 'CONNECTION_KWARGS', 23 | } 24 | 25 | 26 | class QueueNotFoundError(Exception): 27 | pass 28 | 29 | 30 | def _get_redis_connection(config, use_strict_redis=False): 31 | """ 32 | Returns a redis connection from a connection config 33 | """ 34 | if get_config('FAKEREDIS'): 35 | import fakeredis 36 | redis_cls = fakeredis.FakeRedis if use_strict_redis else fakeredis.FakeStrictRedis 37 | else: 38 | redis_cls = redis.StrictRedis if use_strict_redis else redis.Redis 39 | logger.debug(f'Getting connection for {config}') 40 | if 'URL' in config: 41 | if config.get('SSL') or config.get('URL').startswith('rediss://'): 42 | return redis_cls.from_url( 43 | config['URL'], 44 | db=config.get('DB'), 45 | ssl_cert_reqs=config.get('SSL_CERT_REQS', 'required'), 46 | ) 47 | else: 48 | return redis_cls.from_url( 49 | config['URL'], 50 | db=config.get('DB'), 51 | ) 52 | if 'UNIX_SOCKET_PATH' in config: 53 | return redis_cls(unix_socket_path=config['UNIX_SOCKET_PATH'], db=config['DB']) 54 | 55 | if 'SENTINELS' in config: 56 | connection_kwargs = { 57 | 'db': config.get('DB'), 58 | 'password': config.get('PASSWORD'), 59 | 'username': config.get('USERNAME'), 60 | 'socket_timeout': config.get('SOCKET_TIMEOUT'), 61 | } 62 | connection_kwargs.update(config.get('CONNECTION_KWARGS', {})) 63 | sentinel_kwargs = config.get('SENTINEL_KWARGS', {}) 64 | sentinel = Sentinel(config['SENTINELS'], sentinel_kwargs=sentinel_kwargs, **connection_kwargs) 65 | return sentinel.master_for( 66 | service_name=config['MASTER_NAME'], 67 | redis_class=redis_cls, 68 | ) 69 | 70 | return redis_cls( 71 | host=config['HOST'], 72 | port=config['PORT'], 73 | db=config.get('DB', 0), 74 | username=config.get('USERNAME', None), 75 | password=config.get('PASSWORD'), 76 | ssl=config.get('SSL', False), 77 | ssl_cert_reqs=config.get('SSL_CERT_REQS', 'required'), 78 | **config.get('REDIS_CLIENT_KWARGS', {}) 79 | ) 80 | 81 | 82 | def get_connection(queue_settings, use_strict_redis=False): 83 | """Returns a Redis connection to use based on parameters in SCHEDULER_QUEUES 84 | """ 85 | return _get_redis_connection(queue_settings, use_strict_redis) 86 | 87 | 88 | def get_queue( 89 | name='default', 90 | default_timeout=None, is_async=None, 91 | autocommit=None, 92 | connection=None, 93 | **kwargs 94 | ) -> DjangoQueue: 95 | """Returns an DjangoQueue using parameters defined in `SCHEDULER_QUEUES` 96 | """ 97 | from .settings import QUEUES 98 | if name not in QUEUES: 99 | raise QueueNotFoundError(f'Queue {name} not found, queues={QUEUES.keys()}') 100 | queue_settings = QUEUES[name] 101 | if is_async is None: 102 | is_async = queue_settings.get('ASYNC', True) 103 | 104 | if default_timeout is None: 105 | default_timeout = queue_settings.get('DEFAULT_TIMEOUT') 106 | if connection is None: 107 | connection = get_connection(queue_settings) 108 | return DjangoQueue( 109 | name, 110 | default_timeout=default_timeout, 111 | connection=connection, 112 | is_async=is_async, 113 | autocommit=autocommit, 114 | **kwargs 115 | ) 116 | 117 | 118 | def get_all_workers(): 119 | from .settings import QUEUES 120 | workers = set() 121 | for queue_name in QUEUES: 122 | connection = get_connection(QUEUES[queue_name]) 123 | try: 124 | curr_workers = set(DjangoWorker.all(connection=connection)) 125 | workers.update(curr_workers) 126 | except redis.ConnectionError as e: 127 | logger.error(f'Could not connect for queue {queue_name}: {e}') 128 | return workers 129 | 130 | 131 | def _queues_share_connection_params(q1_params: Dict, q2_params: Dict): 132 | """Check that both queues share the same connection parameters 133 | """ 134 | return all( 135 | ((p not in q1_params and p not in q2_params) 136 | or (q1_params.get(p, None) == q2_params.get(p, None))) 137 | for p in _CONNECTION_PARAMS) 138 | 139 | 140 | def get_queues(*queue_names, **kwargs) -> List[DjangoQueue]: 141 | """Return queue instances from specified queue names. 142 | All instances must use the same Redis connection. 143 | """ 144 | from .settings import QUEUES 145 | 146 | kwargs['job_class'] = JobExecution 147 | queue_params = QUEUES[queue_names[0]] 148 | queues = [get_queue(queue_names[0], **kwargs)] 149 | # perform consistency checks while building return list 150 | for name in queue_names[1:]: 151 | if not _queues_share_connection_params(queue_params, QUEUES[name]): 152 | raise ValueError( 153 | f'Queues must have the same redis connection. "{name}" and' 154 | f' "{queue_names[0]}" have different connections') 155 | queue = get_queue(name, **kwargs) 156 | queues.append(queue) 157 | 158 | return queues 159 | -------------------------------------------------------------------------------- /scheduler/rq_classes.py: -------------------------------------------------------------------------------- 1 | from typing import List, Any, Optional, Union 2 | 3 | import django 4 | from django.apps import apps 5 | from redis import Redis 6 | from redis.client import Pipeline 7 | from rq import Worker 8 | from rq.command import send_stop_job_command 9 | from rq.decorators import job 10 | from rq.exceptions import InvalidJobOperation 11 | from rq.job import Job, JobStatus 12 | from rq.job import get_current_job # noqa 13 | from rq.queue import Queue, logger 14 | from rq.registry import ( 15 | DeferredJobRegistry, FailedJobRegistry, FinishedJobRegistry, 16 | ScheduledJobRegistry, StartedJobRegistry, CanceledJobRegistry, BaseRegistry, 17 | ) 18 | from rq.scheduler import RQScheduler 19 | from rq.worker import WorkerStatus 20 | 21 | from scheduler import settings 22 | 23 | MODEL_NAMES = ['ScheduledJob', 'RepeatableJob', 'CronJob'] 24 | 25 | rq_job_decorator = job 26 | ExecutionStatus = JobStatus 27 | InvalidJobOperation = InvalidJobOperation 28 | 29 | 30 | def as_text(v: Union[bytes, str]) -> Optional[str]: 31 | """Converts a bytes value to a string using `utf-8`. 32 | 33 | :param v: The value (bytes or string) 34 | :raises: ValueError: If the value is not bytes or string 35 | :returns: Either the decoded string or None 36 | """ 37 | if v is None: 38 | return None 39 | elif isinstance(v, bytes): 40 | return v.decode('utf-8') 41 | elif isinstance(v, str): 42 | return v 43 | else: 44 | raise ValueError('Unknown type %r' % type(v)) 45 | 46 | 47 | def compact(lst: List[Any]) -> List[Any]: 48 | """Remove `None` values from an iterable object. 49 | :param lst: A list (or list-like) object 50 | :returns: The list without None values 51 | """ 52 | return [item for item in lst if item is not None] 53 | 54 | 55 | class JobExecution(Job): 56 | def __eq__(self, other): 57 | return isinstance(other, Job) and self.id == other.id 58 | 59 | @property 60 | def is_scheduled_job(self): 61 | return self.meta.get('scheduled_job_id', None) is not None 62 | 63 | def is_execution_of(self, scheduled_job): 64 | return (self.meta.get('job_type', None) == scheduled_job.JOB_TYPE 65 | and self.meta.get('scheduled_job_id', None) == scheduled_job.id) 66 | 67 | def stop_execution(self, connection: Redis): 68 | send_stop_job_command(connection, self.id) 69 | 70 | 71 | class DjangoWorker(Worker): 72 | def __init__(self, *args, **kwargs): 73 | self.fork_job_execution = kwargs.pop('fork_job_execution', True) 74 | kwargs['job_class'] = JobExecution 75 | kwargs['queue_class'] = DjangoQueue 76 | super(DjangoWorker, self).__init__(*args, **kwargs) 77 | 78 | def __eq__(self, other): 79 | return (isinstance(other, Worker) 80 | and self.key == other.key 81 | and self.name == other.name) 82 | 83 | def __hash__(self): 84 | return hash((self.name, self.key, ','.join(self.queue_names()))) 85 | 86 | def __str__(self): 87 | return f"{self.name}/{','.join(self.queue_names())}" 88 | 89 | def _start_scheduler( 90 | self, 91 | burst: bool = False, 92 | logging_level: str = "INFO", 93 | date_format: str = '%H:%M:%S', 94 | log_format: str = '%(asctime)s %(message)s', 95 | ) -> None: 96 | """Starts the scheduler process. 97 | This is specifically designed to be run by the worker when running the `work()` method. 98 | Instantiates the DjangoScheduler and tries to acquire a lock. 99 | If the lock is acquired, start scheduler. 100 | If worker is on burst mode just enqueues scheduled jobs and quits, 101 | otherwise, starts the scheduler in a separate process. 102 | 103 | 104 | :param burst (bool, optional): Whether to work on burst mode. Defaults to False. 105 | :param logging_level (str, optional): Logging level to use. Defaults to "INFO". 106 | :param date_format (str, optional): Date Format. Defaults to DEFAULT_LOGGING_DATE_FORMAT. 107 | :param log_format (str, optional): Log Format. Defaults to DEFAULT_LOGGING_FORMAT. 108 | """ 109 | self.scheduler = DjangoScheduler( 110 | self.queues, 111 | connection=self.connection, 112 | logging_level=logging_level, 113 | date_format=date_format, 114 | log_format=log_format, 115 | serializer=self.serializer, 116 | ) 117 | self.scheduler.acquire_locks() 118 | if self.scheduler.acquired_locks: 119 | if burst: 120 | self.scheduler.enqueue_scheduled_jobs() 121 | self.scheduler.release_locks() 122 | else: 123 | proc = self.scheduler.start() 124 | self._set_property('scheduler_pid', proc.pid) 125 | 126 | def execute_job(self, job: 'Job', queue: 'Queue'): 127 | if self.fork_job_execution: 128 | super(DjangoWorker, self).execute_job(job, queue) 129 | else: 130 | self.set_state(WorkerStatus.BUSY) 131 | self.perform_job(job, queue) 132 | self.set_state(WorkerStatus.IDLE) 133 | 134 | def work(self, **kwargs) -> bool: 135 | kwargs.setdefault('with_scheduler', True) 136 | return super(DjangoWorker, self).work(**kwargs) 137 | 138 | def _set_property(self, prop_name: str, val, pipeline: Optional[Pipeline] = None): 139 | connection = pipeline if pipeline is not None else self.connection 140 | if val is None: 141 | connection.hdel(self.key, prop_name) 142 | else: 143 | connection.hset(self.key, prop_name, val) 144 | 145 | def _get_property(self, prop_name: str, pipeline: Optional[Pipeline] = None): 146 | connection = pipeline if pipeline is not None else self.connection 147 | return as_text(connection.hget(self.key, prop_name)) 148 | 149 | def scheduler_pid(self) -> int: 150 | pid = self.connection.get(RQScheduler.get_locking_key(self.queues[0].name)) 151 | return int(pid.decode()) if pid is not None else None 152 | 153 | 154 | class DjangoQueue(Queue): 155 | REGISTRIES = dict( 156 | finished='finished_job_registry', 157 | failed='failed_job_registry', 158 | scheduled='scheduled_job_registry', 159 | started='started_job_registry', 160 | deferred='deferred_job_registry', 161 | canceled='canceled_job_registry', 162 | ) 163 | """ 164 | A subclass of RQ's QUEUE that allows jobs to be stored temporarily to be 165 | enqueued later at the end of Django's request/response cycle. 166 | """ 167 | 168 | def __init__(self, *args, **kwargs): 169 | kwargs['job_class'] = JobExecution 170 | super(DjangoQueue, self).__init__(*args, **kwargs) 171 | 172 | def get_registry(self, name: str) -> Union[None, BaseRegistry, 'DjangoQueue']: 173 | name = name.lower() 174 | if name == 'queued': 175 | return self 176 | elif name in DjangoQueue.REGISTRIES: 177 | return getattr(self, DjangoQueue.REGISTRIES[name]) 178 | return None 179 | 180 | @property 181 | def finished_job_registry(self): 182 | return FinishedJobRegistry(self.name, self.connection) 183 | 184 | @property 185 | def started_job_registry(self): 186 | return StartedJobRegistry(self.name, self.connection, job_class=JobExecution, ) 187 | 188 | @property 189 | def deferred_job_registry(self): 190 | return DeferredJobRegistry(self.name, self.connection, job_class=JobExecution, ) 191 | 192 | @property 193 | def failed_job_registry(self): 194 | return FailedJobRegistry(self.name, self.connection, job_class=JobExecution, ) 195 | 196 | @property 197 | def scheduled_job_registry(self): 198 | return ScheduledJobRegistry(self.name, self.connection, job_class=JobExecution, ) 199 | 200 | @property 201 | def canceled_job_registry(self): 202 | return CanceledJobRegistry(self.name, self.connection, job_class=JobExecution, ) 203 | 204 | def get_all_job_ids(self) -> List[str]: 205 | res = list() 206 | res.extend(self.get_job_ids()) 207 | res.extend(self.finished_job_registry.get_job_ids()) 208 | res.extend(self.started_job_registry.get_job_ids()) 209 | res.extend(self.deferred_job_registry.get_job_ids()) 210 | res.extend(self.failed_job_registry.get_job_ids()) 211 | res.extend(self.scheduled_job_registry.get_job_ids()) 212 | res.extend(self.canceled_job_registry.get_job_ids()) 213 | return res 214 | 215 | def get_all_jobs(self): 216 | job_ids = self.get_all_job_ids() 217 | return compact([self.fetch_job(job_id) for job_id in job_ids]) 218 | 219 | def clean_registries(self): 220 | self.started_job_registry.cleanup() 221 | self.failed_job_registry.cleanup() 222 | self.finished_job_registry.cleanup() 223 | 224 | def remove_job_id(self, job_id: str): 225 | self.connection.lrem(self.key, 0, job_id) 226 | 227 | def last_job_id(self): 228 | return self.connection.lindex(self.key, 0) 229 | 230 | 231 | class DjangoScheduler(RQScheduler): 232 | def __init__(self, *args, **kwargs): 233 | kwargs.setdefault('interval', settings.SCHEDULER_CONFIG['SCHEDULER_INTERVAL']) 234 | super(DjangoScheduler, self).__init__(*args, **kwargs) 235 | 236 | @staticmethod 237 | def reschedule_all_jobs(): 238 | logger.debug("Rescheduling all jobs") 239 | for model_name in MODEL_NAMES: 240 | model = apps.get_model(app_label='scheduler', model_name=model_name) 241 | enabled_jobs = model.objects.filter(enabled=True) 242 | unscheduled_jobs = filter(lambda j: j.ready_for_schedule(), enabled_jobs) 243 | for item in unscheduled_jobs: 244 | logger.debug(f"Rescheduling {str(item)}") 245 | item.save() 246 | 247 | def work(self): 248 | django.setup() 249 | super(DjangoScheduler, self).work() 250 | 251 | def enqueue_scheduled_jobs(self): 252 | self.reschedule_all_jobs() 253 | super(DjangoScheduler, self).enqueue_scheduled_jobs() 254 | -------------------------------------------------------------------------------- /scheduler/settings.py: -------------------------------------------------------------------------------- 1 | import logging 2 | 3 | from django.conf import settings 4 | from django.core.exceptions import ImproperlyConfigured 5 | 6 | logger = logging.getLogger("scheduler") 7 | 8 | QUEUES = dict() 9 | SCHEDULER_CONFIG = dict() 10 | 11 | 12 | def conf_settings(): 13 | global QUEUES 14 | global SCHEDULER_CONFIG 15 | 16 | QUEUES = getattr(settings, 'SCHEDULER_QUEUES', None) 17 | if QUEUES is None: 18 | logger.warning('Configuration using RQ_QUEUES is deprecated. Use SCHEDULER_QUEUES instead') 19 | QUEUES = getattr(settings, 'RQ_QUEUES', None) 20 | if QUEUES is None: 21 | raise ImproperlyConfigured("You have to define SCHEDULER_QUEUES in settings.py") 22 | 23 | SCHEDULER_CONFIG = { 24 | 'EXECUTIONS_IN_PAGE': 20, 25 | 'DEFAULT_RESULT_TTL': 600, # 10 minutes 26 | 'DEFAULT_TIMEOUT': 300, # 5 minutes 27 | 'SCHEDULER_INTERVAL': 10, # 10 seconds 28 | 'FAKEREDIS': False, # For testing purposes 29 | } 30 | user_settings = getattr(settings, 'SCHEDULER_CONFIG', {}) 31 | SCHEDULER_CONFIG.update(user_settings) 32 | 33 | 34 | conf_settings() 35 | 36 | 37 | def get_config(key: str, default=None): 38 | return SCHEDULER_CONFIG.get(key, None) 39 | -------------------------------------------------------------------------------- /scheduler/templates/admin/scheduler/change_form.html: -------------------------------------------------------------------------------- 1 | {% extends "admin/change_form.html" %} 2 | {% load i18n %} 3 | 4 | {% block after_related_objects %} 5 | {% include 'admin/scheduler/jobs-list.partial.html' %} 6 | {% endblock %} -------------------------------------------------------------------------------- /scheduler/templates/admin/scheduler/change_list.html: -------------------------------------------------------------------------------- 1 | {% extends 'admin/change_list.html' %} 2 | {% load scheduler_tags %} 3 | 4 | {% block object-tools %} 5 | {{ block.super }} 6 | {% endblock %} -------------------------------------------------------------------------------- /scheduler/templates/admin/scheduler/confirm_action.html: -------------------------------------------------------------------------------- 1 | {% extends "admin/scheduler/scheduler_base.html" %} 2 | {% load scheduler_tags %} 3 | 4 | {% block breadcrumbs %} 5 | 11 | {% endblock %} 12 | 13 | {% block content_title %}

Are you sure?

{% endblock %} 14 | 15 | {% block content %} 16 |
17 |

18 | Are you sure you want to {{ action|capfirst }} the {{ total_jobs }} selected jobs from 19 | {{ queue.name }} 20 | ? These jobs are selected: 21 |

22 |
    23 | {% for job in jobs %} 24 |
  • 25 | {{ job.id }} 26 | {{ job | show_func_name }} 27 |
  • 28 | {% endfor %} 29 |
30 |
31 | {% csrf_token %} 32 |
33 | {% for job in jobs %} 34 | 35 | {% endfor %} 36 | 37 | 38 | 39 |
40 |
41 |
42 | {% endblock %} 43 | -------------------------------------------------------------------------------- /scheduler/templates/admin/scheduler/job_detail.html: -------------------------------------------------------------------------------- 1 | {% extends "admin/scheduler/scheduler_base.html" %} 2 | {% load static scheduler_tags %} 3 | 4 | {% load static %} 5 | 6 | {% block title %}Job {{ job.id }} {{ block.super }}{% endblock %} 7 | 8 | {% block breadcrumbs %} 9 | 15 | {% endblock %} 16 | 17 | {% block content_title %} 18 |

Job {{ job.id }} 19 | {% if job.is_scheduled_job %} 20 | 21 | Link to scheduled job 22 | 23 | {% endif %} 24 |

25 | {% endblock %} 26 | 27 | {% block content %} 28 |
29 |
30 |
31 | 32 |
{{ job.origin }}
33 |
34 | 35 | 36 |
37 | 38 |
{{ job.timeout }}
39 |
40 | 41 |
42 | 43 |
{{ job.result_ttl }}
44 |
45 | 46 |
47 | 48 |
{{ job.created_at|date:"Y-m-d, H:i:s"|default:"-" }}
49 |
50 | 51 | 52 |
53 | 54 |
{{ job.enqueued_at|date:"Y-m-d, H:i:s"|default:"-" }}
55 |
56 | 57 |
58 | 59 |
{{ job.started_at|date:"Y-m-d, H:i:s"|default:"-" }}
60 |
61 | 62 |
63 | 64 |
{{ job.ended_at|date:"Y-m-d, H:i:s"|default:"-" }}
65 |
66 | 67 | 68 |
69 | 70 |
{{ job.get_status }}
71 |
72 | 73 |
74 | 75 |
76 | {% if data_is_valid %} 77 | {{ job.func_name }}( 78 | {% if job.args %} 79 | {% for arg in job.args %} 80 | {{ arg|force_escape }}, 81 | {% endfor %} 82 | {% endif %} 83 | {% for key, value in job.kwargs.items %} 84 | {{ key }}={{ value|force_escape }}, 85 | {% endfor %}) 86 | {% else %} 87 | Unpickling Error 88 | {% endif %} 89 |
90 |
91 |
92 | 93 |
{{ job | show_func_name }}
94 |
95 | 96 |
97 | 98 |
99 | {% for k in job.meta %} 100 |
101 | 102 |
{{ job.meta | get_item:k }}
103 |
104 | {% endfor %} 105 |
106 |
107 | 108 | 109 |
110 |
111 | 112 |
113 | {% if dependency_id %} 114 | 115 | {{ dependency_id }} 116 | 117 | {% endif %} 118 |
119 |
120 |
121 | {% if exc_info %} 122 |
123 | 124 |
125 |
{% if job.exc_info %}{{ job.exc_info|linebreaks }}{% endif %}
126 |
127 | 128 |
129 | {% endif %} 130 | 131 |
132 | 133 |
{{ job.result | default:'-' }}
134 |
135 |
136 | 137 | 138 |
139 | 143 | {% if job.is_started %} 144 | 150 | {% endif %} 151 | {% if job.is_failed %} 152 | 158 | {% endif %} 159 | {% if not job.is_queued and not job.is_failed %} 160 | 166 | {% endif %} 167 |
168 |
169 | 170 | 171 |
172 | 173 | {% for result in job.results %} 174 |

Result {{ result.id }}

175 | 212 | {% endfor %} 213 |
214 | 215 | 216 | 217 | {% endblock %} 218 | -------------------------------------------------------------------------------- /scheduler/templates/admin/scheduler/jobs-list.partial.html: -------------------------------------------------------------------------------- 1 | {% load scheduler_tags i18n %} 2 | {% if not add %} 3 |
4 |

Job executions

5 |
6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | {% for exec in executions %} 21 | 22 | 25 | 28 | 31 | 34 | 37 | 40 | 43 | 46 | 47 | {% endfor %} 48 | 49 |
IDSTATUSCreated atEnqueued atStarted atRan forWorker nameResult
23 | {{ exec.id }} 24 | 26 | {{ exec|job_status }} 27 | 29 | {{ exec.created_at|date:"Y-m-d, H:i:s"|default:"-" }} 30 | 32 | {{ exec.enqueued_at|date:"Y-m-d, H:i:s"|default:"-" }} 33 | 35 | {{ exec.started_at|date:"Y-m-d, H:i:s"|default:"-" }} 36 | 38 | {{ exec|job_runtime }} 39 | 41 | {{ exec.worker_name|default:"-" }} 42 | 44 | {{ exec|job_result|default:"-" }} 45 |
50 |
51 |

52 | {% if pagination_required %} 53 | {% for i in page_range %} 54 | {% if i == executions.paginator.ELLIPSIS %} 55 | {{ executions.paginator.ELLIPSIS }} 56 | {% elif i == executions.number %} 57 | {{ i }} 58 | {% else %} 59 | {{ i }} 61 | {% endif %} 62 | {% endfor %} 63 | {{ executions.paginator.count }} {% blocktranslate count counter=executions.paginator.count %}entry 64 | {% plural %}entries{% endblocktranslate %} 65 | {% endif %} 66 |

67 |
68 | {% endif %} -------------------------------------------------------------------------------- /scheduler/templates/admin/scheduler/jobs.html: -------------------------------------------------------------------------------- 1 | {% extends "admin/scheduler/scheduler_base.html" %} 2 | 3 | {% load static scheduler_tags %} 4 | 5 | {% block title %}{{ job_status }} Jobs in {{ queue.name }} {{ block.super }}{% endblock %} 6 | 7 | {% block extrastyle %} 8 | {{ block.super }} 9 | 10 | {% endblock %} 11 | 12 | 13 | {% block breadcrumbs %} 14 | 19 | {% endblock %} 20 | 21 | {% block content_title %}

{{ job_status }} jobs in {{ queue.name }}

{% endblock %} 22 | 23 | {% block content %} 24 | 25 |
26 | 33 |
34 |
35 | {% csrf_token %} 36 |
37 | 47 | 49 |
50 |
51 | 52 | 53 | 54 | 61 | 64 | 67 | 70 | 73 | 76 | 79 | 82 | 85 | {% block extra_columns %} 86 | {% endblock extra_columns %} 87 | 88 | 89 | 90 | {% for job in jobs %} 91 | 92 | 97 | 102 | 105 | 108 | 111 | 114 | 115 | 116 | 117 | {% block extra_columns_values %} 118 | {% endblock extra_columns_values %} 119 | 120 | {% endfor %} 121 | 122 |
55 |
56 | 58 |
59 |
60 |
62 |
ID
63 |
65 |
Created
66 |
68 |
Scheduled
69 |
71 |
Enqueued
72 |
74 |
Ended
75 |
77 |
Status
78 |
80 |
Callable
81 |
83 |
Worker
84 |
93 | 96 | 98 | 99 | {{ job.id }} 100 | 101 | 103 | {{ job.created_at|date:"Y-m-d, H:i:s" }} 104 | 106 | {{ job|job_scheduled_time:queue|date:"Y-m-d, H:i:s" }} 107 | 109 | {{ job.enqueued_at|date:"Y-m-d, H:i:s" }} 110 | 112 | {{ job.ended_at|date:"Y-m-d, H:i:s" }} 113 | {{ job.get_status }}{{ job|show_func_name }}{{ job.worker_name|default:'-' }}
123 |
124 |

125 | {% for p in page_range %} 126 | {% if p == page %} 127 | {{ p }} 128 | {% elif forloop.last %} 129 | {{ p }} 130 | {% else %} 131 | {{ p }} 132 | {% endif %} 133 | {% endfor %} 134 | {{ num_jobs }} jobs 135 |

136 |
137 |
138 |
139 | 140 | {% endblock %} 141 | -------------------------------------------------------------------------------- /scheduler/templates/admin/scheduler/queue_workers.html: -------------------------------------------------------------------------------- 1 | {% extends "admin/scheduler/scheduler_base.html" %} 2 | 3 | {% load static scheduler_tags l10n %} 4 | 5 | {% block title %}Workers in {{ queue.name }} {{ block.super }}{% endblock %} 6 | 7 | {% block extrastyle %} 8 | {{ block.super }} 9 | 10 | {% endblock %} 11 | 12 | 13 | {% block breadcrumbs %} 14 | 19 | {% endblock %} 20 | 21 | {% block content_title %}

Workers in {{ queue.name }}

{% endblock %} 22 | 23 | {% block content %} 24 | 25 |
26 |
27 | {% include 'admin/scheduler/workers-list.partial.html' %} 28 |
29 |
30 | 31 | {% endblock %} 32 | -------------------------------------------------------------------------------- /scheduler/templates/admin/scheduler/scheduler_base.html: -------------------------------------------------------------------------------- 1 | {% extends "admin/base_site.html" %} 2 | {% load scheduler_tags %} 3 | 4 | {% load static %} 5 | 6 | {% block extrastyle %} 7 | {{ block.super }} 8 | 17 | 18 | {% endblock %} 19 | 20 | {% block extrahead %} 21 | {{ block.super }} 22 | 23 | {% endblock %} -------------------------------------------------------------------------------- /scheduler/templates/admin/scheduler/single_job_action.html: -------------------------------------------------------------------------------- 1 | {% extends "admin/scheduler/scheduler_base.html" %} 2 | {% load scheduler_tags %} 3 | 4 | {% block breadcrumbs %} 5 | 12 | {% endblock %} 13 | 14 | {% block content_title %}

Are you sure?

{% endblock %} 15 | 16 | {% block content %} 17 | 18 |
19 |

20 | Are you sure you want to {{ action }} 21 | 22 | {{ job.id }} ({{ job|show_func_name }}) 23 | 24 | from 25 | {{ queue.name }}? 26 | This action can not be undone. 27 |
28 | {% if job.is_scheduled_job %} 29 | Note: This scheduled job will be scheduled again if it is enabled 30 | {% endif %} 31 |

32 |
33 | {% csrf_token %} 34 |
35 | 36 |
37 |
38 |
39 | 40 | {% endblock %} 41 | -------------------------------------------------------------------------------- /scheduler/templates/admin/scheduler/stats.html: -------------------------------------------------------------------------------- 1 | {% extends "admin/base_site.html" %} 2 | 3 | {% block title %}Queues {{ block.super }}{% endblock %} 4 | 5 | {% block extrastyle %} 6 | {{ block.super }} 7 | 10 | {% endblock %} 11 | 12 | {% block content_title %}

RQ Queues

{% endblock %} 13 | 14 | {% block breadcrumbs %} 15 | 19 | {% endblock %} 20 | 21 | {% block content %} 22 | 23 |
24 | 25 |
26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | {% if queue.scheduler_pid is not False %} 43 | 44 | {% endif %} 45 | 46 | 47 | 48 | {% for queue in queues %} 49 | 50 | 55 | 60 | 61 | 66 | 71 | 76 | 81 | 86 | 91 | 95 | 96 | 97 | 98 | {% if queue.scheduler_pid is not False %} 99 | 100 | {% endif %} 101 | 102 | {% endfor %} 103 | 104 |
NameQueued JobsOldest Queued JobActive JobsDeferred JobsFinished JobsFailed JobsScheduled JobsCanceled JobsWorkersHostPortDBScheduler PID
51 | 52 | {{ queue.name }} 53 | 54 | 56 | 57 | {{ queue.jobs }} 58 | 59 | {{ queue.oldest_job_timestamp }} 62 | 63 | {{ queue.started_jobs }} 64 | 65 | 67 | 68 | {{ queue.deferred_jobs }} 69 | 70 | 72 | 73 | {{ queue.finished_jobs }} 74 | 75 | 77 | 78 | {{ queue.failed_jobs }} 79 | 80 | 82 | 83 | {{ queue.scheduled_jobs }} 84 | 85 | 87 | 88 | {{ queue.canceled_jobs }} 89 | 90 | 92 | {{ queue.workers }} 93 | 94 | {{ queue.connection_kwargs.host }}{{ queue.connection_kwargs.port }}{{ queue.connection_kwargs.db }}{{ queue.scheduler_pid|default_if_none:"Inactive" }}
105 |
106 | View as JSON 107 |
108 |
109 | 110 | {% endblock %} 111 | -------------------------------------------------------------------------------- /scheduler/templates/admin/scheduler/worker_details.html: -------------------------------------------------------------------------------- 1 | {% extends 'admin/scheduler/scheduler_base.html' %} 2 | 3 | {% block breadcrumbs %} 4 | 9 | {% endblock %} 10 | 11 | {% block content_title %}

Worker Info

{% endblock %} 12 | 13 | {% block content %} 14 |
15 |
16 |
17 |
18 | 19 |
{{ worker.name }}
20 |
21 |
22 | 23 |
24 |
25 | 26 |
{{ worker.get_state }}
27 |
28 |
29 | 30 |
31 |
32 | 33 |
{{ worker.birth_date|date:"Y-m-d, H:i:s" }}
34 |
35 |
36 | 37 |
38 |
39 | 40 |
{{ queue_names }}
41 |
42 |
43 | 44 |
45 |
46 | 47 |
{{ worker.pid }}
48 |
49 |
50 | 51 | 52 |
53 |
54 | 55 |
56 | {% if job %} 57 | {{ job.func_name }} 58 | ({{ job.id }}) 59 | {% else %} 60 | No current job 61 | {% endif %} 62 |
63 |
64 |
65 | 66 | 67 |
68 |
69 | 70 |
{{ worker.successful_job_count|default:0 }}
71 |
72 |
73 | 74 |
75 |
76 | 77 |
{{ worker.failed_job_count|default:0 }}
78 |
79 |
80 | 81 |
82 |
83 | 84 |
{{ total_working_time|default:0.0 }}ms
85 |
86 |
87 |
88 | {% include 'admin/scheduler/jobs-list.partial.html' %} 89 |
90 | 91 | {% endblock %} 92 | -------------------------------------------------------------------------------- /scheduler/templates/admin/scheduler/workers-list.partial.html: -------------------------------------------------------------------------------- 1 | {% load scheduler_tags %} 2 | {% load l10n %} 3 |
4 | 5 | 6 | 7 | 10 | 13 | 16 | 19 | 22 | 25 | 28 | 31 | 34 | 35 | 36 | 37 | {% for worker in workers %} 38 | 39 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | {% endfor %} 54 | 55 |
8 |
Name
9 |
11 |
State
12 |
14 |
Birth
15 |
17 |
Hostname
18 |
20 |
PID
21 |
23 |
Working time
24 |
26 |
Successful jobs
27 |
29 |
Failed jobs
30 |
32 |
Scheduler
33 |
40 | 41 | {{ worker.name }} 42 | 43 | {{ worker.get_state }}{{ worker.birth_date|date:"Y-m-d, H:i:s" }}{{ worker.hostname }}{{ worker.pid|unlocalize }}{{ worker.total_working_time|default:0|floatformat }}secs{{ worker.successful_job_count|default:0 }}{{ worker.failed_job_count|default:0 }}{{ worker | worker_scheduler_pid }}
56 |
-------------------------------------------------------------------------------- /scheduler/templates/admin/scheduler/workers.html: -------------------------------------------------------------------------------- 1 | {% extends "admin/scheduler/scheduler_base.html" %} 2 | 3 | {% load static scheduler_tags l10n %} 4 | 5 | {% block title %}Workers in {{ queue.name }} {{ block.super }}{% endblock %} 6 | 7 | {% block extrastyle %} 8 | {{ block.super }} 9 | 10 | {% endblock %} 11 | 12 | 13 | {% block breadcrumbs %} 14 | 18 | {% endblock %} 19 | 20 | {% block content_title %}

Django Workers

{% endblock %} 21 | 22 | {% block content %} 23 | 24 |
25 |
26 |
27 | {% csrf_token %} 28 | {% include 'admin/scheduler/workers-list.partial.html' %} 29 |
30 |
31 |
32 | 33 | {% endblock %} 34 | -------------------------------------------------------------------------------- /scheduler/templatetags/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dsoftwareinc/django-rq-scheduler/d37a9149c770bde88f5eb60ba03402c4a722f81c/scheduler/templatetags/__init__.py -------------------------------------------------------------------------------- /scheduler/templatetags/scheduler_tags.py: -------------------------------------------------------------------------------- 1 | from typing import Dict 2 | 3 | from django import template 4 | from django.utils.safestring import mark_safe 5 | 6 | from scheduler.rq_classes import JobExecution, DjangoQueue 7 | from scheduler.tools import get_scheduled_job 8 | 9 | register = template.Library() 10 | 11 | 12 | @register.filter 13 | def show_func_name(rq_job: JobExecution) -> str: 14 | try: 15 | res = rq_job.func_name 16 | if res == 'scheduler.tools.run_job': 17 | job = get_scheduled_job(*rq_job.args) 18 | res = job.function_string() 19 | return mark_safe(res) 20 | except Exception as e: 21 | return repr(e) 22 | 23 | 24 | @register.filter 25 | def get_item(dictionary: Dict, key): 26 | return dictionary.get(key) 27 | 28 | 29 | @register.filter 30 | def scheduled_job(job: JobExecution): 31 | django_scheduled_job = get_scheduled_job(*job.args) 32 | return django_scheduled_job.get_absolute_url() 33 | 34 | 35 | @register.filter 36 | def worker_scheduler_pid(worker): 37 | return worker.scheduler_pid() 38 | 39 | 40 | @register.filter 41 | def job_result(job: JobExecution): 42 | result = job.latest_result() 43 | return result.type.name.capitalize() if result else None 44 | 45 | 46 | @register.filter 47 | def job_status(job: JobExecution): 48 | result = job.get_status() 49 | return result.capitalize() 50 | 51 | 52 | @register.filter 53 | def job_runtime(job: JobExecution): 54 | ended_at = job.ended_at 55 | if ended_at: 56 | runtime = job.ended_at - job.started_at 57 | return f'{int(runtime.microseconds / 1000)}ms' 58 | elif job.started_at: 59 | return "Still running" 60 | else: 61 | return "-" 62 | 63 | 64 | @register.filter 65 | def job_scheduled_time(job: JobExecution, queue: DjangoQueue): 66 | try: 67 | return queue.scheduled_job_registry.get_scheduled_time(job.id) 68 | except Exception: 69 | return None 70 | -------------------------------------------------------------------------------- /scheduler/tests/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dsoftwareinc/django-rq-scheduler/d37a9149c770bde88f5eb60ba03402c4a722f81c/scheduler/tests/__init__.py -------------------------------------------------------------------------------- /scheduler/tests/jobs.py: -------------------------------------------------------------------------------- 1 | from time import sleep 2 | 3 | _counter = 0 4 | 5 | 6 | def arg_callable(): 7 | global _counter 8 | _counter += 1 9 | return _counter 10 | 11 | 12 | def test_args_kwargs(*args, **kwargs): 13 | func = "test_args_kwargs({})" 14 | args_list = [repr(arg) for arg in args] 15 | kwargs_list = [f'{k}={v}' for (k, v) in kwargs.items()] 16 | return func.format(', '.join(args_list + kwargs_list)) 17 | 18 | 19 | def long_job(): 20 | sleep(10) 21 | 22 | 23 | test_non_callable = 'I am a teapot' 24 | 25 | 26 | def failing_job(): 27 | raise ValueError 28 | 29 | 30 | def test_job(): 31 | return 1 + 1 32 | -------------------------------------------------------------------------------- /scheduler/tests/test_internals.py: -------------------------------------------------------------------------------- 1 | from datetime import timedelta 2 | 3 | from django.utils import timezone 4 | 5 | from scheduler.models import ScheduledJob 6 | from scheduler.tests.testtools import SchedulerBaseCase, job_factory 7 | from scheduler.tools import get_scheduled_job 8 | 9 | 10 | class TestInternals(SchedulerBaseCase): 11 | def test_get_scheduled_job(self): 12 | job = job_factory(ScheduledJob, scheduled_time=timezone.now() - timedelta(hours=1)) 13 | self.assertEqual(job, get_scheduled_job(job.JOB_TYPE, job.id)) 14 | with self.assertRaises(ValueError): 15 | get_scheduled_job(job.JOB_TYPE, job.id + 1) 16 | with self.assertRaises(ValueError): 17 | get_scheduled_job('UNKNOWN_JOBTYPE', job.id) 18 | -------------------------------------------------------------------------------- /scheduler/tests/test_job_arg_models.py: -------------------------------------------------------------------------------- 1 | from django.core.exceptions import ValidationError 2 | from django.test import TestCase 3 | from django.utils import timezone 4 | 5 | from scheduler.models import JobArg, JobKwarg 6 | from .jobs import arg_callable 7 | from .testtools import jobarg_factory 8 | 9 | 10 | class TestAllJobArg(TestCase): 11 | JobArgClass = JobArg 12 | 13 | def test_bad_arg_type(self): 14 | arg = jobarg_factory(self.JobArgClass, arg_type='bad_arg_type', val='something') 15 | with self.assertRaises(ValidationError): 16 | arg.clean() 17 | 18 | def test_clean_one_value_invalid_str_int(self): 19 | arg = jobarg_factory(self.JobArgClass, arg_type='int', val='not blank', ) 20 | with self.assertRaises(ValidationError): 21 | arg.clean() 22 | 23 | def test_clean_callable_invalid(self): 24 | arg = jobarg_factory(self.JobArgClass, arg_type='callable', val='bad_callable', ) 25 | with self.assertRaises(ValidationError): 26 | arg.clean() 27 | 28 | def test_clean_datetime_invalid(self): 29 | arg = jobarg_factory(self.JobArgClass, arg_type='datetime', val='bad datetime', ) 30 | with self.assertRaises(ValidationError): 31 | arg.clean() 32 | 33 | def test_clean_bool_invalid(self): 34 | arg = jobarg_factory(self.JobArgClass, arg_type='bool', val='bad bool', ) 35 | with self.assertRaises(ValidationError): 36 | arg.clean() 37 | 38 | def test_clean_int_invalid(self): 39 | arg = jobarg_factory(self.JobArgClass, arg_type='int', val='str') 40 | with self.assertRaises(ValidationError): 41 | arg.clean() 42 | 43 | def test_str_clean(self): 44 | arg = jobarg_factory(self.JobArgClass, val='something') 45 | self.assertIsNone(arg.clean()) 46 | 47 | 48 | class TestJobArg(TestCase): 49 | JobArgClass = JobArg 50 | 51 | def test_str(self): 52 | arg = jobarg_factory(self.JobArgClass) 53 | self.assertEqual( 54 | f'JobArg[arg_type={arg.arg_type},value={arg.value()}]', str(arg)) 55 | 56 | def test_value(self): 57 | arg = jobarg_factory(self.JobArgClass, arg_type='str', val='something') 58 | self.assertEqual(arg.value(), 'something') 59 | 60 | def test__str__str_val(self): 61 | arg = jobarg_factory(self.JobArgClass, arg_type='str', val='something') 62 | self.assertEqual('something', str(arg.value())) 63 | 64 | def test__str__int_val(self): 65 | arg = jobarg_factory(self.JobArgClass, arg_type='int', val='1') 66 | self.assertEqual('1', str(arg.value())) 67 | 68 | def test__str__datetime_val(self): 69 | _time = timezone.now() 70 | arg = jobarg_factory(self.JobArgClass, arg_type='datetime', val=str(_time)) 71 | self.assertEqual(str(_time), str(arg.value())) 72 | 73 | def test__str__bool_val(self): 74 | arg = jobarg_factory(self.JobArgClass, arg_type='bool', val='True') 75 | self.assertEqual('True', str(arg.value())) 76 | 77 | def test__repr__str_val(self): 78 | arg = jobarg_factory(self.JobArgClass, arg_type='str', val='something') 79 | self.assertEqual("'something'", repr(arg.value())) 80 | 81 | def test__repr__int_val(self): 82 | arg = jobarg_factory(self.JobArgClass, arg_type='int', val='1') 83 | self.assertEqual('1', repr(arg.value())) 84 | 85 | def test__repr__datetime_val(self): 86 | _time = timezone.now() 87 | arg = jobarg_factory(self.JobArgClass, arg_type='datetime', val=str(_time)) 88 | self.assertEqual(repr(_time), repr(arg.value())) 89 | 90 | def test__repr__bool_val(self): 91 | arg = jobarg_factory(self.JobArgClass, arg_type='bool', val='False') 92 | self.assertEqual('False', repr(arg.value())) 93 | 94 | def test_callable_arg_type__clean(self): 95 | method = arg_callable 96 | arg = jobarg_factory( 97 | self.JobArgClass, arg_type='callable', 98 | val=f'{method.__module__}.{method.__name__}', ) 99 | self.assertIsNone(arg.clean()) 100 | self.assertEqual(1, arg.value()) 101 | self.assertEqual(2, arg.value()) 102 | 103 | 104 | class TestJobKwarg(TestAllJobArg): 105 | JobArgClass = JobKwarg 106 | 107 | def test_str(self): 108 | arg = jobarg_factory(self.JobArgClass) 109 | self.assertEqual( 110 | f'JobKwarg[key={arg.key},arg_type={arg.arg_type},value={arg.val}]', str(arg)) 111 | 112 | def test_value(self): 113 | kwarg = jobarg_factory(self.JobArgClass, key='key', arg_type='str', val='value') 114 | self.assertEqual(kwarg.value(), ('key', 'value')) 115 | 116 | def test__str__str_val(self): 117 | kwarg = jobarg_factory(self.JobArgClass, key='key', arg_type='str', val='something') 118 | self.assertEqual('JobKwarg[key=key,arg_type=str,value=something]', str(kwarg)) 119 | 120 | def test__str__int_val(self): 121 | kwarg = jobarg_factory(self.JobArgClass, key='key', arg_type='int', val=1) 122 | self.assertEqual('JobKwarg[key=key,arg_type=int,value=1]', str(kwarg)) 123 | 124 | def test__str__datetime_val(self): 125 | _time = timezone.now() 126 | kwarg = jobarg_factory(self.JobArgClass, key='key', arg_type='datetime', val=str(_time)) 127 | self.assertEqual(f'JobKwarg[key=key,arg_type=datetime,value={_time}]', str(kwarg)) 128 | 129 | def test__str__bool_val(self): 130 | kwarg = jobarg_factory(self.JobArgClass, key='key', arg_type='bool', val='True') 131 | self.assertEqual('JobKwarg[key=key,arg_type=bool,value=True]', str(kwarg)) 132 | 133 | def test__repr__str_val(self): 134 | kwarg = jobarg_factory(self.JobArgClass, key='key', arg_type='str', val='something') 135 | self.assertEqual("('key', 'something')", repr(kwarg.value())) 136 | 137 | def test__repr__int_val(self): 138 | kwarg = jobarg_factory(self.JobArgClass, key='key', arg_type='int', val='1') 139 | self.assertEqual("('key', 1)", repr(kwarg.value())) 140 | 141 | def test__repr__datetime_val(self): 142 | _time = timezone.now() 143 | kwarg = jobarg_factory(self.JobArgClass, key='key', arg_type='datetime', val=str(_time)) 144 | self.assertEqual("('key', {})".format(repr(_time)), repr(kwarg.value())) 145 | 146 | def test__repr__bool_val(self): 147 | kwarg = jobarg_factory(self.JobArgClass, key='key', arg_type='bool', val='True') 148 | self.assertEqual("('key', True)", repr(kwarg.value())) 149 | -------------------------------------------------------------------------------- /scheduler/tests/test_job_decorator.py: -------------------------------------------------------------------------------- 1 | import time 2 | 3 | from django.test import TestCase 4 | 5 | from scheduler import job, settings 6 | from . import test_settings # noqa 7 | from ..queues import get_queue, QueueNotFoundError 8 | 9 | 10 | @job 11 | def test_job(): 12 | time.sleep(1) 13 | return 1 + 1 14 | 15 | 16 | @job('django_rq_scheduler_test') 17 | def test_job_diff_queue(): 18 | time.sleep(1) 19 | return 1 + 1 20 | 21 | 22 | @job(timeout=1) 23 | def test_job_timeout(): 24 | time.sleep(1) 25 | return 1 + 1 26 | 27 | 28 | @job(result_ttl=1) 29 | def test_job_result_ttl(): 30 | return 1 + 1 31 | 32 | 33 | class JobDecoratorTest(TestCase): 34 | def setUp(self) -> None: 35 | get_queue('default').connection.flushall() 36 | 37 | def test_job_decorator_no_params(self): 38 | test_job.delay() 39 | config = settings.SCHEDULER_CONFIG 40 | self._assert_job_with_func_and_props( 41 | 'default', test_job, config.get('DEFAULT_RESULT_TTL'), config.get('DEFAULT_TIMEOUT')) 42 | 43 | def test_job_decorator_timeout(self): 44 | test_job_timeout.delay() 45 | config = settings.SCHEDULER_CONFIG 46 | self._assert_job_with_func_and_props( 47 | 'default', test_job_timeout, config.get('DEFAULT_RESULT_TTL'), 1) 48 | 49 | def test_job_decorator_result_ttl(self): 50 | test_job_result_ttl.delay() 51 | config = settings.SCHEDULER_CONFIG 52 | self._assert_job_with_func_and_props( 53 | 'default', test_job_result_ttl, 1, config.get('DEFAULT_TIMEOUT')) 54 | 55 | def test_job_decorator_different_queue(self): 56 | test_job_diff_queue.delay() 57 | config = settings.SCHEDULER_CONFIG 58 | self._assert_job_with_func_and_props( 59 | 'django_rq_scheduler_test', 60 | test_job_diff_queue, 61 | config.get('DEFAULT_RESULT_TTL'), 62 | config.get('DEFAULT_TIMEOUT')) 63 | 64 | def _assert_job_with_func_and_props( 65 | self, queue_name, 66 | expected_func, 67 | expected_result_ttl, 68 | expected_timeout): 69 | queue = get_queue(queue_name) 70 | jobs = queue.get_jobs() 71 | self.assertEqual(1, len(jobs)) 72 | 73 | j = jobs[0] 74 | self.assertEqual(j.func, expected_func) 75 | self.assertEqual(j.result_ttl, expected_result_ttl) 76 | self.assertEqual(j.timeout, expected_timeout) 77 | 78 | def test_job_decorator_bad_queue(self): 79 | with self.assertRaises(QueueNotFoundError): 80 | @job('bad-queue') 81 | def test_job_bad_queue(): 82 | time.sleep(1) 83 | return 1 + 1 84 | -------------------------------------------------------------------------------- /scheduler/tests/test_mgmt_cmds.py: -------------------------------------------------------------------------------- 1 | import json 2 | import os 3 | import tempfile 4 | from unittest import mock 5 | 6 | import yaml 7 | from django.core.management import call_command 8 | from django.test import TestCase 9 | 10 | from scheduler.models import ScheduledJob, RepeatableJob 11 | from scheduler.queues import get_queue 12 | from scheduler.tests.jobs import failing_job, test_job 13 | from scheduler.tests.testtools import job_factory 14 | from . import test_settings # noqa 15 | from .test_views import BaseTestCase 16 | from ..tools import create_worker 17 | 18 | 19 | class RqworkerTestCase(TestCase): 20 | 21 | def test_rqworker__no_queues_params(self): 22 | queue = get_queue('default') 23 | 24 | # enqueue some jobs that will fail 25 | jobs = [] 26 | job_ids = [] 27 | for _ in range(0, 3): 28 | job = queue.enqueue(failing_job) 29 | jobs.append(job) 30 | job_ids.append(job.id) 31 | 32 | # Create a worker to execute these jobs 33 | call_command('rqworker', fork_job_execution=False, burst=True) 34 | 35 | # check if all jobs are really failed 36 | for job in jobs: 37 | self.assertTrue(job.is_failed) 38 | 39 | def test_rqworker__run_jobs(self): 40 | queue = get_queue('default') 41 | 42 | # enqueue some jobs that will fail 43 | jobs = [] 44 | job_ids = [] 45 | for _ in range(0, 3): 46 | job = queue.enqueue(failing_job) 47 | jobs.append(job) 48 | job_ids.append(job.id) 49 | 50 | # Create a worker to execute these jobs 51 | call_command('rqworker', 'default', fork_job_execution=False, burst=True) 52 | 53 | # check if all jobs are really failed 54 | for job in jobs: 55 | self.assertTrue(job.is_failed) 56 | 57 | def test_rqworker__worker_with_two_queues(self): 58 | queue = get_queue('default') 59 | queue2 = get_queue('django_rq_scheduler_test') 60 | 61 | # enqueue some jobs that will fail 62 | jobs = [] 63 | job_ids = [] 64 | for _ in range(0, 3): 65 | job = queue.enqueue(failing_job) 66 | jobs.append(job) 67 | job_ids.append(job.id) 68 | job = queue2.enqueue(failing_job) 69 | jobs.append(job) 70 | job_ids.append(job.id) 71 | 72 | # Create a worker to execute these jobs 73 | call_command('rqworker', 'default', 'django_rq_scheduler_test', fork_job_execution=False, burst=True) 74 | 75 | # check if all jobs are really failed 76 | for job in jobs: 77 | self.assertTrue(job.is_failed) 78 | 79 | def test_rqworker__worker_with_one_queue__does_not_perform_other_queue_job(self): 80 | queue = get_queue('default') 81 | queue2 = get_queue('django_rq_scheduler_test') 82 | 83 | job = queue.enqueue(failing_job) 84 | other_job = queue2.enqueue(failing_job) 85 | 86 | # Create a worker to execute these jobs 87 | call_command('rqworker', 'default', fork_job_execution=False, burst=True) 88 | # assert 89 | self.assertTrue(job.is_failed) 90 | self.assertTrue(other_job.is_queued) 91 | 92 | 93 | class RqstatsTest(TestCase): 94 | def test_rqstats__does_not_fail(self): 95 | call_command('rqstats', '-j') 96 | call_command('rqstats', '-y') 97 | call_command('rqstats') 98 | 99 | 100 | class DeleteFailedExecutionsTest(BaseTestCase): 101 | def test_delete_failed_executions__delete_jobs(self): 102 | queue = get_queue('default') 103 | call_command('delete_failed_executions', queue='default') 104 | queue.enqueue(failing_job) 105 | worker = create_worker('default') 106 | worker.work(burst=True) 107 | self.assertEqual(1, len(queue.failed_job_registry)) 108 | call_command('delete_failed_executions', queue='default') 109 | self.assertEqual(0, len(queue.failed_job_registry)) 110 | 111 | 112 | class RunJobTest(TestCase): 113 | def test_run_job__should_schedule_job(self): 114 | queue = get_queue('default') 115 | queue.empty() 116 | func_name = f'{test_job.__module__}.{test_job.__name__}' 117 | # act 118 | call_command('run_job', func_name, queue='default') 119 | # assert 120 | job_list = queue.get_jobs() 121 | self.assertEqual(1, len(job_list)) 122 | self.assertEqual(func_name + '()', job_list[0].get_call_string()) 123 | 124 | 125 | class ExportTest(TestCase): 126 | def setUp(self) -> None: 127 | self.tmpfile = tempfile.NamedTemporaryFile() 128 | 129 | def tearDown(self) -> None: 130 | os.remove(self.tmpfile.name) 131 | 132 | def test_export__should_export_job(self): 133 | jobs = list() 134 | jobs.append(job_factory(ScheduledJob, enabled=True)) 135 | jobs.append(job_factory(RepeatableJob, enabled=True)) 136 | 137 | # act 138 | call_command('export', filename=self.tmpfile.name) 139 | # assert 140 | result = json.load(self.tmpfile) 141 | self.assertEqual(len(jobs), len(result)) 142 | self.assertEqual(result[0], jobs[0].to_dict()) 143 | self.assertEqual(result[1], jobs[1].to_dict()) 144 | 145 | def test_export__should_export_enabled_jobs_only(self): 146 | jobs = list() 147 | jobs.append(job_factory(ScheduledJob, enabled=True)) 148 | jobs.append(job_factory(RepeatableJob, enabled=False)) 149 | 150 | # act 151 | call_command('export', filename=self.tmpfile.name, enabled=True) 152 | # assert 153 | result = json.load(self.tmpfile) 154 | self.assertEqual(len(jobs) - 1, len(result)) 155 | self.assertEqual(result[0], jobs[0].to_dict()) 156 | 157 | def test_export__should_export_job_yaml_without_yaml_lib(self): 158 | jobs = list() 159 | jobs.append(job_factory(ScheduledJob, enabled=True)) 160 | jobs.append(job_factory(RepeatableJob, enabled=True)) 161 | 162 | # act 163 | with mock.patch.dict('sys.modules', {'yaml': None}): 164 | with self.assertRaises(SystemExit) as cm: 165 | call_command('export', filename=self.tmpfile.name, format='yaml') 166 | self.assertEqual(cm.exception.code, 1) 167 | 168 | def test_export__should_export_job_yaml_green(self): 169 | jobs = list() 170 | jobs.append(job_factory(ScheduledJob, enabled=True)) 171 | jobs.append(job_factory(RepeatableJob, enabled=True)) 172 | 173 | # act 174 | call_command('export', filename=self.tmpfile.name, format='yaml') 175 | # assert 176 | result = yaml.load(self.tmpfile, yaml.SafeLoader) 177 | self.assertEqual(len(jobs), len(result)) 178 | self.assertEqual(result[0], jobs[0].to_dict()) 179 | self.assertEqual(result[1], jobs[1].to_dict()) 180 | 181 | 182 | class ImportTest(TestCase): 183 | def setUp(self) -> None: 184 | self.tmpfile = tempfile.NamedTemporaryFile(mode='w') 185 | 186 | def tearDown(self) -> None: 187 | os.remove(self.tmpfile.name) 188 | 189 | def test_import__should_schedule_job(self): 190 | jobs = list() 191 | jobs.append(job_factory(ScheduledJob, enabled=True, instance_only=True)) 192 | jobs.append(job_factory(RepeatableJob, enabled=True, instance_only=True)) 193 | res = json.dumps([j.to_dict() for j in jobs]) 194 | self.tmpfile.write(res) 195 | self.tmpfile.flush() 196 | # act 197 | call_command('import', filename=self.tmpfile.name) 198 | # assert 199 | self.assertEqual(1, ScheduledJob.objects.count()) 200 | db_job = ScheduledJob.objects.first() 201 | attrs = ['name', 'queue', 'callable', 'enabled', 'timeout'] 202 | for attr in attrs: 203 | self.assertEqual(getattr(jobs[0], attr), getattr(db_job, attr)) 204 | 205 | def test_import__should_schedule_job_yaml(self): 206 | jobs = list() 207 | jobs.append(job_factory(ScheduledJob, enabled=True, instance_only=True)) 208 | jobs.append(job_factory(RepeatableJob, enabled=True, instance_only=True)) 209 | res = yaml.dump([j.to_dict() for j in jobs], default_flow_style=False) 210 | self.tmpfile.write(res) 211 | self.tmpfile.flush() 212 | # act 213 | call_command('import', filename=self.tmpfile.name, format='yaml') 214 | # assert 215 | self.assertEqual(1, ScheduledJob.objects.count()) 216 | db_job = ScheduledJob.objects.first() 217 | attrs = ['name', 'queue', 'callable', 'enabled', 'timeout'] 218 | for attr in attrs: 219 | self.assertEqual(getattr(jobs[0], attr), getattr(db_job, attr)) 220 | 221 | def test_import__should_schedule_job_yaml_without_yaml_lib(self): 222 | jobs = list() 223 | jobs.append(job_factory(ScheduledJob, enabled=True, instance_only=True)) 224 | jobs.append(job_factory(RepeatableJob, enabled=True, instance_only=True)) 225 | res = yaml.dump([j.to_dict() for j in jobs], default_flow_style=False) 226 | self.tmpfile.write(res) 227 | self.tmpfile.flush() 228 | # act 229 | with mock.patch.dict('sys.modules', {'yaml': None}): 230 | with self.assertRaises(SystemExit) as cm: 231 | call_command('import', filename=self.tmpfile.name, format='yaml') 232 | self.assertEqual(cm.exception.code, 1) 233 | 234 | def test_import__should_schedule_job_reset(self): 235 | jobs = list() 236 | job_factory(ScheduledJob, enabled=True) 237 | job_factory(ScheduledJob, enabled=True) 238 | jobs.append(job_factory(ScheduledJob, enabled=True)) 239 | jobs.append(job_factory(RepeatableJob, enabled=True, instance_only=True)) 240 | res = json.dumps([j.to_dict() for j in jobs]) 241 | self.tmpfile.write(res) 242 | self.tmpfile.flush() 243 | # act 244 | call_command('import', filename=self.tmpfile.name, reset=True, ) 245 | # assert 246 | self.assertEqual(1, ScheduledJob.objects.count()) 247 | db_job = ScheduledJob.objects.first() 248 | attrs = ['name', 'queue', 'callable', 'enabled', 'timeout'] 249 | for attr in attrs: 250 | self.assertEqual(getattr(jobs[0], attr), getattr(db_job, attr)) 251 | self.assertEqual(1, RepeatableJob.objects.count()) 252 | db_job = RepeatableJob.objects.first() 253 | attrs = ['name', 'queue', 'callable', 'enabled', 'timeout'] 254 | for attr in attrs: 255 | self.assertEqual(getattr(jobs[1], attr), getattr(db_job, attr)) 256 | 257 | def test_import__should_schedule_job_update_existing(self): 258 | jobs = list() 259 | job_factory(ScheduledJob, enabled=True) 260 | jobs.append(job_factory(ScheduledJob, enabled=True)) 261 | res = json.dumps([j.to_dict() for j in jobs]) 262 | self.tmpfile.write(res) 263 | self.tmpfile.flush() 264 | # act 265 | call_command('import', filename=self.tmpfile.name, update=True, ) 266 | # assert 267 | self.assertEqual(2, ScheduledJob.objects.count()) 268 | db_job = ScheduledJob.objects.get(name=jobs[0].name) 269 | self.assertNotEqual(jobs[0].id, db_job.id) 270 | attrs = ['name', 'queue', 'callable', 'enabled', 'timeout'] 271 | for attr in attrs: 272 | self.assertEqual(getattr(jobs[0], attr), getattr(db_job, attr)) 273 | 274 | def test_import__should_schedule_job_without_update_existing(self): 275 | jobs = list() 276 | job_factory(ScheduledJob, enabled=True) 277 | jobs.append(job_factory(ScheduledJob, enabled=True)) 278 | res = json.dumps([j.to_dict() for j in jobs]) 279 | self.tmpfile.write(res) 280 | self.tmpfile.flush() 281 | # act 282 | call_command('import', filename=self.tmpfile.name, ) 283 | # assert 284 | self.assertEqual(2, ScheduledJob.objects.count()) 285 | db_job = ScheduledJob.objects.get(name=jobs[0].name) 286 | attrs = ['id', 'name', 'queue', 'callable', 'enabled', 'timeout'] 287 | for attr in attrs: 288 | self.assertEqual(getattr(jobs[0], attr), getattr(db_job, attr)) 289 | -------------------------------------------------------------------------------- /scheduler/tests/test_redis_models.py: -------------------------------------------------------------------------------- 1 | from django.urls import reverse 2 | 3 | from scheduler.tests.testtools import SchedulerBaseCase 4 | 5 | 6 | class TestWorkerAdmin(SchedulerBaseCase): 7 | def test_admin_list_view(self): 8 | # arrange 9 | self.client.login(username='admin', password='admin') 10 | model = 'worker' 11 | url = reverse(f'admin:scheduler_{model}_changelist') 12 | 13 | # act 14 | res = self.client.get(url) 15 | # assert 16 | self.assertEqual(200, res.status_code) 17 | 18 | 19 | class TestQueueAdmin(SchedulerBaseCase): 20 | def test_admin_list_view(self): 21 | # arrange 22 | self.client.login(username='admin', password='admin') 23 | model = 'queue' 24 | url = reverse(f'admin:scheduler_{model}_changelist') 25 | 26 | # act 27 | res = self.client.get(url) 28 | # assert 29 | self.assertEqual(200, res.status_code) 30 | -------------------------------------------------------------------------------- /scheduler/tests/test_settings.py: -------------------------------------------------------------------------------- 1 | import os 2 | 3 | from django.conf import settings 4 | 5 | from scheduler.settings import conf_settings 6 | 7 | settings.SCHEDULER_QUEUES = { 8 | 'default': {'HOST': 'localhost', 'PORT': 6379, 'DB': 0, 'DEFAULT_TIMEOUT': 500}, 9 | 'test': { 10 | 'HOST': 'localhost', 11 | 'PORT': 1, 12 | 'DB': 1, 13 | }, 14 | 'sentinel': { 15 | 'SENTINELS': [('localhost', 26736), ('localhost', 26737)], 16 | 'MASTER_NAME': 'testmaster', 17 | 'DB': 1, 18 | 'USERNAME': 'redis-user', 19 | 'PASSWORD': 'secret', 20 | 'SOCKET_TIMEOUT': 10, 21 | 'SENTINEL_KWARGS': {}, 22 | }, 23 | 'test1': { 24 | 'HOST': 'localhost', 25 | 'PORT': 1, 26 | 'DB': 1, 27 | 'DEFAULT_TIMEOUT': 400, 28 | 'QUEUE_CLASS': 'django_rq.tests.fixtures.DummyQueue', 29 | }, 30 | 'test2': { 31 | 'HOST': 'localhost', 32 | 'PORT': 1, 33 | 'DB': 1, 34 | }, 35 | 'test3': { 36 | 'HOST': 'localhost', 37 | 'PORT': 6379, 38 | 'DB': 1, 39 | }, 40 | 'async': { 41 | 'HOST': 'localhost', 42 | 'PORT': 6379, 43 | 'DB': 1, 44 | 'ASYNC': False, 45 | }, 46 | 'url': { 47 | 'URL': 'redis://username:password@host:1234/', 48 | 'DB': 4, 49 | }, 50 | 'url_with_db': { 51 | 'URL': 'redis://username:password@host:1234/5', 52 | }, 53 | 'url_default_db': { 54 | 'URL': 'redis://username:password@host:1234', 55 | }, 56 | 'django_rq_scheduler_test': { 57 | 'HOST': 'localhost', 58 | 'PORT': 6379, 59 | 'DB': 0, 60 | }, 61 | 'scheduler_scheduler_active_test': { 62 | 'HOST': 'localhost', 63 | 'PORT': 6379, 64 | 'DB': 0, 65 | 'ASYNC': False, 66 | }, 67 | 'scheduler_scheduler_inactive_test': { 68 | 'HOST': 'localhost', 69 | 'PORT': 6379, 70 | 'DB': 0, 71 | 'ASYNC': False, 72 | }, 73 | 'worker_scheduler_active_test': { 74 | 'HOST': 'localhost', 75 | 'PORT': 6379, 76 | 'DB': 0, 77 | 'ASYNC': False, 78 | }, 79 | 'worker_scheduler_inactive_test': { 80 | 'HOST': 'localhost', 81 | 'PORT': 6379, 82 | 'DB': 0, 83 | 'ASYNC': False, 84 | }, 85 | 'django_rq_scheduler_test2': { 86 | 'HOST': 'localhost', 87 | 'PORT': 6379, 88 | 'DB': 0, 89 | }, 90 | 'test_scheduler': { 91 | 'HOST': 'localhost', 92 | 'PORT': 6379, 93 | 'DB': 0, 94 | 'DEFAULT_TIMEOUT': 400, 95 | }, 96 | } 97 | settings.SCHEDULER_CONFIG = dict( 98 | FAKEREDIS=(os.getenv('FAKEREDIS', 'False') == 'True'), 99 | ) 100 | conf_settings() 101 | -------------------------------------------------------------------------------- /scheduler/tests/test_worker.py: -------------------------------------------------------------------------------- 1 | import os 2 | import uuid 3 | 4 | from scheduler.tests.testtools import SchedulerBaseCase 5 | from scheduler.tools import create_worker 6 | from . import test_settings # noqa 7 | from .. import settings 8 | 9 | 10 | class TestWorker(SchedulerBaseCase): 11 | def test_create_worker__two_workers_same_queue(self): 12 | worker1 = create_worker('default', 'django_rq_scheduler_test') 13 | worker1.register_birth() 14 | worker2 = create_worker('default') 15 | worker2.register_birth() 16 | hostname = os.uname()[1] 17 | self.assertEqual(f'{hostname}-worker.1', worker1.name) 18 | self.assertEqual(f'{hostname}-worker.2', worker2.name) 19 | 20 | def test_create_worker__worker_with_queues_different_connection(self): 21 | with self.assertRaises(ValueError): 22 | create_worker('default', 'test1') 23 | 24 | def test_create_worker__with_name(self): 25 | name = uuid.uuid4().hex 26 | worker1 = create_worker('default', name=name) 27 | self.assertEqual(name, worker1.name) 28 | 29 | def test_create_worker__with_name_containing_slash(self): 30 | name = uuid.uuid4().hex[-4:] + '/' + uuid.uuid4().hex[-4:] 31 | worker1 = create_worker('default', name=name) 32 | self.assertEqual(name.replace('/', '.'), worker1.name) 33 | 34 | def test_create_worker__scheduler_interval(self): 35 | prev = settings.SCHEDULER_CONFIG['SCHEDULER_INTERVAL'] 36 | settings.SCHEDULER_CONFIG['SCHEDULER_INTERVAL'] = 1 37 | worker = create_worker('default') 38 | worker.work(burst=True) 39 | self.assertEqual(worker.scheduler.interval, 1) 40 | settings.SCHEDULER_CONFIG['SCHEDULER_INTERVAL'] = prev 41 | -------------------------------------------------------------------------------- /scheduler/tests/testtools.py: -------------------------------------------------------------------------------- 1 | from datetime import timedelta 2 | 3 | from django.contrib.auth.models import User 4 | from django.contrib.contenttypes.models import ContentType 5 | from django.contrib.messages import get_messages 6 | from django.test import Client, TestCase 7 | from django.utils import timezone 8 | 9 | from scheduler import settings 10 | from scheduler.models import CronJob, JobKwarg, RepeatableJob, ScheduledJob, BaseJob 11 | from scheduler.queues import get_queue 12 | 13 | 14 | def assert_message_in_response(response, message): 15 | messages = [m.message for m in get_messages(response.wsgi_request)] 16 | assert message in messages, f'expected "{message}" in {messages}' 17 | 18 | 19 | def sequence_gen(): 20 | n = 1 21 | while True: 22 | yield n 23 | n += 1 24 | 25 | 26 | seq = sequence_gen() 27 | 28 | 29 | def job_factory(cls, instance_only=False, **kwargs): 30 | values = dict( 31 | name='Scheduled Job %d' % next(seq), 32 | job_id=None, 33 | queue=list(settings.QUEUES.keys())[0], 34 | callable='scheduler.tests.jobs.test_job', 35 | enabled=True, 36 | timeout=None) 37 | if cls == ScheduledJob: 38 | values.update(dict( 39 | result_ttl=None, 40 | scheduled_time=timezone.now() + timedelta(days=1), )) 41 | elif cls == RepeatableJob: 42 | values.update(dict( 43 | result_ttl=None, 44 | interval=1, 45 | interval_unit='hours', 46 | repeat=None, 47 | scheduled_time=timezone.now() + timedelta(days=1), )) 48 | elif cls == CronJob: 49 | values.update(dict(cron_string="0 0 * * *", repeat=None, )) 50 | values.update(kwargs) 51 | if instance_only: 52 | instance = cls(**values) 53 | else: 54 | instance = cls.objects.create(**values) 55 | return instance 56 | 57 | 58 | def jobarg_factory(cls, **kwargs): 59 | content_object = kwargs.pop('content_object', None) 60 | if content_object is None: 61 | content_object = job_factory(ScheduledJob) 62 | values = dict( 63 | arg_type='str', 64 | val='', 65 | object_id=content_object.id, 66 | content_type=ContentType.objects.get_for_model(content_object), 67 | content_object=content_object, 68 | ) 69 | if cls == JobKwarg: 70 | values['key'] = 'key%d' % next(seq), 71 | values.update(kwargs) 72 | instance = cls.objects.create(**values) 73 | return instance 74 | 75 | 76 | def _get_job_from_scheduled_registry(django_job: BaseJob): 77 | jobs_to_schedule = django_job.rqueue.scheduled_job_registry.get_job_ids() 78 | entry = next(i for i in jobs_to_schedule if i == django_job.job_id) 79 | return django_job.rqueue.fetch_job(entry) 80 | 81 | 82 | def _get_executions(django_job: BaseJob): 83 | job_ids = django_job.rqueue.get_all_job_ids() 84 | return list(filter( 85 | lambda j: j.is_execution_of(django_job), 86 | map(lambda jid: django_job.rqueue.fetch_job(jid), job_ids))) 87 | 88 | 89 | class SchedulerBaseCase(TestCase): 90 | @classmethod 91 | def setUpTestData(cls) -> None: 92 | super().setUpTestData() 93 | try: 94 | User.objects.create_superuser('admin', 'admin@a.com', 'admin') 95 | except Exception: 96 | pass 97 | cls.client = Client() 98 | 99 | def setUp(self) -> None: 100 | super(SchedulerBaseCase, self).setUp() 101 | queue = get_queue('default') 102 | queue.empty() 103 | 104 | def tearDown(self) -> None: 105 | super(SchedulerBaseCase, self).setUp() 106 | queue = get_queue('default') 107 | queue.empty() 108 | 109 | @classmethod 110 | def setUpClass(cls): 111 | super(SchedulerBaseCase, cls).setUpClass() 112 | queue = get_queue('default') 113 | queue.connection.flushall() 114 | -------------------------------------------------------------------------------- /scheduler/tools.py: -------------------------------------------------------------------------------- 1 | import importlib 2 | import os 3 | 4 | import croniter 5 | from django.apps import apps 6 | from django.templatetags.tz import utc 7 | from django.utils import timezone 8 | 9 | from scheduler.queues import get_queues, logger, get_queue 10 | from scheduler.rq_classes import DjangoWorker, MODEL_NAMES 11 | from scheduler.settings import get_config 12 | 13 | 14 | def callable_func(callable_str: str): 15 | path = callable_str.split('.') 16 | module = importlib.import_module('.'.join(path[:-1])) 17 | func = getattr(module, path[-1]) 18 | if callable(func) is False: 19 | raise TypeError("'{}' is not callable".format(callable_str)) 20 | return func 21 | 22 | 23 | def get_next_cron_time(cron_string) -> timezone.datetime: 24 | """Calculate the next scheduled time by creating a crontab object 25 | with a cron string""" 26 | now = timezone.now() 27 | itr = croniter.croniter(cron_string, now) 28 | return utc(itr.get_next(timezone.datetime)) 29 | 30 | 31 | def get_scheduled_job(task_model: str, task_id: int): 32 | if task_model not in MODEL_NAMES: 33 | raise ValueError(f'Job Model {task_model} does not exist, choices are {MODEL_NAMES}') 34 | model = apps.get_model(app_label='scheduler', model_name=task_model) 35 | task = model.objects.filter(id=task_id).first() 36 | if task is None: 37 | raise ValueError(f'Job {task_model}:{task_id} does not exit') 38 | return task 39 | 40 | 41 | def run_job(task_model: str, task_id: int): 42 | """Run a scheduled job 43 | """ 44 | scheduled_job = get_scheduled_job(task_model, task_id) 45 | logger.debug(f'Running task {str(scheduled_job)}') 46 | args = scheduled_job.parse_args() 47 | kwargs = scheduled_job.parse_kwargs() 48 | res = scheduled_job.callable_func()(*args, **kwargs) 49 | return res 50 | 51 | 52 | def _calc_worker_name(existing_worker_names): 53 | hostname = os.uname()[1] 54 | c = 1 55 | worker_name = f'{hostname}-worker.{c}' 56 | while worker_name in existing_worker_names: 57 | c += 1 58 | worker_name = f'{hostname}-worker.{c}' 59 | return worker_name 60 | 61 | 62 | def create_worker(*queue_names, **kwargs): 63 | """ 64 | Returns a Django worker for all queues or specified ones. 65 | """ 66 | 67 | queues = get_queues(*queue_names) 68 | existing_workers = DjangoWorker.all(connection=queues[0].connection) 69 | existing_worker_names = set(map(lambda w: w.name, existing_workers)) 70 | kwargs['fork_job_execution'] = not get_config('FAKEREDIS') 71 | if kwargs.get('name', None) is None: 72 | kwargs['name'] = _calc_worker_name(existing_worker_names) 73 | 74 | kwargs['name'] = kwargs['name'].replace('/', '.') 75 | worker = DjangoWorker(queues, connection=queues[0].connection, **kwargs) 76 | return worker 77 | 78 | 79 | def get_job_executions(queue_name, scheduled_job): 80 | queue = get_queue(queue_name) 81 | job_list = queue.get_all_jobs() 82 | res = list(filter(lambda j: j.is_execution_of(scheduled_job), job_list)) 83 | return res 84 | -------------------------------------------------------------------------------- /scheduler/urls.py: -------------------------------------------------------------------------------- 1 | from django.urls import path 2 | 3 | from . import views 4 | 5 | urlpatterns = [ 6 | path('queues/', views.stats, name='queues_home'), 7 | path('queues/stats.json', views.stats_json, name='queues_home_json'), 8 | path('queues//workers/', views.queue_workers, name='queue_workers'), 9 | path('queues///jobs', views.jobs_view, name='queue_registry_jobs'), 10 | path('queues///empty/', views.clear_queue_registry, name='queue_clear'), 11 | path('queues///requeue-all/', views.requeue_all, name='queue_requeue_all'), 12 | path('queues//confirm-action/', views.confirm_action, name='queue_confirm_action'), 13 | path('queues//actions/', views.actions, name='queue_actions'), 14 | ] 15 | 16 | urlpatterns += [ 17 | path('workers/', views.workers, name='workers_home'), 18 | path('workers//', views.worker_details, name='worker_details'), 19 | path('jobs//', views.job_detail, name='job_details'), 20 | path('jobs///', views.job_action, name='queue_job_action'), 21 | ] 22 | -------------------------------------------------------------------------------- /testproject/manage.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | import os 3 | import sys 4 | 5 | if __name__ == "__main__": 6 | os.environ.setdefault("DJANGO_SETTINGS_MODULE", "testproject.settings") 7 | 8 | from django.core.management import execute_from_command_line 9 | 10 | execute_from_command_line(sys.argv) 11 | -------------------------------------------------------------------------------- /testproject/testproject/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dsoftwareinc/django-rq-scheduler/d37a9149c770bde88f5eb60ba03402c4a722f81c/testproject/testproject/__init__.py -------------------------------------------------------------------------------- /testproject/testproject/settings.py: -------------------------------------------------------------------------------- 1 | """ 2 | Django settings for testproject project. 3 | 4 | Generated by 'django-admin startproject' using Django 1.9.3. 5 | 6 | For more information on this file, see 7 | https://docs.djangoproject.com/en/1.9/topics/settings/ 8 | 9 | For the full list of settings and their values, see 10 | https://docs.djangoproject.com/en/1.9/ref/settings/ 11 | """ 12 | 13 | import os 14 | 15 | import django 16 | from fakeredis import FakeStrictRedis, FakeRedis, FakeServer, FakeConnection 17 | 18 | BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) 19 | 20 | # Quick-start development settings - unsuitable for production 21 | # See https://docs.djangoproject.com/en/1.9/howto/deployment/checklist/ 22 | 23 | # SECURITY WARNING: keep the secret key used in production secret! 24 | SECRET_KEY = 'h0_r$4#4@hgdxy_r0*n8+$(wf0&ie9&4-=(d394n!bo=9rt+85' 25 | 26 | # SECURITY WARNING: don't run with debug turned on in production! 27 | DEBUG = True 28 | 29 | ALLOWED_HOSTS = [] 30 | 31 | # Application definition 32 | 33 | INSTALLED_APPS = [ 34 | 'django.contrib.admin', 35 | 'django.contrib.auth', 36 | 'django.contrib.contenttypes', 37 | 'django.contrib.sessions', 38 | 'django.contrib.messages', 39 | 'django.contrib.staticfiles', 40 | 'scheduler', 41 | ] 42 | 43 | MIDDLEWARE = [ 44 | 'django.middleware.security.SecurityMiddleware', 45 | 'django.contrib.sessions.middleware.SessionMiddleware', 46 | 'django.middleware.common.CommonMiddleware', 47 | 'django.middleware.csrf.CsrfViewMiddleware', 48 | 'django.contrib.auth.middleware.AuthenticationMiddleware', 49 | 'django.contrib.messages.middleware.MessageMiddleware', 50 | 'django.middleware.clickjacking.XFrameOptionsMiddleware', 51 | ] 52 | 53 | ROOT_URLCONF = 'testproject.urls' 54 | if django.VERSION > (4, 0): 55 | CACHES = { 56 | 'default': { 57 | 'BACKEND': 'django.core.cache.backends.redis.RedisCache', 58 | 'LOCATION': [ 59 | 'redis://127.0.0.1:6379', # leader 60 | ], 61 | 'OPTIONS': { 62 | 'connection_class': FakeConnection 63 | } 64 | } 65 | } 66 | TEMPLATES = [ 67 | { 68 | 'BACKEND': 'django.template.backends.django.DjangoTemplates', 69 | 'DIRS': [], 70 | 'APP_DIRS': True, 71 | 'OPTIONS': { 72 | 'context_processors': [ 73 | 'django.template.context_processors.debug', 74 | 'django.template.context_processors.request', 75 | 'django.contrib.auth.context_processors.auth', 76 | 'django.contrib.messages.context_processors.messages', 77 | ], 78 | }, 79 | }, 80 | ] 81 | 82 | WSGI_APPLICATION = 'testproject.wsgi.application' 83 | 84 | # Database 85 | # https://docs.djangoproject.com/en/1.9/ref/settings/#databases 86 | 87 | DATABASES = { 88 | 'default': { 89 | 'ENGINE': 'django.db.backends.sqlite3', 90 | 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'), 91 | } 92 | } 93 | 94 | # Password validation 95 | # https://docs.djangoproject.com/en/1.9/ref/settings/#auth-password-validators 96 | 97 | AUTH_PASSWORD_VALIDATORS = [ 98 | { 99 | 'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator', 100 | }, 101 | { 102 | 'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator', 103 | }, 104 | { 105 | 'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator', 106 | }, 107 | { 108 | 'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator', 109 | }, 110 | ] 111 | 112 | # Internationalization 113 | # https://docs.djangoproject.com/en/1.9/topics/i18n/ 114 | 115 | LANGUAGE_CODE = 'en-us' 116 | 117 | TIME_ZONE = 'UTC' 118 | 119 | USE_I18N = True 120 | 121 | USE_L10N = True 122 | 123 | USE_TZ = False 124 | 125 | # Static files (CSS, JavaScript, Images) 126 | # https://docs.djangoproject.com/en/1.9/howto/static-files/ 127 | 128 | STATIC_URL = '/static/' 129 | RQ_QUEUES = { 130 | 'default': { 131 | 'URL': 'redis://localhost:6379/0', 132 | }, 133 | 'low': { 134 | 'URL': 'redis://localhost:6379/0', 135 | }, 136 | 'high': { 137 | 'URL': 'redis://localhost:6379/1', 138 | }, 139 | } 140 | DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField' 141 | 142 | LOGGING = { 143 | 'version': 1, 144 | 'disable_existing_loggers': False, 145 | 'formatters': { 146 | 'verbose': { 147 | 'format': '{levelname} {asctime} {module} {process:d} {thread:d} {message}', 148 | 'style': '{', 149 | }, 150 | 'simple': { 151 | 'format': '%(asctime)s %(levelname)s %(name)s.%(funcName)s:%(lineno)s- %(message)s', 152 | }, 153 | }, 154 | 'handlers': { 155 | 'console': { 156 | 'level': 'DEBUG', 157 | 'class': 'logging.StreamHandler', 158 | 'formatter': 'simple' 159 | }, 160 | }, 161 | 'root': { 162 | 'handlers': ['console'], 163 | 'level': 'DEBUG', 164 | }, 165 | 'loggers': { 166 | 'scheduler': { 167 | 'handlers': ['console'], 168 | 'level': 'INFO', 169 | }, 170 | }, 171 | } 172 | -------------------------------------------------------------------------------- /testproject/testproject/urls.py: -------------------------------------------------------------------------------- 1 | """testproject URL Configuration 2 | 3 | The `urlpatterns` list routes URLs to views. For more information please see: 4 | https://docs.djangoproject.com/en/1.9/topics/http/urls/ 5 | Examples: 6 | Function views 7 | 1. Add an import: from my_app import views 8 | 2. Add a URL to urlpatterns: url(r'^$', views.home, name='home') 9 | Class-based views 10 | 1. Add an import: from other_app.views import Home 11 | 2. Add a URL to urlpatterns: url(r'^$', Home.as_view(), name='home') 12 | Including another URLconf 13 | 1. Import the `include()` function: from django.conf.urls import url, include 14 | 2. Add a URL to urlpatterns: url(r'^blog/', include('blog.urls')) 15 | """ 16 | from django.contrib import admin 17 | from django.urls import path, include 18 | 19 | from . import views 20 | 21 | urlpatterns = [ 22 | path('admin/', admin.site.urls), 23 | path('scheduler/', include('scheduler.urls')), 24 | path('test-view/', views.my_view, ), 25 | ] 26 | -------------------------------------------------------------------------------- /testproject/testproject/views.py: -------------------------------------------------------------------------------- 1 | from django.http.response import HttpResponse 2 | from django.views.decorators.cache import cache_page 3 | 4 | 5 | @cache_page(timeout=500) 6 | def my_view(request): 7 | return HttpResponse("Yeah") 8 | -------------------------------------------------------------------------------- /testproject/testproject/wsgi.py: -------------------------------------------------------------------------------- 1 | """ 2 | WSGI config for testproject project. 3 | 4 | It exposes the WSGI callable as a module-level variable named ``application``. 5 | 6 | For more information on this file, see 7 | https://docs.djangoproject.com/en/1.9/howto/deployment/wsgi/ 8 | """ 9 | 10 | import os 11 | 12 | from django.core.wsgi import get_wsgi_application 13 | 14 | os.environ.setdefault("DJANGO_SETTINGS_MODULE", "testproject.settings") 15 | 16 | application = get_wsgi_application() 17 | --------------------------------------------------------------------------------