├── .gitignore ├── CHANGES.md ├── README.md ├── celerybeatredis ├── __init__.py ├── decoder.py ├── exceptions.py ├── globals.py ├── schedulers.py └── task.py ├── setup.py └── test ├── celeryconfig.py └── tasks.py /.gitignore: -------------------------------------------------------------------------------- 1 | *.pyc 2 | .* 3 | -------------------------------------------------------------------------------- /CHANGES.md: -------------------------------------------------------------------------------- 1 | v0.0.1 initial release 2 | 3 | v0.0.2 Remove task when manually remove task by delete task in redis. 4 | 5 | v0.0.6 fix: tick error 6 | 7 | v0.0.7 load entries from CELERYBEAT_SCHEDULE thanks @sibson 8 | Prefix as Parameter thanks @asmodehn 9 | 10 | v0.0.8 - v0.1.1 no new feature 11 | 12 | v0.1.2 celerybeat-redis can now run in multiple node, only one active, and other nodes are stand by 13 | v0.1.3 support python3 14 | v0.1.5 Refactor for celery 3.1(thanks asmodehn), add ttl for redis lock 15 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Project Status 2 | 3 | Because of the busy of work, I doesn't got much time on this time, and one major reason is now I am programming with Java... 4 | So I won't push any new release from now on, if someone is interest with it, just fork and maintenance your own version, 5 | Sorry for the inconvenient. 6 | 7 | # celerybeat-redis 8 | 9 | It's modified from celerybeat-mongo (https://github.com/zakird/celerybeat-mongo) 10 | 11 | See Changelog in [CHANGES.md](./CHANGES.md) 12 | 13 | This is a Celery Beat Scheduler (http://celery.readthedocs.org/en/latest/userguide/periodic-tasks.html) 14 | that stores both the schedules themselves and their status 15 | information in a backend Redis database. 16 | 17 | # Features 18 | 19 | 1. Full-featured celery-beat scheduler 20 | 2. Dynamically add/remove/modify tasks 21 | 3. Support multiple instance by Active-Standby model 22 | 23 | # Installation 24 | 25 | It can be installed by 26 | installing the celerybeat-redis Python egg: 27 | 28 | # pip install celerybeat-redis 29 | 30 | And specifying the scheduler when running Celery Beat, e.g. 31 | 32 | $ celery beat -S celerybeatredis.schedulers.RedisScheduler 33 | 34 | # Configuration 35 | 36 | Settings for the scheduler are defined in your celery configuration file 37 | similar to how other aspects of Celery are configured 38 | 39 | CELERY_REDIS_SCHEDULER_URL = "redis://localhost:6379/1" 40 | CELERY_REDIS_SCHEDULER_KEY_PREFIX = 'tasks:meta:' 41 | 42 | You must make sure these two values are configured. `CELERY_REDIS_SCHEDULER_URL` 43 | is used to store tasks. `CELERY_REDIS_SCHEDULER_KEY_PREFIX` is used to generate 44 | keys in redis. The keys will be of the form 45 | 46 | tasks:meta:task-name-here 47 | tasks:meta:test-fib-every-3s 48 | 49 | There is also an optional setting 50 | 51 | CELERY_REDIS_SCHEDULER_LOCK_TTL = 30 52 | 53 | This value determines how long the redis scheduler will hold on to it's lock, 54 | which prevents multiple beat instances from running at the same time. However, 55 | in some cases -- such as a hard crash -- celery beat will not be able to clean 56 | up after itself and release the lock. Therefore, the lock is given a 57 | configurable time-to-live, after which it will expire and be released, if it is 58 | not renewed by the beat instance that acquired it. If said beat instance does 59 | die, another instance will be able to pick up the baton, as it were, and run 60 | instead. 61 | 62 | # Quickstart 63 | 64 | After installed and configure the needs by above, you can make a try with test, cd to test directory, start a worker by: 65 | 66 | $ celery worker -A tasks -l info 67 | 68 | then start the beat by: 69 | 70 | $ celery beat -S celerybeatredis.schedulers.RedisScheduler 71 | 72 | celerybeat-redis will load the entry from celeryconfig.py 73 | 74 | # Detailed Configuration 75 | 76 | There was two ways to add a period task: 77 | 78 | ## Add in celeryconfig.py 79 | 80 | Celery provide a `CELERYBEAT_SCHEDULE` entry in config file, when 81 | celerybeat-redis starts with such a config, it will load tasks to redis, create 82 | them as a celerybeat-redis task. 83 | 84 | It's perfect for quick test 85 | 86 | ## Manaully add to Redis 87 | 88 | You can create task by insert specify data to redis, according to following described: 89 | 90 | Schedules can be manipulated in the Redis database through 91 | direct database manipulation. There exist two types of schedules, 92 | interval and crontab. 93 | 94 | ```json 95 | { 96 | "name" : "interval test schedule", 97 | "task" : "task-name-goes-here", 98 | "enabled" : true, 99 | "interval" : { 100 | "every" : 5, 101 | "period" : "minutes" 102 | }, 103 | "args" : [ 104 | "param1", 105 | "param2" 106 | ], 107 | "kwargs" : { 108 | "max_targets" : 100 109 | }, 110 | "total_run_count" : 5, 111 | "last_run_at" : { 112 | "__type__": "datetime", 113 | "year": 2014, 114 | "month": 8, 115 | "day": 30, 116 | "hour": 8, 117 | "minute": 10, 118 | "second": 6, 119 | "microsecond": 667 120 | } 121 | } 122 | ``` 123 | 124 | The example from Celery User Guide::Periodic Tasks. 125 | ```python 126 | CELERYBEAT_SCHEDULE = { 127 | 'interval-test-schedule': { 128 | 'task': 'tasks.add', 129 | 'schedule': timedelta(seconds=30), 130 | 'args': (param1, param2) 131 | } 132 | } 133 | ``` 134 | 135 | Becomes the following:: 136 | ```json 137 | { 138 | "name" : "interval test schedule", 139 | "task" : "task.add", 140 | "enabled" : true, 141 | "interval" : { 142 | "every" : 30, 143 | "period" : "seconds", 144 | }, 145 | "args" : [ 146 | "param1", 147 | "param2" 148 | ], 149 | "kwargs" : { 150 | "max_targets" : 100 151 | }, 152 | "total_run_count": 5, 153 | "last_run_at" : { 154 | "__type__": "datetime", 155 | "year": 2014, 156 | "month": 8, 157 | "day": 30, 158 | "hour": 8, 159 | "minute": 10, 160 | "second": 6, 161 | "microsecond": 667 162 | } 163 | } 164 | ``` 165 | 166 | The following fields are required: name, task, crontab || interval, 167 | enabled when defining new tasks. 168 | `total_run_count` and `last_run_at` are maintained by the 169 | scheduler and should not be externally manipulated. 170 | 171 | The example from Celery User Guide::Periodic Tasks. 172 | (see: http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html#crontab-schedules) 173 | 174 | ```python 175 | CELERYBEAT_SCHEDULE = { 176 | # Executes every Monday morning at 7:30 A.M 177 | 'add-every-monday-morning': { 178 | 'task': 'tasks.add', 179 | 'schedule': crontab(hour=7, minute=30, day_of_week=1), 180 | 'args': (16, 16), 181 | }, 182 | } 183 | ``` 184 | 185 | Becomes: 186 | 187 | ```json 188 | { 189 | "name" : "add-every-monday-morning", 190 | "task" : "tasks.add", 191 | "enabled" : true, 192 | "crontab" : { 193 | "minute" : "30", 194 | "hour" : "7", 195 | "day_of_week" : "1", 196 | "day_of_month" : "*", 197 | "month_of_year" : "*" 198 | }, 199 | "args" : [ 200 | "16", 201 | "16" 202 | ], 203 | "kwargs" : {}, 204 | "total_run_count" : 1, 205 | "last_run_at" : { 206 | "__type__": "datetime", 207 | "year": 2014, 208 | "month": 8, 209 | "day": 30, 210 | "hour": 8, 211 | "minute": 10, 212 | "second": 6, 213 | "microsecond": 667 214 | } 215 | } 216 | ``` 217 | 218 | # Deploy multiple nodes 219 | 220 | Original celery beat doesn't support multiple node deployment, multiple beat 221 | will send multiple tasks and make worker duplicate execution, celerybeat-redis 222 | use a redis lock to deal with it. Only one node running at a time, other nodes 223 | keep tick with minimal task interval, if this node down, when other node 224 | ticking, it will acquire the lock and continue to run. 225 | 226 | WARNING: this is an experiment feature, need more test, not production ready at 227 | this time. 228 | -------------------------------------------------------------------------------- /celerybeatredis/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import 2 | 3 | from .task import PeriodicTask, Crontab, Interval 4 | from .schedulers import RedisScheduler, RedisScheduleEntry 5 | 6 | __all__ = [ 7 | 'PeriodicTask', 8 | 'Crontab', 9 | 'Interval' 10 | 'RedisScheduler', 11 | 'RedisScheduleEntry' 12 | ] -------------------------------------------------------------------------------- /celerybeatredis/decoder.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python 2 | # coding: utf-8 3 | # Date: 14/8/31 4 | # Author: konglx 5 | # File: 6 | # Description: 7 | from datetime import datetime 8 | 9 | try: 10 | import simplejson as json 11 | except ImportError: 12 | import json 13 | 14 | 15 | class DateTimeDecoder(json.JSONDecoder): 16 | def __init__(self, *args, **kargs): 17 | json.JSONDecoder.__init__(self, object_hook=self.dict_to_object, 18 | *args, **kargs) 19 | 20 | def dict_to_object(self, d): 21 | if '__type__' not in d: 22 | return d 23 | 24 | type = d.pop('__type__') 25 | try: 26 | dateobj = datetime(**d) 27 | return dateobj 28 | except: 29 | d['__type__'] = type 30 | return d 31 | 32 | 33 | class DateTimeEncoder(json.JSONEncoder): 34 | """ Instead of letting the default encoder convert datetime to string, 35 | convert datetime objects into a dict, which can be decoded by the 36 | DateTimeDecoder 37 | """ 38 | 39 | def default(self, obj): 40 | if isinstance(obj, datetime): 41 | return { 42 | '__type__': 'datetime', 43 | 'year': obj.year, 44 | 'month': obj.month, 45 | 'day': obj.day, 46 | 'hour': obj.hour, 47 | 'minute': obj.minute, 48 | 'second': obj.second, 49 | 'microsecond': obj.microsecond, 50 | } 51 | else: 52 | return json.JSONEncoder.default(self, obj) 53 | -------------------------------------------------------------------------------- /celerybeatredis/exceptions.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # Copyright 2014 Kong Luoxing 3 | 4 | # Licensed under the Apache License, Version 2.0 (the 'License'); you may not 5 | # use this file except in compliance with the License. You may obtain a copy 6 | # of the License at http://www.apache.org/licenses/LICENSE-2.0 7 | 8 | 9 | class ValidationError(Exception): 10 | pass 11 | 12 | 13 | class TaskTypeError(Exception): 14 | pass 15 | -------------------------------------------------------------------------------- /celerybeatredis/globals.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # Copyright 2014 Kong Luoxing 3 | 4 | # Licensed under the Apache License, Version 2.0 (the 'License'); you may not 5 | # use this file except in compliance with the License. You may obtain a copy 6 | # of the License at http://www.apache.org/licenses/LICENSE-2.0 7 | 8 | import sys 9 | from redis.client import StrictRedis 10 | from celery import current_app 11 | from celery.utils.log import get_logger 12 | 13 | ADD_ENTRY_ERROR = """\ 14 | 15 | Couldn't add entry %r to redis schedule: %r. Contents: %r 16 | """ 17 | 18 | logger = get_logger(__name__) 19 | 20 | PY3 = sys.version_info >= (3, 0) 21 | 22 | default_encoding = 'utf-8' 23 | def bytes_to_str(s): 24 | if PY3: 25 | if isinstance(s, bytes): 26 | return s.decode(default_encoding) 27 | return s 28 | 29 | def str_to_bytes(s): 30 | if PY3: 31 | if isinstance(s, str): 32 | return s.encode(default_encoding) 33 | return s 34 | -------------------------------------------------------------------------------- /celerybeatredis/schedulers.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # Copyright 2014 Kong Luoxing 3 | 4 | # Licensed under the Apache License, Version 2.0 (the 'License'); you may not 5 | # use this file except in compliance with the License. You may obtain a copy 6 | # of the License at http://www.apache.org/licenses/LICENSE-2.0 7 | import datetime 8 | import logging 9 | from functools import partial 10 | # we don't need simplejson, builtin json module is good enough 11 | import json 12 | from copy import deepcopy 13 | 14 | from celery.beat import Scheduler, ScheduleEntry 15 | from redis import StrictRedis 16 | from celery import current_app 17 | import kombu.utils 18 | import celery.schedules 19 | from redis.exceptions import LockError 20 | 21 | from .task import PeriodicTask 22 | from .exceptions import ValidationError, TaskTypeError 23 | from .decoder import DateTimeDecoder, DateTimeEncoder 24 | from .globals import logger 25 | 26 | 27 | class RedisScheduleEntry(object): 28 | """ 29 | The Schedule Entry class is mainly here to handle the celery dependency injection 30 | and delegates everything to a PeriodicTask instance 31 | It follows the Adapter Design pattern, trivially implemented in python with __getattr__ and __setattr__ 32 | This is similar to https://github.com/celery/django-celery/blob/master/djcelery/schedulers.py 33 | that uses SQLAlchemy DBModels as delegates. 34 | """ 35 | 36 | def __init__(self, name=None, task=None, enabled=True, last_run_at=None, 37 | total_run_count=None, schedule=None, args=(), kwargs=None, 38 | options=None, app=None, **extrakwargs): 39 | 40 | # defaults (MUST NOT call self here - or loop __getattr__ for ever) 41 | app = app or current_app 42 | # Setting a default time a bit before now to not miss a task that was just added. 43 | last_run_at = last_run_at or app.now() - datetime.timedelta( 44 | seconds=app.conf.CELERYBEAT_MAX_LOOP_INTERVAL) 45 | 46 | # using periodic task as delegate 47 | object.__setattr__(self, '_task', PeriodicTask( 48 | # Note : for compatibiilty with celery methods, the name of the task is actually the key in redis DB. 49 | # For extra fancy fields (like a human readable name, you can leverage extrakwargs) 50 | name=name, 51 | task=task, 52 | enabled=enabled, 53 | schedule=schedule, # TODO : sensible default here ? 54 | args=args, 55 | kwargs=kwargs or {}, 56 | options=options or {}, 57 | last_run_at=last_run_at, 58 | total_run_count=total_run_count or 0, 59 | **extrakwargs 60 | )) 61 | 62 | # 63 | # Initializing members here and not in delegate 64 | # 65 | 66 | # The app is kept here (PeriodicTask should not need it) 67 | object.__setattr__(self, 'app', app) 68 | 69 | # automatic delegation to PeriodicTask (easy delegate) 70 | def __getattr__(self, attr): 71 | return getattr(self._task, attr) 72 | 73 | def __setattr__(self, attr, value): 74 | # We set the attribute in the task delegate if available 75 | if hasattr(self, '_task') and hasattr(self._task, attr): 76 | setattr(self._task, attr, value) 77 | return 78 | # else we raise 79 | raise AttributeError( 80 | "Attribute {attr} not found in {tasktype}".format(attr=attr, 81 | tasktype=type(self._task))) 82 | 83 | # 84 | # Overrides schedule accessors in PeriodicTask to store dict in json but retrieve proper celery schedules 85 | # 86 | def get_schedule(self): 87 | if {'every', 'period'}.issubset(self._task.schedule.keys()): 88 | return celery.schedules.schedule( 89 | datetime.timedelta( 90 | **{self._task.schedule['period']: self._task.schedule['every']}), 91 | self.app) 92 | elif {'minute', 'hour', 'day_of_week', 'day_of_month', 'month_of_year'}.issubset( 93 | self._task.schedule.keys()): 94 | return celery.schedules.crontab(minute=self._task.schedule['minute'], 95 | hour=self._task.schedule['hour'], 96 | day_of_week=self._task.schedule['day_of_week'], 97 | day_of_month=self._task.schedule['day_of_month'], 98 | month_of_year=self._task.schedule['month_of_year'], 99 | app=self.app) 100 | else: 101 | raise TaskTypeError('Existing Task schedule type not recognized') 102 | 103 | def set_schedule(self, schedule): 104 | if isinstance(schedule, celery.schedules.schedule): 105 | # TODO : unify this with Interval in PeriodicTask 106 | self._task.schedule = { 107 | 'every': max(schedule.run_every.total_seconds(), 0), 108 | 'period': 'seconds' 109 | } 110 | elif isinstance(schedule, celery.schedules.crontab): 111 | # TODO : unify this with Crontab in PeriodicTask 112 | self._task.schedule = { 113 | 'minute': schedule._orig_minute, 114 | 'hour': schedule._orig_hour, 115 | 'day_of_week': schedule._orig_day_of_week, 116 | 'day_of_month': schedule._orig_day_of_month, 117 | 'month_of_year': schedule._orig_month_of_year 118 | } 119 | else: 120 | raise TaskTypeError('New Task schedule type not recognized') 121 | 122 | schedule = property(get_schedule, set_schedule) 123 | 124 | # 125 | # Overloading ScheduleEntry methods 126 | # 127 | def is_due(self): 128 | """See :meth:`~celery.schedule.schedule.is_due`.""" 129 | due = self.schedule.is_due(self.last_run_at) 130 | 131 | logger.debug('task {0} due : {1}'.format(self.name, due)) 132 | if not self.enabled: 133 | logger.info('task {0} is disabled. not triggered.'.format(self.name)) 134 | # if the task is disabled, we always return false, but the time that 135 | # it is next due is returned as usual 136 | return celery.schedules.schedstate(is_due=False, next=due[1]) 137 | 138 | return due 139 | 140 | def __repr__(self): 141 | return ' sync() will need to save it) 296 | new_entry = super(RedisScheduler, self).reserve(entry) 297 | # Need to store the key of the entry, because the entry may change in the mean time. 298 | self._dirty.add(new_entry.name) 299 | return new_entry 300 | 301 | def sync(self): 302 | logger.info('Writing modified entries...') 303 | _tried = set() 304 | try: 305 | while self._dirty: 306 | name = self._dirty.pop() 307 | _tried.add(name) 308 | # Saving the entry back into Redis DB. 309 | self.rdb.set(name, self.schedule[name].jsondump()) 310 | except Exception as exc: 311 | # retry later 312 | self._dirty |= _tried 313 | logger.error('Error while sync: %r', exc, exc_info=1) 314 | 315 | def close(self): 316 | try: 317 | self._lock.release() 318 | except LockError: 319 | pass 320 | self.sync() 321 | 322 | def __del__(self): 323 | # celery beat will create Scheduler twice, first time create then destroy it, so we need release lock here 324 | try: 325 | self._lock.release() 326 | except LockError: 327 | pass 328 | 329 | @property 330 | def info(self): 331 | return ' . db -> {self.schedule_url}'.format(self=self) 332 | -------------------------------------------------------------------------------- /celerybeatredis/task.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # Copyright 2014 Kong Luoxing 3 | 4 | # Licensed under the Apache License, Version 2.0 (the 'License'); you may not 5 | # use this file except in compliance with the License. You may obtain a copy 6 | # of the License at http://www.apache.org/licenses/LICENSE-2.0 7 | import datetime 8 | from copy import deepcopy 9 | from redis import StrictRedis 10 | import celery 11 | 12 | try: 13 | import simplejson as json 14 | except ImportError: 15 | import json 16 | 17 | from .decoder import DateTimeDecoder, DateTimeEncoder 18 | from .exceptions import TaskTypeError 19 | from .globals import bytes_to_str, default_encoding, logger 20 | 21 | 22 | class Interval(object): 23 | 24 | def __init__(self, every, period='seconds'): 25 | self.every = every 26 | # could be seconds minutes hours 27 | self.period = period 28 | 29 | @property 30 | def period_singular(self): 31 | return self.period[:-1] 32 | 33 | def __unicode__(self): 34 | if self.every == 1: 35 | return 'every {0.period_singular}'.format(self) 36 | return 'every {0.every} {0.period}'.format(self) 37 | 38 | 39 | class Crontab(object): 40 | 41 | def __init__(self, minute=0, hour=0, day_of_week=None, day_of_month=None, month_of_year=None): 42 | self.minute = minute 43 | self.hour = hour 44 | self.day_of_week = day_of_week or '*' 45 | self.day_of_month = day_of_month or '*' 46 | self.month_of_year = month_of_year or '*' 47 | 48 | def __unicode__(self): 49 | rfield = lambda f: f and str(f).replace(' ', '') or '*' 50 | return '{0} {1} {2} {3} {4} (m/h/d/dM/MY)'.format( 51 | rfield(self.minute), rfield(self.hour), rfield(self.day_of_week), 52 | rfield(self.day_of_month), rfield(self.month_of_year), 53 | ) 54 | 55 | 56 | class PeriodicTask(object): 57 | """ 58 | Represents a periodic task. 59 | This follows the celery.beat.ScheduleEntry class design. 60 | However it is independent of any celery import, so that any client library can import this module 61 | and use it to manipulate periodic tasks into a Redis database, without worrying about all the celery imports. 62 | Should follow the SQLAlchemy DBModel design. 63 | These are used as delegate from https://github.com/celery/django-celery/blob/master/djcelery/schedulers.py 64 | """ 65 | name = None 66 | task = None 67 | 68 | data = None 69 | 70 | args = [] 71 | kwargs = {} 72 | options = {} 73 | 74 | enabled = True 75 | 76 | # datetime 77 | last_run_at = None 78 | last_task_id = None 79 | 80 | total_run_count = 0 81 | 82 | # Follow celery.beat.SchedulerEntry:__init__() signature as much as possible 83 | def __init__(self, name, task, schedule, key=None, enabled=True, args=(), kwargs=None, options=None, 84 | last_run_at=None, last_task_id=None, total_run_count=None, **extrakwargs): 85 | """ 86 | :param name: name of the task ( = redis key ) 87 | :param task: taskname ( as in celery : python function name ) 88 | :param schedule: the schedule. maybe also a dict with all schedule content 89 | :param relative: if the schedule time needs to be relative to the interval ( see celery.schedules ) 90 | :param enabled: whether this task is enabled or not 91 | :param args: args for the task 92 | :param kwargs: kwargs for the task 93 | :param options: options for hte task 94 | :param last_run_at: lat time the task was run 95 | :param last_task_id: the last id otf the task triggered by this periodic task 96 | :param total_run_count: total number of times the task was run 97 | :return: 98 | """ 99 | 100 | self.task = task 101 | self.enabled = enabled 102 | 103 | # Using schedule property conversion 104 | # logger.warn("Schedule in Task init {s}".format(s=schedule)) 105 | self.schedule = schedule 106 | 107 | self.args = args 108 | self.kwargs = kwargs or {} 109 | self.options = options or {} 110 | 111 | self.last_run_at = last_run_at 112 | self.last_task_id = last_task_id 113 | self.total_run_count = total_run_count 114 | 115 | self.name = name 116 | self.key = key if key else self.name 117 | self.delete_key = 'deleted:' + bytes_to_str(self.key) 118 | 119 | self.running = False 120 | 121 | # storing extra arguments (might be useful to have other args depending on application) 122 | for elem in extrakwargs.keys(): 123 | setattr(self, elem, extrakwargs[elem]) 124 | 125 | @staticmethod 126 | def get_all_as_dict(rdb, key_prefix): 127 | """get all of the tasks, for best performance with large amount of tasks, return a generator 128 | """ 129 | 130 | tasks = rdb.keys(key_prefix + '*') 131 | for task_key in tasks: 132 | try: 133 | dct = json.loads(bytes_to_str(rdb.get(task_key)), cls=DateTimeDecoder, encoding=default_encoding) 134 | # task name should always correspond to the key in redis to avoid 135 | # issues arising when saving keys - we want to add information to 136 | # the current key, not create a new key 137 | # logger.warning('json task {0}'.format(dct)) 138 | yield task_key, dct 139 | except json.JSONDecodeError: # handling bad json format by ignoring the task 140 | logger.warning('ERROR Reading json task at %s', task_key) 141 | 142 | def _next_instance(self, last_run_at): 143 | self.last_run_at = last_run_at 144 | self.total_run_count += 1 145 | """Return a new instance of the same class, but with 146 | its date and count fields updated.""" 147 | return self.__class__(**dict( 148 | self, 149 | last_run_at=last_run_at, 150 | total_run_count=self.total_run_count + 1, 151 | )) 152 | __next__ = next = _next_instance # for 2to3 153 | 154 | def jsondump(self): 155 | # must do a deepcopy using our custom iterator to choose what to save (matching external view) 156 | self_dict = deepcopy({k: v for k, v in iter(self) if v is not None}) 157 | return json.dumps(self_dict, cls=DateTimeEncoder) 158 | 159 | def update(self, other): 160 | """ 161 | Update values from another task. 162 | This is used to dynamically update periodic task from edited redis values 163 | Does not update "non-editable" fields (last_run_at, total_run_count). 164 | Extra arguments will be updated (considered editable) 165 | """ 166 | otherdict = other.__dict__ # note : schedule property is not part of the dict. 167 | otherdict.pop('last_run_at') 168 | otherdict.pop('last_task_id') 169 | otherdict.pop('total_run_count') 170 | self.__dict__.update(otherdict) 171 | 172 | def __repr__(self): 173 | return ''.format( 174 | self.name, self.task, self.args, 175 | self.kwargs, self.options, self.schedule, 176 | ) 177 | 178 | def __unicode__(self): 179 | fmt = '{0.name}: {0.schedule}' 180 | return fmt.format(self) 181 | 182 | def get_schedule(self): 183 | """ 184 | schedule Interval / Crontab -> dict 185 | :return: 186 | """ 187 | return vars(self.data) 188 | 189 | def set_schedule(self, schedule): 190 | """ 191 | schedule dict -> Interval / Crontab if needed 192 | :return: 193 | """ 194 | if schedule is None: 195 | pass 196 | elif isinstance(schedule, Interval) or isinstance(schedule, Crontab): 197 | self.data = schedule 198 | elif isinstance(schedule, celery.schedules.crontab): 199 | self.data = Crontab(schedule.minute, schedule.hour, 200 | schedule.day_of_week, schedule.day_of_month, schedule.month_of_year) 201 | elif isinstance(schedule, celery.schedules.schedule)\ 202 | or isinstance(schedule, datetime.timedelta): 203 | self.data = Interval(schedule.seconds) 204 | else: 205 | schedule_inst = None 206 | for s in [Interval, Crontab]: 207 | try: 208 | schedule_inst = s(**schedule) 209 | except TypeError as typexc: 210 | logger.warn("Create schedule failed. {}".format(schedule.__class__)) 211 | pass 212 | 213 | if schedule_inst is None: 214 | raise TaskTypeError("Schedule {s} didn't match Crontab or Interval type".format(s=schedule)) 215 | else: 216 | self.data = schedule_inst 217 | 218 | schedule = property(get_schedule, set_schedule) 219 | 220 | def __iter__(self): 221 | """ 222 | We iterate on our members a little bit specially 223 | => data is hidden and schedule is shown instead 224 | => rdb is hidden 225 | :return: 226 | """ 227 | for k, v in vars(self).iteritems(): 228 | if k == 'data': 229 | yield 'schedule', self.schedule 230 | else: # we can expose everything else 231 | yield k, v 232 | -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | from setuptools import setup 2 | 3 | setup( 4 | name = "celerybeat-redis", 5 | description = "A Celery Beat Scheduler that uses Redis to store both schedule definitions and status information", 6 | version = "0.1.5", 7 | license = "Apache License, Version 2.0", 8 | author = "Kong Luoxing", 9 | author_email = "kong.luoxing@gmail.com", 10 | home_page = 'https://github.com/kongluoxing/celerybeatredis', 11 | maintainer = "Kong Luoxing", 12 | maintainer_email = "kong.luoxing@gmail.com", 13 | 14 | keywords = "python celery beat redis", 15 | 16 | packages = [ 17 | "celerybeatredis" 18 | ], 19 | 20 | install_requires=[ 21 | 'setuptools', 22 | 'redis>=2.10.3', 23 | 'celery>=3.1.16' 24 | ] 25 | 26 | ) 27 | -------------------------------------------------------------------------------- /test/celeryconfig.py: -------------------------------------------------------------------------------- 1 | from datetime import timedelta 2 | CELERY_REDIS_SCHEDULER_URL = "redis://localhost:6379/0" 3 | BROKER_URL = "redis://localhost:6379/0" 4 | CELERY_REDIS_SCHEDULER_KEY_PREFIX = 'tasks:meta:' 5 | CELERYBEAT_SCHEDULE = { 6 | 'add-every-3-seconds': { 7 | 'task': 'tasks.add', 8 | 'schedule': timedelta(seconds=3), 9 | 'args': (16, 16) 10 | }, 11 | } 12 | -------------------------------------------------------------------------------- /test/tasks.py: -------------------------------------------------------------------------------- 1 | from celery import Celery 2 | import time 3 | app = Celery('tasks') 4 | 5 | @app.task 6 | def add(x, y): 7 | time.sleep(3) 8 | return x + y 9 | --------------------------------------------------------------------------------