├── .gitignore ├── MANIFEST.in ├── README.rst ├── config.yml ├── examples ├── config0-simple-sources-list.yml ├── config1-overwrite-source-handler-watcher.yml ├── config2-yet-another-sources-format.yml ├── config3-custom-pipe.yml └── config4-custom-pipe-with-options.yml ├── logdog ├── __init__.py ├── app.py ├── cli.py ├── compat.py ├── core │ ├── __init__.py │ ├── base_role.py │ ├── config.py │ ├── log.py │ ├── msg.py │ ├── path.py │ ├── policies.py │ ├── register.py │ └── utils │ │ ├── __init__.py │ │ └── six.py ├── default_config.py ├── roles │ ├── __init__.py │ ├── collectors │ │ ├── __init__.py │ │ └── base.py │ ├── connectors │ │ ├── __init__.py │ │ ├── base.py │ │ └── zmq_tunnel.py │ ├── formatters │ │ ├── __init__.py │ │ ├── base.py │ │ └── fmtr.py │ ├── forwarders │ │ ├── __init__.py │ │ ├── base.py │ │ ├── broadcast.py │ │ └── round_robin.py │ ├── parsers │ │ ├── __init__.py │ │ ├── base.py │ │ └── regex.py │ ├── pipes │ │ ├── __init__.py │ │ ├── base.py │ │ └── pipe.py │ ├── pollers │ │ ├── __init__.py │ │ ├── base.py │ │ └── file_watcher.py │ ├── processors │ │ ├── __init__.py │ │ ├── base.py │ │ └── stripper.py │ └── viewers │ │ ├── __init__.py │ │ ├── base.py │ │ ├── console.py │ │ ├── null.py │ │ └── webui │ │ ├── Gruntfile.js │ │ ├── __init__.py │ │ ├── assets │ │ ├── css │ │ │ └── app.css │ │ ├── html │ │ │ └── index.html │ │ └── js │ │ │ └── app.js │ │ ├── bower.json │ │ ├── package.json │ │ ├── static │ │ ├── css │ │ │ └── app.css │ │ ├── img │ │ │ └── icons │ │ │ │ └── navigation │ │ │ │ └── svg │ │ │ │ └── production │ │ │ │ └── ic_menu_18px.svg │ │ └── js │ │ │ └── app.js │ │ └── web.py └── version.py ├── pytest.ini ├── requirements.txt ├── scripts └── logdog ├── setup.py ├── test-requirements.txt ├── tests ├── conftest.py └── core │ ├── test_config.py │ └── test_msg.py └── tox.ini /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | 5 | # C extensions 6 | *.so 7 | 8 | # Distribution / packaging 9 | .Python 10 | env/ 11 | build/ 12 | develop-eggs/ 13 | dist/ 14 | downloads/ 15 | eggs/ 16 | lib/ 17 | lib64/ 18 | parts/ 19 | sdist/ 20 | var/ 21 | *.egg-info/ 22 | .installed.cfg 23 | *.egg 24 | .eggs/ 25 | 26 | # PyInstaller 27 | # Usually these files are written by a python script from a template 28 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 29 | *.manifest 30 | *.spec 31 | MANIFEST 32 | 33 | # Installer logs 34 | pip-log.txt 35 | pip-delete-this-directory.txt 36 | 37 | # Unit test / coverage reports 38 | htmlcov/ 39 | .tox/ 40 | .coverage 41 | .cache 42 | nosetests.xml 43 | coverage.xml 44 | 45 | # Translations 46 | *.mo 47 | *.pot 48 | 49 | # Django stuff: 50 | *.log 51 | 52 | # Sphinx documentation 53 | docs/_build/ 54 | 55 | # PyBuilder 56 | target/ 57 | 58 | # PyCharm 59 | .idea/ 60 | 61 | # Vagrant 62 | .vagrant/ 63 | 64 | # tmp static 65 | bower_components/ 66 | node_modules/ 67 | -------------------------------------------------------------------------------- /MANIFEST.in: -------------------------------------------------------------------------------- 1 | include README.rst 2 | include MANIFEST.in 3 | include MANIFEST 4 | include requirements.txt 5 | include test-requirements.txt 6 | include setup.py 7 | recursive-include logdog * 8 | recursive-include docs * 9 | recursive-include examples * 10 | prune *.pyc 11 | prune *.sw* 12 | prune logdog/roles/viewers/webui/bower_components 13 | prune logdog/roles/viewers/webui/node_modules -------------------------------------------------------------------------------- /README.rst: -------------------------------------------------------------------------------- 1 | python-logdog 2 | ------------- 3 | 4 | distributed log tail viewer 5 | 6 | Why? 7 | 8 | - tail log files and forward them to web in runtime 9 | - dynamically parse and (pre)process logs 10 | - aggregate and collect logs 11 | - alerting 12 | 13 | 14 | Quick start 15 | =========== 16 | 17 | Running a simple one-host configuration. 18 | 19 | .. code-block:: bash 20 | 21 | $ pip install logdog 22 | 23 | Help output: 24 | 25 | .. code-block:: bash 26 | 27 | $  logdog --help  28 | logdog command line interface 29 | 30 | Usage: 31 | logdog [...] [options] 32 | logdog (-h | --help) 33 | logdog --version 34 | 35 | Arguments: 36 | One or more pipe namespaces to be run 37 | 38 | Options: 39 | -h --help Show this screen 40 | --version Show version 41 | -v --verbose Run in verbose mode 42 | -l --log-level= Set internal logging level [default: INFO] 43 | -f --log-format= Set internal logging format [default: quiet] 44 | -c --config= Configuration file (yaml config) 45 | -s --sources= Force specify files to be watched 46 | -H --handler= Force set handler for all sources 47 | (e.g. --handler=viewers.console) 48 | --reset-indices Remove current indices (will reset watching state) 49 | 50 | 51 | Prepare config file: 52 | 53 | .. code-block:: yaml 54 | 55 | # config.yml 56 | --- 57 | sources: 58 | - /var/log/*.log 59 | - /var/log/*/*.log 60 | - /var/log/syslog 61 | 62 | Please, see `default_config.py`_ 63 | 64 | .. _default_config.py: logdog/default_config.py 65 | 66 | Start watching: 67 | 68 | .. code-block:: bash 69 | 70 | $ logdog --config=config.yml 71 | 72 | You can run watching and viewing parts separately: 73 | 74 | .. code-block:: bash 75 | 76 | $ logdog watch --config=config.yml 77 | # another console 78 | $ logdog view --config=config.yml 79 | 80 | 81 | Config 82 | ====== 83 | 84 | YAML is used as a main format for writing configuration. 85 | 86 | Default config: 87 | 88 | .. code-block:: yaml 89 | 90 | --- 91 | sources: 92 | # 93 | - /var/log/*.log 94 | - /var/log/*/*.log 95 | - /var/log/syslog 96 | 97 | ``sources`` is a list of target files/logs. Alternatively, this section can be re-written the following way: 98 | 99 | .. code-block:: yaml 100 | 101 | --- 102 | sources: 103 | - /var/log/*.log: pipes.to-web 104 | - /var/log/*/*.log: 105 | handler: pipes.to-web 106 | # ^ note. 4 spaces 107 | # in case of 2 spaces it will be a key in the list object 108 | # {'/var/log/*/*.log': None, 109 | # 'handler': 'pipes.to-web'} 110 | # but must be {'/var/log/*/*.log': {'handler': 'pipes.to-web'}} 111 | - /var/log/syslog: {handler: pipes.to-web} 112 | 113 | 114 | Pipe is a sequence of steps to process / parse / forward / collect log messages. 115 | ``pipes.to-web`` is a predefined pipe (see `default_config.py`_). 116 | 117 | 118 | Full ``sources`` format: 119 | 120 | .. code-block:: none 121 | 122 | --- 123 | sources: 124 | - (path | search pattern) 125 | # or (`handler`, `watcher`, `meta` are optional) 126 | - (path | search pattern): 127 | handler: handler-name # default pipes.to-web 128 | watcher: watcher-name # default pollers.file-watcher 129 | meta: a-dictionary-containing-any-meta-info # e.g. {tags: [tag1, tag2]} 130 | # or 131 | - (path | search pattern): handler-name 132 | # or 133 | - (path | search pattern): {handler: pipes.to-web} 134 | # or 135 | - (path | search pattern): {watcher: poller.custom-file-poller} 136 | # or 137 | - (path | search pattern): {meta: {tags: [log]}} 138 | 139 | 140 | Example 1: 141 | 142 | .. code-block:: yaml 143 | 144 | --- 145 | sources: 146 | - /var/log/syslog: {handler: pipes.to-web, meta: {tags: [syslog]} 147 | # or 148 | - /var/log/syslog2: 149 | handler: pipes.to-web 150 | meta: 151 | tags: 152 | - syslog 153 | 154 | 155 | Builtins 156 | ======== 157 | 158 | Predefined configs: 159 | 160 | ``pipes``: 161 | 162 | - `pipes.to-web` - defines a simple flow (strip -> zmq localhost:7789 -> zmq *:7789 -> webui) 163 | 164 | ``viewers``: 165 | 166 | - `viewers.null` - does nothing with incoming data 167 | - `viewers.console` - print incoming log messages into stdout 168 | - `viewers.webui` - forward all incoming messages to all connected clients using websockets 169 | 170 | ``connectors``: 171 | 172 | - `connectors.zmq-tunnel` - allows to create any zmq sockets to push/pull data 173 | 174 | For more details see `default_config.py`_. 175 | 176 | Screenshots 177 | =========== 178 | 179 | .. image:: http://i.imgur.com/B4JQ57T.png 180 | 181 | 182 | TODO 183 | ==== 184 | 185 | - cover with tests 186 | - detecting new files 187 | - colorize logs 188 | - add documentation 189 | - zmq connectors 190 | - mongodb collector 191 | - webui storages 192 | - webui filtering / searching 193 | - implement `--validate-config` 194 | -------------------------------------------------------------------------------- /config.yml: -------------------------------------------------------------------------------- 1 | --- 2 | sources: 3 | - /var/log/*.log: pipes.custom 4 | - /var/log/*/*.log: 5 | handler: pipes.custom 6 | - /var/log/syslog: {handler: pipes.custom} 7 | 8 | pipes: 9 | custom: 10 | - watch processors.stripper 11 | - watch parsers.regex: 12 | regex: (?P.*) 13 | - view formatters.formatter 14 | - view viewers.console 15 | 16 | 17 | custom2: 18 | cls: tmp.paths -------------------------------------------------------------------------------- /examples/config0-simple-sources-list.yml: -------------------------------------------------------------------------------- 1 | --- 2 | # Simple configuration file. 3 | # There's a list of files to be watched. 4 | # Default handling pipe will be used for all found files/logs. 5 | sources: 6 | - /var/log/*.log 7 | - /var/log/*/*.log 8 | - /var/log/syslog -------------------------------------------------------------------------------- /examples/config1-overwrite-source-handler-watcher.yml: -------------------------------------------------------------------------------- 1 | --- 2 | sources: 3 | - /var/log/*.log: 4 | handler: pipes.to-web 5 | watcher: pollers.file-watcher 6 | meta: 7 | tags: 8 | - common-logs 9 | - /var/log/*/*.log 10 | - /var/log/syslog -------------------------------------------------------------------------------- /examples/config2-yet-another-sources-format.yml: -------------------------------------------------------------------------------- 1 | --- 2 | sources: 3 | - /var/log/*.log: pipes.to-web 4 | - /var/log/*/*.log: 5 | handler: pipes.to-web 6 | - /var/log/syslog: {handler: pipes.to-web} 7 | -------------------------------------------------------------------------------- /examples/config3-custom-pipe.yml: -------------------------------------------------------------------------------- 1 | --- 2 | sources: 3 | - /var/log/*.log: pipes.custom 4 | - /var/log/*/*.log: 5 | handler: pipes.custom 6 | - /var/log/syslog: {handler: pipes.custom} 7 | 8 | pipes: 9 | custom: 10 | - watch processors.stripper 11 | - watch connectors.zmq-tunnel@sender: 12 | connect: 13 | - tcp://collector.domain.org:5555 14 | - tcp://localhost:5555 15 | socket: PUSH 16 | - view connectors.zmq-tunnel@receiver: 17 | bind: tcp://*:5555 18 | socket: PULL 19 | - view formatters.formatter 20 | - view viewers.console -------------------------------------------------------------------------------- /examples/config4-custom-pipe-with-options.yml: -------------------------------------------------------------------------------- 1 | --- 2 | sources: 3 | - /var/log/*.log 4 | - /var/log/*/*.log 5 | - /var/log/syslog 6 | 7 | options: 8 | sources: 9 | default_handler: pipes.custom 10 | 11 | pipes: 12 | custom: 13 | - watch processors.stripper 14 | - watch connectors.zmq-tunnel@sender: 15 | connect: 16 | - tcp://collector.domain.org:5555 17 | - tcp://localhost:5555 18 | socket: PUSH 19 | - view connectors.zmq-tunnel@receiver: 20 | bind: tcp://*:5555 21 | socket: PULL 22 | - view formatters.formatter 23 | - view viewers.console -------------------------------------------------------------------------------- /logdog/__init__.py: -------------------------------------------------------------------------------- 1 | from logdog.version import __version__, __author__, __email__ 2 | -------------------------------------------------------------------------------- /logdog/app.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import glob 4 | from itertools import chain 5 | import logging 6 | import os 7 | from tornado import gen 8 | 9 | from logdog.core.config import Config, handle_as_list 10 | from logdog.core.path import Path 11 | from logdog.core.register import Register 12 | 13 | 14 | logger = logging.getLogger(__name__) 15 | 16 | 17 | class Application(object): 18 | 19 | def __init__(self, active_namespaces, config, io_loop=None, 20 | force_handler=None, force_sources=None, reset_indices=False): 21 | from tornado.ioloop import IOLoop 22 | self.io_loop = io_loop or IOLoop.current() 23 | self.config = config 24 | self.force_handler = force_handler 25 | self.force_sources = force_sources 26 | self.namespaces = (Config.namespace_default,) 27 | self.active_namespaces = active_namespaces or [Config.namespace_default] 28 | self._pipes = {} 29 | self._sources = {} 30 | self.register = Register(index_file=config.options.sources.index_file, reset=reset_indices) 31 | 32 | logger.debug('[%s] Active namespaces: %s', self, ', '.join(self.active_namespaces)) 33 | 34 | def __str__(self): 35 | return 'APP' 36 | 37 | @gen.coroutine 38 | def load_sources(self): 39 | flat_files = set() 40 | flat_patterns = set() 41 | 42 | sources = self.config.sources 43 | if self.force_sources: 44 | self.force_sources = self.force_sources.split(':') 45 | sources = handle_as_list(self.force_sources) 46 | 47 | for source, conf in sources: 48 | if not isinstance(source, (list, tuple)): 49 | source = [source] 50 | else: 51 | source = list(source) 52 | 53 | if isinstance(conf, basestring): 54 | conf = Config(handler=conf) 55 | elif not conf: 56 | conf = Config(handler=self.config.options.sources.default_handler) 57 | elif isinstance(conf, dict): 58 | conf.setdefault('handler', self.config.options.sources.default_handler) 59 | else: 60 | logger.warning('[APP] Weird config for "%s" (will be skipped).', ', '.join(source)) 61 | continue 62 | 63 | if self.force_handler: 64 | conf['handler'] = self.force_handler 65 | 66 | intersection_patterns = flat_patterns.intersection(source) 67 | if intersection_patterns: 68 | logger.warning('[APP] Duplicate source patterns: %s (will be skipped).', 69 | ', '.join(intersection_patterns)) 70 | source = list(set(source).difference(intersection_patterns)) 71 | 72 | source.sort() 73 | source = tuple(source) 74 | 75 | files = chain(*map(glob.glob, source)) 76 | files = set(filter(os.path.isfile, files)) 77 | 78 | intersection_files = flat_files.intersection(files) 79 | if intersection_files: 80 | logger.warning('[APP] Your source patterns have intersections.' 81 | 'The following files appeared in several groups: %s', 82 | ', '.join(intersection_files)) 83 | 84 | # remove from wider pattern 85 | for source_, (_, files_) in self._sources.items(): 86 | intersection_ = files_.intersection(files) 87 | if intersection_ and len(files_) > len(files): 88 | files_.difference_update(intersection_) 89 | logger.warning('[APP] "%s" will not be a part of "%s" group.', 90 | ', '.join(intersection_), ', '.join(source_)) 91 | elif intersection_: 92 | files.difference_update(intersection_) 93 | logger.warning('[APP] "%s" will not be a part of "%s" group.', 94 | ', '.join(intersection_), ', '.join(source)) 95 | 96 | flat_files.update(files) 97 | flat_patterns.update(source) 98 | self._sources[source] = (conf, files) 99 | 100 | @gen.coroutine 101 | def construct_pipes(self): 102 | default_watcher = self.config.options.sources.default_watcher 103 | for conf, files in self._sources.itervalues(): 104 | conf['app'] = self 105 | conf['parent'] = self 106 | 107 | for f in files: 108 | watcher = self.config.find_and_construct_class( 109 | name=conf.get('watcher', default_watcher), kwargs=conf) 110 | pipe = self.config.find_and_construct_class(name=conf['handler'], kwargs=conf) 111 | 112 | try: 113 | path = self.register.get_path(f) 114 | except KeyError: 115 | path = Path(f, 0, None) 116 | 117 | watcher.set_input(path) 118 | watcher.set_output(pipe) 119 | pipe.link_methods() 120 | watcher.link_methods() 121 | 122 | self._pipes[f] = watcher, pipe 123 | 124 | @gen.coroutine 125 | def _init(self): 126 | yield self.load_sources() 127 | yield self.construct_pipes() 128 | 129 | pipes = [p for _, p in self._pipes.itervalues()] 130 | watchers = [w for w, _ in self._pipes.itervalues()] 131 | yield [i.start() for i in chain(pipes, watchers)] 132 | 133 | def run(self): 134 | self.io_loop.add_callback(self._init) 135 | self.io_loop.start() 136 | -------------------------------------------------------------------------------- /logdog/cli.py: -------------------------------------------------------------------------------- 1 | """logdog command line interface 2 | 3 | Usage: 4 | logdog [...] [options] 5 | logdog (-h | --help) 6 | logdog --version 7 | 8 | Arguments: 9 | One or more pipe namespaces to be run 10 | 11 | Options: 12 | -h --help Show this screen 13 | --version Show version 14 | -v --verbose Run in verbose mode 15 | -l --log-level= Set internal logging level [default: INFO] 16 | -f --log-format= Set internal logging format [default: quiet] 17 | -c --config= Configuration file (yaml config) 18 | -s --sources= Force specify files to be watched 19 | -H --handler= Force set handler for all sources 20 | (e.g. --handler=viewers.console) 21 | --reset-indices Remove current indices (will reset watching state) 22 | """ 23 | from __future__ import absolute_import, unicode_literals 24 | 25 | from docopt import docopt 26 | from logdog.app import Application 27 | from logdog.core.config import ConfigLoader 28 | from logdog.core.log import configure_logging 29 | from logdog.version import __version__ 30 | 31 | 32 | def main(): 33 | arguments = docopt(__doc__, version='logdog {}'.format(__version__)) 34 | 35 | log_level = arguments.get('--log-level').upper() 36 | log_format = log_custom_format = arguments.get('--log-format') 37 | 38 | if log_format not in ('verbose', 'quiet'): 39 | log_format = 'custom' 40 | 41 | if arguments.get('--verbose'): 42 | log_level = 'DEBUG' 43 | log_format = 'verbose' 44 | 45 | configure_logging(log_level, log_format, log_custom_format) 46 | 47 | config_path = arguments.get('--config') 48 | loader = ConfigLoader(path=config_path) 49 | config = loader.load_config(default_only=not config_path) 50 | 51 | Application( 52 | active_namespaces=arguments.get(''), 53 | config=config, 54 | force_handler=arguments.get('--handler'), 55 | force_sources=arguments.get('--sources'), 56 | reset_indices=arguments.get('--reset-indices'), 57 | ).run() 58 | 59 | 60 | if __name__ == '__main__': 61 | main() -------------------------------------------------------------------------------- /logdog/compat.py: -------------------------------------------------------------------------------- 1 | from logdog.core.utils.six import * 2 | 3 | 4 | __all__ = ['import_object', 'text_type'] 5 | 6 | 7 | # tornado import_object 8 | from tornado.util import import_object as _import_object 9 | if PY2: 10 | import_object = lambda name: _import_object(str(name)) 11 | if PY3: 12 | import_object = _import_object 13 | -------------------------------------------------------------------------------- /logdog/core/__init__.py: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /logdog/core/base_role.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | from tornado import gen 5 | from tornado.concurrent import Future 6 | 7 | from .config import Config 8 | from .msg import Msg 9 | from .utils import mark_as_proxy_method, is_proxy, simple_oid 10 | 11 | 12 | logger = logging.getLogger(__name__) 13 | 14 | 15 | class BaseRole(object): 16 | defaults = Config( 17 | unique=False, # means, the object will be singleton / shared 18 | start_delay=0, 19 | ) 20 | _singleton_cache = {} 21 | 22 | @classmethod 23 | def factory(cls, *args, **kwargs): 24 | unique = kwargs.get('unique', False) or cls.defaults.unique 25 | if unique: 26 | # if unique, than behave like a singleton 27 | key = cls.__singleton_key__(args, kwargs) 28 | if key in cls._singleton_cache: 29 | return cls._singleton_cache[key] 30 | new_obj = cls(*args, **kwargs) 31 | cls._singleton_cache[key] = new_obj 32 | return new_obj 33 | 34 | return cls(*args, **kwargs) 35 | 36 | @classmethod 37 | def __singleton_key__(cls, passed_args, passed_kwargs): 38 | return '{}.{}:{}'.format(cls.__module__, 39 | cls.__name__, passed_kwargs.get('config_name', 'unknown')) 40 | 41 | @property 42 | def is_unique(self): 43 | return self.config.unique 44 | 45 | @property 46 | def is_active(self): 47 | return self.started \ 48 | or self.config.namespace_default in self.namespaces \ 49 | or self.config.namespace_default in self.app.active_namespaces \ 50 | or bool(set(self.namespaces).intersection(self.app.active_namespaces)) 51 | 52 | def __init__(self, *items, **config): 53 | self._oid = simple_oid() 54 | 55 | self.app = config.pop('app') 56 | self.parent = config.pop('parent') 57 | self.namespaces = config.pop('namespaces', None) or self.app.namespaces 58 | self.config = self.defaults.copy_and_update(config) 59 | self.items = items 60 | 61 | self._started = self._unique_start_lock = False 62 | self._started_futures = [] 63 | self._stopped_futures = [] 64 | self.input = self.output = None 65 | self.send = getattr(self, 'send', None) 66 | self._forward = getattr(self, '_forward', None) 67 | 68 | def __str__(self): 69 | return '{}:{}'.format(self.__class__.__name__, self.parent) 70 | 71 | @property 72 | def active_items(self): 73 | return [i for i in self.items if i.is_active] 74 | 75 | def construct_subrole(self, name, conf): 76 | if isinstance(conf, (list, tuple)): 77 | conf = {'*': conf} 78 | elif conf is None: 79 | conf = {} 80 | conf['app'] = self.app 81 | conf['parent'] = self 82 | return self.app.config.find_and_construct_class(name=name, kwargs=conf) 83 | 84 | @property 85 | def started(self): 86 | return self._started 87 | 88 | @started.setter 89 | def started(self, value): 90 | self._started = bool(value) 91 | futures = self._started_futures if self.started else self._stopped_futures 92 | for future in futures: 93 | future.set_result(self._started) 94 | if self._started: 95 | logger.debug('[%s] Started.', self) 96 | 97 | def wait_for_start(self): 98 | f = Future() 99 | if not self.started: 100 | self._started_futures.append(f) 101 | else: 102 | f.set_result(True) 103 | return f 104 | 105 | def wait_for_stop(self): 106 | f = Future() 107 | if not self.started: 108 | self._stopped_futures.append(f) 109 | else: 110 | f.set_result(False) 111 | return f 112 | 113 | def link_methods(self): 114 | if getattr(self, 'send', None) is None: 115 | self.send = self.get_send_method() 116 | 117 | if getattr(self, '_forward', None) is None: 118 | self._forward = self.get_forward_method() 119 | 120 | @mark_as_proxy_method 121 | def _input_forwarder(self, data): 122 | return self.output.send(data) 123 | 124 | def get_send_method(self): 125 | return getattr(self, 'on_recv', self._input_forwarder) 126 | 127 | def get_forward_method(self): 128 | method = self._input_forwarder 129 | obj = self.output 130 | 131 | while is_proxy(method): 132 | if obj: 133 | method = obj.send 134 | obj = obj.output 135 | else: 136 | break 137 | 138 | return method 139 | 140 | def set_input(self, obj): 141 | self.input = obj 142 | 143 | def set_output(self, obj): 144 | self.output = obj 145 | 146 | def _prepare_message_meta(self, **extra): 147 | meta = self.config.get('meta') 148 | if meta: 149 | extra.update(meta) 150 | extra.setdefault('host', None) 151 | return extra 152 | 153 | def _prepare_message(self, data): 154 | return Msg( 155 | message=data, 156 | source=None, 157 | meta=self._prepare_message_meta() 158 | ) 159 | 160 | def _pre_start(self): 161 | pass 162 | 163 | def _post_stop(self): 164 | pass 165 | 166 | @gen.coroutine 167 | def start(self): 168 | need_to_skip_start = self.started or not self.is_active 169 | if not need_to_skip_start and self.is_unique and self._unique_start_lock: 170 | need_to_skip_start = True 171 | 172 | if not need_to_skip_start: 173 | if self.is_unique: 174 | self._unique_start_lock = True 175 | 176 | if self.config.start_delay: 177 | yield gen.sleep(self.config.start_delay) 178 | 179 | logger.debug('[%s] Starting...', self) 180 | if hasattr(self.output, 'wait_for_start'): 181 | yield gen.maybe_future(self.output.wait_for_start()) 182 | yield gen.maybe_future(self._pre_start()) 183 | 184 | # mark obj as 'started' 185 | self.started = True 186 | 187 | # notify that the obj will be reused for other pipes in case of uniqueness 188 | if self.is_unique: 189 | logger.debug('[%s] Started in a shared mode.', self) 190 | 191 | @gen.coroutine 192 | def stop(self): 193 | if self.started: 194 | logger.debug('[%s] Stopping...', self) 195 | if hasattr(self.input, 'wait_for_stop'): 196 | yield self.input.wait_for_stop() 197 | self.started = False 198 | yield gen.maybe_future(self._post_stop()) 199 | -------------------------------------------------------------------------------- /logdog/core/config.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import copy 4 | import logging 5 | import os 6 | 7 | from tornado.util import ObjectDict 8 | from yaml import load 9 | try: 10 | from yaml import CLoader as Loader 11 | except ImportError: 12 | from yaml import Loader 13 | 14 | from logdog.default_config import config as logdog_default_config 15 | from logdog.compat import text_type, import_object 16 | 17 | 18 | logger = logging.getLogger(__name__) 19 | 20 | 21 | class ConfigError(Exception): 22 | pass 23 | 24 | _unset = object() 25 | 26 | 27 | def handle_as_list(conf): 28 | if isinstance(conf, dict): 29 | return conf.items() 30 | if isinstance(conf, (list, tuple)): 31 | l = [] 32 | for item in conf: 33 | if isinstance(item, basestring): 34 | l.append((item, None)) 35 | elif isinstance(item, dict) and len(item) == 1: 36 | l.append(item.items()[0]) 37 | elif isinstance(item, (list, tuple)) and item: 38 | l.append((item[0], item[1:] if len(item) > 2 else item[1])) 39 | else: 40 | l.append((item, None)) 41 | return l 42 | return [(conf, None)] 43 | 44 | 45 | def is_importable(key): 46 | return key in ('cls',) 47 | 48 | 49 | class Config(ObjectDict): 50 | namespace_delimiter = ' ' 51 | namespace_default = '*' 52 | subconfig_namespace_delimiter = '@' 53 | 54 | _classes_cache = {} 55 | _conf_path_cache = {} 56 | 57 | def copy(self): 58 | return copy.copy(self) 59 | 60 | def deepcopy(self): 61 | return copy.deepcopy(self) 62 | 63 | def copy_and_update(self, *args, **config): 64 | conf = self.deepcopy() 65 | for a in args: 66 | if isinstance(a, dict): 67 | conf.update(a) 68 | conf.update(config) 69 | return conf 70 | 71 | __call__ = copy_and_update 72 | 73 | def walk_conf(self, path, default=_unset): 74 | _cache = self._conf_path_cache.get(path) 75 | if _cache is not None: 76 | return copy.deepcopy(_cache) 77 | 78 | target = self 79 | for part in path.split('.'): 80 | try: 81 | target = target[part] 82 | except (KeyError, IndexError): 83 | if default is not _unset: 84 | return default 85 | raise 86 | 87 | self._conf_path_cache[path] = target 88 | return copy.deepcopy(target) 89 | 90 | def find_conf(self, name, fallback=None): 91 | name = name.strip().lstrip(self.namespace_delimiter).rstrip(self.subconfig_namespace_delimiter) 92 | 93 | namespace = self.namespace_default 94 | if self.namespace_delimiter in name: 95 | namespace, tmp, name = name.partition(self.namespace_delimiter) 96 | 97 | extra_conf = None 98 | if self.subconfig_namespace_delimiter in name: 99 | name, tmp, extra_conf = name.rpartition(self.subconfig_namespace_delimiter) 100 | extra_conf_relative = '{}.{}{}'.format(name, self.subconfig_namespace_delimiter, extra_conf) 101 | extra_conf = self.walk_conf(extra_conf_relative, default=None) 102 | 103 | try: 104 | target = self.walk_conf(name) 105 | except KeyError: 106 | if fallback: 107 | return self.find_conf(name=fallback) 108 | else: 109 | raise 110 | 111 | if isinstance(target, (list, tuple)): 112 | target = handle_as_list(target) 113 | 114 | elif extra_conf and isinstance(target, dict) and isinstance(extra_conf, dict): 115 | target.update(extra_conf) 116 | 117 | return target, name, namespace.split(',') 118 | 119 | def find_class(self, name, fallback=None): 120 | initial_name = name 121 | _cache = self._classes_cache.get(initial_name) 122 | if _cache is not None: 123 | return copy.deepcopy(_cache) 124 | 125 | conf, name, namespaces = self.find_conf(name, fallback=None) 126 | if isinstance(conf, basestring): 127 | conf = {'cls': conf} 128 | elif isinstance(conf, (list, tuple)): 129 | conf = {'*': conf} 130 | 131 | try: 132 | cls = conf.pop('cls') 133 | except KeyError: 134 | if name != 'default': 135 | fallback = fallback if fallback else ''.join(name.rpartition('.')[:-1] + ('default',)) 136 | cls, f_name, f_conf, f_ns = self.find_class(name=fallback) 137 | f_conf.update(conf) 138 | conf = f_conf 139 | namespaces = list(set(namespaces).union(f_ns)) 140 | else: 141 | raise 142 | 143 | # sanitize conf 144 | for k in conf.keys(): 145 | if k.startswith(self.subconfig_namespace_delimiter): 146 | conf.pop(k) 147 | 148 | if isinstance(cls, basestring): 149 | try: 150 | cls = import_object(cls) 151 | except ImportError: 152 | logger.warning('Could not import "%s".', cls) 153 | 154 | if 'namespaces' in conf: 155 | if tuple(namespaces) == (self.namespace_default,): 156 | namespaces = tuple(conf['namespaces']) 157 | else: 158 | namespaces = tuple(set(namespaces).union(conf['namespaces'])) 159 | 160 | self._classes_cache[initial_name] = cls, name, conf, tuple(namespaces) 161 | return cls, name, copy.deepcopy(conf), tuple(namespaces) 162 | 163 | def find_and_construct_class(self, name, fallback=None, args=(), kwargs=None): 164 | found_cls, name, defaults, ns = self.find_class(name=name, fallback=fallback) 165 | 166 | if kwargs: 167 | defaults.update(kwargs) 168 | 169 | kw = defaults 170 | kw.update(kw.pop('**', {})) 171 | kw['namespaces'] = set(kw.get('namespaces', ())).union(ns) 172 | kw['config_name'] = name 173 | 174 | args = kw.pop('*', args) 175 | 176 | if hasattr(found_cls, 'factory'): 177 | found_cls = found_cls.factory 178 | 179 | args = handle_as_list(args) 180 | 181 | if callable(found_cls): 182 | return found_cls(*args, **kw) 183 | raise ConfigError('{} is not a class. Cannot be imported or called.'.format(found_cls)) 184 | 185 | 186 | class ConfigLoader(object): 187 | 188 | def __init__(self, path): 189 | self._path = path 190 | 191 | def _load_user_config(self): 192 | if not os.path.exists(self._path): 193 | raise ConfigError('[{}] Config file does not exist.'.format(self._path)) 194 | with open(self._path, 'r') as f: 195 | return load(f, Loader=Loader) 196 | 197 | def _merge_configs(self, config, updater): 198 | if isinstance(config, dict): 199 | if isinstance(updater, dict): 200 | missing_keys = set(updater).difference(config) 201 | existing_keys = set(updater).intersection(config) 202 | 203 | cfg = config.copy() 204 | for k in missing_keys: 205 | cfg[k] = updater[k] 206 | 207 | for k in existing_keys: 208 | cfg[k] = self._merge_configs(cfg[k], updater[k]) 209 | 210 | return cfg 211 | 212 | return updater 213 | 214 | def load_config(self, default_only=False): 215 | user_config = self._load_user_config() if not default_only else Config() 216 | default_config = copy.deepcopy(logdog_default_config) 217 | 218 | if not isinstance(user_config, dict): 219 | raise ConfigError('[{}] Invalid config file: must have \"key: value\" format.'.format(self._path)) 220 | 221 | def walk(cfg, key=None): 222 | if isinstance(cfg, dict): 223 | return Config((k, walk(v, key=k)) for k, v in cfg.items()) 224 | if isinstance(cfg, (tuple, list)): 225 | return map(walk, cfg) 226 | if isinstance(cfg, (str, text_type)): 227 | ret = cfg.format(default=default_config) 228 | if is_importable(key): 229 | try: 230 | ret = import_object(ret) 231 | except ImportError: 232 | logger.debug('[%s] Faced non-importable path: \"%s: %s\".', self._path, key, ret) 233 | return ret 234 | return cfg 235 | 236 | default_config = walk(default_config) 237 | user_config = walk(user_config) 238 | 239 | conf = self._merge_configs(default_config, user_config) 240 | conf.sources = handle_as_list(conf.sources) 241 | 242 | return conf -------------------------------------------------------------------------------- /logdog/core/log.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging.config 4 | 5 | 6 | def configure_logging(log_level, log_format, log_custom_format=None): 7 | log_custom_format = log_custom_format or log_format 8 | logging.config.dictConfig({ 9 | 'version': 1, 10 | 'disable_existing_loggers': True, 11 | 'formatters': { 12 | 'quiet': { 13 | 'format': '%(asctime)s [%(levelname)s] %(message)s' 14 | }, 15 | 'verbose': { 16 | 'format': 'PID%(process)d %(asctime)s [%(levelname)s:%(name)s:%(lineno)d] %(message)s' 17 | }, 18 | 'custom': { 19 | 'format': log_custom_format 20 | } 21 | }, 22 | 'handlers': { 23 | 'console': { 24 | 'level': 'DEBUG', 25 | 'class': 'logging.StreamHandler', 26 | 'formatter': log_format 27 | } 28 | }, 29 | 'loggers': { 30 | 'logdog': { 31 | 'handlers': ['console'], 32 | 'level': log_level, 33 | 'propagate': False, 34 | }, 35 | 'tornado': { 36 | 'handlers': ['console'], 37 | 'level': log_level, 38 | 'propagate': False, 39 | }, 40 | 'zmq': { 41 | 'handlers': ['console'], 42 | 'level': log_level, 43 | 'propagate': False, 44 | }, 45 | } 46 | }) 47 | 48 | -------------------------------------------------------------------------------- /logdog/core/msg.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import json 4 | from tornado.escape import to_unicode 5 | import umsgpack 6 | 7 | 8 | class Msg(object): 9 | __slots__ = ( 10 | 'message', 11 | 'source', 12 | 'meta', 13 | 'version', 14 | ) 15 | 16 | def __init__(self, message, source, meta=None, version=1): 17 | self.message = to_unicode(message) 18 | self.source = to_unicode(source) 19 | self.meta = meta 20 | self.version = version 21 | 22 | def __str__(self): 23 | return self.message 24 | 25 | def update_message(self, message): 26 | self.message = to_unicode(message) 27 | 28 | def update_meta(self, d): 29 | if self.meta is None: 30 | self.meta = {} 31 | self.meta.update(d) 32 | 33 | def serialize(self): 34 | return { 35 | 'msg': self.message, 36 | 'src': self.source, 37 | 'meta': self.meta, 38 | '_v': self.version, 39 | } 40 | 41 | @classmethod 42 | def deserialize(cls, data): 43 | return cls(message=data.get('msg'), 44 | source=data.get('src'), 45 | meta=data.get('meta'), 46 | version=data.get('_v', 1)) 47 | 48 | def serialize_json(self): 49 | return json.dumps(self.serialize()) 50 | 51 | @classmethod 52 | def deserialize_json(cls, data): 53 | return cls.deserialize(json.loads(data)) 54 | 55 | def serialize_jsonb(self): 56 | return umsgpack.dumps(self.serialize()) 57 | 58 | @classmethod 59 | def deserialize_jsonb(cls, data): 60 | return cls.deserialize(umsgpack.loads(data)) 61 | -------------------------------------------------------------------------------- /logdog/core/path.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | import os 5 | 6 | 7 | logger = logging.getLogger(__name__) 8 | 9 | 10 | class Path(object): 11 | __slots__ = ('name', 'stat', '_prev_stat', 'offset', '_f', '_last_read_line') 12 | 13 | def __init__(self, name, offset, stat): 14 | self.name = name 15 | self.stat = self._prev_stat = stat 16 | self.offset = offset 17 | 18 | self._f = self._last_read_line = None 19 | 20 | def __str__(self): 21 | return self.name 22 | 23 | def __getstate__(self): 24 | return { 25 | 'name': self.name, 26 | 'stat': tuple(self.stat), 27 | '_prev_stat': tuple(self._prev_stat), 28 | 'offset': self.offset, 29 | '_last_read_line': self._last_read_line, 30 | '_f': None, 31 | } 32 | 33 | def __setstate__(self, value): 34 | self.name = value['name'] 35 | self.stat = os.stat_result(value['stat']) if value['stat'] else None 36 | self.offset = value['offset'] 37 | self._prev_stat = os.stat_result(value['_prev_stat']) if value['_prev_stat'] else None 38 | self._last_read_line = value['_last_read_line'] 39 | self._f = None 40 | 41 | def open(self): 42 | self._f = open(self.name, 'r') 43 | self._f.seek(self.offset) 44 | self._prev_stat = self.stat 45 | self.stat = os.stat(self.name) 46 | 47 | def close(self): 48 | self._f.close() 49 | 50 | def reopen(self): 51 | self.close() 52 | self.open() 53 | 54 | def check_stat(self): 55 | """Is called if we have nothing to read""" 56 | self._prev_stat = self.stat 57 | self.stat = os.stat(self.name) 58 | 59 | if not self.is_same_file(self._prev_stat, self.stat): 60 | logger.debug('[FILE:%s] New file detected by this path. Re-opening...', self.name) 61 | self.offset = 0 62 | self.reopen() 63 | 64 | elif self.is_file_truncated(self._prev_stat, self.stat): 65 | logger.debug('[FILE:%s] Seems, file was truncated. Re-opening...', self.name) 66 | self.offset = 0 67 | self.reopen() 68 | 69 | @staticmethod 70 | def is_same_file(prev_stat, cur_stat): 71 | return prev_stat.st_ino == cur_stat.st_ino and prev_stat.st_dev == cur_stat.st_dev 72 | 73 | @staticmethod 74 | def is_file_truncated(prev_stat, cur_stat): 75 | return prev_stat.st_size > cur_stat.st_size 76 | 77 | def read_line(self): 78 | self._last_read_line = (_, line) = (self.offset, self._f.readline()) 79 | self.offset = self._f.tell() 80 | return line 81 | -------------------------------------------------------------------------------- /logdog/core/policies.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | import time 5 | from tornado import gen 6 | 7 | 8 | logger = logging.getLogger(__name__) 9 | 10 | 11 | class DefaultSleepPolicy(object): 12 | BASE_SLEEP_INTERVAL = 0.5 13 | MAX_SLEEP_INTERVAL = 3 * 60. 14 | SLEEP_REPEAT_COUNT = 10 15 | 16 | def __init__(self, sleep_interval=BASE_SLEEP_INTERVAL, 17 | max_sleep_interval=MAX_SLEEP_INTERVAL, 18 | sleep_repeat_count=SLEEP_REPEAT_COUNT, **kwargs): 19 | 20 | self._base_sleep_interval = self._sleep_interval = sleep_interval 21 | self._max_sleep_interval = max_sleep_interval 22 | self._sleep_repeat_count = self._base_sleep_repeat_count = sleep_repeat_count 23 | 24 | def _next_sleep_interval(self): 25 | if not self._sleep_repeat_count: 26 | self._sleep_repeat_count = self._base_sleep_repeat_count 27 | self._sleep_interval *= 2 28 | if self._sleep_interval > self._max_sleep_interval: 29 | self._sleep_interval = self._max_sleep_interval 30 | return 31 | self._sleep_repeat_count -= 1 32 | 33 | def sleep(self): 34 | f = gen.sleep(self._sleep_interval) 35 | f.add_done_callback(lambda _: self._next_sleep_interval()) 36 | return f 37 | 38 | @property 39 | def cur_interval(self): 40 | return self._sleep_interval 41 | 42 | def reset(self): 43 | self._sleep_interval = self._base_sleep_interval 44 | self._sleep_repeat_count = self._base_sleep_repeat_count 45 | 46 | 47 | class GreedyPolicy(object): 48 | MAX_COUNT = 100 49 | TIMEOUT = 10.0 50 | 51 | def __init__(self, max_count=MAX_COUNT, timeout=TIMEOUT, **kwargs): 52 | 53 | self._max_count = max_count 54 | self._count = 0 55 | self._timeout = timeout 56 | self._time = time.time() 57 | 58 | def need_to_wait(self): 59 | self._count += 1 60 | return (self._count >= self._max_count 61 | or self._time + self._timeout < time.time()) 62 | 63 | def wait(self): 64 | self._count = 0 65 | self._time = time.time() 66 | return gen.moment 67 | -------------------------------------------------------------------------------- /logdog/core/register.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import anydbm 4 | import json 5 | import os 6 | from logdog.core.path import Path 7 | 8 | 9 | class Register(object): 10 | 11 | def __init__(self, index_file, reset=False): 12 | self._index_file = os.path.expandvars(os.path.expanduser(index_file)) 13 | if not os.path.exists(os.path.dirname(self._index_file)): 14 | os.makedirs(os.path.dirname(self._index_file)) 15 | self._reg = anydbm.open(self._index_file, 'c') 16 | if reset: 17 | self._reg.clear() 18 | 19 | def set(self, key, val): 20 | self._reg[str(key)] = str(val) 21 | 22 | def get(self, key): 23 | return self._reg[key] 24 | 25 | def __getitem__(self, item): 26 | return self._reg[item] 27 | 28 | def get_path(self, name): 29 | path = Path('', 0, None) 30 | path.__setstate__(json.loads(self.get(name))) 31 | return path 32 | 33 | def set_path(self, path_obj): 34 | self.set(path_obj.name, json.dumps(path_obj.__getstate__())) -------------------------------------------------------------------------------- /logdog/core/utils/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | from string import ascii_letters 5 | import time 6 | 7 | 8 | logger = logging.getLogger(__name__) 9 | 10 | 11 | def mark_as_proxy_method(func): 12 | func.is_proxy_method = True 13 | return func 14 | 15 | 16 | def is_proxy(func): 17 | return getattr(func, 'is_proxy_method', False) 18 | 19 | 20 | def simple_oid(): 21 | oid = [] 22 | x = int(time.time() * 10000) % 10 ** 8 23 | n = len(ascii_letters) 24 | while x: 25 | oid.insert(0, ascii_letters[x % n]) 26 | x /= n 27 | return ''.join(oid) 28 | -------------------------------------------------------------------------------- /logdog/core/utils/six.py: -------------------------------------------------------------------------------- 1 | """Utilities for writing code that runs on Python 2 and 3""" 2 | 3 | # Copyright (c) 2010-2015 Benjamin Peterson 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights 8 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | # copies of the Software, and to permit persons to whom the Software is 10 | # furnished to do so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | 23 | from __future__ import absolute_import 24 | 25 | import functools 26 | import itertools 27 | import operator 28 | import sys 29 | import types 30 | 31 | __author__ = "Benjamin Peterson " 32 | __version__ = "1.9.0" 33 | 34 | 35 | # Useful for very coarse version differentiation. 36 | PY2 = sys.version_info[0] == 2 37 | PY3 = sys.version_info[0] == 3 38 | 39 | if PY3: 40 | string_types = str, 41 | integer_types = int, 42 | class_types = type, 43 | text_type = str 44 | binary_type = bytes 45 | 46 | MAXSIZE = sys.maxsize 47 | else: 48 | string_types = basestring, 49 | integer_types = (int, long) 50 | class_types = (type, types.ClassType) 51 | text_type = unicode 52 | binary_type = str 53 | 54 | if sys.platform.startswith("java"): 55 | # Jython always uses 32 bits. 56 | MAXSIZE = int((1 << 31) - 1) 57 | else: 58 | # It's possible to have sizeof(long) != sizeof(Py_ssize_t). 59 | class X(object): 60 | def __len__(self): 61 | return 1 << 31 62 | try: 63 | len(X()) 64 | except OverflowError: 65 | # 32-bit 66 | MAXSIZE = int((1 << 31) - 1) 67 | else: 68 | # 64-bit 69 | MAXSIZE = int((1 << 63) - 1) 70 | del X 71 | 72 | 73 | def _add_doc(func, doc): 74 | """Add documentation to a function.""" 75 | func.__doc__ = doc 76 | 77 | 78 | def _import_module(name): 79 | """Import module, returning the module after the last dot.""" 80 | __import__(name) 81 | return sys.modules[name] 82 | 83 | 84 | class _LazyDescr(object): 85 | 86 | def __init__(self, name): 87 | self.name = name 88 | 89 | def __get__(self, obj, tp): 90 | result = self._resolve() 91 | setattr(obj, self.name, result) # Invokes __set__. 92 | try: 93 | # This is a bit ugly, but it avoids running this again by 94 | # removing this descriptor. 95 | delattr(obj.__class__, self.name) 96 | except AttributeError: 97 | pass 98 | return result 99 | 100 | 101 | class MovedModule(_LazyDescr): 102 | 103 | def __init__(self, name, old, new=None): 104 | super(MovedModule, self).__init__(name) 105 | if PY3: 106 | if new is None: 107 | new = name 108 | self.mod = new 109 | else: 110 | self.mod = old 111 | 112 | def _resolve(self): 113 | return _import_module(self.mod) 114 | 115 | def __getattr__(self, attr): 116 | _module = self._resolve() 117 | value = getattr(_module, attr) 118 | setattr(self, attr, value) 119 | return value 120 | 121 | 122 | class _LazyModule(types.ModuleType): 123 | 124 | def __init__(self, name): 125 | super(_LazyModule, self).__init__(name) 126 | self.__doc__ = self.__class__.__doc__ 127 | 128 | def __dir__(self): 129 | attrs = ["__doc__", "__name__"] 130 | attrs += [attr.name for attr in self._moved_attributes] 131 | return attrs 132 | 133 | # Subclasses should override this 134 | _moved_attributes = [] 135 | 136 | 137 | class MovedAttribute(_LazyDescr): 138 | 139 | def __init__(self, name, old_mod, new_mod, old_attr=None, new_attr=None): 140 | super(MovedAttribute, self).__init__(name) 141 | if PY3: 142 | if new_mod is None: 143 | new_mod = name 144 | self.mod = new_mod 145 | if new_attr is None: 146 | if old_attr is None: 147 | new_attr = name 148 | else: 149 | new_attr = old_attr 150 | self.attr = new_attr 151 | else: 152 | self.mod = old_mod 153 | if old_attr is None: 154 | old_attr = name 155 | self.attr = old_attr 156 | 157 | def _resolve(self): 158 | module = _import_module(self.mod) 159 | return getattr(module, self.attr) 160 | 161 | 162 | class _SixMetaPathImporter(object): 163 | """ 164 | A meta path importer to import six.moves and its submodules. 165 | 166 | This class implements a PEP302 finder and loader. It should be compatible 167 | with Python 2.5 and all existing versions of Python3 168 | """ 169 | def __init__(self, six_module_name): 170 | self.name = six_module_name 171 | self.known_modules = {} 172 | 173 | def _add_module(self, mod, *fullnames): 174 | for fullname in fullnames: 175 | self.known_modules[self.name + "." + fullname] = mod 176 | 177 | def _get_module(self, fullname): 178 | return self.known_modules[self.name + "." + fullname] 179 | 180 | def find_module(self, fullname, path=None): 181 | if fullname in self.known_modules: 182 | return self 183 | return None 184 | 185 | def __get_module(self, fullname): 186 | try: 187 | return self.known_modules[fullname] 188 | except KeyError: 189 | raise ImportError("This loader does not know module " + fullname) 190 | 191 | def load_module(self, fullname): 192 | try: 193 | # in case of a reload 194 | return sys.modules[fullname] 195 | except KeyError: 196 | pass 197 | mod = self.__get_module(fullname) 198 | if isinstance(mod, MovedModule): 199 | mod = mod._resolve() 200 | else: 201 | mod.__loader__ = self 202 | sys.modules[fullname] = mod 203 | return mod 204 | 205 | def is_package(self, fullname): 206 | """ 207 | Return true, if the named module is a package. 208 | 209 | We need this method to get correct spec objects with 210 | Python 3.4 (see PEP451) 211 | """ 212 | return hasattr(self.__get_module(fullname), "__path__") 213 | 214 | def get_code(self, fullname): 215 | """Return None 216 | 217 | Required, if is_package is implemented""" 218 | self.__get_module(fullname) # eventually raises ImportError 219 | return None 220 | get_source = get_code # same as get_code 221 | 222 | _importer = _SixMetaPathImporter(__name__) 223 | 224 | 225 | class _MovedItems(_LazyModule): 226 | """Lazy loading of moved objects""" 227 | __path__ = [] # mark as package 228 | 229 | 230 | _moved_attributes = [ 231 | MovedAttribute("cStringIO", "cStringIO", "io", "StringIO"), 232 | MovedAttribute("filter", "itertools", "builtins", "ifilter", "filter"), 233 | MovedAttribute("filterfalse", "itertools", "itertools", "ifilterfalse", "filterfalse"), 234 | MovedAttribute("input", "__builtin__", "builtins", "raw_input", "input"), 235 | MovedAttribute("intern", "__builtin__", "sys"), 236 | MovedAttribute("map", "itertools", "builtins", "imap", "map"), 237 | MovedAttribute("range", "__builtin__", "builtins", "xrange", "range"), 238 | MovedAttribute("reload_module", "__builtin__", "imp", "reload"), 239 | MovedAttribute("reduce", "__builtin__", "functools"), 240 | MovedAttribute("shlex_quote", "pipes", "shlex", "quote"), 241 | MovedAttribute("StringIO", "StringIO", "io"), 242 | MovedAttribute("UserDict", "UserDict", "collections"), 243 | MovedAttribute("UserList", "UserList", "collections"), 244 | MovedAttribute("UserString", "UserString", "collections"), 245 | MovedAttribute("xrange", "__builtin__", "builtins", "xrange", "range"), 246 | MovedAttribute("zip", "itertools", "builtins", "izip", "zip"), 247 | MovedAttribute("zip_longest", "itertools", "itertools", "izip_longest", "zip_longest"), 248 | 249 | MovedModule("builtins", "__builtin__"), 250 | MovedModule("configparser", "ConfigParser"), 251 | MovedModule("copyreg", "copy_reg"), 252 | MovedModule("dbm_gnu", "gdbm", "dbm.gnu"), 253 | MovedModule("_dummy_thread", "dummy_thread", "_dummy_thread"), 254 | MovedModule("http_cookiejar", "cookielib", "http.cookiejar"), 255 | MovedModule("http_cookies", "Cookie", "http.cookies"), 256 | MovedModule("html_entities", "htmlentitydefs", "html.entities"), 257 | MovedModule("html_parser", "HTMLParser", "html.parser"), 258 | MovedModule("http_client", "httplib", "http.client"), 259 | MovedModule("email_mime_multipart", "email.MIMEMultipart", "email.mime.multipart"), 260 | MovedModule("email_mime_nonmultipart", "email.MIMENonMultipart", "email.mime.nonmultipart"), 261 | MovedModule("email_mime_text", "email.MIMEText", "email.mime.text"), 262 | MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"), 263 | MovedModule("BaseHTTPServer", "BaseHTTPServer", "http.server"), 264 | MovedModule("CGIHTTPServer", "CGIHTTPServer", "http.server"), 265 | MovedModule("SimpleHTTPServer", "SimpleHTTPServer", "http.server"), 266 | MovedModule("cPickle", "cPickle", "pickle"), 267 | MovedModule("queue", "Queue"), 268 | MovedModule("reprlib", "repr"), 269 | MovedModule("socketserver", "SocketServer"), 270 | MovedModule("_thread", "thread", "_thread"), 271 | MovedModule("tkinter", "Tkinter"), 272 | MovedModule("tkinter_dialog", "Dialog", "tkinter.dialog"), 273 | MovedModule("tkinter_filedialog", "FileDialog", "tkinter.filedialog"), 274 | MovedModule("tkinter_scrolledtext", "ScrolledText", "tkinter.scrolledtext"), 275 | MovedModule("tkinter_simpledialog", "SimpleDialog", "tkinter.simpledialog"), 276 | MovedModule("tkinter_tix", "Tix", "tkinter.tix"), 277 | MovedModule("tkinter_ttk", "ttk", "tkinter.ttk"), 278 | MovedModule("tkinter_constants", "Tkconstants", "tkinter.constants"), 279 | MovedModule("tkinter_dnd", "Tkdnd", "tkinter.dnd"), 280 | MovedModule("tkinter_colorchooser", "tkColorChooser", 281 | "tkinter.colorchooser"), 282 | MovedModule("tkinter_commondialog", "tkCommonDialog", 283 | "tkinter.commondialog"), 284 | MovedModule("tkinter_tkfiledialog", "tkFileDialog", "tkinter.filedialog"), 285 | MovedModule("tkinter_font", "tkFont", "tkinter.font"), 286 | MovedModule("tkinter_messagebox", "tkMessageBox", "tkinter.messagebox"), 287 | MovedModule("tkinter_tksimpledialog", "tkSimpleDialog", 288 | "tkinter.simpledialog"), 289 | MovedModule("urllib_parse", __name__ + ".moves.urllib_parse", "urllib.parse"), 290 | MovedModule("urllib_error", __name__ + ".moves.urllib_error", "urllib.error"), 291 | MovedModule("urllib", __name__ + ".moves.urllib", __name__ + ".moves.urllib"), 292 | MovedModule("urllib_robotparser", "robotparser", "urllib.robotparser"), 293 | MovedModule("xmlrpc_client", "xmlrpclib", "xmlrpc.client"), 294 | MovedModule("xmlrpc_server", "SimpleXMLRPCServer", "xmlrpc.server"), 295 | MovedModule("winreg", "_winreg"), 296 | ] 297 | for attr in _moved_attributes: 298 | setattr(_MovedItems, attr.name, attr) 299 | if isinstance(attr, MovedModule): 300 | _importer._add_module(attr, "moves." + attr.name) 301 | del attr 302 | 303 | _MovedItems._moved_attributes = _moved_attributes 304 | 305 | moves = _MovedItems(__name__ + ".moves") 306 | _importer._add_module(moves, "moves") 307 | 308 | 309 | class Module_six_moves_urllib_parse(_LazyModule): 310 | """Lazy loading of moved objects in six.moves.urllib_parse""" 311 | 312 | 313 | _urllib_parse_moved_attributes = [ 314 | MovedAttribute("ParseResult", "urlparse", "urllib.parse"), 315 | MovedAttribute("SplitResult", "urlparse", "urllib.parse"), 316 | MovedAttribute("parse_qs", "urlparse", "urllib.parse"), 317 | MovedAttribute("parse_qsl", "urlparse", "urllib.parse"), 318 | MovedAttribute("urldefrag", "urlparse", "urllib.parse"), 319 | MovedAttribute("urljoin", "urlparse", "urllib.parse"), 320 | MovedAttribute("urlparse", "urlparse", "urllib.parse"), 321 | MovedAttribute("urlsplit", "urlparse", "urllib.parse"), 322 | MovedAttribute("urlunparse", "urlparse", "urllib.parse"), 323 | MovedAttribute("urlunsplit", "urlparse", "urllib.parse"), 324 | MovedAttribute("quote", "urllib", "urllib.parse"), 325 | MovedAttribute("quote_plus", "urllib", "urllib.parse"), 326 | MovedAttribute("unquote", "urllib", "urllib.parse"), 327 | MovedAttribute("unquote_plus", "urllib", "urllib.parse"), 328 | MovedAttribute("urlencode", "urllib", "urllib.parse"), 329 | MovedAttribute("splitquery", "urllib", "urllib.parse"), 330 | MovedAttribute("splittag", "urllib", "urllib.parse"), 331 | MovedAttribute("splituser", "urllib", "urllib.parse"), 332 | MovedAttribute("uses_fragment", "urlparse", "urllib.parse"), 333 | MovedAttribute("uses_netloc", "urlparse", "urllib.parse"), 334 | MovedAttribute("uses_params", "urlparse", "urllib.parse"), 335 | MovedAttribute("uses_query", "urlparse", "urllib.parse"), 336 | MovedAttribute("uses_relative", "urlparse", "urllib.parse"), 337 | ] 338 | for attr in _urllib_parse_moved_attributes: 339 | setattr(Module_six_moves_urllib_parse, attr.name, attr) 340 | del attr 341 | 342 | Module_six_moves_urllib_parse._moved_attributes = _urllib_parse_moved_attributes 343 | 344 | _importer._add_module(Module_six_moves_urllib_parse(__name__ + ".moves.urllib_parse"), 345 | "moves.urllib_parse", "moves.urllib.parse") 346 | 347 | 348 | class Module_six_moves_urllib_error(_LazyModule): 349 | """Lazy loading of moved objects in six.moves.urllib_error""" 350 | 351 | 352 | _urllib_error_moved_attributes = [ 353 | MovedAttribute("URLError", "urllib2", "urllib.error"), 354 | MovedAttribute("HTTPError", "urllib2", "urllib.error"), 355 | MovedAttribute("ContentTooShortError", "urllib", "urllib.error"), 356 | ] 357 | for attr in _urllib_error_moved_attributes: 358 | setattr(Module_six_moves_urllib_error, attr.name, attr) 359 | del attr 360 | 361 | Module_six_moves_urllib_error._moved_attributes = _urllib_error_moved_attributes 362 | 363 | _importer._add_module(Module_six_moves_urllib_error(__name__ + ".moves.urllib.error"), 364 | "moves.urllib_error", "moves.urllib.error") 365 | 366 | 367 | class Module_six_moves_urllib_request(_LazyModule): 368 | """Lazy loading of moved objects in six.moves.urllib_request""" 369 | 370 | 371 | _urllib_request_moved_attributes = [ 372 | MovedAttribute("urlopen", "urllib2", "urllib.request"), 373 | MovedAttribute("install_opener", "urllib2", "urllib.request"), 374 | MovedAttribute("build_opener", "urllib2", "urllib.request"), 375 | MovedAttribute("pathname2url", "urllib", "urllib.request"), 376 | MovedAttribute("url2pathname", "urllib", "urllib.request"), 377 | MovedAttribute("getproxies", "urllib", "urllib.request"), 378 | MovedAttribute("Request", "urllib2", "urllib.request"), 379 | MovedAttribute("OpenerDirector", "urllib2", "urllib.request"), 380 | MovedAttribute("HTTPDefaultErrorHandler", "urllib2", "urllib.request"), 381 | MovedAttribute("HTTPRedirectHandler", "urllib2", "urllib.request"), 382 | MovedAttribute("HTTPCookieProcessor", "urllib2", "urllib.request"), 383 | MovedAttribute("ProxyHandler", "urllib2", "urllib.request"), 384 | MovedAttribute("BaseHandler", "urllib2", "urllib.request"), 385 | MovedAttribute("HTTPPasswordMgr", "urllib2", "urllib.request"), 386 | MovedAttribute("HTTPPasswordMgrWithDefaultRealm", "urllib2", "urllib.request"), 387 | MovedAttribute("AbstractBasicAuthHandler", "urllib2", "urllib.request"), 388 | MovedAttribute("HTTPBasicAuthHandler", "urllib2", "urllib.request"), 389 | MovedAttribute("ProxyBasicAuthHandler", "urllib2", "urllib.request"), 390 | MovedAttribute("AbstractDigestAuthHandler", "urllib2", "urllib.request"), 391 | MovedAttribute("HTTPDigestAuthHandler", "urllib2", "urllib.request"), 392 | MovedAttribute("ProxyDigestAuthHandler", "urllib2", "urllib.request"), 393 | MovedAttribute("HTTPHandler", "urllib2", "urllib.request"), 394 | MovedAttribute("HTTPSHandler", "urllib2", "urllib.request"), 395 | MovedAttribute("FileHandler", "urllib2", "urllib.request"), 396 | MovedAttribute("FTPHandler", "urllib2", "urllib.request"), 397 | MovedAttribute("CacheFTPHandler", "urllib2", "urllib.request"), 398 | MovedAttribute("UnknownHandler", "urllib2", "urllib.request"), 399 | MovedAttribute("HTTPErrorProcessor", "urllib2", "urllib.request"), 400 | MovedAttribute("urlretrieve", "urllib", "urllib.request"), 401 | MovedAttribute("urlcleanup", "urllib", "urllib.request"), 402 | MovedAttribute("URLopener", "urllib", "urllib.request"), 403 | MovedAttribute("FancyURLopener", "urllib", "urllib.request"), 404 | MovedAttribute("proxy_bypass", "urllib", "urllib.request"), 405 | ] 406 | for attr in _urllib_request_moved_attributes: 407 | setattr(Module_six_moves_urllib_request, attr.name, attr) 408 | del attr 409 | 410 | Module_six_moves_urllib_request._moved_attributes = _urllib_request_moved_attributes 411 | 412 | _importer._add_module(Module_six_moves_urllib_request(__name__ + ".moves.urllib.request"), 413 | "moves.urllib_request", "moves.urllib.request") 414 | 415 | 416 | class Module_six_moves_urllib_response(_LazyModule): 417 | """Lazy loading of moved objects in six.moves.urllib_response""" 418 | 419 | 420 | _urllib_response_moved_attributes = [ 421 | MovedAttribute("addbase", "urllib", "urllib.response"), 422 | MovedAttribute("addclosehook", "urllib", "urllib.response"), 423 | MovedAttribute("addinfo", "urllib", "urllib.response"), 424 | MovedAttribute("addinfourl", "urllib", "urllib.response"), 425 | ] 426 | for attr in _urllib_response_moved_attributes: 427 | setattr(Module_six_moves_urllib_response, attr.name, attr) 428 | del attr 429 | 430 | Module_six_moves_urllib_response._moved_attributes = _urllib_response_moved_attributes 431 | 432 | _importer._add_module(Module_six_moves_urllib_response(__name__ + ".moves.urllib.response"), 433 | "moves.urllib_response", "moves.urllib.response") 434 | 435 | 436 | class Module_six_moves_urllib_robotparser(_LazyModule): 437 | """Lazy loading of moved objects in six.moves.urllib_robotparser""" 438 | 439 | 440 | _urllib_robotparser_moved_attributes = [ 441 | MovedAttribute("RobotFileParser", "robotparser", "urllib.robotparser"), 442 | ] 443 | for attr in _urllib_robotparser_moved_attributes: 444 | setattr(Module_six_moves_urllib_robotparser, attr.name, attr) 445 | del attr 446 | 447 | Module_six_moves_urllib_robotparser._moved_attributes = _urllib_robotparser_moved_attributes 448 | 449 | _importer._add_module(Module_six_moves_urllib_robotparser(__name__ + ".moves.urllib.robotparser"), 450 | "moves.urllib_robotparser", "moves.urllib.robotparser") 451 | 452 | 453 | class Module_six_moves_urllib(types.ModuleType): 454 | """Create a six.moves.urllib namespace that resembles the Python 3 namespace""" 455 | __path__ = [] # mark as package 456 | parse = _importer._get_module("moves.urllib_parse") 457 | error = _importer._get_module("moves.urllib_error") 458 | request = _importer._get_module("moves.urllib_request") 459 | response = _importer._get_module("moves.urllib_response") 460 | robotparser = _importer._get_module("moves.urllib_robotparser") 461 | 462 | def __dir__(self): 463 | return ['parse', 'error', 'request', 'response', 'robotparser'] 464 | 465 | _importer._add_module(Module_six_moves_urllib(__name__ + ".moves.urllib"), 466 | "moves.urllib") 467 | 468 | 469 | def add_move(move): 470 | """Add an item to six.moves.""" 471 | setattr(_MovedItems, move.name, move) 472 | 473 | 474 | def remove_move(name): 475 | """Remove item from six.moves.""" 476 | try: 477 | delattr(_MovedItems, name) 478 | except AttributeError: 479 | try: 480 | del moves.__dict__[name] 481 | except KeyError: 482 | raise AttributeError("no such move, %r" % (name,)) 483 | 484 | 485 | if PY3: 486 | _meth_func = "__func__" 487 | _meth_self = "__self__" 488 | 489 | _func_closure = "__closure__" 490 | _func_code = "__code__" 491 | _func_defaults = "__defaults__" 492 | _func_globals = "__globals__" 493 | else: 494 | _meth_func = "im_func" 495 | _meth_self = "im_self" 496 | 497 | _func_closure = "func_closure" 498 | _func_code = "func_code" 499 | _func_defaults = "func_defaults" 500 | _func_globals = "func_globals" 501 | 502 | 503 | try: 504 | advance_iterator = next 505 | except NameError: 506 | def advance_iterator(it): 507 | return it.next() 508 | next = advance_iterator 509 | 510 | 511 | try: 512 | callable = callable 513 | except NameError: 514 | def callable(obj): 515 | return any("__call__" in klass.__dict__ for klass in type(obj).__mro__) 516 | 517 | 518 | if PY3: 519 | def get_unbound_function(unbound): 520 | return unbound 521 | 522 | create_bound_method = types.MethodType 523 | 524 | Iterator = object 525 | else: 526 | def get_unbound_function(unbound): 527 | return unbound.im_func 528 | 529 | def create_bound_method(func, obj): 530 | return types.MethodType(func, obj, obj.__class__) 531 | 532 | class Iterator(object): 533 | 534 | def next(self): 535 | return type(self).__next__(self) 536 | 537 | callable = callable 538 | _add_doc(get_unbound_function, 539 | """Get the function out of a possibly unbound function""") 540 | 541 | 542 | get_method_function = operator.attrgetter(_meth_func) 543 | get_method_self = operator.attrgetter(_meth_self) 544 | get_function_closure = operator.attrgetter(_func_closure) 545 | get_function_code = operator.attrgetter(_func_code) 546 | get_function_defaults = operator.attrgetter(_func_defaults) 547 | get_function_globals = operator.attrgetter(_func_globals) 548 | 549 | 550 | if PY3: 551 | def iterkeys(d, **kw): 552 | return iter(d.keys(**kw)) 553 | 554 | def itervalues(d, **kw): 555 | return iter(d.values(**kw)) 556 | 557 | def iteritems(d, **kw): 558 | return iter(d.items(**kw)) 559 | 560 | def iterlists(d, **kw): 561 | return iter(d.lists(**kw)) 562 | 563 | viewkeys = operator.methodcaller("keys") 564 | 565 | viewvalues = operator.methodcaller("values") 566 | 567 | viewitems = operator.methodcaller("items") 568 | else: 569 | def iterkeys(d, **kw): 570 | return iter(d.iterkeys(**kw)) 571 | 572 | def itervalues(d, **kw): 573 | return iter(d.itervalues(**kw)) 574 | 575 | def iteritems(d, **kw): 576 | return iter(d.iteritems(**kw)) 577 | 578 | def iterlists(d, **kw): 579 | return iter(d.iterlists(**kw)) 580 | 581 | viewkeys = operator.methodcaller("viewkeys") 582 | 583 | viewvalues = operator.methodcaller("viewvalues") 584 | 585 | viewitems = operator.methodcaller("viewitems") 586 | 587 | _add_doc(iterkeys, "Return an iterator over the keys of a dictionary.") 588 | _add_doc(itervalues, "Return an iterator over the values of a dictionary.") 589 | _add_doc(iteritems, 590 | "Return an iterator over the (key, value) pairs of a dictionary.") 591 | _add_doc(iterlists, 592 | "Return an iterator over the (key, [values]) pairs of a dictionary.") 593 | 594 | 595 | if PY3: 596 | def b(s): 597 | return s.encode("latin-1") 598 | def u(s): 599 | return s 600 | unichr = chr 601 | if sys.version_info[1] <= 1: 602 | def int2byte(i): 603 | return bytes((i,)) 604 | else: 605 | # This is about 2x faster than the implementation above on 3.2+ 606 | int2byte = operator.methodcaller("to_bytes", 1, "big") 607 | byte2int = operator.itemgetter(0) 608 | indexbytes = operator.getitem 609 | iterbytes = iter 610 | import io 611 | StringIO = io.StringIO 612 | BytesIO = io.BytesIO 613 | _assertCountEqual = "assertCountEqual" 614 | _assertRaisesRegex = "assertRaisesRegex" 615 | _assertRegex = "assertRegex" 616 | else: 617 | def b(s): 618 | return s 619 | # Workaround for standalone backslash 620 | def u(s): 621 | return unicode(s.replace(r'\\', r'\\\\'), "unicode_escape") 622 | unichr = unichr 623 | int2byte = chr 624 | def byte2int(bs): 625 | return ord(bs[0]) 626 | def indexbytes(buf, i): 627 | return ord(buf[i]) 628 | iterbytes = functools.partial(itertools.imap, ord) 629 | import StringIO 630 | StringIO = BytesIO = StringIO.StringIO 631 | _assertCountEqual = "assertItemsEqual" 632 | _assertRaisesRegex = "assertRaisesRegexp" 633 | _assertRegex = "assertRegexpMatches" 634 | _add_doc(b, """Byte literal""") 635 | _add_doc(u, """Text literal""") 636 | 637 | 638 | def assertCountEqual(self, *args, **kwargs): 639 | return getattr(self, _assertCountEqual)(*args, **kwargs) 640 | 641 | 642 | def assertRaisesRegex(self, *args, **kwargs): 643 | return getattr(self, _assertRaisesRegex)(*args, **kwargs) 644 | 645 | 646 | def assertRegex(self, *args, **kwargs): 647 | return getattr(self, _assertRegex)(*args, **kwargs) 648 | 649 | 650 | if PY3: 651 | exec_ = getattr(moves.builtins, "exec") 652 | 653 | 654 | def reraise(tp, value, tb=None): 655 | if value is None: 656 | value = tp() 657 | if value.__traceback__ is not tb: 658 | raise value.with_traceback(tb) 659 | raise value 660 | 661 | else: 662 | def exec_(_code_, _globs_=None, _locs_=None): 663 | """Execute code in a namespace.""" 664 | if _globs_ is None: 665 | frame = sys._getframe(1) 666 | _globs_ = frame.f_globals 667 | if _locs_ is None: 668 | _locs_ = frame.f_locals 669 | del frame 670 | elif _locs_ is None: 671 | _locs_ = _globs_ 672 | exec("""exec _code_ in _globs_, _locs_""") 673 | 674 | 675 | exec_("""def reraise(tp, value, tb=None): 676 | raise tp, value, tb 677 | """) 678 | 679 | 680 | if sys.version_info[:2] == (3, 2): 681 | exec_("""def raise_from(value, from_value): 682 | if from_value is None: 683 | raise value 684 | raise value from from_value 685 | """) 686 | elif sys.version_info[:2] > (3, 2): 687 | exec_("""def raise_from(value, from_value): 688 | raise value from from_value 689 | """) 690 | else: 691 | def raise_from(value, from_value): 692 | raise value 693 | 694 | 695 | print_ = getattr(moves.builtins, "print", None) 696 | if print_ is None: 697 | def print_(*args, **kwargs): 698 | """The new-style print function for Python 2.4 and 2.5.""" 699 | fp = kwargs.pop("file", sys.stdout) 700 | if fp is None: 701 | return 702 | def write(data): 703 | if not isinstance(data, basestring): 704 | data = str(data) 705 | # If the file has an encoding, encode unicode with it. 706 | if (isinstance(fp, file) and 707 | isinstance(data, unicode) and 708 | fp.encoding is not None): 709 | errors = getattr(fp, "errors", None) 710 | if errors is None: 711 | errors = "strict" 712 | data = data.encode(fp.encoding, errors) 713 | fp.write(data) 714 | want_unicode = False 715 | sep = kwargs.pop("sep", None) 716 | if sep is not None: 717 | if isinstance(sep, unicode): 718 | want_unicode = True 719 | elif not isinstance(sep, str): 720 | raise TypeError("sep must be None or a string") 721 | end = kwargs.pop("end", None) 722 | if end is not None: 723 | if isinstance(end, unicode): 724 | want_unicode = True 725 | elif not isinstance(end, str): 726 | raise TypeError("end must be None or a string") 727 | if kwargs: 728 | raise TypeError("invalid keyword arguments to print()") 729 | if not want_unicode: 730 | for arg in args: 731 | if isinstance(arg, unicode): 732 | want_unicode = True 733 | break 734 | if want_unicode: 735 | newline = unicode("\n") 736 | space = unicode(" ") 737 | else: 738 | newline = "\n" 739 | space = " " 740 | if sep is None: 741 | sep = space 742 | if end is None: 743 | end = newline 744 | for i, arg in enumerate(args): 745 | if i: 746 | write(sep) 747 | write(arg) 748 | write(end) 749 | if sys.version_info[:2] < (3, 3): 750 | _print = print_ 751 | def print_(*args, **kwargs): 752 | fp = kwargs.get("file", sys.stdout) 753 | flush = kwargs.pop("flush", False) 754 | _print(*args, **kwargs) 755 | if flush and fp is not None: 756 | fp.flush() 757 | 758 | _add_doc(reraise, """Reraise an exception.""") 759 | 760 | if sys.version_info[0:2] < (3, 4): 761 | def wraps(wrapped, assigned=functools.WRAPPER_ASSIGNMENTS, 762 | updated=functools.WRAPPER_UPDATES): 763 | def wrapper(f): 764 | f = functools.wraps(wrapped, assigned, updated)(f) 765 | f.__wrapped__ = wrapped 766 | return f 767 | return wrapper 768 | else: 769 | wraps = functools.wraps 770 | 771 | def with_metaclass(meta, *bases): 772 | """Create a base class with a metaclass.""" 773 | # This requires a bit of explanation: the basic idea is to make a dummy 774 | # metaclass for one level of class instantiation that replaces itself with 775 | # the actual metaclass. 776 | class metaclass(meta): 777 | def __new__(cls, name, this_bases, d): 778 | return meta(name, bases, d) 779 | return type.__new__(metaclass, 'temporary_class', (), {}) 780 | 781 | 782 | def add_metaclass(metaclass): 783 | """Class decorator for creating a class with a metaclass.""" 784 | def wrapper(cls): 785 | orig_vars = cls.__dict__.copy() 786 | slots = orig_vars.get('__slots__') 787 | if slots is not None: 788 | if isinstance(slots, str): 789 | slots = [slots] 790 | for slots_var in slots: 791 | orig_vars.pop(slots_var) 792 | orig_vars.pop('__dict__', None) 793 | orig_vars.pop('__weakref__', None) 794 | return metaclass(cls.__name__, cls.__bases__, orig_vars) 795 | return wrapper 796 | 797 | 798 | def python_2_unicode_compatible(klass): 799 | """ 800 | A decorator that defines __unicode__ and __str__ methods under Python 2. 801 | Under Python 3 it does nothing. 802 | 803 | To support Python 2 and 3 with a single code base, define a __str__ method 804 | returning text and apply this decorator to the class. 805 | """ 806 | if PY2: 807 | if '__str__' not in klass.__dict__: 808 | raise ValueError("@python_2_unicode_compatible cannot be applied " 809 | "to %s because it doesn't define __str__()." % 810 | klass.__name__) 811 | klass.__unicode__ = klass.__str__ 812 | klass.__str__ = lambda self: self.__unicode__().encode('utf-8') 813 | return klass 814 | 815 | 816 | # Complete the moves implementation. 817 | # This code is at the end of this module to speed up module loading. 818 | # Turn this module into a package. 819 | __path__ = [] # required for PEP 302 and PEP 451 820 | __package__ = __name__ # see PEP 366 @ReservedAssignment 821 | if globals().get("__spec__") is not None: 822 | __spec__.submodule_search_locations = [] # PEP 451 @UndefinedVariable 823 | # Remove other six meta path importers, since they cause problems. This can 824 | # happen if six is removed from sys.modules and then reloaded. (Setuptools does 825 | # this for some reason.) 826 | if sys.meta_path: 827 | for i, importer in enumerate(sys.meta_path): 828 | # Here's some real nastiness: Another "instance" of the six module might 829 | # be floating around. Therefore, we can't use isinstance() to check for 830 | # the six meta path importer, since the other six instance will have 831 | # inserted an importer with different class. 832 | if (type(importer).__name__ == "_SixMetaPathImporter" and 833 | importer.name == __name__): 834 | del sys.meta_path[i] 835 | break 836 | del i, importer 837 | # Finally, add the importer to the meta path import hook. 838 | sys.meta_path.append(_importer) 839 | 840 | 841 | ### Additional customizations for Django ### 842 | 843 | if PY3: 844 | memoryview = memoryview 845 | buffer_types = (bytes, bytearray, memoryview) 846 | else: 847 | # memoryview and buffer are not strictly equivalent, but should be fine for 848 | # django core usage (mainly BinaryField). However, Jython doesn't support 849 | # buffer (see http://bugs.jython.org/issue1521), so we have to be careful. 850 | if sys.platform.startswith('java'): 851 | memoryview = memoryview 852 | else: 853 | memoryview = buffer 854 | buffer_types = (bytearray, memoryview) -------------------------------------------------------------------------------- /logdog/default_config.py: -------------------------------------------------------------------------------- 1 | # Default config. 2 | # options/properties must be valid python identifiers, e.g. "default_pipe" 3 | # names can be any string, e.g. "file->webui" (excluding ' ', '@') 4 | from __future__ import absolute_import, unicode_literals 5 | 6 | 7 | config = { 8 | # Sources format 9 | # YAML 10 | # sources: 11 | # - (path | search pattern) 12 | # # or 13 | # # `handler`, `watcher`, `meta` are optional 14 | # - (path | search pattern): 15 | # handler: handler-name # default pipes.to-web 16 | # watcher: watcher-name # default pollers.file-watcher 17 | # meta: a-dictionary-containing-any-meta-info # e.g. {tags: [tag1, tag2]} 18 | # # or 19 | # - (path | search pattern): handler-name 20 | # # or 21 | # - (path | search pattern): {handler: pipes.to-web} 22 | # # or 23 | # - (path | search pattern): {watcher: poller.custom-file-poller} 24 | # # or 25 | # - (path | search pattern): {meta: {tags: [log]}} 26 | 27 | 'sources': { 28 | '/var/log/*.log': {'handler': 'pipes.to-web'}, 29 | '/var/log/*/*.log': {'handler': 'pipes.to-web'}, 30 | '/var/log/syslog': 'pipes.to-web', 31 | }, 32 | 33 | # Pipes format 34 | # YAML 35 | # pipes: 36 | # pipe-name: 37 | # - [namespace,...] pipe-object # processors.stripper 38 | # - ... # ns processors.stripper 39 | # - ...config # ns1,ns2 processors.stripper 40 | 41 | 'pipes': { 42 | 'default': 'logdog.roles.pipes.Pipe', 43 | 44 | 'to-web': [ 45 | 'watch processors.stripper', 46 | 'watch connectors.zmq-tunnel@sender', 47 | 'view connectors.zmq-tunnel@receiver', 48 | 'view viewers.webui', 49 | ], 50 | 51 | 'experiment-x001': { 52 | 'cls': 'logdog.roles.pipes.Pipe', 53 | '*': [ 54 | 'watch processors.stripper', 55 | {'forwarders.broadcast': [ 56 | 'watch viewers.console', 57 | {'pipes.default': [ 58 | {'watch connectors.zmq-tunnel@sender': {'connect': 'tcp://localhost:7789'}}, 59 | {'view connectors.zmq-tunnel@receiver': {'bind': 'tcp://*:7789'}}, 60 | {'view forwarders.round-robin': [ 61 | 'view viewers.webui', 62 | 'view viewers.null', 63 | ]} 64 | ]}, 65 | ]}, 66 | ], 67 | } 68 | }, 69 | 70 | 'options': { 71 | 'sources': { 72 | 'default_handler': 'pipes.to-web', 73 | 'default_watcher': 'pollers.file-watcher', 74 | 'index_file': '~/.config/logdog/sources-index.idx' 75 | } 76 | }, 77 | 78 | # pollers 79 | 'pollers': { 80 | 'file-watcher': { 81 | 'cls': 'logdog.roles.pollers.FileWatcher', 82 | 'namespaces': ['watch'], 83 | }, 84 | }, 85 | 86 | # collectors 87 | 'collectors': {}, 88 | 89 | # processors / parsers / formatters 90 | 'processors': { # transform message 91 | 'stripper': {'cls': 'logdog.roles.processors.Stripper'}, 92 | }, 93 | 'parsers': { # extract meta 94 | 'regex': {'cls': 'logdog.roles.parsers.Regex', 'regex': r''}, 95 | }, 96 | 'formatters': { # add to meta formatted representation of message 97 | 'formatter': {'cls': 'logdog.roles.formatters.Formatter'}, 98 | }, 99 | 100 | # forwarders 101 | 'forwarders': { 102 | 'broadcast': 'logdog.roles.forwarders.Broadcast', 103 | 'round-robin': 'logdog.roles.forwarders.RoundRobin', 104 | }, 105 | 106 | # connectors 107 | 'connectors': { 108 | 'zmq-tunnel': { 109 | 'cls': 'logdog.roles.connectors.ZMQTunnel', 110 | '@sender': {'socket': 'PUSH', 'connect': ['tcp://localhost:45457']}, 111 | '@receiver': {'socket': 'PULL', 'bind': ['tcp://*:45457']}, 112 | } 113 | }, 114 | 115 | # viewers 116 | 'viewers': { 117 | 'default': 'logdog.roles.viewers.Null', 118 | 119 | 'webui': { 120 | 'cls': 'logdog.roles.viewers.WebUI', 121 | 'port': 8888, 122 | 'debug': False, 123 | }, 124 | 'console': { 125 | 'cls': 'logdog.roles.viewers.Console', 126 | 'redirect_to': 'stdout', 127 | }, 128 | 'null': 'logdog.roles.viewers.Null', 129 | }, 130 | 131 | # utils 132 | 'utils': { 133 | 'policies': { 134 | 'growing-sleep': { 135 | 'cls': 'logdog.core.policies.DefaultSleepPolicy' 136 | }, 137 | 'mostly-greedy': { 138 | 'cls': 'logdog.core.policies.GreedyPolicy' 139 | } 140 | } 141 | } 142 | } 143 | -------------------------------------------------------------------------------- /logdog/roles/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/miphreal/python-logdog/44194b199f1906a156aa15bb97a40bb8b13f3d52/logdog/roles/__init__.py -------------------------------------------------------------------------------- /logdog/roles/collectors/__init__.py: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /logdog/roles/collectors/base.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | from collections import deque 4 | import logging 5 | from tornado import gen 6 | 7 | from logdog.core.base_role import BaseRole 8 | 9 | 10 | logger = logging.getLogger(__name__) 11 | 12 | 13 | class BaseCollector(BaseRole): 14 | defaults = BaseRole.defaults.inherit( 15 | buffer_size=1024, 16 | ) 17 | 18 | def __init__(self, *args, **kwargs): 19 | super(BaseCollector, self).__init__(*args, **kwargs) 20 | self._buffer = deque(maxlen=self.config.buffer_size) 21 | self._wait_for_writable_buf = None 22 | 23 | def _check_buffer_renew_write_policy(self): 24 | write_available = len(self._buffer) <= self.config.buffer_size / 2 25 | if write_available and self._wait_for_writable_buf and not self._wait_for_writable_buf.done(): 26 | self._wait_for_writable_buf.set_result(None) 27 | 28 | def _check_buffer_write_policy(self): 29 | return len(self._buffer) < self.config.buffer_size 30 | 31 | def _wait_for_buffer_writability(self): 32 | self._wait_for_writable_buf = gen.Future() 33 | return self._wait_for_writable_buf 34 | 35 | @gen.coroutine 36 | def on_recv(self, data): 37 | if not self._check_buffer_write_policy(): 38 | logger.info('[%s] Buffer is full. Waiting until buffer is available...', self) 39 | yield self._wait_for_buffer_writability() 40 | self._buffer.append(data) 41 | 42 | def pop(self): 43 | try: 44 | data = self._buffer.popleft() 45 | except IndexError: 46 | return 47 | 48 | self._check_buffer_renew_write_policy() 49 | return data 50 | 51 | -------------------------------------------------------------------------------- /logdog/roles/connectors/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import 2 | 3 | from .zmq_tunnel import ZMQTunnel 4 | -------------------------------------------------------------------------------- /logdog/roles/connectors/base.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | 5 | from logdog.core.base_role import BaseRole 6 | 7 | 8 | logger = logging.getLogger(__name__) 9 | 10 | 11 | class BaseConnector(BaseRole): 12 | pass -------------------------------------------------------------------------------- /logdog/roles/connectors/zmq_tunnel.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | from tornado import gen 4 | from tornado.concurrent import is_future 5 | import zmq 6 | import zmq.eventloop.ioloop 7 | from zmq.eventloop.zmqstream import ZMQStream 8 | 9 | from logdog.core.msg import Msg 10 | from .base import BaseConnector 11 | 12 | 13 | zmq.eventloop.ioloop.install() 14 | 15 | 16 | class ZMQTunnel(BaseConnector): 17 | defaults = BaseConnector.defaults( 18 | unique=True, 19 | connect=(), 20 | bind=(), 21 | socket=None, 22 | ) 23 | 24 | def __init__(self, *args, **kwargs): 25 | if not isinstance(kwargs.get('bind', ()), (list, tuple)): 26 | kwargs['bind'] = (kwargs['bind'],) 27 | if not isinstance(kwargs.get('connect', ()), (list, tuple)): 28 | kwargs['connect'] = (kwargs['connect'],) 29 | 30 | self.stream = self.socket = None 31 | self.ctx = zmq.Context() 32 | 33 | super(ZMQTunnel, self).__init__(*args, **kwargs) 34 | 35 | def __str__(self): 36 | return 'ZMQ-TUNNEL:{}:{}'.format(self.config.socket, 37 | ','.join(self.config.bind) or 38 | ','.join(self.config.connect) or 'None') 39 | 40 | @classmethod 41 | def __singleton_key__(cls, passed_args, passed_kwargs): 42 | key = super(ZMQTunnel, cls).__singleton_key__(passed_args, passed_kwargs) 43 | connect = passed_kwargs.get('connect', cls.defaults.connect) or () 44 | if not isinstance(connect, (list, tuple)): 45 | connect = [connect] 46 | bind = passed_kwargs.get('bind', cls.defaults.bind) or () 47 | if not isinstance(bind, (list, tuple)): 48 | bind = [bind] 49 | return '{key}::socket-type={socket_type}::bind={bind}::connect={connect}'.format( 50 | key=key, 51 | socket_type=passed_kwargs.get('socket', cls.defaults.socket), 52 | connect=','.join(sorted(connect)) or 'None', 53 | bind=','.join(sorted(bind)) or 'None', 54 | ) 55 | 56 | def _pre_start(self): 57 | self.socket = self.ctx.socket(getattr(zmq, self.config.socket)) 58 | 59 | if 'bind' in self.config: 60 | for addr in self.config.bind: 61 | self.socket.bind(addr) 62 | 63 | if 'connect' in self.config: 64 | for addr in self.config.connect: 65 | self.socket.connect(addr) 66 | 67 | self.stream = ZMQStream(self.socket, io_loop=self.app.io_loop) 68 | self.stream.on_recv(self.on_recv) 69 | 70 | @gen.coroutine 71 | def on_recv(self, data): 72 | data = Msg.deserialize_jsonb(data[0]) 73 | ret = self._forward(data) 74 | if is_future(ret): 75 | yield ret 76 | 77 | def send(self, data): 78 | if self.started: 79 | data = data.serialize_jsonb() 80 | return self.stream.send(msg=data) -------------------------------------------------------------------------------- /logdog/roles/formatters/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import 2 | 3 | from .fmtr import Formatter -------------------------------------------------------------------------------- /logdog/roles/formatters/base.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | 5 | from logdog.core.base_role import BaseRole 6 | 7 | 8 | logger = logging.getLogger(__name__) 9 | 10 | 11 | class BaseFormatter(BaseRole): 12 | pass -------------------------------------------------------------------------------- /logdog/roles/formatters/fmtr.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | from .base import BaseFormatter 4 | 5 | 6 | class Formatter(BaseFormatter): 7 | defaults = BaseFormatter.defaults.copy_and_update( 8 | format='{msg.source!s}: {msg.message!s}', 9 | format_meta_name=None, # None means 'replace msg.message' 10 | ) 11 | 12 | def __init__(self, *args, **kwargs): 13 | super(Formatter, self).__init__(*args, **kwargs) 14 | self.format = self.config.format 15 | self.format_meta_name = self.config.format_meta_name 16 | 17 | def __str__(self): 18 | return 'FORMATTER' 19 | 20 | def on_recv(self, data): 21 | msg = self.format.format(msg=data) 22 | if self.format_meta_name is not None: 23 | data.update_meta({self.format_meta_name: msg}) 24 | else: 25 | data.update_message(msg) 26 | return self.output.send(data) -------------------------------------------------------------------------------- /logdog/roles/forwarders/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import 2 | 3 | from .broadcast import Broadcast 4 | from .round_robin import RoundRobin -------------------------------------------------------------------------------- /logdog/roles/forwarders/base.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | from tornado import gen 5 | 6 | from logdog.core.base_role import BaseRole 7 | 8 | 9 | logger = logging.getLogger(__name__) 10 | 11 | 12 | class BaseForwarder(BaseRole): 13 | 14 | def __init__(self, *args, **kwargs): 15 | super(BaseForwarder, self).__init__(*args, **kwargs) 16 | self.items = [self.construct_subrole(name, conf) for name, conf in self.items] 17 | self.link_methods() 18 | for i in self.items: 19 | i.link_methods() 20 | 21 | def __str__(self): 22 | return 'FORWARDER:{}'.format(self._oid) 23 | 24 | @gen.coroutine 25 | def start(self): 26 | if not self.started: 27 | yield [i.start() for i in self.items] 28 | yield super(BaseForwarder, self).start() 29 | 30 | @gen.coroutine 31 | def stop(self): 32 | if self.started: 33 | yield [i.stop() for i in self.items] 34 | yield super(BaseForwarder, self).stop() 35 | 36 | -------------------------------------------------------------------------------- /logdog/roles/forwarders/broadcast.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | from tornado import gen 4 | 5 | from .base import BaseForwarder 6 | 7 | 8 | class Broadcast(BaseForwarder): 9 | def __str__(self): 10 | return 'BROADCAST:{}'.format(self.input) 11 | 12 | @gen.coroutine 13 | def _input_forwarder(self, data): 14 | return [gen.maybe_future(i.send(data)) for i in self.items] -------------------------------------------------------------------------------- /logdog/roles/forwarders/round_robin.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | from tornado import gen 4 | 5 | from .base import BaseForwarder 6 | 7 | 8 | class RoundRobin(BaseForwarder): 9 | 10 | def __init__(self, *args, **kwargs): 11 | super(RoundRobin, self).__init__(*args, **kwargs) 12 | self._curr_target = 0 13 | 14 | def __str__(self): 15 | return 'ROUND-ROBIN:{}'.format(self.input) 16 | 17 | @gen.coroutine 18 | def _input_forwarder(self, data): 19 | if self.items: 20 | ret = self.items[self._curr_target].send(data) 21 | self._curr_target = (self._curr_target + 1) % len(self.items) 22 | if gen.is_future(ret): 23 | yield gen.maybe_future(ret) -------------------------------------------------------------------------------- /logdog/roles/parsers/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import 2 | 3 | from .regex import Regex -------------------------------------------------------------------------------- /logdog/roles/parsers/base.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | 5 | from logdog.core.base_role import BaseRole 6 | 7 | 8 | logger = logging.getLogger(__name__) 9 | 10 | 11 | class BaseParser(BaseRole): 12 | pass -------------------------------------------------------------------------------- /logdog/roles/parsers/regex.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import re 4 | from .base import BaseParser 5 | 6 | 7 | class Regex(BaseParser): 8 | defaults = BaseParser.defaults.copy_and_update( 9 | regex=r'', 10 | ) 11 | 12 | def __init__(self, *args, **kwargs): 13 | super(Regex, self).__init__(*args, **kwargs) 14 | self.re = re.compile(self.config.regex) 15 | 16 | def __str__(self): 17 | return 'REGEX-EXTRACTOR' 18 | 19 | def on_recv(self, data): 20 | match = self.re.search(data.message) 21 | if match: 22 | data.update_meta(match.groupdict()) 23 | return self.output.send(data) -------------------------------------------------------------------------------- /logdog/roles/pipes/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import 2 | 3 | from .pipe import Pipe -------------------------------------------------------------------------------- /logdog/roles/pipes/base.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | from tornado import gen 5 | 6 | from logdog.core.base_role import BaseRole 7 | 8 | 9 | logger = logging.getLogger(__name__) 10 | 11 | 12 | class BasePipe(BaseRole): 13 | defaults = BaseRole.defaults.copy_and_update( 14 | unique=True 15 | ) 16 | 17 | def __init__(self, *args, **kwargs): 18 | super(BasePipe, self).__init__(*args, **kwargs) 19 | self.items = [self.construct_subrole(name, conf) for name, conf in self.items] 20 | self._link_pipe_objects() 21 | if self.items: 22 | self.items[-1].set_output(None) 23 | 24 | def __str__(self): 25 | return 'PIPE:{}'.format(self._oid) 26 | 27 | def _is_valid_pipe(self): 28 | active_pipe = self.active_items 29 | in_ = active_pipe[0] if active_pipe else None 30 | for p in active_pipe[1:]: 31 | if in_.output is not p: 32 | logger.warning('[%s] Lost connectivity in the pipe. Check the configuration. %s -x-> %s', self, in_, p) 33 | return False 34 | in_ = p 35 | return True 36 | 37 | @gen.coroutine 38 | def start(self): 39 | if not self.started and self._is_valid_pipe(): 40 | logger.debug('[%s] Starting %s', self, ' -> '.join(map(str, self.items))) 41 | yield [po.start() for po in reversed(self.items)] 42 | yield super(BasePipe, self).start() 43 | 44 | @gen.coroutine 45 | def stop(self): 46 | if self.started: 47 | yield [po.stop() for po in self.items] 48 | yield super(BasePipe, self).stop() 49 | 50 | def _link_pipe_objects(self): 51 | active_pipe = self.active_items 52 | obj_first = active_pipe[0] if active_pipe else None 53 | for obj_next in active_pipe[1:]: 54 | obj_first.set_output(obj_next) 55 | obj_next.set_input(obj_first) 56 | obj_first = obj_next 57 | 58 | for obj in reversed(active_pipe): 59 | obj.link_methods() 60 | 61 | def set_input(self, obj): 62 | logger.debug('[%s] Pipe initialized with %s.', self, obj) 63 | if self.active_items: 64 | self.active_items[0].set_input(obj) 65 | 66 | def set_output(self, obj): 67 | if self.active_items: 68 | self.active_items[-1].set_output(obj) 69 | 70 | def _input_forwarder(self, data): 71 | if self.active_items: 72 | self._input_forwarder = self.active_items[0].send 73 | return self._input_forwarder(data) -------------------------------------------------------------------------------- /logdog/roles/pipes/pipe.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | from .base import BasePipe 4 | 5 | 6 | class Pipe(BasePipe): 7 | pass -------------------------------------------------------------------------------- /logdog/roles/pollers/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import 2 | 3 | from .file_watcher import FileWatcher -------------------------------------------------------------------------------- /logdog/roles/pollers/base.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | from tornado import gen 5 | 6 | from logdog.core.utils import mark_as_proxy_method 7 | from logdog.core.base_role import BaseRole 8 | 9 | 10 | logger = logging.getLogger(__name__) 11 | 12 | 13 | class BasePoller(BaseRole): 14 | def poll(self): 15 | # Straightforward implementation 16 | while self.started: 17 | self.send(self.input.pop()) 18 | 19 | @mark_as_proxy_method 20 | def on_recv(self, data): 21 | return self.output.send(data) 22 | 23 | @gen.coroutine 24 | def start(self): 25 | yield super(BasePoller, self).start() 26 | if self.started: 27 | yield self.poll() 28 | -------------------------------------------------------------------------------- /logdog/roles/pollers/file_watcher.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | from tornado import gen 5 | from tornado.concurrent import is_future 6 | 7 | from .base import BasePoller 8 | 9 | 10 | logger = logging.getLogger(__name__) 11 | 12 | 13 | class FileWatcher(BasePoller): 14 | defaults = BasePoller.defaults.copy_and_update( 15 | poll_sleep_policy='utils.policies.growing-sleep', 16 | greedy_policy='utils.policies.mostly-greedy', 17 | greedy_file_reading=True, 18 | start_delay=2.5, 19 | ) 20 | 21 | def __init__(self, *args, **kwargs): 22 | super(FileWatcher, self).__init__(*args, **kwargs) 23 | self.poll_sleep_policy = self.app.config.find_and_construct_class(name=self.config.poll_sleep_policy) 24 | self.greedy_policy = self.app.config.find_and_construct_class(name=self.config.greedy_policy) 25 | 26 | def __str__(self): 27 | return 'WATCHER:{!s}'.format(self.input) 28 | 29 | def _prepare_message(self, data): 30 | msg = super(FileWatcher, self)._prepare_message(data) 31 | msg.source = getattr(self.input, 'name', msg.source) 32 | return msg 33 | 34 | @gen.coroutine 35 | def poll(self): 36 | greedy_file_reading_enabled = self.config.greedy_file_reading 37 | 38 | try: 39 | self.input.open() 40 | logger.info('[%s] Opened %s.', self, self.input) 41 | except IOError as e: 42 | logger.error('[%s] Stopping watching because of errors (%s)...', self, e) 43 | self.started = False 44 | yield self.stop() 45 | return 46 | 47 | while self.started: 48 | data = self.input.read_line() 49 | 50 | if data: 51 | data = self._prepare_message(data) 52 | ret = self._forward(data) 53 | if is_future(ret): 54 | yield ret 55 | 56 | self.poll_sleep_policy.reset() 57 | 58 | if greedy_file_reading_enabled and self.greedy_policy.need_to_wait(): 59 | yield self.greedy_policy.wait() 60 | 61 | else: 62 | self.input.check_stat() 63 | self.app.register.set_path(self.input) 64 | logger.debug('[%s] Sleep on watching %ss.', self, self.poll_sleep_policy.cur_interval) 65 | yield self.poll_sleep_policy.sleep() -------------------------------------------------------------------------------- /logdog/roles/processors/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import 2 | 3 | from .stripper import Stripper -------------------------------------------------------------------------------- /logdog/roles/processors/base.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | 5 | from logdog.core.base_role import BaseRole 6 | 7 | 8 | logger = logging.getLogger(__name__) 9 | 10 | 11 | class BaseProcessor(BaseRole): 12 | 13 | def on_recv(self, data): 14 | self.output.send(data) -------------------------------------------------------------------------------- /logdog/roles/processors/stripper.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | from .base import BaseProcessor 4 | 5 | 6 | class Stripper(BaseProcessor): 7 | 8 | def __str__(self): 9 | return 'STRIPPER' 10 | 11 | def on_recv(self, data): 12 | data.update_message(data.message.strip()) 13 | return self.output.send(data) -------------------------------------------------------------------------------- /logdog/roles/viewers/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import 2 | 3 | from .console import Console 4 | from .null import Null 5 | from .webui import WebUI -------------------------------------------------------------------------------- /logdog/roles/viewers/base.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | 5 | from logdog.core.base_role import BaseRole 6 | 7 | 8 | logger = logging.getLogger(__name__) 9 | 10 | 11 | class BaseViewer(BaseRole): 12 | 13 | def on_recv(self, data): 14 | logger.debug(data) -------------------------------------------------------------------------------- /logdog/roles/viewers/console.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | import os 5 | import sys 6 | 7 | from .base import BaseViewer 8 | 9 | 10 | logger = logging.getLogger(__name__) 11 | 12 | 13 | class Console(BaseViewer): 14 | defaults = BaseViewer.defaults.copy_and_update( 15 | redirect_to='stdout' 16 | ) 17 | 18 | def __init__(self, *args, **kwargs): 19 | super(Console, self).__init__(*args, **kwargs) 20 | self._output = getattr(sys, self.config.redirect_to) 21 | 22 | def _input_forwarder(self, data): 23 | sys.stdout.write(data.message) 24 | if not data.message.endswith(os.linesep): 25 | sys.stdout.write(os.linesep) 26 | 27 | on_recv = _input_forwarder -------------------------------------------------------------------------------- /logdog/roles/viewers/null.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | from .base import BaseViewer 4 | 5 | 6 | class Null(BaseViewer): 7 | def _input_forwarder(self, data): 8 | pass 9 | 10 | on_recv = _input_forwarder -------------------------------------------------------------------------------- /logdog/roles/viewers/webui/Gruntfile.js: -------------------------------------------------------------------------------- 1 | module.exports = function (grunt) { 2 | 'use strict'; 3 | 4 | var appJs = [ 5 | '<%= buildPath %>/js/jquery.js', 6 | '<%= buildPath %>/js/angular.js', 7 | '<%= buildPath %>/js/angular-animate.js', 8 | '<%= buildPath %>/js/angular-aria.js', 9 | '<%= buildPath %>/js/angular-ui-router.js', 10 | '<%= buildPath %>/js/angular-material.js', 11 | 12 | '<%= srcPath %>/js/app.js', 13 | '<%= srcPath %>/js/**/*.js' 14 | ]; 15 | 16 | var appCss = [ 17 | '<%= buildPath %>/css/angular-material.css', 18 | 19 | '<%= srcPath %>/css/app.css' 20 | ]; 21 | 22 | grunt.initConfig({ 23 | pkg: grunt.file.readJSON('package.json'), 24 | buildPath: 'static', 25 | srcPath: 'assets', 26 | 27 | clean: { 28 | options: { 29 | force: true 30 | }, 31 | build: ['<%= buildPath %>'], 32 | postMin: [ 33 | '<%= buildPath %>/js/jquery.js', 34 | '<%= buildPath %>/js/angular.js', 35 | '<%= buildPath %>/js/angular-animate.js', 36 | '<%= buildPath %>/js/angular-aria.js', 37 | '<%= buildPath %>/js/angular-ui-router.js', 38 | '<%= buildPath %>/js/angular-material.js', 39 | 40 | '<%= buildPath %>/css/angular-material.css' 41 | ] 42 | }, 43 | 44 | copy: { 45 | base: { 46 | files: [ 47 | {expand: true, cwd: '<%= srcPath %>', src: ['fonts/**'], dest: '<%= buildPath %>/'}, 48 | {expand: true, 49 | cwd: 'bower_components/jquery/dist', 50 | src: ['jquery.js'], dest: '<%= buildPath %>/js'}, 51 | {expand: true, 52 | cwd: 'bower_components/angular', 53 | src: ['angular.js'], dest: '<%= buildPath %>/js'}, 54 | {expand: true, 55 | cwd: 'bower_components/angular-animate', 56 | src: ['angular-animate.js'], dest: '<%= buildPath %>/js'}, 57 | {expand: true, 58 | cwd: 'bower_components/angular-aria', 59 | src: ['angular-aria.js'], dest: '<%= buildPath %>/js'}, 60 | {expand: true, 61 | cwd: 'bower_components/angular-ui-router/release', 62 | src: ['angular-ui-router.js'], dest: '<%= buildPath %>/js'}, 63 | {expand: true, 64 | cwd: 'bower_components/angular-material', 65 | src: ['angular-material.js'], dest: '<%= buildPath %>/js'}, 66 | 67 | {expand: true, 68 | cwd: 'bower_components/angular-material', 69 | src: ['angular-material.css'], dest: '<%= buildPath %>/css'}, 70 | 71 | {expand: true, 72 | cwd: 'bower_components/material-design-icons', 73 | src: ['navigation/svg/production/ic_menu_18px.svg'], dest: '<%= buildPath %>/img/icons'} 74 | ] 75 | } 76 | }, 77 | 78 | concat: { 79 | dev: { 80 | files: { 81 | '<%= buildPath %>/css/app.css': appCss, 82 | '<%= buildPath %>/js/app.js': appJs 83 | } 84 | } 85 | }, 86 | 87 | uglify: { 88 | prod: { 89 | options: { 90 | mangle: true, 91 | except: ['jQuery', 'angular'], 92 | sourceMap: false, 93 | sourceMapName: '<%= buildPath %>/js/app.js.map' 94 | }, 95 | files: { 96 | '<%= buildPath %>/js/app.js': appJs 97 | } 98 | } 99 | }, 100 | 101 | cssmin: { 102 | options: { 103 | shorthandCompacting: false, 104 | roundingPrecision: -1, 105 | sourceMap: false 106 | }, 107 | prod: { 108 | files: { 109 | '<%= buildPath %>/css/app.css': appCss 110 | } 111 | } 112 | }, 113 | 114 | watch: { 115 | configFiles: { 116 | files: ['Gruntfile.js'], 117 | tasks: ['dev'], 118 | options: { 119 | reload: true 120 | } 121 | }, 122 | javascript: { 123 | files: ['<%= srcPath %>/**/*.js'], 124 | tasks: ['dev'] 125 | }, 126 | less: { 127 | files: '<%= srcPath %>/**/*.less', 128 | tasks: ['dev'] 129 | }, 130 | templates: { 131 | files: ['<%= srcPath %>/**'], 132 | tasks: ['dev'] 133 | } 134 | } 135 | }); 136 | 137 | grunt.loadNpmTasks('grunt-contrib-clean'); 138 | grunt.loadNpmTasks('grunt-contrib-copy'); 139 | grunt.loadNpmTasks('grunt-contrib-concat'); 140 | grunt.loadNpmTasks('grunt-contrib-uglify'); 141 | grunt.loadNpmTasks('grunt-contrib-cssmin'); 142 | grunt.loadNpmTasks('grunt-contrib-watch'); 143 | 144 | var dev = [ 145 | 'clean:build', 146 | 'copy:base', 147 | 148 | 'concat:dev' 149 | ]; 150 | var prod = [ 151 | 'clean:build', 152 | 'copy:base', 153 | 154 | 'uglify:prod', 155 | 'cssmin:prod', 156 | 'clean:postMin' 157 | ]; 158 | 159 | grunt.registerTask('dev', dev); 160 | grunt.registerTask('prod', prod); 161 | }; 162 | -------------------------------------------------------------------------------- /logdog/roles/viewers/webui/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | import os 5 | from tornado import gen 6 | from tornado.web import Application, url 7 | 8 | from ..base import BaseViewer 9 | 10 | 11 | logger = logging.getLogger(__name__) 12 | 13 | 14 | class WebApp(Application): 15 | def __init__(self, *args, **kwargs): 16 | from .web import IndexHandler, LogWebSocket 17 | handlers = [ 18 | url(r'/', IndexHandler, name='logdog.index'), 19 | url(r'/ws/logs', LogWebSocket, name='logdog.logs-ws'), 20 | ] 21 | self.ws = [] 22 | kwargs['handlers'] = handlers 23 | super(WebApp, self).__init__(*args, **kwargs) 24 | 25 | 26 | class WebUI(BaseViewer): 27 | defaults = BaseViewer.defaults( 28 | unique=True, 29 | port=8888, 30 | address='', 31 | autoreload=False, 32 | debug=False, 33 | static_path=os.path.join(os.path.dirname(__file__), 'static'), 34 | template_path=os.path.join(os.path.dirname(__file__), 'assets/html'), 35 | ) 36 | 37 | def __init__(self, *args, **kwargs): 38 | super(WebUI, self).__init__(*args, **kwargs) 39 | self._web_app = WebApp(**self.config) 40 | 41 | def __str__(self): 42 | return 'WEBUI:http://{}:{}'.format(self.config.address or '0.0.0.0', self.config.port) 43 | 44 | @classmethod 45 | def __singleton_key__(cls, passed_args, passed_kwargs): 46 | key = super(WebUI, cls).__singleton_key__(passed_args, passed_kwargs) 47 | return '{key}:address={address}:port={port}'.format( 48 | key=key, 49 | address=passed_kwargs.get('address', cls.defaults.address), 50 | port=passed_kwargs.get('port', cls.defaults.port) 51 | ) 52 | 53 | @gen.coroutine 54 | def _pre_start(self): 55 | logger.info('[%s] Starting %s:%s...', self, self.config.address or '0.0.0.0', self.config.port) 56 | try: 57 | self._web_app.listen(address=self.config.address, port=self.config.port) 58 | except Exception as e: 59 | logger.exception(e) 60 | 61 | def on_recv(self, data): 62 | if self.started: 63 | for ws in self._web_app.ws: 64 | ws.write_message(data.serialize_json()) 65 | -------------------------------------------------------------------------------- /logdog/roles/viewers/webui/assets/css/app.css: -------------------------------------------------------------------------------- 1 | .log-view { 2 | height: 696px; 3 | overflow: auto; 4 | } 5 | 6 | .log-view pre { 7 | background-color: inherit; 8 | border: inherit; 9 | margin: 0; 10 | min-height: 690px; 11 | /*-webkit-box-shadow: inset 0 0 34px -10px rgba(224,224,224,1);*/ 12 | /*-moz-box-shadow: inset 0 0 34px -10px rgba(224,224,224,1);*/ 13 | /*box-shadow: inset 0 0 34px -10px rgba(224,224,224,1);*/ 14 | } 15 | 16 | .loading-icon { 17 | font-size: 24px; 18 | color: rgba(92, 158, 126, 0.12); 19 | 20 | display: inline-block; 21 | -webkit-animation: spin 1.5s infinite linear; 22 | -moz-animation: spin 1.5s infinite linear; 23 | animation: spin 1.5s infinite linear; 24 | 25 | top: 10px; 26 | left: 25px; 27 | position: absolute; 28 | } 29 | 30 | @-webkit-keyframes spin { 31 | 0% {transform: rotate(360deg) rotateY(180deg)} 32 | 100% {transform: rotate(0deg) rotateY(180deg)} 33 | } 34 | 35 | .ctl-panel-bottom { 36 | 37 | } 38 | .ctl-panel-bottom .togglebutton label { 39 | margin: 0; 40 | } 41 | .ctl-panel-bottom .togglebutton { 42 | margin: 0 0 0 4px; 43 | position: relative; 44 | top: -1px; 45 | } 46 | .ctl-panel-bottom i { 47 | position: relative; 48 | top: 6px; 49 | } 50 | 51 | .pointer { 52 | cursor: pointer; 53 | } -------------------------------------------------------------------------------- /logdog/roles/viewers/webui/assets/html/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 |

logdog

19 |
20 | 21 | 22 | 23 | 24 |
25 | 26 |
27 | 28 | 29 | 30 |
36 |
37 |
38 | 39 | 40 |
41 | 43 | 44 | [[ isRunning ? 'pause' : 'resume' ]] 45 | clear 46 | 47 |
48 | 49 |
50 |
51 |
52 | 53 | 54 | 55 | 56 |
57 |
58 | 59 | 60 | 61 | 62 | 65 | 66 | 67 | -------------------------------------------------------------------------------- /logdog/roles/viewers/webui/assets/js/app.js: -------------------------------------------------------------------------------- 1 | (function () { 2 | 'use strict'; 3 | 4 | var entityMap = { 5 | "&": "&", 6 | "<": "<", 7 | ">": ">", 8 | '"': '"', 9 | "'": ''', 10 | "/": '/' 11 | }; 12 | 13 | function escapeHtml(string) { 14 | return String(string).replace(/[&<>"'\/]/g, function (s) { 15 | return entityMap[s]; 16 | }); 17 | } 18 | 19 | var app = angular.module('logdog.app', [ 20 | 'ngMaterial', 21 | 'ui.router' 22 | ], 23 | ['$interpolateProvider', function ($interpolateProvider) { 24 | $interpolateProvider.startSymbol('[['); 25 | $interpolateProvider.endSymbol(']]'); 26 | }]); 27 | 28 | app.config(['$mdThemingProvider', function ($mdThemingProvider) { 29 | $mdThemingProvider.theme('default') 30 | .primaryPalette('indigo') 31 | .accentPalette('deep-purple') 32 | .warnPalette('red') 33 | .backgroundPalette('grey'); 34 | }]); 35 | 36 | app.directive('logViewer', ['$timeout', function($timeout){ 37 | return { 38 | restrict: 'A', 39 | scope: {socket: '=', scrollToEndEnabled: '=', logViewerApi: '='}, 40 | templateUrl: 'partial/log-viewer.html', 41 | link: function (scope, element, attrs) { 42 | var logContainer = element.find('#id-log-lines'); 43 | var el = element[0]; 44 | var buffer = []; 45 | var emptyCounter = 0; 46 | var isRunning = true; 47 | 48 | scope.logViewerApi = scope.logViewerApi || {}; 49 | var api = scope.logViewerApi; 50 | api.clear = function () { 51 | logContainer[0].innerHTML = ""; 52 | }; 53 | api.pause = function () { 54 | scope.socket.pause(); 55 | isRunning = false; 56 | }; 57 | api.resume = function () { 58 | scope.socket.resume(); 59 | isRunning = true; 60 | }; 61 | 62 | scope.socket.on(scope.socket.events.MSG, function(event){ 63 | if (isRunning) { 64 | buffer.push(JSON.parse(event.data)); 65 | } 66 | }); 67 | 68 | function scrollToEnd () { 69 | if (scope.scrollToEndEnabled) { 70 | el.scrollTop = el.scrollHeight; 71 | } 72 | } 73 | 74 | function formatMessage (message, noEscape) { 75 | if (message && (typeof message.msg !== 'undefined')) { 76 | var msg = message.msg; 77 | msg = noEscape ? msg : escapeHtml(msg); 78 | return ( 79 | '
' 80 | + msg.trimRight() 81 | + '
' 82 | ); 83 | } 84 | return message; 85 | } 86 | 87 | function appendLogContainer (buf, noEscape) { 88 | if (angular.isArray(buf)) { 89 | angular.forEach(buf, appendLogContainer); 90 | } else { 91 | try { 92 | logContainer.append(formatMessage(buf, noEscape)); 93 | } catch (e) { 94 | logContainer.append(buf); 95 | console.log(e); 96 | } 97 | } 98 | } 99 | 100 | var dumpInterval = 100; 101 | function dumpLog () { 102 | if (buffer.length) { 103 | var buf = buffer; 104 | buffer = []; 105 | emptyCounter = 60 * 1000 / dumpInterval; 106 | appendLogContainer(buf); 107 | scrollToEnd(); 108 | } else { 109 | if (emptyCounter > 0) { 110 | emptyCounter--; 111 | if (emptyCounter == 0) { 112 | var msg = '
[' 113 | + new Date() 114 | + '] New messages will be below.

'; 115 | appendLogContainer(msg, true); 116 | scrollToEnd(); 117 | } 118 | } 119 | } 120 | $timeout(dumpLog, dumpInterval); 121 | } 122 | 123 | 124 | function _activate () { 125 | $timeout(dumpLog, dumpInterval); 126 | } 127 | 128 | _activate(); 129 | } 130 | } 131 | }]); 132 | 133 | app.service('Socket', ['$timeout', function ($timeout) { 134 | return function (url, reconnectDelay, maxReconnectDelay) { 135 | var callbacks = { 136 | 'open': [], 137 | 'message': [], 138 | 'error': [], 139 | 'close': [] 140 | }; 141 | 142 | var events = { 143 | 'OPEN': 'open', 144 | 'MSG': 'message', 145 | 'ERR': 'error', 146 | 'CLOSE': 'close' 147 | }; 148 | 149 | var pause = false; 150 | 151 | var handleEvent = function (type, event) { 152 | if (pause) return; 153 | angular.forEach(callbacks[type], function (callback) { 154 | callback(event); 155 | }); 156 | }; 157 | 158 | var ws = null; 159 | 160 | reconnectDelay = reconnectDelay || 1000; 161 | maxReconnectDelay = maxReconnectDelay || 32000; 162 | function initSocket() { 163 | ws = new WebSocket(url); 164 | 165 | ws.onopen = function (event) { 166 | console.log('socket open'); 167 | handleEvent(events.OPEN, event); 168 | reconnectDelay = 1000; 169 | }; 170 | 171 | ws.onmessage = function (event) { 172 | console.log('socket message'); 173 | handleEvent(events.MSG, event); 174 | }; 175 | 176 | ws.onerror = function (event) { 177 | console.log('socket error'); 178 | handleEvent(events.ERR, event); 179 | }; 180 | 181 | ws.onclose = function (event) { 182 | console.log('socket close'); 183 | console.log('trying to reconnect... Next try in ' + reconnectDelay + 'ms'); 184 | handleEvent(events.CLOSE, event); 185 | $timeout(initSocket, reconnectDelay); 186 | reconnectDelay *= 2; 187 | reconnectDelay = reconnectDelay > maxReconnectDelay ? maxReconnectDelay : reconnectDelay; 188 | }; 189 | 190 | return ws; 191 | } 192 | 193 | return { 194 | 'events': events, 195 | 'ws': initSocket(), 196 | 'on': function(eventType, cb){ 197 | callbacks[eventType].push(cb); 198 | }, 199 | 'off': function(eventType){ 200 | delete callbacks[eventType]; 201 | }, 202 | 'pause': function(){pause = true;}, 203 | 'resume': function(){pause = false;} 204 | } 205 | 206 | }; 207 | }]); 208 | 209 | app.controller('LogViewCtl', ['$scope', 'Socket', function($scope, Socket){ 210 | $scope.socket = Socket('ws://' + location.host + '/ws/logs'); 211 | $scope.scrollToEndEnabled = true; 212 | $scope.isRunning = true; 213 | $scope.logViewerApi = {}; 214 | $scope.onStartStop = function () { 215 | $scope.isRunning = !$scope.isRunning; 216 | $scope.logViewerApi[$scope.isRunning ? 'resume' : 'pause'](); 217 | }; 218 | $scope.onClear = function () { 219 | $scope.logViewerApi.clear && $scope.logViewerApi.clear(); 220 | }; 221 | }]); 222 | })(); -------------------------------------------------------------------------------- /logdog/roles/viewers/webui/bower.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "logdog.webui", 3 | "version": "0.1", 4 | "homepage": "https://github.com/miphreal/python-logdog", 5 | "authors": [ 6 | "Evgeny Lychkovsky " 7 | ], 8 | "description": "runtime logs viewer", 9 | "license": "MIT", 10 | "ignore": [ 11 | "**/.*", 12 | "node_modules", 13 | "bower_components", 14 | "test", 15 | "tests" 16 | ], 17 | "dependencies": { 18 | "jquery": "~2.1.3", 19 | "angular": "~1.3.15", 20 | "angular-ui-router": "~0.2.13", 21 | "angular-material": "~0.8.3", 22 | "material-design-icons": "~1.0.1" 23 | } 24 | } 25 | -------------------------------------------------------------------------------- /logdog/roles/viewers/webui/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "logdog.webui", 3 | "version": "0.1.0", 4 | "homepage": "https://github.com/miphreal/python-logdog", 5 | "authors": [ 6 | "Evgeny Lychkovsky " 7 | ], 8 | "description": "runtime logs viewer", 9 | "license": "MIT", 10 | "ignore": [ 11 | "**/.*", 12 | "node_modules", 13 | "bower_components", 14 | "test", 15 | "tests" 16 | ], 17 | "dependencies": { 18 | "grunt": "^0.4.5", 19 | "grunt-contrib-clean": "^0.6.0", 20 | "grunt-contrib-concat": "^0.5.0", 21 | "grunt-contrib-copy": "^0.7.0", 22 | "grunt-contrib-cssmin": "^0.12.1", 23 | "grunt-contrib-uglify": "^0.7.0", 24 | "grunt-contrib-watch": "^0.6.1" 25 | } 26 | } 27 | -------------------------------------------------------------------------------- /logdog/roles/viewers/webui/static/img/icons/navigation/svg/production/ic_menu_18px.svg: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /logdog/roles/viewers/webui/web.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import, unicode_literals 2 | 3 | import logging 4 | 5 | from tornado import gen 6 | from tornado.web import RequestHandler 7 | from tornado.websocket import WebSocketHandler 8 | 9 | 10 | logger = logging.getLogger(__name__) 11 | 12 | 13 | class IndexHandler(RequestHandler): 14 | 15 | def get(self): 16 | self.render('index.html') 17 | 18 | 19 | class LogWebSocket(WebSocketHandler): 20 | 21 | @gen.coroutine 22 | def open(self): 23 | logger.info('Client connected.') 24 | self.application.ws.append(self) 25 | 26 | def on_close(self): 27 | logger.debug('Connection closed.') 28 | self.application.ws.remove(self) -------------------------------------------------------------------------------- /logdog/version.py: -------------------------------------------------------------------------------- 1 | from __future__ import unicode_literals 2 | 3 | 4 | MAJOR_VERSION = 0 5 | MINOR_VERSION = 2 6 | PATCH_VERSION = 11 7 | version = (MAJOR_VERSION, MINOR_VERSION, PATCH_VERSION) 8 | 9 | __version__ = '.'.join(map(str, version)) 10 | __author__ = 'Evgeny Lychkovsky' 11 | __email__ = 'miphreal@gamil.com' 12 | -------------------------------------------------------------------------------- /pytest.ini: -------------------------------------------------------------------------------- 1 | [pytest] 2 | addopts = 3 | 4 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | pyzmq>=14.5.0 2 | tornado>=4.1 3 | u-msgpack-python>=2.0 4 | docopt>=0.6.2 5 | PyYAML>=3.11 -------------------------------------------------------------------------------- /scripts/logdog: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | from logdog.cli import main 3 | 4 | main() -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | import os 3 | from setuptools import find_packages, setup 4 | 5 | from logdog import ( 6 | __version__ as version, 7 | __author__ as author, 8 | __email__ as author_email) 9 | 10 | 11 | requires = [ 12 | r.strip() for r in 13 | open(os.path.join(os.getcwd(), 'requirements.txt')).readlines() 14 | ] 15 | 16 | setup( 17 | name='logdog', 18 | version=version, 19 | description='logs watching + webui', 20 | author=author, 21 | author_email=author_email, 22 | url='https://github.com/miphreal/python-logdog/', 23 | packages=find_packages(exclude=['tests']), 24 | scripts=['scripts/logdog'], 25 | install_requires=requires, 26 | package_data={'logdog': ['roles/viewers/webui/assets/html/index.html', 27 | 'roles/viewers/webui/static/img/icons/navigation/svg/production/ic_menu_18px.svg', 28 | 'roles/viewers/webui/static/css/app.css', 29 | 'roles/viewers/webui/static/js/app.js']} 30 | ) -------------------------------------------------------------------------------- /test-requirements.txt: -------------------------------------------------------------------------------- 1 | pytest>=2.7.0 2 | PyHamcrest>=1.8.2 3 | fake-factory>=0.5.0 4 | tox>=1.9.2 -------------------------------------------------------------------------------- /tests/conftest.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/miphreal/python-logdog/44194b199f1906a156aa15bb97a40bb8b13f3d52/tests/conftest.py -------------------------------------------------------------------------------- /tests/core/test_config.py: -------------------------------------------------------------------------------- 1 | # coding=utf-8 2 | from hamcrest import * 3 | import pytest 4 | 5 | from logdog.core.config import ConfigLoader, Config, ConfigError 6 | 7 | 8 | @pytest.fixture() 9 | def config_path(tmpdir): 10 | p = tmpdir.join('config.yml') 11 | return p.strpath 12 | 13 | 14 | @pytest.mark.userfixtures('config_path') 15 | class TestConfigLoader(object): 16 | 17 | @pytest.fixture(autouse=True) 18 | def config_path_init(self, config_path): 19 | self.config_path = config_path 20 | 21 | def test_load_defaults(self): 22 | cl = ConfigLoader(path=None) 23 | conf = cl.load_config(default_only=True) 24 | 25 | assert_that(conf, all_of( 26 | # data is accessible both ways (attribute or key) 27 | has_key('sources'), has_property('sources'), 28 | has_key('pipes'), has_property('pipes'), 29 | has_key('options'), has_property('options'), 30 | has_key('pollers'), has_property('pollers'), 31 | has_key('collectors'), has_property('collectors'), 32 | has_key('processors'), has_property('processors'), 33 | has_key('formatters'), has_property('formatters'), 34 | has_key('forwarders'), has_property('forwarders'), 35 | has_key('connectors'), has_property('connectors'), 36 | has_key('viewers'), has_property('viewers'), 37 | has_key('utils'), has_property('utils'), 38 | )) 39 | 40 | def test_load_not_existing_file(self): 41 | cl = ConfigLoader(path=self.config_path) 42 | assert_that( 43 | calling(cl.load_config), 44 | raises(ConfigError, r'^\[.*\] Config file does not exist\.$')) 45 | 46 | def test_load_not_valid_file(self): 47 | with open(self.config_path, 'w') as f: 48 | f.write('') 49 | cl = ConfigLoader(path=self.config_path) 50 | assert_that( 51 | calling(cl.load_config), 52 | raises(ConfigError, r'^\[[^\]]*\] Invalid config file: must have "key: value" format\.$')) 53 | -------------------------------------------------------------------------------- /tests/core/test_msg.py: -------------------------------------------------------------------------------- 1 | # coding=utf-8 2 | from __future__ import unicode_literals 3 | from hamcrest import * 4 | 5 | from logdog.core.msg import Msg 6 | 7 | 8 | class TestMsg(object): 9 | 10 | def test_init(self): 11 | m = Msg(message='log entry', source='/tmp/x') 12 | assert_that(m.message, equal_to('log entry')) 13 | 14 | def test_init_unicode(self): 15 | m = Msg(message='сообщение лога', source='/tmp/x') 16 | assert_that(m.message, equal_to('сообщение лога')) 17 | 18 | def test_update_message(self): 19 | m = Msg(message='log entry', source='/tmp/x') 20 | m.update_message('new msg') 21 | assert_that(m.message, equal_to('new msg')) 22 | 23 | def test_updated_meta(self): 24 | m = Msg(message='log entry', source='/tmp/x') 25 | m.update_meta({'tags': ('syslog', 'auth')}) 26 | assert_that(m.meta, has_entry('tags', contains('syslog', 'auth'))) 27 | 28 | def test_serialize(self): 29 | m = Msg(message='log entry', source='/tmp/x', meta={'tags': ['syslog']}) 30 | assert_that(m.serialize(), all_of( 31 | instance_of(dict), 32 | has_entries({ 33 | 'msg': 'log entry', 34 | 'src': '/tmp/x', 35 | 'meta': {'tags': ['syslog']}, 36 | '_v': 1, 37 | }) 38 | )) 39 | 40 | def test_deserialize(self): 41 | m = Msg.deserialize({ 42 | 'msg': 'log entry', 43 | 'src': '/tmp/x', 44 | 'meta': {'tags': ['syslog']}, 45 | '_v': 1, 46 | }) 47 | assert_that(m, has_properties( 48 | message='log entry', 49 | source='/tmp/x', 50 | meta={'tags': ['syslog']}, 51 | version=1 52 | )) 53 | 54 | def test_serialize_json(self): 55 | m = Msg(message='log entry', source='/tmp/x', meta={'tags': ['syslog']}) 56 | assert_that(m.serialize_json(), contains_string('log entry')) 57 | m = Msg(message='сообщение', source='/tmp/x', meta={'tags': ['syslog']}) 58 | m.serialize_json() 59 | 60 | def test_deserialize_json(self): 61 | assert_that(Msg.deserialize_json('{"msg": "log entry"}'), 62 | has_property('message', equal_to('log entry'))) 63 | assert_that(Msg.deserialize_json('{"msg": "log entry","src":"/tmp/x","meta":null,"_v":1}'), 64 | has_properties( 65 | message=equal_to('log entry'), source=equal_to('/tmp/x'), 66 | meta=None, version=1)) 67 | assert_that(Msg.deserialize_json('{"msg":"сообщение","src":"/tmp/файл","meta":null,"_v":1}'), 68 | has_properties( 69 | message=equal_to('сообщение'), source=equal_to('/tmp/файл'), 70 | meta=None, version=1)) 71 | 72 | def test_serialize_jsonb(self): 73 | m = Msg(message='log entry', source='/tmp/x', meta={'tags': ['syslog']}) 74 | m.serialize_jsonb() 75 | m = Msg(message='сообщение', source='/tmp/файл', meta={'tags': ['syslog']}) 76 | m.serialize_jsonb() 77 | 78 | def test_deserialize_jsonb(self): 79 | m0 = Msg(message='log entry', source='/tmp/x', meta={'tags': ['syslog']}) 80 | serialized_message = m0.serialize_jsonb() 81 | m1 = Msg.deserialize_jsonb(serialized_message) 82 | 83 | assert_that(m1, has_properties( 84 | message=equal_to(m0.message), 85 | source=equal_to(m0.source), 86 | meta=equal_to(m0.meta), 87 | version=equal_to(m0.version) 88 | )) -------------------------------------------------------------------------------- /tox.ini: -------------------------------------------------------------------------------- 1 | [tox] 2 | envlist = py27, py34, pypy 3 | 4 | [testenv] 5 | commands = py.test tests 6 | deps = 7 | pytest 8 | fake-factory 9 | PyHamcrest 10 | --------------------------------------------------------------------------------