├── binder
├── apt.txt
├── runtime.txt
├── requirements.txt
└── postBuild
├── tributary
├── streaming
│ ├── base.py
│ ├── control
│ │ ├── __init__.py
│ │ ├── utils.py
│ │ └── control.py
│ ├── calculations
│ │ ├── utils.py
│ │ ├── __init__.py
│ │ ├── basket.py
│ │ └── finance.py
│ ├── __init__.py
│ ├── output
│ │ ├── __init__.py
│ │ ├── text.py
│ │ ├── file.py
│ │ ├── postgres.py
│ │ ├── kafka.py
│ │ ├── socketio.py
│ │ └── email.py
│ ├── input
│ │ ├── __init__.py
│ │ ├── file.py
│ │ ├── sse.py
│ │ ├── process.py
│ │ ├── postgres.py
│ │ ├── socketio.py
│ │ └── kafka.py
│ ├── dd3.py
│ ├── serialize.py
│ └── graph.py
├── tests
│ ├── __init__.py
│ ├── lazy
│ │ ├── __init__.py
│ │ ├── input
│ │ │ └── __init__.py
│ │ ├── control
│ │ │ ├── __init__.py
│ │ │ └── test_if_lazy.py
│ │ ├── output
│ │ │ ├── __init__.py
│ │ │ └── test_output_lazy.py
│ │ ├── calculations
│ │ │ ├── test_rolling_lazy.py
│ │ │ ├── test_basket_lazy.py
│ │ │ └── test_finance_lazy.py
│ │ ├── test_declarative_lazy.py
│ │ ├── test_fix_value_lazy.py
│ │ ├── test_tolerance_lazy.py
│ │ ├── test_method_lazy.py
│ │ ├── test_tweaks_lazy.py
│ │ ├── test_utils_lazy.py
│ │ └── test_graph_class_lazy.py
│ ├── utils
│ │ ├── __init__.py
│ │ ├── test_extract_parameters.py
│ │ └── test_lazy_to_streaming.py
│ ├── functional
│ │ ├── __init__.py
│ │ ├── test_output_func.py
│ │ ├── test_utils_func.py
│ │ ├── test_input_func.py
│ │ └── test_functional.py
│ ├── streaming
│ │ ├── __init__.py
│ │ ├── input
│ │ │ ├── __init__.py
│ │ │ ├── test_sse_streaming.py
│ │ │ ├── test_postgres_streaming.py
│ │ │ ├── test_file_data.json
│ │ │ ├── test_file_data.csv
│ │ │ ├── test_http_streaming.py
│ │ │ ├── test_process_streaming.py
│ │ │ └── test_input_streaming.py
│ │ ├── control
│ │ │ ├── __init__.py
│ │ │ └── test_if_streaming.py
│ │ ├── output
│ │ │ ├── __init__.py
│ │ │ ├── test_sse_streaming.py
│ │ │ ├── test_socketio_streaming.py
│ │ │ ├── test_kafka_streaming.py
│ │ │ ├── test_postgres_streaming.py
│ │ │ ├── test_file_streaming.py
│ │ │ ├── test_ws_streaming.py
│ │ │ ├── test_output_streaming.py
│ │ │ └── test_http_streaming.py
│ │ ├── calculations
│ │ │ ├── __init__.py
│ │ │ ├── test_calculations_streaming.py
│ │ │ ├── test_basket_streaming.py
│ │ │ ├── test_finance_streaming.py
│ │ │ └── test_rolling_streaming.py
│ │ ├── echo.py
│ │ ├── test_timing_streaming.py
│ │ └── test_streaming.py
│ ├── symbolic
│ │ ├── __init__.py
│ │ └── test_symbolic.py
│ ├── test_base.py
│ └── test_data
│ │ └── ohlc.csv
├── ts
│ └── __init__.py
├── functional
│ ├── output.py
│ ├── utils.py
│ └── __init__.py
├── incubating
│ ├── __init__.py
│ └── ast_experiments.py
├── lazy
│ ├── input
│ │ └── __init__.py
│ ├── control
│ │ ├── __init__.py
│ │ ├── utils.py
│ │ └── control.py
│ ├── calculations
│ │ ├── utils.py
│ │ ├── __init__.py
│ │ ├── rolling.py
│ │ ├── basket.py
│ │ └── finance.py
│ ├── output
│ │ └── __init__.py
│ ├── base.py
│ ├── __init__.py
│ ├── dd3.py
│ ├── decorator.py
│ ├── utils.py
│ └── graph.py
├── _version.py
├── __init__.py
├── thread.py
├── base.py
├── symbolic
│ └── __init__.py
├── parser
│ └── __init__.py
└── utils.py
├── docs
├── icon.ai
├── img
│ ├── nem.png
│ ├── icon.png
│ ├── example.gif
│ ├── example1.gif
│ ├── lazy
│ │ ├── dagred3.gif
│ │ ├── example1.png
│ │ └── example2.png
│ └── streaming
│ │ ├── sio.png
│ │ ├── ws.png
│ │ ├── http.png
│ │ ├── kafka.png
│ │ ├── dagred3.gif
│ │ ├── example1.png
│ │ ├── example2.png
│ │ ├── example3.png
│ │ └── example4.png
├── examples
│ ├── lazy
│ │ ├── output_18_0.png
│ │ ├── output_23_0.png
│ │ ├── output_8_0.svg
│ │ └── output_6_0.svg
│ ├── streaming
│ │ ├── output_23_0.png
│ │ ├── output_26_0.png
│ │ ├── output_30_0.png
│ │ ├── output_3_0.svg
│ │ └── output_13_0.svg
│ └── autodiff
│ │ ├── autodiff.md
│ │ └── output_15_0.svg
├── Makefile
└── make.bat
├── ci
├── scripts
│ ├── init.sql
│ └── server.js
└── docker-compose.yml
├── pyproject.toml
├── .gitattributes
├── .github
├── dependabot.yml
├── ISSUE_TEMPLATE
│ ├── feature_request.md
│ └── bug_report.md
├── workflows
│ └── build.yml
└── CODE_OF_CONDUCT.md
├── MANIFEST.in
├── examples
├── streaming_kafka.ipynb
├── streaming_dagred3.ipynb
├── autodifferentiation.ipynb
├── lazy_excel.ipynb
├── lazy_dagred3.ipynb
├── pipeline_stream.ipynb
├── lazy_blackscholes_autodiff.ipynb
└── pipeline_ws_http_sio.ipynb
├── setup.cfg
├── CONTRIBUTING.md
├── Makefile
├── setup.py
├── .gitignore
├── CATALOG.md
└── README.md
/binder/apt.txt:
--------------------------------------------------------------------------------
1 | graphviz-dev
--------------------------------------------------------------------------------
/binder/runtime.txt:
--------------------------------------------------------------------------------
1 | python-3.7
--------------------------------------------------------------------------------
/tributary/streaming/base.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/ts/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/functional/output.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/incubating/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/lazy/input/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/lazy/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/utils/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/functional/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/lazy/input/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/symbolic/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/lazy/control/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/lazy/output/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/input/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/_version.py:
--------------------------------------------------------------------------------
1 | __version__ = "0.2.2"
2 |
--------------------------------------------------------------------------------
/tributary/tests/functional/test_output_func.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/functional/test_utils_func.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/control/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/output/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/calculations/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/input/test_sse_streaming.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/output/test_sse_streaming.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/tributary/lazy/control/__init__.py:
--------------------------------------------------------------------------------
1 | from .control import If
2 |
--------------------------------------------------------------------------------
/tributary/streaming/control/__init__.py:
--------------------------------------------------------------------------------
1 | from .control import If
2 |
--------------------------------------------------------------------------------
/tributary/lazy/control/utils.py:
--------------------------------------------------------------------------------
1 | _CONTROL_GRAPHVIZSHAPE = "ellipse"
2 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/calculations/test_calculations_streaming.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/docs/icon.ai:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/icon.ai
--------------------------------------------------------------------------------
/tributary/streaming/control/utils.py:
--------------------------------------------------------------------------------
1 | _CONTROL_GRAPHVIZSHAPE = "ellipse"
2 |
--------------------------------------------------------------------------------
/docs/img/nem.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/nem.png
--------------------------------------------------------------------------------
/tributary/lazy/calculations/utils.py:
--------------------------------------------------------------------------------
1 | _CALCULATIONS_GRAPHVIZSHAPE = "ellipse"
2 |
--------------------------------------------------------------------------------
/ci/scripts/init.sql:
--------------------------------------------------------------------------------
1 | CREATE TABLE test(col1 INT);
2 | INSERT INTO test(col1) VALUES(1);
--------------------------------------------------------------------------------
/docs/img/icon.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/icon.png
--------------------------------------------------------------------------------
/tributary/lazy/output/__init__.py:
--------------------------------------------------------------------------------
1 | from .output import Graph, Print, GraphViz, Dagre
2 |
--------------------------------------------------------------------------------
/docs/img/example.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/example.gif
--------------------------------------------------------------------------------
/docs/img/example1.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/example1.gif
--------------------------------------------------------------------------------
/tributary/tests/lazy/calculations/test_rolling_lazy.py:
--------------------------------------------------------------------------------
1 | class TestRolling:
2 | pass
3 |
--------------------------------------------------------------------------------
/docs/img/lazy/dagred3.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/lazy/dagred3.gif
--------------------------------------------------------------------------------
/docs/img/lazy/example1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/lazy/example1.png
--------------------------------------------------------------------------------
/docs/img/lazy/example2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/lazy/example2.png
--------------------------------------------------------------------------------
/docs/img/streaming/sio.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/streaming/sio.png
--------------------------------------------------------------------------------
/docs/img/streaming/ws.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/streaming/ws.png
--------------------------------------------------------------------------------
/docs/img/streaming/http.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/streaming/http.png
--------------------------------------------------------------------------------
/docs/img/streaming/kafka.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/streaming/kafka.png
--------------------------------------------------------------------------------
/docs/img/streaming/dagred3.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/streaming/dagred3.gif
--------------------------------------------------------------------------------
/docs/img/streaming/example1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/streaming/example1.png
--------------------------------------------------------------------------------
/docs/img/streaming/example2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/streaming/example2.png
--------------------------------------------------------------------------------
/docs/img/streaming/example3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/streaming/example3.png
--------------------------------------------------------------------------------
/docs/img/streaming/example4.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/img/streaming/example4.png
--------------------------------------------------------------------------------
/tributary/lazy/base.py:
--------------------------------------------------------------------------------
1 | from .graph import LazyGraph
2 | from .node import Node
3 | from .decorator import node
4 |
--------------------------------------------------------------------------------
/docs/examples/lazy/output_18_0.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/examples/lazy/output_18_0.png
--------------------------------------------------------------------------------
/docs/examples/lazy/output_23_0.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/examples/lazy/output_23_0.png
--------------------------------------------------------------------------------
/docs/examples/streaming/output_23_0.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/examples/streaming/output_23_0.png
--------------------------------------------------------------------------------
/docs/examples/streaming/output_26_0.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/examples/streaming/output_26_0.png
--------------------------------------------------------------------------------
/docs/examples/streaming/output_30_0.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/1kbgz/tributary/HEAD/docs/examples/streaming/output_30_0.png
--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------
1 | [build-system]
2 | # Minimum requirements for the build system to execute.
3 | requires = ["setuptools", "wheel"]
4 |
--------------------------------------------------------------------------------
/tributary/streaming/calculations/utils.py:
--------------------------------------------------------------------------------
1 | _CALCULATIONS_GRAPHVIZSHAPE = "ellipse"
2 |
3 |
4 | def _raise(e):
5 | raise e
6 |
--------------------------------------------------------------------------------
/tributary/lazy/calculations/__init__.py:
--------------------------------------------------------------------------------
1 | from .finance import *
2 | from .ops import *
3 | from .rolling import *
4 | from .basket import *
5 |
--------------------------------------------------------------------------------
/tributary/streaming/calculations/__init__.py:
--------------------------------------------------------------------------------
1 | from .basket import *
2 | from .finance import *
3 | from .ops import *
4 | from .rolling import *
5 |
--------------------------------------------------------------------------------
/.gitattributes:
--------------------------------------------------------------------------------
1 | examples/* linguist-documentation
2 | docs/* linguist-documentation
3 | tests/notebooks/* linguist-documentation
4 | * text=auto eol=lf
5 |
--------------------------------------------------------------------------------
/.github/dependabot.yml:
--------------------------------------------------------------------------------
1 | version: 2
2 | updates:
3 | - package-ecosystem: "github-actions"
4 | directory: "/"
5 | schedule:
6 | interval: "weekly"
7 |
--------------------------------------------------------------------------------
/tributary/tests/test_base.py:
--------------------------------------------------------------------------------
1 | class TestBase:
2 | def test_1(self):
3 | from tributary.base import StreamNone
4 |
5 | assert not StreamNone()
6 |
--------------------------------------------------------------------------------
/binder/requirements.txt:
--------------------------------------------------------------------------------
1 | ipydagred3==0.2.5
2 | ipyregulartable==0.1.2
3 | ipywidgets==7.5.1
4 | jupyterlab==2.2.10
5 | pyEX==0.2.5
6 | requests==2.31.0
7 | tributary==0.1.4
--------------------------------------------------------------------------------
/tributary/lazy/__init__.py:
--------------------------------------------------------------------------------
1 | from .base import LazyGraph, Node as LazyNode, node
2 | from .calculations import *
3 | from .control import *
4 | from .input import *
5 | from .output import *
6 | from .utils import *
7 |
--------------------------------------------------------------------------------
/tributary/__init__.py:
--------------------------------------------------------------------------------
1 | from ._version import __version__ # noqa: F401, E402
2 |
3 | from .lazy import LazyGraph, LazyNode, node # noqa: F401, E402
4 | from .streaming import * # noqa: F401, F403, E402
5 | from .utils import LazyToStreaming # noqa: F401, E402
6 |
--------------------------------------------------------------------------------
/tributary/tests/lazy/test_declarative_lazy.py:
--------------------------------------------------------------------------------
1 | import tributary.lazy as t
2 |
3 |
4 | class TestDeclarative:
5 | def test_simple_declarative(self):
6 | n = t.Node(value=1)
7 | z = n + 5
8 | assert z() == 6
9 | n.setValue(2)
10 | assert z() == 7
11 |
--------------------------------------------------------------------------------
/binder/postBuild:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | EXTENSIONS="
3 | @jupyter-widgets/jupyterlab-manager@2.0.0
4 | ipydagred3@0.2.5
5 | ipyregulartable@0.1.2
6 | "
7 |
8 | jupyter labextension install $EXTENSIONS --no-build
9 | jupyter lab build --dev-build=False --minimize=False
10 |
11 | jupyter serverextension enable --py jupyterlab
12 |
--------------------------------------------------------------------------------
/tributary/tests/functional/test_input_func.py:
--------------------------------------------------------------------------------
1 | import time
2 | import tributary.functional as t
3 |
4 |
5 | class TestInput:
6 | def test_http(self):
7 | t.pipeline(
8 | [t.http], ["callback"], [{"url": "https://www.google.com"}], on_data=print
9 | )
10 | time.sleep(2)
11 | t.stop()
12 |
--------------------------------------------------------------------------------
/ci/scripts/server.js:
--------------------------------------------------------------------------------
1 | const io = require("socket.io"), server = io.listen(8069);
2 |
3 | server.on("connection", (socket) => {
4 | console.info(`Connection with id=${socket.id}`);
5 | socket.on("*", (event) => {
6 | console.log(event);
7 | })
8 | socket.on("disconnect", () => {
9 | console.info(`Disconnected id=${socket.id}`);
10 | });
11 | }, 1000);
--------------------------------------------------------------------------------
/tributary/tests/lazy/test_fix_value_lazy.py:
--------------------------------------------------------------------------------
1 | import random
2 |
3 | import tributary.lazy as t
4 |
5 |
6 | class TestCallableLock:
7 | def test_callable_lock(self):
8 | n = t.Node(value=random.random, dynamic=True)
9 |
10 | x = n()
11 | assert n() != x
12 |
13 | n.setValue(5)
14 | assert n() == 5
15 |
16 | n.unlock()
17 | assert n() != 5
18 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/echo.py:
--------------------------------------------------------------------------------
1 | import json as JSON
2 | import sys
3 | import os.path
4 |
5 | sys.path.insert(
6 | 0, os.path.abspath(os.path.join(os.path.dirname(__file__), "../../../"))
7 | )
8 |
9 |
10 | if __name__ == "__main__":
11 | import tributary.streaming as ts
12 |
13 | def _json(val):
14 | return JSON.dumps(val)
15 |
16 | ts.run(ts.Console(json=True).apply(_json).print())
17 |
--------------------------------------------------------------------------------
/tributary/tests/lazy/test_tolerance_lazy.py:
--------------------------------------------------------------------------------
1 | import tributary.lazy as t
2 |
3 |
4 | class TestTolerance:
5 | def test_tolerance(self):
6 | n = t.Node(value=1.0)
7 | n2 = n + 1
8 |
9 | assert n() == 1.0
10 | assert n2() == 2.0
11 |
12 | n.setValue(1.0000000000000001)
13 | assert n2.isDirty() is False
14 |
15 | n.setValue(1.0001)
16 | assert n2.isDirty() is True
17 |
--------------------------------------------------------------------------------
/tributary/streaming/__init__.py:
--------------------------------------------------------------------------------
1 | from ..base import StreamEnd, StreamNone, StreamRepeat
2 | from .calculations import *
3 | from .control import *
4 | from .graph import StreamingGraph
5 | from .input import *
6 | from .node import Node as StreamingNode
7 | from .output import *
8 | from .scheduler import Scheduler
9 | from .utils import *
10 |
11 |
12 | def run(node, blocking=True, **kwargs):
13 | graph = node.constructGraph()
14 | kwargs["blocking"] = blocking
15 | return graph.run(**kwargs)
16 |
--------------------------------------------------------------------------------
/tributary/tests/lazy/control/test_if_lazy.py:
--------------------------------------------------------------------------------
1 | def func():
2 | yield True
3 | yield False
4 | yield True
5 |
6 |
7 | class TestConditional:
8 | def test_if(self):
9 | import tributary.lazy as tl
10 |
11 | x = tl.Node(value=func)
12 | y = tl.Node(1)
13 | z = tl.Node(-1)
14 |
15 | out = tl.If(x, y, z)
16 |
17 | print(out.graph())
18 | assert out() == 1
19 | assert out() == -1
20 | assert out() == 1
21 | assert out() == 1
22 | assert out() == 1
23 |
--------------------------------------------------------------------------------
/tributary/tests/lazy/output/test_output_lazy.py:
--------------------------------------------------------------------------------
1 | import tributary.lazy as tl
2 |
3 |
4 | class TestOutput:
5 | def test_graphviz(self):
6 | def func():
7 | return 1
8 |
9 | assert tl.GraphViz(tl.Node(value=func, dynamic=True))
10 |
11 | def test_dagre(self):
12 | def func():
13 | return 1
14 |
15 | assert tl.Dagre(tl.Node(value=func, dynamic=True))
16 |
17 | def test_multiple_graph(self):
18 | n = tl.Node(5)
19 | n1 = n + 5
20 | n2 = n + 4
21 | assert tl.Graph([n1, n2]) is not None
22 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/control/test_if_streaming.py:
--------------------------------------------------------------------------------
1 | def conditionals():
2 | yield True
3 | yield False
4 | yield True
5 |
6 |
7 | def if_stream():
8 | yield 1
9 | yield -1
10 | yield 3
11 |
12 |
13 | def else_stream():
14 | yield -1
15 | yield 2
16 | yield -1
17 |
18 |
19 | class TestConditional:
20 | def test_if(self):
21 | import tributary.streaming as ts
22 |
23 | assert ts.run(
24 | ts.Print(
25 | ts.If(ts.Func(conditionals), ts.Func(if_stream), ts.Func(else_stream))
26 | )
27 | ) == [1, 2, 3]
28 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/output/test_socketio_streaming.py:
--------------------------------------------------------------------------------
1 | import tributary.streaming as ts
2 | import pytest
3 | import time
4 |
5 |
6 | class TestSocketIO:
7 | def setup_method(self):
8 | time.sleep(0.5)
9 |
10 | @pytest.mark.skipif("int(os.environ.get('TRIBUTARY_SKIP_DOCKER_TESTS', '1'))")
11 | def test_socketio(self):
12 | """Test socketio streaming"""
13 |
14 | def func():
15 | yield "a"
16 | yield "b"
17 | yield "c"
18 |
19 | out = ts.SocketIOSink(ts.Func(func), url="http://localhost:8069")
20 | assert ts.run(out) == ["a", "b", "c"]
21 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/output/test_kafka_streaming.py:
--------------------------------------------------------------------------------
1 | import tributary.streaming as ts
2 | import pytest
3 | import time
4 |
5 |
6 | class TestKafka:
7 | def setup_method(self):
8 | time.sleep(0.5)
9 |
10 | @pytest.mark.skipif("int(os.environ.get('TRIBUTARY_SKIP_DOCKER_TESTS', '1'))")
11 | def test_kafka(self):
12 | """Test streaming with Kafka"""
13 |
14 | def func():
15 | yield "a"
16 | yield "b"
17 | yield "c"
18 |
19 | out = ts.KafkaSink(ts.Func(func), servers="localhost:9092", topic="tributary")
20 | assert ts.run(out) == ["a", "b", "c"]
21 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/input/test_postgres_streaming.py:
--------------------------------------------------------------------------------
1 | import tributary.streaming as ts
2 | import pytest
3 | import time
4 |
5 |
6 | class TestPostgres:
7 | def setup_method(self):
8 | time.sleep(0.5)
9 |
10 | @pytest.mark.skipif("int(os.environ.get('TRIBUTARY_SKIP_DOCKER_TESTS', '1'))")
11 | def test_pg(self):
12 | query = ["SELECT * FROM test"]
13 | out = ts.PostgresSource(
14 | queries=query,
15 | user="postgres",
16 | database="postgres",
17 | password="test",
18 | host="localhost:5432",
19 | )
20 | assert len(ts.run(out)) != 0
21 |
--------------------------------------------------------------------------------
/MANIFEST.in:
--------------------------------------------------------------------------------
1 | graft tributary
2 | include LICENSE
3 | include README.md
4 | include CODE_OF_CONDUCT.md
5 | include CONTRIBUTING.md
6 |
7 |
8 | include setup.cfg
9 | include pyproject.toml
10 | include .bumpversion.cfg
11 | include Makefile
12 |
13 | graft tributary/tests
14 | exclude tributary/tests/streaming/output/test_file_data.json
15 |
16 | # Get rid of docs, ci, and binder
17 | prune ci
18 | prune docs
19 | prune binder
20 | prune examples
21 | exclude CATALOG.md
22 |
23 |
24 | # Patterns to exclude from any directory
25 | global-exclude *~
26 | global-exclude *.pyc
27 | global-exclude *.pyo
28 | global-exclude .git
29 | global-exclude .ipynb_checkpoints
30 | global-exclude .DS_Store
31 |
--------------------------------------------------------------------------------
/docs/Makefile:
--------------------------------------------------------------------------------
1 | # Minimal makefile for Sphinx documentation
2 | #
3 |
4 | # You can set these variables from the command line.
5 | SPHINXOPTS =
6 | SPHINXBUILD = sphinx-build
7 | SPHINXPROJ = tributary
8 | SOURCEDIR = .
9 | BUILDDIR = _build
10 |
11 | # Put it first so that "make" without argument is like "make help".
12 | help:
13 | @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
14 |
15 | .PHONY: help Makefile
16 |
17 | # Catch-all target: route all unknown targets to Sphinx using the new
18 | # "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
19 | %: Makefile
20 | @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
--------------------------------------------------------------------------------
/.github/ISSUE_TEMPLATE/feature_request.md:
--------------------------------------------------------------------------------
1 | ---
2 | name: Feature request
3 | about: Suggest an idea for this project
4 | title: ''
5 | labels: ''
6 | assignees: ''
7 |
8 | ---
9 |
10 | **Is your feature request related to a problem? Please describe.**
11 | A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
12 |
13 | **Describe the solution you'd like**
14 | A clear and concise description of what you want to happen.
15 |
16 | **Describe alternatives you've considered**
17 | A clear and concise description of any alternative solutions or features you've considered.
18 |
19 | **Additional context**
20 | Add any other context or screenshots about the feature request here.
21 |
--------------------------------------------------------------------------------
/tributary/thread.py:
--------------------------------------------------------------------------------
1 | from .base import StreamNone
2 | from gevent import spawn
3 |
4 |
5 | def run(target, timeout=1):
6 | """Helper for running a thread
7 |
8 | Args:
9 | target (function): function to run on a thread
10 | timeout (int): how long to wait for target to return
11 | Returns:
12 | data: result of the function
13 | """
14 | last = None
15 | done = False
16 |
17 | while not done:
18 | g = spawn(target)
19 | g.join(timeout)
20 |
21 | while not g.successful():
22 | yield StreamNone(last)
23 |
24 | last = g.value
25 | if last is None:
26 | done = True
27 | else:
28 | yield last
29 |
--------------------------------------------------------------------------------
/tributary/streaming/calculations/basket.py:
--------------------------------------------------------------------------------
1 | from .ops import unary
2 | from ..node import Node
3 |
4 |
5 | Len = unary((lambda x: len(x),), name="Len")
6 | CountBasket = unary((lambda x: len(x),), name="Count")
7 | MaxBasket = unary((lambda x: max(x),), name="Count")
8 | MinBasket = unary((lambda x: min(x),), name="Count")
9 | SumBasket = unary((lambda x: sum(x),), name="Count")
10 | AverageBasket = unary((lambda x: sum(x) / len(x),), name="Count")
11 | MeanBasket = AverageBasket
12 |
13 | Node.__len__ = Len
14 | Node.len = Len
15 | Node.countBasket = CountBasket
16 | Node.minBasket = MinBasket
17 | Node.maxBasket = MaxBasket
18 | Node.sumBasket = SumBasket
19 | Node.averageBasket = AverageBasket
20 | Node.meanBasket = AverageBasket
21 |
--------------------------------------------------------------------------------
/tributary/streaming/output/__init__.py:
--------------------------------------------------------------------------------
1 | from .email import Email as EmailSink
2 | from .file import File as FileSink
3 | from .http import HTTP as HTTPSink
4 | from .http import HTTPServer as HTTPServerSink
5 | from .kafka import Kafka as KafkaSink
6 | from .output import Collect, Dagre
7 | from .output import Func as FuncOutput
8 | from .output import Queue as QueueSink
9 | from .output import Graph, GraphViz, Logging, Perspective, PPrint, Print
10 | from .postgres import Postgres as PostgresSink
11 | from .socketio import SocketIO as SocketIOSink
12 | from .sse import SSE as SSESink
13 | from .text import TextMessage as TextMessageSink
14 | from .ws import WebSocket as WebSocketSink
15 | from .ws import WebSocketServer as WebSocketServerSink
16 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/input/test_file_data.json:
--------------------------------------------------------------------------------
1 | {"A":0.5202935277,"B":-0.1342029162,"C":-0.4063935124,"D":0.7683930309}
2 | {"A":2.4391663877,"B":-0.4531601906,"C":0.5504786851,"D":0.325227506}
3 | {"A":0.3086543475,"B":-0.4699086094,"C":0.030693103,"D":1.4332804114}
4 | {"A":1.0297827499,"B":-1.5501301415,"C":2.3629809104,"D":1.6914496686}
5 | {"A":-0.884150846,"B":-0.103603736,"C":-2.0112105601,"D":0.4678144406}
6 | {"A":-0.5774053104,"B":0.3085597903,"C":0.4117180955,"D":-1.2195844306}
7 | {"A":0.3986720589,"B":-0.7892302474,"C":-0.6073367673,"D":-0.0705578578}
8 | {"A":0.6379109681,"B":-2.8962345925,"C":0.5189689327,"D":0.4448039862}
9 | {"A":0.4370833908,"B":0.7522062612,"C":0.0451172889,"D":-0.8192326352}
10 | {"A":1.1929752083,"B":0.0977595985,"C":0.4772077299,"D":-0.148081604}
--------------------------------------------------------------------------------
/tributary/streaming/input/__init__.py:
--------------------------------------------------------------------------------
1 | from .file import File
2 | from .file import File as FileSource
3 | from .http import HTTP
4 | from .http import HTTP as HTTPSource
5 | from .http import HTTPServer
6 | from .http import HTTPServer as HTTPServerSource
7 | from .input import *
8 | from .kafka import Kafka
9 | from .kafka import Kafka as KafkaSource
10 | from .postgres import Postgres
11 | from .postgres import Postgres as PostgresSource
12 | from .process import SubprocessSource
13 | from .socketio import SocketIO
14 | from .socketio import SocketIO as SocketIOSource
15 | from .sse import SSE
16 | from .sse import SSE as SSESource
17 | from .ws import WebSocket
18 | from .ws import WebSocket as WebSocketSource
19 | from .ws import WebSocketServer
20 | from .ws import WebSocketServer as WebSocketServerSource
21 |
--------------------------------------------------------------------------------
/tributary/lazy/dd3.py:
--------------------------------------------------------------------------------
1 | class _DagreD3Mixin(object):
2 | def _greendd3g(self):
3 | if self._dd3g:
4 | self._dd3g.setNode(
5 | self._name, tooltip=str(self.value()), style="fill: #0f0"
6 | )
7 |
8 | def _yellowdd3g(self):
9 | if self._dd3g:
10 | self._dd3g.setNode(
11 | self._name, tooltip=str(self.value()), style="fill: #ff0"
12 | )
13 |
14 | def _reddd3g(self):
15 | if self._dd3g:
16 | self._dd3g.setNode(
17 | self._name, tooltip=str(self.value()), style="fill: #f00"
18 | )
19 |
20 | def _whited3g(self):
21 | if self._dd3g:
22 | self._dd3g.setNode(
23 | self._name, tooltip=str(self.value()), style="fill: #fff"
24 | )
25 |
--------------------------------------------------------------------------------
/tributary/tests/lazy/test_method_lazy.py:
--------------------------------------------------------------------------------
1 | from datetime import datetime
2 | import tributary.lazy as tl
3 |
4 |
5 | class ExamplePoint(tl.LazyGraph):
6 | def __init__(self):
7 | self._now = datetime.now()
8 | self._curve = {self._now: 1, -1: 2}
9 | super().__init__()
10 |
11 | @tl.node()
12 | def now(self):
13 | return self._now
14 |
15 | @tl.node()
16 | def asof(self, date=None):
17 | if date is None:
18 | # return last
19 | return self._curve[-1]
20 | return self._curve[date]
21 |
22 |
23 | class TestMethod:
24 | def test_args_kwargs(self):
25 | pt = ExamplePoint()
26 | # FIXME changed during lazy rewrite, no idea what i was testing originally
27 | assert pt.asof(date=None) == 2
28 | assert pt.asof(date=pt.now()) == 1
29 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/output/test_postgres_streaming.py:
--------------------------------------------------------------------------------
1 | import tributary.streaming as ts
2 | import pytest
3 | import time
4 |
5 |
6 | class TestPostgres:
7 | def setup_method(self):
8 | time.sleep(0.5)
9 |
10 | @pytest.mark.skipif("int(os.environ.get('TRIBUTARY_SKIP_DOCKER_TESTS', '1'))")
11 | def test_pg(self):
12 | def func():
13 | yield 1
14 | yield 2
15 | yield 3
16 |
17 | def parser(data):
18 | return ["INSERT INTO test(col1) VALUES ({});".format(data)]
19 |
20 | out = ts.PostgresSink(
21 | ts.Func(func),
22 | query_parser=parser,
23 | user="postgres",
24 | database="postgres",
25 | password="test",
26 | host="localhost:5432",
27 | )
28 | assert len(ts.run(out)) == 3
29 |
--------------------------------------------------------------------------------
/tributary/tests/functional/test_functional.py:
--------------------------------------------------------------------------------
1 | import pytest
2 | import random
3 | import time
4 |
5 |
6 | class TestFunctional:
7 | def setup_method(self):
8 | time.sleep(1)
9 |
10 | @pytest.mark.skipif("os.name == 'nt'")
11 | def test_general(self):
12 | import tributary.functional as t
13 |
14 | def func1(on_data):
15 | x = 0
16 | while x < 5:
17 | on_data({"a": random.random(), "b": random.randint(0, 1000), "x": x})
18 | time.sleep(0.01)
19 | x = x + 1
20 |
21 | def func2(data, callback):
22 | callback(
23 | [{"a": data["a"] * 1000, "b": data["b"], "c": "AAPL", "x": data["x"]}]
24 | )
25 |
26 | t.pipeline([func1, func2], ["on_data", "callback"], on_data=lambda x: None)
27 | time.sleep(1)
28 | t.stop()
29 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/output/test_file_streaming.py:
--------------------------------------------------------------------------------
1 | import os
2 | import time
3 | import tributary.streaming as ts
4 |
5 |
6 | class TestFile:
7 | def setup_method(self):
8 | time.sleep(0.5)
9 |
10 | def test_file(self):
11 | file = os.path.abspath(
12 | os.path.join(os.path.dirname(__file__), "test_file_data.json")
13 | )
14 | if os.path.exists(file):
15 | os.remove(file)
16 |
17 | def func():
18 | yield 1
19 | yield 2
20 | yield 3
21 | yield 4
22 |
23 | def read_file(file):
24 | with open(file, "r") as fp:
25 | data = fp.read()
26 | return [int(x) for x in data]
27 |
28 | # Test that output is equal to what is read (generalized)
29 | out = ts.FileSink(ts.Func(func), filename=file, json=True)
30 | assert ts.run(out) == read_file(file)
31 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/output/test_ws_streaming.py:
--------------------------------------------------------------------------------
1 | import tributary.streaming as ts
2 | import pytest
3 | import time
4 |
5 |
6 | class TestWebSocket:
7 | def setup_method(self):
8 | time.sleep(0.5)
9 |
10 | @pytest.mark.skipif("int(os.environ.get('TRIBUTARY_SKIP_DOCKER_TESTS', '1'))")
11 | def test_websocket(self):
12 | """Test websocket streaming"""
13 |
14 | def func():
15 | yield "x"
16 | yield "y"
17 | yield "z"
18 |
19 | out = ts.WebSocketSink(ts.Func(func), url="ws://localhost:8080", response=True)
20 | assert len(ts.run(out)) == 3
21 |
22 | def test_websocket_server(self):
23 | """Test websocket server"""
24 |
25 | def func():
26 | yield "x"
27 | yield "y"
28 | yield "z"
29 |
30 | out = ts.WebSocketServerSink(ts.Func(func), port=1234)
31 | assert len(ts.run(out)) == 3
32 |
--------------------------------------------------------------------------------
/docs/make.bat:
--------------------------------------------------------------------------------
1 | @ECHO OFF
2 |
3 | pushd %~dp0
4 |
5 | REM Command file for Sphinx documentation
6 |
7 | if "%SPHINXBUILD%" == "" (
8 | set SPHINXBUILD=sphinx-build
9 | )
10 | set SOURCEDIR=.
11 | set BUILDDIR=_build
12 | set SPHINXPROJ=tributary
13 |
14 | if "%1" == "" goto help
15 |
16 | %SPHINXBUILD% >NUL 2>NUL
17 | if errorlevel 9009 (
18 | echo.
19 | echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
20 | echo.installed, then set the SPHINXBUILD environment variable to point
21 | echo.to the full path of the 'sphinx-build' executable. Alternatively you
22 | echo.may add the Sphinx directory to PATH.
23 | echo.
24 | echo.If you don't have Sphinx installed, grab it from
25 | echo.http://sphinx-doc.org/
26 | exit /b 1
27 | )
28 |
29 | %SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
30 | goto end
31 |
32 | :help
33 | %SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
34 |
35 | :end
36 | popd
37 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/input/test_file_data.csv:
--------------------------------------------------------------------------------
1 | index,A,B,C,D
2 | 2000-01-03,0.026648148571715664,-0.7590154652035094,-0.6494555758721114,1.0184532376246929
3 | 2000-01-04,0.2897691505750989,1.3344806935428815,-1.9651363204561283,-1.6548605987228506
4 | 2000-01-05,-0.5199971756398427,-0.6066183328449831,0.27484079250032406,-0.5610753099845647
5 | 2000-01-06,-0.7126635676927439,1.3122428262804113,2.0336462799165558,0.2175065284752955
6 | 2000-01-07,-0.214971513714637,1.4109329849564642,-0.5792064572755115,0.8086350446039877
7 | 2000-01-10,-0.9888172578604721,-0.007570311807930511,1.083522227060552,0.04844857753597059
8 | 2000-01-11,-0.6152506968139281,-0.6379012668254349,-0.5545751885777795,0.5088485784751581
9 | 2000-01-12,1.273575794521754,0.35551042218088097,0.5514389291813603,-0.22008717208580486
10 | 2000-01-13,-1.077382391129198,0.32078439306326434,0.19722919088791147,1.8449109581947052
11 | 2000-01-14,-1.5834800188615588,1.7435795115972916,0.7906579301597301,-2.7621283579713527
--------------------------------------------------------------------------------
/tributary/tests/streaming/calculations/test_basket_streaming.py:
--------------------------------------------------------------------------------
1 | import tributary.streaming as ts
2 |
3 |
4 | def func4():
5 | yield [1]
6 | yield [1, 2]
7 | yield [1, 2, 3]
8 |
9 |
10 | class TestBasket:
11 | def test_Len(self):
12 | t = ts.Timer(func4, count=3)
13 | out = ts.Len(t)
14 | assert ts.run(out) == [1, 2, 3]
15 |
16 | def test_Count(self):
17 | t = ts.Timer(func4, count=3)
18 | out = ts.CountBasket(t)
19 | assert ts.run(out) == [1, 2, 3]
20 |
21 | def test_Max(self):
22 | t = ts.Timer(func4, count=3)
23 | out = ts.MaxBasket(t)
24 | assert ts.run(out) == [1, 2, 3]
25 |
26 | def test_Min(self):
27 | t = ts.Timer(func4, count=3)
28 | out = ts.MinBasket(t)
29 | assert ts.run(out) == [1, 1, 1]
30 |
31 | def test_Average(self):
32 | t = ts.Timer(func4, count=3)
33 | out = ts.AverageBasket(t)
34 | assert ts.run(out) == [1, 1.5, 2]
35 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/test_timing_streaming.py:
--------------------------------------------------------------------------------
1 | import tributary.streaming as ts
2 |
3 |
4 | def func():
5 | yield 1
6 | yield 2
7 | yield 3
8 | yield 4
9 | yield 5
10 | yield 6
11 |
12 |
13 | def func2():
14 | yield 1
15 | yield ts.StreamNone()
16 | yield 10
17 | yield ts.StreamNone()
18 | yield 100
19 | yield 1000
20 |
21 |
22 | class TestTiming:
23 | def test_normal(self):
24 | out = ts.Func(func) + ts.Func(func2)
25 | assert ts.run(out) == [2, 12, 103, 1004]
26 |
27 | def test_drop(self):
28 | out = ts.Func(func, drop=True) + ts.Func(func2)
29 | assert ts.run(out) == [2, 12, 104, 1006]
30 |
31 | def test_replace(self):
32 | out = ts.Func(func, replace=True) + ts.Func(func2)
33 | assert ts.run(out) == [2, 13, 105, 1006]
34 |
35 | def test_repeat(self):
36 | out = ts.Func(func) + ts.Func(func2, repeat=True)
37 | assert ts.run(out) == [2, 3, 13, 14, 105, 1006]
38 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/output/test_output_streaming.py:
--------------------------------------------------------------------------------
1 | import time
2 | import tributary.streaming as ts
3 |
4 |
5 | class TestOutput:
6 | def setup_method(self):
7 | time.sleep(0.5)
8 |
9 | def test_print(self):
10 | val = 0
11 |
12 | def func():
13 | nonlocal val
14 | val += 1
15 | return val
16 |
17 | assert ts.run(ts.Print(ts.Timer(func, count=2))) == [1, 2]
18 |
19 | def test_print_generator(self):
20 | def func():
21 | yield 1
22 | yield 2
23 | yield 3
24 | yield 4
25 | yield 5
26 |
27 | assert ts.run(ts.Print(ts.Timer(func, count=2))) == [1, 2]
28 |
29 | def test_graphviz(self):
30 | def func():
31 | yield 1
32 |
33 | assert ts.GraphViz(ts.Print(ts.Timer(func, count=2)))
34 |
35 | def test_dagre(self):
36 | def func():
37 | yield 1
38 |
39 | assert ts.Dagre(ts.Print(ts.Timer(func, count=2)))
40 |
--------------------------------------------------------------------------------
/.github/ISSUE_TEMPLATE/bug_report.md:
--------------------------------------------------------------------------------
1 | ---
2 | name: Bug report
3 | about: Create a report to help us improve
4 | title: ''
5 | labels: ''
6 | assignees: ''
7 |
8 | ---
9 |
10 | **Describe the bug**
11 | A clear and concise description of what the bug is.
12 |
13 | **To Reproduce**
14 | Steps to reproduce the behavior:
15 | 1. Go to '...'
16 | 2. Click on '....'
17 | 3. Scroll down to '....'
18 | 4. See error
19 |
20 | **Expected behavior**
21 | A clear and concise description of what you expected to happen.
22 |
23 | **Screenshots**
24 | If applicable, add screenshots to help explain your problem.
25 |
26 | **Desktop (please complete the following information):**
27 | - OS: [e.g. iOS]
28 | - Browser [e.g. chrome, safari]
29 | - Version [e.g. 22]
30 |
31 | **Smartphone (please complete the following information):**
32 | - Device: [e.g. iPhone6]
33 | - OS: [e.g. iOS8.1]
34 | - Browser [e.g. stock browser, safari]
35 | - Version [e.g. 22]
36 |
37 | **Additional context**
38 | Add any other context about the problem here.
39 |
--------------------------------------------------------------------------------
/tributary/streaming/input/file.py:
--------------------------------------------------------------------------------
1 | import aiofiles
2 | import json as JSON
3 | from .input import Func
4 |
5 |
6 | class File(Func):
7 | """Open up a file and yield back lines in the file
8 |
9 | Args:
10 | filename (str): filename to read
11 | json (bool): load file line as json
12 | """
13 |
14 | def __init__(self, filename, json=False, csv=False):
15 | assert not (json and csv)
16 |
17 | async def _file(filename=filename, json=json, csv=csv):
18 | if csv:
19 | async with aiofiles.open(filename) as f:
20 | async for line in f:
21 | yield line.strip().split(",")
22 | else:
23 | async with aiofiles.open(filename) as f:
24 | async for line in f:
25 | if json:
26 | yield JSON.loads(line)
27 | else:
28 | yield line
29 |
30 | super().__init__(func=_file)
31 | self._name = "File"
32 |
--------------------------------------------------------------------------------
/tributary/streaming/output/text.py:
--------------------------------------------------------------------------------
1 | from twilio.rest import Client
2 |
3 | from ..node import Node
4 | from .output import Func
5 |
6 |
7 | class TextMessage(Func):
8 | """Send a text message
9 |
10 | Args:
11 | node (Node): input stream
12 | to (str): phone number/s to send to
13 | from_ (str): phone number to send from
14 | twilio (dict): twilio account info kwargs for twilio.rest.Client
15 | """
16 |
17 | def __init__(
18 | self,
19 | node,
20 | to,
21 | from_,
22 | twilio,
23 | ):
24 | self._twilio_client = Client(**twilio)
25 |
26 | async def _send(
27 | message,
28 | to=to,
29 | from_=from_,
30 | twilio=twilio,
31 | ):
32 | r = self._twilio_client.messages.create(to=to, from_=from_, body=message)
33 |
34 | return r, message
35 |
36 | super().__init__(func=_send, inputs=1)
37 | self._name = "TextMessage"
38 | node >> self
39 |
40 |
41 | Node.textMessage = TextMessage
42 |
--------------------------------------------------------------------------------
/tributary/streaming/output/file.py:
--------------------------------------------------------------------------------
1 | import aiofiles
2 | import json as JSON
3 | from .output import Func
4 | from ..node import Node
5 |
6 |
7 | class File(Func):
8 | """Open up a file and write lines to the file
9 |
10 | Args:
11 | node (Node): input stream
12 | filename (str): filename to write
13 | json (bool): write file line as json
14 | csv (bool): write file line as csv
15 | """
16 |
17 | def __init__(self, node, filename="", json=False, csv=False):
18 | async def _file(data):
19 | if csv:
20 | async with aiofiles.open(filename, "w") as f:
21 | await f.write(",".join(data))
22 | else:
23 | async with aiofiles.open(filename, mode="a") as f:
24 | if json:
25 | await f.write(JSON.dumps(data))
26 | else:
27 | await f.write(data)
28 | return data
29 |
30 | super().__init__(func=_file, name="File", inputs=1)
31 | node >> self
32 |
33 |
34 | Node.file = File
35 |
--------------------------------------------------------------------------------
/tributary/streaming/input/sse.py:
--------------------------------------------------------------------------------
1 | import json as JSON
2 | from aiohttp_sse_client import client as sse_client
3 | from .input import Func
4 |
5 |
6 | class SSE(Func):
7 | """Connect to SSE url and yield results
8 |
9 | Args:
10 | url (str): url to connect to
11 | json (bool): load http content data as json
12 | wrap (bool): wrap result in a list
13 | field (str): field to index result by
14 | """
15 |
16 | def __init__(self, url, json=False, wrap=False, field=None):
17 | async def _req(url=url, json=json, wrap=wrap, field=field):
18 | async with sse_client.EventSource(url) as event_source:
19 | async for event in event_source:
20 | data = event.data
21 |
22 | if json:
23 | data = JSON.loads(data)
24 | if field:
25 | data = data[field]
26 | if wrap:
27 | data = [data]
28 | yield data
29 |
30 | super().__init__(func=_req)
31 | self._name = "SSE"
32 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/input/test_http_streaming.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 | import pytest
3 | import requests
4 | import time
5 | import tributary.streaming as ts
6 |
7 |
8 | class TestHttp:
9 | def setup_method(self):
10 | time.sleep(0.5)
11 |
12 | def test_http(self):
13 | out = ts.Print(ts.HTTPSource(url="https://google.com"))
14 | ret = ts.run(out)
15 | assert len(ret) == 1
16 |
17 | @pytest.mark.skipif('os.name == "nt" or os.environ.get("CI")')
18 | def test_http_server(self):
19 | ss = ts.HTTPServerSource(json=True, host="127.0.0.1", port=12345)
20 | w = ts.Window(ss)
21 | out = ts.run(w, blocking=False)
22 | try:
23 | time.sleep(5)
24 | _ = requests.post("http://127.0.0.1:12345/", json={"test": 1, "test2": 2})
25 | time.sleep(5)
26 | assert w._accum == [{"test": 1, "test2": 2}]
27 | finally:
28 | asyncio.set_event_loop(asyncio.new_event_loop())
29 | try:
30 | out.stop()
31 | except RuntimeError:
32 | pass
33 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/input/test_process_streaming.py:
--------------------------------------------------------------------------------
1 | import os.path
2 | import pytest
3 | import time
4 |
5 | import tributary.streaming as ts
6 |
7 | vals = [
8 | "__init__.py",
9 | "__pycache__",
10 | "test_file_data.csv",
11 | "test_file_data.json",
12 | "test_file_streaming.py",
13 | "test_http_streaming.py",
14 | "test_input_streaming.py",
15 | "test_postgres_streaming.py",
16 | "test_process_streaming.py",
17 | "test_sse_streaming.py",
18 | ]
19 |
20 |
21 | class TestProcess:
22 | def setup_method(self):
23 | time.sleep(0.5)
24 |
25 | @pytest.mark.skipif("'--cov=tributary' in sys.argv")
26 | def test_process(self):
27 | path = os.path.dirname(__file__)
28 | ret = ts.run(ts.SubprocessSource("ls {}".format(path)))
29 | print(ret)
30 | assert ret == vals
31 |
32 | @pytest.mark.skipif("'--cov=tributary' in sys.argv")
33 | def test_process_one_off(self):
34 | path = os.path.dirname(__file__)
35 | ret = ts.run(ts.SubprocessSource("ls {}".format(path), one_off=True))
36 | print(ret)
37 | assert ret == ["\n".join(vals) + "\n"]
38 |
--------------------------------------------------------------------------------
/examples/streaming_kafka.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import tributary as t"
10 | ]
11 | },
12 | {
13 | "cell_type": "code",
14 | "execution_count": 2,
15 | "metadata": {},
16 | "outputs": [],
17 | "source": [
18 | "random = t.Random()\n",
19 | "k1 = t.KafkaSink(random, servers='localhost:9092', topic='sample_topic')"
20 | ]
21 | },
22 | {
23 | "cell_type": "code",
24 | "execution_count": null,
25 | "metadata": {},
26 | "outputs": [],
27 | "source": []
28 | }
29 | ],
30 | "metadata": {
31 | "kernelspec": {
32 | "display_name": "Python 3",
33 | "language": "python",
34 | "name": "python3"
35 | },
36 | "language_info": {
37 | "codemirror_mode": {
38 | "name": "ipython",
39 | "version": 3
40 | },
41 | "file_extension": ".py",
42 | "mimetype": "text/x-python",
43 | "name": "python",
44 | "nbconvert_exporter": "python",
45 | "pygments_lexer": "ipython3",
46 | "version": "3.7.0"
47 | }
48 | },
49 | "nbformat": 4,
50 | "nbformat_minor": 2
51 | }
52 |
--------------------------------------------------------------------------------
/setup.cfg:
--------------------------------------------------------------------------------
1 | [bumpversion]
2 | current_version = 0.2.2
3 | commit = True
4 | tag = False
5 |
6 | [bdist_wheel]
7 | universal = 1
8 |
9 | [metadata]
10 | description_file = README.md
11 | long_description_content_type = text/markdown
12 |
13 | [flake8]
14 | ignore = E203, W503
15 | max-line-length = 200
16 | per-file-ignores =
17 | tributary/lazy/__init__.py:F401, F403
18 | tributary/lazy/base.py:F401
19 | tributary/lazy/calculations/__init__.py:F401, F403
20 | tributary/lazy/control/__init__.py:F401
21 | tributary/lazy/input/__init__.py:F401, F403
22 | tributary/lazy/output/__init__.py:F401
23 | tributary/streaming/__init__.py:F401, F403
24 | tributary/streaming/calculations/__init__.py:F401, F403
25 | tributary/streaming/control/__init__.py:F401
26 | tributary/streaming/input/__init__.py:F401, F403
27 | tributary/streaming/output/__init__.py:F401
28 |
29 | [bumpversion:file:tributary/_version.py]
30 | search = __version__ = "{current_version}"
31 | replace = __version__ = "{new_version}"
32 |
33 | [bumpversion:file:docs/conf.py]
34 | search = version = "{current_version}"
35 | replace = version = "{new_version}"
36 |
37 | [check-manifest]
38 | ignore =
39 | tributary/tests/**
40 |
--------------------------------------------------------------------------------
/tributary/tests/lazy/calculations/test_basket_lazy.py:
--------------------------------------------------------------------------------
1 | import tributary.lazy as tl
2 |
3 |
4 | def func4():
5 | yield [1]
6 | yield [1, 2]
7 | yield [1, 2, 3]
8 |
9 |
10 | class TestBasket:
11 | def test_Len(self):
12 | t = tl.Node(value=func4)
13 | out = tl.Len(t)
14 | assert out() == 1
15 | assert out() == 2
16 | assert out() == 3
17 |
18 | def test_Count(self):
19 | t = tl.Node(value=func4)
20 | out = tl.CountBasket(t)
21 | assert out() == 1
22 | assert out() == 2
23 | assert out() == 3
24 |
25 | def test_Max(self):
26 | t = tl.Node(value=func4)
27 | out = tl.MaxBasket(t)
28 | assert out() == 1
29 | assert out() == 2
30 | assert out() == 3
31 |
32 | def test_Min(self):
33 | t = tl.Node(value=func4)
34 | out = tl.MinBasket(t)
35 | assert out() == 1
36 | assert out() == 1
37 | assert out() == 1
38 |
39 | def test_Average(self):
40 | t = tl.Node(value=func4)
41 | out = tl.AverageBasket(t)
42 | assert out() == 1
43 | assert out() == 1.5
44 | assert out() == 2
45 |
--------------------------------------------------------------------------------
/tributary/tests/lazy/test_tweaks_lazy.py:
--------------------------------------------------------------------------------
1 | import tributary.lazy as tl
2 |
3 |
4 | class TestLazyTweaks:
5 | def test_tweaks(self):
6 | n = tl.Node(name="Test", value=5)
7 | n2 = n + 1
8 |
9 | # base operation
10 | print(n2())
11 | assert n2() == 6
12 |
13 | print(n2(1))
14 | assert not n2.isDirty()
15 |
16 | # tweaking operation applied to `n`
17 | assert n2(1) == 2
18 | assert not n2.isDirty()
19 |
20 | # not permanently set
21 | assert n2() == 6
22 | assert not n2.isDirty()
23 |
24 | def test_tweaks_dirtiness_and_parent(self):
25 | n1 = tl.Node(value=1, name="n1")
26 | n2 = n1 + 2
27 | n3 = n2 + 4
28 |
29 | assert n3({n1: -1}) == 5
30 | assert n3() == 7
31 | assert n3({n2: 2}) == 6
32 | assert n3(2) == 6
33 | assert n3(2, 2) == 4
34 | assert n3({n1: -1}) == 5
35 | assert n3({n1: -1}) == 5
36 | assert n3() == 7
37 | assert n3({n2: 2}) == 6
38 | assert n3(2) == 6
39 | assert n3(2, 2) == 4
40 | assert n3({n1: -1}) == 5
41 | assert n3() == 7
42 | assert n3() == 7
43 | assert n3() == 7
44 |
--------------------------------------------------------------------------------
/tributary/functional/utils.py:
--------------------------------------------------------------------------------
1 | def split(data, callback1, callback2):
2 | """Pass data to 2 callbacks
3 |
4 | Args:
5 | data (any): data to pass to both callbacks
6 | callback1 (callable): first function to call
7 | callback2 (callable): second function to call
8 | """
9 | return (callback1(data), callback2(data))
10 |
11 |
12 | def map(data, *callbacks):
13 | """Pass data to multiple callbacks
14 |
15 | Args:
16 | data (any): data to pass to all callbacks
17 | callbacks (tuple): callbacks to pass data to
18 | """
19 | return (callback(data) for callback in callbacks)
20 |
21 |
22 | def merge(data1, data2, callback):
23 | """Merge two data sources into one callback
24 |
25 | Args:
26 | data1 (any): first data to pass to callback
27 | data2 (any): second data to pass to callback
28 | callback (callable): callback to pass data to
29 | """
30 | return callback((data1, data2))
31 |
32 |
33 | def reduce(callback, *datas):
34 | """Merge multiple data sources into one callback
35 |
36 | Args:
37 | callback (callable): callback to pass data to
38 | datas (tuple): data to pass to callback
39 | """
40 | return callback((data for data in datas))
41 |
--------------------------------------------------------------------------------
/tributary/streaming/input/process.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 | from .input import Func
3 |
4 |
5 | class SubprocessSource(Func):
6 | """Open up a subprocess and yield the results as they come
7 |
8 | Args:
9 | command (str): command to run
10 | """
11 |
12 | def __init__(self, command, one_off=False):
13 | async def _proc(command=command, one_off=one_off):
14 | proc = await asyncio.create_subprocess_shell(
15 | command, stdout=asyncio.subprocess.PIPE
16 | )
17 |
18 | if one_off:
19 | stdout, _ = await proc.communicate()
20 |
21 | if stdout:
22 | stdout = stdout.decode()
23 | yield stdout
24 |
25 | else:
26 | while proc.returncode is None:
27 | done = False
28 |
29 | while not done:
30 | val = await asyncio.create_task(proc.stdout.readline())
31 | val = val.decode().strip()
32 |
33 | if val == "":
34 | done = True
35 | break
36 |
37 | yield val
38 |
39 | super().__init__(func=_proc)
40 | self._name = "Proc"
41 |
--------------------------------------------------------------------------------
/tributary/streaming/control/control.py:
--------------------------------------------------------------------------------
1 | from .utils import _CONTROL_GRAPHVIZSHAPE
2 | from ..node import Node
3 | from ...base import TributaryException
4 |
5 |
6 | def If(if_node, satisfied_node, unsatisfied_node=None, *elseifs):
7 | """Node to return satisfied if if_node else unsatisfied
8 |
9 | Args:
10 | if_node (Node): input stream of booleans
11 | satisfied_node (Node): input stream to return if true
12 | unsatisfied_node (Node): input stream to return if False
13 | elseifs (Tuple(Node)): input stream of boolean/value pairs to evaluate as else if statements
14 | """
15 | if len(elseifs) % 2 != 0:
16 | raise TributaryException("Else ifs must be in pairs")
17 |
18 | if elseifs:
19 | # TODO else ifs
20 | raise NotImplementedError()
21 |
22 | def func(conditional, if_val, else_val):
23 | # TODO else ifs
24 | if conditional:
25 | return if_val
26 | return else_val
27 |
28 | ret = Node(
29 | func=func,
30 | func_kwargs=None,
31 | name="If",
32 | inputs=3,
33 | graphvizshape=_CONTROL_GRAPHVIZSHAPE,
34 | )
35 | ret.set("_count", 0)
36 | if_node >> ret
37 | satisfied_node >> ret
38 | unsatisfied_node >> ret
39 | return ret
40 |
41 |
42 | Node.if_ = If
43 |
--------------------------------------------------------------------------------
/ci/docker-compose.yml:
--------------------------------------------------------------------------------
1 | version: '2'
2 | services:
3 | # zookeeper:
4 | # image: timkpaine/zookeeper:latest # wurstmeister/zookeeper
5 | # ports:
6 | # - "2181:2181"
7 | # kafka:
8 | # image: timkpaine/kafka:latest # wurstmeister/kafka
9 | # ports:
10 | # - "9092:9092"
11 | # environment:
12 | # KAFKA_LISTENERS: PLAINTEXT://0.0.0.0:9092
13 | # KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
14 | # KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
15 | # KAFKA_CREATE_TOPICS: "tributary:1:1"
16 | # volumes:
17 | # - /var/run/docker.sock:/var/run/docker.sockf
18 |
19 | echo-server:
20 | image: timkpaine/echo-server:latest # jmalloc/echo-server
21 | ports:
22 | - 12345:8080
23 |
24 | # socketio-server:
25 | # image: timkpaine/docker-socketio:latest # voduytuan/docker-socketio
26 | # ports:
27 | # - 8069:8069
28 | # volumes:
29 | # - ./scripts/server.js:/srv/app.js
30 |
31 | # postgres:
32 | # image: timkpaine/postgres:latest # postgres
33 | # environment:
34 | # - POSTGRES_PASSWORD=test
35 | # volumes:
36 | # - ./scripts/init.sql:/docker-entrypoint-initdb.d/init.sql
37 | # ports:
38 | # - "5432:5432"
39 |
40 | # redis:
41 | # image: timkpaine/redis:latest # redis
42 | # ports:
43 | # - "6379:6379"
44 |
--------------------------------------------------------------------------------
/tributary/lazy/calculations/rolling.py:
--------------------------------------------------------------------------------
1 | from ..node import Node
2 |
3 | # def RollingCount(node):
4 | # def func(node=node):
5 | # return node()
6 | # # make new node
7 | # ret = node._gennode('RollingCount({})'.format(node._name), func, [node])
8 | # return ret
9 |
10 |
11 | def RollingCount(node):
12 | raise NotImplementedError()
13 |
14 |
15 | def RollingMax(node):
16 | raise NotImplementedError()
17 |
18 |
19 | def RollingMin(node):
20 | raise NotImplementedError()
21 |
22 |
23 | def RollingSum(node):
24 | raise NotImplementedError()
25 |
26 |
27 | def RollingAverage(node):
28 | raise NotImplementedError()
29 |
30 |
31 | def SMA(node, window_width=10, full_only=False):
32 | raise NotImplementedError()
33 |
34 |
35 | def EMA(node, window_width=10, full_only=False, alpha=None, adjust=False):
36 | raise NotImplementedError()
37 |
38 |
39 | def Last(node):
40 | raise NotImplementedError()
41 |
42 |
43 | def First(node):
44 | raise NotImplementedError()
45 |
46 |
47 | def Diff(node):
48 | raise NotImplementedError()
49 |
50 |
51 | Node.rollingCount = RollingCount
52 | Node.rollingMin = RollingMin
53 | Node.rollingMax = RollingMax
54 | Node.rollingSum = RollingSum
55 | Node.rollingAverage = RollingAverage
56 | Node.diff = Diff
57 | Node.sma = SMA
58 | Node.ema = EMA
59 | Node.last = Last
60 | Node.first = First
61 |
--------------------------------------------------------------------------------
/tributary/lazy/calculations/basket.py:
--------------------------------------------------------------------------------
1 | from .ops import unary
2 | from ..node import Node
3 |
4 |
5 | def Len(self):
6 | """Compute len(n) for node n"""
7 | return unary(self, "len({})".format(self._name_no_id), (lambda x: len(x)))
8 |
9 |
10 | def CountBasket(self):
11 | """Compute len(n) for node n"""
12 | return unary(self, "count({})".format(self._name_no_id), (lambda x: len(x)))
13 |
14 |
15 | def MaxBasket(self):
16 | """Compute max(n) for node n"""
17 | return unary(self, "max({})".format(self._name_no_id), (lambda x: max(x)))
18 |
19 |
20 | def MinBasket(self):
21 | """Compute max(n) for node n"""
22 | return unary(self, "min({})".format(self._name_no_id), (lambda x: min(x)))
23 |
24 |
25 | def SumBasket(self):
26 | """Compute sum(n) for node n"""
27 | return unary(self, "sum({})".format(self._name_no_id), (lambda x: sum(x)))
28 |
29 |
30 | def AverageBasket(self):
31 | """Compute mean(n) for node n"""
32 | return unary(
33 | self,
34 | "average({})".format(self._name_no_id),
35 | (lambda x: sum(x) / len(x)),
36 | )
37 |
38 |
39 | MeanBasket = AverageBasket
40 |
41 | Node.__len__ = Len
42 | Node.len = Len
43 | Node.countBasket = CountBasket
44 | Node.minBasket = MinBasket
45 | Node.maxBasket = MaxBasket
46 | Node.sumBasket = SumBasket
47 | Node.averageBasket = AverageBasket
48 | Node.meanBasket = MeanBasket
49 |
--------------------------------------------------------------------------------
/tributary/streaming/output/postgres.py:
--------------------------------------------------------------------------------
1 | import asyncpg
2 | from .output import Func
3 | from ..node import Node
4 |
5 |
6 | class Postgres(Func):
7 | """Connects to Postgres and executes queries
8 |
9 | Args:
10 | node (Node): input tributary
11 | user (str): postgres user
12 | password (str): postgres password
13 | database (str): postgres database
14 | host (str): postgres host
15 | query_parser (func): parse input node data to query list
16 | """
17 |
18 | def __init__(self, node, user, password, database, host, query_parser):
19 | async def _send(
20 | data,
21 | query_parser=query_parser,
22 | user=user,
23 | password=password,
24 | database=database,
25 | host=host,
26 | ):
27 | conn = await asyncpg.connect(
28 | user=user,
29 | password=password,
30 | database=database,
31 | host=host.split(":")[0],
32 | port=host.split(":")[1],
33 | )
34 | queries = query_parser(data)
35 | for q in queries:
36 | await conn.execute(q)
37 |
38 | await conn.close()
39 | return data
40 |
41 | super().__init__(func=_send, name="PostgresSink", inputs=1)
42 | node >> self
43 |
44 |
45 | Node.postgres = Postgres
46 |
--------------------------------------------------------------------------------
/tributary/lazy/control/control.py:
--------------------------------------------------------------------------------
1 | from .utils import _CONTROL_GRAPHVIZSHAPE
2 | from ..node import Node
3 | from ...base import TributaryException
4 |
5 |
6 | def If(if_node, satisfied_node, unsatisfied_node=None, *elseifs):
7 | """Node to return satisfied if if_node else unsatisfied
8 |
9 | Args:
10 | if_node (Node): input booleans
11 | satisfied_node (Node): input to return if true
12 | unsatisfied_node (Node): input to return if False
13 | elseifs (Tuple(Node)): input of boolean/value pairs to evaluate as else if statements
14 | """
15 | if len(elseifs) % 2 != 0:
16 | raise TributaryException("Else ifs must be in pairs")
17 |
18 | if elseifs:
19 | # TODO else ifs
20 | raise NotImplementedError()
21 |
22 | def func(cond, if_, else_=None):
23 | # TODO else ifs
24 | if cond:
25 | return if_
26 | return else_ if else_ is not None else None
27 |
28 | if isinstance(if_node._self_reference, Node):
29 | return if_node._gennode(
30 | "If",
31 | func,
32 | [if_node, satisfied_node, unsatisfied_node],
33 | graphvizshape=_CONTROL_GRAPHVIZSHAPE,
34 | )
35 | return if_node._gennode(
36 | "If",
37 | func,
38 | [if_node, satisfied_node, unsatisfied_node],
39 | graphvizshape=_CONTROL_GRAPHVIZSHAPE,
40 | )
41 |
42 |
43 | Node.if_ = If
44 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/output/test_http_streaming.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 | import pytest
3 | import requests
4 | import time
5 | import tributary.streaming as ts
6 |
7 |
8 | class TestHttp:
9 | def setup_method(self):
10 | time.sleep(0.5)
11 |
12 | @pytest.mark.skipif("int(os.environ.get('TRIBUTARY_SKIP_DOCKER_TESTS', '1'))")
13 | def test_http(self):
14 | """Test http server"""
15 |
16 | def func():
17 | yield "x"
18 | yield "y"
19 | yield "z"
20 |
21 | out = ts.HTTPSink(ts.Func(func), url="http://localhost:8080")
22 | assert len(ts.run(out)) == 3
23 |
24 | @pytest.mark.skipif('os.name == "nt"')
25 | def test_http_server(self):
26 | inp = ts.Random(interval=1, count=2)
27 | ss = ts.HTTPServerSink(inp, json=True, port=12346)
28 | w = ts.Window(ss)
29 | out = ts.run(w, blocking=False)
30 |
31 | try:
32 | time.sleep(1.5)
33 | resp = requests.get("http://127.0.0.1:12346/")
34 | print(resp.json())
35 | time.sleep(1)
36 | resp = requests.get("http://127.0.0.1:12346/")
37 | print(resp.json())
38 | time.sleep(2)
39 | print(w._accum)
40 | assert len(w._accum) == 2
41 | finally:
42 | asyncio.set_event_loop(asyncio.new_event_loop())
43 | try:
44 | out.stop()
45 | except RuntimeError:
46 | pass
47 |
--------------------------------------------------------------------------------
/tributary/lazy/decorator.py:
--------------------------------------------------------------------------------
1 | from .node import Node
2 | from ..utils import _either_type
3 | from ..parser import ( # noqa: F401
4 | pprintCode,
5 | pprintAst,
6 | parseASTForMethod,
7 | getClassAttributesUsedInMethod,
8 | addAttributeDepsToMethodSignature,
9 | Transformer,
10 | )
11 |
12 |
13 | @_either_type
14 | def node(meth, dynamic=False):
15 | """Convert a method into a lazy node"""
16 |
17 | # parse out function
18 | root, mod = parseASTForMethod(meth)
19 |
20 | # grab accessed attributes
21 | ads = getClassAttributesUsedInMethod(root)
22 |
23 | # # add them to signature
24 | # addAttributeDepsToMethodSignature(root, ads)
25 |
26 | # # modify
27 | # new_root = Transformer().visit(root)
28 |
29 | # # remove decorators
30 | # new_root.decorator_list = []
31 |
32 | # # replace in module
33 | # mod.body[0] = new_root
34 |
35 | # # pprintAst(new_root)
36 | # # pprintCode(mod)
37 | # meth = exec(compile(mod, meth.__code__.co_filename, "exec"), meth.__globals__)
38 | # print(type(meth))
39 | # import ipdb; ipdb.set_trace()
40 |
41 | new_node = Node(value=meth)
42 | new_node._nodes_to_bind = ads
43 |
44 | # if new_node._callable_is_method:
45 | # ret = lambda self, *args, **kwargs: new_node._bind( # noqa: E731
46 | # self, *args, **kwargs
47 | # )
48 | # else:
49 | # ret = lambda *args, **kwargs: new_node._bind( # noqa: E731
50 | # None, *args, **kwargs
51 | # )
52 | # ret._node_wrapper = new_node
53 | # return ret
54 |
55 | return new_node
56 |
--------------------------------------------------------------------------------
/tributary/tests/lazy/test_utils_lazy.py:
--------------------------------------------------------------------------------
1 | import tributary.lazy as tl
2 | from datetime import datetime
3 | from time import sleep
4 |
5 |
6 | def func():
7 | yield 1
8 | yield 2
9 |
10 |
11 | class TestUtils:
12 | def test_expire(self):
13 | n = tl.Node(value=5)
14 | n2 = n + 1
15 | sec = datetime.now().second
16 | out = tl.Expire(n2, second=(sec + 2) % 60)
17 |
18 | # assert initial value
19 | assert out() == 6
20 |
21 | # set new value
22 | n.setValue(6)
23 |
24 | # continue to use old value until 2+ seconds elapsed
25 | assert out() == 6
26 |
27 | sleep(3)
28 | assert out() == 7
29 |
30 | def test_interval(self):
31 | n = tl.Node(value=5)
32 | n2 = n + 1
33 | out = tl.Interval(n2, seconds=2)
34 |
35 | # # assert initial value
36 | assert out() == 6
37 |
38 | # # set new value
39 | n.setValue(6)
40 |
41 | # continue to use old value until 2+ seconds elapsed
42 | assert out() == 6
43 |
44 | sleep(3)
45 | assert out() == 7
46 |
47 | def test_window_any_size(self):
48 | n = tl.Window(tl.Node(value=func))
49 |
50 | assert n() == [1]
51 | assert n() == [1, 2]
52 |
53 | def test_window_fixed_size(self):
54 | n = tl.Window(tl.Node(value=func), size=2)
55 | assert n() == [1]
56 | assert n() == [1, 2]
57 |
58 | def test_window_fixed_size_full_only(self):
59 | n = tl.Window(tl.Node(value=func), size=2, full_only=True)
60 | assert n() is None
61 | assert n() == [1, 2]
62 |
--------------------------------------------------------------------------------
/tributary/streaming/output/kafka.py:
--------------------------------------------------------------------------------
1 | import json as JSON
2 | from aiokafka import AIOKafkaProducer
3 | from .output import Func
4 | from ..node import Node
5 |
6 |
7 | class Kafka(Func):
8 | """Connect to kafka server and send data
9 |
10 | Args:
11 | node (Node): input tributary
12 | servers (list): kafka bootstrap servers
13 | topic (str): kafka topic to connect to
14 | json (bool): load input data as json
15 | wrap (bool): wrap result in a list
16 | interval (int): kafka poll interval
17 | """
18 |
19 | def __init__(
20 | self, node, servers="", topic="", json=False, wrap=False, **producer_kwargs
21 | ):
22 | async def _send(data, topic=topic, json=json, wrap=wrap):
23 | if self._producer is None:
24 | self._producer = AIOKafkaProducer(
25 | bootstrap_servers=servers, **producer_kwargs
26 | )
27 |
28 | # Get cluster layout and initial topic/partition leadership information
29 | await self._producer.start()
30 |
31 | if wrap:
32 | data = [data]
33 |
34 | if json:
35 | data = JSON.dumps(data)
36 |
37 | # Produce message
38 | await self._producer.send_and_wait(topic, data.encode("utf-8"))
39 | return data
40 |
41 | # # Wait for all pending messages to be delivered or expire.
42 | # await producer.stop()
43 | super().__init__(func=_send, name="Kafka", inputs=1)
44 | node >> self
45 | self.set("_producer", None)
46 |
47 |
48 | Node.kafka = Kafka
49 |
--------------------------------------------------------------------------------
/tributary/streaming/output/socketio.py:
--------------------------------------------------------------------------------
1 | import json as JSON
2 | from socketIO_client_nexus import SocketIO as SIO
3 | from urllib.parse import urlparse
4 | from .output import Func
5 | from ..node import Node
6 |
7 |
8 | class SocketIO(Func):
9 | """Connect to socketIO server and send updates
10 |
11 | Args:
12 | node (Node): input stream
13 | url (str): url to connect to
14 | channel (str): socketio channel to connect through
15 | field (str): field to index result by
16 | sendinit (list): data to send on socketio connection open
17 | json (bool): load websocket data as json
18 | wrap (bool): wrap result in a list
19 | interval (int): socketio wai interval
20 | """
21 |
22 | def __init__(
23 | self,
24 | node,
25 | url,
26 | channel="",
27 | field="",
28 | sendinit=None,
29 | json=False,
30 | wrap=False,
31 | interval=1,
32 | ):
33 | o = urlparse(url)
34 | socketIO = SIO(o.scheme + "://" + o.netloc, o.port)
35 |
36 | if sendinit:
37 | socketIO.emit(sendinit)
38 |
39 | def _sio(data, field=field, json=json, wrap=wrap, interval=interval):
40 | if json:
41 | data = JSON.loads(data)
42 |
43 | if field:
44 | data = data[field]
45 |
46 | if wrap:
47 | data = [data]
48 |
49 | socketIO.emit(data)
50 | socketIO.wait(seconds=interval)
51 | return data
52 |
53 | super().__init__(func=_sio, name="SocketIO", inputs=1)
54 | node >> self
55 |
56 |
57 | Node.socketio = SocketIO
58 |
--------------------------------------------------------------------------------
/tributary/streaming/dd3.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 |
3 | _DD3_TRANSITION_DELAY = 0.25 # used so you can visually see the
4 | # transition e.g. not too fast
5 |
6 |
7 | def set_transition_delay(amt):
8 | global _DD3_TRANSITION_DELAY
9 | _DD3_TRANSITION_DELAY = amt
10 |
11 |
12 | class _DagreD3Mixin(object):
13 | # ***********************
14 | # Dagre D3 integration
15 | # ***********************
16 |
17 | async def _startdd3g(self):
18 | """represent start of calculation with a dd3 node"""
19 | if self._dd3g: # disable if not installed/enabled as it incurs a delay
20 | self._dd3g.setNode(self._name, tooltip=str(self._last), style="fill: #0f0")
21 | await asyncio.sleep(_DD3_TRANSITION_DELAY)
22 |
23 | async def _waitdd3g(self):
24 | """represent a node waiting for its input to tick"""
25 | if self._dd3g: # disable if not installed/enabled as it incurs a delay
26 | self._dd3g.setNode(self._name, tooltip=str(self._last), style="fill: #ff0")
27 | await asyncio.sleep(_DD3_TRANSITION_DELAY)
28 |
29 | async def _finishdd3g(self):
30 | """represent a node that has finished its calculation"""
31 | if self._dd3g: # disable if not installed/enabled as it incurs a delay
32 | self._dd3g.setNode(self._name, tooltip=str(self._last), style="fill: #f00")
33 | await asyncio.sleep(_DD3_TRANSITION_DELAY)
34 |
35 | async def _enddd3g(self):
36 | """represent a node that has finished all calculations"""
37 | if self._dd3g: # disable if not installed/enabled as it incurs a delay
38 | self._dd3g.setNode(self._name, tooltip=str(self._last), style="fill: #fff")
39 | await asyncio.sleep(_DD3_TRANSITION_DELAY)
40 |
41 | # ***********************
42 |
--------------------------------------------------------------------------------
/tributary/streaming/input/postgres.py:
--------------------------------------------------------------------------------
1 | import asyncpg
2 | import asyncio
3 | from .input import Func
4 |
5 |
6 | class Postgres(Func):
7 | """Connects to Postgres and yields result of query
8 |
9 | Args:
10 | user (str): postgres user
11 | password (str): postgres password
12 | database (str): postgres database
13 | host (str): postgres host
14 | queries (str): list of queries to execute
15 | interval (int): seconds to wait before executing queries
16 | repeat (int): times to repeat
17 | """
18 |
19 | def __init__(self, user, password, database, host, queries, repeat=1, interval=1):
20 | async def _send(
21 | queries=queries,
22 | repeat=int(repeat),
23 | interval=int(interval),
24 | user=user,
25 | password=password,
26 | database=database,
27 | host=host,
28 | ):
29 | conn = await asyncpg.connect(
30 | user=user,
31 | password=password,
32 | database=database,
33 | host=host.split(":")[0],
34 | port=host.split(":")[1],
35 | )
36 | count = 0 if repeat >= 0 else float("-inf")
37 | while count < repeat:
38 | data = []
39 | for query in queries:
40 | values = await conn.fetch(query)
41 | data.extend([list(x.items()) for x in values])
42 | yield data
43 |
44 | if interval:
45 | await asyncio.sleep(interval)
46 |
47 | if repeat >= 0:
48 | count += 1
49 |
50 | await conn.close()
51 |
52 | super().__init__(func=_send)
53 | self._name = "Postgres"
54 |
--------------------------------------------------------------------------------
/tributary/streaming/calculations/finance.py:
--------------------------------------------------------------------------------
1 | from ..node import Node
2 | from ..utils import Reduce
3 | from ...base import StreamNone
4 |
5 |
6 | def RSI(node, period=14):
7 | """Relative Strength Index
8 |
9 | Args:
10 | node (Node): input data
11 | period (int): RSI period
12 | Returns:
13 | Node: stream of RSI calculations
14 | """
15 |
16 | def _filter(up=True):
17 | def filter(val, up=up):
18 | if val is None:
19 | return StreamNone()
20 |
21 | if up:
22 | if val > 0:
23 | return val
24 | else:
25 | return 0
26 | if val < 0:
27 | return abs(val)
28 | return 0
29 |
30 | return filter
31 |
32 | diff = node.diff()
33 |
34 | ups = diff.apply(_filter(up=True)).ema(
35 | window_width=period, alpha=1 / period, adjust=True
36 | )
37 | downs = diff.apply(_filter(up=False)).ema(
38 | window_width=period, alpha=1 / period, adjust=True
39 | )
40 |
41 | RS = ups / downs
42 |
43 | rsi = 100 - (100 / (1 + RS))
44 | return rsi.abs()
45 |
46 |
47 | def MACD(node, period_fast=12, period_slow=26, signal=9):
48 | """Moving Average Convergence/Divergence
49 |
50 | Args:
51 | node (Node): input data
52 | period_fast (int): Fast moving average period
53 | period_slow (int): Slow moving average period
54 | signal (int): MACD moving average period
55 | Returns:
56 | Node: node that emits tuple of (macd, macd_signal)
57 | """
58 | fast = node.ema(window_width=period_fast)
59 | slow = node.ema(window_width=period_slow)
60 | macd = fast - slow
61 | signal = macd.ema(window_width=signal)
62 |
63 | return Reduce(macd, signal)
64 |
65 |
66 | Node.rsi = RSI
67 | Node.macd = MACD
68 |
--------------------------------------------------------------------------------
/CONTRIBUTING.md:
--------------------------------------------------------------------------------
1 | # Contributing
2 |
3 | Thank you for your interest in contributing to tributary!
4 |
5 | ## Reporting bugs, feature requests, etc.
6 |
7 | To report bugs, request new features or similar, please open an issue on the Github
8 | repository.
9 |
10 | A good bug report includes:
11 |
12 | - Expected behavior
13 | - Actual behavior
14 | - Steps to reproduce (preferably as minimal as possible)
15 |
16 | ## Minor changes, typos etc.
17 |
18 | Minor changes can be contributed by navigating to the relevant files on the Github repository,
19 | and clicking the "edit file" icon. By following the instructions on the page you should be able to
20 | create a pull-request proposing your changes. A repository maintainer will then review your changes,
21 | and either merge them, propose some modifications to your changes, or reject them (with a reason for
22 | the rejection).
23 |
24 | ## Setting up a development environment
25 |
26 | If you want to help resolve an issue by making some changes that are larger than that covered by the above paragraph, it is recommended that you:
27 |
28 | - Fork the repository on Github
29 | - Clone your fork to your computer
30 | - Run the following commands inside the cloned repository:
31 | - `pip install -e .[dev]` - This will install the Python package in development
32 | mode.
33 | - Validate the install by running the tests:
34 | - `py.test` - This command will run the Python tests.
35 | - `flake8 tributary` - This command will run the Python linters.
36 |
37 | Once you have such a development setup, you should:
38 |
39 | - Make the changes you consider necessary
40 | - Run the tests to ensure that your changes does not break anything
41 | - If you add new code, preferably write one or more tests for checking that your code works as expected.
42 | - Commit your changes and publish the branch to your github repo.
43 | - Open a pull-request (PR) back to the main repo on Github.
44 |
--------------------------------------------------------------------------------
/tributary/streaming/input/socketio.py:
--------------------------------------------------------------------------------
1 | import json as JSON
2 | from socketIO_client_nexus import SocketIO as SIO
3 | from urllib.parse import urlparse
4 |
5 | from .input import Func
6 |
7 |
8 | class SocketIO(Func):
9 | """Connect to socketIO server and yield back results
10 |
11 | Args:
12 | url (str): url to connect to
13 | channel (str): socketio channel to connect through
14 | field (str): field to index result by
15 | sendinit (list): data to send on socketio connection open
16 | json (bool): load websocket data as json
17 | wrap (bool): wrap result in a list
18 | interval (int): socketio wai interval
19 | """
20 |
21 | def __init__(
22 | self,
23 | url,
24 | channel="",
25 | field="",
26 | sendinit=None,
27 | json=False,
28 | wrap=False,
29 | interval=1,
30 | ):
31 | o = urlparse(url)
32 | socketIO = SIO(o.scheme + "://" + o.netloc, o.port)
33 | if sendinit:
34 | socketIO.emit(sendinit)
35 |
36 | async def _sio(
37 | url=url,
38 | channel=channel,
39 | field=field,
40 | json=json,
41 | wrap=wrap,
42 | interval=interval,
43 | ):
44 | while True:
45 | _data = []
46 | socketIO.on(channel, lambda data: _data.append(data))
47 | socketIO.wait(seconds=interval)
48 | for msg in _data:
49 | # FIXME clear _data
50 | if json:
51 | msg = JSON.loads(msg)
52 |
53 | if field:
54 | msg = msg[field]
55 |
56 | if wrap:
57 | msg = [msg]
58 |
59 | yield msg
60 |
61 | super().__init__(func=_sio)
62 | self._name = "SocketIO"
63 |
--------------------------------------------------------------------------------
/tributary/incubating/ast_experiments.py:
--------------------------------------------------------------------------------
1 | import random
2 |
3 | import tributary.lazy as t
4 |
5 | from tributary.parser import (
6 | parseASTForMethod,
7 | pprintAst,
8 | getClassAttributesUsedInMethod,
9 | addAttributeDepsToMethodSignature,
10 | Transformer,
11 | pprintCode,
12 | )
13 |
14 | blerg = t.Node(value=5)
15 |
16 |
17 | class Func4(t.LazyGraph):
18 | @t.node()
19 | def func1(self):
20 | return self.func2() + 1
21 |
22 | @t.node(dynamic=True)
23 | def func2(self):
24 | return random.random()
25 |
26 |
27 | class Func5(t.LazyGraph):
28 | def __init__(self):
29 | super().__init__()
30 | self.x = self.node(name="x", value=None)
31 |
32 | def reset(self):
33 | self.x = None
34 |
35 | @t.node()
36 | def z(self):
37 | return self.x() | self.y() | blerg
38 |
39 | def zz(self):
40 | return self.x | self.y()
41 |
42 | @t.node()
43 | def y(self):
44 | return 10
45 |
46 |
47 | root, mod = parseASTForMethod(Func5.zz)
48 | pprintAst(root)
49 |
50 | ads = getClassAttributesUsedInMethod(root)
51 | # print("attribute deps:", ads)
52 | # print("***Args***")
53 | # print(root.args.posonlyargs)
54 | # print(root.args.args)
55 | # print(root.args.kwonlyargs)
56 | # print(root.args.kw_defaults)
57 | # print(root.args.defaults)
58 | # print("***")
59 |
60 | addAttributeDepsToMethodSignature(root, ads)
61 |
62 | new_root = Transformer().visit(root)
63 | mod.body[0] = new_root
64 | pprintCode(mod)
65 |
66 | # names = sorted({node.id for node in ast.walk(root) if isinstance(node, ast.Name)})
67 | # print(names)
68 |
69 | # f = Func5()
70 | # print(f.z.graph())
71 | # print(f.z())
72 | # # assert f.z()() == 10
73 | # # assert f.x() is None
74 |
75 | # # f.x = 5
76 |
77 | # # assert f.x() == 5
78 | # # assert f.z()() == 5
79 |
80 | # # f.reset()
81 |
82 | # # assert f.x() is None
83 | # # assert f.z()() == 10
84 |
--------------------------------------------------------------------------------
/tributary/streaming/input/kafka.py:
--------------------------------------------------------------------------------
1 | import json as JSON
2 | from .input import Func
3 |
4 |
5 | class Kafka(Func):
6 | """Connect to kafka server and yield back results
7 |
8 | Args:
9 | servers (list): kafka bootstrap servers
10 | group (str): kafka group id
11 | topics (list): list of kafka topics to connect to
12 | json (bool): load input data as json
13 | wrap (bool): wrap result in a list
14 | interval (int): kafka poll interval
15 | """
16 |
17 | def __init__(
18 | self,
19 | servers,
20 | group,
21 | topics,
22 | json=False,
23 | wrap=False,
24 | interval=1,
25 | **consumer_kwargs
26 | ):
27 | from aiokafka import AIOKafkaConsumer
28 |
29 | self._consumer = None
30 |
31 | if not isinstance(topics, (list, tuple)):
32 | topics = [topics]
33 |
34 | async def _listen(
35 | servers=servers,
36 | group=group,
37 | topics=topics,
38 | json=json,
39 | wrap=wrap,
40 | interval=interval,
41 | ):
42 | if self._consumer is None:
43 | self._consumer = AIOKafkaConsumer(
44 | *topics,
45 | bootstrap_servers=servers,
46 | group_id=group,
47 | **consumer_kwargs
48 | )
49 |
50 | # Get cluster layout and join group `my-group`
51 | await self._consumer.start()
52 |
53 | async for msg in self._consumer:
54 | # Consume messages
55 | # msg.topic, msg.partition, msg.offset, msg.key, msg.value, msg.timestamp
56 |
57 | if json:
58 | msg.value = JSON.loads(msg.value)
59 | if wrap:
60 | msg.value = [msg.value]
61 | yield msg
62 |
63 | # Will leave consumer group; perform autocommit if enabled.
64 | await self._consumer.stop()
65 |
66 | super().__init__(func=_listen)
67 | self._name = "Kafka"
68 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/calculations/test_finance_streaming.py:
--------------------------------------------------------------------------------
1 | import pandas as pd
2 | import superstore
3 | import tributary.streaming as ts
4 |
5 |
6 | class TestFinance:
7 | def test_rsi(self):
8 | vals = pd.DataFrame(superstore.getTimeSeriesData(20))
9 | adjust = False
10 | period = 14
11 | delta = vals["A"].diff().shift(-1)
12 | up, down = delta.copy(), delta.copy()
13 | up[up < 0] = 0
14 | down[down > 0] = 0
15 | _gain = up.ewm(alpha=1.0 / period, adjust=adjust).mean()
16 | _loss = down.abs().ewm(alpha=1.0 / period, adjust=adjust).mean()
17 | RS = _gain / _loss
18 | rsi = pd.Series(100 - (100 / (1 + RS))).values
19 |
20 | curve = ts.Curve(vals["A"].tolist())
21 | ret = ts.run(ts.Print(ts.RSI(curve, period=period), "rsi:"))
22 | for x, y in zip(ret, rsi):
23 | if pd.isna(y):
24 | continue
25 | print("({}, {})".format(x, y))
26 | assert abs(x - y) < 0.001
27 |
28 | def test_macd(self):
29 | vals = pd.DataFrame(superstore.getTimeSeriesData(20))
30 | period_fast = 12
31 | period_slow = 26
32 | signal = 9
33 | adjust = False
34 |
35 | EMA_fast = pd.Series(
36 | vals["A"].ewm(ignore_na=False, span=period_fast, adjust=adjust).mean(),
37 | name="EMA_fast",
38 | )
39 | EMA_slow = pd.Series(
40 | vals["A"].ewm(ignore_na=False, span=period_slow, adjust=adjust).mean(),
41 | name="EMA_slow",
42 | )
43 | MACD = pd.Series(EMA_fast - EMA_slow, name="MACD")
44 | MACD_signal = pd.Series(
45 | MACD.ewm(ignore_na=False, span=signal, adjust=adjust).mean(), name="SIGNAL"
46 | )
47 |
48 | expected = pd.concat([MACD, MACD_signal], axis=1).values
49 |
50 | curve = ts.Curve(vals["A"].tolist())
51 | ret = ts.run(ts.MACD(curve).print("macd:"))
52 |
53 | for i, (macd, signal) in enumerate(ret):
54 | assert abs(expected[i][0] - macd) < 0.001
55 | assert abs(expected[i][1] - signal) < 0.001
56 |
--------------------------------------------------------------------------------
/.github/workflows/build.yml:
--------------------------------------------------------------------------------
1 | name: Build Status
2 |
3 | on:
4 | push:
5 | branches:
6 | - main
7 | tags:
8 | - v*
9 | paths-ignore:
10 | - docs/
11 | - CATALOG.md
12 | - CONTRIBUTING.md
13 | - LICENSE
14 | - README.md
15 | pull_request:
16 | workflow_dispatch:
17 |
18 | concurrency:
19 | group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
20 | cancel-in-progress: true
21 |
22 | permissions:
23 | checks: write
24 | pull-requests: write
25 |
26 | jobs:
27 | build:
28 | runs-on: ${{ matrix.os }}
29 |
30 | strategy:
31 | matrix:
32 | os: [ubuntu-latest, macos-latest, windows-latest]
33 | python-version: [3.9]
34 | event-name: [push]
35 |
36 | steps:
37 | - uses: actions/checkout@v6
38 |
39 | - name: Set up Python ${{ matrix.python-version }}
40 | uses: actions/setup-python@v6
41 | with:
42 | python-version: ${{ matrix.python-version }}
43 | cache: 'pip'
44 | cache-dependency-path: 'setup.py'
45 |
46 | - name: Install dependencies
47 | run: |
48 | make develop
49 |
50 | - name: Lint
51 | run: |
52 | make lint
53 |
54 | - name: Checks
55 | run: |
56 | make checks
57 | if: ${{ github.event_name == matrix.event-name || matrix.os == 'ubuntu-latest' }}
58 |
59 | - name: Test
60 | run: |
61 | make testsci
62 | if: ${{ matrix.os != 'ubuntu-latest' }}
63 |
64 | - name: Test
65 | run: |
66 | make dockerup && make testsci && make dockerdown
67 | if: ${{ matrix.os == 'ubuntu-latest' }}
68 |
69 | - name: Upload test results
70 | uses: actions/upload-artifact@v5
71 | with:
72 | name: pytest-results-${{ matrix.os }}-${{ matrix.python-version }}
73 | path: python_junit.xml
74 | if: ${{ always() }}
75 |
76 | - name: Publish Unit Test Results
77 | uses: EnricoMi/publish-unit-test-result-action@v2
78 | with:
79 | files: python_junit.xml
80 | if: ${{ matrix.os == 'ubuntu-latest' }}
81 |
82 | - name: Upload coverage
83 | uses: codecov/codecov-action@v5
84 |
85 | - name: Twine check
86 | run: |
87 | make dist
88 |
--------------------------------------------------------------------------------
/Makefile:
--------------------------------------------------------------------------------
1 | build: ## Build the repository
2 | python setup.py build
3 |
4 | develop: ## install to site-packages in editable mode
5 | python -m pip install --upgrade build pip setuptools twine wheel
6 | python -m pip install -e .[develop]
7 |
8 | tests: ## Clean and Make unit tests
9 | python -m pytest tributary --cov=tributary --junitxml=python_junit.xml --cov-report=xml --cov-branch
10 |
11 | testsci: ## Clean and Make unit tests
12 | CI=true python -m pytest tributary --cov=tributary --junitxml=python_junit.xml --cov-report=xml --cov-branch
13 |
14 | dockerup:
15 | docker compose -f ci/docker-compose.yml up -d
16 |
17 | dockerdown:
18 | docker compose -f ci/docker-compose.yml down
19 |
20 | notebooks: ## test execute the notebooks
21 | ./scripts/test_notebooks.sh
22 |
23 | lint: ## run linter
24 | python -m flake8 tributary setup.py docs/conf.py
25 |
26 | fix: ## run black fix
27 | python -m black tributary/ setup.py docs/conf.py
28 |
29 | check: checks
30 | checks: ## run lint and other checks
31 | check-manifest
32 |
33 | clean: ## clean the repository
34 | find . -name "__pycache__" | xargs rm -rf
35 | find . -name "*.pyc" | xargs rm -rf
36 | find . -name ".ipynb_checkpoints" | xargs rm -rf
37 | rm -rf .coverage coverage *.xml build dist *.egg-info lib node_modules .pytest_cache *.egg-info .autoversion .mypy_cache
38 | rm -rf ./*.gv*
39 | make -C ./docs clean
40 |
41 | install: ## install to site-packages
42 | python -m pip install .
43 |
44 | docs: ## make documentation
45 | make -C ./docs html
46 | open ./docs/_build/html/index.html
47 |
48 | dist: ## create dists
49 | rm -rf dist build
50 | python setup.py sdist bdist_wheel
51 | python -m twine check dist/*
52 |
53 | publish: dist ## dist to pypi
54 | python -m twine upload dist/* --skip-existing
55 |
56 | # Thanks to Francoise at marmelab.com for this
57 | .DEFAULT_GOAL := help
58 | help:
59 | @grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-30s\033[0m %s\n", $$1, $$2}'
60 |
61 | print-%:
62 | @echo '$*=$($*)'
63 |
64 | .PHONY: testjs testpy tests test lintpy lintjs lint fixpy fixjs fix checks check build develop install labextension dist publishpy publishjs publish docs clean dockerup dockerdown
65 |
--------------------------------------------------------------------------------
/examples/streaming_dagred3.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import tributary.streaming as ts\n",
10 | "import asyncio, random\n",
11 | "\n",
12 | "def func():\n",
13 | " return random.random()\n",
14 | "\n",
15 | "async def long():\n",
16 | " await asyncio.sleep(1)\n",
17 | " return 5\n",
18 | " \n",
19 | "rand = ts.Timer(func, interval=0, count=5)\n",
20 | "five = ts.Timer(long, interval=0, count=5)\n",
21 | "one = ts.Const(1)\n",
22 | "five2 = ts.Const(5)\n",
23 | "neg_rand = ts.Negate(rand)\n",
24 | "\n",
25 | "\n",
26 | "x1 = rand + five # 5 + rand\n",
27 | "x2 = x1 - five2 # rand\n",
28 | "x3 = x2 + neg_rand # 0\n",
29 | "res = x3 + one # 1"
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "execution_count": 2,
35 | "metadata": {},
36 | "outputs": [
37 | {
38 | "data": {
39 | "application/vnd.jupyter.widget-view+json": {
40 | "model_id": "99269c9b08cc453f9c6a73baf3438af8",
41 | "version_major": 2,
42 | "version_minor": 0
43 | },
44 | "text/plain": [
45 | "DagreD3Widget()"
46 | ]
47 | },
48 | "metadata": {},
49 | "output_type": "display_data"
50 | }
51 | ],
52 | "source": [
53 | "res.dagre()"
54 | ]
55 | },
56 | {
57 | "cell_type": "code",
58 | "execution_count": 3,
59 | "metadata": {},
60 | "outputs": [],
61 | "source": [
62 | "x = ts.run(res)"
63 | ]
64 | },
65 | {
66 | "cell_type": "code",
67 | "execution_count": null,
68 | "metadata": {},
69 | "outputs": [],
70 | "source": []
71 | }
72 | ],
73 | "metadata": {
74 | "kernelspec": {
75 | "display_name": "Python 3",
76 | "language": "python",
77 | "name": "python3"
78 | },
79 | "language_info": {
80 | "codemirror_mode": {
81 | "name": "ipython",
82 | "version": 3
83 | },
84 | "file_extension": ".py",
85 | "mimetype": "text/x-python",
86 | "name": "python",
87 | "nbconvert_exporter": "python",
88 | "pygments_lexer": "ipython3",
89 | "version": "3.7.5"
90 | }
91 | },
92 | "nbformat": 4,
93 | "nbformat_minor": 4
94 | }
95 |
--------------------------------------------------------------------------------
/tributary/tests/utils/test_extract_parameters.py:
--------------------------------------------------------------------------------
1 | import inspect
2 | from tributary.utils import extractParameters, Parameter
3 |
4 |
5 | def func(a, b=1, *c, **d):
6 | pass
7 |
8 |
9 | class Test:
10 | def meth(self, a, b=1, *c, **d):
11 | pass
12 |
13 |
14 | class TestLazyToStreaming:
15 | def test_function_parses(self):
16 | params = extractParameters(func)
17 | assert len(params) == 4
18 |
19 | def test_method_parses(self):
20 | t = Test()
21 | params = extractParameters(t.meth)
22 | assert len(params) == 4
23 |
24 | def test_function_all_are_parameters(self):
25 | params = extractParameters(func)
26 | assert isinstance(params[0], Parameter)
27 | assert isinstance(params[1], Parameter)
28 | assert isinstance(params[2], Parameter)
29 | assert isinstance(params[3], Parameter)
30 |
31 | def test_method_all_are_parameters(self):
32 | t = Test()
33 | params = extractParameters(t.meth)
34 | assert isinstance(params[0], Parameter)
35 | assert isinstance(params[1], Parameter)
36 | assert isinstance(params[2], Parameter)
37 | assert isinstance(params[3], Parameter)
38 |
39 | def test_function_names(self):
40 | params = extractParameters(func)
41 | assert params[0].name == "a"
42 | assert params[1].name == "b"
43 | assert params[2].name == "c"
44 | assert params[3].name == "d"
45 |
46 | def test_method_names(self):
47 | t = Test()
48 | params = extractParameters(t.meth)
49 | assert params[0].name == "a"
50 | assert params[1].name == "b"
51 | assert params[2].name == "c"
52 | assert params[3].name == "d"
53 |
54 | def test_function_defaults(self):
55 | params = extractParameters(func)
56 | assert params[0].default == inspect._empty
57 | assert params[1].default == 1
58 | assert params[2].default == ()
59 | assert params[3].default == {}
60 |
61 | def test_method_defaults(self):
62 | t = Test()
63 | params = extractParameters(t.meth)
64 | assert params[0].default == inspect._empty
65 | assert params[1].default == 1
66 | assert params[2].default == ()
67 | assert params[3].default == {}
68 |
--------------------------------------------------------------------------------
/examples/autodifferentiation.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "# Lazy\n",
10 | "import tributary.lazy as tl\n",
11 | "\n",
12 | "\n",
13 | "# f = 3x^2 + 2x\n",
14 | "# f' = 6x + 2\n",
15 | "x1 = tl.Node(value=(5,1), use_dual=True)\n",
16 | "x2 = tl.Node(value=(6,1), use_dual=True)\n",
17 | "x3 = tl.Node(value=(7,1), use_dual=True)\n",
18 | "\n",
19 | "out1 = tl.Add(tl.Mult(tl.Pow(x1, 2), (3,0)), tl.Mult(x1, (2,0)))\n",
20 | "out2 = tl.Add(tl.Mult(tl.Pow(x2, 2), (3,0)), tl.Mult(x2, (2,0)))\n",
21 | "out3 = tl.Add(tl.Mult(tl.Pow(x3, 2), (3,0)), tl.Mult(x3, (2,0)))\n",
22 | "\n",
23 | "assert [out1()[1], out2()[1], out3()[1]] == [6*x+2 for x in [5, 6, 7]]\n"
24 | ]
25 | },
26 | {
27 | "cell_type": "code",
28 | "execution_count": 2,
29 | "metadata": {},
30 | "outputs": [],
31 | "source": [
32 | "# Streaming\n",
33 | "import asyncio\n",
34 | "import tributary.streaming as ts\n",
35 | "import math\n",
36 | "\n",
37 | "# f = sin(x) + x^2\n",
38 | "# f' = cos(x) + 2x\n",
39 | "\n",
40 | "rng = range(-10,11)\n",
41 | "def func():\n",
42 | " for _ in rng:\n",
43 | " yield(_,1)\n",
44 | "\n",
45 | "x = ts.Timer(func, count=len(rng), use_dual=True)\n",
46 | "out = ts.Add(ts.Sin(x), ts.Pow(x,2))\n",
47 | "result = ts.run(out)\n",
48 | "\n",
49 | "while not result.done():\n",
50 | " await asyncio.sleep(1)\n",
51 | "\n",
52 | "assert [x[1] for x in result.result()] == [math.cos(_) + 2*_ for _ in rng]\n"
53 | ]
54 | }
55 | ],
56 | "metadata": {
57 | "kernelspec": {
58 | "display_name": "Python 3",
59 | "language": "python",
60 | "name": "python3"
61 | },
62 | "language_info": {
63 | "codemirror_mode": {
64 | "name": "ipython",
65 | "version": 3
66 | },
67 | "file_extension": ".py",
68 | "mimetype": "text/x-python",
69 | "name": "python",
70 | "nbconvert_exporter": "python",
71 | "pygments_lexer": "ipython3",
72 | "version": "3.7.7"
73 | },
74 | "widgets": {
75 | "application/vnd.jupyter.widget-state+json": {
76 | "state": {},
77 | "version_major": 2,
78 | "version_minor": 0
79 | }
80 | }
81 | },
82 | "nbformat": 4,
83 | "nbformat_minor": 4
84 | }
85 |
--------------------------------------------------------------------------------
/tributary/tests/utils/test_lazy_to_streaming.py:
--------------------------------------------------------------------------------
1 | import time
2 | import tributary.lazy as tl
3 | import tributary.streaming as ts
4 | import tributary as t
5 |
6 |
7 | class TestLazyToStreaming:
8 | def setup_method(self):
9 | time.sleep(0.5)
10 |
11 | def test_function(self):
12 | def func(*args):
13 | for _ in range(5):
14 | yield _
15 |
16 | lazy_node = tl.Node(value=func) + 5
17 | # 5 6 7 8 9
18 |
19 | streaming_node = ts.Print(ts.Func(func) - 5, "streaming:")
20 | # -5 -4 -3 -2 -1
21 |
22 | out = ts.Print(t.LazyToStreaming(lazy_node), "lazy:") + streaming_node
23 | x = ts.run(out)
24 |
25 | # 0 2 4 6 8
26 | print(x)
27 | assert x == [0, 2, 4, 6, 8]
28 |
29 | def test_function_order(self):
30 | def func(*args):
31 | for _ in range(5):
32 | yield _
33 |
34 | lazy_node = tl.Node(value=func) + 5
35 | # 5 6 7 8 9
36 |
37 | streaming_node = ts.Print(ts.Func(func) - 5, "streaming:")
38 | # -5 -4 -3 -2 -1
39 |
40 | out = streaming_node + ts.Print(lazy_node, "lazy:")
41 | x = ts.run(out)
42 |
43 | # 0 2 4 6 8
44 | print(x)
45 | assert x == [0, 2, 4, 6, 8]
46 |
47 | def test_value(self):
48 | def func(*args):
49 | for _ in range(5):
50 | lazy_node.setValue(_ + 1)
51 | yield _
52 |
53 | lazy_node = tl.Node(value=0)
54 | # 5 6 7 8 9
55 |
56 | streaming_node = ts.Print(ts.Func(func) - 5, "streaming:")
57 | # -5 -4 -3 -2 -1
58 |
59 | out = ts.Print(t.LazyToStreaming(lazy_node) + 5, "lazy:") + streaming_node
60 | x = ts.run(out)
61 | # 0 2 4 6 8
62 | print(x)
63 | assert x == [0, 2, 4, 6, 8]
64 |
65 | def test_value_order(self):
66 | lazy_node = tl.Node(value=0)
67 | # 5 6 7 8 9
68 |
69 | def func(lazy_node=lazy_node):
70 | for _ in range(5):
71 | yield _
72 | lazy_node.setValue(_ + 1)
73 |
74 | streaming_node = ts.Print(ts.Func(func) - 5, "streaming:")
75 | # -5 -4 -3 -2 -1
76 |
77 | out = streaming_node + ts.Print(lazy_node + 5, "lazy:")
78 | x = ts.run(out)
79 | # 0 2 4 6 8
80 | print(x)
81 | assert x == [0, 2, 4, 6, 8]
82 |
--------------------------------------------------------------------------------
/tributary/base.py:
--------------------------------------------------------------------------------
1 | class TributaryException(Exception):
2 | pass
3 |
4 |
5 | class BaseNodeValue:
6 | def all_bin_ops(self, other):
7 | return self
8 |
9 | def all_un_ops(self):
10 | return self
11 |
12 | __add__ = all_bin_ops
13 | __radd__ = all_bin_ops
14 | __sub__ = all_bin_ops
15 | __rsub__ = all_bin_ops
16 | __mul__ = all_bin_ops
17 | __rmul__ = all_bin_ops
18 | __div__ = all_bin_ops
19 | __rdiv__ = all_bin_ops
20 | __truediv__ = all_bin_ops
21 | __rtruediv__ = all_bin_ops
22 | __pow__ = all_bin_ops
23 | __rpow__ = all_bin_ops
24 | __mod__ = all_bin_ops
25 | __rmod__ = all_bin_ops
26 | __and__ = all_bin_ops
27 | __or__ = all_bin_ops
28 | __invert__ = all_un_ops
29 |
30 | def __bool__(self):
31 | return False
32 |
33 | def int(self):
34 | return 0
35 |
36 | def float(self):
37 | return 0
38 |
39 | __lt__ = all_bin_ops
40 | __le__ = all_bin_ops
41 | __gt__ = all_bin_ops
42 | __ge__ = all_bin_ops
43 |
44 | def __eq__(self, other):
45 | if isinstance(other, self.__class__):
46 | return True
47 | return False
48 |
49 | __ne__ = all_bin_ops
50 | __neg__ = all_un_ops
51 | __nonzero__ = all_un_ops
52 | __len__ = all_un_ops
53 |
54 | def __repr__(self) -> str:
55 | return self.__class__.__name__
56 |
57 |
58 | class StreamEnd(BaseNodeValue):
59 | """Indicates that a stream has nothing left in it"""
60 |
61 | instance = None
62 |
63 | def __new__(cls):
64 | if not StreamEnd.instance:
65 | StreamEnd.instance = super().__new__(cls)
66 | return StreamEnd.instance
67 | return StreamEnd.instance
68 |
69 |
70 | class StreamRepeat(BaseNodeValue):
71 | """Indicates that a stream has a gap, this object should be ignored
72 | and the previous action repeated"""
73 |
74 | instance = None
75 |
76 | def __new__(cls):
77 | if not StreamRepeat.instance:
78 | StreamRepeat.instance = super().__new__(cls)
79 | return StreamRepeat.instance
80 | return StreamRepeat.instance
81 |
82 |
83 | class StreamNone(BaseNodeValue):
84 | """indicates that a stream does not have a value"""
85 |
86 | instance = None
87 |
88 | def __new__(cls):
89 | if not StreamNone.instance:
90 | StreamNone.instance = super().__new__(cls)
91 | return StreamNone.instance
92 | return StreamNone.instance
93 |
--------------------------------------------------------------------------------
/examples/lazy_excel.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import tributary.symbolic as ts\n",
10 | "import ipysheet as ips"
11 | ]
12 | },
13 | {
14 | "cell_type": "code",
15 | "execution_count": 2,
16 | "metadata": {},
17 | "outputs": [
18 | {
19 | "data": {
20 | "application/vnd.jupyter.widget-view+json": {
21 | "model_id": "0f00afa46281485e85d35b919d1658bb",
22 | "version_major": 2,
23 | "version_minor": 0
24 | },
25 | "text/plain": [
26 | "MySheet(layout=Layout(height='auto', width='auto'))"
27 | ]
28 | },
29 | "metadata": {},
30 | "output_type": "display_data"
31 | }
32 | ],
33 | "source": [
34 | "class MySheet(ips.Sheet):\n",
35 | " def __init__(self):\n",
36 | " super(MySheet, self).__init__()\n",
37 | "\n",
38 | " \n",
39 | "ms = MySheet()\n",
40 | "ms"
41 | ]
42 | },
43 | {
44 | "cell_type": "code",
45 | "execution_count": 7,
46 | "metadata": {},
47 | "outputs": [
48 | {
49 | "data": {
50 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAAwAAAAVCAYAAAByrA+0AAAACXBIWXMAAA7EAAAOxAGVKw4bAAABDElEQVQ4EY2TAQ3CQAxFbyhYwMEkDHAADgAJSMALHnAAFsABSAAcjP+6azOWkKPJz66//1972y51XZeGSCk1ng/Xzk1ERlRVdVDSBiFz5oIKgwobsTPtdPKq1hfWudbTtFLUwtXbjp/U0Jg2G44i9mOh59SE49Dw8uKvpwwvoZ5ovpUWD6EUaHYceivY4QoONGsMC+FeEFNG02DgDT2FUqCpMUyFd0mtOmewDn9oTWIb0wEnY5XCRsfAbE1JrTodHhhuwlIoxVyCMyI+3P3XF3YejdDYXchJ68Xx04R5UzfEzzUWkyvi54zbJpL54ra5EY5a5LHoX20UBjwb2V2Aiw6W9LvFvZCQUb+6fgBkNCNk1aSrKgAAAABJRU5ErkJggg==\n",
51 | "text/latex": [
52 | "$\\displaystyle \\left( \\right)$"
53 | ],
54 | "text/plain": [
55 | "()"
56 | ]
57 | },
58 | "execution_count": 7,
59 | "metadata": {},
60 | "output_type": "execute_result"
61 | }
62 | ],
63 | "source": []
64 | },
65 | {
66 | "cell_type": "code",
67 | "execution_count": null,
68 | "metadata": {},
69 | "outputs": [],
70 | "source": []
71 | }
72 | ],
73 | "metadata": {
74 | "kernelspec": {
75 | "display_name": "Python 3",
76 | "language": "python",
77 | "name": "python3"
78 | },
79 | "language_info": {
80 | "codemirror_mode": {
81 | "name": "ipython",
82 | "version": 3
83 | },
84 | "file_extension": ".py",
85 | "mimetype": "text/x-python",
86 | "name": "python",
87 | "nbconvert_exporter": "python",
88 | "pygments_lexer": "ipython3",
89 | "version": "3.7.5"
90 | }
91 | },
92 | "nbformat": 4,
93 | "nbformat_minor": 4
94 | }
95 |
--------------------------------------------------------------------------------
/examples/lazy_dagred3.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import tributary.lazy as tl\n",
10 | "from random import random\n",
11 | "\n",
12 | "a = tl.Node(value=5)\n",
13 | "b = a + 2\n",
14 | "c = b % 4"
15 | ]
16 | },
17 | {
18 | "cell_type": "code",
19 | "execution_count": 2,
20 | "metadata": {},
21 | "outputs": [
22 | {
23 | "data": {
24 | "application/vnd.jupyter.widget-view+json": {
25 | "model_id": "9904f0a3323a4964ab423bbd5696497d",
26 | "version_major": 2,
27 | "version_minor": 0
28 | },
29 | "text/plain": [
30 | "DagreD3Widget()"
31 | ]
32 | },
33 | "metadata": {},
34 | "output_type": "display_data"
35 | }
36 | ],
37 | "source": [
38 | "c.dagre()"
39 | ]
40 | },
41 | {
42 | "cell_type": "code",
43 | "execution_count": 6,
44 | "metadata": {},
45 | "outputs": [
46 | {
47 | "data": {
48 | "text/plain": [
49 | "2"
50 | ]
51 | },
52 | "execution_count": 6,
53 | "metadata": {},
54 | "output_type": "execute_result"
55 | }
56 | ],
57 | "source": [
58 | "c()"
59 | ]
60 | },
61 | {
62 | "cell_type": "code",
63 | "execution_count": 4,
64 | "metadata": {},
65 | "outputs": [],
66 | "source": [
67 | "a.setValue(4)"
68 | ]
69 | },
70 | {
71 | "cell_type": "code",
72 | "execution_count": 5,
73 | "metadata": {},
74 | "outputs": [
75 | {
76 | "data": {
77 | "text/plain": [
78 | "True"
79 | ]
80 | },
81 | "execution_count": 5,
82 | "metadata": {},
83 | "output_type": "execute_result"
84 | }
85 | ],
86 | "source": [
87 | "c.isDirty()"
88 | ]
89 | },
90 | {
91 | "cell_type": "code",
92 | "execution_count": null,
93 | "metadata": {},
94 | "outputs": [],
95 | "source": []
96 | }
97 | ],
98 | "metadata": {
99 | "kernelspec": {
100 | "display_name": "Python 3",
101 | "language": "python",
102 | "name": "python3"
103 | },
104 | "language_info": {
105 | "codemirror_mode": {
106 | "name": "ipython",
107 | "version": 3
108 | },
109 | "file_extension": ".py",
110 | "mimetype": "text/x-python",
111 | "name": "python",
112 | "nbconvert_exporter": "python",
113 | "pygments_lexer": "ipython3",
114 | "version": "3.7.5"
115 | }
116 | },
117 | "nbformat": 4,
118 | "nbformat_minor": 4
119 | }
120 |
--------------------------------------------------------------------------------
/tributary/tests/lazy/calculations/test_finance_lazy.py:
--------------------------------------------------------------------------------
1 | import pandas as pd
2 | import superstore
3 | import tributary.lazy as tl
4 |
5 |
6 | class TestFinance:
7 | def test_rsi(self):
8 | df = pd.DataFrame(superstore.getTimeSeriesData(20))
9 | adjust = False
10 | period = 14
11 | delta = df["A"].diff().shift(-1)
12 | up, down = delta.copy(), delta.copy()
13 | up[up < 0] = 0
14 | down[down > 0] = 0
15 | _gain = up.ewm(alpha=1.0 / period, adjust=adjust).mean()
16 | _loss = down.abs().ewm(alpha=1.0 / period, adjust=adjust).mean()
17 | RS = _gain / _loss
18 | rsi = pd.Series(100 - (100 / (1 + RS)))
19 |
20 | val = [df["A"].iloc[0]]
21 | n = tl.Node(value=val)
22 | n_rsi = n.rsi(period=period)
23 |
24 | for i, x in enumerate(df["A"][1:]):
25 | val.append(x)
26 | n.setDirty(True)
27 | print("data\t", i, x, n_rsi(), rsi[i])
28 | assert abs(n_rsi() - rsi[i]) < 0.003
29 |
30 | n = tl.Node(value=val)
31 | n_rsi = n.rsi(period=period, basket=True)
32 | assert n_rsi().tolist() == rsi.tolist()
33 |
34 | def test_macd(self):
35 | df = pd.DataFrame(superstore.getTimeSeriesData(20))
36 |
37 | period_fast = 12
38 | period_slow = 26
39 | signal = 9
40 | adjust = False
41 |
42 | EMA_fast = pd.Series(
43 | df["A"].ewm(ignore_na=False, span=period_fast, adjust=adjust).mean(),
44 | name="EMA_fast",
45 | )
46 | EMA_slow = pd.Series(
47 | df["A"].ewm(ignore_na=False, span=period_slow, adjust=adjust).mean(),
48 | name="EMA_slow",
49 | )
50 | MACD = pd.Series(EMA_fast - EMA_slow, name="MACD")
51 | MACD_signal = pd.Series(
52 | MACD.ewm(ignore_na=False, span=signal, adjust=adjust).mean(), name="SIGNAL"
53 | )
54 |
55 | expected = pd.concat([MACD, MACD_signal], axis=1)
56 |
57 | val = []
58 | n = tl.Node(value=val)
59 | n_macd = n.macd(period_fast=period_fast, period_slow=period_slow, signal=signal)
60 |
61 | for i, x in enumerate(df["A"]):
62 | val.append(x)
63 | n_macd.setDirty(True)
64 | ret = n_macd()
65 | assert expected.values[i][0] - ret[0] < 0.001
66 | assert expected.values[i][1] - ret[1] < 0.001
67 |
68 | n = tl.Node(value=val)
69 | n_macd = n.macd(
70 | period_fast=period_fast, period_slow=period_slow, signal=signal, basket=True
71 | )
72 |
73 | assert n_macd().tolist() == expected.values.tolist()
74 |
--------------------------------------------------------------------------------
/tributary/lazy/calculations/finance.py:
--------------------------------------------------------------------------------
1 | import pandas as pd
2 | from ..node import Node
3 |
4 |
5 | def RSI(node, period=14, basket=False):
6 | """Relative Strength Index.
7 |
8 | Args:
9 | node (Node): input node.
10 | period (int): RSI period
11 | basket (bool): given a list as input, return a list as output (as opposed to the last value)
12 | """
13 |
14 | def _rsi(data, period=period, basket=basket):
15 | delta = pd.Series(data).diff().shift(-1)
16 | up, down = delta.copy(), delta.copy()
17 | up[up < 0] = 0
18 | down[down > 0] = 0
19 | _gain = up.ewm(alpha=1.0 / period, adjust=False).mean()
20 | _loss = down.abs().ewm(alpha=1.0 / period, adjust=False).mean()
21 | RS = _gain / _loss
22 | rsi = pd.Series(100 - (100 / (1 + RS)))
23 |
24 | if basket:
25 | return rsi
26 | return rsi.iloc[-1]
27 |
28 | # make new node
29 | ret = node._gennode("RSI[{}]".format(period), _rsi, [node])
30 | return ret
31 |
32 |
33 | def MACD(node, period_fast=12, period_slow=26, signal=9, basket=False):
34 | """Calculate Moving Average Convergence/Divergence
35 |
36 | Args:
37 | node (Node): input data
38 | period_fast (int): Fast moving average period
39 | period_slow (int): Slow moving average period
40 | signal (int): MACD moving average period
41 | basket (bool): given a list as input, return a list as output (as opposed to the last value)
42 | Returns:
43 | Node: node that emits tuple of (macd, macd_signal)
44 | """
45 |
46 | def _macd(
47 | data,
48 | period_fast=period_fast,
49 | period_slow=period_slow,
50 | signal=signal,
51 | basket=basket,
52 | ):
53 | EMA_fast = pd.Series(
54 | pd.Series(data).ewm(ignore_na=False, span=period_fast, adjust=False).mean(),
55 | name="EMA_fast",
56 | )
57 | EMA_slow = pd.Series(
58 | pd.Series(data).ewm(ignore_na=False, span=period_slow, adjust=False).mean(),
59 | name="EMA_slow",
60 | )
61 | MACD = pd.Series(EMA_fast - EMA_slow, name="MACD")
62 | MACD_signal = pd.Series(
63 | MACD.ewm(ignore_na=False, span=signal, adjust=False).mean(),
64 | name="SIGNAL",
65 | )
66 | macd = pd.concat([MACD, MACD_signal], axis=1)
67 |
68 | if basket:
69 | return macd.values
70 | return macd.iloc[-1]
71 |
72 | # make new node
73 | ret = node._gennode(
74 | "MACD[{},{},{}]".format(period_fast, period_slow, signal), _macd, [node]
75 | )
76 | return ret
77 |
78 |
79 | Node.rsi = RSI
80 | Node.macd = MACD
81 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/calculations/test_rolling_streaming.py:
--------------------------------------------------------------------------------
1 | import tributary.streaming as ts
2 | import pandas as pd
3 |
4 |
5 | def func():
6 | yield 1
7 | yield 1
8 | yield 1
9 | yield 1
10 | yield 1
11 |
12 |
13 | def func2():
14 | yield 1
15 | yield 2
16 | yield 0
17 | yield 5
18 | yield 4
19 |
20 |
21 | def func3():
22 | yield 1
23 | yield 2
24 | yield 3
25 | yield 4
26 | yield 5
27 |
28 |
29 | def func4():
30 | for _ in range(10):
31 | yield _
32 |
33 |
34 | def func5():
35 | yield [1, 1, 2]
36 | yield [1, 2, 3]
37 | yield [3, 4, 5]
38 |
39 |
40 | class TestRolling:
41 | def test_count(self):
42 | assert ts.run(ts.RollingCount(ts.Func(func))) == [1, 2, 3, 4, 5]
43 |
44 | def test_sum(self):
45 | assert ts.run(ts.RollingSum(ts.Func(func))) == [1, 2, 3, 4, 5]
46 |
47 | def test_min(self):
48 | assert ts.run(ts.RollingMin(ts.Func(func2))) == [1, 1, 0, 0, 0]
49 |
50 | def test_max(self):
51 | assert ts.run(ts.RollingMax(ts.Func(func2))) == [1, 2, 2, 5, 5]
52 |
53 | def test_average(self):
54 | assert ts.run(ts.RollingAverage(ts.Func(func3))) == [1, 1.5, 2, 2.5, 3]
55 |
56 | def test_diff(self):
57 | ret = ts.run(ts.Diff(ts.Func(func2)))
58 |
59 | vals = [1, 2, 0, 5, 4]
60 | comp = [None] + [vals[i] - vals[i - 1] for i in range(1, 5)]
61 | assert ret[0] is None
62 | for i, x in enumerate(ret[1:]):
63 | assert (x - comp[i + 1]) < 0.001
64 |
65 | def test_sma(self):
66 | ret = ts.run(ts.SMA(ts.Func(func4)))
67 | comp = pd.Series([_ for _ in range(10)]).rolling(10, min_periods=1).mean()
68 | for i, x in enumerate(ret):
69 | assert (x - comp[i]) < 0.001
70 |
71 | def test_ema(self):
72 | ret = ts.run(ts.EMA(ts.Func(func4)))
73 | comp = pd.Series([_ for _ in range(10)]).ewm(span=10, adjust=False).mean()
74 | for i, x in enumerate(ret):
75 | assert (x - comp[i]) < 0.001
76 |
77 | def test_ema2(self):
78 | ret = ts.run(ts.EMA(ts.Func(func4), alpha=1 / 10, adjust=True))
79 | comp = pd.Series([_ for _ in range(10)]).ewm(alpha=1 / 10, adjust=True).mean()
80 | for i, x in enumerate(ret):
81 | assert (x - comp[i]) < 0.001
82 |
83 | def test_last(self):
84 | assert ts.run(ts.Last(ts.Func(func2))) == [1, 2, 0, 5, 4]
85 |
86 | def test_first(self):
87 | assert ts.run(ts.First(ts.Func(func2))) == [1, 1, 1, 1, 1]
88 |
89 | def test_last_iter(self):
90 | assert ts.run(ts.Last(ts.Func(func5))) == [2, 3, 5]
91 |
92 | def test_first_iter(self):
93 | assert ts.run(ts.First(ts.Func(func5))) == [1, 1, 1]
94 |
--------------------------------------------------------------------------------
/tributary/streaming/output/email.py:
--------------------------------------------------------------------------------
1 | import base64
2 |
3 | import emails
4 | from bs4 import BeautifulSoup
5 |
6 | from ..node import Node
7 | from .output import Func
8 |
9 |
10 | def make_email(html, from_, subject="", attachments=None):
11 | """Helper function to make emails
12 |
13 | html (str): content
14 | from_ (str): address to send the email from
15 | subject (str): subject of email
16 | attachments (list) : attachments to send
17 | """
18 | message = emails.html(charset="utf-8", subject=subject, html=html, mail_from=from_)
19 | soup = BeautifulSoup(html, "html.parser")
20 |
21 | # strip markdown links
22 | for item in soup.findAll("a", {"class": "anchor-link"}):
23 | item.decompose()
24 |
25 | # strip matplotlib base outs
26 | for item in soup.find_all(
27 | "div", class_="output_text output_subarea output_execute_result"
28 | ):
29 | for c in item.contents:
30 | if "<matplotlib" in str(c):
31 | item.decompose()
32 |
33 | # remove dataframe table borders
34 | for item in soup.findAll("table", {"border": 1}):
35 | item["border"] = 0
36 | item["cellspacing"] = 0
37 | item["cellpadding"] = 0
38 |
39 | # extract imgs for outlook
40 | imgs = soup.find_all("img")
41 | imgs_to_attach = {}
42 |
43 | # attach main part
44 | for i, img in enumerate(imgs):
45 | if not img.get("localdata"):
46 | continue
47 | imgs_to_attach[img.get("cell_id") + "_" + str(i) + ".png"] = base64.b64decode(
48 | img.get("localdata")
49 | )
50 | img["src"] = "cid:" + img.get("cell_id") + "_" + str(i) + ".png"
51 | # encoders.encode_base64(part)
52 | del img["localdata"]
53 |
54 | # assemble email soup
55 | soup = str(soup)
56 | message = emails.html(charset="utf-8", subject=subject, html=soup, mail_from=from_)
57 |
58 | for img, data in imgs_to_attach.items():
59 | message.attach(filename=img, content_disposition="inline", data=data)
60 |
61 | return message
62 |
63 |
64 | class Email(Func):
65 | """Send an email
66 |
67 | Args:
68 | node (Node): input stream
69 | to (str): email address/es to send to
70 | smpt (dict): stmp settings for email account
71 | """
72 |
73 | def __init__(
74 | self,
75 | node,
76 | to,
77 | smtp,
78 | ):
79 | async def _send(
80 | message,
81 | to=to,
82 | smtp=smtp,
83 | ):
84 | r = message.send(to=to, smtp=smtp)
85 | return r, message
86 |
87 | super().__init__(func=_send, inputs=1)
88 | self._name = "Email"
89 | node >> self
90 |
91 |
92 | Node.email = Email
93 |
--------------------------------------------------------------------------------
/tributary/tests/lazy/test_graph_class_lazy.py:
--------------------------------------------------------------------------------
1 | import tributary.lazy as t
2 | import random
3 |
4 |
5 | class Func1(t.LazyGraph):
6 | def __init__(self, *args, **kwargs):
7 | super().__init__()
8 | self.x = self.node("x", readonly=False, value=1)
9 |
10 |
11 | class Func2(t.LazyGraph):
12 | def __init__(self, *args, **kwargs):
13 | self.y = self.node("y", readonly=False, value=2)
14 |
15 | # ensure no __nodes clobber
16 | self.test = self.node("test", readonly=False, value=2)
17 | self.x = self.node("x", readonly=False, value=2)
18 |
19 |
20 | class Func3(t.LazyGraph):
21 | @t.node()
22 | def func1(self):
23 | return self.random() # test self access
24 |
25 | def random(self):
26 | return random.random()
27 |
28 | @t.node()
29 | def func3(self, x=4):
30 | return 3 + x
31 |
32 |
33 | class Func4(t.LazyGraph):
34 | @t.node()
35 | def func1(self):
36 | return self.func2() + 1
37 |
38 | @t.node()
39 | def func2(self):
40 | return random.random()
41 |
42 |
43 | class Func5(t.LazyGraph):
44 | def __init__(self):
45 | self.x = self.node(name="x", value=None)
46 | super().__init__()
47 |
48 | @t.node()
49 | def z(self):
50 | return self.x | self.y()
51 |
52 | @t.node
53 | def y(self):
54 | return 10
55 |
56 | def reset(self):
57 | self.x = None
58 |
59 |
60 | class TestLazy:
61 | # FIXME
62 | def test_misc(self):
63 | f4 = Func4()
64 | z = f4.func1()
65 | assert isinstance(z, float) and z >= 1
66 | assert f4.func1.print()
67 | assert f4.func1.graph()
68 | assert f4.func1.graphviz()
69 |
70 | def test_lazy_default_func_arg(self):
71 | def func(val, prev_val=0):
72 | print("val:\t{}".format(val))
73 | print("prev_val:\t{}".format(prev_val))
74 | return val + prev_val
75 |
76 | n = t.Node(value=func)
77 | n.kwargs["val"].setValue(5)
78 |
79 | assert n() == 5
80 |
81 | n.set(prev_val=100)
82 |
83 | assert n() == 105
84 |
85 | def test_lazy_args_by_name_and_arg(self):
86 | # see the extended note in lazy.node about callable_args_mapping
87 | n = t.Node(name="Test", value=5)
88 | n2 = n + 1
89 |
90 | assert n2.kwargs["x"]._name_no_id == "Test"
91 |
92 | def test_or_dirtypropogation(self):
93 | f = Func5()
94 | assert f.z()() == 10
95 | assert f.x() is None
96 |
97 | f.x = 5
98 |
99 | assert f.x() == 5
100 | assert f.z()() == 5
101 |
102 | f.reset()
103 |
104 | assert f.x() is None
105 | assert f.z()() == 10
106 |
--------------------------------------------------------------------------------
/examples/pipeline_stream.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import tributary as t\n",
10 | "from perspective import PerspectiveWidget"
11 | ]
12 | },
13 | {
14 | "cell_type": "code",
15 | "execution_count": 2,
16 | "metadata": {},
17 | "outputs": [],
18 | "source": [
19 | "import random\n",
20 | "import time\n",
21 | "def func1(on_data):\n",
22 | " x = 0\n",
23 | " while x < 100:\n",
24 | " on_data({'a': random.random(), 'b': random.randint(0,1000), 'x':x})\n",
25 | " time.sleep(.1)\n",
26 | " x = x+1\n",
27 | " \n",
28 | "def func2(data, callback):\n",
29 | " callback([{'a':data['a'] *1000, 'b': data['b'], 'c': 'AAPL', 'x': data['x']}])\n",
30 | " "
31 | ]
32 | },
33 | {
34 | "cell_type": "code",
35 | "execution_count": 8,
36 | "metadata": {},
37 | "outputs": [
38 | {
39 | "data": {
40 | "application/vnd.jupyter.widget-view+json": {
41 | "model_id": "04ad31df5970405499ccac3d46eeaa45",
42 | "version_major": 2,
43 | "version_minor": 0
44 | },
45 | "text/plain": [
46 | "PerspectiveWidget(column_pivots=['c'], columns=['a', 'b'], plugin='y_line', row_pivots=['x'])"
47 | ]
48 | },
49 | "metadata": {},
50 | "output_type": "display_data"
51 | }
52 | ],
53 | "source": [
54 | "p = PerspectiveWidget({\"a\": float, \"b\": float, \"c\": str, \"x\": float}, plugin='y_line', columns=['a', 'b'], row_pivots=['x'], column_pivots=['c'])\n",
55 | "p"
56 | ]
57 | },
58 | {
59 | "cell_type": "code",
60 | "execution_count": 9,
61 | "metadata": {},
62 | "outputs": [],
63 | "source": [
64 | "t.pipeline([func1, func2], ['on_data', 'callback'], on_data=p.update)"
65 | ]
66 | },
67 | {
68 | "cell_type": "code",
69 | "execution_count": 6,
70 | "metadata": {},
71 | "outputs": [],
72 | "source": [
73 | "t.stop()"
74 | ]
75 | },
76 | {
77 | "cell_type": "code",
78 | "execution_count": null,
79 | "metadata": {},
80 | "outputs": [],
81 | "source": []
82 | }
83 | ],
84 | "metadata": {
85 | "kernelspec": {
86 | "display_name": "Python 3",
87 | "language": "python",
88 | "name": "python3"
89 | },
90 | "language_info": {
91 | "codemirror_mode": {
92 | "name": "ipython",
93 | "version": 3
94 | },
95 | "file_extension": ".py",
96 | "mimetype": "text/x-python",
97 | "name": "python",
98 | "nbconvert_exporter": "python",
99 | "pygments_lexer": "ipython3",
100 | "version": "3.7.5"
101 | }
102 | },
103 | "nbformat": 4,
104 | "nbformat_minor": 4
105 | }
106 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/test_streaming.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 | import time
3 | import tributary.streaming as ts
4 |
5 |
6 | class TestStreaming:
7 | def setup_method(self):
8 | time.sleep(0.5)
9 |
10 | def test_run_simple(self):
11 | t = ts.Const(value=1, count=1)
12 | assert ts.run(t) == [1]
13 |
14 | def test_run_func(self):
15 | def func():
16 | return 5
17 |
18 | t = ts.Func(func, count=1)
19 | assert ts.run(t) == [5]
20 |
21 | def test_run_stop(self):
22 | import time
23 | import tributary.streaming as ts
24 |
25 | async def func():
26 | while True:
27 | yield 1
28 | await asyncio.sleep(1)
29 |
30 | g = ts.run(ts.Print(ts.Func(func)), blocking=False)
31 |
32 | time.sleep(5)
33 | g.stop()
34 |
35 | def test_run_generator(self):
36 | def func():
37 | yield 1
38 | yield 2
39 |
40 | t = ts.Func(func)
41 | assert ts.run(t) == [1, 2]
42 |
43 | def test_run_async_func(self):
44 | async def func():
45 | await asyncio.sleep(0.1)
46 | return 5
47 |
48 | t = ts.Func(func, count=1)
49 | assert ts.run(t) == [5]
50 |
51 | def test_run_async_generator(self):
52 | async def func():
53 | yield 1
54 | yield 2
55 |
56 | t = ts.Func(func)
57 | assert ts.run(t) == [1, 2]
58 |
59 | def test_deep_bfs(self):
60 | a = ts.Const(1, count=1)
61 | b = ts.Random()
62 | c = ts.Curve([1, 2, 3])
63 |
64 | d = a + b
65 | e = a + c
66 | f = b + c
67 |
68 | g = ts.Print(d)
69 | h = ts.Print(e)
70 | i = ts.Print(f)
71 |
72 | def _ids(lst):
73 | return set([elem._id for elem in lst])
74 |
75 | def _ids_ids(lst_of_list):
76 | ret = []
77 | for lst in lst_of_list:
78 | ret.append(_ids(lst))
79 | return ret
80 |
81 | assert _ids(a._deep_bfs(tops_only=True)) == _ids([a, b, c])
82 | assert _ids(b._deep_bfs(tops_only=True)) == _ids([a, b, c])
83 | assert _ids(c._deep_bfs(tops_only=True)) == _ids([a, b, c])
84 | assert _ids(d._deep_bfs(tops_only=True)) == _ids([a, b, c])
85 | assert _ids(e._deep_bfs(tops_only=True)) == _ids([a, b, c])
86 | assert _ids(f._deep_bfs(tops_only=True)) == _ids([a, b, c])
87 | assert _ids(g._deep_bfs(tops_only=True)) == _ids([a, b, c])
88 | assert _ids(h._deep_bfs(tops_only=True)) == _ids([a, b, c])
89 | assert _ids(i._deep_bfs(tops_only=True)) == _ids([a, b, c])
90 |
91 | for x in (a, b, c, d, e, f, g, h, i):
92 | for y in (a, b, c, d, e, f, g, h, i):
93 | assert _ids_ids(x._deep_bfs()) == _ids_ids(y._deep_bfs())
94 |
--------------------------------------------------------------------------------
/setup.py:
--------------------------------------------------------------------------------
1 | import io
2 | import os.path
3 | from codecs import open
4 |
5 | from setuptools import find_packages, setup
6 |
7 | pjoin = os.path.join
8 | here = os.path.abspath(os.path.dirname(__file__))
9 | name = "tributary"
10 |
11 |
12 | def get_version(file, name="__version__"):
13 | path = os.path.realpath(file)
14 | version_ns = {}
15 | with io.open(path, encoding="utf8") as f:
16 | exec(f.read(), {}, version_ns)
17 | return version_ns[name]
18 |
19 |
20 | version = get_version(pjoin(here, name, "_version.py"))
21 |
22 | with open(pjoin(here, "README.md"), encoding="utf-8") as f:
23 | long_description = f.read().replace("\r\n", "\n")
24 |
25 |
26 | requires = [
27 | "aioconsole>=0.2.1",
28 | "aiofiles>=0.4.0",
29 | "aiohttp>=3.5.4",
30 | "aiohttp-sse>=2.0",
31 | "aiohttp-sse-client>=0.2.0",
32 | "aiokafka>=0.6.0",
33 | "aiostream>=0.3.1",
34 | "asyncpg>=0.20.1",
35 | "beautifulsoup4>=4.9.1",
36 | "boltons>=20.1.0",
37 | "emails>=0.5.15",
38 | "frozendict>=1.2",
39 | "future>=0.17.1",
40 | "gevent>=1.3.7",
41 | "graphviz>=0.10.1",
42 | "ipython>=7.0.1",
43 | "ipydagred3>=0.1.5",
44 | "numpy>=1.15.3",
45 | "pandas>=0.19.0",
46 | "scipy>1.2.0",
47 | "six>=1.11.0",
48 | "socketIO-client-nexus>=0.7.6",
49 | "sympy>=1.5.1",
50 | "temporal-cache>=0.0.6",
51 | "tornado>=5.1.1",
52 | "twilio>=6.50.1",
53 | ]
54 |
55 | requires_dev = [
56 | "black>=23",
57 | "check-manifest",
58 | "flake8>=3.7.8",
59 | "flake8-black>=0.2.1",
60 | "mock",
61 | "pybind11>=2.4.0",
62 | "pytest>=4.3.0",
63 | "pytest-cov>=2.6.1",
64 | "superstore",
65 | "Sphinx>=1.8.4",
66 | "sphinx-markdown-builder>=0.5.2",
67 | ] + requires
68 |
69 | setup(
70 | name=name,
71 | version=version,
72 | description="Streaming reactive and dataflow graphs in Python",
73 | long_description=long_description,
74 | long_description_content_type="text/markdown",
75 | url="https://github.com/1kbgz/{name}".format(name=name),
76 | author="Tim Paine",
77 | author_email="t.paine154@gmail.com",
78 | license="Apache 2.0",
79 | install_requires=requires,
80 | extras_require={
81 | "dev": requires_dev,
82 | "develop": requires_dev,
83 | "functional": ["confluent-kafka>=0.11.6", "websocket_client>=0.57.0"],
84 | },
85 | classifiers=[
86 | "Development Status :: 3 - Alpha",
87 | "Programming Language :: Python :: 3",
88 | "Programming Language :: Python :: 3.7",
89 | "Programming Language :: Python :: 3.8",
90 | "Programming Language :: Python :: 3.9",
91 | "Programming Language :: Python :: 3.10",
92 | ],
93 | keywords="streaming lazy graph dag dataflow reactive",
94 | packages=find_packages(exclude=["tests"]),
95 | package_data={},
96 | python_requires=">=3.7",
97 | include_package_data=True,
98 | zip_safe=False,
99 | entry_points={},
100 | )
101 |
--------------------------------------------------------------------------------
/docs/examples/streaming/output_3_0.svg:
--------------------------------------------------------------------------------
1 |
2 |
4 |
6 |
7 |
62 |
--------------------------------------------------------------------------------
/tributary/lazy/utils.py:
--------------------------------------------------------------------------------
1 | from temporalcache import interval, expire
2 | from .base import Node
3 |
4 |
5 | def Expire(
6 | node,
7 | second=None,
8 | minute=None,
9 | hour=None,
10 | day=None,
11 | day_of_week=None,
12 | week=None,
13 | month=None,
14 | maxsize=128,
15 | ):
16 | def _interval(data):
17 | return data
18 |
19 | # make new node
20 | ret = node._gennode(
21 | "Expire[{}-{}-{}-{}-{}-{}-{}-maxsize:{}]({})".format(
22 | second, minute, hour, day, day_of_week, week, month, maxsize, node._name
23 | ),
24 | _interval,
25 | [node],
26 | )
27 |
28 | # stash original recompute
29 | ret._orig_call = ret._call
30 |
31 | # make recompute run on expire
32 | ret._call = expire(
33 | second=second,
34 | minute=minute,
35 | hour=hour,
36 | day=day,
37 | day_of_week=day_of_week,
38 | week=week,
39 | month=month,
40 | maxsize=maxsize,
41 | )(ret._call)
42 | return ret
43 |
44 |
45 | def Interval(
46 | node, seconds=0, minutes=0, hours=0, days=0, weeks=0, months=0, years=0, maxsize=128
47 | ):
48 | def _interval(data):
49 | return data
50 |
51 | # make new node
52 | ret = node._gennode(
53 | "Interval[{}-{}-{}-{}-{}-{}-{}-maxsize:{}]({})".format(
54 | seconds, minutes, hours, days, weeks, months, years, maxsize, node._name
55 | ),
56 | _interval,
57 | [node],
58 | )
59 |
60 | # stash original recompute
61 | ret._orig_call = ret._call
62 |
63 | # make recompute run on interval
64 | ret._call = interval(
65 | seconds=seconds,
66 | minutes=minutes,
67 | hours=hours,
68 | days=days,
69 | weeks=weeks,
70 | months=months,
71 | years=years,
72 | maxsize=maxsize,
73 | )(ret._call)
74 | return ret
75 |
76 |
77 | def Window(node, size=-1, full_only=False):
78 | """Lazy wrapper to collect a window of values. If a node is executed 3 times,
79 | returning 1, 2, 3, then the window node will collect those values in a list.
80 |
81 | Arguments:
82 | node (node): input node
83 | size (int): size of windows to use
84 | full_only (bool): only return if list is full
85 | """
86 |
87 | def func(data, size=size, full_only=full_only):
88 | if size == 0:
89 | return data
90 |
91 | if ret._accum is None:
92 | ret._accum = []
93 |
94 | ret._accum.append(data)
95 |
96 | if size > 0:
97 | ret._accum = ret._accum[-size:]
98 |
99 | if full_only and len(ret._accum) == size:
100 | return ret._accum
101 | elif full_only:
102 | return None
103 | return ret._accum
104 |
105 | # make new node
106 | ret = node._gennode("Window[{}]".format(size if size > 0 else "∞"), func, [node])
107 | ret._accum = None
108 | return ret
109 |
110 |
111 | Node.expire = Expire
112 | Node.interval = Interval
113 | Node.window = Window
114 |
--------------------------------------------------------------------------------
/.github/CODE_OF_CONDUCT.md:
--------------------------------------------------------------------------------
1 | # Contributor Covenant Code of Conduct
2 |
3 | ## Our Pledge
4 |
5 | In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, nationality, personal appearance, race, religion, or sexual identity and orientation.
6 |
7 | ## Our Standards
8 |
9 | Examples of behavior that contributes to creating a positive environment include:
10 |
11 | * Using welcoming and inclusive language
12 | * Being respectful of differing viewpoints and experiences
13 | * Gracefully accepting constructive criticism
14 | * Focusing on what is best for the community
15 | * Showing empathy towards other community members
16 |
17 | Examples of unacceptable behavior by participants include:
18 |
19 | * The use of sexualized language or imagery and unwelcome sexual attention or advances
20 | * Trolling, insulting/derogatory comments, and personal or political attacks
21 | * Public or private harassment
22 | * Publishing others' private information, such as a physical or electronic address, without explicit permission
23 | * Other conduct which could reasonably be considered inappropriate in a professional setting
24 |
25 | ## Our Responsibilities
26 |
27 | Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior.
28 |
29 | Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
30 |
31 | ## Scope
32 |
33 | This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. Representation of a project may be further defined and clarified by project maintainers.
34 |
35 | ## Enforcement
36 |
37 | Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at timothy.k.paine@gmail.com. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.
38 |
39 | Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership.
40 |
41 | ## Attribution
42 |
43 | This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, available at [http://contributor-covenant.org/version/1/4][version]
44 |
45 | [homepage]: http://contributor-covenant.org
46 | [version]: http://contributor-covenant.org/version/1/4/
47 |
--------------------------------------------------------------------------------
/tributary/tests/streaming/input/test_input_streaming.py:
--------------------------------------------------------------------------------
1 | import time
2 | import tributary.streaming as ts
3 |
4 |
5 | class TestConst:
6 | def setup_method(self):
7 | time.sleep(0.5)
8 |
9 | def test_const_1(self):
10 | t = ts.Const(value=1, count=1)
11 | assert ts.run(t) == [1]
12 |
13 | def test_const_2(self):
14 | t = ts.Const(value=1, count=5)
15 | assert ts.run(t) == [1, 1, 1, 1, 1]
16 |
17 |
18 | class TestTimer:
19 | def setup_method(self):
20 | time.sleep(0.5)
21 |
22 | def test_timer(self):
23 | val = 0
24 |
25 | def func():
26 | nonlocal val
27 | val += 1
28 | return val
29 |
30 | t = ts.Timer(func, count=5)
31 | assert ts.run(t) == [1, 2, 3, 4, 5]
32 |
33 | t = ts.Timer(func, count=5)
34 |
35 | def test_timer_delay(self):
36 | val = 0
37 |
38 | def func():
39 | nonlocal val
40 | val += 1
41 | return val
42 |
43 | t = ts.Timer(func, count=5, interval=0.1)
44 | assert ts.run(t) == [1, 2, 3, 4, 5]
45 |
46 | t = ts.Timer(func, count=5)
47 |
48 | def test_timer_generator(self):
49 | def func():
50 | yield 1
51 | yield 2
52 | yield 3
53 | yield 4
54 | yield 5
55 |
56 | t = ts.Timer(func)
57 | assert ts.run(t) == [1]
58 |
59 | t = ts.Timer(func, count=3)
60 | assert ts.run(t) == [1, 2, 3]
61 |
62 | t = ts.Timer(func, count=5)
63 | assert ts.run(t) == [1, 2, 3, 4, 5]
64 |
65 | t = ts.Timer(func, count=6)
66 | assert ts.run(t) == [1, 2, 3, 4, 5]
67 |
68 | def test_timer_generator_delay(self):
69 | def func():
70 | yield 1
71 | yield 2
72 | yield 3
73 | yield 4
74 | yield 5
75 |
76 | t = ts.Timer(func, interval=0.1)
77 | assert ts.run(t) == [1]
78 |
79 | t = ts.Timer(func, count=3, interval=0.1)
80 | assert ts.run(t) == [1, 2, 3]
81 |
82 | t = ts.Timer(func, count=5, interval=0.1)
83 | assert ts.run(t) == [1, 2, 3, 4, 5]
84 |
85 | t = ts.Timer(func, count=6, interval=0.1)
86 | assert ts.run(t) == [1, 2, 3, 4, 5]
87 |
88 |
89 | class TestFunc:
90 | def setup_method(self):
91 | time.sleep(0.5)
92 |
93 | def test_timer(self):
94 | val = 0
95 |
96 | def func():
97 | nonlocal val
98 | val += 1
99 | return val
100 |
101 | t = ts.Timer(func, count=5)
102 | assert ts.run(t) == [1, 2, 3, 4, 5]
103 |
104 | t = ts.Timer(func, count=5)
105 |
106 | def test_timer_delay(self):
107 | val = 0
108 |
109 | def func():
110 | nonlocal val
111 | val += 1
112 | return val
113 |
114 | t = ts.Timer(func, count=5, interval=0.1)
115 | assert ts.run(t) == [1, 2, 3, 4, 5]
116 |
117 | t = ts.Timer(func, count=5)
118 |
119 | def test_func_generator(self):
120 | def func():
121 | yield 1
122 | yield 2
123 | yield 3
124 | yield 4
125 | yield 5
126 |
127 | t = ts.Func(func)
128 | assert ts.run(t) == [1, 2, 3, 4, 5]
129 |
--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------
1 | # Byte-compiled / optimized / DLL files
2 | __pycache__/
3 | *.py[cod]
4 | *$py.class
5 |
6 | # C extensions
7 | *.so
8 |
9 | # Distribution / packaging
10 | .Python
11 | env/
12 | build/
13 | develop-eggs/
14 | dist/
15 | downloads/
16 | eggs/
17 | .eggs/
18 | lib/
19 | lib64/
20 | parts/
21 | sdist/
22 | var/
23 | wheels/
24 | *.egg-info/
25 | .installed.cfg
26 | *.egg
27 |
28 | # PyInstaller
29 | # Usually these files are written by a python script from a template
30 | # before PyInstaller builds the exe, so as to inject date/other infos into it.
31 | *.manifest
32 | *.spec
33 |
34 | # Installer logs
35 | pip-log.txt
36 | pip-delete-this-directory.txt
37 |
38 | # Unit test / coverage reports
39 | htmlcov/
40 | .tox/
41 | .coverage
42 | .coverage.*
43 | .cache
44 | nosetests.xml
45 | coverage.xml
46 | *.cover
47 | .hypothesis/
48 |
49 | # Translations
50 | *.mo
51 | *.pot
52 |
53 | # Django stuff:
54 | *.log
55 | local_settings.py
56 |
57 | # Flask stuff:
58 | instance/
59 | .webassets-cache
60 |
61 | # Scrapy stuff:
62 | .scrapy
63 |
64 | # Sphinx documentation
65 | docs/_build/
66 |
67 | # PyBuilder
68 | target/
69 |
70 | # Jupyter Notebook
71 | .ipynb_checkpoints
72 |
73 | # pyenv
74 | .python-version
75 |
76 | # celery beat schedule file
77 | celerybeat-schedule
78 |
79 | # SageMath parsed files
80 | *.sage.py
81 |
82 | # dotenv
83 | .env
84 |
85 | # virtualenv
86 | .venv
87 | venv/
88 | ENV/
89 |
90 | # Spyder project settings
91 | .spyderproject
92 | .spyproject
93 |
94 | # Rope project settings
95 | .ropeproject
96 |
97 | # mkdocs documentation
98 | /site
99 |
100 | # mypy
101 | .mypy_cache/
102 | .DS_Store
103 | # Logs
104 | logs
105 | *.log
106 | npm-debug.log*
107 | yarn-debug.log*
108 | yarn-error.log*
109 |
110 | # Runtime data
111 | pids
112 | *.pid
113 | *.seed
114 | *.pid.lock
115 |
116 | # Directory for instrumented libs generated by jscoverage/JSCover
117 | lib-cov
118 |
119 | # Coverage directory used by tools like istanbul
120 | coverage
121 |
122 | # nyc test coverage
123 | .nyc_output
124 |
125 | # Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files)
126 | .grunt
127 |
128 | # Bower dependency directory (https://bower.io/)
129 | bower_components
130 |
131 | # node-waf configuration
132 | .lock-wscript
133 |
134 | # Compiled binary addons (http://nodejs.org/api/addons.html)
135 | build/Release
136 |
137 | # Dependency directories
138 | node_modules/
139 | jspm_packages/
140 |
141 | # Typescript v1 declaration files
142 | typings/
143 |
144 | # Optional npm cache directory
145 | .npm
146 |
147 | # Optional eslint cache
148 | .eslintcache
149 |
150 | # Optional REPL history
151 | .node_repl_history
152 |
153 | # Output of 'npm pack'
154 | *.tgz
155 |
156 | # Yarn Integrity file
157 | .yarn-integrity
158 |
159 | # dotenv environment variables file
160 | .env
161 |
162 | .DS_Store
163 | yarn.lock
164 | package-lock.json
165 | .autoversion
166 | .idea
167 | *.gv
168 | *.gv.png
169 |
170 | tmp.py
171 | tmp2.py
172 | docs/api/
173 | tributary/tests/streaming/output/test_file_data.json
174 | tributary/tests/streaming/output/test_file_data.csv
175 | examples/tributary
176 | examples/ipydagred3
177 | ipydagred3
178 | python_junit.xml
179 | tmps
180 | docs/index.md
181 |
182 | # codespaces
183 | lib64
184 | pyvenv.cfg
185 | pythonenv3.8/
186 | .vscode
187 | icon.ai
188 | test.png
189 |
--------------------------------------------------------------------------------
/docs/examples/lazy/output_8_0.svg:
--------------------------------------------------------------------------------
1 |
2 |
4 |
6 |
7 |
68 |
--------------------------------------------------------------------------------
/docs/examples/lazy/output_6_0.svg:
--------------------------------------------------------------------------------
1 |
2 |
4 |
6 |
7 |
68 |
--------------------------------------------------------------------------------
/tributary/lazy/graph.py:
--------------------------------------------------------------------------------
1 | from .node import Node
2 | from ..base import TributaryException
3 |
4 |
5 | class LazyGraph(object):
6 | """Wrapper class around a collection of lazy nodes."""
7 |
8 | def __init__(self, *args, **kwargs):
9 | # the last thing we do is go through all of our methods and ensure that all `_callable_args` in our methods are replaced with nodes
10 | for meth in dir(self):
11 | meth = getattr(self, meth)
12 |
13 | # bind self on nodes
14 | if isinstance(meth, Node):
15 | meth._self_reference = self
16 |
17 | # replace upstream dependencies with their actual node equivalents
18 | if hasattr(meth, "_nodes_to_bind"):
19 | nodes_to_bind = meth._nodes_to_bind
20 |
21 | for node in nodes_to_bind:
22 | if not hasattr(self, node):
23 | raise TributaryException(
24 | "Error binding dependency {} to node {} - make sure to super() after all nodes are defined".format(
25 | node, meth
26 | )
27 | )
28 |
29 | node_to_bind = getattr(self, node)
30 | if isinstance(node_to_bind, Node):
31 | meth << getattr(self, node)
32 |
33 | print(meth._name, meth._parameters, meth.upstream())
34 |
35 | def node(self, name, readonly=False, nullable=True, value=None): # noqa: F811
36 | """method to create a lazy node attached to a graph.
37 |
38 | Args:
39 | name (str): name to represent the node
40 | readonly (bool): whether the node should be settable
41 | nullable (bool): whether node can have value None
42 | value (any): initial value for node
43 | Returns:
44 | BaseNode: the newly constructed lazy node
45 | """
46 | if not hasattr(self, "_LazyGraph__nodes"):
47 | self.__nodes = {}
48 |
49 | if name not in self.__nodes:
50 | if not isinstance(value, Node):
51 | value = Node(
52 | name=name,
53 | value=value,
54 | )
55 | self.__nodes[name] = value
56 | setattr(self, name, self.__nodes[name])
57 | return self.__nodes[name]
58 |
59 | def __getattribute__(self, name):
60 | if name == "_LazyGraph__nodes" or name == "__nodes":
61 | return super(LazyGraph, self).__getattribute__(name)
62 | elif hasattr(self, "_LazyGraph__nodes") and name in super(
63 | LazyGraph, self
64 | ).__getattribute__("_LazyGraph__nodes"):
65 | return super(LazyGraph, self).__getattribute__("_LazyGraph__nodes")[name]
66 | else:
67 | return super(LazyGraph, self).__getattribute__(name)
68 |
69 | def __setattr__(self, name, value):
70 | if hasattr(self, "_LazyGraph__nodes") and name in super(
71 | LazyGraph, self
72 | ).__getattribute__("_LazyGraph__nodes"):
73 | node = super(LazyGraph, self).__getattribute__("_LazyGraph__nodes")[name]
74 | if isinstance(value, Node) and node == value:
75 | return
76 | elif isinstance(value, Node):
77 | raise TributaryException("Cannot set to node")
78 | else:
79 | node.setValue(value)
80 | else:
81 | super(LazyGraph, self).__setattr__(name, value)
82 |
--------------------------------------------------------------------------------
/docs/examples/autodiff/autodiff.md:
--------------------------------------------------------------------------------
1 | # Lazy Automatic Differentiation
2 |
3 | $f(x) = 3x^2 + 2x$
4 |
5 | $f'(x) = 6x + 2$
6 |
7 |
8 | ```python
9 | import tributary.lazy as tl
10 | ```
11 |
12 |
13 | ```python
14 | # f = 3x^2 + 2x
15 | # f' = 6x + 2
16 | x1 = tl.Node(value=(5,1), use_dual=True)
17 | x2 = tl.Node(value=(6,1), use_dual=True)
18 | x3 = tl.Node(value=(7,1), use_dual=True)
19 |
20 | out1 = tl.Add(tl.Mult(tl.Pow(x1, 2), (3,0)), tl.Mult(x1, (2,0)))
21 | out2 = tl.Add(tl.Mult(tl.Pow(x2, 2), (3,0)), tl.Mult(x2, (2,0)))
22 | out3 = tl.Add(tl.Mult(tl.Pow(x3, 2), (3,0)), tl.Mult(x3, (2,0)))
23 | ```
24 |
25 |
26 | ```python
27 | out1.graphviz()
28 | ```
29 |
30 |
31 |
32 |
33 |
34 | 
35 |
36 |
37 |
38 |
39 |
40 | ```python
41 | out2.graphviz()
42 | ```
43 |
44 |
45 |
46 |
47 |
48 | 
49 |
50 |
51 |
52 |
53 |
54 | ```python
55 | out3.graphviz()
56 | ```
57 |
58 |
59 |
60 |
61 |
62 | 
63 |
64 |
65 |
66 |
67 |
68 | ```python
69 | out1()[1]
70 | ```
71 |
72 |
73 |
74 |
75 | 32
76 |
77 |
78 |
79 |
80 | ```python
81 | out2()[1]
82 | ```
83 |
84 |
85 |
86 |
87 | 38
88 |
89 |
90 |
91 |
92 | ```python
93 | out3()[1]
94 | ```
95 |
96 |
97 |
98 |
99 | 44
100 |
101 |
102 |
103 |
104 | ```python
105 | assert [out1()[1], out2()[1], out3()[1]] == [6*x+2 for x in [5, 6, 7]]
106 | ```
107 |
108 | # Streaming automatic differentiation
109 |
110 | $f(x) = sin(x) + x^2$
111 |
112 | $f'(x) = cos(x) + 2x$
113 |
114 |
115 | ```python
116 | import tributary.streaming as ts
117 | import asyncio
118 | import math
119 | ```
120 |
121 |
122 | ```python
123 | # f = sin(x) + x^2
124 | # f' = cos(x) + 2x
125 |
126 | rng = range(-10,11)
127 | def func():
128 | for _ in rng:
129 | yield(_,1)
130 |
131 | x = ts.Timer(func, count=len(rng), use_dual=True)
132 | out = ts.Add(ts.Sin(x), ts.Pow(x,2))
133 | ```
134 |
135 |
136 | ```python
137 | out.graphviz()
138 | ```
139 |
140 |
141 |
142 |
143 |
144 | 
145 |
146 |
147 |
148 |
149 |
150 | ```python
151 | result = ts.run(out)
152 | ```
153 |
154 |
155 | ```python
156 | while not result.done():
157 | await asyncio.sleep(1)
158 | ```
159 |
160 |
161 | ```python
162 | result.result()
163 | ```
164 |
165 |
166 |
167 |
168 | [(100.54402111088937, -20.839071529076453),
169 | (80.58788151475824, -18.911130261884676),
170 | (63.01064175337662, -16.145500033808613),
171 | (48.34301340128121, -13.246097745656696),
172 | (36.27941549819893, -11.039829713349635),
173 | (25.95892427466314, -9.716337814536773),
174 | (16.756802495307927, -8.653643620863612),
175 | (8.858879991940134, -6.989992496600445),
176 | (3.090702573174318, -4.416146836547142),
177 | (0.1585290151921035, -1.4596976941318602),
178 | (0.0, 1.0),
179 | (1.8414709848078965, 2.5403023058681398),
180 | (4.909297426825682, 3.5838531634528574),
181 | (9.141120008059866, 5.010007503399555),
182 | (15.243197504692072, 7.346356379136388),
183 | (24.04107572533686, 10.283662185463227),
184 | (35.72058450180107, 12.960170286650365),
185 | (49.65698659871879, 14.753902254343304),
186 | (64.98935824662338, 15.854499966191387),
187 | (81.41211848524176, 17.088869738115324),
188 | (99.45597888911063, 19.160928470923547)]
189 |
190 |
191 |
192 |
193 | ```python
194 | assert [x[1] for x in result.result()] == [math.cos(_) + 2*_ for _ in rng]
195 | ```
196 |
--------------------------------------------------------------------------------
/tributary/symbolic/__init__.py:
--------------------------------------------------------------------------------
1 | import tributary.lazy as tl
2 | import tributary.streaming as ts
3 | from tributary import TributaryException
4 |
5 | from sympy.utilities.lambdify import lambdify
6 | from sympy.parsing.sympy_parser import (
7 | parse_expr,
8 | standard_transformations as _st,
9 | implicit_multiplication_application as _ima,
10 | )
11 | from sympy import init_printing, dotprint, preorder_traversal
12 | from graphviz import Source
13 |
14 |
15 | init_printing(use_unicode=True)
16 |
17 |
18 | def parse_expression(expr):
19 | """Parse string as sympy expression
20 |
21 | Args:
22 | expr (string): string to convert to sympy expression
23 | """
24 | return parse_expr(expr, transformations=(_st + (_ima,)))
25 |
26 |
27 | def graphviz(expr):
28 | """Plot sympy expression tree using graphviz
29 |
30 | Args:
31 | expr (sympy expression)
32 | """
33 |
34 | return Source(dotprint(expr))
35 |
36 |
37 | def traversal(expr):
38 | """Traverse sympy expression tree
39 |
40 | Args:
41 | expr (sympy expression)
42 | """
43 |
44 | return list(preorder_traversal(expr))
45 |
46 |
47 | def symbols(expr):
48 | """Get symbols used in sympy expression
49 |
50 | Args:
51 | expr (sympy expression)
52 | """
53 | return expr.free_symbols
54 |
55 |
56 | def construct_lazy(expr, modules=None):
57 | """Construct Lazy tributary class from sympy expression
58 |
59 | Args:
60 | expr (sympy expression): A Sympy expression
61 | modules (list): a list of modules to use for sympy's lambdify function
62 | Returns:
63 | tributary.lazy.LazyGraph
64 | """
65 | syms = list(symbols(expr))
66 | names = [s.name for s in syms]
67 | modules = modules or ["scipy", "numpy"]
68 |
69 | class Lazy(tl.LazyGraph):
70 | def __init__(self, **kwargs):
71 | for n in names:
72 | setattr(self, n, self.node(name=n, value=kwargs.get(n, None)))
73 |
74 | self._nodes = [getattr(self, n) for n in names]
75 | self._function = lambdify(syms, expr, modules=modules)
76 | self._expr = expr
77 | super().__init__()
78 |
79 | @tl.node
80 | def evaluate(self):
81 | return self._function(*self._nodes)
82 |
83 | return Lazy
84 |
85 |
86 | def construct_streaming(expr, modules=None):
87 | """Construct streaming tributary class from sympy expression
88 |
89 | Args:
90 | expr (sympy expression): A Sympy expression
91 | modules (list): a list of modules to use for sympy's lambdify function
92 | """
93 | syms = list(symbols(expr))
94 | names = [s.name for s in syms]
95 | modules = modules or ["scipy", "numpy"]
96 | # modules = modules or ["math", "mpmath", "sympy"]
97 |
98 | class Streaming(ts.StreamingGraph):
99 | def __init__(self, **kwargs):
100 | self._kwargs = {}
101 | for n in names:
102 | if n not in kwargs:
103 | raise TributaryException(
104 | "Must provide input source for: {}".format(n)
105 | )
106 | setattr(self, n, kwargs.get(n))
107 | self._kwargs[n] = kwargs.get(n)
108 |
109 | self._set_nodes = [getattr(self, n) for n in names]
110 | self._lambda = lambdify(syms, expr, modules=modules)(**self._kwargs)
111 | self._expr = expr
112 |
113 | super(Streaming, self).__init__(node=self._lambda.collect())
114 |
115 | return Streaming
116 |
--------------------------------------------------------------------------------
/docs/examples/autodiff/output_15_0.svg:
--------------------------------------------------------------------------------
1 |
2 |
4 |
6 |
7 |
74 |
--------------------------------------------------------------------------------
/docs/examples/streaming/output_13_0.svg:
--------------------------------------------------------------------------------
1 |
2 |
4 |
6 |
7 |
74 |
--------------------------------------------------------------------------------
/tributary/tests/symbolic/test_symbolic.py:
--------------------------------------------------------------------------------
1 | import time
2 | import tributary.symbolic as ts
3 | import tributary.streaming as tss
4 | import sympy as sy
5 | from sympy.stats import Normal as syNormal, cdf
6 |
7 | sy.init_printing()
8 |
9 |
10 | class TestConfig:
11 | def setup_method(self):
12 | time.sleep(0.5)
13 |
14 | def test_construct_lazy(self):
15 | # adapted from https://gist.github.com/raddy/bd0e977dc8437a4f8276
16 | spot, strike, vol, dte, rate, cp = sy.symbols("spot strike vol dte rate cp")
17 |
18 | T = dte / 260.0
19 | N = syNormal("N", 0.0, 1.0)
20 |
21 | d1 = (sy.ln(spot / strike) + (0.5 * vol**2) * T) / (vol * sy.sqrt(T))
22 | d2 = d1 - vol * sy.sqrt(T)
23 |
24 | TimeValueExpr = sy.exp(-rate * T) * (
25 | cp * spot * cdf(N)(cp * d1) - cp * strike * cdf(N)(cp * d2)
26 | )
27 |
28 | PriceClass = ts.construct_lazy(TimeValueExpr)
29 |
30 | price = PriceClass(
31 | spot=210.59, strike=205, vol=14.04, dte=4, rate=0.2175, cp=-1
32 | )
33 |
34 | x = price.evaluate()()
35 |
36 | assert price.evaluate()() == x
37 |
38 | price.strike = 210
39 |
40 | assert x != price.evaluate()()
41 |
42 | def test_others(self):
43 | # adapted from https://gist.github.com/raddy/bd0e977dc8437a4f8276
44 | spot, strike, vol, dte, rate, cp = sy.symbols("spot strike vol dte rate cp")
45 | T = dte / 260.0
46 | N = syNormal("N", 0.0, 1.0)
47 | d1 = (sy.ln(spot / strike) + (0.5 * vol**2) * T) / (vol * sy.sqrt(T))
48 | d2 = d1 - vol * sy.sqrt(T)
49 | TimeValueExpr = sy.exp(-rate * T) * (
50 | cp * spot * cdf(N)(cp * d1) - cp * strike * cdf(N)(cp * d2)
51 | )
52 | PriceClass = ts.construct_lazy(TimeValueExpr)
53 | price = PriceClass(
54 | spot=210.59, strike=205, vol=14.04, dte=4, rate=0.2175, cp=-1
55 | )
56 | price.evaluate()()
57 | ts.graphviz(TimeValueExpr)
58 | assert ts.traversal(TimeValueExpr)
59 | assert ts.symbols(TimeValueExpr)
60 |
61 | def test_parse(self):
62 | from sympy.parsing.sympy_parser import parse_expr
63 |
64 | assert parse_expr("x**2") == ts.parse_expression("x**2")
65 |
66 | def test_construct_streaming(self):
67 | # adapted from https://gist.github.com/raddy/bd0e977dc8437a4f8276
68 | # spot, strike, vol, days till expiry, interest rate, call or put (1,-1)
69 | spot, strike, vol, dte, rate, cp = sy.symbols("spot strike vol dte rate cp")
70 |
71 | T = dte / 260.0
72 | N = syNormal("N", 0.0, 1.0)
73 |
74 | d1 = (sy.ln(spot / strike) + (0.5 * vol**2) * T) / (vol * sy.sqrt(T))
75 | d2 = d1 - vol * sy.sqrt(T)
76 |
77 | TimeValueExpr = sy.exp(-rate * T) * (
78 | cp * spot * cdf(N)(cp * d1) - cp * strike * cdf(N)(cp * d2)
79 | )
80 |
81 | PriceClass = ts.construct_streaming(TimeValueExpr)
82 |
83 | def strikes():
84 | strike = 205
85 | while strike <= 220:
86 | yield strike
87 | strike += 2.5
88 |
89 | price = PriceClass(
90 | spot=tss.Const(210.59),
91 | # strike=tss.Print(tss.Const(205), text='strike'),
92 | strike=tss.Func(strikes, interval=1),
93 | vol=tss.Const(14.04),
94 | dte=tss.Const(4),
95 | rate=tss.Const(0.2175),
96 | cp=tss.Const(-1),
97 | )
98 |
99 | ret = tss.run(tss.Print(price._starting_node))
100 | time.sleep(2)
101 | print(ret)
102 | assert len(ret) == 7
103 |
--------------------------------------------------------------------------------
/tributary/streaming/serialize.py:
--------------------------------------------------------------------------------
1 | class NodeSerializeMixin(object):
2 | def save(self):
3 | """return a serializeable structure representing this node's state"""
4 | import dill
5 |
6 | ret = {}
7 | ret["id"] = self._id
8 | ret["graphvizshape"] = self._graphvizshape
9 | # self._dd3g = None # TODO
10 | ret["name"] = self._name_only # use name sans id
11 |
12 | ret["input"] = [dill.dumps(_) for _ in self._input]
13 | ret["active"] = [dill.dumps(_) for _ in self._active]
14 | ret["downstream"] = (
15 | []
16 | ) # TODO think about this more [_.save() for _ in self._downstream]
17 | ret["upstream"] = [_.save() for _ in self._upstream]
18 |
19 | ret["func"] = dill.dumps(self._func)
20 | ret["func_kwargs"] = dill.dumps(self._func_kwargs)
21 |
22 | ret["delay_interval"] = self._delay_interval
23 | ret["execution_max"] = self._execution_max
24 | ret["execution_count"] = self._execution_count
25 |
26 | ret["last"] = dill.dumps(self._last)
27 | ret["finished"] = self._finished
28 | ret["use_dual"] = self._use_dual
29 |
30 | ret["drop"] = self._drop
31 | ret["replace"] = self._replace
32 | ret["repeat"] = self._repeat
33 |
34 | ret["attrs"] = self._initial_attrs
35 | return ret
36 |
37 | @staticmethod
38 | def restore(ret, **extra_attrs):
39 | import dill
40 | from .node import Node
41 |
42 | # self._dd3g = None # TODO
43 |
44 | # constructor args
45 | func = dill.loads(ret["func"])
46 | func_kwargs = dill.loads(ret["func_kwargs"])
47 | name = ret["name"]
48 | inputs = len(ret["input"])
49 | drop = ret["drop"]
50 | replace = ret["replace"]
51 | repeat = ret["repeat"]
52 | graphvizshape = ret["graphvizshape"]
53 | delay_interval = ret["delay_interval"]
54 | execution_max = ret["execution_max"]
55 | use_dual = ret["use_dual"]
56 |
57 | # construct node
58 | n = Node(
59 | func=func,
60 | func_kwargs=func_kwargs,
61 | name=name,
62 | inputs=inputs,
63 | drop=drop,
64 | replace=replace,
65 | repeat=repeat,
66 | graphvizshape=graphvizshape,
67 | delay_interval=delay_interval,
68 | execution_max=execution_max,
69 | use_dual=use_dual,
70 | )
71 |
72 | # restore private attrs
73 | n._id = ret["id"]
74 | n._name = "{}#{}".format(name, n._id)
75 | n._input = [dill.loads(_) for _ in ret["input"]]
76 | n._active = [dill.loads(_) for _ in ret["active"]]
77 | # n._downstream = [] # TODO upstream don't get saved
78 | n._upstream = [Node.restore(_) for _ in ret["upstream"]]
79 |
80 | # restore node relationship
81 | for up_node in n._upstream:
82 | up_node >> n
83 |
84 | n._execution_count = ret["execution_count"]
85 | n._last = ret["last"]
86 | n._finished = ret["finished"]
87 | n._initial_attrs = ret["attrs"]
88 | for k, v in extra_attrs.items():
89 | setattr(n, k, v)
90 | return n
91 |
92 |
93 | if __name__ == "__main__":
94 | # Test script
95 | import asyncio
96 | import tributary.streaming as ts
97 | import time
98 |
99 | async def func():
100 | await asyncio.sleep(2)
101 | return 1
102 |
103 | o = ts.Func(func, count=3).print()
104 |
105 | g = ts.run(o, blocking=False)
106 | time.sleep(3)
107 | g.stop()
108 |
109 | ser = o.save()
110 |
111 | n = ts.Node.restore(ser)
112 |
113 | ts.run(n)
114 |
--------------------------------------------------------------------------------
/tributary/parser/__init__.py:
--------------------------------------------------------------------------------
1 | import ast
2 | import inspect
3 | import textwrap
4 |
5 |
6 | def parseASTForMethod(meth):
7 | source = inspect.getsource(meth)
8 | dedented_source = textwrap.dedent(source)
9 | mod = ast.parse(dedented_source)
10 | return mod.body[0], mod
11 |
12 |
13 | def pprintAst(astNode):
14 | print(ast.dump(astNode, indent=4))
15 |
16 |
17 | def pprintCode(astNode):
18 | print(ast.unparse(astNode))
19 |
20 |
21 | def isClassAttribute(node):
22 | # self.aNode()
23 | if (
24 | isinstance(node, ast.Call)
25 | and isinstance(node.func, ast.Attribute)
26 | and node.func.value.id == "self"
27 | ):
28 | return node.func.attr
29 | # self.aNode
30 | elif isinstance(node, ast.Attribute) and node.value.id == "self":
31 | return node.attr
32 |
33 |
34 | def getClassAttributesUsedInMethod(root):
35 | attribute_deps = set()
36 | for node in ast.walk(root):
37 | attr = isClassAttribute(node)
38 | if attr:
39 | attribute_deps.add(attr)
40 | return attribute_deps
41 |
42 |
43 | # append attribute args to method definition
44 | def addAttributeDepsToMethodSignature(root, attribute_deps):
45 | for attribute_dep in attribute_deps:
46 | append = True
47 |
48 | for meth_arg in root.args.args:
49 | if meth_arg.arg == attribute_dep:
50 | append = False
51 |
52 | if append:
53 | if root.args.args:
54 | # use first one
55 | lineno = root.args.args[0].lineno
56 | else:
57 | # use definition
58 | lineno = root.lineno
59 |
60 | # set lineno to as close as we can, but can't do col_offset really
61 | root.args.args.append(ast.arg(attribute_dep, lineno=lineno, col_offset=0))
62 |
63 |
64 | class Transformer(ast.NodeTransformer):
65 | def __init__(self, *args, **kwargs):
66 | super().__init__(*args, **kwargs)
67 | self._transformed_to_nodes = []
68 |
69 | def generic_visit(self, node):
70 | # Need to call super() in any case to visit child nodes of the current one.
71 | super().generic_visit(node)
72 |
73 | if isClassAttribute(node) and isinstance(node, ast.Call):
74 | # Call(
75 | # func=Attribute(
76 | # value=Name(id='self', ctx=Load()),
77 | # attr='func2',
78 | # ctx=Load()),
79 | # args=[],
80 | # keywords=[])
81 | #
82 | # then replace with argument
83 | #
84 | # Name(id='blerg', ctx=Load())
85 | return ast.Name(
86 | id=node.func.attr,
87 | ctx=ast.Load(),
88 | lineno=node.func.lineno,
89 | col_offset=node.func.col_offset,
90 | )
91 | elif isClassAttribute(node) and isinstance(node, ast.Attribute):
92 | # Attribute(
93 | # value=Name(id='self', ctx=Load()),
94 | # attr='func2',
95 | # ctx=Load())
96 | #
97 | # then replace with argument
98 | #
99 | # Name(id='blerg', ctx=Load())
100 | name = ast.Name(
101 | id=node.attr,
102 | ctx=ast.Load(),
103 | lineno=node.lineno,
104 | col_offset=node.col_offset,
105 | )
106 | self._transformed_to_nodes.append(name)
107 | return name
108 | elif isinstance(node, ast.Call) and node.func in self._transformed_to_nodes:
109 | # if we transformed the inner attribute but its still being called, e.g.
110 | # def func1(self, func2):
111 | # return func2() + 1
112 |
113 | # in this case, just promote the inner name to replace the call
114 | return node.func
115 |
116 | return node
117 |
--------------------------------------------------------------------------------
/tributary/streaming/graph.py:
--------------------------------------------------------------------------------
1 | import asyncio
2 | import sys
3 | from threading import Thread
4 |
5 | from ..base import StreamEnd, StreamNone, StreamRepeat, TributaryException # noqa: F401
6 |
7 |
8 | # nest_asyncio.apply()
9 |
10 |
11 | class StreamingGraph(object):
12 | """internal representation of the entire graph state"""
13 |
14 | def __init__(self, node):
15 | self._stop = False
16 | self._starting_node = node # noqa F405
17 |
18 | # coroutines to run on start and stop
19 | self._onstarts = []
20 | self._onstops = []
21 |
22 | # Collect graph
23 | self.getNodes()
24 |
25 | def getNodes(self):
26 | self._nodes = self._starting_node._deep_bfs()
27 |
28 | # Run through nodes and extract onstarts and onstops
29 | for ns in self._nodes:
30 | for n in ns:
31 | if n._onstarts:
32 | self._onstarts.extend(list(n._onstarts))
33 | if n._onstops:
34 | self._onstops.extend(list(n._onstops))
35 |
36 | # Check that all are async coroutines
37 | for call in self._onstarts + self._onstops:
38 | if not asyncio.iscoroutinefunction(call):
39 | raise TributaryException(
40 | "all onstarts and onstops must be async coroutines, got bad function: {}".format(
41 | call
42 | )
43 | )
44 |
45 | # return node levels
46 | return self._nodes
47 |
48 | def rebuild(self):
49 | # TODO
50 | return self._nodes
51 |
52 | def stop(self):
53 | self._stop = True
54 |
55 | async def _run(self):
56 | value, last, self._stop = None, None, False
57 |
58 | # run onstarts
59 | await asyncio.gather(*(asyncio.create_task(s()) for s in self._onstarts))
60 |
61 | while True:
62 | for level in self._nodes:
63 | if self._stop:
64 | break
65 |
66 | await asyncio.gather(*(asyncio.create_task(n()) for n in level))
67 |
68 | self.rebuild()
69 |
70 | if self._stop:
71 | break
72 |
73 | value, last = self._starting_node.value(), value
74 |
75 | if isinstance(value, StreamEnd):
76 | break
77 |
78 | # run `onstops`
79 | await asyncio.gather(*(asyncio.create_task(s()) for s in self._onstops))
80 |
81 | # return last val
82 | return last
83 |
84 | def run(self, blocking=True, newloop=False, start=True):
85 | if sys.platform == "win32":
86 | # Set to proactor event loop on window
87 | # (default in python 3.8+)
88 | loop = asyncio.ProactorEventLoop()
89 |
90 | else:
91 | if newloop:
92 | # create a new loop
93 | loop = asyncio.new_event_loop()
94 | else:
95 | # get the current loop
96 | loop = asyncio.get_event_loop()
97 |
98 | asyncio.set_event_loop(loop)
99 |
100 | task = loop.create_task(self._run())
101 |
102 | if blocking:
103 | # if loop is already running, make reentrant
104 | try:
105 | if loop.is_running():
106 |
107 | async def wait(task):
108 | return await task
109 |
110 | return asyncio.run_coroutine_threadsafe(wait(task), loop)
111 | # block until done
112 | return loop.run_until_complete(task)
113 | except KeyboardInterrupt:
114 | return
115 |
116 | if start:
117 | t = Thread(target=loop.run_until_complete, args=(task,))
118 | t.daemon = True
119 | t.start()
120 | return loop
121 |
122 | return loop, task
123 |
124 | def graph(self):
125 | return self._starting_node.graph()
126 |
127 | def graphviz(self):
128 | return self._starting_node.graphviz()
129 |
130 | def dagre(self):
131 | return self._starting_node.dagre()
132 |
--------------------------------------------------------------------------------
/tributary/tests/test_data/ohlc.csv:
--------------------------------------------------------------------------------
1 | ,open,high,low,close,volume
2 | 2015-01-01,100.0,110.29930270278898,91.73413290250907,101.69736937634046,7293
3 | 2015-01-02,103.05583881929202,118.9393974472462,87.43703059322297,118.20880076653276,5482
4 | 2015-01-03,116.42928855662532,120.60792871633784,86.07437225927684,89.40772782958769,9896
5 | 2015-01-04,91.43832691022322,119.73257082118027,91.43832691022322,108.383282754049,3858
6 | 2015-01-05,108.75140205204575,110.34267545748654,72.9315161762638,74.14874724969894,7834
7 | 2015-01-06,73.47603272991493,90.15709885262719,71.92228409684014,85.51368828776063,4309
8 | 2015-01-07,84.75208049891495,107.09568500910248,71.64487035301076,97.87617012517279,9522
9 | 2015-01-08,97.76916841721632,111.32163064057094,85.49671140950186,111.32163064057094,4433
10 | 2015-01-09,111.18525979885999,135.8376318726354,108.03708352161485,130.172922752803,5100
11 | 2015-01-10,130.9068358760611,144.29236641999486,123.41194350920175,128.28793550069938,3851
12 | 2015-01-11,127.91266126117857,153.06811260530822,124.4386050899744,150.2971247540012,6669
13 | 2015-01-12,148.58106834799128,156.41494091441953,124.81909400573508,133.95443035885523,1453
14 | 2015-01-13,134.97450505705342,153.09353713977853,117.671167683222,145.11743432194027,3934
15 | 2015-01-14,144.1014262136132,150.18221879530452,120.12903603589662,137.64697077968881,7351
16 | 2015-01-15,138.67716988937627,142.6264116891128,126.78232084171728,130.2253839956803,6819
17 | 2015-01-16,131.4112580806366,134.77361075040497,110.96488247262371,123.71264368849467,5665
18 | 2015-01-17,123.18250308957536,129.02413367346492,101.13365148112439,120.75896244502889,7499
19 | 2015-01-18,119.90286768565262,134.82258381429972,107.41699293965927,132.28486221104092,5096
20 | 2015-01-19,131.57390485002406,133.72295391776765,113.35660753920489,113.41177862220519,1187
21 | 2015-01-20,113.7431428480674,113.7431428480674,94.51858750951723,99.22236832930696,9317
22 | 2015-01-21,100.35283242667116,122.3679666157776,90.27172327008745,121.80004095938168,9149
23 | 2015-01-22,122.71362753005486,135.1924375385195,116.28915817325013,123.01232184381233,3247
24 | 2015-01-23,124.21328832303007,161.16179804229884,121.54343416701995,158.53540522073143,3459
25 | 2015-01-24,159.26908540402658,192.4359186036799,157.0212448783538,189.10432279660859,9919
26 | 2015-01-25,190.1189211196106,197.619199314573,180.39092841848762,186.4497893847748,2443
27 | 2015-01-26,184.89963988369445,203.47084757062254,182.22385990109115,193.43410744746475,7819
28 | 2015-01-27,193.97094279027124,210.78539097676855,186.19097570236957,210.64127040246146,5126
29 | 2015-01-28,210.26295748468885,220.72201033799936,203.43511385172417,207.6492703885567,1448
30 | 2015-01-29,206.1504468215134,233.36710735407522,206.1504468215134,231.1086742225846,6213
31 | 2015-01-30,232.57233952930162,234.6520438443496,194.65386403314056,195.76386500238758,5448
32 | 2015-01-31,195.40563642498552,204.7100325748486,179.5853536279364,189.38794005229968,3399
33 | 2015-02-01,189.15778995352548,197.0710943937962,170.38717952999997,184.5351937335841,7180
34 | 2015-02-02,185.00767569817862,188.10389965392793,174.8339909857694,181.68602083649856,4005
35 | 2015-02-03,180.48369179014546,195.77838794209413,176.05622488910507,193.70505189482463,9126
36 | 2015-02-04,191.9221735915871,207.3148344865269,188.27029899804043,202.81252503505488,8629
37 | 2015-02-05,202.3661097057831,202.66495154053686,182.14308649411993,200.5959058788303,1023
38 | 2015-02-06,201.6635923676699,216.42754432522244,198.68830449252832,214.423097464487,5056
39 | 2015-02-07,214.2674228976109,217.00499563409736,190.58935414264676,191.21289516713387,5420
40 | 2015-02-08,193.06713175368918,198.59549830732342,174.19466096740152,174.54319895211492,5716
41 | 2015-02-09,175.65206192476944,181.16585496476287,168.9856597862254,175.54458496047258,6734
42 | 2015-02-10,176.40461360570268,179.95136505651143,158.6254767021947,163.06894142476406,9675
43 | 2015-02-11,163.42279584709846,169.4980998181426,143.14804859749336,143.14804859749336,5618
44 | 2015-02-12,141.0131375108744,169.77946489925014,138.26602477156618,145.01951706223846,9339
45 | 2015-02-13,145.17515945104867,160.76757326608458,139.23524280974456,149.9769871089409,6458
46 | 2015-02-14,150.12468992876074,153.38732444512974,126.58857724912933,140.03224858662165,9612
47 | 2015-02-15,140.38223357745562,167.71044806699217,134.63778892326485,157.9131458264648,2961
48 | 2015-02-16,156.20601753125888,178.60429295824596,156.08168466667362,165.34393619214524,6856
49 | 2015-02-17,165.95021787008957,173.0718869018042,155.1360939555823,164.89240595032828,7545
50 | 2015-02-18,165.66300941689536,189.27288245420232,162.00361283602473,177.8294009010483,2425
--------------------------------------------------------------------------------
/tributary/functional/__init__.py:
--------------------------------------------------------------------------------
1 | from __future__ import print_function
2 | import os
3 |
4 | if os.name != "nt":
5 | from gevent import monkey
6 |
7 | _PATCHED = False
8 |
9 | if not _PATCHED:
10 | monkey.patch_all(thread=False, select=False)
11 | _PATCHED = True
12 |
13 | from functools import partial # noqa: E402
14 | from concurrent.futures.thread import _WorkItem, BrokenThreadPool # noqa: E402
15 | from concurrent.futures import ThreadPoolExecutor, _base # noqa: E402
16 | import concurrent.futures.thread as cft # noqa: E402
17 | from .input import * # noqa: F401, F403, E402
18 | from .utils import * # noqa: F401, F403, E402
19 |
20 |
21 | _EXECUTOR = ThreadPoolExecutor(max_workers=10)
22 |
23 |
24 | def submit(fn, *args, **kwargs):
25 | """Submit a function to be run on the executor (internal)
26 |
27 | Args:
28 | fn (callable): function to call
29 | args (tuple): args to pass to function
30 | kwargs (dict): kwargs to pass to function
31 | """
32 | if _EXECUTOR is None:
33 | raise RuntimeError("Already stopped!")
34 | self = _EXECUTOR
35 | with self._shutdown_lock:
36 | if hasattr(self, "_broken") and self._broken:
37 | raise BrokenThreadPool(self._broken)
38 |
39 | if hasattr(self, "_shutdown") and self._shutdown:
40 | raise RuntimeError("cannot schedule new futures after shutdown")
41 | if cft._shutdown:
42 | raise RuntimeError(
43 | "cannot schedule new futures after" "interpreter shutdown"
44 | )
45 |
46 | f = _base.Future()
47 | w = _WorkItem(f, fn, args, kwargs)
48 |
49 | self._work_queue.put(w)
50 | self._adjust_thread_count()
51 | return f
52 |
53 |
54 | def run_submit(fn, function_to_call, *args, **kwargs):
55 | try:
56 | f = submit(fn, *args, **kwargs)
57 | except RuntimeError:
58 | # means we've shutdown, stop
59 | return
60 |
61 | if function_to_call:
62 | f.add_done_callback(
63 | lambda fut: function_to_call(fut.result()) if fut.result() else None
64 | )
65 |
66 |
67 | def pipeline(
68 | funcs, func_callbacks, func_kwargs=None, on_data=print, on_data_kwargs=None
69 | ):
70 | """Pipeline a sequence of functions together via callbacks
71 |
72 | Args:
73 | funcs (list of callables): list of functions to pipeline
74 | func_callbacks (List[str]): list of strings indicating the callback names (kwargs of the funcs)
75 | func_kwargs (List[dict]):
76 | on_data (callable): callable to call at the end of the pipeline
77 | on_data_kwargs (dict): kwargs to pass to the on_data function>?
78 | """
79 | global _EXECUTOR
80 | if _EXECUTOR is None:
81 | _EXECUTOR = ThreadPoolExecutor(max_workers=2)
82 |
83 | func_kwargs = func_kwargs or []
84 | on_data_kwargs = on_data_kwargs or {}
85 |
86 | # organize args for functional pipeline
87 | assembled = []
88 | for i, func in enumerate(funcs):
89 | cb = func_callbacks[i] if i < len(func_callbacks) else "on_data"
90 | kwargs = func_kwargs[i] if i < len(func_kwargs) else {}
91 | assembled.append((func, cb, kwargs))
92 |
93 | # assemble pipeline
94 | assembled.reverse()
95 | lambdas = [lambda d, f=on_data: run_submit(f, None, d, **on_data_kwargs)]
96 | for i, a in enumerate(assembled):
97 | func, cb, kwargs = a
98 | function_to_call = lambdas[i]
99 | kwargs[cb] = function_to_call
100 |
101 | if i != len(assembled) - 1:
102 | lambdas.append(
103 | lambda d, kw=kwargs, f=func: run_submit(f, function_to_call, d, **kw)
104 | )
105 | lambdas[-1].__name__ = func.__name__
106 | else:
107 | lambdas.append(
108 | lambda kw=kwargs, f=func: run_submit(f, function_to_call, **kw)
109 | )
110 | lambdas[-1].__name__ = func.__name__
111 |
112 | # start entrypoint
113 | lambdas[-1]()
114 |
115 |
116 | def stop():
117 | """Stop the executor for the pipeline runtime"""
118 | global _EXECUTOR
119 | _EXECUTOR.shutdown(False)
120 | _EXECUTOR._threads.clear()
121 | cft._threads_queues.clear()
122 | _EXECUTOR = None
123 |
124 |
125 | def wrap(function, *args, **kwargs):
126 | """Wrap a function in a partial"""
127 | func = partial(function, *args, **kwargs)
128 | func.__name__ = function.__name__
129 | return func
130 |
--------------------------------------------------------------------------------
/tributary/utils.py:
--------------------------------------------------------------------------------
1 | import functools
2 | import inspect
3 |
4 | import numpy as np
5 | import pandas as pd
6 |
7 | from collections import namedtuple
8 |
9 | from .base import StreamEnd
10 |
11 |
12 | def _either_type(f):
13 | """Utility decorator to allow for either no-arg decorator or arg decorator
14 |
15 | Args:
16 | f (callable): Callable to decorate
17 | """
18 |
19 | @functools.wraps(f)
20 | def new_dec(*args, **kwargs):
21 | if len(args) == 1 and len(kwargs) == 0 and callable(args[0]):
22 | # actual decorated function
23 | return f(args[0])
24 | else:
25 | # decorator arguments
26 | return lambda realf: f(realf, *args, **kwargs)
27 |
28 | return new_dec
29 |
30 |
31 | def LazyToStreaming(lazy_node):
32 | from .base import TributaryException
33 | from .lazy import LazyNode
34 | from .streaming import Func, StreamingNode
35 |
36 | if isinstance(lazy_node, StreamingNode):
37 | return lazy_node
38 | if not isinstance(lazy_node, LazyNode):
39 | raise TributaryException("Malformed input:{}".format(lazy_node))
40 |
41 | return Func(func=lambda node=lazy_node: node())
42 |
43 |
44 | def _compare(new_value, old_value):
45 | """return true if value is new, otherwise false"""
46 | if isinstance(new_value, (int, float)) and type(new_value) is type(old_value):
47 | # if numeric, compare within a threshold
48 | # TODO
49 | return abs(new_value - old_value) > 0.00001
50 |
51 | elif type(new_value) is not type(old_value):
52 | return True
53 |
54 | elif isinstance(new_value, (pd.DataFrame, pd.Series, np.ndarray)) or isinstance(
55 | old_value, (pd.DataFrame, pd.Series, np.ndarray)
56 | ):
57 | return (abs(new_value - old_value) > 0.00001).any()
58 |
59 | return new_value != old_value
60 |
61 |
62 | def _ismethod(callable):
63 | try:
64 | return callable and (
65 | inspect.ismethod(callable)
66 | or (
67 | inspect.getargspec(callable).args
68 | and inspect.getargspec(callable).args[0] == "self"
69 | )
70 | )
71 | except TypeError:
72 | return False
73 |
74 |
75 | def anext(obj):
76 | return obj.__anext__()
77 |
78 |
79 | def _gen_to_func(generator):
80 | try:
81 | return next(generator)
82 | except StopIteration:
83 | return StreamEnd()
84 |
85 |
86 | async def _agen_to_func(generator):
87 | try:
88 | return await anext(generator)
89 | except StopAsyncIteration:
90 | return StreamEnd()
91 |
92 |
93 | def _gen_node(n):
94 | from .streaming import Const, Func
95 | from .lazy import Node as LazyNode
96 | from .streaming import Node as StreamingNode
97 |
98 | if isinstance(n, StreamingNode):
99 | return n
100 | elif isinstance(n, LazyNode):
101 | return LazyToStreaming(n)
102 | elif callable(n):
103 | return Func(n, name="Callable")
104 | return Const(n)
105 |
106 |
107 | class Parameter(object):
108 | def __init__(self, name, position, default, kind):
109 | self.name = name
110 | self.position = position
111 | self.kind = kind
112 |
113 | if kind == inspect._ParameterKind.VAR_POSITIONAL:
114 | # default is empty tuple
115 | self.default = tuple()
116 |
117 | elif kind == inspect._ParameterKind.VAR_KEYWORD:
118 | # default is empty dict
119 | self.default = {}
120 | else:
121 | # default can be inspect._empty
122 | self.default = default
123 |
124 |
125 | def extractParameters(callable):
126 | """Given a function, extract the arguments and defaults
127 |
128 | Args:
129 | value [callable]: a callable
130 | """
131 |
132 | # TODO handle generators as lambda g=g: next(g)
133 | if inspect.isgeneratorfunction(callable):
134 | raise NotImplementedError()
135 |
136 | # wrap args and kwargs of function to node
137 | try:
138 | signature = inspect.signature(callable)
139 |
140 | except ValueError:
141 | # https://bugs.python.org/issue20189
142 | signature = namedtuple("Signature", ["parameters"])({})
143 |
144 | # extract all args. args/kwargs become tuple/dict input
145 | return [
146 | Parameter(p.name, i, p.default, p.kind)
147 | for i, p in enumerate(signature.parameters.values())
148 | ]
149 |
--------------------------------------------------------------------------------
/examples/lazy_blackscholes_autodiff.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Black-Scholes Autodifferentiation Example\n",
8 | "In this example we are calculating $\\theta$ values using the Black-Scholes model via a Lazy Stream."
9 | ]
10 | },
11 | {
12 | "cell_type": "code",
13 | "execution_count": 5,
14 | "metadata": {},
15 | "outputs": [],
16 | "source": [
17 | "# Imports & Constants\n",
18 | "import tributary.lazy as tl\n",
19 | "import math\n",
20 | "TRADING_DAYS = 252"
21 | ]
22 | },
23 | {
24 | "cell_type": "markdown",
25 | "metadata": {},
26 | "source": [
27 | "## We can use the tributary $\\textit{Erf}$ operator to define a Standard Normal CDF\n",
28 | "The CDF is defined using $\\Phi(x, \\mu, \\sigma) = \\frac{1}{2}\\left(1 + Erf\\left(\\frac{x-\\mu}{\\sigma\\sqrt{2}}\\right)\\right)$ for $\\mu = 0, \\sigma = 1$"
29 | ]
30 | },
31 | {
32 | "cell_type": "code",
33 | "execution_count": 6,
34 | "metadata": {},
35 | "outputs": [],
36 | "source": [
37 | "def normal_cdf(x):\n",
38 | " return ((tl.Erf(x / tl.Node((math.sqrt(2),0), use_dual=True)) + tl.Node((1,0), use_dual=True))\n",
39 | " / tl.Node((2,0), use_dual=True))"
40 | ]
41 | },
42 | {
43 | "cell_type": "markdown",
44 | "metadata": {},
45 | "source": [
46 | "## Now we can define our stream\n",
47 | "For this example we are going use a Call Option.\n",
48 | "We define the model as follows:\n",
49 | "\n",
50 | "$C_p = S\\Phi(d_1) - Xe^{-rt}\\Phi(d_2)$\n",
51 | "\n",
52 | "$d_1 = \\frac{ln\\left(\\frac{S}{X}\\right) + \\left(r + \\frac{\\sigma^2}{2}\\right)t}{\\sigma\\sqrt{t}}$\n",
53 | "\n",
54 | "$d_2 = d_1 - \\sigma\\sqrt{t}$\n",
55 | "\n",
56 | "Where\n",
57 | "\n",
58 | "$C_p$ - Price of Call Option\n",
59 | "\n",
60 | "$S$ - Stock Price\n",
61 | "\n",
62 | "$X$ - Strike Price\n",
63 | "\n",
64 | "$r$ - Risk Free Interest Rate\n",
65 | "\n",
66 | "$\\sigma$ - Stock Price Volatility\n",
67 | "\n",
68 | "$t$ - Time to Maturity\n",
69 | "\n",
70 | "$\\Phi$ - Standard Normal CDF (defined above)"
71 | ]
72 | },
73 | {
74 | "cell_type": "markdown",
75 | "metadata": {},
76 | "source": [
77 | "### Lazy Graph"
78 | ]
79 | },
80 | {
81 | "cell_type": "code",
82 | "execution_count": 7,
83 | "metadata": {},
84 | "outputs": [],
85 | "source": [
86 | "strike_price = tl.Node(value=(203, 0), name='Strike Price', use_dual=True)\n",
87 | "stock_price = tl.Node(value=(210, 0), name='Stock Price', use_dual=True)\n",
88 | "r = tl.Node(value=(0.2175, 0), name='Risk Free Interest Rate', use_dual=True)\n",
89 | "time = tl.Div(\n",
90 | " tl.Node(value=(4, 1), name='Time to Maturity', use_dual=True), \n",
91 | " tl.Node(value=(TRADING_DAYS,0), use_dual=True)\n",
92 | " )\n",
93 | "vol = tl.Node(value=(14.04, 0), name='Stock Price Volatility', use_dual=True)\n",
94 | "\n",
95 | "d1 = ((tl.Log(stock_price / strike_price) + time * (r + (vol**2 / tl.Node((2,0), use_dual=True)))) \n",
96 | " / (vol * tl.Sqrt(time)))\n",
97 | " \n",
98 | "\n",
99 | "d2 = d1 - vol * tl.Sqrt(time)\n",
100 | "\n",
101 | "opt_price_lazy = stock_price * normal_cdf(d1)- strike_price *tl.Exp(tl.Negate(r * time)) * normal_cdf(d2) "
102 | ]
103 | },
104 | {
105 | "cell_type": "code",
106 | "execution_count": 8,
107 | "metadata": {},
108 | "outputs": [
109 | {
110 | "data": {
111 | "text/plain": [
112 | "(132.41454517196362, 12.327192918542838)"
113 | ]
114 | },
115 | "execution_count": 8,
116 | "metadata": {},
117 | "output_type": "execute_result"
118 | }
119 | ],
120 | "source": [
121 | "# Run it\n",
122 | "opt_price_lazy()"
123 | ]
124 | },
125 | {
126 | "cell_type": "code",
127 | "execution_count": null,
128 | "metadata": {},
129 | "outputs": [],
130 | "source": []
131 | }
132 | ],
133 | "metadata": {
134 | "kernelspec": {
135 | "display_name": "Python 3",
136 | "language": "python",
137 | "name": "python3"
138 | },
139 | "language_info": {
140 | "codemirror_mode": {
141 | "name": "ipython",
142 | "version": 3
143 | },
144 | "file_extension": ".py",
145 | "mimetype": "text/x-python",
146 | "name": "python",
147 | "nbconvert_exporter": "python",
148 | "pygments_lexer": "ipython3",
149 | "version": "3.7.5"
150 | }
151 | },
152 | "nbformat": 4,
153 | "nbformat_minor": 4
154 | }
155 |
--------------------------------------------------------------------------------
/examples/pipeline_ws_http_sio.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {},
7 | "outputs": [],
8 | "source": [
9 | "import tributary as t\n",
10 | "from perspective import PerspectiveWidget"
11 | ]
12 | },
13 | {
14 | "cell_type": "code",
15 | "execution_count": 2,
16 | "metadata": {},
17 | "outputs": [],
18 | "source": [
19 | "import tributary.functional as tp"
20 | ]
21 | },
22 | {
23 | "cell_type": "code",
24 | "execution_count": 3,
25 | "metadata": {},
26 | "outputs": [],
27 | "source": [
28 | "func = tp.wrap(tp.ws, url='wss://ws.paine.nyc', wrap=True, json=True)"
29 | ]
30 | },
31 | {
32 | "cell_type": "code",
33 | "execution_count": 8,
34 | "metadata": {},
35 | "outputs": [
36 | {
37 | "data": {
38 | "application/vnd.jupyter.widget-view+json": {
39 | "model_id": "b02b3aa4cc254e45a3e3750f87445697",
40 | "version_major": 2,
41 | "version_minor": 0
42 | },
43 | "text/plain": [
44 | "PerspectiveWidget(columns=['A', 'B', 'C', 'D'], plugin='y_line')"
45 | ]
46 | },
47 | "metadata": {},
48 | "output_type": "display_data"
49 | }
50 | ],
51 | "source": [
52 | "p = PerspectiveWidget({\"A\": float, \"B\": float, \"C\": float, \"D\": float}, plugin='y_line')\n",
53 | "p"
54 | ]
55 | },
56 | {
57 | "cell_type": "code",
58 | "execution_count": 9,
59 | "metadata": {},
60 | "outputs": [],
61 | "source": [
62 | "t.pipeline([func], ['callback'], on_data=p.update)"
63 | ]
64 | },
65 | {
66 | "cell_type": "code",
67 | "execution_count": 10,
68 | "metadata": {},
69 | "outputs": [],
70 | "source": [
71 | "t.stop()"
72 | ]
73 | },
74 | {
75 | "cell_type": "code",
76 | "execution_count": 12,
77 | "metadata": {},
78 | "outputs": [
79 | {
80 | "data": {
81 | "application/vnd.jupyter.widget-view+json": {
82 | "model_id": "cfc1d8ec52c84799bd233cc20ca74e36",
83 | "version_major": 2,
84 | "version_minor": 0
85 | },
86 | "text/plain": [
87 | "PerspectiveWidget(columns=['longitude', 'latitude', 'availableBikes'], plugin='xy_scatter')"
88 | ]
89 | },
90 | "metadata": {},
91 | "output_type": "display_data"
92 | }
93 | ],
94 | "source": [
95 | "http = tp.wrap(tp.http, url='https://unpkg.com/@jpmorganchase/perspective-examples@0.1.18/build/citibike.json', json=True, field='stationBeanList')\n",
96 | "psp = PerspectiveWidget({\"id\":int,\n",
97 | " \"stationName\": str,\n",
98 | " \"availableDocks\": int,\n",
99 | " \"totalDocks\": int,\n",
100 | " \"latitude\": float,\n",
101 | " \"longitude\": float,\n",
102 | " \"statusValue\": str,\n",
103 | " \"statusKey\":int,\n",
104 | " \"availableBikes\":int,\n",
105 | " \"stAddress1\": str,\n",
106 | " \"stAddress2\": str,\n",
107 | " \"city\": str,\n",
108 | " \"postalCode\": str,\n",
109 | " \"location\": str,\n",
110 | " \"altitude\": str,\n",
111 | " \"testStation\": bool,\n",
112 | " \"lastCommunicationTime\": \"datetime\",\n",
113 | " \"landMark\": str}, plugin='xy_scatter', columns=['longitude', 'latitude', 'availableBikes'])\n",
114 | "psp"
115 | ]
116 | },
117 | {
118 | "cell_type": "code",
119 | "execution_count": 13,
120 | "metadata": {},
121 | "outputs": [],
122 | "source": [
123 | "t.pipeline([http], ['callback'], on_data=psp.update)"
124 | ]
125 | },
126 | {
127 | "cell_type": "code",
128 | "execution_count": null,
129 | "metadata": {},
130 | "outputs": [],
131 | "source": []
132 | }
133 | ],
134 | "metadata": {
135 | "kernelspec": {
136 | "display_name": "Python 3",
137 | "language": "python",
138 | "name": "python3"
139 | },
140 | "language_info": {
141 | "codemirror_mode": {
142 | "name": "ipython",
143 | "version": 3
144 | },
145 | "file_extension": ".py",
146 | "mimetype": "text/x-python",
147 | "name": "python",
148 | "nbconvert_exporter": "python",
149 | "pygments_lexer": "ipython3",
150 | "version": "3.7.5"
151 | }
152 | },
153 | "nbformat": 4,
154 | "nbformat_minor": 4
155 | }
156 |
--------------------------------------------------------------------------------
/CATALOG.md:
--------------------------------------------------------------------------------
1 | # Sources and Sinks
2 | ## Sources
3 | - Python Function/Generator/Async Function/Async Generator
4 | - Curve - yield through an iterable
5 | - Const - yield a constant
6 | - Timer - yield on an interval
7 | - Random - generates a random dictionary of values
8 | - File - streams data from a file, optionally loading each line as a json
9 | - HTTP - polls a url with GET requests, streams data out
10 | - HTTPServer - runs an http server and streams data sent by clients
11 | - Websocket - strams data from a websocket
12 | - WebsocketServer - runs a websocket server and streams data sent by clients
13 | - SocketIO - streams data from a socketIO connection
14 | - SocketIOServer - streams data from a socketIO connection
15 | - SSE - streams data from an SSE connection
16 | - Kafka - streams data from kafka
17 | - Postgres - streams data from postgres
18 |
19 | ## Sinks
20 | - Func - data to a python function
21 | - File - data to a file
22 | - HTTP - POSTs data to an url
23 | - HTTPServer - runs an http server and streams data to connections
24 | - Websocket - streams data to a websocket
25 | - WebsocketServer - runs a websocket server and streams data to connections
26 | - SocketIO - streams data to a socketIO connection
27 | - SocketIOServer - runs a socketio server and streams data to connections
28 | - SSE - runs an SSE server and streams data to connections
29 | - Kafka - streams data to kafka
30 | - Postgres - streams data to postgres
31 | - Email - streams data and sends it in emails
32 | - TextMessage - streams data and sends it via text message
33 |
34 | # Transforms
35 | ## Modulate
36 | - Delay - Streaming wrapper to delay a stream
37 | - Throttle - Streaming wrapper to only tick at most every interval
38 | - Debounce - Streaming wrapper to only tick on new values
39 | - Apply - Streaming wrapper to apply a function to an input stream
40 | - Window - Streaming wrapper to collect a window of values
41 | - Unroll - Streaming wrapper to unroll an iterable stream
42 | - UnrollDataFrame - Streaming wrapper to unroll a dataframe into a stream
43 | - Merge - Streaming wrapper to merge 2 inputs into a single output
44 | - ListMerge - Streaming wrapper to merge 2 input lists into a single output list
45 | - DictMerge - Streaming wrapper to merge 2 input dicts into a single output dict. Preference is given to the second input (e.g. if keys overlap)
46 | - Reduce - Streaming wrapper to merge any number of inputs
47 | - FixedMap - Map input stream to fixed number of outputs
48 | - Subprocess - Open a subprocess and yield results as they come. Can also stream data to subprocess (either instantaneous or long-running subprocess)
49 |
50 |
51 | ## Calculations
52 | Note that `tributary` can also be configured to operate on **dual numbers** for things like lazy or streaming autodifferentiation.
53 |
54 | ### Arithmetic Operators
55 | - Noop (unary) - Pass input to output
56 | - Negate (unary) - -1 * input
57 | - Invert (unary) - 1/input
58 | - Add (binary) - add 2 inputs
59 | - Sub (binary) - subtract second input from first
60 | - Mult (binary) - multiple inputs
61 | - Div (binary) - divide first input by second
62 | - RDiv (binary) - divide second input by first
63 | - Mod (binary) - first input % second input
64 | - Pow (binary) - first input^second input
65 | - Sum (n-ary) - sum all inputs
66 | - Average (n-ary) - average of all inputs
67 | - Round (unary)
68 | - Floor (unary)
69 | - Ceil (unary)
70 |
71 | ### Boolean Operators
72 | - Not (unary) - `Not` input
73 | - And (binary) - `And` inputs
74 | - Or (binary) - `Or` inputs
75 |
76 | ### Comparators
77 | - Equal (binary) - inputs are equal
78 | - NotEqual (binary) - inputs are not equal
79 | - Less (binary) - first input is less than second input
80 | - LessOrEqual (binary) - first input is less than or equal to second input
81 | - Greater (binary) - first input is greater than second input
82 | - GreaterOrEqual (binary) - first input is greater than or equal to second input
83 |
84 | ### Math
85 | - Log (unary)
86 | - Sin (unary)
87 | - Cos (unary)
88 | - Tan (unary)
89 | - Arcsin (unary)
90 | - Arccos (unary)
91 | - Arctan (unary)
92 | - Sqrt (unary)
93 | - Abs (unary)
94 | - Exp (unary)
95 | - Erf (unary)
96 |
97 | ### Financial Calculations
98 | - RSI - Relative Strength Index
99 | - MACD - Moving Average Convergence Divergence
100 |
101 | ## Converters
102 | - Int (unary)
103 | - Float (unary)
104 | - Bool (unary)
105 | - Str (unary)
106 |
107 | ## Basket Functions
108 | - Len (unary)
109 | - Count (unary)
110 | - Min (unary)
111 | - Max (unary)
112 | - Sum (unary)
113 | - Average (unary)
114 |
115 | ## Rolling
116 | - RollingCount - Node to count inputs
117 | - RollingMin - Node to take rolling min of inputs
118 | - RollingMax - Node to take rolling max of inputs
119 | - RollingSum - Node to take rolling sum inputs
120 | - RollingAverage - Node to take the running average
121 | - SMA - Node to take the simple moving average over a window
122 | - EMA - Node to take an exponential moving average over a window
123 |
124 | ## Node Type Converters
125 | - Lazy->Streaming
126 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | #
2 | Python Data Streams
3 |
4 | [](https://github.com/1kbgz/tributary/actions?query=workflow%3A%22Build+Status%22)
5 | [](https://codecov.io/gh/1kbgz/tributary)
6 | [](https://pypi.python.org/pypi/tributary)
7 | [](https://pypi.python.org/pypi/tributary)
8 | [](https://mybinder.org/v2/gh/1kbgz/tributary/main?urlpath=lab)
9 |
10 |
11 | Tributary is a library for constructing dataflow graphs in python. Unlike many other DAG libraries in python ([airflow](https://airflow.apache.org), [luigi](https://luigi.readthedocs.io/en/stable/), [prefect](https://docs.prefect.io), [dagster](https://docs.dagster.io), [dask](https://dask.org), [kedro](https://github.com/quantumblacklabs/kedro), etc), tributary is not designed with data/etl pipelines or scheduling in mind. Instead, tributary is more similar to libraries like [mdf](https://github.com/man-group/mdf), [loman](https://github.com/janushendersonassetallocation/loman), [pyungo](https://github.com/cedricleroy/pyungo), [streamz](https://streamz.readthedocs.io/en/latest/), or [pyfunctional](https://github.com/EntilZha/PyFunctional), in that it is designed to be used as the implementation for a data model. One such example is the [greeks](https://github.com/1kbgz/greeks) library, which leverages tributary to build data models for [options pricing](https://www.investopedia.com/articles/optioninvestor/07/options_beat_market.asp).
12 |
13 | 
14 |
15 |
16 | # Installation
17 | Install with pip:
18 |
19 | `pip install tributary`
20 |
21 | or with conda:
22 |
23 | `conda install -c conda-forge tributary`
24 |
25 | or from source:
26 |
27 | `python setup.py install`
28 |
29 | Note: If installing from source or with pip, you'll also need [Graphviz itself](https://www.graphviz.org/download/) if you want to visualize the graph using the `.graphviz()` method.
30 |
31 | # Stream Types
32 | Tributary offers several kinds of streams:
33 |
34 | ## Streaming
35 | These are synchronous, reactive data streams, built using asynchronous python generators. They are designed to mimic complex event processors in terms of event ordering.
36 |
37 | ## Functional
38 | These are functional streams, built by currying python functions (callbacks).
39 |
40 | ## Lazy
41 | These are lazily-evaluated python streams, where outputs are propogated only as inputs change. They are implemented as directed acyclic graphs.
42 |
43 | # Examples
44 | - [Streaming](docs/examples/streaming/streaming.md): In this example, we construct a variety of forward propogating reactive graphs.
45 | - [Lazy](docs/examples/lazy/lazy.md): In this example, we construct a variety of lazily-evaluated directed acyclic computation graphs.
46 | - [Automatic Differentiation](docs/examples/autodiff/autodiff.md): In this example, we use `tributary` to perform automatic differentiation on both lazy and streaming graphs.
47 |
48 | # Graph Visualization
49 | You can visualize the graph with Graphviz. All streaming and lazy nodes support a `graphviz` method.
50 |
51 | Streaming and lazy nodes also support [ipydagred3](https://github.com/timkpaine/ipydagred3) for live update monitoring.
52 |
53 | ## Streaming
54 | 
55 |
56 | Here green indicates executing, yellow indicates stalled for backpressure, and red indicates that `StreamEnd` has been propogated (e.g. stream has ended).
57 |
58 | ## Lazy
59 | 
60 |
61 | Here green indicates executing, and red indicates that the node is dirty. Note the the determination if a node is dirty is also done lazily (we can check with `isDirty` whcih will update the node's graph state.
62 |
63 | ## Catalog
64 | See the [CATALOG](CATALOG.md) for a full list of functions, transforms, sources, and sinks.
65 |
66 | ## Support / Contributors
67 | Thanks to the following organizations for providing code or financial support.
68 |
69 |
70 |
71 |
72 | Nemoulous
73 |
74 | ## License
75 | This software is licensed under the Apache 2.0 license. See the [LICENSE](LICENSE) file for details.
76 |
77 | ## Alternatives
78 | Here is an incomplete list of libraries which implement similar/overlapping functionality
79 |
80 | - [man-group/mdf](https://github.com/man-group/mdf)
81 | - [janushendersonassetallocation/loman](https://github.com/janushendersonassetallocation/loman)
82 | - [cedricleroy/pyungo](https://github.com/cedricleroy/pyungo)
83 | - [python-streamz/streamz](https://github.com/python-streamz/streamz)
84 | - [EntilZha/pyfunctional](https://github.com/EntilZha/PyFunctional)
85 | - [stitchfix/hamilton](https://github.com/stitchfix/hamilton)
86 |
--------------------------------------------------------------------------------