├── README.md └── papers ├── LinearAlgebra ├── LinearAlgebra.pdf ├── README.MD └── main.tex └── README.MD /README.md: -------------------------------------------------------------------------------- 1 | # ConsenSys Research 2 | 3 | [![Join the chat at https://gitter.im/ConsenSys/research](https://badges.gitter.im/ConsenSys/research.svg)](https://gitter.im/ConsenSys/research?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) 4 | 5 | 6 | 7 | Welcome to ConsenSys research. This repository is intended as an open dialogue of cross pollination and collaborative effort to work with researchers. 8 | -------------------------------------------------------------------------------- /papers/LinearAlgebra/LinearAlgebra.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ConsenSysMesh/research/e8f24c9e06e8e00b4637d7897a0021bd7f261e5b/papers/LinearAlgebra/LinearAlgebra.pdf -------------------------------------------------------------------------------- /papers/LinearAlgebra/README.MD: -------------------------------------------------------------------------------- 1 | # Linear Algebra LaTex paper reference 2 | 3 | Use this LaTex format as a model for future papers. -------------------------------------------------------------------------------- /papers/LinearAlgebra/main.tex: -------------------------------------------------------------------------------- 1 | \documentclass{article} 2 | \usepackage[utf8]{inputenc} 3 | \usepackage{amsmath,amssymb} 4 | 5 | \usepackage{amsmath, amsthm, amssymb} % % Custom bridges proceedings style 6 | \usepackage{tikz,graphicx} % For including pictures 7 | \usepackage{hyperref} % For formatting (clickable) URLs 8 | \hypersetup{colorlinks=true, urlcolor=blue, urlbordercolor={1 0 0}} 9 | \hypersetup{urlbordercolor={1 0 0}} 10 | 11 | \usepackage{amsmath,amsfonts,tikz} 12 | \usepackage{underscore} 13 | \usepackage{wrapfig} 14 | 15 | \usepackage{fancyhdr} 16 | 17 | \usetikzlibrary{arrows,patterns} 18 | 19 | \title{LinearAlgebra} 20 | 21 | \date{July 2017} 22 | 23 | \parindent=0pt 24 | \parskip=\bigskipamount 25 | 26 | \def\mat#1#2#3#4{ 27 | \left[\begin{matrix}#1\\#3\end{matrix}\right] 28 | } 29 | 30 | 31 | \def\Ker{\hbox{Ker\,}} 32 | \def\Rng{\hbox{Rng\,}} 33 | \def\es{\Lambda} 34 | \def\rev#1{\overleftarrow #1} 35 | 36 | \begin{document} 37 | 38 | \section{Using this document} 39 | 40 | I would like this to become a tutorial on the linear algebra of binary trees and its applications. In order for it to be interactive, you might find it useful to ask questions now and then. 41 | 42 | A particularly useful way to do this in \LaTeX\ is to use the ``marginpar" command.\marginpar{Ask a question here.} Just insert a question in the margin at the appropriate place and as I periodically review/update the document, I'll add more text to answer the question, or find an appropriate link you can go to to learn a little more. 43 | 44 | \section{Prerequisites} 45 | This document will help familiarize you with the linear algebra used in creating fractal binary trees. I want to focus on trees, so here are some things from linear algebra I'll assume you are already familiar with: 46 | \begin{enumerate} 47 | \item Representation of matrices as linear transformations -- such as rotations, shears, reflections, scaling axes, projections, etc. Here is a \href{http://www.vincematsko.com/Spring2017/MAT195/LinAlgGuide.pdf}{brief summary} I used for my digital art course. 48 | 49 | \item Singular and nonsingular matrices. 50 | 51 | \item Finding eigenvalues and eigenvectors of a matrix; algebraic and geometric multiplicities of eigenvalues; defective matrices; writing a matrix as $A=PDP^{-1},$ where $D$ is a diagonal matrix of eigenvalues, and $P$ is a matrix whose columns are the corresponding eigenvectors (sometimes called the spectral decomposition). 52 | 53 | \item Trace and determinant of a matrix. 54 | 55 | \item Range, kernel, and rank of a matrix. 56 | 57 | \item Basic matrix algebra. 58 | \end{enumerate} 59 | 60 | \section{Representation of matrices} 61 | 62 | There are many ways to classify $2\times2$ matrices, but in general, writing a matrix in the form $\mat abcd$ is not particularly helpful. As we will see in later examples, it is useful to use two criteria for our classification: whether the matrix is invertible or singular, or whether it is defective or nondefective. 63 | 64 | \begin{enumerate} 65 | \item[(IN):] Invertible and nondefective. In this case, the matrix is diagonalizable, and has the form 66 | \begin{equation}[{\bf v}_1:{\bf v}_2]\mat{\lambda_1}00{\lambda_2}[{\bf v}_1:{\bf v}_2]^{-1}, \quad\lambda_1\lambda_2\ne0,\label{matin} 67 | \end{equation} 68 | where the eigenvalue $\lambda_1$ has eigenvector ${\bf v}_1,$ and $\lambda_2$ has eigenvector ${\bf v}_2.$ The notation ``$[{\bf v}_1:{\bf v}_2]$" means that matrix whose columns are the vectors ${\bf v}_1$ and ${\bf v}_2.$ Note that in this (and all subsequent cases), ${\bf v}_1$ and ${\bf v}_2$ must be linearly independent so that the matrix $[{\bf v}_1:{\bf v}_2]$ is invertible. 69 | 70 | Note that it is possible in this case that $\lambda_1=\lambda_2.$ In this case, the matrix is just a multiple of the identity, and takes the simpler form 71 | $$\mat{\lambda_1}00{\lambda_1}=\lambda_1\mat1001.$$ 72 | 73 | It is also possible that $\lambda_1$ and $\lambda_2$ are complex. In this case, they are complex conjugates of each other, and the eigenvectors are also complex conjugates of each other. The simplest example of this case is the rotation matrix 74 | $$\mat{\cos\theta}{-\sin\theta}{\sin\theta}{\cos\theta},$$ which has eigenvalues of $\cos\theta\pm i\sin\theta$ and corresponding eigenvectors $(\pm i,1).$ In general, if arbitrary complex conjugates $\lambda_1,\lambda_2$ and conjugates ${\bf v}_1,{\bf v}_2$ are chosen, the matrix given by (\ref{matin}) will have real entries. 75 | 76 | \item[(ID):] Invertible and defective. In this case, the matrix has the form 77 | \begin{equation} 78 | [{\bf v}_1:{\bf v}_2]\left[\begin{matrix}\lambda&1\\0&\lambda\end{matrix}\right][{\bf v}_1:{\bf v}_2]^{-1},\quad\lambda\ne0.\label{matid} 79 | \end{equation} 80 | This is an example of a shear. The unit square is transformed into a parallelogram whose base is the same as its height (both $\lambda$). Most matrices representing shears are not defective; this is a special case. To see why this case occurs, you can look up``Jordan normal form." 81 | 82 | \item[(SN):] Singular, nondefective, and nonzero. 83 | \begin{equation} 84 | [{\bf v}_1:{\bf v}_2]\left[\begin{matrix}\lambda&0\\0&0\end{matrix}\right][{\bf v}_1:{\bf v}_2]^{-1},\quad\lambda\ne0. 85 | \end{equation} 86 | This is simply (\ref{matin}) with $\lambda_2=0.$ The reason this case is singled out is that it is often necessary to solve matrix equations that have the form $$AB=\mat0000=[0],$$ where we use the notation $[0]$ for the matrix of all $0$'s for convenience. If one of $A$ or $B$ is invertible, then the other must be $[0],$ which usually gives a degenerate binary tree. So in solving matrix equations, it is usually necessary to consider the singular case separately. 87 | 88 | \item[(SD):] Singular, defective, and nonzero. In this case, the matrix has the form (\ref{matid}) with $\lambda=0:$ 89 | \begin{equation} 90 | [{\bf v}_1:{\bf v}_2]\left[\begin{matrix}0&1\\0&0\end{matrix}\right][{\bf v}_1:{\bf v}_2]^{-1},\quad\lambda\ne0\label{matsd} 91 | \end{equation} 92 | This is a particular interesting case. Such a matrix $M$ satisfies $M^2=[0]$ but $M\ne[0].$ These matrices may also be classified by the condition $\Ker M=\Rng M.$ They are significant because they ``prune" trees in the sense that if such a transformation is applied twice in a row, the branch ends (since $M^2=[0]$). 93 | 94 | \item[(S0):] This is just the zero matrix, $[0].$ This case is singled out, since if one of the transformations is $[0],$ the tree is essentially a unary tree. 95 | 96 | \end{enumerate} 97 | 98 | Summary of the different types of matrices: 99 | 100 | \setcounter{equation}{0} 101 | \begin{align} 102 | {\rm (IN):\quad}&[{\bf v}_1:{\bf v}_2]\left[\begin{matrix}\lambda_1&0\\0&\lambda_2\end{matrix}\right][{\bf v}_1:{\bf v}_2]^{-1},\quad\lambda_1\lambda_2\ne0.\\ 103 | {\rm (ID):\quad}&[{\bf v}_1:{\bf v}_2]\left[\begin{matrix}\lambda&1\\0&\lambda\end{matrix}\right][{\bf v}_1:{\bf v}_2]^{-1},\quad\lambda\ne0.\\ 104 | {\rm (SN):\quad}&[{\bf v}_1:{\bf v}_2]\left[\begin{matrix}\lambda&0\\0&0\end{matrix}\right][{\bf v}_1:{\bf v}_2]^{-1},\quad\lambda\ne0.\\ 105 | {\rm (SD):\quad}&[{\bf v}_1:{\bf v}_2]\left[\begin{matrix}0&1\\0&0\end{matrix}\right][{\bf v}_1:{\bf v}_2]^{-1}.\label{SD}\\ 106 | {\rm (S0):\quad}& \left[\begin{matrix}0&0\\0&0\end{matrix}\right]. 107 | \end{align} 108 | 109 | 110 | 111 | \section{Definition of a binary tree} 112 | 113 | Here, we discuss various useful notations and definitions for describing binary trees. Without a rigorous definition, any mathematical analysis is not really possible. 114 | 115 | Let $B_0$ and $B_1$ be the affine transformations which represent left and right branching; let {\bf t} be the trunk of the tree. In most discussions of binary trees, $B_0$ and $B_1$ are scaled rotations (which always commute, incidentally, making much of the linear algebra much easier), and the trunk is usually $(0,1).$ These restrictions are rather confining; part of the wide range of trees possible results from allowing the most general transformations. 116 | 117 | $B_0{\bf t}$ and $B_1{\bf t}$ are first-level branches, $B_1B_0{\bf t}$ is a second-level branch, etc. For example, the instructions $001$ determine the node $P$, as illustrated below: 118 | \begin{equation} 119 | (I+B_0+B_0B_0+B_1B_0B_0){\bf t}.\label{treedef} 120 | \end{equation} 121 | Care must be taken about the order of the transformations, since in general, they need not commute. 122 | 123 | \begin{center} 124 | \begin{tikzpicture}[scale=1.8] 125 | \def\nl{(-0.606218,1.35)} 126 | \def\nr{(0.34641,1.2)} 127 | \def\nll{(-1.03057,1.105)} 128 | \def\nlr{(-0.606218,1.63)} 129 | \def\nrl{(0.34641,1.48)} 130 | \def\nrr{(0.484974,1.12)} 131 | \def\nlll{(-1.03057,0.762)} 132 | \def\nllr{(-1.20031,1.203)} 133 | \def\nlrl{(-0.775959,1.728)} 134 | \def\nlrr{(-0.509223,1.686)} 135 | \def\nrll{(0.176669,1.578)} 136 | \def\nrlr{(0.443405,1.536)} 137 | \def\nrrl{(0.581969,1.176)} 138 | \def\nrrr{(0.484974,1.056)} 139 | 140 | \tikzstyle tree=[line width=0.07cm, gray] 141 | \begin{scope}[xshift=3.75cm,yshift=0.35cm,scale=2.25] 142 | \draw[tree] (0,0) -- (0,1) -- \nl -- \nll; 143 | \draw[tree] \nlll -- \nll -- \nl -- \nlr -- \nlrr; 144 | \draw[tree] \nll -- \nllr; 145 | \draw[tree] \nlr -- \nlrl; 146 | \draw[tree] (0,0) -- (0,1) -- \nr -- \nrl; 147 | \draw[tree] \nrll -- \nrl -- \nr -- \nrr -- \nrrr; 148 | \draw[tree] \nrl -- \nrlr; 149 | \draw[tree] \nrr -- \nrrl; 150 | 151 | \foreach \x in {\nlll,\nllr,\nlrl,\nlrr,\nrll,\nrlr,\nrrl,\nrrr} \fill[black] \x circle (0.02); 152 | \LARGE 153 | \draw[>=triangle 45,line width=0.08cm,->,red] (0,0) -- node[right] {\bf t} (0,1); 154 | 155 | \draw[>=triangle 45,line width=0.08cm,->,blue] (0,1) -- \nl; 156 | 157 | \draw[>=triangle 45,line width=0.08cm,->,green!80!black] \nl -- \nll; 158 | 159 | \node[green!80!black] at (-0.7,1.05) {$B_0B_0\bf t$}; 160 | 161 | \node[blue] at (-0.2,1.25) {$B_0\bf t$}; 162 | 163 | \draw[>=triangle 45,line width=0.08cm,->,orange!90!black] \nll -- \nllr; 164 | 165 | \node[orange!90!black] at (-1.175,1.35) {$B_1B_0B_0\bf t$}; 166 | 167 | \node at (-1.25,1.1) {$P$}; 168 | \end{scope} 169 | \end{tikzpicture} 170 | \end{center} 171 | 172 | Now, we abstract a definition from this specific example. 173 | 174 | 175 | 176 | Define $\Sigma=\{0,1,2,...,m-1\};$ $m$ will represent the number of branchings at each node of a tree. In the above example, $m=2.$ 177 | 178 | Define $\Sigma^n$ to be the set of all strings of length $n;$ elements of $\Sigma^n$ will represent sequences of instructions for traversing a binary tree. We denote the empty string by $\es,$ so that $\Sigma^0=\{\es\}.$ For $s\in\Sigma^n,$ we use $|s|=n$ to denote the length of $s.$ We say that $\Sigma^\omega$ is the set of all infinite strings with letters from $\Sigma,$ and put \begin{equation} 179 | \Sigma^*=\Sigma^\omega\cup \bigcup_{n=0}^\infty \Sigma^n. 180 | \end{equation} 181 | Let $s_i$ denote the $i^{\rm th}$ letter in $s,$ and $s_{[i:j]}$ denote that substring of $s$ from the $i^{\rm th}$ to the $j^{\rm th}$ letter when $i\le j.$ When $i>j,$ we put $s_{[i:j]}=\es.$ Also, we write $\rev s$ for the string consisting of the letters of $s$ written in reverse order. This is fairly standard notation for strings used in many computer science applications. 182 | 183 | Now let $\cal B$ be a finite set of $m$ affine transformations, $\{B_0,\ldots B_{m-1}\}.$ These transformations describe how to create the branches in an $m$-ary tree. When all the transformations are invertible, we define ${\cal B}^{-1}$ to be the set of inverse transformations, $\{B_0^{-1},\ldots,B_{m-1}^{-1}\}.$ 184 | 185 | Now we look at (\ref{treedef}). We first need to define the individual terms of the sum. To this end, we define, for a string $s$ of length $n,$ 186 | \begin{equation} 187 | \eta_{\cal B}(s)=\prod_{i=1}^n B_{s_i} =B_{s_n}\ldots B_{s_1}. 188 | \end{equation} 189 | 190 | Note that $\eta_{\cal B}(\es)=I,$ and that the order of transformations in the product is important as the affine transformations may not commute. As an example, $\eta_{\cal B}(001)=B_1B_0B_0.$ Important here is that $\eta_{\cal B}$ defines a {\sl transformation,} not a {\sl vector.} This is because different trunk vectors may actually produce different trees even though the same $B_0$ and $B_1$ are used. Again, we strive for the greatest generality here. 191 | 192 | To add all the branches, we define, for $s\in\Sigma^n,$ 193 | \begin{equation} 194 | \tau_{\cal B}(s)=\sum_{i=0}^{n} \eta_{\cal B}(s_{[1:i]}). 195 | \end{equation} 196 | For the above example, we have $s=001$ and 197 | \begin{align*} 198 | \tau_{\cal B}(001)&=\eta_{\cal B}(\Lambda)+\eta_{\cal B}(0)+\eta_{\cal B}(00)+\eta_{\cal B}(001)\\ 199 | &=I+B_0+B_0B_0+B_1B_0B_0. 200 | \end{align*} 201 | 202 | %For infinite strings, the analogous sum need not necessarily converge; when it does, we write 203 | %\begin{equation} 204 | %\tau_{\cal B}(s)=\sum_{i=0}^{\infty} \eta_{\cal B}(s_{[1:i]}). 205 | %\end{equation} 206 | %Note that this is a sum of transformations; we consider convergence with respect to the usual norm 207 | %\begin{equation} 208 | %\|M\|=\sup\{\|M{\bf v}\|\,|\,\|{\bf v}\|=1\}. 209 | %\end{equation} 210 | For $n\ge0,$ we define 211 | \begin{equation} 212 | {\cal C}^n_{\cal B}=\left\lbrace\tau_{\cal B}(s)\,|\,s\in\Sigma^n\right\rbrace. 213 | \end{equation} 214 | When a trunk (initial branch) {\bf t} of a tree is given, we call 215 | \begin{equation} 216 | {\cal C}_{\cal B}^n{\bf t} 217 | \end{equation} 218 | the {\bf canopy} of depth $n,$ and call elements of ${\cal C}_{\cal B}^n{\bf t}$ {\bf nodes} of depth $n.$ Defining a canopy in this way is important for us, since the number of nodes of depth $n$ often depends on the choice of trunk {\bf t.} 219 | 220 | Of current interest is the sequence 221 | \begin{equation} 222 | |{\cal C}_{\cal B}^1{\bf t}|, |{\cal C}_{\cal B}^2{\bf t}|, |{\cal C}_{\cal B}^3{\bf t}|, |{\cal C}_{\cal B}^4{\bf t}|, |{\cal C}_{\cal B}^5{\bf t}|,\ldots, 223 | \end{equation} 224 | which we call the sequence of {\bf node numbers.} In other words, how many new nodes are created each time we iterate the binary tree algorithm? What we're finding is that an amazingly diverse range of sequences is possible, and moreover, that {\sl proving} the validity of any particular sequence is usually quite involved. 225 | 226 | 227 | 228 | \section{Definition of a Merkle tree} 229 | 230 | Here, we discuss various useful notations and definitions for describing merkle trees. 231 | 232 | 233 | \begin{center} 234 | \begin{tikzpicture}[scale=1.8] 235 | \def\nl{(-0.606218,1.35)} 236 | \def\nr{(0.34641,1.2)} 237 | \def\nll{(-1.03057,1.105)} 238 | \def\nlr{(-0.606218,1.63)} 239 | \def\nrl{(0.34641,1.48)} 240 | \def\nrr{(0.484974,1.12)} 241 | \def\nlll{(-1.03057,0.762)} 242 | \def\nllr{(-1.20031,1.203)} 243 | \def\nlrl{(-0.775959,1.728)} 244 | \def\nlrr{(-0.509223,1.686)} 245 | \def\nrll{(0.176669,1.578)} 246 | \def\nrlr{(0.443405,1.536)} 247 | \def\nrrl{(0.581969,1.176)} 248 | \def\nrrr{(0.484974,1.056)} 249 | 250 | \tikzstyle tree=[line width=0.07cm, gray] 251 | \begin{scope}[xshift=3.75cm,yshift=0.35cm,scale=2.25] 252 | \draw[tree] (0,0) -- (0,1) -- \nl -- \nll; 253 | \draw[tree] \nlll -- \nll -- \nl -- \nlr -- \nlrr; 254 | \draw[tree] \nll -- \nllr; 255 | \draw[tree] \nlr -- \nlrl; 256 | \draw[tree] (0,0) -- (0,1) -- \nr -- \nrl; 257 | \draw[tree] \nrll -- \nrl -- \nr -- \nrr -- \nrrr; 258 | \draw[tree] \nrl -- \nrlr; 259 | \draw[tree] \nrr -- \nrrl; 260 | 261 | \foreach \x in {\nlll,\nllr,\nlrl,\nlrr,\nrll,\nrlr,\nrrl,\nrrr} \fill[black] \x circle (0.02); 262 | \LARGE 263 | \draw[>=triangle 45,line width=0.08cm,->,red] (0,0) -- node[right] {\bf t} (0,1); 264 | 265 | \draw[>=triangle 45,line width=0.08cm,->,blue] (0,1) -- \nl; 266 | 267 | \draw[>=triangle 45,line width=0.08cm,->,green!80!black] \nl -- \nll; 268 | 269 | \node[green!80!black] at (-0.7,1.05) {$B_0B_0\bf t$}; 270 | 271 | \node[blue] at (-0.2,1.25) {$B_0\bf t$}; 272 | 273 | \draw[>=triangle 45,line width=0.08cm,->,orange!90!black] \nll -- \nllr; 274 | 275 | \node[orange!90!black] at (-1.175,1.35) {$B_1B_0B_0\bf t$}; 276 | 277 | \node at (-1.25,1.1) {$P$}; 278 | \end{scope} 279 | \end{tikzpicture} 280 | \end{center} 281 | 282 | Now, we abstract a definition from this specific example. 283 | 284 | Define $\Sigma=\{0,1,2,...,m-1\};$ $m$ will represent the number of branchings at each node of a tree. In the above example, $m=2.$ 285 | 286 | 287 | 288 | 289 | 290 | 291 | 292 | 293 | 294 | 295 | 296 | 297 | 298 | 299 | 300 | 301 | 302 | 303 | 304 | 305 | 306 | 307 | 308 | 309 | 310 | 311 | 312 | 313 | 314 | 315 | 316 | 317 | 318 | 319 | 320 | 321 | 322 | 323 | 324 | 325 | \end{document} 326 | -------------------------------------------------------------------------------- /papers/README.MD: -------------------------------------------------------------------------------- 1 | # Research Papers 2 | 3 | 4 | Some of our current research areas and work in progress papers. --------------------------------------------------------------------------------