├── .gitignore ├── src ├── figures │ ├── LERWw.png │ ├── RWandLERW.png │ ├── RW_200s7hdpi.png │ ├── results_comp.pdf │ ├── LERW_200s7hdpi.png │ ├── preturn_le_plot.png │ ├── LoopErasedRandomWalk.png │ ├── WalkReturnToStartTnProbability.png │ ├── LERW_cluster_sizes_200_histogram.png │ ├── RW1D_cluster_sizes_200_histogram.png │ ├── RW2D_cluster_sizes_200_histogram.png │ ├── WalkReturnToStartT2nProbability.png │ └── LERW_horizontal_displacement_distribution_histogram.png ├── srw_report.synctex.gz(busy) ├── The_Statistics_of_Random_Walks.synctex.gz ├── The_Statistics_of_Random_Walks_poster.png ├── pysc │ ├── srwleplot.py │ ├── lerw.py │ ├── srwgen.py │ └── sysdef.py ├── The_Statistics_of_Random_Walks.bbl ├── references.bib ├── The_Statistics_of_Random_Walks.toc └── The_Statistics_of_Random_Walks.tex ├── .archive ├── src │ ├── .gitignore │ ├── figures │ │ ├── RW_200s7hdpi.png │ │ ├── results_comp.pdf │ │ ├── LERW_200s7hdpi.png │ │ ├── preturn_le_plot.png │ │ ├── LoopErasedRandomWalk.png │ │ ├── WalkReturnToStartT2nProbability.png │ │ ├── WalkReturnToStartTnProbability.png │ │ ├── LERW_cluster_sizes_200_histogram.png │ │ ├── RW1D_cluster_sizes_200_histogram.png │ │ ├── RW2D_cluster_sizes_200_histogram.png │ │ └── LERW_horizontal_displacement_distribution_histogram.png │ ├── The_Statistics_of_Random_Walks.toc │ ├── references.bib │ └── The_Statistics_of_Random_Walks.tex └── The_Statistics_of_Random_Walks.pdf ├── The_Statistics_of_Random_Walks.pdf ├── The_Statistics_of_Random_Walks_poster.pdf └── README.md /.gitignore: -------------------------------------------------------------------------------- 1 | *.py 2 | *.aux 3 | *.bak 4 | *.log 5 | /**/*.log 6 | -------------------------------------------------------------------------------- /src/figures/LERWw.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/figures/LERWw.png -------------------------------------------------------------------------------- /src/figures/RWandLERW.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/figures/RWandLERW.png -------------------------------------------------------------------------------- /src/figures/RW_200s7hdpi.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/figures/RW_200s7hdpi.png -------------------------------------------------------------------------------- /src/figures/results_comp.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/figures/results_comp.pdf -------------------------------------------------------------------------------- /src/figures/LERW_200s7hdpi.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/figures/LERW_200s7hdpi.png -------------------------------------------------------------------------------- /src/figures/preturn_le_plot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/figures/preturn_le_plot.png -------------------------------------------------------------------------------- /src/srw_report.synctex.gz(busy): -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/srw_report.synctex.gz(busy) -------------------------------------------------------------------------------- /.archive/src/.gitignore: -------------------------------------------------------------------------------- 1 | *.aux 2 | *.bbl 3 | *.blg 4 | *.fdb_latexmk 5 | *.fls 6 | *.log 7 | *.synctex.gz 8 | *.synctex.gz(busy) -------------------------------------------------------------------------------- /The_Statistics_of_Random_Walks.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/The_Statistics_of_Random_Walks.pdf -------------------------------------------------------------------------------- /src/figures/LoopErasedRandomWalk.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/figures/LoopErasedRandomWalk.png -------------------------------------------------------------------------------- /.archive/src/figures/RW_200s7hdpi.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/.archive/src/figures/RW_200s7hdpi.png -------------------------------------------------------------------------------- /.archive/src/figures/results_comp.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/.archive/src/figures/results_comp.pdf -------------------------------------------------------------------------------- /.archive/src/figures/LERW_200s7hdpi.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/.archive/src/figures/LERW_200s7hdpi.png -------------------------------------------------------------------------------- /.archive/src/figures/preturn_le_plot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/.archive/src/figures/preturn_le_plot.png -------------------------------------------------------------------------------- /The_Statistics_of_Random_Walks_poster.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/The_Statistics_of_Random_Walks_poster.pdf -------------------------------------------------------------------------------- /.archive/The_Statistics_of_Random_Walks.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/.archive/The_Statistics_of_Random_Walks.pdf -------------------------------------------------------------------------------- /.archive/src/figures/LoopErasedRandomWalk.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/.archive/src/figures/LoopErasedRandomWalk.png -------------------------------------------------------------------------------- /src/The_Statistics_of_Random_Walks.synctex.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/The_Statistics_of_Random_Walks.synctex.gz -------------------------------------------------------------------------------- /src/The_Statistics_of_Random_Walks_poster.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/The_Statistics_of_Random_Walks_poster.png -------------------------------------------------------------------------------- /src/figures/WalkReturnToStartTnProbability.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/figures/WalkReturnToStartTnProbability.png -------------------------------------------------------------------------------- /src/figures/LERW_cluster_sizes_200_histogram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/figures/LERW_cluster_sizes_200_histogram.png -------------------------------------------------------------------------------- /src/figures/RW1D_cluster_sizes_200_histogram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/figures/RW1D_cluster_sizes_200_histogram.png -------------------------------------------------------------------------------- /src/figures/RW2D_cluster_sizes_200_histogram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/figures/RW2D_cluster_sizes_200_histogram.png -------------------------------------------------------------------------------- /src/figures/WalkReturnToStartT2nProbability.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/figures/WalkReturnToStartT2nProbability.png -------------------------------------------------------------------------------- /.archive/src/figures/WalkReturnToStartT2nProbability.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/.archive/src/figures/WalkReturnToStartT2nProbability.png -------------------------------------------------------------------------------- /.archive/src/figures/WalkReturnToStartTnProbability.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/.archive/src/figures/WalkReturnToStartTnProbability.png -------------------------------------------------------------------------------- /.archive/src/figures/LERW_cluster_sizes_200_histogram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/.archive/src/figures/LERW_cluster_sizes_200_histogram.png -------------------------------------------------------------------------------- /.archive/src/figures/RW1D_cluster_sizes_200_histogram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/.archive/src/figures/RW1D_cluster_sizes_200_histogram.png -------------------------------------------------------------------------------- /.archive/src/figures/RW2D_cluster_sizes_200_histogram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/.archive/src/figures/RW2D_cluster_sizes_200_histogram.png -------------------------------------------------------------------------------- /src/figures/LERW_horizontal_displacement_distribution_histogram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/src/figures/LERW_horizontal_displacement_distribution_histogram.png -------------------------------------------------------------------------------- /.archive/src/figures/LERW_horizontal_displacement_distribution_histogram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hr/RandomWalkStats/master/.archive/src/figures/LERW_horizontal_displacement_distribution_histogram.png -------------------------------------------------------------------------------- /src/pysc/srwleplot.py: -------------------------------------------------------------------------------- 1 | # Plot random walk 2 | dpi = 300 3 | fig, ax = plt.subplots() 4 | fig.set_size_inches(3, Circ * 3. / Length) 5 | ax.set_xlim(0, Length - 1) 6 | ax.set_ylim(0, Circ - 1) 7 | ax.get_xaxis().set_visible(False) 8 | ax.get_yaxis().set_visible(False) 9 | plt.plot(*zip(*trajectory), linewidth=0.3) 10 | plt.savefig("plots/"+__file__[:-3]+".png", bbox_inches="tight", dpi=dpi) 11 | -------------------------------------------------------------------------------- /src/pysc/lerw.py: -------------------------------------------------------------------------------- 1 | # Random walk loop erasure 2 | x0, y0, pos = None, None, 0 3 | while pos < len(trajectory): 4 | x, y = trajectory[pos] 5 | if lattice[x][y] > 1 and (not x0): 6 | x0, y0 = x, y 7 | pos0 = pos 8 | elif (x == x0) and (y == y0) and (lattice[x][y] == 1): 9 | del trajectory[pos0:pos+1] # erase loop 10 | x0, y0 = None, None 11 | pos = pos0 12 | lattice[x][y] -= 1 13 | pos += 1 14 | -------------------------------------------------------------------------------- /src/pysc/srwgen.py: -------------------------------------------------------------------------------- 1 | # Random walk generation 2 | while True: 3 | s += 1 4 | if (bool(random.getrandbits(1))): 5 | if (bool(random.getrandbits(1))): 6 | x += 1 7 | else: 8 | x -= 1 9 | else: 10 | if (bool(random.getrandbits(1))): 11 | y += 1 12 | else: 13 | y -= 1 14 | 15 | if (x >= Length): 16 | break 17 | elif (x < 0): 18 | x = 0 19 | if (y >= Circ): 20 | y -= Circ 21 | elif (y < 0): 22 | y += Circ 23 | lattice[x][y] += 1 24 | trajectory.append((x, y)) 25 | -------------------------------------------------------------------------------- /src/pysc/sysdef.py: -------------------------------------------------------------------------------- 1 | # System definition & initialization 2 | seed = 10 # random seed 3 | Length = 200 # length of the cyclinder 4 | Circ = 200 # circumference of cyclinder 5 | x = 0 # x coordinate of starting location 6 | # y coordinate of starting location. Origin is at centre of square 7 | y = Circ / 2 8 | s = 0 # Step number. 9 | trajectory = [] # List of the x coordinates of all points visited. 10 | # (Length x Circ) 2D array of zeros 11 | lattice = np.zeros((Length, Circ), dtype=int) 12 | random.seed(seed) 13 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # The Statistics of Random Walks research report 2 | ![alt tag](src/The_Statistics_of_Random_Walks_poster.png?raw=true "The Statistics of Random Walks research poster") 3 | > The Statistics of Random Walks research poster 4 | 5 | This is a concluding report for research on the statistics of random walks, for which the code is available at https://github.com/HR/RandomWalk, conducted by Habib Rehman (@HR) and Dr. Gunnar Pruessner (g.pruessner@imperial.ac.uk) of the Imperial College London. 6 | 7 | This repo is also accessible via the shortened link http://git.io/RandomWalkStats 8 | -------------------------------------------------------------------------------- /.archive/src/The_Statistics_of_Random_Walks.toc: -------------------------------------------------------------------------------- 1 | \contentsline {section}{\numberline {1}Introduction}{2} 2 | \contentsline {subsection}{\numberline {1.1}Simple Random Walk on a $\mathbb {Z}$ lattice.}{2} 3 | \contentsline {subsection}{\numberline {1.2}Loop Erased Random Walks (LERW)}{3} 4 | \contentsline {paragraph}{Loop Erased Random Walk definition.}{3} 5 | \contentsline {section}{\numberline {2}Methodology}{4} 6 | \contentsline {subsection}{\numberline {2.1}Random Walk Implementation}{4} 7 | \contentsline {paragraph}{System definition.}{4} 8 | \contentsline {paragraph}{Random Walk generation.}{5} 9 | \contentsline {paragraph}{Random Walk Loop Erasure.}{5} 10 | \contentsline {subsection}{\numberline {2.2}Data Collection}{8} 11 | \contentsline {section}{\numberline {3}Results and Analysis}{9} 12 | \contentsline {paragraph}{Return distribution of 1D and 2D Random Walks.}{9} 13 | \contentsline {paragraph}{Difference in arc length distribution of returns and loops erased.}{9} 14 | \contentsline {section}{\numberline {4}Conclusion}{12} 15 | \contentsline {section}{\numberline {5}Appendices}{12} 16 | \contentsline {subsection}{\numberline {5.1}System Definition code}{13} 17 | \contentsline {subsection}{\numberline {5.2}Random Walk generation code}{13} 18 | \contentsline {subsection}{\numberline {5.3}Loop Erased Random Walk code}{13} 19 | \contentsline {section}{\numberline {6}Bibliography}{14} 20 | \contentsline {section}{\numberline {7}Acknowledgements}{14} 21 | -------------------------------------------------------------------------------- /src/The_Statistics_of_Random_Walks.bbl: -------------------------------------------------------------------------------- 1 | \providecommand{\bysame}{\leavevmode\hbox to3em{\hrulefill}\thinspace} 2 | \providecommand{\MR}{\relax\ifhmode\unskip\space\fi MR } 3 | % \MRhref is called by the amsart/book/proc definition of \MR. 4 | \providecommand{\MRhref}[2]{% 5 | \href{http://www.ams.org/mathscinet-getitem?mr=#1}{#2} 6 | } 7 | \providecommand{\href}[2]{#2} 8 | \begin{thebibliography}{1} 9 | 10 | \bibitem{DPscopy2016} 11 | Docs.python.org, \emph{8.17. copy shallow and deep copy operations python 12 | 2.7.11 documentation}, 2016 13 | 14 | \bibitem{dollapplications} 15 | Aaron Doll, \emph{Applications of random walks in tor}. 16 | 17 | \bibitem{Durrett2010} 18 | Rick Durrett, \emph{{Probability: Theory and Examples Rick Durrett January 29, 19 | 2010}}, vol.~49, 2010. 20 | 21 | \bibitem{klafter2011} 22 | J.~Klafter and I.M. Sokolov, \emph{First steps in random walks: From tools to 23 | applications}, OUP Oxford, 2011. 24 | 25 | \bibitem{lawler2010} 26 | G.F. Lawler and V.~Limic, \emph{Random walk: A modern introduction}, Cambridge 27 | Studies in Advanced Mathematics, Cambridge University Press, 2010. 28 | 29 | \bibitem{Redner2002} 30 | Sidney Redner and J.~R. Dorfman, \emph{{A Guide to First-Passage Processes}}, 31 | vol.~70, 2002. 32 | 33 | \bibitem{schweinsberg2009loop} 34 | Jason Schweinsberg, \emph{The loop-erased random walk and the uniform spanning 35 | tree on the four-dimensional discrete torus}, Probability Theory and Related 36 | Fields \textbf{144} (2009), no.~3-4, 319--370. 37 | 38 | \end{thebibliography} 39 | -------------------------------------------------------------------------------- /.archive/src/references.bib: -------------------------------------------------------------------------------- 1 | @article{schweinsberg2009loop, 2 | title={The loop-erased random walk and the uniform spanning tree on the four-dimensional discrete torus}, 3 | author={Schweinsberg, Jason}, 4 | journal={Probability Theory and Related Fields}, 5 | volume={144}, 6 | number={3-4}, 7 | pages={319--370}, 8 | year={2009}, 9 | publisher={Springer} 10 | }, 11 | @book{lawler2010, 12 | title={Random Walk: A Modern Introduction}, 13 | author={Lawler, G.F. and Limic, V.}, 14 | isbn={9781139488761}, 15 | series={Cambridge Studies in Advanced Mathematics}, 16 | url={https://books.google.co.uk/books?id=UBQdwAZDeOEC}, 17 | year={2010}, 18 | publisher={Cambridge University Press} 19 | }, 20 | @book{klafter2011, 21 | title={First Steps in Random Walks: From Tools to Applications}, 22 | author={Klafter, J. and Sokolov, I.M.}, 23 | isbn={9780199234868}, 24 | lccn={2011023454}, 25 | year={2011}, 26 | publisher={OUP Oxford} 27 | }, 28 | @book{Durrett2010, 29 | author = {Durrett, Rick}, 30 | booktitle = {Biometrics}, 31 | doi = {10.2307/2532227}, 32 | isbn = {0534424414}, 33 | issn = {0006341X}, 34 | number = {3}, 35 | pages = {497}, 36 | pmid = {16156020}, 37 | title = {{Probability: Theory and Examples Rick Durrett January 29, 2010}}, 38 | url = {http://books.google.com/books?id=NPYYAQAAIAAJ\&pgis=1}, 39 | volume = {49}, 40 | year = {2010} 41 | }, 42 | @book{Redner2002, 43 | author = {Redner, Sidney and Dorfman, J. R.}, 44 | booktitle = {American Journal of Physics}, 45 | doi = {10.1119/1.1509421}, 46 | isbn = {0521652480}, 47 | issn = {00029505}, 48 | number = {11}, 49 | pages = {1166}, 50 | title = {{A Guide to First-Passage Processes}}, 51 | volume = {70}, 52 | year = {2002} 53 | } 54 | -------------------------------------------------------------------------------- /src/references.bib: -------------------------------------------------------------------------------- 1 | @article{schweinsberg2009loop, 2 | title={The loop-erased random walk and the uniform spanning tree on the four-dimensional discrete torus}, 3 | author={Schweinsberg, Jason}, 4 | journal={Probability Theory and Related Fields}, 5 | volume={144}, 6 | number={3-4}, 7 | pages={319-370}, 8 | year={2009}, 9 | publisher={Springer} 10 | }, 11 | @article{dollapplications, 12 | title={Applications of Random Walks in Tor}, 13 | author={Doll, Aaron} 14 | }, 15 | @book{lawler2010, 16 | title={Random Walk: A Modern Introduction}, 17 | author={Lawler, G.F. and Limic, V.}, 18 | isbn={9781139488761}, 19 | series={Cambridge Studies in Advanced Mathematics}, 20 | url={https://books.google.co.uk/books?id=UBQdwAZDeOEC}, 21 | year={2010}, 22 | publisher={Cambridge University Press} 23 | }, 24 | @book{klafter2011, 25 | title={First Steps in Random Walks: From Tools to Applications}, 26 | author={Klafter, J. and Sokolov, I.M.}, 27 | isbn={9780199234868}, 28 | lccn={2011023454}, 29 | year={2011}, 30 | publisher={OUP Oxford} 31 | }, 32 | @book{Durrett2010, 33 | author = {Durrett, Rick}, 34 | booktitle = {Biometrics}, 35 | doi = {10.2307/2532227}, 36 | isbn = {0534424414}, 37 | issn = {0006341X}, 38 | number = {3}, 39 | pages = {497}, 40 | pmid = {16156020}, 41 | title = {{Probability: Theory and Examples Rick Durrett January 29, 2010}}, 42 | url = {http://books.google.com/books?id=NPYYAQAAIAAJ}, 43 | volume = {49}, 44 | year = {2010} 45 | }, 46 | @book{Redner2002, 47 | author = {Redner, Sidney and Dorfman, J. R.}, 48 | booktitle = {American Journal of Physics}, 49 | doi = {10.1119/1.1509421}, 50 | isbn = {0521652480}, 51 | issn = {00029505}, 52 | number = {11}, 53 | pages = {1166}, 54 | title = {{A Guide to First-Passage Processes}}, 55 | volume = {70}, 56 | year = {2002} 57 | }, 58 | @misc{DPscopy2016, 59 | author={Docs.python.org}, 60 | title={8.17 copy Shallow and deep copy operations Python 2.7.11 documentation}, 61 | url={https://docs.python.org/2/library/copy.html}, 62 | year={2016} 63 | } 64 | -------------------------------------------------------------------------------- /src/The_Statistics_of_Random_Walks.toc: -------------------------------------------------------------------------------- 1 | \contentsline {section}{\numberline {1}Exposition}{2} 2 | \contentsline {subsection}{\numberline {1.1}Aims and Objectives}{2} 3 | \contentsline {subsection}{\numberline {1.2}Context}{2} 4 | \contentsline {subsection}{\numberline {1.3}Selection of approach}{3} 5 | \contentsline {paragraph}{Approaches:}{3} 6 | \contentsline {paragraph}{Selection}{3} 7 | \contentsline {section}{\numberline {2}Abstract}{4} 8 | \contentsline {section}{\numberline {3}Introduction}{4} 9 | \contentsline {subsection}{\numberline {3.1}Simple Random Walk on a $\mathbb {Z}$ lattice.}{4} 10 | \contentsline {subsection}{\numberline {3.2}Loop Erased Random Walks (LERW)}{5} 11 | \contentsline {paragraph}{Loop Erased Random Walk definition.}{6} 12 | \contentsline {section}{\numberline {4}Methodology}{6} 13 | \contentsline {subsection}{\numberline {4.1}Random Walk Implementation}{6} 14 | \contentsline {paragraph}{System definition.}{7} 15 | \contentsline {paragraph}{Random Walk generation.}{7} 16 | \contentsline {paragraph}{Random Walk Loop Erasure.}{8} 17 | \contentsline {subsection}{\numberline {4.2}Collecting Data}{11} 18 | \contentsline {section}{\numberline {5}Results and Analysis}{12} 19 | \contentsline {paragraph}{Return distribution of 1D and 2D Random Walks.}{13} 20 | \contentsline {paragraph}{Difference in arc length distribution of returns and loops erased.}{13} 21 | \contentsline {section}{\numberline {6}Conclusion}{16} 22 | \contentsline {subsection}{\numberline {6.1}Findings}{16} 23 | \contentsline {subsection}{\numberline {6.2}Implications}{17} 24 | \contentsline {section}{\numberline {A}Appendices}{17} 25 | \contentsline {subsection}{\numberline {A.1}Imports}{17} 26 | \contentsline {subsection}{\numberline {A.2}System Definition code}{17} 27 | \contentsline {subsection}{\numberline {A.3}Random Walk generation code}{18} 28 | \contentsline {subsection}{\numberline {A.4}Loop Erased Random Walk code}{18} 29 | \contentsline {subsection}{\numberline {A.5}Random Walk plot code (using the matplotlib API)}{19} 30 | \contentsline {subsection}{\numberline {A.6}Loop Erased Random Walk plot code}{19} 31 | \contentsline {subsection}{\numberline {A.7}Histogram plot (and parse from raw2bin.c) code}{19} 32 | \contentsline {section}{\numberline {B}Bibliography}{20} 33 | \contentsline {section}{\numberline {C}Acknowledgements}{21} 34 | -------------------------------------------------------------------------------- /.archive/src/The_Statistics_of_Random_Walks.tex: -------------------------------------------------------------------------------- 1 | % !TEX TS-program = pdflatexmk 2 | \documentclass{article} 3 | 4 | \usepackage[utf8]{inputenc} 5 | \usepackage{amsmath,physics,mathtools,listings,graphicx,pdfpages,amsfonts,amssymb,amsthm,float} 6 | 7 | \DeclarePairedDelimiter{\opair}{\langle}{\rangle} 8 | \newcommand\given[1][]{\:#1\vert\:} 9 | \graphicspath{ {figures/} } 10 | \title{The Statistics of Random Walks} 11 | \author{Habib Rehman} 12 | \date{June/July August 2015} 13 | \begin{document} 14 | \maketitle 15 | 16 | 17 | 18 | \section*{Abstract} 19 | The purpose of this research has been to identify the precise correspondence between the return distribution of a one-dimensional random walk and a two-dimensional random walk. This research also considers the correspondence between the arc length distribution of a random walker returning and the arc length distribution of the loops erased. 20 | \tableofcontents 21 | \newpage 22 | 23 | \section{Introduction} 24 | A random walk is a stochastic process formed by the repeated summation of identically distributed (independent) random variables distributed random variables and regarded to be a is a V-valued Markov chain on a graph \cite{schweinsberg2009loop}. Eventhough the term $random\ walk$ can be traced back to $P'olya \left( 1921 \right)$ (\emph{‘zufaellige Irrfahrt’} in German), but the context is much older. This research consider random walks on a integer lattice $\mathbb{Z}^\emph{{d}}$ as the discrete space, practically, allows for more analysis (operations) to be carried out with a feasible computational power. The research is also concerned with loop erased random walks in the second dimension (i.e. $d=2$) as it is below the critical dimension for random walks $d = 4$ at which the process converges to Brownian motion \cite{lawler2010} and scaling limits exist (i.e. the limit as the lattice spacing approaches zero). It becomes trivial in the regard that self-intersection becomes highly improbable. The random walks studied correspond to increment distributions with the properties of having zero mean and finite variance. These properties are necessary for normal convergence in the increments of the distributions to occur. They also allow for the succinct expression of the main result quickly. Firstly, I will delineate the statistical behaviour of a simple random walk in what follows. Thereafter, I will provide the appropriate theorems and definition for random walks and loop erased random walks. 25 | 26 | 27 | \subsection{Simple Random Walk on a $\mathbb{Z}$ lattice.} 28 | The simplest non-trivial case is to let $X\textsubscript{1}, X\textsubscript{2}, \ldots$ represent the outcomes of independent experiments and is in the first dimension. Suppose that these experiments are tosses of a fair coin and there is a gain of $+1$ for each head loss $-1$ on each tail. The outcome of a flip of the coin is equally likely to be heads or tails, so the walk is clearly unbiased, in that there is no preference for gain or loss. Then $S\textsubscript{n}$ represents his cumulative gain or loss on the first $n$ plays. 29 | Then the sequence $(S\textsubscript{n})_\emph{n=0}^\infty$ would be described as a simple random walk (for this scenario its called the \emph{gambler\' s ruin estimate}) on the $\mathbb{Z}$ lattice. When the behaviour of such a simple process is analysed in detail, it becomes apparent that it more intriguing than it appears to be. What we can deduce is that: 30 | \begin{enumerate} 31 | \item The walk returns to its starting position (i.e. 0) at time $2n$ with probability 32 | \begin{center} 33 | \includegraphics[scale=0.8]{WalkReturnToStartT2nProbability} 34 | \end{center} 35 | By \emph{Stirling's approximation}, it is 36 | \begin{center} 37 | \includegraphics[scale=0.8]{WalkReturnToStartTnProbability} 38 | \cite{Durrett2010} 39 | \end{center} 40 | \item The walk eventually reaches each integer within $\mathbb{Z}$ lattice with certainty and so traverses it infinitely often. Suppose $S\textsubscript{0} := 0$ and $T := \inf\{n : S\textsubscript{n} = +1\}$ then $P(T < \infty) = 1$ and $P(T = \infty) = 0$ 41 | \item The distribution of \emph{T} is 42 | \begin{center} 43 | $P(T = 2n - 1)\ =\ (-1)_{n-1}\binom{\frac{1}{2}}{n}$ 44 | \end{center} 45 | \end{enumerate} 46 | A Simple Random Walk generally refers to random walk in the first dimension. However, this research is concerned with random walks in 2 dimensions. But interestingly, a 2 dimensional random walk is essentially composed of two Simple Random Walks in either plane as the vertices in either plane selected are independently selected. 47 | \newline 48 | 49 | 50 | \subsection{Loop Erased Random Walks (LERW)} 51 | A Loop erased random walk on the integer lattice is a process obtained from Random Walk by erasing the loops chronologically. In LERW, the process is expected to have a continuum limit that is conformal, and nonrigorous conformal, field theory gives an exact prediction of the exponent as a simple rational number. In three dimensions there is no reason to believe that the exponent takes on a rational value. Once the loops are erased from the random walk, it essentially becomes a uniform spanning tree (UST), which is a spanning tree chosen uniformly at random, and so the algorithms for generating LERWs are used in application to produce uniform spanning trees. 52 | 53 | \paragraph{Loop Erased Random Walk definition.} \label{alg:LE} If $\omega = [\omega\textsubscript{0}, \ldots, \omega\textsubscript{n}]$ is a path, then the \emph{(chronological) loop-erasure} function $LE(\omega)$ would be defined as follows\cite{klafter2011} \newline 54 | Let $\lambda_{0} = \max\{j\ \le \ n: \omega_{j}=0\}$. Set $\beta_{0}=\omega_{0}=\omega_{\lambda_{0}}$. 55 | \newline 56 | Suppose $\lambda_{i} < n$. Let $\lambda_{i+1} = \max\{j\ \le n: \omega_{j}=\omega_{\lambda_{i}+1}\}$. Set $\beta_{i+1}=\omega_{\lambda_{i+1}}=\omega_{\lambda_{i}+1}$. 57 | \newline 58 | If $i_{\omega} = \min \{i : \lambda_{i} = n\} = \min\{i : \beta_{i} = \omega_{n}\}$, then $LE\left(\omega\right) = \lbrack \beta_{0},\ldots,\beta_{i}\rbrack$ 59 | \newpage 60 | 61 | 62 | \section{Methodology} 63 | In this section, I will explain the theory behind the code that was written to produce the resultant data. The code was mainly\footnote{The \emph{raw2binned.c} script in \texttt{C++} was used for the sake of efficiency and was originally written by Francesc, later modified by Gunnar Pruessner and then adapted by me for this project} written in \texttt{Python} as it enabled us to stay problem-oriented and allowed us to implement the LERW rapidly as well as readily provided us much greater useful functionality through its extensive library. I used the \texttt{matplotlib} library to generate the required statistical graphs from the data (i.e. distribution histograms) and \texttt{numpy} library to effective manipulate the data as well as conduct statistical analysis. 64 | 65 | 66 | \subsection{Random Walk Implementation} 67 | The Random Walk was theoretically modelled to transverse on the 2 dimensional surface of a 3 dimensional cylinder (i.e. the system) with two vertical open boundaries and to two horizontal periodic. At the open boundaries, the Random Walk is initiated from starting boundary and ceased at the ending boundary. Due to the curvilinear geometry of a cylinder, the system had the property of allowing the random walk to transverse an infinite number of steps in any vertical direction given that the random walk did not reach the ending boundary. 68 | The system was implemented as a 2 dimensional $\mathbb{Z}$ lattice of a particular size (length by circumference) which represented the 2 dimensional surface of the cylinder and, to implement the previously stated property of the original system, periodic boundaries where inaugurated. The function of the periodic boundaries was to ensure that the random walk continues to transverse from the boundary iteratively until it took a valid step (i.e. a step within the system boundaries) when the random walk surpassed a system boundary (see Figure~\ref{fig:RW}). The following is a formal definition of the system. 69 | 70 | \paragraph{System definition.}\label{subsec:sysfdef} Given that $\vb{S}=\opair{i, j}$ is the system of size (L, C), where \emph{i} and \emph{j} are the respective vertical and horizontal components of the system (refer to Appendix~\ref{sssec:sysdef} for the actual system definition) 71 | \begin{enumerate} 72 | \item The system generation can be expressed as 73 | \begin{center} 74 | $\vb{S} = \left\{\left. (\sum_{k=0}^L i_k, \sum_{k=0}^C j_k) \; \right\vert \;i_k j_k \in\mathbb{Z} \right\}$ 75 | \end{center} 76 | \item The length and the circumference $(L, C)$ of the system are of equal size which implies that the lattice was of equal length (for this research) 77 | \item $\vb{S}\textsubscript{i = 0}$ is the starting boundary and $\vb{S}\textsubscript{i = L}$, where \emph{L} represents the length of the system, is the ending boundary. 78 | \item $(\vb{S}\textsubscript{i = 0}, \vb{S}\textsubscript{j = y})$ is the starting point, where \emph{y} is the first pseudorandom number in the sequence generated by lib function using a given seed. 79 | \item Periodic boundaries exist at $(\vb{S}\textsubscript{j = 0}, \vb{S}\textsubscript{j = C})$, where \emph{C} is the circumference of the system, to allow \emph{j} to be of an indefinite size 80 | \end{enumerate} 81 | 82 | \paragraph{Random Walk generation.} A Random Walk, $S_{n}$, was generated using the generation function 83 | \begin{center} 84 | $\omega_{n} = \sum_{k=0}^n X_k$ 85 | \end{center} 86 | where $\{X_k\}$ are independent and identically distributed random variables. 87 | Implementing the periodic boundaries as well, resulted in the generation function 88 | \begin{center} 89 | $\omega_{n} = \left\{ \sum_{k=0}^n X_k \given[\Big] X_k \lnot\ 0 \lor L \lor C \right\}$ 90 | \end{center} 91 | I translated this into \texttt{Python} code which is viewable in Appendix~\ref{sssec:srwgen}. In the code, the \emph{trajectory} list containing the running set of vertices corresponding to the pseudorandom steps that the random walk had talking was a \texttt{numpy} list because of extensive functionality (i.e functions/methods) that were required to perform the non trivial manipulations quickly as well as staying problem-oriented. For a plot of the random walk output of the generation code refer to the following figure (Figure~\ref{fig:RW}). 92 | \begin{figure} 93 | \begin{center} 94 | \includegraphics[scale=1.6]{RW_200s7hdpi} 95 | \caption{Random walk on a $\mathbb{Z}^{\emph{2}}$ lattice (system size: 200; seed: 7)} 96 | \label{fig:RW} 97 | \end{center} 98 | \end{figure} 99 | 100 | \paragraph{Random Walk Loop Erasure.} A loop is simply the result of the random walk intersecting itself. Considering that the random walk is transversing on an integer lattice, for a loop to occur, a vertex on the lattice has to be transversed at least twice - at the start of the loop and at the end of the loop. So to erase the loop, an algorithm was devised which removed every duplicate of a given list of vertices (i.e. the random walk trajectory) from the first occurrence of the vertex onwards to (and including) the last occurrence in the random walk trajectory list iteratively until all duplicates (i.e. loops) are removed. The $LE(\omega)$ algorithm defined in Loop Erased Random Walk definition in~\ref{alg:LE}, is a condensed form of the algorithm that was used to generate the LERWs to generate the data. 101 | I translated this into \texttt{Python} code which is viewable in Appendix~\ref{sssec:lerw}. For a plot of the loop erased random walk output of the generation code refer to the following figure (Figure~\ref{fig:LERW}). 102 | \begin{figure} 103 | \begin{center} 104 | \includegraphics[scale=1.6]{LERW_200s7hdpi} 105 | \caption{Loop erased random walk on a $\mathbb{Z}^{\emph{2}}$ lattice (system size: 200; seed: 7)} 106 | \label{fig:LERW} 107 | \end{center} 108 | \end{figure} 109 | \newpage 110 | 111 | \subsection{Data Collection} 112 | All data collected for analysis was primary data generated by self-written code and systematic randomness. Data collection was an automatic procedure carried out by the code that I had written when run. A significant amount data had to collected for different system in order to draw a reliable and a more decisive conclusion. As discussed before, the main implication of the presence of periodic boundaries was that the size of the vertical component was indefinite and thus the size of the result was indefinite. Since the code could not be optimised for multithreading, the task for the execution of the code could not utilise more than one core of the multi-core computational infrastructure. Also considering that the computational infrastructure was not dedicated for the task (i.e other OS-related task have had a higher priority), it was apparent that the code had to run for a prolonged period of time in order to collect the necessary. Data was collected for different systems, namely, one dimensional random walk, 113 | \newline 114 | Fortunately, the computational resource available at disposal for this project were vast. With access to the High Performance Computing (HPC) at Huxley Building, Imperial College London, the source code for generating the systems of various sizes was run as a series of jobs over the period of a whole day (~24 hours) to generate the data by the interpreter writing the output to a series of UTF-8 encoded sequential .dat files. Some metadata (such as the system size) was prepended to the files to distinguish and identify the files easier. 115 | \newline 116 | Data was collected for the following systems: 117 | \begin{itemize} 118 | \item 2 dimensional Random Walks of system sizes of 50, 100, 200 and 400 119 | \item 1 dimensional Random Walks of system sizes of 50, 100, 200 and 400 120 | \item 2 dimensional Loop Erased Random Walks of system sizes of 50, 100, 200 and 400 121 | \end{itemize} 122 | On the first run on the cluster, we received a runtime error due to the incorrect configuration of the cluster but this was quickly fixed once the system was reconfigured. Once the files containing the data were generated, they were inspected manually to ensure that the generated data was valid (i.e. conformed to the expected data) and no logical errors where present in the code. Due to time constraints for each job of 15 hours on the cluster and considering that only one core of the cluster was being utilised, the code for generating the 2 dimensional Random Walks of system sizes of 400 and the code for generating 2 dimensional Loop Erased Random Walks of system sizes of 200, 400 was unable to completely execute. This incomplete execution resulted in the files not being generated (files were not completely written and so any associated temporary was discarded). 123 | Hence, \emph{in this report we will only analyse the data by a system\footnote{refer to \S~\ref{subsec:sysfdef} for definition} where} $|\vb{S}| \le 200$ .\newline 124 | These outputs where then run through the \emph{raw2binned.c} script that grouped the raw data into bins in order to plot a double logarithmic histogram (refer to Results and Analysis to see the resulting plot). 125 | \newpage 126 | 127 | 128 | \section{Results and Analysis} 129 | The collected data was used to generate plot double logarithmic histograms (refer to Figure~\ref{fig:LERWhdish} -~\ref{fig:RW2Dhdish}) in order to qualitatively compare the return times distributions for 2 dimensional discrete ($\mathbb{Z}^{\emph{2}}$) LERWs, 2 dimensional discrete ($\mathbb{Z}^{\emph{2}}$) Random Walks and 1 dimensional discrete ($\mathbb{Z}$) Random Walks. \newline 130 | 131 | What my research found was that was that return times distribution for $\mathbb{Z}^{\emph{2}}$ random walk and return times distribution for $\mathbb{Z}$ random walk were very similar. What my research also found that there is a difference in the distribution of the return times of a random walker (1D and 2D) and the distribution of the loops erased in LERW. 132 | 133 | \paragraph{Return distribution of 1D and 2D Random Walks.} Comparing Figure~\ref{fig:RW1Dhdish} and Figure~\ref{fig:RW2Dhdish}), I found that the return times distribution for $\mathbb{Z}^{\emph{2}}$ random walk and return times distribution for $\mathbb{Z}$ random walk were very similar. The following is a possible explanation for this discovery that implicates a significant amount of factual knowledge in the field. For a random walk in one dimension starting at $n$, the probability that the random walk eventually returns to $n$ equals one \cite{Redner2002} $\implies \Pr(S_{n}=n) = 1$, $n \in \mathbb{Z}$. However, the time required to return to $n$, averaged over all possible vertices in the system, is infinite.For a random walk in two dimensions, the survival probability $S(t)$ ultimately decays to zero. This means that a random walk is recurrent which means that it is certain to eventually return to its starting point, and indeed visit any vertex of an infinite lattice. This is the $2.$ deduced characteristic stated in $\S1.1$. This is also because a random walk has no memory and so it transits to a new state (i.e. resets it\' s state) every time a specific lattice vertex is transversed. Hence, recurrence also implies that every site is visited infinitely often. 134 | The arcsin law, which is concerned with the statistics of returns to the origin, also applies here as our random walk is an unrestricted random walk. From the arcsine law, we can infer that the most probable outcome is that the walk always remains entirely on the positive or on the negative axis. This is against the natural expectation of approximately one-half of the total time being spent on the positive axis and the remaining one-half of the time on the negative axis for a random walk which starts at x = 0. Surprisingly, the natural expectation is the least probable outcome. 135 | \paragraph{Difference in arc length distribution of returns and loops erased.} As depicted in Figure~\ref{fig:AREr}, there is a correlation between the distribution of the loops erased in a random walk and the probability of return to (an arbitrary) vertex in the lattice for both systems of size 100 and 200. In fact, the combined distribution is positively skewed which implies that as the loops erased increase, the probability of returning to a vertex in the lattice decreases. The fact that the data is skewed, indicates that there is a difference arc length distribution of a random walker returning and arc length distribution of the loops erased. 136 | \newpage 137 | 138 | \begin{figure}[H] 139 | \centering 140 | \includegraphics[scale=0.6]{preturn_le_plot} 141 | \caption{Probability of return and loops erased distributions for $\mathbb{Z}^{\emph{2}}$ LERWs (system sizes: 100, 200; seed:10) and $\mathbb{Z}$ Random Walk (system size:200; seed:10)} 142 | \label{fig:AREr} 143 | \end{figure} 144 | \begin{figure}[H] 145 | \centering 146 | \includegraphics[scale=0.5]{LERW_cluster_sizes_200_histogram} 147 | \caption{Histogram for the distribution of the Loop Erased for Random Walks on $\mathbb{Z}^{\emph{2}}$ lattice (system size:200; seed:10)} 148 | \label{fig:LERWhdish} 149 | \end{figure} 150 | \begin{figure}[H] 151 | \centering 152 | \includegraphics[scale=0.5]{RW1D_cluster_sizes_200_histogram} 153 | \caption{Histogram for the returns distribution of Random Walks on $\mathbb{Z}$ lattice (system size:200; seed:10)} 154 | \label{fig:RW1Dhdish} 155 | \end{figure} 156 | \begin{figure}[h] 157 | \centering 158 | \includegraphics[scale=0.5]{RW2D_cluster_sizes_200_histogram} 159 | \caption{Histogram for the returns distribution of Random Walks on $\mathbb{Z}^{\emph{2}}$ lattice (system size; seed:10)} 160 | \label{fig:RW2Dhdish} 161 | \end{figure} 162 | \newpage 163 | 164 | \section{Conclusion} 165 | The research has correspondence between the return distribution of a one-dimensional random walk and a two- dimensional random walk. I can conclude that similarity in the return distribution of the one-dimensional random walk and the two-dimensional random walk is due to the survival probability of the two-dimensional random walk eventually being zero and we know that the probability of a one-dimensional random walk is already one. We can also conclude that there is a difference in the distribution of the return times of a random walker and the distribution of the loops erased in LERW. \newline 166 | 167 | This research can be extended by using Self Avoiding Random Walks (SARW) which are random walks that do not self-intersect during their transversal and are in a different universality class from LERW. 168 | 169 | 170 | \section{Appendices} 171 | \emph{Please note that the all the code (listed separately here as sections) is part of the same source code file unless stated otherwise. Hence, any variables/constants declared are accessible throughout the (5.x) code sections respectively of their scope.} 172 | \subsection{System Definition code} \label{sssec:sysdef} 173 | \lstinputlisting[language=Python]{pysc/sysdef.py} 174 | \subsection{Random Walk generation code} \label{sssec:srwgen} 175 | \lstinputlisting[language=Python]{pysc/srwgen.py} 176 | \subsection{Loop Erased Random Walk code} \label{sssec:lerw} 177 | \lstinputlisting[language=Python]{pysc/lerw.py} 178 | 179 | 180 | \section{Bibliography} 181 | \bibliographystyle{amsplain} 182 | \bibliography{references} 183 | 184 | 185 | \section{Acknowledgements} 186 | I would like to thank Professor Gunnar Pruessner of Imperial College London for supervising the research project and the Nuffield Foundation for funding this research. I would also like to thank Imperial College London for allowing us to use the High Performance Computing (HPC) mainframes at Huxley Building in ICL. 187 | \end{document} -------------------------------------------------------------------------------- /src/The_Statistics_of_Random_Walks.tex: -------------------------------------------------------------------------------- 1 | \documentclass{article} 2 | 3 | \usepackage[utf8]{inputenc} 4 | \usepackage{amsmath,physics,mathtools,listings,graphicx,pdfpages,amsfonts,amssymb,amsthm,float} 5 | 6 | \DeclarePairedDelimiter{\opair}{\langle}{\rangle} 7 | \newcommand\given[1][]{\:#1\vert\:} 8 | \graphicspath{ {figures/} } 9 | \title{The Statistics of Random Walks} 10 | \author{Habib Rehman} 11 | \date{June/July 2015} 12 | \begin{document} 13 | \maketitle 14 | 15 | \tableofcontents 16 | 17 | \hphantom 18 | \newline 19 | \section{Exposition} 20 | This section act as an exposition to the research project. 21 | 22 | \subsection{Aims and Objectives} 23 | The main aims of this research project were to: 24 | \begin{enumerate} 25 | \item{Identify the correspondence between the return distribution of a 1D random walk and a 2D random walk} 26 | \item{Consider the correspondence between the arc length distribution of a random walker returning} 27 | \end{enumerate} 28 | The objectives for both aims were the following: 29 | \begin{enumerate} 30 | \item{Explore the physical phenomena (i.e. a stochastic process) that implicates a random walk} 31 | \item{Devise a mathematical model that models the physical phenomena} 32 | \item{Precisely delineate the properties of the mathematical model and all aspects regarding it} 33 | \item{Translate the mathematical model into \texttt{Python} code to implement the model} 34 | \item{Collect data by running the code with nuanced configurations of the mathematical model} 35 | \item{Analyse the data and draw conclusions to obtain results} 36 | \end{enumerate} 37 | 38 | \subsection{Context} 39 | Random walks are applied in a range of fields such as physics, financial economics, population genetics and computer science. Random walks are essential mathematical models used to model stochastic processes which are ubiquitously present in many fields. Any natural process such as diffusion is a stochastic process as it occurs randomly and is thought to be non-deterministic (unpredictable). However, modelling the process as a random walks enables us to deduce the statistics of certain behaviour of the process and thus predict the outcome (to a certain extent) to of an event which implicates the process. 40 | The ability to predict this for stochastic processes is of great importance as it provides actionable information that can be crucial. 41 | 42 | For instance, in finance, random walks are used to model stock prices, by considering the change of the stock prices with respect to time. Where the stock prices and the time are the independent random variables that form a 2D random walk. 43 | 44 | In physics, random walks are essential in predicting how fast one gas will diffuse into another, how fast heat will spread in a solid, how big fluctuations in pressure will be in a small container, and many other statistical phenomena. They were used by Albert Einstein to find the size of atoms from the Brownian motion exhibited by atoms. 45 | 46 | Additionally, in computer science, random walks are used to improve the scalability networks such as Tor \cite{dollapplications} by allowing efficient allocation of resources which prevents network latency and prevent network failure. This example has very real impacts on the users of the network as it dictates the time that they have to invest in order to receive a response to their request. 47 | 48 | \subsection{Selection of approach} 49 | The following are the proposed approaches initially considered out which the most effective approach was selected. 50 | 51 | \paragraph{Approaches:} 52 | \begin{enumerate} 53 | \item{Model the stock price change (with respect to time) data for a particular stock as a random walk, erase loops and then identify the correspondence between the return and arc length distributions. All implementation in \texttt{Python}} 54 | \item{Model the Tor network bandwidth usage (with respect to time) data as a random walk and then identify the correspondence between the return and arc length distributions. All implementation in \texttt{Python}} 55 | \item{Model pseudorandomly generated sequence(s) of independent variables as a random walk and then identify the correspondence between the return and arc length distributions. All implementation \texttt{C++}} 56 | \item{Model a pseudorandomly generated sequence(s) of independent variables as a random walk and then identify the correspondence between the return and arc length distributions. All implementation in \texttt{Python}} 57 | \end{enumerate} 58 | 59 | \paragraph{Selection} I selected approach 4. because it was the most practical, feasible and reproducible approach. While approach 4. doesn't use true random data like approach 1. or 2. , it does provide reproducibility as the seed used to generate pseudorandom sequence always yields the same sequence for the particular seed and thus the results can be verified by a third-party. Data obtained from approach 1. could be subject to manipulation (stock manipulation) and thus yield in unreliable results and selecting a suitable stock would pose non-problem oriented challenges. While approach 2. addresses the problems with approach 1. by definitively yield true random data \footnote{As the Tor network is fully decentralised and extensive enough in size that it is infeasible for any single entity to cause significant disturbances in the network}, it is impractical to gather the required amount of data in the time available for the research and it would also pose additional non-problem oriented challenges. Finally, approach 4. was selected over approach 3. because of its use of the \texttt{Python} programming language instead of the \texttt{C++} programming language. This is because \texttt{Python}, being a high-level language, already provided much of functionality (more extensive standard library) that we needed where as in \texttt{C++} we would have had to implement most of that needed functionality. So, \texttt{Python} enabled us to stay problem oriented and accomplish our objectives faster. 60 | 61 | \section{Abstract} 62 | The purpose of this research has been to identify the precise correspondence between the return distribution of a one-dimensional random walk and a two-dimensional random walk. This research also considers the correspondence between the arc length distribution of a random walker returning and the arc length distribution of the loops erased. 63 | 64 | \section{Introduction} 65 | A random walk is a stochastic process formed by the repeated summation of identically distributed (independent) random variables distributed random variables and regarded to be a is a V-valued Markov chain on a graph \cite{schweinsberg2009loop}. This research considers random walks on a integer lattice $\mathbb{Z}^\emph{{d}}$ as the discrete space, practically, allows for more analysis (operations) to be carried out with a feasible computational power. The research is also concerned with loop erased random walks in the second dimension (i.e. $d=2$) as it is below the critical dimension for random walks $d = 4$ at which the process converges to Brownian motion \cite{lawler2010} and scaling limits exist (i.e. the limit as the lattice spacing approaches zero). It becomes trivial in the regard that self-intersection becomes highly improbable. The random walks studied correspond to increment distributions with the properties of having zero mean and finite variance. These properties are necessary for normal convergence in the increments of the distributions to occur. They also allow for the succinct expression of the main result quickly. Firstly, I will delineate the statistical behaviour of a simple random walk in what follows. Thereafter, I will provide the appropriate theorems and definition for random walks and loop erased random walks. 66 | 67 | \subsection{Simple Random Walk on a $\mathbb{Z}$ lattice.} 68 | The simplest non-trivial case is to let $X\textsubscript{1}, X\textsubscript{2}, \ldots, X\textsubscript{n}$ represent the outcomes of independent experiments and is in the first dimension. Suppose that these experiments are tosses of a fair coin and there is a gain of $+1$ for each head loss $-1$ on each tail. The outcome of a flip of the coin is equally likely to be heads or tails, so the walk is clearly unbiased, in that there is no preference for gain or loss. Then $S\textsubscript{n}$ represents his cumulative gain or loss on the first $n$ plays. 69 | Then the sequence $(S\textsubscript{n})_\emph{n=0}^\infty$ would be described as a simple random walk (for this scenario its called the \emph{gambler\' s ruin estimate}) on the $\mathbb{Z}$ lattice. When the behaviour of such a simple process is analysed in detail, it becomes apparent that it more intriguing than it appears to be. What we can deduce is that: 70 | \begin{enumerate} 71 | \item The walk returns to its starting position (i.e. 0) at time $2n$ with probability 72 | \begin{center} 73 | \includegraphics[scale=0.8]{WalkReturnToStartT2nProbability} 74 | \end{center} 75 | By \emph{Stirling's approximation}, it is 76 | \begin{center} 77 | \includegraphics[scale=0.8]{WalkReturnToStartTnProbability} 78 | \cite{Durrett2010} 79 | \end{center} 80 | \item The walk eventually reaches each integer within $\mathbb{Z}$ lattice with certainty and so traverses it infinitely often. Suppose $S\textsubscript{0} := 0$ and $T := \inf\{n : S\textsubscript{n} = +1\}$ then $P(T < \infty) = 1$ and $P(T = \infty) = 0$ 81 | \item The distribution of \emph{T} is 82 | \begin{center} 83 | $P(T = 2n - 1)\ =\ (-1)_{n-1}\binom{\frac{1}{2}}{n}$ 84 | \end{center} 85 | \end{enumerate} 86 | A Simple Random Walk generally refers to random walk in the first dimension. However, this research is concerned with random walks in 2 dimensions. But interestingly, a 2 dimensional random walk is essentially composed of two Simple Random Walks in either plane as the vertices in either plane selected are independently selected. 87 | \newline 88 | 89 | 90 | \subsection{Loop Erased Random Walks (LERW)} 91 | A Loop erased random walk on the integer lattice is a process obtained from Random Walk by erasing the loops chronologically. In LERW, the process is expected to have a continuum limit that is conformal, and nonrigorous conformal, field theory gives an exact prediction of the exponent as a simple rational number. In three dimensions there is no reason to believe that the exponent takes on a rational value. Once the loops are erased from the random walk, it essentially becomes a uniform spanning tree (UST), which is a spanning tree chosen uniformly at random, and so the algorithms for generating LERWs are used in application to produce uniform spanning trees. 92 | 93 | \paragraph{Loop Erased Random Walk definition.} \label{alg:LE} If $\omega = [\omega\textsubscript{0}, \ldots, \omega\textsubscript{n}]$ is a path, then the \emph{(chronological) loop-erasure} function $LE(\omega)$ would be defined as follows\cite{klafter2011} \newline 94 | Let $\lambda_{0} = \max\{j\ \le \ n: \omega_{j}=0\}$. 95 | \newline 96 | Set $\beta_{0}=\omega_{0}=\omega_{\lambda_{0}}$. 97 | \newline 98 | Suppose $\lambda_{i} < n$. Let $\lambda_{i+1} = \max\{j\ \le n: \omega_{j}=\omega_{\lambda_{i}+1}\}$. 99 | \newline 100 | Set $\beta_{i+1}=\omega_{\lambda_{i+1}}=\omega_{\lambda_{i}+1}$. 101 | \newline 102 | If $i_{\omega} = \min \{i : \lambda_{i} = n\} = \min\{i : \beta_{i} = \omega_{n}\}$, 103 | 104 | then $LE\left(\omega\right) = \lbrack \beta_{0},\ldots,\beta_{i}\rbrack$ 105 | 106 | \section{Methodology} 107 | In this section, I will explain the theory behind the code that was written to produce the resultant data. The code was mainly\footnote{The \emph{raw2binned.c} script in \texttt{C++} was used for the sake of efficiency and was originally written by Francesc, later modified by Gunnar Pruessner and then adapted by me for this project} written in \texttt{Python} as it enabled us to stay problem-oriented and allowed us to implement the LERW rapidly as well as readily provided us much greater useful functionality through its extensive library. We used the Python \texttt{matplotlib} library to generate the required statistical plots from the data as it integrated seamlessly with our workflow. We also used the \texttt{numpy} library for efficiently creating and manipulating the large matrices for the system instances. 108 | 109 | Code distribution was a major issue as sharing code via an email did not only make it hard to keep track of it but also aroused great inconsistencies and miss communication. So, I proposed the usage of the version control systems called \emph{git}. The usage of this system enabled us to stay in sync with the latest code version of the project, resolve any conflicts in the code and update the code. So our workflow consisted simply of fetching the latest code and pushing the changes made \footnote{The changes made were committed and pushed to \emph{GitHub} (a centralised project hosting platform that uses the \emph{git} version control systems) on which the project was hosted.}, each with a single command \footnote{In the shell/terminal/command-line}. 110 | 111 | Overall, these changes made the tasks easier to perform and made the workflow more efficient saving us an enormous amount of time. 112 | 113 | 114 | \subsection{Random Walk Implementation} 115 | The Random Walk was theoretically modelled to transverse on the 2 dimensional surface of a 3 dimensional cylinder (i.e. the system) with two vertical open boundaries and to two horizontal periodic. At the open boundaries, the Random Walk is initiated from starting boundary and ceased at the ending boundary. Due to the curvilinear geometry of a cylinder, the system had the property of allowing the random walk to transverse an infinite number of steps in any vertical direction given that the random walk did not reach the ending boundary. 116 | The system was implemented as a 2 dimensional $\mathbb{Z}$ lattice of a particular size (length by circumference) which represented the 2 dimensional surface of the cylinder and, to implement the previously stated property of the original system, periodic boundaries where inaugurated. The function of the periodic boundaries was ensure that the random walk continues to transverse from the boundary iteratively until it took a valid step (i.e. a step within the system boundaries) when the random walk surpassed a system boundary (see Figure~\ref{fig:RW}). The following is a formal definition of the system. 117 | 118 | \paragraph{System definition.}\label{subsec:sysfdef} Given that $\vb{S}=\opair{i, j}$ is the system of size (L, C), where \emph{i} and \emph{j} are the respective vertical and horizontal components of the system (refer to Appendix~\ref{sssec:sysdef} for the actual system definition) 119 | \begin{enumerate} 120 | \item The system generation can be expressed as 121 | \begin{center} 122 | $\vb{S} = \left\{\left. (\sum_{k=0}^L i_k, \sum_{k=0}^C j_k) \; \right\vert \;i_k j_k \in\mathbb{Z} \right\}$1 123 | \end{center} 124 | \item The length and the circumference $(L, C)$ of the system are of equal size which implies that the lattice was of equal length (for this research) 125 | \item $\vb{S}\textsubscript{i = 0}$ is the starting boundary and $\vb{S}\textsubscript{i = L}$, where \emph{L} represents the length of the system, is the ending boundary. 126 | \item $(\vb{S}\textsubscript{i = 0}, \vb{S}\textsubscript{j = y})$ is the starting point, where \emph{y} is the first pseudorandom number in the sequence generated by lib function using a given seed. 127 | \item Periodic boundaries exist at $(\vb{S}\textsubscript{j = 0}, \vb{S}\textsubscript{j = C})$, where \emph{C} is the circumference of the system, to allow \emph{j} to be of an indefinite size 128 | \end{enumerate} 129 | 130 | \paragraph{Random Walk generation.} A Random Walk, $S_{n}$, was generated using the generation function 131 | \begin{center} 132 | $\omega_{n} = \sum_{k=0}^n X_k$ 133 | \end{center} 134 | where $\{X_k\}$ are independent and identically distributed random variables. 135 | Implementing the periodic boundaries as well, resulted in the generation function 136 | \begin{center} 137 | $\omega_{n} = \left\{ \sum_{k=0}^n X_k \given[\Big] X_k \lnot\ (0 \lor L \lor C) \right\}$ 138 | \end{center} 139 | I translated this into \texttt{Python} code which is viewable in Appendix~\ref{sssec:srwgen}. In the code, the \emph{trajectory} list\footnote{Collectively, the \emph{path} taken (by the random walk)} containing the running set of vertices corresponding to the pseudorandom steps that the random walk had talking was a \texttt{numpy} list because of extensive functionality (i.e functions/methods) that were required to perform the non trivial manipulations quickly as well as staying problem-oriented. 140 | 141 | When Implementing the algorithm, I had to duplicate the trajectory list and perform some operation to it. However, the problem was that by default python only performed a so called \emph{shallow copy} of the list which is creates a new list that only references the data that the old list links to \cite{DPscopy2016}. Hence, when the new list is changed the old list is changed correspondingly as it references the same data. To resolve this issue, I made a \emph{deep copy} of the list which instead of creating a duplicate list that just references the old memory location it actually duplicates the memory location (by recursively copying the properties/methods) and creates a duplicate list with a reference to. This makes any modification to the new list independent from the old list \cite{DPscopy2016}. 142 | 143 | While originally planned to only use \texttt{matplotlib} for plotting the analytical graphs (i.e histograms), I figured out a way to use the \texttt{matplotlib} API to plot of the actual trajectory that a random walk makes. This offered us a qualitatively insight into the structure and behaviour of a random walk with a particular system. Refer to the following figure (Figure~\ref{fig:RW}) to see a sample plot. Refer to Appendix~\ref{sssec:rwplot} for the plot code that uses the matplotlib API. 144 | \begin{figure}[H] 145 | \begin{center} 146 | \includegraphics[scale=1.6]{RW_200s7hdpi} 147 | \caption{Random walk on a $\mathbb{Z}^{\emph{2}}$ lattice (system size: 200; seed: 7)} 148 | \label{fig:RW} 149 | \end{center} 150 | \end{figure} 151 | 152 | \paragraph{Random Walk Loop Erasure.} A loop is simply the result of the random walk intersecting itself. Considering that the random walk is transversing on an integer lattice, for a loop to occur, a vertex on the lattice has to be transversed at least twice - at the start of the loop and at the end of the loop. So to erase the loop, an algorithm was devised which removed every duplicate of a given list of vertices (i.e. the random walk trajectory) from the first occurrence of the vertex onwards to (and including) the last occurrence in the random walk trajectory list iteratively until all duplicates (i.e. loops) are removed. The $LE(\omega)$ algorithm defined in Loop Erased Random Walk definition in \S~\ref{alg:LE}, is a condensed form of the algorithm that was used to generate the LERWs to generate the data. When implementing the LERW and testing, the qualitatively insight gained from plotting the actual LERW trajectory using \texttt{matplotlib} was instrumental in evaluating the effectiveness of the code and identifying any logic errors\footnote{``A logic error (or logical error) is a mistake in a program's source code that results in incorrect or unexpected behaviour'' from $techterms.com/definition/logic\_error$}. While the \texttt{matplotlib} API sufficed for the random walk implementation, it did not achieve the expected results for the loop erased random walk because its inability to handle discontinuity that resulted from the erasure. This is evident in Figure~\ref{fig:LERWw} for the output from which we can qualitatively concluded that the output of the \texttt{matplotlib} API plot code (Appendix~\ref{sssec:rwplot}) is incorrect. So, I had to write algorithm that accounted for this by discreetly plotting the continuous sequences and then joining them. Figure~\ref{fig:LERW} is the output of the final loop erased random plot code (Appendix~\ref{sssec:lerwplot}). 153 | Refer to Appendix~\ref{sssec:lerw} for the implementation of the LERW. 154 | 155 | \begin{figure}[H] 156 | \begin{center} 157 | \includegraphics[scale=1.6]{LERWw} 158 | \caption{Loop erased random walk plot using \texttt{matplotlib} API (system size: 200; seed: 9)} 159 | \label{fig:LERWw} 160 | \end{center} 161 | \end{figure} 162 | 163 | \begin{figure}[H] 164 | \begin{center} 165 | \includegraphics[scale=1.6]{LERW_200s7hdpi} 166 | \caption{Loop erased random walk on a $\mathbb{Z}^{\emph{2}}$ lattice (system size: 200; seed: 7)} 167 | \label{fig:LERW} 168 | \end{center} 169 | \end{figure} 170 | 171 | \subsection{Collecting Data} 172 | All data collected for analysis was primary data generated by self-written code and systematic randomness. Data collection was an automatic procedure carried out by the code that I had written when run. A significant amount data had to collected for different system in order to draw a reliable and a more decisive conclusion. As discussed before, the main implication of the presence of periodic boundaries was that the size of the vertical component was indefinite and thus the size of the result was indefinite. Since the code could not be optimised for multithreading, the task for the execution of the code could not utilise more than one core of the multi-core computational infrastructure. Also considering that the computational infrastructure was not dedicated for the task (i.e other OS-related task have had a higher priority), it was apparent that the code had to run for a prolonged period of time in order to collect the necessary. Data was collected for different systems, namely, one dimensional random walk, 173 | \newline 174 | Fortunately, the computational resource available at disposal for this project were vast. In particular, High Performance Computing (HPC). The source code was run on the HPC as a series of jobs over the period of a whole day (~24 hours) . Bash code was written to instruct the HPC to output the generated the data by the python interpreter as a series of UTF-8 encoded sequential .dat files (in the format required for analysis). Some metadata (such as the system size) was prepended to the files to distinguish and identify the files easier in analysis stage. 175 | \newline 176 | Data was collected for the following systems: 177 | \begin{itemize} 178 | \item 2 dimensional Random Walks of system sizes of 50, 100, 200 and 400 179 | \item 1 dimensional Random Walks of system sizes of 50, 100, 200 and 400 180 | \item 2 dimensional Loop Erased Random Walks of system sizes of 50, 100, 200 and 400 181 | \end{itemize} 182 | On the first run on the cluster, we received a runtime error due to the incorrect configuration of the cluster but this was quickly fixed once the system was reconfigured. Once the files containing the data were generated, they were inspected manually to ensure that the generated data was valid (i.e. conformed to the expected data) and no logical errors where present in the code. Due to time constraints for each job of 15 hours on the cluster and considering that only one core of the cluster was being utilised, the code for generating the 2 dimensional Random Walks of system sizes of 400 and the code for generating 2 dimensional Loop Erased Random Walks of system sizes of 200, 400 was unable to completely execute. This incomplete execution resulted in the files not being generated (files were not completely written and so any associated temporary was discarded). Due to scarcity of time, I made the decision to just work with the data that we had obtained for the analysis instead of spending more time on generating data and produce less analysis. We agreed that this was the most productive course of action. Hence, \emph{in this report we will only analyse the data by a system\footnote{refer to \S~\ref{subsec:sysfdef} for definition} where} $|\vb{S}| \le 200$ .\newline 183 | 184 | \section{Results and Analysis} 185 | After the output .dat files were received, the data such as the return distributions needed to be plot as a histogram. The problem I encountered here was grouping the raw data in suitably sized bins in order to construct the histogram. Since the data was very large (up to 100MB), I we considered using a script written in \textit{C}. However, since this script was in \textit{C} and I was working in \textit{Python} so there was no programmatic way of exchanging the data (variables in one language could not be used in the other). Instead of rewriting this script in \textit{Python} and compromising performance, I figured out a way to exchange the data by modifying the script to output the data in a particular serialised format in \textit{C} and then parsing the output in \textit{Python} (refer to Appendix~\ref{sssec:histparse} for the code). Once parsed, the data was available in \textit{Python\'s} memory space and it could proceed to generate the histogram. The initial histogram depicted a very high positive skew as only a few significant values dominated resulting in an ambiguous trend. Discussing this, I found that this was due to great variation not being represented well due to the linear scales and so he advised me to use a double logarithmic scale. 186 | The double logarithmic histograms (refer to, Figure~\ref{fig:LERWhdish} -~\ref{fig:RW2Dhdish}) were used to qualitatively compare and evalulate the return times distributions for 2 dimensional discrete ($\mathbb{Z}^{\emph{2}}$) LERWs, 2 dimensional discrete ($\mathbb{Z}^{\emph{2}}$) Random Walks and 1 dimensional discrete ($\mathbb{Z}$) Random Walks. \newline 187 | 188 | What my research found was that was that return times distribution for $\mathbb{Z}^{\emph{2}}$ random walk and return times distribution for $\mathbb{Z}$ random walk were very similar. What my research also found that there is a difference in the distribution of the return times of a random walker (1D and 2D) and the distribution of the loops erased in LERW. 189 | 190 | \paragraph{Return distribution of 1D and 2D Random Walks.} Comparing Figure~\ref{fig:RW1Dhdish} and Figure~\ref{fig:RW2Dhdish}), I found that the return times distribution for $\mathbb{Z}^{\emph{2}}$ random walk and return times distribution for $\mathbb{Z}$ random walk were very similar. For a random walk in one dimension starting at $n$, the probability that the random walk eventually returns to $n$ equals one \cite{Redner2002} $\implies \Pr(S_{n}=n) = 1$, $n \in \mathbb{Z}$. However, the time required to return to $n$, averaged over all possible vertices in the system, is infinite. We conclude that this observation is due to the survival probability $S(t)$ ultimately decaying to zero for a random walk in two dimensions. This means that a random walk is recurrent - it is certain to eventually return to its starting point, and indeed visit any vertex of an infinite lattice. This is the $2.$ deduced characteristic stated in $\S1.1$. This is also because of the fact that a random walk has no memory (random variables independently distributed) and so it transits to a new state (i.e. resets it\' s state) every time a specific lattice vertex is transversed. Hence, recurrence also implies that every site is visited infinitely often. 191 | 192 | The arcsin law, which is concerned with the statistics of returns to the origin, also applies here as our random walk is an unrestricted random walk. From the arcsine law, we can infer that the most probable outcome is that the walk always remains entirely on the positive or on the negative axis. This is against the natural expectation of approximately one-half of the total time being spent on the positive axis and the remaining one-half of the time on the negative axis for a random walk which starts at x = 0. Surprisingly, the natural expectation is the least probable outcome.\newline 193 | \paragraph{Difference in arc length distribution of returns and loops erased.} As depicted in Figure~\ref{fig:AREr}, there is a correlation between the distribution of the loops erased in a random walk and the probability of return to (an arbitrary) vertex in the lattice for both systems of size 100 and 200. In fact, the combined distribution is positively skewed which implies that as the loops erased increase, the probability of returning to a vertex in the lattice decreases. The fact that the data is skewed, indicates that there is a difference arc length distribution of a random walker returning and arc length distribution of the loops erased. 194 | 195 | \begin{figure}[H] 196 | \centering 197 | \includegraphics[scale=0.6]{preturn_le_plot} 198 | \caption{Probability of return and loops erased distributions for $\mathbb{Z}^{\emph{2}}$ LERWs (system sizes: 100, 200; seed:10) and $\mathbb{Z}$ Random Walk (system size:200; seed:10)} 199 | \label{fig:AREr} 200 | \end{figure} 201 | \begin{figure}[H] 202 | \centering 203 | \includegraphics[scale=0.5]{LERW_cluster_sizes_200_histogram} 204 | \caption{Histogram for the distribution of the Loop Erased for Random Walks on $\mathbb{Z}^{\emph{2}}$ lattice (system size:200; seed:10)} 205 | \label{fig:LERWhdish} 206 | \end{figure} 207 | \begin{figure}[H] 208 | \centering 209 | \includegraphics[scale=0.5]{RW1D_cluster_sizes_200_histogram} 210 | \caption{Histogram for the returns distribution of Random Walks on $\mathbb{Z}$ lattice (system size:200; seed:10)} 211 | \label{fig:RW1Dhdish} 212 | \end{figure} 213 | \begin{figure}[h] 214 | \centering 215 | \includegraphics[scale=0.5]{RW2D_cluster_sizes_200_histogram} 216 | \caption{Histogram for the returns distribution of Random Walks on $\mathbb{Z}^{\emph{2}}$ lattice (system size; seed:10)} 217 | \label{fig:RW2Dhdish} 218 | \end{figure} 219 | \newpage 220 | 221 | \section{Conclusion} 222 | \subsection{Findings} 223 | The research has shown that the return distribution of a one-dimensional random walk and a two-dimensional random walk is almost identical. It can be concluded that the similarity in the return distribution of the one-dimensional random walk and the two-dimensional random walk is due to the survival probability of the two-dimensional random walk eventually being zero. Since the return probability of a one-dimensional random walk is one, this means that return probability of a two-dimensional random walk is very close to one. We can also conclude that there is a difference in the distribution of the return times of a random walker and the distribution of the loops erased in LERW. 224 | 225 | To further develop this research, I would incorporate the usage of Self Avoiding Random Walks (SARW) which are random walks that do not self-intersect during their transversal and are in a different universality class from LERW. I would also consider the relationships in the statistics of enclosed area between bidirectional LERWs, the statistics of the intersections and the variance of the number of intersections. A more abstract area that could be explored is the topography of LERWs, in particular, the average eccentricity of the loop erased random walk trajectory. 226 | 227 | \subsection{Implications} 228 | I believe that the results that I have obtained could have an impact on many stochastic processes and on the people whose careers are associated with them. The return probability of a two-dimensional and a one-dimensional random walk being almost identical implies that a stochastic process modelled using two essential (independent) random variables will inevitable return to its current position. What this means in the real world, for instance in stock market is that the price of a stock on the stock market with respect to time (where the stock price and time are the two random variables) will eventually return to its current stock price (over an indefinite time period). In this application case, this discovery may be beneficial to shareholders as it allows them to anticipate a future market position and make actionable decision with that information. The cumulative effect of this could dictate the the future of the market both short-term and long-term. Similarly, when applied in a network, this discovery asserts that the current bandwidth usage in network will eventually be reached again over an indefinite time period. This could prompt network administrators to anticipate peak bandwidth usage by deducing the return to peak bandwidth usage time and allocate the resources across the network more efficiently. 229 | 230 | \appendix 231 | \section{Appendices} 232 | \emph{Please note that any variables/constants declared are accessible throughout (globally) the (8.x) code sections respectively of their scope. Also note that this is not all the code produced for the research project but only parts that this report explores. Appendix ~\ref{sssec:imports} declares the libraries used globally and their respective identifiers used to reference them in the subsequent code.} 233 | \subsection{Imports} \label{sssec:imports} 234 | \lstinputlisting[language=Python]{pysc/import.py} 235 | \subsection{System Definition code} \label{sssec:sysdef} 236 | \lstinputlisting[language=Python]{pysc/sysdef.py} 237 | \subsection{Random Walk generation code} \label{sssec:srwgen} 238 | \lstinputlisting[language=Python]{pysc/srwgen.py} 239 | \subsection{Loop Erased Random Walk code} \label{sssec:lerw} 240 | \lstinputlisting[language=Python]{pysc/lerw.py} 241 | \subsection{Random Walk plot code (using the matplotlib API)} \label{sssec:rwplot} 242 | \lstinputlisting[language=Python]{pysc/plot.py} 243 | \subsection{Loop Erased Random Walk plot code} \label{sssec:lerwplot} 244 | \lstinputlisting[language=Python]{pysc/lerwplot.py} 245 | \subsection{Histogram plot (and parse from raw2bin.c) code} \label{sssec:histparse} 246 | \lstinputlisting[language=Python]{pysc/histparse.py} 247 | \hphantom 248 | \newline 249 | \section{Bibliography} 250 | \bibliographystyle{amsplain} 251 | \bibliography{references} 252 | \hphantom 253 | \newline 254 | \section{Acknowledgements} 255 | I would like to thank Professor Gunnar Pruessner of Imperial College London for supervising the research project and the Nuffield Foundation for funding this research. I would also like to thank Imperial College London for allowing us to use the High Performance Computing (HPC) mainframes at Huxley Building in ICL. 256 | 257 | \hphantom 258 | \newline 259 | \textless\textgreater\ $with$ \textless3 $by$ Habib Rehman 260 | 261 | \end{document} 262 | --------------------------------------------------------------------------------