├── conquer ├── __init__.py └── joint.py ├── papers ├── NcvxQR.pdf ├── LinearES.pdf ├── SmoothQR.pdf └── NcvxQRsupp.pdf ├── setup.py ├── README.md ├── LICENSE └── experiments ├── lowd.ipynb └── highd_inference.ipynb /conquer/__init__.py: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /papers/NcvxQR.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/WenxinZhou/conquer/HEAD/papers/NcvxQR.pdf -------------------------------------------------------------------------------- /papers/LinearES.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/WenxinZhou/conquer/HEAD/papers/LinearES.pdf -------------------------------------------------------------------------------- /papers/SmoothQR.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/WenxinZhou/conquer/HEAD/papers/SmoothQR.pdf -------------------------------------------------------------------------------- /papers/NcvxQRsupp.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/WenxinZhou/conquer/HEAD/papers/NcvxQRsupp.pdf -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | from setuptools import setup, find_packages 2 | 3 | with open("README.md", "r", encoding="utf-8") as fh: 4 | long_description = fh.read() 5 | 6 | install_requires = ['numpy', 'scipy', 'scikit-learn', 'qpsolvers', 'cvxopt', 'torch', 'matplotlib'] 7 | 8 | setup( 9 | name='conquer', 10 | version="2.0", 11 | packages=find_packages(), 12 | author="Wenxin Zhou", 13 | author_email="wenxinz@uic.edu", 14 | description="Convolution Smoothed Quantile and Expected Shortfall Regression", 15 | long_description=long_description, 16 | long_description_content_type="text/markdown", 17 | url="https://github.com/WenxinZhou/conquer", # replace with the URL of your project 18 | classifiers=[ 19 | "Programming Language :: Python :: 3", 20 | "License :: OSI Approved :: MIT License", 21 | "Operating System :: OS Independent", 22 | ], 23 | package_data={ 24 | '': ['*.txt', '*.rst'], 25 | 'hello': ['*.msg'], 26 | }, 27 | install_requires=install_requires, 28 | ) -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # conquer (Convolution Smoothed Quantile Regression) 2 | This package contains two modules: the `linear` module, which implements convolution smoothed quantile regression in both low and high dimensions, and the `joint` module, designed for joint quantile and expected shortfall regression. For R implementation, see the ``conquer`` package on [``CRAN``](https://cran.r-project.org/package=conquer) (also embedded in [``quantreg``](https://cran.r-project.org/package=quantreg) as an alternative approach to `fn` and `pfn`). 3 | 4 | The `low_dim` class in the `linear` module applies a convolution smoothing approach to fit linear quantile regression models, known as *conquer*. It also constructs normal-based and (multiplier) bootstrap confidence intervals for all slope coefficients. The `high_dim` class fits sparse quantile regression models in high dimensions via *L1*-penalized and iteratively reweighted *L1*-penalized (IRW-*L1*) conquer methods. The IRW method is inspired by the local linear approximation (LLA) algorithm proposed by [Zou & Li (2008)](https://doi.org/10.1214/009053607000000802) for folded concave penalized estimation, exemplified by the SCAD penalty ([Fan & Li, 2001](https://fan.princeton.edu/papers/01/penlike.pdf)) and the minimax concave penalty (MCP) ([Zhang, 2010](https://doi.org/10.1214/09-AOS729)). Computationally, each weighted l1-penalized conquer estimator is solved using the local adaptive majorize-minimization algorithm ([LAMM](https://doi.org/10.1214/17-AOS1568)). For comparison, the proximal ADMM algorithm ([pADMM](https://doi.org/10.1080/00401706.2017.1345703)) is also implemented. 5 | 6 | The `LR` class in the `joint` module fits joint linear quantile and expected shortfall (ES) regression models ([Dimitriadis & Bayer, 2019](https://doi.org/10.1214/19-EJS1560); [Patton, Ziegel & Chen, 2019](https://doi.org/10.1016/j.jeconom.2018.10.008)) using either FZ loss minimization ([Fissler & Ziegel, 2016](https://doi.org/10.1214/16-AOS1439)) or two-step procedures ([Barendse, 2020](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2937665); [Peng & Wang, 2023](https://onlinelibrary.wiley.com/doi/10.1002/sta4.619); [He, Tan & Zhou, 2023](https://doi.org/10.1093/jrsssb/qkad063)). For the second step of ES estimation, setting ``robust=TRUE`` uses the Huber loss with an adaptively chosen robustification parameter to gain robustness against heavy-tailed error/response; see [He, Tan & Zhou (2023)](https://doi.org/10.1093/jrsssb/qkad063) for more details. Moreover, a combination of the iteratively reweighted least squares (IRLS) algorithm and quadratic programming is utilized to compute non-crossing ES estimates. This ensures that the fitted ES does not exceed the fitted quantile at each observation. 7 | 8 | The `KRR` and `ANN` classes in the `joint` module implement two nonparametric methods for joint quantile and expected shortfall regressions: kernel ridge regression ([Takeuchi et al, 2006](https://www.jmlr.org/papers/v7/takeuchi06a.html)) and neural network regression. For fitting nonparametric QR through the `qt()` method in both `KRR` and `ANN`, there is a `smooth` option available. When set to `TRUE`, it uses the Gaussian kernel convoluted check loss. For fitting nonparametric ES regression using nonparametrically generated surrogate response variables, the `es()` function provides two options: *squared loss* (`robust=FALSE`) and the *Huber loss* (`robust=TRUE`). 9 | 10 | 11 | ## Dependencies 12 | 13 | ``` 14 | python >= 3.9, numpy, scipy, scikit-learn, cvxopt, qpsolvers, torch 15 | optional: matplotlib 16 | ``` 17 | 18 | 19 | ## Installation 20 | 21 | Download the folder ``conquer`` (containing `linear.py` and `joint.py`) in your working directory, or clone the git repo. and install: 22 | ``` 23 | git clone https://github.com/WenxinZhou/conquer.git 24 | python setup.py install 25 | ``` 26 | 27 | ## Examples 28 | 29 | ``` 30 | import numpy as np 31 | import numpy.random as rgt 32 | from scipy.stats import t 33 | from conquer.linear import low_dim, high_dim, pADMM 34 | ``` 35 | Generate data from a linear model with random covariates. The dimension of the feature/covariate space is `p`, and the sample size is `n`. The itercept is 4, and all the `p` regression coefficients are set as 1 in magnitude. The errors are generated from the *t2*-distribution (*t*-distribution with 2 degrees of freedom), centered by subtracting the population *τ*-quantile of *t2*. 36 | 37 | When `p < n`, the module `low_dim` constains functions for fitting linear quantile regression models with uncertainty quantification. If the bandwidth `h` is unspecified, the default value *max\{0.01, \{τ(1- τ)\}^0.5 \{(p+log(n))/n\}^0.4\}* is used. The default kernel function is ``Laplacian``. Other choices are ``Gaussian``, ``Logistic``, ``Uniform`` and ``Epanechnikov``. 38 | 39 | ``` 40 | n, p = 8000, 400 41 | itcp, beta = 4, np.ones(p) 42 | tau, t_df = 0.75, 2 43 | 44 | X = rgt.normal(0, 1.5, size=(n,p)) 45 | Y = itcp + X.dot(beta) + rgt.standard_t(t_df, n) - t.ppf(tau, t_df) 46 | 47 | qr = low_dim(X, Y, intercept=True) 48 | model = qr.fit(tau=tau) 49 | 50 | # model['beta'] : conquer estimate (intercept & slope coefficients). 51 | # model['res'] : n-vector of fitted residuals. 52 | # model['niter'] : number of iterations. 53 | # model['bw'] : bandwidth. 54 | ``` 55 | 56 | At each quantile level *τ*, the `norm_ci` and `boot_ci` methods provide four 100* (1-alpha)% confidence intervals (CIs) for regression coefficients: (i) normal distribution calibrated CI using estimated covariance matrix, (ii) percentile bootstrap CI, (iii) pivotal bootstrap CI, and (iv) normal-based CI using bootstrap variance estimates. For multiplier/weighted bootstrap implementation with `boot_ci`, the default weight distribution is ``Exponential``. Other choices are ``Rademacher``, ``Multinomial`` (Efron's nonparametric bootstrap), ``Gaussian``, ``Uniform`` and ``Folded-normal``. The latter two require a variance adjustment; see Remark 4.7 in [Paper](https://doi.org/10.1016/j.jeconom.2021.07.010). 57 | 58 | ``` 59 | n, p = 500, 20 60 | itcp, beta = 4, np.ones(p) 61 | tau, t_df = 0.75, 2 62 | 63 | X = rgt.normal(0, 1.5, size=(n,p)) 64 | Y = itcp + X.dot(beta) + rgt.standard_t(t_df, n) - t.ppf(tau, t_df) 65 | 66 | qr = low_dim(X, Y, intercept=True) 67 | model1 = qr.norm_ci(tau=tau) 68 | model2 = qr.mb_ci(tau=tau) 69 | 70 | # model1['normal_ci'] : p+1 by 2 numpy array of normal CIs based on estimated asymptotic covariance matrix. 71 | # model2['percentile_ci'] : p+1 by 2 numpy array of bootstrap percentile CIs. 72 | # model2['pivotal_ci'] : p+1 by 2 numpy array of bootstrap pivotal CIs. 73 | # model2['normal_ci'] : p+1 by 2 numpy array of normal CIs based on bootstrap variance estimates. 74 | ``` 75 | 76 | The module `high_dim` contains functions that fit high-dimensional sparse quantile regression models through the LAMM algorithm. The default bandwidth value is *max\{0.05, \{τ(1- τ)\}^0.5 \{ log(p)/n\}^0.25\}*. To choose the penalty level, the `self_tuning` function implements the simulation-based approach proposed by [Belloni & Chernozhukov (2011)](https://doi.org/10.1214/10-AOS827). 77 | The `l1` and `irw` functions compute *L1*- and IRW-*L1*-penalized conquer estimators, respectively. For the latter, the default concave penality is `SCAD` with constant `a=3.7` ([Fan & Li, 2001](https://fan.princeton.edu/papers/01/penlike.pdf)). Given a sequence of penalty levels, the solution paths can be computed by `l1_path` and `irw_path`. 78 | 79 | ``` 80 | p, n = 1028, 256 81 | tau = 0.8 82 | itcp, beta = 4, np.zeros(p) 83 | beta[:15] = [1.8, 0, 1.6, 0, 1.4, 0, 1.2, 0, 1, 0, -1, 0, -1.2, 0, -1.6] 84 | 85 | X = rgt.normal(0, 1, size=(n,p)) 86 | Y = itcp + X@beta + rgt.standard_t(2,size=n) - t.ppf(tau,df=2) 87 | 88 | sqr = high_dim(X, Y, intercept=True) 89 | lambda_max = np.max(sqr.self_tuning(tau)) 90 | lambda_seq = np.linspace(0.25*lambda_max, lambda_max, num=20) 91 | 92 | ## l1-penalized conquer 93 | l1_model = sqr.l1(tau=tau, Lambda=0.75*lambda_max) 94 | 95 | ## iteratively reweighted l1-penalized conquer (default is SCAD penality) 96 | irw_model = sqr.irw(tau=tau, Lambda=0.75*lambda_max) 97 | 98 | ## solution path of l1-penalized conquer 99 | l1_path = sqr.l1_path(tau=tau, lambda_seq=lambda_seq) 100 | 101 | ## solution path of irw-l1-penalized conquer 102 | irw_path = sqr.irw_path(tau=tau, lambda_seq=lambda_seq) 103 | 104 | ## model selection via bootstrap 105 | boot_model = sqr.boot_select(tau=tau, Lambda=0.75*lambda_max, weight="Multinomial") 106 | print('Selected model via bootstrap:', boot_model['majority_vote']) 107 | print('True model:', np.where(beta!=0)[0]) 108 | ``` 109 | 110 | The module `pADMM` has a similar functionality to `high_dim`. It employs the proximal ADMM algorithm to solve weighted *L1*-penalized quantile regression (without smoothing). 111 | ``` 112 | lambda_max = np.max(high_dim(X, Y, intercept=True).self_tuning(tau)) 113 | lambda_seq = np.linspace(0.25*lambda_max, lambda_max, num=20) 114 | admm = pADMM(X, Y, intercept=True) 115 | 116 | ## l1-penalized QR 117 | l1_admm = admm.l1(tau=tau, Lambda=0.5*lambda_max) 118 | 119 | ## iteratively reweighted l1-penalized QR (default is SCAD penality) 120 | irw_admm = admm.irw(tau=tau, Lambda=0.75*lambda_max) 121 | 122 | ## solution path of l1-penalized QR 123 | l1_admm_path = admm.l1_path(tau=tau, lambda_seq=lambda_seq) 124 | 125 | ## solution path of irw-l1-penalized QR 126 | irw_admm_path = admm.irw_path(tau=tau, lambda_seq=lambda_seq) 127 | ``` 128 | 129 | The class `LR` in `conquer.joint` contains functions that fit joint (linear) quantile and expected shortfall models. The `joint_fit` function computes joint quantile and ES regression estimates based on FZ loss minimization ([Fissler & Ziegel, 2016](https://doi.org/10.1214/16-AOS1439)). The `twostep_fit` function implements two-stage procesures to compute quantile and ES regression estimates, with the ES part depending on a user-specified `loss`. Options are ``L2``, ``TrunL2``, ``FZ`` and ``Huber``. The `nc_fit` function computes non-crossing counterparts of the ES estimates when `loss` = `L2` or `Huber`. 130 | 131 | ``` 132 | import numpy as np 133 | import pandas as pd 134 | import numpy.random as rgt 135 | from quantes.joint import LR 136 | 137 | p, n = 10, 5000 138 | tau = 0.1 139 | beta = np.ones(p) 140 | gamma = np.r_[0.5*np.ones(2), np.zeros(p-2)] 141 | 142 | X = rgt.uniform(0, 2, size=(n,p)) 143 | Y = 2 + X @ beta + (X @ gamma) * rgt.normal(0, 1, n) 144 | 145 | lm = LR(X, Y) 146 | ## two-step least squares 147 | m1 = lm.twostep_fit(tau=tau, loss='L2') 148 | 149 | ## two-step truncated least squares 150 | m2 = lm.twostep_fit(tau=tau, loss='TrunL2') 151 | 152 | ## two-step FZ loss minimization 153 | m3 = lm.twostep_fit(tau=tau, loss='FZ', G2_type=1) 154 | 155 | ## two-step adaptive Huber 156 | m4 = lm.twostep_fit(tau=tau, loss='Huber') 157 | 158 | ## non-crossing two-step least squares 159 | m5 = lm.nc_fit(tau=tau, loss='L2') 160 | 161 | ## non-crossing two-step adaHuber 162 | m6 = lm.nc_fit(tau=tau, loss='Huber') 163 | 164 | ## joint quantes regression via FZ loss minimization (G1=0) 165 | m7 = lm.joint_fit(tau=tau, G1=False, G2_type=1, refit=False) 166 | 167 | ## joint quantes regression via FZ loss minimization (G1(x)=x) 168 | m8 = lm.joint_fit(tau=tau, G1=True, G2_type=1, refit=False) 169 | 170 | out = pd.DataFrame(np.c_[(m1['coef_e'], m2['coef_e'], m3['coef_e'], m4['coef_e'], 171 | m5['coef_e'], m6['coef_e'], m7['coef_e'], m8['coef_e'])], 172 | columns=['L2', 'TLS', 'FZ', 'AH', 'NC-L2', 'NC-AH', 'Joint0', 'Joint1']) 173 | out 174 | ``` 175 | 176 | 177 | ## References 178 | 179 | Fernandes, M., Guerre, E. and Horta, E. (2021). Smoothing quantile regressions. *J. Bus. Econ. Statist.* **39**(1) 338–357. [Paper](https://doi.org/10.1080/07350015.2019.1660177) 180 | 181 | He, X., Tan, K. M. and Zhou, W.-X. (2023). Robust estimation and inference for expected shortfall regression with many regressors. *J. R. Stat. Soc. B.* **85**(4) 1223-1246. [Paper](https://doi.org/10.1093/jrsssb/qkad063) 182 | 183 | He, X., Pan, X., Tan, K. M. and Zhou, W.-X. (2023). Smoothed quantile regression with large-scale inference. *J. Econom.* **232**(2) 367-388. [Paper](https://doi.org/10.1016/j.jeconom.2021.07.010) 184 | 185 | Koenker, R. (2005). *Quantile Regression*. Cambridge University Press, Cambridge. [Book](https://www.cambridge.org/core/books/quantile-regression/C18AE7BCF3EC43C16937390D44A328B1) 186 | 187 | Pan, X., Sun, Q. and Zhou, W.-X. (2021). Iteratively reweighted *l1*-penalized robust regression. *Electron. J. Stat.* **15**(1) 3287-3348. [Paper](https://doi.org/10.1214/21-EJS1862) 188 | 189 | Tan, K. M., Wang, L. and Zhou, W.-X. (2022). High-dimensional quantile regression: convolution smoothing and concave regularization. *J. R. Stat. Soc. B.* **84**(1) 205-233. [Paper](https://rss.onlinelibrary.wiley.com/doi/10.1111/rssb.12485) 190 | 191 | ## License 192 | 193 | This package is released under the GPL-3.0 license. -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | GNU GENERAL PUBLIC LICENSE 2 | Version 3, 29 June 2007 3 | 4 | Copyright (C) 2007 Free Software Foundation, Inc. 5 | Everyone is permitted to copy and distribute verbatim copies 6 | of this license document, but changing it is not allowed. 7 | 8 | Preamble 9 | 10 | The GNU General Public License is a free, copyleft license for 11 | software and other kinds of works. 12 | 13 | The licenses for most software and other practical works are designed 14 | to take away your freedom to share and change the works. By contrast, 15 | the GNU General Public License is intended to guarantee your freedom to 16 | share and change all versions of a program--to make sure it remains free 17 | software for all its users. We, the Free Software Foundation, use the 18 | GNU General Public License for most of our software; it applies also to 19 | any other work released this way by its authors. You can apply it to 20 | your programs, too. 21 | 22 | When we speak of free software, we are referring to freedom, not 23 | price. Our General Public Licenses are designed to make sure that you 24 | have the freedom to distribute copies of free software (and charge for 25 | them if you wish), that you receive source code or can get it if you 26 | want it, that you can change the software or use pieces of it in new 27 | free programs, and that you know you can do these things. 28 | 29 | To protect your rights, we need to prevent others from denying you 30 | these rights or asking you to surrender the rights. Therefore, you have 31 | certain responsibilities if you distribute copies of the software, or if 32 | you modify it: responsibilities to respect the freedom of others. 33 | 34 | For example, if you distribute copies of such a program, whether 35 | gratis or for a fee, you must pass on to the recipients the same 36 | freedoms that you received. You must make sure that they, too, receive 37 | or can get the source code. And you must show them these terms so they 38 | know their rights. 39 | 40 | Developers that use the GNU GPL protect your rights with two steps: 41 | (1) assert copyright on the software, and (2) offer you this License 42 | giving you legal permission to copy, distribute and/or modify it. 43 | 44 | For the developers' and authors' protection, the GPL clearly explains 45 | that there is no warranty for this free software. For both users' and 46 | authors' sake, the GPL requires that modified versions be marked as 47 | changed, so that their problems will not be attributed erroneously to 48 | authors of previous versions. 49 | 50 | Some devices are designed to deny users access to install or run 51 | modified versions of the software inside them, although the manufacturer 52 | can do so. This is fundamentally incompatible with the aim of 53 | protecting users' freedom to change the software. The systematic 54 | pattern of such abuse occurs in the area of products for individuals to 55 | use, which is precisely where it is most unacceptable. Therefore, we 56 | have designed this version of the GPL to prohibit the practice for those 57 | products. If such problems arise substantially in other domains, we 58 | stand ready to extend this provision to those domains in future versions 59 | of the GPL, as needed to protect the freedom of users. 60 | 61 | Finally, every program is threatened constantly by software patents. 62 | States should not allow patents to restrict development and use of 63 | software on general-purpose computers, but in those that do, we wish to 64 | avoid the special danger that patents applied to a free program could 65 | make it effectively proprietary. To prevent this, the GPL assures that 66 | patents cannot be used to render the program non-free. 67 | 68 | The precise terms and conditions for copying, distribution and 69 | modification follow. 70 | 71 | TERMS AND CONDITIONS 72 | 73 | 0. Definitions. 74 | 75 | "This License" refers to version 3 of the GNU General Public License. 76 | 77 | "Copyright" also means copyright-like laws that apply to other kinds of 78 | works, such as semiconductor masks. 79 | 80 | "The Program" refers to any copyrightable work licensed under this 81 | License. Each licensee is addressed as "you". "Licensees" and 82 | "recipients" may be individuals or organizations. 83 | 84 | To "modify" a work means to copy from or adapt all or part of the work 85 | in a fashion requiring copyright permission, other than the making of an 86 | exact copy. The resulting work is called a "modified version" of the 87 | earlier work or a work "based on" the earlier work. 88 | 89 | A "covered work" means either the unmodified Program or a work based 90 | on the Program. 91 | 92 | To "propagate" a work means to do anything with it that, without 93 | permission, would make you directly or secondarily liable for 94 | infringement under applicable copyright law, except executing it on a 95 | computer or modifying a private copy. Propagation includes copying, 96 | distribution (with or without modification), making available to the 97 | public, and in some countries other activities as well. 98 | 99 | To "convey" a work means any kind of propagation that enables other 100 | parties to make or receive copies. Mere interaction with a user through 101 | a computer network, with no transfer of a copy, is not conveying. 102 | 103 | An interactive user interface displays "Appropriate Legal Notices" 104 | to the extent that it includes a convenient and prominently visible 105 | feature that (1) displays an appropriate copyright notice, and (2) 106 | tells the user that there is no warranty for the work (except to the 107 | extent that warranties are provided), that licensees may convey the 108 | work under this License, and how to view a copy of this License. If 109 | the interface presents a list of user commands or options, such as a 110 | menu, a prominent item in the list meets this criterion. 111 | 112 | 1. Source Code. 113 | 114 | The "source code" for a work means the preferred form of the work 115 | for making modifications to it. "Object code" means any non-source 116 | form of a work. 117 | 118 | A "Standard Interface" means an interface that either is an official 119 | standard defined by a recognized standards body, or, in the case of 120 | interfaces specified for a particular programming language, one that 121 | is widely used among developers working in that language. 122 | 123 | The "System Libraries" of an executable work include anything, other 124 | than the work as a whole, that (a) is included in the normal form of 125 | packaging a Major Component, but which is not part of that Major 126 | Component, and (b) serves only to enable use of the work with that 127 | Major Component, or to implement a Standard Interface for which an 128 | implementation is available to the public in source code form. A 129 | "Major Component", in this context, means a major essential component 130 | (kernel, window system, and so on) of the specific operating system 131 | (if any) on which the executable work runs, or a compiler used to 132 | produce the work, or an object code interpreter used to run it. 133 | 134 | The "Corresponding Source" for a work in object code form means all 135 | the source code needed to generate, install, and (for an executable 136 | work) run the object code and to modify the work, including scripts to 137 | control those activities. However, it does not include the work's 138 | System Libraries, or general-purpose tools or generally available free 139 | programs which are used unmodified in performing those activities but 140 | which are not part of the work. For example, Corresponding Source 141 | includes interface definition files associated with source files for 142 | the work, and the source code for shared libraries and dynamically 143 | linked subprograms that the work is specifically designed to require, 144 | such as by intimate data communication or control flow between those 145 | subprograms and other parts of the work. 146 | 147 | The Corresponding Source need not include anything that users 148 | can regenerate automatically from other parts of the Corresponding 149 | Source. 150 | 151 | The Corresponding Source for a work in source code form is that 152 | same work. 153 | 154 | 2. Basic Permissions. 155 | 156 | All rights granted under this License are granted for the term of 157 | copyright on the Program, and are irrevocable provided the stated 158 | conditions are met. This License explicitly affirms your unlimited 159 | permission to run the unmodified Program. The output from running a 160 | covered work is covered by this License only if the output, given its 161 | content, constitutes a covered work. This License acknowledges your 162 | rights of fair use or other equivalent, as provided by copyright law. 163 | 164 | You may make, run and propagate covered works that you do not 165 | convey, without conditions so long as your license otherwise remains 166 | in force. You may convey covered works to others for the sole purpose 167 | of having them make modifications exclusively for you, or provide you 168 | with facilities for running those works, provided that you comply with 169 | the terms of this License in conveying all material for which you do 170 | not control copyright. Those thus making or running the covered works 171 | for you must do so exclusively on your behalf, under your direction 172 | and control, on terms that prohibit them from making any copies of 173 | your copyrighted material outside their relationship with you. 174 | 175 | Conveying under any other circumstances is permitted solely under 176 | the conditions stated below. Sublicensing is not allowed; section 10 177 | makes it unnecessary. 178 | 179 | 3. Protecting Users' Legal Rights From Anti-Circumvention Law. 180 | 181 | No covered work shall be deemed part of an effective technological 182 | measure under any applicable law fulfilling obligations under article 183 | 11 of the WIPO copyright treaty adopted on 20 December 1996, or 184 | similar laws prohibiting or restricting circumvention of such 185 | measures. 186 | 187 | When you convey a covered work, you waive any legal power to forbid 188 | circumvention of technological measures to the extent such circumvention 189 | is effected by exercising rights under this License with respect to 190 | the covered work, and you disclaim any intention to limit operation or 191 | modification of the work as a means of enforcing, against the work's 192 | users, your or third parties' legal rights to forbid circumvention of 193 | technological measures. 194 | 195 | 4. Conveying Verbatim Copies. 196 | 197 | You may convey verbatim copies of the Program's source code as you 198 | receive it, in any medium, provided that you conspicuously and 199 | appropriately publish on each copy an appropriate copyright notice; 200 | keep intact all notices stating that this License and any 201 | non-permissive terms added in accord with section 7 apply to the code; 202 | keep intact all notices of the absence of any warranty; and give all 203 | recipients a copy of this License along with the Program. 204 | 205 | You may charge any price or no price for each copy that you convey, 206 | and you may offer support or warranty protection for a fee. 207 | 208 | 5. Conveying Modified Source Versions. 209 | 210 | You may convey a work based on the Program, or the modifications to 211 | produce it from the Program, in the form of source code under the 212 | terms of section 4, provided that you also meet all of these conditions: 213 | 214 | a) The work must carry prominent notices stating that you modified 215 | it, and giving a relevant date. 216 | 217 | b) The work must carry prominent notices stating that it is 218 | released under this License and any conditions added under section 219 | 7. This requirement modifies the requirement in section 4 to 220 | "keep intact all notices". 221 | 222 | c) You must license the entire work, as a whole, under this 223 | License to anyone who comes into possession of a copy. This 224 | License will therefore apply, along with any applicable section 7 225 | additional terms, to the whole of the work, and all its parts, 226 | regardless of how they are packaged. This License gives no 227 | permission to license the work in any other way, but it does not 228 | invalidate such permission if you have separately received it. 229 | 230 | d) If the work has interactive user interfaces, each must display 231 | Appropriate Legal Notices; however, if the Program has interactive 232 | interfaces that do not display Appropriate Legal Notices, your 233 | work need not make them do so. 234 | 235 | A compilation of a covered work with other separate and independent 236 | works, which are not by their nature extensions of the covered work, 237 | and which are not combined with it such as to form a larger program, 238 | in or on a volume of a storage or distribution medium, is called an 239 | "aggregate" if the compilation and its resulting copyright are not 240 | used to limit the access or legal rights of the compilation's users 241 | beyond what the individual works permit. Inclusion of a covered work 242 | in an aggregate does not cause this License to apply to the other 243 | parts of the aggregate. 244 | 245 | 6. Conveying Non-Source Forms. 246 | 247 | You may convey a covered work in object code form under the terms 248 | of sections 4 and 5, provided that you also convey the 249 | machine-readable Corresponding Source under the terms of this License, 250 | in one of these ways: 251 | 252 | a) Convey the object code in, or embodied in, a physical product 253 | (including a physical distribution medium), accompanied by the 254 | Corresponding Source fixed on a durable physical medium 255 | customarily used for software interchange. 256 | 257 | b) Convey the object code in, or embodied in, a physical product 258 | (including a physical distribution medium), accompanied by a 259 | written offer, valid for at least three years and valid for as 260 | long as you offer spare parts or customer support for that product 261 | model, to give anyone who possesses the object code either (1) a 262 | copy of the Corresponding Source for all the software in the 263 | product that is covered by this License, on a durable physical 264 | medium customarily used for software interchange, for a price no 265 | more than your reasonable cost of physically performing this 266 | conveying of source, or (2) access to copy the 267 | Corresponding Source from a network server at no charge. 268 | 269 | c) Convey individual copies of the object code with a copy of the 270 | written offer to provide the Corresponding Source. This 271 | alternative is allowed only occasionally and noncommercially, and 272 | only if you received the object code with such an offer, in accord 273 | with subsection 6b. 274 | 275 | d) Convey the object code by offering access from a designated 276 | place (gratis or for a charge), and offer equivalent access to the 277 | Corresponding Source in the same way through the same place at no 278 | further charge. You need not require recipients to copy the 279 | Corresponding Source along with the object code. If the place to 280 | copy the object code is a network server, the Corresponding Source 281 | may be on a different server (operated by you or a third party) 282 | that supports equivalent copying facilities, provided you maintain 283 | clear directions next to the object code saying where to find the 284 | Corresponding Source. Regardless of what server hosts the 285 | Corresponding Source, you remain obligated to ensure that it is 286 | available for as long as needed to satisfy these requirements. 287 | 288 | e) Convey the object code using peer-to-peer transmission, provided 289 | you inform other peers where the object code and Corresponding 290 | Source of the work are being offered to the general public at no 291 | charge under subsection 6d. 292 | 293 | A separable portion of the object code, whose source code is excluded 294 | from the Corresponding Source as a System Library, need not be 295 | included in conveying the object code work. 296 | 297 | A "User Product" is either (1) a "consumer product", which means any 298 | tangible personal property which is normally used for personal, family, 299 | or household purposes, or (2) anything designed or sold for incorporation 300 | into a dwelling. In determining whether a product is a consumer product, 301 | doubtful cases shall be resolved in favor of coverage. For a particular 302 | product received by a particular user, "normally used" refers to a 303 | typical or common use of that class of product, regardless of the status 304 | of the particular user or of the way in which the particular user 305 | actually uses, or expects or is expected to use, the product. A product 306 | is a consumer product regardless of whether the product has substantial 307 | commercial, industrial or non-consumer uses, unless such uses represent 308 | the only significant mode of use of the product. 309 | 310 | "Installation Information" for a User Product means any methods, 311 | procedures, authorization keys, or other information required to install 312 | and execute modified versions of a covered work in that User Product from 313 | a modified version of its Corresponding Source. The information must 314 | suffice to ensure that the continued functioning of the modified object 315 | code is in no case prevented or interfered with solely because 316 | modification has been made. 317 | 318 | If you convey an object code work under this section in, or with, or 319 | specifically for use in, a User Product, and the conveying occurs as 320 | part of a transaction in which the right of possession and use of the 321 | User Product is transferred to the recipient in perpetuity or for a 322 | fixed term (regardless of how the transaction is characterized), the 323 | Corresponding Source conveyed under this section must be accompanied 324 | by the Installation Information. But this requirement does not apply 325 | if neither you nor any third party retains the ability to install 326 | modified object code on the User Product (for example, the work has 327 | been installed in ROM). 328 | 329 | The requirement to provide Installation Information does not include a 330 | requirement to continue to provide support service, warranty, or updates 331 | for a work that has been modified or installed by the recipient, or for 332 | the User Product in which it has been modified or installed. Access to a 333 | network may be denied when the modification itself materially and 334 | adversely affects the operation of the network or violates the rules and 335 | protocols for communication across the network. 336 | 337 | Corresponding Source conveyed, and Installation Information provided, 338 | in accord with this section must be in a format that is publicly 339 | documented (and with an implementation available to the public in 340 | source code form), and must require no special password or key for 341 | unpacking, reading or copying. 342 | 343 | 7. Additional Terms. 344 | 345 | "Additional permissions" are terms that supplement the terms of this 346 | License by making exceptions from one or more of its conditions. 347 | Additional permissions that are applicable to the entire Program shall 348 | be treated as though they were included in this License, to the extent 349 | that they are valid under applicable law. If additional permissions 350 | apply only to part of the Program, that part may be used separately 351 | under those permissions, but the entire Program remains governed by 352 | this License without regard to the additional permissions. 353 | 354 | When you convey a copy of a covered work, you may at your option 355 | remove any additional permissions from that copy, or from any part of 356 | it. (Additional permissions may be written to require their own 357 | removal in certain cases when you modify the work.) You may place 358 | additional permissions on material, added by you to a covered work, 359 | for which you have or can give appropriate copyright permission. 360 | 361 | Notwithstanding any other provision of this License, for material you 362 | add to a covered work, you may (if authorized by the copyright holders of 363 | that material) supplement the terms of this License with terms: 364 | 365 | a) Disclaiming warranty or limiting liability differently from the 366 | terms of sections 15 and 16 of this License; or 367 | 368 | b) Requiring preservation of specified reasonable legal notices or 369 | author attributions in that material or in the Appropriate Legal 370 | Notices displayed by works containing it; or 371 | 372 | c) Prohibiting misrepresentation of the origin of that material, or 373 | requiring that modified versions of such material be marked in 374 | reasonable ways as different from the original version; or 375 | 376 | d) Limiting the use for publicity purposes of names of licensors or 377 | authors of the material; or 378 | 379 | e) Declining to grant rights under trademark law for use of some 380 | trade names, trademarks, or service marks; or 381 | 382 | f) Requiring indemnification of licensors and authors of that 383 | material by anyone who conveys the material (or modified versions of 384 | it) with contractual assumptions of liability to the recipient, for 385 | any liability that these contractual assumptions directly impose on 386 | those licensors and authors. 387 | 388 | All other non-permissive additional terms are considered "further 389 | restrictions" within the meaning of section 10. If the Program as you 390 | received it, or any part of it, contains a notice stating that it is 391 | governed by this License along with a term that is a further 392 | restriction, you may remove that term. If a license document contains 393 | a further restriction but permits relicensing or conveying under this 394 | License, you may add to a covered work material governed by the terms 395 | of that license document, provided that the further restriction does 396 | not survive such relicensing or conveying. 397 | 398 | If you add terms to a covered work in accord with this section, you 399 | must place, in the relevant source files, a statement of the 400 | additional terms that apply to those files, or a notice indicating 401 | where to find the applicable terms. 402 | 403 | Additional terms, permissive or non-permissive, may be stated in the 404 | form of a separately written license, or stated as exceptions; 405 | the above requirements apply either way. 406 | 407 | 8. Termination. 408 | 409 | You may not propagate or modify a covered work except as expressly 410 | provided under this License. Any attempt otherwise to propagate or 411 | modify it is void, and will automatically terminate your rights under 412 | this License (including any patent licenses granted under the third 413 | paragraph of section 11). 414 | 415 | However, if you cease all violation of this License, then your 416 | license from a particular copyright holder is reinstated (a) 417 | provisionally, unless and until the copyright holder explicitly and 418 | finally terminates your license, and (b) permanently, if the copyright 419 | holder fails to notify you of the violation by some reasonable means 420 | prior to 60 days after the cessation. 421 | 422 | Moreover, your license from a particular copyright holder is 423 | reinstated permanently if the copyright holder notifies you of the 424 | violation by some reasonable means, this is the first time you have 425 | received notice of violation of this License (for any work) from that 426 | copyright holder, and you cure the violation prior to 30 days after 427 | your receipt of the notice. 428 | 429 | Termination of your rights under this section does not terminate the 430 | licenses of parties who have received copies or rights from you under 431 | this License. If your rights have been terminated and not permanently 432 | reinstated, you do not qualify to receive new licenses for the same 433 | material under section 10. 434 | 435 | 9. Acceptance Not Required for Having Copies. 436 | 437 | You are not required to accept this License in order to receive or 438 | run a copy of the Program. Ancillary propagation of a covered work 439 | occurring solely as a consequence of using peer-to-peer transmission 440 | to receive a copy likewise does not require acceptance. However, 441 | nothing other than this License grants you permission to propagate or 442 | modify any covered work. These actions infringe copyright if you do 443 | not accept this License. Therefore, by modifying or propagating a 444 | covered work, you indicate your acceptance of this License to do so. 445 | 446 | 10. Automatic Licensing of Downstream Recipients. 447 | 448 | Each time you convey a covered work, the recipient automatically 449 | receives a license from the original licensors, to run, modify and 450 | propagate that work, subject to this License. You are not responsible 451 | for enforcing compliance by third parties with this License. 452 | 453 | An "entity transaction" is a transaction transferring control of an 454 | organization, or substantially all assets of one, or subdividing an 455 | organization, or merging organizations. If propagation of a covered 456 | work results from an entity transaction, each party to that 457 | transaction who receives a copy of the work also receives whatever 458 | licenses to the work the party's predecessor in interest had or could 459 | give under the previous paragraph, plus a right to possession of the 460 | Corresponding Source of the work from the predecessor in interest, if 461 | the predecessor has it or can get it with reasonable efforts. 462 | 463 | You may not impose any further restrictions on the exercise of the 464 | rights granted or affirmed under this License. For example, you may 465 | not impose a license fee, royalty, or other charge for exercise of 466 | rights granted under this License, and you may not initiate litigation 467 | (including a cross-claim or counterclaim in a lawsuit) alleging that 468 | any patent claim is infringed by making, using, selling, offering for 469 | sale, or importing the Program or any portion of it. 470 | 471 | 11. Patents. 472 | 473 | A "contributor" is a copyright holder who authorizes use under this 474 | License of the Program or a work on which the Program is based. The 475 | work thus licensed is called the contributor's "contributor version". 476 | 477 | A contributor's "essential patent claims" are all patent claims 478 | owned or controlled by the contributor, whether already acquired or 479 | hereafter acquired, that would be infringed by some manner, permitted 480 | by this License, of making, using, or selling its contributor version, 481 | but do not include claims that would be infringed only as a 482 | consequence of further modification of the contributor version. For 483 | purposes of this definition, "control" includes the right to grant 484 | patent sublicenses in a manner consistent with the requirements of 485 | this License. 486 | 487 | Each contributor grants you a non-exclusive, worldwide, royalty-free 488 | patent license under the contributor's essential patent claims, to 489 | make, use, sell, offer for sale, import and otherwise run, modify and 490 | propagate the contents of its contributor version. 491 | 492 | In the following three paragraphs, a "patent license" is any express 493 | agreement or commitment, however denominated, not to enforce a patent 494 | (such as an express permission to practice a patent or covenant not to 495 | sue for patent infringement). To "grant" such a patent license to a 496 | party means to make such an agreement or commitment not to enforce a 497 | patent against the party. 498 | 499 | If you convey a covered work, knowingly relying on a patent license, 500 | and the Corresponding Source of the work is not available for anyone 501 | to copy, free of charge and under the terms of this License, through a 502 | publicly available network server or other readily accessible means, 503 | then you must either (1) cause the Corresponding Source to be so 504 | available, or (2) arrange to deprive yourself of the benefit of the 505 | patent license for this particular work, or (3) arrange, in a manner 506 | consistent with the requirements of this License, to extend the patent 507 | license to downstream recipients. "Knowingly relying" means you have 508 | actual knowledge that, but for the patent license, your conveying the 509 | covered work in a country, or your recipient's use of the covered work 510 | in a country, would infringe one or more identifiable patents in that 511 | country that you have reason to believe are valid. 512 | 513 | If, pursuant to or in connection with a single transaction or 514 | arrangement, you convey, or propagate by procuring conveyance of, a 515 | covered work, and grant a patent license to some of the parties 516 | receiving the covered work authorizing them to use, propagate, modify 517 | or convey a specific copy of the covered work, then the patent license 518 | you grant is automatically extended to all recipients of the covered 519 | work and works based on it. 520 | 521 | A patent license is "discriminatory" if it does not include within 522 | the scope of its coverage, prohibits the exercise of, or is 523 | conditioned on the non-exercise of one or more of the rights that are 524 | specifically granted under this License. You may not convey a covered 525 | work if you are a party to an arrangement with a third party that is 526 | in the business of distributing software, under which you make payment 527 | to the third party based on the extent of your activity of conveying 528 | the work, and under which the third party grants, to any of the 529 | parties who would receive the covered work from you, a discriminatory 530 | patent license (a) in connection with copies of the covered work 531 | conveyed by you (or copies made from those copies), or (b) primarily 532 | for and in connection with specific products or compilations that 533 | contain the covered work, unless you entered into that arrangement, 534 | or that patent license was granted, prior to 28 March 2007. 535 | 536 | Nothing in this License shall be construed as excluding or limiting 537 | any implied license or other defenses to infringement that may 538 | otherwise be available to you under applicable patent law. 539 | 540 | 12. No Surrender of Others' Freedom. 541 | 542 | If conditions are imposed on you (whether by court order, agreement or 543 | otherwise) that contradict the conditions of this License, they do not 544 | excuse you from the conditions of this License. If you cannot convey a 545 | covered work so as to satisfy simultaneously your obligations under this 546 | License and any other pertinent obligations, then as a consequence you may 547 | not convey it at all. For example, if you agree to terms that obligate you 548 | to collect a royalty for further conveying from those to whom you convey 549 | the Program, the only way you could satisfy both those terms and this 550 | License would be to refrain entirely from conveying the Program. 551 | 552 | 13. Use with the GNU Affero General Public License. 553 | 554 | Notwithstanding any other provision of this License, you have 555 | permission to link or combine any covered work with a work licensed 556 | under version 3 of the GNU Affero General Public License into a single 557 | combined work, and to convey the resulting work. The terms of this 558 | License will continue to apply to the part which is the covered work, 559 | but the special requirements of the GNU Affero General Public License, 560 | section 13, concerning interaction through a network will apply to the 561 | combination as such. 562 | 563 | 14. Revised Versions of this License. 564 | 565 | The Free Software Foundation may publish revised and/or new versions of 566 | the GNU General Public License from time to time. Such new versions will 567 | be similar in spirit to the present version, but may differ in detail to 568 | address new problems or concerns. 569 | 570 | Each version is given a distinguishing version number. If the 571 | Program specifies that a certain numbered version of the GNU General 572 | Public License "or any later version" applies to it, you have the 573 | option of following the terms and conditions either of that numbered 574 | version or of any later version published by the Free Software 575 | Foundation. If the Program does not specify a version number of the 576 | GNU General Public License, you may choose any version ever published 577 | by the Free Software Foundation. 578 | 579 | If the Program specifies that a proxy can decide which future 580 | versions of the GNU General Public License can be used, that proxy's 581 | public statement of acceptance of a version permanently authorizes you 582 | to choose that version for the Program. 583 | 584 | Later license versions may give you additional or different 585 | permissions. However, no additional obligations are imposed on any 586 | author or copyright holder as a result of your choosing to follow a 587 | later version. 588 | 589 | 15. Disclaimer of Warranty. 590 | 591 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY 592 | APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT 593 | HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY 594 | OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, 595 | THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR 596 | PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM 597 | IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF 598 | ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 599 | 600 | 16. Limitation of Liability. 601 | 602 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING 603 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS 604 | THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY 605 | GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE 606 | USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF 607 | DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD 608 | PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), 609 | EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF 610 | SUCH DAMAGES. 611 | 612 | 17. Interpretation of Sections 15 and 16. 613 | 614 | If the disclaimer of warranty and limitation of liability provided 615 | above cannot be given local legal effect according to their terms, 616 | reviewing courts shall apply local law that most closely approximates 617 | an absolute waiver of all civil liability in connection with the 618 | Program, unless a warranty or assumption of liability accompanies a 619 | copy of the Program in return for a fee. 620 | 621 | END OF TERMS AND CONDITIONS 622 | 623 | How to Apply These Terms to Your New Programs 624 | 625 | If you develop a new program, and you want it to be of the greatest 626 | possible use to the public, the best way to achieve this is to make it 627 | free software which everyone can redistribute and change under these terms. 628 | 629 | To do so, attach the following notices to the program. It is safest 630 | to attach them to the start of each source file to most effectively 631 | state the exclusion of warranty; and each file should have at least 632 | the "copyright" line and a pointer to where the full notice is found. 633 | 634 | 635 | Copyright (C) 636 | 637 | This program is free software: you can redistribute it and/or modify 638 | it under the terms of the GNU General Public License as published by 639 | the Free Software Foundation, either version 3 of the License, or 640 | (at your option) any later version. 641 | 642 | This program is distributed in the hope that it will be useful, 643 | but WITHOUT ANY WARRANTY; without even the implied warranty of 644 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the 645 | GNU General Public License for more details. 646 | 647 | You should have received a copy of the GNU General Public License 648 | along with this program. If not, see . 649 | 650 | Also add information on how to contact you by electronic and paper mail. 651 | 652 | If the program does terminal interaction, make it output a short 653 | notice like this when it starts in an interactive mode: 654 | 655 | Copyright (C) 656 | This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. 657 | This is free software, and you are welcome to redistribute it 658 | under certain conditions; type `show c' for details. 659 | 660 | The hypothetical commands `show w' and `show c' should show the appropriate 661 | parts of the General Public License. Of course, your program's commands 662 | might be different; for a GUI interface, you would use an "about box". 663 | 664 | You should also get your employer (if you work as a programmer) or school, 665 | if any, to sign a "copyright disclaimer" for the program, if necessary. 666 | For more information on this, and how to apply and follow the GNU GPL, see 667 | . 668 | 669 | The GNU General Public License does not permit incorporating your program 670 | into proprietary programs. If your program is a subroutine library, you 671 | may consider it more useful to permit linking proprietary applications with 672 | the library. If this is what you want to do, use the GNU Lesser General 673 | Public License instead of this License. But first, please read 674 | . 675 | -------------------------------------------------------------------------------- /experiments/lowd.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "id": "76700d75-37c0-4b7d-ae16-27832666cb8c", 7 | "metadata": {}, 8 | "outputs": [], 9 | "source": [ 10 | "import time\n", 11 | "import pandas as pd\n", 12 | "import numpy as np\n", 13 | "import numpy.random as rgt\n", 14 | "from scipy.stats import norm, t\n", 15 | "\n", 16 | "from conquer.linear import low_dim\n", 17 | "rgt.seed(42)\n", 18 | "\n", 19 | "# number of monte carlo simulations\n", 20 | "M = 500 " 21 | ] 22 | }, 23 | { 24 | "cell_type": "markdown", 25 | "id": "0f818101-5d16-47b0-b1cc-14ffdb083f40", 26 | "metadata": {}, 27 | "source": [ 28 | "# Homoscedastic model" 29 | ] 30 | }, 31 | { 32 | "cell_type": "code", 33 | "execution_count": 2, 34 | "id": "b2b44c8d-ea7f-46c0-be04-9bbcaa2d5dea", 35 | "metadata": {}, 36 | "outputs": [ 37 | { 38 | "data": { 39 | "text/html": [ 40 | "
\n", 41 | "\n", 54 | "\n", 55 | " \n", 56 | " \n", 57 | " \n", 58 | " \n", 59 | " \n", 60 | " \n", 61 | " \n", 62 | " \n", 63 | " \n", 64 | " \n", 65 | " \n", 66 | " \n", 67 | " \n", 68 | " \n", 69 | " \n", 70 | " \n", 71 | " \n", 72 | " \n", 73 | " \n", 74 | " \n", 75 | "
MSE (itcp)std (itcp)MSE (coef)std (coef)Runtime
conquer0.0018580.0017460.0762070.0057590.199185
\n", 76 | "
" 77 | ], 78 | "text/plain": [ 79 | " MSE (itcp) std (itcp) MSE (coef) std (coef) Runtime\n", 80 | "conquer 0.001858 0.001746 0.076207 0.005759 0.199185" 81 | ] 82 | }, 83 | "execution_count": 2, 84 | "metadata": {}, 85 | "output_type": "execute_result" 86 | } 87 | ], 88 | "source": [ 89 | "n, p = 8000, 400\n", 90 | "itcp, beta = 4, 1*np.ones(p)*(2*rgt.binomial(1, 1/2, p) - 1)\n", 91 | "tau, t_df = 0.75, 2\n", 92 | "runtime = []\n", 93 | "itcp_se, coef_se = [], []\n", 94 | "for m in range(M):\n", 95 | " X = rgt.normal(0, 1.5, size=(n,p))\n", 96 | " Y = itcp + X @ beta + rgt.standard_t(t_df, n) - t.ppf(tau, t_df)\n", 97 | "\n", 98 | " tic = time.time()\n", 99 | " model = low_dim(X, Y).fit(tau=tau)\n", 100 | " runtime.append(time.time() - tic)\n", 101 | "\n", 102 | " itcp_se.append((model['beta'][0] - itcp)**2)\n", 103 | " coef_se.append(np.sum((model['beta'][1:] - beta)**2))\n", 104 | "\n", 105 | "out = pd.DataFrame({'MSE (itcp)': np.mean(itcp_se),\n", 106 | " 'std (itcp)': np.std(itcp_se),\n", 107 | " 'MSE (coef)': np.mean(coef_se),\n", 108 | " 'std (coef)': np.std(coef_se),\n", 109 | " 'Runtime': np.mean(runtime)}, index=['conquer'])\n", 110 | "out" 111 | ] 112 | }, 113 | { 114 | "cell_type": "markdown", 115 | "id": "058e6d87-30bd-428a-b4fa-b93226584a72", 116 | "metadata": {}, 117 | "source": [ 118 | "### Construction of confidence intervals" 119 | ] 120 | }, 121 | { 122 | "cell_type": "code", 123 | "execution_count": 3, 124 | "id": "11f13ff1-f875-4c8a-b8c7-bc30faf6e851", 125 | "metadata": {}, 126 | "outputs": [ 127 | { 128 | "data": { 129 | "text/html": [ 130 | "
\n", 131 | "\n", 144 | "\n", 145 | " \n", 146 | " \n", 147 | " \n", 148 | " \n", 149 | " \n", 150 | " \n", 151 | " \n", 152 | " \n", 153 | " \n", 154 | " \n", 155 | " \n", 156 | " \n", 157 | " \n", 158 | " \n", 159 | " \n", 160 | " \n", 161 | " \n", 162 | " \n", 163 | " \n", 164 | " \n", 165 | " \n", 166 | " \n", 167 | " \n", 168 | " \n", 169 | " \n", 170 | " \n", 171 | " \n", 172 | " \n", 173 | " \n", 174 | " \n", 175 | " \n", 176 | " \n", 177 | " \n", 178 | " \n", 179 | " \n", 180 | " \n", 181 | " \n", 182 | " \n", 183 | " \n", 184 | " \n", 185 | " \n", 186 | " \n", 187 | " \n", 188 | " \n", 189 | " \n", 190 | " \n", 191 | " \n", 192 | " \n", 193 | " \n", 194 | " \n", 195 | " \n", 196 | " \n", 197 | " \n", 198 | " \n", 199 | " \n", 200 | " \n", 201 | " \n", 202 | " \n", 203 | " \n", 204 | " \n", 205 | " \n", 206 | " \n", 207 | " \n", 208 | " \n", 209 | " \n", 210 | " \n", 211 | " \n", 212 | " \n", 213 | " \n", 214 | " \n", 215 | " \n", 216 | " \n", 217 | " \n", 218 | " \n", 219 | " \n", 220 | " \n", 221 | " \n", 222 | " \n", 223 | " \n", 224 | " \n", 225 | " \n", 226 | " \n", 227 | " \n", 228 | " \n", 229 | " \n", 230 | " \n", 231 | " \n", 232 | " \n", 233 | " \n", 234 | " \n", 235 | " \n", 236 | " \n", 237 | " \n", 238 | " \n", 239 | " \n", 240 | " \n", 241 | " \n", 242 | " \n", 243 | " \n", 244 | " \n", 245 | " \n", 246 | " \n", 247 | " \n", 248 | " \n", 249 | " \n", 250 | " \n", 251 | " \n", 252 | " \n", 253 | " \n", 254 | " \n", 255 | " \n", 256 | " \n", 257 | " \n", 258 | " \n", 259 | " \n", 260 | " \n", 261 | " \n", 262 | " \n", 263 | " \n", 264 | "
1234567891011121314151617181920
Normal0.9560.9720.9820.9700.9580.9680.9620.9640.9640.9740.9700.9800.9700.9600.9740.9540.9760.9360.9560.960
MB-Percentile0.9640.9840.9640.9500.9680.9640.9540.9600.9620.9680.9640.9780.9700.9500.9660.9580.9680.9360.9400.964
MB-Pivotal0.9140.9440.9360.9280.9260.9340.9460.9260.9360.9380.9420.9420.9360.9320.9480.9400.9380.9160.9200.926
MB-Normal0.9560.9740.9620.9620.9560.9600.9600.9640.9520.9680.9620.9820.9660.9500.9740.9520.9740.9340.9420.954
\n", 265 | "
" 266 | ], 267 | "text/plain": [ 268 | " 1 2 3 4 5 6 7 8 9 \\\n", 269 | "Normal 0.956 0.972 0.982 0.970 0.958 0.968 0.962 0.964 0.964 \n", 270 | "MB-Percentile 0.964 0.984 0.964 0.950 0.968 0.964 0.954 0.960 0.962 \n", 271 | "MB-Pivotal 0.914 0.944 0.936 0.928 0.926 0.934 0.946 0.926 0.936 \n", 272 | "MB-Normal 0.956 0.974 0.962 0.962 0.956 0.960 0.960 0.964 0.952 \n", 273 | "\n", 274 | " 10 11 12 13 14 15 16 17 18 \\\n", 275 | "Normal 0.974 0.970 0.980 0.970 0.960 0.974 0.954 0.976 0.936 \n", 276 | "MB-Percentile 0.968 0.964 0.978 0.970 0.950 0.966 0.958 0.968 0.936 \n", 277 | "MB-Pivotal 0.938 0.942 0.942 0.936 0.932 0.948 0.940 0.938 0.916 \n", 278 | "MB-Normal 0.968 0.962 0.982 0.966 0.950 0.974 0.952 0.974 0.934 \n", 279 | "\n", 280 | " 19 20 \n", 281 | "Normal 0.956 0.960 \n", 282 | "MB-Percentile 0.940 0.964 \n", 283 | "MB-Pivotal 0.920 0.926 \n", 284 | "MB-Normal 0.942 0.954 " 285 | ] 286 | }, 287 | "execution_count": 3, 288 | "metadata": {}, 289 | "output_type": "execute_result" 290 | } 291 | ], 292 | "source": [ 293 | "n, p = 500, 20\n", 294 | "mask = 2*rgt.binomial(1, 1/2, p) - 1\n", 295 | "itcp, beta = 4, 1*np.ones(p)*mask\n", 296 | "tau, t_df = 0.75, 2\n", 297 | "\n", 298 | "ci_cover = np.zeros([4, p])\n", 299 | "ci_width = np.empty([M, 4, p])\n", 300 | "for m in range(M):\n", 301 | " X = rgt.normal(0, 1.5, size=(n,p))\n", 302 | " Y = itcp + X@beta + rgt.standard_t(t_df, n) - t.ppf(tau, t_df)\n", 303 | "\n", 304 | " model = low_dim(X, Y) \n", 305 | " sol1 = model.norm_ci(tau)\n", 306 | " sol2 = model.mb_ci(tau)\n", 307 | " \n", 308 | " ci_cover[0,:] += (beta >= sol1['normal_ci'][1:,0])*(beta<= sol1['normal_ci'][1:,1])\n", 309 | " ci_cover[1,:] += (beta >= sol2['percentile_ci'][1:,0])*(beta<= sol2['percentile_ci'][1:,1])\n", 310 | " ci_cover[2,:] += (beta >= sol2['pivotal_ci'][1:,0])*(beta<= sol2['pivotal_ci'][1:,1])\n", 311 | " ci_cover[3,:] += (beta >= sol2['normal_ci'][1:,0])*(beta<= sol2['normal_ci'][1:,1])\n", 312 | " \n", 313 | " ci_width[m,0,:] = sol1['normal_ci'][1:,1] - sol1['normal_ci'][1:,0]\n", 314 | " ci_width[m,1,:] = sol2['percentile_ci'][1:,1] - sol2['percentile_ci'][1:,0]\n", 315 | " ci_width[m,2,:] = sol2['pivotal_ci'][1:,1] - sol2['pivotal_ci'][1:,0]\n", 316 | " ci_width[m,3,:] = sol2['normal_ci'][1:,1] - sol2['normal_ci'][1:,0]\n", 317 | "\n", 318 | "cover = pd.DataFrame(ci_cover/M, index=[\"Normal\", \"MB-Percentile\", \"MB-Pivotal\", \"MB-Normal\"])\n", 319 | "cover.columns = pd.Index(np.linspace(1,p,p), dtype=int)\n", 320 | "cover" 321 | ] 322 | }, 323 | { 324 | "cell_type": "code", 325 | "execution_count": 4, 326 | "id": "6522f352-8e41-4523-b0e6-ef387427afdf", 327 | "metadata": {}, 328 | "outputs": [ 329 | { 330 | "data": { 331 | "text/html": [ 332 | "
\n", 333 | "\n", 346 | "\n", 347 | " \n", 348 | " \n", 349 | " \n", 350 | " \n", 351 | " \n", 352 | " \n", 353 | " \n", 354 | " \n", 355 | " \n", 356 | " \n", 357 | " \n", 358 | " \n", 359 | " \n", 360 | " \n", 361 | " \n", 362 | " \n", 363 | " \n", 364 | " \n", 365 | " \n", 366 | " \n", 367 | " \n", 368 | " \n", 369 | " \n", 370 | " \n", 371 | " \n", 372 | " \n", 373 | " \n", 374 | " \n", 375 | " \n", 376 | " \n", 377 | " \n", 378 | " \n", 379 | " \n", 380 | " \n", 381 | " \n", 382 | " \n", 383 | " \n", 384 | " \n", 385 | " \n", 386 | " \n", 387 | " \n", 388 | " \n", 389 | " \n", 390 | " \n", 391 | " \n", 392 | " \n", 393 | " \n", 394 | " \n", 395 | " \n", 396 | " \n", 397 | " \n", 398 | " \n", 399 | " \n", 400 | " \n", 401 | " \n", 402 | " \n", 403 | " \n", 404 | " \n", 405 | " \n", 406 | " \n", 407 | " \n", 408 | " \n", 409 | " \n", 410 | " \n", 411 | " \n", 412 | " \n", 413 | " \n", 414 | " \n", 415 | " \n", 416 | " \n", 417 | " \n", 418 | " \n", 419 | " \n", 420 | " \n", 421 | " \n", 422 | " \n", 423 | " \n", 424 | " \n", 425 | " \n", 426 | " \n", 427 | " \n", 428 | " \n", 429 | " \n", 430 | " \n", 431 | " \n", 432 | " \n", 433 | " \n", 434 | " \n", 435 | " \n", 436 | " \n", 437 | " \n", 438 | " \n", 439 | " \n", 440 | " \n", 441 | " \n", 442 | " \n", 443 | " \n", 444 | " \n", 445 | " \n", 446 | " \n", 447 | " \n", 448 | " \n", 449 | " \n", 450 | " \n", 451 | " \n", 452 | " \n", 453 | " \n", 454 | " \n", 455 | " \n", 456 | " \n", 457 | " \n", 458 | " \n", 459 | " \n", 460 | " \n", 461 | " \n", 462 | " \n", 463 | " \n", 464 | " \n", 465 | " \n", 466 | "
1234567891011121314151617181920
Normal0.2572320.2563640.2601000.2590400.2628100.2622880.2597650.2615420.2608840.2595480.2606700.2604750.2572230.2595150.2619790.2573570.2584100.2562170.2557830.255602
MB-Percentile0.2256820.2247540.2258010.2256590.2268530.2262550.2237250.2268130.2265760.2266120.2243290.2254850.2254950.2257870.2275600.2244380.2250080.2239860.2245940.225444
MB-Pivotal0.2256820.2247540.2258010.2256590.2268530.2262550.2237250.2268130.2265760.2266120.2243290.2254850.2254950.2257870.2275600.2244380.2250080.2239860.2245940.225444
MB-Normal0.2290290.2279630.2284810.2290650.2312200.2304840.2280930.2309150.2303120.2298350.2282920.2296150.2290600.2292120.2307330.2281680.2287010.2278660.2278550.229004
\n", 467 | "
" 468 | ], 469 | "text/plain": [ 470 | " 1 2 3 4 5 6 \\\n", 471 | "Normal 0.257232 0.256364 0.260100 0.259040 0.262810 0.262288 \n", 472 | "MB-Percentile 0.225682 0.224754 0.225801 0.225659 0.226853 0.226255 \n", 473 | "MB-Pivotal 0.225682 0.224754 0.225801 0.225659 0.226853 0.226255 \n", 474 | "MB-Normal 0.229029 0.227963 0.228481 0.229065 0.231220 0.230484 \n", 475 | "\n", 476 | " 7 8 9 10 11 12 \\\n", 477 | "Normal 0.259765 0.261542 0.260884 0.259548 0.260670 0.260475 \n", 478 | "MB-Percentile 0.223725 0.226813 0.226576 0.226612 0.224329 0.225485 \n", 479 | "MB-Pivotal 0.223725 0.226813 0.226576 0.226612 0.224329 0.225485 \n", 480 | "MB-Normal 0.228093 0.230915 0.230312 0.229835 0.228292 0.229615 \n", 481 | "\n", 482 | " 13 14 15 16 17 18 \\\n", 483 | "Normal 0.257223 0.259515 0.261979 0.257357 0.258410 0.256217 \n", 484 | "MB-Percentile 0.225495 0.225787 0.227560 0.224438 0.225008 0.223986 \n", 485 | "MB-Pivotal 0.225495 0.225787 0.227560 0.224438 0.225008 0.223986 \n", 486 | "MB-Normal 0.229060 0.229212 0.230733 0.228168 0.228701 0.227866 \n", 487 | "\n", 488 | " 19 20 \n", 489 | "Normal 0.255783 0.255602 \n", 490 | "MB-Percentile 0.224594 0.225444 \n", 491 | "MB-Pivotal 0.224594 0.225444 \n", 492 | "MB-Normal 0.227855 0.229004 " 493 | ] 494 | }, 495 | "execution_count": 4, 496 | "metadata": {}, 497 | "output_type": "execute_result" 498 | } 499 | ], 500 | "source": [ 501 | "width = pd.DataFrame(np.mean(ci_width, axis=0), index=[\"Normal\", \"MB-Percentile\", \"MB-Pivotal\", \"MB-Normal\"])\n", 502 | "width.columns = cover.columns\n", 503 | "width" 504 | ] 505 | }, 506 | { 507 | "cell_type": "markdown", 508 | "id": "ae287811-bc68-468e-a288-9bd0cb1e5bd6", 509 | "metadata": {}, 510 | "source": [ 511 | "# Heteroscedastic model" 512 | ] 513 | }, 514 | { 515 | "cell_type": "markdown", 516 | "id": "fe29008e-abe2-402d-80ff-6ff9b97bb85b", 517 | "metadata": {}, 518 | "source": [ 519 | "Let $z=(z_1, \\ldots, z_p)^T \\sim N(0, \\Sigma)$ with $\\Sigma = (0.5^{|j-k|})_{1\\leq j, k \\leq p}$ and $z_0 \\sim {\\rm Unif}(0,2)$ be independent. Generate independent data vectors $\\{(y_i , x_i) \\}_{i=1}^n$ from the model \n", 520 | "$$\n", 521 | " y_i = \\varepsilon_i x_{i1} + x_{i2} + \\cdots + x_{ip} \\quad {\\rm with } \\ \\ x_i = (x_{i1}, \\ldots, x_{ip})^T \\sim (z_0, z_2, \\ldots, z_p)^T,\n", 522 | "$$\n", 523 | "where $\\varepsilon_i$'s are iid $N(0,1)$ variables that are independent of $x_i$'s.\n", 524 | "\n", 525 | "Consider two quantile levels: $\\tau=0.5$ and $\\tau=0.8$. Note that the effect of $x_{i1}$ is only present for $\\tau=0.8$." 526 | ] 527 | }, 528 | { 529 | "cell_type": "code", 530 | "execution_count": 5, 531 | "id": "75b611ee-7478-48a3-a0e8-a24cc2355dce", 532 | "metadata": {}, 533 | "outputs": [], 534 | "source": [ 535 | "def cov_generate(std, corr=0.5):\n", 536 | " p = len(std)\n", 537 | " R = np.zeros(shape=[p,p])\n", 538 | " for j in range(p-1):\n", 539 | " R[j, j+1:] = np.array(range(1, len(R[j,j+1:])+1))\n", 540 | " R += R.T\n", 541 | " return np.outer(std, std) * (corr*np.ones(shape=[p,p]))** R\n", 542 | "\n", 543 | "n, p = 2000, 10\n", 544 | "mu, Sig = np.zeros(p), cov_generate(np.ones(p), 0.5)\n", 545 | "beta = np.ones(p)\n", 546 | "beta[0] = 0" 547 | ] 548 | }, 549 | { 550 | "cell_type": "markdown", 551 | "id": "19e7e1aa-f715-4154-8a8f-326660907886", 552 | "metadata": {}, 553 | "source": [ 554 | "### Case 1: $\\tau=0.5$.\n", 555 | "The conditional median of $y_i$ given $x_i$ is $Q_{0.5}(y_i | x_i) = x_{i2} + \\cdots + x_{ip}$." 556 | ] 557 | }, 558 | { 559 | "cell_type": "code", 560 | "execution_count": 6, 561 | "id": "910136b7-ca4c-4520-a5fd-46637670f11d", 562 | "metadata": {}, 563 | "outputs": [ 564 | { 565 | "data": { 566 | "text/html": [ 567 | "
\n", 568 | "\n", 581 | "\n", 582 | " \n", 583 | " \n", 584 | " \n", 585 | " \n", 586 | " \n", 587 | " \n", 588 | " \n", 589 | " \n", 590 | " \n", 591 | " \n", 592 | " \n", 593 | " \n", 594 | " \n", 595 | " \n", 596 | " \n", 597 | " \n", 598 | " \n", 599 | " \n", 600 | " \n", 601 | " \n", 602 | " \n", 603 | " \n", 604 | " \n", 605 | " \n", 606 | " \n", 607 | " \n", 608 | " \n", 609 | " \n", 610 | " \n", 611 | " \n", 612 | " \n", 613 | " \n", 614 | " \n", 615 | " \n", 616 | " \n", 617 | " \n", 618 | " \n", 619 | " \n", 620 | " \n", 621 | " \n", 622 | " \n", 623 | " \n", 624 | " \n", 625 | " \n", 626 | " \n", 627 | " \n", 628 | " \n", 629 | " \n", 630 | " \n", 631 | " \n", 632 | " \n", 633 | " \n", 634 | " \n", 635 | " \n", 636 | " \n", 637 | " \n", 638 | " \n", 639 | " \n", 640 | " \n", 641 | " \n", 642 | " \n", 643 | " \n", 644 | " \n", 645 | " \n", 646 | " \n", 647 | " \n", 648 | " \n", 649 | " \n", 650 | " \n", 651 | "
12345678910
Normal0.9500.9600.9540.9460.9640.9500.9600.9460.9400.944
MB-Percentile0.9400.9440.9360.9440.9500.9280.9420.9420.9320.942
MB-Pivotal0.9300.9680.9600.9680.9720.9600.9680.9620.9560.956
MB-Normal0.9440.9680.9560.9640.9720.9540.9620.9660.9480.954
\n", 652 | "
" 653 | ], 654 | "text/plain": [ 655 | " 1 2 3 4 5 6 7 8 9 \\\n", 656 | "Normal 0.950 0.960 0.954 0.946 0.964 0.950 0.960 0.946 0.940 \n", 657 | "MB-Percentile 0.940 0.944 0.936 0.944 0.950 0.928 0.942 0.942 0.932 \n", 658 | "MB-Pivotal 0.930 0.968 0.960 0.968 0.972 0.960 0.968 0.962 0.956 \n", 659 | "MB-Normal 0.944 0.968 0.956 0.964 0.972 0.954 0.962 0.966 0.948 \n", 660 | "\n", 661 | " 10 \n", 662 | "Normal 0.944 \n", 663 | "MB-Percentile 0.942 \n", 664 | "MB-Pivotal 0.956 \n", 665 | "MB-Normal 0.954 " 666 | ] 667 | }, 668 | "execution_count": 6, 669 | "metadata": {}, 670 | "output_type": "execute_result" 671 | } 672 | ], 673 | "source": [ 674 | "tau = 0.5\n", 675 | "ci_cover = np.zeros([4, p])\n", 676 | "ci_width = np.empty([M, 4, p])\n", 677 | "for m in range(M):\n", 678 | " X = rgt.multivariate_normal(mean=mu, cov=Sig, size=n)\n", 679 | " X[:,0] = rgt.uniform(0, 2, size=n)\n", 680 | " Y = X@beta + X[:,0]*rgt.normal(0,1,size=n)\n", 681 | "\n", 682 | " model = low_dim(X, Y, intercept=False) \n", 683 | " sol1 = model.norm_ci(tau)\n", 684 | " sol2 = model.mb_ci(tau)\n", 685 | " \n", 686 | " ci_cover[0,:] += (beta >= sol1['normal_ci'][:,0])*(beta<= sol1['normal_ci'][:,1])\n", 687 | " ci_cover[1,:] += (beta >= sol2['percentile_ci'][:,0])*(beta<= sol2['percentile_ci'][:,1])\n", 688 | " ci_cover[2,:] += (beta >= sol2['pivotal_ci'][:,0])*(beta<= sol2['pivotal_ci'][:,1])\n", 689 | " ci_cover[3,:] += (beta >= sol2['normal_ci'][:,0])*(beta<= sol2['normal_ci'][:,1])\n", 690 | " \n", 691 | " ci_width[m,0,:] = sol1['normal_ci'][:,1] - sol1['normal_ci'][:,0]\n", 692 | " ci_width[m,1,:] = sol2['percentile_ci'][:,1] - sol2['percentile_ci'][:,0]\n", 693 | " ci_width[m,2,:] = sol2['pivotal_ci'][:,1] - sol2['pivotal_ci'][:,0]\n", 694 | " ci_width[m,3,:] = sol2['normal_ci'][:,1] - sol2['normal_ci'][:,0]\n", 695 | "\n", 696 | "cover = pd.DataFrame(ci_cover/M, index=[\"Normal\", \"MB-Percentile\", \"MB-Pivotal\", \"MB-Normal\"])\n", 697 | "cover.columns = pd.Index(np.linspace(1,p,p), dtype=int)\n", 698 | "cover" 699 | ] 700 | }, 701 | { 702 | "cell_type": "code", 703 | "execution_count": 7, 704 | "id": "14804788", 705 | "metadata": {}, 706 | "outputs": [ 707 | { 708 | "data": { 709 | "text/html": [ 710 | "
\n", 711 | "\n", 724 | "\n", 725 | " \n", 726 | " \n", 727 | " \n", 728 | " \n", 729 | " \n", 730 | " \n", 731 | " \n", 732 | " \n", 733 | " \n", 734 | " \n", 735 | " \n", 736 | " \n", 737 | " \n", 738 | " \n", 739 | " \n", 740 | " \n", 741 | " \n", 742 | " \n", 743 | " \n", 744 | " \n", 745 | " \n", 746 | " \n", 747 | " \n", 748 | " \n", 749 | " \n", 750 | " \n", 751 | " \n", 752 | " \n", 753 | " \n", 754 | " \n", 755 | " \n", 756 | " \n", 757 | " \n", 758 | " \n", 759 | " \n", 760 | " \n", 761 | " \n", 762 | " \n", 763 | " \n", 764 | " \n", 765 | " \n", 766 | " \n", 767 | " \n", 768 | " \n", 769 | " \n", 770 | " \n", 771 | " \n", 772 | " \n", 773 | " \n", 774 | " \n", 775 | " \n", 776 | " \n", 777 | " \n", 778 | " \n", 779 | " \n", 780 | " \n", 781 | " \n", 782 | " \n", 783 | " \n", 784 | " \n", 785 | " \n", 786 | " \n", 787 | " \n", 788 | " \n", 789 | " \n", 790 | " \n", 791 | " \n", 792 | " \n", 793 | " \n", 794 | "
12345678910
Normal0.1246390.0627230.0700830.0705540.0705160.0705140.0702800.0704700.0701510.063217
MB-Percentile0.1202840.0647100.0723640.0728470.0728510.0723310.0725720.0726180.0719240.064974
MB-Pivotal0.1202840.0647100.0723640.0728470.0728510.0723310.0725720.0726180.0719240.064974
MB-Normal0.1230690.0658820.0737290.0741090.0741630.0735180.0738340.0738730.0733410.066110
\n", 795 | "
" 796 | ], 797 | "text/plain": [ 798 | " 1 2 3 4 5 6 \\\n", 799 | "Normal 0.124639 0.062723 0.070083 0.070554 0.070516 0.070514 \n", 800 | "MB-Percentile 0.120284 0.064710 0.072364 0.072847 0.072851 0.072331 \n", 801 | "MB-Pivotal 0.120284 0.064710 0.072364 0.072847 0.072851 0.072331 \n", 802 | "MB-Normal 0.123069 0.065882 0.073729 0.074109 0.074163 0.073518 \n", 803 | "\n", 804 | " 7 8 9 10 \n", 805 | "Normal 0.070280 0.070470 0.070151 0.063217 \n", 806 | "MB-Percentile 0.072572 0.072618 0.071924 0.064974 \n", 807 | "MB-Pivotal 0.072572 0.072618 0.071924 0.064974 \n", 808 | "MB-Normal 0.073834 0.073873 0.073341 0.066110 " 809 | ] 810 | }, 811 | "execution_count": 7, 812 | "metadata": {}, 813 | "output_type": "execute_result" 814 | } 815 | ], 816 | "source": [ 817 | "width = pd.DataFrame(np.mean(ci_width, axis=0), index=[\"Normal\", \"MB-Percentile\", \"MB-Pivotal\", \"MB-Normal\"])\n", 818 | "width.columns = cover.columns\n", 819 | "width" 820 | ] 821 | }, 822 | { 823 | "cell_type": "markdown", 824 | "id": "45636462-f041-43e7-b0d7-636bdf4f0496", 825 | "metadata": {}, 826 | "source": [ 827 | "### Case 2: $\\tau=0.8$. \n", 828 | "In this case, the conditional $0.8$-quantile of $y_i$ given $x_i$ is $Q_{0.8}(y_i | x_i) = \\Phi^{-1}(0.8) x_{i1} + x_{i2} + \\cdots + x_{ip}$." 829 | ] 830 | }, 831 | { 832 | "cell_type": "code", 833 | "execution_count": 8, 834 | "id": "627a6dd7-8e1c-414b-b7bf-00bb51b64e5c", 835 | "metadata": {}, 836 | "outputs": [], 837 | "source": [ 838 | "tau = 0.8\n", 839 | "true_beta = np.copy(beta)\n", 840 | "true_beta[0] = norm.ppf(tau)\n", 841 | "\n", 842 | "ci_cover = np.zeros([4, p])\n", 843 | "ci_width = np.empty([M, 4, p])\n", 844 | "for m in range(M):\n", 845 | " X = rgt.multivariate_normal(mean=mu, cov=Sig, size=n)\n", 846 | " X[:,0] = rgt.uniform(0, 2, size=n)\n", 847 | " Y = X@beta + X[:,0]*rgt.normal(0,1,size=n)\n", 848 | "\n", 849 | " model = low_dim(X, Y, intercept=False) \n", 850 | " sol1 = model.norm_ci(tau)\n", 851 | " sol2 = model.mb_ci(tau)\n", 852 | " \n", 853 | " ci_cover[0,:] += (true_beta>=sol1['normal_ci'][:,0])*(true_beta<= sol1['normal_ci'][:,1])\n", 854 | " ci_cover[1,:] += (true_beta>=sol2['percentile_ci'][:,0])*(true_beta<= sol2['percentile_ci'][:,1])\n", 855 | " ci_cover[2,:] += (true_beta>=sol2['pivotal_ci'][:,0])*(true_beta<= sol2['pivotal_ci'][:,1])\n", 856 | " ci_cover[3,:] += (true_beta>=sol2['normal_ci'][:,0])*(true_beta<= sol2['normal_ci'][:,1])\n", 857 | " \n", 858 | " ci_width[m,0,:] = sol1['normal_ci'][:,1] - sol1['normal_ci'][:,0]\n", 859 | " ci_width[m,1,:] = sol2['percentile_ci'][:,1] - sol2['percentile_ci'][:,0]\n", 860 | " ci_width[m,2,:] = sol2['pivotal_ci'][:,1] - sol2['pivotal_ci'][:,0]\n", 861 | " ci_width[m,3,:] = sol2['normal_ci'][:,1] - sol2['normal_ci'][:,0]\n", 862 | " \n", 863 | "cover = pd.DataFrame(ci_cover/M, index=[\"Normal\", \"MB-Percentile\", \"MB-Pivotal\", \"MB-Normal\"])\n", 864 | "cover.columns = pd.Index(np.linspace(1,p,p), dtype=int)\n", 865 | "\n", 866 | "width = pd.DataFrame(np.mean(ci_width, axis=0), index=[\"Normal\", \"MB-Percentile\", \"MB-Pivotal\", \"MB-Normal\"])\n", 867 | "width.columns = cover.columns" 868 | ] 869 | }, 870 | { 871 | "cell_type": "code", 872 | "execution_count": 9, 873 | "id": "8252e32b-16ad-442b-9ed9-b6d37bb7382f", 874 | "metadata": {}, 875 | "outputs": [ 876 | { 877 | "data": { 878 | "text/html": [ 879 | "
\n", 880 | "\n", 893 | "\n", 894 | " \n", 895 | " \n", 896 | " \n", 897 | " \n", 898 | " \n", 899 | " \n", 900 | " \n", 901 | " \n", 902 | " \n", 903 | " \n", 904 | " \n", 905 | " \n", 906 | " \n", 907 | " \n", 908 | " \n", 909 | " \n", 910 | " \n", 911 | " \n", 912 | " \n", 913 | " \n", 914 | " \n", 915 | " \n", 916 | " \n", 917 | " \n", 918 | " \n", 919 | " \n", 920 | " \n", 921 | " \n", 922 | " \n", 923 | " \n", 924 | " \n", 925 | " \n", 926 | " \n", 927 | " \n", 928 | " \n", 929 | " \n", 930 | " \n", 931 | " \n", 932 | " \n", 933 | " \n", 934 | " \n", 935 | " \n", 936 | " \n", 937 | " \n", 938 | " \n", 939 | " \n", 940 | " \n", 941 | " \n", 942 | " \n", 943 | " \n", 944 | " \n", 945 | " \n", 946 | " \n", 947 | " \n", 948 | " \n", 949 | " \n", 950 | " \n", 951 | " \n", 952 | " \n", 953 | " \n", 954 | " \n", 955 | " \n", 956 | " \n", 957 | " \n", 958 | " \n", 959 | " \n", 960 | " \n", 961 | " \n", 962 | " \n", 963 | "
12345678910
Normal0.9460.9600.9500.9620.9660.9660.9620.9660.9680.960
MB-Percentile0.9480.9500.9440.9580.9560.9560.9560.9600.9660.954
MB-Pivotal0.9280.9700.9680.9640.9660.9740.9820.9740.9720.976
MB-Normal0.9500.9660.9680.9660.9700.9700.9760.9720.9740.974
\n", 964 | "
" 965 | ], 966 | "text/plain": [ 967 | " 1 2 3 4 5 6 7 8 9 \\\n", 968 | "Normal 0.946 0.960 0.950 0.962 0.966 0.966 0.962 0.966 0.968 \n", 969 | "MB-Percentile 0.948 0.950 0.944 0.958 0.956 0.956 0.956 0.960 0.966 \n", 970 | "MB-Pivotal 0.928 0.970 0.968 0.964 0.966 0.974 0.982 0.974 0.972 \n", 971 | "MB-Normal 0.950 0.966 0.968 0.966 0.970 0.970 0.976 0.972 0.974 \n", 972 | "\n", 973 | " 10 \n", 974 | "Normal 0.960 \n", 975 | "MB-Percentile 0.954 \n", 976 | "MB-Pivotal 0.976 \n", 977 | "MB-Normal 0.974 " 978 | ] 979 | }, 980 | "execution_count": 9, 981 | "metadata": {}, 982 | "output_type": "execute_result" 983 | } 984 | ], 985 | "source": [ 986 | "cover" 987 | ] 988 | }, 989 | { 990 | "cell_type": "code", 991 | "execution_count": 10, 992 | "id": "9e769691-b906-40c0-a801-08339eeb4d63", 993 | "metadata": {}, 994 | "outputs": [ 995 | { 996 | "data": { 997 | "text/html": [ 998 | "
\n", 999 | "\n", 1012 | "\n", 1013 | " \n", 1014 | " \n", 1015 | " \n", 1016 | " \n", 1017 | " \n", 1018 | " \n", 1019 | " \n", 1020 | " \n", 1021 | " \n", 1022 | " \n", 1023 | " \n", 1024 | " \n", 1025 | " \n", 1026 | " \n", 1027 | " \n", 1028 | " \n", 1029 | " \n", 1030 | " \n", 1031 | " \n", 1032 | " \n", 1033 | " \n", 1034 | " \n", 1035 | " \n", 1036 | " \n", 1037 | " \n", 1038 | " \n", 1039 | " \n", 1040 | " \n", 1041 | " \n", 1042 | " \n", 1043 | " \n", 1044 | " \n", 1045 | " \n", 1046 | " \n", 1047 | " \n", 1048 | " \n", 1049 | " \n", 1050 | " \n", 1051 | " \n", 1052 | " \n", 1053 | " \n", 1054 | " \n", 1055 | " \n", 1056 | " \n", 1057 | " \n", 1058 | " \n", 1059 | " \n", 1060 | " \n", 1061 | " \n", 1062 | " \n", 1063 | " \n", 1064 | " \n", 1065 | " \n", 1066 | " \n", 1067 | " \n", 1068 | " \n", 1069 | " \n", 1070 | " \n", 1071 | " \n", 1072 | " \n", 1073 | " \n", 1074 | " \n", 1075 | " \n", 1076 | " \n", 1077 | " \n", 1078 | " \n", 1079 | " \n", 1080 | " \n", 1081 | " \n", 1082 | "
12345678910
Normal0.1439160.0653540.0727710.0729590.0721720.0725530.0723840.0723570.0731900.065104
MB-Percentile0.1383970.0678520.0753060.0753430.0748020.0753330.0750510.0751920.0755210.067160
MB-Pivotal0.1383970.0678520.0753060.0753430.0748020.0753330.0750510.0751920.0755210.067160
MB-Normal0.1413450.0688880.0766650.0765690.0759470.0764160.0761440.0762330.0768920.068287
\n", 1083 | "
" 1084 | ], 1085 | "text/plain": [ 1086 | " 1 2 3 4 5 6 \\\n", 1087 | "Normal 0.143916 0.065354 0.072771 0.072959 0.072172 0.072553 \n", 1088 | "MB-Percentile 0.138397 0.067852 0.075306 0.075343 0.074802 0.075333 \n", 1089 | "MB-Pivotal 0.138397 0.067852 0.075306 0.075343 0.074802 0.075333 \n", 1090 | "MB-Normal 0.141345 0.068888 0.076665 0.076569 0.075947 0.076416 \n", 1091 | "\n", 1092 | " 7 8 9 10 \n", 1093 | "Normal 0.072384 0.072357 0.073190 0.065104 \n", 1094 | "MB-Percentile 0.075051 0.075192 0.075521 0.067160 \n", 1095 | "MB-Pivotal 0.075051 0.075192 0.075521 0.067160 \n", 1096 | "MB-Normal 0.076144 0.076233 0.076892 0.068287 " 1097 | ] 1098 | }, 1099 | "execution_count": 10, 1100 | "metadata": {}, 1101 | "output_type": "execute_result" 1102 | } 1103 | ], 1104 | "source": [ 1105 | "width" 1106 | ] 1107 | } 1108 | ], 1109 | "metadata": { 1110 | "kernelspec": { 1111 | "display_name": "Python 3 (ipykernel)", 1112 | "language": "python", 1113 | "name": "python3" 1114 | }, 1115 | "language_info": { 1116 | "codemirror_mode": { 1117 | "name": "ipython", 1118 | "version": 3 1119 | }, 1120 | "file_extension": ".py", 1121 | "mimetype": "text/x-python", 1122 | "name": "python", 1123 | "nbconvert_exporter": "python", 1124 | "pygments_lexer": "ipython3", 1125 | "version": "3.11.7" 1126 | } 1127 | }, 1128 | "nbformat": 4, 1129 | "nbformat_minor": 5 1130 | } 1131 | -------------------------------------------------------------------------------- /conquer/joint.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import numpy.random as rgt 3 | 4 | from scipy.stats import norm 5 | from scipy.optimize import minimize 6 | from scipy.sparse import csc_matrix 7 | 8 | from sklearn.metrics import pairwise_kernels as PK 9 | from sklearn.kernel_ridge import KernelRidge as KR 10 | 11 | from qpsolvers import Problem, solve_problem 12 | from cvxopt import solvers, matrix 13 | solvers.options['show_progress'] = False 14 | 15 | import torch 16 | import torch.nn as nn 17 | import torch.optim as optim 18 | import matplotlib.pyplot as plt 19 | 20 | from torch.utils.data import DataLoader, TensorDataset 21 | from torch.utils.data.dataset import random_split 22 | from torch.distributions.normal import Normal 23 | from enum import Enum 24 | 25 | from conquer.linear import low_dim 26 | 27 | 28 | 29 | ############################################################################### 30 | ######################### Linear QuantES Regression ########################### 31 | ############################################################################### 32 | class LR(low_dim): 33 | ''' 34 | Joint Linear Quantile and Expected Shortfall Regression 35 | ''' 36 | def _FZ_loss(self, x, tau, G1=False, G2_type=1): 37 | ''' 38 | Fissler and Ziegel's Joint Loss Function 39 | 40 | Args: 41 | G1 : logical flag for the specification function G1 in FZ's loss; 42 | G1(x)=0 if G1=False, and G1(x)=x and G1=True. 43 | G2_type : an integer (from 1 to 5) that indicates the type. 44 | of the specification function G2 in FZ's loss. 45 | 46 | Returns: 47 | FZ loss function value. 48 | ''' 49 | 50 | X = self.X 51 | if G2_type in {1, 2, 3}: 52 | Ymax = np.max(self.Y) 53 | Y = self.Y - Ymax 54 | else: 55 | Y = self.Y 56 | dim = X.shape[1] 57 | Yq = X @ x[:dim] 58 | Ye = X @ x[dim : 2*dim] 59 | f0, f1, _ = G2(G2_type) 60 | loss = f1(Ye) * (Ye - Yq - (Y - Yq) * (Y<= Yq) / tau) - f0(Ye) 61 | if G1: 62 | return np.mean((tau - (Y<=Yq)) * (Y-Yq) + loss) 63 | else: 64 | return np.mean(loss) 65 | 66 | 67 | def joint_fit(self, tau=0.5, G1=False, G2_type=1, 68 | standardize=True, refit=True, tol=None, 69 | options={'maxiter': None, 'maxfev': None, 'disp': False, 70 | 'return_all': False, 'initial_simplex': None, 71 | 'xatol': 0.0001, 'fatol': 0.0001, 'adaptive': False}): 72 | ''' 73 | Joint Quantile & Expected Shortfall Regression via FZ Loss Minimization 74 | 75 | Refs: 76 | Higher Order Elicitability and Osband's Principle 77 | by Tobias Fissler and Johanna F. Ziegel 78 | Ann. Statist. 44(4): 1680-1707, 2016 79 | 80 | A Joint Quantile and Expected Shortfall Regression Framework 81 | by Timo Dimitriadis and Sebastian Bayer 82 | Electron. J. Stat. 13(1): 1823-1871, 2019 83 | 84 | Args: 85 | tau : quantile level; default is 0.5. 86 | G1 : logical flag for the specification function G1 in FZ's loss; 87 | G1(x)=0 if G1=False, and G1(x)=x and G1=True. 88 | G2_type : an integer (from 1 to 5) that indicates the type of the specification function G2 in FZ's loss. 89 | standardize : logical flag for x variable standardization prior to fitting the quantile model; 90 | default is TRUE. 91 | refit : logical flag for refitting joint regression if the optimization is terminated early; 92 | default is TRUE. 93 | tol : tolerance for termination. 94 | options : a dictionary of solver options; 95 | see https://docs.scipy.org/doc/scipy/reference/optimize.minimize-neldermead.html 96 | 97 | Returns: 98 | 'coef_q' : quantile regression coefficient estimate. 99 | 'coef_e' : expected shortfall regression coefficient estimate. 100 | 'nit' : total number of iterations. 101 | 'nfev' : total number of function evaluations. 102 | 'message' : a message that describes the cause of the termination. 103 | ''' 104 | 105 | dim = self.X.shape[1] 106 | Ymax = np.max(self.Y) 107 | ### warm start with QR + truncated least squares 108 | qrfit = low_dim(self.X[:, self.itcp:], self.Y, intercept=True)\ 109 | .fit(tau=tau, standardize=standardize) 110 | coef_q = qrfit['beta'] 111 | tail = self.Y <= self.X @ coef_q 112 | tail_X = self.X[tail,:] 113 | tail_Y = self.Y[tail] 114 | coef_e = np.linalg.solve(tail_X.T @ tail_X, tail_X.T @ tail_Y) 115 | if G2_type in {1, 2, 3}: 116 | coef_q[0] -= Ymax 117 | coef_e[0] -= Ymax 118 | x0 = np.r_[(coef_q, coef_e)] 119 | 120 | ### joint quantile and ES fit 121 | fun = lambda x : self._FZ_loss(x, tau, G1, G2_type) 122 | esfit = minimize(fun, x0, method='Nelder-Mead', 123 | tol=tol, options=options) 124 | nit, nfev = esfit['nit'], esfit['nfev'] 125 | 126 | ### refit if convergence criterion is not met 127 | while refit and not esfit['success']: 128 | esfit = minimize(fun, esfit['x'], method='Nelder-Mead', 129 | tol=tol, options=options) 130 | nit += esfit['nit'] 131 | nfev += esfit['nfev'] 132 | 133 | coef_q, coef_e = esfit['x'][:dim], esfit['x'][dim : 2*dim] 134 | if G2_type in {1, 2, 3}: 135 | coef_q[0] += Ymax 136 | coef_e[0] += Ymax 137 | 138 | return {'coef_q': coef_q, 'coef_e': coef_e, 139 | 'nit': nit, 'nfev': nfev, 140 | 'success': esfit['success'], 141 | 'message': esfit['message']} 142 | 143 | 144 | def twostep_fit(self, tau=0.5, h=None, kernel='Laplacian', 145 | loss='L2', robust=None, G2_type=1, 146 | standardize=True, tol=None, options=None, 147 | ci=False, level=0.95): 148 | ''' 149 | Two-Step Procedure for Joint QuantES Regression 150 | 151 | Refs: 152 | Higher Order Elicitability and Osband's Principle 153 | by Tobias Fissler and Johanna F. Ziegel 154 | Ann. Statist. 44(4): 1680-1707, 2016 155 | 156 | Effciently Weighted Estimation of Tail and Interquantile Expectations 157 | by Sander Barendse 158 | SSRN Preprint, 2020 159 | 160 | Robust Estimation and Inference 161 | for Expected Shortfall Regression with Many Regressors 162 | by Xuming He, Kean Ming Tan and Wen-Xin Zhou 163 | J. R. Stat. Soc. B. 85(4): 1223-1246, 2023 164 | 165 | Inference for Joint Quantile and Expected Shortfall Regression 166 | by Xiang Peng and Huixia Judy Wang 167 | Stat 12(1) e619, 2023 168 | 169 | Args: 170 | tau : quantile level; default is 0.5. 171 | h : bandwidth; the default value is computed by self.bandwidth(tau). 172 | kernel : a character string representing one of the built-in smoothing kernels; 173 | default is "Laplacian". 174 | loss : the loss function used in stage two. There are three options. 175 | 1. 'L2': squared/L2 loss; 176 | 2. 'Huber': Huber loss; 177 | 3. 'FZ': Fissler and Ziegel's joint loss. 178 | robust : robustification parameter in the Huber loss; 179 | if robust=None, it will be automatically determined in a data-driven way; 180 | G2_type : an integer (from 1 to 5) that indicates the type of the specification function G2 in FZ's loss. 181 | standardize : logical flag for x variable standardization prior to fitting the quantile model; 182 | default is TRUE. 183 | tol : tolerance for termination. 184 | options : a dictionary of solver options; 185 | see https://docs.scipy.org/doc/scipy/reference/optimize.minimize-bfgs.html. 186 | ci : logical flag for computing normal-based confidence intervals. 187 | level : confidence level between 0 and 1. 188 | 189 | Returns: 190 | 'coef_q' : quantile regression coefficient estimate. 191 | 'res_q' : a vector of fitted quantile regression residuals. 192 | 'coef_e' : expected shortfall regression coefficient estimate. 193 | 'robust' : robustification parameter in the Huber loss. 194 | 'ci' : coordinates-wise (100*level)% confidence intervals. 195 | ''' 196 | 197 | if loss in {'L2', 'Huber', 'TrunL2', 'TrunHuber'}: 198 | qrfit = self.fit(tau=tau, h=h, kernel=kernel, 199 | standardize=standardize) 200 | nres_q = np.minimum(qrfit['res'], 0) 201 | 202 | if loss == 'L2': 203 | adj = np.linalg.solve(self.X.T@self.X, self.X.T@nres_q / tau) 204 | coef_e = qrfit['beta'] + adj 205 | robust = None 206 | elif loss == 'Huber': 207 | Z = nres_q + tau*(self.Y - qrfit['res']) 208 | X0 = self.X[:, self.itcp:] 209 | if robust == None: 210 | esfit = low_dim(tau*X0, Z, intercept=self.itcp).adaHuber() 211 | coef_e = esfit['beta'] 212 | robust = esfit['robust'] 213 | if self.itcp: coef_e[0] /= tau 214 | elif robust > 0: 215 | esfit = low_dim(tau*X0, Z, intercept=self.itcp)\ 216 | .als(robust=robust, standardize=standardize, scale=False) 217 | coef_e = esfit['beta'] 218 | robust = esfit['robust'] 219 | if self.itcp: coef_e[0] /= tau 220 | else: 221 | raise ValueError("robustification parameter must be positive") 222 | elif loss == 'FZ': 223 | if G2_type in np.arange(1,4): 224 | Ymax = np.max(self.Y) 225 | Y = self.Y - Ymax 226 | else: 227 | Y = self.Y 228 | qrfit = low_dim(self.X[:, self.itcp:], Y, intercept=True)\ 229 | .fit(tau=tau, h=h, kernel=kernel, standardize=standardize) 230 | adj = np.minimum(qrfit['res'], 0)/tau + Y - qrfit['res'] 231 | f0, f1, f2 = G2(G2_type) 232 | 233 | fun = lambda z : np.mean(f1(self.X @ z) * (self.X @ z - adj) \ 234 | - f0(self.X @ z)) 235 | grad = lambda z : self.X.T@(f2(self.X@z)*(self.X@z - adj))/self.n 236 | esfit = minimize(fun, qrfit['beta'], method='BFGS', 237 | jac=grad, tol=tol, options=options) 238 | coef_e = esfit['x'] 239 | robust = None 240 | if G2_type in np.arange(1,4): 241 | coef_e[0] += Ymax 242 | qrfit['beta'][0] += Ymax 243 | elif loss == 'TrunL2': 244 | tail = self.Y <= self.X @ qrfit['beta'] 245 | tail_X = self.X[tail,:] 246 | tail_Y = self.Y[tail] 247 | coef_e = np.linalg.solve(tail_X.T @ tail_X, 248 | tail_X.T @ tail_Y) 249 | robust = None 250 | elif loss == 'TrunHuber': 251 | tail = self.Y <= self.X @ qrfit['beta'] 252 | esfit = LR(self.X[tail, self.itcp:], self.Y[tail],\ 253 | intercept=self.itcp).adaHuber() 254 | coef_e = esfit['beta'] 255 | robust = esfit['robust'] 256 | 257 | if loss in {'L2', 'Huber'} and ci: 258 | res_e = nres_q + tau * self.X@(qrfit['beta'] - coef_e) 259 | n, p = self.X[:,self.itcp:].shape 260 | X0 = np.c_[np.ones(n,), self.X[:,self.itcp:] - self.mX] 261 | if loss == 'L2': 262 | weight = res_e ** 2 263 | else: 264 | weight = np.minimum(res_e ** 2, robust ** 2) 265 | 266 | inv_sig = np.linalg.inv(X0.T @ X0 / n) 267 | acov = inv_sig @ ((X0.T * weight) @ X0 / n) @ inv_sig 268 | radius = norm.ppf(1/2 + level/2) * np.sqrt(np.diag(acov)/n) / tau 269 | ci = np.c_[coef_e - radius, coef_e + radius] 270 | 271 | return {'coef_q': qrfit['beta'], 'res_q': qrfit['res'], 272 | 'coef_e': coef_e, 273 | 'loss': loss, 'robust': robust, 274 | 'ci': ci, 'level': level} 275 | 276 | 277 | def boot_es(self, tau=0.5, h=None, kernel='Laplacian', 278 | loss='L2', robust=None, standardize=True, 279 | B=200, level=0.95): 280 | 281 | fit = self.twostep_fit(tau, h, kernel, loss, 282 | robust, standardize=standardize) 283 | boot_coef = np.zeros((self.X.shape[1], B)) 284 | for b in range(B): 285 | idx = rgt.choice(np.arange(self.n), size=self.n) 286 | boot = LR(self.X[idx,self.itcp:], self.Y[idx], 287 | intercept=self.itcp) 288 | if loss == 'L2': 289 | bfit = boot.twostep_fit(tau, h, kernel, loss='L2', 290 | standardize=standardize) 291 | else: 292 | bfit = boot.twostep_fit(tau, h, kernel, loss, 293 | robust=fit['robust'], 294 | standardize=standardize) 295 | boot_coef[:,b] = bfit['coef_e'] 296 | 297 | left = np.quantile(boot_coef, 1/2-level/2, axis=1) 298 | right = np.quantile(boot_coef, 1/2+level/2, axis=1) 299 | piv_ci = np.c_[2*fit['coef_e'] - right, 2*fit['coef_e'] - left] 300 | per_ci = np.c_[left, right] 301 | 302 | return {'coef_q': fit['coef_q'], 303 | 'coef_e': fit['coef_e'], 304 | 'boot_coef_e': boot_coef, 305 | 'loss': loss, 'robust': fit['robust'], 306 | 'piv_ci': piv_ci, 'per_ci': per_ci, 'level': level} 307 | 308 | 309 | def nc_fit(self, tau=0.5, h=None, kernel='Laplacian', 310 | loss='L2', robust=None, standardize=True, 311 | ci=False, level=0.95): 312 | ''' 313 | Non-Crossing Joint Quantile & Expected Shortfall Regression 314 | 315 | Refs: 316 | Robust Estimation and Inference 317 | for Expected Shortfall Regression with Many Regressors 318 | by Xuming He, Kean Ming Tan and Wen-Xin Zhou 319 | J. R. Stat. Soc. B. 85(4): 1223-1246, 2023 320 | ''' 321 | 322 | qrfit = self.fit(tau=tau, h=h, kernel=kernel, standardize=standardize) 323 | nres_q = np.minimum(qrfit['res'], 0) 324 | fitted_q = self.Y - qrfit['res'] 325 | Z = nres_q/tau + fitted_q 326 | 327 | P = matrix(self.X.T @ self.X / self.n) 328 | q = matrix(-self.X.T @ Z / self.n) 329 | G = matrix(self.X) 330 | hh = matrix(fitted_q) 331 | l, c = 0, robust 332 | 333 | if loss == 'L2': 334 | esfit = solvers.qp(P, q, G, hh, 335 | initvals={'x': matrix(qrfit['beta'])}) 336 | coef_e = np.array(esfit['x']).reshape(self.X.shape[1],) 337 | else: 338 | rel = (self.X.shape[1] + np.log(self.n)) / self.n 339 | esfit = self.twostep_fit(tau, h, kernel, loss, 340 | robust, standardize=standardize) 341 | coef_e = esfit['coef_e'] 342 | res = np.abs(Z - self.X @ coef_e) 343 | c = robust 344 | 345 | if robust == None: 346 | c = find_root(lambda t : np.mean(np.minimum((res/t)**2, 1)) - rel, 347 | np.min(res)+self.params['tol'], np.sqrt(res @ res)) 348 | 349 | sol_diff = 1 350 | while l < self.params['max_iter'] \ 351 | and sol_diff > self.params['tol']: 352 | wt = np.where(res > c, res/c, 1) 353 | P = matrix( (self.X.T / wt ) @ self.X / self.n) 354 | q = matrix( -self.X.T @ (Z / wt) / self.n) 355 | esfit = solvers.qp(P, q, G, hh, initvals={'x': matrix(coef_e)}) 356 | tmp = np.array(esfit['x']).reshape(self.X.shape[1],) 357 | sol_diff = np.max(np.abs(tmp - coef_e)) 358 | res = np.abs(Z - self.X @ tmp) 359 | if robust == None: 360 | c = find_root(lambda t : np.mean(np.minimum((res/t)**2, 1)) - rel, 361 | np.min(res)+self.params['tol'], np.sqrt(res @ res)) 362 | coef_e = tmp 363 | l += 1 364 | c *= tau 365 | 366 | if ci: 367 | res_e = nres_q + tau * (fitted_q - self.X @ coef_e) 368 | X0 = np.c_[np.ones(self.n,), self.X[:,self.itcp:] - self.mX] 369 | if loss == 'L2': weight = res_e ** 2 370 | else: weight = np.minimum(res_e ** 2, c ** 2) 371 | 372 | inv_sig = np.linalg.inv(X0.T @ X0 / self.n) 373 | acov = inv_sig @ ((X0.T * weight) @ X0 / self.n) @ inv_sig 374 | radius = norm.ppf(1/2 + level/2) * np.sqrt(np.diag(acov)/self.n) / tau 375 | ci = np.c_[coef_e - radius, coef_e + radius] 376 | 377 | return {'coef_q': qrfit['beta'], 378 | 'res_q': qrfit['res'], 379 | 'coef_e': coef_e, 'nit': l, 380 | 'loss': loss, 'robust': c, 381 | 'ci': ci, 'level': level} 382 | 383 | 384 | 385 | ############################################################################### 386 | ########################## Kernel Ridge Regression ############################ 387 | ############################################################################### 388 | class KRR: 389 | ''' 390 | Kernel Ridge Regression 391 | 392 | Methods: 393 | __init__(): Initialize the KRR object 394 | qt(): Fit (smoothed) quantile kernel ridge regression 395 | es(): Fit (robust) expected shortfall kernel ridge regression 396 | qt_seq(): Fit a sequence of quantile kernel ridge regressions 397 | qt_predict(): Compute predicted quantile at test data 398 | es_predict(): Compute predicted expected shortfall at test data 399 | qt_loss(): Check or smoothed check loss 400 | qt_grad(): Compute the (sub)gradient of the (smoothed) check loss 401 | bw(): Compute the bandwidth (smoothing parameter) 402 | genK(): Generate the kernel matrix for test data 403 | 404 | Attributes: 405 | params (dict): a dictionary of kernel parameters; 406 | gamma (float), default is 1; 407 | coef0 (float), default is 1; 408 | degree (int), default is 3. 409 | rbf : exp(-gamma*||x-y||_2^2) 410 | polynomial : (gamma* + coef0)^degree 411 | laplacian : exp(-gamma*||x-y||_1) 412 | ''' 413 | params = {'gamma': 1, 'coef0': 1, 'degree': 3} 414 | 415 | 416 | def __init__(self, X, Y, normalization=None, 417 | kernel='rbf', kernel_params=dict(), 418 | smooth_method='convolution', 419 | min_bandwidth=1e-4, n_jobs=None): 420 | ''' 421 | Initialize the KRR object 422 | 423 | Args: 424 | X (ndarray): n by p matrix of covariates; 425 | n is the sample size, p is the number of covariates. 426 | Y (ndarray): response/target variable. 427 | normalization (str): method for normalizing covariates; 428 | should be one of [None, 'MinMax', 'Z-score']. 429 | kernel (str): type of kernel function; 430 | choose one from ['rbf', 'polynomial', 'laplacian']. 431 | kernel_params (dict): a dictionary of user-specified kernel parameters; 432 | default is in the class attribute. 433 | smooth_method (str): method for smoothing the check loss; 434 | choose one from ['convolution', 'moreau']. 435 | min_bandwidth (float): minimum value of the bandwidth; default is 1e-4. 436 | n_jobs (int): the number of parallel jobs to run; default is None. 437 | 438 | Attributes: 439 | n (int) : number of observations 440 | Y (ndarray) : target variable 441 | nm (str) : method for normalizing covariates 442 | kernel (str) : type of kernel function 443 | params (dict) : a dictionary of kernel parameters 444 | X0 (ndarray) : normalized covariates 445 | xmin (ndarray) : minimum values of the original covariates 446 | xmax (ndarray) : maximum values of the original covariates 447 | xm (ndarray) : mean values of the original covariates 448 | xsd (ndarray) : standard deviations of the original covariates 449 | K (ndarray) : n by n kernel matrix 450 | ''' 451 | 452 | self.n = X.shape[0] 453 | self.Y = Y.reshape(self.n) 454 | self.nm = normalization 455 | self.kernel = kernel 456 | self.params.update(kernel_params) 457 | 458 | if normalization is None: 459 | self.X0 = X[:] 460 | elif normalization == 'MinMax': 461 | self.xmin = np.min(X, axis=0) 462 | self.xmax = np.max(X, axis=0) 463 | self.X0 = (X[:] - self.xmin)/(self.xmax - self.xmin) 464 | elif normalization == 'Z-score': 465 | self.xm, self.xsd = np.mean(X, axis=0), np.std(X, axis=0) 466 | self.X0 = (X[:] - self.xm)/self.xsd 467 | 468 | self.min_bandwidth = min_bandwidth 469 | self.n_jobs = n_jobs 470 | self.smooth_method = smooth_method 471 | 472 | # compute the kernel matrix 473 | self.K = PK(self.X0, metric=kernel, filter_params=True, 474 | n_jobs=self.n_jobs, **self.params) 475 | 476 | 477 | def genK(self, x): 478 | ''' Generate the kernel matrix for test data ''' 479 | if np.ndim(x) == 1: 480 | x = x.reshape(1, -1) 481 | if self.nm == 'MinMax': 482 | x = (x - self.xmin)/(self.xmax - self.xmin) 483 | elif self.nm == 'Z-score': 484 | x = (x - self.xm)/self.xsd 485 | 486 | # return an n * m matrix, m is test data size 487 | return PK(self.X0, x, metric=self.kernel, 488 | filter_params=True, n_jobs=self.n_jobs, 489 | **self.params) 490 | 491 | 492 | def qt_loss(self, x, h=0): 493 | ''' 494 | Check or smoothed check loss 495 | ''' 496 | tau = self.tau 497 | if h == 0: 498 | out = np.where(x > 0, tau * x, (tau - 1) * x) 499 | elif self.smooth_method == 'convolution': 500 | out = (tau - norm.cdf(-x/h)) * x \ 501 | + (h/2) * np.sqrt(2/np.pi) * np.exp(-(x/h) ** 2 /2) 502 | elif self.smooth_method == 'moreau': 503 | out = np.where(x > tau*h, tau*x - tau**2 * h/2, 504 | np.where(x < (tau - 1)*h, 505 | (tau - 1)*x - (tau - 1)**2 * h/2, 506 | x**2/(2*h))) 507 | return np.sum(out) 508 | 509 | 510 | def qt_grad(self, x, h=0): 511 | ''' 512 | Gradient/subgradient of the (smoothed) check loss 513 | ''' 514 | if h == 0: 515 | return np.where(x >= 0, self.tau, self.tau - 1) 516 | elif self.smooth_method == 'convolution': 517 | return self.tau - norm.cdf(-x / h) 518 | elif self.smooth_method == 'moreau': 519 | return np.where(x > self.tau * h, self.tau, 520 | np.where(x < (self.tau - 1) * h, 521 | self.tau - 1, x/h)) 522 | 523 | 524 | def bw(self, exponent=1/3): 525 | ''' 526 | Compute the bandwidth (smoothing parameter) 527 | 528 | Args: 529 | exponent (float): the exponent in the formula; default is 1/3. 530 | ''' 531 | krr = KR(alpha = 1, kernel=self.kernel, 532 | gamma = self.params['gamma'], 533 | degree = self.params['degree'], 534 | coef0 = self.params['coef0']) 535 | krr.fit(self.X0, self.Y) 536 | krr_res = self.Y - krr.predict(self.X0) 537 | return max(self.min_bandwidth, 538 | np.std(krr_res)/self.n ** exponent) 539 | 540 | 541 | def qt(self, tau=0.5, alpha_q=1, 542 | init=None, intercept=True, 543 | smooth=False, h=0., method='L-BFGS-B', 544 | solver='clarabel', tol=1e-7, options=None): 545 | ''' 546 | Fit (smoothed) quantile kernel ridge regression 547 | 548 | Args: 549 | tau (float): quantile level between 0 and 1; default is 0.5. 550 | alpha_q (float): regularization parameter; default is 1. 551 | init (ndarray): initial values for optimization; default is None. 552 | intercept (bool): whether to include intercept term; 553 | default is True. 554 | smooth (bool): a logical flag for using smoothed check loss; 555 | default is FALSE. 556 | h (float): bandwidth for smoothing; default is 0. 557 | method (str): type of solver if smoothing (h>0) is used; 558 | choose one from ['BFGS', 'L-BFGS-B']. 559 | solver (str): type of QP solver if check loss is used; 560 | default is 'clarabel'. 561 | tol (float): tolerance for termination; default is 1e-7. 562 | options (dict): a dictionary of solver options; default is None. 563 | 564 | Attributes: 565 | qt_sol (OptimizeResult): solution of the optimization problem 566 | qt_beta (ndarray): quantile KRR coefficients 567 | qt_fit (ndarray): fitted quantiles (in-sample) 568 | ''' 569 | self.alpha_q, self.tau, self.itcp_q = alpha_q, tau, intercept 570 | if smooth and h == 0: 571 | h = self.bw() 572 | n, self.h = self.n, h 573 | 574 | # compute smoothed quantile KRR estimator with bandwidth h 575 | if self.h > 0: 576 | if intercept: 577 | x0 = init if init is not None else np.zeros(n + 1) 578 | x0[0] = np.quantile(self.Y, tau) 579 | res = lambda x: self.Y - x[0] - self.K @ x[1:] 580 | func = lambda x: self.qt_loss(res(x),h) + \ 581 | (alpha_q/2) * np.dot(x[1:], self.K @ x[1:]) 582 | grad = lambda x: np.insert(-self.K @ self.qt_grad(res(x),h) 583 | + alpha_q*self.K @ x[1:], 584 | 0, np.sum(-self.qt_grad(res(x),h))) 585 | self.qt_sol = minimize(func, x0, method=method, 586 | jac=grad, tol=tol, options=options) 587 | self.qt_beta = self.qt_sol.x 588 | self.qt_fit = self.qt_beta[0] + self.K @ self.qt_beta[1:] 589 | else: 590 | x0 = init if init is not None else np.zeros(n) 591 | res = lambda x: self.Y - self.K @ x 592 | func = lambda x: self.qt_loss(res(x), h) \ 593 | + (alpha_q/2) * np.dot(x, self.K @ x) 594 | grad = lambda x: -self.K @ self.qt_grad(res(x), h) \ 595 | + alpha_q * self.K @ x 596 | self.qt_sol = minimize(func, x0=x0, method=method, 597 | jac=grad, tol=tol, options=options) 598 | self.qt_beta = self.qt_sol.x 599 | self.qt_fit = self.K @ self.qt_beta 600 | else: 601 | # fit quantile KRR by solving a quadratic program 602 | C = 1 / alpha_q 603 | lb = C * (tau - 1) 604 | ub = C * tau 605 | prob = Problem(P=csc_matrix(self.K), q=-self.Y, G=None, h=None, 606 | A=csc_matrix(np.ones(n)), b=np.array([0.]), 607 | lb=lb * np.ones(n), ub=ub * np.ones(n)) 608 | self.qt_sol = solve_problem(prob, solver=solver) 609 | self.itcp_q = True 610 | self.qt_fit = self.K @ self.qt_sol.x 611 | b = np.quantile(self.Y - self.qt_fit, tau) 612 | self.qt_beta = np.insert(self.qt_sol.x, 0, b) 613 | self.qt_fit += b 614 | 615 | 616 | def qt_seq(self, tau=0.5, alphaseq=np.array([0.1]), 617 | intercept=True, smooth=False, h=0., 618 | method='L-BFGS-B', solver='clarabel', 619 | tol=1e-7, options=None): 620 | ''' 621 | Fit a sequence of (smoothed) quantile kernel ridge regressions 622 | ''' 623 | alphaseq = np.sort(alphaseq)[::-1] 624 | args = [intercept, smooth, h, method, solver, tol, options] 625 | 626 | x0 = None 627 | x, fit = [], [] 628 | for alpha_q in alphaseq: 629 | self.qt(tau, alpha_q, x0, *args) 630 | x.append(self.qt_beta) 631 | fit.append(self.qt_fit) 632 | x0 = self.qt_beta 633 | 634 | self.qt_beta = np.array(x).T 635 | self.qt_fit = np.array(fit).T 636 | self.alpha_q = alphaseq 637 | 638 | 639 | def qt_predict(self, x): 640 | ''' 641 | Compute predicted quantile at new input x 642 | 643 | Args: 644 | x (ndarray): new input. 645 | ''' 646 | return self.itcp_q*self.qt_beta[0] + \ 647 | self.qt_beta[self.itcp_q:] @ self.genK(x) 648 | 649 | 650 | def es(self, tau=0.5, alpha_q=1, alpha_e=1, 651 | init=None, intercept=True, 652 | qt_fit=None, smooth=False, h=0., 653 | method='L-BFGS-B', solver='clarabel', 654 | robust=False, c=None, 655 | qt_tol=1e-7, es_tol=1e-7, options=None): 656 | """ 657 | Fit (robust) expected shortfall kernel ridge regression 658 | 659 | Args: 660 | tau (float): quantile level between 0 and 1; default is 0.5. 661 | alpha_t, alpha_e (float): regularization parameters; default is 1. 662 | init (ndarray): initial values for optimization; default is None. 663 | intercept (bool): whether to include intercept term; 664 | default is True. 665 | qt_fit (ndarray): fitted quantiles from the first step; 666 | default is None. 667 | smooth (bool): a logical flag for using smoothed check loss; 668 | default is FALSE. 669 | h (float): bandwidth for smoothing; default is 0. 670 | method (str): type of solver if smoothing (h>0) is used; 671 | choose one from ['BFGS', 'L-BFGS-B']. 672 | solver (str): type of QP solver if check loss is used; 673 | default is 'clarabel'. 674 | robust (bool): whether to use the Huber loss in the second step; 675 | default is False. 676 | c (float): positive tuning parameter for the Huber estimator; 677 | default is None. 678 | qt_tol (float): tolerance for termination in qt-KRR; 679 | default is 1e-7. 680 | es_tol (float): tolerance for termination in es-KRR; 681 | default is 1e-7. 682 | options (dict): a dictionary of solver options; default is None. 683 | 684 | Attributes: 685 | es_sol (OptimizeResult): solution of the optimization problem 686 | es_beta (ndarray): expected shortfall KRR coefficients 687 | es_fit (ndarray): fitted expected shortfalls (in-sample) 688 | """ 689 | if qt_fit is None: 690 | self.qt(tau, alpha_q, None, intercept, smooth, h, 691 | method, solver, qt_tol, options) 692 | qt_fit = self.qt_fit 693 | elif len(qt_fit) != self.n: 694 | raise ValueError("Length of qt_fit should be equal to \ 695 | the number of observations.") 696 | 697 | self.alpha_e, self.tau, self.itcp = alpha_e, tau, intercept 698 | n = self.n 699 | 700 | qt_nres = np.minimum(self.Y - qt_fit, 0) 701 | if robust == True and c is None: 702 | c = np.std(qt_nres) * (n/np.log(n))**(1/3) / tau 703 | self.c = c 704 | # surrogate response 705 | Z = qt_nres/tau + qt_fit 706 | if intercept: 707 | x0 = init if init is not None else np.zeros(n + 1) 708 | x0[0] = np.mean(Z) 709 | res = lambda x: Z - x[0] - self.K @ x[1:] 710 | func = lambda x: huber_loss(res(x), c) + \ 711 | (alpha_e/2) * np.dot(x[1:], self.K @ x[1:]) 712 | grad = lambda x: np.insert(-self.K @ huber_grad(res(x), c) 713 | + alpha_e * self.K @ x[1:], 714 | 0, -np.sum(huber_grad(res(x), c))) 715 | self.es_sol = minimize(func, x0, method=method, 716 | jac=grad, tol=es_tol, options=options) 717 | self.es_beta = self.es_sol.x 718 | self.es_fit = self.es_beta[0] + self.K @ self.es_beta[1:] 719 | else: 720 | x0 = init if init is not None else np.zeros(n) 721 | res = lambda x: Z - self.K @ x 722 | func = lambda x: huber_loss(res(x), c) \ 723 | + (alpha_e/2) * np.dot(x, self.K @ x) 724 | grad = lambda x: -self.K @ huber_grad(res(x), c) \ 725 | + alpha_e * self.K @ x 726 | self.es_sol = minimize(func, x0=x0, method=method, 727 | jac=grad, tol=es_tol, options=options) 728 | self.es_beta = self.es_sol.x 729 | self.es_fit = self.K @ self.es_beta 730 | self.es_residual = Z - self.es_fit 731 | self.es_model = None 732 | 733 | 734 | def lses(self, tau=0.5, 735 | alpha_q=1, alpha_e=1, 736 | qt_fit=None, smooth=False, h=0., 737 | method='L-BFGS-B', solver='clarabel', 738 | qt_tol=1e-7, options=None): 739 | 740 | self.alpha_e, self.tau, self.itcp = alpha_e, tau, False 741 | if qt_fit is None: 742 | self.qt(tau, alpha_q, None, True, smooth, h, 743 | method, solver, qt_tol, options) 744 | qt_fit = self.qt_fit 745 | else: 746 | self.qt_fit = qt_fit 747 | 748 | Z = np.minimum(self.Y - self.qt_fit, 0)/tau + self.qt_fit 749 | if self.kernel == 'polynomial': 750 | self.params['gamma'] = None 751 | self.es_model = KR(alpha = alpha_e, kernel=self.kernel, 752 | gamma = self.params['gamma'], 753 | degree = self.params['degree'], 754 | coef0 = self.params['coef0']) 755 | self.es_model.fit(self.X0, Z) 756 | self.es_fit = self.es_model.predict(self.X0) 757 | self.es_residual = Z - self.es_fit 758 | self.es_beta = None 759 | 760 | 761 | def es_predict(self, x): 762 | ''' 763 | Compute predicted expected shortfall at new input x 764 | 765 | Args: 766 | x (ndarray): new input. 767 | ''' 768 | 769 | if self.es_beta is not None: 770 | return self.itcp*self.es_beta[0] \ 771 | + self.es_beta[self.itcp:] @ self.genK(x) 772 | elif self.es_model is not None: 773 | if self.nm == 'MinMax': 774 | x = (x - self.xmin)/(self.xmax - self.xmin) 775 | elif self.nm == 'Z-score': 776 | x = (x - self.xm)/self.xsd 777 | return self.es_model.predict(x) 778 | 779 | 780 | def bahadur(self, x): 781 | ''' 782 | Compute Bahadur representation of the expected shortfall estimator 783 | ''' 784 | if np.ndim(x) == 1: 785 | x = x.reshape(1, -1) 786 | if self.nm == 'MinMax': 787 | x = (x - self.xmin)/(self.xmax - self.xmin) 788 | elif self.nm == 'Z-score': 789 | x = (x - self.xm)/self.xsd 790 | 791 | A = self.K/self.n + self.alpha_e * np.eye(self.n) 792 | return np.linalg.solve(A, self.genK(x)) \ 793 | * self.es_residual.reshape(-1,1) 794 | 795 | 796 | def huber_loss(u, c=None): 797 | ''' Huber loss ''' 798 | if c is None: 799 | out = 0.5 * u ** 2 800 | else: 801 | out = np.where(abs(u)<=c, 0.5*u**2, c*abs(u) - 0.5*c**2) 802 | return np.sum(out) 803 | 804 | 805 | def huber_grad(u, c=None): 806 | ''' Gradient of Huber loss ''' 807 | if c is None: 808 | return u 809 | else: 810 | return np.where(abs(u)<=c, u, c*np.sign(u)) 811 | 812 | 813 | 814 | ############################################################################### 815 | ######################### Neural Network Regression ########################### 816 | ############################################################################### 817 | class ANN: 818 | ''' 819 | Artificial Neural Network Regression 820 | ''' 821 | optimizers = ["sgd", "adam"] 822 | activations = ["sigmoid", "tanh", "relu", "leakyrelu"] 823 | params = {'batch_size' : 64, 'val_pct' : .25, 'step_size': 10, 824 | 'activation' : 'relu', 'depth': 4, 'width': 256, 825 | 'optimizer' : 'adam', 'lr': 1e-3, 'lr_decay': 1., 'n_epochs' : 200, 826 | 'dropout_rate': 0.1, 'Lambda': .0, 'weight_decay': .0, 827 | 'momentum': 0.9, 'nesterov': True} 828 | 829 | 830 | def __init__(self, X, Y, normalization=None): 831 | ''' 832 | Args: 833 | X (ndarry): n by p matrix of covariates; 834 | n is the sample size, p is the number of covariates. 835 | Y (ndarry): response/target variable. 836 | normalization (str): method for normalizing covariates; 837 | should be one of [None, 'MinMax', 'Z-score']. 838 | 839 | Attributes: 840 | Y (ndarray): response variable. 841 | X0 (ndarray): normalized covariates. 842 | n (int): sample size. 843 | nm (str): method for normalizing covariates. 844 | ''' 845 | self.n = X.shape[0] 846 | self.Y = Y.reshape(self.n) 847 | self.nm = normalization 848 | 849 | if self.nm is None: 850 | self.X0 = X 851 | elif self.nm == 'MinMax': 852 | self.xmin = np.min(X, axis=0) 853 | self.xmax = np.max(X, axis=0) 854 | self.X0 = (X - self.xmin)/(self.xmax - self.xmin) 855 | elif self.nm == 'Z-score': 856 | self.xm, self.xsd = np.mean(X, axis=0), np.std(X, axis=0) 857 | self.X0 = (X - self.xm)/self.xsd 858 | 859 | 860 | def plot_losses(self): 861 | fig = plt.figure(figsize=(10, 4)) 862 | plt.style.use('fivethirtyeight') 863 | plt.plot(self.results['losses'], 864 | label='Training Loss', color='C0', linewidth=2) 865 | if self.params['val_pct'] > 0: 866 | plt.plot(self.results['val_losses'], 867 | label='Validation Loss', color='C1', linewidth=2) 868 | plt.yscale('log') 869 | plt.xlabel('Epochs') 870 | plt.ylabel('Loss') 871 | plt.legend() 872 | plt.tight_layout() 873 | return fig 874 | 875 | 876 | def qt(self, tau=0.5, smooth=False, h=0., options=dict(), 877 | plot=False, device='cpu', min_bandwidth=1e-4): 878 | ''' 879 | Fit (smoothed) quantile neural network regression 880 | 881 | Args: 882 | tau (float): quantile level between 0 and 1; default is 0.5. 883 | smooth (boolean): a logical flag for using smoothed check loss; default is FALSE. 884 | h (float): bandwidth for smoothing; default is 0. 885 | options (dictionary): a dictionary of neural network and optimization parameters. 886 | batch_size (int): the number of training examples used in one iteration; 887 | default is 64. 888 | val_pct (float): the proportion of the training data to use for validation; 889 | default is 0.25. 890 | step_size (int): the number of epochs of learning rate decay; default is 10. 891 | activation (string): activation function; default is the ReLU function. 892 | depth (int): the number of hidden layers; default is 4. 893 | width (int): the number of neurons for each layer; default is 256. 894 | optimizer (string): the optimization algorithm; default is the Adam optimizer. 895 | lr (float): , learning rate of SGD or Adam optimization; default is 1e-3. 896 | lr_decay (float): multiplicative factor by which the learning rate will be reduced; 897 | default is 0.95. 898 | n_epochs (int): the number of training epochs; default is 200. 899 | dropout_rate : proportion of the dropout; default is 0.1. 900 | Lambda (float): L_1-regularization parameter; default is 0. 901 | weight_decay (float): weight decay of L2 penalty; default is 0. 902 | momentum (float): momentum accerleration rate for SGD algorithm; 903 | default is 0.9. 904 | nesterov (boolean): whether to use Nesterov gradient descent algorithm; 905 | default is TRUE. 906 | plot (boolean) : whether to plot loss values at iterations. 907 | device (string): device to run the model on; default is 'cpu'. 908 | min_bandwidth (float): minimum value of the bandwidth; default is 1e-4. 909 | ''' 910 | self.params.update(options) 911 | self.device = device 912 | self.tau = tau 913 | if smooth and h == 0: 914 | h = max(min_bandwidth, 915 | (tau - tau**2)**0.5 / self.n ** (1/3)) 916 | self.h = h if smooth else 0. # bandwidth for smoothing 917 | self.results = self.trainer(self.X0, self.Y, 918 | QuantLoss(tau, h), 919 | device, 920 | QuantLoss(tau, 0)) 921 | if plot: self.fig = self.plot_losses() 922 | self.model = self.results['model'] 923 | self.fit = self.results['fit'] 924 | 925 | 926 | def es(self, tau=0.5, robust=False, c=None, 927 | qt_fit=None, smooth=False, h=0., 928 | options=dict(), plot=False, device='cpu'): 929 | ''' 930 | Fit (robust) expected shortfall neural network regression 931 | ''' 932 | self.params.update(options) 933 | self.device = device 934 | self.tau = tau 935 | if qt_fit is None: 936 | self.qt(tau=tau, smooth=smooth, h=h, plot=False, device=device) 937 | qt_fit = self.fit 938 | elif len(qt_fit) != self.n: 939 | raise ValueError("Length of qt_fit should be equal to \ 940 | the number of observations.") 941 | qt_nres = np.minimum(self.Y - qt_fit, 0) 942 | if robust == True and c is None: 943 | c = np.std(qt_nres) * (self.n / np.log(self.n))**(1/3) / tau 944 | self.c = c 945 | Z = qt_nres / tau + qt_fit # surrogate response 946 | loss_fn = nn.MSELoss(reduction='mean') if not robust \ 947 | else nn.HuberLoss(reduction='mean', delta=c) 948 | self.results = self.trainer(self.X0, Z, loss_fn, device) 949 | if plot: self.fig = self.plot_losses() 950 | self.model = self.results['model'] 951 | self.fit = self.results['fit'] 952 | 953 | 954 | def mean(self, options=dict(), 955 | robust=False, c=None, s=1., 956 | plot=False, device='cpu'): 957 | ''' 958 | Fit least squares neural network regression 959 | or its robust version with Huber loss 960 | ''' 961 | self.params.update(options) 962 | self.device = device 963 | if robust == True and c is None: 964 | ls_res = self.Y - self.results['fit'] 965 | scale = s * np.std(ls_res) + (1 - s) * mad(ls_res) 966 | c = scale * (self.n / np.log(self.n))**(1/3) 967 | self.c = c 968 | loss_fn = nn.MSELoss(reduction='mean') if not robust \ 969 | else nn.HuberLoss(reduction='mean', delta=c) 970 | self.results = self.trainer(self.X0, self.Y, loss_fn, device) 971 | if plot: self.fig = self.plot_losses() 972 | self.model = self.results['model'] 973 | self.fit = self.results['fit'] 974 | 975 | 976 | def predict(self, X): 977 | ''' Compute predicted outcomes at new input X ''' 978 | if self.nm == 'MinMax': 979 | X = (X - self.xmin)/(self.xmax - self.xmin) 980 | elif self.nm == 'Z-score': 981 | X = (X - self.xm)/self.xsd 982 | Xnew = torch.as_tensor(X, dtype=torch.float).to(self.device) 983 | return self.model.predict(Xnew) 984 | 985 | 986 | def trainer(self, x, y, loss_fn, device='cpu', val_fn=None): 987 | ''' 988 | Train an MLP model with given loss function 989 | ''' 990 | input_dim = x.shape[1] 991 | x_tensor = torch.as_tensor(x).float() 992 | y_tensor = torch.as_tensor(y).float() 993 | dataset = TensorDataset(x_tensor, y_tensor) 994 | n_total = len(dataset) 995 | n_val = int(self.params['val_pct'] * n_total) 996 | train_data, val_data = random_split(dataset, [n_total - n_val, n_val]) 997 | 998 | train_loader = DataLoader(train_data, 999 | batch_size=self.params['batch_size'], 1000 | shuffle=True, 1001 | drop_last=True) 1002 | if self.params['val_pct'] > 0: 1003 | val_loader = DataLoader(val_data, 1004 | batch_size=self.params['batch_size'], 1005 | shuffle=False) 1006 | 1007 | # initialize the model 1008 | model = MLP(input_dim, options=self.params).to(device) 1009 | 1010 | # choose the optimizer 1011 | if self.params['optimizer'] == "sgd": 1012 | optimizer = optim.SGD(model.parameters(), 1013 | lr=self.params['lr'], 1014 | weight_decay=self.params['weight_decay'], 1015 | nesterov=self.params['nesterov'], 1016 | momentum=self.params['momentum']) 1017 | elif self.params['optimizer'] == "adam": 1018 | optimizer = optim.Adam(model.parameters(), 1019 | lr=self.params['lr'], 1020 | weight_decay=self.params['weight_decay']) 1021 | else: 1022 | raise Exception(self.params['optimizer'] 1023 | + "is currently not available") 1024 | 1025 | if val_fn is None: val_fn = loss_fn 1026 | train_step_fn = make_train_step_fn(model, loss_fn, optimizer) 1027 | val_step_fn = make_val_step_fn(model, val_fn) 1028 | scheduler = optim.lr_scheduler.StepLR(optimizer, 1029 | step_size=self.params['step_size'], 1030 | gamma=self.params['lr_decay']) 1031 | 1032 | losses, val_losses, best_val_loss = [], [], 1e10 1033 | for epoch in range(self.params['n_epochs']): 1034 | loss = mini_batch(device, train_loader, train_step_fn) 1035 | losses.append(loss) 1036 | 1037 | if self.params['val_pct'] > 0: 1038 | with torch.no_grad(): 1039 | val_loss = mini_batch(device, val_loader, val_step_fn) 1040 | val_losses.append(val_loss) 1041 | # save the best model 1042 | if val_losses[epoch] < best_val_loss: 1043 | best_val_loss = val_losses[epoch] 1044 | torch.save({'epoch': epoch+1, 1045 | 'model_state_dict': model.state_dict(), 1046 | 'best_val_loss': best_val_loss, 1047 | 'val_losses': val_losses, 1048 | 'losses': losses}, 'checkpoint.pth') 1049 | # learning rate decay 1050 | scheduler.step() 1051 | else: 1052 | torch.save({'epoch': epoch+1, 1053 | 'model_state_dict': model.state_dict(), 1054 | 'losses': losses}, 'checkpoint.pth') 1055 | 1056 | checkpoint = torch.load('checkpoint.pth') 1057 | final_model = MLP(input_dim, options=self.params).to(device) 1058 | final_model.load_state_dict(checkpoint['model_state_dict']) 1059 | 1060 | return {'model': final_model, 1061 | 'fit': final_model.predict(x_tensor.to(device)), 1062 | 'checkpoint': checkpoint, 1063 | 'losses': losses, 1064 | 'val_losses': val_losses, 1065 | 'total_epochs': epoch+1} 1066 | 1067 | 1068 | class Activation(Enum): 1069 | ''' Activation functions ''' 1070 | relu = nn.ReLU() 1071 | tanh = nn.Tanh() 1072 | sigmoid = nn.Sigmoid() 1073 | leakyrelu = nn.LeakyReLU() 1074 | 1075 | 1076 | class MLP(nn.Module): 1077 | ''' Generate a multi-layer perceptron ''' 1078 | def __init__(self, input_size, options): 1079 | super(MLP, self).__init__() 1080 | 1081 | activation = Activation[options.get('activation', 'relu')].value 1082 | dropout = options.get('dropout_rate', 0) 1083 | layers = [input_size] + [options['width']] * options['depth'] 1084 | 1085 | nn_structure = [] 1086 | for i in range(len(layers) - 1): 1087 | nn_structure.extend([ 1088 | nn.Linear(layers[i], layers[i + 1]), 1089 | nn.Dropout(dropout) if dropout > 0 else nn.Identity(), 1090 | activation 1091 | ]) 1092 | nn_structure.append(nn.Linear(layers[-1], 1)) 1093 | 1094 | self.fc_in = nn.Sequential(*nn_structure) 1095 | 1096 | def forward(self, x): 1097 | return self.fc_in(x) 1098 | 1099 | def predict(self, X): 1100 | with torch.no_grad(): 1101 | self.eval() 1102 | yhat = self.forward(X)[:, 0] 1103 | return yhat.cpu().numpy() 1104 | 1105 | 1106 | ############################################################################### 1107 | ########################### Helper Functions ############################ 1108 | ############################################################################### 1109 | def mad(x): 1110 | ''' Median absolute deviation ''' 1111 | return 1.4826 * np.median(np.abs(x - np.median(x))) 1112 | 1113 | 1114 | def find_root(f, tmin, tmax, tol=1e-5): 1115 | while tmax - tmin > tol: 1116 | tau = (tmin + tmax) / 2 1117 | if f(tau) > 0: 1118 | tmin = tau 1119 | else: 1120 | tmax = tau 1121 | return tau 1122 | 1123 | 1124 | def G2(G2_type=1): 1125 | ''' 1126 | Specification Function G2 in Fissler and Ziegel's Joint Loss 1127 | ''' 1128 | if G2_type == 1: 1129 | f0 = lambda x : -np.sqrt(-x) 1130 | f1 = lambda x : 0.5 / np.sqrt(-x) 1131 | f2 = lambda x : 0.25 / np.sqrt((-x)**3) 1132 | elif G2_type == 2: 1133 | f0 = lambda x : -np.log(-x) 1134 | f1 = lambda x : -1 / x 1135 | f2 = lambda x : 1 / x ** 2 1136 | elif G2_type == 3: 1137 | f0 = lambda x : -1 / x 1138 | f1 = lambda x : 1 / x ** 2 1139 | f2 = lambda x : -2 / x ** 3 1140 | elif G2_type == 4: 1141 | f0 = lambda x : np.log( 1 + np.exp(x)) 1142 | f1 = lambda x : np.exp(x) / (1 + np.exp(x)) 1143 | f2 = lambda x : np.exp(x) / (1 + np.exp(x)) ** 2 1144 | elif G2_type == 5: 1145 | f0 = lambda x : np.exp(x) 1146 | f1 = lambda x : np.exp(x) 1147 | f2 = lambda x : np.exp(x) 1148 | else: 1149 | raise ValueError("G2_type must be an integer between 1 and 5") 1150 | return f0, f1, f2 1151 | 1152 | 1153 | def make_train_step_fn(model, loss_fn, optimizer): 1154 | ''' 1155 | Builds function that performs a step in the training loop 1156 | ''' 1157 | def perform_train_step_fn(x, y): 1158 | model.train() 1159 | yhat = model(x) 1160 | loss = loss_fn(yhat, y.view_as(yhat)) 1161 | loss.backward() 1162 | optimizer.step() 1163 | optimizer.zero_grad() 1164 | return loss.item() 1165 | return perform_train_step_fn 1166 | 1167 | 1168 | def make_val_step_fn(model, loss_fn): 1169 | def perform_val_step_fn(x, y): 1170 | model.eval() 1171 | yhat = model(x) 1172 | loss = loss_fn(yhat, y.view_as(yhat)) 1173 | return loss.item() 1174 | return perform_val_step_fn 1175 | 1176 | 1177 | def mini_batch(device, data_loader, step_fn): 1178 | mini_batch_losses = [] 1179 | for x_batch, y_batch in data_loader: 1180 | x_batch = x_batch.to(device) 1181 | y_batch = y_batch.to(device) 1182 | mini_batch_loss = step_fn(x_batch, y_batch) 1183 | mini_batch_losses.append(mini_batch_loss) 1184 | return np.mean(mini_batch_losses) 1185 | 1186 | 1187 | def QuantLoss(tau=.5, h=.0): 1188 | def loss(y_pred, y): 1189 | z = y - y_pred 1190 | if h == 0: 1191 | return torch.max((tau - 1) * z, tau * z).mean() 1192 | else: 1193 | tmp = .5 * h * torch.sqrt(2/torch.tensor(np.pi)) 1194 | return torch.add((tau - Normal(0, 1).cdf(-z/h)) * z, 1195 | tmp * torch.exp(-(z/h)**2/2)).mean() 1196 | return loss -------------------------------------------------------------------------------- /experiments/highd_inference.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "id": "3b6f2e40-8e31-4a48-9c39-7b98c4031cf1", 7 | "metadata": {}, 8 | "outputs": [ 9 | { 10 | "name": "stdout", 11 | "output_type": "stream", 12 | "text": [ 13 | "Number of CPUs: 28\n" 14 | ] 15 | } 16 | ], 17 | "source": [ 18 | "import numpy as np\n", 19 | "import pandas as pd\n", 20 | "import numpy.random as rgt\n", 21 | "import matplotlib.pyplot as plt\n", 22 | "plt.style.use('fivethirtyeight')\n", 23 | "import scipy.stats as spstat\n", 24 | "import time\n", 25 | "from conquer.linear import high_dim\n", 26 | "\n", 27 | "import multiprocessing\n", 28 | "print('Number of CPUs:', multiprocessing.cpu_count())\n", 29 | "rgt.seed(42)\n", 30 | "\n", 31 | "def cov_generate(std, corr=0.5):\n", 32 | " p = len(std)\n", 33 | " R = np.zeros(shape=[p,p])\n", 34 | " for j in range(p-1):\n", 35 | " R[j, j+1:] = np.array(range(1, len(R[j,j+1:])+1))\n", 36 | " R += R.T\n", 37 | " return np.outer(std, std)*(corr*np.ones(shape=[p,p]))**R" 38 | ] 39 | }, 40 | { 41 | "cell_type": "markdown", 42 | "id": "d666a784-e1e4-4650-b29f-e44985b916f1", 43 | "metadata": {}, 44 | "source": [ 45 | "### Model selection and post-selection inference via bootstrap for high-dimensional quantile regression" 46 | ] 47 | }, 48 | { 49 | "cell_type": "code", 50 | "execution_count": 2, 51 | "id": "b094e889-222b-4eaf-91c1-9a8dc24b02d4", 52 | "metadata": {}, 53 | "outputs": [ 54 | { 55 | "name": "stdout", 56 | "output_type": "stream", 57 | "text": [ 58 | "true model: [ 1 3 6 9 12 15 18 21]\n" 59 | ] 60 | } 61 | ], 62 | "source": [ 63 | "n = 256\n", 64 | "p = 1028\n", 65 | "s = 8\n", 66 | "tau = 0.8\n", 67 | " \n", 68 | "Mu, Sig = np.zeros(p), cov_generate(rgt.uniform(1,2,size=p))\n", 69 | "beta = np.zeros(p)\n", 70 | "beta[:21] = [1, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, -1, 0, 0, -1, 0, 0, -1]\n", 71 | "true_set = np.where(beta!=0)[0]\n", 72 | "print('true model:', true_set+1)\n", 73 | "\n", 74 | "def boot_sim(m, itcp=True, parallel=False, ncore=0):\n", 75 | " rgt.seed(m)\n", 76 | " X = rgt.multivariate_normal(mean=Mu, cov=Sig, size=n)\n", 77 | " Y = 4 + X@beta + rgt.standard_t(2,n) - spstat.t.ppf(tau, df=2)\n", 78 | " \n", 79 | " sqr = high_dim(X, Y, intercept=itcp)\n", 80 | " lambda_sim = 0.75*np.quantile(sqr.self_tuning(tau), 0.9)\n", 81 | "\n", 82 | " boot_model = sqr.boot_inference(tau=tau, Lambda=lambda_sim, weight=\"Multinomial\",\n", 83 | " parallel=parallel, ncore=ncore)\n", 84 | " \n", 85 | " per_ci = boot_model['percentile_ci']\n", 86 | " piv_ci = boot_model['pivotal_ci']\n", 87 | " norm_ci = boot_model['normal_ci']\n", 88 | " \n", 89 | " est_set = np.where(boot_model['boot_beta'][itcp:,0]!=0)[0]\n", 90 | " mb_set = boot_model['majority_vote']\n", 91 | " tp = len(np.intersect1d(true_set, est_set))\n", 92 | " fp = len(np.setdiff1d(est_set, true_set))\n", 93 | " mb_tp = len(np.intersect1d(true_set, mb_set))\n", 94 | " mb_fp = len(np.setdiff1d(mb_set, true_set))\n", 95 | " \n", 96 | " ci_cover = np.c_[(beta>=per_ci[1:,0])*(beta<=per_ci[1:,1]), \n", 97 | " (beta>=piv_ci[1:,0])*(beta<=piv_ci[1:,1]),\n", 98 | " (beta>=norm_ci[1:,0])*(beta<=norm_ci[1:,1])]\n", 99 | " ci_width = np.c_[per_ci[1:,1] - per_ci[1:,0],\n", 100 | " piv_ci[1:,1] - piv_ci[1:,0],\n", 101 | " norm_ci[1:,1] - norm_ci[1:,0]] \n", 102 | " return {'tp': tp, 'fp': fp, 'mb_tp': mb_tp, 'mb_fp': mb_fp, \n", 103 | " 'ci_cover': ci_cover, 'ci_width': ci_width}" 104 | ] 105 | }, 106 | { 107 | "cell_type": "code", 108 | "execution_count": 3, 109 | "id": "35ed41c7-48be-4dbf-a071-9b3d02cd39ac", 110 | "metadata": {}, 111 | "outputs": [ 112 | { 113 | "name": "stdout", 114 | "output_type": "stream", 115 | "text": [ 116 | "10/200 repetitions\n", 117 | "20/200 repetitions\n", 118 | "30/200 repetitions\n", 119 | "40/200 repetitions\n", 120 | "50/200 repetitions\n", 121 | "60/200 repetitions\n", 122 | "70/200 repetitions\n", 123 | "80/200 repetitions\n", 124 | "90/200 repetitions\n", 125 | "100/200 repetitions\n", 126 | "110/200 repetitions\n", 127 | "120/200 repetitions\n", 128 | "130/200 repetitions\n", 129 | "140/200 repetitions\n", 130 | "150/200 repetitions\n", 131 | "160/200 repetitions\n", 132 | "170/200 repetitions\n", 133 | "180/200 repetitions\n", 134 | "190/200 repetitions\n", 135 | "200/200 repetitions\n" 136 | ] 137 | } 138 | ], 139 | "source": [ 140 | "ci_cover, ci_width = np.zeros([p, 3]), np.zeros([p, 3])\n", 141 | "M = 200\n", 142 | "# true and false positives\n", 143 | "tp, fp = np.zeros(M), np.zeros(M)b\n", 144 | "mb_tp, mb_fp = np.zeros(M), np.zeros(M)\n", 145 | "\n", 146 | "runtime = []\n", 147 | "for m in range(M):\n", 148 | " tic = time.time() \n", 149 | " out = boot_sim(m, parallel=True, ncore=20)\n", 150 | " runtime.append(time.time()-tic)\n", 151 | " \n", 152 | " tp[m], fp[m], mb_tp[m], mb_fp[m] = out['tp'], out['fp'], out['mb_tp'], out['mb_fp']\n", 153 | " ci_cover += out['ci_cover']\n", 154 | " ci_width += out['ci_width']\n", 155 | " \n", 156 | " if (m+1)%10 == 0: print(f\"{m+1}/{M} repetitions\")" 157 | ] 158 | }, 159 | { 160 | "cell_type": "code", 161 | "execution_count": 4, 162 | "id": "0bb4171c", 163 | "metadata": {}, 164 | "outputs": [ 165 | { 166 | "name": "stdout", 167 | "output_type": "stream", 168 | "text": [ 169 | " percentile pivotal normal percentile pivotal normal\n", 170 | "1 0.424142 0.424142 0.431566 0.970 0.960 0.980\n", 171 | "3 0.332654 0.332654 0.339613 0.935 0.875 0.920\n", 172 | "6 0.483735 0.483735 0.493433 0.990 0.960 0.985\n", 173 | "9 0.356037 0.356037 0.360269 0.965 0.925 0.955\n", 174 | "12 0.289431 0.289431 0.295586 0.975 0.925 0.970\n", 175 | "15 0.475759 0.475759 0.485569 0.985 0.915 0.960\n", 176 | "18 0.372422 0.372422 0.378631 0.990 0.930 0.980\n", 177 | "21 0.354238 0.354238 0.359127 0.955 0.915 0.960 \n", 178 | "true model: [ 1 3 6 9 12 15 18 21] \n", 179 | "average runtime: 8.917688760757446 \n", 180 | "true positive: 8.0 \n", 181 | "false positive: 1.435 \n", 182 | "VSC prob: 0.26 \n", 183 | "true pos after boot: 8.0 \n", 184 | "false pos after boot: 0.085 \n", 185 | "VSC prob after boot: 0.925\n" 186 | ] 187 | } 188 | ], 189 | "source": [ 190 | "cover = pd.DataFrame(ci_cover/M, columns=['percentile', 'pivotal', 'normal'])\n", 191 | "width = pd.DataFrame(ci_width/M, columns=['percentile', 'pivotal', 'normal'])\n", 192 | "\n", 193 | "boot_out = pd.concat([width.iloc[true_set,:], cover.iloc[true_set,:]], axis=1)\n", 194 | "boot_out.index = boot_out.index + 1\n", 195 | "print(boot_out,\n", 196 | " '\\ntrue model:', true_set+1,\n", 197 | " '\\naverage runtime:', np.mean(runtime),\n", 198 | " '\\ntrue positive:', np.mean(tp), \n", 199 | " '\\nfalse positive:', np.mean(fp),\n", 200 | " '\\nVSC prob:', np.mean((tp==8)*(fp==0)), \n", 201 | " '\\ntrue pos after boot:', np.mean(mb_tp),\n", 202 | " '\\nfalse pos after boot:', np.mean(mb_fp),\n", 203 | " '\\nVSC prob after boot:', np.mean((mb_tp==8)*(mb_fp==0)))\n", 204 | "# VSC: variable selection consistency" 205 | ] 206 | }, 207 | { 208 | "cell_type": "code", 209 | "execution_count": 5, 210 | "id": "0c494ab0-3b0d-4770-ba4f-36a3d68dceef", 211 | "metadata": {}, 212 | "outputs": [ 213 | { 214 | "data": { 215 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAl4AAAHOCAYAAABJrwdYAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy81sbWrAAAACXBIWXMAAA9hAAAPYQGoP6dpAABJ60lEQVR4nO3de1xUdf7H8TeiyArIuAqEKYqom4mKZauhlppaial5W7OwVh9Wkoq6u5ps7eNnFoaVRnnJzFJRW81d0/CK613QVSuF9Rpo4oqYF0AxRIHfHzyYdeQiA8MZwNfz8fARzfmeM5/5nmF4z/d8zzkOaWlpeQIAAECFq2HvAgAAAO4XBC8AAACDELwAAAAMQvACAAAwCMELAADAIAQvAAAAgxC8AAAADELwAgAAMAjBCwAAwCAEL9jE8uXLZTKZZDKZ9PPPP9u7HFQjOTk5WrBggXr27CkfHx/Vq1dPJpNJQUFB9i6tVIKCgqpUvdWNrfr/559/Nn/GLV++3EbV4X5E8LoP7d692/wBMmPGjFKtM2PGDPM6u3fvruAKgf8ZNWqUpkyZooMHDyojI0N5edbf5ezOP5ql+TdmzJgKeCVVU0l95+npqd/97ncaMGCA5s6dq/T0dHuXC1R6BC9UetaGRFQf+/fv17fffitJ6tmzp9asWaO9e/cqNjZWc+fOtW9xUHZ2tlJTU7Vjxw799a9/VadOnfTvf//b3mVZhc8XGK2mvQtA9fDiiy/qxRdftHcZqGZ27NghSXJ0dNSiRYvk7u5e7m326dNHb731VoltTCZTuZ+nOrq7727evKmTJ0/qiy++0IEDB5SSkqKhQ4dq3759euCBB+xY6f+sX7/eJttp0qSJ0tLSbLIt3N8IXgAqrZSUFEmSp6enTUKXJLm7u+vhhx+2ybbuN0X1Xfv27TV48GCNGDFC69evV1pamubMmaN3333XTlUClRuHGgFUWjdv3pQk1azJd8TKzNHRUdOnTzf//9atW+1YDVC5EbxgE6U5q/Hw4cMaP368HnvsMT344IPy9PRUq1at1LVrV40bN05r1qwx/6GVpDZt2lgc8omIiCj1JOitW7fqj3/8o1q3bi0vLy81adJETz75pN577z1dvnz5nq/n9u3b+uyzz9SjRw81btxYPj4+6tatm+bOnavs7Ox7nuE0ZswYmUwmtWnTRpKUmpqqd955R48//rh8fHxkMpkUHR1tbp+WlqZly5bp1VdfVceOHfXggw/Kw8NDLVu21MCBA7V48WJlZ2cXW29R9axbt07PP/+8mjdvroYNG6pz585asGCBbt26ZV4vLy9P33zzjYKCgtS8eXN5e3ura9euWrRokXJzc+/ZT6Vx4MABjRkzRu3atZO3t7caN26sxx9/XGFhYUpOTi5ynYLX8vXXX0uSkpOTC+17I86ezc7O1saNG/WXv/xF3bt3V5MmTdSgQQP5+vrqqaee0owZM0r1fipJbm6uvv76aw0ZMkQPPfSQPDw81LhxYwUEBOjZZ5/Ve++9px9++KHEbfz444+aOHGiHnvsMTVq1Eje3t5q3769xo0bp/j4+HLVV1rNmjXTb3/7W0kqcr/eunVLixcvVv/+/dWiRQt5eHioefPm6tevn7788kuL92VRUlNTNX36dHXr1k0+Pj5q0KCBmjdvrk6dOik4OFhfffWVLl26VGi94s5qtPbzpbjf+Rs3bqhRo0YymUwKDg6+Zz+dPXvWfGbuX//61yLb5OTkaMWKFfrDH/6gVq1aydPTU02bNlXPnj310UcfKSMjo8TnSEpK0ptvvqnAwEA1atRIHh4e+t3vfqfAwECNHj1aX3/9ta5du3bPWlEx+BoJQ3z22WcKCwsr9Mc8JSVFKSkpio+PV1RUlP7973+rZcuWZX6emzdv6vXXX9eaNWsKPX748GEdPnxYn3/+uZYsWaJu3boVuY2MjAwNGjRIBw4csHj8xx9/1I8//qh//vOfmj17dqlrOnjwoIYNG1bkH4UCXbt2LfKP1cWLF7Vt2zZt27ZNX375pb755ht5eXnd8zn/9Kc/adGiRRaP/ec//9GUKVO0Z88eLV68WLdv39arr76qtWvXWrSLj4/Xn/70Jx0+fFiffPJJKV9lYXl5eQoLC9P8+fMLLTt27JiOHTumL7/8UpGRkfrDH/5Q5uepSKGhoebwd6erV6/q0KFDOnTokBYuXKgVK1aoU6dOVm//+vXrGjZsmPbs2WPx+K1bt3Tt2jWdOXNGcXFx2r59e5GjSDk5OZo6daoWLlxY6GzP06dP6/Tp01q2bJmmTp2qyZMnW12ftWrVqmWu607nzp3T0KFDdfToUYvHL126pF27dmnXrl36/PPPtWrVKvn4+BTa7r59+/SHP/yh0FmTly5d0qVLl3T8+HF99913ysvL08iRI238qkpWp04d9enTR6tWrdKWLVuUlpZW4hzB1atXm/fVkCFDCi0/c+aMhg8fXqivsrOzdfDgQR08eND8nnvkkUcKrb927Vq9+uqrFl9ipfzgmpqaqqNHj+qbb76Rh4eHevbsWYZXjPIieKHCJSQkmEOXj4+PRo8erbZt26pevXq6ceOGEhMTtXfvXm3YsMFivTVr1ig7O1uBgYGS8i8rMGrUKIs2d3/AvfHGG+bQ9dBDD2ns2LFq3bq1MjIytH79en355ZdKT0/X0KFDFRMTo3bt2hWqd9SoUebQ9dhjj+n111+Xn5+fLl26pFWrVmnVqlWaNGlSqV57ZmamRowYoV9//VWTJk1S9+7d5erqqhMnTlj8gcnNzVWHDh309NNPq23btvL09DSPrK1atUpbt27VkSNHNHLkyHtOFv7qq6908OBB9e7dW8HBwWrcuLH++9//avbs2Tp48KC+++47LV++XAkJCVq7dq2GDBmiwYMHy8vLS0lJSXr//fd18uRJLV26VP369Svzh/P06dPNoevBBx/UhAkT9Mgjj+jmzZvatm2b5s6dq19//VWvv/66TCaTnn76afO6sbGxkqR3331XGzZskLe3t/7xj39YbL9hw4ZlqssaOTk5atq0qfr27atHH31UjRo1Us2aNXX27Fnt3LlTy5Yt05UrV/TSSy8pLi5OHh4eVm0/IiLCHLp69eqloUOHqnHjxvrNb36jy5cv6z//+Y9iYmKUlZVV5Prjx483j7506NBBI0aMUNOmTVW3bl0dP35cX3zxhQ4ePKjw8HDVq1dPo0ePLl+HlOCXX37RxYsXJcliYn1mZqb69++vxMRE8+t8+eWX1ahRI/33v//V0qVLtXnzZh0/flz9+vXT7t275ebmZl4/OztbI0eOVHp6ulxdXfXKK6/oySeflIeHh27fvq3k5GQdPHjQ6kn0Zfl8Kc4f/vAHrVq1Sjdv3tS6des0YsSIYtt+8803kqTf/e53CggIsFiWmpqqZ555RhcuXFCtWrX04osv6sknn5SPj49u3rypPXv2aP78+bpw4YIGDx6snTt3qnHjxub1L168qJCQEN28eVMNGjTQqFGj1LFjR9WvX183b97UmTNntH//fpudcICyIXjd5y5dulTom1Vx7cpq7dq1ys3NlYuLi2JiYgqN2HTq1Ekvvviibty4oRo1/nf0u3nz5hbtGjRoUOKk6JiYGK1evVqS1LFjR3377bf6zW9+Y17+5JNPqkePHho+fLiys7M1fvx47dy502Ib69evV0xMjCTp6aef1ooVK+To6Ghe3rNnT7Vp00Zvv/12qV77lStXVKdOHW3YsMHiQ7Z9+/YW7datWyc/P79C63fs2FFDhw7VsmXLNHbsWO3du1c7d+7Uk08+WexzHjx4UGPGjLE4PT4gIEDdu3dXx44dlZycrGnTpunKlSuaMWOGxeGUgIAAde7cWR06dNC1a9e0aNGiMgWvY8eO6eOPP5Yk+fn5acuWLapfv755eWBgoPr06aO+ffvqxo0bCg0N1eHDh1W7dm1JMu/nggn1NWvWtNmE+PT09BLf83Xq1FHTpk0lSVOnTlXTpk3l4OBg0aZ9+/bq37+/Ro0apaefflqXLl3SggUL7nm25N0KviQ899xzioqKKrS8R48eGjdunK5cuVJoWUGAlvID3GuvvWaxPCAgQEOGDNFrr72m1atX65133tGQIUMq7IzNjz76yDyS07VrV/PjM2fONIeukJAQhYeHW9QYFBSkv/3tb/rkk0905swZvf/++3rvvffMbeLi4nT+/HlJ0sKFC/Xss89aPG+HDh30/PPP691337XqOmLWfr6UpFu3bvLy8lJqaqq++eabYoNXfHy8jh07JkkaOnRooeUTJkzQhQsX5O3trXXr1qlFixYWywMDAzV06FD17t1bFy9e1PTp0/X555+bl2/evFmZmZmS8j93W7dubbH+73//ew0dOlTvv/9+sWEeFY85Xve5RYsWKTAw8J7/7j5sZY2Cb8F+fn4lHiarU6eOnJ2dy/w8CxculCTVqFFD8+fPtwhdBZ555hkNHz5cUv6cs3379lks/+qrryRJTk5OioyMtAhdBcaOHVvkSFlxxo8fX+ib7d2KCl13eumll9S2bVtJspgbVpRGjRrpnXfeKfR4nTp19MILL0iSLl++rA4dOhQ5R87Ly8s8HyYuLq7E5yrOnXPEZs+ebRG6CjzyyCOaMGGCJOnChQuFDnlWlA0bNpT4Xn/jjTfMbX19fQuFrju1bt3aPK/n7hHb0khNTZUkde7cucR2BXOn7jRr1ixJUu/evQuFrgKOjo768MMPVbt2bV27ds3mfZydna34+HiNHTtWn332maT8kBwSEmJevmTJEkn5fTlt2rQit/O3v/3NHDKioqL066+/mpcVfH5IJfeTg4OD3S4D4ujoqIEDB0qS9u7dq//+979FtisY7XJwcCh0mPHYsWPauHGjpPwLVt8dugo0bdrUfNh4zZo1unHjhnlZQV+ZTKZCoetOtWrVshhVhLEIXqhwBYcdTpw4oUOHDlXIc9y+fdt8yKZLly5q1qxZsW1feeUV88/bt2+32MbevXsl5Y+OFXcdIgcHB6vmJFk7fykvL0+pqan66aefdPToUfM/b29vSfmHbkvSt29f81ybu/n7+5t/fv7554vdRkG7tLS0Ml27qKBfmzZtqieeeKLYdi+//HKhdSqztLQ0nT59WseOHTPvl4JRuePHj99zgvjdCt5jd/8BvZeUlBTzhPv+/fuX2NZkMqlVq1aSVO6Lm3799deFrlzftWtXLVu2TFL+F5a5c+ean++HH34wv39eeOGFYt+XNWvWNF8HMCMjQwcPHjQvu/P3sDLfqqfg9zw3N7fQYfG7H+/UqVOhuWwFh/9q1aqlPn36lPhcBYdHb926ZXHiRUFfpaWlcTixEuNQ431uypQpmjp16j3bzZgxQxEREWV6jsGDB2v27Nm6efOmnn76aT311FPq1auXOnXqpIcfftji8GJZnTlzxvyH67HHHiuxbbt27VSrVi3dunXL4pDT6dOnzd+07zWidfehwuK4urrK19e3VG03bdqkL7/8UnFxcSWecVTUYac73X0I5U53XgurtO2uX79u1UjCzZs3zYeW7rUvvLy85OPjo7Nnz5bqkLctvPDCC0VO+C/Of/7zH82dO1dbt261GH25W25urtLS0qya5zV8+HBFRERo//79ateunfr376+uXbuqU6dOJY4Of//99+af33jjDYtRupKUVH95eHp6qmfPnho3bpw5dEkyH1aT7v1e6NChg/nno0ePmg9XdurUSc2aNVNSUpKmTp2qVatWKSgoSIGBgXrkkUfKNUpuSwEBAfrd736nEydOaNWqVRo/frzF8j179phHwoo6zFgQoG7duiVPT89SP++d+7RPnz4ymUxKS0vTSy+9pM6dO+uZZ55RYGCg2rZty2VZKgn2AipcixYt9NVXX5nnqmzevFmbN2+WlP9tvFu3bnrppZfKdYbN1atXzT83aNCgxLa1atXSb3/7W6Wmplqsd+fIzr22ca/lBUpz0c+8vDyNGzfOPGpwL3cehilKUYdYC9wZckvb7u4z1O7Fmn6U8sPX2bNnLfZFZbF06VJNmjRJt2/fLlX7e+2bu/3lL39RamqqlixZol9++UVffPGFvvjiC0n5vzd9+vTRqFGjCo2OlHXOpTWjakW5+8r1Tk5OMplMxe5na34v7wyad65Xq1Yt/f3vf9crr7yio0eP6ocffjCHlNq1a+v3v/+9hgwZomHDhsnJyalMr8tWhgwZonfffVcJCQk6duyYRQgtOMzo5OSkAQMGFFrXFvu0Xr16WrlypUaNGqVz585pz5495iMBLi4u6tKli4YNG6b+/fvb5AsvyobgBUMEBQXpiSee0Lfffqt//etfiouLU2pqqtLS0vTtt9/q22+/Ve/evbVkyZISA0FplDQnp8C9brRcmm2URmk+3KKiosyhq02bNhozZow6dOggb29v1alTxzzP7LXXXtPKlSvLdJNoe7HFvrCXkydPmkOXh4eHxo8fr65du6pJkyZydXU1HzaLiorSuHHjJFn/WmrWrKmPP/5Yb7zxhlavXq1du3bphx9+UFZWlk6dOqXIyEjNnz9fH3zwgcVh2TvD8Pz580s957BOnTpW1Xe38lz1/17vhZL6rmXLltq9e7diYmK0YcMGxcXF6dSpU7p586Z2796t3bt365NPPtGqVatKnGZQ0YYOHar33nvPfH28v/3tb5JkPttRyj+rs169eoXWLdinDz74oDmklcbdZ/d27NhRhw4d0vr167Vp0ybFxcUpOTlZmZmZ5i+9jz76qFauXFnqL5CwLYIXDOPm5qbg4GDzZOTExERt2rRJCxcu1JkzZ7RlyxZNnz7d4qyn0rrzg+yXX34pse2tW7fM36jvXO/Ow2n32kZ5zvK829KlSyXlX4Byy5YtxQbPqnKfOGv6UfrfoZKi/hjZ04oVK3T79m05Ojpq/fr1xV5fzhb7pUWLFpo6daqmTp2qmzdv6sCBA/r2228VFRWlmzdvauLEiWrfvr35BIs7T1bIy8urtLdAsub38s5DZkW9FxwdHfXMM8/omWeeMW9v+/bt+uqrrxQXF6effvpJf/zjHwudqWwkHx8fderUSXFxcfrmm2/09ttvy8HBQZs3bzafcVnUYUbpf/v00qVLat68eblG72rXrq2BAweaJ/yfO3dOW7du1RdffKGEhAQdOnRIEyZMKPUoO2yLsUbYjZ+fn9544w3t2LHDPKfh22+/LdO2mjZtav42f+fE3KIcOXLEPAn6zj9Yvr6+5vkihw8fLnEb97qSuDWOHz8uSXr22WeLDV15eXn3rKmyqF27tvkszXudTHHx4kWdPXtWkipdeCiYn+Tv71/iRX1t+V6Q8vuvS5cu+vDDDzV37lxJ+fPHCkZMJJkDmCT961//sunz29Kdh9ru9Xt553ulNO8FDw8PDR06VBs2bNBTTz0lKf/3NikpqYzV2kZBsEpOTjafFbxq1SpJUt26dS2uV3engrtcFIzi2VKjRo30yiuvaNu2bea+3bRpk9WHxmEbBC/YnclkMh8qKer2KwVhqKRb5tSsWVNdunSRlD+J9cyZM8W2LTi9XZK6d+9usY2Cs4V27typCxcuFLl+Xl6eVq5cWez2rVUwf6ik+Tfr168vtp7KqKBfk5KSzGeKFqVgtO/OdSqLgkM/Je2XCxcumC8BUBHuvF7bnb8bvr6+5j+g69ats3vYKE779u3NI6ArV64sdq7c7du3zWcs1q1b12Ki/b04ODhYnDlr7S2cSvP5Yo3nn3/ePFr1zTffKC0tzXxtwP79+xd7MkDfvn3NP3/88ccVcgjeyclJjz/+uKT8Pr/XrYdQMQheqHDfffddiYdjrl69qh9//FGS1KRJk0LLCybdnj59usTnKbgqd05Ojt54441Ct8yQpC1btpiH19u1a1foNi8Fl5rIzs5WaGhokRPL58yZY9PRp4I5KZs2bSpygvnp06f1l7/8xWbPZ4RRo0aZ57dNmjSpyP3/448/mm+99MADD9zzsghGK9gviYmJ2r9/f6HlN27c0OjRo8s8anD16lVt2LChxD+w27ZtM/989+9GwbWcbt26pZdeekkpKSnFbicnJ0erVq0q9vpSFcXJyck8N+2nn36yuDDqnd577z2dPHlSkhQcHGwx8hsbG2s+S7Youbm55sOLDg4ORd5yqCSl/XwpLZPJpF69eknKH8FfvXq1+bOouMOMUv5Zkb1795Yk7d69W2+//XaJ742LFy9afHGR8u9RW9L7ICsryzwK5+bmVuT19VDxmOOFCvfZZ5/p1VdfVa9evfTEE0+oZcuWMplMysjIUEJCghYuXGie/3H3LTuk/MmiP//8szZu3KivvvpKHTt2NH9rdHNzM5++36tXLw0ePFirV6/W3r171b17d40dO1YPP/ywMjIytGHDBn3xxRfKzc2Vk5NTkfch7Nevn3r06KFt27Zp8+bNeuaZZxQSEiJfX19dvnxZK1eu1KpVq/Too4+aD42UdyL+Cy+8oLffflspKSnq1auXQkND9fDDDysrK0u7du3S/PnzlZ2drXbt2lWZw42tWrXShAkTNGvWLJ04cUJdu3bVhAkT1L59e4tbBt24cUMODg6KjIw0X7W+shg2bJg+//xz5ebmaujQoQoNDTW/93788UfNmzdPiYmJ6tSpU6EL8ZZGRkaGhg8frkaNGum5555Thw4d5OPjo1q1aumXX37Rv/71L/MFfd3c3Ar90R4wYIBeeeUVLV68WEePHlWnTp30yiuv6IknnpCHh4eysrJ09uxZ/fvf/9a6det04cIFxcbG6sEHH7RJ/5TW5MmTFR0drcTERM2ePVvHjh3TiBEj1LBhQ50/f15Lly7Vpk2bJOVPGXjzzTct1t+5c6c++OADderUSb1795a/v78aNGig7OxsnTlzRlFRUeZDc3379i3VvUzvVNrPF2sMHTpU69ev19WrVzV9+nRJ+ZPm73Wh3Dlz5qhHjx46d+6c5syZo127dmnEiBHy9/fXb37zG6Wnp+vYsWPauXOntm7dqocfftjiKvn/+Mc/9MILL5jv0vHwww+bb8126tQpLVq0yHzZlhEjRnB5CTuh12GIX3/9VevWrbOYp3K31157Ta+++mqhx8eOHau1a9eaJxnf6e5rMs2dO1c5OTlas2aNjh49ar6C9p3c3d21ZMmSYs8E+/LLLzVo0CAdOnRIBw4c0B//+EeL5W3bttVHH31kvsl2ea8j9Prrr2v79u3atm2bfvrpJ/MZcgV+85vf6LPPPtPmzZurTPCSpLfffls3btzQZ599puTkZP3pT38q1MbZ2VmRkZHFznuxp0ceeURTp07VjBkzlJ6eXuTdAMaOHatWrVqVKXgVOHfuXInXFTOZTFqyZEmR96acNWuWPDw8NHv2bKWnpysyMlKRkZFFbsfJycku17xycXHR2rVrzTfJ3rRpkzlo3emhhx7SqlWriryiem5urmJjY8338CxK586d9emnn1pdnzWfL6X1zDPPyN3dXenp6eZJ9YMHD77nWc6enp7asmWLRo4cqX379unIkSP685//XGz7ovrq1q1b2rp1a5E3VS8wYMAA8xmXMB7BCxVu8eLF2rFjh3bs2KH4+HhdvHhRly9fVq1atdSoUSN17NhRI0aMKPYCi23bttWWLVv0ySefaP/+/bp48WKx8zFq166tr776Si+++KKWLVumAwcO6JdffpGzs7OaNm2q3r17a8yYMSUOsZtMJm3atElffPGFVq5cqZ9++kkODg5q2rSpBg4cqDFjxpgPi0j5c1LKo1atWlq1apUWLVqkv//97zpx4oTy8vLk7e2tbt266fXXX1fLli3N1z6rKhwcHPT+++9r0KBBWrRokWJjY3Xx4kXVrFlTjRs3Vvfu3TVmzBiLm/xWNlOmTFH79u312Wef6fvvv9eNGzfk4eGhRx55RCNHjlT37t3LfDV1Hx8f7d69Wzt27NDu3bt15swZXbx4UdevX5ebm5tatmypnj17auTIkcW+X2vUqKG//vWvevHFF7V48WLt3LlTP//8szIyMuTs7Cxvb2+1bt1a3bp103PPPWe3Q0uNGjUy31T822+/1X/+8x+lp6erbt26at26tfr3768RI0YUeWX70NBQdejQQTt27NCBAweUkpKiX375RXl5efLw8FBAQIAGDRqkAQMGlGn02ZrPl9KqXbu2BgwYYDGftKTDjHdq2LChNm3apM2bN+sf//iHDhw4oIsXL+rmzZuqW7eufH199eijj6p3796F5kW+//776tOnj3bu3Knvv/9eqamp+uWXX+To6KgHHnhAHTp00LBhw8wnI8A+HNLS0irnRXSASmzlypXm++N9//33dr12EACg6mByPVAGBfdcq1+/fqlvCQQAAMELuEtKSkqJlxBYunSptmzZIin/xri2uso9AKD6Y44XcJddu3YpLCxMAwcOVJcuXdSkSRPl5ubq9OnTWrNmjaKjoyXl33tu0qRJdq4WAFCVELyAIly+fFkLFy7UwoULi1zu5eXFvc4AAFZjcj1wlytXrmjt2rXaunWrTpw4oUuXLun69etyd3dXy5Yt9cwzz2jkyJFFnsoNAEBJCF4AAAAGYXI9AACAQQheAAAABrnvg1dWVpaSkpKUlZVl71KqBfrT9uhT26I/bYv+tD361LYqW3/e98FLknJycuxdQrVCf9oefWpb9Kdt0Z+2R5/aVmXqT4IXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAYheAEAABiE4AUAAGAQghcAAIBBrA5ey5cvl8lkKvFfv379LNbJyMhQWFiY/P395enpKX9/f4WFhSkjI8NmLwQAAKCyq2ntCm3atNGUKVOKXLZu3TodO3ZMTz31lPmxzMxMBQUFKT4+Xt27d9fgwYOVkJCgefPmaffu3dq0aZNcXFzK/goAAACqCKuDV9u2bdW2bdtCj2dnZ2vhwoWqWbOmXnjhBfPjkZGRio+PV2hoqKZNm2Z+PDw8XDNnzlRkZKTCwsLKWD4AAEDVYbM5XtHR0bpy5YqefvppeXp6SpLy8vIUFRUlV1dXTZ482aL9pEmTZDKZtGzZMuXl5dmqDAAAgErLZsErKipKkjRixAjzY4mJiUpJSVHHjh0LHU50dnZWYGCgzp8/r6SkJFuVAQAAUGlZfaixKGfPntXOnTvVsGFD9ezZ0/x4YmKiJKlZs2ZFrufn52duV/BzcbKysmxRaiHZ2dkW/0X50J+2R5/aFv1pW/Sn7dGntlXR/ens7GxVe5sEr+XLlys3N1fDhw+Xo6Oj+fGCsxbd3d2LXM/Nzc2iXUnOnz+vnJwcG1RbtNTU1ArbtpEe21PH3iVIqiPpWrm2cKDLDduUUo1Ul/doZUF/2hb9aXv0qW1VRH86OjoWO7hUnHIHr9zcXC1fvlwODg566aWXyru5YjVs2LBCtpudna3U1FR5eXnJycmpQp7DWJftXYBNNG7c2N4lVBrV7z1qX/SnbdGftkef2lZl689yB6/t27fr3LlzevLJJ9W0aVOLZXXr1pUkpaenF7nutWvXLNqVxNqhPGs5OTlV+HOg9NgXhfEetS3607boT9ujT22rsvRnuSfXFzWpvkDBvK3iJs8XzAG71/wuAACA6qBcwevKlSvasGGD6tWrp759+xZa7ufnJ29vb+3fv1+ZmZkWy7KyshQbGytvb2+rj48CAABUReUKXn//+9+VnZ2toUOHqnbt2oWWOzg4KDg4WNevX9fMmTMtls2aNUtpaWkKDg6Wg4NDecoAAACoEso1x2vZsmWSij7MWCA0NFQbN25UZGSkjhw5ooCAACUkJCgmJkZt2rRRaGhoeUoAAACoMso84nXo0CEdPXpUjz76qFq3bl1sOxcXF0VHRyskJESnTp3SnDlzdOzYMYWEhCg6Opr7NAIAgPtGmUe8Hn30UaWlpZWqrbu7u8LDwxUeHl7WpwMAAKjybHbLIAAAAJSM4AUAAGAQghcAAIBBCF4AAAAGIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAYheAEAABiE4AUAAGAQghcAAIBBCF4AAAAGIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAYheAEAABiE4AUAAGAQghcAAIBBCF4AAAAGIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAYheAEAABiE4AUAAGCQMgev7777TgMGDJCvr68eeOABtW3bVqNGjdK5c+cs2mVkZCgsLEz+/v7y9PSUv7+/wsLClJGRUe7iAQAAqpKa1q6Ql5eniRMnavHixfL19dWgQYPk6uqqlJQU7d27V8nJyWrUqJEkKTMzU0FBQYqPj1f37t01ePBgJSQkaN68edq9e7c2bdokFxcXm78oAACAysjq4LVgwQItXrxYo0eP1vvvvy9HR0eL5bdv3zb/HBkZqfj4eIWGhmratGnmx8PDwzVz5kxFRkYqLCysHOUDAABUHVYdavz1118VERGhpk2basaMGYVClyTVrJmf5fLy8hQVFSVXV1dNnjzZos2kSZNkMpm0bNky5eXllaN8AACAqsOq4LV9+3ZdvXpVQUFBysnJ0bp16zR79mx9+eWXSkpKsmibmJiolJQUdezYsdDhRGdnZwUGBur8+fOF1gMAAKiurDrU+MMPP+SvVLOmunTpolOnTpmX1ahRQyEhIXr33Xcl5QcvSWrWrFmR2/Lz8zO3K/i5JFlZWdaUWmrZ2dkW/0XlUFH7uyriPWpb9Kdt0Z+2R5/aVkX3p7Ozs1XtrQpely5dkiTNmTNH7dq107Zt29SyZUsdOXJEEyZM0Jw5c+Tr66tRo0aZz1p0d3cvcltubm6SVOqzG8+fP6+cnBxryrVKampqhW3bWHXsXYBNJCcn27uESqf6vEcrB/rTtuhP26NPbasi+tPR0bHYAabiWBW8cnNzJUlOTk5avny5vL29JUmBgYFasmSJOnfurDlz5mjUqFFWFVEaDRs2tPk2pfwEnJqaKi8vLzk5OVXIcxjrsr0LsInGjRvbu4RKo/q9R+2L/rQt+tP26FPbqmz9aVXwqlu3riQpICDAHLoKtGrVSk2bNlVSUpLS0tLMbdPT04vc1rVr1yy2eS/WDuVZy8nJqcKfA6XHviiM96ht0Z+2RX/aHn1qW5WlP62aXN+iRQtJxR8+LHg8KyvLPG+ruMnzBXPASjO/CwAAoDqwasSra9eukqSTJ08WWnbr1i0lJSXJxcVFDRo0kJeXl7y9vbV//35lZmZanNmYlZWl2NhYeXt7W31sFAAAoKqyasTL19dXPXr0UFJSkpYuXWqxbPbs2UpPT1dQUJBq1qwpBwcHBQcH6/r165o5c6ZF21mzZiktLU3BwcFycHAo/6sAAACoAqy+cv1HH32k3r17a/z48Vq/fr1atGihI0eOaNeuXWrcuLGmT59ubhsaGqqNGzcqMjJSR44cUUBAgBISEhQTE6M2bdooNDTUpi8GAACgMrP6Jtm+vr7avn27hg8frh9//FELFixQUlKSRo8erW3btsnLy8vc1sXFRdHR0QoJCdGpU6c0Z84cHTt2TCEhIYqOjuY+jQAA4L5i9YiXJDVq1Ejz5s0rVVt3d3eFh4crPDy8LE8FAABQbVg94gUAAICyIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAYheAEAABiE4AUAAGAQghcAAIBBCF4AAAAGIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAYheAEAABiE4AUAAGAQghcAAIBBCF4AAAAGIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAapae8CKoPH9tSRdNneZQAAgGqOES8AAACDELwAAAAMQvACAAAwCMELAADAIGUKXm3atJHJZCry38SJEwu1z8jIUFhYmPz9/eXp6Sl/f3+FhYUpIyOj3C8AAACgqijzWY1169bVmDFjCj3evn17i//PzMxUUFCQ4uPj1b17dw0ePFgJCQmaN2+edu/erU2bNsnFxaWsZQAAAFQZZQ5e7u7umjp16j3bRUZGKj4+XqGhoZo2bZr58fDwcM2cOVORkZEKCwsraxkAAABVRoXO8crLy1NUVJRcXV01efJki2WTJk2SyWTSsmXLlJeXV5FlAAAAVAplHvHKzs7WihUrlJKSIpPJpN///vdq06aNRZvExESlpKToqaeeKnQ40dnZWYGBgdqwYYOSkpLk5+dX1lIAAACqhDIHr9TUVIWEhFg81rNnTy1YsED169eXlB+8JKlZs2ZFbqMgbCUmJt4zeGVlZZW11BJlZ2dXyHZRPhW1v6uigvco71XboD9ti/60PfrUtiq6P52dna1qX6bg9dJLL6lz585q1aqVnJycdOLECUVERCgmJkYvvPCCNm/eLAcHB/NZi+7u7kVux83NTZJKdXbj+fPnlZOTU5ZyS6FOBW0XZZWcnGzvEiqd1NRUe5dQrdCftkV/2h59alsV0Z+Ojo7FDi4Vp0zBa8qUKRb/36FDB61cuVJBQUGKi4vTli1b9PTTT5dl08Vq2LChTbdXID8BX6uQbaPsGjdubO8SKo3s7GylpqbKy8tLTk5O9i6nyqM/bYv+tD361LYqW3/a7CbZNWrU0PDhwxUXF6f9+/fr6aefVt26dSVJ6enpRa5z7Vp+4CloVxJrh/JQtbG/C3NycqJfbIj+tC360/boU9uqLP1p07MaC+Z23bhxQ9L/5nAlJSUV2b5gDhgT6wEAwP3ApsHr0KFDkiQfHx9J+YHK29tb+/fvV2ZmpkXbrKwsxcbGytvb2+rjowAAAFWR1cHr+PHjSktLK/R4XFyc5s6dq9q1a+u5556TJDk4OCg4OFjXr1/XzJkzLdrPmjVLaWlpCg4OloODQ9mqBwAAqEKsnuO1Zs0affLJJ3riiSfk4+Oj2rVr69ixY9q2bZtq1Kih2bNnW0yMDg0N1caNGxUZGakjR44oICBACQkJiomJUZs2bRQaGmrTFwQAAFBZWR28unbtqpMnT+rw4cOKjY1VVlaWPD09NXDgQIWEhOjRRx+1aO/i4qLo6GhFRERo3bp12rNnj7y8vBQSEqIpU6Zwn0YAAHDfsDp4denSRV26dLFqHXd3d4WHhys8PNzapwMAAKg2KvRejQAAAPgfghcAAIBBCF4AAAAGIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAYheAEAABiE4AUAAGAQghcAAIBBCF4AAAAGIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAYheAEAABiE4AUAAGAQghcAAIBBCF4AAAAGIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAYheAEAABiE4AUAAGAQghcAAIBByh28IiMjZTKZZDKZdODAgSLbZGRkKCwsTP7+/vL09JS/v7/CwsKUkZFR3qcHAACoMsoVvE6cOKHw8HC5uLgU2yYzM1NBQUGaN2+eWrRooZCQED300EOaN2+egoKClJmZWZ4SAAAAqowyB6+cnByNGTNG/v7+CgoKKrZdZGSk4uPjFRoaqjVr1uj//u//tHr1ak2ePFnx8fGKjIwsawkAAABVSpmD18cff6yEhATNmTNHjo6ORbbJy8tTVFSUXF1dNXnyZItlkyZNkslk0rJly5SXl1fWMgAAAKqMMgWvo0ePKiIiQn/+85/VqlWrYtslJiYqJSVFHTt2LHQ40tnZWYGBgTp//rySkpLKUgYAAECVUtPaFW7fvq2QkBC1bNlSEydOLLFtYmKiJKlZs2ZFLvfz8zO3K/i5OFlZWdaWWirZ2dkVsl2UT0Xt76qo4D3Ke9U26E/boj9tjz61rYruT2dnZ6vaWx28PvroIyUkJGjr1q2qVatWiW0Lzlp0d3cvcrmbm5tFu5KcP39eOTk5VlZbWnUqaLsoq+TkZHuXUOmkpqbau4Rqhf60LfrT9uhT26qI/nR0dCx2cKk4VgWv+Ph4ffjhhxo3bpwCAgKseqLyatiwYYVsNz8BX6uQbaPsGjdubO8SKo3s7GylpqbKy8tLTk5O9i6nyqM/bYv+tD361LYqW39aFbzGjBkjX19fvfnmm6VqX7duXUlSenp6kcuvXbtm0a4k1g7loWpjfxfm5OREv9gQ/Wlb9Kft0ae2VVn606rglZCQIEny8vIqcnmvXr0kScuWLVPfvn3N87aKmzxfMAfsXvO7AAAAqgOrgldwcHCRj8fGxioxMVHPPvusGjRoIB8fH0n5gcrb21v79+9XZmamxZmNWVlZio2Nlbe3t9XHRwEAAKoiq4LXp59+WuTjY8aMUWJioiZNmqTHHnvM/LiDg4OCg4M1c+ZMzZw5U9OmTTMvmzVrltLS0vTqq6/KwcGhjOUDAABUHVaf1Wit0NBQbdy4UZGRkTpy5IgCAgKUkJCgmJgYtWnTRqGhoRVdAgAAQKVQ7ptk34uLi4uio6MVEhKiU6dOac6cOTp27JhCQkIUHR1d4n0eAQAAqhObjHjNnz9f8+fPL3a5u7u7wsPDFR4ebounAwAAqJIqfMQLAAAA+QheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAap8AuoAvc701f/tcFW6ki6bIPtlF3aHx+06/MDQHXAiBcAAIBBCF4AAAAGIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAYheAEAABiE4AUAAGAQghcAAIBBCF4AAAAGIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAYheAEAABiE4AUAAGAQghcAAIBBCF4AAAAGIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGMSq4JWWlqbJkyerV69eatmypTw9PdWqVSs999xzWrt2rfLy8gqtk5GRobCwMPn7+8vT01P+/v4KCwtTRkaGzV4EAABAVWBV8Lpy5YqWL1+uOnXqKCgoSGPHjlXPnj11/Phxvfzyy5owYYJF+8zMTAUFBWnevHlq0aKFQkJC9NBDD2nevHkKCgpSZmamLV8LAABApVbTmsZNmjTRzz//rJo1LVe7du2aevXqpSVLluj1119Xq1atJEmRkZGKj49XaGiopk2bZm4fHh6umTNnKjIyUmFhYTZ4GQAAAJWfVSNejo6OhUKXJLm5ualHjx6SpKSkJElSXl6eoqKi5OrqqsmTJ1u0nzRpkkwmk5YtW1bk4UkAAIDqyKoRr+JkZWVp165dcnBw0EMPPSRJSkxMVEpKip566im5uLhYtHd2dlZgYKA2bNigpKQk+fn5leo5KkJ2dnaFbBflU1H7G2VXXfZJwe88v/u2QX/aHn1qWxXdn87Ozla1L1PwSktL0/z585Wbm6tLly4pJiZG586d05QpU8whKjExUZLUrFmzIrdxZ7vSBK/z588rJyenLOWWQp0K2i7KKjk52d4l2FD1eH9Vr30ipaam2ruEaoX+tD361LYqoj8dHR2LzTnFKVPwSk9PV0REhPn/a9WqpenTp2vs2LHmxwrOWnR3dy9yG25ubhbt7qVhw4ZlKfWe8hPwtQrZNsqucePG9i7Bhi7buwCbqC77JDs7W6mpqfLy8pKTk5O9y6ny6E/bo09tq7L1Z5mCV5MmTZSWlqacnBydO3dO//znPzV9+nTt379fixcvLnIeWHlZO5SHqo39XflUt33i5ORU7V6TPdGftkef2lZl6c9yXUDV0dFRTZo00cSJE/XWW28pOjpaS5YskSTVrVtXUv7oWFGuXbtm0Q4AAKC6s9mV67t37y5J2rNnj6T/zeEqOMvxbgVzwEozvwsAAKA6sFnwunDhgiSZDzP6+fnJ29tb+/fvL3Sh1KysLMXGxsrb29vqSWkAAABVlVWTsY4cOaImTZoUmjB/9epVvfPOO5Kknj17SpIcHBwUHBysmTNnaubMmRYXUJ01a5bS0tL06quvysHBobyvAQCs8tieOqouJz2k/fFBe5cAwApWBa8VK1YoKipKXbp0kY+Pj+rUqaPk5GRt2bJF169fV79+/TRkyBBz+9DQUG3cuFGRkZE6cuSIAgIClJCQoJiYGLVp00ahoaE2f0EAAACVlVXBq3///srIyNDBgwcVFxenGzduqF69eurUqZOGDRumQYMGWYxgubi4KDo6WhEREVq3bp327NkjLy8vhYSEaMqUKYUurAoAAFCdWRW8Hn/8cT3++ONWPYG7u7vCw8MVHh5u1XoAAADVjc0m1wMAAKBkBC8AAACDELwAAAAMQvACAAAwCMELAADAIAQvAAAAgxC8AAAADELwAgAAMAjBCwAAwCAELwAAAIMQvAAAAAxC8AIAADAIwQsAAMAgBC8AAACDELwAAAAMQvACAAAwCMELAADAIAQvAAAAgxC8AAAADELwAgAAMAjBCwAAwCAELwAAAIMQvAAAAAxC8AIAADAIwQsAAMAgBC8AAACDELwAAAAMQvACAAAwCMELAADAIAQvAAAAgxC8AAAADELwAgAAMAjBCwAAwCAELwAAAIMQvAAAAAxC8AIAADAIwQsAAMAgBC8AAACDWBW8zp8/r3nz5un555+Xv7+/PDw81LJlSwUHB+vgwYNFrpORkaGwsDD5+/vL09NT/v7+CgsLU0ZGhk1eAAAAQFVR05rGn3/+uT7++GP5+vqqW7du8vDwUGJiotavX6/169dr0aJFev75583tMzMzFRQUpPj4eHXv3l2DBw9WQkKC5s2bp927d2vTpk1ycXGx+YsCAACojKwKXo888og2bNigwMBAi8djY2PVv39/TZo0SX369FHt2rUlSZGRkYqPj1doaKimTZtmbh8eHq6ZM2cqMjJSYWFhNngZAAAAlZ9Vhxr79etXKHRJUmBgoLp27aqrV6/q6NGjkqS8vDxFRUXJ1dVVkydPtmg/adIkmUwmLVu2THl5eeUoHwAAoOqwasSrJLVq1ZIkOTo6SpISExOVkpKip556qtDhRGdnZwUGBmrDhg1KSkqSn5/fPbeflZVlq1ItZGdnV8h2UT4Vtb9RdtVln1S333l775eC/qxu/WpP9KltVXR/Ojs7W9XeJsErOTlZO3bskJeXl1q3bi0pP3hJUrNmzYpcpyBsJSYmlip4nT9/Xjk5ObYotwh1Kmi7KKvk5GR7l2BD1eP9xT6pnCrLfklNTbV3CdUOfWpbFdGfjo6Oxeac4pQ7eN26dUuvvfaabt68qWnTpplHvArOWnR3dy9yPTc3N4t299KwYcPyllqk/AR8rUK2jbJr3LixvUuwocv2LsAmqss+qW6/8/beL9nZ2UpNTZWXl5ecnJzsWkt1QZ/aVmXrz3IFr9zcXL3xxhuKjY3Vyy+/rGHDhtmqrkKsHcpD1cb+rnzYJ5VTZdkvTk5OlaaW6oI+ta3K0p9lvoBqXl6exo8fr1WrVmno0KGaPXu2xfK6detKktLT04tc/9q1axbtAAAAqrsyBa/c3FyNHTtWy5Yt0+DBgzV//nzVqGG5qYJ5W0lJSUVuo2AOWGnmdwEAAFQHVgev3NxcjRs3TsuXL9fAgQO1YMEC87yuO/n5+cnb21v79+9XZmamxbKsrCzFxsbK29vb6klpAAAAVZVVwatgpGv58uUaMGCAPv/88yJDlyQ5ODgoODhY169f18yZMy2WzZo1S2lpaQoODpaDg0PZqwcAAKhCrJpcHxERoRUrVsjV1VXNmzfXBx98UKhNUFCQ2rZtK0kKDQ3Vxo0bFRkZqSNHjiggIEAJCQmKiYlRmzZtFBoaaptXAQAAUAVYFbzOnj0rSbp+/bo+/PDDItv4+PiYg5eLi4uio6MVERGhdevWac+ePfLy8lJISIimTJnCfRoBAMB9xargNX/+fM2fP9+qJ3B3d1d4eLjCw8OtWg8AAKC6KfPlJAAAAGAdghcAAIBBCF4AAAAGIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAYheAEAABiE4AUAAGAQghcAAIBBCF4AAAAGIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAYheAEAABiE4AUAAGAQghcAAIBBCF4AAAAGIXgBAAAYhOAFAABgEIIXAACAQQheAAAABiF4AQAAGITgBQAAYBCCFwAAgEEIXgAAAAYheAEAABiE4AUAAGAQghcAAIBBrA5eK1eu1IQJE9StWzd5enrKZDJp+fLlxbbPyMhQWFiY/P395enpKX9/f4WFhSkjI6NchQMAAFQ1Na1d4d1331VycrLq168vLy8vJScnF9s2MzNTQUFBio+PV/fu3TV48GAlJCRo3rx52r17tzZt2iQXF5dyvQAAAICqwuoRr08//VRHjhxRYmKiRo4cWWLbyMhIxcfHKzQ0VGvWrNH//d//afXq1Zo8ebLi4+MVGRlZ5sIBAACqGquDV7du3eTj43PPdnl5eYqKipKrq6smT55ssWzSpEkymUxatmyZ8vLyrC0BAACgSrL6UGNpJSYmKiUlRU899VShw4nOzs4KDAzUhg0blJSUJD8/v3tuLysrq0LqzM7OrpDtonwqan+j7KrLPqluv/P23i8F/Vnd+tWe6FPbquj+dHZ2tqp9hQYvSWrWrFmRywvCVmJiYqmC1/nz55WTk2O7Ai3UqaDtoqxKmjtY9VSP9xf7pHKqLPslNTXV3iVUO/SpbVVEfzo6Ohabc4pTYcGr4KxFd3f3Ipe7ublZtLuXhg0b2qawu+Qn4GsVsm2UXePGje1dgg1dtncBNlFd9kl1+523937Jzs5WamqqvLy85OTkZNdaqgv61LYqW39WWPCyNWuH8lC1sb8rH/ZJ5VRZ9ouTk1OlqaW6oE9tq7L0Z4VdQLVu3bqSpPT09CKXX7t2zaIdAABAdVdhwatg3lZSUlKRywvmgJVmfhcAAEB1UKHBy9vbW/v371dmZqbFsqysLMXGxsrb29vqSWkAAABVVYUFLwcHBwUHB+v69euaOXOmxbJZs2YpLS1NwcHBcnBwqKgSAAAAKhWrJ9cvXbpUcXFxkqSjR49KkqKiorRnzx5JUlBQkPr27StJCg0N1caNGxUZGakjR44oICBACQkJiomJUZs2bRQaGmqr1wEAAFDpWR284uLi9PXXX1s8tm/fPu3bt0+S5OPjYw5eLi4uio6OVkREhNatW6c9e/bIy8tLISEhmjJlCvdpBAAA9xWrg9f8+fM1f/78Urd3d3dXeHi4wsPDrX0qAACAaqXC5ngBAADAEsELAADAIAQvAAAAgxC8AAAADELwAgAAMAjBCwAAwCBWX04CAFB5mL76r71LkFRH0uVybSHtjw/aphSgkmPECwAAwCAELwAAAIMQvAAAAAxC8AIAADAIwQsAAMAgBC8AAACDELwAAAAMQvACAAAwCMELAADAIAQvAAAAgxC8AAAADELwAgAAMAjBCwAAwCAELwAAAIMQvAAAAAxC8AIAADAIwQsAAMAgBC8AAACDELwAAAAMQvACAAAwSE17FwAAACof01f/tXcJNnOgi70r+B9GvAAAAAxC8AIAADAIwQsAAMAgBC8AAACDELwAAAAMwlmNAADYkG3OBqwj6bINtoPKhhEvAAAAgxC8AAAADELwAgAAMAjBCwAAwCCGBa/vv/9eQ4YMUZMmTdSwYUP16NFD33zzjVFPDwAAYHeGnNW4e/duDRo0SE5OTho4cKDq1q2r7777TqNHj9bZs2f1pz/9yYgyAAAA7KrCg9ft27c1fvx4OTg4aP369WrXrp0kacqUKerdu7dmzJihAQMGyM/Pr6JLKVb92hxxRcXh/VX5sE9QkXh/VT6Ojo72LsHMIS0tLa8in2Dbtm0aOHCgXnzxRc2dO9di2T//+U+NHDlSkyZN0t/+9reKLAMAAMDuKjyW79mzR5LUo0ePQssKHtu7d29FlwEAAGB3FR68EhMTJanIQ4kmk0n169c3twEAAKjOKjx4ZWRkSJLq1q1b5HI3NzdzGwAAgOqMGYAAAAAGqfDgVTDSVdyo1rVr14odDQMAAKhOKjx4FcztKmoeV1pami5fvmzXS0kAAAAYpcKDV+fOnSXlX1bibgWPFbQBAACozir8Ol63b99Whw4dlJKSopiYGLVt21ZS/iHG3r1769SpU9q3b5+aN29ekWUAAADYXYWPeNWsWVOffPKJcnNz1adPH4WGhuqtt95Sly5ddOzYMb355pt2CV3cO9J2Vq5cqQkTJqhbt27y9PSUyWTS8uXL7V1WlXX+/HnNmzdPzz//vPz9/eXh4aGWLVsqODhYBw8etHd5VU5aWpomT56sXr16qWXLlvL09FSrVq303HPPae3atcrLq9DvnveFyMhImUwmmUwmHThwwN7lVElt2rQx9+Hd/yZOnGjv8qqs7777TgMGDJCvr68eeOABtW3bVqNGjdK5c+fsVpMh92p84okntGnTJs2YMUNr1qzRrVu39NBDD+mvf/2rhg4dakQJFrh3pG29++67Sk5OVv369eXl5aXk5GR7l1Slff755/r444/l6+urbt26ycPDQ4mJiVq/fr3Wr1+vRYsW6fnnn7d3mVXGlStXtHz5cnXo0EFBQUGqV6+efvnlF23atEkvv/yyXn75ZUVGRtq7zCrrxIkTCg8Pl4uLizIzM+1dTpVWt25djRkzptDj7du3t0M1VVteXp4mTpyoxYsXy9fXV4MGDZKrq6tSUlK0d+9eJScnq1GjRnaprcIPNVY2t2/f1mOPPabz589ry5Yt5ntH3nnoc//+/Uz4t8KOHTvUrFkz+fj4aPbs2Zo2bZrmzp2rF1980d6lVUnr1q1TgwYNFBgYaPF4bGys+vfvL1dXVx0/fly1a9e2U4VVS05OjvLy8lSzpuX3zGvXrqlXr146fvy44uLi1KpVKztVWHXl5OSoV69ecnBwkJ+fn1atWqWYmBg99thj9i6tymnTpo0kKT4+3s6VVA+fffaZ3nzzTY0ePVrvv/9+oXs13r59u9BnglHuu+t47dq1S6dPn9bgwYPNoUvKv5DrX/7yF92+fZvDZFbq1q2bfHx87F1GtdGvX79CoUuSAgMD1bVrV129elVHjx61Q2VVk6OjY5EfsG5ububbliUlJRldVrXw8ccfKyEhQXPmzKlUNyHG/e3XX39VRESEmjZtqhkzZhT53rRX6JIMOtRYmXDvSFRltWrVkiT+yNlAVlaWdu3aJQcHBz300EP2LqfKOXr0qCIiIvTnP/+Z0UIbyc7O1ooVK5SSkiKTyaTf//735pEwlN727dt19epVDR8+XDk5OdqwYYMSExPl7u6ubt26qVmzZnat774LXtw7ElVVcnKyduzYIS8vL7Vu3dre5VQ5aWlpmj9/vnJzc3Xp0iXFxMTo3LlzmjJlClMLrHT79m2FhISoZcuWTPy2odTUVIWEhFg81rNnTy1YsED169e3U1VVzw8//CApf1SrS5cuOnXqlHlZjRo1FBISonfffdde5d1/was09448f/68kSUB93Tr1i299tprunnzpqZNm8aIVxmkp6crIiLC/P+1atXS9OnTNXbsWDtWVTV99NFHSkhI0NatW82jsCifl156SZ07d1arVq3k5OSkEydOKCIiQjExMXrhhRe0efNmOTg42LvMKuHSpUuSpDlz5qhdu3batm2bWrZsqSNHjmjChAmaM2eOfH19NWrUKLvUd9/N8QKqmtzcXL3xxhuKjY3Vyy+/rGHDhtm7pCqpSZMm5rtlHD58WGFhYZo+fbqCg4N1+/Zte5dXZcTHx+vDDz/UuHHjFBAQYO9yqo0pU6aoS5cuql+/vtzc3NShQwetXLlSjz/+uP79739ry5Yt9i6xysjNzZUkOTk5afny5XrkkUfk6uqqwMBALVmyRDVq1NCcOXPsVt99F7y4dySqkry8PI0fP16rVq3S0KFDNXv2bHuXVOU5OjqqSZMmmjhxot566y1FR0dryZIl9i6ryhgzZox8fX315ptv2ruUaq9GjRoaPny4JGn//v12rqbqKPgbHhAQIG9vb4tlrVq1UtOmTXX69GmlpaXZobr7MHhx70hUFbm5uRo7dqyWLVumwYMHa/78+apR4777la1Q3bt3l/S/k25wbwkJCTp58qS8vLwsLvL59ddfS5J69eolk8mk6OhoO1daPRTM7bpx44adK6k6WrRoIUlyd3cvcnnB41lZWYbVdKf7bo5X586dNWvWLG3btk2DBg2yWMa9I1FZ5Obmaty4cVq+fLkGDhyoBQsWMK+rAly4cEGSfU8tr2qCg4OLfDw2NlaJiYl69tln1aBBAy4xYyOHDh2SJPrTCl27dpUknTx5stCyW7duKSkpSS4uLmrQoIHRpUm6D4PXk08+qaZNm2r16tV67bXXLO4d+cEHH6hmzZrmoV3AHgpGulasWKEBAwbo888/J3SVw5EjR9SkSZNC336vXr2qd955R1L+mWMonU8//bTIx8eMGaPExERNmjSJC6ha6fjx43rggQdkMpksHo+Li9PcuXNVu3ZtPffcc/Yprgry9fVVjx49tG3bNi1dulQjRowwL5s9e7bS09M1dOhQu33huu+CV8G9IwcNGqQ+ffpo0KBBcnNz03fffaeff/5Zb731FjfsttLSpUsVFxcnSeYLe0ZFRZkP3wQFBalv3752q6+qiYiI0IoVK+Tq6qrmzZvrgw8+KNQmKCjI/KUBJVuxYoWioqLUpUsX+fj4qE6dOkpOTtaWLVt0/fp19evXT0OGDLF3mbiPrVmzRp988omeeOIJ+fj4qHbt2jp27Ji2bdumGjVqaPbs2WrcuLG9y6xSPvroI/Xu3Vvjx4/X+vXr1aJFCx05ckS7du1S48aNNX36dLvVdt8FL6ny3TuyqouLizPP7yiwb98+7du3T1L+EDnBq/TOnj0rSbp+/bo+/PDDItv4+PgQvEqpf//+ysjI0MGDBxUXF6cbN26oXr166tSpk4YNG6ZBgwZxmj7sqmvXrjp58qQOHz6s2NhYZWVlydPTUwMHDlRISIgeffRRe5dY5fj6+mr79u0KDw/Xv/71L23btk1eXl4aPXq0Jk+eLA8PD7vVdt/dqxEAAMBeOEUKAADAIAQvAAAAgxC8AAAADELwAgAAMAjBCwAAwCAELwAAAIMQvAAAAAxC8AIAADAIwQsAAMAgBC8AAACDELwAAAAMQvACAAAwyP8DOhsSl9RYMsMAAAAASUVORK5CYII=", 216 | "text/plain": [ 217 | "
" 218 | ] 219 | }, 220 | "metadata": {}, 221 | "output_type": "display_data" 222 | } 223 | ], 224 | "source": [ 225 | "fig1 = plt.hist(fp)\n", 226 | "plt.title(r'Histogram of False Positives')\n", 227 | "plt.show()" 228 | ] 229 | }, 230 | { 231 | "cell_type": "code", 232 | "execution_count": 6, 233 | "id": "3b9f2d89-6f6c-448a-afce-85457670904b", 234 | "metadata": {}, 235 | "outputs": [ 236 | { 237 | "data": { 238 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAoUAAAHOCAYAAADuXnPmAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy81sbWrAAAACXBIWXMAAA9hAAAPYQGoP6dpAABTM0lEQVR4nO3dd3QU9f7/8VdICEhdvpAEAkQIyKWFjiChJHRNpBPpKvws1CCKCJarFxS5KoiKQRTpIk06UiKdJEhRimALLQqEGlpMQkh+f+Ts3iy7KZvsprDPxzmcQ3ba5z27M/vamfnMuMTFxaUKAAAATq1IfjcAAAAA+Y9QCAAAAEIhAAAACIUAAAAQoRAAAAAiFAIAAECEQgAAAIhQCAAAABEKAQAAoAcgFC5ZskQGg0EGg0Fnz57N7+bgAXLv3j198cUX6tixo3x8fFSuXDkZDAYFBQXld9OyJSgoqFC190Fjr/V/9uxZ0z5uyZIldmqd8yrs2zXgSG55ubA9e/boySeflCRNmDBBEydOzHKaqVOnatq0aZKk9evXq02bNg5tI2A0bNgwrVmzJlfzOHv2rBo2bJjt8fv376+wsLBcLfNBkdm6c3d3V7ly5VSnTh116tRJgwYNUtmyZfO4hSiM7LFdF3QGgyHDYcWKFVO5cuVUr149BQUFacCAASpevHjeNQ4FWqE/UuhIxl/nU6dOze+mII/t37/f9MXRsWNHrV69Wvv27VNERIRmzZqVv42DkpKSFBsbq507d+r1119Xy5Yt9eOPP+Z3s2zC/iXvZWe7Hj58uAwGg/z8/PKxpY6TmJioixcv6ocfftC4cePUqlUrRUdH53ezsuTn5yeDwaDhw4c7bBmceczjI4WOMHDgQA0cODC/m4EHzM6dOyVJrq6umjt3rl2OQj3xxBN64403Mh0ns1/4zuz+dZeYmKjff/9dX331lQ4cOKALFy4oJCREUVFRqlixYj629H82btxol/k8/PDDiouLs8u8nJ0jtuuCrHHjxhY/Yv/55x/9/vvvWrBggaKionTq1Cn17dtXkZGRKlasWD61FAVFoQ+FgCNcuHBBkuTp6Wm3L46yZcuqbt26dpmXs7G27ho3bqw+ffpoyJAh2rhxo+Li4vTZZ59pypQp+dRKFHSO2K4LshIlSljd5zRt2lRPPfWUgoODFRERoVOnTmnDhg3q3bt3PrQSBQmnjwErEhMTJUlubvxuKshcXV01efJk09/h4eH52BoUdGzX/1OkSBGNGTPG9PehQ4fysTUoKAp9KMzONQBHjhzRmDFj1Lx5c1WuXFmenp6qU6eO2rRpo9GjR2v16tWmnYX0v2sXjKZNm2ZahvFfRtc1hIeH69lnn1W9evXk5eWlhx9+WO3atdO7776rq1evZllPcnKyZs+erfbt26tq1ary8fFRQECAZs2apaSkpCx7It5/PUxsbKz+85//6LHHHpOPj48MBoM2bNhgGj8uLk6LFy/W888/rxYtWqhy5cry8PBQrVq11KtXL82fP19JSUkZttdae9atW6eePXuqZs2a8vb2lr+/v7744gvdvXvXNF1qaqpWrFihoKAg1axZU5UqVVKbNm00d+5cpaSkZLmesuPAgQMaPny4GjZsqEqVKqlq1ap67LHHNGnSJMXExFidxljL0qVLJUkxMTEW731eXGuSlJSk77//XuPHj1dgYKAefvhhVahQQdWrV1eHDh00derUbH2eMpOSkqKlS5eqb9++ql27tjw8PFS1alU1atRIjz/+uN5991399NNPmc7j559/1ksvvaTmzZurSpUqqlSpkho3bqzRo0fr2LFjuWpfdvn6+ur//u//JMnq+3r37l3Nnz9f3bt31yOPPCIPDw/VrFlT3bp109dff232ubQmNjZWkydPVkBAgHx8fFShQgXVrFlTLVu21ODBgzVv3jxduXLFYrqMeh/bun/JaJuPj49XlSpVZDAYNHjw4CzX07lz50w9bV9//XWr49y7d0/ffPONnnrqKdWpU0eenp6qVq2aOnbsqI8++kg3b97MdBmnTp3Sa6+9platWqlKlSry8PDQv/71L7Vq1UrPPfecli5dqlu3bmXZVmtys6/KznZt3HdmNk5Gl3fkZr3Zus+2p4cfftj0/8z285L022+/6eWXX1azZs1UuXJleXt7q2nTpnrppZd08uTJbC0vJ/MwbkfGbXvp0qUW78n925it+7Y9e/bIYDBo5MiRptcaNmxosZw9e/aYhheE79q1a9eqZ8+eeuSRR+Tl5aUmTZpo4sSJunTpUrbeD2se+J9Ls2fP1qRJkyyCxoULF3ThwgUdO3ZMixYt0o8//qhatWrleDmJiYl68cUXtXr1aovXjxw5oiNHjmjOnDlasGCBAgICrM7j5s2b6t27tw4cOGD2+s8//6yff/5Z3333nWbMmJHtNh08eFD9+vWz+oVl1KZNG6tfpJcuXdL27du1fft2ff3111qxYoW8vLyyXObLL7+suXPnmr32yy+/aMKECdq7d6/mz5+v5ORkPf/881q7dq3ZeMeOHdPLL7+sI0eO6JNPPslmlZZSU1M1adIkq714T548qZMnT+rrr7/WzJkz9dRTT+V4OY4UGhpq+nJK7/r16zp06JAOHTqkL7/8Ut98841atmxp8/xv376tfv36ae/evWav3717V7du3dKZM2cUGRmpHTt2WD36du/ePU2cOFFffvmlUlNTzYadPn1ap0+f1uLFizVx4kS9+uqrNrfPVkWLFjW1K72//vpLISEhOnHihNnrV65c0e7du7V7927NmTNHy5cvl4+Pj8V8o6Ki9NRTT+nGjRsW01+5ckW//vqr1q9fr9TUVA0dOtTOVWWuRIkSeuKJJ7R8+XJt3bpVcXFxmV6TunLlStN71bdvX4vhZ86c0YABAyzWVVJSkg4ePKiDBw+aPnNNmjSxmH7t2rV6/vnnzX5gS2lflrGxsTpx4oRWrFghDw8PdezY0eZ67b2vspfcrrf0srPPtqdz586Z/l+lSpUMx/v000/19ttvW2xf0dHRio6O1sKFC/XGG2/opZdecug8siO3+7acyI/v2jFjxmjhwoVmr506dUphYWH69ttvtXLlSjVt2tTmWh7oUHj8+HFTIPTx8dFzzz2nBg0aqFy5coqPj1d0dLT27dunTZs2mU23evVqJSUlqVWrVpLSbmEwbNgws3Hu3/mOHDnSFAhr166tUaNGqV69erp586Y2btyor7/+Wjdu3FBISIi2bdtm9VYbw4YNMwXC5s2b68UXX1SNGjV05coVLV++XMuXL9e4ceOyVfudO3c0ZMgQ/fPPPxo3bpwCAwNVqlQp/fbbb2ZffikpKWrWrJm6dOmiBg0ayNPT03REcvny5QoPD9fRo0c1dOjQLC+cnzdvng4ePKjOnTtr8ODBqlq1qv7++2/NmDFDBw8e1Pr167VkyRIdP35ca9euVd++fdWnTx95eXnp1KlTev/99/X7779r4cKF6tatW46+OCRp8uTJpkBYuXJljR07Vk2aNFFiYqK2b9+uWbNm6Z9//tGLL74og8GgLl26mKaNiIiQJE2ZMkWbNm1SpUqVtGrVKrP5e3t756hdtrh3756qVaum4OBgNW3aVFWqVJGbm5vOnTunXbt2afHixbp27ZoGDRqkyMhIeXh42DT/adOmmXaanTp1UkhIiKpWraqHHnpIV69e1S+//KJt27YpISHB6vRjxowx/Vpt1qyZhgwZomrVqqlMmTL69ddf9dVXX+ngwYN67733VK5cOT333HO5WyGZuHz5sumXcfpOJnfu3FH37t1NPSs7deqkp59+WlWqVNHff/+thQsXasuWLfr111/VrVs37dmzR6VLlzZNn5SUpKFDh+rGjRsqVaqUnnnmGbVr104eHh5KTk5WTEyMDh48aHOHkpzsXzLy1FNPafny5UpMTNS6des0ZMiQDMddsWKFJOlf//qXGjVqZDYsNjZWXbt21cWLF1W0aFENHDhQ7dq1k4+PjxITE7V3716FhYXp4sWL6tOnj3bt2qWqVauapr906ZJGjBihxMREVahQQcOGDVOLFi1Uvnx5JSYm6syZM9q/f3+uOt/kZl+Vne26QoUKGj16dKbj3C+36y297O6z7SUlJUWfffaZJMnFxUVdu3a1Ot78+fP15ptvSkr7XIaGhsrf31+SFBkZqY8//ljXr1/XO++8o9KlS+v//b//Z9d5zJo1S/Hx8erdu7cuXLhgtcNeiRIlTP/Pyb6tSZMmioiI0KZNm0zXJH/33XcWndbSH1k1yo/v2rlz5+rw4cNq2LChRo4cqVq1aunatWtatWqVvvnmG12/fl29e/dWZGSkKlWqlOm87pdvofDKlSsWv6wyGi+n1q5dq5SUFJUsWVLbtm2zSN8tW7bUwIEDFR8fryJF/ncmvWbNmmbjVahQIdMOAtu2bdPKlSslSS1atNCaNWv00EMPmYa3a9dO7du314ABA5SUlKQxY8Zo165dZvPYuHGjtm3bJknq0qWLvvnmG7m6upqGd+zYUX5+fqYNKyvXrl1TiRIltGnTJrMvgMaNG5uNt27dOtWoUcNi+hYtWigkJESLFy/WqFGjtG/fPu3atUvt2rXLcJkHDx7U8OHDzW6x0ahRIwUGBqpFixaKiYnRO++8o2vXrmnq1Klmp8gaNWokf39/NWvWTLdu3dLcuXNzFApPnjypjz/+WJJUo0YNbd26VeXLlzcNb9WqlZ544gkFBwcrPj5eoaGhOnLkiKnXnfF9Nl6E7ubmZrfOITdu3Mj0M1+iRAlVq1ZNkjRx4kRVq1ZNLi4uZuM0btxY3bt317Bhw9SlSxdduXJFX3zxRZa9mu9n/AHz5JNPatGiRRbD27dvr9GjR+vatWsWw4zhXkrbAb/wwgtmwxs1aqS+ffvqhRde0MqVK/Wf//xHffv2dVjP6o8++sh0BCz9fUz/+9//mgLhiBEj9N5775m1MSgoSG+99ZY++eQTnTlzRu+//77effdd0ziRkZE6f/68JOnLL7/U448/brbcZs2aqWfPnpoyZYrFkcTM2Lp/yUxAQIC8vLwUGxurFStWZBgKjx07Zjo9FxISYjF87NixunjxoipVqqR169bpkUceMRveqlUrhYSEqHPnzrp06ZImT56sOXPmmIZv2bJFd+7ckZS2361Xr57Z9I8++qhCQkL0/vvvZ/hDIyu52Vdld7tO3wElO9t+btdbetndZ9siPj7eYp+TkJBg+vFtDMuhoaGqXbu21TYZLzWoUKGCtm7dKl9fX9PwRx99VN27dzfV9+abb6pbt27y9PS02zyM+0TjdaBZddjLyb6tZMmSqlu3rtkp5Ro1algNgdbWUV5/1x4+fFjt27fXsmXLTGdJjLU1b95cY8eOVVxcnN566y19+eWXWdaQXr5dUzh37ly1atUqy3/3n4q0hfHoQY0aNTI9HFuiRIlc3bzTuNKLFCmisLAws0Bo1LVrVw0YMEBS2jWOUVFRZsPnzZsnKe2mvDNnzjQLhEajRo2y6UbIY8aMsTgicD9rH9L0Bg0apAYNGkhSlte1VKlSRf/5z38sXi9RooT69+8vSbp69aqaNWtm9ZpMLy8v07UhkZGRmS4rI+mvSZwxY4ZZIDRq0qSJxo4dK0m6ePGixWlsR9m0aVOmn/X017NUr17dIhCmV69ePdN1ZPcf6c6O2NhYSTL9Ws+I8Vq99KZPny5J6ty5s0UgNHJ1ddWHH36oYsWK6datW3Zfx0lJSTp27JhGjRql2bNnS0r70hgxYoRp+IIFCySlrct33nnH6nzeeust0xf5okWL9M8//5iGpb8uJ7P15OLi4rDAmxVXV1f16tVLkrRv3z79/fffVsczHiV0cXGxOHV88uRJff/995LSHhZwf7AxqlatmulSgNWrVys+Pt40zLiuDAaDRSBMr2jRomZHY21hz32VPdhjvd0vO/tsW/z0008W+5n27dvrxRdfVEREhJo1a6ZFixbp7bfftjr94sWLTWH/7bffNgtzRtWqVTPt9//55x+LU5r2mIctcrNvy6m8/q51d3fXZ599ZhYIjZ555hnTj+M1a9bo8uXLmc7rfoW+o0lmjId+f/vtN4f1rEpOTjYdqm7durXVD7zRM888Y/r/jh07zOaxb98+SWlHFTO6z5qLi4tN18DZer1camqqYmNj9eeff+rEiROmf8bDz8ePH890+uDgYKsfUkmqX7++6f89e/bMcB7G8eLi4nJ0bzbjeq1WrZratm2b4XhPP/20xTQFWVxcnE6fPq2TJ0+a3hfjEY1ff/01y84S9zN+xrL6krrfhQsXTL+mu3fvnum4BoNBderUkaRc31j6/ovLPT091aZNGy1evFhS2k5y1qxZpuX99NNPps9P//79M/xcurm5me5zevPmTR08eNA0LP12WJAfL2fczlNSUqye7kz/esuWLS1ORRpPVRUtWlRPPPFEpssynvK+e/eu2VEV47qKi4uz2/0ZM5PbfZU92GO93S+vr3E+fPiw5s2bl2EnD+O+sWTJkurTp0+G8+nVq5fKlCljNo0952GLnO7bciOvv2sDAwMzvYxp0KBBktI+b/dfW5mVfDt9nJPH3NmqT58+mjFjhhITE9WlSxd16NBBnTp1UsuWLVW3bl2zU8Y5debMGdMHr3nz5pmO27BhQxUtWlR37941O6R/+vRp0xGKrI4EZvdUQqlSpVS9evVsjbt582Z9/fXXioyMzLRnoLVTiendf1osvfT3BMvueLdv37bpCExiYqLpdGFW74WXl5d8fHx07ty5bF3GYA+2PsLul19+0axZsxQeHp5pb7KUlBTFxcXZdF3hgAEDNG3aNO3fv18NGzZU9+7d1aZNG7Vs2TLTo+qHDx82/X/kyJFmRzczk5vecJnx9PRUx44dNXr0aFMglGT2JZfVZ6FZs2am/584ccL0K7tly5by9fXVqVOnNHHiRC1fvlxBQUFq1aqVmjRpUmAeDdaoUSP961//0m+//ably5eb3WZEkvbu3Ws6gmjt1LExpNy9e9fstF9W0r+nTzzxhAwGg+Li4jRo0CD5+/ura9euatWqlRo0aGC3W8DYa19lD/ZYb+nZss/OLn9/f4uQnpycrNjYWO3evVtTp07V9u3b1aVLF61YsUItWrQwG9e4HdWvXz/Tz7u7u7saNGigvXv3WuxP7TEPW+R035ZT+fFdm1UHkvTDf/nll0wPxNzvge5o8sgjj2jevHmm6we2bNmiLVu2SEo7ihEQEKBBgwbluEODlNYb1KhChQqZjlu0aFH93//9n2JjY82mS39ELKt5ZDXcKDs3Zk1NTdXo0aNNR1uykv7UmjXWTpsbpQ/g2R3v/l5qWbFlPUppwfDcuXNm70VBsXDhQo0bN07JycnZGj+r9+Z+48ePV2xsrBYsWKDLly/rq6++0ldffSUpbbt54oknNGzYMIujSjm9xje3v9jvv7jc3d1dBoMhw/fZlu0y/RdF+umKFi2qb7/9Vs8884xOnDihn376yRQEihUrpkcffVR9+/ZVv3795O7unqO67KVv376aMmWKjh8/rpMnT5oFZOOpY3d3d/Xo0cNiWnu8p+XKldOyZcs0bNgw/fXXX9q7d6/pCEXJkiXVunVr9evXT927d8/Rj3F776vswd7bQl7dTNvNzU2VK1dW//79FRgYqMcee0zXr1/X888/r0OHDpkFeOP2kN39qZR27XRqaqrp8hd7zMMWOd235VR+fNdmdQAg/XBbfyA90KFQSrvHUdu2bbVmzRr98MMPioyMVGxsrOLi4rRmzRqtWbNGnTt31oIFCzINK9mRnQ/w/bfvyMk8siM7O95FixaZPqR+fn4aPny4mjVrpkqVKqlEiRKm6xpfeOEFLVu2LMu2FyT2eC/yy++//24KhB4eHhozZozatGmjhx9+WKVKlTKdCl20aJFGjx4tyfZa3Nzc9PHHH2vkyJFauXKldu/erZ9++kkJCQn6448/NHPmTIWFhemDDz4wO9WePqiHhYVl+xrX9L0DcyI3T4PJ6rOQ2bqrVauW9uzZo23btmnTpk2KjIzUH3/8ocTERO3Zs0d79uzRJ598ouXLl2d66YijhYSE6N133zXd//Ott96SJFOvZCmtJ2a5cuUspjW+p5UrVzYFyOy4//RVixYtdOjQIW3cuFGbN29WZGSkYmJidOfOHdMP8qZNm2rZsmXZ/nFrVBD3VfZab0b2OHNlq4oVK+qpp57S7NmzdfbsWe3Zs0eBgYEW4+XVd5s93rec7ttyKj++a3OzT8vKAx8KJal06dIaPHiw6cL86Ohobd68WV9++aXOnDmjrVu3avLkyWa9E7Mr/U42qws67969a/rVlH669KdIs5qHPe9fZbyY19fXV1u3bs0wFBeW567ash6l/53GsfZFmZ+++eYbJScny9XVVRs3bszw/pn2eF8eeeQRTZw4URMnTlRiYqIOHDigNWvWaNGiRUpMTNRLL72kxo0bmy6ATt9xJzU1tcA+ts+W7TL96TxrnwVXV1d17drVdMuOy5cva8eOHZo3b54iIyP1559/6tlnn7W4o0Be8vHxUcuWLRUZGakVK1bozTfflIuLi7Zs2WLqGW3t1LH0v/f0ypUrqlmzZq6OehYrVky9evUydX7566+/FB4erq+++krHjx/XoUOHNHbs2GwfMTEqiPsqe663/JR+//LLL7+YhcJy5crp4sWLNu1Py5YtaxZa7DGPnLB13+ZI9v78ZnU5TvqcYGuHmge6o0lGatSooZEjR2rnzp2ma0HWrFmTo3lVq1bNdBQk/UXq1hw9etTUISD9l2n16tVN11ocOXIk03lk9YQJW/z666+SpMcffzzDD2lqamqWbSooihUrZurhlVXHokuXLplu3FrQgk36a3Ayu6G6PT8LUtr6a926tT788EPNmjVLUtr1isYjTZLMdqA//PCDXZdvT+lPn2a1Xab/rGTns+Dh4aGQkBBt2rRJHTp0kJS23Z46dSqHrbUPY+iLiYkx9d5fvny5JKlMmTJm9+NMz/hEBuPRT3uqUqWKnnnmGW3fvt20bjdv3mzz6d283FdlN4w4cr3lpfSXqNx/uYpxOzp+/LjFTcnTS0pK0tGjRyVZbkP2mIeUu7No2dm35XYZmbH35zer77f0137b+v3mlKHQyGAwmE5/WXtkmDGoZfboGTc3N7Vu3VpS2gXdZ86cyXBc4y0yJJn9GnNzczP1Ttu1a5cuXrxodfrU1FQtW7Ysw/nbyrgDyOx6r40bN2bYnoLIuF5PnTpl6tFtTfpbHlg7XZKfjKelMntfLl68aLodhiOkv0dW+m2jevXqpp3MunXr8j0IZaRx48amI8fLli3L8NrM5ORkU8/iMmXKmHU6yYqLi4tZD3dbHzuYnf2LLXr27Gk6WrVixQrFxcWZ7n3avXv3DC/yDw4ONv3/448/dsipV3d3dz322GOS0tZ5Vo/Lu19e7quy+77kxXrLC+kDROXKlc2GGfeNd+7c0XfffZfhPNasWWN6T+/fn9pjHpL9tpeM9m3pl2GP5aRn78/vjh07dOHChQyHG/dp6fNJdj3QoXD9+vWZHo69fv26fv75Z0nW71RuvOj19OnTmS7H+LSGe/fuaeTIkVZ/DW3dutV0yqRhw4YWjyYz3q4mKSlJoaGhVjtZfPbZZ3Y9ame8Bmrz5s1WO1ucPn1a48ePt9vy8sKwYcNM13iMGzfO6vv/888/mx4XWLFixSxvrZLXjO9LdHS09u/fbzE8Pj5ezz33XI4vpr9+/bo2bdqU6ZfY9u3bTf+/f9sw3nPt7t27GjRoUKY7p3v37mn58uUZ3j/PUdzd3U3XC/35559mN6VO791339Xvv/8uSRo8eLDZr/iIiAhTb3ZrUlJSTKeMXVxcbL5wPbv7l+wyGAzq1KmTpLQv2JUrV5r2RRmdOpbSei937txZUtozYN98881MPxuXLl2yuI9ceHh4pp+DhIQE09HL0qVLW71/aGbycl9lfF8uX76caQ9Re6y3/Pbzzz+bbvbs7u5uEcYGDhyokiVLSkq7x6C1576fPXvW9FCFhx56yOIG6vaYh5S97SW3+7b0nc7stV1K9v/8Gh+CYe3H7sKFC037pe7du9vUM156wK8pnD17tp5//nl16tRJbdu2Va1atWQwGHTz5k0dP35cX375pek6h/sfMyWlXTh99uxZff/995o3b55atGhh+iVRunRpUw+fTp06qU+fPlq5cqX27dunwMBAjRo1SnXr1tXNmze1adMmffXVV0pJSZG7u7vV5/p269ZN7du31/bt27VlyxZ17dpVI0aMUPXq1XX16lUtW7ZMy5cvV9OmTU2HjnN7qLt///568803deHCBXXq1EmhoaGqW7euEhIStHv3boWFhSkpKUkNGzYsNKeQ69Spo7Fjx2r69On67bff1KZNG40dO1aNGzc2e8xdfHy8XFxcNHPmTNPTTAqKfv36ac6cOUpJSVFISIhCQ0NNn72ff/5Zn3/+uaKjo9WyZUuLm6Bnx82bNzVgwABVqVJFTz75pJo1ayYfHx8VLVpUly9f1g8//GC6mXrp0qUtAkWPHj30zDPPaP78+Tpx4oRatmypZ555Rm3btpWHh4cSEhJ07tw5/fjjj1q3bp0uXryoiIgIi6MQjvbqq69qw4YNio6O1owZM3Ty5EkNGTJE3t7eOn/+vBYuXKjNmzdLSrsM5LXXXjObfteuXfrggw/UsmVLde7cWfXr11eFChWUlJSkM2fOaNGiRabThsHBwTbf7iK7+xdbhISEaOPGjbp+/bomT54sKe3oT1Y38v3ss8/Uvn17/fXXX/rss8+0e/duDRkyRPXr19dDDz2kGzdu6OTJk9q1a5fCw8NVt25dsy/uVatWqX///qanN9WtW9f0ONE//vhDc+fONd1mZMiQITbfoiYv91XG27KkpKRo3Lhxev75581CbPoORbldb45m7YkmycnJunz5snbu3Km5c+eafjiEhoZadAAqX7683n33XY0dO1axsbEKDAzU2LFjTUd9o6KiNGPGDFMP18mTJ1uEEHvMQ0p7X/bs2aPDhw9rxowZ6tixoylsFi9eXN7e3rnetzVo0EDFixdXQkKC3n33XRUtWlRVq1Y1HWioVKlSjjqk2vvz26RJE23btk2dOnXSiBEj9Mgjj+j69ev67rvvTAefypYta9oH2OKBDoVSWtfudevWWVw7kN4LL7yg559/3uL1UaNGae3ataaLUtO7/55zs2bN0r1797R69WqdOHHC9GSF9MqWLasFCxZk2GPz66+/Vu/evXXo0CEdOHBAzz77rNnwBg0a6KOPPlJAQIAk5fo+aS+++KJ27Nih7du3688//zT1ZDV66KGHNHv2bG3ZsqXQhEJJevPNNxUfH6/Zs2crJiZGL7/8ssU4xYsX18yZMzO8zio/NWnSRBMnTtTUqVN148YNq0+JGTVqlOrUqZOjUGj0119/ZXrfRIPBoAULFljtLTl9+nR5eHhoxowZunHjhmbOnKmZM2danY+7u3u+3NOvZMmSWrt2rUJCQnTixAlt3rzZFALTq127tpYvX271SRspKSmKiIgwPQ7MGn9/f3366ac2t8+W/Ut2de3aVWXLltWNGzdMHUz69OmTZQ9JT09Pbd26VUOHDlVUVJSOHj2qV155JcPxra2ru3fvKjw8XOHh4RlO16NHD1PPaFvk5b6qbdu2at68uQ4cOKAVK1ZY9CxOf/bBHuvNkYxPNMlMkSJFNGLECE2aNMnq8GeeeUY3b940PaLU2vvn6uqqN954w+pzj+01j6FDh2ru3LmmZySnf0rR/fdjzOm+rXTp0nrhhRc0c+ZMHTlyxOL+fuvXrzd7lGZ22fvzO2zYMNWtW1eLFy+2+lx5g8Gg5cuXZ3qD64w80KFw/vz52rlzp3bu3Kljx47p0qVLunr1qooWLaoqVaqoRYsWGjJkSIY3t23QoIG2bt2qTz75RPv379elS5cyvM6gWLFimjdvngYOHKjFixfrwIEDunz5sooXL65q1aqpc+fOGj58eKanTQwGgzZv3qyvvvpKy5Yt059//ikXFxdVq1ZNvXr10vDhw02nuiSZ7v6eU0WLFtXy5cs1d+5cffvtt/rtt9+UmpqqSpUqKSAgQC+++KJq1aplurdjYeHi4qL3339fvXv31ty5cxUREaFLly7Jzc1NVatWVWBgoIYPH57hg+kLggkTJqhx48aaPXu2Dh8+rPj4eHl4eKhJkyYaOnSoAgMDc/yUDR8fH+3Zs0c7d+7Unj17dObMGV26dEm3b99W6dKlVatWLXXs2FFDhw7N8PNapEgRvf766xo4cKDmz5+vXbt26ezZs7p586aKFy+uSpUqqV69egoICNCTTz5p8+lCe6lSpYp27dqlxYsXa82aNfrll19048YNlSlTRvXq1VP37t01ZMgQq088CQ0NVbNmzbRz504dOHBAFy5c0OXLl5WamioPDw81atRIvXv3Vo8ePXJ01N6W/Ut2FStWTD169DC7fjmzU8fpeXt7a/PmzdqyZYtWrVqlAwcO6NKlS0pMTFSZMmVUvXp1NW3aVJ07d7Y4zfj+++/riSee0K5du3T48GHFxsbq8uXLcnV1VcWKFdWsWTP169fP1DHHVnm5rypSpIi+++47zZw5U5s3b9aZM2d0586dDE9J5ma95YciRYqodOnSqlatmh577DENGjTI7IlT1owZM0ZdunTRF198od27d5ueCV6pUiW1bdtWzz//vFnnLkfMw9vbW9u3b9f06dO1d+9eXbhwweI52vbYt7399tuqUaOGli5dql9//VU3b960+Z6593PE5/ezzz5Thw4dNH/+fP3yyy+6deuWvL291aVLF7300ks5vlG3S1xcXOG8OtZJLVu2zPS82cOHD+frvdEAAIDjnT171nSWcdasWabHc9rbA93R5EFkfIZp+fLl7f5IJAAA4LwIhQXIhQsXMu2yvnDhQm3dulVS2gO4HXVPJQAA4Hwe6GsKC5vdu3dr0qRJ6tWrl1q3bq2HH35YKSkpOn36tFavXq0NGzZISnuG5Lhx4/K5tQAA4EFCKCxgrl69qi+//FJffvml1eFeXl45enYoAABAZgiFBUinTp00Y8YMhYeH67ffftOVK1d0+/ZtlS1bVrVq1VLXrl01dOjQPL+tAQAAePDR+xgAAAB0NAEAAAChEAAAACIU2iwhIUGnTp2yuJO6s3Dm+qmd2p2RM9dP7dTubAiFOZDbR94Uds5cP7U7J2euXXLu+qndOTlr7YRCAAAAEAoBAABAKAQAAIAIhQAAABChEAAAACIUAgAAQIRCAAAAiFAIAAAAEQoBAAAgQiEAAABEKAQAAIAIhQAAABChEAAAACIUAgAAQIRCAAAASHLL7wYURs33lpB0Nb+bYRdxz1bO7yYAAIACgCOFAAAAIBQCAACAUAgAAAARCgEAACBCIQAAAEQoBAAAgAiFAAAAEKEQAAAAIhQCAABAhEIAAACIUAgAAAARCgEAACBCIQAAACS52TrBsmXLFBkZqZ9//lknTpxQUlKSZs2apYEDB1qMazAYspzf8ePHVaVKFUnS2bNn1bBhwwzHnTt3rnr37m1rkwEAAJAFm0PhlClTFBMTo/Lly8vLy0sxMTEZjjthwgSrr58+fVrLly/Xv/71L1MgTK9+/foKCgqyeL1OnTq2NhcAAADZYHMo/PTTT+Xr6ysfHx/NmDFD77zzTobjTpw40err48ePlyQNHjzY6nA/P78MpwUAAID92RwKAwICcrXAhIQErVixQu7u7urXr1+u5gUAAAD7sDkU5tb69esVFxen7t27q0KFClbHuXjxoubOnasbN26oYsWKateunSpXrpzHLQUAAHAeeR4KFy1aJEkaMmRIhuPs2LFDO3bsMP3t5uamF154QZMnT1aRItnrMJ2QkJC7hmYgKSnJIfPNL7auJ2P9D9p6yA5qp3Zn5Mz1Uzu1O0rx4sUdNu/cyNNQeObMGe3Zs0dVqlRRYGCgxfASJUpowoQJCg4OVrVq1ZSYmKgDBw7o7bff1qxZs+Tu7q5///vf2VrW+fPnde/ePXuXYGypg+ab9zLrKJSZ2NhYO7ek8KB25+TMtUvOXT+1OydH1e7q6ipfX1+HzDu38jQULl68WKmpqRo4cKDVI34eHh5mHUxKly6txx9/XE2aNNFjjz2mWbNmKTQ0NFu3uvH29rZn003Sfjnccsi880PVqlVtGj8pKUmxsbHy8vKSu7u7g1pVMFE7tTtb7ZJz10/t1O5stedZKExJSdHSpUtVpEgRDRo0yKZpvby81KlTJy1btkyHDx9W+/bts5ymoB6aLWhyup7c3d2ddh1TO7U7I2eun9qp3Vnk2RNNwsPD9ffffyswMNDmo1OSVL58eUlSfHy8vZsGAADg9PIsFGang0lmDh8+LEny8fGxW5sAAACQJk9C4ZUrV7R582aVL19ejz/+eIbjHTp0SHfv3rV4/bPPPlNUVJRq164tPz8/RzYVAADAKdl8TeHChQsVGRkpSTpx4oSktKOAe/fulSQFBQUpODjYbJqlS5fq7t276tevX6YXbb711lv6448/5O/vr8qVKyshIUE//vijjh49KoPBoNmzZ8vFxcXWJgMAACALNofCyMhILV261Oy1qKgoRUVFSUo7vXt/KFy8eLGkrE8dP/XUU1q3bp1+/PFHXb16VVJa79gXX3xRo0eP5gbWAAAADmJzKAwLC1NYWJhN0+zfvz9b4w0ZMiTH1xwCAAAg5/KsowkAAAAKLkIhAAAACIUAAAAgFAIAAECEQgAAAIhQCAAAABEKAQAAIEIhAAAARCgEAACACIUAAAAQoRAAAAAiFAIAAECEQgAAAIhQCAAAABEKAQAAIEIhAAAARCgEAACACIUAAAAQoRAAAAAiFAIAAECEQgAAAIhQCAAAABEKAQAAIEIhAAAARCgEAACACIUAAAAQoRAAAAAiFAIAAECEQgAAAIhQCAAAABEKAQAAIEIhAAAARCgEAACACIUAAAAQoRAAAADKQShctmyZxo4dq4CAAHl6espgMGjJkiVWx506daoMBoPVf15eXhkuY8WKFWrfvr28vb318MMPq2/fvvrpp59sbSoAAACyyc3WCaZMmaKYmBiVL19eXl5eiomJyXKa/v37y8fHx3zBbtYX/dFHH2ny5MmqUqWKnn32Wd25c0ffffedunTpolWrVqlNmza2NhkAAABZsDkUfvrpp/L19ZWPj49mzJihd955J8tpBgwYkK0wFx0dralTp6pmzZr64YcfVLZsWUnSCy+8oA4dOmjMmDE6cOBAhoESAAAAOWPz6eOAgACLo372smTJEiUnJ+vll182BUJJqlOnjvr166fTp09r9+7dDlk2AACAM8uTjiaRkZGaOXOmPv30U23ZskWJiYlWx9u7d68kqX379hbDjK/t27fPcQ0FAABwUnlyHva9994z+7tixYoKCwtTYGCg2evR0dEqVaqU1U4oNWrUMI2THQkJCTlsbeaSkpIcMt/8Yut6Mtb/oK2H7KB2andGzlw/tVO7oxQvXtxh884Nh4ZCPz8/hYWFyd/fX56enjp//rxWrVql6dOnq3///tq2bZv8/PxM49+8eVMeHh5W51W6dGnTONlx/vx53bt3L/dFWFXCQfPNe9npKGRNbGysnVtSeFC7c3Lm2iXnrp/anZOjand1dZWvr69D5p1bDg2FwcHBZn/7+vpq/Pjx8vT0VGhoqD788EMtWLDAIcv29vZ2yHzTfjnccsi880PVqlVtGj8pKUmxsbHy8vKSu7u7g1pVMFE7tTtb7ZJz10/t1O5stedLN97+/fvr5Zdf1v79+81eL1OmTIZHAm/dumUaJzsK6qHZgian68nd3d1p1zG1U7szcub6qZ3anUW+PNHE3d1dpUqVUnx8vNnrNWrU0O3bt60esjVeS2i8thAAAAD2ky+hMDo6WnFxcRa3tvH395ckbd++3WIa42vGcQAAAGA/DguFt27d0vHjxy1ej4uL06hRoyRJffr0MRs2cOBAubm56aOPPtKNGzdMr588eVLffvutqlevrrZt2zqqyQAAAE7L5msKFy5cqMjISEnSiRMnJEmLFi0y3WMwKChIwcHBunbtmlq3bq3GjRurbt268vDw0Pnz5xUeHq5r164pMDBQI0aMMJt3zZo19dprr2nKlCny9/dX9+7dFR8fr1WrVunu3buaOXMmTzMBAABwAJsTVmRkpJYuXWr2WlRUlKKioiRJPj4+Cg4OVrly5fTcc8/pwIED2rx5s27cuKESJUqoXr16CgkJ0ZAhQ+Tq6mox/1deeUU+Pj4KCwvT119/raJFi+rRRx/VpEmT1KRJkxyWCQAAgMzYHArDwsIUFhaW5XhlypTRBx98kKNGhYSEKCQkJEfTAgAAwHb50tEEAAAABQuhEAAAAIRCAAAAEAoBAAAgQiEAAABEKAQAAIAIhQAAABChEAAAACIUAgAAQIRCAAAAiFAIAAAAEQoBAAAgQiEAAABEKAQAAIAIhQAAABChEAAAACIUAgAAQIRCAAAAiFAIAAAAEQoBAAAgQiEAAABEKAQAAIAIhQAAABChEAAAACIUAgAAQIRCAAAAiFAIAAAAEQoBAAAgQiEAAABEKAQAAIAIhQAAABChEAAAACIUAgAAQIRCAAAAiFAIAAAA5SAULlu2TGPHjlVAQIA8PT1lMBi0ZMkSi/Hu3r2rtWvXavjw4Xr00Ufl7e2tKlWqqEOHDvrqq6907949i2nOnj0rg8GQ4b9Vq1blrEoAAABkys3WCaZMmaKYmBiVL19eXl5eiomJsTre6dOn9fTTT6t06dJq06aNHn/8cd28eVObN2/WK6+8ovDwcC1dulQuLi4W09avX19BQUEWr9epU8fW5gIAACAbbA6Fn376qXx9feXj46MZM2bonXfesTpeqVKl9NFHH6l///4qUaKE6fUpU6YoODhYmzdv1tq1a9WjRw+Laf38/DRx4kRbmwYAAIAcsvn0cUBAgHx8fLIcz9vbW8OGDTMLhJJUsmRJjRw5UpK0b98+WxcPAAAAB7D5SKE9FC1aVJLk6upqdfjFixc1d+5c3bhxQxUrVlS7du1UuXLlvGwiAACAU8mXULh48WJJUvv27a0O37Fjh3bs2GH6283NTS+88IImT56sIkWyd3AzISEh9w21IikpySHzzS+2ridj/Q/aesgOaqd2Z+TM9VM7tTtK8eLFHTbv3MjzUDh//nxt27ZNbdu2VefOnc2GlShRQhMmTFBwcLCqVaumxMREHThwQG+//bZmzZold3d3/fvf/87Wcs6fP2+1h7N9lMh6lEIio45CWYmNjbVzSwoPandOzly75Nz1U7tzclTtrq6u8vX1dci8cytPQ+GWLVs0fvx4Va1aVXPmzLEY7uHhYdbBpHTp0nr88cfVpEkTPfbYY5o1a5ZCQ0NlMBiyXJa3t7c9m26S9svhlkPmnR+qVq1q0/hJSUmKjY2Vl5eX3N3dHdSqgonaqd3Zapecu35qp3Znqz3PQuEPP/ygIUOGyNPTU+vXr1fFihWzPa2Xl5c6deqkZcuW6fDhwxmedk6voB6aLWhyup7c3d2ddh1TO7U7I2eun9qp3VnkyRNNwsPDNXDgQJUvX17r169XtWrVbJ5H+fLlJUnx8fF2bh0AAAAcHgqNgdBgMGj9+vU5Po9++PBhScrW7XAAAABgG4eGwvsDYY0aNTId/9ChQ7p7967F65999pmioqJUu3Zt+fn5Oaq5AAAATsvmawoXLlyoyMhISdKJEyckSYsWLdLevXslSUFBQQoODtbvv/+ugQMHKjExUa1bt9bKlSst5uXj46OBAwea/n7rrbf0xx9/yN/fX5UrV1ZCQoJ+/PFHHT16VAaDQbNnz7b6WDwAAADkjs2hMDIyUkuXLjV7LSoqSlFRUZLSgl5wcLBiY2OVmJgoSVq1apXVefn7+5uFwqeeekrr1q3Tjz/+qKtXr0pK6x374osvavTo0dzAGgAAwEFsDoVhYWEKCwvLcrw2bdooLi7OpnkPGTJEQ4YMsbVJAAAAyKU86X0MAACAgo1QCAAAAEIhAAAACIUAAAAQoRAAAAAiFAIAAECEQgAAAIhQCAAAABEKAQAAIEIhAAAARCgEAACACIUAAAAQoRAAAAAiFAIAAECEQgAAAIhQCAAAABEKAQAAIEIhAAAARCgEAACACIUAAAAQoRAAAAAiFAIAAECEQgAAAIhQCAAAABEKAQAAIEIhAAAARCgEAACACIUAAAAQoRAAAAAiFAIAAECEQgAAAIhQCAAAABEKAQAAIEIhAAAARCgEAACAchAKly1bprFjxyogIECenp4yGAxasmRJhuPfvHlTkyZNUv369eXp6an69etr0qRJunnzZobTrFixQu3bt5e3t7cefvhh9e3bVz/99JOtTQUAAEA22RwKp0yZovnz5ysmJkZeXl6Zjnvnzh0FBQXp888/1yOPPKIRI0aodu3a+vzzzxUUFKQ7d+5YTPPRRx/pueee06VLl/Tss8+qZ8+e2r9/v7p06aI9e/bY2lwAAABkg82h8NNPP9XRo0cVHR2toUOHZjruzJkzdezYMYWGhmr16tV6++23tXLlSr366qs6duyYZs6caTZ+dHS0pk6dqpo1a2rfvn1699139fHHH2vLli1yc3PTmDFjlJycbGuTAQAAkAWbQ2FAQIB8fHyyHC81NVWLFi1SqVKl9Oqrr5oNGzdunAwGgxYvXqzU1FTT60uWLFFycrJefvlllS1b1vR6nTp11K9fP50+fVq7d++2tckAAADIgsM6mkRHR+vChQtq0aKFSpYsaTasePHiatWqlc6fP69Tp06ZXt+7d68kqX379hbzM762b98+RzUZAADAabk5asbR0dGSJF9fX6vDa9SoYRov/f9LlSpl9VrF9ONkR0JCgs1tzo6kpCSHzDe/2LqejPU/aOshO6id2p2RM9dP7dTuKMWLF3fYvHPDYaHQ2Ls4/Wng9EqXLm02nvH/Hh4e2R4/M+fPn9e9e/ey3V7blHDQfPNeTExMjqaLjY21c0sKD2p3Ts5cu+Tc9VO7c3JU7a6urhkeMMtvDguF+c3b29sh80375XDLIfPOD1WrVrVp/KSkJMXGxsrLy0vu7u4OalXBRO3U7my1S85dP7VTu7PV7rBQWKZMGUnSjRs3rA6/deuW2XjG/2d0JNDa+JkpqIdmC5qcrid3d3enXcfUTu3OyJnrp3ZqdxYO62hivAYwfUeS9IzXBhrHM/7/9u3bVg/ZWhsfAAAA9uHQUFipUiXt37/f4ibVCQkJioiIUKVKlczOq/v7+0uStm/fbjE/42vGcQAAAGA/DguFLi4uGjx4sG7fvq3//ve/ZsOmT5+uuLg4DR48WC4uLqbXBw4cKDc3N3300Udmp51Pnjypb7/9VtWrV1fbtm0d1WQAAACnZfM1hQsXLlRkZKQk6cSJE5KkRYsWme4xGBQUpODgYElSaGiovv/+e82cOVNHjx5Vo0aNdPz4cW3btk1+fn4KDQ01m3fNmjX12muvacqUKfL391f37t0VHx+vVatW6e7du5o5c6bc3B7YvjEAAAD5xuaEFRkZqaVLl5q9FhUVpaioKEmSj4+PKRSWLFlSGzZs0LRp07Ru3Trt3btXXl5eGjFihCZMmGBxU2tJeuWVV+Tj46OwsDB9/fXXKlq0qB599FFNmjRJTZo0yUmNAAAAyILNoTAsLExhYWHZHr9s2bJ677339N5772V7mpCQEIWEhNjaNAAAAOSQw64pBAAAQOFBKAQAAAChEAAAAIRCAAAAiFAIAAAAEQoBAAAgQiEAAABEKAQAAIAIhQAAABChEAAAACIUAgAAQIRCAAAAiFAIAAAAEQoBAAAgQiEAAABEKAQAAIAIhQAAABChEAAAACIUAgAAQIRCAAAAiFAIAAAAEQoBAAAgQiEAAABEKAQAAIAIhQAAABChEAAAACIUAgAAQIRCAAAAiFAIAAAAEQoBAAAgQiEAAABEKAQAAIAIhQAAABChEAAAACIUAgAAQIRCAAAAKA9C4ZIlS2QwGDL9161bN9P4U6dOzXA8Ly8vRzcXAADAKbk5egF+fn6aMGGC1WHr1q3TyZMn1aFDB4th/fv3l4+Pj9lrbm4Oby4AAIBTcnjKatCggRo0aGDxelJSkr788ku5ubmpf//+FsMHDBigNm3aOLp5AAAAUD5eU7hhwwZdu3ZNXbp0kaenZ341AwAAAMqDI4UZWbRokSRpyJAhVodHRkbq8OHDKlKkiGrVqqWAgAAVK1Ys2/NPSEiwSzvvl5SU5JD55hdb15Ox/gdtPWQHtVO7M3Lm+qmd2h2lePHiDpt3brjExcWl5vVCz507p0aNGqlixYo6duyYXF1dTcOmTp2qadOmWUxTsWJFhYWFKTAwMFvLOHXqlO7du2e3NqfXfG8Jh8w3PxxoHZ/fTQAAwGm4urrK19c3v5thVb4cKVyyZIlSUlI0YMAAs0AopXVMCQsLk7+/vzw9PXX+/HmtWrVK06dPV//+/bVt2zb5+flluQxvb2+HtD3tl8Mth8w7P1StWtWm8ZOSkhQbGysvLy+5u7s7qFUFE7VTu7PVLjl3/dRO7c5We56HwpSUFC1ZskQuLi4aNGiQxfDg4GCzv319fTV+/Hh5enoqNDRUH374oRYsWJDlcgrqodmCJqfryd3d3WnXMbVTuzNy5vqpndqdRZ53NNmxY4f++usvtW3bVtWqVcv2dP3795ebm5v279/vuMYBAAA4qTwPhVl1MMmIu7u7SpUqpfh4roEDAACwtzwNhdeuXdOmTZtUrlw5i9PEWYmOjlZcXJzFDa0BAACQe3kaCr/99lslJSUpJCTE6u1lbt26pePHj1u8HhcXp1GjRkmS+vTp4/B2AgAAOJs87WiyePFiSRmfOr527Zpat26txo0bq27duvLw8ND58+cVHh6ua9euKTAwUCNGjMjLJgMAADiFPAuFhw4d0okTJ9S0aVPVq1fP6jjlypXTc889pwMHDmjz5s26ceOGSpQooXr16ikkJERDhgyxuIUNAAAAci/PQmHTpk0VFxeX6ThlypTRBx98kDcNAgAAgEm+PfsYAAAABQehEAAAAIRCAAAAEAoBAAAgQiEAAABEKAQAAIAIhQAAABChEAAAACIUAgAAQIRCAAAAiFAIAAAAEQoBAAAgQiEAAABEKAQAAIAIhQAAABChEAAAACIUAgAAQIRCAAAAiFAIAAAAEQoBAAAgQiEAAABEKAQAAIAIhQAAABChEAAAACIUAgAAQIRCAAAAiFAIAAAAEQoBAAAgQiEAAABEKAQAAIAIhQAAABChEAAAACIUAgAAQIRCAAAAiFAIAAAA5VEo9PPzk8FgsPrvpZdeshj/5s2bmjRpkurXry9PT0/Vr19fkyZN0s2bN/OiuQAAAE7HLa8WVKZMGQ0fPtzi9caNG5v9fefOHQUFBenYsWMKDAxUnz59dPz4cX3++efas2ePNm/erJIlS+ZVswEAAJxCnoXCsmXLauLEiVmON3PmTB07dkyhoaF65513TK+/9957+u9//6uZM2dq0qRJjmwqAACA0ylQ1xSmpqZq0aJFKlWqlF599VWzYePGjZPBYNDixYuVmpqaTy0EAAB4MOXZkcKkpCR98803unDhggwGgx599FH5+fmZjRMdHa0LFy6oQ4cOFqeIixcvrlatWmnTpk06deqUatSokenyEhIS7F6DlFbHg8TW9WSs/0FbD9lB7dTujJy5fmqndkcpXry4w+adG3kWCmNjYzVixAiz1zp27KgvvvhC5cuXl5QWCiXJ19fX6jyMQTA6OjrLUHj+/Hndu3cvt83OQAkHzTfvxcTE5Gi62NhYO7ek8KB25+TMtUvOXT+1OydH1e7q6pphzslveRIKBw0aJH9/f9WpU0fu7u767bffNG3aNG3btk39+/fXli1b5OLiYupdXLZsWavzKV26tCRlqxeyt7e3/QpIJ+2Xwy2HzDs/VK1a1abxk5KSFBsbKy8vL7m7uzuoVQUTtVO7s9UuOXf91E7tzlZ7noTCCRMmmP3drFkzLVu2TEFBQYqMjNTWrVvVpUsXuy6zoB6aLWhyup7c3d2ddh1TO7U7I2eun9qp3VnkW0eTIkWKaMCAAZKk/fv3S0q7bY0k3bhxw+o0t27dMhsPAAAA9pGvvY+N1xLGx8dL+t81g6dOnbI6vvGaw6yuJwQAAIBt8jUUHjp0SJLk4+MjKS3sVapUSfv379edO3fMxk1ISFBERIQqVapUYC/QBAAAKKwcHgp//fVXxcXFWbweGRmpWbNmqVixYnryySclSS4uLho8eLBu376t//73v2bjT58+XXFxcRo8eLBcXFwc3WwAAACn4vCOJqtXr9Ynn3yitm3bysfHR8WKFdPJkye1fft2FSlSRDNmzDDrARsaGqrvv/9eM2fO1NGjR9WoUSMdP35c27Ztk5+fn0JDQx3dZAAAAKfj8FDYpk0b/f777zpy5IgiIiKUkJAgT09P9erVSyNGjFDTpk3Nxi9ZsqQ2bNigadOmad26ddq7d6+8vLw0YsQITZgwgeceAwAAOIDDQ2Hr1q3VunVrm6YpW7as3nvvPb333nsOahUAAADSK1DPPgYAAED+IBQCAACAUAgAAABCIQAAAEQoBAAAgAiFAAAAEKEQAAAAIhQCAABAhEIAAACIUAgAAAARCgEAACBCIQAAAEQoBAAAgAiFAAAAEKEQAAAAIhQCAABAhEIAAACIUAgAAAARCgEAACBCIQAAAEQoBAAAgAiFAAAAEKEQAAAAIhQCAABAhEIAAACIUAgAAAARCgEAACBCIQAAAEQoBAAAgAiFAAAAEKEQAAAAIhQCAABAhEIAAACIUAgAAAARCgEAAKA8CIXnz5/X559/rp49e6p+/fry8PBQrVq1NHjwYB08eNBi/KlTp8pgMFj95+Xl5ejmAgAAOCU3Ry9gzpw5+vjjj1W9enUFBATIw8ND0dHR2rhxozZu3Ki5c+eqZ8+eFtP1799fPj4+5o11c3hzAQAAnJLDU1aTJk20adMmtWrVyuz1iIgIde/eXePGjdMTTzyhYsWKmQ0fMGCA2rRp4+jmAQAAQHlw+rhbt24WgVCSWrVqpTZt2uj69es6ceKEo5sBAACATOTr+diiRYtKklxdXS2GRUZG6vDhwypSpIhq1aqlgIAAi6OJmUlISLBbO9NLSkpyyHzzi63ryVj/g7YesoPaqd0ZOXP91E7tjlK8eHGHzTs3XOLi4lLzY8ExMTFq1qyZDAaDTpw4YQqGU6dO1bRp0yzGr1ixosLCwhQYGJit+Z86dUr37t2za5uNmu8t4ZD55ocDrePzuwkAADgNV1dX+fr65nczrMqXUHj37l11795dERERmj17tvr162catmHDBt26dUv+/v7y9PTU+fPntWrVKk2fPl2pqanatm2b/Pz8slyGI48U+qy65ZB554eL/cvbNH5SUpJiY2Pl5eUld3d3B7WqYKJ2ane22iXnrp/aqd1RtRfUI4V5fvo4JSVFI0eOVEREhJ5++mmzQChJwcHBZn/7+vpq/Pjx8vT0VGhoqD788EMtWLAgy+UU1BVe0OR0Pbm7uzvtOqZ2andGzlw/tVO7s8jTm1enpqZqzJgxWr58uUJCQjRjxoxsT9u/f3+5ublp//79DmwhAACAc8qzUJiSkqJRo0Zp8eLF6tOnj8LCwlSkSPYX7+7urlKlSik+nmvgAAAA7C1PQmFKSopGjx6tJUuWqFevXvriiy+s9jjOTHR0tOLi4ixuaA0AAIDcc3goNB4hXLJkiXr06KE5c+ZkGAhv3bql48ePW7weFxenUaNGSZL69Onj0PYCAAA4I4d3NJk2bZq++eYblSpVSjVr1tQHH3xgMU5QUJAaNGiga9euqXXr1mrcuLHq1q0rDw8PnT9/XuHh4bp27ZoCAwM1YsQIRzcZAADA6Tg8FJ47d06SdPv2bX344YdWx/Hx8VGDBg1Urlw5Pffcczpw4IA2b96sGzduqESJEqpXr55CQkI0ZMgQm087AwAAIGsOD4VhYWEKCwvL1rhlypSxeiQRAAAAjpWnt6QBAABAwUQoBAAAAKEQAAAAhEIAAACIUAgAAAARCgEAACBCIQAAAEQoBAAAgAiFAAAAEKEQAAAAIhQCAABAhEIAAACIUAgAAAARCgEAACDJLb8bAAAFXfO9JSRdze9m2EXcs5XzuwkACiiOFAIAAIBQCAAAAEIhAAAARCgEAACACIUAAAAQoRAAAAAiFAIAAECEQgAAAIhQCAAAABEKAQAAIEIhAAAARCgEAACACIUAAAAQoRAAAAAiFAIAAECEQgAAAIhQCAAAABEKAQAAIEIhAAAARCgEAACACnAoPHz4sPr27auHH35Y3t7eat++vVasWJHfzQIAAHggueV3A6zZs2ePevfuLXd3d/Xq1UtlypTR+vXr9dxzz+ncuXN6+eWX87uJAADgPoZ5f+d3E+ziQOv8bkH+KHChMDk5WWPGjJGLi4s2btyohg0bSpImTJigzp07a+rUqerRo4dq1KiRb20sX6zAHmDNE66urvndhHxD7c6Jbd5533tqt82Dsq046/vuEhcXl5rfjUhv+/bt6tWrlwYOHKhZs2aZDfvuu+80dOhQjRs3Tm+99VY+tRAAAODBU+Ai/d69eyVJ7du3txhmfG3fvn152iYAAIAHXYELhdHR0ZJk9fSwwWBQ+fLlTeMAAADAPgpcKLx586YkqUyZMlaHly5d2jQOAAAA7KPAhUIAAADkvQIXCo1HCDM6Gnjr1q0MjyICAAAgZwpcKDReS2jtusG4uDhdvXo1X29HAwAA8CAqcKHQ399fUtqtae5nfM04DgAAAOyjwN2nMDk5Wc2aNdOFCxe0bds2NWjQQFLaaePOnTvrjz/+UFRUlGrWrJnPLQUAAHhwFLgjhW5ubvrkk0+UkpKiJ554QqGhoXrjjTfUunVrnTx5Uq+99lqOA6E9nqeckpKiOXPmqFWrVqpYsaJq1KihZ555JtPb5BSE5zjntg2RkZF6/fXX1a5dO1WvXl1eXl5q3ry5/v3vfysuLs7qNH5+fjIYDFb/vfTSS3aqLGu5rX3Pnj0Z1mEwGHTgwAGHLNdectuOoKCgTOs3GAz69ttvzaYpCO/9smXLNHbsWAUEBMjT01MGg0FLliyxeT6FcZu3R+2FdZu3R+2FeZu3R/2FcZs/f/68Pv/8c/Xs2VP169eXh4eHatWqpcGDB+vgwYM2zaswbvP2UuAecydJbdu21ebNmzV16lStXr1ad+/eVe3atfX6668rJCQkR/O01/OUX3rpJS1YsEC1a9fW888/r0uXLmn16tXavn27tm7dqtq1aztkublhjzY8/fTTunr1qlq2bKl+/frJxcVFe/fu1cyZM7Vu3Tpt3bpVHh4eFtOVKVNGw4cPt3i9cePGdqktK/Zc//7+/mrd2vKBmN7e3g5dbm7Yox0DBgywWndycrKmT5+uIkWKqF27dhbD8/u9nzJlimJiYlS+fHl5eXkpJiYmR/MpjNu8PWovrNu8vd53qXBu8/aovzBu83PmzNHHH3+s6tWrKyAgQB4eHoqOjtbGjRu1ceNGzZ07Vz179szWvArjNm8vBe70sSMkJyerefPmOn/+vLZu3Wp6nnL6U9L79+/PsgPL7t271a1bNz322GNas2aNihUrJknatWuXevTooccee0ybNm2y+3Jzw15t+Pjjj9WvXz9VrFjR9FpqaqpeeeUVzZ07V//v//0/ffjhh2bT+Pn5SZKOHTtm56qyx16179mzR08++aQmTJigiRMn5tlyc8vR7Vi7dq2efvppde3a1epRAyn/3ntJ2rlzp3x9feXj46MZM2bonXfe0axZszRw4MBsz6MwbvOSfWovjNu8ZJ/aC+s2L9mn/owU5G1+3bp1qlChglq1amX2ekREhLp3765SpUrp119/NW3DGSms27y9FLjTx46we/dunT59Wn369DG9YVLajbDHjx+v5OTkbB1eX7hwoSTpjTfeMPtgtWvXTh06dFBERIT+/PNPuy83N+zVhrFjx5p9OUiSi4uLxo8fL6lgPnowv9Z/QXjf86IdixYtkiQNHjw41211hICAAPn4+ORqHoVxm5fsU3th3OYl+9Ruq4LyvkuOrb8gb/PdunWzCISS1KpVK7Vp00bXr1/XiRMnspxPYd3m7aVAnj62N3s9T3nv3r0qWbKkWrZsaXU+4eHh2rdvn+max4LwHGdHt6Fo0aKSJFdXV6vDk5KS9M033+jChQsyGAx69NFHTb8oHc3etZ86dUqzZ8/WP//8o6pVqyowMFDly5d3+HJzypHt+Pvvv7V9+3Z5eXmpS5cuVsfJz/feXgrjNu9oBXmbt7fCts07UmHe5rP6zKbn7Nu8U4RCezxP+c6dO7p48aLq1q1r9YNl7f6KBeE5zo5uw+LFiyVZ3yAkKTY2ViNGjDB7rWPHjvriiy+s7lztyd61r1ixwuzC4YceekgTJ07UmDFjHLrcnHJkO5YsWaKUlBQNGDBAbm7WdyP5+d7bQ2Hd5h2tIG/z9lbYtnlHKqzbfExMjHbu3CkvLy/Vq1cv03HZ5p3k9LE9nqecnXmkH89ey80tR7bh6NGjmjZtmjw8PBQaGmoxfNCgQdqwYYOio6MVExOj8PBwderUSeHh4erfv79SUx17Oau9aq9QoYImT56sH3/8UefPn9fJkyc1Z84clStXTm+99ZbmzZvnkOXmlqPakZqaajodktFppPx+7+2hsG7zjlTQt3l7KazbvKMU1m3+7t27euGFF5SYmKh33nknyyOFbPNOcqQQ9nfmzBn169dP9+7d09y5c63+CpwwYYLZ382aNdOyZcsUFBSkyMhIbd26NcPTEAVJnTp1VKdOHdPfJUqUUEhIiOrXr6+AgABNnTpVTz/9tIoUcYrfWNq9e7fOnj0rf39/+fr6Wh3nQXnv8T9s82zzhWmbT0lJ0ciRIxUREaGnn35a/fr1y7NlF2ZO8Ym2x/OUszOP9OPZa7m55Yg2nDt3Tk8++aSuXLmiBQsWqG3bttmetkiRIhowYIAkaf/+/TYt11aOXv9169ZV06ZNdenSJZ06dSrPlptdjmqH8ULsIUOG2DRdXr739lBYt3lHKCzbvKMV9G3eUQrbNp+amqoxY8Zo+fLlCgkJ0YwZM7I1Hdu8k4RCezxPuWTJkqpYsaLOnj2re/fuWQy3dl1BQXiOs73bcPbsWQUHB+vixYuaN2+eunbtanObjEcY4uPjbZ7WFnmx/q3VUhDed0e1Iy4uThs2bFDZsmXVrVs3m9uUV++9PRTWbd7eCtM2nxcK8jbvCIVtm09JSdGoUaO0ePFi9enTR2FhYdk+oss27ySh0F7PU/b399edO3cUFRWVrfkUhOc427MNxi+HCxcu6Ouvv1ZQUFCO2nTo0CFJcvhtIxy9/pOTk3XkyBG5uLioatWqebbc7HJEO5YtW6bExESFhITooYcesrlNefXe20th3ObtqbBt845W0Ld5RyhM23xKSopGjx6tJUuWqFevXvriiy+y1eM4PWff5p0iFLZr107VqlXTypUrdfToUdPrt27d0gcffCA3NzfTIW5Junr1qn7//XddvXrVbD5PP/20pLQ7xiclJZle37Vrl3744Qe1atXK7BF8ti7XEexVe/ovh7lz5+rJJ5/MdLm//vqr1UdhRUZGatasWSpWrFiW88gte9X+448/WlwknZycrDfffFMxMTHq0KGDypUrl+PlOoq96k/PeJ+yQYMGZThOQXjvbfUgbfO2epC2eVs9aNu8rR6kbd54hHDJkiXq0aOH5syZk2kgdOZtPjNO8UQTKe1C2d69e6tYsWLq3bu3SpcurfXr1+vs2bN644039Morr5jGnTp1qqZNm2b1bvZjxozRwoULVbt2bXXu3Nn0+JtixYpZffyNLcstyLX7+fkpJiZGzZs3z/BWFOnHnzp1qj755BO1bdtWPj4+KlasmE6ePKnt27erSJEimjFjhs3Xp+SEvWp3cXFRixYtVKlSJd24cUMRERH6448/VKVKFW3atMniV3BBeN/tVb/Rzz//rICAADVs2FC7du3KcJkF5b1fuHChIiMjJUknTpzQkSNH1LJlS1WvXl1S2vNdg4ODTW1+kLZ5e9ReWLd5e9VeWLd5e33upcK1zRtrKVWqlF588UWrgTAoKEgNGjQwG/9B2ebtxWl6H9vrecoff/yx6tWrp/nz5+uLL75QyZIl1bVrV7355ptmvx7svdzcsEcbjM/PPHDgQIYPg0+/YbVp00a///67jhw5ooiICCUkJMjT01O9evXSiBEj1LRp09wXlg32qH3YsGEKDw/X3r17dfXqVbm5ual69ep65ZVXNGrUKBkMBocs1x7s2Q7jEYOsdu4F5b2PjIzU0qVLzV6LiooynRby8fExfTlmpjBu8/aovbBu8/aovTBv8/b63EuFa5s/d+6cJOn27dsWj1808vHxMYXCzBTGbd5enOZIIQAAADLmFNcUAgAAIHOEQgAAABAKAQAAQCgEAACACIUAAAAQoRAAAAAiFAIAAECEQgAAAIhQCAAAABEKAQAAIEIhAAAARCgEAACApP8P5g5nzwYTh+gAAAAASUVORK5CYII=", 239 | "text/plain": [ 240 | "
" 241 | ] 242 | }, 243 | "metadata": {}, 244 | "output_type": "display_data" 245 | } 246 | ], 247 | "source": [ 248 | "fig2 = plt.hist(mb_fp)\n", 249 | "plt.title(r'Histogram of False Positives after Bootstrap')\n", 250 | "plt.show()" 251 | ] 252 | } 253 | ], 254 | "metadata": { 255 | "kernelspec": { 256 | "display_name": "Python 3", 257 | "language": "python", 258 | "name": "python3" 259 | }, 260 | "language_info": { 261 | "codemirror_mode": { 262 | "name": "ipython", 263 | "version": 3 264 | }, 265 | "file_extension": ".py", 266 | "mimetype": "text/x-python", 267 | "name": "python", 268 | "nbconvert_exporter": "python", 269 | "pygments_lexer": "ipython3", 270 | "version": "3.11.7" 271 | } 272 | }, 273 | "nbformat": 4, 274 | "nbformat_minor": 5 275 | } 276 | --------------------------------------------------------------------------------