├── .gitignore ├── LICENSE ├── README.md ├── README.rst ├── genetic_algorithm_Ackley.gif ├── genetic_algorithm_Rastrigin.gif ├── genetic_algorithm_Weierstrass.gif ├── genetic_algorithm_convergence.gif ├── genetic_algorithm_convergence_curve.gif ├── geneticalgorithm ├── __init__.py └── geneticalgorithm.py └── setup.py /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | 5 | # C extensions 6 | *.so 7 | 8 | # Distribution / packaging 9 | .Python 10 | env/ 11 | build/ 12 | develop-eggs/ 13 | dist/ 14 | downloads/ 15 | eggs/ 16 | .eggs/ 17 | lib/ 18 | lib64/ 19 | parts/ 20 | sdist/ 21 | var/ 22 | *.egg-info/ 23 | .installed.cfg 24 | *.egg 25 | 26 | # PyInstaller 27 | # Usually these files are written by a python script from a template 28 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 29 | *.manifest 30 | *.spec 31 | 32 | # Installer logs 33 | pip-log.txt 34 | pip-delete-this-directory.txt 35 | 36 | # Unit test / coverage reports 37 | htmlcov/ 38 | .tox/ 39 | .coverage 40 | .coverage.* 41 | .cache 42 | nosetests.xml 43 | coverage.xml 44 | *,cover 45 | 46 | # Translations 47 | *.mo 48 | *.pot 49 | 50 | # Django stuff: 51 | *.log 52 | 53 | # Sphinx documentation 54 | docs/_build/ 55 | 56 | # PyBuilder 57 | target/ 58 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | ''' 2 | 3 | Copyright 2020 Ryan (Mohammad) Solgi 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy of 6 | this software and associated documentation files (the "Software"), to deal in 7 | the Software without restriction, including without limitation the rights to use, 8 | copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the 9 | Software, and to permit persons to whom the Software is furnished to do so, 10 | subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 18 | THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | 23 | ''' 24 | 25 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | 2 | # geneticalgorithm 3 | 4 | geneticalgorithm is a Python library distributed on [Pypi](https://pypi.org) for implementing standard and elitist 5 | [genetic-algorithm](https://towardsdatascience.com/introduction-to-optimization-with-genetic-algorithm-2f5001d9964b) (GA). 6 | This package solves continuous, [combinatorial](https://en.wikipedia.org/wiki/Combinatorial_optimization) 7 | and mixed [optimization](https://en.wikipedia.org/wiki/Optimization_problem) problems 8 | with continuous, discrete, and mixed variables. 9 | It provides an easy implementation of genetic-algorithm (GA) in Python. 10 | 11 | ## Installation 12 | 13 | Use the package manager [pip](https://pip.pypa.io/en/stable/) to install geneticalgorithm in Python. 14 | 15 | ```python 16 | pip install geneticalgorithm 17 | ``` 18 | ## version 1.0.2 updates 19 | @param convergence_curve - Plot the convergence curve or not. Default is True. 20 | 21 | @param progress_bar - Show progress bar or not. Default is True. 22 | 23 | ## A simple example 24 | Assume we want to find a set of X=(x1,x2,x3) that minimizes function f(X)=x1+x2+x3 where X can be any real number in \[0,10\]. 25 | This is a trivial problem and we already know that the answer is X=(0,0,0) where f(X)=0. 26 | We just use this simple example to see how to implement geneticalgorithm: 27 | 28 | 29 | First we import geneticalgorithm and [numpy](https://numpy.org). Next, we define 30 | function f which we want to minimize and the boundaries of the decision variables; 31 | Then simply geneticalgorithm is called to solve the defined optimization problem as follows: 32 | ```python 33 | import numpy as np 34 | from geneticalgorithm import geneticalgorithm as ga 35 | 36 | def f(X): 37 | return np.sum(X) 38 | 39 | 40 | varbound=np.array([[0,10]]*3) 41 | 42 | model=ga(function=f,dimension=3,variable_type='real',variable_boundaries=varbound) 43 | 44 | model.run() 45 | 46 | ``` 47 | Notice that we define the function f so that its output is the 48 | objective function we want to minimize where the input is the set of X (decision variables). 49 | The boundaries for variables must be defined as a numpy array and for each 50 | variable we need a separate boundary. Here I have three variables and all of 51 | them have the same boundaries (For the case the boundaries are different see the example with mixed variables). 52 | 53 | 54 | 55 | geneticalgorithm has some arguments: 56 | Obviously the first argument is the function f we already defined (for more details about the argument and output see [Function](#1111-id)). 57 | Our problem has three variables so we set dimension equal three. 58 | Variables are real (continuous) so we use string 'real' to notify the type of 59 | variables (geneticalgorithm accepts other types including Boolean, Integers and 60 | Mixed; see other examples). 61 | Finally, we input varbound which includes the boundaries of the variables. 62 | Note that the length of variable_boundaries must be equal to dimension. 63 | 64 | If you run the code, you should see a progress bar that shows the progress of the 65 | genetic algorithm (GA) and then the solution, objective function value and the convergence curve as follows: 66 | 67 | 68 | ![](https://github.com/rmsolgi/geneticalgorithm/blob/master/genetic_algorithm_convergence.gif) 69 | 70 | Also we can access to the best answer of the defined optimization problem found by geneticalgorithm as a dictionary and a report of the progress of the genetic algorithm. 71 | To do so we complete the code as follows: 72 | ```python 73 | convergence=model.report 74 | solution=model.ouput_dict 75 | ``` 76 | 77 | output_dict is a dictionary including the best set of variables found and the value of the given function associated to it ({'variable': , 'function': }). 78 | report is a list including the convergence of the algorithm over iterations 79 | 80 | ## The simple example with integer variables 81 | 82 | Considering the problem given in the simple example above. 83 | Now assume all variables are integers. So x1, x2, x3 can be any integers in \[0,10\]. 84 | In this case the code is as the following: 85 | 86 | ```python 87 | import numpy as np 88 | from geneticalgorithm import geneticalgorithm as ga 89 | 90 | def f(X): 91 | return np.sum(X) 92 | 93 | 94 | varbound=np.array([[0,10]]*3) 95 | 96 | model=ga(function=f,dimension=3,variable_type='int',variable_boundaries=varbound) 97 | 98 | model.run() 99 | 100 | 101 | ``` 102 | So, as it is seen the only difference is that for variable_type we use string 'int'. 103 | 104 | ## The simple example with Boolean variables 105 | 106 | Considering the problem given in the simple example above. 107 | Now assume all variables are Boolean instead of real or integer. So X can be either zero or one. Also instead of three let's have 30 variables. 108 | In this case the code is as the following: 109 | 110 | ```python 111 | import numpy as np 112 | from geneticalgorithm import geneticalgorithm as ga 113 | 114 | def f(X): 115 | return np.sum(X) 116 | 117 | 118 | model=ga(function=f,dimension=30,variable_type='bool') 119 | 120 | model.run() 121 | 122 | 123 | ``` 124 | 125 | Note for variable_type we use string 'bool' when all variables are Boolean. 126 | Note that when variable_type equal 'bool' there is no need for variable_boundaries to be defined. 127 | 128 | ## The simple example with mixed variables 129 | 130 | Considering the problem given in the the simple example above where we want to minimize f(X)=x1+x2+x3. 131 | Now assume x1 is a real (continuous) variable in \[0.5,1.5\], x2 is an integer variable in \[1,100\], and x3 is a Boolean variable that can be either zero or one. 132 | We already know that the answer is X=(0.5,1,0) where f(X)=1.5 133 | We implement geneticalgorithm as the following: 134 | 135 | ```python 136 | import numpy as np 137 | from geneticalgorithm import geneticalgorithm as ga 138 | 139 | def f(X): 140 | return np.sum(X) 141 | 142 | varbound=np.array([[0.5,1.5],[1,100],[0,1]]) 143 | vartype=np.array([['real'],['int'],['int']]) 144 | model=ga(function=f,dimension=3,variable_type_mixed=vartype,variable_boundaries=varbound) 145 | 146 | model.run() 147 | 148 | ``` 149 | 150 | Note that for mixed variables we need to define boundaries also we need to make a numpy array of variable types as above (vartype). Obviously the order of variables in both arrays must match. Also notice that in such a case for Boolean variables we use string 'int' and boundary \[0,1\]. 151 | Notice that we use argument variable_type_mixed to input a numpy array of variable types for functions with mixed variables. 152 | 153 | ## Maximization problems 154 | 155 | geneticalgorithm is designed to minimize the given function. A simple trick to 156 | solve maximization problems is to multiply the objective function by a negative sign. Then the absolute value of the output is the maximum of the function. 157 | Consider the above simple example. Now lets find the maximum of f(X)=x1+x2+x3 where X is a set of real variables in \[0,10\]. 158 | We already know that the answer is X=(10,10,10) where f(X)=30. 159 | ```python 160 | import numpy as np 161 | from geneticalgorithm import geneticalgorithm as ga 162 | 163 | def f(X): 164 | return -np.sum(X) 165 | 166 | varbound=np.array([[0,10]]*3) 167 | 168 | model=ga(function=f,dimension=3,variable_type='real',variable_boundaries=varbound) 169 | 170 | model.run() 171 | 172 | ``` 173 | As seen above np.sum(X) is mulitplied by a negative sign. 174 | 175 | ## Optimization problems with constraints 176 | In all above examples, the optimization problem was unconstrained. Now consider that we want to minimize f(X)=x1+x2+x3 where X is a set of real variables in \[0,10\]. Also we have an extra constraint so that sum of x1 and x2 is equal or greater than 2. The minimum of f(X) is 2. 177 | In such a case, a trick is to define penalty function. Hence we use the code below: 178 | 179 | ```python 180 | import numpy as np 181 | from geneticalgorithm import geneticalgorithm as ga 182 | 183 | def f(X): 184 | pen=0 185 | if X[0]+X[1]<2: 186 | pen=500+1000*(2-X[0]-X[1]) 187 | return np.sum(X)+pen 188 | 189 | varbound=np.array([[0,10]]*3) 190 | 191 | model=ga(function=f,dimension=3,variable_type='real',variable_boundaries=varbound) 192 | 193 | model.run() 194 | 195 | ``` 196 | As seen above we add a penalty to the objective function whenever the constraint is not met. 197 | 198 | Some hints about how to define a penalty function: 199 | 1- Usually you may use a constant greater than the maximum possible value of 200 | the objective function if the maximum is known or if we have a guess of that. Here the highest possible value of our function is 300 201 | (i.e. if all variables were 10, f(X)=300). So I chose a constant of 500. So, if a trial solution is not in the feasible region even though its objective function may be small, the penalized objective function (fitness function) is worse than any feasible solution. 202 | 2- Use a coefficient big enough and multiply that by the amount of violation. 203 | This helps the algorithm learn how to approach feasible domain. 204 | 3- How to define penalty function usually influences the convergence rate of 205 | an evolutionary algorithm. 206 | In my [book on metaheuristics and evolutionary algorithms](https://www.wiley.com/en-us/Meta+heuristic+and+Evolutionary+Algorithms+for+Engineering+Optimization-p-9781119386995) you can learn more about that. 207 | 4- Finally after you solved the problem test the solution to see if boundaries are met. If the solution does not meet 208 | constraints, it shows that a bigger penalty is required. However, in problems where 209 | optimum is exactly on the boundary of the feasible region (or very close to the constraints) which is common in some kinds of problems, a very strict and big penalty may prevent the genetic algorithm 210 | to approach the optimal region. In such a case designing an appropriate penalty 211 | function might be more challenging. Actually what we have to do is to design 212 | a penalty function that let the algorithm searches unfeasible domain while finally 213 | converge to a feasible solution. Hence you may need more sophisticated penalty 214 | functions. But in most cases the above formulation work fairly well. 215 | 216 | ## Genetic algorithm's parameters 217 | 218 | Every evolutionary algorithm (metaheuristic) has some parameters to be adjusted. 219 | [Genetic algorithm](https://pathmind.com/wiki/evolutionary-genetic-algorithm) 220 | also has some parameters. The parameters of geneticalgorithm is defined as a dictionary: 221 | 222 | ```python 223 | 224 | algorithm_param = {'max_num_iteration': None,\ 225 | 'population_size':100,\ 226 | 'mutation_probability':0.1,\ 227 | 'elit_ratio': 0.01,\ 228 | 'crossover_probability': 0.5,\ 229 | 'parents_portion': 0.3,\ 230 | 'crossover_type':'uniform',\ 231 | 'max_iteration_without_improv':None} 232 | 233 | ``` 234 | The above dictionary refers to the default values that has been set already. 235 | One may simply copy this code from here and change the values and use the modified dictionary as the argument of geneticalgorithm. 236 | Another way of accessing this dictionary is using the command below: 237 | 238 | ```python 239 | import numpy as np 240 | from geneticalgorithm import geneticalgorithm as ga 241 | 242 | def f(X): 243 | return np.sum(X) 244 | 245 | 246 | model=ga(function=f,dimension=3,variable_type='bool') 247 | 248 | print(model.param) 249 | 250 | ``` 251 | 252 | An example of setting a new set of parameters for genetic algorithm and running geneticalgorithm for our first simple example again: 253 | 254 | ```python 255 | import numpy as np 256 | from geneticalgorithm import geneticalgorithm as ga 257 | 258 | def f(X): 259 | return np.sum(X) 260 | 261 | 262 | varbound=np.array([[0,10]]*3) 263 | 264 | algorithm_param = {'max_num_iteration': 3000,\ 265 | 'population_size':100,\ 266 | 'mutation_probability':0.1,\ 267 | 'elit_ratio': 0.01,\ 268 | 'crossover_probability': 0.5,\ 269 | 'parents_portion': 0.3,\ 270 | 'crossover_type':'uniform',\ 271 | 'max_iteration_without_improv':None} 272 | 273 | model=ga(function=f,\ 274 | dimension=3,\ 275 | variable_type='real',\ 276 | variable_boundaries=varbound,\ 277 | algorithm_parameters=algorithm_param) 278 | 279 | model.run() 280 | 281 | 282 | ``` 283 | 284 | Notice that max_num_iteration has been changed to 3000 (it was already None). 285 | In the above gif we saw that the algorithm run for 1500 iterations. 286 | Since we did not define parameters geneticalgorithm applied the default values. 287 | However if you run this code geneticalgroithm executes 3000 iterations this time. 288 | To change other parameters one may simply replace the values according to 289 | [Arguments](#1112-id). 290 | 291 | @ max_num_iteration: The termination criterion of geneticalgorithm. 292 | If this parameter's value is None the algorithm sets maximum number of iterations 293 | automatically as a function of the dimension, boundaries, and population size. The user may enter 294 | any number of iterations that they want. It is highly recommended that the user themselves determines the max_num_iterations and not to use None. 295 | 296 | @ population_size: determines the number of trial solutions in each iteration. 297 | The default value is 100. 298 | 299 | @ mutation_probability: determines the chance of each gene in each individual solution 300 | to be replaced by a random value. The default is 0.1 (i.e. 10 percent). 301 | 302 | @ elit_ration: determines the number of elites in the population. The default value is 303 | 0.01 (i.e. 1 percent). For example when population size is 100 and elit_ratio is 304 | 0.01 then there is one elite in the population. If this parameter is set to be zero then 305 | geneticalgorithm implements a standard genetic algorithm instead of elitist GA. 306 | 307 | @ crossover_probability: determines the chance of an existed solution to pass its 308 | genome (aka characteristics) to new trial solutions (aka offspring); the default value is 0.5 (i.e. 50 percent) 309 | 310 | @ parents_portion: the portion of population filled by the members of the previous generation (aka parents); 311 | default is 0.3 (i.e. 30 percent of population) 312 | 313 | @ crossover_type: there are three options including one_point; two_point, and uniform crossover functions; 314 | default is uniform crossover 315 | 316 | @ max_iteration_without_improv: if the algorithms does not improve the objective function 317 | over the number of successive iterations determined by this parameter, then geneticalgorithm 318 | stops and report the best found solution before the max_num_iterations to be met. The default value is None. 319 | 320 | ## Function 321 | The given function to be optimized must only accept one argument and return a scalar. 322 | The argument of the given function is a numpy array which is entered by geneticalgorithm. 323 | For any reason if you do not want to work with numpy in your function you may 324 | [turn the numpy array to a list](https://docs.scipy.org/doc/numpy-1.15.0/reference/generated/numpy.ndarray.tolist.html). 325 | 326 | ## Arguments 327 | 328 | @param function - the given objective function to be minimized 329 | NOTE: This implementation minimizes the given objective function. (For maximization 330 | multiply function by a negative sign: the absolute value of the output would be 331 | the actual objective function) 332 | 333 | @param dimension - the number of decision variables 334 | 335 | @param variable_type - 'bool' if all variables are Boolean; 'int' if all 336 | variables are integer; and 'real' if all variables are real value or continuous 337 | (for mixed type see @param variable_type_mixed). 338 | 339 | @param variable_boundaries - Default None; leave it None if 340 | variable_type is 'bool'; otherwise provide an array of tuples of length two as 341 | boundaries for each variable; the length of the array must be equal dimension. 342 | For example, np.array(\[0,100\],\[0,200\]) determines lower boundary 0 and upper boundary 100 343 | for first and upper boundary 200 for second variable where dimension is 2. 344 | 345 | @param variable_type_mixed - Default None; leave it None if all variables have the same type; otherwise this can be used to specify the type of each variable separately. For example if the first 346 | variable is integer but the second one is real the input is: 347 | np.array(\['int'\],\['real'\]). NOTE: it does not accept 'bool'. If variable 348 | type is Boolean use 'int' and provide a boundary as \[0,1\] 349 | in variable_boundaries. Also if variable_type_mixed is applied, 350 | variable_boundaries has to be defined. 351 | 352 | @param function_timeout - if the given function does not provide 353 | output before function_timeout (unit is seconds) the algorithm raise error. 354 | For example, when there is an infinite loop in the given function. 355 | 356 | @param algorithm_parameters: 357 | @ max_num_iteration - stoping criteria of the genetic algorithm (GA) 358 | @ population_size 359 | @ mutation_probability 360 | @ elit_ration 361 | @ crossover_probability 362 | @ parents_portion 363 | @ crossover_type - Default is 'uniform'; 'one_point' or 'two_point' 364 | are other options 365 | @ max_iteration_without_improv - maximum number of 366 | successive iterations without improvement. If None it is ineffective 367 | 368 | 369 | ## Methods and Outputs: 370 | 371 | methods: 372 | run(): implements the genetic algorithm (GA) 373 | 374 | param: a dictionary of parameters of the genetic algorithm (GA) 375 | 376 | output: 377 | 378 | output_dict: is a dictionary including the best set of variables 379 | found and the value of the given function associated to it. 380 | {'variable': , 'function': } 381 | 382 | report: is a record of the progress of the 383 | algorithm over iterations 384 | 385 | 386 | ## Function timeout 387 | 388 | geneticalgorithm is designed such that if the given function does not provide 389 | any output before timeout (the default value is 10 seconds), the algorithm 390 | would be terminated and raise the appropriate error. In such a case make sure the given function 391 | works correctly (i.e. there is no infinite loop in the given function). Also if the given function takes more than 10 seconds to complete the work 392 | make sure to increase function_timeout in arguments. 393 | 394 | ## Standard GA vs. Elitist GA 395 | 396 | The convergence curve of an elitist genetic algorithm is always non-increasing. So, 397 | the best ever found solution is equal to the best solution of the last iteration. However, 398 | the convergence curve of a standard genetic algorithm is different. If elit_ratio is zero 399 | geneticalgroithm implements a standard GA. The output of geneticalgorithm for standard GA is the best 400 | ever found solution not the solution of the last iteration. The difference 401 | between the convergence curve of standard GA and elitist GA is shown below: 402 | 403 | ![](https://github.com/rmsolgi/geneticalgorithm/blob/master/genetic_algorithm_convergence_curve.gif) 404 | 405 | ## Hints on how to adjust genetic algorithm's parameters 406 | 407 | In general the performance of a genetic algorithm or any evolutionary algorithm 408 | depends on its parameters. Parameter setting of an evolutionary algorithm is important. Usually these parameters are adjusted based on experience and by conducting a sensitivity analysis. 409 | It is impossible to provide a general guideline to parameter setting but the suggestions provided below may help: 410 | 411 | Number of iterations: Select a max_num_iterations sufficienlty large; otherwise the reported solution may not be satisfactory. On the other hand 412 | selecting a very large number of iterations increases the run time significantly. So this is actually a compromise between 413 | the accuracy you want and the time and computational cost you spend. 414 | 415 | Population size: Given a constant number of functional evaluations (max_num_iterations times population_size) I would 416 | select smaller population size and greater iterations. However, a very small choice of 417 | population size is also deteriorative. For most problems I would select a population size of 100 unless the dimension of the problem is very large that needs a bigger population size. 418 | 419 | elit_ratio: Although having few elites is usually a good idea and may increase the rate of 420 | convergence in some problems, having too many elites in the population may cause the algorithm to easily trap in a local optima. I would usually select only one elite in most cases. Elitism is not always necessary and in some problems may even be deteriorative. 421 | 422 | mutation_probability: This is a parameter you may need to adjust more than the other ones. Its appropriate value heavily depends on the problem. Sometimes we may select 423 | mutation_probability as small as 0.01 (i.e. 1 percent) and sometimes even as large as 0.5 (i.e. 50 percent) or even larger. In general if the genetic algorithm trapped 424 | in a local optimum increasing the mutation probability may help. On the other hand if the algorithm suffers from stagnation reducing the mutation probability may be effective. However, this rule of thumb is not always true. 425 | 426 | parents_portion: If parents_portion set zero, it means that the whole of the population is filled with the newly generated solutions. 427 | On the other hand having this parameter equals 1 (i.e. 100 percent) means no new solution 428 | is generated and the algorithm would just repeat the previous values without any change which is not meaningful and effective obviously. Anything between these two may work. The exact value depends on the problem. 429 | 430 | 431 | crossover_type: Depends on the problem. I would usually use uniform crossover. But testing the other ones in your problem is recommended. 432 | 433 | max_iteration_without_improv: This is a parameter that I recommend being used cautiously. 434 | If this parameter is too small then the algorithm may stop while it trapped in a local optimum. 435 | So make sure you select a sufficiently large criteria to provide enough time for the algorithm to progress and to avoid immature convergence. 436 | 437 | Finally to make sure that the parameter setting is fine, we usually should run the 438 | algorithm for several times and if connvergence curves of all runs converged to the same objective function value we may accept that solution as the optimum. The number of runs 439 | depends but usually five or ten runs is prevalent. Notice that in some problems 440 | several possible set of variables produces the same objective function value. 441 | When we study the convergence of a genetic algorithm we compare the objective function values not the decision variables. 442 | 443 | ## Optimization test functions 444 | 445 | Implementation of geneticalgorithm for some benchmark problems: 446 | 447 | ## [Rastrigin](https://en.wikipedia.org/wiki/Rastrigin_function) 448 | 449 | ![](https://upload.wikimedia.org/wikipedia/commons/thumb/8/8b/Rastrigin_function.png/600px-Rastrigin_function.png) 450 | 451 | 452 | ```python 453 | 454 | import numpy as np 455 | import math 456 | from geneticalgorithm import geneticalgorithm as ga 457 | 458 | def f(X): 459 | 460 | dim=len(X) 461 | 462 | OF=0 463 | for i in range (0,dim): 464 | OF+=(X[i]**2)-10*math.cos(2*math.pi*X[i])+10 465 | 466 | return OF 467 | 468 | 469 | varbound=np.array([[-5.12,5.12]]*2) 470 | 471 | model=ga(function=f,dimension=2,variable_type='real',variable_boundaries=varbound) 472 | 473 | model.run() 474 | 475 | ``` 476 | 477 | ![](https://github.com/rmsolgi/geneticalgorithm/blob/master/genetic_algorithm_Rastrigin.gif) 478 | 479 | 480 | ## [Ackley](https://en.wikipedia.org/wiki/Ackley_function) 481 | ![](https://upload.wikimedia.org/wikipedia/commons/thumb/9/98/Ackley%27s_function.pdf/page1-600px-Ackley%27s_function.pdf.jpg) 482 | 483 | ```python 484 | 485 | import numpy as np 486 | import math 487 | from geneticalgorithm import geneticalgorithm as ga 488 | 489 | def f(X): 490 | 491 | dim=len(X) 492 | 493 | t1=0 494 | t2=0 495 | for i in range (0,dim): 496 | t1+=X[i]**2 497 | t2+=math.cos(2*math.pi*X[i]) 498 | 499 | OF=20+math.e-20*math.exp((t1/dim)*-0.2)-math.exp(t2/dim) 500 | 501 | return OF 502 | 503 | varbound=np.array([[-32.768,32.768]]*2) 504 | 505 | model=ga(function=f,dimension=2,variable_type='real',variable_boundaries=varbound) 506 | 507 | model.run() 508 | 509 | ``` 510 | 511 | ![](https://github.com/rmsolgi/geneticalgorithm/blob/master/genetic_algorithm_Ackley.gif) 512 | 513 | ## [Weierstrass](http://infinity77.net/global_optimization/test_functions_nd_W.html) 514 | ![](http://infinity77.net/global_optimization/_images/Weierstrass.png) 515 | 516 | 517 | ```python 518 | 519 | import numpy as np 520 | import math 521 | from geneticalgorithm import geneticalgorithm as ga 522 | 523 | def f(X): 524 | 525 | dim=len(X) 526 | 527 | a=0.5 528 | b=3 529 | OF=0 530 | for i in range (0,dim): 531 | t1=0 532 | for k in range (0,21): 533 | t1+=(a**k)*math.cos((2*math.pi*(b**k))*(X[i]+0.5)) 534 | OF+=t1 535 | t2=0 536 | for k in range (0,21): 537 | t2+=(a**k)*math.cos(math.pi*(b**k)) 538 | OF-=dim*t2 539 | 540 | return OF 541 | 542 | 543 | varbound=np.array([[-0.5,0.5]]*2) 544 | 545 | algorithm_param = {'max_num_iteration': 1000,\ 546 | 'population_size':100,\ 547 | 'mutation_probability':0.1,\ 548 | 'elit_ratio': 0.01,\ 549 | 'crossover_probability': 0.5,\ 550 | 'parents_portion': 0.3,\ 551 | 'crossover_type':'uniform',\ 552 | 'max_iteration_without_improv':None} 553 | 554 | model=ga(function=f,dimension=2,\ 555 | variable_type='real',\ 556 | variable_boundaries=varbound, 557 | algorithm_parameters=algorithm_param) 558 | 559 | model.run() 560 | 561 | ``` 562 | ![](https://github.com/rmsolgi/geneticalgorithm/blob/master/genetic_algorithm_Weierstrass.gif) 563 | ## License 564 | 565 | Copyright 2020 Ryan (Mohammad) Solgi 566 | 567 | Permission is hereby granted, free of charge, to any person obtaining a copy of 568 | this software and associated documentation files (the "Software"), to deal in 569 | the Software without restriction, including without limitation the rights to use, 570 | copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the 571 | Software, and to permit persons to whom the Software is furnished to do so, 572 | subject to the following conditions: 573 | 574 | The above copyright notice and this permission notice shall be included in all 575 | copies or substantial portions of the Software. 576 | 577 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 578 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 579 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 580 | THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 581 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 582 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 583 | SOFTWARE. 584 | 585 | -------------------------------------------------------------------------------- /README.rst: -------------------------------------------------------------------------------- 1 | geneticalgorithm 2 | ================ 3 | 4 | geneticalgorithm is a Python library distributed on 5 | `Pypi `__ for implementing standard and elitist 6 | `genetic-algorithm `__ 7 | (GA). This package solves continuous, 8 | `combinatorial `__ 9 | and mixed 10 | `optimization `__ 11 | problems with continuous, discrete, and mixed variables. It provides an 12 | easy implementation of genetic-algorithm (GA) in Python. 13 | 14 | Installation 15 | ------------ 16 | 17 | Use the package manager `pip `__ to 18 | install geneticalgorithm in Python. 19 | 20 | .. code:: python 21 | 22 | pip install geneticalgorithm 23 | 24 | A simple example 25 | ---------------- 26 | 27 | Assume we want to find a set of X=(x1,x2,x3) that minimizes function f(X)=x1+x2+x3 where X can be any real number in [0,10]. 28 | This is a trivial problem and we already know that the answer is X=(0,0,0) where f(X)=0. We just use this simple example to see how to implement geneticalgorithm: 29 | 30 | First we import geneticalgorithm and `numpy `__. 31 | Next, we define function f which we want to minimize and the boundaries 32 | of the decision variables; Then simply geneticalgorithm is called to 33 | solve the defined optimization problem as follows: 34 | 35 | .. code:: python 36 | 37 | import numpy as np 38 | from geneticalgorithm import geneticalgorithm as ga 39 | 40 | def f(X): 41 | return np.sum(X) 42 | 43 | 44 | varbound=np.array([[0,10]]*3) 45 | 46 | model=ga(function=f,dimension=3,variable_type='real',variable_boundaries=varbound) 47 | 48 | model.run() 49 | 50 | Notice that we define the function f so that its output is the objective 51 | function we want to minimize where the input is the set of X (decision 52 | variables). The boundaries for variables must be defined as a numpy 53 | array and for each variable we need a separate boundary. Here I have 54 | three variables and all of them have the same boundaries (For the case 55 | the boundaries are different see the example with mixed variables). 56 | 57 | | geneticalgorithm has some arguments: 58 | | Obviously the first argument is the function f we already defined (for 59 | more details about the argument and output see `Function <#1111-id>`__). 60 | | Our problem has three variables so we set dimension equal three. 61 | | Variables are real (continuous) so we use string 'real' to notify the 62 | type of variables (geneticalgorithm accepts other types including 63 | Boolean, Integers and Mixed; see other examples). 64 | | Finally, we input varbound which includes the boundaries of the 65 | variables. Note that the length of variable\_boundaries must be equal to 66 | dimension. 67 | 68 | If you run the code, you should see a progress bar that shows the 69 | progress of the genetic algorithm (GA) and then the solution, objective 70 | function value and the convergence curve as follows: 71 | 72 | .. figure:: https://github.com/rmsolgi/geneticalgorithm/blob/master/genetic_algorithm_convergence.gif 73 | :alt: 74 | 75 | Also we can access to the best answer of the defined optimization 76 | problem found by geneticalgorithm as a dictionary and a report of the 77 | progress of the genetic algorithm. To do so we complete the code as 78 | follows: 79 | 80 | .. code:: python 81 | 82 | convergence=model.report 83 | solution=model.ouput_dict 84 | 85 | output\_dict is a dictionary including the best set of variables found 86 | and the value of the given function associated to it ({'variable': , 87 | 'function': }). report is a list including the convergence of the 88 | algorithm over iterations 89 | 90 | The simple example with integer variables 91 | ----------------------------------------- 92 | 93 | Considering the problem given in the simple example above. Now assume 94 | all variables are integers. So x1, x2, x3 can be any integers in [0,10]. 95 | In this case the code is as the following: 96 | 97 | .. code:: python 98 | 99 | import numpy as np 100 | from geneticalgorithm import geneticalgorithm as ga 101 | 102 | def f(X): 103 | return np.sum(X) 104 | 105 | 106 | varbound=np.array([[0,10]]*3) 107 | 108 | model=ga(function=f,dimension=3,variable_type='int',variable_boundaries=varbound) 109 | 110 | model.run() 111 | 112 | So, as it is seen the only difference is that for variable\_type we use 113 | string 'int'. 114 | 115 | The simple example with Boolean variables 116 | ----------------------------------------- 117 | 118 | Considering the problem given in the simple example above. Now assume 119 | all variables are Boolean instead of real or integer. So X can be either 120 | zero or one. Also instead of three let's have 30 variables. In this case 121 | the code is as the following: 122 | 123 | .. code:: python 124 | 125 | import numpy as np 126 | from geneticalgorithm import geneticalgorithm as ga 127 | 128 | def f(X): 129 | return np.sum(X) 130 | 131 | 132 | model=ga(function=f,dimension=30,variable_type='bool') 133 | 134 | model.run() 135 | 136 | | Note for variable\_type we use string 'bool' when all variables are 137 | Boolean. 138 | | Note that when variable\_type equal 'bool' there is no need for 139 | variable\_boundaries to be defined. 140 | 141 | The simple example with mixed variables 142 | --------------------------------------- 143 | 144 | Considering the problem given in the the simple example above where we 145 | want to minimize f(X)=x1+x2+x3. Now assume x1 is a real (continuous) 146 | variable in [0.5,1.5], x2 is an integer variable in [1,100], and x3 is a 147 | Boolean variable that can be either zero or one. We already know that 148 | the answer is X=(0.5,1,0) where f(X)=1.5 We implement geneticalgorithm 149 | as the following: 150 | 151 | .. code:: python 152 | 153 | import numpy as np 154 | from geneticalgorithm import geneticalgorithm as ga 155 | 156 | def f(X): 157 | return np.sum(X) 158 | 159 | varbound=np.array([[0.5,1.5],[1,100],[0,1]]) 160 | vartype=np.array([['real'],['int'],['int']]) 161 | model=ga(function=f,dimension=3,variable_type_mixed=vartype,variable_boundaries=varbound) 162 | 163 | model.run() 164 | 165 | | Note that for mixed variables we need to define boundaries also we 166 | need to make a numpy array of variable types as above (vartype). 167 | Obviously the order of variables in both arrays must match. Also notice 168 | that in such a case for Boolean variables we use string 'int' and 169 | boundary [0,1]. 170 | | Notice that we use argument variable\_type\_mixed to input a numpy 171 | array of variable types for functions with mixed variables. 172 | 173 | Maximization problems 174 | --------------------- 175 | 176 | geneticalgorithm is designed to minimize the given function. A simple 177 | trick to solve maximization problems is to multiply the objective 178 | function by a negative sign. Then the absolute value of the output is 179 | the maximum of the function. Consider the above simple example. Now lets 180 | find the maximum of f(X)=x1+x2+x3 where X is a set of real variables in 181 | [0,10]. We already know that the answer is X=(10,10,10) where f(X)=30. 182 | 183 | .. code:: python 184 | 185 | import numpy as np 186 | from geneticalgorithm import geneticalgorithm as ga 187 | 188 | def f(X): 189 | return -np.sum(X) 190 | 191 | varbound=np.array([[0,10]]*3) 192 | 193 | model=ga(function=f,dimension=3,variable_type='real',variable_boundaries=varbound) 194 | 195 | model.run() 196 | 197 | As seen above np.sum(X) is mulitplied by a negative sign. 198 | 199 | Optimization problems with constraints 200 | -------------------------------------- 201 | 202 | In all above examples, the optimization problem was unconstrained. Now 203 | consider that we want to minimize f(X)=x1+x2+x3 where X is a set of real 204 | variables in [0,10]. Also we have an extra constraint so that sum of x1 205 | and x2 is equal or greater than 2. The minimum of f(X) is 2. In such a 206 | case, a trick is to define penalty function. Hence we use the code 207 | below: 208 | 209 | .. code:: python 210 | 211 | import numpy as np 212 | from geneticalgorithm import geneticalgorithm as ga 213 | 214 | def f(X): 215 | pen=0 216 | if X[0]+X[1]<2: 217 | pen=500+1000*(2-X[0]-X[1]) 218 | return np.sum(X)+pen 219 | 220 | varbound=np.array([[0,10]]*3) 221 | 222 | model=ga(function=f,dimension=3,variable_type='real',variable_boundaries=varbound) 223 | 224 | model.run() 225 | 226 | As seen above we add a penalty to the objective function whenever the 227 | constraint is not met. 228 | 229 | | Some hints about how to define a penalty function: 230 | | 1- Usually you may use a constant greater than the maximum possible 231 | value of the objective function if the maximum is known or if we have a 232 | guess of that. Here the highest possible value of our function is 300 233 | (i.e. if all variables were 10, f(X)=300). So I chose a constant of 500. 234 | So, if a trial solution is not in the feasible region even though its 235 | objective function may be small, the penalized objective function 236 | (fitness function) is worse than any feasible solution. 2- Use a 237 | coefficient big enough and multiply that by the amount of violation. 238 | This helps the algorithm learn how to approach feasible domain. 3- How 239 | to define penalty function usually influences the convergence rate of an 240 | evolutionary algorithm. In my `book on metaheuristics and evolutionary 241 | algorithms `__ 242 | you can learn more about that. 4- Finally after you solved the problem 243 | test the solution to see if boundaries are met. If the solution does not 244 | meet constraints, it shows that a bigger penalty is required. However, 245 | in problems where optimum is exactly on the boundary of the feasible 246 | region (or very close to the constraints) which is common in some kinds 247 | of problems, a very strict and big penalty may prevent the genetic 248 | algorithm to approach the optimal region. In such a case designing an 249 | appropriate penalty function might be more challenging. Actually what we 250 | have to do is to design a penalty function that let the algorithm 251 | searches unfeasible domain while finally converge to a feasible 252 | solution. Hence you may need more sophisticated penalty functions. But 253 | in most cases the above formulation work fairly well. 254 | 255 | Genetic algorithm's parameters 256 | ------------------------------ 257 | 258 | Every evolutionary algorithm (metaheuristic) has some parameters to be 259 | adjusted. `Genetic 260 | algorithm `__ 261 | also has some parameters. The parameters of geneticalgorithm is defined 262 | as a dictionary: 263 | 264 | .. code:: python 265 | 266 | 267 | algorithm_param = {'max_num_iteration': None,\ 268 | 'population_size':100,\ 269 | 'mutation_probability':0.1,\ 270 | 'elit_ratio': 0.01,\ 271 | 'crossover_probability': 0.5,\ 272 | 'parents_portion': 0.3,\ 273 | 'crossover_type':'uniform',\ 274 | 'max_iteration_without_improv':None} 275 | 276 | The above dictionary refers to the default values that has been set 277 | already. One may simply copy this code from here and change the values 278 | and use the modified dictionary as the argument of geneticalgorithm. 279 | Another way of accessing this dictionary is using the command below: 280 | 281 | .. code:: python 282 | 283 | import numpy as np 284 | from geneticalgorithm import geneticalgorithm as ga 285 | 286 | def f(X): 287 | return np.sum(X) 288 | 289 | 290 | model=ga(function=f,dimension=3,variable_type='bool') 291 | 292 | print(model.param) 293 | 294 | An example of setting a new set of parameters for genetic algorithm and 295 | running geneticalgorithm for our first simple example again: 296 | 297 | .. code:: python 298 | 299 | import numpy as np 300 | from geneticalgorithm import geneticalgorithm as ga 301 | 302 | def f(X): 303 | return np.sum(X) 304 | 305 | 306 | varbound=np.array([[0,10]]*3) 307 | 308 | algorithm_param = {'max_num_iteration': 3000,\ 309 | 'population_size':100,\ 310 | 'mutation_probability':0.1,\ 311 | 'elit_ratio': 0.01,\ 312 | 'crossover_probability': 0.5,\ 313 | 'parents_portion': 0.3,\ 314 | 'crossover_type':'uniform',\ 315 | 'max_iteration_without_improv':None} 316 | 317 | model=ga(function=f,\ 318 | dimension=3,\ 319 | variable_type='real',\ 320 | variable_boundaries=varbound,\ 321 | algorithm_parameters=algorithm_param) 322 | 323 | model.run() 324 | 325 | | Notice that max\_num\_iteration has been changed to 3000 (it was 326 | already None). In the above gif we saw that the algorithm run for 1500 327 | iterations. Since we did not define parameters geneticalgorithm applied 328 | the default values. However if you run this code geneticalgroithm 329 | executes 3000 iterations this time. 330 | | To change other parameters one may simply replace the values according 331 | to `Arguments <#1112-id>`__. 332 | 333 | @ max\_num\_iteration: The termination criterion of geneticalgorithm. If 334 | this parameter's value is None the algorithm sets maximum number of 335 | iterations automatically as a function of the dimension, boundaries, and 336 | population size. The user may enter any number of iterations that they 337 | want. It is highly recommended that the user themselves determines the 338 | max\_num\_iterations and not to use None. 339 | 340 | @ population\_size: determines the number of trial solutions in each 341 | iteration. The default value is 100. 342 | 343 | @ mutation\_probability: determines the chance of each gene in each 344 | individual solution to be replaced by a random value. The default is 0.1 345 | (i.e. 10 percent). 346 | 347 | @ elit\_ration: determines the number of elites in the population. The 348 | default value is 0.01 (i.e. 1 percent). For example when population size 349 | is 100 and elit\_ratio is 0.01 then there is one elite in the 350 | population. If this parameter is set to be zero then geneticalgorithm 351 | implements a standard genetic algorithm instead of elitist GA. 352 | 353 | @ crossover\_probability: determines the chance of an existed solution 354 | to pass its genome (aka characteristics) to new trial solutions (aka 355 | offspring); the default value is 0.5 (i.e. 50 percent) 356 | 357 | @ parents\_portion: the portion of population filled by the members of 358 | the previous generation (aka parents); default is 0.3 (i.e. 30 percent 359 | of population) 360 | 361 | @ crossover\_type: there are three options including one\_point; 362 | two\_point, and uniform crossover functions; default is uniform 363 | crossover 364 | 365 | @ max\_iteration\_without\_improv: if the algorithms does not improve 366 | the objective function over the number of successive iterations 367 | determined by this parameter, then geneticalgorithm stops and report the 368 | best found solution before the max\_num\_iterations to be met. The 369 | default value is None. 370 | 371 | Function 372 | -------- 373 | 374 | The given function to be optimized must only accept one argument and 375 | return a scalar. The argument of the given function is a numpy array 376 | which is entered by geneticalgorithm. For any reason if you do not want 377 | to work with numpy in your function you may `turn the numpy array to a 378 | list `__. 379 | 380 | Arguments 381 | --------- 382 | 383 | | @param function - the given objective function to be minimized 384 | | NOTE: This implementation minimizes the given objective function. (For 385 | maximization multiply function by a negative sign: the absolute value of 386 | the output would be the actual objective function) 387 | 388 | @param dimension - the number of decision variables 389 | 390 | @param variable\_type - 'bool' if all variables are Boolean; 'int' if 391 | all variables are integer; and 'real' if all variables are real value or 392 | continuous (for mixed type see @param variable\_type\_mixed). 393 | 394 | @param variable\_boundaries - Default None; leave it None if 395 | variable\_type is 'bool'; otherwise provide an array of tuples of length 396 | two as boundaries for each variable; the length of the array must be 397 | equal dimension. For example, np.array([0,100],[0,200]) determines lower 398 | boundary 0 and upper boundary 100 for first and upper boundary 200 for 399 | second variable where dimension is 2. 400 | 401 | @param variable\_type\_mixed - Default None; leave it None if all 402 | variables have the same type; otherwise this can be used to specify the 403 | type of each variable separately. For example if the first variable is 404 | integer but the second one is real the input is: 405 | np.array(['int'],['real']). NOTE: it does not accept 'bool'. If variable 406 | type is Boolean use 'int' and provide a boundary as [0,1] in 407 | variable\_boundaries. Also if variable\_type\_mixed is applied, 408 | variable\_boundaries has to be defined. 409 | 410 | @param function\_timeout - if the given function does not provide output 411 | before function\_timeout (unit is seconds) the algorithm raise error. 412 | For example, when there is an infinite loop in the given function. 413 | 414 | | @param algorithm\_parameters: 415 | | @ max\_num\_iteration - stoping criteria of the genetic algorithm (GA) 416 | | @ population\_size 417 | | @ mutation\_probability 418 | | @ elit\_ration 419 | | @ crossover\_probability 420 | | @ parents\_portion 421 | | @ crossover\_type - Default is 'uniform'; 'one\_point' or 'two\_point' 422 | are other options @ max\_iteration\_without\_improv - maximum number of 423 | successive iterations without improvement. If None it is ineffective 424 | 425 | Methods and Outputs: 426 | -------------------- 427 | 428 | | methods: 429 | | run(): implements the genetic algorithm (GA) 430 | 431 | param: a dictionary of parameters of the genetic algorithm (GA) 432 | 433 | output: 434 | 435 | output\_dict: is a dictionary including the best set of variables found 436 | and the value of the given function associated to it. {'variable': , 437 | 'function': } 438 | 439 | report: is a record of the progress of the algorithm over iterations 440 | 441 | Function timeout 442 | ---------------- 443 | 444 | geneticalgorithm is designed such that if the given function does not 445 | provide any output before timeout (the default value is 10 seconds), the 446 | algorithm would be terminated and raise the appropriate error. In such a 447 | case make sure the given function works correctly (i.e. there is no 448 | infinite loop in the given function). Also if the given function takes 449 | more than 10 seconds to complete the work make sure to increase 450 | function\_timeout in arguments. 451 | 452 | Standard GA vs. Elitist GA 453 | -------------------------- 454 | 455 | The convergence curve of an elitist genetic algorithm is always 456 | non-increasing. So, the best ever found solution is equal to the best 457 | solution of the last iteration. However, the convergence curve of a 458 | standard genetic algorithm is different. If elit\_ratio is zero 459 | geneticalgroithm implements a standard GA. The output of 460 | geneticalgorithm for standard GA is the best ever found solution not the 461 | solution of the last iteration. The difference between the convergence 462 | curve of standard GA and elitist GA is shown below: 463 | 464 | .. figure:: https://github.com/rmsolgi/geneticalgorithm/blob/master/genetic_algorithm_convergence_curve.gif 465 | :alt: 466 | 467 | Hints on how to adjust genetic algorithm's parameters 468 | ----------------------------------------------------- 469 | 470 | In general the performance of a genetic algorithm or any evolutionary 471 | algorithm depends on its parameters. Parameter setting of an 472 | evolutionary algorithm is important. Usually these parameters are 473 | adjusted based on experience and by conducting a sensitivity analysis. 474 | It is impossible to provide a general guideline to parameter setting but 475 | the suggestions provided below may help: 476 | 477 | Number of iterations: Select a max\_num\_iterations sufficienlty large; 478 | otherwise the reported solution may not be satisfactory. On the other 479 | hand selecting a very large number of iterations increases the run time 480 | significantly. So this is actually a compromise between the accuracy you 481 | want and the time and computational cost you spend. 482 | 483 | Population size: Given a constant number of functional evaluations 484 | (max\_num\_iterations times population\_size) I would select smaller 485 | population size and greater iterations. However, a very small choice of 486 | population size is also deteriorative. For most problems I would select 487 | a population size of 100 unless the dimension of the problem is very 488 | large that needs a bigger population size. 489 | 490 | elit\_ratio: Although having few elites is usually a good idea and may 491 | increase the rate of convergence in some problems, having too many 492 | elites in the population may cause the algorithm to easily trap in a 493 | local optima. I would usually select only one elite in most cases. 494 | Elitism is not always necessary and in some problems may even be 495 | deteriorative. 496 | 497 | mutation\_probability: This is a parameter you may need to adjust more 498 | than the other ones. Its appropriate value heavily depends on the 499 | problem. Sometimes we may select mutation\_probability as small as 0.01 500 | (i.e. 1 percent) and sometimes even as large as 0.5 (i.e. 50 percent) or 501 | even larger. In general if the genetic algorithm trapped in a local 502 | optimum increasing the mutation probability may help. On the other hand 503 | if the algorithm suffers from stagnation reducing the mutation 504 | probability may be effective. However, this rule of thumb is not always 505 | true. 506 | 507 | parents\_portion: If parents\_portion set zero, it means that the whole 508 | of the population is filled with the newly generated solutions. On the 509 | other hand having this parameter equals 1 (i.e. 100 percent) means no 510 | new solution is generated and the algorithm would just repeat the 511 | previous values without any change which is not meaningful and effective 512 | obviously. Anything between these two may work. The exact value depends 513 | on the problem. 514 | 515 | crossover\_type: Depends on the problem. I would usually use uniform 516 | crossover. But testing the other ones in your problem is recommended. 517 | 518 | max\_iteration\_without\_improv: This is a parameter that I recommend 519 | being used cautiously. If this parameter is too small then the algorithm 520 | may stop while it trapped in a local optimum. So make sure you select a 521 | sufficiently large criteria to provide enough time for the algorithm to 522 | progress and to avoid immature convergence. 523 | 524 | Finally to make sure that the parameter setting is fine, we usually 525 | should run the algorithm for several times and if connvergence curves of 526 | all runs converged to the same objective function value we may accept 527 | that solution as the optimum. The number of runs depends but usually 528 | five or ten runs is prevalent. Notice that in some problems several 529 | possible set of variables produces the same objective function value. 530 | When we study the convergence of a genetic algorithm we compare the 531 | objective function values not the decision variables. 532 | 533 | Optimization test functions 534 | --------------------------- 535 | 536 | Implementation of geneticalgorithm for some benchmark problems: 537 | 538 | `Rastrigin `__ 539 | ---------------------------------------------------------------- 540 | 541 | .. figure:: https://upload.wikimedia.org/wikipedia/commons/thumb/8/8b/Rastrigin_function.png/600px-Rastrigin_function.png 542 | :alt: 543 | 544 | .. code:: python 545 | 546 | 547 | import numpy as np 548 | import math 549 | from geneticalgorithm import geneticalgorithm as ga 550 | 551 | def f(X): 552 | 553 | dim=len(X) 554 | 555 | OF=0 556 | for i in range (0,dim): 557 | OF+=(X[i]**2)-10*math.cos(2*math.pi*X[i])+10 558 | 559 | return OF 560 | 561 | 562 | varbound=np.array([[-5.12,5.12]]*2) 563 | 564 | model=ga(function=f,dimension=2,variable_type='real',variable_boundaries=varbound) 565 | 566 | model.run() 567 | 568 | .. figure:: https://github.com/rmsolgi/geneticalgorithm/blob/master/genetic_algorithm_Rastrigin.gif 569 | :alt: 570 | 571 | `Ackley `__ 572 | ---------------------------------------------------------- 573 | 574 | .. figure:: https://upload.wikimedia.org/wikipedia/commons/thumb/9/98/Ackley%27s_function.pdf/page1-600px-Ackley%27s_function.pdf.jpg 575 | :alt: 576 | 577 | .. code:: python 578 | 579 | 580 | import numpy as np 581 | import math 582 | from geneticalgorithm import geneticalgorithm as ga 583 | 584 | def f(X): 585 | 586 | dim=len(X) 587 | 588 | t1=0 589 | t2=0 590 | for i in range (0,dim): 591 | t1+=X[i]**2 592 | t2+=math.cos(2*math.pi*X[i]) 593 | 594 | OF=20+math.e-20*math.exp((t1/dim)*-0.2)-math.exp(t2/dim) 595 | 596 | return OF 597 | 598 | varbound=np.array([[-32.768,32.768]]*2) 599 | 600 | model=ga(function=f,dimension=2,variable_type='real',variable_boundaries=varbound) 601 | 602 | model.run() 603 | 604 | .. figure:: https://github.com/rmsolgi/geneticalgorithm/blob/master/genetic_algorithm_Ackley.gif 605 | :alt: 606 | 607 | `Weierstrass `__ 608 | ------------------------------------------------------------------------------------ 609 | 610 | .. figure:: http://infinity77.net/global_optimization/_images/Weierstrass.png 611 | :alt: 612 | 613 | .. code:: python 614 | 615 | 616 | import numpy as np 617 | import math 618 | from geneticalgorithm import geneticalgorithm as ga 619 | 620 | def f(X): 621 | 622 | dim=len(X) 623 | 624 | a=0.5 625 | b=3 626 | OF=0 627 | for i in range (0,dim): 628 | t1=0 629 | for k in range (0,21): 630 | t1+=(a**k)*math.cos((2*math.pi*(b**k))*(X[i]+0.5)) 631 | OF+=t1 632 | t2=0 633 | for k in range (0,21): 634 | t2+=(a**k)*math.cos(math.pi*(b**k)) 635 | OF-=dim*t2 636 | 637 | return OF 638 | 639 | 640 | varbound=np.array([[-0.5,0.5]]*2) 641 | 642 | algorithm_param = {'max_num_iteration': 1000,\ 643 | 'population_size':100,\ 644 | 'mutation_probability':0.1,\ 645 | 'elit_ratio': 0.01,\ 646 | 'crossover_probability': 0.5,\ 647 | 'parents_portion': 0.3,\ 648 | 'crossover_type':'uniform',\ 649 | 'max_iteration_without_improv':None} 650 | 651 | model=ga(function=f,dimension=2,\ 652 | variable_type='real',\ 653 | variable_boundaries=varbound, 654 | algorithm_parameters=algorithm_param) 655 | 656 | model.run() 657 | 658 | .. figure:: https://github.com/rmsolgi/geneticalgorithm/blob/master/genetic_algorithm_Weierstrass.gif 659 | :alt: 660 | 661 | License 662 | ---------------- 663 | 664 | Copyright 2020 Ryan (Mohammad) Solgi 665 | 666 | Permission is hereby granted, free of charge, to any person obtaining a 667 | copy of this software and associated documentation files (the 668 | "Software"), to deal in the Software without restriction, including 669 | without limitation the rights to use, copy, modify, merge, publish, 670 | distribute, sublicense, and/or sell copies of the Software, and to 671 | permit persons to whom the Software is furnished to do so, subject to 672 | the following conditions: 673 | 674 | The above copyright notice and this permission notice shall be included 675 | in all copies or substantial portions of the Software. 676 | 677 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 678 | OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF 679 | MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. 680 | IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY 681 | CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, 682 | TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE 683 | SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. 684 | 685 | -------------------------------------------------------------------------------- /genetic_algorithm_Ackley.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmsolgi/geneticalgorithm/e33726e0d85af9396ac8a5f67fb84c849239ed5b/genetic_algorithm_Ackley.gif -------------------------------------------------------------------------------- /genetic_algorithm_Rastrigin.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmsolgi/geneticalgorithm/e33726e0d85af9396ac8a5f67fb84c849239ed5b/genetic_algorithm_Rastrigin.gif -------------------------------------------------------------------------------- /genetic_algorithm_Weierstrass.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmsolgi/geneticalgorithm/e33726e0d85af9396ac8a5f67fb84c849239ed5b/genetic_algorithm_Weierstrass.gif -------------------------------------------------------------------------------- /genetic_algorithm_convergence.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmsolgi/geneticalgorithm/e33726e0d85af9396ac8a5f67fb84c849239ed5b/genetic_algorithm_convergence.gif -------------------------------------------------------------------------------- /genetic_algorithm_convergence_curve.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rmsolgi/geneticalgorithm/e33726e0d85af9396ac8a5f67fb84c849239ed5b/genetic_algorithm_convergence_curve.gif -------------------------------------------------------------------------------- /geneticalgorithm/__init__.py: -------------------------------------------------------------------------------- 1 | ''' 2 | 3 | Copyright 2020 Ryan (Mohammad) Solgi 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy of 6 | this software and associated documentation files (the "Software"), to deal in 7 | the Software without restriction, including without limitation the rights to use, 8 | copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the 9 | Software, and to permit persons to whom the Software is furnished to do so, 10 | subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 18 | THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | 23 | ''' 24 | 25 | from .geneticalgorithm import geneticalgorithm 26 | -------------------------------------------------------------------------------- /geneticalgorithm/geneticalgorithm.py: -------------------------------------------------------------------------------- 1 | ''' 2 | 3 | Copyright 2020 Ryan (Mohammad) Solgi 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy of 6 | this software and associated documentation files (the "Software"), to deal in 7 | the Software without restriction, including without limitation the rights to use, 8 | copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the 9 | Software, and to permit persons to whom the Software is furnished to do so, 10 | subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 18 | THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | 23 | ''' 24 | 25 | ############################################################################### 26 | ############################################################################### 27 | ############################################################################### 28 | 29 | import numpy as np 30 | import sys 31 | import time 32 | from func_timeout import func_timeout, FunctionTimedOut 33 | import matplotlib.pyplot as plt 34 | 35 | ############################################################################### 36 | ############################################################################### 37 | ############################################################################### 38 | 39 | class geneticalgorithm(): 40 | 41 | ''' Genetic Algorithm (Elitist version) for Python 42 | 43 | An implementation of elitist genetic algorithm for solving problems with 44 | continuous, integers, or mixed variables. 45 | 46 | 47 | 48 | Implementation and output: 49 | 50 | methods: 51 | run(): implements the genetic algorithm 52 | 53 | outputs: 54 | output_dict: a dictionary including the best set of variables 55 | found and the value of the given function associated to it. 56 | {'variable': , 'function': } 57 | 58 | report: a list including the record of the progress of the 59 | algorithm over iterations 60 | 61 | ''' 62 | ############################################################# 63 | def __init__(self, function, dimension, variable_type='bool', \ 64 | variable_boundaries=None,\ 65 | variable_type_mixed=None, \ 66 | function_timeout=10,\ 67 | algorithm_parameters={'max_num_iteration': None,\ 68 | 'population_size':100,\ 69 | 'mutation_probability':0.1,\ 70 | 'elit_ratio': 0.01,\ 71 | 'crossover_probability': 0.5,\ 72 | 'parents_portion': 0.3,\ 73 | 'crossover_type':'uniform',\ 74 | 'max_iteration_without_improv':None},\ 75 | convergence_curve=True,\ 76 | progress_bar=True): 77 | 78 | 79 | ''' 80 | @param function - the given objective function to be minimized 81 | NOTE: This implementation minimizes the given objective function. 82 | (For maximization multiply function by a negative sign: the absolute 83 | value of the output would be the actual objective function) 84 | 85 | @param dimension - the number of decision variables 86 | 87 | @param variable_type - 'bool' if all variables are Boolean; 88 | 'int' if all variables are integer; and 'real' if all variables are 89 | real value or continuous (for mixed type see @param variable_type_mixed) 90 | 91 | @param variable_boundaries - Default None; leave it 92 | None if variable_type is 'bool'; otherwise provide an array of tuples 93 | of length two as boundaries for each variable; 94 | the length of the array must be equal dimension. For example, 95 | np.array([0,100],[0,200]) determines lower boundary 0 and upper boundary 100 for first 96 | and upper boundary 200 for second variable where dimension is 2. 97 | 98 | @param variable_type_mixed - Default None; leave it 99 | None if all variables have the same type; otherwise this can be used to 100 | specify the type of each variable separately. For example if the first 101 | variable is integer but the second one is real the input is: 102 | np.array(['int'],['real']). NOTE: it does not accept 'bool'. If variable 103 | type is Boolean use 'int' and provide a boundary as [0,1] 104 | in variable_boundaries. Also if variable_type_mixed is applied, 105 | variable_boundaries has to be defined. 106 | 107 | @param function_timeout - if the given function does not provide 108 | output before function_timeout (unit is seconds) the algorithm raise error. 109 | For example, when there is an infinite loop in the given function. 110 | 111 | @param algorithm_parameters: 112 | @ max_num_iteration - stoping criteria of the genetic algorithm (GA) 113 | @ population_size 114 | @ mutation_probability 115 | @ elit_ration 116 | @ crossover_probability 117 | @ parents_portion 118 | @ crossover_type - Default is 'uniform'; 'one_point' or 119 | 'two_point' are other options 120 | @ max_iteration_without_improv - maximum number of 121 | successive iterations without improvement. If None it is ineffective 122 | 123 | @param convergence_curve - Plot the convergence curve or not 124 | Default is True. 125 | @progress_bar - Show progress bar or not. Default is True. 126 | 127 | for more details and examples of implementation please visit: 128 | https://github.com/rmsolgi/geneticalgorithm 129 | 130 | ''' 131 | self.__name__=geneticalgorithm 132 | ############################################################# 133 | # input function 134 | assert (callable(function)),"function must be callable" 135 | 136 | self.f=function 137 | ############################################################# 138 | #dimension 139 | 140 | self.dim=int(dimension) 141 | 142 | ############################################################# 143 | # input variable type 144 | 145 | assert(variable_type=='bool' or variable_type=='int' or\ 146 | variable_type=='real'), \ 147 | "\n variable_type must be 'bool', 'int', or 'real'" 148 | ############################################################# 149 | # input variables' type (MIXED) 150 | 151 | if variable_type_mixed is None: 152 | 153 | if variable_type=='real': 154 | self.var_type=np.array([['real']]*self.dim) 155 | else: 156 | self.var_type=np.array([['int']]*self.dim) 157 | 158 | 159 | else: 160 | assert (type(variable_type_mixed).__module__=='numpy'),\ 161 | "\n variable_type must be numpy array" 162 | assert (len(variable_type_mixed) == self.dim), \ 163 | "\n variable_type must have a length equal dimension." 164 | 165 | for i in variable_type_mixed: 166 | assert (i=='real' or i=='int'),\ 167 | "\n variable_type_mixed is either 'int' or 'real' "+\ 168 | "ex:['int','real','real']"+\ 169 | "\n for 'boolean' use 'int' and specify boundary as [0,1]" 170 | 171 | 172 | self.var_type=variable_type_mixed 173 | ############################################################# 174 | # input variables' boundaries 175 | 176 | 177 | if variable_type!='bool' or type(variable_type_mixed).__module__=='numpy': 178 | 179 | assert (type(variable_boundaries).__module__=='numpy'),\ 180 | "\n variable_boundaries must be numpy array" 181 | 182 | assert (len(variable_boundaries)==self.dim),\ 183 | "\n variable_boundaries must have a length equal dimension" 184 | 185 | 186 | for i in variable_boundaries: 187 | assert (len(i) == 2), \ 188 | "\n boundary for each variable must be a tuple of length two." 189 | assert(i[0]<=i[1]),\ 190 | "\n lower_boundaries must be smaller than upper_boundaries [lower,upper]" 191 | self.var_bound=variable_boundaries 192 | else: 193 | self.var_bound=np.array([[0,1]]*self.dim) 194 | 195 | ############################################################# 196 | #Timeout 197 | self.funtimeout=float(function_timeout) 198 | ############################################################# 199 | #convergence_curve 200 | if convergence_curve==True: 201 | self.convergence_curve=True 202 | else: 203 | self.convergence_curve=False 204 | ############################################################# 205 | #progress_bar 206 | if progress_bar==True: 207 | self.progress_bar=True 208 | else: 209 | self.progress_bar=False 210 | ############################################################# 211 | ############################################################# 212 | # input algorithm's parameters 213 | 214 | self.param=algorithm_parameters 215 | 216 | self.pop_s=int(self.param['population_size']) 217 | 218 | assert (self.param['parents_portion']<=1\ 219 | and self.param['parents_portion']>=0),\ 220 | "parents_portion must be in range [0,1]" 221 | 222 | self.par_s=int(self.param['parents_portion']*self.pop_s) 223 | trl=self.pop_s-self.par_s 224 | if trl % 2 != 0: 225 | self.par_s+=1 226 | 227 | self.prob_mut=self.param['mutation_probability'] 228 | 229 | assert (self.prob_mut<=1 and self.prob_mut>=0), \ 230 | "mutation_probability must be in range [0,1]" 231 | 232 | 233 | self.prob_cross=self.param['crossover_probability'] 234 | assert (self.prob_cross<=1 and self.prob_cross>=0), \ 235 | "mutation_probability must be in range [0,1]" 236 | 237 | assert (self.param['elit_ratio']<=1 and self.param['elit_ratio']>=0),\ 238 | "elit_ratio must be in range [0,1]" 239 | 240 | trl=self.pop_s*self.param['elit_ratio'] 241 | if trl<1 and self.param['elit_ratio']>0: 242 | self.num_elit=1 243 | else: 244 | self.num_elit=int(trl) 245 | 246 | assert(self.par_s>=self.num_elit), \ 247 | "\n number of parents must be greater than number of elits" 248 | 249 | if self.param['max_num_iteration']==None: 250 | self.iterate=0 251 | for i in range (0,self.dim): 252 | if self.var_type[i]=='int': 253 | self.iterate+=(self.var_bound[i][1]-self.var_bound[i][0])*self.dim*(100/self.pop_s) 254 | else: 255 | self.iterate+=(self.var_bound[i][1]-self.var_bound[i][0])*50*(100/self.pop_s) 256 | self.iterate=int(self.iterate) 257 | if (self.iterate*self.pop_s)>10000000: 258 | self.iterate=10000000/self.pop_s 259 | else: 260 | self.iterate=int(self.param['max_num_iteration']) 261 | 262 | self.c_type=self.param['crossover_type'] 263 | assert (self.c_type=='uniform' or self.c_type=='one_point' or\ 264 | self.c_type=='two_point'),\ 265 | "\n crossover_type must 'uniform', 'one_point', or 'two_point' Enter string" 266 | 267 | 268 | self.stop_mniwi=False 269 | if self.param['max_iteration_without_improv']==None: 270 | self.mniwi=self.iterate+1 271 | else: 272 | self.mniwi=int(self.param['max_iteration_without_improv']) 273 | 274 | 275 | ############################################################# 276 | def run(self): 277 | 278 | 279 | ############################################################# 280 | # Initial Population 281 | 282 | self.integers=np.where(self.var_type=='int') 283 | self.reals=np.where(self.var_type=='real') 284 | 285 | 286 | 287 | pop=np.array([np.zeros(self.dim+1)]*self.pop_s) 288 | solo=np.zeros(self.dim+1) 289 | var=np.zeros(self.dim) 290 | 291 | for p in range(0,self.pop_s): 292 | 293 | for i in self.integers[0]: 294 | var[i]=np.random.randint(self.var_bound[i][0],\ 295 | self.var_bound[i][1]+1) 296 | solo[i]=var[i].copy() 297 | for i in self.reals[0]: 298 | var[i]=self.var_bound[i][0]+np.random.random()*\ 299 | (self.var_bound[i][1]-self.var_bound[i][0]) 300 | solo[i]=var[i].copy() 301 | 302 | 303 | obj=self.sim(var) 304 | solo[self.dim]=obj 305 | pop[p]=solo.copy() 306 | 307 | ############################################################# 308 | 309 | ############################################################# 310 | # Report 311 | self.report=[] 312 | self.test_obj=obj 313 | self.best_variable=var.copy() 314 | self.best_function=obj 315 | ############################################################## 316 | 317 | t=1 318 | counter=0 319 | while t<=self.iterate: 320 | 321 | if self.progress_bar==True: 322 | self.progress(t,self.iterate,status="GA is running...") 323 | ############################################################# 324 | #Sort 325 | pop = pop[pop[:,self.dim].argsort()] 326 | 327 | 328 | 329 | if pop[0,self.dim] self.mniwi: 413 | pop = pop[pop[:,self.dim].argsort()] 414 | if pop[0,self.dim]>=self.best_function: 415 | t=self.iterate 416 | if self.progress_bar==True: 417 | self.progress(t,self.iterate,status="GA is running...") 418 | time.sleep(2) 419 | t+=1 420 | self.stop_mniwi=True 421 | 422 | ############################################################# 423 | #Sort 424 | pop = pop[pop[:,self.dim].argsort()] 425 | 426 | if pop[0,self.dim]p2[i]: 518 | x[i]=np.random.randint(p2[i],p1[i]) 519 | else: 520 | x[i]=np.random.randint(self.var_bound[i][0],\ 521 | self.var_bound[i][1]+1) 522 | 523 | for i in self.reals[0]: 524 | ran=np.random.random() 525 | if ran < self.prob_mut: 526 | if p1[i]p2[i]: 529 | x[i]=p2[i]+np.random.random()*(p1[i]-p2[i]) 530 | else: 531 | x[i]=self.var_bound[i][0]+np.random.random()*\ 532 | (self.var_bound[i][1]-self.var_bound[i][0]) 533 | return x 534 | ############################################################################### 535 | def evaluate(self): 536 | return self.f(self.temp) 537 | ############################################################################### 538 | def sim(self,X): 539 | self.temp=X.copy() 540 | obj=None 541 | try: 542 | obj=func_timeout(self.funtimeout,self.evaluate) 543 | except FunctionTimedOut: 544 | print("given function is not applicable") 545 | assert (obj!=None), "After "+str(self.funtimeout)+" seconds delay "+\ 546 | "func_timeout: the given function does not provide any output" 547 | return obj 548 | 549 | ############################################################################### 550 | def progress(self, count, total, status=''): 551 | bar_len = 50 552 | filled_len = int(round(bar_len * count / float(total))) 553 | 554 | percents = round(100.0 * count / float(total), 1) 555 | bar = '|' * filled_len + '_' * (bar_len - filled_len) 556 | 557 | sys.stdout.write('\r%s %s%s %s' % (bar, percents, '%', status)) 558 | sys.stdout.flush() 559 | ############################################################################### 560 | ############################################################################### 561 | 562 | 563 | -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | ''' 2 | Copyright 2020 Ryan (Mohammad) Solgi 3 | 4 | Permission is hereby granted, free of charge, to any person obtaining a copy of 5 | this software and associated documentation files (the "Software"), to deal in 6 | the Software without restriction, including without limitation the rights to use, 7 | copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the 8 | Software, and to permit persons to whom the Software is furnished to do so, 9 | subject to the following conditions: 10 | 11 | The above copyright notice and this permission notice shall be included in all 12 | copies or substantial portions of the Software. 13 | 14 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 15 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 16 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 17 | THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 18 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 19 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 20 | SOFTWARE. 21 | 22 | ''' 23 | 24 | import setuptools 25 | 26 | with open("README.md", "r") as fh: 27 | long_description = fh.read() 28 | 29 | setuptools.setup( 30 | name="geneticalgorithm", 31 | version="1.0.1", 32 | author="Ryan (Mohammad) Solgi", 33 | author_email="ryan.solgi@gmail.com", 34 | maintainer='Ryan (Mohammad) Solgi', 35 | description="An easy implementation of genetic-algorithm (GA) to solve continuous and combinatorial optimization problems with real, integer, and mixed variables in Python", 36 | long_description=long_description, 37 | long_description_content_type="text/markdown", 38 | url="https://github.com/rmsolgi/geneticalgorithm", 39 | keywords=['solve', 'optimization', 'problem', 'genetic', 'algorithm', 'GA', 'easy', 'fast', 'genetic-algorithm', 'combinatorial', 'mixed', 'evolutionary'], 40 | packages=setuptools.find_packages(), 41 | classifiers=[ 42 | "Programming Language :: Python :: 3", 43 | "License :: OSI Approved :: MIT License", 44 | "Operating System :: OS Independent", 45 | "Topic :: Software Development :: Libraries :: Python Modules", 46 | ], 47 | install_requires=['func-timeout','numpy'] 48 | 49 | ) 50 | 51 | 52 | 53 | 54 | 55 | --------------------------------------------------------------------------------