├── .codecov.yml ├── .gitattributes ├── .github ├── CONTRIBUTING.md ├── PULL_REQUEST_TEMPLATE.md └── workflows │ ├── CI.yaml │ ├── build-sphinx.yml │ └── new_sphinx_documentation.yml ├── .gitignore ├── .lgtm.yml ├── CODE_OF_CONDUCT.md ├── LICENSE ├── MANIFEST.in ├── README.md ├── bin └── gsm ├── devtools ├── README.md ├── conda-envs │ └── test_env.yaml └── scripts │ └── create_conda_env.py ├── docs ├── Makefile ├── README.md ├── _static │ └── README.md ├── _templates │ └── README.md ├── api.rst ├── conf.py ├── examples.md ├── getting_started.rst ├── index.md ├── index.rst ├── make.bat └── requirements.yaml ├── examples ├── QChem │ ├── DE_GSM │ │ ├── commands.sh │ │ ├── qstart │ │ └── slurm.qsh │ └── SE_GSM │ │ ├── isomers.txt │ │ ├── qstart │ │ └── slurm.qsh ├── TeraChem │ ├── DE-GSM │ │ ├── commands.sh │ │ └── tc_options.txt │ └── SE-GSM │ │ ├── commands.sh │ │ ├── driving_coordinate.txt │ │ └── tc_options.txt ├── ase_api_de_gsm.py └── ase_api_example.py ├── pyGSM ├── __init__.py ├── _version.py ├── coordinate_systems │ ├── COPYRIGHT.txt │ ├── __init__.py │ ├── cartesian.py │ ├── delocalized_coordinates.py │ ├── internal_coordinates.py │ ├── primitive_internals.py │ ├── rotate.py │ ├── slots.py │ └── topology.py ├── data │ ├── README.md │ ├── dftb_in.hsd │ ├── diels_alder.xyz │ ├── dimer.prmtop │ ├── dimer_h2o.pdb │ ├── ethylene.xyz │ ├── ethylene_molpro.com │ ├── look_and_say.dat │ ├── solvated.pdb │ ├── solvated.prmtop │ └── solvated.rst7 ├── growing_string_methods │ ├── __init__.py │ ├── de_gsm.py │ ├── gsm.py │ ├── main_gsm.py │ ├── se_cross.py │ └── se_gsm.py ├── level_of_theories │ ├── __init__.py │ ├── ase.py │ ├── bagel.py │ ├── base_lot.py │ ├── dftb.py │ ├── file_options.py │ ├── molpro.py │ ├── nanoreactor_engine.py │ ├── openmm.py │ ├── orca.py │ ├── pdynamo.py │ ├── pytc.py │ ├── qchem.py │ ├── terachem.py │ ├── terachemcloud.py │ └── xtb_lot.py ├── molecule │ ├── __init__.py │ └── molecule.py ├── optimizers │ ├── __init__.py │ ├── _linesearch.py │ ├── base_optimizer.py │ ├── beales_cg.py │ ├── conjugate_gradient.py │ ├── eigenvector_follow.py │ └── lbfgs.py ├── potential_energy_surfaces │ ├── __init__.py │ ├── avg_pes.py │ ├── penalty_pes.py │ └── pes.py ├── py.typed ├── pyGSM.py ├── tests │ ├── __init__.py │ ├── test_basic_mecp.py │ └── test_pygsm.py └── utilities │ ├── __init__.py │ ├── block_matrix.py │ ├── block_tensor.py │ ├── cli_utils.py │ ├── elements.py │ ├── manage_xyz.py │ ├── math_utils.py │ ├── nifty.py │ ├── options.py │ ├── random_quotes.py │ └── units.py ├── pyproject.toml ├── readthedocs.yml ├── setup.cfg └── setup.py /.codecov.yml: -------------------------------------------------------------------------------- 1 | # Codecov configuration to make it a bit less noisy 2 | coverage: 3 | status: 4 | patch: false 5 | project: 6 | default: 7 | threshold: 50% 8 | comment: 9 | layout: "header" 10 | require_changes: false 11 | branches: null 12 | behavior: default 13 | flags: null 14 | paths: null -------------------------------------------------------------------------------- /.gitattributes: -------------------------------------------------------------------------------- 1 | pygsm/_version.py export-subst 2 | -------------------------------------------------------------------------------- /.github/CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # How to contribute 2 | 3 | We welcome contributions from external contributors, and this document 4 | describes how to merge code changes into this pygsm. 5 | 6 | ## Getting Started 7 | 8 | * Make sure you have a [GitHub account](https://github.com/signup/free). 9 | * [Fork](https://help.github.com/articles/fork-a-repo/) this repository on GitHub. 10 | * On your local machine, 11 | [clone](https://help.github.com/articles/cloning-a-repository/) your fork of 12 | the repository. 13 | 14 | ## Making Changes 15 | 16 | * Add some really awesome code to your local fork. It's usually a [good 17 | idea](http://blog.jasonmeridth.com/posts/do-not-issue-pull-requests-from-your-master-branch/) 18 | to make changes on a 19 | [branch](https://help.github.com/articles/creating-and-deleting-branches-within-your-repository/) 20 | with the branch name relating to the feature you are going to add. 21 | * When you are ready for others to examine and comment on your new feature, 22 | navigate to your fork of pygsm on GitHub and open a [pull 23 | request](https://help.github.com/articles/using-pull-requests/) (PR). Note that 24 | after you launch a PR from one of your fork's branches, all 25 | subsequent commits to that branch will be added to the open pull request 26 | automatically. Each commit added to the PR will be validated for 27 | mergability, compilation and test suite compliance; the results of these tests 28 | will be visible on the PR page. 29 | * If you're providing a new feature, you must add test cases and documentation. 30 | * When the code is ready to go, make sure you run the test suite using pytest. 31 | * When you're ready to be considered for merging, check the "Ready to go" 32 | box on the PR page to let the pygsm devs know that the changes are complete. 33 | The code will not be merged until this box is checked, the continuous 34 | integration returns checkmarks, 35 | and multiple core developers give "Approved" reviews. 36 | 37 | # Additional Resources 38 | 39 | * [General GitHub documentation](https://help.github.com/) 40 | * [PR best practices](http://codeinthehole.com/writing/pull-requests-and-other-good-practices-for-teams-using-github/) 41 | * [A guide to contributing to software packages](http://www.contribution-guide.org) 42 | * [Thinkful PR example](http://www.thinkful.com/learn/github-pull-request-tutorial/#Time-to-Submit-Your-First-PR) 43 | -------------------------------------------------------------------------------- /.github/PULL_REQUEST_TEMPLATE.md: -------------------------------------------------------------------------------- 1 | ## Description 2 | Provide a brief description of the PR's purpose here. 3 | 4 | ## Todos 5 | Notable points that this PR has either accomplished or will accomplish. 6 | - [ ] TODO 1 7 | 8 | ## Questions 9 | - [ ] Question1 10 | 11 | ## Status 12 | - [ ] Ready to go -------------------------------------------------------------------------------- /.github/workflows/CI.yaml: -------------------------------------------------------------------------------- 1 | name: CI 2 | 3 | on: 4 | # The cookiecutter uses the "--initial-branch" flag when it runs git-init 5 | push: 6 | branches: 7 | - "master" 8 | pull_request: 9 | branches: 10 | - "master" 11 | schedule: 12 | # Weekly tests run on master by default: 13 | # Scheduled workflows run on the latest commit on the default or base branch. 14 | # (from https://help.github.com/en/actions/reference/events-that-trigger-workflows#scheduled-events-schedule) 15 | - cron: "0 0 * * 0" 16 | 17 | jobs: 18 | test: 19 | name: Test on ${{ matrix.os }}, Python ${{ matrix.python-version }} 20 | runs-on: ${{ matrix.os }} 21 | strategy: 22 | matrix: 23 | os: [ubuntu-latest] 24 | python-version: [3.8, 3.9, "3.10"] 25 | 26 | steps: 27 | - uses: actions/checkout@v4 28 | - name: Set up Python ${{ matrix.python-version }} 29 | uses: actions/setup-python@v5 30 | with: 31 | python-version: ${{ matrix.python-version }} 32 | - name: Display Python version 33 | run: python -c "import sys; print(sys.version)" 34 | - name: Install uv 35 | run: | 36 | curl -LsSf https://astral.sh/uv/install.sh | sh 37 | - name: Install dependencies 38 | run: | 39 | uv pip install --system pytest 40 | uv pip install --system -e . 41 | 42 | - name: Run tests 43 | # conda setup requires this special shell 44 | shell: bash -l {0} 45 | run: | 46 | pytest -v --cov=pyGSM --cov-report=xml --color=yes pyGSM/tests/ 47 | 48 | - name: CodeCov 49 | uses: codecov/codecov-action@v4 50 | with: 51 | file: ./coverage.xml 52 | flags: unittests 53 | name: codecov-${{ matrix.os }}-py${{ matrix.python-version }} 54 | -------------------------------------------------------------------------------- /.github/workflows/build-sphinx.yml: -------------------------------------------------------------------------------- 1 | on: [push] 2 | 3 | jobs: 4 | build: 5 | name: Sphinx Pages 6 | runs-on: ubuntu-latest 7 | steps: 8 | - uses: seanzhengw/sphinx-pages@master 9 | id: sphinx-pages 10 | with: 11 | github_token: ${{ secrets.GITHUB_TOKEN }} 12 | create_readme: true -------------------------------------------------------------------------------- /.github/workflows/new_sphinx_documentation.yml: -------------------------------------------------------------------------------- 1 | name: Docs 2 | on: [push, pull_request, workflow_dispatch] 3 | permissions: 4 | contents: write 5 | jobs: 6 | docs: 7 | runs-on: ubuntu-latest 8 | steps: 9 | - uses: actions/checkout@v3 10 | - uses: actions/setup-python@v3 11 | - name: Install dependencies 12 | run: | 13 | pip install sphinx sphinx_rtd_theme 14 | - name: Sphinx build 15 | run: | 16 | sphinx-build doc _build 17 | - name: Deploy 18 | uses: peaceiris/actions-gh-pages@v3 19 | if: ${{ github.event_name == 'push' && github.ref == 'refs/heads/main' }} 20 | with: 21 | publish_branch: gh-pages 22 | github_token: ${{ secrets.GITHUB_TOKEN }} 23 | publish_dir: _build/ 24 | force_orphan: true 25 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | env/ 12 | build/ 13 | develop-eggs/ 14 | dist/ 15 | downloads/ 16 | eggs/ 17 | .eggs/ 18 | lib/ 19 | lib64/ 20 | parts/ 21 | sdist/ 22 | var/ 23 | wheels/ 24 | *.egg-info/ 25 | .installed.cfg 26 | *.egg 27 | 28 | # PyInstaller 29 | # Usually these files are written by a python script from a template 30 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 31 | *.manifest 32 | *.spec 33 | 34 | # Installer logs 35 | pip-log.txt 36 | pip-delete-this-directory.txt 37 | 38 | # Unit test / coverage reports 39 | htmlcov/ 40 | .tox/ 41 | .coverage 42 | .coverage.* 43 | .cache 44 | .pytest_cache 45 | nosetests.xml 46 | coverage.xml 47 | *.cover 48 | .hypothesis/ 49 | 50 | # Translations 51 | *.mo 52 | *.pot 53 | 54 | # Django stuff: 55 | *.log 56 | local_settings.py 57 | 58 | # Flask stuff: 59 | instance/ 60 | .webassets-cache 61 | 62 | # Scrapy stuff: 63 | .scrapy 64 | 65 | # Sphinx documentation 66 | docs/_build/ 67 | 68 | # PyBuilder 69 | target/ 70 | 71 | # Jupyter Notebook 72 | .ipynb_checkpoints 73 | 74 | # pyenv 75 | .python-version 76 | 77 | # celery beat schedule file 78 | celerybeat-schedule 79 | 80 | # SageMath parsed files 81 | *.sage.py 82 | 83 | # dotenv 84 | .env 85 | 86 | # virtualenv 87 | .venv 88 | venv/ 89 | ENV/ 90 | .venv/** 91 | 92 | # Spyder project settings 93 | .spyderproject 94 | .spyproject 95 | 96 | # Rope project settings 97 | .ropeproject 98 | 99 | # mkdocs documentation 100 | /site 101 | 102 | # mypy 103 | .mypy_cache/ 104 | 105 | # profraw files from LLVM? Unclear exactly what triggers this 106 | # There are reports this comes from LLVM profiling, but also Xcode 9. 107 | *profraw 108 | 109 | -------------------------------------------------------------------------------- /.lgtm.yml: -------------------------------------------------------------------------------- 1 | # Configure LGTM for this package 2 | 3 | extraction: 4 | python: # Configure Python 5 | python_setup: # Configure the setup 6 | version: 3 # Specify Version 3 7 | path_classifiers: 8 | library: 9 | - versioneer.py # Set Versioneer.py to an external "library" (3rd party code) 10 | - devtools/* 11 | generated: 12 | - pygsm/_version.py 13 | -------------------------------------------------------------------------------- /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Contributor Covenant Code of Conduct 2 | 3 | ## Our Pledge 4 | 5 | In the interest of fostering an open and welcoming environment, we as 6 | contributors and maintainers pledge to making participation in our project and 7 | our community a harassment-free experience for everyone, regardless of age, 8 | body size, disability, ethnicity, gender identity and expression, level of 9 | experience, nationality, personal appearance, race, religion, or sexual 10 | identity and orientation. 11 | 12 | ## Our Standards 13 | 14 | Examples of behavior that contributes to creating a positive environment include: 15 | 16 | * Using welcoming and inclusive language 17 | * Being respectful of differing viewpoints and experiences 18 | * Gracefully accepting constructive criticism 19 | * Focusing on what is best for the community 20 | * Showing empathy towards other community members 21 | 22 | Examples of unacceptable behavior by participants include: 23 | 24 | * The use of sexualized language or imagery and unwelcome sexual attention or advances 25 | * Trolling, insulting/derogatory comments, and personal or political attacks 26 | * Public or private harassment 27 | * Publishing others' private information, such as a physical or electronic address, without explicit permission 28 | * Other conduct which could reasonably be considered inappropriate in a professional setting 29 | 30 | ## Our Responsibilities 31 | 32 | Project maintainers are responsible for clarifying the standards of acceptable 33 | behavior and are expected to take appropriate and fair corrective action in 34 | response to any instances of unacceptable behavior. 35 | 36 | Project maintainers have the right and responsibility to remove, edit, or 37 | reject comments, commits, code, wiki edits, issues, and other contributions 38 | that are not aligned to this Code of Conduct, or to ban temporarily or 39 | permanently any contributor for other behaviors that they deem inappropriate, 40 | threatening, offensive, or harmful. 41 | 42 | Moreover, project maintainers will strive to offer feedback and advice to 43 | ensure quality and consistency of contributions to the code. Contributions 44 | from outside the group of project maintainers are strongly welcomed but the 45 | final decision as to whether commits are merged into the codebase rests with 46 | the team of project maintainers. 47 | 48 | ## Scope 49 | 50 | This Code of Conduct applies both within project spaces and in public spaces 51 | when an individual is representing the project or its community. Examples of 52 | representing a project or community include using an official project e-mail 53 | address, posting via an official social media account, or acting as an 54 | appointed representative at an online or offline event. Representation of a 55 | project may be further defined and clarified by project maintainers. 56 | 57 | ## Enforcement 58 | 59 | Instances of abusive, harassing, or otherwise unacceptable behavior may be 60 | reported by contacting the project team at 'codyaldaz@gmail.com'. The project team will 61 | review and investigate all complaints, and will respond in a way that it deems 62 | appropriate to the circumstances. The project team is obligated to maintain 63 | confidentiality with regard to the reporter of an incident. Further details of 64 | specific enforcement policies may be posted separately. 65 | 66 | Project maintainers who do not follow or enforce the Code of Conduct in good 67 | faith may face temporary or permanent repercussions as determined by other 68 | members of the project's leadership. 69 | 70 | ## Attribution 71 | 72 | This Code of Conduct is adapted from the [Contributor Covenant][homepage], 73 | version 1.4, available at 74 | [http://contributor-covenant.org/version/1/4][version] 75 | 76 | [homepage]: http://contributor-covenant.org 77 | [version]: http://contributor-covenant.org/version/1/4/ 78 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | 2 | MIT License 3 | 4 | Copyright (c) 2022 Cody Aldaz 5 | 6 | Permission is hereby granted, free of charge, to any person obtaining a copy 7 | of this software and associated documentation files (the "Software"), to deal 8 | in the Software without restriction, including without limitation the rights 9 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 10 | copies of the Software, and to permit persons to whom the Software is 11 | furnished to do so, subject to the following conditions: 12 | 13 | The above copyright notice and this permission notice shall be included in all 14 | copies or substantial portions of the Software. 15 | 16 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 17 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 18 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 19 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 20 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 21 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 22 | SOFTWARE. 23 | -------------------------------------------------------------------------------- /MANIFEST.in: -------------------------------------------------------------------------------- 1 | include CODE_OF_CONDUCT.md 2 | 3 | global-exclude *.py[cod] __pycache__ *.so 4 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | pyGSM 2 | ============================== 3 | [//]: # (Badges) 4 | [![GitHub Actions Build Status](https://github.com/ZimmermanGroup/pyGSM/workflows/CI/badge.svg)](https://github.com/ZimmermanGroup/pyGSM/actions?query=workflow%3ACI) 5 | [![codecov](https://codecov.io/gh/ZimmermanGroup/pyGSM/branch/master/graph/badge.svg)](https://codecov.io/gh/ZimmermanGroup/pyGSM/branch/master) 6 | 7 | 8 | Reaction path and photochemistry tool 9 | 10 | 11 | ## Documentation 12 | See https://zimmermangroup.github.io/pyGSM/ 13 | 14 | 15 | ## Install instructions 16 | pyGSM can be installed by setting up a python environment (e.g. venv, conda), cloning the GitHub repository, and installing with pip. From the root directory in the git repository, this would look like: 17 | ``` 18 | pip install -e . 19 | ``` 20 | -------------------------------------------------------------------------------- /devtools/README.md: -------------------------------------------------------------------------------- 1 | # Development, testing, and deployment tools 2 | 3 | This directory contains a collection of tools for running Continuous Integration (CI) tests, 4 | conda installation, and other development tools not directly related to the coding process. 5 | 6 | 7 | ## Manifest 8 | 9 | ### Continuous Integration 10 | 11 | You should test your code, but do not feel compelled to use these specific programs. You also may not need Unix and 12 | Windows testing if you only plan to deploy on specific platforms. These are just to help you get started. 13 | 14 | ### Conda Environment: 15 | 16 | This directory contains the files to setup the Conda environment for testing purposes 17 | 18 | * `conda-envs`: directory containing the YAML file(s) which fully describe Conda Environments, their dependencies, and those dependency provenance's 19 | * `test_env.yaml`: Simple test environment file with base dependencies. Channels are not specified here and therefore respect global Conda configuration 20 | 21 | ### Additional Scripts: 22 | 23 | This directory contains OS agnostic helper scripts which don't fall in any of the previous categories 24 | * `scripts` 25 | * `create_conda_env.py`: Helper program for spinning up new conda environments based on a starter file with Python Version and Env. Name command-line options 26 | 27 | 28 | ## How to contribute changes 29 | - Clone the repository if you have write access to the main repo, fork the repository if you are a collaborator. 30 | - Make a new branch with `git checkout -b {your branch name}` 31 | - Make changes and test your code 32 | - Ensure that the test environment dependencies (`conda-envs`) line up with the build and deploy dependencies (`conda-recipe/meta.yaml`) 33 | - Push the branch to the repo (either the main or your fork) with `git push -u origin {your branch name}` 34 | * Note that `origin` is the default name assigned to the remote, yours may be different 35 | - Make a PR on GitHub with your changes 36 | - We'll review the changes and get your code into the repo after lively discussion! 37 | 38 | 39 | ## Checklist for updates 40 | - [ ] Make sure there is an/are issue(s) opened for your specific update 41 | - [ ] Create the PR, referencing the issue 42 | - [ ] Debug the PR as needed until tests pass 43 | - [ ] Tag the final, debugged version 44 | * `git tag -a X.Y.Z [latest pushed commit] && git push --follow-tags` 45 | - [ ] Get the PR merged in 46 | 47 | ## Versioneer Auto-version 48 | [Versioneer](https://github.com/warner/python-versioneer) will automatically infer what version 49 | is installed by looking at the `git` tags and how many commits ahead this version is. The format follows 50 | [PEP 440](https://www.python.org/dev/peps/pep-0440/) and has the regular expression of: 51 | ```regexp 52 | \d+.\d+.\d+(?\+\d+-[a-z0-9]+) 53 | ``` 54 | If the version of this commit is the same as a `git` tag, the installed version is the same as the tag, 55 | e.g. `pyGSM-0.1.2`, otherwise it will be appended with `+X` where `X` is the number of commits 56 | ahead from the last tag, and then `-YYYYYY` where the `Y`'s are replaced with the `git` commit hash. 57 | -------------------------------------------------------------------------------- /devtools/conda-envs/test_env.yaml: -------------------------------------------------------------------------------- 1 | name: test 2 | channels: 3 | 4 | - conda-forge 5 | 6 | - defaults 7 | dependencies: 8 | # Base depends 9 | - python 10 | - pip 11 | 12 | # Testing 13 | - pytest 14 | - pytest-cov 15 | - codecov 16 | 17 | # Pip-only installs 18 | #- pip: 19 | # - codecov 20 | 21 | -------------------------------------------------------------------------------- /devtools/scripts/create_conda_env.py: -------------------------------------------------------------------------------- 1 | import argparse 2 | import os 3 | import re 4 | import glob 5 | import shutil 6 | import subprocess as sp 7 | from tempfile import TemporaryDirectory 8 | from contextlib import contextmanager 9 | # YAML imports 10 | try: 11 | import yaml # PyYAML 12 | loader = yaml.safe_load 13 | except ImportError: 14 | try: 15 | import ruamel_yaml as yaml # Ruamel YAML 16 | except ImportError: 17 | try: 18 | # Load Ruamel YAML from the base conda environment 19 | from importlib import util as import_util 20 | CONDA_BIN = os.path.dirname(os.environ['CONDA_EXE']) 21 | ruamel_yaml_path = glob.glob(os.path.join(CONDA_BIN, '..', 22 | 'lib', 'python*.*', 'site-packages', 23 | 'ruamel_yaml', '__init__.py'))[0] 24 | # Based on importlib example, but only needs to load_module since its the whole package, not just 25 | # a module 26 | spec = import_util.spec_from_file_location('ruamel_yaml', ruamel_yaml_path) 27 | yaml = spec.loader.load_module() 28 | except (KeyError, ImportError, IndexError): 29 | raise ImportError("No YAML parser could be found in this or the conda environment. " 30 | "Could not find PyYAML or Ruamel YAML in the current environment, " 31 | "AND could not find Ruamel YAML in the base conda environment through CONDA_EXE path. " 32 | "Environment not created!") 33 | loader = yaml.YAML(typ="safe").load # typ="safe" avoids odd typing on output 34 | 35 | 36 | @contextmanager 37 | def temp_cd(): 38 | """Temporary CD Helper""" 39 | cwd = os.getcwd() 40 | with TemporaryDirectory() as td: 41 | try: 42 | os.chdir(td) 43 | yield 44 | finally: 45 | os.chdir(cwd) 46 | 47 | 48 | # Args 49 | parser = argparse.ArgumentParser(description='Creates a conda environment from file for a given Python version.') 50 | parser.add_argument('-n', '--name', type=str, 51 | help='The name of the created Python environment') 52 | parser.add_argument('-p', '--python', type=str, 53 | help='The version of the created Python environment') 54 | parser.add_argument('conda_file', 55 | help='The file for the created Python environment') 56 | 57 | args = parser.parse_args() 58 | 59 | # Open the base file 60 | with open(args.conda_file, "r") as handle: 61 | yaml_script = loader(handle.read()) 62 | 63 | python_replacement_string = "python {}*".format(args.python) 64 | 65 | try: 66 | for dep_index, dep_value in enumerate(yaml_script['dependencies']): 67 | if re.match('python([ ><=*]+[0-9.*]*)?$', dep_value): # Match explicitly 'python' and its formats 68 | yaml_script['dependencies'].pop(dep_index) 69 | break # Making the assumption there is only one Python entry, also avoids need to enumerate in reverse 70 | except (KeyError, TypeError): 71 | # Case of no dependencies key, or dependencies: None 72 | yaml_script['dependencies'] = [] 73 | finally: 74 | # Ensure the python version is added in. Even if the code does not need it, we assume the env does 75 | yaml_script['dependencies'].insert(0, python_replacement_string) 76 | 77 | # Figure out conda path 78 | if "CONDA_EXE" in os.environ: 79 | conda_path = os.environ["CONDA_EXE"] 80 | else: 81 | conda_path = shutil.which("conda") 82 | if conda_path is None: 83 | raise RuntimeError("Could not find a conda binary in CONDA_EXE variable or in executable search path") 84 | 85 | print("CONDA ENV NAME {}".format(args.name)) 86 | print("PYTHON VERSION {}".format(args.python)) 87 | print("CONDA FILE NAME {}".format(args.conda_file)) 88 | print("CONDA PATH {}".format(conda_path)) 89 | 90 | # Write to a temp directory which will always be cleaned up 91 | with temp_cd(): 92 | temp_file_name = "temp_script.yaml" 93 | with open(temp_file_name, 'w') as f: 94 | f.write(yaml.dump(yaml_script)) 95 | sp.call("{} env create -n {} -f {}".format(conda_path, args.name, temp_file_name), shell=True) 96 | -------------------------------------------------------------------------------- /docs/Makefile: -------------------------------------------------------------------------------- 1 | # Minimal makefile for Sphinx documentation 2 | # 3 | 4 | # You can set these variables from the command line. 5 | SPHINXOPTS = 6 | SPHINXBUILD = sphinx-build 7 | SPHINXPROJ = pyGSM 8 | SOURCEDIR = . 9 | BUILDDIR = _build 10 | 11 | # Put it first so that "make" without argument is like "make help". 12 | help: 13 | @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) 14 | 15 | .PHONY: help Makefile 16 | 17 | # Catch-all target: route all unknown targets to Sphinx using the new 18 | # "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). 19 | %: Makefile 20 | @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) -------------------------------------------------------------------------------- /docs/README.md: -------------------------------------------------------------------------------- 1 | # Compiling pyGSM's Documentation 2 | 3 | The docs for this project are built with [Sphinx](http://www.sphinx-doc.org/en/master/). 4 | To compile the docs, first ensure that Sphinx and the ReadTheDocs theme are installed. 5 | 6 | 7 | ```bash 8 | conda install sphinx sphinx_rtd_theme 9 | ``` 10 | 11 | 12 | Once installed, you can use the `Makefile` in this directory to compile static HTML pages by 13 | ```bash 14 | make html 15 | ``` 16 | 17 | The compiled docs will be in the `_build` directory and can be viewed by opening `index.html` (which may itself 18 | be inside a directory called `html/` depending on what version of Sphinx is installed). 19 | 20 | 21 | A configuration file for [Read The Docs](https://readthedocs.org/) (readthedocs.yaml) is included in the top level of the repository. To use Read the Docs to host your documentation, go to https://readthedocs.org/ and connect this repository. You may need to change your default branch to `main` under Advanced Settings for the project. 22 | 23 | If you would like to use Read The Docs with `autodoc` (included automatically) and your package has dependencies, you will need to include those dependencies in your documentation yaml file (`docs/requirements.yaml`). 24 | 25 | -------------------------------------------------------------------------------- /docs/_static/README.md: -------------------------------------------------------------------------------- 1 | # Static Doc Directory 2 | 3 | Add any paths that contain custom static files (such as style sheets) here, 4 | relative to the `conf.py` file's directory. 5 | They are copied after the builtin static files, 6 | so a file named "default.css" will overwrite the builtin "default.css". 7 | 8 | The path to this folder is set in the Sphinx `conf.py` file in the line: 9 | ```python 10 | templates_path = ['_static'] 11 | ``` 12 | 13 | ## Examples of file to add to this directory 14 | * Custom Cascading Style Sheets 15 | * Custom JavaScript code 16 | * Static logo images 17 | -------------------------------------------------------------------------------- /docs/_templates/README.md: -------------------------------------------------------------------------------- 1 | # Templates Doc Directory 2 | 3 | Add any paths that contain templates here, relative to 4 | the `conf.py` file's directory. 5 | They are copied after the builtin template files, 6 | so a file named "page.html" will overwrite the builtin "page.html". 7 | 8 | The path to this folder is set in the Sphinx `conf.py` file in the line: 9 | ```python 10 | html_static_path = ['_templates'] 11 | ``` 12 | 13 | ## Examples of file to add to this directory 14 | * HTML extensions of stock pages like `page.html` or `layout.html` 15 | -------------------------------------------------------------------------------- /docs/api.rst: -------------------------------------------------------------------------------- 1 | API Documentation 2 | ================= 3 | 4 | .. autosummary:: 5 | :toctree: autosummary 6 | 7 | pyGSM.canvas 8 | -------------------------------------------------------------------------------- /docs/conf.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # 3 | # Configuration file for the Sphinx documentation builder. 4 | # 5 | # This file does only contain a selection of the most common options. For a 6 | # full list see the documentation: 7 | # http://www.sphinx-doc.org/en/stable/config 8 | 9 | # -- Path setup -------------------------------------------------------------- 10 | 11 | # If extensions (or modules to document with autodoc) are in another directory, 12 | # add these directories to sys.path here. If the directory is relative to the 13 | # documentation root, use os.path.abspath to make it absolute, like shown here. 14 | 15 | # Incase the project was not installed 16 | import os 17 | import sys 18 | sys.path.insert(0, os.path.abspath('..')) 19 | 20 | import pyGSM 21 | 22 | 23 | # -- Project information ----------------------------------------------------- 24 | 25 | project = 'pyGSM' 26 | copyright = ("2022, Cody Aldaz. Project structure based on the " 27 | "Computational Molecular Science Python Cookiecutter version 1.1") 28 | author = 'Cody Aldaz' 29 | 30 | # The short X.Y version 31 | version = '' 32 | # The full version, including alpha/beta/rc tags 33 | release = '' 34 | 35 | 36 | # -- General configuration --------------------------------------------------- 37 | 38 | # If your documentation needs a minimal Sphinx version, state it here. 39 | # 40 | # needs_sphinx = '1.0' 41 | 42 | # Add any Sphinx extension module names here, as strings. They can be 43 | # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom 44 | # ones. 45 | extensions = [ 46 | 'sphinx.ext.autosummary', 47 | 'sphinx.ext.autodoc', 48 | 'sphinx.ext.mathjax', 49 | 'sphinx.ext.viewcode', 50 | 'sphinx.ext.napoleon', 51 | 'sphinx.ext.intersphinx', 52 | 'sphinx.ext.extlinks', 53 | ] 54 | 55 | autosummary_generate = True 56 | napoleon_google_docstring = False 57 | napoleon_use_param = False 58 | napoleon_use_ivar = True 59 | 60 | # Add any paths that contain templates here, relative to this directory. 61 | templates_path = ['_templates'] 62 | 63 | # The suffix(es) of source filenames. 64 | # You can specify multiple suffix as a list of string: 65 | # 66 | # source_suffix = ['.rst', '.md'] 67 | source_suffix = '.rst' 68 | 69 | # The master toctree document. 70 | master_doc = 'index' 71 | 72 | # The language for content autogenerated by Sphinx. Refer to documentation 73 | # for a list of supported languages. 74 | # 75 | # This is also used if you do content translation via gettext catalogs. 76 | # Usually you set "language" from the command line for these cases. 77 | language = None 78 | 79 | # List of patterns, relative to source directory, that match files and 80 | # directories to ignore when looking for source files. 81 | # This pattern also affects html_static_path and html_extra_path . 82 | exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store'] 83 | 84 | # The name of the Pygments (syntax highlighting) style to use. 85 | pygments_style = 'default' 86 | 87 | 88 | # -- Options for HTML output ------------------------------------------------- 89 | 90 | # The theme to use for HTML and HTML Help pages. See the documentation for 91 | # a list of builtin themes. 92 | # 93 | html_theme = 'sphinx_rtd_theme' 94 | 95 | # Theme options are theme-specific and customize the look and feel of a theme 96 | # further. For a list of options available for each theme, see the 97 | # documentation. 98 | # 99 | # html_theme_options = {} 100 | 101 | # Add any paths that contain custom static files (such as style sheets) here, 102 | # relative to this directory. They are copied after the builtin static files, 103 | # so a file named "default.css" will overwrite the builtin "default.css". 104 | html_static_path = ['_static'] 105 | 106 | # Custom sidebar templates, must be a dictionary that maps document names 107 | # to template names. 108 | # 109 | # The default sidebars (for documents that don't match any pattern) are 110 | # defined by theme itself. Builtin themes are using these templates by 111 | # default: ``['localtoc.html', 'relations.html', 'sourcelink.html', 112 | # 'searchbox.html']``. 113 | # 114 | # html_sidebars = {} 115 | 116 | 117 | # -- Options for HTMLHelp output --------------------------------------------- 118 | 119 | # Output file base name for HTML help builder. 120 | htmlhelp_basename = 'pyGSMdoc' 121 | 122 | 123 | # -- Options for LaTeX output ------------------------------------------------ 124 | 125 | latex_elements = { 126 | # The paper size ('letterpaper' or 'a4paper'). 127 | # 128 | # 'papersize': 'letterpaper', 129 | 130 | # The font size ('10pt', '11pt' or '12pt'). 131 | # 132 | # 'pointsize': '10pt', 133 | 134 | # Additional stuff for the LaTeX preamble. 135 | # 136 | # 'preamble': '', 137 | 138 | # Latex figure (float) alignment 139 | # 140 | # 'figure_align': 'htbp', 141 | } 142 | 143 | # Grouping the document tree into LaTeX files. List of tuples 144 | # (source start file, target name, title, 145 | # author, documentclass [howto, manual, or own class]). 146 | latex_documents = [ 147 | (master_doc, 'pyGSM.tex', 'pyGSM Documentation', 148 | 'pyGSM', 'manual'), 149 | ] 150 | 151 | 152 | # -- Options for manual page output ------------------------------------------ 153 | 154 | # One entry per manual page. List of tuples 155 | # (source start file, name, description, authors, manual section). 156 | man_pages = [ 157 | (master_doc, 'pyGSM', 'pyGSM Documentation', 158 | [author], 1) 159 | ] 160 | 161 | 162 | # -- Options for Texinfo output ---------------------------------------------- 163 | 164 | # Grouping the document tree into Texinfo files. List of tuples 165 | # (source start file, target name, title, author, 166 | # dir menu entry, description, category) 167 | texinfo_documents = [ 168 | (master_doc, 'pyGSM', 'pyGSM Documentation', 169 | author, 'pyGSM', 'Reaction path and photochemistry tool', 170 | 'Miscellaneous'), 171 | ] 172 | 173 | 174 | # -- Extension configuration ------------------------------------------------- 175 | -------------------------------------------------------------------------------- /docs/examples.md: -------------------------------------------------------------------------------- 1 | The following examples use `Q-Chem`, the different packages require differently formatted `lot_inp_file`. 2 | 3 | ## DE-GSM 4 | diels_alder.xyz is the reactant and product (atoms must be in the same order) 5 | 6 | ```sh 7 | gsm -xyzfile diels_alder.xyz \ 8 | -mode DE_GSM \ 9 | -nnodes 11 \ 10 | -package QChem \ 11 | -lot_inp_file qstart \ 12 | -ID $SLURM_ARRAY_TASK_ID \ 13 | -coordinate_type DLC > log 2>&1 14 | ``` 15 | 16 | And qstart is like your input without charge, multiplicity 17 | 18 | ``` 19 | $rem 20 | JOBTYPE FORCE 21 | EXCHANGE B3LYP 22 | SCF_ALGORITHM diis 23 | SCF_MAX_CYCLES 150 24 | SCF_CONVERGENCE 6 25 | basis 6-31G* 26 | WAVEFUNCTION_ANALYSIS FALSE 27 | SYM_IGNORE TRUE 28 | SYMMETRY FALSE 29 | XC_GRID 1 30 | $end 31 | ``` 32 | 33 | ## SE-GSM 34 | 35 | ```sh 36 | gsm -xyzfile diels_alder.xyz \ 37 | -isomers isomers.txt \ 38 | -mode SE_GSM \ 39 | -package QChem \ 40 | -lot_inp_file qstart \ 41 | -ID $SLURM_ARRAY_TASK_ID \ 42 | -coordinate_type DLC > log 2>&1 43 | ``` 44 | 45 | Where isomers.txt is 46 | 47 | ``` 48 | ADD 4 12 49 | ADD 1 11 50 | ``` 51 | 52 | 53 | ## API Example 54 | 55 | ```python 56 | from pygsm.level_of_theories.qchem import QChem 57 | from pygsm.potential_energy_surfaces import PES 58 | from pygsm.optimizers import * 59 | from pygsm.wrappers import Molecule 60 | from pygsm.utilities import * 61 | from coordinate_systems import Topology,PrimitiveInternalCoordinates,DelocalizedInternalCoordinates 62 | import numpy as np 63 | 64 | 65 | def main(): 66 | geom = manage_xyz.read_xyz("geom.xyz") 67 | xyz = manage_xyz.xyz_to_np(geom) 68 | 69 | nifty.printcool(" Building the LOT") 70 | lot = QChem.from_options( 71 | lot_inp_file="qstart", 72 | states=[(1,0)], 73 | geom=geom, 74 | ) 75 | 76 | nifty.printcool(" Building the PES") 77 | pes = PES.from_options( 78 | lot=lot, 79 | ad_idx=0, 80 | multiplicity=1, 81 | ) 82 | 83 | nifty.printcool("Building the topology") 84 | atom_symbols = manage_xyz.get_atoms(geom) 85 | ELEMENT_TABLE = elements.ElementData() 86 | atoms = [ELEMENT_TABLE.from_symbol(atom) for atom in atom_symbols] 87 | top = Topology.build_topology( 88 | xyz, 89 | atoms, 90 | ) 91 | 92 | nifty.printcool("Building Primitive Internal Coordinates") 93 | p1 = PrimitiveInternalCoordinates.from_options( 94 | xyz=xyz, 95 | atoms=atoms, 96 | addtr=False, # Add TRIC 97 | topology=top, 98 | ) 99 | 100 | nifty.printcool("Building Delocalized Internal Coordinates") 101 | coord_obj1 = DelocalizedInternalCoordinates.from_options( 102 | xyz=xyz, 103 | atoms=atoms, 104 | addtr = False, # Add TRIC 105 | ) 106 | 107 | nifty.printcool("Building Molecule") 108 | reactant = Molecule.from_options( 109 | geom=geom, 110 | PES=pes, 111 | coord_obj = coord_obj1, 112 | Form_Hessian=True, 113 | ) 114 | 115 | print(" Done creating molecule") 116 | optimizer = eigenvector_follow.from_options(Linesearch='backtrack',OPTTHRESH=0.0005,DMAX=0.5,abs_max_step=0.5,conv_Ediff=0.5) 117 | 118 | print("initial energy is {:5.4f}".format(reactant.energy)) 119 | geoms,energies = optimizer.optimize( 120 | molecule=reactant, 121 | refE=reactant.energy, 122 | opt_steps=500, 123 | verbose=True, 124 | ) 125 | 126 | print("Final energy is {:5.4f}".format(reactant.energy)) 127 | manage_xyz.write_xyz('minimized.xyz',geoms[-1],energies[-1],scale=1.) 128 | 129 | if __name__=='__main__': 130 | main() 131 | ``` 132 | 133 | -------------------------------------------------------------------------------- /docs/getting_started.rst: -------------------------------------------------------------------------------- 1 | Getting Started 2 | =============== 3 | 4 | This page details how to get started with pyGSM. 5 | 6 | .. code-block:: python 7 | 8 | import pyGSM 9 | 10 | 11 | You can use this ... -------------------------------------------------------------------------------- /docs/index.md: -------------------------------------------------------------------------------- 1 | # Welcome to the pyGSM documentation! 2 | 3 | ## Intro 4 | PyGSM is a reaction path and geometry optimization package. GSM was previously written in C++ but was difficult to modify and compile. PyGSM attempts to overcome these challenges to allow for easy prototyping (e.g. custom calculations), improved readability, and greater cross-platform support. 5 | 6 | There are a number of changes to this code from the previous code, most greatly is the use of the level of theory object, potential energy surface object, and the separation of the GSM subtypes (e.g. single-ended and double-ended). These changes allow for greater modularity but require understanding the building blocks. 7 | 8 | Also, please see complete examples in the examples folder of the repository. 9 | 10 | ## Current Capabilities 11 | * geometry optimization 12 | * minimum energy crossing point optimization 13 | * minimum energy conical intersection optimization 14 | * single-ended GSM 15 | * double-ended GSM 16 | * single-ended MECP/MECI optimization 17 | 18 | 19 | 20 | -------------------------------------------------------------------------------- /docs/index.rst: -------------------------------------------------------------------------------- 1 | .. pyGSM documentation master file, created by 2 | sphinx-quickstart on Thu Mar 15 13:55:56 2018. 3 | You can adapt this file completely to your liking, but it should at least 4 | contain the root `toctree` directive. 5 | 6 | Welcome to pyGSM's documentation! 7 | ========================================================= 8 | This is a package. 9 | 10 | .. toctree:: 11 | :maxdepth: 2 12 | :caption: Contents: 13 | 14 | getting_started 15 | api 16 | 17 | 18 | 19 | Indices and tables 20 | ================== 21 | 22 | * :ref:`genindex` 23 | * :ref:`modindex` 24 | * :ref:`search` 25 | -------------------------------------------------------------------------------- /docs/make.bat: -------------------------------------------------------------------------------- 1 | @ECHO OFF 2 | 3 | pushd %~dp0 4 | 5 | REM Command file for Sphinx documentation 6 | 7 | if "%SPHINXBUILD%" == "" ( 8 | set SPHINXBUILD=sphinx-build 9 | ) 10 | set SOURCEDIR=. 11 | set BUILDDIR=_build 12 | set SPHINXPROJ=pyGSM 13 | 14 | if "%1" == "" goto help 15 | 16 | %SPHINXBUILD% >NUL 2>NUL 17 | if errorlevel 9009 ( 18 | echo. 19 | echo.The 'sphinx-build' command was not found. Make sure you have Sphinx 20 | echo.installed, then set the SPHINXBUILD environment variable to point 21 | echo.to the full path of the 'sphinx-build' executable. Alternatively you 22 | echo.may add the Sphinx directory to PATH. 23 | echo. 24 | echo.If you don't have Sphinx installed, grab it from 25 | echo.http://sphinx-doc.org/ 26 | exit /b 1 27 | ) 28 | 29 | %SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% 30 | goto end 31 | 32 | :help 33 | %SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% 34 | 35 | :end 36 | popd 37 | -------------------------------------------------------------------------------- /docs/requirements.yaml: -------------------------------------------------------------------------------- 1 | name: docs 2 | channels: 3 | dependencies: 4 | # Base depends 5 | - python 6 | - pip 7 | 8 | 9 | 10 | # Pip-only installs 11 | #- pip: 12 | 13 | -------------------------------------------------------------------------------- /examples/QChem/DE_GSM/commands.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | #This file was created on 06/06/2019 3 | 4 | module load pygsm 5 | 6 | gsm -xyzfile ../../../data/diels_alder.xyz -mode DE_GSM -package QChem -lot_inp_file qstart > log & 7 | 8 | -------------------------------------------------------------------------------- /examples/QChem/DE_GSM/qstart: -------------------------------------------------------------------------------- 1 | $rem 2 | JOBTYPE FORCE 3 | EXCHANGE B3LYP 4 | SCF_ALGORITHM diis 5 | SCF_MAX_CYCLES 150 6 | SCF_CONVERGENCE 6 7 | basis 6-31G* 8 | WAVEFUNCTION_ANALYSIS FALSE 9 | SYM_IGNORE TRUE 10 | SYMMETRY FALSE 11 | XC_GRID 1 12 | $end 13 | 14 | $molecule 15 | -------------------------------------------------------------------------------- /examples/QChem/DE_GSM/slurm.qsh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | #SBATCH -p zimintel --job-name=DE_GSM 3 | #SBATCH --array=1 4 | #SBATCH --output=std.output 5 | #SBATCH --error=std.error 6 | #SBATCH --nodes=1 7 | #SBATCH -c 12 8 | #SBATCH --time=48:00:00 9 | 10 | # load modules 11 | . /etc/profile.d/slurm.sh 12 | module load qchem 13 | module load pygsm 14 | 15 | # pygsm will automatically read the number of processors 16 | # use the -c option to specify threads. python will use 17 | # the threads to make calculations faster 18 | 19 | #run job 20 | gsm -coordinate_type DLC \ 21 | -xyzfile ../../../data/diels_alder.xyz \ 22 | -mode DE_GSM \ 23 | -package QChem \ 24 | -lot_inp_file qstart \ 25 | -ID $SLURM_ARRAY_TASK_ID > log 2>&1 26 | 27 | ID=`printf "%0*d\n" 3 $SLURM_ARRAY_TASK_ID` 28 | rm -rf $QCSCRATCH/string_$ID 29 | 30 | exit 31 | 32 | -------------------------------------------------------------------------------- /examples/QChem/SE_GSM/isomers.txt: -------------------------------------------------------------------------------- 1 | ADD 4 12 2 | ADD 1 11 3 | -------------------------------------------------------------------------------- /examples/QChem/SE_GSM/qstart: -------------------------------------------------------------------------------- 1 | $rem 2 | JOBTYPE FORCE 3 | EXCHANGE B3LYP 4 | SCF_ALGORITHM diis 5 | SCF_MAX_CYCLES 150 6 | SCF_CONVERGENCE 6 7 | basis 6-31G* 8 | WAVEFUNCTION_ANALYSIS FALSE 9 | SYM_IGNORE TRUE 10 | SYMMETRY FALSE 11 | XC_GRID 1 12 | $end 13 | 14 | $molecule 15 | -------------------------------------------------------------------------------- /examples/QChem/SE_GSM/slurm.qsh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | #SBATCH -p zimamd --job-name=DE_GSM 3 | #SBATCH --array=1 4 | #SBATCH --output=std.output 5 | #SBATCH --error=std.error 6 | #SBATCH --nodes=1 7 | #SBATCH -c 12 8 | #SBATCH --time=48:00:00 9 | 10 | # load modules 11 | . /etc/profile.d/slurm.sh 12 | module load qchem 13 | module load pygsm 14 | 15 | # pygsm will automatically read the number of processors 16 | # use the -c option to specify threads. python will use 17 | # the threads to make calculations faster 18 | 19 | #run job 20 | gsm -xyzfile ../../../data/diels_alder.xyz \ 21 | -isomers isomers.txt \ 22 | -mode SE_GSM \ 23 | -package QChem \ 24 | -lot_inp_file qstart \ 25 | -ID $SLURM_ARRAY_TASK_ID \ 26 | -coordinate_type DLC > log 2>&1 27 | 28 | ID=`printf "%0*d\n" 3 $SLURM_ARRAY_TASK_ID` 29 | rm -rf $QCSCRATCH/string_$ID 30 | 31 | exit 32 | 33 | -------------------------------------------------------------------------------- /examples/TeraChem/DE-GSM/commands.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | gsm -xyzfile ../../../data/diels_alder.xyz \ 4 | -mode DE_GSM \ 5 | -num_nodes 11 \ 6 | -package TeraChem \ 7 | -lot_inp_file tc_options.txt \ 8 | -interp_method Geodesic \ 9 | -coordinate_type DLC > log 2>&1 10 | 11 | -------------------------------------------------------------------------------- /examples/TeraChem/DE-GSM/tc_options.txt: -------------------------------------------------------------------------------- 1 | basis 6-31gs 2 | method B3LYP 3 | convthre 1e-7 4 | threall 1e-14 5 | scf diis+a 6 | maxit 200 7 | charge 0 8 | spinmult 1 9 | purify no 10 | precision mixed 11 | gpus 1 12 | 13 | -------------------------------------------------------------------------------- /examples/TeraChem/SE-GSM/commands.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | gsm -xyzfile ../../../data/diels_alder.xyz \ 4 | -mode SE_GSM \ 5 | -isomers driving_coordinate.txt \ 6 | -package TeraChem \ 7 | -lot_inp_file tc_options.txt \ 8 | -coordinate_type DLC > log 2>&1 9 | 10 | -------------------------------------------------------------------------------- /examples/TeraChem/SE-GSM/driving_coordinate.txt: -------------------------------------------------------------------------------- 1 | ADD 11 1 2 | ADD 12 4 3 | -------------------------------------------------------------------------------- /examples/TeraChem/SE-GSM/tc_options.txt: -------------------------------------------------------------------------------- 1 | basis 6-31gs 2 | method B3LYP 3 | convthre 1e-7 4 | threall 1e-14 5 | scf diis+a 6 | maxit 200 7 | charge 0 8 | spinmult 1 9 | purify no 10 | precision mixed 11 | gpus 1 12 | 13 | -------------------------------------------------------------------------------- /examples/ase_api_de_gsm.py: -------------------------------------------------------------------------------- 1 | """ 2 | Wrapper to the code using ASE 3 | 4 | """ 5 | 6 | import os 7 | 8 | try: 9 | from ase import Atoms 10 | import ase.io 11 | except ModuleNotFoundError: 12 | pass 13 | 14 | from pyGSM.coordinate_systems.delocalized_coordinates import DelocalizedInternalCoordinates 15 | from pyGSM.coordinate_systems.primitive_internals import PrimitiveInternalCoordinates 16 | from pyGSM.coordinate_systems.topology import Topology 17 | from pyGSM.growing_string_methods import DE_GSM 18 | from pyGSM.level_of_theories.ase import ASELoT 19 | from pyGSM.optimizers.eigenvector_follow import eigenvector_follow 20 | from pyGSM.optimizers.lbfgs import lbfgs 21 | from pyGSM.potential_energy_surfaces import PES 22 | from pyGSM.utilities import nifty 23 | from pyGSM.utilities.elements import ElementData 24 | from pyGSM.wrappers.molecule import Molecule 25 | from .main import post_processing, cleanup_scratch 26 | 27 | 28 | def minimal_wrapper_de_gsm( 29 | atoms_reactant: Atoms, 30 | atoms_product: Atoms, 31 | calculator, 32 | fixed_reactant=False, 33 | fixed_product=False, 34 | ): 35 | # LOT 36 | # 'EST_Package' 37 | # 'nproc': args.nproc, 38 | # 'states': None, 39 | 40 | # PES 41 | # pes_type = "PES" 42 | # 'PES_type': args.pes_type, 43 | # 'adiabatic_index': args.adiabatic_index, 44 | # 'multiplicity': args.multiplicity, 45 | # 'FORCE_FILE': args.FORCE_FILE, 46 | # 'RESTRAINT_FILE': args.RESTRAINT_FILE, 47 | # 'FORCE': None, 48 | # 'RESTRAINTS': None, 49 | 50 | # optimizer 51 | optimizer_method = "eigenvector_follow" # OR "lbfgs" 52 | line_search = 'NoLineSearch' # OR: 'backtrack' 53 | only_climb = True 54 | # 'opt_print_level': args.opt_print_level, 55 | step_size_cap = 0.1 # DMAX in the other wrapper 56 | 57 | # molecule 58 | coordinate_type = "TRIC" 59 | # 'hybrid_coord_idx_file': args.hybrid_coord_idx_file, 60 | # 'frozen_coord_idx_file': args.frozen_coord_idx_file, 61 | # 'prim_idx_file': args.prim_idx_file, 62 | 63 | # GSM 64 | # gsm_type = "DE_GSM" # SE_GSM, SE_Cross 65 | num_nodes = 11 # 20 for SE-GSM 66 | # 'isomers_file': args.isomers, # driving coordinates, this is a file 67 | add_node_tol = 0.1 # convergence for adding new nodes 68 | conv_tol = 0.0005 # Convergence tolerance for optimizing nodes 69 | conv_Ediff = 100. # Energy difference convergence of optimization. 70 | # 'conv_dE': args.conv_dE, 71 | conv_gmax = 100. # Max grad rms threshold 72 | # 'BDIST_RATIO': args.BDIST_RATIO, 73 | # 'DQMAG_MAX': args.DQMAG_MAX, 74 | # 'growth_direction': args.growth_direction, 75 | ID = 0 76 | # 'gsm_print_level': args.gsm_print_level, 77 | max_gsm_iterations = 100 78 | max_opt_steps = 3 # 20 for SE-GSM 79 | # 'use_multiprocessing': args.use_multiprocessing, 80 | # 'sigma': args.sigma, 81 | 82 | nifty.printcool('Parsed GSM') 83 | 84 | # LOT 85 | lot = ASELoT.from_options(calculator, 86 | geom=[[x.symbol, *x.position] for x in atoms_reactant], 87 | ID=ID) 88 | 89 | # PES 90 | pes_obj = PES.from_options(lot=lot, ad_idx=0, multiplicity=1) 91 | 92 | # Build the topology 93 | nifty.printcool("Building the topologies") 94 | element_table = ElementData() 95 | elements = [element_table.from_symbol(sym) for sym in atoms_reactant.get_chemical_symbols()] 96 | 97 | topology_reactant = Topology.build_topology( 98 | xyz=atoms_reactant.get_positions(), 99 | atoms=elements 100 | ) 101 | topology_product = Topology.build_topology( 102 | xyz=atoms_product.get_positions(), 103 | atoms=elements 104 | ) 105 | 106 | # Union of bonds 107 | # debated if needed here or not 108 | for bond in topology_product.edges(): 109 | if bond in topology_reactant.edges() or (bond[1], bond[0]) in topology_reactant.edges(): 110 | continue 111 | print(" Adding bond {} to reactant topology".format(bond)) 112 | if bond[0] > bond[1]: 113 | topology_reactant.add_edge(bond[0], bond[1]) 114 | else: 115 | topology_reactant.add_edge(bond[1], bond[0]) 116 | 117 | # primitive internal coordinates 118 | nifty.printcool("Building Primitive Internal Coordinates") 119 | 120 | prim_reactant = PrimitiveInternalCoordinates.from_options( 121 | xyz=atoms_reactant.get_positions(), 122 | atoms=elements, 123 | topology=topology_reactant, 124 | connect=coordinate_type == "DLC", 125 | addtr=coordinate_type == "TRIC", 126 | addcart=coordinate_type == "HDLC", 127 | ) 128 | 129 | prim_product = PrimitiveInternalCoordinates.from_options( 130 | xyz=atoms_product.get_positions(), 131 | atoms=elements, 132 | topology=topology_product, 133 | connect=coordinate_type == "DLC", 134 | addtr=coordinate_type == "TRIC", 135 | addcart=coordinate_type == "HDLC", 136 | ) 137 | 138 | # add product coords to reactant coords 139 | prim_reactant.add_union_primitives(prim_product) 140 | 141 | # Delocalised internal coordinates 142 | nifty.printcool("Building Delocalized Internal Coordinates") 143 | deloc_coords_reactant = DelocalizedInternalCoordinates.from_options( 144 | xyz=atoms_reactant.get_positions(), 145 | atoms=elements, 146 | connect=coordinate_type == "DLC", 147 | addtr=coordinate_type == "TRIC", 148 | addcart=coordinate_type == "HDLC", 149 | primitives=prim_reactant 150 | ) 151 | 152 | # Molecules 153 | nifty.printcool("Building the reactant object with {}".format(coordinate_type)) 154 | from_hessian = optimizer_method == "eigenvector_follow" 155 | 156 | molecule_reactant = Molecule.from_options( 157 | geom=[[x.symbol, *x.position] for x in atoms_reactant], 158 | PES=pes_obj, 159 | coord_obj=deloc_coords_reactant, 160 | Form_Hessian=from_hessian 161 | ) 162 | 163 | molecule_product = Molecule.copy_from_options( 164 | molecule_reactant, 165 | xyz=atoms_product.get_positions(), 166 | new_node_id=num_nodes - 1, 167 | copy_wavefunction=False 168 | ) 169 | 170 | # optimizer 171 | nifty.printcool("Building the Optimizer object") 172 | opt_options = dict(print_level=1, 173 | Linesearch=line_search, 174 | update_hess_in_bg=not (only_climb or optimizer_method == "lbfgs"), 175 | conv_Ediff=conv_Ediff, 176 | conv_gmax=conv_gmax, 177 | DMAX=step_size_cap, 178 | opt_climb=only_climb) 179 | if optimizer_method == "eigenvector_follow": 180 | optimizer_object = eigenvector_follow.from_options(**opt_options) 181 | elif optimizer_method == "lbfgs": 182 | optimizer_object = lbfgs.from_options(**opt_options) 183 | else: 184 | raise NotImplementedError 185 | 186 | # GSM 187 | nifty.printcool("Building the GSM object") 188 | gsm = DE_GSM.from_options( 189 | reactant=molecule_reactant, 190 | product=molecule_product, 191 | nnodes=num_nodes, 192 | CONV_TOL=conv_tol, 193 | CONV_gmax=conv_gmax, 194 | CONV_Ediff=conv_Ediff, 195 | ADD_NODE_TOL=add_node_tol, 196 | growth_direction=0, # I am not sure how this works 197 | optimizer=optimizer_object, 198 | ID=ID, 199 | print_level=1, 200 | mp_cores=1, # parallelism not tested yet with the ASE calculators 201 | interp_method="DLC", 202 | ) 203 | 204 | # optimize reactant and product if needed 205 | if not fixed_reactant: 206 | nifty.printcool("REACTANT GEOMETRY NOT FIXED!!! OPTIMIZING") 207 | path = os.path.join(os.getcwd(), 'scratch', f"{ID:03}", "0") 208 | optimizer_object.optimize( 209 | molecule=molecule_reactant, 210 | refE=molecule_reactant.energy, 211 | opt_steps=100, 212 | path=path 213 | ) 214 | if not fixed_product: 215 | nifty.printcool("PRODUCT GEOMETRY NOT FIXED!!! OPTIMIZING") 216 | path = os.path.join(os.getcwd(), 'scratch', f"{ID:03}", str(num_nodes - 1)) 217 | optimizer_object.optimize( 218 | molecule=molecule_product, 219 | refE=molecule_product.energy, 220 | opt_steps=100, 221 | path=path 222 | ) 223 | 224 | # set 'rtype' as in main one (???) 225 | if only_climb: 226 | rtype = 1 227 | # elif no_climb: 228 | # rtype = 0 229 | else: 230 | rtype = 2 231 | 232 | # do GSM 233 | nifty.printcool("Main GSM Calculation") 234 | gsm.go_gsm(max_gsm_iterations, max_opt_steps, rtype=rtype) 235 | 236 | # write the results into an extended xyz file 237 | string_ase, ts_ase = gsm_to_ase_atoms(gsm) 238 | ase.io.write(f"opt_converged_{gsm.ID:03d}_ase.xyz", string_ase) 239 | ase.io.write(f'TSnode_{gsm.ID}.xyz', string_ase) 240 | 241 | # post processing taken from the main wrapper, plots as well 242 | post_processing(gsm, have_TS=True) 243 | 244 | # cleanup 245 | cleanup_scratch(gsm.ID) 246 | 247 | 248 | def gsm_to_ase_atoms(gsm: DE_GSM): 249 | # string 250 | frames = [] 251 | for energy, geom in zip(gsm.energies, gsm.geometries): 252 | at = Atoms(symbols=[x[0] for x in geom], positions=[x[1:4] for x in geom]) 253 | at.info["energy"] = energy 254 | frames.append(at) 255 | 256 | # TS 257 | ts_geom = gsm.nodes[gsm.TSnode].geometry 258 | ts_atoms = Atoms(symbols=[x[0] for x in ts_geom], positions=[x[1:4] for x in ts_geom]) 259 | 260 | return frames, ts_atoms 261 | -------------------------------------------------------------------------------- /examples/ase_api_example.py: -------------------------------------------------------------------------------- 1 | """ 2 | API example modified for an ASE calculator. 3 | 4 | The result is meaningless, apart from showing that the code runs, because of using a Morse potential. 5 | 6 | """ 7 | import ase.io 8 | import numpy as np 9 | from ase.calculators.morse import MorsePotential 10 | 11 | from pyGSM.coordinate_systems import DelocalizedInternalCoordinates 12 | from pyGSM.level_of_theories.ase import ASELoT 13 | from pyGSM.optimizers import eigenvector_follow 14 | from pyGSM.potential_energy_surfaces import PES 15 | from pyGSM.utilities import elements, manage_xyz, nifty 16 | from pyGSM.molecule import Molecule 17 | 18 | 19 | def main(geom): 20 | nifty.printcool(" Building the LOT") 21 | lot = ASELoT.from_options(MorsePotential(), geom=geom) 22 | 23 | nifty.printcool(" Building the PES") 24 | pes = PES.from_options( 25 | lot=lot, 26 | ad_idx=0, 27 | multiplicity=1, 28 | ) 29 | 30 | nifty.printcool("Building the topology") 31 | atom_symbols = manage_xyz.get_atoms(geom) 32 | ELEMENT_TABLE = elements.ElementData() 33 | atoms = [ELEMENT_TABLE.from_symbol(atom) for atom in atom_symbols] 34 | # top = Topology.build_topology( 35 | # xyz, 36 | # atoms, 37 | # ) 38 | 39 | # nifty.printcool("Building Primitive Internal Coordinates") 40 | # p1 = PrimitiveInternalCoordinates.from_options( 41 | # xyz=xyz, 42 | # atoms=atoms, 43 | # addtr=False, # Add TRIC 44 | # topology=top, 45 | # ) 46 | 47 | nifty.printcool("Building Delocalized Internal Coordinates") 48 | coord_obj1 = DelocalizedInternalCoordinates.from_options( 49 | xyz=xyz, 50 | atoms=atoms, 51 | addtr=False, # Add TRIC 52 | ) 53 | 54 | nifty.printcool("Building Molecule") 55 | reactant = Molecule.from_options( 56 | geom=geom, 57 | PES=pes, 58 | coord_obj=coord_obj1, 59 | Form_Hessian=True, 60 | ) 61 | 62 | nifty.printcool("Creating optimizer") 63 | optimizer = eigenvector_follow.from_options(Linesearch='backtrack', OPTTHRESH=0.0005, DMAX=0.5, abs_max_step=0.5, 64 | conv_Ediff=0.5) 65 | 66 | nifty.printcool("initial energy is {:5.4f} kcal/mol".format(reactant.energy)) 67 | geoms, energies = optimizer.optimize( 68 | molecule=reactant, 69 | refE=reactant.energy, 70 | opt_steps=500, 71 | verbose=True, 72 | ) 73 | 74 | nifty.printcool("Final energy is {:5.4f}".format(reactant.energy)) 75 | manage_xyz.write_xyz('minimized.xyz', geoms[-1], energies[-1], scale=1.) 76 | 77 | 78 | if __name__ == '__main__': 79 | diels_adler = ase.io.read("diels_alder.xyz", ":") 80 | xyz = diels_adler[0].positions 81 | 82 | # this is a hack 83 | geom = np.column_stack([diels_adler[0].symbols, xyz]).tolist() 84 | for i in range(len(geom)): 85 | for j in [1, 2, 3]: 86 | geom[i][j] = float(geom[i][j]) 87 | # -------------------------- 88 | 89 | main(geom) 90 | -------------------------------------------------------------------------------- /pyGSM/__init__.py: -------------------------------------------------------------------------------- 1 | """A short description of the project (less than one line).""" 2 | 3 | # Add imports here 4 | from .pyGSM import * 5 | 6 | 7 | from ._version import __version__ 8 | -------------------------------------------------------------------------------- /pyGSM/_version.py: -------------------------------------------------------------------------------- 1 | __version__ = "1.0.0+681.g3b920bb.dirty" 2 | -------------------------------------------------------------------------------- /pyGSM/coordinate_systems/COPYRIGHT.txt: -------------------------------------------------------------------------------- 1 | ---------------------------------------------------------------------------- 2 | ************THIS FOLDER CONTAINS SOURCE CODE FROM GEOMETRIC***************** 3 | ---------------------------------------------------------------------------- 4 | 5 | Copyright 2016-2020 Regents of the University of California and the Authors 6 | 7 | Authors: Lee-Ping Wang, Chenchen Song 8 | 9 | Contributors: 10 | 11 | Redistribution and use in source and binary forms, with or without modification, 12 | are permitted provided that the following conditions are met: 13 | 14 | 1. Redistributions of source code must retain the above copyright notice, 15 | this list of conditions and the following disclaimer. 16 | 17 | 2. Redistributions in binary form must reproduce the above copyright notice, 18 | this list of conditions and the following disclaimer in the documentation 19 | and/or other materials provided with the distribution. 20 | 21 | 3. Neither the name of the copyright holder nor the names of its contributors 22 | may be used to endorse or promote products derived from this software 23 | without specific prior written permission. 24 | 25 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND 26 | ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED 27 | WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. 28 | IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, 29 | INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT 30 | NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR 31 | PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, 32 | WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) 33 | ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE 34 | POSSIBILITY OF SUCH DAMAGE. 35 | -------------------------------------------------------------------------------- /pyGSM/coordinate_systems/__init__.py: -------------------------------------------------------------------------------- 1 | from .internal_coordinates import InternalCoordinates 2 | from .delocalized_coordinates import DelocalizedInternalCoordinates 3 | from .primitive_internals import PrimitiveInternalCoordinates 4 | from .cartesian import CartesianCoordinates 5 | from .topology import Topology,MyG 6 | from .slots import Distance,Angle,Dihedral,OutOfPlane,TranslationX,TranslationY,TranslationZ,RotationA,RotationB,RotationC 7 | -------------------------------------------------------------------------------- /pyGSM/coordinate_systems/cartesian.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | from .internal_coordinates import InternalCoordinates 3 | from .primitive_internals import PrimitiveInternalCoordinates 4 | from .slots import * 5 | 6 | class CartesianCoordinates(InternalCoordinates): 7 | """ 8 | Cartesian coordinate system, written as a kind of internal coordinate class. 9 | This one does not support constraints, because that requires adding some 10 | primitive internal coordinates. 11 | """ 12 | def __init__(self, options): 13 | super(CartesianCoordinates, self).__init__(options) 14 | self.Internals = [] 15 | self.cPrims = [] 16 | self.cVals = [] 17 | self.atoms = options['atoms'] 18 | self.natoms = len(self.atoms) 19 | top_settings={'make_primitives':False} 20 | self.Prims = PrimitiveInternalCoordinates(options.copy().set_values({'extra_kwargs':top_settings})) 21 | #self.Prims = PrimitiveInternalCoordinates(options.copy()) 22 | 23 | for i in range(self.natoms): 24 | self.Prims.add(CartesianX(i, w=1.0)) 25 | self.Prims.add(CartesianY(i, w=1.0)) 26 | self.Prims.add(CartesianZ(i, w=1.0)) 27 | #if 'constraints' in kwargs and kwargs['constraints'] is not None: 28 | # raise RuntimeError('Do not use constraints with Cartesian coordinates') 29 | 30 | self.Vecs = np.eye(3*self.natoms) 31 | 32 | def guess_hessian(self, xyz): 33 | return 0.5*np.eye(len(xyz.flatten())) 34 | 35 | def calcGrad(self,xyz,gradx): 36 | return gradx 37 | 38 | def newCartesian(self,xyz,dq,verbose=True): 39 | return xyz+np.reshape(dq,(-1,3)) 40 | 41 | def calculate(self,coords): 42 | return coords 43 | 44 | #class Cartesian: 45 | # def __init__(self,q,geom,pes): 46 | # self.q = q # the value in the current basis 47 | # self.xs=[] 48 | # self.g=[] 49 | # self.fx=[] 50 | # self.xnorm=[] 51 | # self.gnorm=[] 52 | # self.step=[] 53 | # self.geom=geom 54 | # self.coords=manage_xyz.xyz_to_np(self.geom) 55 | # self.geoms=[] 56 | # self.natoms=len(geom) 57 | # self.PES = pes 58 | # 59 | # def append_data(self,x,g,fx,xnorm,gnorm,step): 60 | # self.q = x 61 | # self.xs.append(x) 62 | # self.g.append(g) 63 | # self.coords = np.reshape(x,(self.natoms,3)) 64 | # self.geom = manage_xyz.np_to_xyz(self.geom,self.coords) 65 | # self.geoms.append(self.geom) 66 | # self.fx.append(fx) 67 | # self.xnorm.append(xnorm) 68 | # self.gnorm.append(gnorm) 69 | # self.step.append(step) 70 | # return 71 | # 72 | # 73 | # def proc_evaluate(self,x): 74 | # self.coords = np.reshape(x,(self.natoms,3)) 75 | # self.geom = manage_xyz.np_to_xyz(self.geom,self.coords) 76 | # fx =self.PES.get_energy(self.geom) 77 | # g = np.ndarray.flatten(self.PES.get_gradient(self.geom)*KCAL_MOL_PER_AU) 78 | # self.PES.lot.hasRanForCurrentCoords= False 79 | # return fx,g 80 | # 81 | #if __name__ =='__main__': 82 | # from qchem import * 83 | # import pybel as pb 84 | # basis="sto-3g" 85 | # nproc=4 86 | # 87 | # filepath="examples/tests/bent_benzene.xyz" 88 | # lot=QChem.from_options(states=[(1,0)],charge=0,basis=basis,functional='HF',nproc=nproc) 89 | # pes = PES.from_options(lot=lot,ad_idx=0,multiplicity=1) 90 | # geom=manage_xyz.read_xyz(filepath,scale=1) 91 | # 92 | # 93 | # if False: 94 | # # cartesian optimization 95 | # param = lbfgs_parameters(cart.proc_evaluate,cart,min_step=0.0005) 96 | # x=np.ndarray.flatten(cart.coords) 97 | # fx=cart.PES.get_energy(cart.geom) 98 | # lb = lbfgs(len(x), x, fx, param,opt_steps=5) 99 | # ret = lb.do_lbfgs(opt_steps=4) 100 | # print cart.fx 101 | # manage_xyz.write_xyzs('prc.xyz',cart.geoms,scale=1.0) 102 | # #ret = lb.do_codys_lbfgs(opt_steps=4) 103 | # #print cart.fx 104 | # #manage_xyz.write_xyzs('prc2.xyz',cart.geoms,scale=1.0) 105 | # else: 106 | # # cartesian optimization 107 | # 108 | # from conjugate_gradient import conjugate_gradient 109 | # from _linesearch import backtrack,parameters 110 | # # => Cartesian constructor <= # 111 | # coords = manage_xyz.xyz_to_np(geom) 112 | # q=coords.flatten() 113 | # print "initial q" 114 | # print q 115 | # cart = Cartesian(q,geom,pes) 116 | # 117 | # #param = parameters(min_step=0.00001) 118 | # param = parameters.from_options(opt_type='UNCONSTRAINED',OPTTHRESH=1e-10) 119 | # cg = conjugate_gradient() 120 | # 121 | # cg.optimize(cart,param,3) 122 | # 123 | # print cart.fx 124 | # manage_xyz.write_xyzs('prc.xyz',cart.geoms,scale=1.0) 125 | # 126 | -------------------------------------------------------------------------------- /pyGSM/data/README.md: -------------------------------------------------------------------------------- 1 | # Sample Package Data 2 | 3 | This directory contains sample additional data you may want to include with your package. 4 | This is a place where non-code related additional information (such as data files, molecular structures, etc.) can 5 | go that you want to ship alongside your code. 6 | 7 | Please note that it is not recommended to place large files in your git directory. If your project requires files larger 8 | than a few megabytes in size it is recommended to host these files elsewhere. This is especially true for binary files 9 | as the `git` structure is unable to correctly take updates to these files and will store a complete copy of every version 10 | in your `git` history which can quickly add up. As a note most `git` hosting services like GitHub have a 1 GB per repository 11 | cap. 12 | 13 | ## Including package data 14 | 15 | Modify your package's `pyproject.toml` file. 16 | Update the [tool.setuptools.package_data](https://setuptools.pypa.io/en/latest/userguide/datafiles.html#package-data) 17 | and point it at the correct files. 18 | Paths are relative to `package_dir`. 19 | 20 | Package data can be accessed at run time with `importlib.resources` or the `importlib_resources` back port. 21 | See https://setuptools.pypa.io/en/latest/userguide/datafiles.html#accessing-data-files-at-runtime 22 | for suggestions. 23 | 24 | If modules within your package will access internal data files using 25 | [the recommended approach](https://setuptools.pypa.io/en/latest/userguide/datafiles.html#accessing-data-files-at-runtime), 26 | you may need to include `importlib_resources` in your package dependencies. 27 | In `pyproject.toml`, include the following in your `[project]` table. 28 | ``` 29 | dependencies = [ 30 | "importlib-resources;python_version<'3.10'", 31 | ] 32 | ``` 33 | 34 | ## Manifest 35 | 36 | * `look_and_say.dat`: first entries of the "Look and Say" integer series, sequence [A005150](https://oeis.org/A005150) 37 | -------------------------------------------------------------------------------- /pyGSM/data/dftb_in.hsd: -------------------------------------------------------------------------------- 1 | Geometry = GenFormat { 2 | <<<"tmp.gen" 3 | } 4 | 5 | Driver = {} 6 | 7 | Hamiltonian = DFTB { 8 | SCC = Yes 9 | SlaterKosterFiles = Type2FileNames { 10 | Prefix = "/global/software/DFTB+/1.3.1/mio-0-1/" 11 | Separator = "-" 12 | Suffix = ".skf" 13 | } 14 | 15 | MaxAngularMomentum { 16 | C = "p" 17 | H = "s" 18 | } 19 | Filling = Fermi { 20 | Temperature [Kelvin] = 0.0 21 | } 22 | 23 | Dispersion = LennardJones { 24 | Parameters = UFFParameters {} 25 | } 26 | } 27 | 28 | Options {} 29 | 30 | Analysis = { 31 | CalculateForces = Yes 32 | } 33 | 34 | ParserOptions { 35 | ParserVersion = 5 36 | } 37 | -------------------------------------------------------------------------------- /pyGSM/data/diels_alder.xyz: -------------------------------------------------------------------------------- 1 | 16 2 | 3 | C -1.06001665 -1.51714564 0.05288674 4 | C -1.82955412 -0.59408623 -0.53968755 5 | C -2.01260392 0.79370866 -0.08977969 6 | C -1.09740592 1.54095108 0.54110413 7 | H -2.40063347 -0.88235561 -1.42321617 8 | H -0.51365172 -1.30383154 0.96828855 9 | H -0.96688964 -2.52122005 -0.35117707 10 | H -1.32088987 2.55628492 0.85667305 11 | H -0.09454533 1.17390089 0.74481476 12 | H -2.98142355 1.23828062 -0.32070817 13 | C 3.01841440 -0.33274049 0.53420511 14 | C 2.48267950 0.16990394 -0.57660955 15 | H 3.89171154 0.11254122 1.00536203 16 | H 2.60849064 -1.21591676 1.01902222 17 | H 1.60806045 -0.27640639 -1.04373753 18 | H 2.89526366 1.05096138 -1.06301386 19 | 16 20 | 21 | C 0.01856467 -1.50510245 0.07175535 22 | C -1.25472580 -0.70291222 -0.04828280 23 | C -1.28551058 0.63324180 -0.08527776 24 | C -0.04811068 1.49696591 -0.04422009 25 | H -2.18794361 -1.26282927 -0.10437525 26 | H -0.12133841 -2.31635373 0.79987325 27 | H 0.22663560 -2.00643122 -0.88734886 28 | H -0.13761965 2.30980257 -0.77849415 29 | H 0.01956981 1.99500653 0.93644690 30 | H -2.24481452 1.14593304 -0.15154508 31 | C 1.21601433 -0.63168912 0.47947978 32 | C 1.22945185 0.68482088 -0.31033881 33 | H 1.14834814 -0.40532251 1.55280900 34 | H 2.15327210 -1.18259709 0.33198960 35 | H 1.29564501 0.45837391 -1.38389788 36 | H 2.11591575 1.28013898 -0.05856919 37 | -------------------------------------------------------------------------------- /pyGSM/data/dimer.prmtop: -------------------------------------------------------------------------------- 1 | %VERSION VERSION_STAMP = V0001.000 DATE = 04/23/19 08:41:34 2 | %FLAG TITLE 3 | %FORMAT(20a4) 4 | default_name 5 | %FLAG POINTERS 6 | %FORMAT(10I8) 7 | 6 2 4 0 2 0 0 0 0 0 8 | 8 2 0 0 0 1 1 0 2 0 9 | 0 0 0 0 0 0 0 1 3 0 10 | 0 11 | %FLAG ATOM_NAME 12 | %FORMAT(20a4) 13 | O H H1 O H H1 14 | %FLAG CHARGE 15 | %FORMAT(5E16.8) 16 | -1.43045055E+01 7.14314160E+00 7.14314160E+00 -1.43045055E+01 7.14314160E+00 17 | 7.14314160E+00 18 | %FLAG ATOMIC_NUMBER 19 | %FORMAT(10I8) 20 | 8 1 1 8 1 1 21 | %FLAG MASS 22 | %FORMAT(5E16.8) 23 | 1.60000000E+01 1.00800000E+00 1.00800000E+00 1.60000000E+01 1.00800000E+00 24 | 1.00800000E+00 25 | %FLAG ATOM_TYPE_INDEX 26 | %FORMAT(10I8) 27 | 1 2 2 1 2 2 28 | %FLAG NUMBER_EXCLUDED_ATOMS 29 | %FORMAT(10I8) 30 | 2 1 1 2 1 1 31 | %FLAG NONBONDED_PARM_INDEX 32 | %FORMAT(10I8) 33 | 1 2 2 3 34 | %FLAG RESIDUE_LABEL 35 | %FORMAT(20a4) 36 | HOH HOH 37 | %FLAG RESIDUE_POINTER 38 | %FORMAT(10I8) 39 | 1 4 40 | %FLAG BOND_FORCE_CONSTANT 41 | %FORMAT(5E16.8) 42 | 3.69600000E+02 43 | %FLAG BOND_EQUIL_VALUE 44 | %FORMAT(5E16.8) 45 | 9.74000000E-01 46 | %FLAG ANGLE_FORCE_CONSTANT 47 | %FORMAT(5E16.8) 48 | 4.19300000E+01 49 | %FLAG ANGLE_EQUIL_VALUE 50 | %FORMAT(5E16.8) 51 | 1.82910584E+00 52 | %FLAG DIHEDRAL_FORCE_CONSTANT 53 | %FORMAT(5E16.8) 54 | 55 | %FLAG DIHEDRAL_PERIODICITY 56 | %FORMAT(5E16.8) 57 | 58 | %FLAG DIHEDRAL_PHASE 59 | %FORMAT(5E16.8) 60 | 61 | %FLAG SCEE_SCALE_FACTOR 62 | %FORMAT(5E16.8) 63 | 64 | %FLAG SCNB_SCALE_FACTOR 65 | %FORMAT(5E16.8) 66 | 67 | %FLAG SOLTY 68 | %FORMAT(5E16.8) 69 | 0.00000000E+00 0.00000000E+00 70 | %FLAG LENNARD_JONES_ACOEF 71 | %FORMAT(5E16.8) 72 | 5.81803229E+05 0.00000000E+00 0.00000000E+00 73 | %FLAG LENNARD_JONES_BCOEF 74 | %FORMAT(5E16.8) 75 | 6.99746810E+02 0.00000000E+00 0.00000000E+00 76 | %FLAG BONDS_INC_HYDROGEN 77 | %FORMAT(10I8) 78 | 0 3 1 0 6 1 9 12 1 9 79 | 15 1 80 | %FLAG BONDS_WITHOUT_HYDROGEN 81 | %FORMAT(10I8) 82 | 83 | %FLAG ANGLES_INC_HYDROGEN 84 | %FORMAT(10I8) 85 | 3 0 6 1 12 9 15 1 86 | %FLAG ANGLES_WITHOUT_HYDROGEN 87 | %FORMAT(10I8) 88 | 89 | %FLAG DIHEDRALS_INC_HYDROGEN 90 | %FORMAT(10I8) 91 | 92 | %FLAG DIHEDRALS_WITHOUT_HYDROGEN 93 | %FORMAT(10I8) 94 | 95 | %FLAG EXCLUDED_ATOMS_LIST 96 | %FORMAT(10I8) 97 | 2 3 3 0 5 6 6 0 98 | %FLAG HBOND_ACOEF 99 | %FORMAT(5E16.8) 100 | 101 | %FLAG HBOND_BCOEF 102 | %FORMAT(5E16.8) 103 | 104 | %FLAG HBCUT 105 | %FORMAT(5E16.8) 106 | 107 | %FLAG AMBER_ATOM_TYPE 108 | %FORMAT(20a4) 109 | oh ho ho oh ho ho 110 | %FLAG TREE_CHAIN_CLASSIFICATION 111 | %FORMAT(20a4) 112 | BLA BLA BLA BLA BLA BLA 113 | %FLAG JOIN_ARRAY 114 | %FORMAT(10I8) 115 | 0 0 0 0 0 0 116 | %FLAG IROTAT 117 | %FORMAT(10I8) 118 | 0 0 0 0 0 0 119 | %FLAG SOLVENT_POINTERS 120 | %FORMAT(3I8) 121 | 2 2 3 122 | %FLAG ATOMS_PER_MOLECULE 123 | %FORMAT(10I8) 124 | 3 3 125 | %FLAG BOX_DIMENSIONS 126 | %FORMAT(5E16.8) 127 | 9.00000000E+01 1.13840000E+01 1.55400000E+00 1.25900000E+00 128 | %FLAG RADIUS_SET 129 | %FORMAT(1a80) 130 | modified Bondi radii (mbondi) 131 | %FLAG RADII 132 | %FORMAT(5E16.8) 133 | 1.50000000E+00 8.00000000E-01 8.00000000E-01 1.50000000E+00 8.00000000E-01 134 | 8.00000000E-01 135 | %FLAG SCREEN 136 | %FORMAT(5E16.8) 137 | 8.50000000E-01 8.50000000E-01 8.50000000E-01 8.50000000E-01 8.50000000E-01 138 | 8.50000000E-01 139 | %FLAG IPOL 140 | %FORMAT(1I8) 141 | 0 142 | -------------------------------------------------------------------------------- /pyGSM/data/dimer_h2o.pdb: -------------------------------------------------------------------------------- 1 | ATOM 1 O HOH 1 -4.273 0.542 0.000 1.00 0.00 O 2 | ATOM 2 H HOH 1 -3.303 0.542 0.000 1.00 0.00 H 3 | ATOM 3 H1 HOH 1 -4.596 -0.364 -0.127 1.00 0.00 H 4 | TER 5 | ATOM 4 O HOH 2 0.731 0.510 0.000 1.00 0.00 O 6 | ATOM 5 H HOH 2 1.701 0.510 0.000 1.00 0.00 H 7 | ATOM 6 H1 HOH 2 0.407 -0.354 -0.299 1.00 0.00 H 8 | TER 9 | -------------------------------------------------------------------------------- /pyGSM/data/ethylene.xyz: -------------------------------------------------------------------------------- 1 | 6 2 | atoms 3 | C 0.00000000 0.66545800 -0.00000100 4 | H -0.92356200 1.23964100 0.00000300 5 | H 0.92356200 1.23964100 0.00000300 6 | C 0.00000000 -0.66545800 -0.00000100 7 | H 0.92356200 -1.23964100 0.00000300 8 | H -0.92356200 -1.23964100 0.00000300 9 | -------------------------------------------------------------------------------- /pyGSM/data/ethylene_molpro.com: -------------------------------------------------------------------------------- 1 | basis 6-31G* 2 | closed 6 3 | occ 9 4 | -------------------------------------------------------------------------------- /pyGSM/data/look_and_say.dat: -------------------------------------------------------------------------------- 1 | 1 2 | 11 3 | 21 4 | 1211 5 | 111221 6 | 312211 7 | 13112221 8 | 1113213211 9 | 31131211131221 10 | 13211311123113112211 11 | 11131221133112132113212221 12 | 3113112221232112111312211312113211 13 | 1321132132111213122112311311222113111221131221 14 | 11131221131211131231121113112221121321132132211331222113112211 15 | 311311222113111231131112132112311321322112111312211312111322212311322113212221 -------------------------------------------------------------------------------- /pyGSM/growing_string_methods/__init__.py: -------------------------------------------------------------------------------- 1 | from .gsm import GSM 2 | from .main_gsm import MainGSM 3 | from .de_gsm import DE_GSM 4 | from .se_gsm import SE_GSM 5 | from .se_cross import SE_Cross 6 | -------------------------------------------------------------------------------- /pyGSM/growing_string_methods/se_cross.py: -------------------------------------------------------------------------------- 1 | from __future__ import print_function 2 | import sys 3 | import os 4 | from os import path 5 | # local application imports 6 | sys.path.append(path.dirname(path.dirname(path.abspath(__file__)))) 7 | 8 | from potential_energy_surfaces import Avg_PES, PES 9 | from .se_gsm import SE_GSM 10 | from molecule import Molecule 11 | from utilities import nifty 12 | # standard library imports 13 | 14 | 15 | class SE_Cross(SE_GSM): 16 | 17 | def go_gsm(self, max_iters=50, opt_steps=3, rtype=0): 18 | """rtype=0 MECI search 19 | rtype=1 MESX search 20 | """ 21 | assert rtype in [0, 1], "rtype not defined" 22 | if rtype == 0: 23 | nifty.printcool("Doing SE-MECI search") 24 | else: 25 | nifty.printcool("Doing SE-MESX search") 26 | 27 | self.nodes[0].gradrms = 0. 28 | self.nodes[0].V0 = self.nodes[0].energy 29 | print(' Initial energy is {:1.4f}'.format(self.nodes[0].energy)) 30 | sys.stdout.flush() 31 | 32 | # stash bdist for node 0 33 | _, self.nodes[0].bdist = self.get_tangent(self.nodes[0], None, driving_coords=self.driving_coords) 34 | print(" Initial bdist is %1.3f" % self.nodes[0].bdist) 35 | 36 | # interpolate first node 37 | self.add_GSM_nodeR() 38 | 39 | # grow string 40 | self.grow_string(max_iters=max_iters, max_opt_steps=opt_steps) 41 | print(' SE_Cross growth phase over') 42 | print(' Warning last node still not fully optimized') 43 | 44 | if True: 45 | path = os.path.join(os.getcwd(), 'scratch/{:03d}/{}'.format(self.ID, self.nR-1)) 46 | # doing extra constrained penalty optimization for MECI 47 | print(" extra constrained optimization for the nnR-1 = %d" % (self.nR-1)) 48 | self.optimizer[self.nR-1].conv_grms = self.options['CONV_TOL']*5 49 | ictan = self.get_tangent_xyz(self.nodes[self.nR-1].xyz, self.nodes[self.nR-2].xyz, self.newic.primitive_internal_coordinates) 50 | self.nodes[self.nR-1].PES.sigma = 1.5 51 | self.optimizer[self.nR-1].optimize( 52 | molecule=self.nodes[self.nR-1], 53 | refE=self.nodes[0].V0, 54 | opt_type='ICTAN', 55 | opt_steps=5, 56 | ictan=ictan, 57 | path=path, 58 | ) 59 | ictan = self.get_tangent_xyz(self.nodes[self.nR-1].xyz, self.nodes[self.nR-2].xyz, self.newic.primitive_internal_coordinates) 60 | self.nodes[self.nR-1].PES.sigma = 2.5 61 | self.optimizer[self.nR-1].optimize( 62 | molecule=self.nodes[self.nR-1], 63 | refE=self.nodes[0].V0, 64 | opt_type='ICTAN', 65 | opt_steps=5, 66 | ictan=ictan, 67 | path=path, 68 | ) 69 | ictan = self.get_tangent_xyz(self.nodes[self.nR-1].xyz, self.nodes[self.nR-2].xyz, self.newic.primitive_internal_coordinates) 70 | self.nodes[self.nR-1].PES.sigma = 3.5 71 | self.optimizer[self.nR-1].optimize( 72 | molecule=self.nodes[self.nR-1], 73 | refE=self.nodes[0].V0, 74 | opt_type='ICTAN', 75 | opt_steps=5, 76 | ictan=ictan, 77 | path=path, 78 | ) 79 | 80 | self.xyz_writer('after_penalty_{:03}.xyz'.format(self.ID), self.geometries, self.energies, self.gradrmss, self.dEs) 81 | self.optimizer[self.nR].opt_cross = True 82 | self.nodes[0].V0 = self.nodes[0].PES.PES2.energy 83 | if rtype == 0: 84 | # MECI optimization 85 | self.nodes[self.nR] = Molecule.copy_from_options(self.nodes[self.nR-1], new_node_id=self.nR) 86 | avg_pes = Avg_PES.create_pes_from(self.nodes[self.nR].PES) 87 | self.nodes[self.nR].PES = avg_pes 88 | path = os.path.join(os.getcwd(), 'scratch/{:03d}/{}'.format(self.ID, self.nR)) 89 | self.optimizer[self.nR].conv_grms = self.options['CONV_TOL'] 90 | self.optimizer[self.nR].conv_gmax = 0.1 # self.options['CONV_gmax'] 91 | self.optimizer[self.nR].conv_Ediff = self.options['CONV_Ediff'] 92 | self.optimizer[self.nR].conv_dE = self.options['CONV_dE'] 93 | self.optimizer[self.nR].optimize( 94 | molecule=self.nodes[self.nR], 95 | refE=self.nodes[0].V0, 96 | opt_type='MECI', 97 | opt_steps=100, 98 | verbose=True, 99 | path=path, 100 | ) 101 | if not self.optimizer[self.nR].converged: 102 | print("doing extra optimization in hopes that the MECI will converge.") 103 | if self.nodes[self.nR].PES.PES2.energy - self.nodes[0].V0 < 20: 104 | self.optimizer[self.nR].optimize( 105 | molecule=self.nodes[self.nR], 106 | refE=self.nodes[0].V0, 107 | opt_type='MECI', 108 | opt_steps=100, 109 | verbose=True, 110 | path=path, 111 | ) 112 | else: 113 | # unconstrained penalty optimization 114 | # TODO make unctonstrained "CROSSING" which checks for dE convergence 115 | self.nodes[self.nR] = Molecule.copy_from_options(self.nodes[self.nR-1], new_node_id=self.nR) 116 | self.nodes[self.nR].PES.sigma = 10.0 117 | print(" sigma for node %d is %.3f" % (self.nR, self.nodes[self.nR].PES.sigma)) 118 | path = os.path.join(os.getcwd(), 'scratch/{:03d}/{}'.format(self.ID, self.nR)) 119 | self.optimizer[self.nR].opt_cross = True 120 | self.optimizer[self.nR].conv_grms = self.options['CONV_TOL'] 121 | # self.optimizer[self.nR].conv_gmax = self.options['CONV_gmax'] 122 | self.optimizer[self.nR].conv_Ediff = self.options['CONV_Ediff'] 123 | self.optimizer[self.nR].conv_dE = self.options['CONV_dE'] 124 | self.optimizer[self.nR].optimize( 125 | molecule=self.nodes[self.nR], 126 | refE=self.nodes[0].V0, 127 | opt_type='UNCONSTRAINED', 128 | opt_steps=200, 129 | verbose=True, 130 | path=path, 131 | ) 132 | self.xyz_writer('grown_string_{:03}.xyz'.format(self.ID), self.geometries, self.energies, self.gradrmss, self.dEs) 133 | 134 | if self.optimizer[self.nR].converged: 135 | self.nnodes = self.nR+1 136 | self.nodes = self.nodes[:self.nnodes] 137 | print("Setting all interior nodes to active") 138 | for n in range(1, self.nnodes-1): 139 | self.active[n] = True 140 | self.active[self.nnodes-1] = False 141 | self.active[0] = False 142 | 143 | # Convert all the PES to excited-states 144 | for n in range(self.nnodes): 145 | self.nodes[n].PES = PES.create_pes_from(self.nodes[n].PES.PES2, 146 | options={'gradient_states': [(1, 1)]}) 147 | 148 | print(" initial ic_reparam") 149 | self.reparameterize(ic_reparam_steps=25) 150 | print(" V_profile (after reparam): ", end=' ') 151 | energies = self.energies 152 | for n in range(self.nnodes): 153 | print(" {:7.3f}".format(float(energies[n])), end=' ') 154 | print() 155 | self.xyz_writer('grown_string1_{:03}.xyz'.format(self.ID), self.geometries, self.energies, self.gradrmss, self.dEs) 156 | 157 | deltaE = energies[-1] - energies[0] 158 | if deltaE > 20: 159 | print(" MECI energy is too high %5.4f. Don't try to optimize pathway" % deltaE) 160 | print("Exiting early") 161 | self.end_early = True 162 | else: 163 | print(" deltaE s1-minimum and MECI %5.4f" % deltaE) 164 | try: 165 | self.optimize_string(max_iter=max_iters, opt_steps=3, rtype=1) 166 | except Exception as error: 167 | if str(error) == "Ran out of iterations": 168 | print(error) 169 | self.end_early = True 170 | else: 171 | print(error) 172 | self.end_early = True 173 | else: 174 | print("Exiting early") 175 | self.end_early = True 176 | 177 | def check_if_grown(self): 178 | isDone = False 179 | # epsilon = 1.5 180 | pes1dE = self.nodes[self.nR-1].PES.dE 181 | pes2dE = self.nodes[self.nR-2].PES.dE 182 | condition1 = (abs(self.nodes[self.nR-1].bdist) <= (1-self.BDIST_RATIO)*abs(self.nodes[0].bdist) and (abs(pes1dE) > abs(pes2dE))) 183 | # condition2 = ((self.nodes[self.nR-1].bdist+0.1 > self.nodes[self.nR-2].bdist) and (1-self.BDIST_RATIO)*abs(self.nodes[0].bdist)) 184 | if condition1: 185 | print(" Condition 1 satisfied") 186 | print(" bdist current %1.3f" % abs(self.nodes[self.nR-1].bdist)) 187 | print(" bdist target %1.3f" % (abs(self.nodes[0].bdist)*(1-self.BDIST_RATIO))) 188 | print(" Growth-phase over") 189 | isDone = True 190 | # elif condition2: 191 | # print(" Condition 2 satisfied") 192 | # print(" Growth-phase over") 193 | # isDone = True 194 | return isDone 195 | 196 | def restart_string(self, xyzfile='restart.xyz'): 197 | super(SE_Cross, self).restart_string(xyzfile) 198 | self.done_growing = False 199 | self.nnodes = 20 200 | self.nR -= 1 201 | # stash bdist for node 0 202 | _, self.nodes[0].bdist = self.get_tangent(self.nodes[0], None, driving_coords=self.driving_coords) 203 | 204 | def set_frontier_convergence(self, nR): 205 | self.optimizer[nR].conv_grms = self.options['ADD_NODE_TOL'] 206 | self.optimizer[nR].conv_gmax = 100. # self.options['ADD_NODE_TOL'] # could use some multiplier times CONV_GMAX... 207 | self.optimizer[nR].conv_Ediff = 1000. # 2.5 208 | print(" conv_tol of node %d is %.4f" % (nR, self.optimizer[nR].conv_grms)) 209 | -------------------------------------------------------------------------------- /pyGSM/level_of_theories/__init__.py: -------------------------------------------------------------------------------- 1 | #from pytc import PyTC 2 | #from qchem import QChem 3 | #from molpro import Molpro 4 | #from dftb import DFTB 5 | #from orca import Orca 6 | #from openmm import OpenMM 7 | #from .pdynamo import pDynamo 8 | from .nanoreactor_engine import nanoreactor_engine 9 | -------------------------------------------------------------------------------- /pyGSM/level_of_theories/ase.py: -------------------------------------------------------------------------------- 1 | """ 2 | Level Of Theory for ASE calculators 3 | https://gitlab.com/ase/ase 4 | 5 | Written by Tamas K. Stenczel in 2021 6 | """ 7 | import importlib 8 | 9 | try: 10 | from ase import Atoms 11 | from ase.calculators.calculator import Calculator 12 | from ase.data import atomic_numbers 13 | from ase import units 14 | except ModuleNotFoundError: 15 | Atoms = None 16 | Calculator = None 17 | atomic_numbers = None 18 | units = None 19 | print("ASE not installed, ASE-based calculators will not work") 20 | 21 | from .base_lot import Lot, LoTError 22 | 23 | 24 | class ASELoT(Lot): 25 | """ 26 | Warning: 27 | multiplicity is not implemented, the calculator ignores it 28 | """ 29 | 30 | def __init__(self, calculator: Calculator, options): 31 | super(ASELoT, self).__init__(options) 32 | 33 | self.ase_calculator = calculator 34 | 35 | @classmethod 36 | def from_options(cls, calculator: Calculator, **kwargs): 37 | """ Returns an instance of this class with default options updated from values in kwargs""" 38 | return cls(calculator, cls.default_options().set_values(kwargs)) 39 | 40 | @classmethod 41 | def copy(cls, lot, options, copy_wavefunction=True): 42 | assert isinstance(lot, ASELoT) 43 | return cls(lot.ase_calculator, lot.options.copy().set_values(options)) 44 | 45 | @classmethod 46 | def from_calculator_string(cls, calculator_import: str, calculator_kwargs: dict = dict(), **kwargs): 47 | # this imports the calculator 48 | module_name = ".".join(calculator_import.split(".")[:-1]) 49 | class_name = calculator_import.split(".")[-1] 50 | 51 | # import the module of the calculator 52 | try: 53 | module = importlib.import_module(module_name) 54 | except ModuleNotFoundError: 55 | raise LoTError( 56 | "ASE-calculator's module is not found: {}".format(class_name)) 57 | 58 | # class of the calculator 59 | if hasattr(module, class_name): 60 | calc_class = getattr(module, class_name) 61 | assert issubclass(calc_class, Calculator) 62 | else: 63 | raise LoTError( 64 | "ASE-calculator's class ({}) not found in module {}".format(class_name, module_name)) 65 | 66 | # make sure there is no calculator in the options 67 | _ = kwargs.pop("calculator", None) 68 | 69 | # construct from the constructor 70 | return cls.from_options(calc_class(**calculator_kwargs), **kwargs) 71 | 72 | def run(self, geom, mult, ad_idx, runtype='gradient'): 73 | # run ASE 74 | self.run_ase_atoms(xyz_to_ase(geom), mult, ad_idx, runtype) 75 | 76 | def run_ase_atoms(self, atoms: Atoms, mult, ad_idx, runtype='gradient'): 77 | # set the calculator 78 | atoms.set_calculator(self.ase_calculator) 79 | 80 | # perform gradient calculation if needed 81 | if runtype == "gradient": 82 | self._Gradients[(mult, ad_idx)] = self.Gradient(- atoms.get_forces() / units.Ha * units.Bohr, 83 | 'Hartree/Bohr') 84 | elif runtype == "energy": 85 | pass 86 | else: 87 | raise NotImplementedError( 88 | f"Run type {runtype} is not implemented in the ASE calculator interface") 89 | 90 | # energy is always calculated -> cached if force calculation was done 91 | self._Energies[(mult, ad_idx)] = self.Energy( 92 | atoms.get_potential_energy() / units.Ha, 'Hartree') 93 | 94 | # write E to scratch 95 | self.write_E_to_file() 96 | 97 | self.hasRanForCurrentCoords = True 98 | 99 | 100 | def xyz_to_ase(xyz): 101 | """ 102 | 103 | Parameters 104 | ---------- 105 | xyz : np.ndarray, shape=(N, 4) 106 | 107 | 108 | Returns 109 | ------- 110 | atoms : ase.Atoms 111 | ASE's Atoms object 112 | 113 | """ 114 | 115 | # compatible with list-of-list as well 116 | numbers = [atomic_numbers[x[0]] for x in xyz] 117 | pos = [x[1:4] for x in xyz] 118 | return geom_to_ase(numbers, pos) 119 | 120 | 121 | def geom_to_ase(numbers, positions, **kwargs): 122 | """Geometry to ASE atoms object 123 | 124 | Parameters 125 | ---------- 126 | numbers : array_like, shape=(N_atoms,) 127 | atomic numbers 128 | positions : array_like, shape=(N_atoms,3) 129 | positions of atoms in Angstroms 130 | kwargs 131 | 132 | Returns 133 | ------- 134 | atoms : ase.Atoms 135 | ASE's Atoms object 136 | """ 137 | 138 | return Atoms(numbers=numbers, positions=positions, **kwargs) 139 | -------------------------------------------------------------------------------- /pyGSM/level_of_theories/dftb.py: -------------------------------------------------------------------------------- 1 | # standard library imports 2 | # local application imports 3 | sys.path.append(path.dirname(path.dirname(path.abspath(__file__)))) 4 | from utilities import manage_xyz 5 | import sys 6 | import os 7 | from os import path 8 | import subprocess 9 | import re 10 | 11 | # third party 12 | import numpy as np 13 | 14 | try: 15 | from .base_lot import Lot 16 | except: 17 | from base_lot import Lot 18 | 19 | 20 | class DFTB(Lot): 21 | 22 | def __init__(self, options): 23 | super(DFTB, self).__init__(options) 24 | os.system('rm -f dftb_jobs.txt') 25 | print(" making folder scratch/{}".format(self.node_id)) 26 | os.system('mkdir -p scratch/{}'.format(self.node_id)) 27 | os.system('cp {} scratch/{}'.format(self.lot_inp_file, self.node_id)) 28 | 29 | def run(self, geom): 30 | owd = os.getcwd() 31 | manage_xyz.write_xyz('scratch/{}/tmp.xyz'.format(self.node_id), geom, scale=1.0) 32 | os.system('./xyz2gen scratch/{}/tmp.xyz'.format(self.node_id)) 33 | os.chdir('scratch/{}'.format(self.node_id)) 34 | os.system('pwd') 35 | cmd = "dftb+" 36 | proc = subprocess.Popen(cmd, 37 | stdout=subprocess.PIPE, 38 | stderr=subprocess.PIPE, 39 | ) 40 | stdout, stderr = proc.communicate() 41 | # with open('dftb_jobs.txt','a') as out: 42 | # out.write(stdout) 43 | # out.write(stderr) 44 | 45 | ofilepath = "detailed.out" 46 | with open(ofilepath, 'r') as ofile: 47 | olines = ofile.readlines() 48 | 49 | self.E = [] 50 | temp = 0 51 | tmpgrada = [] 52 | tmpgrad = [] 53 | pattern = re.compile(r"Total energy: [-+]?[0-9]*\.?[0-9]+ H") 54 | for line in olines: 55 | for match in re.finditer(pattern, line): 56 | tmpline = line.split() 57 | self.E.append((1, 0, float(tmpline[2]))) 58 | if line == " Total Forces\n": 59 | temp += 1 60 | elif temp > 0: 61 | tmpline = line.split() 62 | tmpgrad.append([float(i) for i in tmpline]) 63 | temp += 1 64 | if temp > len(self.atoms): 65 | break 66 | tmpgrada.append(tmpgrad) 67 | 68 | self.grada = [] 69 | for count, i in enumerate(self.states): 70 | if i[0] == 1: 71 | self.grada.append((1, i[1], tmpgrada[count])) 72 | if i[0] == 3: 73 | self.grada.append((3, i[1], tmpgrada[count])) 74 | self.hasRanForCurrentCoords = True 75 | 76 | os.chdir(owd) 77 | return 78 | 79 | if __name__ == '__main__': 80 | filepath = "../../data/ethylene.xyz" 81 | dftb = DFTB.from_options(states=[(1, 0)], fnm=filepath, lot_inp_file='../../data/dftb_in.hsd') 82 | geom = manage_xyz.read_xyz(filepath) 83 | xyz = manage_xyz.xyz_to_np(geom) 84 | print(dftb.get_energy(xyz, 1, 0)) 85 | print(dftb.get_gradient(xyz, 1, 0)) 86 | 87 | xyz = xyz + np.random.rand(xyz.shape[0], xyz.shape[1])*0.1 88 | print(dftb.get_energy(xyz, 1, 0)) 89 | print(dftb.get_gradient(xyz, 1, 0)) 90 | -------------------------------------------------------------------------------- /pyGSM/level_of_theories/file_options.py: -------------------------------------------------------------------------------- 1 | from collections import OrderedDict 2 | from re import sub 3 | from ast import literal_eval as leval 4 | from copy import deepcopy 5 | 6 | 7 | class File_Options(object): 8 | """ Class file_options allows parsing of an input file 9 | """ 10 | 11 | def __init__(self, input_file=None): 12 | self.Documentation = OrderedDict() 13 | self.UserOptions = OrderedDict() 14 | self.ActiveOptions = OrderedDict() 15 | self.ForcedOptions = OrderedDict() 16 | self.ForcedWarnings = OrderedDict() 17 | self.InactiveOptions = OrderedDict() 18 | self.InactiveWarnings = OrderedDict() 19 | 20 | # still need to read the file to build the dictionary 21 | if input_file is not None: 22 | for line in open(input_file).readlines(): 23 | line = sub('#.*$', '', line.strip()) 24 | s = line.split() 25 | if len(s) > 0: 26 | # Options are case insensitive 27 | key = s[0].lower() 28 | try: 29 | val = leval(line.replace(s[0], '', 1).strip()) 30 | except: 31 | val = str(line.replace(s[0], '', 1).strip()) 32 | self.UserOptions[key] = val 33 | 34 | @staticmethod 35 | def copy(file_options): 36 | new = File_Options() 37 | new.Documentation = deepcopy(file_options.Documentation) 38 | new.UserOptions = deepcopy(file_options.UserOptions) 39 | new.ActiveOptions = deepcopy(file_options.ActiveOptions) 40 | new.ForcedOptions = deepcopy(file_options.ForcedOptions) 41 | new.InactiveOptions = deepcopy(file_options.InactiveOptions) 42 | new.InactiveWarnings = deepcopy(file_options.InactiveWarnings) 43 | return new 44 | 45 | def set_active(self, key, default, typ, doc, allowed=None, depend=True, clash=False, msg=None): 46 | """ Set one option. The arguments are: 47 | key : The name of the option. 48 | default : The default value. 49 | typ : The type of the value. 50 | doc : The documentation string. 51 | allowed : An optional list of allowed values. 52 | depend : A condition that must be True for the option to be activated. 53 | clash : A condition that must be False for the option to be activated. 54 | msg : A warning that is printed out if the option is not activated. 55 | """ 56 | doc = sub("\.$", "", doc.strip())+"." 57 | self.Documentation[key] = "%-8s " % ("(" + sub("'>", "", sub(" 0: 131 | if TopBar: 132 | out.append("#===========================================#") 133 | else: 134 | TopBar = True 135 | out.append("#| User-supplied options: |#") 136 | out.append("#===========================================#") 137 | out += UserSupplied 138 | Forced = [] 139 | for key in self.ActiveOptions: 140 | if key in self.ForcedOptions: 141 | Forced.append("%-22s %20s # %s" % (key, str(self.ActiveOptions[key]), self.Documentation[key])) 142 | Forced.append("%-22s %20s # Reason : %s" % ("", "", self.ForcedWarnings[key])) 143 | if len(Forced) > 0: 144 | if TopBar: 145 | out.append("#===========================================#") 146 | else: 147 | TopBar = True 148 | out.append("#| Options enforced by the script: |#") 149 | out.append("#===========================================#") 150 | out += Forced 151 | ActiveDefault = [] 152 | for key in self.ActiveOptions: 153 | if key not in self.UserOptions and key not in self.ForcedOptions: 154 | ActiveDefault.append("%-22s %20s # %s" % (key, str(self.ActiveOptions[key]), self.Documentation[key])) 155 | if len(ActiveDefault) > 0: 156 | if TopBar: 157 | out.append("#===========================================#") 158 | else: 159 | TopBar = True 160 | out.append("#| Active options at default values: |#") 161 | out.append("#===========================================#") 162 | out += ActiveDefault 163 | # out.append("") 164 | out.append("#===========================================#") 165 | out.append("#| End of Input File |#") 166 | out.append("#===========================================#") 167 | Deactivated = [] 168 | for key in self.InactiveOptions: 169 | Deactivated.append("%-22s %20s # %s" % (key, str(self.InactiveOptions[key]), self.Documentation[key])) 170 | Deactivated.append("%-22s %20s # Reason : %s" % ("", "", self.InactiveWarnings[key])) 171 | if len(Deactivated) > 0: 172 | out.append("") 173 | out.append("#===========================================#") 174 | out.append("#| Deactivated or conflicting options: |#") 175 | out.append("#===========================================#") 176 | out += Deactivated 177 | Unrecognized = [] 178 | for key in self.UserOptions: 179 | if key not in self.ActiveOptions and key not in self.InactiveOptions: 180 | Unrecognized.append("%-22s %20s" % (key, self.UserOptions[key])) 181 | if len(Unrecognized) > 0: 182 | # out.append("") 183 | out.append("#===========================================#") 184 | out.append("#| Unrecognized options: |#") 185 | out.append("#===========================================#") 186 | out += Unrecognized 187 | return out 188 | 189 | 190 | if __name__ == '__main__': 191 | 192 | fo = File_Options('tmp') 193 | fo.set_active('crystal', 'not-stupid', str, 'is crystal stupid') 194 | 195 | fo2 = File_Options.copy(fo) 196 | 197 | print(id(fo)) 198 | print(id(fo2)) 199 | 200 | print(fo) 201 | for line in fo.record(): 202 | print(line) 203 | 204 | class tmp2(object): 205 | def __init__(): 206 | return 207 | 208 | for key in fo.ActiveOptions: 209 | setattr(tmp2, key, fo.ActiveOptions[key]) 210 | print(tmp2.crystal) 211 | -------------------------------------------------------------------------------- /pyGSM/level_of_theories/molpro.py: -------------------------------------------------------------------------------- 1 | # standard library imports 2 | import sys 3 | import os 4 | from os import path 5 | import re 6 | import subprocess 7 | 8 | # third party 9 | import numpy as np 10 | 11 | # local application imports 12 | sys.path.append(path.dirname(path.dirname(path.abspath(__file__)))) 13 | 14 | from utilities import manage_xyz 15 | try: 16 | from .base_lot import Lot 17 | except: 18 | from base_lot import Lot 19 | 20 | 21 | class Molpro(Lot): 22 | 23 | def __init__(self, options): 24 | super(Molpro, self).__init__(options) 25 | 26 | self.file_options.set_active('basis', '6-31g', str, '') 27 | self.file_options.set_active('closed', None, int, '') 28 | self.file_options.set_active('occ', None, int, '') 29 | self.file_options.set_active('n_electrons', None, int, '') 30 | self.file_options.set_active('memory', 800, int, '') 31 | self.file_options.set_active('n_states', 2, int, '') 32 | 33 | # set all active values to self for easy access 34 | for key in self.file_options.ActiveOptions: 35 | setattr(self, key, self.file_options.ActiveOptions[key]) 36 | 37 | if self.has_nelectrons is False: 38 | for i in self.states: 39 | self.get_nelec(self.geom, i[0]) 40 | 41 | def write_input_file(self, tempfilename): 42 | # TODO gopro needs a number 43 | tempfile = open(tempfilename, 'w') 44 | tempfile.write(' file,2,mp_{:04d}_{:04d}\n'.format(self.ID, self.node_id)) 45 | tempfile.write(' memory,{},m\n'.format(self.memory)) 46 | tempfile.write(' symmetry,nosym\n') 47 | tempfile.write(' orient,noorient\n\n') 48 | tempfile.write(' geometry={\n') 49 | for coord in geom: 50 | for i in coord: 51 | tempfile.write(' '+str(i)) 52 | tempfile.write('\n') 53 | tempfile.write('}\n\n') 54 | 55 | # get states 56 | singlets = self.search_tuple(self.states, 1) 57 | len_singlets = len(singlets) 58 | # len_E_singlets = singlets[-1][1] + 1 # final adiabat +1 because 0 indexed 59 | triplets = self.search_tuple(self.states, 3) 60 | len_triplets = len(triplets) 61 | 62 | tempfile.write(' basis={}\n\n'.format(self.basis)) 63 | 64 | # do singlets 65 | if len_singlets != 0: 66 | tempfile.write(' {multi\n') 67 | tempfile.write(' direct\n') 68 | tempfile.write(' closed,{}\n'.format(self.closed)) 69 | tempfile.write(' occ,{}\n'.format(self.occ)) 70 | tempfile.write(' wf,{},1,0\n'.format(self.n_electrons)) 71 | # this can be made the len of singlets 72 | tempfile.write(' state,{}\n'.format(self.n_states)) 73 | 74 | for state in self.gradient_states: 75 | s = state[1] 76 | print("running grad state ", s) 77 | grad_name = "510"+str(s)+".1" 78 | tempfile.write(' CPMCSCF,GRAD,{}.1,record={}\n'.format(s+1, grad_name)) 79 | 80 | # TODO this can only do coupling if states is 2, want to generalize to 3 states 81 | if self.coupling_states: 82 | tempfile.write('CPMCSCF,NACM,{}.1,{}.1,record=5200.1\n'.format(self.coupling_states[0], self.coupling_states[1])) 83 | tempfile.write(' }\n') 84 | 85 | for state in self.gradient_states: 86 | s = state[1] 87 | grad_name = "510"+str(s)+".1" 88 | tempfile.write('Force;SAMC,{};varsav\n'.format(grad_name)) 89 | if self.coupling_states: 90 | tempfile.write('Force;SAMC,5200.1;varsav\n') 91 | 92 | if len_triplets != 0: 93 | tempfile.write(' {multi\n') 94 | nclosed = self.nocc 95 | nocc = nclosed+self.nactive 96 | tempfile.write(' closed,{}\n'.format(nclosed)) 97 | tempfile.write(' occ,{}\n'.format(nocc)) 98 | tempfile.write(' wf,{},1,2\n'.format(self.n_electrons)) 99 | # nstates = len(self.states) 100 | tempfile.write(' state,{}\n'.format(len_triplets)) 101 | 102 | for state in triplets: 103 | s = state[1] 104 | grad_name = "511"+str(s)+".1" 105 | tempfile.write(' CPMCSCF,GRAD,{}.1,record={}\n'.format(s+1, grad_name)) 106 | tempfile.write(' }\n') 107 | 108 | for s in triplets: 109 | s = state[1] 110 | grad_name = "511"+str(s)+".1" 111 | tempfile.write('Force;SAMC,{};varsav\n'.format(grad_name)) 112 | 113 | tempfile.close() 114 | 115 | # designed to do multiple multiplicities at once... maybe not such a good idea, that feature is currently broken 116 | # TODO 117 | def runall(self, geom, runtype=None): 118 | 119 | self.Gradients = {} 120 | self.Energies = {} 121 | self.Couplings = {} 122 | tempfilename = 'scratch/{}/gopro.com'.format(self.node_id) 123 | try: 124 | scratch = os.environ['SLURM_LOCAL_SCRATCH'] 125 | args = ['-W', 'scratch', '-n', str(self.nproc), tempfilename, '-d', scratch] 126 | except: 127 | args = [os.environ['MOLPRO_OPTIONS'], '-n', str(self.nproc), tempfilename] 128 | 129 | print(args) 130 | 131 | # WRite input file 132 | self.write_input_file(tempfilename) 133 | 134 | # Run Process 135 | command = ['molpro'] 136 | command.extend(args) 137 | output = subprocess.Popen(command, stdout=open('scratch/{}/gopro.out'.format(self.node_id), 'w'), stderr=subprocess.PIPE).communicate() 138 | print(output[0]) 139 | 140 | # Parse 141 | self.parse() 142 | self.write_E_to_file() 143 | 144 | self.hasRanForCurrentCoords = True 145 | return 146 | 147 | def parse(self): 148 | # Now read the output 149 | tempfileout = 'scratch/{}/gopro.out'.format(self.node_id) 150 | pattern = re.compile(r'MCSCF STATE \d.1 Energy \s* ([-+]?[0-9]*\.?[0-9]+)') 151 | tmp = [] 152 | for i, line in enumerate(open(tempfileout)): 153 | for match in re.finditer(pattern, line): 154 | tmp.append(float(match.group(1))) 155 | 156 | for state, E in zip(self.states, tmp): 157 | self._Energies[state] = self.Energy(E, 'Hartree') 158 | 159 | tmpgrada = [] 160 | tmpgrad = [] 161 | tmpcoup = [] 162 | with open(tempfileout, "r") as f: 163 | for line in f: 164 | if line.startswith("GRADIENT FOR STATE", 7): # will work for SA-MC and RSPT2 HF 165 | for i in range(3): 166 | next(f) 167 | for i in range(len(geom)): 168 | findline = next(f, '').strip() 169 | mobj = re.match(r'^\s*(\S+)\s+(\S+)\s+(\S+)\s+(\S+)\s*$', findline) 170 | tmpgrad.append([ 171 | float(mobj.group(2)), 172 | float(mobj.group(3)), 173 | float(mobj.group(4)), 174 | ]) 175 | tmpgrada.append(tmpgrad) 176 | tmpgrad = [] 177 | if line.startswith(" SA-MC NACME FOR STATES"): 178 | for i in range(3): 179 | next(f) 180 | for i in range(len(geom)): 181 | findline = next(f, '').strip() 182 | mobj = re.match(r'^\s*(\S+)\s+(\S+)\s+(\S+)\s+(\S+)\s*$', findline) 183 | tmpcoup.append([ 184 | float(mobj.group(2)), 185 | float(mobj.group(3)), 186 | float(mobj.group(4)), 187 | ]) 188 | self.Couplings[self.coupling_states] = self.Coupling(np.asarray(tmpcoup), "Hartree/Bohr") 189 | for state, grad in zip(self.states, tmpgrada): 190 | self._Gradients[state] = self.Gradient(np.asarray(grad), "Hartree/Bohr") 191 | 192 | @classmethod 193 | def copy(cls, lot, options, copy_wavefunction=True): 194 | """ create a copy of this lot object""" 195 | # print(" creating copy, new node id =",node_id) 196 | # print(" old node id = ",self.node_id) 197 | node_id = options.get('node_id', 1) 198 | if node_id != lot.node_id and copy_wavefunction: 199 | cmd = "cp scratch/mp_{:04d}_{:04d} scratch/mp_{:04d}_{:04d}".format(lot.ID, lot.node_id, lot.ID, node_id) 200 | print(cmd) 201 | os.system(cmd) 202 | return cls(lot.options.copy().set_values(options)) 203 | 204 | 205 | if __name__ == '__main__': 206 | filepath = "../../data/ethylene.xyz" 207 | molpro = Molpro.from_options(states=[(1, 0), (1, 1)], fnm=filepath, lot_inp_file='../../data/ethylene_molpro.com', coupling_states=(0, 1)) 208 | geom = manage_xyz.read_xyz(filepath) 209 | xyz = manage_xyz.xyz_to_np(geom) 210 | print(molpro.get_energy(xyz, 1, 0)) 211 | print(molpro.get_gradient(xyz, 1, 0)) 212 | print(molpro.get_coupling(xyz, 1, 0, 1)) 213 | -------------------------------------------------------------------------------- /pyGSM/level_of_theories/nanoreactor_engine.py: -------------------------------------------------------------------------------- 1 | 2 | # standard library imports 3 | import sys 4 | import os 5 | from os import path 6 | import re 7 | from collections import namedtuple 8 | import copy as cp 9 | # third party 10 | import numpy as np 11 | 12 | # local application imports 13 | sys.path.append(path.dirname( path.dirname( path.abspath(__file__)))) 14 | 15 | try: 16 | from .base_lot import Lot 17 | from .file_options import File_Options 18 | except: 19 | from base_lot import Lot 20 | from file_options import File_Options 21 | from utilities import * 22 | 23 | ''' 24 | ''' 25 | 26 | class nanoreactor_engine(Lot): 27 | 28 | def __init__(self,options): 29 | super(nanoreactor_engine,self).__init__(options) 30 | # can we do a check here? 31 | self.engine=options['job_data']['engine'] 32 | self.nscffail = 0 33 | self.save_orbital = True 34 | if type(self.options['job_data']['orbfile']) != dict: 35 | self.options['job_data']['orbfile'] = {} 36 | if 'all_geoms' not in self.options['job_data']: 37 | self.options['job_data']['all_geoms'] = {} 38 | @classmethod 39 | def rmsd(cls, geom1, geom2): 40 | total = 0 41 | 42 | flat_geom1 = np.array(geom1).flatten() 43 | flat_geom2 = np.array(geom2).flatten() 44 | for i in range(len(flat_geom1)): 45 | total += (flat_geom1[i] - flat_geom2[i]) ** 2 46 | 47 | return total 48 | 49 | @classmethod 50 | def copy(cls, lot, options={}, copy_wavefunction=True): 51 | options['job_data'] = lot.options['job_data'] 52 | if copy_wavefunction: 53 | options['job_data']['orbfile'].update({'copied_orb': lot.node_id}) 54 | return cls(lot.options.copy().set_values(options)) 55 | 56 | def run(self,geom,mult,ad_idx,runtype='gradient'): 57 | self.Gradients={} 58 | self.Energies = {} 59 | xyz = manage_xyz.xyz_to_np(geom)*units.ANGSTROM_TO_AU 60 | 61 | if self.engine.options['closed_shell']: 62 | fields = ('energy', 'gradient', 'orbfile') 63 | else: 64 | fields = ('energy', 'gradient', 'orbfile_a', 'orbfile_b') 65 | #print(self.options['job_data']['orbfile']['propagate']) 66 | if self.hasRanForCurrentCoords == False and self.node_id in self.options['job_data']['orbfile'].keys(): 67 | print("Recalculating energy") 68 | del self.options['job_data']['orbfile'][self.node_id] 69 | if self.node_id not in self.options['job_data']['orbfile'].keys(): 70 | if 'copied_orb' in self.options['job_data']['orbfile'].keys(): 71 | if self.options['job_data']['orbfile']['copied_orb'] in self.options['job_data']['orbfile'].keys(): 72 | orb_guess = self.options['job_data']['orbfile'][self.options['job_data']['orbfile']['copied_orb']] 73 | del self.options['job_data']['orbfile']['copied_orb'] 74 | else: 75 | orb_guess = None 76 | del self.options['job_data']['orbfile']['copied_orb'] 77 | else: 78 | orb_guess = None 79 | if (self.node_id - 1) in self.options['job_data']['orbfile'].keys() and orb_guess == None: 80 | orb_guess = self.options['job_data']['orbfile'][self.node_id - 1] 81 | elif (self.node_id + 1) in self.options['job_data']['orbfile'].keys() and orb_guess == None: 82 | orb_guess = self.options['job_data']['orbfile'][self.node_id + 1] 83 | else: 84 | orb_guess = self.options['job_data']['orbfile'][self.node_id] 85 | try: 86 | if orb_guess: 87 | results = self.engine.compute_blocking(xyz, fields, job_type = 'gradient', guess = orb_guess) #compute__(fields = "energy, gradient, orbfiles") 88 | #if we're not using a previous orbital as a guess, we want to ensure we find the correct SCF minimum if using guess = generate 89 | else: 90 | print("No orbital guess available") 91 | old_options = cp.deepcopy(self.engine.options) 92 | #does this work with non-unrestricted methods? #TODO 93 | if 'fon' in self.engine.options.keys(): 94 | if self.engine.options['fon'] == 'yes': 95 | self.engine.options['fon_coldstart'] = 'no' 96 | self.engine.options['fon_converger'] = 'no' 97 | self.engine.options['fon_tests'] = 2 98 | results = self.engine.compute_blocking(xyz, fields, job_type = 'gradient', guess = 'generate') #compute__(fields = "energy, gradient, orbfiles") 99 | self.engine.options = old_options 100 | except: 101 | # The calculation failed 102 | # set energy to a large number so the optimizer attempts to slow down 103 | print(" SCF FAILURE") 104 | self.nscffail+=1 105 | energy,gradient = 999, 0 106 | if self.nscffail>25: 107 | raise RuntimeError 108 | #unpacking results and updating orb dictionaries 109 | energy = results[0] 110 | gradient = results[1] 111 | self._Energies[(mult,ad_idx)] = self.Energy(energy,'Hartree') 112 | self._Gradients[(mult,ad_idx)] = self.Gradient(gradient,'Hartree/Bohr') 113 | if self.engine.options['closed_shell']: 114 | orb_path = results[2] 115 | self.options['job_data']['orbfile'].update({self.node_id: orb_path}) 116 | else: 117 | orb_a_path = results[2] 118 | orb_b_path = results[3] 119 | self.options['job_data']['orbfile'].update({self.node_id: orb_a_path + ' ' + orb_b_path}) 120 | # Store the values in memory 121 | 122 | 123 | if __name__=="__main__": 124 | from nanoreactor.engine import get_engine 125 | from nanoreactor.parsing import load_settings 126 | 127 | # read settings from name 128 | db, setting_name, settings = load_settings('refine.yaml', host='fire-05-30') 129 | 130 | # Create the nanoreactor TCPB engine 131 | engine_type=settings['engine'].pop('type') 132 | engine = get_engine(r.mol, engine_type=engine_type, **settings['engine']) 133 | 134 | 135 | # read in a geometry 136 | geom = manage_xyz.read_xyz('../../data/ethylene.xyz') 137 | xyz = manage_xyz.xyz_to_np(geom) 138 | 139 | # create the pyGSM level of theory object 140 | test_lot = nanoreactor_engine(geom,job_data = {'engine',test_engine}) 141 | 142 | # Test 143 | print("getting energy") 144 | print(test_lot.get_energy(xyz,1,0)) 145 | 146 | print("getting grad") 147 | print(test_lot.get_gradient(xyz,1,0)) 148 | 149 | 150 | 151 | -------------------------------------------------------------------------------- /pyGSM/level_of_theories/openmm.py: -------------------------------------------------------------------------------- 1 | # standard library imports 2 | from coordinate_systems import Dihedral 3 | from utilities import manage_xyz, nifty 4 | import sys 5 | from os import path 6 | 7 | # third party 8 | import numpy as np 9 | import simtk.unit as openmm_units 10 | import simtk.openmm.app as openmm_app 11 | import simtk.openmm as openmm 12 | from parmed import load_file 13 | 14 | # local application imports 15 | sys.path.append(path.dirname(path.dirname(path.abspath(__file__)))) 16 | try: 17 | from .base_lot import Lot 18 | except: 19 | from base_lot import Lot 20 | 21 | 22 | class OpenMM(Lot): 23 | def __init__(self, options): 24 | 25 | super(OpenMM, self).__init__(options) 26 | 27 | # get simulation from options if it exists 28 | # self.options['job_data']['simulation'] = self.options['job_data'].get('simulation',None) 29 | self.simulation = self.options['job_data'].get('simulation', None) 30 | 31 | if self.lot_inp_file is not None and self.simulation is None: 32 | 33 | # Now go through the logic of determining which FILE options are activated. 34 | self.file_options.set_active('use_crystal', False, bool, "Use crystal unit parameters") 35 | self.file_options.set_active('use_pme', False, bool, '', "Use particle mesh ewald-- requires periodic boundary conditions") 36 | self.file_options.set_active('cutoff', 1.0, float, '', depend=(self.file_options.use_pme), msg="Requires PME") 37 | self.file_options.set_active('prmtopfile', None, str, "parameter file") 38 | self.file_options.set_active('inpcrdfile', None, str, "inpcrd file") 39 | self.file_options.set_active('restrain_bondfile', None, str, 'list of bonds to restrain') 40 | self.file_options.set_active('restrain_torfile', None, str, "list of torsions to restrain") 41 | self.file_options.set_active('restrain_tranfile', None, str, "list of translations to restrain") 42 | 43 | for line in self.file_options.record(): 44 | print(line) 45 | 46 | # set all active values to self for easy access 47 | for key in self.file_options.ActiveOptions: 48 | setattr(self, key, self.file_options.ActiveOptions[key]) 49 | 50 | nifty.printcool(" Options for OpenMM") 51 | for val in [self.prmtopfile, self.inpcrdfile]: 52 | assert val is not None, "Missing prmtop or inpcrdfile" 53 | 54 | # Integrator will never be used (Simulation requires one) 55 | integrator = openmm.VerletIntegrator(1.0) 56 | 57 | # create simulation object 58 | if self.use_crystal: 59 | crystal = load_file(self.prmtopfile, self.inpcrdfile) 60 | if self.use_pme: 61 | system = crystal.createSystem( 62 | nonbondedMethod=openmm_app.PME, 63 | nonbondedCutoff=self.cutoff*openmm_units.nanometer, 64 | ) 65 | else: 66 | system = crystal.createSystem( 67 | nonbondedMethod=openmm_app.NoCutoff, 68 | ) 69 | 70 | # Add restraints 71 | self.add_restraints(system) 72 | 73 | self.simulation = openmm_app.Simulation(crystal.topology, system, integrator) 74 | # set the box vectors 75 | inpcrd = openmm_app.AmberInpcrdFile(self.inpcrdfile) 76 | if inpcrd.boxVectors is not None: 77 | print(" setting box vectors") 78 | print(inpcrd.boxVectors) 79 | self.simulation.context.setPeriodicBoxVectors(*inpcrd.boxVectors) 80 | else: # Do not use crystal parameters 81 | prmtop = openmm_app.AmberPrmtopFile(self.prmtopfile) 82 | if self.use_pme: 83 | system = prmtop.createSystem( 84 | nonbondedMethod=openmm_app.PME, 85 | nonbondedCutoff=self.cutoff*openmm_units.nanometer, 86 | ) 87 | else: 88 | system = prmtop.createSystem( 89 | nonbondedMethod=openmm_app.NoCutoff, 90 | ) 91 | 92 | # add restraints 93 | self.add_restraints(system) 94 | 95 | self.simulation = openmm_app.Simulation( 96 | prmtop.topology, 97 | system, 98 | integrator, 99 | ) 100 | 101 | def add_restraints(self, system): 102 | # Bond Restraints 103 | if self.restrain_bondfile is not None: 104 | nifty.printcool(" Adding bonding restraints!") 105 | # Harmonic constraint 106 | 107 | flat_bottom_force = openmm.CustomBondForce( 108 | 'step(r-r0) * (k/2) * (r-r0)^2') 109 | flat_bottom_force.addPerBondParameter('r0') 110 | flat_bottom_force.addPerBondParameter('k') 111 | system.addForce(flat_bottom_force) 112 | 113 | with open(self.restrain_bondfile, 'r') as input_file: 114 | for line in input_file: 115 | print(line) 116 | columns = line.split() 117 | atom_index_i = int(columns[0]) 118 | atom_index_j = int(columns[1]) 119 | r0 = float(columns[2]) 120 | k = float(columns[3]) 121 | flat_bottom_force.addBond( 122 | atom_index_i, atom_index_j, [r0, k]) 123 | 124 | # Torsion restraint 125 | if self.restrain_torfile is not None: 126 | nifty.printcool(" Adding torsional restraints!") 127 | 128 | # Harmonic constraint 129 | tforce = openmm.CustomTorsionForce("0.5*k*min(dtheta, 2*pi-dtheta)^2; dtheta = abs(theta-theta0); pi = 3.1415926535") 130 | tforce.addPerTorsionParameter("k") 131 | tforce.addPerTorsionParameter("theta0") 132 | system.addForce(tforce) 133 | 134 | xyz = manage_xyz.xyz_to_np(self.geom) 135 | with open(self.restrain_torfile, 'r') as input_file: 136 | for line in input_file: 137 | columns = line.split() 138 | a = int(columns[0]) 139 | b = int(columns[1]) 140 | c = int(columns[2]) 141 | d = int(columns[3]) 142 | k = float(columns[4]) 143 | dih = Dihedral(a, b, c, d) 144 | theta0 = dih.value(xyz) 145 | tforce.addTorsion(a, b, c, d, [k, theta0]) 146 | 147 | # Translation restraint 148 | if self.restrain_tranfile is not None: 149 | nifty.printcool(" Adding translational restraints!") 150 | trforce = openmm.CustomExternalForce("k*periodicdistance(x, y, z, x0, y0, z0)^2") 151 | trforce.addPerParticleParameter("k") 152 | trforce.addPerParticleParameter("x0") 153 | trforce.addPerParticleParameter("y0") 154 | trforce.addPerParticleParameter("z0") 155 | system.addForce(trforce) 156 | 157 | xyz = manage_xyz.xyz_to_np(self.geom) 158 | with open(self.restrain_tranfile, 'r') as input_file: 159 | for line in input_file: 160 | columns = line.split() 161 | a = int(columns[0]) 162 | k = float(columns[1]) 163 | x0 = xyz[a, 0]*0.1 # Units are in nm 164 | y0 = xyz[a, 1]*0.1 # Units are in nm 165 | z0 = xyz[a, 2]*0.1 # Units are in nm 166 | trforce.addParticle(a, [k, x0, y0, z0]) 167 | 168 | @property 169 | def simulation(self): 170 | return self.options['job_data']['simulation'] 171 | 172 | @simulation.setter 173 | def simulation(self, value): 174 | self.options['job_data']['simulation'] = value 175 | 176 | def run(self, geom, mult, ad_idx, runtype='gradient'): 177 | 178 | coords = manage_xyz.xyz_to_np(geom) 179 | 180 | # Update coordinates of simulation (shallow-copied object) 181 | xyz_nm = 0.1 * coords # coords are in angstrom 182 | self.simulation.context.setPositions(xyz_nm) 183 | 184 | # actually compute (only applicable to ground-states,singlet mult) 185 | if mult != 1 or ad_idx > 1: 186 | raise RuntimeError('MM cant do excited states') 187 | 188 | s = self.simulation.context.getState( 189 | getEnergy=True, 190 | getForces=True, 191 | ) 192 | tmp = s.getPotentialEnergy() 193 | E = tmp.value_in_unit(openmm_units.kilocalories / openmm_units.moles) 194 | self._Energies[(mult, ad_idx)] = self.Energy(E, 'kcal/mol') 195 | 196 | F = s.getForces() 197 | G = -1.0 * np.asarray(F.value_in_unit(openmm_units.kilocalories/openmm_units.moles / openmm_units.angstroms)) 198 | 199 | self._Gradients[(mult, ad_idx)] = self.Gradient(G, 'kcal/mol/Angstrom') 200 | self.hasRanForCurrentCoords = True 201 | 202 | return 203 | 204 | 205 | if __name__ == "__main__": 206 | from openbabel import pybel as pb 207 | # Create and initialize System object from prmtop/inpcrd 208 | prmtopfile = '../../data/solvated.prmtop' 209 | inpcrdfile = '../../data/solvated.rst7' 210 | prmtop = openmm_app.AmberPrmtopFile(prmtopfile) 211 | inpcrd = openmm_app.AmberInpcrdFile(inpcrdfile) 212 | system = prmtop.createSystem( 213 | rigidWater=False, 214 | removeCMMotion=False, 215 | nonbondedMethod=openmm_app.PME, 216 | nonbondedCutoff=1*openmm_units.nanometer # 10 ang 217 | ) 218 | 219 | # Integrator will never be used (Simulation requires one) 220 | integrator = openmm.VerletIntegrator(1.0) 221 | simulation = openmm_app.Simulation( 222 | prmtop.topology, 223 | system, 224 | integrator, 225 | ) 226 | mol = next(pb.readfile('pdb', '../../data/solvated.pdb')) 227 | coords = nifty.getAllCoords(mol) 228 | atoms = nifty.getAtomicSymbols(mol) 229 | print(coords) 230 | geom = manage_xyz.combine_atom_xyz(atoms, coords) 231 | 232 | lot = OpenMM.from_options(states=[(1, 0)], job_data={'simulation': simulation}, geom=geom) 233 | 234 | E = lot.get_energy(coords, 1, 0) 235 | print(E) 236 | 237 | G = lot.get_gradient(coords, 1, 0) 238 | nifty.pmat2d(G) 239 | -------------------------------------------------------------------------------- /pyGSM/level_of_theories/orca.py: -------------------------------------------------------------------------------- 1 | # standard library imports 2 | import sys 3 | import os 4 | from os import path 5 | 6 | # third party 7 | import numpy as np 8 | 9 | # local application imports 10 | sys.path.append(path.dirname(path.dirname(path.abspath(__file__)))) 11 | from .base_lot import Lot 12 | 13 | class Orca(Lot): 14 | 15 | def write_input_file(self, geom, multiplicity): 16 | if self.lot_inp_file is False: 17 | inpstring = '!' 18 | inpstring += ' '+self.functional 19 | inpstring += ' '+self.basis 20 | inpstring += ' EnGrad\n\n' # SOSCF SlowConv \n\n' 21 | inpstring += '%scf\nMaxIter 300\nconvergence strong\n sthresh 1e-7\n' 22 | inpstring += 'thresh 1e-11\n tcut 1e-13 \n directresetfreq 1 \n SOSCFStart 0.00033\nend\n' 23 | # inpstring += '%scf\nMaxIter 300\nend\n' 24 | inpstring += '\n%maxcore 1000\n\n' 25 | inpstring += '%pal\nnproc {}\nend\n\n'.format(self.nproc) 26 | else: 27 | inpstring = '' 28 | with open(self.lot_inp_file) as lot_inp: 29 | lot_inp_lines = lot_inp.readlines() 30 | for line in lot_inp_lines: 31 | inpstring += line 32 | 33 | inpstring += '\n*xyz {} {}\n'.format(self.charge, multiplicity) 34 | for coord in geom: 35 | for i in coord: 36 | inpstring += str(i)+' ' 37 | inpstring += '\n' 38 | inpstring += '*' 39 | tempfilename = 'tempORCAinp_{}'.format(multiplicity) 40 | tempfile = open(tempfilename, 'w') 41 | tempfile.write(inpstring) 42 | tempfile.close() 43 | return tempfilename 44 | 45 | def run(self, geom, multiplicity, ad_idx, runtype='gradient'): 46 | 47 | assert ad_idx == 0, "pyGSM ORCA doesn't currently support ad_idx!=0" 48 | 49 | # Write input file 50 | tempfilename = self.write_input_file(geom, multiplicity) 51 | 52 | path2orca = os.popen('which orca').read().rstrip() 53 | user = os.environ['USER'] 54 | cwd = os.environ['PWD'] 55 | try: 56 | slurmID = os.environ['SLURM_ARRAY_JOB_ID'] 57 | try: 58 | slurmTASK = os.environ['SLURM_ARRAY_TASK_ID'] 59 | runscr = '/tmp/'+user+'/'+slurmID+'/'+slurmTASK 60 | except: 61 | runscr = '/tmp/'+user+'/'+slurmID 62 | except: 63 | pbsID = os.environ['PBS_JOBID'] 64 | orcascr = 'temporcarun' 65 | # runscr = '/tmp/'+user+'/'+orcascr 66 | runscr = '/tmp/'+pbsID+'/'+orcascr 67 | 68 | os.system('mkdir -p {}'.format(runscr)) 69 | os.system('mv {} {}/'.format(tempfilename, runscr)) 70 | cmd = 'cd {}; {} {} > {}/{}.log; cd {}'.format(runscr, path2orca, tempfilename, runscr, tempfilename, cwd) 71 | os.system(cmd) 72 | 73 | # parse output 74 | self.parse(multiplicity, runscr, tempfilename) 75 | 76 | return 77 | 78 | def parse(self, multiplicity, runscr, tempfilename): 79 | engradpath = runscr+'/{}.engrad'.format(tempfilename) 80 | with open(engradpath) as engradfile: 81 | engradlines = engradfile.readlines() 82 | 83 | temp = 100000 84 | for i, lines in enumerate(engradlines): 85 | if '# The current total energy in Eh\n' in lines: 86 | temp = i 87 | if i > temp+1: 88 | self._Energies[(multiplicity, 0)] = self.Energy(float(lines.split()[0]), 'Hartree') 89 | break 90 | 91 | temp = 100000 92 | tmp = [] 93 | tmp2 = [] 94 | for i, lines in enumerate(engradlines): 95 | if '# The current gradient in Eh/bohr\n' in lines: 96 | temp = i 97 | if i > temp+1: 98 | if "#" in lines: 99 | break 100 | tmp2.append(float(lines.split()[0])) 101 | if len(tmp2) == 3: 102 | tmp.append(tmp2) 103 | tmp2 = [] 104 | self._Gradients[(multiplicity, 0)] = self.Gradient(np.asarray(tmp), 'Hartree/Bohr') 105 | -------------------------------------------------------------------------------- /pyGSM/level_of_theories/pdynamo.py: -------------------------------------------------------------------------------- 1 | # standard library imports 2 | import sys 3 | import os 4 | from os import path 5 | import numpy as np 6 | 7 | # third party 8 | import pMolecule as pM 9 | import pCore as pC 10 | from pScientific.Geometry3 import Coordinates3 11 | import pBabel as pB 12 | import glob 13 | # 14 | ## local application imports 15 | #from Definitions import * 16 | 17 | sys.path.append(path.dirname(path.dirname(path.abspath(__file__)))) 18 | from utilities import manage_xyz, nifty, units 19 | try: 20 | from .base_lot import Lot 21 | except: 22 | from base_lot import Lot 23 | 24 | 25 | class pDynamo(Lot): 26 | """ 27 | Level of theory is a wrapper object to do QM/MM DFT calculations 28 | Requires a system object. lot_inp_file must create a pdynamo object 29 | called system 30 | """ 31 | 32 | def __init__(self, options): 33 | 34 | super(pDynamo, self).__init__(options) 35 | 36 | #pDynamo 37 | self.system = self.options['job_data'].get('system', None) 38 | 39 | print(" making folder scratch/{}".format(self.node_id)) 40 | os.system('mkdir -p scratch/{}'.format(self.node_id)) 41 | 42 | # if simulation doesn't exist create it 43 | if self.lot_inp_file is not None and self.simulation is None: 44 | # Now go through the logic of determining which FILE options are activated. 45 | # DO NOT DUPLICATE OPTIONS WHICH ARE ALREADY PART OF LOT OPTIONS (e.g. charge) 46 | # ORCA options 47 | self.file_options.set_active('use_orca', False, bool, "Use ORCA to evaluate energies and gradients") 48 | self.file_options.set_active('orca_method', "B3LYP", str, "Method to use in ORCA e.g. HF, MP2, or density functional", 49 | depend=(self.file_options.use_orca), 50 | msg="Must use ORCA to specify density functional") 51 | self.file_options.set_active('basis', "6-31g", str, "Basis set for wavefunction or density functional", 52 | depend=(self.file_options.use_orca), 53 | msg="Must use ORCA to specify density functional") 54 | self.file_options.set_active('tole', 1e-4, float, "Energy tolerance for convergence") 55 | self.file_options.set_active('maxiter', 100, int, "Number of SCF cycles") 56 | self.file_options.set_active('slowconv', False, bool, "Convergence option for ORCA") 57 | self.file_options.set_active('scfconvtol', 'NormalSCF', str, "Convergence option for ORCA", allowed=['NormalSCF', 'TightSCF', 'ExtremeSCF']) 58 | self.file_options.set_active('d3', False, bool, "Use Grimme's D3 dispersion") 59 | 60 | # QM/MM CHARMM 61 | self.file_options.set_active('qmatom_file', None, str, '') 62 | self.file_options.set_active('use_charmm_qmmm', False, bool, 'Use CHARMM molecular mechanics parameters to perform QMMM', 63 | depend=(self.file_options.qmatom_file is not None), 64 | msg="Must define qm atoms") 65 | self.file_options.set_active('path_to_prm', None, str, 'path to folder containing Charmm parameter files') 66 | self.file_options.set_active('path_to_str', None, str, 'path to folder containing to Charmm str files') 67 | self.file_options.set_active('psf_file', None, str, 'Path to file containing CHARMM PSF') 68 | self.file_options.set_active('crd_file', None, str, 'Path to file containing CHARMM CRD') 69 | 70 | # DFTB options 71 | self.file_options.set_active('use_dftb', False, bool, "Use DFTB to evaluate energies and gradients", 72 | clash=(self.file_options.use_orca), 73 | msg="We're not using DFTB+") 74 | self.file_options.set_active('path_to_skf', None, str, 'path to folder containing skf files') 75 | self.file_options.set_active('use_scc', True, bool, "Use self-consistent charge") 76 | 77 | # General options 78 | self.file_options.set_active('command', None, str, 'pDynamo requires a path to an executable like ORCA or DFTB+') 79 | self.file_options.set_active('scratch', None, str, 'Folder to store temporary files') 80 | 81 | self.file_options.force_active('scratch', 'scratch/{}'.format(self.node_id), 'Setting scratch folder') 82 | nifty.printcool(" Options for pdynamo") 83 | 84 | for line in self.file_options.record(): 85 | print(line) 86 | 87 | # Build system 88 | self.build_system() 89 | 90 | # set all active values to self for easy access 91 | for key in self.file_options.ActiveOptions: 92 | setattr(self, key, self.file_options.ActiveOptions[key]) 93 | 94 | def build_system(self): 95 | 96 | # save xyz file 97 | manage_xyz.write_xyz('scratch/{}/tmp.xyz'.format(self.node_id), self.geom) 98 | 99 | # ORCA 100 | if self.use_orca: 101 | parsed_keywords = [] 102 | # Use these keywords in ORCA 103 | for key in [self.orca_method, self.basis, self.slowconv, self.scfconvtol, self.d3]: 104 | if key is not None and key is not False: 105 | parsed_keywords.append(key) 106 | print(parsed_keywords) 107 | 108 | qcmodel = pM.QCModel.QCModelORCA.WithOptions(keywords=parsed_keywords, 109 | deleteJobFiles=False, 110 | command=self.command, 111 | scratch=self.scratch, 112 | ) 113 | 114 | # assuming only one state for now 115 | qcmodel.electronicState = pM.QCModel.ElectronicState.WithOptions(charge=self.charge, multiplicity=self.states[0][0]) 116 | nbModel = pM.NBModel.NBModelORCA.WithDefaults() 117 | 118 | if self.use_charmm_qmmm: 119 | # Get PRM 120 | prm_files = [] 121 | for name in glob.glob(self.path_to_prm+'/*.prm'): 122 | prm_files.append(name) 123 | for name in glob.glob(self.path_to_str+'/*.str'): 124 | prm_files.append(name) 125 | print(prm_files) 126 | 127 | # Build parameters object 128 | parameters = pB.CHARMMParameterFiles_ToParameters( 129 | [x for x in prm_files]) 130 | system = pB.CHARMMPSFFile_ToSystem( 131 | self.psf_file, 132 | isXPLOR=True, 133 | log='scratch/{}/logfile'.format(self.node_id), 134 | parameters=parameters) 135 | 136 | # Get qm atoms 137 | with open(self.qmatom_file) as f: 138 | qmatom_indices = f.read().splitlines() 139 | qmatom_indices = [int(x) for x in qmatom_indices] 140 | 141 | system.DefineQCModel(qcmodel, qcSelection=pC.Selection(qmatom_indices)) 142 | system.DefineNBModel(nbModel) 143 | else: 144 | # Define System 145 | system = pB.XYZFile_ToSystem('scratch/{}/tmp.xyz'.format(self.node_id)) 146 | system.DefineQCModel(qcmodel) 147 | # system.Summary ( ) 148 | 149 | self.system = system 150 | 151 | elif self.use_dftb: 152 | electronicState = pM.QCModel.ElectronicState.WithOptions(charge=self.charge, multiplicity=self.states[0][0]) 153 | qcModel = pM.QCModel.QCModelDFTB.WithOptions(deleteJobFiles=False, 154 | electronicState=electronicState, 155 | randomScratch=True, 156 | scratch='scratch/{}'.format(self.node_id), 157 | skfPath=self.path_to_skf, 158 | command=self.command, 159 | useSCC=self.use_scc) 160 | system = pB.XYZFile_ToSystem('scratch/{}/tmp.xyz'.format(self.node_id)) 161 | system.DefineQCModel(qcModel) 162 | self.system = system 163 | 164 | def run(self, geom, multiplicity=1): 165 | self.E = [] 166 | self.grada = [] 167 | coordinates3 = Coordinates3.WithExtent(len(geom)) 168 | xyz = manage_xyz.xyz_to_np(geom) 169 | for (i, (x, y, z)) in enumerate(xyz): 170 | coordinates3[i, 0] = x 171 | coordinates3[i, 1] = y 172 | coordinates3[i, 2] = z 173 | self.system.coordinates3 = coordinates3 174 | energy = self.system.Energy(doGradients=True) # KJ 175 | energy *= units.KJ_MOL_TO_AU * units.KCAL_MOL_PER_AU # KCAL/MOL 176 | 177 | self.E.append((multiplicity, energy)) 178 | print(energy) 179 | 180 | gradient = [] 181 | for i in range(len(geom)): 182 | for j in range(3): 183 | gradient.append(self.system.scratch.gradients3[i, j] * units.KJ_MOL_TO_AU / units.ANGSTROM_TO_AU) # Ha/Bohr 184 | gradient = np.asarray(gradient) 185 | # print(gradient) 186 | self.grada.append((multiplicity, gradient)) 187 | # print(gradient) 188 | 189 | @property 190 | def system(self): 191 | return self.options['job_data']['system'] 192 | 193 | @system.setter 194 | def system(self, value): 195 | self.options['job_data']['system'] = value 196 | 197 | 198 | if __name__ == "__main__": 199 | 200 | # QMMM 201 | #filepath='/export/zimmerman/craldaz/kevin2/tropcwatersphere.xyz' 202 | #geom = manage_xyz.read_xyz(filepath) 203 | #filepath='tropcwatersphere.xyz' 204 | #lot = pDynamo.from_options(states=[(5,0)],charge=-1,nproc=16,fnm=filepath,lot_inp_file='pdynamo_options_qmmm.txt') 205 | #lot.run(geom) 206 | 207 | # DFTB 208 | filepath = '../../data/ethylene.xyz' 209 | geom = manage_xyz.read_xyz(filepath) 210 | lot = pDynamo.from_options(states=[(1, 0)], charge=0, nproc=16, fnm=filepath, lot_inp_file='pdynamo_options_dftb.txt') 211 | lot.run(geom) 212 | -------------------------------------------------------------------------------- /pyGSM/level_of_theories/qchem.py: -------------------------------------------------------------------------------- 1 | # standard library imports 2 | import subprocess 3 | import sys 4 | import os 5 | from os import path 6 | 7 | # third party 8 | import numpy as np 9 | 10 | # local application imports 11 | sys.path.append(path.dirname(path.dirname(path.abspath(__file__)))) 12 | from .base_lot import Lot 13 | 14 | 15 | class QChem(Lot): 16 | def __init__(self, options): 17 | super(QChem, self).__init__(options) 18 | 19 | qcscratch = os.environ['QCSCRATCH'] 20 | for state in self.states: 21 | tempfolder = qcscratch + '/string_{:03d}/{}.{}/'.format(self.ID, self.node_id, state[0]) 22 | print(" making temp folder {}".format(tempfolder)) 23 | os.system('mkdir -p {}'.format(tempfolder)) 24 | 25 | copy_input_file = os.getcwd() + "/QChem_input.txt" 26 | print(copy_input_file) 27 | self.write_preamble(self.geom, self.states[0][0], copy_input_file) 28 | print(" making folder scratch/{}".format(self.node_id)) 29 | os.system('mkdir -p scratch/{}'.format(self.node_id)) 30 | 31 | def write_preamble(self, geom, multiplicity, tempfilename, jobtype='FORCE'): 32 | 33 | tempfile = open(tempfilename, 'w') 34 | if not self.lot_inp_file: 35 | tempfile.write(' $rem\n') 36 | tempfile.write(' JOBTYPE {}\n'.format(jobtype)) 37 | tempfile.write(' EXCHANGE {}\n'.format(self.functional)) 38 | tempfile.write(' SCF_ALGORITHM rca_diis\n') 39 | tempfile.write(' SCF_MAX_CYCLES 300\n') 40 | tempfile.write(' BASIS {}\n'.format(self.basis)) 41 | # tempfile.write(' ECP LANL2DZ \n') 42 | tempfile.write(' WAVEFUNCTION_ANALYSIS FALSE\n') 43 | tempfile.write(' GEOM_OPT_MAX_CYCLES 300\n') 44 | tempfile.write('scf_convergence 6\n') 45 | tempfile.write(' SYM_IGNORE TRUE\n') 46 | tempfile.write(' SYMMETRY FALSE\n') 47 | tempfile.write('molden_format true\n') 48 | tempfile.write(' $end\n') 49 | tempfile.write('\n') 50 | tempfile.write('$molecule\n') 51 | else: 52 | with open(self.lot_inp_file) as lot_inp: 53 | lot_inp_lines = lot_inp.readlines() 54 | for line in lot_inp_lines: 55 | tempfile.write(line) 56 | 57 | tempfile.write('{} {}\n'.format(self.charge, multiplicity)) 58 | if os.path.isfile("link.txt"): 59 | with open("link.txt") as link: 60 | link_lines = link.readlines() 61 | tmp_geom = [list(i) for i in geom] 62 | for i, coord in enumerate(tmp_geom): 63 | coord.append(link_lines[i].rstrip('\n')) 64 | for i in coord: 65 | tempfile.write(str(i)+' ') 66 | tempfile.write('\n') 67 | else: 68 | for coord in geom: 69 | for i in coord: 70 | tempfile.write(str(i)+' ') 71 | tempfile.write('\n') 72 | tempfile.write('$end') 73 | tempfile.close() 74 | 75 | def run(self, geom, multiplicity, ad_idx, runtype='gradient'): 76 | 77 | assert ad_idx == 0, "pyGSM Q-Chem doesn't currently support ad_idx!=0" 78 | 79 | qcscratch = os.environ['QCSCRATCH'] 80 | tempfilename = qcscratch + '/string_{:03d}/{}.{}/tempQCinp'.format(self.ID, self.node_id, multiplicity) 81 | 82 | if self.calc_grad and runtype != "energy": 83 | self.write_preamble(geom, multiplicity, tempfilename) 84 | else: 85 | self.write_preamble(geom, multiplicity, tempfilename, jobtype='SP') 86 | 87 | cmd = ['qchem'] 88 | args = ['-nt', str(self.nproc), 89 | '-save', 90 | tempfilename, 91 | '{}.qchem.out'.format(tempfilename), 92 | 'string_{:03d}/{}.{}'.format(self.ID, self.node_id, multiplicity) 93 | ] 94 | cmd.extend(args) 95 | 96 | # Run the process 97 | subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()[0] 98 | 99 | self.parse(qcscratch, multiplicity) 100 | 101 | return 102 | 103 | def parse(self, qcscratch, multiplicity): 104 | # PARSE OUTPUT # 105 | if self.calc_grad: 106 | efilepath = qcscratch + '/string_{:03d}/{}.{}/GRAD'.format(self.ID, self.node_id, multiplicity) 107 | with open(efilepath) as efile: 108 | elines = efile.readlines() 109 | 110 | temp = 0 111 | for lines in elines: 112 | if temp == 1: 113 | # defaulting to the ground-state 114 | self._Energies[(multiplicity, 0)] = self.Energy(float(lines.split()[0]), 'Hartree') 115 | break 116 | if "$" in lines: 117 | temp += 1 118 | 119 | with open(efilepath) as efile: 120 | gradlines = efile.readlines() 121 | temp = 0 122 | tmp = [] 123 | for lines in gradlines: 124 | if '$' in lines: 125 | temp += 1 126 | elif temp == 2: 127 | tmpline = lines.split() 128 | tmp.append([float(i) for i in tmpline]) 129 | elif temp == 3: 130 | break 131 | self._Gradients[(multiplicity, 0)] = self.Gradient(np.asarray(tmp), 'Hartree/Bohr') 132 | else: 133 | raise NotImplementedError 134 | self.write_E_to_file() 135 | 136 | @classmethod 137 | def copy(cls, lot, options, copy_wavefunction=True): 138 | base = os.environ['QCSCRATCH'] 139 | node_id = options.get('node_id', 1) 140 | 141 | if node_id != lot.node_id: # and copy_wavefunction: # other theories are more sensitive than qchem -- commenting out 142 | for state in lot.states: 143 | multiplicity = state[0] 144 | efilepath_old = base + '/string_{:03d}/{}.{}'.format(lot.ID, lot.node_id, multiplicity) 145 | efilepath_new = base + '/string_{:03d}/{}.{}'.format(lot.ID, node_id, multiplicity) 146 | cmd = 'cp -r ' + efilepath_old + ' ' + efilepath_new 147 | print(" copying QCSCRATCH files\n {}".format(cmd)) 148 | os.system(cmd) 149 | return cls(lot.options.copy().set_values(options)) 150 | -------------------------------------------------------------------------------- /pyGSM/level_of_theories/terachemcloud.py: -------------------------------------------------------------------------------- 1 | # standard library imports 2 | 3 | import sys 4 | from os import path 5 | import time 6 | 7 | # third party 8 | import numpy as np 9 | import tcc 10 | import json 11 | 12 | # local application imports 13 | sys.path.append(path.dirname(path.dirname(path.abspath(__file__)))) 14 | from .base_lot import Lot 15 | from utilities import nifty 16 | 17 | 18 | class TeraChemCloud(Lot): 19 | 20 | @property 21 | def TC(self): 22 | return self.options['job_data']['TC'] 23 | 24 | @property 25 | def tcc_options(self): 26 | return self.options['job_data']['tcc_options'] 27 | 28 | @tcc_options.setter 29 | def tcc_options(self, d): 30 | self.options['job_data']['tcc_options'] = d 31 | return 32 | 33 | @property 34 | def orbfile(self): 35 | return self.options['job_data']['orbfile'] 36 | 37 | @orbfile.setter 38 | def orbfile(self, value): 39 | self.options['job_data']['orbfile'] = value 40 | 41 | def __init__(self, options): 42 | super(TeraChemCloud, self).__init__(options) 43 | self.options['job_data']['tcc_options'] = self.options['job_data'].get( 44 | 'tcc_options', {}) 45 | self.options['job_data']['TC'] = self.options['job_data'].get( 46 | 'TC', None) 47 | 48 | if self.lot_inp_file is not None: 49 | exec(open(self.lot_inp_file).read()) 50 | print(' done executing lot_inp_file') 51 | self.options['job_data']['TC'] = TC 52 | self.options['job_data']['tcc_options'] = tcc_options 53 | #self.options['job_data']['orbfile'] 54 | tcc_options_copy = self.tcc_options.copy() 55 | tcc_options_copy['atoms'] = self.atoms 56 | self.tcc_options = tcc_options_copy 57 | 58 | def run(self, coords): 59 | 60 | E = [] 61 | grada = [] 62 | for state in self.states: 63 | # print("on state %d" % state[1]) 64 | multiplicity = state[0] 65 | ad_idx = state[1] 66 | grad_options = self.tcc_options.copy() 67 | grad_options['runtype'] = 'gradient' 68 | grad_options['castargetmult'] = multiplicity 69 | grad_options['castarget'] = ad_idx 70 | if self.orbfile: 71 | grad_options['guess'] = self.orbfile 72 | print(" orbfile is %s" % self.orbfile) 73 | else: 74 | print(" generating orbs from guess") 75 | job_id = self.TC.submit(coords, grad_options) 76 | results = self.TC.poll_for_results(job_id) 77 | while results['message'] == "job not finished": 78 | results = self.TC.poll_for_results(job_id) 79 | print(results['message']) 80 | print("sleeping for 1") 81 | time.sleep(1) 82 | sys.stdout.flush() 83 | 84 | # print((json.dumps(results, indent=2, sort_keys=True))) 85 | self.orbfile = results['orbfile'] 86 | try: 87 | E.append((multiplicity, ad_idx, results['energy'][ad_idx])) 88 | except: 89 | E.append((multiplicity, ad_idx, results['energy'])) 90 | 91 | grada.append((multiplicity, ad_idx, results['gradient'])) 92 | if self.do_coupling: 93 | # state1 = self.states[0][1] 94 | # state2 = self.states[1][1] 95 | nac_options = self.tcc_options.copy() 96 | nac_options['runtype'] = 'coupling' 97 | nac_options['nacstate1'] = 0 98 | nac_options['nacstate2'] = 1 99 | nac_options['guess'] = self.orbfile 100 | 101 | # nifty.printcool_dictionary(nac_options) 102 | job_id = self.TC.submit(coords, nac_options) 103 | results = self.TC.poll_for_results(job_id) 104 | while results['message'] == "job not finished": 105 | results = self.TC.poll_for_results(job_id) 106 | print(results['message']) 107 | print("sleeping for 1") 108 | time.sleep(1) 109 | sys.stdout.flush() 110 | # print((json.dumps(results, indent=2, sort_keys=True))) 111 | coup = results['nacme'] 112 | self.Couplings[self.coupling_states] = self.Coupling( 113 | coup, 'Hartree/Bohr') 114 | 115 | for energy, state in zip(E, self.states): 116 | self._Energies[state] = self.Energy(energy, 'Hartree') 117 | for grad, state in zip(E, self.gradient_states): 118 | self._Gradients[state] = self.Gradient(grad, "Hartree/Bohr") 119 | 120 | self.hasRanForCurrentCoords = True 121 | return 122 | -------------------------------------------------------------------------------- /pyGSM/level_of_theories/xtb_lot.py: -------------------------------------------------------------------------------- 1 | # standard library imports 2 | import sys 3 | from os import path 4 | 5 | # third party 6 | import numpy as np 7 | 8 | try: 9 | from xtb.interface import Calculator 10 | from xtb.utils import get_method, get_solvent 11 | from xtb.interface import Environment 12 | from xtb.libxtb import VERBOSITY_FULL 13 | except: 14 | print('xtb is not imported') 15 | 16 | # local application imports 17 | sys.path.append(path.dirname(path.dirname(path.abspath(__file__)))) 18 | from utilities import manage_xyz, units, elements 19 | try: 20 | from .base_lot import Lot 21 | except: 22 | from base_lot import Lot 23 | 24 | 25 | class xTB_lot(Lot): 26 | def __init__(self, options): 27 | super(xTB_lot, self).__init__(options) 28 | 29 | numbers = [] 30 | E = elements.ElementData() 31 | for a in manage_xyz.get_atoms(self.geom): 32 | elem = E.from_symbol(a) 33 | numbers.append(elem.atomic_num) 34 | self.numbers = np.asarray(numbers) 35 | 36 | def run(self, geom, multiplicity, state, verbose=False): 37 | 38 | # print('running!') 39 | # sys.stdout.flush() 40 | coords = manage_xyz.xyz_to_np(geom) 41 | 42 | # convert to bohr 43 | positions = coords * units.ANGSTROM_TO_AU 44 | calc = Calculator(get_method(self.xTB_Hamiltonian), self.numbers, positions, charge=self.charge) 45 | 46 | calc.set_accuracy(self.xTB_accuracy) 47 | calc.set_electronic_temperature(self.xTB_electronic_temperature) 48 | 49 | if self.solvent is not None: 50 | calc.set_solvent(get_solvent(self.solvent)) 51 | 52 | calc.set_output('lot_jobs_{}.txt'.format(self.node_id)) 53 | res = calc.singlepoint() # energy printed is only the electronic part 54 | calc.release_output() 55 | 56 | # energy in hartree 57 | self._Energies[(multiplicity, state)] = self.Energy(res.get_energy(), 'Hartree') 58 | 59 | # grad in Hatree/Bohr 60 | self._Gradients[(multiplicity, state)] = self.Gradient(res.get_gradient(), 'Hartree/Bohr') 61 | 62 | # write E to scratch 63 | self.write_E_to_file() 64 | 65 | return res 66 | 67 | 68 | if __name__ == "__main__": 69 | 70 | geom = manage_xyz.read_xyz('../../data/ethylene.xyz') 71 | # geoms=manage_xyz.read_xyzs('../../data/diels_alder.xyz') 72 | # geom = geoms[0] 73 | # geom=manage_xyz.read_xyz('xtbopt.xyz') 74 | xyz = manage_xyz.xyz_to_np(geom) 75 | # xyz *= units.ANGSTROM_TO_AU 76 | 77 | lot = xTB_lot.from_options(states=[(1, 0)], gradient_states=[(1, 0)], geom=geom, node_id=0) 78 | 79 | E = lot.get_energy(xyz, 1, 0) 80 | print(E) 81 | 82 | g = lot.get_gradient(xyz, 1, 0) 83 | print(g) 84 | -------------------------------------------------------------------------------- /pyGSM/molecule/__init__.py: -------------------------------------------------------------------------------- 1 | from .molecule import Molecule 2 | -------------------------------------------------------------------------------- /pyGSM/optimizers/__init__.py: -------------------------------------------------------------------------------- 1 | from .conjugate_gradient import conjugate_gradient 2 | from .eigenvector_follow import eigenvector_follow 3 | from .lbfgs import lbfgs 4 | from ._linesearch import backtrack,NoLineSearch 5 | from .beales_cg import beales_cg 6 | -------------------------------------------------------------------------------- /pyGSM/optimizers/conjugate_gradient.py: -------------------------------------------------------------------------------- 1 | from __future__ import print_function 2 | 3 | # standard library imports 4 | try: 5 | from io import StringIO 6 | except: 7 | from StringIO import StringIO 8 | 9 | # third party 10 | import numpy as np 11 | 12 | # local application imports 13 | from .base_optimizer import base_optimizer 14 | from utilities import units 15 | 16 | 17 | class conjugate_gradient(base_optimizer): 18 | 19 | def optimize(self, molecule, refE=0., opt_type='UNCONSTRAINED', opt_steps=3, ictan=None): 20 | print(" initial E %5.4f" % (molecule.energy - refE)) 21 | if opt_type == 'TS': 22 | raise RuntimeError 23 | 24 | # stash/initialize some useful attributes 25 | geoms = [] 26 | energies = [] 27 | geoms.append(molecule.geometry) 28 | energies.append(molecule.energy-refE) 29 | self.initial_step = True 30 | self.disp = 1000. 31 | self.Ediff = 1000. 32 | self.check_inputs(molecule, opt_type, ictan) 33 | nconstraints = self.get_nconstraints(opt_type) 34 | self.buf = StringIO() 35 | 36 | # form initial coord basis 37 | constraints = self.get_constraint_vectors(molecule, opt_type, ictan) 38 | molecule.update_coordinate_basis(constraints=constraints) 39 | 40 | # for cartesian these are the same 41 | x = np.copy(molecule.coordinates) 42 | # xyz = np.copy(molecule.xyz) 43 | 44 | # number of coordinates 45 | if molecule.coord_obj.__class__.__name__ == 'CartesianCoordinates': 46 | n = molecule.num_coordinates 47 | else: 48 | n_actual = molecule.num_coordinates 49 | n = n_actual - nconstraints 50 | 51 | # Evaluate the function value and its gradient. 52 | fx = molecule.energy 53 | g = molecule.gradient.copy() 54 | g_prim = np.dot(molecule.coord_basis, g) 55 | gp = g.copy() 56 | gp_prim = np.dot(molecule.coord_basis, gp) 57 | 58 | molecule.gradrms = np.sqrt(np.dot(g[nconstraints:].T, g[nconstraints:])/n) 59 | if molecule.gradrms < self.conv_grms: 60 | print(" already at min") 61 | return geoms, energies 62 | 63 | for ostep in range(opt_steps): 64 | print(" On opt step {} ".format(ostep+1)) 65 | 66 | if self.initial_step is True: 67 | # d: store the negative gradient of the object function on point x. 68 | # compute the initial step 69 | d_prim = -g_prim 70 | # set initial step to false 71 | self.initial_step = False 72 | else: 73 | # Fletcher-Reeves formula for Beta 74 | # http://en.wikipedia.org/wiki/Nonlinear_conjugate_gradient_method 75 | dnew = -g_prim # 76 | deltanew = np.dot(dnew.T, dnew) 77 | deltaold = np.dot(-gp_prim.T, -gp_prim) 78 | beta = deltanew/deltaold 79 | print(" beta = %1.2f" % beta) 80 | d_prim = dnew + beta*d_prim 81 | 82 | # form in DLC basis (does nothing if cartesian) 83 | d = np.dot(molecule.coord_basis.T, d_prim) 84 | 85 | # normalize the direction 86 | actual_step = np.linalg.norm(d) 87 | print(" actual_step= %1.2f" % actual_step) 88 | d = d/actual_step # normalize 89 | if actual_step > self.options['DMAX']: 90 | step = self.options['DMAX'] 91 | print(" reducing step, new step = %1.2f" % step) 92 | else: 93 | step = actual_step 94 | 95 | # store 96 | xp = x.copy() 97 | gp = g.copy() 98 | gp_prim = np.dot(molecule.coord_basis, gp) 99 | fxp = fx 100 | 101 | # => calculate constraint step <= # 102 | constraint_steps = self.get_constraint_steps(molecule, opt_type, g) 103 | 104 | # line search 105 | print(" Linesearch") 106 | ls = self.Linesearch(n, x, fx, g, d, step, xp, gp, constraint_steps, self.linesearch_parameters, molecule) 107 | print(" Done linesearch") 108 | 109 | # revert to the previous point 110 | if ls['status'] < 0: 111 | x = xp.copy() 112 | print('[ERROR] the point return to the previous point') 113 | return ls['status'] 114 | 115 | # get values from linesearch 116 | p_step = step 117 | step = ls['step'] 118 | x = ls['x'] 119 | fx = ls['fx'] 120 | g = ls['g'] 121 | g_prim = np.dot(molecule.coord_basis, g) 122 | 123 | # control step size 124 | if step < p_step: 125 | self.options['DMAX'] /= 2. 126 | elif step > p_step: 127 | self.options['DMAX'] *= 2. 128 | if self.options['DMAX'] < 0.01: 129 | self.options['DMAX'] = 0.01 130 | elif self.options['DMAX'] > 0.25: 131 | self.options['DMAX'] = 0.25 132 | 133 | # dE 134 | dEstep = fx - fxp 135 | print(" dEstep=%5.4f" % dEstep) 136 | 137 | # update molecule xyz 138 | # xyz = molecule.update_xyz(x-xp) 139 | geoms.append(molecule.geometry) 140 | energies.append(molecule.energy-refE) 141 | 142 | if self.options['print_level'] > 0: 143 | print(" Opt step: %d E: %5.4f gradrms: %1.5f ss: %1.3f DMAX: %1.3f" % (ostep+1, fx-refE, molecule.gradrms, step, self.options['DMAX'])) 144 | self.buf.write(u' Opt step: %d E: %5.4f gradrms: %1.5f ss: %1.3f DMAX: %1.3f\n' % (ostep+1, fx-refE, molecule.gradrms, step, self.options['DMAX'])) 145 | 146 | # gmax = np.max(g)/units.ANGSTROM_TO_AU/KCAL_MOL_PER_AU 147 | # print "current gradrms= %r au" % gradrms 148 | gmax = np.max(g)/units.ANGSTROM_TO_AU 149 | self.disp = np.max(x - xp)/units.ANGSTROM_TO_AU 150 | self.Ediff = fx - fxp / units.KCAL_MOL_PER_AU 151 | print(" maximum displacement component %1.2f (au)" % self.disp) 152 | print(" maximum gradient component %1.2f (au)" % gmax) 153 | 154 | # check for convergence TODO 155 | molecule.gradrms = np.sqrt(np.dot(g[nconstraints:].T, g[nconstraints:])/n) 156 | if molecule.gradrms < self.conv_grms: 157 | break 158 | 159 | # check if finished 160 | # if gradrms <= params.conv_grms or \ 161 | # (self.disp <= params.conv_disp and self.Ediff <= params.conv_Ediff) or \ 162 | # (gmax <= params.conv_gmax and abs(self.Ediff) <= params.conv_Ediff): 163 | # print('[INFO] converged') 164 | # print(gradrms) 165 | # #print self.Ediff 166 | # #print self.disp 167 | # break 168 | 169 | # update DLC --> this changes q, g, Hint 170 | if not molecule.coord_obj.__class__.__name__ == 'CartesianCoordinates': 171 | constraints = self.get_constraint_vectors(molecule, opt_type, ictan) 172 | molecule.update_coordinate_basis(constraints=constraints) 173 | x = np.copy(molecule.coordinates) 174 | fx = molecule.energy 175 | dE = molecule.difference_energy 176 | if dE != 1000.: 177 | print(" difference energy is %5.4f" % dE) 178 | g = molecule.gradient.copy() 179 | molecule.form_Hessian_in_basis() 180 | print() 181 | 182 | print(" opt-summary") 183 | print(self.buf.getvalue()) 184 | return geoms, energies 185 | -------------------------------------------------------------------------------- /pyGSM/potential_energy_surfaces/__init__.py: -------------------------------------------------------------------------------- 1 | #__all__=['PES','Avg_PES','Penalty_PES'] 2 | from .pes import PES 3 | from .avg_pes import Avg_PES 4 | from .penalty_pes import Penalty_PES 5 | #from .md_penalty_pes import MD_Penalty_PES 6 | -------------------------------------------------------------------------------- /pyGSM/potential_energy_surfaces/penalty_pes.py: -------------------------------------------------------------------------------- 1 | # standard library imports 2 | import sys 3 | from os import path 4 | 5 | # third party 6 | 7 | # local application imports 8 | sys.path.append(path.dirname(path.dirname(path.abspath(__file__)))) 9 | from utilities import manage_xyz, units 10 | from .pes import PES 11 | 12 | class Penalty_PES(PES): 13 | """ penalty potential energy surface calculators """ 14 | 15 | def __init__(self, 16 | PES1, 17 | PES2, 18 | lot, 19 | sigma=1.0, 20 | alpha=0.02*units.KCAL_MOL_PER_AU, 21 | ): 22 | self.PES1 = PES(PES1.options.copy().set_values({ 23 | "lot": lot, 24 | })) 25 | self.PES2 = PES(PES2.options.copy().set_values({ 26 | "lot": lot, 27 | })) 28 | self.lot = lot 29 | self.alpha = alpha 30 | self.dE = 1000. 31 | self.sigma = sigma 32 | print(' PES1 multiplicity: {} PES2 multiplicity: {} sigma: {}'.format(self.PES1.multiplicity, self.PES2.multiplicity, self.sigma)) 33 | 34 | @classmethod 35 | def create_pes_from(cls, PES, options={}, copy_wavefunction=True): 36 | lot = type(PES.lot).copy(PES.lot, options, copy_wavefunction) 37 | return cls(PES.PES1, PES.PES2, lot, PES.sigma, PES.alpha) 38 | 39 | def get_energy(self, geom): 40 | E1 = self.PES1.get_energy(geom) 41 | E2 = self.PES2.get_energy(geom) 42 | 43 | # avgE = 0.5*(self.PES1.get_energy(geom) + self.PES2.get_energy(geom)) 44 | avgE = 0.5*(E1+E2) 45 | # self.dE = self.PES2.get_energy(geom) - self.PES1.get_energy(geom) 46 | self.dE = E2-E1 47 | # print "E1: %1.4f E2: %1.4f"%(E1,E2), 48 | # print "delta E = %1.4f" %self.dE, 49 | # TODO what to do if PES2 is or goes lower than PES1? 50 | G = (self.dE*self.dE)/(abs(self.dE) + self.alpha) 51 | # if self.dE < 0: 52 | # G*=-1 53 | # print "G = %1.4f" % G 54 | # print "alpha: %1.4f sigma: %1.4f"%(self.alpha,self.sigma), 55 | # print "F: %1.4f"%(avgE+self.sigma*G) 56 | sys.stdout.flush() 57 | return avgE+self.sigma*G 58 | 59 | def get_gradient(self, geom, frozen_atoms=None): 60 | self.grad1 = self.PES1.get_gradient(geom, frozen_atoms) 61 | self.grad2 = self.PES2.get_gradient(geom, frozen_atoms) 62 | avg_grad = 0.5*(self.grad1 + self.grad2) 63 | dgrad = self.grad2 - self.grad1 64 | if self.dE < 0: 65 | dgrad *= -1 66 | factor = self.sigma*((self.dE*self.dE) + 2.*self.alpha*abs(self.dE))/((abs(self.dE) + self.alpha)**2) 67 | # print "factor is %1.4f" % factor 68 | return avg_grad + factor*dgrad 69 | 70 | def get_avg_gradient(self, xyz, frozen_atoms=None): 71 | return 0.5*(self.PES1.get_gradient(xyz, frozen_atoms) + self.PES2.get_gradient(xyz, frozen_atoms)) 72 | 73 | 74 | if __name__ == '__main__': 75 | 76 | from level_of_theories.pytc import PyTC 77 | import psiw 78 | import lightspeed as ls 79 | 80 | filepath = '../../data/ethylene.xyz' 81 | geom = manage_xyz.read_xyz(filepath, scale=1) 82 | # => Job Data <= ##### 83 | states = [(1, 0), (1, 1)] 84 | charge = 0 85 | nocc = 7 86 | nactive = 2 87 | basis = '6-31gs' 88 | 89 | #### => PSIW Obj <= ###### 90 | nifty.printcool("Build resources") 91 | resources = ls.ResourceList.build() 92 | nifty.printcool('{}'.format(resources)) 93 | 94 | molecule = ls.Molecule.from_xyz_file(filepath) 95 | geom = psiw.geometry.Geometry.build( 96 | resources=resources, 97 | molecule=molecule, 98 | basisname=basis, 99 | ) 100 | nifty.printcool('{}'.format(geom)) 101 | 102 | ref = psiw.RHF.from_options( 103 | geometry=geom, 104 | g_convergence=1.0E-6, 105 | fomo=True, 106 | fomo_method='gaussian', 107 | fomo_temp=0.3, 108 | fomo_nocc=nocc, 109 | fomo_nact=nactive, 110 | print_level=1, 111 | ) 112 | ref.compute_energy() 113 | casci = psiw.CASCI.from_options( 114 | reference=ref, 115 | nocc=nocc, 116 | nact=nactive, 117 | nalpha=nactive/2, 118 | nbeta=nactive/2, 119 | S_inds=[0], 120 | S_nstates=[2], 121 | print_level=1, 122 | ) 123 | casci.compute_energy() 124 | psiw = psiw.CASCI_LOT.from_options( 125 | casci=casci, 126 | rhf_guess=True, 127 | rhf_mom=True, 128 | orbital_coincidence='core', 129 | state_coincidence='full', 130 | ) 131 | 132 | nifty.printcool("Build the pyGSM Level of Theory object (LOT)") 133 | lot = PyTC.from_options(states=[(1, 0), (1, 1)], job_data={'psiw': psiw}, do_coupling=False, fnm=filepath) 134 | 135 | pes1 = PES.from_options(lot=lot, ad_idx=0, multiplicity=1) 136 | pes2 = PES.from_options(lot=lot, ad_idx=1, multiplicity=1) 137 | pes = Penalty_PES(PES1=pes1, PES2=pes2, lot=lot) 138 | geom = manage_xyz.read_xyz(filepath, scale=1) 139 | coords = manage_xyz.xyz_to_np(geom) 140 | print(pes.get_energy(coords)) 141 | print(pes.get_gradient(coords)) 142 | -------------------------------------------------------------------------------- /pyGSM/py.typed: -------------------------------------------------------------------------------- 1 | # PEP 561 marker file. See https://peps.python.org/pep-0561/ 2 | -------------------------------------------------------------------------------- /pyGSM/pyGSM.py: -------------------------------------------------------------------------------- 1 | """Provide the primary functions.""" 2 | 3 | 4 | def canvas(with_attribution=True): 5 | """ 6 | Placeholder function to show example docstring (NumPy format). 7 | 8 | Replace this function and doc string for your own project. 9 | 10 | Parameters 11 | ---------- 12 | with_attribution : bool, Optional, default: True 13 | Set whether or not to display who the quote is from. 14 | 15 | Returns 16 | ------- 17 | quote : str 18 | Compiled string including quote and optional attribution. 19 | """ 20 | 21 | quote = "The code is but a canvas to our imagination." 22 | if with_attribution: 23 | quote += "\n\t- Adapted from Henry David Thoreau" 24 | return quote 25 | 26 | 27 | if __name__ == "__main__": 28 | # Do something if this file is invoked on its own 29 | print(canvas()) 30 | -------------------------------------------------------------------------------- /pyGSM/tests/__init__.py: -------------------------------------------------------------------------------- 1 | """ 2 | Empty init file in case you choose a package besides PyTest such as Nose which may look for such a file. 3 | """ 4 | -------------------------------------------------------------------------------- /pyGSM/tests/test_basic_mecp.py: -------------------------------------------------------------------------------- 1 | import pytest 2 | 3 | from pyGSM.coordinate_systems.delocalized_coordinates import ( 4 | DelocalizedInternalCoordinates, 5 | ) 6 | from pyGSM.coordinate_systems.primitive_internals import PrimitiveInternalCoordinates 7 | from pyGSM.coordinate_systems.topology import Topology 8 | from pyGSM.level_of_theories.xtb_lot import xTB_lot 9 | from pyGSM.molecule.molecule import Molecule 10 | from pyGSM.optimizers import eigenvector_follow 11 | from pyGSM.potential_energy_surfaces.penalty_pes import Penalty_PES 12 | from pyGSM.potential_energy_surfaces.pes import PES 13 | from pyGSM.utilities import elements, manage_xyz, nifty 14 | 15 | 16 | def test_basic_penalty_opt(): 17 | geom = manage_xyz.read_xyzs('pyGSM/data/diels_alder.xyz')[0] 18 | 19 | coupling_states = [] 20 | 21 | lot1 = xTB_lot.from_options( 22 | ID=0, 23 | states=[(1, 0)], 24 | gradient_states=[0], 25 | coupling_states=coupling_states, 26 | geom=geom, 27 | ) 28 | 29 | lot2 = xTB_lot.from_options( 30 | ID=1, 31 | states=[(1, 0)], 32 | gradient_states=[0], 33 | coupling_states=coupling_states, 34 | geom=geom, 35 | ) 36 | 37 | pes1 = PES.from_options( 38 | lot=lot1, 39 | multiplicity=1, 40 | ad_idx=0, 41 | ) 42 | 43 | pes2 = PES.from_options( 44 | lot=lot2, 45 | multiplicity=1, 46 | ad_idx=0, 47 | ) 48 | 49 | pes = Penalty_PES(PES1=pes1, PES2=pes2, lot=lot1) 50 | 51 | nifty.printcool('Building the topology') 52 | atom_symbols = manage_xyz.get_atoms(geom) 53 | ELEMENT_TABLE = elements.ElementData() 54 | atoms = [ELEMENT_TABLE.from_symbol(atom) for atom in atom_symbols] 55 | xyz1 = manage_xyz.xyz_to_np(geom) 56 | top1 = Topology.build_topology( 57 | xyz1, 58 | atoms, 59 | bondlistfile=None, 60 | ) 61 | 62 | nifty.printcool('Building Primitive Internal Coordinates') 63 | connect = False 64 | addtr = True 65 | addcart = False 66 | p1 = PrimitiveInternalCoordinates.from_options( 67 | xyz=xyz1, 68 | atoms=atoms, 69 | connect=connect, 70 | addtr=addtr, 71 | addcart=addcart, 72 | topology=top1, 73 | ) 74 | 75 | nifty.printcool('Building Delocalized Internal Coordinates') 76 | coord_obj1 = DelocalizedInternalCoordinates.from_options( 77 | xyz=xyz1, 78 | atoms=atoms, 79 | addtr=addtr, 80 | addcart=addcart, 81 | connect=connect, 82 | primitives=p1, 83 | ) 84 | 85 | nifty.printcool('Building the molecule') 86 | 87 | Form_Hessian = True 88 | initial = Molecule.from_options( 89 | geom=geom, 90 | PES=pes, 91 | coord_obj=coord_obj1, 92 | Form_Hessian=Form_Hessian, 93 | ) 94 | 95 | optimizer = eigenvector_follow.from_options( 96 | Linesearch='backtrack', # a step size algorithm 97 | OPTTHRESH=0.0005, # The gradrms threshold, this is generally easy to reach for large systems 98 | DMAX=0.01, # The initial max step size, will be adjusted if optimizer is doing well. Max is 0.5 99 | conv_Ediff=0.1, # convergence of difference energy 100 | conv_dE=0.1, # convergence of energy difference between optimization steps 101 | conv_gmax=0.005, # convergence of max gradient 102 | opt_cross=True, # use difference energy criteria to determine if you are at crossing 103 | ) 104 | 105 | print(' MECP optimization') 106 | geoms, energies = optimizer.optimize( 107 | molecule=initial, 108 | refE=initial.energy, 109 | opt_steps=150, # The max number of optimization steps, use a small number until you have your final sigma 110 | verbose=True, 111 | opt_type='UNCONSTRAINED', 112 | xyzframerate=1, 113 | ) 114 | 115 | print(f'{energies = }') 116 | assert energies[-1] == pytest.approx(-1.1495296961093118) 117 | print('Finished!') 118 | -------------------------------------------------------------------------------- /pyGSM/tests/test_pygsm.py: -------------------------------------------------------------------------------- 1 | """ 2 | Unit and regression test for the pyGSM package. 3 | """ 4 | 5 | # Import package, test suite, and other packages as needed 6 | import sys 7 | 8 | import pytest 9 | 10 | import pyGSM 11 | 12 | 13 | def test_pyGSM_imported(): 14 | """Sample test, will always pass so long as import statement worked.""" 15 | assert "pyGSM" in sys.modules 16 | -------------------------------------------------------------------------------- /pyGSM/utilities/__init__.py: -------------------------------------------------------------------------------- 1 | __all__ = ['block_matrix','block_tensor','elements','manage_xyz','math_utils','nifty','options','units'] 2 | 3 | from .block_matrix import block_matrix 4 | from .block_tensor import block_tensor 5 | -------------------------------------------------------------------------------- /pyGSM/utilities/block_tensor.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | from scipy.linalg import block_diag 3 | from .nifty import printcool,pvec1d 4 | import sys 5 | from .math_utils import orthogonalize,conjugate_orthogonalize 6 | 7 | 8 | class block_tensor(object): 9 | 10 | def __init__(self,matlist,cnorms=None): 11 | self.matlist = matlist 12 | if cnorms is None: 13 | cnorms = np.zeros((self.shape[1],1)) 14 | self.cnorms=cnorms 15 | 16 | def __repr__(self): 17 | lines= [" block matrix: # blocks = {}".format(self.num_blocks)] 18 | count=0 19 | for m in self.matlist: 20 | lines.append(str(m)) 21 | count+=1 22 | if count>10: 23 | print('truncating printout') 24 | break 25 | return '\n'.join(lines) 26 | 27 | @staticmethod 28 | def full_matrix(A): 29 | return block_diag(*A.matlist) 30 | 31 | @property 32 | def num_blocks(self): 33 | return len(self.matlist) 34 | 35 | 36 | @staticmethod 37 | def zeros_like(BM): 38 | return block_tensor( [ np.zeros_like(A) for A in BM.matlist ] ) 39 | 40 | def __add__(self,rhs): 41 | print("adding") 42 | if isinstance(rhs, self.__class__): 43 | print("adding block matrices!") 44 | assert(self.shape == rhs.shape) 45 | return block_tensor( [A+B for A,B in zip(self.matlist,rhs.matlist) ] ) 46 | elif isinstance(rhs,float) or isinstance(rhs,int): 47 | return block_tensor( [A+rhs for A in self.matlist ]) 48 | else: 49 | raise NotImplementedError 50 | 51 | def __radd__(self,lhs): 52 | return self.__add__(lhs) 53 | 54 | def __mul__(self,rhs): 55 | if isinstance(rhs, self.__class__): 56 | assert(self.shape == rhs.shape) 57 | return block_tensor( [A*B for A,B in zip(self.matlist,rhs.matlist)] ) 58 | elif isinstance(rhs,float) or isinstance(rhs,int): 59 | return block_tensor( [A*rhs for A in self.matlist ]) 60 | else: 61 | raise NotImplementedError 62 | 63 | def __rmul__(self,lhs): 64 | return self.__mul__(lhs) 65 | 66 | def __len__(self): #size along first axis 67 | return np.sum([len(A) for A in self.matlist]) 68 | 69 | def __truediv__(self,rhs): 70 | if isinstance(rhs, self.__class__): 71 | assert(self.shape == rhs.shape) 72 | return block_tensor( [A/B for A,B in zip(self.matlist,rhs.matlist)] ) 73 | elif isinstance(rhs,float) or isinstance(rhs,int): 74 | return block_tensor( [A/rhs for A in self.matlist ]) 75 | elif isinstance(rhs,np.ndarray): 76 | answer = [] 77 | s=0 78 | for block in self.matlist: 79 | e=block.shape[1]+s 80 | answer.append(block/rhs[s:e]) 81 | s=e 82 | return block_tensor(answer) 83 | else: 84 | raise NotImplementedError 85 | 86 | 87 | @property 88 | def shape(self): 89 | tot = (0,0,0) 90 | for a in self.matlist: 91 | tot = tuple(map(sum,zip(a.shape,tot))) 92 | return tot 93 | 94 | @staticmethod 95 | def transpose(A): 96 | return block_tensor( [ A.T for A in A.matlist] ) 97 | 98 | @staticmethod 99 | def dot(left,right): 100 | def block_vec_dot(block,vec): 101 | if vec.ndim==2 and vec.shape[1]==1: 102 | vec = vec.flatten() 103 | #if block.cnorms is None: 104 | s=0 105 | result=[] 106 | for A in block.matlist: 107 | e = s + np.shape(A)[1] 108 | result.append(np.dot(A,vec[s:e])) 109 | s=e 110 | return np.reshape(np.concatenate(result),(-1,1)) 111 | def vec_block_dot(vec,block,**kwargs): 112 | if vec.ndim==2 and vec.shape[1]==1: 113 | vec = vec.flatten() 114 | #if block.cnorms is None: 115 | s=0 116 | result=[] 117 | for A in block.matlist: 118 | e = s + np.shape(A)[1] 119 | result.append(np.dot(vec[s:e],A)) 120 | s=e 121 | return np.reshape(np.concatenate(result),(-1,1)) 122 | 123 | # (1) both are block matrices 124 | if isinstance(left,block_tensor) and isinstance(right,block_tensor): 125 | return block_tensor([np.dot(A,B) for A,B in zip(left.matlist,right.matlist)]) 126 | # (2) left is np.ndarray with a vector shape 127 | elif isinstance(left,np.ndarray) and (left.ndim==1 or left.shape[1]==1) and isinstance(right,block_tensor): 128 | return vec_block_dot(left,right) 129 | # (3) right is np.ndarray with a vector shape 130 | elif isinstance(right,np.ndarray) and (right.ndim==1 or right.shape[1]==1) and isinstance(left,block_tensor): 131 | return block_vec_dot(left,right) 132 | # (4) l/r is a matrix 133 | elif isinstance(left,np.ndarray) and left.ndim==2: 134 | # 135 | # [ A | B ] [ C 0 ] = [ AC BD ] 136 | # [ 0 D ] 137 | sc=0 138 | tmp_ans=[] 139 | for A in right.matlist: 140 | ec = sc+A.shape[0] 141 | tmp_ans.append(np.dot(left[:,sc:ec],A)) 142 | sc=ec 143 | dot_product=np.hstack(tmp_ans) 144 | return dot_product 145 | 146 | elif isinstance(right,np.ndarray) and right.ndim==2: 147 | # 148 | # [ A | 0 ] [ C ] = [ AC ] 149 | # [ 0 | B ] [ D ] [ BD ] 150 | sc=0 151 | tmp_ans=[] 152 | for A in left.matlist: 153 | ec = sc+A.shape[1] 154 | tmp_ans.append(np.dot(A,right[sc:ec,:])) 155 | sc=ec 156 | dot_product=np.vstack(tmp_ans) 157 | return dot_product 158 | else: 159 | raise NotImplementedError 160 | 161 | 162 | #if __name__=="__main__": 163 | 164 | #A = [np.array([[1,2],[3,4]]), np.array([[5,6],[7,8]])] 165 | #B = [np.array([[1,2],[3,4]]), np.array([[5,6],[7,8]])] 166 | #Ab = bm(A) 167 | #Bb = bm(B) 168 | # 169 | #print("A") 170 | #print(Ab) 171 | # 172 | #print("B") 173 | #print(Bb) 174 | # 175 | ## test 1 176 | #print("test 1 adding block matrices") 177 | #Cb = Ab+Bb 178 | #print(Cb) 179 | # 180 | #print("test 2 adding block matrix and float") 181 | #Db = Ab+2 182 | #print(Db) 183 | # 184 | #print("test 3 reversing order of addition") 185 | #Eb = 2+Ab 186 | #print(Eb) 187 | # 188 | #print("test 4 block multiplication") 189 | #Fb = Ab*Bb 190 | #print(Fb) 191 | # 192 | #print("test 5 block multiplication by scalar") 193 | #Gb = Ab*2 194 | #print(Gb) 195 | # 196 | #print("test 6 reverse block mult by scalar") 197 | #Hb = 2*Ab 198 | #print(Hb) 199 | # 200 | #print("test 7 total len") 201 | #print(len(Hb)) 202 | # 203 | #print("test 8 shape") 204 | #print(Hb.shape) 205 | # 206 | #print("test dot product with block matrix") 207 | #Ib = bm.dot(Ab,Bb) 208 | #print(Ib) 209 | # 210 | #print("test dot product with np vector") 211 | #Jb = bm.dot(Ab,np.array([1,2,3,4])) 212 | #print(Jb) 213 | # 214 | #print("Test dot product with np 2d vector shape= (x,1)") 215 | #a = np.array([[1,2,3,4]]).T 216 | #Kb = bm.dot(Ab,a) 217 | #print(Kb) 218 | # 219 | #print("test dot product with non-block array") 220 | #fullmat = np.random.randint(5,size=(4,4)) 221 | #print(" full mat to mult") 222 | #print(fullmat) 223 | #A = [np.array([[1,2,3],[4,5,6]]), np.array([[7,8,9],[10,11,12]])] 224 | #Ab = bm(A) 225 | #print(" Ab") 226 | #print(bm.full_matrix(Ab)) 227 | #print('result') 228 | #Mb = np.dot(fullmat,bm.full_matrix(Ab)) 229 | #print(Mb) 230 | #Lb = bm.dot(fullmat,Ab) 231 | #print('result of dot product with full mat') 232 | #print(Lb) 233 | #print(Lb == Mb) 234 | # 235 | #print("test dot product with non-block array") 236 | #print(" full mat to mult") 237 | #print(fullmat) 238 | #print(" Ab") 239 | #print(bm.full_matrix(Ab)) 240 | #print('result') 241 | #A = [ np.array([[1,2],[3,4],[5,6]]),np.array([[7,8],[9,10],[11,12]])] 242 | #Ab = bm(A) 243 | #print(Ab.shape) 244 | #print(fullmat.shape) 245 | #Mb = np.dot(bm.full_matrix(Ab),fullmat) 246 | #print(Mb) 247 | #Lb = bm.dot(Ab,fullmat) 248 | #print('result of dot product with full mat') 249 | #print(Lb) 250 | #print(Lb == Mb) 251 | # 252 | -------------------------------------------------------------------------------- /pyGSM/utilities/cli_utils.py: -------------------------------------------------------------------------------- 1 | import matplotlib.pyplot as plt 2 | from pyGSM.coordinate_systems import Angle, Dihedral, Distance, OutOfPlane 3 | 4 | def get_driving_coord_prim(dc): 5 | prim = None 6 | if "ADD" in dc or "BREAK" in dc: 7 | if dc[1] < dc[2]: 8 | prim = Distance(dc[1] - 1, dc[2] - 1) 9 | else: 10 | prim = Distance(dc[2] - 1, dc[1] - 1) 11 | elif "ANGLE" in dc: 12 | if dc[1] < dc[3]: 13 | prim = Angle(dc[1] - 1, dc[2] - 1, dc[3] - 1) 14 | else: 15 | prim = Angle(dc[3] - 1, dc[2] - 1, dc[1] - 1) 16 | elif "TORSION" in dc: 17 | if dc[1] < dc[4]: 18 | prim = Dihedral(dc[1] - 1, dc[2] - 1, dc[3] - 1, dc[4] - 1) 19 | else: 20 | prim = Dihedral(dc[4] - 1, dc[3] - 1, dc[2] - 1, dc[1] - 1) 21 | elif "OOP" in dc: 22 | # if dc[1] 1e-5 and (abs(w) > 1e-6).any(): 166 | try: 167 | basis[:, count] = w/wnorm 168 | count += 1 169 | except: 170 | print("this vector should be vanishing, exiting") 171 | print("norm=", wnorm) 172 | print(w) 173 | exit(1) 174 | dots = np.linalg.multi_dot([basis.T, G, basis]) 175 | if not (np.allclose(dots, np.eye(dots.shape[0], dtype=float))): 176 | print("np.dot(b.T,b)") 177 | print(dots) 178 | raise RuntimeError("error in orthonormality") 179 | return basis 180 | 181 | 182 | # TODO cVecs can be orthonormalized first to make it less confusing 183 | # since they are being added to basis before being technically orthonormal 184 | def orthogonalize(vecs, numCvecs=0): 185 | """ 186 | """ 187 | 188 | # print("in orthogonalize") 189 | rows = vecs.shape[0] 190 | cols = vecs.shape[1] 191 | basis = np.zeros((rows, cols-numCvecs)) 192 | 193 | # for i in range(numCvecs): # orthogonalize with respect to these 194 | # basis[:,i]= vecs[:,i] 195 | 196 | # count=numCvecs-1 197 | count = 0 198 | for v in vecs.T: 199 | w = v - np.sum(np.dot(v, b)*b for b in basis.T) 200 | wnorm = np.linalg.norm(w) 201 | # print("wnorm {} count {}".format(wnorm,count)) 202 | if wnorm > 1e-3 and (abs(w) > 1e-6).any(): 203 | try: 204 | basis[:, count] = w/wnorm 205 | count += 1 206 | except: 207 | print("this vector should be vanishing, exiting") 208 | print("norm=", wnorm) 209 | # print(w) 210 | exit(1) 211 | dots = np.matmul(basis.T, basis) 212 | if not (np.allclose(dots, np.eye(dots.shape[0], dtype=float), atol=1e-4)): 213 | print("np.dot(b.T,b)") 214 | # print(dots) 215 | print(dots - np.eye(dots.shape[0], dtype=float)) 216 | raise RuntimeError("error in orthonormality") 217 | return basis 218 | -------------------------------------------------------------------------------- /pyGSM/utilities/options.py: -------------------------------------------------------------------------------- 1 | import collections 2 | 3 | 4 | class Option(object): 5 | 6 | """ Class Option represents a key, value Option, with possible restrictions 7 | on type and value, and with documentation. 8 | """ 9 | 10 | def __init__( 11 | self, 12 | key=None, 13 | value=None, 14 | required=False, 15 | allowed_types=None, 16 | allowed_values=None, 17 | doc="", 18 | ): 19 | """ Option constructor: 20 | 21 | Params/Members: 22 | key - the string value of the key of the option. 23 | value - the value of the option. 24 | required - True if the user is required to specify the option, 25 | False if the default value is acceptable. If required is True, 26 | an error will be raised when get_value is called if value is 27 | None. 28 | allowed_types - array of allowed types of value. If this field is 29 | not None, a check will be performed on each call to set_value 30 | and get_value to ensure that value isinstance of at least one 31 | of the allowed_types. 32 | allowed_values - list of allowed values of value. If this field is 33 | not None, a check will be performed on each call to set_value 34 | and get_value to ensure the value is one of the allowed_values. 35 | doc - string message providing helpful documentation about the 36 | option. 37 | """ 38 | 39 | self.key = key 40 | self.value = value 41 | self.required = required 42 | self.allowed_types = allowed_types 43 | self.allowed_values = allowed_values 44 | self.doc = doc 45 | 46 | def get_value(self): 47 | """ Get value for this Option and check validity. 48 | 49 | Returns: 50 | value if the Option is in a valid state, else raises RuntimeError. 51 | """ 52 | 53 | if self.required and self.value is None: 54 | raise RuntimeError("Option %s is required" % self.key) 55 | 56 | return self.value 57 | 58 | def set_value(self, value): 59 | """ Set value for this Option and check validity. 60 | 61 | Result: 62 | value is updated if Option is valid, else raises RuntimeError. 63 | """ 64 | 65 | # Short-circuit if the value is None and the option is not required 66 | if value is None and not self.required: 67 | self.value = value 68 | return 69 | 70 | if self.allowed_types and not any(isinstance(value, x) for x in self.allowed_types): 71 | raise RuntimeError("Option %s must be one of allowed types: %s" % ( 72 | self.key, self.allowed_types)) 73 | if self.allowed_values and value not in self.allowed_values: 74 | raise RuntimeError("Option %s must be one of allowed values: %r" % ( 75 | self.key, self.allowed_values)) 76 | 77 | self.value = value 78 | 79 | def __str__(self): 80 | """ Return a string containing the full contents and documentation of this Option. """ 81 | 82 | s = '' 83 | s += 'Option:\n' 84 | s += ' Key: %s\n' % self.key 85 | s += ' Value: %s\n' % self.value 86 | s += ' Required: %s\n' % self.required 87 | s += ' Allowed Types: %s\n' % self.allowed_types 88 | s += ' Allowed Values: %s\n' % self.allowed_values 89 | s += ' Doc: %s\n' % self.doc 90 | s += '\n' 91 | return s 92 | 93 | 94 | class Options(object): 95 | 96 | """ Class Options represents a dict of key, value Option objects, including 97 | restrictions on type, value, etc. Users should interact only with the 98 | Options class - the Option class is used as internal data storage. 99 | 100 | Generally, codes declaring and using Options objects should first 101 | define the valid options, rules, and defaults by using the "add_option" 102 | method. Then, when the user wishes to set or get the values of specific 103 | options, a copy of the Options object should be provided for the user 104 | by calling the "copy" method of Options. 105 | 106 | The underlying Option objects are stored in the options field, which is 107 | presently a collections.OrderedDict to remember the order of Option 108 | declaration. This incurs a 2x performance penalty vs. a standard dict 109 | object, so we may want to optimize the performance later. 110 | """ 111 | 112 | def __init__( 113 | self, 114 | options=None, 115 | ): 116 | """ Options constructor. 117 | 118 | Params/Members: 119 | - options - dict of key -> Option 120 | """ 121 | 122 | if options is None: 123 | self.options = collections.OrderedDict() 124 | else: 125 | self.options = options 126 | 127 | def keys(self): 128 | keys = [] 129 | for opt in self.options: 130 | keys.append(opt) 131 | return keys 132 | 133 | def add_option( 134 | self, 135 | **kwargs 136 | ): 137 | """ Declare a new Option with possible default value, type and value 138 | rules, and documentation. 139 | 140 | Params: See Option constructor for valid kwargs 141 | Result: Options updated with new Option corresponding to key 142 | """ 143 | 144 | self.options[kwargs['key']] = Option( 145 | **kwargs 146 | ) 147 | 148 | def get_option( 149 | self, 150 | key, 151 | ): 152 | """ Get the Option corresponding to key (useful for doc searching an debugging). 153 | 154 | Params: 155 | - key - string key of Option (raises RuntimeError if not in Options) 156 | Returns: 157 | - option - the explicit Option object (most users instead want the 158 | *value* of this object, which should be accessed through the 159 | __getitem__ method below). 160 | """ 161 | 162 | if key not in self.options: 163 | raise ValueError("Key %s is not in Options" % key) 164 | return self.options[key] 165 | 166 | def __getitem__( 167 | self, 168 | key, 169 | ): 170 | """ Get the current value of Option corresponding to key, performing validity checks. 171 | 172 | Params: 173 | - key - string key of Option (raises RuntimeError if not in Options) 174 | Returns: 175 | - value - value of Option (raises RuntimeError if type, value or other validity error). 176 | """ 177 | 178 | if key not in self.options: 179 | raise ValueError("Key %s is not in Options" % key) 180 | return self.options[key].get_value() 181 | 182 | def __setitem__( 183 | self, 184 | key, 185 | value, 186 | ): 187 | """ Set the value of Option corresponding to key, performing validity checks. 188 | 189 | Params: 190 | - key - string key of Option (raises RuntimeError if not in Options) 191 | - value - value of Option (raises RuntimeError if type, value or other validity error). 192 | Result: 193 | - Option value is updated if valid. 194 | """ 195 | 196 | if key not in self.options: 197 | raise ValueError("Key %s is not in Options" % key) 198 | return self.options[key].set_value(value) 199 | 200 | def set_values( 201 | self, 202 | options, 203 | ): 204 | """ Set the values of multiple options. 205 | 206 | Params: 207 | - options - dict of key, value pairs to set (calls __setitem__ once 208 | per key, value pair). 209 | Results: 210 | - Option values are updated if valid. 211 | """ 212 | 213 | for k, v in options.items(): 214 | self[k] = v 215 | return self 216 | 217 | def copy(self): 218 | """ Return a 1-level shallow copy of this Options object. This makes 219 | copies of all underlying Option objects so that changes to the new 220 | Options object will not affect the original Options object. 221 | """ 222 | 223 | options2 = collections.OrderedDict() 224 | for k, v in self.options.items(): 225 | options2[k] = Option(**v.__dict__) 226 | return Options(options=options2) 227 | 228 | def __str__(self): 229 | """ Return the string representations of all Option objects in this Options, in insertion order. """ 230 | s = ''.join(str(v) for v in list(self.options.values())) 231 | return s 232 | 233 | 234 | if __name__ == '__main__': 235 | 236 | import time 237 | 238 | print(" this demonstrates options") 239 | 240 | start = time.time() 241 | options1 = Options() 242 | for k in range(500): 243 | options1.add_option( 244 | key='size%d' % k, 245 | value=0, 246 | allowed_types=[int], 247 | allowed_values=[0, 1], 248 | ) 249 | 250 | start = time.time() 251 | options2 = options1.copy() 252 | print('copy time %11.3E' % (time.time() - start)) 253 | 254 | start = time.time() 255 | options3 = Options() 256 | options3.add_option( 257 | key='size', 258 | value=0, 259 | allowed_types=[int], 260 | allowed_values=[0, 1], 261 | ) 262 | options4 = options3.copy() 263 | print('%11.3E' % (time.time() - start)) 264 | 265 | start = time.time() 266 | options3 = Options() 267 | options3.add_option( 268 | key='size', 269 | value=0, 270 | allowed_types=[int], 271 | allowed_values=[0, 1], 272 | ) 273 | options4 = options3.copy() 274 | print('%11.3E' % (time.time() - start)) 275 | 276 | start = time.time() 277 | options3 = Options() 278 | options3.add_option( 279 | key='size', 280 | value=0, 281 | allowed_types=[int], 282 | allowed_values=[0, 1], 283 | ) 284 | options4 = options3.copy() 285 | print('%11.3E' % (time.time() - start)) 286 | 287 | options4.set_values({'size': 1}) 288 | print(options4) 289 | -------------------------------------------------------------------------------- /pyGSM/utilities/random_quotes.py: -------------------------------------------------------------------------------- 1 | import random 2 | 3 | path_quotes = [ 4 | '"The path to success is to take massive, determined action." ~ Tony Robbins', 5 | '"In essence, if we want to direct our lives, we must take control of our consistent actions. It\'s not what we do once in a while that shapes our lives, but what we do consistently." ~ Tony Robbins', 6 | '"The vision must be followed by the venture. It is not enough to stare up the steps - we must step up the stairs." ~ Vance Havner', 7 | '"Whatever you can do, or dream you can do, begin it. Boldness has genius, power and magic in it. Begin it now." ~ Goethe', 8 | '"Traveling through hyperspace ain\'t like dusting crops, boy!" ~ Han Solo', 9 | ] 10 | 11 | funny_quotes =[ 12 | '"It\'ll take a few moments to get the coordinates from the navicomputer." ~ Han Solo', 13 | '"[points to an alarm on the control panel] What\'s that flashing?" ~ Han Solo', 14 | '"How long before you make the jump to lightspeed?" ~Ben Kenobi', 15 | '"[frantic] Are you kidding? At the rate they\'re gaining—" ~Luke Skywaker', 16 | '"I don\'t like sand It\'s all coarse, and rough, and irritating. And it gets everywhere" ~Anakin Skywalker', 17 | ] 18 | 19 | 20 | fail_quotes = [ 21 | '"Kicking ass takes getting your ass kicked" ~ Jason Calacanis', 22 | '"The perfect is the enemy of the good." ~ Voltaire', 23 | '"In order to succeed, people need a sense of self-efficacy, to struggle together with resilience to meet the inevitable obstacles and inequities of life." ~ Albert Bandura', 24 | '"Only to the extent that we expose ourselves over and over to annihilation can that which is indestructible in us be found." ~ Pema Chodron', 25 | '"Most people have no idea of the giant capacity we can immediately command when we focus all of our resources on mastering a single area of our lives." ~ Tony Robbins', 26 | '"Success comes from taking the initiative and following up persisting eloquently expressing the depth of your love. What simple action could you take today to produce a new momentum toward success in your life?" ~ Tony Robbins', 27 | '"The less effort, the faster and more powerful you will be. " ~ Bruce Lee', 28 | '"Focus is a matter of deciding what things you are not going to do" ~ John Carmack', 29 | '"If you spend too much time thinking about a thing, you\'ll never get it done. " ~ Bruce Lee', 30 | ] 31 | 32 | print(random.choice(fail_quotes)) 33 | -------------------------------------------------------------------------------- /pyGSM/utilities/units.py: -------------------------------------------------------------------------------- 1 | """ 2 | Units and constants pulled from NIST website 01/03/2017 (CODATA 2014) 3 | 4 | Usage goes as 5 | y(bohr radii) = x(Angstrom) * ANGSTROM_TO_AU 6 | x(Angstrom) = y(bohr radii) / ANGSTROM_TO_AU 7 | """ 8 | import re 9 | 10 | # Constants 11 | AVOGADROS_NUMBER = 6.022140857E+23 # (mol^{-1}) 12 | BOLTZMANN_CONSTANT_SI = 1.38064852E-23 # (J / K) 13 | HBAR_SI = 1.054571800E-34 # (J s) 14 | SPEED_OF_LIGHT_SI = 299792458.0 # (m / s) 15 | 16 | # Length 17 | M_PER_AU = 0.52917721067E-10 # (m / a_0) 18 | M_TO_AU = 1.0/M_PER_AU # (a_0 / m) 19 | ANGSTROM_TO_AU = 1.0E-10*M_TO_AU # (a_0 / A) 20 | 21 | # Dipole 22 | DEBYE_TO_AU = 0.393430307 23 | AU_TO_DEBYE = 1.0 / DEBYE_TO_AU 24 | 25 | # couloumb 26 | ELECTRON_TO_COULOMB = 6.24150907E18 27 | 28 | 29 | # Mass 30 | KG_PER_AU = 9.10938356E-31 # (kg / m_e) 31 | KG_TO_AU = 1.0/KG_PER_AU # (m_e / kg) 32 | AMU_TO_AU = 1.0E-3/AVOGADROS_NUMBER*KG_TO_AU # (m_e / amu) 33 | 34 | # Time 35 | S_PER_AU = 2.418884326509E-17 # (s / aut) 36 | S_TO_AU = 1.0/S_PER_AU # (aut / s) 37 | FS_TO_AU = S_TO_AU * 1.0E-15 # (aut / fs) 38 | PS_TO_AU = S_TO_AU * 1.0E-12 # (aut / ps) 39 | 40 | # custom 41 | AKMA_TO_PS = .04888821290839616117 # ( ps / AKMA) sqrt(1./(100*KJ_PER_KCAL)) 42 | PS_TO_AKMA = 1./AKMA_TO_PS 43 | 44 | # Energy/temperature 45 | J_PER_AU = 4.359744650E-18 # (J / E_h) 46 | J_TO_AU = 1.0/J_PER_AU # (E_h / J) 47 | 48 | K_TO_AU = BOLTZMANN_CONSTANT_SI*J_TO_AU # (E_h / K) 49 | K_PER_AU = 1.0/K_TO_AU # (K / E_h) 50 | 51 | EV_PER_AU = 27.21138602 # (eV / E_h) 52 | EV_TO_AU = 1.0/EV_PER_AU # (E_h / eV) 53 | 54 | KJ_MOL_PER_AU = J_PER_AU * 1.0E-3 * AVOGADROS_NUMBER # ((kJ/mol) / E_h) 55 | KJ_MOL_TO_AU = 1.0/KJ_MOL_PER_AU # (E_h / (kJ/mol)) 56 | 57 | KJ_PER_KCAL = 4.184 # (kJ / kcal) 58 | KCAL_MOL_PER_AU = KJ_MOL_PER_AU / KJ_PER_KCAL # ((kcal/mol) / E_h) 59 | KCAL_MOL_TO_AU = 1.0/KCAL_MOL_PER_AU # (E_h / (kcal/mol)) 60 | 61 | INV_CM_PER_AU = 219474.6313702 # (cm^{-1} / E_h) 62 | INV_CM_TO_AU = 1.0/INV_CM_PER_AU # (E_h / cm^{-1}) 63 | 64 | # Force 65 | NEWTON_PER_AU = 8.23872336E-8 # (N / auf) 66 | NEWTON_TO_AU = 1.0 / NEWTON_PER_AU # (auf / N) 67 | NANONEWTON_TO_AU = 1e-9 * NEWTON_TO_AU # (auf / nN) 68 | 69 | # AMBER Units (http://ambermd.org/Questions/units.html) 70 | AMBERLENGTH_TO_AU = ANGSTROM_TO_AU 71 | AMBERMASS_TO_AU = AMU_TO_AU 72 | AMBERENERGY_TO_AU = KCAL_MOL_TO_AU 73 | AMBERTIME_TO_AU = 1.0/20.455*PS_TO_AU 74 | AMBERVELOCITY_TO_AU = AMBERLENGTH_TO_AU/AMBERTIME_TO_AU 75 | AMBERCHARGE_TO_AU = 1.0/18.2223 76 | 77 | # Print all unit conversions 78 | if __name__ == '__main__': 79 | conversions = dict(locals()) 80 | for key, val in conversions.items(): 81 | if key[0] != '_': 82 | print((" Conversion: % 22s, Value: %11.11E" % (key, val))) 83 | 84 | 85 | units = { 86 | 'au_per_amu': 1.8228884855409500E+03, # mass 87 | 'au_per_cminv': 1.0 / 219474.6305, # ??? 88 | 'au_per_ang': 1.0 / 0.5291772109217, # length 89 | 'au_per_K': 1.0 / 3.1577464E5, # temperature 90 | 'au_per_fs': 1.0 / 2.418884326505E-2, # time 91 | } 92 | for k in list(units.keys()): 93 | v = units[k] 94 | mobj = re.match('(\S+)_per_(\S+)', k) 95 | units['%s_per_%s' % (mobj.group(2), mobj.group(1))] = 1.0 / v 96 | -------------------------------------------------------------------------------- /pyproject.toml: -------------------------------------------------------------------------------- 1 | [build-system] 2 | requires = ["setuptools>=61.0", "versioningit~=2.0"] 3 | build-backend = "setuptools.build_meta" 4 | 5 | # Self-descriptive entries which should always be present 6 | # https://packaging.python.org/en/latest/specifications/declaring-project-metadata/ 7 | [project] 8 | name = "pyGSM" 9 | description = "Reaction path and photochemistry tool" 10 | dynamic = ["version"] 11 | readme = "README.md" 12 | authors = [ 13 | { name = "Cody Aldaz", email = "codyaldaz@gmail.com" } 14 | ] 15 | license = { text = "MIT" } 16 | # See https://pypi.org/classifiers/ 17 | classifiers = [ 18 | "License :: OSI Approved :: MIT License", 19 | "Programming Language :: Python :: 3", 20 | ] 21 | requires-python = ">=3.8" 22 | # Declare any run-time dependencies that should be installed with the package. 23 | dependencies = [ 24 | "importlib-resources;python_version<'3.10'", 25 | "numpy>1.11", 26 | "six", 27 | "scipy", 28 | "matplotlib", 29 | "networkx", 30 | "pytest-cov", 31 | "xtb", 32 | "sphinx", 33 | "sphinx_rtd_theme", 34 | "typing-extensions", 35 | ] 36 | 37 | # Update the urls once the hosting is set up. 38 | #[project.urls] 39 | #"Source" = "https://github.com//pyGSM/" 40 | #"Documentation" = "https://pyGSM.readthedocs.io/" 41 | 42 | [project.optional-dependencies] 43 | test = [ 44 | "pytest>=6.1.2", 45 | "pytest-runner" 46 | ] 47 | 48 | [tool.setuptools] 49 | # This subkey is a beta stage development and keys may change in the future, see https://setuptools.pypa.io/en/latest/userguide/pyproject_config.html for more details 50 | # 51 | # As of version 0.971, mypy does not support type checking of installed zipped 52 | # packages (because it does not actually import the Python packages). 53 | # We declare the package not-zip-safe so that our type hints are also available 54 | # when checking client code that uses our (installed) package. 55 | # Ref: 56 | # https://mypy.readthedocs.io/en/stable/installed_packages.html?highlight=zip#using-installed-packages-with-mypy-pep-561 57 | zip-safe = false 58 | # Let setuptools discover the package in the current directory, 59 | # but be explicit about non-Python files. 60 | # See also: 61 | # https://setuptools.pypa.io/en/latest/userguide/pyproject_config.html#setuptools-specific-configuration 62 | # Note that behavior is currently evolving with respect to how to interpret the 63 | # "data" and "tests" subdirectories. As of setuptools 63, both are automatically 64 | # included if namespaces is true (default), even if the package is named explicitly 65 | # (instead of using 'find'). With 'find', the 'tests' subpackage is discovered 66 | # recursively because of its __init__.py file, but the data subdirectory is excluded 67 | # with include-package-data = false and namespaces = false. 68 | include-package-data = false 69 | [tool.setuptools.packages.find] 70 | namespaces = false 71 | where = ["."] 72 | 73 | # Ref https://setuptools.pypa.io/en/latest/userguide/datafiles.html#package-data 74 | [tool.setuptools.package-data] 75 | pyGSM = [ 76 | "py.typed" 77 | ] 78 | 79 | [tool.versioningit] 80 | default-version = "1+unknown" 81 | 82 | [tool.versioningit.format] 83 | distance = "{base_version}+{distance}.{vcs}{rev}" 84 | dirty = "{base_version}+{distance}.{vcs}{rev}.dirty" 85 | distance-dirty = "{base_version}+{distance}.{vcs}{rev}.dirty" 86 | 87 | [tool.versioningit.vcs] 88 | # The method key: 89 | method = "git" # <- The method name 90 | # Parameters to pass to the method: 91 | match = ["*"] 92 | default-tag = "1.0.0" 93 | 94 | [tool.versioningit.write] 95 | file = "pyGSM/_version.py" 96 | 97 | [tool.ruff.format] 98 | quote-style = "single" 99 | -------------------------------------------------------------------------------- /readthedocs.yml: -------------------------------------------------------------------------------- 1 | # readthedocs.yml 2 | 3 | version: 2 4 | 5 | build: 6 | image: latest 7 | 8 | python: 9 | version: 3.8 10 | install: 11 | - method: pip 12 | path: . 13 | 14 | conda: 15 | environment: docs/requirements.yaml -------------------------------------------------------------------------------- /setup.cfg: -------------------------------------------------------------------------------- 1 | # Helper file to handle all configs 2 | 3 | # setuptools configuration. 4 | # see https://setuptools.pypa.io/en/latest/userguide/declarative_config.html 5 | [metadata] 6 | # Self-descriptive entries which should always be present 7 | name = pyGSM 8 | author = Cody Aldaz 9 | author_email = codyaldaz@gmail.com 10 | description = Reaction path and photochemistry tool 11 | long_description = file: README.md 12 | long_description_content_type = "text/markdown" 13 | version = attr: pyGSM.__version__ 14 | license = MIT 15 | python_requires = ">=3.7" 16 | # See https://pypi.org/classifiers/ 17 | classifiers = 18 | License :: OSI Approved :: MIT License 19 | Programming Language :: Python :: 3 20 | # Update the urls and uncomment, once the hosting is set up. 21 | #project_urls = 22 | # Source = https://github.com//pyGSM/ 23 | # Documentation = https://pyGSM.readthedocs.io/ 24 | # Other possible metadata. 25 | # Ref https://packaging.python.org/en/latest/specifications/declaring-project-metadata/ 26 | #keywords = one, two 27 | #platforms = ["Linux", 28 | # "Mac OS-X", 29 | # "Unix", 30 | # "Windows"] 31 | 32 | ## CRA Commented out 33 | # [options] 34 | # # As of version 0.971, mypy does not support type checking of installed zipped 35 | # # packages (because it does not actually import the Python packages). 36 | # # We declare the package not-zip-safe so that our type hints are also available 37 | # # when checking client code that uses our (installed) package. 38 | # # Ref: 39 | # # https://mypy.readthedocs.io/en/stable/installed_packages.html?highlight=zip#using-installed-packages-with-mypy-pep-561 40 | # zip_safe = False 41 | # install_requires = 42 | # importlib-resources; python_version<"3.10" 43 | # tests_require = 44 | # pytest>=6.1.2 45 | # pytest-runner 46 | # # Which Python importable modules should be included when your package is installed 47 | # # Handled automatically by setuptools. Use 'exclude' to prevent some specific 48 | # # subpackage(s) from being added, if needed 49 | # packages = find: 50 | # # Alternatively, see ; https://setuptools.pypa.io/en/latest/userguide/declarative_config.html#using-a-src-layout 51 | # #package_dir = 52 | # # =src 53 | # [options.packages.find] 54 | # where = . 55 | # 56 | # # Optionally, include package data to ship with your package 57 | # # Customize MANIFEST.in if the general case does not suit your needs 58 | # [options.package_data] 59 | # pyGSM = py.typed 60 | 61 | [coverage:run] 62 | # .coveragerc to control coverage.py and pytest-cov 63 | omit = 64 | # Omit the tests 65 | */tests/* 66 | # Omit generated versioneer 67 | pyGSM/_version.py 68 | 69 | [yapf] 70 | # YAPF, in .style.yapf files this shows up as "[style]" header 71 | COLUMN_LIMIT = 119 72 | INDENT_WIDTH = 4 73 | USE_TABS = False 74 | 75 | [flake8] 76 | # Flake8, PyFlakes, etc 77 | max-line-length = 119 78 | 79 | [aliases] 80 | test = pytest 81 | -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | from setuptools import setup 2 | if __name__ == '__main__': 3 | setup( 4 | scripts=['bin/gsm'] 5 | ) 6 | --------------------------------------------------------------------------------