├── .gitignore ├── Dockerfile ├── README.md ├── misc └── convert_formulas.py ├── requirements-dev.txt ├── requirements.txt ├── scripts └── salt-shaker ├── setup.cfg ├── setup.py ├── shaker ├── __init__.py ├── command_line.py ├── libs │ ├── __init__.py │ ├── errors.py │ ├── github.py │ ├── logger.py │ ├── metadata.py │ └── pygit2_utils.py ├── salt_shaker.py ├── shaker_metadata.py └── shaker_remote.py └── tests ├── __init__.py ├── libs ├── __init__.py ├── test_github.py ├── test_metadata.py └── test_pygit2_utils.py ├── test_shaker_metadata.py └── test_shaker_remote.py /.gitignore: -------------------------------------------------------------------------------- 1 | *.py[cod] 2 | spec_test/spec/localhost/*.rb 3 | env.sh 4 | formula-requirements.txt 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Packages 10 | *.swp 11 | *.egg 12 | *.egg-info 13 | dist 14 | build 15 | eggs 16 | parts 17 | bin 18 | var 19 | sdist 20 | develop-eggs 21 | .installed.cfg 22 | lib 23 | lib64 24 | __pycache__ 25 | MANIFEST 26 | 27 | # Installer logs 28 | pip-log.txt 29 | 30 | # Unit test / coverage reports 31 | .coverage 32 | .tox 33 | nosetests.xml 34 | 35 | # Translations 36 | *.mo 37 | 38 | # Mr Developer 39 | .mr.developer.cfg 40 | .project 41 | .pydevproject 42 | *~ 43 | .idea 44 | *.iml 45 | salt/_hacks 46 | vendor/ 47 | .vagrant/ 48 | -------------------------------------------------------------------------------- /Dockerfile: -------------------------------------------------------------------------------- 1 | FROM gyre007/salt-shaker:onbuild 2 | 3 | MAINTAINER WebOps MoJ 4 | 5 | ENTRYPOINT [ "python", "./src/shaker/shaker.py" ] 6 | 7 | CMD ["--help"] 8 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # salt-shaker 2 | 3 | ## Installation 4 | 5 | Opinionated saltstack formula dependency resolver 6 | 7 | Note: Install libgit2 with libssh2 support before installing this package. 8 | 9 | $ brew install libgit2 --with-libssh2 10 | 11 | 12 | ## Quickstart 13 | 14 | Salt shakers requires an initial config file containing the metadata for the local formula. eg, 15 | 16 | ``` 17 | formula: my_organisation/local-formula 18 | 19 | dependencies: 20 | - some_organisation/test1-formula 21 | - another_organisation/testa-formula>=v1.0.0 22 | - another_organisation/testb-formula<=v4.0.0 23 | - another_organisation/testc-formula==v2.0.0 24 | ``` 25 | 26 | To generate and download a list of formula requirements, simply run 27 | 28 | salt-shaker install 29 | 30 | This will also save a list of the requirements and their versions, by default in 'formula-requirements.txt' 31 | 32 | If this file exists, you can run 33 | 34 | salt-shaker install-pinned-versions 35 | 36 | to install the requirements with their versions pinned by the formula requirements file version. 37 | 38 | You can also run a check to see what changes would be made to the formula-requirements file if an 39 | install were run. 40 | 41 | salt-shaker check 42 | 43 | This is useful to see if the dependency resolution chain has changed since versions 44 | were pinned. 45 | 46 | ## Introduction 47 | 48 | Salt shakers requires an initial config file containing the metadata for the local formula. eg, 49 | 50 | ``` 51 | formula: my_organisation/local-formula 52 | 53 | exports: 54 | - local 55 | 56 | dependencies: 57 | - some_organisation/test1-formula 58 | - another_organisation/testa-formula>=v1.0.0 59 | - another_organisation/testb-formula<=v4.0.0 60 | - another_organisation/testc-formula==v2.0.0 61 | ``` 62 | 63 | Here, the name of the formula is set to be 'local-formula', with an organisation name of 'my_organisation'. 64 | This formula will have dependencies on the described formula, based on the format 65 | 66 | /(constraint) 67 | 68 | 69 | ### exports 70 | By default formula `organisation/name-formula` will be exposed to salt minions as `name`. So you can later refer to it 71 | in your sls files using: 72 | ``` 73 | include: 74 | - name 75 | ``` 76 | 77 | In some cases you might want to overwrite the default export name or even expose multiple exports. In such case add to 78 | metadata.yml: 79 | ``` 80 | exports: 81 | - name1 82 | - name2 83 | ``` 84 | 85 | Make sure you have both subdirectories available in formula: 86 | ``` 87 | \ 88 | + name1/ 89 | | + init.sls 90 | | 91 | + name2/ 92 | | + init.sls 93 | | 94 | + metadata.yml 95 | ``` 96 | 97 | And from now on your formula will supply both exports and you can refer to them with: 98 | ``` 99 | include: 100 | - name1 101 | - name2 102 | ``` 103 | 104 | 105 | ### Constraint Resolution 106 | The constraint is optional and can take the form ==, >= or <= followed by a version tag. Salt shaker will use these constraints and the constraints 107 | of any sub-dependencies found recursively on these dependencies, handling conflicts to try and resolve them all to a logically satisfiable single 108 | constraint. 109 | 110 | * '==' Equality takes priority over all other constraints, current equalities override any new ones 111 | * '>=' The highest greater than bound takes precedence over the lower 112 | * '<=' least less-than bound takes precedence over the higher 113 | * '>=, <=' Opposite contraints will throw an exception, although these may be resolvable in practice 114 | 115 | 116 | Constraints specified in the metadata file are parsed first, then these are sequential processed, with the full dependency tree 117 | for that entry being parsed before moving on to the next metadata dependency entry. 118 | 119 | ### Process 120 | Salt shaker consists of two main processes. Firstly, a metadata resolver that can parse config files and generate a set of formulas with resolved dependencies 121 | 122 | This dependency list can then be parsed and resolved into actual tags and sha's on github, downloaded into a local directory, and the set of formula 123 | requirements and their versions stored in a local file. This local file can be used as the base of future updates, so that the remote formulas 124 | versions are in effect 'pinned'. 125 | 126 | ### Misc Options 127 | There are a few flags that can be passed to alter salt-shakers behaviour. 128 | 129 | --enable-remote-check: This will force salt-shaker to contact the remote repository when using pinned versions, updating any 130 | shas that tags resolve to, meaning that if a tag was moved then the change would be picked up. With the default behaviour 131 | tags are assumed to be immutable 132 | 133 | --simulate: No operation mode where the full command specified will be run, but no alterations will be made to any config files. 134 | 135 | --root_dir: Specify the root directory for salt-shaker to work in 136 | 137 | --verbose, --debug: Increase the level of logging output from salt-shaker 138 | 139 | # Running the tests 140 | 141 | It's as simple as running this command: 142 | 143 | ``` 144 | python setup.py nosetests 145 | ``` 146 | -------------------------------------------------------------------------------- /misc/convert_formulas.py: -------------------------------------------------------------------------------- 1 | import urlparse 2 | import requests 3 | import json 4 | import os 5 | import os.path 6 | import sys 7 | import yaml 8 | import pygit2 9 | 10 | 11 | 12 | GH_TOKEN = os.environ.get('GITHUB_TOKEN', None) 13 | ORG = 'ministryofjustice' 14 | PR_TITLE = 'AUTOGENERATED-add_metadata' 15 | url = 'https://api.github.com/orgs/{0}/repos'.format(ORG) 16 | req_url = 'https://raw.githubusercontent.com/{0}/{1}/{2}/{3}' 17 | pr_url = 'https://api.github.com/repos/{0}/{1}/pulls' 18 | wanted_tag = 'master' 19 | reqs_file = 'formula-requirements.txt' 20 | repos_dir = '/tmp/out123/repos' 21 | 22 | 23 | def http_creds_callback(*args, **kwargs): 24 | return pygit2.UserPass(GH_TOKEN, 'x-oauth-basic') 25 | 26 | 27 | if not os.path.exists(repos_dir): 28 | os.makedirs(repos_dir, 0755) 29 | 30 | if not os.path.exists('/tmp/lala.json'): 31 | repos = [] 32 | gh_response = requests.get(url, auth=(GH_TOKEN, 'x-oauth-basic')) 33 | if gh_response.status_code != 200: 34 | print 'Failed to retrieve repos from github' 35 | sys.exit(1) 36 | 37 | repos.extend(json.loads(gh_response.text)) 38 | # parse the Link header and extract the number of pages available. 39 | max_pages = 0 40 | if 'link' in gh_response.headers: 41 | for link, rel in map(lambda x: x.split(';'), 42 | gh_response.headers['link'].split(',')): 43 | link = link.strip()[1:-1] 44 | if rel.split('=')[1] == '"last"': 45 | try: 46 | pg_num = urlparse.urlparse(link).query.split('=')[1] 47 | max_pages = int(pg_num) 48 | except ValueError: 49 | pass 50 | break 51 | 52 | for pg_num in range(2, max_pages+1): 53 | pg_url = '{0}?page={1}'.format(url, pg_num) 54 | gh_response = requests.get(pg_url, auth=(GH_TOKEN, 'x-oauth-basic')) 55 | if gh_response.status_code != 200: 56 | print 'Failed to retrieve page: {0}'.format(pg_num) 57 | continue 58 | repos.extend(json.loads(gh_response.text)) 59 | with open('/tmp/lala.json', 'w') as data: 60 | json.dump(repos, data) 61 | else: 62 | with open('/tmp/lala.json') as data: 63 | repos = json.load(data) 64 | 65 | for repo in repos: 66 | repo_name = repo['name'] 67 | if '-formula' != repo_name[-8:]: 68 | continue 69 | 70 | # Check if a PR already exists. If a PR exists then there is nothing 71 | # for us to do here. 72 | pr = pr_url.format(ORG, repo_name) 73 | pr_response = requests.get(pr, auth=(GH_TOKEN, 'x-oauth-basic')) 74 | if pr_response.status_code != 200: 75 | print '{0}: Cannot access pull requests.'.format(repo_name) 76 | continue 77 | pr_titles = [x['title'] for x in json.loads(pr_response.text)] 78 | if PR_TITLE in pr_titles: 79 | print '{0}: PR already exists'.format(repo_name) 80 | continue 81 | 82 | # Is the metadata file already in repo? 83 | metadata_url = req_url.format(ORG, repo_name, 'metadata', 'metadata.yml') 84 | metadata_response = requests.get(metadata_url, 85 | auth=(GH_TOKEN, 'x-oauth-basic')) 86 | 87 | # metadata file not found. we need to create and push it to the repo 88 | if metadata_response.status_code != 200: 89 | # Generate the requirements list 90 | reqs_url = req_url.format(ORG, repo_name, wanted_tag, reqs_file) 91 | reqs_response = requests.get(reqs_url, auth=(GH_TOKEN, 'x-oauth-basic')) 92 | if reqs_response.status_code == 404: 93 | print '{0}: No requirements file found.'.format(repo_name) 94 | continue 95 | elif reqs_response.status_code != 200: 96 | print 'Failed to retrieve reqs for repo: {0}.'.format(repo_name) 97 | continue 98 | dependencies = [] 99 | for line in reqs_response.text.split('\n'): 100 | if line: 101 | dependencies.append(str(line.split('==')[0])) 102 | 103 | # Clone the repository 104 | repo_dir = os.path.join(repos_dir, repo_name) 105 | formula_name = repo_name.rsplit('-', 1)[0] 106 | git_repo = None 107 | try: 108 | if os.path.exists(repo_dir): 109 | git_repo = pygit2.Repository(repo_dir) 110 | else: 111 | print 'Cloning %s:' % repo['git_url'], 112 | git_repo = pygit2.clone_repository(repo['git_url'], repo_dir) 113 | print 'Done!' 114 | except pygit2.GitError, e: 115 | if 'Repository not found' in e.message: 116 | print 'Repository not found or repository is private' 117 | continue 118 | 119 | # Create metadata branch unless it already exists. 120 | if 'refs/remotes/origin/metadata' not in git_repo.listall_references(): 121 | try: 122 | git_repo.create_branch('metadata', git_repo.head.get_object()) 123 | except ValueError, e: 124 | pass 125 | git_repo.checkout('refs/heads/metadata') 126 | else: 127 | git_repo.checkout('refs/remotes/origin/metadata') 128 | 129 | with open('{0}/metadata.yml'.format(repo_dir), 'w') as formula_meta: 130 | out = {'dependencies': dependencies} 131 | yaml.dump(out, formula_meta, default_flow_style=False) 132 | 133 | git_repo.index.read() 134 | git_repo.index.add('metadata.yml') 135 | git_repo.index.write() 136 | tree = git_repo.index.write_tree() 137 | sig = pygit2.Signature('Kyriakos Oikonomakos', 138 | 'kyriakos.oikonomakos@digital.justice.gov.uk') 139 | parent = git_repo.lookup_reference('HEAD').resolve().get_object().oid 140 | commit_msg = 'adding metadata.yml' 141 | oid = git_repo.create_commit('refs/heads/metadata', sig, sig, 142 | commit_msg, tree, [parent]) 143 | 144 | remote = git_repo.remotes[0] 145 | remote.push_url = repo['html_url'] 146 | print remote, remote.push_url 147 | remote.credentials = http_creds_callback 148 | 149 | pr_data = {'title': 'AUTOGENERATED-add_metadata', 150 | 'head': 'metadata', 151 | 'base': 'master', 152 | 'body': 'Autogenerated PR to include metadata.yml in formula'} 153 | 154 | x = requests.post(pr, data=json.dumps(pr_data), 155 | auth=(GH_TOKEN, 'x-oauth-basic')) 156 | if x.status_code != 201: 157 | print '{0}: Failed to submit PR.'.format(repo_name) 158 | -------------------------------------------------------------------------------- /requirements-dev.txt: -------------------------------------------------------------------------------- 1 | -r requirements.txt 2 | mock==1.0.1 3 | nose==1.3.7 4 | testfixtures==4.1.2 5 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | pygit2 >= 0.21.4 2 | PyYAML 3 | requests[security] 4 | paramiko 5 | parse 6 | 7 | -------------------------------------------------------------------------------- /scripts/salt-shaker: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | 3 | import sys 4 | from shaker.command_line import ShakerCommandLine 5 | 6 | ShakerCommandLine().run(sys.argv[1:]) 7 | -------------------------------------------------------------------------------- /setup.cfg: -------------------------------------------------------------------------------- 1 | [nosetests] 2 | tests=tests/ 3 | -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | from setuptools import setup 2 | 3 | setup( 4 | name='salt-shaker', 5 | version='1.0.3', 6 | packages=['shaker', 7 | 'shaker.libs'], 8 | url='http://github.com/ministryofjustice/salt_shaker', 9 | license='', 10 | author='MoJ DS Infrastucture Team', 11 | author_email='webops@digital.justice.gov.uk', 12 | description='', 13 | install_requires=[ 14 | 'requests[security]', 15 | 'PyYAML', 16 | 'pygit2 >= 0.21.4', 17 | 'parse' 18 | ], 19 | tests_require=[ 20 | 'responses', 21 | 'testfixtures', 22 | 'mock', 23 | ], 24 | test_suite='nose.collector', 25 | setup_requires=['nose>=1.0'], 26 | scripts=['scripts/salt-shaker'], 27 | ) 28 | -------------------------------------------------------------------------------- /shaker/__init__.py: -------------------------------------------------------------------------------- 1 | __author__ = 'kyriakos' 2 | -------------------------------------------------------------------------------- /shaker/command_line.py: -------------------------------------------------------------------------------- 1 | import argparse 2 | import sys 3 | 4 | import salt_shaker 5 | 6 | 7 | class ShakerCommandLine(object): 8 | 9 | def run(self, cli_args): 10 | parser = argparse.ArgumentParser(add_help=True) 11 | subparsers = parser.add_subparsers() 12 | 13 | parser.add_argument('--root_dir', 14 | default='.', 15 | help="Working path to operate under") 16 | parser.add_argument('--verbose', 17 | action='store_true', 18 | help="Enable verbose logging") 19 | parser.add_argument('--debug', 20 | action='store_true', 21 | help="Enable debug logging") 22 | parser.add_argument('--simulate', 23 | '-s', 24 | action='store_true', 25 | help="Only simulate the command, do not commit any changes") 26 | parser.add_argument('--enable-remote-check', 27 | action='store_true', 28 | help="Enable remote checks when installing pinned versions") 29 | 30 | parser_install = subparsers.add_parser('install', 31 | help=("Install formulas and requirements from metadata.yml, " 32 | "recursively resolving remote dependencies"), 33 | ) 34 | parser_install.set_defaults(pinned=False) 35 | parser_install.set_defaults(func=self.shake) 36 | 37 | parser_refresh = subparsers.add_parser('install-pinned-versions', 38 | help=("Install pinned versions of formulas " 39 | "using formula-requirements.txt")) 40 | parser_refresh.set_defaults(pinned=True) 41 | parser_refresh.set_defaults(func=self.shake) 42 | 43 | parser_check = subparsers.add_parser('check', 44 | help=("Check versions of formulas against an update")) 45 | parser_check.set_defaults(check_requirements=True) 46 | parser_check.set_defaults(func=self.shake) 47 | 48 | args_ns = parser.parse_args(args=self.back_compat_args_fix(cli_args)) 49 | # Convert the args as Namespace to dict a so we can pass it as kwargs to a function 50 | args = vars(args_ns) 51 | 52 | return args.pop('func')(**args) 53 | 54 | def back_compat_args_fix(self, cli_args): 55 | args = [] 56 | for arg in cli_args: 57 | if arg.startswith("root_") and "=" in arg: 58 | arg = "--" + arg 59 | args.append(arg) 60 | 61 | return args 62 | 63 | def shake(self, **kwargs): 64 | salt_shaker.shaker(**kwargs) 65 | 66 | 67 | if __name__ == '__main__': 68 | ShakerCommandLine().run(sys.argv) 69 | -------------------------------------------------------------------------------- /shaker/libs/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ministryofjustice/salt-shaker/55f93b2418f91012e030984a13419e7ea8719a26/shaker/libs/__init__.py -------------------------------------------------------------------------------- /shaker/libs/errors.py: -------------------------------------------------------------------------------- 1 | 2 | class ShakerConfigException(Exception): 3 | """ 4 | Exception interpreting salt-shakers config 5 | files 6 | """ 7 | pass 8 | 9 | 10 | class ShakerRequirementsUpdateException(Exception): 11 | """ 12 | Exception updating salt-shakers requirements 13 | """ 14 | pass 15 | 16 | 17 | class ShakerRequirementsParsingException(Exception): 18 | """ 19 | Exception parsing salt-shakers requirements 20 | """ 21 | pass 22 | 23 | 24 | class ConstraintFormatException(Exception): 25 | """ 26 | Exception in the format of a constraint 27 | """ 28 | pass 29 | 30 | 31 | class ConstraintResolutionException(Exception): 32 | """ 33 | Exception resolving a constraint 34 | """ 35 | pass 36 | 37 | 38 | class GithubRepositoryConnectionException(Exception): 39 | """ 40 | Exception caused by connection problems to github 41 | """ 42 | pass 43 | -------------------------------------------------------------------------------- /shaker/libs/logger.py: -------------------------------------------------------------------------------- 1 | import logging 2 | 3 | 4 | class Logger(object): 5 | 6 | """ 7 | A wrapped logger class that allows us to implement a 8 | singleton. 9 | """ 10 | 11 | class __Logger: 12 | """ 13 | Class that enables singleton behaviour 14 | """ 15 | def __init__(self, logger_name): 16 | """ 17 | Initialise singleton 18 | 19 | Args: 20 | logger_name(string): The name of the logger 21 | """ 22 | self.logger_name = logger_name 23 | 24 | def __str__(self): 25 | """ 26 | Get a string representation of this instance 27 | 28 | Returns: 29 | (string): String representation of this instance 30 | """ 31 | return repr(self) + self.logger_name 32 | 33 | logger_name = '' 34 | 35 | def setLevel(self, level): 36 | """ 37 | Set logging to be at level 38 | 39 | Args: 40 | level(logging.LEVEL): The logging level to set 41 | """ 42 | logging.info("Logger::setLevel: Logging level '%s' enabled" 43 | % level) 44 | logging.getLogger(self.logger_name).setLevel(level) 45 | 46 | def info(self, msg): 47 | """ 48 | Log message at level info 49 | 50 | Args: 51 | msg(string): The message to log 52 | """ 53 | logging.getLogger(self.logger_name).info(msg) 54 | 55 | def warning(self, msg): 56 | """ 57 | Log message at level warning 58 | 59 | Args: 60 | msg(string): The message to log 61 | """ 62 | logging.getLogger(self.logger_name).warning(msg) 63 | 64 | def error(self, msg): 65 | """ 66 | Log message at level error 67 | 68 | Args: 69 | msg(string): The message to log 70 | """ 71 | logging.getLogger(self.logger_name).error(msg) 72 | 73 | def critical(self, msg): 74 | """ 75 | Log message at level critical 76 | 77 | Args: 78 | msg(string): The message to log 79 | """ 80 | logging.getLogger(self.logger_name).critical(msg) 81 | 82 | def debug(self, msg): 83 | """ 84 | Log message at level debug 85 | 86 | Args: 87 | msg(string): The message to log 88 | """ 89 | logging.getLogger(self.logger_name).debug(msg) 90 | 91 | # Wrapping singleton class begins 92 | instance = None 93 | 94 | def __new__(cls, logger_name="default"): 95 | if not Logger.instance: 96 | Logger.instance = Logger.__Logger(logger_name) 97 | return Logger.instance 98 | 99 | def __getattr__(self, name): 100 | return getattr(self.instance, name) 101 | 102 | def __setattr__(self, name): 103 | return setattr(self.instance, name) 104 | -------------------------------------------------------------------------------- /shaker/libs/metadata.py: -------------------------------------------------------------------------------- 1 | import shaker.libs.logger 2 | import re 3 | from shaker.libs.errors import (ConstraintFormatException, 4 | ConstraintResolutionException, 5 | ShakerRequirementsParsingException) 6 | from parse import parse 7 | 8 | comparator_re = re.compile('([=><]+)\s*(.*)') 9 | tag_re = re.compile('v[0-9]+\.[0-9]+\.[0-9]+') 10 | 11 | 12 | def parse_metadata(metadata): 13 | """ 14 | Entry function to handle the metadata parsing workflow and return a metadata 15 | object which is cleaned up 16 | 17 | Args: 18 | metadata (dictionary): Keyed salt formula dependency information 19 | 20 | Returns: 21 | parsed_metadata (dictionary): The original metadata parsed and cleaned up 22 | """ 23 | # Remove duplicates 24 | parsed_metadata = resolve_metadata_duplicates(metadata) 25 | return parsed_metadata 26 | 27 | 28 | def resolve_metadata_duplicates(metadata): 29 | """ 30 | Strip duplicates out of a metadata file. If we have no additional criteria, 31 | simply take the first one. Or can resolve by latest version or preferred organisation 32 | if required 33 | 34 | Args: 35 | metadata (dictionary): Keyed salt formula dependency information 36 | 37 | Returns: 38 | resolved_dependencies (dictionary): The original metadata stripped of duplicates 39 | If the metadata could not be resolved then we return the original args version 40 | """ 41 | # Only start to make alterations if we have a valid metadata format 42 | # Otherwise throw an exception 43 | 44 | # If metadata is not a dictionary or does not contain 45 | # a dependencies field then throw an exception 46 | if not (isinstance(metadata, type({}))): 47 | raise TypeError("resolve_metadata_duplicates: Metadata is not a " 48 | "dictionary but type '%s'" % (type(metadata))) 49 | elif not ("dependencies" in metadata): 50 | raise IndexError("resolve_metadata_duplicates: Metadata has " 51 | "no key called 'dependencies'" 52 | ) 53 | # Count the duplicates we find 54 | count_duplicates = 0 55 | 56 | resolved_dependency_collection = {} 57 | for dependency in metadata["dependencies"]: 58 | # Filter out formula name 59 | _, formula = dependency.split(':')[1].split('.git')[0].split('/') 60 | 61 | # Simply take the first formula found, ignore subsequent 62 | # formulas with the same name even from different organisations 63 | # Just warn, not erroring out 64 | if formula not in resolved_dependency_collection: 65 | resolved_dependency_collection[formula] = dependency 66 | else: 67 | # Do some sort of tag resolution 68 | count_duplicates += 1 69 | shaker.libs.logger.Logger().warning("resolve_metadata_duplicates: " 70 | "Skipping duplicate dependency %s" 71 | % (formula)) 72 | 73 | # Only alter the metadata if we need to 74 | if count_duplicates > 0: 75 | resolved_dependencies = resolved_dependency_collection.values() 76 | metadata["dependencies"] = resolved_dependencies 77 | 78 | return metadata 79 | 80 | 81 | def parse_constraint(constraint): 82 | """ 83 | Parse a constraint of form 84 | into an info dictionary of form 85 | {'comparator': comparator, 'tag': tag, 'version': version, 'postfix': postfix} 86 | 87 | Args: 88 | constraint(string): The string representing the constratint 89 | 90 | Returns: 91 | dictionary: The information dictionary 92 | """ 93 | match = comparator_re.search(constraint) 94 | comparator = match.group(1) 95 | tag = match.group(2) 96 | 97 | version = None 98 | postfix = None 99 | parsed_results = parse('v{version}-{postfix}', tag) 100 | if parsed_results: 101 | version = parsed_results["version"] 102 | postfix = parsed_results["postfix"] 103 | else: 104 | parsed_results = parse('v{version}', tag) 105 | if parsed_results: 106 | version = parsed_results["version"] 107 | postfix = None 108 | 109 | return {'comparator': comparator, 110 | 'tag': tag, 111 | 'version': version, 112 | 'postfix': postfix} 113 | 114 | 115 | def resolve_constraints(new_constraint, 116 | current_constraint): 117 | """ 118 | Resolve the dependencies uniquely using the precedence ==, >=, <= 119 | i.e, 120 | * '==' Equality takes priority over all other constraints, current 121 | equalities override any new 122 | * '>=' The highest greater than bound takes precedence over the lower 123 | * '<=' least less-than bound takes precedence over the higher 124 | * '>=, <=' Opposite contraints will throw an exception, although these 125 | may be resolvable in practice 126 | 127 | Args: 128 | new_constraint(string): New comparator and version 129 | current_constraint(string): Current comparator and version 130 | 131 | Returns: 132 | string: The constraint that took precedence 133 | 134 | Raises: 135 | ConstraintFormatException 136 | ConstraintResolutionException 137 | """ 138 | shaker.libs.logger.Logger().debug("metadata.resolve_constraints(%s, %s)" 139 | % (new_constraint, 140 | current_constraint)) 141 | # Deal with simple cases first, if we have an empty 142 | # constraint and a non-empty one, use the non-empty 143 | # one, if both are empty then just no versioning 144 | # is required 145 | have_new_constraint = (new_constraint and (len(new_constraint) > 0)) 146 | have_current_constraint = (current_constraint and (len(current_constraint) > 0)) 147 | if not have_new_constraint and not have_current_constraint: 148 | return '' 149 | elif not have_new_constraint and have_current_constraint: 150 | return current_constraint 151 | elif have_new_constraint and not have_current_constraint: 152 | return new_constraint 153 | 154 | new_constraint_result = parse_constraint(new_constraint) 155 | current_constraint_result = parse_constraint(current_constraint) 156 | shaker.libs.logger.Logger().debug("metadata.resolve_constraints: %s\n%s\n" 157 | % (new_constraint_result, 158 | current_constraint_result)) 159 | if new_constraint_result and current_constraint_result: 160 | new_comparator = new_constraint_result["comparator"] 161 | current_comparator = current_constraint_result["comparator"] 162 | # Deal with equality case 163 | if current_comparator == '==': 164 | return current_constraint 165 | elif new_comparator == '==': 166 | return new_constraint 167 | elif new_comparator != current_comparator: 168 | raise ConstraintResolutionException 169 | elif new_comparator == '>=': 170 | # Get highest version 171 | version = max(new_constraint_result["tag"], 172 | current_constraint_result["tag"]) 173 | return '>=%s' % (version) 174 | elif new_comparator == '<=': 175 | # Get highest version 176 | version = min(new_constraint_result["tag"], 177 | current_constraint_result["tag"]) 178 | return '<=%s' % (version) 179 | else: 180 | msg = ("metadata.resolve_constraints: %s\n%s\n" 181 | % (new_constraint_result, 182 | current_constraint_result)) 183 | raise ConstraintFormatException(msg) 184 | else: 185 | msg = ("metadata.resolve_constraints: %s\n%s\n" 186 | % (new_constraint_result, 187 | current_constraint_result)) 188 | raise ConstraintFormatException(msg) 189 | 190 | return None 191 | 192 | 193 | def parse_metadata_requirements(metadata_dependencies): 194 | """ 195 | Parse the supplied metadata requirements of the format, 196 | [ 197 | 'git@github.com:test_organisation/some-formula.git==v1.0', 198 | 'git@github.com:test_organisation/another-formula.git==v2.0' 199 | ] 200 | or 201 | [ 202 | 'test_organisation/some-formula==v1.0', 203 | 'test_organisation/another-formula==v2.0' 204 | ] 205 | and return them in the format, 206 | 207 | 'test_organisation/some-formula': 208 | { 209 | 'source': 'git@github.com:test_organisation/some-formula.git', 210 | 'constraint': '==1.0', 211 | 'sourced_constraints': ['==1.0'], 212 | 'organisation': 'test_organisation', 213 | 'name': 'some-formula' 214 | } 215 | 216 | Args: 217 | metadata_requirements(string): String of metadata requirements 218 | 219 | Return: 220 | dependencies(dictionary): A collection of details on the 221 | dependencies in the specified format 222 | 223 | """ 224 | dependencies = {} 225 | for metadata_dependency in metadata_dependencies: 226 | # If we have a github url, then parse it, otherwise generate one 227 | # From the simplified format. Pass this to th github url parser 228 | # to ensure we are generating the same strucutres for both cases 229 | metadata_info = {} 230 | if (".git" in metadata_dependency or "git@" in metadata_dependency): 231 | shaker.libs.logger.Logger().debug("metadata::parse_metadata_requirements: " 232 | "Parsing '%s' as raw github format\n" 233 | % (metadata_dependency)) 234 | metadata_info = shaker.libs.github.parse_github_url(metadata_dependency) 235 | else: 236 | parsed_entry = re.search('(.*)([=><]{2})\s*(.*)', metadata_dependency) 237 | if parsed_entry and len(parsed_entry.groups()) >= 3: 238 | parsed_formula = parsed_entry.group(1).strip() 239 | parsed_comparator = parsed_entry.group(2).strip() 240 | parsed_version = parsed_entry.group(3).strip() 241 | shaker.libs.logger.Logger().debug("metadata::parse_metadata_requirements: " 242 | "parsed values for formula >%s< comparator >%s< version >%s<" 243 | % (parsed_formula, parsed_formula, parsed_version)) 244 | github_url = "git@github.com:{0}.git{1}{2}".format(parsed_formula, 245 | parsed_comparator, 246 | parsed_version) 247 | metadata_info = shaker.libs.github.parse_github_url(github_url) 248 | shaker.libs.logger.Logger().debug("metadata::parse_metadata_requirements: " 249 | "Parsing '%s' as simple format with constraint" 250 | % (metadata_dependency)) 251 | 252 | else: 253 | github_url = "git@github.com:%s.git" % (metadata_dependency) 254 | metadata_info = shaker.libs.github.parse_github_url(github_url) 255 | shaker.libs.logger.Logger().debug("metadata::parse_metadata_requirements: " 256 | "Parsing '%s' as simple format without constraint\n" 257 | % (metadata_dependency)) 258 | 259 | if metadata_info: 260 | dependency_entry = { 261 | 'source': metadata_info.get('source', None), 262 | 'constraint': metadata_info.get('constraint', None), 263 | 'sourced_constraints': [], 264 | 'organisation': metadata_info.get('organisation', None), 265 | 'name': metadata_info.get('name', None) 266 | } 267 | # Look for problems 268 | format_check = (dependency_entry['source'] and 269 | dependency_entry['organisation'] and 270 | dependency_entry['name'] 271 | ) 272 | if not format_check: 273 | msg = ("metadata::parse_metadata_requirements: " 274 | "Parsing '%s' as simple format without constraint\n" 275 | % (metadata_dependency)) 276 | raise ShakerRequirementsParsingException(msg) 277 | 278 | dependency_key = "%s/%s" % (dependency_entry.get('organisation'), 279 | dependency_entry.get('name')) 280 | dependencies[dependency_key] = dependency_entry 281 | 282 | shaker.libs.logger.Logger().debug("metadata::parse_metadata_requirements: " 283 | "Parsed entry %s %s\n from metadata: %s" 284 | % (metadata_dependency, 285 | dependency_entry, 286 | metadata_info)) 287 | else: 288 | shaker.libs.logger.Logger().debug("metadata::parse_metadata_requirements: " 289 | "No data found for entry %s" 290 | % (metadata_info.get('source', None))) 291 | return dependencies 292 | 293 | 294 | def compare_requirements(previous_requirements, 295 | new_requirements): 296 | 297 | """ 298 | Compare this objects requirements to another set 299 | of requirements 300 | 301 | Args: 302 | other_requirements(list): List of requirements of form, 303 | [ 304 | some-organisation/some-formula==1.0.1, 305 | some-organisation/another-formula, 306 | ] 307 | 308 | Returns: 309 | list: List of differing formula requirements, in form for 310 | new, deprecated and unequal versions 311 | [ 312 | ['', some-organisation/another-formula] 313 | [some-organisation/some-formula==1.0.1, ''] 314 | [some-organisation/some-formula==1.0.1, some-organisation/another-formula] 315 | ] 316 | """ 317 | diff = [] 318 | parsed_first_requirements = shaker.libs.metadata.parse_metadata_requirements(new_requirements) 319 | parsed_other_requirements = shaker.libs.metadata.parse_metadata_requirements(previous_requirements) 320 | 321 | # Test for deprecated entries 322 | for other_requirement_name, other_requirement_info in parsed_other_requirements.items(): 323 | other_requirement_constraint = other_requirement_info.get("constraint", None) 324 | other_requirement_line = ("%s%s" % (other_requirement_name, other_requirement_constraint)) 325 | if other_requirement_name not in parsed_first_requirements.keys(): 326 | entry = [other_requirement_line, ''] 327 | diff.append(entry) 328 | shaker.libs.logger.Logger().debug("compare_requirements: Found deprecated entry '%s'" 329 | % (entry)) 330 | else: 331 | first_requirement_info = parsed_first_requirements.get(other_requirement_name) 332 | first_requirement_constraint = first_requirement_info.get("constraint", None) 333 | first_requirement_line = ("%s%s" % (other_requirement_name, first_requirement_constraint)) 334 | if first_requirement_constraint != other_requirement_constraint: 335 | entry = [other_requirement_line, first_requirement_line] 336 | diff.append(entry) 337 | shaker.libs.logger.Logger().debug("compare_requirements: Found version diff entry '%s'" 338 | % (entry)) 339 | # Test for new entries 340 | for first_requirement_name, first_requirement_info in parsed_first_requirements.items(): 341 | if first_requirement_name not in parsed_other_requirements.keys(): 342 | first_requirement_constraint = first_requirement_info.get("constraint", None) 343 | first_requirement_line = ("%s%s" % (first_requirement_name, first_requirement_constraint)) 344 | entry = ['', first_requirement_line] 345 | diff.append(entry) 346 | shaker.libs.logger.Logger().debug("compare_requirements: Found new entry '%s'" 347 | % (first_requirement_info)) 348 | 349 | return diff 350 | -------------------------------------------------------------------------------- /shaker/libs/pygit2_utils.py: -------------------------------------------------------------------------------- 1 | import shaker.libs.logger 2 | import paramiko 3 | import pygit2 4 | 5 | 6 | class Pygit2SSHUnsupportedError(Exception): 7 | pass 8 | 9 | 10 | class Pygit2KepairFromAgentUnsupportedError(Exception): 11 | pass 12 | 13 | 14 | class Pygit2SSHAgentMissingKeysError(Exception): 15 | pass 16 | 17 | 18 | link_installation = "http://www.pygit2.org/install.html" 19 | error_message_ssh_support = ("shaker.libs.util:check_pygit2: No SSH support found in libgit2. " 20 | "Please install a version with ssh enabled (%s).\n" 21 | "Note, MacOS users using brew should check the output of 'brew info libgit2' " 22 | "for ssh support" % (link_installation)) 23 | 24 | error_message_credentials_support = ("shaker.libs.util:check_pygit2: Module 'KeypairFromAgent' " 25 | "not found in pygit2.features. " 26 | "Please check your pygit installation (%s)." 27 | % (link_installation)) 28 | 29 | error_message_ssh_missing_keys = ("shaker.libs.util:check_pygit2: The ssh agent doesnt appear to know " 30 | " your github key. " 31 | "Make sure you've added your key with 'ssh-add ~/.id_rsa' or similar. " 32 | " A list of the keys the agent know about can be seen with 'ssh-add -L'.") 33 | 34 | 35 | def pygit2_parse_error(e): 36 | """ 37 | Parse a pygit2 specific error into a more understandable context. Will 38 | also run some checks to try and help with the problem. 39 | 40 | Args: 41 | e(Exception): The exception that was raised 42 | """ 43 | # Common errors to look for are, 44 | # AttributeError: 'module' object has no attribute 'KeypairFromAgent' 45 | # _pygit2.GitError: Unsupported URL protocol 46 | if (isinstance(e, pygit2.GitError) and e.message == "Unsupported URL protocol"): 47 | raise Pygit2SSHUnsupportedError(Pygit2SSHUnsupportedError) 48 | elif (isinstance(e, AttributeError) and e.message == "'module' object has no attribute 'KeypairFromAgent'"): 49 | raise Pygit2KepairFromAgentUnsupportedError(error_message_credentials_support) 50 | else: 51 | raise Pygit2SSHAgentMissingKeysError(error_message_ssh_missing_keys) 52 | 53 | 54 | def pygit2_info(): 55 | """ 56 | Output key pygit2/libgit2 information 57 | """ 58 | link_versions = "http://www.pygit2.org/install.html#version-numbers" 59 | message_versions = ("shaker.libs.util:check_pygit2: pygit2 *requires* the correct " 60 | "version of libgit2, this version was built against libgit2 version '%s'. " 61 | "Please check the versions on your system if you experience " 62 | "problems. (For compatibility, please refer to %s)" 63 | % (pygit2.LIBGIT2_VERSION, link_versions)) 64 | shaker.libs.logger.Logger().warning(message_versions) 65 | 66 | 67 | def pygit2_check(): 68 | """ 69 | Run all checks for pygit2 sanity and raise exceptions if checks fail 70 | 71 | Raises: 72 | Pygit2SSHUnsupportedError: On ssh support check failed 73 | Pygit2KepairFromAgentUnsupportedError: On credential support check failed 74 | """ 75 | if not pygit2_check_ssh(): 76 | raise Pygit2SSHUnsupportedError(error_message_ssh_support) 77 | elif not pygit2_check_credentials(): 78 | raise Pygit2KepairFromAgentUnsupportedError(error_message_credentials_support) 79 | elif not pygit2_agent_has_keys(): 80 | raise Pygit2SSHAgentMissingKeysError(error_message_ssh_missing_keys) 81 | 82 | 83 | def pygit2_check_ssh(): 84 | """ 85 | Check for common pygit2 ssh problems 86 | 87 | Return: 88 | bool: True if no problems found, False otherwise 89 | """ 90 | # Check for ssh support in libgit2 91 | if not (pygit2.features & pygit2.GIT_FEATURE_SSH): 92 | shaker.libs.logger.Logger().critical(error_message_ssh_support) 93 | return False 94 | message_ok = ("shaker.libs.util:pygit2_check_ssh: No ssh problems found. ") 95 | shaker.libs.logger.Logger().debug(message_ok) 96 | return True 97 | 98 | 99 | def pygit2_check_credentials(): 100 | """ 101 | Check for common pygit2 credentials problems 102 | 103 | Return: 104 | bool: True if no problems found, False otherwise 105 | """ 106 | link_installation = "http://www.pygit2.org/install.html" 107 | # Check for KeypairFromAgent support in pygit2 108 | if "KeypairFromAgent" not in vars(pygit2.credentials): 109 | shaker.libs.logger.Logger().critical(error_message_credentials_support) 110 | return False 111 | 112 | message_ok = ("shaker.libs.util:pygit2_check_credentials: No credential problems found. ") 113 | shaker.libs.logger.Logger().debug(message_ok) 114 | return True 115 | 116 | 117 | def pygit2_agent_has_keys(): 118 | """ 119 | Check for common pygit2 ssh agent problems 120 | 121 | Return: 122 | bool: True if no problems found, False otherwise 123 | """ 124 | agent = paramiko.Agent() 125 | keys = agent.get_keys() 126 | if len(keys) < 1: 127 | return False 128 | shaker.libs.logger.Logger().debug("shaker.libs.util:check_pygit2: " 129 | "Please check that the keys listed contain your github key...") 130 | for key in keys: 131 | shaker.libs.logger.Logger().debug("shaker.libs.util:check_pygit2: " 132 | "Found ssh agent key: %s" % key.get_base64()) 133 | return True 134 | -------------------------------------------------------------------------------- /shaker/salt_shaker.py: -------------------------------------------------------------------------------- 1 | import logging 2 | import os 3 | import sys 4 | import warnings 5 | 6 | from shaker.libs import logger 7 | from shaker.libs import metadata 8 | from shaker.libs import pygit2_utils 9 | from shaker_metadata import ShakerMetadata 10 | from shaker_remote import ShakerRemote 11 | from shaker.libs.errors import ShakerRequirementsUpdateException 12 | 13 | 14 | class Shaker(object): 15 | """ 16 | Shaker takes in a metadata yaml file and uses this to resolve a set 17 | of dependencies into a pinned and versioned set in a 18 | formula-requirements.txt file. This can then be used to synchronise 19 | a set of salt-formulas with remote versions pinned to the specified 20 | versions. 21 | 22 | Starting from a root formula and calculate all necessary dependencies, 23 | based on metadata stored in each formula. 24 | 25 | - 26 | 27 | Salt Shaker works by creating an extra file root that must be copied up to 28 | your salt server and added to the master config. 29 | 30 | 31 | The formula-requirements.txt file 32 | --------------------------------- 33 | 34 | The format of the file is simply a list of git-cloneable urls with an 35 | optional revision specified on the end. At the moment the only form a 36 | version comparison accepted is `==`. The version can be a tag, a branch 37 | name or anything that ``git rev-parse`` understands (i.e. a plain sha or 38 | the output of ``git describe`` such as ``v0.2.2-1-g1b520c5``). 39 | 40 | Example:: 41 | 42 | git@github.com:ministryofjustice/ntp-formula.git==v1.2.3 43 | git@github.com:ministryofjustice/repos-formula.git==my_branch 44 | git@github.com:ministryofjustice/php-fpm-formula.git 45 | git@github.com:ministryofjustice/utils-formula.git 46 | git@github.com:ministryofjustice/java-formula.git 47 | git@github.com:ministryofjustice/redis-formula.git==v0.2.2-1-g1b520c5 48 | git@github.com:ministryofjustice/logstash-formula.git 49 | git@github.com:ministryofjustice/sensu-formula.git 50 | git@github.com:ministryofjustice/rabbitmq-formula.git 51 | git@github.com:saltstack-formulas/users-formula.git 52 | 53 | 54 | """ 55 | def __init__(self, root_dir, salt_root_path='vendor', 56 | clone_path='formula-repos', salt_root='_root'): 57 | """ 58 | Initialise application paths and collect together the 59 | metadata 60 | 61 | Args: 62 | root_dir(string): The root directory to use 63 | salt_root_dir(string): The directory to use for the salt 64 | root 65 | clone_path(string): The directory to put formula into 66 | salt_root(string): The directory to link formula into 67 | """ 68 | # Run sanity checks on pygit2 69 | pygit2_utils.pygit2_check() 70 | 71 | self.roots_dir = os.path.join(root_dir, salt_root_path, salt_root) 72 | self.repos_dir = os.path.join(root_dir, salt_root_path, clone_path) 73 | 74 | self._root_dir = root_dir 75 | self._shaker_metadata = ShakerMetadata(root_dir) 76 | 77 | def install_requirements(self, 78 | simulate=False, 79 | enable_remote_check=False): 80 | """ 81 | simulate(bool): True to only simulate the run, 82 | false to carry it through for real 83 | enable_remote_check(bool): True to enable remote 84 | checks when installing pinned versions 85 | """ 86 | logger.Logger().info("Shaker::install_requirements: " 87 | "Installing pinned requirements..." 88 | "dependencies will be installed " 89 | "from the stored formula requirements") 90 | self._load_local_requirements() 91 | self._install_versioned_requirements(overwrite=False, 92 | simulate=simulate, 93 | enable_remote_check=enable_remote_check) 94 | 95 | def update_requirements(self, 96 | simulate=False): 97 | """ 98 | Update the formula-requirements from the metadata, 99 | then install them 100 | 101 | Args: 102 | simulate(bool): True to only simulate the run, 103 | false to carry it through for real 104 | """ 105 | logger.Logger().info("Shaker::install_requirements: " 106 | "Updating and Installing requirements..." 107 | "all dependencies will be " 108 | "re-calculated from the metadata") 109 | self._update_local_requirements() 110 | self._install_versioned_requirements(overwrite=True, 111 | simulate=simulate, 112 | enable_remote_check=True) 113 | 114 | def check_requirements(self): 115 | """ 116 | Check the current formula-requirements against those that 117 | would be generated from the metadata, 118 | """ 119 | logger.Logger().info("Shaker::check_requirements: " 120 | "Checking the current requirements " 121 | "against an update") 122 | self._load_local_requirements(enable_remote_check=True) 123 | current_requirements = self._shaker_remote.get_requirements() 124 | self._update_local_requirements() 125 | new_requirements = self._shaker_remote.get_requirements() 126 | 127 | requirements_diff = metadata.compare_requirements(current_requirements, 128 | new_requirements) 129 | 130 | if len(requirements_diff) == 0: 131 | logger.Logger().info("Shaker::check_requirements: " 132 | "No formula requirements changes found") 133 | else: 134 | for requirement_pair in requirements_diff: 135 | first_entry = requirement_pair[0] 136 | second_entry = requirement_pair[1] 137 | if len(first_entry) == 0: 138 | logger.Logger().info("Shaker::check_requirements: " 139 | "New entry %s" 140 | % (second_entry)) 141 | elif len(second_entry) == 0: 142 | logger.Logger().info("Shaker::check_requirements: " 143 | "Deprecated entry %s" 144 | % (first_entry)) 145 | else: 146 | logger.Logger().info("Shaker::check_requirements: " 147 | "Unequal entries %s != %s" 148 | % (first_entry, 149 | second_entry)) 150 | return requirements_diff 151 | 152 | def _load_local_requirements(self, 153 | enable_remote_check=False): 154 | """ 155 | Load the requirements file and update the remote dependencies 156 | 157 | Args: 158 | enable_remote_check(bool): False to use current formula without checking 159 | remotely for updates. True to use remote repository API to 160 | recalculate shas 161 | """ 162 | logger.Logger().info("Shaker: Loading the current formula requirements...") 163 | self._shaker_remote = ShakerRemote(self._shaker_metadata.local_requirements) 164 | if enable_remote_check: 165 | logger.Logger().info("Shaker: Updating the current formula requirements " 166 | "dependencies...") 167 | self._shaker_remote.update_dependencies() 168 | 169 | def _update_local_requirements(self): 170 | """ 171 | Update the requirements from metadata entries, overriding the 172 | current formula requirements 173 | """ 174 | logger.Logger().info("Shaker: Updating the formula requirements...") 175 | 176 | self._shaker_metadata.update_dependencies(ignore_local_requirements=True) 177 | self._shaker_remote = ShakerRemote(self._shaker_metadata.dependencies) 178 | self._shaker_remote.update_dependencies() 179 | 180 | def _install_versioned_requirements(self, 181 | overwrite=False, 182 | simulate=False, 183 | enable_remote_check=False 184 | ): 185 | """ 186 | Install all of the versioned requirements found 187 | 188 | Args: 189 | overwrite(bool): True to overwrite dependencies, 190 | false otherwise 191 | simulate(bool): True to only simulate the run, 192 | false to carry it through for real 193 | enable_remote_check(bool): False to use current formula without checking 194 | remotely for updates. True to use remote repository API to 195 | recalculate shas 196 | """ 197 | if not simulate: 198 | if enable_remote_check: 199 | logger.Logger().info("Shaker::install_requirements: Updating requirements tag target shas") 200 | self._shaker_remote.update_dependencies() 201 | else: 202 | logger.Logger().info("Shaker::install_requirements: No remote check, not updating tag target shas") 203 | logger.Logger().info("Shaker::install_requirements: Installing requirements...") 204 | successful, unsuccessful = self._shaker_remote.install_dependencies(overwrite=overwrite, 205 | enable_remote_check=enable_remote_check) 206 | 207 | # If we have unsuccessful updates, then we should fail before writing the requirements file 208 | if unsuccessful > 0: 209 | msg = ("Shaker::install_requirements: %s successful, %s failed" 210 | % (successful, unsuccessful)) 211 | raise ShakerRequirementsUpdateException(msg) 212 | 213 | if enable_remote_check: 214 | logger.Logger().info("Shaker: Writing requirements file...") 215 | self._shaker_remote.write_requirements(overwrite=True, backup=False) 216 | else: 217 | requirements = '\n'.join(self._shaker_remote.get_requirements()) 218 | logger.Logger().warning("Shaker: Simulation mode enabled, " 219 | "no changes will be made...\n%s\n\n" 220 | % (requirements)) 221 | 222 | 223 | def _setup_logging(level): 224 | """ 225 | Initialise the default application logging 226 | 227 | Args: 228 | level(logging.LEVEL): The level to set 229 | logging at 230 | """ 231 | logger.Logger('salt-shaker') 232 | logger.Logger().setLevel(level) 233 | 234 | 235 | def shaker(root_dir='.', 236 | debug=False, 237 | verbose=False, 238 | pinned=False, 239 | simulate=False, 240 | check_requirements=False, 241 | enable_remote_check=False): 242 | """ 243 | Utility task to initiate Shaker, setting up logging and 244 | running the neccessary commands to install requirements 245 | 246 | Args: 247 | root_dir(string): The root directory to use 248 | debug(bool): Enable/disable debugging output 249 | verbose(bool): Enable/disable verbose output 250 | pinned(bool): True to use pinned requirements, 251 | False to use metadata to recalculate 252 | requirements 253 | simulate(bool): True to only simulate the run, 254 | false to carry it through for real 255 | check_requirements(bool): True to compare 256 | a remote dependency check with the current 257 | formula requirements 258 | enable_remote_check(bool): True to enable remote 259 | checks when installing pinned versions 260 | """ 261 | if (debug): 262 | _setup_logging(logging.DEBUG) 263 | elif (verbose): 264 | _setup_logging(logging.INFO) 265 | else: 266 | _setup_logging(logging.INFO) 267 | 268 | if not os.path.exists(root_dir): 269 | os.makedirs(root_dir, 0755) 270 | 271 | shaker_instance = Shaker(root_dir=root_dir) 272 | if check_requirements: 273 | shaker_instance.check_requirements() 274 | elif pinned: 275 | shaker_instance.install_requirements(simulate=simulate, 276 | enable_remote_check=enable_remote_check) 277 | else: 278 | shaker_instance.update_requirements(simulate=simulate) 279 | 280 | 281 | def get_deps(root_dir, root_formula=None, constraint=None, force=False): 282 | """ 283 | (DEPRECATED) Update the formula-requirements from the metadata.yaml, 284 | then install them 285 | 286 | Args: 287 | simulate(bool): True to only simulate the run, 288 | false to carry it through for real 289 | """ 290 | # This filterwarning makes sure we always see *this* log message 291 | warnings.filterwarnings("once", "shaker\.salt_shaker\.get_deps.*", DeprecationWarning) 292 | # Then issue a warning form the caller's perspective 293 | warnings.warn("shaker.salt_shaker.get_deps has been deprecated. Use `shaker.salt_shaker.shaker(root_dir=...)` instead", DeprecationWarning, stacklevel=2) 294 | 295 | return shaker(root_dir, pinned=not force, enable_remote_check=True) 296 | -------------------------------------------------------------------------------- /shaker/shaker_metadata.py: -------------------------------------------------------------------------------- 1 | import os 2 | import re 3 | import requests 4 | import yaml 5 | 6 | from shaker.libs.errors import ShakerConfigException 7 | from shaker.libs.errors import GithubRepositoryConnectionException 8 | import shaker.libs.github 9 | import shaker.libs.metadata 10 | import shaker.libs.logger 11 | 12 | 13 | class ShakerMetadata: 14 | """ 15 | Class to hold and resolve all the information about 16 | a formula using it's metadata. It can parse a local 17 | metadata file, use this to generate a dependency list, 18 | and call remotely to parse sub-dependencies. 19 | 20 | Attributes: 21 | working_directory(string): The working directory 22 | to look for metadata files and to write output 23 | metadata_filename(string): The filename that stores 24 | our root metadata 25 | root_metadata(dictionary): Dictionary of information 26 | on the root formula, eg 27 | { 28 | formula: 'test_organisation/my-formula': 29 | 'organisation': 'test_organisation', 30 | 'name': 'my-formula', 31 | 'dependencies': { 32 | 'test_organisation/some-formula': 33 | { 34 | 'source': 'git@github.com:test_organisation/some-formula.git', 35 | 'constraint': '==1.0', 36 | 'organisation': 'test_organisation', 37 | 'name': 'some-formula' 38 | } 39 | } 40 | } 41 | dependencies(dictionary): Dictionary of organisation/name 42 | keys to a dictionary of properties for the dependency, 43 | 'source', the url source for the dependency 44 | 'versions', the versions of the dependency available 45 | 'constraint', the version constraint on the dependency 46 | added in the order found, ie root first. 47 | 'organisation', the organisation name 48 | 'name', the formula name 49 | 50 | An example 51 | 52 | { 53 | 'test_organisation/some-formula': 54 | { 55 | 'source': 'git@github.com:test_organisation/some-formula.git', 56 | 'constraint': '==1.0', 57 | 'organisation': 'test_organisation', 58 | 'name': 'some-formula' 59 | } 60 | } 61 | """ 62 | working_directory = None 63 | metadata_filename = None 64 | root_metadata = {} 65 | local_requirements = {} 66 | dependencies = {} 67 | 68 | def __init__(self, 69 | working_directory='.', 70 | metadata_filename='metadata.yml', 71 | autoload=True): 72 | """ 73 | Initialise the instance from a metadata config file 74 | 75 | Args: 76 | working_directory(string): The directory of the metadata file 77 | metadata_filename(string): The filename of the metadata file 78 | autoload(bool): If True, then try to load local data, do nothing 79 | on False 80 | """ 81 | self.working_directory = working_directory 82 | self.metadata_filename = metadata_filename 83 | self.requirements_filename = metadata_filename 84 | if autoload: 85 | self.load_local_metadata() 86 | self.load_local_requirements() 87 | 88 | def load_local_metadata(self): 89 | """ 90 | Load in the metadata from a file into our data 91 | structures 92 | """ 93 | # Load in the raw metadata 94 | raw_data = self._fetch_local_metadata(self.working_directory, 95 | self.metadata_filename) 96 | # Process the raw data into our data structure 97 | if raw_data: 98 | root_name_key = raw_data.get('formula', None) 99 | if (root_name_key): 100 | self.root_metadata['formula'] = root_name_key 101 | self.root_metadata.update(self._parse_metadata_name(root_name_key)) 102 | else: 103 | shaker.libs.logger.Logger().debug('ShakerMetadata::update_metadata: ' 104 | 'No root key name found, ' 105 | 'assuming a deploy formula') 106 | 107 | root_dependencies = raw_data.get('dependencies', None) 108 | if (root_dependencies): 109 | # Root dependencies need to be differentiated so they can be used a the basis 110 | # for a dependency refresh 111 | self.root_metadata['dependencies'] = shaker.libs.metadata.parse_metadata_requirements(root_dependencies) 112 | else: 113 | shaker.libs.logger.Logger().warning('ShakerMetadata::update_metadata: ' 114 | 'No root dependencies found') 115 | else: 116 | msg = 'ShakerMetadata::update_metadata: Error loading metadata.' 117 | raise ShakerConfigException(msg) 118 | 119 | def update_dependencies(self, 120 | ignore_local_requirements=False, 121 | ignore_dependency_requirements=False): 122 | """ 123 | Update the dependencies from the root formula down 124 | through the dependency chain 125 | 126 | Args: 127 | ignore_local_requirements(bool): True if we skip parsing the requirements file 128 | for the root and use metadata directly, false otherwise 129 | ignore_dependency_requirements(bool): True if we skip parsing the requirements file 130 | for the dependencies and use their metadata directly, false otherwise 131 | """ 132 | # Try to read root requirements, unless we're ignoring them 133 | # If we are, or fail to read, open up metadata 134 | have_local_requirements = len(self.local_requirements) > 0 135 | if not ignore_local_requirements and have_local_requirements: 136 | shaker.libs.logger.Logger().debug('ShakerMetadata::update_dependencies: ' 137 | 'Updating from requirements') 138 | self.dependencies = self.local_requirements 139 | self._fetch_dependencies(self.dependencies, 140 | ignore_dependency_requirements) 141 | else: 142 | shaker.libs.logger.Logger().debug('ShakerMetadata::update_dependencies: ' 143 | 'Updating from metadata') 144 | # Add in root dependencies, always overwrite 145 | root_dependencies = self.root_metadata.get('dependencies', {}) 146 | if len(root_dependencies) <= 0: 147 | shaker.libs.logger.Logger().debug("ShakerMetadata::update_dependencies: " 148 | "No dependencies found in metadata") 149 | else: 150 | self.dependencies = root_dependencies 151 | self._fetch_dependencies(self.dependencies, 152 | ignore_dependency_requirements) 153 | 154 | def load_local_requirements(self, 155 | input_directory='.', 156 | input_filename='formula-requirements.txt'): 157 | """ 158 | Load a dependency list from a file path, parsing it into our 159 | data structures. Expects a list inside 160 | the file of the form 161 | 162 | git@github.com:test_organisation/some-formula.git==v1.0 163 | git@github.com:test_organisation/another-formula.git==v2.0 164 | 165 | Args: 166 | input_directory(string): The directory of the input file 167 | input_filename(string): The filename of the input file 168 | """ 169 | path = "%s/%s" % (input_directory, 170 | input_filename) 171 | shaker.libs.logger.Logger().debug('ShakerMetadata::load_local_requirements: ' 172 | 'Loading %s...' 173 | % (path)) 174 | if not os.path.exists(path): 175 | shaker.libs.logger.Logger().debug('ShakerMetadata::load_local_requirements: ' 176 | 'File not found %s' 177 | % (path)) 178 | return False 179 | else: 180 | with open(path, 'r') as infile: 181 | loaded_dependencies = [] 182 | for line in infile: 183 | stripped_line = line.strip() 184 | if len(stripped_line) > 0 and stripped_line[0] != '#': 185 | loaded_dependencies.append(line) 186 | 187 | if len(loaded_dependencies) > 0: 188 | self.local_requirements = shaker.libs.metadata.parse_metadata_requirements(loaded_dependencies) 189 | return True 190 | else: 191 | shaker.libs.logger.Logger().warning("ShakerMetadata::load_local_requirements: " 192 | "File '%s' empty %s" 193 | % (path, 194 | loaded_dependencies)) 195 | return False 196 | 197 | return True 198 | 199 | def _fetch_local_metadata(self, 200 | directory, 201 | filename): 202 | """ 203 | Fetch data from a local file 204 | 205 | Args: 206 | directory(string): The directory of the file 207 | filename(string): The filename of the file 208 | 209 | Returns: 210 | dictionary: The data found, None if could not be parsed 211 | """ 212 | md_file = os.path.join(directory, 213 | filename) 214 | if os.path.exists(md_file): 215 | with open(md_file, 'r') as md_fd: 216 | try: 217 | data = yaml.load(md_fd) 218 | return data 219 | except yaml.YAMLError as e: 220 | msg = ('ShakerMetadata::_fetch_local_metadata: ' 221 | 'Error in yaml format for file ' 222 | '%s: %s' 223 | % (md_file, 224 | e.message)) 225 | raise yaml.YAMLError(msg) 226 | else: 227 | msg = ('ShakerMetadata::_fetch_local_metadata: ' 228 | 'Error loading file, ' 229 | 'file does not exist. ' 230 | '%s' 231 | % (md_file)) 232 | raise IOError(msg) 233 | 234 | return None 235 | 236 | def _parse_metadata_name(self, 237 | metadata_name): 238 | """ 239 | Parse the supplied metadata name and return the root name entry 240 | in format, 241 | { 242 | 'organisation': 'test_organisation', 243 | 'name': 'my-formula' 244 | } 245 | 246 | Args: 247 | metadata_name(string): String of metadata name 248 | 249 | Returns: 250 | root_name_entry(dictionary): The root name entry found 251 | in the specified format 252 | """ 253 | # Check that the metadata name string is in an expected 254 | # format 255 | if '/' not in metadata_name: 256 | raise ShakerConfigException("ShakerMetadata::_parse_metadata_name: " 257 | "No '/' separator found in string '%s'" 258 | % (metadata_name) 259 | ) 260 | else: 261 | metadata_info = metadata_name.split('/') 262 | if len(metadata_info) != 2: 263 | raise ShakerConfigException("ShakerMetadata::_parse_metadata_name: " 264 | "Bad name format found in string '%s', " 265 | "expected '/" 266 | % (metadata_name)) 267 | else: 268 | root_org = metadata_info[0] 269 | root_name = metadata_info[1] 270 | root_name_entry = { 271 | 'organisation': root_org, 272 | 'name': root_name 273 | } 274 | return root_name_entry 275 | 276 | return None 277 | 278 | def _fetch_dependencies(self, 279 | base_dependencies, 280 | ignore_dependency_requirements=False): 281 | """ 282 | Fetch all of the base formulas dependencies and sub-dependencies 283 | and process them into our data structures 284 | 285 | Args: 286 | base_dependencies(dictionary): 287 | A metadata dictionary to use as the base of our 288 | dependency loading of form 289 | 'test_organisation/some-formula': 290 | { 291 | 'source': 'git@github.com:test_organisation/some-formula.git', 292 | 'constraint': '==1.0', 293 | 'sourced_constraints': ['==1.0'], 294 | 'organisation': 'test_organisation', 295 | 'name': 'some-formula' 296 | }, 297 | 'test_organisation/another-formula': 298 | { 299 | 'source': 'git@github.com:test_organisation/another-formula.git', 300 | 'constraint': '==1.0', 301 | 'sourced_constraints': ['==1.0'], 302 | 'organisation': 'test_organisation', 303 | 'name': 'another-formula' 304 | } 305 | ignore_dependency_requirements(bool): 306 | True if we want to skip parsing a remote requirements file 307 | and go straight to metadata, False otherwise 308 | """ 309 | shaker.libs.logger.Logger().debug('ShakerMetadata::fetch_dependencies: ' 310 | 'Fetching for base dependencies\n %s' 311 | % base_dependencies) 312 | root_metadata = self.root_metadata.get('formula', None) 313 | for dependency_key, dependency_info in base_dependencies.items(): 314 | shaker.libs.logger.Logger().debug("ShakerMetadata::fetch_dependencies: " 315 | "Processing '%s': " 316 | % (dependency_key)) 317 | constraint = dependency_info.get('constraint', '') 318 | 319 | if dependency_key in self.dependencies: 320 | # If we've already sourced this constrained version then we're done 321 | sourced_constraints = self.dependencies.get(dependency_key).get('sourced_constraints', []) 322 | if constraint in sourced_constraints: 323 | shaker.libs.logger.Logger().debug("ShakerMetadata::fetch_dependencies: " 324 | "Already have requirements constraint, '%s' in " 325 | "sourced constraints '%s'" 326 | % (constraint, 327 | sourced_constraints)) 328 | continue 329 | 330 | elif dependency_key == root_metadata: 331 | shaker.libs.logger.Logger().debug("ShakerMetadata::fetch_dependencies: " 332 | "Root key dependency found %s = %s, skipping" 333 | % (dependency_key, root_metadata)) 334 | continue 335 | 336 | # We've checked whether we have this dependency, and whether we 337 | # need to skip it. So now get it 338 | org_name = dependency_info.get('organisation', None) 339 | formula_name = dependency_info.get('name', None) 340 | 341 | shaker.libs.logger.Logger().debug('ShakerMetadata::fetch_dependencies: ' 342 | 'Processing %s' % dependency_key) 343 | 344 | # Try to fetch the formula requirements file, if its not found, 345 | # fallback to fetching the metadata directly 346 | remote_metadata = None 347 | if not ignore_dependency_requirements: 348 | shaker.libs.logger.Logger().debug("ShakerMetadata::fetch_dependencies: " 349 | "Looking for requirements for %s:%s" 350 | % (dependency_key, constraint)) 351 | remote_requirements = self._fetch_remote_requirements(org_name, 352 | formula_name, 353 | constraint=constraint) 354 | 355 | if remote_requirements: 356 | shaker.libs.logger.Logger().debug("ShakerMetadata::fetch_dependencies: " 357 | "Found requirements %s" 358 | % (remote_requirements)) 359 | remote_metadata = {"dependencies": remote_requirements} 360 | 361 | if not remote_metadata: 362 | shaker.libs.logger.Logger().debug("ShakerMetadata::fetch_dependencies: " 363 | "Looking for metadata for %s" 364 | % (dependency_key)) 365 | remote_metadata = self._fetch_remote_metadata(org_name, 366 | formula_name, 367 | constraint=constraint) 368 | 369 | # Need to ensure we don't try to re-get this one 370 | constraint = dependency_info.get('constraint', '') 371 | # we've tried all our methods of sourcing this requirement, so update the 372 | # sourced requirements 373 | self._add_dependency_sourced(dependency_key, constraint) 374 | 375 | if remote_metadata: 376 | remote_dependencies = self._add_dependencies_from_metadata(remote_metadata) 377 | self._fetch_dependencies(remote_dependencies) 378 | 379 | else: 380 | shaker.libs.logger.Logger().debug("ShakerMetadata::fetch_dependencies: " 381 | "No requirements or metadata found for %s, skipping" 382 | % (dependency_key)) 383 | 384 | def _fetch_remote_metadata(self, 385 | org_name, 386 | formula_name, 387 | constraint=None): 388 | """ 389 | Use a organisation, formula name and optional 390 | constraint to fetch the metadata for a formula 391 | 392 | Args: 393 | org_name(string): The name of the organisation 394 | formula_name(string): The name of the formula 395 | constraint(string): (optional) Constraint of the 396 | formula. In '==v1.0.0' type format 397 | 398 | Returns: 399 | (dictionary): The loaded metadata of the required 400 | formula, None type if there was a problem 401 | """ 402 | github_token = shaker.libs.github.get_valid_github_token() 403 | if not github_token: 404 | msg = "github::get_branch_data: No valid github token" 405 | raise GithubRepositoryConnectionException(msg) 406 | 407 | shaker.libs.logger.Logger().debug("ShakerMetadata::_fetch_remote_metadata: " 408 | "Fetching remote repository " 409 | "%s/%s:%s" 410 | % (org_name, 411 | formula_name, 412 | constraint)) 413 | # Check for successful access and any credential problems 414 | metadata = self._fetch_remote_file(org_name, 415 | formula_name, 416 | "metadata.yml", 417 | constraint) 418 | 419 | data = None 420 | if metadata: 421 | data = yaml.load(metadata) 422 | parsed_data = self._parse_metadata_name(data) 423 | parsed_data["dependencies"] = shaker.libs.metadata.parse_metadata_requirements(data) 424 | for dependency_info in parsed_data["dependences"].values(): 425 | dependency_info['sourced_constraints'] = dependency_info.get('constraint', '') 426 | shaker.libs.logger.Logger().debug("ShakerMetadata::_fetch_remote_metadata: " 427 | "Added sourced constraint '%s'" 428 | % (dependency_info)) 429 | return data 430 | else: 431 | shaker.libs.logger.Logger().debug("ShakerMetadata::_fetch_remote_metadata: " 432 | "No metadata found for " 433 | "%s/%s:%s" 434 | % (org_name, 435 | formula_name, 436 | constraint)) 437 | 438 | def _fetch_remote_requirements(self, 439 | org_name, 440 | formula_name, 441 | constraint=None): 442 | """ 443 | Use a organisation, formula name and optional 444 | constraint to fetch the requiremtns for a formula 445 | 446 | Args: 447 | org_name(string): The name of the organisation 448 | formula_name(string): The name of the formula 449 | constraint(string): (optional) Constraint of the 450 | formula. In '==v1.0.0' type format 451 | 452 | Returns: 453 | (dictionary): The loaded dependencies of the required 454 | formula, eg, 455 | 'test_organisation/some-formula': 456 | { 457 | 'source': 'git@github.com:test_organisation/some-formula.git', 458 | 'constraint': '==1.0', 459 | 'sourced_constraints': ['==1.0', '<=2.0.0'], 460 | 'organisation': 'test_organisation', 461 | 'name': 'some-formula' 462 | } 463 | None type if the repo has no requirements file or if there was a problem. 464 | """ 465 | # Check for successful access and any credential problems 466 | raw_requirements = self._fetch_remote_file(org_name, 467 | formula_name, 468 | "formula-requirements.txt", 469 | constraint) 470 | parsed_data = None 471 | if raw_requirements: 472 | data = raw_requirements.split() 473 | if data: 474 | parsed_data = shaker.libs.metadata.parse_metadata_requirements(data) 475 | shaker.libs.logger.Logger().debug("ShakerMetadata::_fetch_remote_requirements: " 476 | "Found parsed_data %s" 477 | % (parsed_data)) 478 | for entry_info in parsed_data.values(): 479 | entry_info['sourced_constraints'] = [entry_info.get('constraint', '')] 480 | shaker.libs.logger.Logger().debug("ShakerMetadata::_fetch_remote_requirements: " 481 | "Added sourced constraint '%s'" 482 | % (entry_info)) 483 | return parsed_data 484 | else: 485 | msg = ("ShakerMetadata::_fetch_remote_requirements: " 486 | "Could not parse requirements found for %s/%s\n%s\n\n" 487 | % (org_name, formula_name, raw_requirements)) 488 | raise ShakerConfigException(msg) 489 | else: 490 | msg = ("ShakerMetadata::_fetch_remote_requirements: " 491 | "No requirements found for %s/%s" 492 | % (org_name, formula_name)) 493 | shaker.libs.logger.Logger().debug(msg) 494 | 495 | def _fetch_remote_file(self, 496 | org_name, 497 | formula_name, 498 | remote_file, 499 | constraint=None): 500 | """ 501 | Use a organisation, formula name and optional 502 | constraint to fetch the requirements for a formula 503 | 504 | Args: 505 | org_name(string): The name of the organisation 506 | formula_name(string): The name of the formula 507 | remote_file(string): The requirements file 508 | of the formula 509 | constraint(string): (optional) Constraint of the 510 | formula. In '==v1.0.0' type format 511 | 512 | Returns: 513 | (dictionary): The loaded metadata of the required 514 | formula, None type if there was a problem 515 | """ 516 | github_token = shaker.libs.github.get_valid_github_token() 517 | if not github_token: 518 | msg = "github::get_branch_data: No valid github token" 519 | raise GithubRepositoryConnectionException(msg) 520 | 521 | target_obj = shaker.libs.github.resolve_constraint_to_object(org_name, formula_name, constraint) 522 | if not target_obj: 523 | msg = ("ShakerMetadata::_fetch_remote_file: " 524 | "%s/%s:%s: No target object found, check it exists " 525 | "and you have the environment variable GITHUB_TOKEN set " 526 | "for authenticated access to private repositories" 527 | % (org_name, formula_name, constraint)) 528 | raise GithubRepositoryConnectionException(msg) 529 | 530 | target_tag = target_obj.get("name", None) 531 | 532 | remote_file_url = ("https://raw.githubusercontent.com/%s/%s/%s/%s" 533 | % (org_name, 534 | formula_name, 535 | target_tag, 536 | remote_file)) 537 | 538 | # Check for successful access and any credential problems 539 | raw_data = requests.get(remote_file_url, 540 | auth=(github_token, 'x-oauth-basic') 541 | ) 542 | shaker.libs.logger.Logger().debug("ShakerMetadata::_fetch_remote_file: " 543 | "Calling github.validate_github_access with raw_data: " + str(raw_data)) 544 | if shaker.libs.github.validate_github_access(raw_data,remote_file_url): 545 | remote_dict = yaml.load(raw_data.content) 546 | return remote_dict 547 | else: 548 | shaker.libs.logger.Logger().debug("ShakerMetadata::_fetch_remote_file: " 549 | "Could not validate github access to '%s'" 550 | % (remote_file_url)) 551 | return None 552 | 553 | def _add_dependencies_from_metadata(self, metadata): 554 | """ 555 | Load in dependencies from a dictionary of form 556 | 'dependencies': { 557 | 'test_organisation/some-formula': 558 | { 559 | 'source': 'git@github.com:test_organisation/some-formula.git', 560 | 'constraint': '==1.0', 561 | 'sourced_constraints: [], 562 | 'organisation': 'test_organisation', 563 | 'name': 'some-formula' 564 | } 565 | } 566 | """ 567 | shaker.libs.logger.Logger().debug("ShakerMetadata::_add_dependencies_from_metadata: " 568 | "Adding metadata: %s" 569 | % (metadata)) 570 | parsed_metadata_dependencies = {} 571 | if metadata: 572 | metadata_dependencies = metadata.get('dependencies', 573 | None) 574 | if metadata_dependencies: 575 | parsed_metadata_dependencies = shaker.libs.metadata.parse_metadata_requirements(metadata_dependencies) 576 | for dep_key, dep_info in parsed_metadata_dependencies.items(): 577 | if dep_key != self.root_metadata.get('formula', None): 578 | if dep_key not in self.dependencies: 579 | shaker.libs.logger.Logger().debug("ShakerMetadata::_add_dependencies_from_metadata: " 580 | "New Metadata added '%s" 581 | % (dep_key)) 582 | self.dependencies[dep_key] = dep_info 583 | else: 584 | # Resolve constraints 585 | current_constraint = self.dependencies[dep_key].get('constraint', {}) 586 | new_constraint = dep_info.get('constraint', None) 587 | self.dependencies[dep_key]['constraint'] = shaker.libs.metadata.resolve_constraints(new_constraint, 588 | current_constraint) 589 | shaker.libs.logger.Logger().debug("ShakerMetadata::_add_dependencies_from_metadata: " 590 | "Updating constraint for '%s" 591 | % (dep_key)) 592 | # Merge source constraints 593 | current_sourced_constraints = self.dependencies[dep_key].get('sourced_constraints', []) 594 | new_sourced_constraints = dep_info.get('sourced_constraints', []) 595 | self.dependencies[dep_key]['sourced_constraints'] = current_sourced_constraints + new_sourced_constraints 596 | shaker.libs.logger.Logger().debug("ShakerMetadata::_add_dependencies_from_metadata: " 597 | "Merged sourced constraints\n" 598 | "'%s' + '%s' = '%s'" 599 | % (current_sourced_constraints, 600 | new_sourced_constraints, 601 | self.dependencies[dep_key]['sourced_constraints'])) 602 | else: 603 | shaker.libs.logger.Logger().debug("ShakerMetadata::_add_dependencies_from_metadata: " 604 | "Root key found (%s==%s), ignoring" 605 | % (dep_key, 606 | self.root_metadata.get('formula', None))) 607 | 608 | else: 609 | shaker.libs.logger.Logger().warning("ShakerMetadata::_add_dependencies_from_metadata: " 610 | "Metadata contained no dependencies") 611 | else: 612 | shaker.libs.logger.Logger().error("ShakerMetadata::_add_dependencies_from_metadata: " 613 | "Metadata null") 614 | 615 | raise ShakerConfigException 616 | 617 | return parsed_metadata_dependencies 618 | 619 | def _add_dependency_sourced(self, 620 | dependency_key, 621 | constraint): 622 | """ 623 | Mark a dependency constraint version as being sourced 624 | 625 | Args: 626 | dependency_key(string): The key name of the dependency 627 | constraint(string): The constraint that has been sourced 628 | """ 629 | # Need to ensure we don't try to re-get this one 630 | # If we've already sourced this constrained version then we're done 631 | sourced_constraints = [constraint] 632 | if dependency_key in self.dependencies: 633 | current_sourced_constraints = self.dependencies.get(dependency_key).get('sourced_constraints', None) 634 | if current_sourced_constraints: 635 | sourced_constraints = sourced_constraints + current_sourced_constraints 636 | else: 637 | self.dependencies[dependency_key] = {} 638 | 639 | self.dependencies[dependency_key]["sourced_constraints"] = sourced_constraints 640 | -------------------------------------------------------------------------------- /shaker/shaker_remote.py: -------------------------------------------------------------------------------- 1 | import errno 2 | import os 3 | import shutil 4 | 5 | import shaker.libs.github 6 | import shaker.libs.logger 7 | from shaker.libs.errors import ConstraintResolutionException 8 | import re 9 | import yaml 10 | 11 | 12 | class ShakerRemote: 13 | """ 14 | Class to handle communication with remote repositories, 15 | resolving dependencies into downloads into local 16 | directories 17 | 18 | Attributes: 19 | _dependencies(list): A list of git repository targets 20 | """ 21 | _dependencies = {} 22 | _targets = [] 23 | _working_directory = '' 24 | _install_directory = '' 25 | _salt_root = '' 26 | _dynamic_modules_dirs = ['_modules', '_grains', '_renderers', 27 | '_returners', '_states'] 28 | 29 | def __init__(self, 30 | dependencies, 31 | working_directory='vendor', 32 | install_directory='formula-repos', 33 | salt_root='_root'): 34 | self._dependencies = dependencies 35 | self._working_directory = working_directory 36 | self._install_directory = install_directory 37 | self._salt_root = salt_root 38 | 39 | def update_dependencies(self): 40 | """ 41 | Update the list of targets with actual git sha 42 | targets from the dictionary of dependencies 43 | """ 44 | shaker.libs.logger.Logger().debug("ShakerRemote::update_dependencies: " 45 | "Updating the dependencies \n%s\n\n" 46 | % (self._dependencies.keys())) 47 | for dependency in self._dependencies.values(): 48 | target_sha = self._resolve_constraint_to_sha(dependency) 49 | shaker.libs.logger.Logger().debug("ShakerRemote::update_dependencies: " 50 | "Found sha '%s'" 51 | % (target_sha)) 52 | if target_sha: 53 | dependency["sha"] = target_sha 54 | 55 | def install_dependencies(self, 56 | overwrite=False, 57 | remove_directories=True, 58 | enable_remote_check=False): 59 | """ 60 | Install the dependency as specified by the formula dictionary and 61 | return the directory symlinked into the roots_dir 62 | 63 | Args: 64 | overwrite(bool): True if we will delete and recreate existing 65 | directories, False to preserve them 66 | remove_directories(bool): True to delete unused directories, 67 | False to preserve them 68 | enable_remote_check(bool): False to update repositories directly, 69 | True to contact github to find shas 70 | Returns: 71 | tuple: Tuple of successful, skipped repository updates 72 | """ 73 | self._create_directories(overwrite=overwrite) 74 | successful_updates = 0 75 | unsuccessful_updates = 0 76 | install_dir = os.path.join(self._working_directory, 77 | self._install_directory) 78 | for dependency in self._dependencies.values(): 79 | dependency_name = dependency.get("name", None) 80 | 81 | use_tag = False 82 | if not enable_remote_check: 83 | dependency_constraint = dependency.get("constraint", None) 84 | parsed_dependency_constraint = shaker.libs.metadata.parse_constraint(dependency_constraint) 85 | dependency_tag = parsed_dependency_constraint.get("tag", None) 86 | shaker.libs.logger.Logger().debug("ShakerRemote::install_dependencies: " 87 | "No remote checks, found tag '%s'" 88 | % (dependency_tag)) 89 | if dependency_tag is not None: 90 | dependency["tag"] = dependency_tag 91 | use_tag = True 92 | else: 93 | msg = ("ShakerRemote::install_dependencies: " 94 | "No tag found when remote checks disabled") 95 | raise ConstraintResolutionException(msg) 96 | else: 97 | shaker.libs.logger.Logger().debug("ShakerRemote::install_dependencies: " 98 | "Remote checks enabled on dependency %s" 99 | % (dependency)) 100 | 101 | success = shaker.libs.github.install_source(dependency, 102 | install_dir, 103 | use_tag) 104 | shaker.libs.logger.Logger().debug("ShakerRemote::install_dependencies: " 105 | "Installed '%s to directory '%s': %s" 106 | % (dependency_name, 107 | install_dir, 108 | success)) 109 | if success: 110 | successful_updates += 1 111 | else: 112 | unsuccessful_updates += 1 113 | 114 | success_message = "FAIL" 115 | if success: 116 | success_message = "OK" 117 | 118 | if (use_tag): 119 | shaker.libs.logger.Logger().info("ShakerRemote::install_dependencies: " 120 | "Updating '%s' from tag '%s'...%s" 121 | % (dependency_name, 122 | dependency_tag, 123 | success_message)) 124 | else: 125 | shaker.libs.logger.Logger().info("ShakerRemote::install_dependencies: " 126 | "Updating '%s' from raw sha '%s'...%s" 127 | % (dependency_name, 128 | dependency.get("sha", None), 129 | success_message)) 130 | if remove_directories: 131 | for pathname in os.listdir(install_dir): 132 | found = False 133 | for value in self._dependencies.values(): 134 | name = value.get("name", None) 135 | if pathname == name: 136 | found = True 137 | break 138 | if not found: 139 | shaker.libs.logger.Logger().debug("ShakerRemote::install_dependencies: " 140 | "Deleting directory on non-existent " 141 | "dependency '%s'" 142 | % (pathname)) 143 | fullpath = os.path.join(self._working_directory, 144 | self._install_directory, 145 | pathname) 146 | shutil.rmtree(fullpath) 147 | 148 | # Do linking of modules 149 | self._update_root_links() 150 | return (successful_updates, unsuccessful_updates) 151 | 152 | def write_requirements(self, 153 | output_directory='.', 154 | output_filename='formula-requirements.txt', 155 | overwrite=False, 156 | backup=False): 157 | """ 158 | Write out the resolved dependency list, into the file 159 | in the working directory. Skip overwrite unless forced 160 | 161 | Args: 162 | output_filename(string): The filename of the output file 163 | overwrite(bool): False to not ovewrite a pre-existing 164 | file, false otherwise. 165 | 166 | Returns: 167 | bool: True if file written, false otherwise 168 | """ 169 | path = "%s/%s" % (output_directory, 170 | output_filename) 171 | 172 | if os.path.exists(path): 173 | if not overwrite: 174 | shaker.libs.logger.Logger().warning('ShakerMetadata::write_requirements: ' 175 | ' File exists, not writing...') 176 | return False 177 | elif backup: 178 | # postfix = time.time() 179 | postfix = "last" 180 | newpath = "%s.%s" % (path, postfix) 181 | try: 182 | os.rename(path, newpath) 183 | shaker.libs.logger.Logger().info('ShakerMetadata::write_requirements: ' 184 | ' File exists, renaming %s to %s.' 185 | % (path, 186 | newpath)) 187 | except OSError as e: 188 | shaker.libs.logger.Logger().error('ShakerMetadata::write_requirements: ' 189 | ' Problem renaming file %s to %s: %s' 190 | % (path, 191 | newpath, 192 | e.message)) 193 | return False 194 | 195 | with open(path, 'w') as outfile: 196 | requirements = self.get_requirements() 197 | outfile.write('\n'.join(requirements)) 198 | outfile.write('\n') 199 | shaker.libs.logger.Logger().debug("ShakerMetadata::write_requirements: " 200 | "Wrote file '%s'" 201 | % (path) 202 | ) 203 | return True 204 | 205 | return False 206 | 207 | def _get_formula_exports(self, dependency_info): 208 | """ 209 | based on metadata.yml generates a list of exports 210 | if file is unreadable or exports are not supplied defaults to `re.sub('-formula$', '', name)` 211 | 212 | example metadata.yaml for formula foobar-formula 213 | ``` 214 | exports: 215 | - foo 216 | - bar 217 | ``` 218 | 219 | Returns: 220 | a list of directories from formula to link (exports supplied by formula) 221 | """ 222 | name = dependency_info.get('name', None) 223 | exports_default = [re.sub('-formula$', '', name)] 224 | metadata_path = os.path.join(self._working_directory, 225 | self._install_directory, 226 | name, 'metadata.yml') 227 | try: 228 | with open(metadata_path, 'r') as metadata_file: 229 | metadata = yaml.load(metadata_file) 230 | shaker.libs.logger.Logger().debug("ShakerRemote::_get_formula_exports: metadata {}".format(metadata)) 231 | exports = metadata.get("exports", exports_default) 232 | except IOError: 233 | shaker.libs.logger.Logger().debug("ShakerRemote::_get_formula_exports: skipping unreadable {}".format( 234 | metadata_path 235 | )) 236 | exports = exports_default 237 | shaker.libs.logger.Logger().debug("ShakerRemote::_get_formula_exports: exports {}".format(exports)) 238 | return exports 239 | 240 | def _update_root_links(self): 241 | for dependency_info in self._dependencies.values(): 242 | shaker.libs.logger.Logger().debug("ShakerRemote::update_root_links: " 243 | "Updating '%s" 244 | % (dependency_info)) 245 | name = dependency_info.get('name', None) 246 | exports = self._get_formula_exports(dependency_info) 247 | # Let's link each export from this formula 248 | for export in exports: 249 | # Collect together a list of source directory paths to use for 250 | # our formula discovery an linking strategy 251 | subdir_candidates = [ 252 | { 253 | "source": os.path.join(self._working_directory, 254 | self._install_directory, 255 | name, 256 | export 257 | ), 258 | "target": os.path.join(self._working_directory, 259 | self._salt_root, 260 | export 261 | ) 262 | }, 263 | { 264 | "source": os.path.join(self._working_directory, 265 | self._install_directory, 266 | name), 267 | "target": os.path.join(self._working_directory, 268 | self._salt_root, 269 | name) 270 | }, 271 | ] 272 | subdir_found = False 273 | for subdir_candidate in subdir_candidates: 274 | source = subdir_candidate["source"] 275 | target = subdir_candidate["target"] 276 | if os.path.exists(source): 277 | if not os.path.exists(target): 278 | subdir_found = True 279 | relative_source = os.path.relpath(source, os.path.dirname(target)) 280 | os.symlink(relative_source, target) 281 | shaker.libs.logger.Logger().info("ShakerRemote::update_root_links: " 282 | "Linking %s to %s" 283 | % (source, target)) 284 | else: 285 | msg = ("ShakerRemote::update_root_links: " 286 | "Target '%s' conflicts with something else" 287 | % (target)) 288 | raise IOError(msg) 289 | 290 | break 291 | 292 | # If we haven't linked a root yet issue an exception 293 | if not subdir_found: 294 | msg = ("ShakerRemote::update_root_links: " 295 | "Could not find target link for formula '%s'" 296 | % (name)) 297 | raise IOError(msg) 298 | else: 299 | self._link_dynamic_modules(name) 300 | 301 | def _link_dynamic_modules(self, dependency_name): 302 | shaker.libs.logger.Logger().debug("ShakerRemote::_link_dynamic_modules(%s) " 303 | % (dependency_name)) 304 | 305 | repo_dir = os.path.join(self._working_directory, self._install_directory, dependency_name) 306 | 307 | for libdir in self._dynamic_modules_dirs: 308 | targetdir = os.path.join(self._working_directory, 309 | self._salt_root, 310 | libdir) 311 | sourcedir = os.path.join(repo_dir, libdir) 312 | 313 | relative_source = os.path.relpath(sourcedir, targetdir) 314 | 315 | if os.path.isdir(sourcedir): 316 | for name in os.listdir(sourcedir): 317 | if not os.path.isdir(targetdir): 318 | os.mkdir(targetdir) 319 | sourcefile = os.path.join(relative_source, name) 320 | targetfile = os.path.join(targetdir, name) 321 | try: 322 | shaker.libs.logger.Logger().debug("ShakerRemote::_link_dynamic_modules" 323 | "linking %s" 324 | % (sourcefile)) 325 | os.symlink(sourcefile, targetfile) 326 | except OSError as e: 327 | if e.errno == errno.EEXIST: # already exist 328 | shaker.libs.logger.Logger().warning("ShakerRemote::_link_dynamic_modules: " 329 | "Not linking %s as link already exists" 330 | % (sourcefile)) 331 | else: 332 | raise 333 | 334 | def _resolve_constraint_to_sha(self, 335 | dependency): 336 | """ 337 | Convert the dependencies version into a downloadable 338 | sha target 339 | """ 340 | # Find our tags 341 | org = dependency.get('organisation', None) 342 | name = dependency.get('name', None) 343 | constraint = dependency.get('constraint', None) 344 | 345 | # Resolve the constraint to an actual tag 346 | target_obj = shaker.libs.github.resolve_constraint_to_object(org, name, constraint) 347 | if target_obj: 348 | dependency["version"] = target_obj['name'] 349 | dependency["sha"] = target_obj["commit"]['sha'] 350 | shaker.libs.logger.Logger().debug("_resolve_constraint_to_sha(%s) Found version '%s' and sha '%s'" 351 | % (dependency.get('name', ''), 352 | dependency["version"], 353 | dependency["sha"])) 354 | return dependency["sha"] 355 | 356 | return None 357 | 358 | def _create_directories(self, overwrite=False): 359 | """ 360 | Make sure all our required directories are set up correctly 361 | """ 362 | if not os.path.exists(self._working_directory): 363 | os.makedirs(self._working_directory, 0755) 364 | 365 | # Delete the salt roots dir on each run. 366 | # This is because the files in roots_dir are just symlinks 367 | salt_root_path = os.path.join(self._working_directory, 368 | self._salt_root) 369 | if os.path.exists(salt_root_path): 370 | shutil.rmtree(salt_root_path) 371 | shaker.libs.logger.Logger().debug("_create_directories: Deleting salt root directory '%s'" 372 | % (salt_root_path)) 373 | os.makedirs(salt_root_path) 374 | 375 | # Ensure the repos_dir exists 376 | install_path = os.path.join(self._working_directory, 377 | self._install_directory) 378 | 379 | if not os.path.exists(install_path): 380 | try: 381 | shaker.libs.logger.Logger().debug("_create_directories: Creating repository directory '%s'" 382 | % (install_path)) 383 | os.makedirs(install_path) 384 | except OSError as e: 385 | raise IOError("There was a problem creating the directory '%s', '%s'" 386 | % (install_path, e)) 387 | elif overwrite and os.path.exists(install_path): 388 | shutil.rmtree(install_path) 389 | shaker.libs.logger.Logger().debug("_create_directories: Deleting repository directory '%s'" 390 | % (install_path)) 391 | os.makedirs(install_path) 392 | 393 | def get_requirements(self): 394 | """ 395 | Get a list of the requirements from the current 396 | dependency metadata 397 | 398 | Returns: 399 | (list): List of requirements or None type. Format is 400 | /(comparator) 401 | """ 402 | requirements = [] 403 | if self._dependencies and len(self._dependencies) > 0: 404 | 405 | for key, info in self._dependencies.items(): 406 | entry = ("%s==%s" 407 | % (key, 408 | info.get("version", "") 409 | )) 410 | requirements.append(entry) 411 | 412 | return requirements 413 | -------------------------------------------------------------------------------- /tests/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ministryofjustice/salt-shaker/55f93b2418f91012e030984a13419e7ea8719a26/tests/__init__.py -------------------------------------------------------------------------------- /tests/libs/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ministryofjustice/salt-shaker/55f93b2418f91012e030984a13419e7ea8719a26/tests/libs/__init__.py -------------------------------------------------------------------------------- /tests/libs/test_github.py: -------------------------------------------------------------------------------- 1 | import unittest 2 | import os 3 | import requests 4 | import responses 5 | import json 6 | import pygit2 7 | from nose.tools import raises 8 | 9 | import shaker.libs.github 10 | from shaker.libs.errors import ConstraintResolutionException 11 | 12 | 13 | class TestGithub(unittest.TestCase): 14 | 15 | _sample_response_tags = [ 16 | { 17 | "name": "v1.0.1", 18 | "zipball_url": "https://api.github.com/repos/ministryofjustice/test-formula/zipball/v1.0.1", 19 | "tarball_url": "https://api.github.com/repos/ministryofjustice/test-formula/tarball/v1.0.1", 20 | "commit": { 21 | "sha": "6826533980361f54b9de17d181830fa4ec94138c", 22 | "url": "https://api.github.com/repos/ministryofjustice/test-formula/commits/6826533980361f54b9de17d181830fa4ec94138c" 23 | } 24 | }, 25 | { 26 | "name": "v2.0.1", 27 | "zipball_url": "https://api.github.com/repos/ministryofjustice/test-formula/zipball/v2.0.1", 28 | "tarball_url": "https://api.github.com/repos/ministryofjustice/test-formula/tarball/v2.0.1", 29 | "commit": { 30 | "sha": "1d7d509b534b08b08b1f85253990b6c3f0dec007", 31 | "url": "https://api.github.com/repos/ministryofjustice/test-formula/commits/1d7d509b534b08b08b1f85253990b6c3f0dec007" 32 | } 33 | }, 34 | ] 35 | 36 | _sample_response_branches = { 37 | "name": "branch-01", 38 | "commit": { 39 | "sha": "1035f6628a5991bd8b5d7b35affaf5b22f738287", 40 | "commit": { 41 | "url": "https://api.github.com/repos/ministryofjustice/sensu-formula/git/commits/1035f6628a5991bd8b5d7b35affaf5b22f738287", 42 | "comment_count": 0 43 | }, 44 | "url": "https://api.github.com/repos/ministryofjustice/sensu-formula/commits/1035f6628a5991bd8b5d7b35affaf5b22f738287", 45 | "html_url": "https://github.com/ministryofjustice/sensu-formula/commit/1035f6628a5991bd8b5d7b35affaf5b22f738287", 46 | "comments_url": "https://api.github.com/repos/ministryofjustice/sensu-formula/commits/1035f6628a5991bd8b5d7b35affaf5b22f738287/comments", 47 | }, 48 | } 49 | 50 | def setUp(self): 51 | unittest.TestCase.setUp(self) 52 | os.environ['GITHUB_TOKEN'] = 'false' 53 | 54 | def tearDown(self): 55 | unittest.TestCase.tearDown(self) 56 | 57 | def test_parse_github_url(self): 58 | """ 59 | TestGithub: Test the components are pulled out of github urls 60 | """ 61 | url = "git@github.com:test-organisation/test1-formula.git==v1.0.1" 62 | parsed_info = shaker.libs.github.parse_github_url(url) 63 | self.assertEqual(parsed_info.get('source', ''), 64 | "git@github.com:test-organisation/test1-formula.git", 65 | "Source field not equal" 66 | "%s!=%s" 67 | % (parsed_info.get('source', ''), 68 | "git@github.com:test-organisation/test1-formula.git")) 69 | self.assertEqual(parsed_info.get('name', ''), 70 | "test1-formula", 71 | "Name field not equal" 72 | "%s!=%s" 73 | % (parsed_info.get('name', ''), 74 | "test1-formula")) 75 | 76 | parsed_info = shaker.libs.github.parse_github_url(url) 77 | self.assertEqual(parsed_info.get('organisation', ''), 78 | "test-organisation", 79 | "Organisation field not equal " 80 | "%s!=%s" 81 | % (parsed_info.get('organisation', ''), 82 | "test-organisation")) 83 | 84 | parsed_info = shaker.libs.github.parse_github_url(url) 85 | self.assertEqual(parsed_info.get('constraint', ''), 86 | "==v1.0.1", 87 | "Constraint field not equal" 88 | "%s!=%s" 89 | % (parsed_info.get('constraint', ''), 90 | "v1.0.1")) 91 | 92 | @responses.activate 93 | def test_get_valid_github_token(self): 94 | """ 95 | TestGithub: Test calling the get token function with a good token 96 | """ 97 | # Test validating a github token thats missing 98 | if 'GITHUB_TOKEN' in os.environ: 99 | del os.environ['GITHUB_TOKEN'] 100 | token = shaker.libs.github.get_valid_github_token() 101 | self.assertEqual(token, None, "Expected no github token to be found") 102 | 103 | # Test calling with a valid github token. We should expect a 104 | # 200 status 105 | 106 | # Setup the incorrect credential mock responses 107 | mock_resp = [ 108 | { 109 | "message": "Successful login.", 110 | "mock": "True" 111 | } 112 | ] 113 | 114 | responses.add(responses.GET, 115 | "https://api.github.com", 116 | content_type="application/json", 117 | body=json.dumps(mock_resp), 118 | status=200 119 | ) 120 | 121 | # Setup an invalid github token 122 | os.environ['GITHUB_TOKEN'] = "FAKE_VALID_TOKEN" 123 | expected_token = os.environ['GITHUB_TOKEN'] 124 | 125 | # Attempt validating the invalid token 126 | github_token = shaker.libs.github.get_valid_github_token() 127 | url = 'https://api.github.com' 128 | actual_response = requests.get(url, 129 | auth=(github_token, 'x-oauth-basic')) 130 | 131 | # Check we got the right messages and statuses 132 | response_mock = json.loads(actual_response.text)[0]["mock"] 133 | self.assertTrue(response_mock, "Not working with the mocked response") 134 | self.assertEqual(actual_response.status_code, 135 | 200, 136 | "Expected 200 response, got '%s'" 137 | % actual_response.status_code) 138 | self.assertEqual(github_token, 139 | expected_token, 140 | "Expected None type token, got '%s'" 141 | % github_token) 142 | 143 | @responses.activate 144 | def test_get_valid_github_token_online(self): 145 | """ 146 | TestGithub: Test calling the get token function with a bad token 147 | """ 148 | # Test calling with an invalid github token. We should expect a "Bad Credentials" message 149 | # and a 401 status 150 | 151 | # Setup the incorrect credential mock responses 152 | mock_resp = [ 153 | { 154 | "documentation_url": "https://developer.github.com/v3", 155 | "message": "Bad credentials", 156 | "mock": "True" 157 | } 158 | ] 159 | 160 | responses.add(responses.GET, 161 | "https://api.github.com", 162 | content_type="application/json", 163 | body=json.dumps(mock_resp), 164 | status=401 165 | ) 166 | 167 | # Setup an invalid github token 168 | os.environ['GITHUB_TOKEN'] = "INVALID_TOKEN" 169 | 170 | # Attempt validating the invalid token 171 | github_token = shaker.libs.github.get_valid_github_token(online_validation_enabled=True) 172 | url = 'https://api.github.com' 173 | actual_response = requests.get(url, 174 | auth=(github_token, 'x-oauth-basic')) 175 | 176 | # Check we got the right messages and statuses 177 | response_mock = json.loads(actual_response.text)[0]["mock"] 178 | self.assertTrue(response_mock, "Not working with the mocked response") 179 | self.assertEqual(actual_response.status_code, 180 | 401, 181 | "Expected 401 response, got '%s'" 182 | % actual_response.status_code) 183 | self.assertEqual(github_token, 184 | None, 185 | "Expected None type token, got '%s'" 186 | % github_token) 187 | 188 | @responses.activate 189 | def test_validate_blocked_github_token(self): 190 | """ 191 | TestGithub: Test calling the get token function with a blocked token 192 | """ 193 | # Test calling with an blocked github token. We should expect a 194 | # "Maximum number of login attempts exceeded" message 195 | # and a 403 status 196 | 197 | # Setup the incorrect credential mock responses 198 | mock_resp = [ 199 | { 200 | "message": "Maximum number of login attempts exceeded. Please try again later.", 201 | "documentation_url": "https://developer.github.com/v3", 202 | "mock": "True" 203 | } 204 | ] 205 | 206 | responses.add(responses.GET, 207 | "https://api.github.com", 208 | content_type="application/json", 209 | body=json.dumps(mock_resp), 210 | status=403 211 | ) 212 | 213 | # Setup an invalid github token 214 | os.environ['GITHUB_TOKEN'] = "INVALID_TOKEN" 215 | 216 | # Attempt validating the invalid token 217 | github_token = shaker.libs.github.get_valid_github_token(online_validation_enabled=True) 218 | url = 'https://api.github.com' 219 | actual_response = requests.get(url, 220 | auth=(github_token, 'x-oauth-basic')) 221 | 222 | # Check we got the right messages and statuses 223 | response_mock = json.loads(actual_response.text)[0]["mock"] 224 | self.assertTrue(response_mock, "Not working with the mocked response") 225 | self.assertEqual(actual_response.status_code, 226 | 403, 227 | "Expected 401 response, got '%s'" 228 | % actual_response.status_code) 229 | self.assertEqual(github_token, 230 | None, 231 | "Expected None type token, got '%s'" 232 | % github_token) 233 | 234 | @responses.activate 235 | def test_resolve_constraint_to_object_equality_resolvable(self): 236 | """ 237 | TestGithub: Test that we get the right tags for a resolvable constraint 238 | """ 239 | 240 | responses.add(responses.GET, 241 | 'https://api.github.com/repos/ministryofjustice/test-formula/tags', 242 | content_type="application/json", 243 | body=json.dumps(self._sample_response_tags), 244 | status=200 245 | ) 246 | org = 'ministryofjustice' 247 | formula = 'test-formula' 248 | version = 'v1.0.1' 249 | constraint = '==%s' % version 250 | tag_data = shaker.libs.github.resolve_constraint_to_object(org, 251 | formula, 252 | constraint) 253 | wanted_tag = tag_data['name'] 254 | # Equality constraint is satisfiable 255 | self.assertEqual(wanted_tag, 256 | version, 257 | "Equality constraintshould be satisfiable, " 258 | "actual:%s expected:%s" 259 | % (wanted_tag, 260 | version)) 261 | 262 | @responses.activate 263 | @raises(ConstraintResolutionException) 264 | def test_resolve_constraint_to_object_equality_unresolvable(self): 265 | """ 266 | TestGithub: Test that we throw an unresolvable constraint error 267 | """ 268 | responses.add(responses.GET, 269 | 'https://api.github.com/repos/ministryofjustice/test-formula/tags', 270 | content_type="application/json", 271 | body=json.dumps(self._sample_response_tags), 272 | status=200 273 | ) 274 | org = 'ministryofjustice' 275 | formula = 'test-formula' 276 | version = 'v666' 277 | constraint = '==%s' % version 278 | tag_data = shaker.libs.github.resolve_constraint_to_object(org, 279 | formula, 280 | constraint) 281 | self.assertTrue(False, "TODO") 282 | 283 | @responses.activate 284 | def test_resolve_constraint_to_object_greater_than_resolvable(self): 285 | """ 286 | TestGithub: Test that we get the right tags for a resolvable constraint 287 | """ 288 | responses.add(responses.GET, 289 | 'https://api.github.com/repos/ministryofjustice/test-formula/tags', 290 | content_type="application/json", 291 | body=json.dumps(self._sample_response_tags), 292 | status=200 293 | ) 294 | org = 'ministryofjustice' 295 | formula = 'test-formula' 296 | version = 'v1.1' 297 | expected_version = 'v2.0.1' 298 | constraint = '>=%s' % version 299 | tag_data = shaker.libs.github.resolve_constraint_to_object(org, 300 | formula, 301 | constraint) 302 | wanted_tag = tag_data.get('name', None) 303 | # Equality constraint is satisfiable 304 | self.assertEqual(wanted_tag, 305 | expected_version, 306 | "Greater than constraint should be satisfiable, " 307 | "actual:%s expected:%s" 308 | % (wanted_tag, 309 | expected_version)) 310 | 311 | @responses.activate 312 | @raises(ConstraintResolutionException) 313 | def test_resolve_constraint_to_object_greater_than_unresolvable(self): 314 | """ 315 | TestGithub: Test that we throw an unresolvable constraint error 316 | """ 317 | responses.add(responses.GET, 318 | 'https://api.github.com/repos/ministryofjustice/test-formula/tags', 319 | content_type="application/json", 320 | body=json.dumps(self._sample_response_tags), 321 | status=200 322 | ) 323 | org = 'ministryofjustice' 324 | formula = 'test-formula' 325 | version = 'v2.1' 326 | constraint = '>=%s' % version 327 | shaker.libs.github.resolve_constraint_to_object(org, 328 | formula, 329 | constraint) 330 | # We're testing for exceptions, No assertion needed 331 | 332 | @responses.activate 333 | def test_resolve_constraint_to_object_less_than_resolvable(self): 334 | """ 335 | TestGithub: Test that we get the right tags for a resolvable constraint 336 | """ 337 | responses.add(responses.GET, 338 | 'https://api.github.com/repos/ministryofjustice/test-formula/tags', 339 | content_type="application/json", 340 | body=json.dumps(self._sample_response_tags), 341 | status=200 342 | ) 343 | org = 'ministryofjustice' 344 | formula = 'test-formula' 345 | version = 'v1.1' 346 | expected_version = 'v1.0.1' 347 | constraint = '<=%s' % version 348 | tag_data = shaker.libs.github.resolve_constraint_to_object(org, 349 | formula, 350 | constraint) 351 | wanted_tag = tag_data['name'] 352 | # Equality constraint is satisfiable 353 | self.assertEqual(wanted_tag, 354 | expected_version, 355 | "Less than constraint should be satisfiable, " 356 | "actual:%s expected:%s" 357 | % (wanted_tag, 358 | expected_version)) 359 | 360 | @responses.activate 361 | @raises(ConstraintResolutionException) 362 | def test_resolve_constraint_to_object_lesser_than_unresolvable(self): 363 | """ 364 | TestGithub: Test that we throw an unresolvable constraint error 365 | """ 366 | responses.add(responses.GET, 367 | 'https://api.github.com/repos/ministryofjustice/test-formula/tags', 368 | content_type="application/json", 369 | body=json.dumps(self._sample_response_tags), 370 | status=200 371 | ) 372 | org = 'ministryofjustice' 373 | formula = 'test-formula' 374 | version = 'v1.0' 375 | constraint = '<=%s' % version 376 | tag_data = shaker.libs.github.resolve_constraint_to_object(org, 377 | formula, 378 | constraint) 379 | self.assertTrue(False, "TODO") 380 | 381 | @responses.activate 382 | def test_get_valid_tags(self): 383 | responses.add(responses.GET, 384 | 'https://api.github.com/repos/ministryofjustice/test-formula/tags', 385 | content_type="application/json", 386 | body=json.dumps(self._sample_response_tags), 387 | status=200 388 | ) 389 | org = 'ministryofjustice' 390 | formula = 'test-formula' 391 | wanted_tag, tag_versions, _ = shaker.libs.github.get_valid_tags(org, formula) 392 | expected_wanted_tag = "v2.0.1" 393 | expected_tag_versions = ["1.0.1", "2.0.1"] 394 | 395 | self.assertEqual(wanted_tag, expected_wanted_tag, "Actual wanted tag '%s, expected '%s'" 396 | % (wanted_tag, expected_wanted_tag)) 397 | self.assertEqual(tag_versions, expected_tag_versions, "Actual wanted tag '%s, expected '%s'" 398 | % (tag_versions, expected_tag_versions)) 399 | 400 | def test_get_latest_tag_no_prereleases(self): 401 | """ 402 | Test latest tag with no prerelease 403 | """ 404 | include_prereleases = False 405 | tag_versions = [ 406 | "1.1.1", 407 | "2.2.2-prerelease", 408 | "notathing", 409 | "3.3.3stillnotathing" 410 | ] 411 | expected_lastest_tag = "1.1.1" 412 | 413 | actual_latest_tag = shaker.libs.github.get_latest_tag(tag_versions, include_prereleases) 414 | self.assertEqual(actual_latest_tag, 415 | expected_lastest_tag, 416 | "Actual tag %s != %s" 417 | % (actual_latest_tag, 418 | expected_lastest_tag)) 419 | 420 | def test_get_latest_tag_prereleases(self): 421 | """ 422 | Test get latest tag with prereleases included 423 | """ 424 | include_prereleases = True 425 | tag_versions = [ 426 | "1.1.1", 427 | "2.2.2-prerelease", 428 | "notathing", 429 | "3.3.3prerelease2" 430 | ] 431 | expected_lastest_tag = "3.3.3prerelease2" 432 | 433 | actual_latest_tag = shaker.libs.github.get_latest_tag(tag_versions, include_prereleases) 434 | self.assertEqual(actual_latest_tag, 435 | expected_lastest_tag, 436 | "Actual tag %s != %s" 437 | % (actual_latest_tag, 438 | expected_lastest_tag)) 439 | 440 | def test_parse_semver_tag_release(self): 441 | """ 442 | Parse a valid release tag 443 | """ 444 | tag = "v1.2.3" 445 | result = shaker.libs.github.parse_semver_tag(tag) 446 | expected_result = { 447 | "major": 1, 448 | "minor": 2, 449 | "patch": 3, 450 | "postfix": None 451 | } 452 | self.assertEqual(result, 453 | expected_result, 454 | "%s != %s" 455 | % (result, 456 | expected_result)) 457 | 458 | def test_parse_semver_tag_prerelease(self): 459 | """ 460 | Parse a valid release tag 461 | """ 462 | tag = "v1.2.3-prerelease" 463 | result = shaker.libs.github.parse_semver_tag(tag) 464 | expected_result = { 465 | "major": 1, 466 | "minor": 2, 467 | "patch": 3, 468 | "postfix": "prerelease" 469 | } 470 | self.assertEqual(result, 471 | expected_result, 472 | "%s != %s" 473 | % (result, 474 | expected_result)) 475 | 476 | def test_parse_semver_tag_noncompliant_prerelease(self): 477 | """ 478 | Parse a prerelease with a non-compliant tag 479 | """ 480 | tag = "v1.2.3ijidsja" 481 | result = shaker.libs.github.parse_semver_tag(tag) 482 | expected_result = { 483 | "major": 1, 484 | "minor": 2, 485 | "patch": 3, 486 | "postfix": 'ijidsja' 487 | } 488 | self.assertEqual(result, 489 | expected_result, 490 | "%s != %s" 491 | % (result, 492 | expected_result)) 493 | 494 | def test_parse_semver_tag_noncompliant(self): 495 | """ 496 | Parse a non-compliant tag 497 | """ 498 | tag = "v1.2.3ijidsja" 499 | result = shaker.libs.github.parse_semver_tag(tag) 500 | expected_result = { 501 | "major": 1, 502 | "minor": 2, 503 | "patch": 3, 504 | "postfix": 'ijidsja' 505 | } 506 | self.assertEqual(result, 507 | expected_result, 508 | "%s != %s" 509 | % (result, 510 | expected_result)) 511 | 512 | def test_is_tag_release(self): 513 | """ 514 | Test tag is a valid release 515 | """ 516 | tag = "v1.2.3" 517 | actual_result = shaker.libs.github.is_tag_release(tag) 518 | self.assertTrue(actual_result, "%s should be a release" % tag) 519 | 520 | def test_is_tag_release_prerelease(self): 521 | """ 522 | Test is not a valid release, its a prerelease 523 | """ 524 | tag = "v1.2.3-prereleases" 525 | actual_result = shaker.libs.github.is_tag_release(tag) 526 | self.assertFalse(actual_result, "%s should not be a release" % tag) 527 | 528 | def test_is_tag_release_noncompliant_prerelease(self): 529 | """ 530 | Test is not a valid release, its a prerelease 531 | """ 532 | tag = "v1.2.3prereleases1" 533 | actual_result = shaker.libs.github.is_tag_release(tag) 534 | self.assertFalse(actual_result, "%s should not be a release" % tag) 535 | 536 | def test_is_tag_release_notcompliant(self): 537 | """ 538 | Test is not a valid release, is not compiant 539 | """ 540 | tag = "v1.2.ijidsja" 541 | actual_result = shaker.libs.github.is_tag_release(tag) 542 | self.assertFalse(actual_result, "%s should not be a release" % tag) 543 | 544 | def test_is_tag_prerelease_release(self): 545 | """ 546 | Test tag is not a valid prerelease, its a release 547 | """ 548 | tag = "v1.2.3" 549 | actual_result = shaker.libs.github.is_tag_prerelease(tag) 550 | self.assertFalse(actual_result, "%s should be a release, not a prerelease" % tag) 551 | 552 | def test_is_tag_prerelease_prerelease(self): 553 | """ 554 | Test is a valid prerelease 555 | """ 556 | tag = "v1.2.3-prereleases1" 557 | actual_result = shaker.libs.github.is_tag_prerelease(tag) 558 | self.assertTrue(actual_result, "%s should not be a prerelease" % tag) 559 | 560 | def test_is_tag_prerelease_noncompliant(self): 561 | """ 562 | Test is a valid prerelease with noncompliant tags 563 | """ 564 | tag = "v1.2.3a" 565 | actual_result = shaker.libs.github.is_tag_prerelease(tag) 566 | self.assertTrue(actual_result, "%s should be an alpha prerelease" % tag) 567 | tag = "v1.2.3asdad" 568 | actual_result = shaker.libs.github.is_tag_prerelease(tag) 569 | self.assertTrue(actual_result, "%s should be an alpha prerelease" % tag) 570 | 571 | @responses.activate 572 | def test_resolve_constraint_to_object_branch_equality_resolvable(self): 573 | """ 574 | TestGithub: Test that we get the right branch for a resolvable constraint 575 | """ 576 | 577 | responses.add(responses.GET, 578 | 'https://api.github.com/repos/ministryofjustice/test-formula/branches/branch-01', 579 | content_type="application/json", 580 | body=json.dumps(self._sample_response_branches), 581 | status=200 582 | ) 583 | org = 'ministryofjustice' 584 | formula = 'test-formula' 585 | version = 'branch-01' 586 | constraint = '==%s' % version 587 | tag_data = shaker.libs.github.resolve_constraint_to_object(org, 588 | formula, 589 | constraint) 590 | wanted_tag = tag_data['name'] 591 | # Equality constraint is satisfiable 592 | self.assertEqual(wanted_tag, 593 | version, 594 | "Branch equality constraint should be satisfiable, " 595 | "actual:%s expected:%s" 596 | % (wanted_tag, 597 | version)) 598 | 599 | @responses.activate 600 | @raises(ConstraintResolutionException) 601 | def test_resolve_constraint_to_object_branch_equality_unresolvable(self): 602 | """ 603 | TestGithub: Test that we throw an unresolvable constraint error 604 | when branch doesn't exist 605 | """ 606 | 607 | # setup a mock response - branch not found 608 | mock_resp = [ 609 | { 610 | "message": "Branch not found", 611 | "documentation_url": "https://developer.github.com/v3/repos/#get-branch" 612 | } 613 | ] 614 | 615 | responses.add(responses.GET, 616 | 'https://api.github.com/repos/ministryofjustice/test-formula/branches/branch-01', 617 | content_type="application/json", 618 | body=json.dumps(mock_resp), 619 | status=403 620 | ) 621 | org = 'ministryofjustice' 622 | formula = 'test-formula' 623 | branch_name = 'branch-01' 624 | constraint = '==%s' % branch_name 625 | shaker.libs.github.resolve_constraint_to_object(org, 626 | formula, 627 | constraint) 628 | # We're testing for exceptions, No assertion needed 629 | -------------------------------------------------------------------------------- /tests/libs/test_metadata.py: -------------------------------------------------------------------------------- 1 | from unittest import TestCase 2 | from shaker.libs import metadata 3 | from nose.tools import raises 4 | from shaker.libs.errors import ConstraintResolutionException 5 | from mock import patch 6 | 7 | 8 | class TestMetadata(TestCase): 9 | 10 | # Sample metadata with duplicates 11 | _sample_metadata_duplicates = { 12 | "dependencies": [ 13 | "git@github.com:test_organisation/test1-formula.git==v1.0.1", 14 | "git@github.com:test_organisation/test1-formula.git==v1.0.2", 15 | "git@github.com:test_organisation/test2-formula.git==v2.0.1", 16 | "git@github.com:test_organisation/test3-formula.git==v3.0.1", 17 | "git@github.com:test_organisation/test3-formula.git==v3.0.2" 18 | ], 19 | "entry": ["dummy"] 20 | } 21 | 22 | _sample_metadata_no_duplicates = { 23 | "dependencies": [ 24 | "git@github.com:test_organisation/test1-formula.git==v1.0.1", 25 | "git@github.com:test_organisation/test2-formula.git==v2.0.1", 26 | "git@github.com:test_organisation/test3-formula.git==v3.0.1" 27 | ], 28 | "entry": ["dummy"] 29 | } 30 | 31 | def test_resolve_constraints_equality(self): 32 | """ 33 | TestMetadata: Test == constraints are resolved correctly 34 | """ 35 | 36 | # Under a simple resolve, the current constraint should win 37 | new_constraint = '==v0.1' 38 | current_constraint = '==v1.1' 39 | constraint = metadata.resolve_constraints(new_constraint, 40 | current_constraint) 41 | self.assertEqual(constraint, 42 | current_constraint, 43 | "Under a simple resolve, the current constraint should win. " 44 | "Actual '%s', Expected %s" 45 | % (constraint, 46 | current_constraint)) 47 | 48 | def test_resolve_constraints_greater_than(self): 49 | """ 50 | TestMetadata: Test >= constraints are resolved correctly 51 | """ 52 | # With >=, the largest constraint should win 53 | new_constraint = '>=v1.2' 54 | current_constraint = '>=v1.1' 55 | constraint = metadata.resolve_constraints(new_constraint, 56 | current_constraint) 57 | self.assertEqual(constraint, 58 | new_constraint, 59 | "With >=, the largest constraint should win " 60 | "Actual '%s', Expected %s" 61 | % (constraint, 62 | new_constraint)) 63 | 64 | def test_resolve_constraints_less_than(self): 65 | """ 66 | TestMetadata: Test >= constraints are resolved correctly 67 | """ 68 | # With >=, the largest constraint should win 69 | new_constraint = '<=v1.2' 70 | current_constraint = '<=v1.1' 71 | constraint = metadata.resolve_constraints(new_constraint, 72 | current_constraint) 73 | self.assertEqual(constraint, 74 | current_constraint, 75 | "With <=, the least constraint should win " 76 | "Actual '%s', Expected %s" 77 | % (constraint, 78 | current_constraint)) 79 | 80 | @raises(ConstraintResolutionException) 81 | def test_resolve_constraints_unequal(self): 82 | """ 83 | TestMetadata: Test unequal constraints are resolved correctly 84 | """ 85 | # Expect an exception on unequal constraints 86 | new_constraint = '<=v1.2' 87 | current_constraint = '>=v1.1' 88 | constraint = metadata.resolve_constraints(new_constraint, 89 | current_constraint) 90 | self.assertEqual(constraint, 91 | current_constraint, 92 | "Expect an exception on unequal constraints" 93 | % (constraint, 94 | current_constraint)) 95 | 96 | def test_resolve_metadata_duplicates(self): 97 | """ 98 | TestMetadata: Check if we successfully remove duplicates from a sample metadata 99 | """ 100 | original_metadata = self._sample_metadata_duplicates 101 | expected_metadata = self._sample_metadata_no_duplicates 102 | resolved_metadata = metadata.resolve_metadata_duplicates(original_metadata) 103 | 104 | expected_metadata_dependencies = expected_metadata["dependencies"] 105 | resolved_metadata_dependencies = resolved_metadata["dependencies"] 106 | expected_metadata_entries = expected_metadata["entry"] 107 | resolved_metadata_entries = resolved_metadata["entry"] 108 | 109 | # Test dependencies found 110 | for expected_metadata_dependency in expected_metadata_dependencies: 111 | self.assertTrue(expected_metadata_dependency in resolved_metadata_dependencies, 112 | "test_resolve_metadata_duplicates: dependency '%s' " 113 | "not found in de-duplicated metadata" 114 | % (expected_metadata_dependency)) 115 | 116 | # Test entry found 117 | for expected_metadata_entry in expected_metadata_entries: 118 | self.assertTrue(expected_metadata_entry in resolved_metadata_entries, 119 | "test_resolve_metadata_duplicates: Entry '%s' " 120 | "not found in de-duplicated metadata" 121 | % (expected_metadata_entry)) 122 | 123 | @raises(TypeError) 124 | def test_resolve_metadata_duplicates_bad_metadata_object(self): 125 | """ 126 | TestMetadata: Check if bad yaml metadata will throw up a TypeError. 127 | """ 128 | # Callable with bad metadata 129 | metadata.resolve_metadata_duplicates("not-a-dictionary") 130 | 131 | @raises(IndexError) 132 | def test_resolve_metadata_duplicates_metadata_missing_index(self): 133 | """ 134 | TestMetadata: Check if metadata with a missing index will throw an error 135 | """ 136 | metadata.resolve_metadata_duplicates({}) 137 | 138 | @patch('shaker.libs.github.parse_github_url') 139 | @patch('shaker.shaker_metadata.ShakerMetadata._fetch_local_metadata') 140 | @patch('shaker.shaker_metadata.ShakerMetadata.load_local_metadata') 141 | @patch('shaker.shaker_metadata.ShakerMetadata.load_local_requirements') 142 | def test_parse_metadata_requirements_raw(self, 143 | mock_load_local_requirements, 144 | mock_load_local_metadata, 145 | mock_fetch_local_metadata, 146 | mock_parse_github_url): 147 | requirements = [ 148 | 'git@github.com:test_organisation/some-formula.git==v1.0', 149 | 'git@github.com:test_organisation/another-formula.git>=v2.0' 150 | ] 151 | 152 | expected_result = { 153 | 'test_organisation/some-formula': 154 | { 155 | 'source': 'git@github.com:test_organisation/some-formula.git', 156 | 'constraint': '==v1.0', 157 | 'sourced_constraints': [], 158 | 'organisation': 'test_organisation', 159 | 'name': 'some-formula' 160 | }, 161 | 'test_organisation/another-formula': 162 | { 163 | 'source': 'git@github.com:test_organisation/another-formula.git', 164 | 'constraint': '>=v2.0', 165 | 'sourced_constraints': [], 166 | 'organisation': 'test_organisation', 167 | 'name': 'another-formula' 168 | } 169 | } 170 | mock_parse_github_url.side_effect = [ 171 | { 172 | 'source': 'git@github.com:test_organisation/some-formula.git', 173 | 'constraint': '==v1.0', 174 | 'sourced_constraints': ['==v1.0'], 175 | 'organisation': 'test_organisation', 176 | 'name': 'some-formula' 177 | }, 178 | { 179 | 'source': 'git@github.com:test_organisation/another-formula.git', 180 | 'constraint': '>=v2.0', 181 | 'sourced_constraints': ['==v2.0'], 182 | 'organisation': 'test_organisation', 183 | 'name': 'another-formula' 184 | }, 185 | None 186 | ] 187 | # PEP8 requires unused mock being used 188 | mock_load_local_requirements.return_value = None 189 | mock_load_local_metadata.return_value = None 190 | mock_fetch_local_metadata.return_value = None 191 | 192 | actual_result = metadata.parse_metadata_requirements(requirements) 193 | 194 | self.assertEqual(actual_result, 195 | expected_result, 196 | "TestShakerMetadata::test__parse_metadata_requirements_raw: Mismatch\n" 197 | "Actual: %s\nExpected: %s\n\n" 198 | % (actual_result, 199 | expected_result)) 200 | 201 | @patch('shaker.shaker_metadata.ShakerMetadata._fetch_local_metadata') 202 | @patch('shaker.shaker_metadata.ShakerMetadata.load_local_metadata') 203 | @patch('shaker.shaker_metadata.ShakerMetadata.load_local_requirements') 204 | def test_parse_metadata_requirements_simple(self, 205 | mock_load_local_requirements, 206 | mock_load_local_metadata, 207 | mock_fetch_local_metadata): 208 | requirements = [ 209 | 'test_organisation/some-formula==v1.0', 210 | 'test_organisation/another-formula>=v2.0' 211 | ] 212 | 213 | expected_result = { 214 | 'test_organisation/some-formula': 215 | { 216 | 'source': 'git@github.com:test_organisation/some-formula.git', 217 | 'constraint': '==v1.0', 218 | 'sourced_constraints': [], 219 | 'organisation': 'test_organisation', 220 | 'name': 'some-formula' 221 | }, 222 | 'test_organisation/another-formula': 223 | { 224 | 'source': 'git@github.com:test_organisation/another-formula.git', 225 | 'constraint': '>=v2.0', 226 | 'sourced_constraints': [], 227 | 'organisation': 'test_organisation', 228 | 'name': 'another-formula' 229 | } 230 | } 231 | 232 | # PEP8 requires unused mock being used 233 | mock_load_local_requirements.return_value = None 234 | mock_load_local_metadata.return_value = None 235 | mock_fetch_local_metadata.return_value = None 236 | 237 | actual_result = metadata.parse_metadata_requirements(requirements) 238 | 239 | self.assertEqual(actual_result, 240 | expected_result, 241 | "TestShakerMetadata::test__parse_metadata_requirements_simple: Mismatch\n" 242 | "Actual: %s\nExpected: %s\n\n" 243 | % (actual_result, expected_result)) 244 | 245 | @patch('shaker.shaker_metadata.ShakerMetadata._fetch_local_metadata') 246 | @patch('shaker.shaker_metadata.ShakerMetadata.load_local_metadata') 247 | @patch('shaker.shaker_metadata.ShakerMetadata.load_local_requirements') 248 | def test_parse_metadata_requirements_simple_with_blanks(self, 249 | mock_load_local_requirements, 250 | mock_load_local_metadata, 251 | mock_fetch_local_metadata): 252 | """ 253 | TestMetadata: Test that blanks are accepted in the formula constraints 254 | """ 255 | requirements = [ 256 | 'test_organisation/some-formula == v1.0', 257 | 'test_organisation/another-formula >= v2.0' 258 | ] 259 | 260 | expected_result = { 261 | 'test_organisation/some-formula': 262 | { 263 | 'source': 'git@github.com:test_organisation/some-formula.git', 264 | 'constraint': '==v1.0', 265 | 'sourced_constraints': [], 266 | 'organisation': 'test_organisation', 267 | 'name': 'some-formula' 268 | }, 269 | 'test_organisation/another-formula': 270 | { 271 | 'source': 'git@github.com:test_organisation/another-formula.git', 272 | 'constraint': '>=v2.0', 273 | 'sourced_constraints': [], 274 | 'organisation': 'test_organisation', 275 | 'name': 'another-formula' 276 | } 277 | } 278 | 279 | # PEP8 requires unused mock being used 280 | mock_load_local_requirements.return_value = None 281 | mock_load_local_metadata.return_value = None 282 | mock_fetch_local_metadata.return_value = None 283 | 284 | actual_result = metadata.parse_metadata_requirements(requirements) 285 | 286 | self.assertEqual(actual_result, 287 | expected_result, 288 | "TestShakerMetadata::test_parse_metadata_requirements_simple_with_blanks: Mismatch\n" 289 | "Actual: %s\nExpected: %s\n\n" 290 | % (actual_result, expected_result)) 291 | 292 | def test_compare_requirements_equal(self): 293 | """ 294 | TestShakerMetadata: Test comparing different requirements equal 295 | """ 296 | previous_requirements = [ 297 | "test_organisation/test1-formula==v1.0.1", 298 | "test_organisation/test2-formula==v2.0.1", 299 | "test_organisation/test3-formula==v3.0.1", 300 | ] 301 | new_requirements = [ 302 | "test_organisation/test1-formula==v1.0.1", 303 | "test_organisation/test2-formula==v2.0.1", 304 | "test_organisation/test3-formula==v3.0.1", 305 | ] 306 | actual_result = metadata.compare_requirements(previous_requirements, 307 | new_requirements) 308 | self.assertEqual(0, 309 | len(actual_result), 310 | "Comparison should have no difference") 311 | 312 | def test_compare_requirements_new_entry(self): 313 | """ 314 | TestShakerMetadata: Test comparing different requirements new entries 315 | """ 316 | previous_requirements = [ 317 | "test_organisation/test1-formula==v1.0.1", 318 | "test_organisation/test2-formula==v2.0.1", 319 | ] 320 | new_requirements = [ 321 | "test_organisation/test1-formula==v1.0.1", 322 | "test_organisation/test2-formula==v2.0.1", 323 | "test_organisation/test3-formula==v3.0.1", 324 | ] 325 | actual_result = metadata.compare_requirements(previous_requirements, 326 | new_requirements) 327 | expected_result = [ 328 | ['', "test_organisation/test3-formula==v3.0.1"] 329 | ] 330 | self.assertEqual(actual_result, 331 | expected_result, 332 | ("Comparison should have deprecated entry\n" 333 | "Actual: '%s'\n" 334 | "Expected: %s\n") 335 | % (actual_result, 336 | expected_result)) 337 | 338 | def test_compare_requirements_deprecated_entry(self): 339 | """ 340 | TestShakerMetadata: Test comparing different requirements deprecated entries 341 | """ 342 | previous_requirements = [ 343 | "test_organisation/test1-formula==v1.0.1", 344 | "test_organisation/test2-formula==v2.0.1", 345 | "test_organisation/test3-formula==v3.0.1", 346 | ] 347 | new_requirements = [ 348 | "test_organisation/test1-formula==v1.0.1", 349 | "test_organisation/test2-formula==v2.0.1", 350 | ] 351 | actual_result = metadata.compare_requirements(previous_requirements, 352 | new_requirements) 353 | expected_result = [ 354 | ["test_organisation/test3-formula==v3.0.1", ""] 355 | ] 356 | self.assertEqual(actual_result, 357 | expected_result, 358 | ("Comparison should have new entry\n" 359 | "Actual: '%s'\n" 360 | "Expected: %s\n") 361 | % (actual_result, 362 | expected_result)) 363 | 364 | def test_compare_requirements_new_versions(self): 365 | """ 366 | TestShakerMetadata: Test comparing different requirements versions 367 | """ 368 | previous_requirements = [ 369 | "test_organisation/test1-formula==v1.0.10", 370 | "test_organisation/test2-formula==v2.0.1", 371 | "test_organisation/test3-formula==v3.0.1", 372 | ] 373 | new_requirements = [ 374 | "test_organisation/test1-formula==v1.0.1", 375 | "test_organisation/test2-formula==v2.0.10", 376 | "test_organisation/test3-formula==v3.0.1", 377 | 378 | ] 379 | actual_result = metadata.compare_requirements(previous_requirements, 380 | new_requirements) 381 | expected_result = [ 382 | ["test_organisation/test1-formula==v1.0.10", 383 | "test_organisation/test1-formula==v1.0.1"], 384 | ["test_organisation/test2-formula==v2.0.1", 385 | "test_organisation/test2-formula==v2.0.10"] 386 | ] 387 | self.assertEqual(actual_result, 388 | expected_result, 389 | ("Comparison should have new version\n" 390 | "Actual: '%s'\n" 391 | "Expected: %s\n") 392 | % (actual_result, 393 | expected_result)) 394 | -------------------------------------------------------------------------------- /tests/libs/test_pygit2_utils.py: -------------------------------------------------------------------------------- 1 | import sys 2 | import traceback 3 | 4 | from unittest import TestCase 5 | from shaker.libs import pygit2_utils 6 | from mock import patch 7 | from nose.tools import raises 8 | 9 | import pygit2 10 | 11 | 12 | class TestPygit2Utils(TestCase): 13 | 14 | @raises(pygit2_utils.Pygit2KepairFromAgentUnsupportedError) 15 | def test_pygit2_parse_error__attributeerror(self): 16 | """ 17 | Test pygit2_parse_error raises correct exception on attribute error 18 | """ 19 | e = AttributeError("'module' object has no attribute 'KeypairFromAgent'") 20 | pygit2_utils.pygit2_parse_error(e) 21 | 22 | @raises(pygit2_utils.Pygit2SSHUnsupportedError) 23 | def test_pygit2_parse_error__credentialserror(self): 24 | """ 25 | Test pygit2_parse_error raises correct exception on credentials error 26 | """ 27 | e = pygit2.GitError("Unsupported URL protocol") 28 | pygit2_utils.pygit2_parse_error(e) 29 | 30 | def test_pygit2_parse_error__preserves_backtrace(self): 31 | 32 | def sub_func(): 33 | x = {} 34 | x.i_should_throw() 35 | 36 | try: 37 | try: 38 | sub_func() 39 | except AttributeError as e: 40 | pygit2_utils.pygit2_parse_error(e) 41 | 42 | self.fail("Should have thrown an exception") 43 | except: 44 | tb = traceback.extract_tb(sys.exc_info()[2]) 45 | filename,lineno,method,code = tb[-1] 46 | self.assertEqual(method, "sub_func") 47 | self.assertEqual(code, "x.i_should_throw()") 48 | 49 | @patch("pygit2.credentials", spec={}) 50 | def test_pygit2_check_credentials__no_keychain(self, mock_pygit2_credentials): 51 | """ 52 | TestPygit2Utils:test_pygit2_check_credentials_no_keychain: Check for missing credentials support 53 | """ 54 | mock_pygit2_credentials.return_value = None 55 | result = pygit2_utils.pygit2_check_credentials() 56 | self.assertFalse(result) 57 | 58 | @raises(pygit2_utils.Pygit2SSHUnsupportedError) 59 | @patch("shaker.libs.pygit2_utils.pygit2_check_ssh") 60 | def test_pygit2_check__no_ssh(self, 61 | mock_check_ssh): 62 | """ 63 | Test pygit_check raises exception when ssh check fails 64 | """ 65 | mock_check_ssh.return_value = False 66 | pygit2_utils.pygit2_check() 67 | 68 | @raises(pygit2_utils.Pygit2KepairFromAgentUnsupportedError) 69 | @patch("shaker.libs.pygit2_utils.pygit2_check_credentials") 70 | @patch("shaker.libs.pygit2_utils.pygit2_check_ssh") 71 | def test_pygit2_check__no_credentials(self, 72 | mock_check_ssh, 73 | mock_check_credentials): 74 | """ 75 | Test pygit_check raises exception when credential check fails 76 | """ 77 | mock_check_ssh.return_value = True 78 | mock_check_credentials.return_value = False 79 | pygit2_utils.pygit2_check() 80 | 81 | @patch("shaker.libs.pygit2_utils.pygit2") 82 | def test_pygit2_check_ssh__have_ssh(self, 83 | mock_pygit2): 84 | """ 85 | TestPygit2Utils:test_pygit2_check_ssh: Check for successful ssh support 86 | """ 87 | # Mock needed pygit2 components 88 | mock_pygit2.features = pygit2.GIT_FEATURE_SSH 89 | mock_pygit2.GIT_FEATURE_SSH = pygit2.GIT_FEATURE_SSH 90 | mock_pygit2.GIT_FEATURE_HTTPS = pygit2.GIT_FEATURE_HTTPS 91 | 92 | result = pygit2_utils.pygit2_check_ssh() 93 | self.assertTrue(result) 94 | 95 | @patch("shaker.libs.pygit2_utils.pygit2") 96 | def test_pygit2_check_ssh__no_ssh(self, 97 | mock_pygit2): 98 | """ 99 | TestPygit2Utils:test_pygit2_check_no_ssh: Check for missing ssh support 100 | """ 101 | # Mock needed pygit2 components 102 | mock_pygit2.features = pygit2.GIT_FEATURE_HTTPS 103 | mock_pygit2.GIT_FEATURE_SSH = pygit2.GIT_FEATURE_SSH 104 | mock_pygit2.GIT_FEATURE_HTTPS = pygit2.GIT_FEATURE_HTTPS 105 | 106 | result = pygit2_utils.pygit2_check_ssh() 107 | self.assertFalse(result) 108 | 109 | @patch("pygit2.credentials", spec={}) 110 | def test_pygit2_check_credentials__have_keychain(self, mock_pygit2_credentials): 111 | """ 112 | TestPygit2Utils:test_pygit2_check_credentials_keychain: Check for successful credentials support 113 | """ 114 | mock_pygit2_credentials.KeypairFromAgent = "" 115 | result = pygit2_utils.pygit2_check_credentials() 116 | self.assertTrue(result) 117 | -------------------------------------------------------------------------------- /tests/test_shaker_metadata.py: -------------------------------------------------------------------------------- 1 | from unittest import TestCase 2 | from mock import patch 3 | from mock import mock_open 4 | from nose.tools import raises 5 | import testfixtures 6 | import responses 7 | import json 8 | 9 | import logging 10 | import shaker.libs.logger 11 | from shaker.shaker_metadata import ShakerMetadata 12 | from shaker.libs.errors import GithubRepositoryConnectionException 13 | 14 | 15 | class TestShakerMetadata(TestCase): 16 | 17 | _sample_metadata_root = { 18 | "formula": "test_organisation/root-formula", 19 | 'dependencies': 20 | { 21 | 'test_organisation/test1-formula': 22 | { 23 | 'source': 'git@github.com:test_organisation/test1-formula.git', 24 | 'constraint': '==v1.0.1', 25 | 'sourced_constraints': [], 26 | 'organisation': 'test_organisation', 27 | 'name': 'test1-formula' 28 | }, 29 | 'test_organisation/test2-formula': 30 | { 31 | 'source': 'git@github.com:test_organisation/test2-formula.git', 32 | 'constraint': '==v2.0.1', 33 | 'sourced_constraints': [], 34 | 'organisation': 'test_organisation', 35 | 'name': 'test2-formula' 36 | } 37 | } 38 | } 39 | 40 | _sample_metadata_test1 = { 41 | "name": "test_organisation/test1-formula", 42 | "dependencies": [ 43 | "git@github.com:test_organisation/test3-formula.git==v3.0.1", 44 | ] 45 | } 46 | 47 | _sample_metadata_test2 = { 48 | "name": "test_organisation/test2-formula", 49 | "dependencies": [ 50 | "git@github.com:test_organisation/test3-formula.git==v3.0.2", 51 | ] 52 | } 53 | _sample_metadata_test3 = { 54 | "name": "test_organisation/test3-formula", 55 | "dependencies": [ 56 | "git@github.com:test_organisation/root-formula.git==v1.0.2", 57 | ] 58 | } 59 | _sample_requirements_root = { 60 | "git@github.com:test_organisation/testa-formula.git==v1.0.1", 61 | "git@github.com:test_organisation/testb-formula.git==v2.0.1", 62 | } 63 | 64 | _sample_requirements_testa = { 65 | "git@github.com:test_organisation/testc-formula.git==v3.0.1", 66 | } 67 | 68 | _sample_requirements_testb = { 69 | "dependencies": [ 70 | "git@github.com:test_organisation/testc-formula.git==v3.0.2", 71 | ] 72 | } 73 | _sample_requirements_testc = {} 74 | 75 | _sample_root_formula = { 76 | 'formula': 'test_organisation/root-formula', 77 | 'organisation': 'test_organisation', 78 | 'name': 'root-formula', 79 | 'dependencies': { 80 | 'test_organisation/test1-formula': 81 | { 82 | 'source': 'git@github.com:test_organisation/test1-formula.git', 83 | 'constraint': '==v1.0.1', 84 | 'sourced_constraints': [], 85 | 'organisation': 'test_organisation', 86 | 'name': 'test1-formula' 87 | }, 88 | 'test_organisation/test2-formula': 89 | { 90 | 'source': 'git@github.com:test_organisation/test2-formula.git', 91 | 'constraint': '==v2.0.1', 92 | 'sourced_constraints': [], 93 | 'organisation': 'test_organisation', 94 | 'name': 'test2-formula' 95 | } 96 | } 97 | } 98 | 99 | _sample_dependencies = { 100 | 'test_organisation/test1-formula': { 101 | 'source': 'git@github.com:test_organisation/test1-formula.git', 102 | 'constraint': '==v1.0.1', 103 | 'sourced_constraints': [], 104 | 'organisation': 'test_organisation', 105 | 'name': 'test1-formula' 106 | }, 107 | 'test_organisation/test2-formula': { 108 | 'source': 'git@github.com:test_organisation/test2-formula.git', 109 | 'constraint': '==v2.0.1', 110 | 'sourced_constraints': [], 111 | 'organisation': 'test_organisation', 112 | 'name': 'test2-formula' 113 | }, 114 | 'test_organisation/test3-formula': { 115 | 'source': 'git@github.com:test_organisation/test3-formula.git', 116 | 'constraint': '==v3.0.1', 117 | 'sourced_constraints': [], 118 | 'organisation': 'test_organisation', 119 | 'name': 'test3-formula' 120 | } 121 | } 122 | _sample_dependencies_root_only = { 123 | 'test_organisation/test1-formula': { 124 | 'source': 'git@github.com:test_organisation/test1-formula.git', 125 | 'constraint': '==v1.0.1', 126 | 'sourced_constraints': [], 127 | 'organisation': 'test_organisation', 128 | 'name': 'test1-formula' 129 | }, 130 | 'test_organisation/test2-formula': { 131 | 'source': 'git@github.com:test_organisation/test2-formula.git', 132 | 'constraint': '==v2.0.1', 133 | 'sourced_constraints': [], 134 | 'organisation': 'test_organisation', 135 | 'name': 'test2-formula' 136 | }, 137 | } 138 | _sample_sourced_dependencies_root_only = { 139 | 'test_organisation/test1-formula': { 140 | 'source': 'git@github.com:test_organisation/test1-formula.git', 141 | 'constraint': '==v1.0.1', 142 | 'sourced_constraints': ['==v1.0.1'], 143 | 'organisation': 'test_organisation', 144 | 'name': 'test1-formula' 145 | }, 146 | 'test_organisation/test2-formula': { 147 | 'source': 'git@github.com:test_organisation/test2-formula.git', 148 | 'constraint': '==v2.0.1', 149 | 'sourced_constraints': ['==v2.0.1'], 150 | 'organisation': 'test_organisation', 151 | 'name': 'test2-formula' 152 | }, 153 | } 154 | _sample_requirements_file = ("git@github.com:test_organisation/test1-formula.git==v1.0.1\n" 155 | "git@github.com:test_organisation/test2-formula.git==v2.0.1\n" 156 | "git@github.com:test_organisation/test3-formula.git==v3.0.1\n") 157 | 158 | _sample_tags_test1 = [ 159 | {"name": "v1.0.0"}, 160 | {"name": "v1.0.1"}, 161 | {"name": "v1.0.2"}, 162 | ] 163 | 164 | _sample_tags_test2 = [ 165 | {"name": "v2.0.0"}, 166 | {"name": "v2.0.1"}, 167 | {"name": "v2.0.2"}, 168 | ] 169 | 170 | def setUp(self): 171 | """ 172 | TestShakerMetadata: Pre-method setup of the test object 173 | """ 174 | logging.getLogger("salt-shaker-unittest").setLevel(logging.INFO) 175 | TestCase.setUp(self) 176 | 177 | def tearDown(self): 178 | """ 179 | TestShakerMetadata: Post-method teardown of the test object 180 | """ 181 | TestCase.tearDown(self) 182 | 183 | @patch('shaker.shaker_metadata.ShakerMetadata.load_local_requirements') 184 | @patch('shaker.shaker_metadata.ShakerMetadata.load_local_metadata') 185 | def test__init__(self, 186 | mock_load_local_metadata, 187 | mock_load_local_requirements, 188 | ): 189 | """ 190 | TestShakerMetadata: Test object initialises correctly 191 | """ 192 | mock_load_local_metadata.return_value = None 193 | mock_load_local_requirements.return_value = None 194 | expected_working_directory = '.' 195 | expected_metadata_filename = 'metadata.yml' 196 | testobj_default = ShakerMetadata() 197 | self.assertEqual(testobj_default.working_directory, 198 | expected_working_directory, 199 | "Default initialised working directory mismatch" 200 | "'%s' != '%s'" 201 | % (testobj_default.working_directory, 202 | expected_working_directory) 203 | ) 204 | self.assertEqual(testobj_default.metadata_filename, 205 | expected_metadata_filename, 206 | "Default initialised metadata filename mismatch. " 207 | "'%s' != '%s'" 208 | % (testobj_default.metadata_filename, 209 | expected_metadata_filename) 210 | ) 211 | 212 | expected_working_directory = '/some/dir/' 213 | expected_metadata_filename = 'some_metadata.yml' 214 | testobj_custom = ShakerMetadata(expected_working_directory, 215 | expected_metadata_filename) 216 | self.assertEqual(testobj_custom.working_directory, 217 | expected_working_directory, 218 | "Custom initialised working directory mismatch" 219 | "'%s' != '%s'" 220 | % (testobj_custom.working_directory, 221 | expected_working_directory) 222 | ) 223 | 224 | self.assertEqual(testobj_custom.metadata_filename, 225 | expected_metadata_filename, 226 | "Custom initialised metadata filename mismatch" 227 | "'%s' != '%s'" 228 | % (testobj_custom.metadata_filename, 229 | expected_metadata_filename) 230 | ) 231 | 232 | @patch('yaml.load') 233 | @patch('__builtin__.open') 234 | @patch('os.path.exists') 235 | def test__fetch_local_metadata__fileexists(self, 236 | mock_path_exists, 237 | mock_open, 238 | mock_yaml_load): 239 | """ 240 | Test we get data when we load a metadata file 241 | """ 242 | mock_path_exists.return_value = True 243 | with patch('__builtin__.open', 244 | mock_open(read_data=()), 245 | create=True): 246 | mock_yaml_load.return_value = self._sample_metadata_root 247 | testobj = ShakerMetadata() 248 | actual_return_value = testobj._fetch_local_metadata('fakedir', 249 | 'fakefile') 250 | self.assertEqual(actual_return_value, 251 | self._sample_metadata_root, 252 | "Metadata equalitymismatch: " 253 | "\nActual:%s\nExpected:%s\n\n" 254 | % (actual_return_value, 255 | self._sample_metadata_root)) 256 | 257 | @raises(IOError) 258 | @patch('os.path.exists') 259 | def test__fetch_local_metadata__filenotexist(self, 260 | mock_path_exists): 261 | """ 262 | Test we raise an error when we try to load a non-existent file 263 | """ 264 | mock_path_exists.return_value = False 265 | testobj = ShakerMetadata() 266 | testobj._fetch_local_metadata() 267 | # No assert needed, we're testing for an exception 268 | 269 | @patch('shaker.libs.metadata.parse_metadata_requirements') 270 | @patch('shaker.shaker_metadata.ShakerMetadata._parse_metadata_name') 271 | @patch('shaker.shaker_metadata.ShakerMetadata._fetch_local_metadata') 272 | def test_load_local_metadata(self, 273 | mock_fetch_local_metadata, 274 | mock_parse_metadata_name, 275 | mock_parse_metadata_requirements, 276 | ): 277 | """ 278 | TestShakerMetadata: Test the metadata loads correctly into the object 279 | """ 280 | testobj = ShakerMetadata() 281 | mock_fetch_local_metadata.return_value = self._sample_metadata_root 282 | mock_parse_metadata_name.return_value = { 283 | 'organisation': 'test_organisation', 284 | 'name': 'root-formula' 285 | } 286 | mock_parse_metadata_requirements.return_value = self._sample_root_formula["dependencies"] 287 | testobj.load_local_metadata() 288 | 289 | self.assertEqual(testobj.root_metadata, 290 | self._sample_root_formula, 291 | 'Metadata root formula mismatch\n\n' 292 | '%s\n\n' 293 | '%s\n' 294 | % (testobj.root_metadata, 295 | self._sample_root_formula 296 | ) 297 | ) 298 | 299 | @patch('shaker.shaker_metadata.ShakerMetadata._fetch_remote_file') 300 | @patch('shaker.shaker_metadata.ShakerMetadata.load_local_metadata') 301 | def test__fetch_remote_requirements(self, 302 | mock_load_local_metadata, 303 | mock_fetch_remote_file): 304 | sample_raw_requirements = ( 305 | "git@github.com:test_organisation/test1-formula.git==v1.0.1\n" 306 | "git@github.com:test_organisation/test2-formula.git==v2.0.1\n" 307 | ) 308 | 309 | # PEP8 requires unused mock being used 310 | mock_load_local_metadata.return_value = None 311 | 312 | expected_result = self._sample_sourced_dependencies_root_only.copy() 313 | 314 | mock_fetch_remote_file.return_value = sample_raw_requirements 315 | org_name = "test-organisation" 316 | formula_name = "test1-formula" 317 | constraint = None 318 | 319 | testobj = ShakerMetadata() 320 | data = testobj._fetch_remote_requirements(org_name, formula_name, constraint) 321 | self.assertEqual(data, 322 | expected_result, 323 | ("Dependency data mismatch:\nActual:%s\nExpected:%s\n\n" 324 | % (data, expected_result))) 325 | 326 | @patch("shaker.shaker_metadata.ShakerMetadata._fetch_dependencies") 327 | @patch("shaker.shaker_metadata.ShakerMetadata.load_local_metadata") 328 | def test_update_dependencies__use_requirements(self, 329 | mock_load_local_metadata, 330 | mock_fetch_dependencies): 331 | 332 | """ 333 | TestShakerMetadata: Test we update dependencies using root requirements path 334 | """ 335 | sample_root_metadata = { 336 | 'test_organisation/test1-formula': { 337 | 'source': 'git@github.com:test_organisation/test1-formula.git', 338 | 'constraint': '==v1.0.1', 339 | 'sourced_constraints': ['==v1.0.1'], 340 | 'organisation': 'test_organisation', 341 | 'name': 'test1-formula' 342 | } 343 | } 344 | sample_root_requirements = { 345 | 'test_organisation/testa-formula': { 346 | 'source': 'git@github.com:test_organisation/testa-formula.git', 347 | 'constraint': '==v1.0.1', 348 | 'sourced_constraints': ['==v1.0.1'], 349 | 'organisation': 'test_organisation', 350 | 'name': 'testa-formula' 351 | } 352 | } 353 | 354 | # PEP8 requires unused mock being used 355 | mock_load_local_metadata.return_value = None 356 | 357 | testobj = ShakerMetadata() 358 | testobj.local_requirements = sample_root_requirements 359 | testobj.root_metadata = sample_root_metadata 360 | 361 | ignore_local_requirements = False 362 | ignore_dependency_requirements = False 363 | testobj.update_dependencies(ignore_local_requirements, 364 | ignore_dependency_requirements) 365 | 366 | mock_fetch_dependencies.assert_called_once_with(sample_root_requirements, 367 | ignore_dependency_requirements) 368 | self.assertEqual(testobj.dependencies, 369 | sample_root_requirements, 370 | 'Dependencies mismatch\n' 371 | 'Actual:%s\n' 372 | 'Expected:%s\n\n' 373 | % (testobj.dependencies, 374 | sample_root_requirements 375 | ) 376 | ) 377 | 378 | @patch("shaker.shaker_metadata.ShakerMetadata._fetch_dependencies") 379 | @patch("shaker.shaker_metadata.ShakerMetadata.load_local_metadata") 380 | def test_update_dependencies__use_metadata(self, 381 | mock_load_local_metadata, 382 | mock_fetch_dependencies): 383 | 384 | """ 385 | TestShakerMetadata: Test we update dependencies using root metadata path 386 | """ 387 | # PEP8 requires unused mock being used 388 | mock_load_local_metadata.return_value = None 389 | 390 | testobj = ShakerMetadata() 391 | testobj.local_requirements = {} 392 | testobj.root_metadata = self._sample_metadata_root 393 | 394 | ignore_local_requirements = False 395 | ignore_dependency_requirements = False 396 | testobj.update_dependencies(ignore_local_requirements, 397 | ignore_dependency_requirements) 398 | 399 | mock_fetch_dependencies.assert_called_once_with(self._sample_dependencies_root_only, 400 | ignore_dependency_requirements) 401 | self.assertEqual(testobj.dependencies, 402 | self._sample_dependencies_root_only, 403 | 'Dependencies mismatch\n' 404 | 'Actual:%s\nExpected:%s\n\n' 405 | % (testobj.dependencies, 406 | self._sample_dependencies_root_only 407 | ) 408 | ) 409 | 410 | @patch('shaker.shaker_metadata.ShakerMetadata._fetch_remote_metadata') 411 | @patch('shaker.shaker_metadata.ShakerMetadata._fetch_remote_requirements') 412 | def test_fetch_dependencies__requirements_exist(self, 413 | mock_fetch_remote_requirements, 414 | mock_fetch_remote_metadata): 415 | """ 416 | TestShakerMetadata:test_fetch_dependencies__requirements_exist: Get dependencies when requirements file exists 417 | """ 418 | test_base_dependencies = { 419 | 'test_organisation/test1-formula': 420 | { 421 | 'source': 'git@github.com:test_organisation/test1-formula.git', 422 | 'constraint': '==v1.0.1', 423 | 'sourced_constraints': [], 424 | 'organisation': 'test_organisation', 425 | 'name': 'test1-formula' 426 | } 427 | } 428 | mock_fetch_remote_requirements.side_effect = [ 429 | ['test_organisation/test2-formula==v2.0.1'], 430 | None 431 | ] 432 | 433 | expected_dependencies = { 434 | 'test_organisation/test1-formula': 435 | { 436 | 'sourced_constraints': ['==v1.0.1'], 437 | }, 438 | 'test_organisation/test2-formula': 439 | { 440 | 'source': 'git@github.com:test_organisation/test2-formula.git', 441 | 'constraint': '==v2.0.1', 442 | 'sourced_constraints': ['==v2.0.1'], 443 | 'organisation': 'test_organisation', 444 | 'name': 'test2-formula' 445 | } 446 | } 447 | mock_fetch_remote_metadata.return_value = None 448 | tempobj = ShakerMetadata(autoload=False) 449 | tempobj.dependencies = {} 450 | tempobj._fetch_dependencies(test_base_dependencies, 451 | ignore_dependency_requirements=False) 452 | 453 | testfixtures.compare(tempobj.dependencies, expected_dependencies) 454 | 455 | @patch('shaker.shaker_metadata.ShakerMetadata._fetch_remote_metadata') 456 | @patch('shaker.shaker_metadata.ShakerMetadata._fetch_remote_requirements') 457 | def test_fetch_dependencies__only_metadata_exists(self, 458 | mock_fetch_remote_requirements, 459 | mock_fetch_remote_metadata): 460 | test_base_dependencies = { 461 | 'test_organisation/test1-formula': 462 | { 463 | 'source': 'git@github.com:test_organisation/test1-formula.git', 464 | 'constraint': '==v1.0.1', 465 | 'sourced_constraints': [], 466 | 'organisation': 'test_organisation', 467 | 'name': 'test1-formula' 468 | } 469 | } 470 | mock_fetch_remote_metadata.side_effect = [ 471 | { 472 | "formula": "test_fetch_dependencies__only_metadata_exists", 473 | 'dependencies': 474 | [ 475 | "test_organisation/test2-formula==v2.0.1" 476 | ] 477 | }, 478 | None 479 | ] 480 | 481 | expected_dependencies = { 482 | 'test_organisation/test1-formula': 483 | { 484 | 'sourced_constraints': ['==v1.0.1'], 485 | }, 486 | 'test_organisation/test2-formula': 487 | { 488 | 'source': 'git@github.com:test_organisation/test2-formula.git', 489 | 'constraint': '==v2.0.1', 490 | 'sourced_constraints': ['==v2.0.1'], 491 | 'organisation': 'test_organisation', 492 | 'name': 'test2-formula' 493 | } 494 | } 495 | mock_fetch_remote_requirements.return_value = None 496 | tempobj = ShakerMetadata(autoload=False) 497 | tempobj.dependencies = {} 498 | tempobj._fetch_dependencies(test_base_dependencies, 499 | ignore_dependency_requirements=False) 500 | 501 | testfixtures.compare(tempobj.dependencies, expected_dependencies) 502 | 503 | @patch('shaker.shaker_metadata.ShakerMetadata._fetch_remote_metadata') 504 | @patch('shaker.shaker_metadata.ShakerMetadata._fetch_remote_requirements') 505 | def test_fetch_dependencies__already_sourced(self, 506 | mock_fetch_remote_requirements, 507 | mock_fetch_remote_metadata): 508 | """ 509 | TestShakerMetadata::test_fetch_dependencies_exists: Don't fetch dependencies if we've already sourced them 510 | """ 511 | test_base_dependencies = { 512 | 'test_organisation/test1-formula': 513 | { 514 | 'source': 'git@github.com:test_organisation/test1-formula.git', 515 | 'constraint': '==v1.0.1', 516 | 'sourced_constraints': [], 517 | 'organisation': 'test_organisation', 518 | 'name': 'test1-formula' 519 | } 520 | } 521 | mock_fetch_remote_metadata.side_effect = [ 522 | { 523 | 'formula': 'test_organisation/test2-formula', 524 | 'dependencies': 525 | [ 526 | "test_organisation/test2-formula==v2.0.1" 527 | ] 528 | }, 529 | None 530 | ] 531 | 532 | expected_dependencies = { 533 | 'test_organisation/test1-formula': 534 | { 535 | 'sourced_constraints': ['==v1.0.1'], 536 | }, 537 | } 538 | mock_fetch_remote_requirements.return_value = None 539 | tempobj = ShakerMetadata(autoload=False) 540 | tempobj.dependencies = {'test_organisation/test1-formula': {'sourced_constraints': ['==v1.0.1']}} 541 | tempobj._fetch_dependencies(test_base_dependencies, 542 | ignore_dependency_requirements=False) 543 | 544 | testfixtures.compare(tempobj.dependencies, expected_dependencies) 545 | 546 | @raises(GithubRepositoryConnectionException) 547 | @patch('shaker.libs.github.validate_github_access') 548 | @patch('shaker.libs.github.resolve_constraint_to_object') 549 | @patch('shaker.libs.github.get_valid_github_token') 550 | def test_fetch_remote_file__no_valid_object(self, 551 | mock_get_valid_github_token, 552 | mock_resolve_constraint_to_object, 553 | mock_validate_github_access): 554 | """ 555 | TestShakerMetadata::test_fetch_remote_file__no_valid_object: Check for exception when no valid object 556 | """ 557 | mock_get_valid_github_token.return_value = True 558 | mock_resolve_constraint_to_object.return_value = None 559 | mock_validate_github_access.return_value = True 560 | tempobj = ShakerMetadata(autoload=False) 561 | tempobj._fetch_remote_file("fake", "fake", "fake", "fake") 562 | # Looking for exception, assert not needed 563 | self.assertTrue(False, "N/A") 564 | 565 | @patch('shaker.libs.github.validate_github_access') 566 | @patch('shaker.libs.github.resolve_constraint_to_object') 567 | @patch('shaker.libs.github.get_valid_github_token') 568 | def test_fetch_remote_file__bad_access(self, 569 | mock_get_valid_github_token, 570 | mock_resolve_constraint_to_object, 571 | mock_validate_github_access): 572 | """ 573 | TestShakerMetadata::test_fetch_remote_file__bad_access: Check for None on problem accessing github 574 | """ 575 | mock_get_valid_github_token.return_value = True 576 | mock_resolve_constraint_to_object.return_value = { 577 | "name": "v5.2.0", 578 | "commit": { 579 | "sha": "FAKE", 580 | "url": "https://api.github.com/repos/fake" 581 | } 582 | } 583 | 584 | mock_validate_github_access.return_value = False 585 | tempobj = ShakerMetadata(autoload=False) 586 | return_val = tempobj._fetch_remote_file("FAKE", "FAKE", "FAKE", "FAKE") 587 | self.assertEqual(return_val, None, "Should get None type on bad github access") 588 | 589 | @responses.activate 590 | @patch('shaker.libs.github.validate_github_access') 591 | @patch('shaker.libs.github.resolve_constraint_to_object') 592 | @patch('shaker.libs.github.get_valid_github_token') 593 | def test_fetch_remote_file__good_access(self, 594 | mock_get_valid_github_token, 595 | mock_resolve_constraint_to_object, 596 | mock_validate_github_access): 597 | """ 598 | TestShakerMetadata::test_fetch_remote_file__bad_access: Check for good access 599 | """ 600 | mock_get_valid_github_token.return_value = True 601 | mock_resolve_constraint_to_object.return_value = { 602 | "name": "v1.0.0", 603 | "commit": { 604 | "sha": "FAKE", 605 | "url": "https://api.github.com/repos/fake" 606 | } 607 | } 608 | mock_response = { 609 | "name": "v1.0.0", 610 | "commit": { 611 | "sha": "fakesha", 612 | "url": "https://fakeurl" 613 | }, 614 | } 615 | responses.add( 616 | responses.GET, 617 | "https://raw.githubusercontent.com/FAKE/FAKE/v1.0.0/FAKE", 618 | content_type="application/json", 619 | body=json.dumps(mock_response) 620 | ) 621 | mock_validate_github_access.return_value = True 622 | tempobj = ShakerMetadata(autoload=False) 623 | expected_return = mock_response 624 | return_val = tempobj._fetch_remote_file("FAKE", "FAKE", "FAKE", "FAKE") 625 | self.assertEqual(return_val, 626 | expected_return, 627 | "Metadata mismatch\nActual:'%s'\nExpected:'%s'" 628 | % (return_val, expected_return)) 629 | 630 | @patch('os.path.exists') 631 | def test_load_local_requirements(self, 632 | mock_path_exists 633 | ): 634 | """ 635 | TestShakerMetadata::test_load_local_requirements: Test loading from local dependency file 636 | """ 637 | # Setup 638 | mock_path_exists.return_value = True 639 | text_file_data = '\n'.join(["git@github.com:test_organisation/test1-formula.git==v1.0.1", 640 | "git@github.com:test_organisation/test2-formula.git==v2.0.1", 641 | "git@github.com:test_organisation/test3-formula.git==v3.0.1"]) 642 | with patch('__builtin__.open', 643 | mock_open(read_data=()), 644 | create=True) as mopen: 645 | mopen.return_value.__iter__.return_value = text_file_data.splitlines() 646 | 647 | shaker.libs.logger.Logger().setLevel(logging.DEBUG) 648 | tempobj = ShakerMetadata(autoload=False) 649 | input_directory = '.' 650 | input_filename = 'test' 651 | tempobj.load_local_requirements(input_directory, input_filename) 652 | mock_path_exists.assert_called_once_with('./test') 653 | mopen.assert_called_once_with('./test', 'r') 654 | testfixtures.compare(tempobj.local_requirements, self._sample_dependencies) 655 | 656 | @patch('os.path.exists') 657 | def test_load_local_requirements__with_blanks(self, 658 | mock_path_exists 659 | ): 660 | """ 661 | TestShakerMetadata::test_load_local_requirements: Test loading from local dependency file with blanks and comments 662 | """ 663 | # Setup 664 | mock_path_exists.return_value = True 665 | text_file_data = '\n'.join(["git@github.com:test_organisation/test1-formula.git==v1.0.1", 666 | "", 667 | "git@github.com:test_organisation/test2-formula.git==v2.0.1", 668 | " ", 669 | "#DONT_READ_ME", 670 | "git@github.com:test_organisation/test3-formula.git==v3.0.1"]) 671 | with patch('__builtin__.open', 672 | mock_open(read_data=()), 673 | create=True) as mopen: 674 | mopen.return_value.__iter__.return_value = text_file_data.splitlines() 675 | 676 | shaker.libs.logger.Logger().setLevel(logging.DEBUG) 677 | tempobj = ShakerMetadata(autoload=False) 678 | input_directory = '.' 679 | input_filename = 'test' 680 | tempobj.load_local_requirements(input_directory, input_filename) 681 | mock_path_exists.assert_called_once_with('./test') 682 | mopen.assert_called_once_with('./test', 'r') 683 | testfixtures.compare(tempobj.local_requirements, self._sample_dependencies) 684 | -------------------------------------------------------------------------------- /tests/test_shaker_remote.py: -------------------------------------------------------------------------------- 1 | from unittest import TestCase 2 | from mock import patch 3 | from shaker.shaker_remote import ShakerRemote 4 | from nose.tools import raises 5 | 6 | 7 | class TestShakerRemote(TestCase): 8 | """ 9 | A class to test the ShakerRemote class 10 | """ 11 | 12 | _sample_dependencies = { 13 | 'test_organisation/test1-formula': { 14 | 'source': 'git@github.com:test_organisation/test1-formula.git', 15 | 'constraint': '==v1.0.1', 16 | 'organisation': 'test_organisation', 17 | 'name': 'test1-formula' 18 | }, 19 | 'test_organisation/test2-formula': { 20 | 'source': 'git@github.com:test_organisation/test2-formula.git', 21 | 'constraint': '==v2.0.1', 22 | 'organisation': 'test_organisation', 23 | 'name': 'test2-formula' 24 | }, 25 | 'test_organisation/test3-formula': { 26 | 'source': 'git@github.com:test_organisation/test3-formula.git', 27 | 'constraint': '==v3.0.1', 28 | 'organisation': 'test_organisation', 29 | 'name': 'test3-formula' 30 | } 31 | } 32 | 33 | _sample_requirements = ("git@github.com:test_organisation/test1-formula.git==v1.0.1\n" 34 | "git@github.com:test_organisation/test2-formula.git==v2.0.1\n" 35 | "git@github.com:test_organisation/test3-formula.git==v3.0.1\n") 36 | 37 | def setUp(self): 38 | TestCase.setUp(self) 39 | 40 | def tearDown(self): 41 | TestCase.tearDown(self) 42 | 43 | @patch('os.write') 44 | @patch('__builtin__.open') 45 | @patch('os.path.exists') 46 | def test_write_requirements__overwrite(self, 47 | mock_path_exists, 48 | mock_open, 49 | mock_write): 50 | """ 51 | TestShakerMetadata: Test resolved dependency overwrites an existing file when forced 52 | """ 53 | 54 | # Setup 55 | testobj = ShakerRemote(self._sample_dependencies) 56 | output_directory = "tests/files" 57 | output_filename = "test_dependencies.txt" 58 | output_path = '%s/%s' % (output_directory, 59 | output_filename) 60 | 61 | # Overwrite an existing file 62 | mock_path_exists.return_value = True 63 | testobj.write_requirements(output_directory, 64 | output_filename, 65 | overwrite=True, 66 | backup=False) 67 | mock_open.assert_called_once_with(output_path, 'w') 68 | mock_write.assert_called_once(self._sample_requirements) 69 | 70 | @patch('os.write') 71 | @patch('__builtin__.open') 72 | @patch('os.path.exists') 73 | def test_write_requirements__simple(self, 74 | mock_path_exists, 75 | mock_open, 76 | mock_write): 77 | """ 78 | TestShakerMetadata: Test resolved dependency are correctly written out to file 79 | """ 80 | 81 | # Setup 82 | testobj = ShakerRemote(self._sample_dependencies) 83 | output_directory = "tests/files" 84 | output_filename = "test_dependencies.txt" 85 | output_path = '%s/%s' % (output_directory, 86 | output_filename) 87 | 88 | # Simple write 89 | mock_path_exists.return_value = False 90 | testobj.write_requirements(output_directory, 91 | output_filename, 92 | overwrite=False, 93 | backup=False) 94 | mock_open.assert_called_once_with(output_path, 'w') 95 | mock_write.assert_called_once(self._sample_requirements) 96 | 97 | @patch('os.write') 98 | @patch('__builtin__.open') 99 | @patch('os.path.exists') 100 | def test_write_requirements__no_overwrite(self, 101 | mock_path_exists, 102 | mock_open, 103 | mock_write): 104 | """ 105 | TestShakerMetadata: Test resolved dependency do not overwrite an existing file 106 | """ 107 | 108 | # Setup 109 | testobj = ShakerRemote(self._sample_dependencies) 110 | output_filename = "test_dependencies.txt" 111 | 112 | # Don't overwrite an existing file 113 | mock_path_exists.return_value = True 114 | testobj.write_requirements(output_filename, 115 | overwrite=False, 116 | backup=False) 117 | self.assertFalse(mock_open.called, ("With overwrite disabled, " 118 | "we shouldn't have called to open")) 119 | self.assertFalse(mock_write.called, ("With overwrite disabled, " 120 | "we shouldn't have called to write")) 121 | 122 | @patch('shaker.libs.github.install_source') 123 | @patch('shaker.libs.github.get_repository_sha') 124 | @patch('shaker.shaker_remote.ShakerRemote._create_directories') 125 | def test_install_dependencies_non_existing_sha(self, 126 | mock_create_directories, 127 | mock_get_repository_sha, 128 | mock_install_source): 129 | """ 130 | TestShakerMetadata: Test installing dependencies with non-existent sha 131 | """ 132 | mock_create_directories.return_value = None 133 | mock_get_repository_sha.side_effect = ["fake_sha", 134 | None] 135 | mock_install_source.return_value = True 136 | 137 | @patch('shaker.shaker_remote.ShakerRemote._link_dynamic_modules') 138 | @patch('os.symlink') 139 | @patch('os.path.exists') 140 | def test__update_root_links__formula_subdir_exists(self, 141 | mock_path_exists, 142 | mock_symlink, 143 | mock_link_dynamic_modules): 144 | """ 145 | Test root links are made when our formula has docker-formula/docker structure 146 | """ 147 | # Setup 148 | sample_dependencies = { 149 | 'test_organisation/test1-formula': { 150 | 'source': 'git@github.com:test_organisation/test1-formula.git', 151 | 'constraint': '==v1.0.1', 152 | 'organisation': 'test_organisation', 153 | 'name': 'test1-formula' 154 | }, 155 | } 156 | testobj = ShakerRemote(sample_dependencies) 157 | # Set path exists check values 158 | # True: formula-repos/docker-formula/docker exists 159 | # False: _root/docker exists 160 | mock_path_exists.side_effect = [ 161 | True, 162 | False 163 | ] 164 | testobj._update_root_links() 165 | relative_source = "../formula-repos/test1-formula/test1" 166 | target = "vendor/_root/test1" 167 | mock_symlink.assert_called_once_with(relative_source, target) 168 | mock_link_dynamic_modules.assert_called_once_with("test1-formula") 169 | 170 | @patch('shaker.shaker_remote.ShakerRemote._link_dynamic_modules') 171 | @patch('os.symlink') 172 | @patch('os.path.exists') 173 | def test__update_root_links__formula_subdir_not_exist(self, 174 | mock_path_exists, 175 | mock_symlink, 176 | mock_link_dynamic_modules): 177 | """ 178 | Test root links are made when our formula has no subdir docker/ structure 179 | """ 180 | # Setup 181 | sample_dependencies = { 182 | 'test_organisation/test1': { 183 | 'source': 'git@github.com:test_organisation/test1.git', 184 | 'constraint': '==v1.0.1', 185 | 'organisation': 'test_organisation', 186 | 'name': 'test1' 187 | }, 188 | } 189 | testobj = ShakerRemote(sample_dependencies) 190 | # Set path exists check values 191 | # False: formula-repos/docker-formula/docker exists 192 | # True: formula-repos/docker/ exists 193 | # False: _root/docker exists 194 | mock_path_exists.side_effect = [ 195 | False, 196 | True, 197 | False 198 | ] 199 | testobj._update_root_links() 200 | relative_source = "../formula-repos/test1" 201 | target = "vendor/_root/test1" 202 | mock_symlink.assert_called_once_with(relative_source, target) 203 | mock_link_dynamic_modules.assert_called_once_with("test1") 204 | 205 | @raises(IOError) 206 | @patch('shaker.shaker_remote.ShakerRemote._link_dynamic_modules') 207 | @patch('os.symlink') 208 | @patch('os.path.exists') 209 | def test__update_root_links__formula_link_exists(self, 210 | mock_path_exists, 211 | mock_symlink, 212 | mock_link_dynamic_modules): 213 | """ 214 | Test root links are made when our formula has no subdir docker-formula/ structure 215 | """ 216 | # Setup 217 | sample_dependencies = { 218 | 'test_organisation/test1-formula': { 219 | 'source': 'git@github.com:test_organisation/test1-formula.git', 220 | 'constraint': '==v1.0.1', 221 | 'organisation': 'test_organisation', 222 | 'name': 'test1-formula' 223 | }, 224 | } 225 | testobj = ShakerRemote(sample_dependencies) 226 | # Set path exists check values 227 | # True: formula-repos/docker-formula/docker exists 228 | # True: _root/docker exists 229 | mock_path_exists.side_effect = [ 230 | True, 231 | True 232 | ] 233 | testobj._update_root_links() 234 | 235 | @raises(IOError) 236 | @patch('os.path.exists') 237 | def test__update_root_links__formula_dirs_not_exist(self, 238 | mock_path_exists): 239 | """ 240 | Test we have an exception when we can't find a any directory 241 | """ 242 | # Setup 243 | sample_dependencies = { 244 | 'test_organisation/test1-formula': { 245 | 'source': 'git@github.com:test_organisation/test1-formula.git', 246 | 'constraint': '==v1.0.1', 247 | 'organisation': 'test_organisation', 248 | 'name': 'test1-formula' 249 | }, 250 | } 251 | testobj = ShakerRemote(sample_dependencies) 252 | # Set path exists check values 253 | # True: formula-repos/docker-formula/docker exists 254 | # False: _root/docker exists 255 | mock_path_exists.side_effect = [ 256 | False, 257 | False 258 | ] 259 | testobj._update_root_links() 260 | source = "vendor/formula-repos/test1-formula/test1" 261 | target = "vendor/_root/test1" 262 | mock_symlink.assert_called_with(source, target) 263 | mock_link_dynamic_modules.assert_called_with() 264 | --------------------------------------------------------------------------------