├── .gitignore ├── CODE_OF_CONDUCT.md ├── CONTRIBUTING.md ├── Dockerfile ├── LICENSE ├── NOTICE ├── README.md ├── apichanges ├── __init__.py ├── cli.py ├── icons.py ├── model.py ├── publisher.py ├── record.py ├── repo.py └── sitebuild.py ├── assets ├── css │ ├── bulma.min.css │ ├── docutils.css │ ├── icons.css │ └── site.css └── js │ ├── search.js │ └── vue.js ├── justfile ├── requirements.txt ├── setup.py ├── templates ├── index.j2 ├── macros.j2 ├── search.j2 ├── service-commit.j2 ├── service-map.j2 ├── service.j2 └── template.j2 ├── tf ├── bucket.tf ├── bucket_policy.json ├── certificate.tf ├── cloudfront.tf ├── domain.tf ├── ecs_cluster.tf ├── ecs_iam.tf ├── ecs_registry.tf ├── ecs_schedule.tf ├── ecs_task.tf ├── ecs_vpc.tf ├── provider.tf ├── schema.json ├── task-definitions │ └── sitebuild.json └── vars.tf └── tools └── icon_build.py /.gitignore: -------------------------------------------------------------------------------- 1 | bin/ 2 | commands/ 3 | lib/ 4 | include/ 5 | share/ 6 | __pycache__ 7 | cache.json 8 | .python-version 9 | tf/.terraform 10 | tf/terraform.tfstate.backup 11 | .Python 12 | .DS_Store 13 | -------------------------------------------------------------------------------- /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | ## Code of Conduct 2 | This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct). 3 | For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact 4 | opensource-codeofconduct@amazon.com with any additional questions or comments. 5 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contributing Guidelines 2 | 3 | Thank you for your interest in contributing to our project. Whether it's a bug report, new feature, correction, or additional 4 | documentation, we greatly value feedback and contributions from our community. 5 | 6 | Please read through this document before submitting any issues or pull requests to ensure we have all the necessary 7 | information to effectively respond to your bug report or contribution. 8 | 9 | 10 | ## Reporting Bugs/Feature Requests 11 | 12 | We welcome you to use the GitHub issue tracker to report bugs or suggest features. 13 | 14 | When filing an issue, please check existing open, or recently closed, issues to make sure somebody else hasn't already 15 | reported the issue. Please try to include as much information as you can. Details like these are incredibly useful: 16 | 17 | * A reproducible test case or series of steps 18 | * The version of our code being used 19 | * Any modifications you've made relevant to the bug 20 | * Anything unusual about your environment or deployment 21 | 22 | 23 | ## Contributing via Pull Requests 24 | Contributions via pull requests are much appreciated. Before sending us a pull request, please ensure that: 25 | 26 | 1. You are working against the latest source on the *master* branch. 27 | 2. You check existing open, and recently merged, pull requests to make sure someone else hasn't addressed the problem already. 28 | 3. You open an issue to discuss any significant work - we would hate for your time to be wasted. 29 | 30 | To send us a pull request, please: 31 | 32 | 1. Fork the repository. 33 | 2. Modify the source; please focus on the specific change you are contributing. If you also reformat all the code, it will be hard for us to focus on your change. 34 | 3. Ensure local tests pass. 35 | 4. Commit to your fork using clear commit messages. 36 | 5. Send us a pull request, answering any default questions in the pull request interface. 37 | 6. Pay attention to any automated CI failures reported in the pull request, and stay involved in the conversation. 38 | 39 | GitHub provides additional document on [forking a repository](https://help.github.com/articles/fork-a-repo/) and 40 | [creating a pull request](https://help.github.com/articles/creating-a-pull-request/). 41 | 42 | 43 | ## Finding contributions to work on 44 | Looking at the existing issues is a great way to find something to contribute on. As our projects, by default, use the default GitHub issue labels (enhancement/bug/duplicate/help wanted/invalid/question/wontfix), looking at any 'help wanted' issues is a great place to start. 45 | 46 | 47 | ## Code of Conduct 48 | This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct). 49 | For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact 50 | opensource-codeofconduct@amazon.com with any additional questions or comments. 51 | 52 | 53 | ## Security issue notifications 54 | If you discover a potential security issue in this project we ask that you notify AWS/Amazon Security via our [vulnerability reporting page](http://aws.amazon.com/security/vulnerability-reporting/). Please do **not** create a public github issue. 55 | 56 | 57 | ## Licensing 58 | 59 | See the [LICENSE](LICENSE) file for our project's licensing. We will ask you to confirm the licensing of your contribution. 60 | 61 | We may ask you to sign a [Contributor License Agreement (CLA)](http://en.wikipedia.org/wiki/Contributor_License_Agreement) for larger changes. 62 | -------------------------------------------------------------------------------- /Dockerfile: -------------------------------------------------------------------------------- 1 | from python:3.8.1-buster 2 | 3 | LABEL name="apichanges" \ 4 | homepage="https://github.com/awslabs/aws-sdk-api-changes" \ 5 | maintainer="Kapil Thangavelu " 6 | 7 | RUN adduser --disabled-login apichanges 8 | COPY --chown=apichanges:apichanges . /home/apichanges 9 | RUN echo "deb http://deb.debian.org/debian buster-backports main" >> /etc/apt/sources.list 10 | 11 | RUN apt-get -q update \ 12 | && apt-get -q -y install \ 13 | libxml2-dev libxslt1-dev libcairo2-dev build-essential libffi-dev \ 14 | git curl unzip zstd \ 15 | && apt-get -y -t buster-backports install libgit2-dev \ 16 | && cd /home/apichanges \ 17 | && pip3 install -r requirements.txt \ 18 | && python3 setup.py develop \ 19 | && curl -LSfs https://japaric.github.io/trust/install.sh | \ 20 | sh -s -- --git casey/just --target x86_64-unknown-linux-musl --to /usr/local/bin \ 21 | && apt-get --yes remove build-essential \ 22 | && apt-get purge --yes --auto-remove -o APT::AutoRemove::RecommendsImportant=false \ 23 | && rm -Rf /var/cache/apt/ \ 24 | && rm -Rf /var/lib/apt/lists/* \ 25 | && rm -Rf /root/.cache/ 26 | 27 | USER apichanges 28 | WORKDIR /home/apichanges 29 | ENV LC_ALL="C.UTF-8" LANG="C.UTF-8" TZ=":/etc/localtime" 30 | ENTRYPOINT ["/usr/local/bin/just"] 31 | 32 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | 2 | Apache License 3 | Version 2.0, January 2004 4 | http://www.apache.org/licenses/ 5 | 6 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 7 | 8 | 1. Definitions. 9 | 10 | "License" shall mean the terms and conditions for use, reproduction, 11 | and distribution as defined by Sections 1 through 9 of this document. 12 | 13 | "Licensor" shall mean the copyright owner or entity authorized by 14 | the copyright owner that is granting the License. 15 | 16 | "Legal Entity" shall mean the union of the acting entity and all 17 | other entities that control, are controlled by, or are under common 18 | control with that entity. For the purposes of this definition, 19 | "control" means (i) the power, direct or indirect, to cause the 20 | direction or management of such entity, whether by contract or 21 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 22 | outstanding shares, or (iii) beneficial ownership of such entity. 23 | 24 | "You" (or "Your") shall mean an individual or Legal Entity 25 | exercising permissions granted by this License. 26 | 27 | "Source" form shall mean the preferred form for making modifications, 28 | including but not limited to software source code, documentation 29 | source, and configuration files. 30 | 31 | "Object" form shall mean any form resulting from mechanical 32 | transformation or translation of a Source form, including but 33 | not limited to compiled object code, generated documentation, 34 | and conversions to other media types. 35 | 36 | "Work" shall mean the work of authorship, whether in Source or 37 | Object form, made available under the License, as indicated by a 38 | copyright notice that is included in or attached to the work 39 | (an example is provided in the Appendix below). 40 | 41 | "Derivative Works" shall mean any work, whether in Source or Object 42 | form, that is based on (or derived from) the Work and for which the 43 | editorial revisions, annotations, elaborations, or other modifications 44 | represent, as a whole, an original work of authorship. For the purposes 45 | of this License, Derivative Works shall not include works that remain 46 | separable from, or merely link (or bind by name) to the interfaces of, 47 | the Work and Derivative Works thereof. 48 | 49 | "Contribution" shall mean any work of authorship, including 50 | the original version of the Work and any modifications or additions 51 | to that Work or Derivative Works thereof, that is intentionally 52 | submitted to Licensor for inclusion in the Work by the copyright owner 53 | or by an individual or Legal Entity authorized to submit on behalf of 54 | the copyright owner. For the purposes of this definition, "submitted" 55 | means any form of electronic, verbal, or written communication sent 56 | to the Licensor or its representatives, including but not limited to 57 | communication on electronic mailing lists, source code control systems, 58 | and issue tracking systems that are managed by, or on behalf of, the 59 | Licensor for the purpose of discussing and improving the Work, but 60 | excluding communication that is conspicuously marked or otherwise 61 | designated in writing by the copyright owner as "Not a Contribution." 62 | 63 | "Contributor" shall mean Licensor and any individual or Legal Entity 64 | on behalf of whom a Contribution has been received by Licensor and 65 | subsequently incorporated within the Work. 66 | 67 | 2. Grant of Copyright License. Subject to the terms and conditions of 68 | this License, each Contributor hereby grants to You a perpetual, 69 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 70 | copyright license to reproduce, prepare Derivative Works of, 71 | publicly display, publicly perform, sublicense, and distribute the 72 | Work and such Derivative Works in Source or Object form. 73 | 74 | 3. Grant of Patent License. Subject to the terms and conditions of 75 | this License, each Contributor hereby grants to You a perpetual, 76 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 77 | (except as stated in this section) patent license to make, have made, 78 | use, offer to sell, sell, import, and otherwise transfer the Work, 79 | where such license applies only to those patent claims licensable 80 | by such Contributor that are necessarily infringed by their 81 | Contribution(s) alone or by combination of their Contribution(s) 82 | with the Work to which such Contribution(s) was submitted. If You 83 | institute patent litigation against any entity (including a 84 | cross-claim or counterclaim in a lawsuit) alleging that the Work 85 | or a Contribution incorporated within the Work constitutes direct 86 | or contributory patent infringement, then any patent licenses 87 | granted to You under this License for that Work shall terminate 88 | as of the date such litigation is filed. 89 | 90 | 4. Redistribution. You may reproduce and distribute copies of the 91 | Work or Derivative Works thereof in any medium, with or without 92 | modifications, and in Source or Object form, provided that You 93 | meet the following conditions: 94 | 95 | (a) You must give any other recipients of the Work or 96 | Derivative Works a copy of this License; and 97 | 98 | (b) You must cause any modified files to carry prominent notices 99 | stating that You changed the files; and 100 | 101 | (c) You must retain, in the Source form of any Derivative Works 102 | that You distribute, all copyright, patent, trademark, and 103 | attribution notices from the Source form of the Work, 104 | excluding those notices that do not pertain to any part of 105 | the Derivative Works; and 106 | 107 | (d) If the Work includes a "NOTICE" text file as part of its 108 | distribution, then any Derivative Works that You distribute must 109 | include a readable copy of the attribution notices contained 110 | within such NOTICE file, excluding those notices that do not 111 | pertain to any part of the Derivative Works, in at least one 112 | of the following places: within a NOTICE text file distributed 113 | as part of the Derivative Works; within the Source form or 114 | documentation, if provided along with the Derivative Works; or, 115 | within a display generated by the Derivative Works, if and 116 | wherever such third-party notices normally appear. The contents 117 | of the NOTICE file are for informational purposes only and 118 | do not modify the License. You may add Your own attribution 119 | notices within Derivative Works that You distribute, alongside 120 | or as an addendum to the NOTICE text from the Work, provided 121 | that such additional attribution notices cannot be construed 122 | as modifying the License. 123 | 124 | You may add Your own copyright statement to Your modifications and 125 | may provide additional or different license terms and conditions 126 | for use, reproduction, or distribution of Your modifications, or 127 | for any such Derivative Works as a whole, provided Your use, 128 | reproduction, and distribution of the Work otherwise complies with 129 | the conditions stated in this License. 130 | 131 | 5. Submission of Contributions. Unless You explicitly state otherwise, 132 | any Contribution intentionally submitted for inclusion in the Work 133 | by You to the Licensor shall be under the terms and conditions of 134 | this License, without any additional terms or conditions. 135 | Notwithstanding the above, nothing herein shall supersede or modify 136 | the terms of any separate license agreement you may have executed 137 | with Licensor regarding such Contributions. 138 | 139 | 6. Trademarks. This License does not grant permission to use the trade 140 | names, trademarks, service marks, or product names of the Licensor, 141 | except as required for reasonable and customary use in describing the 142 | origin of the Work and reproducing the content of the NOTICE file. 143 | 144 | 7. Disclaimer of Warranty. Unless required by applicable law or 145 | agreed to in writing, Licensor provides the Work (and each 146 | Contributor provides its Contributions) on an "AS IS" BASIS, 147 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 148 | implied, including, without limitation, any warranties or conditions 149 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 150 | PARTICULAR PURPOSE. You are solely responsible for determining the 151 | appropriateness of using or redistributing the Work and assume any 152 | risks associated with Your exercise of permissions under this License. 153 | 154 | 8. Limitation of Liability. In no event and under no legal theory, 155 | whether in tort (including negligence), contract, or otherwise, 156 | unless required by applicable law (such as deliberate and grossly 157 | negligent acts) or agreed to in writing, shall any Contributor be 158 | liable to You for damages, including any direct, indirect, special, 159 | incidental, or consequential damages of any character arising as a 160 | result of this License or out of the use or inability to use the 161 | Work (including but not limited to damages for loss of goodwill, 162 | work stoppage, computer failure or malfunction, or any and all 163 | other commercial damages or losses), even if such Contributor 164 | has been advised of the possibility of such damages. 165 | 166 | 9. Accepting Warranty or Additional Liability. While redistributing 167 | the Work or Derivative Works thereof, You may choose to offer, 168 | and charge a fee for, acceptance of support, warranty, indemnity, 169 | or other liability obligations and/or rights consistent with this 170 | License. However, in accepting such obligations, You may act only 171 | on Your own behalf and on Your sole responsibility, not on behalf 172 | of any other Contributor, and only if You agree to indemnify, 173 | defend, and hold each Contributor harmless for any liability 174 | incurred by, or claims asserted against, such Contributor by reason 175 | of your accepting any such warranty or additional liability. 176 | -------------------------------------------------------------------------------- /NOTICE: -------------------------------------------------------------------------------- 1 | Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # AWS SDK API ChangeLog 2 | 3 | Think of it as developer centric AWS what's new site, ie. just tell me 4 | what's changing in the apis, with field level changes highlighted on 5 | updated apis. 6 | 7 | Its a static site generator via git walking and diffing a repo 8 | containing service model files that are contained in aws sdks. 9 | 10 | Its deployed as a periodic spot fargate task that pulls the public sdk 11 | repo, walks named tags and performs diff. It keeps a json statefile as 12 | a cache of those diffs, and then renders the site to an s3 bucket with 13 | cloudfront in front. Service icons are assembled into css sprites from 14 | the publicly available svg icon sets. 15 | 16 | Types of changes detected. 17 | - new method 18 | - new/modified parameter on existing method 19 | - new service 20 | 21 | Semantic Changelogs 22 | 23 | - we use the nodejs sdk repo which produce nice machine 24 | readable diffs with the service team logs. 25 | 26 | This uses https://github.com/casey/just as a replacement for make. 27 | 28 | 29 | ## License 30 | 31 | This project is licensed under the Apache-2.0 License. 32 | -------------------------------------------------------------------------------- /apichanges/__init__.py: -------------------------------------------------------------------------------- 1 | # 2 | -------------------------------------------------------------------------------- /apichanges/cli.py: -------------------------------------------------------------------------------- 1 | import itertools 2 | import logging 3 | import os 4 | 5 | import click 6 | import jinja2 7 | import pygit2 8 | 9 | from .model import ReleaseDelta 10 | from .repo import CommitProcessor, TagWalker 11 | 12 | log = logging.getLogger("apichanges") 13 | 14 | 15 | @click.group() 16 | def cli(): 17 | """AWS API ChangeLog""" 18 | logging.basicConfig(level=logging.INFO) 19 | 20 | 21 | def _repo_stream_options(func): 22 | decorators = [ 23 | click.option("--path", required=True, help="Path to AWS SDK git clone"), 24 | click.option("--since", required=True, help="Start Date or Tag"), 25 | click.option("--until", help="End Date or Tag, default: last commit date"), 26 | click.option( 27 | "--service", multiple=True, help="Filter changes to only these services" 28 | ), 29 | click.option( 30 | "--changes-dir", 31 | default=".changes", 32 | help="sdk release changes json dir in repo", 33 | ), 34 | click.option("--model-path", help="model directory prefix", required=True), 35 | click.option("--model-suffix", help="suffix for model files", required=True), 36 | ] 37 | for d in decorators: 38 | func = d(func) 39 | return func 40 | 41 | 42 | @cli.command(name="build-page") 43 | @click.option("--debug", is_flag=True, default=False) 44 | @click.option( 45 | "--template", 46 | help="Path to Jinja2 Template", 47 | type=click.Path(exists=True, resolve_path=True), 48 | ) 49 | @click.option("--output", type=click.Path(resolve_path=True)) 50 | @_repo_stream_options 51 | def build_page( 52 | path, 53 | since, 54 | until, 55 | service, 56 | template, 57 | output, 58 | changes_dir, 59 | model_path, 60 | model_suffix, 61 | debug, 62 | ): 63 | """build a single page site""" 64 | repo = pygit2.Repository(path) 65 | releases = [] 66 | count = 0 67 | 68 | walker = TagWalker(repo) 69 | delta_processor = CommitProcessor( 70 | repo, 71 | model_prefix=model_path, 72 | model_suffix=model_suffix, 73 | change_dir=changes_dir, 74 | services=service, 75 | debug=debug, 76 | ) 77 | 78 | log.info("scanning for api changes since %s until %s", since, until or "latest") 79 | 80 | for prev, cur, commit_info, change_diff in walker.walk(since, until): 81 | count += 1 82 | service_changes = delta_processor.process(commit_info, change_diff) 83 | if service_changes: 84 | release = ReleaseDelta(commit_info, service_changes) 85 | log.info(release) 86 | releases.append(releases) 87 | 88 | log.info( 89 | ("processed %d/%d releases, " "%d svc %d api updates across %d services"), 90 | len(releases), 91 | count, 92 | sum(map(len, releases)), 93 | sum(map(len, [svc.changes for svc in itertools.chain(*[r for r in releases])])), 94 | len({s.name for s in itertools.chain(*[r for r in releases])}), 95 | ) 96 | 97 | # sort changes regardless of walk direction by reverse date 98 | releases = sorted(releases, key=lambda c: c.commit["created_at"], reverse=True) 99 | 100 | if template and output: 101 | log.info("rendering template %s to %s", template, output) 102 | template_env = jinja2.Environment( 103 | loader=jinja2.FileSystemLoader(os.path.dirname(str(template))) 104 | ) 105 | template = template_env.get_template(os.path.basename(str(template))) 106 | with open(output, "w") as fh: 107 | fh.write(template.render(releases=releases)) 108 | 109 | 110 | if __name__ == "__main__": 111 | cli() 112 | -------------------------------------------------------------------------------- /apichanges/icons.py: -------------------------------------------------------------------------------- 1 | def get_icon(service_name): 2 | return "/icons/%s.png" % (ICON_SERVICE_MAP.get(service_name).lower()) 3 | 4 | 5 | def get_icon_style(service_name): 6 | img = ICON_SERVICE_MAP.get(service_name).lower() 7 | return "sprite-images-%s" % img 8 | 9 | 10 | ICON_SERVICE_MAP = { 11 | "aws": "AWS-Cloud-alt", 12 | "a2i-runtime.sagemaker": "Amazon-SageMaker", 13 | "a4b": "Alexa-For-Business", 14 | "access-analyzer": "AWS-Identity-and-Access-Management_IAM", 15 | "acm": "AWS-Certificate-Manager", 16 | "acm-pca": "AWS-Certificate-Authority", 17 | "amplify": "AWS-Amplify", 18 | "api.detective": "Security-Identity-and-Compliance", 19 | "api.ecr": "Amazon-EC2-Container-Registry", 20 | "api.elastic-inference": "Amazon-Elastic-Inference", 21 | "api.mediatailor": "AWS-Elemental-MediaTailor", 22 | "api.pricing": "AWS-Cost-Explorer", 23 | "api.sagemaker": "Amazon-SageMaker", 24 | "api.tunneling.iot": "IoT_Generic", 25 | "apigateway": "Amazon-API-Gateway", 26 | "appconfig": "AWS-Systems-Manager", 27 | "application-autoscaling": "Amazon-Application-Auto-Scaling", 28 | "applicationinsights": "Amazon-CloudWatch", 29 | "appmesh": "AWS-App-Mesh", 30 | "appstream2": "Amazon-Appstream-2_0", 31 | "appsync": "AWS-AppSync", 32 | "athena": "Amazon-Athena", 33 | "autoscaling": "Amazon-EC2_Auto-Scaling", 34 | "backup": "AWS-Backup", 35 | "batch": "AWS-Batch", 36 | "budgets": "AWS-Budgets", 37 | "catalog.marketplace": "AWS-Marketplace", 38 | "ce": "AWS-Cost-Explorer", 39 | "chime": "Amazon-Chime", 40 | "cloud9": "AWS-Cloud9", 41 | "clouddirectory": "Amazon-Cloud-Directory", 42 | "cloudformation": "AWS-CloudFormation", 43 | "cloudfront": "Amazon-CloudFront", 44 | "cloudhsm": "AWS-CloudHSM", 45 | "cloudhsmv2": "AWS-CloudHSM", 46 | "cloudsearch": "Amazon-CloudSearch", 47 | "cloudsearchdomain": "Amazon-CloudSearch", 48 | "cloudtrail": "AWS-CloudTrail", 49 | "codebuild": "AWS-CodeBuild", 50 | "codecommit": "AWS-CodeCommit", 51 | "codedeploy": "AWS-CodeDeploy", 52 | "codeguru-profiler": "Developer-Tools", 53 | "codeguru-reviewer": "Developer-Tools", 54 | "codepipeline": "AWS-CodePipeline", 55 | "codestar": "AWS-CodeStar", 56 | "codestar-connections": "AWS-CodeStar", 57 | "codestar-notifications": "AWS-CodeStar", 58 | "cognito-identity": "Amazon-Cognito", 59 | "cognito-idp": "Amazon-Cognito", 60 | "cognito-sync": "Amazon-Cognito", 61 | "comprehend": "Amazon-Comprehend", 62 | "comprehendmedical": "Amazon-Comprehend", 63 | "compute-optimizer": "Reserved-Instance-Reporting", 64 | "config": "AWS-Config", 65 | "connect": "Amazon-Connect", 66 | "cur": "AWS-Cost-and-Usage-Report", 67 | "data.iot": "Internet-of-Things", 68 | "data.iotevents": "AWS-IoT-Events", 69 | "data.jobs.iot": "Internet-of-Things", 70 | "data.mediastore": "AWS-Elemental-MediaStore", 71 | "dataexchange": "Amazon-Simple-Storage-Service-S3_Bucket-with-Objects", 72 | "datapipeline": "AWS-Data-Pipeline", 73 | "datasync": "AWS-DataSync", 74 | "dax": "Amazon-DynamoDB_DAX", 75 | "devicefarm": "AWS-Device-Farm", 76 | "devices.iot1click": "AWS-IoT-1-Click", 77 | "directconnect": "AWS-Direct-Connect", 78 | "discovery": "AWS-Migration-Hub", 79 | "dlm": "AWS-Backup", 80 | "dms": "AWS-Database-Migration-Service", 81 | "ds": "AWS-Directory-Service", 82 | "dynamodb": "Amazon-DynamoDB", 83 | "ebs": "Amazon-Elastic-Block-Store-EBS", 84 | "ec2": "Amazon-EC2", 85 | "ec2-instance-connect": "Amazon-EC2", 86 | "ecr": "Amazon-EC2-Container-Registry", 87 | "ecs": "Amazon-Elastic-Container-Service", 88 | "eks": "Amazon-Elastic-Kubernetes-Service", 89 | "elasticache": "Amazon-ElastiCache", 90 | "elasticbeanstalk": "AWS-Elastic-Beanstalk", 91 | "elasticfilesystem": "Amazon-Elastic-File-System_EFS", 92 | "elasticloadbalancing": "Elastic-Load-Balancing", 93 | "elasticmapreduce": "Amazon-EMR", 94 | "elastictranscoder": "Amazon-Elastic-Transcoder", 95 | "email": "Amazon-Simple-Email-Service-SES", 96 | "entitlement.marketplace": "AWS-Marketplace", 97 | "es": "Amazon-Elasticsearch-Service", 98 | "events": "Amazon-EventBridge", 99 | "execute-api": "Amazon-API-Gateway", 100 | "firehose": "Amazon-Kinesis-Data-Firehose", 101 | "fms": "AWS-Firewall-Manager", 102 | "forecast": "Amazon-Forecast", 103 | "forecastquery": "Amazon-Forecast", 104 | "frauddetector": "Machine-Learning", 105 | "fsx": "Amazon-FSx", 106 | "gamelift": "Amazon-GameLift", 107 | "glacier": "Amazon-S3-Glacier", 108 | "globalaccelerator": "AWS-Global-Accelerator", 109 | "glue": "AWS-Glue", 110 | "greengrass": "AWS-IoT-Greengrass", 111 | "groundstation": "AWS-Ground-Station", 112 | "guardduty": "Amazon-GuardDuty", 113 | "health": "AWS-Personal-Health-Dashboard", 114 | "iam": "AWS-Identity-and-Access-Management_IAM", 115 | "imagebuilder": "Amazon-EC2_AMI", 116 | "importexport": "AWS-Snow-Family_Snowball-Import-Export", 117 | "inspector": "Amazon-Inspector", 118 | "iot": "Internet-of-Things", 119 | "iotanalytics": "AWS-IoT-Analytics", 120 | "iotevents": "AWS-IoT-Events", 121 | "iotthingsgraph": "AWS-IoT-Things-Graph", 122 | "kafka": "Amazon-Managed-Streaming-for-Kafka", 123 | "kendra": "Machine-Learning", 124 | "kinesis": "Amazon-Kinesis", 125 | "kinesisanalytics": "Amazon-Kinesis-Data-Analytics", 126 | "kinesisvideo": "Amazon-Kinesis-Video-Streams", 127 | "kms": "AWS-Key-Management-Service", 128 | "lakeformation": "AWS-Lake-Formation", 129 | "lambda": "AWS-Lambda", 130 | "license-manager": "AWS-License-Manager", 131 | "lightsail": "Amazon-Lightsail", 132 | "logs": "Amazon-CloudWatch", 133 | "machinelearning": "Machine-Learning", 134 | "macie": "Amazon-Macie", 135 | "managedblockchain": "Amazon-Managed-Blockchain", 136 | "marketplacecommerceanalytics": "AWS-Marketplace", 137 | "mediaconnect": "AWS-Elemental-MediaConnect", 138 | "mediaconvert": "AWS-Elemental-MediaConvert", 139 | "medialive": "AWS-Elemental-MediaLive", 140 | "mediapackage": "AWS-Elemental-MediaPackage", 141 | "mediapackage-vod": "AWS-Elemental-MediaPackage", 142 | "mediastore": "AWS-Elemental-MediaStore", 143 | "metering.marketplace": "AWS-Marketplace", 144 | "mgh": "AWS-Migration-Hub", 145 | "migrationhub-config": "AWS-Migration-Hub", 146 | "mobile": "Mobile", 147 | "mobileanalytics": "Mobile", 148 | "models.lex": "Amazon-Lex", 149 | "monitoring": "Amazon-CloudWatch", 150 | "mq": "Amazon-MQ", 151 | "mturk-requester": "Users", 152 | "networkmanager": "Amazon-VPC_Router", 153 | "oidc": "AWS-Single-Sign-On", 154 | "opsworks": "AWS-OpsWorks", 155 | "opsworks-cm": "AWS-OpsWorks", 156 | "organizations": "AWS-Organizations", 157 | "outposts": "Corporate-data-center", 158 | "participant.connect": "Amazon-Connect", 159 | "personalize": "Amazon-Personalize", 160 | "personalize-events": "Amazon-Personalize", 161 | "personalize-runtime": "Amazon-Personalize", 162 | "pi": "Amazon-CloudWatch", 163 | "pinpoint": "Amazon-Pinpoint", 164 | "polly": "Amazon-Polly", 165 | "portal.sso": "AWS-Single-Sign-On", 166 | "projects.iot1click": "AWS-IoT-Analytics_Notebook", 167 | "qldb": "Amazon-Quantum-Ledger-Database_QLDB", 168 | "quicksight": "Amazon-Quicksight", 169 | "ram": "AWS-Resource-Access-Manager", 170 | "rds": "Amazon-RDS", 171 | "rds-data": "Amazon-Aurora", 172 | "redshift": "Amazon-Redshift", 173 | "rekognition": "Amazon-Rekognition", 174 | "resource-groups": "AWS-Resource-Access-Manager", 175 | "robomaker": "AWS-RoboMaker", 176 | "route53": "Amazon-Route-53", 177 | "route53domains": "Amazon-Route-53", 178 | "route53resolver": "Amazon-Route-53", 179 | "runtime.lex": "Amazon-Lex", 180 | "runtime.sagemaker": "Amazon-SageMaker", 181 | "s3": "Amazon-Simple-Storage-Service-S3", 182 | "s3-control": "Amazon-Simple-Storage-Service-S3", 183 | "sagemaker": "Amazon-SageMaker", 184 | "savingsplans": "AWS-Budgets", 185 | "schemas": "Amazon-EventBridge", 186 | "sdb": "Database", 187 | "secretsmanager": "AWS-Secrets-Manager", 188 | "securityhub": "AWS-Security-Hub", 189 | "serverlessrepo": "AWS-Serverless-Application-Repository", 190 | "servicecatalog": "AWS-Service-Catalog", 191 | "servicediscovery": "Amazon-Route-53", 192 | "servicequotas": "AWS-Trusted-Advisor", 193 | "session.qldb": "Amazon-Quantum-Ledger-Database_QLDB", 194 | "shield": "AWS-Shield_Shield-Advanced", 195 | "signer": "AWS-Identity-and-Access-Management-IAM_Data-Encryption-Key", 196 | "sms": "Amazon-Pinpoint", 197 | "sms-voice.pinpoint": "Amazon-Pinpoint", 198 | "snowball": "AWS-Snowball", 199 | "sns": "Amazon-Simple-Notification-Service-SNS", 200 | "sqs": "Amazon-Simple-Queue-Service-SQS", 201 | "ssm": "AWS-Systems-Manager", 202 | "states": "AWS-Step-Functions", 203 | "storagegateway": "AWS-Storage-Gateway", 204 | "streams.dynamodb": "Amazon-DynamoDB", 205 | "sts": "AWS-Identity-and-Access-Management-IAM_AWS-STS", 206 | "support": "AWS-Trusted-Advisor", 207 | "swf": "AWS-Step-Functions", 208 | # foobar 209 | "tagging": "Application-Integration_Event_Resource", 210 | "textract": "Amazon-Textract", 211 | "transcribe": "Amazon-Transcribe", 212 | "transfer": "AWS-Transfer-for-SFTP", 213 | "translate": "Amazon-Translate", 214 | "waf": "AWS-WAF", 215 | "waf-regional": "AWS-WAF", 216 | "wafv2": "AWS-WAF", 217 | "workdocs": "Amazon-WorkDocs", 218 | "worklink": "Amazon-WorkLink", 219 | "workmail": "Amazon-WorkMail", 220 | "workmailmessageflow": "Amazon-WorkMail", 221 | "workspaces": "Amazon-Workspaces", 222 | "xray": "AWS-X-Ray", 223 | } 224 | -------------------------------------------------------------------------------- /apichanges/model.py: -------------------------------------------------------------------------------- 1 | import logging 2 | 3 | from botocore import hooks, model, xform_name 4 | from botocore.docs.docstring import ClientMethodDocstring 5 | from docutils.core import publish_parts 6 | from docutils.writers.html5_polyglot import HTMLTranslator, Writer 7 | 8 | log = logging.getLogger("apichanges.model") 9 | 10 | 11 | class ServiceModel(model.ServiceModel): 12 | def __init__(self, service_description, service_name=None): 13 | super(ServiceModel, self).__init__(service_description, service_name) 14 | # Use our shape factory 15 | self._shape_resolver = ShapeResolver(service_description.get("shapes", {})) 16 | 17 | 18 | class ShapeVisitor(object): 19 | # we use visitors due to the presence of recursive 20 | # self/circular references in shapes, for which we 21 | # track seen/stack. 22 | 23 | # all of these visitors would benefit significantly 24 | # from a cache on shape. 25 | 26 | DEFAULT_VALUE = () 27 | 28 | def __init__(self): 29 | self.seen = set() 30 | 31 | def process(self, shape, *params): 32 | skind = type(shape) 33 | stype = repr(shape) 34 | if stype in self.seen: 35 | return self.DEFAULT_VALUE 36 | self.seen.add(stype) 37 | try: 38 | if skind is Shape: 39 | return self.visit_shape(shape, *params) 40 | elif skind is StructureShape: 41 | return self.visit_structure(shape, *params) 42 | elif skind is ListShape: 43 | return self.visit_list(shape, *params) 44 | elif skind is MapShape: 45 | return self.visit_map(shape, *params) 46 | elif skind is StringShape: 47 | return self.visit_string(shape, *params) 48 | finally: 49 | self.seen.remove(stype) 50 | 51 | 52 | class EqualityVisitor(ShapeVisitor): 53 | 54 | DEFAULT_VALUE = True 55 | 56 | def visit_structure(self, shape, other): 57 | # type change to struct 58 | if type(shape) != type(other): 59 | return True 60 | added = set(shape.members).difference(other.members) 61 | if added: 62 | return False 63 | for m in shape.members: 64 | if not self.process(shape.members[m], other.members[m]): 65 | return False 66 | return True 67 | 68 | def visit_list(self, shape, other): 69 | return self.process(shape.member, other.member) 70 | 71 | def visit_string(self, shape, other): 72 | return shape.enum == other.enum 73 | 74 | def visit_shape(self, shape, other): 75 | return repr(shape) == repr(other) 76 | 77 | def visit_map(self, shape, other): 78 | return self.process(shape.key, other.key) and self.process( 79 | shape.value, other.value 80 | ) 81 | 82 | 83 | class ReferenceVisitor(ShapeVisitor): 84 | def visit_structure(self, shape, name): 85 | for m in shape.members.values(): 86 | if self.process(m, name): 87 | return True 88 | return False 89 | 90 | def visit_list(self, shape, shape_name): 91 | return self.process(shape.member, shape_name) 92 | 93 | def visit_map(self, shape, shape_name): 94 | return self.process(shape.key, shape_name) or self.process( 95 | shape.value, shape_name 96 | ) 97 | 98 | def visit_string(self, shape, shape_name): 99 | return False 100 | 101 | def visit_shape(self, shape, shape_name): 102 | return False 103 | 104 | 105 | class TypeRepr(ShapeVisitor): 106 | def visit_structure(self, shape): 107 | d = {} 108 | for k, m in shape.members.items(): 109 | d[k] = self.process(m) 110 | return d 111 | 112 | def visit_list(self, shape): 113 | return [self.process(shape.member)] 114 | 115 | def visit_map(self, shape): 116 | return {self.process(shape.key): self.process(shape.value)} 117 | 118 | def visit_string(self, shape): 119 | if shape.enum: 120 | return " | ".join(shape.enum) 121 | return "string" 122 | 123 | def visit_shape(self, shape): 124 | return shape.type_name 125 | 126 | 127 | class DeltaVisitor(ShapeVisitor): 128 | def visit_structure(self, new, other): 129 | if type(new) != type(other): 130 | return TypeRepr().process(new) 131 | added = set(new.members).difference(other.members) 132 | modified = {a: TypeRepr().process(new.members[a]) for a in added} 133 | for m in new.members: 134 | if m in added: 135 | continue 136 | md = self.process(new.members[m], other.members[m]) 137 | if md: 138 | modified[m] = md 139 | return modified 140 | 141 | def visit_list(self, new, other): 142 | return self.process(new.member, other.member) 143 | 144 | def visit_map(self, new, other): 145 | return self.process(new.value, other.value) 146 | 147 | def visit_string(self, new, other): 148 | return set(new.enum).difference(other.enum) 149 | 150 | def visit_shape(self, new, other): 151 | return [] 152 | 153 | 154 | class ComparableShape(object): 155 | def __eq__(self, other): 156 | if not isinstance(other, self.__class__): 157 | return False 158 | return EqualityVisitor().process(self, other) 159 | 160 | def references(self, shape_name): 161 | return ReferenceVisitor().process(self, shape_name) 162 | 163 | def delta(self, other): 164 | return DeltaVisitor().process(self, other) 165 | 166 | 167 | class StructureShape(ComparableShape, model.StructureShape): 168 | pass 169 | 170 | 171 | class ListShape(ComparableShape, model.ListShape): 172 | pass 173 | 174 | 175 | class MapShape(ComparableShape, model.MapShape): 176 | pass 177 | 178 | 179 | class StringShape(ComparableShape, model.StringShape): 180 | pass 181 | 182 | 183 | class Shape(ComparableShape, model.Shape): 184 | pass 185 | 186 | 187 | class ShapeResolver(model.ShapeResolver): 188 | 189 | # Any type not in this mapping will default to the Shape class. 190 | SHAPE_CLASSES = { 191 | "structure": StructureShape, 192 | "list": ListShape, 193 | "map": MapShape, 194 | "string": StringShape, 195 | } 196 | 197 | DEFAULT_SHAPE = Shape 198 | 199 | # override method to insert default shape 200 | def get_shape_by_name(self, shape_name, member_traits=None): 201 | try: 202 | shape_model = self._shape_map[shape_name] 203 | except KeyError: 204 | raise model.NoShapeFoundError(shape_name) 205 | try: 206 | shape_cls = self.SHAPE_CLASSES.get(shape_model["type"], self.DEFAULT_SHAPE) 207 | except KeyError: 208 | raise model.InvalidShapeError( 209 | "Shape is missing required key 'type': %s" % shape_model 210 | ) 211 | if member_traits: 212 | shape_model = shape_model.copy() 213 | shape_model.update(member_traits) 214 | result = shape_cls(shape_name, shape_model, self) 215 | return result 216 | 217 | 218 | def diff_model(new, old=None): 219 | new = ServiceModel(new) 220 | log.debug("delta diffing service:%s", new.service_name) 221 | if old: 222 | old = ServiceModel(old) 223 | new_methods = set(new.operation_names).difference(old.operation_names) 224 | else: 225 | new_methods = set(new.operation_names) 226 | 227 | changes = [] 228 | for n in new_methods: 229 | changes.append(NewMethod(new, n)) 230 | 231 | if not old: 232 | return ServiceChange(new, changes, new=True) 233 | 234 | old_shapes = set(old.shape_names) 235 | modified_shapes = [] 236 | 237 | # TODO: with a shape cache for equality/delta we could avoid 238 | # extraneous compares on shape recursion. 239 | for s in new.shape_names: 240 | ns = new.shape_for(s) 241 | if s not in old_shapes: 242 | continue 243 | os = old.shape_for(s) 244 | if ns == os: # equality visitor 245 | continue 246 | delta = ns.delta(os) 247 | if delta: 248 | modified_shapes.append((s, delta)) 249 | 250 | mshape_map = dict(modified_shapes) 251 | for op in new.operation_names: 252 | op_delta = {} 253 | op_shape = new.operation_model(op) 254 | if op_shape.input_shape and op_shape.input_shape.name in mshape_map: 255 | op_delta["request"] = d_i = mshape_map[op_shape.input_shape.name] 256 | if op_shape.output_shape and op_shape.output_shape.name in mshape_map: 257 | op_delta["response"] = mshape_map[op_shape.output_shape.name] 258 | 259 | # sigh ec2 service specific hack 260 | if ( 261 | new.service_name == "ec2" 262 | and "request" in op_delta 263 | and "TagSpecifications" in d_i 264 | ): 265 | d_i.pop("TagSpecifications") 266 | if not d_i: 267 | op_delta.pop("request") 268 | if not op_delta: 269 | continue 270 | if len(op_delta) == 2 and op_delta["request"] == op_delta["response"]: 271 | op_delta = {"both": op_delta["request"]} 272 | changes.append(UpdatedMethod(new.service_name, op, op_delta)) 273 | 274 | if changes: 275 | return ServiceChange(new, changes) 276 | 277 | 278 | class ReleaseDelta(object): 279 | # Top level container for all the changes 280 | # within a given commit/release. 281 | def __init__(self, info, service_changes): 282 | self.commit = info 283 | self.service_changes = service_changes 284 | 285 | def __iter__(self): 286 | return iter(self.service_changes) 287 | 288 | def __len__(self): 289 | return len(self.service_changes) 290 | 291 | def __repr__(self): 292 | return ( 293 | "" 296 | ).format( 297 | commit=self.commit, 298 | service_count=len({s.name for s in self}), 299 | change_count=sum([len(s) for s in self]), 300 | ) 301 | 302 | 303 | class ServiceChange(object): 304 | def __init__(self, service, changes, new=False): 305 | self.new = new 306 | self.service = service 307 | self.changes = changes 308 | self.count_new = self.count_updated = 0 309 | for c in self.changes: 310 | if c.type == "new": 311 | self.count_new += 1 312 | else: 313 | self.count_updated += 1 314 | self.commit = {} 315 | self.logs = () 316 | 317 | @property 318 | def name(self): 319 | return self.service.service_name.lower() 320 | 321 | @property 322 | def title(self): 323 | return self.service.metadata.get("serviceFullName", self.name) 324 | 325 | def __len__(self): 326 | return len(self.changes) 327 | 328 | def __iter__(self): 329 | return iter(self.changes) 330 | 331 | def __repr__(self): 332 | return ( 333 | "" 335 | ).format( 336 | name=self.name, 337 | commit=self.commit, 338 | updated=self.count_updated, 339 | new=self.count_new, 340 | logs=self.logs and "yes" or "no", 341 | ) 342 | 343 | LOG_ID_MAP = { 344 | "Elastic Load Balancing v2": "elbv2", 345 | "Lex Runtime Service": "lexruntime", 346 | "SFN": "stepfunctions", 347 | "IoT 1Click Devices Service": "iot1click-devices", 348 | "SageMaker A2I Runtime": "augmentedairuntime", 349 | "Cognito Identity Provider": "cognitoidentityserviceprovider", 350 | # next two might be a specific hack around a particular release/feature 351 | "Cost Explorer": "savingsplans", 352 | "Budgets": "savingsplans", 353 | } 354 | 355 | def associate_logs(self, change_log): 356 | candidates = ( 357 | self.LOG_ID_MAP.get(self.service.metadata.get("serviceId")), 358 | self.service.metadata.get("serviceId"), 359 | self.service.metadata.get("signingName"), 360 | self.service.metadata.get("endpointPrefix"), 361 | # sso oidc 362 | self.service.metadata.get("serviceAbbreviation", "").replace(" ", "-"), 363 | "-".join( 364 | [ 365 | c 366 | for c in self.service.metadata.get("uid", "").split("-") 367 | if not c.isdigit() 368 | ] 369 | ), 370 | self.service.metadata.get("endpointPrefix", "").replace("-", ""), 371 | self.service.metadata.get("serviceId", "").replace(" ", ""), 372 | self.service.metadata.get("serviceId", "") + "service", 373 | self.service.service_name.replace("-", "") + "service", 374 | self.service.metadata.get("serviceId", "").replace(" ", "-"), 375 | ) 376 | 377 | if not change_log: 378 | return 379 | logs = () 380 | for c in filter(None, candidates): 381 | logs = change_log.get(c.lower(), ()) 382 | if logs: 383 | break 384 | if not logs: 385 | log.warning( 386 | "%s no change log entry found: %s", self.name, list(change_log.keys()) 387 | ) 388 | self.logs = logs 389 | 390 | 391 | class Change(object): 392 | @property 393 | def service_name(self): 394 | return self.service.service_name 395 | 396 | def render_operation(self): 397 | # try and reuse botocore's sphinx doc infrastructure. 398 | method_doc = ClientMethodDocstring( 399 | operation_model=self.op, 400 | method_name=self.op.name, 401 | event_emitter=hooks.HierarchicalEmitter(), 402 | method_description=self.op.documentation, 403 | example_prefix="client.%s" % xform_name(self.op.name), 404 | include_signature=False, 405 | ) 406 | return self._render_docutils(method_doc) 407 | 408 | def _render_docutils(self, method_doc): 409 | method_writer = Writer() 410 | method_writer.translator_class = HTMLTranslator 411 | parts = publish_parts( 412 | str(method_doc), 413 | settings_overrides={"report_level": 4}, 414 | writer=method_writer, 415 | ) 416 | return parts["fragment"] 417 | 418 | 419 | class NewMethod(Change): 420 | 421 | type = "new" 422 | 423 | def __init__(self, service, op): 424 | self.service = service 425 | self.op = op 426 | 427 | def __repr__(self): 428 | return "New Method: service:{} method:{}".format(self.service_name, self.op) 429 | 430 | 431 | class UpdatedMethod(Change): 432 | 433 | type = "updated" 434 | 435 | def __init__(self, service, op, delta): 436 | self.service = service 437 | self.op = op 438 | self.delta = delta 439 | 440 | def __repr__(self): 441 | return ("Updated Method: service:{} method:{} " "delta:{}").format( 442 | self.service, self.op, self.delta 443 | ) 444 | -------------------------------------------------------------------------------- /apichanges/publisher.py: -------------------------------------------------------------------------------- 1 | import contextlib 2 | import gzip 3 | import logging 4 | import mimetypes 5 | import os 6 | import shutil 7 | import tempfile 8 | from pathlib import Path 9 | 10 | import boto3 11 | 12 | log = logging.getLogger("apichanges.publish") 13 | 14 | 15 | @contextlib.contextmanager 16 | def temp_dir(): 17 | try: 18 | d = tempfile.mkdtemp() 19 | yield Path(d) 20 | finally: 21 | shutil.rmtree(d) 22 | 23 | 24 | class SitePublisher(object): 25 | 26 | compress_exts = set(("js", "css", "json", "html")) 27 | 28 | def __init__(self, site_dir: Path, s3_bucket, s3_prefix=""): 29 | self.site_dir = site_dir 30 | self.bucket = s3_bucket 31 | self.prefix = s3_prefix.rstrip("/") 32 | 33 | def publish(self): 34 | client = boto3.client("s3") 35 | with temp_dir() as staging: 36 | self.prepare_staging(staging) 37 | self.transfer_staging(client, staging) 38 | 39 | def transfer_staging(self, client, staging): 40 | for dirpath, dirnames, files in os.walk(staging): 41 | dirpath = Path(dirpath) 42 | for f in files: 43 | sf = dirpath / f 44 | tf = sf.relative_to(staging) 45 | ext = f.rsplit(".", 1)[-1] 46 | params = {"ACL": "public-read"} 47 | if ext in self.compress_exts: 48 | params["ContentEncoding"] = "gzip" 49 | if ext == "rss": 50 | params["ContentDisposition"] = "inline" 51 | params["ContentType"], _ = mimetypes.guess_type(f) 52 | key = str(self.prefix / tf).lstrip("/") 53 | log.info("upload %s", key) 54 | client.upload_file( 55 | str(sf), Bucket=self.bucket, Key=key, ExtraArgs=params 56 | ) 57 | 58 | def prepare_staging(self, staging): 59 | tf_count = 0 60 | tf_size = 0 61 | for dirpath, dirnames, files in os.walk(self.site_dir): 62 | dirpath = Path(dirpath) 63 | stage_dir = staging / dirpath.relative_to(self.site_dir) 64 | for f in files: 65 | ext = f.rsplit(".", 1)[-1] 66 | tf = stage_dir / f 67 | f = dirpath / f 68 | tf.parent.mkdir(parents=True, exist_ok=True) 69 | if ext in self.compress_exts: 70 | with gzip.open(tf, "w") as fh: 71 | fh.write(f.read_bytes()) 72 | osize = f.stat().st_size 73 | csize = tf.stat().st_size 74 | log.debug( 75 | "compressed %s -> %s -> %s (%0.0f%%)" 76 | % (dirpath / f, osize, csize, csize / float(osize) * 100) 77 | ) 78 | tf_size += csize 79 | else: 80 | shutil.copy2(str(f), str(tf)) 81 | tf_size += f.stat().st_size 82 | tf_count += 1 83 | 84 | log.info("prepared stage %d files %d size" % (tf_count, tf_size)) 85 | -------------------------------------------------------------------------------- /apichanges/record.py: -------------------------------------------------------------------------------- 1 | from datetime import datetime 2 | from dataclasses import dataclass 3 | from dataclasses_json import dataclass_json 4 | from typing import List, Dict, Any 5 | 6 | 7 | @dataclass_json 8 | @dataclass 9 | class ServiceChange: 10 | name: str 11 | title: str 12 | change_log: str 13 | new: bool 14 | ops_added: List[str] 15 | ops_updated: List[str] 16 | ops_changes: Dict[str, Any] 17 | model_file: str 18 | 19 | @classmethod 20 | def from_changes(cls, service_changes): 21 | for s in service_changes: 22 | yield cls( 23 | name=s.name, 24 | title=s.title, 25 | new=s.new, 26 | model_file=s.model_file, 27 | change_log="\n".join(s.logs), 28 | ops_added=[c.op for c in s if c.type == "new"], 29 | ops_updated=[c.op for c in s if c.type == "updated"], 30 | ops_changes={c.op: c.delta for c in s if c.type == "updated"}, 31 | ) 32 | 33 | @property 34 | def count_new(self): 35 | return len(self.ops_added) 36 | 37 | @property 38 | def count_updated(self): 39 | return len(self.ops_updated) 40 | 41 | @property 42 | def slug(self): 43 | t = "{c.name} - " 44 | if self.new: 45 | t += "new service - {c.count_new} methods " 46 | else: 47 | if self.count_new: 48 | t += "{c.count_new} new " 49 | if self.count_updated: 50 | t += "{c.count_updated} updated" 51 | return t.format(c=self) 52 | 53 | def __len__(self): 54 | return len(self.ops_added) + len(self.ops_updated) 55 | 56 | 57 | @dataclass_json 58 | @dataclass 59 | class Commit: 60 | id: str 61 | tag: str 62 | created: datetime 63 | service_changes: List[ServiceChange] 64 | 65 | def select(self, service_name: str) -> List[ServiceChange]: 66 | for s in self: 67 | if s.name == service_name: 68 | yield s 69 | 70 | def __iter__(self): 71 | return iter(self.service_changes) 72 | 73 | @property 74 | def size(self): 75 | return sum([len(s) for s in self.service_changes]) 76 | 77 | @classmethod 78 | def from_commits(cls, releases): 79 | for r in releases: 80 | yield cls( 81 | id=r.commit["commit_id"], 82 | tag=r.commit["tag"], 83 | created=r.commit["created_at"], 84 | service_changes=list(ServiceChange.from_changes(r)), 85 | ) 86 | -------------------------------------------------------------------------------- /apichanges/repo.py: -------------------------------------------------------------------------------- 1 | import bisect 2 | import json 3 | import logging 4 | import os 5 | import re 6 | from datetime import datetime, timedelta 7 | from distutils.version import LooseVersion 8 | from functools import lru_cache 9 | 10 | import pygit2 11 | from dateutil.parser import parse as parse_date 12 | from dateutil.tz import tzoffset, tzutc 13 | 14 | from .model import diff_model 15 | 16 | log = logging.getLogger("apichanges.repo") 17 | 18 | 19 | def commit_date(commit): 20 | tzinfo = tzoffset(None, timedelta(minutes=commit.author.offset)) 21 | return datetime.fromtimestamp(float(commit.author.time), tzinfo) 22 | 23 | 24 | def commit_dict(commit, committed_date=None): 25 | return dict( 26 | commit_id=str(commit.id), 27 | created_at=committed_date or commit_date(commit), 28 | author=commit.author.name, 29 | author_email=commit.author.email, 30 | committer=commit.committer.name, 31 | commitetr_email=commit.committer.email, 32 | message=commit.message, 33 | parent_count=len(commit.parents), 34 | ) 35 | 36 | 37 | class CommitProcessor(object): 38 | def __init__( 39 | self, 40 | repo, 41 | model_prefix, 42 | model_suffix, 43 | change_dir=None, 44 | services=(), 45 | debug=False, 46 | ): 47 | self.repo = repo 48 | self.model_prefix = model_prefix 49 | self.model_suffix = model_suffix 50 | self.change_dir = change_dir 51 | self.services = services 52 | self.debug = debug 53 | 54 | def load_change_log(self, fid): 55 | change_log = {} 56 | data = json.loads(self.repo[fid].read_raw().decode("utf8")) 57 | for n in data: 58 | change_log.setdefault(n["category"].strip("`").lower(), []).append( 59 | n["description"] 60 | ) 61 | return change_log 62 | 63 | def process(self, commit, change_diff): 64 | if self.debug: 65 | log.debug( 66 | ( 67 | "commit:{commit_id:.8} tag:{tag} date:{created_at:%Y/%m/%d %H:%M}\n" 68 | " stats: {stats}" 69 | ).format( 70 | stats=change_diff.stats.format( 71 | pygit2.GIT_DIFF_STATS_SHORT, 80 72 | ).strip(), 73 | **commit 74 | ) 75 | ) 76 | service_changes = [] 77 | 78 | change_path = change_log = None 79 | if self.change_dir: 80 | change_path = os.path.join( 81 | self.change_dir, "%s.json" % commit["tag"].lstrip("v") 82 | ) 83 | 84 | # Get file map so we can ensure change log first. 85 | file_map = {d.new_file.path: d for d in change_diff.deltas} 86 | if change_path and change_path in file_map: 87 | change_log = self.load_change_log(file_map.get(change_path).new_file.id) 88 | 89 | for dpath, d in [ 90 | (f, d) 91 | for f, d in file_map.items() 92 | if f.startswith(self.model_prefix) and f.endswith(self.model_suffix) 93 | ]: 94 | if self.services: 95 | found = False 96 | for s in self.services: 97 | if s in dpath: 98 | found = True 99 | if not found: 100 | continue 101 | if self.debug: 102 | log.debug( 103 | "api model change {} change: {}".format(dpath, d.status_char()) 104 | ) 105 | if d.status_char() == "A": 106 | new = json.loads(self.repo[d.new_file.id].read_raw().decode("utf8")) 107 | old = None 108 | elif d.status_char() == "M": 109 | new = json.loads(self.repo[d.new_file.id].read_raw().decode("utf8")) 110 | old = json.loads(self.repo[d.old_file.id].read_raw().decode("utf8")) 111 | else: 112 | log.warning( 113 | "service file unknown change commit:%s file:%s change:%s", 114 | commit["commit_id"], 115 | dpath, 116 | d.status_char(), 117 | ) 118 | continue 119 | try: 120 | svc_change = diff_model(new, old) 121 | except Exception: 122 | log.error("commit:%s error processing %s", commit["commit_id"], dpath) 123 | raise 124 | continue 125 | 126 | if not svc_change: 127 | continue 128 | 129 | svc_change.model_file = str(d.new_file.id) 130 | svc_change.commit = commit 131 | svc_change.associate_logs(change_log) 132 | log.info(svc_change) 133 | service_changes.append(svc_change) 134 | return service_changes 135 | 136 | 137 | class TagWalker(object): 138 | """Iter commits and diffs on a git repo. 139 | """ 140 | 141 | # twin peaks styled, not texas ranger 142 | def __init__(self, repo): 143 | self.repo = repo 144 | 145 | def walk(self, since, until=None): 146 | """paramertized iterator. 147 | 148 | since|until: either a date string or a tag 149 | 150 | if given a date resolve to the nearest tag. until 151 | defaults to last tag. tags are sorted as version 152 | numbers. 153 | """ 154 | tags = self.get_tag_set() 155 | start = self.get_target_tag(tags, since) 156 | end = self.get_target_tag(tags, until, end=True) 157 | 158 | if start == end: 159 | log.debug("walker exit start == end") 160 | return 161 | 162 | for idx, t in enumerate( 163 | tags[tags.index(start) : tags.index(end) + 1], tags.index(start) 164 | ): 165 | previous = self.get_tag_commit(tags[idx - 1]) 166 | cur = self.get_tag_commit(tags[idx]) 167 | change_diff = self.repo.diff(previous, cur) 168 | info = commit_dict(cur) 169 | info["tag"] = str(t).rsplit("/", 1)[-1] 170 | log.debug("walking tag: %s date:%s" % (t, info["created_at"])) 171 | yield previous, cur, info, change_diff 172 | 173 | def resolve(self, target): 174 | if target: 175 | try: 176 | self.repo.lookup_reference("refs/tags/%s" % target) 177 | except (KeyError, pygit2.InvalidSpecError): 178 | target = parse_date(target).astimezone(tzutc()) 179 | return target 180 | 181 | @lru_cache(128) 182 | def get(self, tag): 183 | tags = self.get_tag_set() 184 | idx = bisect.bisect_left(LooseVersion(tag), tags) 185 | prev = self.get_tag_commit(tags[idx]) 186 | cur = self.get_tag_commit(tag) 187 | return (commit_dict(cur), self.repo.diff(prev, cur)) 188 | 189 | def get_tag_commit(self, tag): 190 | return self.repo.lookup_reference(str(tag)).peel() 191 | 192 | def get_target_tag(self, tags, target, end=False): 193 | target = self.resolve(target) 194 | if target is None and end: 195 | return tags[-1] 196 | elif target is None: 197 | return tags[0] 198 | # bisect on version 199 | if not isinstance(target, datetime): 200 | indexer = end and bisect.bisect_left or bisect.bisect_right 201 | idx = indexer(tags, LooseVersion("refs/tags/%s" % target)) 202 | if idx == len(tags): 203 | return tags[-1] 204 | return tags[idx] 205 | 206 | # date linear traversal from recent to older 207 | prev = None 208 | for t in reversed(tags): 209 | t_date = commit_date(self.get_tag_commit(t)) 210 | if not end: 211 | if t_date > target: 212 | prev = t 213 | continue 214 | if t_date < target: 215 | return prev or t 216 | if end: 217 | if t_date < target: 218 | prev = t 219 | if t_date > target: 220 | return prev or t 221 | 222 | @lru_cache(5) 223 | def get_tag_set(self): 224 | regex = re.compile("^refs/tags") 225 | tags = list( 226 | map( 227 | LooseVersion, 228 | filter(lambda r: regex.match(r), self.repo.listall_references()), 229 | ) 230 | ) 231 | tags.sort() 232 | return tags 233 | -------------------------------------------------------------------------------- /apichanges/sitebuild.py: -------------------------------------------------------------------------------- 1 | import itertools 2 | import json 3 | import logging 4 | import operator 5 | import os 6 | import shutil 7 | import time 8 | from collections import Counter 9 | from datetime import datetime, timedelta 10 | from pathlib import Path 11 | from typing import List, Optional 12 | 13 | import arrow 14 | import jinja2 15 | import pygit2 16 | from botocore import hooks, xform_name 17 | from botocore.docs.docstring import ClientMethodDocstring 18 | from dateutil.tz import tzutc 19 | from docutils.core import publish_parts 20 | from docutils.writers.html5_polyglot import HTMLTranslator, Writer 21 | from feedgen.feed import FeedGenerator 22 | 23 | from .icons import get_icon, get_icon_style 24 | from .model import ReleaseDelta, ServiceModel 25 | from .record import Commit, ServiceChange # noqa 26 | from .repo import CommitProcessor, TagWalker 27 | 28 | log = logging.getLogger("awschanges.site") 29 | 30 | GIT_EMPTY_FILE = "0000000000000000000000000000000000000000" 31 | 32 | 33 | class DateTimeEncoder(json.JSONEncoder): 34 | def default(self, obj): 35 | if isinstance(obj, datetime): 36 | return obj.isoformat() 37 | return json.JSONEncoder.default(self, obj) 38 | 39 | 40 | def bisect_create_age(commits: List[Commit], days: int) -> int: 41 | marker_date = datetime.now().astimezone(tzutc()).replace( 42 | hour=0, minute=0, second=0, microsecond=0 43 | ) - timedelta(days=days) 44 | 45 | for idx, c in enumerate(commits): 46 | if marker_date > c.created: 47 | break 48 | return idx 49 | 50 | 51 | def bisect_month(commits: List[Commit], month: datetime) -> int: 52 | for idx, c in enumerate(commits): 53 | if (c.created.year, c.created.month) == (month.year, month.month): 54 | continue 55 | return idx - 1 56 | 57 | 58 | def group_by_date( 59 | commits: List[Commit], year: bool = False, month: bool = False 60 | ) -> List[Commit]: 61 | 62 | if year: 63 | key_func = lambda c: c.created.year # noqa 64 | elif month: 65 | key_func = lambda c: (c.created.year, c.created.month) # noqa 66 | else: 67 | raise SyntaxError("one of month or year should be specified") 68 | 69 | # itertools group by seems flaky.. 70 | groups = {} 71 | for c in commits: 72 | groups.setdefault(key_func(c), []).append(c) 73 | return groups 74 | 75 | 76 | def group_by_service(commits: List[Commit]): 77 | groups = {} 78 | for c in commits: 79 | for s in c.service_changes: 80 | groups.setdefault(s.name, []).append(c) 81 | return groups 82 | 83 | 84 | def chunks(changes, size=20): 85 | # slightly specialized batching implementation. each change commit 86 | # can contain n service changes, which contains n operation 87 | # changes, any page rendered with a large number of operation 88 | # changes will be fairly heavy weight (mbs). we attempt to size a 89 | # batch based on the number of service changes, but will treat the 90 | # change commit as an atomic unit. in practice this may produce a 91 | # batch with a single change set. 92 | batch = [] 93 | batch_size = 0 94 | for c in changes: 95 | batch_size += c.size 96 | batch.append(c) 97 | if batch_size > size: 98 | yield batch 99 | batch = [] 100 | batch_size = 0 101 | if batch: 102 | yield batch 103 | 104 | 105 | def sizeof_fmt(num, suffix="B"): 106 | for unit in ["", "Kb", "Mb", "Gb", "Tb"]: 107 | if abs(num) < 1024.0: 108 | return "%3.1f%s%s" % (num, unit, suffix) 109 | num /= 1024.0 110 | return "%.1f %s" % (num, suffix) 111 | 112 | 113 | class TemplateAPI: 114 | # flyweight used per template render 115 | def __init__(self, repo, build_time=None): 116 | self.repo = pygit2.Repository(repo) 117 | self.service_models = {} 118 | self.stats = Counter() 119 | self.build_time = build_time 120 | 121 | def get_service_title(self, service_name, commits): 122 | for c in commits: 123 | for s in c: 124 | if s.name == service_name: 125 | return s.title 126 | return service_name 127 | 128 | def get_service_doc(self, service_change): 129 | return self._get_service_model(service_change).documentation 130 | 131 | def get_human_age(self, rdate): 132 | return arrow.get(rdate).humanize() 133 | 134 | def render_operation(self, service_change, op_name): 135 | # try and reuse botocore's sphinx doc infrastructure. 136 | m = self._get_service_model(service_change) 137 | if m is None: 138 | log.error("couldnt find model %s", service_change) 139 | return "" 140 | opm = m.operation_model(op_name) 141 | method_doc = ClientMethodDocstring( 142 | operation_model=opm, 143 | method_name=opm.name, 144 | event_emitter=hooks.HierarchicalEmitter(), 145 | method_description=opm.documentation, 146 | example_prefix="client.%s" % xform_name(opm.name), 147 | include_signature=False, 148 | ) 149 | return self._render_docutils(method_doc) 150 | 151 | def _get_service_model(self, service_change): 152 | if service_change.model_file in self.service_models: 153 | return self.service_models[service_change.model_file] 154 | self.stats["model_load"] += 1 155 | t = time.time() 156 | if service_change.model_file == GIT_EMPTY_FILE: 157 | return 158 | data = json.loads( 159 | self.repo[service_change.model_file].read_raw().decode("utf8") 160 | ) 161 | m = ServiceModel(data) 162 | self.stats["model_load_time"] += time.time() - t 163 | self.service_models[service_change.model_file] = m 164 | return m 165 | 166 | def _render_docutils(self, method_doc): 167 | method_writer = Writer() 168 | method_writer.translator_class = HTMLTranslator 169 | self.stats["op_render"] += 1 170 | t = time.time() 171 | parts = publish_parts( 172 | str(method_doc), 173 | settings_overrides={"report_level": 4}, 174 | writer=method_writer, 175 | ) 176 | self.stats["op_render_time"] += time.time() - t 177 | return parts["fragment"] 178 | 179 | 180 | class Site: 181 | 182 | site_prefix = "" 183 | site_url = "" 184 | default_commit_days = 14 185 | 186 | def __init__(self, repo_path, cache_path, template_dir, assets_dir): 187 | self.repo_path = repo_path 188 | self.cache_path = Path(cache_path) 189 | self.template_dir = Path(template_dir).resolve() 190 | self.assets_dir = assets_dir 191 | 192 | self.commits = [] 193 | self.env = jinja2.Environment( 194 | lstrip_blocks=True, 195 | trim_blocks=True, 196 | loader=jinja2.FileSystemLoader(str(template_dir)), 197 | ) 198 | self.pages = [] 199 | self.output = None 200 | self.build_time = datetime.utcnow() 201 | 202 | def upload(self, output: Path, destination: str): 203 | pass 204 | 205 | def build(self, output: Path, destination: Optional[str] = None): 206 | log.info("build site") 207 | self.output = output 208 | new_commits = self.load(self.repo_path, self.cache_path) 209 | self.build_index_pages(self.commits[: bisect_create_age(self.commits, 60)]) 210 | self.build_feed(self.commits[: bisect_create_age(self.commits, days=60)]) 211 | if not new_commits: 212 | log.info("no changes") 213 | # self.build_service_pages(self.commits) 214 | # self.build_commit_pages(self.commits[ 215 | # :bisect_create_age(self.commits, self.default_commit_days)]) 216 | else: 217 | log.info("incremental build %d commits", len(new_commits)) 218 | self.build_commit_pages(new_commits) 219 | self.build_service_pages(self.commits, set(group_by_service(new_commits))) 220 | self.build_search_index( 221 | self.commits[: bisect_create_age(self.commits, days=365 + 60)] 222 | ) 223 | self.copy_assets(output) 224 | pages = list(self.pages) 225 | self.pages = [] 226 | return pages 227 | 228 | def copy_assets(self, output, incremental=True): 229 | if not self.assets_dir: 230 | return 231 | for atype in ("css", "js", "sprite", "icons"): 232 | if incremental and atype == "icons": 233 | continue 234 | shutil.copytree( 235 | self.assets_dir / atype, self.output / atype, dirs_exist_ok=True 236 | ) 237 | 238 | @classmethod 239 | def link(self, relative_path): 240 | link = self.site_url 241 | if self.site_prefix: 242 | link += "/%s" % self.site_prefix 243 | link += "/%s" % relative_path 244 | return link 245 | 246 | def render_page( 247 | self, path, template: Optional[str] = None, force: bool = False, **kw 248 | ): 249 | if template: 250 | tpl = self.env.get_template(template) 251 | p = self.output / path 252 | p.parent.mkdir(parents=True, exist_ok=True) 253 | if p.exists() and not force: 254 | return 255 | t = time.time() 256 | with p.open("w") as fh: 257 | tapi = TemplateAPI(str(self.repo_path), self.build_time) 258 | kw["icon_style"] = get_icon_style 259 | kw["icon"] = get_icon 260 | kw["api"] = tapi 261 | kw["build_time"] = self.build_time 262 | if template: 263 | t = time.time() 264 | fh.write(tpl.render(**kw)) 265 | else: 266 | fh.write(kw["content"]) 267 | self.pages.append(path) 268 | 269 | log.debug( 270 | "page:%s size:%s time:%0.2f mtime:%0.2f models:%d op-time:%0.2f", 271 | path, 272 | sizeof_fmt(p.stat().st_size), 273 | time.time() - t, 274 | tapi.stats["model_load_time"], 275 | tapi.stats["model_load"], 276 | tapi.stats["op_render_time"], 277 | ) 278 | 279 | def build_feed(self, commits: List[Commit]): 280 | log.info("build feed page %d" % len(commits)) 281 | feed = FeedGenerator() 282 | feed.id("") 283 | feed.title("AWS API Changes") 284 | feed.author( 285 | { 286 | "name": "AWSPIChanges", 287 | "email": "https://github.com/awslabs/aws-sdk-api-changes", 288 | } 289 | ) 290 | feed.link(href=self.site_url, rel="alternate") 291 | feed.link(href="%s/feed/" % self.site_url, rel="self") 292 | feed.description("AWS API ChangeLog") 293 | feed.language("en-US") 294 | feed.generator("artisan-sdk-gitops") 295 | feed.image( 296 | url="https://a0.awsstatic.com/main/images/logos/aws_logo_smile_179x109.png" 297 | ) # noqa 298 | for c in commits: 299 | for s in c.service_changes: 300 | fe = feed.add_entry(order="append") 301 | fe.title( 302 | "{} - {}{}methods".format( 303 | s.title, 304 | s.count_new and "%d new " % s.count_new or "", 305 | s.count_updated and "%d updated " % s.count_updated or "", 306 | ) 307 | ) 308 | fe.id("{}-{}".format(c.id, s.name)) 309 | fe.description(s.change_log) 310 | fe.link( 311 | { 312 | "href": self.link( 313 | "archive/changes/%s-%s.html" % (c.id[:6], s.name) 314 | ) 315 | } 316 | ) 317 | fe.published(c.created) 318 | self.render_page( 319 | "feed/feed.rss", 320 | force=True, 321 | content=feed.rss_str(pretty=True).decode("utf8"), 322 | ) 323 | 324 | def build_search_index(self, commits: List[Commit]): 325 | log.info("build search index %d" % len(commits)) 326 | sd = [] 327 | for c in commits: 328 | for svc in c: 329 | sd.append( 330 | { 331 | "id": c.id, 332 | "created": c.created, 333 | "svc": svc.name, 334 | "t": svc.title, 335 | "log": svc.change_log, 336 | "new": len(svc.ops_added), 337 | "up": len(svc.ops_updated), 338 | } 339 | ) 340 | self.render_page( 341 | "search_data.json", content=json.dumps(sd, cls=DateTimeEncoder) 342 | ) 343 | self.render_page("search/index.html", "search.j2") 344 | 345 | def build_commit_pages(self, commits: List[Commit]): 346 | for c in commits: 347 | for svc_change in c: 348 | self.render_page( 349 | "archive/changes/{}-{}.html".format(c.id[:6], svc_change.name), 350 | "service-commit.j2", 351 | service_change=svc_change, 352 | commit=c, 353 | force=True, 354 | ) 355 | 356 | def build_service_pages(self, commits: List[Commit], services=None): 357 | groups = group_by_service(commits) 358 | for svc_name in sorted(groups): 359 | if services and svc_name not in services: 360 | continue 361 | svc_title = list(groups[svc_name][0].select(svc_name))[0].title 362 | self.render_page( 363 | "archive/service/{}/index.html".format(svc_name), 364 | "service.j2", 365 | service=svc_name, 366 | service_title=svc_title, 367 | releases=groups[svc_name], 368 | force=True, 369 | ) 370 | self.render_page( 371 | "archive/service/index.html", 372 | "service-map.j2", 373 | services=sorted(groups, key=lambda s: groups[s][0].created, reverse=True), 374 | changes=groups, 375 | force=True, 376 | ) 377 | 378 | # def build_month_archive(self, commits): 379 | # for (year, month), mcommits in group_by_date( 380 | # commits, month=True).items(): 381 | # dt = datetime(year=year, month=month, day=1) 382 | # self.render_page( 383 | # 'archive/index/{}/{}/index.html'.format(year, month), 384 | # 'month.j2', 385 | # archive_date=dt, 386 | # releases=mcommits) 387 | 388 | def build_index_pages(self, commits: List[Commit]): 389 | pager = {} 390 | # pager = {'size': len(commits), 391 | # 'archive': self.link( 392 | # 'archive/{:%Y%/m}'.format(commits[-1].created)), 393 | # 'pages': [idx for idx, batch in enumerate(pages)]} 394 | # 395 | # log.info('build main pages %d' % len(pager['pages'])) 396 | for idx, batch in [(0, commits)]: 397 | p = "%s.html" % (idx and "archive/index/%d" % idx or "index") 398 | log.info( 399 | "main page: %s commits: %d changes: %d start: %s period: %s", 400 | str(p), 401 | len(batch), 402 | sum([len(s) for s in itertools.chain(*[c for c in batch])]), 403 | batch[-1].created.strftime("%Y-%m-%d"), 404 | (batch[0].created - batch[-1].created).days, 405 | ) 406 | self.render_page(p, "index.j2", releases=batch, pager=pager, force=True) 407 | 408 | def load(self, repo_path: str, cache_path: str, since: Optional[str] = None): 409 | log.info("git walking repository") 410 | commits = [] 411 | if os.path.exists(cache_path): 412 | with open(cache_path) as fh: 413 | commits = Commit.schema().loads(fh.read(), many=True) 414 | commits.sort(key=operator.attrgetter("created"), reverse=True) 415 | self.commits = commits 416 | if not commits: 417 | new_commits = self._load(repo_path, since=since) 418 | if since is None: # last commit is typically an import 419 | new_commits.pop(-1) 420 | else: 421 | new_commits = self._load(repo_path, since=commits[0].tag) 422 | self.commits.extend(new_commits) 423 | self.commits.sort(key=operator.attrgetter("created"), reverse=True) 424 | with open(cache_path, "w") as fh: 425 | fh.write(Commit.schema().dumps(self.commits, many=True)) 426 | return new_commits 427 | 428 | def _load( 429 | self, repo_path: str, since: Optional[str] = None, until: Optional[str] = None 430 | ) -> List[Commit]: 431 | repo = pygit2.Repository(str(Path(repo_path).expanduser().resolve())) 432 | walker = TagWalker(repo) 433 | delta = CommitProcessor( 434 | repo, 435 | change_dir=".changes", 436 | model_prefix="apis/", 437 | model_suffix="normal.json", 438 | ) 439 | releases = [] 440 | for _, _, commit_info, change_diff in walker.walk(since, until): 441 | svc_changes = delta.process(commit_info, change_diff) 442 | if svc_changes: 443 | releases.append(ReleaseDelta(commit_info, svc_changes)) 444 | return list(Commit.from_commits(releases)) 445 | -------------------------------------------------------------------------------- /assets/css/docutils.css: -------------------------------------------------------------------------------- 1 | /* Minimal style sheet for the HTML output of Docutils. */ 2 | /* */ 3 | /* :Author: Günter Milde, based on html4css1.css by David Goodger */ 4 | /* :Id: $Id: minimal.css 8216 2018-06-05 13:37:44Z milde $ */ 5 | /* :Copyright: © 2015 Günter Milde. */ 6 | /* :License: Released under the terms of the `2-Clause BSD license`_, */ 7 | /* in short: */ 8 | /* */ 9 | /* Copying and distribution of this file, with or without modification, */ 10 | /* are permitted in any medium without royalty provided the copyright */ 11 | /* notice and this notice are preserved. */ 12 | /* */ 13 | /* This file is offered as-is, without any warranty. */ 14 | /* */ 15 | /* .. _2-Clause BSD license: http://www.spdx.org/licenses/BSD-2-Clause */ 16 | 17 | /* This CSS2.1_ stylesheet defines rules for Docutils elements without */ 18 | /* HTML equivalent. It is required to make the document semantic visible. */ 19 | /* */ 20 | /* .. _CSS2.1: http://www.w3.org/TR/CSS2 */ 21 | /* .. _validates: http://jigsaw.w3.org/css-validator/validator$link */ 22 | 23 | /* alignment of text and inline objects inside block objects*/ 24 | .align-left { text-align: left; } 25 | .align-right { text-align: right; } 26 | .align-center { clear: both; text-align: center; } 27 | .align-top { vertical-align: top; } 28 | .align-middle { vertical-align: middle; } 29 | .align-bottom { vertical-align: bottom; } 30 | 31 | /* titles */ 32 | h1.title, p.subtitle { 33 | text-align: center; 34 | } 35 | p.admonition-title, 36 | p.topic-title, 37 | p.sidebar-title, 38 | p.rubric, 39 | p.system-message-title { 40 | font-weight: bold; 41 | } 42 | h1 + p.subtitle, 43 | h1 + p.section-subtitle { 44 | font-size: 1.6em; 45 | } 46 | h2 + p.section-subtitle { font-size: 1.28em; } 47 | p.subtitle, 48 | p.section-subtitle, 49 | p.sidebar-subtitle { 50 | font-weight: bold; 51 | margin-top: -0.5em; 52 | } 53 | p.sidebar-title, 54 | p.rubric { 55 | font-size: larger; 56 | } 57 | p.rubric { color: maroon; } 58 | a.toc-backref { 59 | color: black; 60 | text-decoration: none; } 61 | 62 | /* Warnings, Errors */ 63 | div.caution p.admonition-title, 64 | div.attention p.admonition-title, 65 | div.danger p.admonition-title, 66 | div.error p.admonition-title, 67 | div.warning p.admonition-title, 68 | div.system-messages h1, 69 | div.error, 70 | span.problematic, 71 | p.system-message-title { 72 | color: red; 73 | } 74 | 75 | /* inline literals */ 76 | span.docutils.literal { 77 | font-family: monospace; 78 | white-space: pre-wrap; 79 | } 80 | /* do not wraph at hyphens and similar: */ 81 | .literal > span.pre { white-space: nowrap; } 82 | 83 | /* Lists */ 84 | 85 | /* compact and simple lists: no margin between items */ 86 | .simple li, .compact li, 87 | .simple ul, .compact ul, 88 | .simple ol, .compact ol, 89 | .simple > li p, .compact > li p, 90 | dl.simple > dd, dl.compact > dd { 91 | margin-top: 0; 92 | margin-bottom: 0; 93 | } 94 | 95 | /* Table of Contents */ 96 | div.topic.contents { margin: 0; } 97 | div.topic.contents ul { 98 | list-style-type: none; 99 | padding-left: 1.5em; 100 | } 101 | 102 | /* Enumerated Lists */ 103 | ol.arabic { list-style: decimal } 104 | ol.loweralpha { list-style: lower-alpha } 105 | ol.upperalpha { list-style: upper-alpha } 106 | ol.lowerroman { list-style: lower-roman } 107 | ol.upperroman { list-style: upper-roman } 108 | 109 | dt span.classifier { font-style: italic } 110 | dt span.classifier:before { 111 | font-style: normal; 112 | margin: 0.5em; 113 | content: ":"; 114 | } 115 | 116 | /* Field Lists and drivatives */ 117 | /* bold field name, content starts on the same line */ 118 | dl.field-list > dt, 119 | dl.option-list > dt, 120 | dl.docinfo > dt, 121 | dl.footnote > dt, 122 | dl.citation > dt { 123 | font-weight: bold; 124 | clear: left; 125 | float: left; 126 | margin: 0; 127 | padding: 0; 128 | padding-right: 0.5em; 129 | } 130 | /* Offset for field content (corresponds to the --field-name-limit option) */ 131 | dl.field-list > dd, 132 | dl.option-list > dd, 133 | dl.docinfo > dd { 134 | margin-left: 9em; /* ca. 14 chars in the test examples */ 135 | } 136 | /* start field-body on a new line after long field names */ 137 | dl.field-list > dd > *:first-child, 138 | dl.option-list > dd > *:first-child 139 | { 140 | display: inline-block; 141 | width: 100%; 142 | margin: 0; 143 | } 144 | /* field names followed by a colon */ 145 | dl.field-list > dt:after, 146 | dl.docinfo > dt:after { 147 | content: ":"; 148 | } 149 | 150 | /* Bibliographic Fields (docinfo) */ 151 | pre.address { font: inherit; } 152 | dd.authors > p { margin: 0; } 153 | 154 | /* Option Lists */ 155 | dl.option-list { margin-left: 40px; } 156 | dl.option-list > dt { font-weight: normal; } 157 | span.option { white-space: nowrap; } 158 | 159 | /* Footnotes and Citations */ 160 | dl.footnote.superscript > dd {margin-left: 1em; } 161 | dl.footnote.brackets > dd {margin-left: 2em; } 162 | dl > dt.label { font-weight: normal; } 163 | a.footnote-reference.brackets:before, 164 | dt.label > span.brackets:before { content: "["; } 165 | a.footnote-reference.brackets:after, 166 | dt.label > span.brackets:after { content: "]"; } 167 | a.footnote-reference.superscript, 168 | dl.footnote.superscript > dt.label { 169 | vertical-align: super; 170 | font-size: smaller; 171 | } 172 | dt.label > span.fn-backref { margin-left: 0.2em; } 173 | dt.label > span.fn-backref > a { font-style: italic; } 174 | 175 | /* Line Blocks */ 176 | div.line-block { display: block; } 177 | div.line-block div.line-block { 178 | margin-top: 0; 179 | margin-bottom: 0; 180 | margin-left: 40px; 181 | } 182 | 183 | /* Figures, Images, and Tables */ 184 | .figure.align-left, 185 | img.align-left, 186 | object.align-left, 187 | table.align-left { 188 | margin-right: auto; 189 | } 190 | .figure.align-center, 191 | img.align-center, 192 | object.align-center { 193 | margin-left: auto; 194 | margin-right: auto; 195 | display: block; 196 | } 197 | table.align-center { 198 | margin-left: auto; 199 | margin-right: auto; 200 | } 201 | .figure.align-right, 202 | img.align-right, 203 | object.align-right, 204 | table.align-right { 205 | margin-left: auto; 206 | } 207 | /* reset inner alignment in figures and tables */ 208 | /* div.align-left, div.align-center, div.align-right, */ 209 | table.align-left, table.align-center, table.align-right 210 | { text-align: inherit } 211 | 212 | /* Admonitions and System Messages */ 213 | div.admonition, 214 | div.system-message, 215 | div.sidebar{ 216 | margin: 40px; 217 | border: medium outset; 218 | padding-right: 1em; 219 | padding-left: 1em; 220 | } 221 | 222 | /* Sidebar */ 223 | div.sidebar { 224 | width: 30%; 225 | max-width: 26em; 226 | float: right; 227 | clear: right; 228 | } 229 | 230 | /* Text Blocks */ 231 | blockquote, 232 | div.topic, 233 | pre.literal-block, 234 | pre.doctest-block, 235 | pre.math, 236 | pre.code { 237 | margin-left: 1.5em; 238 | margin-right: 1.5em 239 | } 240 | pre.code .ln { color: gray; } /* line numbers */ 241 | 242 | /* Tables */ 243 | table { border-collapse: collapse; } 244 | td, th { 245 | border-style: solid; 246 | border-color: silver; 247 | padding: 0 1ex; 248 | border-width: thin; 249 | } 250 | td > p:first-child, th > p:first-child { margin-top: 0; } 251 | td > p, th > p { margin-bottom: 0; } 252 | 253 | table > caption { 254 | text-align: left; 255 | margin-bottom: 0.25em 256 | } 257 | 258 | table.borderless td, table.borderless th { 259 | border: 0; 260 | padding: 0; 261 | padding-right: 0.5em /* separate table cells */ 262 | } 263 | 264 | 265 |