├── .flake8 ├── .gitattributes ├── .github └── workflows │ └── main.yml ├── .gitignore ├── .isort.cfg ├── .pre-commit-config.yaml ├── .pylintrc ├── IntuneUploader ├── IntuneAppCleaner.py ├── IntuneAppIconGetter.py ├── IntuneAppPromoter.py ├── IntuneAppUploader.py ├── IntuneScriptUploader.py ├── IntuneSlackNotifier.py ├── IntuneTeamsNotifier.py ├── IntuneUploader.recipe ├── IntuneUploaderLib │ └── IntuneUploaderBase.py ├── IntuneVTAppDeleter.py └── requirements.txt ├── LICENSE └── README.md /.flake8: -------------------------------------------------------------------------------- 1 | [flake8] 2 | select = B,C,E,F,P,W,B9 3 | max-line-length = 80 4 | ### DEFAULT IGNORES FOR 4-space INDENTED PROJECTS ### 5 | # E127, E128 are hard to silence in certain nested formatting situations. 6 | # E203 doesn't work for slicing 7 | # E265, E266 talk about comment formatting which is too opinionated. 8 | # E402 warns on imports coming after statements. There are important use cases 9 | # that require statements before imports. 10 | # E501 is not flexible enough, we're using B950 instead. 11 | # E722 is a duplicate of B001. 12 | # P207 is a duplicate of B003. 13 | # P208 is a duplicate of C403. 14 | # W503 talks about operator formatting which is too opinionated. 15 | ignore = E127, E128, E203, E265, E266, E402, E501, E722, P207, P208, W503 16 | ### DEFAULT IGNORES FOR 2-space INDENTED PROJECTS (uncomment) ### 17 | # ignore = E111, E114, E121, E127, E128, E265, E266, E402, E501, P207, P208, W503 18 | exclude = 19 | .git, 20 | .hg, 21 | # This will be fine-tuned over time 22 | max-complexity = 65 23 | -------------------------------------------------------------------------------- /.gitattributes: -------------------------------------------------------------------------------- 1 | # Auto detect text files and perform LF normalization 2 | * text=auto 3 | -------------------------------------------------------------------------------- /.github/workflows/main.yml: -------------------------------------------------------------------------------- 1 | name: Fetch GitHub API Data 2 | 3 | on: 4 | schedule: 5 | - cron: "0 * * * *" 6 | workflow_dispatch: 7 | 8 | jobs: 9 | fetch: 10 | runs-on: ubuntu-latest 11 | steps: 12 | - name: Checkout gh-pages branch 13 | uses: actions/checkout@v3 14 | with: 15 | ref: gh-pages 16 | 17 | - name: Fetch GitHub Repository Data 18 | run: | 19 | curl -H "Authorization: token ${{ secrets.GH_API_TOKEN }}" \ 20 | -H "Accept: application/vnd.github.v3+json" \ 21 | https://api.github.com/repos/almenscorner/intune-uploader \ 22 | | jq 'del(.pushed_at, .size)' > data/github.json 23 | 24 | - name: Fetch GitHub Contributors 25 | run: | 26 | curl -H "Authorization: token ${{ secrets.GH_API_TOKEN }}" \ 27 | -H "Accept: application/vnd.github.v3+json" \ 28 | https://api.github.com/repos/almenscorner/intune-uploader/contributors \ 29 | > data/contributors.json 30 | 31 | - name: Commit & Push Changes to gh-pages 32 | run: | 33 | git config --global user.email "actions@github.com" 34 | git config --global user.name "GitHub Actions" 35 | git add data/github.json data/contributors.json 36 | git commit -m "Update GitHub API cache [skip ci]" || exit 0 37 | git push origin gh-pages 38 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | build/ 12 | develop-eggs/ 13 | dist/ 14 | downloads/ 15 | eggs/ 16 | .eggs/ 17 | lib/ 18 | lib64/ 19 | parts/ 20 | sdist/ 21 | var/ 22 | wheels/ 23 | share/python-wheels/ 24 | *.egg-info/ 25 | .installed.cfg 26 | *.egg 27 | MANIFEST 28 | 29 | # PyInstaller 30 | # Usually these files are written by a python script from a template 31 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 32 | *.manifest 33 | *.spec 34 | 35 | # Installer logs 36 | pip-log.txt 37 | pip-delete-this-directory.txt 38 | 39 | # Unit test / coverage reports 40 | htmlcov/ 41 | .tox/ 42 | .nox/ 43 | .coverage 44 | .coverage.* 45 | .cache 46 | nosetests.xml 47 | coverage.xml 48 | *.cover 49 | *.py,cover 50 | .hypothesis/ 51 | .pytest_cache/ 52 | cover/ 53 | 54 | # Translations 55 | *.mo 56 | *.pot 57 | 58 | # Django stuff: 59 | *.log 60 | local_settings.py 61 | db.sqlite3 62 | db.sqlite3-journal 63 | 64 | # Flask stuff: 65 | instance/ 66 | .webassets-cache 67 | 68 | # Scrapy stuff: 69 | .scrapy 70 | 71 | # Sphinx documentation 72 | docs/_build/ 73 | 74 | # PyBuilder 75 | .pybuilder/ 76 | target/ 77 | 78 | # Jupyter Notebook 79 | .ipynb_checkpoints 80 | 81 | # IPython 82 | profile_default/ 83 | ipython_config.py 84 | 85 | # pyenv 86 | # For a library or package, you might want to ignore these files since the code is 87 | # intended to run in multiple environments; otherwise, check them in: 88 | # .python-version 89 | 90 | # pipenv 91 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. 92 | # However, in case of collaboration, if having platform-specific dependencies or dependencies 93 | # having no cross-platform support, pipenv may install dependencies that don't work, or not 94 | # install all needed dependencies. 95 | #Pipfile.lock 96 | 97 | # poetry 98 | # Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control. 99 | # This is especially recommended for binary packages to ensure reproducibility, and is more 100 | # commonly ignored for libraries. 101 | # https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control 102 | #poetry.lock 103 | 104 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow 105 | __pypackages__/ 106 | 107 | # Celery stuff 108 | celerybeat-schedule 109 | celerybeat.pid 110 | 111 | # SageMath parsed files 112 | *.sage.py 113 | 114 | # Environments 115 | .env 116 | .venv 117 | env/ 118 | venv/ 119 | ENV/ 120 | env.bak/ 121 | venv.bak/ 122 | 123 | # Spyder project settings 124 | .spyderproject 125 | .spyproject 126 | 127 | # Rope project settings 128 | .ropeproject 129 | 130 | # mypy 131 | .mypy_cache/ 132 | .dmypy.json 133 | dmypy.json 134 | 135 | # Pyre type checker 136 | .pyre/ 137 | 138 | # pytype static type analyzer 139 | .pytype/ 140 | 141 | # Cython debug symbols 142 | cython_debug/ 143 | 144 | # PyCharm 145 | # JetBrains specific template is maintainted in a separate JetBrains.gitignore that can 146 | # be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore 147 | # and can be added to the global gitignore or merged into this file. For a more nuclear 148 | # option (not recommended) you can uncomment the following to ignore the entire idea folder. 149 | .idea/ 150 | 151 | .DS_Store 152 | 153 | .vscode/ 154 | 155 | site/ 156 | 157 | .jekyll-cache/ 158 | 159 | _site/ 160 | -------------------------------------------------------------------------------- /.isort.cfg: -------------------------------------------------------------------------------- 1 | [settings] 2 | multi_line_output=3 3 | include_trailing_comma=True 4 | force_grid_wrap=0 5 | use_parentheses=True 6 | line_length=88 7 | -------------------------------------------------------------------------------- /.pre-commit-config.yaml: -------------------------------------------------------------------------------- 1 | repos: 2 | - repo: https://github.com/homebysix/pre-commit-macadmin 3 | rev: v1.12.4 4 | hooks: 5 | - id: check-autopkg-recipes 6 | args: ["--recipe-prefix=com.github.almenscorner.", "--strict", "--"] 7 | exclude: ^Scripts/Template\.intune\.recipe$ 8 | - id: forbid-autopkg-overrides 9 | - id: forbid-autopkg-trust-info 10 | - repo: https://github.com/pre-commit/pre-commit-hooks 11 | rev: v4.5.0 12 | hooks: 13 | - id: check-added-large-files 14 | args: ["--maxkb=200"] 15 | - id: check-ast 16 | - id: check-case-conflict 17 | - id: check-docstring-first 18 | - id: check-executables-have-shebangs 19 | - id: check-merge-conflict 20 | - id: check-yaml 21 | - id: detect-private-key 22 | - id: end-of-file-fixer 23 | - id: fix-byte-order-marker 24 | - id: fix-encoding-pragma 25 | - id: mixed-line-ending 26 | - id: no-commit-to-branch 27 | - id: trailing-whitespace 28 | args: ["--markdown-linebreak-ext=md"] 29 | - repo: https://github.com/ambv/black 30 | rev: 23.10.0 31 | hooks: 32 | - id: black 33 | - repo: https://github.com/pycqa/isort 34 | rev: 5.12.0 35 | hooks: 36 | - id: isort 37 | - repo: https://github.com/pycqa/flake8 38 | rev: 6.1.0 39 | hooks: 40 | - id: flake8 41 | - repo: https://github.com/PyCQA/pylint 42 | rev: v3.0.1 43 | hooks: 44 | - id: pylint 45 | -------------------------------------------------------------------------------- /.pylintrc: -------------------------------------------------------------------------------- 1 | [MAIN] 2 | 3 | disable= 4 | attribute-defined-outside-init, 5 | consider-using-f-string, 6 | consider-using-generator, 7 | consider-using-join, 8 | consider-using-with, 9 | duplicate-code, 10 | import-error, 11 | inconsistent-return-statements, 12 | invalid-name, 13 | line-too-long, 14 | missing-module-docstring, 15 | raise-missing-from, 16 | too-few-public-methods, 17 | too-many-arguments, 18 | too-many-branches, 19 | too-many-instance-attributes, 20 | too-many-locals, 21 | too-many-return-statements, 22 | too-many-statements, 23 | unnecessary-lambda-assignment, 24 | wrong-import-position, 25 | -------------------------------------------------------------------------------- /IntuneUploader/IntuneAppCleaner.py: -------------------------------------------------------------------------------- 1 | #!/usr/local/autopkg/python 2 | # -*- coding: utf-8 -*- 3 | 4 | """ 5 | This processor cleans up apps in Intune based on the input variables. 6 | It will keep the number of versions specified in the keep_version_count variable. It will delete the rest. 7 | 8 | Created by Tobias Almén 9 | """ 10 | 11 | import os 12 | import sys 13 | 14 | __all__ = ["IntuneAppCleaner"] 15 | 16 | sys.path.insert(0, os.path.dirname(__file__)) 17 | from IntuneUploaderLib.IntuneUploaderBase import IntuneUploaderBase 18 | 19 | 20 | class IntuneAppCleaner(IntuneUploaderBase): 21 | """This processor cleans up apps in Intune based on the input variables.""" 22 | 23 | description = __doc__ 24 | input_variables = { 25 | "display_name": { 26 | "required": True, 27 | "description": "The name of the app to clean up.", 28 | }, 29 | "keep_version_count": { 30 | "required": False, 31 | "description": "The number of versions to keep. Defaults to 3.", 32 | "default": 3, 33 | }, 34 | "test_mode": { 35 | "required": False, 36 | "description": "If True, will only print what would have been done.", 37 | "default": False, 38 | }, 39 | } 40 | output_variables = { 41 | "intuneappcleaner_summary_result": { 42 | "description": "Description of interesting results." 43 | } 44 | } 45 | 46 | def main(self): 47 | """Main process.""" 48 | # Set variables 49 | self.BASE_ENDPOINT = ( 50 | "https://graph.microsoft.com/beta/deviceAppManagement/mobileApps" 51 | ) 52 | self.CLIENT_ID = self.env.get("CLIENT_ID") 53 | self.CLIENT_SECRET = self.env.get("CLIENT_SECRET") 54 | self.TENANT_ID = self.env.get("TENANT_ID") 55 | app_name = self.env.get("display_name") 56 | keep_versions = self.env.get("keep_version_count") 57 | apps_to_delete = [] 58 | test_mode = self.env.get("test_mode") 59 | 60 | # Get access token 61 | self.token = self.obtain_accesstoken( 62 | self.CLIENT_ID, self.CLIENT_SECRET, self.TENANT_ID 63 | ) 64 | 65 | # When running from the command line, keep_versions is a string, convert to int 66 | if isinstance(keep_versions, str): 67 | keep_versions = int(keep_versions) 68 | 69 | # Get macthing apps 70 | apps = self.get_matching_apps(app_name) 71 | self.output(f"Found {str(len(apps))} apps matching {app_name}") 72 | 73 | if len(apps) == 0: 74 | return 75 | 76 | # Get primaryBundleVersion or buildNumber for each app and set it as a key in the app dict 77 | apps = list( 78 | map( 79 | lambda item: {**item, "primaryBundleVersion": item["buildNumber"]} 80 | if "primaryBundleVersion" not in item and "buildNumber" in item 81 | else item, 82 | apps, 83 | ) 84 | ) 85 | # Get a sorted list of apps by version 86 | apps = sorted(apps, key=lambda app: app["primaryBundleVersion"], reverse=True) 87 | 88 | # If the number of apps is greater than the keep version count, remove the apps that should be kept 89 | if len(apps) > keep_versions: 90 | self.output("App count is greater than keep version count, removing apps.") 91 | # Remove the apps that should be kept 92 | apps_to_delete = apps[keep_versions:] 93 | # Delete the apps that should not be kept 94 | for app in apps_to_delete: 95 | self.output( 96 | "Deleted app: " 97 | + app["displayName"] 98 | + " " 99 | + app["primaryBundleVersion"] 100 | ) 101 | # Only delete if not in test mode 102 | if not test_mode: 103 | self.makeapirequestDelete( 104 | self.BASE_ENDPOINT + "/" + app["id"], self.token 105 | ) 106 | 107 | self.env["intuneappcleaner_summary_result"] = { 108 | "summary_text": "Summary of IntuneAppCleaner results:", 109 | "report_fields": [ 110 | "searched name", 111 | "keep count", 112 | "match count", 113 | "removed count", 114 | "removed versions", 115 | ], 116 | "data": { 117 | "searched name": app_name, 118 | "keep count": str(keep_versions), 119 | "match count": str(len(apps)), 120 | "removed count": str(len(apps_to_delete)), 121 | "removed versions": ", ".join( 122 | [app["primaryBundleVersion"] for app in apps_to_delete] 123 | ) 124 | if len(apps_to_delete) > 0 125 | else "", 126 | }, 127 | } 128 | 129 | 130 | if __name__ == "__main__": 131 | PROCESSOR = IntuneAppCleaner() 132 | PROCESSOR.execute_shell() 133 | -------------------------------------------------------------------------------- /IntuneUploader/IntuneAppIconGetter.py: -------------------------------------------------------------------------------- 1 | #!/usr/local/autopkg/python 2 | # -*- coding: utf-8 -*- 3 | 4 | """ 5 | This processor extracts the app icon from a .app or .dmg file and saves it as a .png file. 6 | This is a very basic processor that uses the sips command to convert the icon to png format. 7 | For more advanced icon extraction, use the AppIconExtractor processor. 8 | 9 | Created by Tobias Almén 10 | """ 11 | 12 | import glob 13 | import os 14 | import plistlib 15 | import subprocess 16 | 17 | from autopkglib import DmgMounter 18 | 19 | 20 | class IntuneAppIconGetter(DmgMounter): 21 | """Extracts the app icon from a .app or .dmg file and saves it as a .png file.""" 22 | 23 | input_variables = { 24 | "app_file": { 25 | "required": True, 26 | "description": "Path to the .app or .dmg file to extract the icon from.", 27 | }, 28 | "name": { 29 | "required": True, 30 | "description": "Name of the app to use in the output file name.", 31 | }, 32 | } 33 | output_variables = { 34 | "icon_file_path": { 35 | "description": "Path to the extracted icon file, if successful.", 36 | } 37 | } 38 | 39 | def main(self): 40 | """Main process""" 41 | # Get input variables 42 | app_file = self.env.get("app_file") 43 | name = self.env.get("name") 44 | recipe_cache_dir = self.env.get("RECIPE_CACHE_DIR") 45 | self.env["icon_file_path"] = None 46 | mount_point = None 47 | 48 | # If app bundle not found, skip icon extraction 49 | if not os.path.exists(app_file): 50 | self.output(f"Could not find {app_file}.app, skipping icon extraction") 51 | return 52 | 53 | # If app bundle is a .dmg file, mount it and get path to .app file 54 | if os.path.splitext(app_file)[1] == ".dmg": 55 | mount_point = self.mount(app_file) 56 | app_path = glob.glob(os.path.join(mount_point, "*.app")) 57 | if not app_file: 58 | self.output("Could not find .app file, skipping icon extraction") 59 | return 60 | # It is assumed that we will get the first .app file in the mounted .dmg 61 | app_path = app_path[0] 62 | info_plist = os.path.join(app_path, "Contents", "Info.plist") 63 | elif os.path.splitext(app_file)[1] == ".app": 64 | app_path = app_file 65 | info_plist = os.path.join(app_path, "Contents", "Info.plist") 66 | else: 67 | self.output("File is not a .app or .dmg file, skipping icon extraction") 68 | return 69 | 70 | # Load Info.plist file and get icon file path 71 | try: 72 | with open(info_plist, "rb") as f: 73 | info_dict = plistlib.load(f) 74 | except plistlib.InvalidFileException: 75 | return 76 | 77 | icon_name = info_dict.get( 78 | "CFBundleIconFile", name 79 | ) # use name as default if CFBundleIconFile is missing 80 | icon_path = os.path.join(app_path, "Contents", "Resources", f"{icon_name}") 81 | icon_output_path = os.path.join(recipe_cache_dir, f"{name}.png") 82 | sips_path = "/usr/bin/sips" 83 | 84 | # If icon file path does not end with .icns, append .icns extension 85 | if not os.path.splitext(icon_path)[1] == ".icns": 86 | icon_path += ".icns" 87 | 88 | # If icon file not found, skip icon extraction 89 | if not os.path.exists(icon_path): 90 | self.output(f"Could not find icon for {name}, skipping icon extraction") 91 | return 92 | 93 | # If sips command not found, skip icon extraction 94 | if not os.path.exists(sips_path): 95 | self.output("Could not find sips, skipping icon extraction") 96 | return 97 | 98 | # Use sips command to convert icon to png format and save to output path 99 | try: 100 | cmd = [ 101 | sips_path, 102 | "-s", 103 | "format", 104 | "png", 105 | icon_path, 106 | "--out", 107 | icon_output_path, 108 | ] 109 | proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE) 110 | proc.wait() 111 | 112 | # change icon size to 256x256 113 | cmd = [ 114 | sips_path, 115 | icon_output_path, 116 | "-z", 117 | "256", 118 | "256", 119 | "--out", 120 | icon_output_path, 121 | ] 122 | proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE) 123 | proc.wait() 124 | except subprocess.CalledProcessError as err: 125 | self.output(f"Error converting icon: {err}") 126 | return 127 | 128 | # Set output variable to path of extracted icon file 129 | if os.path.exists(icon_output_path): 130 | self.env["icon_file_path"] = icon_output_path 131 | 132 | # If app bundle was a .dmg file, unmount it 133 | if mount_point: 134 | self.unmount(app_file) 135 | 136 | 137 | if __name__ == "__main__": 138 | PROCESSOR = IntuneAppIconGetter() 139 | PROCESSOR.execute_shell() 140 | -------------------------------------------------------------------------------- /IntuneUploader/IntuneAppPromoter.py: -------------------------------------------------------------------------------- 1 | #!/usr/local/autopkg/python 2 | # -*- coding: utf-8 -*- 3 | 4 | """ 5 | This processor promotes apps in Intune based on the input variables. It will promote the app to the first group in the promotion_info list. 6 | If the app has been promoted before, it will promote the app to the next group in the list, if the number of days since the last promotion is greater than or equal to the number of days in the previous ring. 7 | 8 | Created by Tobias Almén 9 | """ 10 | 11 | import json 12 | import os 13 | import sys 14 | from datetime import datetime 15 | 16 | __all__ = ["IntuneAppPromoter"] 17 | 18 | sys.path.insert(0, os.path.dirname(__file__)) 19 | from IntuneUploaderLib.IntuneUploaderBase import IntuneUploaderBase 20 | 21 | 22 | class App: 23 | """App class""" 24 | 25 | def __init__(self, app_name, app_version): 26 | self.displayName = app_name 27 | self.primaryBundleVersion = app_version 28 | 29 | 30 | class IntuneAppPromoter(IntuneUploaderBase): 31 | """This processor promotes an app to groups in Intune based on the input variables.""" 32 | 33 | description = __doc__ 34 | input_variables = { 35 | "display_name": { 36 | "required": True, 37 | "description": "The name of the app to assign.", 38 | }, 39 | "blacklist_versions": { 40 | "required": False, 41 | "description": "If the app version is in this list, it will not be assigned. Can be a wildcard, for example, 5.*", 42 | }, 43 | "promotion_info": { 44 | "required": True, 45 | "description": "An array of dicts containing information about the assignments and schedule.", 46 | }, 47 | } 48 | output_variables = { 49 | "intuneapppromoter_summary_result": { 50 | "description": "Description of interesting results." 51 | } 52 | } 53 | 54 | def main(self): 55 | """Main process""" 56 | # Set variables 57 | self.BASE_ENDPOINT = ( 58 | "https://graph.microsoft.com/beta/deviceAppManagement/mobileApps" 59 | ) 60 | self.CLIENT_ID = self.env.get("CLIENT_ID") 61 | self.CLIENT_SECRET = self.env.get("CLIENT_SECRET") 62 | self.TENANT_ID = self.env.get("TENANT_ID") 63 | app_name = self.env.get("display_name") 64 | app_blacklist_versions = self.env.get("blacklist_versions") 65 | promotion_info = self.env.get("promotion_info") 66 | 67 | def promote_app(group): 68 | notes = { 69 | "promotion_date": date.strftime("%Y-%m-%d"), 70 | "ring": group.get("ring"), 71 | } 72 | self.assign_app(app, [group]) 73 | notes["ring"] = group.get("ring") 74 | data = json.dumps( 75 | { 76 | "notes": json.dumps(notes), 77 | "@odata.type": intune_app.get("@odata.type"), 78 | } 79 | ) 80 | self.makeapirequestPatch( 81 | f"{self.BASE_ENDPOINT}/{intune_app.get('id')}", 82 | self.token, 83 | None, 84 | data, 85 | 204, 86 | ) 87 | promotions.append({"version": app_version, "ring": group.get("ring")}) 88 | 89 | # Get access token 90 | self.token = self.obtain_accesstoken( 91 | self.CLIENT_ID, self.CLIENT_SECRET, self.TENANT_ID 92 | ) 93 | 94 | # Check if promotion info is set 95 | if not promotion_info: 96 | self.output("No promotion info found, exiting.") 97 | return None 98 | 99 | promotions = [] 100 | 101 | # Get matching apps 102 | intune_apps = self.get_matching_apps(app_name) 103 | # If no apps are found, exit 104 | if not intune_apps: 105 | self.output(f"No app found with name: {app_name}, exiting.") 106 | return None 107 | 108 | formatted_date_string = datetime.now().strftime("%Y-%m-%d") 109 | date = datetime.strptime(formatted_date_string, "%Y-%m-%d") 110 | 111 | def match_version(version, blacklist_versions): 112 | # Check if version is a wildcard 113 | for v in blacklist_versions: 114 | if v.endswith("*") and version.startswith(v[:-1]): 115 | return True 116 | # Check if version is in blacklist 117 | if version in blacklist_versions: 118 | return True 119 | 120 | # If no match, return False 121 | return False 122 | 123 | for intune_app in intune_apps: 124 | self.request = intune_app 125 | 126 | # Get assignments for app 127 | assignments = self.makeapirequest( 128 | f"{self.BASE_ENDPOINT}/{intune_app.get('id')}/assignments", 129 | self.token, 130 | None, 131 | ) 132 | current_group_ids = [ 133 | c["target"].get("groupId") 134 | for c in assignments["value"] 135 | if c["target"].get("groupId") 136 | ] 137 | 138 | def version(intune_app): 139 | return intune_app.get("primaryBundleVersion") or intune_app.get( 140 | "buildNumber" 141 | ) 142 | 143 | app_version = version(intune_app) 144 | app = App(app_name, app_version) 145 | 146 | if app_blacklist_versions is not None and ( 147 | match_version(version(intune_app), app_blacklist_versions) 148 | or match_version(version(intune_app), app_blacklist_versions) 149 | ): 150 | self.output( 151 | f"App version {version(intune_app)} is blacklisted, skipping version." 152 | ) 153 | continue 154 | # Get all promotion group ids 155 | promotion_ids = [group.get("group_id") for group in promotion_info] 156 | # If all promotion group ids are in current group ids, exits 157 | if all(id in current_group_ids for id in promotion_ids): 158 | self.output( 159 | f"{intune_app.get('displayName')} {app_version} is already assigned to all groups, skipping." 160 | ) 161 | return None 162 | 163 | # If app has not been promoted before, assign to first group in promotion_info 164 | if not intune_app.get("notes"): 165 | promote_app(promotion_info[0]) 166 | # If app has been promoted before, check if it is time to promote again 167 | else: 168 | # Get days since last promotion and previous ring 169 | notes_data = json.loads(intune_app.get("notes")) 170 | promote_date = notes_data.get("promotion_date") 171 | delta = (date - datetime.strptime(promote_date, "%Y-%m-%d")).days 172 | previous_ring = notes_data.get("ring") 173 | previous_ring_days = sum( 174 | [ 175 | group.get("days") 176 | for group in promotion_info 177 | if group.get("ring") == previous_ring 178 | ] 179 | ) 180 | 181 | for group in promotion_info: 182 | # If the delta is greater than or equal to the previous ring days, promote 183 | if ( 184 | group.get("group_id") not in current_group_ids 185 | and delta >= previous_ring_days 186 | ): 187 | promote_app(group) 188 | break 189 | 190 | self.env["intuneapppromoter_summary_result"] = { 191 | "summary_text": "Summary of IntuneAppPromoter results:", 192 | "report_fields": ["app name", "promotions", "blacklisted versions"], 193 | "data": { 194 | "app name": app_name, 195 | "promotions": ( 196 | ", ".join( 197 | [ 198 | f"{promotion.get('version')} ({promotion.get('ring')})" 199 | for promotion in promotions 200 | ] 201 | ) 202 | if len(promotions) > 0 203 | else "" 204 | ), 205 | "blacklisted versions": ( 206 | ", ".join(app_blacklist_versions) 207 | if app_blacklist_versions is not None 208 | else "" 209 | ), 210 | }, 211 | } 212 | 213 | 214 | if __name__ == "__main__": 215 | processor = IntuneAppPromoter() 216 | processor.execute_shell() 217 | -------------------------------------------------------------------------------- /IntuneUploader/IntuneAppUploader.py: -------------------------------------------------------------------------------- 1 | #!/usr/local/autopkg/python 2 | # -*- coding: utf-8 -*- 3 | 4 | """ 5 | This processor uploads an app to Microsoft Intune using the Microsoft Graph API, it also assigns the app to group(s) if specified 6 | and adds the app to categories if specified. It also supports updating the app if it already exists in Intune. 7 | 8 | It is heavily inspired by the IntuneImporter processor by @SteveKueng. 9 | 10 | Created by Tobias Almén 11 | """ 12 | 13 | import json 14 | import os 15 | import sys 16 | import tempfile 17 | from dataclasses import dataclass, field 18 | 19 | from autopkglib import ProcessorError 20 | 21 | sys.path.insert(0, os.path.dirname(__file__)) 22 | from IntuneUploaderLib.IntuneUploaderBase import IntuneUploaderBase 23 | 24 | __all__ = ["IntuneAppUploader"] 25 | 26 | 27 | class IntuneAppUploader(IntuneUploaderBase): 28 | """IntuneAppUploader processor""" 29 | 30 | description = __doc__ 31 | input_variables = { 32 | "CLIENT_ID": { 33 | "required": True, 34 | "description": "The client ID to use for authenticating the request.", 35 | }, 36 | "CLIENT_SECRET": { 37 | "required": True, 38 | "description": "The client secret to use for authenticating the request.", 39 | }, 40 | "TENANT_ID": { 41 | "required": True, 42 | "description": "The tenant ID to use for authenticating the request.", 43 | }, 44 | "app_file": { 45 | "required": True, 46 | "description": "The app file to upload to Intune.", 47 | }, 48 | "displayname": { 49 | "required": True, 50 | "description": "The display name of the app.", 51 | }, 52 | "description": { 53 | "required": True, 54 | "description": "The description of the app.", 55 | }, 56 | "publisher": { 57 | "required": True, 58 | "description": "The publisher of the app.", 59 | }, 60 | "owner": { 61 | "required": False, 62 | "description": "The owner of the app.", 63 | }, 64 | "developer": { 65 | "required": False, 66 | "description": "The developer of the app.", 67 | }, 68 | "categories": { 69 | "required": False, 70 | "description": "An array of categories to add to the app by name. Must be created in Intune first", 71 | }, 72 | "information_url": { 73 | "required": False, 74 | "description": "The information URL of the app.", 75 | }, 76 | "privacy_information_url": { 77 | "required": False, 78 | "description": "The privacy information URL of the app.", 79 | }, 80 | "notes": { 81 | "required": False, 82 | "description": "The notes of the app.", 83 | }, 84 | "bundleId": { 85 | "required": True, 86 | "description": "The bundle ID of the app.", 87 | }, 88 | "bundleVersion": { 89 | "required": True, 90 | "description": "The bundle version of the app.", 91 | }, 92 | "minimumSupportedOperatingSystem": { 93 | "required": False, 94 | "description": "The minimum supported operating system of the app.", 95 | "default": "v11_0", 96 | }, 97 | "install_as_managed": { 98 | "required": False, 99 | "description": "Whether to install the app as managed or not.", 100 | "default": False, 101 | }, 102 | "ignore_version_detection": { 103 | "required": False, 104 | "description": "Whether Intune will ignore the version in the detection of the installed application.", 105 | "default": False, 106 | }, 107 | "icon": { 108 | "required": False, 109 | "description": "Path to the PNG icon of the app.", 110 | }, 111 | "preinstall_script": { 112 | "required": False, 113 | "description": "The base64 encoded preinstall script for the app. Only applicable to unmanaged PKG apps.", 114 | }, 115 | "postinstall_script": { 116 | "required": False, 117 | "description": "The base64 encoded postinstall script for the app. Only applicable to unmanaged PKG apps.", 118 | }, 119 | "ignore_current_app": { 120 | "required": False, 121 | "description": "Whether to ignore the current app in Intune and create either way.", 122 | "default": False, 123 | }, 124 | "ignore_current_version": { 125 | "required": False, 126 | "description": "Whether to ignore the current version in Intune and upload binary either way.", 127 | "default": False, 128 | }, 129 | "assignment_info": { 130 | "required": False, 131 | "description": "The assignment info of the app. Provided as an array of dicts containing keys 'group_id' and 'intent'. See https://github.com/almenscorner/intune-uploader/wiki/IntuneAppUploader#input-variables for more information.", 132 | }, 133 | "lob_app": { 134 | "required": False, 135 | "description": "Bool value whether the app is a line-of-business app or not.", 136 | "default": False, 137 | }, 138 | "scope_tags": { 139 | "required": False, 140 | "description": "The scope tags to assign to the app. Provide as a list of strings the ids of the scope tags.", 141 | }, 142 | } 143 | output_variables = { 144 | "name": {"description": "The name of the app that was uploaded."}, 145 | "version": { 146 | "description": "The version of the app that was uploaded or updated." 147 | }, 148 | "intune_app_id": { 149 | "description": "The ID of the app that was uploaded or updated." 150 | }, 151 | "content_version_id": { 152 | "description": "The content version ID of the app that was uploaded or updated." 153 | }, 154 | "intune_app_changed": { 155 | "description": "Returns True if the app was updated or created, False if not." 156 | }, 157 | "intuneappuploader_summary_result": { 158 | "description": "Description of interesting results.", 159 | }, 160 | } 161 | 162 | def main(self): 163 | """Main process""" 164 | # Set up variables 165 | self.BASE_ENDPOINT = ( 166 | "https://graph.microsoft.com/beta/deviceAppManagement/mobileApps" 167 | ) 168 | self.CLIENT_ID = self.env.get("CLIENT_ID") 169 | self.CLIENT_SECRET = self.env.get("CLIENT_SECRET") 170 | self.TENANT_ID = self.env.get("TENANT_ID") 171 | self.RECIPE_CACHE_DIR = self.env.get("RECIPE_CACHE_DIR") 172 | # Set the content_updated variable to false 173 | self.content_update = False 174 | # Set the intune_app_changed variable to false 175 | self.env["intune_app_changed"] = False 176 | # Get the app info from the environment variables 177 | self.app_file = self.env.get("app_file") 178 | app_displayname = self.env.get("displayname") 179 | app_description = self.env.get("description") 180 | app_publisher = self.env.get("publisher") 181 | app_owner = self.env.get("owner") 182 | app_developer = self.env.get("developer") 183 | app_categories = self.env.get("categories") 184 | app_information_url = self.env.get("information_url") 185 | app_privacy_information_url = self.env.get("privacy_information_url") 186 | app_notes = self.env.get("notes") 187 | app_bundleId = self.env.get("bundleId") 188 | app_bundleVersion = self.env.get("bundleVersion") 189 | app_type = os.path.splitext(self.app_file)[1][1:] 190 | app_assignment_info = self.env.get("assignment_info") 191 | app_install_as_managed = self.env.get("install_as_managed") 192 | app_ignore_version_detection = self.env.get("ignore_version_detection") 193 | app_minimum_os_version = self.env.get("minimumSupportedOperatingSystem") 194 | app_icon = self.env.get("icon") 195 | app_preinstall_script = self.env.get("preinstall_script") 196 | app_postinstall_script = self.env.get("postinstall_script") 197 | app_scope_tags = self.env.get("scope_tags") 198 | filename = os.path.basename(self.app_file) 199 | ignore_current_app = self.env.get("ignore_current_app") 200 | ignore_current_version = self.env.get("ignore_current_version") 201 | lob_app = self.env.get("lob_app") 202 | 203 | # Get the access token 204 | self.token = self.obtain_accesstoken( 205 | self.CLIENT_ID, self.CLIENT_SECRET, self.TENANT_ID 206 | ) 207 | 208 | @dataclass 209 | class App: 210 | """ 211 | A class to represent an app. 212 | """ 213 | 214 | displayName: str = app_displayname 215 | description: str = app_description 216 | publisher: str = app_publisher 217 | owner: str = app_owner 218 | developer: str = app_developer 219 | notes: str = app_notes 220 | fileName: str = filename 221 | privacyInformationUrl: str = app_privacy_information_url 222 | informationUrl: str = app_information_url 223 | primaryBundleId: str = app_bundleId 224 | primaryBundleVersion: str = app_bundleVersion 225 | ignoreVersionDetection: bool = app_ignore_version_detection 226 | installAsManaged: bool = app_install_as_managed 227 | minimumSupportedOperatingSystem: dict = field(default_factory=dict) 228 | largeIcon: dict = field(default_factory=dict, init=False) 229 | roleScopeTagIds: list = field(default_factory=list) 230 | 231 | def __post_init__(self): 232 | """ 233 | Creates app data based on the app type. 234 | """ 235 | 236 | if not lob_app: 237 | self.includedApps = [ 238 | { 239 | "@odata.type": "#microsoft.graph.macOSIncludedApp", 240 | "bundleId": app_bundleId, 241 | "bundleVersion": app_bundleVersion, 242 | } 243 | ] 244 | 245 | if app_type == "dmg": 246 | self.__dict__["@odata.type"] = "#microsoft.graph.macOSDmgApp" 247 | 248 | elif app_type == "pkg": 249 | self.__dict__["@odata.type"] = "#microsoft.graph.macOSPkgApp" 250 | if app_preinstall_script: 251 | self.preInstallScript = { 252 | "scriptContent": app_preinstall_script 253 | } 254 | if app_postinstall_script: 255 | self.postInstallScript = { 256 | "scriptContent": app_postinstall_script 257 | } 258 | 259 | else: 260 | self.childApps = [ 261 | { 262 | "@odata.type": "#microsoft.graph.macOSLobChildApp", 263 | "bundleId": app_bundleId, 264 | "buildNumber": app_bundleVersion, 265 | "versionNumber": "0.0", 266 | } 267 | ] 268 | 269 | self.__dict__.pop("primaryBundleId") 270 | self.__dict__.pop("primaryBundleVersion") 271 | self.__dict__["bundleId"] = app_bundleId 272 | self.__dict__["buildNumber"] = app_bundleVersion 273 | self.__dict__["@odata.type"] = "#microsoft.graph.macOSLobApp" 274 | 275 | self.minimumSupportedOperatingSystem = { 276 | "@odata.type": "#microsoft.graph.macOSMinimumOperatingSystem", 277 | app_minimum_os_version: True, 278 | } 279 | 280 | # Create the app data 281 | app_data = App() 282 | if app_icon: 283 | app_data.largeIcon = { 284 | "type": "image/png", 285 | "value": self.encode_icon(app_icon), 286 | } 287 | 288 | if app_scope_tags: 289 | app_data.roleScopeTagIds = app_scope_tags 290 | 291 | # Get the app data as a dictionary 292 | app_data_dict = app_data.__dict__ 293 | # Convert the dictionary to JSON 294 | data = json.dumps(app_data_dict) 295 | # Check if app already exists 296 | current_app_result, current_app_data = self.get_current_app( 297 | app_displayname, app_bundleVersion, app_data_dict["@odata.type"] 298 | ) 299 | # Get app categories from Intune 300 | intune_app_categories = self.get_app_categories() 301 | 302 | # Check if the app categories exist in Intune 303 | if app_categories: 304 | categories_to_create = [] 305 | for category in app_categories: 306 | if category not in intune_app_categories: 307 | categories_to_create.append(category) 308 | if categories_to_create: 309 | self.output( 310 | f"Creating categories {', '.join(categories_to_create)} in Intune" 311 | ) 312 | self.create_app_categories(categories_to_create) 313 | 314 | # If the ignore_current_app variable is set to true, create the app regardless of whether it already exists 315 | if ignore_current_app and not current_app_data: 316 | raise ProcessorError( 317 | "App not found in Intune. Please set ignore_current_app to false." 318 | ) 319 | if ( 320 | ignore_current_app 321 | and app_bundleVersion != current_app_data["primaryBundleVersion"] 322 | ): 323 | self.output( 324 | f"Creating app {app_data.displayName} version {app_bundleVersion}" 325 | ) 326 | # Create the app 327 | self.request = self.makeapirequestPost( 328 | f"{self.BASE_ENDPOINT}", self.token, "", data, 201 329 | ) 330 | 331 | # If the ignore_current_app variable is not set to true, check if the app already exists and update it if necessary 332 | else: 333 | # If the app needs to be updated or the current version should be ignored 334 | if current_app_result == "update" or ignore_current_version is True: 335 | # If the app is not found, raise an error 336 | if not current_app_data: 337 | raise ProcessorError( 338 | "App not found in Intune. Please set ignore_current_version to false." 339 | ) 340 | 341 | # If the app version is the same, update the file contents 342 | if current_app_data["primaryBundleVersion"] == app_bundleVersion: 343 | self.output( 344 | f'Upadating File Contents for app {current_app_data["displayName"]} version {current_app_data["primaryBundleVersion"]}' 345 | ) 346 | # If the app version is different, update the app 347 | else: 348 | self.output( 349 | f'Updating app {current_app_data["displayName"]} from {current_app_data["primaryBundleVersion"]} to version {app_bundleVersion}' 350 | ) 351 | # Update the app 352 | self.content_update = True 353 | self.makeapirequestPatch( 354 | f'{self.BASE_ENDPOINT}/{current_app_data["id"]}', 355 | self.token, 356 | "", 357 | data, 358 | 204, 359 | ) 360 | self.request = current_app_data 361 | 362 | # If the app is up to date and the current version should not be ignored 363 | if current_app_result == "current" and ignore_current_version is False: 364 | self.output( 365 | f'App {current_app_data["displayName"]} version {current_app_data["primaryBundleVersion"]} is up to date' 366 | ) 367 | return 368 | 369 | # If the app does not exist 370 | if current_app_result is None: 371 | self.output( 372 | f"Creating app {app_data.displayName} version {app_data.primaryBundleVersion}" 373 | ) 374 | # Create the app 375 | self.request = self.makeapirequestPost( 376 | f"{self.BASE_ENDPOINT}", self.token, "", data, 201 377 | ) 378 | 379 | # Create the content version 380 | content_version_url = f'{self.BASE_ENDPOINT}/{self.request["id"]}/{str(app_data_dict["@odata.type"]).replace("#", "")}/contentVersions' 381 | self.content_version_request = self.makeapirequestPost( 382 | content_version_url, 383 | self.token, 384 | "", 385 | json.dumps({}), 386 | 201, 387 | ) 388 | 389 | if not self.content_version_request: 390 | self.output("Failed to create content version, trying again") 391 | self.content_version_request = self.makeapirequestPost( 392 | content_version_url, 393 | self.token, 394 | "", 395 | json.dumps({}), 396 | 201, 397 | ) 398 | if not self.content_version_request: 399 | self.delete_app() 400 | raise ProcessorError("Failed to create content version") 401 | 402 | # Encrypt the app 403 | encryptionData, encryptionInfo = self.encrypt_app() 404 | 405 | # Write the encryption data to a file 406 | new_file, tempfilename = tempfile.mkstemp(dir=self.RECIPE_CACHE_DIR) 407 | with open(new_file, "wb") as f: 408 | f.write(encryptionData) 409 | 410 | # Get the app file info 411 | content_file = self.appFile(tempfilename) 412 | # Post the app file info 413 | data = json.dumps(content_file) 414 | self.content_file_request = self.makeapirequestPost( 415 | f'{self.BASE_ENDPOINT}/{self.request["id"]}/microsoft.graph.macOSLobApp/contentVersions/{self.content_version_request["id"]}/files', 416 | self.token, 417 | "", 418 | data, 419 | 201, 420 | ) 421 | 422 | # Get the content file upload URL 423 | file_content_request_url = f'{self.BASE_ENDPOINT}/{self.request["id"]}/microsoft.graph.macOSLobApp/contentVersions/{self.content_version_request["id"]}/files/{self.content_file_request["id"]}' 424 | file_content_request = self.makeapirequest( 425 | file_content_request_url, 426 | self.token, 427 | ) 428 | 429 | self.wait_for_azure_storage_uri() 430 | 431 | if not file_content_request["azureStorageUri"]: 432 | # try again 433 | file_content_request = self.makeapirequest( 434 | file_content_request_url, 435 | self.token, 436 | ) 437 | 438 | if not file_content_request["azureStorageUri"]: 439 | self.delete_app() 440 | raise ProcessorError("Failed to get the Azure Storage upload URL") 441 | 442 | # Create the block list 443 | self.create_blocklist(tempfilename, file_content_request["azureStorageUri"]) 444 | 445 | # Clean up temp file 446 | os.unlink(tempfilename) 447 | 448 | # Commit the file 449 | data = json.dumps({"fileEncryptionInfo": encryptionInfo}) 450 | self.makeapirequestPost( 451 | f'{self.BASE_ENDPOINT}/{self.request["id"]}/microsoft.graph.macOSLobApp/contentVersions/{self.content_version_request["id"]}/files/{self.content_file_request["id"]}/commit', 452 | self.token, 453 | "", 454 | data, 455 | 200, 456 | ) 457 | 458 | # Wait for the file to upload 459 | self.wait_for_file_upload() 460 | 461 | # Patch the app to use the new content version 462 | data = { 463 | "@odata.type": "#microsoft.graph.macOSLobApp", 464 | "committedContentVersion": self.content_version_request["id"], 465 | } 466 | 467 | self.makeapirequestPatch( 468 | f"{self.BASE_ENDPOINT}/{self.request['id']}", 469 | self.token, 470 | "", 471 | json.dumps(data), 472 | 204, 473 | ) 474 | 475 | if app_categories: 476 | self.update_categories(app_categories, self.request.get("categories")) 477 | 478 | if app_assignment_info: 479 | for assignment in app_assignment_info: 480 | if "exclude" not in assignment: 481 | assignment["exclude"] = False 482 | self.assign_app(app_data, app_assignment_info) 483 | 484 | self.env["intune_app_changed"] = True 485 | self.env["intuneappuploader_summary_result"] = { 486 | "summary_text": "The following new items were imported into Intune:", 487 | "report_fields": [ 488 | "name", 489 | "version", 490 | "intune_app_id", 491 | "content_version_id", 492 | ], 493 | "data": { 494 | "name": app_displayname, 495 | "version": app_bundleVersion, 496 | "intune_app_id": self.request["id"], 497 | "content_version_id": self.content_version_request["id"], 498 | }, 499 | } 500 | 501 | 502 | if __name__ == "__main__": 503 | PROCESSOR = IntuneAppUploader() 504 | PROCESSOR.execute_shell() 505 | -------------------------------------------------------------------------------- /IntuneUploader/IntuneScriptUploader.py: -------------------------------------------------------------------------------- 1 | #!/usr/local/autopkg/python 2 | # -*- coding: utf-8 -*- 3 | 4 | """ 5 | This processor uploads a script to Microsoft Intune using the Microsoft Graph API, it also assigns the script to group(s) if specified 6 | It also supports updating the script if it already exists in Intune. 7 | 8 | Created by Tobias Almén 9 | """ 10 | 11 | import base64 12 | import json 13 | import os 14 | import sys 15 | from dataclasses import dataclass, field 16 | 17 | from autopkglib import ProcessorError 18 | 19 | __all__ = ["IntuneScriptUploader"] 20 | 21 | sys.path.insert(0, os.path.dirname(__file__)) 22 | from IntuneUploaderLib.IntuneUploaderBase import IntuneUploaderBase 23 | 24 | 25 | class IntuneScriptUploader(IntuneUploaderBase): 26 | """Uploads a script to Microsoft Intune using the Microsoft Graph API.""" 27 | 28 | input_variables = { 29 | "script_path": { 30 | "required": True, 31 | "description": "Path to script to upload.", 32 | }, 33 | "description": { 34 | "required": False, 35 | "description": "Description of script.", 36 | }, 37 | "display_name": { 38 | "required": False, 39 | "description": "Display name of script. If not provided, will use IU-.", 40 | }, 41 | "run_as_account": { 42 | "required": False, 43 | "description": "Run as account.", 44 | "default": "system", 45 | }, 46 | "retry_count": { 47 | "required": False, 48 | "description": "Number of times to retry script.", 49 | "default": 3, 50 | }, 51 | "block_execution_notifications": { 52 | "required": False, 53 | "description": "Block execution notifications.", 54 | "default": True, 55 | }, 56 | "assignment_info": { 57 | "required": False, 58 | "description": "The assignment info of the app. Provided as an array of dicts containing keys 'group_id' and 'intent'. See https://github.com/almenscorner/intune-uploader/wiki/IntuneScriptUploader#input-variables for more information.", 59 | }, 60 | } 61 | output_variables = { 62 | "intunescriptuploader_summary_result": { 63 | "description": "Description of interesting results." 64 | } 65 | } 66 | 67 | def main(self): 68 | """Main process""" 69 | 70 | def assign_script(self, script_id: str, assignment_info: dict) -> None: 71 | """Assigns a script to groups. 72 | 73 | Args: 74 | script_id (str): The id of the script. 75 | assignment_info (dict): The assignment information. 76 | """ 77 | assignment_endpoint = self.BASE_ENDPOINT.replace( 78 | "deviceShellScripts", "deviceManagementScripts" 79 | ) 80 | current_assignment = self.makeapirequest( 81 | f"{assignment_endpoint}/{script_id}/assignments", self.token 82 | ) 83 | # Get the current group ids 84 | current_group_ids = [ 85 | c["target"].get("groupId") 86 | for c in current_assignment["value"] 87 | if c["target"].get("groupId") 88 | ] 89 | # Check if the group id is not in the current assignments 90 | missing_assignment = [ 91 | a for a in assignment_info if a["group_id"] not in current_group_ids 92 | ] 93 | data = {"deviceManagementScriptAssignments": []} 94 | 95 | if missing_assignment: 96 | for assignment in missing_assignment: 97 | if assignment["intent"] == "exclude": 98 | atype = "#microsoft.graph.exclusionGroupAssignmentTarget" 99 | else: 100 | atype = "#microsoft.graph.groupAssignmentTarget" 101 | 102 | # Assign the script to the group 103 | data["deviceManagementScriptAssignments"].append( 104 | { 105 | "target": { 106 | "@odata.type": atype, 107 | "groupId": assignment["group_id"], 108 | }, 109 | } 110 | ) 111 | 112 | for assignment in current_assignment["value"]: 113 | data["deviceManagementScriptAssignments"].append( 114 | { 115 | "target": assignment["target"], 116 | } 117 | ) 118 | 119 | self.makeapirequestPost( 120 | f"{assignment_endpoint}/{script_id}/assign", 121 | self.token, 122 | "", 123 | json.dumps(data), 124 | 200, 125 | ) 126 | 127 | # Set variables 128 | self.BASE_ENDPOINT = ( 129 | "https://graph.microsoft.com/beta/deviceManagement/deviceShellScripts" 130 | ) 131 | self.CLIENT_ID = self.env.get("CLIENT_ID") 132 | self.CLIENT_SECRET = self.env.get("CLIENT_SECRET") 133 | self.TENANT_ID = self.env.get("TENANT_ID") 134 | script_path = self.env.get("script_path") 135 | script_description = self.env.get("description") 136 | script_name = self.env.get("display_name") 137 | run_as_account = self.env.get("run_as_account") 138 | retry_count = self.env.get("retry_count") 139 | block_execution_notifications = self.env.get("block_execution_notifications") 140 | assignment_info = self.env.get("assignment_info") 141 | filename = os.path.basename(script_path) 142 | action = "" 143 | 144 | # Check if script name is provided 145 | if not script_name: 146 | # If not, use filename 147 | script_name = f"IU-{os.path.basename(script_path).removesuffix(os.path.splitext(script_path)[1])}" 148 | 149 | # Check if script path exists 150 | if os.path.exists(script_path): 151 | if os.path.isdir(script_path): 152 | raise ProcessorError(f"Path is a directory: {script_path}") 153 | else: 154 | raise ProcessorError(f"Path does not exist: {script_path}") 155 | 156 | # Get token 157 | self.token = self.obtain_accesstoken( 158 | self.CLIENT_ID, self.CLIENT_SECRET, self.TENANT_ID 159 | ) 160 | 161 | @dataclass 162 | class ShellScript: 163 | """ 164 | Class to represent a shell script. 165 | """ 166 | 167 | displayName: str = script_name 168 | description: str = script_description 169 | scriptContent: str = field(default_factory=str) 170 | retryCount: int = retry_count 171 | runAsAccount: str = run_as_account 172 | blockExecutionNotifications: bool = block_execution_notifications 173 | fileName: str = filename 174 | 175 | # Create script object 176 | script = ShellScript() 177 | 178 | with open(script_path, "r", encoding="utf-8") as f: 179 | # Encode script content as base64 180 | script.scriptContent = base64.b64encode(f.read().encode("utf-8")).decode( 181 | "utf-8" 182 | ) 183 | 184 | # Convert script object to json 185 | script_data = json.dumps(script.__dict__) 186 | 187 | # Check if script exists in Intune 188 | params = {"$filter": f"displayName eq '{script_name}'"} 189 | current_script = self.makeapirequest(self.BASE_ENDPOINT, self.token, params) 190 | 191 | if current_script["value"]: 192 | # Get script content 193 | current_script_content = self.makeapirequest( 194 | f"{self.BASE_ENDPOINT}/{current_script['value'][0]['id']}", self.token 195 | ) 196 | # Check if script matches current script 197 | if current_script_content["scriptContent"] == script.scriptContent: 198 | self.output( 199 | f"Script '{script_name}' already exists and matches current script." 200 | ) 201 | action = "none" 202 | else: 203 | self.output( 204 | f"Script '{script_name}' already exists but does not match current script. Updating script." 205 | ) 206 | action = "update" 207 | self.makeapirequestPatch( 208 | f"{self.BASE_ENDPOINT}('{current_script_content['id']}')", 209 | self.token, 210 | "", 211 | script_data, 212 | ) 213 | 214 | else: 215 | self.output(f"Script '{script_name}' does not exist. Creating script.") 216 | action = "create" 217 | create_request = self.makeapirequestPost( 218 | self.BASE_ENDPOINT, self.token, None, script_data, 201 219 | ) 220 | 221 | if assignment_info and action != "none": 222 | # Assign script to groups 223 | if action == "create": 224 | script_id = create_request["id"] 225 | else: 226 | script_id = current_script["value"][0]["id"] 227 | 228 | assign_script(self, script_id, assignment_info) 229 | 230 | self.env["intunescriptuploader_summary_result"] = { 231 | "summary_text": "Summary of IntuneScriptUploader results:", 232 | "report_fields": [ 233 | "script name", 234 | "script path", 235 | "run as account", 236 | "retry count", 237 | "block execution notifications", 238 | "action", 239 | ], 240 | "data": { 241 | "script name": script_name, 242 | "script path": script_path, 243 | "run as account": run_as_account, 244 | "retry count": str(retry_count), 245 | "block execution notifications": str(block_execution_notifications), 246 | "action": action, 247 | }, 248 | } 249 | 250 | 251 | if __name__ == "__main__": 252 | PROCESSOR = IntuneScriptUploader() 253 | PROCESSOR.execute_shell() 254 | -------------------------------------------------------------------------------- /IntuneUploader/IntuneSlackNotifier.py: -------------------------------------------------------------------------------- 1 | #!/usr/local/autopkg/python 2 | # -*- coding: utf-8 -*- 3 | 4 | """ 5 | This processor posts a message to a Slack channel using a webhook URL. 6 | The message is formatted as an Adaptive Card and can include a link to the Intune app. 7 | The processor is intended to be used in conjunction with the IntuneAppUploader, IntuneAppPromoter, and IntuneAppCleaner processors. 8 | 9 | Created by Tobias Almén 10 | """ 11 | 12 | import json 13 | import os 14 | import sys 15 | import time 16 | 17 | import requests 18 | from autopkglib import ProcessorError 19 | 20 | __all__ = ["IntuneSlackNotifier"] 21 | 22 | sys.path.insert(0, os.path.dirname(__file__)) 23 | from IntuneUploaderLib.IntuneUploaderBase import IntuneUploaderBase 24 | 25 | 26 | class IntuneSlackNotifier(IntuneUploaderBase): 27 | """Sends a message to a Slack channel using a webhook URL.""" 28 | 29 | input_variables = { 30 | "webhook_url": { 31 | "required": True, 32 | "description": "Webhook URL for the slack channel to post to.", 33 | }, 34 | "intuneappuploader_summary_result": { 35 | "required": False, 36 | "description": "Results from the IntuneAppUploader processor.", 37 | }, 38 | "intuneapppromoter_summary_result": { 39 | "required": False, 40 | "description": "Results from the IntuneAppPromoter processor.", 41 | }, 42 | "intuneappcleaner_summary_result": { 43 | "required": False, 44 | "description": "Results from the IntuneAppCleaner processor.", 45 | }, 46 | } 47 | output_variables = {} 48 | 49 | def main(self): 50 | """Main process""" 51 | 52 | slack_webhook = self.env.get("webhook_url") 53 | intuneappuploader_summary_results = self.env.get( 54 | "intuneappuploader_summary_result", {} 55 | ) 56 | intuneapppromoter_summary_result = self.env.get( 57 | "intuneapppromoter_summary_result", {} 58 | ) 59 | intuneappcleaner_summary_results = self.env.get( 60 | "intuneappcleaner_summary_result", {} 61 | ) 62 | 63 | def _slack_message(title, message, imported=False, app_id=None): 64 | payload = { 65 | "blocks": [ 66 | { 67 | "type": "header", 68 | "text": {"type": "plain_text", "text": title, "emoji": True}, 69 | }, 70 | {"type": "section", "text": {"type": "mrkdwn", "text": message}}, 71 | ] 72 | } 73 | 74 | if imported: 75 | payload["blocks"].append( 76 | { 77 | "type": "section", 78 | "text": { 79 | "type": "mrkdwn", 80 | "text": f"**", 81 | }, 82 | } 83 | ) 84 | 85 | return payload 86 | 87 | def _post_slack_message(data): 88 | data = json.dumps(data) 89 | response = requests.post(url=slack_webhook, data=data) 90 | 91 | retry_count = 0 92 | while response.status_code != 200 and retry_count < 3: 93 | time.sleep(2) 94 | response = requests.post(url=slack_webhook, data=data) 95 | retry_count += 1 96 | if response.status_code != 200: 97 | raise ProcessorError( 98 | f"Failed to post message to slack, status code: {response.status_code} - {response.text}" 99 | ) 100 | 101 | def _updated_alerts(summary): 102 | name = summary["data"]["name"] 103 | version = summary["data"]["version"] 104 | result_id = summary["data"]["intune_app_id"] 105 | content_version_id = summary["data"]["content_version_id"] 106 | task_title = f"✅ Imported {name} {version}" 107 | task_description = ( 108 | f"*Name:* {name}" 109 | + "\r" 110 | + f"*Intune App ID:* {result_id}" 111 | + "\r" 112 | + f"*Content Version ID:* {content_version_id}" 113 | ) 114 | 115 | self.output(f"Posting imported message to slack for {name} {version}") 116 | message = _slack_message( 117 | task_title, task_description, imported=True, app_id=result_id 118 | ) 119 | _post_slack_message(message) 120 | 121 | def _removed_alerts(summary): 122 | removed_count = summary["data"]["removed count"] 123 | if int(removed_count) == 0: 124 | return 125 | name = summary["data"]["searched name"] 126 | removed_versions = summary["data"]["removed versions"] 127 | keep_count = summary["data"]["keep count"] 128 | task_title = f"🗑 Removed old versions of {name}" 129 | task_description = "" 130 | task_description += ( 131 | f"*Remove Count:* {removed_count}" 132 | + "\r" 133 | + f"*Removed Versions:* {removed_versions}" 134 | + "\r" 135 | + f"*Keep Count:* {keep_count}" 136 | ) 137 | 138 | self.output(f"Posting removed message to slack for {name}") 139 | message = _slack_message(task_title, task_description) 140 | _post_slack_message(message) 141 | 142 | def _promoted_alerts(summary): 143 | name = summary["data"]["app name"] 144 | promotions = summary["data"]["promotions"] 145 | blacklisted_versions = summary["data"]["blacklisted versions"] 146 | task_title = "🚀 Promoted %s" % name 147 | task_description = "" 148 | task_description += ( 149 | f"*Promotions:* {promotions}" 150 | + "\r" 151 | + f"*Blacklisted Versions:* {blacklisted_versions}" 152 | ) 153 | 154 | self.output(f"Posting promoted message to slack for {name}") 155 | message = _slack_message(task_title, task_description) 156 | _post_slack_message(message) 157 | 158 | if intuneappuploader_summary_results: 159 | _updated_alerts(intuneappuploader_summary_results) 160 | if intuneappcleaner_summary_results: 161 | _removed_alerts(intuneappcleaner_summary_results) 162 | if intuneapppromoter_summary_result: 163 | _promoted_alerts(intuneapppromoter_summary_result) 164 | 165 | 166 | if __name__ == "__main__": 167 | PROCESSOR = IntuneSlackNotifier() 168 | PROCESSOR.execute_shell() 169 | -------------------------------------------------------------------------------- /IntuneUploader/IntuneTeamsNotifier.py: -------------------------------------------------------------------------------- 1 | #!/usr/local/autopkg/python 2 | # -*- coding: utf-8 -*- 3 | 4 | """ 5 | This processor posts a message to a Microsoft Teams channel using a webhook URL. 6 | The message is formatted as an Adaptive Card and can include a link to the Intune app. 7 | The processor is intended to be used in conjunction with the IntuneAppUploader, IntuneAppPromoter, and IntuneAppCleaner processors. 8 | 9 | Created by Tobias Almén 10 | """ 11 | 12 | import json 13 | import os 14 | import sys 15 | import time 16 | 17 | import requests 18 | from autopkglib import ProcessorError 19 | 20 | __all__ = ["IntuneTeamsNotifier"] 21 | 22 | sys.path.insert(0, os.path.dirname(__file__)) 23 | from IntuneUploaderLib.IntuneUploaderBase import IntuneUploaderBase 24 | 25 | 26 | class IntuneTeamsNotifier(IntuneUploaderBase): 27 | """Sends a message to a Microsoft Teams channel using a webhook URL.""" 28 | 29 | input_variables = { 30 | "webhook_url": { 31 | "required": True, 32 | "description": "Webhook URL for the Teams channel to post to.", 33 | }, 34 | "intuneappuploader_summary_result": { 35 | "required": False, 36 | "description": "Results from the IntuneAppUploader processor.", 37 | }, 38 | "intuneapppromoter_summary_result": { 39 | "required": False, 40 | "description": "Results from the IntuneAppPromoter processor.", 41 | }, 42 | "intuneappcleaner_summary_result": { 43 | "required": False, 44 | "description": "Results from the IntuneAppCleaner processor.", 45 | }, 46 | "intunevtappdeleter_summary_result": { 47 | "required": False, 48 | "description": "Results from the IntuneVTAppDeleter processor.", 49 | }, 50 | } 51 | output_variables = {} 52 | 53 | def main(self): 54 | """Main process""" 55 | 56 | teams_webhook = self.env.get("webhook_url") 57 | intuneappuploader_summary_results = self.env.get( 58 | "intuneappuploader_summary_result", {} 59 | ) 60 | intuneapppromoter_summary_result = self.env.get( 61 | "intuneapppromoter_summary_result", {} 62 | ) 63 | intuneappcleaner_summary_results = self.env.get( 64 | "intuneappcleaner_summary_result", {} 65 | ) 66 | intunevtappdeleter_summary_results = self.env.get( 67 | "intunevtappdeleter_summary_result", {} 68 | ) 69 | 70 | def _teams_message(title, message, imported=False, app_id=None): 71 | data = { 72 | "type": "message", 73 | "attachments": [ 74 | { 75 | "contentType": "application/vnd.microsoft.card.adaptive", 76 | "contentUrl": None, 77 | "content": { 78 | "$schema": "http://adaptivecards.io/schemas/adaptive-card.json", 79 | "type": "AdaptiveCard", 80 | "version": "1.2", 81 | "msteams": {"width": "Full"}, 82 | "body": [ 83 | { 84 | "type": "Container", 85 | "bleed": True, 86 | "size": "stretch", 87 | "items": [{"type": "TextBlock", "text": title}], 88 | }, 89 | { 90 | "type": "ColumnSet", 91 | "columns": [ 92 | { 93 | "type": "Column", 94 | "items": [ 95 | { 96 | "type": "TextBlock", 97 | "text": message, 98 | "wrap": True, 99 | }, 100 | ], 101 | }, 102 | ], 103 | }, 104 | ], 105 | }, 106 | } 107 | ], 108 | } 109 | 110 | if imported: 111 | data["attachments"][0]["content"]["actions"] = [ 112 | { 113 | "type": "Action.OpenUrl", 114 | "title": "View App in Intune", 115 | "url": f"https://intune.microsoft.com/#view/Microsoft_Intune_Apps/SettingsMenu/~/0/appId/{app_id}", 116 | } 117 | ] 118 | 119 | return data 120 | 121 | def _post_teams_message(data): 122 | data = json.dumps(data) 123 | headers = { 124 | "Content-Type": "application/json", 125 | } 126 | response = requests.post(url=teams_webhook, data=data, headers=headers) 127 | 128 | retry_count = 0 129 | success_codes = [200, 201, 202, 204] 130 | while response.status_code not in success_codes and retry_count < 3: 131 | time.sleep(2) 132 | response = requests.post(url=teams_webhook, data=data, headers=headers) 133 | retry_count += 1 134 | if response.status_code not in success_codes: 135 | raise ProcessorError( 136 | f"Failed to post message to Teams, status code: {response.status_code} - {response.text}" 137 | ) 138 | 139 | def _updated_alerts(summary): 140 | name = summary["data"]["name"] 141 | version = summary["data"]["version"] 142 | result_id = summary["data"]["intune_app_id"] 143 | content_version_id = summary["data"]["content_version_id"] 144 | task_title = f"✅ Imported {name} {version}" 145 | task_description = ( 146 | f"**Name:** {name}" 147 | + "\r \r" 148 | + f"**Intune App ID:** {result_id}" 149 | + "\r \r" 150 | + f"**Content Version ID:** {content_version_id}" 151 | ) 152 | 153 | self.output(f"Posting imported message to Teams for {name} {version}") 154 | message = _teams_message( 155 | task_title, task_description, imported=True, app_id=result_id 156 | ) 157 | _post_teams_message(message) 158 | 159 | if intunevtappdeleter_summary_results: 160 | _vt_alerts(intunevtappdeleter_summary_results) 161 | 162 | def _removed_alerts(summary): 163 | removed_count = summary["data"]["removed count"] 164 | if int(removed_count) == 0: 165 | return 166 | name = summary["data"]["searched name"] 167 | removed_versions = summary["data"]["removed versions"] 168 | keep_count = summary["data"]["keep count"] 169 | task_title = f"🗑 Removed old versions of {name}" 170 | task_description = "" 171 | task_description += ( 172 | f"**Remove Count:** {removed_count}" 173 | + "\r \r" 174 | + f"**Removed Versions:** {removed_versions}" 175 | + "\r \r" 176 | + f"**Keep Count:** {keep_count}" 177 | ) 178 | 179 | self.output(f"Posting removed message to Teams for {name}") 180 | message = _teams_message(task_title, task_description) 181 | _post_teams_message(message) 182 | 183 | def _promoted_alerts(summary): 184 | name = summary["data"]["app name"] 185 | promotions = summary["data"]["promotions"] 186 | blacklisted_versions = summary["data"]["blacklisted versions"] 187 | task_title = "🚀 Promoted %s" % name 188 | task_description = "" 189 | task_description += ( 190 | f"**Promotions:** {promotions}" 191 | + "\r \r" 192 | + f"**Blacklisted Versions:** {blacklisted_versions}" 193 | ) 194 | 195 | self.output(f"Posting promoted message to Teams for {name}") 196 | message = _teams_message(task_title, task_description) 197 | _post_teams_message(message) 198 | 199 | def _vt_alerts(summary): 200 | name = summary["data"]["app_name"] 201 | version = summary["data"]["version"] 202 | positives = summary["data"]["configured_positives"] 203 | vt_positives = summary["data"]["virustotal_positives"] 204 | vt_ratio = summary["data"]["virustotal_ratio"] 205 | deleted = summary["data"]["deleted"] 206 | task_title = f"🦠 Deleted {name} {version}" 207 | task_description = "" 208 | task_description += ( 209 | f"**Configured Positives:** {positives}" 210 | + "\r \r" 211 | + f"**VirusTotal Positives:** {vt_positives}" 212 | + "\r \r" 213 | + f"**VirusTotal Ratio:** {vt_ratio}" 214 | ) 215 | 216 | if bool(deleted): 217 | self.output(f"Posting virustotal deleted message to Teams for {name}") 218 | message = _teams_message(task_title, task_description) 219 | _post_teams_message(message) 220 | 221 | if intuneappuploader_summary_results: 222 | _updated_alerts(intuneappuploader_summary_results) 223 | if intuneappcleaner_summary_results: 224 | _removed_alerts(intuneappcleaner_summary_results) 225 | if intuneapppromoter_summary_result: 226 | _promoted_alerts(intuneapppromoter_summary_result) 227 | 228 | 229 | if __name__ == "__main__": 230 | PROCESSOR = IntuneTeamsNotifier() 231 | PROCESSOR.execute_shell() 232 | -------------------------------------------------------------------------------- /IntuneUploader/IntuneUploader.recipe: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | Description 6 | Recipe stub for any Shared Processors in this directory. 7 | 8 | Identifier 9 | com.github.almenscorner.intune-upload.processors 10 | Input 11 | 12 | MinimumVersion 13 | 2.3 14 | Process 15 | 16 | 17 | 18 | -------------------------------------------------------------------------------- /IntuneUploader/IntuneUploaderLib/IntuneUploaderBase.py: -------------------------------------------------------------------------------- 1 | #!/usr/local/autopkg/python 2 | # -*- coding: utf-8 -*- 3 | 4 | """ 5 | IntuneAppUploaderBase is a base class for processors that upload apps among other things to Microsoft Intune using the Microsoft Graph API. 6 | 7 | Created by Tobias Almén 8 | """ 9 | 10 | import base64 11 | import hashlib 12 | import hmac 13 | import json 14 | import os 15 | import time 16 | 17 | import requests 18 | from autopkglib import Processor, ProcessorError 19 | from cryptography.hazmat.primitives import padding 20 | from cryptography.hazmat.primitives.ciphers import Cipher, algorithms, modes 21 | 22 | 23 | class IntuneUploaderBase(Processor): 24 | """IntuneUploaderBase processor""" 25 | 26 | def obtain_accesstoken( 27 | self, client_id: str, client_secret: str, tenant_id: str 28 | ) -> dict: 29 | """This function obtains an access token from the Microsoft Graph API. 30 | 31 | Args: 32 | client_id (str): The client ID to use for authenticating the request. 33 | client_secret (str): The client secret to use for authenticating the request. 34 | tenant_id (str): The tenant ID to use for authenticating the request. 35 | 36 | Returns: 37 | dict: The response from the request as a dictionary. 38 | """ 39 | 40 | url = f"https://login.microsoftonline.com/{tenant_id}/oauth2/v2.0/token" 41 | headers = {"Content-Type": "application/x-www-form-urlencoded"} 42 | data = { 43 | "grant_type": "client_credentials", 44 | "client_id": client_id, 45 | "client_secret": client_secret, 46 | "scope": "https://graph.microsoft.com/.default", 47 | } 48 | 49 | response = requests.post(url, headers=headers, data=data) 50 | 51 | if response.status_code != 200: 52 | raise ProcessorError( 53 | f"Failed to obtain access token. Status code: {response.status_code}" 54 | ) 55 | response = json.loads(response.text) 56 | return response 57 | 58 | def makeapirequest(self, endpoint: str, token: dict, q_param=None) -> dict: 59 | """This function makes a request to the Graph API and returns the response as a dictionary. 60 | 61 | Args: 62 | endpoint (str): The endpoint to make the request to. 63 | token (dict): The access token to use for authenticating the request. 64 | q_param (dict, optional): The query parameters to use for the request. Defaults to None. 65 | 66 | Returns: 67 | dict: The response from the request as a dictionary. 68 | """ 69 | 70 | headers = { 71 | "Content-Type": "application/json", 72 | "Authorization": "Bearer {0}".format(token["access_token"]), 73 | } 74 | retry_response_codes = [502, 503, 504] 75 | if q_param is not None: 76 | response = requests.get(endpoint, headers=headers, params=q_param) 77 | if response.status_code in retry_response_codes: 78 | self.output( 79 | "Ran into issues with Graph request, waiting 10 seconds and trying again..." 80 | ) 81 | time.sleep(10) 82 | response = requests.get(endpoint, headers=headers) 83 | elif response.status_code == 429: 84 | self.output( 85 | f"Hit Graph throttling, trying again after {response.headers['Retry-After']} seconds" 86 | ) 87 | while response.status_code == 429: 88 | time.sleep(int(response.headers["Retry-After"])) 89 | response = requests.get(endpoint, headers=headers) 90 | else: 91 | response = requests.get(endpoint, headers=headers) 92 | if response.status_code in retry_response_codes: 93 | self.output( 94 | "Ran into issues with Graph request, waiting 10 seconds and trying again..." 95 | ) 96 | time.sleep(10) 97 | response = requests.get(endpoint, headers=headers) 98 | elif response.status_code == 429: 99 | self.output( 100 | f"Hit Graph throttling, trying again after {response.headers['Retry-After']} seconds" 101 | ) 102 | while response.status_code == 429: 103 | time.sleep(int(response.headers["Retry-After"])) 104 | response = requests.get(endpoint, headers=headers) 105 | 106 | if response.status_code != 200: 107 | raise ProcessorError( 108 | "Request failed with ", response.status_code, " - ", response.text 109 | ) 110 | json_data = json.loads(response.text) 111 | 112 | if "@odata.nextLink" in json_data.keys(): 113 | record = self.makeapirequest(json_data["@odata.nextLink"], token) 114 | entries = len(record["value"]) 115 | count = 0 116 | while count < entries: 117 | json_data["value"].append(record["value"][count]) 118 | count += 1 119 | 120 | return json_data 121 | 122 | def makeapirequestPost( 123 | self, 124 | postEndpoint: str, 125 | token: dict, 126 | q_param=None, 127 | json_data=None, 128 | status_code=200, 129 | ) -> dict: 130 | """This function makes a request to the Graph API and returns the response as a dictionary. 131 | 132 | Args: 133 | postEndpoint (str): Endpoint to make the request to. 134 | token (dict): The access token to use for authenticating the request. 135 | q_param (dict, optional): The query parameters to use for the request. Defaults to None. 136 | json_data (dict, optional): The json data to use for the request. Defaults to None. 137 | status_code (int, optional): The status code to check for. Defaults to 200. 138 | 139 | Returns: 140 | dict: If there is a response, the response from the request as a dictionary. 141 | """ 142 | 143 | headers = { 144 | "Content-Type": "application/json", 145 | "Authorization": "Bearer {0}".format(token["access_token"]), 146 | } 147 | 148 | if q_param is not None: 149 | response = requests.post( 150 | postEndpoint, headers=headers, params=q_param, data=json_data 151 | ) 152 | else: 153 | response = requests.post(postEndpoint, headers=headers, data=json_data) 154 | if response.status_code == status_code: 155 | if response.text: 156 | json_data = json.loads(response.text) 157 | return json_data 158 | elif response.status_code == 429: 159 | self.output( 160 | f"Hit Graph throttling, trying again after {response.headers['Retry-After']} seconds" 161 | ) 162 | while response.status_code == 429: 163 | time.sleep(int(response.headers["Retry-After"])) 164 | response = requests.post(postEndpoint, headers=headers, data=json_data) 165 | elif response.status_code == 412: 166 | self.output("Precondition failed, trying again...") 167 | time.sleep(10) 168 | response = requests.post(postEndpoint, headers=headers, data=json_data) 169 | else: 170 | raise ProcessorError( 171 | "Request failed with ", response.status_code, " - ", response.text 172 | ) 173 | 174 | def makeapirequestPatch( 175 | self, 176 | patchEndpoint: str, 177 | token: dict, 178 | q_param=None, 179 | json_data=None, 180 | status_code=200, 181 | ) -> None: 182 | """This function makes a request to the Graph API and returns the response as a dictionary. 183 | 184 | Args: 185 | postEndpoint (str): Endpoint to make the request to. 186 | token (dict): The access token to use for authenticating the request. 187 | q_param (dict, optional): The query parameters to use for the request. Defaults to None. 188 | json_data (dict, optional): The json data to use for the request. Defaults to None. 189 | status_code (int, optional): The status code to check for. Defaults to 200. 190 | """ 191 | 192 | headers = { 193 | "Content-Type": "application/json", 194 | "Authorization": "Bearer {0}".format(token["access_token"]), 195 | } 196 | 197 | if q_param is not None: 198 | response = requests.patch( 199 | patchEndpoint, headers=headers, params=q_param, data=json_data 200 | ) 201 | else: 202 | response = requests.patch(patchEndpoint, headers=headers, data=json_data) 203 | if response.status_code == status_code: 204 | pass 205 | else: 206 | raise ProcessorError( 207 | "Request failed with ", response.status_code, " - ", response.text 208 | ) 209 | 210 | def makeapirequestDelete( 211 | self, 212 | deleteEndpoint: str, 213 | token: dict, 214 | q_param=None, 215 | jdata=None, 216 | status_code=200, 217 | ) -> None: 218 | """This function makes a DELETE request to the Graph API and returns the response as a dictionary. 219 | 220 | Args: 221 | deleteEndpoint (str): Endpoint to make the request to. 222 | token (dict): The access token to use for authenticating the request. 223 | q_param (dict, optional): The query parameters to use for the request. Defaults to None. 224 | jdata (json, optional): The json data to use for the request. Defaults to None. 225 | status_code (int, optional): The status code to check for. Defaults to 200. 226 | 227 | Raises: 228 | ProcessorError: If the request fails. 229 | """ 230 | 231 | headers = { 232 | "Content-Type": "application/json", 233 | "Authorization": "Bearer {0}".format(token["access_token"]), 234 | } 235 | 236 | if q_param is not None: 237 | response = requests.delete( 238 | deleteEndpoint, headers=headers, params=q_param, data=jdata 239 | ) 240 | else: 241 | response = requests.delete(deleteEndpoint, headers=headers, data=jdata) 242 | if response.status_code == status_code: 243 | pass 244 | else: 245 | raise ProcessorError( 246 | "Request failed with ", response.status_code, " - ", response.text 247 | ) 248 | 249 | def encrypt_app(self) -> tuple: 250 | """Encrypts the app with AES-256 in CBC mode. 251 | 252 | Returns: 253 | tuple: Tuple containing: 254 | str: The encrypted app. 255 | dict: The encryption info. 256 | """ 257 | encryptionKey = os.urandom(32) 258 | hmacKey = os.urandom(32) 259 | initializationVector = os.urandom(16) 260 | profileIdentifier = "ProfileVersion1" 261 | fileDigestAlgorithm = "SHA256" 262 | 263 | with open(self.app_file, "rb") as f: 264 | plaintext = f.read() 265 | 266 | # Pad the plaintext to a multiple of 16 bytes 267 | padder = padding.PKCS7(128).padder() 268 | padded_plaintext = padder.update(plaintext) + padder.finalize() 269 | 270 | # Encrypt the padded plaintext using AES-256 in CBC mode 271 | cipher = Cipher(algorithms.AES(encryptionKey), modes.CBC(initializationVector)) 272 | encryptor = cipher.encryptor() 273 | encrypted_data = encryptor.update(padded_plaintext) + encryptor.finalize() 274 | 275 | # Combine the IV and encrypted data into a single byte string 276 | iv_data = initializationVector + encrypted_data 277 | 278 | # Generate a HMAC-SHA256 signature of the IV and encrypted data 279 | h = hmac.new(hmacKey, iv_data, hashlib.sha256) 280 | signature = h.digest() 281 | 282 | # Combine the signature and IV + encrypted data into a single byte string 283 | encrypted_pkg = signature + iv_data 284 | 285 | # Generate a base64-encoded string of the encrypted package (unused) 286 | # encoded_pkg = base64.b64encode(encrypted_pkg).decode() 287 | 288 | # Generate a base64-encoded string of the encryption key 289 | encoded_key = base64.b64encode(encryptionKey).decode() 290 | 291 | # Generate a base64-encoded string of the HMAC key 292 | encoded_hmac_key = base64.b64encode(hmacKey).decode() 293 | 294 | # Generate a base64-encoded string of the file digest 295 | filebytes = open(self.app_file, "rb").read() 296 | filehash_sha256 = hashlib.sha256(filebytes) 297 | fileDigest = base64.b64encode(filehash_sha256.digest()).decode() 298 | 299 | # Generate the file encryption info dictionary 300 | fileEncryptionInfo = {} 301 | fileEncryptionInfo["@odata.type"] = "#microsoft.graph.fileEncryptionInfo" 302 | fileEncryptionInfo["encryptionKey"] = encoded_key 303 | fileEncryptionInfo["macKey"] = encoded_hmac_key 304 | fileEncryptionInfo["initializationVector"] = base64.b64encode( 305 | initializationVector 306 | ).decode() 307 | fileEncryptionInfo["profileIdentifier"] = profileIdentifier 308 | fileEncryptionInfo["fileDigestAlgorithm"] = fileDigestAlgorithm 309 | fileEncryptionInfo["fileDigest"] = fileDigest 310 | fileEncryptionInfo["mac"] = base64.b64encode(signature).decode() 311 | 312 | return (encrypted_pkg, fileEncryptionInfo) 313 | 314 | def appFile(self, encrypted_app_file: str) -> dict: 315 | """This function creates the appFile dictionary for the Microsoft Graph API. 316 | 317 | Args: 318 | encrypted_app_file (str): The path to the encrypted application file. 319 | 320 | Returns: 321 | dict: The appFile dictionary. 322 | """ 323 | appFile = {} 324 | appFile["@odata.type"] = "#microsoft.graph.mobileAppContentFile" 325 | appFile["name"] = os.path.basename(self.app_file) 326 | appFile["size"] = os.path.getsize(self.app_file) 327 | appFile["sizeEncrypted"] = os.path.getsize(encrypted_app_file) 328 | appFile["manifest"] = None 329 | appFile["isDependency"] = False 330 | return appFile 331 | 332 | def create_blocklist(self, file_path: str, azure_storage_uri: str) -> None: 333 | """Uploads a file to Azure Blob Storage using the block list upload mechanism. 334 | 335 | Args: 336 | file_path (str): The path to the file to upload. 337 | azure_storage_uri (str): The URI of the Azure Blob Storage container to upload the file to. 338 | """ 339 | # Set the chunk size to 6 MB 340 | chunk_size = 6 * 1024 * 1024 341 | 342 | # Open the file in binary mode 343 | with open(file_path, "rb") as f: 344 | # Initialize the block IDs list and the block index 345 | block_ids = [] 346 | block_index = 0 347 | 348 | # Read the file in chunks and upload each chunk as a block 349 | while True: 350 | chunk = f.read(chunk_size) 351 | if not chunk: 352 | break 353 | 354 | # Generate a block ID for the current chunk 355 | block_id = base64.b64encode(f"block-{block_index:04}".encode()).decode() 356 | 357 | # Upload the chunk as a block 358 | uri = f"{azure_storage_uri}&comp=block&blockid={block_id}" 359 | headers = {"x-ms-blob-type": "BlockBlob"} 360 | try: 361 | r = requests.put(uri, headers=headers, data=chunk) 362 | except: 363 | raise ProcessorError("Failed to upload block") 364 | 365 | # Add the block ID to the list of block IDs 366 | block_ids.append(block_id) 367 | 368 | # Increment the block index 369 | block_index += 1 370 | 371 | # Generate the block list XML 372 | block_list_xml = "" 373 | for block_id in block_ids: 374 | block_list_xml += f"{block_id}" 375 | block_list_xml += "" 376 | 377 | # Upload the block list XML 378 | uri = f"{azure_storage_uri}&comp=blocklist" 379 | headers = {"Content-Type": "application/xml"} 380 | r = requests.put(uri, headers=headers, data=block_list_xml) 381 | 382 | if r.status_code != 201: 383 | raise ProcessorError("Failed to upload block list XML") 384 | 385 | def get_file_content_status(self) -> dict: 386 | """Returns the status of a file upload. 387 | 388 | Returns: 389 | dict: The file content status dictionary. 390 | """ 391 | url = f"{self.BASE_ENDPOINT}/{self.request['id']}/microsoft.graph.macOSLobApp/contentVersions/{self.content_version_request['id']}/files/{self.content_file_request['id']}" 392 | return self.makeapirequest(url, self.token) 393 | 394 | def delete_app(self) -> None: 395 | """ 396 | Deletes an app from Intune. 397 | """ 398 | if self.request.get("id") and self.content_update is False: 399 | self.makeapirequestDelete( 400 | f"{self.BASE_ENDPOINT}/{self.request['id']}", self.token 401 | ) 402 | 403 | def wait_for_file_upload(self) -> None: 404 | """Waits for a file to be uploaded. 405 | 406 | Raises: 407 | ProcessorError: If the file upload fails or times out. 408 | """ 409 | attempt = 1 410 | status = self.get_file_content_status() 411 | 412 | while status["uploadState"] != "commitFileSuccess": 413 | time.sleep(5) 414 | status = self.get_file_content_status() 415 | attempt += 1 416 | if status["uploadState"] == "commitFileFailed": 417 | self.delete_app() 418 | raise ProcessorError("Failed to commit file") 419 | if attempt > 20: 420 | self.delete_app() 421 | raise ProcessorError("Timed out waiting for file upload to complete") 422 | 423 | def wait_for_azure_storage_uri(self) -> None: 424 | """Waits for an Azure Storage upload URL to be generated. 425 | 426 | raises: 427 | ProcessorError: If the Azure Storage upload URL request fails or times out. 428 | """ 429 | attempt = 1 430 | status = self.get_file_content_status() 431 | 432 | while status["uploadState"] != "azureStorageUriRequestSuccess": 433 | time.sleep(5) 434 | status = self.get_file_content_status() 435 | attempt += 1 436 | if status["uploadState"] == "azureStorageUriRequestFailed": 437 | self.delete_app() 438 | raise ProcessorError("Failed to get the Azure Storage upload URL") 439 | if attempt > 20: 440 | self.delete_app() 441 | raise ProcessorError( 442 | "Timed out waiting for the Azure Storage upload URL" 443 | ) 444 | 445 | def get_matching_apps(self, displayname: str) -> list: 446 | """Gets a list of apps from Intune that match the specified display name. 447 | 448 | Args: 449 | displayname (str): The display name of the app. 450 | 451 | Returns: 452 | list: A list of apps that match the specified display name. 453 | """ 454 | params = { 455 | "$filter": f"(isof('microsoft.graph.macOSDmgApp') or isof('microsoft.graph.macOSPkgApp') or isof('microsoft.graph.macOSLobApp')) and displayName eq '{displayname}'", 456 | "$expand": "categories", 457 | } 458 | request = self.makeapirequest( 459 | f"{self.BASE_ENDPOINT}", self.token, q_param=params 460 | ) 461 | 462 | return request["value"] 463 | 464 | def get_app_categories(self) -> list: 465 | """Gets a list of app categories from Intune. 466 | 467 | Returns: 468 | list: A list of app categories. 469 | """ 470 | categories = [] 471 | 472 | request = self.makeapirequest( 473 | "https://graph.microsoft.com/beta/deviceAppManagement/mobileAppCategories", 474 | self.token, 475 | ) 476 | 477 | for category in request["value"]: 478 | categories.append(category["displayName"]) 479 | 480 | return categories 481 | 482 | def create_app_categories(self, categories: list) -> list: 483 | """Creates a list of app categories in Intune. 484 | 485 | Args: 486 | categories (list): The categories to create. 487 | 488 | Returns: 489 | list: A list of the created categories. 490 | """ 491 | created_categories = [] 492 | for category in categories: 493 | data = json.dumps({"displayName": category}) 494 | request = self.makeapirequestPost( 495 | "https://graph.microsoft.com/beta/deviceAppManagement/mobileAppCategories", 496 | self.token, 497 | "", 498 | data, 499 | status_code=201, 500 | ) 501 | created_categories.append(request) 502 | 503 | return created_categories 504 | 505 | def get_current_app(self, displayname: str, version: int, odata_type: str) -> tuple: 506 | """Gets the current app from Intune. 507 | 508 | Args: 509 | displayname (str): The display name of the app. 510 | version (int): The version of the app. 511 | 512 | Returns: 513 | tuple: The result of the request and the data returned by the request. 514 | """ 515 | 516 | matching_apps = self.get_matching_apps(displayname) 517 | request = [ 518 | app 519 | for app in matching_apps 520 | if app["displayName"] == displayname 521 | and app.get("primaryBundleVersion") == version 522 | or app.get("buildNumber") == version 523 | and app["@odata.type"] == odata_type 524 | ] 525 | result = None 526 | data = {} 527 | 528 | if request: 529 | for item in request: 530 | item_version = ( 531 | item.get("primaryBundleVersion") 532 | if item.get("primaryBundleVersion") 533 | else item.get("buildNumber") 534 | ) 535 | if item_version < version: 536 | result = "update" 537 | item["primaryBundleVersion"] = item_version 538 | data = item 539 | else: 540 | result = "current" 541 | item["primaryBundleVersion"] = item_version 542 | data = item 543 | 544 | return result, data 545 | 546 | def update_categories(self, category_names: list, current_categories: list) -> None: 547 | """Gets the category IDs for the specified category name(s). 548 | 549 | Args: 550 | category_names (list): The category name(s). 551 | current_categories (list): The current categories for the app. 552 | """ 553 | # Define the URL to retrieve the mobile app categories 554 | category_url = ( 555 | "https://graph.microsoft.com/v1.0/deviceAppManagement/mobileAppCategories" 556 | ) 557 | 558 | # Retrieve the list of mobile app categories 559 | categories = self.makeapirequest(category_url, self.token, "") 560 | 561 | # If there are current categories, get their display names 562 | if current_categories: 563 | current_categories = [c["displayName"] for c in current_categories] 564 | # Filter the category IDs to only include those with display names in the category_names list and not in the current_categories list 565 | category_ids = [ 566 | c 567 | for c in categories["value"] 568 | if c["displayName"] in category_names 569 | and c["displayName"] not in current_categories 570 | ] 571 | # If there are no current categories, filter the category IDs to only include those with display names in the category_names list 572 | else: 573 | category_ids = [ 574 | c for c in categories["value"] if c["displayName"] in category_names 575 | ] 576 | 577 | # If there are category IDs to add, add them to the app 578 | if category_ids: 579 | for category_id in category_ids: 580 | # Create the data payload to add the category to the app 581 | data = json.dumps({"@odata.id": f'{category_url}/{category_id["id"]}'}) 582 | # Make the API request to add the category to the app 583 | self.makeapirequestPost( 584 | f'{self.BASE_ENDPOINT}/{self.request["id"]}/categories/$ref', 585 | self.token, 586 | "", 587 | data, 588 | 204, 589 | ) 590 | 591 | def encode_icon(self, icon_path: str) -> str: 592 | """Encodes an icon file as a base64 string. 593 | 594 | Args: 595 | icon_path (str): The path to the icon file. 596 | 597 | Returns: 598 | str: The base64 encoded icon file. 599 | """ 600 | with open(icon_path, "rb") as f: 601 | return base64.b64encode(f.read()).decode() 602 | 603 | def assign_app(self, app, assignment_info: dict) -> None: 604 | """Assigns an app to groups. 605 | 606 | Args: 607 | app (class): The app class. 608 | assignment_info (dict): The assignment information. 609 | """ 610 | current_assignment = self.makeapirequest( 611 | f"{self.BASE_ENDPOINT}/{self.request['id']}/assignments", self.token 612 | ) 613 | # Get the current group ids 614 | current_group_ids = [ 615 | c["target"].get("groupId") 616 | for c in current_assignment["value"] 617 | if c["target"].get("groupId") 618 | ] 619 | # Get the current all assignments 620 | current_all_assignment = [ 621 | c["target"].get("@odata.type") 622 | for c in current_assignment["value"] 623 | if c["target"]["@odata.type"] != "#microsoft.graph.groupAssignmentTarget" 624 | ] 625 | 626 | # Convert human readable All Users and All Devices to the odata type 627 | for assignment in assignment_info: 628 | if assignment.get("all_assignment") == "AllUsers": 629 | assignment[ 630 | "all_assignment" 631 | ] = "#microsoft.graph.allLicensedUsersAssignmentTarget" 632 | elif assignment.get("all_assignment") == "AllDevices": 633 | assignment[ 634 | "all_assignment" 635 | ] = "#microsoft.graph.allDevicesAssignmentTarget" 636 | 637 | # Check if the group id is not in the current assignments 638 | missing_assignment = [ 639 | a 640 | for a in assignment_info 641 | if "group_id" in a and a["group_id"] not in current_group_ids 642 | ] 643 | # Check if there are missing all assignments 644 | missing_all_assignment = [ 645 | a 646 | for a in assignment_info 647 | if "all_assignment" in a 648 | and a["all_assignment"] not in current_all_assignment 649 | ] 650 | data = {"mobileAppAssignments": []} 651 | 652 | if missing_assignment: 653 | for assignment in missing_assignment: 654 | # Assign the app to the group 655 | if assignment.get("exclude") is True: 656 | odata_type = "#microsoft.graph.exclusionGroupAssignmentTarget" 657 | else: 658 | odata_type = "#microsoft.graph.groupAssignmentTarget" 659 | data["mobileAppAssignments"].append( 660 | { 661 | "@odata.type": "#microsoft.graph.mobileAppAssignment", 662 | "target": { 663 | "@odata.type": odata_type, 664 | "groupId": assignment["group_id"], 665 | }, 666 | "intent": assignment["intent"], 667 | "settings": None, 668 | } 669 | ) 670 | 671 | if missing_all_assignment: 672 | for assignment in missing_all_assignment: 673 | data["mobileAppAssignments"].append( 674 | { 675 | "@odata.type": "#microsoft.graph.mobileAppAssignment", 676 | "target": { 677 | "@odata.type": assignment["all_assignment"], 678 | }, 679 | "intent": assignment["intent"], 680 | "settings": None, 681 | } 682 | ) 683 | 684 | for assignment in current_assignment["value"]: 685 | data["mobileAppAssignments"].append( 686 | { 687 | "@odata.type": "#microsoft.graph.mobileAppAssignment", 688 | "target": assignment["target"], 689 | "intent": assignment["intent"], 690 | "settings": None, 691 | } 692 | ) 693 | 694 | if missing_all_assignment or missing_assignment: 695 | self.output( 696 | f"Updating assignments for app {app.displayName} version {app.primaryBundleVersion}" 697 | ) 698 | self.makeapirequestPost( 699 | f"{self.BASE_ENDPOINT}/{self.request['id']}/assign", 700 | self.token, 701 | "", 702 | json.dumps(data), 703 | 200, 704 | ) 705 | 706 | 707 | if __name__ == "__main__": 708 | PROCESSOR = IntuneUploaderBase() 709 | PROCESSOR.execute_shell() 710 | -------------------------------------------------------------------------------- /IntuneUploader/IntuneVTAppDeleter.py: -------------------------------------------------------------------------------- 1 | #!/usr/local/autopkg/python 2 | # -*- coding: utf-8 -*- 3 | 4 | """ 5 | This processor will delete the app if the VirusTotal positives is greater than the configured positives. 6 | 7 | This processor requires that io.github.hjuutilainen.VirusTotalAnalyzer/VirusTotalAnalyzer 8 | is run before this processor. 9 | 10 | Created by Tobias Almén 11 | """ 12 | 13 | import os 14 | import sys 15 | import time 16 | 17 | __all__ = ["IntuneVTAppDeleter"] 18 | 19 | sys.path.insert(0, os.path.dirname(__file__)) 20 | from IntuneUploaderLib.IntuneUploaderBase import IntuneUploaderBase 21 | 22 | 23 | class IntuneVTAppDeleter(IntuneUploaderBase): 24 | """This processor cleans up apps in Intune based on the input variables.""" 25 | 26 | description = __doc__ 27 | input_variables = { 28 | "display_name": { 29 | "required": False, 30 | "description": "The name of the app to clean up.", 31 | }, 32 | "version": { 33 | "required": False, 34 | "description": "The version of the app to remove", 35 | }, 36 | "positives": { 37 | "required": False, 38 | "description": "The amount positives of VirusTotal results to delete the app.", 39 | "default": 10, 40 | }, 41 | "test_mode": { 42 | "required": False, 43 | "description": "If True, will only print what would have been done.", 44 | "default": False, 45 | }, 46 | } 47 | output_variables = { 48 | "intunevtappdeleter_summary_result": { 49 | "description": "Description of interesting results." 50 | } 51 | } 52 | 53 | def main(self): 54 | """Main process.""" 55 | # Set variables 56 | self.BASE_ENDPOINT = ( 57 | "https://graph.microsoft.com/beta/deviceAppManagement/mobileApps" 58 | ) 59 | self.CLIENT_ID = self.env.get("CLIENT_ID") 60 | self.CLIENT_SECRET = self.env.get("CLIENT_SECRET") 61 | self.TENANT_ID = self.env.get("TENANT_ID") 62 | positives = self.env.get("positives") 63 | app_name = self.env.get("display_name") 64 | version = self.env.get("version") 65 | test_mode = self.env.get("test_mode") 66 | vt_results = self.env.get("virus_total_analyzer_summary_result") 67 | 68 | if not vt_results: 69 | self.output("No VirusTotal results found. Skipping app deletion.") 70 | return 71 | 72 | vt_filename = vt_results["data"]["name"] 73 | 74 | # When running from the command line, positives is a string, convert to int 75 | if isinstance(positives, str): 76 | positives = int(positives) 77 | 78 | # Get access token 79 | self.token = self.obtain_accesstoken( 80 | self.CLIENT_ID, self.CLIENT_SECRET, self.TENANT_ID 81 | ) 82 | 83 | def _get_app(): 84 | # Get macthing apps 85 | app = self.get_matching_apps(app_name) 86 | app = list( 87 | map( 88 | lambda item: ( 89 | {**item, "primaryBundleVersion": item["buildNumber"]} 90 | if "primaryBundleVersion" not in item and "buildNumber" in item 91 | else item 92 | ), 93 | app, 94 | ) 95 | ) 96 | if len(app) > 1: 97 | app = list( 98 | filter( 99 | lambda item: item["primaryBundleVersion"] == version 100 | and item["fileName"] == vt_filename, 101 | app, 102 | ) 103 | ) 104 | 105 | return app[0] if len(app) > 0 else None 106 | 107 | app = _get_app() 108 | 109 | retry_count = 0 110 | while app is None and retry_count < 5: 111 | self.output("No matching app found. Retrying in 5 seconds...") 112 | time.sleep(5) 113 | retry_count += 1 114 | # Retry getting the app 115 | app = _get_app() 116 | 117 | if not app: 118 | self.output( 119 | f"No matching app found for {app_name} and version {version}. Skipping deletion." 120 | ) 121 | return 122 | 123 | self.output( 124 | f"Found matching app for {app_name}, version {version} and filename {vt_filename}" 125 | ) 126 | 127 | # Only delete if not in test mode 128 | vt_positives = int(vt_results["data"]["ratio"].split("/")[0]) 129 | 130 | deleted = False 131 | # if virus total positives is greater than or equal to the configured positives, delete the app 132 | if vt_positives >= positives: 133 | deleted = True 134 | self.output( 135 | f"VirusTotal positives is greater than {positives}. Deleting app." 136 | ) 137 | if not test_mode: 138 | self.makeapirequestDelete( 139 | self.BASE_ENDPOINT + "/" + app["id"], self.token 140 | ) 141 | else: 142 | self.output( 143 | f"VirusTotal positives is less than {positives}. Not deleting app {app_name} {version}." 144 | ) 145 | 146 | self.env["intunevtappdeleter_summary_result"] = { 147 | "summary_text": "The following items were checked for removal from Intune based on VirusTotal positives:", 148 | "report_fields": [ 149 | "app_name", 150 | "version", 151 | "configured_positives", 152 | "virustotal_positives", 153 | "virustotal_ratio", 154 | "deleted", 155 | ], 156 | "data": { 157 | "app_name": app_name, 158 | "version": version, 159 | "configured_positives": str(positives), 160 | "virustotal_positives": str(vt_positives), 161 | "virustotal_ratio": str(vt_results["data"]["ratio"]), 162 | "deleted": str(deleted), 163 | }, 164 | } 165 | 166 | 167 | if __name__ == "__main__": 168 | PROCESSOR = IntuneVTAppDeleter() 169 | PROCESSOR.execute_shell() 170 | -------------------------------------------------------------------------------- /IntuneUploader/requirements.txt: -------------------------------------------------------------------------------- 1 | requests 2 | cryptography 3 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # intune-uploader 2 | 3 | ## Description 4 | This project aims to simplify the process of creating and updating apps and other payloads in Intune by leveraging the power of [AutoPkg](https://github.com/autopkg/autopkg). With AutoPkg, you can automate the process of downloading, packaging, and uploading apps to Intune, saving you time and effort. 5 | 6 | Moving forward, additional processors for Intune might be added to this project that uploads more data like Shell scripts to provide a complete automated deployment process. This is why the base class [IntuneUploaderBase](IntuneUploader/IntuneUploaderLib/IntuneUploaderBase.py) was created, to allow for easier creation of additional processors. Contributions are welcome! 7 | 8 | Ideas for future processors: 9 | - Ideas welcome 10 | 11 | For getting started help and documentation, please visit the wiki pages: 12 | - [Intune App Uploader](https://github.com/almenscorner/intune-uploader/wiki/IntuneAppUploader) 13 | - [Intune App Icon Getter](https://github.com/almenscorner/intune-uploader/wiki/IntuneAppIconGetter) 14 | - [Intune App Cleaner](https://github.com/almenscorner/intune-uploader/wiki/IntuneAppCleaner) 15 | - [Intune Script Uploader](https://github.com/almenscorner/intune-uploader/wiki/IntuneScriptUploader) 16 | - [Intune App Promoter](https://github.com/almenscorner/intune-uploader/wiki/IntuneAppPromoter) 17 | - [Intune Teams Notifier](https://github.com/almenscorner/intune-uploader/wiki/IntuneTeamsNotifier) 18 | - [Intune Slack Notifier](https://github.com/almenscorner/intune-uploader/wiki/IntuneSlackNotifier) 19 | - [Intune VT App Deleter](https://github.com/almenscorner/intune-uploader/wiki/IntuneVTAppDeleter) 20 | 21 | Join the discussions on Slack 22 | 23 | 24 | 25 | ### IntuneAppUploader - LOB apps (managed PKG) 26 | LOB type apps support has been added. It is required that you provide a pkg file that is signed with a valid Apple Developer ID certificate and notarized. This app type can be deploy apps in a "available" manner rather than "required". This means that the user can choose to install the app or not. This is useful for apps that are not required for the user to do their job, but are nice to have. 27 | 28 | In the override file for a signed and notarized pkg, set the following key to upload as a LOB app: 29 | ```xml 30 | lob_app 31 | 32 | ``` 33 | 34 | ## Development 35 | Pull requests are welcome! 36 | 37 | Some ground rules before submitting a PR, 38 | * Install [pre-commit](https://pre-commit.com) 39 | * Once installed, run `pre-commit install` in the forked repo 40 | * Make sure all tests pass before submitting a PR 41 | --------------------------------------------------------------------------------