├── AADCCloudSync-Hardened.md ├── AADGCP.md ├── AADPasswordProtectionWithSentiel.md ├── AHRiskScoreMGand10klimits.md ├── AutoExcludeCAP.md ├── CA-SISM-Conditional Access Summaries, Insights, Security & Monitoring.md ├── CEF-AMASetup.md ├── CyberEstate-TVMData.md ├── DODCOAZT.md ├── DeviceTagging.md ├── EntraIDAssessment.md ├── GenAI Monitoring Made Easy with Microsoft Solutions.md ├── IdentityGRC.md ├── IdentityGRCWorkbooks.md ├── MDI-Hardened-Environments.md ├── MDI-Hardened.md ├── MISPTISetup.md ├── MaliciousActivityandSentinelP1.md ├── MaliciousActivityandSentinelP2.md ├── MaliciousActivityandSentinelP3.md ├── MaliciousActivityandSentinelP4.md ├── MaliciousActivityandSentinelP5.md ├── ODA.md ├── TVMIngestion.md ├── TenantCAPols.md ├── crosscloudsync.md └── tvm-adf.md /AADCCloudSync-Hardened.md: -------------------------------------------------------------------------------- 1 | ## Azure Active Directory Connect Cloud Sync - Hardened (STIGGED) Setup. ## 2 | 3 | A lot of the work I do consists of working in hardened security baselines. In short, that means STIGS are pushed via Group Policy to harden the systems. The new variant is called Azure Active Directory Connect Cloud Sync which uses a provisioning agent instead of the traditional connect application over the Azure Fabric backbone. For in depth overview please review [here](https://docs.microsoft.com/en-us/azure/active-directory/cloud-sync/what-is-cloud-sync?toc=https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Factive-directory%2Fcloud-sync%2Ftoc.json&bc=https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fbread%2Ftoc.json). 4 | 5 | The current environment(s) I have been working in are currently at 2016+ OS level. I will focus on 2019+ [DISA STIG](https://public.cyber.mil/stigs/gpo/), Microsoft Windows Server 2022 Domain Controllers and on our new 2022 Member Server. Disclaimer - The prerequisites are listed [here](https://docs.microsoft.com/en-us/azure/active-directory/cloud-sync/how-to-prerequisites?tabs=public-cloud) for AADC-CloudSync. 6 | 7 | Before I begin, please open another browser window with the prerequisite screen side by side with this article. The setup below will work with FIPS enabled, if you cannot get approval to turn of FIPS for the server (recommended). 8 | 9 | CloudSync should be installed on a member server (can be an Azure IaaS VM as well), Windows 2019 and later. This member server should be a member of the Tier0 (Identity Tier) or installing the agent on a Domain Controller is supported. 10 | 11 | *Disclaimer:* Please continue to install on a Tier0 member server. 12 | 13 | If your domain is NOT using gmsa(Group Managed Service Accounts), you need to Create the Key Distribution Services KDS Root Key seen. More info [here](https://docs.microsoft.com/en-us/windows-server/security/group-managed-service-accounts/create-the-key-distribution-services-kds-root-key). This is a prerequisite for using gMSA. If you are using gMSA, skip this step. 14 | 15 | ## Domain Group - Create. 16 | 17 | Create an Active Directory Security Group and make the cloud sync server(s) members. I created a group called 'cloudsyncretrievepwd'. Why do we do this? If you are planning on high availability a group is better suited for ease of management when it comes to allowing the gMSA to run with 'PrincipalsAllowToRetrieveManagedPassword' of the cloud sync servers. 18 | 19 | ## AADC Cloud Sync server setup. 20 | 21 | Follow my instructions along with the official docs posted [here](https://docs.microsoft.com/en-us/azure/active-directory/cloud-sync/how-to-prerequisites?tabs=public-cloud). 22 | 23 | #### Step1: Select "Local Server" within Server Manager and turn off IE Enhanced Security Configuration Mode for Administrators. 24 | #### Step2: Patch and make sure .Net Framework 4.7+ is installed and updated. 25 | #### Step3: Enable TLS1.2 26 | ``` 27 | New-Item 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Server' -Force | Out-Null 28 | New-ItemProperty -path 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Server' -name 'Enabled' -value '1' -PropertyType 'DWord' -Force | Out-Null 29 | New-ItemProperty -path 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Server' -name 'DisabledByDefault' -value 0 -PropertyType 'DWord' -Force | Out-Null 30 | New-Item 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Client' -Force | Out-Null 31 | New-ItemProperty -path 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Client' -name 'Enabled' -value '1' -PropertyType 'DWord' -Force | Out-Null 32 | New-ItemProperty -path 'HKLM:\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Client' -name 'DisabledByDefault' -value 0 -PropertyType 'DWord' -Force | Out-Null 33 | New-ItemProperty -path 'HKLM:\SOFTWARE\Microsoft\.NETFramework\v4.0.30319' -name 'SchUseStrongCrypto' -value 1 -PropertyType 'DWord' -Force | Out-Null 34 | ``` 35 | #### Step4: FIPS bypass for .NetFramework. If FIPS is disabled please skip this next step. 36 | Open CMD and run as Administrator. 37 | ``` 38 | %SYSTEMROOT%\Microsoft.NET\Framework\v4.0.30319\Config\machine.config 39 | %SYSTEMROOT%\Microsoft.NET\Framework64\v4.0.30319\Config\machine.config 40 | ***ADD the following to each machine.config file*** 41 | 42 | 43 | 44 | ``` 45 | #### Step5: Reboot. 46 | #### Step6: Open PowerShell(Run as Administrator) and run the following. 47 | ``` 48 | Install-WindowsFeature -Name RSAT-AD-PowerShell 49 | ``` 50 | #### Step7: Copy the contents of the script to the cloud sync server. Disclaimer - you may or may not need to use the -KerberosEncryptionType flag but if you are using 2016 Domain Controller STIG you will have to on OS 2016-2022. 51 | ``` 52 | Install-WindowsFeature -Name RSAT-AD-PowerShell 53 | Run this script 54 | # Filename: cloudsyncgmsa.ps1 55 | # Description: Creates and installs a custom gMSA account for use with Azure AD Connect cloud sync. 56 | # 57 | # DISCLAIMER: 58 | # Copyright (c) Microsoft Corporation. All rights reserved. This 59 | # script is made available to you without any express, implied or 60 | # statutory warranty, not even the implied warranty of 61 | # merchantability or fitness for a particular purpose, or the 62 | # warranty of title or non-infringement. The entire risk of the 63 | # use or the results from the use of this script remains with you. 64 | # 65 | # 66 | # 67 | # 68 | # Declare variables 69 | $Name = 'aadccsgmsa' #The name of the gMSA to be created 70 | $Description = "Azure AD Cloud Sync service account for cloud sync server" 71 | $Server = "aadccsgmsa.cyberlorians.net" #This is the cloud gmsa dns name 72 | $Principal = Get-ADGroup 'cloudsyncretrievepwd' #AD group created in the DC step 73 | 74 | # Create service account in Active Directory 75 | New-ADServiceAccount -Name $Name ` 76 | -Description $Description ` 77 | -DNSHostName $Server ` 78 | -ManagedPasswordIntervalInDays 30 ` 79 | -PrincipalsAllowedToRetrieveManagedPassword $Principal ` 80 | -Enabled $True ` 81 | -PassThru 82 | 83 | Set-ADServiceAccount -Identity $Name -KerberosEncryptionType AES128,AES256 //If using 2016STIG and above you have to use 84 | 85 | # Install the new service account on Azure AD Cloud Sync server 86 | Install-ADServiceAccount -Identity $Name 87 | ``` 88 | 89 | #### Step8: Navigate to AAD>Azure AD Connect>Manage Azure AD cloud sync and "download agent". 90 | 91 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/cloudsyncdownload1.png) 92 | 93 | #### Step9: Install "AADConnectProvisioningAgentSetup.exe", which you just downloaded. 94 | 95 | Connect to AzureAD with hybrid AAD role account. 96 | Customize the installation and chose "Use custom gMSA", enter the gMSA created earlier. 97 | 98 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/cloudsyncsetup1.png) 99 | 100 | Connect to Active Directory with EnterpriseAdmin account>Confirm and hit next. 101 | 102 | The installation should have been successful. Again, if any issues it was because of the -KerberosEncryptionType but you would know that was not working during the PowerShell script installation. 103 | 104 | *Disclaimer* - The STIG GPOs do NOT set the logon as a service, User rights permissions. By default, the installation should add the gMSA account to this "User Rights Assignment". If you are setting this setting via GPO, please add the gMSA account. Another note - if are using any other gMSA, lets say for Defender for Identity, you will have to also be sure all accounts are healthy, added and not overwriting. In short, either use GPO to set multiple or do it manually. 105 | 106 | #### Step10: Password hash sync with FIPS enabled. If you have gone this far - you have FIPS enabled. 107 | 108 | Enable MD5 for password hash synchronization [here](https://docs.microsoft.com/en-us/azure/active-directory/cloud-sync/how-to-install). Now, reboot. 109 | 110 | #### Step11: Check AAD Agent health. 111 | 112 | Navigate to "AAD>Azure AD Connect>Manage Azure AD cloud sync>Review all agents" and 113 | 114 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/cloudsyncagenthealth.png) 115 | 116 | Once confirmed - continue to "Cloud sync configuration" [here](https://docs.microsoft.com/en-us/azure/active-directory/cloud-sync/how-to-configure). 117 | 118 | #### Step12: Adding another agent for High Availability. 119 | 120 | Follow Steps 1-4 and step 9. Step10, if applicable. 121 | 122 | Cheers! 123 | 124 | 125 | 126 | 127 | -------------------------------------------------------------------------------- /AADGCP.md: -------------------------------------------------------------------------------- 1 | ## Entra & Google Cloud Platform - Identity Governance. ## 2 | 3 | The biggest hurdle in most organizations comes down to managing the Identity Goverance aspect. Over my career, I have worked for many outfits that have multiple identities, when it was easily not needed. Fast forward to today, I am seeing multiple traditional Active Directory infrastructures, as well as, multiple cloud platforms. I.e., CompanyA has traditional Active Directory Domain Services in hybrid with Azure Active Directory but their AWS (Amazon Web Services) and GCP (Google Cloud Platform) cloud providers are separated and under an entirely different management realm outside of the identity teams purview and organizational life cycle. The scenario I just described can be spun a few ways but it all comes down to consolidating into one identity platform and creating the identity 'baseline' at the Azure AD control plane. 4 | 5 | Executive Order [M-22-09](https://www.whitehouse.gov/wp-content/uploads/2022/01/M-22-09.pdf) has laid it out in a paragraph: "Using centrally managed systems to provide enterprise identity and access management 6 | services reduces the burden on agency staff to manage individual accounts and credentials. It 7 | also improves agencies’ knowledge of user activities, thereby enabling better detection of 8 | anomalous behavior, allowing agencies to more uniformly enforce security policies that limit 9 | access, as well as quickly detect and take action against anomalous behavior when needed.". 10 | 11 | Let's Federate Google Cloud with Entra ID, see google [doc](https://cloud.google.com/architecture/identity/federating-gcp-with-azure-active-directory). I recommend reading the entire document a few times to get a full grasp of each scenario. A key note with the entire setup is that, whatever custom domain name you end up using which is tied to the identity suffix, that is how your pass-through is going to work. I won't deep dive into it, just review the doc and follow the flow. 12 | 13 | *Disclaimer* - My example will not only use Entra ID as the sole identity provider but to leverage 'cloud only accounts' from the Entra ID side. Why cloud only? In a lot of scenarios, the other endoint, GCP in this case is for administrative purposes only. So, I am keeping seperation of duties and least privilege in mind rather than syncing a tiered admin account from on prem and flowing through. In short, better security. See diagram below of the setup. 14 | 15 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/overviewEntra IDgcp.png) 16 | 17 | To accomplish this feat, we will be using SCIM (System for Cross-Domain Identity Management) and SSO (Single Sign-On). That means, SCIM = user/group provisioning to set the Authorization/Authentication and SSO with SAML = seamless flow to GCP to do their administrative work. The detailed flow will look like below. 18 | 19 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/overviewgcpidp.png) 20 | 21 | *Dislaimer & Pre-reqs* - GCP does not allow .onmicrosoft.com accounts to provision to GCP. You MUST use a custom domain domain. It is recommend to use the SAME custom domain name that is being using in Entra ID but is not necessary nor may be possible by who owns the name. See [here](https://cloud.google.com/architecture/identity/federating-gcp-with-azure-active-directory#usage_of_dns_domains_in_azure_ad) on GCP domain names. What does that mean? cyberlorians.com exists in both Entra ID and GCP. In fact, a customer is not able to setup a GCP environment without a custom domain name. Plan wisely. 22 | 23 | Let's dig in! This setup is assumed you have Entra ID and Google already setup. The first we need to do is create the user provisioning piece from Entra ID>GCP. The official document is [here](https://cloud.google.com/architecture/identity/federating-gcp-with-azure-ad-configuring-provisioning-and-single-sign-on) on those steps. 24 | 25 | ## Provisioning Setup 26 | 27 | #### Step1: Create a 'delegated admin' in GCP for Entra ID. 28 | This user will be the automated provisioning account that we set in the Entra ID application to SCIM the users/groups to GCP. Those steps are outlined [here](https://cloud.google.com/architecture/identity/federating-gcp-with-azure-ad-configuring-provisioning-and-single-sign-on#creating_a_cloud_identity_user_account_for_synchronization). Proceed to domain name setup if you want any subdomains. If not, leave the primary as is. 29 | 30 | #### Step2: Configure Entra ID Provisioning. 31 | Steps are outlined [here](https://cloud.google.com/architecture/identity/federating-gcp-with-azure-ad-configuring-provisioning-and-single-sign-on#create_an_enterprise_application), however, name your Enterpise Application: 'GoogleCloudProvisioning.onmicrosoft'. Or something to distinguish it by. The real reason I state this is because if using Entra ID B2B you will need another provisioning enterprise application with its own attribute mappings. On the image below, make sure you have 'NO', set on all 3 properties as stated in the doc. Note - you may keep provisioning and SSO Enterprise Apps together but for security best practice, leave them seperated and locked down accordingly to whom can manage them. 32 | 33 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/gcpprovissettings.png) 34 | 35 | *Disclaimer* - I am showcasing administrative work only using cloud only, .onmicrosoft.com accounts. I.e., cranesmeadows.com is a dns suffix on prem in my ADDS environment using ADFS. If I were to use the same suffix I would allow the same account which is on prem to be used (no least privilege) and going through ADFS as the token gets passed from the on prem ADDS servers first (this is messy, but 100% doable). So many scenarios can work here, so please understand them all from the documentation. I'm just focusing on admin accounts only to break all lateral movement and anti-phising campaigns from email. I have a block rule in my exchange admin center from any outside email to my primary (*.onmicrosoft.com), to help lock this effort down. You may use a custom domain name if you would like to keep in line with the Enterprise Access Model - Management Plan. Just keep least privilege in mind. 36 | 37 | #### Step3: Configure User Provisioning. 38 | 39 | As stated, in this lab, I am doing a UPN: domain substitute but please chose accordingly [here](https://cloud.google.com/architecture/identity/federating-gcp-with-azure-ad-configuring-provisioning-and-single-sign-on#configure_user_provisioning). It is straight forward but below are snippets for visualization (User & Group mappings). Remember, these users and groups will transform to GCP with the suffix of cranesmeadows.com (GCP Domain Name). 40 | 41 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/gcpattrimapping1.png) 42 | 43 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/gcpattrimapping2.png) 44 | 45 | #### Step4: Enable Automatic Provisioning. 46 | 47 | Leave this step as default and all is well, document is [here](https://cloud.google.com/architecture/identity/federating-gcp-with-azure-ad-configuring-provisioning-and-single-sign-on#enable_automatic_provisioning). 48 | 49 | #### Step5: Assign the users to the new provisioning enterprise app (seen below) to be provisioned. Confirm provisioning is working and check GCP to see the new users. 50 | 51 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/gcpprovusergroup.png) 52 | 53 | ## SSO Setup 54 | 55 | #### Step1: Configure Entra ID for Single sign-on. 56 | 57 | Follow the instructions [here](https://cloud.google.com/architecture/identity/federating-gcp-with-azure-ad-configuring-provisioning-and-single-sign-on#configuring_azure_ad_for_single_sign-on) by creating a new Enterprise Application first then proceeding with configure user assignment (the same users/groups assigned to the provisioning). 58 | 59 | #### Step2: Configure SAML Settings 60 | 61 | Instructions are [here](https://cloud.google.com/architecture/identity/federating-gcp-with-azure-ad-configuring-provisioning-and-single-sign-on#configure_saml_settings), however, I want to show the settings by the snippets below. 62 | 63 | SAML specific to your custom domain name is below. 64 | 65 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/gcpsaml.png) 66 | 67 | Under the 'UPN: domain substitute steps'. Look at the snippet because the instructions on the doc can be bit janky. Under the join() for parameter2, just type the custom domain name in the field and hit enter. It will look like below. Do NOT put quotes because the parameter will then have double qoutes and NOT work. 68 | 69 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/tenantgcptransform.png) 70 | 71 | ## Configure Cloud Identity or Google Workspace for single sign-on 72 | 73 | These steps are laid out [here](https://cloud.google.com/architecture/identity/federating-gcp-with-azure-ad-configuring-provisioning-and-single-sign-on#configuring_cloud_identity_for_single_sign-on) and remember the certificate you are uploading is coming off the enterprise app, 'SAML Signing Certificate' (BASE64) you downloaded. 74 | 75 | Log into myapps.microsoft.com and you should be able to SSO into GCP. 76 | 77 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/gcpmyapps.png) 78 | 79 | SSO>GCP Console confirmation 80 | 81 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/gcpssoconfirmed.png) 82 | 83 | 84 | ## Further Notes 85 | 86 | *Note 1* - I stated to use multiple Enterprise Apps for provisioning and SSO. At first this may sound goofy but it is for least privilege of the application. One app is controlling provisioning into the other cloud provider, therefore we do not need any users to view this application or anyone outside of the owner or proper Entra ID privileged role to view and make a configuration to the privileged role and sync attributes. 87 | 88 | *Note 2* - If wanting to incorporate B2B from Azure Commercial or Government, stand up yet another provisioning Enterprise Application for this need and follow these [steps](https://cloud.google.com/architecture/identity/azure-ad-b2b-user-provisioning-and-sso#configure_azure_ad_provisioning). The reason for this and at this time of writing is simply because under the provisioning attribute mapping, it is difficult to add multiple 'Replacements' for the Entra ID attribute to the 'primaryEmail', GCP attribute. Once a way is found, this documentation will be updated. 89 | 90 | *Note 2.b* - Configuring the SSO for B2B and everything else can be a little confusing, as shown [here](https://cloud.google.com/architecture/identity/azure-ad-b2b-user-provisioning-and-sso#configuring_azure_ad_for_single_sign-on). In your journey, if you decide to go this route, please see the below snippets along with Googles documentation to see how the SSO claims can be combined into one. 91 | 92 | Enterprise App = GCP-SSO. Head to Single sign-on>2(Attributes & Claims). 93 | 94 | Claim Name = UUID (Name ID) - Edit this field in the below images. 95 | 96 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/gcpmanageclaim0.png) 97 | 98 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/gcpmanageclaim1.png) 99 | 100 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/gcpmanageclaim2.png) 101 | 102 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/gcpmanageclaim3.png) 103 | 104 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/gcpmanageclaim4.png) 105 | 106 | Any claim condition - Just enter as seen below. 107 | 108 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/gcpmanageclaim5.png) 109 | 110 | Back on the Attributes & Claims page - Additional Claims - Enter as seen below. 111 | 112 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/gcpmanageclaim6.png) 113 | 114 | *Note 3* - Always think of least privilege during this type of setup. Whether it be from on prem first, B2B or cloud. Don't mix the traditional Tiers from on prem OR control and data planes of the cloud or worse, give access to a B2B user from another tenant full admin rights to your Google platform. When talking administrative functions and least privilege, also chose passwordless scenarios, MFA, phish resistant and etc. A good example to where you this situation may not require a cloud only admin account, is perhaps a read only or billing account for the GCP side. In that case you may find it acceptable to use an on prem synced account to flow through the entire process. Albeit, you may inadverently change the identity lifecycle and management piece to all of this and introduce lax security. 115 | 116 | *Note 4* - Don't think using multiple apps for provisioning and SSO will a nightmare. I recommend using Entra ID Dymanic groups to populate your groups. This way you just assign the groups permissions to the apps and call it a day because you know the users with a certain, (another attribute) will be auto populated into these groups. 117 | 118 | ## Part2: The security layer to this - coming soon! ## 119 | 120 | 121 | 122 | -------------------------------------------------------------------------------- /AADPasswordProtectionWithSentiel.md: -------------------------------------------------------------------------------- 1 | ## Capturing AAD Password Protection Summaries and Monitoring with Sentinel ## 2 | 3 | I am going to keep this one short, if you find yourself configuring "Azure Active Directory Password Protection" and want to montior the summary report, see no further. See basic build doc [here](https://learn.microsoft.com/en-us/azure/active-directory/authentication/howto-password-ban-bad-on-premises-operations). 4 | 5 | After setup you can monitor all other events with the AMA agent, seen [here](https://learn.microsoft.com/en-us/azure/active-directory/authentication/howto-password-ban-bad-on-premises-monitor). 6 | 7 | If you would like to capture the Password Protection Summaries, your first setup would be to create a custom DCR rule to capture the XPath query. See below. 8 | 9 | Microsoft-AzureADPasswordProtection-DCAgent/Admin!*[System[(Level=1 or Level=2 or Level=3 or Level=4 or Level=0 or Level=5)]] 10 | 11 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/AADPassDCR.png) 12 | 13 | Once configured, head over to Sentiel and plug in the below query. 14 | 15 | 16 | ``` 17 | Event 18 | | where Source == "Microsoft-AzureADPasswordProtection-DCAgent" 19 | | summarize PasswordChangesValidated = countif(EventID == 10014), PasswordSetsValidated = countif(EventID == 10015), PasswordChangesRejected = countif(EventID == 10016), PasswordSetsRejected = countif(EventID == 10017), PasswordChangeAuditOnlyFailures = countif(EventID == 10024), PasswordSetAuditOnlyFailures = countif(EventID == 10025), PasswordChangeErrors = countif(EventID == 10012), PasswordSetErrors = countif(EventID == 10013) by bin(TimeGenerated, 7d), Computer 20 | ``` 21 | 22 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/AADPassDCR2.png) 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | -------------------------------------------------------------------------------- /AHRiskScoreMGand10klimits.md: -------------------------------------------------------------------------------- 1 | #### Advanced Hunting in MDE for Risk Score, Machine Groups AND beating the 10klimits - Huge thanks and collab effort to [Matthew Zorich](https://twitter.com/reprise_99?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor) and [David Lajoie](https://www.linkedin.com/in/dclajoie/). 2 | ## Hunting query for all risk scores and machine groups 3 | ``` 4 | // Devices 5 | let devicelist = 6 | DeviceInfo 7 | | summarize arg_max(Timestamp, *) by DeviceId 8 | | project DeviceName, DeviceId, MachineGroup; 9 | // Scoring for the CVEs 10 | let Critical = int(40); 11 | let High = int(10); 12 | let Medium = int(3); 13 | let Low = int(1); 14 | let Informational = int(0); 15 | // Get All the CVEs 16 | let AllCVE = (DeviceTvmSoftwareVulnerabilities 17 | | project DeviceId, DeviceName, VulnerabilitySeverityLevel, CveId, SoftwareVendor 18 | | extend RiskScore = case(VulnerabilitySeverityLevel == "Critical", Critical, 19 | VulnerabilitySeverityLevel == "High", High, 20 | VulnerabilitySeverityLevel == "Medium", Medium, 21 | VulnerabilitySeverityLevel == "Low", Low, 22 | Informational) 23 | ); 24 | // Get all CVE information 25 | let CVEScore = (DeviceTvmSoftwareVulnerabilitiesKB 26 | ); 27 | AllCVE | join kind=leftouter CVEScore on CveId 28 | // Create the column Criticality to count all critical and high CVEs with an available exploit 29 | | extend Criticality = case(IsExploitAvailable == "1" and VulnerabilitySeverityLevel == "Critical", "Critical" 30 | ,IsExploitAvailable == "1" and VulnerabilitySeverityLevel == "High", "High" 31 | ,"Lower") 32 | | summarize TotalRiskScore = sum(RiskScore), TotalCVE = count(CveId), AverageScore = avg(RiskScore), Vendors = makeset(SoftwareVendor), Exploitable = countif(IsExploitAvailable==1), CriticalCVE = countif(Criticality == "Critical" or Criticality == "High") ,CVSSNone = countif(isempty(CvssScore)), CVSSLow = countif(CvssScore between (0.1 .. 3.9)), CVSSMedium = countif(CvssScore between (4.0 .. 6.9)), CVSSHigh = countif(CvssScore between (7.0 .. 8.9)), CVSSCritical = countif(CvssScore between (9 .. 10)) by DeviceName, DeviceId 33 | | join kind=inner (devicelist) on DeviceId 34 | | extend splitall=split(MachineGroup, DeviceName) 35 | | project-away DeviceId1, DeviceName1, splitall 36 | | sort by TotalRiskScore desc 37 | ``` 38 | ## If output is greater than 10k run the following in MDE API Explorer 39 | [MDE API Hunting - MSFT Docs](https://docs.microsoft.com/en-us/microsoft-365/security/defender-endpoint/run-advanced-query-api?view=o365-worldwide#request-example "MDE API Hunting") 40 | 41 | 42 | ![alt text](https://github.com/TheCyberlorians/uploadedimages/blob/main/AHMGAPI.png "MDE API Explorer") 43 | 44 | ## Enter the below query to the MDE API seen in the image 45 | ``` 46 | { 47 | "Query":"let devicelist = DeviceInfo | summarize arg_max(Timestamp, *) by DeviceId | project DeviceName, DeviceId, MachineGroup; let Critical = int(40); let High = int(10); let Medium = int(3); let Low = int(1); let Informational = int(0); let AllCVE = (DeviceTvmSoftwareVulnerabilities | project DeviceId, DeviceName, VulnerabilitySeverityLevel, CveId, SoftwareVendor | extend RiskScore = case(VulnerabilitySeverityLevel == \"Critical\", Critical, VulnerabilitySeverityLevel == \"High\", High, VulnerabilitySeverityLevel == \"Medium\", Medium, VulnerabilitySeverityLevel == \"Low\", Low, Informational)); let CVEScore = (DeviceTvmSoftwareVulnerabilitiesKB ); AllCVE | join kind=leftouter CVEScore on CveId | extend Criticality = case(IsExploitAvailable == \"1\" and VulnerabilitySeverityLevel == \"Critical\", \"Critical\" ,IsExploitAvailable == \"1\" and VulnerabilitySeverityLevel == \"High\", \"High\" ,\"Lower\") | summarize TotalRiskScore = sum(RiskScore), TotalCVE = count(CveId), AverageScore = avg(RiskScore), Vendors = makeset(SoftwareVendor), Exploitable = countif(IsExploitAvailable==1), CriticalCVE = countif(Criticality == \"Critical\" or Criticality == \"High\") ,CVSSNone = countif(isempty(CvssScore)), CVSSLow = countif(CvssScore between (0.1 .. 3.9)), CVSSMedium = countif(CvssScore between (4.0 .. 6.9)), CVSSHigh = countif(CvssScore between (7.0 .. 8.9)), CVSSCritical = countif(CvssScore between (9 .. 10)) by DeviceName, DeviceId | join kind=inner (devicelist) on DeviceId | project-away DeviceId1, DeviceName1 | sort by TotalRiskScore desc;" 48 | } 49 | ``` 50 | 51 | ## Take output and dump into a txt file into "folder": 52 | ![alt text](https://github.com/TheCyberlorians/uploadedimages/blob/main/jsonoutputmdeapi.png) 53 | 54 | ## Take this script and put into “folder”, call it “jsontocsv.ps1” 55 | ``` 56 | $PathToJson = Read-Host -Prompt "Enter JSON Txt File" 57 | (Get-Content -Path $PathToJson | ConvertFrom-Json).Results | Export-Csv -Path $PathToJson.Replace('.txt','.csv') -NoTypeInformation 58 | ``` 59 | ## Run script and enter the file - csv file will dump into "folder" 60 | ![alt text](https://github.com/TheCyberlorians/uploadedimages/blob/main/jsontocsv.png) 61 | 62 | ## Verify output in csv file 63 | ![alt text](https://github.com/TheCyberlorians/uploadedimages/blob/main/jsoncsvexcel.png) 64 | 65 | -------------------------------------------------------------------------------- /AutoExcludeCAP.md: -------------------------------------------------------------------------------- 1 | ## Automatically Exclude BreakGlass Group From Conditional Access ## 2 | 3 | Having your break glass accounts be part of an exclusion group which is EXCLUDED from conditional access policy is a pivotal piece to your Zero Trust Identity plane, for two simple reasons. This allows the identity team to gain access back into a tenant if someone were to configure a mistake and break AuthZ/AuthN to the tenant. As well as if a threat actor has taken over and removed the exclusions from the policies. You are at mercy of the recurrence and I would suggest this run, every 1-5m in corporate orgs. 4 | 5 | 6 | ## Pre-Configuration 7 | 8 | 1. Create a security group in Entra ID and label it 'Exclude - BG'. 9 | 10 | 2. Grab the ObjectID and save it for Post-Configuration steps. 11 | 12 | ## Deploy the logic app 13 | 14 | [![Deploy to Azure](https://aka.ms/deploytoazurebutton)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FCyberlorians%2FLogicApps%2Fmain%2FAutoCAPExclude.json) 15 | 16 | 1. On the Parameters Tab of the logic app, Enter the objectID of your Exclusion Group. 17 | 18 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/autoexcludeautodeploy.png) 19 | 20 | ## Post-Configuration of the AutoCAPExclude Logic App 21 | 22 | 1. Open Azure PowerShell via the browswer & paste the below code. 23 | 24 | ``` 25 | connect-azuread 26 | 27 | $miObjectID = $null 28 | Write-Host "Looking for Managed Identity with default prefix names of the Logic App..." 29 | $miObjectIDs = @() 30 | $miObjectIDs = (Get-AzureADServicePrincipal -SearchString "AutoCapExclude").ObjectId 31 | if ($miObjectIDs -eq $null) { 32 | $miObjectIDs = Read-Host -Prompt "Enter ObjectId of Managed Identity (from Logic App):" 33 | } 34 | 35 | # The app ID of the Microsoft Graph API where we want to assign the permissions 36 | $appId = "00000003-0000-0000-c000-000000000000" 37 | $permissionsToAdd = @("policy.Read.All","Policy.ReadWrite.ConditionalAccess","mail.send") 38 | $app = Get-AzureADServicePrincipal -Filter "AppId eq '$appId'" 39 | 40 | foreach ($miObjectID in $miObjectIDs) { 41 | foreach ($permission in $permissionsToAdd) { 42 | Write-Host $permission 43 | $role = $app.AppRoles | where Value -Like $permission | Select-Object -First 1 44 | New-AzureADServiceAppRoleAssignment -Id $role.Id -ObjectId $miObjectID -PrincipalId $miObjectID -ResourceId $app.ObjectId 45 | } 46 | } 47 | ``` 48 | 49 | 2. Set your recurrence of the logic app. Suggested 1-5m. 50 | 51 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/autocaprecur.png) 52 | 53 | 3. Configure your endpoint based off what graph environment you are working with. 54 | 55 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/autocapgetcond.png) 56 | 57 | *Graph endpoints for Step2 are below* 58 | 59 | ``` 60 | Commercial URL = https://graph.microsoft.com/v1.0/identity/conditionalAccess/policies 61 | Commercial Audience = https://graph.microsoft.com 62 | 63 | GCC URL = https://graph.microsoft.com/v1.0/identity/conditionalAccess/policies 64 | GCC Audience = https://graph.microsoft.com 65 | 66 | GCCH URI = https://graph.microsoft.us/v1.0/identity/conditionalAccess/policies 67 | GCCH Audience = https://graph.microsoft.us 68 | ``` 69 | 70 | 4. Configure the SEND MAIL (POST) and what graph endpoint you need to use. Graph Endpoint URLs are listed just below the image. 71 | 72 | The first arrow in the URI line item is a shared mailbox. The second arrow within the body are the "recipients". 73 | 74 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/autocapemail.png) 75 | 76 | *Graph endpoints for Step4 are below* 77 | 78 | ``` 79 | Commercial URL = https://graph.microsoft.com/v1.0/users/EMAILADDRESS/sendmail 80 | Commercial Audience = https://graph.microsoft.com 81 | 82 | GCC URL = https://graph.microsoft.com/v1.0/users/EMAILADDRESS/sendmail 83 | GCC Audience = https://graph.microsoft.com 84 | 85 | GCCH URI = https://graph.microsoft.us/v1.0/users/EMAILADDRESS/sendmail 86 | GCCH Audience = https://graph.microsoft.us 87 | ``` 88 | 89 | ## Run logic app and test (make sure the exclusion group is not part of a conditional access to test) 90 | 91 | *Excluded group now added to the CAP* 92 | 93 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/autocapproof.png) 94 | 95 | *Email sent to DLs in the logic app* 96 | 97 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/autocapsendemailproof.png) 98 | 99 | ## Monitoring & Alerting of the automation 100 | 101 | 1. On the logic app, click > Diagnostic Settings and send to the preferred Log Analytics Workspace. 102 | 103 | 2. Create an Azure Monitor or Sentinel Analytical Rule based off kQL log below* 104 | 105 | ``` 106 | AzureDiagnostics 107 | | where resource_workflowName_s == "AutoCAPExclude" 108 | | sort by TimeGenerated asc 109 | | where status_s == "Failed" 110 | | distinct startTime_t, resource_workflowName_s, status_s, resource_actionName_s 111 | ``` 112 | 113 | 114 | 115 | 116 | -------------------------------------------------------------------------------- /CA-SISM-Conditional Access Summaries, Insights, Security & Monitoring.md: -------------------------------------------------------------------------------- 1 | ## CA-SISM: Conditional Access Summaries, Insights, Security & Monitoring. ## 2 | 3 | Lacking permissions for the Identity plane prevents access to view Conditional Access Policies, obtaining current policies becomes unattainable. This approach relies on a logic app to invoke the Graph API for conditional access and feed the data into the log analytics workspace. The resulting table will be named 'TenantCAPols_CL'. Ingestion will occur once on both Monday and Friday of every week. 4 | 5 | ## Deploy the logic app 6 | 7 | [![Deploy to Azure](https://aka.ms/deploytoazurebutton)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FCyberlorians%2FLogicApps%2Fmain%2FTenantCAPols-Ingest.json) 8 | 9 | 10 | ## Post-Configuration of the TenantCAPols-Ingest Logic App 11 | 12 | 13 | 1. **Open Azure PowerShell via the browser & Paste the below code.** 14 | 15 | ``` 16 | connect-azuread 17 | 18 | $miObjectID = $null 19 | Write-Host "Looking for Managed Identity with default prefix names of the Logic App..." 20 | $miObjectIDs = @() 21 | $miObjectIDs = (Get-AzureADServicePrincipal -SearchString "TenantCAPols-Ingest").ObjectId 22 | if ($miObjectIDs -eq $null) { 23 | $miObjectIDs = Read-Host -Prompt "Enter ObjectId of Managed Identity (from Logic App):" 24 | } 25 | 26 | # The app ID of the Microsoft Graph API where we want to assign the permissions 27 | $appId = "00000003-0000-0000-c000-000000000000" 28 | $permissionsToAdd = @("policy.Read.All") 29 | $app = Get-AzureADServicePrincipal -Filter "AppId eq '$appId'" 30 | 31 | foreach ($miObjectID in $miObjectIDs) { 32 | foreach ($permission in $permissionsToAdd) { 33 | Write-Host $permission 34 | $role = $app.AppRoles | where Value -Like $permission | Select-Object -First 1 35 | New-AzureADServiceAppRoleAssignment -Id $role.Id -ObjectId $miObjectID -PrincipalId $miObjectID -ResourceId $app.ObjectId 36 | } 37 | } 38 | ``` 39 | 40 | 1(a). *If applicable, change http calls*. Configure your endpoint based off what graph environment you are working with. Please adjust the logic app http call per the tenant you are working in. Commercial & GCC use the same API call, Gov will need to be adjust. 41 | 42 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/autocapgetcond.png) 43 | 44 | *Graph endpoints for adjustment are below* 45 | 46 | ``` 47 | Commercial URL = https://graph.microsoft.com/v1.0/identity/conditionalAccess/policies 48 | Commercial Audience = https://graph.microsoft.com 49 | 50 | GCC URL = https://graph.microsoft.com/v1.0/identity/conditionalAccess/policies 51 | GCC Audience = https://graph.microsoft.com 52 | 53 | GCCH URI = https://graph.microsoft.us/v1.0/identity/conditionalAccess/policies 54 | GCCH Audience = https://graph.microsoft.us 55 | ``` 56 | 57 | 58 | 2. **Configuration for Law Analytics Workspace Ingestion.** 59 | 60 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/cacismlaw.png) 61 | 62 | 3. **Confirm ingestion at your Log Analytics Workspace.** 63 | 64 | ``` 65 | TenantCAPols_CL 66 | | summarize arg_max(TimeGenerated, *) by id_g 67 | | extend DisplayName = tostring(displayName_s) 68 | | extend PolicyId = tostring(id_g) 69 | //| extend Conditions = Policies.conditions 70 | //| mv-expand Conditions 71 | | extend State = case( 72 | tostring(state_s) == "enabled", "On", 73 | tostring(state_s) == "disabled", "Off", 74 | tostring(state_s) == "enabledForReportingButNotEnforced", "Report-only", 75 | "Unknown" 76 | ) 77 | //| extend GrantControls = Policies.grantControls 78 | | extend CreatedTimeDate = createdDateTime_t 79 | | extend ModifiedTimeDate = modifiedDateTime_t 80 | | extend UsersInclude = conditions_users_includeUsers_s 81 | | extend UsersExclude = conditions_users_excludeUsers_s 82 | | extend GroupsInclude = conditions_users_includeGroups_s 83 | | extend GroupsExclude = conditions_users_excludeGroups_s 84 | | extend CloudAppsInclude = conditions_applications_includeApplications_s 85 | | extend CloudAppsExclude = conditions_applications_excludeApplications_s 86 | | extend ClientPlatformsIncludeTmp = column_ifexists("conditions_platforms_includePlatforms_s", "") 87 | | extend ClientPlatformsInclude = case( 88 | "ExplicitOnly" == "ExplicitOnly", ClientPlatformsIncludeTmp, 89 | case( 90 | isnotempty(ClientPlatformsIncludeTmp), ClientPlatformsIncludeTmp, 91 | todynamic("(all)") 92 | ) 93 | ) 94 | | extend ClientPlatformsIncludeTooltip = case( 95 | ClientPlatformsInclude == "(all)", "This is the implicit configuration. All platforms are included, because the setting has not been configured.", 96 | "" 97 | ) 98 | | extend ClientPlatformsExcludeTmp = column_ifexists("conditions_platforms_excludePlatforms_s", "[]") 99 | | extend ClientPlatformsExclude = case( 100 | "ExplicitOnly" == "ExplicitOnly", ClientPlatformsExcludeTmp, 101 | case( 102 | isnotempty(ClientPlatformsExcludeTmp), ClientPlatformsExcludeTmp, 103 | todynamic("[]") 104 | ) 105 | ) 106 | | extend ClientApps = conditions_clientAppTypes_s 107 | | extend LocationsInclude = case( 108 | "ExplicitOnly" == "ExplicitOnly", conditions_locations_includeLocations_s, 109 | case( 110 | isnotempty(conditions_locations_includeLocations_s), conditions_locations_includeLocations_s, 111 | todynamic("(all)") 112 | ) 113 | ) 114 | | extend LocationsIncludeTooltip = case( 115 | LocationsInclude == "(all)", "This is the implicit configuration. All locations are included, because the setting has not been configured.", 116 | "" 117 | ) 118 | | extend LocationsExclude = case( 119 | "ExplicitOnly" == "ExplicitOnly", conditions_locations_excludeLocations_s, 120 | case( 121 | isnotempty(conditions_locations_excludeLocations_s), conditions_locations_excludeLocations_s, 122 | todynamic("[]") 123 | ) 124 | ) 125 | | extend UserRiskLevels = case( 126 | "ExplicitOnly" == "ExplicitOnly", conditions_userRiskLevels_s, 127 | case( 128 | array_length(todynamic(conditions_userRiskLevels_s)) != 0, conditions_userRiskLevels_s, 129 | todynamic("(all)") 130 | ) 131 | ) 132 | | extend UserRiskLevelsTooltip = case( 133 | UserRiskLevels == "(all)", "This is the implicit configuration. All user risk levels are included, because the setting has not been configured.", 134 | "" 135 | ) 136 | | extend SigninRiskLevels = case( 137 | "ExplicitOnly" == "ExplicitOnly", conditions_signInRiskLevels_s, 138 | case( 139 | array_length(todynamic(conditions_signInRiskLevels_s)) != 0, conditions_signInRiskLevels_s, 140 | todynamic("(all)") 141 | ) 142 | ) 143 | | extend SigninRiskLevelsTooltip = case( 144 | UserRiskLevels == "(all)", "This is the implicit configuration. All sign-in risk levels are included, because the setting has not been configured.", 145 | "" 146 | ) 147 | | extend GrantControls = grantControls_builtInControls_s 148 | | extend FullPolicyJson = pack_all() 149 | | sort by DisplayName asc 150 | | project ['Policy display name'] = DisplayName, State, ['Cloud apps included'] = CloudAppsInclude, ['Cloud apps excluded'] = CloudAppsExclude, ['Users included'] = UsersInclude, ['Users excluded'] = UsersExclude, ['Groups included'] = GroupsInclude, ['Groups excluded'] = GroupsExclude, ['Client platforms included'] = ClientPlatformsInclude, ['Client platforms excluded'] = ClientPlatformsExclude, ['Client apps'] = ClientApps, ['Locations included'] = LocationsInclude, ['Locations excluded'] = LocationsExclude, ['User risk levels'] = UserRiskLevels, ['Sign-in risk levels'] = SigninRiskLevels, ['Grant controls'] = GrantControls, CreatedTimeDate, ModifiedTimeDate, PolicyId, ['Full policy JSON'] = FullPolicyJson, ClientPlatformsIncludeTooltip, LocationsIncludeTooltip, UserRiskLevelsTooltip, SigninRiskLevelsTooltip 151 | ``` 152 | 153 | 154 | -------------------------------------------------------------------------------- /CEF-AMASetup.md: -------------------------------------------------------------------------------- 1 | ## Stream Common Event Format (CEF) using Azure Monitor Agent (AMA) (the helping hand guide). Authored by: Michael Crane and [Lorenzo Ireland](https://github.com/dcodev1702). ## 2 | 3 | Many folks using Sentinel have issues with clarity around the Common Event Format (CEF) via AMA and rightfully so. This article deems to clear any confusion in both Azure Commercial and US Goverment tenants. See CEF-AMA [here](https://learn.microsoft.com/en-us/azure/sentinel/connect-cef-ama). 4 | 5 | *Disclaimer* - In Microsoft Sentinel, the CEF connector simply provides instructions to create a Data Collection Rule (DCR) and a Python script to enble rsyslog TCP and UDP on port 514. From there, CEF events are forwarded via the Azure Monitor Agent (AMA) via TCP/443 TLS 1.2 to it's corresponding Log Analytics Workspace defined in the associated Data Collection Rule and then to the CommonSecruityLog table within the defined Log Analytics Workspace. It is NOT a true connector and simply takes advantage of AMA, DCR, and a [Workspace transformation](https://learn.microsoft.com/en-us/azure/sentinel/data-transformation). You will have some manual work to do and this solution does work in Microsoft Azure US Government. 6 | 7 | *Pre-req* - Create an Azure Monitor Agent supported Linux Virtual Machine (VM). This example uses Ubuntu 22.04. This has also been tested with Rocky 8 Linux. I am using Azure for this use case. The VM you select should be based upon the volume of CEF traffic you expect to recieve. Being this is owned by the SecOps team, the VM lives within my SecOps subscription in Azure. 8 | 9 | ## CEF Setup on Ubuntu 22.04 10 | 11 | ``` 12 | # SSH to the newly created Ubuntu VM. The following commands can be copied and pasted into the ssh session. 13 | 14 | # Update/Upgrade System, if needed. 15 | sudo apt-get update -y && sudo apt-get upgrade -y 16 | 17 | # Reboot 18 | sudo reboot 19 | 20 | # Check if Python 2.7/3 is installed and syslog-ng or rsyslog (rsyslog by default) 21 | sudo apt install python3 22 | sudo apt install rsyslog 23 | ``` 24 | 25 | ## Creating the Data Collection Rule. 26 | 27 | The DCR rule has to be in place first. Just create a simple syslog DCR and call it a day, as we will reconfiguring it later. Once created, allow some time for the AMA extension to be added to the machine and the syslog data to ingest into Sentinel before going onto the next step. 28 | 29 | *Instructions* - [here](https://learn.microsoft.com/en-us/azure/sentinel/forward-syslog-monitor-agent) 30 | 31 | ## Run the following on your CEF machine, AFTER you have created the DCR rule. 32 | 33 | Azure Commercial or Azure Goverment. The installation script configures the rsyslog or syslog-ng daemon to use the required protocol and restarts the daemon 34 | ``` 35 | sudo wget https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/DataConnectors/Syslog/Forwarder_AMA_installer.py 36 | sudo python3 Forwarder_AMA_installer.py 37 | 38 | ``` 39 | ## Edit the rsyslog or syslog-ng conf file. 40 | On the Ubuntu server you will see it has been changed to CEF by uncommented modules and inputs. Confirm changes at: 'cat /etc/rsyslog.conf' 41 | 42 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/cefmagrsyslog.png) 43 | 44 | ## Setup the connector with the API - Reconfigure the DCR for CEF and NOT syslog. 45 | 46 | *PreReqs* - PowerShell, Az Module. 47 | 48 | GET Request URL and Header - **Azure Commercial** 49 | 50 | ``` 51 | Connect-AzAccount -Environment Azure -UseDeviceAuthentication 52 | $token = (Get-AzAccessToken -ResourceUrl 'https://management.azure.com').Token 53 | $headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]" 54 | $headers.Add("Authorization","Bearer $token") 55 | $ct = ‘application/json’ 56 | $subscriptionId= ‘SubscriptionIDofWhereTheDCRLives’ 57 | $resourceGroupName = 'RGofWhereTheDCRLives' 58 | $dataCollectionRuleName = ‘CEF-DCR-Name’ 59 | $url = “https://management.azure.com/subscriptions/$($subscriptionId)/resourceGroups/$($resourceGroupName)/providers/Microsoft.Insights/dataCollectionRules/$($dataCollectionRuleName)?api-version=2019-11-01-preview” 60 | $DCRResponse = Invoke-RestMethod $url -Method 'Get' -Headers $headers 61 | $DCRResponse | ConvertTo-JSON | Out-File "$(pwd).Path\dcr.json" 62 | ``` 63 | 64 | GET Request URL and Header - **Azure Government** 65 | 66 | ``` 67 | Connect-AzAccount -Environment AzureUSGovernment -UseDeviceAuthentication 68 | $token = (Get-AzAccessToken -ResourceUrl 'https://management.usgovcloudapi.net').Token 69 | $headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]" 70 | $headers.Add("Authorization","Bearer $token") 71 | $ct = ‘application/json’ 72 | $subscriptionId= ‘SubscriptionIDofWhereTheDCRLives’ 73 | $resourceGroupName = 'RGofWhereTheDCRLives' 74 | $dataCollectionRuleName = ‘CEF-DCR-Name’ 75 | $url = “https://management.usgovcloudapi.net/subscriptions/$($subscriptionId)/resourceGroups/$($resourceGroupName)/providers/Microsoft.Insights/dataCollectionRules/$($dataCollectionRuleName)?api-version=2019-11-01-preview” 76 | $DCRResponse = Invoke-RestMethod $url -Method 'Get' -Headers $headers 77 | $DCRResponse | ConvertTo-JSON | Out-File "$(pwd).Path\dcr.json" 78 | ``` 79 | ## Reading the Request Body and make edits 80 | 81 | You can follow the directions [here](https://learn.microsoft.com/en-us/azure/sentinel/connect-cef-ama#request-body). 82 | 83 | Edit and Notes: Where you see a RED dot, take not of the MSFT article and your changes according to yours. Below is an example. Make changes and save the file. 84 | 85 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/cefdcredit.png) 86 | 87 | ## PUT Request Body - **This is the same for any Azure Environment** 88 | 89 | ``` 90 | $json = Get-Content c:\tools\dcrcefapi.json 91 | $DCRPUT = Invoke-RestMethod -Method ‘PUT’ $url -Body $json -Headers $headers -ContentType $ct 92 | ``` 93 | 94 | ## Confirm changes have been made by reading the overview/JSON on your DCR rule in Azure Monitor. 95 | 96 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/CEFcompleteDCR.png) 97 | 98 | ## Test the connector [here](https://learn.microsoft.com/en-us/azure/sentinel/connect-cef-ama#test-the-connector) 99 | 100 | ## Confirm you are ingesting the CEF Logs into Sentinel. 101 | 102 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/SentinelCEFProof.png) 103 | 104 | ## Verify the connect is installed correctly, run the troubleshooting script w/ this command. 105 | 106 | Azure Commercial 107 | ``` 108 | sudo wget https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/DataConnectors/CEF/cef_AMA_troubleshoot.py 109 | sudo python3 cef_AMA_troubleshoot.py 110 | ``` 111 | 112 | Azure Government 113 | ``` 114 | sudo https://raw.githubusercontent.com/Cyberlorians/Sentinel/main/Connectors/CEF/cef_AMA_GOV_troubleshoot.py 115 | sudo python3 cef_AMA_troubleshoot.py 116 | ``` 117 | -------------------------------------------------------------------------------- /CyberEstate-TVMData.md: -------------------------------------------------------------------------------- 1 | ## CyberEstate with Threat & Vulnerability Managment to the DataLake 2 | 3 | This article is a follow up to my [TVMIngestion](https://github.com/Cyberlorians/Articles/blob/main/TVMIngestion.md). As a refresh, the background is as follows: 4 | 5 | There is no Sentinel connector option for Microsofts XDR Threat Vulnerability Management data to ingest into Sentinel. Since the release of my LogicApp, which works flawless for smaller orgs - there are API [limitations](https://learn.microsoft.com/en-us/legal/microsoft-365/api-terms). Working alongside with two other colleagues (Seyed Amirhossein Nouraie & Mike Palitto) they had exposed me to Data Factory. 6 | 7 | The why! When querying API data in XDR there are call limitations and when using the LogicApp it is easy to hit those limitations. Data Factory allows pagination (continually looping) on the odata call within the API until all data is seen and ingestion to X (your endpoint). This is a huge deal because all of our Federal Customers have this mandate to track their TVM data and send to another agency. Regardless if you are a Federal CX you will want this solution because; **A** - *no connector to Sentinel or Streaming API in XDR*, **B** - *only 30 Days of data reside in XDR*, **C** - *the need for long term storage of said data to X (another endpoint).* The great piece with this solution is that we are sending to a blob container and compressed! So the data will arrive on the storage account half the size as a gz file type. We can then query from ADX to view the data. 8 | 9 | **XDR to ADLS/ADX use case**. You CAN send your XDR data via Streaming API to ADLS. Jeffrey Appel has saved me time to show you how to accomplish this, seen [here](https://jeffreyappel.nl/export-microsoft-defender-for-endpoint-security-events-with-the-streaming-api/). You CAN also export logs from Sentinel/LAW to the ADLS and either chose external query or ingest to ADX using [Data Export](https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-data-export?tabs=portal#limitations). If you chose to do this as well, follow the steps to export logs to ADLS and then continue with the below steps and ingest continually, it follows the same flow. 10 | 11 | An overview idea of what would small solution of all this enterprise logging around the Cyber Estate can look like is directly below. 12 | 13 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/onewaytoslicethepie.png) 14 | 15 | The Data Lake was chosen specific to MSFT products. There are use cases when the need is to send out via the EventHub for interagency collaboration around Dashboardings. For these situations please refer to your architect and look at Event Hubs. This is but another working solution to use ADLS for the Life Cycle Mangaement around block blobs (compressed) and ADX ingestion. It is important to note that this is possible with External querying the data from ADX to ADLS but not in scope. 16 | 17 | Lets start by setting up the Data Lake. You are going to deploy a standard Azure DataLake Storage Gen2 and we will use blob containers. 18 | 19 |
Deploy Storage Account - follow the steps below. 20 |

21 | 22 | **1** - *In Azure, Create Storage Account.* 23 | 24 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/storage1.png) 25 | 26 | **2** - *Enable Hierarchical Namespace. This will flag Data Lake GenV2 to kick off.* 27 | 28 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/storage2.png) 29 | 30 | **3** - *Uncheck the recovery features. If you do not do this it will block the deployment* 31 | 32 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/storage3.png) 33 | 34 |

35 | 36 | 37 |
Deploy Data Factory - follow the steps below. 38 |

39 | 40 | **1** - *In Azure, Create Data Factory.* 41 | 42 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adf1.png) 43 | 44 | **2** - *After creation of the Data Factory navigate to Managed Identities just under the Settings blade. Click "Azure Role Assignments".* 45 | 46 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adf2.png) 47 | 48 | **3** - *As seen in the image, add "Storage Blob Contributor" for this managed identity to the Data Lake created earlier.* 49 | 50 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adf3.png) 51 | 52 | **4** - *Open Azure PowerShell CLI and run the [script] you see below. Enter your Managed Identity of the ADF.* 53 | 54 | 55 | ``` 56 | connect-azuread 57 | 58 | $miObjectID = $null 59 | Write-Host "Looking for Managed Identity with default prefix names of the Logic App..." 60 | $miObjectIDs = @() 61 | $miObjectIDs = (Get-AzureADServicePrincipal -SearchString "YourADFManagedIdentity").ObjectId 62 | if ($miObjectIDs -eq $null) { 63 | $miObjectIDs = Read-Host -Prompt "Enter ObjectId of Managed Identity (from Logic App):" 64 | } 65 | 66 | # The app ID of the Microsoft Graph API where we want to assign the permissions 67 | $appId = "fc780465-2017-40d4-a0c5-307022471b92" 68 | $permissionsToAdd = @("Vulnerability.Read.All","Software.Read.All") 69 | $app = Get-AzureADServicePrincipal -Filter "AppId eq '$appId'" 70 | 71 | foreach ($miObjectID in $miObjectIDs) { 72 | foreach ($permission in $permissionsToAdd) { 73 | Write-Host $permissions 74 | $role = $app.AppRoles | where Value -Like $permission | Select-Object -First 1 75 | New-AzureADServiceAppRoleAssignment -Id $role.Id -ObjectId $miObjectIDs -PrincipalId $miObjectID -ResourceId $app.ObjectId 76 | } 77 | } 78 | ``` 79 | 80 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adfperms1.png) 81 | 82 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adfperm2.png) 83 | 84 | ## Configuring Data Factory - follow steps below. 85 | 86 | *Disclaimer - it is important to note that for this demo I chose a commercial instance. You can change the endpoints to mimic other Azure Environments see right below.* 87 | 88 | ``` 89 | Commercial URL = https://api.securitycenter.microsoft.com/api/machines/SoftwareVulnerabilitiesByMachine?deviceName 90 | Commercial Audience = https://api.securitycenter.microsoft.com 91 | 92 | GCC URL = https://api-gcc.securitycenter.microsoft.us/api/machines/SoftwareVulnerabilitiesByMachine?deviceName 93 | GCC Audience = https://api-gcc.securitycenter.microsoft.us 94 | 95 | GCCH URI = https://api-gov.securitycenter.microsoft.us/api/machines/SoftwareVulnerabilitiesByMachine?deviceName 96 | GCCH Audience = https://api-gov.securitycenter.microsoft.us 97 | ``` 98 |

99 | 100 |
Working w/ Data Factory - follow the steps below. 101 |

102 | 103 | **1** - *Click "Launch Studio"* 104 | 105 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adf4.png) 106 | 107 | **2** - *Click Author > then the + sign > Pipeline > Import from pipeline template"* 108 | 109 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adf5.png) 110 | 111 | **3** - *When prompted for a ZIP file, down and save the [TVM Data Factory Template](https://github.com/Cyberlorians/CyberEstate/blob/main/AHTVM.zip) file then upload as the template. Once uploaded the default upload will look like the below image.* 112 | 113 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adf6restvuln.png) 114 | 115 | **4** - *You will need to create Linked Services for each to work. On the "TVM_Rest_Vuln_Connection(Rest dataset)", click the drop-down and click "New". Follow the below snippet, test-connection and create. NOTE - none of these connections will work unless you have set the permissions for ADF on the managed identity via the script.* 116 | 117 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adfrestvulnconnection.png). 118 | 119 | **5** - *Navigate to the next Linked Service, "TVM_Out", click the drop-down and click "New". Follow the snippet, test-connection and create. NOTE - this connection will not work if you did not follow the step to give the Managed Identity the Storage Contributor role.* 120 | 121 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adfoutconnection.png) 122 | 123 | **6** - *You will need to create Linked Services for each to work. On the "TVM_Rest_Software_Connection(Rest dataset)", click the drop-down and click "New". Follow the below snippet, test-connection and create. NOTE - none of these connections will work unless you have set the permissions for ADF on the managed identity via the script.* 124 | 125 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adfrestsoftwareconnection.png). 126 | 127 | **7** - *You will need to create Linked Services for each to work. On the "TVM_Rest_Firmware_Connection(Rest dataset)", click the drop-down and click "New". Follow the below snippet, test-connection and create. NOTE - none of these connections will work unless you have set the permissions for ADF on the managed identity via the script.* 128 | 129 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adfrestfirmwareconnection.png). 130 | 131 | **8** - *Verify all connections are good and hit complete.* 132 | 133 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/connectionscompleted.png) 134 | 135 |

136 | 137 |
Validate & Publish Data Factory Pipelines - follow the steps below. 138 |

139 | 140 | 141 | **1** - *After the last step of configuration, you'll be brought back to the pipeline menu. Click debug. Everything should check out perfectly if the steps were followed.* 142 | 143 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/pipelinevalidate.png). 144 | 145 | **2** - *Navigate over to your Data Lake and verify the folders have been uploaded. Once there check the files are gz and block blobs (they are).* 146 | 147 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adlsproof.png) 148 | 149 | **3** - Once firmed successful, click "Publish All".* 150 | 151 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adfoutlooksuccesspublish.png) 152 | 153 | *4** - *Add a trigger to your pipeline - you chose the schedule.* 154 | 155 |

156 | 157 |
Azure Data Explorer - You will have to follow these instructions for each TVM table. 158 |

159 | 160 | *Disclaimer - To Create [ADX](https://learn.microsoft.com/en-us/azure/data-explorer/create-cluster-and-database?tabs=free).* 161 | 162 | **1** - *Right click your DB and click, "Get Data". Give the table a name accordingly and select the container you have saved your ADF data too.* 163 | 164 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adx01.png) 165 | 166 | **2** - *On the inspect the data tab, review data (I removed two columns to only have value).* 167 | 168 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adx02.png) 169 | 170 | **3** - *You MAY be prompted to grant permissions to ADX to READ the blob data if you have not done so already. Snippet is below on what that would look like.* 171 | 172 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adfadx3.png) 173 | 174 | **4** - *Review the summary and hit close. You may know query the data in ADX.* 175 | 176 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/adx03.png) 177 | 178 | 179 | **5** - *Query your data.* 180 | 181 | ``` 182 | TVMDeviceVuln 183 | | project value 184 | | mv-expand value 185 | | evaluate bag_unpack(value) 186 | ``` 187 | 188 | 189 | **6** - *The preceeding steps were only a one time ingestion. In order to continually ingest from ADLS, you will need to create an [Event Grid](https://learn.microsoft.com/en-us/azure/data-explorer/create-event-grid-connection?tabs=portal-adx%2Cazure-blob-storage#create-an-event-grid-data-connection). Follow the instructions to continually ingest new data the full automated solution. When using Event Grid it automatically kicks off a new EH but please understand the limitations around Standard Event Hubs, Namespaces etc. I recommend going Premium Event Hub in any org.* 190 | 191 | 192 |

193 | 194 | 195 | 196 | 197 | 198 | -------------------------------------------------------------------------------- /DODCOAZT.md: -------------------------------------------------------------------------------- 1 | 2 | ### 1.1.1 User Inventory 3 | 4 | Query Azure AD for all synchronized user objects using Microsoft Graph PowerShell: 5 | ``` 6 | Connect-MgGraph -Scopes user.read.all 7 | Select-MgProfile beta 8 | Get-MgUser -Filter "OnPremisesSyncEnabled eq true" -All 9 | ``` 10 | 11 | Verify no synchronized users are assigned Global Administrator role: 12 | ``` 13 | Connect-MgGraph -Scopes RoleManagement.Read.All 14 | $roleid = $(Get-MgRoleManagementDirectoryRoleDefinition ` |?{$_.DisplayName -eq "Global Administrator"}).id 15 | $members = Get-MgRoleManagementDirectoryRoleAssignment ` 16 | | ?{$_.RoleDefinitionId -eq $roleid} 17 | foreach ($member in $members) { 18 | $u = Get-MgUser -UserId $member.PrincipalId 19 | if ($u.OnPremisesSyncEnabled -eq $true) { 20 | "Synced user is member of Global Admins! $($u.UserPrincipalName)" | write-host -ForegroundColor Red 21 | $flag = 1}} 22 | if (!($flag)) {Write-Host "Pass" -ForegroundColor Green} 23 | ``` 24 | 25 | Find privileged users in Azure AD by checking assignments for Azure AD roles: 26 | ``` 27 | $roles = 28 | Connect-MgGraph -Scopes RoleManagement.Read.All 29 | $privids = @() 30 | foreach ($role in $roles) { 31 | $roleid = $(Get-MgRoleManagementDirectoryRoleDefinition ` 32 | | ?{$_.DisplayName -eq $roles}).id 33 | $privids = $(Get-MgRoleManagementDirectoryRoleAssignment ` 34 | | ?{$_.RoleDefinitionId -eq $roleid}).PrincipalId 35 | } 36 | ``` 37 | 38 | Investigate role assignments to highly privileged Azure RBAC roles referencing the list in this section: 39 | ``` 40 | Connect-AzAccount 41 | $roles = az role assignment list --subscription 'SubscriptionId' 42 | ``` 43 | ### 1.2.1 Implement App Based Permissions Per Enterprise 44 | 45 | View the on-premises synchronization status with Microsoft Graph PowerShell: 46 | ``` 47 | Connect-MgGraph 48 | $(Get-MgOrganization).AdditionalProperties.onPremisesSyncStatus.state 49 | ``` 50 | 51 | Use MS Graph PowerShell to view all dynamic security groups: 52 | ``` 53 | Connect-MgGraph -Scopes Group.Read.All 54 | Get-MgGroup | ?{$_.GroupTypes -eq "DynamicMembership"} 55 | ``` 56 | 57 | ### 1.2.2 Rule Based Dynamic Access Pt 1 58 | 59 |  (1) Sign in to the Entra Portal > Azure Active Directory > Applications > Enterprise Applications.
60 |  (2) Copy the Application ID for an enterprise application.
61 |  (3) Using this Application ID, find enabled Conditional Access Policies with MS Graph PowerShell: 62 | ``` 63 | Connect-MgGraph -Scopes Policy.Read.All 64 | Select-MgProfile beta 65 | $appid = "02750a1b-9140-48e0-abee-e6b708e4765d" 66 | Get-MgIdentityConditionalAccessPolicy | ?{($_.state -eq 'enabled') -and ($_.Conditions.Applications.IncludeApplications -contains $appid -or ($_.Conditions.Applications.IncludeApplications -eq 'All' -and ($_.Conditions.Applications.ExcludeApplications -notcontains $appid)))} 67 | ``` 68 | 69 | All possible applications use JIT/JEA permissions for admin users. Create Privileged Access Groups for any administrative App Roles Using MS Graph PowerShell: 70 | ``` 71 | New-MgGroup -DisplayName "MyPrivAccessGroup" -MailNickname "MyPrivAccessGroup" -SecurityEnabled -MailEnabled:$false -IsAssignableToRole 72 | ``` 73 | 74 | ### 1.2.3 Rule Based Dynamic Acces Pt 2 75 | 76 | i. Application access requires assignment by Azure AD Security Group, Dynamic Security Group, or Entitlements Management access package.
77 | ii. Apps that support Application Roles are configured to dynamically assign access based on user attributes or group membership.
78 | iii. Baseline Conditional Access Policy set includes the following rule types:
79 |  (1) All users, all applications, MFA (or Authentication Strength)
80 |  (2) All users, all applications, high sign-in or user risk, block
81 |  (3) All users, [select applications], require Hybrid Azure AD Join or Compliant device.
82 | iv. Custom Security Attributes (preview) allow defining attribute sets and assigning values to users for authorization within applications. These security attributes can also be assigned to groups and used within Conditional Access to scope a policy to an application. 83 | 84 | ### 1.2.4 Enterprise Gov’t Roles and Permissions Pt 1 85 | 86 | Review the available attributes for an Azure AD user: 87 | ``` 88 | Connect-MgGraph -Scopes User.Read.All 89 | $upn = "user@domain.com" 90 | Get-Mguser -UserId $upn | fl 91 | ``` 92 | 93 | Azure AD Connect Sync or Azure AD Connect Cloud Sync can sync users from on-premises Active Directory Domain Services. To view the synchronized attributes, run the following in MS Graph PowerShell: 94 | ``` 95 | Get-Mguser -UserId $upn | select OnPremises* 96 | ``` 97 | 98 | Use Microsoft Graph PowerShell to assign a user and role to an application based on custom security attributes: 99 | ``` 100 | # Assign the values to the variables 101 | $userId = "" 102 | $app_name = "" 103 | $app_role_name = "" 104 | $sp = Get-MgServicePrincipal -Filter "displayName eq '$app_name'" 105 | 106 | # Get the user to assign, and the service principal for the app to assign to 107 | $params = @{ 108 | "PrincipalId" =$userId 109 | "ResourceId" =$sp.Id 110 | "AppRoleId" =($sp.AppRoles | Where-Object { $_.DisplayName -eq $app_role_name }).Id 111 | } 112 | 113 | # Assign the user to the app role 114 | New-MgUserAppRoleAssignment -UserId $userId -BodyParameter $params | 115 | Format-List Id, AppRoleId, CreationTime, PrincipalDisplayName, 116 | PrincipalId, PrincipalType, ResourceDisplayName, ResourceId 117 | ``` 118 | 119 | ### 1.2.5 Enterprise Gov't Roles and Permissions Pt 2 120 | 121 | Use MS Graph PowerShell to check for federated domains: 122 | ``` 123 | $fdomains = @($(Get-MgDomain | ?{$_.AuthenticationType -eq "Federated"}))[0] 124 | if ($fdomains) { 125 | $sts = Get-MgDomainFederationConfiguration -DomainId $fdomains.id; 126 | Write-Host "Federated domain ($($fdomains.id)) is using STS $($sts.IssuerUri)." -ForegroundColor Red 127 | } else { 128 | Write-Host "Pass: No federated domains found." -ForegroundColor Green 129 | } 130 | ``` 131 | 132 | 133 | 134 | 135 | 136 | -------------------------------------------------------------------------------- /DeviceTagging.md: -------------------------------------------------------------------------------- 1 | ``` 2 | # Install necessary modules 3 | if (-not (Get-Module -ListAvailable -Name Microsoft.Graph.Authentication)) { 4 | Install-Module Microsoft.Graph.Authentication -Scope CurrentUser -Force 5 | } 6 | if (-not (Get-Module -ListAvailable -Name AzureAD)) { 7 | Install-Module AzureAD -Scope CurrentUser -Force 8 | } 9 | 10 | # Import modules 11 | Import-Module Microsoft.Graph.Authentication 12 | Import-Module AzureAD 13 | 14 | # Connect to Microsoft Graph using device authentication 15 | Connect-MgGraph -Scopes Application.Read.All, AppRoleAssignment.ReadWrite.All -UseDeviceAuthentication 16 | 17 | # Connect to Azure AD module (required for Get-AzureADServicePrincipal) 18 | Connect-AzureAD 19 | 20 | # Define the name of the Managed Identity 21 | $miName = "MDEDeviceTaggingtest" 22 | Write-Host "Searching for Managed Identity: $miName..." 23 | 24 | # Attempt to retrieve the Managed Identity (Service Principal) 25 | $managedIdentity = Get-AzureADServicePrincipal -Filter "displayName eq '$miName'" 26 | 27 | # Fallback: Ask for ObjectId if not found 28 | if ($null -eq $managedIdentity) { 29 | $miObjectId = Read-Host -Prompt "Managed Identity not found. Enter ObjectId manually:" 30 | } else { 31 | $miObjectId = $managedIdentity.ObjectId 32 | } 33 | 34 | # The app ID of the Microsoft Graph API where we want to assign the permissions 35 | $appId = "fc780465-2017-40d4-a0c5-307022471b92" 36 | $permissionsToAdd = @("Machine.ReadWrite.All") 37 | 38 | # Get the service principal for the target application 39 | $app = Get-AzureADServicePrincipal -Filter "AppId eq '$appId'" 40 | 41 | # Assign each required permission to the Managed Identity 42 | foreach ($permission in $permissionsToAdd) { 43 | Write-Host "Assigning permission: $permission" 44 | 45 | # Find the matching AppRole in the service principal 46 | $role = $app.AppRoles | Where-Object { $_.Value -eq $permission -and $_.AllowedMemberTypes -contains "Application" } | Select-Object -First 1 47 | 48 | if ($null -eq $role) { 49 | Write-Warning "Permission $permission not found in the application's AppRoles." 50 | continue 51 | } 52 | 53 | # Assign the role to the managed identity 54 | New-AzureADServiceAppRoleAssignment -ObjectId $miObjectId ` 55 | -PrincipalId $miObjectId ` 56 | -ResourceId $app.ObjectId ` 57 | -Id $role.Id 58 | } 59 | 60 | # Disconnect sessions 61 | Disconnect-MgGraph 62 | Disconnect-AzureAD 63 | ``` 64 | -------------------------------------------------------------------------------- /EntraIDAssessment.md: -------------------------------------------------------------------------------- 1 | ## Microsoft Entra ID Assessment - Azure Monitor Agent 2 | 3 |
Prerequisites. 4 |

5 | 6 | ## Getting Started - Info Call Out! 7 | Work with your CSAM on the [Getting Started w/ On-Demand Assessments](https://learn.microsoft.com/en-us/services-hub/unified/health/getting-started-with-on-demand-assessments). A few call outs that are on that link that should be mentioned here. 8 | Use the following checklist to ensure all steps in this section are completed before moving onto the next section. 9 | 10 | ***Azure Subscription*** - *Assessment person needs OWNER on the subscription and an email associated with that user account.*

11 | ***Services Hub Registration*** - *From prior, CSAM needs to invite that same OWNER w/ an associated email.*

12 | ***Link Azure Subscription and Log Analytics Workspace to Services Hub*** - *You will see this under your account icon>Edit Log Analytics Workspace.*

13 | 14 | You will negate these next two call outs that are on the link above and proceed with the build out.

15 | - Add the assessment(s) in Services Hub 16 | - Provide access to Log Analytics workspace 17 | 18 | ## Begin here! 19 | 20 | 21 | 1. Create Resource Group: 'Assessment'. 22 | 2. Create Log Analytics Workspace in Assessment RG: 'Assessment-LAW'. 23 | 3. Create AzureVM (Server 22): 'Assessment'. 24 | 4. Turn on "Enable Systemd Assigned Managed Identity", while building the VM, under the management blade. Verify after deployment it is enabled. 25 | 26 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/mgmdidentity.png) 27 | 28 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/mgmdidentity2.png) 29 | 30 | 31 |

32 | 33 |
On-Demand Assessment Machine Configuration. 34 |

35 | 36 | ## Machine Configuration 37 | 38 | Log in as local administrator to the virtual machine. 39 | 40 | 1. Verify Endpoints. 41 | 42 | *Domain Environment - Required Azure Service Endpoints* 43 | 44 | | Endpoint | Desciprtion | 45 | | :--- | :----: | 46 | |management.azure.com | Azure Resource Manager| 47 | login.windows.net | Azure Active Directory| 48 | dc.services.visualstudio.com | Application Insights| 49 | agentserviceapi.azure-automation.net | Guest Configuration| 50 | *-agentservice-prod-1.azure-automation.net | Guest Configuration| 51 | *.his.hybridcompute.azure-automation.net | Hybrid Identity Service| 52 | 53 | 2. Utilize Test-NetConnection. 54 | 55 | ``` 56 | tnc management.azure.com -Port 443; 57 | tnc login.windows.net -port 443; 58 | tnc dc.services.visualstudio.com -port 443; 59 | tnc agentserviceapi.azure-automation.net -port 443 60 | ``` 61 | 3. Patch the OS and reboot. *Disclaimer - .NET 4.8 is required. Server 2022 comes with this framework by default*. 62 | 63 | 4. Create folder directory. 'C:\Assessment\Entra' 64 | 5. Turn off IE EnchancedMode. 65 | 6. Start -> Run -> gpedit.msc-> Computer Configuration -> Windows Settings -> Security Settings -> Local Policies -> User Rights Assignment -> Log on as a batch job -> Add Adminstrators. 66 | 7. Start -> Run -> gpedit.msc-> Computer Configuration -> Administrative Template -> system -> user profile ->Do not forcefully unload the users registry at user logoff -> Click Enable. 67 | 8. Run PowerShell as Administrator and install four modules on the Assessment Server - DO NOT MISS THIS STEP! 68 | ``` 69 | Install-Module Microsoft.Graph -Verbose -AllowClobber -Force 70 | Install-Module Msonline -verbose -allowclobber -force 71 | Install-Module AzureRM -verbose -allowclobber -Force 72 | Install-Module AzureADPreview -verbose -allowclobber -Force 73 | ``` 74 | 9. Reboot and proceed. 75 | 76 |

77 | 78 |
Services Hub Configuration. 79 |

80 | 81 | ## Services Hub Configuration 82 | 83 | 1. Log into Services Hub and add your log analytics workspace. 84 | 85 | 2. Add the Azure AD Assessment. 86 | 87 | 3. Add the VM and the assessment path you used from the previous step. Installation will begin. 88 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/entraassessment.png) 89 | 90 | 4. The installation creates a Data Collection Rule, named 'Azure DCR Rule'. 91 | 92 | 5. Verify you see AzureAssessment, AssessmentPlatform AND AzureMonitorWindowsAgent 93 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/assessmentextension.png) 94 | 95 | 6. Take note and if you see the extensions are out of date, STOP and update (select extensions what need updating and click update). Updates available will look like below, pay close attention to what version is available and use that number to replace the code below. 96 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/assessmentupdate2.png) 97 | 98 | EXAMPLE code is below, if you want/have to do manually. You must first uninstall the extension then install using Azure PowerShell CLI. 99 | ``` 100 | Set-AzVMExtension -ResourceGroupName "Assessment" ` 101 | -VMName "Assessment" ` 102 | -Name "AssessmentPlatform" ` 103 | -Publisher "Microsoft.ServicesHub" ` 104 | -ExtensionType "AssessmentPlatform" ` 105 | -TypeHandlerVersion "4.5" 106 | 107 | Set-AzVMExtension -ResourceGroupName "Assessment" ` 108 | -VMName "Assessment" ` 109 | -Name "AzureAssessment" ` 110 | -Publisher "Microsoft.ServicesHub" ` 111 | -ExtensionType "AzureAssessment" ` 112 | -TypeHandlerVersion "1.9" 113 | ``` 114 | 115 | 8. After DCR kick off from Step #3 a new folder will be created on C:\ called 'ODA'. Leave this folder alone as it is reserved for system. 116 | 117 |

118 | 119 | 120 |
Create Assessment Application. 121 |

122 | 123 | ## Create 'Microsoft Assessment' Application 124 | 125 | 1. Verify that you have the Azure subscription Owner role on the Azure subscription on the same email ID that you use to login into Services Hub. Review [Linking Permissions](https://learn.microsoft.com/en-us/services-hub/unified/health/assessments-troubleshooting-ama#linking-and-permissions). 126 | 127 | 2. Create Application, reviewed [here](https://learn.microsoft.com/en-us/services-hub/unified/health/getting-started-entraid#setup-the-microsoft-microsoft-entra-id-assessment-on-the-data-collection-machine). Authentication to Entra as Global Administrator*- you will be prompted for MFA and after setup, you must consent to the application permissions. See application permissions that will be delegated [here](https://learn.microsoft.com/en-us/services-hub/unified/health/getting-started-microsoftassessmentapplication/permission-requirements). 128 | 3. 129 | 4. When prompted for the Subscription boundary. Chose only the subscription where the assessment VM resides. This step sets READER permission for the application service principal on the subscription. 130 | 131 | ``` 132 | New-MicrosoftAssessmentsApplication -allowclobber -force 133 | ``` 134 | If there are URL restrictions in place in order to correctly setup the assessment application, you will need to ensure you whitelist the 135 | following URLs: 136 | 137 | | Endpoint | Port | 138 | | :--- | :----: | 139 | | aadcdn.msauth.net|443| 140 | az818661.vo.msecnd.net|443| 141 | c.urs.microsoft.com|443| 142 | go.microsoft.com|443| 143 | iecvlist.microsoft.com|443| 144 | ieonline.microsoft.com|443| 145 | login.microsoftonline.com|443| 146 | oneget.org|443| 147 | psg-prod-eastus.azureedge.net|443| 148 | www.powershellgallery.com|443| 149 | 150 |

151 | 152 |
Create Scheduled Task. 153 |

154 | 155 | 1. Create Scheduled Task - run this task as the local admin with computername\localadmin as shown below. 156 | ``` 157 | Add-AzureAssessmentTask -WorkingDirectory C:\Assessment\Entra -ScheduledTaskUsername Assessment\xadmin 158 | ``` 159 | 2. Verify the Scheduled Task 160 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/scheduledtask.png) 161 | 162 | 3. Right-Click the ST and click run. Adjust or remove schedule if needed. VM should be powered off between assessments. 163 | 164 | 4. After the ST has been kicked off. The C:\Assessment\Entra folder will being to populate with a numerical folder. 165 | 166 |

167 | 168 |
Verifying Data. 169 |

170 | 171 | 172 | ## Verifying Data to the Log Analytics Workspace ## 173 | 174 | ``` 175 | //Viewing Failed Recommendation Results 176 | AzureAssessmentRecommendation 177 | | where TimeGenerated > ago (30d) //set time 178 | | where RecommendationResult contains '' 179 | | summarize count() by RecommendationResult, ['Week Starting']=startofweek(TimeGenerated) 180 | | sort by ['Week Starting'] desc, RecommendationResult asc 181 | ``` 182 | 2. Once confirmed, you will see data trickle in over the next few hours populate in ServicesHub. 183 | 184 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/assessmentshcomplete.png) 185 | 186 |

187 | 188 | 189 |
Summary and Delivery. 190 |

191 | 192 | In ServicesHub, go to the Primary Navigation and select IT Health, then choose On-Demand Assessments. On the Assessments page, select the assessment and then click Download Executive Summary or Download All Recommendations to view the reports. Provide these to your CSA for the final review and delivery of the Assessment, along with any remediation tasks that will be assigned to you. 193 | 194 |

195 | 196 |
Troubleshooting. 197 |

198 | 199 | https://learn.microsoft.com/en-us/services-hub/unified/health/assessments-troubleshooting-ama#linking-and-permissions 200 | 201 | As of 11/7/2024, after upgrading the extensions to 4.5 and 1.9 there is a known issue of the AzureAssessment.execpkg being removed from the C:\ODA\Pakages folder. Before proceeding, please do the following. 202 | 203 | 1. Copy the AzureAssessment.execpkg file from "C:\Packages\Plugins\Microsoft.ServicesHub.AzureAssessment\1.9\bin" to "C:\ODA\Packages" 204 | 2. Proceed once confirmed you have copied this file. Again, COPY not CUT. 205 | 206 | 6. Install the Azure Monitor Agent Extension on the newly created VM (this can be seen from the Extensions blade on the VM). Run the below command from the Azure Portal PowerShell and verify. 207 | 208 | **!!DO NOT MISS THIS STEP!!** 209 | 210 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/amaassessment.png) 211 | ``` 212 | Connect-AzAccount -UseDeviceAuthentication 213 | Set-AzVMExtension -Name AzureMonitorWindowsAgent -ExtensionType AzureMonitorWindowsAgent -Publisher Microsoft.Azure.Monitor -ResourceGroupName Assessment -VMName Assessment -Location EastUS -TypeHandlerVersion 1.0 -EnableAutomaticUpgrade $true 214 | ``` 215 | 216 | ``` 217 | Clear-MicrosoftAssessmentsApplication -IncludeAADApplication $true 218 | ``` 219 |

220 | 221 | 222 | 223 | 224 | -------------------------------------------------------------------------------- /GenAI Monitoring Made Easy with Microsoft Solutions.md: -------------------------------------------------------------------------------- 1 | ## GenAI Monitoring Made Easy with Microsoft Solutions 2 | 3 | In this blog post, we'll explore how to use Kusto Query Language (KQL) and Azure Policy to block the creation of Generative AI (GenAI) resources in Azure. We'll also cover how to monitor creation events through KQL and block access to unauthorized GenAI resources. This guide will provide a detailed overview of the steps involved, ensuring your Azure environment remains secure and compliant. 4 | We'll reference specific parts of this [article](https://techcommunity.microsoft.com/blog/microsoftthreatprotectionblog/get-visibility-into-your-deepseek-use-with-defender-for-cloud-apps/4372520) to provide a comprehensive understanding of the process. 5 | -------------------------------------------------------------------------------- /IdentityGRC.md: -------------------------------------------------------------------------------- 1 | ## Identity - Governance, Risk and Compliance ## 2 | 3 | Have you ever wanted a way for your folks to query the Azure Active Directory log data without having a complimentary AAD Role? The following solution, you will create a custom RBAC role on the Resource Group where the Log Analytics Workspace resides. This introduces least privilege without the need for an AAD role and allows the users to query, using KQL and to create custom workbooks. The only caveat is that the workbooks for identity in the AAD and Conditional Access blades in the Entra portal are based off an AAD role. You are left with creating custom kql's, which is fine but NO workbooks for your GRC users to query using the built-in AAD workbooks because you do not have that role now. Let me start with showing you how to set this up first, get your folks being able to query the data and then in a part2, I will show the workbooks for identity. 4 | 5 | *Pre-requisites* - Create an AAD Security Group labeled 'Custom - AAD Logs Reader' and populate with your Identity GRC users. 6 | *Pre-requisites* - Azure Active Directory Diagnostic setting and send logs to a Log Analytics Workspace. Recommened to send to your Sentinel LAW. 7 | 8 | ### Create the custom RBAC role on the Resource Group where the LAW resides (you cannot create a custom role at the LAW, yet). ### 9 | 10 | #### 1. At the ResourceGroup (IAM), where your Log Analytics Workspace resides. Add a custom role. 11 | 12 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customrbac1.png) 13 | 14 | #### 2. Name the Role: Custom - AAD Logs Reader. 15 | 16 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customrbac2.png) 17 | 18 | #### 3. Under JSON, click 'edit' and paste the below code into the actions brackets and hit 'save'. 19 | 20 | ``` 21 | 22 | "actions": [ 23 | "Microsoft.OperationalInsights/workspaces/read", 24 | "Microsoft.OperationalInsights/workspaces/query/read", 25 | "Microsoft.OperationalInsights/workspaces/query/SigninLogs/read", 26 | "Microsoft.OperationalInsights/workspaces/query/AuditLogs/read", 27 | "Microsoft.OperationalInsights/workspaces/query/AADManagedIdentitySignInLogs/read", 28 | "Microsoft.OperationalInsights/workspaces/query/AADNonInteractiveUserSignInLogs/read", 29 | "Microsoft.OperationalInsights/workspaces/query/AADProvisioningLogs/read", 30 | "Microsoft.OperationalInsights/workspaces/query/AADRiskyServicePrincipals/read", 31 | "Microsoft.OperationalInsights/workspaces/query/AADRiskyUsers/read", 32 | "Microsoft.OperationalInsights/workspaces/query/AADServicePrincipalRiskEvents/read", 33 | "Microsoft.OperationalInsights/workspaces/query/AADServicePrincipalSignInLogs/read", 34 | "Microsoft.OperationalInsights/workspaces/query/AADUserRiskEvents/read", 35 | "Microsoft.OperationalInsights/workspaces/query/ADFSSignInLogs/read", 36 | "Microsoft.OperationalInsights/workspaces/query/NetworkAccessTraffic/read" 37 | ], 38 | 39 | ``` 40 | 41 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customrbac3.png) 42 | 43 | #### 4. Review and Create. 44 | 45 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customrbac4.png) 46 | 47 | #### 5. At the ResourceGroup (IAM), add role assignment. 48 | 49 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customrbac5.png) 50 | 51 | #### 6. Find the 'Custom - AAD Logs Reader' Role and hit Next. 52 | 53 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customrbac6.png) 54 | 55 | #### 7. On the Members page, ADD the 'Custom - AAD Logs Reader' security group and click, 'review and assign'. 56 | 57 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customrbac7.png) 58 | 59 | ### Verify user access with Identity GRC user. ### 60 | 61 | #### 1. Log into Azure>Monitor and then Logs. 62 | 63 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customrbac8.png) 64 | 65 | #### 2. Select the scope of the Log Analytics Workspace which you have access too and click 'apply'. 66 | 67 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customrbac9.png) 68 | 69 | #### 3. Verify you can now query the AAD tables as listed above. Example is below. 70 | 71 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customrbac10.png) 72 | 73 | Where does this leave you? Well, I would start to use Matt Zorichs queries [here](https://github.com/reprise99/Sentinel-Queries/tree/main/Azure%20Active%20Directory) and review his write [up](https://learnsentinel.blog/2022/06/21/kql-lessons-learnt-from-365daysofkql/). You can look through some of the examples here to help get you started. I would also deploy his [365daysofKQL](https://github.com/reprise99/Sentinel-Queries/tree/main/Query%20Pack). The Microsoft Sentinel community has a ton of precanned queries as well.. 74 | 75 | Next up, head on over to the identity [workbooks](https://github.com/Cyberlorians/Articles/blob/main/IdentityGRCWorkbooks.md) section as a caveat to this article. *This is currently under development*. 76 | 77 | 78 | 79 | 80 | -------------------------------------------------------------------------------- /IdentityGRCWorkbooks.md: -------------------------------------------------------------------------------- 1 | ## Identity - Governance, Risk and Compliance - Workbooks! ## 2 | 3 | This is the contiuation of the pre-cursor, [Identity - GRC](https://github.com/Cyberlorians/Articles/blob/main/IdentityGRC.md). These workbooks were created and tweaked from the Entra AAD and Conditional Access portals and from others around the community. A common thread you will see is I have added a LAW section to all workbooks as a drop-down selection. Adding this option makes it easier to deploy this workbook in a custom identity RBAC as we did in part1 or just deploying to your Sentinel or Azure Monitor Workbook collections. The point is, deploy the workbooks wherever you have proper permissions and you are all set. 4 | 5 | You will use Azure Monitor>Workbooks section to import these or any workbooks. Remember, this is a continuation of the least privilege RBAC from part1. 6 | 7 | *Disclaimer* - Give access to the same security group that was created in the first artcle. I.e., grant 'Workbook Contributor', to the 'Custom - AAD Logs Reader', security group on a Resource Group where the users will be saving these workbooks. I chose to create a Resource Group called, Workbooks and set the permissions that way. 8 | 9 | ### Conditional Access Trends and Changes Workbook #### 10 | 11 | This is kind of a unique workbook. It was built on a few customer asks and Matt Zorichs [CAP Insights](https://learnsentinel.blog/2022/05/09/azure-ad-conditional-access-insights-auditing-with-microsoft-sentinel/) (please follow his document to see how you can leverage the workbook further) AND from the Conditional Access Entra blade on insights. So, a 3 in 1, whammy of a workbook. The other workbook included is partially from Daniel Chronlund [here](https://danielchronlund.com/category/conditional-access/) with a new caveat of adding a text box for you type in your exlusion group name to view any changes that have taken place. The CAP Trends will allow you to see your entire environment and what's going on, especially from a failed state or new creation (threat hunting). These workbooks can be used as threat hunting for anomalies too. 12 | 13 | #### 1 - Navigte to Azure Monitor>Worksbooks, select New. 14 | 15 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customwbs1.png) 16 | 17 | #### 2 - Paste the JSON file [here](https://github.com/Cyberlorians/Workbooks/blob/main/ConditionalAccessTrends.json) into the workbook template and click 'apply'. 18 | 19 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customwbs2.png) 20 | 21 | #### 3 - Set the workspace to your dedicated RBAC scoped permission and SAVE the workbook the RG you have 'Workbook Contributor' permission on. 22 | 23 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customwbs3.png) 24 | 25 | 26 | ### Azure Active Directory Maintenance Workbook #### 27 | 28 | Maintaining a well managed AzureAD tenant w/ kql. A lot of this is based off Matt Zorichs - AADSpringCleaning (sorry, Matt I stole your data) and a few other tweaks from the field too. Deploy the workbook and follow along [here](https://learnsentinel.blog/2022/03/16/maintaining-a-well-managed-azure-ad-tenant-with-kql/). This workbook should be used monthly or quarterly, not just every spring. It will help keep your tenant identities in check. Another great caveat with this is the 'Conditional Access Trends Workbook'. This workbook will be continually updated by the Cyberlorians as we are working on revision 2, already. 29 | 30 | #### 1 - Navigte to Azure Monitor>Worksbooks, select New. (Same steps as above). 31 | 32 | #### 2 - Paste the JSON file [here](https://github.com/Cyberlorians/Workbooks/blob/main/AzureADMaintenace.json) into the workbook template and click 'apply'. 33 | 34 | #### 3 - Set the workspace to your dedicated RBAC scoped permission and SAVE the workbook the RG you have 'Workbook Contributor' permission on. 35 | 36 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/maintenacewb.png) 37 | 38 | ### AzureAD Signins and Audits ### 39 | 40 | #### 1 - In progress 41 | -------------------------------------------------------------------------------- /MDI-Hardened-Environments.md: -------------------------------------------------------------------------------- 1 | ## Microsoft Defender for Identity Hardened Environments ## 2 | 3 | This comprehensive installation guide for Microsoft Defender for Identity is specifically designed for deployment in a hardened (STIG-compliant) environment. Based on my experience with deployments, the following steps are arranged strategically: ensure that all object monitoring, SACL (System Control Access List) and service accounts are configured before proceeding with the installation of the sensor. The goal of this document is strictly around Domain Controllers and sensor additions. ADFS servers should be deprecated and moved to Entra CBA. CA services can be followed in the add-on section. 4 | 5 | 6 |
Prerequisites. 7 |

8 | 9 | The [prerequisites](https://docs.microsoft.com/en-us/defender-for-identity/prerequisites) are pretty straight forward and have been updated. Please read this thoroughly for Customers in US Government, here is your [doc](https://docs.microsoft.com/en-us/defender-for-identity/us-govt-gcc-high). *Disclaimer* - depending on your govt environment, you may have to allow *atp.azure.us through your proxy instead *.atp.azure.com, just be aware. 10 | 11 | Test your prerequisites [here](https://learn.microsoft.com/en-us/defender-for-identity/deploy/prerequisites#test-your-prerequisites). 12 | 13 | Plan for capacity [here](https://docs.microsoft.com/en-us/defender-for-identity/capacity-planning). 14 | 15 |

16 | 17 |
Windows Event Collecting. 18 |

19 | 20 | Please review [Configure Windows Event Collection](https://docs.microsoft.com/en-us/defender-for-identity/configure-windows-event-collection). 21 | 22 | In January 2024, Microsoft introduced a streamlined method for deploying 'Audit Policies' for Microsoft Defender for Identity using the PowerShell module 'DefenderForIdentity'. An overview is posted [here](https://techcommunity.microsoft.com/t5/microsoft-defender-xdr-blog/introducing-the-new-powershell-module-for-microsoft-defender-for/ba-p/4028734). This module simplifies the 'Auditing' setup compared to manual configuration. 23 | 24 | For improved clarity, a detailed guide has been created by [MSFTAdvocate](https://www.msftadvocate.com/configure-audit-policies-for-microsoft-defender-for-identity/). Please review this resource before proceeding to the next steps to ensure a coherent understanding of the process. 25 | 26 | In order to proceed please install the module (Install-Module DefenderForIdentity) OR manunally download from [PSGallery](https://www.powershellgallery.com/packages/DefenderForIdentity/1.0.0.0) on the Domain Controller OR on another Tier0 asset server. 27 | 28 | ***Note: The DefenderForIdentity module requires the ActiveDirectory and the GroupPolicy modules to be installed on the server. It is also advised against modifying default Group Policy Objects (GPOs), such as Default Domain Controllers or Default Domain GPOs. Instead, each operating system should adhere to its own hardened baseline, incorporating appropriate WMI filters. This also applies to Domain Controllers, which should use dedicated GPOs. As an administrator, it is crucial to ensure that the Group Policy Object precedence is correctly configured and functioning as intended.*** 29 | 30 | ***Note: When using the MDIConfiguration module, it will create separate Group Policy Objects (GPOs). It is advisable to leave these policies unchanged, regardless of your existing baselines.*** 31 | 32 | 33 | **1** - *Set Domain Controller Advanced Audit Policy.* Review [here](https://learn.microsoft.com/en-us/defender-for-identity/deploy/configure-windows-event-collection#configure-auditing-for-domain-controllers). 34 | ``` 35 | Set-MDIConfiguration -Mode Domain -Configuration AdvancedAuditPolicysDCs 36 | ``` 37 | 38 | **2** - *Set Domain Controller NTLM Auditing.* Review [here](https://learn.microsoft.com/en-us/defender-for-identity/deploy/configure-windows-event-collection#configure-ntlm-auditing). 39 | ``` 40 | Set-MDIConfiguration -Mode Domain -Configuration NTLMAuditing 41 | ``` 42 | 43 | **3** - *Configure Domain Object Auditing.* Review [here](https://learn.microsoft.com/en-us/defender-for-identity/deploy/configure-windows-event-collection#configure-domain-object-auditing). 44 | ``` 45 | Set-MDIConfiguration -Mode Domain -Configuration DomainObjectAuditing 46 | ``` 47 |

48 | 49 |
Configure Auditing on Configuration Container. 50 |

51 | 52 | ***1*** - *Configure auditing on the configuration container.* Review [here](https://learn.microsoft.com/en-us/defender-for-identity/deploy/configure-windows-event-collection#configure-auditing-on-the-configuration-container 53 | ). 54 | 55 | ``` 56 | Set-MDIConfiguration -Mode Domain -Configuration ConfigurationContainerAuditing 57 | ``` 58 | 59 |

60 | 61 |
Configure Directory Service Account. 62 |

63 | 64 | Review [Directory Service Account Recommendations](https://docs.microsoft.com/en-us/defender-for-identity/directory-service-accounts). 65 | 66 | ***Note: For optimal security, it is recommended to use a Group Managed Service Account (gMSA).*** 67 | 68 | 69 | **1** - *Create Sensor Group.* 70 | ``` 71 | $SensorGroup = 'MDISensors' 72 | $SensorGroupDesc = 'Members are allowing MDI gMSA attribute of -PrincipalsAllowedtoRetrieveManagedPassowrd.' 73 | New-ADGroup -Name $SensorGroup ` 74 | -Path "OU=Groups,OU=Tier0,DC=gcccyberlorian,DC=net" ` 75 | -GroupScope 'Universal' ` 76 | -GroupCategory 'Security' 77 | ``` 78 | 79 | **2** - *Create Group Managed Service Account.* 80 | ``` 81 | $Identity= 'MDIgMSA' #The name of the gMSA to be created 82 | $Description = "MDI group managed service account" 83 | $DNS = 'MDIgMSA.gcccyberlorians.net' #This is the gmsa dns hostname 84 | $Principal = Get-ADGroup $SensorGroup #Setting attribute for MDI Sensor to -PrincipalsAllowedtoRetrieveManagedPassowrd. 85 | $Kerb = 'AES128,AES256' #2016 and above OS STIG level - verify encryption used in env. 86 | New-ADServiceAccount -Name $Identity ` 87 | -Description $Description ` 88 | -DNSHostName $DNS ` 89 | -ManagedPasswordIntervalInDays 30 ` 90 | -PrincipalsAllowedToRetrieveManagedPassword $Principal ` 91 | -Enabled $True ` 92 | -KerberosEncryptionType $Kerb ` 93 | -PassThru 94 | ``` 95 | **3** - *Set all Domain Controllers to be members of Sensor Group.* 96 | ``` 97 | 98 | $sourceGroupName = "Domain Controllers" 99 | $targetGroupName = $SensorGroup 100 | 101 | # Retrieve the distinguished name (DN) of the source and target groups 102 | $sourceGroup = Get-ADGroup -Identity $sourceGroupName 103 | $targetGroup = Get-ADGroup -Identity $targetGroupName 104 | 105 | if ($sourceGroup -and $targetGroup) { 106 | # Get all members of the source group 107 | $members = Get-ADGroupMember -Identity $sourceGroup.DistinguishedName 108 | 109 | # Add each member to the target group 110 | foreach ($member in $members) { 111 | Add-ADGroupMember -Identity $targetGroup.DistinguishedName -Members $member.SamAccountName 112 | } 113 | 114 | Write-Output "All members of '$sourceGroupName' have been added to '$targetGroupName'." 115 | } else { 116 | Write-Output "One or both of the specified groups could not be found." 117 | } 118 | ``` 119 | 120 | **4** - *Set gMSA $Identity with [permission](https://learn.microsoft.com/en-us/defender-for-identity/deploy/create-directory-service-account-gmsa#verify-that-the-gmsa-account-has-the-required-rights).* 121 | 122 | ***Note: Add this to the Domain Controller OS-Based STIG, and if using it in conjunction with ADFS/CA, also include it in the ADFS/CA OS-Based STIG. I cannot stress how crucial this step is. In the past, this step was omitted from current documentation, but I am pleased it has now been added. However, it remains an easy oversight. Without this in place, nothing will work.*** 123 | 124 | **5** - *Test gMSA 'LogOnAsAService' permission after policy set in Step 4.* 125 | ``` 126 | Get-ADServiceAccount -Identity $Identity -Properties * | select Prin* 127 | Test-ADServiceAccount -Identity $Identity 128 | ``` 129 | 130 |

131 | 132 |
Grant Directory Service Account permissions on all objects (READ). 133 |

134 | 135 | 136 | **1** - *Declare the identity that you want to add read access to the deleted objects container.* 137 | ``` 138 | $Identity = 'MDIgMSA' 139 | ``` 140 | 141 | ***2*** - *Create a group and add the gMSA to it to configure permissions for the group and incorporate the gMSA within..* 142 | ``` 143 | $groupName = 'MDIDeletedObjRead' 144 | $groupDescription = 'Members of this group are allowed to read the objects in the Deleted Objects container in AD' 145 | if(Get-ADServiceAccount -Identity $Identity -ErrorAction SilentlyContinue) { 146 | $groupParams = @{ 147 | Name = $groupName 148 | SamAccountName = $groupName 149 | DisplayName = $groupName 150 | GroupCategory = 'Security' 151 | GroupScope = 'Universal' 152 | Description = $groupDescription 153 | } 154 | $group = New-ADGroup @groupParams -PassThru 155 | Add-ADGroupMember -Identity $group -Members ('{0}$' -f $Identity) 156 | $Identity = $group.Name 157 | } 158 | 159 | # Get the deleted objects container's distinguished name: 160 | $distinguishedName = ([adsi]'').distinguishedName.Value 161 | $deletedObjectsDN = 'CN=Deleted Objects,{0}' -f $distinguishedName 162 | 163 | # Take ownership on the deleted objects container: 164 | $params = @("$deletedObjectsDN", '/takeOwnership') 165 | C:\Windows\System32\dsacls.exe $params 166 | 167 | # Grant the 'List Contents' and 'Read Property' permissions to the user or group: 168 | $params = @("$deletedObjectsDN", '/G', ('{0}\{1}:LCRP' -f ([adsi]'').name.Value, $Identity)) 169 | C:\Windows\System32\dsacls.exe $params 170 | 171 | # To remove the permissions, uncomment the next 2 lines and run them instead of the two prior ones: 172 | # $params = @("$deletedObjectsDN", '/R', ('{0}\{1}' -f ([adsi]'').name.Value, $Identity)) 173 | # C:\Windows\System32\dsacls.exe $params 174 | ``` 175 |

176 | 177 |
Configure SAM-R for lateral movement. 178 |

179 | 180 | ***Note: This is a DENY Group Policy Object to Domain Controllers. On a STIG level, you could add these settings to each OS based or standalone GPO at the top level down. Plan according on this (layout).*** 181 | 182 | ***1*** - *Configure SAM-R required permissions [here](https://learn.microsoft.com/en-us/defender-for-identity/deploy/remote-calls-sam#configure-sam-r-required-permissions).* 183 | 184 | ***2*** - *Configure DENY for Domain Controllers on the GPO.* 185 | 186 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/SAMR.png) 187 | 188 |

189 | 190 |
Sensor Installation. 191 |

192 | 193 | ***Note: Before installing the sensor it is important to add the gMSA (DSA) to the 'Directory Service Accounts' blade in the XDR portal.*** 194 | 195 | ***1*** - *[Configure the gMSA in 365 Defender](https://docs.microsoft.com/en-us/defender-for-identity/directory-service-accounts#configure-directory-service-account-in-microsoft-365-defender).* 196 | 197 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/mdigmsa.png) 198 | 199 | ***2*** - *Download the [sensor](https://docs.microsoft.com/en-us/defender-for-identity/download-sensor).* 200 | 201 | ***3*** - *Test Connectivity to Defender for Identity (check again). If failure, refer to [Test Connectivity](https://learn.microsoft.com/en-us/defender-for-identity/deploy/test-connectivity).* 202 | ``` 203 | Test-MDISensorApiConnection 204 | ``` 205 | 206 | ***4*** - *Install Sensor [setup](https://learn.microsoft.com/en-us/defender-for-identity/deploy/install-sensor).* 207 | 208 |

209 | 210 |
Additional Scenarios. 211 |

212 | 213 | Continue to follow on with addition scenarions listed [here](https://learn.microsoft.com/en-us/defender-for-identity/deploy/active-directory-federation-services). 214 | 215 |

216 | 217 | 218 | 219 | 220 | 221 | 222 | 223 | 224 | 225 | -------------------------------------------------------------------------------- /MDI-Hardened.md: -------------------------------------------------------------------------------- 1 | ## Microsoft Defender for Identity - Hardened (STIGGED) Setup. ## 2 | 3 | A lot of the work I do consists of working in hardened security baselines. In short, that means STIGS are pushed via Group Policy to harden the systems. 4 | 5 | ## Microsoft Defender for Identity ## 6 | 7 | The [prerequisites](https://docs.microsoft.com/en-us/defender-for-identity/prerequisites) are pretty straight forward and have been updated. Please read this thoroughly and for my friends working in US Government, here is your [doc](https://docs.microsoft.com/en-us/defender-for-identity/us-govt-gcc-high). *Disclaimer* - depending on your govt environment, you may have to allow *atp.azure.us through your proxy instead *.atp.azure.com, just be aware. 8 | 9 | Plan for capacity [here](https://docs.microsoft.com/en-us/defender-for-identity/capacity-planning). 10 | 11 | 12 | ## [Configure Windows Event Collection](https://docs.microsoft.com/en-us/defender-for-identity/configure-windows-event-collection) ## 13 | 14 | *Disclaimer* - Huge Kudos to Raymond Roethof and allowing me to drop his link for some tidbits as well. His document [here](https://thalpius.com/2022/07/30/microsoft-defender-for-identity-auditing/) outlines all the auditing steps. Albeit, my article revolves around hardened systems. One awesome tidbit from his is the 4th section, "Object Auditing". This will simplify those GUI steps for you all. Cheers and thanks Raymond. 15 | 16 | For the most part STIGs capture the audit settings but MDI does call out a bit more, [here](https://docs.microsoft.com/en-us/defender-for-identity/configure-windows-event-collection). My advise is Do NOT edit default GPOs, whether that be Default Domain Controllers of the Default Domain. For each OS flavor you should be following its own hardened baseline, same holds true for a Domain Controller - use dedicated GPOs. The "Configure Windows Event Collection", site is a bit misleading so I broke it down for you. When you go to edit, DO NOT forget to edit each for success and failures. 17 | 18 | Domain Controllers - Use STIG baseline and follow [doc](https://docs.microsoft.com/en-us/defender-for-identity/configure-windows-event-collection#configure-audit-policies). 19 | 1. On Domain Controllers ONLY - Configure your hardened baseline GPO for EventID 8004, [here](https://docs.microsoft.com/en-us/defender-for-identity/configure-windows-event-collection#event-id-8004). 20 | 2. Additional Configuration for LDAP Search Event ID 1644, [here](https://docs.microsoft.com/en-us/defender-for-identity/configure-windows-event-collection#event-id-8004). What is Event ID 1644? See [here](https://github.com/Cyberlorians/uploadedimages/blob/main/eventid1644.png). *Disclaimer* - I would add this registry setting to the OS based hardened GPO so that it gets added to ALL Domain Controllers. See image below. 21 | 22 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/eventid1644.png) 23 | 24 | ADFS - Use STIG baseline, ADFS [auditing](https://docs.microsoft.com/en-us/windows-server/identity/ad-fs/troubleshooting/ad-fs-tshoot-logging) and follow [ADFS events](https://docs.microsoft.com/en-us/defender-for-identity/configure-windows-event-collection#for-active-directory-federation-services-ad-fs-events). Lastly, [Enable auditing on an ADFS Object](https://docs.microsoft.com/en-us/defender-for-identity/configure-windows-event-collection#enable-auditing-on-an-adfs-object). 25 | 26 | OS flavors and Tiered structures - Use STIG baseline and follow [other events](https://docs.microsoft.com/en-us/defender-for-identity/configure-windows-event-collection#for-other-events). 27 | 28 | Configure Object Auditing - this needs to be completed for [4662](https://docs.microsoft.com/en-us/defender-for-identity/configure-windows-event-collection#configure-object-auditing) events. *Disclaimer* - follow these steps closely. One tidbit too, on the first step, the instructions are clear. When it says 'Clear All', then add full control and so on. It will look like under 'Properties', that it is empty. Click 'OK' and apply. Go back into the current setting you just set and it will be made clear to you that 'WRITE' permissions are now there. Just wanted to clear that confusion up. Repeat the SAME steps for all 3 Audit entries. 29 | 30 | ## [Directory Service Account Recommendations](https://docs.microsoft.com/en-us/defender-for-identity/directory-service-accounts) ## 31 | 32 | The recommendation here is to use a gMSA account. Lets dig into creating that. 33 | 34 | If your domain is NOT using gmsa (Group Managed Service Accounts), you need to Create the Key Distribution Services KDS Root Key seen. More info [here](https://docs.microsoft.com/en-us/windows-server/security/group-managed-service-accounts/create-the-key-distribution-services-kds-root-key). This is a prerequisite for using gMSA. If you are using gMSA, skip this step. 35 | 36 | Domain Group - Create. 37 | Create an Active Directory Security Group and make the MDI member servers members of the group. I created a group called 'MDIGroup'. Why do we do this? If you are planning on protecting Domain Controllers and ADFS Servers with MDI, they need to be members of the same group to allow -PrincipalsAllowedToRetrieveManagedPassword. 38 | 39 | Copy the contents of the script to the first DC you are installing MDI on. Disclaimer - you may or may not need to use the -KerberosEncryptionType flag but if you are using 2016+ Domain Controller STIGyou will have to on OS 2016-2022. 40 | ``` 41 | Install-WindowsFeature -Name RSAT-AD-PowerShell //This is already installed if you are on a Domain Controller 42 | 43 | Run this script 44 | # Filename: MDIgMSA.ps1 45 | # Description: Creates and installs a custom gMSA account for use with Microsoft Defender for Identity. 46 | # 47 | # Declare variables 48 | $Name = 'MDIgMSA' #The name of the gMSA to be created 49 | $Description = "MDI group managed service account for MDI" 50 | $DNS = "MDIgMSA.cyberlorians.net" #This is the gmsa dns hostname 51 | $Principal = Get-ADGroup 'MDIGroup' #AD group created in the DC step, comment out if using 'Domain Controllers' only and uncomment next step. 52 | #$Principal = Get-ADGroup 'Domain Controllers' #Uncomment if just using DomainControllers and comment out the previous step 53 | $Kerb = 'AES128,AES256' #If using 2016STIG and above you have to use 54 | 55 | # Create service account in Active Directory 56 | $NewGMSAServiceDetails = @{ 57 | Name = $Name 58 | Description = $Description 59 | DNSHostName = $DNS 60 | ManagedPasswordIntervalInDays = 30 61 | KerberosEncryptionType = $Kerb 62 | PrincipalsAllowedToRetrieveManagedPassword = $Principal 63 | Enabled = $True 64 | } 65 | New-ADServiceAccount @NewGMSAServiceDetails -PassThru 66 | 67 | #Install-ADServiceAccount -Identity 'MDIgMSA'#MSFT Docs call for this piece BUT you do not have to since you will be setting the new gMSA account in a GPO for proper permissions. 68 | Get-ADServiceAccount -Identity 'MDIgMSA' -Properties * | select Prin* 69 | Test-ADServiceAccount -Identity 'MDIgMSA' 70 | ``` 71 | !!Verify!! that the newly created gMSA account has the [log on as a service](https://docs.microsoft.com/en-us/defender-for-identity/directory-service-accounts#verify-that-the-gmsa-account-has-the-required-rights-if-needed) permission to all MDI machines. *Disclaimer* - add this to the Domain Controller OS Based STIG and if using in conjunction with ADFS, then add to the ADFS OS Based STIG as well. I cannot stress how important this step is. In the past, this step was missing from current docs and I am happing that it has been added but it is still an easy oversight. If this is NOT in place, nothing will work! 72 | 73 | Your last step in the gMSA ladder is to [Configure the gMSA in 365 Defender](https://docs.microsoft.com/en-us/defender-for-identity/directory-service-accounts#configure-directory-service-account-in-microsoft-365-defender). When adding the gMSA account suffix with the $ so it matches the SAMAccountName Attribute on prem in AD. 74 | 75 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/mdigmsa.png) 76 | 77 | ## [MDI Role Groups](https://docs.microsoft.com/en-us/defender-for-identity/role-groups) ## 78 | 79 | I am not going to cover this in detail, perhaps another article. However, keep the MDI groups protected, carefully. Use Conditional Access Policy to enforce access to the traditional ATP portal. As well as, use a Privilege Access Group to lock down the groups via nesting. These groups are NOT AAD Roles so you cannot PIM them by default. Just adding food for thought but that is not the intent of this article. More on that later. 80 | 81 | ## [Configure SAM-R](https://docs.microsoft.com/en-us/defender-for-identity/remote-calls-sam) ## 82 | 83 | This is another DENY to Domain Controllers. On a STIG level, you could add these [GPO](https://docs.microsoft.com/en-us/defender-for-identity/remote-calls-sam) settings to each OS Based GPO STIG. Or, add a top level down from the root. Read through this page closely as you are going to have to decide how to approach these rights assignments and how GPO precendence could effect them. I.e, if you are following a correct Tiered Model, putting the these SAM-R settings at the root can work. See the DENY below - DENY Read and Apply Group Policy to Domain Controllers. 84 | 85 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/SAMR.png) 86 | 87 | You are on a roll now. Final steps are below! 88 | 89 | -Download the [sensor](https://docs.microsoft.com/en-us/defender-for-identity/download-sensor).
90 | -If you are going via a proxy, check the doc [here](https://docs.microsoft.com/en-us/defender-for-identity/configure-proxy).
91 | -After extracting the contents, install Npcap first - DO NOT MISS THIS STEP!
92 | -Install the [sensor](https://docs.microsoft.com/en-us/defender-for-identity/install-sensor), see the Prerequisites. As I stated above, install the Npcap driver first before the sensor install. Or, Pcap will install.
93 | 94 | Once installation has completed. Check the MDI portal and see the health of your sensor. If you have followed each step all should be well. 95 | 96 | Now go connect MDI/365 to Sentinel and be complete! 97 | 98 | 99 | 100 | 101 | 102 | 103 | -------------------------------------------------------------------------------- /MISPTISetup.md: -------------------------------------------------------------------------------- 1 | ## MISP Open Source Threat Intelligence Platform to Microsoft Sentinel. Authored by: Matt Larkin and Michael Crane. ## 2 | 3 | After the announcement of the free, LIMO Threat Intelligence injestion was deemed to be End of Life. Many of us have been after a replacement. This guide will associate a cost, check cost workbook, and a VM cost. However, it will have an automated process of TI into your Sentinel instance. See MISP [here](https://www.misp-project.org/). 4 | 5 | 6 | *Pre-req* - Create a Ubuntu VM, I am using Azure for this use case. Don't need anything crazy to keep the cost low for this use case. Being this is owned by the SecOps team, the VM lives within my SecOps subscription in Azure. 7 | 8 | *Disclaimer* - You can use the default account that is created when you stand up the server, initially or create a user called MISP. Regardless, a user account named 'MISP' will be created during the install. I chose to use my default admin account and then change the password later for RDP access. 9 | 10 | ## This installs the MISP TI Platform 11 | 12 | ``` 13 | # Ssh to the new Ubuntu VM. The following commands can be copied and pasted into the ssh session. 14 | 15 | # Update/Upgrade System, if needed. 16 | sudo apt-get update -y && sudo apt-get upgrade -y 17 | 18 | # Reboot 19 | sudo systemctl reboot 20 | 21 | # Please check the installer options first to make the best choice for your install 22 | wget -O /tmp/INSTALL.sh https://raw.githubusercontent.com/MISP/MISP/2.4/INSTALL/INSTALL.sh 23 | bash /tmp/INSTALL.sh 24 | 25 | # This will install MISP Core - Install will pause to create user MISP, chose yes to run as MISP. 26 | wget -O /tmp/INSTALL.sh https://raw.githubusercontent.com/MISP/MISP/2.4/INSTALL/INSTALL.sh 27 | bash /tmp/INSTALL.sh -c 28 | ``` 29 | 30 | ## Copy the content from the output in a notepad/shared file, temporarily. You need the AuthKey. 31 | 32 | This key can be pulled later by: cat /home/misp/MISP-authkey.txt 33 | 34 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/MISPafterinstall1.png) 35 | 36 | 37 | ``` 38 | # Now, create a local password for MISP user. I chose to wait until AFTER the script ends. 39 | sudo passwd misp 40 | 41 | # Install Xfce4 Desktop. 42 | sudo DEBIAN_FRONTEND=noninteractive apt-get -y install xfce4 43 | sudo apt install xfce4-session 44 | 45 | # Install XRDP. 46 | sudo apt-get -y install xrdp 47 | sudo systemctl enable xrdp 48 | sudo ufw allow 3389 49 | sudo service xrdp restart 50 | sudo adduser misp ssl-cert 51 | echo xfce4-session >~/.xsession 52 | sudo service xrdp restart 53 | 54 | # Install Firefox Browser - this needs to be installed to configure MISP to Sentinel Feeds. 55 | sudo apt install firefox -y 56 | 57 | ``` 58 | 59 | # Configure MISP TI Platform 60 | 61 | 1. RDP to the Ubuntu server with the MISP user created during the MISP install. 62 | 63 | 2. Log into the server as seen below. The default username and password were provided earlier. 64 | 65 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/MISPlogin.png) 66 | 67 | 3. Enter default password will prompt you to change the password 68 | 69 | 4. After logging in, navigate to 'Sync Actions'>'List Feeds'. Select both default feeds, enable and cache. Once done, click 'Fetch and store all feed data'. These two are by default, you can add more from the proper MISP website [here](https://www.misp-project.org/feeds/). 70 | 71 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/MISPsetup1.png). 72 | 73 | 5. Retrieve your key from earlier. if you forgot - cat /home/misp/MISP-authkey.txt, hang on to it! 74 | 75 | ## Create AAD App Reg in Azure AD. 76 | 77 | 1. Open the Application Registration Portal and click New registration on the menu bar. 78 | 2. Enter a name, and choose Register, other options can be left with their defaults. 79 | 3. Note down the Application (client) ID and Directory (tenant) ID. You will need to enter these into the script’s configuration file. 80 | 4. Under Certificates & secrets, click New client secret enter a description and click Add. A new secret will be displayed. Copy this for later entry into the script. 81 | 5. Under API permissions, choose Add a permission > Microsoft Graph. 82 | 6. Under Application Permissions, add ThreatIndicators.ReadWrite.OwnedBy. 83 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/MISPsetup2.png). 84 | 85 | ## Enable the Sentinel Connector 86 | Open your Azure Sentinel workspace, click ‘Data connectors’ and then look for the ‘Threat Intelligence Platforms’ connection. Open the connector and click Connect. 87 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/MISPsetup3.png) 88 | 89 | ## Setup the script for MISP-Sentinel API Calls. RDP as MISP first. 90 | 91 | 1. Run the below script. 92 | 93 | ``` 94 | sudo apt-get install python3-venv 95 | python3 -m venv mispToSentinel 96 | cd mispToSentinel 97 | source bin/activate 98 | git clone https://github.com/microsoftgraph/security-api-solutions 99 | cd security-api-solutions/Samples/MISP/ 100 | pip3 install requests requests-futures pymisp 101 | nano config.py 102 | ``` 103 | 2. After opening config.py, edit the file as seen below, replacing with your information. Remember your Auth Key is at: cat /home/misp/MISP-authkey.txt 104 | 105 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/mispconfig.png) 106 | 107 | 3. Use CTRL+O to write-out, save and CTRL-X to exit. 108 | 109 | 4. Run the following now to sync the feeds to Sentinel. 110 | ``` 111 | python3 script.py 112 | ``` 113 | Confirm ingestion by navigating to the TI workbook in Sentinel. 114 | 115 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/MISPsetup7.png) 116 | 117 | ## Cron job setup. 118 | 119 | Below is a CRONTAB entry example of running the script every day at midnight. You can use the generator [here](https://crontab-generator.org/). 120 | ``` 121 | * 0 * * * home/misp/mispToSentinel/security-api-solutions/Samples/python3 script.py >/dev/null 2>&1 122 | ``` 123 | 124 | -------------------------------------------------------------------------------- /MaliciousActivityandSentinelP1.md: -------------------------------------------------------------------------------- 1 | ## Part-1: Malicious traffic in Sentinel. ## 2 | 3 | *Disclaimer* - Per the Executive Order, M-21-31, [page 16](https://www.whitehouse.gov/wp-content/uploads/2021/08/M-21-31-Improving-the-Federal-Governments-Investigative-and-Remediation-Capabilities-Related-to-Cybersecurity-Incidents.pdf) states that NSG flow logs are to be captured and lists the retention settings. If you are using the workbook in Sentinel, look at (EL0)>IDS/IPS section to view the call out. This entire series follows that mandate as well as others. Don't forget to navigate with the [Maturity Model for Event Log Management (M-21-31) Workbook for Microsoft Sentinel](https://techcommunity.microsoft.com/t5/public-sector-blog/microsoft-sentinel-maturity-model-for-event-log-management-m-21/ba-p/3074336). 4 | 5 | *Pre-req* - Network Watcher and Storage account. I personally would keep the NetworkWatcher and storage account used in this situation under the SOC/Infosec resource group. 6 | 7 | I hope I caught your attention with the title. Now that I have your attention, how do we capture the "bad" traffic hitting our environment. We do this by creating an [NSG](https://docs.microsoft.com/en-us/azure/network-watcher/nsg-flow-logs-policy-portal) (network security group) Flow Log to [Traffic Analyitcs](https://docs.microsoft.com/en-us/azure/network-watcher/traffic-analytics-policy-portal) and send them off to Sentinel. This is an important note: You can accomplish NSG Flow Logs & Traffic Anayltics Azure Policy by using **"Configure network security groups to use specific workspace, storage account and flowlog retention policy for traffic analytics"**, so use the latter and call it a day. Pre-reqs you will need an azure storage account for the flow logs and to enable [NetworkWatcher](https://docs.microsoft.com/en-us/azure/network-watcher/network-watcher-create). 8 | 9 | *Disclaimer* - By default "Retention (days)" on the Flow logs settings is set to 30 with this new policy. Think about how you want to set this, in this case, because I am going to Sentinel I am going to set to 1 day as Sentinel/LAW is my retention. 10 | 11 | ### Configure network security groups to use specific workspace, storage account and flow log retention policy for traffic analytics - by Azure Policy ### 12 | Some quick notes on the image below. The parameters call for "ID" - you can find this ID by going to the resource, overview tab, resource JSON and pull the ResourceID string). Enter the parameters below and hit next to Remediation tab. 13 | 1. Parameter 1 - DeployifNotExists 14 | 2. Parameter 2 - Chose NSG Region. The NSG, NetworkWatcher and storage account need to be same. 15 | 3. Parameter 3 - Storage ResourceID of the storage account. 16 | 4. Parameter 4 - Unchecking the "only show parameters" will allow you to pick 10 or (default 60). 17 | 5. Parameter 5 - Sentinel ResourceID (json view) - obtain the workspace ID for the Sentinel Log Analytics Workspace. 18 | 6. Parameter 6 - Chose Workspace Region. 19 | 7. Parameter 7 - Sentinel WorkspaceID - This is Sentinels LAW WorkspaceID NOT ResourceID. 20 | 8. Parameter 8 - Set the Network Watcher Resource Group where Network Watcher resides. 21 | 9. Parameter 9 - Set the NAME of the Network Watcher (within the same region in Network Watcher RG). 22 | 10. Parameter 10 - Set the Number of days to retain flowlogs. 23 | 24 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/nsgflowlogsupdate.png) 25 | 26 | The next step will consist of creating a remediation task. Caveat and pleast take note! In order for this to work the managed identity has to have permissions at the highest tier of what you are setting and sending to Sentinel LAW. I.e., Do not set a one subscription and expect the policy to write on another subscrition where the parameters are set. Once completed, hit review and save. 27 | 1. Create a remediation task 28 | 2. Create a system assigned managed identity 29 | 30 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/trafficanalyticsremed.png) 31 | 32 | **Verifying Azure Traffic Analytics has made it to Sentinel.** 33 | 1. Navigate to Sentinel LAW. 34 | 2. Type "AzureNetworkAnalytics_CL" as seen below. 35 | 3. Please read more detail on [Traffic Analytics Schema](https://docs.microsoft.com/en-us/azure/network-watcher/traffic-analytics-schema). 36 | 37 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/Azurenetanalyitcsschematable.png) 38 | 39 | **Verifying Azure Traffic Analytics is capturing data and you are receiving proper traffic and visualization.** 40 | 1. For demo purposes the malicious RED is botnet activity. 41 | 42 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/trafficanalyticsblade.png) 43 | 44 | 45 | 46 | ## [Continue to Part-2: Malicious traffic in Sentinel](https://github.com/Cyberlorians/Articles/blob/main/MaliciousActivityandSentinelP2.md). ## -------------------------------------------------------------------------------- /MaliciousActivityandSentinelP2.md: -------------------------------------------------------------------------------- 1 | ## Part-2: Malicious traffic in Sentinel. ## 2 | 3 | In [Part-1](https://github.com/Cyberlorians/Articles/blob/main/MaliciousActivityandSentinelP1.md) of this series I showed you how to connect NSG Flow logs with Network Watcher, Traffic Analytics and sending it all to the final destination of Sentinel. Ok, so where do we go from here? First, lets focus on what we did in [Part-1](https://github.com/Cyberlorians/Articles/blob/main/MaliciousActivityandSentinelP1.md). The last picture I displayed was Traffic Analyitcs, so let us focus there for a minute. Head on over to the Network Watcher we setup, then Traffic Analytics. Disclaimer: I have malicious traffic ONLY because I have a honeypot with allowed traffic - do this at your own risk. However and a big however, this very well could be you or your customers environment. Remember in Part-1 when I stated I had a customer with a bad NSG? Well, this is how it went down in real time and I am just playing out the steps for you all, step by step. 4 | 5 | The image below, in my environment you see my honeypot allowing botnets into my environment (Malicious = RED) on the overview page. I recommend learning this blade as much as possible as you can go pretty deep in the weeds. Click on "View Map" under "Deployed Azure Regions" and it will bring up the heat map. 6 | 7 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/trafficanalyticsblade2.png) 8 | 9 | This image is the heat map of my malicious activity. I can enhance this as much as I would like by clicking on any of the endpoints, connecting the dots and clicking on any of the fields at the top menu. 10 | 11 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/trafficanalyticsheatmap.png) 12 | 13 | Head back to the main Traffic Analytics blade now and click on "Malicious IPs", where the red arrow is pointing. 14 | 15 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/trafficpublicipinfokql.png) 16 | 17 | That last step I had you do has led you to the log analytics workspace of Sentinel. Are the pre-reqs starting to make sense? What you are seeing below is a KQL query from the flow logs of 'MaliciousFlow' with IP, PublicIp, Location, ThreatType and Description. Yup! Those are Botnets and NO they should NOT be overlooked as "Oh, those are just botnets doing probing against our environment". You're going to see why. 18 | 19 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/networkwatcherquery.png) 20 | 21 | I recommend, and at your own risk, stand up a honeypot in your own environment. Rod Trent had posted this [honeypot](https://thoor.tech/blog/rdp-honeypot-ms-sentinel-workbook/?WT.mc_id=modinfra-00000-rotrent) article on behalf of Pierre Thoor. So, that has saved me from doing a full write up and I thank them. The article is wonderful and you should follow it. However, for my "Malicious traffic in Sentinel" series just stand up a VM with an NSG allowing any/any in. Keep in mind though, this series is not only to help you learn in your environment but to take these steps in each part of the series and use on any production tenant. Flip a quarter and that is your chance on seeing Malicious traffic in your production environments you are supporting. 22 | 23 | In conclusion of Part-2, I hope you are starting to get the visualization of where I am about to bring you. Hang on because it's about to get nuts. 24 | 25 | ## [Continue to Part-3: Malicious traffic in Sentinel](https://github.com/Cyberlorians/Articles/blob/main/MaliciousActivityandSentinelP3.md) ## -------------------------------------------------------------------------------- /MaliciousActivityandSentinelP3.md: -------------------------------------------------------------------------------- 1 | ## Part-3: Malicious traffic in Sentinel. ## 2 | 3 | [Part-2](https://github.com/Cyberlorians/Articles/blob/main/MaliciousActivityandSentinelP2.md) showed us how bring together Traffic Analytics and malicious activity in Network Watcher as well as sending the logs to Sentinels LAW. Let's start getting into the juicy data. 4 | 5 | What I want you to do next is open your Sentinel workspace, navigate to Workbooks add "Azure Network Watcher" and save it. 6 | 7 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/azurenetworkwatcher.png) 8 | 9 | Navigate back to the Workbook in Sentinel and open it up. I am only going to focus on the Malicious IP activity but please navigate through this work as it will show NIC, VM, Traffic flows, Attached resources, NSGs being attacked and etc. Please navigate to "Malicious Actors" in the workbook. Here we see Malicious IPs and allowed IN (please dig into this workbook and get a feel for it). 10 | 11 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/maliciousactors.png) 12 | 13 | Lets just get into Sentinel already! We need a rule to trigger an alert, don't we? This workbook nor Traffic Analytics does that by default. So, I have one ready to [go](https://github.com/Cyberlorians/Sentinel/blob/main/Analytic%20Rules/Custom%20-%20Malicious%20IP%20Allowed%20IN.json) to eliminate any FPs (False Positives). How did I start that query rule? Well, navigate back to "Network Watcher>Traffic Analytics", and click the RED inblound. 14 | 15 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/trafficanalyticskql.png) 16 | 17 | Now, we see the basics of [my analytic rule](https://github.com/Cyberlorians/Sentinel/blob/main/Analytic%20Rules/Custom%20-%20Malicious%20IP%20Allowed%20IN.json) and your upcoming analytic rule. 18 | 19 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/trafficanalyticskql2.png) 20 | 21 | Lets head to Part-4 22 | ## [Continue to Part-4: Malicious traffic in Sentinel](https://github.com/Cyberlorians/Articles/blob/main/MaliciousActivityandSentinelP4.md) ## -------------------------------------------------------------------------------- /MaliciousActivityandSentinelP4.md: -------------------------------------------------------------------------------- 1 | ## Part-4: Malicious traffic in Sentinel. ## 2 | 3 | [Part-3](https://github.com/Cyberlorians/Articles/blob/main/MaliciousActivityandSentinelP3.md) showed us Network Watcher workbook, the basics of "Malicious Actors", and AzureNetworkAnalytics_CL custom log showing inbound flows. Lets get right into the nitty gritty and import the playbook first, then the analytic rule. The playbook you will import is called "IP2GEOComments", which will add comments to the Malicious traffic incident. The comments will be IP origin and IP information. 4 | 5 | In the Azure Portal, search for Custom Deployment and "Build your own template in the editor". 6 | 7 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customteplatelogicapp.png) 8 | 9 | Paste the content from [IP2GEOComments](https://github.com/Cyberlorians/Sentinel/blob/main/Playbooks/IP2GEOComments.json), hit apply and continue to create on the next step of the deployment. 10 | 11 | Disclaimer: You MAY or SHOULD chose to use the [IP2GEOComments-Incidents Playbooks](https://github.com/Cyberlorians/Sentinel/blob/main/Playbooks/IP2GEOComments-Incident.json) 12 | 13 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customtemplatelogicapptemplate.png) 14 | 15 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/customtemplatelogicappcreate.png) 16 | 17 | After the playbook has imported we need to configure permissions for the logic app. Disclaimer: the logic app runs as a mananged identity but will need Microsoft Sentinel Responder permissions. To begin, navigate to your new IP2GEOComments logic app and go to "Identity" tab. You should see the status is "On". Now click on "Azure Role Assignments". 18 | 19 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/logicapppermissions.png) 20 | 21 | After clicking on "Azure Role Assignments" tab, click "Add role assignment (Preview). Chose your subscription, resource group and Role as shown in the image. Disclaimer: for this setup the only option is to set the permissions at the same resource group for "Microsoft Sentinel Responder" permission as the most least privilege assignment. You can set the SAME permission at the Sentinel Log Analytics Workspace for more granularity. After adding, save the permissions. 22 | 23 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/logicapppermissions2.png) 24 | 25 | Back on the left hand side of the logic app blade, head to "API Connections" and you will see the the API connection that is associated with the logic app. 26 | 27 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/logicappapiverify.png) 28 | 29 | Click on the API connection and double check that it is in a ready state. 30 | 31 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/logicappverify2.png) 32 | 33 | Lastly, go back to the left hand side of the logic app and under "Development Tools", chose "Logic app designer". Open each step and verify that the managed identity is connected. It should be but if it is not, chose "Change connection" and add the managed identity that is associated with the logic app. 34 | 35 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/logicappdesignerverify.png) 36 | 37 | On to the next step which is the Analytical rule. Open your Sentinel workspace and navigate to Anayltics. Click on import and import [Custom - Malicious IP Allowed IN](https://github.com/Cyberlorians/Sentinel/blob/main/Analytic%20Rules/Custom%20-%20Malicious%20IP%20Allowed%20IN.json) rule. 38 | 39 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/allowmaliciousinrule.png) 40 | 41 | That was easy, right? Now, that we have the rule imported, click "Edit" on your new rule. Disclaimer: you may edit this rule now or later based on your preferences. 42 | 43 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/maliciousINruleEDIT.png) 44 | 45 | In the Analytics rule wizard, navigate to "Automated Reponse" tab and under "alert automation", add the new IP2GEOComments playbooks. Click review and then save. 46 | 47 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/ip2geoautoresponse.png) 48 | 49 | If you have followed all steps and you are indeed having malicious traffic inbound your rule will kick off and the playbook will add comments as seen below. 50 | 51 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/ip2geotagsworking.png) 52 | 53 | Disclaimer: Again, you may chose to use the [IP2GEOComments-Incidents Playbooks](https://github.com/Cyberlorians/Sentinel/blob/main/Playbooks/IP2GEOComments-Incident.json), here as well with the same type of setup. Just make sure it is connected with the managed identity and the api is ready to roll. 54 | 55 | Cool huh! I think so too but we could use a bit more detail on those IPs. Following the SAME steps as above, deploy [Get-VirusTotalIPReport](https://github.com/Cyberlorians/Sentinel/blob/main/Playbooks/Get-VirusTotalIPReport.json). I adjusted this playbook from the Sentinel community to run as a managed identity AND who will have "whois" data. Disclaimer: and you CANNOT miss this step. Please sign up a [VirusTotal](https://www.virustotal.com/gui/home/upload) which is free for the API Key and on the logic app designer connector you will have to enter your api-key, name and update. *Disclaimer* - the logic app needs managed identity permissions of "Microsoft Sentinel Responder". It is best to put the logic app within the same resource group as Sentinel/LAW. 56 | 57 | Once deployed, you will see three API connectors. Listed below. 58 | 59 | 1 - "azureloganalyticsdatacollector-Get-VirusTotalIPReport" - Edit API connect and Enter the WorkspaceID and WorkspaceKey. Click SAVE. 60 | 2 - "virustotal-Get-VirusTotalIPReport" - Edit API connect and Enter your new "x-api_key". Click SAVE. 61 | 3 - "azuresentinel-Get-VirusTotalIPReport" - This connector is the managed identity connector piece. So, for this one go back to to the logic app>identity blade and give it the "Microsoft Sentinel Responder" Role on the Sentinel LAW. Once that is complete just verify this API states it is "Ready". 62 | 63 | Head back to Seninel and on the Incident, scroll over and click on the ellipsis and chose "Run Playbook (Preview)" 64 | 65 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/getvirustotalrunplaybook.png) 66 | 67 | Chose the new, "Get-VirusTotalIPReport" and click "Run". 68 | 69 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/getvtrun.png) 70 | 71 | Go back into your new incident and under the comments section you will see any and all Virus Total IP Reports. 72 | 73 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/getvtcomments.png) 74 | 75 | To sum this all up. You now have IP2GEOComments which will auto run when an incident triggers AND you now have the ability to run Get-VirusTotalIPReport for FREE. Disclaimer: you can run the Virus Total playbook on any investigation of your choice now. 76 | 77 | Lets head to Part-5 78 | ## [Continue to Part-5: Malicious traffic in Sentinel - Summary Rules](https://github.com/Cyberlorians/Articles/blob/main/MaliciousActivityandSentinelP5.md) ## 79 | 80 | ## [Back to Part-1: Malicious traffic in Sentinel](https://github.com/Cyberlorians/Articles/blob/main/MaliciousActivityandSentinelP1.md) ## 81 | ## [Back to Part-2: Malicious traffic in Sentinel](https://github.com/Cyberlorians/Articles/blob/main/MaliciousActivityandSentinelP2.md) ## 82 | ## [Back to Part-3: Malicious traffic in Sentinel](https://github.com/Cyberlorians/Articles/blob/main/MaliciousActivityandSentinelP3.md) ## 83 | -------------------------------------------------------------------------------- /MaliciousActivityandSentinelP5.md: -------------------------------------------------------------------------------- 1 | ## Part-5: Malicious traffic in Sentinel - Summary Rules. ## 2 | 3 | Flow logs can become quite detailed, particularly when an open port is allowing incoming malicious activity. To manage this, we can use Summary Rules to consolidate and focus on the Allowed Malicious IN activity. What exactly are Summary Rules? 4 | 5 | Per [Microsoft](https://learn.microsoft.com/en-us/azure/sentinel/summary-rules) on Summary Rules. 6 | 7 | Use summary rules in Microsoft Sentinel to aggregate large sets of data in the background for a smoother security operations experience across all log tiers. Summary data is precompiled in custom log tables and provide fast query performance, including queries run on data derived from low-cost log tiers. Summary rules can help optimize your data for: 8 | 9 | - Analysis and reports, especially over large data sets and time ranges, as required for security and incident analysis, month-over-month or annual business reports, and so on. 10 | 11 | - Cost savings on verbose logs, which you can retain for as little or as long as you need in a less expensive log tier, and send as summarized data only to an Analytics table for analysis and reports. 12 | 13 | - Security and data privacy, by removing or obfuscating privacy details in summarized shareable data and limiting access to tables with raw data. 14 | 15 | Let’s create a Summary Rule to focus on only ‘Allowed-IN’ 16 | 17 | 1 - Enter the following below. 18 | 19 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/sr1.png) 20 | 21 | 2- Enter the KQL logic below into the snippet you see below. You can adjust the Query scheduling to your liking or default. 22 | 23 | ``` 24 | AzureNetworkAnalytics_CL 25 | | where SubType_s == 'FlowLog' and FlowType_s == 'MaliciousFlow' 26 | | where AllowedInFlows_d == 1 27 | | summarize make_set(SrcIP_s), make_set(FlowType_s) by AllowedInFlows_d, DestIP_s, NSGList_s, DestPort_d 28 | ``` 29 | 30 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/sr2.png) 31 | 32 | 3 - Review & Create. 33 | 34 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/sr3.png) 35 | 36 | 4 - Review the new Summary Rule data. 37 | 38 | ``` 39 | MaliciousIN_CL 40 | | where _RuleName == "MaliciousIN" 41 | | project-away _BilledSize, _BinSize, _RuleLastModifiedTime, _RuleName, _BinStartTime, TenantId, Type 42 | ``` 43 | 44 | 45 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/sr4.png) 46 | -------------------------------------------------------------------------------- /ODA.md: -------------------------------------------------------------------------------- 1 |
1 - Getting Started with On-Demand Assessment 2 |

3 | 4 | This will auto expand but able to expand back. Verbiage from here ![https://learn.microsoft.com/en-us/services-hub/unified/health/getting-started-windows-client]. Needs to be streamlined and cleaned up with snippets at each step. 5 |

6 |
7 | 8 |
9 | 2 - Configure Microsoft On-Demand Assessments MAIN TITLE

10 | All items from current ODA need to lie here, GPO, Network, OS etc and consolidated as a point back for the "On-Demand Assessments". 11 |

12 |
    13 |
    2.1 - System and Network Requirements 14 |
      15 |
      Azure Public

      16 | 17 | | *Azure Public Endpoint* | *Description* | 18 | | :--- | :----: | 19 | |management.azure.com | Azure Resource Manager| 20 | login.windows.net | Azure Active Directory| 21 | dc.services.visualstudio.com | Application Insights| 22 | agentserviceapi.azure-automation.net | Guest Configuration| 23 | *-agentservice-prod-1.azure-automation.net | Guest Configuration| 24 | *.his.hybridcompute.azure-automation.net | Hybrid Identity Service| 25 |

      26 |
      27 |
    28 |
      29 |
      Azure Government

      30 | 31 | | *Azure Government Endpoint* | *Description* | 32 | | :--- | :----: | 33 | |management.azure.com | Azure Resource Manager| 34 | login.windows.net | Azure Active Directory| 35 | dc.services.visualstudio.com | Application Insights| 36 | agentserviceapi.azure-automation.net | Guest Configuration| 37 | *-agentservice-prod-1.azure-automation.net | Guest Configuration| 38 | *.his.hybridcompute.azure-automation.net | Hybrid Identity Service| 39 |

      40 |
    41 |
42 |
    43 |
    2.2 - Configure Security Policy Configurations

    44 | 45 | - reword this and clean up ![https://learn.microsoft.com/en-us/services-hub/unified/health/config-oda#configuring-the-required-group-policy-objects] 46 |

    47 |
48 |
    49 |
    2.3 - On-Demand Assessments

    50 | 51 | - [On-Demand Assessment - Entra](./EntraIDAssessment.md) 52 | - [On-Demand Assessment - Sharepoint](MDI-Hardened.md) 53 |

    54 |
55 |
56 | 57 | 58 | 59 |
Working with Results 60 |

61 |

62 |
63 | 64 |
Troubleshooting 65 |

66 |

67 |
68 | 69 | 70 | 71 |
Configure Microsoft On-Demand Assessment Collector 72 | 73 | *** This section could be in the "Configuration main section if all assessments are equal. *** 74 |

75 | 76 | 1. Create Resource Group: 'Assessment'. 77 | 2. Create Log Analytics Workspace in Assessment RG: 'Assessment'. 78 | 3. Create Azure Virtual Machine (Server 22): 'Assessment'. 79 | 4. Turn on "Enable Systemd Assigned Managed Identity", while building the virtual machine, under the management blade. Verify after deployment it is enabled. 80 | 81 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/mgmdidentity.png) 82 | 83 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/mgmdidentity2.png) 84 | 85 | 6. Install the Azure Monitor Agent Extension on the newly created virtual machine (this can be seen from the Extensions blade on the VM). Run the below command from the Azure Portal PowerShell and verify. 86 | 87 | **!!DO NOT MISS THIS STEP!!** 88 | 89 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/amaassessment.png) 90 | ``` 91 | Connect-AzAccount -UseDeviceAuthentication 92 | Set-AzVMExtension -Name AzureMonitorWindowsAgent -ExtensionType AzureMonitorWindowsAgent -Publisher Microsoft.Azure.Monitor -ResourceGroupName Assessment -VMName Assessment -Location EastUS -TypeHandlerVersion 1.0 -EnableAutomaticUpgrade $true 93 | ``` 94 | 95 |

96 | 97 | -------------------------------------------------------------------------------- /TVMIngestion.md: -------------------------------------------------------------------------------- 1 | ## Microsoft Defender for Endpoint - Threat & Vulnerability Mgmt - Sentinel Ingestion ## 2 | 3 | As of now, there is no Sentinel connector option for 365Defender TVM Data to ingest into Sentinel. This solution will use a logic app and an API call. The solution was built around Azure GCC tenant and I will note how to adjust some easy fixes per your tenant. The API we are calling is on the 365Defender side, specifically this one (api/machines/SoftwareVulnerabilitiesByMachine?deviceName). Which the article on the API call is [here](https://learn.microsoft.com/en-us/microsoft-365/security/defender-endpoint/get-assessment-software-vulnerabilities?view=o365-worldwide#12-permissions). Below is a snippet of what the API call looks like in 365D. 4 | 5 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/365DAPI.png) 6 | 7 | *Disclaimer* - You can use any API call from the 365Defender API side that is listed but the JSON schema will have to be properly adjusted and the results may vary. This is done in the logic app (Parse JSON step) - by uploading a sample. Tweak this until it is satisfactory. 8 | 9 | ## Deploy the logic app 10 | 11 | 1 - [MDETVM Logic App](https://raw.githubusercontent.com/Cyberlorians/Playbooks/main/MDETVM.json). Copy the contents of the logic app. 12 | 13 | 2 - In Azure, natigave to 'Deploy A Custom Template' and chose 'Build your own template in the editor' 14 | 15 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/TVMcustomdeployment.png) 16 | 17 | 3 - On the screen, copy the contents from step #1 and PASTE into the table, replacing all data. 18 | 19 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/uploadtemplate.png) 20 | 21 | 4 - Hit Save and deploy. 22 | 23 | ## Configuring the MDETVM Logic App 24 | 25 | 1 - After deployment, open the new logic app. Within the logic app blade, on the left hand side - navigate to API Connections. 26 | 27 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/TVMAPI.png) 28 | 29 | 2 - Click on 'Edit API Connection' 30 | 31 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/TVMWorkspaceConfig.png) 32 | 33 | 3 - Enter the Log Analytics WorkspaceID and Key. You can find this under your current LAW>Agents blade. Grab that info and pop into the corresponding fields and save. The API should say connected now. 34 | 35 | 4 - Changing the HTTP step per your Azure Environment. As stated, below as built for a GCC environment but you can adjust the http GET URI field accoridng to your environment. 36 | 37 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/TVMHTTPGet.png) 38 | 39 | ``` 40 | Commercial URL = https://api.securitycenter.microsoft.com/api/machines/SoftwareVulnerabilitiesByMachine?deviceName 41 | Commercial Audience = https://api.securitycenter.microsoft.com 42 | 43 | GCC URL = https://api-gcc.securitycenter.microsoft.us/api/machines/SoftwareVulnerabilitiesByMachine?deviceName 44 | GCC Audience = https://api-gcc.securitycenter.microsoft.us 45 | 46 | GCCH URI = https://api-gov.securitycenter.microsoft.us/api/machines/SoftwareVulnerabilitiesByMachine?deviceName 47 | GCCH Audience = https://api-gov.securitycenter.microsoft.us 48 | ``` 49 | 50 | ## Setting Permissions On The Managed Identity 51 | 52 | 1 - As stated, when you deployed the MDETVM logic app, it deploys a managed identity with the SAME name. The next step gives permissions for the API call to the App Role assignment on 'Windows Defender ATP - Vulnerability.Read.All'. The WindowsDefenderAtp AppRoleID is ($appId = "fc780465-2017-40d4-a0c5-307022471b92). 53 | 54 | 55 | Export the below code to a ps1 file - PowerShell Script. The SearchString 'MDETVM' will be the managed identity created from the logic app deployment. IF you change the name of the logic app during deploying you will need to change the -SearchString flag within the script below to reflect your new name. 56 | 57 | ``` 58 | $miObjectID = $null 59 | Write-Host "Looking for Managed Identity with default prefix names of the Logic App..." 60 | $miObjectIDs = @() 61 | $miObjectIDs = (Get-AzureADServicePrincipal -SearchString "MDETVM").ObjectId 62 | if ($miObjectIDs -eq $null) { 63 | $miObjectIDs = Read-Host -Prompt "Enter ObjectId of Managed Identity (from Logic App):" 64 | } 65 | 66 | # The app ID of the Microsoft Graph API where we want to assign the permissions 67 | $appId = "fc780465-2017-40d4-a0c5-307022471b92" 68 | $permissionsToAdd = @("Vulnerability.Read.All") 69 | $app = Get-AzureADServicePrincipal -Filter "AppId eq '$appId'" 70 | 71 | foreach ($miObjectID in $miObjectIDs) { 72 | foreach ($permission in $permissionsToAdd) { 73 | Write-Host $permissions 74 | $role = $app.AppRoles | where Value -Like $permission | Select-Object -First 1 75 | New-AzureADServiceAppRoleAssignment -Id $role.Id -ObjectId $miObjectIDs -PrincipalId $miObjectID -ResourceId $app.ObjectId 76 | } 77 | } 78 | ``` 79 | Successful TVM permissions will look like below. 80 | 81 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/TVMperms.png) 82 | 83 | Run the logic app with the 'Run Trigger' option to test the app and ingestion into Sentinel. The initial ingest will take a few minutes as the table is created in the Log Analytics Worskpace. 84 | 85 | *Disclaimer* - The logic app is set to run once per day. You can adjust the Recurrence step as you see fit. 86 | 87 | A Successful run will look like the below snippet. 88 | 89 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/TVMVerify.png) 90 | 91 | Give it up to 10minutes to ingest. 92 | 93 | Verify ingestion in Sentinel/LAW 94 | 95 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/MDETVMSentinel.png) 96 | 97 | ## Example KQL Below comparing with KEV that CISA has put out. 98 | 99 | ``` 100 | let KEV= 101 | externaldata(cveID: string, vendorProject: string, product: string, vulnerabilityName: string, dateAdded: datetime, shortDescription: string, requiredAction: string, dueDate: datetime) 102 | [ 103 | h@'https://www.cisa.gov/sites/default/files/csv/known_exploited_vulnerabilities.csv' 104 | ] 105 | with(format='csv',ignorefirstrecord=true); 106 | MDETVM_CL 107 | | project deviceName_s, osPlatform_s, cveID=cveId_s 108 | | join kind=inner KEV on cveID 109 | | summarize ['Vulnerabilities']=make_set(cveID) by deviceName_s 110 | | extend ['Count of Known Exploited Vulnerabilities'] = array_length(['Vulnerabilities']) 111 | | sort by ['Count of Known Exploited Vulnerabilities'] 112 | ``` 113 | 114 | 115 | -------------------------------------------------------------------------------- /TenantCAPols.md: -------------------------------------------------------------------------------- 1 | ## Tenant Conditional Access Policies - Graph Ingestion. ## 2 | 3 | Lacking permissions for the Identity plane prevents access to view Conditional Access Policies, obtaining current policies becomes unattainable. This approach relies on a logic app to invoke the Graph API for conditional access and feed the data into the log analytics workspace. The resulting table will be named 'TenantCAPols_CL'. Ingestion will occur once on both Monday and Friday of every week. 4 | 5 | ## Deploy the logic app 6 | 7 | [![Deploy to Azure](https://aka.ms/deploytoazurebutton)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FCyberlorians%2FLogicApps%2Fmain%2FTenantCAPols-Ingest.json) 8 | 9 | 10 | ## Post-Configuration of the TenantCAPols-Ingest Logic App 11 | 12 | 13 | 1. **Open Azure PowerShell via the browser & Paste the below code.** 14 | 15 | ``` 16 | # Install Microsoft Graph module if not already available 17 | if (-not (Get-Module -ListAvailable -Name Microsoft.Graph.Authentication)) { 18 | Install-Module Microsoft.Graph.Authentication -Scope CurrentUser -Force 19 | } 20 | Import-Module Microsoft.Graph.Authentication 21 | 22 | # Connect to Microsoft Graph using device authentication - Commercial & GCC Environment 23 | Connect-MgGraph -Scopes Application.Read.All, AppRoleAssignment.ReadWrite.All -UseDeviceAuthentication 24 | 25 | # Connect to Microsoft Graph using device authentication - GCCH - Uncomment Next Line 26 | # Connect-MgGraph -Scopes Application.Read.All, AppRoleAssignment.ReadWrite.All -Environment USGov -UseDeviceAuthentication 27 | 28 | # Define the name of the Managed Identity 29 | $miName = "TenantCAPols-Ingest" 30 | Write-Host "Searching for Managed Identity: $miName..." 31 | 32 | # Attempt to retrieve the Managed Identity 33 | $managedIdentity = Get-MgServicePrincipal -Filter "displayName eq '$miName'" 34 | 35 | # Fallback: Ask for ObjectId if not found 36 | if ($null -eq $managedIdentity) { 37 | $miObjectId = Read-Host -Prompt "Managed Identity not found. Enter ObjectId manually:" 38 | } else { 39 | $miObjectId = $managedIdentity.Id 40 | } 41 | 42 | # Microsoft Graph application ID 43 | $appId = "00000003-0000-0000-c000-000000000000" 44 | 45 | # Define required permissions to assign 46 | $permissionsToAdd = @("Policy.Read.All") 47 | 48 | # Get the Microsoft Graph service principal 49 | $graphSp = Get-MgServicePrincipal -Filter "appId eq '$appId'" 50 | 51 | # Assign each required permission to the Managed Identity 52 | foreach ($permission in $permissionsToAdd) { 53 | Write-Host "Assigning: $permission" 54 | 55 | # Find the matching AppRole 56 | $role = $graphSp.AppRoles | Where-Object { $_.Value -eq $permission } | Select-Object -First 1 57 | 58 | # Build assignment parameters 59 | $params = @{ 60 | PrincipalId = $miObjectId 61 | ResourceId = $graphSp.Id 62 | AppRoleId = $role.Id 63 | } 64 | 65 | # Create the app role assignment 66 | New-MgServicePrincipalAppRoleAssignment -ServicePrincipalId $miObjectId -BodyParameter $params 67 | } 68 | 69 | # Disconnect from Microsoft Graph 70 | Disconnect-MgGraph 71 | 72 | ``` 73 | 74 | 1(a). *If applicable, change http calls*. Configure your endpoint based off what graph environment you are working with. Please adjust the logic app http call per the tenant you are working in. Commercial & GCC use the same API call, Gov will need to be adjust. 75 | 76 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/autocapgetcond.png) 77 | 78 | *Graph endpoints for adjustment are below* 79 | 80 | ``` 81 | Commercial URL = https://graph.microsoft.com/v1.0/identity/conditionalAccess/policies 82 | Commercial Audience = https://graph.microsoft.com 83 | 84 | GCC URL = https://graph.microsoft.com/v1.0/identity/conditionalAccess/policies 85 | GCC Audience = https://graph.microsoft.com 86 | 87 | GCCH URI = https://graph.microsoft.us/v1.0/identity/conditionalAccess/policies 88 | GCCH Audience = https://graph.microsoft.us 89 | ``` 90 | 91 | 92 | 2. **Configuration for Law Analytics Workspace Ingestion.** 93 | 94 | ![](https://github.com/Cyberlorians/uploadedimages/blob/main/cacismlaw.png) 95 | 96 | 3. **Confirm ingestion at your Log Analytics Workspace.** 97 | 98 | ``` 99 | TenantCAPols_CL 100 | | summarize arg_max(TimeGenerated, *) by id_g 101 | | extend DisplayName = tostring(displayName_s) 102 | | extend PolicyId = tostring(id_g) 103 | //| extend Conditions = Policies.conditions 104 | //| mv-expand Conditions 105 | | extend State = case( 106 | tostring(state_s) == "enabled", "On", 107 | tostring(state_s) == "disabled", "Off", 108 | tostring(state_s) == "enabledForReportingButNotEnforced", "Report-only", 109 | "Unknown" 110 | ) 111 | //| extend GrantControls = Policies.grantControls 112 | | extend CreatedTimeDate = createdDateTime_t 113 | | extend ModifiedTimeDate = modifiedDateTime_t 114 | | extend UsersInclude = conditions_users_includeUsers_s 115 | | extend UsersExclude = conditions_users_excludeUsers_s 116 | | extend GroupsInclude = conditions_users_includeGroups_s 117 | | extend GroupsExclude = conditions_users_excludeGroups_s 118 | | extend CloudAppsInclude = conditions_applications_includeApplications_s 119 | | extend CloudAppsExclude = conditions_applications_excludeApplications_s 120 | | extend ClientPlatformsIncludeTmp = column_ifexists("conditions_platforms_includePlatforms_s", "") 121 | | extend ClientPlatformsInclude = case( 122 | "ExplicitOnly" == "ExplicitOnly", ClientPlatformsIncludeTmp, 123 | case( 124 | isnotempty(ClientPlatformsIncludeTmp), ClientPlatformsIncludeTmp, 125 | todynamic("(all)") 126 | ) 127 | ) 128 | | extend ClientPlatformsIncludeTooltip = case( 129 | ClientPlatformsInclude == "(all)", "This is the implicit configuration. All platforms are included, because the setting has not been configured.", 130 | "" 131 | ) 132 | | extend ClientPlatformsExcludeTmp = column_ifexists("conditions_platforms_excludePlatforms_s", "[]") 133 | | extend ClientPlatformsExclude = case( 134 | "ExplicitOnly" == "ExplicitOnly", ClientPlatformsExcludeTmp, 135 | case( 136 | isnotempty(ClientPlatformsExcludeTmp), ClientPlatformsExcludeTmp, 137 | todynamic("[]") 138 | ) 139 | ) 140 | | extend ClientApps = conditions_clientAppTypes_s 141 | | extend LocationsInclude = case( 142 | "ExplicitOnly" == "ExplicitOnly", conditions_locations_includeLocations_s, 143 | case( 144 | isnotempty(conditions_locations_includeLocations_s), conditions_locations_includeLocations_s, 145 | todynamic("(all)") 146 | ) 147 | ) 148 | | extend LocationsIncludeTooltip = case( 149 | LocationsInclude == "(all)", "This is the implicit configuration. All locations are included, because the setting has not been configured.", 150 | "" 151 | ) 152 | | extend LocationsExclude = case( 153 | "ExplicitOnly" == "ExplicitOnly", conditions_locations_excludeLocations_s, 154 | case( 155 | isnotempty(conditions_locations_excludeLocations_s), conditions_locations_excludeLocations_s, 156 | todynamic("[]") 157 | ) 158 | ) 159 | | extend UserRiskLevels = case( 160 | "ExplicitOnly" == "ExplicitOnly", conditions_userRiskLevels_s, 161 | case( 162 | array_length(todynamic(conditions_userRiskLevels_s)) != 0, conditions_userRiskLevels_s, 163 | todynamic("(all)") 164 | ) 165 | ) 166 | | extend UserRiskLevelsTooltip = case( 167 | UserRiskLevels == "(all)", "This is the implicit configuration. All user risk levels are included, because the setting has not been configured.", 168 | "" 169 | ) 170 | | extend SigninRiskLevels = case( 171 | "ExplicitOnly" == "ExplicitOnly", conditions_signInRiskLevels_s, 172 | case( 173 | array_length(todynamic(conditions_signInRiskLevels_s)) != 0, conditions_signInRiskLevels_s, 174 | todynamic("(all)") 175 | ) 176 | ) 177 | | extend SigninRiskLevelsTooltip = case( 178 | UserRiskLevels == "(all)", "This is the implicit configuration. All sign-in risk levels are included, because the setting has not been configured.", 179 | "" 180 | ) 181 | | extend GrantControls = grantControls_builtInControls_s 182 | | extend FullPolicyJson = pack_all() 183 | | sort by DisplayName asc 184 | | project ['Policy display name'] = DisplayName, State, ['Cloud apps included'] = CloudAppsInclude, ['Cloud apps excluded'] = CloudAppsExclude, ['Users included'] = UsersInclude, ['Users excluded'] = UsersExclude, ['Groups included'] = GroupsInclude, ['Groups excluded'] = GroupsExclude, ['Client platforms included'] = ClientPlatformsInclude, ['Client platforms excluded'] = ClientPlatformsExclude, ['Client apps'] = ClientApps, ['Locations included'] = LocationsInclude, ['Locations excluded'] = LocationsExclude, ['User risk levels'] = UserRiskLevels, ['Sign-in risk levels'] = SigninRiskLevels, ['Grant controls'] = GrantControls, CreatedTimeDate, ModifiedTimeDate, PolicyId, ['Full policy JSON'] = FullPolicyJson, ClientPlatformsIncludeTooltip, LocationsIncludeTooltip, UserRiskLevelsTooltip, SigninRiskLevelsTooltip 185 | ``` 186 | 187 | 188 | -------------------------------------------------------------------------------- /crosscloudsync.md: -------------------------------------------------------------------------------- 1 | ## Cross-Cloud Sync Setup 2 | 3 | STEP #1 4 | 5 | ``` 6 | connect-graph -Environment USGov -Scopes "Application.ReadWrite.All", "Policy.ReadWrite.Authorization", "User.ReadWrite.All","Directory.ReadWrite.All", "Policy.ReadWrite.CrossTenantAccess", "RoleManagement.ReadWrite.Directory" 7 | ``` 8 | 9 | STEP #2 10 | 11 | ## Create a new Service Principal 12 | 13 | ``` 14 | Import-Module Microsoft.Graph.Applications 15 | 16 | $params = @{ 17 | appId = "2313b47f-a76d-4513-be58-500e42ce8d11" 18 | } 19 | 20 | New-MgServicePrincipal -BodyParameter $params 21 | ``` 22 | 23 | STEP #3 24 | 25 | ## Enable elevation of priv users perms to be added to custom role 26 | 27 | ``` 28 | Import-Module Microsoft.Graph.Identity.SignIns 29 | 30 | $params = @{ 31 | enabledPreviewFeatures = @("ConditionForExternalObjectScope", "CustomRolesForUsersSet2") 32 | } 33 | 34 | Update-MgPolicyAuthorizationPolicy -BodyParameter $params 35 | ``` 36 | 37 | 38 | STEP #4 39 | 40 | ## CREATE WRITE DEFINITION ID (you will have to go into the UI of the custom role and add other permissions if you'd like to test anything further) 41 | 42 | ``` 43 | # Import the required modules 44 | Import-Module Microsoft.Graph.Identity.Governance 45 | $params = @{ 46 | description = "Cross-Cloud Sync Write Permissions for Private Preview" 47 | displayName = "Cross-Cloud Sync Write Permissions for Private Preview" 48 | rolePermissions = @( 49 | @{ 50 | allowedResourceActions = @( 51 | "microsoft.directory/users/basic/update", 52 | "microsoft.directory/users/userType/update" 53 | ) 54 | } 55 | ) 56 | assignableScopes = @( 57 | "/tenants/sourcetenant" 58 | ) 59 | isEnabled = $true 60 | } 61 | 62 | 63 | New-MgRoleManagementDirectoryRoleDefinition -BodyParameter $params 64 | ``` 65 | ## Add further perms to WRITE ROLE if need be. Custom Roles are LIMITED 66 | 67 | ``` 68 | "microsoft.directory/users/basic/update", 69 | "microsoft.directory/users/contactInfo/update", 70 | "microsoft.directory/users/extensionProperties/update", 71 | "microsoft.directory/users/jobInfo/update", 72 | "microsoft.directory/users/parentalControls/update", 73 | "microsoft.directory/users/sponsors/update", 74 | "microsoft.directory/users/usageLocation/update", 75 | "microsoft.directory/users/preferredDataLocation/update", 76 | "microsoft.directory/users/userType/update", 77 | "microsoft.directory/deletedItems.users/restore", 78 | "microsoft.directory/users/disable", 79 | "microsoft.directory/users/enable", 80 | "microsoft.directory/users/userPrincipalName/update", 81 | "microsoft.directory/users/userType/update", 82 | "microsoft.directory/users/delete" 83 | ``` 84 | ## CREATE READ DEFINITON ID 85 | 86 | ``` 87 | $params = @{ 88 | description = "Cross-Cloud Sync Read Permissions for Private Preview" 89 | displayName = "Cross-Cloud Sync Read Permissions for Private Preview" 90 | rolePermissions = @( 91 | @{ 92 | allowedResourceActions = @( 93 | "microsoft.directory/crossTenantAccessPolicy/default/standard/read", 94 | "microsoft.directory/crossTenantAccessPolicy/partners/identitySynchronization/standard/read", 95 | "microsoft.directory/crossTenantAccessPolicy/partners/standard/read", 96 | "microsoft.directory/crossTenantAccessPolicy/standard/read", 97 | "microsoft.directory/users/identities/read", 98 | "microsoft.directory/users/licenseDetails/read", 99 | "microsoft.directory/users/manager/read", 100 | "microsoft.directory/users/registeredDevices/read", 101 | "microsoft.directory/users/sponsors/read", 102 | "microsoft.directory/users/standard/read", 103 | "microsoft.directory/authorizationPolicy/standard/read" 104 | ) 105 | } 106 | ) 107 | isEnabled = $true 108 | } 109 | 110 | New-MgRoleManagementDirectoryRoleDefinition -BodyParameter $params 111 | ``` 112 | STEP #6 113 | 114 | ## Set Permissions for Service Principal 115 | 116 | ``` 117 | connect-azuread 118 | 119 | $miObjectID = $null 120 | Write-Host "Looking for Managed Identity with default prefix names of the Logic App..." 121 | $miObjectIDs = @() 122 | $miObjectIDs = (Get-AzureADServicePrincipal -SearchString "Microsoft.Azure.SyncFabric.CrossCloud.Commercial2Government").ObjectId 123 | if ($miObjectIDs -eq $null) { 124 | $miObjectIDs = Read-Host -Prompt "Enter ObjectId of Managed Identity (from Logic App):" 125 | } 126 | 127 | # The app ID of the Microsoft Graph API where we want to assign the permissions 128 | $appId = "00000003-0000-0000-c000-000000000000" 129 | $permissionsToAdd = @("User.Invite.All","Organization.Read.All") 130 | $app = Get-AzureADServicePrincipal -Filter "AppId eq '$appId'" 131 | 132 | foreach ($miObjectID in $miObjectIDs) { 133 | foreach ($permission in $permissionsToAdd) { 134 | Write-Host $permission 135 | $role = $app.AppRoles | where Value -Like $permission | Select-Object -First 1 136 | New-AzureADServiceAppRoleAssignment -Id $role.Id -ObjectId $miObjectID -PrincipalId $miObjectID -ResourceId $app.ObjectId 137 | } 138 | } 139 | ``` 140 | 141 | 142 | -------------------------------------------------------------------------------- /tvm-adf.md: -------------------------------------------------------------------------------- 1 | set system assigned mgmd idp from ADF>ADX 2 | 3 | https://techcommunity.microsoft.com/blog/azuredataexplorer/azure-data-factory-to-adx-free-cluster/3657756 4 | 5 | 01 - https://api-gcc.securitycenter.microsoft.us/api/machines/SoftwareInventoryByMachine 6 | ``` 7 | .create table DeviceTvmSoftwareInventory ( 8 | TimeGenerated: datetime, 9 | deviceId: string, 10 | rbacGroupId: int, 11 | rbacGroupName: string, 12 | deviceName: string, 13 | osPlatform: string, 14 | softwareVendor: string, 15 | softwareName: string, 16 | softwareVersion: string, 17 | numberOfWeaknesses: int, 18 | diskPaths: string, 19 | registryPaths: string, 20 | softwareFirstSeenTimestamp: datetime, 21 | endOfSupportStatus: string, 22 | endOfSupportDate: datetime 23 | ) 24 | ``` 25 | 26 | 02 - https://api-gcc.securitycenter.microsoft.us/api/machines/SecureConfigurationsAssessmentByMachine 27 | ``` 28 | .create table DeviceTvmSecureConfigurationAssessment ( 29 | TimeGenerated: datetime, 30 | deviceId: string, 31 | rbacGroupId: int, 32 | rbacGroupName: string, 33 | deviceName: string, 34 | osPlatform: string, 35 | osVersion: string, 36 | timestamp: datetime, 37 | configurationId: string, 38 | configurationCategory: string, 39 | configurationSubcategory: string, 40 | configurationImpact: real, 41 | isCompliant: bool, 42 | isApplicable: bool, 43 | isExpectedUserImpact: bool, 44 | configurationName: string, 45 | recommendationReference: string 46 | ) 47 | ``` 48 | 49 | 03 - https://api-gov.securitycenter.microsoft.us/api/machines/HardwareFirmwareInventoryByMachine 50 | ``` 51 | .create table DeviceTvmHardwareFirmware ( 52 | TimeGenerated: datetime, 53 | deviceId: string, 54 | rbacGroupId: int, 55 | rbacGroupName: string, 56 | deviceName: string, 57 | componentType: string, 58 | manufacturer: string, 59 | componentName: string, 60 | componentVersion: string, 61 | additionalFields: string 62 | ) 63 | ``` 64 | 65 | 04 - https://api-gcc.securitycenter.microsoft.us/api/baselineProfiles 66 | ``` 67 | .create table DeviceBaselineComplianceProfiles ( 68 | TimeGenerated: datetime, 69 | id: string, 70 | name: string, 71 | description: string, 72 | benchmark: string, 73 | version: string, 74 | operatingSystem: string, 75 | operatingSystemVersion: string, 76 | status: bool, 77 | complianceLevel: string, 78 | settingsNumber: int, 79 | createdBy: string, 80 | lastUpdatedBy: string, 81 | createdOnTimeOffset: datetime, 82 | lastUpdateTimeOffset: datetime, 83 | passedDevices: int, 84 | totalDevices: int, 85 | rbacGroupIdsProfileScope: dynamic, 86 | rbacGroupNamesProfileScope: dynamic, 87 | deviceTagsProfileScope: dynamic 88 | ) 89 | ``` 90 | 05 - https://api-gcc.securitycenter.microsoft.us/api/baselineConfigurations 91 | ``` 92 | .create table DeviceBaselineComplianceAssessmentKB ( 93 | TimeGenerated: datetime, 94 | id: string, 95 | uniqueId: string, 96 | benchmarkName: string, 97 | benchmarkVersion: string, 98 | name: string, 99 | description: string, 100 | category: string, 101 | complianceLevels: dynamic, 102 | cce: string, 103 | rationale: string, 104 | remediation: string, 105 | recommendedValue: dynamic, 106 | source: dynamic, 107 | isCustom: bool 108 | ) 109 | ``` 110 | 06 - https://api-gcc.securitycenter.microsoft.us/api/baselineConfigurations 111 | ``` 112 | .create table DeviceBaselineComplianceAssessment ( 113 | TimeGenerated: datetime, 114 | id: string, 115 | configurationId: string, 116 | deviceId: string, 117 | deviceName: string, 118 | profileId: string, 119 | osPlatform: string, 120 | osVersion: string, 121 | rbacGroupId: int, 122 | rbacGroupName: string, 123 | isApplicable: bool, 124 | isCompliant: bool, 125 | dataCollectionTimeOffset: datetime, 126 | complianceCalculationTimeOffset: datetime, 127 | recommendedValue: dynamic, 128 | currentValue: dynamic, 129 | source: dynamic, 130 | isExempt: bool 131 | ) 132 | ``` 133 | 07- https://api-gcc.securitycenter.microsoft.us/api/Machines/BrowserExtensionsInventoryByMachine 134 | ``` 135 | .create table DeviceTvmBrowserExtensions ( 136 | TimeGenerated: datetime, 137 | deviceId: string, 138 | rbacGroupId: int, 139 | rbacGroupName: string, 140 | installationTime: datetime, 141 | browserName: string, 142 | extensionId: string, 143 | extensionName: string, 144 | extensionDescription: string, 145 | extensionVersion: string, 146 | extensionRisk: string, 147 | extensionVendor: string, 148 | isActivated: bool 149 | ) 150 | ``` 151 | 08 - https://api-gcc.securitycenter.microsoft.us/api/vulnerabilities/machinesVulnerabilities 152 | ``` 153 | .create table DeviceTvmSoftwareVulnerabilities ( 154 | TimeGenerated: datetime, 155 | id: string, 156 | cveId: string, 157 | machineId: string, 158 | fixingKbId: string, 159 | productName: string, 160 | productVendor: string, 161 | productVersion: string, 162 | severity: string 163 | ) 164 | ``` 165 | 09 - https://api-gcc.securitycenter.microsoft.us/api/machines/CertificateAssessmentByMachine 166 | ``` 167 | .create table DeviceTvmCertificateInfo ( 168 | TimeGenerated: datetime, 169 | deviceId: string, 170 | deviceName: string, 171 | thumbprint: string, 172 | path: string, 173 | signatureAlgorithm: string, 174 | keySize: int, 175 | expirationDate: datetime, 176 | issueDate: datetime, 177 | subjectType: string, 178 | serialNumber: string, 179 | issuedTo: dynamic, 180 | issuedBy: dynamic, 181 | keyUsage: dynamic, 182 | extendedKeyUsage: dynamic, 183 | rbacGroupId: int, 184 | rbacGroupName: string 185 | ) 186 | ``` 187 | --------------------------------------------------------------------------------