One of the biggest advantages of the cloud in general and Azure Sentinel, in particular, is being API focused. SIEM products are integration savvy, whether with telemetry sources or with other management platforms. The cloud makes automating this integration critical to tackling the ephemeral nature of resources. In this evolving blog post, we will cover Azure Sentinel integration and automation capabilities.
Azure Sentinel uses Azure Log Analytics for log management and the Log Analytics APIs serve Azure Sentinel.
The Query API
Azure Sentinel enables easy and fast API access to the workspace, Azure Sentinel’s primary data store. This enables you to use Azure Sentinel as your data lake and build your own algorithms and applications over the data.
To do that, send your KQL queries using the Log Analytics query API. To learn more about how to use the query API, which is part of Azure REST API, you might want to read getting started with Azure REST API, or read Rin Ure’s great write up on how to use the API. There are some tools that already use the API and can make life simpler:
- PowerShell script – now includes CSV export
- PowerShell cmdlet
- Azure CLI
- Logic Apps Azure Monitor logs connector
Also, in addition to ingested event data, the Azure Sentinel workspace stores alerts in the SecurityAlert table and bookmarks in the HuntingBookmark table, which can be accessed using the query API. Incidents are not stored in the workspace but can be read using the management API discussed below.
The Data Collector API
You can ingest data to Azure Sentinel using the Log Analytics Data Collector API. You can directly use the API using your preferred programming language, but also use tools such as the Log Analytics agent, Logstash and Logic Apps without programming. The API and the different ways to use it are discussed in the custom connectors blog post.
The Graph Security API
The Graph Security API offers a direct interface, which may be easier to use for special popular data access use cases:
- Read Azure Sentinel’s alerts.
- Ingest TI to Azure Sentinel utilizing the built-in TI based analytics without modifications. Note that this cannot be achieved with the data collector API as it writes to custom tables rather than to the standard TI table, ThreatIntelligenceIndicator. See this blog post as an example.
Management integration and automation
Using automation for deployment and management is always a cost saver. For the cloud, in which resources are often ephemeral, automation is ever more important, and the same applies to service providers which need to on and off-board customers as efficiently as possible. Management APIs are also important to tie processes, and not just data, into other systems in the organization such as a service provider’s portal, a workflow system or a ticketing system.
The Azure Sentinel management API documentation can be found here.
Swagger and example files can be found here.
Looking to include the API calls in an ARM template? the newly introduced scripting capability within ARM templates enables including any Sentinel API call in an ARM template. For more details refer to “Extending Azure Resource Manager (ARM), Azure’s control plane” from Ignite 2019
As mentioned before, the API allows access to incident data, not available through the query API. You can find the export all incidents script a useful example for doing that.
Azure APIs use the same roles and permissions mechanism as does the portal. More details on Azure Sentinel roles and permissions can be found here.
Using the API to retrieve and update incident information
While both alert and incident information is avaialable through the query API as they are stored in tables in the workspace, updating incidents requires using the management API. As a result, it is sometimes preferred to use the manamgenet API also to retrieve incident and alert informtion. Some useful examples that can help are:
Using the management API to automate content deployment
The most common use for the API is to automate the deployment and update of Analytics Alert Rules and hunting queries.
Two open-source implementations of the API you might find useful for this purpose are:
- Automating analytics and hunting rules deployment using AzSentinel created by Wortell. You can use these scripts to export and import all rules from a workspace. If using the API directly to deploy a rule retrieved from another workspace, make sure you update the following JSON fields:
- The id should be modified so it will fit the current workspace URI
- The etag should be cleared
- The lastModifiedUtc should be removed
When using any one of the scripts presented in this section, this is already handled for you.
- Javier Soriano and Philippe Zenhaeusern have implemented a CI/CD flow using GitHub, Azure DevOps, and the Sentinel automation capabilities. It enables you to manage rules, queries, playbooks, workbooks, and more on GitHub and have them continuously deployed to your Sentinel workspace. You can even create a new workspace and connect it automatically.
- They also discuss how to extend your CI/CD framework across workspaces and tenants.
The management API would also be the solution for backing up and restoring configuration. The automation PowerShell modules and scripts described above both read and write resources and therefore, can be used for backup and restore. Using CI/CD ensures that the master copy of the configuration is external to start with.
Automated deployment and configuration for other resources
Azure Sentinel uses other resources which are part of the Azure environment and for which you would need to use their own deployment automation mechanism:
- Workbooks: use ARM. To ensure the workbook is listed in Sentinel:
- Set the sourceId to the workspace ID (should look similar to this /subscriptions/… /resourcegroups/…/providers/microsoft.operationalinsights/workspaces/…)
- Set the category to “sentinel”
- Logic App playbooks: use ARM. To ensure the playbook appears in Sentinel:
- It has to use the Sentinel trigger
- Be in the same subscription as the workspace.
- Saved searches and functions:
Using Azure Policy to collect from Azure services
The recommended way to configure Azure Services to stream to Azure Sentinel is to use Azure Policy. This ensures new services are automatically set to collect without the user having to wait to be connected. Create-AzDiagPolicy (GitHub, PowerShell Gallery) allows you to create Azure Policies for enforcing Azure services to log
To learn more about using Azure policy to ensure any new Azure resource send telemetry to Azure read Tao Yang’s blog post. though note that the referenced policy templates are out of date and the script above should be used.
Lastly, for ASC continuous export use the built in policy ‘Deploy export to Log Analytics workspace for Azure Security Center alerts and recommendations’ (Policy ID: ffb6f416-7bd2-4488-8828-56585fef2be9) or use this policy template. Use ASC continuous export to collect ASC recommendations as well as an alternative to the ASC alerts connector, albeit with some limitations.
About API support and versioning
While cloud applications user interfaces change on an on-going basis, on the API side we commit to longer term consistency. So, while the APIs do change regularly:
- A breaking change requires a new version of the API.
- Existing versions can be deprecated only after 3 years if GA, and 90 days for preview, and need to follow a deprecation process.