In this article, you will learn about the different methods for auditing in Azure SQL Database and Azure SQL Managed Instance. For a summary of what auditing is, please refer to the July SQL Server Geeks magazine.
Here’s a quick summary of the different ways you can audit based on the different cloud offerings. To learn more about extended events or SQL Server Audit, please refer to the July, August, and November issues of SQL Server Geeks magazine.
You have a few options for auditing with Azure SQL Database:
1. Azure Portal – The portal provides functionality that is like SQL Server Audit, but doesn’t allow you to filter the auditing results. You can turn it on at the server level or the database level, but don’t turn it on at both levels. If you audit at both levels, you will collect duplicate auditing records. The portal gives you three storage choices for your auditing data:
a. Storage – This is blob storage in Azure. It stores your audit files in .xel format in a folder structure you can loop through to query.
b. Log Analytics – This is a workspace in Azure you can use to store auditing data in a format that is queryable by the Kusto Query Language.
c. Event Hub – This stores audit logs in JSON and requires a stream to consume events and write them to a target.
2. Extended Events – Commonly referred to as xevents. This is a lightweight and flexible option for auditing. If you’ve used Profiler or Trace, this will feel familiar to you. You can use the GUI or scripts to create and query the results of xevents. You will need to store this in blob storage in Azure. It has many of the same events as traditional SQL Server xevents.
Enabling Azure SQL Database Auditing
Here’s a high-level explanation of how to set up Azure SQL Database auditing. Navigate to your Azure SQL Database server. I highly recommend enabling auditing at the server level so you will get all events relating to all of your databases in the audit data. The following screenshot shows you how to enable auditing at the server level:
It’s as easy as clicking a radio button and choosing your storage location. There is also the ability to audit Microsoft support operations events. I like to enable this, as well, because I want to see behind-the-scenes operations that might affect my databases. The following screenshot shows you how to enable the Microsoft support operations auditing:
You can use the same or a different audit log destination. I tend to put it all together with my other database auditing to make it easier to see all auditing data in the same place and at the same time.
If you have a use case to audit only one database or a subset of databases on a server, you wouldn’t enable it at the server level. Instead, you should navigate to the specific database to enable it, like in the following screenshot:
You can see in the previous screenshot that server-level auditing is enabled. You would want to disable the server auditing before you enable auditing in this one database. This prevents you from getting a lot of duplicate audit data.
Let’s take a look at each of the storage options you have in Azure SQL Database auditing.
This option lets you put the .xel auditing files in a blob storage account of your choosing. The files are put in a subfolder structure that can be hard to query. You will need to loop through them with something like PowerShell to make it easier to query. The following screenshot shows you what this storage option looks like:
You can query these .xel files like you would other xevent files. The November issue of the SQL Server Geeks magazine has more information on how to query .xel files. You will need to use XQuery to parse through the XML.
This option lets you store your audit data in an event hub. This stores data in JSON and requires you to set up a stream to read events and write them to a target. The event hub must be in the same region as your database. The following screenshot shows you how the event hub looks after you set it up:
Frankly, I haven’t used this option because it’s beyond the scope of activities I want to engage in. I prefer a storage account, or even better, log analytics.
So far, my favorite way to store audit logs is with log analytics. I put multiple different database servers and their corresponding databases into the same log analytics for easy querying and reporting on the audit data. The following screenshot shows the log analytics audit data:
You can access this data by either navigating to your log analytics workspace or into Auditing under each of the audited databases, then navigating to Log Analytics. The following screenshot shows the steps to access the auditing data from the database:
Even if you enable the auditing at the server level, you still have to access the audit data via the database. It’s easier to go into your log analytics workspace to see all the audit data in one place instead of clicking on different databases to see the data. That’s why I store all my database audit data in the same log analytics workspace. Then I can query all the database audit data at the same time.
To query the audit data in log analytics, you need to use Kusto Query Language (KQL). It’s easy to figure out, especially if you know SQL. Here’s a link to a SQL to Kusto cheat sheet: https://docs.microsoft.com/en-us/azure/data-explorer/kusto/query/sqlcheatsheet
Extended events for Azure SQL Database
The way you set up xevents in Azure SQL Database is the same way you do it in SQL Server. The main difference is you need a storage account to store the files. The following screenshot shows you how you can set up an xevent with a query or in the GUI:
Azure Managed Instance and Amazon Web Services Relational Database Service Auditing
These managed database options both have the same auditing options as a SQL Server running on VM, except for the fact that they need to use cloud storage. The following screenshot shows you the setup for SQL Server Audit and Extended Events for Managed Instance and AWS RDS:
You may have to take into consideration security, networking, and other settings for getting your managed instance to communicate with blob storage and/or getting your RDS instance communicating with s3
Disclaimer on Auditing
Be very careful how and what you audit. You can overload or freeze up a production server. It happened to me when I didn’t even think it was possible to crash a production server on a VM with a SQL Server Audit. I just thought somehow, I could stop the audit in between things it was auditing. Sometimes it’s going to be hard to stop your audit if it’s auditing too much or going to be so hard to weed through all the data, it will be like trying to find a needle in a haystack with all the stuff you’re auditing. I just go with the less is more method of auditing whether I’m using extended events or SQL Server audit.
This is not as important with Azure SQL Database auditing in the portal. Even though Microsoft doesn’t allow you to filter your auditing data before it’s collected, it’s still a lightweight option for auditing. I would still use caution with auditing on a very busy production database. That’s why it may be better to use xevents in that case, so you can audit more specifically only what you need.
When you are in charge of data you must take that responsibility with the level of seriousness that it truly requires. Guaranteeing specific levels of RTO and RPO are part of this responsibility and you need to do everything in your power to make sure you can meet them. These cases and scenarios shared here can help you cover some gaps in your general strategy so that you are never caught off-guard. And remember, even if you are in the managed Azure SQL Db cloud offering, you should still verify your configuration and test your procedures to make sure your RPO and RTOs are being met. Thanks for reading!
Azure SQL Audit Overview
Azure SQL Extended Events
Azure SQL Managed Instance Auditing
AWS RDS Auditing