Automation Accounts are a robust instrument for operations, safety, or any group that should handle Azure assets and preserve consistency in a Microsoft Azure setting. Utilizing Automation Accounts, we will deal with and outline a number of processes utilizing automation. The great thing about Azure automation is the power to attach with different Azure merchandise equivalent to Key Vault, Storage Accounts, and Azure Capabilities, to call a couple of. On this article, we’re going to use a easy state of affairs the place we’re going to use an Automation Account to create a report of all digital machines per useful resource group. The script will generate a JSON file and put it aside on a Storage Account.
The script being launched right here is to have details about the configuration of all VMs within the Microsoft Azure. It’s helpful in environments with a lot of modifications all through the lifecycle of the VMs.
The concept behind that is to make use of the framework that we’re going to construct on this article and adapt to your setting to offer a safer and modular setting to your operations/safety/infrastructure groups which are answerable for managing Azure assets.
Listed here are some key factors that we’re going to use:
- We’re going to reap the benefits of Automation Account variables to retrieve the Storage Account and Key Vault getting used for this resolution.
- We’re going to create a SAS token of 1 hour to have the ability to save the recordsdata into the Storage Account.
- The objective is to keep away from any password within the code.
Creating the setting and the Automation Account
Step one is to create a useful resource group to retailer our assets which are going for use by our operations/safe group. For this text, we shall be making a useful resource group referred to as AP-Operations within the Canada Central area.
The second step is to create an Azure Automation Account. Seek for Automation Account. Within the new blade, click on on Add and fill out all of the required data. Make certain to pick Sure within the Create Azure Run As Account.
To avoid wasting time, we’re going to pre-populate a few of the configurations that we have to have within the Azure automation, which is the variables. For now, let’s create a single variable referred to as StorageAccount. Within the worth, add TBD, and we are going to come again right here so as to add the precise worth later.
We’re going to use the brand new AZ modules in our new Runbooks. We now have an article describing all of the ins and outs of this course of, you will discover extra data here. For now, we’re going to import the AZ.Accounts, AZ.Useful resource, AZ.Automation, AZ.Storage modules into our Automation Account.
Making a Storage Account
A brand new Storage Account shall be created to assist the Automation Account. To create a brand new one, seek for Storage Accounts within the Azure Portal, and click on on Add.
Step one within the new Storage Account is to create a container for every Runbook. Since we’re planning to create our first one, we are going to create a brand new container referred to as vminventory.
This Storage Account is for our each day operations, and our objective is to maintain the consumption low and preserve the recordsdata within the Storage Account for 60 days. We may create a Runbook to validate the recordsdata and purge recordsdata older than 60 days. Nonetheless, the Storage Account has such a function built-in.
Within the Lifecycle Administration merchandise, click on on Add rule, assign a reputation, and choose delete blob and outline the variety of days desired (in our article it’s going to be 60).
Working your Runbook and connecting all of the dots
The whole code of the script is shared in my GitHub area. Nonetheless, we’re going to discover simply the code that connects with the brand new options being launched on this article, that are variables and Storage Accounts.
The primary one is to retrieve the Storage Account identify from the Automation Account variables. To retrieve any knowledge, we should know the Automation Account Title and the useful resource group. We’re going to use variables within the script for these two items of data. Within the third line, we retrieve the content material of the Storage Account.
$vResourceGroupname = "AP-Operations" $vAutomationAccountName = "AA-CanC-Operations" $vaStorageAccount = Get-AzAutomationVariable $vAutomationAccountName -Title "StorageAccount" -ResourceGroupName $vResourceGroupname
Now that now we have the Storage Account identify, we’re going to create a time period of two hours utilizing the $StartTime(present time) and the $EndTime (two hours from now). We’re going to retrieve the Storage Account into the $stgAccount variable and create a SAS token utilizing all the data created till this level. Lastly, we’re going to use the variable $stgcontext to create the reference to the Storage Account utilizing the SASToken.
$StartTime = Get-Date $EndTime = $startTime.AddHours(1.zero) $stgAccount = Get-AzStorageAccount -Title $vaStorageAccount.worth -ResourceGroupName $vResourceGroupname $SASToken = New-AzStorageAccountSASToken -Service Blob -ResourceType Container,Object -Permission "racwdlup" -startTime $StartTime -ExpiryTime $EndTime -Context $StgAccount.Context $stgcontext = New-AzStorageContext -storageAccountName $stgAccount.StorageAccountName -SasToken $SASToken
To avoid wasting recordsdata into the Storage Account, we will run the next cmdlet, the place the $vFileName is the identify of the file within the present file system, the container identify and the -Pressure to overwrite any current recordsdata with the identical identify.
$tmp = Set-AzStorageBlobContent -File $vFileName -Container vminventory -Context $stgcontext -Pressure
Validating the script in motion
You may copy and paste the content material of the script from the GitHub link above. The primary run needs to be utilizing a check pane and the outcomes needs to be much like the picture listed under.
Since we’re utilizing Storage Accounts as a repository of our scripts, we must always test the storage account to validate if we will see the recordsdata being saved there. We are able to see the content material of the JSON file utilizing the Edit tab, as depicted within the picture under.
Featured picture: Shutterstock