AVS SRM and Azure Functions Integration!
Ran into a fun little issue recently where we needed to run PowerShell scripts in the Azure VMware Solution (AVS) deployment of SRM. With how AVS is deployed, you have very limited permissions and abilities to run scripts. Zero administrative access to the SRM appliance.
The scenario is; I want to run a PowerShell script to do a thing before storage sync occurs, and I want to do it from the AVS SRM appliance. CAN’T BE DONE?! I thought so too. With such little access to the SRM appliance in AVS, we have to trigger a script to be run outside somehow. The only other option was to rework a whole lot of other things, which really wasn’t a viable option. So, what to do?
Turns out it can be done! I used the following to make it work:
- Azure App Services
- Azure Functions
- cURL command!!
From the perspective of the SRM appliance, you have the ability to specify a manual command to run in a recovery step. But, again, since we can’t put scripts on the SRM appliance we can’t run anything locally. But, turns out that the cURL command works here! Great!
Where to put the scripts then? Azure Functions!
I created an Azure Function backed by an App Service Plan running Windows and using PowerShell. With that setup I could use something similar to this to run one of those scripts from SRM:
/bin/curl -X POST "https://super-awesome-app/azurewebsites.net/api/<script_name>?code=<api code>"
This is super useful is you need to get stuff sorted out before storage sync happens in SRM. If you wanted to move around DNS records, send an email, do SQL log shipping, any number of things. As long as you can do it in Azure Functions, you can get it done. You could use Python, .NET or whatever too.
Passing Variables to Function App
Passing parameters can be done in two main ways. This first method uses query elements that are inline with the request.
/bin/curl -X POST "https://super-awesome-app/azurewebsites.net/api/<script name>?code=<api code>&Server=SQLServer1"
Or, if you have more complicated parameters you want to send, you can add a body to the request using JSON. This portion of the request is added on as a separate element using the -d switch and specifying the content type:
/bin/curl -X POST "https://super-awesome-app/azurewebsites.net/api/<script name>?code=<api code>" -H "Content-Type: application/json" -d '{"UNCPath": "/path/name", "FileName": "file.txt"}'
In the above example we are calling the scrip with the API code and then specifying that we are also sending along some JSON in the body of the message. The JSON would look something like this:
{
"UNCPath": "/path/name",
"FileName": "file.txt"
}
In the PowerShell the above values are available to you via the $Request variable:
#This is to grab from inline variable options
$VariableName = $Request.Query.QueryVariableName
#Use this to grab from body elements like we're using for the JSON
$VariableName= $Request.Body.BodyVariableName
Building a Solution
So, to sum it up here are the steps:
Step 1: Create an Azure Function App
Couple things to note here:
- Using ‘Functions Premium’ because I want VNet integration and private endpoints. I don’t want to use this over the public Internet and need to integrate this with private DNS services. If you don’t need that, just go with consumption based.
- Also, the ‘Elastic Premium EP1’ plan is the lowest plan to choose from that has these options. This is really overkill for this application, but really the only option available with the services I needed to integrate with.
- Also, PowerShell isn’t a requirement at all… you can run anything you’d like. I was working with existing PowerShell, so no need to convert it to anything else.
Step 2: DNS and Private Endpoint Integrate (Skip if you’re not interested in this part)
There’s a lot going on here, and there are different ways of going about this. Fortunately, you can do most of it form the same interface at one time, or you can connect all the dots later.
But, what you here see is the inbound access being setup and using Private DNS so when using the FQDN of the app service, you’ll get a private IP instead of the normal public endpoint.
*Note: this also requires that you have DNS infrastructure setup so your on-premises environments know how to grab this information. Check out this post on integrating DNS into Azure.
The rest is just making sure there is an outbound source for the function to use. Keep in mind, the endpoints-outbound-subnet is a Delegated subnet which means it is dedicated to this function app, just keep that in mind… it’s locked down. You can’t add anything to it or change it in any meaningful way.
Create a Function App
Creating a Function App is very straight forward. What you do with it can be very complex. But it gives you some good basic starter code to test with. First, create a function:
When you create the function, you’ll be prompted for a few things. In our case, we just want to use an HTTP trigger
We will want to give this trigger a name and the authorization level. The authorization level allows you to select to use an API key or allow it to be anonymous.
Important note, the function name is what you’re going to call in your curl command.
A Few Other Notes
A few other things I’m also doing are the following:
- Use Azure Key Vault to store all credentials. Expressing them in the script as $ENV:VariableName. This ability is located under “Configurations -> Application Settings”
- You can just enter credentials here, or point them to values stored within Azure Key Vault. I recommend Key Vault.
So, what does a script for this kind of thing look like? Here’s a very simple example of sending an email, passing in the name of the server:
using namespace System.Net
# Grab data from query
param($Request, $TriggerMetadata)
# Write to the Azure Functions log stream.
Write-Host "SendMail.ps1 trigger function processed a request."
# Interact with query parameters or the body of the request.
$Server_name = $Request.Query.Server
$Date = Get-Date
# Prep Email
$From = "srm_email@contoso.com"
$To = "server_admin@contoso.cpm"
$ServerName= $Server_name
$SMTPServer = "SMTP.internal.contoso.com"
$Subject = "SRM Trigger Function Run"
&body = "Things have happened, and we'd think you'd like to know about it"
#Send-MailMessage -To $To -Subject $Subject -From $From -Body $Body -SMTPServer $SMTPServer
Now, yes… this does require that you have a server willing to accept SMTP port 25 without authentication but you get the general idea. Keep it internal, whitelist the source IP (endpoint-outbound-subnet) for these functions (If you did the private DNS thing earlier).
This get’s the job done, and you can see there is a lot of flexibility here. I’ve barely scratched the surface on what you can do with functions, not to mention App Services in general.
Invoking Commands on other Machines
Another thing that needed to happen was that some scripts required the “invoke-command” to be run. This cmdlet runs a command on another windows machine, using the session you are currently on as a proxy. Problem with this was with permissions. Azure Functions isn’t an AD joined entity and a managed identity won’t sync back to on-premises AD (in this setup). So, introduce Hybrid Connections.
Hybrid Connections allow you to run an agent on a remote VM, on-premises VM in our case) and then run any kind of scrips on that VM as if you were there. So, what we needed here was nested ‘invoke-command’ commands.
First, in the PowerShell script in an Azure Function, we want to invoke a command pointing at the Hybrid connection. In that command, we reference a script we want the remote VM to run…. In that script we have ANOTHER invoke-command that points to the actual target server. Here’s an example:
# Input bindings are passed in via param block.
param($Request, $TriggerMetadata)
# Write to the Azure Functions log stream.
Write-Host "PowerShell HTTP trigger function processed a request happened."
# Interact with query parameters or the body of the request.
$Server = $Request.Query.Server
# Delegated Credentials for environment
$UserName = "contoso\admin"
$SecuredPassword = ConvertTo-SecureString $Env:CREDENTIAL -AsPlainText -Force
$Credential = [System.management.automation.pscredential]::new($UserName, $SecuredPassword)
# This is the name of the hybrid connection Endpoint.
$HybridEndpoint = "hybridendpoint01"
#################################################################################
# This is the remote script to run
$script = {
Param(
[Parameter(Mandatory=$True)]
$Credential,
$Server
)
Invoke-Command {Get-Service "VMTools" | Where-Object Status -eq 'Stopped' | Start-Service} -ComputerName $Server -Credential $Credential
}
#################################################################################
# Create hybrid connection for script to run
Write-Output "Running command via Invoke-Command"
Invoke-Command -ComputerName $HybridEndpoint `
-Credential $Credential `
-Port 5986 `
-UseSSL `
-ScriptBlock $Script `
-ArgumentList $Credential, $Server `
-SessionOption (New-PSSessionOption -SkipCACheck)
What you see here is that we are using the ‘invoke-command’ and specifying a script to run ON the target ‘ComputerName’. This is all done from the perspective of the agent installed for the HybridConnection. In this case, the $HybridEndpoint is going to be the hostname of the machine that the connection is installed on… so it is just referencing itself.
Remember, the invoke-command block is run AT the $HybridEndpoint, not in azure functions.
It’s a little complicated, but it allows us to use the permissions of the VM we’re connected to via Hybrid Connection to achieve results elsewhere… Like check status or change status of a service running on a VM like in the example above.