6 min read
Boost your IT automation efficiency with new ScriptRunner release
We have just released our latest ScriptRunner update, version 7.1, packed with powerful new features aimed at making IT...
ScriptRunner Blog
Auditing is not a built-in feature and change tracking must be done manually, we all know working with PowerShell. The purpose of this article is to walk through some scenarios where an appropriate level of auditing is made with several techniques to handle this. Solutions provided will not be scenario specific but customized to fit the need. Auditing is a broad term for what will be shown in this article. Simply put, the script code below exports important tidbits of information to text files for later consumption.
When creating these text files, we can use the Out-File cmdlet which allows the placing of information into files. We can build simple CSV files by providing a header line and then one line per CSV output using comma separated values. Below is a sample of a CSV:
Data lines explained
Before we walk through either scenario, let's create basic log files for tracking and auditing purposes. First, we need to establish a base for these reporting files. For this, we will use a variable ($BasePath) to store the current path. For logging files, a folder called 'Reports' will be used for the destination for any log files.
$BasePath = (Get-Item -Path ".\" -Verbose).FullName
$Path = $BasePath+"\"+"Reports"
Before storing any files in this folder, we will check for the existence of the folder. In case it does not exist, we create it:
if(Test-Path $Path -ErrorAction SilentlyContinue){
Write-Host "'nLogging directory exists - $($Path)'n"
}
else{
New-Item -ItemType Directory -Path $Path -Force
}
Additionally, we'll set the width of the PowerShell session window, as this makes logging long lines easier.
$Host.UI.RawUI.BufferSize = New-Object Management.Automation.Host.Size (500, 9999)
To help keep track of when events occurred, we will use a variable to store a one-liner to evoke as needed:
$Date = { Get-Date -Format "MM.dd.yyyy-hh.mm" }
Lastly, we'll establish our log file names:
$LogFileName = "ScriptExecutionLog.txt"
$LogDestination = $Path+"\"+$LogFileName
OK. On to the scenarios!
For our first scenario, we have a large company that is migrating from on-premises to Microsoft 365. Within the environment is a basic Active Directory configuration in terms of domains and forests - one and one - forest and domain levels are both set to Windows 2012 R2. In terms of size, we are looking at over 40,000 users in the environment and thousands of groups as well.
Prior to migration, it was decided to match their Primary SMTP address with their User Principal Name (UPN) to facilitate a good user experience. With some review, we found that 14,000+ accounts were not matching, and we would need to make changes. Through the grapevine, we had also heard that there were apps that may be restamping accounts with the incorrect information (read UPN changes). No apps were identified as relying on the UPN login, but we wanted to keep track of all changes just in case a reversal would be needed.
First, we need to enter a timestamp for the start of the script (note the date format is common for the US and international readers may need something different):
$Line = ' ' | Out-File $Destination -Append
$Line = "### START @ $($Date.Invoke()) ###" | Out-File $LogDestination -Append
$Line = ' ' | Out-File $LogDestination -Append
Out-File is the cmdlet that writes values to the log file we defined earlier.
For the first part of the log, we define our new log file and the header line of the file (think column headers in a CSV):
$BeforeFileName = "BeforeChanges.txt"
$BeforeDestination = $Path+"\"+$BeforeFileName
$FileHeader = 'DisplayName,Alias,PrimarySMTPAddress,UPN' | Out-File $BeforeDestination
Then with each user, we take the properties values of the user's Display Name, User Alias, Primary SMTP Address and User Principal Name (UPN):
$Line = "$DisplayName,$Alias,$PrimarySMTPAddress,$UPN" | Out-File $BeforeDestination -Append
Notice that the line is appended to the file since we are adding each line to the file and do not want to overwrite the file. We can also be more comprehensive and log each user processed:
$Output = "The mailbox $DisplayName was processed." | Out-File $LogDestination -Append
** Note that the variable used ($Output or $Line) is inconsequential as we can choose any acceptable PowerShell variable name if we wish.
In this code, we have a section of code that deals with changing the user's UPN. The code is set up as a Try {} Catch {} block so that we can log good changes (in the Try {} block) or failures (Catch {} block).
If ($UPN -ne $PrimarySMTP) {
Try {
Set-Mailbox $User -UserPrincipalName $NewUPN -ErrorAction STOP
$Line = "$($Date.Invoke()),Successfully changed the UPN to $PrimarySMTPto the correct matching value for the $User." | Out-File $LogDestination -Append
} Catch {
$Line = "$($Date.Invoke()) , Failed to set the UPN to $PrimarySMTP to the correct matching value for the $User." | Out-File $LogDestination -Append
$Line = "$($Date.Invoke()) , Error message - $_.Exception.Message" | Out-File $LogDestination -Append
}
} Else {
$Line = "$($Date.Invoke()) , UPN for $User is correct, no change needed." | Out-File $LogDestination -Append
}
In the middle, notice the $_.Exception.Message variable. This allows us to capture any PowerShell errors that normally would have been displayed and these are now copied to a log file for future examination.
This code will look exactly like the section we used to log initial settings, except our destination file will be different, since we need a before and after file:
$AfterFileName = "AfterChanges.txt"
$AfterDestination = $Path+"\"+$AfterFileName
$FileHeader = 'DisplayName,Alias,PrimarySMTPAddress,UPN' | Out-File $AfterDestination
$Line = "$DisplayName,$Alias,$PrimarySMTPAddress,$UPN" | Out-File $AfterDestination -Append
Lastly, we will again add a timestamp for the end of the script.
$Line = ' ' | Out-File $LogDestination -Append
$Line = "### END @ $($Date.Invoke()) ###" | Out-File $LogDestination -Append
$Line = ' ' | Out-File $LogDestination -Append
Below is a sample log file with a Start/End timestamp as well as a log of events in the middle:
START and END timestamps
On to the next scenario!
For this scenario, we have an Exchange-to-Exchange Online migration where we are moving over 10,000 mailboxes, from multiple geographical regions to a single Exchange Online tenant. While moving these users, we have a script (like the one previously written about here and here) that administrators are utilizing for migrations. To further enhance this script, we can use PowerShell to generate logs for critical items in the script.
This is the summary of logging we can perform:
Write to a log every action that occurs, as well as a start and stop date for script execution.
Log error - capture the error PowerShell would have generated to the screen.
Start of the script - appropriate header to the file to start the script:
$FileName = "MailboxMoveAdministrationLogging.txt"
$Destination = $Path+"\"+$FileName
$FileCheck = Test-Path $Destination
If (-not($FileCheck)) {
$Line = "This file logs all changes made to move requests in this script" | Out-File $Destination
$Line = "---------------------------------------------------------------" | Out-File $Destination -Append
$Line = " " | Out-File $Destination -Append
}
$Line = ' ' | Out-File $Destination -Append
$Line = "### START @ $($Date.Invoke()) ###" | Out-File $Destination -Append
$Line = ' ' | Out-File $Destination -Append
For this function, the logging is just exporting the entire Report value, which provides a detailed analysis of a mailbox move to the cloud and is used to determine why the move has failed or is paused. Below, we grab the mailbox's primary smtp address, specify an output file (specific to the mailbox move we are querying), grab the mailbox's report and then export that entire report to the file we specified in line 2:
$Mailbox = $MoveRequest.PrimarySMTPAddress
$OutputFile ="$Path"+"\"+"$Mailbox"+"-Report.txt"
$Report = (Get-MoveRequestStatistics -Identity $Mailbox -IncludeReport).Report
$Report.Entries | Ft | Out-File -FilePath $OutputFile
For this part of the script, we create a new move request for a mailbox migration, again using a Try {} Catch {} block. If the creation succeeds, then this is logged and if there is a failure, the error message is logged as well:
$Mailbox = $User.PrimarySMTPAddress
If ($Null -eq (Get-MoveRequest $Mailbox -ErrorAction Silentlycontinue)){
Try {
$Creation = New-MoveRequest -Identity $Mailbox -Remote -RemoteHostName $Endpoint -TargetDeliveryDomain $TargetDomain -RemoteCredential $OPCred -SuspendWhenReadyToComplete -ErrorAction STOP
$Line = "The mailbox move for $Mailbox was successfully created." | Out-File $Destination -Append
} Catch {
$Line = "The mailbox move for $Mailbox was NOT successfully created." | Out-File $Destination -Append
$Line = "$($Date.Invoke()) , Error message - $_.Exception.Message" | Out-File $Destination -Append
}
}
In this code section, we have a code block that will remove move requests, perhaps because the user needs to wait or is being removed from the current set of moves due to them leaving the company. This process is similar to the new move request, shown previously, and we have this section of code:
Write-host "Removing move request for " -ForegroundColor White -NoNewline
Write-host "$SMTPAddress....." -ForegroundColor Yellow
Try {
Remove-MoveRequest $SMTPAddress -Confirm:$False -ErrorAction STOP
$Line = "$($Date.Invoke()) , Remove move request for $SMTPAddress succeeded." | Out-File $Destination -Append
} Catch {
$Line = "$($Date.Invoke()) , Remove request for $SMTPAddress failed." | Out-File $Destination -Append
$Line = "$($Date.Invoke()) , Error message - $_.Exception.Message" | Out-File $Destination -Append
}
A log file with START, END and actions logged
A log file with START, END and in this case also errors logged
What scenarios can you come up with? Would you create separate logs for good and bad results? Another possible use of Out-File is documentation, for example, daily mailbox stats for all users in Exchange Online. In other words, we have many possibilities so go forth and log, or audit, or else with PowerShell.
The number 10,000 is important in both scenarios. PowerShell uses a 1,000-result limit. Since both environments are large, using -ResultSize Unlimited for some cmdlets is the only way to get results. Another performance tip for large environments, if a cmdlet has 'Filter' parameters, use that before using 'Where' to filter your results. Using 'Where' in a large environment can significantly increase processing time for a script.
Governance and compliance readiness increasingly require the complete traceability of all processes. The use of PowerShell scripts is also not excluded. With the external database, you can prove which scripts were executed on which system with which parameters, what the results were, etc. over the required periods of time. No matter if you use one or more ScriptRunner hosts.
ScriptRunner Report/Audit DB Connector connects the ScriptRunner host to the database on a Microsoft SQL server. If you run multiple ScriptRunner hosts, you can write them all to the same database. This means that you have all reporting information across all systems in a single database.
If a report was generated by an action, it is first written to the circulation database on the ScriptRunner host. Connector also generates an XML file whose contents are automatically transferred to the database on the SQL server. A restart after errors ensures that no report is lost and all data that has occurred in the meantime is stored in the database.
Sep 30, 2024 by Frank Kresse
We have just released our latest ScriptRunner update, version 7.1, packed with powerful new features aimed at making IT...
Aug 16, 2024 by Heiko Brenn
Welcome to Scriptember! We are thrilled to announce the launch of a unique, month-long campaign dedicated to...
Aug 14, 2024 by Jeffery Hicks
I'd like to think that because you are reading this, you are a professional PowerShell user and this article will be a...
Damian Scoles is a ten-time Microsoft MVP specializing in Exchange, Office 365 and PowerShell who has 25 years of IT industry experience. He is based in the Chicago area and started out managing Exchange 5.5 and Windows NT. Over the years he has worked with Office 365 since BPOS and his experience has grown to include Azure AD, Security and Compliance Admin Centers, and Exchange Online. His community outreach includes contributing to TechNet forums, creating PowerShell scripts that can be found on his blogs, writing in-depth PowerShell / Office365 / Exchange blog articles, tweeting, and creating PowerShell videos on YouTube. He has written five PowerShell books and is also actively working on the book "Microsoft 365 Security for IT Pros".