Quantcast
Channel: SQL Server – SQL DBA with A Beard
Viewing all 59 articles
Browse latest View live

PowerShell CMDLets added for SQL2016 Always Encrypted

$
0
0

The post on the SQLServer blog at TechNet by the SQL Server Tools Team today made me jump out of my seat.

The July update for SSMS includes the first substantial improvement in SQL PowerShell in many years. We owe a lot of thanks for this effort to the great collaboration with our community. We have several new CMDLETs to share with you

In one release there are twenty-five new CMDLets for the new sqlserver module

This means that if you have a PowerShell script doing Import-Module SQLPS, it will need to be changed to be Import-Module SqlServer in order to take advantage of the new provider functionality and new CMDLETs. The new module will be installed to “%Program Files\WindowsPowerShell\Modules\SqlServer” and hence no update to $env:PSModulePath is required.

So SQLPS will still continue to work but will not be updated and will not contain the new CMDlets or the future new CMDlets.

So what new things do we have?

This month we introduce CMDLETs for the following areas:

  • Always Encrypted
  • SQL Agent
  • SQL Error Logs

Chrissy LeMaire has written about the new SQL Agent cmdlets

Aaron Nelson has written about the new Get-SqlErrorLog cmdlet

Laerte Junior has written about Invoke-SQLCmd

All four of us will be presenting a webinar on the new CMDlets via the PowerShell Virtual Chapter Wed, Jul 06 2016 12:00 Eastern Daylight Time If you cant make it a recording will be made available on YouTube on the VC Channel https://sqlps.io/video

Always Encrypted CMDlets

That leaves the Always Encrypted CMDLets and there are 17 of those!

Add-SqlColumnEncryptionKeyValue Adds a new encrypted value for an existing column encryption key object in the database.
Complete-SqlColumnMasterKeyRotation Completes the rotation of a column master key.
Get-SqlColumnEncryptionKey Returns all column encryption key objects defined in the database, or returns one column encryption key object with the specified name.
Get-SqlColumnMasterKey Returns the column master key objects defined in the database, or returns one column master key object with the specified name.
Invoke-SqlColumnMasterKeyRotation Initiates the rotation of a column master key.
New-SqlAzureKeyVaultColumnMasterKeySettings Creates a SqlColumnMasterKeySettings object describing an asymmetric key stored in Azure Key Vault.
New-SqlCngColumnMasterKeySettings Creates a SqlColumnMasterKeySettings object describing an asymmetric key stored in a key store supporting the Cryptography Next Generation (CNG) API.
New-SqlColumnEncryptionKey Crates a new column encryption key object in the database.
New-SqlColumnEncryptionKeyEncryptedValue Produces an encrypted value of a column encryption key.
New-SqlColumnEncryptionSettings Creates a new SqlColumnEncryptionSettings object that encapsulates information about a single column’s encryption, including CEK and encryption type.
New-SqlColumnMasterKey Creates a new column master key object in the database.
New-SqlCspColumnMasterKeySettings Creates a SqlColumnMasterKeySettings object describing an asymmetric key stored in a key store with a Cryptography Service Provider (CSP) supporting Cryptography API (CAPI).
Remove-SqlColumnEncryptionKey Removes the column encryption key object from the database.
Remove-SqlColumnEncryptionKeyValue Removes an encrypted value from an existing column encryption key object in the database.
Remove-SqlColumnMasterKey Removes the column master key object from the database.
Set-SqlColumnEncryption Encrypts, decrypts or re-encrypts specified columns in the database.
 

 

That seems to cover setting up Always Encrypted with Powershell , removing it and getting information about it. When the new SSMS update is dropped you will be able to start using all of this new functionality.

Just remember Import-Module sqlserver

CALL TO ACTION

Microsoft are engaging with the community to improve the tools we all use in our day to day work. There is are two Trello boards set up for YOU to use to contribute

https://sqlps.io/vote  for SQLPS  sqlserver PowerShell module

https://sqlps.io/ssms for SSMS

Go and join them and upvote YOUR preferred choice of the next lot of CMDlets

trellocount

 

We have also set up a SQL Community Slack for anyone in the community to discuss all things related to SQL including the Trello board items and already it seems a good place for people to get help with 150+ members in a few days. You can get an invite here https://sqlps.io/slack

Come and join us



Using the new SQLServer Powershell module to get SQL Agent Job Information

$
0
0

So with the July Release of SSMS everything changed for using PowerShell with SQL. You can read the details here As I mentioned in my previous post the name of the module has changed to sqlserver

This means that if you have a PowerShell script doing Import-Module SQLPS, it will need to be changed to be Import-Module SqlServer in order to take advantage of the new provider functionality and new CMDLETs. The new module will be installed to “%Program Files\WindowsPowerShell\Modules\SqlServer” and hence no update to $env:PSModulePath is required.

You can download the latest SSMS release here Once you have installed and rebooted you can start to look at the new Powershell CMDlets

 Import-module sqlserver

Take a look at cmdlets

 Get-command -module sqlserver

Today I want to look at agent jobs

 Get-command *sqlagent*

getcomand sqlagent

So I decided to see how to gather the information I gather for the DBADatabase as described here

This is the query I use to insert the data for the server level agent job information.

 $Query = @"
INSERT INTO [Info].[AgentJobServer]
 ([Date]
 ,[InstanceID]
 ,[NumberOfJobs]
 ,[SuccessfulJobs]
 ,[FailedJobs]
 ,[DisabledJobs]
 ,[UnknownJobs])
 VALUES
 (GetDate()
 ,(SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
 ,'$JobCount'
 ,'$successCount'
 ,'$failedCount'
 ,'$JobsDisabled'
 ,'$UnknownCount')
"@

So Get-SQLAgentJob looks like the one I need. Lets take a look at the help. This should be the starting point whenever you use a new cmdlet

 Get-Help Get-SqlAgentJob -Full

Which states

Returns a SQL Agent Job object for each job that is present in the target instance of SQL Agent.

That sounds like it will meet my needs. Lets take a look

Get-SqlAgentJob -ServerInstance $Connection|ft -AutoSize

sqlinstances

I can get the information I require like this


$JobCount = (Get-SqlAgentJob -ServerInstance $Connection ).Count
$successCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Succeeded'}.Count
$failedCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Failed'}.Count
$JobsDisabled = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.IsEnabled -eq $false}.Count
$UnknownCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Unknown'}.Count

NOTE – That code is for PowerShell V4 and V5, if you are using earlier versions of PowerShell you would need to use

$JobCount = (Get-SqlAgentJob -ServerInstance $Connection ).Count
$successCount = (Get-SqlAgentJob -ServerInstance $Connection|Where-Object {$_.LastRunOutcome -eq 'Succeeded'}).Count
$failedCount = (Get-SqlAgentJob -ServerInstance $Connection |Where-Object {$_.LastRunOutcome -eq 'Failed'}).Count
$JobsDisabled = (Get-SqlAgentJob -ServerInstance $Connection |Where-Object{$_.IsEnabled -eq $false}).Count
$UnknownCount = (Get-SqlAgentJob -ServerInstance $Connection |Where-Object{$_.LastRunOutcome -eq 'Unknown'}).Count

But to make the code more performant it is better to do this

 [pscustomobject]$Jobs= @{}
$Jobs.JobCount = (Get-SqlAgentJob -ServerInstance $Connection ).Count
$Jobs.successCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Succeeded'}.Count
$Jobs.failedCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Failed'}.Count
$Jobs.JobsDisabled = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.IsEnabled -eq $false}.Count
$Jobs.UnknownCount = (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.LastRunOutcome -eq 'Unknown'}.Count
$Jobs

jobs

Using Measure-Command showed that this completed in
TotalSeconds : 0.9889336
Rather than
TotalSeconds : 2.9045701

Note that

 (Get-SqlAgentJob -ServerInstance $Connection ).where{$_.Enabled -eq $false}.Count

Does not work. I had to check the properties using

 Get-SqlAgentJob -ServerInstance $Connection |Get-Member -Type Properties

Which showed me

IsEnabled Property bool IsEnabled {get;set;}

So I tested this against the various SQL versions I had in my lab using this code

$Table = $null
$Table = New-Object System.Data.DataTable "Jobs"
$Col1 = New-Object System.Data.DataColumn ServerName,([string])
$Col2 = New-Object System.Data.DataColumn JobCount,([int])
$Col3 = New-Object System.Data.DataColumn SuccessCount,([int])
$Col4 = New-Object System.Data.DataColumn FailedCount,([int])
$Col5 = New-Object System.Data.DataColumn DisabledCount,([int])
$Col6 = New-Object System.Data.DataColumn UnknownCount,([int])

$Table.Columns.Add($Col1)
$Table.Columns.Add($Col2)
$Table.Columns.Add($Col3)
$Table.Columns.Add($Col4)
$Table.Columns.Add($Col5)
$Table.Columns.Add($Col6)
foreach ($ServerName in $DemoServers)
{
## $ServerName
$InstanceName =  $ServerName|Select-Object InstanceName -ExpandProperty InstanceName
$Port = $ServerName| Select-Object Port -ExpandProperty Port
$ServerName = $ServerName|Select-Object ServerName -ExpandProperty ServerName
$Connection = $ServerName + '\' + $InstanceName + ',' + $Port
try
{
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection
}
catch
{
"Failed to connect to $Connection"
}
if (!( $srv.version)){
"Failed to Connect to $Connection"
continue
}
[pscustomobject]$Jobs= @{}
$JobHistory = Get-SqlAgentJob -ServerInstance $Connection
$Row = $Table.NewRow()
$Row.ServerName = $ServerName
$Row.JobCount = $JobHistory.Count
$Row.SuccessCount = $JobHistory.where{$_.LastRunOutcome -eq 'Succeeded'}.Count
$Row.FailedCount = $JobHistory.where{$_.LastRunOutcome -eq 'Failed'}.Count
$Row.DisabledCount = $JobHistory.where{$_.IsEnabled -eq $false}.Count
$Row.UnknownCount = $JobHistory.where{$_.LastRunOutcome -eq 'Unknown'}.Count
$Table.Rows.Add($row)
}
$Table|ft
Here are the results
job data table

I also had a look at Get-SQLAgentJobHistory Lets take a look at the help

Get-help get-SQLAgentJobHistory -showwindow

DESCRIPTION

Returns the JobHistory present in the target instance of SQL Agent.

This cmdlet supports the following modes of operation to return the JobHistory:

  1. By specifying the Path of the SQL Agent instance.
  2. By passing the instance of the SQL Agent in the input.
  3. By invoking the cmdlet in a valid context.

So I ran

Get-SqlAgentJobHistory -ServerInstance sql2014ser12r2 

And got back a whole load of information. Every job history available on the server. Too much to look it immediately to work out what to do

So I looked at just one job

Get-SqlAgentJobHistory -ServerInstance SQL2014Ser12R2 -JobName 'DatabaseBackup - SYSTEM_DATABASES - FULL - Local G Drive'

And got back the last months worth of history for that one job as that is the schedule used to purge the job history for this server So then I added -Since Yesterday to only get the last 24 hours history

Get-SqlAgentJobHistory -ServerInstance SQL2014Ser12R2 -JobName 'DatabaseBackup - SYSTEM_DATABASES - FULL - Local G Drive' -Since Yesterday

agentjobdetail

The Since Parameter is described as

-Since <SinceType>

A convenient abbreviation to avoid using the -StartRunDate parameter.
It can be specified with the -EndRunDate parameter.

Do not specify a -StartRunDate parameter, if you want to use it.

Accepted values are:
– Midnight (gets all the job history information generated after midnight)
– Yesterday (gets all the job history information generated in the last 24 hours)
– LastWeek (gets all the job history information generated in the last week)
– LastMonth (gets all the job history information generated in the last month)

When I run

Get-SqlAgentJobHistory -ServerInstance SQL2014Ser12R2 -JobName 'DatabaseBackup - SYSTEM_DATABASES - FULL - Local G Drive' -Since Yesterday |Measure-Object

I get

Count : 3

And if I run

Get-SqlAgentJobHistory -ServerInstance SQL2014Ser12R2 -JobName 'DatabaseBackup - SYSTEM_DATABASES - FULL - Local G Drive' -Since Yesterday |select RunDate,StepID,Server,JobName,StepName,Message|Out-GridView

I get

agent job out gridview

Which matches the view I see in SSMS Agent Job History

jobhistory

So Get-SqlAgentJobHistory will enable you to use PowerShell to gather information about the Job history for each step of the Agent Jobs and also the message which I can see being very useful.

Come and join us in the SQL Community Slack to discuss these CMDLets and all things SQL Community https://sqlps.io/slack

CALL TO ACTION

Microsoft are engaging with the community to improve the tools we all use in our day to day work. There is are two Trello boards set up for YOU to use to contribute

https://sqlps.io/vote for SQLPS sqlserver PowerShell module

https://sqlps.io/ssms for SSMS

Go and join them and upvote YOUR preferred choice of the next lot of CMDlets

trellocount

We have also set up a SQL Community Slack for anyone in the community to discuss all things related to SQL including the Trello board items and already it seems a good place for people to get help with 150+ members in a few days. You can get an invite here https://sqlps.io/slack

Come and join us


Converting SQL Agent Job Duration to TimeSpan using PowerShell

$
0
0

When you look in msdb for the SQL Agent Job duration you will find that it is an int.

sysjobshistoiry

This is also the same when you look at Get-SQLAgentJobHistory from the sqlserver module. (You can get this by downloading the latest SSMS release from here)

agentjobhistoryproperties

This means that when you look at the various duration of the Agent Jobs you get something like this

duration.PNG

The first job took 15 hours 41 minutes  53 seconds, the second 1 minute 25 seconds, the third 21 seconds. This makes it quite tricky to calculate the duration in a suitable datatype. In T-SQL people use scripts like the following from MSSQLTips.com

((run_duration/10000*3600 + (run_duration/100)%100*60 + run_duration%100 + 31 ) / 60)  as ‘RunDurationMinutes’

I needed more information than the number of minutes so I have this which will convert the Run Duration to a timespan

$FormattedDuration = @{Name = ‘FormattedDuration’ ; Expression = {[timespan]$_.RunDuration.ToString().PadLeft(6,’0′).insert(4,’:’).insert(2,’:’)}}

formatted.PNG

So how did I get to there?

First I tried to just convert it. In PowerShell you can define a datatype in square brackets and PowerShell will try to convert it

timespan

It did its best but it converted it to ticks! So we need to convince PowerShell that this is a proper timespan. First we need to convert the run duration to a standard length, you can use the PadLeft method of a string to do this which will ensure that a string has a length and precede the current string with a value you choose until the string is that length.

Lets have a length of 6 and preceding zeros PadLeft(6,’0′)

padlefterror

But this works only if it is a string!! Remember red text is useful, it will often contain the information you need to resolve your error. Luckily there is a method to turn an int to a string. I am using the foreach method to demonstrate

padleft-with-string

Now every string is 6 characters long starting with zeros. So all that is left is to format this with colons to separate the hours and minutes and the minutes and seconds. We can do this with the insert method. You can find out the methods using Get-Member or its alias gm

methods.PNG

So the insert method takes an int for the startindex and a string value to enter

insert

There we go now we have some proper formatted timespans however they are still strings. We can then convert them using [timespan] Now we can format the results within the select by using an expression as shown below

select

and as you can see it is a timespan now

timespan property.PNG

On a slight side note. I needed the durations for Agent Jobs with a certain name within the last 6 days.

getting-agent-jobs

I did this by passing an array of servers (which I got from my dbareports database) to Get-SQLAgentJobHistory. I then used the Where method to filter for JobName and the Job Outcome step of the history. I compared the RunDate property  to Get-Date (today) adding -6 days using the AddDays method🙂

Hopefully this will be of use to people and also I have it recorded for the next time I need to do it🙂

 

 


PowerShell, Pester and Ola Hallengrens Maintenance Solution

$
0
0
If you are a SQL DBA you will have heard of Ola Hallengrens Maintenance solution If you haven’t go and click the link and look at the easiest way to ensure that all of your essential database maintenance is performed. You can also watch a video from Ola at SQL Bits
Recently I was thinking about how I could validate that this solution was installed in the way that I wanted it to be so I turned to Pester You can find a great how to get started here which will show you how to get Pester and how to get started with TDD.
This isn’t TDD though this is Environment Validation and this is how I went about creating my test.
First I thought about what I would look for in SSMS when I had installed the maintenance solution and made a list of the things that I would check which looked something like this. This would be the checklist you would create (or have already created) for yourself or a junior following this install. This is how easy you can turn that checklist into a Pester Test and remove the human element and open your install for automated testing
  • SQL Server Agent is running – Otherwise the jobs won’t run🙂
  • We should have 4 backup jobs with a name of
  • DatabaseBackup – SYSTEM_DATABASES – FULL
  • DatabaseBackup – USER_DATABASES – FULL
  • DatabaseBackup – USER_DATABASES – DIFF
  • DatabaseBackup – USER_DATABASES – LOG
  • We should have Integrity Check and Index Optimisation Jobs
  • We should have the clean up jobs
  • All jobs should be scheduled
  • All jobs should be enabled
  • The jobs should have succeeded

I can certainly say that I have run through that check in my head and also written it down in an installation guide in the past. If I was being more careful I would have checked if there were the correct folders in the folder I was backing up to.

Ola’s script uses a default naming convention so this makes it easy. There should be a SERVERNAME or SERVERNAME$INSTANCENAME folder or if there is an Availability Group a CLUSTERNAME$AGNAME and in each of those a FULL DIFF and LOG folder which I can add to my checklist

So now we have our checklist we just need to turn in into a Pester Environmental Validation script

It would be useful to be able to pass in a number of instances so we will start with a foreach loop and then a Describe Block then split the server name and instance name, get the agent jobs and set the backup folder name

$ServerName = $Server.Split('\')[0]
$InstanceName = $Server.Split('\')[1]
$ServerName = $ServerName.ToUpper()
Describe 'Testing $Server Backup solution'{
BeforeAll {$Jobs = Get-SqlAgentJob -ServerInstance $Server
$srv = New-Object Microsoft.SQLServer.Management.SMO.Server $Server
$dbs = $Srv.Databases.Where{$_.status -eq 'Normal'}.name
if($InstanceName)
{
$DisplayName = 'SQL Server Agent ($InstanceName)'
$Folder = $ServerName + '$' + $InstanceName
}
else
{
$DisplayName = 'SQL Server Agent (MSSQLSERVER)'
$Folder = $ServerName
}
}
if($CheckForBackups -eq $true)
{
$CheckForDBFolders -eq $true
}
$Root = $Share + '\' + $Folder 
I also set the Agent service display name so I can get its status. I split the jobs up using a Context block, one each for Backups, Database maintenance and solution clean up but they all follow the same pattern. .First get the jobs
$Jobs = $Jobs.Where{($_.Name -like 'DatabaseBackup - SYSTEM_DATABASES - FULL*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - FULL*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - DIFF*' + $JobSuffix + '*') -or ($_.Name -like 'DatabaseBackup - USER_DATABASES - LOG*' + $JobSuffix + '*')}
Then we can iterate through them and check them but first lets test the Agent Service. You do this with an It Block and in it put a single test like this
actual-value | Should Be expected-value
So to check the Agent Job is running we can do this
(Get-service -ComputerName $ServerName -DisplayName $DisplayName).Status | Should Be 'Running'
To find out how to get the right values for any test I check using get member so to see what is available for a job I gathered the Agent Jobs into a variable using the Get-SQLAgentJob command in the new sqlserver module (which you can get by installing the latest SSMS from here) and then explored their properties using Get-Member and the values using Select Object
$jobs = Get-SqlAgentJob -ServerInstance $server
($Jobs | Get-Member -MemberType Property).name
$Jobs[0] | Select-Object *
then using a foreach to loop through them I can check that the jobs, exists, is enabled, has a schedule and succeeded last time it ran like this
$Jobs = $Jobs.Where{($_.Name -eq 'DatabaseIntegrityCheck - SYSTEM_DATABASES') -or ($_.Name -eq 'DatabaseIntegrityCheck - USER_DATABASES') -or ($_.Name -eq 'IndexOptimize - USER_DATABASES')}
foreach($job in $Jobs)
{
$JobName = $Job.Name
It '$JobName Job Exists'{
$Job | Should Not BeNullOrEmpty
}
It '$JobName Job is enabled' {
$job.IsEnabled | Should Be 'True'
}
It '$JobName Job has schedule' {
$Job.HasSchedule | Should Be 'True'
}
if($DontCheckJobOutcome -eq $false)
{
It '$JobName Job succeeded' {
$Job.LastRunOutCome | Should Be 'Succeeded'
}
}
So I have checked the agent and the jobs and now I want to check the folders exist. First for the instance using Test-Path so the user running the PowerShell session must have privileges and access to list the files and folders
Context '$Share Share For $Server' {
It 'Should have the root folder $Root' {
Test-Path $Root | Should Be $true
}
The for every database we need to set some variables for the Folder path. We don’t back up tempdb so we ignore that and then check if the server is SQL2012 or above and if it is check if the database is a member of an availability group and set the folder name appropriately
  foreach($db in $dbs.Where{$_ -ne 'tempdb'})
{

if($Srv.VersionMajor -ge 11)
{
If($srv.Databases[$db].AvailabilityGroupName)
{
$AG = $srv.Databases[$db].AvailabilityGroupName
$Cluster = $srv.ClusterName
$OLAAg = $Cluster + '$' + $AG
if($Share.StartsWith('\\') -eq $False)
{
$UNC = $Share.Replace(':','$')
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $OlaAG
}
else
{
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $Folder
}
}
else
{
if($Share.StartsWith('\\') -eq $False)
{
$UNC = $Share.Replace(':','$')
$Root = '\\' + $ServerName + '\' + $UNC + '\' + $Folder
}
else
{
$Root = $Share + '\' + $Folder
}
}
}
$db = $db.Replace(' ','')
$Dbfolder = $Root + &amp;quot;\$db&amp;quot;
$Full = $Dbfolder + '\FULL'
$Diff = $Dbfolder + '\DIFF'
$Log  = $Dbfolder + '\LOG'
If($CheckForDBFolders -eq $True)
{
Context &amp;quot;Folder Check for $db on $Server on $Share&amp;quot; {
It &amp;quot;Should have a folder for $db database&amp;quot; {
Test-Path $Dbfolder |Should Be $true
} 
But we need some logic for checking for folders because Ola is smart and checks for Log Shipping databases so as not to break the LSN chain and system databases only have full folders and simple recovery databases only have full and diff folders. I used the System.IO.Directory Exists method as I found it slightly quicker for UNC Shares
If($CheckForDBFolders -eq $True)
{
Context 'Folder Check for $db on $Server on $Share' {
It 'Should have a folder for $db database' {
Test-Path $Dbfolder |Should Be $true
}
if($Db -notin ('master','msdb','model') -and ($Srv.Databases[$db].RecoveryModel -ne 'Simple') -and ( $LSDatabases -notcontains $db))
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
It 'Has a Diff Folder' {
[System.IO.Directory]::Exists($Diff) | Should Be $True
}
It 'Has a Log Folder' {
[System.IO.Directory]::Exists($Log) | Should Be $True
}
} #
elseif(($Srv.Databases[$db].RecoveryModel -eq 'Simple') -and $Db -notin ('master','msdb','model') -or ( $LSDatabases -contains $db) )
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
It 'Has a Diff Folder' {
[System.IO.Directory]::Exists($Diff) | Should Be $True
}
} #
else
{
It 'Has a Full Folder' {
[System.IO.Directory]::Exists($Full) | Should Be $True
}
}#
} # End Check for db folders
}
and a similar thing for the files in the folders although this caused me some more issues with performance. I first used Get-ChildItem but in folders where a log backup is running every 15 minutes it soon became very slow. So I then decided to compare the create time of the folder with the last write time which was significantly quicker for directories with a number of files but then fell down when there was a single file in the directory so if the times match I revert back to Get-ChildItem.
If anyone has a better more performant option I would be interested in knowing. I used Øyvind Kallstad PowerShell Conference session Chasing the seconds Slides and Video and tried the methods in there with Measure-Command but this was the best I came up with
If($CheckForBackups -eq $true)
{
Context ' File Check For $db on $Server on $Share' {
$Fullcreate = [System.IO.Directory]::GetCreationTime($Full)
$FullWrite = [System.IO.Directory]::GetLastWriteTime($Full)
if($Fullcreate -eq $FullWrite)
{
It 'Has Files in the FULL folder for $db' {
Get-ChildItem $Full\*.bak | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the FULL folder for $db' {
$FullCreate | Should BeLessThan $FullWrite
}
}
It 'Full File Folder was written to within the last 7 days' {
$Fullwrite |Should BeGreaterThan (Get-Date).AddDays(-7)
}
if($Db -notin ('master','msdb','model'))
{
$Diffcreate = [System.IO.Directory]::GetCreationTime($Diff)
$DiffWrite = [System.IO.Directory]::GetLastWriteTime($Diff)
if($Diffcreate -eq $DiffWrite)
{
It 'Has Files in the DIFF folder for $db' {
Get-ChildItem $Diff\*.bak | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the DIFF folder for $db' {
$DiffCreate | Should BeLessThan $DiffWrite
}
}&amp;amp;amp;amp;lt;/div&amp;amp;amp;amp;gt;&amp;amp;amp;amp;lt;div&amp;amp;amp;amp;gt;It 'Diff File Folder was written to within the last 24 Hours' {
$Diffwrite |Should BeGreaterThan (Get-Date).AddHours(-24)
}
}
if($Db -notin ('master','msdb','model') -and ($Srv.Databases[$db].RecoveryModel -ne 'Simple') -and ( $LSDatabases -notcontains $db))
{
$Logcreate = [System.IO.Directory]::GetCreationTime($Log)
$LogWrite = [System.IO.Directory]::GetLastWriteTime($Log)
if($Logcreate -eq $LogWrite)
{
It 'Has Files in the LOG folder for $db' {
Get-ChildItem $Log\*.trn | Should Not BeNullOrEmpty
}
}
else
{
It 'Has Files in the LOG folder for $db' {
$LogCreate | Should BeLessThan $LogWrite
}
}
It 'Log File Folder was written to within the last 30 minutes' {
$Logwrite |Should BeGreaterThan (Get-Date).AddMinutes(-30)
}
}# Simple Recovery
}
}# Check for backups
You could just run the script you have just created from your check-list, hopefully this blog post can help you see that you  can do so.
But I like the message showing number of tests and successes and failures at the bottom and I want to use parameters in my script. I can do this like this
[CmdletBinding()]
## Pester Test to check OLA
Param(
$Instance,
$CheckForBackups,
$CheckForDBFolders,
$JobSuffix ,
$Share ,
[switch]$NoDatabaseRestoreCheck,
[switch]$DontCheckJobOutcome
)
and then call it using Invoke-Pester with the parameters like this
$Script = @{
Path = $Path;
Parameters = @{ Instance = Instance;
CheckForBackups = $true;
CheckForDBFolders = $true;
JobSuffix = 'BackupShare1';
Share = '\\Server1\BackupShare1';
NoDatabaseRestoreCheck= $true;
DontCheckJobOutcome = $true}
}
Invoke-Pester -Script $Script
but that’s a bit messy, hard to remember and won’t encourage people newer to Powershell to use it so I wrapped it in a function with some help and examples and put it in GitHub Test-OlaInstance.ps1 and Test-Ola. There is one thing to remember. You will need to add the path to Test-Ola.ps1 on Line 90 of Test-OlaInstance so that the script can find it
Once you have that you can call it for a single instance or a number of instances like so. Here I check for Folders and Backup files
$Servers =  'SQL2008Ser2008','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2'
Test-OLAInstance -Instance $Servers -Share 'H:\' -CheckForBackups
and get  a nice result like this. In a little under 20 seconds I completed my checklist for 4 servers including checking if the files and folders exist for 61 databases🙂 (The three failures were my Integrity Check jobs holding some test corrupt databases)
pester ola check.PNG
This gives me a nice and simple automated method of checking if Ola’s maintenance script has been correctly installed. I can use this for one server or many by passing in an array of servers (although they must use the same folder for backing up whether that is UNC or local) I can also add this to an automated build process to ensure that everything has been deployed correctly.
I hope you find it useful

Speaking? You? Go on. #tsql2sday #84

$
0
0

This is a blog post for this month’s T-SQL Tuesday post, hosted by Andy Yun (b|t). T-SQL Tuesday is a monthly blog event started by Adam Machanic (b|t). The T-SQL Tuesday topic this month was about advice for new speakers. Thanks Andy for hosting. I have created a channel in the SQL Server Community Slack for presenting which everyone can make use of to ask and to answer questions

I think you should share what you know with others.

You will be amazing.

I will give you some great advice I learnt from a fantastic person’s blog post

  1. Start speaking
  2. Keep going
  3. Listen to feedback
  4. That’s it.

Kendra has said it all, you don’t need to read any further😉

 

 

 

 

However..

Not all plain sailing

I love giving sessions but I never knew or thought that I would. My journey to speaking started at my SQL user group in Exeter and two fabulous people Jonathan and Annette Allen who encouraged me to share some PowerShell with the group. I was terrified, didn’t think I was worthy, my HDMI output wasn’t strong enough to power the projector, I had to transfer my slides and demo to Jonathans laptop. It was a fraught and frustrating experience.

My second presentation was done on Stuart Moores MacBook Pro using Office Online for presentations and Azure for demos. Again a change right at the last minute and using a machine I didn’t know (and a different keyboard set-up).

Stuff will go wrong. Murphy’s Law will always show his head somewhere and no matter how often you test and re-test your demos, sometimes an odd thing will make them stop working

There will be problems and issues, you can mitigate some of them by following the 6 P’s

Proper Preparation Prevents Pretty Poor Performance.

You can read some great blog posts in this T-SQL Tuesday Series and also this one from Steve Jones or any of these But also accept that these things happen and you must be prepared to shine on through the darkness if the power runs out or use pen and paper or even plastic cups like John Martin :-)

You never know you might enjoy it

I found I enjoyed it and wanted to do more and since then I have presented sessions in a wide variety of places. It was very strange to have been sat watching and listening to all of these fantastic presenters thinking I could never do that and then find out that actually it is something that I enjoy doing and find fun. You can do that too.

Equally, it’s ok to not enjoy it, think its not worth the stress and hassle and support the community in a different way but at least give it a go

You will be nervous

quote-joan-jett-you-want-to-have-butterflies-in-your-185899

I shared a train across Germany with someone who had attended the PSMonday conference in Munich and they were astonished when I said that I get very nervous before speaking. It’s ok to be nervous, the trick is to make use of that nervous energy and turn it into something positive.

I get very nervous before presentations. My hands shake, I sweat, I either babble or loose my voice. I fret and fidget and check everything a thousandillion times. I find it is better for me if I am sat in the room during the previous presentation as that generally helps me to feel more relaxed as I can listen to their talk and also out of respect for the presenter and the organisation it forces me to sit quietly.

You will find your own way to deal with this, maybe listening to music on headphones or just sitting quietly somewhere. Don’t worry if it is not immediately obvious, try some different things, talk with others and believe me, it will be ok.

Don’t try to numb it with alcohol

Once I get up and its ‘my’ turn I take a few deep breaths and suddenly presenter turns on and I forget all about being nervous.

173101-everything-you-want-is-on-the-other-side-of-fear

Something to talk about

I have nothing to talk about.

Or everyone else knows more than I do.

Or X Y and Z talk about this much better than I do.

I’m scared

Richard Munn and I gave an impromptu session at SQL Relay in Cardiff where we talked about and hopefully encouraged people to start speaking and these statements came up.

wp_20161004_13_59_32_pro

Heres (a little of) what we said

No-one knows everything. Many people know a hell of a lot but not everything. You know a lot more than you realise and you also know things that no-one else does.

If you are stuck for things to talk about think about the you of 6 months or a year ago and something that you have learnt in that time and write the session that you wish you could have seen then. There will be other people at a similar stage who will appreciate it.

Don’t be scared, they are only people.

Practice

My dog is the one person who has been present at my presentations the most. He has listened (sometimes intently) to me practicing.

You need to practice speaking out loud.

You need to understand the timings

You need to be comfortable with hearing yourself speaking out aloud

You need to practice speaking out loud

A double reminder because I think it is important. You should practice and practice and practice with an eye on your timings and if you have a good friend who is technical or a small group at work for a lunchtime maybe then ask them if they will listen and give feedback.

Wanna chat?

I am very passionate about community involvement and lucky enough to be involved in two fantastic communities – the SQL community and the PowerShell community and have made some great friends along the way. I was amazed and proud when very soon after my second presentation someone told me that I had inspired them to start to present.

Since then I have gone out of my way to encourage other people to speak and to blog and am really enjoying watching them blossom. If you want to have a chat via email or via slack about speaking or blogging or getting involved in the community please feel free to contact me and I promise you I will get back to you. Better still go to the SQL Community Slack and ask questions in #presentingorspeaking

Go find out more

We are good at sharing and learning technical content but we can share and learn about so much more, about all aspects of our life. Go and read all of the other posts in this T-SQL Tuesday for starters🙂 and develop


Running SQL Queries with Visual Studio Code

$
0
0

Reading this blog post by Shawn Melton Introduction of Visual Studio Code for DBAs reminded me that whilst I use Visual Studio Code (which I shall refer to as Code from here on) for writing PowerShell and Markdown and love how easily it interacts with Github I hadn’t tried T-SQL. If you are new to Code (or if you are not) go and read Shawns blog post but here are the steps I took to running T-SQL code using Code

To download Code go to this link https://code.visualstudio.com/download and choose your operating system. Code works on Windows, Linux and Mac

00-code-download

Once you have downloaded and installed hit CTRL SHIFT and P which will open up the command palette

01-ctrlshiftp

Once you start typing the results will filter so type ext and then select Extensions : Install Extension

02-extensions

Which will open the Extensions tab ( You could have achieved the same end result just by clicking this icon)

03-install-extensions

But then you would not have learned about the command palette 🙂

So, with the extensions tab open, search for mssql and then click install

04-search-mssql

Once it has installed the button will change to Reload so click it

05-reload

And you will be prompted to Reload the window

06-reload-prompt

Accept the prompt and then open a new file (CTRL N) and then change the language for the file.

You can do this by clicking CTRL K and then M (Not CTRL K CTRL M) or click the language button

07-choose-langauga

And then choose SQL

08-choose-sql

This will start a download so make sure you are connected (and allowed to connect to the internet)

09-start-download

Once it has finished it will show this

10-finished-downloading

And offer you the chance to read the release notes

11-release-notes

Which you can get for any extension anytime by finding the extension in the extensions tab and clicking on it. This has links to tutorials as well as information about the release

12-release-notes

The mssql extension enables Intellisence for T-SQL when you open a .sql file or when you change the language to SQL as shown above for a new file

13-intellisense

Write your T-SQL Query and press CTRL SHIFT and E or Right Click and choose Execute Query. This will ask you to choose a Connection Profile (and display any existing profiles)

14-execute

Choose Create Connection Profile and answer the prompts

15-enter-servername

The query will then run

16-query-runs

You can then output the results to csv or json if you wish

17-results

You can find a video showing this whole process with some typos and an error here

Using SQL with VS Code


Using the PowerShell SQL Provider with SQL Authentication

$
0
0

Whilst having a conversation with Chrissy LeMaire last week about using the SQL Provider. She asked if it could use SQL Authentication. I had no idea but said I would find out. This is how and what I did.

I am going to demonstrate this in Visual Studio Code with the PowerShell extension installed but I have also tested this in PowerShell version 5.1.14393.576 on Windows 10 and PowerShell Version 5.0.10586.117 on Windows 8.1 but it will work across other versions I believe

First we need to import the sqlserver module

Import-Module sqlserver

Then if we run Get-PSDrive we can see our SQL Server PS Drive

get psdrive.PNG

A quick dir and we can see what is in that drive

get-psdrive

We can navigate this like it is a file system. For example, below I have changed
to the ‘SQL directory’ run dir and found the machine name ROB-SURFACEBOOK and then
changed to that ‘directory’ and dir to find the instances, navigated to the default
instance and run dir to see what is available

cds

This would enable me to do this to get the database names for example

dir-databases

You can connect to remote SQL instances in the same manner by running

CD SQLSERVER:\SQL\SERVERNAME

However I don’t have permission using this account

failed to connect.PNG

We need to create a new PS Drive so first, as always, start with Get-Help

get-help new-psdrive.PNG

So we can create a new PS Drive with  a credential

Parameters
-Credential Specifies a user account that has permission to perform this action. The default is the current user.

Does it accept SQL Server?

Notes
New-PSDrive is designed to work with the data exposed by any provider. To list the providers available in your session, use Get-PSProvider.

get-psprovider.PNG

Yes it does 🙂

So we can add a new PSDrive using

 New-PSDrive -Name SQL2016N1-SQLAuth -PSProvider Sqlserver `
 -Root SQLSERVER:\SQL\SQL2016N1 -Description 'This is the SQL2016N1 SQL `
PSDrive using the  SQL Authenticated login DemoLogin' -Credential Get-Credential

add psdrive.PNG

Which will prompt us for a credential. Once we enter the credential we will have our SQL Server drive for that instance

access-denied

Except we won’t !! This is because the account that VS Code (or PowerShell) is running does not have permissions on the SQL Server. So I need to run VS Code (or PowerShell) as a user with those permissions by right clicking on the icon whilst holding shift down and clicking run as different user. Obviously you won’t need to do this if the current logged on user has permission on the remote SQL Server

right-click-run-as-different-user

and then enter the credentials of the user

creds

Now that we are running as the correct user

whoami
We repeat the code from above and we will have our PS Drive

need an instance.PNG

Nope 🙂 This time we get the following error

New-PSDrive : SQL Server PowerShell provider error: The root path for a drive with credentials needs to include the instance name.

which makes sense if you think about it as the credentials will only be valid at instance level

So we will add the instance name to the command. As this is the default instance we add DEFAULT

New-PSDrive -Name SQL2016N1-SQLAuth -PSProvider Sqlserver `
-Root SQLSERVER:\SQL\SQL2016N1\DEFAULT -Description 'This is the SQL2016N1 SQL `
PSDrive using the  SQL Authenticated login DemoLogin' -Credential Get-Credential

we-have-a-new-psdrive

Success!! So we will be able to see it in our list of PS Drives using Get-PSDrive

sql auth psdrive.PNG

and we can see further details like this

details.PNG

and we can navigate to it using cd NAMEOFDRIVE

dir-sqlauth

and perform our tasks

databases and logins.PNG

If you take a look at the SQL Server you will see that all that is being run is T-SQL

its just tsql.PNG

Unfortunately you cannot use the persist parameter with a SQLSERVER provider to persist the mapping across sessions

This post has shown you how to set up SQL Server PS Drives using SQL Authentication. The important points are

  • Use Get-Help New-PSDrive -ShowWindow for all the help and examples
  • You must be running PowerShell or VS Code as a user with Windows Permissions on the SQL Server
  • You must specify the instance name
  • Use “New-PSDrive -Name PSDRIVENAME -PSProvider SqlServer -Root SQLSERVER:\SQL\SERVER\INSTANCE -Description ‘DESCRIPTION’ -Credential Get-Credential”
  • You can then navigate the remote SQL Server like a file directory from the command line using SQL Authentication
  • It’s just running T-SQL against the instance

Altering a Job Step on Hundreds of SQL Servers with PowerShell

$
0
0

I flew to Utrecht last week to present with Chrissy LeMaire and Sander Stad to present to the joint Dutch SQL and PowerShell User Groups. Whilst I was sat at the airport I got a phone call from my current client. “We need to change the backup path for all of the servers to a different share, how long will it take you?”

About 5 minutes  (PowerShell is very powerful – be careful when following these examples 😉 )

We will use the sqlserver module, so you will need to have installed the latest version of SSMS from https://sqlps.io/dl

This code was run using PowerShell version 5 and will not work on Powershell version 3 or lower as it uses the where method.

Lets grab all of our jobs on the estate. (You will need to fill the $Servers variable with the names of your instances, maybe from a database or CMS or a text file)

$Jobs = Get-SQLAgentJob -ServerInstance $Servers

Once we have the jobs we need to iterate only through the ones we need to. This step could also have been done in the line above. Lets assume we are using the Ola Hallengren Solution to backup our estate

Foreach($job in $Jobs.Where{$_.Name -like '*DatabaseBackup*' -and $_.isenabled -eq $true})

Then because I have to target a specific job step I can iterate through those and filter in the same way

foreach ($Step in $Job.jobsteps.Where{$_.Name -like '*DatabaseBackup*'})

Now all I need to do is to replace C:\Backup with C:\MSSQL\Backup (in this example I am using my labs backup paths)

$Step.Command = $Step.Command.Replace("Directory = N'C:\Backup'","Directory = N'C:\MSSQL\Backup'")

And then call the Alter method

$Step.Alter()

And that is all there is to it. Here is the full script I used

$Jobs = Get-SQLAgentJob -ServerInstance $Servers

Foreach($job in $Jobs.Where{$_.Name -like '*DatabaseBackup*' -and $_.isenabled -eq $true})
{
foreach ($Step in $Job.jobsteps.Where{$_.Name -like '*DatabaseBackup*'})
{
$Step.Command = $Step.Command.Replace("Directory = N'C:\Backup'","Directory = N'C:\MSSQL\Backup'")
$Step.Alter()
}
}

In only a few minutes I had altered several hundred instances worth of Ola Hallengren Jobs 🙂

This is one of the many reasons I love PowerShell, it enables me to perform mass changes very quickly and easily. Of course, you need to make sure that you know that what you are changing is what you want to change. I have caused severe issues by altering the SQL alerts frequency to 1 second instead of one hour on an estate!! Although the beauty of PowerShell meant that I was able to change it very quickly once the problem was realised
You can change a lot of settings. If you look at what is available at a job step level
job-step-properties
Happy Automating



Adding a T-SQL Job Step to a SQL Agent Job with PowerShell

$
0
0

In my last post, I explained how to alter an existing job step across many servers. I also had cause to add a T-SQL Job step to a large number of jobs as well. This is how I did it.
As before I gathered the required jobs using Get-SQLAgentJob command from the sqlserver module which you can get by installing the latest SSMS from https://sqlps.io/dl 

This code was run on PowerShell version 5 and will not run on PowerShell version 3 or earlier as it uses the where method
I put all of our jobs that I required on the estate into a variable called $Jobs. (You will need to fill the $Servers variable with the names of your instances, maybe from a database or CMS or a text file

$Jobs = (Get-SQLAgentJob -ServerInstance $Servers).Where{$_.Name -like '*PartOfNameOfJob*' -and $_.IsEnabled -eq $true}

Then I can iterate through them with a foreach loop

foreach($Job in $Jobs)

Then we need to create a new job step which is done with the following code

$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep 

To find out what is available for this object you can run

$NewStep | Get-Member -MemberType Property

job-step-properties

We need to set the name, the parent (The job), the database, the command, the subsystem, the on fail action, on success action and the id for the job step.
I set the command to a variable to make the code easier to read

$Command = "SELECT Name from sys.databases"

the rest of the properties I fill in inside the loop. To find out what the properties can hold I look at MSDN for a Microsoft.SqlServer.Management.Smo.Agent.JobStep  The ID property is the number of the job step starting at 1 so this example will add a new job step that will be the first to run

$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep
$NewStep.Name = 'A descriptive name for the job step'
$NewStep.Parent = $Job
$NewStep.DatabaseName = 'master'
$NewStep.Command = $Command
$NewStep.SubSystem = 'TransactSql'
$NewStep.OnFailAction = 'QuitWithFailure'
$NewStep.OnSuccessAction = 'GoToNextStep'
$NewStep.ID = 1

Once the object has all of the properties all we need to do is create it and alter the job

$NewStep.create()
$Job.Alter() 

and putting it all together it looks like this

$Jobs = (Get-SQLAgentJob -ServerInstance $Servers).Where{$_.Name -like '*PartOfNameOfJob*' -and $_.IsEnabled -eq $true}
$Command = "Select name from sys.databases"
foreach($Job in $Jobs)
{
$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep
$NewStep.Name = 'A descriptive name for the job step1asdfsfasdfa'
$NewStep.Parent = $Job
$NewStep.DatabaseName = 'master'
$NewStep.Command = $Command
$NewStep.SubSystem = 'TransactSql'
$NewStep.OnFailAction = 'QuitWithFailure'
$NewStep.OnSuccessAction = 'GoToNextStep'
$NewStep.ID = 1
$NewStep.create().script()
$Job.Alter()
}

Hopefully this will help you if you need to add a T-SQL Job Step to a large number of servers
Happy Automating


Quickly Creating Test Users in SQL Server with PowerShell using the sqlserver module and dbatools

$
0
0

One of the most visited posts on my blog is nearly two and half years old now – Add User to SQL Server Database Role with PowerShell and Quickly Creating Test Users. I thought it was time to update it and use the latest sqlserver module and the dbatools module.

You can get the latest version of the sqlserver module by installing SSMS 2016. The PASS PowerShell Virtual Chapter have created a short link to make this easier for you to remember: https://sqlps.io/dl

Once you have downloaded and installed SSMS you can load the module.

Import-Module sqlserver

There is one situation where you will get an error loading the sqlserver module into PowerShell. If you have the SQLPS module already imported then you will get the following error:

Import-Module : The following error occurred while loading the extended type data file:

sqlserver-module-error

In that case you will need to remove the SQLPS module first.

Remove-Module sqlps
Import-Module sqlserver

The original post dealt with creating a number of test users for a database and assigning them to different roles quickly and easily.

First let’s quickly create a list of Admin users and a list of Service Users and save them in a text file.

$i = 0
while($I -lt 100)
{
"Beard_Service_User$i" | Out-File 'C:\temp\Users.txt' -Append
$i++
}

$i = 0
while($I -lt 10)
{
"Beard_Service_Admin_$i" | Out-File 'C:\temp\Admins.txt' -Append
$i++
}

Now that we have those users in files we can assign them to a variable by using Get-Content

$Admins = Get-Content 'C:\temp\Admins.txt'

Of course we can use any source for our users – a database, an excel file, Active Directory or even just type them in.

We can use the Add-SQLLogin command from the sqlserver module to add our users as SQL Logins, but at present we cannot add them as database users and assign them to a role.

If we want to add a Windows Group or a Windows User to our SQL Server we can do so using:

Add-SqlLogin -ServerInstance $Server -LoginName $User -LoginType WindowsUser -DefaultDatabase tempdb -Enable -GrantConnectSql 

Notice that we need to enable and grant connect SQL to the user.

If we want to add a SQL login the code is pretty much the same but we either have to enter the password in an authentication box or pass in a PSCredential object holding the username and password. Keeping credentials secure in PowerShell scripts is outside the scope of this post and the requirement is for none-live environments so we will pass in the same password for all users as a string to the script. You may want or be required to achieve this in a different fashion.

 $Pass = ConvertTo-SecureString -String $Password -AsPlainText -Force
$Credential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $User, $Pass
Add-SqlLogin -ServerInstance $Server -LoginName $User -LoginType $LoginType -DefaultDatabase tempdb -Enable -GrantConnectSql -LoginPSCredential $Credential

We can ensure that we are not trying to add logins that already exist using

 if(!($srv.Logins.Contains($User)))
{

The $srv is a SQL Server Management Server Object which you can create using a snippet. I blogged about snippets here and you can find my list of snippets on github here. However, today I am going to use the dbatools module to create a SMO Server Object using the Connect-DbaSqlServer command and assign the server and the database to a variable:

 # Create a SQL Server SMO Object
$srv = Connect-DbaSqlServer -SqlServer $server
$db = $srv.Databases[$Database]

Once we have our Logins we need to create our database users:

 $usr = New-Object ('Microsoft.SqlServer.Management.Smo.User') ($db, $User)
$usr.Login = $User
$usr.Create()

and add them to a database role.

#Add User to the Role
$db.roles[$role].AddMember($User)

I created a little function to call in the script and then simply loop through our users and admins and call the function.

foreach($User in $Users)
{
Add-UserToRole -Password $Password -User $user -Server $server -Role $Userrole  -LoginType SQLLogin
}

foreach($User in $Admins)
{
Add-UserToRole -Password $Password -User $user -Server $server -Role $adminrole  -LoginType SQLLogin
}

To check that they have been added correctly I simply use the Get-DbaRoleMember command from dbatools and output it to Out-GridView using the alias ogv as I am on the command line:

Get-DbaRoleMember -SqlInstance $server |ogv

which looks like this:

15-final-compare

 

Once we need to clean up the logins and users we can use the Get-SQLLogin and Remove-SQLLogin commands from the sqlserver module to remove the logins and if we do that first we can then use the dbatools command Remove-SQLOrphanuser to remove the orphaned users 🙂 (I thought that was rather cunning!)

(Get-SqlLogin -ServerInstance $server).Where{$_.Name -like '*Beard_Service_*'}|Remove-SqlLogin

Remove-SQLOrphanUser -SqlServer $Server -databases $database

The Remove-SQLLogin will prompt for confirmation and the result of the Remove-SQLOrphanUser looks like this

remove-them-all

When you are looking at doing this type of automation with PowerShell, you should remember always to make use of Get-Command, Get-Help and Get-Member. That will enable you to work out how to do an awful lot. I have a short video on youtube about this:

 

and when you get stuck come and ask in the SQL Server Slack at https://sqlps.io/slack. You will find a powershellhelp channel in there.

Here is the complete code:

#Requires -module sqlserver
#Requires -module dbatools

### Define some variables
$server = ''
$Password = "Password"
$Database = 'TheBeardsDatabase'
$Admins = Get-Content 'C:\temp\Admins.txt'
$Users = Get-Content 'C:\temp\Users.txt'
$LoginType = 'SQLLogin'
$userrole =  'Users'
$adminrole = 'Admin'

# Create a SQL Server SMO Object
$srv = Connect-DbaSqlServer -SqlServer $server
$db = $srv.Databases[$Database]

function Add-UserToRole
{
param
(
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateNotNullOrEmpty()]
[string]$Password,
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateNotNullOrEmpty()]
[string]$User,
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateNotNullOrEmpty()]
[string]$Server,
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateNotNullOrEmpty()]
[string]$Role,
[Parameter(Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true,
ValueFromRemainingArguments=$false)]
[ValidateSet("SQLLogin", "WindowsGroup", "WindowsUser")]
[string]$LoginType
)

if(!($srv.Logins.Contains($User)))
{
if($LoginType -eq 'SQLLogin')
{
$Pass = ConvertTo-SecureString -String $Password -AsPlainText -Force
$Credential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $User, $Pass
Add-SqlLogin -ServerInstance $Server -LoginName $User -LoginType $LoginType -DefaultDatabase tempdb -Enable -GrantConnectSql -LoginPSCredential $Credential
}
elseif($LoginType -eq 'WindowsGroup' -or $LoginType -eq 'WindowsUser')
{
Add-SqlLogin -ServerInstance $Server -LoginName $User -LoginType $LoginType -DefaultDatabase tempdb -Enable -GrantConnectSql
}
}
if (!($db.Users.Contains($User)))
{

# Add user to database

$usr = New-Object ('Microsoft.SqlServer.Management.Smo.User') ($db, $User)
$usr.Login = $User
$usr.Create()

}
#Add User to the Role
$db.roles[$role].AddMember($User)
}

foreach($User in $Users)
{
Add-UserToRole -Password $Password -User $user -Server $server -Role $Userrole -LoginType SQLLogin
}

foreach($User in $Admins)
{
Add-UserToRole -Password $Password -User $user -Server $server -Role $adminrole -LoginType SQLLogin
}

Get-DbaRoleMember -SqlInstance $server |ogv

Happy Automating!

 

 

 


Adding a PowerShell Job Step to an existing SQL Agent Job Step with PowerShell

$
0
0

In my last post I showed how to add a T-SQL Job step to an existing SQL Agent Job. The process is exactly the same for a PowerShell job step.

As before I gathered the required jobs using Get-SQLAgentJob command from the sqlserver module which you can get by installing the latest SSMS from https://sqlps.io/dl 

This code was run on PowerShell version 5 and will not run on PowerShell version 3 or earlier as it uses the where method
I put all of our jobs that I required on the estate into a variable called $Jobs. (You will need to fill the $Servers variable with the names of your instances, maybe from a database or CMS or a text file and of course you can add more logic to filter those servers as required.

$Jobs = (Get-SQLAgentJob -ServerInstance $Servers).Where{$_.Name -like '*PartOfNameOfJob*' -and $_.IsEnabled -eq $true}

Of course to add a PowerShell Job step the target server needs to be SQL 2008 or higher. If you have an estate with older versions it is worth creating a SMO server object (you can use a snippet) and checking the version and then getting the jobs like this

foreach($Server in $Servers)
{
 $srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server
 if($srv.VersionMajor -ge 10)
 {
    $Jobs = $srv.JobServer.Jobs

and you could choose to create a CmdExec Job step for earlier verions in an else code block.

Once I have the Jobs I can iterate through them with a foreach loop

foreach($Job in $Jobs)

Then we need to create a new job step which is done with the following code

$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep 

To find out what is available for this object you can run

$NewStep | Get-Member -MemberType Property

job-step-properties

We need to set the name, the parent (The job), the command, the subsystem, the on fail action, on success action and the id for the job step.
I set the command to a variable to make the code easier to read

$Command = "Get-Process"

the rest of the properties I fill in inside the loop. To find out what the properties can hold I look at MSDN for a Microsoft.SqlServer.Management.Smo.Agent.JobStep  The ID property is the number of the job step starting at 1 so this example will add a new job step that will be the first to run

$Name = $Job.Name
$JobServer = $srv.JobServer
$Job = $JobServer.Jobs[$Name]
$NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep
$NewStep.Name = 'a descriptive name for my PowerShell script'
$NewStep.Parent = $Job
$NewStep.Command = $Command
$NewStep.SubSystem = 'PowerShell'
$NewStep.OnFailAction = 'QuitWithFailure'
$NewStep.OnSuccessAction = 'GoToNextStep'
$NewStep.ID = 1

Once the object has all of the properties all we need to do is create it and alter the job

$NewStep.create()
$Job.Alter() 

and putting it all together it looks like this

foreach($Server in $Servers)
{
    $srv = New-Object Microsoft.SqlServer.Management.Smo.Server $Server
    if($srv.VersionMajor -ge 10)
    {
       $Jobs = $srv.JobServer.Jobs.Where{$_.Name -like '*PartOfNameOfJob*' -and $_.IsEnabled -eq $true}
       foreach($Job in $Jobs)
       {
           $NewStep = New-Object Microsoft.SqlServer.Management.Smo.Agent.JobStep
           $NewStep.Name = 'a descriptive name for my PowerShell script'
           $NewStep.Parent = $Job
           $NewStep.Command = $Command
           $NewStep.SubSystem = 'PowerShell'
           $NewStep.OnFailAction = 'QuitWithFailure'
           $NewStep.OnSuccessAction = 'GoToNextStep'
           $NewStep.ID = 1
           $NewStep.create()
           $Job.Alter()
       }
    }

}

Happy Automating


Testing Your SQL Server Backups the Easy Way with PowerShell & dbatools

$
0
0

In a previous post I wrote about how easy it was to restore a whole SQL Servers user databases from a  directory using the dbatools module. Maybe it is a good idea to look at for disaster recovery scenarios but even PowerShell is going to be useless if your backups don’t work

But setting up a solution to test your backups (technically test your restores) is difficult isn’t it?

Lets use the dbatools module and see how easy it is

The dbatools module has a command called Test-DbaLastBackup if you look at the page or at the help using

 

Get-Help Test-DbaLastBackup -ShowWindow

 

you will see that this command

Restores all or some of the latest backups and performs a consistency check

1. Gathers information about the last full backups
2. Restores the backups to the Destination with a new name. If no Destination is specified, the originating SqlServer will be used.
3. The database is restored as “dbatools-testrestore-$databaseName” by default, but you can change dbatools-testrestore to whatever you would like using -Prefix
4. The internal file names are also renamed to prevent conflicts with original database
5. A consistency check is then performed
6. And the test database is finally dropped

So, if you only have one SQL Server but want to ensure that you are testing your backup files then as along as you have the diskspace you can simply run

Test-DbaLastBackup -SqlServer sql2016n2

and the latest backups that have been taken will be restored using a different name with different filenames, checked for consistency and then dropped

01 - simple test backups.PNG

and as you can see an object is returned
SourceServer  : SQL2016N2
TestServer    : SQL2016N2
Database      : FadetoBlack
FileExists    : True
RestoreResult : Success
DbccResult    : Success
SizeMB        : 1243.26
BackupTaken   : 3/18/2017 12:36:07 PM
BackupFiles   : Z:\SQL2016N2\FadetoBlack\FULL_COPY_ONLY\SQL2016N2_FadetoBlack_FULL_COPY_ONLY_20170318_123607.bak

which shows the Server, the database, if the file exists, the restore result, the DBCC result, the size of the backup file, when it was taken and the path used

You don’t have to use the same server and in many shops you would not want to. You can specify a destination server and you can also pipe the results to Out-GridView to enable easy filtering.

Test-DbaLastBackup -SqlServer sql2016n2 -Destination SQL2016N1 | OGV

Note you need to be backing up to a shared location ie a path that starts \\ I have fudged this a little in the demo for the keen eyed

03 - with OGV.PNG

Maybe you only want to test the backups for the important databases or some backups are restored using other means and you don’t need to test them this way. There is a databases parameter which you can tab through the database names

04 - choosing databases.gif

In the ISE you can see the drop down of database names

05 - in ISE.PNG

If you have limited space you might not want to test the largest databases so you can use the MaxMb parameter to only restore databases under this size. In the example below, you can see that Fadetoblack was skipped and the system databases were skipped as they are not backing up to a shared location

06 - max mb.PNG

The databases are restored onto the server using a different name and the files are also named differently to avoid any conflicts. The default prefix is dbatools-testrestore- but you can change this using the prefix switch if you wish

02 data files.PNG

You may not want to use your special, super quick storage for performing your test restores. If you have separate data drives that you would like to use for the restores, you can specify those with the -DataDirectory and -LogDirectory. If you do not use these switches then the command will use the default data and log locations.

Its possible to reduce the amount of checks that are done. If you only want to do a Verify Only on the backup then you can use the -VerifyOnly switch, you can skip the DBCC check by using the -NoCheck switch and you can leave the test restore databases on the server using the -NoDrop switch

Happy Automating

 

NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using

 Get-Module dbatools

and update it using an Administrator PowerShell session with

 Update-Module dbatools

You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using

 Install-Module dbatools

Then you can use

 Update-dbatools

 

 

 

 

 


Taking dbatools Test-DbaLastBackup a little further

$
0
0

In a previous post I showed how easy it is to test your backups using Test-DbaLastBackup

Today I thought I would take it a little further and show you how PowerShell can be used to transmit or store this information in the manner you require

Test-DBALastBackup returns an object of information

SourceServer  : SQL2016N2
TestServer    : SQL2016N2
Database      : FadetoBlack
FileExists    : True
RestoreResult : Success
DbccResult    : Success
SizeMB        : 1243.26
BackupTaken   : 3/18/2017 12:36:07 PM
BackupFiles   : Z:\SQL2016N2\FadetoBlack\FULL_COPY_ONLY\SQL2016N2_FadetoBlack_FULL_COPY_ONLY_20170318_123607.bak

which shows the server, the database name, if the file exists, the restore result, the DBCC result, the size of the backup file, when the backup was taken and the path used

Text File

As it is an object we can make use of that in PowerShell. We can output the results to a file

Test-DbaLastBackup -SqlServer sql2016n2 -Destination SQL2016N1 -MaxMB 5 | Out-File C:\temp\Test-Restore.txt
notepad C:\temp\Test-Restore.txt

01 - out file.PNG

CSV

Or maybe you need a CSV

 Test-DbaLastBackup -SqlServer sql2016n2 -Destination SQL2016N1 -MaxMB 5 | Export-Csv  C:\temp\Test-Restore.csv -NoTypeInformation

02 - csv file.PNG

JSON

Maybe you want some json

 Test-DbaLastBackup -SqlServer sql2016n2 -Destination SQL2016N1| ConvertTo-Json | Out-file c:\temp\test-results.json

06 - json results.PNG

HTML

Or an HTML page


$Results = Test-DbaLastBackup -SqlServer sql2016n2 -Destination SQL2016N1
$Results | ConvertTo-Html | Out-File c:\temp\test-results.html

03 - html.PNG

Excel

or perhaps you want a nice colour coded Excel sheet to show your manager or the auditors

Import-Module dbatools

$TestServer = 'SQL2016N1'
$Server = 'SQL2016N2'
## Run the test and save to a variable
$Results = Test-DbaLastBackup -SqlServer $server -Destination $TestServer
# Set the filename
$TestDate = Get-Date
$Date = Get-Date -Format ddMMyyy_HHmmss
$filename = 'C:\Temp\TestResults_' + $Date + '.xlsx'
# Create a .com object for Excel
$xl = new-object -comobject excel.application
$xl.Visible = $true # Set this to False when you run in production
$wb = $xl.Workbooks.Add() # Add a workbook
$ws = $wb.Worksheets.Item(1) # Add a worksheet
$cells=$ws.Cells
$col = 1
$row = 3
## Create a legenc
$cells.item($row,$col)="Legend"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex = 34
$row ++
$cells.item($row,$col)="True or Success"
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex = 10
$row ++
$cells.item($row,$col)="False or Failed"
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 3
$row ++
$cells.item($row,$col)="Skipped"
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 16
$row ++
$cells.item($row,$col)="Backup Under 7 days old"
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 4
$row ++
$cells.item($row,$col)="Backup Over 7 days old"
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 3
## Create a header
$col ++
$row = 3
$cells.item($row,$col)="Source Server"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="Test Server"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="Database"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="File Exists"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="Restore Result"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="DBCC Result"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="Size Mb"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="Backup Date"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col ++
$cells.item($row,$col)="Backup Files"
$cells.item($row,$col).font.size=16
$Cells.item($row,$col).Columnwidth = 10
$Cells.item($row,$col).Interior.ColorIndex= 34
$col = 2
$row = 4
foreach($result in $results)
{
$col = 2
$cells.item($row,$col)=$Result.SourceServer
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$col ++
$cells.item($row,$col)=$Result.TestServer
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$col++
$cells.item($row,$col)=$Result.Database
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$col++
$cells.item($row,$col)=$Result.FileExists
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
if($result.FileExists -eq 'True')
{
    $Cells.item($row,$col).Interior.ColorIndex= 10
}
elseif($result.FileExists -eq 'False')
{
    $Cells.item($row,$col).Interior.ColorIndex= 3
}
else
{
    $Cells.item($row,$col).Interior.ColorIndex= 16
}
$col++
$cells.item($row,$col)=$Result.RestoreResult
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
if($result.RestoreResult -eq 'Success')
{
    $Cells.item($row,$col).Interior.ColorIndex= 10
}
elseif($result.RestoreResult -eq 'Failed')
{
    $Cells.item($row,$col).Interior.ColorIndex= 3
}
else
{
    $Cells.item($row,$col).Interior.ColorIndex= 16
}
$col++
$cells.item($row,$col)=$Result.DBCCResult
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
if($result.DBCCResult -eq 'Success')
{
    $Cells.item($row,$col).Interior.ColorIndex= 10
}
elseif($result.DBCCResult -eq 'Failed')
{
    $Cells.item($row,$col).Interior.ColorIndex= 3
}
else
{
    $Cells.item($row,$col).Interior.ColorIndex= 16
}
$col++
$cells.item($row,$col)=$Result.SizeMb
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$col++
$cells.item($row,$col)=$Result.BackupTaken
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
if($result.BackupTaken -gt (Get-Date).AddDays(-7))
{
    $Cells.item($row,$col).Interior.ColorIndex= 4
}
else
{
    $Cells.item($row,$col).Interior.ColorIndex= 3
}
$col++
$cells.item($row,$col)=$Result.BackupFiles
$cells.item($row,$col).font.size=12
$Cells.item($row,$col).Columnwidth = 10
$row++
}
[void]$ws.cells.entireColumn.Autofit()
## Add the title after the autofit
$col = 2
$row = 1
$cells.item($row,$col)="This report shows the results of the test backups performed on $TestServer for $Server on $TestDate"
$cells.item($row,$col).font.size=18
$Cells.item($row,$col).Columnwidth = 10
$wb.Saveas($filename)
$xl.quit()

It looks like this. Green is Good, Red is Bad, Grey is don’t care!

Email

You might need to email the results, here I am using GMail as an example. With 2 factor authentication you need to use an app password in the credential

Import-Module dbatools
$TestServer = 'SQL2016N1'$Server = 'SQL2016N2'
## Run the test and save to a variable
$Results = Test-DbaLastBackup -SqlServer $server -Destination $TestServer -MaxMB 5
$to = ''
$smtp = 'smtp.gmail.com'
$port = 587
$cred = Get-Credential
$from = 'Beard@TheBeard.Local'
$subject = 'The Beard Reports on Backup Testing'
$Body = $Results | Format-Table | Out-String
Send-MailMessage -To $to -From $from -Body $Body -Subject $subject -SmtpServer $smtp -Priority High -UseSsl -Port $port -Credential $cred
07 -email
You can of course attach any of the above files as an attachment using the -attachment parameter in Send-MailMessage

Database

Of course, as good data professionals we probably want to put the data into a database where we can ensure that it is kept safe and secure

dbatools has a couple of commands to help with that too. We can use Out-DbaDataTable to create a datatable object and Write-DbaDatatable to write it to a database

Create a table

USE [TestResults]
GO
CREATE TABLE [dbo].[backuptest](
[SourceServer] [nvarchar](250) NULL,
[TestServer] [nvarchar](250) NULL,
[Database] [nvarchar](250) NULL,
[FileExists] [nvarchar](10) NULL,
[RestoreResult] [nvarchar](200) NULL,
[DBCCResult] [nvarchar](200) NULL,
[SizeMB] [int] NULL,
[Backuptaken] [datetime] NULL,
[BackupFiles] [nvarchar](300) NULL
) ON [PRIMARY]
GO
then add the data
Import-Module dbatools
$TestServer = 'SQL2016N1'
$Server = 'SQL2016N2'
$servers = 'SQL2005Ser2003','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2012Ser08AG3','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3'
## Run the test for each server and save to a variable (This uses PowerShell v4 or above code)
$Results = $servers.ForEach{Test-DbaLastBackup -SqlServer $_ -Destination $TestServer -MaxMB 5}
## Convert to a daatatable.
$DataTable = Out-DbaDataTable -InputObject $Results
## Write to the database
Write-DbaDataTable -SqlServer $Server -Database TestResults -Schema dbo -Table backuptest -KeepNulls -InputObject $DataTable

and query it

08 - Database.PNG

Hopefully that has given you some ideas of how you can make use of this great command and also one of the benefits of PowerShell and the ability to use objects for different purposes

Happy Automating

NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using

 Get-Module dbatools

and update it using an Administrator PowerShell session with

 Update-Module dbatools

You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using

Install-Module dbatools

Then you can use

Update-dbatools

 

 

 

 


Backing up SQL Server on Linux using Ola Hallengrens Maintenance Solution

$
0
0

With the release of SQL Server vNext CTP 1.4 SQL Agent was released for use on Linux. To install it on Ubuntu you need to upgrade your SQL Server to CTP 1.4. On Ubuntu you do this with

sudo apt-get update
sudo apt-get install mssql-server

Once you have CTP 1.4 you can install SQL Agent as follows

sudo apt-get update
sudo apt-get install mssql-server-agent
sudo systemctl restart mssql-server

for different flavours of Linux follow the steps here

Once you have done that you will see that the Agent is now available

01 - SSMS Agent Linux.PNG

So now I can schedule backups and maintenance for my Linux SQL databases using the agent. I immediately turned to Ola Hallengrens Maintenance Solution I downloaded the SQL file and ran it against my Linux server once I had changed the path for the backups to a directory I had created at /var/opt/mssql/backups notice that it is specified using Windows notation with C:\ at the root

SET @CreateJobs = 'Y'          -- Specify whether jobs should be created.
SET @BackupDirectory     = N'C:\var\opt\mssql\backups' -- Specify the backup root directory.
SET @CleanupTime         = 350         -- Time in hours, after which backup files are deleted. If no time is specified, then no backup files are deleted.
SET @OutputFileDirectory = NULL         -- Specify the output file directory. If no directory is specified, then the SQL Server error log directory is used.
SET @LogToTable          = 'Y'          -- Log commands to a table.

The stored procedures were created

03 - stored procedures

and the jobs were created

04 - jobs.PNG

Now the jobs are not going to run as they are as they have CmdExec steps and this is not supported in SQL on Linux so we have to make some changes to the steps. As I blogged previously, this is really easy using PowerShell

First we need to grab the jobs into a variable. We will use Get-SQLAgentJobHistory from the sqlserver module which you need to download SSMS 2016 or later to get. You can get it from https://sqlps.io/dl As we are targeting a Linux SQL Server we will use SQL authentication which we will provide via Get-Credential and then take a look at the jobs

Import-Module sqlserver
$cred = Get-Credential
$Jobs = Get-SqlAgentJob -ServerInstance LinuxvVNext -Credential $cred
$jobs |ft -auto

 

05 Powershell jobs.PNG

Once the jobs were in the variable I decided to filter out only the jobs that are calling the stored procedures to perform the backups, DBCC and Index optimisation and loop through them first. Backups are the most important after all

## Find the jobs we want to change foreach($Job in $jobs.Where{$_.Name -like '*DATABASES*'})

Then it is simply a case of replacing the sqlcmd text in the command to return it to T-SQL, adding the database name (I installed Ola’s stored procedures into the master database and changing the subsystem to use T-SQL instead of CmdExec

## replace the text as required
$job.jobsteps[0].command = $job.jobsteps[0].command.Replace('sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "' , '').Replace('" -b','')
## Change the subsystem
$job.jobsteps[0].subsystem = 'TransactSQL'
## Add the databasename
$job.jobsteps[0].DatabaseName = 'master'
## Alter the jobstep
$job.jobsteps[0].Alter()

We can check that it has done this using PowerShell

$Jobs = Get-SqlAgentJob -ServerInstance LinuxvVNext -Credential $cred
foreach($Job in $jobs.Where{$_.Name -like '*DATABASES*'})
{
foreach($step in $Job.JobSteps)
{
$step | Select Parent, Name, Command,DatabaseName,Subsystem
}
}
06 - Jobs changed.PNG

or by looking in SSMS if you prefer

07 - jobs changed ssms.PNG

Now lets run the jobs and check the history using Get-SqlAgentJobHistory

Get-SqlAgentJobHistory -ServerInstance linuxvnextctp14 -Credential $cred | select RunDate,StepID,Server,JobName,StepName,Message|Out-GridView
08 - ogv for jobs.PNG

Which pretty much matches what you see in SSMS

09 - ssms jobs view.PNG

and if you look in the directory you see the files exactly as you would expect them to be

10 - Files in Linux

We still need to change the other jobs that Ola’s script create. If we look at the command steps

 

11 - job comands.PNG

We can see that the CommandLog Cleanup job can use the same PowerShell code as the backup jobs, the sp_delete_backuphistory and sp_purgejobhistory jobs need to refer to the msdb database instead of master. For the moment the Output File Cleanup job is the one that is not able to be run on Linux. Hopefully soon we will be able to run PowerShell job steps and that will be resolved as well

Here is the full snippet of code to change all of the jobs

$server = 'Linuxvnextctp14'
$cred = Get-Credential
$Jobs = Get-SqlAgentJob -ServerInstance $server -Credential $cred
## Find the jobs we want to change
foreach($Job in $jobs)
{
if($Job.Name -like '*DATABASES*' -or $Job.Name -like '*CommandLog*')
{
## replace the text as required
$job.jobsteps[0].command = $job.jobsteps[0].command.Replace('sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "' , '').Replace('" -b','')
## Change the subsystem
$job.jobsteps[0].subsystem = 'TransactSQL'
## Add the databasename
$job.jobsteps[0].DatabaseName = 'master'
## Alter the jobstep
$job.jobsteps[0].Alter()
}
if($Job.Name -like '*history*')
{
## replace the text as required
$job.jobsteps[0].command = $job.jobsteps[0].command.Replace('sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d msdb -Q "' , '').Replace('" -b','')
## Change the subsystem
$job.jobsteps[0].subsystem = 'TransactSQL'
## Add the databasename
$job.jobsteps[0].DatabaseName = 'msdb'
## Alter the jobstep
$job.jobsteps[0].Alter()
}
}
 Happy Automating

Using Pester with dbatools Test-DbaLastBackup

$
0
0

In previous posts I have shown how to use Test-DbaLastBackup from dbatools and how you can make use of the results. Today we will look at using  Pester with the results

Pester provides a framework for running unit tests to execute and validate PowerShell commands from within PowerShell. Pester consists of a simple set of functions that expose a testing domain-specific language (DSL) for isolating, running, evaluating and reporting the results of PowerShell commands.

we shall use it to validate our results. First we need to gather our results as we have seen before, In this example I have set the MaxMb to 5 so change that if you are playing along

Import-Module dbatools
$TestServer = 'SQL2016N1'
$Server = 'SQL2016N2'
$servers = 'SQL2016N1','SQL2016N2'
$Results = $servers.ForEach{Test-DbaLastBackup -SqlServer $_ -Destination $TestServer -MaxMB 5}
Then we need to write some Pester Tests. I tried to use Test Cases which are the correct method to iterate through collections as Mike Robbins shows here but Pester does not accept the type of object that is returned from this command for that. It’s ok though, because Pester is just PowerShell we can use a foreach loop.
In this scenario, we are testing for failures rather than when the backup test has skipped due to the file path not being a network share or the size being greater than our max size, so our checks are using the Should Not assertion. I have also added a test for the time the backup was taken.
Describe "Last Backup Test results - NOTE THIS IGNORES Skipped restores,DBCC and BackupFiles" {
foreach($result in $results)
{
It "$($Result.Database) on $($Result.SourceServer) File Should Exist" {
$Result.FileExists| Should Not Be 'False'
}
It "$($Result.Database) on $($Result.SourceServer) Restore should be Success" {
$Result.RestoreResult| Should Not Be 'False'
}
It "$($Result.Database) on $($Result.SourceServer) DBCC should be Success" {
$Result.DBCCResult| Should Not Be 'False'
}
It "$($Result.Database) on $($Result.SourceServer) Backup Should be less than a week old" {
$Result.BackupTaken| Should BeGreaterThan (Get-Date).AddDays(-7)
}
}
If we run that we get an output like this. Green is Good Red is Bad 🙂
01 - pester script.PNG
We can save the script to a file and use the Invoke-Pester to call it like this.
Invoke-Pester C:\temp\BackupPester.ps1
(Some Restore Frames removed for brevity)
02 -invoke pester.gif
invoke-Pester can output results to a file so we can output to XML which can be consumed by many things
$Date = Get-Date -Format ddMMyyyHHmmss
$tempFolder = 'c:\temp\BackupTests\'
Push-Location $tempFolder
$XML = $tempFolder + "BackupTestResults_$Date.xml"
$script = 'C:\temp\BackupPester.ps1'
Invoke-Pester -Script $Script -OutputFile $xml -OutputFormat NUnitXml
will provide an XML file like this
04 - XML output.PNG
We can also make use of the reportunit.exe from http://relevantcodes.com/ to create pretty HTML files from the XML files we created
This piece of code will download and extract the file if it does not exist in the directory
#download and extract ReportUnit.exe
$url = 'http://relevantcodes.com/Tools/ReportUnit/reportunit-1.2.zip'
$fullPath = Join-Path $tempFolder $url.Split("/")[-1]
$reportunit = $tempFolder + '\reportunit.exe'
if((Test-Path $reportunit) -eq $false)
{
(New-Object Net.WebClient).DownloadFile($url,$fullPath)
Expand-Archive -Path $fullPath -DestinationPath $tempFolder
}
and this will run it against the XML and open the file
##run reportunit against report.xml and display result in browser
$HTML = $tempFolder  + 'index.html'
& .\reportunit.exe $tempFolder
Invoke-Item $HTML
which will look  like
03 - pretty html file.gif
Happy Automating

NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using

 Get-Module dbatools

and update it using an Administrator PowerShell session with

 Update-Module dbatools

You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using

Install-Module dbatools

Then you can use

Update-dbatools


Test your Sqlserver backups on Linux with PowerShell and dbatools

Getting SQL Server File Sizes and Space Used with dbatools

$
0
0

I read a great blog post about answering the question how big is the database using T-SQL on SQL Buffet and wondered how much I could do with the dbatools module

The dbatools module (for those that don’t know) is a PowerShell module written by amazing folks in the community designed to make administrating your SQL Server significantly easier using PowerShell. The instructions for installing it are available here It comprises of 182 separate commands at present (11 March 2017 Version 0.8.938)

I know that there is a Get-DBADatabaseFreeSpace Command written by Mike Fal b | t and using Glenn Berry’s diagnostic queries

First thing as always is to look at the help

 Get-Help Get-DbaDatabaseFreespace -ShowWindow

which will show you the help for the command and some examples

Lets look at the details for a single instance

 Get-DbaDatabaseFreespace -sqlserver $server

This is what it looks like

02 - singel server.gif

and yes it really is that fast, I have not speeded this up. 232 ms to get those details for an instance with 19 databases

03 - Measure Command.PNG

What information do you get ? Lets look at the information for a single database, you get an object for each file

Server               : SQL2014SER12R2
Database             : DBA-Admin
FileName             : DBA-Admin_System
FileGroup            : PRIMARY
PhysicalName         : F:\DBA-Admin_System.MDF
FileType             : ROWS
UsedSpaceMB          : 3
FreeSpaceMB          : 253
FileSizeMB           : 256
PercentUsed          : 1
AutoGrowth           : 0
AutoGrowType         : MB
SpaceUntilMaxSizeMB  : 16777213
AutoGrowthPossibleMB : 0
UnusableSpaceMB      : 16777213
Server               : SQL2014SER12R2
Database             : DBA-Admin
FileName             : DBA-Admin_Log
FileGroup            :
PhysicalName         : G:\DBA-Admin_Log.LDF
FileType             : LOG
UsedSpaceMB          : 32
FreeSpaceMB          : 224
FileSizeMB           : 256
PercentUsed          : 12
AutoGrowth           : 256
AutoGrowType         : MB
SpaceUntilMaxSizeMB  : 2528
AutoGrowthPossibleMB : 2304
UnusableSpaceMB      : 0
Server               : SQL2014SER12R2
Database             : DBA-Admin
FileName             : DBA-Admin_User
FileGroup            : UserFG
PhysicalName         : F:\DBA-Admin_User.NDF
FileType             : ROWS
UsedSpaceMB          : 1
FreeSpaceMB          : 255
FileSizeMB           : 256
PercentUsed          : 0
AutoGrowth           : 256
AutoGrowType         : MB
SpaceUntilMaxSizeMB  : 5119
AutoGrowthPossibleMB : 4864
UnusableSpaceMB      : 0
There is a lot of useful information returned for each file. Its better if you use Out-GridView as then you can order by columns and filter in the top bar.
04 - single server ogv.gif

 

As always, PowerShell uses the permissions of the account running the sessions to connect to the SQL Server unless you provide a separate credential for SQL Authentication. If you need to connect with a different windows account you will need to hold Shift down and right click on the PowerShell icon and click run as a different user.

Lets get the information for a single database. The command has dynamic parameters which populate the database names to save you time and keystrokes

05 dynamic parameters.gif

But you may want to gather information about more than one server. lets take a list of servers and place them into a variable. You can add your servers to this variable in a number of ways, maybe by querying your CMDB or using your registered servers or central management server

$SQLServers = 'SQL2005Ser2003','SQL2012Ser08AG3','SQL2012Ser08AG1','SQL2012Ser08AG2','SQL2014Ser12R2','SQL2016N1','SQL2016N2','SQL2016N3','SQLVnextN1','SQLvNextN2'

and then

Get-DbaDatabaseFreespace -SqlInstance $SQLServers | Out-GridView

 

06 - Many servers ogv.PNG

As you can see, you get a warning quite correctly, that the information for the asynchronous secondary node of the availability group databases is not available and I did not have all of my servers running so there are a couple of could not connect warnings as well. You can still filter very quickly. dbatools is tested from SQL2000 to SQL vNext as you can see below (although I don’t have a SQL2000 instance)

07 - filter ogv

 

Not only on Windows, this command will work against SQL running on Linux as well

08 - linux.PNG

So if we want to know the total size of the files on disk for  the database we need to look at the FileSizeMB property

$server = 'SQL2014Ser12R2'
$dbName = 'AdventureWorksDW2014'
Get-DbaDatabaseFreespace -SqlServer $server -database $dbName |
Select Database,FileName,FileSizeMB

Of course that’s an easy calculation here

08a - filesize.PNG

but if we have numerous files then it may be tougher. we can use the Measure-Object command to sum the properties. We need to do a bit of preparation here and set a couple of calculated properties to make it more readable

$server = 'SQL2014Ser12R2'
$dbName = 'AdventureWorksDW2014'
$database = @{Name = 'Database'; Expression = {$dbname}}
$FileSize = @{Name = 'FileSize'; Expression = {$_.Sum}}
Get-DbaDatabaseFreespace -SqlServer $server -database $dbName |
Select Database,FileSizeMB |
Measure-Object FileSizeMB -Sum |
Select $database ,Property, $filesize

09 - filessize

Maybe we want to look at all of the databases on an instance. Again, we have to do a little more work here

$server = 'SQL2014Ser12R2'
$srv = Connect-DbaSqlServer $server
$SizeonDisk = @()
$srv.Databases |ForEach-Object {
$dbName = $_.Name
$database = @{Name = 'Database'; Expression = {$dbname}}
$FileSize = @{Name = 'FileSize'; Expression = {$_.Sum}}
$SizeOnDisk += Get-DbaDatabaseFreespace -SqlServer $server -database $dbName | Select Database,FileSizeMB |  Measure-Object FileSizeMb -Sum | Select $database ,Property, $Filesize
}
$SizeOnDisk

10 - size on disk

If we wanted the databases ordered by the size of their files we could do this

$SizeOnDisk |Sort-Object Filesize -Descending

11 - size sorted.PNG

As it is PowerShell we have an object and we can use it any way we like. Maybe we want that information in a text file or a csv or an excel file or in an email, PowerShell can do that

 ## In a text file
$SizeonDisk | Out-file C:\temp\Sizeondisk.txt
Invoke-Item C:\temp\Sizeondisk.txt
## In a CSV
$SizeonDisk | Export-Csv C:\temp\Sizeondisk.csv -NoTypeInformation
notepad C:\temp\Sizeondisk.csv
## Email
Send-MailMessage -SmtpServer $smtp -From DBATeam@TheBeard.local -To JuniorDBA-Smurf@TheBeard.Local `
-Subject "Smurf this needs looking At" -Body $SizeonDisk
## Email as Attachment
Send-MailMessage -SmtpServer $smtp -From DBATeam@TheBeard.local -To JuniorDBA-Smurf@TheBeard.Local `
-Subject "Smurf this needs looking At" -Body "Smurf" -Attachments C:\temp\Sizeondisk.csv

I had a little play with Boe Prox PoshCharts (you have to use the dev branch) to see if I could get some nice charts and unfortunately the bar charts did not come out so well but luckily the donut and pie charts did. (I’m a DBA I love donuts!)

$SizeonDisk| Out-PieChart -XField Database -YField FileSize -Title "UsedSpaceMB per Database on $Server" -IncludeLegend -Enable3D
$SizeonDisk| Out-DoughnutChart -XField Database -YField FileSize -Title "UsedSpaceMB per Database on $Server" -IncludeLegend -Enable3D

12 - donuts.PNG

So the point is, whatever you or your process requires you can pretty much bet that PowerShell can enable it for you to automate.

You can make use of all of the properties exposed by the command. If you want to only see the files with less than 20% space free

 Get-DbaDatabaseFreespace -SqlServer $server | Where-Object {$_.PercentUsed -gt 80}

13 - percent used.PNG

you can also use the command to check for file growth settings as well

 Get-DbaDatabaseFreespace -SqlServer $server | Where-Object {$_.AutoGrowType  -ne 'Mb'}

14 - autogrowth.PNG

Or maybe you want to know the FileSize, Used and Free Space per database

 $server = 'SQL2014Ser12R2'
$srv = Connect-DbaSqlServer $server
$SizeonDisk = @()
$srv.Databases |ForEach-Object {
$dbName = $_.Name
$database = @{Name = 'Database'; Expression = {$dbname}}
$MB = @{Name = 'Mbs'; Expression = {$_.Sum}}
$SizeOnDisk += Get-DbaDatabaseFreespace -SqlServer $server -database $dbName | Select Database,FileSizeMB, UsedSpaceMB, FreeSpaceMb |  Measure-Object FileSizeMb , UsedSpaceMB, FreeSpaceMb -Sum  | Select $database ,Property, $Mb
}
$SizeOnDisk 

15 totals.PNG

Hopefully that has given you a quick insight into another one of the fabulous dbatools commands. Any questions, comment below or head over to the SQL Server Community Slack via https://sqlps.io/slack

Happy Automating

 

NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using

 Get-Module dbatools

and update it using an Administrator PowerShell session with

 Update-Module dbatools

You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using

 Install-Module dbatools

Then you can use

 Update-dbatools

 


Max Length of a column in a DataTable in PowerShell

$
0
0

Whilst I was writing my Test-DbaLastBackup Posts I ran into a common error I get when importing datatables into a database

I was using this table

01 - table

and when I tried to add the results of the Test-DbaLastBackup I got this

02 -error.PNG

Exception calling “WriteToServer” with “1” argument(s): “The given value of type String from the data source cannot be converted to type nvarchar of the
specified target column.”
At line:356 char:4
+             $bulkCopy.WriteToServer($InputObject)
+             ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : InvalidOperationException

Hmm, it says that it can’t convert a string to a nvarchar, that doesn’t sound right.To find out what was happening I used a little bit of code that I use every single day

 $Error[0] | Fl -force

All errors from your current session are stored in the $error array so [0] accesses the most recent one and fl is an alias for Format-List and the force switch expands the object. This is what I saw

03 expanded error.PNG

System.Management.Automation.MethodInvocationException: Exception calling “WriteToServer” with “1” argument(s): “The given value of
type String from the data source cannot be converted to type nvarchar of the specified target column.” —>
System.InvalidOperationException: The given value of type String from the data source cannot be converted to type nvarchar of the
specified target column. —> System.InvalidOperationException: String or binary data would be truncated.

String or binary data would be truncated. OK that makes sense, one of my columns has larger data than the destination column but which one? Lets take a look at some of the data

SourceServer  : SQL2016N3
TestServer    : SQL2016N1
Database      : RidetheLightning
FileExists    : Skipped
RestoreResult : Restore not located on shared location
DbccResult    : Skipped
SizeMB        : 4.08
BackupTaken   : 3/19/2017 12:00:03 AM
BackupFiles   : C:\MSSQL\Backup\SQL2016N3\RidetheLightning\FULL\SQL2016N3_RidetheLightning_FULL_20170319_000003.bak

SourceServer  : SQL2016N3
TestServer    : SQL2016N1
Database      : TheCallofKtulu
FileExists    : Skipped
RestoreResult : Restore not located on shared location
DbccResult    : Skipped
SizeMB        : 4.08
BackupTaken   : 3/19/2017 12:00:04 AM
BackupFiles   : C:\MSSQL\Backup\SQL2016N3\TheCallofKtulu\FULL\SQL2016N3_TheCallofKtulu_FULL_20170319_000004.bak

SourceServer  : SQL2016N3
TestServer    : SQL2016N1
Database      : TrappedUnderIce
FileExists    : Skipped
RestoreResult : Restore not located on shared location
DbccResult    : Skipped
SizeMB        : 4.08
BackupTaken   : 3/19/2017 12:00:04 AM
BackupFiles   : C:\MSSQL\Backup\SQL2016N3\TrappedUnderIce\FULL\SQL2016N3_TrappedUnderIce_FULL_20170319_000004.bak

Hmm, its not going to be easy to work out which bit of data is too big here.

All I need to know is the maximum length of the columns in the datatable though so I have a little snippet that will do that for me

$columns = ($datatable | Get-Member -MemberType Property).Name
foreach($column in $Columns) {
$max = 0
foreach ($a in $datatable){
       if($max -lt $a.$column.length){
        $max = $a.$column.length
       }
}
Write-Output "$column max length is $max"
}

and the output looks like this

04 - max length.PNG

So we can quickly see that the backupfiles property is too big and change the table accordingly and no more error.

Its pretty quick too, scanning 105 rows in 56 milliseconds in this example
05 - how long.PNG

I keep this little snippet in my snippets list for PowerShell ISE which you can find here

Here is the code to add this as a snippet to ISE
## A list of snippets
$snips = Get-IseSnippet
## Add a snippet
if(!$snips.Where{$_.Name -like 'Max Length of Datatable*'})
{
$snippet = @{
 Title = 'Max Length of Datatable'
 Description = 'Takes a datatable object and iterates through it to get the max length of the string columns - useful for data loads'
 Text = @"
`$columns = (`$datatable | Get-Member -MemberType Property).Name
foreach(`$column in `$Columns)
{
`$max = 0
foreach (`$a in `$datatable)
{
if(`$max -lt `$a.`$column.length)
{
`$max = `$a.`$column.length
}
}
Write-Output "`$column max length is `$max"
}

"@
}
New-IseSnippet @snippet
}
and if you want to add it to your VSCode snippets then you need to edit the PowerShell.json file which is located in your user home AppData folder ‘C:\Users\User\AppData\Roaming\Code\User\snippets\powershell.json’ or by clicking File –> Preferences –> User Snippets and typing PowerShell
04 user snippets.gif

Then you can add this bit of json inside the curly braces

 "Max Length of Datatable": {
"prefix": "Max Length of Datatable",
"body": [
"$$columns = ($$datatable | Get-Member -MemberType Property).Name",
"foreach($$column in $$Columns) {",
"    $$max = 0",
"    foreach ($$a in $$datatable){",
"        if($$max -lt $$a.$$column.length){",
"            $$max = $$a.$$column.length",
"        }",
"    }",
"    Write-Output \"$$column max length is $$max\"",
"}"
],
"description": "Takes a datatable object and iterates through it to get the max length of the string columns - useful for data loads"
}, 

and you have your snippet ready for use

07 - snippet in vscode.gif

Happy Automating

 

 


Test the SQL Server database collation with PowerShell and dbatools

$
0
0

If your server collation is different to your database collation then you may find that you get an error similar to this

Cannot resolve the collation conflict between “SQL_Latin1_General_CP1_CI_AS” and “Latin1_General_CI_AS” in the equal to operation.

when your queries use the tempdb

It would be useful to be able to quickly test if that may be the case and with the dbatools module you can. There is a Test-DbaDatabaseCollation command which will do just that. This page will show you how to install dbatools if you have not already got it

As always you should start with Get-Help when looking at a PowerShell command

Get-Help TestDbaDatabaseCollation -ShowWindow

 

01 - Get Help.PNG

There are only 3 parameters Sqlserver, Credential and detailed

Lets start with SQLServer

Test-DbaDatabaseCollation -SqlServer SQLvNextN2

this gives a quick and simple output showing the server name, database name and an IsEqual property

02 - test server

So in this example we can see that the WideWorldImporters database does not have the same collation as the server. If we only wanted to see information about databases with a collation that does not match the server then we could use

(Test-DbaDatabaseCollation -SqlServer SQLvNextN2).Where{$_.IsEqual -eq $false}

03 - equals false

That doesn’t give us any further information though. There is the detailed parameter as well. Lets see what that does

 Test-DbaDatabaseCollation -SqlServer SQLvNextN2 -Detailed

04 - detailed.PNG

This time we get the server name, server collation, database name , database collation and the IsEqual property. This is a collection of objects so we are not bound be just seeing them on the screen we can use them as I blogged about here

For example

 ## Output to a file
Test-DbaDatabaseCollation -SqlServer SQLvNextN2 -Detailed |Out-File C:\Temp\CollationCheck.txt
## Output to CSV
Test-DbaDatabaseCollation -SqlServer SQLvNextN2 -Detailed |Export-Csv  C:\temp\CollationCheck.csv -NoTypeInformation
<## Output to JSON
Test-DbaDatabaseCollation -SqlServer SQLvNextN2 -Detailed | ConvertTo-Json | Out-file c:\temp\CollationCheck.json
## Look at the files
notepad C:\temp\CollationCheck.json
notepad C:\temp\CollationCheck.csv
notepad C:\temp\CollationCheck.txt

Of course, you will probably want to test more than one server at a time. Lets pass an array of servers and see what happens

$SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*' -and $_.Name -ne 'SQL2008Ser2008' -and $_.State -eq 'Running'}).Name
# Test Db collation
Test-DbaDatabaseCollation -SqlServer $SQLServers -Detailed 
05 - servers.PNG

In this example, I am querying my Hyper-V server for all VMs with SQL in the name,except for my broken SQL2008 box ,that are running. I love PowerShell’s Out-GridView command for many reasons. The ability to sort by columns quickly and simply is one of them. Lets add that to the code and sort by the IsEquals column

$SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*' -and $_.Name -ne 'SQL2008Ser2008' -and $_.State -eq 'Running'}).Name
# Test Db collation
Test-DbaDatabaseCollation -SqlServer $SQLServers -Detailed | Out-GridView
06 - servers ogv.PNG

Excellent, that works a treat. How about Linux? Does this work if SQL is running on Linux? We will have to use the credential parameter as we need SQL Authentication. this time I have used the Format-Table command to format the output.

$cred = Get-Credential
Test-DbaDatabaseCollation -SqlServer LinuxvNextCTP14 -Credential $cred -Detailed | Format-Table -AutoSize
07 - Linux.PNG

Lets add some Pester tests. If we want to test a list of servers and see if any of their databases have an incorrect collation we can simply test if the IsEquals flag contains a false.

We can do this using TestCases. Test cases allow Pester to loop through a collection of ‘things’ The testcases parameter takes an array of hashtables. This all sounds very complicated to those unclear about PowerShell but here some code to do it.

$SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*' -and $_.Name -ne 'SQL2008Ser2008' -and $_.State -eq 'Running'}).Name
$testCases= @()
$SQLServers.ForEach{$testCases += @{Name = $_}}

The first line gathers the list of SQL Servers from the Hyper-V as before. You can get this from a text file, csv, Active Directory, CMS, registered servers list. The second line initiates the TestCases array and the third line iterates through the list of servers and adds a hashtable to the TestCases array

To make use of the test cases we have to use the -TestCases parameter in our It block of our Pester Test and add a param() so that the test knows where to get the values from. To add the value from the test cases into the title of the test we need to reference it inside <>

If you want to learn more about Pester. I highly recommend The Pester Book by Don Jones and Adam Bertram

Here is the code

Describe "Testing Database Collation" {
    $SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*' -and $_.Name -ne 'SQL2008Ser2008' -and $_.State -eq 'Running'}).Name
    $testCases= @()
    $SQLServers.ForEach{$testCases += @{Name = $_}}
    It "<Name> databases have the right collation" -TestCases $testCases {
        Param($Name)
        $Collation = Test-DbaDatabaseCollation -SqlServer $Name
        $Collation.IsEqual -contains $false | Should Be $false
    }
}

If we save that as a PowerShell file, we can call it with Invoke-Pester

08 - Servers Pester.PNG

which shows which servers do not have databases with the correct collation. This may be all that is required but what about if you want to check each database on each server with Pester?

I could not see a way to do this with TestCases so I reverted to PowerShell. Pester is just PowerShell code after all.

Describe "Testing Database Collation" {
    $SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*' -and $_.Name -ne 'SQL2008Ser2008' -and $_.State -eq 'Running'}).Name
    foreach($Server in $SQLServers)
    {
        $CollationTests = Test-DbaDatabaseCollation -SqlServer $Server
        foreach($CollationTest in $CollationTests)
        {
            It "$($Collationtest.Server) database $($CollationTest.Database) should have the correct collation" {
                $CollationTest.IsEqual | Should Be $true
            }
        }
    }
}

In this example, we again gather the names of our SQL servers and then iterate through them. Then set the results of the Test-DBADatabaseCollation to a variable and iterate through each of the results and test the IsEquals property. We can save that as a file and call it with Invoke-Pester and this time it looks like

09 - Individual databases.PNG

Excellent we can quickly and easily see which database on which server doesnot have a matching collation. We cant see in the results of the Pester test what collation it should be though. Lets do that as well.

This time we need to use the Detailed parameter and test that the ServerCollation matches the DatabaseCollation. This will enable Pester to display that information to us. Here is the code

Describe "Testing Database Collation" {
    $SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*' -and $_.Name -ne 'SQL2008Ser2008' -and $_.State -eq 'Running'}).Name
    foreach($Server in $SQLServers)
    {
        $CollationTests = Test-DbaDatabaseCollation -SqlServer $Server -Detailed
        foreach($CollationTest in $CollationTests)
        {
            It "$($Collationtest.Server) database $($CollationTest.Database) should have the correct collation of $($CollationTest.ServerCollation)" {
                $CollationTest.DatabaseCollation | Should Be $CollationTest.ServerCollation
            }
        }
    }
}
and if we save that as a file and call it with invoke-Pester (you can just run the code using PowerShell as well) it looks like this
10 - full test.PNG

Now Pester shows us what collation it is expecting and what the collation of the database is as well when it fails the test. (I love the little arrow showing where the difference is!)

Hopefully this post has shown you how you can use Test-DbaDatabaseCollation from the dbatools module to test your servers and combine that with Pester. If you have any questions about the dbatools module go and ask in the dbatools channel in the SQL Community Slack channel


Getting SQLServers Last Known Good DBCC Checkdb with PowerShell and dbatools

$
0
0

As good SQL Server DBA’s we want to ensure that our databases are regularly checked for consistency by running DBCC CheckDB. This will be frequently scheduled using an Agent Job or by using Ola Hallengrens Maintenance Solution

We can check for the last known good DBCC Check using the undocumented DBCC DBINFO(DBNAME) WITH TABLERESULTS You can see the last known good DBCC Check around about row 50

00 - Using TSQL.PNG

This makes parsing the information a bit more tricky and whilst you could use sp_MSForEachDB to iterate through the databases that doesn’t always work as you expect as Aaron Bertrand explains

Of course, I am going to use PowerShell and also the dbatools module This module is a community based project written by excellent, brilliant people in their own time and available to you free. To find out more and how to use and install it visit https://dbatools.io

In the module there is a command called Get-DbaLastGoodCheckDb This command was created by Jakob Bindslet. You can find Jakob on his blog and on LinkedIn.

As always, you start with any PowerShell command by using Get-Help

get-help Get-DbaLastGoodCheckDb -ShowWindow

00a - get help.PNG

This command has three parameters Sqlserver, Credential and Detailed. Lets see what it looks like

Get-DbaLastGoodCheckDb -SqlServer SQLvNextN2

01 - One server

It returns an object with the server name, database name and the time and date of the last known good checkdb for every database on the server. What happens if we use the detailed parameter?

Get-DbaLastGoodCheckDb -SqlServer SQLvNextN2 -Detailed

 

02 - one server detailed.PNG

This time we get more information. The server name, database name, when the database was created, the last good DBCC Checkdb, how long since the database was created, how long since the last known good DBCC Checkdb, a status and a Data Purity enabled flag. If you look at the image above it shows that the DBA_Admin database has a status of “New database, not checked yet” even though it has a date for the last known good DBCC CheckDb. This is because it was restored after this server was upgrade from CTP 1.3 to CTP 1.4 and there has not yet been a DBCC CheckDb run yet. The system databases have a status of “CheckDb should be performed”. This is because the last known good DBCC CheckDb is more than 7 days ago. Lets run a DBCC CheckDb and check again

02a - one server.PNG

This time the status has changed to OK for all of the databases 🙂

We can pass an array of SQL servers to this command as well and check multiple servers at the same time. In this example, I am querying my Hyper-V server for all VMs with SQL in the name,except for my broken SQL2008 box ,that are running. I love PowerShell’s Out-GridView command for many reasons so lets use that. you can filter quickly and easily in the top bar.

$SQLServers = (Get-VM -ComputerName beardnuc | Where-Object {$_.Name -like '*SQL*' -and $_.Name -ne 'SQL2008Ser2008' -and $_.State -eq 'Running'}).Name
Get-DbaLastGoodCheckDb -SqlServer $SQLServers -Detailed | Out-GridView

03 - many servers ogv.PNG

As you can see, you get a warning for secondary availability group databases. It’s quick too. In my lab of 10 servers and 125 databases ranging from SQL2005 to SQL vNext it runs in a little under  5 seconds. This command is not compatible with SQL2000 servers.

04 - measure comand.PNG

It is important to remember that as this script uses the DBCC DBINFO() WITH TABLERESULTS, there are several known weak points, including:

– DBCC DBINFO is an undocumented feature/command.
– The LastKnowGood timestamp is updated when a DBCC CHECKFILEGROUP is performed.
– The LastKnowGood timestamp is updated when a DBCC CHECKDB WITH PHYSICAL_ONLY is performed.
– The LastKnownGood timestamp does not get updated when a database in READ_ONLY.

Databases created prior to SQL2005 and then upgraded to SQL 2005 or above need to have DBCC CheckDb run once with the DATA_PURITY option to ensure that the DATA_PURITY check ,which look for column values where the value is outside the valid range of values for the column’s data type, is run by default when DBCC CheckDB is run. This is explained more fully by Paul Randal here and Ken Simmons here The Data Purity Enabled flag from the command will show false if the data purity check is not being performed. This should be resolved by running DBCC CheckDB with DATA_PURITY option as explained here

Now with one line of PowerShell code you can check the last time a DBCC CheckDb was run for each database on one or more instances. The beauty of PowerShell is that an object is returned which you can use in any number of ways as shown in a previous post

Happy Automating

NOTE – The major 1.0 release of dbatools due in the summer 2017 may have breaking changes which will stop the above code from working. There are also new commands coming which may replace this command. This blog post was written using dbatools version 0.8.942 You can check your version using

Get-Module dbatools

and update it using an Administrator PowerShell session with

Update-Module dbatools

You may find that you get no output from Update-Module as you have the latest version. If you have not installed the module from the PowerShell Gallery using

Install-Module dbatools

Then you can use

Update-dbatools

 


Viewing all 59 articles
Browse latest View live