Quantcast
Channel: SQL Server – SQL DBA with A Beard
Viewing all 59 articles
Browse latest View live

Creating SQL Server Database with PowerShell

$
0
0

 

This morning I have been setting up my Azure Servers in preparation for my presentation to the Cardiff SQL User Group this month.

I used my scripts from My Post on Spinning Up Azure SQL Boxes to create two servers and then I wanted to create some databases

I decided it was time to write a Create-Database function using a number of scripts that I have used to create individual databases.

Errors

Whilst finalising the function I didn’t quite get it right sometimes and was faced with an error.

image

Not the most useful of errors to troubleshoot. The issue could be anywhere in the script

You can view the last errors PowerShell has shown using $Errors. This gives you the last 500 errors but you can see the last error by using $Error[0] if you pipe it to Format-List you can get a more detailed error message so I added a try catch to the function which gave me an error message I could resolve.

image

Much better. The problem was

Cannot create file ‘C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\.LDF’ because it already exists.

Mistyping a variable has caused this. Creating an empty file name variable which then threw the error the second(and third,fourth fifth) times I ran the script but this error pointed me to it.

Creating Database

There are a vast number of variables you can set when creating a database. I decided to set File Sizes, File Growth Sizes, Max File Sizes and Recovery Model. I only set Server and Database Name as mandatory parameters and gave the other parameters default values

image

We take the parameters for file sizes in MB and set them to KB

image

Then set the default file locations. Create a database object, a Primary file group object and add the file group object to the database object

image

Add a User File Group for User objects

image

Create a database file on the primary file group using the variables set earlier

image

Do the same for the user file and then create a Log File

image

Set the Recovery Model and create the database and then set the user file group as the default

image

Finally catch the errors

image

It can then be called as follows Create-Database SERVERNAME DATABASENAME

image

or by setting all the parameters Create-Database -Server Fade2black -DBName DatabaseTest -SysFileSize 10 -UserFileSize 15 -LogFileSize 20 -UserFileGrowth 7 -UserFileMaxSize 150 -LogFileGrowth 8 -LogFileMaxSize 250 -DBRecModel FULL

image

This means that I can easily and quickly set up several databases of different types and sizes

The script can be found here Create-Database



List Databases (and Properties) on SQL Server with PowerShell

$
0
0

Another post in the PowerShell Box of Tricks series. Here is another script which I use to save me time and effort during my daily workload enabling me to spend more time on more important (to me) things!

Todays question which I often get asked is What databases are on that server?

This is often a follow up to a question that requires the Find-Database script. It is often asked by support teams investigating issues. It can also be asked by developers checking the impact of other services on their DEV/UAT environments, by change managers investigating impact of changes, by service managers investigating the impact of downtime, when capacity planning for a new service and numerous other situations.

A simple quick and easy question made simpler with this function which can also be called when creating documentation

image

Simply call it with Show-DatabasesOnServer SERVERNAME and use the results as you need

image

This only shows you the name but if you need more information about your databases then have a look and see what you require.

Use Get-Member to see what is there. I ran the following code to count the number of Properties available for Databases (Using PowerShell V3 on SQL Server 2012 SP1 11.0.3350.0 )

image

154 Properties that you can examine and that is just for databases:-)

Picking out a few properties you could do something like this

image

If you want aliases for your column headings you will need to add a bit of code to the select.

For Example, maybe you want to Database Name as a heading and the Size in Gb (Its in Mb in the example above) You would need to create a hash table with a Label element and an Expression element. The Label is the column heading and the Expression can just be the data or a calculation on data.

So select Name becomes

select @{label=”Database Name”;Expression={$_.Name}}

The Column Heading is Database Name and the data is the Name property

and select Size becomes

Select @{label=”Size GB”;Expression={“{0:N3}” -f ($_.Size/1024)}}

The Column Heading is Size GB and the data is the Size property divided by 1024 to 3 decimal places

then your code would look like this

$srv.databases|select @{label=”Server”;Expression={$_.Parent.name}},`
@{label=”Database Name”;Expression={$_.Name}}, Owner, Collation, CompatibilityLevel,`
RecoveryModel, @{label=”Size GB”;Expression={“{0:N3}” -f ($_.Size/1024)}}|`
Format-Table -Wrap –AutoSize

and the results

image

You can get the code for the function here Show-DatabasesOnServer


Checking For A Database Backup with PowerShell

$
0
0

Todays PowerShell Box of Tricks Post is about backups

I highly recommend you go and read bookmark and come straight back Stuart Moore’s series on Backups with Powershell The link takes you to the category showing all of the posts in the series

Whilst I have many other methods to inform me if backups fail, sometimes someone walks up to the desk ignores the headphones in my ears (They are bright yellow FFS), eyes fixed on the screen, face deep in concentration and asks me a question. It was for times like these that the functions from this series were written.

“When was this database last backed up?”

Auditors, managers, bosses, developers all can come and ask this question. I just turn to my PowerShell prompt type

image

and show him the screen. Job done (Sometimes)

As I mentioned previously there are 154 properties on the database object. Three of them are

LastBackupDate
LastDifferentialBackupDate
LastLogBackupDate

We simply call these and if the date reported is ’01 January 0001 00:00:00′ print NEVER

The Code is here Show-LastDatabaseBackup


Showing and Killing SQL Server Processes with PowerShell

$
0
0

Another post in the PowerShell Box of Tricks series. There are much better ways of doing this I admit but as you can do it with PowerShell I created a function to do it.

Create a Server Object and notice that there is a Method named EnumProcesses by piping it to Get-Member and then look at the Properties and Methods of EnumProcesses

image

Once I had done that then it was easy to create a function to display what is going on. It’s quick and easy. Not as good as sp_WhoIsActive but it displays about the same info as sp_who, sp_who2

image

image

You can also find a Method called KillProcess on the Server Property so I asked a Yes/No question using Windows Forms. You can find much more detail on that here

image

All you need to supply is the spid

The code is here Show-SQLProcesses


#TSQL2sDay Why My Head is Always in The Cloud

$
0
0

Todays post is my first for the TSQL2sDay series. For those not familiar this is rotating blog party that was started by Adam Machanic (@AdamMachanic | blog) back in 2009. If you want to catch up on all the fun to date? Check out this nice archive (link) put together by Steve Jones (@way0utwest |blog). Thank you Steve!!!

Azure Ballon - Credit http://owenrichardson.com/

This one is hosted by Jorge Segarra @SQLChicken:  who said This month’s topic is all about the cloud. What’s your take on it? Have you used it? If so, let’s hear your experiences. Haven’t used it? Let’s hear why or why not? Do you like/dislike recent changes made to cloud services? It’s clear skies for writing! So let’s hear it folks, where do you stand with the cloud?

My wife would tell you that my head is always in the cloud and she’s right (she usually is) just not like that picture! I would love to float gracefully above the land and gaze upon the view but its the landing that bothers me and will always stop me from trying it

Credit http://owenrichardson.com/

She’s right, pedantically and literally too, because this year I have spent a lot of time with my head and my fingers and my thinking in Virtual Machines using Windows Azure. That is where I have learnt a lot of my SQL and Powershell this year. After SQL Saturday Exeter and SQL Bits in Nottingham this year I have needed a place to practice and learn, an environment to try things and break things and mend them again and experiment.

I learn just as well by doing things as I do reading about them. Stuart Moore  @napalmgram has a great post called Learning to Play with SQL Server and whist I haven’t been as rough with my Azure SQL instances as he suggests I have been able to practice at will without worry and thanks to my MSDN subscription without cost. I have taken examples from blog posts and demos from User Group Sessions and run them on my Windows Azure VMs

Every single blog post I have written this year that has examples has been written in Azure and screen shots from Azure. Whilst some of my Powershell scripts in the PowerShell Box of Tricks series had already been written to solve one particular problem or another at MyWork, every single one was refined and demo’d and all the screen shots were from Azure and several were developed on Azure too

My first ever session to the SQL South West user group was about Spinning up and Shutting Down VMS in Azure was about Azure and was an interesting experience in Murphys Law which meant I ended up having to deliver it  on Azure.

The second time I have talked was about the PowerShell Box of Tricks series to the Cardiff User Group. Having learnt my lesson from the first time I had bought a mini HDMI to VGA converter and I had tested it using a couple of monitors at home and it worked wonderfully. However, when I got to Cardiff my little Asus convertible didn’t provide enough grunt to power the funky presentation screen. Luckily thanks to Stuart Moore @napalmgram who was also there doing his excellent PowerShell Back Up and Restore Session who let me use his Mac I was able to deliver the session using Office Web App to run the PowerPoint from my SkyDrive whilst all the demos were on ………Yup you guessed it Windows Azure !!!

So I feel qualified to answer Jorge’s questions and take part in T-SQL Tuesday this time round.

I like Azure. I like the ease I can spin up and down machines or any PaaS services at will. I love that I can do it with PowerShell because I really enjoy using PowerShell in my day to day work and at home too. Living as I do in a beautifully convenient bungalow in the country, I still enjoy the frustration of watching that spinning ring as my videos buffer on our 1.8Mbs at best internet connection. Whilst that does have an impact on using Azure it is a damn sight better than waiting many days trying to download one single file. Something like an ISO file for the latest SQL Server CTP for example.

There is no way I would have got a look at SQL Server 2014 if it wasn’t for Azure. I was able to spin up a SQL Server 2014 machine in only a few minutes and log in and have a play and then delete it. I have done the same with Server 2012 and 2012 R2. It has enabled me to try setting up Availability Groups and other technologies not yet implemented at MyWork

I wouldn’t have been able to do any of that on my machines at home as I don’t have anything capable of running Hyper-V whilst this 8 year old desktop still keeps hanging on despite the odd noises. (Negotiations are currently in place to replace it with something shiny and new. Just need that lottery win now !!)

I have also transferred my Cricket Averages database to WASD and am talking with a friend of mine about developing an app that will use the mobile service as well.

The rate of change is much quicker in the cloud, things change and change quickly. As quickly as I had written my post about Spinning up and Shutting Down VMS in Azure Microsoft changed the rules and didn’t charge for machines that were turned off. New services appear all the time. New services move quickly through from Preview to release and as Grant Fritchey noticed this week new views have been added to to Windows Azure SQL Database under the covers. I think this is something we are just going to have to live with. The scale of the cloud means it is much easier to test improvements at large scale and that means they can be released quicker.  It makes it more challenging to keep up I admit but it’s a constant drip of new things rather than a big bang all at once.

Azure has brought me to where I am today and I think it will continue to be part of my future. If I remember to submit my PowerShell session for SQL Saturday Exeter (Submit yours here) and it gets chosen then you will be able to see me there (if you register here) using Azure to give back to the SQL Community


What Runs on the SQL Server when you run a PowerShell script?–Question from #SQLRelay

$
0
0

Last week I ran a PowerShell lab at SQL Relay in Cardiff. There are still a few places available for SQL Relay week 2. Take a look here for more details and follow the twitter hashtag #SQLRelay for up to date information

The link for my slides and demos from the second part are here https://t.co/Fik2odyUMA

Whilst we were discussing Show-LastDatabaseBackup Kev Chant @KevChant asked where it was getting the information from and I answered that PowerShell was running SQL commands under the hood against the server and if you ran profiler that is what you would see. We didn’t have time to do that in Cardiff but I thought I would do it today to show what happens

A reminder of what Show-LastDatabaseBackup function does

image

If we start a trace with Profiler and run this function we get these results in PowerShell

image

In Profiler we see that it is running the following T-SQL for

image

exec sp_executesql N'
SELECT dtb.name AS [Name]
 FROM master.sys.databases AS dtb 
WHERE (dtb.name=@_msparam_0)',N'@_msparam_0 nvarchar(4000)',@_msparam_0=N'RageAgainstTheMachine' 

and then for

image

exec sp_executesql N'
 create table #tempbackup (database_name nvarchar(128), [type] char(1), backup_finish_date datetime)
 insert into #tempbackup select database_name, [type], max(backup_finish_date) 
from msdb..backupset where [type] = ''D'' or [type] = ''L'' or [type]=''I'' 
group by database_name, [type] 
SELECT (select backup_finish_date 
from #tempbackup 
where type = @_msparam_0 and db_id(database_name) = dtb.database_id) AS [LastBackupDate] 
FROM master.sys.databases AS dtb
 WHERE (dtb.name=@_msparam_1) drop table #tempbackup ',N'@_msparam_0 nvarchar(4000),@_msparam_1 nvarchar(4000)',@_msparam_0=N'D',@_msparam_1=N'RageAgainstTheMachine'

For

image

exec sp_executesql N'
 create table #tempbackup (database_name nvarchar(128), [type] char(1), backup_finish_date datetime)
 insert into #tempbackup select database_name, [type], max(backup_finish_date)
 from msdb..backupset 
where [type] = ''D'' or [type] = ''L'' or [type]=''I'' 
group by database_name, [type] 
SELECT (select backup_finish_date
 from #tempbackup 
where type = @_msparam_0 and db_id(database_name) = dtb.database_id) AS [LastDifferentialBackupDate] 
FROM master.sys.databases AS dtb 
WHERE (dtb.name=@_msparam_1) <mailto:dtb.name=@_msparam_1)> 
drop table #tempbackup ',N'@_msparam_0 nvarchar(4000),@_msparam_1 nvarchar(4000)',@_msparam_0=N'I',@_msparam_1=N'RageAgainstTheMachine' 

And for

image

exec sp_executesql N' 
create table #tempbackup (database_name nvarchar(128), [type] char(1), backup_finish_date datetime) 
insert into #tempbackup select database_name, [type], max(backup_finish_date)
 from msdb..backupset 
where [type] = ''D'' or [type] = ''L'' or [type]=''I'' 
group by database_name, [type] 
SELECT (select backup_finish_date 
from #tempbackup 
where type = @_msparam_0 and db_id(database_name) = dtb.database_id) AS [LastLogBackupDate] 
FROM master.sys.databases AS dtb 
WHERE (dtb.name=@_msparam_1) <mailto:dtb.name=@_msparam_1)> 
drop table #tempbackup ',N'@_msparam_0 nvarchar(4000),@_msparam_1 nvarchar(4000)',@_msparam_0=N'L',@_msparam_1=N'RageAgainstTheMachine' 

So the answer to your question Kev is

Yes it does get the information from the msdb database


Starting My Azure SQL Server VMs with PowerShell

$
0
0

 

The last post about Launching Azure VMs with PowerShell made someone ask me to explain how I start my Azure VMs normally so here goes.

When I decide to write a blog post or develop and test a script or run through demos from a presentation or blog post I fire up my Azure Virtual machines with PowerShell. This is how I do it

Open PowerShell and check that I am connected to my default subscription by running Get-AzureSubscription

Note – You must have installed Windows Azure PowerShell and installed the PublishSettingsFile or used Add-AzureAccount for your subscription following the steps here

http://www.windowsazure.com/en-us/manage/install-and-configure-windows-powershell/

Then I run the following three Cmdlets

image

Get-AzureVM shows me the VMs associated with that subscription.

I then pipe to Start-AzureVM as I want to start both machines. If I only wanted one I would check that

Get-AzureVM -name Fade2Black -ServiceName TheBestBeard

returned the correct machine and then pipe that to Start-AzureVM

Once the VMs have started I use Get-AzureRemoteDesktopFile giving a local path for the rdp file and specifying –launch to run the RDP session

image

and away we go :-)

Once I have finished simply run

image

and my machines are stopped and no longer running my credit down.


Dropping All Tables From A SQL Database with PowerShell

$
0
0

This post could also have been titled confusion with foreach or For-EachObject

The scenario – Having created a blank database a number of users and permissions for an external consultant to create a test database for an application I got a phone call.

“Please can you drop all the tables from the database as we need to re-run the installer with some different parameters”

Sure, I thought. No problem. I will use PowerShell. A simple script is all I need

image

That ought to do it. Loop through the tables and drop each one. But when I ran it I got this error

image

What I did (which I should have done first up but time pressures hadn’t allowed) was drop the database and write a script to recreate it and all the users and permissions required using my Create Windows User Function and Add User to Database Role Function but it got me thinking.

So I went home and fired up my Azure VMs and had a play and found two ways of resolving it. But first lets understand what is happening here. I read this post which explains it quite well for his script.

We are going through a list collection and deleting any instance of our event receiver, in the “Foreach loop”. But once we delete an item we are modifying the current list collection. The “Foreach” loop looks to see what the current value is, before it moves on to the next item. But since we deleted the current item, we get the “Collection was modified; enumeration operation may not execute” error.

Now that understand what is going on, we can now look at a solution to correct the error.

The simplest way to avoid modifying the collection would be with a “For Loop”.  With a “For Loop”, no modifications are made that will interrupt the looping process.

So when PowerShell has dropped the table it returns to the tables collection to find the current table before moving on to the next table but as we have deleted the table it falls over.

So lets fix it.

First lets create a test database with PowerShell. A piece of code that is useful to keep for scenarios like this. If you are creating a database for something other than a quick demo or a test then go and explore the other properties of the database object that you will surely want to configure. But for this demo the following is fine, it will use default options. The same applies for the tables script below.

image

Now lets create some tables.

image

And check they have been created

image

Now following the advice from above we can do the following

image

First we count the number of tables and set it to a variable and then create a for loop. Note if you put $i –le $tables.Count then the script will only delete 4 tables! In the script block we are setting the $table variable to the first in the collection and then drops it. List the table names again to check or run $tables.Count and you will see that all the tables have been deleted.

This was the other solution I found. It makes use of the scripter object to script the Drop commands for the tables add them to a Query string and pass that to  Invoke-SQLCmd to run it.

image



How I Check Hundreds of SQL Agent Jobs in 60 Seconds with Powershell

$
0
0

Checking that your Agent Jobs have completed successfully is a vital part of any DBA’s responsibility. It is essential to ensure that all of the hard work you have put into setting up the jobs can be quickly and easily checked. In a large estate this can be very time consuming and if done manually prone to human error. I have repeatedly mentioned John Sansoms Blog Post entitled “The Best DBAs Automate Everything” and I follow that advice. Today I will share with you one fo the first scripts that I wrote.

When I started as a DBA I was told that my first job every morning was to check the Agent Jobs and resolve any errors. This is still something I do first before anything else. (Except coffee, experience has taught me that you get your coffee before you log into your computer otherwise on the bad days you can miss out on coffee for many an hour) I have two scripts to do this. The first sends me an email if the number of failed jobs on a server is greater than zero. This helps me to quickly and simply identify where to start in the case of multiple failures and is also a backup to the second script.

The second script runs on a different server and creates an excel worksheet and colour codes it. This makes it very simple to quickly scroll through the sheet and spot any red cells which designate failed jobs and also provides a nice easy to understand method to show management that on that specific day everything went well (or badly)

As with any Powershell script which manipulates Office applications you first need to create an object and add the workbook and worksheet to it. I also set a filename date variable and a Date variable for the Sheet.

 

When you use Powershell to manipulate Excel you can access individual cells by identifying them by Row and Column. I use this to create a description for the work book as follows

There are lots of properties that you can play with within Excel. As with any Powershell the best way to find what you need is to use the Get-Member Cmdlet. If you run

($cells.item(1,3)|Get-Member).Count

You will see that there are 185 Methods and Properties available to you (in Office 2013 on Windows 8.1)

The snippet above creates the following

 

As you can see we are going to colour code the Job Status according to the three available results Successful, Failed and Unknown. We are also going to colour code the date column to see when the job was last run, this will enable you to easily identify if the last time the job ran it was successful but last night it didn’t kick off for some reason.

The next step is a fairly standard loop through available servers by picking them from a SQLServers text file, a list of the server names (ServerName\Instance if required) that you wish to check. You could also just create an array of server names or pick them from a table with Invoke-SQLCmd but which ever way you do it you need to be able to iterate through the array and then the .Jobs Collection in the JobServer Namespace as follows

 

 

What the script then does is to use the following properties of the $Job object and write the Excel File according to the logic in the description

$Job.Name
$Job.IsEnabled
$Job.LastRunOutcome
$Job.LastRunDate

To finish up save the workbook to a share available to all of the DBA Team and quit Excel. Notice that I use a double whammy to make sure Excel is really gone. First I quit the .com object and then I stop the process. I do this because I found that on my server quitting the .com object left the Excel process running and I ended up with dozens and dozens of them. If you have Excel open before you run this script either comment out the last line or save your work (You should save your work anyway regulary!)

As always I take no responsibility for your environment, that’s your Job! Don’t run this on Production unless you know what it is doing and are happy that you have first tested it somewhere safely away from any important systems. Make sure that you understand the correct time to run this job and have qualified the impact on the box it is running on.

Here is a screen shot of the finished Excel Sheet

 

As you can see the Data Transfer Job needs investigation! The reason I add to yellow rows above and below each servers list of jobs is to help me identify any server that is not responding as that will be easily recognised as two lots of yellow with nothing between them

I have considered improving this script by inputting the data into a database and running a report from that database but have not had the need to do so yet.

Enjoy this script, you can find it here

 

If you have any questions please use the comment box below

 

 


Listing the SQL Server SysAdmins With PowerShell

$
0
0

A very short blog today just to pass on this little script.

I was required to list all of the SysAdmins across a large estate. Obviously I turned to PowerShell :-)

I iterated through my server list collection and then created a server SMO object and used the EnumServerRoleMembers method to display all of the sysadmin members

2014-04-12_152433

This will work on SQL2000 – SQL2012. You can see how you can easily change the rolename in the script to enumerate other server roles.

Another way you could do it is to use the query

“SELECT c.name AS Sysadmin_Server_Role_Members
FROM sys.server_principals a
INNER JOIN sys.server_role_members b
ON a.principal_id = b.role_principal_id AND a.type = ‘R’ AND a.name =’sysadmin’
INNER JOIN sys.server_principals c
ON b.member_principal_id = c.principal_id”

and pass that with Invoke-SQLCMD through to every server (if you had to use Powershell :-) ). That query won’t work with SQL 2000 though


A look at the SQL Assessment Intelligence Pack in Operational Insights

$
0
0

Operational Insights is a service that has been added in preview to Azure. It enables you to collect, combine, correlate and visualize all your machine data in one place. It can collect data from all of your machines either via SCOM or by using an agent. Once the data is collected Operational Insights has a number of Intelligence Packs which have pre-configured rules and algorithms to provide analysis in various areas including for SQL Server

http://azure.microsoft.com/en-gb/services/operational-insights/

I thought I would take a look. I have an installation of SCOM in my lab on my laptop and I read the instructions to see how to connect it to Operational Insights. (You don’t have to have a SCOM installation to use Operational insights you can make use of an agent as well just follow the steps from the page below)

http://azure.microsoft.com/en-us/trial/operational-insights-get-started/

It really is very simple

If you have an Azure subscription already you can sign into the portal and join the preview program by clicking

New –> App Services –> Operational Insights

and create a new Operational Insights Workspace.

Once you have done that, if you have an installation of SCOM 2012 you need to be running Service Pack 1 and download and install the System Center Operational Insights Connector for Operations Manager and import the MPB files into SCOM.

If you have SCOM 2012R2 the connector is already installed and to connect your SCOM to Operational Insights is very very easy as you can see on

http://azure.microsoft.com/en-us/trial/operational-insights-get-started/?step2=withaccount&step3=SCOMcustomer

  1. In the Operations Manager Console, click Administration.
  2. Under Administration, select System Center Advisor, and then click Advisor Connection.
  3. Click Register to Advisor Service.
  4. Sign in with your Microsoft or Organizational account.
  5. Choose an existing Operational Insights workspace from the drop down menu
  6. Confirm your changes.
  7. In the System Center Advisor Overview page, Under Actions, click Add a Computer/Group.
  8. Under Options, select Windows Server or All Instance Groups, and then search and add servers that you want data

That is it. No really, that is it. I was amazed how quickly I was able to get this done in my lab and it would not take very long in a large implementation of SCOM either as you will have your groups of computers defined which will make it easy to decide which groups to use. You could use a separate workspace for each type of server or split up the information per service. It really is very customisable.

Once you have done that, go and add some of the Intelligence Packs. Each intelligence pack will change the amount  and type of data that is collected. At November 23rd there are

Alert Management – for your SCOM Alerts

Change Tracking – Tracking Configuration Changes

Log Management – for event log collection and interrogation

System Update Assessment – Missing Security Updates

Malware Assessment – Status of Anti-Malware and Anti-Virus scans

Capacity Planning – Identify Capacity and Utilisation bottlenecks

SQL Assessment – The risk and health of SQL Server Environment

There are also two ‘coming soon’ Intelligence packs

AD Assessment – Risk and health of Active Directory

Security – Explore security related data and help identify security breaches

You then (if you are like me) have a period of frustration whilst you wait for all of the data to be uploaded and aggregated but once it is you sign into the Operational Insights Portal

https://preview.opinsights.azure.com and it will look like this

opsman1

There is a lot of information there. As it is on my laptop and the lab is not running all of the time and is not connected to the internet most of the time I am not surprised that there are some red parts to my assessment!!

Obviously I was interested in the SQL Assessment and I explored it a bit further

Clicking on the SQL Assessment tile takes you to a screen which shows the SQL Assessment broken down into 6 Focus areas

Security and Compliance, Availability and Business Continuity, Performance and Scalability, Upgrade, Migration and  Deployment, Operations and Monitoring and Change and Configuration Management. MSDN http://msdn.microsoft.com/en-us/library/azure/dn873967.aspx gives some more information about each one

Security and Compliance – Safeguard the reputation of your organization by defending yourself from security threats and breaches, enforcing corporate policies, and meeting technical, legal and regulatory compliance requirements.

Availability and Business Continuity – Keep your services available and your business profitable by ensuring the resiliency of your infrastructure and by having the right level of business protection in the event of a disaster.

Performance and Scalability – Help your organization to grow and innovate by ensuring that your IT environment can meet current performance requirements and can respond quickly to changing business needs.

Upgrade, Migration and Deployment – Position your IT department to be the key driver of change and innovation, by taking full advantage of new enabling technologies to unlock more business value for organizational units, workforce and customers.

Operations and Monitoring – Lower your IT maintenance budget by streamlining your IT operations and implementing a comprehensive preventative maintenance program to maximize business performance.

Change and Configuration Management – Protect the day-to-day operations of your organization and ensure that changes won’t negatively affect the business by establishing change control procedures and by tracking and auditing system configurations.

You will be able to see some dials showing you how well you are doing in each area for the servers whose data has been collected.

opsman2

Each area will have the High Priority Recommendations shown below the dial and you can click on them to see more information about those recommendations

opsman3

You can also click the dial or the see all link to enter the search area where you can customise how you wish to see the data that has been collected, this looks a bit confusing at first

opsman4

The top bar contains the search , the timescale and some buttons to save the search, view the saved searches and view the search history, all of which will be shown in the right hand column below

The left column contains a bar graph for the search and all of the filters. The middle column contains the results of the search and can be viewed in list or tabular format and exported to CSV using the button below. A little bit of experimentation will give you a better understanding of how the filtering works and how you can make use of that for your environment

By looking at the search for the Operations and Monitoring Focus Area shown above

Type:SQLAssessmentRecommendation IsRollup=true RecommendationPeriod=2014-11 FocusArea=”Operations and Monitoring” RecommendationResult=Failed | sort RecommendationWeight desc

I saw that RecommendationResult=Failed and changed it to RecommendationResult=Passed. This enabled me to see all of the Recommendations that had been passed in the Focus Area and clicking the export button downloaded a csv file. I deleted RecommendationResult=Passed from the search and that gave me all of the recommendations that made up that Focus Area

Operations and Monitoring Focus Area

Recommendation
Enable Remote Desktop on servers.
Enable Remote Desktop on virtual machines.
Ensure computers are able to download updates.
Configure event logs to overwrite or archive old events automatically.
Review event log configuration to ensure event data is retained automatically. This relates to System Logs
Review event log configuration to ensure event data is retained automatically. This relates to Application Logs

I decided then to do the same for each of the Focus Areas for the SQL Assessment Intelligence Pack

Security and Compliance Focus Area

Recommendation
Change passwords that are the same as the login name.
Remove logins with blank passwords.
LAN Manager Hash for Passwords Stored
Investigate why unsigned kernel modules were loaded.
Apply security best practices to contained databases.
Enable User Account control on all computers.
Consider disabling the xp_cmdshell extended stored procedure.
Implement Windows authentication on Microsoft Azure-hosted SQL Server deployments.
Avoid using the Local System account to run the SQL Server service.
Avoid adding users to the db_owner database role.
Ensure only essential users are added to the SQL Server sysadmin server role.
Disable SQL Server guest user in all user databases.
Avoid running SQL Server Agent jobs using highly-privileged accounts.
Configure the SQL Server Agent service to use a recommended account.
Apply Windows password policies to SQL Server logins.
Investigate failures to validate the integrity of protected files.
Investigate failures to validate kernel modules.

Availability and Business Continuity Focus Area

Recommendation
Schedule full database backups at least weekly.
Optimize your backup strategy with Microsoft Azure Blob Storage.
Avoid using the Simple database recovery model.
Ensure all installations of Windows are activated.
Investigate logical disk errors.
Reduce the maximum Kerberos access token size.
Investigate connection failures due to SSPI context errors.
Set the PAGE_VERIFY database option to CHECKSUM.
Increase free space on system drives.
Investigate a write error on a disk.
Check the network access to Active Directory domain controllers.
Review DNS configuration on non-DNS servers.
Increase free space on system drives.
Investigate memory dumps.
Increase free space on system drives.
Investigate why the computer shut down unexpectedly.
Enable dynamic DNS registration for domain-joined servers.

Performance and Scalability Focus Area

Recommendation
Increase the number of tempdb database files.
Configure the tempdb database to reduce page allocation contention.
Ensure all tempdb database files have identical initial sizes and growth increments.
Set autogrowth increments for database files and log files to fixed values rather than percentage values.
Set autogrowth increments for transaction log files to less than 1GB.
Modify auto-grow settings to use a fixed size growth increment of less than 1GB and consider enabling Instant File Initialization.
Change your Affinity Mask and Affinity I/O MASK settings to prevent conflicts.
Resolve issues caused by excessive virtual log files.
Modify the database file layout for databases larger than 1TB.
Set the AUTO_CLOSE option to OFF for frequently accessed databases.
Review memory requirements on servers with less than 4GB of physical memory installed.
Configure system SiteName properties to be dynamic.
Align the Max Degree of Parallelism option to the number of logical processors.
Align the Max Degree of Parallelism option to the number of logical processors.
Consider disabling the AUTO_SHRINK database option.
Review memory requirements on computers with high paging file use.
Ensure SQL Server does not consume memory required by other applications and system components.
Consider changing your power saving settings to optimize performance.
Increase the initial size of the tempdb database.
Review the configuration of Maximum Transfer Unit (MTU) size.
Review your paging file settings.
Review and optimize memory cache configuration.
Review the configuration of Maximum Transfer Unit (MTU) size.
Review the system processor scheduling mode.
Review network provider ordering settings.
Remove invalid entries from the PATH environment variable.
Remove network entries from the PATH environment variable.
Investigate processes that use a large number of threads.
Avoid hosting user database files on the same disk volume as tempdb database files.
Review processes with large working set sizes.
Reduce the length of the PATH environment variable.
Reduce the number of entries in the PATH environment variable.
Ensure SQL Server does not consume memory required by other applications and system components.
Enable the backup compression default configuration option.
Ensure the DNS Client service is running and is set to start automatically.
Consider compressing database tables and indexes.

Upgrade, Migration and Deployment Focus Area

Recommendation
Ensure all devices run supported operating system versions.
Ensure that the guest user is enabled in the msdb database.
Avoid using the Affinity64 Mask configuration setting in new development work.
Avoid using the Affinity Mask configuration setting in new development work.
Avoid using the Affinity I/O Mask configuration setting in new development work.
Avoid using the Allow Updates configuration option in SQL Server.
Avoid using the Allow Updates configuration option in SQL Server.
Avoid using the Affinity64 I/O Mask configuration setting in new development work.
Configure SQL Server to accept incoming connections.
Configure SQL Server instances and firewalls to allow communication over TCP/IP.

As I have no data for Change and Configuration Management I was not able to see the recommendations in my Operation Insights Workspace.

Edit: Daniele Muscetta has said in the comments that this is a bug which is being tracked

As you can see from the type and description of the recommendations above these are all areas that a DBA will be concerned about and the benefit of having all of this information gathered, pre-sorted, prioritised and presented to you in this manner will enable you to work towards a better SQL environment and track your progress. You can read more about the SQL Assessment Intelligence Pack here

http://msdn.microsoft.com/en-us/library/azure/dn873958.aspx

As well as the pre-determined queries that are built into the Intelligence pack you can search your data in any way that you require enabling you to present information about the health and risk of your SQL Environment to your team or your management with ease. The “with ease” bit is dependent on you understanding the language and structure of the search queries.

You will need to put this page into your bookmarks

http://msdn.microsoft.com/library/azure/dn873997.aspx

As it contains the syntax and definitions to search your data

A very useful page for a starter like me is

http://blogs.msdn.com/b/dmuscett/archive/2014/10/19/advisor-searches-collection.aspx

by Daniele Muscetta which has a list of useful Operational Insights search queries such as

SQL Recommendation by Computer

Type=SQLAssessmentRecommendation IsRollup=false RecommendationResult=Failed | measure count() by Computer

If you click the star to the right of the search box you will find the saved searches. For the SQL Assessment Intelligence Pack there are

Did the agent pass the prerequisite check (if not, SQL Assessment data won’t be complete)
Focus Areas
How many SQL Recommendation are affecting a Computer a SQL Instance or a Database?
How many times did each unique SQL Recommendation trigger?
SQL Assesments passed by Server
SQL Recommendation by Computer
SQL Recommendation by Database
SQL Recommendation by Instance

You can use these and you can save your own searches which show the data in a way that is valuable to you.

Overall I am impressed with this tool and can see how it can be beneficial for a DBA as well as for System Administrators. I was amazed how easy it was to set up and how quickly I was able to start manipulating the data once it had been uploaded.


Show AutoGrowth Events with Powershell to CSV

$
0
0

This week I was reading Pinal Daves post about Autogrowth Events

http://blog.sqlauthority.com/2015/02/03/sql-server-script-whenwho-did-auto-grow-for-the-database/

as it happened I had a requirement to make use of the script only a few days later. I was asked to provide the information in a CSV so that the person who required the information could manipulate it in Excel.

I am a great believer in Automation. If you are going to do something more than once then automate it so I wrote two functions, added them to TFS and now they will be available to all of my team members next time they load Powershell.

Why two functions? Well Pinal Daves script gets the information from the default trace for a single database but there may be times when you need to know the autogrowth events that happened on a server with multiple databases.

I use a very simple method for doing this as I have not found the correct way to parse the default trace with Powershell. The functions rely on Invoke-SQLCMD2 which I also have in my functions folder and pass the query from Pinal Daves Blog post as a here string

$Results = Invoke-Sqlcmd2 -ServerInstance $Server -Database master -Query $Query

To output to CSV I use the Export-CSV cmdlet

if($CSV)
{
$Results| Export-Csv -Path $CSV
}

And to open the CSV I add a [switch] parameter. You can find out more about parameters here or by

Get-Help about_Functions_Advanced_Parameters

so the parameter block of my function looks like

param
(
[Parameter(Mandatory=$true)]
[string]$Server,
[Parameter(Mandatory=$true)]
[string]$Database,
[Parameter(Mandatory=$false)]
[string]$CSV,
[Parameter(Mandatory=$false)]
[switch]$ShowCSV
)

Now when I am asked again to provide this information it is as easy as typing

Show-AutogrowthServer -Server SQL2014Ser12R2 

or

Show-AutogrowthDatabase -Server SQL2014Ser12R2 -Database Autogrowth

and the results will be displayed as below

autogrowth

just a side note. Pinal Daves script uses @@servername in the where clause and if you have renamed your host the script will be blank. The resolution to this is to runt he following T-SQL

 sp_dropserver 'OLDSERVERNAME';
GO
sp_addserver NEWSERVERNAME, local;
GO

You can find the scripts here

Show-AutoGrowthServer

Show-AutoGrowthDatabase

and all of my Script Center Submissions are here

As always – The internet lies, fibs and deceives and everything you read including this post  should be taken with a pinch of salt and examined carefully. All code should be understood and tested prior to running in a live environment.


Instances and Ports with PowerShell

$
0
0

Just a quick post and a day late for #SQLNewBlogger There are some excellent posts on that hashtag and I recommend that you read them

When you know a server name but not the name of the instances or the ports that they are using this function will be of use


<#
.SYNOPSIS
Shows the Instances and the Port Numbers on a SQL Server

.DESCRIPTION
This function will show the Instances and the Port Numbers on a SQL Server using WMI

.PARAMETER Server
The Server Name

.EXAMPLE
Get-SQLInstancesPort Fade2Black

This will display the instances and the port numbers on the server Fade2Black
.NOTES
AUTHOR: Rob Sewell sqldbawithabeard.com
DATE: 22/04/2015
#>

function Get-SQLInstancesPort
{

param ([string]$Server)

[system.reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo")|Out-Null
[system.reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.SqlWmiManagement")|Out-Null
$mc = new-object Microsoft.SqlServer.Management.Smo.Wmi.ManagedComputer $Server
$Instances = $mc.ServerInstances

foreach($Instance in $Instances)
{
$port = @{Name ="Port"; Expression = {$_.ServerProtocols['Tcp'].IPAddresses['IPAll'].IPAddressProperties['TcpPort'].Value}}
$Parent = @{Name = "Parent"; Expression ={$_.Parent.Name}}
$Instance|Select $Parent,Name, $Port
}
}

Installing and upgrading default scripts automation – part one – Introduction

$
0
0

First I must say thank you to all of the wonderful people who have put time and effort into providing free tools and scripts to enable not only myself but all SQL DBAs to ease their work. For this series I especially thank

  • Brent Ozar – w|t
  • Ola Hallengren – w
  • Adam Mechanic – b|t
  • Jared Zagelbaum – b|t

The aim of this series is to share the methodology and the scripts that I have used to resolve this issue.

How can I automate the deployment and update of backup, integrity ,index maintenance and troubleshooting scripts as well as other default required scripts to all of the instances under my control and easily target any instances by OS version, SQL version, Environment, System or any other configuration of my choosing

This is Part 1 – Introduction I will link to the further posts here as I write them

So the scenario that lead to this series is a varied estate of SQL servers and instances where I wanted an automated method of deploying the scripts and being able to target different servers. It needed to be easy to maintain, easy to use and easy to alter. I wanted to be able to update all of the scripts easily when required. I also wanted to automate the install of new instances and ensure that I could ensure that all required scripts were installed and documented.

The method of doing this that I chose is just that – Its the way that I chose, whether it will work for you and your estate I don’t know but I hope you will find it of benefit. Of course you must test it first. Ensure that you understand what is happening, what it is doing and that that is what you want it to do. If you implement this methodology of installing scripts you will easily be able to start by targeting your Development Server and then gradually rolling it out to any other environments’ whilst always making sure that you test, monitor and validate prior to moving to the next.

I decided that I needed to have a DBA Database to start with. The role of the DBA Database is to be the single source of truth for the instances that are under my control, a source for the location of the scripts that I need to deploy and a place to hold the information that I gather from the servers. It is from this database that I will be able to target the instances as required and set the flags to update the scripts as and when I need to

agentjob

On that instance I also chose to put the SQL Agent Job that will deploy all of the scripts. This is an important point. The account that you use to run that job whether it is the Agent Service Account or a proxy account will need to have privileges on every instance that you target. It will need to be able to run every script that you wish to target your servers. The privileges it requires are defined by the scripts that you want to run. How that is set up is unique to your environment and your system. I will only say that all errors are logged to the log files and will enable you to resolve these issues. You should always use the principle of least privilege required to get the job done. Domain and Sys Admin are not really the best answer here :-)

I also created 2 further Agent Jobs to gather Windows and SQL Information from the servers. These jobs target all the instances and servers in the DBA Database and gather information centrally about Windows and SQL configurations making it easy to provide that information to any other teams that require it. I am always looking for methods to reduce the workload on DBAs and enabling people (using the correct permissions) to gather the information that they require by self-service is very beneficial

Documentation and logging about the scripts are provided by the log files stored as text files to troubleshoot the script and also documented in the Change log table in a DBA database on each instance which I blogged about previously here

The last thing was the script which needed to be modular and easy to add to and amend.

Throughout this series of blog posts I will share and expand on the methods I used to do this. If you have any questions at any point please feel free to ask either by commenting on the post or by contacting me using the methods on my About Me page


Setting Up and Using Azure VM SQL Automated Backup (and Restore)

$
0
0

This weekend I was creating some Azure VMs to test and was required to use the GUI for some screenshots. I have always used my Powershell scripts described here to create my test systems and with a new job taking up a lot of my time had missed the announcement about Azure SQL Automated Backup and Azure SQL Automated Patching so was surprised to see this screen

1

I read the announcement and also the details on MSDN https://msdn.microsoft.com/en-us/library/azure/dn906091.aspx which show that this requires the SQL Server IaaS Agent. This is a default option on new virtual machines.

There are some other considerations too. It is only supported for SQL Server 2014 and Windows Server 2012 and 2012R2 at present and you can set a retention period to a maximum of 30 days but it is automated. You do not have to decide upon the backup strategy Azure will decide the frequency and type of backups dependent upon the workload of the database and some other factors such as

A full backup is taken
○ when an instance is added to use Managed backup
○ When transaction log growth is 1Gb or more
○ At least once a week
○ If the log chain is broken
○ When a database is created

A transaction log backup is taken
– If no log backup is found
– Transaction log space used is 5Mb or larger
– At least once every two hours
– Any time the transaction log backup is lagging behind a full database backup. The goal is to keep the log chain ahead of full backup.

From <https://msdn.microsoft.com/en-gb/library/dn449496(v=sql.120).aspx>

There are some restrictions
– Only database backups are supported
– System databases are not supported so you need to back those up yourself
– You can only back up to Azure storage
– Maximum backup size is 1Tb as this is the maximum size for a blob in Azure storage
– Simple recovery is not supported
– Maximum retention is 30 days – if you are required to keep your backups for longer than 30 days for regulatory or other reasons you could simply use Azure Automation to copy the files to another storage account in Azure)

How to set it up.

If you are using the GUI then you will find SQL Automated Backup in the optional config blade of the set up. You can follow the steps here to set it up. If (like me) you want to use Powershell then use the following code after you have created your Virtual Machine

$storageaccount = "<storageaccountname>"
$storageaccountkey = (Get-AzureStorageKey -StorageAccountName $storageaccount).Primary
$storagecontext = New-AzureStorageContext -StorageAccountName $storageaccount -StorageAccountKey $storageaccountkey

$encryptionpassword = (Get-Credential -message 'Backup Encryption Password' -User 'IGNOREUSER').password
$autobackupconfig = New-AzureVMSqlServerAutoBackupConfig -StorageContext $storagecontext -Enable -RetentionPeriod 10 -EnableEncryption -CertificatePassword $encryptionpassword
Get-AzureVM -ServiceName <vmservicename> -Name <vmname> | Set-AzureVMSqlServerExtension -AutoBackupSettings $autobackupconfig | Update-AzureVM

Once you have run the code, Azure will take care of the rest. Add a couple of databases to your instance and look in the storage account and you will see this

2

3

And in the automaticbackup container you will find the Certificates and master key backups

4

It will also create a credential

5

You can use the same credential to back up your system databases. If like me you use Ola Hallengrens excellent Maintenance Solution then simply change your systems backup job as follows


USE [msdb]
GO
EXEC msdb.dbo.sp_update_jobstep @job_name = 'DatabaseBackup - SYSTEM_DATABASES - FULL', @step_id=1 ,
		@command=N'sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "EXECUTE [dbo].[DatabaseBackup] @Databases = ''SYSTEM_DATABASES'', "https://myaccount.blob.core.windows.net/mycontainer"
		,  @Credential = ''AutoBackup_Credential'', @BackupType = ''FULL'', @Verify = ''Y'', @CleanupTime = NULL, @CheckSum = ''Y'', @LogToTable = ''Y''" -b'
GO

If you need to restore your database then you can use the GUI and when you choose restore you will see this screen

6

Enter your storage account and the key which you can get from the Azure portal. You will notice that the credential has already been selected, click connect and

7

There are all of your backups ready to restore to any point in time that you choose. By clicking script the T-SQL is generated which looks like this


USE [master]
BACKUP LOG [Test] TO  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_LogBackup_2015-07-16_06-21-26.bak'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,
NOFORMAT, NOINIT,  NAME = N'Test_LogBackup_2015-07-16_06-21-26',
NOSKIP, NOREWIND, NOUNLOAD,  NORECOVERY ,  STATS = 5
RESTORE DATABASE [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714201240+00.bak'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714202740+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150714224241+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715005741+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715031242+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715052742+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715074243+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715095743+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150715121243+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [Test] FROM  URL = N'https://sqlbackupstoragebeard.blob.core.windows.net/asqlvm9-mssqlserver/Test_b8bb98d7a235487d9789b3ee8759cf3e_20150716060004+00.log'
WITH  CREDENTIAL = N'AutoBackup_Credential' ,  FILE = 1,  NOUNLOAD,  STATS = 5
GO

There is an important note. Remember this when you have just set it up so that you don’t think that you have done it wrong (which is what I did!)

When you enable Automated Patching for the first time, Azure configures the SQL Server IaaS Agent in the background. During this time, the portal will not show that Automated Patching is configured. Wait several minutes for the agent to be installed, configured. After that the portal will reflect the new settings.

From <https://msdn.microsoft.com/en-us/library/azure/dn961166.aspx>

And also look out for this

8

The password I had chosen was not complex enough but the Powershell script had succeeded and not given me the warning

To set up SQL Automated Patching you follow a similar steps. The setting is again on the OS Config blade and click enable and then you can choose the frequency and duration of the patching.

It is important to remember to choose your maintenance window correctly. If you have set up your SQL VMs correctly you will have them in an availability set and be using either mirroring or Availability Groups and have the VMs set up in the same availability set to ensure availability during the underlying host patching but I had it confirmed by Principal Software Engineering Manager Sethu Srinivasan t via Microsoft PFE Arvind Shyamsundar b | t that the SQL Automated Patching is not HA aware so you will need to ensure that you set the maintenance windows on each VM to ensure that they do not overlap



Populating My DBA Database for Power Bi with PowerShell – Server Info

$
0
0

Following my last post about using Power Bi with my DBA Database I have been asked if I would share the PowerShell scripts which I use to populate my database. They are the secondary part to my DBADatabase which I also use to automate the installation and upgrade of all of my DBA scripts as I started to blog about in this post Installing and upgrading default scripts automation – part one – Introduction which is a series I will continue later.

In this post I will show how to create the following report

1

You will find the latest version of my DBADatabase creation scripts here.

I create the following tables

dbo.ClientDatabaseLookup
dbo.Clients
dbo.InstanceList
dbo.InstanceScriptLookup
dbo.ScriptList
Info.AgentJobDetail
Info.AgentJobServer
Info.Databases
Info.Scriptinstall
Info.ServerOSInfo
Info.SQLInfo

By adding Server name, Instance Name , Port, Environment, NotContactable, and Location into the InstanceList table I can gather all of the information that I need and also easily add more information to other tables as I need to.

The not contactable column is so that I am able to add instances that I am not able to contact due to permission or environment issues. I can still gather information about them manually and add it to the table. I use the same script and change it to generate the SQL query rather than run it, save the query and then run the query manually to insert the data. This is why I have the DateAdded and Date Checked column so that I know how recent the data is. I don’t go as far as recording the change however as that will be added to a DBA-Admin database on every instance which stores every change to the instance.

The ServerOSInfo table is created like so


/****** Object: Table [Info].[ServerOSInfo]&nbsp;&nbsp;&nbsp; Script Date: 26/08/2015 19:50:38 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [Info].[ServerOSInfo](
[ServerOSInfoID] [int] IDENTITY(1,1) NOT NULL,
[DateAdded] [datetime] NULL,
[DateChecked] [datetime] NULL,
[ServerName] [nvarchar](50) NULL,
[DNSHostName] [nvarchar](50) NULL,
[Domain] [nvarchar](30) NULL,
[OperatingSystem] [nvarchar](100) NULL,
[NoProcessors] [tinyint] NULL,
[IPAddress] [nvarchar](15) NULL,
[RAM] [int] NULL,
CONSTRAINT [PK__ServerOS__50A5926BC7005F29] PRIMARY KEY CLUSTERED
(
[ServerOSInfoID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

The Powershell script uses Jason Wasser @wasserja Write-Log function to write to a text file but I also  enable some logging into a new event log by following the steps here http://blogs.technet.com/b/heyscriptingguy/archive/2013/02/01/use-powershell-to-create-and-to-use-a-new-event-log.aspx to create a log named SQLAutoScript with a source SQLAUTOSCRIPT

To run the script I simply need to add the values for

$CentralDBAServer = '' ## Add the address of the instance that holds the DBADatabase
$CentralDatabaseName= 'DBADatabase' 
$LogFile = "\DBADatabaseServerUpdate_" + $Date + ".log" ## Set Path to Log File

And the script will do the rest. Call the script from a PowerShell Job Step and schedule it to run at the frequency you wish, I gather the information every week. You can get the script from here or you can read on to see how it works and how to create the report

I create a function called Catch-Block to save keystrokes and put my commands inside a try catch to make the scripts as robust as possible.

function Catch-Block{
param ([string]$Additional)
$ErrorMessage = " On $Connection " + $Additional + $_.Exception.Message + $_.Exception.InnerException.InnerException.message
$Message = " This message came from the Automated Powershell script updating the
DBA Database with Server Information"
$Msg = $Additional + $ErrorMessage + " " + $Message
Write-Log -Path $LogFile -Message $ErrorMessage -Level Error
Write-EventLog -LogName SQLAutoScript -Source "SQLAUTOSCRIPT" -EventId 1 -EntryType Error -Message $Msg
}

I give the function an additional parameter which will hold each custom error message which I write to both the event log and a text message to enable easy troubleshooting and include the message from the $Error variable by accessing it with $_. I won’t include the try catch in the examples below. I gather all of the server names from the InstanceList table and set the results to an array variable called $Servers

$AlltheServers = Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query "SELECT DISTINCT [ServerName] FROM [DBADatabase].[dbo].[InstanceList] WHERE Inactive = 0 OR NotContactable = 1"
$Servers = $AlltheServers| Select ServerName -ExpandProperty ServerName

I then loop through the array and gather the information with three WMI queries.

Write-Log -Path $LogFile -Message "Gathering Info for $Server "
foreach($Server in $Servers)
{
Write-Log -Path $LogFile -Message "Gathering Info for $Servers"
$DNSHostName = 'NOT GATHERED'
$Domain = 'NOT GATHERED'
$OperatingSystem = 'NOT GATHERED'
$IP = 'NOT GATHERED'
try{
$Info = get-wmiobject win32_computersystem -ComputerName $Server -ErrorAction Stop|select DNSHostName,Domain,
@{Name="RAM";Expression={"{0:n0}" -f($_.TotalPhysicalMemory/1gb)}},NumberOfLogicalProcessors

I give the variables some default values in case they are not picked up and set the error action for the command to Stop to exit the try and the first query gathers the DNSHostName, Domain Name, the amount of RAM in GB and the number of logical processors, the second gathers the Operating System version but the third was the most interesting to do. There are many methods of gathering the IP Address using powershell and I tried a few of them before finding one that would work with all of the server versions that I had in my estate but the one that worked remotely the best for me and this is a good point to say that this works in my lab and in my shop but may not nessacarily work in yours, so understand, check and test this and any other script that you find on the internet before you let them anywhere near your production environment.

Unfortunately the one that worked everywhere remotely errored with the local server so I added a check to see if the server name in the variable matches the global environment variable of Computer Name

$OS =&nbsp; gwmi Win32_OperatingSystem&nbsp; -ComputerName $Server| select @{name='Name';Expression={($_.caption)}} 
if($Server -eq $env:COMPUTERNAME)
{$IP = (Get-WmiObject -ComputerName $Server -class win32_NetworkAdapterConfiguration -Filter 'ipenabled = "true"' -ErrorAction Stop).ipaddress[0] }
else {$IP = [System.Net.Dns]::GetHostAddresses($Server).IPAddressToString }
Write-Log -Path $LogFile -Message "WMI Info gathered for $Server "

Once I have all of the information I check if the server already exists in the ServerOs table and choose to either insert or update.

	$Exists = Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query "SELECT [ServerName] FROM [DBADatabase].[Info].[ServerOSInfo] WHERE ServerName = '$Server'"
	
	if ($Exists)
	{
	$Query = @"
	UPDATE [Info].[ServerOSInfo]
	   SET [DateChecked] = GetDate()
	      ,[ServerName] = '$Server'
	      ,[DNSHostName] = '$DNSHostName'
	      ,[Domain] = '$Domain'
	      ,[OperatingSystem] = '$OperatingSystem'
	      ,[NoProcessors] = '$NOProcessors'
	      ,[IPAddress] = '$IP'
	      ,[RAM] = '$RAM'
	WHERE ServerName = '$Server'
	"@
	}
	else
	{
	$Query = @"
	INSERT INTO [Info].[ServerOSInfo]
	           ([DateChecked]
	           ,[DateAdded
	           ,[ServerName]
	           ,[DNSHostName]
	           ,[Domain]
	           ,[OperatingSystem]
	           ,[NoProcessors]
	           ,[IPAddress]
	           ,[RAM])
	     VALUES
	   ( GetDate()
	      ,GetDate()
	      ,'$Server'
	      ,'$DNSHostName'
	      ,'$Domain'
	      ,'$OperatingSystem'
	      ,'$NoProcessors'
	      ,'$IP'
	      ,'$RAM')
	"@
	}
	Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $Query
	

And that’s it. Now if you wish to gather different data about your servers then you can examine the data available to you by

get-wmiobject Win32_OperatingSystem -ComputerName $Server | Get-Member
get-wmiobject win32_computersystem -ComputerName $Server | Get-Member

If you find something that you want to gather you can then add the property to the script and gather that information as well, make sure that you add the column to the table and to both the insert and update statements in the PowerShell Script

Creating the report in Power Bi

All data shown in the examples below has been generated from real-life data but all identifiable data has been altered or removed. I was born in Bolton and SQL SouthWest is based in Exeter :-)

Open Power Bi Desktop and click get data. Add the connection details for your DBA Database server and database and add the query

	SELECT SOI.[ServerOSInfoID]
	      ,SOI.[DateChecked]
	      ,SOI.[ServerName]
	      ,SOI.[DNSHostName]
	      ,SOI.[Domain]
	      ,SOI.[OperatingSystem]
	      ,SOI.[NoProcessors]
	      ,SOI.[IPAddress]
	      ,SOI.[RAM]
	,IL.ServerName
	,IL.InstanceName
		  ,IL.Location
		  ,IL.Environment
		  ,IL.Inactive
		  ,IL.NotContactable
	        FROM [DBADatabase].[Info].[ServerOSInfo] as SOI
	  JOIN [dbo].[InstanceList] as IL
	  ON IL.ServerName =  SOI.[ServerName]
	

2

Create a new column for the Operating Edition by clicking data on the left and using this code as described in my previous post

Operating System Edition = SWITCH([OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "DataCenter",
"Microsoft Windows Server 2012 Standard","Standard",
"Microsoft Windows Server 2012 R2 Datacenter", "DataCenter",
"Microsoft Windows Server 2008 R2 Standard", "Standard",
"Microsoft Windows Server 2008 R2 Enterprise", "Enterprise",
"Microsoft® Windows Server® 2008 Standard", "Standard",
"Microsoft® Windows Server® 2008 Enterprise","Enterprise",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Standard",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Enterprise",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

And one for OS Version using this code

OS Version = SWITCH([OperatingSystem], "Microsoft Windows Server 2012 Datacenter", "Server 2012",
"Microsoft Windows Server 2012 Standard","Server 2012",
"Microsoft Windows Server 2012 R2 Datacenter", "Server 2012 R2",
"Microsoft Windows Server 2008 R2 Standard", "Server 2008 R2",
"Microsoft Windows Server 2008 R2", "Server 2008 R2",
"Microsoft Windows Server 2008 R2 Enterprise", "Server 2008 R2",
"Microsoft® Windows Server® 2008 Standard", "Server 2008",
"Microsoft® Windows Server® 2008 Enterprise","Server 2008",
"Microsoft(R) Windows(R) Server 2003, Standard Edition", "Server 2003",
"Microsoft(R) Windows(R) Server 2003, Enterprise Edition", "Server 2003",
"Microsoft Windows 2000 Server", "Server 2000",
"Unknown")

I also created a new measure to count the distinct number of servers and instances as follows

Servers = DISTINCTCOUNT(Query1[Servers Name])
Instances = COUNT(Query1[Instance])

Then in the report area I start by creating a new text box and adding a title to the report and setting the page level filter to InActive is false so that all decommissioned servers are not included

3

I then create a donut chart for the number of servers by Operating System by clicking the donut chart in the visualisations and then dragging the OS version to the Details and the Servers Name to the Values

4

I then click the format button and added a proper title and the background colour

5

Then create the server numbers by location in the same way by clicking donut chart and adding location and count of server names and adding the formatting in the same way as the previous donut

6

I created a number of charts to hold single values for Domain, Instance, Server, RAM, Processors and the number of Not Contactable to provide a quick easy view of those figures, especially when you filter the report by clicking on a value within the donut chart. I find that managers really like this feature. They are all created in the same way by clicking the card in the visualisation and choosing the value

7

I also add a table for the number of servers by operating system and the number of servers by location by dragging those values to a table visualisation. I find that slicers are very useful ways of enabling information to be displayed as required, use the live visualisation to do this, I add the environment column to slice so that I can easily see values for the live environment or the development environment

I create a separate page in the report to display all of the server data as this can be useful for other teams such as the systems (server admin) team. I give them a lot of different slicers : – Domain, Location, Environment, OS Version, Edition and NotContactable with a table holding all of the relevant values to enable them to quickly see details

8

You can get all of the scripts here

 

In the next post I will show how to gather and report on SQL Server information


Populating My DBA Database for Power Bi with PowerShell – SQL Info

$
0
0

Following my post about using Power Bi with my DBA Database I have been asked if I would share the PowerShell scripts which I use to populate my database.

In this post I will show how to create the following report

1

2

Although you will find so many items of data that I expect that you will want to create different reports for your own requirements. You will also want to put the report onto PowerBi.com and explore the natural language querying as I show at the end of this post

You will find the latest version of my DBADatabase creation scripts and PowerShell scripts here.

The SQLInfo table is created using this code

CREATE TABLE [Info].[SQLInfo](
	[SQLInfoID] [int] IDENTITY(1,1) NOT NULL,
	[DateChecked] [datetime] NULL,
	[DateAdded] [datetime] NULL,
	[ServerName] [nvarchar](50) NULL,
	[InstanceName] [nvarchar](50) NULL,
	[SQLVersionString] [nvarchar](100) NULL,
	[SQLVersion] [nvarchar](100) NULL,
	[ServicePack] [nvarchar](3) NULL,
	[Edition] [nvarchar](50) NULL,
	[ServerType] [nvarchar](30) NULL,
	[Collation] [nvarchar](30) NULL,
	[IsHADREnabled] [bit] NULL,
	[SQLServiceAccount] [nvarchar](35) NULL,
	[SQLService] [nvarchar](30) NULL,
	[SQLServiceStartMode] [nvarchar](30) NULL,
	[BAckupDirectory] [nvarchar](256) NULL,
	[BrowserAccount] [nvarchar](50) NULL,
	[BrowserStartMode] [nvarchar](25) NULL,
	[IsSQLClustered] [bit] NULL,
	[ClusterName] [nvarchar](25) NULL,
	[ClusterQuorumstate] [nvarchar](20) NULL,
	[ClusterQuorumType] [nvarchar](30) NULL,
	[C2AuditMode] [nvarchar](30) NULL,
	[CostThresholdForParallelism] [tinyint] NULL,
	[MaxDegreeOfParallelism] [tinyint] NULL,
	[DBMailEnabled] [bit] NULL,
	[DefaultBackupCComp] [bit] NULL,
	[FillFactor] [tinyint] NULL,
	[MaxMem] [int] NULL,
	[MinMem] [int] NULL,
	[RemoteDacEnabled] [bit] NULL,
	[XPCmdShellEnabled] [bit] NULL,
	[CommonCriteriaComplianceEnabled] [bit] NULL,
	[DefaultFile] [nvarchar](100) NULL,
	[DefaultLog] [nvarchar](100) NULL,
	[HADREndpointPort] [int] NULL,
	[ErrorLogPath] [nvarchar](100) NULL,
	[InstallDataDirectory] [nvarchar](100) NULL,
	[InstallSharedDirectory] [nvarchar](100) NULL,
	[IsCaseSensitive] [bit] NULL,
	[IsFullTextInstalled] [bit] NULL,
	[LinkedServer] [nvarchar](max) NULL,
	[LoginMode] [nvarchar](20) NULL,
	[MasterDBLogPath] [nvarchar](100) NULL,
	[MasterDBPath] [nvarchar](100) NULL,
	[NamedPipesEnabled] [bit] NULL,
	[OptimizeAdhocWorkloads] [bit] NULL,
	[InstanceID] [int] NULL,
	[AGListener] [nvarchar](150) NULL,
	[AGs] [nvarchar](150) NULL,
 CONSTRAINT [PK__SQL__50A5926BC7005F29] PRIMARY KEY CLUSTERED 
(
	[SQLInfoID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

GO

ALTER TABLE [Info].[SQLInfo]  WITH CHECK ADD  CONSTRAINT [FK_SQLInfo_InstanceList] FOREIGN KEY([InstanceID])
REFERENCES [dbo].[InstanceList] ([InstanceID])
GO

ALTER TABLE [Info].[SQLInfo] CHECK CONSTRAINT [FK_SQLInfo_InstanceList]
GO

The Powershell script uses Jason Wasser @wasserja Write-Log function to write to a text file but I also enable some logging into a new event log by following the steps here http://blogs.technet.com/b/heyscriptingguy/archive/2013/02/01/use-powershell-to-create-and-to-use-a-new-event-log.aspx to create a log named SQLAutoScript with a source SQLAUTOSCRIPT

To run the script I simply need to add the values for

$CentralDBAServer = '' ## Add the address of the instance that holds the DBADatabase
$CentralDatabaseName = 'DBADatabase' 
$LogFile = "\DBADatabaseServerUpdate_" + $Date + ".log" ## Set Path to Log File

And the script will do the rest. Call the script from a PowerShell Job Step and schedule it to run at the frequency you wish, I gather the information every week. You can get the script from here or you can read on to see how it works and how to create the report and publish it to powerbi.com

I create a function called Catch-Block to save keystrokes and put my commands inside a try catch to make the scripts as robust as possible.

function Catch-Block
{
param ([string]$Additional)
$ErrorMessage = " On $Connection " + $Additional + $_.Exception.Message + $_.Exception.InnerException.InnerException.message
$Message = " This message came from the Automated Powershell script updating the DBA Database with Server Information"
$Msg = $Additional + $ErrorMessage + " " + $Message
Write-Log -Path $LogFile -Message $ErrorMessage -Level Error
Write-EventLog -LogName SQLAutoScript -Source "SQLAUTOSCRIPT" -EventId 1 -EntryType Error -Message $Msg
}

I give the function an additional parameter which will hold each custom error message which I write to both the event log and a text message to enable easy troubleshooting and include the message from the $Error variable by accessing it with $_. I won’t include the try catch in the examples below. I gather all of the server names from the InstanceList table and set the results to an array variable called $ServerNames holding the server name, instance name and port

 $Query = @"
 SELECT [ServerName]
      ,[InstanceName]
      ,[Port]
  FROM [DBADatabase].[dbo].[InstanceList]
  Where Inactive = 0 
    AND NotContactable = 0
"@
try{
$AlltheServers= Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $query
$ServerNames = $AlltheServers| Select ServerName,InstanceName,Port
}

I then loop through the array and create a $Connection variable for my SMO connection string and connect to the server

foreach ($ServerName in $ServerNames)
{
## $ServerName
 $InstanceName =  $ServerName|Select InstanceName -ExpandProperty InstanceName
 $Port = $ServerName| Select Port -ExpandProperty Port
$ServerName = $ServerName|Select ServerName -ExpandProperty ServerName 
 $Connection = $ServerName + '\' + $InstanceName + ',' + $Port

 try
 {
 $srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection

Even though I place the creation of the SMO server object in a try block you still need to an additional check to ensure that you can connect and populate the object as the code above creates an empty SMO Server object with the name property set to the $Connection variable if you can’t connect to that server and doesn’t error as you may expect
The way I have always validated an SMO Server object is to check the version property. There is no justifiable reason for choosing that property, you could choose any one but that’s the one I have always used. I use an if statement to do this ( This post about Snippets will show you the best way to learn powershell code) The reference I use for exiting a loop in the way that you want is this one In this case we use a continue to carry on iterating the loop

 if (!( $srv.version)){
 Catch-Block " Failed to Connect to $Connection"
 continue
 }

If you wish to view all of the different properties that you can gather information on in this way you can use this code to take a look. (This is something you should get used to doing when writing new Powershell scripts)

$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection
 $srv | Get-Member

As you can see from the screenshot below on my SQL2014 server there are 184 properties. I havent chosen to gather all of them, only the ones that are of interest to me, our team or others who request information from our team such as auditors and project managers etc

3

You can choose to use any or all of these properties as long as you ensure you have the columns in your table with the correct data type and that you have the correct knowledge and logic to stop the script from erroring if/when the property is not available. Here is an example

if ($srv.IsHadrEnabled -eq $True)
 {$IsHADREnabled = $True
 $AGs = $srv.AvailabilityGroups|Select Name -ExpandProperty Name|Out-String
 $Expression = @{Name = 'ListenerPort' ; Expression = {$_.Name + ',' + $_.PortNumber }}
 $AGListener =  $srv.AvailabilityGroups.AvailabilityGroupListeners|select $Expression|select ListenerPort -ExpandProperty ListenerPort
 }
 else
 {
 $IsHADREnabled = $false
 $AGs = 'None'
 $AGListener = 'None'
 }
 $BackupDirectory = $srv.BackupDirectory

I check if the property IsHADREnabled is true and if it is I then gather the information about the Availability Group names and the listener port and if it doesn’t exist I set the values to None.

You will find that not all of the properties that you want are at the root of the Server SMO object. If you want you max and min memory values and you want to know if remote admin connections or xp_cmdshell are enabled you will need to look at the $Srv.Configuration object

 $MaxMem = $srv.Configuration.MaxServerMemory.ConfigValue
 $MinMem = $srv.Configuration.MinServerMemory.ConfigValue
 $RemoteDacEnabled = $srv.Configuration.RemoteDacConnectionsEnabled.ConfigValue
 $XPCmdShellEnabled = $srv.Configuration.XPCmdShellEnabled.ConfigValue

You can look for the property that you want by using the Get-Member cmdlet as shown above or use MSDN to find it starting from here or by GoogleBingDuckDuckGo ing “Powershell SMO” and the property you wish to find.

The rest of the script follows exactly the same pattern as the previous post by checking the SQL Info table for an entry for that instance and updating the table if it exists and inserting if it does not.

There are other uses for gathering this information than just for reporting on it. You can target different versions of SQL for different scripts. You can identify values that are outside what is expected and change them. If xp_cmdshell should not be enabled, write the TSQL to gather the connection string of all of the servers from the DBADatabase where the SQLInfo table has XPCMDShellenabled = 1 and loop through them exactly as above and change the value of $srv.Configuration.XPCmdShellEnabled.ConfigValue to 0 and then $Srv.Alter()

It is a very powerful way of dynamically targeting your estate if you are looking after many instances and with great power comes great responsibility.

ALWAYS TEST THESE AND ANY SCRIPTS YOU FIND OR SCRIPTS YOU WRITE BEFORE YOU RUN THEM IN YOUR PRODUCTION ENVIRONMENT

Yeah, I shouted and some people thought it was rude. But its important, it needs to be repeated and drilled in so that it becomes habitual. You can do great damage to your estate with only a few lines of PowerShell and a DBA Database so please be very careful and ensure that you have a suitable test subset of servers that you can use to test

The other thing we can do is report on the data and with Power Bi we can create self service reports and dashboards and also make use of the natural language query at powerbi.com so that when your systems team ask “What are all the servers in X data center?” you can enable them to answer it themselves or when the compliance officer asks how many SQL 2005 instances do we have and which clients do they serve you can give them a dashboard they can query themselves.

This is how I create the two reports you see at the top. I start by connecting to the data source, my DBA Database

4

And I use this query

SELECT 
	IL.ServerName
	,IL.InstanceName
	  ,IL.Location
	  ,IL.Environment
	  ,IL.Inactive
	  ,IL.NotContactable
	  ,SI.[SQLInfoID]
      ,SI.[DateChecked]
      ,SI.[DateAdded]
      ,SI.[ServerName]
      ,SI.[InstanceName]
      ,SI.[SQLVersionString]
      ,SI.[SQLVersion]
      ,SI.[ServicePack]
      ,SI.[Edition]
      ,SI.[ServerType]
      ,SI.[Collation]
      ,SI.[IsHADREnabled]
      ,SI.[SQLServiceAccount]
      ,SI.[SQLService]
      ,SI.[SQLServiceStartMode]
      ,SI.[BAckupDirectory]
      ,SI.[BrowserAccount]
      ,SI.[BrowserStartMode]
      ,SI.[IsSQLClustered]
      ,SI.[ClusterName]
      ,SI.[ClusterQuorumstate]
      ,SI.[ClusterQuorumType]
      ,SI.[C2AuditMode]
      ,SI.[CostThresholdForParallelism]
      ,SI.[MaxDegreeOfParallelism]
      ,SI.[DBMailEnabled]
      ,SI.[DefaultBackupCComp]
      ,SI.[FillFactor]
      ,SI.[MaxMem]
      ,SI.[MinMem]
      ,SI.[RemoteDacEnabled]
      ,SI.[XPCmdShellEnabled]
      ,SI.[CommonCriteriaComplianceEnabled]
      ,SI.[DefaultFile]
      ,SI.[DefaultLog]
      ,SI.[HADREndpointPort]
      ,SI.[ErrorLogPath]
      ,SI.[InstallDataDirectory]
      ,SI.[InstallSharedDirectory]
      ,SI.[IsCaseSensitive]
      ,SI.[IsFullTextInstalled]
      ,SI.[LinkedServer]
      ,SI.[LoginMode]
      ,SI.[MasterDBLogPath]
      ,SI.[MasterDBPath]
      ,SI.[NamedPipesEnabled]
      ,SI.[OptimizeAdhocWorkloads]
      ,SI.[InstanceID]
      ,SI.[AGListener]
      ,SI.[AGs]
        FROM [DBADatabase].[Info].[SQLInfo] as SI
  JOIN [DBADatabase].[dbo].[InstanceList] as IL
  ON IL.InstanceID =  SI.InstanceID

So that I can easily add any and all the data to the reports if I choose or query using them in powerbi.com

First I created 3 measures.

AG = DISTINCTCOUNT(Query1[AGs])
Instances = DISTINCTCOUNT(Query1[InstanceID])
Servers = DISTINCTCOUNT(Query1[ServerName])

I click on map

5

And drag the location column to location and the Instances measure to both the Values and Color Saturation

6

I then click on edit and format the title and change the colours for the data

7

Next I created I heat map for Instances by Edition. The picture shows the details

8

And a column chart for Instances by Version

9

I also add a table showing the number of instances in each location and a slicer for environment.

Even though you have added one slicer, you are able to slice the data by clicking on the charts. If I click on Developer Edition I can quickly see which versions and locations they are in

10

This works for the map and the column chart as well. This has all been created using live data as a base with all identifying information altered, Bolton is where I was born and the other locations are chosen at random, all other figures and rollups have also been altered.

11

To create the other report I create two donut charts for Instances by version and by location using steps similar to my previous post and then add some tables for location, edition and xp_cmdshell enabled as well as some cards showing total numbers of Servers, Instances and Availability Groups and a slicer for environment to create a report like this, you can use the donut charts to slice the data as well

12

But there are so many different points of information gathered by this script that you get extra value using the natural language query on powerbi.com.

Click Publish and enter your powerbi.com credentials and then log into powerbi.com in a browser and you will see your report and your dataset. (Note, you can easily filter to find your dashboards, reports and data sets)

13

Click the plus sign to create a new dashboard and click the pin on any of the objects in your report to pin them to the dashboard

14

Then you can view (and share) your dashboard

15

Once you have done this you can query your data using natural language. It will cope with spelling mistakes and expects the column names so you may want to think about renaming them in your report by right clicking on them after you get your data.

You can ask it questions and build up information on the fly and alter it as you need it. As a DBA doing this and imagining enabling others to be able to ask these questions whenever they want from a browser and as many times as they like, it was very cool!

16

17

18

19

20

Pretty cool, I think you and any of your ‘requestors’ would agree

You can get all of the scripts here


Populating My DBA Database for Power Bi with PowerShell – Databases

$
0
0

Following my post about using Power Bi with my DBA Database I have been asked if I would share the PowerShell scripts which I use to populate my database.

In this post I will show how to create the following report

db1

db2

Although you will find so many items of data that I expect that you will want to create different reports for your own requirements. You will also want to put the report onto PowerBi.com and explore the natural language querying as I show at the end of this post

You will find the latest version of my DBADatabase creation scripts and PowerShell scripts here.

The SQLInfo table is created using this code

CREATE TABLE [Info].[Databases](
	[DatabaseID] [int] IDENTITY(1,1) NOT NULL,
	[InstanceID] [int] NOT NULL,
	[Name] [nvarchar](256) NULL,
	[DateAdded] [datetime2](7) NULL,
	[DateChecked] [datetime2](7) NULL,
	[AutoClose] [bit] NULL,
	[AutoCreateStatisticsEnabled] [bit] NULL,
	[AutoShrink] [bit] NULL,
	[AutoUpdateStatisticsEnabled] [bit] NULL,
	[AvailabilityDatabaseSynchronizationState] [nvarchar](16) NULL,
	[AvailabilityGroupName] [nvarchar](128) NULL,
	[CaseSensitive] [bit] NULL,
	[Collation] [nvarchar](30) NULL,
	[CompatibilityLevel] [nvarchar](15) NULL,
	[CreateDate] [datetime2](7) NULL,
	[DataSpaceUsageKB] [float] NULL,
	[EncryptionEnabled] [bit] NULL,
	[IndexSpaceUsageKB] [float] NULL,
	[IsAccessible] [bit] NULL,
	[IsFullTextEnabled] [bit] NULL,
	[IsMirroringEnabled] [bit] NULL,
	[IsParameterizationForced] [bit] NULL,
	[IsReadCommittedSnapshotOn] [bit] NULL,
	[IsSystemObject] [bit] NULL,
	[IsUpdateable] [bit] NULL,
	[LastBackupDate] [datetime2](7) NULL,
	[LastDifferentialBackupDate] [datetime2](7) NULL,
	[LastLogBackupDate] [datetime2](7) NULL,
	[Owner] [nvarchar](30) NULL,
	[PageVerify] [nvarchar](17) NULL,
	[ReadOnly] [bit] NULL,
	[RecoveryModel] [nvarchar](10) NULL,
	[ReplicationOptions] [nvarchar](40) NULL,
	[SizeMB] [float] NULL,
	[SnapshotIsolationState] [nvarchar](10) NULL,
	[SpaceAvailableKB] [float] NULL,
	[Status] [nvarchar](35) NULL,
	[TargetRecoveryTime] [int] NULL,
 CONSTRAINT [PK_Databases] PRIMARY KEY CLUSTERED 
(
	[DatabaseID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

The Powershell script uses Jason Wasser @wasserja Write-Log function to write to a text file but I also enable some logging into a new event log by following the steps here http://blogs.technet.com/b/heyscriptingguy/archive/2013/02/01/use-powershell-to-create-and-to-use-a-new-event-log.aspx to create a log named SQLAutoScript with a source SQLAUTOSCRIPT

To run the script I simply need to add the values for

$CentralDBAServer = '' ## Add the address of the instance that holds the DBADatabase
$CentralDatabaseName = 'DBADatabase' 
$LogFile = "\DBADatabaseServerUpdate_" + $Date + ".log" ## Set Path to Log File

And the script will do the rest. Call the script from a PowerShell Job Step and schedule it to run at the frequency you wish, I gather the information every week. You can get the script from here or you can read on to see how it works and how to create the report and publish it to powerbi.com and query it with natural langauge

I create a function called Catch-Block to save keystrokes and put my commands inside a try catch to make the scripts as robust as possible. I won’t include the try catch in the examples below. I gather all of the server names from the InstanceList table and set the results to an array variable called $ServerNames holding the server name, instance name and port

 $Query = @"
 SELECT [ServerName]
      ,[InstanceName]
      ,[Port]
  FROM [DBADatabase].[dbo].[InstanceList]
  Where Inactive = 0 
    AND NotContactable = 0
"@
try{
$AlltheServers= Invoke-Sqlcmd -ServerInstance $CentralDBAServer -Database $CentralDatabaseName -Query $query
$ServerNames = $AlltheServers| Select ServerName,InstanceName,Port
}

I then loop through the array and create a $Connection variable for my SMO connection string and connect to the server

foreach ($ServerName in $ServerNames)
{
## $ServerName
 $InstanceName =  $ServerName|Select InstanceName -ExpandProperty InstanceName
 $Port = $ServerName| Select Port -ExpandProperty Port
$ServerName = $ServerName|Select ServerName -ExpandProperty ServerName 
 $Connection = $ServerName + '\' + $InstanceName + ',' + $Port

 try
 {
 $srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection

Even though I place the creation of the SMO server object in a try block you still need to an additional check to ensure that you can connect and populate the object as the code above creates an empty SMO Server object with the name property set to the $Connection variable if you can’t connect to that server and doesn’t error as you may expect
The way I have always validated an SMO Server object is to check the version property. There is no justifiable reason for choosing that property, you could choose any one but that’s the one I have always used. I use an if statement to do this ( This post about Snippets will show you the best way to learn PowerShell code) The reference I use for exiting a loop in the way that you want is this one In this case we use a continue to carry on iterating the loop

 if (!( $srv.version)){
 Catch-Block " Failed to Connect to $Connection"
 continue
 }

I then loop through the user databases

foreach($db in $srv.databases|Where-Object {$_.IsSystemObject -eq $false })
{
$Name = $db.Name
$Parent = $db.Parent.Name

To gather information on all databases just remove everything after the pipe symbol or if you wish to exclude certain databases from the collection gathering, maybe the database you keep your Change log table and DBA Team info in you can do that as well here

foreach($db in $srv.databases|Where-Object {$_.Name -ne 'EXCLUDENAME' })
{
$Name = $db.Name
$Parent = $db.Parent.Name

If you wish to view all of the different properties that you can gather information on in this way you can use this code to take a look. (This is something you should get used to doing when writing new Powershell scripts)

$Connection = 'SERVERNAMEHERE'
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $Connection
 $srv.databases | Get-Member

An alternative method of doing this is to set a variable to a $db and then to select all of the properties so that you can see the values and identify the ones you want. Again this a good thing to do when exploring new objects

$db = $srv.databases['DBNAMEHERE'] 
$db| Select *

You can see from the screen shot below that there are 170 properties available to you on a SQL2014 instance. You can gather any or all of that information as long as you ensure that you have the columns with the correct data types in your table and that your script has the logic to deal with properties that do not exist although I have had less issue with this for the database object than the server object

db3

You can look for the property that you want by using the Get-Member cmdlet as shown above or use MSDN to find it starting from here or by GoogleBingDuckDuckGo ing “Powershell SMO” and the property you wish to find.

The rest of the script follows exactly the same pattern as the previous post by checking the SQL Info table for an entry for that instance and updating the table if it exists and inserting if it does not.

This is how I created the reports shown above.

Connect to the DBA Database and run these queries to gather the data for the report.

SELECT 
IL.ServerName
,IL.InstanceName
,IL.Location
,IL.Environment
,IL.Inactive
,IL.NotContactable
,D.[DatabaseID]
,D.[InstanceID]
,D.[Name]
,D.[DateAdded]
,D.[DateChecked]
,D.[AutoClose]
,D.[AutoCreateStatisticsEnabled]
,D.[AutoShrink]
,D.[AutoUpdateStatisticsEnabled]
,D.[AvailabilityDatabaseSynchronizationState]
,D.[AvailabilityGroupName]
,D.[CaseSensitive]
,D.[Collation]
,D.[CompatibilityLevel]
,D.[CreateDate]
,D.[DataSpaceUsageKB]
,D.[EncryptionEnabled]
,D.[IndexSpaceUsageKB]
,D.[IsAccessible]
,D.[IsFullTextEnabled]
,D.[IsMirroringEnabled]
,D.[IsParameterizationForced]
,D.[IsReadCommittedSnapshotOn]
,D.[IsUpdateable]
,D.[LastBackupDate]
,D.[LastDifferentialBackupDate]
,D.[LastLogBackupDate]
,D.[Owner]
,D.[PageVerify]
,D.[ReadOnly]
,D.[RecoveryModel]
,D.[ReplicationOptions]
,D.[SizeMB]
,D.[SnapshotIsolationState]
,D.[SpaceAvailableKB]
,D.[Status]
,D.[TargetRecoveryTime]
FROM [DBADatabase].[Info].[Databases] as D
JOIN [DBADatabase].[dbo].[InstanceList] as IL
ON IL.InstanceID =D.InstanceID

To get all the database and instance information and

SELECT C.ClientName
 ,[DatabaseID]
 ,[InstanceID]
 ,[Notes]
  FROM [DBADatabase].[dbo].[ClientDatabaseLookup] as CDL
  JOIN [DBADatabase].[dbo].[Clients] as C
  ON CDL.clientid = c.clientid

To get the client information. The client information needs to be manually added to the table as this (in general) needs a human bean to understand. When the script runs every night it will pick up new databases and I add a default value of “Not Entered” to the table which makes it easier to identify the databases that need this additional work. (This also means that as a Team Leader I can monitor that my team are doing this) It can also be added to any scripts which create new databases for deployment.

Then we need to create some measures and calculated columns for our report. I did this as I realised that I needed it when making the report rather than all up front.

I created two calculated columns for size for the databases one for Gb and one for Tb by clicking on the data icon on the left and then new measure

SizeGb = Query1[SizeMB]/1024
SizeTb = Query1[SizeGb]/1024

Some measures for count of Databases, Instances and Servers

Databases = COUNT(Query1[DatabaseID])
Instances = DISTINCTCOUNT(Query1[InstanceID])
Servers = DISTINCTCOUNT(Query1[ServerName])

I also wanted to be able to differentiate between ‘External’ and ‘Internal’ customers. So I created a calculated column for this value using a switch statement.

External = SWITCH(Clients[ClientName],"Not Entered", 0 , "Dev Team",0,"Mi Team",0,"DBA Team",0,"Finance Department",0,"HR",0,"Operations",0,"Payroll",0,"Test Team",0,"Systems Team",0,"Unknown",0,1)

I create a donut chart to show the size of the database in Gb by client (and no, my real clients are not rock bands :-) ) as shown below. I formatted the title, legend and background by clicking on the paintbrush in the visualisation pane. I would encourage you to investigate the options here.

db4
The other donut chart is number of clients per location (and those are SQL User group locations in the UK and my hometown Bolton)

db5

The rest of the visualisations on that report are cards and tables which I am sure that you can work out.

I created a map to show the location of the databases

db6

And after reading this post http://sqldusty.com/2015/08/03/power-bi-tip-use-the-treemap-chart-as-a-colorful-slicer/ by Dustin Ryan I created a colourful slicer for environment and the client and then added some other information. The important thing here is to pick the information that the person looking at the report needs to see. So if it is recovery model, compatibility level, collation, page verify setting, mirroring, replication, size and number of databases then this report is correct but I doubt that’s what you want :-)

You can slice this report by location, client or environment. For example, I can easily see which clients have data in Exeter and the size and number of databases

db7

Or if Metallica ring me up I can quickly see that they have 4 databases, just under 69Gb of data in Exeter and it isn’t mirrored. You will notice that it is not easy to see the recovery model or the compatibility level. If you hover over the results you get a highlight figure which shows the data is filtered but it is not shown visually very well as there are over a thousand databases using full recovery model.

db8

If we are asked about the Integration environment we can see that it is hosted in Bolton, Manchester, Southampton and Exeter and comprises of 394 databases and 739 Gb of data. It is also easier to see the compatibility level and recovery model as the ratios are larger

db9

Once we have created the report in the way that we want we can then publish it to powerbi.com and share it with others if we wish. Publishing is as easy as pressing the publish button and entering your powerbi credentials but if you want your data to automatically refresh (and this is the point of the exercise to remove manual work) then you will need to install and configure the PowerBi gateway and schedule a refresh I will post about this later.

Once the report is published you can access it in the browser and create a dashboard by clicking the pin in the top right of a visualisation and a pop up will ask you which dashboard you wish to pin it to (Another recent update to Power Bi)

db10

Once you have a dashboard you can then perform some natural language question and answer on it. This can be quite interesting and not always quite what you (or your report readers) might expect but it is getting better all the time

db11

You have to remember to use the names of the columns correctly

db12

But once you have the query correct you can alter it by adding “as a VISUALISATION” and choose the visualisation

db13

db14

And once you have the visualisation you can pin it to the dashboard

I think you can see how useful it can be

db15

This doesn’t work quite as you expect

db16

But this does

db17

How about this (and yes it felt wrong to type!)

db18

And the auditors would love to be able to do this. (This is an old copy of the database in case The Eagles people are reading this – your database is backed up every 15 minutes)

db19

Or this for a DBA ( Yes, my obfuscation script database naming convention is a bit bland)

db20

Or the DBA team manager might choose this one

db21

The advantage that I cannot show via static pictures is that the data, visualisation and the suggestions alter in real time as you type

I hope that you have found this useful and that you can see the benefits and advantages of using a DBA Database and empowering people to use self-service to answer their own questions leaving the DBA time to do more important things like drinking coffee :-)

As always if you have any questions or comments please feel free to post them on the blog.


Power Bi, PowerShell and SQL Agent Jobs

$
0
0

Continuing my series on using Power Bi with my DBA Database I am going to show in this post how I create the most useful daily report for DBAs – The SQL Agent Job report. You can get the scripts and reports here

AG1

This gives a quick overview of the status of the Agent Jobs across the estate and also quickly identifies recent failed jobs enabling the DBA to understand their focus and prioritise their morning efforts.

I gather the information into 2 tables AgentJobDetail

CREATE TABLE [Info].[AgentJobDetail](
[AgetnJobDetailID] [int] IDENTITY(1,1) NOT NULL,
[Date] [datetime] NOT NULL,
[InstanceID] [int] NOT NULL,
[Category] [nvarchar](50) NOT NULL,
[JobName] [nvarchar](250) NOT NULL,
[Description] [nvarchar](750) NOT NULL,
[IsEnabled] [bit] NOT NULL,
[Status] [nvarchar](50) NOT NULL,
[LastRunTime] [datetime] NOT NULL,
[Outcome] [nvarchar](50) NOT NULL,
CONSTRAINT [PK_info.AgentJobDetail] PRIMARY KEY CLUSTERED
(
[AgetnJobDetailID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

and AgentJobServer

CREATE TABLE [Info].[AgentJobServer](
[AgentJobServerID] [int] IDENTITY(1,1) NOT NULL,
[Date] [datetime] NOT NULL,
[InstanceID] [int] NOT NULL,
[NumberOfJobs] [int] NOT NULL,
[SuccessfulJobs] [int] NOT NULL,
[FailedJobs] [int] NOT NULL,
[DisabledJobs] [int] NOT NULL,
[UnknownJobs] [int] NOT NULL,
CONSTRAINT [PK_Info.AgentJobServer] PRIMARY KEY CLUSTERED
(
[AgentJobServerID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

The Detail table holds the results of every Agent Job and the Server table holds a roll up for each server. The script to gather this information is based on the script I used to put the information into an Excel Sheet as described in my post How I Check Hundreds of Agent Jobs in 60 Seconds with PowerShell which I also altered to send an HTML email to the DBA team each morning. This however is a much better solution and allows for better monitoring and trending.

As I have explained in my previous posts I use an Instance List table to hold the information about each instance in the estate and a series of PowerShell scripts which run via Agent Jobs to gather the information into various tables. These posts describe the use of the Write-Log function and the methodology of gathering the required information and looping through each instance so I wont repeat that here. There is an extra check I do however for Express Edition as this does not contain the Agent service

$edition = $srv.Edition
 if($Edition -eq 'Express')
 {
 Write-Log -Path $LogFile -Message "No Information gathered as this Connection $Connection is Express"
 continue
 }

The Agent Job information can be found in SMO by exploring the $srv.JobServer.Jobs object and I gather the information by iterating through each job and setting the values we require to variables

try{
$JobCount=$srv.JobServer.jobs.Count
$successCount=0
$failedCount=0
$UnknownCount=0
$JobsDisabled=0
#For each job on the server
foreach($jobin$srv.JobServer.Jobs)
{
$jobName=$job.Name;
$jobEnabled=$job.IsEnabled;
$jobLastRunOutcome=$job.LastRunOutcome;
$Category=$Job.Category;
$RunStatus=$Job.CurrentRunStatus;
$Time=$job.LastRunDate;
if($Time-eq'01/01/000100:00:00')
{$Time=''}
$Description=$Job.Description;
#Counts for jobs Outcome
if($jobEnabled-eq$False)
{$JobsDisabled+=1}
elseif($jobLastRunOutcome-eq"Failed")
{$failedCount+=1;}
elseif($jobLastRunOutcome-eq"Succeeded")
{$successCount+=1;}
elseif($jobLastRunOutcome-eq"Unknown")
{$UnknownCount+=1;}

I found that some Jobs had names and descriptions that had ‘ in them which would cause the SQL update or insert statement to fail so I use the replace method to replace the ‘ with ”

if($Description -eq $null){$Description = ' '}
$Description = $Description.replace('''','''''')
if($jobName -eq $Null){$jobName = 'None'}
$JobName = $JobName.replace('''','''''')

I then insert the data per job after checking that it does not already exist which allows me to re-run the job should a number of servers be uncontactable at the time of the job running without any additional work

IF NOT EXISTS (
SELECT  [AgetnJobDetailID]
FROM [DBADatabase].[Info].[AgentJobDetail]
where jobname = '$jobName'
and InstanceID = (SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
and lastruntime = '$Time'
)
INSERT INTO [Info].[AgentJobDetail]
([Date]
,[InstanceID]
,[Category]
,[JobName]
,[Description]
,[IsEnabled]
,[Status]
,[LastRunTime]
,[Outcome])
VALUES
(GetDate()
,(SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
,'$Category'
,'$jobName'
,'$Description'
,'$jobEnabled'
,'$RunStatus'
,'$Time'
,'$jobLastRunOutcome')

I put this in a here-string variable and pass it to Invoke-SQLCmd I do the same with the roll up using this query

INSERT INTO [Info].[AgentJobServer]
([Date]
,[InstanceID]
,[NumberOfJobs]
,[SuccessfulJobs]
,[FailedJobs]
,[DisabledJobs]
,[UnknownJobs])
VALUES
(GetDate()
,(SELECT [InstanceID]
FROM [DBADatabase].[dbo].[InstanceList]
WHERE [ServerName] = '$ServerName'
AND [InstanceName] = '$InstanceName'
AND [Port] = '$Port')
,'$JobCount'
,'$successCount'
,'$failedCount'
,'$JobsDisabled'
,'$UnknownCount')

This job runs as a SQL Agent Job every morning a half an hour or so before the DBA arrives for the morning shift vastly improving the ability of the DBA to prioritise their morning routine.

To create the report open Power Bi Desktop and click Get Data

ag2

Then choose SQL Server and click connect

ag3

Enter the Connection string, the database and the  query to gather the data

ag5

The query is

Select IL.InstanceID,
IL.ServerName,
IL.InstanceName,
IL.Environment,
IL.Location,
AJD.Category,
AJD.Date,
AJD.Description,
AJD.IsEnabled,
AJD.JobName,
AJD.LastRunTime,
AJD.Outcome,
AJD.Status
FROM [dbo].[InstanceList] IL
JOIN [Info].[AgentJobDetail] AJD
ON IL.InstanceID = AJD.InstanceID
WHERE LastRunTime > DATEADD(Day,-31,GETDATE())

Once we have gathered the data we then create some extra columns and measures for the reports. First I create a date column from the datetime Date Column

DayDate = DATE(YEAR('Agent Job Detail'[Date]),MONTH('Agent Job Detail'[Date]),DAY('Agent Job Detail'[Date]))

I also do the same for the LastRuntime. I create a day of the week column so that I can report on jobs outcome by day

DayyOfWeek = CONCATENATE(WEEKDAY('Agent Job Detail'[Date],2),FORMAT('Agent Job Detail'[Date]," -dddd"))

My friend Terry McCann b | t helped me create a column that returns true if the last run time is within 24 hours of the current time to help identify the recent jobs that have failed NOTE – On a Monday morning you will need to change this if you do not check your jobs on the weekend.

Last Run Relative Hour = ((1.0*(NOW()-'Agent Job Detail'[LastRunTime]))*24)<24

I create a measure for Succeeded, Failed and Unknown

Succeeded = IF('Agent Job Detail'[Outcome] = "Succeeded"
, 1
, 0)

Next we have to create some measures for the sum of failed jobs and the averages This is the code for 7 day sum

Failed7Days = CALCULATE(SUM('Agent Job Detail'[Failed]),FILTER (
ALL ( 'Agent Job Detail'[Last Run Date] ),
'Agent Job Detail'[Last Run Date] > ( MAX ( 'Agent Job Detail'[Last Run Date]  ) - 7 )
&& 'Agent Job Detail'[Last Run Date]  <= MAX ( 'Agent Job Detail'[Last Run Date]  )     ) )

and for the 7 Day average

Failed7DayAverage = DIVIDE([Failed7Days],7)

I did the same for 30 days. I used the TechNet reference for DAX expressions and got ideas from Chris Webbs blog

ag6
First I created the 30 day historical trend chart using a Line and Clustered column chart using the last run date as the axis and the succeed measure as the column and the Failed, Failed 7 Day Average and failed 30 day average as the lines

I then formatted the lines and title and column

ag7

To create the gauge which shows how well we have done today I created a measure to quickly identify todays jobs


LastRun Relative Date Offset = INT('Agent Job Detail'[LastRunTime] - TODAY())

which I use as a filter for the gauge as shown below. I also create two measures zero and twenty for the minimum and maximum for the gauge

ag8

The rest of the report is measures for 7 day average and 30 day average, a slicer for environment  and two tables, one to show the historical job counts and one to show the jobs that have failed in the last 24 hours using the Last Run Relative Hour measure from above

ag9

There are many other reports that you can or may want to create maybe by day of the week or by category depending on your needs. Once you have the data gathered you are free to play with the data as you see fit. Please add any further examples of reports you can run or would like to run in the comments below.

Once you have your report written you can publish it to PowerBi.com and create a dashboard and query it with natural language. I have explained the process in previous posts

For example – How many Jobs failed today

ag110

Which server had most failed jobs

ag11

or using the category field which database maintenance jobs failed today

ag13

I hope these posts have given you ideas about how you can use Powershell, a DBA Database and Power Bi to help you to manage and report on your environment.

You can get the scripts and reports here

I have written further posts about this

Using Power Bi with my DBA Database

Populating My DBA Database for Power Bi with PowerShell – Server Info

Populating My DBA Database for Power Bi with PowerShell – SQL Info

Populating My DBA Database for Power Bi with PowerShell – Databases

Power Bi, PowerShell and SQL Agent Jobs


DBA Database scripts are on Github

$
0
0

It started with a tweet from Dusty

Tweets

The second session I presented at the fantastic PowerShell Conference Europe was about using the DBA Database to automatically install DBA scripts like sp_Blitz, sp_AskBrent, sp_Blitzindex from Brent Ozar , Ola Hallengrens Maintenance Solution , Adam Mechanics sp_whoisactive , This fantastic script for logging the results from sp_whoisactive to a table , Extended events sessions and other goodies for the sanity of the DBA.

By making use of the dbo.InstanceList in my DBA database I am able to target instances, by SQL Version, OS Version, Environment, Data Centre, System, Client or any other variable I choose. An agent job that runs every night will automatically pick up the instances and the scripts that are marked as needing installing. This is great when people release updates to the above scripts allowing you to target the development environment and test before they get put onto live.

I talked to a lot of people in Hannover and they all suggested that I placed the scripts onto GitHub and after some how-to instructions from a few people (Thank you Luke) I spent the weekend updating and cleaning up the code and you can now find it on GitHub here

github

I have added the DBA Database project, the Powershell scripts and Agent Job creation scripts to call those scripts and everything else I use. Some of the DBA Scripts I use (and links to those you need to go and get yourself for licensing reasons) and the Power Bi files as well. I will be adding some more jobs that I use to gather other information soon.

Please go and have a look and see if it is of use to you. It is massively customisable and I have spoken to various people who have extended it in interesting ways so I look forward to hearing about what you do with it.

As always, questions and comments welcome

 

 


Viewing all 59 articles
Browse latest View live