Category: Administration

Today I like to write about a problem that excists in a lot of environments. The file sizes and file growth of the databases are default configured with 1MB growth and 10% growth. I see a lot of environments and most of them do not have adjusted values for these settings. The default setting for database files is:

  • Data: initial size 2MB; Autogrowth: by 1MB, unrestricted growth
  • Log: initial size 1MB; Autogrowth: by 10 percent, unrestricted growth

Good reading about the default values is this article [Choosing Default Sizes for Your Data and Log Files]. So after reading that excellent blog article by Paul S. Randal you should be convinced that the default settings are not the best for your environment. But what should they be?

Of course the well known answer is: “It depends…”

In an ideal situation you know what the expected growth in data for your database is and you pre-size your database files accordingly. However I rarely see an environment where the DBA and/or the Developers have any clue about the growth in data for the next few months, years of even a longer time frame.

After changing file sizes too long by hand I started thinking about creating a script for automating this based on the file sizes the databases have when you run the script.

I started with defining 4 possible sizes for database file growth settings:

  • Databases that grow per 100MB
  • Databases that grow per 250MB
  • Databases that grow per 1.000MB
  • Databases that grow per 2.500MB

The next thing is to define what file sizes are the limits for these file growth settings:

  • Databases from 1MB to 249MB
  • Databases from 250MB to 2.499MB
  • Databases from 2.500MB to 4.999MB
  • Databases from 5.000MB to ∞

Next I started writing a script to programmatically create the T-SQL statement to change the settings for the databases.

First I check what the result will be for each database with an overview for every new file size and file growth setting:

SELECT [SMF].[database_id] AS [Database ID]

, [SMF].[name] AS [File name]

, [SMF].[size] * 8 / 1024 AS [Current size]

, ROUND ((([SMF].[size]) * 8 / 1024 / 100) , 2 , 0) * 100 + 100 AS [New size in growth per 100]

, ROUND ((([SMF].[size]) * 8 / 1024 / 250) , 2 , 0) * 250 + 250 AS [New size in growth per 250]

, ROUND ((([SMF].[size]) * 8 / 1024 / 1000) , 2 , 0) * 1000 + 1000 AS [New size in growth per 1000]

, ROUND ((([SMF].[size]) * 8 / 1024 / 2500) , 2 , 0) * 2500 + 2500 AS [New size in growth per 2500]

FROM [master].[sys].[master_files] [SMF]

The result from the script above will be something like this (results will vary per environment):

 

 

 

 

 

Next I created the script that creates a result based on the current file sizes:

SELECT [SMF].[database_id] AS [Database ID]

, [SMF].[file_id] AS [File ID]

, [SMF].[name] AS [File name]

, [SMF].[size] * 8 / 1024 AS [Current size]

, [SMF].[growth] * 8 / 1024 AS [Current growth]

, [SMF].[is_percent_growth] AS [Is current percentage growth]

, CASE

WHEN [SMF].[size] * 8 / 1024 < 100 THEN CONVERT(nvarchar(100), ROUND ((([SMF].[size]) * 8 / 1024 / 100) , 2 , 0) * 100 + 100)

WHEN [SMF].[size] * 8 / 1024 < 2500 THEN CONVERT(nvarchar(100), ROUND ((([SMF].[size]) * 8 / 1024 / 250) , 2 , 0) * 250 + 250)

WHEN [SMF].[size] * 8 / 1024 < 5000 THEN CONVERT(nvarchar(100), ROUND ((([SMF].[size]) * 8 / 1024 / 1000) , 2 , 0) * 1000 + 1000)

WHEN [SMF].[size] * 8 / 1024 >= 5000 THEN CONVERT(nvarchar(100), ROUND ((([SMF].[size]) * 8 / 1024 / 2500) , 2 , 0) * 2500 + 2500)

END AS [New file size]

, CASE

WHEN [SMF].[size] * 8 / 1024 < 100 THEN '100'

WHEN [SMF].[size] * 8 / 1024 < 2499 THEN '250'

WHEN [SMF].[size] * 8 / 1024 < 4999 THEN '1000'

WHEN [SMF].[size] * 8 / 1024 >= 5000 THEN '2500'

ELSE '1024'

END AS [New file growth]

FROM [master].[sys].[master_files] [SMF]

In the results from that script you will see what the current sizes and growth settings are and the proposed new sizes and growth settings (results will vary per environment):


 

 

 

 

 

So with this logic in place I could create the final script that creates the T-SQL code for changing the database file sizes for each database in the SQL Server instance based on the current file sizes and defenitions made earlier. In scripts like these I never make them that the T-SQL code will automatically run. The scripts create T-SQL code that you need to copy paste into a new query window so that you can check if the script accomplishes what you want. Before running the script below make sure your results are displayed in Text (CTRL + T)

In this script I use the SQLCMD mode option :CONNECT + Servername to be sure the changes will be done on the correct SQL Server instance.

SELECT ':CONNECT ' + @@SERVERNAME + '

ALTER DATABASE [' + CONVERT(nvarchar(1000), DB_NAME(SMF.database_id)) + '] MODIFY FILE ( NAME = ''' + RTRIM(SMF.name) + ''', SIZE = ' +

CASE

WHEN [SMF].[size] * 8 / 1024 < 100 THEN CONVERT(nvarchar(100), ROUND ((([SMF].[size]) * 8 / 1024 / 250) , 2 , 0) * 100 + 100)

WHEN [SMF].[size] * 8 / 1024 < 2500 THEN CONVERT(nvarchar(100), ROUND ((([SMF].[size])* 8 / 1024 / 250) , 2 , 0) * 250 + 250)

WHEN [SMF].[size] * 8 / 1024 < 5000 THEN CONVERT(nvarchar(100), ROUND ((([SMF].[size]) * 8 / 1024 / 1000) , 2 , 0) * 1000 + 1000)

WHEN [SMF].[size] * 8 / 1024 >= 5000 THEN CONVERT(nvarchar(100), ROUND ((([SMF].[size]) * 8 /1024 / 2500) , 2 , 0) * 2500 + 2500)

END + ', FILEGROWTH = ' + (

CASE

WHEN [SMF].[size] * 8 / 1024 < 100 THEN '100'

WHEN [SMF].[size] * 8 / 1024 < 2499 THEN '250'

WHEN [SMF].[size] * 8 / 1024 < 4999 THEN '1000'

WHEN [SMF].[size] * 8 / 1024 >= 5000 THEN '2500'

ELSE '1024'

END + 'MB )' )

FROM [master].[sys].[master_files] [SMF]

The results from this script are the T-SQL commands for changing the file sizes and file growth settings (results will vary per environment):


 

 

 

 

 

 

 

 

 

When you used CTRL + T to get the results to text the result will look like above. Select only the commands and copy them into a new Query windows:


 

 

 

 

 

 

 

Default you will see the :CONNECT + Server lines not in grey, to get your script to run correctly you first need to enable SQLCMD mode via the menu option [Query] and [SQLCMD Mode]

Now you can run the query created by the script to change your database file sizes and file growth settings.

Remember that every time you run the query the sizes will change. This script is mainly intended for first time running after installing a new server with databases or when you need to change the default settings for the first time.

Of course you could run this script every now and then. My advise in this case is to change only databases that needs to change and leave the already changed databases as they are.

There is impact when running the results from the script because databases are grown to new sizes and that takes disk I/O. So in heavy used environments I advise to run the results from the script planned in a time windows where there is lower usage or better in a maintenance window.

Today, the first of January 2016 my new business officially starts. For a long time I have been thinking of starting my own business and in the last three months of 2015 I decided to make the big step. The last three months were busy ones, still my day time job and preparing SQLBrander.com B.V. so I could start on this day. Already some work is planned through people I already knew and I have to do work for Conclusion FIT, my former employer. Thanks to those for helping to start my business!

Of course I will keep blogging and hopefully we meet in person someday. My blog articles will always be in English.

A very good, healthy and successful year to all of you!

The website and logo is designed by Paulien Pannekoek, the building of the website is done by Webproof and Annemiek van Bentem helped me out with the texts on the site. Thank you all 🙂

Today I got the question if it is possible to create a login trigger that only is triggered on SQL Server login’s (except SA of course)​ that are used from SQL Server Management Studio to connect to a SQL Server instance.

So the first thing you do in cases like this is to check if someone else has solved this already for you. I found the following post​ that had only part of the solution.​

​So I changed the code so that it does only block login attempts from SQL Server Management Studio with a SQL Server account. Domain accounts and domain groups keep working as requested and also SA keeps working.

/****** Object:  DdlTrigger [TR_Block_Logon_Via_Management_Studio_SQL_Login]    Script Date: 08/11/2015 16:47:21 ******/
IF  EXISTS (SELECT * FROM master.sys.server_triggers WHERE parent_class_desc = 'SERVER' AND name = N'TR_Block_Logon_Via_Management_Studio_SQL_Login')
DROP TRIGGER [TR_Block_Logon_Via_Management_Studio_SQL_Login] ON ALL SERVER
GO

/****** Object:  DdlTrigger [TR_Block_Logon_Via_Management_Studio_SQL_Login]    Script Date: 08/11/2015 16:47:21 ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

CREATE TRIGGER [TR_Block_Logon_Via_Management_Studio_SQL_Login]
ON ALL SERVER
FOR LOGON
AS

BEGIN

DECLARE @program_name nvarchar(128)
DECLARE @host_name nvarchar(128)

SELECT @program_name = program_name,
@host_name = host_name
FROM sys.dm_exec_sessions AS c
WHERE login_name <> 'sa'
AND nt_user_name = ''

IF @program_name LIKE '%Management%Studio%'
BEGIN
RAISERROR('This login is for application use only.',16,1)
ROLLBACK;
END
END;
GO

SET ANSI_NULLS OFF
GO

SET QUOTED_IDENTIFIER OFF
GO

ENABLE TRIGGER [TR_Block_Logon_Via_Management_Studio_SQL_Login] ON ALL SERVER
GO

There is however a way to bypass this solution so it is not completely fool proof. It is meant as a first blocking.

Last week I got a new loginname and password for an environment based on Windows Server 2012 R2. Also in the mail I got the request to change my password at first login. Good practice so no problem with that!

To get to the environment I first needed to connect via MSTSC to a server based on Windows Server 2008 R2. Then connect via MSTSC to the environment based on Windows Server 2012 R2.

Normal steps to change your password are using CTRL + ALT + DEL and choose Change A Password.

ChangePasswordServer2012R2.PNG

The problem I was facing was that CTRL + ALT + DEL does not work in a MSTSC session​. There is a solution for that. You can use CTRL + ALT + END to send a CTRL + ALT + DEL to the server you are connected to. So when I did exactly that I got a bit surprised by the view I got:

ChangePasswordServer2008R2.png

This is the result of CTRL + ALT + DEL on a computer based on Windows Server 2008 R2, not the result you would expect when connected to a Windows Server 2012 R2 machine. It seems that the CTRL + ALT + END is only used on the first session of MSTSC. So what now? Next I tried to change the password via my user account tile on the start menu.

ChangeAccountPicture.PNG

When you choose Change Account Picture and then Sign in Options you get to the new style of programs from Windows.

Accounts.png

To my surprise the option Change to change the password was greyed out…

After some searching on the Internet my collegue Willem found a wonderful workaround (or possible security bug?) to change the password.

  1. Go to the start menu
  2. Type OSK
  3. Start the On Screen Keyboard
  4. Press on your Physical keyboard CTRL + ALT
  5. Press on the On Screen Keyboard DEL
  6. Remove the On Screen Keyboard
  7. Click on Change A Password
  8. Change your password

ChangeAPasswordServer2012R2.PNG

Why the option is greyed out by default is a bit strange. In Windows Server 2012 R2 it is mandatory to change your password every 42 days. The days the management of servers was done on the physical console of the server is a long time ago by my standards… And why, if this default is not permitted, it can be done in this way makes it even more strange.

Of course there is always the option to change the password via the command prompt… IF you have domain Admin rights…

So if anyone has a better option (Powershell?) please share.

Update 2014-10-19: I got a link via Twitter from SystematicADM how to change your password via Powershell. And also a link how to change other AD user’s passwords via Powershell. Thanks to SystematicADM for the response!

Azure Backup is a service from the Azure environment that can be used to back up on premise machines, on premise virtual machines and cloud virtual machines. My first test was if an IAAS VM from Azure could be backed up by Azure Backup. The Azure Backup service is not advertised to do the last option but a customer is planning on moving the Hyper-V environment partly to Azure and there also needed to be a backup in place for the environment. Of course you can use Data Protection Manager or other systems but the drawback is that you need an extra VM for DPM and it is not really with the cloud in mind if there is a service that is able to back up your environment. So I started to test Azure Backup with my home/test environment. No extra charges when you have a Visual Studio Ultimate subscription with MSDN. It has a 115 EURO free of charge limit for testing purposes.

Below is the complete process of configuring the Azure Backup solution. For cost reasons I have used a Self-Signed Certificate instead of a certificate that needs to be bought. So if you would like to try this out yourself the only thing you need is an Azure account with some spending room. There are test Azure offers where you are limited in what you can use. If you would like more spending room you can always get your credit card and test some further… It is all up to you.

In short the following steps are taken:

  1. Create a Self-Signed Certificate
  2. Create the Backup Vault and upload the certificate to Azure
  3. Export the certificate from MMC
  4. Import the certificate on the VM
  5. Run WBInstaller.exe on the Virtual Machine
  6. Register Server
  7. Schedule backup

 

The following steps need to be done once:

 

Create a Self-Signed Certificate

Start Visual Studio Command Prompt

%comspec% /k “”C:\Program Files (x86)\Microsoft Visual Studio 11.0\VC\vcvarsall.bat”” x86_amd64

Use the following statement to create a Self-Signed Certificate

makecert.exe -r -pe -n CN=AzureBackup -ss my -sr localmachine -eku 1.3.6.1.5.5.7.3.2  -e 12/12/2040  -len 2048 C:\Temp\AzureBackup.cer

 

Create the Backup Vault and upload the certificate to Azure

  1. Create new Backup Vault in Azure by clicking + and choose Data Services, Recovery Services, Backup Vault and Quick Create
  2. Enter a name for the Backup Vault and choose a region near you (Western Europe in my case) and click on Create Vault
  3. After the message is displayed that the Vault is created select the Vault
  4. Click on the Manage Certificate icon in the screen and browse to the CER file you created and click (V)
  5. After this is completed succesfully click on Download Agent (WBInstaller.exe)

 

Export the certificate from MMC

Start MMC and add Snap-In Security on the machine where the Certificate is created

  1. Right click on the AzureBackup certificate in Certificates\Personal\Certificates and choose All Tasks and Export
  2. Click Next in the Welcome to the Certificate Export Wizard screen
  3. Select Yes, export the private key in the Export Private Key screen (if this screen does not show, delete the AzureBackup certificate and start all over) and click Next
  4. Select Personal Information Exchange (PKCS #12 (.PFX) and Include all certificates in the certification path if possible in the Export File Format screen and click Next
  5. Select Password and enter a password twice and click Next
  6. Browse to the file location where you would like to save the exported file and name the file in the File tot Export screen and click Next
  7. Check the choices in the Completing the Certificate Export Wizard screen and click Finish if all is ok
  8. On the Certificate Import Wizard message: “The Export was successful” click OK

 

The following steps need to be done on every Virtual Machine that needs to be backed up by Azure Backup

 

Import the certificate on the VM

  1. Copy the PKF certificate file and the WBInstaller.exe to the virtual machine or create a network share accessible from the Virtual Machine where you place the files.
  2.  Import the certificate (AzureBackupExport.PFX) in the Security MMC console on the Virtual Machine that needs to be backed up
  3. Start MMC and add Snap-In Security on the Virtual Machine
  4. Right click on Certificates(Local Computer)\Personal and choose All Tasks and Import
  5. Select Local Machine in the Welcome to the Certificate Import Wizard and click Next
  6. Browse to the file location where you saved the exported file and select the file type Personal Information Exchange (PFX) in the File tot Import screen and click Next
  7. Enter the Password for the PFX file and select Include all extended properties in the Private key protection screen and click Next
  8. Select Place all certificates in the following store [Personal] in the Certificat Store screen and click Next
  9. Check the settings in the Completing the Certificate Import Wizard screen and click Finish
  10. On the Certificate Import Wizard message: “The Import was successful” click OK

 

Run WBInstaller.exe on the Virtual Machine

  1. Click I accept the terms of the Supplemental Notice in the Supplemental Notice screen and click OK
  2. In the screen Prerequisites Check click on Next
  3. Change the Installation folder and the Cache Location at will or leave it default and click on Next
  4. Select Use Microsoft Update when I check for updates (recommended) or I do not want to use Microsoft Update and click Install
  5. Wait untill the WBInstaller completes and leave Check for newer updates in the Installation screen and click Finish
  6. In the Before You Begin screen click Next
  7. In the screen Upgrade Process click Finish
  8. Check for Updates and install these

 

Register Server

  1. When finished installing updates start Windows Azure Backup (shortcut on the Desktop)
  2. Click on Register Server in the top right of the application
  3. Click on Next in the Proxy Configuration screen
  4. Click on Browse and select the correct certificate in the Vault Identification screen and click Next
  5. Select the Backup Vault in the Vault Identification screen and click Next
  6. Click on Generate Passfrphase or create one your own and Enter a location to save the passphrase in the Encryption Setting screen and click Register
  7. Click Close in the Server Registration screen

 

Schedule backup

  1. Click on Schedule Backup in the top right of the application
  2. Click Next in the Getting Started screen
  3. Click on Add Items and select what you would like to backup in the Select Items to Backup screen
  4. Click on Exclusion Settings an select what files you do not want to backup
  5. Click Next in the Select Items to Backup Screen
  6. Select the days and times the backup needs to run in the Specify Backup Time screen and click on Next
  7. Select the number of days retention time in the Specify Retention Setting screen and click Next
  8. On the Confirmation screen click Finish
  9. Click Close in the Modify Backup Progress screen

 

Backup Now

  1. If needed click on Backup Now in the top right corner of the application
  2. In the Confirmation screen click Back Up
  3. You may close the Backup progress screen if needed. The backup will continue

 

Good luck! If you have any questions please let me know! Next blogpost will be on restoring your data.