Finding Failed Jobs

How would you go about finding jobs that have failed and why would you want to look for these?  You might want to do this to ensure that you are being notified appropriately of job failures. For instance, after reviewing failed jobs, you may realize that there are job failures that no one knew about. Maybe this is occurring because the jobs are not set to send email. If that’s the case see my post here so you can fix that.

But how do you find these job failures in the first place?

1. Manually look at the job activity monitor for the SQL instance.

The job activity monitor is found under the SQL Server Agent node in the Object Explorer of SSMS.

 

Just double click it to open and you will see something like this. In this case I set up a job with an incomplete SELECT statement so that it would create a failure.


2. Query the msdb database.

SELECT
J.Name AS JobName
, J.description AS JobDescription
, H.step_id
, H.step_name
, msdb.dbo.agent_datetime(run_date, run_time) as 'RunDateTime'
, H.sql_severity
, H.message
FROM sysjobs AS J
INNER JOIN sysjobhistory AS H ON J.job_id = H.job_id
INNER JOIN sysjobsteps AS JS ON J.job_id = JS.job_id AND H.step_id = JS.step_id
WHERE H.run_status = 0 --Failed jobs

 

Fixing Jobs With No Email Notification

In the last post we found jobs that had no email notification. Once you find them, how do you fix this problem?

I’m going to assume that database mail is already enabled and configured on the SQL Server instance.
1. Create an operator, assuming one does not exist.

This can be done either through T-SQL or through the GUI.  In the GUI, right click the Operator folder under SQL Server Agent and chooser “New Operator.”  Then fill in the details of operator “Name “and “Email name.”  See below for an example.

 

Then you click OK in the bottom right of the GUI to finish the process.

 

Alternatively you can use T-SQL to create the Operator:

USE [msdb]
GO
EXEC msdb.dbo.sp_add_operator @name=N'DBA Team', 
    @enabled=1, 
    @pager_days=0, 
    @email_address=N'MyEmail@MyDomain.com'
GO

2. Assign that operator to multiple jobs via T-SQL.

Once there is an Operator to assign to jobs with no email notifications, the next step is to actually add the new Operator to the applicable SQL Agent jobs.  This will create the T-SQL that can be copied and ran in a new SSMS window, allowing you to confirm before running the statements.

Keep in mind that in this example I am only updating jobs that don’t already have an operator assigned to them.  This makes the WHERE clause very important.  If you leave this off, copy and run the generated statements, then all jobs will be updated to whatever was specified in the variable.  This may not be what you want for your environment, so please be sure to use the WHERE clause and choose notify_level_email = 0 if you only want to update jobs that have no operator assigned to them.

 

use msdb
GO
DECLARE @operator varchar(50)
SET @operator = 'DBA Team' 

SELECT 'EXEC msdb.dbo.sp_update_job @job_ID = ''' + convert(varchar(50),J.job_id) + ''' ,@notify_level_eventlog = 2, @notify_level_email = 2, @notify_email_operator_name = ''' + @operator + '''' 
FROM sysjobs As J
 INNER JOIN sysjobschedules AS JS ON J.job_id = JS.job_id
 INNER JOIN sysschedules AS SS ON SS.schedule_id = JS.schedule_id
 
WHERE J.enabled = 1 --Job is enabled
 AND J.notify_level_email = 0 --never notify
 AND SS.enabled = 1; --it has a schedule that is enabled

Notice that I am using sp_update_job not only to add the email operator, but also to set the job to log to the Windows event log in case of a failure.  Why?  Well, what happens if database mail is not configured properly?  Maybe it worked when the job was created, or maybe it worked the last time you touched database mail, but what if someone else made changes to it?  In that case, you might not get the expected email when the job fails.

If you have someone, like a Junior DBA, who is looking through Windows Event logs for issues they might find it.  Or what if you have one of those slick 3rd party monitoring that scans for errors and notifies you when it finds one?  Then you have another way to be notified that the job failed.

And here is your BOL link to all the goodies involved with sp_update_job.

Finding Jobs With No Email Notifications

When you take over a new SQL Server instance one of the things you might consider doing early on is finding SQL Agent jobs that have no email notification set up.  These will be jobs that could fail and they will never alert anyone. I am of the opinion that if a job is worth creating, then it’s worth telling someone when it doesn’t do what it is supposed to do.

SQL Agent Job Email Operators

A quick, easy way to determine jobs that have no email operator is to connect to your SQL Server instance and run the code below.

 

USE msdb;
GO

SELECT J.name,
 J.notify_level_email,
 J.notify_email_operator_id

FROM sysjobs AS J
 INNER JOIN sysjobschedules AS JS ON J.job_id = JS.job_id
 INNER JOIN sysschedules AS SS ON SS.schedule_id = JS.schedule_id
 
WHERE J.enabled = 1
 AND J.notify_level_email = 0 --never notify
 AND SS.enabled = 1; --it has a schedule that is enabled

For jobs that are not going to email anyone if they fail, the last two columns will return a zero.  These are jobs that you will want to focus on and fix so that when they fail, someone can fix the problem.

Here is a link on all the goods available to you from dbo.sysjobs:

https://msdn.microsoft.com/en-us/library/ms189817.aspx

Next Steps to Take

  1. Run the above code against your SQL Servers.  If you have a Central Management Server, consider running the code there so you can see all the results across your entire environment at once.
  2. Evaluate the list of results returned and set up email operators as needed so you can be alerted if the job fails.
  3. If you have any questions, or need help with this, please contact me here.

Three Reasons to Use a Custom Database Role

Ever had a situation where a small in-house application written for a small group of employees needed some permissions in the database? How do you handle that? I want to walk through the scenario and provide three reasons that using a Database Role might be the answer you are looking for.

Let’s first cover, at a high level, the different levels of permissions in SQL Server and the associated commands. In SQL Server there are Server level permissions for the SQL instance and there are database level permissions. Within the database level there are also specific permissions for objects like tables, views and store procedures. There are also database permissions that can be grouped together in a database role. In fact, there are some built-in database roles for SQL Server as shown below. If you aren’t familiar with these, you can find out about them here.

Database Roles

 

As an example, suppose there is an application being written for a small department in your company. The application needs access to your company database, but not access to everything in the database. Perhaps the app only touches a few tables. Someone could grant the Server level permission of SysAdmin to a Windows account or Windows group that contains the necessary people added to it in Active Directory. That would be a “sledge hammer” approach to providing permissions and break the principle of least privilege. In this scenario, a database role could be used to accomplish the task at hand.

On your domain you or the System Administrator can create a group called MyDomain\DepartmentName. With the pseudo-code below as a model, you can create the Login and User.

USE master;
GO
CREATE LOGIN [MyDomain\DepartmentName] FOR WINDOWS
USE MyDatabase;
GO
CREATE USER [MyDomain\ DepartmentName] FOR LOGIN [MyDomain\ DepartmentName]

In this new AD group, you place the members of the department. This allows the department members the ability to use their already familiar Windows passwords, keeping it easier for them. Using Windows groups also makes it easier for you. If you aren’t convinced of that read this post.

At this point is when some people might still take the approach of giving big permissions by mapping the created database user to the db_owner role in the database and be done with the process of permissions. However, do you really want a group of people in a department to have complete control over the entire database. Somehow I doubt it, and I doubt your auditors want to see that either.

Permissions could be granted to the user account itself inside SQL Server, but what if other people who are not yet in the appropriate AD group want access? Now you either have to add them to that AD group, which might now make sense from an organizational perspective, or you have to start this process all over again with a new login and user. This is where Database Roles come in. Below is example code on how you would do that based on the process we already have in place.

USE MyDB;
GO
CREATE ROLE MyNewDBRole;
GO
ALTER ROLE ADD MEMBER [MyDomain\ DepartmentName];
GO
GRANT EXECUTE ON [dbo].[GetEmployees] TO MyNewDBRole;
GO
GRANT EXECUTE ON [dbo].[InsertEmployee] TO MyNewDBRole;
GO
GRANT EXECUTE ON [dbo].[DeleteEmployee] TO MyNewDBRole;
GO
GRANT EXECUTE ON [dbo].UpdateEmployee] TO MyNewDBRole;
GO
GRANT EXECUTE ON [dbo].[GetSales] TO MyNewDBRole;

 

So what are some reasons to use a process like this? First, the application will follow the principle of least privileges. This means that the application and the users of the application will only have the permissions required for the functionality of the application. This keeps your SQL Server more secure, which makes auditors happy.

Second, this type of process eases administration. When the application is expanded and more people begin to use it and for different purposes, just create a group like the first one demonstrated and add the group to the role. Grant any additional permissions needed to the role. No adding individual logins to the SQL Server and no applying permissions to many individual logins, which would be a pain. Now you have a single point of permissions administration within SQL Server – the Database Role. Additionally, if there are new employees in the department who need access to the application, simply add those people to the AD group and you’re done!

Third, you and the System Administrator get some face time!! Who doesn’t like a little collaboration?

Three Places to Get Information On New Updates for SQL Server

There are many consistently hot topics around SQL Server. One such topic is, “How does a DBA obtain the latest information on updates?” In this post, I will discuss three places that I frequent for information on SQL Server updates.

The first link I want to discuss is http://sqlserverbuilds.blogspot.com/. This site gives you a very comprehensive page of data. Initially there will be a summary grid that lists a SQL Server release down the left side and across the top will be release names like RTM, SP1, SP2, etc. Inside the grid will be the corresponding release number for a particular release and that release number will be a link to the corresponding download.

Below that summary grid will be similar grids for individual releases. The difference is that in the grids for individual releases the reader will find links to hotfixes, cumulative updates and service packs. There is a grid for each release go back to SQL Server 7.0 and there are even links to 6.5 and 6.0 builds! This site does not have an RSS subscription option.

The next link is http://sqlserverupdates.com/ and is maintained by the people of BrentOzar.com. If you are not familiar with them, they are a consulting company started by Brent Ozar, Kendra Little and Jeremiah Peschka. Kendra and Jeremiah recently departed the company to pursue other interests, turning the reigns over solely to Brent Ozar.

This site goes back to SQL Server 2005. The matrix provided shows the reader the SQL Server version, then the latest updates and when support ends for that version. There is a separate link for updates other than the latest one for each edition. Clicking the link for Other SQL 2014 Updates, for example, opens a separate page that shows much the same information as the edition grids on http://sqlserverbuilds.blogspot.com/. On this site there is also a FAQ that addresses things like what is the difference between a hotfix, cumulative update and a service pack. This site does have an option to subscribe to notifications about new updates for SQL Server.

The last link is http://blogs.msdn.com/b/sqlreleaseservices/. It is written by the SQL Server Engineering Team. Rather than being organized in a grid style, this site lists information chronologically as posts are written and published. As a result, there is a mixture of posts about the various edition releases all on the same page as separate blog posts. Readers can subscribe with Live Bookmarks.

If you think you don’t need to concern yourself with keeping up to date on SQL Server patches, then please take a look at this post.

I hope this has been helpful and I wish you happy patching!

 

Moving Database Files to Another Drive

You may occasionally have the need to relocate database files. What are some reasons that may prompt the need for this type of change? First, someone might have discovered that the database files are on the same drive as the operating system. For a discussion of why you don’t want to do this see this post. Second, your database might simply be outgrowing its current drive. Third, perhaps someone installed the transaction log file on the same drive as the mdf file.

So, after the need to do this has arisen, how is the task actually done? The first step, which may have already been done as part of a process that discovered the issue in the first place, is to confirm the current location of your database files. There are at least two ways to approach this. First, if you want to look at all the information on the SQL Server instance then use the below query that utilizes sys.masterfiles.

SELECT

MF.name AS LogicalFileName

, physical_name AS FilePath

FROM sys.master_files AS MF

A second way to do this is to review the information for a single database. For that, the query below can be used. Note I am looking for AdventureWorks2008R2 in my particular query.

SELECT
D.name AS DBName
, MF.name AS LogicalFileName
, MF.physical_name AS FilePath

FROM sys.master_files AS MF
INNER JOIN sys.databases AS D ON MF.database_id = D.database_id
WHERE D.name = 'AdventureWorks2008R2'

 

Click to enlarge.

FilesNames and Paths

Once this is complete the next step is to verify that the target drive for the new file is large enough to hold the drive that you plan on moving. You really don’t want to get in the middle of this change only to discover that the target drive does not have enough room to hold the file you want to move.  In this case I will be demonstrating how to move a transaction log file and so I will use a DBCC command to return space information.

DBCC SQLPERF(LOGSPACE)

 

DBCC SQLPERF LOGSPACE

 

The DBCC command gives you the current size of the log in megabytes, as well as other useful information like the percentage of space used in that log file. In this case my transaction log file is very small at 3.36 MB for the AdventureWorks2008R2 database.

Another way to see this information for a specific database would be to right click the database name in SQL Server Management Studio and select Tasks > Shrink > Files. Switch the display to the Log file and the dialog will show the currently allocated space for the transaction log file as well as the free space in that file at the moment.

Once you have confirmed that the target drive will hold the file you want to use the next step will be to take the database offline so that the file you want to move can be copied to the target drive. Use the script below to offline your database, replacing AdventureWorks2008R2 with the name of your database.

ALTER DATABASE AdventureWorks2008R2 SET OFFLINE WITH ROLLBACK IMMEDIATE;

 

Now using Windows Explorer, copy the file from its current location to the new location and run the below query. for this demonstration I simply made a new folder called Logs in the default path like this: ‘C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Logs

ALTER DATABASE AdventureWorks2008R2

MODIFY FILE (NAME = AdventureWorks2008R2_log, FileName = 'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Logs\AdventureWorks2008R2_log.ldf');

Then bring the database online.

ALTER DATABASE AdventureWorks2008R2 SET ONLINE;

That’s it. The files have been successfully moved. Congratulations! Performance has been improved or a disaster with a full drive has been averted.

Setting Min and Max Server Memory Setting

Configuring the min and max server memory settings for SQL Server has a number of things to consider prior to making changes. Before we jump into those considerations, let’s learn what the min and max server memory options do.

What does the min server memory option do?

The min server memory option specifies the minimum amount of memory utilized by the SQL Server buffer pool. It may seem that this should mean that when SQL Server starts it will automatically grab the memory equal to the min server memory value. However, this is not what happens. Instead, SQL Server acquires system memory as needed to process a workload. Consequently, if the SQL Server workload remains low, it is possible that the min server memory value is never reached. However, once it is reached SQL Server will not release an amount of memory that would take it below that minimum setting.

For example, let’s suppose your server has 128 GB of RAM and the min server memory option were set to 32 GB for SQL Server. SQL Server does not immediately allocate 32 GB of RAM upon serve startup. It will acquire memory as needed for the workload. This means that if there is little activity in the database it may take some time before SQL Server allocates 32 GB of RAM. Once this happens, SQL Server will not release memory that would reduce the allocation below the min serve memory setting.

What does the max server memory option do?

To a certain point the max server memory setting does do what it sounds like. Keep in mind this setting is applicable only to the buffer pool. SQL Server can allocate some memory beyond this max number as it consumes memory for things outside of the buffer pool. Also, realize that this setting does not account for memory allocated to other components of SQL Server that may be installed and running, such as Reporting Services or Analysis Services.

 

The default value for Max Server Memory is 2,147,483,647 in MB. That number is rather large, 2,047.99 GB to be exact. So what happens if the machine SQL Server is installed on does not have that much memory and the default was not changed? As previously mentioned, SQL Server will continue to acquire additional memory based on workload. As a result, SQL Server could make requests of Windows that puts the box under memory pressure as it essentially chokes the OS by leaving less and less memory for the OS. There are mechanisms in the Operating System that allow it to page out to disk some information in the SQL Server buffer pool.  This action means Windows can recover that memory for itself. This hurts SQL Server performance because disk IO is far slower than keeping operations in memory. Among other places, details on how this process works can be found here, which is a blog post discussing the Lock Pages In Memory feature. I will save that setting for discussion a later post.

 

You can calculate how much memory the Windows OS and other applications may need and then subtract that from the physical memory available. There are recommendations on the internet for such things. Whatever number you arrive at would then be set as the Max Server Memory setting in SQL Server. The Available MB perfmon counter should then be judiciously monitored to ensure that there is adequate free memory left on the box, at least 1-2 GB. That will allow plenty of memory for that person who just insists on using an RDP connection to do his or her work or for a new process to fire off without bringing the server to its knees.

 

For example, let’s say that your server has 128 GB of RAM. You may decide to allocate 12 GB for the OS. Perhaps you have a vendor app that needs 4 GB. You take 128-(12+4) and that leaves 112 GB of RAM. If you are running SQL Server 2008 R2 then the Max Server Memory setting can be set to a max value of 64 GB. That is per SQL instance. SQL Server 2012 Standard supports a maximum of 64 GB and SQL Server 2014 Standard supports 128 GB. If SQL Server 2014 Standard were installed in this scenario there would be a maximum of 112 GB that could potentially be allocated to the SQL Server buffer pool for the machine described above.   Setting this limit at 112 GB, followed by the aforementioned monitoring of the Available MB performance monitor counter, would help ensure that SQL Server does not hog memory from the Windows operating system.

 

Now that there is a foundational understanding of what the Min Server Memory and Max Server memory settings are and how they work, let’s look at how to set these options in SQL Server.

How to Set Min and Max Server Memory

First, there is the GUI. Open SQL Server Management Studio and connect to a SQL Server instance. Right click on the SQL Server instance name and then choose Properties. Now select the Memory page in the left pane  Manually change the values in the Minimum and Max Server Memory boxes.  Remember that this is in MB.  Then click OK in the lower right of the dialog box.  Also, remember that when you make changes in SSMS you can click the Script button at the top of the dialog and T-SQL for the particular changes will be generated for you.

 

Min and Max Server Settings

 

Second, these settings can be changed using T-SQL.  Here is an example script that sets the minimum to 12 GB and the Max to 32 GB.

 

EXEC sys.sp_configure N'show advanced options', N'1' RECONFIGURE WITH OVERRIDE

GO

EXEC sys.sp_configure N'min server memory (MB)', N'12288'

GO

EXEC sys.sp_configure N'max server memory (MB)', N'32768'

GO

RECONFIGURE WITH OVERRIDE

GO

EXEC sys.sp_configure N'show advanced options', N'0' RECONFIGURE WITH OVERRIDE

GO

Setting Max Server Memory With PowerShell

As a third option, you can use PowerShell to set the max server memory. I have a blog post here that is the beginning of a series on using PowerShell to manage SQL Server.

We’re going to look at Set-DBAMaxMemory from the DBATools module. Here is what the DBATools website has to say about this cmdlet in their documentation.

“Inspired by Jonathan Kehayias’ post about SQL Server Max memory (http://bit.ly/sqlmemcalc), this uses a formula to determine the default optimum RAM to use, then sets the SQL max value to that number.”

Here are some examples from the documentation from DBATools and from the Help commands available in PowerShell.

<# If you have a Central Management Server for you SQL environment, consider using this command to loop through all the SQL Servers and set the Max Server Memory where it is set to something larger than the total amount of RAM assigned to the server. #> 

Get-DbaRegServer -SqlInstance sqlserver | Test-DbaMaxMemory | Where-Object { $_.MaxValue -gt $_.Total } | Set-DbaMaxMemory 

<# If you have a Central Management Server for you SQL environment, consider using this command to loop through all the SQL Servers and set the Max Server Memory to this accepted formula created by a SQL Server expert. #> 

Get-DbaRegServer -SqlInstance sqlserver | Test-DbaMaxMemory | Set-DbaMaxMemory 

<# If you don't have a registered server then just use the below #> 

Test-DbaMaxMemory -SQLinstance SQLServerInstanceNameHere | Set-DbaMaxMemory

 

What to do next

  • Review these settings on your SQL Server instance and consider whether they need to be adjusted.
  • Monitor your Available MB setting to ensure your servers have plenty of free space.
  • Explore using PowerShell commands for setting this server memory option.
  • Contact me here or reach out to me on Twitter if you have questions about this post or other questions about SQL Server.

Use Windows Groups to Manage SQL Server

Ever come across a SQL Server that had 5, 10, or more individual logins for users in it? I have a few times and it made me wonder if whoever had done such a thing realized how much harder they were making the administration of the server by doing this. I suppose some folks start out with one or two people that they give special permissions or access directly to the SQL instance and underlying databases. Their logic probably goes something like, “It’s just Tommy and Suzie. Those are the only people who will need access for a long time.”

Initially I have to agree with this thought. It does seem simpler when you only have a couple of people to simply create a login and corresponding user for those people. However, SQL Server access and permissions rarely stay that simple. Hopefully you are in a growing company. A growing company will eventually employ additional people who either truly need or believe that they need access to the database. If you are already in the habit of granting direct access using individual logins and database users, it will be tempting to continue this pattern. Before you know it, one day you connect to SQL Server and there is an ever increasing list of individual logins that suddenly catches your attention, and most likely there are a variety of permissions applied to those logins and users. This becomes a management nightmare.

In steps Windows groups. In a typical environment there will be Developers, one or more DBA’s and perhaps folks who write SSRS reports or who do ad hoc type reporting. These groups most likely all need different permission sets. Create a Windows group in AD, or have your System Admin do it. (You are on good terms with her, right?) Ask the SysAdmin to add the developers to a MyDomain\Dev group. Ask her for MyDomain\SQLAdmins and perhaps MyDomain\Reporting.

Now in SQL Server add these Logins and map them to appropriate database Users. For the Dev group maybe you give them only Public permissions in production at the SQL instance and db_datareader to various databases. Now, do the same for MyDomain\Reporting. Why have two groups with the same permissions you ask? Well, it’s a management thing. See if you throw them all into the same group when people on the Dev team need slightly different permissions for something now that permission change would be applied to your reporting people too, and you may not want that.

Back in SQL Server 2008 the BUILTIN Administrators group was removed from SQL Server as a group that was granted automatic access to the SQL Server instance. This was done to enhance security. Given this change, you might want to create a group specifically for your DBA’s and grant them the sysadmin server role. This is where MyDomain\SQLAdmins come in. You can grant them the needed access to manage the SQL instance without making those folks admins on the machine. The Windows SysAdmins can retain control of the machine without automatically being granted control in the SQL instance to do things that the DBA’s would frown upon. Everyone is happy, right?

Now let’s suppose that the company has a new project and three Developers are being hired. In the past the DBA might make three separate logins and map those logins to database users. Now that MyDomain\Dev exists just ask your System Administrator to add those new developers to that group while they are in Active Directory creating Windows accounts for the new employees and presto, permissions are set before the new employees show up on the first day. Prod permissions are set and you have of course already added the MyDomain\Dev group to the development SQL instance in the sysadmin server role so permissions are set there as well. All of that without ever touching SQL Server to grant individual permissions to people. You’ve just saved yourself some time so you can go do something else.

I hope this has been insightful and encourages you to start thinking ahead now, even while your SQL Server environment is still relatively small. I think starting with Windows groups from the beginning will save you time and headache later.

3 Additional Tips for Using SQL Server Management Studio

In an earlier post I listed three tips for using the SQL Server Management Studio user interface. I want to offer three additional tips for SSMS.  I think you will find that these ideas will bring additional efficiency and functionality to your use of the interface.

The first tip is related to getting more viewing space after running a query. Query results obviously take up the bottom portion of the SSMS editor window. I have often found myself then clicking in the query editor window and scrolling to view various parts of my query, all the while the results portion of the screen is using a good portion of the window. Some time back an associate told me how to easily address that annoying issue. Simply click CTRL and R to collapse the query results so that the full page is again used for viewing the query text.

The second tip is related to finding objects in Object Explorer. Ever found yourself trying to locate a particular stored procedure that has a name that is a mile long and your brain only remembers part of the procedure name? What about trying to find all the tables related to Orders or Shipping? As it turns out, there is a nice way to search for such things.

Drill down into the object level you are searching through, such as Databases > AdventureWorks2008R2 > Tables. Right click the Tables folder and select Filter. Then select Filter Settings. You are then presented with the dialog below. Enter the name description in the Value column and click OK. This filtering also works for the Views, Stored Procedures and Functions folders.

Filter Search on a Table

 

The third tip relates to using color to visually identify what SQL Server instance the query window is connected to. I once saw a Developer who had five SQL Server Management Studio windows open at once. When another Developer asked him why he answered that it was so he could maintain separate connections to different SQL instances and not be as confused as if he had all the connections in one running instance of SQL Server. It was, I suppose, an admirable attempt to ensure he was running the query against the right SQL instance, but it caused me to remember that color can be used for identifying the SQL instance.

Here is how it works. When SSMS is launched the user is presented with a connection dialog.

SSMS Connect to Server Dialog2

 

Next select Options and the below appears.

 

SSMS Connection Properties Dialog

Click on the Select button, choose a color and then click OK and then Connect in order to connect to the SQL instance. Now open a new query window. The color you just selected will appear along the bottom of the SSMS editor window. To do this after you have already opened SSMS, click Connect > Database Engine > Options from the Object Explorer pane. This type of color customization can also be done with registered servers. For more information on this feature click this link: https://msdn.microsoft.com/en-us/library/hh213617(v=sql.120).aspx

 

I hope these additional tips on SSMS prove useful. Feel free to leave me a comment or another customization that you like to use.

You Really Should Stay Current on Cumulative Updates

 

I have encountered a Log Shipping error that I want to detail as an encouragement for staying current on SQL Server cumulative updates. First, a little background. The environment is SQL Server 2014 Standard edition with a physical machine as the primary and two secondaries, both virtual machines backed by a flash storage system.   SQL Server is on service pack 1, cumulative update 1. Log shipping had been configured in this environment for many months, but SQL Server 2008R2 was updated to SQL Server 2014 some months prior. The Log Shipping Restore job leaves the database in the Standby Mode so it can be queried for reporting purposes.

 

One evening I received the following error via database mail: “SQL Server Alert System: ’Severity 021’ occurred on \\MySecondary. An error occurred during recovery, preventing the database ‘MyDatabase’ from restarting. Diagnose the recovery errors and fix them or restore from a known good backup.” This was immediately followed by another email advising that the Log Shipping restore job had failed. Yet a third email was sent for error 9004 severity 16, explaining that an error occurred processing the log for “MyDatabase.” The message informed that I could restore from backup or rebuild the log. I connected to the SQL instance and saw that the database was in the restoring mode.

 

A quick internet search turned up the following two links.

https://support.microsoft.com/en-us/kb/2987585

https://support.microsoft.com/en-us/kb/2015753

 

The symptoms section of the first link seemed to fit my situation fairly well. I was unfamiliar with the fsutil command line utility. When I executed it against my environment the results indicated that the Bytes per Sector value was 512 and Bytes per Physical Sector returned “Not Supported.” The support article gave me two workarounds for addressing the issue.

  • Move the transaction log file at the destination to a drive that has “Bytes per Physical Sector” set as 512 bytes.Note The Standby file can still be located on the drive that has “Bytes per Physical Sector” set as 4096 bytes.
  • Restore the log backups without using the standby option. Instead of the STANDBY option, use the WITH NORECOVERY option during the restore operation.

 

Since my database was still in the restoring mode and the second URL indicated I had a damaged transaction log, I could not offline the database and relocate the file per the first recommendation. I also could not confirm how the bytes per sector value on the drive was configured anyway. The first article indicates that the issue was fixed in SQL Server 2014 SP1 CU2 and SQL Server 2012 SP2, CU7. Remember, I mentioned that the environment was on CU1 of SQL Server 2014 SP1. Unfortunately, that put me one update behind with respect to being protected from this situation.

 

Looks like I had one good course of action – restore from backup. I disabled the restore job, restored from a recent full backup and the most recent differential. I left the database in the restoring mode so I could re-enable the Log Shipping restore job. This would get the secondary back to Standby mode.

 

If you want to know when the latest updates for SQL Server are available, then I recommend saving the links below in your favorite web browser.  As this post demonstrates, be sure to read what the latest are about as soon as possible to see if they apply in your situation.

 

http://sqlserverbuilds.blogspot.com/

http://sqlserverupdates.com/

http://sqlblog.com/blogs/aaron_bertrand/default.aspx