5 Types of SQL Server Backups

One of the foundational responsibilities of a DBA is making sure that backups are available.  SQL Server has several foundational types of backups that can be made. Before I dive into that though, I want to ask a question that seems to have a fairly obvious answer.

Why take backups?

  1. You want to be able to recover from data corruption or mistakes made when people make changes directly to the data – like when they write and run a script in SSMS, outside of an application that’s using a set of tested stored procedures.
  2. You want to be able to refresh you non-prod environments so developers have data that looks like prod  to develop against.
  3. You want to have backups so you can test whether you can restore and how long it will take
  4. You want backups as they are often an essential part of the migration process from one SQL Server to another – usually newer SQL Server, or to Azure.

SQL Server Full Backups

What does a full backup accomplish?  A full backup, creates a backup of the entire database. It is all-inclusive, containing the entire database structure and all the data.  When a full backup starts, the Log Sequence Number is recorded and when the full backup completes, the Log Sequence Number is recorded.  This Log Sequence Number is the mechanism used by SQL Server to know what order INSERTS, UPDATES or DELETES occurred in.  As such, having the beginning and ending Log Sequence number recorded as part of the full backup allows for a transactionally consistent backup because the full backup is aware of the changes that took place while the backup was being made.  This allows for recovery of those transactions during the restore process. This full backup serves as the base backup, or starting point for all differential and log backups after it.  Each time a new full backup is taken, a new base backup point is created.

SQL Server Differential Backups

What does a differential backup do for you?  A differential backup works off the concept of a differential bitmap.  SQL Server is composed of eight kilobyte pages. These pages are collected in groups of eight called “Extents.” There is a bitmap that is used to mark any Extents that have experienced data changes since the last full backup. When a differential backup is executed, it is this bitmap that determines what data is backed up.

Based on the size and rate of change in a database, differential backups are often going to complete faster than a full backup.  Additionally, the differential backups will be smaller in size since only changed extents are backed up, rather than the entire database.

When the full recovery model is used, the power of the differential can be very important.  The differential backup reduces the number of log backups that have to be restored. For example, if there is a full backup taken on Sunday night, and transaction logs every 15 minutes after that, when a DBA needs to restore the database to Thursday at 5:53 AM, she needs to construct a restore process based on the Full backup and every transaction log since Sunday evening!  That’s a lot of transaction logs to have to include.  With a daily differential occurring at 5 AM, the DBA now only needs to restore the full backup from Sunday evening, the one differential that was created at 5 AM on Thursday, and only the transaction log backups that occurred after 5 AM Thursday. Being able to skip over using all those log backups is powerful.

This makes for much simpler script creation and readability, and can save a lot of time if the DBA is trying to write this script manually.  You should have an automated process to create the backup script, but some DBAs don’t have that set up yet. We’ll talk about automating the backups script creation  later in the series.

SQL Server File and FileGroup Backups

What are File and FileGroup backups? Well to start with, this is a more advanced topic than these other backup types. As such, it is less likely that you will need this type of backup. However, knowing what your options are is always a good idea.

File and FileGroup backups allow a more granular backup, and by extension, a more granular restore process. It does what it sounds like.  A file or filegroup backup backs up an individual file or a collection of files contained in a filegroup. This allows for restoring a small part of a database to fix a problem rather than having to restore the entire database. This option, however, does introduce complexity and, in my opinion, is really only useful for large databases, particularly in the range of 500 GB or more.

These file and filegroup backup types can be used when a larger database has been divided into multiple files.  For example, a large database containing sales information might be divided into various Sales files, perhaps by year, or even by month if there are a high volume of sales recorded in the database.  These files are then aggregated into FileGroups. See below for an example of creating files and filegroups in a database. This example was taken from the  MS docs page here.

USE master;
GO
-- Create the database with the default data
-- filegroup, filestream filegroup and a log file. Specify the
-- growth increment and the max size for the
-- primary data file.
CREATE DATABASE MyDB
ON PRIMARY
  ( NAME='MyDB_Primary',
    FILENAME=
       'c:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\data\MyDB_Prm.mdf',
    SIZE=4MB,
    MAXSIZE=10MB,
    FILEGROWTH=1MB),

--Here is where a separate file is created and assigned to a FileGroup, which is little more than a container.
FILEGROUP MyDB_FG1
  ( NAME = 'MyDB_FG1_Dat1',
    FILENAME =
       'c:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\data\MyDB_FG1_1.ndf',
    SIZE = 1MB,
    MAXSIZE=10MB,
    FILEGROWTH=1MB),

--Here is where another separate file is created and assigned to a FileGroup.
  ( NAME = 'MyDB_FG1_Dat2',
    FILENAME =
       'c:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\data\MyDB_FG1_2.ndf',
    SIZE = 1MB,
    MAXSIZE=10MB,
    FILEGROWTH=1MB)
LOG ON
  ( NAME='MyDB_log',
    FILENAME =
       'c:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\data\MyDB.ldf',
    SIZE=1MB,
    MAXSIZE=10MB,
    FILEGROWTH=1MB);
GO
ALTER DATABASE MyDB 
  MODIFY FILEGROUP MyDB_FG1 DEFAULT;
GO

-- Create a table in the user-defined filegroup.
USE MyDB;
CREATE TABLE MyTable
  ( cola int PRIMARY KEY,
    colb char(8) )
ON MyDB_FG1;
GO

SQL Server Transaction Log Backups

What is a transaction log backup? A transaction log backup will take a backup of SQL Server’s transaction log so that data changes stored there can be recovered during the restore process.  Again the Log Sequence Number concept mentioned earlier is key.  Because transactions occur in a certain order, the log backups contain that order of transactions.  Transaction log backups must be restored in order.  If you try to restore a sequence of log backups in the wrong order, SQL Server will generate an error.

Having these log backups allows you to do what is called a point in time restore.  Having the option for a point in time restore can be critical for a business. Let’s suppose someone is making changes directly to one or more tables.  If a mistake is made and an unexpected data change occurs, point in time restore will allow you as the DBA to recover the data up to the moment just before the erroneous change was made.

SQL Server Copy Only Backups

What are Copy-only backups?  Copy only backups are backups that do not affect or track the Log Sequence Number information.  Why would you use this? Well, let’s suppose that you are already taking full, differential and log backups.  You are asked to provide a backup file of a database to another department or perhaps just a single developer who is working on a project. That Developer, hopefully, doesn’t have access to the backup directory for your backup processes.  Rather than you trying to copy a set of backups to a share or other location that the Developer does have access to, you could simply create a Copy Only full backup that is written to a shared location.  The Developer can then use that backup to restore locally to her machine, and you receive the benefit of not changing the base backup for your backup chain.

What to Do Next

  1. Examine your environment to see if you are leveraging the “Power of the Diff” by taking differential backups to help simplify your recovery process.
  2. Consider the size of your databases and determine if you have a situation that might benefit from having a database with multiple files and filegroups.

How Do You Find Good SQL Server Training Resources?

What process can you use to find SQL Server training that will be valuable? After all, not everything on the internet can or should be trusted.  If you want training on something in this constantly changing world of technology, how do you locate that training?  How do you find information that is worth spending your time on?  How much of it costs something?  How much of it is free and is the free stuff any good?

I live in the Microsoft world in general and specifically in the SQL Server universe so that is where I am going to focus.  But, to be clear, this process, and the methods I am going to talk about, will be relevant even if you are a C# Developer or some other kind of IT knowledge worker that has little or nothing to do with databases. I would be surprised if Linux and other open source software didn’t have some equivalencies to what I am going to discuss later on.

Let’s suppose you’ve been sitting next to the SQL Server for awhile and you’ve fixed a few things from time to time.  Congratulations!  You are now what is called an “Accidental DBA.”  You’ve fallen, quite by accident, into managing SQL Server in some capacity.  You’re sensing, though, that there’s a lot more that needs to happen and you have no idea where to start to find information.  After all, when you Google “SQL Server” you get back about a million hits and you don’t know who is a reliable source of information or what you should focus on first.

  • Microsoft MVPs

Have you heard of Microsoft’s Most Valuable Professional program?  If not, this is a good entry point for finding quality content. A Microsoft MVP is a person who has invested significant time into making the Microsoft community a better place.  They do this through three main avenues.  First MVPs may make significant contriubtions to coding projects.  Second, they certainly spend time helping others.  This usually comes from answering questions at places like MSDN, StackExchange.com or SQLServerCentral.com. Third, they create content.  This comes from doing things like blogging, writing articles, and publishing written materials.

So, what does this have to do with training? I’m glad you asked. Remember how we want to know how to sift through all the hits on the internet when you type in something like “SQL Server training”? You want to find the good stuff, right? Well, since you want to learn about something in the Microsoft world, why not find out who the MVPs are and learn from them?  Many of them have their own websites where they create content. Some of them will have special training that they have created on a variety of subjects.  The blog content is going to be free.  Usually you’ll need to fork over some money for the special training they have made.

Here is how to find Microsoft MVPs.  Go here and click the link at the top of the page for “Find an MVP.” For the SQL Server category, select Data Platform from the “Award Category” drop down. If you want to see MVPs in your area, hit the “Region” drop down and select your region to filter the results. From there, click on the name of an MVP and you will be directed to a profile page that will provide a summary of that person.

The summary page for an MVP will show where they have presented and what the presentation title was.  The profile summary will often have links to the person’s blogging locations as well, their Twitter account, and even books they have written. Want to see if someone you follow is an MVP?  Type the person’s name or part of their name into the Keyword box and click “Search”. Beyond the information on the MVP site for a particular person, sharpen your internet search skills and type in the person’s name.  See what you can find!

  • Paid Video Training

  • Pluralsight – In the interest of fairness in terms of the comparison to CBTNuggets, you should know that I’ve had a Pluralsight subscription of my own for years.  I have only in the last year or so learned about CBTNuggets.

On Pluralsight you will see courses from SQL Server people like Paul Randal, Kimberly Tripp, Erin Stellato and others from SQLSKills.com. People like Gail Shaw, Pinal Dave, and Mike MCKeown also have SQL Server courses on Pluralsight.  The platform has reasonably priced monthly and yearly plans. Personal plans on Pluralsight are $29 a month.  You can download exercise files that allow you to work through the content as you watch the presenter or later on your own. The platform also has skills test for a variety of subjects.

A couple of my favorite features  are called Paths and Channels.  Paths are pre-built collections of content that Pluralsight provides to you for learning a topic.  You can subscribe to Paths and you can combine a pre-built path with other content you’re interested in via the creation of a Channel.  Let’s say you’re interested in a group of courses that aren’t part of a path.  What do you do?  Well, don’t fret. That’s what Channels are for.  You can build you own custom path by adding courses to a Channel that you name.  You can also add a pre-built Path from Pluralsight to your custom channel.

Pluralsight also tracks weekly minutes spent watching videos so you can see how much time you’re spending on training and there is a history page so you can see what courses you’ve watched and how much of that course has been viewed.

  • CBTNuggets – At $59 per month, this platform is ore expensive than Plauralsight.  However, CBTNuggets has a few features that Pluralsight doesn’t.

The CBTNuggets platform offers virtual labs that can be spun up from directly inside a course. Additionally, there are IT Certification practice exams from Kaplan as part of the monthly subscription fee.

They also have accountability coaches.  These coaches help subscribers stay on track with their learning goals and they provide additional resources for learning the material as well as learning strategies.

The SQL Server  training on this platform is geared toward the Microsoft Certifications and are taught by people I am less familiar with.  Most of these certifications retire on Jan 31, 2021.  It doesn’t mean there is no value in taking these courses after that date, as I’m sure you would learn a lot from the courses, assuming CBTNuggets keeps them up after the certification expiration.  I certainly think these course could be helpful for someone trying to earn a Microsoft certification before Jan 31, 2021.

 

  • Free Video Learning

There is a growing body of great, free training on the internet.  When looking for this type of learning, search for people who are Microsoft MVP’s in the area you want to train for.

  • Search YouTube for MVPs or well known people who have advanced knowledge of a topic you’re interested in and subscribe to their channels.  YouTube also has playlists of pre-aggregated content on a particular topic.  You can search for playlists by topic like “SQL Server DBA Playlist” and find many playlists to save and watch later. Below shows a playlist for Wise Owl Training on SQL Server Reporting Services. Notice the icon in the lower left corner?  The three dashes and the + sign allow you to save this playlist to your YouTube page.  Then the playlist will appear on the main YouTube home screen on the left.

Here is a sample from my own YouTube playlists. I am not currently watching all of these, but they are playlists I definitely intend to watch.  When I need training on a topic, these playlists provide a handy way for me to find something quickly.

  • Find companies that offer free video training on topics of interest. For example, PragmaticWorks has free videos on a wide range of SQL Server topics. RedGate has a lot of free stuff on SQL Server and Dev-Ops for the database.
  • Find websites of individuals who blog about topics of interest. Besides the blog, some of those people are likely to make videos that are free to watch. This is true of Brent Ozar and Kendra Little, for example.

 

  • Free Online Conferences

Free online conferences are popping up like crazy now that in-person events are being shelved because of Covid.  More and more vendors are assembling virtual conferences, many of them free.

  • EightKB – This is a brand new SQL Server internals conference.  So far they have hosted one virtual event, and I attended.  The content was amazing.
  • Groupby.org – This was initially started by Brent Ozar but is now being headed up by Blythe Morrow.  This is event is held quarterly and has excellent speakers from around the world.
  • “New Stars of Data” – This is a brand new platform launched in 2020 that provides a place for up and coming data experts to speak.

 

  • Paid Conferences

  • DevIntersection – This is a yearly conference hosted in Las Vegas.  It is a broad conference, covering a wide variety of technology.  There is a SQL Server specific track called SQLIntersection. Early bird registration for this event runs anywhere from $1,700 to just over $3,000 depending on how many workshops you attend.
  • PASS Summit – The virtual event costs around $1,000 this year.  They are hoping to be back to in-person events on 2021.
  • SQLBits – This is a data platform conference focused on the SQL Server platform that is hosted in the UK.  Current prices and currency exchanges place the cost of a 2 day conference and 3 workshops at just over $,1000 US dollars.  It is a virtual conference this year.
  • DataGrillen – This in-person conference was cancelled for 2020.  It is hosted in Germany.

 

  • Blogs/Websites/People

  • Brent Ozar: Microsoft Certified Master for SQL Server who focuses on making SQL Server faster and more reliable (i.e. he does performance tuning)
  • Kendra Little – Microsoft Certified Master for SQL Server and now also blogs about DevOps for the Database for RedGate Software.
  • Paul Randal and friends – Paul Randal and Kimberly Tripp worked at Microsoft and are now married.  The site SQLSkills.com was founded by Kimberly and she and Paul run that site and company.
  • Itzik Ben-Gan: Itzik is a co-founder of SolidQ, a company that provides training on leveraging the T-SQL language to solve query problems.  If you want to learn T-SQL inside and out, look up this company and find resources from Itzik.
  • DBATools – Chrissy LeMaire and others have built numerous PowerShell commands into a powerful module that can be used to manage most aspects of SQL Server.
  • SQLServerCentral – I was first introduced to this site because of the forums where a lot of well-known SQL Server experts hang out and answer questions.  But, they also have a lot of training series articles on a wide variety of topics that they call their “Stairway Series”.
  • Microsoft Learn – This is Microsoft’s newest iteration of it’s free training platform. It provides a lot of information on the newest Microsoft certifications as well as free training paths to help users earn a certification. There are a number of data related paths, primarily connected to learning Azure.

Reader, we haven’t even talked about books, classes that might be held at a training center in your local area, or SQLSaturdays.  There is a wide range of means for you to obtain training and help on the SQL Server platform.

  • What to do next

  1. Find at least one sub-point under each of the above main bullet points and investigate that item thoroughly.  Look up MVPs and start following one of their blogs. Investigate YouTube and save some playlists. Check out Microsoft Learn. Pick one thing from each main category above.
  2. Use those selected items to build a training plan for yourself.
  3. Put that training plan somewhere so that you will see it, everyday. Mine is on the white board behind me in my home office.

 

 

Azure SQL Boot Camp Day 3 Takeaways

Hello reader!  Here we are with the much anticipated post for day 3 of the recent Azure SQL Boot Camp. Here are the previous posts for days one and two .

If you want to get started with Azure  SQL, the boot camp is a great way to do it.  You’ll get Azure experts like Anna Hoffman and Bob Ward, who is also a long time SQL Server expert, teaching you how to leverage the cloud for SQL Server. On day three, Anna does a couple of demos on the subject of HA that demonstrate the capabilities of the Azure platform.  Other days have demos of other concepts as well.

Not only will you learn specifically about SQL Server in Azure, but this series is going to teach you other foundational concepts about Azure that are applicable no matter what you’re trying to do in Microsoft’s cloud. Things like cloud networking and the security aspect that needs to go into that networking, as well as how to give people access to resources in a secure way, are relevant for every part of Azure. With that, let’s dig into what day 3 had to offer.

Day 3 of the Azure SQL Boot Camp was all about availability.  When done right, availability solutions keep your environment up and running when things don’t go right. Here are the key takeaways for Day 3 regarding high availability:

  • Azure has you covered in terms of backup/restore.
  1. A full backup is taken weekly.
  2. A differential backup is taken ever 12 hours.
  3. Log Backups are taken every 5 to 10 minutes.
  4. Backups are stored on Read Access Geo-Redundant Storage, which means there are multiple copies of backups in differing locations.  This protects you from losing the backups.
  5. Point in time restore is available.
  6. Backup integrity checks are peformed and there is even automated, random testing of those backups via automatically restoring some of them from time to time.
  7. Backup retention is by default at 7 days.  You can go up to 35 days or even use the Long Term Retention service and keep backups for as long as 10 years.
  8. With Managed Instance a copy only backup option is available.
  9. On premises backups can be used to restore to a Managed Instance as part of a migration strategy.

Let’s be honest, how many of you have a backup strategy that is this solid? Some of you have databases that are in full recovery model with no log backups happening. Yikes! Don’t believe me? Check into your backup history information in msdb with a little T-SQL and you can verify for yourself whether your environment is up to snuff in this regard.

How many of you are writing backups to the same location as the data? And, the big question is this: How many of you are testing whether you can use your backups to perform a restore? Remember, just because you have backups doesn’t mean you can restore.

If you don’t have these sorts of things in place, you might be doing a lot of mopping up after a disastrous situation. “Clean up on aisle 4!”

  • Azure has you covered when it comes to High Availability
  1. In the general purpose tier, Azure customers get a service that behaves like a SQL Server Failover Cluster Instance.
  2. Auto failover is possible and is handled by SQL Server and the Azure Service fabric.
  3. There is local SSD and local Tempdb with data files in premium storage.
  4. Backup files are geo-redundant.
  5. In the Business critical tier, Azure customers get Availability Groups behind the scenes – set up and configured for you automatically. With this service, you get 3 secondaries, only one of which has to harden transactions in order for the primary to commit transactions, and one replica for read only routing.
  6. In the hyperscale tier, you get snapshot backups taken by Azure, up to 4 secondaries and they can all be readable, as well as redundant data and log files through the log service and long term storage on Azure Standard Storage.
  7. Services like Zone Redundant Configuration, or Availability Zones, provide additional HA services for Azure SQL DB, but not in all regions.  This method of HA is currently not available for Azure Managed Instance.
  8. Geo-replication and auto-failover groups are a central piece to HA in Azure. Geo-replication has multiple replicas and supports read scale out. With this service offering, you do have to update connection strings after a manual failover is initiated.  It is asynchronous but you can use a system stored procedure to force synchronous commit prior to a failover. Geo-replication does not allow for failing over multiple databases together, and is available only for Azure SQL DB. Auto-failover groups can failover multiple databases at once with no changes to the application connection string.  It supports read scale out and is available only in Managed Instance.

Again, let’s do a gut check. How many of you are running important or business critical SQL Servers on stand alone servers and your only real option for recovery if the machine has an issue is to create a new SQL Server and restore all your backups? In this scenario, that important or business critical SQL Server doesn’t have a second copy via log shipping or some sort of replication, mirroring or an Availability Group.

Go on…. think about it.  List your important and critical business servers in a Word doc. Now examine the list and ask yourself what would happen if any of those servers died and couldn’t be recovered.  Would some part of your business stop while you tried to build a new SQL Server and restore? If so, I think you just found an urgent action item to put at the top of your to do list.  As you determine your solution, think about whether some form of SQL Server in Azure can give you what you need.

This is an important feature that is on by default in Azure.  It is available in SQL Server 2019 on-premises, but it is not on by default. It uses a persisted version store to house old data rows.  Those old data rows, and the most recent checkpoint, are used for a faster rollback.  This addresses the problem of a long rollback from a long running transaction that was killed.

Well, that is the summary for Day 3 of the Azure SQL Boot Camp. There is more that was discussed, but remember, I am only trying to give you a summary of key takeaways. Enjoy as you dig into SQL Server in the Azure cloud!

Azure SQL Boot Camp Day 2 Key Takeaways

Hello dear reader.  This is the second of a four part series on a recent online event hosted by Microsoft on their Microsoft Developer page over on YouTube.  From August 17th to August 20th Ann Hoffman and Bob Ward hosted a live boot camp for Azure SQL.  This post provides you with key takeaways from day 2 of that event.  The topic title for Day 2 was Azure security and performance.  Below are my takeaways. If you’re looking for Day 1 takeaways, look here.

I. Four methods discussed for handling network security in Azure for Azure SQL database.

    • The setting “Allow Access to Azure Services.” This setting allows any Azure resource to communicate with other Azure resources. So, this allows wide open communication between all your Azure resources. It is not recommended to leave this enabled.
    • Firewall Rules : This is what it sounds like – a firewall service hosted by Azure that allows you to build rules about access within and to and from Azure services, including traffic related to on-premises resources.
    • Virtual Network Rules : These are rules are a feature of the Azure firewall.  They are applied at the server level and not the database level.
    • Private Link : This is the creation of a private endpoint for network communication.  This is the most secure method.

II. Understanding RBAC is important.

RBAC stands for Role Based Access Control and is a means to provide granular controls in your Azure environment.  This feature isn’t just about SQL.  RBAC is applicable to your total Azure environment. This method of security involves assigning a role, which is just a named collection of permissions, to a security principal and making that assignment at a certain scope.

Security Principals

A security principal can be a user, group, service principal or a managed identity.  A user is an individual account with access in Azure Active Directory. The “Group” service principal is the same concept as a Windows AD group. It is a collection of users created in azure Active Directory. A Service Principal can be thought of as a user account for an application.  A Managed Identity is an identity managed by Azure that provides access to Azure resources, like Azure Key Vault, for example. A managed identity might be used to access Azure Key Vault to retrieve other secrets that are needed. Managed identities are a feature of Azure Active Directory and are included in subscriptions to Azure AD.

Remember, I said that RBAC is about assigning collections of permissions, called Roles, to security principals and making that assignment at a certain scope. Well, we’ve talked about security principals So let’s discuss roles.

RBAC Roles

There are four categories of roles that are applied to various resource types in Azure.  Those roles are:

  1. Owner – Has full access to all resources within a subscription.  Service principles in this role can also delegate access to other people.
  2. Contributor – This role allows a service principle to create and manage access to resources, but can’t grant access to others.
  3. Reader – This role only allows view access.
  4. User Access Administrator – Manages user access to resources.

These categories of roles appear in various forms and with various names within categories of resources like Computer, Networking, Storage, Integration, Security, etc.

Scopes Within Azure

A scope is a set of resources that access applies to. Scope has a hierarchy with parent-child inheritance. Grant access at the very top, and you’ve granted that access to everything below it.T he hierarchy is Management Group > Subscriptions > Resource Group > Resources.

What are some examples of things RBAC can be used for?

  • Allow a person to manage virtual networks but not VMs.
  • Segment access for Development DBAs just to non-prod databases. They can be in a group with the Contributor role so they can create and manage databases within a subscription that hosts only non-prod databases.
  • Allow a person to view endpoints, but not change them in any way.

III. Authentication to Azure SQL

  • Mixed mode authentication to SQL is forced.
  • SQL auth login is required during deployment and is called “server admin.” This account is a server level principal and is effectively in the “sysadmin” role for the server and databases.
  • If you need Windows authentication then you need to use Azure Active Directory.
  • There are some syntax differences with the CREATE LOGIN scripting for managed instance and Azure SQL Database. This is most notable when Azure AD authentication is involved.
  • There are a couple of less privileged database roles that can be utilized to limit the need for using the server admin SQL auth login. LoginManager is a database level role in the master database of the logical server. People in this role can create logins for the database server. The DBManager role is also located in the master of the logical server. This role can create and delete databases on the database server.

IV. Azure SQL Performance Capabilities

There is a lot that could be said about performance for Azure SQL. I am only giving you some key takeaways.

  1. Large scale capacity in Azure is absolutely available.  You get up to 128 vCores, 4 TB of memory and 4 TB’s of space for a database in Azure SQL. In Hyperscale you have a 100TB max database size.
  2. TempDB is on local SSD and the MIXED_PAGE_ALLOCATION setting is off.
  3. Only full recovery is supported so minimal logging in bulk operations is not available.
  4. MAXDOP is controllable at the database level.
  5. Optimize for ad hoc workloads, a great feature for managing plan cache bloat, is supported for SQL Managed Instance but not for SQL Database.

V. Azure SQL Monitoring

There are a lot of options for monitoring in Azure.  The portal provides access to a lot of different options.  You get things like Azure Monitor Metrics, alerts and logs that can be reviewed.  Performance overview and query performance insights in the portal are provided via Query Store, which is on by default.

Extended events is available in Azure Managed Instance and Azure SQL DB. The one thing to be aware of is that you don’t have access to the underlying file system for Azure SQL DB so the only way to have a file target for an extended events session is to use Azure Blob storage.

Dynamic Management Views are available in Azure Managed Instance and SQL DB. For managed instance all SQL Servers DMVs are available.  For Azure SQL DB a common subset of DMVs are on hand.

Remember that for Managed Instance there is no access to the underlying operating system so things like perfmon are not available.  Despite that, you do have access to the perfmon counters via sys.dm_os_performance_counters.

There are some Azure specific DMVs.  Sys.server_resource_stats in Azure Managed Instance  returns aggregated data concerning CPU usage, IO and storage data.  It can also return things like the resource SKU, meaning whether you are using General Purpose or the Business Critical tier. The counterpart to this DMV for Azure SQL Database is called sys.dm_db_resource_stats.

This is already a long post and there is a lot I didn’t share or summarize.  I encourage you to research performance features like Intelligent Query Processing, Automatic Plan Correction and Automatic Tuning for Azure SQL Database.

 

I hope these takeaways have been informative and will inspire you to watch the videos and do some of your own research.

Azure SQL Boot Camp Day 1 Key Takeaways

Recently Microsoft hosted an Azure SQL Boot Camp over on the Microsoft Developer page on YouTube.  The boot camp consisted of four days of two hour sessions describing the different features of Azure SQL.  The hosts were Anna Hoffman and Bob Ward.

This will be the first of a four part series offering you key takeaways from each of the sessions.  These takeaway posts by their nature can’t possibly include everything from each session so keep that in mind as you read.

Why Would You Use Azure SQL VMs?

Azure VMs are Infrastructure as a Service. If your on-prem SQL Servers are working just fine, why would you switch to Azure SQL VM?

  1. You’re committed to migrating at least some part of your servers to the cloud and you want an easy entry point that feels most like your on-prem SQL Servers.  Azure Virtual Machines are just what they sound like. You get a VM in Azure’s cloud. Once you install SQL Server on the VM, it looks and feels like the SQL Server of old, because it is. You still have to do some Azure networking and configure access to it. This gets you a gentle introduction to Azure without changing your whole world all at once.
  2. You are committed to migrating at least some part of your servers to the cloud and you need on-prem, native SQL Serve features such as SSAS, SSRS or SSIS.  Possibly you need things like Distributed Transaction Coordinator, Linked Servers or FileStream. If you are using these features on-prem, and you don’t want to change anything about how your environment works, then you need Azure VMs.
  3. You are committed to migrating at least some part of your servers to the cloud and you need access to the operating system. If your application using your SQL Server needs to do something at the Windows file or other OS level then the Azure VMs offering is your only option.
  4. Your databases are on the Simple Recovery Model and you don’t want to change that. The other flavors of Azure SQL, namely Managed Instance and SQL DB, require the databases be in Full Recovery Model. Keep in mind though, that the Simple Recovery Model inhibits your ability to recover from a data disaster.So, think long and hard about whether you really want to keep things on that recovery model.
  5. You have SQL Server 2008 or 2008 R2 in your environment. These versions of SQL Server are out of support on prem.  This means no more security updates of any kind. No more adding features to the platform (non-security related updates).  If you break it you pay for it. Meaning, if you call Microsoft for help, they won’t help you for free. They may just tell you that you need to upgrade. But, maybe you have legacy apps that are tied to one of these versions of SQL Servers and you still want to migrate them to the cloud.  In that case, you have to use Azure SQL VMs. Good news, in Azure you can still get free updates to SQL Server 2008 and 2008R2.
  6. You can have automated backups and security updates handled by Azure.  With Azure handling this, your team has more time for other things.
  7. Accelerated storage performance with Azure BLOB Caching.
  8. Virtual networks to integrate with on-prem resources
  9. Azure VM built-in HA and Azure Storage built-in DR
  10. Failover Cluster Instance with Azure Premium file share.

How do you migrate to this. One way is to use Azure Site Recovery.

There is more about Azure VMs, but remember, I promised you only key takeways.

 

What Is Azure SQL Managed Instance All About?

Azure SQL Managed Instance is the next progression toward Platform as a Service. As you progress toward Platform as a Service, more and more features are managed by Azure, leaving you with more time for other things.

Here are some takeaways.

  1. You want to use this if your company wants to move to the cloud while removing more management and administrative overhead, but you need instance scoped features like SQL Server Agent, Database Mail, CLR, Linked Servers, Service broker, etc.
  2. Management overhead related to business continuity, high availability and DR are handled by Azure for you.  The nature of these features in Azure Managed Instance are determined by things like your service tier (General Purpose or Business Critical).
  3. The General Purpose Tier offers remote storage, involves the least money, but has only a single instance.
  4. The Business Critical Tier offers lower latency storage, a readable secondary, and has higher costs.

How do you migrate to this.  One way is near zero down time using log shipping.

What is Azure SQL Database All About?

Azure SQL Database is at the far end of the continuum of IaaS versus PaaS.  As a PaaS offering, basically everything but the database is stripped away. There is no access to file level structures or the OS.  Both server and instance level options to make changes are removed. So what are some characteristics of this offering?

  1. Fully managed service, meaning that the Azure service handles all SQL upgrades, patching, backups, high availability and monitoring. Thus, almost all administrative functions are handled by Azure. This provides you with the latest stable version of the SQL Server database engine.
  2. There is no ability to make configuration changes that require a server restart because Azure SQL Database does not provide access to the underlying server.
  3. Offers access to a single database with its own guaranteed amount of computer, memory and storage resources.
  4. Elastic Pools  in Azure SQL Database allow for multiple, single databases to be managed together, providing predictable behavior for multi-tenant scenarios.
  5. SQL Database offers General Purpose, Business Critical and Hyperscale  service tiers.  HyperScale provides storage for large databases (100TB+).
  6. You can mix a single database deployment with an elastic pool.

How do you migrate to Azure SQL Database? One way to accomplish a migration is discussed here.  You can also use the Azure Database Migration Service.

For any of these offerings, I didn’t cover things like the monitoring available with each of these in Azure portal, deployment options or connectivity architecture.

Azure has a lot to offer.  There are caveats and usage scenarios for each type of platform within Azure.  Please be sure to check out what is and isn’t available from a SQL Server features and management perspective within each offering.

Stay tuned for part 2 of Key Takeaways from the Azure SQL Boot Camp.

What Is the Value of Mentoring?

There aren’t many formal mentoring relationships these days. It seems to be rare when two people enter into an arrangement where one person agrees to give of their time energy and resources to answer questions and assist another person. Oh sure, you do it casually when you give an assignment to someone and say, “Let me know if you need any help.”

However, that’s different from a recent offer made to people in the SQL Server community. Paul Randal is once again offering his time to multiple people, inviting others to engage in discussion about non-technical subjects. To be sure, it is a unique and wonderful opportunity.

Why Look for a Mentor Now?

Blogging

I have had some career goals over the years. For a very long time I wanted to start a blog. Almost five years ago I finally started this blog. At first, I was posting fairly regularly. Yet, the posts began dropping off and are almost nonexistent. I would like to have a mentor so I could learn a process that works for me, enabling me to post at least once a week.

Presenting

Another goal I would like to reach is to start presenting. For a very long time I believed the lie that I didn’t know enough to be able to give back. Imposter syndrome was a big scary monster holding me back. I entered into the IT world back in 2008 knowing nothing about that world. I got into it because I happened to be a person who understood an important line of business application at my company. People noticed that, and I eventually began managing that software. I had no IT or Computer Science training. I still don’t have a Computer Science degree.

Recently I began to have access to some really smart people and found that I often handled SQL Server situations in ways that were similar or the same as them. By no means am I on their level, but I found that methodologies, ways of thinking and troubleshooting approaches were often the same. When I saw that, I began to really have confidence that I have something to offer other people. I have been given so much by the SQL Server Community. I want to give back.

Managing a Team/ Launching a Department

My current employer has been around since the 50’s. They never had a database administrator on the pay roll until they hired me in April 2018. In January 2020 I hired another DBA to work with me. I’ve managed people before, but not in technical environments and not another DBA. I am essentially launching a department from scratch. How do I do that? I’m figuring it out, but having someone to bounce ideas of off who is outside the situation could be incredibly beneficial.

Consulting

For about 5 or 6 years now I have given a lot of thought to opening my own business, starting a consulting company. I have recently watched a 2 hour presentation from Brent Ozar about how he did it and how he recommends other people attempt it. But a two hour presentation at PASS doesn’t cover everything. How do you pick tools to get the thing off the ground? If you’re going to do this with friends, how does that work, or should you just go it alone? When do you incorporate your work into a legal business? How do you turn a side hustle into a real business while avoiding legal entanglements with your employer over “work product?” These are all legitimate questions that I’m still trying to figure out.

Summary

So, there you have it. I want a mentor to help me with building a high quality blogging site. I want help with presentation skills, how to launch a department and to help me explore whether I should, or how I could, launch a consulting company. The value of mentoring for me is that I can receive some help so I can get that last nudge I need to start giving back and doing it in a meaningful, consistent way.

Lessons Learned Updating to SQL Server 2017 AlwaysOn Availability Groups

In late 2019, a project that had been in progress for well over a year finally came to a conclusion.  I had collaborated with a number of people to migrate a stand alone SQL Server 2008 and two stand alone SQL Server 2008 R2 instances.  Each stand alone SQL Server was migrated to a three node Availability Group. Here are a few things learned along the way.

Automatic Seeding

First, consider using automatic-seeding if you are migrating to AlwaysOn Availability Groups.  I hadn’t previously worked with anything newer than SQL Server 2014 so I was unaware of this feature.  It is new as of SQL Server 2016. In this case, the old SQL Servers were migrating up to SQL Server 2017, so this option was available to us.  For a couple of months leading up to the migration event there had been various tests involving the set up of the AGs.  One of those tests involved restoring a full backup followed by one or more log backups over on the secondary nodes.  As the migration date approached, we started experimenting with auto seeding.

The auto-seeding feature streams the database from the primary to one or more secondaries behind the scenes.  This is accomplished using the endpoints created during the Availability Group creation process. This feature can eliminate the need to create scripts to handle the restores of databases over to secondary nodes in the AG.  

There are four ways to monitor the process of the streaming of the databases to the secondary nodes. First, the SQL Server Error Log makes entries about the process. Second, various DMVs contain information about the status and progress of automatic seeding.  Third, Backup history tables in MSDB can be queried. Fourth, Extended Events can be used to monitor the progress.  We chose to query DMVs during the migration.

What’s In a Name?

Second, ponder how to handle the scenario where connection names for applications change. As it turns out, a lot of meaning and importance is attached to names.  The computer and SQL Server instance names we were dealing with were deeply ingrained in the business. The names had been around for years and were used everywhere in connection strings.  If we migrate to new SQL Server instances, how do we handle that? 

One option is to create an alias in DNS that points to another location. Several months before the actual migration, we changed the computer names for each of the stand alone SQL Servers. The Infrastructure team used the previous computer names as aliases that then pointed to the newly named computers. This does require a change inside SQL Server, by the way.  SQL Server is aware of the computer it is installed on, so if you do this, know that you have to make a change in SQL Server as well. 

Let me explain the concept by way of example. Let’s suppose you have a computer called MyServerABC and it has a default instance of SQL Server on it.  This means that database connections also use the name MyServerABC to talk to the databases.  That connection name might be used in a lot of applications.  So that tens, or maybe even hundreds, of connection strings don’t need to change when the SQL Server is migrated, here is what you can do. During a maintenance window, change the name of the computer and make a change to SQL Server so it knows the new computer name it is on. Here is a link to describe how to do that in SQL Server and it has a discussion of other considerations when doing this sort of thing.

Let’s suppose the new computer name is MyServerXYZ. After the previous actions are taken, in DNS a new record is made called “MyServerABC”.  That alias is then “pointed” or directed to the computer name “MyServerXYZ”.  Now, applications that use MyServerABC as a connection name can continue using that connection name.  When the application makes a call to MyServerABC, that call is simply re-routed to MyServerXYZ and the application is none the wiser.

This exact change process worked for us early on.  Later, once we migrated the first stand alone SQL Server to an Availability Group, a person on the Infrastructure team changed the alias so that it pointed to the AG Listener instead of the old server we just migrated from.  Connections were then instantly routed to the new SQL Server 2017 AG. 

Database Migration Assistant

Third, the Database Migration Assistant is software from Microsoft that evaluates a SQL instance and the databases on that SQL Server for readiness to be migrated to a new platform. This tool can help migrate databases to Azure, but we were executing a migration involving on-prem to on-prem.  The tool provides an assessment of and information regarding scenarios in use in your environment that might block a successful migration as well as offering information on new features to leverage in the new SQL Server. 

We were able to find several things in our environment that had to be changed prior to the migration using this software.  Without the assessment, we would have migrated only to experience failures of various kinds in the new environment.  

Planning Software

Fourth, use planning software of some kind so the migration steps are known and progress can be tracked.  One of the things I chose to do was plan out each step and create an electronic project board for all the things that needed to happen the day of the migration.  I had been using Office 365’s Planner at that time, so I created my steps there.  One of the Developers converted that to Trello because that is what the Dev teams were using at the time, and the Devs were at the office with us to help with testing applications post-migration.  The Trello board was made visible on a couple of large, wall mounted TV’s for everyone to see.  This allowed everyone to know what had been done, what was being worked on and what was next.

Communication

Fifth, think carefully about how the migration team will communicate during the migration.  (All in the same physical room? Company conference bridge while working remote? Conference call with a support vendor? Remote work platform like Teams or Slack? Some combinatin of these things?)

If you have a big room that will hold everyone, that might be a great option. Remember, this happened back in early December 2019, pre-covid. So, having a dozen or so of your favorite co-workers all in a room made sense.

Meeting together allows for easy, fast communication by simply turning to the person in the room to discuss something or ask a question. Collaboration is smooth and convenient in that scenario.  As I mentioned earlier, we leveraged a Trello board that was displayed on a couple of large TVs so everyone can see what has been worked on, what is currently being worked on and what is next.  

Be aware and mindful of the human tendency to talk when idle though.  If you get a dozen people in a room and only a few of them are busy at the moment, there could be a significant amount of chatter to block out for the people who are busy at that moment.

These days, now that the U.S. is experiencing an explosion of work from home due to the pandemic, you would probably set up something like Teams or Slack and use that as your collaboration platform so people can collaborate from home on migration day.  Platforms that allow for phone calls, screen sharing and quick collaboration among large groups of people would be essential in a large scale, high impact IT project. In those scenarios, the people collaborating still need to think carefully about communication so that people aren’t talking over each other while addressing the migration work. 

Well, that’s it for now.  I hope these points help you consider some things that maybe need to be included or discussed as part of your next IT project.  

Querying An AS400 Using SQL Server 2019 Data Virtualization

For those who have taken an interest in the latest SQL Server release, SQL Server 2019, you may have noticed a new feature called Data Virtualization. Perhaps you’ve seen a presentation about it, or maybe you’ve read about it in Bob Ward’s book entitled “SQL Server 2019 Revealed.”

I myself was curious about this feature and so I set out to see if I could use it to connect to an AS400 iSeries machine. Tom Willwerth, my friend at StraightPathSQL, and I worked on this process together. This post is primarily about walking you through the steps we recently took to get this configured.

Before I dive into that, why was I interested in this feature? What did I hope to gain? Well, first of all, there was definitely the motivation of wondering, “Can we get this to work?” Secondly, and more practically, the promise of SQL Server Data Virtualization is to make other data sources available without using a Linked Server and without the time it takes to develop an ETL process to move the data. On a related note, you can cut out the time it takes for an ETL job to actually move the data somewhere like a data warehouse or flattened tables for reporting. Third, the Polybase feature has a built in engine that can provide query performance not available via Linked Server. Fourth, I wanted to provide a way for developers to query data in the AS400 without having to learn the different syntax required by the AS400 iSeries. Fifth, query writers can also join the external able to local SQL Server data.

Now let’s move forward with the how of this process. I started by installing SQL Server 2019, and in that process there are some steps to select for installing the Polybase feature. If you’re not familiar with that, it came out in SQL Server 2016 and is the foundation of data virtualization. During the install process, SQL Server suggested a port range that it would use as part of configuring Polybase. I wasn’t sure if SQL Server knew that the suggested port range was open or not. As a result, I actually stopped to download the SysInternals tool called TCPView and run it on the server to confirm that the suggested port range was open.

After the SQL Server install completed, I ran the code below in SSMS to enable the Polybase feature.

exec sp_configure 'Show Advanced Options', 1
reconfigure
EXEC sp_configure 'polybase enabled', 1
reconfigure
EXEC sp_configure 'polybase enabled'

The next thing I did was create a database master key. This has to exist in order to encrypt the credential that will be stored in SQL Server later on in the process.

CREATE MASTER KEY WITH ENCRYPTION BY PASSWORD = 'S0meStr0ngP@55w0rd1234xYz!'

I then set my test database to compatibility level 150 for SQL Server 2019 and made my database scoped credential, like so.

ALTER DATABASE MyTestSQLDatabase SET COMPATIBILITY_LEVEL = 150

CREATE DATABASE SCOPED CREDENTIAL MySQLCred WITH IDENTITY = 'MYAS400CRED', SECRET = 'S0m3Str0ngP@55w0rd';

The value for IDENTITY corresponds to a user in the source database, in this case, the AS400. SECRET, of course, is the password for the user name stored in IDENTITY.

I believe after this Tom joined the process and we began together by installing version 1.8 of the 64 bit iSeries Client Access driver for Windows onto the SQL Server instance I was testing from. Going into the details of that are a bit beyond the scope of this post so I won’t describe the process here.

Creating an external data source was the next part of the journey. Syntax for that is here.

After a lot of wrangling, searching, and looking here for an AS400 connection string example, Tom and I arrived at this code sample that we ran successfully. Obviously I’ve changed some details to anonymize this, but you get the idea.

CREATE EXTERNAL DATA SOURCE MYAS400DEV WITH (LOCATION = 'odbc://MYAS400DEV.LeeMarkum.com', CONNECTION_OPTIONS = 'Driver={iSeries Access ODBC Driver}; System = MYAS400DEV'; ServerNode = MYAS400DEV.LeeMarkum.com', PUSHDOWN = ON, CREDENTIAL = MySQLCred);

Next up we took the optional step of creating a schema called EXT for holding our external tables. Afterward, we moved on to the business of actually making our first external table. This was very much a multi-step, trial and error process.

Here is the process we eventually found that worked:

  • Get SQL definition from AS400 via the iSystem Navigator, minus trigger definitions, and paste into SSMS
  • Remove any index or key definitions. Without this change SQL Server throws an error
  • Remove any defined defaults. Without this change SQL Server throws an error
  • Change the CHAR data types to NCHAR. Without this change SQL Server throws an error
  • Replace DB2 schema name with EXT, our previously created SQL Server schema.
  • Change CREATE TABLE  to CREATE EXTERNAL TABLE
  • Add this at the bottom: WITH ( LOCATION=’MYAS400DEV.SomeSchemaName.DB2TABLENAMEHERE’, DATA_SOURCE= MYDEVAS400)

This left us with code like the below:

CREATE EXTERNAL TABLE EXT.Customer 
(CUST# NUMERIC(7,0)NOT NULL,
CUSTNAME NCHAR(30) NOT NULL,
CUSTADDRS NCHAR(50) NOT NULL,
CUSTCITY NCHAR(30) NOT NULL,
CUSTSTATE NCHAR(2) NOT NULL,
CUSTZIP NCHAR(2) NOT NULL
)
WITH ( LOCATION='MYAS400DEV.SomeSchemaName.DB2TABLENAMEHERE', DATA_SOURCE= MYDEVAS400)

As part of this process, we had to keep the same column names as what existed in the AS400. Otherwise, an error was thrown, and this makes sense given that we’re looking at the AS400 table in the WITH (LOCATION = ”, data_source = ) part of the syntax.

After this step there is the optional step of creating statistics on columns of the virtual table. You might want to do this, for instance, on columns that might be involved in joins as a performance enhancement. For the purposes of this work, I skipped this step for the time being because Tom and I just wanted to see some query results from all this work! We could return later to this item to try to tweak performance.

The moment of truth had finally arrived. We loaded up a query against our EXT.Customer table in SQL Server 2019 and, to our delight, we received the expected result set back using standard T-SQL.

I’m not gonna lie, this was a fair amount of work and took some time to set up. However, I’m hoping this blog post, and others like it, will save you some pain and time.

SQL Server Install Guide Part 3

I’m in part 3 of a series on installing SQL Server.  In the previous post I began discussing steps to take after the install is complete.  Because there are quite a number of things to consider post-install, this part 3 will continue discussing the post-install configuration items.

Configure SQL Server Alerts

One of the things I always do after installing SQL Server is set up SQL Server Alerts.  This is a free, easy way to find out about problems on the SQL instance, like read retry errors or other potential nasty issues.  There are probably a number of places to find scripts useful for this task.  The one I recommend is from Glenn Berry at SQLSkills.com.  You can find his script at this link.  One of my favorite things about Glenn’s script is that it has very well defined alert names that allow the DBA to easily know what server is involved when an alert is sent.

How Should MAXDOP be configured?

Next up is setting MAXDOP.  The foundational advice from Microsoft concerning Max Degree of Parallelism is in KB2806535, found here.  The summary of this article is that if you have a single NUMA node and less than 8 logical processors, then keep MAXDOP at or below the number of logical processors.    If you have multiple NUMA nodes and less than 8 logical processors per NUMA node then keep MAXDOP at or below the number of logical processors per NUMA node.   If there are more than 8 logical processors per NUMA node, keep MAXDOP at 8.

So, how do you tell how many NUMA nodes there are on the machine?  I’m glad you asked!  There are several ways to determine this information.  I will refer the reader to a post by Denny Cherry on the various methods.

Setting Min/Max Server Memory

MIN/MAX server memory should be reviewed as part of your post install configuration items.  The MIN Server Memory setting is the minimum memory your server will allocate to SQL Server.  The trick to understanding this is that SQL Server does not automatically grab that minimum value of memory upon start up.  The memory used by SQL Server increases gradually after start up, assuming a steady workload on the instance.  Once it crosses the value set in this MIN Memory setting then it does not give that memory back to the OS.

Setting the MAX Server Memory setting will prevent SQL Server from taking so much memory that it starves the OS.  I typically leave 4-6 GB of RAM free to the OS and have not had problems with memory pressure using that guideline.

I will say though, that the machines I work with are dedicated to SQL Server, and almost always have just the database engine installed on them.  If you have other SQL Server components on this same box, or application components from one or more applications then 4-6 GB left to non database engine components probably is not enough.  Definitely monitor the Available Megabytes Perfmon counter over time so you can see how much free memory the machine has and adjust the Max Server Memory setting for SQL Server as needed.

To keep this series to a reasonable length, I’m simply going to list the next few items I have on my own personal Install Guide that I use.  The series wasn’t meant to be an exhaustive explanation of everything one would put on an install guide anyway.

The guide I use is in Excel format and I go through each item on it as I work through the process.  I mark off what I’ve done and then I save the guide.  This helps me to ensure that I’ve covered all the steps and it gives me and other people a record of what I did during the process.  Some of these steps can be entire blog posts on their own, or even an entire series written on that one step, so summarizing them here and expanding on them in later posts will be my approach.  Ok, so here’s the remaining list.

  1. Ensure Windows power plan is on high performance
  2. Configure tempdb appropriately. (Google this and do research.  Here are a couple of starting points.  See this and this.
  3. Set up a DBA management database. (This is a database on each instance that has key procedures and objects, like sp_WhoIsActive, sp_Blitz, and any other custom scripts you like to use for troubleshooting or diagnostics)
  4. Enable Query Store (SQL Server 2016 and higher)
  5. Update compatibility level of databases to newest level. (This assumes that the new cardinality estimator was tested in a Dev and test environment prior to this migration or new install)
  6. Make sure database backups are occurring as expected.
  7. Consider enabling automatic plan choice correction (SQL Server 2017)

So, what are some of your steps for installing and configuring SQL Server?

SQL Server Install Guide Part 2

In part 1 of this series, I discussed pre-installation steps.  There are many guides available online for the actual installation process so I’m not going to give a step by step installation guide here.  Rather, I’m going to focus on what to think about after the install is complete.

Recent versions of SQL Server will now go out and download additional content for you before the install actually happens so you’re not left on the base, RTM release of the product.  However, one thing to consider post installation is still, are you on the latest patch?  How do you know?  Where can you go to compare the version number of the SQL Server you installed to what is currently available?  Well, you can look back at this post to answer that question.

Once you’ve determined whether or not additional patching is needed and completed that, then what?  Well, there is a lot of post install configuration still to do.  For example, do you need to migrate Linked Servers, SQL Agent jobs, SQL Operators?  What about setting up database mail or moving Logins?  Each one of these can take up a lot of time.  What if there was a quick way to handle those things?  Enter DBATools.

DBATools is a fantastic collection of hundreds of PowerShell commands that have been developed and tested by users worldwide.  With a simple command, linked servers can be migrated, with their password information intact. The same with SQL Server logins and their passwords and permissions.  Database mail? No problem!  In fact, if you call Start-DBAMigration and pass a source and target SQL instance, then you can migrate the entire instance in a single command.  DBATools is fantastic for ongoing maintenance tasks and a bunch of other things too so be sure to check out all that it can do for you as a DBA.

The next item on my install guide is trace flags.  Trace flags control the behavior of SQL Server in a variety of ways.  As one example, trace flag 3226 controls whether successful backups are logged in the SQL Sever error log.  Setting this flag prevents those “backup successful’ messages from cluttering up your error log.

The important thing to remember if you’re doing a migration is that you most likely want the new SQL Server to behave in the same way as the old one.  To find out what trace flags are globally enabled on a SQL Server, run the following code.

DBCC TRACESTATUS(-1)

If you want a list of trace flags and what they do, those lists do exist but they are sometimes hard to find.  Here is one such list that tells you whether you should be concerned about a particular trace flag being enabled or not.

If you’re starting from scratch with the database and application you’re installing, be sure to find out if the vendor has any recommendations and why they recommend those particular trace flags.  Also, do your own research about trace flag functionality and see why you personally may or may not want to use a particular flag.