What Are SQL Server Logins and Users

You’re a data professional learning about managing SQL Server and you’ve been asked to grant permissions for SQL Server to an individual or a group of individuals. What do you need to understand in order to accomplish this? I’ll be your guide to getting started with handling access to SQL Server.

Let’s start with an analogy.

SQL Server Logins and users

Keys and fobs are common place these days for vehicles. Sometimes if you have a fob it ties into the ignition allowing for push button start so no key is needed for that. That’s not always the case though. Many people still have key and fob combos on their key ring. For our purposes, we’re going to assume that is the case.

Your fob let’s you access your vehicle’s doors. The fob gives you what I’ll call an initial, top layer of access to your vehicle. With it you can unlock the doors so you can get inside. You can now sit in the seats, open the glove compartment and access other surface level features of the car, but you can’t do much else.

Your key to the ignition lets you start the car, giving you access to the engine, radio, power seats and other features in the vehicle.  With the key in the ignition, you can now take your car for a spin.

What is a SQL Server login?

A SQL Server login is much like the fob to your car. The login, by itself with no other permissions having been applied, gives the holder of the login access to the initial, surface layer parts of SQL Server. The login allows for connecting to an installation of SQL Server on a computer. A login can be a Windows authenticated account or a SQL Login. One uses the Windows operating system authentication method and the other uses just data within SQL Server itself to authenticate or verify the ability to connect to a SQL server instance.

SQL Server Logins

SQL Server Fixed Roles

Roles are sets of permissions that can be assigned to a security object, like a SQL Server login. These permission sets grant certain abilities to the logins that have been assigned those roles.

  • bulkadmin: Allows for running the BULK INSERT statement
  • dbcreator: Allows for creating, altering, dropping or restoring a database. Databases created by logins in this role are owned by that login and essentially have db_owner rights in the database.
  • diskadmin: Allows for the management of backup devices
  • processadmin: Allows for use of the KILL command to end session connections
  • public: Allows for access to certain metadata system catalog views and is granted to all logins by default.As a best practice, you should never change the permissions applied to the public server role because then every login connecting will have those permissions.
  • securityadmin: Allows for the management of access to the SQL Server. It manages logins and associated permissions such as grant, deny and revoke.
  • serveradmin: Allows for changing server-wide options, can shut down the server, clear the SQL Server procedure cache, as well as a few other things.
  • setupadmin: Allows for adding and removing Linked Servers using T-SQL
  • sysadmin: Allows for full control of the SQL Server. Logins in this role can do anything.

What is a database user?

A database user object is much like the key to your car. The database user (the key) is a database level authentication object that is typically associated with a Server level login (the fob). This user object is what provides for connecting to one or more databases.

SQL Server Database User

SQL Server Database Fixed Roles

  • db_accessadmin: Allows for adding or removing access to a database
  • db_backupoperator: Allows for backing up the database
  • db_datareader: Allows for reading the data in a database
  • db_datawriter: Allows for using insert, update or delete in a database
  • db_ddladmin: Allows for creating, altering or dropping (deleting) database objects
  • db_denydatareader: Denies the ability to read (SELECT) data.
  • db_denydatawrtier: Denies the ability to write (add, update or delete) data.
  • db_owner: Allows for full control of all database objects and data.
  • db_securityadmin: Allows for managing access to a database. Has control over database users and their permissions.
  • public: A role granted to al users in a database by default. It is best practice not to alter the permissions of this role.

 

The login (fob) allows for connecting to SQL Server and in turn is almost always associated with a database user (the key). That user allows connection to one or more databases. The server roles grant permissions for group members to do things with the SQL Server installation. The database roles grant permissions to members to do things with one or more databases.

Using Windows Active Directory to Manage SQL Server Access

To make managing SQL Server easier over the long run, whenever possible, it is best to use Windows Active Directory groups and appropriate server and database fixed roles. For example, if you need to grant read only access to the accounting database for the accounting department, or even just a couple of people in Accounting, then either ask the SysAdmins about Windows groups for the Accounting department and who is in those roles, or get that information from Active Directory Users and Groups yourself. Now, add windows groups to SQL Server and assign permissions. For more on that see this post about the topic.

 

Next Steps to Take

  1. See this for MS Docs on how to create a login.
  2. See this for MS Docs on how to create a database user.
  3. If you have questions or comments, leave me a comment on this post, or message me on Twitter

Building a SQL environment inventory with the MAP Toolkit

 

The MAP Toolkit, also called the Microsoft Assessment and Planning Toolkit, is a standard way to obtain information about your Microsoft environment. This includes your SQL Server. The tool is so common that a link to it is included in the SQL Server media. This tool will scan your network and provide a fairly comprehensive report on SQL Servers it finds. The direct link to the MAP Toolkit download is located here. Be sure to download all the files from that page.

MAP Toolkit
MAP Toolkit Link In SQL Server Installer

As a DBA new on the job, the MAP Toolkit can be a lifesaver. You have no idea how many SQL Servers there are or what versions and editions exist, or where they are installed. Sure, you asked these questions during the interview, but unless you were talking to a DBA team who has been there awhile, you likely didn’t get correct answers. You see, most places don’t really know how many SQL Servers they have. I’ll be your guide to getting that first SQL Server environment inventory up and running so you know what you’ve inherited.

Why Use the MAP Toolkit?

First, using this MAP Toolkit will automate the process of gathering important SQL Server environment information for data professionals. Even if you only have a a few SQL Servers, using this tool will save you time versus manually connecting to each SQL instance to get the same data.

Second, there are almost always more SQL Servers in an environment than people realize and this application will locate them for you.

Third, the MAP Toolkit will help you find computers where Standard or Enterprise Edition is installed other than production. You have to have licensing for Standard or Enterprise edition when installed on people’s laptops or PCs. I’ve found plenty of employee computers running Enterprise edition SQL Server! Also, non-prod servers should not have Standard or Enterprise edition installed for the same reason. In non-prod, use Developer edition, as it’s free.

Getting Started with the MAP Toolkit

A full tutorial on using the MAP Toolkit is available here on Microsoft Learn. MAP Toolkit installation is fairly straight forward but is also included in the  tutorial from Microsoft that I linked to above. You can install it on your own desktop and then scan Active Directory for SQL Servers. Please be sure to notify your System Administrators BEFORE you run the scan as it will set off intrusion detection alarms.

Here is a screen capture of what some of the inventory methods are.

MAP Toolkit scan options
MAP Toolkit – Scan options

 

Generating Excel Reports from the MAP Toolkit

Once SQL Server scan is complete, you will see a screen like this one.

MAP Toolkit - Post Collection Screen
MAP Toolkit – Post Collection Screen

Click anywhere in the box under “SQL Server Discovery”, then you will see a screen like this. Click each excel icon in the Options section to generate the Excel for what was captured.

MAP Toolkit - Generate Excel Output
MAP Toolkit – Generate Excel Output

 

The SqlServerAssessment file will have the list of the SQL Server instances it found, the computers those instances are on the product version name, Service Pack number and edition. The data will also contain the OS version of the computer and some general hardware information. Importing at least these columns into a SQL Server database using the Import/Export Wizard in SQL Server is your first step to understand what your SQL Server environment looks like.

If you want to see what the assessment files might look like before you run your first scan, be sure to extract the file “MAPSample_Documents.zip” and locate the SQL Server samples. These will give you an idea of the kinds of information you can expect from the tool.

Why Import This Data to a SQL Server?

Importing this data to SQL Server affords you at least four advantages to keeping it just in an Excel file.

  1. The data recovery abilities of a SQL Server database are superior as opposed to trying to recover a file that has been deleted, which is not always possible.
  2. The data is accessible to other data professionals who have proper permissions on the SQL Server. This makes it easy for other people on the data team to review, analyze and maintain the data.
  3. Having the data in a database also allows you to query the data for a variety of things, like how many of each product name(SQL Server 2005, 2008…2017, 2019), that you have.
  4. As you maintain this data over time. You can see how the environment changes as you upgrade older SQL Server products to newer ones, for example. This can be great information to have when it comes to updating your resume or when approaching review time.

There is certainly other useful information in the various tabs of both spreadsheets created by the reports. I strongly encourage you to explore that data to learn what you can about your SQL Server environment.

 

Next Steps To Take

  1. Now that you seen what the MAP Toolkit can do for you, download and run your first scan. Then import it into SQL Server. You’ll likely be surprised at what it turns up.
  2. Analyze the scan results to see how many SQL Servers you need to dump upgrade to get onto a supported version of SQL Server. Upgrading to a modern version of SQL Server can be a key goal for you as a DBA.
  3. If you have questions or comments, leave me a comment on this post, or message me on Twitter

6 Ways to Advance Your IT Career

The “Great Resignation” is upon us, or so the internet tells us. People want better working conditions, better wages, just… better. Large swaths of people are trying to advance their careers. Here are 6 ways to advance your IT career.

  1. Accept that long term loyalty to a small to mid-sized company is likely keeping your salary low and limiting opportunities.
  2. Use the wizards at salary.com to plug in the job titles you want and discover what you should be making.
  3. Focus on building technical skills, especially early in your career
  4. Work on your interviewing skills
  5. Start blogging
  6. Start presenting

How to Advance Your IT Career: Here is How I did It

  1. I realized that long term loyalty to my small to mid-sized company was keeping my salary low.

Career growth opportunities were limited. The company I was in when I launched my IT career probably had 200-300 people in their corporate office and there were 8-12 local offices that had maybe 5-10 people working out of each of them. So, less than 500 employees. We weren’t big.

There were some opportunities for growing your career there, but those opportunities to be promoted to a new or different role came around once every 3-5 years, and sometimes less frequently than that. There was maybe one more title change in sight so upward mobility was now limited.
I had been in relatively the same role for 4 years. Annual increases were  my primary means  of increasing my salary.

Also, in my scenario, it didn’t seem like we were exactly swimming in money. These two things combined to keep my career opportunities limited and my wages below average.

“I needed to change jobs if I was going to advance.”

 

I had been at this company about 9 years already before I started looking for other work. Loyalty wasn’t going to help my career. I needed to change jobs if I was going to advance.

  1. I used the wizards at salary.com to plug in the job titles I wanted and discover what I should be making.

 

Salary.com What am I Worth
Salary.com What am I Worth

You can use the site to compare different job titles to one another in terms of experience generally required, job descriptions, and other details.Below is some data for Austin, TX for Database Administrator

Sample DBA Salary Information - Austin TX

TitleAVG Years ExperienceLow End Salary YearlyHigh End Salary YearlyMedian Salary Yearly
DBA I0-2520008700067000
DBA II2-47100011800094000
DBA III4-695000140000118000
DBA IV7+109000152000131000

Once I had revelation #1 above and saw how much I was below the average salary, I was fairly motivated to make an employment change. That first job change resulted in a 10% increase in my salary!

  1. I spent the better part of 2 years building technical skills.

Classes at a university training center

There were two parts to this approach. First, I took IT classes at a local education and training center. To be clear, these were NOT boot camps. In my case I was focused on database technology, but there were and still are, certificates for programming and web development. These were generally 1 -2 day classes offered through my state university. They cost anywhere from $300-$600. My employer was reimbursing me the cost of the classes.

They involved lectures on IT topics paired with in-class labs where we would “do the stuff” that was just discussed. By doing this, I took enough courses to earn a Database Technology certificate. I could be trained and gain experience on database technology rapidly in a single 1 or 2 day course.

Regular self study

Second, I would also typically spend 1.5-2 hours of self-study every night after work Mon-Thur. On Sat/Sun I would study usually 3-4 hours each day.

I did both of these approaches for the better part of two years. This all culminated in my first Microsoft Certification in May 2013. To be clear, the goal in all of this was to learn new skills. Later, I focused on the certification as a means of learning specific new skills and knowledge that would help me earn the certification. My thought was that, alongside my resume, the certification would help demonstrate knowledge and possibly help open a few doors early on in my career.

Because of the classes and the extra work after hours, I was able to handle more and more complicated scenarios at my job. That was, after all, the real goal.

  1. I worked on my interviewing skills.
Confident Interviewing
Photo by Sora Shimazaki from Pexels

Reading blog posts and watching videos about interviewing became a regular habit. I created a Word doc of questions I wanted to ask potential employers during an interview. The interview was as much for me as it was for the employer. Interviewing the potential employer to find out what their IT environment is like, what the IT team is like and what the company mindset is about certain things, are critical parts of you assessing the employer.

Now that I’ve been in IT officially since 2010 and officially a DBA since 2014,  here are the next steps I’m taking to advance my career right now.

5. I started blogging.

One you have even a year or two of experience, blogging can be a great way to grow your career. Providing good technical information on how to solve a problem not only helps you remember and create a reference for you later, but forces you to learn more about the topic so that you can explain it to others. Thus, blogging is a great learning tool for you as well as an opportunity to get your name out into your technical community.

As you consistently post material, you can become known as the “go to” person for technology “x”. Then, when people want to know about technology “x” they find and read your blog posts and refer other people to your blog. This sort of activity can open doors for you professionally.

For example, Anthony Nocentino is well known for talking about Kubernetes. Brent Ozar and Erik Darling are synonymous with performance tuning. Hugo Kornelius is known for his instruction on SQL Server execution plans. Erin Stellato is known for many things, but most recently she has focused on the use of Query Store in SQL Server. SQL Server Statistics is strongly associated with Kimberley Tripp.

You can get started blogging for free using WordPress, orWeebly. There are probably other free options as well.

6. I started presenting.

public speaking advances your career
Photo by mentatdgt from Pexels

Back in July of 2020 I gave my first technical presentation outside the walls of my company. I am a co-organizer for a SQL Server Meetup. A speaker for one of our up-coming sessions had a conflict and had to cancel. There was about a week to find a speaker and so I decided that speaker would be me.

I put together a presentation called “Modern SQL Server Features That Make Life Better.” I presented it to my own user group and one other user group. Then I was accepted as a speaker to New Stars of Data. That event has opened up two more opportunities for me and I’m pursuing other opportunities through the features at Sessionize. They have a feature called “Discover Events” that allows you to see upcoming speaking opportunities.

Much like blogging, when presenting you have to learn something fairly well so that you can confidently present the information. Asa result, you build your reputation as an authority on your topic. Also, presenting gets your face and name in front of people in a way that even blogging doesn’t. When presenting, people can ask you questions in real time about your topic and get to know you a bit as they listen to you or watch you present. They get a real life sense of who you are. they also get a feel for your level of expertise and you can build a good reputation for yourself, all the while helping other people learn.

Next Steps To Take

  1. Figure out which of the 6 steps you need to take to move your career forward and do it.
  2. Leave me a comment on this post to share which step you’re going to take or message me on Twitter with your plan. Sharing your next step with others reinforces your commitment.

Error Messages 8114 or 22122 When Performing Change Tracking Cleanup

 

You’re a data professional and you’re trying to keep up with patching a wide range of SQL Server versions and editions. How do you know what’s in the CU and whether you should apply it or not? My favorite way to read up on CUs is to go to SQLServerUpdates and click around there.  It will take you to the latest CU pages as well as give you a way to see previous CUs that are available.

While doing this recently, I discovered this for CU 26 on SQL Server 2017.

Known Issues SQL Server 2017 CU 26

How to check If Change Tracking Is Enabled

SQL Server 2017 CU 26 has a known issue with change tracking. Before you apply that CU, be sure to check that you’re not using Change Tracking somewhere. Well, how would you do that if you have a larger environment and don’t immediately know if you’re using Change Tracking somewhere?

You could connect to your Central Management Server in SQL Server Management Studio and then run the below to return the SQL Server instance and database where the Change Tracking Feature is enabled. In this case, we don’t care what tables are involved in the Change Tracking, so there is no need to look at sys.change_tracking_tables. We only need to know that Change Tracking is enabled. This is enough to know that we shouldn’t update to SQL Server 2017 CU26 for that SQL instance.

SELECT db_name(database_id) AS DBname
FROM sys.change_tracking_databases;

You might see this:

Msg 8114, Level 16, State 1, Procedure sp_add_ct_history, Line LineNumber

Error converting data type numeric to int.

Or this error:

“DateTime spidNum     Error: 22122, Severity: 16, State: 1.

DateTime spidNum     Change Tracking autocleanup failed on side table of “table_name”. If the failure persists, use sp_flush_CT_internal_table_on_demand to clean up expired records from its side table.

The solution is to either stay on CU25 or lower, or go up to CU 27 for SQL Server 2017. Microsoft specifically says on this page that CU27 has a fix. You could also uninstall CU26 for SQL Server 2017. You might need to do this if you’re already on this version and can’t update to CU27 for some reason. To do this you can go to Control Panel > Programs and Features > View Installed updates

If you’re on SQL Server 2016 SP3 and experiencing this problem, there is a hotfix here listed as KB5006943.

 

 

Thoughts On Pass Data Community Summit 2021

 

Pass Data Community Summit was November 10-12 and was hosted and sponsored by RedGate along with other supporting sponsors like Microsoft and smaller companies like MinionWare. Having been to last year’s virtual Summit, I think this one was done much better.

Pre-conference Sessions

There were a big collection of pre-conference sessions. I attended “AWS for SQL Server Professionals” because my current employer is entirely in AWS. Erin Stellato had an all day session on Query Store that I attended as well. For a shorter, one hour version of this presentation, click here to go to a YouTube presentation by Erin at the Saint Charles SQL Server Meetup. I learned a lot in both all day sessions and I plan on attending pre-conference sessions next time as well.

Thoughts on Pass Data Community Summit
Thoughts on Pass Data Community Summit

Things I liked about Pass Data Community Summit:

  • The main three day Summit was free.
  • Pre-conference sessions on Monday and Tuesday were accessible due to the low price for the sessions. Sessions were $200.00. I like that the speakers were getting paid for their sessions. This might always be the case. I don’t know because this is the first time I had paid for and scheduled any pre-conference sessions. Some speakers donated the money to a charitable organization.
  • Spatial chat app: This app was intriguing because it would drop you into a virtual room that had the appearance of an actual room with tables, chairs, a water cooler, etc. For people reading this who have played the Sims, the virtual lobby was very Sim-like. The application displayed a circular icon with the attendees face in it and video/audio could be turned on and off. Much like a scenario in a real building, the further your circle was from other people the lower the volume. If you couldn’t hear someone well, or just saw someone across the room that you wanted to chat with, you could move across the room by dragging your circle over next to someone.  There were a number of rooms available in addition to the lobby.
  • The session catalog: The catalog had a number of ways to find content. You could filter by day, day/time, type of session and even search for content by a speaker’s name.
  • Brent Ozar’s keynote on Friday morning. He gave a great presentation on 5 ways that the cloud impacts your career as a data professional.

Things that could have been better about Pass Data Community Summit:

  • There was widespread confusion about the 30 minute times on the schedule Wed-Fri and the need to watch pre-recorded content. As a solution, perhaps when someone is building their schedule, have a pop up on the session catalog screen that clearly tells attendees that they need to watch the pre-recorded video before the Q&A session they are scheduling.
  • Pre-recorded content that had to be watched ahead of time meant I had to spend extra time watching presentations prior to Q&A sessions. My schedule didn’t allow for a lot of that and so I opted for mostly live sessions. This also necessitated a lot of schedule changes in the My Schedule page.
  • In the Q&A sessions I did attend, which I believe I made it to 3 or 4, the attendees didn’t come with questions in hand so getting a conversation going was often slow to achieve. Maybe next year there could be a mechanism provided that would allow questions to be submitted ahead of time. These questions would go to presenters prior to their Q&A sessions.

 

Next Steps to Take

  1. Watch the RedGate website for additional announcements about the next Summit. They already announced that next year’s Summit will be a hybrid event.
  2. Start presenting. You don’t have to be an expert to explain how you solved a recent problem. You just describe your problem, solutions you tried and what ended up working. Someone has either had that problem and couldn’t solve it, or they will like your solution better, or someone may offer an enhancement to your solution to make it even better. Who knows, maybe we’ll see you presenting at a Pass Data Community Summit in the future!

 

 

6 Steps to Build a Training Plan

 

As an athlete there is a certain amount of preparation that goes into being selected to be on a team, competing for a starting role and then being competitive with the opponent. For tennis players to be competitive and go to the finals in Wimbledon, it takes hours and hours of training, practice, and discipline to eat well before ever stepping on the court in a tournament. To make a basketball team, takes hours of practice by yourself on a basketball court and competing with others and your team mates. To make the starting line up takes hard work, outside of game day. A certain hunger and desire to grow and be more than you currently are is important as an athlete.


Photo by Markus Spiske from Pexels

In his retirement speech in 1993, Michael Jordan had this to say, “When I lose the sense of motivation and the sense to prove something as a basketball player, it’s time for me to move away from the game of basketball…. I went through all the stages of getting myself prepared for the next year, and the desire wasn’t there.”

Tom Brady is arguably the best quarterback of all time. He has accomplished more than basically every other quarterback, but he’s still hungry for more accomplishments and he still wants to do the necessary work that makes it possible to compete at his best.

If a successful athlete stops training outside of game day, their game eventually begins to slip. You can’t perform at a high level on the tennis court, basketball court, football field, ice rink, etc. , if you aren’t putting in the preparation time outside of game day.

So, what does this has to do with you as an IT worker? You can no more perform well on “game day” at your job without adequate and continuous preparation, than great athletes can play well if they stop doing the hard work of game prep.

Take a Personal Assessment

Let me ask you some  questions.

  1. Are you still hungry to learn technology?
  2. Are you putting in the work necessary in order to be the best technologist you can be ?
  3. When was the last time you spent time during the week to learn something new?
  4. Are you regularly talking about how you’re not good at certain things, but then also not putting in effort to improve?
  5. Do tasks regularly take you extra time at work compared to others in your same role?

If your answers to the above questions indicate that you might not be putting in the work necessary to stay current in your area of technology, then maybe you need to ask yourself why. Maybe, you’re coasting and you have been for awhile. Why? Maybe you’ve had a very rough time since the beginning of 2020 because of the pandemic. Ok. I completely get it. What are some things getting in the way of building a training plan that you can execute?

Burn Out will stop “Game Day Prep”

Burnout Prevents You From Building a Training Plan
Burnout – pexels-pixabay-42230

Maybe you’re burned out and need to recharge. That is very common. The last 18 months or so have been especially hard on most people. I usually see at least 5-7 tweets a week about this topic. I have a post about this as related to myself. Burnout is real and it’s hard to go through and hard to recover from. It takes time to deal with burnout and may likely involve finding others who will support you. Also, realize that no one should expect you to be 100% on your game every day, all the time. That’s just not reality.

Athletes have a lot of people around them to help make them successful. They have coaches, trainers, medical personnel , executives, family members and teammates to help keep them going. There is no reason to think that you also don’t need a support network. Possibly find yourself a therapist to talk to and find people to encourage you and help you.  Try switching up your routine so you can take care of yourself better physically and mentally. Right the ship, so to speak. Then, get back to prepping for “game day.”

 

Long Term Coasting Will Stop “Game Day Prep”

Perhaps you’ve recently been in a period of extra after hours learning because an important project required new skills. Maybe you were gearing up to pass a certification. Good for you. After that you took a break. Ok. That’s reasonable. Be cautious though about your “little break” from game day prep turning into long term coasting where you don’t put in additional effort to learn and grow. If 3, 6, 9 months or more goes by and you’re still coasting, ask your self why and what, you might want to do about that.

 

Other Priorities Will Stop “Game Day Prep”

Everyone has priorities, whether we are aware of them or not. We all make choices and those choices prioritize our time. If you’re spending endless hours in front of the TV, the XBox or the PS3, then you’re prioritizing that activity over others. Perhaps you like to host social events with friends and family over good barbecue and a game of Frisbee in the back yard. Possibly you’re taking care of a sick family member. Certainly no one will fault you for a little down time via entertaining yourself with games or a movie. I certainly hope you are making time for friends and honoring your commitment to your family. I would just encourage you to take stock of whether other, unfruitful priorities are keeping you from being the technologist you could be.

 

6 Steps to Build a Training Plan and Execute On It

  1. Open up your favorite text editor and make a list of the top 3 things you want or need to learn in the next 6 months. You likely know what they are.
  2. Do an internet search for links that explain those 3 topics and add those links to their corresponding topics in the text editor. Be sure to have at least 3 links per topic you want to learn.
  3. Save the file to your desktop so you see it every day as a reminder to work on your skills.
  4. If you have a whiteboard, write your plan on the whiteboard so you see it every day.
  5. Work your personal training plan by reading one article a day from your training plan.
  6. Now, rinse and repeat. Start the process over with those same three topics.

As a bonus, hopefully at least some of those articles have demos. Do the demos yourself on your own local install of SQL Server. Don’t just read the demos. After you’ve done these things, you will have read and worked through the content of 6 articles for each of your three topics that you wanted to learn. That’s a great start!

Next Steps to Take

  1. If you want to chat about any of this, hit me up on Twitter or post a comment here. I’m happy to discuss.
  2. Build a training plan using the steps above.
  3. If you’re struggling to find good resources, check out my post about how to find reliable resources for learning.
  4. Leave me a comment on this post to share your training plan, or message me on Twitter with your plan. Sharing your plan with others reinforces your commitment to the plan.

6 things to consider when presenting

 

Things to consider when presenting

I’ve presented a few times recently and discovered I like doing it. Presenting is a great way to give back to the I.T. community – in this case the SQL Server community. I’ve previously blogged about things I learned from giving my first presentation, and what it was like presenting at New Stars of Data. Here are six additional thoughts on presenting.

 

  • Be sure you show up early for your presentation

The day of New Stars of Data, I was up 90 minutes earlier than usual to use the link to the track I was in to join a panel discussion in progress. I just wanted to be sure that the link was going to work and catch a bit of that presentation. I also wanted to be sure I was wide awake when it was time for me to  present. Slumber does not mix well with presenting!

If you’re not presenting early in the morning like I was, be sure to leave yourself plenty of time to get to your computer to have it ready for presenting. You don’t want to turn it on 5 minutes before you’re supposed to present only to find that it won’t turn on for some reason or that you can no longer find your presentation materials.

  • Use stories whenever possible

In one case I not only used words to explain why data professionals should use a particular SQL Server feature, but I told a relevant store from my work experience. People connect to stories and they are likely to remember the story and the feature you discussed long after all the technical details.  Also, telling the story helped keep me engaged and excited about what I was saying. In turn, that helps the audience remember what was discussed.

 

  • Avoid making a bunch of changes to the slides or demos late in your preparation

Doing this may make you less comfortable when you get to the parts of the slide deck that you’ve recently modified. This is especially true if you’re a new speaker or if the slide(s) have undergone more than just cosmetic changes.

Also, once you’ve tested your demos and you know they are working, I would encourage you not to adjust them if your presentation date and time is coming up soon.

 

  • Practice frequently and record yourself giving the presentation

You need to practice so that your comfortable explaining the content of your presentation. The transitions between slides, to demos and back, also need to feel comfortable and as smooth as possible. Adequate practice will help you be feel at ease with the mouse or keyboard presses necessary to make those transitions.

You also want to listen to your tone of voice to ensure you seem engaged. This is a tough one because we all have different communication styles. Some people are naturally animated and excited when talking. Some people use their hands a lot. Some are more demure. You know yourself, so do some reflection about how you sound and make any adjustments that are needed and that you can comfortably make while still honoring your own personality.

Additionally, pay attention to your pace when speaking. You don’t want to sound rushed. Try to strike a balance with the speed of your speaking. Check out a post from Catherine Wilhelmsen on a tool she used recently that helps with presentation skills.

 

  • Be sure that your information is accurate

Please, please, please do your research as much as possible to ensure that what you’re presenting to people is the correct information. Everyone is learning as we go. Just make a commitment to yourself and to your audience that you will exercise due diligence to research your topic and present accurate information.

 

  • If possible, brush your teeth before presenting

Now, this seems a bit out of place, but I learned this recently from a practice session I was doing. I was practicing for New Stars of Data and had eaten food 1-2 hours before I was practicing. Early in the practice recording I realize that I have a piece of food that has now dislodged. Luckily this happened during a practice session and not while I was presenting live to people!

 

Next Steps to Take

1. Check out the speaker improvement library here.

2. If you would like help with anything in this post, or with something else related to SQL Server, reach out to me on Twitter, and I’ll be glad to offer assistance.

 

What is it like to be a presenter at the News Stars of Data virtual conference

 

Back in early July 2021, I presented at the SQL Server Meetup I co-host with Anthony Fortner. We had a speaker no show a couple of months prior and then for the July meeting a speaker had to cancel about a week before the meeting. I couldn’t let another month go by with no Meetup. So, I created a presentation from nothing, practiced it and gave it live for the first time in about a 7 day time span. It took about 15 hours of work and was my first SQL Server presentation to a public group. I learned a number of things from that experience. Then I gave a revamped version of the presentation to New Stars of Data in late October.

In late September this year I was on LinkedIn and engaged a conversation about presenting on SQL Server.  New Stars Of Data had been advertised recently and if I remember correctly, that came up in the discussion. I mentioned I might apply to be a speaker. Andy Yun chimed in and encouraged me to apply. I thought, “Why not! I’ll apply and see what happens.”

I applied and was accepted. Ben Weissman sent me an acceptance email with some introductory information about the conference and introduced me to Leslie Andrews, who would be my mentor.

I’ll admit that I had only heard the name. That’s it. I didn’t know anything else about Leslie, but I assumed that if she’s a mentor for New Stars of Data, she must be at least well-known. If you look at her Twitter profile and who is following her, etc. you can see that she is indeed well-known. I was excited and interested to work with her, and nervous. I mean, lots of people know her so she’s “SQL famous.” What will she think of me? Will I embarrass myself talking to her because I won’t know what to say? I was anxious for no reason. Leslie is very personable and easy to talk to.

3 Questions I asked my mentor about speaking at New Stars of Data

We set up a video meeting. As I prepped for the meeting with three questions:

  1. How many people are generally in the presentation sessions?
  2. Is there anything I should consider that’s different because of the number of people in the session?
  3. What advice do you have for me?

That first question was to set my expectations. I didn’t want to be overwhelmed by 50, 100, 200 people being in my session. That would have been fantastic, of course! However, I wanted to know what to expect because that would help me manage my emotions the day of the presentation. My theory was that the more people I was speaking to, the more nervous I might be, but if I at least knew that ahead of time, then I wouldn’t be as nervous.  Leslie told me that there might be as many as 50 people in the session.

My second question was “Is there anything I should consider that’s different because of the number of people in the session?” This was just a curiosity question since I had never given a presentation to more than just a handful of people at a SQL Server Meetup. Leslie told me that there was nothing really different to prepare for whether there were 5, 50, or 200 people. That was actually a bit of a relief.

Advice from Leslie Andrews: Round 1

My third question was, “What advice do you have for me?” Leslie gave me 3 main ideas to think about.

  • Write down what you want to say and memorize it

I didn’t end up doing this. the point of this advise was to get me comfortable with the presentation. I did practice the full presentation at least three times in the week or so before New Stars of Data. Additionally, I went over a few spots in the presentation where I was having difficulty with the slides and demos. By the time I presented at New Stars of Data, I had also presented the topic two previous times. I recorded each time I practiced it in it’s entirety so I could listen to it. I did the recordings of my practices because I wanted to hear how I sounded.  Was I too monotone? Did I sound engaged? Were there a lot of filler words? I was asking myself those questions as I watched the recordings.

  • Showing yourself on camera is not required

While not required, I did turn my camera on for the presentation. I had built this presentation about 6 weeks or so prior and had given it twice before to SQL Server Meetups. The second time I presented, I ended up with my camera off. I found that to be unnerving. It made me paranoid that I wasn’t actually talking to anyone. For that reason, I did turn on my camera when I presented for New Stars of Data

I was already familiar with this presentation as I had attended it at the last PASS Summit. I learned a lot about font sizes, colors, and other visibility issues related to presentations.

Advice from Leslie Andrews: Round 2

I provided Leslie with my existing slides and asked her for feedback. I also provided her with a link to the first time I gave the presentation. Here’s what she said.

  • Be sure to update font sizes in SSMS for results and execution plans.
  • Leslie felt that the slides were fairly wordy and commented that slides with a lots of words tempts people to read you slides rather than listen to what you’re saying.
  • She suggested that I add one or more slides to explain the concepts of High Availability and Disaster Recovery. She said I shouldn’t assume that everyone in the session will know what those ideas are about.
  • I needed to add at least three slides. One each for prompting for question at the end, one for thanking New Stars of Data and a duplicate of my contact info slide at the end to show when I was done.

I wasn’t surprised by bullet point #2. Wordiness in my writing has always been something that people point out. I’ve worked on it, but it’s a work in progress. Also, I want my slides to be a resource of sorts after the presentation. Nonetheless, I did tone down the amount of words on my slides.

I hadn’t realized I was making an assumption that people would know what HA and DR were about. Therefore, it was nice to have someone on the outside point that out so I could explain what I meant.

A Mistake I Made with the Slides

I made a bunch of changes to the slides based on the above. Then I had to practice for that second time I was giving this presentation to a live group prior to presenting at New Stars of Data. I made a significant mistake here. I didn’t have enough time to make a bunch of slide changes and practice enough with those changes before the second time I presented to a live group.

As a result, I felt that the second presentation wasn’t nearly as good as the first time I gave it. I just wasn’t comfortable yet with all the changes I had made. I made a few more tweaks to the slides before New Stars of Data, but by about two weeks out, I committed to stop messing with the slides so I could get comfortable with the presentation format.

 

What Happened the Day of New Stars of Data?

The day of New Stars of Data, I was up 90 minutes earlier than usual to use the link to the track I was in to join a panel discussion in progress. My presentation was at 8:45 AM Central time in the U.S. I just wanted to be sure that the link was going to work and catch a bit of that presentation. I also wanted to be sure I was wide awake when it was time for me to  present.

The speaker who was right before me had cancelled so there was a gap in the track. I was concerned that I might lose most of my audience, but I couldn’t control that. I connected early to my session track and Deborah Melkin was there to greet me. Andy Yun was also there. These people are “SQL Famous” , but they set me at ease.  I decided to take questions at the end because I didn’t want to interrupt my flow in the presentation once I had settled in. Deborah said that made complete sense and was happy to handle questions at the end.

The presentation went well overall. I had a few bobbles with my words and one time where my finger hit the scroll button on my mouse, causing my slides to scroll forward several slides. I felt good about the presentation though.

A few days later, Ben sent me  a pdf of my feedback. Everyone had good things to say, and that was encouraging. One person shared the observation that they felt my vocal tone could have been more varied so that I sounded more engaged and excited about my topic. I wasn’t shocked by that. I’m not a particularly animated person when I’m talking. I’m also a new speaker and I’m sure there were some nerves there. However, I also know from listening to my practice sessions that I had improved that part of the presentation already and it would get better as I settled in as a speaker.

 

Next Steps to Take

  1. Consider sharing what you know. If you’ve solved a problem in the I.T. realm, then you have something to share. You don’t have to be an “expert” to help others by sharing what you’ve learned.
  2. If you would like help with anything in this post, or with something else related to SQL Server, reach out to me on Twitter, and I’ll be glad to offer assistance.

 

Four Areas to Focus on When You Start A New DBA Role

 

You’ve just been hired into a DBA role at a new company, or you’ve been given the DBA keys at your current company. Maybe you’re a SysAdmin and your boss has informed you that you are now supposed to manage the SQL Servers as well as everything else on your plate. In any of these situations, you may have some confidence in your skills, but especially in the case of being a new hire, you have absolutely no true idea of what you’re walking into.

In these scenarios, where do you start? Start with these four areas.

  1. Backups
  2. SQL Agent Job Notifications
  3. Building an environment inventory
  4. Security

 

1. SQL Server backups: You have to be able to recover the data.

My experience has been that many companies aren’t doing an adequate job of managing the backup routines for their databases. I’ve been in companies where two or even three products were running backups on the same SQL Servers. I’ve been in situations where important databases weren’t backed up frequently enough and with the right set of backup types to recover appropriately, which would result in losing more data than the management was comfortable with. I’ve also seen scenarios where there were no backups at all.

So, what do you do to tackle the issue of SQL Server backups in your environment? First, connect to your SQL Servers and run the below query. This will give you the database name and the date of the last full backup if the last full backup was over 7 days ago . If the database has never been backed up, then you’ll get that information as well. Any results from this query should be investigated immediately. This query was adapted from sp_Blitz with some column names changed to make it more clear what it does.

SELECT  d.[name] AS DatabaseName ,
COALESCE(CAST(MAX(b.backup_finish_date) AS VARCHAR(25)),'No Backup Ever Made') AS LastFullBackup
FROM master.sys.databases AS D
LEFT OUTER JOIN msdb.dbo.backupset AS B ON D.name COLLATE SQL_Latin1_General_CP1_CI_AS = B.database_name COLLATE SQL_Latin1_General_CP1_CI_AS

AND B.type = 'D' /*Full backup*/
AND B.server_name = SERVERPROPERTY('ServerName') /*Backupset ran on server you're currently connected to. */

WHERE D.[name]<> 'tempdb'  /* Eliminate TempDB. No need to back that up */
AND D.state_desc NOT IN ('RESTORING', 'OFFLINE', 'OFFLINE_SECONDARY') /* Exclude databases that are offline or involved in log shipping, for example */
AND D.is_in_standby = 0 /* Exclude databases in stand-by state as part of log shipping*/
AND D.source_database_id IS NULL /* Excludes database snapshots */

GROUP BY d.name
HAVING MAX(B.backup_finish_date) <= GETDATE()-7 /*Full backup older than 7 days ago.*/
OR MAX(B.backup_finish_date) IS NULL;										    

 

This one will show you any databases in the Full recovery model that haven’t had a transaction log backup in over 24 hours. This query was adapted from sp_Blitz with some column names changed to make it more clear what it does.

SELECT  d.[name] AS DatabaseName ,
COALESCE(CAST(MAX(b.backup_finish_date) AS VARCHAR(25)),'No Backup Ever Made') AS LastLogBackup
FROM master.sys.databases AS D
LEFT OUTER JOIN msdb.dbo.backupset AS B ON D.name COLLATE SQL_Latin1_General_CP1_CI_AS = B.database_name COLLATE SQL_Latin1_General_CP1_CI_AS

AND B.type = 'L' /*Log backup*/
AND B.server_name = SERVERPROPERTY('ServerName') /*Backupset ran on server you're currnetly connected to. */

WHERE D.[name]<> 'tempdb'  /* Eliminate TempDB. No need to back that up */
AND D.state_desc NOT IN ('RESTORING', 'OFFLINE', 'OFFLINE_SECONDARY') /* Exclude databases that are offline or involved in log shipping, for example */
AND D.is_in_standby = 0 /* Database is read-only to restore a log backup as in Log Shipping. */
AND D.source_database_id IS NULL /* Excludes database snapshots */
AND D.recovery_model_desc = 'Full'
GROUP BY d.name
HAVING MAX(B.backup_finish_date) <= GETDATE()-1 /*Log backup older than 1 day ago.*/
OR MAX(B.backup_finish_date) IS NULL;										    

 

Review the SQL Server to see if there are any native backup methods in place, such as a Maintenance Plan or Ola Hallengren scripts.  Each of these backup methods will produce one or more SQL Server Agent jobs. Check to see if the jobs associated with those options are enabled.

If you don’t see either of those options set up, your company may be using a 3rd party backup software. These are typically, but not always, deployed by System Administrators. Check with them to see what 3rd party software is in place for backups and let them know your findings for the server in question. this could be a situation where this SQL Server was just not enrolled in the 3rd party software and so is not being backed up.

If none of these things turn up a backup solution, then get one in place ASAP. See this post for options on how to create backups.

2. SQL Agent Job Notifications: If a job is worth creating, it’s worth knowing if it failed.

This is in position #2 because it’s been an easy win in every job I’ve taken. In every new role I’ve had, I have discovered many SQL Agent jobs with no notifications or notifications set up to wrong people. If a job is worth creating, it’s worth knowing if it failed.

As you work through finding failed jobs that never notified anyone, you may find that you’re fixing important agent jobs that should have been working. This process fixes business and data related processes and also helps you find stake-holders for data processes. When jobs fail and you investigate, you often need to know who is impacted by the job failure. This further leads to identifying who should be notified if it fails. This process then, actually helps connect you to key players in the organization very early. Those people on the business side almost always appreciate your efforts to make sure that data processes that affect them are working properly. This gives you allies as you move forward in your role.

So how do you proceed if you find jobs that have no notification set up if they should fail?

  • First, locate jobs that are failing. I have a post that will show you two ways to do that. You can either look at the SQL Agent Job Activity Monitor on a SQL instance or query the msdb database.
  • Second, locate enabled jobs that have no email operators assigned. These will be jobs that no one will know whether they are working or not. This post will tell you about how to find these jobs.
  • Third, work through a process to make sure jobs with no notification set up will have an email operator assigned to them. This will ensure the right “someone” will know if the job fails.

3. Build an environment Inventory: You can’t manage what you don’t know about.

The process of gathering information about your environment tells you about the make up of versions and editions in your environment. This, in turn, leads to the answer to the question, “What SQL Server’s are out of support and likely need to be upgraded?”

“How many SQL Servers are there at company X?”

Sure, the hiring manager told you the company only has a dozen SQL Servers, but I’ve often found that hiring managers often don’t know the real answer to the question, “So, how many SQL Servers do you have at company x?” That’s a question you should always ask when interviewing by the way. When you hear the answer, I’d encourage you to add at least 50% to that answer. At one job, I was told there were about a dozen SQL Servers. Less than 6 months in I had located about 50 SQL Servers, and that was before I ran the MAP Toolkit.

If you are not familiar with the MAP Toolkit, when you run the utility it does a search of your network or Active Directory to find SQL Servers. You will want to ensure that your security team and/or System Administrators know you are going to run this utility. This is because it will likely set off security software or even be blocked, in some cases.

The tool produces an Excel format list of plenty of information for you to know and this is the simplest way to start creating a SQL Server inventory. This scan is likely to find a lot of SQL Servers that no one seems to remember existing. Managing all the SQL Servers within your purview is important. You can’t do that if you don’t know about them.

You will also probably find individual employees who have installed SQL Server Standard or even Enterprise Edition on their local PCs. That’s a licensing no-no. This situation gives you the chance to help your company avoid licensing entanglements with Microsoft by addressing these scenarios.

If you find that you aren’t allowed to use the MAP Toolkit, as you hear about other SQL Servers that exist in your environment, or as you discover them for yourself as you look through Active Directory Users and Computers, make a Central Management Server and add SQL Servers to it. Ensure you have access to those SQL Servers and then query them to find out all you can about them. Review the information available to you from SERVERPROPERTY and build your own queries to discover information about the company SQL Servers. Here is an example query from the MS Docs link above. As you discover information about the company SQL Servers, create your own spreadsheet of information, or create a database and tables to hold the information you find.

SELECT  
  SERVERPROPERTY('MachineName') AS ComputerName,
  SERVERPROPERTY('ServerName') AS InstanceName,  
  SERVERPROPERTY('Edition') AS Edition,
  SERVERPROPERTY('ProductVersion') AS ProductVersion,  
  SERVERPROPERTY('ProductLevel') AS ProductLevel;  
GO

 

4. Security: Who has access to SQL Server and what type of access do they have?

I generally focus on this one after the others because security is often a political subject within an organization. You don’t want to become embroiled in power struggles right away by trying to take away someone’s access, because you will lose. Additionally, if you’ve done the previous work mentioned, you have likely built some good business relationships and a few allies to help you have the discussion about SQL Server access and security.

As a DBA, you and the business need to know who has SysAdmin level access. The access that they possess allows their credentials to make any change they want to. In truth, it’s often not so much about trusting the person who has the access as it is realizing that every person who has SA permission is a person whom a hacker could exploit to gain access to the company’s data. You will want to make the business aware of how many credentials have this level of permission and whose credentials they are.

How do you find who has SysAdmin level access on your SQL Server? Here is a query for that from MS Docs. I have added a WHERE clause to exclude the commonly found SQL Server accounts that start with NT SERVICE.

 

SELECT	roles.principal_id AS RolePrincipalID,	
roles.name AS RolePrincipalName,	server_role_members.member_principal_id	AS MemberPrincipalID,	
members.name AS MemberPrincipalName

FROM sys.server_role_members AS server_role_members
INNER JOIN sys.server_principals AS roles
    ON server_role_members.role_principal_id = roles.principal_id
INNER JOIN sys.server_principals AS members 
    ON server_role_members.member_principal_id = members.principal_id
WHERE members.name NOT LIKE 'NT SERVICE%';

This query shows you what credentials have Server level SA access. You also will want to know what accounts have the db_owner role at the database level.

I would strongly encourage you to review the permissions scripts written and maintained by Kenneth Fisher. These will provide a wealth of information about your server level logins and your database level users.

 

Next Steps To Take

  1. Check out the First Responder Kit on Github.
  2. If you have a comment or question about this post specifically, leave a comment and I’ll get back to you.
  3. If you would like help with something else related to SQL Server, reach out to me on Twitter, and I’ll be glad to offer assistance.

 

Three Keys to SQL Server Performance

 

Everyone wants good performance from their database systems. Some even expect and need a high performing Ferrari all the time. How is this achieved though? What do you need to understand about SQL Server specifically in order to make your company’s applications hum along like a well tuned car? We will look at three keys to SQL Server performance.

 

SQL Server Speed
SQL Server Performance

Here’s the short list of performance keys.

1. SQL Server must be able to cache sufficient data in memory

2. SQL Server must be able to retrieve data from disk efficiently

3. You must write “good” T-SQL

 

SQL Server Memory

 

1. SQL Server must be able to cache sufficient data in memory

SQL Server does not use memory like most applications.  People unfamiliar with SQL Server’s use of memory are surprised to see SQL Server using 80% or more of a server’s RAM. “I have 128 GB  of RAM on this machine why is SQL Server set to consume 115 GB of that?!” Then they decide to change Max Server Memory down to like 64 GB or less. Suddenly they might find that disk IO shoots up. You will also likely see that execution plans are rapidly aging out of the plan cache, which will burn up CPU with new compiles for query plans and potentially lead to parameter sniffing issues. When all this comes into play people start complaining about applications not working as well. They start saying the magical phrase, “It’s slow.”

When a query is issued and the  data requested is not in memory, SQL Server must use CPU and disk IO to get the data into memory first. While it does this a PAGEIOLATCH_* wait is registered in SQL Server because the current query is waiting on the data to be retrieved. This wait causes the query to be put on hold.  Reading data off the disk is always, always going to be slower than working with data that’s already in memory.

So, how do you determine what is a sufficient amount of RAM for your SQL Server? The short answer is that I would encourage you to look at  my post here and use the DBATools cdmlet Set-DBAMaxMemory. The post will help you understand more about what the Max Server memory setting does and the PowerShell cmdlet will recommend a good starting place for setting Max Server Memory. Here are some examples from the documentation from DBATools and from the Help commands available in PowerShell.

<# If you have a Central Management Server for your SQL environment, consider using this command to loop through all the SQL Servers and set the Max Server Memory where it is set to something larger than the total amount of RAM assigned to the server. #> 

Get-DbaRegServer -SqlInstance sqlserver | Test-DbaMaxMemory | Where-Object { $_.MaxValue -gt $_.Total } | Set-DbaMaxMemory 

<# If you have a Central Management Server for your SQL environment, consider using this command to loop through all the SQL Servers and set the Max Server Memory to this accepted formula created by a SQL Server expert. #> 

Get-DbaRegServer -SqlInstance sqlserver | Test-DbaMaxMemory | Set-DbaMaxMemory 

<# If you don't have a registered server then just use the below #> 

Test-DbaMaxMemory -SQLinstance SQLServerInstanceNameHere | Set-DbaMaxMemory

The longer answer is this. Take a look at three metrics in the SQL Server:BufferManager category in Performance Monitor.  First, if you observe Page Life Expectancy over time and the value has long stretches where it plummets and stays low without recovering, then you have some heavy hitting queries running that could likely benefit from having more RAM available. This will likely be accompanied by SQL Server wait stats showing a high average wait for PAGEIOLATCH_* type waits. This is SQL Server reading pages from disk because the data pages it needed were not in memory.

Second, review the Perfmon counter Lazy Writes/Sec. If SQL Server is having to free up memory in between checkpoints, this value will be above zero. If it is regularly above zero, then SQL Server is regularly under memory pressure and is having to write data pages in memory back to the disk so that it can load different data pages to satisfy queries.

Third, look at Free list Stalls/sec. The value of this Performance Monitor counter indicates the number of requests per second that had to wait for a free data page in memory. If Page Life Expectancy is often low for long stretches and both Lazy Writes/sec and Free List Stalls/sec are greater than zero, then you need to either adjust Max Server Memory up (as long as you don’t go too high based on the above information), add memory or, take a hard look at your indexes and queries involved when these PerfMon metrics are out of balance.

SQL Server Indexing and Disk Performance

 

2. SQL Server Must Be Able to Retrieve Data From Disk Efficiently

Look at your SQL Server wait stats information and if you see PAGEIOLATCH_* very prominent then there could be a good chance that the indexes in your database need attention or those long IO waits could mean a problem with the disk subsystem.

PAGEIOLATCH_* type waits are all wait types related to reading 8KB data pages from the disk into memory. Inefficient indexes could be making these reads longer because SQL Server can’t quickly get the data it needs from the existing indexes. This can happen over time as query patterns and data patterns change. For example, your company might introduce a new category of products Suddenly people are querying that category and its associated products far more and older, mainstay products less. The distribution of that data may affect the execution plan generated.

The company may have re-factored an existing application into new tables and missed indexing the Category column. Now when people are searching for things like “Bike Accessories” there is no supporting index. This results in long table scans of millions of rows.

As a start to determine if there is inefficient indexing, run and save the output of sp_Blitzindex to examine your tables and indexes. Review its recommendations and make adjustments. Then re-measure index usage with sp_BlitzIndex. Some time after a restart of SQL Server re-run sp_BlitzIndex and compare the output to the previously saved run looking to see if SQL Server is using the adjusted indexes.

To review whether you have a disk IO subsystem issue, look at the DMV called sys.dm_io_virtual_file_stats. You can also review the SQL Server Error Log looking for messages indicating IO that took longer than 15 seconds. These could be an indication of an IO subsystem issue. Review this article and this article on these topics.

Both of these articles provide information on understanding the DMV and the error message. There is also information about Performance Monitor counters to use to measure potential problems.

If you are in the cloud, such as AWS, be sure to review settings for ebs and fsx storage to ensure that the IOPs and throughput are set up appropriately. Also, be sure to take into consideration how the AWS ec2 instance type might be throttling your IO and throughput capabilities. Just because your storage is set up with a certain IOPs and throughput doesn’t mean that the ec2 instance can support the storage settings.

T-SQL Anti-Patterns to Avoid

 

3. You Must Write “Good” T-SQL

Poorly written T-SQL can cause poor application and SQL Server performance. Here is a brief list of things to avoid.

A. Writing T-SQL that makes it non-SARGable. This will cause SQL Server to have to scan entire tables instead of using an existing index. For details and examples see this, this and this. there is no need for me to explain this in depth, as it has been written about quite frequently.

B. Overuse of cursors. SQL is a set based language so making it do operations row by row, is far less efficient than using set-based logic.  Take a look at this post and this post.

C. Overuse of SELECT *. Some tables are very wide and have millions or even billions of rows. If you don’t truly need every column, then do SQL Server and your application a favor and only return the columns that are actually needed!

D. Be careful with scalar user defined functions. This goes back to the idea of SQL being a set-based language. A scalar udf returns a single value for each value passed to it. when you use this sort of logic and pass into it large numbers of rows, then each row is processed one at a time inside the function to return a value. Also, SQL Server doesn’t do a good job of “seeing inside” Scalar UDFs and show you that one is present in a query plan. Asa result, you might not see this sort of thing if you’re looking at a query plan. Additionally, scalar UDFs kill SQL Server’s ability to go parallel. For more, take a look at this, this , and this.

This list could be much, much longer. the point here is that how you write your T-SQL often has a direct impact on performance. If you write T-SQL for applications, I strongly encourage you to look at the blogs of folks like Erik Darling and Kendra Little as well as sites that have a large number of entries on T-SQL like this one.

Next Steps To Take

If you would like help with anything in this post, or with something else related to SQL Server, reach out to me here, on Twitter, or by email at Leem@leemarkum.com and I’ll be glad to offer assistance.