Community Resources

After helping to plan several SQL Saturday events, planning Pittsburgh SQL Server User Group meetings, planning a Wanna Be A DBA track, and chatting with others in the community; I’ve come to realize that people often ask me how and where they can learn more about SQL Server. I start rattling off a list of places that they can go, and they start scribbling like mad to keep up with all the resources that I mention.
This page is to document good resources for beginners and seasoned pros alike. It is by no means a comprehensive list. I will continue to add more resources to this page in the future, as I find other places of value.  Updates can be found at http://nelsonsweb.net/resources.  Send me a message if there’s anything else that you think is worth adding to this list as well.  I welcome your feedback!

Events

SQL Saturday (www.sqlsaturday.com)
There’s nothing better than a full day of free training!  There is typically a small charge for lunch.  All of the training is free, thanks to many sponsors; and all the speakers also donate their time as well.  Visit sqlsaturday.com frequently to see upcoming events near you.  If you attend a SQL Saturday, make sure that you visit the sponsors and thank them.  None of the events would be possible without them.
Pittsburgh SQL Saturday is usually in October

Local User group meetings (www.sqlpass.org   http://pittsburgh.sqlpass.org)
There are 280+ local SQL user groups across the world.  Check out sqlpass.org to find a group near you.  The Pittsburgh, PA group typically meets monthly on the last Tuesday of the month.

PASS Virtual Chapters (http://sqlpass.org/PASSChapters/VirtualChapters.aspx)
PASS has a collection of “Virtual Chapters”.  these groups have a variety of subject matters, and are produced in many languages as well!  Go to the site and join the groups that interest you.  They will send you a web link when they schedule an upcoming meeting.

24 hours of PASS ( www.24hoursofpass.com/)
PASS puts on this event a few times a year.  There are online webinars that run for a 24 hour period.  Register online, and they will send you a meeting link to watch the presentations that you are interested in.  They also typically record all the sessions and make them available online later.

Training

SQLServerCentral Stairway Series (http://www.sqlservercentral.com/stairway/)
The Stairways series are a great place to go to get an overview on a specific topic.  There are series for everything from database design to business intelligence to powershell.  All of the lessons within the series are written by a leading industry expert in that subject.

Pragmatic Works Training on the ‘T’s (http://pragmaticworks.com/Training/FreeTraining)
Pragmatic Works offers free training every Tuesday and Thursday at 11 AM EST!  Most of the sessions are BI related, but you will sometimes see some general administration sessions as well.  All of the previous sessions were recorded and are available to watch later at the same site.  Pragmatic Works also runs paid in-person training in various cities and they have some cool paid tools for SSIS that you may want to check out.

Brent Ozar (http://www.brentozar.com/first-aid/events/)
Brent Ozar and his team also offer free training events weekly.

Free tools

Notepad++ (notepad-plus-plus.org/)
This is my preferred text editor.  I wanted to mention this tool in particular because it has a free plugin called Poor Mans TSQL Formatter.   You can copy a query into Notepad++, and then click on the plugin to format SQL code.  The plugin will reformat the code into a more standardized format for you for easier reading.

sp_Blitz (http://www.brentozar.com/blitz/)
Brent Ozar has a great set of scripts to give you a health check overview of your servers.  After running the scripts, you will see lots of potential red flags, with links for more information about the problems and what you can do to fix them.

sp_whoisactive (Download)
Adam Mechanic built a great stored procedure that is sp_who2 on steroids!  Running the basic stored procedure will give you tons of information about what SPID’s are currently hitting your instance, including CPU, tempdb and I/O usage, how long sessions have been executing, blocking sessions and lots more.  You can also add in additional parameters to get tons more information, including query plans.  And one of my favorite options lets you save teh output into a database table for further review and analysis.
Adam also wrote a 30 day blog post series to go into details of the additional options that you can use.  http://sqlblog.com/blogs/adam_machanic/archive/tags/sp_5F00_whoisactive/default.aspx

Ola Hallengren Backup and Smart Reindexing scripts (https://ola.hallengren.com/)
No list is complete without mentioning Ola’s alternative to native maintenance plans.  These stored procedures fix several common issues to make database backups and reindexing better.

SQL Server Builds (http://sqlserverbuilds.blogspot.com/)
This site is a great resource if you want to check what the latest service pack is for any version of SQL Server.  This site has information about every single patch and service pack, and direct links to download each one from Microsoft.

Blogs

I don’t know that I can recommend any other blogs to you except for this one.  🙂
In all seriousness, there are lots of blogs that I do follow.  Most are either people who I know, or people who I respect in the community.  At some point I may get around to listing some or all of them here.  In the meantime, a good place to get started is SQLServerCentral.  They aggregate lots of other blogs together in one place for you here: http://www.sqlservercentral.com/blogs/

 

Pittsburgh SQL Saturday 2013

2013-07-16_0047Preparations are well under way for this year’s Pittsburgh SQL Saturday. If you’ve never heard of or been to SQL Saturday, it is well worth the time. It is an all day, FREE training/technical conference devoted to Microsoft SQL Server. (there is a $10 charge to help offset the cost for lunch)
This year’s Pittsburgh event is going to be held at ITT Tech in Robinson Township on September 14. You can see the details of the event here: http://sqlsaturday.com/250/eventhome.aspx

If you’re interested in attending, sign up now as seating is limited.
If you’re interested in speaking, please fill out the speaker registration form now. The call for speakers deadline is June 22.

I am helping to plan the event again this year, and coordinating the speaker schedule for the day. If you have any questions or concerns, feel free to get in touch. And I hope to see you in September!

Powershell: delete files older than X days

On a recent project, I needed to delete archive folders that were older than a specified number of days. The thing that made this a little more challenging is that there were Daily, Weekly, and Monthly folders (similar to the screenshot below); each of which had a different retention period.

I found several scripts to delete folders and files older than a specified number of days, but these scripts would delete all the contents of the specified folder. I needed to be able to filter out the Daily, Weekly, or Monthly folders separately to handle their retention period.

This script is can be customized. Change the “-filter” to include the folder names that you want to delete, and change the number of days in the addDays () command.
Another really handy option is to use the -whatif option at the end of the script. This will print out in the powershell window what will be deleted, but it will not delete the files. This will let you test the delete without actually deleting the folders/files. The first delete example below includes the -whatif option so that you can see where it goes.

$thedirectory = "C:\test\ImportFolder\Archive"
# use "-whatif" to show what will be deleted without actually deleting anything
cd $thedirectory
get-childitem $thedirectory -filter "*daily*" |? {$_.psiscontainer  -and $_.lastwritetime -le (get-date).adddays(-35)} |% {remove-item $_ -force -recurse -whatif}
get-childitem $thedirectory -filter "*weekly*" |? {$_.psiscontainer  -and $_.lastwritetime -le (get-date).adddays(-15)} |% {remove-item $_ -force -recurse}
get-childitem $thedirectory -filter "*monthly*" |? {$_.psiscontainer  -and $_.lastwritetime -le (get-date).addmonths(-25)} |% {remove-item $_ -force -recurse }

This post is part of the blogging phenomenon known as TSQL Tuesday. This month’s blog party is hosted by Wayne Sheffield, who is writing a series of blog posts in the month of February all about powershell.  I couldn’t pick just 1 script to share today, so here is my second post on the topic for day.

Wait for file before processing

As part of a recent project, I needed to check if a file existed before starting an SSIS package to import that file.  The catch was that I did not know what time time file was going to be placed on the file system.  If the SSIS package runs and the file does not exist yet, the package will fail.

You can create a script task component within SSIS to check for a file and then sleep, however several sources said that this could spike the processor so I deceded to go a different route.  To solve the problem, I wrote a quick little powershell script to check if the file exists and then wait in a loop before starting the SSIS package.

I created a SQL agent job with 2 steps.  The job runs daily at 1:00 AM. The job has a status of Executing on Step 1 until the file exists.

Step 1: powershell script to check for the file (see below)

Step 2: Execute SSIS package task.

The file that I am looking for in this example is: C:\test\ImportFolder\fileToCheck_20130212.txt
You will notice that today’s date is also appended to the end of the file name.  I built the file name out dynamically along with the current date stamp.

$dt = Get-Date -format yyyyMMdd
$path = 'C:\test\ImporFolder\'
$theFile = $path + 'fileToCheck_' + $dt +'.txt'

#  $theFile variable will contain:  C:\test\ImportFolder\fileToCheck_20130212.txt
While (1 -eq 1) {
	IF (Test-Path $theFile) {
		#file exists. break loop
		break
	}
	#sleep for 60 seconds, then check again
	Start-Sleep -s 60
}

This post is part of the blogging phenomenon known as TSQL Tuesday. This month’s blog party is hosted by Wayne Sheffield, who is writing a series of blog posts in the month of February all about powershell..

SSIS execute task only on Tuesday

I’ve started working with SSIS a lot more lately.  I am going to attempt to document here some of the quirks that took me a little while to figure out along the way.

In this tidbit, I have a multi-step SSIS package that needs to be run on a daily basis.  However, one step of the process should only be run on a specific day of the week (lets say it should only be run on Tuesday).

My screenshots are all taken in Visual Studio 2010.  The process is the same for 2008 R2.

The process:

  1. Create the necessary steps.  For this simple example, I created 3 Execute SQL tasks.  They each run the query: “SELECT 1”.
  2. Drag the green arrows to link each of the tasks in order.  By default, the links will all be success constraints.
  3. Double click on the line between Task 1 and Task 2.  This will open the Precedence Constraint Editor.
    1. Change Evaluation operation to: “Expression and Constraint”
    2. In the expression block, type: “DATEPART( “dw”, getdate()) ==3″
      • Using the DATEPART(“dw” ) function, Sunday=1 and Saturday=7.  Since we only want Tuesday, we choose ==3.
    3. Press OK
    4. You will notice that the line between Task 1 and Task 2 now has an “fx” symbol on top of it.
  4. At this point, the package will run and Task 2 will only run on the specified day.  However there is an issue with the current setup: Task 3 will begin executing as soon as Task 1 completes.  It will not wait  until Task 2 completes on Tuesdays.  We need to modify that link as well to create a fork in the path.
  5. Double click on the link between Task 1 and Task 3.
    1. Change Evaluation operation to: “Expression and Constraint”
    2. In the expression block, type: “DATEPART( “dw”, getdate()) !=3″
    3. Under multiple constraints, change to the option: “Logical OR. One constraint must evaluate to True”
      • Changing to Logical OR is required.  Since we created the fork at Task 1, only one of the two lines going into Task 3 will evaluate as successful completion.
    4. Press OK
    5. You will notice that the line between Task 1 and Task 3 turns into a dashed line and it also has an “fx” symbol on top of it.
  6. You can now run your package to test that the steps work properly.  To test the design, I changed my formula’s ==3 and !=3 to a different day of the week to make sure that Task 2 got bypassed correctly on its off day.

SQL Saturday Pittsburgh

Wow, somehow I managed to not post anything here for 6 months!  I started a new job since then, which has kept me really busy, and I did not make the time for posting anything on my blog.

 

The big news is that Pittsburgh’s SQL Satuday is this Saturday October 6 at La Roche in the north hills of Pittsburgh.  I have been organizing the schedule for the event, which is now posted on the SQL Saturday site. Check it out, and make sure you get registered to attend.  There’s a great mix of local and out of town speakers, and some great topics that will be presented.

Let me know if you have any questions, and hope to see you on Saturday!

Import Active Directory users into SQL Server

I needed to import a list of all Active Directory user accounts into a table in SQL Server for a recent project. This project also gave me a perfect opportunity to learn a little bit of powershell. Below chronicles the script that I built. I’m going to skip over a lot of the powershell basics information, as that is available from other sources. For this project, I needed to populate a table with these fields from Active Directory: Display Name, NT username, email address, and office phone.
I used the powershell Get-ADUser cmdlet to get the information out of Active Directory
Before doing anything else, you need to open a powershell command window (Start–>Run–>powershell.exe) and import the Powershell ActiveDirectory module:

PS C:\> Import-Module activedirectory

After importing the module, you can learn more about the Get-ADUser cmdlet by using some of these commands

Get-Help Get-ADUser
Get-Help Get-ADUser -examples
Get-Help Get-ADUser -detailed

Examples are great, but I learn better by seeing real results, so lets run a quick query to see what information we get.

Get-ADUser -filter * -ResultSetSize 1
#Note, I included “-ResultSetSize 1” so that I was not overwhelming the domain controllers while testing.

Awesome, I can now see user accounts from Active Directory! The output that I got showed me some of the information that I needed, but I am still missing some pieces (primarily email address and phone number). The “-Properties” option will let you pick additional fields to include in the output. I got a little stuck here briefly, because the Get-ADUser cmdlet names for the properties do not all match the Active Directory field names. To figure out what the appropriate field names were, I ran this:

Get-ADUser -filter * -ResultSetSize 1 -Properties *

Cool, now I can put the fields together to get a shortened list of only what I am looking for:

Get-ADUser -filter * -ResultSetSize 1 -Properties EmailAddress,OfficePhone
# Note this will return additional fields (DistinguishedName,Enabled,ObjectClass, SID,…)

I got a little bit stuck here too, because I was getting too much information. When I got to the point of exporting this data to a CSV file and importing it into SQL Server (coming later), I got hung up because some of the fields did not always have information for my organization. The solution came by using a pipe (SHIFT + \ key) and the Select-Object cmdlet. This let me filter for only the specific columns that I wanted out of Active Directory.

Get-ADUser -filter * -ResultSetSize 1 -Properties EmailAddress,OfficePhone | Select-Object EmailAddress,OfficePhone,DisplayName,SamAccountName

I now see only the 4 columns that I care about. On a larger scale test, I realized that I was returning accounts that I did not want to see (like disabled accounts, Administrative accounts, etc.) I used the –Filter option to include some search criteria here.
Filtering in powershell is a little different than what I am used to. For example, “=” is “-eq” in powershell and “not equal to” or “<>” is “-notlike” in powershell. You can also combine multiple filters by including the entire set in curly brackets { }, individual parameters in parenthesis (), and using the “-and” operator. The Asterisk is the wildcard variable.
For example:

-Filter {(Name -notlike "*(Administrator)") -and (Name -notlike "Matt*") -and (Enabled -eq "True") }
# I also threw in there where Name is not like Matt*

Now that I have only the fields that I want, and I filtered out the users that I don’t want to see, I can start working on importing it into SQL Server. I could have used powershell to insert the records directly into SQL, but I was concerned about latency issues and spamming the domain controllers into a denial-of-service attack. I was working with more than 50,000 Active Directory accounts. I definitely did not want to hold up the domain controllers if there was an issue with the SQL server during the process. Because of this, I decided to export the data as a CSV comma delimited file and then use SSIS to import the data.
Exporting the data to a csv file uses another pipe (SHIFT + \ key) and the export-csv cmdlet. Make sure to put in your appropriate file path to export to

#Make sure you put your file path between the < >
 | export-csv -path \\\\ADUsersExported.csv -NoTypeInformation -Encoding "UTF8"

Putting everything together.

Make sure to put in your appropriate file path to export to.
I also took out the “-ResultSetSize” option so that all records were returned.

#Make sure you put your file path between the < >
Get-ADUser -Filter {(Name -notlike "*(Administrator)")  -and (Enabled -eq "True") }  -Properties SamAccountName,DisplayName,EmailAddress,OfficePhone | Select-Object EmailAddress,OfficePhone,DisplayName,SamAccountName | export-csv -path \\\\ADUsersExported.csv -NoTypeInformation -Encoding "UTF8"

Once the data was exported to a CSV comma delimited file, I am using SSIS to import it into SQL server. The powershell script and SSIS package are both scheduled to run daily overnight when things should be slower on the servers.

SQL Tuesday #028 – Jack of All Trades, Master of None

This month’s TSQL Tuesday is hosted by Argenis Fernandez.  This month’s topic:  “blog about your experience. Tell us why you specialized, or why you’d like to specialize. If you don’t think that specialization is a good thing, tell us why. Discuss. Argue your point(s).”

=================================================

My first job out of college was a Network Administrator for a small non-profit organization.  As the only IT guy in the organization, I was responsible for a lot…managing servers, planning upgrades, scheduling downtime, email setup/support, network switches, cabling, wireless network access/security, managing data, backing up data, help desk support, desktop pc support, web site design, managing the phone system, building overhead paging, video surveillance system, report design and generation, unjamming printers and copiers, evaluating new applications, and developing custom applications.  The organization has classrooms spread out across the entire county.  Each classroom had various technologies from a PC, to phones, answering machines, and speaker systems.

I definitely could not specialize in any particular area in this position.  On any given day I could have needed to travel 60 miles round trip to fix a problem in one of the outlying classrooms and return to my office prepare data to submit for a federally mandated report.  When people asked me what I did, I would tell them my job was to fix anything that plugged into a wall.  With a non-existent budget, I had to get creative to make things work in the organization.

Through a series of life choices, I moved on to a new company in a new position that focused more on systems and database administration.  This position is definitely a lot more specialized than where I started.  There are now other support groups that I can refer people to for issues with PC’s, Exchange, networking, phones, etc.

I still think that it is important to keep up on a lot of the basics, especially with SQL server.  People like to blame the database as the problem when often times the solution is not nearly that simple.  Many problems that I encounter on a day-to-day basis are rooted in specialties not directly related to SQL server.  I may not be a specialist in networking, pc repair, or Active Directory administration, but it has been very beneficial to me to have a good working knowledge of these concepts.  There are other people responsible for fixing these things at my current company.  I can usually figure out what the problems is  and direct to the correct support groups fairly quickly with my generalist background.

Running SSMS template explorer from network drive

This is a quick one today as a follow-up to my previous post on using the SSMS Template explorer.  While I do like the convenience of using the template explorer to store frequently used scripts, one of my biggest  complaints was the fact that all the scripts are buried several directories down under the C:\users directory (windows 7).

Carl Demelo shared an awesome way to move your template directory to a different directory on SQL Server Central.  This solution will only work on Windows 7 using the new shell command mklink.

I tried it out to move my template directory to a network drive that gets backed up on a regular basis.  It worked perfect, thanks Carl!