Tag Archives: powershell

Powershell: delete files older than X days

On a recent project, I needed to delete archive folders that were older than a specified number of days. The thing that made this a little more challenging is that there were Daily, Weekly, and Monthly folders (similar to the screenshot below); each of which had a different retention period.

I found several scripts to delete folders and files older than a specified number of days, but these scripts would delete all the contents of the specified folder. I needed to be able to filter out the Daily, Weekly, or Monthly folders separately to handle their retention period.

This script is can be customized. Change the “-filter” to include the folder names that you want to delete, and change the number of days in the addDays () command.
Another really handy option is to use the -whatif option at the end of the script. This will print out in the powershell window what will be deleted, but it will not delete the files. This will let you test the delete without actually deleting the folders/files. The first delete example below includes the -whatif option so that you can see where it goes.

$thedirectory = "C:\test\ImportFolder\Archive"
# use "-whatif" to show what will be deleted without actually deleting anything
cd $thedirectory
get-childitem $thedirectory -filter "*daily*" |? {$_.psiscontainer  -and $_.lastwritetime -le (get-date).adddays(-35)} |% {remove-item $_ -force -recurse -whatif}
get-childitem $thedirectory -filter "*weekly*" |? {$_.psiscontainer  -and $_.lastwritetime -le (get-date).adddays(-15)} |% {remove-item $_ -force -recurse}
get-childitem $thedirectory -filter "*monthly*" |? {$_.psiscontainer  -and $_.lastwritetime -le (get-date).addmonths(-25)} |% {remove-item $_ -force -recurse }

This post is part of the blogging phenomenon known as TSQL Tuesday. This month’s blog party is hosted by Wayne Sheffield, who is writing a series of blog posts in the month of February all about powershell.  I couldn’t pick just 1 script to share today, so here is my second post on the topic for day.

Wait for file before processing

As part of a recent project, I needed to check if a file existed before starting an SSIS package to import that file.  The catch was that I did not know what time time file was going to be placed on the file system.  If the SSIS package runs and the file does not exist yet, the package will fail.

You can create a script task component within SSIS to check for a file and then sleep, however several sources said that this could spike the processor so I deceded to go a different route.  To solve the problem, I wrote a quick little powershell script to check if the file exists and then wait in a loop before starting the SSIS package.

I created a SQL agent job with 2 steps.  The job runs daily at 1:00 AM. The job has a status of Executing on Step 1 until the file exists.

Step 1: powershell script to check for the file (see below)

Step 2: Execute SSIS package task.

The file that I am looking for in this example is: C:\test\ImportFolder\fileToCheck_20130212.txt
You will notice that today’s date is also appended to the end of the file name.  I built the file name out dynamically along with the current date stamp.

$dt = Get-Date -format yyyyMMdd
$path = 'C:\test\ImporFolder\'
$theFile = $path + 'fileToCheck_' + $dt +'.txt'

#  $theFile variable will contain:  C:\test\ImportFolder\fileToCheck_20130212.txt
While (1 -eq 1) {
	IF (Test-Path $theFile) {
		#file exists. break loop
		break
	}
	#sleep for 60 seconds, then check again
	Start-Sleep -s 60
}

This post is part of the blogging phenomenon known as TSQL Tuesday. This month’s blog party is hosted by Wayne Sheffield, who is writing a series of blog posts in the month of February all about powershell..

Import Active Directory users into SQL Server

I needed to import a list of all Active Directory user accounts into a table in SQL Server for a recent project. This project also gave me a perfect opportunity to learn a little bit of powershell. Below chronicles the script that I built. I’m going to skip over a lot of the powershell basics information, as that is available from other sources. For this project, I needed to populate a table with these fields from Active Directory: Display Name, NT username, email address, and office phone.
I used the powershell Get-ADUser cmdlet to get the information out of Active Directory
Before doing anything else, you need to open a powershell command window (Start–>Run–>powershell.exe) and import the Powershell ActiveDirectory module:

PS C:\> Import-Module activedirectory

After importing the module, you can learn more about the Get-ADUser cmdlet by using some of these commands

Get-Help Get-ADUser
Get-Help Get-ADUser -examples
Get-Help Get-ADUser -detailed

Examples are great, but I learn better by seeing real results, so lets run a quick query to see what information we get.

Get-ADUser -filter * -ResultSetSize 1
#Note, I included “-ResultSetSize 1” so that I was not overwhelming the domain controllers while testing.

Awesome, I can now see user accounts from Active Directory! The output that I got showed me some of the information that I needed, but I am still missing some pieces (primarily email address and phone number). The “-Properties” option will let you pick additional fields to include in the output. I got a little stuck here briefly, because the Get-ADUser cmdlet names for the properties do not all match the Active Directory field names. To figure out what the appropriate field names were, I ran this:

Get-ADUser -filter * -ResultSetSize 1 -Properties *

Cool, now I can put the fields together to get a shortened list of only what I am looking for:

Get-ADUser -filter * -ResultSetSize 1 -Properties EmailAddress,OfficePhone
# Note this will return additional fields (DistinguishedName,Enabled,ObjectClass, SID,…)

I got a little bit stuck here too, because I was getting too much information. When I got to the point of exporting this data to a CSV file and importing it into SQL Server (coming later), I got hung up because some of the fields did not always have information for my organization. The solution came by using a pipe (SHIFT + \ key) and the Select-Object cmdlet. This let me filter for only the specific columns that I wanted out of Active Directory.

Get-ADUser -filter * -ResultSetSize 1 -Properties EmailAddress,OfficePhone | Select-Object EmailAddress,OfficePhone,DisplayName,SamAccountName

I now see only the 4 columns that I care about. On a larger scale test, I realized that I was returning accounts that I did not want to see (like disabled accounts, Administrative accounts, etc.) I used the –Filter option to include some search criteria here.
Filtering in powershell is a little different than what I am used to. For example, “=” is “-eq” in powershell and “not equal to” or “<>” is “-notlike” in powershell. You can also combine multiple filters by including the entire set in curly brackets { }, individual parameters in parenthesis (), and using the “-and” operator. The Asterisk is the wildcard variable.
For example:

-Filter {(Name -notlike "*(Administrator)") -and (Name -notlike "Matt*") -and (Enabled -eq "True") }
# I also threw in there where Name is not like Matt*

Now that I have only the fields that I want, and I filtered out the users that I don’t want to see, I can start working on importing it into SQL Server. I could have used powershell to insert the records directly into SQL, but I was concerned about latency issues and spamming the domain controllers into a denial-of-service attack. I was working with more than 50,000 Active Directory accounts. I definitely did not want to hold up the domain controllers if there was an issue with the SQL server during the process. Because of this, I decided to export the data as a CSV comma delimited file and then use SSIS to import the data.
Exporting the data to a csv file uses another pipe (SHIFT + \ key) and the export-csv cmdlet. Make sure to put in your appropriate file path to export to

#Make sure you put your file path between the < >
 | export-csv -path \\\\ADUsersExported.csv -NoTypeInformation -Encoding "UTF8"

Putting everything together.

Make sure to put in your appropriate file path to export to.
I also took out the “-ResultSetSize” option so that all records were returned.

#Make sure you put your file path between the < >
Get-ADUser -Filter {(Name -notlike "*(Administrator)")  -and (Enabled -eq "True") }  -Properties SamAccountName,DisplayName,EmailAddress,OfficePhone | Select-Object EmailAddress,OfficePhone,DisplayName,SamAccountName | export-csv -path \\\\ADUsersExported.csv -NoTypeInformation -Encoding "UTF8"

Once the data was exported to a CSV comma delimited file, I am using SSIS to import it into SQL server. The powershell script and SSIS package are both scheduled to run daily overnight when things should be slower on the servers.