Part of the Power Programming network.

Script for detaching databases

clock February 10, 2010 17:01 by author Vic |

I seem to be detaching databases a lot recently, so I decided that a script to do this would help. 

 

 

First step I kill and user processes accessing the database, then I detach the database

 

 

USE Master

GO


declare @Kill varchar(255)

declare #Cur cursor for

  select 'KILL '+CONVERT(VARCHAR(5),spid) +' -- '+p.loginame FROM master..sysprocesses p  INNER JOIN master..sysdatabases AS D ON (P.DBID = P.DBID) WHERE spid != @@SPID AND p.loginame != 'sa' D.Name IN ('PPCRM')


open #Cur

fetch next from #Cur into @Kill

while @@fetch_status = 0

begin

print (@Kill)

fetch next from #Cur into @Kill

end


close #Cur 

deallocate #cur

GO


IF EXISTS (SELECT * FROM sys.databases WHERE name = 'PPCRM')

BEGIN

EXEC master.dbo.sp_detach_db @dbname = N'PPCRM', @keepfulltextindexfile = N'true'

END

 



Using sp_msforeachtable to list the space used for every table

clock December 21, 2009 15:47 by author Vic |

One useful bit of code, is when you want to identify which tables in a particular database is using most of the space of the datafile. This script uses the undocumented sp_msforeachtable to do this

 

USE PowerProgrammingAudit

go

EXEC sp_msforeachtable 'EXEC sp_spaceused [?]'



Making a permission allocation script to SQL Agent Job re-runnable

clock December 12, 2009 08:40 by author Administrator |

I needed to make some permission scripts for a SQL Server job re-runnable, here's how I did it

 

 

IF EXISTS(SELECT dbID FROM master..sysdatabases WHERE name='LiveServerTrigger')

BEGIN

IF NOT EXISTS (SELECT 'X' FROM sys.sysusers WHERE name = N'PP\SQLServerJobs')

BEGIN

EXEC sp_grantdbaccess 'PP\SQLServerJobs', 'PP\SQLServerJobs'

END

IF NOT EXISTS (SELECT S.Sid FROM sys.sysmembers AS M

INNER JOIN sys.sysusers AS S ON (M.MemberUID = S.UID)

INNER JOIN sys.sysusers AS R ON (M.GroupUID = R.UID)

WHERE R.Name = 'SQLServerJob'

AND S.Name = 'PP\SQLServerJobs')

BEGIN

EXEC sp_addrolemember 'SQLServerJob', 'PP\SQLServerJobs'

END

EXEC msdb.dbo.sp_update_job @job_name=N'PowerProgrammingAuditingTrigger',  @owner_login_name=N'PP\SQLServerJobs'

END

GO

 



Bulk delete of DTS Packages

clock November 13, 2009 17:28 by author Vic |

I need to clear out all the DTS packages of one of my servers prior to re-installing them, I came across this very useful script on another blog

http://neilblackburn.blogspot.com/2009/08/delete-all-dts-packages-on-sql-server.html

 

Here is a version of the script which will allow you to be more selective on which DTS packages you are deleting

 

DECLARE @PackagesToRemove TABLE (Package VARCHAR(100))

-- Place insert statements here for each package name you with to remove
INSERT INTO @PackagesToRemove (Package) VALUES ('Import Single FX File')
-- end of insert statements

-- Uncomment this if you wish to remove ALL DTS packages
/*
INSERT INTO @PackagesToRemove (Package)
SELECT SP.Name
FROM msdb.dbo.sysdtspackages AS SP
*/

-- Open a cursor containing all DTS package ID's
DECLARE @packageUID UNIQUEIDENTIFIER
DECLARE packages CURSOR FOR

SELECT DISTINCT id 
FROM msdb.dbo.sysdtspackages AS SP
	INNER JOIN @PackagesToRemove AS PTR ON (SP.Name = PTR.Package)

OPEN packages
FETCH NEXT FROM packages INTO @packageUID

WHILE @@FETCH_STATUS = 0
BEGIN

	EXEC msdb.dbo.sp_drop_dtspackage NULL, @packageUID, null
	FETCH NEXT FROM packages INTO @packageUID

END

CLOSE packages
DEALLOCATE packages


ASP.NET stopped working after installing "Microsoft .NET Framework 2.0 Service Pack 1 - Update 'KB953300' "

clock October 21, 2009 16:16 by author Vic |

We had a problem with one of our live servers, one of our system admins installed the "Microsoft .NET Framework 2.0 Service Pack 1 - Update 'KB953300' " patch. Since he installed this patch during working hours he deferred the reboot until the evening.

 

Unfortunately, when this patch was installed it recycled the worker processes (hense chucking everyone off the systems and clearing their state). Then on one of our boxes all our ASP.NET applications (.NET 2 through 3.5) all stopped responding. All we would get is 404 response, we checked the IIS log files and we could see that our requests were hitting IIS, but the ASP.NET worker process was not picking up the requests.

 

Under IIS we checked the configuration to make sure the right ISAPI extension was selected to process the request, we then re-started IIS, but none of these seemed to fix the issue.

 

We then re-registered IIS using the microsoft tools found in the .NET folder (C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727), first we ran

 

aspnet_regiis -i -r

 

This is supposed to register the .NET DLLs back with IIS but not touch the script maps. Unfortunately, this did not fix it.

 

Next we did a slight variation, using

 

aspnet_regiis -i

 

This is supposed to register the .NET DLLs back with IIS and will also refresh the script maps. Unfortunately, this did not fix it.

 

When we sent a request, I could see that the aspnet_wp.exe was not in the services. So I was positive it was something wrong with the IIS registration, so I looked into the other parameters which are available with aspnet_regiis. I noticed the -enabled flag, I would have thought that asp.net would be enabled by default but what the heck lets put that on as a parameter and see what happens

 

aspnet_regiis -i - enable

 

Guess what! It fixed the issue, so my question to all of you is why did I have to specify -enable? Why did this patch disable ASP.NET? Hopefully our system admins won't install these patches without testing them on another box, and hopefully they will run in patches out of hours in future.

 

 

 



"Safely Remove Hardware" icon is missing

clock October 18, 2009 16:29 by author Vic |

For some reason the Safely Remove Hardware icon is missing from the task area, I've tried numerous fixes for this but none seem to be a permanent fix.

What I do when I need to remove hardware now, is to goto Start -> Run and then use the following command line to load up the dialog

 

RunDll32.exe shell32.dll,Control_RunDLL hotplug.dll

 



Restoring Databases and Transaction Logs

clock October 9, 2009 11:30 by author Vic |

Here's a script that I've written to help me restore a database and the relevant transaction logs. I am sure I could make it a lot more elegant, but it works fine for what I need

 SET NOCOUNT ON

DECLARE @DataFilePath VARCHAR(100)
DECLARE @BackupFilePath VARCHAR(100)

DECLARE @MainBackUpFile VARCHAR(100)
DECLARE @LogicalNameData VARCHAR(100)
DECLARE @LogicalNameLog VARCHAR(100)
DECLARE @FileNameData VARCHAR(100)
DECLARE @FileNameLog VARCHAR(100)
DECLARE @Database VARCHAR(100)

DECLARE @TLID INT
DECLARE @TransactionLogFileName VARCHAR(100)
DECLARE @isFinalTranLog BIT

DECLARE @SQL NVARCHAR(MAX)

DECLARE @TransactionLogs TABLE (TLID INT IDENTITY(1,1) PRIMARY KEY, TransactionLogFileName VARCHAR(100), isFinalTranLog BIT DEFAULT 0, Processed BIT DEFAULT 0)

-- START OF REQUIRED CHANGES
SET @DataFilePath = 'D:\Data\'
SET @BackupFilePath = 'E:\backups\'

-- CHANGE THESE PARAMETERS
SET @Database = 'PowerProgramming'

SET @MainBackUpFile = 'PowerProgramming_backup_200910082200.bak'
SET @LogicalNameData = 'PowerProgramming_Data'
SET @LogicalNameLog = 'PowerProgramming_Log'
SET @FileNameData = 'PowerProgramming.MDF'
SET @FileNameLog = 'PowerProgramming.LDF'

-- INSERT EACH TRANSACTION LOG INTO THIS TABLE IN THE CORRECT ORDER
INSERT INTO @TransactionLogs (TransactionLogFileName, isFinalTranLog)
 VALUES ('PowerProgramming_backup_200910090800.trn', 0)

INSERT INTO @TransactionLogs (TransactionLogFileName, isFinalTranLog)
 VALUES ('PowerProgramming_backup_200910091000.trn', 1)
-- END OF REQUIRED CHANGES

SET @SQL = 'RESTORE DATABASE [' + @Database + '] FROM  DISK = N''' + @BackupFilePath + @MainBackUpFile + ''' WITH  FILE = 1,  MOVE N''' + @LogicalNameData + ''' TO N''' + @DataFilePath + @FileNameData + ''',  MOVE N'''+ @LogicalNameLog + ''' TO N''' + @DataFilePath + @FileNameLog + ''',  NORECOVERY,  NOUNLOAD,  STATS = 10'

PRINT @SQL
EXEC sp_executesql @SQL

SELECT TOP 1 @TLID = TLID, @TransactionLogFileName = TransactionLogFileName, @isFinalTranLog = isFinalTranLog
 FROM @TransactionLogs
 WHERE Processed = 0
 ORDER BY TLID ASC

WHILE EXISTS (SELECT TLID FROM @TransactionLogs WHERE Processed = 0)
BEGIN
 SET @SQL = 'RESTORE LOG [' + @Database + '] FROM  DISK = N''' + @BackupFilePath + @TransactionLogFileName + ''' WITH  FILE = 1, '

 IF @isFinalTranLog != 1
  BEGIN
   SET @SQL = @SQL + 'NORECOVERY,'
  END

 SET @SQL = @SQL + 'NOUNLOAD,  STATS = 10'

 PRINT @SQL
 EXEC sp_executesql @SQL

 UPDATE @TransactionLogs
  SET Processed = 1
 WHERE TLID = @TLID

 SELECT TOP 1 @TLID = TLID, @TransactionLogFileName = TransactionLogFileName, @isFinalTranLog = isFinalTranLog
  FROM @TransactionLogs
  WHERE Processed = 0
  ORDER BY TLID ASC
END

SET NOCOUNT OFF

 



Truncating large transaction logs

clock September 30, 2009 12:12 by author Vic |

We have several systems that are UAT and pre-production boxes, these are regularly refreshed from a copy of our live database and then anonymised. As these databases do not have maintenance plans running against them, much of the time the log files get huge. We don't really want to do a backup, or maintenance plans for them as this would be a waste of resources.

 

 

Instead we do the following

 

USE PowerProgrammingCRM
GO

BACKUP LOG PowerProgrammingCRM WITH TRUNCATE_ONLY
DBCC SHRINKFILE(PowerProgrammingCRM_log)


Using ildasm and ilasm to modify a .NET assembly

clock September 23, 2009 13:56 by author Vic |

We had a bit of an issue with some of our legacy applications this morning, one of our infrastructure engineers had "accidentally" changed a password on a domain account which was causing certain aspects of our applications to fail. WHY?

 

Because someone had hard coded the user name and password within the source code. (not me!)

 

We couldn't change the password back to the original as our domain policy would not permit it, and we could not change the domain policy to get around the issue. This only left us with three options

1.  Go back to the branched code and change the passwords in the code, re-compile the whole application, re-run all the test and put the new code through our deployment process.

2.  Disassemble the problem assemblies, change the hard coded passwords and then recompile the code and replace the DLLs on the live systems

3.  Fix the code so these credentials are stored in a more suitable place (i.e. not in the code!)

 

Well, we're doing 1 and 2 for now.  We have already done 3 in a to be released branch, but can't really retro fit those changes easily in to a previous release branch.

 

Anyway, here's how I used the Ildasm and ilasm to achieve this

 

First, we need to disassemble our .NET code into IL code. You can do this using the ildasm tool via command line you would do the following

 

ildasm /out=e:\app\PowerProgramming.Domain.il e:\app\original_dlls\PowerProgramming.Domain.dll

 

Alternatively, if you just type in ildasm in the command line you will get a user inteface to do the same. Simply open up the assembly, and the "Dump" the IL code.

 

Secondly, you need to update your IL code with any modifications you need to make.

 

Finally, you use ilasm to recompile the IL code into an assembly.

ilasm e:\app\PowerProgramming.Domain.il  /resource=e:\app\PowerProgramming.Domain.res /dll /output=e:\app\PowerProgramming.Domain.dll /include=e:\app\original_dlls

Simple as that... The only question to answer is where do you find these utilities?

 

ildasm = C:\Program Files\Microsoft SDKs\Windows\v6.0A\Bin

ilasm = C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727



Zipping and copying SQL Server backups automatically

clock August 23, 2009 15:55 by author Vic |

We have several networks here, in several locations with differing types of lines connecting them. Some of the connections are fast, others are slow. Some of the time we want to restore a set of databases on another environment to allow for testing or diagnosing an issue.

To simplify this, I created a PowerShell script that will monitor the database (hourly) for any databases that have been backed up, it will then zip this file up locally and copy it over to the other network. This means that if/when I need this backup, the backup should be on the local network and I don't have to wait to copy the file over the slow lines!

Here's the script...

 

$global:SQLServerToMonitor = "ppsql1"; # server to monitor

$global:UNCSourceLocationForFiles = "\\ppsql1\e$\"; # destintaiton path for zipped up files

$global:MachineLocalTargetLocationForFiles = "E:\"; # equivalent local path to backup files

$global:TargetLocationForFiles = "\\pptestsql1\LatestLiveBackups\"; # network path to backup files

$global:Zip7Path = "C:\Program Files\7-Zip\"; # path to 7-zip

$global:DatabaseList = "'PPMain','PPAssets','PPMonitor'";

$global:HoursToScan = 1;

function monitorServer()

{

$msg = Get-Date -format g

AppendToLog "Scan initiated on $msg","White";

$sevenZipCmd = Join-Path $Zip7Path "7z.exe";

$fileCreated = "";$expectedLength = [int]($global:MachineLocalTargetLocationForFiles.Length);

$sql = "SELECT physical_device_name, REPLACE(physical_device_name, '$MachineLocalTargetLocationForFiles', '$UNCSourceLocationForFiles') AS UNCLocation " +

"FROM msdb.dbo.backupmediafamily " +

"JOIN msdb.dbo.backupset ON (msdb.dbo.backupmediafamily.media_set_id = msdb.dbo.backupset.media_set_id) " +

"WHERE backup_start_date > dateadd(hour,-$HoursToScan,getdate()) " +

"AND type = 'D' " +

"AND database_name IN ($DatabaseList) " + "AND LEFT(physical_device_name, $expectedLength) = '$MachineLocalTargetLocationForFiles'; ";

 

$cn = new-object System.Data.SqlClient.SqlConnection("Server=$SQLServerToMonitor;Database=msdb;Trusted_Connection=True;");

 

$cn.Open();

 

$cm = New-object System.Data.SqlClient.SqlCommand;

$cm.Connection = $cn;

$cm.CommandText = $sql;

 

$da = new-object System.Data.SqlClient.SqlDataAdapter ($cm);$ds = new-object System.Data.DataSet "dsBackups";

 

$da.Fill($ds, "dsBackups") | Out-Null;

 

$dtChild = new-object "System.Data.DataTable" "dsBackups";

$dtChild = $ds.Tables[0];$dtChild | FOREACH-OBJECT {
$pDB = $_.physical_device_name;

$dbUNC = $_.UNCLocation;

$rawFile = [System.IO.Path]::GetFileNameWithoutExtension($pDB);

$lastUnderscore = $rawFile.LastIndexOf("_");

if($lastUnderscore -gt 0)

{

$searchFileName = $rawFile.substring(0, $lastUnderscore) + "*.zip";

}

$archivename = Join-Path $CreateZIPFileIn "$rawFile.zip";

$SearchForOldArchives = Join-Path $TargetLocationForFiles "$searchFileName";

$archiveDestination = Join-Path $TargetLocationForFiles "$rawFile.zip";

 

if(-not (Test-Path $archiveDestination))

{

if ((Test-Path $SearchForOldArchives) -and [string]::IsNullOrEmpty($searchFileName) -eq $False)

{

AppendToLog "Deleting old archive $SearchForOldArchives","white";$msg = Del $SearchForOldArchives -Exclude $archivename -Force -Recurse;

 

AppendToLog
" -- $msg","blue";

}

if (Test-Path $dbUNC)

{

if(! (Test-Path $archiveDestination))

{

$params = 'a "' + $archivename + '" "' + $dbUNC + '"';$cmd = "'$sevenZipCmd' $params";

# write-host -Object $cmd -ForegroundColor "white";

AppendToLog "Archiving $rawFile","white";

 

$msg = Invoke-Expression "& $cmd";

 

AppendToLog " -- $msg", "blue"

 

$searchDestination = Join-Path $TargetLocationForFiles "$searchFileName";if($searchDestination -ne "" -and (Test-Path $searchDestination))

{

AppendToLog "Deleting old backups on Destination","white";

$msg = Del $searchDestination

AppendToLog " -- $msg", "blue"

}

AppendToLog
" - Moving from $archivename To $archiveDestination","blue"; $msg = Move-Item -LiteralPath $archivename -Destination $archiveDestination -Force;

AppendToLog " -- $msg", "blue"

}

else

{

AppendToLog "$archivename already on destination, not re-doing work","red";

}

}

else

{

AppendToLog "$dbUNC not found, skipping archive","red";

}

}

else

{

AppendToLog "$rawFile already exists, skipping archive", "green";

}

}

$cn.Close();

}

function AppendToLog([string]$msg, $foreColor)

{

if ($foreColor -eq $null)

{

write-host -Object $msg -ForegroundColor "white";

}

else

{

write-host -Object $msg -ForegroundColor $foreColor;

}

Out-File -Append -FilePath "BackupMove.log" -InputObject $msg -Encoding Ascii;# -Width 300;

}

monitorServer;