Showing posts with label T-SQL Tuesday. Show all posts
Showing posts with label T-SQL Tuesday. Show all posts

Monday, March 7, 2011

T-SQL Tuesday 16: Aggregation




So it’s that time again, T-SQL Tuesday.  This week our event is Sponsored by Jes Borland (Blog|Twitter).  So Jes what’s the subject?

I want to hear how you solved business problems with aggregate functions. I want to see your cool T-SQL tricks. How are aggregates used in SSRS, SSAS, or SSIS? What have you learned about aggregate functions?

So Dear Reader, Aggregation it is!

SOME THINGS JUST GO TOGETHER




We’ve all probably done our fair share of fancy aggregates, I have to say that most of the code I’ve used aggregates for is for business functionality and sitting with clients that I’ve worked for in the past.  I think we all have a nightmare report, script, or task that we accomplished using aggregates and while those things make for interesting stories, they aren’t that useful.  I’d like to give you something that you can use, (if you don’t already have a script to do what I’ve done here). 

Sometimes the most useful things are simple, like PB & J (simple and delicious!).  Here are a couple of queries that I use that have the SUM aggregate function in them.  Simple but effective.

The first is a quick script that extracts the information from DBCC SHOWFILESTATS and uses the "undocumented" sp_msforeachdb in order to gather total space used by all datafiles on an instance, total space reserved for all data files on an instance, then a breakdown by datafile per database, lastly we take the info from DBCC SQLPERF(LOGSPACE)and sort by the log file size.

The second script is of the same concept.  You can use the built-in reports “Disk Usage by Tables” to get the same info, but sometimes it’s just nice to build a script to work for you.  This script uses sp_spaceused and the "undocumented" sp_msforeachtable in order to build a list of tables by row number and the Reserved size for Data. (I use the Second script on my Compression demos to help you find an appropriate table (size wise there are multiple steps J) while examining compression).  Then I do a SUM function on each of the fields to show an aggregate of your database by user table.

SCRIPT1 – Get Database Size Stats

/*============================================================
--@SQLBalls: SQLBalls@gmail.com
--http://www.SQLBalls.com
--T-SQL TUESDAY 16 Aggregation
--
--This Sample Code is provided for the purpose of illustration only
--and -is not intended to be used in a production environment. 
--THIS SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED "AS IS"
--WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING
--BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR
--FITNESS FOR A PARTICULAR PURPOSE.
==============================================================*/

IF exists(SELECT * FROM tempdb.sys.tables WHERE name like '#myTable%' and name not like'#myTable2%')
BEGIN
    DROP TABLE #myTable;
END
CREATE TABLE #myTable (
                      [myID] [int] IDENTITY(1,1) NOT NULL,
                      fileid varchar(4),
                      fileGroup varchar(4),
                      TotalExtents int,
                      usedExtents int,
                      DBName sysname,
                      filename1 varchar(8000));
GO
                     
IF exists(SELECT * FROM tempdb.sys.tables WHERE name like '#myTable2%')
BEGIN
    DROP TABLE #myTable2;
END
CREATE TABLE #myTable2(
                      dbName sysname,
                      logSize float,
                      spaceUsed float,
                      dbStatus int
                      );                   
GO

INSERT INTO #myTable
EXEC ('sp_msforeachdb @command1="use [?]; dbcc showfilestats"');
GO
                     
INSERT INTO #mytable2
EXEC( 'dbcc sqlperf(logspace)');
GO

SELECT
((SUM(usedExtents)*64)/1024) AS DatabaseUsedSpaceinMB
,((SUM(totalExtents)*64)/1024) AS DatabaseSizeinMB
FROM #myTable;
GO

SELECT
    DBName
    ,usedExtents
    ,TotalExtents
    ,((usedExtents*64)/1024) AS DatabaseUsedSpaceinMB
    ,((totalExtents*64)/1024) AS DatabaseSizeinMB
    ,(((totalExtents*64)/1024)-((usedExtents*64)/1024)) as DataFileFreeSpace
 FROM
    #mytable
 ORDER BY
    ((totalExtents*64)/1024) DESC;
 GO


SELECT
    *
FROM
    #myTable2
ORDER BY
    logsize DESC;
GO

DROP TABLE #myTable;

DROP TABLE #myTable2;
GO




SCRIPT2-Get Table Size Stats

/*============================================================
--@SQLBalls: SQLBalls@gmail.com
--http://www.SQLBalls.com
--T-SQL TUESDAY 16 Aggregation
--
--This Sample Code is provided for the purpose of illustration only
--and -is not intended to be used in a production environment. 
--THIS SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED "AS IS"
--WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING
--BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR
--FITNESS FOR A PARTICULAR PURPOSE.
==============================================================*/

IF EXISTS(SELECT * FROM TEMPDB.sys.tables WHERE name like '#myTable%')
BEGIN
    DROP TABLE #myTable
END

CREATE TABLE #myTable(
                               [Name] sysname,
                               [Rows] int,
                               [Reserved] varchar(500),
                               [Data] varchar(500),
                               [Index_Size] varchar(500),
                               [Unused] varchar(500)
                               );

INSERT INTO #myTable
EXEC ('sp_msforeachtable @command1="sp_spaceused [?]"');
GO

SELECT * FROM #myTable
ORDER BY
    [rows] DESC,
    LEFT([reserved],(LEN([reserved])-3)) DESC;

SELECT
    SUM(CAST(LEFT([reserved],(LEN([reserved])-3)) as int)) as [Total_Reserved(KB)]
    ,SUM(CAST(LEFT([data],(LEN([data])-3)) as int)) as [Total_Data(KB)]
    ,SUM(CAST(LEFT([Index_Size],(LEN([Index_Size])-3)) as int)) as [Total_IndexData(KB)]
    ,SUM(CAST(LEFT([Unused],(LEN([Unused])-3)) as int)) as [Total_UnusedSpace(KB)]
FROM  #myTable

GO

DROP TABLE #myTable;

  
Happy T-SQL Tuesday I hope you enjoy the scripts!

Thanks,
Brad

Monday, January 10, 2011

A House and a Dog! T-SQL Tuesday 14: Techie New Year’s Resolutions

So T-SQL Tuesday is back again already, the first of 2011, and the  theme is techie resolutions.  This topic is courtesy of the Jen 1/2 of the MidnightDBA (twitter)(blog). 

I really like the idea for this topic, because we all have a lot of personal goals when it comes to a new year, but this is all about techie resolutions and what you want to achieve as it relates to technology. 



You might be wondering how does a House and a Dog make the list of techie goodness?  Houses are great and the can be tech savvy, but a dog….. really a dog…..!?

Look at those eyes, they can be techie if they want to.  So here we go! 

1.       MCITP DBA SQL 2008
I’ve got my MCTS for SQL 2008, so one test to go on this one, this will be the first resolution I wrap up this year.
2.       MCITP DBA Developer SQL 2008
I will start this track as soon as I finish the MCITP for the SQL 2008 DBA, my birthday is in late January and I’ll be getting the 70-433 book  and a Prometric 2 Pack  to help with this.
3.       13 T-SQL Tuesdays
This will be a year long project, I did my very first T-SQL Tuesday last month a full year will put me at 13, so the goal is 13 T-SQL Tuesdays by the end of the year J!

4.       40 Blog Posts
I’m new to blogging, and it is not as easy as it looks when you are casually visiting a site.  Each post requires quite a bit of work.  I have 4 kids, I’m married, and work a full time job.  Needless to say the one thing I’m not lacking on is boredom.  The thing I like most about blogging it is forcing me to keep tight priorities and my goal is to do at least 40 blogs for the year.  That is less than 4 a month.  I’d like to do more but I’m of the under promise over deliver line of thinking.

5.       6 Presentations to the SQL Community
I don’t know how, but I missed most of the SQL Community, except for SQLServerCentral.com, while I was breaking into and becoming a DBA.  It wasn’t until 2009 while I was working in D.C. that our Vice President of Microsoft Technologies came to me and said, “Your really interested in SQL Server right?”  “Yes”, was my simple response to him.  I had already gotten my MCITP SQL DBA for 2005 and was working on my MCTS for 2008, so I thought "like" was an understatement.  You like a sandwhich, you like a joke, you like a movie, but I have a deep rooted desire to know SQL like I know comic books.

He said he was going to forward me this link he got for something call 24 hours of PASS.  Needless to say I was hooked.  Not only did I love presentations from folks like Brent Ozar (twitter) (blog) and Louis Davidson (twitter) (blog), but I thought about what I did every day and felt I had things I could share as well.
Working for the Office of the President I was a little leery to put myself out there, so when I left and I wanted to find a company that would encourage me to be active in the SQL Community, heck I didn’t need encouragement I just needed the thumbs up!  And yes I got the big thumbs up.
My first presentation will be at SQL Saturday 62 in Tampa this Saturday, January 15th, I’ll be presenting on Page and Row Compression How, When, and Why.  I’ve presented internally for my company and also for previous audiences.  But this will be my first venture into the SQL Community and I’m very excited.  I’m going to be giving a 30 minute presentation at OPASS on March 8th, Transparent Data Encryption the Lightning Round.  So that is 2 down, 4 to go.
6.       MCM Prep
This is a yearlong goal for me, I’m already watching all the MCM videos, taking notes, and buying books.  Come 2012 I want to tackle this head on, but this is an expensive proposition, especially when you’re a family of 4 (six including Mrs. Balls and myself).  Mrs. Balls is super awesome, very supportive, quite beautiful, and she said I could go for it as long as I get her a House and a Dog.  So believe it or not
7.       A House and a Dog are techie goals!
My kids will be so happy to hear this!  The kids have wanted a Dog for quite sometime and this move was all about putting down roots long term.  To be quite honest, if I only achieve one thing on this list, this is by far the most importiant.

So there it is, not quite a top 10, but 7 will do for this year for me.  So Dear Reader, thanks for stopping by and Happy New Year!

Thanks,

Brad

Wednesday, December 8, 2010

T-SQL Tuesday 13: Who Gave YOU the Authority to Make Changes to MY Database?

"Who Gave YOU the Authority to Make Changes to MY Database?"
I was very new to a DBA position, and this was the question posed to me by a very, very, very angry user.  “I’m duf  SeQL DBA”, I not so eloquently replied swallowing was seemed like a very dry bite of a sandwich, I was enjoying before this exchange began.
Let me step back a couple weeks.  I had just started a new position as the Sr. SQL DBA, the job was not shaping up like I thought.  A disaster that occurred before I landed on the ground had wiped out all of the Documentation.  I mean, there was zero documentation, there was no list of the servers to be managed, in many cases there were no environments other than production, minimal change management tracking, and the where to start list kept shifting.  Also on my first day that my boss had taken me to lunch to warn me that the last 2 people to hold my position had been unceremoniously fired within 6 months? 

Initially I had been told I would only manage 5 Servers.  That seemed like a cake walk, and to be honest I was a little disappointed.  The number 5 was given because there was no list of servers, and these were the 5 most important servers that anyone could think of.  Soon that list grew to over 100 servers all to myself.
I had been told of the utter importance of one particular system.  So I figured that would be the best place to start.  There were no regular backups of all databases, just some databases.  Some that had an acceptable level of data loss of no more than 5 minutes, where in Simple recovery model and had not been backed up….ever.
So I went through the system following change management, documenting everything, setting up regular maintenance, getting everything in line, examining existing jobs, and gathering baselines.  So when the user in question showed up, I had reviewed the jobs, looked at what they actually did, and changed the name of one job to better reflect what it did.
Unbeknownst to me, because it was not documented, the job I renamed was part of a business process another user would follow, set off manually, and because I changed the name (about 1 hour before this user would kick off this job) there was no job that matched the name exactly, as per what they had done before and in this particular Division that had started a storm that head straight to my desk.
There were several issues that were brought to light by this incident, 1st the lack of documentation would only lead to more issues, the process I was following for change management needed to be better defined, and I needed to know more about my databases.
The incident as it occurred involved the intervention of management, explanations on my part, and assurances that I had done everything right and thoroughly (which is probably why I continued to have a job), did I mention that on my first day that my boss had revealed to me that the last 2 people to hold my position had been unceremoniously fired, and this was before all of the documentation had been lost? 

FIX IT! FIX IT! FIX IT!
An Experience like this makes you check Monster will leave you with two alternatives, You can have a poor attitude to the customer in question, or you can acknowledge that things need to be fixed and do your best.
Or as Oscar Rogers would say
Step 1: Fix
Step 2: IT
Step 3: Repeat Steps 1 & 2 TILL IT’S FIXED!
One of the things I did was re-define the Change Management process for Databases, meeting with the users and documenting every last detail became crucial, as well as developing baseline standards and making sure that they got implemented across the board, but more important engaging the users before and after the changes were made.
My main issue with interacting with the business was that the Database Administrator position had been such a source of instability that the business was not used to having a DBA, and in order for me to do my job I needed to re-introduce the profession and assure the business that I wasn’t “messing” with their data but it was my job to protect it, help them, and explain the how’s and why’s where possible.  There was a lot of doubt and my issue was to build trust.
It wasn’t an easy task, but every time I was able to provide documentation, Server lists, or any Standard Procedures I was able to chip away at that doubt. Every time a disaster occurred, and we were able to recover it chipped away a little more.
Trust Comes Full Circle

A day eventually came when this user had an issue with their database.  It was a case of logical corruption, the process they ran via human interaction requires some parameters.  The Parameters that had been entered were incorrect.  And it caused the data produced by the batch process to be incorrect and inconsistent.  To compound when the data was mis-entered the user had waited to the point that rolling back the transactions, simply finding the most recent data entered, or recovery from backups was no longer possible.

A couple of things had changed since out last interaction, I had documentation on the process; I had created development and staging environments, and I also knew how guarded this user would be with their data so I wasn’t caught off guard.

The first thing I did was to re-ensure them that their data’s integrity was my top priority.  I laid out the plan to correct the data (which was easy to find thanks to the documentation), I walked them through the change process, and how we would validate and test in each environment.  Now these are not big revolutionary concepts and are pretty simple and standard, but when you go from not having structure to having a more standard process the difference is night and day.

By the time we had finished the user who had once surprised me while I was eating lunch surprised me by emailing my boss, his boss, and their boss to inform them what a great job I had done for them. 

It was a very hard won victory, to get this user not only to trust me, but to trust that as a DBA I cared just as much about their data as they did.  You can have issues with the business where they are slow to adapt technology, slow to move in a certain direction, or have misconceptions about technology as a whole.  But if the business doesn't trust you to do what is best for them, then all the knowledge in the world won't help you sell your point of view.