Tuesday, December 21, 2010

SQL Saturday 62: Row Compression

I'm really excited to be selected as a speaker for SQL Saturday 62 in Tampa on January 15th 2011, register today at http://www.sqlsaturday.com/62/register.aspx !

Just a quick blog this evening, the whole family, myself included, have been under the weather.  I wanted to get something out on this, and I’ll put more out in the coming days.

The topic I will be presenting on is Compression.  My session is titled Page and Row Compression How, When, and Why, and I’m very excited to be presenting on this topic.   There are also a lot of really great sessions but other speakers, to check out the full line up click here.

I'm going to break up the subject and talk about it in parts, we will start with a very brief introduction to the concepts in Row Compression..


Row Compression

Row Compression was introduced to us in SQL 2008, it’s predecessor was vardecimal compression introduced in SQL 2005, and subsequently deprecated in SQL 2008.  It is still available in 2008, but vardecimal compression does almost the exact same thing as Row Compression, but on variable length numeric fields.
 The premises is that you take the variable length columns from a record and you make them a fixed length field, you take the variable length array and covert it from a 4 byte field to a 4 bit field, and there is also special handling for nulls and zero values. 
In SQL 2008 R2 Row Compression was extended even further to include Unicode support.
Okay, it sounds smart but what does it mean? 

Think Vacuume Sealed.

And Tastey, Tastey meat.
But mainly Vacuume Sealed.




We are extracting every bit of extra space down to trim the size of the data stored in a record on a page.  Row compression should be synonymous to you as saving space! 
If you move from a bigger house to a smaller house then you toss some things out, re-organize, possible consolidate a little.  This is what is going on.  Now keep in mind that not all data-types are covered and this is for In Row Data, LOB and Row Overflow data will not be compressed. 
We’ll save more for later, but needless to say we next need to talk about Page Compression, what makes it up, how to tell if your tables are candidates for compression based on your record types.  Also your tables activity level, and type, can help you estimate what kind of overhead to expect from compression.

Wednesday, December 8, 2010

T-SQL Tuesday 13: Who Gave YOU the Authority to Make Changes to MY Database?

"Who Gave YOU the Authority to Make Changes to MY Database?"
I was very new to a DBA position, and this was the question posed to me by a very, very, very angry user.  “I’m duf  SeQL DBA”, I not so eloquently replied swallowing was seemed like a very dry bite of a sandwich, I was enjoying before this exchange began.
Let me step back a couple weeks.  I had just started a new position as the Sr. SQL DBA, the job was not shaping up like I thought.  A disaster that occurred before I landed on the ground had wiped out all of the Documentation.  I mean, there was zero documentation, there was no list of the servers to be managed, in many cases there were no environments other than production, minimal change management tracking, and the where to start list kept shifting.  Also on my first day that my boss had taken me to lunch to warn me that the last 2 people to hold my position had been unceremoniously fired within 6 months? 

Initially I had been told I would only manage 5 Servers.  That seemed like a cake walk, and to be honest I was a little disappointed.  The number 5 was given because there was no list of servers, and these were the 5 most important servers that anyone could think of.  Soon that list grew to over 100 servers all to myself.
I had been told of the utter importance of one particular system.  So I figured that would be the best place to start.  There were no regular backups of all databases, just some databases.  Some that had an acceptable level of data loss of no more than 5 minutes, where in Simple recovery model and had not been backed up….ever.
So I went through the system following change management, documenting everything, setting up regular maintenance, getting everything in line, examining existing jobs, and gathering baselines.  So when the user in question showed up, I had reviewed the jobs, looked at what they actually did, and changed the name of one job to better reflect what it did.
Unbeknownst to me, because it was not documented, the job I renamed was part of a business process another user would follow, set off manually, and because I changed the name (about 1 hour before this user would kick off this job) there was no job that matched the name exactly, as per what they had done before and in this particular Division that had started a storm that head straight to my desk.
There were several issues that were brought to light by this incident, 1st the lack of documentation would only lead to more issues, the process I was following for change management needed to be better defined, and I needed to know more about my databases.
The incident as it occurred involved the intervention of management, explanations on my part, and assurances that I had done everything right and thoroughly (which is probably why I continued to have a job), did I mention that on my first day that my boss had revealed to me that the last 2 people to hold my position had been unceremoniously fired, and this was before all of the documentation had been lost? 

FIX IT! FIX IT! FIX IT!
An Experience like this makes you check Monster will leave you with two alternatives, You can have a poor attitude to the customer in question, or you can acknowledge that things need to be fixed and do your best.
Or as Oscar Rogers would say
Step 1: Fix
Step 2: IT
Step 3: Repeat Steps 1 & 2 TILL IT’S FIXED!
One of the things I did was re-define the Change Management process for Databases, meeting with the users and documenting every last detail became crucial, as well as developing baseline standards and making sure that they got implemented across the board, but more important engaging the users before and after the changes were made.
My main issue with interacting with the business was that the Database Administrator position had been such a source of instability that the business was not used to having a DBA, and in order for me to do my job I needed to re-introduce the profession and assure the business that I wasn’t “messing” with their data but it was my job to protect it, help them, and explain the how’s and why’s where possible.  There was a lot of doubt and my issue was to build trust.
It wasn’t an easy task, but every time I was able to provide documentation, Server lists, or any Standard Procedures I was able to chip away at that doubt. Every time a disaster occurred, and we were able to recover it chipped away a little more.
Trust Comes Full Circle

A day eventually came when this user had an issue with their database.  It was a case of logical corruption, the process they ran via human interaction requires some parameters.  The Parameters that had been entered were incorrect.  And it caused the data produced by the batch process to be incorrect and inconsistent.  To compound when the data was mis-entered the user had waited to the point that rolling back the transactions, simply finding the most recent data entered, or recovery from backups was no longer possible.

A couple of things had changed since out last interaction, I had documentation on the process; I had created development and staging environments, and I also knew how guarded this user would be with their data so I wasn’t caught off guard.

The first thing I did was to re-ensure them that their data’s integrity was my top priority.  I laid out the plan to correct the data (which was easy to find thanks to the documentation), I walked them through the change process, and how we would validate and test in each environment.  Now these are not big revolutionary concepts and are pretty simple and standard, but when you go from not having structure to having a more standard process the difference is night and day.

By the time we had finished the user who had once surprised me while I was eating lunch surprised me by emailing my boss, his boss, and their boss to inform them what a great job I had done for them. 

It was a very hard won victory, to get this user not only to trust me, but to trust that as a DBA I cared just as much about their data as they did.  You can have issues with the business where they are slow to adapt technology, slow to move in a certain direction, or have misconceptions about technology as a whole.  But if the business doesn't trust you to do what is best for them, then all the knowledge in the world won't help you sell your point of view.
 

Monday, November 29, 2010

Working On Presentations

Happy Thanksgiving!!!!   Just wanted to take a second to write a quick update, I'm working on some presentations for an upcoming SQL Saturday in Tampa.  I've submitted a couple sessions and we'll see if any get picked up.  Regardless I'm looking forward to attending the event and catching up with some friends and fellow DBA's.

Monday, November 15, 2010

Master Data Services, Step 1 What is it?

So I’m taking a look at Master Data Services, which is a new addition to SQL 2008 R2.  SQL 2008 R2 added a lot of new features to SQL Server, a lot of them will push DBA’s outside of the normal box we sit in.  That box has become a lot smaller over the last couple years as certifications have pushed us to be Database Administrators, Database Developers, or Business Intelligence Developers.  But all three professions had area’s that overlapped one another, and more often than not the way a work place will define the title depends on what they want you to do.
Strictly speaking, and going by the area of study covered by the certifications in each area, Master Data Management would seem to be a concept covered by Data Architects and maybe Database Developers.  However, Master Data Services ships with SQL 2008 R2 and if you company bought it you may be asked at some point to install it, or even what it does, And in that case here is a bit of an overview.
1st if you are using SQL 2008 R2 or looking at it I would encourage you to go to, http://blogs.msdn.com/b/microsoft_press/archive/2010/04/14/free-ebook-introducing-microsoft-sql-server-2008-r2.aspx?wa=wsignin1.0 , and download the free e-book Introducing Microsoft SQL Server 2008 R2 by Ross Mistry and  Stacia Misner.  Ross is an expert with Microsoft and is currently working implementing SQL Server solutions out of the Microsoft Technology Center in Silicon Valley and Stacia is a BI expert that currently works for the amazing folks at SQLSkills.com.   The E-book is free, which is always a great bargain, and is chocked full of tasty knowledge nuggets……. mmmmmm nuggets.
So what is Master Data Services?  The 50,000 high view is a location where you can import multiple sources of data, organize the data, present a master view of the data, that can be exported to source databases, but will not be in and of itself the Gold Master Database.  Master Data Services uses a web API, and is not exposed via SSMS, SQL Server Management Studio.  It can integrate with SQL Server Sharepoint and have work flows and business rules written against it.  It exposes the Windows API and can have custom C# components coded for it.
You can set it up as a multi server deployment, or on one server.  However it will require x64 hardware and a Web Server.  For an overview of how to set up/install MDS take a look at the Master Data Management Team Blog, http://sqlblog.com/blogs/mds_team/archive/2009/12/10/installing-and-configuring-master-data-services-2008-r2-november-ctp.aspx . 
So reading over that description you may say, “this is a DBA/database technology?!?”.  Microsoft has decided that it will bundle it’s data related components with SQL Server.  This is good and bad in that normally you would do a point and click install for SQL Server, there are much more complex installs, but normally you have a Database Server, a set of drives, and your application lives on this server (be it virtual or physical).
You have applications that use your databases and data cubes, you may have dedicated SSRS and SSIS servers.  But at the end of the day we are a portion of an application, not typically the whole of an enterprise level application.  Master Data Services is, IMHO, an Enterprise level application.  It is a much deeper dive than just point click, optimize, and maintain.
I’ll be taking a deeper look as I go, but I just wanted to offer an up front, that this will not be a normal DBA road trip.
Thanks,

Brad

Denali

Denali CTP1 is available for download, http://www.microsoft.com/sqlserver/en/us/product-info/future-editions.aspx, This will be the latest major Version release since SQL 2008, and will build upon the minor version release from SQL 2008 R2.  There look like some really interesting features that will need to be flushed out more.  Brent Ozar has blogged about Hadron here, http://www.brentozar.com/archive/2010/11/sql-server-denali-database-mirroring-rocks/ .

Hadron will be where mirroring is going for SQL 2011, some new features will be that it is built on top of Windows Clustering, you can have up to 4 mirrors (only 1 in CTP 1), and you can execute read only queries against the mirrors, no need for snapshots. For more info see here, http://msdn.microsoft.com/en-us/library/ff877884(v=SQL.110).aspx .
Columnar level indexes, this was teased during the keynote for PASS opening night.  This would be big for Data Analytics for SQL Server, this will allow SQL to scream and compete with Oracle, DB2, TeraData, and Informix.  The TCP benchmarks on this will be unlike anything SQL has seen before.
This will be a really interesting ride.  More to come.
Thanks,

Brad

Wednesday, November 10, 2010

Powershell 2.0 Required for UCP on Windows XP

So I’m trying to install SQL Server 2008 R2 Utility Control Point, so I can take a look at all that the product has to offer and see how we can utilize it at my current place of business.

For the sake of full disclosure I’m doing this on a laptop running Windows XP SP 3, x86, Dual Core Processors and about 3 GB of memory.

I installed it about a week ago, and set up another named instance and had a SQL Agent Job running stored procedures from Adventure works every minute, to help simulate a work load on one of the databases.

I’ve read how it can take up to 45 minutes to get data posted to the dash board and so I set the jobs and got busy doing other things.   One week later I take a look at my Utility Explorer to see all the beautiful looking screens.


Needless to say this isn’t what I expected to see.  It almost looks as if there is no data.  So I check my enrolled instances to see what status they are running under.





Now I’m really bothered, apparently something has gone wrong.  So I take a look at my SQL Agent jobs to see if the jobs were created and if the history has been running properly.


I see that my sysutility_mi_collect_and_upload job has been failing.  So I isolate just that job, seeing as how collecting and uploading the data is directly related to seeing nice dashboards (or so I would guess).


And YOWZA! This has not been running at all, a quick glance back over the history and I can see this job has never run correctly. Let’s take a closer look at why this is failing.


Step 2 of the job is failing and the error is big and ugly  The important part is the following:

  (Get-Wmiobject  <<<< Win32_MountPoint) |   InvocationName   : Get-Wmiobject  PipelineLength   : 1  PipelinePosition : 1          WARNING : 11/8/2010 12:25:10 PM : 00000000000000000000000000000000000000000000000000000000000000000000000000000000WARNING : 11/8/2010 12:25:10 PM :     ErrorRecord    : Command execution stopped because the shell variable "ErrorActi                   onPreference" is set to Stop: Invalid class   StackTrace

When I set up the UCP I remember seeing one Yellow Warning/Yield Sign.  Honestly I breezed right by it, because if it’s not going to work you get a big red Error sign, right.  Right?  Wrong.

The Warning box was pointing to WMI not being set up correctly.  There are several Powershell class files that are required by UCP.  So the Question is what are they and how do you check that they are there.

They are:
Win32_MountPoint
Win32_Volume
Win32_LogicalDisk
Win32_PerfRawData_PerfProc_Process
Win32_PerfRawData_PerfOS_Processor
Win32_Processor

So how do you check for them?  Open up PowerShell and type get-wmiobject [objectname]



As you can see from above this is the error you will get if you are missing one of the classes.

A little more research shows me that Powershell 1.0 and Windows XP does not contain Win32_MountPoint and Win32_Volume classes.  This was reported on Microsoft Connect at here: https://connect.microsoft.com/SQLServer/feedback/details/540412/sysutility-mi-collect-and-upload-job-failure?wa=wsignin1.0 .

Microsoft Employee Jennifer Beckmann reported that the user was able to get everything working after installing Powershell 2.0

The link above lists a “work around”, by a user named Dryknot.  I would not recommend this solution as it recommended opening up the SQL Agent job and editing the PowerShell Code.  No offense to Dryknot I’m sure that she/he is a perfectly great person, and that they took the time to post a work around to help others shows that they are indeed a great person just trying to help others.  But there are 2 reasons I would not make this change.

  1. You are changing the requirements that are being gathered from one class, Win32_MountPoint to another, Win32_MappedLogicalDisk.  So it’s like saying I want to gather information on an Orange and buying an Apple.  One thing is not like the other, even though they are both drive classes.
  2. I’m missing 2 classes.  Even if I were to fix the Win32_MountPoint issue, I would still have one with Win32_Volume.  And if you use Mount Point’s you know how critical it is to gather space usage on them.  DON’T CHEAP OUT HERE.  Do not accept substitutes go for the original.

So If you are planning on implementing a UCP on a Windows XP machine make sure that you have Powershell 2.0 running or you could end up with the same results I did.

Thanks,

Brad

******DISCLAIMER********
All advice is AS IS.  There are no warranties or services provided as a result of anything you see here.

Tuesday, November 9, 2010

Dynamic Permissions


I needed to script out a database user role and all of the permissions that this role has with it for a server migration. We are moving the databases from on server to a new location.  You could try the Generate script tool, but while that would give me the script to re-create the role it did not give me all of the rights that role currently has.

 I opened up the advanced option and still not luck.

So I did a quick Google search on how to script out roles and found an old post by SQL MVP Russell Fields where he was answering a question on the SQL Security Forum, http://social.msdn.microsoft.com/Forums/en-US/sqlsecurity/threads.

Russell’s Response, http://social.msdn.microsoft.com/Forums/en-US/sqlsecurity/thread/26ffcac4-8d4c-4107-b355-80e63344fb8e listed how to use the DMO’s sys.database_permissions and sys.database_principals in order to query a database to find out the roles that where listed for the database, as well as the permissions each of those roles granted.

So my first step is most of the way done.  I now know how to get the info, now I want to dynamically generate a script that will re-create this role for me, and allow me to use this on multiple databases.

One thing I will need to do is add the schema to the return results, as we will want to make sure our object mappings are correct.  So I’ll add sysobjects and the schema_name() function to this query to pull that back.  My first results look like this.

select
      u.name,
      p.permission_name,
      p.class_desc,
      schema_name(o.uid),
      object_name(p.major_id) ObjectName,
      state_desc
from
      sys.database_permissions  p
      join sys.database_principals u
      on p.grantee_principal_id = u.principal_id
      join sysobjects o
      on p.major_id=o.id
where
      u.name='DatabaseRoleName'
order by
      ObjectName, name, p.permission_name

You will want to fill in the Database role you are looking for in the Where clause, or remove the Where to get a full listing of all the permissions for all of the roles in the database you are querying.

So now we’ve got everything we want, now I need to string it all together into the syntax I would like.


select
      u.name,
      p.permission_name,
      p.class_desc,
      schema_name(o.uid),
      object_name(p.major_id) ObjectName,
      state_desc,


      (state_desc + ' ' + p.permission_name + ' ON ' + '['+schema_name(o.uid)+'].['+object_name(p.major_id)+'] TO ' + u.name) As PermissionText
from
      sys.database_permissions  p
      join sys.database_principals u
      on p.grantee_principal_id = u.principal_id
      join sysobjects o
      on p.major_id=o.id
where
      u.name='DatabaseRoleName'
order by


      ObjectName, name, p.permission_name

This gives me exactly what I want syntax wise, but now I need to insert “GO” in between each row, and I would like to make sure and specify the database that I’m working in, so I have a full script generator.

So now I’ll flush out the script a little bit, I need to do some row by row alterations, and I’m not looking at a large set of data, so I’ll use a table variable to hold all the syntax that I’m going to generate.


--======================================================
--Declare our variables
--======================================================
declare @dbname  varchar(500), @i int, @x int, @sqlcmd varchar(max), @RoleName nvarchar(500)


--======================================================
--Get the Current DB Name
--======================================================
set @dbname = (select DB_NAME())
set @Rolename = N'DatabaseRoleName'

--======================================================
--Declare the 2 Table Variables I'll be using
--======================================================
declare @myTable as Table(
                  [myID] [int] IDENTITY(1,1) NOT NULL,
                  mySQLText varchar(max)
                  )
declare @myTable2 as Table(
                  [myID] [int] IDENTITY(1,1) NOT NULL,
                  mySQLText varchar(max)
                  )                


--======================================================
--Populate @myTable
--======================================================
insert into @myTable
select
      (state_desc + ' ' + p.permission_name + ' ON ' + '['+schema_name(o.uid)+'].['+object_name(p.major_id)+'] TO ' + u.name) As PermissionText
from
      sys.database_permissions  p
      join sys.database_principals u
      on p.grantee_principal_id = u.principal_id
      join sysobjects o
      on p.major_id=o.id
where
      u.name=@Rolename


--======================================================
--Set the @i count = to the count from @mytable
--Set @x = 0
--======================================================
set @i = (select count(*) from @myTable)
set @x=0


--======================================================
--Begin the loop to create our dynamic statements
--you can either use table 2 or you could print the values
--to have a text you could just copy and paste
--this will work either way
--======================================================
while @x < @i
      Begin
            set @x= @x +1
            if (@x=1)
                  Begin
                        set @sqlcmd= 'Use [' + @dbname + ']'
                       
                        insert into @myTable2(mySQLText)
                        values(@sqlcmd)
                       
                        set @sqlcmd='GO'
                       
                        insert into @myTable2(mySQLText)
                        values(@sqlcmd)
                       
                        set @sqlcmd ='IF  not EXISTS (SELECT * FROM sys.database_principals WHERE name = N'+ ''''+ @Rolename+ ''''+ ' AND type = '+ ''''+ 'R' + ''''+ ')'
                       
                        insert into @myTable2(mySQLText)
                        values(@sqlcmd)
                       
                        set @sqlcmd='CREATE ROLE [' + @Rolename + '] AUTHORIZATION [dbo]'
                       
                        insert into @myTable2(mySQLText)
                        values(@sqlcmd)
                       
                        set @sqlcmd='GO'
                       
                        insert into @myTable2(mySQLText)
                        values(@sqlcmd)
                       
                  End
           
            set @sqlcmd = (select mySQLText from @myTable where myID=@x)
           
            insert into @myTable2(mySQLText)
            values(@sqlcmd)
           
            set @sqlcmd='GO'
           
            insert into @myTable2(mySQLText)
            values(@sqlcmd)
           
      End

--======================================================
--Query @myTable2 and get the fruits of your labor
--======================================================   
select mySQLText from @myTable2    

And here you go!

As I said in the comments above instead of using a second table variable you can just comment that out and where you insert values into table 1 you would just use a print command to get the script as something you can copy and paste

Thanks,

Brad

Sunday, November 7, 2010

1st Blog, 1st Post - Why the name SQL Balls

First I'd like to say Hi and if you are reading this YAY!!! (mini dance in celebration of your visit).   This is my first go at a blog, I've been a SQL Database Administrator for a while now, and I've read a lot of blogs and thought now was time to jump in and join the fray.

My name is Bradley Ball, and where ever I went people always tried to give me nicknames but the only one that ever stuck was Balls, in high school, or in college, when I worked with the Guys in the Army, at the Office of the President... wait they just called me Brad, and my fantasy football leauge.  So Balls is the only name that has ever stuck, and being a SQL Server DBA I figured SQLBall's it would be.

So a little about my background, I've got my MCITP SQL 2005 DBA & MCTS SQL 2008 DBA certifications.  I got my 2005 certs while working as a contractor to the U.S. Army and my 2008 cert while working as a contractor for the Executive Office of the President.  During that time I don't think I would have felt comfortable having a blog, but I have left the world of Government contracting to work in the private sector for Publix Supermarkets, this year.

I've worked in a lot of different environments and I'm continually amazed by how much there is to learn about SQL Server and how it can be implimented, maintained, and troubleshooted.  If your a SQL Proffessional I hope that I can post something that I've found along the way that can help you, as I've found so many great proffessionals out there blogging and sharing thier knowledge as well.

So Thanks for visiting, I'm going to follow up with a couple posts on Dynamic Scripts that I've been working on as well some work on Transparent Data Encryption that I've been doing.

Thanks for Visiting!

Brad