Thursday, June 26, 2014

It's not Business, It's Personal

Hello Dear Reader.  I find myself at this late hour unable to sleep.  Yesterday the slate of speakers for the PASS Summit was announced.  What should have been a happy moment was quickly darkened by the words of people that I know well within the SQL Server Community.

I would ask the MVP's and others in the SQL Server Community; Did you plan on intimidating new speakers yesterday?  

Because you did.  I have a few first time speaker that I've been working with. Not first time PASS Speakers, first time period.  Encouraging and mentoring them to get involved in SSUG's and SQL Saturday's.  At the beginning of the year I told one in particular that we should work on a plan so she would have the experience to submit to the PASS Summit this year.

Her first words to me when we spoke yesterday?  "Thank God I didn't submit, because the MVP's would be talking smack about me right now!"

Wonderful work growing the next generation of SQL Server Speakers.  Is this what community has become?

It seems every year with the speaker selection process the people I would normally count on as pillars in our community take the opportunity to bash the process.

If the process is broken so be it.  We should discuss that.  WE SHOULD NOT LEVEL PERSONAL ATTACKS.

That is ill befitting of the responsibility that we as speakers have in the community.

I remember what it was like to be a simple DBA that looked at speakers at conferences with awe and wonder.  Instead of being a community where we encourage new speakers, what.... we encourage new speakers as long as they all are from different companies?

By attacking Pragmatic Works and suggesting that the speakers did anything less than earn their spots, you demean the volunteers, my co-workers, anyone who works for my company, and you demean me.

There were a couple issues that occurred yesterday that compounded one another.  The presentation that occurred during the 24 Hours of PASS that I moderated, see Brent's blog.

Then Kendal a former board member who had knowledge of the process.  Who praised the volunteers and the way the process works, as noted by absentee presenters who didn't receive sessions this year how a speakers name did not guarantee a spot.  He instead implied that something improper had happened.  Here's his blog.  Until he accused me of having no integrity and not deserving my sessions it was a pretty interesting read, click here.

Here's the part to pay attention to: 
  • "3 Preconference sessions by Pragmatic Works employees are on the list, including one delivered by PASS Executive Vice President, Finance & Governance Adam Jorgensen who is also President and Managing Partner of Pragmatic Works. I know a lot of folks that work at Pragmatic and they're good at what they do, but having 3 precon sessions (where presenters usually make good money from the sales) selected for the same company as one of PASS's execs...smells. I'd like to give PASS the benefit of doubt on this one, but I'll it's very hard to ignore, even if Adam wasn't one of the presenters."

HOW DO YOU HANDLE IT?


First I reacted in his comment sections.  I was mad and I called what he wrote Bullshit.  I stand by that.


I've reached out to Kendal.  I hope to talk to him soon.  This shouldn't be a conversation on Twitter or over the blog-o-sphere.  I know him, I consider him a friend, and this accusation is beneath him and regardless of the intention it is deeply personal to me.

I reached out to Brent.  Brent and I DM'ed very very ridiculously late at night.  Brent I can't thank you enough for taking the time to reply.  I hope to talk to you soon!

I completely understand Brent with the 24 HOP.  The reason I reached out to him was because of his comment on Kendal's blog.

In the comment's Brent had this reply:



We discussed 2 different issues over DM.  One is the transparency of the process the other was the selection.  Giving vendors preferential treatment, and that this wasn’t the case here.  Brent didn't have an issue with the Pragmatic Works folks having sessions and understood the level of community involvement that we have.

His issue was transparency.  I was really glad we could discuss this, in-digital-person.  Concerns like that should be communicated amongst friends so false insults do not fly.  I consider Brent a friend, it meant a lot that he made himself so readily available to chat.  It is what I would hope for in a friend.

This is how we should handle these things.  If you have a concern with something I'm doing, reach out to me.  

I remember well what it was like to be a simple DBA that looked at speakers at conferences with awe and wonder.  Seeing people like Brian Kelly and Andy Warren, both of whom I know, comment on this blog and not try to reign in the personal attacks is disheartening.  Andy’s were not inflammatory, but they also did nothing to suggest I or my other co-workers were above the board.

I understand I haven't been at this as long as you guys.  I'm not an MVP.  I've only been speaking the last couple years.  

As a somewhat new member to all of this, I would ask the people that are supposed to be respected Sr. members of the community to conduct themselves with a little more Integrity.

If you know me.  Yet you would say these type of things about me, how does that make new people feel looking at our community from the outside.  Do you believe it makes them want to volunteer and participate in it?


INTEGRITY

My father taught me as a child you only have your integrity once and you should not waste it.  This means something to me.  When I invest in something, I invest wholeheartedly.  I cannot love with half my heart.  I cannot commit to something while sitting on the fence. If I did not earn something then I do not want it.

The greatest things that we get in life are the things we struggle to achieve.  It is only through the labor of the struggle that the fruits of success are realized.

This year I have presented 26 times.  From New Hampshire, to Boston, to Puerto Rico, to Orange County CA, to Denver, to Phoenix, to Atlanta, to Portland, Tampa, Orlando, and more.  I have done deep dives, pre-con's, 1 day sessions, 2 day sessions, 5 day sessions, and this doesn't even include customer presentations.  This is all community.

I have evangelized to user groups and individuals about how they should get involved, present, participate.  I discuss with them how it will help them and help their career.

I would once again point to my co-worker who has not yet delivered their first SQL Community presentation said to me "And you wonder why new people feel intimidated.  I would hate it if they were talking about me".


IT’S NOT BUSINESS IT’S PERSONAL

We've all heard the phrase before "it's not personal its business".  It is typically used as the justification for doing some pretty crappy stuff.

There are some people out there that believe participating in the SQL Community is all about marketing.  That it's business.  Being out there and participating gets them business.  If it is business to them, fine.  It's not to me.  To me the SQL Community is personal.

Right now I am away from home.  I'm away from my kids.  I presented at a user group in AZ last night.  I didn't get paid for it, I didn't get "new" business leads.  As a matter of fact I spent 15 minutes of my 1 hour presentation encouraging people to volunteer.

Why?  Because I love this community.  I have received a lot in my life from the SQL Community.  I have a job I love, I've made new friends, and I’ve traveled to new places, volunteered in ways I never imagined possible.

I truly believe that within every person there is a story waiting to be told that we all want to hear.  It could be brought to life during a presentation on Professional Development, a passionate Deep Dive, or a harrowing tale of lessons learned in the trenches.  When I present I tell people there is a story in each of them that I would love to hear.  They just need to have the faith in themselves to present and the possibilities of what they can do from there are endless.

This isn't business to me.  I would never invest this much time into something I didn't love.  It's personal.

Suggesting that I submitted to the same process as anyone else and received preferential treatment isn't business.  It's personal.  And it's wrong.

I hope from here we can clear the air.  If anyone would like to talk to me about this I’m happy to.  From here on out though please separate criticism of the processes from those that are here for all the right reasons.

As always Dear Reader, Thanks for stopping by.

Thanks,

Brad


Thursday, May 22, 2014

Introducing What_To_Compress V2

Hello Dear Reader!  I'm about 32,000 feet in the air leaving St. Louis after a great Performance Tuning Workshop for Pragmatic Works.  While there Jason Strate (@StrateSQL | Blog) and I had a two day class, and I was able meet some great people and speak with the St Louis SSUG about the new features in SQL Server 2014.  It was a good trip, but I'm happy to be on the way home.

"So Balls", you say, "The blog title is Introducing What_To_Compress, what is that?"

Great question Dear Reader!  For quite a few years I've been presenting, blogging, and writing about Compression.  On the Resource Page I have a list of those presentations as well as the scripts I use.  I'd been thinking about putting together a script to give compression recommendations since I first did my deep dive at the PASS Summit on the subject back in 2011.

About a year ago I did just that.  I've tossed this script around to some co-workers, friends, SQL People, and MVP's and asked for feedback.  I'm finally at the point that I'm ready to release the first version.  So without further ado, here it is.

WHAT_TO_COMPRESS V2

First off this is not a stored procedure like Jason's sp_IndexAnalysis script.  I'll probably make it dynamic in the future, but that is a next release.  If you want to jump straight to the download click here to get What_To_CompressV2.sql. 


This takes the Best Practices that I've been preaching about and applies it to telling you what to compress.  It looks at every Index, every Table, by partition, and gathers the In_Row_Data, Row_OverFlow_Data, and Lob_Data counts.  It tells you the Percentage of COMPRESSIBLE and UNCOMPRESSIBLE data per table, what the Scan and Update patterns are for your tables & Indexes, and makes a recommendation on the level of compression you should use.

It also gives you my detailed reasoning behind the recommendation that I've given you.  For example:

"The Percentage of Scan and Seek operations is 0.00% and the average amount of Update operations is 0.00%.  Data that can be compressed makes up 100.00% of this table.  There is no workload for the current table.  Please wait for the usage statistics to become representative of a typical work load.  If this is a typical work load, this is an excellent candidate for Page Compression.  Test with sp_estimate_data_compression_savings.  Remember that it takes 5% of the tables size and moves it to tempDB.  Validate that you have enough room on your server to perform this operation before attempting."

"The Percentage of Scan and Seek operations is 0.60% and the average amount of Update operations is 99.00%.  Data that can be compressed makes up 100.00% of this table.  However Based on the workload of this server this table should not be compressed.  If you apply Row or Page Compression it will have a higher CPU cost because of the low Seek and Scan Ratio.  Test with sp_estimate_data_compression_savings.  Remember that it takes 5% of the tables size and moves it to tempDB.  Validate that you have enough room on your server to perform this operation before attempting."

"The amount of Uncompressible data in this table does not make it a match for compression.  Data that can be compressed makes up 17.12% of this table.  While data that cannot be compressed makes up 82.88% of this table."

There is one parameter within the script that allows you to set the number of Pages in a table that you want to consider for compression.  By default the number is set at 8 pages, but you can increase that if you would like.

"So Balls", you say, "This sounds great but isn't their a built in stored procedure that can estimate size compression already in SQL Server?"

Yes, there is Dear Reader.  The built in stored procedure has a few things that we should discuss.

SP_ESTIMATE_DATA_COMPRESSION_SAVINGS


The first thing you should know before you use a tool is how it works, and what it does.  You wouldn't normally use a nail gun to open a beer.  You could, but it's not the right tool for the job.

The way sp_estimate_data_compression_savings works is that it takes the table that you specify, moves 5% of it into tempdb applies the compression you specify, and then extrapolates that estimate out over the size of your entire table.  It does a nice job of taking fragmentation into account in order not to give you an inaccurate information.  The key phrase that defines my root concern is, *it takes 5% of your table and moves into tempdb*.  For small tables this probably isn't an issue.  For VLDBs that have very large tables, this is a pretty big deal.

There are some well meaning community scripts available on blogs and codeplex that take sp_estimate_data_compression_savings and wrap it in a cursor to estimate the space savings for each table.  They do this estimation for Row and Page compression, for every table in your database.

This step tells us the space savings, but their are other settings we should take into account.  We should look at those before we begin estimating compression savings across the board.  What should we look at first?


  1. Our Allocation Units.  Only IN_ROW_DATA compresses.  Tables with a lot of LOB data types may not see any advantage in compression.  Even if they slightly compress the over head on those tables can make queries less efficient.
  2. Do we read from our tables?  If we do a lot of scans, seeks, and lookups from our tables this could indicate whether Page or Row compression would give us the best performance.
  3. Do we update our tables often?  Notice I said update.  Not delete, not insert, update.  When we apply compression we remove all the extra white space from fixed length data making all data types, that can use compression, in affect variable length fields.  This can lead to increased Page Splits, specifically mid-Page Splits, aka LOP_DELETE_SPLITs.  For more on Page Splits and mid-Page Splits see my blog, How to Find Bad Page Splits.
  4. SP_ESTIMATE_DATA_COMPRESSION_SAVINGS doesn't look at any of these.  Why estimate all the different compression types without first identifying your table that are proper candidates and looking at what the overall size of those tables are.
  5. You have to have Enterprise Edition to run sp_estimate_data_compression_savings.  You can run What_To_Compress on Standard Edition.


I wouldn't avoid using sp_estimate_data_compression_savings.  However, it wouldn't be the first thing that I run when looking at what to compress.

FEEDBACK

Okay Dear Reader, I need your help.  Run this.  Let me know what you think, what it's missing, anything you can think of.  I'll try to get a v3 out in the next couple months based on feedback.  Most of all if it works, drop me a line!  I love hearing success stories on compression.  Send your emails to: bball@pragmaticworks.com.  And as always Dear Reader, Thank you for stopping by.


Thanks,

Brad



Wednesday, April 30, 2014

Decks & Demos Live

Hello Dear Reader!  Just a quick update.  I've been traveling a lot and I'm behind in posting some Decks & Demos.  Here are my decks from the Inside the Query Optimizer talk that I did recently at the Denver SSUG and also at SQL Saturday 283 Puerto Rico.  Click here for the Slide Deck and Click Here for the Demo Scripts!  They are also posted on the Resource Page.

Thank you so much to those who attended and the Organizers for having me!!

Thanks Again,

Brad