Friday, September 12, 2014

Professional Microsoft SQL Server 2014 Administration

Hello Dear Reader!  Last September I was approached with a tremendous opportunity to become the Managing Author for the Professional Microsoft SQL Server 2014 book by Wrox.  We worked throughout the CTP phase and solidified the book after RTM.  By the end of July all the pages were in, all the chapters proofed, and we all had a collective sigh of relief.

I was joined in this book by Steven Wort, Ross LoForte, Chad Churchwell (@chadchurchwell | blog), and Jorge Segarra (@SQLChicken | blog) from Microsoft and Adam Jorgensen (@ajbigdata | blog), Brian Knight (@BrianKnight), Kim Hathaway (@sqlkimh), Roger Wolter (@rwolter50 | blog), Dan Clark, and Kathy Vick (@MSKathyV | blog) from Pragmatic Works.

Tech reviewing the book we had Kathi Kellenberger (@auntKathi ), Jason Strate (@StrateSQL | blog), and my buddy Daniel Taylor (@DBABulldog | blog).

The initial goal was to just update the book.  It quickly became a project to gut and replace old ideas with newer material, a vision that continued to grow and should make the next release of the Pro Admin series drastically different even from this book.

I couldn't be prouder of this crew.  Roger Wolter is a former PM from Microsoft who helped write Service Broker, and has done some of the largest and most interesting implementations of it in the world.  You will see his handy work on Chapter 6 on Service Broker and Chapter 8 on Securing the Database Instance.

I worked with Jorge very closely before he joined Microsoft.  He is an amazing guy, with a thirst for new knowledge.  Jorge took Chapter 1 on the SQL Server Architecture and 24 on SQL Server Azure Administration and Configuration.

My friend Kim Hathaway and I teamed up Chapter 2 Installation Best Practices and Chapter 3 Upgrading SQL Server 2014 Best Practices.  Dan Clark, .NET coding wiz and all around BI knowledge base, lent his talents to Chapter 7 SQL Server CLR Integration.

Kathy Vick a former Microsoftie with two tours of duty, who has been working with SQL Server since it was still called Sybase prior to 4.2 has Chapter 13 on Performance Tuning T-SQL and Chapter 14 on Indexing your Database.

Bradley Schacht did more than can be mentioned for the BI side of the house in this book.  He wrote Chapter 23 on SQL Server and SharePoint Integration.  Chad is a smart and amazing PFE for Microsoft.  He joined at the last moment us to take over Chapter 16 on Clustering in SQL Server 2014 and provided a quick and solid contribution to help us over the finish line.

Steven Wort, Ross LoForte, Brian, and Adam all produced the work that is consistent with what we have expected over the years.  Superb.

Then there's this guy.  Mr. Balls. I was honored to be asked with working with this gifted crew.  I wrote Chapter 4 on Managing and Troubleshooting the Database Engine, Chapter 9 In-Memory OLTP (Hekaton), Chapter 10 Configuring the Server for Optimal Performance, and Chapter 11 Configuring the Server for Optimal Performance.  Hmmm....I sense a theme.

The link to the book on Amazon is here.  Just wanted to say Thanks again to the team that put this together!

Look Mom & Dad, I'm on a Book!!!

As always Dear Reader Thanks for stopping by.

Thanks,

Brad


Tuesday, September 9, 2014

24 Hours of PASS Preview: Zero to Hero (I'm the Zero)


Hello Dear Reader!  We are already away into the 24 Hours of PASS Summit Preview for 2014!  I don't know if you heard, but I have a pre-con at the PASS Summit!  More precisely SQL MVP Robert Cain(@arcanecode | Blog) had a great idea for a precon and invited SQL MVP/MCM Jason Strate (@stratesql | Blog) and myself to join him.

We decided early on that we have a great opportunity to showcase how we use PowerShell to complete tasks on the Business Intelligence, DBA, and Cloud engagements and pass on real world skills.  We also want to do it in a way that things are useful.  Our goal is to have things we can give you that will allow you to leave the pre-con and use right away.

We also realized with a bunch of smart guys (and me), presenting we had the opportunity to use Humor and a bit of stage acting.  I'll be playing the role of the Zero in our pre con.

"So Balls", you say, "What's a Zero, and how do you play one? (and why are you explaining this)?"


Great questions Dear Reader!  First let's talk about, what's a zero?  I will be pretending that I do not know how to use PowerShell.  That I don't understand how to use the verbiage, variables, function, modular code design, how to import modules, or do a lot of other stuff you need to know.  I will need to learn from the ground up as if I'm a beginner in the class.
Hopefully me at the Summit (without the awkward flying)

I will ask questions, get explanations, and help bring the audience along.  As the day progresses I'll become a hero using concepts and technology to deliver some end to end solutions.  I'll even take over the Azure PowerShell portion at the very end of the day.

Why am I explaining this?  PASS has an international audience and I'm not a professional actor.  I'll do my best but some may miss the humor in what we are presenting.  Robert, Jason, and myself spoke last night and we didn't want anyone to think that I didn't actually know PowerShell or give reason to doubt why I'm participating in the pre-con.

So sit back today, enjoy our session.  I hope you enjoy me being the Zero, and come to the Summit to find out how to be a Hero with me.

But wait there's more!! Today during out session Robert and I will do most of the talking.  Jason will be live answering your PowerShell questions using the #pass24HOP hash tag on twitter, and answering the questions in the room chat on Twitter as well.  This promises to be a fun session, hope to see you there!

Here is the link to our pre-con.   Here is a link to the 24 Hours of PASS website.  Good luck, happy learning, and as always Thanks for stopping by.

Thanks,

Brad

Thursday, September 4, 2014

Why You Should Go to SQL Saturday

SQL Sat Puerto Rico
Hello Dear Reader!  Soon SQL Saturday #318 in Orlando FL will be here.  The SQL Community does a lot of work at SQL Saturday’s, present at them, and help put them on.  When talking about them, one of the most frequent questions I get asked is: “Why should I go to a SQL Saturday?” 


Almost 4 years ago I attended my first SQL Saturday, attending was a last minute decision and one that has changed my life.  I have a real passion for SQL Saturdays, and while results may vary, my simplest answer is “they can be life changing”.  Here’s how I got there.   



Summit 2013 - Denny's awesome Party
The only SQL Event I had ever attended was the first 24 Hours of PASS.  I loved it.  I watched with eager anticipation, this was the first SQL training I’d ever been to.  Every company I’d worked for thus far had balked at sending me to training. 


I desperately wanted training.  When I discovered the 24 Hours of PASS I became a fan, FREE SQL Training on the internet!!! What a concept!  Of course it was to plug the PASS Summit, and if training was a no go you can guess what my chances of ever going to the Summit in 2009 were. ZERO.  This was as close as I could get, but closer than I’d ever been before.


So as the PASS Summit 2010 was gearing up there was another 24 hours of PASS.  I reserved conference rooms at my company, registered for the events, had a router for network connections set up, and I pumped up the “free” training to the other DBA’s. I worked for two days from there as the sessions were streamed. 
Jorge at SQL Sat Jacksonville


While talking with the other DBA’s that when the magic moment happened.  My friend Greg and my buddy Dan Taylor (@DBABulldog | Blog) said, “If you like the 24 Hours of PASS you’ll love SQL Saturday”.  What’s a SQL Saturday I asked?

A free event where Consultants, MVP’s, and SQL Community members set up tracks and have free presentations all day long.  I was stunned.  It was like I was a child hearing about “FREE CANDY” given out at Halloween for the first time.  Where was this? When was this? This weekend!  In Orlando!  I can do that!  I had to pay $5 for my lunch, but other than that no cost.  I almost felt like I was getting away with something.  As if someone would stop me at the gate and say, “Sorry Sir, you get to sit in the lobby only paying attendees get to see the sessions.”  It didn’t happen.  I got in just fine.


Tom Larock kicking
off SQL Sat OC 
It was everything I’d wanted.  Sessions on Wait Stats, PBM, CMS, Indexing, two deep dives one on partitioning and another on CPU!  I met DBA’s that understood my pains, issues with hardware stressed beyond capacity, aging relic’s with critical LOB apps that we couldn’t get new hardware for, 3rd party vendors with bad indexes, bad code, and little support.  People trying to find a way to survive with NEW insights and experiences sharing openly and free.  People who understood my issues without having to pretend that they actually understood.   


I met Tom Larock (@SQLRockstar | Blog), Argenis Fernandez (@DBArgenis | Blog), Jorge Segarra (@SQLChicken | Blog), Patrick LeBlanc (@PatrickDBA | Blog), and 1 half of my future law firm of Biguns and Balls Jack Corbett (@Unclebiguns | Blog).  There were more.  Lot’s more.  That could take me pages more.  The point is I made it and it was like coming home.


Jason and Steve at SQL Live 360 

That day started it off.  Without Kendal Van Dyke (@SQLDBA | Blog), Andy Warren (@SQLAndy | Blog), Karla Landrum (@KarlaKay22 | Blog), and Jack putting on this SQL Saturday 49 I’m not here today.  

I submitted to be a speaker at the next event I could, I started a blog (you may be familiar with this one), got on Linked-In, and even got a Twitter account.  That event, that one SQL Saturday lead me to presenting at 7 more the next year.  


Getting a spot in the 2nd chance track at SQL Rally, getting voted in by the community at the PASS Summit 2011, and being invited to be on the planning committee for SQL Saturday Orlando #85 the following year after I’d first attended.

Summit 2013 with the guys

At the end of SQL Saturday Orlando every year we stand up top of a stair case and throw out t-shirts and give away raffle items.  In 2012 Andy Warren looked at me while we were tossing out t-shirts and asked “How’s the view from up here?”  I grinned imagining about 50 different replies, but in the end it was a simple “amazing” that left my mouth.


My second job after college took me to Virginia.   A friend had recommended me for the position.  He met me at the airport, as I flew in for my interview, so I would see a friendly face.  I thanked him.  He told me “I showed you the door, you have to walk through it”.  He was right.  I did.  That job taught me a lot and led me new places.


SQL Saturday was the same way.  It showed me the door.  Walking through it brought me new acquaintances, some new friends, new ideas, to SSUG’s, the PASS Summit, Dev Connection in Las Vegas, SQL Live 360 in Orlando, two books, and a pretty awesome job at Pragmatic Works.
Summit 2013 - Karaoke at the Pragmatic Works Party
This is just the journey so far.  Funny how close yet far away 2009 feels. There is always the question, Dear Reader, of where tomorrow will take you.  We all start somewhere.  Everyone has to have the first time.  That brings us back to the question. 


Why should you go to SQL Saturday?  Because they can be life changing.  Hope to see you at one soon, click here to register for Orlando.

As always, Thanks for stopping by.

Thanks,


Brad

Thursday, June 26, 2014

Deck & Demo's Live & Thank You AZSSUG & OPASS!

Hello Dear Reader!  Just a quick post to say Thank You to the Arizona SQL Server User Group and to my home town nortth Orlando user group OPASS!

This week I was very lucky to present Inside the Query Optimizer to the AZ SSUG and Performance Tuning, NOW! to OPASS.  I had promised to get my decks and demo's live and I wanted to do that.

Click Here for the Deck for Inside the Query Optimizer, and here for Demo's.

Click Here for the Deck for Performance Tuning, NOW!, and here for Demo's.

AZ to FL and back again.  I believe next week I'll just rest :).

Seriously Thank you to the wonderful SSUG leaders, Matt & Amy in AZ, and Shawn, Karla, and Rodney in my home town.

Without you this isn't possible!  And Dear Attendee's Thank You, if you have any questions please feel free to shoot me an email.

As always Thanks for stopping by.

Thanks,

Brad

It's not Business, It's Personal

Hello Dear Reader.  I find myself at this late hour unable to sleep.  Yesterday the slate of speakers for the PASS Summit was announced.  What should have been a happy moment was quickly darkened by the words of people that I know well within the SQL Server Community.

I would ask the MVP's and others in the SQL Server Community; Did you plan on intimidating new speakers yesterday?  

Because you did.  I have a few first time speaker that I've been working with. Not first time PASS Speakers, first time period.  Encouraging and mentoring them to get involved in SSUG's and SQL Saturday's.  At the beginning of the year I told one in particular that we should work on a plan so she would have the experience to submit to the PASS Summit this year.

Her first words to me when we spoke yesterday?  "Thank God I didn't submit, because the MVP's would be talking smack about me right now!"

Wonderful work growing the next generation of SQL Server Speakers.  Is this what community has become?

It seems every year with the speaker selection process the people I would normally count on as pillars in our community take the opportunity to bash the process.

If the process is broken so be it.  We should discuss that.  WE SHOULD NOT LEVEL PERSONAL ATTACKS.

That is ill befitting of the responsibility that we as speakers have in the community.

I remember what it was like to be a simple DBA that looked at speakers at conferences with awe and wonder.  Instead of being a community where we encourage new speakers, what.... we encourage new speakers as long as they all are from different companies?

By attacking Pragmatic Works and suggesting that the speakers did anything less than earn their spots, you demean the volunteers, my co-workers, anyone who works for my company, and you demean me.

There were a couple issues that occurred yesterday that compounded one another.  The presentation that occurred during the 24 Hours of PASS that I moderated, see Brent's blog.

Then Kendal a former board member who had knowledge of the process.  Who praised the volunteers and the way the process works, as noted by absentee presenters who didn't receive sessions this year how a speakers name did not guarantee a spot.  He instead implied that something improper had happened.  Here's his blog.  Until he accused me of having no integrity and not deserving my sessions it was a pretty interesting read, click here.

Here's the part to pay attention to: 
  • "3 Preconference sessions by Pragmatic Works employees are on the list, including one delivered by PASS Executive Vice President, Finance & Governance Adam Jorgensen who is also President and Managing Partner of Pragmatic Works. I know a lot of folks that work at Pragmatic and they're good at what they do, but having 3 precon sessions (where presenters usually make good money from the sales) selected for the same company as one of PASS's execs...smells. I'd like to give PASS the benefit of doubt on this one, but I'll it's very hard to ignore, even if Adam wasn't one of the presenters."

HOW DO YOU HANDLE IT?


First I reacted in his comment sections.  I was mad and I called what he wrote Bullshit.  I stand by that.


I've reached out to Kendal.  I hope to talk to him soon.  This shouldn't be a conversation on Twitter or over the blog-o-sphere.  I know him, I consider him a friend, and this accusation is beneath him and regardless of the intention it is deeply personal to me.

I reached out to Brent.  Brent and I DM'ed very very ridiculously late at night.  Brent I can't thank you enough for taking the time to reply.  I hope to talk to you soon!

I completely understand Brent with the 24 HOP.  The reason I reached out to him was because of his comment on Kendal's blog.

In the comment's Brent had this reply:



We discussed 2 different issues over DM.  One is the transparency of the process the other was the selection.  Giving vendors preferential treatment, and that this wasn’t the case here.  Brent didn't have an issue with the Pragmatic Works folks having sessions and understood the level of community involvement that we have.

His issue was transparency.  I was really glad we could discuss this, in-digital-person.  Concerns like that should be communicated amongst friends so false insults do not fly.  I consider Brent a friend, it meant a lot that he made himself so readily available to chat.  It is what I would hope for in a friend.

This is how we should handle these things.  If you have a concern with something I'm doing, reach out to me.  

I remember well what it was like to be a simple DBA that looked at speakers at conferences with awe and wonder.  Seeing people like Brian Kelly and Andy Warren, both of whom I know, comment on this blog and not try to reign in the personal attacks is disheartening.  Andy’s were not inflammatory, but they also did nothing to suggest I or my other co-workers were above the board.

I understand I haven't been at this as long as you guys.  I'm not an MVP.  I've only been speaking the last couple years.  

As a somewhat new member to all of this, I would ask the people that are supposed to be respected Sr. members of the community to conduct themselves with a little more Integrity.

If you know me.  Yet you would say these type of things about me, how does that make new people feel looking at our community from the outside.  Do you believe it makes them want to volunteer and participate in it?


INTEGRITY

My father taught me as a child you only have your integrity once and you should not waste it.  This means something to me.  When I invest in something, I invest wholeheartedly.  I cannot love with half my heart.  I cannot commit to something while sitting on the fence. If I did not earn something then I do not want it.

The greatest things that we get in life are the things we struggle to achieve.  It is only through the labor of the struggle that the fruits of success are realized.

This year I have presented 26 times.  From New Hampshire, to Boston, to Puerto Rico, to Orange County CA, to Denver, to Phoenix, to Atlanta, to Portland, Tampa, Orlando, and more.  I have done deep dives, pre-con's, 1 day sessions, 2 day sessions, 5 day sessions, and this doesn't even include customer presentations.  This is all community.

I have evangelized to user groups and individuals about how they should get involved, present, participate.  I discuss with them how it will help them and help their career.

I would once again point to my co-worker who has not yet delivered their first SQL Community presentation said to me "And you wonder why new people feel intimidated.  I would hate it if they were talking about me".


IT’S NOT BUSINESS IT’S PERSONAL

We've all heard the phrase before "it's not personal its business".  It is typically used as the justification for doing some pretty crappy stuff.

There are some people out there that believe participating in the SQL Community is all about marketing.  That it's business.  Being out there and participating gets them business.  If it is business to them, fine.  It's not to me.  To me the SQL Community is personal.

Right now I am away from home.  I'm away from my kids.  I presented at a user group in AZ last night.  I didn't get paid for it, I didn't get "new" business leads.  As a matter of fact I spent 15 minutes of my 1 hour presentation encouraging people to volunteer.

Why?  Because I love this community.  I have received a lot in my life from the SQL Community.  I have a job I love, I've made new friends, and I’ve traveled to new places, volunteered in ways I never imagined possible.

I truly believe that within every person there is a story waiting to be told that we all want to hear.  It could be brought to life during a presentation on Professional Development, a passionate Deep Dive, or a harrowing tale of lessons learned in the trenches.  When I present I tell people there is a story in each of them that I would love to hear.  They just need to have the faith in themselves to present and the possibilities of what they can do from there are endless.

This isn't business to me.  I would never invest this much time into something I didn't love.  It's personal.

Suggesting that I submitted to the same process as anyone else and received preferential treatment isn't business.  It's personal.  And it's wrong.

I hope from here we can clear the air.  If anyone would like to talk to me about this I’m happy to.  From here on out though please separate criticism of the processes from those that are here for all the right reasons.

As always Dear Reader, Thanks for stopping by.

Thanks,

Brad


Thursday, May 22, 2014

Introducing What_To_Compress V2

Hello Dear Reader!  I'm about 32,000 feet in the air leaving St. Louis after a great Performance Tuning Workshop for Pragmatic Works.  While there Jason Strate (@StrateSQL | Blog) and I had a two day class, and I was able meet some great people and speak with the St Louis SSUG about the new features in SQL Server 2014.  It was a good trip, but I'm happy to be on the way home.

"So Balls", you say, "The blog title is Introducing What_To_Compress, what is that?"

Great question Dear Reader!  For quite a few years I've been presenting, blogging, and writing about Compression.  On the Resource Page I have a list of those presentations as well as the scripts I use.  I'd been thinking about putting together a script to give compression recommendations since I first did my deep dive at the PASS Summit on the subject back in 2011.

About a year ago I did just that.  I've tossed this script around to some co-workers, friends, SQL People, and MVP's and asked for feedback.  I'm finally at the point that I'm ready to release the first version.  So without further ado, here it is.

WHAT_TO_COMPRESS V2

First off this is not a stored procedure like Jason's sp_IndexAnalysis script.  I'll probably make it dynamic in the future, but that is a next release.  If you want to jump straight to the download click here to get What_To_CompressV2.sql. 


This takes the Best Practices that I've been preaching about and applies it to telling you what to compress.  It looks at every Index, every Table, by partition, and gathers the In_Row_Data, Row_OverFlow_Data, and Lob_Data counts.  It tells you the Percentage of COMPRESSIBLE and UNCOMPRESSIBLE data per table, what the Scan and Update patterns are for your tables & Indexes, and makes a recommendation on the level of compression you should use.

It also gives you my detailed reasoning behind the recommendation that I've given you.  For example:

"The Percentage of Scan and Seek operations is 0.00% and the average amount of Update operations is 0.00%.  Data that can be compressed makes up 100.00% of this table.  There is no workload for the current table.  Please wait for the usage statistics to become representative of a typical work load.  If this is a typical work load, this is an excellent candidate for Page Compression.  Test with sp_estimate_data_compression_savings.  Remember that it takes 5% of the tables size and moves it to tempDB.  Validate that you have enough room on your server to perform this operation before attempting."

"The Percentage of Scan and Seek operations is 0.60% and the average amount of Update operations is 99.00%.  Data that can be compressed makes up 100.00% of this table.  However Based on the workload of this server this table should not be compressed.  If you apply Row or Page Compression it will have a higher CPU cost because of the low Seek and Scan Ratio.  Test with sp_estimate_data_compression_savings.  Remember that it takes 5% of the tables size and moves it to tempDB.  Validate that you have enough room on your server to perform this operation before attempting."

"The amount of Uncompressible data in this table does not make it a match for compression.  Data that can be compressed makes up 17.12% of this table.  While data that cannot be compressed makes up 82.88% of this table."

There is one parameter within the script that allows you to set the number of Pages in a table that you want to consider for compression.  By default the number is set at 8 pages, but you can increase that if you would like.

"So Balls", you say, "This sounds great but isn't their a built in stored procedure that can estimate size compression already in SQL Server?"

Yes, there is Dear Reader.  The built in stored procedure has a few things that we should discuss.

SP_ESTIMATE_DATA_COMPRESSION_SAVINGS


The first thing you should know before you use a tool is how it works, and what it does.  You wouldn't normally use a nail gun to open a beer.  You could, but it's not the right tool for the job.

The way sp_estimate_data_compression_savings works is that it takes the table that you specify, moves 5% of it into tempdb applies the compression you specify, and then extrapolates that estimate out over the size of your entire table.  It does a nice job of taking fragmentation into account in order not to give you an inaccurate information.  The key phrase that defines my root concern is, *it takes 5% of your table and moves into tempdb*.  For small tables this probably isn't an issue.  For VLDBs that have very large tables, this is a pretty big deal.

There are some well meaning community scripts available on blogs and codeplex that take sp_estimate_data_compression_savings and wrap it in a cursor to estimate the space savings for each table.  They do this estimation for Row and Page compression, for every table in your database.

This step tells us the space savings, but their are other settings we should take into account.  We should look at those before we begin estimating compression savings across the board.  What should we look at first?


  1. Our Allocation Units.  Only IN_ROW_DATA compresses.  Tables with a lot of LOB data types may not see any advantage in compression.  Even if they slightly compress the over head on those tables can make queries less efficient.
  2. Do we read from our tables?  If we do a lot of scans, seeks, and lookups from our tables this could indicate whether Page or Row compression would give us the best performance.
  3. Do we update our tables often?  Notice I said update.  Not delete, not insert, update.  When we apply compression we remove all the extra white space from fixed length data making all data types, that can use compression, in affect variable length fields.  This can lead to increased Page Splits, specifically mid-Page Splits, aka LOP_DELETE_SPLITs.  For more on Page Splits and mid-Page Splits see my blog, How to Find Bad Page Splits.
  4. SP_ESTIMATE_DATA_COMPRESSION_SAVINGS doesn't look at any of these.  Why estimate all the different compression types without first identifying your table that are proper candidates and looking at what the overall size of those tables are.
  5. You have to have Enterprise Edition to run sp_estimate_data_compression_savings.  You can run What_To_Compress on Standard Edition.


I wouldn't avoid using sp_estimate_data_compression_savings.  However, it wouldn't be the first thing that I run when looking at what to compress.

FEEDBACK

Okay Dear Reader, I need your help.  Run this.  Let me know what you think, what it's missing, anything you can think of.  I'll try to get a v3 out in the next couple months based on feedback.  Most of all if it works, drop me a line!  I love hearing success stories on compression.  Send your emails to: bball@pragmaticworks.com.  And as always Dear Reader, Thank you for stopping by.


Thanks,

Brad