Tuesday, September 24, 2013

Can You Compress a Temp Table?

 Hello Dear Reader!  We are finishing up the final day of the Performance Tuning Workshop here in Atlanta and I got an interesting question on Compression from Tim Radney (@tradney | Blog).

The question: Can you compress a temp table? Just a quick blog to get the answer out there while Gareth Swanepoel (@GarethSwan | Blog)  teaches the class about Extended Events. 

My guess was yes.  Temp Tables can have statistics, Clustered and Non-Clustered Indexes, while they only exist in the session they are created, I would be they could be compressed.  If you would actually want to compress them is a different discussion, but let’s prove this out.

DEMO

Here’s a quick demo to show you can do this.  So first up we will create our Temp Table specifying with Data_Compression=ROW.

This will create our temp table #myTable1, we will then insert 15000.

if exists(select name from tempdb.sys.tables where name like '#myTable1%')
begin
     drop table #mytable1
end
go
create table #myTable1(
              myid int identity(1,1) primary key clustered
              ,mychar1 char(500) default 'a'
              ,mychar2 char(3000) default 'b'
              ) with (data_compression=row)
go
declare @i int
set @i=0
while(@i<5000)
begin
     set @i=@i+1
     insert into #myTable1
     default values
end

Now let’s use DBCC IND to view the pages associated with our table, and DBCC Page to Validate that our data is compressed.

dbcc ind(tempdb, '#myTable1', 1)
go



dbcc traceon(3604)
go
dbcc page('tempdb', 1, 376,3)
go



Looking at the output of DBCC Page I can see that the CD array for my compressed data is present near the header.  Row compression is indeed on.

Now let’s rebuild this using page compression on a rebuild operation using sp_spaceused to measure the size of the table.


And it is now Page Compressed.  Thanks for the question Tim!  And as always Dear Reader Thank you for stopping by.

Thanks,

Brad




6 comments:

  1. btw, instead of that loop try:
    insert into #myTable1 default values;
    go 5000;

    ReplyDelete
    Replies
    1. That still has the problem of being RBAR because it's still going to do 5,000 individual inserts. It doesn't matter so much with a tiny test table like this but it will matter when you're doing repeated tests where you need to rebuild the test table over and over with a million or two rows.

      Delete
  2. Compression of a Temp Table seems like it could be useful. Have you done any testing to see how much of a difference it would make performance wise?

    ReplyDelete
    Replies
    1. Hey Jeff! I have not done a lot of testing on this, but I will.

      Delete
  3. I guess the same won't apply to table variable. Have to find some time to try out.

    ReplyDelete
  4. I have a question about using already compressed table in user DB. If I use temp tables does the compression from the already compressed table carry over to the temp table created or does the compression only stay in memory and to have it remain compressed I would have to create the temp table as compressed to take advantage of compression.

    ReplyDelete