Monday, August 20, 2012

Please VOTE for Me: PASS Summit 2012 Lightening Talks
Hello Dear Reader the PASS Summit is approaching and with that the program committee sent out a request for Lightening Talk Submissions.  Last year I was able to give a lightening talk, 24 a 5 minute Horror Story, about one of my worst on call shifts ever that unfolded during a 24 straight hours.  It involved the wrong RAID Drive being pulled, having to rebuild transaction logs, Master going nuclear and having to rebuild the system tables and restore from backup, and finishing off with a little DBA Prayer called “Please God let DBCC CHECKDB run clean so I can get to sleep….”.  It was fun and I tried to do it with as much humor as I could given the situation.

This year the Lightening Talks have been extended to 10 minutes, and I’ve submitted another that is now up for community vote. 

“So Balls,” you say, “What are you presenting on?”

A very important topic Dear Reader and it is all about how to be a better DBA.  We all will go to the Summit and spend hundreds and in most cases thousands of dollars to attend.  What about the time in between?  

Once a year we have the largest get together in the World of SQL Server professionals.  After you go home how do you keep up with it?  Knowing where to look is the first step.  There are many many organizations that work tirelessly to keep the spirit of the Summit alive until we meet again, and you can get it without breaking the budget.  My topic is Get Top Notch Training for Free or Next to Nothing.


Top Notch, You've said it all.

  The greatest thing about Microsoft SQL Server is the SQL Server Community.  I would use it as a major selling point if I were a Microsoft Rep.  

Save thousands of dollars, yep.  Get features included with Enterprise Edition that cost hundreds of thousands of dollars with other vendors, check.  Has a community of millions of users who bust their butt’s regularly to give free training, documentation, assistance, and put on over 100 free training clinics in 2012 alone, Check.   

I admit I am biased here.  Most technologies have a gathering and professionals that go out of their way to help others.  You would be hard pressed to find one as grand in scale and scope as the SQL Server Community.  Without further ado here’s my Abstract:

The greatest thing about SQL Server is its Community. This is always spotlighted at the Summit, but throughout the year there is Free Training offered by Top SQL Minds, MVP's, and MCM's alike. Learn about Webinars, User Group Meetings, and SQL Saturdays and how to keep your SQL Learning going all year long.

There’s a lot to be gained by going to the big conferences, but if you’re in a shop where the budget isn’t there you don’t have to miss out.  My company Pragmatic Works has free training on the T’s (Tuesday and Thursday’s), SQL Skills has their Insider video’s, The Brent Ozar PLF has weekly webinars, Idera has the Ace program, and you name it (and sorry to anyone I left out) we've got it!

Not to mention the PASS Virtual ChaptersDBA, DBA Fundamentals, Performance, PowerShell, Big Data, Business Intelligence, and more!  Want a preview of the great content you will get at the PASS Summit 2012 look no further than the 24 Hours of PASS, once again completely free.

Want to be able to reach people in person and network?  Maybe you should attend a SQL Saturday, check out SQL Saturday 151coming up in Orlando Saturday September 29th, where the same people and many of the same presentations given around the globe are brought to the local community.  

Want community more than once or twice a year?  Check out your Local SQL Server User Group to the uninitiated), where you meet the DBA’s that make up your local community, once again absolutely free.

Many conferences will give you a chance to get training that may not make it out to any of these channels (the Microsoft PSS team the CAT team and other Microsoft guru's), and I would argue that they are still very important and valuable reasons to attend.

However, knowing where to look when those conferences are gone and just a memory and notes on a page is priceless.  So this will be my presentation.  It will be chock full of links to resources, how to find information, what sites to go through (I still haven't mentioned forums!).  Better yet being a lightening talk we'll have some people in the room that may be able to contribute more as well!



Thursday, August 16, 2012

SQL Saturday 151 BI Pre-con: Stacia Misner

Hello Dear Reader, SQL Saturday 151 Orlando is picking up steam.  The Pre-Con’s have been named and they are fantastic.  SQL Saturday Orlando is always a big event, the schedule has been posted, and the planning is well underway.  First stop the BI Pre-Con Featuring Stacia Misner(@StaciaMisner|Blog) taking place at the beautiful Lake Mary Hyatt Place hotel.

“So Balls,” you say, “Who is Stacia and why should I attend?”

Great question Dear Reader, I work with a lot of really great people in the BI world, even though that area is not my forte, and everyone agrees Stacia is one of the TOP BI experts in the world.  She is one of the instructors for the Microsoft SSAS Maestro Program.  Stacia has been the author of over 12 different books on the subject of SQL Server.  

Her most recent is Introducing SQL Server 2012, available as a free PDF download click here to get it!  I first met Stacia at SQL Saturday #62 back in Tampa in 2011.  She had also written the book introducing SQL Server 2008 R2 and I had some questions about Master Data Services.  I was trying to figure out if Master Data Services in the 2008 R2 release was right for a project I was working on.  I asked her if I could get her advice and she was polite, candid, and very helpful.  Now it is your chance to get to meet Stacia.

A 360-Degree View of SQL Server 2012 Business Intelligence

One of the greatest things about SQL Saturdays is that the top SQL speakers and consultants will offer their training services for an unbelievable deal.  Stacia has taught BI Immersion Courses, Pre-Cons, and spoken at seminars that cost thousands of dollars to attend. You can attend her SQL Saturday session, and get one on one time with this expert, for just $99.   So now let’s look at the plan for the day.

In this session, we’ll take a holistic look at the BI features in the latest version of SQL Server by reviewing the architecture requirements, exploring the implications for existing BI applications, and introducing new capabilities that support the transformation of data into business insight. We'll start with data integration and management by reviewing the overhaul that Integration Services received in this release, how to formalize the data cleansing process by using the new Data Quality Services, and how master data management is improved with the updates to Master Data Services. Then we'll discuss the improvements to analytical capabilities by exploring updates to Analysis Services, including the new Tabular Model, and enhancements available in PowerPivot. Last, we'll cover the new presentation layer options available in Reporting Services and the new release of Power View. Of course, you’ll see demonstrations of the new features, but the primary purpose of this session is to give you a chance to ask lots of questions and to get a look “under the hood” to better understand what you’ll need to do to get these BI features up and running properly. You’ll also learn how to prepare your data environment to leverage these features and how best to manage the user experience.

If you are in a BI shop, or are looking to expand your career and get more in-depth in the BI field this is a great training opportunity.  Click here to sign up for the pre-con.  Click here to register for SQL Saturday 151 if you haven’t already.  I hope to see you there!  As always Thanks for stopping by!



Tuesday, August 14, 2012

Database Certificates and the X.509 Standard

 Hello Dear Reader, I came across an interesting discovery about a year ago and realized I’d never written about it.  I’ve done a lot of work with encryption and mainly Transparent Data Encryption.  I’ve got a presentation that I’ve given on the subject that I presented at SQL Saturday’s, Dev Conection, and SQL Rally.  I take a database, back it up, and drop the unencrypted backup in a hex editor.  This allows me to show what the contents look like before and after encryption.

I encrypt the database and take a backup and I put that in the hex editor as well.  One day in front of OPASS, the North Orlando SQL Server User Group, I dragged the certificate and private key backups in the hex editor as well and I noticed something disturbing.  Part of the encrypted backup of the certificate was in plain text!

“So Balls”, you say, “What does the certificate have to do with the X 5 O….whatever.”

Well put Dear Reader, and the short answer is again everything.


The X.509 Security standards are the International Technology Union encryption guidelines for Public Key Infrastructure and Privilege Management Infrastructure.  In short these are the smart guys that make up the encryption standards we use in just about everything.   It just so happens that they have some pull over SQL Server Database Certificates as well.

So I was in front of OPASS and I was giving a presentation on Transparent Data Encryption when I made an interesting discovery.  I made all my demo’s and passwords easy so I wouldn’t have to worry which was which,  the password was ‘Brad1’.  Imagine my surprise when I pulled in the encrypted, with a private key and password (aslo ‘Brad1’), backed up contents of the Database Certificate and found laying there in plain text was ‘Brad1’.

It was my own fault for making a demo that used the same thing over and over.  I didn’t know which password leaked.  I went home entered a different value in for each place I had ‘Brad1’, backed up the certificate, and pulled it into a hex editor.  It was the Subject of the Certificate.


Why would the subject be in plain text?  Good question Dear Reader.  I hopped over to MSDN to look at the documentation on database certificates, click here to view.  I found this information:

                SUBJECT ='certificate_subject_name'
The term subject refers to a field in the metadata of the certificate as defined in the X.509 standard. The subject can be up to 128 characters long. Subjects that exceed 128 characters will be truncated when they are stored in the catalog, but the binary large object (BLOB) that contains the certificate will retain the full subject name.

Nothing about why it was in plain text, but it pointed to the X.509 Security Standards.  Click here to read the X.509 Security Standards if you have trouble sleeping at night. 

The Subject is mentioned quite a bit.  The way it works out, is that the subject is used as part of a trust anchor.  Think of each certificate like a fingerprint.  Each is supposed to be encrypted and different.  Occasionally you have twins, and the certificates are so similar that you need a way to tell them apart.  In the event of that situation the Subject is used to differentiate them.

So while you would think the subject is… well… the Subject of what you will use the certificate for, it is not.  I would generate a strong password and place it in the subject anytime I use a database certificate.  But let’s do a quick demo to show.


First we will create a Master Key and a Database Certificate.

Create Master Key Encryption By Password='MasterKeyPass1'
Create Certificate DatabaseCertificate With Subject='Dont Put Anything Importiant in the subject'

Now let’s back them up to disk.  We’ll encrypt the certificate using a private key, and a strong password to encrypt the private key as well.

BACKUP CERTIFICATE DatabaseCertificate TO FILE ='C:\Encrypt\DatabaseCertificate.cer'
WITH PRIVATE KEY ( FILE ='C:\Encrypt\bradprivkey.key', ENCRYPTION BY PASSWORD ='$uper$ecretP@ssword')

You should have 2 files from the backup the Database Certificate and the Private Key.  

Now let’s open up the Certificate in our handy hex editor.

And there it is!  Our subject sitting in plain text, not a huge security leak but for best practices I like to tell people not to put anything important in the Subject.

Thanks for stopping by Dear Reader!



Monday, August 13, 2012

Database Corruption, Transparent Data Encryption, and Trace Flag 5004

This one comes straight from the email bag.  A friend recently had a problem, they were placing TDE on a database and the encryption scan had stopped at state 2 percent_complete 0. I'm bouncing around the Charlotte NC airport facing some plane delays, and I thought what better time than to play around with a little database corruption. 

“So Balls”, you say, “What does TDE stuck in an encryption scan have to do with corruption.”

Great Question Dear Reader!  The default Page_Verify setting in SQL Server is Checksum.  This means when a page is read into memory and written back to disk a Checksum is calculated based off the pages contents.  When it is written back to disk, that Checksum is there.  When the page is read again the Checksum is used as a validation.  If the Checksum fails then it tosses an error reporting the page as a Suspect Page.

Think of this like going through the TSA Checkpoint, you’ve got your ticket and your identification.  If your ticket says ‘Serenity’, but your ID says ‘Zachary’ you will probably get flagged by the system as Suspect.  In both cases that’s where the probing begins. 


For this example I’m going to use a database that I’ve corrupted called CorruptAdventure taken from a corrupted version of AdventureWorksDW2008R2.  Horrible name for a database, it was just asking for corrupting.  We’ll start out assuming everything is fine.  The powers that be want TDE, Transparent Data Encryption, enabled on the database and we will do that.  First we’ll create our Master Key and a Database Certificate to use in the encryption.

Create Master Key
and Certificate
USE master
Create Master Key Encryption By Password='MasterKeyPass1'
Create Certificate DatabaseCertificate With Subject='Dont Put Anything Importiant in the subject'

Now we’ll point to CorruptAdventure and create a Database Encryption Key and set encryption to on.  Transparent Data Encryption will read each page into memory.  If it doesn’t have a checksum one will get written.  Our page has a checksum, but it’s contents have been corrupted.  When SQL calculates a checksum to validate the current on, the page will get logged to the MSDB.dbo.Suspect_Pages table.

use CorruptAdventure
create database encryption key
with algorithm = aes_256
encryption by server certificate DatabaseCertificate
Alter Database CorruptAdventure
Set Encryption on

It looks like it is encrypting!

Whoa! We hit our error. 

Let’s query our Suspect_Pages table.  Just like I thought we’ve got our database ID and our page ID.  The error_type column is equal to 2, this means our page was flagged suspect during a Checksum operation. 
It just stalled out
Why would this happen?

A page checksum occurs on all pages
whent the TDE scan
select * from msdb.dbo.suspect_pages

Now let’s run DBCC CheckDB and verify if we really have something wrong with our database.


Msg 8928, Level 16, State 1, Line 1
Object ID 325576198, index ID 5, partition ID 72057594043498496, alloc unit ID 72057594044940288 (type In-row data): Page (1:3874) could not be processed.  See other errors for details.
Msg 8939, Level 16, State 98, Line 1
Table error: Object ID 325576198, index ID 5, partition ID 72057594043498496, alloc unit ID 72057594044940288 (type In-row data), page (1:3874). Test (IS_OFF (BUF_IOERR, pBUF->bstat)) failed. Values are 12716041 and -4.
Msg 8976, Level 16, State 1, Line 1
Table error: Object ID 325576198, index ID 5, partition ID 72057594043498496, alloc unit ID 72057594044940288 (type In-row data). Page (1:3874) was not seen in the scan although its parent (1:3888) and previous (1:3873) refer to it. Check any previous errors.
Msg 8978, Level 16, State 1, Line 1
Table error: Object ID 325576198, index ID 5, partition ID 72057594043498496, alloc unit ID 72057594044940288 (type In-row data). Page (1:3875) is missing a reference from previous page (1:3874). Possible chain linkage problem.
CHECKDB found 0 allocation errors and 4 consistency errors in table 'FactInternetSales' (object ID 325576198).
CHECKDB found 0 allocation errors and 4 consistency errors in database 'CorruptAdventure'.
repair_allow_data_loss is the minimum repair level for the errors found by DBCC CHECKDB (CorruptAdventure).

Just as I suspected corruption.   We got the Database ID and page number from Suspect_Pages and DBCC CHECKDB just verified that the page is indeed corrupt. Now we can find exactly what type of data is corrupted, which will determine our strategy for handling it.  We have the Object ID and Index ID for the DBCC CHECKDB Scan.

We can do a query against sys.indexes joined to sys.objects using the IndexID, 5, and ObjectId, 32576198, provided.  We will get the table name, index name, and index type.

select as TableName
     , as IndexName
     sys.indexes i
     left join sys.objects o
     on i.object_id=o.object_id
     and i.index_id=5

Our corruption is on a non-clustered index.  If you ever get corruption this is one of the easiest types to fix.  We drop our non-clustered index and re-create it, and it should fix everything.

USE CorruptAdventure
DROP INDEX IX_FactInternetSales_OrderDateKey ON dbo.FactInternetSales
CREATE NONCLUSTERED INDEX IX_FactInternetSales_OrderDateKey ON dbo.FactInternetSales
(OrderDateKey ASC)

Now let’s Run DBCC CHECKDB to get a clean bill of health.


Excellent, looking at our TDE status it still hasn’t moved. 

The TDE encryption scan should have paused when the Checksum error occurred.  In case it didn’t you can manually pause the encryption scan and reset it with Trace Flag 5004.  Turning Trace Flag 5004 on will stop the encryption scan right where it is.  You then need to turn Trace Flag 5004 off so you can re-issue the encryption command and watch it commence.  You might not need to use Trace Flag 5004, but I like to play this one on the safe side.

ALTER DATABASE CorruptAdventure

Let’s check our encryption status.

We are progressing again, and it looks like now we’ve completed!  Excellent, not only did we get our database encrypted but we were able to fix corruption that we were not previously aware of.  One last peek at our TDE scan and we see it is complete and our encryption_status is now 3, no longer stuck at 2.

Well my plane just arrived, so that’s all for now Dear Reader, as always Thanks for stopping by.



Wednesday, August 8, 2012

How to Data Compress Varchar(MAX)

I talk a lot about compression.  I’ve blogged a pretty decent amount on it as well.  One of the things that often confuses people is what can and cannot be compressed.  There is a list of data types that can be Row Compressed.  That list is different between each SQL Version.  Page compression on the other hand works at the binary level, it is data type agnostic.

The big determining factor is what type of Allocation Unit your data is stored on.

“Balls,” you say “What’s an Allocation Unit?”

An Allocation unit is the structure behind the structure.  Think of real estate for a second.  Buildings and property are zoned in a city or a town.  One section is for businesses, another is zoned for residential, one may be zoned for the government.  In SQL Server we have 3 different zones IN_ROW_DATA, ROW_OVERFLOW_DATA, and LOB_DATA. 

Instead of being sized just for type, your size matters just as much.  If you are a regular every day Integer or Character field you live in IN_ROW_DATA.  You are LOB_DATA if you are a VARBINARY(MAX) that contains a 500 MB picture file.  ROW_OVERFLOW_DATA are variable length fields that start off on IN_ROW_DATA pages, but if that data grows large enough that it cannot fit on an 8 KB IN_ROW_DATA page then it gets popped off the IN_ROW_DATA Page and lands  on the ROW_OVERFLOW_DATA Page.

The data types in SQL that have a (MAX) designation, XML, or certain CLR types start off on IN_ROW_DATA pages.  They get moved off if the size grows.


So how in the wide wide world of sports does this apply to Data Compression?  If your data is on an IN_ROW_DATA page it could be compressed.  Row compression still only applies to the data types that are listed per version, see row compression here at MSDN.

Page Compression only requires matching binary patterns, as long as it is IN_ROW_DATA pages we are good to go.  You can use this script to run against your database to get the Allocation Unit makeup of your tables and indexes.

     OBJECT_NAME(sp.object_id) AS [ObjectName]
     , AS IndexName
     ,sps.in_row_data_page_count as In_Row
     ,sps.row_overflow_used_page_count AS Row_Over_Flow
     ,sps.lob_reserved_page_count AS LOB_Data
     sys.dm_db_partition_stats sps
     JOIN sys.partitions sp
           ON sps.partition_id=sp.partition_id
     JOIN sys.indexes si
           ON sp.index_id=si.index_id AND sp.object_id = si.object_id
     OBJECTPROPERTY(sp.object_id,'IsUserTable') =1
order by sps.in_row_data_page_count desc

The higher the IN_ROW_DATA page count the more likely you have a candidate for compression. 


We’ve laid the ground work now on to the main event.  First we’ll create our database and  our table and insert some data.  I’ve got two Varchar(Max) fields, we’ll put 2012 characters in each. 
Select our demo database
to use
use master
if exists(select name from sys.databases where name='demoInternals')
     alter database demoInternals set single_user with rollback immediate
     drop database demoInternals
Create Database demoInternals
USE demoInternals
Create our table
IF EXISTS(SELECT name FROM sys.tables WHERE name='vmaxTest')
     DROP TABLE dbo.vmaxTest
CREATE TABLE vmaxTest(myid int identity(1,1)
     , mydata varchar(max) default 'a'
     ,mydata2 varchar(max) default 'b'
     ,CONSTRAINT pk_vmaxtest1 PRIMARY KEY CLUSTERED (myid))
Insert 5000 rows
SET @i=0
WHILE (@i<5000)
     INSERT INTO vmaxTest(mydata, mydata2)
     VALUES(replicate('a',2012)+cast(@i AS VARCHAR(5)), replicate('b', 2012)+cast(@i AS VARCHAR(5)))
     SET @i=@i+1

If you use our script from earlier then you can see we have 4950 IN_ROW_DATA Pages.

Now let’s update one of our Varchar(max) fields to 8000 characters so that we push it off of IN_ROW_DATA and over to LOB_DATA Pages.   Run our script again to get our counts.
Now we'll update just the b values
to force them into row_overflow data
UPDATE dbo.vmaxTest
set mydata2=replicate('b',8000)

We certainly have some fragmentation, but we’ve added 5009 LOB_DATA pages to the mix.  Now let’s apply Page Compression and use our script again to see the results.
Rebuild our table with
Page Compression
ALTER TABLE dbo.vmaxtest

As you can see the IN_ROW_DATA Compressed, the LOB_DATA didn’t.  Another way that knowing thy data can help you understand what you can and should compress.



Tuesday, August 7, 2012

Book Review: On Writing A Memoir of the Craft by Stephen King

I just finished this book.  Literally.  Sitting on an airplane in route from Orlando to Charlotte NC.  I got up out of my seat, fetched my computer just so I could start writing.  It took me two weeks to finish.   I was busy, but I enjoyed every moment I could sneak reading On Writing into the day.

Are you a writer?  Technical, fiction, non-fiction, blogger, columnist, or novelist?  If you write read it. 

 Not a Stephen King fan? Read it anyway.  He is a once in a lifetime author.  He has been very successful in his line of work.  He knows something about the craft that you may not.  He knew plenty that I did not. 

“So Balls”, you say, “What’s so great about this book?”

In a word Dear Reader? Everything.


I’m not a fan of horror movies.  I’m a bit of a wennie in that regard.  Ask my wife, she loves them.  I’ll watch them, I cringe, I jump, and sometimes I’ll make a sound. 

I don’t like the gore of the movies, but give a book that has the same elements and I’ll lap it up.  My tastes tend to shift towards the Supernatural, Sci-Fi, Horror, Mysteries, and tales of Knight’s and times such as that.  I’ve occasionally read biographies, but other than Technical Manuals, most of what I read is Fiction.

I’m not sure the first a Stephen King book I read.  The first time I remember reading something of his was a collection of short stories called Different Seasons.  It had stories that ran the gambit.  Rita Hayworth and the Shawshank Redemption (read it long before it was a movie), The Apt Pupil (ditto), Different Seasons (aka the movie Stand by Me ditto again), and the Langoliers (TV movie but ditto times four).  I always heard of Stephen King the “horror” writer, he’s been writing since before I was born and famous for just about as long. 

I also found something interesting; my favorite part of the book was the Introduction.  I liked reading the thoughts of the man himself.  He seemed funny, smart, the kind of guy you would want to hang out with.  He took me on a trip and described things in such a way that I understood them.  I liked hanging out with Stephen King.

Fast forward some years and I picked up the novelized version of the Screen Play for Storm of the Century.  Once again I got to read the comments, the thoughts that made up the man, and learn a little more about his process.  I liked the TV version, admittedly I didn’t watch it until after I’d read the screen play, but the dialogue was better in my head.  The special effects budget had no limit.  What stuck with me most was King’s description of how he had envisioned the character of Andre Linoge. 

Steve,  hope you don’t mind that I call him Steve he’s told me so many stories over the years calling him “Stephen” feels too formal.  Anyway Steve had a dream about a man sitting on the bunk of a bed in a cell block.  You could draw parallels to the Green Mile, but Andre was different from John Coffey.  He was smaller and looked quite a bit different for starters.  Instead of being gentle he was a menacing force, everyone was in greater danger when he was close by even if he was in prison cell.   The cage held a hungry tiger, not a passive giant.  The dream scared the bejesus out of him, he woke up and had to write.  Had to write about the character before it left his mind.

Think of the vivid dreams that you get.  You wake up and have to tell someone.  Good, bad, scary, crazy, a dream that leaves a mark.  How cool would it be to make a story out of the dream? 

We recently took the kids to the library to get our first round of library cards.  While we were there I was looking for a book.  I looked and eventually found Just After Sunset: Stories.  In the introduction Steve mentioned On Writing and my interest was piqued.  The next trip to the library I picked it up, I’m glad that I did.

We start out with Stevie King growing up as a kid.  He goes out of his way to show us that he wasn’t born into writing.  It was a skill he developed.  It takes practice; you have to work out the muscles that you use writing. You will fail.  Failure is part of trying.  Don’t let that discourage you from trying; Stevie King had a stack of rejection letters that he kept above his desk.  Success was not overnight or easy.  We travel through his life with great detail, to see success and failure. 

His early life is humbling.  He wrote his first novels in a trailer on a small desk by the washing machine.  There is no shortcut to being a successful writer, but we see how the man was crafted.  We see his views on literature.   We get his reflections on his life.

You need to have a tool box.  In it you need to place the tools that you will use.  As a doctor you may wield a scalpel, in IT you use a computer, and as a writer you need your tools as well.  Not just a pen and paper, or a keyboard.  Tools of vocabulary, grammar, a greater understanding of nouns, verbs (passive vs. active), lessons on adverbs and pronouns, elements of style, naturally evolving stories vs. outlined stories, and instructions and examples of how to use them.  Stephen King makes it interesting, engaging, and gasp educational.

Steve explains his thought process on connecting with the reader.  The bond between the writer and the reader is so great and so close, that it is psychic on the level of telepathy.  Don’t believe me?  I’ll give you a quick example. 

                The old man walked through the rain.  He pulled his jacket around him close.  He had made this walk many times, up the road from the store back to his apartment.  The weather raged against him.   One look at his hands and you could see this was far from his first storm.  Life had not been easy, he never asked it to be.  He dipped his head and passed an inside out umbrella that had lost its way, discarded and forgotten.  Not unlike the old man himself.”

The umbrella, did you see it?  Was it maroon, light yellow, blue, maybe green?  You know what color you saw, I didn’t need to tell you.  How about the old man, what color was his hair?  I never told you, but yet you saw it, or was he bald?  Somehow you just knew.   You Dear Reader have a great imagination.  You can paint your own canvas, and I should let you.  That makes this story not mine, but our story.

You knew Dear Reader, magically, as if by telepathy exactly what I was thinking.  Straight down rain, sideways rain.  You knew, and you got it just right.


Reading this book will make you look at the way that you write, and examine what you are doing.  I could do a chapter by chapter review but it wouldn’t do it justice.  It is the instructions of a teacher, and a damn fine read.  I borrowed this book from the library. I will buy a copy.  Word count 1429, final -10%=1286 (You’ll get it when you read it).