Hi All,
I am looking to test out database compression on a database, the problem that I am having at the moment is that once the database is compressed the performance is 15% slower. The testing strategy in use at the moment is to make a batch of 50,000 calls to the database. The stored procedure does a select on some tables, loads the data onto a temp table and then does a couple of joins and then finally a select statement happens to display the rows of data. I understand that reading/writing from a temp table would cause data to be compressed/decompressed, which in turn can cause degradation in performance.
It is a bit difficult to select a single call to test the procedure with, however the total calls equate to 15% degradation in performance. I would like to know if there are any testing strategies that I can use to work out where I am loosing time as a result of the database page compression. I look forward to hearing from you.
Thanks.