Whenever, we talk about Performance Tuning in an SQL Server environment, not necessary always due to bad indexes,statistics or perhaps due to poor query structure the query is behaving badly, in some cases it is possible due to a Incorrect Data Type the query is responding very poor. In this article, Satnam Singh,a Senior SQL Server DBA from Larsen and Toubro Infotech,Mumbai,India discusses one such situation which he faced a couple of days back wherein a T-SQL query was performing very badly due to the column involved in the query using an NTEXT data type.


One of my developers suddenly told me that the query being involved in the SSRS report is behaving very poorly such that it is taking a long long time to generate the output.Initially I thought that there might be some communication problem due to which the application was taking a pretty long time to communicate with the database server as it resided in Australia but later on I came to know that wasn’t the case. After having a detailed look at the query, I noticed that there were few columns in the T-SQL which were having NTEXT data type and causing the damage. NTEXT causes a very heavy Clustered Index Scan along with a very heavy LOB Logical Reads, now let us understand the same via an example.

Let us create a dummy table named comment using the T-SQL below:

Once the table gets created, I insert few dummy records into the table using the below T-SQL:

Now let us check the Statistics IO count of the table named comment using the below T-SQL.

The Statistics IO count is as follows:

As you can see from the above T-SQL that though the value of the Logical Reads is mere 1 but the value of the LOB Logical Reads is 6 which is pretty high for the T-SQL to behave very poorly. This is an expected behaviour as NTEXT never stores the data in the form of Table, Instead it stores it in the form of Large Objects structure.

Now in order to get rid of this situation, I decided to convert the NTEXT datatype column to a NVARCHAR(MAX) data type as shown in the screen capture below:

Once I change the data type of the column to NVARCHAR, I then decided to check the Statistics IO count using the below T-SQL,

The Statistics IO is as shown below:

Oops that still very bad, inspite of changing the Data Type of the column, the LOB Logical Read count is still the same i.e 6. This is because the system has still kept the data in the Large Object structure and uses a pointer as a reference to extract the text from this column.Now in order to fix this problem, I executed the below T-SQL.

Once executed, Now I will view the Statistics IO Count again using the below T-SQL.

Now you can see that the LOB Logical Read count has dropped to 0 which is excellent. Also this is due to the query optimiser extracting data directly from the table instead of the Large Objects structure.

This was my recommendation on How to improve the Performance when the columns having NTEXT data type is present in the table. A detailed POC needs to be done once the above strategy is decided to be implemented. Normally, one should avoid using NTEXT in future developments because Microsoft have confirmed that this is going to be a depreciated feature in future versions of the product. Also, I would never recommend anybody using a NVARCHAR(MAX) or VARCHAR(MAX) as a data type of the column in a particular table because it would not allow you to perform an ONLINE Rebuilding of Indexes in a table.

Many Thanks to all the viewers for providing their valuable time in reading this article.This article is dedicated to my colleagues Arti Samkaria, Yadu Nair and the members of the Support team at the OffShore Development Centre with whom I have done plenty of cases on SQL Server Performance Tuning . Please do let us know if there are any valuable suggestions for improving the article.



Satnam Singh

Like us on FaceBook Follow us on Twitter

Follow me on FaceBook| Join the fastest growing SQL Server group on FaceBook