SQL VARCHAR vs NVARCHAR in CAST performance

I have a query that compares data in two tables:

SELECT DISTINCT MT.Column1, MT.Column2, MT.Column5, MT.Column7, MT.Column9 FROM tblMyTable MT WHERE EntryDate >= @StartDate AND EntryDate <= @EndDate AND NOT EXISTS ( SELECT ID FROM tblOtherTable WHERE SomeString LIKE 'X' + CAST(MT.Column1 AS VARCHAR(16)) + 'Y' + CAST(MT.Column3 AS VARCHAR(16)) + 'Z' + CAST(MT.Column4 AS VARCHAR(16)) + '%' ) 

It is working fine. But when I try to use CAST (var AS NVARCHAR), the query runs for more than 10 minutes and does not seem to end in the near future. But when I switch to CAST (var AS VARCHAR) as above, the request ends in 2-3 seconds.

CASTed columns are defined as:

  • Column1 int, not null,
  • Column 3 varchar (50) but not null
  • Column4 varchar (9), not null

but in fact all contain ONLY digits, 9-15 digits in length

I wonder what could be causing this performance loss?

UPDATE:

The execution plan shows the following: enter image description here

+5
source share
1 answer

The nvarchar data type has a higher data type priority. Thus, with nvarchar CAST, the indexed column must first be converted to nvarchar, and the index cannot be used to search more efficiently as a result.

The indexed column is already varchar, so column conversion is not required in this case. The index can be used to more efficiently find data access in the execution plan.

This behavior is known as subversion. See http://en.wikipedia.org/wiki/Sargable .

+6
source

Source: https://habr.com/ru/post/1203929/


All Articles