Friday, August 02, 2019

SQL Server Query: Fast with Literal but Slow with Variable

Often when there is a drastic difference from run to run of a query I find that it is often one of 5 issues.
  1. STATISTICS - Statistics are out of date. A database stores statistics on the range and distribution of the types of values in a various column on tables and indexes. This helps the query engine to develop a "Plan" of attack for how it will do the query, for example, the type of method it will use to match keys between tables using a hash or looking through the entire set. You can call Update Statistics on the entire database or just certain tables or indexes. This slows down the query from one run to another because when statistics are out of date, it's likely the query plan is not optimal for the newly inserted or changed data for the same query (explained more later below). It may not be proper to Update Statistics immediately on a Production database as there will be some overhead, slow down and lag depending on the amount of data to sample. You can also choose to use a Full Scan or Sampling to update Statistics. If you look at the Query Plan, you can then also view the statistics on the Indexes in use such using the command DBCC SHOW_STATISTICS (tablename, indexname). This will show you the distribution and ranges of the keys that the query plan is using to base its approach on.
  2. PARAMETER SNIFFING - The query plan that is cached is not optimal for the particular parameters you are passing in, even though the query itself has not changed. For example, if you pass in a parameter which only retrieves 10 out of 1,000,000 rows, then the query plan created may use a Hash Join, however, if the parameter you pass in will use 750,000 of the 1,000,000 rows, the plan created may be an index scan or table scan. In such a situation you can tell the SQL statement to use the option OPTION (RECOMPILE) or an SP to use WITH RECOMPILE. To tell the Engine this is a "Single Use Plan" and not to use a Cached Plan which likely does not apply. There is no rule on how to make this decision, it depends on knowing the way the query will be used by users.
  3. INDEXES - It's possible that the query hasn't changed, but a change elsewhere such as the removal of a very useful index has slowed down the query.
  4. ROWS CHANGED - The rows you are querying drastically changes from call to call. Usually, statistics are automatically updated in these cases. However, if you are building dynamic SQL or calling SQL within a tight loop, there is a possibility you are using an outdated Query Plan based on the wrong drastic number of rows or statistics. Again in this case OPTION (RECOMPILE) is useful.
  5. THE LOGIC Its the Logic, your query is no longer efficient, it was fine for a small number of rows, but no longer scales. This usually involves a more in-depth analysis of the Query Plan. For example, you can no longer do things in bulk, but have to Chunk things and do smaller Commits, or your Cross Product was fine for a smaller set but now takes up CPU and Memory as it scales larger, this may also be true for using DISTINCT, you are calling a function for every row, your key matches don't use an index because of CASTING type conversion or NULLS or functions... Too many possibilities here.

Wednesday, June 07, 2017

Security Framework

I've created a new repository on GitHub.
The project is a security framework for mvc projects.
You can find it th following address on GitHub:

I'll update this post periodically to make a useful guide to use it in your projects.

Have fun. 

Monday, October 10, 2016

Consuming a WCF with error: "The caller was not authenticated by the service"

Consuming a WCF service when it hosted on a work group server may cause raise some errors such as: "The caller was not authenticated by the service".
I tackled it by adding the following lines to my code:
client.ClientCredentials.Windows.ClientCredential.Domain = "WorkstationName";
client.ClientCredentials.Windows.ClientCredential.UserName = "blabla";
client.ClientCredentials.Windows.ClientCredential.Password = "blabla123";

Wednesday, October 05, 2016

Row level security for a table in SQL Server

This post is a question I asked in SO while ago, but I did not got any answer. So because of deletion warning I posted it here. 
I've implemented a claim base application by MVC, C#, EF6 and SQL Server 2014. Now there is a new requirement to add row level security on it.
The scenario is:
  1. We have a table (Product) with nearly 10000 row.
  2. User table has n..n relation with Product.
  3. All of actions have security attributes to controls users access according to claims.
  4. When a user wants to fetch data from db, I want to filter result based on user access. Every user must view the related data.
There is a solution for this matter in thisthis but my problem is the rows count in the table (Product). If I create a new table (Product_ACL) to persist users permission to Product records per capita, managing the permission is so hard and full potential of mistakes for the end user to managing users permissions.

The normal way to solve my issue is:
    create table Product(
        Name nvarchar(100)

    create table Product_ACL (
        ProductId INT REFERENCES Product(Id),
        username varchar(100),
        permission varchar(20)
but it is painful for users who want to manage the permissions. Then I'm seeking a better design for it.

In brief I'd like to add Row Level Security in an application like this tutorial but the difference between this sample and my application is in user access to rows. In my application a record which has security tag can be accessed by more than on user then I want to change the design of my tables to achieve RLS(I have to upgrade to SQL Server 2016)

To summarize my thoughts, I think I can achieve my desired by making some changes in my entities. I have a User table to store users information. A Product table for storing product information with specifying a product's region, province and city info(geography information). Then with an extra table for storing users permissions. for example: user_1 has access to province_1 then user_1 has access to all products in province_1. If I set city field also then user_1 access only all products in province_1 and city_1.
create table User(
    Name nvarchar(100)

create table Product(
    Name nvarchar(100)
    RegionId INT REFERENCES Region(Id) NULL,
    ProvinceId INT REFERENCES Province(Id) NULL,

create table User_ACL (
    UserId INT REFERENCES User(Id),
    RegionId INT REFERENCES Region(Id) NULL,
    ProvinceId INT REFERENCES Province(Id) NULL,

I think with this model if I add a new table that needs security, I can cover its requirement.

Final Update:
I finally designed a security model to my application.
Is there any suggestion or improvement about my idea?
I deleted Product and other entities from my model.
ERD diagram
This post helped me to find a better solution for RLS, finally.

Saturday, August 04, 2012

Adding a Control into a String Dynamically

You suppose you are wanted to add a string in a page that the string has to have a hyper link in it to link an especial page for any reason. 
So for doing this job you can do it in this way:

private string buildMessage(string message, int reportId) {
        StringBuilder sb = new StringBuilder();
        HyperLink hp = new HyperLink();
        hp.NavigateUrl = "~/index.aspx?reportId=" + reportId;
        hp.ID = "hpl";
        hp.Text = "Link Report";
        hp.Style.Add(HtmlTextWriterStyle.FontWeight, "Bold");
        using (StringWriter sw = new StringWriter(sb)) {
            using (HtmlTextWriter tw = new HtmlTextWriter(sw)) {
       sb.Append("Some other text");
 return sb.ToString();     }

Now you can you this function in different way. For example you 
can add a Panel Control to your page. Then call this function to 
set the Panel InnerHtml property.

Good Luck.

Wednesday, August 01, 2012

Distinct in a List of Objects

If you have a list of objects and in the list there are repetitive elements, If you want to delete duplicate elements, you may be want to use Linq Distinct() method. But the result may be not your desire because it does not work properly.
For getting the best result in your work, it is better to use the following method for doing distinct in your list:

   1: var listOfObject = items
   2:     .GroupBy(l => l.PropertyToCompare) //Name of you propery
   3: .Select(l => l.First()); 

Thursday, March 03, 2011

ASP.NET Ajax 3.5 error Sys.WebForms.PageRequestManagerParserErrorException

A few days ago, I wanted to add an Export to Excel button in a page in my web application, that I used Ajax on it, But unfortunately it did not work properly and generated an error(the same as my post header). After searching in the net I found out I have to add a line in page. That's it, it Worked excellent :). If you encountered this error, you must add:


and also:

            <asp:PostBackTrigger ControlID="ControlName" />

Good luck :)

SQL Server Query: Fast with Literal but Slow with Variable

Often when there is a drastic difference from run to run of a query I find that it is often one of 5 issues. STATISTICS  - Statistics a...