Saturday, August 21, 2010

Brief about SSP (MOSS - Shared Service Provider)

What is SSP: Shared Services Providers act as more than an intermediary between web applications and sensitive services. By creating a Shared Services Provider (SSP) you can create an interface, or a mediator, that retrieves the information you need from the sensitive service and places it in a database for query by the appropriately authorised web applications.

Best example to elaborate use of SSP is list of users/profiles. SSP can import all the users and their information from Active Directory (AD) once and make it available to other web applications. So here web applications can consume the date from SSP instead of managing a copy of identical data locally.

Some key aspects of SSP that I learned are mentioned below.

  • SSP offers five services
  1. Business Data Catalog (BDC)
  2. Office SharePoint server search
  3. Excel Services
  4. User Profile Application
  5. Session State
  • When you configure SSP, it creates two databases
    • SSP Service Database (SharedServices_DB) à For SSP configurations
    • SSP Search Database (SharedServices_Search_DB) à For storing search results of users.

Recommended Guidelines for SSP

  1. It's recommended to create different application pool for SSP web application.
  2. Not recommended to use same web application for hosting both SharePoint sites and SSP sites. (backup strategy)
  3. Its good practice to create new application for hosting my sites. Using same web application for both my site and SSP will cause problems while backup.
  4. SSP creates two databases, it is good practice to use different SQL instance for these databases rather using same SQL server for SharePoint sites.

Also you can refer to very good article/post on SSP by sharepoint MVP.
http://msmvps.com/blogs/shane/archive/2007/06/29/what-is-a-shared-service-provider.aspx

Hope you find it useful, cheers.

Saturday, August 14, 2010

How to find Tables with NO INDEXES...??

Often times when your application is not performing well as it used to, then you start blaming your SQL developer for the bad queries. But this may not be the case all the time. For your application to work fast, you should also have proper indexes in the underlying database.

When your application is not working as fast as it used to, then first check that you might want to do is to check if you have proper indexes in place. One very obvious check that you might want to do will be to indentify tables which are not having any indexes at all. Now how you do that?

Open the SSMS, and check all the tables one by one inside your database..!! Pretty time consuming and so manual, right.

I got this very handy SQL script which will do the job for you very quick. Execute the below script on your database and you will get all the list of all culprit tables.

SELECT name

FROM
sys.tables

WHERE
OBJECTPROPERTY(OBJECT_ID,'TableHasIndex')=0

Hope that you will find this post useful.

Cheers,

Saturday, August 7, 2010

Best Practices for SQL Queries

I was looking into best practices for writing SQL queries and came across one very interesting and useful online resource/post. This post focuses on how to write understandable, maintainable and readable queries. Good post for SQL queries code review, Below is the link.


Enjoy coding..!!!

Fetching Contents from EXCEL File..!!

Often times in .Net application, we want to fetch data from EXCEL files. Reading contents from EXCEL files is quite simple and easy task. 

There are two ways, you can read contents from EXCEL file.

  1. Using Microsoft Office Component
  2. Using Microsoft Jet Engine

As per Microsoft recommendation, it is not advisable to use Office components on the server. It means that if you want to use this library for a server application, it's not a good idea to use the Office component. And it's better to do the connection using Jet Engine.

So in this post I will be discussing the second & recommended option, which is using Microsoft Jet Engine.

Connecting To EXCEL

To make a connection to the EXCEL, you can use OLEDB objects that will treat EXCEL file as database and then required contents or data can be fetched using SQL queries. Following is the connection string that you can use.

The connection string should be set to the
OleDbConnection
object. This is very critical as Jet Engine might not give a proper error message if the appropriate details are not given

Syntax for XLS files:
Provider=Microsoft.Jet.OLEDB.4.0; Data Source=<Full Path of Excel File>; Extended Properties="Excel 8.0; HDR=No; IMEX=1".

Data Source: Specify here the path of the excel file.

Extended Properties: Includes

  1. HDR = [Yes/No]

This property will be used to specify the definition of header for each column. If the value is "Yes", the first row will be treated as heading. Otherwise, the heading will be generated by the system like F1, F2 and so on.

  1. IMEX = [0/1/2]

This property indicates the Import Export mode and addresses how contents datatypes will be treated.

  • IMEX=0 and IMEX=2: It will take the first 8 rows and then the data type for each column will be decided.
  • IMEX=1 : In this case, everything will be treated as text.

For more info regarding Extended Properties,
click here.

Finally your connection to EXCEL file will be something like this.

cn = new
OleDbConnection();

cn.ConnectionString = "Provider=Microsoft.Jet.OLEDB.4.0; Data Source='C:\MyEXcel.xls'; Extended Properties="Excel 8.0; HDR=No; IMEX=1";

Fetch EXCEL Sheets from EXCEL File

As we know so far that EXCEL file is like a database, the sheets in the excel file are equivalent to tables inside a database. So all we need to know is the sheet name and we can the write a query to fetch data from the sheet. So our next step is to fetch the Sheets contained in the EXCEL file. Following code will help you fetching all the sheets contained in the EXCEL file.

System.Data.DataTable dt = null;

cn.Open();

dt = cn.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, null);

if(dt == null)

{

return
null;

}

String[] excelSheets = new
String[dt.Rows.Count];

int i = 0;

// Add the sheet name to the string array.

foreach(DataRow row in dt.Rows)

{

    string strSheetTableName = row["TABLE_NAME"].ToString();

    excelSheets[i] = strSheetTableName.Substring(0,strSheetTableName.Length-1);

    i++;

}

return excelSheets;

If you already know what sheets an EXCEL file is going to have then you can directly hardcode that in your code. Above code is required when you want to get list of all the sheets in EXCEL file.

Reading Contents from EXCEL Sheet

Now we know the database (Excel File connection) and table (sheetName) name from which we have to read contents. So we only have to write one query to fetch or read data from the excel sheet. Following is the code for this.

string strComand;

string strSheetName;

strSheetName = "sheet1";     //I know the sheet name so I mentioned it here directly.

strComand="select * from ["+strSheetName+"$]"; // You can specify specific column names as well in this query instead of *.

daAdapter=new
OleDbDataAdapter(strComand,cn);

    DataTable dt=new
DataTable(strSheetName);

    daAdapter.FillSchema(dt,SchemaType.Source);

    daAdapter.Fill(dt);

    cn.Close();

    return dt ;

Above code will give you DataTable
containing all the content of the specified sheet. Now you can load this data into DataSet as follows.

DataSet ds=new
DataSet();             

    ds.Tables.Add(dt);

I hope that you will find this post useful. Feel free to provide comments and your feedbacks.

Cheers,

Mirza Ateeq


 

Friday, July 9, 2010

Writing Custom functions in SSRS

We all know the fact that SSRS comes with very limited system defined functions and most of the times we get into situation where we don't find functions as per our requirement and need.

Many times we come across situation where we want to modify/transform the actual data before rendering it in the report. This transformation can be as simple as changing the date formats and some times it can be complex, such as encoding/decoding of the data stored.
Though we can do these transformation in SQL queries, it is not advisable to do so. Business logic should not be combined with the data layer. So what we should do in situations where SSRS fails to provide you system functions to fulfill your needs ?

Fortunately SSRS allows you to create your own custom functions as per your need. You can create your own custom functions to implement any complex string manipulation/logic using C# or VB.

Recently I had need to decode the HTML contents which are stored in database and show them in the report. As there is no predefined function available to decode encoded HTML, i decided to write my own custom function. Its simple to add a custom function.
Go to Report Properties under report menu. Click on the Code option in right side menu of the popup. It gives you place holder for placing your custom code as shown in below image.














Thats it, you are done. And its also easy to use these custom functions in report. All you need to do is this
Code.HTMLDecode(Fields!Description) in case mentioned above.


cheers,

Monday, June 14, 2010

How to Identify Open SQL Connections

Most of the time you run into a situation where your SQL server doesn't allow you to create a connection because of limit of the open connection at at time. This is indication that your application is leaking open connections. Some of your lazy developers forgot to close connection to database after use.

To debug this problem you need to know how many open connections your application has left in the connection pools.
There is a very simple and handy query which can help you out in such situations.

SELECT SPID,
STATUS,
PROGRAM_NAME,
LOGINAME=RTRIM(LOGINAME),
HOSTNAME,
CMD,
DBID
FROM MASTER.DBO.SYSPROCESSES
WHERE DBID != 0
If you know the exact database name then you can add Database filter in where clause to filter out connections made to specific database. see below.
AND DB_NAME(DBID) = 'EnterDBName'

Hope you find this query useful., cheers.

Sunday, May 23, 2010

Implementing Distinct UNION Component in SSIS 2008

SSIS 2008 comes with dataflow component called UNION ALL, which is used for combining the result set from two different input sources and provides one output.
For example, suppose you want to fetch books list from two different FLAT File source and have to insert into one table [bookLis].
To implement this you can add two FLAT File source in DataFlow task, combine them using UNION ALL component and finally result from UNION ALL component can be passed on to OLEDB Destination (as indicated in below image).
















This works fine for until you dont have duplicates in input FLAT Files. Take the scenario when you have got some duplicate records in two FLAT Files and you want to insert only UNIQUE records in table. Unfortunately UNION ALL component does not provide you facility to output distinct records. So how can we implement this business requirement in SSIS.

There is one solution to the problem. we can make use of SORT component to remove duplicate records from the output produced by UNION ALL. Add SORT component to DataFlow task having input from UNION ALL component. Edit Sort component to specify sorting order on the basis of column which uniquely identifies a book record (say BookID). This will sort the result set generated by UNION ALL. Now this SORT component also provides a option to remove duplicate based on the sorted column, so if you check this option the result generated by SORT component will not have any duplicate records.
You can pass output from SORT component to your OLEDB Destination for final save. (as indicated in below images)



































I hope that you find this post useful. Feel free to provide your comments.
Thanks for reading, Cheers.



Monday, April 26, 2010

What is "nvarchar" ?

Most of us are familiar with the use of the SQL datatype nvarchar.
Yes, nvarchar comes with unicode support & thus we use it when we want support for different languages.

But have you ever wondered, why it is called as nvarchar ? If yes then here is the answer..
As per ANSCI standards, nvarchar stands for "National character varying". so while declaring any column/variable in SQL of type nvarchar you can also specify it as follows.

create table tEmployeeMaster(
EmployeeName national character varying(64)
)


One more important thing to note about this database is, whenever you try to insert some string value into nvarchar column/variable, SQL server needs first cast it from varchar to nvarchar and then stores it.

Insert Into tEmployeeMaster values('Mirza Ateeq Baig')

Now here there is some casting overhead for SQL server to cast string to nvarchar.
There is one way to avoid this casting overhead. Above insert statement can also be written as follows.

Insert Into tEmployeeMaster values(N'Mirza Ateeq Baig')

The prefix N before string indicates that the string should be treated as nvarchar. So now SQL server knows that this is nvarchar, thus no casting is required in this case.

Hope that this is useful for some of you. cheers,

Friday, March 12, 2010

How to make Read-only VIEWS in SQL-Server

We all know that Views in SQL Server are editable. Unlike Oracle, there is no direct option available to make a view read-only (as per my knowledge, correct me if I am wrong). So i was just looking into some options to make Views read-only and users should not be able to update the views.

One solution to this is, only assign SELECT permission on view to the specified users/role. With that said user/role will only be able to SELECT from the view.

But if you dont want to mess around with security and dont want anyone to update views at all then there is one "cheat" solution. While creating the View use Derived fields even if you can directly get the fields from table.

Following is one VIEW that I created where I have two derived columns and one normal column mapped from table.

CREATE VIEW [dbo].[TestReadOnlyView]
AS
SELECT case when PriorityTypeID IS NULL THEN NULL
ELSE PriorityTypeID end as PriorityTypeID
,case when PriorityTypeDesc IS NULL THEN NULL
ELSE PriorityTypeDesc end as PriorityTypeDesc
,SortOrder
FROM dbo.PriorityTypes


Now with this VIEW, you can update SortOrder column but you wont be able to update other two columns.
Try out yourself.

I am still looking into this, so if anyone is having any other solution or opinion then do share with me.

Thursday, January 28, 2010

UserDefined Function to Trim Zeros from decimal value

Most of the times for reporting purpose we want dont want to show extra zeros that comes along with decimal values. To my surprise SQL Server doesnt provide any function which will help trimming extra useless zeros from decimal. After searching a lot about this I found that if you want you have to write your own function which will serve this purpose for you.
And I found one scalar function which you can call in your procedures to trim extra zeros from a decimal value. Pass the decimal value, variable or column you want to remove the extra zeros from to the function, and it returns a trimmed string:


CREATE FUNCTION [dbo].[fnTrimZeros]
(
@decValue decimal(7,5)
)
RETURNS varchar(9)
AS
BEGIN
DECLARE @txtTrimmed varchar(9), @chrTest varchar(1)

SELECT @txtTrimmed = Convert(varchar(8),@decValue)
SELECT @chrTest = Substring(@txtTrimmed,Len(@txtTrimmed),1)
WHILE @chrTest = '0'
BEGIN
SELECT @txtTrimmed = substring(@txtTrimmed,0,len(@txtTrimmed))
SELECT @chrTest = substring(@txtTrimmed,len(@txtTrimmed),1)
END
IF @chrTest = '.' -- remove unnecessary decimal point
SELECT @txtTrimmed = substring(@txtTrimmed,0,len(@txtTrimmed))

RETURN @txtTrimmed + '%' -- optional % sign formatting

END



Call the function by:

SELECT dbo.fnTrimZeros(@DecimalVariable) AS TrimmedDecimal

-or-

SELECT dbo.fnTrimZeros(DecimalColumn) AS TrimmedDecimal FROM dbo.DecimalTable

You can adjust the precision of the decimals, or length of the varchars to suit your needs. Also, if you convert the trimmed string back to a decimal, you're probably just going to add extra zeros back on since you can't predict the precision of the decimal on the fly...
Ideally these kind of string manipulation should be done on the presentation layer (when possible, 'cause sometimes it's not!) instead of doing this in back end.

Wednesday, January 20, 2010

Rarely Used but Useful SQL Function "NULLIF"

We all are very well aware and familiar with the function IFNULL, which is used to check if the value is NULL and provide replacement value for NULL.

Recently I came across one more useful SQL function "NULLIF". This works exactly opposite to IFNULL. In here you can find for some specific value and can replace it with NULL.
Now you will wonder why we may need to replace some useful value with NULL. Well I have one situation for you. Suppose you are doing some mathematical operations and you have to account for division by zero. Instead of writing multiple checks and IF clauses, NULLIF function comes in handy.

One of the simplest solutions is to do something like this:

SELECT @v1 / NULLIF( @v2, 0)

This will return NULL if division was invalid (Division by zero) or else
it will give the proper result.

If you prefer the result of an invalid division to be zero, you can add
a COALESCE:

SELECT COALESCE( @v1 / NULLIF(@v2,0), 0)


Hope that you will also find this function useful. Feel free to put your comments.

Saturday, January 2, 2010

Potential of 'SixthSense' Technology by Pranav Mistry

Guys, I was amazed to see an Indian who came up with such an important technology that will actually change our physical world. The guy I am walking about is Pranav Mistry from MIT India and I must say that he has got amazing vision.
This video is a must watch for every Techy who want to make a difference.

At TEDIndia, Pranav Mistry demos several tools that help the physical world interact with the world of data -- including a deep look at his SixthSense device and a new, paradigm-shifting paper "laptop".
In an onstage Q&A, Mistry says he'll open-source the software behind SixthSense, to open its possibilities to all.

http://economictimes.indiatimes.com/tv/TED-India-Pranav-Mistry/videoshow_ted/5231080.cms


Hope you like it..!!!