Quantcast
Channel: Microsoft SQL Server Support Blog

Dealing with very large SQL Compact database files

$
0
0

Working with very large SQL Compact files, you may run into two issues:

 

1.     Manipulating a large SQL Compact database in VS 2008 IDE requires a hotfix

 

http://support.microsoft.com/kb/968436

 

Error message when you use the Visual Studio 2008 IDE to manipulate a SQL Server Compact 3.5 Database file which is larger than 128 Megabytes: "The database file that is larger than the configured maximum database size"

 

2.     Upgrading a very large database from SQL Compact 3.1 to 3.5 requires another hotfix

http://support.microsoft.com/kb/971027

FIX: Error message when you upgrade a very large database to SQL Server Compact 3.5: "The database file is larger than the configured maximum database size. This setting takes effect on the first concurrent database connection only"

Hotfix 3.5.5692.12 fixes a problem where large SQL CE database upgrade fails with an error:

The database file is larger than the configured maximum database size. This setting takes effect on the first concurrent database connection only. [ Required Max Database Size (in MB; 0 if unknown) = <size>]

Depending upon the file-size and available resources on the machine, the upgrade process for a very large database may consume significant memory.  This is expected behavior.   You may want to stop other applications to make room for the Upgrade process to complete.


How to configure SQL server to listen on different ports on different IP addresses?

$
0
0
Technorati Tags:

The following post describes how you can configure your SQL Server to listen on a different port(s) on different IP addresses that are available on your system. This procedure applies to both SQL server 2005 and SQL server 2008.

Case 1: SQL Server is installed in an environment where the IP addresses had not changed since the time it is originally installed.

1) Open SQL Server configuration manager.

2) Ensure that TCP/IP protocol is enabled

image

By default, all the IP addresses listen the same port or ports configured in SQL Server configuration manager. The SQL Server error log will have an entry like the following:

2009-07-15 17:40:06.39 Server      Server is listening on [ 'any' <ipv4> 2675].

3) In the TCP/IP properties set ‘Listen All’ to ‘No’

image

4) Go to the IP addresses tab for the instance, set Enabled to Yes and TCP port number field for the specific IP address to the desired

port. The screen will look as follows:

image

5) Restart the SQL Server. Now you should see an entry similar to the following in the SQL error log:

2009-07-15 18:03:10.58 Server      Server is listening on [ x.y.z.w <ipv4> 2000].
2009-07-15 18:03:10.59 Server      Server is listening on [ x.y.z.v <ipv4> 2001].

As you can see from the above each of the IP addresses is listening on different port.

Case 2: SQL Server is installed in an environment where the IP addresses change dynamically, but the number of IPs active on the system are the same (For example there are two IPs active on the system, but because of lease expiry or when moving to a new subnet, the system hosting SQL Server got either one or both of its IPs changed). In this case, get the output of ipconfig /all on the system, and edit one or all the IP addresses as needed with the new IP addresses that are active on the system using a similar procedure discussed in Case 1.

Case 3: You add an additional IP address on the system:

In that scenario, you will not be able to use the procedure documented in Case 1  or Case 2 above as the Configuration Manager’s IP address list will just only have as many entries as the number of IPs that SQL Server found when it is installed

In this scenario, you can take the following steps to update the registry values SQL server looks at to listen on different ports on different IP addresses.

Warning Serious problems might occur if you modify the registry incorrectly by using Registry Editor or by using another method. These problems might require that you reinstall your operating system. Microsoft cannot guarantee that these problems can be solved. Modify the registry at your own risk. . For more information about how to back up, restore, and modify the registry, click the following article number to view the article in the Microsoft Knowledge Base:

322756 How to back up and restore the registry in Windows

1) Navigate to the following registry key on the SQL server machine:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\!INSTANCEID!\MSSQLServer\SuperSocketNetLib\Tcp\

Note: !INSTANCEID! is a place holder for your SQL server instance.

2) Right click on IP1, export the registry key as SQLIP template.reg

3) Edit the  key name and IP address key .reg file that you exported in step 2 with notepad with the new IP address. (You can get the IP address list on the system by executing ipconfig /all > ipconfig.txt command from the command prompt).

The contents would look as follows:

Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL10.SQL10\MSSQLServer\SuperSocketNetLib\Tcp\IP3]—> Change this name to a new value for example IP4
"Enabled"=dword:00000000
"Active"=dword:00000001
"TcpPort"="2001"
"TcpDynamicPorts"=""
"DisplayName"="Specific IP Address"
"IpAddress"="a.b.c.d" –> Update this with new IP address value 

4) After editing the file save it with a different name – for example new IP4.reg

5) Double click the .reg file from step 3 to import the key as a new entry under [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL10.SQL10\MSSQLServer\SuperSocketNetLib\Tcp] registry entry.

6) Repeat steps (3),(4) and (5) for any other new IP addresses that you currently want to configure on the system.

Note: After adding the above registry keys, the new IP addresses should now show up in SQL server configuration manager.

7) Optional: Clean up any IPs that are no longer active by deleting the associated <IP_n> registry keys.

8) In SQL Server configuration manager, IP addresses tab, ensure that only the addresses that are listed in the ipconfig output on the system have Enabled property set to Yes and the other set to No

image

Note: If IP All is set to No and if the IP addresses tab has some IP addresses that have ‘Enabled’ set to ‘Yes’ but the actual IP is no longer active on the system, SQL Server service fails to start with an error message like the following logged to SQL Server error log:

2009-07-16 15:43:07.87 Server      Server is listening on [ 127.0.0.1 <ipv4> 2753].
2009-07-16 15:43:07.89 Server      Error: 26024, Severity: 16, State: 1.
2009-07-16 15:43:07.89 Server      Server failed to listen on x.y.z,w <ipv4> 2000. Error: 0x2741. To proceed, notify your system administrator.
2009-07-16 15:43:07.95 Server      Error: 17182, Severity: 16, State: 1.
2009-07-16 15:43:07.95 Server      TDSSNIClient initialization failed with error 0x2741, status code 0xa. Reason: Unable to initialize the TCP/IP listener. The requested address is not valid in its context.

2009-07-16 15:43:07.95 Server      Error: 17182, Severity: 16, State: 1.
2009-07-16 15:43:07.95 Server      TDSSNIClient initialization failed with error 0x2741, status code 0x1. Reason: Initialization failed with an infrastructure error. Check for previous errors. The requested address is not valid in its context.

2009-07-16 15:43:07.95 Server      Error: 17826, Severity: 18, State: 3.
2009-07-16 15:43:07.95 Server      Could not start the network library because of an internal error in the network library. To determine the cause, review the errors immediately preceding this one in the error log.
2009-07-16 15:43:07.95 Server      Error: 17120, Severity: 16, State: 1.
2009-07-16 15:43:07.95 Server      SQL Server could not spawn FRunCM thread. Check the SQL Server error log and the Windows event logs for information about possible related problems.

Case 4: SQL Server is installed in a clustered environment.

On cluster, you cannot configure SQL Server to listen on a specific IP addresses. You must chose IPALL. The IP addresses on which the cluster instance will be listening on is determined by cluster resources (configurable through Cluster Administrator, by adding IP Address resources under SQL Network Name resource).

Additional links:

 

 

 

 

Ramu Konidena

Microsoft SQL Server Support Technical Lead

SQL Server 2005 setup fails when MSXML Core Services 6.0 Service Pack 2 has already been installed

$
0
0

 

There is a known issue with SQL Server setup when MSXML6 component update has been installed on the system. The problem described in

KB 968749 http://support.microsoft.com/kb/968749 has raised a lot of concerns by customers. The concerns are related to the fact that the solution is manual and not usable in a Large Enterprise environments. In order to automate MSXML6 component un-install we have created the automatic solution for this issue. However the solution needs to be implemented on a case by case basis.

 

If you are experiencing the issue described in KB 968749 and need an automated solution for this, please contact SQL CSS.

Please see details on how to open the incident here http://support.microsoft.com/

The incident for this specific issue is going to be free of charge.

 

We apologize for any inconvenience.

 

SQL CSS.

 

Follow us on twitter (http://twitter.com/MicrosoftSQLCSS)

How to connect to file-based data sources (Microsoft Access , Microsoft Excel and Text files ) from a 64 bit application

$
0
0
 

The Issue:

A 64-bit process can load only 64-bit components in it's process boundary. Same is true for a 32-bit process also. So, if your application is 64 bit, you will need a 64-bit provider or driver to connect to  Microsoft Access (mdb, accdb) or Microsoft Excel 2010 (xls, xlsx, and xlsb) or  text files. Bad news is that there is no 64-bit provider or driver available "yet" to connect to these file-based data sources. Good news is that a 64-bit Provider is heading your way which is currently in beta phase.

 

The Kludge:

The common workaround is to connect to a 32-bit SQL Server instance that has a Linked Server to the Access/Excel/Text file. This is a hack, can be difficult to get set-up, and can have stability and performance issues, and realistically, we at Microsoft would rather not support this setup or issues arising from it.

 

The Good news:

 A 64-bit driver is headed your way. This is great news for users in a 64-bit world. Soon you'll be able to connect to these file-based data sources from your 64-bit application, rather than wrestle with obscure settings to force them to connect via a Linked Server.

 

The next version of Microsoft Office, Office 2010, will be available in a 64-bit version. This version will include a 64-bit version of "2010 Office System Driver Connectivity Components" which will include all the needed 64-bit ODBC driver and OLEDB providers to connect to these file-based data sources.

 

You will not have to buy or install Office 2010 to obtain and use the new 64-bit components. Like the current version of the provider, it will be available as a free download.

 

You can download the beta version from here:

http://www.microsoft.com/downloads/details.aspx?familyid=C06B8369-60DD-4B64-A44B-84B371EDE16D&displaylang=en

 

Connection string for 64-bit OLEDB Provider:

  • For Microsoft Office Access : Set the Provider string to “Microsoft.ACE.OLEDB.12.0"
  • For Microsoft Office Excel   : Add “Excel 12.0” to the Extended Properties of the OLEDB connection string.

 

Connection string for 64-bit ODBC Driver:

  • For Microsoft Office Access: Set the Connection String to “Driver={Microsoft Access Driver (*.mdb, *.accdb)};DBQ=path to mdb/accdb file”
  • For Microsoft Office Excel: Set the Connection String to “Driver={Microsoft Excel Driver (*.xls, *.xlsx, *.xlsm, *.xlsb)};DBQ=path to xls/xlsx/xlsm/xlsb file”

 

The gotchas:

  • You cannot install the 32-bit version and the 64-bit version of the "2010 Office System Driver Connectivity Components" on the same computer.
  • You cannot install the 64-bit version of the "2010 Office System Driver Connectivity Components" on a computer that already has the 32-bit Office 2007 ACE Provider. However, the 32-bit Office 2007 provider can coexist side-by-side with the 32-bit version of the "2010 Office System Driver Connectivity Components".

 

Authors:  Enamul Khaleque & Srini Gajjela [DSD-SQLDeveloper group at Microsoft]

 

Tools of the Trade: Part IV - Developing WinDbg Extension DLLs

$
0
0

A WinDbg extension DLL is set of exported callback functions for implementing user defined commands to extract specific customized  information from the memory dump(s).  Extension DLLs are loaded by debugger engine and can provide extra functionality of automation tasks while performing user-mode or kenrel-mode debugging.  An extension DLL may export any number of functions that are used to execute extension commands.  Each function is explicitly declared as an export in the  DLL's definition file or .def file and function names must be in lowercase letters

WinDbg (DbgEng ) extension DLL must export DebugExtensionInitialize. This will be called when the DLL is loaded, to initialize the DLL. It may be used by the DLL to initialize global variables.

An extension DLL may export an optional function DebugExtensionUninitialize. If this is exported, it will be called just before the extension DLL is unloaded.

An extension DLL may export a DebugExtensionNotify. If this is exported, it will be called when a session begins or ends, and when a target starts or stops executing. These notifications are also provided to IDebugEventCallbacks objects registered with a client.

An extension DLL may export KnownStructOutput. If this is exported, it will be called when the DLL is loaded. This function returns a list of structures that the DLL knows how to print on a single line. It may be called later to format instances of these structures for printing.

So, how to develop   your own Windbg Extension DLL? Lets follow these steps:

1. Download & Install debugging tools for windows from http://www.microsoft.com/whdc/devtools/debugging/installx86.Mspx

2. Create "Win32 Console Application" using VS 2008

3. Select Application type as "DLL" and click "Finish" .

Step 3

4. Add a "Module-Definition File (.def)" called "wdbrowser" to the project. One way to export your extension function is by specifying the function names in the EXPORTS section of the .def file. You may you use other ways of exporting functions, such as __dllexport

Step 4

5. Configure the project "Additional include Directories" to point to header files that comes with Windbg. Default folder for x86 is "C:\Program Files\Debugging Tools for Windows (x86)\sdk\inc"

6. Configure the project "Additional Library Directories" to point to library files that comes with Windbg. Default folder  for x86 libraries  is  ""C:\Program Files\Debugging Tools for Windows (x86)\sdk\lib\i386""

7. The library, "dbgeng.lib " & "dbgeng.dll" has the implementation of the Debug engine exported functions. So, add "dbgeng.lib" in "Additional Dependencies".

8. Add name of the module definition file created at the above Step #3

Step 8

9. Now Include the following required headers files in "stdafx.h"

#include <windows.h>

#include <imagehlp.h>

#include <wdbgexts.h>

#include <dbgeng.h>

#include <extsfns.h>

10. Declare following two global variables in your extension project main implementation file.

//Version.

EXT_API_VERSION g_ExtApiVersion = {1,1,EXT_API_VERSION_NUMBER,0} ;

WINDBG_EXTENSION_APIS ExtensionApis = {0};

11. Declare following debug engine COM interface pointers.

IDebugAdvanced2*  gAdvancedDebug2=NULL;

IDebugControl4*   gDebugControl4=NULL;

IDebugControl*    gExecuteCmd=NULL;

IDebugClient*                 gDebugClient=NULL;

12. Next step is to declare and implement WinDbgExtensionDllInit function in your DLL main implementation source file. In this example that is "wdbrowser.cpp" . The WinDbgExntensionDllInit is the first function that will be called by windbg . So, this function is the idle for implementing any extension specific initialization or related functionality. Please refer http://msdn.microsoft.com/en-us/library/cc267872.aspx for more details about this function.

VOID WDBGAPI WinDbgExtensionDllInit (PWINDBG_EXTENSION_APIS lpExtensionApis, USHORT usMajorVersion, USHORT usMinorVersion)

{

                  ExtensionApis = *lpExtensionApis;

                  HRESULT hResult = S_FALSE;

                        if (hResult = DebugCreate(__uuidof(IDebugClient), (void**) &gDebugClient) != S_OK)

                         {

                                                dprintf("Acuqiring IDebugClient* Failled\n\n");

                                                return;

                         }

                         if (hResult = gDebugClient->QueryInterface(__uuidof(IDebugControl), (void**) &gExecuteCmd) != S_OK)

                         {

                                        dprintf("Acuqiring IDebugControl* Failled\n\n");

                                                return;

                         }

                         if (hResult = gDebugClient->QueryInterface(__uuidof(IDebugAdvanced2), (void**) &gAdvancedDebug2) != S_OK)

                         {

                                              dprintf("Acuqiring IDebugAdvanced2* Failled\n\n");

                                                return;

                         }

                         if (hResult = gDebugClient->QueryInterface(__uuidof(IDebugControl4), (void**) &gDebugControl4) != S_OK)

                         {

                            dprintf("Acuqiring IDebugControl4* Failled\n\n");

                                                return;

                         }

}

13. Declare another exported function ExtensionApiVersion to report the version of your extension to windbg. Please refer to http://msdn.microsoft.com/en-us/library/cc267873.aspx for detailed information about this function.

LPEXT_API_VERSION WDBGAPI ExtensionApiVersion (void)

{

    return &g_ExtApiVersion;

}

14. Define Debug engine's interface pointers, so that your extension module can interact with debug engine. For more information please refer

http://msdn.microsoft.com/en-us/library/cc265976.aspx  - IDebugClient, http://msdn.microsoft.com/en-us/library/cc266102.aspx - IDebugControl

http://msdn.microsoft.com/en-us/library/cc265957.aspx - IDebugAdvanced

IDebugAdvanced2* gAdvancedDebug2=NULL;

IDebugControl4* gDebugControl4=NULL;

IDebugControl* gExecuteCmd=NULL;

IDebugClient*               gDebugClient=NULL;

15. Next step is to - implement debug engine's callback interface IDebugOutputCallbacks . Debug engine callbacks your implementation of IDebugOutCallbacks::Output() with output as a result of the commands that are executed by your extension function.

Refer to http://msdn.microsoft.com/en-us/library/cc265716.aspx   for detailed information about IDebugOutputCallbacks::Output()

16. Add the following new class in a header file that inherits the IDebugOutputCallbacks interface .

#ifndef __OUT_HPP__

#define __OUT_HPP__

#include <string>

#include <sstream>

class StdioOutputCallbacks : public IDebugOutputCallbacks

{

private:

                        std::string m_OutputBuffer;

                        //

                        //This buffer holds the output from the command execution.

                        //

                        CHAR m_OutPutBuffer[4096];

public:

                        void InitOutPutBuffer();

                        std::string GetOutputBuffer()

                        {

                                                return m_OutputBuffer;

                        };

                        void ClearOutPutBuffer()              

                        {

                                                m_OutputBuffer = "";

                        };

    STDMETHOD(QueryInterface)(

        THIS_

        IN REFIID InterfaceId,

        OUT PVOID* Interface

        );

    STDMETHOD_(ULONG, AddRef)(

        THIS

        );

    STDMETHOD_(ULONG, Release)(

        THIS

        );

    // IDebugOutputCallbacks.

    STDMETHOD(Output)(

        THIS_

        IN ULONG Mask,

        IN PCSTR Text

        );

};

extern StdioOutputCallbacks g_OutputCb;

#endif // #ifndef __OUT_HPP__

17. Add the following code that implements the IDebugOutputCallbacks interface methods, especially Output()

#include "stdafx.h"

#include <stdio.h>

#include <windows.h>

#include <dbgeng.h>

#include "OutputCallBack.h"

StdioOutputCallbacks g_OutputCb;

STDMETHODIMP

StdioOutputCallbacks::QueryInterface(

    THIS_

    IN REFIID InterfaceId,

    OUT PVOID* Interface

    )

{

    *Interface = NULL;

    if (IsEqualIID(InterfaceId, __uuidof(IUnknown)) ||

        IsEqualIID(InterfaceId, __uuidof(IDebugOutputCallbacks)))

    {

        *Interface = (IDebugOutputCallbacks *)this;

        AddRef();

        return S_OK;

    }

    else

    {

        return E_NOINTERFACE;

    }

}

STDMETHODIMP_(ULONG)

StdioOutputCallbacks::AddRef(

    THIS

    )

{

    // This class is designed to be static so

    // there's no true refcount.

    return 1;

}

STDMETHODIMP_(ULONG)

StdioOutputCallbacks::Release(

    THIS

    )

{

    // This class is designed to be static so

    // there's no true refcount.

    return 0;

}

STDMETHODIMP

StdioOutputCallbacks::Output(

    THIS_

    IN ULONG Mask,

    IN PCSTR Text

    )

{

    UNREFERENCED_PARAMETER(Mask);

                        m_OutputBuffer += Text;

    return S_OK;

}

void StdioOutputCallbacks::InitOutPutBuffer()

{

                        m_OutputBuffer.erase();

}

18. Add implementation of your extension function. In this example, we choose to implement an extension that displays the variable names, types in the frame 2 of the current thread. The implementation is:

DECLARE_API (dvf3)

{

//

// Install output callbacks.

//

if ((gDebugClient->SetOutputCallbacks((PDEBUG_OUTPUT_CALLBACKS) &g_OutputCb))!= S_OK)

{

dprintf("*****Error while installing Outputcallback.*****\n\n");

return;

}

//

// Since frame count starts from 0 index, we have to pass 2 as parameter for .frame command for the frame# 2

//

//Execute command to extrac 2nd frame.

if (gExecuteCmd->Execute(DEBUG_OUTCTL_THIS_CLIENT | //Send output to only outputcallbacks

DEBUG_OUTCTL_OVERRIDE_MASK |

DEBUG_OUTCTL_NOT_LOGGED,

".frame 2",

DEBUG_EXECUTE_DEFAULT ) != S_OK)

{

dprintf("Executing .frame 2 failled\n");

return;

}

//Execute command to extrac 2nd frame.

if (gExecuteCmd->Execute(DEBUG_OUTCTL_THIS_CLIENT | //Send output to only outputcallbacks

DEBUG_OUTCTL_OVERRIDE_MASK |

DEBUG_OUTCTL_NOT_LOGGED,

"dv /i /t /v",

DEBUG_EXECUTE_DEFAULT ) != S_OK)

{

dprintf("Executing dv /i /t /v failled\n");

return;

}

dprintf("***** Extracting locals & formal params from frame 2 *****");

dprintf("\n%s\n", g_OutputCb.GetOutputBuffer().c_str());

}

19. Re-build the project. Copy the .DLL from release folder to a folder where Windbg looks for extension DLLs.

On x86 machine default location is  "<Drive letter>\Program Files\Debugging Tools for Windows (x86)\winext"

20. The extension is ready for the use or test.

21. Start windbg and open a full user mode dump. Type .load myextension and hit enter to load the extension DLL into Windbg process space

Step21

22. Run .chain command to verify if your extension is loaded by WinDbg . you will see output similar to below output, if your extension is loaded.

23. Type !dvf3 to run the extension function for extracting and displaying variable names, types from the frame 2 .

Step23

Additional references:

http://msdn.microsoft.com/en-us/library/cc265826.aspx - describes about how to Interact with debug engine, I/O operations with debug engine, Memory access, Using symbols, source files.

http://www.codeplex.com/ODbgExt  - Microsoft Open Debugger Extension for Windbg

Happy developing debug engine extensions!

Posted By: Srini Gajjela & Enamul Khaleque (DSD-SQLDeveloper group @ Microsoft)

Creating HTTP endpoint fails with 7850 error.

$
0
0

Creating a HTTP endpoint in SQL Server 2005 or SQL Server 2008 may fail with the following error messages:

Msg 7850, Level 16, State 1, Line 1

The user 'domain\myuser' does not have permission to register endpoint 'training_sql_endpoint' on the specified URL.  Please ensure the URL refers to a namespace that is reserved for listening by SQL.

Msg 7807, Level 16, State 1, Line 1

An error ('0x80070005') occurred while attempting to register the endpoint 'training_sql_endpoint'.

 

The error message is actually incorrect in this context, it instead should report that it is the SQL Service account and also direct you to reserve the namespace explicitly, such as (this is what SQL Server 2008 now has):

The SQL Server Service account does not have permission to register the supplied URL on the endpoint '%.*ls'.  Use sp_reserve_http_namespace to explicitly reserve the URL namespace before you try to register the URL again.

When you run CREATE ENDPOINT to create a HTTP endpoint, this is done under the context of the SQL Server service account.  If the namespace reservation does not already exist, then SQL will implicitly create the reservation.  However, this requires that the SQL Server service account have local administrator privileges on the computer.  If the SQL Service account does not have local administrator, on SQL Server 2005 it will fail with the message noted earlier.

 To resolve this you have two options:

1.        Add the SQL Server service account to the local administrators group, restart and then run the CREATE ENDPOINT again.

2.       Or, explicitly reserve the name space while logged on as a Windows Authentication user that has Local Administrator on the computer and sysadmin on SQL,  *before* you run CREATE ENDPOINT.  For example:

sp_reserve_http_namespace N'http://*:2050/sql/myfolder'

Then when you run the CREATE ENDPOINT, the SQL Service account will not have to reserve the namespace because it already exists, and proceed with creating the endpoint.  Note that when you reserve a namespace explicitly, you need to be sure that the string of the namespace you reserve matches the parameters in the CREATE ENDPOINT statement.  So for the namespace above, the CREATE ENDPOINT would need to look like the following for SQL to match it up correctly:

 CREATE ENDPOINT [myendpoint]

          STATE=STARTED

AS HTTP (PATH=N'/sql/myfolder', PORTS = (CLEAR), AUTHENTICATION = (NTLM, KERBEROS, INTEGRATED), SITE=N'*', CLEAR_PORT = 2050, COMPRESSION=DISABLED) 

              

The following link has more on this in “Identifying the Namespace for an Endpoint” http://msdn.microsoft.com/en-us/library/ms190614.aspx

 

How to install/enable .Net 3.5 SP1 on Windows Server 2008 R2 for SQL Server 2008 and SQL Server 2008 R2

$
0
0
The .NET Framework 3.5 SP1 (also referred to as .NET Framework 3.5.1) is a prerequisite for SQL Server 2008. SQL Server 2008 Setup for standalone instance will install the .NET Framework 3.5 SP1 if it is not already installed. In Windows Server 2008 R2, the .NET Framework is a feature and installing it is different when compared to older versions of Windows Operating System. In previous versions, one could either download the .NET Framework 3.5.1 install binaries from Microsoft download site or use the install binaries from the redist folder of the SQL Server 2008 installation media. Starting with Windows Server 2008 R2, the method of installing .NET Framework 3.5.1 has changed. This document explains how to verify that .NET Framework 3.5.1 is installed and if it is not installed how you can add it.
  
How to verify if .NET Framework 3.5 SP1 is installed:

Here are the steps to verify that .NET Framework 3.5.1 is installed on Windows Server 2008 R2.

  1. Click the Start button in the lower left hand corner of the display.
  2. Highlight Administrative Tools and select Server Manager.
  3. In the Server Manager interface, click Features to display all the installed Features in the right hand pane. Verify that .NET Framework 3.5.1 is listed.

If .NET Framework 3.5.1 feature is not listed, you can use either of the following methods to install it:

Method 1: Using Server Manager Interface

  1. In the Server Manager interface, select Add Features to displays a list of possible features.
  2. In the Select Features interface, expand .NET Framework 3.5.1 Features.
  3. Once you expand .NET Framework 3.5.1 Features, you will see two check boxes. One for .NET Framework 3.5.1 and other for WCF Activation. Check the box next to .NET Framework 3.5.1 and click Next.
    Note: If you do not expand .NET Framework 3.5.1 Features and check it, you will get a pop-up titled Add Features Wizard  as shown below.
    Click Cancel and expand .NET Framework 3.5.1 Features and then check .NET Framework 3.5.1 check box below it.
     
    You cannot install .NET Framework 3.5.1 Features unless the required role services and features are also installed.
  4. In the Confirm Installation Selections interface, review the selections and then click Install.
  5. Allow the installation process to complete and then click Close.

Method 2: Using PowerShell

  1. Click the Start button in the lower left hand corner of the display.
  2. Highlight All Programs and select Accessories
  3. Expand Windows PowerShell and right click Windows PowerShell and select Run as administrator. Click Yes on the User Account Control box.
  4. At the PowerShell command prompt, type the following commands, and then press ENTER after each command:
  • Import-Module ServerManager
  • Add-WindowsFeature as-net-framework

Note: A screenshot is shown below:

 

Step by step N-tier configuration of Sync services for ADO.NET 2.0

$
0
0

Recently I have worked on couple of cases where customers were trying to use N-tier configuration of Sync services for ADO.NET on IIS. In this blog we will use IIS and setup N-tier configuration of sync service for ADO.NET version 2.0 which comes as part of Microsoft Sync Framework 1.0. 

Preparing Environment:

We need a development machine that we will use to develop the application, a middle tier server where IIS is installed and configured for WCF services and database server. If you like, you can use the same development box for IIS and database server too.

a) Client tier

We will develop the application (client and WCF service) and run the client part of the application in this machine. We will use Visual Studio 2008, SP1, so it should be installed on this machine.

Install sync services on this client machine. Sync Services for ADO.NET version 2.0 comes with Microsoft Sync Framework 1.0 which gets installed if you install SQL Server 2008 or Visual studio 2008, SP1. You can also download it from http://www.microsoft.com/downloads/details.aspx?FamilyId=C88BA2D1-CEF3-4149-B301-9B056E7FB1E6&displaylang=en

Install SQL Server Compact 3.5 Service Pack 1 on this client machine if it is already not there. SQL Server Compact is available in three ways: - Integrated with SQL Server 2008 or later versions, Integrated with Microsoft Visual Studio 2008 or later versions and as a download from the Web site at: http://www.microsoft.com/sqlserver/2005/en/us/compact-downloads.aspx

b) Middle tier

If you want to use a separated middle tier server for IIS to run WCF service then make sure you have IIS installed and configured on this box to run WCF. Install sync services on this machine too. I have used the same development machine for middle tier so I did not have to install it again.

c) Database Server

Install a version of SQL Server other than SQL Server Compact to act as the server database. If you like to you can use SQL Server Express that installs with Visual Studio. I have used developer edition of SQL 2008, SP1.

We are going to use a sample database that should be prepared by following the article on “Setup Scripts for Sync Services How-to Topics” at: http://msdn.microsoft.com/en-us/library/bb726041.aspx. Copy the T-SQL statements of “Custom Change Tracking for Offline Scenarios” that creates a custom change tracking infrastructure. Once you run this script successfully from query analyzer of your SQL Server it will create the new database by the name SyncSamplesDb. I have the database created as shown below:

clip_image001 

Developing and deploying WCF service:

Create a WCF Services Application in Visual Studio 2008: (You may need to run Visual Studio as Administrator to create virtual directories in IIS)

clip_image002

Type “WcfForSync” as the name of the project and  click on “OK” button. It will create the WCF project with IService1.cs, Service1.svc, Service1.svc.cs and Web.config files with other files and folders. I will keep these default files for simplicity.

Open the IService1.cs file by double clicking it and replace the code with the code below: (Note that the code samples in the following sections have been taken from the MSDN articles mentioned in the reference section at the bottom with little modification)

using System.Collections.ObjectModel;

using System.ServiceModel;

using System.Data;

using Microsoft.Synchronization;

using Microsoft.Synchronization.Data;

using Microsoft.Synchronization.Data.Server;

namespace WcfForSync

{

    [ServiceContract]

    public interface IService1

    {

        [OperationContract()]

        SyncContext ApplyChanges(SyncGroupMetadata groupMetadata, DataSet dataSet, SyncSession syncSession);

        [OperationContract()]

        SyncContext GetChanges(SyncGroupMetadata groupMetadata, SyncSession syncSession);

        [OperationContract()]

        SyncSchema GetSchema(Collection<string> tableNames, SyncSession syncSession);

        [OperationContract()]

        SyncServerInfo GetServerInfo(SyncSession syncSession);

    }

}

Next add a class file by the name SampleServerSyncProvider.cs as below:

clip_image003

Replace the code in this file with the code below:

using System.Data;

using System.Data.SqlClient;

using Microsoft.Synchronization;

using Microsoft.Synchronization.Data;

using Microsoft.Synchronization.Data.Server;

namespace WcfForSync

{

    //Create a class that is derived from

    //Microsoft.Synchronization.Server.DbServerSyncProvider.

    public class SampleServerSyncProvider : DbServerSyncProvider

    {

        public SampleServerSyncProvider()

        {

            //Create a connection to the sample server database.

            Utility util = new Utility();

            SqlConnection serverConn = new SqlConnection(util.ServerConnString);

            this.Connection = serverConn;

            //Create a command to retrieve a new anchor value from

            //the server. In this case, we use a timestamp value

            //that is retrieved and stored in the client database.

            //During each synchronization, the new anchor value and

            //the last anchor value from the previous synchronization

            //are used: the set of changes between these upper and

            //lower bounds is synchronized.

            //

            //SyncSession.SyncNewReceivedAnchor is a string constant;

            //you could also use @sync_new_received_anchor directly in

            //your queries.

            SqlCommand selectNewAnchorCommand = new SqlCommand();

            string newAnchorVariable = "@" + SyncSession.SyncNewReceivedAnchor;

            selectNewAnchorCommand.CommandText = "SELECT " + newAnchorVariable + " = min_active_rowversion() - 1";

            selectNewAnchorCommand.Parameters.Add(newAnchorVariable, SqlDbType.Timestamp);

            selectNewAnchorCommand.Parameters[newAnchorVariable].Direction = ParameterDirection.Output;

            selectNewAnchorCommand.Connection = serverConn;

            this.SelectNewAnchorCommand = selectNewAnchorCommand;

            //Create a SyncAdapter for the Customer table by using

            //the SqlSyncAdapterBuilder:

            //  * Specify the base table and tombstone table names.

            //  * Specify the columns that are used to track when

            //    changes are made.

            //  * Specify download-only synchronization.

            //  * Call ToSyncAdapter to create the SyncAdapter.

            //  * Specify a name for the SyncAdapter that matches the

            //    the name specified for the corresponding SyncTable.

            //    Do not include the schema names (Sales in this case).

            SqlSyncAdapterBuilder customerBuilder = new SqlSyncAdapterBuilder(serverConn);

            customerBuilder.TableName = "Sales.Customer";

            customerBuilder.TombstoneTableName = customerBuilder.TableName + "_Tombstone";

            customerBuilder.SyncDirection = SyncDirection.DownloadOnly;

            customerBuilder.CreationTrackingColumn = "InsertTimestamp";

            customerBuilder.UpdateTrackingColumn = "UpdateTimestamp";

            customerBuilder.DeletionTrackingColumn = "DeleteTimestamp";

            SyncAdapter customerSyncAdapter = customerBuilder.ToSyncAdapter();

            customerSyncAdapter.TableName = "Customer";

            this.SyncAdapters.Add(customerSyncAdapter);

        }

    }

    public class Utility

    {

        //Return the server connection string.

        public string ServerConnString

        {

           get { return @"Data Source= SQLServer\instance; Initial Catalog=SyncSamplesDb; User Id=xxxxxx; Password=xxxxxx"; }

        }

    }

}

Note: You need to update the connection string in the above Utility class to connect to your SQL Server.

Open Service1.svc.cs file in the project by double clicking on it:

clip_image004

and replace the existing code with the code below:

using System.Collections.ObjectModel;

using System.ServiceModel;

using System.Data;

using Microsoft.Synchronization;

using Microsoft.Synchronization.Data;

using Microsoft.Synchronization.Data.Server;

namespace WcfForSync

{

    // NOTE: If you change the class name "Service1" here, you must also update the reference to "Service1" in App.config.

    public class Service1 : IService1

    {

         private SampleServerSyncProvider _serverSyncProvider;

         public Service1()

        {

            this._serverSyncProvider = new SampleServerSyncProvider();

        }

        [System.Diagnostics.DebuggerNonUserCodeAttribute()]

        public virtual SyncContext ApplyChanges(SyncGroupMetadata groupMetadata, DataSet dataSet, SyncSession syncSession)

        {

            return this._serverSyncProvider.ApplyChanges(groupMetadata, dataSet, syncSession);

        }

        [System.Diagnostics.DebuggerNonUserCodeAttribute()]

        public virtual SyncContext GetChanges(SyncGroupMetadata groupMetadata, SyncSession syncSession)

        {

            return this._serverSyncProvider.GetChanges(groupMetadata, syncSession);

        }

        [System.Diagnostics.DebuggerNonUserCodeAttribute()]

        public virtual SyncSchema GetSchema(Collection<string> tableNames, SyncSession syncSession)

        {

            return this._serverSyncProvider.GetSchema(tableNames, syncSession);

        }

        [System.Diagnostics.DebuggerNonUserCodeAttribute()]

        public virtual SyncServerInfo GetServerInfo(SyncSession syncSession)

        {

            return this._serverSyncProvider.GetServerInfo(syncSession);

        }

    }

}

The application requires references to Microsoft.Synchronization.dll, Microsoft.Synchronization.Data.dll and Microsoft.Synchronization.Data.Server.dll. Right click on “References” from the project and click on Add Reference…

Select Microsoft.Synchronization.dll (Location on my machine: C:\Program Files (x86)\Microsoft Sync Framework\v1.0\Runtime\x86)

clip_image005

And next add the references to the other two dlls (Location on my machine: C:\Program Files (x86)\Microsoft Sync Framework\v1.0\Runtime\ADO.NET\V2.0\x86)

clip_image006

Now you should be able to build the project successfully. Once building is successful publish the WCF to IIS. Go to the properties of the project and under Web* tab type IIS server information (middletierserver) , click on the “Create Virtual Directory” as below:

clip_image007

From IIS you can see a Virtual directory has been created:

clip_image008

If you browse the WCF (http://middletierserver/WcfForSync/Service1.svc) you should get the following page. I have used the same development machine for IIS, so it is showing “localhost” in the URL.

clip_image009

Note:

When tried to browse WCF I have noticed following error with some IIS machines:

HTTP Error 404.3 – Not Found

The page you are requesting cannot be served because of the extension configuration. If the page is a script, add a handler. If the file should be downloaded, add a MIME map. Detailed Error InformationModule StaticFileModule.

If you encounter this error please take necessary action as per article at: http://blogs.msdn.com/rjohri/archive/2009/06/29/the-page-you-are-requesting-cannot-be-served-because-of-the-extension-configuration.aspx

Developing client application and consuming WCF service:

Create a Console application in Visual Studio:

clip_image010

Once the project is created, reference the dlls  Microsoft.Synchronization.dll, Microsoft.Synchronization.Data.dll, and Microsoft.Synchronization.Data.SqlServerCe.dll as before. You also need to reference System.Data.SqlServerCe.dll (you should have this dll on your machine once you installed SQL Server Compact 3.5 Service Pack 1, location of this dll on my machine: C:\Program Files (x86)\Microsoft SQL Server Compact Edition\v3.5\Desktop as shown below)

clip_image011

Now we need to do service reference to our WCF that we developed and deployed before. Right click on the Reference in this client project and select Add Service Reference…

clip_image013

In the Add Service Reference screen you type the URL of our WCF and click on Go button as shown below:

clip_image014

Click on “OK” button, it will create service reference as below:

clip_image016

Replace the code in the file Program.cs with the following code:

using System;

using System.IO;

using System.Data;

using System.Data.SqlClient;

using System.Data.SqlServerCe;

using Microsoft.Synchronization;

using Microsoft.Synchronization.Data;

using Microsoft.Synchronization.Data.SqlServerCe;

namespace ClientForSync

{

    class Program

    {

        static void Main(string[] args)

        {

            //The Utility class handles all functionality that is not

            //directly related to synchronization, such as holding connection

            //string information and making changes to the server database.

            Utility util = new Utility();

            //The SampleStats class handles information from the SyncStatistics

            //object that the Synchronize method returns.

            SampleStats sampleStats = new SampleStats();

            //Delete and re-create the database. The client synchronization

            //provider also enables you to create the client database

            //if it does not exist.

            ////util.SetClientPassword();

            util.RecreateClientDatabase();

            //Initial synchronization. Instantiate the SyncAgent

            //and call Synchronize.

            SampleSyncAgent sampleSyncAgent = new SampleSyncAgent();

            SyncStatistics syncStatistics = sampleSyncAgent.Synchronize();

            sampleStats.DisplayStats(syncStatistics, "initial");

            //Make changes on the server.

            util.MakeDataChangesOnServer();

            //Subsequent synchronization.

            syncStatistics = sampleSyncAgent.Synchronize();

            sampleStats.DisplayStats(syncStatistics, "subsequent");

            //Return server data back to its original state.

            util.CleanUpServer();

            //Exit.

            Console.Write("\nPress Enter to close the window.");

            Console.ReadLine();

        }

    }

    //Create a class that is derived from

    //Microsoft.Synchronization.SyncAgent.

    public class SampleSyncAgent : SyncAgent

    {

        public SampleSyncAgent()

        {

            //Instantiate a client synchronization provider and specify it

            //as the local provider for this synchronization agent.

            this.LocalProvider = new SampleClientSyncProvider();

            //The remote provider now references a proxy instead of directly referencing the server provider. The proxy is created by passing a reference to a WCF service

            ServiceReference1.Service1Client serviceProxy = new ServiceReference1.Service1Client();

            this.RemoteProvider = new ServerSyncProviderProxy(serviceProxy);

            //Add the Customer table: specify a synchronization direction of

            //DownloadOnly.

            SyncTable customerSyncTable = new SyncTable("Customer");

            customerSyncTable.CreationOption = TableCreationOption.DropExistingOrCreateNewTable;

            customerSyncTable.SyncDirection = SyncDirection.DownloadOnly;

            this.Configuration.SyncTables.Add(customerSyncTable);

        }

    }

    //Create a class that is derived from

    //Microsoft.Synchronization.Data.SqlServerCe.SqlCeClientSyncProvider.

    //You can just instantiate the provider directly and associate it

    //with the SyncAgent, but you could use this class to handle client

    //provider events and other client-side processing.

    public class SampleClientSyncProvider : SqlCeClientSyncProvider

    {

        public SampleClientSyncProvider()

        {

            //Specify a connection string for the sample client database.

            Utility util = new Utility();

            this.ConnectionString = util.ClientConnString;

        }

    }

    //Handle the statistics that are returned by the SyncAgent.

    public class SampleStats

    {

        public void DisplayStats(SyncStatistics syncStatistics, string syncType)

        {

            Console.WriteLine(String.Empty);

            if (syncType == "initial")

            {

                Console.WriteLine("****** Initial Synchronization ******");

            }

            else if (syncType == "subsequent")

            {

                Console.WriteLine("***** Subsequent Synchronization ****");

            }

            Console.WriteLine("Start Time: " + syncStatistics.SyncStartTime);

            Console.WriteLine("Total Changes Downloaded: " + syncStatistics.TotalChangesDownloaded);

            Console.WriteLine("Complete Time: " + syncStatistics.SyncCompleteTime);

            Console.WriteLine(String.Empty);

        }

    }

    public class Utility

    {

       //Return the client connection string with the password. Don’t forget to create folder

        public string ClientConnString

        {

          get { return @"Data Source='D:\\SyncServices\\SyncSampleClient.sdf'; Password=xxxxxxx"; }

        }

        //Return the server connection string.

        public string ServerConnString

        {

            get { return @"Data Source= SQLServer\instance; Initial Catalog=SyncSamplesDb; User Id=xxxxxx; Password=xxxxxx"; }

        }

        //Make server changes that are synchronized on the second

        //synchronization.

        public void MakeDataChangesOnServer()

        {

            int rowCount = 0;

            using (SqlConnection serverConn = new SqlConnection(this.ServerConnString))

            {

                SqlCommand sqlCommand = serverConn.CreateCommand();

                sqlCommand.CommandText =

                    "INSERT INTO Sales.Customer (CustomerName, SalesPerson, CustomerType) " +

                    "VALUES ('Cycle Mart', 'James Bailey', 'Retail') " +

                    "UPDATE Sales.Customer " +

                    "SET  SalesPerson = 'James Bailey' " +

                    "WHERE CustomerName = 'Tandem Bicycle Store' " +

                    "DELETE FROM Sales.Customer WHERE CustomerName = 'Sharp Bikes'";

                serverConn.Open();

                rowCount = sqlCommand.ExecuteNonQuery();

                serverConn.Close();

            }

            Console.WriteLine("Rows inserted, updated, or deleted at the server: " + rowCount);

        }

        //Revert changes that were made during synchronization.

        public void CleanUpServer()

        {

            using (SqlConnection serverConn = new SqlConnection(this.ServerConnString))

            {

                SqlCommand sqlCommand = serverConn.CreateCommand();

                sqlCommand.CommandType = CommandType.StoredProcedure;

                sqlCommand.CommandText = "usp_InsertSampleData";

                serverConn.Open();

                sqlCommand.ExecuteNonQuery();

                serverConn.Close();

            }

        }

        //Delete the client database.

        public void RecreateClientDatabase()

        {

            using (SqlCeConnection clientConn = new SqlCeConnection(this.ClientConnString))

            {

                if (File.Exists(clientConn.Database))

                {

                    File.Delete(clientConn.Database);

                }

            }

            SqlCeEngine sqlCeEngine = new SqlCeEngine(this.ClientConnString);

            sqlCeEngine.CreateDatabase();

        }

    }

}

Now we have to update the connection string information in the code. In the Utility class in the below methods you should update the connection string for SQL server and client database (SQL CE database)

//Return the client connection string with the password. Don’t forget to create folder

        public string ClientConnString

        {

          get { return @"Data Source='D:\\SyncServices\\SyncSampleClient.sdf'; Password=xxxxxxx"; }

        }

        //Return the server connection string.

        public string ServerConnString

        {

            get { return @"Data Source= SQLServer\instance; Initial Catalog=SyncSamplesDb; User Id=xxxxxxx; Password=xxxxxxx"; }

        }

Build this client project, at this point the project should compile successfully.

Running/testing synchronization:

Run the Console application and you should see the result of the synchronization of this sample as below:

clip_image017

You can run query on the Customer table in SyncSamplesDb to check the changes by setting the break point in the client app and also by opening a SQL profiler you can see the activities going on in SQL Server.

Happy Synchronization!

References:

Overview (Synchronization Services) : http://msdn.microsoft.com/en-us/library/bb726031(SQL.90).aspx

Architecture and Classes (Synchronization Services) : http://msdn.microsoft.com/en-us/library/bb726025(SQL.90).aspx

Getting Started: A Synchronization Services Application: http://msdn.microsoft.com/en-us/library/bb726015.aspx

How to: Configure N-Tier Synchronization at: http://msdn.microsoft.com/en-us/library/bb902831.aspx

A large scale implementation of Sync Service example at: http://blogs.msdn.com/sync/archive/2009/10/19/sharepoint-2010-now-integrates-microsoft-sync-framework.aspx

Microsoft Sync Framework 2.0 redistributable Package - http://www.microsoft.com/downloads/details.aspx?FamilyId=109DB36E-CDD0-4514-9FB5-B77D9CEA37F6&displaylang=en

 

Author : Faiz(MSFT), SQL Developer Engineer

Reviewed by : Enamul(MSFT), SQL Developer Technical Lead; Azim(MSFT), SQL Developer Technical Lead;Srini(MSFT), SQL Developer Engineer


Commonly used 32-bit CPU registers and their purpose

$
0
0

While debugging a dump, we commonly see various CPU registers. Each register has different purpose. I am trying to put them together in one place for your easy reference.

 

You can list all the registers in windbg with “r” command:

Register

 

Please note that a given register may be used for general purpose at any time.

 

EAX: Arithmetic operations, I/O port access and interrupt call

EDX: Arithmetic operations, I/O port access and interrupt call. If you have a multiplication operation which results in more than what a single register can handle, then the most significant 16 numbers are stored in EDX and least significant ones are stored in EAX

EBX: Holds return values

ECX: Used for loop counters. Also used for "this" pointer for a class

EIP: Instruction pointer. Points to the next instruction to execute.

ESP: Stack pointer. This points to the top of the stack.

EBP: Base/Frame pointer.

 

Author : Enamul(MSFT), SQL Developer Technical Lead

How to enable TDS parser to display TDS frames when SQLServer is listening on port other than default 1433

$
0
0

If you try to view a netmon trace in Netmon 3.4, you will see TDS traffic is nicely parsed for you. You will see a display similar to this:

clip_image001

The parsing works nicely because SQLServer is listening on default tcp port 1433. But, if your SQLServer is listening on a different port (other than 1433), then these TDS parsing won't work by default. You will see a display similar to this (no frames resolving to TDS):

clip_image002

To enable the parser to parse TDS traffic for ports other than 1433 (in this case, 63959), we need to take following steps:

1. Click on "Parsers" tab in netmon UI. This will display list of parsers installed. You will see tcp.npl, tds.npl along with several different parsers

clip_image003

2. Double click on tcp.npl and search for "1433", you will get into a switch/case code block saying "case 1433". We basically need to include our port 63959 here. Just add a case statement above “case 1433” without any parsing code.

case 63959:

case 1433:

//TDS-parsing code goes here

 

This is what it’s look like after the change:

 

image

3. Save your changes

4. Reload your netmon trace, now it should look like following. With the change, TDS parser will resolve TDS traffic on this particular non-default port (63959).

clip_image005

Author : Enamul(MSFT), SQL Developer Technical Lead

Configuration changes needed for running SSIS 2008/2008 R2 classes in .NET4.0

$
0
0

Recently we came across a case where the customer was developing a .NET managed application using SQL Server Integration Services (SSIS) object model.  The managed application was behaving differently in different versions of Visual Studio (2008 vs 2010). We finally figured out the whys. Rather than let the good research go to waste, we're posting the results here. 

Quick background

The managed assemblies that are commonly used when programming Integration Services using the .NET Framework are:

Microsoft.SqlServer.ManagedDTS.dll
Microsoft.SqlServer.RuntimeWrapper.dll
Microsoft.SqlServer.PipelineHost.dll
Microsoft.SqlServer.PipelineWrapper.dll

These assemblies contain various namespaces, such as at http://msdn.microsoft.com/en-us/library/microsoft.sqlserver.dts.runtime.aspx

Code Segment

maincode

Note: This example requires a reference to the Microsoft.SqlServer.ManagedDTS.dll and Microsoft.SqlServer.SQLTask.dll assemblies.

For reference: http://msdn.microsoft.com/en-us/library/ms345167.aspx

Behavior Differences in VS2008 vs VS2010

When you run the above code in Visual Studio 2008 and in Visual Studio 2010 you will get different results (the value will be null for the object myExecuteSQLTask for VS 2010) as shown below:

VS 2008:

VS2008

VS 2010:

VS2010

The Solution:

The reason of this behavior is a mismatch of .NET framework between Visual Studio 2010 and those managed assemblies (Microsoft.SqlServer.ManagedDTS.dll, Microsoft.SqlServer.RuntimeWrapper.dll, Microsoft.SqlServer.PipelineHost.dll, Microsoft.SqlServer.PipelineWrapper.dll) that come with SQL 2008/2008 R2. Visual Studio uses .NET Framework 4.0 whereas those assemblies are compiled against .NET Framework 3.5.

The .NET application configuration can be used to handle this. If your application is built with the .NET Framework 4 but has a dependency on a mixed-mode assembly built with an earlier version of the .NET Framework we should use <supportedRuntime> Element in the configuration file. This element specifies which versions of the common language runtime the application supports. In addition, in the <startup> element in configuration file, we must set the useLegacyV2RuntimeActivationPolicy attribute to true. However, setting this attribute to true means that all components built with earlier versions of the .NET Framework are run using the .NET Framework 4 instead of the runtimes they were built with.

Reference: http://msdn.microsoft.com/en-us/library/bbx34a2h.aspx

Placing the following section in the configuration file of the application should take care of this issue:

<configuration>
  <startup useLegacyV2RuntimeActivationPolicy="true">
    <supportedRuntime version="v4.0"/>
    <supportedRuntime version="v2.0.50727"/>
  </startup>
</configuration>

 

Author : Faiz(MSFT), SQL Developer Engineer; Enamul(MSFT), SQL Developer Technical Lead

Have you checked the MaximumErrorCount in your SSIS Package ?

$
0
0

Recently we came across a case where the customer was running a Data Flow Task that was executing to it's completion without transferring any data.  We finally figured out the whys. Rather than let the good research go to waste, we're posting the results here. 

Source: SQLServer OLEDB Provider

Destination: SQLServer OLEDB Provider

Transform: Data Flow Task (DFT)

 

Frist we thought the Source table was empty. But after querying the table, we found it had 300+ rows. After enabling SSIS log and redeploying the package, we found that the package was encountering a datatype mismatch but weirdly enough, it was reporting successful completion. Then we found that the Maximum Error Count for the package was set to 9999. This explains why the package was silently swallowing error and reporting successful completion.

Capture

We changed that to 1 and the package stopped at error as expected.

 

So, if your SSIS package shows any weird behavior, don't forget to check the MaximumErrorCount field !

 

Author : Enamul(MSFT), SQL Developer Technical Lead

Why SSIS package runs little slower on the first run

$
0
0

When a package is being loaded, the runtime will enumerate all of the connection managers, tasks, and data flow components on the system. It does this by looking in certain directories (under <sql>\100\dts), and also by looking in COM registry for classes that implement certain interfaces. Because this iteration can take a while, we cache this information with the SSIS service.

This might explain the performance difference you’re seeing on the first run of the package.

 

Author : Enamul(MSFT), SQL Developer Technical Lead

SharePoint 2007 using ADFS v1 authentication integrated with Reporting Services 2008 : "Object Moved" error

$
0
0

If SharePoint 2007 is configured to use Web SSO authentication by using ADFS v1, when trying to render a RS 2008 Report in SharePoint integrated mode, the following error occurs when clicking on the report:

An unexpected error occured while connecting to the report server. Verify that the report server is available and configured for SharePoint integrated mode. --> The request failed with the error message:
--
<html><head><title>Object moved</title></head><body>
<h2>Object moved to <a href="http://<_ADFS_signin_link_>">here</a>.</h2>
</body></html>
--

 

Cause: Please note that this behavior is by design. The reporting Service webpart calls the “SharePoint Proxy Endpoints" (see http://msdn.microsoft.com/en-us/library/bb326209(SQL.90).aspx).
Requests from viewer and management pages are sent through these Proxy Endpoints. Neither code path forwards cookies. As a consequence, ADFS considers the call unauthenticated and redirects to the ADFS login page.

Resolution: To solve this issue, use SharePoint 2010 instead of SharePoint 2007. In SharePoint 2010, the RS webpart viewer and management pages send requests directly to the report server.
As Claims Based Authentication is not supported for SharePoint integration between SharePoint 2010 and Reporting Services 2008, you will also need to use Reporting Services 2008 R2.

 

Cédric Naudy

SQL Server Support.

Output parameter streaming feature in SQL Native client 11

$
0
0

Recently we had a case where customer wanted to stream output parameter using SNAC 11. Rather than let the good research go to waste, we're posting the steps here. 

It’s not optimum or sometimes  not even possible to define  large buffer  when the  parameter that is being  fetched from the SQL server is very large, because earlier versions ODBC Driver Manager version 3.8 does not support retrieving the large output parameter in small chunks multiple times.

Beginning from ODBC  Driver Manager 3.8 & SQL Native Client Version 11.0 supports a new feature called Output Parameter Streaming. Applications  memory footprint size can be reduced using this feature, by  invoking SQLGetData using small size buffer multiple times  to retrieve large output parameter value.  This feature is supported by ODBC Driver manager version 3.8 and SQL Native client version 11.0 or higher only.

Lets follow these steps to implement this feature  using sample application:

1. Download and install the latest version of Windows platform SDK  http://msdn.microsoft.com/en-us/windows/bb980924

2. Install SQL Native Client 11.0 on the application machine or workstation - http://www.microsoft.com/downloads/en/details.aspx?FamilyID=6a04f16f-f6be-4f92-9c92-f7e5677d91f9

3. Create a sample  table and import  some binary data in SQL Server ( For example, SQL 2008, SQL 2008 R2)

CREATE TABLE TableImage(Document varbinary(max)) INSERT INTO TableImage(Document)  SELECT * FROM OPENROWSET(BULK N'SomeImage.bmp', SINGLE_BLOB) AS I

4. Create stored procedure that returns image of given ID:

CREATE PROCEDURE [dbo].[SP_TestOutputPara] @Param1 integer,

@Param2 VARBINARY(max) OUTPUT

AS

BEGIN

-- SET NOCOUNT ON added to prevent extra result sets from

-- interfering with SELECT statements.

SET NOCOUNT ON;

SELECT @Param2 = [Document]  FROM [pubs].[dbo].[TableImage] where [TableImage].[id]  = 1

END

GO

5. Define DSN using SQL Native Client  Version 11.0  to point to the database that hosts the table and SP created at the above steps 3 & 4

6. The sample code that shows  how to bind  the  streamed output parameter and retrieve the large output  parameter multiple times  using SQLGetData()

BOOL GetBinaryDataInChunks(SQLUINTEGER idOfPicture, SQLHSTMT hstmt)

{

SQLINTEGER lengthOfPicture=SQL_DATA_AT_EXEC;

SQLPOINTER ParamValuePtr = (SQLPOINTER) 2;

BYTE        smallBuffer[2048];   // A very small buffer.

SQLRETURN   retcode, retcode2;

// Bind the first parameter (input parameter)

retcode = SQLBindParameter(hstmt,

1,                                             // The first parameter.

SQL_PARAM_INPUT,         // Input parameter: The ID_of_picture.

SQL_C_ULONG,                 // The C Data Type of the ID.

SQL_INTEGER,                   // The Param-Type of the ID.

0,                                         // ColumnSize is ignored for integer.

0,                                         // DecimalDigits is ignored for integer.

&idOfPicture,                   // The Address of the buffer for the input parameter.

0,                                        // Buffer length is ignored for integer.

NULL);                                // This is ignored for integer.

if (retcode != SQL_SUCCESS)

Return FALSE;

            // Bind the streamed output parameter.

            retcode = SQLBindParameter(hstmt,

  2,                                                                                                // The second parameter.

SQL_PARAM_OUTPUT_STREAM,                                        // A streamed output parameter.

  SQL_C_BINARY,                                                                       // The C Data Type of the picture. 

  SQL_VARBINARY,                                                                    // The Param-Type of the picture.

  0,                                                                                               // ColumnSize: The maximum size of varbinary(max).

  0,                                                                                               // DecimalDigits is ignored for binary type.

  (SQLPOINTER)2,                                                                     // ParameterValuePtr: An application-defined. token (this will be returned from SQLParamData).

                                                                   // In this example, we used the ordinal  of the parameter.

0,                                                                                               // This is ignored for streamed output parameters.

&lengthOfPicture);                                                                 // StrLen_or_IndPtr: The status variable returned.

if (retcode != SQL_SUCCESS)

Return FALSE;

   retcode = SQLPrepare(hstmt,(SQLCHAR*) "{call SP_TestOututPara(?, ?)}", SQL_NTS);

   if ( retcode == SQL_ERROR )

       return FALSE;

//Execute the stored procedure.

   retcode = SQLExecute(hstmt);

   if ( retcode == SQL_ERROR )

       return FALSE;

   // Assume that the retrieved picture exists. Use SQLBindCol or SQLGetData to retrieve the result-set.

   // Process the result set and move to the streamed output parameters.

         retcode = SQLMoreResults( hstmt );

   // SQLGetData retrieves and displays the picture in parts. The streamed output parameter is available.

   while (retcode == SQL_PARAM_DATA_AVAILABLE) 

   {

       SQLPOINTER token;                                                            // Output by SQLParamData.

       SQLINTEGER cbLeft;                                        // #bytes remained

       retcode = SQLParamData(hstmt, &token);

       if ( retcode == SQL_PARAM_DATA_AVAILABLE )

       {

           do

           {

               retcode2 = SQLGetData(hstmt,

    (UWORD) token,                            // the value of the token is the ordinal.

     SQL_C_BINARY,                            // The C-type of the picture.

    smallBuffer,                                   // A small buffer.

    sizeof(smallBuffer),                     // The size of the buffer.

    &cbLeft);                                       // How much data we can get.

  //Print the buffer.

cout << smallBuffer << "\n";

if (retcode2 == SQL_ERROR)

Return FALSE;

           }

           while ( retcode2 == SQL_SUCCESS_WITH_INFO );

       }

   }

   return TRUE;

}

Further references:

What are the new features in SQL Native Client Version 11.0 http://msdn.microsoft.com/en-us/library/cc280510(SQL.110).aspx

Retrieving Output parameters - http://msdn.microsoft.com/en-us/library/ms712625(VS.85).aspx

The ODBC 3.8 features supported in  code name "Denali" or  SNAC 11  - http://blogs.msdn.com/b/sqlnativeclient/archive/2010/11/16/sql-server-code-named-quot-denali-quot-native-client-supporting-odbc-3-8.aspx

 

Author : Srini(MSFT), SQL Developer Engineer

Reviewed by : Enamul(MSFT), SQL Developer Technical Lead


No Books Online Content “in the box” for the next version of SQL Server– Denali

$
0
0

 

The Denali discs will not contain any of the Books Online content; setup only installs the help viewer and a handful of utility .chm files. All of the BOL documentation will by default come from online (MSDN). With the Microsoft strategy moving to the cloud we want to use the internet as the centre of gravity for product documentation. This encourages customers to download the latest version when they install instead of installing stale content from disk that was locked down several months before general availability. We are one of the last few big Microsoft products shift to the online model for content delivery.

Here’s some more information:

Q: Does this mean no more local BOL?

A: You can access BOL locally. On first use of Help, the user is presented with a dialog asking them to choose either online or offline mode as the default setting for Help Viewer. By selecting “Yes,” the viewer will be configured to run in Online mode against the copy of BOL in the MSDN Library. If you select “No,” local mode is made the default setting and you will then need to download the Help packages containing Books Online by selecting “Install content from online” in Help Library Manager.

Q: How do I get BOL locally if I don’t have internet access (such as a data centre)?

A: A CAB of the BOL content is available on the Microsoft Download Centre that can be downloaded from a location that has internet access and can be burnt to disc, copied to a portable drive, or a network share.

Q: I have my Books Online on a network share, how do others access it now?

Users simply need to go to Start >> All Programs >> Microsoft SQL Server “Denali” >> Help and Community and click on the “Manage Help Settings” shortcut. This launches Help Library Manager and from there users click the “Install content from disc” link, browse to the network share with the BOL CABs, select the HelpContentSetup.msha file, select Books Online in the list of content available for install, then click update. This will install the BOL content locally on the user’s machine.

Q: Why are you doing this again?

A:  Switching to an online solution provides more flexibility in providing customers with the most up-to-date content as we are no longer tied to the SQL Server release schedule.

  • New customers typically first go through the BOL on the disc after installing SQL Server, so they keep referring to increasingly outdated content. We have to freeze Books Online 2-3 months before the dev team freezes the code in order to complete the final translation work, test all the localized Books Online versions, and have them ready to ship at RTM. We still get feedback from the on-disc copy of the SQL Server 2005 BOL about issues we fixed 5 years ago in the first update. With the new viewer and its ease in delivering updates, we feel that many customers will be using more current information than if we kept shipping the content on the disc.
  • While we immediately start working on the first BOL update and publish it at the time customers can order the product, the copy of BOL on the product discs is already somewhat obsolete at RTM.
  • The number of customers who prefer online help keeps growing; those customers can skip installing the Books Online content.

Executing an INSERT statement on a View in linked server

$
0
0

Consider this scenario where you have a linked server from one SQL server to another SQL Server. Both the SQL Servers are SQL Server 2008 SP2 on Windows 2008.

Say the two SQL Servers are Server A and Server B.

Linked server from Server A to Server B is set up using SQL Native Client 10.0 provider.

On Server B, you have a VIEW that joins couple of tables, TABLE 1 and TABLE2 and a trigger that fires an INSERT into Table 2 when you INSERT into the View.

SERVER B

---------------------

Create 2 tables, 1 view, 1 trigger (instead of)

CREATE TABLE T1 (c1 INT)

GO

CREATE TABLE T2 (c2 INT)

GO

CREATE VIEW vt (cv) AS SELECT c1 FROM t1 UNION ALL SELECT c2 FROM t2

GO

The View confirms to the rules of an Updateable View and a Partitioned View

CREATE VIEW

http://msdn.microsoft.com/en-us/library/ms187956.aspx

Create an INSTEAD OF TRIGGER to INSERT into physical table t2 when insert is fired against the VIEW.

CREATE TRIGGER vt_trig ON vt INSTEAD OF INSERT AS

BEGIN

INSERT INTO t2 (c2) SELECT i.cv FROM INSERTED i

END

SERVER A

---------------------

Create linked server on Server A to Server B with default SNAC provider and call it SNACLinked. Also create another linked server with MSDASQL and ODBC DSN (set up ODBC DSN using SQLODBCsrv driver) and call it MSDASQL_SQL2008.

Set up linked server with MSDASQL and SQLODBC driver instead of SQLNCLI provider:

EXEC sp_addlinkedserver
@server = N'MSDASQL_SQL2008',
@srvproduct = N'',
@provider = N'MSDASQL',
@datasrc = N'sql2008' <- name of odbc system DSN

In this linked server, this will work:
INSERT INTO MSDASQL_SQL2008.INST4.dbo.vt (cv) VALUES (111)

When you INSERT into VIEW on Server A, using linked server created with SNAC, we get the below mentioned error:

INSERT INTO SNACLinked.Test.dbo.vt(cv) values ('16')

ERROR:

OLE DB provider "SQLNCLI10" for linked server "XXXX" returned message "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done."

Msg 16955, Level 16, State 2, Line 1

Could not create an acceptable cursor.

Locally executing the INSERT on VIEW on Server A works fine.

INSERT into Test.dbo.vt(cv) values (9) – this executes fine.

But when using ODBC DSN for linked server, the INSERT on VIEW works fine from Server A also:

INSERT INTO MSDASQL_SQL2008.Test.dbo.vt(cv) values ('15') - works fine.

The behavior is the same when you execute INSERT statement with OPENQUERY. It fails with the SNAC linked server but works fine with ODBC DSN linked server.

INSERT OPENQUERY (SNACLinked, 'SELECT CV FROM TEST.dbo.vt')

VALUES ('16'); -- fails

INSERT OPENQUERY (MSDASQL_SQL2008, 'SELECT CV FROM TEST.dbo.vt')

VALUES ('16'); -- works fine

It would still fail with the same error when an INDEXED VIEW with a separate TRIGGER is created in place of INSTEAD OF TRIGGER.

CREATE VIEW vt_indexed (cvindexed) with SchemaBinding AS

SELECT c1 FROM dbo.t1 UNION ALL SELECT c2 FROM dbo.t2

GO

CREATE TRIGGER vt_indexed_trig ON vt_indexed INSTEAD OF INSERT AS

BEGIN

INSERT INTO t2 (c2) SELECT i.cvindexed FROM INSERTED i

END

The XACT_ABORT SET option is set to ON for INSERT.

Why INSERT on VIEW with SQLNCLI fails

-------------------------------------------------------

SQL Server wants to do a rowset-based INSERT (cursor based INSERT) operation through the OLE DB API IRowsetChange interface.

SQL Server requests an Updateable rowset for the SELECT statement.

In the SQLNCLI case we are going directly to the OLE DB provider (SQLNCLI or SQLNCLI10).

In the SQLNCLI / OLE DB case we are getting a READ-ONLY cursor that just returns an error about the cursor and does not proceed further. The SQL engine cannot handle this scenario and throws an error.

In the ODBC case, we are really going through MSDASQL/ODBC Driver of 2 layers that interact with SQL Engine. In the ODBC case the cursor gets downgraded to a read-only cursor (you can see the message in in the Profiler, The cursor was not declared). However, MSDASQL has additional logic here to simulate an updateable cursor. It indicates to the SQL Engine that an Updateable rowset is returned. SQL Engine can continue with its logic.

Here is the Error message in Profiler:

Exception Error: 16955, Severity: 16, State: 2 Microsoft SQL Server test 2396 55 2011-02-15 07:35:59.663

User Error Message Could not create an acceptable cursor. Microsoft SQL Server test 2396 55 2011-02-15 07:35:59.663

Exception Error: 16945, Severity: 16, State: 2 Microsoft SQL Server test 2396 55 2011-02-15 07:35:59.663

User Error Message The cursor was not declared. Microsoft SQL Server test 2396 55 2011-02-15 07:35:59.663

Continuing with only the MSDASQL case, SQL Engine does a positioned update through IRowsetChange::InsertRow. MSDASQL generates an INSERT statement as a response to this by parsing the base table/view name, e.g. INSERT INTO Test.dbo.vt( c1) VALUES(?). Since there is a trigger on this read-only VIEW, SQL engine handles it by executing the trigger and everything works.

When opening an updateable rowset over a simple VIEW with a UNION, Rowsetviewer tries 2 things:

declare @p1 int

set @p1=0

declare @p3 int

set @p3=98305

declare @p4 int

set @p4=311300

declare @p5 int

set @p5=0

exec sp_cursoropen @p1 output,N'select * from vt',@p3 output,@p4 output,@p5 output

select @p1, @p3, @p4, @p5

go

declare @p1 int

set @p1=180150009

declare @p3 int

set @p3=8

declare @p4 int

set @p4=1

declare @p5 int

set @p5=1

exec sp_cursoropen @p1 output,N'select * from vt',@p3 output,@p4 output,@p5 output

select @p1, @p3, @p4, @p5

go

Case1, which fails:

scrollopt: 0x18001 = KEYSET_ACCEPTABLE, CHECK_ACCEPTED_TYPES, KEYSET

ccopt: 0x4C004 = OPTIMISTIC_ACCEPTABLE,UPDT_IN_PLACE, CHECK_ACCEPTED_OTPS, OPTIMISTIC

Case2 which succeeds:

scrollopt: 8 = FORWARD_ONLY

ccopt: 1 = READ_ONLY

So in short:

- SQL Server requests an updateable cursor

- For most views, that fails

- MSDASQL is simulating an updateable cursor with insert statements.

So the workaround for SQLNCLI would be to use a stored procedure executed remotely:

CREATE PROCEDURE [dbo].[UpdateViewStoredProc] @Param1 int

AS

BEGIN

SET NOCOUNT ON;

INSERT into Test.dbo.vt(cv) values (@Param1)

END

Execute this stored procedure from remote linked server:

exec SNACLinked.Test.dbo.UpdateViewStoredProc 22

References

----------------

CREATE VIEW

http://msdn.microsoft.com/en-us/library/ms187956.aspx

sp_cursoropen (Transact-SQL)

http://msdn.microsoft.com/en-us/library/ff848737.aspx

 

Author: Aruna Koppanur (MSFT), SQL Developer Engineer.

ADO.NET application connecting to a mirrored SQL Server Database may timeout long before the actual connection timeout elapses, sometimes within milliseconds

$
0
0

Recently we had few cases, where .NET Applications, connecting to a mirrored SQL Server Database, using ADO.NET SQLClient provider, in a mirrored SQL Server 2005 or SQL Server 2008 instance scenario, may intermittently fail with the following error message:

System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.

at System.Data.ProviderBase.DbConnectionPool.GetConnection(DbConnection owningObject)

at System.Data.ProviderBase.DbConnectionFactory.GetConnection(DbConnection owningConnection)

at System.Data.ProviderBase.DbConnectionPool.GetConnection(DbConnection owningObject)

at System.Data.ProviderBase.DbConnectionFactory.GetConnection(DbConnection owningConnection)

at System.Data.ProviderBase.DbConnectionClosed.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory)

at System.Data.SqlClient.SqlConnection.Open()

A more detail call stack, as we reproduced the issue, will look like the following stack:

Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.

   at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)

   at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning()

   at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error)

   at System.Data.SqlClient.TdsParserStateObject.ReadSni(DbAsyncResult asyncResult, TdsParserStateObject stateObj)

   at System.Data.SqlClient.TdsParserStateObject.ReadNetworkPacket()

   at System.Data.SqlClient.TdsParser.ConsumePreLoginHandshake(Boolean encrypt, Boolean trustServerCert, Boolean& marsCapable)

   at System.Data.SqlClient.TdsParser.Connect(ServerInfo serverInfo, SqlInternalConnectionTds connHandler, Boolean ignoreSniOpenTimeout, Int64 timerExpire, Boolean encrypt, Boolean trustServerCert, Bo

olean integratedSecurity)

   at System.Data.SqlClient.SqlInternalConnectionTds.AttemptOneLogin(ServerInfo serverInfo, String newPassword, Boolean ignoreSniOpenTimeout, TimeoutTimer timeout, SqlConnection owningObject)

   at System.Data.SqlClient.SqlInternalConnectionTds.LoginWithFailover(Boolean useFailoverHost, ServerInfo primaryServerInfo, String failoverHost, String newPassword, Boolean redirectedUserInstance, S

qlConnection owningObject, SqlConnectionString connectionOptions, TimeoutTimer timeout)

What make this timeout interesting is that the applications experience the connection timeout long before the actual default connection timeout value (15s), sometimes in milliseconds.

The issue happens, because, SQLClient uses the connection retry algorithm (for TCP/IP Connections), and sets the initial connection timeout value as 15*0.08~1200ms and first tries to connect to the principal SQL server and if fails, then, tries to connect to mirrored SQL Server (the fact that the database is mirrored, may come from the connection string or SQLClient may collect this information, the first time it connects to the database). If this connection attempt also fails due to timeout (because of slow response from SQL Server or due to network delays), the SQLClient incorrectly sets the connection to doomed state, giving the connection timeout exception, without further connection retry attempt to principal.

Microsoft has confirmed that this is a problem in the current release of ADO.NET. This issue will be fixed in ADO.NET version, ships with Visual Studio 2011.

In the meantime, we request to use the following workarounds:

1. Increase the connection string timeout to 150 sec. This will give the first attempt enough time to connect( 150* .08=12 sec)

2. Add MinPool Size=20 in the connection string. This will always maintain a minimum of 20 connections in the pool and there will be less chances of creating new connection, thus reducing the chance of this error.

3. Improve the network performance. Update your NIC drivers to the latest firmware version. We have seen network latency when your NIC card is not compatible with certain Scalable Networking Pack settings. If you are on Windows Vista SP1 or above you may also consider disabling Receive Window Auto-Tuning. If you have NIC teaming enabled, disabling it would be a good option.

 

Author : Meer(MSFT), SQL Developer Escalation Services , Microsoft

Posted by : Enamul(MSFT), SQL Developer Technical Lead , Microsoft

App using SQLClient(.NET 4.0) will automatically construct an SPN when using shared memory with windows authentication

$
0
0

Consider the following scenario:

1) You have a .NET 4.0 application connecting to SQL Server 2005 or SQL Server 2008 edition (including SQL Server Express Edition) with ADO.NET using SQLClient provider.

2) Only Shared Memory is enabled for protocols for SQL Server Instance

3) All client protocols are enabled for SQL Native Client and SQL Native Client 10.0.

4) The application is running on Windows 7 machine.

In this scenario, the application may fail to connect to SQL Server, with the following error message:

Login failed. The login is from an untrusted domain and cannot be used with Windows authentication. [CLIENT: <local machine>]

Additionally, you may also see the following error message in the application event log:

SSPI handshake failed with error code 0x8009030c, state 14 while establishing a connection with integrated security; the connection has been closed. Reason: AcceptSecurityContext failed. The Windows error code indicates the cause of failure. [CLIENT: <local machine>].

At a first look, it may seem that the issue is related to Kerberos, because we see most of the SSPI handshake error messages due to Kerberos failure, which would most likely be related to non-existent SPN or bad SPN for SQL Server. We would expect that, for local connections, we are connecting over NTLM and the SPN is not required for NTLM.

A reference discussion on the error “SSPI handshake failed with error code 0x8009030c” can be found in this forum post: http://social.msdn.microsoft.com/Forums/en-US/sqlsecurity/thread/c46b0257-7304-47dc-a8b5-090001ff70a5

While I will discuss on how we can resolve the “Login failed” issue, the main goal for this blog is to point out that, there is a breaking change in ADO.NET SQLClient provider implementation of .NET 4.0. SQLClient in a .NET 4.0 Application will automatically construct an SPN for SQL Server connections over shared memory when windows authentication is used. In the prior version, .NET 3.5 SP1, this was not the case, for connections, we would use NTLM directly. To comply with NTLM Reflection Protection and to Extended Protection improvements in Windows Security, we now use the Negotiate SSP ( http://msdn.microsoft.com/en-us/library/aa378748(VS.85).aspx ), rather than directly using NTLM, in addition to constructing an SPN for the SQL Server we connect to locally from the ADO.NET Applications. Authentication defaults will be as detailed in the MSDN Article http://msdn.microsoft.com/en-us/library/ms191153.aspx under "Authenticaion Defaults" section.

In the above scenario, the reason of the “Login Failed” error message was due to the fact that, when attempting a local connection over shared memory, SQLClient was constructing an SPN for NTLM authentication, for the reasons explained above. Due to the DNS resolution, the constructed SPN became different (we were getting a wrong suffix) than the SPN that is present in domain controller.

While the correct DNS resolution is required to fix the issue (so we can construct the correct SPN), to work around the issue, we can use one of the following methods:

1) Create a named pipe alias for the SQL Server you are connecting to using SQL Native client 10.

2) Change the registry key as per KB, to include " BackConnectionHostNames ", Method 1 on the KB: http://support.microsoft.com/kb/926642

Method 1 (recommended): Create the Local Security Authority host names that can be referenced in an NTLM authentication request

To do this, follow these steps for all the nodes on the client computer:

1. Click Start, click Run, type regedit, and then click OK.

2. Locate and then click the following registry subkey: HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\MSV1_0

3. Right-click MSV1_0, point to New, and then click Multi-String Value.

4. In the Name column, type BackConnectionHostNames, and then press ENTER.

5. Right-click BackConnectionHostNames, and then click Modify.

6. In the Value data box, type the CNAME or the DNS alias, that is used for the local shares on the computer, and then click OK.
Note Type each host name on a separate line.
Note If the BackConnectionHostNames registry entry exists as a REG_DWORD type, you have to delete the BackConnectionHostNames registry entry.

7. Exit Registry Editor, and then restart the computer.

MORE INFORMATION

Registering a Service Principal Name
http://msdn.microsoft.com/en-us/library/ms191153.aspx
Service Principal Name (SPN) Support in Client Connections
http://msdn.microsoft.com/en-us/library/cc280459(v=SQL.105).aspx
What SPN do I use and how does it get there?
http://blogs.msdn.com/b/psssql/archive/2010/03/09/what-spn-do-i-use-and-how-does-it-get-there.aspx

Error message when you try to access a server locally by using its FQDN or its CNAME alias after you install Windows Server 2003 Service Pack 1: "Access denied" or "No network provider accepted the given network path"

http://support.microsoft.com/kb/926642

Understanding Kerberos and NTLM authentication in SQL Server Connections

http://blogs.msdn.com/b/sql_protocols/archive/2006/12/02/understanding-kerberos-and-ntlm-authentication-in-sql-server-connections.aspx

Understanding the error message: “Login failed for user ''. The user is not associated with a trusted SQL Server connection.”

http://blogs.msdn.com/b/sql_protocols/archive/2008/05/03/understanding-the-error-message-login-failed-for-user-the-user-is-not-associated-with-a-trusted-sql-server-connection.aspx

Author : Meer(MSFT), SQL Developer Escalation Services , Microsoft

Posted by : Enamul(MSFT), SQL Developer Technical Lead , Microsoft

.NET application doesn’t connect to the SQL server after a Database mirroring failover

$
0
0

In this blog I will discuss a connection timeout issue that one of my customers was having after failing over the mirror database. For better understanding of the issue let me discuss some key terms/concepts of database mirroring before I get into the issue description. Database mirroring is first introduced in SQL Server 2005 for increasing database availability. It is software solution and does not require any hardware like clustering. Database mirroring involves redoing every insert, update, and delete operation that occurs on the principal database onto the mirror database as quickly as possible. Redoing is accomplished by sending a stream of active transaction log records to the mirror server, which applies log records to the mirror database, in sequence, as quickly as possible.

If a SQL server database is configured with a mirrored database and the principal database becomes unavailable for some reason then we can manually or automatically failover to the mirrored database.

To support automatic failover, a database mirroring session must be configured in high-safety mode and also possess a third server instance, known as the witness. To know more about how SQL server mirroring works please read the following MS links.

Database Mirroring in SQL Server 2005

http://technet.microsoft.com/en-us/library/cc917680.aspx

How to: Configure a Database Mirroring Session (SQL Server Management Studio)

http://msdn.microsoft.com/en-us/library/ms188712.aspx

Now let’s get into the detail of the issue I want to discuss today. Customer was observing timeout error from the application after shutting down the principal server. They had setup the database mirroring as “high safety with automatic failover mirror” using a witness and were also specifying the Failover Partner and a connection timeout of 10 seconds in the connection string. During testing, when they turned off the SQL Server service on the principal server the auto failover worked like a charm, but when they took the principal server offline (by shutting down the server or killing the network card) failover happened as expected (i.e. the mirror becomes the new principal) but the application that was trying to connect to the SQL server mirrored database was getting a timeout error as below.

Server Error in '/' Application.

A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 - The wait operation timed out.)

Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.

Exception Details: System.Data.SqlClient.SqlException: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 - The wait operation timed out.)

So once again the problem is if they stop only the SQL server service mirroring works as expected but when they shut down the SQL Server (principal) the connections were timing out and the application was not able to connect to the mirrored database as expected.

Without digging into this further we requested the customer to increase the connection time out to 60 seconds and test. Customer increased the connections time and now even after shutting down the SQL server the application was able to connection to the mirrored database successfully.

So we were able to find a work around for the issue pretty quick but we still needed to explain what was causing the issue and why increasing the connection timeout fixed it. To find out the reason behind these different behaviors we captured network traces in all three scenarios.

Scenario 1: We did a failover from the principle to the mirror by only stopping the SQL server service. In this scenario the application was able to connect to the mirror database successfully.

1. To initiate a connection the client sends SYN  to the principal and the principle server sends ACK, RST right away

clip_image002

2. As the client gets the response from the principle almost immediately it tries to connect to the mirror and the connection becomes successful.

clip_image004

Scenario 2: We did a failover by shutting down the principal server and the connection timeout was set to 10 sec. In this scenario the application was failing to connect to the SQL server and displaying a timeout error.

1. To initiate a connection client sends SYN to the principal and it gets nothing back from the principle as the principle is not up.

2. So the client retransmits the SYN to the principle based on the value in TcpMaxConnectRetransmissions registry key. TcpMaxConnectRetransmissions specifies how many times TCP retransmits an unanswered request for a new connection. TCP retransmits new connection requests until they are answered or until this value expires. The default value of TcpMaxConnectRetransmissions is 2.

TCP/IP adjusts the frequency of retransmissions over time. The delay between the original transmission and the first retransmission for each interface is determined by the value of the TcpInitialRTT entry. By default, it is three seconds. This delay doubles after each attempt. After the final attempt, TCP/IP waits for an interval equal to double the last delay, and then it abandons the connection request.

Please keep in mind that TcpMaxConnectRetransmissions is different from TcpMaxDataRetransmissions which specifies how many times TCP retransmits an unacknowledged data segment on an existing connection.

In the network trace we see the application server retransmitted SYN after 3 seconds. TcpMaxConnectRetransmissions was set to two so the next retransmit should be after 6 more seconds and 3+6= 9 sec from the beginning of the connection and after that we should wait another 6*2=12 seconds before trying to connect to the mirror database. Customer set the connection timeout as 10 seconds. For some reason I did not see the 2nd retransmit packet in the trace but regardless the connection was timed out. We did not see any attempt in the trace to make a connation to the mirror database.

clip_image006

Scenario 3: We did a failover by shutting down the principal database but the connection timeout in the connection string was increased from 10 seconds to 60 seconds. In this scenario the application was able to connect to the mirror database successfully.

1. To initiate a connection client sends SYN to principal and it gets nothing back from the principle as the principle is not up.

2. So client retransmits the SYN to the principle after 3 seconds. The default value for TcpMaxConnectRetransmissions is set to 2. It should retransmits the SYN after 6 more seconds. Then it should wait for another 6*2=12 sec and then try to connect to the mirror database.

3. For some reason we are not seeing the 2nd retransmits the SYN from the app server to the principal in the trace. However, we as expected exactly after 6+12=18 sec the app server sent SYN to the mirror server to initiate the connection. It got response right away and the connection was successful. From the beginning it took total 3+6+12=21 secs to attempt to connect to the mirror database which is less than the new connection timeout value 60 seconds and this why we did not see any connection timeout this time.

clip_image008

To recap in the first scenario we did not see a timeout because even though the SQL server service was stopped the server was up and sent a RESET to the client right away and the application server was able to establish a connection with the mirror database before connection timeout 10 seconds. In the second scenario we were getting the timeout because the principal server was shutdown and therefore the application server was not getting any response at all from the SQL server and had to do 2 retransmits based on the default TcpMaxConnectRetransmissions value before it could try to connect to the mirror database. The connection timeout value was too and small (10 sec) and it timed out while doing those retransmits to the principal server and never had a chance to try to connect to the mirror database.

In the 3rd scenario as we increased the connection timeout the application server was able to try to connect to the mirror database after done retransmitting the SYN packets to the principal server and the connection was successful.

References:

How to: Configure a Database Mirroring Session (SQL Server Management Studio)

http://msdn.microsoft.com/en-us/library/ms188712.aspx

Making the Initial Connection to a Database Mirroring Session

http://msdn.microsoft.com/en-us/library/ms366348.aspx

Connection Retry Algorithm (for TCP/IP Connections)

http://msdn.microsoft.com/en-us/library/ms365783.aspx

TcpMaxConnectRetransmissions

http://technet.microsoft.com/en-us/library/cc758896(WS.10).aspx

TcpMaxDataRetransmissions

http://technet.microsoft.com/en-us/library/cc780586(WS.10).aspx

 

Author: MFarooq [MSFT]

Sending several files to several individuals using SSIS Send Email Task

$
0
0

Recently I got a case and the customer was trying to create an SSIS package to e-mail results of several files to several individuals and he wanted to know if there is any inbuilt task in SSIS out of the box that can do this. He was saving few files in a folder and wanted to send some of the files from that folder that start with a certain series of characters (wildcards) as attachments to the emails.  He was trying to do that using SQL Server and SSIS 2008 R2.

After some research I was certain that there is no inbuilt task in SSIS that can achieve this directly. Then I started to work on alternate solutions using the exiting tasks that we have in SSIS. We came up with a solution and I thought there are other users out there who may need to do something similar from an SSIS package. So in this blog I will list the detail steps (with a lot of screenshots) of the solution that we provided to that customer. In summary we suggested the following as an alternate to achieve the customer’s goal.

I) Using a ForEach Loop container, you can loop over all the files in a folder, and save each file into a variable “MyFIle” file.

II) Inside the loop use a Script Task to append each file string from current file to a global string variable “Myfiles” + concatenate the | character, into a delimited list of individual files.

III) Then outside the loop (after it) use a send mail task with an expression set on the FileAttachments property to point to @[User::MyFiles]

Later I made a sample SSIS package implementing the above steps and it worked like a charm.  In the package I used a ForEachLoop Container and a Script Task to select multiple attachments from a folder using the wild card and then used a Send Email Task to send the email to multiple recipients with the attachments. Below I am listing steps with screen shoots.

1. Open up a new Integration Services Project under Boniness Intelligence Projects in Visual Studio 2008 and name it SendEmailTest (or any name that you want J)

2. Drag and drop a Foreach Loop Container.

3. Double click Foreach Loop Container task, go to the Collection Tab and do the following

a. Select “Foreach File Enumerator” in the Enumerator field

b. Click the browse button and navigate to the folder location where you have the files that you want to attach

c. Type the wild card string value as per your requirement

Example: Att*.txt à this will loop through all the files that start with “Att” and has the extension “.txt”.

After doing the above steps the Foreach Loop Editor should look as below.

 clip_image005

4. In the Foreach Loop Editor window select Variable Mappings tab and then click the drop down under Variable column and select <New Variable>. It will give you the “Add variable” screen.

 clip_image008

5. In the “Add Variable” screen name the variable as “MyFile” and also select the other options as below.

 clip_image011

6. Now click OK and you should see the variable “MyFile” added in the Foreach Loop Container as below.

 clip_image014

7. Clcik OK to the Foreach Loop Editor windows and then drag and drop a Script Task and place it inside the Foreach Loop Container.

clip_image016

8. From the top menu bar select SSISà Variable

 clip_image019

9. In the Variables pan you should already see the variable we added earlier “MyFile”. Type the name of another new variable “MyFiles” and select the other options as below.

 
clip_image022

10. Double click the Script Task to get the Script Task Editor window and click the browse button next to “ReadOnlyVariables”.

 clip_image025

11. Once you click the browse button in the Script Task Editor you will get the following Select Variables window.

 clip_image028

12. Check the variable “MyFile” and then clcik OK

13. Repeat steps 10 and 11 and select the variable “MyFiles” for the field “ReadWriteVariables”. After the selections the Script Task Editor should look as below.

 clip_image032

14. Now form the Script Task Editor window click the “Edit Script” button and add the following code in the Main() function and the click OK to complete editing the Script Task.

Note: The commented lines are for testing or debugging. I left them here in case you start to see issues and may need to debug J

public void Main()

{

// TODO: Add your code here

//Dts.Log("entering SCRIPT TASK.. ", 999, null);

Dts.Variables["User::MyFiles"].Value = Dts.Variables["User::MyFiles"].Value.ToString() + "|" + Dts.Variables["User::MyFile"].Value.ToString();

//MessageBox.Show(Dts.Variables["User::MyFile"].Value.ToString());

//MessageBox.Show(Dts.Variables["User::MyFiles"].Value.ToString());

Dts.TaskResult = (int)ScriptResults.Success;

}

So the Foreach loop will loop through the files in the specified folder and the script task will concatenate the file names in the “MyFiles” variable. Now we need to add a Send Email task and use the MyFiles variable to attach all the files.

15. Before we can add a Send Email Task we need to add a SMTP connection manager. Right click under the Connection managers pan at the bottom and select “New Connection”

 
clip_image034

16. From the Add SSIS Connection Manager windows select SMTP and click “Add” button. You should get the following SMTP Connection Manager Editor window.

 clip_image039

17. Specify your SMTP server and make sure this is a valid SMTP server. You need to enable SMTP service in your machine. Click OK to complete the SMTP connection Manager configuration.

18. Drag and drop a Send Email Task outside Foreach Loop Container.

clip_image041

19. Double click the Send Email Task to get the Send Email Task Editor. Select the newly created SMTP connection Manager form the dropdown. Type the From email address and then type the email address of the recipients separated by semicolon in the To filed.

 clip_image044

20. Click the expression tab from the left and set the value of FileAttachments field under Expression as @[User::MyFiles]. This will attach all the files in the email that we looped though in the Foreach loop.

 clip_image047

21. Click OK to complete the setting of the Send Email Task and run the package.

Reference:

Configure SMTP E-mail (IIS 7)

http://technet.microsoft.com/en-us/library/cc772058(WS.10).aspx

Note:

The SMTP server is not installed by default. SMTP can be added through the Features Summary area of the Server Manager tool in Windows Server® 2008.

Send Mail Task

http://msdn.microsoft.com/en-us/library/ms142165.aspx

SMTP Connection Manager

http://msdn.microsoft.com/en-us/library/ms137684.aspx

 

Author: MFarooq [MSFT]

Setting up database mirroring with certificates

$
0
0

We use certificates while setting up database mirroring for two partners that are in a workgroup or in non-trusted domains.  The idea is to create a certificate on each partner, export it to the other and then setup a login to use that certificate. As explained in BOL here, this is called setting up Inbound and Outbound connections

Here is a simplified representation of how it needs to be setup

clip_image001[8]
 


   

If either of these is not setup correctly you can get a variety of error messages like these

Msg 1431, Level 16, State 4, Line 1

Neither the partner nor the witness server instance for database "TEST2" is available. Reissue the command when at least one of the instances becomes available.

 

Error: 1438, Severity: 16, State: 1.

The server instance Partner rejected configure request; read its error log file for more information. The reason 1405, and state 2, can be of use for diagnostics by Microsoft. This is a transient error hence retrying the request is likely to succeed. Correct the cause if any and retry.

 

Error: 1405, Severity: 16, State: 2

 

Apart from the blog post above, you can refer to Bemis 2189705 for a step by step approach to setting up database mirroring with certificates.  The steps consist of the following

1.       Setup Outbound connections:- Consists of creating the certificate, the endpoint ( with the certificate in the AUTHENTICATION clause) and then backing up the certificate

2.       Setup Inbound connections:-  Consists of restoring the certificate from the partner, associating it with a login and granting that login connect on the endpoint

3.       Run the ALTER DATABASE statements starting with Mirror server  first and then on Principal

 

In order to avoid confusion, be sure to use two separate local accounts on each partner and name them with the prefix of the machine name for the other partner.

Also, we shall see these error messages while trying to use the GUI to setup database mirroring on Sql Server 2005 as outlined in http://connect.microsoft.com/SQLServer/feedback/details/343027/database-mirroring-gui-does-not-work-and-throws-fqdn-error   - which means setting up mirroring using certificates through SSMS is not possible on Yukon SSMS but is possible for Katmai and above.

Example Scripts that you can use in your environment

--1.Setup Outbound connections:- Consists of creating the certificate, the endpoint ( with the certificate in the AUTHENTICATION clause)

--and then backing up the certificate

USE master

CREATE CERTIFICATE HOST_PRINCIPAL_cert

WITH SUBJECT = 'HOST_PRINCIPAL certificate',

START_DATE = '08/19/2011'

GO

 

CREATE ENDPOINT Endpoint_Mirroring

STATE = STARTED

AS TCP ( LISTENER_PORT=5022, LISTENER_IP = ALL)

FOR DATABASE_MIRRORING (

AUTHENTICATION = CERTIFICATE HOST_PRINCIPAL_cert

, ROLE = ALL);

GO

 

BACKUP CERTIFICATE HOST_PRINCIPAL_cert TO FILE = 'C:\temp\HOST_PRINCIPAL_cert.cer';

GO

--2. Copy this ceritificate to the Mirror machine

--3.Setup Outbound connections:- Consists of creating the certificate, the endpoint ( with the certificate in the AUTHENTICATION clause)

--and then backing up the certificate

CREATE CERTIFICATE HOST_MIRROR_cert

WITH SUBJECT = 'HOST_MIRROR_certificate',

START_DATE ='08/19/2011'

GO

 

CREATE ENDPOINT Endpoint_Mirroring

STATE = STARTED

AS TCP (

LISTENER_PORT=5022

, LISTENER_IP = ALL

)

FOR DATABASE_MIRRORING (

AUTHENTICATION = CERTIFICATE HOST_MIRROR_cert

, ROLE = ALL

);

GO

BACKUP CERTIFICATE HOST_MIRROR_cert TO FILE = 'C:\temp\HOST_MIRROR_cert.cer';

GO

--4. Copy this certificate to the Principal machine

--5.Setup Inbound connections:-  Consists of restoring the certificate from the partner, associating it with a login and granting that login connect

--on the endpoint

--Create the login for the Mirror machine and associate the mirror cert with the login

USE master;

CREATE LOGIN HOST_MIRROR_login WITH PASSWORD = '1Sample_Strong_Password!@#';

GO

CREATE USER HOST_MIRROR_user FOR LOGIN HOST_MIRROR_login;

GO

--Associate the certificate with the user.

CREATE CERTIFICATE HOST_MIRROR_cert

AUTHORIZATION HOST_MIRROR_user

FROM FILE = 'C:\temp\HOST_MIRROR_cert.cer'

GO

--Grant connect on the endpoint to the login

GRANT CONNECT ON ENDPOINT::Endpoint_Mirroring TO [HOST_MIRROR_login];

GO

--6.Setup Inbound connections:-  Consists of restoring the certificate from the partner, associating it with a login and granting that login connect

--on the endpoint

--Create the login for the Principal machine and associate the principal cert with the login

USE master;

CREATE LOGIN HOST_PRINCIPAL_login WITH PASSWORD = '=Sample#2_Strong_Password2';

GO

 

CREATE USER HOST_PRINCIPAL_user FOR LOGIN HOST_PRINCIPAL_login;

GO

--Associate the certificate with the user.

CREATE CERTIFICATE HOST_PRINCIPAL_cert

AUTHORIZATION HOST_PRINCIPAL_user

FROM FILE = 'C:\temp\HOST_PRINCIPAL_cert.cer'

GO

--Grant connect on the endpoint to the login

GRANT CONNECT ON ENDPOINT::Endpoint_Mirroring TO [HOST_PRINCIPAL_login];

GO

 

Here is a list of common troubleshooting steps to do when you are stuck with these error messages

1. Telnet to the ports works correctly from each machine to the other

2.  Output of netstat –abn shows that ports are opened on both sides as expected

Principal

TCP    0.0.0.0:5022           0.0.0.0:0              LISTENING       5068

  [sqlservr.exe]

Mirror

  TCP    0.0.0.0:5023           0.0.0.0:0              LISTENING       5052

  [sqlservr.exe]

 

4. Check the output of sys.database_mirroring_endpoints is identical on both sides

End_Mirroring     65545 273   2     TCP   4     DATABASE_MIRRORING      0     STARTED      0     3     ALL   1     4     CERTIFICATE 258   1     RC4

End_Mirroring     65545 273   2     TCP   4     DATABASE_MIRRORING      0     STARTED      0     3     ALL   1     4     CERTIFICATE 258   1     RC4

5. Run the Metadata check query from BOL and check that the respective logins have correct CONNECT permissions on  each partner

SELECT 'Metadata Check';

SELECT EP.name, SP.STATE,

   CONVERT(nvarchar(38), suser_name(SP.grantor_principal_id))

      AS GRANTOR,

   SP.TYPE AS PERMISSION,

   CONVERT(nvarchar(46),suser_name(SP.grantee_principal_id))

      AS GRANTEE

   FROM sys.server_permissions SP , sys.endpoints EP

   WHERE SP.major_id = EP.endpoint_id

   ORDER BY Permission,grantor, grantee;

GO

 

6. Try removing the ENCRYPTION clause from the endpoint definition and recreating it. This is relevant when the Sql Server is hosted on virtual machine in Vmware environment.

7. Try dropping and recreating the local account on each side that is being used for DBM. This will only be required if they are using local windows accounts and they have moved the database from one server to another with the same local account existing on both the machines or if they have moved to another domain where the same domain user exists as the previous domain. This is similar to the orphaned logins issue with SQL Authentication for Database Mirroring

 

Rohit Nayak

Sr. Support Engineer – Sql Server CTS


How to get "Microsoft.sqlserver.msxml6_interop.dll" without buying SQL Server 2008

$
0
0

The other day I was working with a customer who was in the process of developing SSIS "Control Flow" custom component. He was having an issue in compiling/building his code in Visual Studio and getting

Warning about the build process about indirect dependency on the .NET Framework assembly due to SSIS references.

Error:

The primary reference "Microsoft.SQLServer.ManagedDTS, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91, processorArchitecture=MSIL" could not be resolved because it has an indirect dependency on the .NET Framework assembly "mscorlib, Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" which has a higher version "2.0.3600.0" than the version "2.0.0.0" in the current target framework.

The cause and resolution of this error as it is explained by our escalation engineer Jason at: http://blogs.msdn.com/b/jason_howell/archive/2010/08/18/visual-studio-2010-solution-build-process-give-a-warning-about-indirect-dependency-on-the-net-framework-assembly-due-to-ssis-references.aspx

As per the article this problem affects machines where SQL Server 2005 was once installed. Even if SQL Server 2008 or SQL Server 2008 R2 is currently installed, if at some point SQL Server 2005 was installed in the past, the problem assembly may exist in the Globally Assembly Cache (GAC). The problem is related to the way assemblies reference the SQL Server 2005 version of Microsoft.SQLServer.msxml6_interop.dll. That msxml_interop assembly in turn references mscorlib.dll that is version 2.0.3600, which is the prerelease Beta 2 of .NET 2.0.

The resolution of this issue is update the version of the Microsoft.SQLServer.msxml6_interop.dll in the GAC with a copy of the file from SQL Server 2008 installation that has the correct references to .Net 2.0 RTM (ie. get rid of the beta reference). My customer did not have SQL Server 2008 and he was not ready to buy SQL Server 2008 only to get this DLL.

I am going to talk about how you can get Microsoft.SQLServer.msxml6_interop.dll without buying SQL Server 2008 or 2008 R2.

SQL 2008 Management Studio Express which is free download has this DLL. You can get it by following the steps below:

a) You can install SQL 2008 Management Studio Express on a machine where you do not have SQL 2005 installed, Management Studio Express can be found at: http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=7593.

b) This will install Microsoft.SqlServer.Msxml6_interop.dll into GAC.

c) If you want to get a copy of this dll then you can copy GAC dll using the command:

xcopy c:\windows\assembly\*Microsoft.SQLServer.msxml6_interop.dll* C:\GACCopy\/s/r

d) The dll will be in the folder at: C:\GACCopy\GAC_MSIL\Microsoft.SqlServer.Msxml6_interop\6.0.0.0__89845dcd8080cc91

Install this copy the DLL into the GAC on the machine where Visual Studio is used to build the solutions. If you have any issue in installing the DLL in GAC, Jason’s article has some tips.

 

Author : Faiz (MSFT), SQL Developer Engineer

Use Existing MSDN C++ ODBC Samples for Microsoft Linux ODBC Driver

$
0
0

By Gregory Suarez | Sr. Escalation Engineer | SQL Server

 

The CTP release of the Microsoft SQL Server ODBC Driver for Linux (http://www.microsoft.com/download/en/details.aspx?id=28160 ) opens many  opportunities for Red Hat Enterprise Linux (RHEL 5.x) customers who want to access the power of Microsoft SQL Server.

Well known utilities such as BCP (bulk copy) and SQLCMD are provided with the driver and their use is fairly straight forward; however; some customers have mentioned the absence of an SDK and C/C++ samples. These features are slated to be available shortly after the product RTM's at the end of March 2012 but in the meantime, you can use the existing MSDN ODBC C/C++ samples to get
you started. 

The Microsoft ODBC Linux driver shares a common code base
with
its Windows ODBC counterpart and has been regression tested using many of the same test suites. With that said, many of the existing MSDN ODBC C/C++ samples that are console based should compile on Linux with only minor modifications.

To get you started, I'll use SQLBindCol sample found here: http://msdn.microsoft.com/en-us/library/windows/desktop/ms711010%28v=vs.85%29.aspx

A vast majority of Linux users will retrieve the data as SQLCHAR and will use printf.  This is different from typical Windows usage which is why I decided to use single byte below.  Note that the driver will return the SQLCHAR data as UTF8.

Here is a list of changes made


1. Remove unneeded header files.  In particular  <windows.h> header.  
2. Replaced SQLWCHAR with SQLCHAR
3. Replaced wprintf with printf

 


Here is the final source ready to be compiled on RHEL:

 

#include <stdio.h>
#include <stdlib.h>
#include <sqlext.h>
#include <sql.h>

#define NAME_LEN 50
#define PHONE_LEN 20

void show_error() {
   printf("error\n");
}

int main() {
   SQLHENV henv;
   SQLHDBC hdbc;
   SQLHSTMT hstmt = 0;
   SQLRETURN retcode;
   SQLCHAR szName[NAME_LEN], szPhone[PHONE_LEN], sCustID[NAME_LEN];
   SQLLEN cbName = 0, cbCustID = 0, cbPhone = 0;

   // Allocate environment handle
   retcode = SQLAllocHandle(SQL_HANDLE_ENV, SQL_NULL_HANDLE, &henv);

   // Set the ODBC version environment attribute
   if (retcode == SQL_SUCCESS || retcode == SQL_SUCCESS_WITH_INFO) {
      retcode = SQLSetEnvAttr(henv, SQL_ATTR_ODBC_VERSION, (SQLPOINTER*)SQL_OV_ODBC3, 0);

      // Allocate connection handle
      if (retcode == SQL_SUCCESS || retcode == SQL_SUCCESS_WITH_INFO) {
         retcode = SQLAllocHandle(SQL_HANDLE_DBC, henv, &hdbc);

         // Set login timeout to 5 seconds
         if (retcode == SQL_SUCCESS || retcode == SQL_SUCCESS_WITH_INFO) {
            SQLSetConnectAttr(hdbc, SQL_LOGIN_TIMEOUT, (SQLPOINTER)5, 0);

            // Connect to data source
            retcode = SQLConnect(hdbc, (SQLCHAR*) "SQLCMD", SQL_NTS, (SQLCHAR*) "Test1", 5, (SQLCHAR*) "Password1", 9);

            // Allocate statement handle
            if (retcode == SQL_SUCCESS || retcode == SQL_SUCCESS_WITH_INFO) {
               retcode = SQLAllocHandle(SQL_HANDLE_STMT, hdbc, &hstmt);

               retcode = SQLExecDirect (hstmt, (SQLCHAR *) "SELECT CustomerID, ContactName, Phone FROM CUSTOMERS ORDER BY 2, 1, 3", SQL_NTS);
               if (retcode == SQL_SUCCESS || retcode == SQL_SUCCESS_WITH_INFO) {

                  // Bind columns 1, 2, and 3
                  retcode = SQLBindCol(hstmt, 1, SQL_C_CHAR, sCustID, 100, &cbCustID);
                  retcode = SQLBindCol(hstmt, 2, SQL_C_CHAR, szName, NAME_LEN, &cbName);
                  retcode = SQLBindCol(hstmt, 3, SQL_C_CHAR, szPhone, PHONE_LEN, &cbPhone);

                  // Fetch and print each row of data. On an error, display a message and exit.
                  for (int i=0 ; ; i++) {
                     retcode = SQLFetch(hstmt);
                     if (retcode == SQL_ERROR || retcode == SQL_SUCCESS_WITH_INFO)
                        show_error();
                     if (retcode == SQL_SUCCESS || retcode == SQL_SUCCESS_WITH_INFO)
                        printf( "%d: %s %s %s\n", i + 1, sCustID, szName, szPhone);
                     else
                        break;
                  }
               }

               // Process data
               if (retcode == SQL_SUCCESS || retcode == SQL_SUCCESS_WITH_INFO) {
                  SQLCancel(hstmt);
                  SQLFreeHandle(SQL_HANDLE_STMT, hstmt);
               }

               SQLDisconnect(hdbc);
            }

            SQLFreeHandle(SQL_HANDLE_DBC, hdbc);
         }
      }
      SQLFreeHandle(SQL_HANDLE_ENV, henv);
   }
}

 

Public ODBC headers are installed with unixODBC 2.3.0 and are typically included in the /usr/include/odbc directory. 

Be sure your IDE is configured correctly to find the headers and also be sure to link to the odbc library object using the –lodbc switch.

I was able to compile the program above using the following command line options:

cc -m64 -g -I/usr/include -L/usr/lib -lodbc -o SQLBindColtest SQLBindColtest.c

Good luck and I hope this helps you get started with writing C/C++ application that use the Microsoft Linux ODBC driver.

Use Microsoft Linux ODBC Driver and Linked Server to access OLEDB Data sources on Remote Systems

$
0
0

By Gregory Suarez | Sr. Escalation Engineer | SQL Server

I was recently working with one of our customers when he indicated it would be great if the Microsoft Linux ODBC driver could be used to access his other database systems - in addition to Microsoft SQL Server.   Apparently, he liked the driver so much; the idea of having a single database client stack could simplify administration, reduce the memory footprint of his client application and further increase performance and throughput.  

At the time, I didn’t think much of this – but shortly after his comment, I realized this is something that’s easily accomplished.   After all, the driver is similar to what we currently have running on the Windows platform.  As long as an appropriate OLEDB provider is configured as a linked server within SQL Server everything should be good to go.

A linked server configuration enables SQL Server to execute commands against OLE DB data sources on remote servers. Linked servers offer the following advantages:

·         Remote server access.

·         The ability to issue distributed queries, updates, commands, and transactions on heterogeneous data sources across the enterprise.

·         The ability to address diverse data sources similarly.

* Note, the MS Linux ODBC does not support distributed transaction

See the following article for more details concerning linked servers: http://msdn.microsoft.com/en-us/library/ms188279.aspx

As a test, I decided to use the following components:

1.       Redhat Enterprise Linux 5.x client configured with Microsoft Linux ODBC Driver.

2.       SQL Server 2008 R2 configured with a linked server to IBM’s DB2 (using the Microsoft OLEDB Provider for DB2 )

3.       Sun’s Solaris 11 OS configured with IBM DB2 Version 9.7 Server (x64)


The goal is simply to return results from an IBM DB2 9.7 system running on a Solaris 11 Unix system to a Redhat Linux workstation configured with the Microsoft Linux ODBC driver.  Of course, Microsoft SQL Server is sitting in the middle.

From the Redhat Linux workstation, I created and executed the following script to create the linked server:

 createLinkedServer.sql

IF  EXISTS (SELECT srv.name FROM sys.servers srv WHERE srv.server_id != 0 AND srv.name = N'SOLARIS2')EXEC master.dbo.sp_dropserver @server=N'SOLARIS2', @droplogins='droplogins'

GO
EXEC master.dbo.sp_addlinkedserver @server = N'SOLARIS2', @srvproduct=N'DB2OLEDB', @provider=N'DB2OLEDB', @datasrc=N'Solaris2', @provstr=N'Provider=DB2OLEDB;User ID=gregorys;Password=password1;Initial Catalog=TEST;Network Transport Library=TCPIP;Host CCSID=1252;PC Code Page=1252;Network Address=65.53.9.96;Network Port=50000;Package Collection=TEST;Default Schema=DB2INST1;Process Binary as Character=False;Units of Work=RUW;DBMS Platform=DB2/6000;Defer Prepare=False;DateTime As Char=False;Rowset Cache Size=0;DateTime As Date=False;Auth Encrypt=False;AutoCommit=True;Authentication=Server;Decimal As Numeric=False;FastLoad Optimize=False;Derive Parameters=True;Persist Security Info=True;Data Source=TEST;Connection Pooling=False;'

/* For security reasons the linked server remote logins password is changed with ######## */
EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname=N'SOLARIS2',@useself=N'False',@locallogin=NULL,@rmtuser=NULL,@rmtpassword=NULL
GO
EXEC master.dbo.sp_serveroption @server=N'SOLARIS2', @optname=N'collation compatible', @optvalue=N'true'
GO
EXEC master.dbo.sp_serveroption @server=N'SOLARIS2', @optname=N'data access', @optvalue=N'true'
GO
EXEC master.dbo.sp_serveroption @server=N'SOLARIS2', @optname=N'dist', @optvalue=N'false'
GO
EXEC master.dbo.sp_serveroption @server=N'SOLARIS2', @optname=N'pub', @optvalue=N'false'
GO 
EXEC master.dbo.sp_serveroption @server=N'SOLARIS2', @optname=N'rpc', @optvalue=N'true'
GO
EXEC master.dbo.sp_serveroption @server=N'SOLARIS2', @optname=N'rpc out', @optvalue=N'true'
GO
EXEC master.dbo.sp_serveroption @server=N'SOLARIS2', @optname=N'sub', @optvalue=N'false'
GO
EXEC master.dbo.sp_serveroption @server=N'SOLARIS2', @optname=N'connect timeout', @optvalue=N'0'
GO
EXEC master.dbo.sp_serveroption @server=N'SOLARIS2', @optname=N'collation name', @optvalue=null
GO
EXEC master.dbo.sp_serveroption @server=N'SOLARIS2', @optname=N'lazy schema validation', @optvalue=N'false'
GO
EXEC master.dbo.sp_serveroption @server=N'SOLARIS2', @optname=N'query timeout', @optvalue=N'0'
GO
EXEC master.dbo.sp_serveroption @server=N'SOLARIS2', @optname=N'use remote collation', @optvalue=N'true'
GO
EXEC master.dbo.sp_serveroption @server=N'SOLARIS2', @optname=N'remote proc transaction promotion', @optvalue=N'true'
GO

sqlcmd -i ./createLinkedServer.sql -U xxx -P xxxx -S 65.53.9.94

Next, I created a stored procedure to execute the linked query:

createStoreProcedure.sql

USE [Northwind]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE [dbo].[Solaris2]
AS
BEGIN
SET NOCOUNT ON;
SELECT [EMPNO],[FIRSTNME],[MIDINIT] ,[LASTNAME],[WORKDEPT],[PHONENO] ,[HIREDATE],[JOB] ,[EDLEVEL] ,[SEX],[BIRTHDATE],[SALARY],[BONUS],[COMM] FROM [SOLARIS2].[TEST].[DB2INST1].[EMPLOYEE];

END
GO

sqlcmd -i ./createStoreProcedure.sql -U xxx -P xxxx -S 65.53.9.94

And finally, I executed the stored procedure.  

Below, we have Redhat Linux pulling data from Solaris/DB2 - compliments of the Microsoft Linux ODBC driver and SQL Server’s linked server functionality.

1> solaris2
2> go
EMPNO  FIRSTNME     MIDINIT LASTNAME        WORKDEPT PHONENO HIREDATE         JOB      EDLEVEL SEX BIRTHDATE        SALARY      BONUS       COMM      
------ ------------ ------- --------------- -------- ------- ---------------- -------- ------- --- ---------------- ----------- ----------- -----------
000010 CHRISTINE    I       HAAS            A00      3978          1965-01-01 PRES          18 F         1933-08-24    52750.00     1000.00     4220.00
000020 MICHAEL      L       THOMPSON        B01      3476          1973-10-10 MANAGER       18 M         1948-02-02    41250.00      800.00     3300.00
000030 SALLY        A       KWAN            C01      4738          1975-04-05 MANAGER       20 F         1941-05-11    38250.00      800.00     3060.00
000050 JOHN         B       GEYER           E01      6789          1949-08-17 MANAGER       16 M         1925-09-15    40175.00      800.00     3214.00
000060 IRVING       F       STERN           D11      6423          1973-09-14 MANAGER       16 M         1945-07-07    32250.00      500.00     2580.00
000070 EVA          D       PULASKI         D21      7831          1980-09-30 MANAGER       16 F         1953-05-26    36170.00      700.00     2893.00
000090 EILEEN       W       HENDERSON       E11      5498          1970-08-15 MANAGER       16 F         1941-05-15    29750.00      600.00     2380.00
000100 THEODORE     Q       SPENSER         E21      0972          1980-06-19 MANAGER       14 M         1956-12-18    26150.00      500.00     2092.00
000110 VINCENZO     G       LUCCHESSI       A00      3490          1958-05-16 SALESREP      19 M         1929-11-05    46500.00      900.00     3720.00
000120 SEAN                 O'CONNELL       A00      2167          1963-12-05 CLERK         14 M         1942-10-18    29250.00      600.00     2340.00
000130 DOLORES      M       QUINTANA        C01      4578          1971-07-28 ANALYST       16 F         1925-09-15    23800.00      500.00     1904.00
000140 HEATHER      A       NICHOLLS        C01      1793          1976-12-15 ANALYST       18 F         1946-01-19    28420.00      600.00     2274.00
000150 BRUCE                ADAMSON         D11      4510          1972-02-12 DESIGNER      16 M         1947-05-17    25280.00      500.00     2022.00
000160 ELIZABETH    R       PIANKA          D11      3782          1977-10-11 DESIGNER      17 F         1955-04-12    22250.00      400.00     1780.00
000170 MASATOSHI    J       YOSHIMURA       D11      2890          1978-09-15 DESIGNER      16 M         1951-01-05    24680.00      500.00     1974.00
000180 MARILYN      S       SCOUTTEN        D11      1682          1973-07-07 DESIGNER      17 F         1949-02-21    21340.00      500.00     1707.00
000190 JAMES        H       WALKER          D11      2986          1974-07-26 DESIGNER      16 M         1952-06-25    20450.00      400.00     1636.00
000200 DAVID                BROWN           D11      4501          1966-03-03 DESIGNER      16 M         1941-05-29    27740.00      600.00     2217.00
000210 WILLIAM      T       JONES           D11      0942          1979-04-11 DESIGNER      17 M         1953-02-23    18270.00      400.00     1462.00
000220 JENNIFER     K       LUTZ            D11      0672          1968-08-29 DESIGNER      18 F         1948-03-19    29840.00      600.00     2387.00
000230 JAMES        J       JEFFERSON       D21      2094          1966-11-21 CLERK         14 M         1935-05-30    22180.00      400.00     1774.00
000240 SALVATORE    M       MARINO          D21      3780          1979-12-05 CLERK         17 M         1954-03-31    28760.00      600.00     2301.00
000250 DANIEL       S       SMITH           D21      0961          1969-10-30 CLERK         15 M         1939-11-12    19180.00      400.00     1534.00
000260 SYBIL        P       JOHNSON         D21      8953          1975-09-11 CLERK         16 F         1936-10-05    17250.00      300.00     1380.00
000270 MARIA        L       PEREZ           D21      9001          1980-09-30 CLERK         15 F         1953-05-26    27380.00      500.00     2190.00
000280 ETHEL        R       SCHNEIDER       E11      8997          1967-03-24 OPERATOR      17 F         1936-03-28    26250.00      500.00     2100.00
000290 JOHN         R       PARKER          E11      4502          1980-05-30 OPERATOR      12 M         1946-07-09    15340.00      300.00     1227.00
000300 PHILIP       X       SMITH           E11      2095          1972-06-19 OPERATOR      14 M         1936-10-27    17750.00      400.00     1420.00
000310 MAUDE        F       SETRIGHT        E11      3332          1964-09-12 OPERATOR      12 F         1931-04-21    15900.00      300.00     1272.00
000320 RAMLAL       V       MEHTA           E21      9990          1965-07-07 FIELDREP      16 M         1932-08-11    19950.00      400.00     1596.00
000330 WING                 LEE             E21      2103          1976-02-23 FIELDREP      14 M         1941-07-18    25370.00      500.00     2030.00
000340 JASON        R       GOUNOT          E21      5698          1947-05-05 FIELDREP      16 M         1926-05-17    23840.00      500.00     1907.00


Sometimes, it is easy to forget our implementation of the Linux based ODBC driver retains the same functionality found with our existing Windows based ODBC driver.

I hope this topic helps you understand some of the capabilities of the Microsoft Linux ODBC driver.





Latest Images