SQL Server

15 Comments

Since Team Foundation Server (TFS) uses SQL Server Analysis Services (SSAS) as its repository for all kinds of info, people are interested to expose the data through Excel or SQL Server Reporting Services (SSRS). SSAS doesn't expose itself through the HTTP protocol by default, but this can be configured using this paper. I didn't created the complete document myself but used this paper from Edward Melomed as start. The paper on Microsoft TechNet uses SSAS 2005 and Windows Server 2003. Both products are still widely used, but 2008 is the magic word nowadays. By the way, I can't wait till his new book Microsoft SQL Server 2008 Analysis Services Unleashed is released.

Summary:  This paper is intended for anyone who is interested in providing access to SQL Server 2008 Analysis Services through HTTP protocol.The paper explains the steps required to set up HTTP access and discusses different performance and security settings. All the instructions in this paper are specific to the Microsoft Windows® Server™ 2008 operating system.

Overview

Microsoft® SQL Server™ 2008 Analysis Services uses the same architecture for providing HTTP access as did SQL Server 2005 Analysis Services.

The pump component is loaded into IIS (Internet Information Services) and serves as an ISAPI extension, pumping data from the client to an Analysis Services server and back.

This white paper walks you through the process of setting up HTTP access to Analysis Services when using Microsoft Windows Server™ 2008.

httpssas01_big

Figure 1 provides a high-level overview of the component architecture.

Getting binaries

To get binaries:

Copy the contents of the %Installation folder%OLAPbinisapi directory into the folder you would like to become the base for the virtual directory in IIS.

In this example, we are going to copy all the files from the C:Program FilesMicrosoft SQL ServerMSAS10.MSSQLSERVEROLAPbinisapi folder into the C:inetpubwwwrootolap directory.

Notes:  To take advantage of the full set of security settings, it is important to make sure that the folder to become the base for the virtual directory is located on the drive formatted for the NTFS file system.

Due to IIS limitations, the path to your directory should not contain spaces.

If you are planning to run the HTTP pump on a different server than the Analysis Services server, please make sure that you also install OLEDB for Analysis Redistributable package.

Creating an application pool

To create an application pool:

1.

To open the Computer Management console, open Control Panel, then Administrative Tools, then Computer Management.

(Or, right-click the My Computer icon and select Manage on the shortcut menu.)

2.

In the Computer Management console, expand the Roles node and then the Web Server (IIS) and Internet Information Services node.

If you can’t find the Internet Information Services node, IIS is probably not installed on your machine. To install it, open Add or Remove Programs in Control Panel. Select Add/Remove Windows Components. Then add IIS to Windows.

3.

Right-click the Application Pools to open the shortcut menu and select Add Application Pool...

4.

Name the application pool. In this example, we call it OLAP. (See Figure 2.)

httpasws02
Figure 2

Setting up an application

To use the application pool:

1.

To open the Computer Management console, open Control Panel, then Administrative Tools, then Computer Management.

(Or, right-click the My Computer icon and select Manage on the shortcut menu.)

2.

In the Computer Management console, expand the Roles node and then the Web Server (IIS) and Internet Information Services node.

If you can’t find the Internet Information Services node, IIS is probably not installed on your machine. To install it, open Add or Remove Programs in Control Panel. Select Add/Remove Windows Components. Then add IIS to Windows.

3.

Right-click the OLAP directory in your Web site to open the shortcut menu and select Convert to Application. Choose the application pool as one you’ve just created. (See Figure 3.)

httpasws03
Figure 3

Setting up directory properties

To set properties for the directory:

1.

Click your virtual directory node and select Handler Mappings from the menu. (See Figure 4.)

httpasws04
Figure 4

2.Make sure the directory has Read and Script permissions. This can be checked by clicking the "Edit Feature Permissions..." in

the top right corner of the Handler Mappings screen as shown in Figure 5.

httpasws04a
Figure 5

3.

In the top right corner of the Handler Mappings screen click the option " Add Script Map...".

4.

Enter *.dll in the Request path text box.

In the Executable option, enter the full path name to msmdpump.dll. In this example, it would be: C:wwwrootolapmsmdpump.dll

5.

Name the Script Map. In this example, we call it OLAP.

Your screen should look something like the dialog box shown in Figure 6.

httpasws05
Figure 6

6.

Click the OK button to accept the settings.

7.

When you hit OK to add the script mapping you will be prompted if you want to enable the ISAPI extension. (See Figure 7.)

httpasws06
Figure 7

8.

Click the Yes button to accept the settings.

9.The configuration of this decision can be found under ISAPI and CGI Restrictions of the server. (See Figure 8.)

httpasws07
Figure 8

10.Search for your extension and name it as shown in Figure 9.

httpasws08
Figure 9

11.

Click the OK button to accept the settings.

Selecting security settings

To select security settings:

1.

Click your virtual directory node and select Autentication from the menu. (See Figure 10.)

httpasws09
Figure 10

2.

You are presented with three options. Choose one of them. (See Figure 11)

httpasws10
Figure 11

This section briefly describes these options, citing the advantages, disadvantages, and security concerns of each one. All other options are not discussed in the paper. You can use Help for IIS to read more about it.

Anonymous access

When this mode is selected, the pump (msmdpump.dll) is running with credentials; in our case, these are the credentials of IUSR_MACHINENAME user. Therefore, every connection to Analysis Services is opened as IUSR user. When this mode is selected, there is no distinction between which user is connecting to IIS and which to Analysis Services. There is no way to distinguish between users.

This mode is to be used when the security infrastructure does not take advantage of the security functionality of Analysis Services. This is most likely an extremely controlled environment, where users are given or denied access to the virtual directory.

Windows authentication

This is the most secure and the recommended mode.

It requires that IIS Server be able to access user domain credentials. This could be done using Microsoft Active Directory® or another mechanism. It is beyond the scope of this paper to discuss all the possible configurations.

Basic authentication

This mode requires that the user enter a user name and password. The user name and password are transmitted over the HTTP connection to IIS. IIS will try to impersonate the user using the provided credentials.

Please note that it is absolutely imperative for anyone building a system where the password is transmitted to have ways of securing the communication channel. IIS provides a great set of tools for setting up and requiring that all communications be encrypted using HTTPS protocol.

3.For configuring anonymous authentication right-click Anonymous Authentication and then click Edit. You will see a dialog box similar to the one in Figure 12 Here you can configure a specific account which will be used to logon SSAS.

httpasws11
Figure 12

Selecting the target Analysis Services server

As you can see in the architectural diagram in Figure 1, every pump component uses its own configuration file.

Open the msmdpump.ini file located in your folder and take a look at the contents of this file. It should look like the following:

    1 <ConfigurationSettings>

    2     <ServerName>localhost</ServerName>

    3     <SessionTimeout>3600</SessionTimeout>

    4     <ConnectionPoolSize>100</ConnectionPoolSize>

    5     <MinThreadPoolSize>0</MinThreadPoolSize>

    6     <MaxThreadPoolSize>0</MaxThreadPoolSize>

    7     <MaxThreadsPerClient>4</MaxThreadsPerClient>

    8 </ConfigurationSettings>

The only setting you are interested in at this point is <ServerName>.

If the Analysis Services instance that you need to provide access to is located on the local machine and installed as a default instance, there is no reason to change this setting. Otherwise, you need to specify the machine name and instance name ( mymachineinst1).

It is also possible to specify a pointer to the virtual directory on another IIS server that is set up for HTTP access to Analysis Services.

For example, you could have <ServerName>http://secondmachine/olap/msmdpump.dll</ServerName>

Getting it all together

At this point you should have configured your HTTP pump and should be ready to connect from your application.

If your application provides you with a way to specify the server name, all you need to do is to substitute your server name with the path to your virtual directory concatenated with “msmdpump.dll”.

As in SQL Server 2005 Analysis Services, the MSOLAP OLEDB provider will understand that the server name includes an URL path and will automatically start using HTTP protocol.

For example, to connect to “MyMachine” from the MDX sample application, you should be able to connect to the Analysis Services server using “http://MyMachine/olap/msmdpump.dll” as a server name.

 

1 Comment

A couple of weeks ago I posted a post about creating a couple of data generators for DataDude. In mine first data generator I used aggregation extensibility. Microsoft divided each form of extensibility in another section. Since I created one or more instances of the standard data generator classes and use those to do the work its called aggregation extensibility. Another thing I created is the use of multiple generators in one class. The base class implementation constructs outputs that are based on public properties that are marked with the OutputAttribute. The inputs are set by using the InputAttribute. Using properties that are marked with attributes provides a simple mechanism for declaring input and output values that are strongly typed. The code below uses multiple outputs, a numeric and a regular expression data generator to do the work.

    1 using Microsoft.VisualStudio.TeamSystem.Data.DataGenerator;
    2 using Microsoft.VisualStudio.TeamSystem.Data.Generators;
    3 using System;
    4 using System.Globalization;
    5 
    6 namespace DutchGenerator
    7 {
    8     public class DutchDataDetails : Generator
    9     {
   10         #region Fields
   11         private string _prefix;
   12         private string _seperator;
   13         private CasingType _casing;
   14 
   15         private RegexString _regex;
   16         private Int _numeric;
   17         private Random _rand = new Random();
   18 
   19         private string genPostCode;
   20         private string genProvince;
   21         private string genProvinceAbbrev;
   22         private string genPhoneNumber;
   23         private string genMobilePhoneNumber;
   24         #endregion
   25 
   26         #region Enumerators
   27         public enum CasingType
   28         {
   29             NormalCase,
   30             LowerCase,
   31             UpperCase,
   32             TitleCase
   33         }
   34 
   35         private string[] Provinces = new string[] {
   36               "Drenthe"
   37             , "Flevoland"
   38             , "Friesland"
   39             , "Gelderland"
   40             , "Groningen"
   41             , "Limburg"
   42             , "Noord-Brabant"
   43             , "Noord-Holland"
   44             , "Overijssel"
   45             , "Utrecht"
   46             , "Zeeland"
   47             , "Zuid-Holland"};
   48 
   49         private string[] ProvincesAbbrev = new string[]
   50         {
   51               "DR"
   52             , "FL" 
   53             , "FR"
   54             , "GL"
   55             , "GR"
   56             , "LI"
   57             , "NB"
   58             , "NH"
   59             , "OV"
   60             , "UT"
   61             , "ZH"
   62             , "ZL"
   63         };
   64         #endregion
   65 
   66         #region Inputs
   67         [Input(Name = "Prefix", Description = "Prefix for (mobile)phonenumbers. (e.g. +31)", DefaultValue = "")]
   68         public string Prefix
   69         {
   70             get
   71             {
   72                 return _prefix;
   73             }
   74             set
   75             {
   76                 _prefix = value;
   77             }
   78         }
   79 
   80         [Input(Name = "Seperator", Description = "Seperator for postcodes (e.g. space char) and (mobile)phonenumbers (e.g. parenthes and minus char).", DefaultValue = " ")]
   81         public string Seperator
   82         {
   83             get
   84             {
   85                 return _seperator;
   86             }
   87             set
   88             {
   89                 _seperator = value;
   90             }
   91         }
   92 
   93         [Input(Name = "Character casing", Description = "Character casing of generated string.", DefaultValue = CasingType.NormalCase)]
   94         public CasingType CharacterCasing
   95         {
   96             get
   97             {
   98                 return _casing;
   99             }
  100             set
  101             {
  102                 _casing = value;
  103             }
  104         }
  105         #endregion
  106 
  107         #region Outputs
  108         [Output(Name = "Province name", Description = "The long form of the province name.")]
  109         public string Province
  110         {
  111             get
  112             {
  113                 return genProvince;
  114             }
  115         }
  116 
  117         [Output(Name = "Province abbreviation", Description = "The 2 letter form of the province name.")]
  118         public string ProvinceAbbrev
  119         {
  120             get
  121             {
  122                 return genProvinceAbbrev;
  123             }
  124         }
  125 
  126         [Output(Name = "Postcode", Description = "A valid _regex.")]
  127         public string PostCode
  128         {
  129             get
  130             {
  131                 return genPostCode;
  132             }
  133         }
  134 
  135         [Output(Name = "Phonenumber", Description = "A valid phonenumber, including area code.")]
  136         public string PhoneNumber
  137         {
  138             get
  139             {
  140                 return genPhoneNumber;
  141             }
  142         }
  143 
  144         [Output(Name = "Mobile Phonenumber", Description = "A valid mobile phonenumber.")]
  145         public string MobilePhoneNumber
  146         {
  147             get
  148             {
  149                 return genMobilePhoneNumber;
  150             }
  151         }
  152         #endregion
  153 
  154         protected string FormatOutputValue(string originalOutput)
  155         {
  156             string formattedValue = String.Empty;
  157 
  158             if (originalOutput != null)
  159             {
  160                 //format character casing
  161                 switch (_casing)
  162                 {
  163                     case CasingType.NormalCase:
  164                         formattedValue = originalOutput;
  165                         break;
  166 
  167                     case CasingType.LowerCase:
  168                         formattedValue = CultureInfo.CurrentCulture.TextInfo.ToLower(originalOutput);
  169                         break;
  170 
  171                     case CasingType.UpperCase:
  172                         formattedValue = CultureInfo.CurrentCulture.TextInfo.ToUpper(originalOutput);
  173                         break;
  174 
  175                     case CasingType.TitleCase:
  176                         formattedValue = CultureInfo.CurrentCulture.TextInfo.ToTitleCase(originalOutput);
  177                         break;
  178 
  179                     default:
  180                         break;
  181                 }
  182             }
  183 
  184             return formattedValue;
  185         }
  186 
  187         protected override void OnInitialize(GeneratorInit initInfo)
  188         {
  189             base.OnInitialize(initInfo);
  190 
  191             switch (this.OutputKey)
  192             {
  193                 case "Province":
  194 
  195                 case "ProvinceAbbrev":
  196                     _numeric = new Int();
  197                     _numeric.Distribution = new Normal();
  198                     _numeric.Min = 0;
  199                     _numeric.Max = 11;
  200                     _numeric.Initialize(initInfo);
  201                     break;
  202 
  203                 case "PostCode":
  204                     _regex = new RegexString();
  205                     _regex.Expression = "[1-9]{1}[0-9]{3}##([a-z]|[A-Z]){2}";
  206                     _regex.MaximumLength = 8;
  207                     _regex.Initialize(initInfo);
  208                     break;
  209 
  210                 case "PhoneNumber":
  211                     _regex = new RegexString();
  212                     _regex.Expression = "0[123457]{1}[0-9]{1}##[0-9]{7}";
  213                     _regex.MaximumLength = 12;
  214                     _regex.Initialize(initInfo);
  215                     break;
  216 
  217                 case "MobilePhoneNumber":
  218                     _regex = new RegexString();
  219                     _regex.Expression = "06##[0-9]{8}";
  220                     _regex.MaximumLength = 12;
  221                     _regex.Initialize(initInfo);
  222                     break;
  223 
  224                 default:
  225                     break;
  226             }
  227         }
  228 
  229         protected override void OnGenerateNextValues()
  230         {
  231             string result;
  232             base.OnGenerateNextValues();
  233 
  234             switch (this.OutputKey)
  235             {
  236                 case "Province":
  237                     _numeric.GenerateNextValues();
  238                     genProvince = FormatOutputValue(Provinces[_numeric.Result]);
  239                     break;
  240 
  241                 case "ProvinceAbbrev":
  242                     _numeric.GenerateNextValues();
  243                     genProvinceAbbrev = FormatOutputValue(ProvincesAbbrev[_numeric.Result]);
  244                     break;
  245 
  246                 case "PostCode":
  247                     _regex.GenerateNextValues();
  248                     genPostCode = FormatOutputValue(_regex.Result.Replace("##", _seperator));
  249                     break;
  250 
  251                 case "PhoneNumber":
  252                     _regex.GenerateNextValues();
  253                     result = _regex.Result.Replace("##", _seperator);
  254 
  255                     if ((_seperator.Length > 0) && (_rand.Next(0, 2) == 0))
  256                     {
  257                         result = result.Replace(_seperator, result.Substring(3 + _seperator.Length, 1));
  258                         result = result.Remove(4, 1);
  259                         result = result.Insert(4, _seperator);
  260                     }
  261                     genPhoneNumber = FormatOutputValue(_prefix.Length > 0 ? _prefix + result.Substring(1) : result);
  262                     break;
  263 
  264                 case "MobilePhoneNumber":
  265                     _regex.GenerateNextValues();
  266                     result = _regex.Result.Replace("##", _seperator);
  267 
  268                     genMobilePhoneNumber = FormatOutputValue(_prefix.Length > 0 ? _prefix + result.Substring(1) : result);
  269                     break;
  270 
  271                 default:
  272                     break;
  273             }
  274 
  275         }
  276 
  277     }
  278 }
  279 

Dutch data generator (Provinces, Postcodes, Phone - and Mobile phone numbers)

To register the class in the Visual Studio environment a XML file has to be created where all objects are registered through

  • XML file naming convention <AssemblyFilename>.Extensions.xml
  • The file extension .Extensions.xml is mandatory
  • The Extensions.xml file has to be placed in the %ProgramFiles%Microsoft Visual Studio 8DBPro directory
  • Must implement the schema defined in %ProgramFiles%Microsoft Visual Studio 8DBPro Microsoft.VisualStudio.TeamSystem.Data.Extensions.xsd
<?xml version="1.0" encoding="us-ascii"?>
<extensions assembly="DutchGenerator, Version=1.0.0.0, Culture=neutral, PublicKeyToken=d6ddc2e862439c84" version="1" xmlns="urn:Microsoft.VisualStudio.TeamSystem.Data.Extensions" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="urn:Microsoft.VisualStudio.TeamSystem.Data.Extensions
  Microsoft.VisualStudio.TeamSystem.Data.Extensions.xsd">
    <extension type="DutchGenerator.DutchDataDetails" enabled="true" />
</extensions>

DutchGenerator.Extensions.xml

There are two options where the extension assemblies can be placed

  • %ProgramFiles%Microsoft Visual Studio 8DBProExtensions directory
  • GAC, requires the assembly to be strong name signed

Do use the extension assembly the following steps have to be done

  • Copy .extensions.xml file to DBPro directory
  • Copy assembly to DBProExtensions directory
  • GAC the extension assembly

Since this isn't handy when your developing you Extension, you could create a generic post build event:

copy "$(ProjectDir)$(TargetName).Extensions.xml" "$(ProgramFiles)Microsoft Visual Studio 8DBPro$(TargetName).Extensions.xml" /y 
 
copy "$(TargetDir)$(TargetFileName)" "$(ProgramFiles)Microsoft Visual Studio 8DBProExtensions$(TargetFileName)" /y 
 
"$(DevEnvDir)....SDKv2.0bingacutil.exe" /if "$(ProgramFiles)Microsoft Visual Studio 8DBProExtensions$(TargetFileName)"
 

In the next screendumps the Dutch Data Generator is used to select and modify one of the data generator outputs and create the generated output.

properties
Generator properties

datadetails 
Generated output

Summary
I love the data generators option. More and more often companies prohibited you to use you production data in a test environment, probably because of SOX (Sarbanes-Oxley). So you have to create your own test data with the same amount of records. Before DataDude this was hard to do, but with DataDude you can do this easily and even create your own generators. I hope lots of people will create useful generators and maybe we can collect them on codeplex.

 

On June 11 and 12, I attended a 2-day Visual Studio Team Edition for Database Professionals workshop at Class-A. This event was hosted by Gert Drapers (also aka DataDude), see his blog post announcing the event. Gert discussed all the important features of the tool and some of my fellow attendees complained the workshop was too short for such an enormous set of features and exercises. I have to agree a bit, because we not only discussed all the features of DBPro, we also covered the features of SR1 and the not yet released power tools. The Power Tools are listed on the Future Release page of VSTS. So there was hardly time for any of the exercises, but I don’t give a ****, because I’m more interested to hear and see what DBPro can do. After this workshop and the DevDays, I started to do my own exercises. One of the things, which I will give my full attention, is the ability to extend DBPro. I will focus on all extensions, but data generation in particular.

The power of VSTE DBPro data generation, lays in the fact you can generate data that closely matches your domain values allowed in your environment. The problem is that knowledge of the allowed and disallowed domain values cannot always be derived from the schema, so you have to go in to the designer and configure the generators for your columns. By using the built-in data generators, you can generate random data, generate data from existing data sources, and control many aspects of data generation. If the functionality of the built in generators is insufficient, you can create custom data generators. To create custom data generators, you use the classes in the Microsoft.VisualStudio.TeamSystem.Data.DataGenerator namespace.

Let me start by motivating the data generators that I would like to build. I often find databases which contain Dutch related information, such as postal code, phone number, provinces, e.t.c. Furthermore a lot of columns contain data which could be generated with a DataBound generator, but some of these data could also be retrieved from a number of web services (Cities, Countries, Currencies, ISO-codes e.t.c.). Last but not least I have credit card background, not that I misused other peoples card, but I build a card number controller to prevent this.

The extensibility API provides classes from which developers can inherit. In addition to classes, the API includes attributes that you can apply to your derived classes. By applying these attributes, you reduce the amount of code that is required in custom generators for common cases.

You can use the extensibility API in the following four ways to create custom data generators:

Extensibility

Description

Difficulty

Declarative Extensibilty

  • Inherit from the Generator class.
  • Add custom input and output properties.

Easy

Normal Extensibility

  • Inherit from the Generator class.
  • Add custom input and output properties.
  • Override one or more Generator methods (for example, OnGenerateNextValues and OnInitialize).

Medium. This method is recommended in most cases.

Aggregation Extensibility

  • Inherit from the Generator class.
  • Add custom input and output properties.
  • Override one or more Generator methods.
  • In your class, create one or more instances of the standard data generator classes and use those to do the work.

Medium

Base Extensibility

  • Create a class that implements the IGenerator interface.
  • Implement all methods that are required by your generator.
  • Create a custom designer for the generator that implements the IDesigner interface.
  • Implement all methods that are required by your designer.

Difficult

Summary
Since I hope to be better than the average, I will use Aggregation Extensibility for my Dutch data generator, in which I will probably use the Regular Expression generator to build my data details. For one of the other two data generators I will be using Base Extensibility and give them their own designer. So keep in touch and follow my posts.

 

SQL Server Code name


Microsoft is planning to release the next version of its SQL Server database, which is codenamed "Katmai," in 2008. Everybody is waiting anxiously what this version will bring us. Like everybody I've a personal wish list for SSRS and it seems MS is working hard to fulfill this list.



  • Rich Text support
  • RTF/DOC/PPT rendering
  • Shared Styles (Styles sheets)
  • Templates
  • Font embedding in PDF
  • Many Chart enhancement (Dundas)
  • Edit in Preview

At the Microsoft Business Intelligence (BI) Conference Jeff Raikes announced that Microsoft likes to acquire OfficeWriter from SoftArtisans. OfficeWriter is the only product available today that works with Reporting Services to preserve 100% of the features in your sophisticated spreadsheets and documents. Reporting Services currently does not support MS Word and has limited support for MS Excel. This is a huge win for folks who want to render reports to Word.


Another thing which is quite remarkable is the Report Designer will get a new look and feel. This could mean that MS is integrating the original Report Designer which is a part of BIDS and Report Builder to one product or the Report Designer is leaving the Visual Studio environment. Here is screen dump of the new Report Designer.



New Report Designer


Furthermore Microsoft added a new Katmai informational site Your Data, Any Place, Any Time. This site will inform you about the latest evolvements of "Katmai".


Summary
Unfortunately not all sessions from the Microsoft Business Intelligence (BI) Conference are and will be available on the net, so it's hard to understand where MS is heading to. Maybe the organizers could learn something from the MIX organizers because all sessions were available on the net within 24 hours after each live presentation. Great job guys! Besides this I hope you're are as enthusiastic as I'm about "Katmai" and I can hardly wait till the first CTP of "Katmai" will come out public.


 


5 Comments


How often do you bump into problems with users and security when you move a SQL Server database to another server. The most annoying problem is when you try to create a user under Security, and the user exists within the moved database. When you try to apply user mapping and permissions you get an error like this:

Error 15023: User, group or role '%s' already exists in the current database.


Normally you don't have to reapply the user permissions because SQL Server uses a special background process to associate user accounts in a database with logins on the server. This could take some time, but luckily there’s a quick way to do this:


use database
go


sp_change_users_login 'auto_fix', 'UserName'


This will output the following result.


The row for user ' ' will be fixed by updating its login link to a login already in existence.
The number of orphaned users fixed by updating users was 1.
The number of orphaned users fixed by adding new logins and then updating users was 0.


Microsoft has a knowledge base article that explains the problem and solution Q240872.


 


Technorati tags:

4 Comments

On a recent assignment they requested me to implement two Integration Services instances on the same machine. Their serverpark isn't that big, but they want to simulate a separated test en development environment. With DTS this wasn't a real problem, but how do you configure SSIS to use multiple instances, since SSIS can only be installed once on a single machine and totally works different then DTS. Before I explain how you can configure SSIS you first need to know the differences between DTS and SSIS. 

Traditional (DTS) warehouse loading

  • Integration process simply conforms data and loads the database server
  • The database performs aggregations, sorting and other operations
  • Database competes for resources from user queries
  • This solution does not scale very well

 

Warehouse loading with SSIS

  • SQL Server Integration Services conforms the data
  • But also aggregates and sorts, and loads the database
  • This frees-up the database server for user queries

SSIS includes a configuration file for configuring the Integration Services service. By default, the file is located in the folder, Program FilesMicrosoft SQL Server90DTSBinn, and the file name is MsDtsSrvr.ini.xml.

The default configuration file contains the following settings:

  • The root folders to display for Integration Services in Object Explorer of SQL Server Management Studio are the MSDB and File System folders.
  • The packages in the file system that the Integration Services service manages are located in %Program Files%Microsoft SQL Server90DTSPackages.

You can modify the configuration file to display additional root folders in Object Explorer, or to specify a different folder or additional folders in the file system to be managed by Integration Services service. The example below shows how I configured the configuration file to use more than one MSDB database, which are stored in separated database instances.

As you can see in the Management Studio I can store my packages in different databases as well a multiple File Systems.

Note
The registry key HKEY_LOCAL_MACHINESOFTWAREMicrosoftMSDTSServiceConfigFile specifies the location and name for the configuration file that Integration Services service uses. The default value of the registry key is C:Program FilesMicrosoft SQL Server90DTSBinn MsDtsSrvr.ini.xml. You can update the value of the registry key to use a different name and location for the configuration file.

Summary
The SQL Team did a great job for letting you choose how many instances you want to use on one machine. They're fully configurable and administrable and you're able to create a real test and development environment on one machine. Great job guys.

 

6 Comments

If you want to shrink the size of the transaction log, which you think has become much larger than you would like, there are a two options.

Detach and Attach
First I will explain a trick for quickly remove the log file by detaching and automatically recreate a new log file by attaching the database. Remember that this would not work if the database has more than one log file. When you detach the database using sp_detach_db or using the UI in SQL Server Management Studio, SQL Server will know that the database was cleanly shutdown. This will ensure that the log file does not need to be available to attach the database again, so you could delete it. When you attach the database again, SQL Server will create a new log file for you, which will be of the minimum size.

The following are the steps to do:

  1. Detach the database using SP_DETACH_DB procedure (Ensure no processes are using the database files.) 
  2. Delete the log file
  3. Attach the database again using SP_ATTACH_DB procedure

or

  1. Open SQL Server Management Studio
  2. Open content sensitive menu of the database
  3. Choose Tasks > Detach
  4. Delete the log file
  5. Open content sensitive menu of the server
  6. Choose Attach
  7. Select the the database .mdf


    Note: Make sure you remove the the log file from the database details, as you can see in the upper screenshot.

Since you are detaching and attaching the database within the same server you will not have the problem of broken logins.

Backup and Shrink file
This option is not a trick, but the official way to shrink the transaction log file. Before you will use it, I will like to mention this option not always result in the way you want it. I haven't found out why, but sometimes the size of the log file will not result in the size you requested.

The following are the steps to do:

  1. Backup the the transaction log using BACKUP LOG database WITH TRUNCATE_ONLY. Where database is the name of the database.
  2. Shrink the file using DBCC SHRINKFILE(database_log, 50). Where 50 is the preferred size in MB of the log file.

or

  1. Open SQL Server Management Studio
  2. Open content sensitive menu of the database
  3. Choose Tasks > Back Up
  4. Choose Back up type - Transaction Log
  5. Open content sensitive menu of the database
  6. Choose Tasks > Shrink > Files
  7. Choose File type - Log
  8. Choose Shrink Action (it's good to leave some empty space. e.g. 50 MB)

Summary
For shrinking the log consisting of one or more log files, you can find very good documentation by searching the article kb256650 (for SQL 7.0) and article kb272318 (for SQL 2000) in Microsoft Knowledge Base. Another interesting article is How to stop the transaction log of a SQL Server database from growing unexpectedly, which applies to SQL 2000 and 2005.

 

Technorati tags:

13 Comments

Since the release of Reporting Services 2000 back in January of 2004, I've build lots of apps using this technology. On my newest assignment they requested me to implement two Reporting Services instances on the same machine. Their serverpark isn't that big, but they want to simulate a separated test en development environment. I immediately applied that this won't be a problem and since SQL Server 2005 contains Reporting Services Configuration it isn't that hard anymore. To begin this giant adventure you've install two instances of SQL Server 2005 Reporting Services (e.g. Test & Development) and choose the option to not configure Reporting Services. Finished this, you now have to create an additional Web site(s), so the separated Reporting Services instances are used independently.

To simulate two web servers on one machine, you need to install an extra IP address or use the same IP address in combination with a host header value. This time I choose the extra IP address because this satisfied the Helpdesk administrator the most ;-). To add an extra IP address open your Network Card properties, followed by opening the properties of the Internet Protocol (TCP/IP). Select the Advanced button and Add as many IP addresses as you want to have separate instances of Reporting Services.

 
IP Address configuring

When you've added the extra IP address, you can create an additional Web Site to your Web Server. Using the Web Site Creation Wizard, you can select one of the other IP addresses.


Web Site Creation Wizard

The next step is configure your Reporting Services using Reporting Services Configuration. For extra info about configuring check MSDN.


Internet Information Services (IIS) Manager

When everything is configured using this Tool, it still doesn't work the way you like too. The second instance is still connecting to your default instance because the default behavior of a Reporting Services instance is redirecting its requests to the localhost. To change this behavior you have to change some settings in the config files (RSReportServer.config, RSWebApplication.config) of Reporting Services.

For example in: C:Program FilesMicrosoft SQL ServerMSSQL.#Reporting ServicesReportServer open

RSReportServer.config

old value:
<UrlRoot>http://<servername>/reportserver</UrlRoot>

new value:
<UrlRoot>http://10.10.10.11/reportserver</UrlRoot>

For example in: C:Program FilesMicrosoft SQL ServerMSSQL.#Reporting ServicesReportManager open

RSWebApplication.config

old value:
<ReportServerUrl></ReportServerUrl>
<ReportServerVirtualDirectory>ReportServer</ReportServerVirtualDirectory>

new value:
<ReportServerUrl>http://10.10.10.11/reportserver</ReportServerUrl>
<ReportServerVirtualDirectory></ReportServerVirtualDirectory>

Note: It's very important that only one of the two values is filled, else an error will occur.

Run iisreset /noforce and you're now using multiple instance of Reporting Services on the same machine.

Check the following URL's:

Summary
The SQL Team did a great job for letting you choose how many instances you want to use on one machine. They're fully configurable and administrable and you're able to create a real test and development environment on one machine. Great job guys.

 

Technorati tags: , ,

4 Comments

This is the first blog of a series about discussing the improvements of installing SQL Server SP2.

If you've applied Service Pack 2 CTP2 to your SQL Server 2005 Reporting Services installation, you might have noticed the appearance again of "Select All" from the drop-down list for multi-valued parameters. If you would like to get an official patch before SP2 is released, you can contact Support Services and ask for Knowledge Base Article #919478.

     
           SQL Server 2005                    SQL Server 2005 SP1                 SQL Server 2005 SP2

Via Brian Welcker

Here's the story behind this disappearance.

With SQL Server 2005, we introduced multi-valued parameters, which allow for you to specify a set of values for a single report parameter. If you think about how multi-valued parameters work against a SQL source, you build a SELECT statement with an IN clause and the values get substituted before it is sent to the server, e.g.

SELECT name FROM customer WHERE country IN (@country)

becomes:

SELECT name FROM customer WHERE customer.country IN ('United States', 'Canada', 'Mexico')

The problem comes from the addition of "Select All" to the drop-down list. In summary, the functionality of the "Select All" checkbox is simply, "check all of the values in the list unless they are already all checked. In this case, then uncheck them all." While this functionality works fine with a small set of values, having several hundred values in an IN clause is not a very efficient SQL statement.

The performance implications for Analysis Services is even more significant; you can select all of the individual members of a dimension as well as the "All" member of a dimension. If your parameter contains the entire member hierarchy (not just a single level), selecting all of the items in the list can cause major performance issues.

A more efficient "Select All" function would actually modify the statement to exclude the clause entirely. Selecting "all countries" in the above statement would result in:

SELECT name FROM customer

This is quite a bit trickier than the simple parameter substitution logic as you need to maintain the integrity of the statement (What if there is more than one search condition?) Also, we would need to write query parsing routines for each language we supported (T-SQL, MDX, PL-SQL, etc.)

We thought a long time about how to address the performance issues of this feature in SP1. Ideally, you would be able to add a flag to the report definition that would say whether or not to add the "Select All" checkbox. However, as a policy, we don't change the report definition in service packs due to the upgrade and compatibility issues. We also thought about auto-detecting whether a report is against Analysis Services but couldn't really do this without changing the execution SOAP API. We thought about making it a flag in the configuration file but this would force people to decide one way or another for all reports. Finally, we thought about putting "Select All" at the end of the list instead of the beginning.

Summary
It’s good to know why the MS team removed the Select All in the first place. It had nothing to do with a bug or so, but it was all about performance. Since we are self-employed and responsible for our own queries it's nice to have back again. I know several people made all kind of solutions to fulfill the same needs, but nothing came really close to the current UI and behavior.

 

Update: I updated this article, because MS is shipping the December 2006 preview now.

Microsoft released the SQL Server 2005 Service Pack 2 Community Technology Preview December 2006.  CTP December 2006 will be a public release and the goal of this outreach is to raise awareness of the CTP to solicit community feedback during the aggressive CTP cycle, while demonstrating the deep integration between SQL Server 2005 and 2007 Office. 

They also created the SQL Server Community Technology Preview Program. Leading up to the release of SQL Server 2005, Microsoft made Community Technology Previews (CTPs) available as public downloads. In addition, they want to highlight the increasing transparency in Microsoft’s development cycle to provide greater predictability to its customers.

Driving Innovation:

Heterogeneous Environments (Interoperability):

  • Oracle Support in Report Builder. Users will now be able to use Report Builder on top of Oracle data sources.
  • Hyperion support with SSRS. Organizations will now be able to use SSRS to build reports on top of Hyperion Essbase cubes.

Performance/Enterprise:

  • Data compression (varDecimal), which is important for data warehouse scenarios and is specifically important for SAP BW scenarios. This requires less disk storage of decimal data which increases overall performance.
  • Manageability enhancements. Based on customer feedback, SQL Server provides enhanced management capabilities for DBAs such as improvements in database maintenance plans, enhanced management reports and a new copy database wizard.

SQL Server Express Edition

  • Management reports have been added to SQL Server Express Edition enabling customers to get insights into the performance of their Express Edition and SQL Server Compact Edition databases.
  • SQL Server Management Studio Express Edition now enables management of SQL Server Compact Edition databases.

SQL Server 2005 SP2 will ship shortly after the launch of Windows Vista and Office. You can find more details about the features in SP2 at http://go.microsoft.com/fwlink/?LinkId=71711.

 

Technorati tags: