First posted by David Jennaway
Additional information by this author
A frequent request we come across is from companies who want to know which users are using CRM and when. The CRM platform provides the facility to gather detailed usage information by writing plug-ins, but a simpler and more general mechanism is to use the Internet Information Services (IIS) logging mechanism.
This article will explain how to derive useful usage metrics from the information provided by IIS logging. The main steps are:
1. Configure IIS logging to allow easy querying of the log data
2. Filter the raw log data to help identify usage patterns
3. Specify time periods to help categorize when people access CRM
4. Present the usage information in a graphical format through reports
Configure IIS logging
IIS logging allow the capture of information about every web request submitted to the server, and it is configured at the web site level. By default the log information is written to text files, but IIS can also write the information directly to a relational database, which allows easier analysis. For the purposes of this post, the simplest setup is to configure IIS 6.0 to write the log data to a SQL Server ODBC data source
1. Create a SQL Database. In this database, create a SQL table to hold the log data. IIS has a SQL script that will create the table with appropriate columns – this script will be in %windir%\system32\inetsrv\logtemp.sql
Download this file
Open the script within SQL Server Management Studio, swap to the correct database and execute the script
2. Create an ODBC data source on the IIS server with the connection details to the database in step 1. The data source must be a system DSN, and I’d recommend using Integrated authentication to SQL Server
3. In IIS Manager, open the Properties of the Microsoft Dynamics CRM web site. On the Web Site tab, select ODBC Logging in the Active log format dropdown, and set the associated properties to reflect the SQL table and ODBC DSN created in steps 1 and 2. If you specified Integrated authentication in the DSN, then the User name and Password will be ignored, though oddly, you cannot leave the password field blank
Filtering the log data
The IIS log will have a record for every request submitted to the web site. This includes requests for images, stylesheets and other supporting files, and can result in more data than you need . For instance, opening a CRM record can result in 30 records in the log table. There are also entries for CRM processes (such as the CRM Asynchronous Service) accessing the CRM platform.
This extraneous data can be filtered out with a SQL view. We use the following view definition:
CREATE view [dbo].[IISLogFiltered]
as
select ClientHost, username, LogTime, processingtime, bytesrecvd, bytessent, servicestatus, operation, target, parameters
from inetlog
where username not in ('-', 'CRMTEST\CRM4DEV$')
and parameters <> '-' and parameters not like 'lcid=%'
-- Replace CRMTEST\CRM4DEV$ with the account used by the CRM services
This assumes the SQL table has the default name (‘inetlog’), and filters out anonymous requests, and those from the CRM services. It also filters out requests with no query string parameters (or where the only parameter is ‘lcid’), which excludes the requests for the supporting files. You can explore the log data in more detail, and adjust the filters as appropriate.
Specifying time periods to categorize access
Even with the SQL view described above, you are still a step away from getting useful information about when users access CRM. The raw data stores the date and time of each request in one field, but it will help to process this into time and date periods. The approach we use is to build a SQL table that contains definable time periods , and a SQL function to match the log time to a time period. The SQL table can be created with the following script:
CREATE TABLE [dbo].[TimePeriod](
[PeriodText] [nvarchar](20) NULL,
[HourStart] [int] NULL,
[MinuteStart] [int] NULL,
[TotalMinuteStart] AS ((60)*[HourStart]+[MinuteStart]),
[TotalMinuteEnd] [int] NULL)
You should end up with something that looks like this :
You can then populate it with data for the time periods you want, for example:
This splits the day into hour-long periods between 8:00 and 18:00, with one period before and after the hour-long periods. The table can be used to find the PeriodText from a datetime field using a SQL function:
CREATE function GetTimePeriodText (@DateTime datetime) returns nvarchar(20)
as
begin
declare @tm int, @ret nvarchar(20)
set @tm = 60 * datepart(hh, @DateTime) + datepart(mi, @DateTime)
select @ret = max(PeriodText) from TimePeriod where @tm >= TotalMinuteStart and @tm < TotalMinuteEnd
return @ret
end
It will also help to extract the date component, this can be done using the SQL Convert function. The following SQL view applies these functions to the above SQL view used for filtering. It also extracts the CRM organization name from the target (url), and removes the domain part of the username.
Create view IISLogFilteredWithPeriods
as
select ClientHost, username, LogTime, processingtime, bytesrecvd, bytessent, servicestatus, operation, target, parameters
, dbo.GetTimePeriodText(LogTime) as TimePeriod
, convert(nchar(10), LogTime, 103) as DateText -- The last parameter defines the format
, convert(nchar(8), LogTime, 112) as DateYYYYMMDD -- Useful for sorting dates
, case when charindex('/', target, 2) > 2 then substring(target, 2, charindex('/', target, 2) - 2) else '' end as Organization -- Get organization name from target
, rtrim(substring(username, 1 + charindex('\', username), len(username) - 1 + charindex('\', username))) as usernameonly -- Remove Domain part of user name
from IISLogFiltered
Note that the IIS log will store data in Universal Time (UTC). You can modify either the data in the TmePeriod table, or the logic in the GetTimePeriodText function, to apply timezone information. Another, though officially unsupported approach, would be to use the fn_UTCToLocalTime SQL function in the MSCRM database.
The whole structure should look like this :
Creating reports on the log data
Now we’ve got the structure to process the log data, we can present in to users via a SQL Server Reporting Services report. The layout is up to you, but we find an effective way to present the information is in a matrix, with the time periods along the columns, with conditional formatting to highlight periods of light or heavy use. The following is an example, using the report definition below:
The numbers are the count of requests, with the background colours indicating how heavy the use is (as it’s a test system I’m considering 5 or more requests per hour to be heavy usage).
The SQL statement that produced this report is:
select usernameonly as username
, TimePeriod, DateText, DateYYYYMMDD
, sum(processingtime) as ProcessingTime, count(*) as Requests
from IISLogFilteredWithPeriods
where logtime > dateadd(d, -7, getdate())
group by usernameonly, TimePeriod, DateText, DateYYYYMMDD
You can download the complete report XML HERE
Further thoughts – more analysis
The report provided in this post is limited to displaying the number of request per user. Further ideas for analysis are:
- Using the processingtime data to get an idea of the processing on the IIS server. This is a measure of the elapsed time from starting processing the request in IIS, to submitting the end of the response, and will be affected by the load on the IIS server. This can be useful for identifying when and if requests take a long time to complete
- Identifying the types of entities accessed. The target can help identify which entity has been requested. This takes a bit of parsing, but the topic ‘URL Addressable Forms and Views’ (http://msdn.microsoft.com/en-gb/library/cc150850.aspx) in the CRM SDK will help
- IIS logs are not useful for identifying the type of data operation (e.g. create, update, delete, assign) has been performed. If you need this information, you would be advised to use a plugin/callout mechanism for the auditing – see http://blogs.msdn.com/crm/archive/2006/12/05/creating-solutions-a-custom-crm-usage-log.aspx for an example of this
- The techniques described in this post were developed for CRM 4, but also work on CRM 3. The only difference is that there is no need to extract the Organization name from the target on CRM 3
Recent Comments