Being named an MVP this year is very exciting news for me. As I was trying to come up with the right words for a post, Stacey Toevs, who maintains our corporate Intranet site created this feature article for a series and internal Hitachi Consulting initiative called “Be Known”. I thought I’d share it with you here…
Microsoft’s Newest Most Valued Professional
A rare honor was given to Paul Turley October 1: He was named Microsoft’s MVP in the SQL Server domain. The Microsoft MVP Award is given annually to recognize exceptional technology community leaders worldwide who actively share their high quality, real-world expertise with users and Microsoft. With fewer than 5,000 awardees worldwide, Microsoft MVPs represent a highly select group of experts and the diversity of today’s technical communities. “I believe there are only a handful of MVPs worldwide on the Microsoft SSRS reporting platform,” says Hillary Feier VP on the high tech products team. “This is a tremendous accomplishment, and having an MVP on our team demonstrates to Microsoft our commitment and expertise on their platform.” Paul has a very active blog that illustrates his willingness to share his expertise and help others – a key MVP requirement. “MVPs share a passion for technology, a willingness to help others, and a commitment to community,” says Microsoft’s head of customer and partner advocacy Richard Kaplan. “These are the qualities that make MVPs exceptional community leaders. By sharing their knowledge and experiences and providing objective feedback, they help people solve problems and discover new capabilities every day. MVPs are technology’s best and brightest, and we are honored to welcome Paul as one of them.”
It hardly makes sense to attempt to improve the thoughts of an all-star blogger, so we reproduce here some of Paul’s thoughts on blogging. For starters, Paul sees blogging as a natural progression in his career: “For me blogging is the next stage of technical community involvement. In the early 1990s, I was an independent consultant working at HP. I found myself doing a lot of Access and VB projects, where I got to know Access really well. To find answers, I would look on the public support forums and newsgroups, and while I was looking for answers to my questions, I read questions from others and realized that I had answers for them. Before long, I was spending more time answering questions and helping others purely in the spirit of service. After a couple of years, I guess I became somewhat of an authority on Access and was contacted through a forum to write a couple of chapters in a Wrox Press book. Nine years later, I’ve contributed to ten books. I started speaking at the SQL Server PASS global summit because I could attend the conference for free. Now I just enjoy being involved.
“Blogging is great fun, and it’s good for the core muscles – no, wait, that’s Pilates… I get confused. I’ve started and stopped blogging in the past but I didn’t really have a forum or a good reason to keep going. Hilary Feier challenged me to blog regularly and this gave me the traction to reach critical mass. Since March, I’ve had 3,330 visitors from 92 countries and I get regular email, which motivates me to keep at it.
“To come up with a solution for a technical problem, sometimes I just need to let it bake for a while and later some creative solution will just hit me. When someone asks me a question at a client site, in a training class or at a conference; I often tuck these things away in my mind. I love to help people but I can’t be responsible for fixing everyone’s problems. Instead of promising to get back to an individual and then forgetting, when I get a chance I’ll post a blog with the solution. I get a sense of closure because I was able to resolve some nagging problem but I didn’t have to meet a deadline or remember who to get back to.
“I get a lot of questions from people, mostly from technical people in the trenches – and a lot of them are from our clients and prospects. I’m not the only one at Hitachi that writes books or speaks at conferences. When I go to a client site, I see our books all over. When I go to a sales call with one of our DBDs, I often hear people say ‘I read your blog’ or ‘I read an article from one of your associates.’ They’ve already heard about us from multiple sources, and I think this gives us credibility over another consulting company they’re talking to.
“Like my Access forums work in the ’90s, I’ve spent a lot of free time more recently posting answers in the Microsoft Development Network (MSDN) support forums. Before long, they made me a forum moderator and then received the MVP designation. In the past five years, I’ve been focusing my energy on SQL Server Reporting Services and BI. It’s important to build relationships. When we were working on the first Reporting Services book in 2003, I took the Microsoft product team leads out to lunch. Now I can pick up the phone or get email replies directly from the developers. I do a lot of this off of company time but when I write, present at a user group or speak at a conference; I always make a point to let people know who I work for. I’m there because of the smart people I surround myself with. You can’t volunteer because it will get you noticed or promoted. You have to do the right thing for the right reason. But, in making the effort, good things happen.”
Our SQL Server Reporting Services Recipe book was released in April. This is a cookbook style book for experienced SSRS designers with 63 individual report recipes, each addressing a specific business requirement or scenario. My co-author, Robert Bruckner, is a technical lead on the SSRS product team and a prolific blogger on the topic. The book is about 630 pages long and took about a year to write. The book takes a unique approach by providing recipes and step-by-step instructions to solve specific business problems.
I want to return the field value for a specific row to a textbox below a table in my report. To pull this off, I have to be a little creative as there isn’t a simple way to return a value from a specific row in a dataset outside of a data region other then using the FIRST() or LAST() functions. One technique is to write a custom function and then pass values on each row of a table through it to capture the row ID and value that you will want to return to a textbox outside the table. here’s an example:
The report contains a parameter named RowNumber used to specify the row whose value I want to return.
In the report properties Code window, I’ve entered the following VB.NET code:
Private dRowValue As Decimal
Function SetRowValue(CurrRowID As Integer, ReturnRowID As Integer, Value As Decimal) As Decimal
If CurrRowID = ReturnRowID Then dRowValue = Value
Function GetRowValue() As Decimal
In the textbox in the detail row of my table that shows my field value, I use the following expression:
There are certain key words in a T-SQL query than cannot be parameterized, including the TOP statement. Consider the followong attempt to parameterize the number of rows returned from the query results:
Select Top @TopCustomerCount F.CustomerKey, F.OrderDateKey, Case When DC.MiddleName Is Null Then FirstName + ' ' + LastName Else FirstName + ' ' + MiddleName + '.' + ' ' + LastName End CustomerName, Convert(Char(11),DT.FullDateAlternateKey,106) OrderDate, Sum(F.SalesAmount) SalesAmount From FactInternetSales F, DimTime DT, DimCustomer DC Where F.OrderDateKey = DT.TimeKey And F.CustomerKey = DC.CustomerKey And Convert(Int,DT.CalendarYear) = @Year Group By F.CustomerKey, F.OrderDateKey, DC.MiddleName, DC.FirstName, DC.LastName, DT.FullDateAlternateKey Order By F.OrderDateKey Desc, CustomerName
This query will not work because the query designer can’t resolve the TopCustomerCount parameter. To solve this problem, build an expression to concatonate the entire query, with parameterized literal values like the following example:
This same technique can be used to resolve complex decision structures in code before building the SQL Statement string. For even more flexibility, create a custom function in the report properties Code window and use Visual Basic.NET code to build and return the entire query string.
A user has been added to a SSRS role for the report folder or report but is still prompted to login. This can happen if Internet Explorer isn’t configured to persist the user’s Windows login to Intranet sites.
To rectify this, in Internet Explorer open Tools > Internet Options > Security tab
Change the settings for the Local Intranet Zone by clicking Custom Level
Near the bottom of the list of settings under User Authentication, select the radio button for the option titled: Automatic logon with current user name and password
Click OK to close and save the settings and then restart IE.
Creating one report to meet multiple business requirements can save a lot of work down the road. A common scenario is for different users to ask for similar reporting metrics that may just be variations of existing reports. Using SQL Server Reporting Services, you can achieve this goal with a little planning and creative report design.
The following technique in Reporting Services for SQL Server 2008 uses a parameterized expression to modify a dataset so that it returns a different set of columns. Conditional logic in the report is used to include only columns in a table which correspond to the columns returned by the query. The same technique will work in earlier versions of Reporting Services but the designer user interface will be a little different.
This example uses a parameter to change the stored procedure that will be executed to return the report data. Conditional query logic could also be applied to generate a dynamic SQL or MDX statement.
To begin, use SQL Server Management Studio to connect to the AdventureWorksDW2008 sample database and then execute the following script to create two new stored procedures:
create proc spResellerSales1 @DateFrom int, @DateTo int as select d.CalendarYear, d.CalendarQuarter, d.EnglishMonthName as Month , sum(fs.OrderQuantity) as OrderQuantity , sum(fs.SalesAmount) as SalesAmount from FactResellerSales fs inner join DimDate d on fs.OrderDateKey = d.DateKey inner join DimProduct p on fs.ProductKey = p.ProductKey inner join DimProductSubcategory ps on p.ProductSubcategoryKey = ps.ProductSubcategoryKey inner join DimProductCategory pc on ps.ProductCategoryKey = pc.ProductCategoryKey where d.DateKey between @DateFrom and @DateTo group by d.CalendarYear, d.CalendarQuarter, d.EnglishMonthName , d.MonthNumberOfYear order by d.CalendarYear, d.CalendarQuarter, d.MonthNumberOfYear ; go —————————- create proc spResellerSales2 @DateFrom int, @DateTo int as select d.CalendarYear, d.EnglishMonthName as Month , sum(fs.OrderQuantity) as OrderQuantity , sum(fs.SalesAmount) as SalesAmount from FactResellerSales fs inner join DimDate d on fs.OrderDateKey = d.DateKey inner join DimProduct p on fs.ProductKey = p.ProductKey inner join DimProductSubcategory ps on p.ProductSubcategoryKey = ps.ProductSubcategoryKey inner join DimProductCategory pc on ps.ProductCategoryKey = pc.ProductCategoryKey where d.DateKey between @DateFrom and @DateTo group by d.CalendarYear, d.EnglishMonthName, d.MonthNumberOfYear order by d.CalendarYear, d.MonthNumberOfYear ; go
Note that the only significant difference between the spResellerSales1 and spResellerSales2 stored procedures is that the first one returns the CalendarQuarter column and the second one does not. Remember that this is a simple example for the sake of demonstration. You could have as many different procedures or query variations as you like as long as one of them returns all possible columns and the others return a subset of columns in the same order.
The report contains parameters, two that will be automatically generated from the DateFrom and DateTo parameters in the procedure, and another parameter named DataSource that I added myself. Note that I’ve provided default values for the DateFrom and DateTo parameters since they’re really not part of the demonstration scenario.
To populate the dataset Fields collection, configure the dataset to use a stored procedure and select the first procedure (the one that returns all of the columns) as you normally would without using an expression. The DataSource parameter (not to be confused with a report data source) is use to pass and swap the stored procedure names to the report’s main dataset. Manually add this parameter to the report:
The available values for this parameter include the two stored procedure names:
This parameter is referenced in the report’s main dataset using the expression: =Parameters!DataSource.Value
A table is added to the report and each dataset field is added as a column to the table:
Since the CalendarQuarter column may not be available when the spResellerSales2 stored procedure is used in the parameterized dataset, this column must be hidden when the column is not returned in the result set. This is accomplished by changing the Hidden property of the table column. Right-click the column header for the Calendar Quarter column and choose Column Visibility… from the menu. Choose Show or hide based on an expression and then click the expression button to use the Expression Builder to create the following expression for this property: =Fields!CalendarQuarter.IsMissing.
Remember that there is a difference between a report dataset and the query that it references. When a query is written, (or in this case, when a stored procedure is selected) field definitions are added to the dataset object and will be there whether the query returns a column or not. If that query doesn’t return a corresponding column for the field, the field’s IsMissing property returns True.
You’re all done!
Preview the report and choose the first stored procedure in the parameter list. The report returns columns for all fields, including the Calendar Quarter:
Change the Data Source parameter to use the second stored procedure and click the View report button on the toolbar. Now the report returns columns for all fields except the Calendar Quarter:
A copy of this sample SSRS 2008 report is included for download.
Report designers often revert to overly complex report designs and often use sub reports unnecessarily. Building a report on a single dataset query to return all related rows gives you greater flexibility to group and filter the data.
The following sample report contains a single dataset using the following query:
select 1 as StoreID, 'Freds Store' as StoreName , convert(date, '2009-01-15') as TransDate, 'Chicken' as Product , 14.99 as Price union select 1 as StoreID, 'Freds Store' as StoreName , convert(date, '2009-01-25') as TransDate, 'Beef' as Product , 19.99 as Price union select 1 as StoreID, 'Freds Store' as StoreName , convert(date, '2009-02-15') as TransDate, 'Chicken' as Product , 15.99 as Price union select 2 as StoreID, 'Bobs Store' as StoreName , convert(date, '2009-01-15') as TransDate, 'Tuna' as Product , 17.99 as Price union select 2 as StoreID, 'Bobs Store' as StoreName , convert(date, '2009-02-15') as TransDate, 'Chicken' as Product , 15.99 as Price
A list data region is grouped on StoreID
Table1 (in the list) has columns bound to TransDate, Product & Price Filter for this table is: TransDate Between ‘2009-01-01’ And ‘2009-01-31’
Table2 (in the list) has columns bound to TransDate, Product & Price Filter for this table is: TransDate Between ‘2009-02-01’ And ‘2009-02-28’
Select a table and choose Filters in the Properties Window to open the Tablix Properties dialog (in SSRS 2008.) This shows the filter for the first table:
*In this simple example, the date values are hard-coded but could be parameterized and/or based in expressions, such as using the DateAdd() function to compare the current month sale to the prior month, etc.
To create a “greenbar” report in a table with alternating row colors, you can simply use the RowNumber function to determine whether a row is odd or even. This also works with the newer Tablix-style matrix in SSRS 2008. However the matrix data region in SSRS 2000 and 2005 doesn’t support the RowNumber function so it requires a different technique. This can be accomplished using a simple embedded custom code function in the report.
Open the Report Properties dialog and create a Visual Basic.NET function in the Code page by entering the following code:
Private bOddRow As Boolean
Function AlternateColor (ByVal OddColor As String, ByVal EvenColor As String, _ ByVal Toggle As Boolean) As String If Toggle Then bOddRow = Not bOddRow If bOddRow Then Return OddColor Else Return EvenColor End If End Function
This function takes three arguments; the name or numeric string for the odd color rows, the name or numeric string for the even color rows and a Boolean (e.g. True or False) flag used to switch between odd and even row modes. This function is called in an expression for the BackgroundColor property for every detail cell in the matrix and at least one row header cell.
For the group value that will be used to alternate row colors, edit that row group cell textbox properties and use the following expression for the BackgroundColor property.
=Code.AlternateColor(“AliceBlue”, “White”, True)
Of course, you can use whatever colors you want for the odd and even background colors. These can either be standard web color names, like those you see in the SSRS color property list, or a six character hexadecimal color value proceeded by a pound sign.
For each cell textbox in the data region of the Matrix, enter the same expression for the BackgroundColor property, only change the third argument to False:
=Code.AlternateColor(“AliceBlue”, “White”, True)
That’s it. When you run the report, the background color will change every time the group value changes.
How does it work? The matrix is rendered like a typewriter (for those of you born after the dawn of the Information Age, that’s a device we used to use to fill out forms and send text messages.) The cells are rendered from top to bottom and from left to right; returning to the left-most cell on the next row after it?s done with the row above. When the renderer encounters a new group value, it creates a new instance of the row group cell which toggles the private bOdd Boolean variable. On the first row, it gets flipped from False be True (we?re on row #1, which is an odd number.) Each cell on that row gets set with the odd row color. This continues until there?s a new group value, which flips bOdd to False (row #2 is an even color) and so on.
Detailed step-by-step instructions for this technique may be found in the Wrox Press book: Professional SQL Server 2005 Reporting Services, Chapter 7, pages 272-277.
As we have endeavored to solve various business problems, we’ve learned to do some interesting things with Reporting Services. On consulting engagements, I often find myself in front of a client who is asking questions like “Can you do this or that?” Almost inevitably, the answer is “yes,” but the question becomes what the best method would be to meet the requirement. With a little outside-the-box thinking, a lot of interesting things are possible. This may involve some custom programming, embedding report items or using customer application components in concert with Reporting Services.
In the following article, I’ve compiled a description of reporting challenges and solutions we’ve encountered when developing reports for our clients. For each “solution recipe,” I provide a brief list of skills, techniques, and resources needed to apply the report feature. This should give you a good idea about how prepared you may be to use the techniques based on your skill set and the level of complexity. Some of these are easy to duplicate and others require more advanced skills, which may include Transact-SQL and Visual Basic programming. These are not intended to be exercises or step-by-step instructions. I have made a point to provide enough information to demonstrate the concepts and techniques.
Creating a Business Scorecard
Microsoft Office Business Scorecard Manager is a Back Office product that integrates with SharePoint Portal Server and may be used to bridge the features of several Microsoft products, such as SQL Server Analysis Services, Excel, Visio, and Office Web Components. BSM is primarily designed for the nonprogrammer, power-user and requires little technical expertise to create dashboard-style reports. Using Reporting Services and a little programming savvy, it’s possible to duplicate some of the BSM features without the product.
This type of reporting scenario has quickly become a mainstay in enterprise business applications. Also known as executive dashboards, business scorecards provide summary level progress and success status information for business leaders.
You’ll need the following:
A query expression with data-based or calculated target, budget, variance and actual values
A multigroup table with drill-down features
Small images for use as progress indicators
An expression used to translate KPI and target values to indicator images
To understand and appreciate the value of this type of reporting interface, you need to walk in the shoes of corporate business leaders. A typical corporate officer deals with a lot of people and a variety of information in a day, and often needs to make immediate decisions based on this information. Moving from meeting to meeting, transaction-level details are too granular for most decisions. Business leaders need to know how the business is performing overall and whether there are areas of concern or notable success. I’ve sat in a lot of meetings with a General Manager or Director sitting on one side of the table and subject experts on another. The officer begins by saying, “So, how are we doing?” The subject expert gives a lengthy presentation, stepping through PowerPoint slides, charts, graphs, and diagrams that depict trends and variances based on mountains of data. After the presentation, the officer concludes with the question: “So, how are we doing?” Scorecards and dashboards answer this all important question using succinct summary values and simple graphical, symbolic progress indicators.
Although simplification is a key concept, scorecards go beyond just keeping reports simple. Trends and success indicators should be clear and easy to understand but should provide an avenue to discover more detail and to view related trends and summaries. These objectives are easily achieved using drill-down and drill-through report features.
Targets and KPIs
These are the fundamental concepts behind business scorecards. For any given measurement, a target is simply an objective value. Targets are often data-driven values like a Budget, Quota, Baseline, or Goal. A KPI, or Key Progress Indicator, is a set of thresholds used to measure actual values with the target. KPIs may define banding indicators that to signify a range of variances like poor, acceptable and exceptional performance. KPI thresholds may be a single point, corresponding to the target, percentage, or fixed increment offsets with any number of indicator bands.
When considering longer-term trends, you may want to recognize the difference between a positive trend using a KPI and whether or not a value represents a successful outcome, as a KSI (Key Success Indicator.) For example, sales for a particular product may have been unprofitable since it went on the market. If sales are rising, a KPI would show positive sales growth while a KSI would indicate that the company is still in the red. We might simply define two targets, one to measure short-term progress and the other to measure overall profitability.
Indicators are graphical icons, representing the state of an actual value with respect to a KPI band. On the scorecard, corresponding indicator icons might be red, yellow, and green symbols. Indicators are typically common symbolic metaphors like traffic lights, colored shapes, progress bars, gauges, and directional arrows. Figure 1 shows some common indicator graphics embedded in a sample report.
Variance is the difference between an actual and target value. If time variances will be used extensively, the queries used to make these calculations can be very intensive. Aggregating and calculating sales totals, for example, for a given month over last month, quarter, or year can require some heavy-duty query processing (even with a modest number of detail rows.) Ideally this type of data should be stored in a data mart or data warehouse with precalculated variance values stored in the database. The AdventureWorksDW database contains some preaggregated summary values, but as you can see, even for this simple report with only year-over-year variances, the query is fairly complex.
, SUM(ThisYearSales.ExtendedAmount) AS ExtendedAmountSum
, SUM(ThisYearSales.SalesAmountQuota) AS SalesAmountQuotaSum
, SUM(LastYearSales.ExtendedAmountSum) AS ExtendedAmountSumLastYear
, DimEmployee.FirstName + ' ' + DimEmployee.LastName AS EmployeeName
FROM DimEmployee INNER JOIN FactSalesQuota
ON DimEmployee.EmployeeKey = FactSalesQuota.EmployeeKey
INNER JOIN DimTime ON FactSalesQuota.TimeKey = DimTime.TimeKey
INNER JOIN FactResellerSales
ON DimEmployee.EmployeeKey = FactResellerSales.EmployeeKey
AND DimTime.TimeKey = FactResellerSales.OrderDateKey
INNER JOIN DimSalesTerritory
ON DimSalesTerritory.SalesTerritoryKey =
FactResellerSales.SalesTerritoryKey) AS ThisYearSales
, FactResellerSales.ExtendedAmount AS ExtendedAmountSum
INNER JOIN DimTime
ON FactResellerSales.OrderDateKey = DimTime.TimeKey
INNER JOIN DimSalesTerritory
ON DimSalesTerritory.SalesTerritoryKey =
) AS LastYearSales
ON LastYearSales.CalendarYear = ThisYearSales.CalendarYear - 1
AND ThisYearSales.EmployeeKey = LastYearSales.EmployeeKey
AND ThisYearSales.SalesTerritoryKey = LastYearSales.SalesTerritoryKey
GROUP BY ThisYearSales.SalesTerritoryRegion, ThisYearSales.SalesTerritoryKey
, ThisYearSales.CalendarYear, ThisYearSales.LastName, ThisYearSales.FirstName
ORDER BY ThisYearSales.SalesTerritoryRegion, ThisYearSales.CalendarYear
, ThisYearSales.LastName, ThisYearSales.FirstName
When running complex queries like this one, you may need to increase the default connection timeout setting on the data source. The default setting is 15 seconds, which may not be sufficient for this query on all hardware. In a production application with data volumes greater than the sample database, I would recommend testing query performance and possibly using an Analysis Services database with cubes and precalculated aggregates. To populate the data warehouse, you will use queries similar to this one and store the results for later retrieval.
Figure 2 shows a simple table with two groups: on the SalesTerritory and CalendarYear fields. This table is much like several previous examples. The detail row is hidden by default allowing for drill-down using the SalesTerritoryRegion textbox. Two more images will serve as indicators. These are based on expressions used to change the indicator image.
You will notice that the images have a white background even though I’ve used background colors to separate the rows. I’ve only done this to simplify this example. I have simply added the images to the cells in the table header. If you want to use transparent images over a colored or shaded background, you will need to add rectangles to the header cells and then place images in the rectangles. This way, you can set the BackgroundColor property for each rectangle and take advantage of the image transparency. The final example, shown in Figure 6 (on the final page of the article), uses this technique to fill the background color behind the scorecard indicator images.
Looking at the columns with text headers, the first column contains the SalesTerritoryRegion field in the first group header and the CalendarYear field in the detail row.
The second column contains the EmployeeName in the detail row.
The third text column is for the SalesAmountQuota field. The header uses the SUM() function to aggregate the details for the sales territory.
The fourth text column contains total sales values, using the ExtendedAmount field.
The last column of textboxes, labeled Yr. Variance, calculates the total sales amount annual variance. In the header row, the expression uses the SUM() function. In the detail row, the SUM() function is omitted.
For variety, I’ve resolved the second indicator column images using only an in-line expression, rather than using a custom function. This is the expression for the header row. The detail row expression is the same but excluding the SUM() function. As a rule, after I’ve decided to use custom code, I’ll typically continue to use custom functions for all but the simplest expressions so I can keep business logic in one place.
The following is intended as all one line but will not display that way with this page width online.
This expression returns the name of an exclamation mark icon image when this year’s sales amount is less than 80 percent of last year’s. I created an image file called “nothing,” which is a blank icon with a white background. Using this image effectively displays nothing in the image cell.
Synchronizing Charts and Related Report Items
One of the great advantages to the scorecard approach is that all the information is presented in a concise page. In order to make the best use of screen space, I can use a separate report item to show content related to the item selected in the scorecard.
Figure 4 shows the table and chart items. When I select a sales territory, by clicking on the small pie chart icon in the first column, I want to see sales trend information in a column chart. I’ve placed a chart to the right of the scorecard and have configured it as a column chart. I’ve also simplified the chart by removing the legend.
The chart content is synchronized using a report parameter. The SalesTerritoryKey parameter is used to filter the dataset providing data to the chart. The SalesTerritoryregion parameter is used to provide a title value for the textbox above the chart. Figure 5 shows the Navigation properties for the pie icon used to synchronize the chart. Note that the Jump to report property is set to navigate back to this report, re-rendering the same report with the new parameter values.
Figure 6 shows the rendered report with some region sections drilled open. I’ve clicked the pie icon next to the Southwest region to synchronize the chart and view sales trend details for the southwest region. Again, note the background color fill behind the scorecard indicator images using the technique I mentioned earlier.
One of the first things you’ll probably notice is that the quota and total sales values are not very close at all. These values have actually changed in different editions of the AdventureWorksDW sample database as it progressed through beta testing cycles. For this report example to be more realistic, you can manually change some of the SalesAmountQuota values in the FactSalesQuota table. The completed sample code on the book’s support site, at P2P.WROX.COM, contains a script to update all of the quota values.
This article is the second in a series of three articles from Paul Turley excerpted from the 90-page chapter, “Report Solution Patterns and Recipes” from the book Professional SQL Server 2005 Reporting Services, written by Paul Turley, Dave DuVarney, James Counihan, and Todd Bryant (Wrox, 2006, ISBN: 0-7645-8497-9). The full chapter contains a deep analysis of successful report projects, including project profiles, success factors, user discussions, and scope management. Templates are provided to assist requirement gathering and management. The Report Recipes section of this chapter includes 19 specific examples of advanced report designs that implement custom coding and other techniques taught in previous chapters. This series of articles demonstrates three of these examples. The first article was Greenbar Reports and the final installment will be Creating Sparklines. Hitachi Consulting has business and IT consulting offices throughout the world. Paul works for Hitachi’s Pacific Northwest Business Intelligence practice in Seattle and may be contacted at email@example.com.
Chart reports are an effective way to visualize data to show trends and comparisons in a way that can help the user get real meaning from numbers. However, chart data can be meaningless without context. This example will show you how to display a column chart with a goal line so users can easily understand whether charted values are above or below and acceptable threshold. I provide three versions of this report; a simple example using a SQL query to demonstrate the essentials, one with an MDX query and a copy of the latter with a parameterized threshold and dynamically colored columns to show data points above or below the goal.
This is a report design that you can almost do with standard report design features but it?s that last little step ? that last critical piece ? that takes a little creativity to solve. When I was first challenged to design this report for a consulting client, I knew that it could be done but I just wasn?t sure how. I grew up in Washington State in the 1970s with the legend of Bigfoot. For a while, there were Bigfoot sightings every week and everyone knew someone claimed to have seen Bigfoot (or had seen someone who knew someone who had claimed to have seen Bigfoot.) We knew he was out there in the remote wilderness but we weren?t sure where. This was a little like that ? the solution was out there and I just needed to find it.
This technique falls into a category that I need to give credit for before I mention it. Chris Hays, who is responsible for the Report Definition Language specification on the SSRS product team, coined the phrase ?sleazy hack? in the title of his blog site a few years ago. Owing to Chris for the definition, the report design technique showcased in this recipe is definitely in the sleazy hack category.
Reporting Services 2008
What You’ll Need
Column, area or line chart
Experience with VB expressions
Designing the Report
This solution is more about technique than complexity. The chart you see in Figure 1 has some simple design features you?d expect to be fairly easy to define but this is a little challenging to do right out of the box. In SSRS 2008, creating multi-series reports like this one, containing columns and a line, is standard fair. However, the natural behavior of all charts is that each column or line segment is plotted from the center of the data point along the axis. This means that since the line and columns in this chart have the same range of category (X-axis) values, the line would begin in the center of the left-most column and end in the center of the right-most column instead of running from edge to edge of the chart container. In fact, no matter what the minimum and maximum values are, the chart normally provides a margin of space on each side of the axis. You?ll see this and the resolution as we work through the report design.
Let?s get started. The first bit of ?hack? is in the query. In order for the line to extend beyond the range of column center points, it must begin and end with lower and higher scale values than the respective column points. This means that we need to work some magic in the query and add two extra values to the scale.
1. Create a new report
2. Add a data source for the AdventureWorksDW2008 database.
3. Create a dataset and enter the following SQL script:
Category as CategoryGroup
, RANK() over (order by OrderNum, Category) as OrderValue
select Null as Category, Null as Qty, 12000 as Target, 0 as OrderNum
pc.EnglishProductCategoryName as Category
, Sum(fis.OrderQuantity) as Qty
, 12000 as Target, 1 as OrderNum
FactInternetSales fis inner join DimProduct p
inner join DimProductSubcategory ps
inner join DimProductCategory pc
group by pc.EnglishProductCategoryName
select Null as Category, Null as Qty, 12000 as Target, 2 as OrderNum
) as s;
Execute the query and view the results:
You should notice that we have added two fictitious rows to the results on the first and last records, based on the OrderValue sort order. Notice also the lack of values for the CategoryGroup and Qty columns, but the Target column contains values in these rows. This provides the basis for the goal line to extend beyond the range of points plotted by the column chart.
Build the chart and define the groups and series data fields.
1. Add a new chart to the report body.
2. Choose a standard column chart type.
3. Drag and drop or select the OrderValue field in the Category axis.
4. Drag and drop or select the CategoryGroup field as a second field in the category axis.
5. Click the chart in the designer to show the data field and axis field drop zones.
6. Add the Qty and Target fields to the data fields drop zone to define the data series.
7. Right-click the Target field and change the chart type for this series to a Line chart.
8. Remove the axis labels.
Cosmetic changes can be applied as appropriate. These may include the line chart weight and color. The report in the designer should look similar to Figure 3.
Preview the report and compare the result to Figure 4. You?ll notice the empty columns for the OrderValue field values 1 and 5. Note how the line chart naturally extends from the center of first column position to the center of the last. Again, the requirement is for this line to extend from margin to margin.
Don?t worry about the axis labels for both of these groups. When we change the axis value range, the labels for inner group will disappear. Before we do this, we need to show the CategoryGroup field values on the outer group instead of the numbers:
1. In the Category axis fields, edit the OrderValue group properties.
2. In the Category Group Properties dialog, set the Label property to use the CategoryGroup field.
Now we?ll work our magic on the line chart margins?
To eliminate the extra space around the chart, we?ll set the left and right axis margin to be one-half of a scale value from the first and last column positions.
1. Right-click on the Category axis labels at the bottom of the chart and open the Axis Properties dialog.
2. Edit the axis scale Minimum property and set it to the following expression:
=MIN(Fields!OrderValue.Value) + 0.5
3. Edit the axis scale Maximum property and set it to the following expression:
=MAX(Fields!OrderValue.Value) – 0.5
4. Accept these changes and preview the report, comparing your results with Figure 6.
At this point, you have a working chart report with the essential features. The goal line, which is actually a line chart, extends from margin to margin. Even though we added extra values to the beginning and end of the axis range to make this work, now the columns fill the entire chart area.
Adding Dynamic Color
We can use a dynamic fill color to bring attention to series values that are above or below the goal line. Columns under the goal will be red and columns at or above the goal will be lime green. To enable this feature, follow these steps:
1. Click once on the chart to show the field drop zones.
2. Right-click the Qty field series and open the Series Properties dialog.
3. On the Fill page, set the Fill style property to Solid.
4. Next to the Color drop down, click the Expression button (fx) to open the Expression dialog.
Charts in Reporting Services offer a lot of design flexibility and opportunities to customize the behavior and visual presentation of data. However, with these capabilities come a myriad of properties that can be a bit daunting to navigate and find in the design interface. This report design may not be quite as elusive as Bigfoot but it does solve a specific problem and provide real business value.
When exploring new features and design patterns, I find it useful to save versions of my chart reports in different stages of development. This way I can revert to a working state if I get myself into trouble. With some practice and perhaps a few late nights of experimentation, you will develop patterns that work best for your users and the data they need to visualize.
What lessons have we learned about managing report projects and what are the best practices?
Since each report is stored in a separate XML-based RDL file, Reporting Services projects play well with version control. Visual Studio Team System and Visual Source Safe integrate with the Visual Studio/BIDS shell and third-party source control solutions like SVN may be managed from the file system. Reports that have sub reports, drill-through reports or other dependencies should be managed in a BIDS project. Report Builder 2.0 or 3.0 may also be used but designers should mange their reports on the server rather than in the file system. It’s best to use one tool or the other for any given report.
The deployment approach can be no different than your application development build process but it shouldn’t be over complicated. Over all, experience has proven that report projects are best kept simple because you typically don’t have the same level of component interdependencies that you would in dev. projects.
In my experience, there are two different approaches that may work depending on your team structure and methodology. You can manage the reports through all deployment phases from Visual Studio. This puts the developer/report designer in the position to manage reports coming out of testing and into production, which may not always be ideal in a formal IT environment. The other approach is to have the developer/report designer deploy reports to a dev server and then for a deployment manager to capture and manage the RDL files in file system based version control, using scripts to deploy to QA, UAT & Production.
Keeping report definitions in projects and the server(s) synchronized can be tricky but it’s manageable as long as you set ground rules for the team, use version control and conduct regular status checks. RDL files don’t have build numbers so it’s important to keep one official copy of the most recent RDL in a designated project or folder. The RDL Scripter, mentioned in another blog post, can be a useful tool for cloning deployed reports to the file system.
How should date type parameters be prompted? There are two common approaches depending on how you want the parameter to appear. Keep in mind that in a cube, a date attribute is typically not stored or presented as a date type. So you can either present the parameter as a hierarchy or as a date. In the prior example, if you were to parameterize the query as:
[Measures].[Internet Sales Amount] on columns
, PERIODSTODATE([Date].[Calendar].[Calendar Year], @ToDate) on rows
…the parameter must return the string value in this format:
[Date].[Calendar].[Date].[February 5, 2004]
Unfortunately the MDX query designer is not very good at letting you customize a query so you have to work with and around it’s tendancy to take over the process. It often will not allow you to simply reference a parameter using syntax that it didn’t build for you. I recommend using the graphical builder to create a simple query that will return all the members in the order you will want to see them in your final, customized query. If you want the date parameter to appear as a hierarchy, use the query builder to add a filter using a user hierarchy and check the Parameter checkbox in the filter pane. This will create a separate MDX dataset to drive the parameter list.
Next, either switch the query editor to text edit mode or replace the query using an expression. I find it more flexible to build string expressions to work around the MDX query builder’s quirkiness. The expression version of the query would be:
The lines are concatonated together for readability. make sure each line is separated with a space or carriage return. This dataset would be good to go beacuse, the parameter was alraedy built by the graphical designer.
If you want to present the user with a real date type parameter, which will utilize the date picker control, you will need to pass the resulting parameter value into a function or expression to build the date member string value. To do this in an embedded function, open the Report Properties dialog and type this code into the Code page:
Function DateToMember(TheDate As Date) As String
Dim sDateString As String = Format(TheDate, "MMMM d, yyyy")
Return "[Date].[Calendar].[Date].[" & sDateString & "]"
…and then use the function in the dataset expression:
Subscriptions in Reporting Services are designed to run on a schedule and there isn?t a built-in feature to just run a subscription manually.
This is actually quite easy to do with a little bit of TSQL or .NET programming code. You can either use the SSRS web service API via code or command line script, or run the SQL Agent job using the sp_start_job system stored procedure. In either case, you will need the SubscriptionID GUID assigned to the SQL Server Agent job that was generated when the subscription was created. This is stored in the ReportServer Subscriptions table. Each SubscriptionID has a corresponding SQL Server Agent job named using the same GUID.
To find the subscription Agent jobs, open SQL Server Management Studio and connect to the relational instance hosting the report server content databases. Expand the SQL Server Agent jobs. Each job will have one step and will be scheduled to run at the time specified in Report Manager. Subscriptions have a TimedSubscription event type and reference the report?s ItemID key in the ReportServer database Catalog table. Hopefully, the can find this just be using the scheduled execution time. Copy the job name GUID to the clipboard. Do not modify this job or step but .
After you have the SubscriptionID/Job name, running the subscription is the easy part. In a T-SQL script, just execute this command substituting the GUID: USE msdb EXEC sp_start_job @job_name = ‘D4EF7E0D-7FFB-47A5-A87A-D63F2D242A06’
You can use an ADO.NET SqlCommand object to execute this code from web form or application.
The programmatic technique uses the SSRS web service API or the rs command line utility (which is actually a console wrapper around the web service interface). Using the command line option, the FireEvent method with the TimedSubscription argument will start the subscription using the Agent job name GUID:
Dim rs As New ReportingService() rs.Credentials = System.Net.CredentialCache.DefaultCredentials rs.FireEvent(“TimedSubscription”, “D4EF7E0D-7FFB-47A5-A87A-D63F2D242A06”)