Power Platform World Tour Vancouver Presentations

These are copies of my presentation decks from the sessions today at the Power Platform World Tour event in Vancouver, BC. I appreciate all who attended.

The Charticulator session was originally presented in July at the Power BI Symposium in Portland and here is the recording of the session:

Please feel free to contact me with questions or feedback using comments in this post.

Power BI Project Good and Best Practices

Paul Turley
Power & Data Platform MVP, Principal Consultant, Pragmatic Works

Preface: This post is set set of guidelines that address many aspects of new projects. Although I feel strongly that many of my recommendations are correct and applicable in most uses cases, other experienced practitioners have valuable ideas and I welcome their input. Please post your comments, links to other resources and ideas that may be different than mine. This will be a living document with ongoing revisions with the goal to provide a comprehensive, best practices guide as the platform continues to mature and as experts continue to use it.

Revision: 8/25/2019 – cleaned-up & added checklist
Revision: 9/7/2019 – added Security section & corrected Last Refresh Date/Time

I find there there are so many things to remember when starting a project that a checklist is handy. I’ve been collecting the following as notes for some time. Bare with me as I work on consolidating this article into a concise checklist.

The purpose of this article is to outline a set of guidelines and recommended practices for managing Microsoft Power BI projects. This guide is primarily focused on the work performed by the IT-managed BI Solution Developers for formally-managed BI projects.

Why “Good” and “Best”?

Power BI is a tool and platform that serves two masters: IT developers and business data users. As such, some practices are absolutely best-practice and should be followed with rare exception. Other recommendations are conditional guidelines, with possible exceptions. The flexibility of Power BI makes it possible for industry practitioners to developed their own solutions to apply design patterns in the absence of a specific product feature (like source control, for example).

This guide is largely based on my experience, discussions with other members of the MVP community, clients and fellow consultants. Most of the information in this recommendation guide are based on the consensus of parties with the greatest experience with the Power BI platform. These parties include:

  • Members of the Microsoft MVP Program (Data platform and Business Applications focus) who participate in daily interactions with the Power BI product development and leadership teams at Microsoft.
  • Microsoft certified solution partners who offer consulting, training and value-added services to Microsoft product customers.

Power BI has come a long way in a short time; since the Power BI service and Power BI Desktop were introduced in 2015. At that time, the emphasis was to enable self-service business users to analyze and visualize data in small-scale ad hoc reports. As Power BI grew into the enterprise space, it has been adopted by IT-managed data governance and solution development teams. Power BI is displacing – and being used in concert with – conventional IT-centric Business Intelligence product technologies that have been evolving for decades (such as SQL Server Analysis Services, Reporting Services and Integration Services). With this history and the rapid adoption of the platform, standard design practices will continue to evolve.

Two discreet audiences and roles should be considered for Power BI:

  • Business Data Analyst
  • IT-managed BI Solution Developers

Consider these roles, with respect to both the solution developer(s) and users, as you begin to plan a new project.

Business Data Analyst

Members of this role are part of a rapidly growing segment of the industry. They understand how to use software tools and understand data manipulation and reporting within the confines of area of business focus. Traditionally, data analysts were data consumers who use financial spreadsheets but now they use advanced data preparation tools to perform reporting and deeper data analytics. Business Data Analysts generally fall into two subcategories:

  • Advanced Data Consumers
    These users explore data using spreadsheets, pivot tables and chart-style reporting tools that are served by existing multidimensional cubes and data models. They do not write code and work with IT/BI Developers, Data Modelers or Advanced Data Analysts to collect and model the data they use to create their reports.
  • Data Modelers, Advanced Data Analysts and Data Scientists
    These advanced users perform their own data mashups, produce data models and write limited code to create solutions for their department or business area (using Power Query, DAX, R, Python, etc.) They do no architect and develop governed and certified datasets for enterprise-level business use.

IT-managed BI Solution Developers

Enterprise-scale BI solutions are typically created by a team of specialists who may include:

  • Business Systems Analysts
  • BI Solution Architects
  • Project Managers
  • Data Transformation (ETL) Developers
  • Enterprise Data Modelers
  • Measure code experts (using DAX)
  • Report Developers
  • Operations & Deployment Engineers

Power BI may be used alone or with other products to create BI data transformation queries, data models, reports and dashboards. IT-BI managed scenarios:

  • Complete BI solution design, development and deployment, consumed by end users.
  • Data models created and deployed to support user ad hoc report design.
  • Support Advanced Data Analyst data transformation, data modeling & report design
  • Migrate advanced user created data models to governed/certified datasets, transferring ownership to IT & operations

Project Types – simplified

For simplicity, this guide refers to Power BI projects as either “Formal” or “Informal” using the following definitions:

  • Formal projects are scoped, funded, staffed and executed with the collaboration of a business champion and stakeholders; and IT Business Intelligence developers and data managers. These projects promote business and IT-governed datasets and certified reports.
  • Informal projects are executed by business users and are considered ad hoc in nature. Datasets are generally not IT governed, and reports are typically not certified.
  • Hybrid projects can be anything in-between. They might be a user-authored report using published, certified dataset used for self-service reporting. Informal, self-service datasets can be migrated to governed datasets in collaborative IT/business projects.

Again, this guide is primarily focused on the work performed by the IT-managed BI Solution Developers for formally-managed BI projects. Topics included in this guide:

  • Solution Architecture
  • Managing Power BI Desktop Files
  • Datasets and Reports
  • Version Control & Lifecycle Management
  • Workspace and App Management
  • Data Model and Power Query Design Guidelines
  • Dimensional Design
  • Query Optimization
  • Managing Dataset Size with Parameters
  • Implicit and Explicit Measures
  • If Users Need Excel, Give them Excel
  • Certified & Shared Datasets
  • Enterprise Scale Options

Solution Architecture

All Business Intelligence projects involve the same essential components including:

  • Source Queries
  • Data transformation steps
  • Semantic data model
  • Calculations (typically measures)
  • Data visualizations that reside on:
    • reports
    • report pages
    • dashboards

These components can be built using different products and development tools. Lengthy discussions can be had to debate the nuances and merits of different options; but a short summary of recommendations follows. To summarize these options in simple form, first consider the following options:

  1. Single PBIX File

    Build all queries, transformations, the data model, measures, reports and pages
    with Power BI Desktop in a single PBIX file and deploy to the Power BI cloud
    service; to be shared with users.

Recommendation/Cost/Benefit:
This option is most suitable for Business
Data Analysts, erring on the side of simple design, typically optimal for
small, informal self-service projects where a desktop business user needs to perform their own data mashups and quick, ad hoc analysis. Risks include unknown data integrity and the use of uncertified dataset results. Informal projects can effectively serve as prototypes for formally developed and managed dataset development. Single file projects can be migrated to dataset/report two file projects with some rework required for the reports, depending on the report complexity.

Separate Dataset & Report PBIX

In formal projects, use Power BI Desktop to author a separate PBIX “dataset” file containing the following: queries, transformations, the data model and measures. These are in turn deployed to the Power BI cloud service. Develop one or more separate PBIX “report” files that connects to the published dataset file in the service. All report pages and visuals reside in the “report” file. Some additional measures can be created in the report
file.

Recommendation/Cost/Benefit:
This pattern is recommended for most projects where data governance is important and where new report features are likely to be requested in the future. By separating the dataset from report development; the data model design can be locked-down and managed by a data steward. This approach provides report designers freedom to add visuals and make report enhancements without risking the stability and integrity of the data model that may be serving data to other reports.
Development and management of a two file project is a little more cumbersome because the dataset must be redeployed before changes are available in reports.
Self-service report users who need to explore and visualize data, can use a governed and certified data model. Changes and enhancement should be coordinated between the report designer and the data steward or IT resource managing the data model.

SSAS with Power BI

Taking the “traditional route” in formal projects to build the data model in SSAS, hosted either on-prem or in Azure Analysis Services, and then creating reports and visuals with Power BI was a necessity under enterprise-level feature were added to the Power BI platform in 2017-2018. Although there are still some good reasons to use SSAS rather than Power BI to build the data model (called a “dataset” in Power BI), Power BI can be the more cost-effective choice today. Power BI and SSAS both offer unique capabilities which doesn’t clearly make one option always better than the other.

You can develop the source queries, transformations, data model and measures in an Analysis Services (SSAS) project using Visual Studio (also called SQL Server Data Tools or SSDT) and in turn deploy the SSAS database/semantic model to an on-premises SSAS instance or Azure Analysis Services (AAS) cloud service. Connect new Power BI reports to the deployed SSAS model using a direct connection. SSAS/AAS can handle data models with more than the 1 GB, 3 GB & 12 GB compressed data size limits of Power BI. The SSAS development environment can be more difficult to navigate and the model designer can be unstable. When the VS/SSDT designer crashes, developers can lose work even if they are routinely saving their work.

Recommendation/Cost/Benefit:
Favor using Power BI. It typically takes considerably longer to develop data models in Visual Studio/SSDT than using Power BI Desktop. This project architecture is no longer the primary recommendation for BI solution development. Microsoft is directing most of their development and support resources to Power BI rather than SSAS. Although SSAS still has some advanced capabilities not yet present in Power BI; Power BI can handle larger data volumes in certain cases. Microsoft and Microsoft partners recommend using Power BI is the tool for developing most new BI solutions.

  • DirectQuery vs Import/In-memory

    Power BI desktop using DirectQuery mode, rather than import mode to reference tables in an underlying data mart or data warehouse relational database. No data is stored in the Power BI or SSAS data model; live SQL queries are generated when navigating report visuals.

    Recommendation/Cost/Benefit:
  • Avoid this option, unless needed to satisfy unique requirements. Power BI and SSAS both are favored and are optimized to use: import mode, in-memory tabular data models for all but rare and exceptional scenarios. DirectQuery significantly limits the DAX Calculations supported in the data
  • model. Query speed is typically many times slower than import, in-memory mode. The advantage of DirectQuery mode is that queries are executed directly against source tables in real-time. Consider building composite models with both import and DirectQuery mode to get good performance and drill-down to live transactional details. Composite models are typically complex and more time-consuming to develop, test and deploy.

Managing Power BI Desktop Files

Power BI Desktop files (PBIX and PBIT) should be stored in a centrally managed network-assessable folder, where they can be accessed by all
members of a BI report development team. The storage folder should support automatic backup and recovery in the case of storage loss. Options depend on currently supported services and organization preferences within the following guidelines:

File Size and Folder Synchronization

  • Report and dataset developers must open files from the Windows file system and not from a web browser interface. Files must either reside in or be synchronized with the Windows file system.
  • Files containing imported data typically range in size from 100 to 600 MB. Any shared folder synchronization or disaster recovery system should be designed to effectively handle multiple files of this size.

Options:

  • OneDrive For Business (shared by team, with folder synchronization).
  • SharePoint or SharePoint Online (with folder synchronization).
  • GitHub and/or VSTS with local repository & folder synchronization. If used, Git must be configured for large file storage (LFS) if PBIX files are to be stored in the repository.

File Naming

Give files an intuitive and friendly name, that will result in them not being confused with similar report projects. In default view, only the first 20 or so characters are visible for each report. Use self-evident names that can be used to identify datasets and reports, using the beginning of long names. Workspace and report names have limited space in the Power BI portal navigation bar. Use short and concise names when possible; but try to use names that differentiate like-named items in the first 20 characters. The full name is visible in the hover-over tooltip or by resizing the panel.

image

Datasets and Reports

Formal report projects with certified datasets should be developed with reports and datasets divided into separate files. By default, all Power BI objects, including reports and datasets, are stored in a single file. When the PBIX file is published from Power BI Desktop to a workspace; the PBIX file is separated into separate dataset and report objects. This default behavior is convenient for informal, small-scale, self-service analysis; but not optimal for managing formal solutions with governed datasets, managed by teams of developers or report authors.

Separating the dataset and report into different PBIX files; allows them to be managed and versioned separately. This approach also promotes a division of labor and ownership, particularly for certified datasets that must undergo scrupulous validation.

Like the way SSAS projects are used to develop multidimensional cubes and tabular models; deploying a separate Power BI dataset provides an uncomplicated black box solution for ad hoc report design and data exploration. Power BI can now be used to develop large data models containing 10s of gigabytes of compressed data. Using these development tools take far less time and possibly without additional licensing fees.

Version Control & Lifecycle Management

At the present time, Power BI has no integrated version control mechanism. In addition, Power BI is incompatible with most version control systems that perform branching and differencing operations on code and script files that are common in application development projects. Power BI (PBIX and PBIT) report and dataset files are stored as binary archive files with internal checksum metadata. Change and differencing operations performed by most code and version management systems will corrupt these files and render them inoperable.

Reality: Integrating changes made to multiple PBIX files is difficult to manage and challenging to perform technically. Even with differencing tools and effective version-control, just avoid multi-developer work on Power BI dataset files.

The Power BI ALM Toolkit (based on the older BISM Normalizer tool for SSAS Tabular) will make it easier to compare differences. Until (or if) specialized tooling becomes available, you should avoid replying in this method to manage project work.

Power BI Desktop PBIX files that contain imported data can be large; typically, in the range of 100-600 MB. However, they can be checked into a version control repository for the purpose of backing up and restoring files.

PBIX files can be reduced to small files by saving them to a Power BI Template (PBIT) file. This removes all data and may be an effective to make occasional backups for version archive purposes and disaster recovery. The template can then be used to generate a new PBIX file; much like using a template to generate a new Word, Excel or PowerPoint document.
Saving a PBIX to a template for backup purposes takes minimal effort; but rehydrating a PBIX from a template is more labor-intensive and time-consuming. The reason for this is that a new file must be created and then queries must reprocessed to populate tables with new data.

Recommendations:

File versions

Store the dataset and report PBIX files separately using a version number postfix for the file name in the following the format: Major.Minor.Revision. For example:

Manufacturing Cost Analysis Dataset v1.5.3.PBIX

Keep only the previous file versions that are useful for historical reference and recovery. Report files are small and don’t take up much storage; but, dataset files can be large. If you need to archive older dataset files, resave them as Power BI Template (PBIT) files to reduce the file size. However, a developer must continue to work with the current PBIX file. Recovering a dataset from a template is time-consuming so developers cannot actively develop using only template files.

Remove the version number from file names prior to production deployment.

Dataset Version History

By keeping version history information in an internally-generated table in the data mode; durable revision history notes and version information is self-contained and not dependent on external data sources.

Add a Version History table to the data model by using the “Enter Data” feature in Power Query to create a table containing these columns:

  • Version Number
  • Revision Date/Time
  • Developer Name
  • Revision Notes

Add a custom column named Refresh Date/Time that will be used to capture the date & time every time the queries & model is refreshed. This custom column could be added to any table but it is best to use a table with few rows because the value will be the same in each row.

DateTime.LocalNow()

Add the following measures to the model:

Measure Name Definition
Last Refresh Date/Time = MAX( ‘Version History'[Refresh Date/Time] )
Current Version = MAX( ‘Version History'[Version Number] )

Create a Dataset Information page in the dataset PBIX file and in any report that references the dataset.

Every time a revision is made to the data model, edit the Enter Data
step and add a row to the table with a new version number. Use the following guidelines to increment components of the version number:

Component 
Major releaseIncrement only when a newly scoped solution is released to PROD.
Major releases have a specific set of planning features. A new major release is created only for a new set of planned features or significant enhancements as part of a new scope of work.
Minor Version NumberAs a team, decide when changes are significant enough to warrant an incremental minor version release; typically, when changes are made to QA or PROD releases within a major release. Generally, for minor feature
enhancements: For a dataset, minor version changes may break or affect a report’s design.
For a report, minor version changes add or change functionality aside from fixes.
Fix or Revision NumberIncrement for every bug fix or design change, that will be deployed to the DEV environment or visible to other developers. Generally, for bug
fixes but not for adding features. May be incremented by the developer. Reset
to zero with a new minor version number.

For example; the following version number represents Major Release Version 1Minor Version 5 and Fix/Revision Number 3:

v1.5.3

Datasets and reports should be versioned separately. The major version number for reports and datasets may be synchronized for major workspace app releases.

Be mindful, that the purpose for version numbering is to provide at simple standard to keep track of changes. The process can easily be over-thought and over-engineered.  Keep it simple, discuss changes as a team and make process adjustments to focus on priority tasks.

Release Management, DevOps and Automation

(topic draft – expect updates – please comment)

This is an area where the Power BI platform currently lacks a formal process or mechanism. We know that release management tools are being developers but in the early stages. We may or may not see a release management too from Microsoft that blends with other tools that you might have in use, such as VSTS and AzureDevops. In my experience, automated Power BI release management isn’t really being practiced in the industry yet. That said, there are techniques available. PowerShell script may be used to publish datasets and reports, and to change dataset bindings. It is possible to either publish to a production workspace or to effectively move assets from one workspace to another. This approach is discussed briefly in the Power BI Enterprise Deployment Guide. Other approaches are discussed here:

Power BI release management

PLEASE ADD REFERENCES

https://www.msbiblog.com/2018/09/12/power-bi-release-management/

Workspace and App Management

To share reports to users who do not have a Power BI Pro license (each license has an additional cost) must be in a Premium capacity workspace and the workspace must be published as an app.

Create a separate workspace to share with Free licensed users

Create a separate workspace for formal projects that have a report, collection of reports and/or dashboard that should be distributed to multiple view-only users. A workspace must be converted to an app to be shared with users who do not have Power BI Pro licenses assigned to them. Once the workspace is published as an app; it can be shared with any group of “free” users or all users in the organization. The visible text for workspace names is limited; name workspaces so they be uniquely identified in about 20 characters or less.

For a formal project, create the following workspaces:

DEV Workspace

Only development team members need Contributor access to this workspace. This workspace does not need to have Premium capacity; unless, developers need to unit test incremental refresh or other Premium features.

QA Workspace

All testers must have View access for testing and Contributor access for report authoring. Should be in Premium capacity to test incremental refresh.

PROD Workspace

Omit the “PROD” designation in the name. This workspace will be the
name of the published app that users will see in their Apps, Home and Favorite pages so use a name that is simple and sensible. Must have Premium capacity to share the app with non-Pro licensed users.

image

Use a common workspace for informal reports not shared with Free
licensed users

Multiple self-service and informal report files can share a common workspace without converting the workspace to an app. All users accessing any of those reports, must have a Power BI Pro license assigned to them. Informal or self-service Power BI projects typically don’t require 
separate Development or QA workspaces.

Data Model and Power Query Design Guidelines

The topics in this section are not an exhaustive guide to Power BI query and data model design. Following are common issues and recommendations often encountered in projects.

Dimensional Design

Always design data models for optimal performance, even when it “doesn’t matter” in simple projects. As data volumes grow and as data models are enhanced to address new requirements, increased complexity leads to performance challenges. There are many books on this subject dealing with the intricacies of best practice data model design. In brief:

  • Build star schemas – wherever possible, reshape data into fact a dimension tables with single key, one-to-many relationships from dimensions to fact.
  • Enforce dimension key uniqueness – Just because a key value “should” be unique, there is no guarantee that it will be unless enforced at the data source. Perform grouping and duplicate reduction in the data source views or Power Query queries to guarantee uniqueness. Duplicate record count checks and other mechanisms can be applied to audit source data for integrity but do not allow the data model to violate these rules.
  • Avoid bi-directional filters and unnecessary bridging tables – These data modelling patterns adversely affect performance.
  • Calculated columns – Should be applied in Power Query and not in DAX calculated columns wherever possible. This maintains a consistent design pattern for maintainability.
  • Annotate code – Use in-line comments and annotations in all code including SQL, M and DAX; to explain calculation logic and provide author and revision information.
  • Hide all fields not used directly by users. These include: primary and foreign key columns, numeric columns used to create measures, and columns used to specify the sort order of other fields.
  • Set to Do Not Summarize – Any non-hidden numeric columns that are not intended to roll-up or summarize values should be set to “Do Not Summarize” in the Modeling ribbon in Power BI Desktop. Columns set to summarize are indicated with a Sigma icon.

Support Query Folding with Views

Data source queries in Power BI are authored using the Power Query editor; which generates queries in the Microsoft Data Mashup “M” universal query language. A critical optimization technique employed by Power Query is called “Query Folding”; where the “M” query definition is translated into a native query that can be executed by the source database server. If query folding works, SQL Server executes queries and returns results to the Power BI service through the on-premises data gateway. If query folding can’t be used, all raw data is streamed through the gateway, held in memory in the service and then processed inefficiently, row-by-row.

Query folding works reliably when queries authored in Power Query reference relational database tables or views. Avoid using SQL statements in Power Query and avoid using stored procedures as query data source objects. This can be a difficult transition for database professionals who are accustomed to writing SQL queries but Power Query can optimize its queries only when referencing tables or views.

In rare cases to avoid extremely long-running queries that might time-out during data refresh, it may be necessary to materialize source table results in a table stored in the EDW or data mart database. Using parameters with an incremental refresh policy can also reduce or eliminate query time-out issues.

Managing Dataset Size with Parameters

Use Power Query parameters to filter large tables (typically only fact tables). The incremental refresh feature of Power BI Premium capacity will automatically generate partitions and only process new or partial data to avoid reloading large tables.

Even if you don’t intend to implement incremental refresh policies; data refresh parameters allow the working dataset size to be reduced in the development environment and to deploy a smaller dataset file that can then be refreshed with a full load of data in the service.

Recommendation:
Design large tables with a parameterized range filter that conforms to the requirements supporting incremental refresh policies.

When a dataset PBIX file approaches 400 MB in size, parameters should be used to filter large tables and reduce the working set of records for development.

Dataset approaching 1 GB in size should have fact tables configured with incremental refresh policies and must be in a Premium capacity workspace.

First, create two Date/Time type parameters named RangeStart and RangeEnd. Next create a date range filter in the early steps of each fact table on a Date/Time type column in the table. Incremental refresh policy requirements are detailed in this article: https://docs.microsoft.com/en-us/power-bi/service-premium-incremental-refresh

Measures & Numeric Columns

Format all DAX code for easy readability. not only does it look better but will help avoid mistakes and support debugging and maintenance. As a guide, use DAXFormatter.com to “beautify” measures and follow the same code formatting pattern going forward.

Numeric Measures

All measures should return the correct numeric type values and every measure should be correctly formatted. This seemingly trivial subject is important and an issue in most informal and ill-managed projects.  Requirements should be defined for data types, precision and formatting.  Common items include:

  • Measures are designed to summarize and total over any filter, slicer or grouping condition. Apply the appropriate roll-up or conditional level logic if it is not 
  • Percent and ratio measures should be fractional decimals. For example, “45 percent” is represented as a decimal value .45. When formatted, it appears as 45%, or 45.00% with two decimals for better precision.
  • All whole or large number measures should include a thousand separator.
  • Use the DAX DIVIDE function instead of the forward slash operator to handle blank and zero division errors.

Text Measures

  • Do not conditionally return text values to handle exceptions (like “not applicable”) in a measure that should return a numeric value. 
    Create a separate text-type measure to support titles & labels, if needed.

Implicit and Explicit Measures

For simple and informal projects, Power BI supports implicit measures. Numeric columns in each table are automatically set to summarize data for ad hoc reporting. This behavior is both convenient and confusing for ad hoc report users. In formal projects, avoid using implicit measures for the following reasons. Instead, create a measure for any numeric column values that should be aggregated for reporting, and then hide all redundant numeric columns.

  • Implicit measures don’t work in Excel pivot table and expressions
    Like when working with cubes and connected SSAS models, only explicit measures are available in Excel report.
  • Two choices confuse users
    Since all data models will include explicit measures, written using DAX. It is confusing for users to understand when to use measures and when to use numeric columns. In conventional Excel pivot tables (which connects to Power BIas i f it were a multidimensional cube); measures and calculation objects are contained in a measure groups and all columns (whether numeric or any other data type) are dimension attributes. The introduction of PowerPivot, SSAS tabular and Power BI confuses most users with inconsistencies.
  • Measures are Flexible
    Inevitably, some implicit measure will eventually be replaced with explicit measures when they are required to support some conditional logic or alternate aggregation behavior.  Starting with explicit measures, avoids retraining users and introducing inconsistent behavior.

Promote Self-service Reporting

The true value of the Power BI platform is when business users can use the interactive reporting features of Power BI to get answers to business questions. This can be as simple as starting with an existing report and changing a slicer or filter; or adding a visual to a report page to see a selected set of data. Novice report author/users can easily be taught to drag-and-drop fields into a new report page to group and slice data from an IT & certified dataset. Starting with IT/business certified datasets published to the Power BI service, users with a range of interests and skills can modify or author reports using their web browser without altering governed data or protected reports. 

For user training and support purposes, consider educating users with skills in the following roles:

Novice Report Users

  • Navigate to an existing published app or report
  • Understand how to dashboard tiles, report pages and visuals
  • Use filters and slicers
  • Use visual interactions to select and slice data
  • Use report visual drill-through navigation
  • Use Q&A to ask and answer data questions

Casual Data Analysts

  • Edit a report
  • Change a report visual to a different type of visual
  • Add a report page
  • Add visuals and fields to a report page
  • Add and use slicers to filter and select data
  • Use Analyze in Excel to create a connected pivot table
  • Use Excel to add fields, groups, totals, filters & slicers to a pivot table
  • Set alerts & create a report subscription
  • User bookmarks to create custom filter sets and navigation

Intermediate Report Authors/Analysts

  • Using a published certified dataset, create a new report
  • Add pages, visuals, slicers and filters
  • Add custom measures to a report using DAX
  • Create drill-through actions & navigation
  • Create a custom dashboard by pinning tiles

Use SSRS/Paginated Reports for Operational Reporting

Power BI is not a replacement for paginated, operational reporting. For static, multi-page, printable reports; use SQL Server Reporting Services (SSRS) instead of Power BI. SSRS (now called “Paginated Reports”) is integrated into the Power BI service with Premium capacity licensing, and can be integrated with interactive Power BI reports and Power BI data datasets.

To a limited degree, some operational reports can be reproduced using Power BI reports and SSRS can be used, some a limited degree, to create interactive reports.

Recommendation:
Use the right tool for the job and consider the strengths and limitations of Power BI, SSRS/Paginated Reports and Excel to author and deliver reports for different audiences and use cases. 

If Users Need Excel, Give them Excel

Financial users often misunderstand that they must adapt to visual dashboard reporting when they really need spreadsheet reports with reliable, current data. Power BI was borne from Excel and from SQL Server Analysis Services (traditionally “cubes”). Therefore, it can meet the needs of both self-service users and IT-managed enterprise BI solutions. In many ways, Excel are a perfect pair. If users need spreadsheets, don’t give them reports that look like spreadsheets. Give them connected spreadsheets; then train them to use Excel from a Power BI report and certified dataset.

Analyze in Excel is a gem that is often not promoted by IT-based report authors; typically, because report requirements are usually specified before users lean about this capability. It really is a best-of-both-worlds solution; because it keeps governed data in a published dataset which allows users to access it via a secure live connection from Excel.

When selecting Analyze in Excel from a published report or dataset in the service, Excel opens with a new connection (using an OCD file) directed at the published dataset. The Analyze in Excel ribbon (installed to Excel when downloaded) can also be used to create a new connection from Excel.
By default, a pivot table is generated in a new worksheet; but advanced Excel users can use cube functions to place live Power BI dataset data anywhere in a workbook.

https://sqlserverbi.blog/2016/05/10/power-bi-and-excel-the-dynamic-duo/

Recommendation:
Rather than exporting data from Power BI (and creating off-line copies of data), promote the use of live, connected reporting with Excel. 

Enable Analyze in Excel in selected reports and datasets; then train a limited audience of advanced users to promote its use. Establish best practices for each business use case. Generally information about this feature is available here: https://docs.microsoft.com/en-us/power-bi/service-analyze-in-excel

Implementing Security

There are many good references and tutorials about data and report security that I won’t cover here but will summarize the recommended approaches. Data level security and report/content level security are separate concepts that should be coordinated.

Report/Content-Level Security

There are multiple options for sharing report content with users.  Users cannot view a dashboard or report unless they have been granted access to a workspace or have been invited to share a report.

If you have a Premium capacity Power BI tenant, you can package the content of a workspace [dashboard(s) and report(s)] as an app which can be shared with any users in your organization who do not need individual Power BI Pro licenses to read and view content. These users will not be able to edit or author additional reports.

Users with Pro licenses assigned through the Office 365 Portal can have a workspace or report shared with them, which they can (unless otherwise restricted) edit, save, copy or share with others.

Data-Level Security

Once a user can access a report, they will see the results of all data in the underlying dataset for that report. Users with report level access can read all dataset data unless role-based and/or row-level security has been implemented. Role/RLS security work essentially the same way in SSAS and with data models developed with Power BI Desktop.

Role-based Security

Role-based Security is active when at least one role is defined for the dataset, filter expressions are added to the role and them user or group members are added to the role. Conditional filtering is performed for each table in the data model using a DAX expression.

Row-level Security

RLS is an extension of role-based security. In addition to creating one or more roles with filter expressions and adding role members, filtering may be conditional based on the authenticated user.  The typical approach is to populate and maintain a mapping table of users and key values for each table in the data model that should have restricted access. All related data will be filtered. For example, if a user can only see data for a particular sales territory, they should only see sales totals for the territory.

The user/mapping tables must be maintained either manually or through automation if the user/object permission information is available in an existing data source. Don’t over-engineer this. The user/mapping information can be managed in a spreadsheet and securely stored in SharePoint or OneDrive, or driven by a PowerApp or custom form. *This requirement is often missed in large projects.

Recommendation:
Add users to Office or Azure Active Directory groups and then use groups to share report content. If you are using role-based or row-level security, use groups to manage role membership.

If using Premium, use workspace apps to share and distribute reports with  read-only users and reduce Pro license requirements.

Create a strategy for managing RLS user/entity mapping tables. Either manage manually or automate if the data is available.

Certified & Shared Datasets

To promote reuse and data governance, datasets can be used in reports residing in different workspaces (including a personal workspace). Datasets can be endorsed to designate those that are certified and promoted by data owners and the organization. The Power BI service includes a platform and workflow to manage promoted and certified datasets. When enabled, a dataset author can enable a dataset to be marked for promotion and certification. Designated users can test and optionally certify specific datasets that are known to be reliable and conform to organizational
governance standards.

Dataset promotional and certification markings require the coordinated effort of the solution developer, Power BI tenant admin, data steward or business stakeholders. The actual process is simple; but the certification process should be defined and managed by solution business owners and service administrators.

The following image illustrates the
aforementioned process.

clip_image002

clip_image004

Information about this capability: https://docs.microsoft.com/en-us/power-bi/service-datasets-certify

Enterprise Scale Options

In many ways, Power BI has now surpassed the capabilities of SQL Server Analysis Services. Microsoft are investing in the enterprise capabilities of the Power BI platform, by enhancing Power BI Premium Capacity, adding Paginated Report and features to support massive scale specialized use cases. Consider the present and planned capabilities of the Power BI platform; before, choosing another data modeling tool such as SSAS.

Resources:

https://sqlserverbi.blog/2018/07/27/power-bi-for-grownups/ 

https://sqlserverbi.blog/2018/12/13/data-model-options-for-power-bi-solutions/

Project Preparation Checklist

[_] Categorize the solution by identifying the author & user roles related to the project:

[_] Author role: Business Data Analyst

[_] Author role: Skilled Data Modeler, Analyst, Data Scientist

[_] Author role: IT BI Developer

[_] Users’ role: Report/Dashboard Consumer

[_] Users’ role: Self-service Report Author

[_] Users’ role: Advanced Data Analyst

[_] Develop & Document Support & training plan for users

[_] Identify the Solution Type for the project. This will guide other project management designs:

[_] Design single PBIX file for small geoup, departmental project authored by one developer for a limited group of users

[_] Design & deploy a separate dataset PBIX file – from report file(s) – when the dataset should be branded as a Certified dataset

[_] Design separate dataset and report PBIX fiels for formal projects with more than one dataset & report developer, to coordinate work

[_] Use SSAS/AAS as a data modeling option when those databases exist or where IT operations insist o management development and maintenance through integrated source control (e.g. Visual Studio Team Services & Azure DevOps)

[_] Identify the Project Type & related Soluton Architecture:

[_] Project type: Formal project

[_] Project type: Informal project

[_] Project type: Hybrid project

[_] Architectural approach: Single PBIX

[_] Architectural approach: Separate dataset and report PBIX

[_] Architectural approach: Report PBIX connected to SSAS or AAS

[_] Understand DirectQuery model trade-offs and special use cases. Avoid if possible.

[_] Create storage locations and folder structure for Development file management:

[_] Development file storage

[_] Team member collaboration environment & processes

[_] Folder synchronization

[_] Define File naming standards

[_] Decide on dataset and report names

[_] Define the Version Control & Lifecycle Management

[_] Postfix files with 3-part version number

[_] Remove ver number from published files in QA and PROD

[_] Create Version History table in Power Query

[_] Increment version numbers in data model

[_] Backup PBIT files for archive

[_] Create measures: Last Refresh Date/Time

[_] Create measure: Current Version

[_] Add data model info page to report

[_] Define your Release Management, DevOps & Automation strategy (if any – Might be OK to deploy files manually) (to automate or not to automate)

[_] Decide on Workspace and App Management, workspace & app name, etc.

[_] Create PROD workspace (omit PRD from name), assign dedicated capacity

[_] Create QA workspace (post-fix name with QA), assign dedicated capacity

[_] Create DEV workspace (post-fixo name with DEV), dedicated capacity not required

[_] Assign Pro licenses to all developers, admins and report author users (QA?)

[_] Assign membership and access to workspaces

[_] Create fact date range filter parameters: RangeStart & RangeEnd

[_] Filter large fact tables with range filters, consider incremental refresh policies if slow and/or over 800 MB compressed.

[_] Design source queries (T-SQL?) to reshape source data into conformed dimension & fact tables

[_] Create views in database for each dimension and fact

[_] Enforce key uniqueness to remove all duplicate keys from all dimension tables

[_] Query Date dim/lookup table at source if it exists

[_] If not available, generate Date dim/lookup table in Power Query

[_] Avoid bi-directional relationship wherever possible

[_] Include code annotations & comment blocks in: views, M queries,measures

[_] Hide all key columns and other columns not used directly by users

[_] Use parameters to filter and reduce dataset PBIX size to ~400 MB or less

[_] Create incremental refresh policies for dataset larger than ~500 MB

[_] Use Premium capacity workspaces for datasets approaching 900 MB

[_] Use Premium capacity workspace when you need to share reports with non-Pro licensed user

[_] For non-Pro licensed user, convert QA (and then PRD) workspace to app and distribute the app to users via AD group membership

[_] Create explicit measures for all aggregate-able column values

[_] Hide all measure base numeric columns

[_] Set all non-aggregate-able numeric column default summarization to “Do Not Summarize”

[_] Format add whole numbers with thousand separators

[_] Format all currency & decimal measures to defined standard (perhaps 2 decimal, thousand separator)

[_] Format all ratios and percentage with appropriate decimal positions

[_] Use the DIVIDE function for all division operations to avoid div by zero

[_] Name all measures and user-visible fields with mixed-case, short,

[_] friendly sentenc-like terms

[_] Assign measures to fact table where they logically belong – or to an empty measure group table.

[_] Measures that don’t have an obvious home (fact) table should be assigned a measure group table.

[_] Precede measure group table names with “_” so they sort to the top of the list. (display folders are not ready for use in Power BI Desktop)
As a team decide:

[_] When to increment the revision/fix number

[_] When to increment the minor version number

[_] Who owns the migration to QA process? How is it implemented?

[_] Who own migration from QA to PROD? How is it implemented?

[_] How is user/stakeholder sign-off obtained?

[_] How are data source requirements defined, presented & documented?

[_] Who owns & maintains the requirements log?

[_] How are measure & KPI calculation requirements documented & maintained?

[_] How are the report design, layout & visualization requirements documented & maintained?

[_] What is the organization & project styling & branding standard?

[_] Does the organization have a standard brand image? Where is it documented? How owns that standard & can answer questions?

[_] Should report be designed for mobile consumption (with mobile layouts)?

Power BI Product Suggestion: Show Measure Dependencies

I was just working on a client project where we have several complex measures that reference other measures. Sometimes it can be hard to keep all these dependencies in mind. As a feature, it would be great for Power BI Desktop to show these dependencies and navigate between them. After a quick search in the Power BI Community Ideas site, I posted a suggestion. Afterward, I found an older suggestion made by Avi Signh here: https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/16783000-measure-dependency-view-similar-to-the-query-depe

Here’s my version in the mockup below. I think there’s a lot of potential to pile-on and add to this idea but I think that having a simple tree view of dependent measures would be a great start. If you agree, please vote this idea up on the Power BI community forum.

Show measure dependencies in the field list

Measures that reference and depend on other measures can be difficult to trace and debug.  To make this easier, provide the option in the field list to show the measures that are referenced by other measures. This should be optional and performed individually for each measure to avoid unnecessary performance overhead.  Here’s a conceptual mockup:

measure dependency

Power BI Licensing

Power BI licensing options are a mystery to many. It’s not that it is complicated; in fact, on the contrary, Power BI pricing is actually quite simple once you understand the basics. It is just that there are so many options and different features and capabilities. This short video blog post walks you through the costs and licensing options for Microsoft Power BI and the Power BI platform including free, Pro and Premium capacity. I also talk about alternative licensing options for accessing Premium capacity features with the ability to scale cost and capacity.

This post is part of the Azure Everyday series from Pragmatic Works. See the entire series of video blog posts at: https://blog.pragmaticworks.com/topic/azure-every-day

Power Platform Virtual Conference

I am happy to join several world-class speakers on July 23 & 24 for the Power Platform Virtual Conference. I will be speaking about how to create “no-code” Power BI custom visuals along with Mike Carlo from PowerBI.tips.

Join me on Tuesday, July 23 at 5 p.m. EDT for my session on Creating Power BI Custom Visuals with No Code at the Power Platform Virtual Conference. This 2-day online conference dedicated to the #PowerPlatform is only $49. Register today! #PowerPlatformCon #PowerBI #PowerApps #MicrosoftFlow http://powerplatformconference.com/

paul_turley-01

Looking to enhance your Power BI, PowerApps and Flow skills but can’t attend an in-person conference? Check out this virtual (on-line, live) conference featuring several headlining, international conference presenters. Join Pragmatic Works for the only 100% virtual conference dedicated to the #PowerPlatform on July 23rd and 24th! http://powerplatformconference.com/ #powerplatformcon #PowerBI #PowerApps #MicrosoftFlow

power-platform-virtual-conference-v04_twitter

Links to Pragmatic Works’ Social Accounts

Twitter: https://twitter.com/PragmaticWorks

LinkedIn: https://www.linkedin.com/company/PragmaticWorks

Facebook: https://facebook.com/PragmaticWorks

Drillthrough Navigation Between Power BI and SSRS Paginated Reports

Report navigation is, in my opinion, the essence of true data exploration and discovery. It lets business users see and understand important information in different forms; in summary and in detail, within context.  SSRS and Power BI are truly a dynamic duo for reporting and data discovery. I wanted to post a follow up to these two articles because they are apparently very popular topics. Posted two days apart on my blog, they received 2,744 and 3,356 views. Woo!

image

Nine years ago we released the Reporting Services Recipes Book from Wrox Press, which was a very popular collection of design patterns and techniques, often referenced frequently by Microsoft leaders at industry conferences. In the book, we showcased different ways to move between SSRS reports by passing parameters and dynamic URLs to create interactive reporting solutions. I am working with my former co-author and a team of talented BI and report design experts on the second edition of the the Recipe book that will include not only updated examples of SSRS patterns for on-prem report solutions but we are including new patterns for Power BI Report Server and Paginated reports in the Power BI service.

Today we have Power BI in addition to SSRS (now called “Paginated Reports”). These tools are different and often serve distinctly different purposes for creating reporting solutions. But, using Power BI and SSRS/Paginated Reports together opens doors to create incredibly rich integrated report and data discovery solutions.

I recently published two blog posts to share some of my work-in-progress for the recipe book: Drillthrough from Power BI to an SSRS Paginated Report, and: Drillthrough from Paginated Report to Power BI. Both of these posts demonstrate navigation from one of the report tools to the other, essentially allowing users to work with these two capable tools in a full circle.  As the newer and more modern data analysis tool, Power BI fills and important gap but it is not a replacement for the type of multi-page grouped and repeated style of reporting that SSRS is optimized for. Likewise, Power BI excels as an interactive tool for data discovery and self-service analysis. SSRS and Power BI were borne from the same platform and and have common architectures but also many differences. Used together, the realm of possibilities is expansive.

It is important to understand that the techniques used to navigate and pass filter context between these report tools is limited to the capabilities of web query string parameterization. In my experience, this is rarely a show-stopper but I do occasionally encounter folks pushing the limits – both practically and technically.

Both SSRS and Power BI have their own drillthrough capabilities using internal mechanisms that do not rely on the web browser and report server to handle web URLs. However, moving between different report types relies and web URLS. Although powerful and flexible, there are some limitations. Each web browser has it’s own maximum URL length limit, ranging from 2048 to 2083 or so characters. The server-hosted web services also have their own limits. For simplicity, let’s call the limit about 2000 characters, which will include the base domain and folder path URL, and the query string directives, parameters and values.  Keep all this within the ~2000 character limit and you’re good.  Also be mindful that all this is visible in the browser’s address bar. Requests and responses sent to the Power BI service are encrypted so there is little risk of exposure outside the session. Your on-prem report server connections can also be protected using certificate-based encryption.  Just be mindful about what you pass in sight of the user and web browser after the stream is decrypted on the client.

The report navigation recipes posted here on my blog are early drafts. I welcome your feedback, questions and ideas for more report recipes. Sample files and other downloadable content that are referenced in the recipes will be available when the book is published – and no, that date has not been set, but I will make an announcement when we know.  Thank you in advanced for your questions and comments.

Drillthrough from Paginated Report to Power BI

— report recipe
also see: Drillthrough from Power BI to an SSRS Paginated Report

In the recipe titled “Drillthrough from Power BI to Paginated Report” I demonstrate how to use report parameters and a dynamic URL address to navigate between reports and pass parameter values in the URL from a Power BI report to detailed data in an SSRS paginated report. Using a similar technique, we can navigate from a paginated report to a Power BI report.

Power BI is very flexible and does not require you to define parameters like you would in a paginated report. You can literally filter a report using any field in the dataset.

Product Versions

SSRS/Paginated Reports (one of the following):

· SQL Server Reporting Services 2016 or newer

· Power BI Report Server

· Paginated Reports in the Power BI service

Power BI Desktop:

· Power BI Desktop May 2019 or newer

What You’ll Need

· A Power BI report connected to a data source using import, DirectQuery or SSAS direct connect mode.

· Paginated Report server: SQL Server Reporting Services (Standard or Enterprise Edition), Power BI Report Server or a Power BI subscription with Premium capacity

· Moderate level Visual Basic.NET coding skills

Difficulty level: 300

The sample database and files will be available in the forthcoming book: SQL Server Reporting Services Paginated Report Recipes, 2nd Edition (working title).
These instructions are provided as an example but refer to files that will be available when the book is published. Please contact me in the comments with questions and feedback.

Preparing the target Power BI Report

You don’t have to do anything special in the Power BI report design to enable parameterized filtering; you simply need to know the tables and field names you will need to use for filtering. Most of the effort is building the dynamic URL in the paginated report.

The first order of business is to make note of each field or measure to use for filtering. I suggest jotting them down in a text editor using the notation: TableName/FieldName.

Note: Filtering is applied at the report level, so navigating to a report using URL filters will affect all the pages in that report.

In the following examples, I will demonstrate my experience with a report deployed to my Power BI tenant. In the exercises a little later, you will deploy a sample report and work through these steps in your own Power BI environment.

Addressing Power BI Objects

There are several URL-addressable objects in the Power BI service to which you can navigate using a URL. Each of these objects is assigned a globally unique identifier that can be addressed in a URL. Reports can be filtered, and dashboards cannot, but other objects that are related to reports can be filtered and will respond to a filter directive in the URL. For each object type listed in the following table, navigate to the object in your web browser and copy the URL from the address bar.

Object Type

Accepts Filters

Description

Dashboard

No

Dashboards don’t accept filters

Report

Yes

Shows default report page with report-level filters

Report page

Yes

Applies report-level filters to page

Bookmark

Yes

Bookmark and filter directives can be combined in the URL query string using “&” (rather than “?”) to append directives

Workspace app

Depends

Default page in app navigation must be a report rather than a dashboard

Workspace app report page

Yes

Same as a report. Use path to app or a report within the app

Workspace app bookmark

Yes

Same as report bookmarks. Filter directive may be added to the query string using “&” rather than “?”.

Filter Directives & Syntax

The URL query string filtering syntax for Power BI is borrowed from the OData standard which offers a great deal of flexibility. For example, depending on data types; you can specify that a field is equal to, less then, greater than, contains, in a range, or includes a list of parameter values. The filter directive is simply tacked onto the end of the URL specifying an object path in the form of:

?filter=Table/Field eq value

Text and date type values are passed in single quotes:

?filter=Table/Field eq ‘value

Multiple filters may be appended together using the “and” operator. This example uses real table and field names:

?filter=Product/Brand eq ‘Contoso’ and Sales/Quantity lt 5

This will filter all pages on the report to include only records where the Brand field in the Product table is “Contoso” and the Quantity field in the Sales table is less than 5.

All of the available filter operators are listed in the following table.:

Operator

Description

eq

Equal to

ne

not equal to

ge

greater than or equal to

gt

greater than

le

less than or equal to

lt

less than

in

in list (item1, item2, item3)

Before I can add the filter directive, I need to get the report URL. If I go my Power BI service, navigate to a workspace and then click to open a report, I can capture the URL from the address bar. As an example, here’s the address I captured for a report published in my Power BI tenant:

https://app.powerbi.com/groups/cdae7f3c-b818-493a-af56-55308d41798f/reports/0dcf5a4d-1bf2-438f-b13b-95eb9adadd91/ReportSection

You can see that each object type (“groups”, “reports”, etc.) is proceeded by a globally unique identifier; which is a long string of characters generated by the service. These identifiers don’t change after the object is deployed or created in the service.

Notice that the ReportSection element at the end of the address has no value which means that I have not specified a report page. If I were to use this address to navigate to the report, Power BI will show me the default page, which is the page that was visible when the report was last saved. If I navigate to another page and then capture the URL, the object ID of that page is appended to the “ReportSection” text at the end of the address:

https://app.powerbi.com/groups/cdae7f3c-b818-493a-af56-55308d41798f/reports/0dcf5a4d-1bf2-438f-b13b-95eb9adadd91/ReportSection3b93b5700b15a516ed98

Note: The syntax for the page or ReportSection is a little different than other objects and does not use a forward slash before the object ID.

The common terms we use for some of these objects are a little different that the attribute names used in the URL query string, as noted here. I will show you some additional object names after this first example.

Object Type

Query String Attribute

Object ID

Workspace

groups

cdae7f3c-b818-493a-af56-55308d41798f

Report

reports

0dcf5a4d-1bf2-438f-b13b-95eb9adadd91

Page

ReportSection

3b93b5700b15a516ed98

Now, to add the filter directive to the report URL… When I append the filter directive from the earlier example to the report URL like this…

https://app.powerbi.com/groups/cdae7f3c-b818-493a-af56-55308d41798f/reports/0dcf5a4d-1bf2-438f-b13b-95eb9adadd91/ReportSection?filter=Product/Brand eq ‘Contoso’ and Sales/Quantity lt 5

…the report opens with only data where the Brand is Contoso and Sales records have a quantity less than 5. But, my browser seems to have mangled the URL making it difficult to decipher:

https://app.powerbi.com/groups/cdae7f3c-b818-493a-af56-55308d41798f/reports/0dcf5a4d-1bf2-438f-b13b-95eb9adadd91/ReportSection?filter=Product~2FBrand%20eq%20%27Contoso%27%20and%20Sales~2FQuantity%20lt%205

This is a necessary practice when passing certain information in a URL called “escaping”. There are certain characters that either have special meaning to a web server or simply can’t be passed in a web URL without special encoding. Spaces and forward slashes are the most common examples that must be converted to a special string of characters so that a web server (in this case the Power BI service) understands what a web client (in this case, my web browser) is asking for in the web address and query string directives. Technicalities aside, we must substitute these characters with special “escape character” strings. Cryptic as it might seem, this is simply a requirement.

Report URL on Power BI Report Server

If you are using Power BI Report Server rather than the Power BI service, you can get the report URL in the same way. Simply navigate to the report and capture the URL in the address bar. You will not see object IDs like those generated by the Power BI service but just the named folder and report. For example, here is the path to a report named Sales Analysis in a folder named Report Recipes on my local report server:

http://localhost/reportspbrs/powerbi/Report%20Recipes/Sales%20Analysis

The pattern for adding filters to a report published to an on-premises report server is the same as the Power BI service. Append the filter directive query string parameter to the URL using the same syntax (beginning with “?filter=”).

Handling Special Characters

If any object names that contain literal spaces or other special-purpose characters (like % or @ for example), you must explicitly replace these with properly escaped notation. In the previous example, you saw that the web browser replaces forward slashes with “~2F” and spaces with “%20” within the URL.

This is where things might get a little confusing but it’s really not complicated. The HTTP escape character encoding that your web browser does automatically uses a different syntax than the Power BI service or Power BI Report Server. For object names (tables or fields) in the filter directive, Power BI employs a different syntax called Unicode URL escape codes, which is a seven-character string beginning with “_x” and ending with “_” and a four-character hexadecimal number in the middle. To represent a space, the code is “_x0020_”. Use the following table for reference:

Character

HTTP Escape Code

Unicode URL Escape Code

Space

%20

_x0020_

$

%24

_x0024_

&

%26

_x0026_

`

%60

_x0060_

:

%3A

_x003A_

<

%3C

_x003C_

>

%3E

_x003E_

[

%5B

_x005B_

]

%5D

_x005D_

{

%7B

_x007B_

}

%7D

_x007D_

%22

_x0022_

+

%2B

_x002B_

#

%23

_x0023_

%

%25

_x0025_

@

%40

_x0040_

/

%2F

_x002F_

;

%3B

_x003B_

=

%3D

_x003D_

?

%3F

_x003F_

\

%5C

_x005C_

^

%5E

_x005E_

|

%7C

_x007C_

~

%7E

_x007E_

%27

_x0027_

,

%2C

_x002C_

Let’s say that I need to specify a filter for a field named Product Category in a table named Sales Orders, and I only want to see data for two specific categories, Accessories and Bikes. Before adding escape codes, the filter string would look like this:

?filter=Product/Product Category in (‘Accessories’,’Bikes’)

If I replace the necessary characters (within object names) with Unicode URL escape codes, the filter string becomes:

?filter=Product~2FProduct_x0020_Category in (‘Accessories’,’Bikes’)

Additionally, my web browser automatically adds HTTP escape codes for spaces the quotation characters outside of object names, which produces this string:

?filter=Product~2FProduct_x0020_Category%20in%20(%27Accessories%27,%27Bikes%27)

In summary, you do not need to apply the HTTP escape codes because your browser will do that for you. If you capture the URL with the escaped characters, that’s fine. However, you must replace spaces and special characters with the filter directive object names with the appropriate Unicode escape codes.

Now that you understand the syntax of the filter directive and the need for escape codes to replace certain characters and spaces, you are equipped to build a fully-qualified URL for the report and folder.

Build the Report Navigation Solution

In this solution, you will make enhancements to a fairly standard SSRS paginated report. You will use a custom Visual Basic function and expression to encode a dynamic report URL with parameters to filter a Power BI report.

Design the Source Paginated Report

A starting report is provided for you. This is a simple SSRS paginated report connected the SSRS2019Recipes sample database.

1. Make a copy of the Sales Summary-Starter.rdl file and rename it Sales Summary.rdl.

2. Open Sales Summary.rdl in Report Builder (shown in Figure 1).

clip_image002

Figure 1

3. Open the data source and make sure you can connect to the SSRS2019Recipes database on you SQL Server. Change the server name in the connection of necessary and test the connection.

4. Leave Report Builder open to resumereport design work after the following steps.

Open and Publish the Power BI Target Report

1. Open Power BI Desktop and then open the Sales Analysis (Drillthrough Target).pbix report file.

2. Publish the Sales Analysis (Drillthrough Target) report to a workspace in your Power BI tenant or to a folder in you Power BI Report Server.

Note: If you are using Power BI Report Server rather than the Power BI Service, you must use the version of Power BI Desktop that installs with Power BI Report Server. To publish a report in Power BI Desktop for report server, choose Save As… from the File menu. To publish a report to the Power BI service, choose Publish from the File menu.

3. Navigate to the published Sales Analysis (Drillthrough Target) report in your web browser, shown in Figure 2.

4. Select the entire URL in the address bar and copy it to the Windows clipboard.

clip_image004

Figure 2

Continue Designing the Source Paginated Report

Now that you have the complete URL for the published Sales Analysis Power BI report in the clipboard, you can return to Report Builder where you were previously editing the Sales Summary SSRS paginated report.

1. In Report Builder, select the report parameter named PBIReportURL.

2. Right-click the selected parameter and choose Edit Parameter from the menu.

1. Select the Default Values page.

2. Click to select the Specify values radio button.

3. Click the Add button.

4. Paste the URL for the Power BI report into the Value box as shown in Figure 3.

5. Click OK to accept the parameter changes.

clip_image006

Figure 3

Create a Custom Function to Encode Object Name Text

1. In Report Builder, right-click the report designer background outside of the report body (dark gray area) and select Report Properties from the menu.

2. In the Report Properties dialog. Select the Code page.

3. In the Code box, enter the following Visual Basic.NET code.

Note: The custom code editor has no syntax validation or debugging features, so it can be challenging to debug and correct errors in your code. I recommend that you challenge yourself by hand-entering this code but if you have any trouble, a copy of this code is provided in the Code.txt file for your convenience.

Function EncodeObjectNames ( ObjectNames as String ) As String

Dim sText As String = ObjectNames

‘– Replace Spaces —

sText = REPLACE( sText, ” “, “_x0020_”)

‘– Replace $ —

sText = REPLACE( sText, “$”, “_x0024_”)

‘– Replace & —

sText = REPLACE( sText, “&”, “_x0026_”)

‘– Replace % —

sText = REPLACE( sText, “%”, “_x0025_”)

‘– Replace ‘ —

sText = REPLACE( sText, “‘”, “_x0027_”)

Return sText

End Function

The completed function is shown in Figure X

4. Click OK to accept the code changes and close the Report Properties dialog.

clip_image008

Figure 4

Add Link Icon

1. Select the Insert tab in the main Report Builder application.

2. From the Insert ribbon, click the Image button.

The mouse pointer changes to an Image icon.

3. In the table, click the right-most cell in the Detail column to drop the new image.

The Image Properties dialog opens.

4. On the Image Properties dialog, click the Import button.

5. Drop down the file extension filter list which is currently set to JPEG files (*.jpg).

6. Select the option to show all file types: All files (*.*).

7. Browse to the correct folder and select the Link Icon.png file.

clip_image010

Figure 5

Add Navigation Action

1. Select the Action page in the Image Properties dialog.

2. Under Enable as an action:, click to select the radio button labelled Go to URL.

3. Click the “fx” button to the right of the Select URL: drop-down list.

clip_image012

Figure 6

4. In the Expression dialog, enter the following code.

Note: Be particularly mindful of the literal single quotes surrounding the ProductSubcatory field reference at the end of the expression.

=Parameters!PBIReportURL.Value

& “?filter=” & Code.EncodeObjectNames( “Product/Product Subcategory” )

& ” eq ‘” & Fields!ProductSubcategory.Value & “‘ and Date/Year eq ” & Fields!Year.Value

5. Click OK to accept the new image settings.

6. Save your changes to the Sales Summary report.

7. Switch to Preview and test the drillthrough navigation link by clicking the link icon on any row.

clip_image014

Figure 7

The target report should show only data for the selected Subcategory and Year.

8. Deploy the Sales Summary paginated report to your report server or Premium Power BI tenant.

9. Test the deployed Sales Summary report to verify that it works as it did in the designer.