It has been a few weeks since I last posted, and a while since I started the promised blog series “Doing Power BI the Right Way”. I’ve received a lot of positive comments on the posts in the series this far, and people wondering why it’s taking so long. Well, I am busy working on a book about Paginated Reports and SQL Server Reporting Services. This will be the second edition of a book I co-authored ten years ago and this edition is also a collaborative effort with multiple contributors. We have quite a lot of excellent material that is just starting to go through the process of editorial and technical review. Writing a technical book always takes more time and effort than imagined, and working on this one in my “spare time” is a laughable notion but this is going to be a great book.
Writing this blog series while juggling other commitments has brought me to two realizations: 1. Posting articles in the series will be sporadic, and 2. The topics will be in fairly random order based on the projects I’m working on and the topics that are most top-of-mind at the time.
This brings me to the subject of this post: Paginated and Analytic reports.
Before we had Power BI, we had Reporting Services. When the focus of my career and consulting practice became Business Intelligence, starting about fifteen years ago, most of the “BI reports” I created were in SQL Server Reporting Services (SSRS). We could create a variety of charts and reports that allowed users to drill-down from summary to details. We could create reports that allowed a user to drill-through, from one record or report item to another report that presented more details. With a little creativity, we could create reports that allowed users to click items that would filter other items on the same report. It took some work and time to create these interactive “drill-through-to-self” reports. Today, after transforming data into the right format for a tabular data model, this type of interactive functionality just magically happens in Power BI with very little report design effort. But, Power BI is primarily a tool for analyst users to create their own reports and to get answers to business questions rather than for IT developers to create reports that print or export perfectly to a PDF or Excel. Although it is possible to create fairly advanced paginated reports, most bread-and-butter operational reports are really just what I call “lists of stuff”… the results of SQL queries presented as a table or matrix; perhaps with a few groupings and subtotals. Simple business processes and task often require simple reports to help manage them.
Paginated reports, by definition, are reports that allow a lot of data to be presented on multiple pages. Paginated reports can be viewed online, printed or exported to a variety of file formats. Becasue there is so much overlap in the capabilities of SSRS/Paginated Reports and Power BI, the choice between these two report tools is not always clear. I can cite many cases when using Power BI Desktop is clearly the right choice to create an analytic report, and cases where it would make more sense to create an operational report using Paginated Report Builder. I think these are the easier decision points but I would like to explore those cases where we could argue either way. What are the pros and cons of using Power BI to create traditional reports? When would it make more sense to replace a static, paginated report with an interactive report created with Power BI?
The integration of Paginated Reports into the Power BI cloud service continues to march forward at a rapid pace. Just yesterday, Chris Finlan teased a blog post announcing new Power Automate integrations that will allow report to be distributed and exported using flows. Listen up for some big announcements at Microsoft Ignite about Paginated Reports in the Power BI service. Whether you host operational reports on your own report server or in the cloud, Paginated Reports are an important part of most business reporting portfolios.
When do you choose to use Paginated Reports or SSRS rather than Power BI? Do you design reports differently than you would have 5 years ago? Do your users prefer interactive, self-service reports now or do they still want made-to-order static reports?
What are the deciding factors between creating a paginated report or an interactive report?
Do you create paginated reports with a Power BI dataset or SSAS model as the data source or do you only use SQL queries for those reports?
I’m interested in your ideas and questions. Please post them in the comments below this post and then I will extend this post with your input and my thoughts.
Navigating from a Power BI report to an SSRS Paginated report with filters and parameters.
Product Versions SSRS/Paginated Reports (one of the following): – SQL Server Reporting Services 2016 or newer – Power BI Report Server – Paginated Reports in the Power BI service Power BI Desktop: – Power BI Desktop May 2019 or newer All versions of Reporting Services accept parameters in a similar fashion, but subtle changes were made to URL parameter formats in SSRS 2016. Some adjustments may be required if you want to apply these techniques to SSRS versions prior to 2016.
What You’ll Need – A Power BI report connected to a data source using import, DirectQuery or SSAS direct connect mode. Report server: – Power BI Report Server or a Power BI subscription with Premium capacity (minimum: P1 or A4) – Moderate level DAX coding skills
Difficulty level: 200
Designing the Power BI Report
This recipe primarily involves Power BI report design techniques. I’m not going to get into the details of Power BI report design but will cover the basics with a partially-completed report to get you started. If you are less-experienced with Power BI you can use this as an example for future report projects.
The sample database and files will be available in the forthcoming book: SQL Server Reporting Services Paginated Report Recipes, 2nd Edition (working title).
These instructions are provided as an example but refer to files that will be available when the book is published. Please contact me in the comments with questions and feedback.
Figure 1 shows the Sales Summary Power BI report, which is deployed to my local Power BI Report Server along with the Sales Detail, a paginated (SSRS) report also deployed to my local Power BI Report Server.
We will start by looking at the target report. This is just a standard SSRS paginated report. A completed copy of this is provided for you to configure and deploy to your report server or Premium Power BI service tenant. The sample report is very simple and could easily be enhanced to include more information in greater detail.
1. Open the Sales Detail.rdl report in Report Builder
There are four report parameters in this report, two of which are used to connect to the SQL Server database. The other two parameters are used to pass filter values to the underlying query.
2. Edit the ServerName parameter and set the Default property to the name of your server or instance
The default value for this parameter LocalHost which connected to my local default instance of SQL Server. You can leave this setting if your SQL Server instance is the same as mine.
3. Close and save any changes the ServerName parameter
4. Preview the report and make sure it connects and displays data from the sample database
5. Select different values for the Year and Country parameters using the parameter dropdown lists and click the View Report button to see values in the report change, and to confirm that the parameters are working correctly.
6. Use the Save As… option on the File menu to save the report to a folder on your report server or a workspace in your Premium Power BI tenant
Note: Depending on the Report Builder menu option and deployment target, the terms “save” and “publish” are used interchangeably, meaning that the report has been deployed to a server or service where it can be ran and viewed by users.
Get the address for the published paginated report
1. Open Notepad or your favorite text editor
2. In a web browser, enter the address for your report portal or Power BI portal where you published the Sales Detail paginated report
Tip: If you don’t know the address for your web portal or report server, you can use Report Server Configuration Manager to lookup the Web Service URL and Web Portal URL. By default, the web portal address is http://SERVERNAME/Reports and the report server address is http://SERVERNAME/ReportServer (where SERVERNAME is the name of the server where Reporting Services is installed) but this is configurable and might be different in your environment.
3. Navigate to the folder or workspace and run the newly published report, verify that the report runs and returns data
4. Use Ctrl-C to copy the full address for the report from the browser address bar to the Windows clipboard
5. Paste the address into the text editor
The next step can seem a little tricky but it’s not really complicated. You need to get the path to the report on the report server rather than the report portal.
If you published the paginated report to your on-premises report server, do the following:
1. In your text editor, copy and paste the report address to a new line
2. In the new address line, locate the text “Reports” and change it to “ReportServer”
3. Copy the modified address line to the address bar in your browser and press Enter
The browser should navigate to a text menu page similar to Figure 2.
The address for the report is actually not yet in the correct format but it was sufficient to navigate to the Home folder on the report server. The following steps are used to obtain the correctly-formatted report address.
1. Use the links displayed on the report server menu page to navigate to the folder to which you published the Sales Detail report and click the report name link to run the report
2. When the report is displayed, copy the complete address from the address bar and paste it to a new line in your text editor
This is the correct path for this report on the report server. You can new add parameter values to the end of the address.
3. Append parameters and values to the address in the address bar like this. Each parameter is preceded with an ampersand character:
4. Press Enter and verify that the parameter is changed in the report
Open the starting report in Power BI Desktop and add drillthrough navigation for the paginated report
1. Make a copy of the Sales Summary-Starter.pbix file and rename it Sales Summary.pbix
2. Open Sales Summary.pbix in Power BI Desktop
If you will be using Power BI Report Server to deploy report on premises, make sure that you use the version of Power BI Desktop designed for your server version. If in doubt, you can download and install Desktop from the download menu in your report web portal.
3. On the Home ribbon, click the Edit Query button to open the Power Query editor
4. You will see three query parameters. Two parameters are used to manage database connection information in a similar manner to the Sales detail paginated report.
Tip: Using parameters in the Power Query editor can help manage variable information like connection information, file paths and addresses.
In the starter report file, there are three text type query parameters. Modify these parameters, using the following instructions:
SQL Server address or server name. You can use “LocalHost” if it is a local instance.
Path to the deployed SSRS/paginated report
5. If necessary, change the ServerName parameter current value to connect to your SQL Server
6. Copy the report server address for the deployed Sales Detail report from your text editor and past it into the current value ReportURL parameter.
Build the dynamic target report address link as a measure in Power BI
The ReportURL Parameter value is exposed as a column in a table named Constants. You will use this to build a dynamic link in the report.
1. On the Home ribbon in the Power Query editor, click Close & Apply to process queries and return to the report designer.
2. Figure 3 shows the report designer in Power BI Desktop. The objective is to add links to each item displayed in the table visual, allowing a user to see a paginated report with details related to a selected year and country.
3. Expand the Sales Summary Subcategory Country table in the field list on the right
4. Click the ellipsis (…) to the right of table name to see the context menu and select New Measure
5. The formula bar is displayed above the report canvas with placeholder text: Measure =
Note: You can use the down arrow on the right side of the formula bar to expand the formula editor window, and the up arrow in the same location to collapse the formula editor
6. Replace the placeholder text with the following new measure:
The purpose of this measure is to build the address text used for our link to the detail paginated report. The SelectedYear and SelectedCountry variables get those respective values. The RptURL variable gets the report path the parameter. This DAX script assembles the URL in the appropriate format.
As you enter the text, use the Intellisense and autocompletion features offered by the editor to build the DAX code. These features can help with proper code syntax and save you from typing errors.
7. Press Enter to validate the code and save the measure
8. With the measure selected in the field list on the right, switch to the Modeling ribbon
9. Use the Data Category drop-down to select Web URL (see Figure 4)
Add report links to the report
With the Detail Report Link measure added to the data model and categorized as a web URL, you can add a link to the table that will dynamically generate a context-specific link for report navigation.
1. Before you go any further, click the floppy disk icon on the toolbar to save your work
2. Select the table on the right side of the report canvas that shows sales by product category grouped by years
3. Drag the new Detail Report Link measure from the field list to the Values field well below the SalesAmount field. This adds a new column to the table visual showing a long URL in blue “link” text (see Figure 5)
Tip: It is easy to deselect a visual so before you begin changing properties, make sure that you still have the table visual in the report canvas
4. In the left-most blade in the panel to the right of the report canvas, click the paint roller icon to see the properties for the table visual, shown in Figure 6
5. Expand the Values group
6. Locate the URL icon property and switch the state of the property to On
7. Verify that the web addresses displayed in the last column of the table are replaced with a small blue link icon, which you can see in Figure 7
Test and publish the completed report solution
A this point, navigation should work from the report in Power BI Desktop, just as it should after the report is deployed.
1. Hover over the link icon on a few different rows to see the parameters at the end of the address in the tooltip
2. Test the report navigation by clicking a link after noting the corresponding year and country.
3. Publish or save the paginated report to your report server or Premium Power BI tenant
4. Test drillthrough navigation in the deployed reports
5. Navigate to the deployed copy of the Power BI report
6. Locate a row with values for a specific year and country and click the corresponding link
The paginated report opens in the browser
7. Verify that the paginated report parameters reflect the selected year and country
Report navigation is the essence of data exploration and gives users the ability to understand their data in proper context. Dashboard and summary reports help business users see the big picture but having access to details let’s them understand the low-level details behind the numbers. Power BI as an ideal tool for summary-level, interactive reporting and paginated reports are great for consuming detail-level data. Using this technique allows you to have both of these capabilities. Power BI reports can be published to your on-premises Power BI Report Server or to the cloud Power BI service. Paginated reports do not have to be in the same location and can be deployed on-prem in a standard SSRS report server, Power BI Report Server or in you Premium Capacity Power BI Service tenant.
A couple of things to keep in-mind when using this technique are that there are string size limits when passing parameters in a web URL. You should avoid passing sensitive information in clear text parameters. When navigating between reports located on different services or services, users may be prompted for their credentials to login to a service.
This is a question that comes up all the time. Power BI licensing is not complicated but a common challenge is that the person who sets up a new Power BI subscription and tenant within an organization is often not the same person who manages Office 365 or Azure service licensing for the organization. I’ve consulted on projects for several organizations where folks just didn’t know who to talk to or how to proceed after testing the water with Power BI. After setting up a new subscription, IT professionals and business data analysts often don’t know how to license Power BI for company use and share reports and datasets with others.
This post will show you how licenses are assigned to users and, more importantly, what to request from your support desk or administrators who may be unfamiliar with Power BI and Office 365 user licensing. Keep reading for background information about why this is important and necessary.
You can use Power BI in one of three modes:
1) If you’re a one-person organization or don’t need to securely share content online with anyone, you can just use Power BI for free. Yep, free. No feature limits.
2) If your organization has a few hundred users or less, you will need to buy a Power BI Pro license for every user to author or view reports. Unless you publish report content publicly to the web, every user must have a Pro license – period.
3) If you have several hundred users or you have so much data that you need to manage it with dedicated capacity, it may be cost-effective to purchase a Premium capacity tenant for a few thousand dollars a month. Pro licenses are still required to publish reports but anyone in the organization can view published reports online or an on-premises Power BI Report Server.
Power BI Subscription Basics
Let’s say that I work for XYZ company and my work address is Paul@xyz.com. Assuming that a Power BI subscription doesn’t yet exist, if I go to PowerBI.com and setup an account using my email address, I have created a Power BI subscription for my company that is a tenant within the Power BI service. I could be the janitor for a multinational corporation but I am now the administrator of the tenant.
By way of definitions; the Power BI Service is the entire Power BI offering within the Microsoft Azure cloud. At any time, it could encompass hundreds of virtual machines geolocated in data centers throughout the world. When you subscribe to the service, you are – in effect – renting some space within the service. The “space” that you rent and manage for your subscription is called a tenant. It’s sort of like renting some office space or an apartment in a large building. You don’t really own it but you are paying for the right to live there. You can read about these and other administrative concepts here.
After setting up a new Power BI subscription, you really have one of two options:
1) If you have the authority to purchase licensing and manage services on behalf of your organization, proceed to purchase and assign licenses for report developers and users.
2) Make a service request or contact the appropriate administrator within your organization to add and assign licenses. This might be your help desk, systems or operations admin or Office/cloud services administrator.
The Admin Take-Over
After a “less privileged” user sets up the first Power BI subscription for the organization, no one else can do the same. This can be a little confusing if some person in Finance sets-up a trial account and then a BI developer tries to do the same thing. If the organization plans to move-forward with a governed Power BI tenant, they can perform an “Admin Take-Over”. Contrary to what the name suggests, this is not the same as a government coupe with troopers storming the building in Kevlar vests and automatic weapons. It simply means that an administrator assumes control of the new tenant and transfers admin rights from the person who established the subscription the appropriate person(s). Adam Saxton describes who this works in this Guy-In-A-Cube installment.
Using a Web API is a convenient way to expose and consume data over an Internet connection. Exercising some essential design patterns, understanding and working with the Power Query Formula Firewall is essential if you need to consume Web API data with Power Query and schedule data refresh with the Power BI Service.
Having recently worked-through numerous issues with API data feeds and deployed report configurations, I’ve learned a few important best practices and caveats – at least for some common use cases. In one example, we have a client who expose their software-as-a-service (SaaS) customer data through several web API endpoints. Each SaaS customer has a unique security key which they can use with Power BI, Power Query or Excel and other tools to create reporting solutions. If we need a list of available products, it is a simple matter to create a long URL string consisting of the web address for the endpoint, security key and other parameters; an then just pass this to Power Query as a web data source. However, it’s not quite that easy for non-trivial reporting scenarios.
Thanks to Jamie Mikami from CSG Pro for helping me with the Azure function code for demonstrating this with demo data. Thanks also to Chris Webb who has meticulously covered several facets of API data sources in great detail on his blog, making this process much easier.
Some web APIs have a database query or other logic hard-wired to each individual endpoint. The endpoint I am demonstrating allows a stored procedure name and filter to be passed as parameters, which allows one endpoint to run any query in the database that is allowed by the developer or admin. The following information was setup for this demo:
…and if we paste the same URL into the address dialog for a new web data source in Power Query, a complete table is returned.
Good so far, right? But, here’s here’s the problem. If we were to use a single API call in this manner, the Power BI service may, under certain conditions, allow the data source to be refreshed but you cannot modify the query string parameters in this way for the service to trust the web API connection so it can be refreshed. Now for a more sophisticated and more realistic example.
To minimize the data volume per call and load data incrementally, web API data is often paged or filtered using a date range or category of some kind. In this example, one call returns a list of years for which there are orders. With that, orders can be loaded for each year. An actual production scenario may be more sophisticated but this demonstrates the design pattern.
The first query – the outer query – returns one row per year. Then we create another query that executes a stored procedure requiring a Year parameter, that returns all the Order records for that year.
That query, shown here, is converted into a function by adding the Year parameter.
In the typical design pattern, a custom column is added which invokes the custom function, passing the YEAR column value.
I deploy a copy of this report file to my Power BI tenant and then try to schedule data refresh. Here’s the result:
The error reads: “You can’t schedule refresh for this dataset because the following data sources currently don’t support refresh…”
The Power Query Formula Firewall prevents the queries from running because they don’t meet requirements to be trusted according to the rules of the the formula firewall and the “fast combine” feature. Each query must be granted permission to run. The default permission set for web sources is “Anonymous” which simply means that no credentials are needed. The developer simply needs to agree to let the query run.
Each query being combined must share a common address or connection string, with a compatible set of privacy level settings.
The formula firewall has a problem with us concatenating the endpoint, code and parameters into one humungous web address string. The reasons and details are less important but separating the address and other elements and letting Power Query manage them as separate objects can resolve this issue.
Here’s the refactored M code for the outer query/function. This is the patterned I’m using for all web API queries. Note that the “BaseUri” is a scalar text type variable, and the other web query parameters are elements of a record stored in a variable named “Query”. These two variables are passed as arguments to the Web.Contents method:
(Year as number) => let BaseUri = “https://intellibizwebapi.azurewebsites.net/api/ExecuteSQL”, QueryRecord = [ Query= [ code = “abFbalkeuCiozdne7PeMG0bZWAZGj65uJ3zLsYoB8zLfisrJo6gv2/Fvw==”, name = “uspOnlineSalesByYear”, filter = Number.ToText( Year ) ] ], Source = Json.Document( Web.Contents( BaseUri, QueryRecord ) ),
By letting the Web.Contents method work it’s magic and by conforming to the other requirements I mentioned, the Power BI service formula firewall will trust the source and allow this report to be scheduled for data refresh.
As I mentioned earlier, Chris Webb has covered a number of nuances related to this method on his blog. To understand it deeply, I suggest reading about them here.
Following the Power BI World Tour, Seattle event on Oct 30, please join me for a full-day of deep learning. That’s right… it’s on Oct 31st so put on your Wonder Woman or Captain America costume and get ready to exercise your super powers with Power Query and Power BI! You will learn to master Power Query extensively from Beginner to Advanced. The other session taught at the same time by Brian Grant is “Power BI: Enhance Your Data Model with DAX” but ya gotta pick one. You can learn more about the Power BI World Tour and the Academy by following these events on Twitter and LinkedIn using the links at the bottom of this post, or search these hashtags:
#PowerBIUG | #PowerBI | @pbiusergroup | #PowerBIUGAcademy | #PBIWorldTour
The foundations of a Business Intelligence solution are data transformations, data wrangling, data cleansing and ETL. A well-crafted Power BI project rests on Power Query and the queries that define the data model, calculations and report visuals. This full-day session will teach you how to lay the foundation for a Power BI solution with simple and advanced Power Query techniques.
Learn from Paul Turley, ten-year Microsoft Data Platform MVP and veteran BI Solution Architect. You will learn best practice design patterns, tricks, shortcuts and proven techniques to improve your skills and add immediate value to your projects. Power Query is everywhere – and growing.
The skills and techniques taught in this workshop apply to Power BI Desktop, the “Get Data” feature in Excel 2016+, SQL Server Analysis Services 2017+ (SSAS), Azure Analysis Services (AAS) and Data Flows in the Power BI Common Data Service (CDS). You will learn through exercises and instructor-led hands-on demos. Bring your laptop with the latest version of Power BI Desktop installed. The rest will be provided. We will cover material from basics through advanced. Each exercise is separate so you can absorb only what you need to learn, based on your prior experience, needs and skill level.
Power Query Basics
Quick tour of the Power Query interface & essentials
Creating and managing queries
Adding and editing steps
Recovery and project management
Essential best practices
Managing data sources
Working with folder paths, web URIs & database connections
Referencing & Duplicating queries
Consolidating queries, building base queries & dependency chains
Loading queries into data model tables
Basic error handling & debugging
Data Sources & Structures
Flat CSV files
Irregular text files (headings & totals)
JSON (complex, with nested & ragged hierarchies)
Excel (single sheet/table, multiple sheets/tables)
Folders & file collections
Web pages a page tables
Web APIs & web service endpoints
Essential Query Techniques
Managing data types
Applying correct naming conventions
Working with Date & Time values
Splitting & formatting columns
De-duplicating & grouping
Pivot, Unpivot & Transpose
Custom columns & expression basics
Extracting tables from a data sources to supporting essential modeling for Power BI report design:
Slicer & calculation-driver tables
Advanced Power Query Techniques
Working with M: The Data Mashup language
M function essentials
Prioritized learning (what’s most important)
Using & managing parameters
Using the #shared object for internal documentation, examples & code syntax
Understanding M objects (values, tables, lists & records)
Number, Date, Time & Text manipulation M functions
Create a Date lookup/dimension table using M & Power Query
Create a Time series lookup/dimension table using M & Power Query
Why do I need a Date dimension in Power BI?
Standard date parts & hierarchies
Columns to support time-intelligence calculations
Working with fiscal & special-purpose calendars (e.g. 4-4-5, ISO)
Working with query functions
Parameterized queries, API endpoints & user-defined functions
Putting it Together
Queries to support data model construction
Queries used to support report visuals
Deploy a report, configure the on-premises gateway
Use query parameters to schedule refresh in a deployed Power BI solution
A service provider or vendor might want to publish multiple copies of a report that should connect to different database servers or databases. In a true multitenant service solution, we would have a singe database with row-level user mapping tables that filter data by the logged in user. True multitenant solutions require quite a lot of planning and development work to implement. In smaller-scale or interim solutions, copies of a report can be deployed to different workspaces and then queries can be parameterized to use different database connections.
In this post, I’ll demonstrate deploying and configuring such a solution where the server name and database name have been parameterized and setup to use the on-premises gateway to connect and refresh data. I’ll also setup scheduled refresh. The full video walk-through is below but I’ll do a quick review to set the stage.
This is the Power Query Editor in Power BI Desktop. I have two parameters that are used to specify the ServerName and DatabaseName for each SQL Server query:
Once deployed to a workspace in the service, the gateway must be configured with a data source for every possible server name and database combination. In my example, I can connect to my local server using the NetBIOS name, IP address or LocalHost. These all are acceptable methods but a data source must be added to the gateway configuration for each so the that the connection strings match exactly. Remember that the connection is from the on-prem gateway to the database server so names like LocalHost or an internal IP address will work just fine. In my example, I’m using the IP address of my local loopback adaptor on the local machine to connect to a local instance of SQL Server over the TCP connection.
In the workspace, the dataset is bound to the gateway. Click the ellipsis and choose Settings.
To bind the gateway to the dataset, click to select the radio button next to the gateway. This flips the switch titled “Use a data gateway”. Apply the setting and then you can refresh the data or schedule refresh.
Finally, the parameters can be updated right here in the dataset settings.
The message from the Microsoft Business Applications Summit this week was very clear: Power BI is growing up.
We have known for a while that Power BI is a great front-end tool for enterprise-scale tabular and multidimensional models. Power BI Premium will now allow almost limitless data scale using extremely large datasets by leveraging composite models that employ both DirectQuery and VertiPaq cached mode tables. This feature shatters the previous barriers where data models could only support either DirectQuery or cached mode (imported data). Additionally, Microsoft publicly unveiled plans to add IT developer centric features to Power BI Desktop for managing large models, version control and application development lifecycle.
These announcements leave many open questions about how Microsoft will continue to support self-service and enterprise customers with one tool and about the long-term future of SSAS and Visual Studio-based SSDT. At this point, none of these tools are on the chopping block, but it is clear that James Phillips and his team have significant momentum and are not slowing down to wait for other products to catch-up.
In this interview with Christian Wade, he describes the benefits of this new capability. Christian is Group Program Manager on the Power BI and Analysis Services product team with focus on data model development and scalability features.
Composite Models & Aggregations
In the opening keynote and again in his sessions, Christian demonstrated Power BI reports on the taxi driver activity database with over a trillion rows of raw data. The larger dataset was in a Spark cluster, accessed using DirectQuery. Aggregated tables were stored in the in-memory model using the new composite model feature. As the data was explored in report visuals, the Power BI engine would seamlessly switch from tabular in-memory aggregate tables to DirectQuery source data in order to return low-level details. Composite models will allow mashing-up imported database and file-based data with an DirectQuery.
There are limits and complexities with these new features. You cannot mashup imported tables in a Power BI model based in a direct connection to SSAS, but enterprise-scale features in Power BI arguably may not steer a solution architect to select SSAS over Power BI for serious data modeling. With incremental data refresh, large model support, row-level security and many other “big kid” features, Power BI might be a preferable choice. I’m not ready to rule-out Analysis Services as the better option for most enterprise solutions – at least not in the near future, but Power BI is definitely heading in that direction.
Power BI Desktop Features for Developers
News Flash: Power BI Desktop will be the primary tool for serious, enterprise data model development. A bunch of features are in the works and will arrive soon for preview. The obvious question is what will happen to SSDT for Visual Studio and the answer is that it’s not going away but the product team is putting their energy into enhancing Power BI Desktop for now. It will likely have some kind of “Beginner/Advanced” switch to simplify things for the novice user and isolate advanced or complicated, developer-centric features.
Here are a few screen captures of the forthcoming features:
Schema Compare & Model Integration
Like the BISM Normalizer tool created by Christian Wade, there will be a fully-supported compare and merge tool baked into Desktop. The early build is branded with components from MAQ Software.
Relationship Diagram Designer, Properties Pane and Application Lifecycle Management Features
A new-and-improved diagram designer is imminent. It will allow large models to be more effectively managed and displayed. You will have the ability to save individual views of the diagram with selected, subject-area tables.
Like in Visual Studio, properties can be set and managed in a Properties sheet window. Support for standard SSAS features like display folders is coming and can be managed here.
Business Applications Release Notes
With the velocity of features and updates, a comprehensive set of release notes are available here. This document will be updated frequently with the revised roadmap, feature history and announcements.
APIS and Powershell CmdLets for Power BI Administration
At the MVP Summit in March Kay Unkroth promised API endpoints and Powershell CmdLets for managing all aspects of Power BI. He even did a dance and sang a song to make the announcement memorable. This week, the product team delivered what they promised, with the announcement official here.
The following features and capabilities are on the delivery short list coming in the next 2-3 months:
Power BI Service
The new, “modern workspace” eventually replacing current workspaces integrated with Office 365 groups. The new app workspace will no longer be bound to an Office 365 Group. The feature will be optional at first, but will replace current workspaces soon.
Now, users can be: Member, Admin, Contributor
Can add groups to workspace
Premium capacity tenants can have nodes in different regions
Admins will have visibility and access to all objects and resources
Microsoft Business Applications Summit
Keynotes and Featured Sessions from the Microsoft Business Applications Summit are here.
This is a post about a post about a post. Thanks to those of you who are entering comments in the original May 12 post titled SQL, M or DAX? This is a popular topic. And thanks to Adam Saxton for mentioning this post in his Guy in A Cube Weekly Roundup.
This is a HUUUUGE topic and I can tell that I’ve struck a chord with many BI practitioners by bringing it up. Please post your comments and share your ideas. I’m particularly interested in hearing your challenging questions and your thoughts about the pros-and-cons of some less-obvious choices about whether to implement transformations & calculations in SQL, M or DAX.
This week, I have had engaging conversations on this topic while working on a Power BI consulting project for a large municipal court system. As a consultant, I’ve had three weeks of experience with their data and business environment. The internal staff have spent decades negotiating the intricacies and layers upon layers of business process so of course, I want to learn from their experience but I also want to cautiously pursue opportunities to think outside the box. That’s why they hired me.
Tell me if this situation resonates with you… Working with a SQL Server database developer who is really good with T-SQL but fairly new to Power BI & tabular modeling, we’re building a data model and reports sourced from a line-of-business application’s SQL Server database. They’ve been writing reports using some pretty complicated SQL queries embedded in SSRS paginated reports. Every time a user wants a new report, a request is sent to the IT group. A developer picks up the request, writes some gnarly T-SQL query with pre-calculated columns and business rules. Complex reports might take days or weeks of development time. I needed to update a dimension table in the data model and needed a calculated column to differentiate case types. Turns out that it wasn’t a simple addition and his response was “I’ll just send you the SQL for that…you can just paste it”. The dilemma here is that all the complicated business rules had already been resolved using layers of T-SQL common table expressions (CTEs), nested subqueries and CASE statements. It was very well-written SQL and it would take considerable effort to re-engineer the logic into a dimensional tabular model to support general-use reporting. After beginning to nod-off while reading through the layers of SQL script, my initial reaction was to just paste the code and be done with it. After all, someone had already solved this problem, right?
The trade-off by using the existing T-SQL code is that the calculations and business rules are applied at a fixed level of granularity and within a certain business context. The query would need to be rewritten to answer different business questions. If we take the “black box” approach and paste the working and tested SQL script into the Power Query table definition, chances are that we won’t be able to explain the query logic in a few months, after we’ve moved on and forgotten this business problem. If you are trying to create a general-purpose data model to answer yet-to-be-defined questions, then you need to use design patterns that allow developers and users to navigate the model at different levels of grain across different dimension tables, and in different filtering contexts. This isn’t always the right answer but in this case, I am recommending that we do as little data merging, joining and manipulation as possible in the underlying source queries. But, the table mapping between source and data model are not one-to-one. In some cases, two or three source tables are combined using SQL joins, into a flattened and simplified lookup table – containing only the necessary, friendly-named columns and keys, and no unnecessary clutter like CreatedDateTime, ModifiedDateTime and CreatedByUser columns. Use custom columns in M/Power Query to transform the row-level calculated values and DAX measures to perform calculations in aggregate and within filter/slicing/grouping context.
I’d love to hear your thoughts and ideas on this topic.
Yesterday a friend asked for a little help getting started with Power BI. He’s a DBA and system administrator and wanted to cut his teeth on Power BI with a really simple dashboard-style scorecard report. Using a list of database servers with license expiration dates, he thought it would be a simple matter to calculate and show the expiration status for each server using a simple traffic light indicator. The envisioned server list might look something like this:
Makes perfect sense, right? This is a basic use case and a good application for simple KPIs; with the one minor caveat that POWER BI DOESN’T SUPPORT THIS!
This topic has become a bit of a soapbox topic for me because it’s a capability that, in my opinion, is a very obvious gap in the Power BI feature set. After unleashing my rant, I’ll demonstrate a solution in this post.
The most interesting thing about this missing feature is that for many years it has existed in the products that evolved into the current Power BI product . Key Performance Indicators (KPIs) are defined as scriptable objects in SQL Server Analysis Services (SSAS) with tremendous flexibility. KPIs are simple… the STATE element of a KPI (often considered “Bad”, “OK”, or “Good” status) is translated into a visual indicator, usually an icon (commonly “Red”, “Yellow” or “Green”, respectively). There are variations on this theme but it’s a very simple concept and a good solution has existed for many years. In SSAS Tabular, the State logic was dummied-down to a slider control that eliminated some of the flexibility we have in the earlier multidimensional project designer but it still works. The slider UX expects that the state applies when a value is equal to or greater then the threshold for yellow and green, and less-then the threshold value for red. Queries returned from SSAS include metadata that tells Excel, Power BI visuals or a variety of other client tools: “The KPI state is 1 (meaning ‘good’) so display a square green icon for this item”. If you have the luxury of building your data model in Analysis Services using the SQL Server Data Tools (SSDT) designer for tabular models – or in Power Pivot for Excel, you would define a KPI using this dialog:
The actual return value for a KPI designed this way is really just “–1”, “0” or “1” which typically represent “Bad”, “OK” and “Good” states, respectively. As I said, you have other options like switching the red/green position or using 5 states rather than 3. The multidimensional KPI designer even gives you more flexibility by allowing you to write a formula to return the VALUE, STATE and TREND element values for a KPI separately. It would be wonderful to have the same capability in Power BI. It would be marvelous if we could the slider UI like this and then an Advanced button to override the default logic and define more complex rules in DAX! The SSAS architecture already supports this capability so it just needs to be added to the UI.
If you design your data model using SSAS multidimensional or tabular, or using Power Pivot for Excel (which was the first iteration of Power BI) KPIs are just magically rendered in native Power BI visuals like a Table or Matrix. But alas, Power BI Desktop does not have this well-established feature that could easily be ported from Power Pivot or the SSAS Tabular model designer.
</ END RANT>
…back to my friend’s simple scorecard report.
Using out-of the box features, the best we could do was this…
Create a calculated column in the table that returns -1 when the expiration date has passed, 0 if it is today and 1 if the expiration date is in the future. Here’s the DAX script for the column definition:
Expiration Status Val =
IF([EndofLifeDate] < TODAY(), -1
, IF([EndofLifeDate] > TODAY(), 1
Next, add some fields and the new column to a table visual and use the Conditional Formatting setting in the table properties to set rules for the Back Color property of the calculated column, like this:
Here’s the table with the conditionally-formatted column:
Why Not Use the KPI Visuals?
The standard KPI visual in Power BI is designed to visualize only one value rather than one for each row in a table. Like an Excel Pivot Table, if KPIs were defined in a Power Pivot or SSAS cube or model; a Power BI Table will simply visualize them but the Power BI model designer doesn’t yet offer the ability to create KPI objects.
Several community developers have tried to fill the feature gap with custom visuals but every one of them seems to address different and specific use cases, such as time-series trending or comparing multiple measure values. I have yet to use one of the available KPI visuals that just simply allows you to visualize the KPI status for each row in a table, without having to customize or shape the data in unique and complicated ways.
How to Design Status KPIs With Indicators
Here’s the fun part: Using the Expiration Status column values (-1, 0 or 1), we can dynamically switch-out the image information in another calculated column. Power BI has no provision for embedding images into a report in a way that they can be used dynamically. You can add an image, like a logo, to a report page and you can reference image files using a URL but you cannot embed them into a table or use conditional expressions.
Using this trick, you can conditionally associate images with each row of a table. This is a technique I learned from Jason Thomas, whose blog link is below. Using a Base64 encoder, I encoded three state KPI indicator images as text which I then copied and pasted into the following calculated column formula DAX script:
The encoded binary strings correspond to these three images, in this order:
To reuse this, you should be able to simply copy and paste this code from here into a new calculated column. You no longer need the image files because that binary content is now stored in the table column. It really doesn’t matter what labels you use for the status key values as long as they correspond to the keys used in the preceding code. I’m using the conventional -1, 0 and 1 because that’s the way SSAS KPIs work.
On the Modeling ribbon, set the Data Category for the new column to “Image URL”:
That’s it! Just add any of these columns to a Table visual and WHAM, KPI indicators!
*Incidentally, since adopting Jason’s technique, Gerhard Brueckl came up with a method utilizing Power Query to manage and import image files that I will use in the future. Prior to that, I used this site Jason recommended in his post. My thought is that if a separate table only stored three rows (one for each KPI status), the status key value would be used to relate the tables. It would be interesting to see if using a related table reduces the PBIX file size or if VertiPaq can effectively compress the repeating values of image column. May be a good topic for a later post.
The results of last month’s Power BI Global Hackathon are in! The Hackathon was facilitated by our our PUG here in Portland with the goal of welcoming global contenders in subsequent contest. Five teams entered the contest using publically-available data to visualize and tell data stories using our favorite data analysis platform. Congratulations to Xinyu Zheng and Ron Barrett for winning the challenge with their entry, analyzing Yelp restaurant star ratings. These were all great entries and you can view the contest results in the Power BI report below.
Here are the published projects that were entered in the Hackathon:
Xinyu and Ron analyzed ratings from nearly 1200 restaurant Pittsburgh, Phoenix and Las Vegas. Results compare ratings and reviews by restaurant and food categories, sentiment and key phrases in the review comments
I loved the creativity of this solution from Jeremy Black and Kirill Perian who analyzed alcohol sales statistics using infographics and bookmarks to switch out visuals on the same page. The presentation concludes on the last page of the report with an auto-advancing variation of “100 Bottles of Beer on The Wall”. Nice touch.
I’m admittedly a bit biased because this was my design, with a lot of help from Brian, Ron and Greg. We used a series of tables to prompt a user for Power BI solution business requirements and recommend fitting solution architectures and components. We pushed some practical and technical limits in our project and I’ll write a separate post about it.
This entry from Ron Ellis Gaut is a nice, clean orchestration of county health data, measuring health and comparing personal well-being and program efficacy.
The entry from Daniel Claborne emphasizes machine learning predictions performed with R Script, commonly used in data science. He actually includes the annotated code and explains the technique and the approach using training and prediction data sets.
The Portland Power BI User Group was one of the first and continues to be one of the most active in the international community. We meet on the 4th Wednesday evening every month in Beaverton, Oregon. Today there are many active PUGs all over the world.
Question: How many different versions of Power BI Desktop might you have installed at one time?
Answer: Three (or more)
What happens when you have different versions installed, and how can you make sure that you use the right version for a given Power BI report file?
An issue came up this week when I tried to open a Power BI Desktop file (.PBIX) from File Explorer and Power BI Desktop told me I was headed down a dark and difficult path. Well, not exactly, but it displayed the following message:
Unable to open document
The queries were authored with a newer version of Power BI Desktop and might not work with your version.
Please install the latest version to avoid errors when refreshing.
When I clicked the Close button, rather than leaving me to correct with what seemed to be a complicated and potentially damaging situation, Power BI Desktop starts up and continues to tell me about the perils that lie ahead, in this message:
Report layout differences might exist
This Power BI report file may have some features that aren’t available in Power BI Desktop until the next release.
If you need to see the latest version you worked with on the web (app.powerbi.com), please view the report there. We’re sorry for any inconvenience.
As an unsuspecting user, I might be confused but at least I can rest assured that the application developers at Microsoft who write these warning messages are thoughtful and apologetic.
What’s going on?
In addition to the reports I author and deploy to the Power BI cloud service, I also create reports for my on-premises Power BI Report Server. Report Server requires an older version of Power BI Desktop which can be installed from the menu on the report server. This older version of desktop (October 2017 in my case) is sandboxed by Windows so it doesn’t get upgraded by the latest Power BI Desktop installer when I update it from the PowerBI.com. In Control Panel, you can see both installations:
The problem I experienced was a result of installing the older desktop version for PBRS after the newest version. The file extensions (PBIX and PBIT) are already associated with whatever version of desktop is installed and registered with Windows. The remedy is quite simple… just reinstall the latest version of Power BI Desktop and perform a Repair if you already have that version installed.
This next part is more informational than problematic but it actually is possible to have additional “versions” or packages of Power BI Desktop installed. If you install Power BI Desktop from the Windows 10 Microsoft Store, you get a sandboxed installation that runs in a restricted “safe” security context. This is a good option for users in a restricted corporate network environment who don’t have local admin access to their computer/ In most cases, they can install the application this way. As you can see, I actually have three separate Power BI Desktop installations.
These are all 64 bit builds of the desktop applications so I could even install 32 bit builds of Power BI Desktop as well. I would only do that for compatibility with an old 32 bit database driver or if I were running on an old 32 bit Windows machine, which is not an ideal scenario. Keep in mind that 32 bit applications can only use a limited amount of RAM (about 3.7 GB minus some system overhead).
This first week of the new year has been a lot of housecleaning for me (literally and figuratively… my office desk and cabinet will be clean by the end of the day!). Three years into teaching classes and workshops on being productive with Power BI, it continues to be a product requiring a lot of work to stay current. New features are introduced in every Power BI Desktop monthly update; not to mention updates to the cloud service and on-prem server. You would have to be a child of the 80s to get the Never Ending Story reference. Otherwise, it’s just a really bad flying dog – and pop song, which are both hard to explain, so we won’t. The point is that it’s an ongoing effort to keep skills, courseware and presentation material up-to-date.
If you’re like me, sometimes all these updates can be a bit of a distraction (we’re talking about Power BI again – not the dog, movie or song… case in point). I’m excited by the continual innovations and improvements to Power BI. However, the foundational rules of good design don’t really change that much. Effective data curation, correct modeling and good core visualization design are as critical as ever. The trick is to know which new features may be used to improve foundational design and which ones you can leave as icing on the cake for minor enhancements. Updating courseware and workshop labs seems to be a never ending task and I’m hard at work revising older material and adding new content to prepare for new events this year. An important topic I will continue to revisit this year is how Power BI is used along with other Microsoft tools to create different types of solutions. I’m working on a new presentation to describe all of the incarnations of Power BI, used to deliver everything from self-service desktop reports – all the way up to enterprise-scale solutions for corporate users with governed and secured data – and most everything in-between.
The first workshop of the new year will be a one day preconference before the Redmond SQL Saturday, on Microsoft campus Friday, February 9. You can sign-up for this event here. I’m working on a few others in the following months and will update this and other blog posts when they are scheduled. I hope to see you there. You are welcome to send me questions or suggestions about specific topics of focus. Just add a comment to this post, or reach me through Twitter or LinkedIn.
Since starting the Guy In A Cube series over three years ago, Adam Saxton has become the front man for Microsoft Business Intelligence feature announcements and Power BI how-to tutorials. Joined by Patrick Leblanc last year, the Guy In a Cube series features over 200 short video tutorials with at least two new clips added every week. The Guy In A Cube YouTube channel currently has over 13,000 followers.
I always look forward to spending time with both of these guys at community events. I first met Adam several years ago when he was a Senior Support Escalation Engineer for Microsoft who helped with a number of tough BI server setups. Patrick did senior level consulting work for customers in the field before joining Microsoft as Data Platform Solutions Architect. Adam and Patrick are natural born entertainers. With their deep knowledge and expertise with database technology, security and the entire BI, analytics and reporting stack; they offer one of the top online resources for learning and keeping up with the rapidly-expanding Microsoft BI platform.
At conferences and events, I rarely see Adam without a camera in his hand and so was a real treat to get them in front of my camera for this interview. We chatted about how they got started and how they continue to use their series to highlight new and important features, and to provide guidance to support the Microsoft BI community.
In this interview during the 2017 PASS Summit, we continue to explore the topic of this series of blog posts on Microsoft BI solutions for the enterprise. Patrick and Adam share their guidance about managing large-scale solutions, version control and multi-developer projects using Power BI, Analysis Services and SQL Server.
Between sessions at the PASS Summit, I had the privilege of interviewing Alberto Ferrari and Marco Russo; SSAS Maestros and BI industry thought leaders. Alberto and Marco are revered as expert trainers and practitioners of SSAS and Power BI solutions. They are also well known for evangelizing best practices using the DAX calculation and query language. We spoke about some of the challenges and learnings they have gained from years of large-scale project implementations.
Both SSAS Tabular and Power BI Desktop share the same characteristic in that the semantic data model definition is described in a single file. Compared with older multidimensional model projects, the single file approach simplifies much of the development effort but it can be challenging to coordinate projects with multiple developers. They shared tool recommendations to assist with project integration, version management and development. Marco and Alberto recommend a number of community supported add-ins and other freely available software to enhance Visual Studio and Power BI Desktop.
I sincerely appreciate these two gentlemen taking the time to share their insights in this short interview, and their many contributions over the years that continue to improve Microsoft’s industry-leading Business Intelligence platform and the community that supports it. A comprehensive library of resources, training, tools and articles are available at: SQLBI.com.
During the interview we talked about several different tools that they adamantly endorse for any SSAS developer to have at their disposal. From my own experience, in rough order of priority, I recommend :
BIDS Helper / BI Developer Extensions – This Visual Studio add-in is an important tool for anyone developing Business Intelligence solutions in any version of Visual Studio or SQL Server. It includes several essential features and capabilities for SSIS, SSRS and SSAS projects. It has been a community-developed and community-supported tool for many years and is considered by most experienced BI developers to be a core extension of the Visual Studio SSDT (formerly called “BIDS”) add-in.
Community developed tools like this help to spur future enhancements to the officially-supported capabilities in later Microsoft development tooling. I suggest that you carefully read the documentation for SSAS Tabular features in this tool because some of the options only apply to certain versions of SSAS. Some of the BIDS Helper features for SSAS 2012 and 2014 have since been added to the officially-supported designer for SSAS 2016 and 2017. BIDS Helper was initially maintained on the Microsoft CodePlex site (here: http://bidshelper.codeplex.com/) and has since been moved to GitHub here: https://bideveloperextensions.github.io/. It can now be installed in the Visual Studio Tools > Extensions and Updates menu by search for “BI Developer Extensions” in the Online section.
DAX Studio – This is a stand-alone application and the most comprehensive DAX, SSAS and Power BI calculation script and query editing and performance tool available. It is an open source project managed and maintained by a collaboration of several community MVPs and industry leaders (coordinators are Darren Gosbell & Marco Russo). The latest version and codebase have recently been moved from CodePlex to http://daxstudio.org.
Tabular Editor – An open source, stand-alone SSAS Tabular model design tool that runs outside of Visual Studio. It is distributed through a GitHub repo at: https://tabulareditor.github.io. The source code is maintained at: https://github.com/otykier/TabularEditor. It is described as a lightweight editor for SSAS Tabular Models built in .NET/WinForms and utilizes the Analysis Management Objects (AMO) library.
DAX Editor – an extension to SSDT that makes it easier to edit DAX measures in a Tabular project. It includes: DAX Syntax Highlighting, DAX Formatting and Textual representation of measures defined in a Tabular project
Extracting/importing measures from/to a Tabular model file (.BIM). This add-in is developed and maintained by SQLBI. It can be found in the Visual Studio > Tools > Extensions and Updates menu or downloaded from SQLBI at: https://www.sqlbi.com/tools/dax-editor.
BISM Normalizer – An add-in for Visual Studio/SSDT that can be installed from the Visual Studio Tools > Extensions and Updates menu by search for “BISM Normalizer”. This is an essential tool if you need to compare multiple SSAS Tabular projects and databases (including workspace databases) and then manage the steps to selectively merge additions and differences between objects. I have found this useful for integrating parallel development work on the same SSAS Tabular project but I will warn you that if used routinely to merge multiple changes to a single project, it can be a complex undertaking and not something I recommend on a daily basis.
BISM Normalizer is developed and maintained by Christian Wade, formerly an independent consultant and now a Senior Program Manager on the SSAS product team. Although Christian works for Microsoft, this is still considered a community-supported, third-party tool not officially supported by Microsoft. It is, however, the only tool designed specifically to perform schema comparisons and object merging in SSAS.
In our discussion, we also refer to these tools and resources:
VertiPaq Analyzer – An older tool (actually an Excel Power Pivot workbook) created by Kasper de Jonge, Program Manager from the Power BI/SSAS product team, that imports SSAS DMV query results and reports object size, object data compression rates and optimization statistics. It has since been updated and enhanced. This is a convenient way to get information quickly about your SSAS or Power BI model. The same information can be obtained, with a little more effort, by running DMV queries in DAX Studio or SSMS. Links to the original and updated versions are available on the SQL BI site at: https://www.sqlbi.com/articles/data-model-size-with-vertipaq-analyzer/.
Tabular Modeling in Microsoft SQL Server Analysis Services – is a Microsoft Press book written by Alberto and Marco that applies mainly to SSAS Tabular 2016 and 2017 projects. This is an absolutely invaluable reference and tutorial on SSAS design best practices and design techniques. Previous books from our Italian friends include DAX Patterns and The Definitive Guide to DAX; all excellent learning resources that I have in my book library.
Shortly after this conversation, Marco and Alberto delivered two sessions at the PASS Summit conference about DAX best practices and performance optimizations. A little later, Marco sat with Guy In A Cube hosts Adam Saxton and Patrick Leblanc to chat about these sessions. You can find that interview here. I also had a chance to catch-up with Adam and Patrick in an earlier interview during the conference, which I will share in a later blog post.
Conversations with Julie Koesmarno, Olivier Matrat, Aaron Nelson, Seth Bauer and Robert Bruckner captured in video interviews below…
Continuing my video blog series of interviews from PASS Summit, I had the opportunity to catch-up with several Microsoft BI and Data Platform industry leaders amid the crowds and between sessions. Stay tuned to this station for many more interviews and insider information about the Microsoft Business Intelligence and Data Platform.
I caught up with Julie Koesmarno in the Community Zone, a few days after travelling together on the SQL Train (aka “Oregon SQL Party Train to Seattle”) from Oregon SQL Saturday the weekend before Summit. She’s been a non-stop community advocate for several years, and continues to speak at events all over. Julie was an Business Intelligence consultant and user group leader in Australia and Southern California before joining Microsoft as technical evangelist. You might recognize her from the executive demonstration during the opening keynote at PASS Summit last year.
Olivier Matrat, Principal Program Manager (that’s Microsoft job title code for “in charge of a lot of important stuff”), talks about how they are hard at work integrating several products in the “Power *” suite. This is the first time I’d heard of all the “Power…” prefixed product names unofficially referred to as “Power Star”, but it makes perfect sense. Olivier said that we can expect to see tighter integration between tools like Power BI, Flow and Power Apps with more embeddable features for developers and solution integrators.
Aaron Nelson, Data Platform MVP and hard-core PowerShell enthusiast, spoke about some new capabilities he presented in his session about PowerShell for Business Intelligence. The new REST API will let Power BI and report server admins orchestrate server migrations and task automation with PowerShell CmdLets. He seized the opportunity to promote the PASS PowerShell virtual group that he helps manage, at SQLPS.IO. I’ve promised Aaron a follow-up post to demonstrate how the REST API works with PowerShell and the new MSBuild integration, so please watch my blog for that in the next few days.
I chatted with Seth Bauer, BI consultant and Data Platform MVP, on the escalator in the Washington State Convention Center between sessions. Seth has been on the front lines of the Power BI advisors community since the product launched. He cites Q&A Natural Language and Explain the Difference as examples of the most compelling features. He participates in PASS Summit for professional networking and to stay current with BI technologies.
Robert Bruckner is a Senior Architect on the Power BI team and long-time developer lead for Reporting Services. He told me that there are many exciting capabilities on the horizon for Power BI and other integrated reporting technologies that are still under NDA. He mentioned a recent announcement that the On-premises gateway will soon support single sign-on, delegation, load balancing and high-availability. It is truly exciting to see such emphasis on enterprise-scale capabilities for these tools.