How to Configure the Power BI Gateway to use Dataset Connection Parameters

A service provider or vendor might want to publish multiple copies of a report that should connect to different database servers or databases.  In a true multitenant service solution, we would have a singe database with row-level user mapping tables that filter data by the logged in user.  True multitenant solutions require quite a lot of planning and development work to implement.  In smaller-scale or interim solutions, copies of a report can be deployed to different workspaces and then queries can be parameterized to use different database connections.

In this post, I’ll demonstrate deploying and configuring such a solution where the server name and database name have been parameterized and setup to use the on-premises gateway to connect and refresh data.  I’ll also setup scheduled refresh.  The full video walk-through is below but I’ll do a quick review to set the stage.

This is the Power Query Editor in Power BI Desktop.  I have two parameters that are used to specify the ServerName and DatabaseName for each SQL Server query:

Power BI Gateway with Parameters (Time 0_00_57;24)

Once deployed to a workspace in the service, the gateway must be configured with a data source for every possible server name and database combination.  In my example, I can connect to my local server using the NetBIOS name, IP address or LocalHost.  These all are acceptable methods but a data source must be added to the gateway configuration for each so the that the connection strings match exactly.  Remember that the connection is from the on-prem gateway to the database server so names like LocalHost or an internal IP address will work just fine.  In my example, I’m using the IP address of my local loopback adaptor on the local machine to connect to a local instance of SQL Server over the TCP connection.

Power BI Gateway with Parameters (Time 0_04_31;15)

In the workspace, the dataset is bound to the gateway.  Click the ellipsis and choose Settings.

Power BI Gateway with Parameters (Time 0_02_44;22)

To bind the gateway to the dataset, click to select the radio button next to the gateway.  This flips the switch titled “Use a data gateway”.  Apply the setting and then you can refresh the data or schedule refresh.

Power BI Gateway with Parameters (Time 0_05_47;00)Power BI Gateway with Parameters (Time 0_06_09;27)Finally, the parameters can be updated right here in the dataset settings.

Power BI Gateway with Parameters (Time 0_06_46;21)

 

Video Demonstration

 

SQL, M or Dax? – part 2

[ Related posts in this series: SQL, M or DAX: When Does it Matter? SQL, M or Dax? – part 2 SQL or M? – SSAS Partitions Using SQL Server Table-Valued Functions (UDFs) SQL or M? – SSAS Partitions in Power Query/M ]

This is a post about a post about a post.  Thanks to those of you who are entering comments in the original May 12 post titled SQL, M or DAX?  This is a popular topic. And thanks to Adam Saxton for mentioning this post in his Guy in A Cube Weekly Roundup.

This is a HUUUUGE topic and I can tell that I’ve struck a chord with many BI practitioners by bringing it up.  Please post your comments and share your ideas.  I’m particularly interested in hearing your challenging questions and your thoughts about the pros-and-cons of some less-obvious choices about whether to implement transformations & calculations in SQL, M or DAX.

This week, I have had engaging conversations on this topic while working on a Power BI consulting project for a large municipal court system.  As a consultant, I’ve had three weeks of experience with their data and business environment.  The internal staff have spent decades negotiating the intricacies and layers upon layers of business process so of course, I want to learn from their experience but I also want to cautiously pursue opportunities to think outside the box.  That’s why they hired me.

Tell me if this situation resonates with you…  Working with a SQL Server database developer who is really good with T-SQL but fairly new to Power BI & tabular modeling, we’re building a data model and reports sourced from a line-of-business application’s SQL Server database.  They’ve been writing reports using some pretty complicated SQL queries embedded in SSRS paginated reports.  Every time a user wants a new report, a request is sent to the IT group.  A developer picks up the request, writes some gnarly T-SQL query with pre-calculated columns and business rules.  Complex reports might take days or weeks of development time.  I needed to update a dimension table in the data model and needed a calculated column to differentiate case types.  Turns out that it wasn’t a simple addition and his response was “I’ll just send you the SQL for that…you can just paste it”.  The dilemma here is that all the complicated business rules had already been resolved using layers of T-SQL common table expressions (CTEs), nested subqueries and CASE statements.  It was very well-written SQL and it would take considerable effort to re-engineer the logic into a dimensional tabular model to support general-use reporting.  After beginning to nod-off while reading through the layers of SQL script, my initial reaction was to just paste the code and be done with it.  After all, someone had already solved this problem, right?

The trade-off by using the existing T-SQL code is that the calculations and business rules are applied at a fixed level of granularity and within a certain business context.  The query would need to be rewritten to answer different business questions.  If we take the “black box” approach and paste the working and tested SQL script into the Power Query table definition, chances are that we won’t be able to explain the query logic in a few months, after we’ve moved on and forgotten this business problem.  If you are trying to create a general-purpose data model to answer yet-to-be-defined questions, then you need to use design patterns that allow developers and users to navigate the model at different levels of grain across different dimension tables, and in different filtering contexts.  This isn’t always the right answer but in this case, I am recommending that we do as little data merging, joining and manipulation as possible in the underlying source queries.  But, the table mapping between source and data model are not one-to-one.  In some cases, two or three source tables are combined using SQL joins, into a flattened and simplified lookup table – containing only the necessary, friendly-named columns and keys, and no unnecessary clutter like CreatedDateTime, ModifiedDateTime and CreatedByUser columns.  Use custom columns in M/Power Query to transform the row-level calculated values and DAX measures to perform calculations in aggregate and within filter/slicing/grouping context.

I’d love to hear your thoughts and ideas on this topic.

 

 

 

 

 

 

 

 

Interview with Pinal Dave at PASS Summit 2017

While attending PASS Summit 2017 in Seattle, I had a chance to catch-up of several friends and industry experts, and I will be sharing these interviews in a series of posts.  On the 1st of November, Pinal Dave completed his 4200th post on SqlAuthority.com – a landmark by blogging once per day for eleven years.  Pinal shares this success story and humbly discusses his secrets about being the world-renound authority that we all know, along with some advice about tuning SQL Server.

Thank you, Pinal, for your time and all you do for the PASS community.

 

Can’t We Just Get Along? Making SSRS, Power BI and Excel Play Well Together

Please join me and other 2017 PASS Summit speakers for 24 Hours of PASS: Summit Preview on July 19th and 20th.  24HOP is a series of 60 minute on-line sessions presented back-to-back for 24 hours, from the same professionals who will deliver preconference and main conference sessions during the Summit the first week of November this year.  These online sessions are free of charge and normally attended by thousands of individuals to gain insight and knowledge about the topics.  24 Hours of PASS features free educational webinars delivered over 24 hours. Topics covered in this edition include Performance Tuning, SQL Server 2017, Linux, DevOps, Azure, PowerShell, SSRS, Power BI and much more. Browse all sessions. These webinars provide a sneak peek at some of the best practices, expert tips and demos you’ll find at this year’s PASS Summit in Seattle. Continue reading

Modern Reporting with SQL Server 2016 and 2017 Reporting Services full day pre-conference session

I am thrilled to be presenting a full-day preconference session before the PASS Summit, on October 31st.  Please join me for a deep tour of the new capabilities and BI platform integrations for SQL Server Reporting Services 2016 & 2017.  The session will also review the essential skills and tasks to setup and configure the report server and web portal, report design and modern report solution planning. Continue reading