Tour of the Power BI Solution Advisor

As a follow-up to my earlier post titled “Nine Realms of Power BI and the Power BI Solution Advisor“,  I’ve recorded this 7 minute tour of the solution advisor tour:

I’ll also follow-up here with another tour to step-through the “making of” the tool and a peek inside the design.


Using Power Query “M” To Encode Text As Numbers

I worked through a brain-teaser on a consulting project today that I thought I’d share in case it was useful for someone else in the community.  We needed to convert application user names into an encoded format that would preserve case sensitive comparison.  Here’s the story… A client of mine is using Power BI Desktop to munge data from several different source systems to create analytic reports.

Two-Phase BI  Projects

I’m going to step out of the frame just a moment to make a soapbox speech:  I’m a believer in two-phase Business Intelligence project design.  What that means in a few words is that we rapidly work through a quick design, building a functional pilot or proof-of-concept to produce some reports that demonstrate the capability of the solution.  This gets stakeholders and folks funding the project on-board so we can get the support necessary to schedule and budget the more formal, production-scale long-term business solution.  Part of the negotiation is that we might use self-service BI tools to bend or even break the rules of proper design the first time through.  We agree to learn what we can from this experience, salvage what we can from the first phase project and then we adhere to proper design rules, using what we learned to build a production-ready solution in Phase Two.

Our project is in Phase One and we’re cutting corners all over the place to get reports done and ready to show our stakeholders.  Today I learned that the user login names stored in one of the source systems, which we will use to uniquely identify system users, allows different users to be setup using the same combinations of letters as long as the upper and lower case don’t match.  I had to ask the business user to repeat that and I had heard it right the first time.  If there were two users named “Bob Smith” that were setup with login user names of “BOBSMITH” and “BobSmith”, that was perfectly acceptable per the rules enforced in the application.  No right-minded application developer on this planet or any other should have let that happen but since their dink-wad software produces this data, we have to use it as it is.  In the Phase Two (production-ready) solution we will generate surrogate keys to define uniqueness but in this version, created with Power BI Desktop, I have to figure out how to make the same user name strings, with different upper and lower-case combinations, participate in relationships and serve as table key identifiers.


Wouldn’t it be nice if I could convert each UserName string to a numeric representation of each character (which would be different for each upper or lower case letter).  I knew that to convert each character one-at-a-time, I would need to bust off each string into a list of characters.  Let’s see…  that’s probably done with a List object but what method and where do I find the answer?

It’s Off To The Web, Batman!

Yes, I Googled it (I actually used Bing) and found several good resources.  Most official docs online weren’t very helpful.  I have a paper copy of Ken Puls book where he mentions List.Splitter, which seemed promising.  I have an e-copy of Chris Webb’s book – somewhere – and I know he eats and breathes this kinda stuff.  Running low on options, I came across Reza Rad’s December, 2017 blog post and found Mecca.  Reza has an extensive post about parsing and manipulating lists. He helped me understand the mechanics of the List.Accumulate function, which is really powerful.  Reza provides several good examples of List manipulation; pulling lists apart and putting them back together.  This post didn’t entirely address my scenario but did give me a foundation to figure the rest out on my own.  The post is here. It was educational and sent me in the right direction.  But, the sample code didn’t resolve my issue entirely.  It did, however get me thinking about the problem a certain way and I figured it out.  HOT DANG!

So Here’s The Deal

The first step was to tear each string down into a List object.  At that point, you have a collection of characters to have your way with.

I created a calculated column and entered something like this:

=Text.ToList( [UserName] )


If you were to add this column in the query design and then scroll on over to the new column, you’d see that it shows up as a List object placeholder, just all waiting for you to click the magic link that navigates to the list of all the characters in the column string.


We don’t want to do this.

Beep Beep Beep…. Backing up The Bus

Removing the navigation step and looking at the column of List object placeholders…  I want to modify the M code for this step to do the following:

  1. Parse the items in the list (each character in the UserName field)
  2. For each character, convert it to a number
  3. Iterate through the list and concatenate the numbers into a new string of numerals

To enumerate over the elements of a list and put the list members back into some kind of presentable package (like a single string or a number), we can use the Accumulate method.

The Accumulator is a little machine with a crank handle on the side.  Every turn of the handle spits out on of the element values, using the current variable.  You can do whatever you want with the object in the current variable, but if you want to put it back into the machine for next turn, you should combine it with the state variable, which represents the previous value (when the handle was cranked the last time).

Here’s my final desired result:


In a nutshell, List.Accumulate contains two internal variables that can be used to iterate over the elements of a list (sort of like an array) and assemble a new value.

The state variable holds the temporary value that you can build on each each iteration, and the current variable represents the value of the current element.  With an example, this will be clear.

The final code takes the output from “Text.ToList” and builds a List object from the characters in the UserName field on that row.

Next, List.Accumulate iterates over each character where my code uses “Character.ToNumber” over the current character to convert it to numeric form.

Adding this custom column…


…generates this M code in the query:

= Table.AddColumn(#”Reordered Columns”, “Encoded UserName 1”, each List.Accumulate(
, “”
, (state, current)=>
Character.ToNumber(current), “000”

Just like magic, now I have a unique numeric column representing the distinct upper and lower-case characters in these strings, that I can reliably be used as a key and join operator.

Bad Data Happens

As I said earlier, in a solution where we can manage the data governance rules, perhaps we could prevent these mixed-case user names from being created.  However, in this project, they did and we needed to use them.

Nine Realms of Power BI and the Power BI Solution Advisor

The use cases for Power BI, along with its many companion technologies, are numerous.  Many organizations are exploring the use of Power BI in enterprise-scale solutions and struggling with the myriad of options and choices.  I’ve grouped these options into nine categories that I call the “Nine Realms of Power BI”.  Along with my friends at CSG Pro – Brian, Greg & Ron, we have created a Power BI-based tool that you can use as a sort-of survey to assess your business and technical requirements and then recommend a reference solution architecture in one of these categories.  The options, components and reference architectures, capabilities, limits and cost guidelines are detailed later in this presentation.  I’ll also take you on a tour of the solution advisor tool, which I have published for public Internet users.

This is a presentation I prepared for the Redmond SQL Saturday that I will also use for some future presentations.


Let’s start by grouping requirements and solution criteria into eight categories.  In the solution advisor, you’ll choose one option from each of these. We’ll explore these categories in detail a bit later.


Why Nine Realms?  I actually came up with nine solution architectures before the “Nine Realms” theme came to me, but I found it fitting that these concepts seems to align with the Norse mythology depicted in the Thor movies from Marvel Comics.  After doing a little reading, I found that these stories have been around for centuries and are rooted in real Viking folklore that have some real substance behind them.

In short, according to tradition, the nine realms or worlds are branches of the cosmological tree; Yggdrasil. The realms include familiar worlds depicted in the stories we know, like Asgard – the home of the gods – and Midgard – home of the humans, which is earth.

Not all the worlds in the Yggdrasil tree are necessarily “better” or “worse” than, or above or below, others but they are all different, with attributes better suited for their inhabitants.  I find this to be a relevant analogy.

Stay with me here and I’ll show you how this all relates to the various incarnations of Power BI solutions.


Asgard is the home of the gods and is a place resembling Utopia, or a perfect world where everything is meticulously architected and all questions have answers.

Likewise, in a perfect BI solution, every base is covered and the solution achieves something approaching perfection.  Delivering such a thing is a goal of many BI solutions but achieving perfection is costly and often extends the technical scope and delivery timeline of a solution.  The stresses to achieve the utopian dream of a perfect BI solution can tread practical limits of not only time and money but also of patience and sanity; stakeholder commitment, interpersonal relationships among staff and leaders, work-life balance and the overall health of team business culture.


Which of the worlds is right for you and your audience?  Which one of the worlds should you try to achieve?

I promise to get serious here soon, but please indulge me with the “Thor” theme for just a moment…

Start by understanding your capabilities and stay focused on your objectives.  Keep your enemies close… in other words, understand the forces working against your success and strategically plan to overcome them.

Every distraction that deviates from of your planned solution – every new feature, every one-off promise to a stakeholder, every exception to the constrained list of in-scope deliverables – becomes your enemy.  Each of these metaphoric “friends” seems welcoming and well-intentioned until the schedule slips and the list of deliverables and challenges becomes insurmountable and unobtainable within your deadlines and technical capabilities.

Slide6 Slide7

This slide is key.  Power BI has a rich heritage of technologies that go back many years and are deeply engrained into the desktop application and cloud service – but some of these technologies also as more capable services outside of the desktop product.  For example, Power BI Desktop actually uses a scaled-down instance of SQL Server Analysis Services, which implements the Vertipaq tabular in-memory analytics engine.  If you need more horsepower than the Power BI Desktop modeling component provides, you can graduate to a full-blown SSAS instance and continue to work with a very similar, but more robust, data modeling tool that will scale on-prem or to the cloud to accommodate significantly more data and richer admin controls.  Be mindful, though, that making the leap from Power BI Desktop to enterprise SQL Server tools can be a big undertaking.


How about your audience?  Who and where are they?  How do you need to secure your solution, reports and data?

Slide9 Slide10

Where will you host your reports and how will users access them?  …in the cloud using the Power BI service – or on-premises using Power BI Report Server?

Slide11 Slide12

The Nine Realms of Power BI

As promised, here are the Nine Realms of Power BI.  They are roughly categorized into three or four different groups.

The top row are all solution options that utilize the Azure cloud-based Power BI service (, with the cached data model and reports deployed to the cloud service, or with reports in the cloud and data remaining on-prem.

The second row of options are exclusively on-premises with no reliance on cloud services or cloud storage.

The seventh item, “Azure SSAS – Deployed to Service”, is entirely cloud-based and requires on on-prem infrastructure at all.

The remaining two items are special use cases where reports and dashboards are embedded into and managed by a custom application; or data is fed in real time to live visuals.


Solutions are cloud/on-prem hybrid, entirely on-prem, entirely cloud-based or specialized solutions such as embedded or live-streaming.


Now back to the solution requirement categories.  Here they are in detail.  Consider this like a survey.  The solution advisor asks the questions on the right for each of the categories:


Power BI Solution Advisor

You can access the Power BI Solution Advisor by clicking the slide image.

With a little help form my friends, we have built this tool – using Power BI of course – to assess the solution requirement criteria and recommend relevant solution architectures.

Let’s take a quick look at the tool and then we will explore it in detail a little later.  The recommended architectures are details in the slides that follow.  (3-18 update: Video Tour of the Power BI Solution Advisor)


1. Cached Data Model, Deployed to Service

For secure report sharing, Power BI Pro licenses are required for all users without Premium capacity licensing.

Premium capacity licensing covers unlimited read-only users. Pro licenses are required for publishing & sharing.


2. SSAS Direct Connect, Deployed to Service

In many respects, this is the most versatile mode for using the Power BI platform with high volume data managed on premises. The latest version of Power BI Desktop may be used with new and preview features. With reports published to the service, key features like dashboards, natural language Q&A, mobile access, alerts and subscriptions are supported. Connecting to SSAS through the gateway enables you to manage full-scale semantic models in tabular and multidimensional, using partitions for incremental data refresh. Compared to DirectQuery, this option has better performance and unlimited DAX calculation features.

In simple terms, data is read from the on-prem data model in real-time as users interact with reports; but the service is even smarter than that.  To optimize performance and reduce unnecessary network traffic, query results get cached and refused for short periods.

Caching policy:


3. DirectQuery, Deployed to Service

The goal of DirectQuery is to enable as much capability as possible without caching data in a persistent data model. Rather then performing calculations on in-memory tables in a Vertipaq model, report interactions are translated into native queries for the data source to process and return aggregated results. To that end, report query performance will lag and complex calculations are limited. DAX functions that consume high data volume are impacted the most (e.g. SUMMARIZE, CALCULATETABLE, YTD, PARALLELPERIOD, RANKX, etc.)

There will always be performance and functionality limits with this feature but it will likely continue to see investments to improve performance as much as feasibly possible.

DirectQuery is typically chosen when: 1) a Microsoft customer have not fully embraced cached model or SSAS modeling concepts, or 2) when a relational data warehouse/mart is performance-tuned to address specific query & report scenarios within acceptable limits.


4. Cached Data Model, Deployed On-Premises

Reports are deployed to an on-premises instance SQL Server Reporting Services called “Power BI Report Server”.

SSRS catalog database requires SQL Server 2008+

Power BI Report Server licensing requirements: SQL Server Enterprise edition with Software Assurance, or Power BI Premium capacity.
Due to slower product release cycles, PBIRS features & capabilities lag behind Power BI Desktop/service by 1-4 months (PBIRS updates are about every quarter.)

User could have two version of Power BI Desktop installed (older version for PBIRS & latest version). Be cautious with version control.


5. SSAS Direct Connect, Deployed On-Premises

This option provides for a fully-scaled out enterprise solution with no dependencies on cloud services.

No model data size limit.

Role-based, row-level security (RLS) is supported in SSAS.

Enterprise scaled architecture (PBIRS & SSAS on separate machines) will require constrained delegation/Kerberos configuration unless static credentials are stored.

Scale-out architecture is supported on each tier by load-balancing multiple SSAS machines and/or load-balancing multiple PBIRS machines.

PBIRS doesn’t support Power BI service features like dashboards, natural language Q&A, alerts, mobile app access & R visuals.


6. DirectQuery, Deployed On-Premises

This option also provides for a fully-scaled out enterprise solution with no dependencies on cloud services.

No data source size limit.

Performance degradation and DAX calculation limits apply (same as DirectQuery in the serveice).

Scale-out architecture is supported by load-balancing multiple PBIRS machines.

PBIRS doesn’t support Power BI service features like dashboards, natural language Q&A, alerts, mobile app access & R visuals.


7. Azure SSAS Direct Connect, Deployed to Service

In most respects, this option is identical to using SSAS on-premises except no gateway is required to connect to Azure SSAS.

No on-premises hardware investment is required for this option since everything is hosted in the Azure cloud.

No SSAS product licensing costs. ASSAS costs are billed for hourly usage depending on capacity & service tier (developer: $ .13, production: $ .43 to $ 20.76 per hour)

Requires Azure Active Directory which can be federated to on-premises domain.

ASSAS is tabular only, same or slightly newer build as latest boxed product (2017/1400) & support older compatibility modes.

Capabilities & features are the same as using SSAS on-prem.


8. Embedded Service & Embedded Solutions

Power BI Embedded now supports all features of a solution deployed to the Power BI service.

Managed through Azure services in the Azure portal.

Capacity & usage-based costs range from $1 to $32 per hour.

Service may be paused & managed through the API.


This diagram depicts the components and interactions of an embedded solution.

Detailed information:

Power BI .NET SDK (server-side code):

Power BI JavaScript SDK (client-side code):



9. Live Streaming Solutions

Streaming is a capability for developing custom solutions on top of the Power BI service.

The feature set is light and simple.

No separate licensing is required.

Streaming types & capabilities:

•Pushed dataset: Supports standard report visuals if “Historic data analysis” is switched on; caches data in a dynamically-created Azure SQL database.

•Streaming dataset: Does not store data… only dashboard tiles are supported. Push from REST API or as endpoint from streaming service, like Azure Stream Analytics.

•PubNub: Streaming dataset tailored to consume standard PubNub channels.


Now for a deeper-dive look at the Power BI Solution Advisor…

This project is a work-in-progress that can used to provide direction and to explore solution options.

It is not perfect or comprehensive but can help recommend solution architectures based on chosen requirements and solution criteria.

The second page uses bookmarks to navigate through the requirement category slicers and display candidate solution architectures.

Right-click a solution architecture “tile” to drill-through to components and help links.

On the final page:

The relative complexity of the chosen solution is estimated, based on selected components.

Select any combination of components to see related help topics and links to articles & resources.


Again, I need to credit my friends at CSG Pro in the Portland area, for teaming up to build this tool.  It was an entry in a recent Power BI Hackathon.  CSG Pro hosts our monthly Power BI User Group meetings on the 4th Wednesday evening of the month in Beaverton, OR.

You can learn more about their consulting and development services at

If you would like to download a copy of the presentation slide desk, it’s here:  Feel free to use it as long as you keep all content intact including my contact information and copyright info.  As always, your comments and questions are welcome.

Power BI Global Hackathon Contest Results

The results of last month’s Power BI Global Hackathon are in! The Hackathon was facilitated by our our PUG here in Portland with the goal of welcoming global contenders in subsequent contest. Five teams entered the contest using publically-available data to visualize and tell data stories using our favorite data analysis platform.  Congratulations to Xinyu Zheng and Ron Barrett for winning the challenge with their entry, analyzing Yelp restaurant star ratings.  These were all great entries and you can view the contest results in the Power BI report below.

image  image  image

Here are the published projects that were entered in the Hackathon:

Xinyu and Ron analyzed ratings from nearly 1200 restaurant Pittsburgh, Phoenix and Las Vegas.  Results compare ratings and reviews by restaurant and food categories, sentiment and key phrases in the review comments


I loved the creativity of this solution from Jeremy Black and Kirill Perian who analyzed alcohol sales statistics using infographics and bookmarks to switch out visuals on the same page.  The presentation concludes on the last page of the report with an auto-advancing variation of “100 Bottles of Beer on The Wall”.  Nice touch.


I’m admittedly a bit biased because this was my design, with a lot of help from Brian, Ron and Greg.  We used a series of tables to prompt a user for Power BI solution business requirements and recommend fitting solution architectures and components.  We pushed some practical and technical limits in our project and I’ll write a separate post about it.


This entry from Ron Ellis Gaut is a nice, clean orchestration of county health data, measuring health and comparing personal well-being and program efficacy.


The entry from Daniel Claborne emphasizes machine learning predictions performed with R Script, commonly used in data science.  He actually includes the annotated code and explains the technique and the approach using training and prediction data sets.


The Portland Power BI User Group was one of the first and continues to be one of the most active in the international community.  We meet on the 4th Wednesday evening every month in Beaverton, Oregon. Today there are many active PUGs all over the world.


Managing Multiple Power BI Desktop Application Versions

Question:  How many different versions of Power BI Desktop might you have installed at one time?

Answer: Three (or more)

What happens when you have different versions installed, and how can you make sure that you use the right version for a given Power BI report file?

An issue came up this week when I tried to open a Power BI Desktop file (.PBIX) from File Explorer and Power BI Desktop told me I was headed down a dark and difficult path. Well, not exactly, but it displayed the following message:

Unable to open document

The queries were authored with a newer version of Power BI Desktop and might not work with your version.

Please install the latest version to avoid errors when refreshing.


When I clicked the Close button, rather than leaving me to correct with what seemed to be a complicated and potentially damaging situation, Power BI Desktop starts up and continues to tell me about the perils that lie ahead, in this message:

Report layout differences might exist

This Power BI report file may have some features that aren’t available in Power BI Desktop until the next release.

If you need to see the latest version you worked with on the web (, please view the report there. We’re sorry for any inconvenience.


As an unsuspecting user, I might be confused but at least I can rest assured that the application developers at Microsoft who write these warning messages are thoughtful and apologetic.

What’s going on?

In addition to the reports I author and deploy to the Power BI cloud service, I also create reports for my on-premises Power BI Report Server.  Report Server requires an older version of Power BI Desktop which can be installed from the menu on the report server.  This older version of desktop (October 2017 in my case) is sandboxed by Windows so it doesn’t get upgraded by the latest Power BI Desktop installer when I update it from the  In Control Panel, you can see both installations:


The problem I experienced was a result of installing the older desktop version for PBRS after the newest version.  The file extensions (PBIX and PBIT) are already associated with whatever version of desktop is installed and registered with Windows.  The remedy is quite simple… just reinstall the latest version of Power BI Desktop and perform a Repair if you already have that version installed.

This next part is more informational than problematic but it actually is possible to have additional “versions” or packages of Power BI Desktop installed.  If you install Power BI Desktop from the Windows 10 Microsoft Store, you get a sandboxed installation that runs in a restricted “safe” security context.  This is a good option for users in a restricted corporate network environment who don’t have local admin access to their computer/  In most cases, they can install the application this way.  As you can see, I actually have three separate Power BI Desktop installations.


These are all 64 bit builds of the desktop applications so I could even install 32 bit builds of Power BI Desktop as well.  I would only do that for compatibility with an old 32 bit database driver or if I were running on an old 32 bit Windows machine, which is not an ideal scenario.  Keep in mind that 32 bit applications can only use a limited amount of RAM (about 3.7 GB minus some system overhead).

Redmond, Washington SQL Saturday & Precons: Feb 9th & 10th on Microsoft Campus

SQL Saturday events occur in cities all over.  These events give technology professionals and students the opportunity to learn about database technologies, business intelligence, and new and emerging data trends to improve skills and master data.  I have been privileged to attend and speak and several SQL Saturdays around the world but the SQL Saturday in Redmond, Washington is special because it is close to home for Microsoft and people in the greater Seattle area.  SQL Saturday is always a free, sponsor-supported event with 60 to 90 minute, conference-length sessions presented by several noted industry professionals, authors and trainers.  Many of these sessions are selections of the same great learning content you would get from the same presenters at a large industry conference which might cost thousands of dollars to attend.  One of the great perks of being in Microsoft’s backyard is that several sessions are delivered by Microsoft product team leaders, with insider tips and timely information available from the people who develop SQL Server, Azure Services, Power BI and the rest of these great products.

In addition to the shorter sessions on Saturday, all-day preconference sessions on Friday give attendees the option for deeper, focused learning for a small fee to cover travel, facility and material costs.  This year, on Friday, Feb 9th; four preconference sessions are offered by traveling presenters.  Join Arnie Rowland, Ben Miller, Vern Rabe or myself for a full-day deep-dive into one of these compelling topics.  The following is from a recent announcement to the Pacific Northwest SQL User Group Members:

SQL Saturday Redmond – Feb 10, 2018

Just a reminder that SQLSaturday#696 will take place on Saturday, February 10, 2018 at Building 92, 15010 NE 35th St, Redmond, Washington, United States, 98052. SQLSaturday#696 is a free one day training event for SQL Server professionals and those interested in SQL Server. Please register for SQL Saturday Redmond 2018 at Registration.

Four All-day Preconferences – Friday, Feb 9, 2018

Also this year, we are offering Pre-Cons on Friday February 9, 2018, the day before SQLSaturday, in the same building (92): 8:00 AM to 4:30 PM. Each PreCon is well worth the modest fee. Register for a SQL Saturday Redmond PreCon by accessing its Eventbrite link below:

Building a Business Intelligence Solution with Power BI – Paul Turley

T-SQL for Performance and Accuracy – Vern Rabe

Quelling Your Queasies: Mastering Technical Presentations – Arnie Rowland

PowerShell Modules for the DBA – Ben Miller

See you there!

Please plan to join us in Redmond, on the Microsoft Campus – for the preconference on Friday, February 9th – and for SQL Saturday on February 10th, 2018.