Facebook Live Pop-up Session Recording

A big THANK YOU to everyone who attended the Facebook Live Pop-up session today. This was a fun event and I enjoyed taking and answering your questions. A recording of the live session is available right here:

We’re not quite sure why the video jumped around a bit but it didn’t seem to be too much of a distractor. We tested everything and had no issues until the event (of course!). I recently upgraded my older LifeCam 1080p camera to the LifeCam Studio HD camera – so maybe blasting more bits through the service caused some unrest. With that exception, I’d love to have your feedback about the format and the whole live Q&A concept.

Another concept I’m kicking around is to provide a forum for you and others to request guided training content based on your questions.  It would be sort of a Q&A forum that would drive the way we build online training lessons.  What do you think?
Please post your comments below.

PASS Facebook Live Pop-up Expert Series

There are some great learning opportunities available from PASS and I am exciting to participate in two online events this month!

Please join me on April 24 for a live chat about all things BI, reporting and data analytics.  Ask me anything you want about these or related topics and I’ll answer your questions, talk about my experience or find out what the community has to say.  The session is on Tuesday, April 24th at 6PM UTC (that’s 11 AM here in Pacific Time).  Follow the image link to put it on your calendar.  You can use the comments on the Facebook post or send an email if you’d like to queue up your questions ahead of time.

Here are some topics to get you started:

  • Is self-service reporting and data modeling really sustainable?
  • New features are released (monthly).  How do we keep (IT/or users) up to speed?
  • Where can we find best practice guidance for our solutions?
  • What’s the best tool to use for a certain style of reporting solution?
  • Differences between Power BI in the service and on-premises
  • What is the future for SSRS and Power BI Report Server?
  • How do I license Power BI, Report Server and my users?
  • Can we expose reports externally?
  • What is the migration path from Power BI tabular data models to on-premises and Azure AS models?
  • What’s up with mobile reporting?
  • How do I get started with Power Query & M
  • What’s the best way to learn and get support with DAX and calculations?
  • How do Excel, SSRS, Power BI and SSAS work together (or do they?)
  • What’s unique about your scenario and business rules?  How do we best proceed and meet those requirements?
  • What’s up with reports in SharePoint, external-facing application, embedding reports and self-service reporting?

There have already been some great sessions from Kendra Little and Bob Ward – which I have thoroughly enjoyed watching.  I’ve always loved Kendra’s presentation style and positive energy when she speaks.  Bob is a tried-and-true SQL Server expert with many years of experience on the SQL Server product engineering team.

Join me live, learn some good stuff and we’ll have some fun!

24 Hours of PASS

Every year community speakers present the 24 Hours of PASS (24HOP) which will be on April 25th.

24HOP Call for Speakers: Cross-Platform SQL Server Management

Every hour, a different presenter will deliver a 60 minute session on a specialized topic from midnight to midnight UTC.  My talk will be the Nine Realms of Power BI and the many different ways Power BI may be used along with other technologies to deliver Business Intelligence, reporting and analytic solutions.

My session is at 4PM Pacific Time on Wednesday, April 25th.  That’s 11PM UTC for you night owls in western Europe.  The rest of you can do the TZ math for your time zone.

How to add KPI indicators to a Table in Power BI

Yesterday a friend asked for a little help getting started with Power BI.  He’s a DBA and system administrator and wanted to cut his teeth on Power BI with a really simple dashboard-style scorecard report.  Using a list of database servers with license expiration dates, he thought it would be a simple matter to calculate and show the expiration status for each server using a simple traffic light indicator.  The envisioned server list might look something like this:


Makes perfect sense, right?  This is a basic use case and a good application for simple KPIs; with the one minor caveat that POWER BI DOESN’T SUPPORT THIS!

This topic has become a bit of a soapbox topic for me because it’s a capability that, in my opinion, is a very obvious gap in the Power BI feature set.  After unleashing my rant, I’ll demonstrate a solution in this post.


The most interesting thing about this missing feature is that for many years it has existed in the products that evolved into the current Power BI product .  Key Performance Indicators (KPIs) are defined as scriptable objects in SQL Server Analysis Services (SSAS) with tremendous flexibility.  KPIs are simple…  the STATE element of a KPI (often considered “Bad”, “OK”, or “Good” status) is translated into a visual indicator, usually an icon (commonly “Red”, “Yellow” or “Green”, respectively).  There are variations on this theme but it’s a very simple concept and a good solution has existed for many years.  In SSAS Tabular, the State logic was dummied-down to a slider control that eliminated some of the flexibility we have in the earlier multidimensional project designer but it still works.  The slider UX expects that the state applies when a value is equal to or greater then the threshold for yellow and green, and less-then the threshold value for red. Queries returned from SSAS include metadata that tells Excel, Power BI visuals or a variety of other client tools: “The KPI state is 1 (meaning ‘good’) so display a square green icon for this item”.  If you have the luxury of building your data model in Analysis Services using the SQL Server Data Tools (SSDT) designer for tabular models – or in Power Pivot for Excel, you would define a KPI using this dialog:

See the source image

The actual return value for a KPI designed this way is really just “–1”, “0” or “1” which typically represent “Bad”, “OK” and “Good” states, respectively.  As I said, you have other options like switching the red/green position or using 5 states rather than 3.  The multidimensional KPI designer even gives you more flexibility by allowing you to write a formula to return the VALUE, STATE and TREND element values for a KPI separately.  It would be wonderful to have the same capability in Power BI. It would be marvelous if we could the slider UI like this and then an Advanced button to override the default logic and define more complex rules in DAX!  The SSAS architecture already supports this capability so it just needs to be added to the UI.

If you design your data model using SSAS multidimensional or tabular, or using Power Pivot for Excel (which was the first iteration of Power BI) KPIs are just magically rendered in native Power BI visuals like a Table or Matrix.  But alas, Power BI Desktop does not have this well-established feature that could easily be ported from Power Pivot or the SSAS Tabular model designer.


…back to my friend’s simple scorecard report.

Using out-of the box features, the best we could do was this…
Create a calculated column in the table that returns -1 when the expiration date has passed, 0 if it is today and 1 if the expiration date is in the future.  Here’s the DAX script for the column definition:

Expiration Status Val =
IF([EndofLifeDate] < TODAY(), -1
, IF([EndofLifeDate] > TODAY(), 1
, 0

Next, add some fields and the new column to a table visual and use the Conditional Formatting setting in the table properties to set rules for the Back Color property of the calculated column, like this:


Here’s the table with the conditionally-formatted column:


Why Not Use the KPI Visuals?

The standard KPI visual in Power BI is designed to visualize only one value rather than one for each row in a table.  Like an Excel Pivot Table, if KPIs were defined in a Power Pivot or SSAS cube or model; a Power BI Table will simply visualize them but the Power BI model designer doesn’t yet offer the ability to create KPI objects.

Several community developers have tried to fill the feature gap with custom visuals but every one of them seems to address different and specific use cases, such as time-series trending or comparing multiple measure values.  I have yet to use one of the available KPI visuals that just simply allows you to visualize the KPI status for each row in a table, without having to customize or shape the data in unique and complicated ways.

How to Design Status KPIs With Indicators

Here’s the fun part:  Using the Expiration Status column values (-1, 0 or 1), we can dynamically switch-out the image information in another calculated column.  Power BI has no provision for embedding images into a report in a way that they can be used dynamically.  You can add an image, like a logo, to a report page and you can reference image files using a URL but you cannot embed them into a table or use conditional expressions.

Using this trick, you can conditionally associate images with each row of a table.  This is a technique I learned from Jason Thomas, whose blog link is below.  Using a Base64 encoder, I encoded three state KPI indicator images as text which I then copied and pasted into the following calculated column formula DAX script:

Expired = SWITCH([Expiration Status],

The encoded binary strings correspond to these three images, in this order:


To reuse this, you should be able to simply copy and paste this code from here into a new calculated column.  You no longer need the image files because that binary content is now stored in the table column.  It really doesn’t matter what labels you use for the status key values as long as they correspond to the keys used in the preceding code.  I’m using the conventional -1, 0 and 1 because that’s the way SSAS KPIs work.

On the Modeling ribbon, set the Data Category for the new column to “Image URL”:


That’s it!  Just add any of these columns to a Table visual and WHAM, KPI indicators!


*Incidentally, since adopting Jason’s technique, Gerhard Brueckl came up with a method utilizing Power Query to manage and import image files that I will use in the future.  Prior to that, I used  this site Jason recommended in his post.  My thought is that if a separate table only stored three rows (one for each KPI status), the status key value would be used to relate the tables.  It would be interesting to see if using a related table reduces the PBIX file size or if VertiPaq can effectively compress the repeating values of image column.  May be a good topic for a later post.





Please vote up this feature request so we can get the Power BI product team to add it back to the product:

Tour of the Power BI Solution Advisor

As a follow-up to my earlier post titled “Nine Realms of Power BI and the Power BI Solution Advisor“,  I’ve recorded this 7 minute tour of the solution advisor tour:

at last count, the tool has been accessed about 650 times.  Thanks for visiting!

I’ll also follow-up here with another tour to step-through the “making of” the tool and a peek inside the design.

Using Power Query “M” To Encode Text As Numbers

I worked through a brain-teaser on a consulting project today that I thought I’d share in case it was useful for someone else in the community.  We needed to convert application user names into an encoded format that would preserve case sensitive comparison.  Here’s the story… A client of mine is using Power BI Desktop to munge data from several different source systems to create analytic reports.

Two-Phase BI  Projects

I’m going to step out of the frame just a moment to make a soapbox speech:  I’m a believer in two-phase Business Intelligence project design.  What that means in a few words is that we rapidly work through a quick design, building a functional pilot or proof-of-concept to produce some reports that demonstrate the capability of the solution.  This gets stakeholders and folks funding the project on-board so we can get the support necessary to schedule and budget the more formal, production-scale long-term business solution.  Part of the negotiation is that we might use self-service BI tools to bend or even break the rules of proper design the first time through.  We agree to learn what we can from this experience, salvage what we can from the first phase project and then we adhere to proper design rules, using what we learned to build a production-ready solution in Phase Two.

Our project is in Phase One and we’re cutting corners all over the place to get reports done and ready to show our stakeholders.  Today I learned that the user login names stored in one of the source systems, which we will use to uniquely identify system users, allows different users to be setup using the same combinations of letters as long as the upper and lower case don’t match.  I had to ask the business user to repeat that and I had heard it right the first time.  If there were two users named “Bob Smith” that were setup with login user names of “BOBSMITH” and “BobSmith”, that was perfectly acceptable per the rules enforced in the application.  No right-minded application developer on this planet or any other should have let that happen but since their dink-wad software produces this data, we have to use it as it is.  In the Phase Two (production-ready) solution we will generate surrogate keys to define uniqueness but in this version, created with Power BI Desktop, I have to figure out how to make the same user name strings, with different upper and lower-case combinations, participate in relationships and serve as table key identifiers.


Wouldn’t it be nice if I could convert each UserName string to a numeric representation of each character (which would be different for each upper or lower case letter).  I knew that to convert each character one-at-a-time, I would need to bust off each string into a list of characters.  Let’s see…  that’s probably done with a List object but what method and where do I find the answer?

It’s Off To The Web, Batman!

Yes, I Googled it (I actually used Bing) and found several good resources.  Most official docs online weren’t very helpful.  I have a paper copy of Ken Puls book where he mentions List.Splitter, which seemed promising.  I have an e-copy of Chris Webb’s book – somewhere – and I know he eats and breathes this kinda stuff.  Running low on options, I came across Reza Rad’s December, 2017 blog post and found Mecca.  Reza has an extensive post about parsing and manipulating lists. He helped me understand the mechanics of the List.Accumulate function, which is really powerful.  Reza provides several good examples of List manipulation; pulling lists apart and putting them back together.  This post didn’t entirely address my scenario but did give me a foundation to figure the rest out on my own.  The post is here. It was educational and sent me in the right direction.  But, the sample code didn’t resolve my issue entirely.  It did, however get me thinking about the problem a certain way and I figured it out.  HOT DANG!

So Here’s The Deal

The first step was to tear each string down into a List object.  At that point, you have a collection of characters to have your way with.

I created a calculated column and entered something like this:

=Text.ToList( [UserName] )


If you were to add this column in the query design and then scroll on over to the new column, you’d see that it shows up as a List object placeholder, just all waiting for you to click the magic link that navigates to the list of all the characters in the column string.


We don’t want to do this.

Beep Beep Beep…. Backing up The Bus

Removing the navigation step and looking at the column of List object placeholders…  I want to modify the M code for this step to do the following:

  1. Parse the items in the list (each character in the UserName field)
  2. For each character, convert it to a number
  3. Iterate through the list and concatenate the numbers into a new string of numerals

To enumerate over the elements of a list and put the list members back into some kind of presentable package (like a single string or a number), we can use the Accumulate method.

The Accumulator is a little machine with a crank handle on the side.  Every turn of the handle spits out on of the element values, using the current variable.  You can do whatever you want with the object in the current variable, but if you want to put it back into the machine for next turn, you should combine it with the state variable, which represents the previous value (when the handle was cranked the last time).

Here’s my final desired result:


In a nutshell, List.Accumulate contains two internal variables that can be used to iterate over the elements of a list (sort of like an array) and assemble a new value.

The state variable holds the temporary value that you can build on each each iteration, and the current variable represents the value of the current element.  With an example, this will be clear.

The final code takes the output from “Text.ToList” and builds a List object from the characters in the UserName field on that row.

Next, List.Accumulate iterates over each character where my code uses “Character.ToNumber” over the current character to convert it to numeric form.

Adding this custom column…


…generates this M code in the query:

= Table.AddColumn(#”Reordered Columns”, “Encoded UserName 1”, each List.Accumulate(
, “”
, (state, current)=>
Character.ToNumber(current), “000”

Just like magic, now I have a unique numeric column representing the distinct upper and lower-case characters in these strings, that I can reliably be used as a key and join operator.

Bad Data Happens

As I said earlier, in a solution where we can manage the data governance rules, perhaps we could prevent these mixed-case user names from being created.  However, in this project, they did and we needed to use them.

Nine Realms of Power BI and the Power BI Solution Advisor

The use cases for Power BI, along with its many companion technologies, are numerous.  Many organizations are exploring the use of Power BI in enterprise-scale solutions and struggling with the myriad of options and choices.  I’ve grouped these options into nine categories that I call the “Nine Realms of Power BI”.  Along with my friends at CSG Pro – Brian, Greg & Ron, we have created a Power BI-based tool that you can use as a sort-of survey to assess your business and technical requirements and then recommend a reference solution architecture in one of these categories.  The options, components and reference architectures, capabilities, limits and cost guidelines are detailed later in this presentation.  I’ll also take you on a tour of the solution advisor tool, which I have published for public Internet users.

This is a presentation I prepared for the Redmond SQL Saturday that I will also use for some future presentations.


Let’s start by grouping requirements and solution criteria into eight categories.  In the solution advisor, you’ll choose one option from each of these. We’ll explore these categories in detail a bit later.


Why Nine Realms?  I actually came up with nine solution architectures before the “Nine Realms” theme came to me, but I found it fitting that these concepts seems to align with the Norse mythology depicted in the Thor movies from Marvel Comics.  After doing a little reading, I found that these stories have been around for centuries and are rooted in real Viking folklore that have some real substance behind them.

In short, according to tradition, the nine realms or worlds are branches of the cosmological tree; Yggdrasil. The realms include familiar worlds depicted in the stories we know, like Asgard – the home of the gods – and Midgard – home of the humans, which is earth.

Not all the worlds in the Yggdrasil tree are necessarily “better” or “worse” than, or above or below, others but they are all different, with attributes better suited for their inhabitants.  I find this to be a relevant analogy.

Stay with me here and I’ll show you how this all relates to the various incarnations of Power BI solutions.


Asgard is the home of the gods and is a place resembling Utopia, or a perfect world where everything is meticulously architected and all questions have answers.

Likewise, in a perfect BI solution, every base is covered and the solution achieves something approaching perfection.  Delivering such a thing is a goal of many BI solutions but achieving perfection is costly and often extends the technical scope and delivery timeline of a solution.  The stresses to achieve the utopian dream of a perfect BI solution can tread practical limits of not only time and money but also of patience and sanity; stakeholder commitment, interpersonal relationships among staff and leaders, work-life balance and the overall health of team business culture.


Which of the worlds is right for you and your audience?  Which one of the worlds should you try to achieve?

I promise to get serious here soon, but please indulge me with the “Thor” theme for just a moment…

Start by understanding your capabilities and stay focused on your objectives.  Keep your enemies close… in other words, understand the forces working against your success and strategically plan to overcome them.

Every distraction that deviates from of your planned solution – every new feature, every one-off promise to a stakeholder, every exception to the constrained list of in-scope deliverables – becomes your enemy.  Each of these metaphoric “friends” seems welcoming and well-intentioned until the schedule slips and the list of deliverables and challenges becomes insurmountable and unobtainable within your deadlines and technical capabilities.

Slide6 Slide7

This slide is key.  Power BI has a rich heritage of technologies that go back many years and are deeply engrained into the desktop application and cloud service – but some of these technologies also has more capable services outside of the desktop product.  For example, Power BI Desktop actually uses a scaled-down instance of SQL Server Analysis Services, which implements the Vertipaq tabular in-memory analytics engine.  If you need more horsepower than the Power BI Desktop modeling component provides, you can graduate to a full-blown SSAS instance and continue to work with a very similar, but more robust, data modeling tool that will scale on-prem or to the cloud to accommodate significantly more data and richer admin controls.  Be mindful, though, that making the leap from Power BI Desktop to enterprise SQL Server tools can be a big undertaking.


How about your audience?  Who and where are they?  How do you need to secure your solution, reports and data?

Slide9 Slide10

Where will you host your reports and how will users access them?  …in the cloud using the Power BI service – or on-premises using Power BI Report Server?

Slide11 Slide12

The Nine Realms of Power BI

As promised, here are the Nine Realms of Power BI.  They are roughly categorized into three or four different groups.

The top row are all solution options that utilize the Azure cloud-based Power BI service (PowerBI.com), with the cached data model and reports deployed to the cloud service, or with reports in the cloud and data remaining on-prem.

The second row of options are exclusively on-premises with no reliance on cloud services or cloud storage.

The seventh item, “Azure SSAS – Deployed to Service”, is entirely cloud-based and requires no on-prem infrastructure at all.

The remaining two items are special use cases where reports and dashboards are embedded into and managed by a custom application; or data is fed in real time to live visuals.


Solutions are cloud/on-prem hybrid, entirely on-prem, entirely cloud-based or specialized solutions such as embedded or live-streaming.


Now back to the solution requirement categories.  Here they are in detail.  Consider this like a survey.  The solution advisor asks the questions on the right for each of the categories:


Power BI Solution Advisor

You can access the Power BI Solution Advisor by clicking the slide image.

With a little help form my friends, we have built this tool – using Power BI of course – to assess the solution requirement criteria and recommend relevant solution architectures.

Let’s take a quick look at the tool and then we will explore it in detail a little later.  The recommended architectures are details in the slides that follow.  (3-18 update: Video Tour of the Power BI Solution Advisor)


1. Cached Data Model, Deployed to Service

For secure report sharing, Power BI Pro licenses are required for all users without Premium capacity licensing.

Premium capacity licensing covers unlimited read-only users. Pro licenses are required for publishing & sharing.


2. SSAS Direct Connect, Deployed to Service

In many respects, this is the most versatile mode for using the Power BI platform with high volume data managed on premises. The latest version of Power BI Desktop may be used with new and preview features. With reports published to the service, key features like dashboards, natural language Q&A, mobile access, alerts and subscriptions are supported. Connecting to SSAS through the gateway enables you to manage full-scale semantic models in tabular and multidimensional, using partitions for incremental data refresh. Compared to DirectQuery, this option has better performance and unlimited DAX calculation features.

In simple terms, data is read from the on-prem data model in real-time as users interact with reports; but the service is even smarter than that.  To optimize performance and reduce unnecessary network traffic, query results get cached and refused for short periods.

Caching policy: https://docs.microsoft.com/en-us/power-bi/service-q-and-a-direct-query#what-data-is-cached-and-how-is-privacy-protected


3. DirectQuery, Deployed to Service

The goal of DirectQuery is to enable as much capability as possible without caching data in a persistent data model. Rather then performing calculations on in-memory tables in a Vertipaq model, report interactions are translated into native queries for the data source to process and return aggregated results. To that end, report query performance will lag and complex calculations are limited. DAX functions that consume high data volume are impacted the most (e.g. SUMMARIZE, CALCULATETABLE, YTD, PARALLELPERIOD, RANKX, etc.)

There will always be performance and functionality limits with this feature but it will likely continue to see investments to improve performance as much as feasibly possible.

DirectQuery is typically chosen when: 1) a Microsoft customer have not fully embraced cached model or SSAS modeling concepts, or 2) when a relational data warehouse/mart is performance-tuned to address specific query & report scenarios within acceptable limits.


4. Cached Data Model, Deployed On-Premises

Reports are deployed to an on-premises instance SQL Server Reporting Services called “Power BI Report Server”.

SSRS catalog database requires SQL Server 2008+

Power BI Report Server licensing requirements: SQL Server Enterprise edition with Software Assurance, or Power BI Premium capacity.
Due to slower product release cycles, PBIRS features & capabilities lag behind Power BI Desktop/service by 1-4 months (PBIRS updates are about every quarter.)

User could have two version of Power BI Desktop installed (older version for PBIRS & latest version). Be cautious with version control.


5. SSAS Direct Connect, Deployed On-Premises

This option provides for a fully-scaled out enterprise solution with no dependencies on cloud services.

No model data size limit.

Role-based, row-level security (RLS) is supported in SSAS.

Enterprise scaled architecture (PBIRS & SSAS on separate machines) will require constrained delegation/Kerberos configuration unless static credentials are stored.

Scale-out architecture is supported on each tier by load-balancing multiple SSAS machines and/or load-balancing multiple PBIRS machines.

PBIRS doesn’t support Power BI service features like dashboards, natural language Q&A, alerts, mobile app access & R visuals.


6. DirectQuery, Deployed On-Premises

This option also provides for a fully-scaled out enterprise solution with no dependencies on cloud services.

No data source size limit.

Performance degradation and DAX calculation limits apply (same as DirectQuery in the serveice).

Scale-out architecture is supported by load-balancing multiple PBIRS machines.

PBIRS doesn’t support Power BI service features like dashboards, natural language Q&A, alerts, mobile app access & R visuals.


7. Azure SSAS Direct Connect, Deployed to Service

In most respects, this option is identical to using SSAS on-premises except no gateway is required to connect to Azure SSAS.

No on-premises hardware investment is required for this option since everything is hosted in the Azure cloud.

No SSAS product licensing costs. ASSAS costs are billed for hourly usage depending on capacity & service tier (developer: $ .13, production: $ .43 to $ 20.76 per hour)

Requires Azure Active Directory which can be federated to on-premises domain.

ASSAS is tabular only, same or slightly newer build as latest boxed product (2017/1400) & support older compatibility modes.

Capabilities & features are the same as using SSAS on-prem.


8. Embedded Service & Embedded Solutions

Power BI Embedded now supports all features of a solution deployed to the Power BI service.

Managed through Azure services in the Azure portal.

Capacity & usage-based costs range from $1 to $32 per hour.

Service may be paused & managed through the API.


This diagram depicts the components and interactions of an embedded solution.

Detailed information:

Power BI .NET SDK (server-side code): https://github.com/Microsoft/PowerBI-CSharp

Power BI JavaScript SDK (client-side code): https://github.com/Microsoft/PowerBI-JavaScript

Power BI REST API: https://msdn.microsoft.com/library/dn877544.aspx





9. Live Streaming Solutions

Streaming is a capability for developing custom solutions on top of the Power BI service.

The feature set is light and simple.

No separate licensing is required.

Streaming types & capabilities:

•Pushed dataset: Supports standard report visuals if “Historic data analysis” is switched on; caches data in a dynamically-created Azure SQL database.

•Streaming dataset: Does not store data… only dashboard tiles are supported. Push from REST API or as endpoint from streaming service, like Azure Stream Analytics.

•PubNub: Streaming dataset tailored to consume standard PubNub channels.



Now for a deeper-dive look at the Power BI Solution Advisor…

This project is a work-in-progress that can used to provide direction and to explore solution options.

It is not perfect or comprehensive but can help recommend solution architectures based on chosen requirements and solution criteria.

The second page uses bookmarks to navigate through the requirement category slicers and display candidate solution architectures.

Right-click a solution architecture “tile” to drill-through to components and help links.

On the final page:

The relative complexity of the chosen solution is estimated, based on selected components.

Select any combination of components to see related help topics and links to articles & resources.


Again, I need to credit my friends at CSG Pro in the Portland area, for teaming up to build this tool.  It was an entry in a recent Power BI Hackathon.  CSG Pro hosts our monthly Power BI User Group meetings on the 4th Wednesday evening of the month in Beaverton, OR.

You can learn more about their consulting and development services at CGSPro.com

If you would like to download a copy of the presentation slide desk, it’s here: https://sqlserverbiblog.files.wordpress.com/2018/02/nine-realms-of-power-bi.pdf.  Feel free to use it as long as you keep all content intact including my contact information and copyright info.  As always, your comments and questions are welcome.

Power BI Global Hackathon Contest Results

The results of last month’s Power BI Global Hackathon are in! The Hackathon was facilitated by our our PUG here in Portland with the goal of welcoming global contenders in subsequent contest. Five teams entered the contest using publically-available data to visualize and tell data stories using our favorite data analysis platform.  Congratulations to Xinyu Zheng and Ron Barrett for winning the challenge with their entry, analyzing Yelp restaurant star ratings.  These were all great entries and you can view the contest results in the Power BI report below.

image  image  image

Here are the published projects that were entered in the Hackathon:

Xinyu and Ron analyzed ratings from nearly 1200 restaurant Pittsburgh, Phoenix and Las Vegas.  Results compare ratings and reviews by restaurant and food categories, sentiment and key phrases in the review comments


I loved the creativity of this solution from Jeremy Black and Kirill Perian who analyzed alcohol sales statistics using infographics and bookmarks to switch out visuals on the same page.  The presentation concludes on the last page of the report with an auto-advancing variation of “100 Bottles of Beer on The Wall”.  Nice touch.


I’m admittedly a bit biased because this was my design, with a lot of help from Brian, Ron and Greg.  We used a series of tables to prompt a user for Power BI solution business requirements and recommend fitting solution architectures and components.  We pushed some practical and technical limits in our project and I’ll write a separate post about it.


This entry from Ron Ellis Gaut is a nice, clean orchestration of county health data, measuring health and comparing personal well-being and program efficacy.


The entry from Daniel Claborne emphasizes machine learning predictions performed with R Script, commonly used in data science.  He actually includes the annotated code and explains the technique and the approach using training and prediction data sets.


The Portland Power BI User Group was one of the first and continues to be one of the most active in the international community.  We meet on the 4th Wednesday evening every month in Beaverton, Oregon. Today there are many active PUGs all over the world.