Since the Reporting Services product team announced last year that it was their intention to provide an on premises deployment option for Power BI reports, there has been tremendous interest. Despite their determination to provide clarity around their efforts , there has still been some speculation about how and when we’ll see this capability. I don’t want to jump on the bandwagon and add to the rumors or debate the details. The fact is that product leadership have been more than forthright about their progress and must work through the usual rigor of packaging and supporting a major feature addition when it is ready for release.
At the PASS Summit in Seattle a few weeks ago, Riccardo Muti and Chris Finlan announced that the team planned to release a new preview of the enhancements to SQL Server 2016 Reporting Services that could support Power BI reports hosted within the SSRS web portal. Today, they are delivering the goods with an Azure virtual machine that we can use to preview these features. Riccardo was very clear that the VM would be a very early preview and that the actual released product wouldn’t be announced until it was ready. The question on the minds of several community leaders is whether these features will be available in some kind of update for SQL Server 2016 or whether it will be in the next version of SQL Server. I checked with Riccardo Muti, Group Program Manager for SSRS and he says “We’re releasing Power BI reports in ‘SSRS vNext’ as part of the SQL Server vNext wave, but don’t assume ‘vNext’ must mean a couple of years.“ …and, “it’ll be here in 2017“. Now we have a definitive answer.
Dan English posted about this today in an article titled On-Premise Power BI–A Microsoft Story. I encourage you to read his entire post and understand his thoughts on the matter. At the PASS Summit session, I had the impression that there would be an interim update for SQL Server 2016 but after reading a lengthy email thread today, that doesn’t seem to be the case. My conclusion is that plans can change and there has not been on official announcement on the release plan.
The official announcement was posted today on the Reporting Services product team blog by Chris Finlan, titled Technical Preview of Power BI reports in SQL Server Reporting Services now available. Chris told me about this plan a few weeks head of the announcement and said that they weighed their options carefully before they decided to go down this route. The trade-off was that this decision would likely raise unanswered questions and speculation out in the community. In the end, they decided it was best to err on the side of being open and transparent with their customers.
From Chris’ post on the team blog:
As we brainstormed creative ways to let people try this functionality as early as possible, we had three very specific goals we wanted to achieve:
Provide access to the new functionality publicly as early as possible while ensuring the end-user experience was something you’d find valuable
Create a self-contained experience and environment that allowed users of any skill level an easy way to get started
In no way disrupt or delay the initial preview of a downloadable and installable version
By using the Azure Marketplace to distribute this early technical preview, we feel we have not only met those goals, but also established a repeatable way to distribute content in the future. For users who would prefer to run this technical preview on an on-premises server, you are welcome to provision a virtual machine and then download the image as a .vhd file and use Hyper-V functionality to do so.
In the Business Intelligence and reporting space, SQL Server 2016 was a big step forward. Aside from significant relational engine improvements, Analysis Services Tabular matured considerably. Components of Reporting Services were completely re-architected and several new features were added, which include modern HTML5 rendering for all browsers, the new web portal, mobile reports, KPIs, Power BI integration, native browser printing, parameter layout control and PowerPoint export. With new features, there are always loose ends to tie up – and that’s what service packs are for. I am aware of a few enhancements in SP1 to the Reporting Services web portal, and general improvements to the SSRS platform.
As of today, SQL Server 2016 Service Pack 1 is generally available for all product editions. The service pack can be downloaded from here. For the official announcement, read this post on the SQL Server Blog from Microsoft. From the announcement…
The capabilities in SQL Server 2016 SP1 which are now being make available to Standard edition and Express edition for the first time include:
Faster transaction performance from In-memory OLTP, faster query performance from In-memory ColumnStore, and the ability to combine the two for real-time Hybrid Transactional and Analytical Processing, also known as Operational Analytics;
Data warehousing or data mart performance features such as partitioning, compression, change data capture, database snapshot, and the ability to query across structured and unstructured data with a single node of PolyBase; and
The innovative security feature Always Encrypted for encryption at rest and in motion, as well as fine-grained auditing which captures more detailed audit information for your compliance reporting needs. In addition, row-level security and dynamic data masking are being made available to Express edition for the first time.
I hope you can join me at the Chicago Microsoft Technology Center for the PASS Business Analytics Day, on Wednesday, January 11, 2017.
The focus of this hands-on workshop is building business solutions with Power BI Desktop and the Power BI service. We will cover both self-service and enterprise scenarios, with data from files and on-premises data sources; focusing on core skills using Power Query “M”, modeling techniques, DAX measures and visualization design. At the end, you will have a complete solution built from real business data, shaped, cleansed and modeled; with a dashboard and interactive report visuals ready for analysis.
You will also learn about different options to deploy and integrate Power BI into your local reporting servers and secure web application through embedding and web publishing. You will see how Power BI works with R to add forecasting, predictions and advanced statistical analytics. See how custom visuals are used to extend visual analysis and how Power BI may be used for live streaming, DirectQuery to on-prem database systems and direct connect with Analysis Services to architect large-scale business data solutions.
The Chicago Microsoft Technology Center is located downtown at 200 East Randolph Street, Suite 200, Chicago, IL 60601. Arrive early. The workshop will begin at 8:30 and runs until 4:30 PM. You can only attend one session during the Business Analytics Day event.
I’m pleased to announce that I will be presenting two more Dashboard-In-A-Day training sessions in October. These free, full day training sessions are co-sponsored by Microsoft and SolidQ. I’ve had a blast teaching Power BI in lots of other locations. What’s not to like? It’s Power BI and it’s free. That’s a pretty good combination.
If you’re in the area, please plan to take and day and get some great training. Here are the dates and registration links:
DIAD is designed to accelerate your Power BI experience with a comprehensive training program in a single day. A trained Power BI specialist partner will host you for the entire day – all you have to do is bring your Windows-based notebook and we will supply the rest – even lunch!* After registering, you will receive an email confirmation that includes a link to download the training materials you’ll need for the day. With DIAD you get practical hands-on experience with SolidQ who specializes in the art and science of bringing data to life in a full-day of instructor lead report building. You will learn how to connect to, import & transform data from a variety of sources, build real data models, as well as author and publish Business Intelligence reports. You will learn how to customize and share your creations for collaboration with other groups securely inside your organization including mobile device sharing.
Today marks the first anniversary of Power BI Desktop and the Power BI Service. The effort to achieve this milestone and the on-going commitment of the teams responsible for this product is nothing short of phenomenal. Earlier this week, I reached out to a few friends to join me in sharing this Happy Birthday wish and thanks to the people behind this amazing, and ever-improving product. Adam Saxton assembled this video montage and is credited for the impressive production work.
The product released last year on this date was a major overhaul of the earlier tools, with some features based on mature technologies that have been improving for several years. Since then, the pace of development, new capabilities and feature enhancements has been mind-blowing. The Power BI service is enhanced weekly and the desktop tool is updated every month, while the platform remains stable and reliable.
To the Microsoft Power BI product team:
From those of us in the community – partners, trainers, consultants, MVP Program members, user group leaders, developers, integrators and service providers – thank and congratulate you.
Here are some out takes and clips I received after Adam finished the production:
“Professional SQL Server 2016 Reporting Services and Mobile Reports” from Wrox Press is content complete! The book is well into the editing stage– still, another grueling phase of the publishing process, but there is light at the end of the proverbial tunnel. What is it like to write an 800+ page technical book? I’ve been trying to come up with the best metaphor to describe a year of persistent weekend-and-evening content development – building labs, demonstrations and exercises; performing research and and hounding my friends on the product team. A big thanks thanks to Chris Finlan, Riccardo Muti and Robert Bruckner on the SSRS product team for their continued help and support.
I’ve admittedly been a bit quiet on the community front for a while I’ve been heads-down on this book project but I hope to start redirecting more energy to community content.
Intense book writing can be a life-sucker:
This is Paul’s brain…
…this is Paul’s brain after writing a technical book:
Don’t get me wrong – I enjoy book writing. I’ve spent a lot of my career writing technical books and courseware because I enjoy the challenge. Writing helps me see things from different perspectives and at different levels of depth and difficulty for different audiences.
Give my brain a couple of weeks to heal and I’ll channel some of that energy into blogging and producing more community-facing training content.
So, what’s in the book? This is the Fifth Edition of the Wrox Professional SQL Server Reporting Services series which I’ve been leading and co-authoring since 2003. Amazon still has a working title listed but this is normal. As a next-edition, some of the 2012 edition evolved forward but most of the book is all new material. Four chapters are dedicated to the new Mobile Report features introduced in SSRS 2016. Before someone asks… when will the book be out? I don’t know exactly; there’s still work to be done. Soon. How’s that?
Over the next few weeks, I’ll post excerpts from the book to give you a taste of what’s coming and provide some valuable information about SSRS 2016.
Here’s a short except from Chapter 2: What’s New in SQL Server 2016 Reporting Services?
Reporting Services was released as an add-in tool for SQL Server 2000 back in early 2004. At the time, the feature set was light by comparison to the product today but the foundational architecture was hasn’t changed significantly. Essential features included basic charts, sub-reports data regions and single-value textboxes.
The second release in 2005 added a self-service report authoring tool called Report Builder (later named Report Builder 1.0) paired with a semantic modelling tool in the designer. The original modelling and ad hoc report tool has since been deprecated but it inspired more capable replacement technologies like Power Pivot and the later generations of Report Builder. Not to be confused with the original Report Builder tool, Report Builder 2.0 and 3.0 produce report definition files compatible with the report project tools that are integrated with Visual Studio; originally called Business Intelligence Development Studio (or “BIDS”).
Report Builder and Designer Enhancements
The report design experience for standard “paginated” reports hasn’t really changed much over the past few product versions but there have been incremental improvements. Report Builder is restyled to conform to Office 2016 standards. The installation process for Report Builder changes to an “Evergreen” app. This means that Microsoft maintains updates for frequent download, rather than the old “ClickOnce” installation from your on-prem server. Similar to prior versions, users can elect to install Report Builder from the Report Portal menu. Report Builder is updated with a modern look-and-feel, simple and sleek, Changes are mostly cosmetic while the fundamental features are the same.
The Visual Studio integrated Report Designer is now part of SQL Server Data Tools (SSDT), a downloadable add-in for Visual Studio. Although the tool set hasn’t changed significantly, there are some subtle changes to the way SSDT is installed and the way updates are delivered. First of all, confusion about the name “SSDT” is dispelled because the former “SSDT” (the previous version add-in for database projects) and “SSDT for BI” (the previous version add-in for SSIS, SSAS and SSRS projects) are now a combined package, simply called SSDT. Secondly, you can simply download and install a version of SSDT that will work with the current version of Visual Studio or a few versions back. The SSDT add-in will be updated frequently and you will have the option to install updates on-demand from within Visual Studio.
Modern Browser Rendering
One of the most significant product improvements in 2016 may be one of the least apparent under casual observation. The entire HTML rendering engine has been completely overhauled across the platform. The report portal, used to navigate and manage report content, and the actual report content are rendered to modern HTML 5 standards, which are supported by all modern web browsers. The shift to modern HTML output means that web content produced by Reporting Services is consistently consumable on any device, regardless of the operating system or web browser, so long as it supports modern standards. The benefits are readily apparent when reports simply work on smartphones of any type, on tablets and laptop and desktop machines; regardless of the brand or operating system.
Parameter Layout Control
You will have improved control over parameter formatting and placement. Since the inception of Reporting Services about twelve years ago, parameters have always been arbitrarily arranged in a narrow bar at the top of the browser window, from left-to-right, and then top-down. The report designer has a grid to manage the placement of parameters, in the parameter bar, in any configuration, within definable rows and columns.
The addition of mobile dashboards to the SSRS platform is based on the Datazen product acquisition from ComponentArt in 2015. Mobile reports are primarily designed to enable data interactivity in dashboard-style reports created by a mobile report developer. Managing this expectation is important because this tool is significantly different than conventional Reporting Services.
Mobile reports can be viewed in the browser but are optimized for phone and tablet devices through native, installed applications running on all the major mobile OS platforms. They are not a replacement for high-fidelity paginated reports created with Reporting Services, or self-service analytics in Power BI; they serve an entirely different purpose.
At first, the mobile dashboard experience may seem to be a simple drop-in of the Datazen product; but, it is apparent that some integration with the SSRS architecture has already taken place, and more adaptations are likely on the horizon. The first notable difference is the Datazen server is entirely replaced by the SQL Server report server, and queries are now managed as SSRS shared datasets.
New Key Performance Indicators (KPIs) integrated with the new Report Portal are also based on the Datazen product acquisition. These KPI visuals are created and managed entirely within the portal. In additional to the standard traffic light style comparison of actual vs target values, KPIs can include a trend line or segment chart. The KPI is driven by data from one or more shared datasets that were created in the SSDT report designer. For ease and simplicity, any value of the KPI can be entered manually through the design page.
Although they are visualized in the Report Portal, KPIs are delivered to mobile devices through the Power BI mobile applications on every popular device platform.
Native Printing Control
The previous printing capability in SSRS replied on an ActiveX control which was only supported on Windows desktops and in certain web browsers. Even in tightly controlled Windows server environments, system administrators would rescind ActiveX support and disallow report printing from the server. The modern printing solution uses the PDF renderer to printable output and then the Adobe document viewer to perform the actual printing
Users have had the option to export and render report content to Excel for several SSRS versions. Output to Word was added in SQL Server 2008 and then both of these rendering options were improved and updated in the 2008 R2 version. Now a third Office application format will be supported with the introduction of PowerPoint document rendering.
Most report items and data regions are converted to individual image objects in the resulting PowerPoint sides with one side generated per report page. Additional slides are created based on the report content size and layout. Textboxes are created for titles and report text, which support some report actions and textbox properties.
Integrated and Improved Report Portal
A new Report Portal web interface is introduced to replace the Report Manager. Like Report Manager, the portal is an ASP.NET web application used to access, run and manage reports in the web browser. The new portal has a look-and-feel we are accustomed to seeing in other modern apps from Microsoft these days; with responsive design for constancy on different device form factors. Report Portal will be the home for mobile reports, KPIs and paginated reports – the new name for RDL reports authored with Reporting Services. In the future, we may see support for additional content types.
Report Portal supports in all modern web browser by emitting responsive HTML5 with adaptations for mobile devices and screen orientations.
New Charts and Visual Enhancements
With the addition of two new chart types, visualization improvements are inched forward in Reporting Services. The new Sunburst and Treemap charts apply multi-level field groups visualized in both color and visual boundaries.
New and updated report visuals are likely to be an area of focus for future Reporting Services enhancements, given the success of self-service BI tools like Power BI. The design interface is identical to existing chart types and the only real difference is that groups of rows are visualized in these unique formats. The Sunburst chart is also capable of consuming unbalanced hierarchies with slices generated for different levels only where data points exist.
Standardized, Modern Browser Rendering
At first, you may not notice significant changes from previous versions, but the HTML renderer has been completely overhauled and updated. Now, reports are rendered to HTML 5 standards whereas, they should consistently maintain the same appearance and behavior in all modern browsers that support the HTML 5 standard such as Microsoft Edge, IE 11, and newer versions of Google Chrome, Safari and Firefox. This change is a welcome improvement which should clear up many problems with inconsistent and quirky report layouts while using different web browsers and devices. By the same token, the change means there is no specific backward-compatibility for outdated browsers; consequently, reports that may have worked (or partially worked) in an old version of Internet Explorer may no longer work until the user upgrades.
Power BI Dashboard Pinning
For organizations who have invested in the Power BI cloud service, the Power BI integration feature allows users to pin graphical SSRS report visuals to their online dashboards. In order to use this feature, an administrator must register the report server with an existing Power BI subscription, and a report user must have access to the Power BI subscription.
Reporting Services Configuration Manager includes a new page to manage Power BI Integration. This is where you register the report server instance with the Power BI subscription. When a report with “pinnable” items (such as images, charts and gauges) is viewed in Report Portal, the Power BI icon is displayed on the toolbar.
“pinnable” items are highlighted in Report Portal. When a visual is selected, you are prompted to select a Power BI dashboard and the refresh frequency. This schedules an Agent job on the report server to push updated visuals to the dashboard at the selected frequency.
The Select frequency of updates option utilizes the SSRS subscription architecture by scheduling a SQL Server Agent job on the database server with the report server catalog. The Agent job re-queries the report data and then a report server component refreshes the Power BI dashboard tile with an updated report visual.
Pinned report visuals appear on the dashboard alongside the Power BI report and Excel visuals. Clicking on one of these visuals will drill-through to the report back on the on-premises report server. This gives users a seamless navigation experience between cloud-hosted Power BI content and selected report visual elements on your own report server.
The integrated Power BI experience is a big step forward in providing a completely integrated IT-hosted and self-service reporting, BI and analytics solution.
I just finished posting a 4-part series demonstrating how to use Power BI with on-premises data. The running time for all four parts is about 44 minutes. I cover using the Power BI Enterprise Gateway to connect published Power BI reports to on-premises data sources:
SQL Server database connected with DirectQuery
Large SQL Server database with clustered column store index connected with DirectQuery
SSAS multidimensional/cube on-prem with direct connect
SSAS Tabular with large data volume & multiple partitions on-prem with direct connect
The four-part series is on my YouTube channel using the following links:
Adam Saxton also gives a nice walkthrough of using SQL Server DirectQuery with Power BI that may provide an additional perspective and information about securing and sharing the solution.
I’d also like to give a plug for a terrific Power BI architecture diagram poster created by Dustin Ryan. I love the simplicity of this comprehensive graphic. You can download the PDF file from his blog.
I’m jumping on a plane bound for San Jose, California today headed to the PASS Business Analytics Conference (BAC). If you plan to attend my hands-on workshop, Tuesday at 2:00, the following information will help you prepare:
Bring your laptop running Windows 7, 8 or 10. Power BI will work, at minimum, on Windows 7 32 bit with 4 GB of RAM. A 64-bit OS with more RAM is better but not required. When you install Power BI Desktop on Windows 7, you may be instructed to install the .NET Framework 4.5. Please do this ahead of time.
Install Power BI Desktop Just go to PowerBi.com and install it from the Products menu. You don’t need an account to install Power BI Desktop.
Charge your laptop battery We will be meeting in a standard session room in the afternoon without additional power. Plan to spend the two hours running on battery power (and don’t use up your battery charge earlier in the day). I don’t have control over the schedule or meeting room facilities so please plan ahead. We also won’t have control of the wireless Internet connectivity which I hope to use to provide files and exercise instructions. Just in case of connection issues, I will provide files on USB drives.
If you do not yet have a subscription setup at PowerBI.com, that’s OK. In this shorter version of the workshop, we will spend most of the time working offline. If you do have a subscription, you can publish and work with the online features but this is not a requirement.
This is a reminder to compete the annual Business Intelligence product survey conducted by BARC Research. I’ve been participating in the survey for several years (as far as I know, as long as it has been running). As a participant, I receive a detailed copy of the results which are informative and insightful.
The survey takes 10-15 minutes and you can get started here.
Last year the BI Survey evaluated 35 products, with 3,267 survey responders. BARC conducts a credible survey and not a SPAM trolling exercise. To see the infographic with a summary of last years’ results, click the image below:
History clearly confirms that good things come in threes. As far as I know this is the first time that only the three most widely recognized Business Intelligence platforms have been placed in the top-right leaders/visionaries space of the Magic Quadrant for Business Intelligence and Analytics Platforms, as of February 2016.
Microsoft is rated by Gartner as the having the most complete vision while Tableau and Qlik rate slightly higher in their ability to implement. This makes perfect sense as “Microsoft BI” is not one product or tool. It has many components and offers many choices, but is certainly comprehensive and scalable.
Power BI is delivering “the last mile” of data visualization beautifully on top of the robust foundation of SQL Server, Analysis Services, Integration Services, HDInsight, Stream Anlytics, machine learning and all the other Azure services. It’s also a great single-product solution for simple projects with room to grow as needed.
Oh, here are the results…
The quadrant chart also shows that there is significant opportunity for all these vendors to improve their execution and implementation strategy. BI is not easy to do. Using some tools, it may be easier to get started or to build something simple to demonstrate functionality but data is inherently complex and every business must approach analytics a little bit differently. There is a lot of wisdom in this simple portrayal. (just got the link to the entire Gartner report and added the link below). A notable statement is that there are no “challengers” which emphases a significant margin between the leaders and others.
Power BI reports can now be embedded into a pubic-facing website! Today, Faisal Mohamood, Group Product Manager on the Power BI team announced that Power BI reports can now be published to the web for all the world to see – woo hoo! I’ve worked with the feature and it’s quite easy to implement. Details are here. To exhibit this new capability, Microsoft is hosting a Best Report competition which I entered with the SolidQ Airline Performance Tracker.
Please join me on February 3rd at 10 Pacific Time for this presentation brought to you by Microsoft and SolidQ
If you’re still on SQL Server 2005, there are some important things to be aware of. First and foremost is that you are missing out on innumerable capabilities in more recent versions of the SQL Server platform, in the relational engine and the BI stack. Secondly, Microsoft will no longer support SQL Server 2005 after mid April of this year. Why is this any more significant than previous versions? There has never been such a large body of customers on an outdated database platform nearing the end of its useful life. This is not a marketing ploy to upsell customers but simply the normal retirement cycle of a ten-year-old, outdated product. Yet, for some organizations managing critical systems, it’s a disaster waiting to happen.
Considering performance and security alone, the differences between SQL Server 2005 and recent versions are quantum. We frequently see cases where performance improvements on a typical workload improve by multiples of 2, 10 and more, making the ROI math pretty simple. Improvements to the core BI feature set (Reporting Services, Analysis Services and Integration Services) are also vast, with improved reliability, ease of design and usability. These are some of the topics I’ll discuss in this free presentation.
An upgrade of your SQL Server Business Intelligence solution brings with it the challenge of understanding how to apply new features to your BI workload, but it also opens up tremendous opportunity for business impact. In this session we’ll talk about business benefits to upgrading your BI solution, costs to upgrade now vs. the costs of waiting, and current trends in corporate upgrade strategies. For those impacted by the end of extended support for SQL Server 2005 on 04/12/2016, we will discuss enhancements to the SQL Server platform and examples of how to take advantage of the latest features. We’ll touch on workload-specific upgrade benefits to modernizing your SSAS, SSRS, and SSIS environments, then wrap up with how upgrade can improve users’ reporting experience using the latest in mobile BI.
Self-service data modeling tools like Power BI Desktop, Power Pivot and SSAS Tabular promote the advantage of using data from multiple data sources. I’m including SSAS Tabular on the “self-service” list because it is derived from Power Pivot and often used for “quicker” IT-managed BI projects. The alleged advantage of sourcing data directly from within the modeling tool might seem to suggest that a formal ETL process and dimensional modeling exercise is unnecessary. The more experience I gain with these newer tools in business settings has me ever thinking about using best practices and the right approach for scaling BI solutions of all sizes.
The bottom line, before getting into the weeds on this topic, is that there are cases where I think it makes perfect sense to skip the usual steps of staging, transforming and modeling data into a relational data mart. Consequently, by using this approach, you are also removing the safety net and limiting options to scale and expand the solution. I love tools like Power Query and Power BI Desktop to develop simple BI solutions quickly. Perhaps a related topic for another time but always consider options to incorporate enterprise scale with modern analytics. Solutions always grow.
The Self-Service Modeling Paradigm
More and more of the projects I inherit from consulting customers and previous developers resemble the following diagram:
In less-formal solutions, the “ODS” layer in this diagram may be little more than a small database on an existing server with lookup tables populated by a weekly scheduled SSIS package or stored procedure. A lot of these projects are either departmental or the result of a self-service project handed-off to the IT organization to be “right-sized” and transitioned from the desktop to a server-hosted solution.
Power Pivot, Power BI Desktop or SSAS Tabular are often used to mash-up data from multiple data sources. Some of the data may be staged and transformed to simplify a complicated query or to work-around security restrictions from the original data source. The data feeding multiple tables in the model arrives in separate streams where the keys then need to be manipulated in order to match-up fact table keys to identifying keys in the lookup/dimension tables.
What’s the Problem With This Approach?
In some solutions with a manageable scale and a reasonable tolerance for a certain amount of data loss and inconsistency, this approach may be just fine. There are very good reasons for inconsistencies between sets of data that come from different sources. When 20 million fact rows are obtained from an online ordering system and .1% don’t have matching customers records that come from the system used to manage the master customer list, it may simply be a question of timing. Or, maybe the definition of “customer” is slightly different in the two systems. Chances are that there are also legitimate data quality issues affecting a small percentage of these records.
Whatever the case may be, a data conformity or potential data quality issue in this design scenario falls on the model designer to either fix or ignore. Again, this may or may not be a concern but it is a decision point and might raise the question of ownership when questions about data quality are raised.
Before I continue, let’s revisit how this common issue is handled in more conventional Business Intelligence solutions…
Managing Data Conformity in Traditional BI
Here’s a quick overview of the conventional Business Intelligence approach. For the past twenty years or so, we’ve promoted strict dimensional modeling principles that can be summarized in the following diagram. This diagram is my interpretation of The Kimball Method, applied to my experience and tools available from Microsoft. I learned to orchestrate BI solutions using this approach; however tedious, time-consuming and expensive. This is my frame of reference for any BI project. Many variables can influence adaptations from this model such as product choices, big data sources, real-time analysis, data volume, budget or the stakeholder’s realm of influence or ownership – but this is my starting point for a BI project done “right”.
If we take this general approach and scale it down to a departmental or business unit level, where an enterprise data warehouse doesn’t exist to satisfy analytic reporting requirements; we commonly end-up with a solution that looks more like the following diagram. Data is still procured from multiple sources through extract and transform processes, loaded into a staging database and/or directly into a dimensional data mart. Even when using an Analysis Services Tabular project to build the semantic model, the model only has one data source with all the fact and dimension tables populated from the relational data mart. The dimensionalized data is then surfaced through a semantic model and then presented through ad-hoc analysis, analytic and operational reports, and executive dashboards.
Different Solution Architecture – Same Problem
The most significant difference between the previous two diagrams and the one I started with is that the solution developer has control over the data before it enters the semantic model. Data conformity and quality can be managed in different places which include:
At the source
In the ETL process after source data is staged, and before it enters the data warehouse/data mart
Within the semantic model
In theory, fixing “bad” or non-conforming data at the source seems to be ideal for everyone. In a utopian setting, the business can incorporate a master data management solution to let data stewards correct erroneous or unmatched records and then feed them back into source systems. Operational reports produced by business applications should match analytical reports and dashboards from the data warehouse/data marts and the models they feed. Please don’t misunderstand – many businesses incorporate MDM solutions and they work well. The reality in most scenarios is that this is a difficult undertaking. It requires business process changes and an enormous level of cooperation and coordination across business units, leadership and interdependent teams.
Weeding out bad records and making corrections as data flows downstream to the data warehouse is often a practical trade-off that has many advantages. Reports produced from the data warehouse or data marts should match-up to analytic reports and dashboards fed by semantic models which uses congruent data.
Any provisions to compensate for data anomalies within the data model will produce some degree of variation between reports produced upstream and analytic reports produced from the semantic model. This may be acceptable as long as users understand and expect certain discrepancies. Reconciling financial results and auditing to a narrow margin can be a difficult and time-consuming exercise. Often times user confidence can be maintained when the analytic model can produce the same bottom-line figure and discrepancies are evident only in different grouping buckets or categories – like a “catch-all” or “Unknown” member for records that don’t match.
How Analysis Services Multidimensional and Tabular Deal With Data Non-Conformity
SQL Server Analysis Services (which we now call “Multidimensional SSAS”) has handled mismatched keys the same way for several years. When fact rows are processed with key values that don’t match a related dimension, errors are reported and the model processing fails. The developer must make a deliberate exception for these records to match a specially-generated “Unknown” member of the dimension. Under the default behavior, a solution developer following The Kimball Method would have defined Unknown member records in every dimension table and then mapped the unmatched keys in the ETL process.
For the purpose of this discussion and as far as we’re concerned; SSAS Tabular, Power Pivot and Power BI Desktop are the same. All these tools, which are derived from different versions of the same code base, all deal with mismatched keys the same way.
SSAS Tabular is significantly different from multidimensional in that unmatched fact rows simply flow through model processing and a blank member is generated in each affected dimension table. Visually, this shows up as an aggregated value for an unlabeled heading. Casual users will miss this and won’t notice that a certain number of records have fallen through the cracks. There are currently no properties or settings to change this behavior. In fact, a Connect Item filed back in Feb 2014 was dismissed as a low priority suggestion. I’m not debating the issue but just informing you that this is how it works. I’d personally like to at least have the option to label the blank dimension record as “Unknown”. I’ve found some creative ways to work-around this limitation. The best documented approach is this one that Javier Guillen posted in 2012.
One of the reasons I like SSAS Tabular is that it’s simple. But, “simple” means that is has less “stuff”, less “clutter” and fewer features than its more complicated ancestor. I supposed that if every SSAS practitioner were able to get the product developers to add their favorite feature, we might end up right where we started – and that’s a dilemma. This is an over-simplification as there is always room for product improvement.
This is a topic worthy of deeper discussion so it’s on the list of items for me to explore in follow-up posts. Please let me know what you think, post your questions and provide some feedback.
The proliferation of Power BI across the Microsoft ecosystem is astounding and Power BI keeps popping up in the most interesting places like live streaming dashboards in Cortana Analytics and just recently on the Windows 10 desktop. Power BI is so much more than a simple data visualization tool for desktop data and social media streams. With the Windows 10 Cortana integration, you simply ask Cortana a question and the answer is visualized in the search results window above the Start button. Any published datasets you choose are added to the list of all the stuff Cortana checks when you ask her questions. A link below the visual takes you to your Power BI dashboard in the web browser to interact, add fields & filters, change the visual; and pin the results to the dashboard if you like.
I’ve made a quick video to demonstrate how it works. By the way, you can just type your questions instead of speaking them to Cortana. Questions are sent to the Power BI Q&A feature for the datasets you chose to integrate from your subscription.
The Windows 10 “Threshold 2” update a couple weeks ago introduces Power BI integration with Cortana on the desktop. I checked for non-critical updates on my desktop machine like I often do each Sunday an Windows Update tells me that the update is ready to download and install:
This is a substantial update that took a while – and a few system reboots – to install. Once the system is updated, you need to make sure the user account you use for your Power BI subscription is integrated with Windows 10 using the Settings > Accounts page in Windows…
After that, Cortana integration should show up in your Power BI tenant online. In the settings page, check the “Enable Cortana…” box for each dataset you want to integrate.
Interacting with Power BI is now a simple matter of asking Cortana questions that line-up with objects in the published model, just as you would using the Q&A feature from a dashboard in the web browser. In most cases, you’ll want to navigate from the Windows search results window to the full page view by using the “Show more details in Power BI” link.
Chicago – full-day: March 4, 2016
San Jose, CA – 2 hour session at the PASS BAC, May 3-4, 2016
Will you be in the Chicago area on March 4th? So will I and you can learn to use Power BI to design real business solutions at my full-day Power BI Hands-On Workshop one day before the Chicago SQL Saturday #484. I’ve delivered this at other preconference events with sell-out attendance. Thank you to all who have attended in other locations. An abbreviated version was delivered at the PASS Global Summit this past November and I’ll be doing a quick two-hour abbreviated session at the PASS Business Analytics Conference this coming Spring in San Jose, CA.
This engaging workshop will teach attendees how to use Power BI Desktop to create a complete solution delivering dashboards, self-service and mobile BI. Attendees will learn to analyze information using impactful and interactive visualizations. We will create a solution using a variety of data sources that can be used on the desktop by Data Analysts or for group collaboration in the Power BI secure cloud service.
Attendees should have a basic knowledge of database concepts and should know how to use Excel 2013 or newer.
Bring your laptop running Windows 7 or newer (4 GB of RAM is required, more is better, 64 bit OS recommended). Before the session, attendees should download and install the Power BI Desktop and create a free account at PowerBI.com (90% of the exercises can be completed in Power BI Desktop without a PowerBI.com subscription.) Note that to create a subscription you must use a work email address rather than an address from a free mail provider such as Hotmail, GMail or Yahoo!. If you have any difficulty getting signed up, follow the directions at this post.
At the conclusion, attendees will know how to use Power Query to combine and transform data from different sources; Power Pivot to model data for reporting and discovery, and write calculations, KPIs and complex metrics. They will learn to use Power BI report visuals to create interactive visualizations and dashboards, and Excel PivotTables, PivotCharts, slicers, conditional formating and advanced features to create collaborative dashboards and reports. Finally, attendees will learn to deliver a collaborative BI solution in Power BI online enabling natural language queries with Power BI Q&A, scheduled data refresh and queries using live, on-premises data. We will also introduce enterprise capabilities which include connectivity to live, on-premises data sources through DirectQuery and Analysis Services.
BI Solution Components
Begin with a brief and high-level introduction to the components of traditional BI solutions to help attendees appreciate the usual scale and cost. Discuss the options to scale these items for small and large projects. Discuss the trade-offs for each and the value of centralized solutions, data quality and master data management. Discuss the challenges of designing, building and maintaining large-scale BI solutions.
Introducing the Power BI Dashboard designer
Universal principles of data behavior
Universal data problems & visualization principles
Project types & scale
Self-service BI & enterprise scale projects
Power BI project stages
Managing data transformations with Power Query
Introduction to advanced “M” script
Data modeling with Power Pivot
Introducing DAX Calculations
Visualizing & exploring with Power BI report visuals
Building live, interactive dashboards
Publishing and collaborating with PowerBI.com
Integrating Excel pivot tables & reports
Introducing management groups & content packs
On-premises and large-scale solution options
Data Management Gateway & scheduled refresh
Registration includes lunch and beverages
When: Friday, March 4, 2016 from 8:30 AM to 4:30 PM (CST) – Add to Calendar