I’ve created a series of five tutorial videos for a set of workshops I’ll be doing at several events this year. The workshop is available from an hour-long quick demo to a full-day, hands-on workshop with a hosted lab environment in the cloud. The series teaches how to create a complete business intelligence solution using the components of Power BI in Excel 2013. These video tutorials, although complete, are a subset of the entire workshop. Each tutorial is about 10 minutes long.
Some exciting announcements were made today about the new Power BI cloud-based business analytics service. Today Power BI is an add-on service for Office 365 that requires an enterprise-level license – about $50 per user per month. Data transformation queries, data models and dashboard visualizations are all created using add-in tools for Excel 2013. All of the content is managed in SharePoint Online. The new Power BI designer and hosted dashboard elements won’t require any Excel add-ins and use a streamlined cloud service with a list of mobile options. The best part is that the price also went way down – about $10/month for designers and it’s free for anyone who just runs reports and dashboards. Yea… free!
James Phillips, Microsoft General Manager of Data Experiences posted “Today is an incredibly exciting day as we unveil the new Power BI – a service we believe will fundamentally transform the “business of business intelligence.” Power BI can help every company adopt a data culture with easy-to-use, accessible tools and innovative technology that lowers the barrier to entry, for all.” Read his entire post here.
The new Power BI Preview is available to try out for free to anyone with a US email address. It’s actually been available to limited audiences for a couple of months but this is the first official announcement. Just go to the Power BI site and get started using the preview.
This new direction raises many questions about the investments many have already made into Microsoft BI tools and the options we’ll have to continue to use those tools or migrate to the new platform. We are seeing a lot of open dialog from the Microsoft product teams and leadership to address these questions in advisory councils and forums like the MVP Program.
I’ve been playing with the preview and it’s quite impressive for being so new. Several features aren’t quite as capable as the full Excel-based suite in Office 365 today but the tiled dashboards & new visualizations are very nice. The designer is lean and simple. It contains a light version of Power Query, an ultra-light modeling tool in lieu of Power Pivot and a very robust visualization experience that in many ways is ahead of Power View. In all it’s a nice tool that is being updated rapidly. I’ve provided quite a bit of feedback using the “smile” and “frown” buttons in the designer and the product teams have responded very quickly with bug fixes and answers. They’re obviously committed to getting this right. The preview is probably not something you’ll want to use for real business BI projects right now but certainly a tool to take very seriously in the months ahead.
This post actually applies to both multidimensional (cube) projects and Tabular projects which manage data source connectivity in a similar fashion. The multidimensional design experience is different but the mechanics of the SSAS engine are essentially the same.
When loading data from a SQL Server data source into a Tabular model project, chances are very good that you’ll see an error on the first attempt. I’ve been creating SSAS projects for over over 15 years and, even though this is a relatively simple process, I still usually work through the initial configuration in trial-and-error mode. So, let’s call this a normal part of the process.
A Word of Advice
In Tabular model design, a common practice is to select all the tables and columns, rename objects and apply filters before finalizing the import step. If something goes wrong and you can’t connect to the data source, you’ll lose all this work and will need to start over. Test connectivity with one table before going through this aggravating cycle.
Here’s an example of an “ouch” moment after taking a half hour to make all the table/column selections. Note the lack of a “Back” button!
When connecting to a SQL Server data source in the SSAS project designer, there are two different identities used at different times. When you initially create and test the connection in the designer, you will authenticate as yourself – the user logged into the local computer. However, when you actually load data into the table or process the model, even in the designer, Analysis Services authenticates using the identity of the SSAS service. If the SSAS instance is installed on your local development machine and you just used the default settings during setup, this is an account generated by SQL Server setup. If you’re using a production-like SSAS server, your system administrator may have configured the service to run under a specifically-defined domain principal. If the latter is the case, you’ll need to get your DBA or sys admin involved. If you have an isolated development environment, with everything install on one computer, the following instructions will take care of it.
I’m running all my SQL Server services on my local development machine and created a Tabular project that will use the AdventureWorksDW2012 sample database. The SSAS instance I plan to use for my workspace database is the local instance. First, let’s get the service account for the SSAS instance. The easiest way to view local services is to right-click “My Computer” or “This PC” in Windows Explorer and select “Manage”. Navigate to the Services node like so:
I have two SSAS instances installed and the default instance is called MSSQLSERVER. Double-click to view the Properties dialog and select the Log On page:
As you can see, the default service account is “NT Service\MSSQLServerOLAPService”. I’ve selected this text and copied it to the clipboard. I’m done with the Services and Properties windows.
Now, I open SSMS and view the Logins for the SQL Server instance my SSAS project uses for the data source. After connecting to the relational instance, expand the object explorer, Security and Logins. Two logins are needed; one for myself and one for the SSAS service account. Since this is my development machine and had installed SQL Server, I already have a login for my Windows user account and I’m a member of the sysadmin server role which grants me access to every database on the server. If you’re not a privileged user on this SQL Server instance, you will need a login created for your Windows credential and will need to at least be in the db_datareader role for the target database (example below).
To create the service account login, right-click Logins and choose New Login… Paste the service account name here and move to the User Mapping page. Select the target database and then check the box next to the db_datareader database role.
Click OK to save changes and you should be good to go.
Back in the Tabular model designer, I’m going to import only one small table to check connectivity. Once verified, I’ll click the Existing Connections toolbar icon, select all my tables, select and de-select columns, apply filters and then load the data with successful results. As soon as I see record counts, I know I’m in the clear…
Just One More Thing
On a different but related topic since you’re getting your development environment ready… If you’re not a local administrator on the SSAS machine you’re using for the Tabular workspace database (typically a development machine), you need to make sure your Windows account is in the the Server Administrators role for the SSAS instance. If this isn’t the case, you’ll get an error when you process the model. To do this, connect to the SSAS instance in SSMS, right-click the server and open Properties. On the Security page, use the Add button to add your Windows account.
Is is possible to duplicate the same many-to-many relationship behavior in VertiPaq that we have in SSAS multidimensional?
Since Tabular model projects were introduced in SQL Server 2012, one of the major blocking points for adaption has been the lack of out-of-the-box support for many-to-many relationships. I’ve been using, demonstrating and showcasing SSAS Tabular and Power Pivot for at least three years and in that time it became apparent that, even though this might only be a perceived limitation in many cases, for quite a few enterprise IT shops; many-to-many support has become a litmus test for whether this technology is ready to be taken seriously. Honestly most business data models don’t need many-to-many relationships to be useful but it is still a reality. This week Chris Webb posted about bidirectional relationship support in the new Power BI designer and demonstrated an example using sales of fruit that can exist in different categories.
The Power BI Designer is the first publically-available product from Microsoft that supports M2M relationships in a VertiPaq model and I wanted to conduct a test to see if I could duplicate the same M2M functionality as we have in a familiar multidimensional model, using the Adventure Works sample data. So, here goes…
Many-to-Many Example in SSAS Multidimensional
In the Enterprise examples for the Adventure Works Cycles sample data, the Adventure Works cube contains two many-to-many dimensional relationships. Here is one of those relationships; in the dimensional usage grid of the cube editor, you can see that the relationship between the Internet Sales measure group and the Sales Reasons dimension is defined as many-to-many:
The M2M relationship involves three tables in the data source view with the Internet Sales Reason Facts table bridging any number of Internet Sales Facts records to any number of Sales Reasons records. Not in view is the DimCustomer table which is related to Internet Sales Facts.
A measure group named Internet Customers contains a distinct count measure based on the CustomerKey column in the Fact Internet Sales table. This measure will exhibit behavior unique to M2M aggregation as you’ll see in these queries:
In the first query, the Customer Count measure is grouped by Sales Reason on rows. If we were to add the result for each group, the total would be 24,135. This is incorrect because it’s a non-additive measure – the same customer is counted for placing sales orders for different reasons so some customers are counted multiple times:
This is evident by replacing the Sales Reason attribute with the All member on rows, showing that there were actually only 18,484 distinct customers placing orders:
Narrowing the results to only one customer helps to explain. This is the key value for customer “Aaron Zhang”. You can see that Aaron placed two orders with a different reason for each:
Again, replacing the Sales Reason attribute with the All member shows that there was really only one customer:
Many-to-Many Example in a VertiPaq Model
To get started, I opened the Power BI Designer and imported the same tables from AdventureWorksDW2012. I’m using the Power BI Designer Preview as of early January, 2015. Because we don’t have a Power Pivot interface in the designer (yet?), we’re essentially using Power Query to populate the VertiPaq semantic model.
The FactInternetSales and FactInternetSalesReason tables are related using two columns: SalesOrderNumber and SalesOrderLineNumber. These columns needed to be converted to strings and then concatenated into a single column to facilitate the relationship. This is a common practice in Power Pivot/Tabular.
Using the Manage Relationships dialog, the tables are related with these three relationships:
I switched to the Report page which finalized the data load and populated the model. When I started, I was concerned about not being able to explicitly define DAX measures but it was a simple matter to change the field aggregation for the CustomerKey to “Count (Distinct)”.
Using the Power View report tool, it was also a simple matter to verify the results and duplicate all of the query results as before. Here, you can see that Aaron Zhang placed two different orders for different reasons and the total reflects the non-additive behavior of this many-to-many relationship.
In conclusion, many-to-many relationships are now supported in this implementation of VertiPaq and it behaves the same as in multidimensional cubes! This is very good news and may give us a good idea of what to expect in the near future.