I’m impressed with the current generation of semantic modeling technology from Microsoft. I’ve had the opportunity to use both PowerPivot and SSAS Tabular Models on a handful of client projects and have had very good results so far. The xVelocity in-memory aggregation and data compression engine seems to be everything that was promised. PowerPivot doesn’t allow us to do everything we can do with Analysis Services cubes but for simple aggregate values and calculations performed with moderately large volumes of data, it performs quickly and is fast and easy to to implement. In some ways, we can do more with Tabular models than multidimensional models with less effort. I’m anxious to continue to test the limits of this technology and really find its practical boundaries and capabilities.
This past week, Microsoft released Office 2013 with PowerPivot and the Power View visualization tool build right into Excel. Power View makes it really easy to analyze PowerPivot and Tabular models and to present results in interesting and impactful ways. Every time I start working with Power View, I can’t put it down because I keep finding more creative ways to present the information.
I’ve given three presentations so far on using PowerPivot, Power View and Tabular models at SQL Saturday events in Dallas and Salt Lake City and I have four more events* on the horizon. One of the reasons I enjoy giving these presentations is that it gives me a good opportunity to find out what people are thinking about when they consider using different database tools. Being in front of an audience challenges me and makes me think hard about the right and wrong ways to use software. Some really good questions came out of these sessions that I’ll share with you. Thank you to those how have attended and I sincerely appreciate your questions and challenges.
At the Dallas SQL Saturday BI Edition, I was delivering a session titled PowerPivot & Power View – High-octane Self-service Reporting and demonstrating how to mash-up different sets of data. I used a bridge table sourced from a text file to resolve a data quality issue; two product numbers that represent the same product, explaining that this is a common practice in an ad-hoc solution. I hand went up and a gentleman said “Isn’t that something that should be resolved in your ETL process and stored in a central data warehouse?” Absolutely, Yes! Ideally, data quality and master data issues should be managed centrally and according to a business-wide standard.
The reality is often that when a user has a set of data containing some inaccuracies, they can’t always wait for the perfect solution and we simply do what must be done at the time to resolve issues in the short term. Exceptions to these rules of “proper” design may be evidence of a business requirement that should be carefully reviewed so we can “do it right” in the future. In another session, I was explaining that one important purpose of PowerPivot is to put powerful yet simple data modeling tools in the hands of business users so they can do the work themselves, rather than waiting for IT to build solutions for them. This brings a very interesting question to bare. Isn’t this what we did with Microsoft Access? Are we setting the stage for a new generation of poorly-designed, user-created applications that promote desktop data sprawl and a general lack of standardization?
Access was and still is a great tool for organizing and presenting information. Since it is installed on so many desktops with the rest of the Office applications, it’s available to many undisciplined and untrained users to created poorly-implemented solutions. Isn’t the same thing true for PowerPivot and Power View? Now that they are both on the desktop and accessible for users, won’t they also create incorrectly designed applications and spread outdated and inaccurate data around the organization? Sure they will. The difference is that these users have a simple path to move from the desktop to the workgroup and the enterprise by storing these models in SharePoint.
To keep things under control in the long term, IT must be a part of this process. Solutions should be sandboxed so that business user created data models and reports are stored in isolated document libraries that are designated for prototypes and proofs of concept. After formal review and redesign, if necessary, models and reports are moved to designated document libraries with the expectation that they may be used with the stamp of approval of the business and IT. These reports use data governed by IT and vetted by the business, having met certain standards of accuracy and reliability. User-generated data models can also be migrated to use enterprise data marts and IT-governed data models, rather than the outdated off-line data models created by users.
Users will always create their own solutions using whatever tools are at their disposal. Some will be wrong and some will be very valuable to the business. The key to achieving the right balance is through proactive user education, the appropriate level of security and control, the separation of “experimental” models and reports from those guaranteed to be accurate and reliable; and a regular cadence of IT and business review of all assets to migrate, correct or redesign these data models and reports to move them from one category to the other.
*I’ll be speaking on these topics at the SQL Saturday # 172 in Portland next week, at the PASS Global Summit, the following week at SQL Saturday #166 in Olympia, Washington and then in December at the DevTeach conference in Montreal.
Paul,
You included the following comment, which I believe is a “the glass is half-empty” view:
“This brings a very interesting question to bare. Isn’t this what we did with Microsoft Access? Are we setting the stage for a new generation of poorly-designed, user-created applications that promote desktop data sprawl and a general lack of standardization?”
Have a look at a paper written by Luke Chung, President of FMS, Inc., a very well respected company that creates great tools for users of Access, SQL Server, and .NET software. In this paper, Luke includes the following quote:
Successful databases evolve over time. A good IT strategy embraces, not fights, this natural trend. Anticipating the transition is part of a successful database strategy. That means preparing for times when applications need to migrate to new platforms or be completely re-written.
When these occur, one should not blame the existing platform, but rather celebrate the success of the organization and the system that took it to the next level. The existing system should be considered a great prototype for the next system since the business needs are well defined and users accept it. This significantly reduces the risk of the new system in a world where expensive systems are never delivered or built or fulfill a fraction of their original intent.
Source: Database Evolution: Microsoft Access within an Organization’s Database Strategy
http://www.fmsinc.com/MicrosoftAccess/Strategy/index.asp#hateaccess
So, how about celebrating these successes, instead of having a mindset that IT must somehow be involved in everything. As a 27 year employee of a Fortune 50 company, I can tell you, it “ain’t gonna happen”, no matter how much you wish otherwise. The reason is simple: Most IT departments are simply too busy, do not truly understand customer requirements, and often-times deliver unsatisfactory applications at huge costs to the departments. Yes, I speak from first-hand knowledge here.
Tom Wickerath
Microsoft Access MVP Alumni
April 2006 to March, 2012
Tom
Thank you for your thoughtful response to this post and I apologize for taking so long to reply. I agree whole-heartedly with your assessment about business-user-created solutions. I chose to use Access as an example because I knew that it would evoke some emotion but more than just stirring the pot to get people riled-up, I wanted to make the point that powerful tools in the right hands can be used to do amazing things. Fifteen years ago, I was that Access jockey, working in the trenches near the production lines, trying to solve business problems that a large corporate IT group was not solving because they had other things to do – and we were very successful. Yes, Access is a tool that lets people who are close to the business create solutions that can move the business forward in positive ways. If people in the organization, in various roles (analysts, management, IT, executives, etc.) are working together toward common goals, these efforts can result in appropriately-managed and supported systems that solve problems and provide real value. The glass is half-full… sometimes. Like Access, PowerPivot and other next-generation analysis tools will be used for good and evil and the partnership between data-skilled business users and IT professionals, who are charged with the responsibility to protect and govern corporate data, will become even more critical than ever before.