PASS 2010 Summit Announcements and New SQL Server Developments

To say that this was a very full conference would be an understatement.  The SQL Server platform has grown to be nearly incomprehensible.  It’s nearly impossible for any mortal to have a complete understanding of the features and capabilities across every technology in the component stack.  Not many years ago there were jack-of-all-trade SQL Server professionals who wore most all of the hats and did a little reporting and data analysis on top of their data administration tasks.  There are very few people who are now wearing all the hats and getting anything done.  Perhaps in the smaller shops but if a DBA is truly doing their job, they’re not doing any sort of deep BI at the same time.  There do seem to be some DBA/developers in the small shops but they’re not deeply entrenched in BI projects.

With a significant feature refresh this past year and a brand new product version in the works for the following year, there are many new features in the currently-available product and many many more coming.  A lot of the information related to SQL Server “Denali” has been revealed in insider forums and presentations from various Microsoft product teams over the past few weeks under NDA but is being made public this week.

I’ve been getting heavy doses of Denali lately.  About a month ago, I was invited to attend the week-long SQL Server “Denali” Software Design Review at Microsoft.  This is where MVPs and selected Microsoft partners saw every new or candidate feature demonstrated by the product teams.  They solicited feedback told us about some longer-term goals.  All of this was under strict non-disclosure agreement but some of the information was released to the public at the PASS Summit.  Before and after PASS, I attended the SQL Server Technology Advisory Summit – an extension of the SDR where we went deeper into specific features and development planned for release in the Denali RTM.  For me, this was a great opportunity to sit down with the Microsoft BI leadership and several the SSRS & SSAS product team members to discuss the goals and prioritization of new features.  They have many ideas, have many features in the works but can only deploy a limited set of capabilities in the first release.  These were very insightful meetings – and of course, all but the publically-released information is covered by NDA.  So, I’ll tell you what I know and what I think when I’m permitted to do so.

The BI Semantic Model and Future of Analysis Services

The BI Semantic Model or BISM is the most significant BI development in the SQL Server space and is huge bet for Microsoft.  They’ve been behind their competition in offering a truly capable semantic data modeling layer to compete with the likes of the Business Objects Universe or the Cognos Framework.  Analysis Services is an awesome solution for staged, read-only data analysis but hasn’t worked well for real-time scenarios.  SSAS is a very mature OLAP technology and the mainstay for corporate IT shops and consultants specializing in BI.  Needless to say, this will shake things up in the industry.  I have had mixed feelings about this direction but I can finally say that I get it.  Do I like it or completely agree with it?  I’m still trying to form an opinion.  If they can truly execute on their plans to make this a an enterprise-class tool, then I’ll be excited to see that happen.  My concern is that we now have a technology that appears to be two different things: a user-class add-in for Excel to let non-savvy users play with data and a hugely capable enterprise class, do-it-all analytic engine.  I don’t know if serious business IT folks can accept a technology that does it all. 

Since last year, PowerPivot has enabled business users to serve their own data without limits.  It provides a highly-interactive experience with very fast performance.  A PowerPivot workbook can be imported into a BISM model.  According to Amir Netz in the keynote demonstration, importing, querying and working with the model data is not just “wicked fast.  It’s now an “engine of the devil” fast!  With BISM, Vertipaq now exists in the relational engine and operates on a column store in SQL Server.  The Analysis Services product team promises to eventually add all of the capabilities found in the UDM-base Analysis Services to the BISM platform. 

I had the opportunity to participate in the SQL Server Technology Advisory Summit on Monday and Friday of this past week.  When I specifically asked about the long-term future of Analysis Service, the SSAS product team leadership promised that the current OLAP engine will continue to be developed and supported for years to come.  My crystal ball is cloudy but I think this means that the Vertipaq/PowerPivot based solution will continue to receive most of the product team’s attention and that some time in the next several years, if it proves to be a completely superior offering, SSAS will become a candidate for eventual deprecation.  The bigger question is how SSAS will be perceived in the industry during this transition.  If everyone drove Ferraris, would elite car enthusiast still want to to drive them?

As more business users try to create their own BI solutions with PowerPivot and consultants build enterprise solutions with these new tools, our best practices and design patterns will get some realignment.  I think many of us who are deeply entrenched in Microsoft BI will need to do some repositioning as we do our best to juxtapose the new and existing tools for our customers in a climate that is already a little confusing & unclear.  No doubt that PowerPivot is a powerful technology that can be used to solve a lot of interesting problems “but with great power comes great responsibility” so it may also be a loaded gun for a lot of users.  They will get likely themselves into trouble and will need help to dig out.  Remember what we did with Microsoft Access 10-15 years ago?  There’s a whole freight train full of that on its way again.  IT and business leaders must step in with support and data governance rules to manage the way data is used and the decisions made from these models.

CTP1 is publically available this week.  Honestly, there’s not much to see.  None of the new, cool stuff is really finished but should be in CTP2, which will only be available to members of the TAP program most likely sometime in the late winter or spring.

Paul Turley

Paul Turley

Microsoft Data Platform MVP, Principal Consultant for 3Cloud Solutions Specializing in Business Intelligence, SQL Server solutions, Power BI, Analysis Services & Reporting Services.

Leave a Reply

%d bloggers like this: