I’m interested in hearing your response to this question. Microsoft is investing aggressively in delivering Business Intelligence as a holistic platform for reporting, collaborating & sharing information. The next generation of Microsoft self-service reporting tools will require SharePoint 2010 Enterprise Edition. Some of the other enterprise BI vendors require heavier investments to use their products.
What are your thoughts about this? Is it good news that Microsoft is offering a cohesive platform and technologies that work well together? Is this too much to expect form mid-market businesses? Is Microsoft stepping up to compete with the BI big guys ?
To say that this was a very full conference would be an understatement. The SQL Server platform has grown to be nearly incomprehensible. It’s nearly impossible for any mortal to have a complete understanding of the features and capabilities across every technology in the component stack. Not many years ago there were jack-of-all-trade SQL Server professionals who wore most all of the hats and did a little reporting and data analysis on top of their data administration tasks. There are very few people who are now wearing all the hats and getting anything done. Perhaps in the smaller shops but if a DBA is truly doing their job, they’re not doing any sort of deep BI at the same time. There do seem to be some DBA/developers in the small shops but they’re not deeply entrenched in BI projects.
With a significant feature refresh this past year and a brand new product version in the works for the following year, there are many new features in the currently-available product and many many more coming. A lot of the information related to SQL Server “Denali” has been revealed in insider forums and presentations from various Microsoft product teams over the past few weeks under NDA but is being made public this week.
I’ve been getting heavy doses of Denali lately. About a month ago, I was invited to attend the week-long SQL Server “Denali” Software Design Review at Microsoft. This is where MVPs and selected Microsoft partners saw every new or candidate feature demonstrated by the product teams. They solicited feedback told us about some longer-term goals. All of this was under strict non-disclosure agreement but some of the information was released to the public at the PASS Summit. Before and after PASS, I attended the SQL Server Technology Advisory Summit – an extension of the SDR where we went deeper into specific features and development planned for release in the Denali RTM. For me, this was a great opportunity to sit down with the Microsoft BI leadership and several the SSRS & SSAS product team members to discuss the goals and prioritization of new features. They have many ideas, have many features in the works but can only deploy a limited set of capabilities in the first release. These were very insightful meetings – and of course, all but the publically-released information is covered by NDA. So, I’ll tell you what I know and what I think when I’m permitted to do so.
The BI Semantic Model and Future of Analysis Services
The BI Semantic Model or BISM is the most significant BI development in the SQL Server space and is huge bet for Microsoft. They’ve been behind their competition in offering a truly capable semantic data modeling layer to compete with the likes of the Business Objects Universe or the Cognos Framework. Analysis Services is an awesome solution for staged, read-only data analysis but hasn’t worked well for real-time scenarios. SSAS is a very mature OLAP technology and the mainstay for corporate IT shops and consultants specializing in BI. Needless to say, this will shake things up in the industry. I have had mixed feelings about this direction but I can finally say that I get it. Do I like it or completely agree with it? I’m still trying to form an opinion. If they can truly execute on their plans to make this a an enterprise-class tool, then I’ll be excited to see that happen. My concern is that we now have a technology that appears to be two different things: a user-class add-in for Excel to let non-savvy users play with data and a hugely capable enterprise class, do-it-all analytic engine. I don’t know if serious business IT folks can accept a technology that does it all.
Since last year, PowerPivot has enabled business users to serve their own data without limits. It provides a highly-interactive experience with very fast performance. A PowerPivot workbook can be imported into a BISM model. According to Amir Netz in the keynote demonstration, importing, querying and working with the model data is not just “wicked fast. It’s now an “engine of the devil” fast! With BISM, Vertipaq now exists in the relational engine and operates on a column store in SQL Server. The Analysis Services product team promises to eventually add all of the capabilities found in the UDM-base Analysis Services to the BISM platform.
I had the opportunity to participate in the SQL Server Technology Advisory Summit on Monday and Friday of this past week. When I specifically asked about the long-term future of Analysis Service, the SSAS product team leadership promised that the current OLAP engine will continue to be developed and supported for years to come. My crystal ball is cloudy but I think this means that the Vertipaq/PowerPivot based solution will continue to receive most of the product team’s attention and that some time in the next several years, if it proves to be a completely superior offering, SSAS will become a candidate for eventual deprecation. The bigger question is how SSAS will be perceived in the industry during this transition. If everyone drove Ferraris, would elite car enthusiast still want to to drive them?
As more business users try to create their own BI solutions with PowerPivot and consultants build enterprise solutions with these new tools, our best practices and design patterns will get some realignment. I think many of us who are deeply entrenched in Microsoft BI will need to do some repositioning as we do our best to juxtapose the new and existing tools for our customers in a climate that is already a little confusing & unclear. No doubt that PowerPivot is a powerful technology that can be used to solve a lot of interesting problems “but with great power comes great responsibility” so it may also be a loaded gun for a lot of users. They will get likely themselves into trouble and will need help to dig out. Remember what we did with Microsoft Access 10-15 years ago? There’s a whole freight train full of that on its way again. IT and business leaders must step in with support and data governance rules to manage the way data is used and the decisions made from these models.
CTP1 is publically available this week. Honestly, there’s not much to see. None of the new, cool stuff is really finished but should be in CTP2, which will only be available to members of the TAP program most likely sometime in the late winter or spring.
Microsoft is investing heavily in cloud computing with several online services available under the Azure Platform.
The SQL Azure cloud database offering has been available for about a year now. I personally haven’t seen wide adoption for SQL Azure but there’s a lot of interest. Most of our clients either don’t think they need to push their data outside their walls or don’t have the kind of scalability needs that Azure addresses. There are some very compelling reasons to use SQL Azure. Security is one of the big questions that comes up in client discussions. The fact is that hosting business data in Azure is probably more secure than most on-premise solutions. Short of some Hollywood action thriller-style terrorist team coming in to take over one of the Microsoft data centers and holding the staff at gun point while stealing all the hard drives, data theft isn’t very easy and therefore, not very likely. For small scale solutions, SQL Azure is cheap and easy. For larger-scale needs, Azure offers the unique ability to scale on demand, making it an ideal way to service intermittent, extremely high-demand scenarios without building and paying for a clustered data center. Since you only pay for what you use and when you use it, licensing and infrastructure ROI is factored out of the equation until you have a lot of activity.
Imagine, for example, an on-line retail market site that routine gets only moderate use. The owner doesn’t need to invest in any on-premise server hardware or software licensing if they use SQL Azure for data services. Because the transaction database doesn’t typically use a lot of storage, the usual cost is low – let’s say it’s only $100 per month during slow times. Now the owner promotes a sale on black Friday, at the beginning of the Christmas holiday shopping season, and the number of transactions soars by a factor of thousands. A single database server or even a small cluster may not be able to handle this traffic volume but SQL Azure scales on-demand to handle the load. Multiple servers are automatically requisitioned during the shopping day and millions of transactions are performed. Of course the cost for increased usage goes up but then drops the next day when traffic settles back down. The sale is a success and the owner has made no hardware or licensing investment to handle the spike in sales. the same scenario applies to web applications hosted using the Windows Azure service. Whether on-premise or in the cloud, most any application can use SQL Azure for storage and data services with a reliable Internet connection.
SQL Azure Reporting
With reporting and BI coming to the cloud, this will become more a more attractive offering. Azure Reporting, essentially, SQL Server 2008 R2 Reporting Services hosted through the Windows Azure services, should be available as soon as December, 2010. Subscriptions and more advanced management capabilities won’t be part of this offering and you can’t deploy to Azure from Report Builder but all of the reporting features will be available for relational data stored only in SQL Azure databases. How does it work? With a Windows Azure account, simply deploy a report from BIDS to Windows Azure URL and …whamo bamo, it’s there for anyone who has been granted access to view it from anywhere in the galaxy.
Azure Data Market
Previously called Project Dallas, the Azure Data Market enables people to share data streams and other data commodities for free or purchase. This should be a convenient way to obtain common or specialized sets of data for integration into applications – especially data sets that are common to a lot of applications like geographic information, consumer surveys, real estate and population growth. https://datamarket.azure.com/
Project Crescent is a fully-interactive, browser-based data visualization surfaced using Silverlight interactive and animated controls. It’s Reporting Services, only sexier. A Crescent report is designed directly from SharePoint in the web browser. Users connect to data through a BISM model and jut select the tables & fields they want to see in different types of report elements. It supports tables, grids, panels and a variety of charts. Data is filtered and sliced by simply clicking on regions and data points. ike PerformancePoint dashboards, report content changes in-place and data regions are synchronized when when any one is used to filter the data.
Crescent is designed for the user, not for the IT professional, and doesn’t support expressions or parameters. It’s super easy to use and has a lot of business value right out of the box but it doesn’t afford the same flexibility and power of professional Reporting Services. Crescent reports are stored in RDLX format which is an extension of the RDL XML schema. There probably won’t be a migration path from Crescent reports to RDL reports right away but this will likely come in the near future. We’re also likely to see the Silverlight controls extended to RDL reports for no other reason than report designers will demand the same cool and dynamic behaviors.
Another part of the self-service reporting experience in Denali is a new feature, somewhat similar to report subscriptions, called Report Alerts. Like Crescent, this is only available from a SharePoint 2010 Enterprise environment and uses the SharePoint event model rather than the SQL Agent. The concept is that users are notified when the data feeding a report changes and meets some specific criteria. This may be useful when a metric falls below and threshold or target. Plans call for alerts to support both RDL and Crescent style reports.