sharing my experiences with the Microsoft data platform, SQL Server BI, Data Modeling, SSAS Design, Power Pivot, Power BI, SSRS Advanced Design, Power BI, Dashboards & Visualization since 2009
Please join my associates and I for an all-day SQL Server Upgrade workshop on November 3rd
If you are planning an upgrade to a newer version of SQL Server, you won’t want to miss this all-day, preconference workshop.
Join John Martin, Sr. Technology Evangelist from Microsoft, will spend the morning discussing migration planning from SQL Server 2005.
In the afternoon, SolidQ mentors Ron Talmage, Richard Waymire, Jim Miller and Paul Turley will talk about and demonstrate upgrading specific workloads (projects and features) from older product versions to newer product versions. We will introduce the comprehensive SQL Server 2014 Upgrade Whitepaper that we recently wrote for publication by Microsoft.
Additional to upgrading specifically from SQL Server 2005 to SQL Server 2014, we will also talk about upgrading from SQL Server 2008 and 2008 R2 to SQL Server 2012 and 2014.
An upgrade and/or migration is far more than just a case of moving a database or installing a new version of SQL Server, there have been so many changes since SQL 2005 arrived that we need to do a lot of tasks to ensure we have a successful upgrade project.
This session will guide you through the process, looking at not only the technology but the methodology, processes and tools that we have at our disposal to make sure that when we do move from SQL Server 2005 to 2012/2014 or to the cloud with SQL Server in an Azure VM or Azure SQL Database that we do so in a way that we can be confident of success. We will take a special look at workload-specific upgrade needs for OLTP, HA, SSAS/SSRS, and SSIS environments.
In my afternoon section, I will demonstrate the capabilities of SSAS Tabular models and discuss upgrading and migrating Reporting Services from 2005, 2008 and 2008R2 to 2012 and 2014.
I started shopping for a wearable fitness band several months ago. I looked at several and was most impressed with the BASIS Band concept. It was different because it combined a device that had advanced tracking capabilities with a cloud-based service that could be expanded and improved with only server-side upgrades. What a great business model, I thought. reading the forums, I could see that the BASIS customer service folks were very responsive to feedback and feature requests. A lot of people were also very excited to see this service expand and were eagerly asking for more features and capabilities that could be added to the back-end service or software. BASIS didn’t yet have integration with popular fitness apps but that was coming. You couldn’t yet download your data to analyze it yourself but, given their centralized approach to synchronizing the device and managing all this data, it would only be a matter of time before they would add this capability. They offered an iPhone and Android app but nothing for Windows or the Windows Phone, but again, a simple API would enable the developer community to do to some great things. Since the BASIS collected more sensor data than most other devices, I was excited at the prospect of being able to analyze my own fitness data with my own tools. This just seemed like a slam dunk with some much opportunity. I could just imagine being able to chart and trend my heart rate, perspiration and calories during and after after workouts; over weeks and month of progress.
Back in June when I started shopping, I could go to the Nike Store and look at their bands. The JawBone and FitBit bands were on display everywhere, and there were many others that I could touch and look at but the BASIS Band wasn’t on display anywhere; only available online. As an underdog brand contender with all of their advertising claims, I reasoned that the device itself must be just as good as the competitors so I took the plunge and ordered a BASIS Carbon Steel directly from the company for $199. Shortly after my purchase, the company announced that they had been purchased by Intel. Wow! How cool was it that this small start-up was now part of one of the most innovative technology companies on the planet? Great things must be on the way! I felt a little slighted when they lowered the retail price of the band I had just purchased to $149 but with all the upgrades coming, I could live with that.
After nearly four months as a BASIS customer, this has been one of the most disappointing investments I have ever made. They were so close to getting it right by listening to and responding to their customers but they just didn’t do it. Their support forum is full of questions asking why this company has ignored such passionate customers. I’ve bought a lot of junk in my life but the BASIS is something that COULD HAVE BEEN GREAT but isn’t. If you’re considering the BASIS, take some time to read through their Community Forums and the comments from customers. I’ve posted a few comments and suggestions over the past few months like this one. I’ve also asked that the customer service people who always respond very quickly with answers like “Thank you so much. We take your feedback seriously but can’t respond with any specific information…” to ask company leaders or developers from the company who can answer these questions to respond.
I had so much hope for BASIS to be cool and useful for me and many others – and it could be, in my opinion, if the company would simply listen to their customers and respond to them. They would earn the respect of many technologist if they would release an API so the software developer community could develop apps and integration, and the ability to download, export or feed a users personal fitness data to an app or tool of their choosing. Until they do that, BASIS has very limited use for me.
PEAK Next Generation Band
Will the next generation PEAK be better than the B1 or Carbon Steel? Of course it will but when you use in a BASIS, you are investing in a complete service and the whole thing has to work together any value from it. Here’s a good article comparing the new Peak with previous models. I expect the Peak band to address some of the shortcoming of the device but the biggest shortfall of Basis is their limited service and lack of integration. If they can fix that, and the damage already done to their reputation by waiting so long, they could have a good product.
What Exactly Is BASIS?
Basis really consists of three different components: The band itself, the synchronization software and the web-based service. All three must be used or the band is useless. In my opinion there should be four components, including an API or web service to allow developers to integrate Basis with their own applications. This is so obvious, in my opinion, that I have included it in one of the grading criteria in my score below.
The Carbon Steel band is bulky but bulky men’s watches are fashionable so that didn’t bother me. The LCD display is dim and very difficult to read. It feels like an LCD watch from the 1980s. The backlight stays on for a short time and it can’t be read during an outdoor run or bike ride.
The Basis Sync application that runs on the desktop looks like a Windows application written ten years ago in Visual Basic 6 in about two hours. It’s light on features and synchronization fails about a third of the time. Unplug the device & plug it back in, close the app and re-open it, or wait and try later; and it will then synchronize with the Basis cloud service.
The online service and web site are quite functional. The site is attractive and the analytic graphic and charting is actually quite impressive. I featured both the daily fitness statistics time line and sleep analysis in this blog post: https://sqlserverbi.blog/2014/07/15/nascar-driver-kurt-buschs-basis-bio-stats/. Basis also sends alters and reminders. They have this goal tracking and point system to keep you motivated. All of that is well-designed and works well if you want to use your device in this way.
Where the service breaks-down is if you want to analyze more than a days’ worth of stats. You just can’t. Period. Since you can’t download or integrate any of “your” data, you can’t use it outside of the few features offered by Basis. In short, it’s not really “your data” to analyze.
The accuracy of the device is constantly under scrutiny. Basis has a lot of monitoring capabilities and tracks heart rate, perspiration, skin temperature and movement. With this, it analyzes different activities, sleep quality and calories. I have not measured its accuracy against other devices so I don’t a strong opinion. I can say that it has worked well for me and from my reading, Basis seems to work as well as or better than most competitive devices. I would rate the functionality of the device quite high but the display is very poor in my opinion.
Setting the time and time zone. You can’t. There are no settings at all on the device. It sets the time when it synchronizes and you cannot change the time, date or the time zone without changing your online profile. This makes the band useless as a watch when you travel.
Alarms, calendar, reminders – None. This $200 “sophisticated” device lacks all of these common features of a $20 Casio watch.
Grading Basis
Device – B
Synchronization software – D
Web site, on-line analysis and daily trending – B
Integration & API support – F
Overall grade: D+
Potential: Could have been an A. Now a C if they get their act together.
Requirement: Allow users to enter comments after they review results from a Power Pivot model and then show the comment text in the Pivot Table report results. Here’s a quick example of the final result. My example uses Flintstones characters in-place of employee or customer names and a simple measure value. After reviewing the results, the user moves to another worksheet and enters a comment for the date and person with an expiration date for the comment. After refreshing the report, each cell with a comment is indicated with a little “dog ear” and the comment is displayed by clicking the cell.
In practice, the data model could be in a server-hosted SSAS Tabular model, Office 365/Power BI or SharePoint-hosted Power Pivot workbook, or a desktop Excel file. In one real world application, if entries don’t meet company standards, the person reviewing the data entry can enter comments about these late entries and other policy violations. A comment has an expiration date and in the production solution, the comment expiration is also dependent on a calculated “severity level” which can be baked into the expiration logic.
My first thought when considering the need to incorporate text strings into a semantic model was that it might not be a good idea to store large text strings in a Power Pivot or Tabular model. After all, the tabular model technology is really optimized to handle numeric and redundant data. Upon carefully talking through the requirements, we’re pretty comfortable that the number of columns will be fairly low and the length of the comment strings should be conservative. The VertiPaq/VertiScan engine is actually pretty forgiving about having “stuff” added to a model that it will ignore except when it’s being used in a calculation or query. So, that’s good – my comment rows shouldn’t get in the way of core features like measure aggregation and dimension navigation. I’m really curious to find out how viable this is in a larger-scale application so if you end-up doing this with a lot of data or have the opportunity to test the limits of this technique, please share your results.
With that in mind, I took the plunge and built a simple proof-of-concept solution; which is what I’m demonstrating here. If the storage and query engine are OK coexisting with these new column fields, the big question that remains is how will the query engine react when we start to send it some unconventional long string results. Will these long values just get crunched through the query engine expressions along with the usual measures without making a ruckus.
Now remember that this is an über-simplified model and that the real one contains about 30 tables, but the major components are the same. The main fact table (tbl_fact) contains keys related to some outside tables and a measure called “Sum of Sum Value” which aggregates the SomeValue column for every Person in a pivot table report. So, here’s the solution:
The Comments table (tblComments, which I’ll call “Comments”) is essentially fact table related to Dates and People:
The source for the Comments table is a worksheet/table. Here are some of the comments:
In the production solution, we want the comment to show up until the expiration date. Rather than waiting for the system date to change, I’ve added a table of EffectiveDate values to use as a slicer for prototyping. The DAX measure expressions are added to the calculation area in the Comment table.
I’ll start with a simple version of the the CurrentComment measure before adding the expiration logic:
Current Comment:=IF(HASONEVALUE(tbl_Comments[Comment]), VALUES(tbl_Comments[Comment]))
it’s important to test the comment for only one value using the HASONEVALUE function to avoid an error. The VALUES function just returns the column value as a string. Now, I’ll add the expiration logic which uses the first measure:
Once this is all tested and working, we can just substitute “TODAY()” in-place of “FIRSTDATE( tblEffectivedate[EffectiveDate] )” to use the system date.
Now, to add the Excel comments. This is the fun part (as if it hasn’t been exciting enough thus far!)
Add a pivot table to a new sheet name “Report”. Add Dates and People on rows and the “Some of Some Value” measure on columns (which actually adds it to the VALUES). Add the “Comment Until Expiration” to columns as well. Select the last column added to the pivot table and hide it. In my example, this is column D.
You’ll need to have macros enabled and trust VBA in Options > Macro Settings.
Use Alt+F11 to open the Visual Basic for Applications editor and enter the following code into a code module (usually Modules > Module1):
‘***********************************************************
‘ Paul Turley, 9-22-14
‘ Dynamically add comments to Pivot Table value cells based
‘ on an adjacent hidden column.
‘***********************************************************
Dim ws As Worksheet
Dim pt As PivotTable
Const iPTRowOffSet As Integer = 3 ‘ Pivot Table starts on this row
Const iPTHeaderRows As Integer = 2 ‘ Number of Pivot Table header rows
Const iPTClearSafetyRows As Integer = 100 ‘ Number of rows beyond the last PT row that get cleared (in-case the filtered table shrinks by a large # of rows)
Dim iRows As Integer
Dim iRow As Integer
Dim CommentRng As Range
Dim ValueRange As Range
Sub SetObjects()
Set ws = Worksheets(“Report”)
Set pt = ws.PivotTables(1)
End Sub
Sub btnCreateComments()
SetObjects
ClearAllComments
CreateComments
End Sub
Sub CreateComments()
For iRow = (iPTRowOffSet + iPTHeaderRows) To pt.RowRange.Rows.Count + iPTRowOffSet
Set CommentRng = pt.ColumnRange(iRow, 2)
If CommentRng.Cells(0).Value <> “” And iRow >= iPTRowOffSet Then
Set ValueRange = Worksheets(“Report”).Cells(iRow + 1, 3)
If ValueRange.Comment Is Nothing Then
ValueRange.AddComment (CommentRng.Cells(0).Value)
End If
ValueRange.Comment.Visible = False
End If
Next
End Sub
Sub ClearAllComments()
SetObjects
For iRow = (iPTRowOffSet + iPTHeaderRows) To (pt.RowRange.Rows.Count + iPTRowOffSet + iPTClearSafetyRows)
Set ValueRange = Worksheets(“Report”).Cells(iRow + 1, 3)
If Not ValueRange.Comment Is Nothing Then ValueRange.Comment.Delete
Next
End Sub
Note that there are some assumptions made in this code. I don’t think the pivot table needs to start at cell B3 but that’s where mine is. Note the constants at the top that are used to skip the header cells. These values will need to be adjusted if you make changes.
Next, open the code module for the workbook (named ThisWorkbook) and add the following code for the SheetCalculate event (three lines added between the generated Private Sub & End Sub lines):
Private Sub Workbook_SheetCalculate(ByVal Sh As Object) SetObjects
ClearAllComments
CreateComments End Sub
Now, let the magic happen. Every time the sheet is calculated (this event fires every time it is connected, filtered, sliced or the pivot table is changed), the code iterates through the measure column cells and removes all the Excel comments and then iterates through and adds new Excel comments using the values in the adjacent hidden column. The hidden column contains the calculated “Comment Until Expiration” measure text values.
You can add or remove comment text in the Comments sheet, move to the report sheet and use Data > Refresh to update the model. Use slicers to change the EffectiveDate and filters to add or remove row values. With each action, you should see comments come and go. To view a comment, click the cell and hover the mouse pointer over the dog-ear in the upper-right corner of the cell. There are more comment options on the Review ribbon. Make sure you save this as a macro-enabled workbook (.xlsm) file.
As always, I’m interested in feedback, thoughts and suggestions.
Thank you to the Microsoft MVP Program for awarding me MVP for six years running! I love this program and look forward to another year of great community events and opportunities to advise and mentor.
This year I had the privilege to speak at numerous SQL Saturdays, conferences and user groups, help develop a certification program for the University of Washington, contribute to books and papers, moderate forums, test new products and advise Microsoft product teams, direct the Oregon SQL PASS Chapter, and work with consulting clients to develop database and BI solutions using cutting-edge technologies. I’m planning the Oregon SQL Saturday this year and will be co-presenting a pre-conference at the PASS Global Summit.
I’m always looking for new opportunities to support the SQL Server and Microsoft Business Intelligence community. If you’re looking for a speaker, writer or need some help, please contact me.