Please help me develop some useful information for analysis – about the Microsoft BI & reporting products you are using. There will be more polls in the future. I’ll share the results for all. Questions, suggestions or feedback? Leave a comment and I’ll reply.
It was a proud moment for Fernando Guerrero, the CEO of SolidQ, to chat with the King of Spain and the Spanish Minister of Science and Innovation at a recent conference in Madrid. Mr. Guerrero spoke with His Majesty The King of Spain, Juan Carlos I about the steady growth of the company and, in particular, the investment of SolidQ in the Spanish economy and education. His Majesty was complimentary about the growth and success of the company.
Fernando also spoke with Ms. Cristina Garmendia, the Spanish Minister of Science and Innovation, about recent activities in the SolidQ Spanish division that have caused them to double their headcount in two years at a time when Spain is facing high unemployment.
The company operates a number of programs including the SolidQ University with faculty members at universities throughout the world. SolidQ Research works with the European Union, the Spanish government and leading universities on research projects and assisting graduate students achieve PhD degrees.
Needless to say, everyone at SolidQ shares some of the pride to be recognized by honored national leaders.
The Analysis Services team has been working overtime lately to provide added community support for the traditional UDM-based OLAP engine (not that any of our friends in Building 35 have actually ever worked as little as a 40 hour week during their tenure at Microsoft!) In light of the team’s focus on project Gemini (now known as PowerPivot and the forthcoming BI Semantic Model in SQL Server Denali) over the past couple of years, it’s very encouraging to see their continued commitment to this staple analytic data platform. Last year they announced the Maestros enterprise-scale training and certification program with the second round of sessions scheduled this month in Redmond and Madrid.
The latest deliverable is the SQL Server 2008 R2 Analysis Services Operations Guide, a very comprehensive, 108-page document that focuses on operational support for Analysis Services database servers and solution environments. This guide was written by Thomas Kejser, John Sirmon, and Denny Lee from the SQL Server Customer Advisory Team (SQLCAT); with contributions from 26 product team members and leading independent industry experts; which include Alejandro Leguizamo and Alberto Ferrari from SolidQ. I attended an excellent session at TechEd 2011in Atlanta last month by Adam Jorgensen on large-scale SSAS operational guidance where he made reference to the forthcoming operational guide.
If you’re looking for a resource on the Data Quality and Master Data Management features in SQL Server 2008 R2, a new book is available. My colleagues Dejan Sarka and Davide Mauri at SolidQ have completed their book on DQ & MDM in SQL Server 2008 R2 and it’s available as a free download. Yep, that’s right, no strings attached.
“Data is the key asset of any company. However, not all data is equally important. In an enterprise, we can always find the key data, such as customer data. This key data is the most important asset of a company. We call this kind of data master data.
If everyone would always insert correct data into a system, there would be no need for proactive constraints or for reactive data cleansing. We could store our data in text files, and maybe the only application we would need would be Notepad. Unfortunately, in real life, things go wrong. A good and suitable data model, like the Relational Model, enforces data integrity through the schema and through constraints. Unfortunately, many developers still do not understand the importance of a good data model. Nevertheless, even with an ideal model, we cannot enforce data quality. Data integrity means that the data is in accordance with our business rules; it does not mean that our data is correct.
Fresh E-book for Free Download
This book deals with master data. It explains how we can recognize our master data. It stresses the importance of a good data model for data integrity. It shows how we can find areas of bad or suspicious data. It shows how we can proactively enforce better data quality and make an authoritative master data source through a specialized Master Data Management application. It also shows how we can tackle the problems with duplicate master data and the problems with identity mapping from different databases in order to create a unique representation of the master data.
For all the tasks mentioned in this book, we use the tools that are available in the Microsoft SQL Server 2008 R2 suite. In order to achieve our goal—good quality of our data—nearly any part of the suite turns to be useful. This is not a beginner’s book. We, the authors, suppose that you, the readers, have quite good knowledge of SQL Server Database Engine, .NET, and other tools from the SQL Server suite.
Achieving good quality of your master data is not an easy task. We hope this book will help you with this task and serve you as a guide for practical work and as a reference manual whenever you have problems with master data.”