It was nice to start the day drinking Starbucks in Seattle. It has that feeling of authenticity about it like drinking Boddingtons in Manchester or getting head-butted in Glasgow. Except of course, that Boddingtons is no longer brewed in Manchester since the Belgian owners moved production from Strangeways, where it had been for 200 years, to South Wales. That’s why Boddies isn’t advertised as the Cream of Manchester anymore, merely The Cream. Not that I’m bitter (boom boom).
It was equally exciting to walk into the main hall for the first keynote this morning to be hit by an ageing guitar band belting out cover versions. The PA was so loud that the distortion made hearing the between-songs banter difficult, but I think they were called The Deadly Hanjdob Quartet. They finished with a rousing rendition of Rick Astley’s “Never Going to Give You Up”, which was bad when it was released in 1987, hasn’t aged well and suffers further when butchered by an old bloke sweatting in a suit. 0830. Slightly better than an 0530 semi-naked rain filled taxi encounter, but not much. Mornings just aren’t for me.
So onto the keynote and the first hour was a little disappointing. Stephen Elop, president of Microsoft’s Business Division spoke about Microsoft’s vision for BI.
This started by stating that 10 years ago, BI was a relatively immature discipline.
Eh? Microsoft’s offering (SQL Server 6.5) may have been immature but the rest of the world was moving on apace. There were multiple ROLAP technologies out there , several OLAP vendors like Essbase, Oracle Express, Holos, Cognos and BusinessObjects, though at the time BO had this crazy idea that pulling data back to the desktop, building a micro-cube on the fly and using that to render reports was the way forward – it was called Desktop OLAP (DOLAP).
I’d only just stopped coughing when he went on to say that Microsoft had transformed the market (fair enough) and with Analysis Services 2008 they had a product which in scalability terms was “virtually unlimited”. Hello? I’m not even going to rise to that one.
So having let myself get wound up by the marketing pitch, I was knocked sideways by the announcements from Ted Kummert, VP of Data and Storage platform. First off we had a demo of how the DatAllegro purchase might work with SQL Server. A 24-server shared-nothing SQL Server cluster with DatAllegro orchestrating things was demoed against a partitioned 150TB, one trillion row fact table database using some simple Reporting Services reports to return data in seconds. OK, so it was a sanitised demo and the queries were selected to run quickly but 150TB in SQL Server? Better news for us is that Microsoft is working to select standard hardware for a reference architecture and the storage aspect is covered by EMC. It brings a warm glow to the Galloping Data Architect's cockles - if only I knew the company song.
It was suggested that this would be licensed as a separate SQL Server SKU and if you’re looking for more information, its called Project Madison. Of course a cynic would ask what the point of this expensive acquisition was if they already have “virtually unlimited” scalability in SSAS 2008 – 150TB cube anyone?
Last but not least was Microsoft’s new vision with Project Gemini the "unique vision" that would address the needs of those 80% of users that currently wanted, but didn’t have access to BI at their desktop.
Unique vision…errr, ok, maybe I misheard Michael Saylor, CEO of MicroStrategy announce a vision to put “a crystal ball on every desktop” back in 1996. That would be 2 years before Microsoft released OLAP Server with SQL Server 7 in that “immature” BI market.
We got a demo of Project Gemini from Donald Farmer; this is the Microsoft vision of “Self Service BI”, where users can load data from varying sources into Gemini using an Excel Add-in and then create pivot tables based on the data sourced by Gemini. Results/reports/applications can be published back to SharePoint for use by other users. In his demo, Donald pulled back a 20 million row fact table into Gemini (which sorted and filtered the data almost instantaneously), combined it with an “external” data source, showed the integrated data model created dynamically in the background, created a pivot table report and simple drill down dimension structures in Excel and finally formatted and published the results as a reasonably tidy looking dashboard application back to SharePoint in a matter of minutes.
The UI that is created can include dimensions as slicers for basic report filtering and any selection within the slicer propagates to all pivots and charts on the sheet. Simple drill down is also supported (within the context of your Gemini data set).
Apparently SSAS is the engine doing the work behind the scenes so there must be some sort of dynamic cube created which attaches itself to the workbook to serve up the data so quickly. It was a blinding demo, typical of Donald who is clearly excited about the capabilities and power that it can bring to the desktop – the 80% of users that Stephen Elop targeted. Microsoft is putting their subprime mortgage on Excel.
The Microsoft view is that it is better for users to go out and get their own data that for that overworked IT department to source it for them and add it to the data warehouse which would take forever to prioritise and implement.
After a few hours reflecting on this I began to have a few doubts.
The Microsoft BI vision is one of guided analytics, lightweight dashboards with Excel as the tool of choice for "power users" and Reporting Services becoming more of a user-oriented tool for self service reporting. I get the impression that Microsoft is targeting Business Objects here and form is definitely taking precedence over function. ProClarity is sadly dead and train of thought or high-end analytics doesn’t appear to get a look in.
I’m not sure I want all my users going out and dragging multi-million row result sets into Excel to work on locally. If I have 500 users in my company, how many times would the same result set have to be stored into multiple Excel reports for things to become unmanageable? Repeat that for the set of available data. And all this posted back to SharePoint?
Where is the single version of the truth in this architecture? I’ve just spent 4 years of my life trying to convince users to stop using Excel as a data store and here are Microsoft positively encouraging it. Hell will freeze over before this capability is used responsibly in most organisations.
Is it really too much to ask for external data that is useful to the business to be included in the data warehouse? Surely we have gone past the point where we spend 3 years building monolithic databases that don’t have the flexibility to incorporate new requirements as they are discovered?
I also have severe doubts about having users integrate data at the UI. Again, how many users will have to pull the same external data source into their spreadsheet before that has cost more than having it sourced once centrally. Where is the ability to share or reuse that integration? Where do the metadata mappings take place – because in the real world external reference data isn’t going to exist in the same context as the stuff inside the organisation, someone will have to provide mapping. I have this horrible vision of Stratature being served up in Excel for users to do this on the desktop. Personal MDM.
The other flaw in the argument is that the IT department is supposed to monitor what the users are up to and identify reports that are being shared by multiple users, have become business critical or which are becoming too large and bring them back into the IT department. This is obviously a different IT department to the one that was too busy to bring that external data source into the corporate data warehouse in the first place but have time to watch chaos unfold on SharePoint.
All the BI focussed updates will be packaged into a SQL Server update code named Kilimanjaro. This is separate to the 24-36 month cycles of major SQL Server releases and the date given for release is first half 2010.
In other news, there will be an October feature pack featuring Attunity connectors for Oracle and Teradata and the release of Report Builder 2.0 which makes several significant enhancements to the usability of the product including a rather nifty looking shared component library which allows users to select components developed by others and also highlights any changes made back to the original author to give them the option of pulling the revised version into their report. All the handling of data sources and fields is done automatically in the background. This functionality comes from Microsoft’s acquisition of 90 Degree.
Microsoft have also bought Zoomix which will appear in the next major release as SQL Server Data Quality Services, part of the SSIS stream. This sounded interesting and fills an obvious gap in the portfolio but no further details were given today.
So, as I strolled back from the Microsoft BI partner event last night, where its worth saying I was in a minority of one with most of these concerns, I reflected on what seems to the revival of DOLAP as the future of Microsoft BI (I'm really hoping to be proved wrong on this one), the dumbing down of analytical capabilities in the front end and a world where 150TB data warehouses are fronted with Reporting Services. Time to dig out my old MicroStrategy 4 certification perhaps?
The really scary thing though – other than the demographic that walk Seattle’s streets in an evening which seems to be 30% business people, 30% geeks and 30% tramps and hobos - is that most of this stuff is at least two years away. By then we will be 13 years into Microsoft’s strategy for BI, the competition will have had plenty of time to react to these latest announcements anyway and there will be a 60% chance that I have been bored or bludgeoned to death on my way back to the hotel.