One of the highlights of Microsoft’s Business Intelligence 2008 Conference in October was the announcement of “Project Gemini”. As Forrester reported, “With its just-announced Project Gemini, Microsoft aims to bring an Excel-based user analytics mashup tool into the core of Microsoft’s BI and data warehousing product portfolio. What is now only in the hands of OLAP data modelers and other highly trained staff will — as Community Technology Previews roll out to public beta testers late next year — become available to any company employee as an in-memory, drag-and-drop, pivot-table-enabled dashboard.”
According to Forrester, Microsoft’s ubiquitous spreadsheet, Excel, is already the most popular front-end program used by business analysts and others who want to analyze and display the results of their BI queries. This announcement of Project Gemini shows that Microsoft wants to accelerate the use of Excel as the ubiquitous front end for business intelligence dashboarding.
So what is Project Gemini? It’s an Excel add-on planned to ship with Kilimanjaro (a business intelligence focused release of SQL Server) that incorporates column-based in-memory business intelligence.
If you’d like to hear how Microsoft explained it, let’s go back in time to the Microsoft Business Intelligence 2008 Conference keynote speeches.
To view Donald Farmer introducing Project Gemini, fast forward to about 1 hour and 26 minutes. Also worth watching is the “BI Fairy Tale” at 1 hour and 17 minutes.
But, what is the real impact of Project Gemini for us Dashboard Spies? Well, thanks to the good folks at SiSense, makers of a business intelligence tool called Prism, we have this contribution of an article that makes sense of Project Gemini.
Thanks goes out to Roni Floman and Eldad Farkash for this article:
What does Microsoft’s project Gemini announcement mean for the business intelligence world?
By Eldad Farkash
Microsoft announced its Project Gemini on October 6th 2008. In essence, Project Gemini adds to Excel the ability to perform column based in-memory business intelligence without much of the terminology today’s BI consultants need to master.
Why does in-memory matter when it applies to business intelligence?
Traditional business intelligence solutions are OLAP centric. The OLAP was developed to rapidly answer multidimensional analytical queries (a paraphrase on Nigel Pendse). Since disks aren’t that quick, the OLAP pre-computes and aggregates the data.
The OLAP’s pre-calculation and pre-aggregation of business intelligence metrics was successful in that it enabled the early success of business intelligence and powered its growth. It is what made business intelligence so important for the large corporations that could afford to maintain it. This also points at the weak point: OLAPs create a lot of new data, require a data warehouse and involve long projects. This is the same as saying a high total cost of ownership.
This is where in-memory business intelligence comes in. It takes advantage of column based data structures (as opposed to row based tables or pre-aggregated cubes), and uses the already available super fast RAM to aggregate and calculate millions of cells on your regular Desktop (or on any cheap data server for that matter), without the long lasting table scans and indexing techniques that are required by traditional OLAP & OLTP systems. Coupled with modern, intuitive user interfaces, it lets users slice, dice and filter data in a way that is easy to learn.
Click on the “read more” link below to view the rest of this post:
And now what?
Even with in-memory business intelligence, product pricing and project implementation still resemble the “old” days of the OLAP. Why is that? The reason is that business intelligence still behaves like a company specific application that needs to be developed, customized and then implemented in the company in a lengthy process. Data warehouses and data formatting still happen, and take weeks. Structures are still rigidly defined.
This is exactly where the business intelligence industry needs to go today. The business intelligence application will just connect to the data source, whether it is cloud based or internal, pull the data and begin working. Interfaces will enable business users to use what they know: the intuitive formula representation of excel with the strengths of business intelligence. In the future, in-memory can certainly stand up to the next challenge: mash-ups of structured and unstructured data. And pricing and deployment will be very similar to the pricing and deployment of the productivity software installed on most desktops. In the long run, Microsoft’s entry may mean just that.
What does Microsoft’s entry mean?
Some laud Microsoft for the Gemini approach (don’t hold your breath. It still isn’t due for some time). Some say they had no choice. Some dislike it on the reasoning of “garbage in, garbage out”. They say that if a company hasn’t invested in an OLAP, cleansed and ordered its data, then no effort to analyze it, even using the latest and greatest technology, will be good or produce reliable results.
In many ways, this debate resembles the debate on how Excel is used internally, as a de-facto business intelligence replacement. It is a worthwhile debate, but moot, given the way people use it for dashboards, analysis and charting. That’s why leveraging Excel’s ubiquity to enter and dominate self service BI is always reason for a heated debate.
All the open questions will be answered in two years, in the next Office release. Only then will companies begin the work to enable the Project Gemini vision.
What project Gemini really means is that Microsoft is enabling a market, very much like what it did to the OLAP market when it began playing there. Once Microsoft entered, OLAP grew from a small, specialized industry to the top selling enterprise segment. None of the large business intelligence players would be where they are without MS elevating this industry. This means that the move will be a huge push towards pervasive BI. It also means that the legitimate questions raised regarding versions of the truth, metadata, data cleansing will need to be answered. This is how technology evolves: in jumps and starts.
The Gemini announcement will rejuvenate business intelligence, mainly for the small and mid sized businesses. More users will specifically ask for in-memory column based technologies, as more users realize the potential savings. The business intelligence industry will become stronger, not weaker.
More importantly, this may return the desktop to the center stage.
This will also shift the focus and value of the IT budget earmarked for business intelligence. In a nut shell, Cheap PCs will start to calculate what today’s blade servers can’t. This may bring on the 64bit revolution. Companies will install 64bit windows on PCs by default just to support this Office feature (others have also hypothesized that Gemini is made to guarantee Office upgrades). The price is the same, the upgrade in productivity, priceless.
By Eldad Farkash, Founder and CEO, SiSense (http://www.sisense.com)
SiSense is a provider of in-memory desktop business intelligence solutions. Priced at tens of dollars per user month (and with a free version) SiSense prism promises fast and easy dashboards and analytics, connecting to SQL, MySQL, Excel, MS Access, Analysis Services and other data sources, without IT resources, data warehousing, programming or scripting.
Tags: SiSense Dashboards, Project Gemini Excel, Microsoft dashboards, Excel dashboard
For more on Project Gemini, please see this great post titled “Project Gemini – building models and analyzing data from Excel“. It’s from the MS SQL Server Development Customer Advisory Team and features the wonderful phrase – WYSIWA (What-You-See-Is-What-You-Analyze).