Wednesday, January 27, 2010

Business Dashboard Design and Presentation with Microsoft Business Intelligence and Silverlight

I'm reading: Business Dashboard Design and Presentation with Microsoft Business Intelligence and SilverlightTweet this !
Often, most of the BI folks invest entire energy, focus and time on topics like architecture, performance and design and by no means I would try to negate this theory. By all means, it's very vital for any project. Still one thing that I personally feel lacks in the overall process is emphasis on presentation.

It has been my seen and heard experience that, even if the performance of a data warehouse is a bit on the lower end (for ex. 7% to 10% less than expected), a nice presentation of the dashboard can really take off all the pain that the users would feel due to such issues. I am not trying to making a point that how this can be used as an excuse for performance, but I am trying to emphasize the point that presentation is a very very vital part of the overall project delivery, which even applies to a data warehousing project where Dashboard is a part of the delivery.

To the best of my knowledge, with the kind of charts, bars and graphs available with Performancepoint Services now, it has evolved much better from what it was in the earlier version. But personally, I do not feel that it's to the best of my satisfaction or I can say that Silverlight based Dashboards has started overwhelming me. After Dundas Dashboards which I described in an earlier post few weeks back, there is another example of Silverlight based dashboard. And this one too definitely looks spectacular.

The example which I am going to describe uses something called "RadControls for Silverlight". These controls use Silverlight and the visualizations are quite impressive. These comes from a company called Telerik Corporation, which seems like a components and controls provider firm for DotNet based applications. The example which I am talking about is called Sales Dashboard Demo and a whitepaper is also available on the same.

Thou this controls are not designed to work directly with any data warehouse or technologies like a cube and KPIs, but it should be definitely possible to use these controls where a dashboard is created using SSRS. In my opinion, in the next release or service-pack, Microsoft should come up with some Silverlight based user-interface controls that can be used in the dashboarding side of Business Intelligence. Whether this happens or not, Silverlight is already getting popular as an interactive and intelligent user interface rendering mechanism.

Monday, January 25, 2010

Performance of Kimball Method Slowly Changing Dimension Component v1.5

I'm reading: Performance of Kimball Method Slowly Changing Dimension Component v1.5Tweet this !
I just read a post by Todd McDermid on Kimball Method Slowly Changing Dimension v1.5 release. Todd gives a very nice explanation on the internals of how this components works. Theoretically the performance is claimed to be 100 times faster than SCD, and the reason I feel is a particular design choice made for this component. I would not summarize all or any of the features of this new version of this component, as I cannot do a better job than what Todd has done on his post, but my interest is surrounding this design choice.

After reading this post, I was able to realize that this component uses Cached Lookup kind of implementation internally, and this makes this component performing like a bullet, but also a fat memory eater. So theoretically, in case if it's executed on a data warehouse or ETL server where other simultaneous ETL operations are working in parallel, one can expect a sudden low in memory if this component starts processing a huge SCD. I by no mean intend to point that one should not use this component, but the point I am trying to make is it should be used with a little caution. It's like you are boarding a super sonic carrier and reaching the space is guaranteed to be faster than any regular aircraft, but be ready to bear huge fuel tank and thrust for the minute till it reaches there.

Considering all the advantages it brings to the table compared to regular SSIS SCD component, and considering the standard RAM size that I have mostly seen on ETL Servers (8 GB normally), this component is a must-go selection. Read the entire post of Todd to understand various features of this component along with a video tutorial on the same.

Friday, January 22, 2010

Excel on browser ( Excel Web App ), Powerpivot and Sharepoint

I'm reading: Excel on browser ( Excel Web App ), Powerpivot and SharepointTweet this !
I was reading Andrew Fryer's Blog on a post regarding Powerpivot management and the first sentence said that "The most important thing about PowerPivot is the ability to share users analysis into SharePoint so that these other users can slice and data form within a browser."

For those who are not aware, Microsoft Office 2010 has come out with a new neat feature that can be thought of as a NANO version of Microsoft Sharepoint, but it brings to the table a very important functionality. This application is called Excel Web App and is a light-weight Excel client that allows to collaborate working on excel workbooks more or less like Sharepoint using a browser. Keep in view, that multiple users can work together on it, but it should not go under assumption that it would have all features that are available while working on worksheets that are hosted on Sharepoint. Obviously, it won't have those features as they are not Excel but Sharepoint features.

The point that I am trying to make is that, if Powerpivot can be used in conjunction with Excel Web App, need of Sharepoint can reduced helping small scale organizations to still have the minimal required features like simultaneous collaboration and analysis capability of Powerpivot. I don't think powerpivot is supported from this application as of this draft, as I read on the Excel Webapp overview post comments that "Excel Web App cannot run add-ins built for the Excel client app". Thru the course of evolution or thru some workaround, it can be effectively used to replace Sharepoint if just collaboration of Excel worksheets is the most used feature on Sharepoint for your organization which would reduce huge costs.

I am NO expert at Excel Web App but having worked with Excel Services on Sharepoint Server and having read the post on Excel Web App overview (do not miss reading the comments on this post) and this post on Collaborative editing using Excel Web app from Excel Team, I am very sure of the point that I am trying to make is possible and a practitioner or Excel team can confirm the same. I am happy to learn if there's anyone out there who have got different views on the same. I have posted question on the same to Excel Team, do check out the Excel Web App overview post for the answer from Excel Team.

Sunday, January 17, 2010

SQL Server 2008 R2 Editions

I'm reading: SQL Server 2008 R2 EditionsTweet this !
SQL Server 2008 R2 has come out with some of the unique editions, which has not been seen historically. The major categories of these editions are DataCenter Edition, Parallel Data warehouse, Enterprise Edition and the Standard and below versions.

DataCenter the name itself implies what would be the probable use of this edition. Parallel Data warehouse is an interesting edition which is called an appliance-based solution i.e. in simple terms you need to buy hardware along with this version. I discussed in my previous post about this concept. Pricing of both these editions seems to be very similar, in fact its $57,498 for both of these editions for per processor quote which can be read from this page.

One other interesting difference between Enterprise and Standard edition is that Master Data Services and Powerpivot for Sharepoint would be available only in Enterprise edition and not Standard Edition. On the top of it, Master Data Services would work only on 64-bit editions. Also Sharepoint Server 2010 would work only on 64-bit hardware and architecture. In summary if one carefully thinks, it's better to use 64-bit hardware and OS architecture if you are planning to buy Enterprise Edition of SQL Server 2008 R2 as most of the features that it offers are available only on 64-bit architecture and not x86.

To get a higher level view of the differences, read this datasheet which gives a clear idea of the same.

Sunday, January 10, 2010

Data Analysis using Data Mining and Microsoft Silverlight

I'm reading: Data Analysis using Data Mining and Microsoft SilverlightTweet this !
I posted my first post of this year on the topic of which all technologies would be the focus of this year that would affect Microsoft Business Intelligence. One of those that I listed was Silverlight Applications that would integrate with MS BI Deliverables, and mostly they integarte with the reporting and/or dashboarding modules.

While browsing the web, I stumbled upon a very interesting web based service that provides data mining analysis capabilities, and the visualizations are too good to believe. The interesting point here is these visualizations are developed using Microsoft Silverlight. As I stated in my earlier posts that we have not yet seen a bright out-of-the-box silverlight based tools packaged with microsoft business intelligence stack, but independent vendors have already started finding an excellent use of these technologies.

The only downside of this service is that it provides analysis only over the web in the form of a web service. And one would not want to sent out an entire data warehouse over the web or analyze data of these volume using a web service. Check out yourself these fantastic product which is called Data Applied and below is one it's fantastic visualizations.

Friday, January 08, 2010

SQL Server 2008 R2 Parallel Data Warehouse ( formerly known as project Madison )

I'm reading: SQL Server 2008 R2 Parallel Data Warehouse ( formerly known as project Madison )Tweet this !

SQL Server 2008 R2 would be released in different flavor never heard before, and it is called SQL Server 2008 R2 Parallel Data Warehouse which was earlier known as Project Madison. This flavor of SQL Server 2008 is unique and first one of its kind. Some of the points that make it unique are as follows:

1) This version of SQL Server can't be bought as an independent piece of software, it has to be bought along with the hardware.

2) Hardware would generally consist of one controller node, and rest of the compute nodes (3 minimum). This controller node would manage requests and route it to compute nodes. Also the licensing for installation of SQL Server on each node is on a per CPU basis, which means that minimum 4 licenses would have to be procured.

3) To the best of my knowledge, I read somewhere that SQL Server 2008 Parallel Data warehouse edition would cost above $57,000, so consider the price of the same for minimum 3 compute node CPUs. Also the rest of the software and integration and controlling would cost extra. Add to this the installation and consulting fees that one needs to house for maintaining this setup.

Clustering brings concurrency to the system and reduces load, but it can't reduce the time that a single query would take without any resource latency. To break this barrier, parallelism would be required to execute bits of the same request simultaneously and this is what exactly this setup would bring to the table. SQL Server can also run queries in parallel, but in a data warehouse it would be interesting to see how parallelism is being brought, and claims are that queries that takes hours would come down to as low as minutes. Massive Parallel Processing is claimed to be obtained thru this implementation, and the architecture would be hub-and-spoke.

By partnering with vendors like HP and IBM who are some of the leading runners in providing hardware required for different kinds of data warehouse setups, Microsoft has created a unique kind of sales package leveraging business for themselves and it's partners. This idea can be used even by solution providers organizations by partnering with local or international hardware vendors, and coming out with a sales package that incorporates end-to-end data warehouse development in addition to specialized hardware setup as a single package. Fast Track Data warehouse is a nice start up architecture where solution providers can make a start for a similar package offering.

More information on this setup can be read from this data sheet.

Monday, January 04, 2010

What is Microsoft Codename Dallas and how is it associated with Microsoft Business Intelligence ?

I'm reading: What is Microsoft Codename Dallas and how is it associated with Microsoft Business Intelligence ?Tweet this !
I would term Dallas as a new potential in the making for content-as-a-service on the cloud computing platform. Here is how one should try out a hands-on experience on Dallas. Before planning or expecting anything from this platform, keep in view it is still in alpha phase and not even in betas. So just expect to get a feel of what it is going to offer, and not a full-fledged working product.

Firstly, in order to get access to Dallas, one needs to get access the Dallas Developer Portal. It's still not in the phase of open-for-all where anyone can sign up and get access. As of now, we can get access to Dallas as on invitation basis. Go to this portal, request your invitation code (its just a matter of dropping an email with a request to receive invitation code). Generally one should receive the invitation code on the very next day.

Log on to this portal, and you would be provided your account key. Save this key securely at some place, though this would remain available to you on your account home page on this potal. This key is important as this would be used as a kind of GUID for billing and reporting. The question that comes now is billing what and report what ? This is the next part of Dallas.

As per what I understand and anticipate is that, Dallas would offer content in the form of structured datasets from a lot of content providers in the form of a service. These content providers would be business houses, who would provide respective content which can be images, text or any blob kind of data to their subscribers. Using Dallas, one can subscribe to their service of choice.

Presently this portal provides a "Catalog" tab which shows a list of service providers to which we can subscribe. Subscription is free for the trial period. After you have subscribed, there is another tab called "Subscriptions" where you would be able to find your subscriptions. Under each subscription, you would be able to find a link that says "Click here to explore the dataset". Once you click on it, you would be able taken to Service Explorer page where it provides a set of parameters to specify your criteria and a few action buttons to view the data. This can be thought of much like a accessing or testing an invoked web-service. You can preview the data, as well as download it to analyze with powerpivot (provided if it's installed on your machine). The dataset that this service provider would return, would be limited to 100 records which is limit set for the trial. One nice thing is that it offers the service class readily to download to that service, so developers can just add a reference to it in their applications and business can start accessing it in their own hosted environment.

The last tab on this portal is the "Access Report" tab. On this tab you can view the reports of how many times you accessed each subscription service. And to the best of my thinking, you would be charged as per the usage of each subscription. I have not learned much about the pricing policy, so can't comment really much about it.

Finally it comes to how this can be used for Business Intelligence. Business Analysts keep of accessing and aggregating data from different sources, and most of the premium content providers charge a fees for the subscription. One of the best examples is in the Equity Research Business, where data from a lot of sources is studied for each equity. The same concept can be used here but it should be pay-as-per-use in my understanding. Using the cloud platform, applications and data storage can all be hosted and stored on the cloud instead of investing IT resources for the same. And using tools like powerpivot, data from various different business sources can be aggregated and analyzed without investing major IT resources for the same.

Still this does not eliminate the need of IT technology workers (coz I am one of them), as they would be the one who would be facilitating this setup for the business in IT environment to reduce manual efforts and program regular analysis routines !!

Sunday, January 03, 2010

Microsoft Codename Dallas and Powerpivot

I'm reading: Microsoft Codename Dallas and PowerpivotTweet this !
When powerpivot was introduced, the power of analytics that it brought to Excel and Sharepoint hosted content was spectacular. The point where I didn't feel something in place was, what would powerpivot analyze and when analytics is generally facilitated for MIS systems right out of data warehouse and cubes, would powerpivot really come to that much use for the business analysts keeping the flag of Self-Service BI still waving ?

One of the interesting use of powerpivot came to my attention when I saw use of powerpivot with Microsoft Codename Dallas. Dallas is still in the alpha version, and seems to be an interesting concept. It can be thought of as a potential new tiny in the market place of service subscription business.

Any business that is providing it's content in the form of service can partner with Microsoft and take advantage of the Microsoft's sales channel. Microsoft would provide these services via Dallas, which users can subscribe by paying for the required subscriptions. Content from these services would be made available in a structured format (like in the form of a dataset or in Excel) to the subscribers.

This content can then by readily analyzed using Powerpivot. This is a very interesting part of cloud computing, and powerpivot comes to its most appropriate use for business intelligence, in my views. Powerpivot of course is a very powerful mechanism to analyze huge data in a way that was not possible just using excel, but that felt much more like an add-on to me instead of real time business intelligence. But now when data from Dallas can be analyzed by joining with other sources of data using powerpivot, this is what I feel is real business intelligence.

Saturday, January 02, 2010

Emerging technologies that would affect Microsoft Business Intelligence in 2010

I'm reading: Emerging technologies that would affect Microsoft Business Intelligence in 2010Tweet this !

It is new year and I thought today I would publish my first post of the year. I have been working on SQL Server 2008 R2 November CTP these days and there are a lot of new technologies evolving which would also affect MB BI. Also from a professional point of view, it would be worth watching out as a good knowhow of these technologies can definitely add weight to your CV.

As per my views, below is a list of technologies that would be worth watching out for 2010:

1) Master Data Services

2) Performancepoint Services for Sharepoint

3) SQL Azure and Cloud Computing

4) Excel Services & REST API in Sharepoint Server 2010

5) Powerpivot for Excel and Sharepoint

6) Silverlight applications that can be integrated with MS BI deliverables

Related Posts with Thumbnails