NW Business Intelligence

thoughts on technology, B.I., and more…

Posts Tagged ‘IBM Cognos’

Testing Reports, How Well Should We Do It?

Posted by Brad Greene on October 14, 2012

I think everyone would agree that one of the long held goals in Business Intelligence, the delivery of a known (hopefully high) quality deliverables, is a very critical success factor. If your BI team does not deliver this your customers (end users) quickly become disillusioned and suspicious of your work. Even small mistakes, made once too often, can do long term damage and you will spend months, maybe years, proving your work can be trusted. As the importance of the data warehouse has risen, the need for rigorous testing of deliverables has risen. However, my observation is in many cases the resources applied to testing has not kept pace; especially when compared to other areas of software development.

This posting is specifically about testing reports. This is an area where I think we can do much better. As developers we understand that every link in the BI delivery chain is critical to delivering quality. However, time and again, I see the a dramatic drop in the level of sophistication for testing reports from that applied to the other steps in BI delivery. Development teams work so hard to make sure their data is loaded on time, cleansed, transformed into business value and made available to their end users only to be presented in reports that may have defects. Those defects are slipping in because testing reports is not easy. It is deceptively complex.

I’m most familiar with the Cognos product suite but I believe that many of the challenges testing reports apply to other BI suites. Here are some I see:

Reports are typically not written in a procedure language (commonly XML, CSS, HTML, etc.)
Reports are database dependent so output is infinitely variable
Reports often contain graphic output like charts and images
Reports can be output in more than one format (PDF, CSV, Excel, XML, etc.)
Reports support variables or parameters, sometimes large numbers
Reports may require support for more than one language
Reports may be dependent on logic in metadata repositories than can be branched independently from the reports
Reports often involve user security, can be complex
Reports can support multiple data sources and types
Reports are often supported in multiple environments (browser, browser version, OS, bitness)
Reports can have strict formatting specifications (pixel perfect reports for printing)

Before I was doing dedicated BI work I was captive at a firm that was using Mercury as a testing tool. We developed a set of reports as part of a SaaS product suite. The testing tool was using basic “screen scraping” methods to compare report outputs between versions to find changes. We were able to set up automated testing of the reports to accommodate our 6 week release cycle. This was in the early days of very large databases and ours was constantly undergoing changes to tables and data. Sound familiar? We didn’t think of ourselves as data warehouse developers but that is what we did and fortunately we were able to invest in quality testing tools that included features for automating our report testing. Tools like Mercury were relatively expensive even then but the cost of delivering a bad report was part of the justification.

There are now tools available to do this job that are far more integrated and easier to use. I’m aware of one solid player that offers real testing features to the Cognos market. There may be other companies but I’ve not seen them. The one I have actually seen work is Motio. It doesn’t solve all the issues I listed above but nothing does yet. Testing reports takes a lot of process and effort to get it right. Is it worth the investment? That’s the question. A colleague of mine sent me an interesting article that might help find that answer. Here Douglas Hubbard discusses how we might better measure the right things when doing a cost-benefit analysis: The IT Measurement Inversion

Making things even more difficult is Agile. With Agile development methods making their way into most companies it is interesting to see how BI teams adapt to make the most of this methodology. You can find lots of others writing about their experiences with this. As the pressure mounts to quicken the pace of release cycles in BI the ability to hold the line on quality is even more important. The challenges for the QA team (often the very same BI developers) vary widely depending on the size of the organization. However, there are a few common themes;

Staffing testing roles to meet demand
Underestimating the difficulties testing BI reports and related BI analytic output
Getting overwhelmed by the challenges of managing multiple releases (branching is starting to become more common in BI)
Lack of tools to automate report testing (as discussed above)

So what to do?

Here is what I see most organizations do. Bluntly put, close their eyes and hope. Relying on unit testing by developers and spot checks by a few part time QA staff.

OK, that may be slightly dramatized but it is essentially what ends up happening. The costs of BI are typically under constant scrutiny (still). No one seems to ever budget $200K (just an example) or more for testing reports when they start a BI project. No one understands why you should spend that kind of money on testing. It’s not trivial for sure. I’m not saying that’s what these tools cost. Enterprise software pricing is complicated and you have people costs as well. However, that’s not the point. The software does something complex and saves companies a lot of money when it is used appropriately. BI teams have to start asking themselves about the cost of these actions?

Delivering reports with inaccurate values in them
Delivering reports that run slowly or unpredictably
Delivering reports to the wrong people
Delivering reports that allow access to sensitive data (regulatory compliance)
Delivering reports that are badly formatted (missing logos, misaligned columns, etc.)
Reports with misspelled words
Reports that fail under some conditions (one environment and not another)

Whether big or small, BI teams have to take testing of their BI reporting deliverables seriously. Not doing so is a sure sign of a lack of experience. It is only a matter of time before this approach leads to problems. Unless the project or company is quite small it can and should afford to invest in a dedicated tool. The pay off is huge in the long run. No team can afford to deliver bad reports, period. Smaller companies and projects may not be able to afford a dedicated tool. This doesn’t mean you can’t adopt good practices for testing it just means they will be more work and take more of people’s time. Taking the time to set up dedicated testing environments with known data sets, SQL scripts to run to validate report output, screen shots from prior reports to compare results to and so on, is how you mimic an automated tool. You just can’t do all that by clicking buttons. But the concepts are the same. You can learn more about this by just watching vendor demos to see what features they offer and building your own manual processes. Then hopefully, some day, your project will be worthy of the investment in automated report testing tools.

Posted in Business Intelligence | Tagged: , , , | 2 Comments »

SSAS vs Transformer

Posted by Brad Greene on February 16, 2012

I’ve been all over the place the past couple of months. One interesting project I started during this time is helping prove the viability of using Microsoft’s SSAS cube tech to replace Cognos Transformer. The application is retail and there is quite a lot of data in some of the fact tables (one is nearing a billion rows). Some of the dimensions are quite large but not in the millions fortunately. The challenge for this customer has been that Transformer, being what it is, at these volumes of data, has started to become overwhelmed. Build times are getting ridiculously long and the failure rate, while not high, is enough to be troubling and the recovery from any failure is painful given the long build times. I know there are ways to work around some of the limitations and people have been very resourceful. Go there if you have to I guess. But there are options.

Seeing this I felt compelled to recommend SSAS as the next step. We’re starting to see more Cognos clients in this situation making this decision. The results are just too compelling to ignore. The SSAS tool has become robust, feature rich and is very scalable. Our initial proof of concept confirmed everything we expected. We were able to design and build cubes at the lowest levels of detail, providing a more seamless user experience, and do it with build times that were far, far shorter than those of any comparable Transformer cube.

The combination of SSAS to design and build big, detailed cubes, combined with the BI management and presentation capabilities provided in Cognos is an awesome combination. If you simply do this to deliver cubes to Excel users you are missing the point here. Transformer is just no longer able to handle the increasing volumes of data some companies are collecting. So, fix that by plugging in SSAS, but don’t abandon all the other great things Cognos has to offer. It’s a great marriage.

There is a BUT here. No question that SSAS requires more technical skills than Transformer. Knowledge of the MDX language is mandatory. Technical staff will have a non-trivial learning curve to climb but the alternative is not pretty either. In an environment where SQL Enterprise is already present it may well be an easy decision to make the move to SSAS. Get some help, get some training and do it. This stuff works.

Posted in Business Intelligence, Technology | Tagged: , , , | Comments Off on SSAS vs Transformer

Cognos Component List

Posted by Brad Greene on September 23, 2011

Want the list of installed components in a Cognos installation? Substitute your server name into the URL below and paste it into your browser. Assumes your are using CGI of course. Change it to cognosisapi.dll if you are using the IIS DLL. You should get a long list that starts out something like the text below.


; Licensed Materials - Property of IBM
; BI and PM: is
; (C) Copyright IBM Corp. 2004, 2010
; US Government Users Restricted Rights - Use, duplication or disclosure restricted by GSA ADP Schedule Contract with IBM Corp
[Product Information]
LICENSE_BI_SERVER_name=IBM Cognos License
C8BISRVR_name=IBM Cognos Business Intelligence Server
LICENSE_BI_SAMPLES_name=IBM Cognos License


Posted in Business Intelligence, Tech Tips | Tagged: | 2 Comments »

Cognos Reports and Microsoft Office

Posted by Brad Greene on August 20, 2011

IBM Cognos has provided an ancillary piece of software that dynamically connects Cognos Reports to Microsoft Office tools like Word, PowerPoint and Excel for several releases. It was called Go Office under the 8.x series but has been renamed Cognos for Microsoft Office. I have not used this tool before Cognos 10 but a new client was looking for a way to produce reports in PowerPoint without needing to manually keep the data in them up to date. New to report development they were expecting to seamlessly connect some Cognos Report Studio reports to Microsoft Office. Fine idea in theory. It looked great in the demo I’m sure.

As with all things in the world of software the devil is the details. This tool works well for certain things. Simple things. If you are producing reports with small list reports, crosstabs of a half dozen cells or charts, it’s fine. The integration of Cognos into Office is fairly seamless. Once you configure the Options with your Cognos gateway server’s URI you will be presented with a tree prompt widget that lets you navigate your Cognos Connection folders. Find the report you want in the tree, select Import and set a few options in the dialog boxes you’re presented with and the report is imported. Pretty simple really.

The surprise comes when you don’t understand that the reports you import are converted to Office objects native to the tool you are using. So if you are importing into PowerPoint and import a list report the result will be a PowerPoint table. The data in the table will be “live”. It is refreshed with a click of a button. That’s the “beauty” of the integration. The unfortunate side is you just lost all your formatting. If you spent any time formatting your report in Report Studio all the work is tossed away as it is converted to the PowerPoint table. I suppose this is not unreasonable because of the vast differences between the Cognos Studio product capabilities and PowerPoint.

The same is true for Word and Excel of course. If you have complex reports that are large and heavily formatted then you are going to have to carefully plan how you use Cognos for Microsoft Office. PowerPoint tables are limited to 25 lines for example and header, footer, label and other text seems to be left behind during import. It works, but don’t expect your nicely formatted lists and crosstabs to pop up in Word just like they did in Cognos Connection. There is some work left to do on formatting but your data will be live.


Posted in Business Intelligence | Tagged: , | 2 Comments »