If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

Anyone using DQM?

Started by Lynn, 21 Jan 2013 11:25:42 AM

Previous topic - Next topic

Lynn

I am curious to know if anyone is running DQM. If so,

What version of Cognos are you on?
What database(s) are your sources on?
Is it implemented for a new project from scratch or was a migration from CQM to DQM performed?
Any other thoughts or experiences to share?

Thanks!

CognosPaul

I've had one client using DQM. The last time I was there they were still on 10.1.1, but upgrading to 10.2 was scheduled for the near future.

Amongst the various databases that they used, the biggest improvement that they saw was on Essbase. After a few aborted attempts at migrating reports, they rebuilt one from scratch and the runtime was even faster than the hand written MDX they were using as a baseline.

DQM (at least on OLAP) makes certain assumptions on the way reports are built, and will fail if you don't build your report correctly. That means many existing reports will not run as expected without at least some tweaking.

Lynn

Thanks Paul for the feedback! There are some consultants here at my client and they are doing a new project (Teradata database is the source) and they are claiming DQM is too new and has bugs. They say the best performance comes from tuning Teradata properly and they do not plan to pursue DQM.

I am aware of the impact on report migration if we switch to it later, so my concern is that they aren't following best practice (at least per the DQM Cookbook as I read it) and there will be an added cost to switch to DQM later for re-testing at the very minimum. The CQM query engine is old architecture and no longer being invested in by IBM, so DQM is the only place where future benefits are going to materialize. I felt there ought to be some compelling reasons not to use DQM.

This project is a DMR model intended to replace some cubes. I guess we'll see how it goes. Thanks again!

MFGF

I would concur that going with DQM now, before any reports are built, seems to be an opportunity to avoid migration in future. If you find it is as buggy as your Teradata guys claim, it is very easy to switch back to CQM before you get too far. I would definitely advise you give it a try, though - you ought to get much better performance than using the Teradata ODBC connection.

Just my 2p :)

MF.
Meep!

Lynn

Thanks very much muppet. I will try to do some testing on a package of each. This will probably fall on me since I'm sure they won't agree to test both without some sort of change in scope fuss getting kicked up.

I'll try to remember to post back with how things go. Would love to hear from others with DQM experience out there also!

Thanks again!

CognosPaul

If you're still in the early stages of the project, you may want to invest some extra effort to ensure the database meets the needs of the new Dynamic Cubes system. I haven't had much chance to play with it, but what I have seen is impressive.

Plus if you're working with Teradata, you're probably not working with small sets of data; any way of pre-aggregating data would be helpful.

MFGF

On its own, DQM wouldn't utilise database aggregates - only if you were using Dynamic Cubes, and you would need to be on 10.2 to be able to do that. I think the biggest gains in performance will result from improved query planning and from the caching done by DQM.

Good luck!! We'd be very interested to hear how things go!

Meep.
Meep!

cognostechie

I just did. ver 10.1  New Model containing a DMR. I didn't see any visual gain in performance when I used DQM.

cognostechie

DQM seems to have a bunch of bugs ! When I make a simple report by putting something in Crosstab columns, it gives an error saying 'A server error was encountered'. Happened many times and happens randomly without any reason or logic. When I publish a package using CQM, I dont get that error.

Lynn

Quote from: cognostechie on 24 Jan 2013 07:12:24 PM
DQM seems to have a bunch of bugs ! When I make a simple report by putting something in Crosstab columns, it gives an error saying 'A server error was encountered'. Happened many times and happens randomly without any reason or logic. When I publish a package using CQM, I dont get that error.

What database are you using? I'm also curious to know if your package is relational or DMR.

Still waiting on the consultants at my current client to deliver their solution before diving into CQM/DQM, so my own experience is limited at the moment.

cognostechie

DB is Netezza and the package is a DMR but I published a relational package too and it has the same error. I figured what is hapenning. When things are simple like straight query subjects then it works. When I put functions like _add_days or something else, it works sometimes but it fails at other times. It's random !

MFGF

Wat build of Cognos 10 are you using? Any fix packs applied?

MF.
Meep!

MMcBride

We have tried DQM on DB2 9.7 - 10.1 and SQL Server 2008 SP2
Our primary goal was to create DMR models to give the clients multidimensional functionality on a relational source.
We have Cognos 10.1.1 on AIX 6

We stopped.

Issues we have seen:
1 - Create a DQM FM model (when you create a new model you can select DQM at the beginning, vs the normal method and then selecting DQM at publish time) - this was a horrible fail. Both on DB2 and SQL Server. The main issue was we started our model before development on the Mart was complete, they added a column to the table and nothing we did even deleting and re-importing the Tables would show this new column. Not only was it not refreshing the tables when you tried to ignore the new column and just keep on using the model you would receive a "model is out of date" type error. We reached out to IBM and were pointed to an open APAR, this was a problem in multiple versions of Cognos and the resolution was "upgrade to 10.2" but they wouldn't guarentee it would fix the issue.
2 - Server side Cache issues because DQM models are cached differently than CQM on the server we have had ALOT of problems where reports were either not running at all or when they did run they showed old results. Clearing the Cache, using the command line cache flush nothing worked - the ONLY thing that works is to stop and restart WebSphere and Cognos together.
3 - Java Memory Leak, we are forced to run our JVM for Web Sphere at 10GB min and 20GB max, same with the query service in Cognos - if I reduce this at all we end up with Dozens of 8GB Java Heap Dumps within Web Sphere - but only when using DQM, when I switched these models to ODBC in our Development environment the Heap Dumps stopped instantly. We were clearing 175+ GB of Heap Dumps nightly from Web Sphere.

As a result of the problems we faced above we are removing DQM from our environment completely - I have 4 SQL Server connections left to convert to ODBC, we will look at DQM again when we migrate to 10.2 at some point this year but presently it is not worth the trouble here in our environment.

And before you ask, yes we have reached out to IBM and had them in house 4 times in the past 8 months to help us resolve these issues, the final results were - they recommended we switch to CQM using ODBC drivers.

I have heard alot of good things about DQM, I am still excited to see how this will change our Cognos environment in the long run, but whether the issue is due to our server configuration or some wierd set of circumstances we are pulling it for now.

One last note - one of the functions that broke for us using DQM was _add_days as well, this was on a relational model not a DMR but yes we saw the same thing with Date logic using DQM models inconsistant results. I hate telling the users - look if it errors out just hit refresh again - keep trying until you don't get an error - that is unacceptable...

cognostechie

Quote from: MFGF on 28 Jan 2013 03:20:08 AM
Wat build of Cognos 10 are you using? Any fix packs applied?

MF.

10.1.1  so the patch is applied. 

ykud

Hm, I'm just starting a new project with 10.1.1 fp1 and TM1 and SQL Server 2008 R2.

Spend 2 days struggling with DQM, will probably switch to CQM, due to:
1) I see something like https://www-304.ibm.com/support/entdocview.wss?uid=swg1PM70160
every now and then and I triple checked dimensions for "special" symbols.
2) "XQE-PLN-0215 The report has levels from the same hierarchy on multiple edges." limitation. Really makes dimensional reporting a pain, you can't have months as rows and Current / Prev member as column.

Since my data volumes are low and it'll be an Active Report by the end of the day, I'll stick with CQM for now.

CAPP

Hi Lynn,
I have a customer with IBM Cognos 10.2.1 and i'm experimenting with DQM Published packages to a 1 dispatcher sandbox and getting a mixture of unanticipated results. So I have switched on more verbose logging in the XQELog and found that I can run a report (say a simple crosstab) and the log showed SQL queries being sent to the database on the first run of the report ( SQL Server 2008 R2). I Then open Query Studio and using the same package I draw in the same dimension levels and measure I used in the report and the log shows SQL queries being sent to the database again. On writing out the DQM Cache it shows 2 sets of Requests to the same package. I did not expect the DQM cache to be application wide, but more package wide and possible to serve all studio applications on the Cognos Connection against the same package. I even tried Workspace Advance and that had the same effect. So in conclusion DQM is 1stly package, then dispatcher with Query Service and also Application wide, which would imply that there can be many possible duplications of the same dimension hierarchies for the same package.
Has anybody had this experience, or got a more informed knowledge/view around this.
Many thanks
Chris

kiran.timsina

I faced two issues related to DQM. It was some months ago that I faced those, so I don't exactly remember what the issues were but as far I remember they were:
1. The model couldn't identify columns with NUMERIC(x,y) datatype. My tables were on Netezza. I got his error when I enabled DQM option right at the beginning while creating a new project.
2. Adhoc reports in Business Insight Advanced ran forever when I enabled DQM option while publishing the package. But the reports in Report Studio were fine. My package was a DMR package.