If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

Most helpful functions of Data Module?

Started by ztaylor, 19 Feb 2020 01:50:54 PM

Previous topic - Next topic

ztaylor

Hi, I'm creating some videos for new users to get up to speed quickly using data modules.

Any suggestions on what the most helpful things are to highlight for people new to Cognos Analytics 11? These would be be for people who have not used Framework Manager before (or Cognos at all for that matter).

Thanks!


MFGF

Quote from: ztaylor on 19 Feb 2020 01:50:54 PM
Hi, I'm creating some videos for new users to get up to speed quickly using data modules.

Any suggestions on what the most helpful things are to highlight for people new to Cognos Analytics 11? These would be be for people who have not used Framework Manager before (or Cognos at all for that matter).

Thanks!

Hi,

I would highlight the intent-driven modeling capabilities which help users to find relevant data. I'd also point out the automatic relationship determination between objects, which helps users to link their data successfully.

Depending on the version you are using, there are some really useful topics such as 'Split' which you could cover, plus things like data groups and navigation paths.

Good luck!

MF.
Meep!

Francis aka khayman

what version to use that is not dissapointing? i'm current using v11.0.13

ztaylor

Thanks MF!

@Francis, I'm going to do 11.1.4 or 11.1.5 - there are major changes between 11.1.4/5 and 11.0.13 so definitely want to be up at the later versions.

bus_pass_man

#4
A data module, like a FM model, is first and foremost a modelling application.  It exists to allow a modeller to take metadata from, possibly, many data sources and create a query plan which can enable business users to make queries without knowing SQL and to allow them to interact with their data from their perspective.

To be successful you need to be a modeller.  Whether this is an obsolete notion or not, and that, magically, the design of a query planner can be done algorithmically, and without any analysis by a human, remains to be seen.  There still needs to be someone who needs to understand concepts about relational data bases such as keys.  This person also needs to be able to understand business requirements and translate that in to a functional application which allows the business user to achieve his business query objectives.  A short-form term for this person is a modeller.

It is an irreducible minimum.   The basic Kimball-ian tasks of declaring the facts and their grains and the dimensions remain.

I think intent-based modelling is an attempt to make a start along the path to automating the task of analyzing data base information and prising out how things are related to other things, but it still requires that you review the results and verify such mundane things such as that the relationships are correct, using the correct keys, at the correct grains.

It is a pity I see people trying to write SQL in reports and that they don't grok that a good model could liberate the business user to explore his data and seek out insight without the intermediation of a report writer and without dependence on static reports.

There is a gap between the capabilities of FM and modules.   

One of the biggest gaps is provision for prompting.

The gap between the two applications is narrowing, as you can now, with 11.1.x, do stuff like model for multi-fact grains with column dependency. 

Column dependency is more flexible than determinants in FM.  In FM, you can only have one hierarchy of determinants in a query subject but, in a data module, you can have multiple column dependency paths in a query subject, which can represent different aspects of the data.  For example, an employee query subject can now have your employee-by-geography and employee-by-manager views in the same object.  In FM you need to model these sorts of things as separate query subjects.

Data security in modules isn't as flexible and adjustable as in FM.   In FM, it is very much tied in with parameter maps and macros.   You could define a parameter map which would contain some part (or all) of a filter expression, which would be substituted, at run time, for a session parameter value (such as the identity of the user).  The parameter map would be referenced in a macro in the security filter in the query subject.  You could put tables into parameter maps so the administration and updating of the security would be external to the model's packages and just be a matter of updating the security table, which could be done by anyone with the rights to update the table, not just the modeller and without the requirement to republish the package (and synchronize the reports to the latest version of the package).

A couple of months ago there was a guy who was using macros in the data source connections to allow him to switch to different sources for each user.  This is very much not the pattern of data security which is mainstream for FM and must be an administrative nightmare, but it shows the flexibility of FM.   

Navigation paths allow you to define a lightweight collection of query items which can be swapped by a report or dashboard user so they don't need to have edit rights to explore the data.  If you have a report or dashboard where a query item from a navigation path is used (you need to use the instance of the query item which appears in the navigation path's branch), you can select that query item and, by invoking the right mouse, view a menu which allows you to drill up or down the navigation path.  In some ways it can serve the purpose of dimensional modelling something without the overhead of DMR and without the need to deal with MUNs.   I think it is better as it isn't restricted to a hierarchy but can be seen as a bucket of attributes.

It could be helpful if you could have some analogue where measures could exist, so you could have a report where you could keep the attributes but swap in different measures without needing to have a reporting licence.

Data groups and clean allow you to refine the data to make it more useful to users without knowing the functions to do that.   (try clicking on the view or edit button in the properties to see what has been done to modify the expression of the query item).

In general, the UI could have made it clearer the way to do the basic Kimball-ian tasks.  Cube Designer was starting to get this notion but it never caught on and the UI for that was half-complete.

Many things are obviously half-complete.  You can create views but you can't edit them.   

The expression editor is far, far better than its FM cousin, or anything like it which I've seen Cognos do.  The help text will remain until you move the cursor to someplace outside of the function keyword.   This little thing is immeasurably useful as I can build up a function without needing to memorize the parameters and their syntax before the help text disappears, as in FM, where I need to keep clicking on the function in the function tree to see the help text again.  Often, I will copy the help text and paste it into the FM expression just so it will stay around while I work.

You can test sections of an expression so you can troubleshoot it.   I can't praise that highly enough.  If you've ever worked with errors in an expression which was more than trivially small and tried to figure out where the bloody problem is, you will appreciate it too.

The comments allow you to make clearer what you were intending to do.   The formatting of the expression is saved.

You can collapse parts of the expression so it is more easily readable.

I forgot to mention the relative time measures functionality.   It is weird but once you get used to it you will find it helpful.  There's so much reporting requirements for relative time comparison.   It would be helpful if the ability to extend and augment the out of the box relative time stuff, with things like not just prior year but the year before the prior year (and similar x-period sets) without trying to figure out what the bloody hell you need to do.

Andrei I

Quote
A data module, like a FM model, is first and foremost a modelling application.
A very nice write up!
I still prefer an approach of a managed metadata but self-service reporting does require some way of adding personal data . It looks like IBM is suggesting to replace FM modeling with Datamodules.
So much for a premise of the Single Source of Truth :-)

Francis aka khayman

just curious. can you use mdx/dimensional functions in data models?

srmoure

Quote from: Andrei I on 20 Feb 2020 07:47:29 AM
It looks like IBM is suggesting to replace FM modeling with Datamodules.
So much for a premise of the Single Source of Truth :-)

It's like the are shooting themselves in the foot. They don't have good competition for FM modeling and they are easily surpassed by competition in simplified modeling. They should be pushing both of them to be a real winner. I think data modules are a great concept, but very limited in some filtering or modeling aspect. It offers unlimited flexibility, though.
Anyway, the FM replacement strategy will end with more people using other solutions because governance is the key competitive advantage from IBM.