If you are unable to create a new account, please email support@bspsoftware.com

 

News:

MetaManager - Administrative Tools for IBM Cognos
Pricing starting at $2,100
Download Now    Learn More

Main Menu

Recent posts

#1
Watsonx Orchestrate / Can Watsonx Orchestrate and Co...
Last post by DaBaker - Today at 08:32:11 AM
Orchestrate can call APIs, run tasks, trigger external systems, and handle conditional logic.
Cognos exposes REST and SDK endpoints for:
- Refreshing packages
- Running reports/jobs
- Checking job status
- Downloading outputs

Has anyone used Orchestrate to connect or pair to Cognos Analytics?
Any tips or tricks?
#2
Here are ten tasks BI teams can automate right now to save time and reduce errors.

- Identifying duplicate reports
- Detecting unused content
- Testing report execution after upgrades
- Tracking security changes
- Validating packages and lineage
- Monitoring dispatcher performance
- Flagging reports that fail or exceed thresholds
- Cleaning orphaned objects
- Logging version changes
- Running scheduled health checks

These tasks consume countless hours across BI teams. AI agents and automation tools help eliminate that burden.
MetaManager and BSP Software tools can help with all of this... Get Your Environment Automated and Under Control
#3
For many BI teams, AI adoption feels overwhelming. A simple modernization roadmap helps break it down. First, stabilize and optimize your existing Cognos environment. Second, upgrade to Cognos 12 to gain the latest security, performance, and cloud aligned capabilities.

Once the foundation is solid, begin layering in AI enhancements such as predictive forecasting, natural language querying, and automated insights from WxBI. The journey from BI to AI is not a leap. It is a series of clear, manageable steps that any Cognos team can take.
#4
Watsonx BI represents the next evolution of analytics for organizations that have relied on Cognos for years. Instead of replacing dashboards or reports, it enhances them by bringing predictive intelligence and natural language capabilities directly into the BI workflow.

For long time Cognos teams, this means analytics can shift from "What happened" to "What will happen" without rebuilding your reporting environment. Watsonx BI sits alongside Cognos and unlocks insights that used to require a data science team. Predictive forecasting, anomaly detection, and automated narrative generation allow BI teams to deliver more value in less time.
#5
Report Studio / search case works, simple case...
Last post by hespora - Today at 04:41:22 AM
Cognos 10.2.2 on Oracle...

in a query, i'm inserting a simple mapping:
case
  when [dimension] = 'value' then 'blah'
  else 'blub'
end

That one works fine. Replacing the syntax with a simple case however:
case [dimension]
  when 'value' then 'blah'
  else 'blub'
end

will fail to run and yield an ORA-12704 character set mismatch. My [dimension] on the db is of type NVARCHAR; the only thing I can think of here is that search case and simple case use different data types for interpreting  the literals - is that what is happening?
#6
Dashboards / Re: Dashboard performs less on...
Last post by dougp - Yesterday at 09:54:38 AM
QuoteThe fact table in question is a table with about 130.000 records, ... When using this dimension, the number of records increases to 170 million.

After rereading this, I think this is the symptom to focus on.

By any chance, does your date dimension have about 1300 rows?  About 3.6 years?  Of course, working with non-rounded numbers to begin with would help.

I think what you are meaning by N:N is not even that the date table and the fact have a many-to-many relationship.  I think you're saying they have no relationship -- a cross join or cartesian join.  So the result is a dataset with the number of rows being the number or rows from fact times the number of rows from the date dimension.

Creating a proper relationship between the tables will help.  The result should be the number of rows on the fact.  Once you have that, you can start trying to use the dataset to answer questions.
#7
Dashboards / Re: Dashboard performs less on...
Last post by bus_pass_man - Yesterday at 09:17:11 AM


I don't know what you mean by 'trajectories' and I think I don't need to know, although knowing would be nice.

You want the count of trajectories where start date >= {some date} and end date <= {some other date} is that a correct understanding?  That is easily done, without mucking about with many to many relationships.  Why didn't you try that?

#8
Dashboards / Re: Dashboard performs less on...
Last post by moos_93 - Yesterday at 02:13:46 AM
Quote from: bus_pass_man on 05 Dec 2025 06:08:47 PMYes I think you're doing something wrong.  You don't mention a bridge table for your N:N relationship, which you probably would want to have for any legitimate bridge table scenario but I really don't think this is one of them. 
  Can you clarify what you mean by this.  Is this a duration?  What are you trying to do here?

Also is the N:N in the time dimension or the relationship between it and the fact table?  You are unclear.


I can not comment about possible other modelling problems because I have not seen your model and all the information I have is what you have chosen to reveal.



One very big problem with data sets is that you can't define data security on them, unlike tables in a loaded schema.

Thanks for your response. The START_DATE and END_DATE indeed show the duration of a trajectory. Both columns are used for different use-cases. In this case I want to report on the number of active trajectories at a given time, using a N:N connection between the two dates and a year-date calendar, s.t.:

FACT1.START_DATE >= CALENDAR.DATE
FACT1.EIND_DATE <= CALENDAR.DATE


A trajectory typically spans several months. I have considered using a year-month calendar instead, but in the dataset-solution the response time was very acceptable when using a year-date calendar. Furthermore, a year-date calendar would leave more of the original data, since in different use-cases reporting on date level is required. Adding several calendars to describe the same time(span) would make working with the data too complex for end users.

#9
Dashboards / Re: Dashboard performs less on...
Last post by moos_93 - Yesterday at 02:04:38 AM
Quote from: dougp on 05 Dec 2025 12:22:43 PMThat is expected.  The dashboard that gets data from the database must perform these tasks for every visualization every time the user touches something.
write SQL
connect to the database server
run a query that involves joins between multiple tables
get a response back
update the viz

Put another way, when you are connecting to the database:
Cognos must write a more complex query.
Cognos spends time communicating with the db server.
the database server must perform lookups and filters across joins.
Cognos spends more time communicating with the db server.
Cognos waits for the data to be downloaded across the network.

In contrast, using a Cognos dataset means
write SQL
get data from a single table
update the viz

Choosing to use a Cognos dataset involves considering the tradeoffs between size and speed.  For most cases, a dataset will perform much faster than a direct database connection.


This is the same as using Power BI and comparing Import vs. Direct Query.




Thanks for your reply. So this logic can explain why the dataset-solution would show a visualisation after 10 seconds, but the oracle-solution times out after a few minutes?
#10
Reporting / Re: Scheduling a report to run...
Last post by dougp - 08 Dec 2025 09:44:59 AM
I have a date table that includes holidays.  So Event Studio would work.  It doesn't seem it would be much effort for someone to create a package for Event Studio to use.

Alternatively, you could use an external event (https://www.ibm.com/docs/en/cognos-analytics/12.0.x?topic=scheduling-set-up-trigger-occurrence-server) based on a script that computes 2nd business day.  On Windows I might do this by calling a PowerShell script from a Windows Scheduled Task.  But you can also call the script from an ETL process to be sure it's not just on a schedule, but also that it runs following successful completion of your data mart load (instead of after a failure or while the data is still being loaded).