Greetings!
We have a History table built for Point-in-time reporting, it is partiontioned by an as of Date, and is updated daily. This table contains 35 days of data. In a addtion we have a seperate table that contains just the current day data of the the same data as the History table, in fact it is used to update the History table. To improve response time on the History Point and time data, I was thinking of building out a Time based partitioned cube. I have read the Cognos documentation on this but still have lots of questions
Questions:
1) Can a Cognos Script be develop to automate the updating/removing of entries in the *.vcd file?
2) With a regular incremental update, you must do a complete refresh, pretty frequent, if I am mainting my *.vcd file does that still apply, I do not think so but still would like to understand how many child cubes can be supported and still get good response time.
3) Would there be an issue doing the first turn of the cube update against the History table as described above then do the just the daily File each day. This way first run will give me 35 days worth of point and time data, and only have to run single day worth of data after that.
Thanks
Charlie
Here is what I have discovered so far
Related to Item #2) I have over 42 Partitioned cubes and the performances is still very good! #3) I have updated my transformer model after running my first build(based on 30 days of history) by changing my query(Report Studio Report) and running another build on my desktop with one day worth of data and it updated my vcd file, and added my partition cube and I could view them all via the master cube with no problem. Trying to see if we can achieve it with two seperate model now... but need more information on automation of cube build for this type of solution(Time based Partition)
Thanks
Charlie