As of December 1, 2020, Focal Point is retired and repurposed as a reference repository. We value the wealth of knowledge that's been shared here over the years. You'll continue to have access to this treasure trove of knowledge, for search purposes only.
Join the TIBCO Community TIBCO Community is a collaborative space for users to share knowledge and support one another in making the best use of TIBCO products and services. There are several TIBCO WebFOCUS resources in the community.
From the Home page, select Predict: WebFOCUS to view articles, questions, and trending articles.
Select Products from the top navigation bar, scroll, and then select the TIBCO WebFOCUS product page to view product overview, articles, and discussions.
Request access to the private WebFOCUS User Group (login required) to network with fellow members.
Former myibi community members should have received an email on 8/3/22 to activate their user accounts to join the community. Check your Spam folder for the email. Please get in touch with us at community@tibco.com for further assistance. Reference the community FAQ to learn more about the community.
I've noticed recently that the CMLOG2 (ETL request message log) is getting a bit large and I thought that I should drop a few records from this log.
Referring to my trusty ETL Manager User's Guide (we're using 5.1.2 but the guide is for 5.1.0), Appendix A gives some instructions on how to delete rows from CMLOG2 (up to a certain date) using CMDELLOG and how to drop and recreate CMLOG2 using CMDRPLOG. Basically these should be executable from "Run Stored Procedure".
However, neither of these options seem to work (both come back and say "0 records returned" and the size of CMLOG2 is unchanged).
Any ideas would be appreciated.
Cheers
Andrew W
Posts: 27 | Location: Sydney, Australia | Registered: May 27, 2003
[qb]However, neither of these options seem to work (both come back and say "0 records returned" and the size of CMLOG2 is unchanged).[/qb]
Since this appears to be a product problem as opposed to a tip or technique, it should be solved within the hottrack system.
I don't have personal experience with this problem, and the people I would normally ask about this aren't around at the moment. If you haven't opened a hottrack call, you should do so now. I could call in, but (a) then there'd be no record of it happening, and (b) customers normally get higher priority than company personnel for this sort of thing.
Regards, Jake
Posts: 20 | Location: NY, NY | Registered: June 19, 2003
Hi, I have just registered and noticed your posting. We currently use 523 (previously 435) and have experienced similar concerns regarding the log files, both in 435 and 523.
What we have found is that the bigger the log files get the slower ETL manager completes its jobs. Consequently what we have done is to 1. create a 'new' log file (0 records), save this away in a 'safe' folder, 2. then have set up a casted MRE job to backup the large etl log to a backup folder after which the 'new' log file overwrites the large existing one.
We have the MRE job casted on a daily basis, taking place before the ETL routines start in the early hours. We have found this optimises the ETL running times. Equally if we want to look back at a log we have these in the backup folder for reference. Additionally we decided to use MRE to do this as we didnt want to create any contentions within ETL Manager with it trying to re-create log files automatically (not sure if ETL manager would write an entry to the log when it executes this?). Hope this is of help, Tony