As of December 1, 2020, Focal Point is retired and repurposed as a reference repository. We value the wealth of knowledge that's been shared here over the years. You'll continue to have access to this treasure trove of knowledge, for search purposes only.
Join the TIBCO Community TIBCO Community is a collaborative space for users to share knowledge and support one another in making the best use of TIBCO products and services. There are several TIBCO WebFOCUS resources in the community.
From the Home page, select Predict: WebFOCUS to view articles, questions, and trending articles.
Select Products from the top navigation bar, scroll, and then select the TIBCO WebFOCUS product page to view product overview, articles, and discussions.
Request access to the private WebFOCUS User Group (login required) to network with fellow members.
Former myibi community members should have received an email on 8/3/22 to activate their user accounts to join the community. Check your Spam folder for the email. Please get in touch with us at community@tibco.com for further assistance. Reference the community FAQ to learn more about the community.
I developed a dataflow with Source, Target and corresponding tranformations. My Source table contains around 1 million records. The flow is valid and when I ran the Process flow, it started fetching the records and loading it into target. It took more than 4 hours and it is still not completed. So, I cancelled the process flow.
Later, I noticed the table got loaded with some amount of records. But, I increased the commit size greater than source record count.
My question is, will the process flow run in the background?. If so, where to go and stop it?.
A couple of things: 1) to cancel go to console, then workspace then data services; right click and select agents. You should see your request running. Right click on it and seletc "kill this agent" 2) I suspect the reason your job is taking so long is that you are using "keyed update". If possible avoid that and use "insert from memory"