As of December 1, 2020, Focal Point is retired and repurposed as a reference repository. We value the wealth of knowledge that's been shared here over the years. You'll continue to have access to this treasure trove of knowledge, for search purposes only.
Join the TIBCO Community TIBCO Community is a collaborative space for users to share knowledge and support one another in making the best use of TIBCO products and services. There are several TIBCO WebFOCUS resources in the community.
From the Home page, select Predict: WebFOCUS to view articles, questions, and trending articles.
Select Products from the top navigation bar, scroll, and then select the TIBCO WebFOCUS product page to view product overview, articles, and discussions.
Request access to the private WebFOCUS User Group (login required) to network with fellow members.
Former myibi community members should have received an email on 8/3/22 to activate their user accounts to join the community. Check your Spam folder for the email. Please get in touch with us at community@tibco.com for further assistance. Reference the community FAQ to learn more about the community.
We have a report which takes absolutely forever to run, commonly it will time out. When I query the information in oracle though the data comes back very fast. I can only assume there is something going on with WF to hinder its performance.
Any ideas of where I should even start looking? I've checked the forum already, several pieces of documentation on the tech doc library and I'm running out of options.
Thanks for taking a look at this. Appreciated, Shawna
one that I wrote myself which pulls all rows quickly, opposed to the report which should be pulling back 370K rows (far less rows) for the users. Thank you.
Perhaps you should try enabling tracing in WebFOCUS, capturing the SQL statement and run that instead in your SQL tool. You'll take a look at the execution plan (compared to the one you wrote yourself) and see where the differences are.
Many times the slowdown you notice lie on the fetching of the whole set of rows. If you ask me, 370,000+ rows seems like a significant chunk of data to read over from Oracle to WebFOCUS to then be styled. It may not even be the reporting engine but your browser choking when trying to display an HTML table with that many number of rows.
Many things to consider but again, I would start at evaluating the SQL created by the iWay adapter in the first place and take it from there. You'll want to see that all filters are being passed to the database to handle, as opposed to WebFOCUS trying to read everything and filtering after on the reporting server side, which I've seen happening very frequently in poorly-written WebFOCUS code.
When I query the information in oracle though the data comes back very fast
Was this very fast retrieval for all the records ?
I have been told by some people that the query runs quicker in oracle (via Toad in my case), but the time frame they see if from the start of the execution to the first row returned.
Are you able to post your original SQL and the report ?
In the report we are pulling all the data through and then using many WHERE clauses, it's my understanding this could be impacting the performance greatly.
I'll try to figure out iWay enough to check the adapter sql today.
We also use Toad and choosing almost the same filtered criteria that's problematic brings back all rows quicker than the report takes to run.
We also made sure all the JOINS are indexed properly. Hmmmm...
Thanks for the assistance, you both are bringing in some really good suggestions.
Are you coding your request using Developer Studio? If so, clicking "View" in the drop-down list you have "Show generated SQL". Comparing it with what you write, this could give you a hint.
Daniel In Focus since 1982 wf 8.202M/Win10/IIS/SSA - WrapApp Front End for WF
Posts: 1980 | Location: Tel Aviv, Israel | Registered: March 23, 2006
Thanks for all the assistance, I was unfortunately not super successful.
Daniel - I am in the dev studio but under 'View' I do not see this as an option.
Alex - I'll double check on the DEFINES, I don't think we are though.
Dave - That sounds like great advice, but I still can't see where/how webFocus writes its query. I was not successful getting around in iWay as I had hoped last week.
Strange that you do not see it... You are running against an SQL database, right? Well, if it is not there, the next thing you can do is entre TRACE commands in your FEX. Then with "View Source", you can see how WF generates SQL:
SET TRACEOFF = ALL
SET TRACEON = SQLAGGR//CLIENT
SET TRACEON = STMTRACE//CLIENT
SET TRACEON = STMTRACE/2/CLIENT
SET TRACEUSER = ON
Daniel In Focus since 1982 wf 8.202M/Win10/IIS/SSA - WrapApp Front End for WF
Posts: 1980 | Location: Tel Aviv, Israel | Registered: March 23, 2006
The other thing you might want to do is have a look at your agents console and see where the work is being done. Generally at http://servername:8121/bcstart.html See how many rows it's pulling from Oracle. If it's pulling more than you expect then it's almost assuredly applying some of your where clauses after the SQL call on the EDASERVER and that will be slower for a number of reasons.
If you want to get really tricky, get the agent number that the request is running under (from the agent console I mentioned above). Log on to your server as a user and go to that agent's temporary directory. Generally installed in \ibi\srv77\wfs\edatemp\tsxxxxxx where the Xs are the agent's number. Since this thing takes all day to run you have time to go to that directory and refresh it repeatedly to see if temporary files are being created. If Oracle is doing all the work you should see nothing but stderr.log and stdout.log in that folder. If EDA is bringing material back in pieces and reassembling them itself, you'll see other files appear (and disappear) as the process proceeds. That's an indication that something you've coded has pushed the Go-Slow button.
I had an I-Way process with a DEFINE field on one of its sources. I didn't even use the DEFINE field -- it was just along for the ride. For some reason it switched off the pass-through option and I had 11 temporary FOCUS files being created for a painfully simple denormalization routine. Big ones too -- 40GB worth and the process took an hour to run. I moved that DEFINE to the target table and it ran in 60 seconds with no temporary files.
J.
Posts: 1012 | Location: At the Mast | Registered: May 17, 2007