As of December 1, 2020, Focal Point is retired and repurposed as a reference repository. We value the wealth of knowledge that's been shared here over the years. You'll continue to have access to this treasure trove of knowledge, for search purposes only.
Join the TIBCO Community TIBCO Community is a collaborative space for users to share knowledge and support one another in making the best use of TIBCO products and services. There are several TIBCO WebFOCUS resources in the community.
From the Home page, select Predict: WebFOCUS to view articles, questions, and trending articles.
Select Products from the top navigation bar, scroll, and then select the TIBCO WebFOCUS product page to view product overview, articles, and discussions.
Request access to the private WebFOCUS User Group (login required) to network with fellow members.
Former myibi community members should have received an email on 8/3/22 to activate their user accounts to join the community. Check your Spam folder for the email. Please get in touch with us at community@tibco.com for further assistance. Reference the community FAQ to learn more about the community.
It should take the same amount of time - if your TABLE FILE WebFOCUS request is properly written.
WebFOCUS translates a TABLE FILE to a SQL SELECT statement, while developing a report, you should turn SQL traces on to view the generated SQL and warnings - this helps you write cleaner WebFOCUS code.
Here's a good FocalPoint thread: Run Faster. There's lots more on this subject.
Francis
Give me code, or give me retirement. In FOCUS since 1991
Production: WF 7.7.05M, Dev Studio, BID, MRE, WebSphere, DB2 / Test: WF 8.1.05M, App Studio, BI Portal, Report Caster, jQuery, HighCharts, Apache Tomcat, MS SQL Server
We found that SQL performs better then TABLE FILE against DB2, but we also had a SUB-HUB environment setup. We found that using TABLE FILE translated the SQL to be not as efficient. My understanding from IBI, that in a SUB-HUB the TABLE FILE is being translated a few times. Not sure how your environment is setup. To each is own of course, only testing would tell you.
Also, keep in mind that a simple TABLE would probably be as good as the SQL code, but as soon as you include any DEFINEs etc, there's a risk of returning a huge answer set, because the DB can't evaluate it. The approach we have is issuing a PREPARE to create the extract and try to perform as much conversions etc to the DB. Another advantage is you can issue DEFINE and TABLE directly on the PREPAREd extract, instead of the TABLE FILE SQLOUT stuff.