As of December 1, 2020, Focal Point is retired and repurposed as a reference repository. We value the wealth of knowledge that's been shared here over the years. You'll continue to have access to this treasure trove of knowledge, for search purposes only.
Join the TIBCO Community TIBCO Community is a collaborative space for users to share knowledge and support one another in making the best use of TIBCO products and services. There are several TIBCO WebFOCUS resources in the community.
From the Home page, select Predict: WebFOCUS to view articles, questions, and trending articles.
Select Products from the top navigation bar, scroll, and then select the TIBCO WebFOCUS product page to view product overview, articles, and discussions.
Request access to the private WebFOCUS User Group (login required) to network with fellow members.
Former myibi community members should have received an email on 8/3/22 to activate their user accounts to join the community. Check your Spam folder for the email. Please get in touch with us at community@tibco.com for further assistance. Reference the community FAQ to learn more about the community.
I have the choice on a project to either use a direct SQL script (1) or build the group of reports on the masters (2) of that database.
My generic question is what is the advantage of the one choice against the other. - Which is quicker or slower in performance. - Which is quicker in building. - What are the possible traps. - What are the doe's and don'ts.
when the sql scripts that get the data are available already, should I create a master on that script?
Just some thoughts on a stormy Sunday.
Frank
Frank
prod: WF 7.6.10 platform Windows, databases: msSQL2000, msSQL2005, RMS, Oracle, Sybase,IE7 test: WF 7.6.10 on the same platform and databases,IE7
Posts: 2387 | Location: Amsterdam, the Netherlands | Registered: December 03, 2006
My main concern would be the development of the SQL. Since you did not specify the RDBMS you are dealing with, my overall comment would be some RDBMSes are easier to develop SQL than others. So you need that skill set or just good old SQL skills. Whether or the not the SQL is optimized is quesitonable as well.
- Which is quicker or slower in performance. If you do SQL Passthru, this bypaases the parsing of the TABLE requst. This would in theory be faster. But please note that due to WebFOCUS's native adapters, the genersted SQL should be pretty optimized. There is no guarantee that your SQL would be. But truth be told, if you have really good SQL it bypasses a step so it should be faster.
- Which is quicker in building. If you are always developing SQL to use for SQL Passthru, that takes longer than just using the GUI.
- What are the possible traps. A big trap would be migration. What if the customer decides the SQL Server is scaling well, so they decide to move to Oracle? The SQL you wrote might be nice MS SQL, but maybe not so great PL/SQL. Maybe it won't even work! This means all the SQL needs to be reviewed and potentially modified. You might say, "Hey my SQL is generic, so it should work." You're right it should work, but will it be optimized? Most optimizations are specific to the native SQL, which WebFOCUS should generate.
- What are the doe's and don'ts. I don't know. Consider the above.
It's also nice to create synonym against stored procedures. If there is a stored procedure out there to take advantage of, then great. I don't think it makes sense to create stored procedures for WebFOCUS. I say take advantage of what's there, but don't reinvent the wheel, when you can use the GUI to do JOINS, Business Views, Reports,, etc.
As Chris says, in sense depends. I use very little SQL as we have DB2 and if your selection is on real fields and especially key fields response in WebFOCUS is very good. If you are wanting to 'roll up' data prior to reporting and you are comfortable with SQL then faster development on your part.
Long term maintenance should be your biggest concern in my humlbe opinion.
Leah
Posts: 1317 | Location: Council Bluffs, IA | Registered: May 24, 2004
In this special case the supplier of the application (Microsoft CRM) gave some sample reports with sql scripts in an excel template. The result is what you can expect from excel...... We now want to move this to WebFocus and then the passthrough sql is the quickest way. That just make me think if in all cases it is the best way.
Frank
prod: WF 7.6.10 platform Windows, databases: msSQL2000, msSQL2005, RMS, Oracle, Sybase,IE7 test: WF 7.6.10 on the same platform and databases,IE7
Posts: 2387 | Location: Amsterdam, the Netherlands | Registered: December 03, 2006
I have spent a lot of time at a client sites' doing some report development using SQL Stored Procedures and Views and will bew glad to give you my option. Well I hate them and SQL Pass-thru is the same as a Stored Procedure. First I had to wait on the IT staff to make changes to the SP and views all the time. Now that they have may reports in production where users select single values from a list for Structured Ad-Hoc reports they can only select one Items at a time no Mutli-Select which the users are now asking for so all SP have to be replace with WebFOCUS and allow it to generate Dynamic SQL. So a lot of wasted time. I would recommend WebFOCUS and allow it to generate the Dynamic SQL for you. You can turn on SQL Tracing and fine tune the WebFOCUS code to generate the best SQL for you. While SQL Path thru does have it's place for general reporting I would reccommend WebFOCUS Code. Also if you ever move to another RDBMS then you will have to change your SQL Pass-thru. I quess if you are a consultant and need to stay employeed (billable) then SQL Pass-Thru may be what you want but thats another topic.
I'm an employee on the company where I build the reports, I supports your opinion, but in this special circumstance the sql script helps me evaluate the structure of the crm database. I get the first release up and running and now I have time to rebuild that process in real webfocus.
Frank
prod: WF 7.6.10 platform Windows, databases: msSQL2000, msSQL2005, RMS, Oracle, Sybase,IE7 test: WF 7.6.10 on the same platform and databases,IE7
Posts: 2387 | Location: Amsterdam, the Netherlands | Registered: December 03, 2006
Frank, Just a short question. We are also in the process of using MS-CRM. The package has, I was told, reporting features using MS's reporting. At what point would it become more efficient to use WebFocus instead?
Daniel In Focus since 1982 wf 8.202M/Win10/IIS/SSA - WrapApp Front End for WF
Posts: 1980 | Location: Tel Aviv, Israel | Registered: March 23, 2006
We are untill now not impressed by the reports that are in the CRM system and as you will know after implementation the real reports will have to be build upon the specific use of the system. Besides that we will combine CRM information with real business information. So for example the client adresses and other CRM info with the transactions he did that comes from the back-office (legacy) system.
The symple report build upon the sql script works fine, but now I need to find out how to combine that with the legacy system and that might be better doing with real focus language and masters.
Frank
prod: WF 7.6.10 platform Windows, databases: msSQL2000, msSQL2005, RMS, Oracle, Sybase,IE7 test: WF 7.6.10 on the same platform and databases,IE7
Posts: 2387 | Location: Amsterdam, the Netherlands | Registered: December 03, 2006
The one time I have used SQL passthrough is when I needed to have a subselect in the SQL. I know I can use a hold file, but the volume of data returned is verrrry large, and is therefore slow due to the significant mount of I/O. When I say very large, I mean rows numbering in the hundreds of thousands. Using SQL passthrough eliminated the I/O from the database to the WF server, speeding up the process in general.
Diptesh WF 7.1.7 - AIX, MVS
Posts: 79 | Location: Warren, NJ, USA | Registered: October 25, 2006
I have mixed opinions on this. We've actually found development to be a bit faster if we use SQL passthru (either calling a SP directly, or writing the query and putting that SQL in). That's definitely not true in all cases, but in all cases I would say that SQL passthru is AT LEAST as fast as letting WF handle it. I can't think of any cases where WF is faster. In fact, I've been frustrated at times with how the WF parser handles some things. What seemingly should be relatively simple (in my mind), WF parses it out into 3 separate calls to the database. One of those calls ends up being really slow because it's bringing back the entire table so it makes the whole report slow. Writing the same query in SQL and sticking it in yielded 100x the performance.
Frankly, I don't want to spend my time playing around with the joins and stuff trying to tweak it to pass the most efficient SQL. In this case, let the DBMS do it's job and optimize it.
Bottom line, each specific DBMS is going to be as good and in most cases much better than the generic WF parser. That's not to say that the WF parser is bad, it's just more "generic".
For what it's worth, quite a few of our reports are complex enough that it requires multiple steps to get to the final result -- something that would be entirely too time consuming to perform in consecutive WF steps...
Don't get me wrong, I'm not knocking the WF experience -- most of our early reports were built there and frankly if it's a simple enough report that it comes from a table or even a few tables joined together, go ahead and use WF. You'll have the metadata there and that's a reuse benefit you may not get with SQL passthru. I've just found that most of the reports we've built as-of-late have been easier to code in SQL and ultimately perform better also...
To address TexasStingray's comments on multi-select -- for what it's worth, we've got several complex reports with multi-select and they're stored procedures as well. Granted, you have to think about that when you're writing the stored procedure, but it's entirely possible that the IT resources didn't understand how to do it in the stored procedure...
I think the short answer is just like everyone else has said -- it depends. It depends on your situation, "time-to-market", etc.
Production: 7.6.6 WF Server  <=>  7.6.6 WF Client  <=>  7.6.6 Dev Studio Testing: <none> Using MRE & BID.  Connected to MS SQL Server 2005 Output Types: HTML, Excel, PDF
I was at a contract where an existing process used to establish the number of rows in an MVS based DB2 table was taking a very long time (hours not minutes). The code was basically -
TABLE FILE db2file SUM CNT.* END
Looking at the SQL produced it could be seen that it was performing a tablespace scan, passing everything back to FOCUS (pre webfocus) where the count was performed. The total was in the region of 100 million rows.
Changing the code to -
TABLE FILE db2file WRITE CNT.* END
produced the more efficient SQL and the process ran in sub 3 seconds elasped and produced the same result.
The moral is that, yes, FOCUS and WF can be made efficient and you will have to play with your code a bit (and use Francis' oft quoted SQL Tracing) to ensure that the SQL produced uses indexes where possible and only returns the minimun recordset required.
Alternatively, if you are proficient at writing efficient SQL yourself, you can greatly enhance your development turn around by using SQL passthru as Trav has mentioned.
For my money, I use SQL when against an MS SQL or DB2 datasource as I am content with the SQL syntax and efficiencies. I am now facing Oracle and the syntax is new to me, so WF it is (it also happens to be my clients policy )
T
In FOCUS since 1986
WebFOCUS Server 8.2.01M, thru 8.2.07 on Windows Svr 2008 R2
WebFOCUS App Studio 8.2.06 standalone on Windows 10
Posts: 5694 | Location: United Kingdom | Registered: April 08, 2004
Frank, As you can see there are a lot of opinions out there. But that adds to the power of WebFOCUS is that you can have your cake eat it the way you want. The best advice you could receive is to use what works best. There is a time and a place for every thing. As a developer the tool at your hand. I would suggest coding what you can in WebFOCUS and turn on SQL Tracing. Here are the commands I use on SQL Tracing, with WebFOCUS 7.1.x and 7.6.x
SET TRACEOFF=ALL
SET TRACEON=STMTRACE//CLIENT
SET TRACEON=SQLAGGR//CLIENT
SET TRACEUSER=ON
SET XRETRIEVAL=OFF
If performance becomes an issue be sure to use tracing, make sure the fields used in you where statements are indexed. The key to the SQL trace is the SQLAGGR make sure the Aggregation is DONE or at least not applicable to the verb used (meaning that you used PRINT verb and not SUM verb).
PS: for everyone IBI offers a very good relational Efficiencies Class that I would recommend.
I think the FOCUS/WebFOCUS and SQL/DB2 interface is very powerful. Once you know what's passed and what's not passed and learn what you can and cannot code, you can do things efficiently in WF. At one client site, we used WF against DB2 files where we were joining 90 millions rows to 60 million rows to 30 millions rows and we were able to make it efficient. That says it all to me!!!!
WebFOCUS 7.6.6/TomCat/Win2k3
Posts: 428 | Location: Springfield, MA | Registered: May 07, 2003