As of December 1, 2020, Focal Point is retired and repurposed as a reference repository. We value the wealth of knowledge that's been shared here over the years. You'll continue to have access to this treasure trove of knowledge, for search purposes only.
Join the TIBCO Community TIBCO Community is a collaborative space for users to share knowledge and support one another in making the best use of TIBCO products and services. There are several TIBCO WebFOCUS resources in the community.
From the Home page, select Predict: WebFOCUS to view articles, questions, and trending articles.
Select Products from the top navigation bar, scroll, and then select the TIBCO WebFOCUS product page to view product overview, articles, and discussions.
Request access to the private WebFOCUS User Group (login required) to network with fellow members.
Former myibi community members should have received an email on 8/3/22 to activate their user accounts to join the community. Check your Spam folder for the email. Please get in touch with us at community@tibco.com for further assistance. Reference the community FAQ to learn more about the community.
I have a report that reads from a huge database, (it takes like a minute or more to generate the report) and has 8 different calculated columns.
The user wants to be able to click on any column title and to have the report sorted by the column.
Of course I can have drill-downs on each column and call the same report each time with a different parameter, but.... as you see, it takes minutes to get the data...
my question: is there any way to keep the hold file or some other way so that on each click, I still have my data somewhere and that I just need to sort it differently... if so, please someone HELP !
We were looking into something like this as well... but since each call is an individual call that is completely independent it could be an issue. One possible solution is to make an index table with the information that the report would contain that could be run to get the information and the users would just make calls to that table.
Another way... but this is more theory than anything.. not sure if it would work completely.
If you have WebFocus 5.2.1 and up... right before you hold the table
APP HOLD FOLDER -*FOLDER = folder under the ibi\apps\ if you didn't change it
TABLE FILE TABLENAME PRINT ... ON TABLE HOLD AS 'TEMPNAME' END
then Display the table as you would for this request.
that would actually save the table in that folder, then you would have to check to see if the file is there... filedef to the filename and make the call to that table. One thing that I wouldn't know is how or when to delete the temp table... or else everytime someone ran the request they would get that table till someone deleted it from that folder. possible inconsistency from the bigger Database. Something to possibly look into if you want.
I see your point for "APP HOLD FOLDER" (even if I have never used that logic yet ) ... still, if holding a temp file, then I wouldn't know when to delete it and it may create data inconsistency...
you are mentioning something like to check to see if the folder is there... this has nothing to do with the topic, but I would really like to know how to do so, meaning checking to see if any file or folder exists...
How often is the bigger database updated? if its daily or something, maybe have the server clean that folder if its even possible then everyday, the first request would take a minute then the rest of the requests would use that first table.
if its changed often... you were saying it being on the sort such that the table would need to be cleaned on exit of that screen... you could then on exit of that screen(assuming its a browser) override the exit to run a little program that would delete it.
Most of this is all theory... I havn't applied all of it working in a situation.
As for the checking to see if a file exists... I remember doing it... but a few years back.. I'd have to get back to you on that one... don't remember off hand.
var tdcobj=document.all.studentgrades var sortnamestring="-name" var sortgradestring="-grade"
function sortbyname(){ sortnamestring=(sortnamestring!="+name")? "+name" : "-name" tdcobj.Sort=sortnamestring tdcobj.reset() }
function sortbygrade(){ sortgradestring=(sortgradestring!="+grade")? "+grade" : "-grade" tdcobj.Sort=sortgradestring tdcobj.reset() }
Put the following in a text file called studentgrades.txt in the same directory as the html file:
name|grade ~George Chiang~|~83%~ ~Bill Larson~|~69%~ ~Jimmy Lin~|~94%~ ~Mary Miller~|~59%~ ~Jane Wood~|~89%~ ~Terry Gray~|~72%~ ~Andrew Dart~|~82%~Open the html file in IE 4 or later. You can click on the column titles and it will resort.
I haven't applied this to WebFOCUS yet, but it shouldn't be too hard.
Another solution would be to hold the report as an Excel spreadsheet so the user can sort the columns.
Cheers,
Pietro.This message has been edited. Last edited by: <Mabel>,
I think in this situation, you would still need the actual file somewhere... because in a request, it is deleted immediately after the request. perhaps hold it in format alpha to a folder with the APP HOLD feature and then use the technique provided by Pietro? would that not solve the situation... everytime someone new would run the request it would overwrite the file? then would this cause issues when multiple users are accessing the file?
not sure anymore... the file would almost have to be store client side to ensure every user would have its own copy of the file. Leo
This seems to be a perfect example of a MAINTAIN application. where MAINTAIN maintains the stack and lets you resort. I don't have MAINTAIN , sadly, but have see in demo'd at NY Fuse, in an app written for Hofstra Univ.. And they performed just exactly the sorts you want. I perform my column resorts with dd's on the col titles, the icky way. Pietro's way is so so so cool. You could FILEDEF your temporary data file right in the agent, and then the server side include #studentgrades could reference it. You might want to use -SET &MYPATH = TEMPPATH(40,'A480'); then -SET &MYFD = &MYPATH // &MYOUTFILE ; then it seems the // doesn't work so you have to -SET &MYFD1 = SQUEEZ(80,&MYFD,'A80'); FILEDEF &MYOUTFILE DISK &MYFD1 and you should have your datafile grabable. [ Those / are pipes (pipes don't print on this bbs), concat chars, actually double concats which are supposed to nuke trailing blanks, but we find that in this case, they dont get nuked.] But... i can't see how the agent wont just drop you as soon as you launch the asp page, so couldn't you just create your extract file in a batch job, and let all the users hit the extract instead of the source files? shortening the reporting time??
Posts: 3811 | Location: Manhattan | Registered: October 28, 2003
Here's a FEX that uses the MS Tabular Data Control concept. Give it a try.
The hold file has to be created in a directory that the webserver can address, which means the data server and web server must be on the same machine (I think).
Cheers,
Pietro.
-*-- Tabular Data Control example --------------------------
-SET &ECHO=ALL;
-*-- Set the directory of the hold file to a directory -* that can be addressed by the webserver -SET &TEMPDIR = 'E:\ibi\apps\gemini\';
-*-- Allocate the hold file to be used by TDC --------------
I'm just curious about having multiple users. If you need it stored somewhere, then your saving it on the server just like the APP HOLD technique. the issue being that say someone gets the report and during the minute of processing the request, someone else executes the request. As the file is being stored to the a new one would overwrite it? then what would happen to the first guy who thinks the file is complete? I guess if you just app hold the last table, it would be done quickly and odds are that it won't be an issue. Just something to keep in mind incase things go wrong from time to time.
I get around this by using the random number generator function and using the number as part of the name for the hold file. This will keep the files from coliiding when you have multiple users.
-SET &RAND=ABS(RDUNIF('D9.7'))*1000000;
Just make sure you retain the random name of the hold file from request to request.
Posts: 995 | Location: Gaithersburg, MD, USA | Registered: May 07, 2003
Unless I am misinterpreting the code from Pietro, he is bringing in the time of day as part of the file name which would give the file a unique name solving the problem as well. (True?)
Yes, I'm putting the date and time of day in the file name. Remember this is only a proof of concept, I wasn't taking into account that a pile of hold files would be created from running this over and over. I wanted to use the WebFOCUS TEMPDIR directory but couldn't address it as a URL. Of course, one could add an alias of the TEMPDIR directory to the webserver.
I was hoping to do the following:
-*-- Keep all temporary files created ---
SET TEMPERASE = OFF -RUN
-*-- Determine the path of the temporary directory (EDATEMP) --- -SET &TEMPDIR = TEMPPATH(60,'A60');
-*-- Allocate the hold file to be used by TDC --- -SET &HOLDCAR = &TEMPDIR || 'HOLDCAR.TXT'
One of the TEMPDIRs on our server is D:\ibi\srv52\wfs\edatemp\ts000036\
There's one directory per agent, so there's never a problem with multiple users running a request at exactly the same time.This message has been edited. Last edited by: <Mabel>,
-*-- Keep all temporary files created --- SET TEMPERASE = OFF
This seems to work contrary to what my 5.2 documentation says. The documentation says OFF to delete, ON to keep. Actually, it's OFF to keep, ON to delete.
I am on the West Coast, in Los Angeles, so by the time I get to work and log on to the Forum, you guys have had great session of thoughts and suggestions :-) Thank you all...
it makes me feel so bad to see how little of WebFocus I know :-(
Just to let you know that my report is financial/statistical and contains critical data. The users are people who are very knowledgeable about every penny displayed on this report... they choose the date range and other criteria, then click on run which then generates the report....
my first concern: there are at least 100 users at any time to access this report and by generating these temp files, even with a time stamp, wouldn't make the file unique as there may be at least 2 users clicking on the run button at the same time for different date range and criteria.. so if the temp file is being stored on the server, how to make sure the correct user views the appropriate report ????
second concern: how to clean out after them, with all these 1000s of reports daily ????
What about holding the report as an Excel spreadsheet? Then you wouldn't have to worry about temporary files and your users can use Excel to sort their columns?
I do offer 3 output formats: HTML, Excel and PDF.... Unfortunately, the HTML version is the one the users like the most, even thought, as you know, that's the exact report with just different out format....
For the first concern, you create the name of the file in the call... therefore when your doing the sort, when you click the column table, you can send it the filename to be used. If its blank then you do the full 1min call(which is also what happens when you run the report for the first time)
I know for WebFocus 5.2.1 and I would assume up... if you do the APP HOLD TABLENAME directly before the one table you need... it would be the only one stored.. not sure on how the TEMPERASE works.. havn't really used it. but I would assume it would work the same.
How about something like the code below. This will create a temporary directory specific to every user that can persist between WebFOCUS requests. It does require that you can uniquely identify a user in every WebFOCUS request. If you can't, you may be able to find some other value to use instead of a user id.
-* This is in the initial request that creates -* the primary HOLD file. -SET &USERID=GETUSER('A8'); APP MAP WFTEMP C:\tempfiles\&USERID APP DELETE WFTEMP APP CREATE WFTEMP APP HOLD WFTEMP
-* This appears in subsequent requests -SET &USERID=GETUSER('A8'); APP MAP WFTEMP C:\tempfiles\&USERID APP PREPENDPATH WFTEMP
Yet another idea - something like "mini-datamarts on demand". The "skeleton" of the idea looks as follows:
-* Report parameters (entered via the launch form): -* 1. "Datamart" parameters: -* ... -* &FILTER1 - for filtering the source database -* &AGGR1 - field for aggregating data -* ... -* 2. "Other" parameters -* ... -* &FILTER2 - filter for obtaining final result -* &OUTFMT - output format -* ... -* ... -*------------------------------------------------ -* Set the datamart name, based on parameter values: -SET &DATAMART = &FILTER1 || &AGGR1; -*------------------------------------------------ -* Check if the datamart exist: TABLE FILE &DATAMART PRINT * WHERE RECORDLIMIT EQ 1 END -IF &RECORDS GT 0 THEN GOTO MEXIST; -* If exist, goto the final report. -*------------------------------------------------ -* Here is the very sophisticated query (it could be very-fine-tuned SQL PASSTHROUGH): TABLE FILE VERY_LARGE_DATABASE SUM ... BY &AGGR1 BY ... WHERE &FILTER1 AND ... ON TABLE HOLD AS TMP END -RUN -* Data transformations and calculations: DEFINE FILE TMP ... END -* The datamart will be HOLD permanently: APP HOLD DATAMARTS TABLE FILE TMP ... ON TABLE HOLD AS &DATAMART END -RUN -* NOTE: The &DATAMART is not a large table, -* although it still contains more data than we need for this report, -* but it can be used for other reports. -*------------------------------------------------ -* Generating the final report: -MEXIST TABLE FILE &DATAMART SUM ... BY ... WHERE &FILTER2 ... ON TABLE SET ONLINE-FMT &OUTFMT END
If the datamart with the particular parameter combination exists, the report should be calculated very quickly (no query to the source database, no data transformation, no calculated fields). The query is performed only once, when the combination of parameters is entered very first time.
The datamart is available to all (authorized) users, and may be useful for several types of reports.
For the "datamart parameters" we should chose those which are good aggregators and filters in the source database.
There is a limitation for the FOCUS FILE NAME length (about 64 in 5.2).
The technique could result in a very large number of files, thus decreasing manageability.
Yes, it demands some disk space.
This message has been edited. Last edited by: <Mabel>,