I am trying to burst a report.I want to hold the main fex file with one dynamic name.And I want to use that dynamic name Hold file in Dynamic List file for Report Bursting. My Main fex .
-SET &ECHO = ALL; APP HOLD IBISAMP
-SET &HOLD = &BURST_VAL||'REPORT';
TABLE FILE CAR PRINT COUNTRY CAR SALES COMPUTE DEST/A20 = COUNTRY|'.xls'; ON TABLE HOLD AS &HOLD FORMAT FOCUS END
TABLE FILE &HOLD PRINT COUNTRY CAR SALES DEST BY COUNTRY NOPRINT END -GOTO ENDCOM
-ENDCOM
My Dynamic file . APP HOLD IBISAMP -SET &HOLD = &BURST_VAL||'REPORT';
TABLE FILE &HOLD ON TABLE SET ASNAMES ON PRINT COUNTRY AS 'VALUE' DEST ON TABLE PCHOLD END But I am getting error like
Error retrieving records for distribution list:webfocus_test/dyn_file A VALUE IS MISSING FOR: &BURST_VAL
The value for &BURST_VAL is stored in BOTPARMS table for this schedule.But &BURST_VAL is recognized in my main fex but in dynamic file it is not recognized. Anything I am missing here ??This message has been edited. Last edited by: Kerry,
WF8103 -UNIX,HTML,EXCEL,PDF.
July 23, 2009, 06:45 AM
GamP
How is the dynamic file called? Is it a separate schedule?
GamP
- Using AS 8.2.01 on Windows 10 - IE11.
in Focus since 1988
July 23, 2009, 07:54 AM
Poongs
No .It is with the same schedule. I enabled Burst Option from Report Caster and I specified my Dynamic file in Dynamic List under Distribution Information
WF8103 -UNIX,HTML,EXCEL,PDF.
July 24, 2009, 10:17 AM
Gerry
The values stored in BOTPARMS for parameters are only available to the scheduled procedure. The values are not available to a procedure creating a Dynamic Distribution List.
You could submit an New Feature Request for this enhancement.
In the meantime, you could have your scheduled procedure write the value of the parameter to a location where it is available to the procedure creating the Dynamic List.
WebFOCUS All Releases
July 27, 2009, 01:38 AM
Poongs
Thanks Gerry,
My Problem here is before my first schedule complete its bursting my second schedule overwirte my existing hold file.Imagine my first schedule is handling huge data and second one is less data.