As of December 1, 2020, Focal Point is retired and repurposed as a reference repository. We value the wealth of knowledge that's been shared here over the years. You'll continue to have access to this treasure trove of knowledge, for search purposes only.
Join the TIBCO Community TIBCO Community is a collaborative space for users to share knowledge and support one another in making the best use of TIBCO products and services. There are several TIBCO WebFOCUS resources in the community.
From the Home page, select Predict: WebFOCUS to view articles, questions, and trending articles.
Select Products from the top navigation bar, scroll, and then select the TIBCO WebFOCUS product page to view product overview, articles, and discussions.
Request access to the private WebFOCUS User Group (login required) to network with fellow members.
Former myibi community members should have received an email on 8/3/22 to activate their user accounts to join the community. Check your Spam folder for the email. Please get in touch with us at community@tibco.com for further assistance. Reference the community FAQ to learn more about the community.
My query is regarding the maximum limit of a mainframe focus dataset to hold the data. this focus dataset is NOT defined with multi- volumes.
So problem is that when a fixed number of segments get inserted in this dataset. It gets fatal error while writing to the file. (FOC198) FATAL ERROR IN DATABASE I/O. FOCUS TERMINATING ERROR WRITING BCCBSPL2, PAGE 188971, CODE 0xaa000003
Than we perform rebuild process to spilt the data from abended dataset to new focus dataset on the key field and take new focus dataset in use.
CYLS allocation to this datasets as following: Record format . . . : F Record length . . . : 16384 Block size . . . . : 16384 1st extent cylinders: 628 Secondary cylinders: 350
Current Allocation Allocated cylinders: 4,200 Allocated extents. : 5
Current Utilization Used cylinders . . : 4,200 Used extents . . . : 5
I found multi volume concept that can reduce the occurrence of this space abend.
Is there any method that can remove this space ABEND permanently?This message has been edited. Last edited by: Kerry,
Sorry to say but you did not get the clear picture of my problem. So describing it again in more detail.
Actually what happens, we have a mainframe batch job (Runs Daily) that takes input from a sequential file having 10 billion records. This focus batch job process all the records and load to a focus DB dataset. after every a month or two months it gives the 'FATAL ERROR IN DATABASE I/O. FOCUS TERMINATING ERROR WRITING'. When this error comes we create a new focus DB dataset with allocation as mentioned in my first post and process the input data against new focus DB dataset. But it again occurs after a or two months and we repeat the step to create new dataset.
Previously we were not using the multi-volume concept with our focus datasets. Now I have implemented this multi-volume concept with all new focus datasets. Surely it will help to reduce the 'ERROR WRITING' problem.
So my queries are mentioned below What is the optimal primary and secondary cylinders allocation for a focus dataset so that we can perform Rebuild Process with out any problem ?
Or is there any method that can be used to remove the this problem permanently ?
It may depend on what "processing" you are doing. If the MODIFY (or MAINTAIN) program deletes segment instances ("records") and adds others, the space occupied by the deleted instances generally will not be reclaimed, so it becomes dead space and the file expands in size, even when the net number of segments remains relatively steady. -- If that is the case, a periodic REBUILD might solve the problem.
A lot also depends on your Focus file design. You may be able to achieve significant improvement in storage efficiency by redesigning the segment structure.
Posts: 1925 | Location: NYC | In FOCUS since 1983 | Registered: January 11, 2005