Focal Point Banner
Community Center Education Summit Technical Support User Groups
Let's Get Social!

Facebook Twitter LinkedIn YouTube
Focal Point    Focal Point Forums  Hop To Forum Categories  WebFOCUS/FOCUS Forum on Focal Point     [CLOSED] Space Abend on Mainframe Focus Datasets
Go
New
Search
Notify
Tools
Reply
  
[CLOSED] Space Abend on Mainframe Focus Datasets
 Login/Join
 
Member
posted
Hello Guys,

My query is regarding the maximum limit of a mainframe focus dataset to hold the data. this focus dataset is NOT defined with multi- volumes.

So problem is that when a fixed number of segments get inserted in this dataset. It gets fatal error while writing to the file.
(FOC198) FATAL ERROR IN DATABASE I/O. FOCUS TERMINATING ERROR WRITING
BCCBSPL2, PAGE 188971, CODE 0xaa000003


Than we perform rebuild process to spilt the data from abended dataset to new focus dataset on the key field and take new focus dataset in use.

CYLS allocation to this datasets as following:
Record format . . . : F
Record length . . . : 16384
Block size . . . . : 16384
1st extent cylinders: 628
Secondary cylinders: 350

Current Allocation
Allocated cylinders: 4,200
Allocated extents. : 5

Current Utilization
Used cylinders . . : 4,200
Used extents . . . : 5

I found multi volume concept that can reduce the occurrence of this space abend.

Is there any method that can remove this space ABEND permanently?

This message has been edited. Last edited by: Kerry,


WebFOCUS 7.6
Windows, All Outputs

 
Posts: 5 | Registered: March 13, 2012Reply With QuoteReport This Post
Expert
posted Hide Post
I haven't been on mainframe in quite a while, but, I guess that you have hit the limit of the FOCUS database.

How big is the FOCUS file in bytes ?, is it about 2gb


Waz...

Prod:WebFOCUS 7.6.10/8.1.04Upgrade:WebFOCUS 8.2.06OS:LinuxOutputs:HTML, PDF, Excel, PPT
In Focus since 1984
Know The Code

 
Posts: 6118 | Location: 33.8688° S, 151.2093° E | Registered: October 31, 2006Reply With QuoteReport This Post
Virtuoso
posted Hide Post
Can you post the master? There may be ways to re-segment the file to use storage more efficiently.

Using XFOCUS rather than FOCUS might also help.
 
Posts: 1925 | Location: NYC | In FOCUS since 1983 | Registered: January 11, 2005Reply With QuoteReport This Post
Expert
posted Hide Post
I'm geting confused in my old age, XFOCUS is a licensed product isn't it ?


Waz...

Prod:WebFOCUS 7.6.10/8.1.04Upgrade:WebFOCUS 8.2.06OS:LinuxOutputs:HTML, PDF, Excel, PPT
In Focus since 1984
Know The Code

 
Posts: 6118 | Location: 33.8688° S, 151.2093° E | Registered: October 31, 2006Reply With QuoteReport This Post
Member
posted Hide Post
Thanks all for your reply..

Sorry to say but you did not get the clear picture of my problem. So describing it again in more detail.

Actually what happens, we have a mainframe batch job (Runs Daily) that takes input from a sequential file having 10 billion records. This focus batch job process all the records and load to a focus DB dataset. after every a month or two months it gives the 'FATAL ERROR IN DATABASE I/O. FOCUS TERMINATING ERROR WRITING'. When this error comes we create a new focus DB dataset with allocation as mentioned in my first post and process the input data against new focus DB dataset. But it again occurs after a or two months and we repeat the step to create new dataset.

Previously we were not using the multi-volume concept with our focus datasets. Now I have implemented this multi-volume concept with all new focus datasets. Surely it will help to reduce the 'ERROR WRITING' problem.

So my queries are mentioned below
What is the optimal primary and secondary cylinders allocation for a focus dataset so that we can perform Rebuild Process with out any problem ?

Or is there any method that can be used to remove the this problem permanently ?

Thanks
Amit Panwar


WebFOCUS 7.6
Windows, All Outputs

 
Posts: 5 | Registered: March 13, 2012Reply With QuoteReport This Post
Virtuoso
posted Hide Post
It may depend on what "processing" you are doing. If the MODIFY (or MAINTAIN) program deletes segment instances ("records") and adds others, the space occupied by the deleted instances generally will not be reclaimed, so it becomes dead space and the file expands in size, even when the net number of segments remains relatively steady. -- If that is the case, a periodic REBUILD might solve the problem.

A lot also depends on your Focus file design. You may be able to achieve significant improvement in storage efficiency by redesigning the segment structure.
 
Posts: 1925 | Location: NYC | In FOCUS since 1983 | Registered: January 11, 2005Reply With QuoteReport This Post
Master
posted Hide Post
quote:
Or is there any method that can be used to remove the this problem permanently ?

I've seen the situation, you find yourself in, before.

It sounds like you need to build a purge routine, that runs periodically, to replace what you are now doing anyway manually.

Build/discover a business rule that goes something like: Delete all database records older than one year.

One you do that, you can then build your FOCUS purge routine.

Pseudocode:
TABLE FILE name
PRINT ..
IF DATEOFDATA LT date
ON TABLE HOLD..
END

MODIFY FILE name
FIXFORM..
MATCH..
ON MATCH DELETE
..
DATA ON HOLD..
END


You would add/build database backups, rebuilds, and audit trails, as needed.




Test: WebFOCUS 8.1.05M Prod: WebFOCUS 8.1.05M Server: Windows Server 2012/Tomcat Standalone Workstation: Windows 7/IE 11 Database: Oracle 12c/Netezza Output: AHTML/XLSX/HTML/PDF/JSCHART Tools: WFDS, Repository Content, BI Portal Designer, & ReportCaster

 
Posts: 765 | Registered: April 23, 2003Reply With QuoteReport This Post
  Powered by Social Strata  
 

Focal Point    Focal Point Forums  Hop To Forum Categories  WebFOCUS/FOCUS Forum on Focal Point     [CLOSED] Space Abend on Mainframe Focus Datasets

Copyright © 1996-2018 Information Builders, leaders in enterprise business intelligence.