Focal Point
[CLOSED] foctemp (.ftm) file size

This topic can be found at:
https://forums.informationbuilders.com/eve/forums/a/tpc/f/7971057331/m/8441060722

May 08, 2007, 02:10 PM
jbmuir
[CLOSED] foctemp (.ftm) file size
Hello All,

I'm wondering what the maximum size of a foctemp (.ftm) file is when using WebFOCUS 7.1? There was a 2 gigabyte limit when we used FOCUS for Unix.

Thanks for your help.
-James

This message has been edited. Last edited by: Kerry,


WF 7.1.6 moving to WF 7.7, Solaris 10, HTML,PDF,XL
May 08, 2007, 02:14 PM
jbmuir
Oh yeah, forgot to mention that we are running WF on Solaris 10.


WF 7.1.6 moving to WF 7.7, Solaris 10, HTML,PDF,XL
May 09, 2007, 08:36 AM
hammo1j
The limit on a .FTM file is the operating system limit since it is a simple stream file.

The 2 gig limit used to refer to FOCSORT which is a file in FOCUS format.



Server: WF 7.6.2 ( BID/Rcaster) Platform: W2003Server/IIS6/Tomcat/SQL Server repository Adapters: SQL Server 2000/Oracle 9.2
Desktop: Dev Studio 765/XP/Office 2003 Applications: IFS/Jobscope/Maximo
May 09, 2007, 09:36 PM
FortuneCookie
I stumbled across this Tech Doc the other while searching for something else.

http://techsupport.informationbuilders.com/tech/cof/cof_database_limits.html

I think it'll cover any questions you have regardless of release and product type.


Prod: WebFOCUS 7.1.6, Windows 2003

Dev: WebFOCUS 7.6.2, Windows 2003
May 14, 2007, 03:24 PM
jbmuir
thanks hamm01j and fortune cookie for the definitive answer.
-James


WF 7.1.6 moving to WF 7.7, Solaris 10, HTML,PDF,XL
February 05, 2009, 01:01 PM
susannah
jb
did you ever discover the max of a .ftm file ?
Mine have begun to exceed 2gig and i'm getting a read error now...when trying to read the whole file.
i looked at the doc that Francesco found..and it listed only FOC or xfoc files...
i'm on solaris/unix/oracle thing as well...
and having ageda




In Focus since 1979///7706m/5 ;wintel 2008/64;OAM security; Oracle db, ///MRE/BID
February 05, 2009, 02:39 PM
Dave Ayers
Susannah,

quote:
i'm... and having ageda


I'm so sorry, hope you get well soon. Smiler


Regards,
Dave

http://www.daveayers.com

WebFocus/Maintain 7.6.4-8
on Win2000 and 2003 Server
February 05, 2009, 03:51 PM
j.gross
quote:
Originally posted by susannah:
...Mine have begun to exceed 2gig and i'm getting a read error now...when trying to read the whole file.


If your mean read via TABLE: I believe Focus still has a 2g limit on the internal matrix -- a limitation related to the size limit for simple Focus files. So PRINT * on a flat file > 2gb will lead to an internal matrix of similar size.

But you should still be able to do some reporting w/o bumping against the 2g ceiling -- Summarization (SUM) and selection (IF/WHERE) can reduce the 'height' of answer set, and eliminating columns from the request reduces the 'width'. And of course if you can make do with TABLEF, there's no limit (the internal matrix just holds a single row).

Or did you mean something else?
February 05, 2009, 04:39 PM
susannah
hmmm...
thanks for offering to help...
i'm extracting a 2.5 gig flat file .ftm
when i TABLE it to build my XFOCUS files, the bottom bunch of records don't get read...i get an ERROR READING FILE filename
but the fex runs, and the XFOCUS files get created...just missing data.
2) so today, i extracted to a 2.5 gig ON TABLE HOLD FORMAT XFOCUS...
and the sucker worked...
when i TABLE it to build my XFOCUS files, all the records get read.
... so for the moment, my ageda is gone, thanks Dave...but soon to come back since i don't know exactly whats going on...and the files are getting bigger, and will be 5 gig by year end.




In Focus since 1979///7706m/5 ;wintel 2008/64;OAM security; Oracle db, ///MRE/BID
February 05, 2009, 08:19 PM
Dave Ayers
Susannah,


Sounds like partition time for you...


Regards,
Dave

http://www.daveayers.com

WebFocus/Maintain 7.6.4-8
on Win2000 and 2003 Server
February 05, 2009, 09:20 PM
Waz
I've, from memory, had much larger that 2 gig[gle]s, in fact the program once chewed up 100 gig[gle]s.

I suspect that there is no limit, as John says, an operating system limit.

The fex sould just keep reading, lines from the file, all depends on blowing a file pointer I guess.


Waz...

Prod:WebFOCUS 7.6.10/8.1.04Upgrade:WebFOCUS 8.2.07OS:LinuxOutputs:HTML, PDF, Excel, PPT
In Focus since 1984
Pity the lost knowledge of an old programmer!

February 06, 2009, 06:37 AM
j.gross
1. You refer to these as .ftm files. Do they originate with a WF server?

2. I suppose there could be some integrity issue at the file-system level. To test that hypothesis beforehand, insert a quick scan to verify WF can read all the lines:

TABLEF FILE xxx (note TABLEF!)
WRITE CNT.FLD1 MAX.FLD1 MAX.FLD2 etc.
ON TABLE HOLD
END
(and check the line count against your source)

3. Divide and conquer: Partition the incoming files to reasonable size (at the sourcem or as a prelude step)
February 06, 2009, 11:13 AM
susannah
yes, they originate with a WFRS...
its a big extract from Oracle.
the extract is 2.1 gig
TABLE FILE reads that extract and creates XFOCUS databases.
Theproblem is, its too big to read.
so now, i have to create my extract, not as ON TABLE HOLD {format alpha/binary} but rather as
ON TABLE HOLD FORMAT XFOCUS
it works...
i can read and process that file.
but i don't like it.
On a windows server, no issues.
On a unix server, issues. We now hypothsize that its a unix opsys setting that limits file sizes ... but you know what happens when you present that idea to anyone in a red hat...




In Focus since 1979///7706m/5 ;wintel 2008/64;OAM security; Oracle db, ///MRE/BID
February 13, 2009, 11:03 AM
j.gross
Suggestion:

Chances are your resultant Xfocus file is composed of inately hierarchical data. Design the master accordingly as a multiple segment structure, and load it up one segment at a time, using a series of HOLD files pulled from the 2.1GB monster flat extract. That tends to cut down on the width (fields) and/or depth (rows) of each HOLD, so you avoid flirting with the 2GB internal matrix limitation of TABLE.

If you want to avoid cataloging the master, you can create it on the fly --- do a HOLD FORMAT XFOCUS with multiple verbs to create the master (with IFs to ensure just a few records), then use CREATE to wash away that data --- and proceed as above.

I'll be happy to discuss particulars offline.

-Jack
February 13, 2009, 11:54 AM
GamP
What I remember of my unix days - long ago and far away ... - :
Limiting file sizes on unix is done by using the 'ulimit -f' setting.
To see what value the setting is currently use 'ulimit -f' to see the user setting, and 'ulimit -fH' to see the OS enforced setting.


GamP

- Using AS 8.2.01 on Windows 10 - IE11.
in Focus since 1988
February 13, 2009, 02:20 PM
susannah
i asked Ginny and she says, similar to what you said, Gamp, that there is an
"'Enhanced JFS' where JFS is Journal File System.'"

i just tried what you suggested, Gamp, and got
'unlimited' as the result...
i tried ulimit -fH and got 'invalid number' so the H is probably different now.
but the ulimit -f gave a very interesting answer...
so not i'm thinking its not unix...




In Focus since 1979///7706m/5 ;wintel 2008/64;OAM security; Oracle db, ///MRE/BID