As of December 1, 2020, Focal Point is retired and repurposed as a reference repository. We value the wealth of knowledge that's been shared here over the years. You'll continue to have access to this treasure trove of knowledge, for search purposes only.
Join the TIBCO Community TIBCO Community is a collaborative space for users to share knowledge and support one another in making the best use of TIBCO products and services. There are several TIBCO WebFOCUS resources in the community.
From the Home page, select Predict: WebFOCUS to view articles, questions, and trending articles.
Select Products from the top navigation bar, scroll, and then select the TIBCO WebFOCUS product page to view product overview, articles, and discussions.
Request access to the private WebFOCUS User Group (login required) to network with fellow members.
Former myibi community members should have received an email on 8/3/22 to activate their user accounts to join the community. Check your Spam folder for the email. Please get in touch with us at community@tibco.com for further assistance. Reference the community FAQ to learn more about the community.
we have a very huge data almost 7-8 million of records,As part of my requirement in worst cases i need to load all the records in a excel or pdf file,Is it possible? with out breaking or stop....This message has been edited. Last edited by: <Kathryn Henning>,
My 2 cents: You might want to ask what the end user wants to do with so much data. Postprocessing is probably the answer. Question is, can you do that for him/her and reducing the output volume.
...
If you save the output in a flatfile ( .txt or .csv ) it will be a long big breakless file.
But it's not possible to open it in excel anyway ( it doesn't support so many lines ).
Good luck, Dave
_____________________ WF: 8.0.0.9 > going 8.2.0.5
Posts: 668 | Location: Veghel, The Netherlands | Registered: February 16, 2010
First it is not advisable. Second in Excel it is not possible. Third, even if PDF allows, think about the size of the file and consider if end users system resource capacity to handle such a huge file. You may also need to consider Network bandwidth limit, browser timeout issue too.
First off, I agree with everyone else. You don't want 7 to 8 million record outputs. In my experience, these users just want to skip your BI system and work with these in their own excel environments. This cause problems for many reasons which I wouldn't get into here.
As for "This isn't possible in excel." Thats not entirely true. You would need to create a row count for each row, hold your outpput, then write a loop to compound each group of million records onto their own page. We used to do this all the time before excel could handle 1mil records per sheet.
You still have to consider the users computer and its capabilities. an 8 million excel file will take a huge hit on the persons resources. They could open it in access, but again, there will be other issues associated with that.
Eric Woerle 8.1.05M Gen 913- Reporting Server Unix 8.1.05 Client Unix Oracle 11.2.0.2
Posts: 750 | Location: Warrenville, IL | Registered: January 08, 2013
I agree with the above. However, perhaps you could use the Migration / Export of that file. It does sound, like Eric says, that: "these users just want to skip your BI system and work with these in their own excel environments. This cause problems for many reasons which I wouldn't get into here."
Posts: 3132 | Location: Tennessee, Nashville area | Registered: February 23, 2005
I strongly believe there is be always scope for filter. if at all Users are adamant you can try for Burst option and try to ftp the file to some shared server.
WF 8007 Solaris
Posts: 64 | Location: North Carolina | Registered: December 04, 2007