As of December 1, 2020, Focal Point is retired and repurposed as a reference repository. We value the wealth of knowledge that's been shared here over the years. You'll continue to have access to this treasure trove of knowledge, for search purposes only.
Join the TIBCO Community TIBCO Community is a collaborative space for users to share knowledge and support one another in making the best use of TIBCO products and services. There are several TIBCO WebFOCUS resources in the community.
From the Home page, select Predict: WebFOCUS to view articles, questions, and trending articles.
Select Products from the top navigation bar, scroll, and then select the TIBCO WebFOCUS product page to view product overview, articles, and discussions.
Request access to the private WebFOCUS User Group (login required) to network with fellow members.
Former myibi community members should have received an email on 8/3/22 to activate their user accounts to join the community. Check your Spam folder for the email. Please get in touch with us at community@tibco.com for further assistance. Reference the community FAQ to learn more about the community.
I have an ETL procedure which need to copy about 8,000,000 records from a SAP Table into a FOCUS table with Data Migrator. Now the whole proces is taking already two hours and it looks like it hung.
When I divide the whole into max amount of records of 2,500,000 each it will take 3 minutes. Which makes a total of 3 * 3 = 9 minutes in total.
How could this be, that as a whole it won't function. But if I divide it up to 2,500,000 records max. it takes 3 minutes per set?
Well the order seems not the problem nor does it has reached it's limit. Also the use of XFOCUS isn't the solution.
So what could it possibly be? But the funny part is that when I max it to 2 million records it takes 2 minutes. Then 3 million takes 3 minutes. But more then 3 million, it hits the fan.
It could be that the optimizer of the RDBMS you are running SAP on has decided a different plan at the threshhold you point out. (Oracle optimizer is very poor from experience).
This assumes that the extract involves a number of joins and selections and is not just a straight table copy.
Server: WF 7.6.2 ( BID/Rcaster) Platform: W2003Server/IIS6/Tomcat/SQL Server repository Adapters: SQL Server 2000/Oracle 9.2 Desktop: Dev Studio 765/XP/Office 2003 Applications: IFS/Jobscope/Maximo
Posts: 888 | Location: Airstrip One | Registered: October 06, 2006
Have you been able to find out if the problem lies with retrieving the data or with loading it into a focus db? In other words split the job in two parts. The first part retrieves the data and stores it in a flat file, the second load the file into your focus db. That way you can try to find out what the repsonse times are per part of the process and also what the filesizes are that you're dealing with.
GamP
- Using AS 8.2.01 on Windows 10 - IE11.
in Focus since 1988
Posts: 1961 | Location: Netherlands | Registered: September 25, 2007
Do you have indexes in your FOCUS master? If yes, that can considerably slow down the load. There are techniques to mitigate this process, one being that you do your CREATE FILE with the master with the indexes, do the load with a master without indexes and an S0 segtype, then do a REBUILD INDEX.
But first you need to answer for yourself the question that GamP is asking, i.e where is the failure.