Focal Point Banner


As of December 1, 2020, Focal Point is retired and repurposed as a reference repository. We value the wealth of knowledge that's been shared here over the years. You'll continue to have access to this treasure trove of knowledge, for search purposes only.

Join the TIBCO Community
TIBCO Community is a collaborative space for users to share knowledge and support one another in making the best use of TIBCO products and services. There are several TIBCO WebFOCUS resources in the community.

  • From the Home page, select Predict: WebFOCUS to view articles, questions, and trending articles.
  • Select Products from the top navigation bar, scroll, and then select the TIBCO WebFOCUS product page to view product overview, articles, and discussions.
  • Request access to the private WebFOCUS User Group (login required) to network with fellow members.

Former myibi community members should have received an email on 8/3/22 to activate their user accounts to join the community. Check your Spam folder for the email. Please get in touch with us at community@tibco.com for further assistance. Reference the community FAQ to learn more about the community.


Focal Point    Focal Point Forums  Hop To Forum Categories  iWay Software Product Forum on Focal Point    [CASE-OPENED] CLOB data length limitation in DataMigrator

Read-Only Read-Only Topic
Go
Search
Notify
Tools
[CASE-OPENED] CLOB data length limitation in DataMigrator
 Login/Join
 
Member
posted
Hi,

I am trying to extract/load CLOB data from one oracle db to another oracle db through Data migrator (7.7.03 - unicode), but while doing that the ETL flow is throwing an exception once the CLOB data length reaches a limit of 32kb, so am not able to extract CLOB data which is greater than 32kb, am not applying any sort of transformations on the column.

Can someone help me with this?

Thanks,
Srini

This message has been edited. Last edited by: <Kathryn Henning>,
 
Posts: 13 | Registered: February 03, 2011Report This Post
Virtuoso
posted Hide Post
The CLOB size should go up to 2Gb. There are some settings in edaserve.cfg that need to be there to support large CLOB fields, ora_oci = y is one important one I believe.

Also check the ACTUAL and USAGE in the MFD, should be TX and TX50


Alan.
WF 7.705/8.007
 
Posts: 1451 | Location: Portugal | Registered: February 07, 2007Report This Post
Member
posted Hide Post
Hi,

Thanks for your reply, the setting ora_oci = y is there in edaserve.cfg indeed, I've looked at oracle adapter admin doc and couldn't find any other setting that needs to be made other than this.

What I've observed from traces is that while reading the CLOB data, the column is being interpreted as CLOB where as while loading it is considering it as Long Varchar(32k) though it is a CLOB field, what could be the reason?

While reading from source:

05.51.32 BY <<< OR8FOC SQLDA: Name= E27, Addr= 20e91cc8
05.51.32 BY <<< OR8FOC SQLDA: Type= CLOB[0199), Length= 0000
05.51.32 BY <<< OR8FOC SQLDA: Indicator= Addr: 20e931f3: 0000

While loading into target:

05.51.32 BY >>> OR8FOC SQLDA: Name= REPROCESS_REQUEST_MESSAGE, Addr= 20ed6678
05.51.32 BY >>> OR8FOC SQLDA: Type= Long Varchar(01C9), Length= 7D00
05.51.32 BY >>> OR8FOC SQLDA: Indicator= Addr: 20c03178: 0000
05.51.32 BY >>> OR8FOC SQLDA: Data= Addr: 20c0317a: Len: 11687:
 
Posts: 13 | Registered: February 03, 2011Report This Post
  Powered by Social Strata  

Read-Only Read-Only Topic

Focal Point    Focal Point Forums  Hop To Forum Categories  iWay Software Product Forum on Focal Point    [CASE-OPENED] CLOB data length limitation in DataMigrator

Copyright © 1996-2020 Information Builders