I am trying to extract/load CLOB data from one oracle db to another oracle db through Data migrator (7.7.03 - unicode), but while doing that the ETL flow is throwing an exception once the CLOB data length reaches a limit of 32kb, so am not able to extract CLOB data which is greater than 32kb, am not applying any sort of transformations on the column.
Can someone help me with this?
SriniThis message has been edited. Last edited by: <Kathryn Henning>,
The CLOB size should go up to 2Gb. There are some settings in edaserve.cfg that need to be there to support large CLOB fields, ora_oci = y is one important one I believe.
Also check the ACTUAL and USAGE in the MFD, should be TX and TX50
Thanks for your reply, the setting ora_oci = y is there in edaserve.cfg indeed, I've looked at oracle adapter admin doc and couldn't find any other setting that needs to be made other than this.
What I've observed from traces is that while reading the CLOB data, the column is being interpreted as CLOB where as while loading it is considering it as Long Varchar(32k) though it is a CLOB field, what could be the reason?
While reading from source:
05.51.32 BY <<< OR8FOC SQLDA: Name= E27, Addr= 20e91cc8
05.51.32 BY <<< OR8FOC SQLDA: Type= CLOB[0199), Length= 0000
05.51.32 BY <<< OR8FOC SQLDA: Indicator= Addr: 20e931f3: 0000
While loading into target:
05.51.32 BY >>> OR8FOC SQLDA: Name= REPROCESS_REQUEST_MESSAGE, Addr= 20ed6678
05.51.32 BY >>> OR8FOC SQLDA: Type= Long Varchar(01C9), Length= 7D00
05.51.32 BY >>> OR8FOC SQLDA: Indicator= Addr: 20c03178: 0000
05.51.32 BY >>> OR8FOC SQLDA: Data= Addr: 20c0317a: Len: 11687:
|Powered by Social Strata|