As of December 1, 2020, Focal Point is retired and repurposed as a reference repository. We value the wealth of knowledge that's been shared here over the years. You'll continue to have access to this treasure trove of knowledge, for search purposes only.
Join the TIBCO Community TIBCO Community is a collaborative space for users to share knowledge and support one another in making the best use of TIBCO products and services. There are several TIBCO WebFOCUS resources in the community.
From the Home page, select Predict: WebFOCUS to view articles, questions, and trending articles.
Select Products from the top navigation bar, scroll, and then select the TIBCO WebFOCUS product page to view product overview, articles, and discussions.
Request access to the private WebFOCUS User Group (login required) to network with fellow members.
Former myibi community members should have received an email on 8/3/22 to activate their user accounts to join the community. Check your Spam folder for the email. Please get in touch with us at community@tibco.com for further assistance. Reference the community FAQ to learn more about the community.
May be a simple question, pardon me I am an iWay person but recently started using DM effectively.
My requirement is to copy data from SQLServer to Oracle 10g. I have successfully created the transformations, data flow and pflow. Even tested it just fine.
Then came a situation where few of the oracle column was changed to timestamp. Thus, my inserts as varchar has to be changed. Shouldn't be a problem right.
Did the following:
(1) SQLServer source transformations - All these datetime value columns were defined as HYYMDS (2) Oracle target transformations - All these datetime value columns were defined as HYYMDS (3) Checked MAS files to be sure of these defintions.
When I do a test transformations in the target I get correct values. Was so happy.
Now the last part of running the data flow and checking them.
sob.sob.sob. I get (FOC1383) UNSUPPORTED DATETIME FORMAT FOR FIELD : IDOC_INS_DATE
What am I doing wrong. Please help !!! Thanks much in advance This message has been edited. Last edited by: Kerry,
Were the master files for both tables created using the synonym generator? Are you actually performing source and/or target transformations on the date data or are you simply extracting from the SQLServer table and loading into the Oracle table? If you are performing transformations on the dates, what kinds of changes are you making?
You might open the flow and then switch to Text View to see if anything in the code stands out as a potential problem. You should look closely at any formats in the code that are being used for processing the date columns. (For example, with the Text View open do a search on 'IDOC_INS_DATE'.)
WebFOCUS 7.7.05
Posts: 1213 | Location: Seattle, Washington - USA | Registered: October 22, 2007
While date columns are pretty standard, date and time columns are one of those areas where every database is different, in this case with the precision of the time portion.
The MS SQL Server DATETIME column should be described as HYYMDs. Note the small "s" which indicates a value to the millisecond like '2010/08/13 00:00:00.000'.
The newer ORACLE TIMESTAMP column should be described as HYYMDm. The small "m" is for microseconds like ...000000'.
Once the columns are described accurately your data flow will work fine.
N/A
Posts: 397 | Location: New York City | Registered: May 03, 2007
Yes I created using the synonym generator. I am not doing any transformations on these date columns. They should be direct mappings. I will look into the formats as you had mentioned. Thank You.
Clif,
Thank You. I will try with these formats and get back
oops, when I choose HYYMD (with milliseconds), thats what gets written.. I think I need to go and manually change then Will make the changes now and try it. Thanks !!!
In my humble opinion you shouldn't have to modify a Master generated by the synonym generator. I would leave the Masters untouched.
Since the FOC1383 error is on a fieldname that is on both the source and target table, do you know which table the error occurs on? It's most likely the target table and most likely because of the "JDBC generic adapter. Have you tried using the adapter for Oracle?
I did a quick search at Tech Support and here's one result:
Symptom: (FOC1383) UNSUPPORTED DATETIME FORMAT FOR FIELD
Problem: Using JDBC adapter configured to go against Informix database, an error message is received when trying to query against a master file that has:
Restarted DM with HYYMDS values for both source and target and it works perfectly fine
Yaay !!! Thanks everyone...
Is there any special settings to handle NULL mapping to timestamp? If there is no source value for these columns, I get a garbage value - 7391-01-14 06:30:16 instead of null in target?
If you haven't already done so, specify your source date with MISSING=ON. If that doesn't seem to do the trick, then try adding SET NULL=ON to the top of your program.
EDIT: Your target date should be specified with MISSING=ON, too.This message has been edited. Last edited by: Dan Satchell,
WebFOCUS 7.7.05
Posts: 1213 | Location: Seattle, Washington - USA | Registered: October 22, 2007
Thanks Dan.. I do have MISSING in the source. Can you please explain what u mean by changing the program? Do u mean the data flow content? Please let me know. Thanks much.
Make sure MISSING=ON is also specified for the target date and test your flow before doing the following.
If you look at the process flow in the Process Flow tab, are there any procedures being executed prior to the Data Flow? If so, then simply add SET NULL=ON to the procedure. If not, then create a new procedure (not a new flow) and place the command SET NULL=ON in it. Drag the procedure into your Process Flow and use the right mouse button to connect it between the Start button and the Data Flow.
WebFOCUS 7.7.05
Posts: 1213 | Location: Seattle, Washington - USA | Registered: October 22, 2007
I read that your target database was ORACLE but missed that you were using the generic JDBC adapter. As with Informix, when there is a database specific adapter available we recommend that you use it to provide better coverage of that database features.
N/A
Posts: 397 | Location: New York City | Registered: May 03, 2007