Focal Point Banner


As of December 1, 2020, Focal Point is retired and repurposed as a reference repository. We value the wealth of knowledge that's been shared here over the years. You'll continue to have access to this treasure trove of knowledge, for search purposes only.

Join the TIBCO Community
TIBCO Community is a collaborative space for users to share knowledge and support one another in making the best use of TIBCO products and services. There are several TIBCO WebFOCUS resources in the community.

  • From the Home page, select Predict: WebFOCUS to view articles, questions, and trending articles.
  • Select Products from the top navigation bar, scroll, and then select the TIBCO WebFOCUS product page to view product overview, articles, and discussions.
  • Request access to the private WebFOCUS User Group (login required) to network with fellow members.

Former myibi community members should have received an email on 8/3/22 to activate their user accounts to join the community. Check your Spam folder for the email. Please get in touch with us at community@tibco.com for further assistance. Reference the community FAQ to learn more about the community.


Focal Point    Focal Point Forums  Hop To Forum Categories  WebFOCUS/FOCUS Forum on Focal Point     [CLOSED] SQL Pass-Thru Issue

Read-Only Read-Only Topic
Go
Search
Notify
Tools
[CLOSED] SQL Pass-Thru Issue
 Login/Join
 
Platinum Member
posted
I am trying to test SQL pass-thru with a custom Hive adapter. I am getting an error on the connection name when trying to issue the SQL pass-thru. I am using a Hive database with the custom adapter named CLOUD_DATA_WAREHOUSE. The error is below and the code is below the error message.

0 NUMBER OF RECORDS IN TABLE= 1 LINES= 1
(FOC1113) INVALID CONTROL COMMAND
(FOC32498) FOCEXEC PARSER ERROR. INVALID SYMBOL: CLOUD_DATA_WAREHOUSE
BYPASSING TO END OF COMMAND
(FOC1517) UNRECOGNIZED COMMAND SELECT AMI_METER_ID AS ASSET_ID, ASSET_REALM, ASSET_TYPE, ACCESS_AND_REASON.REASON AS REASON, ACCESS_AND_REASON.ACCESS_LEVEL AS ACCESS, CONCAT(USER_NAME, '@', USER_REALM) AS USER
(FOC1517) UNRECOGNIZED COMMAND FROM (
(FOC1517) UNRECOGNIZED COMMAND SELECT USER_NAME, USER_REALM, AMI_METER_ID, ASSET_REALM, ASSET_TYPE, USER_ACCESS_AND_REASON(USER_UUID, CLOUD_ID, USER_REALM, ASSET_REALM, OWNED_BY, ROLE_ACCESS_LEVELS, APPLY_DEFAULT_SECURITY) AS ACCESS_AND_REASON
(FOC1517) UNRECOGNIZED COMMAND FROM (
(FOC1517) UNRECOGNIZED COMMAND SELECT DISTINCT USER_NAME, USER_UUID, CLOUD_ID, USER_REALM, LOCATIONS.UUID AS LOCATION_UUID, REALM AS ASSET_REALM, ROLE_ACCESS_LEVELS, APPLY_DEFAULT_SECURITY, OWNED_BY, AMI_METER_ID, ASSET_TYPE
(FOC1517) UNRECOGNIZED COMMAND FROM MDM_METER_LOCATION_ALL AS LOCATIONS
0 ERROR AT OR NEAR LINE 10 IN PROCEDURE ADHOCRQ FOCEXEC *
(FOC376) SYNTAX ERROR OR MISSING ELEMENT IN JOIN/COMBINE COMMAND:
(FOC1517) UNRECOGNIZED COMMAND FROM USERS
(FOC1517) UNRECOGNIZED COMMAND ) AS USERS JOIN (
(FOC1517) UNRECOGNIZED COMMAND SELECT REALM AS ASSET_REALM, METER_LOCATION_UUID, OWNED_BY, AMI_METER_ID, ASSET_TYPE
(FOC1517) UNRECOGNIZED COMMAND FROM MDM_ELECTRIC_METER_ALL
(FOC1517) UNRECOGNIZED COMMAND ) AS METERS ON (LOCATIONS.UUID = METERS.METER_LOCATION_UUID AND LOCATIONS.REALM = METERS.ASSET_REALM)
(FOC1517) UNRECOGNIZED COMMAND ) AS USER_AFTER_UDF
(FOC1517) UNRECOGNIZED COMMAND ) AS USER_WITH_ACCESS
(FOC1517) UNRECOGNIZED COMMAND WHERE ACCESS_AND_REASON.ACCESS_LEVEL <> 'NO_ACCESS'
(FOC1517) UNRECOGNIZED COMMAND ORDER BY ASSET_ID, ASSET_REALM, ASSET_TYPE, REASON, ACCESS, USER
0 ERROR AT OR NEAR LINE 23 IN PROCEDURE ADHOCRQ FOCEXEC *
(FOC205) THE DESCRIPTION CANNOT BE FOUND FOR FILE NAMED: SQLOUT
BYPASSING TO END OF COMMAND


Here is the code:

[Code}
-SET &CONNECTION_NAME = 'CLOUD_DATA_WAREHOUSE';

ENGINE CLOUD_DATA_WAREHOUSE SET DEFAULT_CONNECTION CLOUD_DATA_WAREHOUSE
SQL CLOUD_DATA_WAREHOUSE
SELECT ami_meter_id AS asset_id, asset_realm, asset_type, access_and_reason.reason AS reason, access_and_reason.access_level AS access, concat(user_name, '@', user_realm) AS user
FROM (
SELECT user_name, user_realm, ami_meter_id, asset_realm, asset_type, user_access_and_reason(user_uuid, cloud_id, user_realm, asset_realm, owned_by, role_access_levels, apply_default_security) AS access_and_reason
FROM (
SELECT DISTINCT user_name, user_uuid, cloud_id, user_realm, locations.uuid as location_uuid, realm AS asset_realm, role_access_levels, apply_default_security, owned_by, ami_meter_id, asset_type
FROM mdm_meter_location_all AS locations
JOIN (
SELECT name AS user_name, id AS user_uuid, clouduserid AS cloud_id, realm AS user_realm
FROM users
) AS users JOIN (
SELECT realm AS asset_realm, meter_location_uuid, owned_by, ami_meter_id, asset_type
FROM mdm_electric_meter_all
) AS meters ON (locations.uuid = meters.meter_location_uuid AND locations.realm = meters.asset_realm)
) AS user_after_udf
) AS user_with_access
WHERE access_and_reason.access_level <> 'NO_ACCESS'
ORDER BY asset_id, asset_realm, asset_type, reason, access, user
END
-RUN

TABLE FILE SQLOUT
PRINT *
END
-RUN

This message has been edited. Last edited by: FP Mod Chuck,


WF 8.1.05
Windows
SQL Server 2014
HTML, PDF, EXCEL
 
Posts: 175 | Registered: March 24, 2017Report This Post
Virtuoso
posted Hide Post
Hi AMC

I think the issue is the syntax on this line.

ENGINE CLOUD_DATA_WAREHOUSE SET DEFAULT_CONNECTION CLOUD_DATA_WAREHOUSE

It should be

ENGINE SQLHIV SET DEFAULT_CONNECTION CLOUD_DATA_WAREHOUSE

Hope that helps,


Thank you for using Focal Point!

Chuck Wolff - Focal Point Moderator
WebFOCUS 7x and 8x, Windows, Linux All output Formats
 
Posts: 2127 | Location: Customer Support | Registered: April 12, 2005Report This Post
Platinum Member
posted Hide Post
Thanks Chuck for the response. When I make that change though I get this error below:

0 NUMBER OF RECORDS IN TABLE= 1 LINES= 1
(FOC1671) CONNECTION CLOUD_DATA_WAREHOUSE IS NOT DECLARED FOR INTERFACE SQLHIV
(FOC1671) CONNECTION CLOUD_DATA_WAREHOUSE IS NOT DECLARED FOR INTERFACE SQLHIV
(FOC32498) FOCEXEC PARSER ERROR. INVALID SYMBOL: CLOUD_DATA_WAREHOUSE


WF 8.1.05
Windows
SQL Server 2014
HTML, PDF, EXCEL
 
Posts: 175 | Registered: March 24, 2017Report This Post
Platinum Member
posted Hide Post
Ok, so next step - what are the adapter settings?
And then, what is the environment settings for your DB


Cheers,
H.

WebFOCUS 8.1.05M
Oracle 11g - DB2
RedHat
 
Posts: 115 | Location: Brighton UK | Registered: February 19, 2005Report This Post
Platinum Member
posted Hide Post
The adpater settingd are below. The database is a Hive scenarion and it's running over Hadoop. If more specific DB environment settings would help, I can try to get.

Connect parameters

Connection Name: cloud_data_warehouse

URL: jdbc:nisc-hive:http://mdmrpt-spark-master-1.cloud.dev:8200/mdm-spark-extract/services;realm;&REALM;user;trustUser1;password;password1;actingasuserid;&UUID
Sample: jdbc:JdbcProgress:T:edarisc3:4108:qaiway

Security: Password Passthru

Environment:
IBI_CLASSPATH /share/drivers/postgresql-9.2-1003.jdbc3.jar/share/drivers/postgresql-9.2-1003.jdbc4.jar/share/drivers/hadoop-common-2.6.0-cdh5.7.0.jar/share/drivers/hive-jdbc-standalone.jar/usr/java/latest/share/drivers/cdw-jdbc-client.jar

CLASSPATH /share/drivers

PATH /usr/local/webfocus/ibi/srv82/home/bin/usr/local/bin/bin/usr/bin/usr/local/sbin/usr/sbin/usr/local/webfocus/.local/bin--- NOT FOUND/usr/local/webfocus/bin--- NOT FOUND

LD_LIBRARY_PATH /usr/java/latest/lib/amd64/server/usr/local/webfocus/ibi/srv82/home/bin/lib/usr/lib


WF 8.1.05
Windows
SQL Server 2014
HTML, PDF, EXCEL
 
Posts: 175 | Registered: March 24, 2017Report This Post
Platinum Member
posted Hide Post
quote:
ENGINE SQLHIV SET DEFAULT_CONNECTION CLOUD_DATA_WAREHOUSE


It might be as silly as:

ENGINE SQLHIV SET DEFAULT_CONNECTION cloud_data_warehouse


Cheers,
H.

WebFOCUS 8.1.05M
Oracle 11g - DB2
RedHat
 
Posts: 115 | Location: Brighton UK | Registered: February 19, 2005Report This Post
Virtuoso
posted Hide Post
AMC

I made a bad assumption and thought you were using the native Hive adapter. Double check the entry in the reporting server profile (edasprof.prf) but I think you should change SQLHIV to SQLJDBC


Thank you for using Focal Point!

Chuck Wolff - Focal Point Moderator
WebFOCUS 7x and 8x, Windows, Linux All output Formats
 
Posts: 2127 | Location: Customer Support | Registered: April 12, 2005Report This Post
Platinum Member
posted Hide Post
I made both changes that Hank W and Chuck Wolff suggested....changing 'CLOUD_DATA_WAREHOUSE' to lower case and changed SQLHIV to SQLJDBC because that entry is in the reporting server profile using "cloud_data_warehous".

This is the error I get now below. It seems like it got further in the processing. Any more ideas?

0 NUMBER OF RECORDS IN TABLE= 1 LINES= 1
(FOC32498) FOCEXEC PARSER ERROR. INVALID SYMBOL: cloud_data_warehouse
BYPASSING TO END OF COMMAND
(FOC1517) UNRECOGNIZED COMMAND SELECT AMI_METER_ID AS ASSET_ID, ASSET_REALM, ASSET_TYPE, ACCESS_AND_REASON.REASON AS REASON, ACCESS_AND_REASON.ACCESS_LEVEL AS ACCESS, CONCAT(USER_NAME, '@', USER_REALM) AS USER
(FOC1517) UNRECOGNIZED COMMAND FROM (
(FOC1517) UNRECOGNIZED COMMAND SELECT USER_NAME, USER_REALM, AMI_METER_ID, ASSET_REALM, ASSET_TYPE, USER_ACCESS_AND_REASON(USER_UUID, CLOUD_ID, USER_REALM, ASSET_REALM, OWNED_BY, ROLE_ACCESS_LEVELS, APPLY_DEFAULT_SECURITY) AS ACCESS_AND_REASON
(FOC1517) UNRECOGNIZED COMMAND FROM (
(FOC1517) UNRECOGNIZED COMMAND SELECT DISTINCT USER_NAME, USER_UUID, CLOUD_ID, USER_REALM, LOCATIONS.UUID AS LOCATION_UUID, REALM AS ASSET_REALM, ROLE_ACCESS_LEVELS, APPLY_DEFAULT_SECURITY, OWNED_BY, AMI_METER_ID, ASSET_TYPE
(FOC1517) UNRECOGNIZED COMMAND FROM MDM_METER_LOCATION_ALL AS LOCATIONS


WF 8.1.05
Windows
SQL Server 2014
HTML, PDF, EXCEL
 
Posts: 175 | Registered: March 24, 2017Report This Post
Platinum Member
posted Hide Post
It's not recognizing the JDBC
What is the connection row in edasprof.prf
its under /ibi/srv81/wfs/etc/


Cheers,
H.

WebFOCUS 8.1.05M
Oracle 11g - DB2
RedHat
 
Posts: 115 | Location: Brighton UK | Registered: February 19, 2005Report This Post
Virtuoso
posted Hide Post
AMC

I looked at the code you posted in the beginning and I think I see the issue. The line SQL CLOUD_DATA_WAREHOUSE should be SQL SQLJDBC. If this is still an issue after that change please repost your code and the exact line of the connection from the server profile.

Thanks


Thank you for using Focal Point!

Chuck Wolff - Focal Point Moderator
WebFOCUS 7x and 8x, Windows, Linux All output Formats
 
Posts: 2127 | Location: Customer Support | Registered: April 12, 2005Report This Post
Platinum Member
posted Hide Post
Hi,

Does your sql statement work if you run it in HIVE ? It looks like you are connecting thru spark did you try thru zookeeper ? Just wondering if the problem is the connection from webfocus or if it could possibly be on the hadoop side.


Webfocus 8.0.7 on Windows
 
Posts: 176 | Location: Ohio | Registered: October 26, 2006Report This Post
Platinum Member
posted Hide Post
Sorry for the late reply here.

Chuck Wolff - I had caught and already replaced the SQL CLOUD_DATA_WAREHOUSE with SQL SQLJDBC. I had to fix a couple of minor things but the SQL does run in a HIVE instance. Here is the code again and the exact line of the connection from the edasprof server profile below. Also, I have posted the error message that I get now below.

ENGINE SQLJDBC SET DEFAULT_CONNECTION cloud_data_warehouse
SQL SQLJDBC PREPARE SQLOUT FOR
SELECT ami_meter_id, asset_realm, asset_type, access_and_reason.reason AS reason, access_and_reason.access_level AS access, concat(user_name, '@', user_realm) AS user
FROM (
        SELECT user_name, user_realm, ami_meter_id, asset_realm, asset_type, user_access_and_reason(user_uuid, cloud_id, user_realm, asset_realm, owned_by, role_access_levels, apply_default_security) AS access_and_reason
        FROM (
                SELECT DISTINCT user_name, user_uuid, cloud_id, user_realm, locations.uuid as location_uuid, realm AS asset_realm, role_access_levels, apply_default_security, owned_by, ami_meter_id, asset_type
                FROM mdm_meter_location_all AS locations
                JOIN (
                        SELECT name AS user_name, id AS user_uuid, clouduserid AS cloud_id, realm AS user_realm
                        FROM users
                ) AS users JOIN (
                        SELECT realm AS asset_realm, meter_location_uuid, owned_by, ami_meter_id, asset_type
                        FROM mdm_electric_meter_all
                ) AS meters ON (locations.uuid = meters.meter_location_uuid AND locations.realm = meters.asset_realm)
        ) AS user_after_udf
) AS user_with_access
WHERE access_and_reason.access_level <> 'NO_ACCESS'
ORDER BY ami_meter_id, asset_realm, asset_type, reason, access, user
END
-RUN

TABLE FILE SQLOUT
PRINT *
END
-RUN
-EXIT
  


Line from the edasprof server profile:

ENGINE SQLJDBC SET CONNECTION_ATTRIBUTES cloud_data_warehouse 'jdbc:nisc-hive:http://mdmrpt-spark-master-1.cloud.dev:8200/mdm-spark-extract/services;realm;&REALM;user;trustUser1;password;password1;actingasuserid;&UUID'

Error:
0 NUMBER OF RECORDS IN TABLE= 1 LINES= 1
(FOC1400) SQLCODE IS -1 (HEX: FFFFFFFF) XOPEN: 00000
(FOC1500) : (-1) [00000] JDBFOC>> preparesql(): java.lang.IllegalStateException:
(FOC1500) : Error while compiling statement: FAILED: SemanticException [Error
(FOC1500) : 10011]: Line 1:243 Invalid function 'user_access_and_reason' (job id
(FOC1500) : 'momdm1-134d2d6b-cbb1-42e2-afbc-83b46af6a013')
coop.nisc.mdm.extract.cli
(FOC1500) : ent.SqlResultFuture.getUninterruptibly(SqlResultFuture.java:80)
coop.nis
(FOC1500) : c.mdm.extract.jdbc.StatementImpl.executeQuery(StatementImpl.java:44)
ibi
(FOC1500) : .jdbfoc.jjdbfoc.preparesql(jjdbfoc.java:4669)
ibi.ntj2c.ntj2c.processPre
(FOC1500) : psql(ntj2c.java:865)
ibi.ntj2c.ntj2c.access$1700(ntj2c.java:402)
ibi.ntj
(FOC1500) : 2c.ntj2c$18.process(ntj2c.java:619)
ibi.ntj2c.ntj2c.readAndExecuteComman
(FOC1500) : d(ntj2c.java:1555)
ibi.ntj2c.ntj2c.execute(ntj2c.java:1427)
ibi.c.execut
(FOC1500) : e(ntj2c.j
L (FOC1405) SQL PREPARE ERROR.
0 ERROR AT OR NEAR LINE 25 IN PROCEDURE ADHOCRQ FOCEXEC *
(FOC205) THE DESCRIPTION CANNOT BE FOUND FOR FILE NAMED: SQLOUT
BYPASSING TO END OF COMMAND


WF 8.1.05
Windows
SQL Server 2014
HTML, PDF, EXCEL
 
Posts: 175 | Registered: March 24, 2017Report This Post
Virtuoso
posted Hide Post
AMC

It is complaining about an invalid function user_access_and_reason in the SQL Statemenmt. Since this worked in the native hive interface I suspect the since you are using a JDBC connection this function is not recognized. I would recommend you open a case. You may want to ask your account exec for permission to use the hive adapter instead of JDBC and see if this resolves this as well.

Thanks


Thank you for using Focal Point!

Chuck Wolff - Focal Point Moderator
WebFOCUS 7x and 8x, Windows, Linux All output Formats
 
Posts: 2127 | Location: Customer Support | Registered: April 12, 2005Report This Post
Platinum Member
posted Hide Post
Thanks Chuck W. Yes, it may be because of the JDBC adapter connection. I may possibly open case for it. In regards to the hive adapter, I was basically informed at our organization that we can't use the hive adapter because it does not work with how our security is set up in respect to our different users at different sites being able to access reporting I believe. I was told the JDBC generic adapter works for the way the security is set up.


WF 8.1.05
Windows
SQL Server 2014
HTML, PDF, EXCEL
 
Posts: 175 | Registered: March 24, 2017Report This Post
Virtuoso
posted Hide Post
Hi AMC

Then you should open a case.

Thanks,


Thank you for using Focal Point!

Chuck Wolff - Focal Point Moderator
WebFOCUS 7x and 8x, Windows, Linux All output Formats
 
Posts: 2127 | Location: Customer Support | Registered: April 12, 2005Report This Post
  Powered by Social Strata  

Read-Only Read-Only Topic

Focal Point    Focal Point Forums  Hop To Forum Categories  WebFOCUS/FOCUS Forum on Focal Point     [CLOSED] SQL Pass-Thru Issue

Copyright © 1996-2020 Information Builders