Problem Description:

Pipeline build job on Infoworks Azure setup fails with the below error.

main]:[06:33:19,966] [DEBUG] [AwbUtil] ( - stacktrace org.apache.hive.service.cli.HiveSQLException: Error while processing statement: Failed to read external resource hdfs:///user/infoworks-user/temp//udfs//opt/infoworks/df/udfs/df-shared.jar

    at org.apache.hive.jdbc.Utils.verifySuccess(

    at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(

    at org.apache.hive.jdbc.HiveStatement.execute(

    at org.apache.hive.jdbc.HivePreparedStatement.execute(

    at io.infoworks.awb.utils.shared.SharedConnection.prepareConnection(

    at io.infoworks.awb.utils.shared.SharedConnection.getConnection(

    at io.infoworks.awb.utils.shared.SharedConnection.executeSharedConnection(

    at io.infoworks.awb.utils.shared.SharedConnection.execute(

    at io.infoworks.awb.utils.shared.SharedConnection.execute(

Root cause: 

This issue happens as the df-shared.jar will be stored in Azure Storage Blob but Infoworks application will try to access it from hdfs file system.


Perform the below steps to add the wasb location in file and then trigger the pipeline build job.

a) Login to Infoworks Edge node.

b) Go to /opt/infoworks/conf directory.

c) Open file and add the below config at the end of the file.


Look for the below message in the pipeline build log to get the wasb URI.

-----Message from pipeline build log-----

[main]:[06:33:19,812] [TRACE] [IWFileSystem] ( - FileSystem created with uri wasb://


Applicable Versions

Infoworks Azure 2.7.1