Description: During the Teradata TPT ingestion, Infoworks DataFoundry would use the Teradata Parallel Transport Utility installed on the Master node of the persistent cluster to extract the data from Teradata Databasease into a CSV file and then the CSV file data will be ingested to the target storage system.


Set the below Advanced Configuration at the Source level before running the TPT ingestion job so that the tpt.out file generated will be copied to the dbfs location.


key: copy_tpt_log_to_target
value:true




If the ingestion job fails, download the databricks job log from the IWX UI.



Open the downloaded/extracted job log folder and you would see the below message in the log4j-active.log file.


21/09/15 09:26:41 INFO TPTScriptGenerator: Copying tpt log file from /root/temp///tpt/logs/6942b1cd3589eeec67575061_EXPORT14325-1.out to dbfs:////iw/sources/TPT_source/b71082900ad36350d56ea1e2/tptlogs/6942b1cd3589eeec67575061/

Get the log from the above dbfs location and run the below command to get more details on the failure.

tlogview -l  6942b1cd3589eeec67575061_EXPORT14325-1.out


TPT client needs to be installed on the machine from where you are running the tlogview command to analyze the .out file generated.


Applicable IWX versions:

v5.x