Teradata TPT full load jobs fail with the below exception from the Azure end when Infoworks is running on Azure platform.
[INFO] 2019-08-15 02:57:02,111 [pool-4-thread-1] infoworks.discovery.utils.TPTScriptGenerator:486 :: Error Output:
java.io.IOException: The block list may not contain more than 50,000 blocks. Please see the cause for further information.
This is an exception coming from the Azure end but not from Infoworks end. During the TPT ingestion, the Teradata Parallel transport utility will extract the data from Teradata and will write it into a CSV file. When TPT tries to upload this CSV file which is more than 200GB to Azure BLOB storage, it fails with the above mentioned Azure exception. This is a limitation from the Azure side that we cannot upload a file that is more than 200GB.
To overcome that we should split the file into chunks or increase the number of writers so that the size of a single file will not exceed 200GB.
A block blob can include a maximum of 50,000 blocks. Uncommitted blocks should be committed to fulfill the content or data of the blob. A blob can have a maximum of 100,000 uncommitted blocks at any given time. If this maximum count is exceeded, the service returns status code 409 (RequestEntityTooLargeBlockCountExceedsLimit).
Perform the below steps to resolve this issue.