Problem Description

Whenever we submit the Ingestion jobs, the Pipeline jobs or the Export Jobs from Infoworks, Infoworks runs these jobs on the Databricks Job clusters. But there is a 1000 job hard limit per workspace at Databricks side and if the number of jobs submitted from Infoworks exceeds this threshold, the jobs start failing with quota limit exceptions.


In order to overcome this issue, please run the attached script to clear the completed or failed jobs from the Databricks Job History service.

How to run the script?

To cleanup only infoworks jobs: python -s <host> -t <token>

To cleanup all databricks jobs  : python -s <host> -t <token> -j all

source /opt/infoworks/bin/;python -s -t abcdefg-token

Affects version

Infoworks Datafoundry 3.x onwards