Executing an import job of a JSON object fails when using the Tez execution framework. The same job however is successfully run when force to run in the Hadoop execution framework.
This is a configuration issue for the particular Tez Job.
To resolve this issue, increase the available resources for the following parameters:
As an example, you may want to double these initial values. The initial values may be seen in the Job Trace logs in the job-conf-cluster.xml file.
To work-around this issue, run the job using the MapReduce execution framework by adding the following parameter to the job:
das.execution-framework=Hadoop (for Datameer <5.3) or
das.execution-framework=MapReduce (since Datameer 5.3).
Please sign in to leave a comment.