Goal
Learn how to prioritize jobs submitted by Datameer to a Hadoop cluster.
Learn
Depending on whether or not you are using impersonation, you can employ one of the following mechanisms.
Without Impersonation
If not using impersonation, users can set the scheduling of jobs for specific cluster queues at either a global or per job level.
To do this on a global level:
- Open the
Administration
tab in Datameer. - Select Hadoop Cluster from the side menu
- To send all jobs of all Framework types to the same queue, add the following property in the
Custom Properties
space:
das.job.queue=<cluster queue name>
- To specify a job queue for a preferred Executiuon Framework, add one of the following properties in the
Custom Properties
space:tez.queue.name=<cluster queue name>
mapreduce.job.quename=<cluster queue name>
(MapReduce is a deprecated framework)
Note: To do this on an individual job level, add one of these properties to theCustom Properties
space within the specific artifact's Advanced
Configuration
.
With Impersonation
Datameer users that are running impersonation don't need to set any scheduling properties in Datameer. Jobs coming from Datameer will already be labeled and all configuration for the queues are made on the Hadoop cluster itself.
Comments
0 comments
Please sign in to leave a comment.