Goal
If you encounter execution errors or a long run time of imports from Hive, you might need to implement additional logging. This logging helps you debug and to find out what is preventing the import from a successful run.
Learn
First, configure the artifact in question for a specific execution framework. Then implement the enhanced logging.
For MapReduce v2, related jobs should be set within the import-specific custom properties:
das.debug.tasks.logs.collect.force=true das.execution-framework=MapReduce
For Tez, related jobs should be set within the import-specific custom properties:
das.debug.tasks.logs.collect.force=true
das.execution-framework=Tez
tez.task.log.level=DEBUG
tez.am.log.level=DEBUG
Set the Default Log Severity to:
TRACE
Add the following to Logging Customization:
### For Datameer service
log4j.category.datameer=TRACE
log4j.category.datameer.dap.conductor.webapp.controller.data=DEBUG
log4j.category.datameer.dap.common.service=DEBUG ### For Apache Hadoop Framework
log4j.category.org.apache.hadoop=DEBUG
log4j.category.org.apache.thrift=DEBUG ### For Hive plugin
log4j.category.datameer.das.plugin.hive=DEBUG ### For LDAP authentication log4j.category.datameer.dap.conductor.authentication=DEBUG log4j.category.datameer.dap.plugin.ldap.authentication=DEBUG
Further Information
Comments
0 comments
Please sign in to leave a comment.