When triggering a job to run in a secured Hadoop environment, the following error message is logged in the conductor logs:
[datameer] ERROR [2015-01-01 12:00:00.000] [2804321940-1239] (JobExecutionTraceRetrievalService.java:89) - Unable to read job execution trace log java.io.IOException: Failed on local exception: java.io.IOException: Couldn't setup connection for datameer@REALM to datameerapp1.datameer.com/10.1.1.201:8020; Host Details : local host is: "hadoopapp1/10.1.1.101"; destination host is: "hadoopapp2.datameer.com":8020; ... Caused by: java.io.IOException: Couldn't setup connection for datameer@REALM to hadoopapp1.datameer.com/10.1.1.101:8020 ... Caused by: javax.security.sasl.SaslException: No common protection layer between client and server
In the environment, the
hadoop.rpc.protection value is set to
This is a known issue with Apache Hadoop: HDFS-5688.
To work-around this issue, Apache indicates that the
hadoop.rpc.protection value should be set to
For permanent resolution, please check for updates to bug HDFS-5688.