ambari 安装 哪些services
发布网友
发布时间:2022-08-15 17:25
我来回答
共1个回答
热心网友
时间:2023-09-17 17:26
我也遇到了类似问题,不知道怎么解决,你的解决了吗?我的错如下,你看看是否有遇到过,谢谢。
stderr: /var/lib/ambari-agent/data/errors-413.txt
2015-03-11 09:34:49,348 - Error while executing command 'any':
Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 123, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 31, in hook
setup_hadoop_env()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 113, in setup_hadoop_env
content=InlineTemplate(params.hadoop_env_sh_template)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 93, in action_create
raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
Fail: Applying File['/etc/hadoop/conf/hadoop-env.sh'] failed, parent directory /etc/hadoop/conf doesn't exist
Error: Error: Unable to run the custom hook script ['/usr/bin/python2.6', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-413.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-413.json', 'INFO', '/var/lib/ambari-agent/data/tmp']
stdout: /var/lib/ambari-agent/data/output-413.txt
2015-03-11 09:34:48,511 - Group['hadoop'] {'ignore_failures': False}
2015-03-11 09:34:48,512 - Adding group Group['hadoop']
2015-03-11 09:34:48,586 - Group['nobody'] {'ignore_failures': False}
2015-03-11 09:34:48,587 - Modifying group nobody
2015-03-11 09:34:48,663 - Group['users'] {'ignore_failures': False}
2015-03-11 09:34:48,663 - Modifying group users
2015-03-11 09:34:48,737 - Group['nagios'] {'ignore_failures': False}
2015-03-11 09:34:48,738 - Adding group Group['nagios']
2015-03-11 09:34:48,810 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
2015-03-11 09:34:48,811 - Modifying user nobody
2015-03-11 09:34:48,885 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-03-11 09:34:48,886 - Adding user User['nagios']
2015-03-11 09:34:48,958 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-03-11 09:34:48,958 - Adding user User['ambari-qa']
2015-03-11 09:34:49,027 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-03-11 09:34:49,028 - Adding user User['zookeeper']
2015-03-11 09:34:49,103 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-03-11 09:34:49,104 - Adding user User['hdfs']
2015-03-11 09:34:49,177 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-03-11 09:34:49,180 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
2015-03-11 09:34:49,253 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] e to not_if
2015-03-11 09:34:49,254 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
2015-03-11 09:34:49,255 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
2015-03-11 09:34:49,328 - Skipping Link['/etc/hadoop/conf'] e to not_if
2015-03-11 09:34:49,348 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs'}
2015-03-11 09:34:49,348 - Error while executing command 'any':
Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 123, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 31, in hook
setup_hadoop_env()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 113, in setup_hadoop_env
content=InlineTemplate(params.hadoop_env_sh_template)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 93, in action_create
raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
Fail: Applying File['/etc/hadoop/conf/hadoop-env.sh'] failed, parent directory /etc/hadoop/conf doesn't exist
Error: Error: Unable to run the custom hook script ['/usr/bin/python2.6', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-413.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-413.json', 'INFO', '/var/lib/ambari-agent/data/tmp']