hadoop - Monitoring status of Oozie jobs submitted through Python subprocess -
i'm triggering oozie workflow shell command executed via python's subprocess module.
popen("oozie job -config workflow.properties -run".split()) the execution of remainder of program depends on success or failure of various actions in workflow. when triggering oozie job interactively, oozie jobid can use monitor success/failure status of job using
oozie job -info <jobid> i want access status of various action nodes programmatically, through python. there way accomplish this?
what i've tried far: oozie mapreduce action places _success file in output folder upon successful job completion. polled presence of file every 2 seconds. however, i'm wondering if there direct api/shell call status of action nodes.
related(ish) questions this post talks reverse problem: of triggering python jobs through oozie , doesn't answer question.
you have restapi access logs status.. , submit oozie job , monitor it!! can pass restapis oozie-server can able take out periodically!!btw can applicable jobs can submitted in oozie.
for ref : https://oozie.apache.org/docs/4.1.0/webservicesapi.html
Comments
Post a Comment