MapReduce Application Master REST API接口

概述

MapReduce Application Master REST API允许用户获取正在运行的MapReduce应用主控节点状态。目前这等同于一个正在运行的MapReduce作业。这些信息包括应用主控节点正在运行的作业以及所有作业详情,如任务、计数器、配置、尝试等。应用主控节点应通过代理访问。该代理可配置为在资源管理器或独立主机上运行。代理URL通常形如:http://proxy-http-address:port/proxy/appid

MapReduce 应用主节点信息 API

MapReduce应用程序主节点信息资源提供有关该mapreduce应用程序主节点的整体信息。这包括应用程序ID、启动时间、用户、名称等。

URI

以下两个URI都可以从由appid值标识的应用程序ID中获取MapReduce应用程序主节点信息。

支持的HTTP操作

  • GET

支持的查询参数

  None

info 对象的元素

当你请求获取mapreduce应用主节点信息时,该信息将以info对象的形式返回。

项目 数据类型 描述
appId string 应用程序ID
startedOn long 应用程序启动时间(自纪元以来的毫秒数)
name string 应用程序的名称
user string 启动应用程序的用户名称
elapsedTime long 应用程序启动后经过的时间(单位:毫秒)

响应示例

JSON响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0003/ws/v1/mapreduce/info

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)

响应体:

{
  "info" : {
      "appId" : "application_1326232085508_0003",
      "startedOn" : 1326238244047,
      "user" : "user1",
      "name" : "Sleep job",
      "elapsedTime" : 32374
   }
}

XML响应

HTTP请求:

  Accept: application/xml
  GET http://proxy-http-address:port/proxy/application_1326232085508_0003/ws/v1/mapreduce/info

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/xml
  Content-Length: 223
  Server: Jetty(6.1.26)

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<info>
  <appId>application_1326232085508_0003</appId>
  <name>Sleep job</name>
  <user>user1</user>
  <startedOn>1326238244047</startedOn>
  <elapsedTime>32407</elapsedTime>
</info>

作业API

jobs资源提供了在此应用程序主节点上运行的作业列表。另请参阅Job API了解作业对象的语法。

URI

支持的HTTP操作

  • GET

支持的查询参数

  None

jobs 对象的元素

当你请求作业列表时,信息将以作业对象集合的形式返回。另请参阅Job API了解作业对象的语法。

项目 数据类型 描述
job 作业对象数组(JSON)/零个或多个作业对象(XML) 作业对象的集合

响应示例

JSON响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)

响应体:

{
  "jobs" : {
      "job" : [
         {
            "runningReduceAttempts" : 1,
            "reduceProgress" : 100,
            "failedReduceAttempts" : 0,
            "newMapAttempts" : 0,
            "mapsRunning" : 0,
            "state" : "RUNNING",
            "successfulReduceAttempts" : 0,
            "reducesRunning" : 1,
            "acls" : [
               {
                  "value" : " ",
                  "name" : "mapreduce.job.acl-modify-job"
               },
               {
                  "value" : " ",
                  "name" : "mapreduce.job.acl-view-job"
               }
            ],
            "reducesPending" : 0,
            "user" : "user1",
            "reducesTotal" : 1,
            "mapsCompleted" : 1,
            "startTime" : 1326238769379,
            "id" : "job_1326232085508_4_4",
            "successfulMapAttempts" : 1,
            "runningMapAttempts" : 0,
            "newReduceAttempts" : 0,
            "name" : "Sleep job",
            "mapsPending" : 0,
            "elapsedTime" : 59377,
            "reducesCompleted" : 0,
            "mapProgress" : 100,
            "diagnostics" : "",
            "failedMapAttempts" : 0,
            "killedReduceAttempts" : 0,
            "mapsTotal" : 1,
            "uberized" : false,
            "killedMapAttempts" : 0,
            "finishTime" : 0
         }
     ]
   }
 }

XML响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs
  Accept: application/xml

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/xml
  Content-Length: 1214
  Server: Jetty(6.1.26)

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<jobs>
  <job>
    <startTime>1326238769379</startTime>
    <finishTime>0</finishTime>
    <elapsedTime>59416</elapsedTime>
    <id>job_1326232085508_4_4</id>
    <name>Sleep job</name>
    <user>user1</user>
    <state>RUNNING</state>
    <mapsTotal>1</mapsTotal>
    <mapsCompleted>1</mapsCompleted>
    <reducesTotal>1</reducesTotal>
    <reducesCompleted>0</reducesCompleted>
    <mapProgress>100.0</mapProgress>
    <reduceProgress>100.0</reduceProgress>
    <mapsPending>0</mapsPending>
    <mapsRunning>0</mapsRunning>
    <reducesPending>0</reducesPending>
    <reducesRunning>1</reducesRunning>
    <uberized>false</uberized>
    <diagnostics/>
    <newReduceAttempts>0</newReduceAttempts>
    <runningReduceAttempts>1</runningReduceAttempts>
    <failedReduceAttempts>0</failedReduceAttempts>
    <killedReduceAttempts>0</killedReduceAttempts>
    <successfulReduceAttempts>0</successfulReduceAttempts>
    <newMapAttempts>0</newMapAttempts>
    <runningMapAttempts>0</runningMapAttempts>
    <failedMapAttempts>0</failedMapAttempts>
    <killedMapAttempts>0</killedMapAttempts>
    <successfulMapAttempts>1</successfulMapAttempts>
    <acls>
      <name>mapreduce.job.acl-modify-job</name>
      <value> </value>
    </acls>
    <acls>
      <name>mapreduce.job.acl-view-job</name>
      <value> </value>
    </acls>
  </job>
</jobs>

作业API

作业资源包含由该应用主节点启动的特定作业的相关信息。某些字段仅在用户具备相应权限时才可访问——具体取决于ACL设置。

URI

使用以下URI获取由jobid值标识的作业对应的作业对象。

支持的HTTP操作

  • GET

支持的查询参数

  None

job 对象的元素

项目 数据类型 描述
id string 作业ID
name string 作业名称
user string 用户名
state string 作业状态 - 有效值为: NEW, INITED, RUNNING, SUCCEED, FAILED, KILL_WAIT, KILLED, ERROR
startTime long 作业开始时间(自纪元起的毫秒数)
finishTime long 作业完成时间(自纪元以来的毫秒数)
elapsedTime long 自任务启动以来经过的时间(毫秒)
mapsTotal int 映射任务总数
mapsCompleted int 已完成的map任务数量
reducesTotal int reduce任务的总数
reducesCompleted int 已完成的reduce任务数量
diagnostics string 诊断信息
uberized boolean 标识该作业是否为uber作业 - 即完全在应用主节点中运行
mapsPending int 待运行的map任务数量
mapsRunning int 正在运行的map任务数量
reducesPending int 待运行的reduce任务数量
reducesRunning int 正在运行的reduce任务数量
newReduceAttempts int 新的reduce尝试次数
runningReduceAttempts int 正在运行的reduce任务尝试次数
failedReduceAttempts int 失败的reduce尝试次数
killedReduceAttempts int 被杀死的reduce任务尝试次数
successfulReduceAttempts int 成功的reduce尝试次数
newMapAttempts int 新map尝试的次数
runningMapAttempts int 正在运行的map任务尝试次数
failedMapAttempts int 失败的map任务尝试次数
killedMapAttempts int 被终止的map任务尝试次数
successfulMapAttempts int 成功的map任务尝试次数
acls acl数组(json)/零个或多个acl对象(xml) acl对象集合

acls 对象的元素

项目 数据类型 描述
value string ACL值
name string 访问控制列表名称

响应示例

JSON响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/json
  Server: Jetty(6.1.26)
  Content-Length: 720

响应体:

{
   "job" : {
      "runningReduceAttempts" : 1,
      "reduceProgress" : 100,
      "failedReduceAttempts" : 0,
      "newMapAttempts" : 0,
      "mapsRunning" : 0,
      "state" : "RUNNING",
      "successfulReduceAttempts" : 0,
      "reducesRunning" : 1,
      "acls" : [
         {
            "value" : " ",
            "name" : "mapreduce.job.acl-modify-job"
         },
         {
            "value" : " ",
            "name" : "mapreduce.job.acl-view-job"
         }
      ],
      "reducesPending" : 0,
      "user" : "user1",
      "reducesTotal" : 1,
      "mapsCompleted" : 1,
      "startTime" : 1326238769379,
      "id" : "job_1326232085508_4_4",
      "successfulMapAttempts" : 1,
      "runningMapAttempts" : 0,
      "newReduceAttempts" : 0,
      "name" : "Sleep job",
      "mapsPending" : 0,
      "elapsedTime" : 59437,
      "reducesCompleted" : 0,
      "mapProgress" : 100,
      "diagnostics" : "",
      "failedMapAttempts" : 0,
      "killedReduceAttempts" : 0,
      "mapsTotal" : 1,
      "uberized" : false,
      "killedMapAttempts" : 0,
      "finishTime" : 0
   }
}

XML响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4
  Accept: application/xml

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/xml
  Content-Length: 1201
  Server: Jetty(6.1.26)

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<job>
  <startTime>1326238769379</startTime>
  <finishTime>0</finishTime>
  <elapsedTime>59474</elapsedTime>
  <id>job_1326232085508_4_4</id>
  <name>Sleep job</name>
  <user>user1</user>
  <state>RUNNING</state>
  <mapsTotal>1</mapsTotal>
  <mapsCompleted>1</mapsCompleted>
  <reducesTotal>1</reducesTotal>
  <reducesCompleted>0</reducesCompleted>
  <mapProgress>100.0</mapProgress>
  <reduceProgress>100.0</reduceProgress>
  <mapsPending>0</mapsPending>
  <mapsRunning>0</mapsRunning>
  <reducesPending>0</reducesPending>
  <reducesRunning>1</reducesRunning>
  <uberized>false</uberized>
  <diagnostics/>
  <newReduceAttempts>0</newReduceAttempts>
  <runningReduceAttempts>1</runningReduceAttempts>
  <failedReduceAttempts>0</failedReduceAttempts>
  <killedReduceAttempts>0</killedReduceAttempts>
  <successfulReduceAttempts>0</successfulReduceAttempts>
  <newMapAttempts>0</newMapAttempts>
  <runningMapAttempts>0</runningMapAttempts>
  <failedMapAttempts>0</failedMapAttempts>
  <killedMapAttempts>0</killedMapAttempts>
  <successfulMapAttempts>1</successfulMapAttempts>
  <acls>
    <name>mapreduce.job.acl-modify-job</name>
    <value> </value>
  </acls>
  <acls>
    <name>mapreduce.job.acl-view-job</name>    <value> </value>
  </acls>
</job>

作业尝试API

通过作业尝试API,您可以获取表示作业尝试的资源集合。当您对此资源执行GET操作时,将获得一个作业尝试对象集合。

URI

支持的HTTP操作

  • GET

支持的查询参数

  None

jobAttempts 对象的元素

当你请求作业尝试列表时,返回的信息将以作业尝试对象数组的形式呈现。

项目 数据类型 描述
jobAttempt 作业尝试对象数组(JSON)/零个或多个作业尝试对象(XML) 作业尝试对象的集合

jobAttempt 对象的元素

项目 数据类型 描述
id string 作业尝试ID
nodeId string 运行该尝试的节点ID
nodeHttpAddress string 运行该尝试的节点的HTTP地址
logsLink string 指向作业尝试日志的http链接
containerId string 作业尝试的容器ID
startTime long 尝试的开始时间(自纪元以来的毫秒数)

响应示例

JSON响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/jobattempts

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)

响应体:

{
   "jobAttempts" : {
      "jobAttempt" : [
         {
            "nodeId" : "host.domain.com:8041",
            "nodeHttpAddress" : "host.domain.com:8042",
            "startTime" : 1326238773493,
            "id" : 1,
            "logsLink" : "http://host.domain.com:8042/node/containerlogs/container_1326232085508_0004_01_000001",
            "containerId" : "container_1326232085508_0004_01_000001"
         }
      ]
   }
}

XML响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/jobattempts
  Accept: application/xml

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/xml
  Content-Length: 498
  Server: Jetty(6.1.26)

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<jobAttempts>
  <jobAttempt>
    <nodeHttpAddress>host.domain.com:8042</nodeHttpAddress>
    <nodeId>host.domain.com:8041</nodeId>
    <id>1</id>
    <startTime>1326238773493</startTime>
    <containerId>container_1326232085508_0004_01_000001</containerId>
    <logsLink>http://host.domain.com:8042/node/containerlogs/container_1326232085508_0004_01_000001</logsLink>
  </jobAttempt>
</jobAttempts>

作业计数器API

通过作业计数器API,您可以获取代表该作业所有计数器的资源集合。

URI

支持的HTTP操作

  • GET

支持的查询参数

  None

jobCounters 对象的元素

项目 数据类型 描述
id string 作业ID
counterGroup counterGroup对象数组(JSON)/零个或多个counterGroup对象(XML) 计数器组对象的集合

counterGroup 对象的元素

项目 数据类型 描述
counterGroupName string 计数器组的名称
counter 计数器对象数组(JSON)/零个或多个计数器对象(XML) 计数器对象的集合

counter 对象的元素

项目 数据类型 描述
name string 计数器的名称
reduceCounterValue long reduce任务的计数器值
mapCounterValue long map任务的计数器值
totalCounterValue long 所有任务的计数器值

响应示例

JSON响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/counters

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)

响应体:

{
   "jobCounters" : {
      "id" : "job_1326232085508_4_4",
      "counterGroup" : [
         {
            "counterGroupName" : "Shuffle Errors",
            "counter" : [
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "BAD_ID"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "CONNECTION"
               },
              {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "IO_ERROR"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "WRONG_LENGTH"
               },                {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "WRONG_MAP"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "WRONG_REDUCE"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.FileSystemCounter",
            "counter" : [
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 2483,
                  "name" : "FILE_BYTES_READ"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 108763,
                  "name" : "FILE_BYTES_WRITTEN"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "FILE_READ_OPS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "FILE_LARGE_READ_OPS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "FILE_WRITE_OPS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 48,
                  "name" : "HDFS_BYTES_READ"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "HDFS_BYTES_WRITTEN"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1,
                  "name" : "HDFS_READ_OPS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "HDFS_LARGE_READ_OPS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "HDFS_WRITE_OPS"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.TaskCounter",
            "counter" : [
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1,
                  "name" : "MAP_INPUT_RECORDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1200,
                  "name" : "MAP_OUTPUT_RECORDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 4800,
                  "name" : "MAP_OUTPUT_BYTES"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 2235,
                  "name" : "MAP_OUTPUT_MATERIALIZED_BYTES"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 48,
                  "name" : "SPLIT_RAW_BYTES"
               },
              {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "COMBINE_INPUT_RECORDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "COMBINE_OUTPUT_RECORDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 460,
                  "name" : "REDUCE_INPUT_GROUPS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 2235,
                  "name" : "REDUCE_SHUFFLE_BYTES"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 460,
                  "name" : "REDUCE_INPUT_RECORDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "REDUCE_OUTPUT_RECORDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1200,
                  "name" : "SPILLED_RECORDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1,
                  "name" : "SHUFFLED_MAPS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "FAILED_SHUFFLE"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1,
                  "name" : "MERGED_MAP_OUTPUTS"
               },                {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 58,
                  "name" : "GC_TIME_MILLIS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1580,
                  "name" : "CPU_MILLISECONDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 462643200,
                  "name" : "PHYSICAL_MEMORY_BYTES"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 2149728256,
                  "name" : "VIRTUAL_MEMORY_BYTES"
               },
              {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 357957632,
                  "name" : "COMMITTED_HEAP_BYTES"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.lib.input.FileInputFormatCounter",
            "counter" : [
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "BYTES_READ"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.lib.output.FileOutputFormatCounter",
            "counter" : [
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "BYTES_WRITTEN"
               }
            ]
         }
      ]
   }
}

XML响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/counters
  Accept: application/xml

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/xml
  Content-Length: 7027
  Server: Jetty(6.1.26)

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<jobCounters>
  <id>job_1326232085508_4_4</id>
  <counterGroup>
    <counterGroupName>Shuffle Errors</counterGroupName>
    <counter>
      <name>BAD_ID</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>CONNECTION</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>IO_ERROR</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>WRONG_LENGTH</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>WRONG_MAP</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>WRONG_REDUCE</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
  </counterGroup>
  <counterGroup>
    <counterGroupName>org.apache.hadoop.mapreduce.FileSystemCounter</counterGroupName>
    <counter>
      <name>FILE_BYTES_READ</name>
      <totalCounterValue>2483</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>FILE_BYTES_WRITTEN</name>
      <totalCounterValue>108763</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>FILE_READ_OPS</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>FILE_LARGE_READ_OPS</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>FILE_WRITE_OPS</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>HDFS_BYTES_READ</name>
      <totalCounterValue>48</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>HDFS_BYTES_WRITTEN</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>HDFS_READ_OPS</name>
      <totalCounterValue>1</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>HDFS_LARGE_READ_OPS</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>HDFS_WRITE_OPS</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
  </counterGroup>
  <counterGroup>
    <counterGroupName>org.apache.hadoop.mapreduce.TaskCounter</counterGroupName>
    <counter>
      <name>MAP_INPUT_RECORDS</name>
      <totalCounterValue>1</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>MAP_OUTPUT_RECORDS</name>
      <totalCounterValue>1200</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>MAP_OUTPUT_BYTES</name>
      <totalCounterValue>4800</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>MAP_OUTPUT_MATERIALIZED_BYTES</name>
      <totalCounterValue>2235</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>SPLIT_RAW_BYTES</name>
      <totalCounterValue>48</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>COMBINE_INPUT_RECORDS</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>COMBINE_OUTPUT_RECORDS</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>REDUCE_INPUT_GROUPS</name>
      <totalCounterValue>460</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>REDUCE_SHUFFLE_BYTES</name>
      <totalCounterValue>2235</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>REDUCE_INPUT_RECORDS</name>
      <totalCounterValue>460</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>REDUCE_OUTPUT_RECORDS</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>SPILLED_RECORDS</name>
      <totalCounterValue>1200</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>SHUFFLED_MAPS</name>
      <totalCounterValue>1</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>FAILED_SHUFFLE</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>MERGED_MAP_OUTPUTS</name>
      <totalCounterValue>1</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>GC_TIME_MILLIS</name>
      <totalCounterValue>58</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>CPU_MILLISECONDS</name>
      <totalCounterValue>1580</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>PHYSICAL_MEMORY_BYTES</name>
      <totalCounterValue>462643200</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>VIRTUAL_MEMORY_BYTES</name>
      <totalCounterValue>2149728256</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
    <counter>
      <name>COMMITTED_HEAP_BYTES</name>
      <totalCounterValue>357957632</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
  </counterGroup>
  <counterGroup>
    <counterGroupName>org.apache.hadoop.mapreduce.lib.input.FileInputFormatCounter</counterGroupName>
    <counter>
      <name>BYTES_READ</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>  </counterGroup>  <counterGroup>
    <counterGroupName>org.apache.hadoop.mapreduce.lib.output.FileOutputFormatCounter</counterGroupName>
    <counter>      <name>BYTES_WRITTEN</name>
      <totalCounterValue>0</totalCounterValue>
      <mapCounterValue>0</mapCounterValue>
      <reduceCounterValue>0</reduceCounterValue>
    </counter>
  </counterGroup>
</jobCounters>

作业配置API

作业配置资源包含有关此作业的作业配置信息。

URI

使用以下URI从由jobid值标识的作业中获取作业配置信息。

支持的HTTP操作

  • GET

支持的查询参数

  None

conf 对象的元素

项目 数据类型 描述
path string 作业配置文件的路径
property 配置属性数组(JSON)/零个或多个属性对象(XML) 属性对象集合

property 对象的元素

项目 数据类型 描述
name string 配置属性的名称
value string 配置属性的值
source string 该配置对象的来源位置。如果存在多个来源,则会显示历史记录,列表末尾为最新来源。

响应示例

JSON响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/conf

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)

响应体:

这只是输出的一小段示例,因为实际输出内容非常庞大。完整的输出会包含您作业配置文件中的每一个属性。

{
   "conf" : {
      "path" : "hdfs://host.domain.com:9000/user/user1/.staging/job_1326232085508_0004/job.xml",
      "property" : [
         {
            "value" : "/home/hadoop/hdfs/data",
            "name" : "dfs.datanode.data.dir",
            "source" : ["hdfs-site.xml", "job.xml"]
         },
         {
            "value" : "org.apache.hadoop.yarn.server.webproxy.amfilter.AmFilterInitializer",
            "name" : "hadoop.http.filter.initializers"
            "source" : ["programmatically", "job.xml"]
         },
         {
            "value" : "/home/hadoop/tmp",
            "name" : "mapreduce.cluster.temp.dir"
            "source" : ["mapred-site.xml"]
         },
         ...
      ]
   }
}

XML响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/conf
  Accept: application/xml

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/xml
  Content-Length: 552
  Server: Jetty(6.1.26)

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<conf>
  <path>hdfs://host.domain.com:9000/user/user1/.staging/job_1326232085508_0004/job.xml</path>
  <property>
    <name>dfs.datanode.data.dir</name>
    <value>/home/hadoop/hdfs/data</value>
    <source>hdfs-site.xml</source>
    <source>job.xml</source>
  </property>
  <property>
    <name>hadoop.http.filter.initializers</name>
    <value>org.apache.hadoop.yarn.server.webproxy.amfilter.AmFilterInitializer</value>
    <source>programmatically</source>
    <source>job.xml</source>
  </property>
  <property>
    <name>mapreduce.cluster.temp.dir</name>
    <value>/home/hadoop/tmp</value>
    <source>mapred-site.xml</source>
  </property>
  ...
</conf>

任务API

通过任务API,您可以获取代表某个作业所有任务的资源集合。当您对此资源执行GET操作时,将获得一个任务对象集合。

URI

支持的HTTP操作

  • GET

支持的查询参数

  • type - 任务类型,有效值为 m 或 r。m 表示 map 任务,r 表示 reduce 任务。

tasks 对象的元素

当你请求任务列表时,返回的信息将以任务对象数组的形式呈现。另请参阅Task API了解任务对象的语法。

项目 数据类型 描述
task 任务对象数组(JSON)/零个或多个任务对象(XML) 任务对象的集合

响应示例

JSON响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/tasks

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)

响应体:

{
   "tasks" : {
      "task" : [
         {
            "progress" : 100,
            "elapsedTime" : 2768,
            "state" : "SUCCEEDED",
            "startTime" : 1326238773493,
            "id" : "task_1326232085508_4_4_m_0",
            "type" : "MAP",
            "successfulAttempt" : "attempt_1326232085508_4_4_m_0_0",
            "finishTime" : 1326238776261
         },
         {
            "progress" : 100,
            "elapsedTime" : 0,
            "state" : "RUNNING",
            "startTime" : 1326238777460,
            "id" : "task_1326232085508_4_4_r_0",
            "type" : "REDUCE",
            "successfulAttempt" : "",
            "finishTime" : 0
         }
      ]
   }
}

XML响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/tasks
  Accept: application/xml

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/xml
  Content-Length: 603
  Server: Jetty(6.1.26)

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<tasks>
  <task>
    <startTime>1326238773493</startTime>
    <finishTime>1326238776261</finishTime>
    <elapsedTime>2768</elapsedTime>
    <progress>100.0</progress>
    <id>task_1326232085508_4_4_m_0</id>
    <state>SUCCEEDED</state>
    <type>MAP</type>
    <successfulAttempt>attempt_1326232085508_4_4_m_0_0</successfulAttempt>
  </task>
  <task>
    <startTime>1326238777460</startTime>
    <finishTime>0</finishTime>
    <elapsedTime>0</elapsedTime>
    <progress>100.0</progress>
    <id>task_1326232085508_4_4_r_0</id>
    <state>RUNNING</state>
    <type>REDUCE</type>
    <successfulAttempt/>
  </task>
</tasks>

任务API

任务资源包含有关作业中特定任务的信息。

URI

使用以下URI从由taskid值标识的任务中获取任务对象。

支持的HTTP操作

  • GET

支持的查询参数

  None

任务对象的元素

项目 数据类型 描述
id string 任务ID
state string 任务状态 - 有效值为: NEW, SCHEDULED, RUNNING, SUCCEEDED, FAILED, KILL_WAIT, KILLED
type string 任务类型 - MAP 或 REDUCE
successfulAttempt string 最近一次成功尝试的ID
progress float 任务进度百分比
startTime long 任务开始时间(自纪元起的毫秒数)
finishTime long 任务完成的时间(自纪元以来的毫秒数)
elapsedTime long 应用程序启动后经过的时间(单位:毫秒)

响应示例

JSON响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/tasks/task_1326232085508_4_4_r_0

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)

响应体:

{
   "task" : {
      "progress" : 100,
      "elapsedTime" : 0,
      "state" : "RUNNING",
      "startTime" : 1326238777460,
      "id" : "task_1326232085508_4_4_r_0",
      "type" : "REDUCE",
      "successfulAttempt" : "",
      "finishTime" : 0
   }
}

XML响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/tasks/task_1326232085508_4_4_r_0
  Accept: application/xml

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/xml
  Content-Length: 299
  Server: Jetty(6.1.26)

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<task>
  <startTime>1326238777460</startTime>
  <finishTime>0</finishTime>
  <elapsedTime>0</elapsedTime>
  <progress>100.0</progress>
  <id>task_1326232085508_4_4_r_0</id>
  <state>RUNNING</state>
  <type>REDUCE</type>
  <successfulAttempt/>
</task>

任务计数器API

通过任务计数器API,您可以获取代表该任务所有计数器的资源集合。

URI

支持的HTTP操作

  • GET

支持的查询参数

  None

jobTaskCounters 对象的元素

项目 数据类型 描述
id string 任务ID
taskcounterGroup array of counterGroup objects(JSON)/zero or more counterGroup objects(XML) 计数器组对象的集合

counterGroup 对象的元素

项目 数据类型 描述
counterGroupName string 计数器组的名称
counter 计数器对象数组(JSON)/零个或多个计数器对象(XML) 计数器对象集合

counter 对象的元素

项目 数据类型 描述
name string 计数器的名称
value long 计数器的值

响应示例

JSON响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/tasks/task_1326232085508_4_4_r_0/counters

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)

响应体:

{
   "jobTaskCounters" : {
      "id" : "task_1326232085508_4_4_r_0",
      "taskCounterGroup" : [
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.FileSystemCounter",
            "counter" : [
               {
                  "value" : 2363,
                  "name" : "FILE_BYTES_READ"
               },
               {
                  "value" : 54372,
                  "name" : "FILE_BYTES_WRITTEN"
               },
               {
                  "value" : 0,
                  "name" : "FILE_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "FILE_LARGE_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "FILE_WRITE_OPS"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_BYTES_READ"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_BYTES_WRITTEN"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_LARGE_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_WRITE_OPS"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.TaskCounter",
            "counter" : [
               {
                  "value" : 0,
                  "name" : "COMBINE_INPUT_RECORDS"
               },
               {
                  "value" : 0,
                  "name" : "COMBINE_OUTPUT_RECORDS"
               },
               {
                  "value" : 460,
                  "name" : "REDUCE_INPUT_GROUPS"
               },
               {
                  "value" : 2235,
                  "name" : "REDUCE_SHUFFLE_BYTES"
               },
               {
                  "value" : 460,
                  "name" : "REDUCE_INPUT_RECORDS"
               },
               {
                  "value" : 0,
                  "name" : "REDUCE_OUTPUT_RECORDS"
               },
               {
                  "value" : 0,
                  "name" : "SPILLED_RECORDS"
               },
               {
                  "value" : 1,
                  "name" : "SHUFFLED_MAPS"
               },
               {
                  "value" : 0,
                  "name" : "FAILED_SHUFFLE"
               },
               {
                  "value" : 1,
                  "name" : "MERGED_MAP_OUTPUTS"
               },
               {
                  "value" : 26,
                  "name" : "GC_TIME_MILLIS"
               },
               {
                  "value" : 860,
                  "name" : "CPU_MILLISECONDS"
               },
               {
                  "value" : 107839488,
                  "name" : "PHYSICAL_MEMORY_BYTES"
               },
               {
                  "value" : 1123147776,
                  "name" : "VIRTUAL_MEMORY_BYTES"
               },
               {
                  "value" : 57475072,
                  "name" : "COMMITTED_HEAP_BYTES"
               }
            ]
         },
         {
            "counterGroupName" : "Shuffle Errors",
            "counter" : [
               {
                  "value" : 0,
                  "name" : "BAD_ID"
               },
               {
                  "value" : 0,
                  "name" : "CONNECTION"
               },
               {
                  "value" : 0,
                  "name" : "IO_ERROR"
               },
               {
                  "value" : 0,
                  "name" : "WRONG_LENGTH"
               },
               {
                  "value" : 0,
                  "name" : "WRONG_MAP"
               },
               {
                  "value" : 0,
                  "name" : "WRONG_REDUCE"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.lib.output.FileOutputFormatCounter",
            "counter" : [
               {
                  "value" : 0,
                  "name" : "BYTES_WRITTEN"
               }
            ]
         }
      ]
   }
}

XML响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/tasks/task_1326232085508_4_4_r_0/counters
  Accept: application/xml

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/xml
  Content-Length: 2660
  Server: Jetty(6.1.26)

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<jobTaskCounters>
  <id>task_1326232085508_4_4_r_0</id>
  <taskCounterGroup>
    <counterGroupName>org.apache.hadoop.mapreduce.FileSystemCounter</counterGroupName>
    <counter>
      <name>FILE_BYTES_READ</name>
      <value>2363</value>
    </counter>
    <counter>
      <name>FILE_BYTES_WRITTEN</name>
      <value>54372</value>
    </counter>
    <counter>
      <name>FILE_READ_OPS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>FILE_LARGE_READ_OPS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>FILE_WRITE_OPS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>HDFS_BYTES_READ</name>
      <value>0</value>
    </counter>
    <counter>
      <name>HDFS_BYTES_WRITTEN</name>
      <value>0</value>
    </counter>
    <counter>
      <name>HDFS_READ_OPS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>HDFS_LARGE_READ_OPS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>HDFS_WRITE_OPS</name>
      <value>0</value>
    </counter>
  </taskCounterGroup>
  <taskCounterGroup>
    <counterGroupName>org.apache.hadoop.mapreduce.TaskCounter</counterGroupName>
    <counter>
      <name>COMBINE_INPUT_RECORDS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>COMBINE_OUTPUT_RECORDS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>REDUCE_INPUT_GROUPS</name>
      <value>460</value>
    </counter>
    <counter>
      <name>REDUCE_SHUFFLE_BYTES</name>
      <value>2235</value>
    </counter>
    <counter>
      <name>REDUCE_INPUT_RECORDS</name>
      <value>460</value>
    </counter>
    <counter>
      <name>REDUCE_OUTPUT_RECORDS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>SPILLED_RECORDS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>SHUFFLED_MAPS</name>
      <value>1</value>
    </counter>
    <counter>
      <name>FAILED_SHUFFLE</name>
      <value>0</value>
    </counter>
    <counter>
      <name>MERGED_MAP_OUTPUTS</name>
      <value>1</value>
    </counter>
    <counter>
      <name>GC_TIME_MILLIS</name>
      <value>26</value>
    </counter>
    <counter>
      <name>CPU_MILLISECONDS</name>
      <value>860</value>
    </counter>
    <counter>
      <name>PHYSICAL_MEMORY_BYTES</name>
      <value>107839488</value>
    </counter>
    <counter>
      <name>VIRTUAL_MEMORY_BYTES</name>
      <value>1123147776</value>
    </counter>
    <counter>
      <name>COMMITTED_HEAP_BYTES</name>
      <value>57475072</value>
    </counter>
  </taskCounterGroup>
  <taskCounterGroup>
    <counterGroupName>Shuffle Errors</counterGroupName>
    <counter>
      <name>BAD_ID</name>
      <value>0</value>
    </counter>
    <counter>
      <name>CONNECTION</name>
      <value>0</value>
    </counter>
    <counter>
      <name>IO_ERROR</name>
      <value>0</value>
    </counter>
    <counter>
      <name>WRONG_LENGTH</name>
      <value>0</value>
    </counter>
    <counter>
      <name>WRONG_MAP</name>
      <value>0</value>
    </counter>
    <counter>
      <name>WRONG_REDUCE</name>
      <value>0</value>
    </counter>
  </taskCounterGroup>
  <taskCounterGroup>
    <counterGroupName>org.apache.hadoop.mapreduce.lib.output.FileOutputFormatCounter</counterGroupName>
    <counter>
      <name>BYTES_WRITTEN</name>
      <value>0</value>
    </counter>
  </taskCounterGroup>
</jobTaskCounters>

任务尝试API

通过任务尝试API,您可以获取表示作业内任务尝试的资源集合。当您对此资源执行GET操作时,将获得一个任务尝试对象集合。

URI

支持的HTTP操作

  • GET

支持的查询参数

  None

taskAttempts 对象的元素

当你请求任务尝试列表时,返回的信息将以任务尝试对象数组的形式呈现。另请参阅Task Attempt API了解任务对象的语法。

项目 数据类型 描述
taskAttempt 任务尝试对象数组(JSON)/零个或多个任务尝试对象(XML) 任务尝试对象的集合

响应示例

JSON响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/tasks/task_1326232085508_4_4_r_0/attempts

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)

响应体:

{
   "taskAttempts" : {
      "taskAttempt" : [
         {
            "elapsedMergeTime" : 47,
            "shuffleFinishTime" : 1326238780052,
            "assignedContainerId" : "container_1326232085508_0004_01_000003",
            "progress" : 100,
            "elapsedTime" : 0,
            "state" : "RUNNING",
            "elapsedShuffleTime" : 2592,
            "mergeFinishTime" : 1326238780099,
            "rack" : "/98.139.92.0",
            "elapsedReduceTime" : 0,
            "nodeHttpAddress" : "host.domain.com:8042",
            "type" : "REDUCE",
            "startTime" : 1326238777460,
            "id" : "attempt_1326232085508_4_4_r_0_0",
            "finishTime" : 0
         }
      ]
   }
}

XML响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/tasks/task_1326232085508_4_4_r_0/attempts
  Accept: application/xml

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/xml
  Content-Length: 807
  Server: Jetty(6.1.26)

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<taskAttempts>
  <taskAttempt>
    <startTime>1326238777460</startTime>
    <finishTime>0</finishTime>
    <elapsedTime>0</elapsedTime>
    <progress>100.0</progress>
    <id>attempt_1326232085508_4_4_r_0_0</id>
    <rack>/98.139.92.0</rack>
    <state>RUNNING</state>
    <nodeHttpAddress>host.domain.com:8042</nodeHttpAddress>
    <type>REDUCE</type>
    <assignedContainerId>container_1326232085508_0004_01_000003</assignedContainerId>
    <shuffleFinishTime>1326238780052</shuffleFinishTime>
    <mergeFinishTime>1326238780099</mergeFinishTime>
    <elapsedShuffleTime>2592</elapsedShuffleTime>
    <elapsedMergeTime>47</elapsedMergeTime>
    <elapsedReduceTime>0</elapsedReduceTime>
  </taskAttempt>
</taskAttempts>

任务尝试API

Task Attempt(任务尝试)资源包含有关作业中特定任务尝试的信息。

URI

使用以下URI从由attemptid值标识的任务中获取任务尝试对象。

支持的HTTP操作

  • GET

支持的查询参数

  None

taskAttempt 对象的元素

项目 数据类型 描述
id string 任务ID
rack string 机架
state string 任务尝试的状态 - 有效值包括: NEW, UNASSIGNED, ASSIGNED, RUNNING, COMMIT_PENDING, SUCCESS_CONTAINER_CLEANUP, SUCCEEDED, FAIL_CONTAINER_CLEANUP, FAIL_TASK_CLEANUP, FAILED, KILL_CONTAINER_CLEANUP, KILL_TASK_CLEANUP, KILLED
type string 任务类型
assignedContainerId string 此尝试被分配到的容器ID
nodeHttpAddress string 该任务尝试运行的节点的HTTP地址
diagnostics string 诊断信息
progress float 任务尝试的进度百分比
startTime long 任务尝试开始的时间(自纪元以来的毫秒数)
finishTime long 任务尝试完成的时间(自纪元以来的毫秒数)
elapsedTime long 自任务尝试开始以来经过的时间(以毫秒为单位)

对于reduce任务尝试,您还有以下字段:

项目 数据类型 描述
shuffleFinishTime long shuffle完成的时间(自纪元以来的毫秒数)
mergeFinishTime long 合并完成的时间(自纪元以来的毫秒数)
elapsedShuffleTime long shuffle阶段完成所需的时间(从reduce任务开始到shuffle完成之间的时间,单位为毫秒)
elapsedMergeTime long 合并阶段完成所需的时间(从shuffle完成到合并完成之间的时间,单位为毫秒)
elapsedReduceTime long reduce阶段完成所需时间(从合并完成到reduce任务结束之间的时间,单位为毫秒)

响应示例

JSON响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/tasks/task_1326232085508_4_4_r_0/attempts/attempt_1326232085508_4_4_r_0_0

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)

响应体:

{
   "taskAttempt" : {
      "elapsedMergeTime" : 47,
      "shuffleFinishTime" : 1326238780052,
      "assignedContainerId" : "container_1326232085508_0004_01_000003",
      "progress" : 100,
      "elapsedTime" : 0,
      "state" : "RUNNING",
      "elapsedShuffleTime" : 2592,
      "mergeFinishTime" : 1326238780099,
      "rack" : "/98.139.92.0",
      "elapsedReduceTime" : 0,
      "nodeHttpAddress" : "host.domain.com:8042",
      "startTime" : 1326238777460,
      "id" : "attempt_1326232085508_4_4_r_0_0",
      "type" : "REDUCE",
      "finishTime" : 0
   }
}

XML响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/tasks/task_1326232085508_4_4_r_0/attempts/attempt_1326232085508_4_4_r_0_0
  Accept: application/xml

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/xml
  Content-Length: 691
  Server: Jetty(6.1.26)

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<taskAttempt>
  <startTime>1326238777460</startTime>
  <finishTime>0</finishTime>
  <elapsedTime>0</elapsedTime>
  <progress>100.0</progress>
  <id>attempt_1326232085508_4_4_r_0_0</id>
  <rack>/98.139.92.0</rack>
  <state>RUNNING</state>
  <nodeHttpAddress>host.domain.com:8042</nodeHttpAddress>
  <type>REDUCE</type>
  <assignedContainerId>container_1326232085508_0004_01_000003</assignedContainerId>
  <shuffleFinishTime>1326238780052</shuffleFinishTime>
  <mergeFinishTime>1326238780099</mergeFinishTime>
  <elapsedShuffleTime>2592</elapsedShuffleTime>
  <elapsedMergeTime>47</elapsedMergeTime>
  <elapsedReduceTime>0</elapsedReduceTime>
</taskAttempt>

任务尝试状态API

通过任务尝试状态API,您可以查询已提交任务尝试的状态,也可以通过发送PUT请求将运行中任务尝试的状态修改为"KILLED"来终止该任务尝试。执行PUT操作时,必须为AM Web服务设置身份验证。此外,您必须获得终止任务尝试的授权。目前只能将状态更改为"KILLED";尝试将状态更改为其他值会导致400错误响应。下方展示了未经授权和错误请求的示例。当您成功执行PUT操作时,初始响应可能是202。您可以通过重复PUT请求直到获得200响应、使用GET方法查询状态或查询任务尝试信息并检查状态来确认应用程序已被终止。在以下示例中,我们重复PUT请求并获得了200响应。

请注意,要终止任务尝试,您必须为HTTP接口设置身份验证过滤器。该功能要求在HttpServletRequest中设置用户名。如果未设置过滤器,响应将是"UNAUTHORIZED"(未授权)响应。

此功能目前处于Alpha测试阶段,未来可能会发生变化。

URI

支持的HTTP操作

  • GET
    • POST

支持的查询参数

  None

jobTaskAttemptState 对象的元素

当你请求一个应用程序的状态时,返回的信息包含以下字段

项目 数据类型 描述
state string 应用程序状态 - 可能是以下之一:"NEW"(新建)、"STARTING"(启动中)、"RUNNING"(运行中)、"COMMIT_PENDING"(提交待定)、"SUCCEEDED"(成功)、"FAILED"(失败)、"KILLED"(已终止)

响应示例

JSON响应

HTTP请求

  GET http://proxy-http-address:port/proxy/application_1429692837321_0001/ws/v1/mapreduce/jobs/job_1429692837321_0001/tasks/task_1429692837321_0001_m_000000/attempts/attempt_1429692837321_0001_m_000000_0/state

响应头:

HTTP/1.1 200 OK
Content-Type: application/json
Server: Jetty(6.1.26)
Content-Length: 20

响应体:

{
  "state":"STARTING"
}

HTTP请求

  PUT http://proxy-http-address:port/proxy/application_1429692837321_0001/ws/v1/mapreduce/jobs/job_1429692837321_0001/tasks/task_1429692837321_0001_m_000000/attempts/attempt_1429692837321_0001_m_000000_0/state

请求体:

{
  "state":"KILLED"
}

响应头:

HTTP/1.1 200 OK
Content-Type: application/json
Server: Jetty(6.1.26)
Content-Length: 18

响应体:

{
  "state":"KILLED"
}

XML响应

HTTP请求

  GET http://proxy-http-address:port/proxy/application_1429692837321_0001/ws/v1/mapreduce/jobs/job_1429692837321_0001/tasks/task_1429692837321_0001_m_000000/attempts/attempt_1429692837321_0001_m_000000_0/state

响应头:

HTTP/1.1 200 OK
Content-Type: application/xml
Server: Jetty(6.1.26)
Content-Length: 121

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<jobTaskAttemptState>
  <state>STARTING</state>
</jobTaskAttemptState>

HTTP请求

  PUT http://proxy-http-address:port/proxy/application_1429692837321_0001/ws/v1/mapreduce/jobs/job_1429692837321_0001/tasks/task_1429692837321_0001_m_000000/attempts/attempt_1429692837321_0001_m_000000_0/state

请求体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<jobTaskAttemptState>
  <state>KILLED</state>
</jobTaskAttemptState>

响应头:

HTTP/1.1 200 OK
Content-Type: application/xml
Server: Jetty(6.1.26)
Content-Length: 121

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<jobTaskAttemptState>
  <state>KILLED</state>
</jobTaskAttemptState>

未经授权的错误响应

HTTP请求

  PUT http://proxy-http-address:port/proxy/application_1429692837321_0001/ws/v1/mapreduce/jobs/job_1429692837321_0001/tasks/task_1429692837321_0001_m_000000/attempts/attempt_1429692837321_0001_m_000000_0/state

请求体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<jobTaskAttemptState>
  <state>KILLED</state>
</jobTaskAttemptState>

响应头:

HTTP/1.1 403 Unauthorized
Content-Type: application/json
Server: Jetty(6.1.26)

错误请求响应

HTTP请求

  PUT http://proxy-http-address:port/proxy/application_1429692837321_0001/ws/v1/mapreduce/jobs/job_1429692837321_0001/tasks/task_1429692837321_0001_m_000000/attempts/attempt_1429692837321_0001_m_000000_0/state

请求体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<jobTaskAttemptState>
  <state>RUNNING</state>
</jobTaskAttemptState>

响应头:

HTTP/1.1 400
Content-Length: 295
Content-Type: application/xml
Server: Jetty(6.1.26)

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<RemoteException>
  <exception>BadRequestException</exception>
  <message>java.lang.Exception: Only 'KILLED' is allowed as a target state.</message>
  <javaClassName>org.apache.hadoop.yarn.webapp.BadRequestException</javaClassName>
</RemoteException>

任务尝试计数器API

通过任务尝试计数器API,您可以获取代表该任务尝试所有计数器的资源集合。

URI

支持的HTTP操作

  • GET

支持的查询参数

  None

jobTaskAttemptCounters 对象的元素

项目 数据类型 描述
id string 任务尝试ID
taskAttemptcounterGroup 任务尝试计数器组对象数组(JSON)/零个或多个任务尝试计数器组对象(XML) 任务尝试计数器组对象的集合

taskAttemptCounterGroup 对象的元素

项目 数据类型 描述
counterGroupName string 计数器组的名称
counter 计数器对象数组(JSON)/零个或多个计数器对象(XML) 计数器对象集合

counter 对象的元素

项目 数据类型 描述
name string 计数器的名称
value long 计数器的值

响应示例

JSON响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/tasks/task_1326232085508_4_4_r_0/attempts/attempt_1326232085508_4_4_r_0_0/counters

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)

响应体:

{
   "jobTaskAttemptCounters" : {
      "taskAttemptCounterGroup" : [
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.FileSystemCounter",
            "counter" : [
               {
                  "value" : 2363,
                  "name" : "FILE_BYTES_READ"
               },
               {
                  "value" : 54372,
                  "name" : "FILE_BYTES_WRITTEN"
               },
               {
                  "value" : 0,
                  "name" : "FILE_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "FILE_LARGE_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "FILE_WRITE_OPS"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_BYTES_READ"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_BYTES_WRITTEN"
               },
              {
                  "value" : 0,
                  "name" : "HDFS_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_LARGE_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_WRITE_OPS"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.TaskCounter",
            "counter" : [
               {
                  "value" : 0,
                  "name" : "COMBINE_INPUT_RECORDS"
               },
               {
                  "value" : 0,
                  "name" : "COMBINE_OUTPUT_RECORDS"
               },
               {
                  "value" : 460,
                  "name" : "REDUCE_INPUT_GROUPS"
               },
               {
                  "value" : 2235,
                  "name" : "REDUCE_SHUFFLE_BYTES"
               },
               {
                  "value" : 460,
                  "name" : "REDUCE_INPUT_RECORDS"
               },
               {
                  "value" : 0,
                  "name" : "REDUCE_OUTPUT_RECORDS"
               },
               {
                  "value" : 0,
                  "name" : "SPILLED_RECORDS"
               },
               {
                  "value" : 1,
                  "name" : "SHUFFLED_MAPS"
               },
               {
                  "value" : 0,
                  "name" : "FAILED_SHUFFLE"
               },
               {
                  "value" : 1,
                  "name" : "MERGED_MAP_OUTPUTS"
               },
               {
                  "value" : 26,
                  "name" : "GC_TIME_MILLIS"
               },
               {
                  "value" : 860,
                  "name" : "CPU_MILLISECONDS"
               },
               {
                  "value" : 107839488,
                  "name" : "PHYSICAL_MEMORY_BYTES"
               },
               {
                  "value" : 1123147776,
                  "name" : "VIRTUAL_MEMORY_BYTES"
               },
               {
                  "value" : 57475072,
                  "name" : "COMMITTED_HEAP_BYTES"
               }
            ]
         },
         {
            "counterGroupName" : "Shuffle Errors",
            "counter" : [
               {
                  "value" : 0,
                  "name" : "BAD_ID"
               },
               {
                  "value" : 0,
                  "name" : "CONNECTION"
               },
               {
                  "value" : 0,
                  "name" : "IO_ERROR"
               },
               {
                  "value" : 0,
                  "name" : "WRONG_LENGTH"
               },
               {
                  "value" : 0,
                  "name" : "WRONG_MAP"
               },
               {
                  "value" : 0,
                  "name" : "WRONG_REDUCE"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.lib.output.FileOutputFormatCounter",
            "counter" : [
               {
                  "value" : 0,
                  "name" : "BYTES_WRITTEN"
               }
            ]
         }
      ],
      "id" : "attempt_1326232085508_4_4_r_0_0"
   }
}

XML响应

HTTP请求:

  GET http://proxy-http-address:port/proxy/application_1326232085508_0004/ws/v1/mapreduce/jobs/job_1326232085508_4_4/tasks/task_1326232085508_4_4_r_0/attempts/attempt_1326232085508_4_4_r_0_0/counters
  Accept: application/xml

响应头:

  HTTP/1.1 200 OK
  Content-Type: application/xml
  Content-Length: 2735
  Server: Jetty(6.1.26)

响应体:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<jobTaskAttemptCounters>
  <id>attempt_1326232085508_4_4_r_0_0</id>
  <taskAttemptCounterGroup>
    <counterGroupName>org.apache.hadoop.mapreduce.FileSystemCounter</counterGroupName>
    <counter>
      <name>FILE_BYTES_READ</name>
      <value>2363</value>
    </counter>
    <counter>
      <name>FILE_BYTES_WRITTEN</name>
      <value>54372</value>
    </counter>
    <counter>
      <name>FILE_READ_OPS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>FILE_LARGE_READ_OPS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>FILE_WRITE_OPS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>HDFS_BYTES_READ</name>
      <value>0</value>
    </counter>
    <counter>
      <name>HDFS_BYTES_WRITTEN</name>
      <value>0</value>
    </counter>
    <counter>
      <name>HDFS_READ_OPS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>HDFS_LARGE_READ_OPS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>HDFS_WRITE_OPS</name>
      <value>0</value>
    </counter>
  </taskAttemptCounterGroup>
  <taskAttemptCounterGroup>
    <counterGroupName>org.apache.hadoop.mapreduce.TaskCounter</counterGroupName>
    <counter>
      <name>COMBINE_INPUT_RECORDS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>COMBINE_OUTPUT_RECORDS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>REDUCE_INPUT_GROUPS</name>
      <value>460</value>
    </counter>
    <counter>
      <name>REDUCE_SHUFFLE_BYTES</name>
      <value>2235</value>
    </counter>
    <counter>
      <name>REDUCE_INPUT_RECORDS</name>
      <value>460</value>
    </counter>
    <counter>
      <name>REDUCE_OUTPUT_RECORDS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>SPILLED_RECORDS</name>
      <value>0</value>
    </counter>
    <counter>
      <name>SHUFFLED_MAPS</name>
      <value>1</value>
    </counter>
    <counter>
      <name>FAILED_SHUFFLE</name>
      <value>0</value>
    </counter>
    <counter>
      <name>MERGED_MAP_OUTPUTS</name>
      <value>1</value>
    </counter>
    <counter>
      <name>GC_TIME_MILLIS</name>
      <value>26</value>
    </counter>
    <counter>
      <name>CPU_MILLISECONDS</name>
      <value>860</value>
    </counter>
    <counter>
      <name>PHYSICAL_MEMORY_BYTES</name>
      <value>107839488</value>
    </counter>
    <counter>
      <name>VIRTUAL_MEMORY_BYTES</name>
      <value>1123147776</value>
    </counter>
    <counter>
      <name>COMMITTED_HEAP_BYTES</name>
      <value>57475072</value>
    </counter>
  </taskAttemptCounterGroup>
  <taskAttemptCounterGroup>
    <counterGroupName>Shuffle Errors</counterGroupName>
    <counter>
      <name>BAD_ID</name>
      <value>0</value>
    </counter>
    <counter>
      <name>CONNECTION</name>
      <value>0</value>
    </counter>
    <counter>
      <name>IO_ERROR</name>
      <value>0</value>
    </counter>
    <counter>
      <name>WRONG_LENGTH</name>
      <value>0</value>
    </counter>
    <counter>
      <name>WRONG_MAP</name>
      <value>0</value>
    </counter>
    <counter>
      <name>WRONG_REDUCE</name>
      <value>0</value>
    </counter>
  </taskAttemptCounterGroup>
  <taskAttemptCounterGroup>
    <counterGroupName>org.apache.hadoop.mapreduce.lib.output.FileOutputFormatCounter</counterGroupName>
    <counter>
      <name>BYTES_WRITTEN</name>
      <value>0</value>
    </counter>
  </taskAttemptCounterGroup>
</jobTaskAttemptCounters>