hadoop - Apache Pig - ERROR 6007: Unable to check name -


i'm trying run bare bones basic script in pig tutorial (http://pig.apache.org/docs/r0.11.1/start.html#pig-scripts) looks this:

/* myscript.pig script simple. includes 3 pig latin statements. */  = load 'student' using pigstorage() (name:chararray, age:int, gpa:float); -- loading data b = foreach generate name;  -- transforming data dump b;  -- retrieving results 

output:

2013-05-13 15:26:55,864 [main] error org.apache.pig.tools.grunt.grunt - error 6007: unable check name hdfs://stage-hadoop101.cluster:8020/user/myusername details @ logfile: /volumes/nimue/environment/pig-0.11.1/pig_1368473213767.log macbook-3:pig myusername$ cat /volumes/nimue/environment/pig-0.11.1/pig_1368473213767.log 

this results in:

pig stack trace --------------- error 6007: unable check name hdfs://stage-hadoop101.cluster:8020/user/myusername  org.apache.pig.impl.logicallayer.frontendexception: error 1000: error during parsing. unable check name hdfs://stage-hadoop101.cluster:8020/user/myusername     @ org.apache.pig.pigserver$graph.parsequery(pigserver.java:1607)     @ org.apache.pig.pigserver$graph.registerquery(pigserver.java:1546)     @ org.apache.pig.pigserver.registerquery(pigserver.java:516)     @ org.apache.pig.tools.grunt.gruntparser.processpig(gruntparser.java:991)     @ org.apache.pig.tools.pigscript.parser.pigscriptparser.parse(pigscriptparser.java:412)     @ org.apache.pig.tools.grunt.gruntparser.parsestoponerror(gruntparser.java:194)     @ org.apache.pig.tools.grunt.gruntparser.parsestoponerror(gruntparser.java:170)     @ org.apache.pig.tools.grunt.grunt.exec(grunt.java:84)     @ org.apache.pig.main.run(main.java:604)     @ org.apache.pig.main.main(main.java:157)     @ sun.reflect.nativemethodaccessorimpl.invoke0(native method)     @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:39)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:25)     @ java.lang.reflect.method.invoke(method.java:597)     @ org.apache.hadoop.util.runjar.main(runjar.java:212) caused by: failed parse: pig script failed parse:  <file test/myscript.pig, line 6, column 4> pig script failed validate: org.apache.pig.backend.datastorage.datastorageexception: error 6007: unable check name hdfs://stage-hadoop101.cluster:8020/user/myusername     @ org.apache.pig.parser.queryparserdriver.parse(queryparserdriver.java:191)     @ org.apache.pig.pigserver$graph.parsequery(pigserver.java:1599)     ... 14 more caused by:  <file test/myscript.pig, line 6, column 4> pig script failed validate: org.apache.pig.backend.datastorage.datastorageexception: error 6007: unable check name hdfs://stage-hadoop101.cluster:8020/user/myusername     @ org.apache.pig.parser.logicalplanbuilder.buildloadop(logicalplanbuilder.java:835)     @ org.apache.pig.parser.logicalplangenerator.load_clause(logicalplangenerator.java:3236)     @ org.apache.pig.parser.logicalplangenerator.op_clause(logicalplangenerator.java:1315)     @ org.apache.pig.parser.logicalplangenerator.general_statement(logicalplangenerator.java:799)     @ org.apache.pig.parser.logicalplangenerator.statement(logicalplangenerator.java:517)     @ org.apache.pig.parser.logicalplangenerator.query(logicalplangenerator.java:392)     @ org.apache.pig.parser.queryparserdriver.parse(queryparserdriver.java:184)     ... 15 more caused by: org.apache.pig.backend.datastorage.datastorageexception: error 6007: unable check name hdfs://stage-hadoop101.cluster:8020/user/myusername     @ org.apache.pig.backend.hadoop.datastorage.hdatastorage.iscontainer(hdatastorage.java:207)     @ org.apache.pig.backend.hadoop.datastorage.hdatastorage.aselement(hdatastorage.java:128)     @ org.apache.pig.backend.hadoop.datastorage.hdatastorage.aselement(hdatastorage.java:138)     @ org.apache.pig.parser.queryparserutils.getcurrentdir(queryparserutils.java:91)     @ org.apache.pig.parser.logicalplanbuilder.buildloadop(logicalplanbuilder.java:827)     ... 21 more caused by: java.io.ioexception: failed on local exception: com.google.protobuf.invalidprotocolbufferexception: message missing required fields: callid, status; host details : local host is: "macbook-3.local/192.168.2.2"; destination host is: "stage-hadoop101.cluster":8020;      @ org.apache.hadoop.net.netutils.wrapexception(netutils.java:761)     @ org.apache.hadoop.ipc.client.call(client.java:1239)     @ org.apache.hadoop.ipc.protobufrpcengine$invoker.invoke(protobufrpcengine.java:202)     @ com.sun.proxy.$proxy9.getfileinfo(unknown source)     @ sun.reflect.nativemethodaccessorimpl.invoke0(native method)     @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:39)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:25)     @ java.lang.reflect.method.invoke(method.java:597)     @ org.apache.hadoop.io.retry.retryinvocationhandler.invokemethod(retryinvocationhandler.java:164)     @ org.apache.hadoop.io.retry.retryinvocationhandler.invoke(retryinvocationhandler.java:83)     @ com.sun.proxy.$proxy9.getfileinfo(unknown source)     @ org.apache.hadoop.hdfs.protocolpb.clientnamenodeprotocoltranslatorpb.getfileinfo(clientnamenodeprotocoltranslatorpb.java:630)     @ org.apache.hadoop.hdfs.dfsclient.getfileinfo(dfsclient.java:1559)     @ org.apache.hadoop.hdfs.distributedfilesystem.getfilestatus(distributedfilesystem.java:811)     @ org.apache.hadoop.fs.filesystem.exists(filesystem.java:1345)     @ org.apache.pig.backend.hadoop.datastorage.hdatastorage.iscontainer(hdatastorage.java:200)     ... 25 more caused by: com.google.protobuf.invalidprotocolbufferexception: message missing required fields: callid, status     @ com.google.protobuf.uninitializedmessageexception.asinvalidprotocolbufferexception(uninitializedmessageexception.java:81)     @ org.apache.hadoop.ipc.protobuf.rpcpayloadheaderprotos$rpcresponseheaderproto$builder.buildparsed(rpcpayloadheaderprotos.java:1094)     @ org.apache.hadoop.ipc.protobuf.rpcpayloadheaderprotos$rpcresponseheaderproto$builder.access$1300(rpcpayloadheaderprotos.java:1028)     @ org.apache.hadoop.ipc.protobuf.rpcpayloadheaderprotos$rpcresponseheaderproto.parsedelimitedfrom(rpcpayloadheaderprotos.java:986)     @ org.apache.hadoop.ipc.client$connection.receiveresponse(client.java:946)     @ org.apache.hadoop.ipc.client$connection.run(client.java:844) 

searching on unable check name , invalidprotocolbufferexception: message missing required fields turned nothing.

if starting off pig, suggest run script in local mode first.

 pig -x local myscript.pig 

if want run in mapreduce mode, make sure you've followed mapreduce mode instructions first.


Comments

Popular posts from this blog

c# - DetailsView in ASP.Net - How to add another column on the side/add a control in each row? -

javascript - firefox memory leak -

Trying to import CSV file to a SQL Server database using asp.net and c# - can't find what I'm missing -