public class PigSplit
extends org.apache.hadoop.mapreduce.InputSplit
implements org.apache.hadoop.io.Writable, org.apache.hadoop.conf.Configurable
Configurable.setConf(Configuration)
on the backend so we can use
the Configuration to create the SerializationFactory to deserialize the
wrapped InputSplit.Constructor and Description |
---|
PigSplit() |
PigSplit(org.apache.hadoop.mapreduce.InputSplit[] wrappedSplits,
int inputIndex,
List<OperatorKey> targetOps,
int splitIndex) |
Modifier and Type | Method and Description |
---|---|
boolean |
disableCounter() |
org.apache.hadoop.conf.Configuration |
getConf() |
long |
getLength() |
long |
getLength(int idx)
Return the length of a wrapped split
|
String[] |
getLocations() |
int |
getNumPaths() |
int |
getSplitIndex() |
List<OperatorKey> |
getTargetOps() |
org.apache.hadoop.mapreduce.InputSplit |
getWrappedSplit()
This methods returns the actual InputSplit (as returned by the
InputFormat ) which this class is wrapping. |
org.apache.hadoop.mapreduce.InputSplit |
getWrappedSplit(int idx) |
boolean |
isMultiInputs()
Returns true if the map has multiple inputs, else false
|
void |
readFields(DataInput is) |
void |
setConf(org.apache.hadoop.conf.Configuration conf)
(non-Javadoc)
|
void |
setCurrentIdx(int idx) |
void |
setDisableCounter(boolean disableCounter) |
void |
setMultiInputs(boolean b)
Indicates this map has multiple input (such as the result of
a join operation).
|
String |
toString() |
void |
write(DataOutput os) |
public PigSplit()
public PigSplit(org.apache.hadoop.mapreduce.InputSplit[] wrappedSplits, int inputIndex, List<OperatorKey> targetOps, int splitIndex)
public List<OperatorKey> getTargetOps()
public org.apache.hadoop.mapreduce.InputSplit getWrappedSplit()
InputFormat
) which this class is wrapping.public org.apache.hadoop.mapreduce.InputSplit getWrappedSplit(int idx)
idx
- the index into the wrapped splitspublic String[] getLocations() throws IOException, InterruptedException
getLocations
in class org.apache.hadoop.mapreduce.InputSplit
IOException
InterruptedException
public long getLength() throws IOException, InterruptedException
getLength
in class org.apache.hadoop.mapreduce.InputSplit
IOException
InterruptedException
public long getLength(int idx) throws IOException, InterruptedException
idx
- the index into the wrapped splitsIOException
InterruptedException
public void readFields(DataInput is) throws IOException
readFields
in interface org.apache.hadoop.io.Writable
IOException
public void write(DataOutput os) throws IOException
write
in interface org.apache.hadoop.io.Writable
IOException
public int getSplitIndex()
public void setMultiInputs(boolean b)
b
- true if the map has multiple inputspublic boolean isMultiInputs()
public org.apache.hadoop.conf.Configuration getConf()
getConf
in interface org.apache.hadoop.conf.Configurable
public void setConf(org.apache.hadoop.conf.Configuration conf)
setConf
in interface org.apache.hadoop.conf.Configurable
This will be called by
{@link PigInputFormat#getSplits(org.apache.hadoop.mapreduce.JobContext)}
to be used in {@link #write(DataOutput)} for serializing the
wrappedSplit
This will be called by Hadoop in the backend to set the right Job
Configuration (hadoop will invoke this method because PigSplit implements
{@link Configurable} - we need this Configuration in readFields() to
deserialize the wrappedSplit
public int getNumPaths()
public void setDisableCounter(boolean disableCounter)
public boolean disableCounter()
public void setCurrentIdx(int idx)
Copyright © 2007-2012 The Apache Software Foundation