org.apache.pig.piggybank.storage
Class CSVExcelStorage

java.lang.Object
  extended by org.apache.pig.LoadFunc
      extended by org.apache.pig.FileInputLoadFunc
          extended by org.apache.pig.builtin.PigStorage
              extended by org.apache.pig.piggybank.storage.CSVExcelStorage
All Implemented Interfaces:
LoadPushDown, OrderedLoadFunc, StoreFuncInterface

public class CSVExcelStorage
extends PigStorage
implements StoreFuncInterface, LoadPushDown

CSV loading and storing with support for multi-line fields, and escaping of delimiters and double quotes within fields; uses CSV conventions of Excel 2007. Arguments allow for control over:

Usage:
STORE x INTO '<destFileName>'
USING CSVExcelStorage(['<delimiter>' [,{'YES_MULTILINE' | 'NO_MULTILINE'} [,{'UNIX' | 'WINDOWS' | 'UNCHANGED'}]]]);

Defaults are comma, 'NO_MULTILINE', 'UNCHANGED' The linebreak parameter is only used during store. During load no conversion is performed.

Example:
STORE res INTO '/tmp/result.csv'
USING CSVExcelStorage(',', 'NO_MULTILINE', 'WINDOWS');

would expect to see comma separated files for load, would use comma as field separator during store, would treat every newline as a record terminator, and would use CRLF as line break characters (0x0d 0x0a: \r\n).

Example:
STORE res INTO '/tmp/result.csv'
USING CSVExcelStorage(',', 'YES_MULTILINE');

would allow newlines inside of fields. During load such fields are expected to conform to the Excel requirement that the field is enclosed in double quotes. On store, the chararray containing the field will accordingly be enclosed in double quotes.

Note:
A danger with enabling multiline fields during load is that unbalanced double quotes will cause slurping up of input until a balancing double quote is found, or until something breaks. If you are not expecting newlines within fields it is therefore more robust to use NO_MULTILINE, which is the default for that reason.

Excel expects double quotes within fields to be escaped with a second double quote. When such an embedding of double quotes is used, Excel additionally expects the entire fields to be surrounded by double quotes. This package follows that escape mechanism, rather than the use of backslash.

Tested with: Pig 0.8.0, Windows Vista, Excel 2007 SP2 MSO(12.0.6545.5004).

Note:
When a file with newlines embedded in a field is loaded into Excel, the application does not automatically vertically enlarge the respective rows. It is therefore easy to miss when fields consist of multiple lines. To make the multiline rows clear:

Examples:
With multiline turned on:
"Conrad\n
Emil",Dinger,40
Is read as (Conrad\nEmil,Dinger,40)

With multiline turned off:
"Conrad\n
Emil",Dinger,40
is read as
(Conrad)
(Emil,Dinger,40)

Always:

That is, the escape character is the double quote, not backslash.

Known Issues:

Author:
"Andreas Paepcke"

Nested Class Summary
static class CSVExcelStorage.Linebreaks
           
static class CSVExcelStorage.Multiline
           
 
Nested classes/interfaces inherited from interface org.apache.pig.LoadPushDown
LoadPushDown.OperatorSet, LoadPushDown.RequiredField, LoadPushDown.RequiredFieldList, LoadPushDown.RequiredFieldResponse
 
Field Summary
protected static byte DOUBLE_QUOTE
           
protected  org.apache.hadoop.mapreduce.RecordReader in
           
protected static byte LINEFEED
           
protected static byte NEWLINE
           
protected static byte RECORD_DEL
           
 
Fields inherited from class org.apache.pig.builtin.PigStorage
mLog, writer
 
Constructor Summary
CSVExcelStorage()
          Constructs a CSVExcel load/store that uses comma as the field delimiter, terminates records on reading a newline within a field (even if the field is enclosed in double quotes), and uses LF as line terminator.
CSVExcelStorage(String delimiter)
          Constructs a CSVExcel load/store that uses specified string as a field delimiter.
CSVExcelStorage(String delimiter, String multilineTreatment)
          Constructs a CSVExcel load/store that uses specified string as a field delimiter, and allows specification whether to handle line breaks within fields.
CSVExcelStorage(String delimiter, String multilineTreatment, String eolTreatment)
          Constructs a CSVExcel load/store that uses specified string as a field delimiter, provides choice whether to manage multiline fields, and specifies chars used for end of line.
 
Method Summary
 List<LoadPushDown.OperatorSet> getFeatures()
          Determine the operators that can be pushed to the loader.
 org.apache.hadoop.mapreduce.InputFormat getInputFormat()
          This will be called during planning on the front end.
 Tuple getNext()
          Retrieves the next tuple to be processed.
 void prepareToRead(org.apache.hadoop.mapreduce.RecordReader reader, PigSplit split)
          Initializes LoadFunc for reading data.
 LoadPushDown.RequiredFieldResponse pushProjection(LoadPushDown.RequiredFieldList requiredFieldList)
          Indicate to the loader fields that will be needed.
 void putNext(Tuple tupleToWrite)
          Write a tuple to the data store.
 void setLocation(String location, org.apache.hadoop.mapreduce.Job job)
          Communicate to the loader the location of the object(s) being loaded.
 void setUDFContextSignature(String signature)
          This method will be called by Pig both in the front end and back end to pass a unique signature to the LoadFunc.
 
Methods inherited from class org.apache.pig.builtin.PigStorage
checkSchema, cleanupOnFailure, equals, equals, getOutputFormat, hashCode, prepareToWrite, relToAbsPathForStoreLocation, setStoreFuncUDFContextSignature, setStoreLocation
 
Methods inherited from class org.apache.pig.FileInputLoadFunc
getSplitComparable
 
Methods inherited from class org.apache.pig.LoadFunc
getAbsolutePath, getLoadCaster, getPathStrings, join, relativeToAbsolutePath
 
Methods inherited from class java.lang.Object
clone, finalize, getClass, notify, notifyAll, toString, wait, wait, wait
 
Methods inherited from interface org.apache.pig.StoreFuncInterface
checkSchema, cleanupOnFailure, getOutputFormat, prepareToWrite, relToAbsPathForStoreLocation, setStoreFuncUDFContextSignature, setStoreLocation
 

Field Detail

LINEFEED

protected static final byte LINEFEED
See Also:
Constant Field Values

NEWLINE

protected static final byte NEWLINE
See Also:
Constant Field Values

DOUBLE_QUOTE

protected static final byte DOUBLE_QUOTE
See Also:
Constant Field Values

RECORD_DEL

protected static final byte RECORD_DEL
See Also:
Constant Field Values

in

protected org.apache.hadoop.mapreduce.RecordReader in
Constructor Detail

CSVExcelStorage

public CSVExcelStorage()
Constructs a CSVExcel load/store that uses comma as the field delimiter, terminates records on reading a newline within a field (even if the field is enclosed in double quotes), and uses LF as line terminator.


CSVExcelStorage

public CSVExcelStorage(String delimiter)
Constructs a CSVExcel load/store that uses specified string as a field delimiter.

Parameters:
delimiter - the single byte character that is used to separate fields. ("," is the default.)

CSVExcelStorage

public CSVExcelStorage(String delimiter,
                       String multilineTreatment)
Constructs a CSVExcel load/store that uses specified string as a field delimiter, and allows specification whether to handle line breaks within fields. Pig example:
STORE a INTO '/tmp/foo.csv'
USING org.apache.pig.piggybank.storage.CSVExcelStorage(",", "YES_MULTILINE");

Parameters:
delimiter - the single byte character that is used to separate fields. ("," is the default.)
multilineTreatment - "YES_MULTILINE" or "NO_MULTILINE" ("NO_MULTILINE is the default.)

CSVExcelStorage

public CSVExcelStorage(String delimiter,
                       String multilineTreatment,
                       String eolTreatment)
Constructs a CSVExcel load/store that uses specified string as a field delimiter, provides choice whether to manage multiline fields, and specifies chars used for end of line.

The eofTreatment parameter is only relevant for STORE():

Pig example:
STORE a INTO '/tmp/foo.csv'
USING org.apache.pig.piggybank.storage.CSVExcelStorage(",", "NO_MULTILINE", "WINDOWS");

Parameters:
delimiter - the single byte character that is used to separate fields. ("," is the default.)
String - "YES_MULTILINE" or "NO_MULTILINE" ("NO_MULTILINE is the default.)
eolTreatment - "UNIX", "WINDOWS", or "NOCHANGE" ("NOCHANGE" is the default.)
Method Detail

putNext

public void putNext(Tuple tupleToWrite)
             throws IOException
Description copied from interface: StoreFuncInterface
Write a tuple to the data store.

Specified by:
putNext in interface StoreFuncInterface
Overrides:
putNext in class PigStorage
Parameters:
tupleToWrite - the tuple to store.
Throws:
IOException - if an exception occurs during the write

getNext

public Tuple getNext()
              throws IOException
Description copied from class: LoadFunc
Retrieves the next tuple to be processed. Implementations should NOT reuse tuple objects (or inner member objects) they return across calls and should return a different tuple object in each call.

Overrides:
getNext in class PigStorage
Returns:
the next tuple to be processed or null if there are no more tuples to be processed.
Throws:
IOException - if there is an exception while retrieving the next tuple

setLocation

public void setLocation(String location,
                        org.apache.hadoop.mapreduce.Job job)
                 throws IOException
Description copied from class: LoadFunc
Communicate to the loader the location of the object(s) being loaded. The location string passed to the LoadFunc here is the return value of LoadFunc.relativeToAbsolutePath(String, Path). Implementations should use this method to communicate the location (and any other information) to its underlying InputFormat through the Job object. This method will be called in the backend multiple times. Implementations should bear in mind that this method is called multiple times and should ensure there are no inconsistent side effects due to the multiple calls.

Overrides:
setLocation in class PigStorage
Parameters:
location - Location as returned by LoadFunc.relativeToAbsolutePath(String, Path)
job - the Job object store or retrieve earlier stored information from the UDFContext
Throws:
IOException - if the location is not valid.

getInputFormat

public org.apache.hadoop.mapreduce.InputFormat getInputFormat()
Description copied from class: LoadFunc
This will be called during planning on the front end. This is the instance of InputFormat (rather than the class name) because the load function may need to instantiate the InputFormat in order to control how it is constructed.

Overrides:
getInputFormat in class PigStorage
Returns:
the InputFormat associated with this loader.

prepareToRead

public void prepareToRead(org.apache.hadoop.mapreduce.RecordReader reader,
                          PigSplit split)
Description copied from class: LoadFunc
Initializes LoadFunc for reading data. This will be called during execution before any calls to getNext. The RecordReader needs to be passed here because it has been instantiated for a particular InputSplit.

Overrides:
prepareToRead in class PigStorage
Parameters:
reader - RecordReader to be used by this instance of the LoadFunc
split - The input PigSplit to process

pushProjection

public LoadPushDown.RequiredFieldResponse pushProjection(LoadPushDown.RequiredFieldList requiredFieldList)
                                                  throws FrontendException
Description copied from interface: LoadPushDown
Indicate to the loader fields that will be needed. This can be useful for loaders that access data that is stored in a columnar format where indicating columns to be accessed a head of time will save scans. This method will not be invoked by the Pig runtime if all fields are required. So implementations should assume that if this method is not invoked, then all fields from the input are required. If the loader function cannot make use of this information, it is free to ignore it by returning an appropriate Response

Specified by:
pushProjection in interface LoadPushDown
Overrides:
pushProjection in class PigStorage
Parameters:
requiredFieldList - RequiredFieldList indicating which columns will be needed. This structure is read only. User cannot make change to it inside pushProjection.
Returns:
Indicates which fields will be returned
Throws:
FrontendException

setUDFContextSignature

public void setUDFContextSignature(String signature)
Description copied from class: LoadFunc
This method will be called by Pig both in the front end and back end to pass a unique signature to the LoadFunc. The signature can be used to store into the UDFContext any information which the LoadFunc needs to store between various method invocations in the front end and back end. A use case is to store LoadPushDown.RequiredFieldList passed to it in LoadPushDown.pushProjection(RequiredFieldList) for use in the back end before returning tuples in LoadFunc.getNext(). This method will be call before other methods in LoadFunc

Overrides:
setUDFContextSignature in class PigStorage
Parameters:
signature - a unique signature to identify this LoadFunc

getFeatures

public List<LoadPushDown.OperatorSet> getFeatures()
Description copied from interface: LoadPushDown
Determine the operators that can be pushed to the loader. Note that by indicating a loader can accept a certain operator (such as selection) the loader is not promising that it can handle all selections. When it is passed the actual operators to push down it will still have a chance to reject them.

Specified by:
getFeatures in interface LoadPushDown
Overrides:
getFeatures in class PigStorage
Returns:
list of all features that the loader can support


Copyright © ${year} The Apache Software Foundation