Package org.apache.iceberg.parquet
Class Parquet.ReadBuilder
java.lang.Object
org.apache.iceberg.parquet.Parquet.ReadBuilder
- All Implemented Interfaces:
org.apache.iceberg.InternalData.ReadBuilder
- Enclosing class:
- Parquet
public static class Parquet.ReadBuilder
extends Object
implements org.apache.iceberg.InternalData.ReadBuilder
-
Method Summary
Modifier and TypeMethodDescription<D> org.apache.iceberg.io.CloseableIterable<D>build()callInit()Deprecated.caseSensitive(boolean newCaseSensitive) createBatchedReaderFunc(Function<org.apache.parquet.schema.MessageType, VectorizedReader<?>> func) createReaderFunc(BiFunction<org.apache.iceberg.Schema, org.apache.parquet.schema.MessageType, ParquetValueReader<?>> newReaderFunction) createReaderFunc(Function<org.apache.parquet.schema.MessageType, ParquetValueReader<?>> newReaderFunction) filter(org.apache.iceberg.expressions.Expression newFilter) filterRecords(boolean newFilterRecords) project(org.apache.iceberg.Schema newSchema) readSupport(org.apache.parquet.hadoop.api.ReadSupport<?> newFilterSupport) Deprecated.will be removed in 2.0.0; usecreateReaderFunc(Function)insteadrecordsPerBatch(int numRowsPerBatch) setCustomType(int fieldId, Class<? extends org.apache.iceberg.StructLike> structClass) setRootType(Class<? extends org.apache.iceberg.StructLike> rootClass) split(long newStart, long newLength) Restricts the read to the given range: [start, start + length).withAADPrefix(ByteBuffer aadPrefix) withFileEncryptionKey(ByteBuffer encryptionKey) withNameMapping(org.apache.iceberg.mapping.NameMapping newNameMapping)
-
Method Details
-
split
Restricts the read to the given range: [start, start + length).- Specified by:
splitin interfaceorg.apache.iceberg.InternalData.ReadBuilder- Parameters:
newStart- the start position for this readnewLength- the length of the range this read should scan- Returns:
- this builder for method chaining
-
project
- Specified by:
projectin interfaceorg.apache.iceberg.InternalData.ReadBuilder
-
caseInsensitive
-
caseSensitive
-
filterRecords
-
filter
-
readSupport
@Deprecated public Parquet.ReadBuilder readSupport(org.apache.parquet.hadoop.api.ReadSupport<?> newFilterSupport) Deprecated.will be removed in 2.0.0; usecreateReaderFunc(Function)instead -
createReaderFunc
public Parquet.ReadBuilder createReaderFunc(Function<org.apache.parquet.schema.MessageType, ParquetValueReader<?>> newReaderFunction) -
createReaderFunc
public Parquet.ReadBuilder createReaderFunc(BiFunction<org.apache.iceberg.Schema, org.apache.parquet.schema.MessageType, ParquetValueReader<?>> newReaderFunction) -
createBatchedReaderFunc
public Parquet.ReadBuilder createBatchedReaderFunc(Function<org.apache.parquet.schema.MessageType, VectorizedReader<?>> func) -
set
-
callInit
Deprecated.will be removed in 2.0.0; usecreateReaderFunc(Function)instead -
reuseContainers
- Specified by:
reuseContainersin interfaceorg.apache.iceberg.InternalData.ReadBuilder
-
recordsPerBatch
-
withNameMapping
-
setRootType
- Specified by:
setRootTypein interfaceorg.apache.iceberg.InternalData.ReadBuilder
-
setCustomType
public Parquet.ReadBuilder setCustomType(int fieldId, Class<? extends org.apache.iceberg.StructLike> structClass) - Specified by:
setCustomTypein interfaceorg.apache.iceberg.InternalData.ReadBuilder
-
withFileEncryptionKey
-
withAADPrefix
-
build
public <D> org.apache.iceberg.io.CloseableIterable<D> build()- Specified by:
buildin interfaceorg.apache.iceberg.InternalData.ReadBuilder
-
createReaderFunc(Function)instead