@Experimental(value=SCHEMAS)
public abstract static class JdbcIO.ReadRows
extends org.apache.beam.sdk.transforms.PTransform<org.apache.beam.sdk.values.PBegin,org.apache.beam.sdk.values.PCollection<org.apache.beam.sdk.values.Row>>
JdbcIO.readRows().| Constructor and Description |
|---|
ReadRows() |
| Modifier and Type | Method and Description |
|---|---|
org.apache.beam.sdk.values.PCollection<org.apache.beam.sdk.values.Row> |
expand(org.apache.beam.sdk.values.PBegin input) |
void |
populateDisplayData(org.apache.beam.sdk.transforms.display.DisplayData.Builder builder) |
JdbcIO.ReadRows |
withDataSourceConfiguration(JdbcIO.DataSourceConfiguration config) |
JdbcIO.ReadRows |
withDataSourceProviderFn(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Void,javax.sql.DataSource> dataSourceProviderFn) |
JdbcIO.ReadRows |
withFetchSize(int fetchSize)
This method is used to set the size of the data that is going to be fetched and loaded in
memory per every database call.
|
JdbcIO.ReadRows |
withOutputParallelization(boolean outputParallelization)
Whether to reshuffle the resulting PCollection so results are distributed to all workers.
|
JdbcIO.ReadRows |
withQuery(java.lang.String query) |
JdbcIO.ReadRows |
withQuery(org.apache.beam.sdk.options.ValueProvider<java.lang.String> query) |
JdbcIO.ReadRows |
withStatementPreparator(JdbcIO.StatementPreparator statementPreparator) |
public JdbcIO.ReadRows withDataSourceConfiguration(JdbcIO.DataSourceConfiguration config)
public JdbcIO.ReadRows withDataSourceProviderFn(org.apache.beam.sdk.transforms.SerializableFunction<java.lang.Void,javax.sql.DataSource> dataSourceProviderFn)
public JdbcIO.ReadRows withQuery(java.lang.String query)
public JdbcIO.ReadRows withQuery(org.apache.beam.sdk.options.ValueProvider<java.lang.String> query)
public JdbcIO.ReadRows withStatementPreparator(JdbcIO.StatementPreparator statementPreparator)
public JdbcIO.ReadRows withFetchSize(int fetchSize)
Statement.setFetchSize(int)
It should ONLY be used if the default value throws memory errors.public JdbcIO.ReadRows withOutputParallelization(boolean outputParallelization)
public org.apache.beam.sdk.values.PCollection<org.apache.beam.sdk.values.Row> expand(org.apache.beam.sdk.values.PBegin input)
expand in class org.apache.beam.sdk.transforms.PTransform<org.apache.beam.sdk.values.PBegin,org.apache.beam.sdk.values.PCollection<org.apache.beam.sdk.values.Row>>public void populateDisplayData(org.apache.beam.sdk.transforms.display.DisplayData.Builder builder)
populateDisplayData in interface org.apache.beam.sdk.transforms.display.HasDisplayDatapopulateDisplayData in class org.apache.beam.sdk.transforms.PTransform<org.apache.beam.sdk.values.PBegin,org.apache.beam.sdk.values.PCollection<org.apache.beam.sdk.values.Row>>