Interface SparkType
-
- All Superinterfaces:
java.util.Comparator<java.lang.Object>
- All Known Implementing Classes:
Empty,SparkAscii,SparkBigInt,SparkBlob,SparkBoolean,SparkCounter,SparkDate,SparkDecimal,SparkDouble,SparkDuration,SparkFloat,SparkFrozen,SparkInet,SparkInt,SparkList,SparkMap,SparkSet,SparkSmallInt,SparkText,SparkTime,SparkTimestamp,SparkTimeUUID,SparkTinyInt,SparkTuple,SparkUdt,SparkUUID,SparkVarChar,SparkVarInt
public interface SparkType extends java.util.Comparator<java.lang.Object>This interface and the inheriting classes define the Spark equivalents to the Cassandra CQL data types. Each CQL type should have a 1-1 mapping to an equivalent Spark type defined in the `org.apache.cassandra.spark.data.converter.SparkSqlTypeConverter` implementation.
-
-
Method Summary
All Methods Static Methods Instance Methods Abstract Methods Default Methods Modifier and Type Method Description default intcompare(java.lang.Object first, java.lang.Object second)static intcompareArrays(java.lang.Object[] first, java.lang.Object[] second, java.util.function.Function<java.lang.Integer,SparkType> types)default intcompareTo(java.lang.Object first, java.lang.Object second)default org.apache.spark.sql.types.DataTypedataType()org.apache.spark.sql.types.DataTypedataType(org.apache.cassandra.bridge.BigNumberConfig bigNumberConfig)default booleanequals(java.lang.Object first, java.lang.Object second)static booleanequalsArrays(java.lang.Object[] first, java.lang.Object[] second, java.util.function.Function<java.lang.Integer,SparkType> types)default booleanequalsTo(java.lang.Object first, java.lang.Object second)default java.lang.ObjectnativeSparkSqlRowValue(org.apache.spark.sql.catalyst.expressions.GenericInternalRow row, int position)default java.lang.ObjectnativeSparkSqlRowValue(org.apache.spark.sql.Row row, int position)default java.lang.ObjectsparkSqlRowValue(org.apache.spark.sql.catalyst.expressions.GenericInternalRow row, int position)default java.lang.ObjectsparkSqlRowValue(org.apache.spark.sql.Row row, int position)default java.lang.ObjecttoSparkSqlType(java.lang.Object value, boolean isFrozen)default java.lang.ObjecttoTestRowType(java.lang.Object value)
-
-
-
Method Detail
-
dataType
default org.apache.spark.sql.types.DataType dataType()
- Returns:
- the SparkSQL `org.apache.spark.sql.types.DataType` for this SparkType.
-
dataType
org.apache.spark.sql.types.DataType dataType(org.apache.cassandra.bridge.BigNumberConfig bigNumberConfig)
- Parameters:
bigNumberConfig- specifies the scale and precision to be used for VarInt and Decimal types.- Returns:
- the SparkSQL `org.apache.spark.sql.types.DataType` for this SparkType.
-
toSparkSqlType
default java.lang.Object toSparkSqlType(@NotNull java.lang.Object value, boolean isFrozen)- Parameters:
value- the Cassandra value.isFrozen- true if the type is frozen.- Returns:
- the value mapped to the Spark equivalent data type.
-
sparkSqlRowValue
default java.lang.Object sparkSqlRowValue(org.apache.spark.sql.catalyst.expressions.GenericInternalRow row, int position)- Parameters:
row- a SparkSQL `org.apache.spark.sql.catalyst.expressions.GenericInternalRow`position- position in row- Returns:
- the SparkSQL value at `position` in the `row` converted back into test type - used only in the test system.
-
nativeSparkSqlRowValue
default java.lang.Object nativeSparkSqlRowValue(org.apache.spark.sql.catalyst.expressions.GenericInternalRow row, int position)- Parameters:
row- a SparkSQL `org.apache.spark.sql.catalyst.expressions.GenericInternalRow`position- position in row- Returns:
- the SparkSQL value at `position` in the `row` converted back into test type - used only in the test system.
-
sparkSqlRowValue
default java.lang.Object sparkSqlRowValue(org.apache.spark.sql.Row row, int position)- Parameters:
row- a SparkSQL `org.apache.spark.sql.Row`position- position in row- Returns:
- the SparkSQL value at `position` in the `row` converted back into test type - used only in the test system.
-
nativeSparkSqlRowValue
default java.lang.Object nativeSparkSqlRowValue(org.apache.spark.sql.Row row, int position)- Parameters:
row- a SparkSQL `org.apache.spark.sql.Row`position- position in row- Returns:
- the SparkSQL value at `position` in the `row` converted back into test type - used only in the test system.
-
toTestRowType
default java.lang.Object toTestRowType(java.lang.Object value)
- Parameters:
value- SparkSQL value.- Returns:
- SparkSQL value converted back into test type - used only in the test system.
-
equals
default boolean equals(java.lang.Object first, java.lang.Object second)
-
equalsTo
default boolean equalsTo(java.lang.Object first, java.lang.Object second)
-
compare
default int compare(java.lang.Object first, java.lang.Object second)- Specified by:
comparein interfacejava.util.Comparator<java.lang.Object>
-
compareArrays
static int compareArrays(java.lang.Object[] first, java.lang.Object[] second, java.util.function.Function<java.lang.Integer,SparkType> types)
-
equalsArrays
static boolean equalsArrays(java.lang.Object[] first, java.lang.Object[] second, java.util.function.Function<java.lang.Integer,SparkType> types)
-
compareTo
default int compareTo(java.lang.Object first, java.lang.Object second)
-
-