class SparkConf extends ReadOnlySparkConf with Cloneable with Logging with Serializable
Configuration for a Spark application. Used to set various Spark parameters as key-value pairs.
Most of the time, you would create a SparkConf object with new SparkConf(), which will load
values from any spark.* Java system properties set in your application as well. In this case,
parameters you set directly on the SparkConf object take priority over system properties.
For unit tests, you can also call new SparkConf(false) to skip loading external settings and
get the same configuration no matter what the system properties are.
All setter methods in this class support chaining. For example, you can write
new SparkConf().setMaster("local").setAppName("My app").
- Source
- SparkConf.scala
- Note
- Once a SparkConf object is passed to Spark, it is cloned and can no longer be modified by the user. Spark does not support modifying the configuration at runtime. 
- Alphabetic
- By Inheritance
- SparkConf
- Serializable
- Logging
- Cloneable
- ReadOnlySparkConf
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Instance Constructors
Type Members
-   implicit  class LogStringContext extends AnyRef- Definition Classes
- Logging
 
Value Members
-   final  def !=(arg0: Any): Boolean- Definition Classes
- AnyRef → Any
 
-   final  def ##: Int- Definition Classes
- AnyRef → Any
 
-   final  def ==(arg0: Any): Boolean- Definition Classes
- AnyRef → Any
 
-   final  def asInstanceOf[T0]: T0- Definition Classes
- Any
 
-    def catchIllegalValue[T](key: String)(getValue: => T): TWrapper method for get() methods which require some specific value format. Wrapper method for get() methods which require some specific value format. This catches any NumberFormatException or IllegalArgumentException and re-raises it with the incorrectly configured key in the exception message. - Attributes
- protected
- Definition Classes
- ReadOnlySparkConf
 
-    def clone(): SparkConfCopy this object Copy this object - Definition Classes
- SparkConf → AnyRef
 
-    def contains(key: String): BooleanDoes the configuration contain a given parameter? Does the configuration contain a given parameter? - Definition Classes
- SparkConf → ReadOnlySparkConf
 
-    def contains(entry: ConfigEntry[_]): BooleanDoes the configuration have the typed config entry? Does the configuration have the typed config entry? - Definition Classes
- ReadOnlySparkConf
 
-   final  def eq(arg0: AnyRef): Boolean- Definition Classes
- AnyRef
 
-    def equals(arg0: AnyRef): Boolean- Definition Classes
- AnyRef → Any
 
-    def get(key: String, defaultValue: String): StringGet a parameter, falling back to a default if not set Get a parameter, falling back to a default if not set - Definition Classes
- ReadOnlySparkConf
 
-    def get(key: String): StringGet a parameter; throws a NoSuchElementException if it's not set Get a parameter; throws a NoSuchElementException if it's not set - Definition Classes
- ReadOnlySparkConf
 
-    def getAll: Array[(String, String)]Get all parameters as a list of pairs Get all parameters as a list of pairs - Definition Classes
- SparkConf → ReadOnlySparkConf
 
-    def getAllWithPrefix(prefix: String): Array[(String, String)]Get all parameters that start with prefix
-    def getAppId: StringReturns the Spark application id, valid in the Driver after TaskScheduler registration and from the start in the Executor. 
-    def getAvroSchema: Map[Long, String]Gets all the avro schemas in the configuration used in the generic Avro record serializer 
-    def getBoolean(key: String, defaultValue: Boolean): BooleanGet a parameter as a boolean, falling back to a default if not set Get a parameter as a boolean, falling back to a default if not set - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- IllegalArgumentExceptionIf the value cannot be interpreted as a boolean
 
-   final  def getClass(): Class[_ <: AnyRef]- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
 
-    def getDouble(key: String, defaultValue: Double): DoubleGet a parameter as a double, falling back to a default if not ste Get a parameter as a double, falling back to a default if not ste - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as a double
 
-    def getExecutorEnv: Seq[(String, String)]Get all executor environment variables set on this SparkConf 
-    def getInt(key: String, defaultValue: Int): IntGet a parameter as an integer, falling back to a default if not set Get a parameter as an integer, falling back to a default if not set - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as an integer
 
-    def getLong(key: String, defaultValue: Long): LongGet a parameter as a long, falling back to a default if not set Get a parameter as a long, falling back to a default if not set - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as a long
 
-    def getOption(key: String): Option[String]Get a parameter as an Option Get a parameter as an Option - Definition Classes
- SparkConf → ReadOnlySparkConf
 
-    def getSizeAsBytes(key: String, defaultValue: Long): LongGet a size parameter as bytes, falling back to a default if not set. Get a size parameter as bytes, falling back to a default if not set. - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as bytes
 
-    def getSizeAsBytes(key: String, defaultValue: String): LongGet a size parameter as bytes, falling back to a default if not set. Get a size parameter as bytes, falling back to a default if not set. If no suffix is provided then bytes are assumed. - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as bytes
 
-    def getSizeAsBytes(key: String): LongGet a size parameter as bytes; throws a NoSuchElementException if it's not set. Get a size parameter as bytes; throws a NoSuchElementException if it's not set. If no suffix is provided then bytes are assumed. - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as bytes- java.util.NoSuchElementExceptionIf the size parameter is not set
 
-    def getSizeAsGb(key: String, defaultValue: String): LongGet a size parameter as Gibibytes, falling back to a default if not set. Get a size parameter as Gibibytes, falling back to a default if not set. If no suffix is provided then Gibibytes are assumed. - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as Gibibytes
 
-    def getSizeAsGb(key: String): LongGet a size parameter as Gibibytes; throws a NoSuchElementException if it's not set. Get a size parameter as Gibibytes; throws a NoSuchElementException if it's not set. If no suffix is provided then Gibibytes are assumed. - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as Gibibytes- java.util.NoSuchElementExceptionIf the size parameter is not set
 
-    def getSizeAsKb(key: String, defaultValue: String): LongGet a size parameter as Kibibytes, falling back to a default if not set. Get a size parameter as Kibibytes, falling back to a default if not set. If no suffix is provided then Kibibytes are assumed. - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as Kibibytes
 
-    def getSizeAsKb(key: String): LongGet a size parameter as Kibibytes; throws a NoSuchElementException if it's not set. Get a size parameter as Kibibytes; throws a NoSuchElementException if it's not set. If no suffix is provided then Kibibytes are assumed. - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as Kibibytes- java.util.NoSuchElementExceptionIf the size parameter is not set
 
-    def getSizeAsMb(key: String, defaultValue: String): LongGet a size parameter as Mebibytes, falling back to a default if not set. Get a size parameter as Mebibytes, falling back to a default if not set. If no suffix is provided then Mebibytes are assumed. - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as Mebibytes
 
-    def getSizeAsMb(key: String): LongGet a size parameter as Mebibytes; throws a NoSuchElementException if it's not set. Get a size parameter as Mebibytes; throws a NoSuchElementException if it's not set. If no suffix is provided then Mebibytes are assumed. - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as Mebibytes- java.util.NoSuchElementExceptionIf the size parameter is not set
 
-    def getTimeAsMs(key: String, defaultValue: String): LongGet a time parameter as milliseconds, falling back to a default if not set. Get a time parameter as milliseconds, falling back to a default if not set. If no suffix is provided then milliseconds are assumed. - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as milliseconds
 
-    def getTimeAsMs(key: String): LongGet a time parameter as milliseconds; throws a NoSuchElementException if it's not set. Get a time parameter as milliseconds; throws a NoSuchElementException if it's not set. If no suffix is provided then milliseconds are assumed. - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as milliseconds- java.util.NoSuchElementExceptionIf the time parameter is not set
 
-    def getTimeAsSeconds(key: String, defaultValue: String): LongGet a time parameter as seconds, falling back to a default if not set. Get a time parameter as seconds, falling back to a default if not set. If no suffix is provided then seconds are assumed. - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as seconds
 
-    def getTimeAsSeconds(key: String): LongGet a time parameter as seconds; throws a NoSuchElementException if it's not set. Get a time parameter as seconds; throws a NoSuchElementException if it's not set. If no suffix is provided then seconds are assumed. - Definition Classes
- ReadOnlySparkConf
- Exceptions thrown
- NumberFormatExceptionIf the value cannot be interpreted as seconds- java.util.NoSuchElementExceptionIf the time parameter is not set
 
-    def hashCode(): Int- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
 
-    def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean- Attributes
- protected
- Definition Classes
- Logging
 
-    def initializeLogIfNecessary(isInterpreter: Boolean): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-   final  def isInstanceOf[T0]: Boolean- Definition Classes
- Any
 
-    def isTraceEnabled(): Boolean- Attributes
- protected
- Definition Classes
- Logging
 
-    def log: Logger- Attributes
- protected
- Definition Classes
- Logging
 
-    def logBasedOnLevel(level: Level)(f: => MessageWithContext): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logDebug(msg: => String, throwable: Throwable): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logDebug(entry: LogEntry, throwable: Throwable): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logDebug(entry: LogEntry): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logDebug(msg: => String): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logError(msg: => String, throwable: Throwable): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logError(entry: LogEntry, throwable: Throwable): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logError(entry: LogEntry): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logError(msg: => String): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logInfo(msg: => String, throwable: Throwable): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logInfo(entry: LogEntry, throwable: Throwable): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logInfo(entry: LogEntry): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logInfo(msg: => String): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logName: String- Attributes
- protected
- Definition Classes
- Logging
 
-    def logTrace(msg: => String, throwable: Throwable): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logTrace(entry: LogEntry, throwable: Throwable): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logTrace(entry: LogEntry): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logTrace(msg: => String): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logWarning(msg: => String, throwable: Throwable): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logWarning(entry: LogEntry, throwable: Throwable): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logWarning(entry: LogEntry): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-    def logWarning(msg: => String): Unit- Attributes
- protected
- Definition Classes
- Logging
 
-   final  def ne(arg0: AnyRef): Boolean- Definition Classes
- AnyRef
 
-   final  def notify(): Unit- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
 
-   final  def notifyAll(): Unit- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
 
-    def registerAvroSchemas(schemas: Schema*): SparkConfUse Kryo serialization and register the given set of Avro schemas so that the generic record serializer can decrease network IO 
-    def registerKryoClasses(classes: Array[Class[_]]): SparkConfUse Kryo serialization and register the given set of classes with Kryo. Use Kryo serialization and register the given set of classes with Kryo. If called multiple times, this will append the classes from all calls together. 
-    def remove(key: String): SparkConfRemove a parameter from the configuration 
-    def set(key: String, value: String): SparkConfSet a configuration variable. 
-    def setAll(settings: Iterable[(String, String)]): SparkConfSet multiple parameters together 
-    def setAppName(name: String): SparkConfSet a name for your application. Set a name for your application. Shown in the Spark web UI. 
-    def setExecutorEnv(variables: Array[(String, String)]): SparkConfSet multiple environment variables to be used when launching executors. Set multiple environment variables to be used when launching executors. (Java-friendly version.) 
-    def setExecutorEnv(variables: Seq[(String, String)]): SparkConfSet multiple environment variables to be used when launching executors. Set multiple environment variables to be used when launching executors. These variables are stored as properties of the form spark.executorEnv.VAR_NAME (for example spark.executorEnv.PATH) but this method makes them easier to set. 
-    def setExecutorEnv(variable: String, value: String): SparkConfSet an environment variable to be used when launching executors for this application. Set an environment variable to be used when launching executors for this application. These variables are stored as properties of the form spark.executorEnv.VAR_NAME (for example spark.executorEnv.PATH) but this method makes them easier to set. 
-    def setIfMissing(key: String, value: String): SparkConfSet a parameter if it isn't already configured 
-    def setJars(jars: Array[String]): SparkConfSet JAR files to distribute to the cluster. Set JAR files to distribute to the cluster. (Java-friendly version.) 
-    def setJars(jars: Seq[String]): SparkConfSet JAR files to distribute to the cluster. 
-    def setMaster(master: String): SparkConfThe master URL to connect to, such as "local" to run locally with one thread, "local[4]" to run locally with 4 cores, or "spark://master:7077" to run on a Spark standalone cluster. 
-    def setSparkHome(home: String): SparkConfSet the location where Spark is installed on worker nodes. 
-   final  def synchronized[T0](arg0: => T0): T0- Definition Classes
- AnyRef
 
-    def toDebugString: StringReturn a string listing all keys and values, one per line. Return a string listing all keys and values, one per line. This is useful to print the configuration out for debugging. 
-    def toString(): String- Definition Classes
- AnyRef → Any
 
-   final  def wait(arg0: Long, arg1: Int): Unit- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
 
-   final  def wait(arg0: Long): Unit- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
 
-   final  def wait(): Unit- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
 
-    def withLogContext(context: Map[String, String])(body: => Unit): Unit- Attributes
- protected
- Definition Classes
- Logging
 
Deprecated Value Members
-    def finalize(): Unit- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable]) @Deprecated
- Deprecated
- (Since version 9)