Package org.apache.spark.util
Class AccumulatorContext
Object
org.apache.spark.util.AccumulatorContext
An internal class used to track accumulators by Spark itself.
- 
Constructor SummaryConstructors
- 
Method SummaryModifier and TypeMethodDescriptionstatic voidclear()Clears all registeredAccumulatorV2s.static scala.Option<AccumulatorV2<?,?>> get(long id) Returns theAccumulatorV2registered with the given ID, if any.static scala.Option<Object>internOption(scala.Option<Object> value) Naive way to reduce the duplicate Some objects for values 0 and -1 TODO: Eventually if this spreads out to more values then using Guava's weak interner would be a better solution.static org.apache.spark.internal.Logging.LogStringContextLogStringContext(scala.StringContext sc) static longnewId()Returns a globally unique ID for a newAccumulatorV2.static intReturns the number of accumulators registered.static org.slf4j.Loggerstatic voidorg$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) static voidregister(AccumulatorV2<?, ?> a) Registers anAccumulatorV2created on the driver such that it can be used on the executors.static voidremove(long id) Unregisters theAccumulatorV2with the given ID, if any.
- 
Constructor Details- 
AccumulatorContextpublic AccumulatorContext()
 
- 
- 
Method Details- 
newIdpublic static long newId()Returns a globally unique ID for a newAccumulatorV2. Note: Once you copy theAccumulatorV2the ID is no longer unique.- Returns:
- (undocumented)
 
- 
numAccumspublic static int numAccums()Returns the number of accumulators registered. Used in testing.
- 
registerRegisters anAccumulatorV2created on the driver such that it can be used on the executors.All accumulators registered here can later be used as a container for accumulating partial values across multiple tasks. This is what org.apache.spark.scheduler.DAGSchedulerdoes. Note: if an accumulator is registered here, it should also be registered with the active context cleaner for cleanup so as to avoid memory leaks.If an AccumulatorV2with the same ID was already registered, this does nothing instead of overwriting it. We will never register same accumulator twice, this is just a sanity check.- Parameters:
- a- (undocumented)
 
- 
removepublic static void remove(long id) Unregisters theAccumulatorV2with the given ID, if any.- Parameters:
- id- (undocumented)
 
- 
getReturns theAccumulatorV2registered with the given ID, if any.- Parameters:
- id- (undocumented)
- Returns:
- (undocumented)
 
- 
clearpublic static void clear()Clears all registeredAccumulatorV2s. For testing only.
- 
internOptionNaive way to reduce the duplicate Some objects for values 0 and -1 TODO: Eventually if this spreads out to more values then using Guava's weak interner would be a better solution.- Parameters:
- value- (undocumented)
- Returns:
- (undocumented)
 
- 
org$apache$spark$internal$Logging$$log_public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_()
- 
org$apache$spark$internal$Logging$$log__$eqpublic static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) 
- 
LogStringContextpublic static org.apache.spark.internal.Logging.LogStringContext LogStringContext(scala.StringContext sc) 
 
-