FlowSpec for an added, updated, or modified flow configDataNode.FlowEdgeContexts into a Dag.SpecCompiler to become healthy.DataNode.FlowEdge.FlowEdgeFactory for creating BaseFlowEdge.FlowGraph.PathFinder that assumes an unweighted FlowGraph and computes the
shortest path using a variant of the BFS path-finding algorithm.Dag from the backing store, typically upon completion of execution.Dag from the backing store, typically upon completion of execution.Spec and compile corresponding materialized Specs
and the mapping to SpecExecutor that they can be run on.DatasetDescriptor objectFlowConfig should check if current node is active (master).BaseFlowEdge.FlowEdge from the edge propertiesDag.DagManager.DagManagerThread performs 2 actions when scheduled:
Dequeues any newly submitted Dags from the Dag queue.Dag, in case of a job failure.Dags.DataNode related configuration keys.HttpDataNode related configuration keys.DatasetDescriptor.DataNode by its identifierFlowGraph.FlowConfig should check if current node is active (master).DataNode by its identifierFlowGraph.SpecStore
so only they would be loaded during DR handling.FlowEdges are the same if they have the same endpoints and both refer to the same i.e.FileSystemDataNode implementation.FlowSpec containing the source and destination DataNodes, as well as the
source and target DatasetDescriptors, and returns a sequence
of fully resolved JobSpecs that will move the source dataset
from the source datanode, perform any necessary transformations and land the dataset at the destination node
in the format described by the target DatasetDescriptor.FlowSpec containing the source and destination DataNodes, as well as the
source and target DatasetDescriptors, and returns a sequence
of fully resolved JobSpecs that will move the source dataset
from the source datanode, perform any necessary transformations and land the dataset at the destination node
in the format described by the target DatasetDescriptor.FlowEdge related configuration keys.PathFinder related configuration keys.FlowTemplates.FlowEdge during path
computation while the edge is explored for its eligibility.FlowGraph.FlowGraph.JobTemplates.DatasetDescriptor with FS-based storage.FlowTemplates.JobStatusRetriever.FlowEdge.Dag given a file name.Dags from the underlying store.Dags from the underlying store.FlowTemplate.Dag.DagNodes which are the dependency nodes for concatenating this Dag
with any other Dag.DataNodeDataNodeDataNode.DataNode.FlowTemplate with given URI.DataNode from the node identifierTopologySpecs.FlowGraph from a git repository.FlowConfigsResourceHandler which consider if current node is Active or Standby.JobScheduler that is also a SpecCatalogListener.HdfsDataNode.HiveDataNode implementation.FlowTemplate that loads a HOCON file as a StaticFlowTemplate.Spec ie flow and compile corresponding materialized job Spec
and its mapping to SpecExecutor.SqlDatasetDescriptor.GobblinTrackingEvents reporting statuses of
running jobs.KafkaJobStatusMonitor instance.LocalFSDataNode.Spec ie flow and compile corresponding materialized job Spec
and its mapping to SpecExecutor.DagStateStore using MySQL as a backup, leverage MysqlStateStore.JobStatus.DataMovementAuthorizer that always returns true.FSFlowTemplateCatalog that keeps a cache of flow and job templates.SpecCatalogListener.FlowGraph.FlowSpec for a deleted or renamed flow configDagManager.DagManagerThread.SpecCompiler as active.SpecCompiler active/inactive.DagManager becomes active, it loads the serialized representations of the currently running Dags
from the checkpoint directory, deserializes the Dags and adds them to a queue to be consumed by
the DagManager.DagManagerThreads.Spec and compile corresponding materialized Specs
and the mapping to SpecExecutor that they can be run on.FlowTemplate using a static Config as the raw configuration for the template.URI for cancellation requsts to the DagManager.TopologySpec Factory that creates or generates the TopologySpec to be used.FlowTemplate using the provided Config object.FlowTemplate using the provided Config object.FlowConfig should check if current node is active (master).Dag to the backing store.Dag to the backing store.