5 Standard cmsRun Process building interface used for data processing
6 for a particular data scenario.
7 A scenario is a macro-data-taking setting such as cosmic running,
8 beam halo running, or particular validation tests.
10 This class defines the interfaces used by the Tier 0 and Tier 1
11 processing to wrap calls to ConfigBuilder in order to retrieve all the
12 configurations for the various types of job
17 from Configuration.DataProcessing.Merge
import mergeProcess
18 from Configuration.DataProcessing.Repack
import repackProcess
33 given a skeleton process object and references
34 to the output modules for the products it produces,
35 install the standard reco sequences and event content for this
39 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
40 msg +=
"Does not contain an implementation for promptReco"
41 raise NotImplementedError, msg
48 Build an express processing configuration for this scenario.
50 Express processing runs conversion, reco and alca reco on each
51 streamer file in the express stream and writes out RAW, RECO and
52 a combined ALCA file that gets mergepacked in a later step
54 writeTiers is list of tiers to write out, not including ALCA
56 datasets is the list of datasets to split into for each tier
57 written out. Should always be one dataset
59 alcaDataset - if set, this means the combined Alca file is written
60 out with no dataset splitting, it gets assigned straight to the datase
64 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
65 msg +=
"Does not contain an implementation for expressProcessing"
66 raise NotImplementedError, msg
73 Given a skeleton process install the skim splitting for given skims
76 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
77 msg +=
"Does not contain an implementation for alcaSkim"
78 raise NotImplementedError, msg
85 Given a skeleton process install the skim production for given skims
88 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
89 msg +=
"Does not contain an implementation for alcaReco"
90 raise NotImplementedError, msg
97 build a DQM Harvesting configuration
101 datasetName - aka workflow name for DQMServer, this is the name of the
102 dataset containing the harvested run
103 runNumber - The run being harvested
104 globalTag - The global tag being used
105 inputFiles - The list of LFNs being harvested
108 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
109 msg +=
"Does not contain an implementation for dqmHarvesting"
110 raise NotImplementedError, msg
117 build an AlCa Harvesting configuration
121 globalTag - The global tag being used
122 inputFiles - The list of LFNs being harvested
125 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
126 msg +=
"Does not contain an implementation for alcaHarvesting"
127 raise NotImplementedError, msg
134 Given a process install the sequences for Tier 1 skimming
135 and the appropriate output modules
138 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
139 msg +=
"Does not contain an implementation for skimming"
140 raise NotImplementedError, msg
143 def merge(self, *inputFiles, **options):
147 builds a merge configuration
150 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
158 builds a repack configuration
161 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
173 Util to prune an unwanted output module
176 del process._Process__outputmodules[moduleName]