5 Standard cmsRun Process building interface used for data processing
6 for a particular data scenario.
7 A scenario is a macro-data-taking setting such as cosmic running,
8 beam halo running, or particular validation tests.
10 This class defines the interfaces used by the Tier 0 and Tier 1
11 processing to wrap calls to ConfigBuilder in order to retrieve all the
12 configurations for the various types of job
17 from Configuration.DataProcessing.Merge
import mergeProcess
18 from Configuration.DataProcessing.Repack
import repackProcess
37 given a skeleton process object and references
38 to the output modules for the products it produces,
39 install the standard reco sequences and event content for this
43 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
44 msg +=
"Does not contain an implementation for promptReco"
45 raise NotImplementedError, msg
52 Build an express processing configuration for this scenario.
54 Express processing runs conversion, reco and alca reco on each
55 streamer file in the express stream and writes out RAW, RECO and
56 a combined ALCA file that gets mergepacked in a later step
58 writeTiers is list of tiers to write out, not including ALCA
60 datasets is the list of datasets to split into for each tier
61 written out. Should always be one dataset
63 alcaDataset - if set, this means the combined Alca file is written
64 out with no dataset splitting, it gets assigned straight to the datase
68 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
69 msg +=
"Does not contain an implementation for expressProcessing"
70 raise NotImplementedError, msg
77 Given a skeleton process install the skim splitting for given skims
80 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
81 msg +=
"Does not contain an implementation for alcaSkim"
82 raise NotImplementedError, msg
89 Given a skeleton process install the skim production for given skims
92 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
93 msg +=
"Does not contain an implementation for alcaReco"
94 raise NotImplementedError, msg
101 build a DQM Harvesting configuration
105 datasetName - aka workflow name for DQMServer, this is the name of the
106 dataset containing the harvested run
107 runNumber - The run being harvested
108 globalTag - The global tag being used
109 inputFiles - The list of LFNs being harvested
112 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
113 msg +=
"Does not contain an implementation for dqmHarvesting"
114 raise NotImplementedError, msg
121 build an AlCa Harvesting configuration
125 globalTag - The global tag being used
126 inputFiles - The list of LFNs being harvested
129 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
130 msg +=
"Does not contain an implementation for alcaHarvesting"
131 raise NotImplementedError, msg
138 Given a process install the sequences for Tier 1 skimming
139 and the appropriate output modules
142 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
143 msg +=
"Does not contain an implementation for skimming"
144 raise NotImplementedError, msg
147 def merge(self, *inputFiles, **options):
151 builds a merge configuration
154 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
162 builds a repack configuration
165 msg =
"Scenario Implementation %s\n" % self.__class__.__name__
177 Util to prune an unwanted output module
180 del process._Process__outputmodules[moduleName]