CMS 3D CMS Logo

 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Properties Friends Macros Groups Pages
List of all members | Public Member Functions | Public Attributes
Scenario.Scenario Class Reference
Inheritance diagram for Scenario.Scenario:

Public Member Functions

def __init__
 
def alcaHarvesting
 
def alcaReco
 
def alcaSkim
 
def dqmHarvesting
 
def dropOutputModule
 
def expressProcessing
 
def merge
 
def promptReco
 
def repack
 
def skimming
 
def visualizationProcessing
 

Public Attributes

 eras
 

Detailed Description

_Scenario_

Definition at line 24 of file Scenario.py.

Constructor & Destructor Documentation

def Scenario.Scenario.__init__ (   self)

Definition at line 29 of file Scenario.py.

29 
30  def __init__(self):
31  self.eras=cms.Modifier()
32 

Member Function Documentation

def Scenario.Scenario.alcaHarvesting (   self,
  globalTag,
  datasetName,
  options 
)
_alcaHarvesting_

build an AlCa Harvesting configuration

Arguments:

globalTag - The global tag being used
inputFiles - The list of LFNs being harvested

Definition at line 139 of file Scenario.py.

140  def alcaHarvesting(self, globalTag, datasetName, **options):
141  """
142  _alcaHarvesting_
143 
144  build an AlCa Harvesting configuration
145 
146  Arguments:
147 
148  globalTag - The global tag being used
149  inputFiles - The list of LFNs being harvested
150 
151  """
152  msg = "Scenario Implementation %s\n" % self.__class__.__name__
153  msg += "Does not contain an implementation for alcaHarvesting"
154  raise NotImplementedError(msg)
155 
def Scenario.Scenario.alcaReco (   self,
  skims,
  options 
)
_alcaSkim_

Given a skeleton process install the skim production for given skims

Definition at line 107 of file Scenario.py.

108  def alcaReco(self, *skims, **options):
109  """
110  _alcaSkim_
111 
112  Given a skeleton process install the skim production for given skims
113 
114  """
115  msg = "Scenario Implementation %s\n" % self.__class__.__name__
116  msg += "Does not contain an implementation for alcaReco"
117  raise NotImplementedError(msg)
118 
def Scenario.Scenario.alcaSkim (   self,
  skims,
  options 
)
_alcaSkim_

Given a skeleton process install the skim splitting for given skims

Definition at line 95 of file Scenario.py.

95 
96  def alcaSkim(self, skims, **options):
97  """
98  _alcaSkim_
99 
100  Given a skeleton process install the skim splitting for given skims
101 
102  """
103  msg = "Scenario Implementation %s\n" % self.__class__.__name__
104  msg += "Does not contain an implementation for alcaSkim"
105  raise NotImplementedError(msg)
106 
def Scenario.Scenario.dqmHarvesting (   self,
  datasetName,
  runNumber,
  globalTag,
  options 
)
_dqmHarvesting_

build a DQM Harvesting configuration

Arguments:

datasetName - aka workflow name for DQMServer, this is the name of the
dataset containing the harvested run
runNumber - The run being harvested
globalTag - The global tag being used
inputFiles - The list of LFNs being harvested

Definition at line 119 of file Scenario.py.

120  def dqmHarvesting(self, datasetName, runNumber, globalTag, **options):
121  """
122  _dqmHarvesting_
123 
124  build a DQM Harvesting configuration
125 
126  Arguments:
127 
128  datasetName - aka workflow name for DQMServer, this is the name of the
129  dataset containing the harvested run
130  runNumber - The run being harvested
131  globalTag - The global tag being used
132  inputFiles - The list of LFNs being harvested
133 
134  """
135  msg = "Scenario Implementation %s\n" % self.__class__.__name__
136  msg += "Does not contain an implementation for dqmHarvesting"
137  raise NotImplementedError(msg)
138 
def Scenario.Scenario.dropOutputModule (   self,
  processRef,
  moduleName 
)
_dropOutputModule_

Util to prune an unwanted output module

Definition at line 195 of file Scenario.py.

196  def dropOutputModule(self, processRef, moduleName):
197  """
198  _dropOutputModule_
199 
200  Util to prune an unwanted output module
201 
202  """
203  del process._Process__outputmodules[moduleName]
204  return
def Scenario.Scenario.expressProcessing (   self,
  globalTag,
  options 
)
_expressProcessing_

Build an express processing configuration for this scenario.

Express processing runs conversion, reco and alca reco on each
streamer file in the express stream and writes out RAW, RECO and
a combined ALCA file that gets mergepacked in a later step

writeTiers is list of tiers to write out, not including ALCA

datasets is the list of datasets to split into for each tier
written out. Should always be one dataset

alcaDataset - if set, this means the combined Alca file is written
out with no dataset splitting, it gets assigned straight to the datase
provided

Definition at line 48 of file Scenario.py.

48 
49  def expressProcessing(self, globalTag, **options):
50  """
51  _expressProcessing_
52 
53  Build an express processing configuration for this scenario.
54 
55  Express processing runs conversion, reco and alca reco on each
56  streamer file in the express stream and writes out RAW, RECO and
57  a combined ALCA file that gets mergepacked in a later step
58 
59  writeTiers is list of tiers to write out, not including ALCA
60 
61  datasets is the list of datasets to split into for each tier
62  written out. Should always be one dataset
63 
64  alcaDataset - if set, this means the combined Alca file is written
65  out with no dataset splitting, it gets assigned straight to the datase
66  provided
67 
68  """
69  msg = "Scenario Implementation %s\n" % self.__class__.__name__
70  msg += "Does not contain an implementation for expressProcessing"
71  raise NotImplementedError(msg)
72 
73 
def expressProcessing
Definition: Scenario.py:48
def Scenario.Scenario.merge (   self,
  inputFiles,
  options 
)
_merge_

builds a merge configuration

Definition at line 169 of file Scenario.py.

References Merge.mergeProcess().

170  def merge(self, *inputFiles, **options):
171  """
172  _merge_
173 
174  builds a merge configuration
175 
176  """
177  msg = "Scenario Implementation %s\n" % self.__class__.__name__
178  return mergeProcess(*inputFiles, **options)
179 
def mergeProcess
Definition: Merge.py:16
def Scenario.Scenario.promptReco (   self,
  globalTag,
  options 
)
_installPromptReco_

given a skeleton process object and references
to the output modules for the products it produces,
install the standard reco sequences and event content for this
scenario

Definition at line 33 of file Scenario.py.

33 
34  def promptReco(self, globalTag, **options):
35  """
36  _installPromptReco_
37 
38  given a skeleton process object and references
39  to the output modules for the products it produces,
40  install the standard reco sequences and event content for this
41  scenario
42 
43  """
44  msg = "Scenario Implementation %s\n" % self.__class__.__name__
45  msg += "Does not contain an implementation for promptReco"
46  raise NotImplementedError(msg)
47 
def Scenario.Scenario.repack (   self,
  options 
)
_repack_

builds a repack configuration

Definition at line 180 of file Scenario.py.

References Repack.repackProcess().

181  def repack(self, **options):
182  """
183  _repack_
184 
185  builds a repack configuration
186 
187  """
188  msg = "Scenario Implementation %s\n" % self.__class__.__name__
189  return repackProcess(**options)
190 
def repackProcess
Definition: Repack.py:12
def Scenario.Scenario.skimming (   self,
  skims,
  globalTag,
  options 
)
_skimming_

Given a process install the sequences for Tier 1 skimming
and the appropriate output modules

Definition at line 156 of file Scenario.py.

157  def skimming(self, skims, globalTag, **options):
158  """
159  _skimming_
160 
161  Given a process install the sequences for Tier 1 skimming
162  and the appropriate output modules
163 
164  """
165  msg = "Scenario Implementation %s\n" % self.__class__.__name__
166  msg += "Does not contain an implementation for skimming"
167  raise NotImplementedError(msg)
168 
def Scenario.Scenario.visualizationProcessing (   self,
  globalTag,
  options 
)
_expressProcessing_

Build a configuration for the visualization processing for this scenario.

Visualization processing runs unpacking, and reco on 
streamer files and it is equipped to run on the online cluster
and writes RECO or FEVT files,

writeTiers is list of tiers to write out.

Definition at line 74 of file Scenario.py.

74 
75  def visualizationProcessing(self, globalTag, **options):
76  """
77  _expressProcessing_
78 
79  Build a configuration for the visualization processing for this scenario.
80 
81  Visualization processing runs unpacking, and reco on
82  streamer files and it is equipped to run on the online cluster
83  and writes RECO or FEVT files,
84 
85  writeTiers is list of tiers to write out.
86 
87 
88  """
89  msg = "Scenario Implementation %s\n" % self.__class__.__name__
90  msg += "Does not contain an implementation for visualizationProcessing"
91  raise NotImplementedError(msg)
92 
93 
94 
def visualizationProcessing
Definition: Scenario.py:74

Member Data Documentation

Scenario.Scenario.eras

Definition at line 30 of file Scenario.py.

Referenced by Impl.AlCa.AlCa.alcaHarvesting(), Reco.Reco.alcaHarvesting(), Impl.Test.Test.alcaSkim(), Impl.AlCa.AlCa.alcaSkim(), Reco.Reco.alcaSkim(), Impl.prodmc.prodmc.dqmHarvesting(), Impl.preprodmc.preprodmc.dqmHarvesting(), Impl.relvalgen.relvalgen.dqmHarvesting(), Impl.relvalmc.relvalmc.dqmHarvesting(), Impl.relvalmcfs.relvalmcfs.dqmHarvesting(), Impl.Test.Test.dqmHarvesting(), Impl.DataScouting.DataScouting.dqmHarvesting(), Impl.AlCa.AlCa.dqmHarvesting(), Reco.Reco.dqmHarvesting(), Impl.Test.Test.expressProcessing(), Reco.Reco.expressProcessing(), Impl.AlCa.AlCa.expressProcessing(), Impl.Test.Test.promptReco(), Impl.DataScouting.DataScouting.promptReco(), Impl.AlCa.AlCa.promptReco(), Reco.Reco.promptReco(), Impl.Test.Test.skimming(), Reco.Reco.skimming(), and Reco.Reco.visualizationProcessing().