CMS 3D CMS Logo

List of all members | Public Member Functions | Public Attributes
MatrixInjector.MatrixInjector Class Reference
Inheritance diagram for MatrixInjector.MatrixInjector:

Public Member Functions

def __init__ (self, opt, mode='init', options='')
 
def prepare (self, mReader, directories, mode='init')
 
def submit (self)
 
def upload (self)
 
def uploadConf (self, filePath, label, where)
 the info are not in the task specific dict but in the general dict t_input.update(copy.deepcopy(self.defaultHarvest)) t_input['DQMConfigCacheID']=t_second['ConfigCacheID'] More...
 

Public Attributes

 batchName
 
 batchTime
 
 chainDicts
 
 couch
 
 couchCache
 
 count
 
 DbsUrl
 
 defaultChain
 
 defaultHarvest
 
 defaultInput
 
 defaultScratch
 
 defaultTask
 
 dqmgui
 
 group
 
 keep
 
 label
 
 memoryOffset
 
 memPerCore
 
 speciallabel
 
 testMode
 
 user
 
 version
 
 wmagent
 

Detailed Description

Definition at line 38 of file MatrixInjector.py.

Constructor & Destructor Documentation

def MatrixInjector.MatrixInjector.__init__ (   self,
  opt,
  mode = 'init',
  options = '' 
)

Definition at line 40 of file MatrixInjector.py.

40  def __init__(self,opt,mode='init',options=''):
41  self.count=1040
42 
43  self.dqmgui=None
44  self.wmagent=None
45  for k in options.split(','):
46  if k.startswith('dqm:'):
47  self.dqmgui=k.split(':',1)[-1]
48  elif k.startswith('wma:'):
49  self.wmagent=k.split(':',1)[-1]
50 
51  self.testMode=((mode!='submit') and (mode!='force'))
52  self.version =1
53  self.keep = opt.keep
54  self.memoryOffset = opt.memoryOffset
55  self.memPerCore = opt.memPerCore
56  self.batchName = ''
57  self.batchTime = str(int(time.time()))
58  if(opt.batchName):
59  self.batchName = '__'+opt.batchName+'-'+self.batchTime
60 
61  #wagemt stuff
62  if not self.wmagent:
63  self.wmagent=os.getenv('WMAGENT_REQMGR')
64  if not self.wmagent:
65  if not opt.testbed :
66  self.wmagent = 'cmsweb.cern.ch'
67  self.DbsUrl = "https://"+self.wmagent+"/dbs/prod/global/DBSReader"
68  else :
69  self.wmagent = 'cmsweb-testbed.cern.ch'
70  self.DbsUrl = "https://"+self.wmagent+"/dbs/int/global/DBSReader"
71 
72  if not self.dqmgui:
73  self.dqmgui="https://cmsweb.cern.ch/dqm/relval"
74  #couch stuff
75  self.couch = 'https://'+self.wmagent+'/couchdb'
76 # self.couchDB = 'reqmgr_config_cache'
77  self.couchCache={} # so that we do not upload like crazy, and recyle cfgs
78  self.user = os.getenv('USER')
79  self.group = 'ppd'
80  self.label = 'RelValSet_'+os.getenv('CMSSW_VERSION').replace('-','')+'_v'+str(self.version)
81  self.speciallabel=''
82  if opt.label:
83  self.speciallabel= '_'+opt.label
84 
85 
86  if not os.getenv('WMCORE_ROOT'):
87  print('\n\twmclient is not setup properly. Will not be able to upload or submit requests.\n')
88  if not self.testMode:
89  print('\n\t QUIT\n')
90  sys.exit(-18)
91  else:
92  print('\n\tFound wmclient\n')
93 
94  self.defaultChain={
95  "RequestType" : "TaskChain", #this is how we handle relvals
96  "SubRequestType" : "RelVal", #this is how we handle relvals, now that TaskChain is also used for central MC production
97  "RequestPriority": 500000,
98  "Requestor": self.user, #Person responsible
99  "Group": self.group, #group for the request
100  "CMSSWVersion": os.getenv('CMSSW_VERSION'), #CMSSW Version (used for all tasks in chain)
101  "Campaign": os.getenv('CMSSW_VERSION'), # = AcquisitionEra, will be reset later to the one of first task, will both be the CMSSW_VERSION
102  "ScramArch": os.getenv('SCRAM_ARCH'), #Scram Arch (used for all tasks in chain)
103  "ProcessingVersion": self.version, #Processing Version (used for all tasks in chain)
104  "GlobalTag": None, #Global Tag (overridden per task)
105  "ConfigCacheUrl": self.couch, #URL of CouchDB containing Config Cache
106  "DbsUrl": self.DbsUrl,
107  #- Will contain all configs for all Tasks
108  #"SiteWhitelist" : ["T2_CH_CERN", "T1_US_FNAL"], #Site whitelist
109  "TaskChain" : None, #Define number of tasks in chain.
110  "nowmTasklist" : [], #a list of tasks as we put them in
111  "Multicore" : 1, # do not set multicore for the whole chain
112  "Memory" : 3000,
113  "SizePerEvent" : 1234,
114  "TimePerEvent" : 10,
115  "PrepID": os.getenv('CMSSW_VERSION')
116  }
117 
119  "EnableHarvesting" : "True",
120  "DQMUploadUrl" : self.dqmgui,
121  "DQMConfigCacheID" : None,
122  "Multicore" : 1 # hardcode Multicore to be 1 for Harvest
123  }
124 
126  "TaskName" : None, #Task Name
127  "ConfigCacheID" : None, #Generator Config id
128  "GlobalTag": None,
129  "SplittingAlgo" : "EventBased", #Splitting Algorithm
130  "EventsPerJob" : None, #Size of jobs in terms of splitting algorithm
131  "RequestNumEvents" : None, #Total number of events to generate
132  "Seeding" : "AutomaticSeeding", #Random seeding method
133  "PrimaryDataset" : None, #Primary Dataset to be created
134  "nowmIO": {},
135  "Multicore" : opt.nThreads, # this is the per-taskchain Multicore; it's the default assigned to a task if it has no value specified
136  "KeepOutput" : False
137  }
139  "TaskName" : "DigiHLT", #Task Name
140  "ConfigCacheID" : None, #Processing Config id
141  "GlobalTag": None,
142  "InputDataset" : None, #Input Dataset to be processed
143  "SplittingAlgo" : "LumiBased", #Splitting Algorithm
144  "LumisPerJob" : 10, #Size of jobs in terms of splitting algorithm
145  "nowmIO": {},
146  "Multicore" : opt.nThreads, # this is the per-taskchain Multicore; it's the default assigned to a task if it has no value specified
147  "KeepOutput" : False
148  }
149  self.defaultTask={
150  "TaskName" : None, #Task Name
151  "InputTask" : None, #Input Task Name (Task Name field of a previous Task entry)
152  "InputFromOutputModule" : None, #OutputModule name in the input task that will provide files to process
153  "ConfigCacheID" : None, #Processing Config id
154  "GlobalTag": None,
155  "SplittingAlgo" : "LumiBased", #Splitting Algorithm
156  "LumisPerJob" : 10, #Size of jobs in terms of splitting algorithm
157  "nowmIO": {},
158  "Multicore" : opt.nThreads, # this is the per-taskchain Multicore; it's the default assigned to a task if it has no value specified
159  "KeepOutput" : False
160  }
161 
162  self.chainDicts={}
163 
164 
def replace(string, replacements)
S & print(S &os, JobReport::InputFile const &f)
Definition: JobReport.cc:66
def __init__(self, opt, mode='init', options='')
#define str(s)

Member Function Documentation

def MatrixInjector.MatrixInjector.prepare (   self,
  mReader,
  directories,
  mode = 'init' 
)

Definition at line 165 of file MatrixInjector.py.

References mps_setup.append, MatrixInjector.MatrixInjector.batchName, MatrixInjector.MatrixInjector.batchTime, MatrixInjector.MatrixInjector.chainDicts, MatrixInjector.MatrixInjector.defaultChain, MatrixInjector.MatrixInjector.defaultHarvest, MatrixInjector.MatrixInjector.defaultInput, MatrixInjector.MatrixInjector.defaultScratch, MatrixInjector.MatrixInjector.defaultTask, spr.find(), createfilelist.int, mps_monitormerge.items, MatrixInjector.MatrixInjector.keep, genParticles_cff.map, MatrixInjector.MatrixInjector.memoryOffset, MatrixInjector.MatrixInjector.memPerCore, edm.print(), python.rootplot.root2matplotlib.replace(), MatrixInjector.MatrixInjector.speciallabel, split, and MuonErrorMatrixValues_cff.values.

165  def prepare(self,mReader, directories, mode='init'):
166  try:
167  #from Configuration.PyReleaseValidation.relval_steps import wmsplit
168  wmsplit = {}
169  wmsplit['DIGIHI']=5
170  wmsplit['RECOHI']=5
171  wmsplit['HLTD']=5
172  wmsplit['RECODreHLT']=2
173  wmsplit['DIGIPU']=4
174  wmsplit['DIGIPU1']=4
175  wmsplit['RECOPU1']=1
176  wmsplit['DIGIUP15_PU50']=1
177  wmsplit['RECOUP15_PU50']=1
178  wmsplit['DIGIUP15_PU25']=1
179  wmsplit['RECOUP15_PU25']=1
180  wmsplit['DIGIUP15_PU25HS']=1
181  wmsplit['RECOUP15_PU25HS']=1
182  wmsplit['DIGIHIMIX']=5
183  wmsplit['RECOHIMIX']=5
184  wmsplit['RECODSplit']=1
185  wmsplit['SingleMuPt10_UP15_ID']=1
186  wmsplit['DIGIUP15_ID']=1
187  wmsplit['RECOUP15_ID']=1
188  wmsplit['TTbar_13_ID']=1
189  wmsplit['SingleMuPt10FS_ID']=1
190  wmsplit['TTbarFS_ID']=1
191  wmsplit['RECODR2_50nsreHLT']=5
192  wmsplit['RECODR2_25nsreHLT']=5
193  wmsplit['RECODR2_2016reHLT']=5
194  wmsplit['RECODR2_50nsreHLT_HIPM']=5
195  wmsplit['RECODR2_25nsreHLT_HIPM']=5
196  wmsplit['RECODR2_2016reHLT_HIPM']=1
197  wmsplit['RECODR2_2016reHLT_skimSingleMu']=1
198  wmsplit['RECODR2_2016reHLT_skimDoubleEG']=1
199  wmsplit['RECODR2_2016reHLT_skimMuonEG']=1
200  wmsplit['RECODR2_2016reHLT_skimJetHT']=1
201  wmsplit['RECODR2_2016reHLT_skimMET']=1
202  wmsplit['RECODR2_2016reHLT_skimSinglePh']=1
203  wmsplit['RECODR2_2016reHLT_skimMuOnia']=1
204  wmsplit['RECODR2_2016reHLT_skimSingleMu_HIPM']=1
205  wmsplit['RECODR2_2016reHLT_skimDoubleEG_HIPM']=1
206  wmsplit['RECODR2_2016reHLT_skimMuonEG_HIPM']=1
207  wmsplit['RECODR2_2016reHLT_skimJetHT_HIPM']=1
208  wmsplit['RECODR2_2016reHLT_skimMET_HIPM']=1
209  wmsplit['RECODR2_2016reHLT_skimSinglePh_HIPM']=1
210  wmsplit['RECODR2_2016reHLT_skimMuOnia_HIPM']=1
211  wmsplit['RECODR2_2017reHLT_Prompt']=1
212  wmsplit['RECODR2_2017reHLT_skimSingleMu_Prompt_Lumi']=1
213  wmsplit['RECODR2_2017reHLT_skimDoubleEG_Prompt']=1
214  wmsplit['RECODR2_2017reHLT_skimMET_Prompt']=1
215  wmsplit['RECODR2_2017reHLT_skimMuOnia_Prompt']=1
216  wmsplit['RECODR2_2017reHLT_Prompt_L1TEgDQM']=1
217  wmsplit['RECODR2_2018reHLT_Prompt']=1
218  wmsplit['RECODR2_2018reHLT_skimSingleMu_Prompt_Lumi']=1
219  wmsplit['RECODR2_2018reHLT_skimDoubleEG_Prompt']=1
220  wmsplit['RECODR2_2018reHLT_skimJetHT_Prompt']=1
221  wmsplit['RECODR2_2018reHLT_skimMET_Prompt']=1
222  wmsplit['RECODR2_2018reHLT_skimMuOnia_Prompt']=1
223  wmsplit['RECODR2_2018reHLT_skimEGamma_Prompt_L1TEgDQM']=1
224  wmsplit['RECODR2_2018reHLT_skimMuonEG_Prompt']=1
225  wmsplit['RECODR2_2018reHLT_skimCharmonium_Prompt']=1
226  wmsplit['RECODR2_2018reHLT_skimJetHT_Prompt_HEfail']=1
227  wmsplit['RECODR2_2018reHLT_skimJetHT_Prompt_BadHcalMitig']=1
228  wmsplit['RECODR2_2018reHLTAlCaTkCosmics_Prompt']=1
229  wmsplit['RECODR2_2018reHLT_skimDisplacedJet_Prompt']=1
230  wmsplit['RECODR2_2018reHLT_ZBPrompt']=1
231  wmsplit['RECODR2_2018reHLT_Offline']=1
232  wmsplit['RECODR2_2018reHLT_skimSingleMu_Offline_Lumi']=1
233  wmsplit['RECODR2_2018reHLT_skimDoubleEG_Offline']=1
234  wmsplit['RECODR2_2018reHLT_skimJetHT_Offline']=1
235  wmsplit['RECODR2_2018reHLT_skimMET_Offline']=1
236  wmsplit['RECODR2_2018reHLT_skimMuOnia_Offline']=1
237  wmsplit['RECODR2_2018reHLT_skimEGamma_Offline_L1TEgDQM']=1
238  wmsplit['RECODR2_2018reHLT_skimMuonEG_Offline']=1
239  wmsplit['RECODR2_2018reHLT_skimCharmonium_Offline']=1
240  wmsplit['RECODR2_2018reHLT_skimJetHT_Offline_HEfail']=1
241  wmsplit['RECODR2_2018reHLT_skimJetHT_Offline_BadHcalMitig']=1
242  wmsplit['RECODR2_2018reHLTAlCaTkCosmics_Offline']=1
243  wmsplit['RECODR2_2018reHLT_skimDisplacedJet_Offline']=1
244  wmsplit['RECODR2_2018reHLT_ZBOffline']=1
245  wmsplit['HLTDR2_50ns']=1
246  wmsplit['HLTDR2_25ns']=1
247  wmsplit['HLTDR2_2016']=1
248  wmsplit['HLTDR2_2017']=1
249  wmsplit['HLTDR2_2018']=1
250  wmsplit['HLTDR2_2018_BadHcalMitig']=1
251  wmsplit['Hadronizer']=1
252  wmsplit['DIGIUP15']=1
253  wmsplit['RECOUP15']=1
254  wmsplit['RECOAODUP15']=5
255  wmsplit['DBLMINIAODMCUP15NODQM']=5
256  wmsplit['DigiFull']=5
257  wmsplit['RecoFull']=5
258  wmsplit['DigiFullPU']=1
259  wmsplit['RecoFullPU']=1
260  wmsplit['RECOHID11']=1
261  wmsplit['DigiFullTriggerPU_2023D17PU'] = 1
262  wmsplit['RecoFullGlobalPU_2023D17PU']=1
263  wmsplit['DIGIUP17']=1
264  wmsplit['RECOUP17']=1
265  wmsplit['DIGIUP17_PU25']=1
266  wmsplit['RECOUP17_PU25']=1
267  wmsplit['DIGICOS_UP16']=1
268  wmsplit['RECOCOS_UP16']=1
269  wmsplit['DIGICOS_UP17']=1
270  wmsplit['RECOCOS_UP17']=1
271  wmsplit['DIGICOS_UP18']=1
272  wmsplit['RECOCOS_UP18']=1
273  wmsplit['HYBRIDRepackHI2015VR']=1
274  wmsplit['HYBRIDZSHI2015']=1
275  wmsplit['RECOHID15']=1
276  wmsplit['RECOHID18']=1
277 
278  #import pprint
279  #pprint.pprint(wmsplit)
280  except:
281  print("Not set up for step splitting")
282  wmsplit={}
283 
284  acqEra=False
285  for (n,dir) in directories.items():
286  chainDict=copy.deepcopy(self.defaultChain)
287  print("inspecting",dir)
288  nextHasDSInput=None
289  for (x,s) in mReader.workFlowSteps.items():
290  #x has the format (num, prefix)
291  #s has the format (num, name, commands, stepList)
292  if x[0]==n:
293  #print "found",n,s[3]
294  #chainDict['RequestString']='RV'+chainDict['CMSSWVersion']+s[1].split('+')[0]
295  index=0
296  splitForThisWf=None
297  thisLabel=self.speciallabel
298  #if 'HARVESTGEN' in s[3]:
299  if len( [step for step in s[3] if "HARVESTGEN" in step] )>0:
300  chainDict['TimePerEvent']=0.01
301  thisLabel=thisLabel+"_gen"
302  # for double miniAOD test
303  if len( [step for step in s[3] if "DBLMINIAODMCUP15NODQM" in step] )>0:
304  thisLabel=thisLabel+"_dblMiniAOD"
305  processStrPrefix=''
306  setPrimaryDs=None
307  nanoedmGT=''
308  for step in s[3]:
309 
310  if 'INPUT' in step or (not isinstance(s[2][index],str)):
311  nextHasDSInput=s[2][index]
312 
313  else:
314 
315  if (index==0):
316  #first step and not input -> gen part
317  chainDict['nowmTasklist'].append(copy.deepcopy(self.defaultScratch))
318  try:
319  chainDict['nowmTasklist'][-1]['nowmIO']=json.loads(open('%s/%s.io'%(dir,step)).read())
320  except:
321  print("Failed to find",'%s/%s.io'%(dir,step),".The workflows were probably not run on cfg not created")
322  return -15
323 
324  chainDict['nowmTasklist'][-1]['PrimaryDataset']='RelVal'+s[1].split('+')[0]
325  if not '--relval' in s[2][index]:
326  print('Impossible to create task from scratch without splitting information with --relval')
327  return -12
328  else:
329  arg=s[2][index].split()
330  ns=map(int,arg[arg.index('--relval')+1].split(','))
331  chainDict['nowmTasklist'][-1]['RequestNumEvents'] = ns[0]
332  chainDict['nowmTasklist'][-1]['EventsPerJob'] = ns[1]
333  if 'FASTSIM' in s[2][index] or '--fast' in s[2][index]:
334  thisLabel+='_FastSim'
335  if 'lhe' in s[2][index] in s[2][index]:
336  chainDict['nowmTasklist'][-1]['LheInputFiles'] =True
337 
338  elif nextHasDSInput:
339  chainDict['nowmTasklist'].append(copy.deepcopy(self.defaultInput))
340  try:
341  chainDict['nowmTasklist'][-1]['nowmIO']=json.loads(open('%s/%s.io'%(dir,step)).read())
342  except:
343  print("Failed to find",'%s/%s.io'%(dir,step),".The workflows were probably not run on cfg not created")
344  return -15
345  chainDict['nowmTasklist'][-1]['InputDataset']=nextHasDSInput.dataSet
346  if ('DQMHLTonRAWAOD' in step) :
347  chainDict['nowmTasklist'][-1]['IncludeParents']=True
348  splitForThisWf=nextHasDSInput.split
349  chainDict['nowmTasklist'][-1]['LumisPerJob']=splitForThisWf
350  if step in wmsplit:
351  chainDict['nowmTasklist'][-1]['LumisPerJob']=wmsplit[step]
352  # get the run numbers or #events
353  if len(nextHasDSInput.run):
354  chainDict['nowmTasklist'][-1]['RunWhitelist']=nextHasDSInput.run
355  if len(nextHasDSInput.ls):
356  chainDict['nowmTasklist'][-1]['LumiList']=nextHasDSInput.ls
357  #print "what is s",s[2][index]
358  if '--data' in s[2][index] and nextHasDSInput.label:
359  thisLabel+='_RelVal_%s'%nextHasDSInput.label
360  if 'filter' in chainDict['nowmTasklist'][-1]['nowmIO']:
361  print("This has an input DS and a filter sequence: very likely to be the PyQuen sample")
362  processStrPrefix='PU_'
363  setPrimaryDs = 'RelVal'+s[1].split('+')[0]
364  if setPrimaryDs:
365  chainDict['nowmTasklist'][-1]['PrimaryDataset']=setPrimaryDs
366  nextHasDSInput=None
367  else:
368  #not first step and no inputDS
369  chainDict['nowmTasklist'].append(copy.deepcopy(self.defaultTask))
370  try:
371  chainDict['nowmTasklist'][-1]['nowmIO']=json.loads(open('%s/%s.io'%(dir,step)).read())
372  except:
373  print("Failed to find",'%s/%s.io'%(dir,step),".The workflows were probably not run on cfg not created")
374  return -15
375  if splitForThisWf:
376  chainDict['nowmTasklist'][-1]['LumisPerJob']=splitForThisWf
377  if step in wmsplit:
378  chainDict['nowmTasklist'][-1]['LumisPerJob']=wmsplit[step]
379 
380  # change LumisPerJob for Hadronizer steps.
381  if 'Hadronizer' in step:
382  chainDict['nowmTasklist'][-1]['LumisPerJob']=wmsplit['Hadronizer']
383 
384  #print step
385  chainDict['nowmTasklist'][-1]['TaskName']=step
386  if setPrimaryDs:
387  chainDict['nowmTasklist'][-1]['PrimaryDataset']=setPrimaryDs
388  chainDict['nowmTasklist'][-1]['ConfigCacheID']='%s/%s.py'%(dir,step)
389  chainDict['nowmTasklist'][-1]['GlobalTag']=chainDict['nowmTasklist'][-1]['nowmIO']['GT'] # copy to the proper parameter name
390  chainDict['GlobalTag']=chainDict['nowmTasklist'][-1]['nowmIO']['GT'] #set in general to the last one of the chain
391  if 'NANOEDM' in step :
392  nanoedmGT = chainDict['nowmTasklist'][-1]['nowmIO']['GT']
393  if 'NANOMERGE' in step :
394  chainDict['GlobalTag'] = nanoedmGT
395  if 'pileup' in chainDict['nowmTasklist'][-1]['nowmIO']:
396  chainDict['nowmTasklist'][-1]['MCPileup']=chainDict['nowmTasklist'][-1]['nowmIO']['pileup']
397  if '--pileup ' in s[2][index]: # catch --pileup (scenarion) and not --pileup_ (dataset to be mixed) => works also making PRE-MIXed dataset
398  processStrPrefix='PU_' # take care of pu overlay done with GEN-SIM mixing
399  if ( s[2][index].split()[ s[2][index].split().index('--pileup')+1 ] ).find('25ns') > 0 :
400  processStrPrefix='PU25ns_'
401  elif ( s[2][index].split()[ s[2][index].split().index('--pileup')+1 ] ).find('50ns') > 0 :
402  processStrPrefix='PU50ns_'
403  if 'premix_stage2' in s[2][index] and '--pileup_input' in s[2][index]: # take care of pu overlay done with DIGI mixing of premixed events
404  if s[2][index].split()[ s[2][index].split().index('--pileup_input')+1 ].find('25ns') > 0 :
405  processStrPrefix='PUpmx25ns_'
406  elif s[2][index].split()[ s[2][index].split().index('--pileup_input')+1 ].find('50ns') > 0 :
407  processStrPrefix='PUpmx50ns_'
408 
409  if acqEra:
410  #chainDict['AcquisitionEra'][step]=(chainDict['CMSSWVersion']+'-PU_'+chainDict['nowmTasklist'][-1]['GlobalTag']).replace('::All','')+thisLabel
411  chainDict['AcquisitionEra'][step]=chainDict['CMSSWVersion']
412  chainDict['ProcessingString'][step]=processStrPrefix+chainDict['nowmTasklist'][-1]['GlobalTag'].replace('::All','').replace('-','_')+thisLabel
413  if 'NANOMERGE' in step :
414  chainDict['ProcessingString'][step]=processStrPrefix+nanoedmGT.replace('::All','').replace('-','_')+thisLabel
415  else:
416  #chainDict['nowmTasklist'][-1]['AcquisitionEra']=(chainDict['CMSSWVersion']+'-PU_'+chainDict['nowmTasklist'][-1]['GlobalTag']).replace('::All','')+thisLabel
417  chainDict['nowmTasklist'][-1]['AcquisitionEra']=chainDict['CMSSWVersion']
418  chainDict['nowmTasklist'][-1]['ProcessingString']=processStrPrefix+chainDict['nowmTasklist'][-1]['GlobalTag'].replace('::All','').replace('-','_')+thisLabel
419  if 'NANOMERGE' in step :
420  chainDict['nowmTasklist'][-1]['ProcessingString']=processStrPrefix+nanoedmGT.replace('::All','').replace('-','_')+thisLabel
421 
422  if (self.batchName):
423  chainDict['nowmTasklist'][-1]['Campaign'] = chainDict['nowmTasklist'][-1]['AcquisitionEra']+self.batchName
424 
425  # specify different ProcessingString for double miniAOD dataset
426  if ('DBLMINIAODMCUP15NODQM' in step):
427  chainDict['nowmTasklist'][-1]['ProcessingString']=chainDict['nowmTasklist'][-1]['ProcessingString']+'_miniAOD'
428 
429  if( chainDict['nowmTasklist'][-1]['Multicore'] ):
430  # the scaling factor of 1.2GB / thread is empirical and measured on a SECOND round of tests with PU samples
431  # the number of threads is NO LONGER assumed to be the same for all tasks
432  # https://hypernews.cern.ch/HyperNews/CMS/get/edmFramework/3509/1/1/1.html
433  # now change to 1.5GB / additional thread according to discussion:
434  # https://hypernews.cern.ch/HyperNews/CMS/get/relval/4817/1/1.html
435 # chainDict['nowmTasklist'][-1]['Memory'] = 3000 + int( chainDict['nowmTasklist'][-1]['Multicore'] -1 )*1500
436  chainDict['nowmTasklist'][-1]['Memory'] = self.memoryOffset + int( chainDict['nowmTasklist'][-1]['Multicore'] -1 ) * self.memPerCore
437 
438  index+=1
439  #end of loop through steps
440  chainDict['RequestString']='RV'+chainDict['CMSSWVersion']+s[1].split('+')[0]
441  if processStrPrefix or thisLabel:
442  chainDict['RequestString']+='_'+processStrPrefix+thisLabel
443 
def prepare(self, mReader, directories, mode='init')
def replace(string, replacements)
S & print(S &os, JobReport::InputFile const &f)
Definition: JobReport.cc:66
void find(edm::Handle< EcalRecHitCollection > &hits, DetId thisDet, std::vector< EcalRecHitCollection::const_iterator > &hit, bool debug=false)
Definition: FindCaloHit.cc:20
double split
Definition: MVATrainer.cc:139
def MatrixInjector.MatrixInjector.submit (   self)

Definition at line 581 of file MatrixInjector.py.

References edm.print(), MatrixInjector.MatrixInjector.testMode, and MatrixInjector.MatrixInjector.wmagent.

581  def submit(self):
582  try:
583  from modules.wma import makeRequest,approveRequest
584  from wmcontrol import random_sleep
585  print('\n\tFound wmcontrol\n')
586  except:
587  print('\n\tUnable to find wmcontrol modules. Please include it in your python path\n')
588  if not self.testMode:
589  print('\n\t QUIT\n')
590  sys.exit(-17)
591 
592  import pprint
593  for (n,d) in self.chainDicts.items():
594  if self.testMode:
595  print("Only viewing request",n)
596  print(pprint.pprint(d))
597  else:
598  #submit to wmagent each dict
599  print("For eyes before submitting",n)
600  print(pprint.pprint(d))
601  print("Submitting",n,"...........")
602  workFlow=makeRequest(self.wmagent,d,encodeDict=True)
603  print("...........",n,"submitted")
604  random_sleep()
605 
606 
607 
608 
S & print(S &os, JobReport::InputFile const &f)
Definition: JobReport.cc:66
def MatrixInjector.MatrixInjector.upload (   self)

Definition at line 561 of file MatrixInjector.py.

References edm.print(), str, and MatrixInjector.MatrixInjector.uploadConf().

561  def upload(self):
562  for (n,d) in self.chainDicts.items():
563  for it in d:
564  if it.startswith("Task") and it!='TaskChain':
565  #upload
566  couchID=self.uploadConf(d[it]['ConfigCacheID'],
567  str(n)+d[it]['TaskName'],
568  d['ConfigCacheUrl']
569  )
570  print(d[it]['ConfigCacheID']," uploaded to couchDB for",str(n),"with ID",couchID)
571  d[it]['ConfigCacheID']=couchID
572  if it =='DQMConfigCacheID':
573  couchID=self.uploadConf(d['DQMConfigCacheID'],
574  str(n)+'harvesting',
575  d['ConfigCacheUrl']
576  )
577  print(d['DQMConfigCacheID'],"uploaded to couchDB for",str(n),"with ID",couchID)
578  d['DQMConfigCacheID']=couchID
579 
580 
def uploadConf(self, filePath, label, where)
the info are not in the task specific dict but in the general dict t_input.update(copy.deepcopy(self.defaultHarvest)) t_input[&#39;DQMConfigCacheID&#39;]=t_second[&#39;ConfigCacheID&#39;]
S & print(S &os, JobReport::InputFile const &f)
Definition: JobReport.cc:66
#define str(s)
def MatrixInjector.MatrixInjector.uploadConf (   self,
  filePath,
  label,
  where 
)

the info are not in the task specific dict but in the general dict t_input.update(copy.deepcopy(self.defaultHarvest)) t_input['DQMConfigCacheID']=t_second['ConfigCacheID']

batch name appended to Campaign name chainDict['Campaign'] = chainDict['AcquisitionEra'] clean things up now provide the number of tasks

Definition at line 534 of file MatrixInjector.py.

References MatrixInjector.MatrixInjector.couchCache, TmCcu.count, TmApvPair.count, TmModule.count, TmPsu.count, MatrixInjector.MatrixInjector.count, ValidationMisalignedTracker.count, SiStripDetSummary::Values.count, MatrixInjector.MatrixInjector.group, ElectronLikelihoodCategoryData.label, entry< T >.label, classes.PlotData.label, SiPixelFedFillerWordEventNumber.label, TtEvent::HypoClassKeyStringToEnum.label, HcalLutSet.label, L1GtBoardTypeStringToEnum.label, SiPixelQualityESProducer.label, L1GtPsbQuadStringToEnum.label, MatrixInjector.MatrixInjector.label, ValidationMisalignedTracker.label, L1GtConditionTypeStringToEnum.label, cond::payloadInspector::ModuleVersion.label, L1GtConditionCategoryStringToEnum.label, edm.print(), MatrixInjector.MatrixInjector.testMode, EcalTPGParamReaderFromDB.user, popcon::RpcObGasData.user, popcon::RPCObPVSSmapData.user, popcon::RpcDataT.user, popcon::RpcDataV.user, popcon::RpcDataGasMix.user, popcon::RpcDataUXC.user, popcon::RpcDataS.user, popcon::RpcDataFebmap.user, popcon::RpcDataI.user, and MatrixInjector.MatrixInjector.user.

Referenced by MatrixInjector.MatrixInjector.upload().

534  def uploadConf(self,filePath,label,where):
535  labelInCouch=self.label+'_'+label
536  cacheName=filePath.split('/')[-1]
537  if self.testMode:
538  self.count+=1
539  print('\tFake upload of',filePath,'to couch with label',labelInCouch)
540  return self.count
541  else:
542  try:
543  from modules.wma import upload_to_couch,DATABASE_NAME
544  except:
545  print('\n\tUnable to find wmcontrol modules. Please include it in your python path\n')
546  print('\n\t QUIT\n')
547  sys.exit(-16)
548 
549  if cacheName in self.couchCache:
550  print("Not re-uploading",filePath,"to",where,"for",label)
551  cacheId=self.couchCache[cacheName]
552  else:
553  print("Loading",filePath,"to",where,"for",label)
554  ## totally fork the upload to couch to prevent cross loading of process configurations
555  pool = multiprocessing.Pool(1)
556  cacheIds = pool.map( upload_to_couch_oneArg, [(filePath,labelInCouch,self.user,self.group,where)] )
557  cacheId = cacheIds[0]
558  self.couchCache[cacheName]=cacheId
559  return cacheId
560 
def uploadConf(self, filePath, label, where)
the info are not in the task specific dict but in the general dict t_input.update(copy.deepcopy(self.defaultHarvest)) t_input[&#39;DQMConfigCacheID&#39;]=t_second[&#39;ConfigCacheID&#39;]
S & print(S &os, JobReport::InputFile const &f)
Definition: JobReport.cc:66

Member Data Documentation

MatrixInjector.MatrixInjector.batchName

Definition at line 56 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.batchTime

Definition at line 57 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.chainDicts

Definition at line 162 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.couch

Definition at line 75 of file MatrixInjector.py.

MatrixInjector.MatrixInjector.couchCache

Definition at line 77 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.count

Definition at line 41 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.DbsUrl

Definition at line 67 of file MatrixInjector.py.

MatrixInjector.MatrixInjector.defaultChain

Definition at line 94 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultHarvest

Definition at line 118 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultInput

Definition at line 138 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultScratch

Definition at line 125 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultTask

Definition at line 149 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.dqmgui

Definition at line 43 of file MatrixInjector.py.

MatrixInjector.MatrixInjector.group

Definition at line 79 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.keep

Definition at line 53 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.memoryOffset

Definition at line 54 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.memPerCore

Definition at line 55 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.speciallabel

Definition at line 81 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.testMode
MatrixInjector.MatrixInjector.version
MatrixInjector.MatrixInjector.wmagent

Definition at line 44 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.submit().