CMS 3D CMS Logo

List of all members | Public Member Functions | Public Attributes
MatrixInjector.MatrixInjector Class Reference
Inheritance diagram for MatrixInjector.MatrixInjector:

Public Member Functions

def __init__ (self, opt, mode='init', options='')
 
def prepare (self, mReader, directories, mode='init')
 
def submit (self)
 
def upload (self)
 
def uploadConf (self, filePath, label, where)
 the info are not in the task specific dict but in the general dict t_input.update(copy.deepcopy(self.defaultHarvest)) t_input['DQMConfigCacheID']=t_second['ConfigCacheID'] More...
 

Public Attributes

 batchName
 
 batchTime
 
 chainDicts
 
 couch
 
 couchCache
 
 count
 
 DbsUrl
 
 defaultChain
 
 defaultHarvest
 
 defaultInput
 
 defaultScratch
 
 defaultTask
 
 dqmgui
 
 group
 
 keep
 
 label
 
 memoryOffset
 
 memPerCore
 
 speciallabel
 
 testMode
 
 user
 
 version
 
 wmagent
 

Detailed Description

Definition at line 37 of file MatrixInjector.py.

Constructor & Destructor Documentation

def MatrixInjector.MatrixInjector.__init__ (   self,
  opt,
  mode = 'init',
  options = '' 
)

Definition at line 39 of file MatrixInjector.py.

39  def __init__(self,opt,mode='init',options=''):
40  self.count=1040
41 
42  self.dqmgui=None
43  self.wmagent=None
44  for k in options.split(','):
45  if k.startswith('dqm:'):
46  self.dqmgui=k.split(':',1)[-1]
47  elif k.startswith('wma:'):
48  self.wmagent=k.split(':',1)[-1]
49 
50  self.testMode=((mode!='submit') and (mode!='force'))
51  self.version =1
52  self.keep = opt.keep
53  self.memoryOffset = opt.memoryOffset
54  self.memPerCore = opt.memPerCore
55  self.batchName = ''
56  self.batchTime = str(int(time.time()))
57  if(opt.batchName):
58  self.batchName = '__'+opt.batchName+'-'+self.batchTime
59 
60  #wagemt stuff
61  if not self.wmagent:
62  self.wmagent=os.getenv('WMAGENT_REQMGR')
63  if not self.wmagent:
64  if not opt.testbed :
65  self.wmagent = 'cmsweb.cern.ch'
66  self.DbsUrl = "https://"+self.wmagent+"/dbs/prod/global/DBSReader"
67  else :
68  self.wmagent = 'cmsweb-testbed.cern.ch'
69  self.DbsUrl = "https://"+self.wmagent+"/dbs/int/global/DBSReader"
70 
71  if not self.dqmgui:
72  self.dqmgui="https://cmsweb.cern.ch/dqm/relval"
73  #couch stuff
74  self.couch = 'https://'+self.wmagent+'/couchdb'
75 # self.couchDB = 'reqmgr_config_cache'
76  self.couchCache={} # so that we do not upload like crazy, and recyle cfgs
77  self.user = os.getenv('USER')
78  self.group = 'ppd'
79  self.label = 'RelValSet_'+os.getenv('CMSSW_VERSION').replace('-','')+'_v'+str(self.version)
80  self.speciallabel=''
81  if opt.label:
82  self.speciallabel= '_'+opt.label
83 
84 
85  if not os.getenv('WMCORE_ROOT'):
86  print '\n\twmclient is not setup properly. Will not be able to upload or submit requests.\n'
87  if not self.testMode:
88  print '\n\t QUIT\n'
89  sys.exit(-18)
90  else:
91  print '\n\tFound wmclient\n'
92 
93  self.defaultChain={
94  "RequestType" : "TaskChain", #this is how we handle relvals
95  "SubRequestType" : "RelVal", #this is how we handle relvals, now that TaskChain is also used for central MC production
96  "RequestPriority": 500000,
97  "Requestor": self.user, #Person responsible
98  "Group": self.group, #group for the request
99  "CMSSWVersion": os.getenv('CMSSW_VERSION'), #CMSSW Version (used for all tasks in chain)
100  "Campaign": os.getenv('CMSSW_VERSION'), # = AcquisitionEra, will be reset later to the one of first task, will both be the CMSSW_VERSION
101  "ScramArch": os.getenv('SCRAM_ARCH'), #Scram Arch (used for all tasks in chain)
102  "ProcessingVersion": self.version, #Processing Version (used for all tasks in chain)
103  "GlobalTag": None, #Global Tag (overridden per task)
104  "ConfigCacheUrl": self.couch, #URL of CouchDB containing Config Cache
105  "DbsUrl": self.DbsUrl,
106  #- Will contain all configs for all Tasks
107  #"SiteWhitelist" : ["T2_CH_CERN", "T1_US_FNAL"], #Site whitelist
108  "TaskChain" : None, #Define number of tasks in chain.
109  "nowmTasklist" : [], #a list of tasks as we put them in
110  "Multicore" : 1, # do not set multicore for the whole chain
111  "Memory" : 3000,
112  "SizePerEvent" : 1234,
113  "TimePerEvent" : 0.1,
114  "PrepID": os.getenv('CMSSW_VERSION')
115  }
116 
118  "EnableHarvesting" : "True",
119  "DQMUploadUrl" : self.dqmgui,
120  "DQMConfigCacheID" : None,
121  "Multicore" : 1 # hardcode Multicore to be 1 for Harvest
122  }
123 
125  "TaskName" : None, #Task Name
126  "ConfigCacheID" : None, #Generator Config id
127  "GlobalTag": None,
128  "SplittingAlgo" : "EventBased", #Splitting Algorithm
129  "EventsPerJob" : None, #Size of jobs in terms of splitting algorithm
130  "RequestNumEvents" : None, #Total number of events to generate
131  "Seeding" : "AutomaticSeeding", #Random seeding method
132  "PrimaryDataset" : None, #Primary Dataset to be created
133  "nowmIO": {},
134  "Multicore" : opt.nThreads, # this is the per-taskchain Multicore; it's the default assigned to a task if it has no value specified
135  "KeepOutput" : False
136  }
138  "TaskName" : "DigiHLT", #Task Name
139  "ConfigCacheID" : None, #Processing Config id
140  "GlobalTag": None,
141  "InputDataset" : None, #Input Dataset to be processed
142  "SplittingAlgo" : "LumiBased", #Splitting Algorithm
143  "LumisPerJob" : 10, #Size of jobs in terms of splitting algorithm
144  "nowmIO": {},
145  "Multicore" : opt.nThreads, # this is the per-taskchain Multicore; it's the default assigned to a task if it has no value specified
146  "KeepOutput" : False
147  }
148  self.defaultTask={
149  "TaskName" : None, #Task Name
150  "InputTask" : None, #Input Task Name (Task Name field of a previous Task entry)
151  "InputFromOutputModule" : None, #OutputModule name in the input task that will provide files to process
152  "ConfigCacheID" : None, #Processing Config id
153  "GlobalTag": None,
154  "SplittingAlgo" : "LumiBased", #Splitting Algorithm
155  "LumisPerJob" : 10, #Size of jobs in terms of splitting algorithm
156  "nowmIO": {},
157  "Multicore" : opt.nThreads, # this is the per-taskchain Multicore; it's the default assigned to a task if it has no value specified
158  "KeepOutput" : False
159  }
160 
161  self.chainDicts={}
162 
163 
def replace(string, replacements)
def __init__(self, opt, mode='init', options='')
if(dp >Float(M_PI)) dp-

Member Function Documentation

def MatrixInjector.MatrixInjector.prepare (   self,
  mReader,
  directories,
  mode = 'init' 
)

Definition at line 164 of file MatrixInjector.py.

References mps_setup.append, MatrixInjector.MatrixInjector.batchName, MatrixInjector.MatrixInjector.batchTime, MatrixInjector.MatrixInjector.chainDicts, MatrixInjector.MatrixInjector.defaultChain, MatrixInjector.MatrixInjector.defaultHarvest, MatrixInjector.MatrixInjector.defaultInput, MatrixInjector.MatrixInjector.defaultScratch, MatrixInjector.MatrixInjector.defaultTask, spr.find(), reco.if(), createfilelist.int, mps_monitormerge.items, MatrixInjector.MatrixInjector.keep, genParticles_cff.map, MatrixInjector.MatrixInjector.memoryOffset, MatrixInjector.MatrixInjector.memPerCore, python.rootplot.root2matplotlib.replace(), MatrixInjector.MatrixInjector.speciallabel, split, and MuonErrorMatrixValues_cff.values.

164  def prepare(self,mReader, directories, mode='init'):
165  try:
166  #from Configuration.PyReleaseValidation.relval_steps import wmsplit
167  wmsplit = {}
168  wmsplit['DIGIHI']=5
169  wmsplit['RECOHI']=5
170  wmsplit['HLTD']=5
171  wmsplit['RECODreHLT']=2
172  wmsplit['DIGIPU']=4
173  wmsplit['DIGIPU1']=4
174  wmsplit['RECOPU1']=1
175  wmsplit['DIGIUP15_PU50']=1
176  wmsplit['RECOUP15_PU50']=1
177  wmsplit['DIGIUP15_PU25']=1
178  wmsplit['RECOUP15_PU25']=1
179  wmsplit['DIGIUP15_PU25HS']=1
180  wmsplit['RECOUP15_PU25HS']=1
181  wmsplit['DIGIHIMIX']=5
182  wmsplit['RECOHIMIX']=5
183  wmsplit['RECODSplit']=1
184  wmsplit['SingleMuPt10_UP15_ID']=1
185  wmsplit['DIGIUP15_ID']=1
186  wmsplit['RECOUP15_ID']=1
187  wmsplit['TTbar_13_ID']=1
188  wmsplit['SingleMuPt10FS_ID']=1
189  wmsplit['TTbarFS_ID']=1
190  wmsplit['RECODR2_50nsreHLT']=5
191  wmsplit['RECODR2_25nsreHLT']=5
192  wmsplit['RECODR2_2016reHLT']=5
193  wmsplit['RECODR2_50nsreHLT_HIPM']=5
194  wmsplit['RECODR2_25nsreHLT_HIPM']=5
195  wmsplit['RECODR2_2016reHLT_HIPM']=1
196  wmsplit['RECODR2_2016reHLT_skimSingleMu']=1
197  wmsplit['RECODR2_2016reHLT_skimDoubleEG']=1
198  wmsplit['RECODR2_2016reHLT_skimMuonEG']=1
199  wmsplit['RECODR2_2016reHLT_skimJetHT']=1
200  wmsplit['RECODR2_2016reHLT_skimMET']=1
201  wmsplit['RECODR2_2016reHLT_skimSinglePh']=1
202  wmsplit['RECODR2_2016reHLT_skimMuOnia']=1
203  wmsplit['RECODR2_2016reHLT_skimSingleMu_HIPM']=1
204  wmsplit['RECODR2_2016reHLT_skimDoubleEG_HIPM']=1
205  wmsplit['RECODR2_2016reHLT_skimMuonEG_HIPM']=1
206  wmsplit['RECODR2_2016reHLT_skimJetHT_HIPM']=1
207  wmsplit['RECODR2_2016reHLT_skimMET_HIPM']=1
208  wmsplit['RECODR2_2016reHLT_skimSinglePh_HIPM']=1
209  wmsplit['RECODR2_2016reHLT_skimMuOnia_HIPM']=1
210  wmsplit['RECODR2_2017reHLT_Prompt']=1
211  wmsplit['RECODR2_2017reHLT_skimSingleMu_Prompt_Lumi']=1
212  wmsplit['RECODR2_2017reHLT_skimDoubleEG_Prompt']=1
213  wmsplit['RECODR2_2017reHLT_skimMET_Prompt']=1
214  wmsplit['RECODR2_2017reHLT_skimMuOnia_Prompt']=1
215  wmsplit['HLTDR2_50ns']=1
216  wmsplit['HLTDR2_25ns']=1
217  wmsplit['HLTDR2_2016']=1
218  wmsplit['HLTDR2_2017']=1
219  wmsplit['Hadronizer']=1
220  wmsplit['DIGIUP15']=1
221  wmsplit['RECOUP15']=1
222  wmsplit['RECOAODUP15']=5
223  wmsplit['DBLMINIAODMCUP15NODQM']=5
224  wmsplit['DigiFull']=5
225  wmsplit['RecoFull']=5
226  wmsplit['DigiFullPU']=1
227  wmsplit['RecoFullPU']=1
228  wmsplit['RECOHID11']=1
229  wmsplit['DigiFullTriggerPU_2023D17PU'] = 1
230  wmsplit['RecoFullGlobalPU_2023D17PU']=1
231  wmsplit['DIGIUP17']=1
232  wmsplit['RECOUP17']=1
233  wmsplit['DIGIUP17_PU25']=1
234  wmsplit['RECOUP17_PU25']=1
235  wmsplit['DIGICOS_UP17']=1
236  wmsplit['RECOCOS_UP17']=1
237 
238 
239  #import pprint
240  #pprint.pprint(wmsplit)
241  except:
242  print "Not set up for step splitting"
243  wmsplit={}
244 
245  acqEra=False
246  for (n,dir) in directories.items():
247  chainDict=copy.deepcopy(self.defaultChain)
248  print "inspecting",dir
249  nextHasDSInput=None
250  for (x,s) in mReader.workFlowSteps.items():
251  #x has the format (num, prefix)
252  #s has the format (num, name, commands, stepList)
253  if x[0]==n:
254  #print "found",n,s[3]
255  #chainDict['RequestString']='RV'+chainDict['CMSSWVersion']+s[1].split('+')[0]
256  index=0
257  splitForThisWf=None
258  thisLabel=self.speciallabel
259  #if 'HARVESTGEN' in s[3]:
260  if len( [step for step in s[3] if "HARVESTGEN" in step] )>0:
261  chainDict['TimePerEvent']=0.01
262  thisLabel=thisLabel+"_gen"
263  # for double miniAOD test
264  if len( [step for step in s[3] if "DBLMINIAODMCUP15NODQM" in step] )>0:
265  thisLabel=thisLabel+"_dblMiniAOD"
266  processStrPrefix=''
267  setPrimaryDs=None
268  nanoedmGT=''
269  for step in s[3]:
270 
271  if 'INPUT' in step or (not isinstance(s[2][index],str)):
272  nextHasDSInput=s[2][index]
273 
274  else:
275 
276  if (index==0):
277  #first step and not input -> gen part
278  chainDict['nowmTasklist'].append(copy.deepcopy(self.defaultScratch))
279  try:
280  chainDict['nowmTasklist'][-1]['nowmIO']=json.loads(open('%s/%s.io'%(dir,step)).read())
281  except:
282  print "Failed to find",'%s/%s.io'%(dir,step),".The workflows were probably not run on cfg not created"
283  return -15
284 
285  chainDict['nowmTasklist'][-1]['PrimaryDataset']='RelVal'+s[1].split('+')[0]
286  if not '--relval' in s[2][index]:
287  print 'Impossible to create task from scratch without splitting information with --relval'
288  return -12
289  else:
290  arg=s[2][index].split()
291  ns=map(int,arg[arg.index('--relval')+1].split(','))
292  chainDict['nowmTasklist'][-1]['RequestNumEvents'] = ns[0]
293  chainDict['nowmTasklist'][-1]['EventsPerJob'] = ns[1]
294  if 'FASTSIM' in s[2][index] or '--fast' in s[2][index]:
295  thisLabel+='_FastSim'
296  if 'lhe' in s[2][index] in s[2][index]:
297  chainDict['nowmTasklist'][-1]['LheInputFiles'] =True
298 
299  elif nextHasDSInput:
300  chainDict['nowmTasklist'].append(copy.deepcopy(self.defaultInput))
301  try:
302  chainDict['nowmTasklist'][-1]['nowmIO']=json.loads(open('%s/%s.io'%(dir,step)).read())
303  except:
304  print "Failed to find",'%s/%s.io'%(dir,step),".The workflows were probably not run on cfg not created"
305  return -15
306  chainDict['nowmTasklist'][-1]['InputDataset']=nextHasDSInput.dataSet
307  if ('DQMHLTonRAWAOD' in step) :
308  chainDict['nowmTasklist'][-1]['IncludeParents']=True
309  splitForThisWf=nextHasDSInput.split
310  chainDict['nowmTasklist'][-1]['LumisPerJob']=splitForThisWf
311  if step in wmsplit:
312  chainDict['nowmTasklist'][-1]['LumisPerJob']=wmsplit[step]
313  # get the run numbers or #events
314  if len(nextHasDSInput.run):
315  chainDict['nowmTasklist'][-1]['RunWhitelist']=nextHasDSInput.run
316  if len(nextHasDSInput.ls):
317  chainDict['nowmTasklist'][-1]['LumiList']=nextHasDSInput.ls
318  #print "what is s",s[2][index]
319  if '--data' in s[2][index] and nextHasDSInput.label:
320  thisLabel+='_RelVal_%s'%nextHasDSInput.label
321  if 'filter' in chainDict['nowmTasklist'][-1]['nowmIO']:
322  print "This has an input DS and a filter sequence: very likely to be the PyQuen sample"
323  processStrPrefix='PU_'
324  setPrimaryDs = 'RelVal'+s[1].split('+')[0]
325  if setPrimaryDs:
326  chainDict['nowmTasklist'][-1]['PrimaryDataset']=setPrimaryDs
327  nextHasDSInput=None
328  else:
329  #not first step and no inputDS
330  chainDict['nowmTasklist'].append(copy.deepcopy(self.defaultTask))
331  try:
332  chainDict['nowmTasklist'][-1]['nowmIO']=json.loads(open('%s/%s.io'%(dir,step)).read())
333  except:
334  print "Failed to find",'%s/%s.io'%(dir,step),".The workflows were probably not run on cfg not created"
335  return -15
336  if splitForThisWf:
337  chainDict['nowmTasklist'][-1]['LumisPerJob']=splitForThisWf
338  if step in wmsplit:
339  chainDict['nowmTasklist'][-1]['LumisPerJob']=wmsplit[step]
340 
341  # change LumisPerJob for Hadronizer steps.
342  if 'Hadronizer' in step:
343  chainDict['nowmTasklist'][-1]['LumisPerJob']=wmsplit['Hadronizer']
344 
345  #print step
346  chainDict['nowmTasklist'][-1]['TaskName']=step
347  if setPrimaryDs:
348  chainDict['nowmTasklist'][-1]['PrimaryDataset']=setPrimaryDs
349  chainDict['nowmTasklist'][-1]['ConfigCacheID']='%s/%s.py'%(dir,step)
350  chainDict['nowmTasklist'][-1]['GlobalTag']=chainDict['nowmTasklist'][-1]['nowmIO']['GT'] # copy to the proper parameter name
351  chainDict['GlobalTag']=chainDict['nowmTasklist'][-1]['nowmIO']['GT'] #set in general to the last one of the chain
352  if 'NANOEDM' in step :
353  nanoedmGT = chainDict['nowmTasklist'][-1]['nowmIO']['GT']
354  if 'NANOMERGE' in step :
355  chainDict['GlobalTag'] = nanoedmGT
356  if 'pileup' in chainDict['nowmTasklist'][-1]['nowmIO']:
357  chainDict['nowmTasklist'][-1]['MCPileup']=chainDict['nowmTasklist'][-1]['nowmIO']['pileup']
358  if '--pileup ' in s[2][index]: # catch --pileup (scenarion) and not --pileup_ (dataset to be mixed) => works also making PRE-MIXed dataset
359  processStrPrefix='PU_' # take care of pu overlay done with GEN-SIM mixing
360  if ( s[2][index].split()[ s[2][index].split().index('--pileup')+1 ] ).find('25ns') > 0 :
361  processStrPrefix='PU25ns_'
362  elif ( s[2][index].split()[ s[2][index].split().index('--pileup')+1 ] ).find('50ns') > 0 :
363  processStrPrefix='PU50ns_'
364  if 'premix_stage2' in s[2][index] : # take care of pu overlay done with DIGI mixing of premixed events
365  if s[2][index].split()[ s[2][index].split().index('--pileup_input')+1 ].find('25ns') > 0 :
366  processStrPrefix='PUpmx25ns_'
367  elif s[2][index].split()[ s[2][index].split().index('--pileup_input')+1 ].find('50ns') > 0 :
368  processStrPrefix='PUpmx50ns_'
369 
370  if acqEra:
371  #chainDict['AcquisitionEra'][step]=(chainDict['CMSSWVersion']+'-PU_'+chainDict['nowmTasklist'][-1]['GlobalTag']).replace('::All','')+thisLabel
372  chainDict['AcquisitionEra'][step]=chainDict['CMSSWVersion']
373  chainDict['ProcessingString'][step]=processStrPrefix+chainDict['nowmTasklist'][-1]['GlobalTag'].replace('::All','').replace('-','_')+thisLabel
374  if 'NANOMERGE' in step :
375  chainDict['ProcessingString'][step]=processStrPrefix+nanoedmGT.replace('::All','').replace('-','_')+thisLabel
376  else:
377  #chainDict['nowmTasklist'][-1]['AcquisitionEra']=(chainDict['CMSSWVersion']+'-PU_'+chainDict['nowmTasklist'][-1]['GlobalTag']).replace('::All','')+thisLabel
378  chainDict['nowmTasklist'][-1]['AcquisitionEra']=chainDict['CMSSWVersion']
379  chainDict['nowmTasklist'][-1]['ProcessingString']=processStrPrefix+chainDict['nowmTasklist'][-1]['GlobalTag'].replace('::All','').replace('-','_')+thisLabel
380  if 'NANOMERGE' in step :
381  chainDict['nowmTasklist'][-1]['ProcessingString']=processStrPrefix+nanoedmGT.replace('::All','').replace('-','_')+thisLabel
382 
383  if (self.batchName):
384  chainDict['nowmTasklist'][-1]['Campaign'] = chainDict['nowmTasklist'][-1]['AcquisitionEra']+self.batchName
385 
386  # specify different ProcessingString for double miniAOD dataset
387  if ('DBLMINIAODMCUP15NODQM' in step):
388  chainDict['nowmTasklist'][-1]['ProcessingString']=chainDict['nowmTasklist'][-1]['ProcessingString']+'_miniAOD'
389 
390  if( chainDict['nowmTasklist'][-1]['Multicore'] ):
391  # the scaling factor of 1.2GB / thread is empirical and measured on a SECOND round of tests with PU samples
392  # the number of threads is NO LONGER assumed to be the same for all tasks
393  # https://hypernews.cern.ch/HyperNews/CMS/get/edmFramework/3509/1/1/1.html
394  # now change to 1.5GB / additional thread according to discussion:
395  # https://hypernews.cern.ch/HyperNews/CMS/get/relval/4817/1/1.html
396 # chainDict['nowmTasklist'][-1]['Memory'] = 3000 + int( chainDict['nowmTasklist'][-1]['Multicore'] -1 )*1500
397  chainDict['nowmTasklist'][-1]['Memory'] = self.memoryOffset + int( chainDict['nowmTasklist'][-1]['Multicore'] -1 ) * self.memPerCore
398 
399  index+=1
400  #end of loop through steps
401  chainDict['RequestString']='RV'+chainDict['CMSSWVersion']+s[1].split('+')[0]
402  if processStrPrefix or thisLabel:
403  chainDict['RequestString']+='_'+processStrPrefix+thisLabel
404 
def prepare(self, mReader, directories, mode='init')
def replace(string, replacements)
void find(edm::Handle< EcalRecHitCollection > &hits, DetId thisDet, std::vector< EcalRecHitCollection::const_iterator > &hit, bool debug=false)
Definition: FindCaloHit.cc:20
if(dp >Float(M_PI)) dp-
double split
Definition: MVATrainer.cc:139
def MatrixInjector.MatrixInjector.submit (   self)

Definition at line 540 of file MatrixInjector.py.

References MatrixInjector.MatrixInjector.testMode, and MatrixInjector.MatrixInjector.wmagent.

540  def submit(self):
541  try:
542  from modules.wma import makeRequest,approveRequest
543  from wmcontrol import random_sleep
544  print '\n\tFound wmcontrol\n'
545  except:
546  print '\n\tUnable to find wmcontrol modules. Please include it in your python path\n'
547  if not self.testMode:
548  print '\n\t QUIT\n'
549  sys.exit(-17)
550 
551  import pprint
552  for (n,d) in self.chainDicts.items():
553  if self.testMode:
554  print "Only viewing request",n
555  print pprint.pprint(d)
556  else:
557  #submit to wmagent each dict
558  print "For eyes before submitting",n
559  print pprint.pprint(d)
560  print "Submitting",n,"..........."
561  workFlow=makeRequest(self.wmagent,d,encodeDict=True)
562  print "...........",n,"submitted"
563  random_sleep()
564 
565 
566 
def MatrixInjector.MatrixInjector.upload (   self)

Definition at line 520 of file MatrixInjector.py.

References harvestTrackValidationPlots.str, and MatrixInjector.MatrixInjector.uploadConf().

520  def upload(self):
521  for (n,d) in self.chainDicts.items():
522  for it in d:
523  if it.startswith("Task") and it!='TaskChain':
524  #upload
525  couchID=self.uploadConf(d[it]['ConfigCacheID'],
526  str(n)+d[it]['TaskName'],
527  d['ConfigCacheUrl']
528  )
529  print d[it]['ConfigCacheID']," uploaded to couchDB for",str(n),"with ID",couchID
530  d[it]['ConfigCacheID']=couchID
531  if it =='DQMConfigCacheID':
532  couchID=self.uploadConf(d['DQMConfigCacheID'],
533  str(n)+'harvesting',
534  d['ConfigCacheUrl']
535  )
536  print d['DQMConfigCacheID'],"uploaded to couchDB for",str(n),"with ID",couchID
537  d['DQMConfigCacheID']=couchID
538 
539 
def uploadConf(self, filePath, label, where)
the info are not in the task specific dict but in the general dict t_input.update(copy.deepcopy(self.defaultHarvest)) t_input[&#39;DQMConfigCacheID&#39;]=t_second[&#39;ConfigCacheID&#39;]
def MatrixInjector.MatrixInjector.uploadConf (   self,
  filePath,
  label,
  where 
)

the info are not in the task specific dict but in the general dict t_input.update(copy.deepcopy(self.defaultHarvest)) t_input['DQMConfigCacheID']=t_second['ConfigCacheID']

batch name appended to Campaign name chainDict['Campaign'] = chainDict['AcquisitionEra'] clean things up now provide the number of tasks

Definition at line 493 of file MatrixInjector.py.

References MatrixInjector.MatrixInjector.couchCache, TmCcu.count, TmModule.count, TmApvPair.count, TmPsu.count, MatrixInjector.MatrixInjector.count, ValidationMisalignedTracker.count, SiStripDetSummary::Values.count, MatrixInjector.MatrixInjector.group, ElectronLikelihoodCategoryData.label, classes.PlotData.label, entry< T >.label, SiPixelFedFillerWordEventNumber.label, HcalLutSet.label, TtEvent::HypoClassKeyStringToEnum.label, L1GtBoardTypeStringToEnum.label, L1GtPsbQuadStringToEnum.label, MatrixInjector.MatrixInjector.label, ValidationMisalignedTracker.label, L1GtConditionTypeStringToEnum.label, cond::payloadInspector::ModuleVersion.label, L1GtConditionCategoryStringToEnum.label, PhysicsTools::Calibration::Comparator.label, MatrixInjector.MatrixInjector.testMode, EcalTPGParamReaderFromDB.user, popcon::RpcDataV.user, popcon::RPCObPVSSmapData.user, popcon::RpcDataT.user, popcon::RpcObGasData.user, popcon::RpcDataFebmap.user, popcon::RpcDataI.user, popcon::RpcDataS.user, popcon::RpcDataGasMix.user, popcon::RpcDataUXC.user, and MatrixInjector.MatrixInjector.user.

Referenced by MatrixInjector.MatrixInjector.upload().

493  def uploadConf(self,filePath,label,where):
494  labelInCouch=self.label+'_'+label
495  cacheName=filePath.split('/')[-1]
496  if self.testMode:
497  self.count+=1
498  print '\tFake upload of',filePath,'to couch with label',labelInCouch
499  return self.count
500  else:
501  try:
502  from modules.wma import upload_to_couch,DATABASE_NAME
503  except:
504  print '\n\tUnable to find wmcontrol modules. Please include it in your python path\n'
505  print '\n\t QUIT\n'
506  sys.exit(-16)
507 
508  if cacheName in self.couchCache:
509  print "Not re-uploading",filePath,"to",where,"for",label
510  cacheId=self.couchCache[cacheName]
511  else:
512  print "Loading",filePath,"to",where,"for",label
513  ## totally fork the upload to couch to prevent cross loading of process configurations
514  pool = multiprocessing.Pool(1)
515  cacheIds = pool.map( upload_to_couch_oneArg, [(filePath,labelInCouch,self.user,self.group,where)] )
516  cacheId = cacheIds[0]
517  self.couchCache[cacheName]=cacheId
518  return cacheId
519 
def uploadConf(self, filePath, label, where)
the info are not in the task specific dict but in the general dict t_input.update(copy.deepcopy(self.defaultHarvest)) t_input[&#39;DQMConfigCacheID&#39;]=t_second[&#39;ConfigCacheID&#39;]

Member Data Documentation

MatrixInjector.MatrixInjector.batchName

Definition at line 55 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.batchTime

Definition at line 56 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.chainDicts

Definition at line 161 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.couch

Definition at line 74 of file MatrixInjector.py.

MatrixInjector.MatrixInjector.couchCache

Definition at line 76 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.count

Definition at line 40 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.DbsUrl

Definition at line 66 of file MatrixInjector.py.

MatrixInjector.MatrixInjector.defaultChain

Definition at line 93 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultHarvest

Definition at line 117 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultInput

Definition at line 137 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultScratch

Definition at line 124 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultTask

Definition at line 148 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.dqmgui

Definition at line 42 of file MatrixInjector.py.

MatrixInjector.MatrixInjector.group

Definition at line 78 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.keep

Definition at line 52 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.memoryOffset

Definition at line 53 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.memPerCore

Definition at line 54 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.speciallabel

Definition at line 80 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.testMode
MatrixInjector.MatrixInjector.wmagent

Definition at line 43 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.submit().