test
CMS 3D CMS Logo

 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Properties Friends Macros Pages
List of all members | Public Member Functions | Public Attributes
MatrixInjector.MatrixInjector Class Reference
Inheritance diagram for MatrixInjector.MatrixInjector:

Public Member Functions

def __init__
 
def prepare
 
def submit
 
def upload
 
def uploadConf
 

Public Attributes

 chainDicts
 
 couch
 
 couchCache
 
 count
 
 DbsUrl
 
 defaultChain
 
 defaultHarvest
 
 defaultInput
 
 defaultScratch
 
 defaultTask
 
 dqmgui
 
 group
 
 keep
 
 label
 
 memoryOffset
 
 memPerCore
 
 speciallabel
 
 testMode
 
 user
 
 version
 
 wmagent
 

Detailed Description

Definition at line 36 of file MatrixInjector.py.

Constructor & Destructor Documentation

def MatrixInjector.MatrixInjector.__init__ (   self,
  opt,
  mode = 'init',
  options = '' 
)

Definition at line 38 of file MatrixInjector.py.

38 
39  def __init__(self,opt,mode='init',options=''):
40  self.count=1040
41 
42  self.dqmgui=None
43  self.wmagent=None
44  for k in options.split(','):
45  if k.startswith('dqm:'):
46  self.dqmgui=k.split(':',1)[-1]
47  elif k.startswith('wma:'):
48  self.wmagent=k.split(':',1)[-1]
49 
50  self.testMode=((mode!='submit') and (mode!='force'))
51  self.version =1
52  self.keep = opt.keep
53  self.memoryOffset = opt.memoryOffset
54  self.memPerCore = opt.memPerCore
55 
56  #wagemt stuff
57  if not self.wmagent:
58  self.wmagent=os.getenv('WMAGENT_REQMGR')
59  if not self.wmagent:
60  if not opt.testbed :
61  self.wmagent = 'cmsweb.cern.ch'
62  self.DbsUrl = "https://"+self.wmagent+"/dbs/prod/global/DBSReader"
63  else :
64  self.wmagent = 'cmsweb-testbed.cern.ch'
65  self.DbsUrl = "https://"+self.wmagent+"/dbs/int/global/DBSReader"
66 
67  if not self.dqmgui:
68  self.dqmgui="https://cmsweb.cern.ch/dqm/relval"
69  #couch stuff
70  self.couch = 'https://'+self.wmagent+'/couchdb'
71 # self.couchDB = 'reqmgr_config_cache'
72  self.couchCache={} # so that we do not upload like crazy, and recyle cfgs
73  self.user = os.getenv('USER')
74  self.group = 'ppd'
75  self.label = 'RelValSet_'+os.getenv('CMSSW_VERSION').replace('-','')+'_v'+str(self.version)
76  self.speciallabel=''
77  if opt.label:
78  self.speciallabel= '_'+opt.label
79 
80 
81  if not os.getenv('WMCORE_ROOT'):
82  print '\n\twmclient is not setup properly. Will not be able to upload or submit requests.\n'
83  if not self.testMode:
84  print '\n\t QUIT\n'
85  sys.exit(-18)
86  else:
87  print '\n\tFound wmclient\n'
88 
89  self.defaultChain={
90  "RequestType" : "TaskChain", #this is how we handle relvals
91  "SubRequestType" : "RelVal", #this is how we handle relvals, now that TaskChain is also used for central MC production
92  "RequestPriority": 500000,
93  "Requestor": self.user, #Person responsible
94  "Group": self.group, #group for the request
95  "CMSSWVersion": os.getenv('CMSSW_VERSION'), #CMSSW Version (used for all tasks in chain)
96  "Campaign": os.getenv('CMSSW_VERSION'), # only for wmstat purpose
97  "ScramArch": os.getenv('SCRAM_ARCH'), #Scram Arch (used for all tasks in chain)
98  "ProcessingVersion": self.version, #Processing Version (used for all tasks in chain)
99  "GlobalTag": None, #Global Tag (overridden per task)
100  "CouchURL": self.couch, #URL of CouchDB containing Config Cache
101  "ConfigCacheURL": self.couch, #URL of CouchDB containing Config Cache
102  "DbsUrl": self.DbsUrl,
103  #- Will contain all configs for all Tasks
104  #"SiteWhitelist" : ["T2_CH_CERN", "T1_US_FNAL"], #Site whitelist
105  "TaskChain" : None, #Define number of tasks in chain.
106  "nowmTasklist" : [], #a list of tasks as we put them in
107  "unmergedLFNBase" : "/store/unmerged",
108  "mergedLFNBase" : "/store/relval",
109  "dashboardActivity" : "relval",
110  "Multicore" : 1, # do not set multicore for the whole chain
111  "Memory" : 3000,
112  "SizePerEvent" : 1234,
113  "TimePerEvent" : 0.1
114  }
116  self.defaultHarvest={
117  "EnableHarvesting" : "True",
118  "DQMUploadUrl" : self.dqmgui,
119  "DQMConfigCacheID" : None,
120  "Multicore" : 1 # hardcode Multicore to be 1 for Harvest
121  }
122 
123  self.defaultScratch={
124  "TaskName" : None, #Task Name
125  "ConfigCacheID" : None, #Generator Config id
126  "GlobalTag": None,
127  "SplittingAlgo" : "EventBased", #Splitting Algorithm
128  "EventsPerJob" : None, #Size of jobs in terms of splitting algorithm
129  "RequestNumEvents" : None, #Total number of events to generate
130  "Seeding" : "AutomaticSeeding", #Random seeding method
131  "PrimaryDataset" : None, #Primary Dataset to be created
132  "nowmIO": {},
133  "Multicore" : opt.nThreads, # this is the per-taskchain Multicore; it's the default assigned to a task if it has no value specified
134  "KeepOutput" : False
135  }
136  self.defaultInput={
137  "TaskName" : "DigiHLT", #Task Name
138  "ConfigCacheID" : None, #Processing Config id
139  "GlobalTag": None,
140  "InputDataset" : None, #Input Dataset to be processed
141  "SplittingAlgo" : "LumiBased", #Splitting Algorithm
142  "LumisPerJob" : 10, #Size of jobs in terms of splitting algorithm
143  "nowmIO": {},
144  "Multicore" : opt.nThreads, # this is the per-taskchain Multicore; it's the default assigned to a task if it has no value specified
145  "KeepOutput" : False
146  }
147  self.defaultTask={
148  "TaskName" : None, #Task Name
149  "InputTask" : None, #Input Task Name (Task Name field of a previous Task entry)
150  "InputFromOutputModule" : None, #OutputModule name in the input task that will provide files to process
151  "ConfigCacheID" : None, #Processing Config id
152  "GlobalTag": None,
153  "SplittingAlgo" : "LumiBased", #Splitting Algorithm
154  "LumisPerJob" : 10, #Size of jobs in terms of splitting algorithm
155  "nowmIO": {},
156  "Multicore" : opt.nThreads, # this is the per-taskchain Multicore; it's the default assigned to a task if it has no value specified
157  "KeepOutput" : False
158  }
160  self.chainDicts={}
161 

Member Function Documentation

def MatrixInjector.MatrixInjector.prepare (   self,
  mReader,
  directories,
  mode = 'init' 
)

Definition at line 162 of file MatrixInjector.py.

References bitset_utilities.append(), MatrixInjector.MatrixInjector.chainDicts, MatrixInjector.MatrixInjector.defaultChain, MatrixInjector.MatrixInjector.defaultHarvest, MatrixInjector.MatrixInjector.defaultInput, MatrixInjector.MatrixInjector.defaultScratch, MatrixInjector.MatrixInjector.defaultTask, spr.find(), reco.if(), cmsHarvester.index, mps_monitormerge.items, MatrixInjector.MatrixInjector.keep, MatrixInjector.MatrixInjector.memoryOffset, MatrixInjector.MatrixInjector.memPerCore, SiPixelLorentzAngle_cfi.read, python.rootplot.root2matplotlib.replace(), MatrixInjector.MatrixInjector.speciallabel, split, and makeHLTPrescaleTable.values.

163  def prepare(self,mReader, directories, mode='init'):
164  try:
165  #from Configuration.PyReleaseValidation.relval_steps import wmsplit
166  wmsplit = {}
167  wmsplit['DIGIHI']=5
168  wmsplit['RECOHI']=5
169  wmsplit['HLTD']=5
170  wmsplit['RECODreHLT']=2
171  wmsplit['DIGIPU']=4
172  wmsplit['DIGIPU1']=4
173  wmsplit['RECOPU1']=1
174  wmsplit['DIGIUP15_PU50']=1
175  wmsplit['RECOUP15_PU50']=1
176  wmsplit['DIGIUP15_PU25']=1
177  wmsplit['RECOUP15_PU25']=1
178  wmsplit['DIGIUP15_PU25HS']=1
179  wmsplit['RECOUP15_PU25HS']=1
180  wmsplit['DIGIHIMIX']=5
181  wmsplit['RECOHIMIX']=5
182  wmsplit['RECODSplit']=1
183  wmsplit['SingleMuPt10_UP15_ID']=1
184  wmsplit['DIGIUP15_ID']=1
185  wmsplit['RECOUP15_ID']=1
186  wmsplit['TTbar_13_ID']=1
187  wmsplit['SingleMuPt10FS_ID']=1
188  wmsplit['TTbarFS_ID']=1
189  wmsplit['RECODR2_50nsreHLT']=1
190  wmsplit['RECODR2_25nsreHLT']=1
191  wmsplit['RECODR2_2016reHLT']=5
192  wmsplit['RECODR2_2016reHLT_skimSingleMu']=5
193  wmsplit['RECODR2_2016reHLT_skimDoubleEG']=5
194  wmsplit['RECODR2_2016reHLT_skimMuonEG']=5
195  wmsplit['RECODR2_2016reHLT_skimJetHT']=5
196  wmsplit['RECODR2_2016reHLT_skimSinglePh']=5
197  wmsplit['RECODR2_2016reHLT_skimMuOnia']=5
198  wmsplit['HLTDR2_50ns']=1
199  wmsplit['HLTDR2_25ns']=1
200  wmsplit['HLTDR2_2016']=1
201  wmsplit['Hadronizer']=1
202  wmsplit['DIGIUP15']=5
203  wmsplit['RECOUP15']=5
204  wmsplit['RECOAODUP15']=5
205  wmsplit['DBLMINIAODMCUP15NODQM']=5
206 
207 
208  #import pprint
209  #pprint.pprint(wmsplit)
210  except:
211  print "Not set up for step splitting"
212  wmsplit={}
213 
214  acqEra=False
215  for (n,dir) in directories.items():
216  chainDict=copy.deepcopy(self.defaultChain)
217  print "inspecting",dir
218  nextHasDSInput=None
219  for (x,s) in mReader.workFlowSteps.items():
220  #x has the format (num, prefix)
221  #s has the format (num, name, commands, stepList)
222  if x[0]==n:
223  #print "found",n,s[3]
224  #chainDict['RequestString']='RV'+chainDict['CMSSWVersion']+s[1].split('+')[0]
225  index=0
226  splitForThisWf=None
227  thisLabel=self.speciallabel
228  #if 'HARVESTGEN' in s[3]:
229  if len( [step for step in s[3] if "HARVESTGEN" in step] )>0:
230  chainDict['TimePerEvent']=0.01
231  thisLabel=thisLabel+"_gen"
232  # for double miniAOD test
233  if len( [step for step in s[3] if "DBLMINIAODMCUP15NODQM" in step] )>0:
234  thisLabel=thisLabel+"_dblMiniAOD"
235  processStrPrefix=''
236  setPrimaryDs=None
237  for step in s[3]:
238 
239  if 'INPUT' in step or (not isinstance(s[2][index],str)):
240  nextHasDSInput=s[2][index]
241 
242  else:
243 
244  if (index==0):
245  #first step and not input -> gen part
246  chainDict['nowmTasklist'].append(copy.deepcopy(self.defaultScratch))
247  try:
248  chainDict['nowmTasklist'][-1]['nowmIO']=json.loads(open('%s/%s.io'%(dir,step)).read())
249  except:
250  print "Failed to find",'%s/%s.io'%(dir,step),".The workflows were probably not run on cfg not created"
251  return -15
252 
253  chainDict['nowmTasklist'][-1]['PrimaryDataset']='RelVal'+s[1].split('+')[0]
254  if not '--relval' in s[2][index]:
255  print 'Impossible to create task from scratch without splitting information with --relval'
256  return -12
257  else:
258  arg=s[2][index].split()
259  ns=map(int,arg[arg.index('--relval')+1].split(','))
260  chainDict['nowmTasklist'][-1]['RequestNumEvents'] = ns[0]
261  chainDict['nowmTasklist'][-1]['EventsPerJob'] = ns[1]
262  if 'FASTSIM' in s[2][index] or '--fast' in s[2][index]:
263  thisLabel+='_FastSim'
264  if 'lhe' in s[2][index] in s[2][index]:
265  chainDict['nowmTasklist'][-1]['LheInputFiles'] =True
266 
267  elif nextHasDSInput:
268  chainDict['nowmTasklist'].append(copy.deepcopy(self.defaultInput))
269  try:
270  chainDict['nowmTasklist'][-1]['nowmIO']=json.loads(open('%s/%s.io'%(dir,step)).read())
271  except:
272  print "Failed to find",'%s/%s.io'%(dir,step),".The workflows were probably not run on cfg not created"
273  return -15
274  chainDict['nowmTasklist'][-1]['InputDataset']=nextHasDSInput.dataSet
275  splitForThisWf=nextHasDSInput.split
276  chainDict['nowmTasklist'][-1]['LumisPerJob']=splitForThisWf
277  if step in wmsplit:
278  chainDict['nowmTasklist'][-1]['LumisPerJob']=wmsplit[step]
279  # get the run numbers or #events
280  if len(nextHasDSInput.run):
281  chainDict['nowmTasklist'][-1]['RunWhitelist']=nextHasDSInput.run
282  if len(nextHasDSInput.ls):
283  chainDict['nowmTasklist'][-1]['LumiList']=nextHasDSInput.ls
284  #print "what is s",s[2][index]
285  if '--data' in s[2][index] and nextHasDSInput.label:
286  thisLabel+='_RelVal_%s'%nextHasDSInput.label
287  if 'filter' in chainDict['nowmTasklist'][-1]['nowmIO']:
288  print "This has an input DS and a filter sequence: very likely to be the PyQuen sample"
289  processStrPrefix='PU_'
290  setPrimaryDs = 'RelVal'+s[1].split('+')[0]
291  if setPrimaryDs:
292  chainDict['nowmTasklist'][-1]['PrimaryDataset']=setPrimaryDs
293  nextHasDSInput=None
294  else:
295  #not first step and no inputDS
296  chainDict['nowmTasklist'].append(copy.deepcopy(self.defaultTask))
297  try:
298  chainDict['nowmTasklist'][-1]['nowmIO']=json.loads(open('%s/%s.io'%(dir,step)).read())
299  except:
300  print "Failed to find",'%s/%s.io'%(dir,step),".The workflows were probably not run on cfg not created"
301  return -15
302  if splitForThisWf:
303  chainDict['nowmTasklist'][-1]['LumisPerJob']=splitForThisWf
304  if step in wmsplit:
305  chainDict['nowmTasklist'][-1]['LumisPerJob']=wmsplit[step]
306 
307  # change LumisPerJob for Hadronizer steps.
308  if 'Hadronizer' in step:
309  chainDict['nowmTasklist'][-1]['LumisPerJob']=wmsplit['Hadronizer']
310 
311  #print step
312  chainDict['nowmTasklist'][-1]['TaskName']=step
313  if setPrimaryDs:
314  chainDict['nowmTasklist'][-1]['PrimaryDataset']=setPrimaryDs
315  chainDict['nowmTasklist'][-1]['ConfigCacheID']='%s/%s.py'%(dir,step)
316  chainDict['nowmTasklist'][-1]['GlobalTag']=chainDict['nowmTasklist'][-1]['nowmIO']['GT'] # copy to the proper parameter name
317  chainDict['GlobalTag']=chainDict['nowmTasklist'][-1]['nowmIO']['GT'] #set in general to the last one of the chain
318  if 'pileup' in chainDict['nowmTasklist'][-1]['nowmIO']:
319  chainDict['nowmTasklist'][-1]['MCPileup']=chainDict['nowmTasklist'][-1]['nowmIO']['pileup']
320  if '--pileup ' in s[2][index]: # catch --pileup (scenarion) and not --pileup_ (dataset to be mixed) => works also making PRE-MIXed dataset
321  processStrPrefix='PU_' # take care of pu overlay done with GEN-SIM mixing
322  if ( s[2][index].split()[ s[2][index].split().index('--pileup')+1 ] ).find('25ns') > 0 :
323  processStrPrefix='PU25ns_'
324  elif ( s[2][index].split()[ s[2][index].split().index('--pileup')+1 ] ).find('50ns') > 0 :
325  processStrPrefix='PU50ns_'
326  if 'DIGIPREMIX_S2' in s[2][index] : # take care of pu overlay done with DIGI mixing of premixed events
327  if s[2][index].split()[ s[2][index].split().index('--pileup_input')+1 ].find('25ns') > 0 :
328  processStrPrefix='PUpmx25ns_'
329  elif s[2][index].split()[ s[2][index].split().index('--pileup_input')+1 ].find('50ns') > 0 :
330  processStrPrefix='PUpmx50ns_'
331 
332  if acqEra:
333  #chainDict['AcquisitionEra'][step]=(chainDict['CMSSWVersion']+'-PU_'+chainDict['nowmTasklist'][-1]['GlobalTag']).replace('::All','')+thisLabel
334  chainDict['AcquisitionEra'][step]=chainDict['CMSSWVersion']
335  chainDict['ProcessingString'][step]=processStrPrefix+chainDict['nowmTasklist'][-1]['GlobalTag'].replace('::All','')+thisLabel
336  else:
337  #chainDict['nowmTasklist'][-1]['AcquisitionEra']=(chainDict['CMSSWVersion']+'-PU_'+chainDict['nowmTasklist'][-1]['GlobalTag']).replace('::All','')+thisLabel
338  chainDict['nowmTasklist'][-1]['AcquisitionEra']=chainDict['CMSSWVersion']
339  chainDict['nowmTasklist'][-1]['ProcessingString']=processStrPrefix+chainDict['nowmTasklist'][-1]['GlobalTag'].replace('::All','')+thisLabel
340 
341  # specify different ProcessingString for double miniAOD dataset
342  if ('DBLMINIAODMCUP15NODQM' in step):
343  chainDict['nowmTasklist'][-1]['ProcessingString']=chainDict['nowmTasklist'][-1]['ProcessingString']+'_miniAOD'
344 
345  if( chainDict['nowmTasklist'][-1]['Multicore'] ):
346  # the scaling factor of 1.2GB / thread is empirical and measured on a SECOND round of tests with PU samples
347  # the number of threads is NO LONGER assumed to be the same for all tasks
348  # https://hypernews.cern.ch/HyperNews/CMS/get/edmFramework/3509/1/1/1.html
349  # now change to 1.5GB / additional thread according to discussion:
350  # https://hypernews.cern.ch/HyperNews/CMS/get/relval/4817/1/1.html
351 # chainDict['nowmTasklist'][-1]['Memory'] = 3000 + int( chainDict['nowmTasklist'][-1]['Multicore'] -1 )*1500
352  chainDict['nowmTasklist'][-1]['Memory'] = self.memoryOffset + int( chainDict['nowmTasklist'][-1]['Multicore'] -1 ) * self.memPerCore
353 
354  index+=1
355  #end of loop through steps
356  chainDict['RequestString']='RV'+chainDict['CMSSWVersion']+s[1].split('+')[0]
357  if processStrPrefix or thisLabel:
358  chainDict['RequestString']+='_'+processStrPrefix+thisLabel
359 
360 
361 
362  #wrap up for this one
363  import pprint
364  #print 'wrapping up'
365  #pprint.pprint(chainDict)
366  #loop on the task list
367  for i_second in reversed(range(len(chainDict['nowmTasklist']))):
368  t_second=chainDict['nowmTasklist'][i_second]
369  #print "t_second taskname", t_second['TaskName']
370  if 'primary' in t_second['nowmIO']:
371  #print t_second['nowmIO']['primary']
372  primary=t_second['nowmIO']['primary'][0].replace('file:','')
373  for i_input in reversed(range(0,i_second)):
374  t_input=chainDict['nowmTasklist'][i_input]
375  for (om,o) in t_input['nowmIO'].items():
376  if primary in o:
377  #print "found",primary,"procuced by",om,"of",t_input['TaskName']
378  t_second['InputTask'] = t_input['TaskName']
379  t_second['InputFromOutputModule'] = om
380  #print 't_second',pprint.pformat(t_second)
381  if t_second['TaskName'].startswith('HARVEST'):
382  chainDict.update(copy.deepcopy(self.defaultHarvest))
383  chainDict['DQMConfigCacheID']=t_second['ConfigCacheID']
384  ## the info are not in the task specific dict but in the general dict
385  #t_input.update(copy.deepcopy(self.defaultHarvest))
386  #t_input['DQMConfigCacheID']=t_second['ConfigCacheID']
387  break
388 
389  ## there is in fact only one acquisition era
390  #if len(set(chainDict['AcquisitionEra'].values()))==1:
391  # print "setting only one acq"
392  if acqEra:
393  chainDict['AcquisitionEra'] = chainDict['AcquisitionEra'].values()[0]
394 
395  ## clean things up now
396  itask=0
397  if self.keep:
398  for i in self.keep:
399  if type(i)==int and i < len(chainDict['nowmTasklist']):
400  chainDict['nowmTasklist'][i]['KeepOutput']=True
401  for (i,t) in enumerate(chainDict['nowmTasklist']):
402  if t['TaskName'].startswith('HARVEST'):
403  continue
404  if not self.keep:
405  t['KeepOutput']=True
406  elif t['TaskName'] in self.keep:
407  t['KeepOutput']=True
408  t.pop('nowmIO')
409  itask+=1
410  chainDict['Task%d'%(itask)]=t
411 
412 
413  ##
414 
415 
416  ## provide the number of tasks
417  chainDict['TaskChain']=itask#len(chainDict['nowmTasklist'])
418 
419  chainDict.pop('nowmTasklist')
420  self.chainDicts[n]=chainDict
421 
422 
423  return 0
boost::dynamic_bitset append(const boost::dynamic_bitset<> &bs1, const boost::dynamic_bitset<> &bs2)
this method takes two bitsets bs1 and bs2 and returns result of bs2 appended to the end of bs1 ...
void find(edm::Handle< EcalRecHitCollection > &hits, DetId thisDet, std::vector< EcalRecHitCollection::const_iterator > &hit, bool debug=false)
Definition: FindCaloHit.cc:7
if(dp >Float(M_PI)) dp-
double split
Definition: MVATrainer.cc:139
def MatrixInjector.MatrixInjector.submit (   self)

Definition at line 471 of file MatrixInjector.py.

References MatrixInjector.MatrixInjector.testMode, and MatrixInjector.MatrixInjector.wmagent.

472  def submit(self):
473  try:
474  from modules.wma import makeRequest,approveRequest
475  from wmcontrol import random_sleep
476  print '\n\tFound wmcontrol\n'
477  except:
478  print '\n\tUnable to find wmcontrol modules. Please include it in your python path\n'
479  if not self.testMode:
480  print '\n\t QUIT\n'
481  sys.exit(-17)
482 
483  import pprint
484  for (n,d) in self.chainDicts.items():
485  if self.testMode:
486  print "Only viewing request",n
487  print pprint.pprint(d)
488  else:
489  #submit to wmagent each dict
490  print "For eyes before submitting",n
491  print pprint.pprint(d)
492  print "Submitting",n,"..........."
493  workFlow=makeRequest(self.wmagent,d,encodeDict=True)
494  approveRequest(self.wmagent,workFlow)
495  print "...........",n,"submitted"
496  random_sleep()
497 
498 
499 
def MatrixInjector.MatrixInjector.upload (   self)

Definition at line 451 of file MatrixInjector.py.

References MatrixInjector.MatrixInjector.uploadConf().

452  def upload(self):
453  for (n,d) in self.chainDicts.items():
454  for it in d:
455  if it.startswith("Task") and it!='TaskChain':
456  #upload
457  couchID=self.uploadConf(d[it]['ConfigCacheID'],
458  str(n)+d[it]['TaskName'],
459  d['CouchURL']
460  )
461  print d[it]['ConfigCacheID']," uploaded to couchDB for",str(n),"with ID",couchID
462  d[it]['ConfigCacheID']=couchID
463  if it =='DQMConfigCacheID':
464  couchID=self.uploadConf(d['DQMConfigCacheID'],
465  str(n)+'harvesting',
466  d['CouchURL']
467  )
468  print d['DQMConfigCacheID'],"uploaded to couchDB for",str(n),"with ID",couchID
469  d['DQMConfigCacheID']=couchID
470 
def MatrixInjector.MatrixInjector.uploadConf (   self,
  filePath,
  label,
  where 
)

Definition at line 424 of file MatrixInjector.py.

References MatrixInjector.MatrixInjector.couchCache, TmCcu.count, TmModule.count, TmApvPair.count, TmPsu.count, MatrixInjector.MatrixInjector.count, ValidationMisalignedTracker.count, SiStripDetSummary::Values.count, MD5.count, MatrixInjector.MatrixInjector.group, ElectronLikelihoodCategoryData.label, entry< T >.label, SiPixelFedFillerWordEventNumber.label, classes.PlotData.label, TtEvent::HypoClassKeyStringToEnum.label, HcalLutSet.label, DTDQMHarvesting.DTDQMHarvesting.label, DTVDriftMeanTimerCalibration.DTVDriftMeanTimerCalibration.label, DTVDriftSegmentCalibration.DTVDriftSegmentCalibration.label, DTAnalysisResiduals.DTAnalysisResiduals.label, DTDQMValidation.DTDQMValidation.label, L1GtBoardTypeStringToEnum.label, DTResidualCalibration.DTResidualCalibration.label, DTTTrigValid.DTTTrigValid.label, DTTTrigResidualCorr.DTTTrigResidualCorr.label, MatrixInjector.MatrixInjector.label, L1GtPsbQuadStringToEnum.label, ValidationMisalignedTracker.label, L1GtConditionTypeStringToEnum.label, L1GtConditionCategoryStringToEnum.label, PhysicsTools::Calibration::Comparator.label, MatrixInjector.MatrixInjector.testMode, EcalTPGParamReaderFromDB.user, popcon::RpcDataV.user, popcon::RpcObGasData.user, popcon::RPCObPVSSmapData.user, popcon::RpcDataT.user, popcon::RpcDataFebmap.user, popcon::RpcDataS.user, popcon::RpcDataUXC.user, popcon::RpcDataI.user, popcon::RpcDataGasMix.user, MatrixInjector.MatrixInjector.user, and conddblib.TimeType.user.

Referenced by MatrixInjector.MatrixInjector.upload().

425  def uploadConf(self,filePath,label,where):
426  labelInCouch=self.label+'_'+label
427  cacheName=filePath.split('/')[-1]
428  if self.testMode:
429  self.count+=1
430  print '\tFake upload of',filePath,'to couch with label',labelInCouch
431  return self.count
432  else:
433  try:
434  from modules.wma import upload_to_couch,DATABASE_NAME
435  except:
436  print '\n\tUnable to find wmcontrol modules. Please include it in your python path\n'
437  print '\n\t QUIT\n'
438  sys.exit(-16)
439 
440  if cacheName in self.couchCache:
441  print "Not re-uploading",filePath,"to",where,"for",label
442  cacheId=self.couchCache[cacheName]
443  else:
444  print "Loading",filePath,"to",where,"for",label
445  ## totally fork the upload to couch to prevent cross loading of process configurations
446  pool = multiprocessing.Pool(1)
447  cacheIds = pool.map( upload_to_couch_oneArg, [(filePath,labelInCouch,self.user,self.group,where)] )
448  cacheId = cacheIds[0]
449  self.couchCache[cacheName]=cacheId
450  return cacheId

Member Data Documentation

MatrixInjector.MatrixInjector.chainDicts

Definition at line 159 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.couch

Definition at line 69 of file MatrixInjector.py.

MatrixInjector.MatrixInjector.couchCache

Definition at line 71 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.count

Definition at line 39 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.DbsUrl

Definition at line 61 of file MatrixInjector.py.

MatrixInjector.MatrixInjector.defaultChain

Definition at line 88 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultHarvest

Definition at line 115 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultInput

Definition at line 135 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultScratch

Definition at line 122 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultTask

Definition at line 146 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.dqmgui

Definition at line 41 of file MatrixInjector.py.

MatrixInjector.MatrixInjector.group

Definition at line 73 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.keep

Definition at line 51 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.label

Definition at line 74 of file MatrixInjector.py.

Referenced by Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor._sort_list(), python.rootplot.root2matplotlib.Hist.bar(), python.rootplot.root2matplotlib.Hist.barh(), python.rootplot.root2matplotlib.Hist.errorbar(), python.rootplot.root2matplotlib.Hist.errorbarh(), Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor.foundIn(), Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor.fullFilename(), Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor.inputEventContent(), Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor.outputEventContent(), core.TriggerMatchAnalyzer.TriggerMatchAnalyzer.process(), Vispa.Plugins.ConfigEditor.ToolDataAccessor.ToolDataAccessor.properties(), Vispa.Plugins.EdmBrowser.EdmDataAccessor.EdmDataAccessor.properties(), Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor.properties(), Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor.readConnections(), core.AutoHandle.AutoHandle.ReallyLoad(), Vispa.Plugins.ConfigEditor.ToolDataAccessor.ToolDataAccessor.updateProcess(), MatrixInjector.MatrixInjector.uploadConf(), and Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor.usedBy().

MatrixInjector.MatrixInjector.memoryOffset

Definition at line 52 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.memPerCore

Definition at line 53 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.speciallabel

Definition at line 75 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.testMode

Definition at line 49 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.submit(), and MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.user

Definition at line 72 of file MatrixInjector.py.

Referenced by cmsPerfSuite.PerfSuite.optionParse(), dataset.BaseDataset.printInfo(), production_tasks.CheckDatasetExists.run(), production_tasks.GenerateMask.run(), production_tasks.SourceCFG.run(), production_tasks.FullCFG.run(), production_tasks.MonitorJobs.run(), production_tasks.CleanJobFiles.run(), and MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.version

Definition at line 50 of file MatrixInjector.py.

Referenced by python.rootplot.argparse._VersionAction.__call__(), validation.Sample.datasetpattern(), validation.Sample.filename(), argparse.ArgumentParser.format_version(), and python.rootplot.argparse.ArgumentParser.format_version().

MatrixInjector.MatrixInjector.wmagent

Definition at line 42 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.submit().