test
CMS 3D CMS Logo

 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Properties Friends Macros Pages
List of all members | Public Member Functions | Public Attributes
MatrixInjector.MatrixInjector Class Reference
Inheritance diagram for MatrixInjector.MatrixInjector:

Public Member Functions

def __init__
 
def prepare
 
def submit
 
def upload
 
def uploadConf
 

Public Attributes

 chainDicts
 
 couch
 
 couchCache
 
 count
 
 DbsUrl
 
 defaultChain
 
 defaultHarvest
 
 defaultInput
 
 defaultScratch
 
 defaultTask
 
 dqmgui
 
 group
 
 keep
 
 label
 
 memoryOffset
 
 memPerCore
 
 speciallabel
 
 testMode
 
 user
 
 version
 
 wmagent
 

Detailed Description

Definition at line 36 of file MatrixInjector.py.

Constructor & Destructor Documentation

def MatrixInjector.MatrixInjector.__init__ (   self,
  opt,
  mode = 'init',
  options = '' 
)

Definition at line 38 of file MatrixInjector.py.

38 
39  def __init__(self,opt,mode='init',options=''):
40  self.count=1040
41 
42  self.dqmgui=None
43  self.wmagent=None
44  for k in options.split(','):
45  if k.startswith('dqm:'):
46  self.dqmgui=k.split(':',1)[-1]
47  elif k.startswith('wma:'):
48  self.wmagent=k.split(':',1)[-1]
49 
50  self.testMode=((mode!='submit') and (mode!='force'))
51  self.version =1
52  self.keep = opt.keep
53  self.memoryOffset = opt.memoryOffset
54  self.memPerCore = opt.memPerCore
55 
56  #wagemt stuff
57  if not self.wmagent:
58  self.wmagent=os.getenv('WMAGENT_REQMGR')
59  if not self.wmagent:
60  if not opt.testbed :
61  self.wmagent = 'cmsweb.cern.ch'
62  self.DbsUrl = "https://"+self.wmagent+"/dbs/prod/global/DBSReader"
63  else :
64  self.wmagent = 'cmsweb-testbed.cern.ch'
65  self.DbsUrl = "https://"+self.wmagent+"/dbs/int/global/DBSReader"
66 
67  if not self.dqmgui:
68  self.dqmgui="https://cmsweb.cern.ch/dqm/relval"
69  #couch stuff
70  self.couch = 'https://'+self.wmagent+'/couchdb'
71 # self.couchDB = 'reqmgr_config_cache'
72  self.couchCache={} # so that we do not upload like crazy, and recyle cfgs
73  self.user = os.getenv('USER')
74  self.group = 'ppd'
75  self.label = 'RelValSet_'+os.getenv('CMSSW_VERSION').replace('-','')+'_v'+str(self.version)
76  self.speciallabel=''
77  if opt.label:
78  self.speciallabel= '_'+opt.label
79 
80 
81  if not os.getenv('WMCORE_ROOT'):
82  print '\n\twmclient is not setup properly. Will not be able to upload or submit requests.\n'
83  if not self.testMode:
84  print '\n\t QUIT\n'
85  sys.exit(-18)
86  else:
87  print '\n\tFound wmclient\n'
88 
89  self.defaultChain={
90  "RequestType" : "TaskChain", #this is how we handle relvals
91  "SubRequestType" : "RelVal", #this is how we handle relvals, now that TaskChain is also used for central MC production
92  "RequestPriority": 500000,
93  "Requestor": self.user, #Person responsible
94  "Group": self.group, #group for the request
95  "CMSSWVersion": os.getenv('CMSSW_VERSION'), #CMSSW Version (used for all tasks in chain)
96  "Campaign": os.getenv('CMSSW_VERSION'), # only for wmstat purpose
97  "ScramArch": os.getenv('SCRAM_ARCH'), #Scram Arch (used for all tasks in chain)
98  "ProcessingVersion": self.version, #Processing Version (used for all tasks in chain)
99  "GlobalTag": None, #Global Tag (overridden per task)
100  "CouchURL": self.couch, #URL of CouchDB containing Config Cache
101  "ConfigCacheURL": self.couch, #URL of CouchDB containing Config Cache
102  "DbsUrl": self.DbsUrl,
103  #- Will contain all configs for all Tasks
104  #"SiteWhitelist" : ["T2_CH_CERN", "T1_US_FNAL"], #Site whitelist
105  "TaskChain" : None, #Define number of tasks in chain.
106  "nowmTasklist" : [], #a list of tasks as we put them in
107  "unmergedLFNBase" : "/store/unmerged",
108  "mergedLFNBase" : "/store/relval",
109  "dashboardActivity" : "relval",
110  "Multicore" : 1, # do not set multicore for the whole chain
111  "Memory" : 3000,
112  "SizePerEvent" : 1234,
113  "TimePerEvent" : 0.1
114  }
116  self.defaultHarvest={
117  "EnableHarvesting" : "True",
118  "DQMUploadUrl" : self.dqmgui,
119  "DQMConfigCacheID" : None,
120  "Multicore" : 1 # hardcode Multicore to be 1 for Harvest
121  }
122 
123  self.defaultScratch={
124  "TaskName" : None, #Task Name
125  "ConfigCacheID" : None, #Generator Config id
126  "GlobalTag": None,
127  "SplittingAlgo" : "EventBased", #Splitting Algorithm
128  "EventsPerJob" : None, #Size of jobs in terms of splitting algorithm
129  "RequestNumEvents" : None, #Total number of events to generate
130  "Seeding" : "AutomaticSeeding", #Random seeding method
131  "PrimaryDataset" : None, #Primary Dataset to be created
132  "nowmIO": {},
133  "Multicore" : opt.nThreads, # this is the per-taskchain Multicore; it's the default assigned to a task if it has no value specified
134  "KeepOutput" : False
135  }
136  self.defaultInput={
137  "TaskName" : "DigiHLT", #Task Name
138  "ConfigCacheID" : None, #Processing Config id
139  "GlobalTag": None,
140  "InputDataset" : None, #Input Dataset to be processed
141  "SplittingAlgo" : "LumiBased", #Splitting Algorithm
142  "LumisPerJob" : 10, #Size of jobs in terms of splitting algorithm
143  "nowmIO": {},
144  "Multicore" : opt.nThreads, # this is the per-taskchain Multicore; it's the default assigned to a task if it has no value specified
145  "KeepOutput" : False
146  }
147  self.defaultTask={
148  "TaskName" : None, #Task Name
149  "InputTask" : None, #Input Task Name (Task Name field of a previous Task entry)
150  "InputFromOutputModule" : None, #OutputModule name in the input task that will provide files to process
151  "ConfigCacheID" : None, #Processing Config id
152  "GlobalTag": None,
153  "SplittingAlgo" : "LumiBased", #Splitting Algorithm
154  "LumisPerJob" : 10, #Size of jobs in terms of splitting algorithm
155  "nowmIO": {},
156  "Multicore" : opt.nThreads, # this is the per-taskchain Multicore; it's the default assigned to a task if it has no value specified
157  "KeepOutput" : False
158  }
160  self.chainDicts={}
161 

Member Function Documentation

def MatrixInjector.MatrixInjector.prepare (   self,
  mReader,
  directories,
  mode = 'init' 
)

Definition at line 162 of file MatrixInjector.py.

References bitset_utilities.append(), MatrixInjector.MatrixInjector.chainDicts, MatrixInjector.MatrixInjector.defaultChain, MatrixInjector.MatrixInjector.defaultHarvest, MatrixInjector.MatrixInjector.defaultInput, MatrixInjector.MatrixInjector.defaultScratch, MatrixInjector.MatrixInjector.defaultTask, spr.find(), reco.if(), cmsHarvester.index, mps_monitormerge.items, MatrixInjector.MatrixInjector.keep, MatrixInjector.MatrixInjector.memoryOffset, MatrixInjector.MatrixInjector.memPerCore, SiPixelLorentzAngle_cfi.read, python.rootplot.root2matplotlib.replace(), MatrixInjector.MatrixInjector.speciallabel, split, and makeHLTPrescaleTable.values.

163  def prepare(self,mReader, directories, mode='init'):
164  try:
165  #from Configuration.PyReleaseValidation.relval_steps import wmsplit
166  wmsplit = {}
167  wmsplit['DIGIHI']=5
168  wmsplit['RECOHI']=5
169  wmsplit['HLTD']=5
170  wmsplit['RECODreHLT']=2
171  wmsplit['DIGIPU']=4
172  wmsplit['DIGIPU1']=4
173  wmsplit['RECOPU1']=1
174  wmsplit['DIGIUP15_PU50']=1
175  wmsplit['RECOUP15_PU50']=1
176  wmsplit['DIGIUP15_PU25']=1
177  wmsplit['RECOUP15_PU25']=1
178  wmsplit['DIGIUP15_PU25HS']=1
179  wmsplit['RECOUP15_PU25HS']=1
180  wmsplit['DIGIHIMIX']=5
181  wmsplit['RECOHIMIX']=5
182  wmsplit['RECODSplit']=1
183  wmsplit['SingleMuPt10_UP15_ID']=1
184  wmsplit['DIGIUP15_ID']=1
185  wmsplit['RECOUP15_ID']=1
186  wmsplit['TTbar_13_ID']=1
187  wmsplit['SingleMuPt10FS_ID']=1
188  wmsplit['TTbarFS_ID']=1
189  wmsplit['RECODR2_50nsreHLT']=1
190  wmsplit['RECODR2_25nsreHLT']=1
191  wmsplit['RECODR2_2016reHLT']=5
192  wmsplit['RECODR2_2016reHLT_skimSingleMu']=5
193  wmsplit['RECODR2_2016reHLT_skimDoubleEG']=5
194  wmsplit['RECODR2_2016reHLT_skimMuonEG']=5
195  wmsplit['RECODR2_2016reHLT_skimJetHT']=5
196  wmsplit['RECODR2_2016reHLT_skimMuOnia']=5
197  wmsplit['HLTDR2_50ns']=1
198  wmsplit['HLTDR2_25ns']=1
199  wmsplit['HLTDR2_2016']=1
200  wmsplit['Hadronizer']=1
201  wmsplit['DIGIUP15']=5
202  wmsplit['RECOUP15']=5
203  wmsplit['RECOAODUP15']=5
204  wmsplit['DBLMINIAODMCUP15NODQM']=5
205 
206 
207  #import pprint
208  #pprint.pprint(wmsplit)
209  except:
210  print "Not set up for step splitting"
211  wmsplit={}
212 
213  acqEra=False
214  for (n,dir) in directories.items():
215  chainDict=copy.deepcopy(self.defaultChain)
216  print "inspecting",dir
217  nextHasDSInput=None
218  for (x,s) in mReader.workFlowSteps.items():
219  #x has the format (num, prefix)
220  #s has the format (num, name, commands, stepList)
221  if x[0]==n:
222  #print "found",n,s[3]
223  #chainDict['RequestString']='RV'+chainDict['CMSSWVersion']+s[1].split('+')[0]
224  index=0
225  splitForThisWf=None
226  thisLabel=self.speciallabel
227  #if 'HARVESTGEN' in s[3]:
228  if len( [step for step in s[3] if "HARVESTGEN" in step] )>0:
229  chainDict['TimePerEvent']=0.01
230  thisLabel=thisLabel+"_gen"
231  # for double miniAOD test
232  if len( [step for step in s[3] if "DBLMINIAODMCUP15NODQM" in step] )>0:
233  thisLabel=thisLabel+"_dblMiniAOD"
234  processStrPrefix=''
235  setPrimaryDs=None
236  for step in s[3]:
237 
238  if 'INPUT' in step or (not isinstance(s[2][index],str)):
239  nextHasDSInput=s[2][index]
240 
241  else:
242 
243  if (index==0):
244  #first step and not input -> gen part
245  chainDict['nowmTasklist'].append(copy.deepcopy(self.defaultScratch))
246  try:
247  chainDict['nowmTasklist'][-1]['nowmIO']=json.loads(open('%s/%s.io'%(dir,step)).read())
248  except:
249  print "Failed to find",'%s/%s.io'%(dir,step),".The workflows were probably not run on cfg not created"
250  return -15
251 
252  chainDict['nowmTasklist'][-1]['PrimaryDataset']='RelVal'+s[1].split('+')[0]
253  if not '--relval' in s[2][index]:
254  print 'Impossible to create task from scratch without splitting information with --relval'
255  return -12
256  else:
257  arg=s[2][index].split()
258  ns=map(int,arg[arg.index('--relval')+1].split(','))
259  chainDict['nowmTasklist'][-1]['RequestNumEvents'] = ns[0]
260  chainDict['nowmTasklist'][-1]['EventsPerJob'] = ns[1]
261  if 'FASTSIM' in s[2][index] or '--fast' in s[2][index]:
262  thisLabel+='_FastSim'
263  if 'lhe' in s[2][index] in s[2][index]:
264  chainDict['nowmTasklist'][-1]['LheInputFiles'] =True
265 
266  elif nextHasDSInput:
267  chainDict['nowmTasklist'].append(copy.deepcopy(self.defaultInput))
268  try:
269  chainDict['nowmTasklist'][-1]['nowmIO']=json.loads(open('%s/%s.io'%(dir,step)).read())
270  except:
271  print "Failed to find",'%s/%s.io'%(dir,step),".The workflows were probably not run on cfg not created"
272  return -15
273  chainDict['nowmTasklist'][-1]['InputDataset']=nextHasDSInput.dataSet
274  splitForThisWf=nextHasDSInput.split
275  chainDict['nowmTasklist'][-1]['LumisPerJob']=splitForThisWf
276  if step in wmsplit:
277  chainDict['nowmTasklist'][-1]['LumisPerJob']=wmsplit[step]
278  # get the run numbers or #events
279  if len(nextHasDSInput.run):
280  chainDict['nowmTasklist'][-1]['RunWhitelist']=nextHasDSInput.run
281  if len(nextHasDSInput.ls):
282  chainDict['nowmTasklist'][-1]['LumiList']=nextHasDSInput.ls
283  #print "what is s",s[2][index]
284  if '--data' in s[2][index] and nextHasDSInput.label:
285  thisLabel+='_RelVal_%s'%nextHasDSInput.label
286  if 'filter' in chainDict['nowmTasklist'][-1]['nowmIO']:
287  print "This has an input DS and a filter sequence: very likely to be the PyQuen sample"
288  processStrPrefix='PU_'
289  setPrimaryDs = 'RelVal'+s[1].split('+')[0]
290  if setPrimaryDs:
291  chainDict['nowmTasklist'][-1]['PrimaryDataset']=setPrimaryDs
292  nextHasDSInput=None
293  else:
294  #not first step and no inputDS
295  chainDict['nowmTasklist'].append(copy.deepcopy(self.defaultTask))
296  try:
297  chainDict['nowmTasklist'][-1]['nowmIO']=json.loads(open('%s/%s.io'%(dir,step)).read())
298  except:
299  print "Failed to find",'%s/%s.io'%(dir,step),".The workflows were probably not run on cfg not created"
300  return -15
301  if splitForThisWf:
302  chainDict['nowmTasklist'][-1]['LumisPerJob']=splitForThisWf
303  if step in wmsplit:
304  chainDict['nowmTasklist'][-1]['LumisPerJob']=wmsplit[step]
305 
306  # change LumisPerJob for Hadronizer steps.
307  if 'Hadronizer' in step:
308  chainDict['nowmTasklist'][-1]['LumisPerJob']=wmsplit['Hadronizer']
309 
310  #print step
311  chainDict['nowmTasklist'][-1]['TaskName']=step
312  if setPrimaryDs:
313  chainDict['nowmTasklist'][-1]['PrimaryDataset']=setPrimaryDs
314  chainDict['nowmTasklist'][-1]['ConfigCacheID']='%s/%s.py'%(dir,step)
315  chainDict['nowmTasklist'][-1]['GlobalTag']=chainDict['nowmTasklist'][-1]['nowmIO']['GT'] # copy to the proper parameter name
316  chainDict['GlobalTag']=chainDict['nowmTasklist'][-1]['nowmIO']['GT'] #set in general to the last one of the chain
317  if 'pileup' in chainDict['nowmTasklist'][-1]['nowmIO']:
318  chainDict['nowmTasklist'][-1]['MCPileup']=chainDict['nowmTasklist'][-1]['nowmIO']['pileup']
319  if '--pileup ' in s[2][index]: # catch --pileup (scenarion) and not --pileup_ (dataset to be mixed) => works also making PRE-MIXed dataset
320  processStrPrefix='PU_' # take care of pu overlay done with GEN-SIM mixing
321  if ( s[2][index].split()[ s[2][index].split().index('--pileup')+1 ] ).find('25ns') > 0 :
322  processStrPrefix='PU25ns_'
323  elif ( s[2][index].split()[ s[2][index].split().index('--pileup')+1 ] ).find('50ns') > 0 :
324  processStrPrefix='PU50ns_'
325  if 'DIGIPREMIX_S2' in s[2][index] : # take care of pu overlay done with DIGI mixing of premixed events
326  if s[2][index].split()[ s[2][index].split().index('--pileup_input')+1 ].find('25ns') > 0 :
327  processStrPrefix='PUpmx25ns_'
328  elif s[2][index].split()[ s[2][index].split().index('--pileup_input')+1 ].find('50ns') > 0 :
329  processStrPrefix='PUpmx50ns_'
330 
331  if acqEra:
332  #chainDict['AcquisitionEra'][step]=(chainDict['CMSSWVersion']+'-PU_'+chainDict['nowmTasklist'][-1]['GlobalTag']).replace('::All','')+thisLabel
333  chainDict['AcquisitionEra'][step]=chainDict['CMSSWVersion']
334  chainDict['ProcessingString'][step]=processStrPrefix+chainDict['nowmTasklist'][-1]['GlobalTag'].replace('::All','')+thisLabel
335  else:
336  #chainDict['nowmTasklist'][-1]['AcquisitionEra']=(chainDict['CMSSWVersion']+'-PU_'+chainDict['nowmTasklist'][-1]['GlobalTag']).replace('::All','')+thisLabel
337  chainDict['nowmTasklist'][-1]['AcquisitionEra']=chainDict['CMSSWVersion']
338  chainDict['nowmTasklist'][-1]['ProcessingString']=processStrPrefix+chainDict['nowmTasklist'][-1]['GlobalTag'].replace('::All','')+thisLabel
339 
340  # specify different ProcessingString for double miniAOD dataset
341  if ('DBLMINIAODMCUP15NODQM' in step):
342  chainDict['nowmTasklist'][-1]['ProcessingString']=chainDict['nowmTasklist'][-1]['ProcessingString']+'_miniAOD'
343 
344  if( chainDict['nowmTasklist'][-1]['Multicore'] ):
345  # the scaling factor of 1.2GB / thread is empirical and measured on a SECOND round of tests with PU samples
346  # the number of threads is NO LONGER assumed to be the same for all tasks
347  # https://hypernews.cern.ch/HyperNews/CMS/get/edmFramework/3509/1/1/1.html
348  # now change to 1.5GB / additional thread according to discussion:
349  # https://hypernews.cern.ch/HyperNews/CMS/get/relval/4817/1/1.html
350 # chainDict['nowmTasklist'][-1]['Memory'] = 3000 + int( chainDict['nowmTasklist'][-1]['Multicore'] -1 )*1500
351  chainDict['nowmTasklist'][-1]['Memory'] = self.memoryOffset + int( chainDict['nowmTasklist'][-1]['Multicore'] -1 ) * self.memPerCore
352 
353  index+=1
354  #end of loop through steps
355  chainDict['RequestString']='RV'+chainDict['CMSSWVersion']+s[1].split('+')[0]
356  if processStrPrefix or thisLabel:
357  chainDict['RequestString']+='_'+processStrPrefix+thisLabel
358 
359 
360 
361  #wrap up for this one
362  import pprint
363  #print 'wrapping up'
364  #pprint.pprint(chainDict)
365  #loop on the task list
366  for i_second in reversed(range(len(chainDict['nowmTasklist']))):
367  t_second=chainDict['nowmTasklist'][i_second]
368  #print "t_second taskname", t_second['TaskName']
369  if 'primary' in t_second['nowmIO']:
370  #print t_second['nowmIO']['primary']
371  primary=t_second['nowmIO']['primary'][0].replace('file:','')
372  for i_input in reversed(range(0,i_second)):
373  t_input=chainDict['nowmTasklist'][i_input]
374  for (om,o) in t_input['nowmIO'].items():
375  if primary in o:
376  #print "found",primary,"procuced by",om,"of",t_input['TaskName']
377  t_second['InputTask'] = t_input['TaskName']
378  t_second['InputFromOutputModule'] = om
379  #print 't_second',pprint.pformat(t_second)
380  if t_second['TaskName'].startswith('HARVEST'):
381  chainDict.update(copy.deepcopy(self.defaultHarvest))
382  chainDict['DQMConfigCacheID']=t_second['ConfigCacheID']
383  ## the info are not in the task specific dict but in the general dict
384  #t_input.update(copy.deepcopy(self.defaultHarvest))
385  #t_input['DQMConfigCacheID']=t_second['ConfigCacheID']
386  break
387 
388  ## there is in fact only one acquisition era
389  #if len(set(chainDict['AcquisitionEra'].values()))==1:
390  # print "setting only one acq"
391  if acqEra:
392  chainDict['AcquisitionEra'] = chainDict['AcquisitionEra'].values()[0]
393 
394  ## clean things up now
395  itask=0
396  if self.keep:
397  for i in self.keep:
398  if type(i)==int and i < len(chainDict['nowmTasklist']):
399  chainDict['nowmTasklist'][i]['KeepOutput']=True
400  for (i,t) in enumerate(chainDict['nowmTasklist']):
401  if t['TaskName'].startswith('HARVEST'):
402  continue
403  if not self.keep:
404  t['KeepOutput']=True
405  elif t['TaskName'] in self.keep:
406  t['KeepOutput']=True
407  t.pop('nowmIO')
408  itask+=1
409  chainDict['Task%d'%(itask)]=t
410 
411 
412  ##
413 
414 
415  ## provide the number of tasks
416  chainDict['TaskChain']=itask#len(chainDict['nowmTasklist'])
417 
418  chainDict.pop('nowmTasklist')
419  self.chainDicts[n]=chainDict
420 
421 
422  return 0
boost::dynamic_bitset append(const boost::dynamic_bitset<> &bs1, const boost::dynamic_bitset<> &bs2)
this method takes two bitsets bs1 and bs2 and returns result of bs2 appended to the end of bs1 ...
void find(edm::Handle< EcalRecHitCollection > &hits, DetId thisDet, std::vector< EcalRecHitCollection::const_iterator > &hit, bool debug=false)
Definition: FindCaloHit.cc:7
if(dp >Float(M_PI)) dp-
double split
Definition: MVATrainer.cc:139
def MatrixInjector.MatrixInjector.submit (   self)

Definition at line 470 of file MatrixInjector.py.

References MatrixInjector.MatrixInjector.testMode, and MatrixInjector.MatrixInjector.wmagent.

471  def submit(self):
472  try:
473  from modules.wma import makeRequest,approveRequest
474  from wmcontrol import random_sleep
475  print '\n\tFound wmcontrol\n'
476  except:
477  print '\n\tUnable to find wmcontrol modules. Please include it in your python path\n'
478  if not self.testMode:
479  print '\n\t QUIT\n'
480  sys.exit(-17)
481 
482  import pprint
483  for (n,d) in self.chainDicts.items():
484  if self.testMode:
485  print "Only viewing request",n
486  print pprint.pprint(d)
487  else:
488  #submit to wmagent each dict
489  print "For eyes before submitting",n
490  print pprint.pprint(d)
491  print "Submitting",n,"..........."
492  workFlow=makeRequest(self.wmagent,d,encodeDict=True)
493  approveRequest(self.wmagent,workFlow)
494  print "...........",n,"submitted"
495  random_sleep()
496 
497 
498 
def MatrixInjector.MatrixInjector.upload (   self)

Definition at line 450 of file MatrixInjector.py.

References MatrixInjector.MatrixInjector.uploadConf().

451  def upload(self):
452  for (n,d) in self.chainDicts.items():
453  for it in d:
454  if it.startswith("Task") and it!='TaskChain':
455  #upload
456  couchID=self.uploadConf(d[it]['ConfigCacheID'],
457  str(n)+d[it]['TaskName'],
458  d['CouchURL']
459  )
460  print d[it]['ConfigCacheID']," uploaded to couchDB for",str(n),"with ID",couchID
461  d[it]['ConfigCacheID']=couchID
462  if it =='DQMConfigCacheID':
463  couchID=self.uploadConf(d['DQMConfigCacheID'],
464  str(n)+'harvesting',
465  d['CouchURL']
466  )
467  print d['DQMConfigCacheID'],"uploaded to couchDB for",str(n),"with ID",couchID
468  d['DQMConfigCacheID']=couchID
469 
def MatrixInjector.MatrixInjector.uploadConf (   self,
  filePath,
  label,
  where 
)

Definition at line 423 of file MatrixInjector.py.

References MatrixInjector.MatrixInjector.couchCache, TmCcu.count, TmModule.count, TmApvPair.count, TmPsu.count, MatrixInjector.MatrixInjector.count, ValidationMisalignedTracker.count, SiStripDetSummary::Values.count, MD5.count, MatrixInjector.MatrixInjector.group, ElectronLikelihoodCategoryData.label, entry< T >.label, SiPixelFedFillerWordEventNumber.label, TtEvent::HypoClassKeyStringToEnum.label, HcalLutSet.label, DTDQMHarvesting.DTDQMHarvesting.label, DTVDriftMeanTimerCalibration.DTVDriftMeanTimerCalibration.label, DTVDriftSegmentCalibration.DTVDriftSegmentCalibration.label, DTDQMValidation.DTDQMValidation.label, DTAnalysisResiduals.DTAnalysisResiduals.label, L1GtBoardTypeStringToEnum.label, DTResidualCalibration.DTResidualCalibration.label, DTTTrigValid.DTTTrigValid.label, DTTTrigResidualCorr.DTTTrigResidualCorr.label, MatrixInjector.MatrixInjector.label, L1GtPsbQuadStringToEnum.label, ValidationMisalignedTracker.label, L1GtConditionTypeStringToEnum.label, L1GtConditionCategoryStringToEnum.label, PhysicsTools::Calibration::Comparator.label, MatrixInjector.MatrixInjector.testMode, EcalTPGParamReaderFromDB.user, popcon::RpcDataV.user, popcon::RpcDataT.user, popcon::RPCObPVSSmapData.user, popcon::RpcObGasData.user, popcon::RpcDataGasMix.user, popcon::RpcDataS.user, popcon::RpcDataFebmap.user, popcon::RpcDataUXC.user, popcon::RpcDataI.user, MatrixInjector.MatrixInjector.user, and conddblib.TimeType.user.

Referenced by MatrixInjector.MatrixInjector.upload().

424  def uploadConf(self,filePath,label,where):
425  labelInCouch=self.label+'_'+label
426  cacheName=filePath.split('/')[-1]
427  if self.testMode:
428  self.count+=1
429  print '\tFake upload of',filePath,'to couch with label',labelInCouch
430  return self.count
431  else:
432  try:
433  from modules.wma import upload_to_couch,DATABASE_NAME
434  except:
435  print '\n\tUnable to find wmcontrol modules. Please include it in your python path\n'
436  print '\n\t QUIT\n'
437  sys.exit(-16)
438 
439  if cacheName in self.couchCache:
440  print "Not re-uploading",filePath,"to",where,"for",label
441  cacheId=self.couchCache[cacheName]
442  else:
443  print "Loading",filePath,"to",where,"for",label
444  ## totally fork the upload to couch to prevent cross loading of process configurations
445  pool = multiprocessing.Pool(1)
446  cacheIds = pool.map( upload_to_couch_oneArg, [(filePath,labelInCouch,self.user,self.group,where)] )
447  cacheId = cacheIds[0]
448  self.couchCache[cacheName]=cacheId
449  return cacheId

Member Data Documentation

MatrixInjector.MatrixInjector.chainDicts

Definition at line 159 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.couch

Definition at line 69 of file MatrixInjector.py.

MatrixInjector.MatrixInjector.couchCache

Definition at line 71 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.count

Definition at line 39 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.DbsUrl

Definition at line 61 of file MatrixInjector.py.

MatrixInjector.MatrixInjector.defaultChain

Definition at line 88 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultHarvest

Definition at line 115 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultInput

Definition at line 135 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultScratch

Definition at line 122 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.defaultTask

Definition at line 146 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.dqmgui

Definition at line 41 of file MatrixInjector.py.

MatrixInjector.MatrixInjector.group

Definition at line 73 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.keep

Definition at line 51 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.label

Definition at line 74 of file MatrixInjector.py.

Referenced by Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor._sort_list(), python.rootplot.root2matplotlib.Hist.bar(), python.rootplot.root2matplotlib.Hist.barh(), python.rootplot.root2matplotlib.Hist.errorbar(), python.rootplot.root2matplotlib.Hist.errorbarh(), Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor.foundIn(), Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor.fullFilename(), Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor.inputEventContent(), Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor.outputEventContent(), core.TriggerMatchAnalyzer.TriggerMatchAnalyzer.process(), Vispa.Plugins.ConfigEditor.ToolDataAccessor.ToolDataAccessor.properties(), Vispa.Plugins.EdmBrowser.EdmDataAccessor.EdmDataAccessor.properties(), Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor.properties(), Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor.readConnections(), core.AutoHandle.AutoHandle.ReallyLoad(), Vispa.Plugins.ConfigEditor.ToolDataAccessor.ToolDataAccessor.updateProcess(), MatrixInjector.MatrixInjector.uploadConf(), and Vispa.Plugins.ConfigEditor.ConfigDataAccessor.ConfigDataAccessor.usedBy().

MatrixInjector.MatrixInjector.memoryOffset

Definition at line 52 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.memPerCore

Definition at line 53 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.speciallabel

Definition at line 75 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.prepare().

MatrixInjector.MatrixInjector.testMode

Definition at line 49 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.submit(), and MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.user

Definition at line 72 of file MatrixInjector.py.

Referenced by cmsPerfSuite.PerfSuite.optionParse(), dataset.BaseDataset.printInfo(), production_tasks.CheckDatasetExists.run(), production_tasks.GenerateMask.run(), production_tasks.SourceCFG.run(), production_tasks.FullCFG.run(), production_tasks.MonitorJobs.run(), production_tasks.CleanJobFiles.run(), and MatrixInjector.MatrixInjector.uploadConf().

MatrixInjector.MatrixInjector.version

Definition at line 50 of file MatrixInjector.py.

Referenced by python.rootplot.argparse._VersionAction.__call__(), validation.Sample.datasetpattern(), validation.Sample.filename(), argparse.ArgumentParser.format_version(), and python.rootplot.argparse.ArgumentParser.format_version().

MatrixInjector.MatrixInjector.wmagent

Definition at line 42 of file MatrixInjector.py.

Referenced by MatrixInjector.MatrixInjector.submit().