CMS 3D CMS Logo

Classes | Functions
uploadConditions Namespace Reference

Classes

class  ConditionsUploader
 
class  HTTP
 
class  HTTPError
 

Functions

def addToTarFile (tarFile, fileobj, arcname)
 
def get_directory_to_pull_to (default_directory, commit_hash)
 
def get_local_commit_hash ()
 
def get_version_info (url)
 
def getCredentials (options)
 
def getInput (default, prompt='')
 
def getInputChoose (optionsList, default, prompt='')
 
def getInputRepeat (prompt='')
 
def getInputWorkflow (prompt='')
 
def main ()
 
def parse_arguments ()
 
def pull_code_from_git (target_directory, repository_url, hash)
 
def re_upload (options)
 
def run_in_shell (*popenargs, **kwargs)
 
def run_upload (**parameters)
 
def runWizard (basename, dataFilename, metadataFilename)
 
def testTier0Upload ()
 
def upload (options, arguments)
 
def uploadAllFiles (options, arguments)
 
def uploadTier0Files (filenames, username, password, cookieFileName=None)
 

Detailed Description

Joshua Dawes - CERN, CMS - The University of Manchester

Upload script wrapper - controls the automatic update system.

Note: the name of the file follows a different convention to the others because it should be the same as the current upload script name.

Takes user arguments and passes them to the main upload module CondDBFW.uploads, once the correct version exists.

1. Ask the server corresponding to the database we're uploading to which version of CondDBFW it has (query the /conddbfw_version/ url).
2. Decide which directory that we can write to - either the current local directory, or /tmp/random_string/.
3. Pull the commit returned from the server into the directory from step 2.
4. Invoke the CondDBFW.uploads module with the arguments given to this script.
Script that uploads to the new CMS conditions uploader.
Adapted to the new infrastructure from v6 of the upload.py script for the DropBox from Miguel Ojeda.

Function Documentation

◆ addToTarFile()

def uploadConditions.addToTarFile (   tarFile,
  fileobj,
  arcname 
)

Definition at line 439 of file uploadConditions.py.

439 def addToTarFile(tarFile, fileobj, arcname):
440  tarInfo = tarFile.gettarinfo(fileobj = fileobj, arcname = arcname)
441  tarInfo.mode = 0o400
442  tarInfo.uid = tarInfo.gid = tarInfo.mtime = 0
443  tarInfo.uname = tarInfo.gname = 'root'
444  tarFile.addfile(tarInfo, fileobj)
445 

Referenced by uploadConditions.ConditionsUploader.uploadFile().

◆ get_directory_to_pull_to()

def uploadConditions.get_directory_to_pull_to (   default_directory,
  commit_hash 
)
Finds out which directory we can safely use - either CondDBFW/ or a temporary directory.

Definition at line 73 of file uploadConditions.py.

73 def get_directory_to_pull_to(default_directory, commit_hash):
74  """
75  Finds out which directory we can safely use - either CondDBFW/ or a temporary directory.
76  """
77  # try to write a file (and then delete it)
78  try:
79  handle = open(os.path.join(default_directory, "test_file"), "w")
80  handle.write("test")
81  handle.close()
82  os.remove(os.path.join(default_directory, "test_file"))
83  sys.path.insert(0, default_directory)
84  return default_directory
85  except IOError as io:
86  # cannot write to default directory, so set up a directory in /tmp/
87  new_path = os.path.join("tmp", commit_hash[0:10])
88  if not(os.path.exists(new_path)):
89  os.mkdir(new_path)
90  sys.path.insert(0, new_path)
91  return new_path
92  else:
93  # for now, fail
94  exit("Can't find anywhere to pull the new code base to.")
95 

References beamvalidation.exit().

Referenced by parse_arguments().

◆ get_local_commit_hash()

def uploadConditions.get_local_commit_hash ( )
Gets the commit hash used by the local repository CondDBFW/.git/.

Definition at line 50 of file uploadConditions.py.

51  """
52  Gets the commit hash used by the local repository CondDBFW/.git/.
53  """
54  directory = os.path.abspath("CondDBFW")
55 
56  # get the commit hash of the code in `directory`
57  # by reading the .commit_hash file
58  try:
59  commit_hash_file_handle = open(os.path.join(directory, ".commit_hash"), "r")
60  commit_hash = commit_hash_file_handle.read().strip()
61 
62  # validate length of the commit hash
63  if len(commit_hash) != 40:
64  print("Commit hash found is not valid. Must be 40 characters long.")
65  exit()
66 
67  #commit_hash = run_in_shell("git --git-dir=%s rev-parse HEAD" % (os.path.join(directory, ".git")), shell=True).strip()
68 
69  return commit_hash
70  except Exception:
71  return None
72 

References beamvalidation.exit(), print(), and digitizers_cfi.strip.

Referenced by parse_arguments().

◆ get_version_info()

def uploadConditions.get_version_info (   url)
Queries the server-side for the commit hash it is currently using.
Note: this is the commit hash used by /data/services/common/CondDBFW on the server-side.

Definition at line 32 of file uploadConditions.py.

32 def get_version_info(url):
33  """
34  Queries the server-side for the commit hash it is currently using.
35  Note: this is the commit hash used by /data/services/common/CondDBFW on the server-side.
36  """
37  request = pycurl.Curl()
38  request.setopt(request.CONNECTTIMEOUT, 60)
39  user_agent = "User-Agent: ConditionWebServices/1.0 python/%d.%d.%d PycURL/%s" % (sys.version_info[ :3 ] + (pycurl.version_info()[1],))
40  request.setopt(request.USERAGENT, user_agent)
41  # we don't need to verify who signed the certificate or who the host is
42  request.setopt(request.SSL_VERIFYPEER, 0)
43  request.setopt(request.SSL_VERIFYHOST, 0)
44  response_buffer = StringIO()
45  request.setopt(request.WRITEFUNCTION, response_buffer.write)
46  request.setopt(request.URL, url + "conddbfw_version/")
47  request.perform()
48  return json.loads(response_buffer.getvalue())
49 

Referenced by parse_arguments().

◆ getCredentials()

def uploadConditions.getCredentials (   options)

Definition at line 627 of file uploadConditions.py.

627 def getCredentials( options ):
628 
629  username = None
630  password = None
631  netrcPath = None
632  if authPathEnvVar in os.environ:
633  authPath = os.environ[authPathEnvVar]
634  netrcPath = os.path.join(authPath,'.netrc')
635  if options.authPath is not None:
636  netrcPath = os.path.join( options.authPath,'.netrc' )
637  try:
638  # Try to find the netrc entry
639  (username, account, password) = netrc.netrc( netrcPath ).authenticators(options.netrcHost)
640  except Exception:
641  # netrc entry not found, ask for the username and password
642  logging.info(
643  'netrc entry "%s" not found: if you wish not to have to retype your password, you can add an entry in your .netrc file. However, beware of the risks of having your password stored as plaintext. Instead.',
644  options.netrcHost)
645 
646  # Try to get a default username
647  defaultUsername = getpass.getuser()
648  if defaultUsername is None:
649  defaultUsername = '(not found)'
650 
651  username = getInput(defaultUsername, '\nUsername [%s]: ' % defaultUsername)
652  password = getpass.getpass('Password: ')
653 
654  return username, password
655 
656 

References getInput().

Referenced by uploadAllFiles().

◆ getInput()

def uploadConditions.getInput (   default,
  prompt = '' 
)
Like raw_input() but with a default and automatic strip().

Definition at line 52 of file uploadConditions.py.

52 def getInput(default, prompt = ''):
53  '''Like raw_input() but with a default and automatic strip().
54  '''
55 
56  answer = raw_input(prompt)
57  if answer:
58  return answer.strip()
59 
60  return default.strip()
61 
62 

Referenced by getCredentials(), getInputChoose(), getInputWorkflow(), runWizard(), and uploadAllFiles().

◆ getInputChoose()

def uploadConditions.getInputChoose (   optionsList,
  default,
  prompt = '' 
)
Makes the user choose from a list of options.

Definition at line 76 of file uploadConditions.py.

76 def getInputChoose(optionsList, default, prompt = ''):
77  '''Makes the user choose from a list of options.
78  '''
79 
80  while True:
81  index = getInput(default, prompt)
82 
83  try:
84  return optionsList[int(index)]
85  except ValueError:
86  logging.error('Please specify an index of the list (i.e. integer).')
87  except IndexError:
88  logging.error('The index you provided is not in the given list.')
89 
90 

References getInput(), and createfilelist.int.

Referenced by runWizard().

◆ getInputRepeat()

def uploadConditions.getInputRepeat (   prompt = '')
Like raw_input() but repeats if nothing is provided and automatic strip().

Definition at line 91 of file uploadConditions.py.

91 def getInputRepeat(prompt = ''):
92  '''Like raw_input() but repeats if nothing is provided and automatic strip().
93  '''
94 
95  while True:
96  answer = raw_input(prompt)
97  if answer:
98  return answer.strip()
99 
100  logging.error('You need to provide a value.')
101 
102 

Referenced by runWizard().

◆ getInputWorkflow()

def uploadConditions.getInputWorkflow (   prompt = '')
Like getInput() but tailored to get target workflows (synchronization options).

Definition at line 63 of file uploadConditions.py.

63 def getInputWorkflow(prompt = ''):
64  '''Like getInput() but tailored to get target workflows (synchronization options).
65  '''
66 
67  while True:
68  workflow = getInput(defaultWorkflow, prompt)
69 
70  if workflow in frozenset(['offline', 'hlt', 'express', 'prompt', 'pcl']):
71  return workflow
72 
73  logging.error('Please specify one of the allowed workflows. See above for the explanation on each of them.')
74 
75 

References getInput().

◆ main()

def uploadConditions.main ( )
Entry point.

Definition at line 897 of file uploadConditions.py.

897 def main():
898  '''Entry point.
899  '''
900 
901  parser = optparse.OptionParser(usage =
902  'Usage: %prog [options] <file> [<file> ...]\n'
903  )
904 
905  parser.add_option('-d', '--debug',
906  dest = 'debug',
907  action="store_true",
908  default = False,
909  help = 'Switch on printing debug information. Default: %default',
910  )
911 
912  parser.add_option('-b', '--backend',
913  dest = 'backend',
914  default = defaultBackend,
915  help = 'dropBox\'s backend to upload to. Default: %default',
916  )
917 
918  parser.add_option('-H', '--hostname',
919  dest = 'hostname',
920  default = defaultHostname,
921  help = 'dropBox\'s hostname. Default: %default',
922  )
923 
924  parser.add_option('-u', '--urlTemplate',
925  dest = 'urlTemplate',
926  default = defaultUrlTemplate,
927  help = 'dropBox\'s URL template. Default: %default',
928  )
929 
930  parser.add_option('-f', '--temporaryFile',
931  dest = 'temporaryFile',
932  default = defaultTemporaryFile,
933  help = 'Temporary file that will be used to store the first tar file. Note that it then will be moved to a file with the hash of the file as its name, so there will be two temporary files created in fact. Default: %default',
934  )
935 
936  parser.add_option('-n', '--netrcHost',
937  dest = 'netrcHost',
938  default = defaultNetrcHost,
939  help = 'The netrc host (machine) from where the username and password will be read. Default: %default',
940  )
941 
942  parser.add_option('-a', '--authPath',
943  dest = 'authPath',
944  default = None,
945  help = 'The path of the .netrc file for the authentication. Default: $HOME',
946  )
947 
948  parser.add_option('-r', '--reUpload',
949  dest = 'reUpload',
950  default = None,
951  help = 'The hash of the file to upload again.',
952  )
953 
954  (options, arguments) = parser.parse_args()
955 
956  logLevel = logging.INFO
957  if options.debug:
958  logLevel = logging.DEBUG
959  logging.basicConfig(
960  format = '[%(asctime)s] %(levelname)s: %(message)s',
961  level = logLevel,
962  )
963 
964  if len(arguments) < 1:
965  if options.reUpload is None:
966  parser.print_help()
967  return -2
968  else:
969  return re_upload(options)
970  if options.reUpload is not None:
971  print("ERROR: options -r can't be specified on a new file upload.")
972  return -2
973 
974  return upload(options, arguments)
975 

References print(), re_upload(), and upload().

◆ parse_arguments()

def uploadConditions.parse_arguments ( )

Definition at line 179 of file uploadConditions.py.

179 def parse_arguments():
180  # read in command line arguments, and build metadata dictionary from them
181  parser = argparse.ArgumentParser(prog="cmsDbUpload client", description="CMS Conditions Upload Script in CondDBFW.")
182 
183  parser.add_argument("--sourceDB", type=str, help="DB to find Tags, IOVs + Payloads in.", required=False)
184 
185  # metadata arguments
186  parser.add_argument("--inputTag", type=str,\
187  help="Tag to take IOVs + Payloads from in --sourceDB.", required=False)
188  parser.add_argument("--destinationTag", type=str,\
189  help="Tag to copy IOVs + Payloads to in --destDB.", required=False)
190  parser.add_argument("--destinationDatabase", type=str,\
191  help="Database to copy IOVs + Payloads to.", required=False)
192  parser.add_argument("--since", type=int,\
193  help="Since to take IOVs from.", required=False)
194  parser.add_argument("--userText", type=str,\
195  help="Description of --destTag (can be empty).")
196 
197  # non-metadata arguments
198  parser.add_argument("--metadataFile", "-m", type=str, help="Metadata file to take metadata from.", required=False)
199 
200  parser.add_argument("--debug", required=False, action="store_true")
201  parser.add_argument("--verbose", required=False, action="store_true")
202  parser.add_argument("--testing", required=False, action="store_true")
203  parser.add_argument("--fcsr-filter", type=str, help="Synchronization to take FCSR from for local filtering of IOVs.", required=False)
204 
205  parser.add_argument("--netrc", required=False)
206 
207  parser.add_argument("--hashToUse", required=False)
208 
209  parser.add_argument("--server", required=False)
210 
211  parser.add_argument("--review-options", required=False, action="store_true")
212 
213  command_line_data = parser.parse_args()
214 
215  # default is the production server, which can point to either database anyway
216  server_alias_to_url = {
217  "prep" : "https://cms-conddb-dev.cern.ch/cmsDbCondUpload/",
218  "prod" : "https://cms-conddb.cern.ch/cmsDbCondUpload/",
219  None : "https://cms-conddb.cern.ch/cmsDbCondUpload/"
220  }
221 
222  # if prep, prod or None were given, convert to URLs in dictionary server_alias_to_url
223  # if not, assume a URL has been given and use this instead
224  if command_line_data.server in server_alias_to_url.keys():
225  command_line_data.server = server_alias_to_url[command_line_data.server]
226 
227  # use netrc to get username and password
228  try:
229  netrc_file = command_line_data.netrc
230  netrc_authenticators = netrc.netrc(netrc_file).authenticators("ConditionUploader")
231  if netrc_authenticators == None:
232  print("Your netrc file must contain the key 'ConditionUploader'.")
233  manual_input = raw_input("Do you want to try to type your credentials? ")
234  if manual_input == "y":
235  # ask for username and password
236  username = raw_input("Username: ")
237  password = getpass.getpass("Password: ")
238  else:
239  exit()
240  else:
241  print("Read your credentials from ~/.netrc. If you want to use a different file, supply its name with the --netrc argument.")
242  username = netrc_authenticators[0]
243  password = netrc_authenticators[2]
244  except:
245  print("Couldn't obtain your credentials (either from netrc or manual input).")
246  exit()
247 
248  command_line_data.username = username
249  command_line_data.password = password
250  # this will be used as the final destinationTags value by all input methods
251  # apart from the metadata file
252  command_line_data.destinationTags = {command_line_data.destinationTag:{}}
253 
254  """
255  Construct metadata_dictionary:
256  Currently, this is 3 cases:
257 
258  1) An IOV is being appended to an existing Tag with an existing Payload.
259  In this case, we just take all data from the command line.
260 
261  2) No metadata file is given, so we assume that ALL upload metadata is coming from the command line.
262 
263  3) A metadata file is given, hence we parse the file, and then iterate through command line arguments
264  since these override the options set in the metadata file.
265 
266  """
267  if command_line_data.hashToUse != None:
268  command_line_data.userText = ""
269  metadata_dictionary = command_line_data.__dict__
270  elif command_line_data.metadataFile == None:
271  command_line_data.userText = command_line_data.userText\
272  if command_line_data.userText != None\
273  else str(raw_input("Tag's description [can be empty]:"))
274  metadata_dictionary = command_line_data.__dict__
275  else:
276  metadata_dictionary = json.loads("".join(open(os.path.abspath(command_line_data.metadataFile), "r").readlines()))
277  metadata_dictionary["username"] = username
278  metadata_dictionary["password"] = password
279  metadata_dictionary["userText"] = metadata_dictionary.get("userText")\
280  if metadata_dictionary.get("userText") != None\
281  else str(raw_input("Tag's description [can be empty]:"))
282  # set the server to use to be the default one
283  metadata_dictionary["server"] = server_alias_to_url[None]
284 
285  # go through command line options and, if they are set, overwrite entries
286  for (option_name, option_value) in command_line_data.__dict__.items():
287  # if the metadata_dictionary sets this, overwrite it
288  if option_name != "destinationTags":
289  if option_value != None or (option_value == None and not(option_name in metadata_dictionary.keys())):
290  # if option_value has a value, override the metadata file entry
291  # or if option_value is None but the metadata file doesn't give a value,
292  # set the entry to None as well
293  metadata_dictionary[option_name] = option_value
294  else:
295  if option_value != {None:{}}:
296  metadata_dictionary["destinationTags"] = {option_value:{}}
297  elif option_value == {None:{}} and not("destinationTags" in metadata_dictionary.keys()):
298  metadata_dictionary["destinationTags"] = {None:{}}
299 
300  if command_line_data.review_options:
301  defaults = {
302  "since" : "Since of first IOV",
303  "userText" : "Populated by upload process",
304  "netrc" : "None given",
305  "fcsr_filter" : "Don't apply",
306  "hashToUse" : "Using local SQLite file instead"
307  }
308  print("Configuration to use for the upload:")
309  for key in metadata_dictionary:
310  if not(key) in ["username", "password", "destinationTag"]:
311  value_to_print = metadata_dictionary[key] if metadata_dictionary[key] != None else defaults[key]
312  print("\t%s : %s" % (key, value_to_print))
313 
314  if raw_input("\nDo you want to continue? [y/n] ") != "y":
315  exit()
316 
317  return metadata_dictionary
318 

References beamvalidation.exit(), get_directory_to_pull_to(), get_local_commit_hash(), get_version_info(), join(), data_sources.json_data_node.make(), print(), pull_code_from_git(), run_in_shell(), run_upload(), and str.

Referenced by uploads.uploader.send_metadata().

◆ pull_code_from_git()

def uploadConditions.pull_code_from_git (   target_directory,
  repository_url,
  hash 
)
Pulls CondDBFW from the git repository specified by the upload server.

Definition at line 98 of file uploadConditions.py.

98 def pull_code_from_git(target_directory, repository_url, hash):
99  """
100  Pulls CondDBFW from the git repository specified by the upload server.
101  """
102  # make directory
103  target = os.path.abspath(target_directory)
104  sys.path.append(target)
105  conddbfw_directory = os.path.join(target, "CondDBFW")
106  git_directory = os.path.join(conddbfw_directory, ".git")
107  if not(os.path.exists(conddbfw_directory)):
108  os.mkdir(conddbfw_directory)
109  else:
110  # if the directory exists, it may contain things - prompt the user
111  force_pull = str(raw_input("CondDBFW directory isn't empty - empty it, and update to new version? [y/n] "))
112  if force_pull == "y":
113  # empty directory and delete it
114  run_in_shell("rm -rf CondDBFW", shell=True)
115  # remake the directory - it will be empty
116  os.mkdir(conddbfw_directory)
117 
118  print("Pulling code back from repository...")
119  print(horizontal_rule)
120 
121  run_in_shell("git --git-dir=%s clone %s CondDBFW" % (git_directory, repository_url), shell=True)
122  # --force makes sure we ignore any conflicts that
123  # could occur and overwrite everything in the checkout
124  run_in_shell("cd %s && git checkout --force -b version_used %s" % (conddbfw_directory, hash), shell=True)
125 
126  # write the hash to a file in the CondDBFW directory so we can delete the git repository
127  hash_file_handle = open(os.path.join(conddbfw_directory, ".commit_hash"), "w")
128  hash_file_handle.write(hash)
129  hash_file_handle.close()
130 
131  # can now delete .git directory
132  shutil.rmtree(git_directory)
133 
134  print(horizontal_rule)
135  print("Creating local log directories (if required)...")
136  if not(os.path.exists(os.path.join(target, "upload_logs"))):
137  os.mkdir(os.path.join(target, "upload_logs"))
138  if not(os.path.exists(os.path.join(target, "server_side_logs"))):
139  os.mkdir(os.path.join(target, "server_side_logs"))
140  print("Finished with log directories.")
141  print("Update of CondDBFW complete.")
142 
143  print(horizontal_rule)
144 
145  return True
146 

References print(), run_in_shell(), and str.

Referenced by parse_arguments().

◆ re_upload()

def uploadConditions.re_upload (   options)

Definition at line 812 of file uploadConditions.py.

812 def re_upload( options ):
813  netrcPath = None
814  logDbSrv = prodLogDbSrv
815  if options.hostname == defaultDevHostname:
816  logDbSrv = devLogDbSrv
817  if options.authPath is not None:
818  netrcPath = os.path.join( options.authPath,'.netrc' )
819  try:
820  netrcKey = '%s/%s' %(logDbSrv,logDbSchema)
821  print('#netrc key=%s' %netrcKey)
822  # Try to find the netrc entry
823  (username, account, password) = netrc.netrc( netrcPath ).authenticators( netrcKey )
824  except IOError as e:
825  logging.error('Cannot access netrc file.')
826  return 1
827  except Exception as e:
828  logging.error('Netrc file is invalid: %s' %str(e))
829  return 1
830  conStr = '%s/%s@%s' %(username,password,logDbSrv)
831  con = cx_Oracle.connect( conStr )
832  cur = con.cursor()
833  fh = options.reUpload
834  cur.execute('SELECT FILECONTENT, STATE FROM FILES WHERE FILEHASH = :HASH',{'HASH':fh})
835  res = cur.fetchall()
836  found = False
837  fdata = None
838  for r in res:
839  found = True
840  logging.info("Found file %s in state '%s;" %(fh,r[1]))
841  fdata = r[0].read().decode('bz2')
842  con.close()
843  if not found:
844  logging.error("No file uploaded found with hash %s" %fh)
845  return 1
846  # writing as a tar file and open it ( is there a why to open it in memory?)
847  fname = '%s.tar' %fh
848  with open(fname, "wb" ) as f:
849  f.write(fdata)
850  rname = 'reupload_%s' %fh
851  with tarfile.open(fname) as tar:
852  tar.extractall()
853  os.remove(fname)
854  dfile = 'data.db'
855  mdfile = 'metadata.txt'
856  if os.path.exists(dfile):
857  os.utime(dfile,None)
858  os.chmod(dfile,0o755)
859  os.rename(dfile,'%s.db' %rname)
860  else:
861  logging.error('Tar file does not contain the data file')
862  return 1
863  if os.path.exists(mdfile):
864  os.utime(mdfile,None)
865  os.chmod(mdfile,0o755)
866  mdata = None
867  with open(mdfile) as md:
868  mdata = json.load(md)
869  datelabel = datetime.now().strftime("%y-%m-%d %H:%M:%S")
870  if mdata is None:
871  logging.error('Metadata file is empty.')
872  return 1
873  logging.debug('Preparing new metadata file...')
874  mdata['userText'] = 'reupload %s : %s' %(datelabel,mdata['userText'])
875  with open( '%s.txt' %rname, 'wb') as jf:
876  jf.write( json.dumps( mdata, sort_keys=True, indent = 2 ) )
877  jf.write('\n')
878  os.remove(mdfile)
879  else:
880  logging.error('Tar file does not contain the metadata file')
881  return 1
882  logging.info('Files %s prepared for the upload.' %rname)
883  arguments = [rname]
884  return upload(options, arguments)
885 

References edm.decode(), print(), readEcalDQMStatus.read, str, and upload().

Referenced by main().

◆ run_in_shell()

def uploadConditions.run_in_shell ( popenargs,
**  kwargs 
)
Runs string-based commands in the shell and returns the result.

Definition at line 147 of file uploadConditions.py.

147 def run_in_shell(*popenargs, **kwargs):
148  """
149  Runs string-based commands in the shell and returns the result.
150  """
151  out = subprocess.PIPE if kwargs.get("stdout") == None else kwargs.get("stdout")
152  new_kwargs = kwargs
153  if new_kwargs.get("stdout"):
154  del new_kwargs["stdout"]
155  process = subprocess.Popen(*popenargs, stdout=out, **new_kwargs)
156  stdout = process.communicate()[0]
157  returnCode = process.returncode
158  cmd = kwargs.get('args')
159  if cmd is None:
160  cmd = popenargs[0]
161  if returnCode:
162  raise subprocess.CalledProcessError(returnCode, cmd)
163  return stdout
164 

Referenced by parse_arguments(), and pull_code_from_git().

◆ run_upload()

def uploadConditions.run_upload ( **  parameters)
Imports CondDBFW.uploads and runs the upload with the upload metadata obtained.

Definition at line 165 of file uploadConditions.py.

165 def run_upload(**parameters):
166  """
167  Imports CondDBFW.uploads and runs the upload with the upload metadata obtained.
168  """
169  try:
170  import CondDBFW.uploads as uploads
171  except Exception as e:
172  traceback.print_exc()
173  exit("CondDBFW or one of its dependencies could not be imported.\n"\
174  + "If the CondDBFW directory exists, you are likely not in a CMSSW environment.")
175  # we have CondDBFW, so just call the module with the parameters given in the command line
176  uploader = uploads.uploader(**parameters)
177  result = uploader.upload()
178 

References beamvalidation.exit().

Referenced by parse_arguments().

◆ runWizard()

def uploadConditions.runWizard (   basename,
  dataFilename,
  metadataFilename 
)

Definition at line 103 of file uploadConditions.py.

103 def runWizard(basename, dataFilename, metadataFilename):
104  while True:
105  print('''\nWizard for metadata for %s
106 
107 I will ask you some questions to fill the metadata file. For some of the questions there are defaults between square brackets (i.e. []), leave empty (i.e. hit Enter) to use them.''' % basename)
108 
109  # Try to get the available inputTags
110  try:
111  dataConnection = sqlite3.connect(dataFilename)
112  dataCursor = dataConnection.cursor()
113  dataCursor.execute('select name from sqlite_master where type == "table"')
114  tables = set(zip(*dataCursor.fetchall())[0])
115 
116  # only conddb V2 supported...
117  if 'TAG' in tables:
118  dataCursor.execute('select NAME from TAG')
119  # In any other case, do not try to get the inputTags
120  else:
121  raise Exception()
122 
123  inputTags = dataCursor.fetchall()
124  if len(inputTags) == 0:
125  raise Exception()
126  inputTags = zip(*inputTags)[0]
127 
128  except Exception:
129  inputTags = []
130 
131  if len(inputTags) == 0:
132  print('\nI could not find any input tag in your data file, but you can still specify one manually.')
133 
134  inputTag = getInputRepeat(
135  '\nWhich is the input tag (i.e. the tag to be read from the SQLite data file)?\ne.g. BeamSpotObject_ByRun\ninputTag: ')
136 
137  else:
138  print('\nI found the following input tags in your SQLite data file:')
139  for (index, inputTag) in enumerate(inputTags):
140  print(' %s) %s' % (index, inputTag))
141 
142  inputTag = getInputChoose(inputTags, '0',
143  '\nWhich is the input tag (i.e. the tag to be read from the SQLite data file)?\ne.g. 0 (you select the first in the list)\ninputTag [0]: ')
144 
145  destinationDatabase = ''
146  ntry = 0
147  while ( destinationDatabase != 'oracle://cms_orcon_prod/CMS_CONDITIONS' and destinationDatabase != 'oracle://cms_orcoff_prep/CMS_CONDITIONS' ):
148  if ntry==0:
149  inputMessage = \
150  '\nWhich is the destination database where the tags should be exported? \nPossible choices: oracle://cms_orcon_prod/CMS_CONDITIONS (for prod) or oracle://cms_orcoff_prep/CMS_CONDITIONS (for prep) \ndestinationDatabase: '
151  elif ntry==1:
152  inputMessage = \
153  '\nPlease choose one of the two valid destinations: \noracle://cms_orcon_prod/CMS_CONDITIONS (for prod) or oracle://cms_orcoff_prep/CMS_CONDITIONS (for prep) \
154 \ndestinationDatabase: '
155  else:
156  raise Exception('No valid destination chosen. Bailing out...')
157  destinationDatabase = getInputRepeat(inputMessage)
158  ntry += 1
159 
160  while True:
161  since = getInput('',
162  '\nWhich is the given since? (if not specified, the one from the SQLite data file will be taken -- note that even if specified, still this may not be the final since, depending on the synchronization options you select later: if the synchronization target is not offline, and the since you give is smaller than the next possible one (i.e. you give a run number earlier than the one which will be started/processed next in prompt/hlt/express), the DropBox will move the since ahead to go to the first safe run instead of the value you gave)\ne.g. 1234\nsince []: ')
163  if not since:
164  since = None
165  break
166  else:
167  try:
168  since = int(since)
169  break
170  except ValueError:
171  logging.error('The since value has to be an integer or empty (null).')
172 
173  userText = getInput('',
174  '\nWrite any comments/text you may want to describe your request\ne.g. Muon alignment scenario for...\nuserText []: ')
175 
176  destinationTags = {}
177  while True:
178  destinationTag = getInput('',
179  '\nWhich is the next destination tag to be added (leave empty to stop)?\ne.g. BeamSpotObjects_PCL_byRun_v0_offline\ndestinationTag []: ')
180  if not destinationTag:
181  if len(destinationTags) == 0:
182  logging.error('There must be at least one destination tag.')
183  continue
184  break
185 
186  if destinationTag in destinationTags:
187  logging.warning(
188  'You already added this destination tag. Overwriting the previous one with this new one.')
189 
190  destinationTags[destinationTag] = {
191  }
192 
193  metadata = {
194  'destinationDatabase': destinationDatabase,
195  'destinationTags': destinationTags,
196  'inputTag': inputTag,
197  'since': since,
198  'userText': userText,
199  }
200 
201  metadata = json.dumps(metadata, sort_keys=True, indent=4)
202  print('\nThis is the generated metadata:\n%s' % metadata)
203 
204  if getInput('n',
205  '\nIs it fine (i.e. save in %s and *upload* the conditions if this is the latest file)?\nAnswer [n]: ' % metadataFilename).lower() == 'y':
206  break
207  logging.info('Saving generated metadata in %s...', metadataFilename)
208  with open(metadataFilename, 'wb') as metadataFile:
209  metadataFile.write(metadata)
210 

References getInput(), getInputChoose(), getInputRepeat(), createfilelist.int, print(), and ComparisonHelper.zip().

Referenced by uploadAllFiles().

◆ testTier0Upload()

def uploadConditions.testTier0Upload ( )

Definition at line 976 of file uploadConditions.py.

976 def testTier0Upload():
977 
978  global defaultNetrcHost
979 
980  (username, account, password) = netrc.netrc().authenticators(defaultNetrcHost)
981 
982  filenames = ['testFiles/localSqlite-top2']
983 
984  uploadTier0Files(filenames, username, password, cookieFileName = None)
985 
986 

References uploadTier0Files().

◆ upload()

def uploadConditions.upload (   options,
  arguments 
)

Definition at line 886 of file uploadConditions.py.

886 def upload(options, arguments):
887  results = uploadAllFiles(options, arguments)
888 
889  if 'status' not in results:
890  print('Unexpected error.')
891  return -1
892  ret = results['status']
893  print(results)
894  print("upload ended with code: %s" %ret)
895  return ret
896 

References print(), and uploadAllFiles().

Referenced by main(), and re_upload().

◆ uploadAllFiles()

def uploadConditions.uploadAllFiles (   options,
  arguments 
)

Definition at line 657 of file uploadConditions.py.

657 def uploadAllFiles(options, arguments):
658 
659  ret = {}
660  ret['status'] = 0
661 
662  # Check that we can read the data and metadata files
663  # If the metadata file does not exist, start the wizard
664  for filename in arguments:
665  basepath = filename.rsplit('.db', 1)[0].rsplit('.txt', 1)[0]
666  basename = os.path.basename(basepath)
667  dataFilename = '%s.db' % basepath
668  metadataFilename = '%s.txt' % basepath
669 
670  logging.info('Checking %s...', basename)
671 
672  # Data file
673  try:
674  with open(dataFilename, 'rb') as dataFile:
675  pass
676  except IOError as e:
677  errMsg = 'Impossible to open SQLite data file %s' %dataFilename
678  logging.error( errMsg )
679  ret['status'] = -3
680  ret['error'] = errMsg
681  return ret
682 
683  # Check the data file
684  empty = True
685  try:
686  dbcon = sqlite3.connect( dataFilename )
687  dbcur = dbcon.cursor()
688  dbcur.execute('SELECT * FROM IOV')
689  rows = dbcur.fetchall()
690  for r in rows:
691  empty = False
692  dbcon.close()
693  if empty:
694  errMsg = 'The input SQLite data file %s contains no data.' %dataFilename
695  logging.error( errMsg )
696  ret['status'] = -4
697  ret['error'] = errMsg
698  return ret
699  except Exception as e:
700  errMsg = 'Check on input SQLite data file %s failed: %s' %(dataFilename,str(e))
701  logging.error( errMsg )
702  ret['status'] = -5
703  ret['error'] = errMsg
704  return ret
705 
706  # Metadata file
707  try:
708  with open(metadataFilename, 'rb') as metadataFile:
709  pass
710  except IOError as e:
711  if e.errno != errno.ENOENT:
712  errMsg = 'Impossible to open file %s (for other reason than not existing)' %metadataFilename
713  logging.error( errMsg )
714  ret['status'] = -4
715  ret['error'] = errMsg
716  return ret
717 
718  if getInput('y', '\nIt looks like the metadata file %s does not exist. Do you want me to create it and help you fill it?\nAnswer [y]: ' % metadataFilename).lower() != 'y':
719  errMsg = 'Metadata file %s does not exist' %metadataFilename
720  logging.error( errMsg )
721  ret['status'] = -5
722  ret['error'] = errMsg
723  return ret
724  # Wizard
725  runWizard(basename, dataFilename, metadataFilename)
726 
727  # Upload files
728  try:
729  dropBox = ConditionsUploader(options.hostname, options.urlTemplate)
730 
731  # Authentication
732  username, password = getCredentials(options)
733 
734  results = {}
735  for filename in arguments:
736  backend = options.backend
737  basepath = filename.rsplit('.db', 1)[0].rsplit('.txt', 1)[0]
738  metadataFilename = '%s.txt' % basepath
739  with open(metadataFilename, 'rb') as metadataFile:
740  metadata = json.load( metadataFile )
741  # When dest db = prep the hostname has to be set to dev.
742  forceHost = False
743  destDb = metadata['destinationDatabase']
744  if destDb.startswith('oracle://cms_orcon_prod') or destDb.startswith('oracle://cms_orcoff_prep'):
745  hostName = defaultHostname
746  if destDb.startswith('oracle://cms_orcoff_prep'):
747  hostName = defaultDevHostname
748  dropBox.setHost( hostName )
749  authRet = dropBox.signIn( username, password )
750  if not authRet==0:
751  msg = "Error trying to connect to the server. Aborting."
752  if authRet==-2:
753  msg = "Error while signin in. Aborting."
754  logging.error(msg)
755  return { 'status' : authRet, 'error' : msg }
756  results[filename] = dropBox.uploadFile(filename, options.backend, options.temporaryFile)
757  else:
758  results[filename] = False
759  logging.error("DestinationDatabase %s is not valid. Skipping the upload." %destDb)
760  if not results[filename]:
761  if ret['status']<0:
762  ret['status'] = 0
763  ret['status'] += 1
764  ret['files'] = results
765  logging.debug("all files processed, logging out now.")
766 
767  dropBox.signOut()
768 
769  except HTTPError as e:
770  logging.error('got HTTP error: %s', str(e))
771  return { 'status' : -1, 'error' : str(e) }
772 
773  return ret
774 

References getCredentials(), getInput(), runWizard(), and str.

Referenced by upload().

◆ uploadTier0Files()

def uploadConditions.uploadTier0Files (   filenames,
  username,
  password,
  cookieFileName = None 
)
Uploads a bunch of files coming from Tier0.
This has the following requirements:
    * Username/Password based authentication.
    * Uses the online backend.
    * Ignores errors related to the upload/content (e.g. duplicated file).

Definition at line 775 of file uploadConditions.py.

775 def uploadTier0Files(filenames, username, password, cookieFileName = None):
776  '''Uploads a bunch of files coming from Tier0.
777  This has the following requirements:
778  * Username/Password based authentication.
779  * Uses the online backend.
780  * Ignores errors related to the upload/content (e.g. duplicated file).
781  '''
782 
783  dropBox = ConditionsUploader()
784 
785  dropBox.signIn(username, password)
786 
787  for filename in filenames:
788  try:
789  result = dropBox.uploadFile(filename, backend = 'test')
790  except HTTPError as e:
791  if e.code == 400:
792  # 400 Bad Request: This is an exception related to the upload
793  # being wrong for some reason (e.g. duplicated file).
794  # Since for Tier0 this is not an issue, continue
795  logging.error('HTTP Exception 400 Bad Request: Upload-related, skipping. Message: %s', e)
796  continue
797 
798  # In any other case, re-raise.
799  raise
800 
801  #-toDo: add a flag to say if we should retry or not. So far, all retries are done server-side (Tier-0),
802  # if we flag as failed any retry would not help and would result in the same error (e.g.
803  # when a file with an identical hash is uploaded again)
804  #-review(2015-09-25): get feedback from tests at Tier-0 (action: AP)
805 
806  if not result: # dropbox reported an error when uploading, do not retry.
807  logging.error('Error from dropbox, upload-related, skipping.')
808  continue
809 
810  dropBox.signOut()
811 

Referenced by testTier0Upload().

uploadConditions.parse_arguments
def parse_arguments()
Definition: uploadConditions.py:179
digitizers_cfi.strip
strip
Definition: digitizers_cfi.py:19
uploadConditions.getInputChoose
def getInputChoose(optionsList, default, prompt='')
Definition: uploadConditions.py:76
join
static std::string join(char **cmd)
Definition: RemoteFile.cc:17
uploadConditions.uploadAllFiles
def uploadAllFiles(options, arguments)
Definition: uploadConditions.py:657
uploadConditions.testTier0Upload
def testTier0Upload()
Definition: uploadConditions.py:976
uploadConditions.run_in_shell
def run_in_shell(*popenargs, **kwargs)
Definition: uploadConditions.py:147
uploadConditions.pull_code_from_git
def pull_code_from_git(target_directory, repository_url, hash)
Definition: uploadConditions.py:98
uploadConditions.getInput
def getInput(default, prompt='')
Definition: uploadConditions.py:52
uploadConditions.get_directory_to_pull_to
def get_directory_to_pull_to(default_directory, commit_hash)
Definition: uploadConditions.py:73
uploadConditions.main
def main()
Definition: uploadConditions.py:897
str
#define str(s)
Definition: TestProcessor.cc:52
uploadConditions.addToTarFile
def addToTarFile(tarFile, fileobj, arcname)
Definition: uploadConditions.py:439
uploadConditions.getCredentials
def getCredentials(options)
Definition: uploadConditions.py:627
print
void print(TMatrixD &m, const char *label=nullptr, bool mathematicaFormat=false)
Definition: Utilities.cc:46
Exception
uploadConditions.run_upload
def run_upload(**parameters)
Definition: uploadConditions.py:165
createfilelist.int
int
Definition: createfilelist.py:10
ComparisonHelper::zip
OutputIterator zip(InputIterator1 first1, InputIterator1 last1, InputIterator2 first2, InputIterator2 last2, OutputIterator result, Compare comp)
Definition: L1TStage2CaloLayer1.h:41
readEcalDQMStatus.read
read
Definition: readEcalDQMStatus.py:38
uploads.uploader
Definition: uploads.py:83
edm::decode
bool decode(bool &, std::string const &)
Definition: types.cc:72
uploadConditions.re_upload
def re_upload(options)
Definition: uploadConditions.py:812
uploadConditions.getInputRepeat
def getInputRepeat(prompt='')
Definition: uploadConditions.py:91
uploadConditions.runWizard
def runWizard(basename, dataFilename, metadataFilename)
Definition: uploadConditions.py:103
uploadConditions.upload
def upload(options, arguments)
Definition: uploadConditions.py:886
beamvalidation.exit
def exit(msg="")
Definition: beamvalidation.py:53
uploadConditions.uploadTier0Files
def uploadTier0Files(filenames, username, password, cookieFileName=None)
Definition: uploadConditions.py:775
uploadConditions.getInputWorkflow
def getInputWorkflow(prompt='')
Definition: uploadConditions.py:63
uploadConditions.get_local_commit_hash
def get_local_commit_hash()
Definition: uploadConditions.py:50
uploadConditions.get_version_info
def get_version_info(url)
Definition: uploadConditions.py:32