CMS 3D CMS Logo

Classes | Functions
uploadConditions Namespace Reference

Classes

class  ConditionsUploader
 
class  HTTP
 
class  HTTPError
 

Functions

def addToTarFile (tarFile, fileobj, arcname)
 
def get_directory_to_pull_to (default_directory, commit_hash)
 
def get_local_commit_hash ()
 
def get_version_info (url)
 
def getCredentials (options)
 
def getInput (default, prompt='')
 
def getInputChoose (optionsList, default, prompt='')
 
def getInputRepeat (prompt='')
 
def getInputWorkflow (prompt='')
 
def main ()
 
def parse_arguments ()
 
def pull_code_from_git (target_directory, repository_url, hash)
 
def re_upload (options)
 
def run_in_shell (*popenargs, **kwargs)
 
def run_upload (**parameters)
 
def runWizard (basename, dataFilename, metadataFilename)
 
def testTier0Upload ()
 
def upload (options, arguments)
 
def uploadAllFiles (options, arguments)
 
def uploadTier0Files (filenames, username, password, cookieFileName=None)
 

Detailed Description

Joshua Dawes - CERN, CMS - The University of Manchester

Upload script wrapper - controls the automatic update system.

Note: the name of the file follows a different convention to the others because it should be the same as the current upload script name.

Takes user arguments and passes them to the main upload module CondDBFW.uploads, once the correct version exists.

1. Ask the server corresponding to the database we're uploading to which version of CondDBFW it has (query the /conddbfw_version/ url).
2. Decide which directory that we can write to - either the current local directory, or /tmp/random_string/.
3. Pull the commit returned from the server into the directory from step 2.
4. Invoke the CondDBFW.uploads module with the arguments given to this script.
Script that uploads to the new CMS conditions uploader.
Adapted to the new infrastructure from v6 of the upload.py script for the DropBox from Miguel Ojeda.

Function Documentation

◆ addToTarFile()

def uploadConditions.addToTarFile (   tarFile,
  fileobj,
  arcname 
)

Definition at line 428 of file uploadConditions.py.

428 def addToTarFile(tarFile, fileobj, arcname):
429  tarInfo = tarFile.gettarinfo(fileobj = fileobj, arcname = arcname)
430  tarInfo.mode = 0o400
431  tarInfo.uid = tarInfo.gid = tarInfo.mtime = 0
432  tarInfo.uname = tarInfo.gname = 'root'
433  tarFile.addfile(tarInfo, fileobj)
434 

Referenced by uploadConditions.ConditionsUploader.uploadFile().

◆ get_directory_to_pull_to()

def uploadConditions.get_directory_to_pull_to (   default_directory,
  commit_hash 
)
Finds out which directory we can safely use - either CondDBFW/ or a temporary directory.

Definition at line 73 of file uploadConditions.py.

73 def get_directory_to_pull_to(default_directory, commit_hash):
74  """
75  Finds out which directory we can safely use - either CondDBFW/ or a temporary directory.
76  """
77  # try to write a file (and then delete it)
78  try:
79  handle = open(os.path.join(default_directory, "test_file"), "w")
80  handle.write("test")
81  handle.close()
82  os.remove(os.path.join(default_directory, "test_file"))
83  sys.path.insert(0, default_directory)
84  return default_directory
85  except IOError as io:
86  # cannot write to default directory, so set up a directory in /tmp/
87  new_path = os.path.join("tmp", commit_hash[0:10])
88  if not(os.path.exists(new_path)):
89  os.mkdir(new_path)
90  sys.path.insert(0, new_path)
91  return new_path
92  else:
93  # for now, fail
94  exit("Can't find anywhere to pull the new code base to.")
95 

References beamvalidation.exit().

Referenced by parse_arguments().

◆ get_local_commit_hash()

def uploadConditions.get_local_commit_hash ( )
Gets the commit hash used by the local repository CondDBFW/.git/.

Definition at line 50 of file uploadConditions.py.

51  """
52  Gets the commit hash used by the local repository CondDBFW/.git/.
53  """
54  directory = os.path.abspath("CondDBFW")
55 
56  # get the commit hash of the code in `directory`
57  # by reading the .commit_hash file
58  try:
59  commit_hash_file_handle = open(os.path.join(directory, ".commit_hash"), "r")
60  commit_hash = commit_hash_file_handle.read().strip()
61 
62  # validate length of the commit hash
63  if len(commit_hash) != 40:
64  print("Commit hash found is not valid. Must be 40 characters long.")
65  exit()
66 
67  #commit_hash = run_in_shell("git --git-dir=%s rev-parse HEAD" % (os.path.join(directory, ".git")), shell=True).strip()
68 
69  return commit_hash
70  except Exception:
71  return None
72 

References beamvalidation.exit(), print(), and digitizers_cfi.strip.

Referenced by parse_arguments().

◆ get_version_info()

def uploadConditions.get_version_info (   url)
Queries the server-side for the commit hash it is currently using.
Note: this is the commit hash used by /data/services/common/CondDBFW on the server-side.

Definition at line 32 of file uploadConditions.py.

32 def get_version_info(url):
33  """
34  Queries the server-side for the commit hash it is currently using.
35  Note: this is the commit hash used by /data/services/common/CondDBFW on the server-side.
36  """
37  request = pycurl.Curl()
38  request.setopt(request.CONNECTTIMEOUT, 60)
39  user_agent = "User-Agent: ConditionWebServices/1.0 python/%d.%d.%d PycURL/%s" % (sys.version_info[ :3 ] + (pycurl.version_info()[1],))
40  request.setopt(request.USERAGENT, user_agent)
41  # we don't need to verify who signed the certificate or who the host is
42  request.setopt(request.SSL_VERIFYPEER, 0)
43  request.setopt(request.SSL_VERIFYHOST, 0)
44  response_buffer = StringIO()
45  request.setopt(request.WRITEFUNCTION, response_buffer.write)
46  request.setopt(request.URL, url + "conddbfw_version/")
47  request.perform()
48  return json.loads(response_buffer.getvalue())
49 

Referenced by parse_arguments().

◆ getCredentials()

def uploadConditions.getCredentials (   options)

Definition at line 617 of file uploadConditions.py.

617 def getCredentials( options ):
618 
619  username = None
620  password = None
621  netrcPath = None
622  if authPathEnvVar in os.environ:
623  authPath = os.environ[authPathEnvVar]
624  netrcPath = os.path.join(authPath,'.netrc')
625  if options.authPath is not None:
626  netrcPath = os.path.join( options.authPath,'.netrc' )
627  try:
628  # Try to find the netrc entry
629  (username, account, password) = netrc.netrc( netrcPath ).authenticators(options.netrcHost)
630  except Exception:
631  # netrc entry not found, ask for the username and password
632  logging.info(
633  'netrc entry "%s" not found: if you wish not to have to retype your password, you can add an entry in your .netrc file. However, beware of the risks of having your password stored as plaintext. Instead.',
634  options.netrcHost)
635 
636  # Try to get a default username
637  defaultUsername = getpass.getuser()
638  if defaultUsername is None:
639  defaultUsername = '(not found)'
640 
641  username = getInput(defaultUsername, '\nUsername [%s]: ' % defaultUsername)
642  password = getpass.getpass('Password: ')
643 
644  return username, password
645 
646 

References getInput().

Referenced by uploadAllFiles().

◆ getInput()

def uploadConditions.getInput (   default,
  prompt = '' 
)
Like input() but with a default and automatic strip().

Definition at line 52 of file uploadConditions.py.

52 def getInput(default, prompt = ''):
53  '''Like input() but with a default and automatic strip().
54  '''
55 
56  answer = input(prompt)
57  if answer:
58  return answer.strip()
59 
60  return default.strip()
61 
62 

References input.

Referenced by getCredentials(), getInputChoose(), getInputWorkflow(), runWizard(), and uploadAllFiles().

◆ getInputChoose()

def uploadConditions.getInputChoose (   optionsList,
  default,
  prompt = '' 
)
Makes the user choose from a list of options.

Definition at line 76 of file uploadConditions.py.

76 def getInputChoose(optionsList, default, prompt = ''):
77  '''Makes the user choose from a list of options.
78  '''
79 
80  while True:
81  index = getInput(default, prompt)
82 
83  try:
84  return optionsList[int(index)]
85  except ValueError:
86  logging.error('Please specify an index of the list (i.e. integer).')
87  except IndexError:
88  logging.error('The index you provided is not in the given list.')
89 
90 

References getInput(), and createfilelist.int.

Referenced by runWizard().

◆ getInputRepeat()

def uploadConditions.getInputRepeat (   prompt = '')
Like input() but repeats if nothing is provided and automatic strip().

Definition at line 91 of file uploadConditions.py.

91 def getInputRepeat(prompt = ''):
92  '''Like input() but repeats if nothing is provided and automatic strip().
93  '''
94 
95  while True:
96  answer = input(prompt)
97  if answer:
98  return answer.strip()
99 
100  logging.error('You need to provide a value.')
101 
102 

References input.

Referenced by runWizard().

◆ getInputWorkflow()

def uploadConditions.getInputWorkflow (   prompt = '')
Like getInput() but tailored to get target workflows (synchronization options).

Definition at line 63 of file uploadConditions.py.

63 def getInputWorkflow(prompt = ''):
64  '''Like getInput() but tailored to get target workflows (synchronization options).
65  '''
66 
67  while True:
68  workflow = getInput(defaultWorkflow, prompt)
69 
70  if workflow in frozenset(['offline', 'hlt', 'express', 'prompt', 'pcl']):
71  return workflow
72 
73  logging.error('Please specify one of the allowed workflows. See above for the explanation on each of them.')
74 
75 

References getInput().

◆ main()

def uploadConditions.main ( )
Entry point.

Definition at line 886 of file uploadConditions.py.

886 def main():
887  '''Entry point.
888  '''
889 
890  parser = optparse.OptionParser(usage =
891  'Usage: %prog [options] <file> [<file> ...]\n'
892  )
893 
894  parser.add_option('-d', '--debug',
895  dest = 'debug',
896  action="store_true",
897  default = False,
898  help = 'Switch on printing debug information. Default: %default',
899  )
900 
901  parser.add_option('-b', '--backend',
902  dest = 'backend',
903  default = defaultBackend,
904  help = 'dropBox\'s backend to upload to. Default: %default',
905  )
906 
907  parser.add_option('-H', '--hostname',
908  dest = 'hostname',
909  default = defaultHostname,
910  help = 'dropBox\'s hostname. Default: %default',
911  )
912 
913  parser.add_option('-u', '--urlTemplate',
914  dest = 'urlTemplate',
915  default = defaultUrlTemplate,
916  help = 'dropBox\'s URL template. Default: %default',
917  )
918 
919  parser.add_option('-f', '--temporaryFile',
920  dest = 'temporaryFile',
921  default = defaultTemporaryFile,
922  help = 'Temporary file that will be used to store the first tar file. Note that it then will be moved to a file with the hash of the file as its name, so there will be two temporary files created in fact. Default: %default',
923  )
924 
925  parser.add_option('-n', '--netrcHost',
926  dest = 'netrcHost',
927  default = defaultNetrcHost,
928  help = 'The netrc host (machine) from where the username and password will be read. Default: %default',
929  )
930 
931  parser.add_option('-a', '--authPath',
932  dest = 'authPath',
933  default = None,
934  help = 'The path of the .netrc file for the authentication. Default: $HOME',
935  )
936 
937  parser.add_option('-r', '--reUpload',
938  dest = 'reUpload',
939  default = None,
940  help = 'The hash of the file to upload again.',
941  )
942 
943  (options, arguments) = parser.parse_args()
944 
945  logLevel = logging.INFO
946  if options.debug:
947  logLevel = logging.DEBUG
948  logging.basicConfig(
949  format = '[%(asctime)s] %(levelname)s: %(message)s',
950  level = logLevel,
951  )
952 
953  if len(arguments) < 1:
954  if options.reUpload is None:
955  parser.print_help()
956  return -2
957  else:
958  return re_upload(options)
959  if options.reUpload is not None:
960  print("ERROR: options -r can't be specified on a new file upload.")
961  return -2
962 
963  return upload(options, arguments)
964 

References print(), re_upload(), and upload().

◆ parse_arguments()

def uploadConditions.parse_arguments ( )

Definition at line 179 of file uploadConditions.py.

179 def parse_arguments():
180  # read in command line arguments, and build metadata dictionary from them
181  parser = argparse.ArgumentParser(prog="cmsDbUpload client", description="CMS Conditions Upload Script in CondDBFW.")
182 
183  parser.add_argument("--sourceDB", type=str, help="DB to find Tags, IOVs + Payloads in.", required=False)
184 
185  # metadata arguments
186  parser.add_argument("--inputTag", type=str,\
187  help="Tag to take IOVs + Payloads from in --sourceDB.", required=False)
188  parser.add_argument("--destinationTag", type=str,\
189  help="Tag to copy IOVs + Payloads to in --destDB.", required=False)
190  parser.add_argument("--destinationDatabase", type=str,\
191  help="Database to copy IOVs + Payloads to.", required=False)
192  parser.add_argument("--since", type=int,\
193  help="Since to take IOVs from.", required=False)
194  parser.add_argument("--userText", type=str,\
195  help="Description of --destTag (can be empty).")
196 
197  # non-metadata arguments
198  parser.add_argument("--metadataFile", "-m", type=str, help="Metadata file to take metadata from.", required=False)
199 
200  parser.add_argument("--debug", required=False, action="store_true")
201  parser.add_argument("--verbose", required=False, action="store_true")
202  parser.add_argument("--testing", required=False, action="store_true")
203  parser.add_argument("--fcsr-filter", type=str, help="Synchronization to take FCSR from for local filtering of IOVs.", required=False)
204 
205  parser.add_argument("--netrc", required=False)
206 
207  parser.add_argument("--hashToUse", required=False)
208 
209  parser.add_argument("--server", required=False)
210 
211  parser.add_argument("--review-options", required=False, action="store_true")
212 
213  command_line_data = parser.parse_args()
214 
215  # default is the production server, which can point to either database anyway
216  server_alias_to_url = {
217  "prep" : "https://cms-conddb-dev.cern.ch/cmsDbCondUpload/",
218  "prod" : "https://cms-conddb.cern.ch/cmsDbCondUpload/",
219  None : "https://cms-conddb.cern.ch/cmsDbCondUpload/"
220  }
221 
222  # if prep, prod or None were given, convert to URLs in dictionary server_alias_to_url
223  # if not, assume a URL has been given and use this instead
224  if command_line_data.server in server_alias_to_url.keys():
225  command_line_data.server = server_alias_to_url[command_line_data.server]
226 
227  # use netrc to get username and password
228  try:
229  netrc_file = command_line_data.netrc
230  netrc_authenticators = netrc.netrc(netrc_file).authenticators("ConditionUploader")
231  if netrc_authenticators == None:
232  print("Your netrc file must contain the key 'ConditionUploader'.")
233  manual_input = raw_input("Do you want to try to type your credentials? ")
234  if manual_input == "y":
235  # ask for username and password
236  username = raw_input("Username: ")
237  password = getpass.getpass("Password: ")
238  else:
239  exit()
240  else:
241  print("Read your credentials from ~/.netrc. If you want to use a different file, supply its name with the --netrc argument.")
242  username = netrc_authenticators[0]
243  password = netrc_authenticators[2]
244  except:
245  print("Couldn't obtain your credentials (either from netrc or manual input).")
246  exit()
247 
248  command_line_data.username = username
249  command_line_data.password = password
250  # this will be used as the final destinationTags value by all input methods
251  # apart from the metadata file
252  command_line_data.destinationTags = {command_line_data.destinationTag:{}}
253 
254  """
255  Construct metadata_dictionary:
256  Currently, this is 3 cases:
257 
258  1) An IOV is being appended to an existing Tag with an existing Payload.
259  In this case, we just take all data from the command line.
260 
261  2) No metadata file is given, so we assume that ALL upload metadata is coming from the command line.
262 
263  3) A metadata file is given, hence we parse the file, and then iterate through command line arguments
264  since these override the options set in the metadata file.
265 
266  """
267  if command_line_data.hashToUse != None:
268  command_line_data.userText = ""
269  metadata_dictionary = command_line_data.__dict__
270  elif command_line_data.metadataFile == None:
271  command_line_data.userText = command_line_data.userText\
272  if command_line_data.userText != None\
273  else str(raw_input("Tag's description [can be empty]:"))
274  metadata_dictionary = command_line_data.__dict__
275  else:
276  metadata_dictionary = json.loads("".join(open(os.path.abspath(command_line_data.metadataFile), "r").readlines()))
277  metadata_dictionary["username"] = username
278  metadata_dictionary["password"] = password
279  metadata_dictionary["userText"] = metadata_dictionary.get("userText")\
280  if metadata_dictionary.get("userText") != None\
281  else str(raw_input("Tag's description [can be empty]:"))
282  # set the server to use to be the default one
283  metadata_dictionary["server"] = server_alias_to_url[None]
284 
285  # go through command line options and, if they are set, overwrite entries
286  for (option_name, option_value) in command_line_data.__dict__.items():
287  # if the metadata_dictionary sets this, overwrite it
288  if option_name != "destinationTags":
289  if option_value != None or (option_value == None and not(option_name in metadata_dictionary.keys())):
290  # if option_value has a value, override the metadata file entry
291  # or if option_value is None but the metadata file doesn't give a value,
292  # set the entry to None as well
293  metadata_dictionary[option_name] = option_value
294  else:
295  if option_value != {None:{}}:
296  metadata_dictionary["destinationTags"] = {option_value:{}}
297  elif option_value == {None:{}} and not("destinationTags" in metadata_dictionary.keys()):
298  metadata_dictionary["destinationTags"] = {None:{}}
299 
300  if command_line_data.review_options:
301  defaults = {
302  "since" : "Since of first IOV",
303  "userText" : "Populated by upload process",
304  "netrc" : "None given",
305  "fcsr_filter" : "Don't apply",
306  "hashToUse" : "Using local SQLite file instead"
307  }
308  print("Configuration to use for the upload:")
309  for key in metadata_dictionary:
310  if not(key) in ["username", "password", "destinationTag"]:
311  value_to_print = metadata_dictionary[key] if metadata_dictionary[key] != None else defaults[key]
312  print("\t%s : %s" % (key, value_to_print))
313 
314  if raw_input("\nDo you want to continue? [y/n] ") != "y":
315  exit()
316 
317  return metadata_dictionary
318 

References beamvalidation.exit(), get_directory_to_pull_to(), get_local_commit_hash(), get_version_info(), join(), data_sources.json_data_node.make(), print(), pull_code_from_git(), run_in_shell(), run_upload(), and str.

Referenced by uploads.uploader.send_metadata().

◆ pull_code_from_git()

def uploadConditions.pull_code_from_git (   target_directory,
  repository_url,
  hash 
)
Pulls CondDBFW from the git repository specified by the upload server.

Definition at line 98 of file uploadConditions.py.

98 def pull_code_from_git(target_directory, repository_url, hash):
99  """
100  Pulls CondDBFW from the git repository specified by the upload server.
101  """
102  # make directory
103  target = os.path.abspath(target_directory)
104  sys.path.append(target)
105  conddbfw_directory = os.path.join(target, "CondDBFW")
106  git_directory = os.path.join(conddbfw_directory, ".git")
107  if not(os.path.exists(conddbfw_directory)):
108  os.mkdir(conddbfw_directory)
109  else:
110  # if the directory exists, it may contain things - prompt the user
111  force_pull = str(raw_input("CondDBFW directory isn't empty - empty it, and update to new version? [y/n] "))
112  if force_pull == "y":
113  # empty directory and delete it
114  run_in_shell("rm -rf CondDBFW", shell=True)
115  # remake the directory - it will be empty
116  os.mkdir(conddbfw_directory)
117 
118  print("Pulling code back from repository...")
119  print(horizontal_rule)
120 
121  run_in_shell("git --git-dir=%s clone %s CondDBFW" % (git_directory, repository_url), shell=True)
122  # --force makes sure we ignore any conflicts that
123  # could occur and overwrite everything in the checkout
124  run_in_shell("cd %s && git checkout --force -b version_used %s" % (conddbfw_directory, hash), shell=True)
125 
126  # write the hash to a file in the CondDBFW directory so we can delete the git repository
127  hash_file_handle = open(os.path.join(conddbfw_directory, ".commit_hash"), "w")
128  hash_file_handle.write(hash)
129  hash_file_handle.close()
130 
131  # can now delete .git directory
132  shutil.rmtree(git_directory)
133 
134  print(horizontal_rule)
135  print("Creating local log directories (if required)...")
136  if not(os.path.exists(os.path.join(target, "upload_logs"))):
137  os.mkdir(os.path.join(target, "upload_logs"))
138  if not(os.path.exists(os.path.join(target, "server_side_logs"))):
139  os.mkdir(os.path.join(target, "server_side_logs"))
140  print("Finished with log directories.")
141  print("Update of CondDBFW complete.")
142 
143  print(horizontal_rule)
144 
145  return True
146 

References print(), run_in_shell(), and str.

Referenced by parse_arguments().

◆ re_upload()

def uploadConditions.re_upload (   options)

Definition at line 802 of file uploadConditions.py.

802 def re_upload( options ):
803  netrcPath = None
804  logDbSrv = prodLogDbSrv
805  if options.hostname == defaultDevHostname:
806  logDbSrv = devLogDbSrv
807  if options.authPath is not None:
808  netrcPath = os.path.join( options.authPath,'.netrc' )
809  try:
810  netrcKey = '%s/%s' %(logDbSrv,logDbSchema)
811  # Try to find the netrc entry
812  (username, account, password) = netrc.netrc( netrcPath ).authenticators( netrcKey )
813  except IOError as e:
814  logging.error('Cannot access netrc file.')
815  return 1
816  except Exception as e:
817  logging.error('Netrc file is invalid: %s' %str(e))
818  return 1
819  conStr = '%s/%s@%s' %(username,password,logDbSrv)
820  con = cx_Oracle.connect( conStr )
821  cur = con.cursor()
822  fh = options.reUpload
823  cur.execute('SELECT FILECONTENT, STATE FROM FILES WHERE FILEHASH = :HASH',{'HASH':fh})
824  res = cur.fetchall()
825  found = False
826  fdata = None
827  for r in res:
828  found = True
829  logging.info("Found file %s in state '%s;" %(fh,r[1]))
830  fdata = r[0].read().decode('bz2')
831  con.close()
832  if not found:
833  logging.error("No file uploaded found with hash %s" %fh)
834  return 1
835  # writing as a tar file and open it ( is there a why to open it in memory?)
836  fname = '%s.tar' %fh
837  with open(fname, "wb" ) as f:
838  f.write(fdata)
839  rname = 'reupload_%s' %fh
840  with tarfile.open(fname) as tar:
841  tar.extractall()
842  os.remove(fname)
843  dfile = 'data.db'
844  mdfile = 'metadata.txt'
845  if os.path.exists(dfile):
846  os.utime(dfile,None)
847  os.chmod(dfile,0o755)
848  os.rename(dfile,'%s.db' %rname)
849  else:
850  logging.error('Tar file does not contain the data file')
851  return 1
852  if os.path.exists(mdfile):
853  os.utime(mdfile,None)
854  os.chmod(mdfile,0o755)
855  mdata = None
856  with open(mdfile) as md:
857  mdata = json.load(md)
858  datelabel = datetime.now().strftime("%y-%m-%d %H:%M:%S")
859  if mdata is None:
860  logging.error('Metadata file is empty.')
861  return 1
862  logging.debug('Preparing new metadata file...')
863  mdata['userText'] = 'reupload %s : %s' %(datelabel,mdata['userText'])
864  with open( '%s.txt' %rname, 'wb') as jf:
865  jf.write( json.dumps( mdata, sort_keys=True, indent = 2 ) )
866  jf.write('\n')
867  os.remove(mdfile)
868  else:
869  logging.error('Tar file does not contain the metadata file')
870  return 1
871  logging.info('Files %s prepared for the upload.' %rname)
872  arguments = [rname]
873  return upload(options, arguments)
874 

References edm.decode(), readEcalDQMStatus.read, str, and upload().

Referenced by main().

◆ run_in_shell()

def uploadConditions.run_in_shell ( popenargs,
**  kwargs 
)
Runs string-based commands in the shell and returns the result.

Definition at line 147 of file uploadConditions.py.

147 def run_in_shell(*popenargs, **kwargs):
148  """
149  Runs string-based commands in the shell and returns the result.
150  """
151  out = subprocess.PIPE if kwargs.get("stdout") == None else kwargs.get("stdout")
152  new_kwargs = kwargs
153  if new_kwargs.get("stdout"):
154  del new_kwargs["stdout"]
155  process = subprocess.Popen(*popenargs, stdout=out, **new_kwargs)
156  stdout = process.communicate()[0]
157  returnCode = process.returncode
158  cmd = kwargs.get('args')
159  if cmd is None:
160  cmd = popenargs[0]
161  if returnCode:
162  raise subprocess.CalledProcessError(returnCode, cmd)
163  return stdout
164 

Referenced by parse_arguments(), and pull_code_from_git().

◆ run_upload()

def uploadConditions.run_upload ( **  parameters)
Imports CondDBFW.uploads and runs the upload with the upload metadata obtained.

Definition at line 165 of file uploadConditions.py.

165 def run_upload(**parameters):
166  """
167  Imports CondDBFW.uploads and runs the upload with the upload metadata obtained.
168  """
169  try:
170  import CondDBFW.uploads as uploads
171  except Exception as e:
172  traceback.print_exc()
173  exit("CondDBFW or one of its dependencies could not be imported.\n"\
174  + "If the CondDBFW directory exists, you are likely not in a CMSSW environment.")
175  # we have CondDBFW, so just call the module with the parameters given in the command line
176  uploader = uploads.uploader(**parameters)
177  result = uploader.upload()
178 

References beamvalidation.exit().

Referenced by parse_arguments().

◆ runWizard()

def uploadConditions.runWizard (   basename,
  dataFilename,
  metadataFilename 
)

Definition at line 103 of file uploadConditions.py.

103 def runWizard(basename, dataFilename, metadataFilename):
104  while True:
105  print('''\nWizard for metadata for %s
106 
107 I will ask you some questions to fill the metadata file. For some of the questions there are defaults between square brackets (i.e. []), leave empty (i.e. hit Enter) to use them.''' % basename)
108 
109  # Try to get the available inputTags
110  dataConnection = sqlite3.connect(dataFilename)
111  dataCursor = dataConnection.cursor()
112 
113  dataCursor.execute('select NAME from TAG')
114  records = dataCursor.fetchall()
115  inputTags = []
116  for rec in records:
117  inputTags.append(rec[0])
118 
119  if len(inputTags) == 0:
120  raise Exception("Could not find any input tag in the data file.")
121 
122  else:
123  print('\nI found the following input tags in your SQLite data file:')
124  for (index, inputTag) in enumerate(inputTags):
125  print(' %s) %s' % (index, inputTag))
126 
127  inputTag = getInputChoose(inputTags, '0',
128  '\nWhich is the input tag (i.e. the tag to be read from the SQLite data file)?\ne.g. 0 (you select the first in the list)\ninputTag [0]: ')
129 
130  destinationDatabase = ''
131  ntry = 0
132  while ( destinationDatabase != 'oracle://cms_orcon_prod/CMS_CONDITIONS' and destinationDatabase != 'oracle://cms_orcoff_prep/CMS_CONDITIONS' ):
133  if ntry==0:
134  inputMessage = \
135  '\nWhich is the destination database where the tags should be exported? \nPossible choices: oracle://cms_orcon_prod/CMS_CONDITIONS (or prod); oracle://cms_orcoff_prep/CMS_CONDITIONS (or prep) \ndestinationDatabase: '
136  elif ntry==1:
137  inputMessage = \
138  '\nPlease choose one of the two valid destinations: \noracle://cms_orcon_prod/CMS_CONDITIONS (for prod) or oracle://cms_orcoff_prep/CMS_CONDITIONS (for prep) \
139 \ndestinationDatabase: '
140  else:
141  raise Exception('No valid destination chosen. Bailing out...')
142  destinationDatabase = getInputRepeat(inputMessage)
143  if destinationDatabase == 'prod':
144  destinationDatabase = 'oracle://cms_orcon_prod/CMS_CONDITIONS'
145  if destinationDatabase == 'prep':
146  destinationDatabase = 'oracle://cms_orcoff_prep/CMS_CONDITIONS'
147  ntry += 1
148 
149  while True:
150  since = getInput('',
151  '\nWhich is the given since? (if not specified, the one from the SQLite data file will be taken -- note that even if specified, still this may not be the final since, depending on the synchronization options you select later: if the synchronization target is not offline, and the since you give is smaller than the next possible one (i.e. you give a run number earlier than the one which will be started/processed next in prompt/hlt/express), the DropBox will move the since ahead to go to the first safe run instead of the value you gave)\ne.g. 1234\nsince []: ')
152  if not since:
153  since = None
154  break
155  else:
156  try:
157  since = int(since)
158  break
159  except ValueError:
160  logging.error('The since value has to be an integer or empty (null).')
161 
162  userText = getInput('',
163  '\nWrite any comments/text you may want to describe your request\ne.g. Muon alignment scenario for...\nuserText []: ')
164 
165  destinationTags = {}
166  while True:
167  destinationTag = getInput('',
168  '\nWhich is the next destination tag to be added (leave empty to stop)?\ne.g. BeamSpotObjects_PCL_byRun_v0_offline\ndestinationTag []: ')
169  if not destinationTag:
170  if len(destinationTags) == 0:
171  logging.error('There must be at least one destination tag.')
172  continue
173  break
174 
175  if destinationTag in destinationTags:
176  logging.warning(
177  'You already added this destination tag. Overwriting the previous one with this new one.')
178 
179  destinationTags[destinationTag] = {
180  }
181 
182  metadata = {
183  'destinationDatabase': destinationDatabase,
184  'destinationTags': destinationTags,
185  'inputTag': inputTag,
186  'since': since,
187  'userText': userText,
188  }
189 
190  metadata = json.dumps(metadata, sort_keys=True, indent=4)
191  print('\nThis is the generated metadata:\n%s' % metadata)
192 
193  if getInput('n',
194  '\nIs it fine (i.e. save in %s and *upload* the conditions if this is the latest file)?\nAnswer [n]: ' % metadataFilename).lower() == 'y':
195  break
196  logging.info('Saving generated metadata in %s...', metadataFilename)
197  with open(metadataFilename, 'w') as metadataFile:
198  metadataFile.write(metadata)
199 

References getInput(), getInputChoose(), getInputRepeat(), createfilelist.int, and print().

Referenced by uploadAllFiles().

◆ testTier0Upload()

def uploadConditions.testTier0Upload ( )

Definition at line 965 of file uploadConditions.py.

965 def testTier0Upload():
966 
967  global defaultNetrcHost
968 
969  (username, account, password) = netrc.netrc().authenticators(defaultNetrcHost)
970 
971  filenames = ['testFiles/localSqlite-top2']
972 
973  uploadTier0Files(filenames, username, password, cookieFileName = None)
974 
975 

References uploadTier0Files().

◆ upload()

def uploadConditions.upload (   options,
  arguments 
)

Definition at line 875 of file uploadConditions.py.

875 def upload(options, arguments):
876  results = uploadAllFiles(options, arguments)
877 
878  if 'status' not in results:
879  print('Unexpected error.')
880  return -1
881  ret = results['status']
882  print(results)
883  print("upload ended with code: %s" %ret)
884  return ret
885 

References print(), and uploadAllFiles().

Referenced by main(), and re_upload().

◆ uploadAllFiles()

def uploadConditions.uploadAllFiles (   options,
  arguments 
)

Definition at line 647 of file uploadConditions.py.

647 def uploadAllFiles(options, arguments):
648 
649  ret = {}
650  ret['status'] = 0
651 
652  # Check that we can read the data and metadata files
653  # If the metadata file does not exist, start the wizard
654  for filename in arguments:
655  basepath = filename.rsplit('.db', 1)[0].rsplit('.txt', 1)[0]
656  basename = os.path.basename(basepath)
657  dataFilename = '%s.db' % basepath
658  metadataFilename = '%s.txt' % basepath
659 
660  logging.info('Checking %s...', basename)
661 
662  # Data file
663  try:
664  with open(dataFilename, 'rb') as dataFile:
665  pass
666  except IOError as e:
667  errMsg = 'Impossible to open SQLite data file %s' %dataFilename
668  logging.error( errMsg )
669  ret['status'] = -3
670  ret['error'] = errMsg
671  return ret
672 
673  # Check the data file
674  empty = True
675  try:
676  dbcon = sqlite3.connect( dataFilename )
677  dbcur = dbcon.cursor()
678  dbcur.execute('SELECT * FROM IOV')
679  rows = dbcur.fetchall()
680  for r in rows:
681  empty = False
682  dbcon.close()
683  if empty:
684  errMsg = 'The input SQLite data file %s contains no data.' %dataFilename
685  logging.error( errMsg )
686  ret['status'] = -4
687  ret['error'] = errMsg
688  return ret
689  except Exception as e:
690  errMsg = 'Check on input SQLite data file %s failed: %s' %(dataFilename,str(e))
691  logging.error( errMsg )
692  ret['status'] = -5
693  ret['error'] = errMsg
694  return ret
695 
696  # Metadata file
697  try:
698  with open(metadataFilename, 'rb') as metadataFile:
699  pass
700  except IOError as e:
701  if e.errno != errno.ENOENT:
702  errMsg = 'Impossible to open file %s (for other reason than not existing)' %metadataFilename
703  logging.error( errMsg )
704  ret['status'] = -4
705  ret['error'] = errMsg
706  return ret
707 
708  if getInput('y', '\nIt looks like the metadata file %s does not exist. Do you want me to create it and help you fill it?\nAnswer [y]: ' % metadataFilename).lower() != 'y':
709  errMsg = 'Metadata file %s does not exist' %metadataFilename
710  logging.error( errMsg )
711  ret['status'] = -5
712  ret['error'] = errMsg
713  return ret
714  # Wizard
715  runWizard(basename, dataFilename, metadataFilename)
716 
717  # Upload files
718  try:
719  dropBox = ConditionsUploader(options.hostname, options.urlTemplate)
720 
721  # Authentication
722  username, password = getCredentials(options)
723 
724  results = {}
725  for filename in arguments:
726  backend = options.backend
727  basepath = filename.rsplit('.db', 1)[0].rsplit('.txt', 1)[0]
728  metadataFilename = '%s.txt' % basepath
729  with open(metadataFilename, 'rb') as metadataFile:
730  metadata = json.load( metadataFile )
731  # When dest db = prep the hostname has to be set to dev.
732  forceHost = False
733  destDb = metadata['destinationDatabase']
734  if destDb.startswith('oracle://cms_orcon_prod') or destDb.startswith('oracle://cms_orcoff_prep'):
735  hostName = defaultHostname
736  if destDb.startswith('oracle://cms_orcoff_prep'):
737  hostName = defaultDevHostname
738  dropBox.setHost( hostName )
739  authRet = dropBox.signIn( username, password )
740  if not authRet==0:
741  msg = "Error trying to connect to the server. Aborting."
742  if authRet==-2:
743  msg = "Error while signin in. Aborting."
744  logging.error(msg)
745  return { 'status' : authRet, 'error' : msg }
746  results[filename] = dropBox.uploadFile(filename, options.backend, options.temporaryFile)
747  else:
748  results[filename] = False
749  logging.error("DestinationDatabase %s is not valid. Skipping the upload." %destDb)
750  if not results[filename]:
751  if ret['status']<0:
752  ret['status'] = 0
753  ret['status'] += 1
754  ret['files'] = results
755  logging.debug("all files processed, logging out now.")
756 
757  dropBox.signOut()
758 
759  except HTTPError as e:
760  logging.error('got HTTP error: %s', str(e))
761  return { 'status' : -1, 'error' : str(e) }
762 
763  return ret
764 

References getCredentials(), getInput(), runWizard(), and str.

Referenced by upload().

◆ uploadTier0Files()

def uploadConditions.uploadTier0Files (   filenames,
  username,
  password,
  cookieFileName = None 
)
Uploads a bunch of files coming from Tier0.
This has the following requirements:
    * Username/Password based authentication.
    * Uses the online backend.
    * Ignores errors related to the upload/content (e.g. duplicated file).

Definition at line 765 of file uploadConditions.py.

765 def uploadTier0Files(filenames, username, password, cookieFileName = None):
766  '''Uploads a bunch of files coming from Tier0.
767  This has the following requirements:
768  * Username/Password based authentication.
769  * Uses the online backend.
770  * Ignores errors related to the upload/content (e.g. duplicated file).
771  '''
772 
773  dropBox = ConditionsUploader()
774 
775  dropBox.signIn(username, password)
776 
777  for filename in filenames:
778  try:
779  result = dropBox.uploadFile(filename, backend = 'test')
780  except HTTPError as e:
781  if e.code == 400:
782  # 400 Bad Request: This is an exception related to the upload
783  # being wrong for some reason (e.g. duplicated file).
784  # Since for Tier0 this is not an issue, continue
785  logging.error('HTTP Exception 400 Bad Request: Upload-related, skipping. Message: %s', e)
786  continue
787 
788  # In any other case, re-raise.
789  raise
790 
791  #-toDo: add a flag to say if we should retry or not. So far, all retries are done server-side (Tier-0),
792  # if we flag as failed any retry would not help and would result in the same error (e.g.
793  # when a file with an identical hash is uploaded again)
794  #-review(2015-09-25): get feedback from tests at Tier-0 (action: AP)
795 
796  if not result: # dropbox reported an error when uploading, do not retry.
797  logging.error('Error from dropbox, upload-related, skipping.')
798  continue
799 
800  dropBox.signOut()
801 

Referenced by testTier0Upload().

uploadConditions.parse_arguments
def parse_arguments()
Definition: uploadConditions.py:179
input
static const std::string input
Definition: EdmProvDump.cc:48
digitizers_cfi.strip
strip
Definition: digitizers_cfi.py:19
uploadConditions.getInputChoose
def getInputChoose(optionsList, default, prompt='')
Definition: uploadConditions.py:76
join
static std::string join(char **cmd)
Definition: RemoteFile.cc:17
uploadConditions.uploadAllFiles
def uploadAllFiles(options, arguments)
Definition: uploadConditions.py:647
uploadConditions.testTier0Upload
def testTier0Upload()
Definition: uploadConditions.py:965
uploadConditions.run_in_shell
def run_in_shell(*popenargs, **kwargs)
Definition: uploadConditions.py:147
uploadConditions.pull_code_from_git
def pull_code_from_git(target_directory, repository_url, hash)
Definition: uploadConditions.py:98
uploadConditions.getInput
def getInput(default, prompt='')
Definition: uploadConditions.py:52
uploadConditions.get_directory_to_pull_to
def get_directory_to_pull_to(default_directory, commit_hash)
Definition: uploadConditions.py:73
uploadConditions.main
def main()
Definition: uploadConditions.py:886
str
#define str(s)
Definition: TestProcessor.cc:53
uploadConditions.addToTarFile
def addToTarFile(tarFile, fileobj, arcname)
Definition: uploadConditions.py:428
uploadConditions.getCredentials
def getCredentials(options)
Definition: uploadConditions.py:617
print
void print(TMatrixD &m, const char *label=nullptr, bool mathematicaFormat=false)
Definition: Utilities.cc:46
Exception
uploadConditions.run_upload
def run_upload(**parameters)
Definition: uploadConditions.py:165
createfilelist.int
int
Definition: createfilelist.py:10
readEcalDQMStatus.read
read
Definition: readEcalDQMStatus.py:38
uploads.uploader
Definition: uploads.py:98
edm::decode
bool decode(bool &, std::string const &)
Definition: types.cc:72
uploadConditions.re_upload
def re_upload(options)
Definition: uploadConditions.py:802
uploadConditions.getInputRepeat
def getInputRepeat(prompt='')
Definition: uploadConditions.py:91
uploadConditions.runWizard
def runWizard(basename, dataFilename, metadataFilename)
Definition: uploadConditions.py:103
uploadConditions.upload
def upload(options, arguments)
Definition: uploadConditions.py:875
beamvalidation.exit
def exit(msg="")
Definition: beamvalidation.py:52
uploadConditions.uploadTier0Files
def uploadTier0Files(filenames, username, password, cookieFileName=None)
Definition: uploadConditions.py:765
uploadConditions.getInputWorkflow
def getInputWorkflow(prompt='')
Definition: uploadConditions.py:63
uploadConditions.get_local_commit_hash
def get_local_commit_hash()
Definition: uploadConditions.py:50
uploadConditions.get_version_info
def get_version_info(url)
Definition: uploadConditions.py:32