CMS 3D CMS Logo

Classes | Functions
uploadConditions Namespace Reference

Classes

class  ConditionsUploader
 
class  HTTP
 
class  HTTPError
 

Functions

def addToTarFile (tarFile, fileobj, arcname)
 
def get_directory_to_pull_to (default_directory, commit_hash)
 
def get_local_commit_hash ()
 
def get_version_info (url)
 
def getCredentials (options)
 
def getInput (default, prompt='')
 
def getInputChoose (optionsList, default, prompt='')
 
def getInputRepeat (prompt='')
 
def getInputWorkflow (prompt='')
 
def main ()
 
def parse_arguments ()
 
def pull_code_from_git (target_directory, repository_url, hash)
 
def re_upload (options)
 
def run_in_shell (popenargs, kwargs)
 
def run_upload (parameters)
 
def runWizard (basename, dataFilename, metadataFilename)
 
def testTier0Upload ()
 
def upload (options, arguments)
 
def uploadAllFiles (options, arguments)
 
def uploadTier0Files (filenames, username, password, cookieFileName=None)
 

Detailed Description

Joshua Dawes - CERN, CMS - The University of Manchester

Upload script wrapper - controls the automatic update system.

Note: the name of the file follows a different convention to the others because it should be the same as the current upload script name.

Takes user arguments and passes them to the main upload module CondDBFW.uploads, once the correct version exists.

1. Ask the server corresponding to the database we're uploading to which version of CondDBFW it has (query the /conddbfw_version/ url).
2. Decide which directory that we can write to - either the current local directory, or /tmp/random_string/.
3. Pull the commit returned from the server into the directory from step 2.
4. Invoke the CondDBFW.uploads module with the arguments given to this script.
Script that uploads to the new CMS conditions uploader.
Adapted to the new infrastructure from v6 of the upload.py script for the DropBox from Miguel Ojeda.

Function Documentation

def uploadConditions.addToTarFile (   tarFile,
  fileobj,
  arcname 
)

Definition at line 438 of file uploadConditions.py.

Referenced by uploadConditions.ConditionsUploader.uploadFile().

438 def addToTarFile(tarFile, fileobj, arcname):
439  tarInfo = tarFile.gettarinfo(fileobj = fileobj, arcname = arcname)
440  tarInfo.mode = 0o400
441  tarInfo.uid = tarInfo.gid = tarInfo.mtime = 0
442  tarInfo.uname = tarInfo.gname = 'root'
443  tarFile.addfile(tarInfo, fileobj)
444 
def addToTarFile(tarFile, fileobj, arcname)
def uploadConditions.get_directory_to_pull_to (   default_directory,
  commit_hash 
)
Finds out which directory we can safely use - either CondDBFW/ or a temporary directory.

Definition at line 72 of file uploadConditions.py.

References cmsRelvalreport.exit.

Referenced by parse_arguments().

72 def get_directory_to_pull_to(default_directory, commit_hash):
73  """
74  Finds out which directory we can safely use - either CondDBFW/ or a temporary directory.
75  """
76  # try to write a file (and then delete it)
77  try:
78  handle = open(os.path.join(default_directory, "test_file"), "w")
79  handle.write("test")
80  handle.close()
81  os.remove(os.path.join(default_directory, "test_file"))
82  sys.path.insert(0, default_directory)
83  return default_directory
84  except IOError as io:
85  # cannot write to default directory, so set up a directory in /tmp/
86  new_path = os.path.join("tmp", commit_hash[0:10])
87  if not(os.path.exists(new_path)):
88  os.mkdir(new_path)
89  sys.path.insert(0, new_path)
90  return new_path
91  else:
92  # for now, fail
93  exit("Can't find anywhere to pull the new code base to.")
94 
def get_directory_to_pull_to(default_directory, commit_hash)
def uploadConditions.get_local_commit_hash ( )
Gets the commit hash used by the local repository CondDBFW/.git/.

Definition at line 49 of file uploadConditions.py.

References cmsRelvalreport.exit, edm.print(), and digitizers_cfi.strip.

Referenced by parse_arguments().

50  """
51  Gets the commit hash used by the local repository CondDBFW/.git/.
52  """
53  directory = os.path.abspath("CondDBFW")
54 
55  # get the commit hash of the code in `directory`
56  # by reading the .commit_hash file
57  try:
58  commit_hash_file_handle = open(os.path.join(directory, ".commit_hash"), "r")
59  commit_hash = commit_hash_file_handle.read().strip()
60 
61  # validate length of the commit hash
62  if len(commit_hash) != 40:
63  print("Commit hash found is not valid. Must be 40 characters long.")
64  exit()
65 
66  #commit_hash = run_in_shell("git --git-dir=%s rev-parse HEAD" % (os.path.join(directory, ".git")), shell=True).strip()
67 
68  return commit_hash
69  except Exception:
70  return None
71 
S & print(S &os, JobReport::InputFile const &f)
Definition: JobReport.cc:65
def uploadConditions.get_version_info (   url)
Queries the server-side for the commit hash it is currently using.
Note: this is the commit hash used by /data/services/common/CondDBFW on the server-side.

Definition at line 31 of file uploadConditions.py.

Referenced by parse_arguments().

32  """
33  Queries the server-side for the commit hash it is currently using.
34  Note: this is the commit hash used by /data/services/common/CondDBFW on the server-side.
35  """
36  request = pycurl.Curl()
37  request.setopt(request.CONNECTTIMEOUT, 60)
38  user_agent = "User-Agent: ConditionWebServices/1.0 python/%d.%d.%d PycURL/%s" % (sys.version_info[ :3 ] + (pycurl.version_info()[1],))
39  request.setopt(request.USERAGENT, user_agent)
40  # we don't need to verify who signed the certificate or who the host is
41  request.setopt(request.SSL_VERIFYPEER, 0)
42  request.setopt(request.SSL_VERIFYHOST, 0)
43  response_buffer = StringIO()
44  request.setopt(request.WRITEFUNCTION, response_buffer.write)
45  request.setopt(request.URL, url + "conddbfw_version/")
46  request.perform()
47  return json.loads(response_buffer.getvalue())
48 
def get_version_info(url)
def uploadConditions.getCredentials (   options)

Definition at line 626 of file uploadConditions.py.

References getInput().

Referenced by uploadAllFiles().

626 def getCredentials( options ):
627 
628  username = None
629  password = None
630  netrcPath = None
631  if authPathEnvVar in os.environ:
632  authPath = os.environ[authPathEnvVar]
633  netrcPath = os.path.join(authPath,'.netrc')
634  if options.authPath is not None:
635  netrcPath = os.path.join( options.authPath,'.netrc' )
636  try:
637  # Try to find the netrc entry
638  (username, account, password) = netrc.netrc( netrcPath ).authenticators(options.netrcHost)
639  except Exception:
640  # netrc entry not found, ask for the username and password
641  logging.info(
642  'netrc entry "%s" not found: if you wish not to have to retype your password, you can add an entry in your .netrc file. However, beware of the risks of having your password stored as plaintext. Instead.',
643  options.netrcHost)
644 
645  # Try to get a default username
646  defaultUsername = getpass.getuser()
647  if defaultUsername is None:
648  defaultUsername = '(not found)'
649 
650  username = getInput(defaultUsername, '\nUsername [%s]: ' % defaultUsername)
651  password = getpass.getpass('Password: ')
652 
653  return username, password
654 
655 
def getCredentials(options)
def getInput(default, prompt='')
def uploadConditions.getInput (   default,
  prompt = '' 
)
Like raw_input() but with a default and automatic strip().

Definition at line 51 of file uploadConditions.py.

Referenced by getCredentials(), getInputChoose(), getInputWorkflow(), runWizard(), and uploadAllFiles().

51 def getInput(default, prompt = ''):
52  '''Like raw_input() but with a default and automatic strip().
53  '''
54 
55  answer = raw_input(prompt)
56  if answer:
57  return answer.strip()
58 
59  return default.strip()
60 
61 
def getInput(default, prompt='')
def uploadConditions.getInputChoose (   optionsList,
  default,
  prompt = '' 
)
Makes the user choose from a list of options.

Definition at line 75 of file uploadConditions.py.

References getInput(), and createfilelist.int.

Referenced by runWizard().

75 def getInputChoose(optionsList, default, prompt = ''):
76  '''Makes the user choose from a list of options.
77  '''
78 
79  while True:
80  index = getInput(default, prompt)
81 
82  try:
83  return optionsList[int(index)]
84  except ValueError:
85  logging.error('Please specify an index of the list (i.e. integer).')
86  except IndexError:
87  logging.error('The index you provided is not in the given list.')
88 
89 
def getInputChoose(optionsList, default, prompt='')
def getInput(default, prompt='')
def uploadConditions.getInputRepeat (   prompt = '')
Like raw_input() but repeats if nothing is provided and automatic strip().

Definition at line 90 of file uploadConditions.py.

Referenced by runWizard().

90 def getInputRepeat(prompt = ''):
91  '''Like raw_input() but repeats if nothing is provided and automatic strip().
92  '''
93 
94  while True:
95  answer = raw_input(prompt)
96  if answer:
97  return answer.strip()
98 
99  logging.error('You need to provide a value.')
100 
101 
def getInputRepeat(prompt='')
def uploadConditions.getInputWorkflow (   prompt = '')
Like getInput() but tailored to get target workflows (synchronization options).

Definition at line 62 of file uploadConditions.py.

References getInput().

62 def getInputWorkflow(prompt = ''):
63  '''Like getInput() but tailored to get target workflows (synchronization options).
64  '''
65 
66  while True:
67  workflow = getInput(defaultWorkflow, prompt)
68 
69  if workflow in frozenset(['offline', 'hlt', 'express', 'prompt', 'pcl']):
70  return workflow
71 
72  logging.error('Please specify one of the allowed workflows. See above for the explanation on each of them.')
73 
74 
def getInput(default, prompt='')
def getInputWorkflow(prompt='')
def uploadConditions.main ( )
Entry point.

Definition at line 896 of file uploadConditions.py.

References re_upload(), and upload().

896 def main():
897  '''Entry point.
898  '''
899 
900  parser = optparse.OptionParser(usage =
901  'Usage: %prog [options] <file> [<file> ...]\n'
902  )
903 
904  parser.add_option('-d', '--debug',
905  dest = 'debug',
906  action="store_true",
907  default = False,
908  help = 'Switch on printing debug information. Default: %default',
909  )
910 
911  parser.add_option('-b', '--backend',
912  dest = 'backend',
913  default = defaultBackend,
914  help = 'dropBox\'s backend to upload to. Default: %default',
915  )
916 
917  parser.add_option('-H', '--hostname',
918  dest = 'hostname',
919  default = defaultHostname,
920  help = 'dropBox\'s hostname. Default: %default',
921  )
922 
923  parser.add_option('-u', '--urlTemplate',
924  dest = 'urlTemplate',
925  default = defaultUrlTemplate,
926  help = 'dropBox\'s URL template. Default: %default',
927  )
928 
929  parser.add_option('-f', '--temporaryFile',
930  dest = 'temporaryFile',
931  default = defaultTemporaryFile,
932  help = 'Temporary file that will be used to store the first tar file. Note that it then will be moved to a file with the hash of the file as its name, so there will be two temporary files created in fact. Default: %default',
933  )
934 
935  parser.add_option('-n', '--netrcHost',
936  dest = 'netrcHost',
937  default = defaultNetrcHost,
938  help = 'The netrc host (machine) from where the username and password will be read. Default: %default',
939  )
940 
941  parser.add_option('-a', '--authPath',
942  dest = 'authPath',
943  default = None,
944  help = 'The path of the .netrc file for the authentication. Default: $HOME',
945  )
946 
947  parser.add_option('-r', '--reUpload',
948  dest = 'reUpload',
949  default = None,
950  help = 'The hash of the file to upload again.',
951  )
952 
953  (options, arguments) = parser.parse_args()
954 
955  logLevel = logging.INFO
956  if options.debug:
957  logLevel = logging.DEBUG
958  logging.basicConfig(
959  format = '[%(asctime)s] %(levelname)s: %(message)s',
960  level = logLevel,
961  )
962 
963  if len(arguments) < 1:
964  if options.reUpload is None:
965  parser.print_help()
966  return -2
967  else:
968  return re_upload(options)
969  if options.reUpload is not None:
970  print "ERROR: options -r can't be specified on a new file upload."
971  return -2
972 
973  return upload(options, arguments)
974 
def upload(options, arguments)
def re_upload(options)
def uploadConditions.parse_arguments ( )

Definition at line 178 of file uploadConditions.py.

References cmsRelvalreport.exit, get_directory_to_pull_to(), get_local_commit_hash(), get_version_info(), join(), data_sources.json_data_node.make(), edm.print(), pull_code_from_git(), run_in_shell(), run_upload(), and str.

Referenced by uploads.uploader.send_metadata().

179  # read in command line arguments, and build metadata dictionary from them
180  parser = argparse.ArgumentParser(prog="cmsDbUpload client", description="CMS Conditions Upload Script in CondDBFW.")
181 
182  parser.add_argument("--sourceDB", type=str, help="DB to find Tags, IOVs + Payloads in.", required=False)
183 
184  # metadata arguments
185  parser.add_argument("--inputTag", type=str,\
186  help="Tag to take IOVs + Payloads from in --sourceDB.", required=False)
187  parser.add_argument("--destinationTag", type=str,\
188  help="Tag to copy IOVs + Payloads to in --destDB.", required=False)
189  parser.add_argument("--destinationDatabase", type=str,\
190  help="Database to copy IOVs + Payloads to.", required=False)
191  parser.add_argument("--since", type=int,\
192  help="Since to take IOVs from.", required=False)
193  parser.add_argument("--userText", type=str,\
194  help="Description of --destTag (can be empty).")
195 
196  # non-metadata arguments
197  parser.add_argument("--metadataFile", "-m", type=str, help="Metadata file to take metadata from.", required=False)
198 
199  parser.add_argument("--debug", required=False, action="store_true")
200  parser.add_argument("--verbose", required=False, action="store_true")
201  parser.add_argument("--testing", required=False, action="store_true")
202  parser.add_argument("--fcsr-filter", type=str, help="Synchronization to take FCSR from for local filtering of IOVs.", required=False)
203 
204  parser.add_argument("--netrc", required=False)
205 
206  parser.add_argument("--hashToUse", required=False)
207 
208  parser.add_argument("--server", required=False)
209 
210  parser.add_argument("--review-options", required=False, action="store_true")
211 
212  command_line_data = parser.parse_args()
213 
214  # default is the production server, which can point to either database anyway
215  server_alias_to_url = {
216  "prep" : "https://cms-conddb-dev.cern.ch/cmsDbCondUpload/",
217  "prod" : "https://cms-conddb.cern.ch/cmsDbCondUpload/",
218  None : "https://cms-conddb.cern.ch/cmsDbCondUpload/"
219  }
220 
221  # if prep, prod or None were given, convert to URLs in dictionary server_alias_to_url
222  # if not, assume a URL has been given and use this instead
223  if command_line_data.server in server_alias_to_url.keys():
224  command_line_data.server = server_alias_to_url[command_line_data.server]
225 
226  # use netrc to get username and password
227  try:
228  netrc_file = command_line_data.netrc
229  netrc_authenticators = netrc.netrc(netrc_file).authenticators("ConditionUploader")
230  if netrc_authenticators == None:
231  print("Your netrc file must contain the key 'ConditionUploader'.")
232  manual_input = raw_input("Do you want to try to type your credentials? ")
233  if manual_input == "y":
234  # ask for username and password
235  username = raw_input("Username: ")
236  password = getpass.getpass("Password: ")
237  else:
238  exit()
239  else:
240  print("Read your credentials from ~/.netrc. If you want to use a different file, supply its name with the --netrc argument.")
241  username = netrc_authenticators[0]
242  password = netrc_authenticators[2]
243  except:
244  print("Couldn't obtain your credentials (either from netrc or manual input).")
245  exit()
246 
247  command_line_data.username = username
248  command_line_data.password = password
249  # this will be used as the final destinationTags value by all input methods
250  # apart from the metadata file
251  command_line_data.destinationTags = {command_line_data.destinationTag:{}}
252 
253  """
254  Construct metadata_dictionary:
255  Currently, this is 3 cases:
256 
257  1) An IOV is being appended to an existing Tag with an existing Payload.
258  In this case, we just take all data from the command line.
259 
260  2) No metadata file is given, so we assume that ALL upload metadata is coming from the command line.
261 
262  3) A metadata file is given, hence we parse the file, and then iterate through command line arguments
263  since these override the options set in the metadata file.
264 
265  """
266  if command_line_data.hashToUse != None:
267  command_line_data.userText = ""
268  metadata_dictionary = command_line_data.__dict__
269  elif command_line_data.metadataFile == None:
270  command_line_data.userText = command_line_data.userText\
271  if command_line_data.userText != None\
272  else str(raw_input("Tag's description [can be empty]:"))
273  metadata_dictionary = command_line_data.__dict__
274  else:
275  metadata_dictionary = json.loads("".join(open(os.path.abspath(command_line_data.metadataFile), "r").readlines()))
276  metadata_dictionary["username"] = username
277  metadata_dictionary["password"] = password
278  metadata_dictionary["userText"] = metadata_dictionary.get("userText")\
279  if metadata_dictionary.get("userText") != None\
280  else str(raw_input("Tag's description [can be empty]:"))
281  # set the server to use to be the default one
282  metadata_dictionary["server"] = server_alias_to_url[None]
283 
284  # go through command line options and, if they are set, overwrite entries
285  for (option_name, option_value) in command_line_data.__dict__.items():
286  # if the metadata_dictionary sets this, overwrite it
287  if option_name != "destinationTags":
288  if option_value != None or (option_value == None and not(option_name in metadata_dictionary.keys())):
289  # if option_value has a value, override the metadata file entry
290  # or if option_value is None but the metadata file doesn't give a value,
291  # set the entry to None as well
292  metadata_dictionary[option_name] = option_value
293  else:
294  if option_value != {None:{}}:
295  metadata_dictionary["destinationTags"] = {option_value:{}}
296  elif option_value == {None:{}} and not("destinationTags" in metadata_dictionary.keys()):
297  metadata_dictionary["destinationTags"] = {None:{}}
298 
299  if command_line_data.review_options:
300  defaults = {
301  "since" : "Since of first IOV",
302  "userText" : "Populated by upload process",
303  "netrc" : "None given",
304  "fcsr_filter" : "Don't apply",
305  "hashToUse" : "Using local SQLite file instead"
306  }
307  print("Configuration to use for the upload:")
308  for key in metadata_dictionary:
309  if not(key) in ["username", "password", "destinationTag"]:
310  value_to_print = metadata_dictionary[key] if metadata_dictionary[key] != None else defaults[key]
311  print("\t%s : %s" % (key, value_to_print))
312 
313  if raw_input("\nDo you want to continue? [y/n] ") != "y":
314  exit()
315 
316  return metadata_dictionary
317 
S & print(S &os, JobReport::InputFile const &f)
Definition: JobReport.cc:65
static std::string join(char **cmd)
Definition: RemoteFile.cc:18
#define str(s)
def uploadConditions.pull_code_from_git (   target_directory,
  repository_url,
  hash 
)
Pulls CondDBFW from the git repository specified by the upload server.

Definition at line 97 of file uploadConditions.py.

References edm.print(), run_in_shell(), and str.

Referenced by parse_arguments().

97 def pull_code_from_git(target_directory, repository_url, hash):
98  """
99  Pulls CondDBFW from the git repository specified by the upload server.
100  """
101  # make directory
102  target = os.path.abspath(target_directory)
103  sys.path.append(target)
104  conddbfw_directory = os.path.join(target, "CondDBFW")
105  git_directory = os.path.join(conddbfw_directory, ".git")
106  if not(os.path.exists(conddbfw_directory)):
107  os.mkdir(conddbfw_directory)
108  else:
109  # if the directory exists, it may contain things - prompt the user
110  force_pull = str(raw_input("CondDBFW directory isn't empty - empty it, and update to new version? [y/n] "))
111  if force_pull == "y":
112  # empty directory and delete it
113  run_in_shell("rm -rf CondDBFW", shell=True)
114  # remake the directory - it will be empty
115  os.mkdir(conddbfw_directory)
116 
117  print("Pulling code back from repository...")
118  print(horizontal_rule)
119 
120  run_in_shell("git --git-dir=%s clone %s CondDBFW" % (git_directory, repository_url), shell=True)
121  # --force makes sure we ignore any conflicts that
122  # could occur and overwrite everything in the checkout
123  run_in_shell("cd %s && git checkout --force -b version_used %s" % (conddbfw_directory, hash), shell=True)
124 
125  # write the hash to a file in the CondDBFW directory so we can delete the git repository
126  hash_file_handle = open(os.path.join(conddbfw_directory, ".commit_hash"), "w")
127  hash_file_handle.write(hash)
128  hash_file_handle.close()
129 
130  # can now delete .git directory
131  shutil.rmtree(git_directory)
132 
133  print(horizontal_rule)
134  print("Creating local log directories (if required)...")
135  if not(os.path.exists(os.path.join(target, "upload_logs"))):
136  os.mkdir(os.path.join(target, "upload_logs"))
137  if not(os.path.exists(os.path.join(target, "server_side_logs"))):
138  os.mkdir(os.path.join(target, "server_side_logs"))
139  print("Finished with log directories.")
140  print("Update of CondDBFW complete.")
141 
142  print(horizontal_rule)
143 
144  return True
145 
S & print(S &os, JobReport::InputFile const &f)
Definition: JobReport.cc:65
def run_in_shell(popenargs, kwargs)
def pull_code_from_git(target_directory, repository_url, hash)
#define str(s)
def uploadConditions.re_upload (   options)

Definition at line 811 of file uploadConditions.py.

References edm.decode(), str, and upload().

Referenced by main().

811 def re_upload( options ):
812  netrcPath = None
813  logDbSrv = prodLogDbSrv
814  if options.hostname == defaultDevHostname:
815  logDbSrv = devLogDbSrv
816  if options.authPath is not None:
817  netrcPath = os.path.join( options.authPath,'.netrc' )
818  try:
819  netrcKey = '%s/%s' %(logDbSrv,logDbSchema)
820  print '#netrc key=%s' %netrcKey
821  # Try to find the netrc entry
822  (username, account, password) = netrc.netrc( netrcPath ).authenticators( netrcKey )
823  except IOError as e:
824  logging.error('Cannot access netrc file.')
825  return 1
826  except Exception as e:
827  logging.error('Netrc file is invalid: %s' %str(e))
828  return 1
829  conStr = '%s/%s@%s' %(username,password,logDbSrv)
830  con = cx_Oracle.connect( conStr )
831  cur = con.cursor()
832  fh = options.reUpload
833  cur.execute('SELECT FILECONTENT, STATE FROM FILES WHERE FILEHASH = :HASH',{'HASH':fh})
834  res = cur.fetchall()
835  found = False
836  fdata = None
837  for r in res:
838  found = True
839  logging.info("Found file %s in state '%s;" %(fh,r[1]))
840  fdata = r[0].read().decode('bz2')
841  con.close()
842  if not found:
843  logging.error("No file uploaded found with hash %s" %fh)
844  return 1
845  # writing as a tar file and open it ( is there a why to open it in memory?)
846  fname = '%s.tar' %fh
847  with open(fname, "wb" ) as f:
848  f.write(fdata)
849  rname = 'reupload_%s' %fh
850  with tarfile.open(fname) as tar:
851  tar.extractall()
852  os.remove(fname)
853  dfile = 'data.db'
854  mdfile = 'metadata.txt'
855  if os.path.exists(dfile):
856  os.utime(dfile,None)
857  os.chmod(dfile,0o755)
858  os.rename(dfile,'%s.db' %rname)
859  else:
860  logging.error('Tar file does not contain the data file')
861  return 1
862  if os.path.exists(mdfile):
863  os.utime(mdfile,None)
864  os.chmod(mdfile,0o755)
865  mdata = None
866  with open(mdfile) as md:
867  mdata = json.load(md)
868  datelabel = datetime.now().strftime("%y-%m-%d %H:%M:%S")
869  if mdata is None:
870  logging.error('Metadata file is empty.')
871  return 1
872  logging.debug('Preparing new metadata file...')
873  mdata['userText'] = 'reupload %s : %s' %(datelabel,mdata['userText'])
874  with open( '%s.txt' %rname, 'wb') as jf:
875  jf.write( json.dumps( mdata, sort_keys=True, indent = 2 ) )
876  jf.write('\n')
877  os.remove(mdfile)
878  else:
879  logging.error('Tar file does not contain the metadata file')
880  return 1
881  logging.info('Files %s prepared for the upload.' %rname)
882  arguments = [rname]
883  return upload(options, arguments)
884 
def upload(options, arguments)
bool decode(bool &, std::string const &)
Definition: types.cc:62
def re_upload(options)
#define str(s)
def uploadConditions.run_in_shell (   popenargs,
  kwargs 
)
Runs string-based commands in the shell and returns the result.

Definition at line 146 of file uploadConditions.py.

Referenced by parse_arguments(), and pull_code_from_git().

146 def run_in_shell(*popenargs, **kwargs):
147  """
148  Runs string-based commands in the shell and returns the result.
149  """
150  out = subprocess.PIPE if kwargs.get("stdout") == None else kwargs.get("stdout")
151  new_kwargs = kwargs
152  if new_kwargs.get("stdout"):
153  del new_kwargs["stdout"]
154  process = subprocess.Popen(*popenargs, stdout=out, **new_kwargs)
155  stdout = process.communicate()[0]
156  returnCode = process.returncode
157  cmd = kwargs.get('args')
158  if cmd is None:
159  cmd = popenargs[0]
160  if returnCode:
161  raise subprocess.CalledProcessError(returnCode, cmd)
162  return stdout
163 
def run_in_shell(popenargs, kwargs)
def uploadConditions.run_upload (   parameters)
Imports CondDBFW.uploads and runs the upload with the upload metadata obtained.

Definition at line 164 of file uploadConditions.py.

References cmsRelvalreport.exit.

Referenced by parse_arguments().

164 def run_upload(**parameters):
165  """
166  Imports CondDBFW.uploads and runs the upload with the upload metadata obtained.
167  """
168  try:
169  import CondDBFW.uploads as uploads
170  except Exception as e:
171  traceback.print_exc()
172  exit("CondDBFW or one of its dependencies could not be imported.\n"\
173  + "If the CondDBFW directory exists, you are likely not in a CMSSW environment.")
174  # we have CondDBFW, so just call the module with the parameters given in the command line
175  uploader = uploads.uploader(**parameters)
176  result = uploader.upload()
177 
def run_upload(parameters)
def uploadConditions.runWizard (   basename,
  dataFilename,
  metadataFilename 
)

Definition at line 102 of file uploadConditions.py.

References getInput(), getInputChoose(), getInputRepeat(), createfilelist.int, and ComparisonHelper.zip().

Referenced by uploadAllFiles().

102 def runWizard(basename, dataFilename, metadataFilename):
103  while True:
104  print '''\nWizard for metadata for %s
105 
106 I will ask you some questions to fill the metadata file. For some of the questions there are defaults between square brackets (i.e. []), leave empty (i.e. hit Enter) to use them.''' % basename
107 
108  # Try to get the available inputTags
109  try:
110  dataConnection = sqlite3.connect(dataFilename)
111  dataCursor = dataConnection.cursor()
112  dataCursor.execute('select name from sqlite_master where type == "table"')
113  tables = set(zip(*dataCursor.fetchall())[0])
114 
115  # only conddb V2 supported...
116  if 'TAG' in tables:
117  dataCursor.execute('select NAME from TAG')
118  # In any other case, do not try to get the inputTags
119  else:
120  raise Exception()
121 
122  inputTags = dataCursor.fetchall()
123  if len(inputTags) == 0:
124  raise Exception()
125  inputTags = zip(*inputTags)[0]
126 
127  except Exception:
128  inputTags = []
129 
130  if len(inputTags) == 0:
131  print '\nI could not find any input tag in your data file, but you can still specify one manually.'
132 
133  inputTag = getInputRepeat(
134  '\nWhich is the input tag (i.e. the tag to be read from the SQLite data file)?\ne.g. BeamSpotObject_ByRun\ninputTag: ')
135 
136  else:
137  print '\nI found the following input tags in your SQLite data file:'
138  for (index, inputTag) in enumerate(inputTags):
139  print ' %s) %s' % (index, inputTag)
140 
141  inputTag = getInputChoose(inputTags, '0',
142  '\nWhich is the input tag (i.e. the tag to be read from the SQLite data file)?\ne.g. 0 (you select the first in the list)\ninputTag [0]: ')
143 
144  destinationDatabase = ''
145  ntry = 0
146  while ( destinationDatabase != 'oracle://cms_orcon_prod/CMS_CONDITIONS' and destinationDatabase != 'oracle://cms_orcoff_prep/CMS_CONDITIONS' ):
147  if ntry==0:
148  inputMessage = \
149  '\nWhich is the destination database where the tags should be exported? \nPossible choices: oracle://cms_orcon_prod/CMS_CONDITIONS (for prod) or oracle://cms_orcoff_prep/CMS_CONDITIONS (for prep) \ndestinationDatabase: '
150  elif ntry==1:
151  inputMessage = \
152  '\nPlease choose one of the two valid destinations: \noracle://cms_orcon_prod/CMS_CONDITIONS (for prod) or oracle://cms_orcoff_prep/CMS_CONDITIONS (for prep) \
153 \ndestinationDatabase: '
154  else:
155  raise Exception('No valid destination chosen. Bailing out...')
156  destinationDatabase = getInputRepeat(inputMessage)
157  ntry += 1
158 
159  while True:
160  since = getInput('',
161  '\nWhich is the given since? (if not specified, the one from the SQLite data file will be taken -- note that even if specified, still this may not be the final since, depending on the synchronization options you select later: if the synchronization target is not offline, and the since you give is smaller than the next possible one (i.e. you give a run number earlier than the one which will be started/processed next in prompt/hlt/express), the DropBox will move the since ahead to go to the first safe run instead of the value you gave)\ne.g. 1234\nsince []: ')
162  if not since:
163  since = None
164  break
165  else:
166  try:
167  since = int(since)
168  break
169  except ValueError:
170  logging.error('The since value has to be an integer or empty (null).')
171 
172  userText = getInput('',
173  '\nWrite any comments/text you may want to describe your request\ne.g. Muon alignment scenario for...\nuserText []: ')
174 
175  destinationTags = {}
176  while True:
177  destinationTag = getInput('',
178  '\nWhich is the next destination tag to be added (leave empty to stop)?\ne.g. BeamSpotObjects_PCL_byRun_v0_offline\ndestinationTag []: ')
179  if not destinationTag:
180  if len(destinationTags) == 0:
181  logging.error('There must be at least one destination tag.')
182  continue
183  break
184 
185  if destinationTag in destinationTags:
186  logging.warning(
187  'You already added this destination tag. Overwriting the previous one with this new one.')
188 
189  destinationTags[destinationTag] = {
190  }
191 
192  metadata = {
193  'destinationDatabase': destinationDatabase,
194  'destinationTags': destinationTags,
195  'inputTag': inputTag,
196  'since': since,
197  'userText': userText,
198  }
199 
200  metadata = json.dumps(metadata, sort_keys=True, indent=4)
201  print '\nThis is the generated metadata:\n%s' % metadata
202 
203  if getInput('n',
204  '\nIs it fine (i.e. save in %s and *upload* the conditions if this is the latest file)?\nAnswer [n]: ' % metadataFilename).lower() == 'y':
205  break
206  logging.info('Saving generated metadata in %s...', metadataFilename)
207  with open(metadataFilename, 'wb') as metadataFile:
208  metadataFile.write(metadata)
209 
def runWizard(basename, dataFilename, metadataFilename)
OutputIterator zip(InputIterator1 first1, InputIterator1 last1, InputIterator2 first2, InputIterator2 last2, OutputIterator result, Compare comp)
def getInputRepeat(prompt='')
def getInputChoose(optionsList, default, prompt='')
def getInput(default, prompt='')
def uploadConditions.testTier0Upload ( )

Definition at line 975 of file uploadConditions.py.

References uploadTier0Files().

976 
977  global defaultNetrcHost
978 
979  (username, account, password) = netrc.netrc().authenticators(defaultNetrcHost)
980 
981  filenames = ['testFiles/localSqlite-top2']
982 
983  uploadTier0Files(filenames, username, password, cookieFileName = None)
984 
985 
def uploadTier0Files(filenames, username, password, cookieFileName=None)
def uploadConditions.upload (   options,
  arguments 
)

Definition at line 885 of file uploadConditions.py.

References uploadAllFiles().

Referenced by main(), and re_upload().

885 def upload(options, arguments):
886  results = uploadAllFiles(options, arguments)
887 
888  if 'status' not in results:
889  print 'Unexpected error.'
890  return -1
891  ret = results['status']
892  print results
893  print "upload ended with code: %s" %ret
894  return ret
895 
def upload(options, arguments)
def uploadAllFiles(options, arguments)
def uploadConditions.uploadAllFiles (   options,
  arguments 
)

Definition at line 656 of file uploadConditions.py.

References getCredentials(), getInput(), runWizard(), and str.

Referenced by upload().

656 def uploadAllFiles(options, arguments):
657 
658  ret = {}
659  ret['status'] = 0
660 
661  # Check that we can read the data and metadata files
662  # If the metadata file does not exist, start the wizard
663  for filename in arguments:
664  basepath = filename.rsplit('.db', 1)[0].rsplit('.txt', 1)[0]
665  basename = os.path.basename(basepath)
666  dataFilename = '%s.db' % basepath
667  metadataFilename = '%s.txt' % basepath
668 
669  logging.info('Checking %s...', basename)
670 
671  # Data file
672  try:
673  with open(dataFilename, 'rb') as dataFile:
674  pass
675  except IOError as e:
676  errMsg = 'Impossible to open SQLite data file %s' %dataFilename
677  logging.error( errMsg )
678  ret['status'] = -3
679  ret['error'] = errMsg
680  return ret
681 
682  # Check the data file
683  empty = True
684  try:
685  dbcon = sqlite3.connect( dataFilename )
686  dbcur = dbcon.cursor()
687  dbcur.execute('SELECT * FROM IOV')
688  rows = dbcur.fetchall()
689  for r in rows:
690  empty = False
691  dbcon.close()
692  if empty:
693  errMsg = 'The input SQLite data file %s contains no data.' %dataFilename
694  logging.error( errMsg )
695  ret['status'] = -4
696  ret['error'] = errMsg
697  return ret
698  except Exception as e:
699  errMsg = 'Check on input SQLite data file %s failed: %s' %(dataFilename,str(e))
700  logging.error( errMsg )
701  ret['status'] = -5
702  ret['error'] = errMsg
703  return ret
704 
705  # Metadata file
706  try:
707  with open(metadataFilename, 'rb') as metadataFile:
708  pass
709  except IOError as e:
710  if e.errno != errno.ENOENT:
711  errMsg = 'Impossible to open file %s (for other reason than not existing)' %metadataFilename
712  logging.error( errMsg )
713  ret['status'] = -4
714  ret['error'] = errMsg
715  return ret
716 
717  if getInput('y', '\nIt looks like the metadata file %s does not exist. Do you want me to create it and help you fill it?\nAnswer [y]: ' % metadataFilename).lower() != 'y':
718  errMsg = 'Metadata file %s does not exist' %metadataFilename
719  logging.error( errMsg )
720  ret['status'] = -5
721  ret['error'] = errMsg
722  return ret
723  # Wizard
724  runWizard(basename, dataFilename, metadataFilename)
725 
726  # Upload files
727  try:
728  dropBox = ConditionsUploader(options.hostname, options.urlTemplate)
729 
730  # Authentication
731  username, password = getCredentials(options)
732 
733  results = {}
734  for filename in arguments:
735  backend = options.backend
736  basepath = filename.rsplit('.db', 1)[0].rsplit('.txt', 1)[0]
737  metadataFilename = '%s.txt' % basepath
738  with open(metadataFilename, 'rb') as metadataFile:
739  metadata = json.load( metadataFile )
740  # When dest db = prep the hostname has to be set to dev.
741  forceHost = False
742  destDb = metadata['destinationDatabase']
743  if destDb.startswith('oracle://cms_orcon_prod') or destDb.startswith('oracle://cms_orcoff_prep'):
744  hostName = defaultHostname
745  if destDb.startswith('oracle://cms_orcoff_prep'):
746  hostName = defaultDevHostname
747  dropBox.setHost( hostName )
748  authRet = dropBox.signIn( username, password )
749  if not authRet==0:
750  msg = "Error trying to connect to the server. Aborting."
751  if authRet==-2:
752  msg = "Error while signin in. Aborting."
753  logging.error(msg)
754  return { 'status' : authRet, 'error' : msg }
755  results[filename] = dropBox.uploadFile(filename, options.backend, options.temporaryFile)
756  else:
757  results[filename] = False
758  logging.error("DestinationDatabase %s is not valid. Skipping the upload." %destDb)
759  if not results[filename]:
760  if ret['status']<0:
761  ret['status'] = 0
762  ret['status'] += 1
763  ret['files'] = results
764  logging.debug("all files processed, logging out now.")
765 
766  dropBox.signOut()
767 
768  except HTTPError as e:
769  logging.error('got HTTP error: %s', str(e))
770  return { 'status' : -1, 'error' : str(e) }
771 
772  return ret
773 
def runWizard(basename, dataFilename, metadataFilename)
def getCredentials(options)
def uploadAllFiles(options, arguments)
def getInput(default, prompt='')
#define str(s)
def uploadConditions.uploadTier0Files (   filenames,
  username,
  password,
  cookieFileName = None 
)
Uploads a bunch of files coming from Tier0.
This has the following requirements:
    * Username/Password based authentication.
    * Uses the online backend.
    * Ignores errors related to the upload/content (e.g. duplicated file).

Definition at line 774 of file uploadConditions.py.

Referenced by testTier0Upload().

774 def uploadTier0Files(filenames, username, password, cookieFileName = None):
775  '''Uploads a bunch of files coming from Tier0.
776  This has the following requirements:
777  * Username/Password based authentication.
778  * Uses the online backend.
779  * Ignores errors related to the upload/content (e.g. duplicated file).
780  '''
781 
782  dropBox = ConditionsUploader()
783 
784  dropBox.signIn(username, password)
785 
786  for filename in filenames:
787  try:
788  result = dropBox.uploadFile(filename, backend = 'test')
789  except HTTPError as e:
790  if e.code == 400:
791  # 400 Bad Request: This is an exception related to the upload
792  # being wrong for some reason (e.g. duplicated file).
793  # Since for Tier0 this is not an issue, continue
794  logging.error('HTTP Exception 400 Bad Request: Upload-related, skipping. Message: %s', e)
795  continue
796 
797  # In any other case, re-raise.
798  raise
799 
800  #-toDo: add a flag to say if we should retry or not. So far, all retries are done server-side (Tier-0),
801  # if we flag as failed any retry would not help and would result in the same error (e.g.
802  # when a file with an identical hash is uploaded again)
803  #-review(2015-09-25): get feedback from tests at Tier-0 (action: AP)
804 
805  if not result: # dropbox reported an error when uploading, do not retry.
806  logging.error('Error from dropbox, upload-related, skipping.')
807  continue
808 
809  dropBox.signOut()
810 
def uploadTier0Files(filenames, username, password, cookieFileName=None)