當前位置: 首頁>>代碼示例>>Python>>正文


Python FileSpec.checksum方法代碼示例

本文整理匯總了Python中taskbuffer.FileSpec.FileSpec.checksum方法的典型用法代碼示例。如果您正苦於以下問題:Python FileSpec.checksum方法的具體用法?Python FileSpec.checksum怎麽用?Python FileSpec.checksum使用的例子?那麽, 這裏精選的方法代碼示例或許可以為您提供幫助。您也可以進一步了解該方法所在taskbuffer.FileSpec.FileSpec的用法示例。


在下文中一共展示了FileSpec.checksum方法的2個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Python代碼示例。

示例1: convertToJobFileSpec

# 需要導入模塊: from taskbuffer.FileSpec import FileSpec [as 別名]
# 或者: from taskbuffer.FileSpec.FileSpec import checksum [as 別名]
 def convertToJobFileSpec(self,datasetSpec,setType=None,useEventService=False):
     jobFileSpec = JobFileSpec()
     jobFileSpec.fileID     = self.fileID
     jobFileSpec.datasetID  = datasetSpec.datasetID
     jobFileSpec.jediTaskID = datasetSpec.jediTaskID
     jobFileSpec.lfn        = self.lfn
     jobFileSpec.GUID       = self.GUID
     if setType == None:
         jobFileSpec.type   = self.type
     else:
         jobFileSpec.type   = setType
     jobFileSpec.scope      = self.scope
     jobFileSpec.fsize      = self.fsize
     jobFileSpec.checksum   = self.checksum
     jobFileSpec.attemptNr  = self.attemptNr
     # dataset attribute
     if datasetSpec != None:
         # dataset
         if not datasetSpec.containerName in [None,'']:
             jobFileSpec.dataset = datasetSpec.containerName
         else:
             jobFileSpec.dataset = datasetSpec.datasetName
         if self.type in datasetSpec.getInputTypes() or setType in datasetSpec.getInputTypes():
             # prodDBlock
             jobFileSpec.prodDBlock = datasetSpec.datasetName
             # storage token    
             if not datasetSpec.storageToken in ['',None]:
                 jobFileSpec.dispatchDBlockToken = datasetSpec.storageToken 
         else:
             # destinationDBlock
             jobFileSpec.destinationDBlock = datasetSpec.datasetName
             # storage token    
             if not datasetSpec.storageToken in ['',None]:
                 jobFileSpec.destinationDBlockToken = datasetSpec.storageToken.split('/')[0] 
             # destination
             if not datasetSpec.destination in ['',None]:
                 jobFileSpec.destinationSE = datasetSpec.destination
             # set prodDBlockToken for Event Service
             if useEventService and datasetSpec.getObjectStore() != None:
                 jobFileSpec.prodDBlockToken = 'objectstore^{0}'.format(datasetSpec.getObjectStore())
             # allow no output
             if datasetSpec.isAllowedNoOutput():
                 jobFileSpec.allowNoOutput()
     # return
     return jobFileSpec
開發者ID:ruslan33,項目名稱:panda-jedi,代碼行數:47,代碼來源:JediFileSpec.py

示例2: FileSpec

# 需要導入模塊: from taskbuffer.FileSpec import FileSpec [as 別名]
# 或者: from taskbuffer.FileSpec.FileSpec import checksum [as 別名]
job.destinationDBlock = 'panda.destDB.%s' % commands.getoutput('uuidgen')
job.destinationSE     = 'AGLT2_TEST'
job.prodDBlock        = 'user.mlassnig:user.mlassnig.pilot.test.single.hits'
job.currentPriority   = 1000
#job.prodSourceLabel   = 'ptest'
job.prodSourceLabel   = 'user'
job.computingSite     = site
job.cloud             = cloud
job.cmtConfig         = 'x86_64-slc6-gcc48-opt'
job.specialHandling   = 'ddm:rucio'
#job.transferType      = 'direct'

ifile = 'HITS.06828093._000096.pool.root.1'
fileI = FileSpec()
fileI.GUID = 'AC5B3759-B606-BA42-8681-4BD86455AE02'
fileI.checksum = 'ad:5d000974'
fileI.dataset = 'user.mlassnig:user.mlassnig.pilot.test.single.hits'
fileI.fsize = 94834717
fileI.lfn = ifile
fileI.prodDBlock = job.prodDBlock
fileI.scope = 'mc15_13TeV'
fileI.type = 'input'
job.addFile(fileI)

ofile = 'RDO_%s.root' % commands.getoutput('uuidgen')
fileO = FileSpec()
fileO.dataset = job.destinationDBlock
fileO.destinationDBlock = job.destinationDBlock
fileO.destinationSE = job.destinationSE
fileO.lfn = ofile
fileO.type = 'output'
開發者ID:PanDAWMS,項目名稱:panda-server,代碼行數:33,代碼來源:testReco-noDA.py


注:本文中的taskbuffer.FileSpec.FileSpec.checksum方法示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。