当前位置: 首页>>代码示例>>Python>>正文


Python lexicon.build_lexicon方法代码示例

本文整理汇总了Python中dragnn.python.lexicon.build_lexicon方法的典型用法代码示例。如果您正苦于以下问题:Python lexicon.build_lexicon方法的具体用法?Python lexicon.build_lexicon怎么用?Python lexicon.build_lexicon使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在dragnn.python.lexicon的用法示例。


在下文中一共展示了lexicon.build_lexicon方法的4个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1: testBuildLexicon

# 需要导入模块: from dragnn.python import lexicon [as 别名]
# 或者: from dragnn.python.lexicon import build_lexicon [as 别名]
def testBuildLexicon(self):
    empty_input_path = os.path.join(FLAGS.test_tmpdir, 'empty-input')
    lexicon_output_path = os.path.join(FLAGS.test_tmpdir, 'lexicon-output')

    with open(empty_input_path, 'w'):
      pass

    # The directory may already exist when running locally multiple times.
    if not os.path.exists(lexicon_output_path):
      os.mkdir(lexicon_output_path)

    # Just make sure this doesn't crash; the lexicon builder op is already
    # exercised in its own unit test.
    lexicon.build_lexicon(lexicon_output_path, empty_input_path) 
开发者ID:ringringyi,项目名称:DOTA_models,代码行数:16,代码来源:lexicon_test.py

示例2: complete_master_spec

# 需要导入模块: from dragnn.python import lexicon [as 别名]
# 或者: from dragnn.python.lexicon import build_lexicon [as 别名]
def complete_master_spec(master_spec, lexicon_corpus, output_path,
                         tf_master=''):
  """Finishes a MasterSpec that defines the network config.

  Given a MasterSpec that defines the DRAGNN architecture, completes the spec so
  that it can be used to build a DRAGNN graph and run training/inference.

  Args:
    master_spec: MasterSpec.
    lexicon_corpus: the corpus to be used with the LexiconBuilder.
    output_path: directory to save resources to.
    tf_master: TensorFlow master executor (string, defaults to '' to use the
      local instance).

  Returns:
    None, since the spec is changed in-place.
  """
  if lexicon_corpus:
    lexicon.build_lexicon(output_path, lexicon_corpus)

  # Use Syntaxnet builder to fill out specs.
  for i, spec in enumerate(master_spec.component):
    builder = ComponentSpecBuilder(spec.name)
    builder.spec = spec
    builder.fill_from_resources(output_path, tf_master=tf_master)
    master_spec.component[i].CopyFrom(builder.spec) 
开发者ID:ringringyi,项目名称:DOTA_models,代码行数:28,代码来源:spec_builder.py

示例3: testBuildLexicon

# 需要导入模块: from dragnn.python import lexicon [as 别名]
# 或者: from dragnn.python.lexicon import build_lexicon [as 别名]
def testBuildLexicon(self):
    empty_input_path = os.path.join(test_flags.temp_dir(), 'empty-input')
    lexicon_output_path = os.path.join(test_flags.temp_dir(), 'lexicon-output')

    with open(empty_input_path, 'w'):
      pass

    # The directory may already exist when running locally multiple times.
    if not os.path.exists(lexicon_output_path):
      os.mkdir(lexicon_output_path)

    # Just make sure this doesn't crash; the lexicon builder op is already
    # exercised in its own unit test.
    lexicon.build_lexicon(lexicon_output_path, empty_input_path) 
开发者ID:generalized-iou,项目名称:g-tensorflow-models,代码行数:16,代码来源:lexicon_test.py

示例4: main

# 需要导入模块: from dragnn.python import lexicon [as 别名]
# 或者: from dragnn.python.lexicon import build_lexicon [as 别名]
def main(argv):
  del argv  # unused
  # Constructs lexical resources for SyntaxNet in the given resource path, from
  # the training data.
  lexicon.build_lexicon(
      lexicon_dir,
      training_sentence,
      training_corpus_format='sentence-prototext')

  # Construct the ComponentSpec for tagging. This is a simple left-to-right RNN
  # sequence tagger.
  tagger = spec_builder.ComponentSpecBuilder('tagger')
  tagger.set_network_unit(name='FeedForwardNetwork', hidden_layer_sizes='256')
  tagger.set_transition_system(name='tagger')
  tagger.add_fixed_feature(name='words', fml='input.word', embedding_dim=64)
  tagger.add_rnn_link(embedding_dim=-1)
  tagger.fill_from_resources(lexicon_dir)

  master_spec = spec_pb2.MasterSpec()
  master_spec.component.extend([tagger.spec])

  hyperparam_config = spec_pb2.GridPoint()

  # Build the TensorFlow graph.
  graph = tf.Graph()
  with graph.as_default():
    builder = graph_builder.MasterBuilder(master_spec, hyperparam_config)

    target = spec_pb2.TrainTarget()
    target.name = 'all'
    target.unroll_using_oracle.extend([True])
    dry_run = builder.add_training_from_config(target, trace_only=True)

  # Read in serialized protos from training data.
  sentence = sentence_pb2.Sentence()
  text_format.Merge(open(training_sentence).read(), sentence)
  training_set = [sentence.SerializeToString()]

  with tf.Session(graph=graph) as sess:
    # Make sure to re-initialize all underlying state.
    sess.run(tf.initialize_all_variables())
    traces = sess.run(
        dry_run['traces'], feed_dict={dry_run['input_batch']: training_set})

  with open('dragnn_tutorial_1.html', 'w') as f:
    f.write(visualization.trace_html(traces[0], height='300px').encode('utf-8')) 
开发者ID:ringringyi,项目名称:DOTA_models,代码行数:48,代码来源:tutorial_1.py


注:本文中的dragnn.python.lexicon.build_lexicon方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。