本文整理汇总了Python中nltk.tree.Tree.gradW_d方法的典型用法代码示例。如果您正苦于以下问题:Python Tree.gradW_d方法的具体用法?Python Tree.gradW_d怎么用?Python Tree.gradW_d使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类nltk.tree.Tree
的用法示例。
在下文中一共展示了Tree.gradW_d方法的1个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1: backprop_d
# 需要导入模块: from nltk.tree import Tree [as 别名]
# 或者: from nltk.tree.Tree import gradW_d [as 别名]
def backprop_d(W_d, a_d_tree, a_e_tree):
if len(a_d_tree) > 0:
tree_out = Tree(a_d_tree.node, [backprop_d(W_d, child, a_e_tree[i]) for i, child in enumerate(a_d_tree)])
delta_p = np.concatenate([child.delta for child in tree_out])
# add contribution to gradient using current node a_d & prev. node delta
tree_out.gradW_d = delta_p.reshape((len(delta_p.flatten()), 1)) * a_d_tree.node
tree_out.gradb_d = delta_p
# calculate this node's delta, return annotated tree
tree_out.delta = np.dot(W_d.transpose(), delta_p) * (1 - tree_out.node ** 2)
return tree_out
else:
tree_out = Tree(a_d_tree.node, [])
tree_out.delta = (a_d_tree.node - get_concat_terminals(a_e_tree)) * (1 - a_d_tree.node ** 2)
tree_out.gradW_d = np.zeros(W_d.shape)
tree_out.gradb_d = np.zeros(W_d.shape[0])
return tree_out