本文整理汇总了Python中nupic.encoders.random_distributed_scalar.RandomDistributedScalarEncoder.mapBucketIndexToNonZeroBits方法的典型用法代码示例。如果您正苦于以下问题:Python RandomDistributedScalarEncoder.mapBucketIndexToNonZeroBits方法的具体用法?Python RandomDistributedScalarEncoder.mapBucketIndexToNonZeroBits怎么用?Python RandomDistributedScalarEncoder.mapBucketIndexToNonZeroBits使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类nupic.encoders.random_distributed_scalar.RandomDistributedScalarEncoder
的用法示例。
在下文中一共展示了RandomDistributedScalarEncoder.mapBucketIndexToNonZeroBits方法的1个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1: testMapBucketIndexToNonZeroBits
# 需要导入模块: from nupic.encoders.random_distributed_scalar import RandomDistributedScalarEncoder [as 别名]
# 或者: from nupic.encoders.random_distributed_scalar.RandomDistributedScalarEncoder import mapBucketIndexToNonZeroBits [as 别名]
def testMapBucketIndexToNonZeroBits(self):
"""
Test that mapBucketIndexToNonZeroBits works and that max buckets and
clipping are handled properly.
"""
enc = RandomDistributedScalarEncoder(resolution=1.0, w=11, n=150)
# Set a low number of max buckets
enc._initializeBucketMap(10, None)
enc.encode(0.0)
enc.encode(-7.0)
enc.encode(7.0)
self.assertEqual(len(enc.bucketMap), enc._maxBuckets,
"_maxBuckets exceeded")
self.assertTrue(
(enc.mapBucketIndexToNonZeroBits(-1) == enc.bucketMap[0]).all(),
"mapBucketIndexToNonZeroBits did not handle negative index")
self.assertTrue(
(enc.mapBucketIndexToNonZeroBits(1000) == enc.bucketMap[9]).all(),
"mapBucketIndexToNonZeroBits did not handle negative index")
e23 = enc.encode(23.0)
e6 = enc.encode(6)
self.assertEqual((e23 == e6).sum(), enc.getWidth(),
"Values not clipped correctly during encoding")
e_8 = enc.encode(-8)
e_7 = enc.encode(-7)
self.assertEqual((e_8 == e_7).sum(), enc.getWidth(),
"Values not clipped correctly during encoding")
self.assertEqual(enc.getBucketIndices(-8)[0], 0,
"getBucketIndices returned negative bucket index")
self.assertEqual(enc.getBucketIndices(23)[0], enc._maxBuckets-1,
"getBucketIndices returned bucket index that is too large")