当前位置: 首页>>代码示例>>Python>>正文


Python TokenStreamRewriter.insertAfter方法代码示例

本文整理汇总了Python中antlr4.TokenStreamRewriter.TokenStreamRewriter.insertAfter方法的典型用法代码示例。如果您正苦于以下问题:Python TokenStreamRewriter.insertAfter方法的具体用法?Python TokenStreamRewriter.insertAfter怎么用?Python TokenStreamRewriter.insertAfter使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在antlr4.TokenStreamRewriter.TokenStreamRewriter的用法示例。


在下文中一共展示了TokenStreamRewriter.insertAfter方法的6个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1: testInsertAfterLastIndex

# 需要导入模块: from antlr4.TokenStreamRewriter import TokenStreamRewriter [as 别名]
# 或者: from antlr4.TokenStreamRewriter.TokenStreamRewriter import insertAfter [as 别名]
    def testInsertAfterLastIndex(self):
        input = InputStream('abc')
        lexer = TestLexer(input)
        stream = CommonTokenStream(lexer=lexer)
        stream.fill()
        rewriter = TokenStreamRewriter(tokens=stream)
        rewriter.insertAfter(10, 'x')

        self.assertEqual(rewriter.getDefaultText(), 'abcx')
开发者ID:antlr,项目名称:antlr4,代码行数:11,代码来源:TestTokenStreamRewriter.py

示例2: test2InsertBeforeAfterMiddleIndex

# 需要导入模块: from antlr4.TokenStreamRewriter import TokenStreamRewriter [as 别名]
# 或者: from antlr4.TokenStreamRewriter.TokenStreamRewriter import insertAfter [as 别名]
    def test2InsertBeforeAfterMiddleIndex(self):
        input = InputStream('abc')
        lexer = TestLexer(input)
        stream = CommonTokenStream(lexer=lexer)
        stream.fill()
        rewriter = TokenStreamRewriter(tokens=stream)

        rewriter.insertBeforeIndex(1, 'x')
        rewriter.insertAfter(1, 'x')

        self.assertEqual(rewriter.getDefaultText(), 'axbxc')
开发者ID:antlr,项目名称:antlr4,代码行数:13,代码来源:TestTokenStreamRewriter.py

示例3: testReplaceRangeThenInsertAfterRightEdge

# 需要导入模块: from antlr4.TokenStreamRewriter import TokenStreamRewriter [as 别名]
# 或者: from antlr4.TokenStreamRewriter.TokenStreamRewriter import insertAfter [as 别名]
    def testReplaceRangeThenInsertAfterRightEdge(self):
        input = InputStream('abcccba')
        lexer = TestLexer(input)
        stream = CommonTokenStream(lexer=lexer)
        stream.fill()
        rewriter = TokenStreamRewriter(tokens=stream)

        rewriter.replaceRange(2, 4, 'x')
        rewriter.insertAfter(4, 'y')

        self.assertEqual('abxyba', rewriter.getDefaultText())
开发者ID:antlr,项目名称:antlr4,代码行数:13,代码来源:TestTokenStreamRewriter.py

示例4: testReplaceThenInsertAfterLastIndex

# 需要导入模块: from antlr4.TokenStreamRewriter import TokenStreamRewriter [as 别名]
# 或者: from antlr4.TokenStreamRewriter.TokenStreamRewriter import insertAfter [as 别名]
    def testReplaceThenInsertAfterLastIndex(self):
        input = InputStream('abc')
        lexer = TestLexer(input)
        stream = CommonTokenStream(lexer=lexer)
        stream.fill()
        rewriter = TokenStreamRewriter(tokens=stream)

        rewriter.replaceIndex(2, 'x')
        rewriter.insertAfter(2, 'y')

        self.assertEqual('abxy', rewriter.getDefaultText())
开发者ID:antlr,项目名称:antlr4,代码行数:13,代码来源:TestTokenStreamRewriter.py

示例5: testPreservesOrderOfContiguousInserts

# 需要导入模块: from antlr4.TokenStreamRewriter import TokenStreamRewriter [as 别名]
# 或者: from antlr4.TokenStreamRewriter.TokenStreamRewriter import insertAfter [as 别名]
    def testPreservesOrderOfContiguousInserts(self):
        """
        Test for fix for: https://github.com/antlr/antlr4/issues/550
        """
        input = InputStream('aa')
        lexer = TestLexer(input)
        stream = CommonTokenStream(lexer=lexer)
        stream.fill()
        rewriter = TokenStreamRewriter(tokens=stream)

        rewriter.insertBeforeIndex(0, '<b>')
        rewriter.insertAfter(0, '</b>')
        rewriter.insertBeforeIndex(1, '<b>')
        rewriter.insertAfter(1, '</b>')

        self.assertEqual('<b>a</b><b>a</b>', rewriter.getDefaultText())
开发者ID:antlr,项目名称:antlr4,代码行数:18,代码来源:TestTokenStreamRewriter.py

示例6: testToStringStartStop2

# 需要导入模块: from antlr4.TokenStreamRewriter import TokenStreamRewriter [as 别名]
# 或者: from antlr4.TokenStreamRewriter.TokenStreamRewriter import insertAfter [as 别名]
    def testToStringStartStop2(self):
        input = InputStream('x = 3 * 0 + 2 * 0;')
        lexer = TestLexer2(input)
        stream = CommonTokenStream(lexer=lexer)
        stream.fill()
        rewriter = TokenStreamRewriter(tokens=stream)

        self.assertEqual('x = 3 * 0 + 2 * 0;', rewriter.getDefaultText())

        # replace 3 * 0 with 0
        rewriter.replaceRange(4, 8, '0')
        self.assertEqual('x = 0 + 2 * 0;', rewriter.getDefaultText())
        self.assertEqual('x = 0 + 2 * 0;', rewriter.getText('default', 0, 17))
        self.assertEqual('0', rewriter.getText('default', 4, 8))
        self.assertEqual('x = 0', rewriter.getText('default', 0, 8))
        self.assertEqual('2 * 0', rewriter.getText('default', 12, 16))

        rewriter.insertAfter(17, "// comment")
        self.assertEqual('2 * 0;// comment', rewriter.getText('default', 12, 18))

        self.assertEqual('x = 0', rewriter.getText('default', 0, 8))
开发者ID:antlr,项目名称:antlr4,代码行数:23,代码来源:TestTokenStreamRewriter.py


注:本文中的antlr4.TokenStreamRewriter.TokenStreamRewriter.insertAfter方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。