当前位置: 首页>>代码示例>>Python>>正文


Python lexer.Lexer类代码示例

本文整理汇总了Python中xonsh.lexer.Lexer的典型用法代码示例。如果您正苦于以下问题:Python Lexer类的具体用法?Python Lexer怎么用?Python Lexer使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。


在下文中一共展示了Lexer类的8个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1: check_token

def check_token(inp, exp):
    l = Lexer()
    l.input(inp)
    obs = list(l)
    if len(obs) != 1:
        msg = 'The observed sequence does not have length-1: {0!r} != 1\n'
        msg += '# obs\n{1}'
        raise AssertionError(msg.format(len(obs), pformat(obs)))
    return assert_token_equal(exp, obs[0])
开发者ID:BlaXpirit,项目名称:xonsh,代码行数:9,代码来源:test_lexer.py

示例2: check_token

def check_token(inp, exp):
    l = Lexer()
    l.input(inp)
    obs = list(l)
    if len(obs) != 1:
        msg = "The observed sequence does not have length-1: {0!r} != 1\n"
        msg += "# obs\n{1}"
        pytest.fail(msg.format(len(obs), pformat(obs)))
    return assert_token_equal(exp, obs[0])
开发者ID:mitnk,项目名称:xonsh,代码行数:9,代码来源:test_lexer.py

示例3: Lexer

# -*- coding: utf-8 -*-
"""Tests the xonsh lexer."""
from __future__ import unicode_literals, print_function
import os

import nose
from nose.tools import assert_equal, assert_true, assert_false

from xonsh.lexer import Lexer
from xonsh.tools import subproc_toks, subexpr_from_unbalanced, is_int, \
    always_true, always_false, ensure_string, is_env_path, str_to_env_path, \
    env_path_to_str, escape_windows_title_string, is_bool, to_bool, bool_to_str, \
    ensure_int_or_slice, is_float, is_string, check_for_partial_string

LEXER = Lexer()
LEXER.build()

INDENT = '    '

def test_subproc_toks_x():
    exp = '$[x]'
    obs = subproc_toks('x', lexer=LEXER, returnline=True)
    assert_equal(exp, obs)

def test_subproc_toks_ls_l():
    exp = '$[ls -l]'
    obs = subproc_toks('ls -l', lexer=LEXER, returnline=True)
    assert_equal(exp, obs)

def test_subproc_toks_git():
    s = 'git commit -am "hello doc"'
开发者ID:gforsyth,项目名称:xonsh,代码行数:31,代码来源:test_tools.py

示例4: check_tokens_subproc

def check_tokens_subproc(inp, exp):
    l = Lexer()
    l.input('$[{}]'.format(inp))
    obs = list(l)[1:-1]
    return assert_tokens_equal(exp, obs)
开发者ID:BlaXpirit,项目名称:xonsh,代码行数:5,代码来源:test_lexer.py

示例5: check_tokens

def check_tokens(inp, exp):
    l = Lexer()
    l.input(inp)
    obs = list(l)
    return assert_tokens_equal(exp, obs)
开发者ID:BlaXpirit,项目名称:xonsh,代码行数:5,代码来源:test_lexer.py

示例6: test_lexer_split

def test_lexer_split(s, exp):
    lexer = Lexer()
    obs = lexer.split(s)
    assert exp == obs
开发者ID:tinloaf,项目名称:xonsh,代码行数:4,代码来源:test_lexer.py

示例7: test_redir_whitespace

def test_redir_whitespace(case):
    inp = '![{}/path/to/file]'.format(case)
    l = Lexer()
    l.input(inp)
    obs = list(l)
    assert obs[2].type == 'WS'
开发者ID:tinloaf,项目名称:xonsh,代码行数:6,代码来源:test_lexer.py

示例8: check_tokens_subproc

def check_tokens_subproc(inp, exp, stop=-1):
    l = Lexer()
    l.input("$[{}]".format(inp))
    obs = list(l)[1:stop]
    return assert_tokens_equal(exp, obs)
开发者ID:mitnk,项目名称:xonsh,代码行数:5,代码来源:test_lexer.py


注:本文中的xonsh.lexer.Lexer类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。