當前位置: 首頁>>代碼示例>>Python>>正文


Python lexer.Lexer類代碼示例

本文整理匯總了Python中xonsh.lexer.Lexer的典型用法代碼示例。如果您正苦於以下問題:Python Lexer類的具體用法?Python Lexer怎麽用?Python Lexer使用的例子?那麽, 這裏精選的類代碼示例或許可以為您提供幫助。


在下文中一共展示了Lexer類的8個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Python代碼示例。

示例1: check_token

def check_token(inp, exp):
    l = Lexer()
    l.input(inp)
    obs = list(l)
    if len(obs) != 1:
        msg = 'The observed sequence does not have length-1: {0!r} != 1\n'
        msg += '# obs\n{1}'
        raise AssertionError(msg.format(len(obs), pformat(obs)))
    return assert_token_equal(exp, obs[0])
開發者ID:BlaXpirit,項目名稱:xonsh,代碼行數:9,代碼來源:test_lexer.py

示例2: check_token

def check_token(inp, exp):
    l = Lexer()
    l.input(inp)
    obs = list(l)
    if len(obs) != 1:
        msg = "The observed sequence does not have length-1: {0!r} != 1\n"
        msg += "# obs\n{1}"
        pytest.fail(msg.format(len(obs), pformat(obs)))
    return assert_token_equal(exp, obs[0])
開發者ID:mitnk,項目名稱:xonsh,代碼行數:9,代碼來源:test_lexer.py

示例3: Lexer

# -*- coding: utf-8 -*-
"""Tests the xonsh lexer."""
from __future__ import unicode_literals, print_function
import os

import nose
from nose.tools import assert_equal, assert_true, assert_false

from xonsh.lexer import Lexer
from xonsh.tools import subproc_toks, subexpr_from_unbalanced, is_int, \
    always_true, always_false, ensure_string, is_env_path, str_to_env_path, \
    env_path_to_str, escape_windows_title_string, is_bool, to_bool, bool_to_str, \
    ensure_int_or_slice, is_float, is_string, check_for_partial_string

LEXER = Lexer()
LEXER.build()

INDENT = '    '

def test_subproc_toks_x():
    exp = '$[x]'
    obs = subproc_toks('x', lexer=LEXER, returnline=True)
    assert_equal(exp, obs)

def test_subproc_toks_ls_l():
    exp = '$[ls -l]'
    obs = subproc_toks('ls -l', lexer=LEXER, returnline=True)
    assert_equal(exp, obs)

def test_subproc_toks_git():
    s = 'git commit -am "hello doc"'
開發者ID:gforsyth,項目名稱:xonsh,代碼行數:31,代碼來源:test_tools.py

示例4: check_tokens_subproc

def check_tokens_subproc(inp, exp):
    l = Lexer()
    l.input('$[{}]'.format(inp))
    obs = list(l)[1:-1]
    return assert_tokens_equal(exp, obs)
開發者ID:BlaXpirit,項目名稱:xonsh,代碼行數:5,代碼來源:test_lexer.py

示例5: check_tokens

def check_tokens(inp, exp):
    l = Lexer()
    l.input(inp)
    obs = list(l)
    return assert_tokens_equal(exp, obs)
開發者ID:BlaXpirit,項目名稱:xonsh,代碼行數:5,代碼來源:test_lexer.py

示例6: test_lexer_split

def test_lexer_split(s, exp):
    lexer = Lexer()
    obs = lexer.split(s)
    assert exp == obs
開發者ID:tinloaf,項目名稱:xonsh,代碼行數:4,代碼來源:test_lexer.py

示例7: test_redir_whitespace

def test_redir_whitespace(case):
    inp = '![{}/path/to/file]'.format(case)
    l = Lexer()
    l.input(inp)
    obs = list(l)
    assert obs[2].type == 'WS'
開發者ID:tinloaf,項目名稱:xonsh,代碼行數:6,代碼來源:test_lexer.py

示例8: check_tokens_subproc

def check_tokens_subproc(inp, exp, stop=-1):
    l = Lexer()
    l.input("$[{}]".format(inp))
    obs = list(l)[1:stop]
    return assert_tokens_equal(exp, obs)
開發者ID:mitnk,項目名稱:xonsh,代碼行數:5,代碼來源:test_lexer.py


注:本文中的xonsh.lexer.Lexer類示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。