소스 검색

More specific docs for contexts and tokenizer.

tags/v0.1
Ben Kurtovic 11 년 전
부모
커밋
b2b49ebd80
2개의 변경된 파일48개의 추가작업 그리고 2개의 파일을 삭제
  1. +22
    -2
      mwparserfromhell/parser/contexts.py
  2. +26
    -0
      mwparserfromhell/parser/tokens.py

+ 22
- 2
mwparserfromhell/parser/contexts.py 파일 보기

@@ -29,10 +29,30 @@ heading of level two. This is used to determine what tokens are valid at the
current point and also if the current parsing route is invalid.

The tokenizer stores context as an integer, with these definitions bitwise OR'd
to add them, AND'd to check if they're set, and XOR'd to remove them.
to add them, AND'd to check if they're set, and XOR'd to remove them. The
advantage of this is that contexts can have sub-contexts (as FOO == 0b11 will
cover BAR == 0b10 and BAZ == 0b01).

Local (stack-specific) contexts:

* TEMPLATE
** TEMPLATE_NAME
** TEMPLATE_PARAM_KEY
** TEMPLATE_PARAM_VALUE
* HEADING
** HEADING_LEVEL_1
** HEADING_LEVEL_2
** HEADING_LEVEL_3
** HEADING_LEVEL_4
** HEADING_LEVEL_5
** HEADING_LEVEL_6

Global contexts:

* GL_HEADING
"""

# Local (stack-specific) contexts:
# Local contexts:

TEMPLATE = 0b000000111
TEMPLATE_NAME = 0b000000001


+ 26
- 0
mwparserfromhell/parser/tokens.py 파일 보기

@@ -28,6 +28,32 @@ a syntactically valid form by the
:py:class:`~mwparserfromhell.parser.tokenizer.Tokenizer`, and then converted
into the :py:class`~mwparserfromhell.wikicode.Wikicode` tree by the
:py:class:`~mwparserfromhell.parser.builder.Builder`.

Tokens:

* Text = make("Text")
* *Templates*
** TemplateOpen
** TemplateParamSeparator
** TemplateParamEquals
** TemplateClose
** HTMLEntityStart
** HTMLEntityNumeric
** HTMLEntityHex
** HTMLEntityEnd
* *Headings*
** HeadingStart
** HeadingEnd
* *Tags*
** TagOpenOpen
** TagAttrStart
** TagAttrEquals
** TagAttrQuote
** TagCloseOpen
** TagCloseSelfclose
** TagOpenClose
** TagCloseClose

"""

from __future__ import unicode_literals


불러오는 중...
취소
저장