Browse Source

Really minor documentation fixes.

tags/v0.4
Ben Kurtovic 10 years ago
parent
commit
51df09ccf0
2 changed files with 8 additions and 4 deletions
  1. +4
    -2
      README.rst
  2. +4
    -2
      docs/integration.rst

+ 4
- 2
README.rst View File

@@ -123,19 +123,21 @@ If you're using Pywikipedia_, your code might look like this::

import mwparserfromhell
import wikipedia as pywikibot

def parse(title):
site = pywikibot.getSite()
page = pywikibot.Page(site, title)
text = page.get()
return mwparserfromhell.parse(text)

If you're not using a library, you can parse templates in any page using the
following code (via the API_)::
If you're not using a library, you can parse any page using the following code
(via the API_)::

import json
import urllib
import mwparserfromhell
API_URL = "http://en.wikipedia.org/w/api.php"

def parse(title):
data = {"action": "query", "prop": "revisions", "rvlimit": 1,
"rvprop": "content", "format": "json", "titles": title}


+ 4
- 2
docs/integration.rst View File

@@ -11,19 +11,21 @@ If you're using Pywikipedia_, your code might look like this::

import mwparserfromhell
import wikipedia as pywikibot

def parse(title):
site = pywikibot.getSite()
page = pywikibot.Page(site, title)
text = page.get()
return mwparserfromhell.parse(text)

If you're not using a library, you can parse templates in any page using the
following code (via the API_)::
If you're not using a library, you can parse any page using the following code
(via the API_)::

import json
import urllib
import mwparserfromhell
API_URL = "http://en.wikipedia.org/w/api.php"

def parse(title):
raw = urllib.urlopen(API_URL, data).read()
res = json.loads(raw)


Loading…
Cancel
Save