2011 DesignandImplementationoftheSwe

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Wikitext Markup Parser; Sweble Wikitext Parser.

Notes

Cited By

Quotes

Author Keywords

Copyright Information

Abstract

The heart of each wiki, including Wikipedia, is its content. Most machine processing starts and ends with this content. At present, such processing is limited, because most wiki engines today cannot provide a complete and precise representation of the wiki's content. They can only generate HTML. The main reason is the lack of well-defined parsers that can handle the complexity of modern wiki markup. This applies to Media Wiki, the software running Wikipedia, and most other wiki engines.

This paper shows why it has been so difficult to develop comprehensive parsers for wiki markup. It presents the design and implementation of a parser for Wikitext, the wiki markup language of MediaWiki. We use parsing expression grammars where most parsers used no grammars or grammars poorly suited to the task. Using this parser it is possible to directly and precisely query the structured data within wikis, including Wikipedia.

The parser is available as open source from http://sweble.org

1. Introduction

2. Prior And Related Work

2.1 Related And Prior Work

2.2 Prior Parser Attempts

3. Wikitext And Mediawiki

3.1 How The Mediawiki Parser Works

3.2 Challenges To Parsing Wikitext

4. The Sweble Wikitext Parser

4.1 Requirements For The Parser

4.2 Parser Design

4.3 AST Design

4.4 Parser Implementation

5. Limitations

6. Conclusions

7. Acknowledgements

We would like to thank Carsten Kolassa, Michel Salim and Ronald Veldema for their help and support.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2011 DesignandImplementationoftheSweHannes Dohrn
Dirk Riehle
Design and Implementation of the Sweble Wikitext Parser: Unlocking the Structured Data of Wikipedia10.1145/2038558.20385712011