2018-07-22 16:14:38 -07:00
|
|
|
# lex
|
|
|
|
|
2018-07-25 19:30:26 -07:00
|
|
|
Use the liblex library to tokenize text.
|
2018-07-22 16:14:38 -07:00
|
|
|
|
|
|
|
## Installation
|
|
|
|
|
|
|
|
Add this to your application's `shard.yml`:
|
|
|
|
|
|
|
|
```yaml
|
|
|
|
dependencies:
|
|
|
|
lex:
|
2018-07-25 19:30:26 -07:00
|
|
|
git: https://dev.danilafe.com/Chip-8-Wizardry/lex.git
|
2018-07-22 16:14:38 -07:00
|
|
|
```
|
|
|
|
|
|
|
|
## Usage
|
|
|
|
|
|
|
|
```crystal
|
|
|
|
require "lex"
|
|
|
|
|
2018-07-25 19:30:26 -07:00
|
|
|
# Create a lexer
|
|
|
|
lexer = Lex::Lexer.new
|
|
|
|
|
|
|
|
# Add tokens using their regular expression and value.
|
|
|
|
# Tokens with larger value take higher priority.
|
|
|
|
lexer.add_pattern(".", 0) # Matches any one character.
|
|
|
|
lexer.add_pattern("ab+", 1) # Matches ab, abb, abbb...
|
|
|
|
lexer.add_pattern("(ab)+", 2) # Matches ab, abab, ababab...
|
|
|
|
lexer.add_pattern("ab*", 3) # Matches a, ab, abb...
|
|
|
|
lexer.add_pattern("[a-d]", 4) # matches a, b, c, d
|
|
|
|
lexer.add_pattern("[^a-d]", 5) # matches all chars but a, b, c, d
|
2018-07-22 16:14:38 -07:00
|
|
|
|
2018-07-25 19:30:26 -07:00
|
|
|
# Lex some text
|
|
|
|
tokens = lexer.lex "ab abb"
|
|
|
|
```
|
2018-07-22 16:14:38 -07:00
|
|
|
|
2018-07-25 19:30:26 -07:00
|
|
|
An array of tuples of type `Tuple(String, Int32)` is returned.
|
2018-07-22 16:14:38 -07:00
|
|
|
|
|
|
|
## Contributing
|
|
|
|
|
2018-07-25 19:30:26 -07:00
|
|
|
1. Fork it (<https://dev.danilafe.com/Chip-8-Wizardry/lex>)
|
2018-07-22 16:14:38 -07:00
|
|
|
2. Create your feature branch (`git checkout -b my-new-feature`)
|
|
|
|
3. Commit your changes (`git commit -am 'Add some feature'`)
|
|
|
|
4. Push to the branch (`git push origin my-new-feature`)
|
|
|
|
5. Create a new Pull Request
|
|
|
|
|
|
|
|
## Contributors
|
|
|
|
|
2018-07-25 19:30:26 -07:00
|
|
|
- [DanilaFe](https://dev.danilafe.com/DanilaFe) Danila Fedorin - creator, maintainer
|