A library to convert a set of input patterns into tokens.
Go to file
2018-03-24 17:15:49 -07:00
external Update libds (again). 2018-03-24 17:15:49 -07:00
include Build an "inverted" path to allow for patterns that exclude characters. 2018-02-08 18:40:37 -08:00
src Fix bug causing comparison between incompatible enums. 2018-02-28 13:47:39 -08:00
.gitignore Initial commit. Setup gitignore. 2017-01-19 15:29:50 -08:00
.gitmodules Change submodule url to http to allow non-ssh access. 2017-05-11 22:19:49 -07:00
CMakeLists.txt Make CMake calls consistent. 2018-02-04 00:25:52 -08:00
README.md Create README.md 2017-02-15 03:28:52 +00:00


A library for converting input text into tokens defined by regular expressions.


liblex is a part of an attempt to write / create a compiler entirely from scratch. This part of the compiler would be used to convert input text into tokens to be evaluated by the parser.


First of all, an evulation configuration has to be created. This configuration is used to store the various regular expressions to be used during lexing. The below code is an example of initializing and configuring an evalutation configuration.

/* Declares the configuration */
eval_config config;
/* Initializes the configuration for use */
/* Registers regular expressions to be used. The IDs, given as the third
   parameter are also used for priority - the higher the ID, the higher the
   priority. */
eval_config_add(&config, "[ \n]+", 0);
eval_config_add(&config, "[a-zA-Z_][a-zA-Z_0-9]*", 1);
eval_config_add(&config, "if", 2);
eval_config_add(&config, "else", 3);
eval_config_add(&config, "[0-9]+", 4);
eval_config_add(&config, "{|}", 5);

It should be noted that this example is incomplete. eval_config_add returns a liblex_result, which represents the result of the operation. LIBLEX_SUCCESS means that no errors occured. LIBLEX_MALLOC, on the other hand, means that the function failed to allocate the necessary memory, and LIBLEX_INVALID means that the regular expression provided was not correctly formatted.

After the eval configuration has been configured, tokenizing a string is done by creating a linked list and populating it with the resulting tokens (called matches).

/* Declares the linked list. */
ll match_ll;
/* Initializes the linked list. */

/* The first parameter is the input string, the second is the index at which
   to begin parsing. */
eval_all(string, 0, &config, &match_ll);

Once done, some things need to be cleaned up. The eval_foreach_match_free function can be passed to ll_foreach containing the matches to release them:

ll_foreach(&match_ll, NULL, compare_always, eval_foreach_match_free);

And the configuration can be freed using: