-
Book Overview & Buying
-
Table Of Contents
Building Programming Language Interpreters
By :
When creating a lexer, the first step is to identify what kinds of tokens you have and what data needs to be stored for each one of them. In this exercise, I decided to model the tokens as a variant with different types for each token.
Then, you need to define the regular expressions for how each token is matched, which will set the rules for what kind of input gets accepted in your language. I went ahead and noted the tokens that were needed for the sample code I had laid out before.
It is very unlikely that you will have a benefit in implementing a lexer by yourself; therefore, it’s important to evaluate various options of existing libraries. I decided to use the lexertl library because of its plain C++ API, which makes it easier to teach.
Finally, I implemented the lexer by just using the lexertl API, translating the rules that I had identified before. I included a test to validate that I get the correct set of tokens when processing a specific input...
Change the font size
Change margin width
Change background colour