I understand the tokenizing part. I have the lex reading the file, splitting the tokens, labeling them accordingly, and displaying it in the console. What is the process after this?
1)At this point do I try to pull the defines from preprocessor and lex again? Which would be in the lex.h file, or in a def file like Dr.Dobbs?
2)Do I make the symbol table at this point?
3)How do I pass the lexed input to the parser, get the treenode, and start matching, or the linking part of this?
There has to be some kind soul that will help me connect the dots. Lol.
What I have tried:
I have tried a few compiler books but the way they structure it, they leave out the linking part, as well as many of the sites online.