Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/xie392/lexer
C language parser, implementing lexical analyzer
https://github.com/xie392/lexer
Last synced: about 1 month ago
JSON representation
C language parser, implementing lexical analyzer
- Host: GitHub
- URL: https://github.com/xie392/lexer
- Owner: xie392
- License: mit
- Created: 2023-09-18T17:51:14.000Z (over 1 year ago)
- Default Branch: master
- Last Pushed: 2023-09-26T11:09:25.000Z (about 1 year ago)
- Last Synced: 2024-11-01T11:35:23.571Z (about 2 months ago)
- Language: JavaScript
- Homepage: https://lexer-ten.vercel.app/
- Size: 301 KB
- Stars: 9
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
README
# Lexical Analyzer
A lexical analyzer (Lexer, also known as a scanner) is a part of a compiler or interpreter that breaks down the input source code string into small pieces called tokens. Each token typically represents a basic syntactical unit in the source code, such as keywords, identifiers, operators, constants, etc. Lexical analysis is the first phase of the compilation process, and its primary task is to transform complex source code strings into an easily manageable stream of tokens.
# Principles of Lexical Analyzer
The principles of a lexical analyzer are as follows:
- The lexical analyzer reads a lexical unit from the source code string.
- The lexical analyzer converts the lexical unit into a stream of tokens.# Implementation of Lexical Analyzer
A Position class is used to represent the position of a lexical unit, which includes the line number and column number of the lexical unit. It is used to advance through lexical units. The next() method is called to get the next lexical unit. A Deterministic Finite Automaton (DFA) is used to determine different lexical units. Finally, the lexical units are converted into a stream of tokens.
# Usage of Lexical Analyzer
```shell
npm install lexers
``````typescript
import { createTokenizer } from 'lexers'const code = `int a = 1;`
const tokenizer = createTokenizer(code)
const tokens = tokenizer.lexer()
```# [Playground](https://lexer-ten.vercel.app/)
# Contributions
### (1) Project Statistics
### Q&A
If you have any problems or questions, please [submit an issue](https://github.com/xie392/lexer/issues/new)