Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/bartavelle/fastparser
A fast bytestring parser
https://github.com/bartavelle/fastparser
bytestring fast haskell parsing
Last synced: 10 days ago
JSON representation
A fast bytestring parser
- Host: GitHub
- URL: https://github.com/bartavelle/fastparser
- Owner: bartavelle
- License: bsd-3-clause
- Created: 2016-10-11T09:20:23.000Z (about 8 years ago)
- Default Branch: master
- Last Pushed: 2023-06-08T09:47:21.000Z (over 1 year ago)
- Last Synced: 2024-03-15T11:50:00.740Z (8 months ago)
- Topics: bytestring, fast, haskell, parsing
- Language: Haskell
- Size: 35.2 KB
- Stars: 6
- Watchers: 4
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
README
[![Build Status](https://travis-ci.org/bartavelle/fastparser.svg?branch=master)](https://travis-ci.org/bartavelle/fastparser)
A very simple, backtracking, fast parser combinator library.
It is measurably faster than [attoparsec](https://hackage.haskell.org/package/attoparsec) (36% in [this use case](https://hbtvl.banquise.net/posts/2015-12-14-fastParsing03.html)), but only works on strict `ByteString`, lacks many helper functions, and is not resumable.
It also should consume a tiny bit less memory for equivalent operations.# When NOT to use fastparser
* When performance is not the **most** pressing concern.
* When you need to parse anything else but strict `ByteString`.
* When you need to use a battle-tested library. While very simple, and in constant use by me, this package is still quite experimental.
* When you need to parse large inputs that are not easily cut into many smaller pieces that can be parsed independently.# How to use fastparser
`fastparser` works well with small pieces, such as individual log file lines. It is recommended to use it with a coroutine library (such as [conduit](http://hackage.haskell.org/package/conduit) or [pipe](http://hackage.haskell.org/package/pipes)), so that the input could be incrementaly consumed, cut into individual records, all of which would end up parsed independently.
One such setup, with the `conduit` ecosystem, would look like:
sourceFile "/tmp/foo" .| Data.Conduit.Binary.lines .| CL.map (parseOnly parser) .| ...
Other than that, `fastparser` is fairly close to any other parser combinators library.