Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/causal-agent/scraper
HTML parsing and querying with CSS selectors
https://github.com/causal-agent/scraper
hacktoberfest rust
Last synced: about 1 month ago
JSON representation
HTML parsing and querying with CSS selectors
- Host: GitHub
- URL: https://github.com/causal-agent/scraper
- Owner: rust-scraper
- License: isc
- Created: 2016-01-01T21:45:09.000Z (almost 9 years ago)
- Default Branch: master
- Last Pushed: 2024-10-31T07:32:17.000Z (about 2 months ago)
- Last Synced: 2024-11-19T05:06:55.429Z (about 1 month ago)
- Topics: hacktoberfest, rust
- Language: Rust
- Homepage: https://docs.rs/scraper
- Size: 328 KB
- Stars: 1,939
- Watchers: 20
- Forks: 109
- Open Issues: 6
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# scraper
[![crates.io](https://img.shields.io/crates/v/scraper?color=dark-green)][crate]
[![downloads](https://img.shields.io/crates/d/scraper)][crate]
[![test](https://github.com/causal-agent/scraper/actions/workflows/test.yml/badge.svg)][tests]HTML parsing and querying with CSS selectors.
`scraper` is on [Crates.io][crate] and [GitHub][github].
[crate]: https://crates.io/crates/scraper
[github]: https://github.com/causal-agent/scraper
[tests]: https://github.com/causal-agent/scraper/actions/workflows/test.ymlScraper provides an interface to Servo's `html5ever` and `selectors` crates, for browser-grade parsing and querying.
## Examples
### Parsing a document
```rust
use scraper::Html;let html = r#"
Hello, world!
Hello, world!
"#;let document = Html::parse_document(html);
```### Parsing a fragment
```rust
use scraper::Html;
let fragment = Html::parse_fragment("Hello, world!
");
```### Parsing a selector
```rust
use scraper::Selector;
let selector = Selector::parse("h1.foo").unwrap();
```### Selecting elements
```rust
use scraper::{Html, Selector};let html = r#"
- Foo
- Bar
- Baz
"#;
let fragment = Html::parse_fragment(html);
let selector = Selector::parse("li").unwrap();
for element in fragment.select(&selector) {
assert_eq!("li", element.value().name());
}
```
### Selecting descendent elements
```rust
use scraper::{Html, Selector};
let html = r#"
- Foo
- Bar
- Baz
"#;
let fragment = Html::parse_fragment(html);
let ul_selector = Selector::parse("ul").unwrap();
let li_selector = Selector::parse("li").unwrap();
let ul = fragment.select(&ul_selector).next().unwrap();
for element in ul.select(&li_selector) {
assert_eq!("li", element.value().name());
}
```
### Accessing element attributes
```rust
use scraper::{Html, Selector};
let fragment = Html::parse_fragment(r#""#);
let selector = Selector::parse(r#"input[name="foo"]"#).unwrap();
let input = fragment.select(&selector).next().unwrap();
assert_eq!(Some("bar"), input.value().attr("value"));
```
### Serializing HTML and inner HTML
```rust
use scraper::{Html, Selector};
let fragment = Html::parse_fragment("
Hello, world!
");
let selector = Selector::parse("h1").unwrap();
let h1 = fragment.select(&selector).next().unwrap();
assert_eq!("
Hello, world!
", h1.html());
assert_eq!("Hello, world!", h1.inner_html());
```
### Accessing descendent text
```rust
use scraper::{Html, Selector};
let fragment = Html::parse_fragment("
Hello, world!
");
let selector = Selector::parse("h1").unwrap();
let h1 = fragment.select(&selector).next().unwrap();
let text = h1.text().collect::>();
assert_eq!(vec!["Hello, ", "world!"], text);
```
### Manipulating the DOM
```rust
use html5ever::tree_builder::TreeSink;
use scraper::{Html, Selector};
let html = "hello
REMOVE ME
";let selector = Selector::parse(".hello").unwrap();
let mut document = Html::parse_document(html);
let node_ids: Vec<_> = document.select(&selector).map(|x| x.id()).collect();
for id in node_ids {
document.remove_from_parent(&id);
}
assert_eq!(document.html(), "hello");
```
## Contributing
Please feel free to open pull requests. If you're planning on implementing
something big (i.e. not fixing a typo, a small bug fix, minor refactor, etc)
then please open an issue first.