{"id":31033295,"url":"https://github.com/scrapegraphai/scrapegraphai-ruby","last_synced_at":"2025-09-14T01:43:23.610Z","repository":{"id":310599868,"uuid":"1036618116","full_name":"ScrapeGraphAI/scrapegraphai-ruby","owner":"ScrapeGraphAI","description":null,"archived":false,"fork":false,"pushed_at":"2025-08-25T02:14:00.000Z","size":171,"stargazers_count":0,"open_issues_count":1,"forks_count":0,"subscribers_count":0,"default_branch":"main","last_synced_at":"2025-09-06T15:01:46.204Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Ruby","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ScrapeGraphAI.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":"SECURITY.md","support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-08-12T10:40:20.000Z","updated_at":"2025-08-12T12:25:48.000Z","dependencies_parsed_at":"2025-08-19T05:56:34.899Z","dependency_job_id":"32f40cf0-bb92-491b-8e5b-bc71f681afe1","html_url":"https://github.com/ScrapeGraphAI/scrapegraphai-ruby","commit_stats":null,"previous_names":["scrapegraphai/scrapegraphai-ruby"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/ScrapeGraphAI/scrapegraphai-ruby","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ScrapeGraphAI%2Fscrapegraphai-ruby","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ScrapeGraphAI%2Fscrapegraphai-ruby/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ScrapeGraphAI%2Fscrapegraphai-ruby/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ScrapeGraphAI%2Fscrapegraphai-ruby/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ScrapeGraphAI","download_url":"https://codeload.github.com/ScrapeGraphAI/scrapegraphai-ruby/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ScrapeGraphAI%2Fscrapegraphai-ruby/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":275051516,"owners_count":25396977,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-09-13T02:00:10.085Z","response_time":70,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-09-14T01:43:20.336Z","updated_at":"2025-09-14T01:43:23.598Z","avatar_url":"https://github.com/ScrapeGraphAI.png","language":"Ruby","readme":"# Scrapegraphai Ruby API library\n\nThe Scrapegraphai Ruby library provides convenient access to the Scrapegraphai REST API from any Ruby 3.2.0+ application. It ships with comprehensive types \u0026 docstrings in Yard, RBS, and RBI – [see below](https://github.com/stainless-sdks/scrapegraphai-ruby#Sorbet) for usage with Sorbet. The standard library's `net/http` is used as the HTTP transport, with connection pooling via the `connection_pool` gem.\n\nIt is generated with [Stainless](https://www.stainless.com/).\n\n## Documentation\n\nDocumentation for releases of this gem can be found [on RubyDoc](https://gemdocs.org/gems/scrapegraphai).\n\nThe REST API documentation can be found on [scrapegraphai.com](https://scrapegraphai.com).\n\n## Installation\n\nTo use this gem, install via Bundler by adding the following to your application's `Gemfile`:\n\n```ruby\ngem \"scrapegraphai\", \"~\u003e 0.0.1\"\n```\n\n## Usage\n\n```ruby\nrequire \"bundler/setup\"\nrequire \"scrapegraphai\"\n\nscrapegraphai = Scrapegraphai::Client.new(\n  api_key: ENV[\"SCRAPEGRAPHAI_API_KEY\"], # This is the default and can be omitted\n  environment: \"environment_1\" # defaults to \"production\"\n)\n\ncompleted_smartscraper = scrapegraphai.smartscraper.create(user_prompt: \"Extract the product name, price, and description\")\n\nputs(completed_smartscraper.request_id)\n```\n\n### Handling errors\n\nWhen the library is unable to connect to the API, or if the API returns a non-success status code (i.e., 4xx or 5xx response), a subclass of `Scrapegraphai::Errors::APIError` will be thrown:\n\n```ruby\nbegin\n  smartscraper = scrapegraphai.smartscraper.create(user_prompt: \"Extract the product name, price, and description\")\nrescue Scrapegraphai::Errors::APIConnectionError =\u003e e\n  puts(\"The server could not be reached\")\n  puts(e.cause)  # an underlying Exception, likely raised within `net/http`\nrescue Scrapegraphai::Errors::RateLimitError =\u003e e\n  puts(\"A 429 status code was received; we should back off a bit.\")\nrescue Scrapegraphai::Errors::APIStatusError =\u003e e\n  puts(\"Another non-200-range status code was received\")\n  puts(e.status)\nend\n```\n\nError codes are as follows:\n\n| Cause            | Error Type                 |\n| ---------------- | -------------------------- |\n| HTTP 400         | `BadRequestError`          |\n| HTTP 401         | `AuthenticationError`      |\n| HTTP 403         | `PermissionDeniedError`    |\n| HTTP 404         | `NotFoundError`            |\n| HTTP 409         | `ConflictError`            |\n| HTTP 422         | `UnprocessableEntityError` |\n| HTTP 429         | `RateLimitError`           |\n| HTTP \u003e= 500      | `InternalServerError`      |\n| Other HTTP error | `APIStatusError`           |\n| Timeout          | `APITimeoutError`          |\n| Network error    | `APIConnectionError`       |\n\n### Retries\n\nCertain errors will be automatically retried 2 times by default, with a short exponential backoff.\n\nConnection errors (for example, due to a network connectivity problem), 408 Request Timeout, 409 Conflict, 429 Rate Limit, \u003e=500 Internal errors, and timeouts will all be retried by default.\n\nYou can use the `max_retries` option to configure or disable this:\n\n```ruby\n# Configure the default for all requests:\nscrapegraphai = Scrapegraphai::Client.new(\n  max_retries: 0 # default is 2\n)\n\n# Or, configure per-request:\nscrapegraphai.smartscraper.create(\n  user_prompt: \"Extract the product name, price, and description\",\n  request_options: {max_retries: 5}\n)\n```\n\n### Timeouts\n\nBy default, requests will time out after 60 seconds. You can use the timeout option to configure or disable this:\n\n```ruby\n# Configure the default for all requests:\nscrapegraphai = Scrapegraphai::Client.new(\n  timeout: nil # default is 60\n)\n\n# Or, configure per-request:\nscrapegraphai.smartscraper.create(\n  user_prompt: \"Extract the product name, price, and description\",\n  request_options: {timeout: 5}\n)\n```\n\nOn timeout, `Scrapegraphai::Errors::APITimeoutError` is raised.\n\nNote that requests that time out are retried by default.\n\n## Advanced concepts\n\n### BaseModel\n\nAll parameter and response objects inherit from `Scrapegraphai::Internal::Type::BaseModel`, which provides several conveniences, including:\n\n1. All fields, including unknown ones, are accessible with `obj[:prop]` syntax, and can be destructured with `obj =\u003e {prop: prop}` or pattern-matching syntax.\n\n2. Structural equivalence for equality; if two API calls return the same values, comparing the responses with == will return true.\n\n3. Both instances and the classes themselves can be pretty-printed.\n\n4. Helpers such as `#to_h`, `#deep_to_h`, `#to_json`, and `#to_yaml`.\n\n### Making custom or undocumented requests\n\n#### Undocumented properties\n\nYou can send undocumented parameters to any endpoint, and read undocumented response properties, like so:\n\nNote: the `extra_` parameters of the same name overrides the documented parameters.\n\n```ruby\ncompleted_smartscraper =\n  scrapegraphai.smartscraper.create(\n    user_prompt: \"Extract the product name, price, and description\",\n    request_options: {\n      extra_query: {my_query_parameter: value},\n      extra_body: {my_body_parameter: value},\n      extra_headers: {\"my-header\": value}\n    }\n  )\n\nputs(completed_smartscraper[:my_undocumented_property])\n```\n\n#### Undocumented request params\n\nIf you want to explicitly send an extra param, you can do so with the `extra_query`, `extra_body`, and `extra_headers` under the `request_options:` parameter when making a request, as seen in the examples above.\n\n#### Undocumented endpoints\n\nTo make requests to undocumented endpoints while retaining the benefit of auth, retries, and so on, you can make requests using `client.request`, like so:\n\n```ruby\nresponse = client.request(\n  method: :post,\n  path: '/undocumented/endpoint',\n  query: {\"dog\": \"woof\"},\n  headers: {\"useful-header\": \"interesting-value\"},\n  body: {\"hello\": \"world\"}\n)\n```\n\n### Concurrency \u0026 connection pooling\n\nThe `Scrapegraphai::Client` instances are threadsafe, but are only are fork-safe when there are no in-flight HTTP requests.\n\nEach instance of `Scrapegraphai::Client` has its own HTTP connection pool with a default size of 99. As such, we recommend instantiating the client once per application in most settings.\n\nWhen all available connections from the pool are checked out, requests wait for a new connection to become available, with queue time counting towards the request timeout.\n\nUnless otherwise specified, other classes in the SDK do not have locks protecting their underlying data structure.\n\n## Sorbet\n\nThis library provides comprehensive [RBI](https://sorbet.org/docs/rbi) definitions, and has no dependency on sorbet-runtime.\n\nYou can provide typesafe request parameters like so:\n\n```ruby\nscrapegraphai.smartscraper.create(user_prompt: \"Extract the product name, price, and description\")\n```\n\nOr, equivalently:\n\n```ruby\n# Hashes work, but are not typesafe:\nscrapegraphai.smartscraper.create(user_prompt: \"Extract the product name, price, and description\")\n\n# You can also splat a full Params class:\nparams = Scrapegraphai::SmartscraperCreateParams.new(\n  user_prompt: \"Extract the product name, price, and description\"\n)\nscrapegraphai.smartscraper.create(**params)\n```\n\n### Enums\n\nSince this library does not depend on `sorbet-runtime`, it cannot provide [`T::Enum`](https://sorbet.org/docs/tenum) instances. Instead, we provide \"tagged symbols\" instead, which is always a primitive at runtime:\n\n```ruby\n# :queued\nputs(Scrapegraphai::CompletedSmartscraper::Status::QUEUED)\n\n# Revealed type: `T.all(Scrapegraphai::CompletedSmartscraper::Status, Symbol)`\nT.reveal_type(Scrapegraphai::CompletedSmartscraper::Status::QUEUED)\n```\n\nEnum parameters have a \"relaxed\" type, so you can either pass in enum constants or their literal value:\n\n```ruby\nScrapegraphai::CompletedSmartscraper.new(\n  status: Scrapegraphai::CompletedSmartscraper::Status::QUEUED,\n  # …\n)\n\nScrapegraphai::CompletedSmartscraper.new(\n  status: :queued,\n  # …\n)\n```\n\n## Versioning\n\nThis package follows [SemVer](https://semver.org/spec/v2.0.0.html) conventions. As the library is in initial development and has a major version of `0`, APIs may change at any time.\n\nThis package considers improvements to the (non-runtime) `*.rbi` and `*.rbs` type definitions to be non-breaking changes.\n\n## Requirements\n\nRuby 3.2.0 or higher.\n\n## Contributing\n\nSee [the contributing documentation](https://github.com/stainless-sdks/scrapegraphai-ruby/tree/main/CONTRIBUTING.md).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fscrapegraphai%2Fscrapegraphai-ruby","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fscrapegraphai%2Fscrapegraphai-ruby","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fscrapegraphai%2Fscrapegraphai-ruby/lists"}