{"id":13576666,"url":"https://github.com/gosom/google-maps-scraper","last_synced_at":"2025-12-28T13:43:01.027Z","repository":{"id":154011463,"uuid":"631285104","full_name":"gosom/google-maps-scraper","owner":"gosom","description":"scrape data  data from Google Maps. Extracts data such as the name, address, phone number, website URL, rating,  reviews number, latitude and longitude, reviews,email and more for each place","archived":false,"fork":false,"pushed_at":"2025-03-02T22:15:58.000Z","size":21233,"stargazers_count":1253,"open_issues_count":21,"forks_count":177,"subscribers_count":11,"default_branch":"main","last_synced_at":"2025-03-28T19:58:28.517Z","etag":null,"topics":["distributed-scraper","distributed-scraping","golang","google-maps","google-maps-scraping","web-scraper","web-scraping"],"latest_commit_sha":null,"homepage":"","language":"Go","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/gosom.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":".github/FUNDING.yaml","license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null},"funding":{"github":"gosom"}},"created_at":"2023-04-22T14:35:13.000Z","updated_at":"2025-03-28T12:28:32.000Z","dependencies_parsed_at":"2024-01-16T20:28:29.575Z","dependency_job_id":"a5aa18cc-051e-4653-bc0f-a99f0b5f7cba","html_url":"https://github.com/gosom/google-maps-scraper","commit_stats":null,"previous_names":[],"tags_count":48,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/gosom%2Fgoogle-maps-scraper","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/gosom%2Fgoogle-maps-scraper/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/gosom%2Fgoogle-maps-scraper/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/gosom%2Fgoogle-maps-scraper/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/gosom","download_url":"https://codeload.github.com/gosom/google-maps-scraper/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247311905,"owners_count":20918340,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["distributed-scraper","distributed-scraping","golang","google-maps","google-maps-scraping","web-scraper","web-scraping"],"created_at":"2024-08-01T15:01:12.654Z","updated_at":"2025-12-28T13:43:01.019Z","avatar_url":"https://github.com/gosom.png","language":"Go","readme":"# Google maps scraper\n![build](https://github.com/gosom/google-maps-scraper/actions/workflows/build.yml/badge.svg)\n[![Go Report Card](https://goreportcard.com/badge/github.com/gosom/google-maps-scraper)](https://goreportcard.com/report/github.com/gosom/google-maps-scraper)\n[![Discord](https://img.shields.io/badge/Discord-Join%20Chat-7289DA?logo=discord\u0026logoColor=white)](https://discord.gg/fpaAVhNCCu)\n\n\u003e A free and open-source Google Maps scraper with both command line and web UI options. This tool is easy to use and allows you to extract data from Google Maps efficiently.\n\n## Join Our Community\n\n[![Discord](https://img.shields.io/badge/Discord-Join%20Chat-7289DA?logo=discord\u0026logoColor=white)](https://discord.gg/fpaAVhNCCu)\n\nJoin our Discord server to get help, share ideas, and connect with other users of the Google Maps Scraper!\n\n## 🎯 Need a Central Database for Your Leads?\n\nScraped data is just the beginning. **[LeadsDB](https://getleadsdb.com/)** is your central database for business leads:\n\n- **AI Agent Integration** - Connect any MCP-compatible AI to manage leads with natural language\n- **Automatic Deduplication** - Duplicates are detected and merged automatically\n- **Advanced Filters** - Combine multiple filters with AND/OR logic on any field\n- **Flexible Export** - Export filtered results to CSV or JSON anytime\n- **REST API** - Full CRUD API to use LeadsDB as a backend for your apps\n\n**Start free with 500 leads** 👉 [Join the Waitlist](https://getleadsdb.com/)\n\n## Sponsors\n\n\n### Supported by the Community\n\nThis project relies on the support of its users and sponsors to stay alive and improve. If you find it useful, here’s how you can help:\n\n- ⭐ **Star the repository** to show your support and help others discover it.\n- ❤️ **Sponsor the project** to contribute directly to its development. [Become a sponsor →](https://github.com/sponsors/gosom)\n- 🤝 **Use the services of our sponsors** to support the project while benefiting from their offerings.\n\nYour support ensures the project remains maintained and continues to grow. Thank you!\n\n### Premium Sponsors\n\n**No time for code? Extract ALL Google Maps listings at country-scale in 2 clicks, without keywords or limits** 👉 [Try it now for free](https://scrap.io?utm_medium=ads\u0026utm_source=github_gosom_gmap_scraper)\n\n[![Extract ALL Google Maps Listings](./img/premium_scrap_io.png)](https://scrap.io?utm_medium=ads\u0026utm_source=github_gosom_gmap_scraper)\n\n\u003chr\u003e\n\n\u003ctable\u003e\n\u003ctr\u003e\n\u003ctd\u003e\u003cimg src=\"./img/SerpApi-logo-w.png\" alt=\"SerpApi Logo\" width=\"100\"\u003e\u003c/td\u003e\n\u003ctd\u003e\n\u003cb\u003eAt SerpApi, we scrape public data from Google Maps and other top search engines.\u003c/b\u003e\n\nYou can find the full list of our APIs here: [https://serpapi.com/search-api](https://serpapi.com/search-api)\n\u003c/td\u003e\n\u003c/tr\u003e\n\u003c/table\u003e\n\n[![SerpApi Banner](./img/SerpApi-banner.png)](https://serpapi.com/?utm_source=google-maps-scraper)\n\n\u003chr\u003e\n\n**G Maps Extractor**  \nA no-code Google Maps scraper that pulls business leads from Google Maps in one click.\n\n- 📇 **Includes** emails, social profiles, phone numbers, addresses, reviews, images and more.\n- 📥 **Export** to CSV · Excel · JSON  \n- 🔌 **API** Support: Extract data via [API](https://gmapsextractor.com/google-maps-api?utm_source=github\u0026utm_medium=banner\u0026utm_campaign=gosom)\n- 🎁 **Free**: Get your first **1,000 leads** today  \n[Get Started for Free](https://gmapsextractor.com?utm_source=github\u0026utm_medium=banner\u0026utm_campaign=gosom)\n\n[![Gmaps Extractor](./img/gmaps-extractor-banner.png)](https://gmapsextractor.com?utm_source=github\u0026utm_medium=banner\u0026utm_campaign=gosom)\n\n\u003c/hr\u003e\n\n### Special Thanks to:\n\n[![Google Maps API for easy SERP scraping](https://www.searchapi.io/press/v1/svg/searchapi_logo_black_h.svg)](https://www.searchapi.io/google-maps?via=gosom)\n**Google Maps API for easy SERP scraping**\n\n\u003chr\u003e\n\n[Evomi](https://evomi.com?utm_source=github\u0026utm_medium=banner\u0026utm_campaign=gosom-maps) is your Swiss Quality Proxy Provider, starting at **$0.49/GB**\n\n[![Evomi Banner](https://my.evomi.com/images/brand/cta.png)](https://evomi.com?utm_source=github\u0026utm_medium=banner\u0026utm_campaign=gosom-maps)\n\n\u003chr\u003e\n\n[Decodo's proxies](https://visit.decodo.com/APVbbx) with #1 response time in the market\n\nCollect data without facing CAPTCHAs, IP bans, or geo-restrictions\n- ● 125M+ IP pool\n- ● 195+ locations worldwide  \n- ● 24/7 tech support\n- ● Extensive documentation\n\n**[Start your 3-day free trial with 100MB →](https://visit.decodo.com/APVbbx)**\n\n![Decodo](./img/decodo.png)\n\n\u003chr\u003e\n\n\n\n## What Google maps scraper does\n\nA command line and web based google maps scraper build using \n\n[scrapemate](https://github.com/gosom/scrapemate) web crawling framework.\n\nYou can use this repository either as is, or you can use its code as a base and\ncustomize it to your needs\n\n![Example GIF](img/example.gif)\n\n### Web UI:\n\n```\nmkdir -p gmapsdata \u0026\u0026 docker run -v $PWD/gmapsdata:/gmapsdata -p 8080:8080 gosom/google-maps-scraper -data-folder /gmapsdata\n```\n\nOr dowload the [binary](https://github.com/gosom/google-maps-scraper/releases) for your platform and run it.\n\nNote: The results will take at least 3 minutes to appear, even if you add only one keyword. This is the minimum configured runtime.\n\nNote: for MacOS the docker command should not work. **HELP REQUIRED**\n\n\n### Command line:\n\n```\ntouch results.csv \u0026\u0026 docker run -v $PWD/example-queries.txt:/example-queries -v $PWD/results.csv:/results.csv gosom/google-maps-scraper -depth 1 -input /example-queries -results /results.csv -exit-on-inactivity 3m\n```\n\nfile `results.csv` will contain the parsed results.\n\n**If you want emails use additionally the `-email` parameter*\n\n### REST API\nThe Google Maps Scraper provides a RESTful API for programmatic management of scraping tasks.\n\n### Key Endpoints\n\n- POST /api/v1/jobs: Create a new scraping job\n- GET /api/v1/jobs: List all jobs\n- GET /api/v1/jobs/{id}: Get details of a specific job\n- DELETE /api/v1/jobs/{id}: Delete a job\n- GET /api/v1/jobs/{id}/download: Download job results as CSV\n\nFor detailed API documentation, refer to the OpenAPI 3.0.3 specification available through Swagger UI or Redoc when running the app https://localhost:8080/api/docs\n\n\n## 🌟 Support the Project!\n\nIf you find this tool useful, consider giving it a **star** on GitHub. \nFeel free to check out the **Sponsor** button on this repository to see how you can further support the development of this project. \nYour support helps ensure continued improvement and maintenance.\n\n\n## Features\n\n- Extracts many data points from google maps\n- Exports the data to CSV, JSON or PostgreSQL \n- Performance about 120 urls per minute (-depth 1 -c 8)\n- Extendable to write your own exporter\n- Dockerized for easy run in multiple platforms\n- Scalable in multiple machines\n- Optionally extracts emails from the website of the business\n- SOCKS5/HTTP/HTTPS proxy support\n- Serverless execution via AWS Lambda functions (experimental \u0026 no documentation yet)\n- Fast Mode (BETA)\n\n## Notes on email extraction\n\nBy default email extraction is disabled. \n\nIf you enable email extraction (see quickstart) then the scraper will visit the \nwebsite of the business (if exists) and it will try to extract the emails from the\npage.\n\nFor the moment it only checks only one page of the website (the one that is registered in Gmaps). At some point, it will be added support to try to extract from other pages like about, contact, impressum etc. \n\n\nKeep in mind that enabling email extraction results to larger processing time, since more\npages are scraped. \n\n## Fast Mode\n\nFast mode returns you at most 21 search results per query ordered by distance from the **latitude** and **longitude** provided.\nAll the results are within the specified **radius**\n\nIt does not contain all the data points but basic ones. \nHowever it provides the ability to extract data really fast. \n\nWhen you use the fast mode ensure that you have provided:\n- zoom\n- radius (in meters)\n- latitude\n- longitude\n\n\n**Fast mode is Beta, you may experience blocking**\n\n## Extracted Data Points\n\n#### 1. `input_id`\n- Internal identifier for the input query.\n\n#### 2. `link`\n- Direct URL to the business listing on Google Maps.\n\n#### 3. `title`\n- Name of the business.\n\n#### 4. `category`\n- Business type or category (e.g., Restaurant, Hotel).\n\n#### 5. `address`\n- Street address of the business.\n\n#### 6. `open_hours`\n- Business operating hours.\n\n#### 7. `popular_times`\n- Estimated visitor traffic at different times of the day.\n\n#### 8. `website`\n- Official business website.\n\n#### 9. `phone`\n- Business contact phone number.\n\n#### 10. `plus_code`\n- Shortcode representing the precise location of the business.\n\n#### 11. `review_count`\n- Total number of customer reviews.\n\n#### 12. `review_rating`\n- Average star rating based on reviews.\n\n#### 13. `reviews_per_rating`\n- Breakdown of reviews by each star rating (e.g., number of 5-star, 4-star reviews).\n\n#### 14. `latitude`\n- Latitude coordinate of the business location.\n\n#### 15. `longitude`\n- Longitude coordinate of the business location.\n\n#### 16. `cid`\n- **Customer ID** (CID) used by Google Maps to uniquely identify a business listing. This ID remains stable across updates and can be used in URLs.\n- **Example:** `3D3174616216150310598`\n\n#### 17. `status`\n- Business status (e.g., open, closed, temporarily closed).\n\n#### 18. `descriptions`\n- Brief description of the business.\n\n#### 19. `reviews_link`\n- Direct link to the reviews section of the business listing.\n\n#### 20. `thumbnail`\n- URL to a thumbnail image of the business.\n\n#### 21. `timezone`\n- Time zone of the business location.\n\n#### 22. `price_range`\n- Price range of the business (`$`, `$$`, `$$$`).\n\n#### 23. `data_id`\n- An internal Google Maps identifier composed of two hexadecimal values separated by a colon.\n- **Structure:** `\u003cspatial_hex\u003e:\u003clisting_hex\u003e`\n- **Example:** `0x3eb33fecd7dfa167:0x2c0e80a0f5d57ec6`\n- **Note:** This value may change if the listing is updated and should not be used for permanent identification.\n\n#### 24. `images`\n- Links to images associated with the business.\n\n#### 25. `reservations`\n- Link to book reservations (if available).\n\n#### 26. `order_online`\n- Link to place online orders.\n\n#### 27. `menu`\n- Link to the menu (for applicable businesses).\n\n#### 28. `owner`\n- Indicates whether the business listing is claimed by the owner.\n\n#### 29. `complete_address`\n- Fully formatted address of the business.\n\n#### 30. `about`\n- Additional information about the business.\n\n#### 31. `user_reviews`\n- Collection of customer reviews, including text, rating, and timestamp.\n\n#### 32. `emails`\n- Email addresses associated with the business, if available.\n\n#### 33. `user_reviews_extended`\n- Collection of customer reviews, including text, rating, and timestamp. This includes all the\n  reviews that can be extracted (up to around 300)\n\n**Note**: email is empty by default (see Usage)\n\n**Note**: Input id is an ID that you can define per query. By default it's a UUID\nIn order to define it you can have an input file like:\n\n**Note**: user_reviews_extended is empty by default. You need to start the program with the\n`-extra-reviews` command line flag to enabled this (see Usage)\n\n```\nMatsuhisa Athens #!#MyIDentifier\n```\n\n## Quickstart\n\n### Using docker:\n\n```\ntouch results.csv \u0026\u0026 docker run -v $PWD/example-queries.txt:/example-queries -v $PWD/results.csv:/results.csv gosom/google-maps-scraper -depth 1 -input /example-queries -results /results.csv -exit-on-inactivity 3m\n```\n\nfile `results.csv` will contain the parsed results.\n\n**If you want emails use additionally the `-email` parameter**\n\n**All Reviews**\nYou can fetch up to around 300 reviews instead of the first 8 by using the \ncommand line parameter `--extra-reviews`. If you do that I recommend you use JSON\noutput instead of CSV.\n\n\n### On your host\n\n(tested only on Ubuntu 22.04)\n\n**make sure you use go version 1.25.5**\n\n\n```\ngit clone https://github.com/gosom/google-maps-scraper.git\ncd google-maps-scraper\ngo mod download\ngo build\n./google-maps-scraper -input example-queries.txt -results restaurants-in-cyprus.csv -exit-on-inactivity 3m\n```\n\nBe a little bit patient. In the first run it downloads required libraries.\n\nThe results are written when they arrive in the `results` file you specified\n\n**If you want emails use additionally the `-email` parameter**\n\n### Using a Proxy\n\n#### UI\nFrom the UI set the url, username and password\n\n#### Command line\n\nUse the `-proxies` option like:\n\n```\n./google-maps-scraper -input example-queries.txt -results random.txt -proxies '\u003cproxy1\u003e,\u003cproxy2\u003e' -depth 1 -c 2\n```\n\nwhere `\u003cproxy1\u003e,...\u003cproxyN\u003e` is a valid proxy url like:\n\n```\n'scheme://username:password@host:port\n```\n\nif your proxy does not require authentication:\n\n```\nscheme://host:port\n```\n\nSupported schemes:\n\n- socks5\n- socks5h\n- http\n- https\n\nI encourange you to buy a proxy service from one of our sponsors.\nThey are reliable and help me to maintain the project.\n\n#### Example with Decodo Proxies\n\n[Decodo](https://visit.decodo.com/APVbbx) offers high-performance proxies with #1 response time in the market:\n\n```bash\n./google-maps-scraper -input example-queries.txt -results restaurants.csv -proxies 'http://username:password@proxy.decodo.com:8080' -depth 1 -c 2\n```\n\n**[Get your Decodo proxy credentials →](https://visit.decodo.com/APVbbx)** | **[View detailed Decodo integration guide →](decodo.md)**\n\n\n### Command line options\n\ntry `./google-maps-scraper -h` to see the command line options available:\n```\n  -addr string\n        address to listen on for web server (default \":8080\")\n  -aws-access-key string\n        AWS access key\n  -aws-lambda\n        run as AWS Lambda function\n  -aws-lambda-chunk-size int\n        AWS Lambda chunk size (default 100)\n  -aws-lambda-invoker\n        run as AWS Lambda invoker\n  -aws-region string\n        AWS region\n  -aws-secret-key string\n        AWS secret key\n  -c int\n        sets the concurrency [default: half of CPU cores] (default 1)\n  -cache string\n        sets the cache directory [no effect at the moment] (default \"cache\")\n  -data-folder string\n        data folder for web runner (default \"webdata\")\n  -debug\n        enable headful crawl (opens browser window) [default: false]\n  -depth int\n        maximum scroll depth in search results [default: 10] (default 10)\n  -disable-page-reuse\n        disable page reuse in playwright\n  -dsn string\n        database connection string [only valid with database provider]\n  -email\n        extract emails from websites\n  -exit-on-inactivity duration\n        exit after inactivity duration (e.g., '5m')\n  -extra-reviews\n        enable extra reviews collection\n  -fast-mode\n        fast mode (reduced data collection)\n  -function-name string\n        AWS Lambda function name\n  -geo string\n        set geo coordinates for search (e.g., '37.7749,-122.4194')\n  -input string\n        path to the input file with queries (one per line) [default: empty]\n  -json\n        produce JSON output instead of CSV\n  -lang string\n        language code for Google (e.g., 'de' for German) [default: en] (default \"en\")\n  -leadsdb-api-key string\n        LeadsDB API key for exporting results to LeadsDB\n  -produce\n        produce seed jobs only (requires dsn)\n  -proxies string\n        comma separated list of proxies to use in the format protocol://user:pass@host:port example: socks5://localhost:9050 or http://user:pass@localhost:9050\n  -radius float\n        search radius in meters. Default is 10000 meters (default 10000)\n  -results string\n        path to the results file [default: stdout] (default \"stdout\")\n  -s3-bucket string\n        S3 bucket name\n  -web\n        run web server instead of crawling\n  -writer string\n        use custom writer plugin (format: 'dir:pluginName')\n  -zoom int\n        set zoom level (0-21) for search (default 15)\n```\n\n## Using a custom writer\n\nIn cases the results need to be written in a custom format or in another system like a db a message queue or basically anything the Go plugin system can be utilized.\n\nWrite a Go plugin (see an example in examples/plugins/example_writeR.go) \n\nCompile it using (for Linux):\n\n```\ngo build -buildmode=plugin -tags=plugin -o ~/mytest/plugins/example_writer.so examples/plugins/example_writer.go\n```\n\nand then run the program using the `-writer` argument. \n\nSee an example:\n\n1. Write your plugin (use the examples/plugins/example_writer.go as a reference)\n2. Build your plugin `go build -buildmode=plugin -tags=plugin -o ~/myplugins/example_writer.so plugins/example_writer.go`\n3. Download the lastes [release](https://github.com/gosom/google-maps-scraper/releases/) or build the program\n4. Run the program like `./google-maps-scraper -writer ~/myplugins:DummyPrinter -input example-queries.txt`\n\n\n### Plugins and Docker\n\nIt is possible to use the docker image and use tha plugins.\nIn such case make sure that the shared library is build using a compatible GLIB version with the docker image.\notherwise you will encounter an error like:\n\n```\n/lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by /plugins/example_writer.so)\n```\n\n## Exporting to LeadsDB\n\nYou can export your scraped results directly to [LeadsDB](https://getleadsdb.com/), a central database for managing business leads.\n\nUsing LeadsDB allows you to:\n- **Filter leads** with advanced AND/OR logic on any field\n- **Export custom fields** - select exactly which fields you need in CSV or JSON\n- **Access via API** - integrate with your own apps or AI agents\n\n### Usage\n\n```bash\n./google-maps-scraper -input example-queries.txt -leadsdb-api-key \"your-api-key\" -exit-on-inactivity 3m\n```\n\nOr using an environment variable:\n\n```bash\nexport LEADSDB_API_KEY=\"your-api-key\"\n./google-maps-scraper -input example-queries.txt -exit-on-inactivity 3m\n```\n\n### What gets exported\n\nThe scraper maps Google Maps data to LeadsDB leads:\n\n| Google Maps Field | LeadsDB Field |\n|-------------------|---------------|\n| Title | Name |\n| Category | Category |\n| Categories | Tags |\n| Phone | Phone |\n| Website | Website |\n| Address | Address, City, State, Country, PostalCode |\n| Latitude/Longitude | Coordinates |\n| Review Rating | Rating |\n| Review Count | ReviewCount |\n| Emails | Email |\n| Thumbnail | LogoURL |\n| CID | SourceID |\n\nAdditional fields like Google Maps link, plus code, price range, owner info, and more are stored as custom attributes.\n\n### Getting an API Key\n\nSign up at [LeadsDB](https://getleadsdb.com/) and get your API key from the [settings page](https://getleadsdb.com/settings).\n\n## Using Database Provider (postgreSQL)\n\nFor running in your local machine:\n\n```\ndocker-compose -f docker-compose.dev.yaml up -d\n```\n\nThe above starts a PostgreSQL container and creates the required tables\n\nto access db:\n\n```\npsql -h localhost -U postgres -d postgres\n```\n\nPassword is `postgres`\n\nThen from your host run:\n\n```\ngo run main.go -dsn \"postgres://postgres:postgres@localhost:5432/postgres\" -produce -input example-queries.txt --lang el\n```\n\n(configure your queries and the desired language)\n\nThis will populate the table `gmaps_jobs` . \n\nyou may run the scraper using:\n\n```\ngo run main.go -c 2 -depth 1 -dsn \"postgres://postgres:postgres@localhost:5432/postgres\"\n```\n\nIf you have a database server and several machines you can start multiple instances of the scraper as above.\n\n### Kubernetes\n\nYou may run the scraper in a kubernetes cluster. This helps to scale it easier.\n\nAssuming you have a kubernetes cluster and a database that is accessible from the cluster:\n\n1. First populate the database as shown above\n2. Create a deployment file `scraper.deployment`\n\n```\napiVersion: apps/v1\nkind: Deployment\nmetadata:\n  name: google-maps-scraper\nspec:\n  selector:\n    matchLabels:\n      app: goohttps://www.scrapeless.com/gle-maps-scraper\n  replicas: {NUM_OF_REPLICAS}\n  template:\n    metadata:\n      labels:\n        app: google-maps-scraper\n    spec:\n      containers:\n      - name: google-maps-scraper\n        image: gosom/google-maps-scraper:v0.9.3\n        imagePullPolicy: IfNotPresent\n        args: [\"-c\", \"1\", \"-depth\", \"10\", \"-dsn\", \"postgres://{DBUSER}:{DBPASSWD@DBHOST}:{DBPORT}/{DBNAME}\", \"-lang\", \"{LANGUAGE_CODE}\"]\n```\n\nPlease replace the values or the command args accordingly \n\nNote: Keep in mind that because the application starts a headless browser it requires CPU and memory. \nUse an appropriate kubernetes cluster\n\n## Telemetry\n\nAnonymous usage statistics are collected for debug and improvement reasons. \nYou can opt out by setting the env variable `DISABLE_TELEMETRY=1`\n\n## Performance\n\nExpected speed with concurrency of 8 and depth 1 is 120 jobs/per minute.\nEach search is 1 job + the number or results it contains.\n\nBased on the above: \nif we have 1000 keywords to search with each contains 16 results =\u003e 1000 * 16 = 16000 jobs.\n\nWe expect this to take about 16000/120 ~ 133 minutes ~ 2.5 hours\n\nIf you want to scrape many keywords then it's better to use the Database Provider in\ncombination with Kubernetes for convenience and start multiple scrapers in more than 1 machines.\n\n## References\n\nFor more instruction you may also read the following links\n\n- https://blog.gkomninos.com/how-to-extract-data-from-google-maps-using-golang\n- https://blog.gkomninos.com/distributed-google-maps-scraping\n- https://github.com/omkarcloud/google-maps-scraper/tree/master (also a nice project) [many thanks for the idea to extract the data by utilizing the JS objects]\n\n\n## Licence\n\nThis code is licensed under the MIT License\n\n\n## Contributing\n\nPlease open an ISSUE or make a Pull Request\n\n\nThank you for considering support for the project. Every bit of assistance helps maintain momentum and enhances the scraper’s capabilities!\n\n\n\n\n## Sponsors\n\n### Special Thanks to:\n\n\n[Decodo's proxies](https://visit.decodo.com/APVbbx) with #1 response time in the market\n\nCollect data without facing CAPTCHAs, IP bans, or geo-restrictions\n- ● 125M+ IP pool\n- ● 195+ locations worldwide  \n- ● 24/7 tech support\n- ● Extensive documentation\n\n**[Start your 3-day free trial with 100MB →](https://visit.decodo.com/APVbbx)**\n\n![Decodo](./img/decodo.png)\n\n\u003cbr\u003e\n\n[Evomi](https://evomi.com?utm_source=github\u0026utm_medium=banner\u0026utm_campaign=gosom-maps) is your Swiss Quality Proxy Provider, starting at **$0.49/GB**\n\n- 👩‍💻 **$0.49 per GB Residential Proxies**: Our price is unbeatable\n- 👩‍💻 **24/7 Expert Support**: We will join your Slack Channel\n- 🌍 **Global Presence**: Available in 150+ Countries\n- ⚡ **Low Latency**\n- 🔒 **Swiss Quality and Privacy**\n- 🎁 **Free Trial**\n- 🛡️ **99.9% Uptime**\n- 🤝 **Special IP Pool selection**: Optimize for fast, quality or quantity of ips\n- 🔧 **Easy Integration**: Compatible with most software and programming languages\n\n[![Evomi Banner](https://my.evomi.com/images/brand/cta.png)](https://evomi.com?utm_source=github\u0026utm_medium=banner\u0026utm_campaign=gosom-maps)\n\n\u003cbr\u003e\n\n[![Google Maps API for easy SERP scraping](https://www.searchapi.io/press/v1/svg/searchapi_logo_black_h.svg)](https://www.searchapi.io/google-maps?via=gosom)\n**Google Maps API for easy SERP scraping**\n\n\n\n### Premium Sponsors\n\n\u003ctable\u003e\n\u003ctr\u003e\n\u003ctd\u003e\n\u003ca href=\"https://gmapsextractor.com?utm_source=github\u0026utm_medium=banner\u0026utm_campaign=gosom\"\u003e\n\u003cimg src=\"img/gmaps-extractor-logo.png\" alt=\"G Maps Extractor Logo\" width=\"100\"\u003e\n\u003c/a\u003e\n\u003c/td\u003e\n\u003ctd\u003e\n\u003cb\u003eG Maps Extractor\u003c/b\u003e  \nA no-code Google Maps scraper that pulls business leads from Google Maps in one click.\n\n- 📇 **Includes** emails, social profiles, phone numbers, addresses, reviews, images and more.\n- 📥 **Export** to CSV · Excel · JSON\n- 🔌 **API** Support: Extract data via [API](https://gmapsextractor.com/google-maps-api?utm_source=github\u0026utm_medium=banner\u0026utm_campaign=gosom)\n- 🎁 **Free**: Get your first **1,000 leads** today  \n\u003ca href=\"https://gmapsextractor.com?utm_source=github\u0026utm_medium=banner\u0026utm_campaign=gosom\"\u003eGet Started for Free\u003c/a\u003e\n\u003c/td\u003e\n\u003c/tr\u003e\n\u003c/table\u003e\n\u003chr\u003e\n\n\u003ctable\u003e\n\u003ctr\u003e\n\u003ctd\u003e\u003cimg src=\"./img/SerpApi-logo-w.png\" alt=\"SerpApi Logo\" width=\"100\"\u003e\u003c/td\u003e\n\u003ctd\u003e\n\u003cb\u003eAt SerpApi, we scrape public data from Google Maps and other top search engines.\u003c/b\u003e\n\nYou can find the full list of our APIs here: [https://serpapi.com/search-api](https://serpapi.com/search-api)\n\u003c/td\u003e\n\u003c/tr\u003e\n\u003c/table\u003e\n\nFor more information, see [document](serpapi.md).\n\n\n\u003chr\u003e\n\n**No time for code? Extract ALL Google Maps listings at country-scale in 2 clicks, without keywords or limits** 👉 [Try it now for free](https://scrap.io?utm_medium=ads\u0026utm_source=github_gosom_gmap_scraper)\n\n[![Extract ALL Google Maps Listings](./img/premium_scrap_io.png)](https://scrap.io?utm_medium=ads\u0026utm_source=github_gosom_gmap_scraper)\n\nFor more information, see [scrap.io demo](scrap_io.md).\n\n\n### Supported by the Community\n\n[Supported by the community](https://github.com/sponsors/gosom)\n\nIf you're planning to use DigitalOcean, signing up through this link helps support the project. You get $200 in credit over 60 days, and I receive $25 once you've spent $25:\n\n\u003ca href=\"https://www.digitalocean.com/?refcode=c11136c4693c\u0026utm_campaign=Referral_Invite\u0026utm_medium=Referral_Program\u0026utm_source=badge\"\u003e\u003cimg src=\"https://web-platforms.sfo2.cdn.digitaloceanspaces.com/WWW/Badge%201.svg\" alt=\"DigitalOcean Referral Badge\" /\u003e\u003c/a\u003e\n\n\n## Notes\n\nPlease use this scraper responsibly and in accordance with all applicable laws and regulations. Unauthorized scraping of data may violate the terms of service of the website being scraped.\n\nbanner is generated using OpenAI's DALL-E\n\u003e **Note:** If you register via the links on my page, I may get a commission. This is another way to support my work\n\n","funding_links":["https://github.com/sponsors/gosom"],"categories":["Go"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgosom%2Fgoogle-maps-scraper","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fgosom%2Fgoogle-maps-scraper","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgosom%2Fgoogle-maps-scraper/lists"}