{"id":21034710,"url":"https://github.com/reecejohnson/web-crawler","last_synced_at":"2025-10-25T11:46:41.652Z","repository":{"id":216820982,"uuid":"365135966","full_name":"reecejohnson/web-crawler","owner":"reecejohnson","description":"A command-line application to crawl all internal links for a specified URL. Built with: Java, Spring, JUnit, Mockito 🌍🕸.","archived":false,"fork":false,"pushed_at":"2021-06-23T09:52:31.000Z","size":522,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-01-20T15:57:04.898Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"","language":"Java","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/reecejohnson.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null}},"created_at":"2021-05-07T06:28:07.000Z","updated_at":"2021-06-23T16:16:51.000Z","dependencies_parsed_at":"2024-01-13T03:27:47.005Z","dependency_job_id":"f8035baa-52b3-4bb7-b5de-d56f403cec33","html_url":"https://github.com/reecejohnson/web-crawler","commit_stats":null,"previous_names":["reecejohnson/web-crawler"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/reecejohnson%2Fweb-crawler","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/reecejohnson%2Fweb-crawler/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/reecejohnson%2Fweb-crawler/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/reecejohnson%2Fweb-crawler/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/reecejohnson","download_url":"https://codeload.github.com/reecejohnson/web-crawler/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":243475371,"owners_count":20296714,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-11-19T13:08:04.551Z","updated_at":"2025-10-25T11:46:36.619Z","avatar_url":"https://github.com/reecejohnson.png","language":"Java","readme":"# Web Crawler 🌎🕸\nA command line application to crawl all internal links for a specified URL and print each URL visited with a list of links found on that page to the console. \n\n![Example output file](src/main/resources/example.png)\n\n### Rules\n- Crawler will not follow external links, only internal\n- No pre-built web-scraping frameworks to be used\n- Smaller libraries are permitted (e.g. HTML parsing)\n\n## Run Tests\n`./gradlew test`\n\n## Run Application\nRun the application by providing arguments:\n- The base URL to crawl\n- The number of threads to run concurrently.\n\n`./gradlew run --args='https://www.url-to-crawl.com 4'`\n\n## Output \nRunning the application will produce a HTML file of the crawl results at `/output/results.html`\n \n\n## Questions \u0026 Queries\n📩 [reece@reecejohnson.co.uk](mailto:reece@reecejohnson.co.uk?subject=Web%20Crawler)\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Freecejohnson%2Fweb-crawler","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Freecejohnson%2Fweb-crawler","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Freecejohnson%2Fweb-crawler/lists"}