https://github.com/karthikmprakash/github_repos_scraper
A tool to extract names of github repos of any user
https://github.com/karthikmprakash/github_repos_scraper
automation bs4 data github python repositories requests webscraping
Last synced: 3 months ago
JSON representation
A tool to extract names of github repos of any user
- Host: GitHub
- URL: https://github.com/karthikmprakash/github_repos_scraper
- Owner: karthikmprakash
- Created: 2021-07-09T06:30:45.000Z (about 4 years ago)
- Default Branch: main
- Last Pushed: 2021-07-10T11:59:14.000Z (about 4 years ago)
- Last Synced: 2025-02-26T21:46:36.023Z (7 months ago)
- Topics: automation, bs4, data, github, python, repositories, requests, webscraping
- Language: Python
- Homepage:
- Size: 5.86 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Github repo names of any user
A tool to extract names of github repos of any user
## Introduction:Web scraping, web harvesting, or web data extraction is data scraping used for extracting data from websites using its HTML structure, In this post, I will explain basic fundaments of web scraping using python and also explore it by a live demonstration with two python libraries Beautifulsoup and requests respectively.
## What you will learn from this post:
* basic understanding of web scraping
* how to extract data from a website using classes and HTML tags
* how to use requests module to get data
* how to use Beautifulsoup## Requirements:
* python3
* requests
* bs4## Installation:
sudo apt-get python3-pip
pip3 install requests
pip3 install bs4## How to run this code
* run the file main.py by `python3 main.py`
* Enter the username
* You'll be prompted if you want to store it in a file