{"id":19070797,"url":"https://github.com/bing-su/arcalive-crawler-python","last_synced_at":"2025-02-22T03:44:09.236Z","repository":{"id":109511113,"uuid":"398231728","full_name":"Bing-su/arcalive-crawler-python","owner":"Bing-su","description":"아카라이브 크롤러","archived":false,"fork":false,"pushed_at":"2021-08-25T05:06:22.000Z","size":27,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-01-02T15:50:45.070Z","etag":null,"topics":["crawler","python"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Bing-su.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2021-08-20T09:58:58.000Z","updated_at":"2024-12-29T11:02:27.000Z","dependencies_parsed_at":null,"dependency_job_id":"b9401464-beb8-4e07-b985-a2583343a978","html_url":"https://github.com/Bing-su/arcalive-crawler-python","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Bing-su%2Farcalive-crawler-python","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Bing-su%2Farcalive-crawler-python/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Bing-su%2Farcalive-crawler-python/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Bing-su%2Farcalive-crawler-python/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Bing-su","download_url":"https://codeload.github.com/Bing-su/arcalive-crawler-python/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":240122574,"owners_count":19751142,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["crawler","python"],"created_at":"2024-11-09T01:20:38.237Z","updated_at":"2025-02-22T03:44:09.218Z","avatar_url":"https://github.com/Bing-su.png","language":"Python","readme":"# arcalive-crawler-python\n\n### 아카라이브에서 게시물을 크롤링하는 파이썬 프로그램입니다.\n\n아카라이브의 특정 채널에서 페이지 단위로 내용물을 수집합니다.  \n수집한 내용은 json파일로 저장되며 그 형태는 다음과 같습니다.\n\n```py\n{\n    \"channel\": str,  # 채널 이름\n    \"category\": str,  # 카테고리 이름\n    \"best_only\": bool,  # 개념글만 수집하는지 여부\n    \"start_page\": int,  # 수집을 시작한 페이지 번호\n    \"end_page\": int,  # 수집을 끝낸 페이지 번호\n    \"scrapped_time\": str,  # 크롤링 시작 시간, \"%Y-%m-%d %H:%M:%S\"\n    \"total_data\": int,  # 수집한 게시물 수\n    \"total_comments\": int,  # 수집한 댓글 수\n    \"data\": List[dict] [  # 수집한 내용\n        {\n            \"id\": int,  # 게시물의 id\n            \"title\": str,  # 게시물의 제목\n            \"category\": str,  # 게시물이 속한 카테고리, 없으면 빈칸\n            \"time\": str,  # 게시물의 작성 시간, \"%Y-%m-%d %H:%M:%S\"\n            \"contents\": str,  # 게시물의 본문\n            \"comments\": List[str]  # 게시물의 댓글\n        },\n    ]\n}\n```\n\n## 사용방법\n\n예) `python main.py genshin 1 2 -b`  \n -\u003e 원신 채널의 개념글에서 1페이지부터 2페이지까지 크롤링한다.\n\n\u003c/br\u003e\n\n`python main.py \"채널 이름(영어)\" \"시작페이지\" \"끝페이지\"`  \n`-b, --best` (옵션) 사용하면 개념글 페이지에서 크롤링합니다.  \n`-c, --category \"카테고리 이름\"` (옵션) 크롤링할 해당 채널의 카테고리 이름.  \n`-n, --file_name \"파일 이름\"` (옵션) 크롤링한 데이터를 저장할 파일 이름.  \n`-s, --save_path \"경로\"` (옵션) 파일을 저장할 경로. 기본적으로 현재 위치에 data폴더를 만들어 저장합니다.  \n`-p, --parser \"파서 이름\"` (옵션) beautifulsoup4가 사용할 파서의 이름, 기본값=\"html.parser\"  \n`-w, --workers 숫자` (옵션) 멀티스레딩에 사용할 max_workers 수  \n\n## 요구사항\n`requests`  \n`beautifulsoup4`  \n`tqdm`","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbing-su%2Farcalive-crawler-python","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fbing-su%2Farcalive-crawler-python","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbing-su%2Farcalive-crawler-python/lists"}