{"id":26230105,"url":"https://github.com/frocode/news-data-pipeline_airflow_aws","last_synced_at":"2026-04-17T03:04:53.844Z","repository":{"id":251641272,"uuid":"836470068","full_name":"FroCode/news-Data-Pipeline_Airflow_AWS","owner":"FroCode","description":null,"archived":false,"fork":false,"pushed_at":"2024-08-08T17:25:00.000Z","size":51346,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2026-01-02T12:37:32.072Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/FroCode.png","metadata":{"files":{"readme":"README.md","changelog":"news.csv","contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-07-31T23:29:59.000Z","updated_at":"2024-10-02T22:07:49.000Z","dependencies_parsed_at":"2024-08-08T20:07:52.371Z","dependency_job_id":null,"html_url":"https://github.com/FroCode/news-Data-Pipeline_Airflow_AWS","commit_stats":null,"previous_names":["frocode/twitter-data-pipeline_airflow_aws","frocode/news-data-pipeline_airflow_aws"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/FroCode/news-Data-Pipeline_Airflow_AWS","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/FroCode%2Fnews-Data-Pipeline_Airflow_AWS","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/FroCode%2Fnews-Data-Pipeline_Airflow_AWS/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/FroCode%2Fnews-Data-Pipeline_Airflow_AWS/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/FroCode%2Fnews-Data-Pipeline_Airflow_AWS/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/FroCode","download_url":"https://codeload.github.com/FroCode/news-Data-Pipeline_Airflow_AWS/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/FroCode%2Fnews-Data-Pipeline_Airflow_AWS/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":31913078,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-04-16T18:22:33.417Z","status":"online","status_checked_at":"2026-04-17T02:00:06.879Z","response_time":62,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-03-12T22:19:48.950Z","updated_at":"2026-04-17T03:04:53.823Z","avatar_url":"https://github.com/FroCode.png","language":"Python","readme":"# News API ETL Pipeline\n\n## Overview\n\nThis repository contains a simple ETL (Extract, Load) pipeline for fetching top headlines from the News API and uploading the data to an AWS S3 bucket. It uses Python and Apache Airflow for orchestration.\n\n## Files\n\n- **`news_etl.py`**: Contains the ETL process for fetching news data and uploading it to S3.\n- **`news_etl_dag.py`**: Defines the Airflow DAG for scheduling and running the ETL process.\n\n## Prerequisites\n\n- Python 3.7 or higher\n- Apache Airflow\n- `requests` library\n- `pandas` library\n- `boto3` library (for S3 operations)\n- AWS credentials configured for `boto3`\n\n## Setup\n\n1. **Clone the Repository:**\n\n    ```bash\n    git clone https://github.com/yourusername/news_etl.git\n    cd news_etl\n    ```\n\n2. **Install Dependencies:**\n\n    It is recommended to use a virtual environment. Install the required Python libraries using:\n\n    ```bash\n    pip install requests pandas boto3 apache-airflow\n    ```\n\n3. **Configure Airflow:**\n\n    - Set up Apache Airflow by following the [Airflow documentation](https://airflow.apache.org/docs/apache-airflow/stable/start.html).\n    - Place the `news_etl_dag.py` file in your Airflow DAGs folder (usually located at `~/airflow/dags`).\n\n4. **Configure AWS Credentials:**\n\n    Ensure that your AWS credentials are configured. You can set up your AWS credentials using the AWS CLI:\n\n    ```bash\n    aws configure\n    ```\n\n    Alternatively, set up environment variables:\n\n    ```bash\n    export AWS_ACCESS_KEY_ID=your_access_key_id\n    export AWS_SECRET_ACCESS_KEY=your_secret_access_key\n    ```\n\n5. **Modify API Key:**\n\n    Replace the placeholder API key in `news_etl.py` with your actual News API key:\n\n    ```python\n    api_key = 'your_news_api_key'\n    ```\n\n## Usage\n\n1. **Running the ETL Process Manually:**\n\n    You can run the ETL process directly using Python:\n\n    ```bash\n    python news_etl.py\n    ```\n\n2. **Scheduling with Airflow:**\n\n    - Start the Airflow web server and scheduler:\n\n      ```bash\n      airflow webserver --port 8080\n      airflow scheduler\n      ```\n\n    - Access the Airflow web interface at `http://localhost:8080`.\n    - Trigger the `news_etl_dag` manually or wait for it to run according to the schedule (daily).\n\n## S3 Bucket Configuration\n\n- Ensure you have an S3 bucket named `reddits-data` (or adjust the bucket name in `news_etl.py` accordingly).\n- The pipeline saves the news data as `news.csv` in the S3 bucket.\n\n## Error Handling\n\n- **API Errors:** Ensure your News API key is valid and not expired. Check for HTTP response codes in the logs for detailed error messages.\n- **S3 Upload Errors:** Verify that your AWS credentials have the necessary permissions to upload files to the S3 bucket.\n\n![Capture d'écran 2024-08-08 192422](https://github.com/user-attachments/assets/bf49797e-16ef-47f3-8f4c-57ec36e54a45)\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ffrocode%2Fnews-data-pipeline_airflow_aws","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Ffrocode%2Fnews-data-pipeline_airflow_aws","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ffrocode%2Fnews-data-pipeline_airflow_aws/lists"}