{"id":28061257,"url":"https://github.com/mmerlyn/sensor-stream","last_synced_at":"2026-04-12T17:05:00.303Z","repository":{"id":291496911,"uuid":"977712362","full_name":"mmerlyn/sensor-stream","owner":"mmerlyn","description":"End-to-end real-time sensor data pipeline using Python, Spark, PostgreSQL, Docker, and React for live visualization.","archived":false,"fork":false,"pushed_at":"2025-05-07T09:13:57.000Z","size":1059,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-05-12T10:07:15.120Z","etag":null,"topics":["apache-kafka","apache-spark","fast-api","postgresql-database","python","react","spark-streaming","streaming"],"latest_commit_sha":null,"homepage":"","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/mmerlyn.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-05-04T20:06:19.000Z","updated_at":"2025-05-07T09:14:00.000Z","dependencies_parsed_at":"2025-05-06T21:23:45.882Z","dependency_job_id":null,"html_url":"https://github.com/mmerlyn/sensor-stream","commit_stats":null,"previous_names":["mmerlyn/sensor-streaming","mmerlyn/sensor-stream"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/mmerlyn/sensor-stream","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/mmerlyn%2Fsensor-stream","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/mmerlyn%2Fsensor-stream/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/mmerlyn%2Fsensor-stream/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/mmerlyn%2Fsensor-stream/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/mmerlyn","download_url":"https://codeload.github.com/mmerlyn/sensor-stream/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/mmerlyn%2Fsensor-stream/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":265942582,"owners_count":23853294,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["apache-kafka","apache-spark","fast-api","postgresql-database","python","react","spark-streaming","streaming"],"created_at":"2025-05-12T09:55:05.749Z","updated_at":"2026-04-12T17:05:00.291Z","avatar_url":"https://github.com/mmerlyn.png","language":"TypeScript","readme":"# SensorStream\n\nA real-time data pipeline that streams sensor readings through Kafka and Spark to a live React dashboard. I built this to gain hands-on experience with **distributed streaming systems**, **big data processing**, and **end-to-end data engineering**.\n\n[Demo](https://drive.google.com/file/d/10sotn4D0T8xfHV6UxCW88erpHJdvhuak/view?usp=sharing) | [Blog](https://medium.com/@merlynmercylona/building-a-live-sensor-monitoring-system-with-kafka-spark-postgresql-fastapi-react-e66a2aa10550)\n\n## Why I Built This\n\nI wanted to understand how real-time data systems work in production environments - the kind used for IoT telemetry, financial market data, and application monitoring. Rather than just reading about Kafka and Spark, I built a complete pipeline that ingests, processes, stores, and visualizes streaming data in real time. This forced me to solve real integration challenges: connecting distributed services, handling data serialization, managing stream processing checkpoints, and building responsive UIs that update continuously.\n\n## Skills Demonstrated\n\n| Area                     | Technologies \u0026 Concepts                                              |\n| ------------------------ | -------------------------------------------------------------------- |\n| **Stream Processing**    | Apache Spark Structured Streaming, PySpark, micro-batch processing   |\n| **Message Queues**       | Apache Kafka, ZooKeeper, producer/consumer patterns, topic design    |\n| **Backend Development**  | Python, FastAPI, REST API design, CORS configuration                 |\n| **Database Design**      | PostgreSQL, JDBC integration, indexing strategies, schema design     |\n| **Frontend Development** | React 19, TypeScript, Recharts, Tailwind CSS, polling-based updates  |\n| **Data Serialization**   | JSON schema validation, Spark StructType definitions                 |\n| **Containerization**     | Docker, Docker Compose, multi-service orchestration, networking      |\n| **System Integration**   | Service dependencies, environment configuration, cross-service comms |\n\n## Architecture\n\n```\n┌─────────────────────────────────────────────────────────────────────┐\n│                     React Dashboard (:5173)                         │\n│  • Live temperature/humidity charts    • 5-second polling           │\n│  • Rolling data window                 • Recharts visualization     │\n└─────────────────────────────┬───────────────────────────────────────┘\n                              │ HTTP GET /latest, /history\n                              ↓\n┌─────────────────────────────────────────────────────────────────────┐\n│                      FastAPI Backend (:8000)                        │\n│  • REST endpoints          • PostgreSQL queries                     │\n│  • CORS middleware         • JSON response formatting               │\n└─────────────────────────────┬───────────────────────────────────────┘\n                              │ SQL queries\n                              ↓\n┌─────────────────────────────────────────────────────────────────────┐\n│                      PostgreSQL (:5432)                             │\n│  • sensor_data table       • Timestamp DESC index                   │\n│  • Persistent storage      • Optimized for latest-first queries    │\n└─────────────────────────────┬───────────────────────────────────────┘\n                              ↑ JDBC batch writes\n                              │\n┌─────────────────────────────────────────────────────────────────────┐\n│                   Spark Structured Streaming                        │\n│  • Kafka consumer          • JSON parsing with schema validation    │\n│  • Micro-batch processing  • foreachBatch sink to PostgreSQL        │\n│  • Null filtering          • Error handling per batch               │\n└─────────────────────────────┬───────────────────────────────────────┘\n                              ↑ Consumes from topic\n                              │\n┌─────────────────────────────────────────────────────────────────────┐\n│                    Apache Kafka (:9092)                             │\n│  • Topic: sensor-data      • Distributed message queue              │\n│  • ZooKeeper coordination  • Decouples producer from consumer       │\n└─────────────────────────────┬───────────────────────────────────────┘\n                              ↑ Publishes sensor readings\n                              │\n┌─────────────────────────────────────────────────────────────────────┐\n│                      Python Producer                                │\n│  • Simulates sensor-001    • 1-second intervals                     │\n│  • JSON serialization      • Auto-retry on connection failure       │\n└─────────────────────────────────────────────────────────────────────┘\n```\n\n## Components Implemented\n\n**Data Producer**\n\n- Simulates environmental sensor generating temperature (20-30°C) and humidity (40-60%)\n- Publishes JSON messages to Kafka every second\n- Auto-retry logic with exponential backoff for broker connectivity\n- Clean shutdown handling\n\n**Stream Processor (Spark)**\n\n- Reads from Kafka with \"earliest\" offset for full data capture\n- Strongly-typed schema validation using Spark StructType\n- Filters malformed records before database insertion\n- Batch-writes to PostgreSQL via JDBC with error isolation\n\n**REST API (FastAPI)**\n\n- `GET /latest` - Returns most recent sensor reading\n- `GET /history?limit=N` - Returns N most recent readings in chronological order\n- CORS-enabled for cross-origin frontend requests\n- Connection pooling with proper error handling\n\n**Dashboard (React)**\n\n- Live-updating display with current temperature and humidity\n- Three interactive Recharts visualizations:\n  - Temperature trend line chart\n  - Humidity trend line chart\n  - Combined dual-axis chart for correlation analysis\n- 10-point rolling window to show recent trends\n- Loading states and error handling for API failures\n\n## What I Learned\n\n- **Distributed systems design**: Understanding how loosely-coupled services communicate through message queues and why decoupling producers from consumers improves system resilience\n- **Data pipeline architecture**: Designing end-to-end data flow from ingestion to storage to presentation, with clear boundaries between processing stages\n- **Schema enforcement**: Implementing strongly-typed data contracts at system boundaries to catch malformed data early and prevent downstream failures\n- **Database optimization**: Choosing appropriate indexing strategies based on query patterns rather than generic best practices\n- **Error handling in distributed systems**: Isolating failures to individual batches so one bad record doesn't halt the entire pipeline\n- **API design**: Building RESTful endpoints with proper CORS configuration, connection management, and meaningful error responses\n- **Frontend state management**: Separating concerns between data fetching, caching, and UI rendering in reactive applications\n- **Container orchestration**: Managing multi-service dependencies, networking, and environment configuration with Docker Compose\n- **Debugging across service boundaries**: Tracing data flow through multiple systems to identify where issues originate\n\n## License\n\nMIT\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmmerlyn%2Fsensor-stream","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmmerlyn%2Fsensor-stream","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmmerlyn%2Fsensor-stream/lists"}