Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/margaretkhendre/home-sales-vs-big-data

In this repository, Google Collab is paired with SparkSQL to determine key metrics about home sales data. Spark is also used to create temporary views, partition data, and cache/unchache a temporary table in the process.
https://github.com/margaretkhendre/home-sales-vs-big-data

big-data googlecollab ipynb-jupyter-notebook pyspark spark sparksql sql

Last synced: about 1 month ago
JSON representation

In this repository, Google Collab is paired with SparkSQL to determine key metrics about home sales data. Spark is also used to create temporary views, partition data, and cache/unchache a temporary table in the process.

Awesome Lists containing this project

README

        

# Big Data Challenge

## Background
In this challenge, you'll use your knowledge of SparkSQL to determine key metrics about home sales data. Then you'll use Spark to create temporary views, partition the data, cache and uncache a temporary table, and verify that the table has been uncached.

## Instructions
1. Import the necessary PySpark SQL functions for this assignment.
Screenshot 2023-07-05 at 3 46 24 PM

2. Read the home_sales_revised.csv data in the starter code into a Spark DataFrame.

3. Create a temporary table called home_sales.
Screenshot 2023-07-05 at 3 47 23 PM

4. Answer the following questions using SparkSQL:

- What is the average price for a four-bedroom house sold for each year? Round off your answer to two decimal places.
- What is the average price of a home for each year it was built that has three bedrooms and three bathrooms? Round off your answer to two decimal places.
- What is the average price of a home for each year that has three bedrooms, three bathrooms, two floors, and is greater than or equal to 2,000 square feet? Round off your answer to two decimal places.
- What is the "view" rating for homes costing more than or equal to $350,000? Determine the run time for this query, and round off your answer to two decimal places.
Screenshot 2023-07-05 at 3 48 05 PM

5. Cache your temporary table home_sales.

6. Check if your temporary table is cached.
Screenshot 2023-07-05 at 3 48 42 PM

7. Using the cached data, run the query that filters out the view ratings with an average price of greater than or equal to $350,000. Determine the runtime and compare it to uncached runtime.

8. Partition by the "date_built" field on the formatted parquet home sales data.
Screenshot 2023-07-05 at 3 49 29 PM

9. Create a temporary table for the parquet data.

10. Run the query that filters out the view ratings with an average price of greater than or equal to $350,000. Determine the runtime and compare it to uncached runtime.

11. Uncache the home_sales temporary table.

12. Verify that the home_sales temporary table is uncached using PySpark.
Screenshot 2023-07-05 at 3 50 04 PM