Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/gabfr/wunderfleet-task
Simple payment processing formulary challenge, using an external API and 3-stepped form
https://github.com/gabfr/wunderfleet-task
laravel tests vue
Last synced: about 1 month ago
JSON representation
Simple payment processing formulary challenge, using an external API and 3-stepped form
- Host: GitHub
- URL: https://github.com/gabfr/wunderfleet-task
- Owner: gabfr
- Created: 2019-08-14T16:08:05.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2023-01-04T07:14:53.000Z (about 2 years ago)
- Last Synced: 2023-03-27T11:00:55.351Z (almost 2 years ago)
- Topics: laravel, tests, vue
- Language: PHP
- Homepage:
- Size: 3.49 MB
- Stars: 0
- Watchers: 1
- Forks: 1
- Open Issues: 24
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Wunderfleet Code Challenge
This is a simple application that has 3-steps formulary to collect the payment data of future customers of Wunderfleet!
To tackle the proposed problem I choosed the Laravel PHP framework altogether with the Vue.js JavaScript framework.
With the Laravel Framework I developed two simple API endpoints:
- `POST /api/customers` - To create the customer with the data filled on the form;
- `GET /api/customers/{customer_uid}` - Retrieve the updated customer data.
Laravel has a built-in queue processing system that allows us to run heavy and/or I/O blocking routines in the
background of the server without making the client request wait until we call the Wunderfleet fake payment API.Then, I used Vue.js as a Single Page Application that calls these endpoints.
This way we can decouple the interface from the backend anytime we need, also this is a highly scalable architecture.At last, but not least, to attend the constraint in this challenge that said the user can come back later into
the form and start from where he stopped, I used the browser's `localStorage` to persist the state
of the application. Using the local storage has the pro of letting this responsibility to the client (browser)
and we do not need to store temporary session data in our infrastructure.## Getting started
To run this project you have to:
- `composer install` - Install all package dependencies using composer
- `cp .env.example .env` - Copy the `.env.example` to `.env` and configure the `DB_*` related variables.
I left the `PAYMENT_DATA_URL` already filled in the .env.example to ease the configuration.
- Laravel can support both MySQL and SQLite, I suggest running with MySQL.
- `php artisan migrate` - After configuring the database connection, run the migrate to create all tables. **Don't forget to prior create the database on the DBMS or the file in case of using SQLite.**
- `php artisan serve` - Finally, run this command to start a local development server;
- Just access `http://127.0.0.1:8000` and you already should see the first step of the registration form.## Database schema
To keep it simple, we just have 3 tables for this system to run:
![Tables Diagram](https://gabrielf.com/img/wunder_tables.png)
Note that we have the `customers` table that is responsible to persist the customer data. And two other
auxiliary tables:
- `migrations` - serves as a checkpoint of database changes, every time we need to change something in the database, we just create a migration and run the `php artisan migrate`, it will run the proper database modifications and store which was the last migration it ran in this table
- `failed_jobs` - serves as a repository for failed jobs, example: if the payment API goes through a downtime
all the jobs will start to fail with timeout exception and logged into this `failed_jobs` table. Then when the API is back up we can just run `php artisan queue:retry all` to reprocess all the payment data that failed to process in the downtime window## Q&A
- What more can you optimize for performance on your project?
- I already prepared the architecture to use processing queues to handle the API calls,
for thus we need to change the `QUEUE_CONNECTION` environment variable in the `.env` file to use something
like Beanstalkd or RabbitMQ or Redis or even Amazon SQS (all of them has drivers available).
Then with the queue system of your choice the only thing we need
to do is to bring up a worker running `php artisan queue:work --queue=default`
([more on Laravel Queues here](https://laravel.com/docs/5.8/queues));
- Another thing that can optimize our performance is to avoid
[doing pool requests like we do every 5 seconds until the Payment API call is done in the background](https://github.com/gabfr/wunderfleet-task/blob/master/resources/js/store/actions.js#L12)
and start working with web sockets and broadcast, this way we have a connection point to dispatch the customer
update right to the client's browser.
- What can be better in your project?
- I choosed to develop this solution with simplicity in mind, using only one table to store data of various entities
is not the optimal solution in the matter of data modeling. Altought to keep this project simple to evaluate and run
I opted to make it this way. But, in the future normalize the address and the bank information of the customer in
another table can ease future maintenances on this database.
- Why did you use an string primary key in your customer table, instead of using the default auto_increment?
- As we are working with an exposed API endpoint, using `auto_increment` would give us a breach for other users fetching another users data. So, to make it more difficult to fetch it, we use extremely random string id keys ([`uniqid`](http://php.net/uniqid) - and `md5` the generated id).