Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/zeit/micro
Asynchronous HTTP microservices
https://github.com/zeit/micro
async await micro microservice vercel
Last synced: 6 days ago
JSON representation
Asynchronous HTTP microservices
- Host: GitHub
- URL: https://github.com/zeit/micro
- Owner: vercel
- License: mit
- Created: 2016-01-23T05:17:00.000Z (almost 9 years ago)
- Default Branch: main
- Last Pushed: 2024-06-19T16:57:01.000Z (6 months ago)
- Last Synced: 2024-11-21T09:20:32.322Z (about 1 month ago)
- Topics: async, await, micro, microservice, vercel
- Language: TypeScript
- Homepage:
- Size: 1020 KB
- Stars: 10,586
- Watchers: 177
- Forks: 459
- Open Issues: 10
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-list-microservice - micro
- awesome-nodejs - micro - Asynchronous HTTP microservices - ★ 7611 (Web frameworks)
- awesome - micro - Asynchronous HTTP microservices (JavaScript)
- awesome-npm - micro - 异步编程特色的微服务框架 (1. 后端开发 / 1.2 框架)
README
# Micro — Asynchronous HTTP microservices
## Features
- **Easy**: Designed for usage with `async` and `await`
- **Fast**: Ultra-high performance (even JSON parsing is opt-in)
- **Micro**: The whole project is ~260 lines of code
- **Agile**: Super easy deployment and containerization
- **Simple**: Oriented for single purpose modules (function)
- **Standard**: Just HTTP!
- **Explicit**: No middleware - modules declare all [dependencies](https://github.com/amio/awesome-micro)
- **Lightweight**: With all dependencies, the package weighs less than a megabyte**Disclaimer:** Micro was created for use within containers and is not intended for use in serverless environments. For those using Vercel, this means that there is no requirement to use Micro in your projects as the benefits it provides are not applicable to the platform. Utility features provided by Micro, such as `json`, are readily available in the form of [Serverless Function helpers](https://vercel.com/docs/runtimes#official-runtimes/node-js/node-js-request-and-response-objects).
## Installation
**Important:** Micro is only meant to be used in production. In development, you should use [micro-dev](https://github.com/vercel/micro-dev), which provides you with a tool belt specifically tailored for developing microservices.
To prepare your microservice for running in the production environment, firstly install `micro`:
```bash
npm install --save micro
```## Usage
Create an `index.js` file and export a function that accepts the standard [http.IncomingMessage](https://nodejs.org/api/http.html#http_class_http_incomingmessage) and [http.ServerResponse](https://nodejs.org/api/http.html#http_class_http_serverresponse) objects:
```js
module.exports = (req, res) => {
res.end('Welcome to Micro');
};
```Micro provides [useful helpers](https://github.com/vercel/micro#body-parsing) but also handles return values – so you can write it even shorter!
```js
module.exports = () => 'Welcome to Micro';
```Next, ensure that the `main` property inside `package.json` points to your microservice (which is inside `index.js` in this example case) and add a `start` script:
```json
{
"main": "index.js",
"scripts": {
"start": "micro"
}
}
```Once all of that is done, the server can be started like this:
```bash
npm start
```And go to this URL: `http://localhost:3000` - 🎉
### Command line
```
micro - Asynchronous HTTP microservicesUSAGE
$ micro --help
$ micro --version
$ micro [-l listen_uri [-l ...]] [entry_point.js]By default micro will listen on 0.0.0.0:3000 and will look first
for the "main" property in package.json and subsequently for index.js
as the default entry_point.Specifying a single --listen argument will overwrite the default, not supplement it.
OPTIONS
--help shows this help message
-v, --version displays the current version of micro
-l, --listen listen_uri specify a URI endpoint on which to listen (see below) -
more than one may be specified to listen in multiple placesENDPOINTS
Listen endpoints (specified by the --listen or -l options above) instruct micro
to listen on one or more interfaces/ports, UNIX domain sockets, or Windows named pipes.For TCP (traditional host/port) endpoints:
$ micro -l tcp://hostname:1234
For UNIX domain socket endpoints:
$ micro -l unix:/path/to/socket.sock
For Windows named pipe endpoints:
$ micro -l pipe:\\.\pipe\PipeName
```### `async` & `await`
Examples
Micro is built for usage with async/await.
```js
const sleep = require('then-sleep');
module.exports = async (req, res) => {
await sleep(500);
return 'Ready!';
};
```
### Port Based on Environment Variable
When you want to set the port using an environment variable you can use:
```
micro -l tcp://0.0.0.0:$PORT
```
Optionally you can add a default if it suits your use case:
```
micro -l tcp://0.0.0.0:${PORT-3000}
```
`${PORT-3000}` will allow a fallback to port `3000` when `$PORT` is not defined.
Note that this only works in Bash.
### Body parsing
Examples
For parsing the incoming request body we included an async functions `buffer`, `text` and `json`
```js
const { buffer, text, json } = require('micro');
module.exports = async (req, res) => {
const buf = await buffer(req);
console.log(buf);
//
const txt = await text(req);
console.log(txt);
// '{"price": 9.99}'
const js = await json(req);
console.log(js.price);
// 9.99
return '';
};
```
### API
##### `buffer(req, { limit = '1mb', encoding = 'utf8' })`
##### `text(req, { limit = '1mb', encoding = 'utf8' })`
##### `json(req, { limit = '1mb', encoding = 'utf8' })`
- Buffers and parses the incoming body and returns it.
- Exposes an `async` function that can be run with `await`.
- Can be called multiple times, as it caches the raw request body the first time.
- `limit` is how much data is aggregated before parsing at max. Otherwise, an `Error` is thrown with `statusCode` set to `413` (see [Error Handling](#error-handling)). It can be a `Number` of bytes or [a string](https://www.npmjs.com/package/bytes) like `'1mb'`.
- If JSON parsing fails, an `Error` is thrown with `statusCode` set to `400` (see [Error Handling](#error-handling))
For other types of data check the [examples](#body-parsing-examples)
### Sending a different status code
So far we have used `return` to send data to the client. `return 'Hello World'` is the equivalent of `send(res, 200, 'Hello World')`.
```js
const { send } = require('micro');
module.exports = async (req, res) => {
const statusCode = 400;
const data = { error: 'Custom error message' };
send(res, statusCode, data);
};
```
##### `send(res, statusCode, data = null)`
- Use `require('micro').send`.
- `statusCode` is a `Number` with the HTTP status code, and must always be supplied.
- If `data` is supplied it is sent in the response. Different input types are processed appropriately, and `Content-Type` and `Content-Length` are automatically set.
- `Stream`: `data` is piped as an `octet-stream`. Note: it is _your_ responsibility to handle the `error` event in this case (usually, simply logging the error and aborting the response is enough).
- `Buffer`: `data` is written as an `octet-stream`.
- `object`: `data` is serialized as JSON.
- `string`: `data` is written as-is.
- If JSON serialization fails (for example, if a cyclical reference is found), a `400` error is thrown. See [Error Handling](#error-handling).
### Programmatic use
You can use Micro programmatically by requiring Micro directly:
```js
const http = require('http');
const sleep = require('then-sleep');
const { serve } = require('micro');
const server = new http.Server(
serve(async (req, res) => {
await sleep(500);
return 'Hello world';
}),
);
server.listen(3000);
```
##### serve(fn)
- Use `require('micro').serve`.
- Returns a function with the `(req, res) => void` signature. That uses the provided `function` as the request handler.
- The supplied function is run with `await`. So it can be `async`
##### sendError(req, res, error)
- Use `require('micro').sendError`.
- Used as the default handler for errors thrown.
- Automatically sets the status code of the response based on `error.statusCode`.
- Sends the `error.message` as the body.
- Stacks are printed out with `console.error` and during development (when `NODE_ENV` is set to `'development'`) also sent in responses.
- Usually, you don't need to invoke this method yourself, as you can use the [built-in error handling](#error-handling) flow with `throw`.
##### createError(code, msg, orig)
- Use `require('micro').createError`.
- Creates an error object with a `statusCode`.
- Useful for easily throwing errors with HTTP status codes, which are interpreted by the [built-in error handling](#error-handling).
- `orig` sets `error.originalError` which identifies the original error (if any).
## Error Handling
Micro allows you to write robust microservices. This is accomplished primarily by bringing sanity back to error handling and avoiding callback soup.
If an error is thrown and not caught by you, the response will automatically be `500`. **Important:** Error stacks will be printed as `console.error` and during development mode (if the env variable `NODE_ENV` is `'development'`), they will also be included in the responses.
If the `Error` object that's thrown contains a `statusCode` property, that's used as the HTTP code to be sent. Let's say you want to write a rate limiting module:
```js
const rateLimit = require('my-rate-limit');
module.exports = async (req, res) => {
await rateLimit(req);
// ... your code
};
```
If the API endpoint is abused, it can throw an error with `createError` like so:
```js
if (tooMany) {
throw createError(429, 'Rate limit exceeded');
}
```
Alternatively you can create the `Error` object yourself
```js
if (tooMany) {
const err = new Error('Rate limit exceeded');
err.statusCode = 429;
throw err;
}
```
The nice thing about this model is that the `statusCode` is merely a suggestion. The user can override it:
```js
try {
await rateLimit(req);
} catch (err) {
if (429 == err.statusCode) {
// perhaps send 500 instead?
send(res, 500);
}
}
```
If the error is based on another error that **Micro** caught, like a `JSON.parse` exception, then `originalError` will point to it. If a generic error is caught, the status will be set to `500`.
In order to set up your own error handling mechanism, you can use composition in your handler:
```js
const { send } = require('micro');
const handleErrors = (fn) => async (req, res) => {
try {
return await fn(req, res);
} catch (err) {
console.log(err.stack);
send(res, 500, 'My custom error!');
}
};
module.exports = handleErrors(async (req, res) => {
throw new Error('What happened here?');
});
```
## Testing
Micro makes tests compact and a pleasure to read and write.
We recommend [Node TAP](https://node-tap.org/) or [AVA](https://github.com/avajs/ava), a highly parallel test framework with built-in support for async tests:
```js
const http = require('http');
const { send, serve } = require('micro');
const test = require('ava');
const listen = require('test-listen');
const fetch = require('node-fetch');
test('my endpoint', async (t) => {
const service = new http.Server(
serve(async (req, res) => {
send(res, 200, {
test: 'woot',
});
}),
);
const url = await listen(service);
const response = await fetch(url);
const body = await response.json();
t.deepEqual(body.test, 'woot');
service.close();
});
```
Look at [test-listen](https://github.com/vercel/test-listen) for a
function that returns a URL with an ephemeral port every time it's called.
## Contributing
1. [Fork](https://help.github.com/articles/fork-a-repo/) this repository to your own GitHub account and then [clone](https://help.github.com/articles/cloning-a-repository/) it to your local device
2. Link the package to the global module directory: `npm link`
3. Within the module you want to test your local development instance of Micro, just link it to the dependencies: `npm link micro`. Instead of the default one from npm, node will now use your clone of Micro!
You can run the tests using: `npm test`.
## Credits
Thanks to Tom Yandell and Richard Hodgson for donating the name "micro" on [npm](https://www.npmjs.com)!
## Authors
- Guillermo Rauch ([@rauchg](https://x.com/rauchg)) - [Vercel](https://vercel.com)
- Leo Lamprecht ([@leo](https://x.com/leo)) - [Vercel](https://vercel.com)
- Tim Neutkens ([@timneutkens](https://x.com/timneutkens)) - [Vercel](https://vercel.com)