{"id":21531919,"url":"https://github.com/matrixai/js-rpc","last_synced_at":"2025-07-22T10:37:18.582Z","repository":{"id":178750550,"uuid":"662083772","full_name":"MatrixAI/js-rpc","owner":"MatrixAI","description":"Stream-based JSON RPC for JavaScript/TypeScript Applications","archived":false,"fork":false,"pushed_at":"2025-05-29T05:38:10.000Z","size":1544,"stargazers_count":4,"open_issues_count":6,"forks_count":0,"subscribers_count":5,"default_branch":"staging","last_synced_at":"2025-07-13T21:45:32.780Z","etag":null,"topics":["json-rpc","json-streaming","rpc"],"latest_commit_sha":null,"homepage":"https://polykey.com","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/MatrixAI.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2023-07-04T10:14:48.000Z","updated_at":"2025-05-29T05:32:28.000Z","dependencies_parsed_at":null,"dependency_job_id":"5a72d886-53d3-4685-82af-0f48ced3dda4","html_url":"https://github.com/MatrixAI/js-rpc","commit_stats":null,"previous_names":["matrixai/js-rpc"],"tags_count":39,"template":false,"template_full_name":null,"purl":"pkg:github/MatrixAI/js-rpc","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MatrixAI%2Fjs-rpc","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MatrixAI%2Fjs-rpc/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MatrixAI%2Fjs-rpc/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MatrixAI%2Fjs-rpc/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/MatrixAI","download_url":"https://codeload.github.com/MatrixAI/js-rpc/tar.gz/refs/heads/staging","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MatrixAI%2Fjs-rpc/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":266477152,"owners_count":23935390,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-07-22T02:00:09.085Z","response_time":66,"last_error":null,"robots_txt_status":null,"robots_txt_updated_at":null,"robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["json-rpc","json-streaming","rpc"],"created_at":"2024-11-24T02:18:11.673Z","updated_at":"2025-07-22T10:37:18.569Z","avatar_url":"https://github.com/MatrixAI.png","language":"TypeScript","readme":"# js-rpc\n\n## Installation\n\n```sh\nnpm install --save @matrixai/rpc\n```\n\n## Usage\n\n### Basic Usage\n\nBecause decorators are experimental, you must enable:\n`\"experimentalDecorators\": true` in your `tsconfig.json` to use this library.\n\nFirst, setup an `RPCStream` to use:\n\n```ts\nconst rpcStream = {\n  readable: new ReadableStream(),\n  writable: new WritableStream(),\n  cancel: () =\u003e {};\n};\n```\n\n#### Server\n\n```ts\nimport type { JSONRPCParams, JSONRPCResult, JSONValue } from '@matrixai/rpc';\nimport { RPCServer, UnaryHandler } from '@matrixai/rpc';\n\n// Create a Handler\nclass SquaredNumberUnary extends UnaryHandler\u003c\n  ContainerType,\n  JSONRPCParams\u003c{ value: number }\u003e,\n  JSONRPCResult\u003c{ value: number }\u003e\n\u003e {\n  public handle = async (\n    input: JSONRPCParams\u003c{ value: number }\u003e,\n    cancel: (reason?: any) =\u003e void,\n    meta: Record\u003cstring, JSONValue\u003e | undefined,\n    ctx: ContextTimed,\n  ): Promise\u003cJSONRPCResult\u003c{ value: number }\u003e\u003e =\u003e {\n    return input.value ** 2;\n  };\n}\n\nconst rpcServer = new RPCServer();\n\nawait rpcServer.start({\n  manifest: {\n    SquaredDuplex: new SquaredDuplex(),\n    // ... add more handlers here\n  },\n});\n\nrpcServer.handleStream(rpcStream);\n```\n\n#### Client\n\n```ts\nimport type { HandlerTypes } from '@matrixai/rpc';\nimport { RPCClient, UnaryCaller } from '@matrixai/rpc';\n\n// Get the CallerTypes of the handler\ntype CallerTypes = HandlerTypes\u003cSquaredNumberUnary\u003e;\nconst squaredNumber = new UnaryCaller\u003c\n  CallerTypes['input'],\n  CallerTypes['output']\n\u003e();\n\nconst rpcClient = new RPCClient({\n  manifest: {\n    squaredNumber,\n    // ... add more here\n  },\n  streamFactory: async () =\u003e rpcStream,\n});\n\nawait rpcClient.methods.squaredNumber({ value: 2 });\n// returns { value: 4 }\n```\n\nAny of the callers or handlers can be added from the below `Call Types` section.\n\n### Call Types\n\n#### Unary\n\nIn Unary calls, the client sends a single request to the server and receives a\nsingle response back, much like a regular async function call.\n\n##### Handler\n\n```ts\nimport type { JSONRPCParams, JSONRPCResult, JSONValue } from '@matrixai/rpc';\nimport { UnaryHandler } from '@matrixai/rpc';\nclass SquaredNumberUnary extends UnaryHandler\u003c\n  ContainerType,\n  JSONRPCParams\u003c{ value: number }\u003e,\n  JSONRPCResult\u003c{ value: number }\u003e\n\u003e {\n  public handle = async (\n    input: JSONRPCParams\u003c{ value: number }\u003e,\n    cancel: (reason?: any) =\u003e void,\n    meta: Record\u003cstring, JSONValue\u003e | undefined,\n    ctx: ContextTimed,\n  ): Promise\u003cJSONRPCResult\u003c{ value: number }\u003e\u003e =\u003e {\n    return input.value ** 2;\n  };\n}\n```\n\n##### Caller\n\n```ts\nimport type { HandlerTypes } from '@matrixai/rpc';\nimport { UnaryCaller } from '@matrixai/rpc';\ntype CallerTypes = HandlerTypes\u003cSquaredNumberUnary\u003e;\nconst squaredNumber = new UnaryCaller\u003c\n  CallerTypes['input'],\n  CallerTypes['output']\n\u003e();\n```\n\n##### Call-Site\n\nThe client initiates a unary RPC call by invoking a method that returns a\npromise. It passes the required input parameters as arguments to the method. The\nclient then waits for the promise to resolve, receiving the output.\n\n```ts\nawait rpcClient.methods.squaredNumber({ value: 3 });\n// returns { value: 9 }\n```\n\n#### Client Streaming\n\nIn Client Streaming calls, the client can write multiple messages to a single\nstream, while the server reads from that stream and then returns a single\nresponse. This pattern is useful when the client needs to send a sequence of\ndata to the server, after which the server processes the data and replies with a\nsingle result. This pattern is good for scenarios like file uploads.\n\n##### Handler\n\nOn the server side, the handle function is an asynchronous function that takes\nan AsyncIterableIterator as input, representing the stream of incoming messages\nfrom the client. It returns a promise that resolves to the output that will be\nsent back to the client.\n\n```ts\nimport type { JSONRPCParams, JSONRPCResult, JSONValue } from '@matrixai/rpc';\nimport { ClientHandler } from '@matrixai/rpc';\nclass AccumulateClient extends ClientHandler\u003c\n  ContainerType,\n  JSONRPCParams\u003c{ value: number }\u003e,\n  JSONRPCResult\u003c{ value: number }\u003e\n\u003e {\n  public handle = async (\n    input: AsyncIterableIterator\u003cJSONRPCParams\u003c{ value: number }\u003e\u003e,\n    cancel: (reason?: any) =\u003e void,\n    meta: Record\u003cstring, JSONValue\u003e | undefined,\n    ctx: ContextTimed,\n  ): Promise\u003cJSONRPCResult\u003c{ value: number }\u003e\u003e =\u003e {\n    let acc = 0;\n    for await (const number of input) {\n      acc += number.value;\n    }\n    return { value: acc };\n  };\n}\n```\n\n##### Caller\n\n```ts\nimport type { HandlerTypes } from '@matrixai/rpc';\nimport { ClientCaller } from '@matrixai/rpc';\ntype CallerTypes = HandlerTypes\u003cAccumulateClient\u003e;\nconst accumulate = new ClientCaller\u003c\n  CallerTypes['input'],\n  CallerTypes['output']\n\u003e();\n```\n\n##### Call-Site\n\nThe client initiates a client streaming RPC call using a method that returns a\nwritable stream and a promise. The client writes to the writable stream and\nawaits the output promise to get the response.\n\n```ts\nconst { output, writable } = await rpcClient.methods.accumulate();\nconst writer = writabe.getWriter();\nawait writer.write({ value: 1 });\nawait writer.write({ value: 2 });\nawait writer.write({ value: 3 });\nawait writer.write({ value: 4 });\nawait writer.close();\nawait output;\n// output resolves to { value: 10 }\n```\n\n#### Server Streaming\n\nIn Server Streaming calls, the client sends a single request and receives\nmultiple responses in a read-only stream from the server. The server can keep\npushing messages as long as it needs, allowing real-time updates from the server\nto the client. This is useful for things like monitoring, where the server needs\nto update the client in real-time based on events or data changes. In this\nexample, the client sends a number and the server responds with the squares of\nall numbers up to that number.\n\n##### Handler\n\nOn the server side, the handle function is an asynchronous generator function\nthat takes a single input parameter from the client. It yields multiple messages\nthat will be sent back to the client through the readable stream.\n\n```ts\nimport type { JSONRPCParams, JSONRPCResult, JSONValue } from '@matrixai/rpc';\nimport { ServerHandler } from '@matrixai/rpc';\nclass CountServer extends ServerHandler\u003c\n  ContainerType,\n  JSONRPCParams\u003c{ value: number }\u003e,\n  JSONRPCResult\u003c{ value: number }\u003e\n\u003e {\n  public handle = async function* (\n    input: JSONRPCParams\u003c{ value: number }\u003e,\n    cancel: (reason?: any) =\u003e void,\n    meta: Record\u003cstring, JSONValue\u003e | undefined,\n    ctx: ContextTimed,\n  ): AsyncIterableIterator\u003cJSONRPCResult\u003c{ value: number }\u003e\u003e {\n    for (let i = input.number; i \u003c input.number + 5; i++) {\n      yield { value: i };\n    }\n  };\n}\n```\n\n##### Caller\n\n```ts\nimport type { HandlerTypes } from '@matrixai/rpc';\nimport { ServerCaller } from '@matrixai/rpc';\ntype CallerTypes = HandlerTypes\u003cCountServer\u003e;\nconst count = new ServerCaller\u003cCallerTypes['input'], CallerTypes['output']\u003e();\n```\n\n##### Call-Site\n\nThe client initiates a server streaming RPC call using a method that takes input\nparameters and returns a readable stream. The client writes a single message and\nthen reads multiple messages from the readable stream.\n\n```ts\nconst callerInterface = await rpcClient.methods.count({ value: 5 });\nconst numbers = [];\nwhile (true) {\n  const { value, done } = await reader.read();\n  numbers.push(value.value);\n  if (done) break;\n}\n// numbers is [5, 6, 7, 8, 9]\n```\n\n#### Duplex Stream\n\nA Duplex Stream enables both the client and the server to read and write\nmessages in their respective streams independently of each other. Both parties\ncan read and write multiple messages in any order. It's useful in scenarios that\nrequire ongoing communication in both directions, like chat applications.\n\n##### Handler\n\n```ts\nimport type { JSONRPCParams, JSONRPCResult, JSONValue } from '@matrixai/rpc';\nimport { DuplexHandler } from '@matrixai/rpc';\nclass EchoDuplex extends DuplexHandler\u003c\n  ContainerType,\n  JSONRPCParams,\n  JSONRPCResult\n\u003e {\n  public handle = async function* (\n    input: AsyncIterableIterator\u003cJSONRPCParams\u003c{ value: number }\u003e\u003e, // This is a generator.\n    cancel: (reason?: any) =\u003e void,\n    meta: Record\u003cstring, JSONValue\u003e | undefined,\n    ctx: ContextTimed,\n  ): AsyncIterableIterator\u003cJSONRPCResult\u003c{ value: number }\u003e\u003e {\n    for await (const incomingData of input) {\n      yield incomingData;\n    }\n  };\n}\n```\n\n##### Caller\n\n```ts\nimport type { HandlerTypes } from '@matrixai/rpc';\nimport { ServerCaller } from '@matrixai/rpc';\ntype CallerTypes = HandlerTypes\u003cEchoDuplex\u003e;\nconst echo = new ServerCaller\u003cCallerTypes['input'], CallerTypes['output']\u003e();\n```\n\n##### Call-Site\n\nThe client initiates a duplex streaming RPC call using a method that returns\nboth a readable and a writable stream. The client can read from the readable\nstream and write to the writable stream.\n\n```ts\n// Initialize the duplex call\nconst { readable, writable } = await rpcClient.methods.SquaredDuplex();\n\n// Get the reader and writer from the streams\nconst reader = readable.getReader();\nconst writer = writable.getWriter();\n\n// Write data to the server\nconst inputData: JSONObject = { someKey: 'someValue' };\nawait writer.write(inputData);\n\nconst readResult = await reader.read();\n\n// readResult is { someKey: \"someValue\" }\n```\n\n#### Raw Streams\n\nRaw Streams are designed for low-level handling of RPC calls, enabling granular\ncontrol over data streaming. Unlike other patterns, Raw Streams allow both the\nserver and client to work directly with raw data, providing a more flexible yet\ncomplex way to handle communications. This is especially useful when the RPC\nprotocol itself needs customization or when handling different types of data\nstreams within the same connection.\n\n##### Handler\n\n```ts\nimport type { JSONRPCRequest, JSONValue } from '@matrixai/rpc';\nimport { RawHandler } from '@matrixai/rpc';\nclass FactorialRaw extends RawHandler\u003cContainerType\u003e {\n  public handle = async (\n    [request, inputStream]: [JSONRPCRequest, ReadableStream\u003cUint8Array\u003e],\n    cancel: (reason?: any) =\u003e void,\n    meta: Record\u003cstring, JSONValue\u003e | undefined,\n    ctx: ContextTimed,\n  ): Promise\u003c[JSONValue, ReadableStream\u003cUint8Array\u003e]\u003e =\u003e {\n    const { readable, writable } = new TransformStream\u003c\n      Uint8Array,\n      Uint8Array\n    \u003e();\n    (async () =\u003e {\n      function factorialOf(n: number): number {\n        return n === 0 ? 1 : n * factorialOf(n - 1);\n      }\n\n      const reader = inputStream.getReader();\n      const writer = writable.getWriter();\n      while (true) {\n        const { done, value } = await reader.read();\n        if (done) {\n          break;\n        }\n\n        const num = parseInt(new TextDecoder().decode(value), 10);\n        const factorial = factorialOf(num).toString();\n        const outputBuffer = new TextEncoder().encode(factorial);\n\n        writer.write(outputBuffer);\n      }\n      writer.close();\n    })();\n\n    return [\n      'Starting factorial computation',\n      readable as ReadableStream\u003cUint8Array\u003e,\n    ];\n  };\n}\n```\n\n##### Caller\n\n```ts\nimport { RawCaller } from '@matrixai/rpc';\nconst factorial = new RawCaller();\n```\n\n##### Call-Site\n\n```ts\nconst { readable, writable, meta } = await rpcClient.methods.factorial();\n\nconsole.log('Meta:', meta); // Output meta information, should be 'Starting factorial computation'\n\n// Create a writer for the writable stream\nconst writer = writable.getWriter();\n\n// Send numbers 4, 5, 6, 8 to the server for factorial computation\nfor (const num of [4, 5, 6, 8]) {\n  const buffer = new TextEncoder().encode(num.toString());\n  await writer.write(buffer);\n}\nawait writer.close();\n\n// Create a reader for the readable stream\nconst reader = readable.getReader();\n\n// Read the computed factorials from the server\nwhile (true) {\n  const { done, value } = await reader.read();\n  if (done) {\n    console.log('Done reading from stream.');\n    process.exit(0);\n    break;\n  }\n  const factorialResult = new TextDecoder().decode(value).trim(); // Added trim() to remove any extra whitespace\n  console.log(`The factorial is: ${factorialResult}`);\n}\n```\n\n## Specifications\n\n### Timeouts\n\nWhenever the time between the initial message and the following subsequent\nmessage of an RPC call exceeds a defined timeout time, the RPC call will have\ntimed out.\n\nFor Unary calls, this is similar to the timeout of a response after sending a\nrequest.\n\nIf the client were to time out, the stream is forcibly closed and\n`ErrorRPCTimedOut` is thrown from the call.\n\nIf the server were to time out, is is advisory. Meaning that the server may\nchoose to optionally eagerly throw `ErrorRPCTimedOut`, or continue processing as\nnormal.\n\nAfter the client receives the subsequent message from the server, the timeout\ntimer is cancelled.\n\nLikewise on the server, the timeout timer is cancelled after the first message\nis sent to the client.\n\nThis means that the timeout for Streaming calls acts as a Proof of Life, and\nafter it is established, the timeout no longer applies. This allows for\nlong-running Streaming calls.\n\nNote that when supplying a `Timer` instance to the call-site in `RPCClient`, the\ntimeout timer will not be cancelled. As it is expected for the library to not\nmutate the passed-in `Timer`, and for the user to expect that receiving a\nmesssage will have meaned that the timer no longer matters.\n\n#### Throwing Timeouts Server-Side\n\nBy default, a timeout will not cause an RPC call to automatically throw, this\nmust be manually done by the handler when it receives the abort signal from\n`ctx.signal`. An example of this is like so:\n\n```ts\nclass TestMethod extends UnaryHandler {\n  public handle = async (\n    input: JSONValue,\n    cancel: (reason?: any) =\u003e void,\n    meta: Record\u003cstring, JSONValue\u003e | undefined,\n    ctx: ContextTimed,\n  ): Promise\u003cJSONValue\u003e =\u003e {\n    const abortProm = utils.promise\u003cnever\u003e();\n    ctx.signal.addEventListener('abort', () =\u003e {\n      resolveCtxP(ctx);\n      abortProm.resolveP(ctx.signal.reason);\n    });\n    throw await abortProm.p;\n  };\n}\n```\n\n#### Priority of Timeout Options\n\nA `timeoutTime` can be passed both to the constructors of `RPCServer` and\n`RPCClient`. This is the default `timeoutTime` for all callers/handlers.\n\nIn the case of `RPCServer`, a `timeout` can be specified when extending any\n`Handler` class. This will override the default `timeoutTime` set on `RPCServer`\nfor that handler only.\n\n```ts\nclass TestMethodArbitraryTimeout extends UnaryHandler {\n  public timeout = 100;\n  public handle = async (\n    input: JSONValue,\n    _cancel,\n    _meta,\n    ctx_,\n  ): Promise\u003cJSONValue\u003e =\u003e {\n    return input;\n  };\n}\n```\n\nIn the case of `RPCClient`, a `ctx` with the property `timer` can be supplied\nwith a `Timer` instance or `number` when making making an RPC call. This will\noverride the default `timeoutTime` set on `RPCClient` for that call only.\n\n```ts\nawait rpcClient.methods.testMethod({}, { timer: 100 });\nawait rpcClient.methods.testMethod({}, { timer: new Timer(undefined, 100) });\n```\n\nHowever, it's important to note that any of these timeouts may ultimately be\noverridden by the shortest timeout of the server and client combined using the\ntimeout middleware below.\n\n#### Timeout Middleware\n\nThe `timeoutMiddleware` sets an RPCServer's timeout based on the lowest timeout\nbetween the Client and the Server. This is so that handlers can eagerly time out\nand stop processing as soon as it is known that the client has timed out.\n\nThis case can be seen in the first diagram, where the server is able to stop the\nprocessing of the handler, and close the associated stream of the RPC call based\non the shorter timeout sent by the client:\n\n![RPCServer sets timeout based on RPCClient](images/timeoutMiddlewareClientTimeout.svg)\n\nWhere the `RPCClient` sends a timeout that is longer than that set on the\n`RPCServer`, it will be rejected. This is as the timeout of the client should\nnever be expected to exceed that of the server, so that the server's timeout is\nan absolute limit.\n\n![RPCServer rejects longer timeout sent by RPCClient](images/timeoutMiddlewareServerTimeout.svg)\n\nThe `timeoutMiddleware` is enabled by default, and uses the `.metadata.timeout`\nproperty on a JSON-RPC request object for the client to send it's timeout.\n\n## Development\n\nRun `nix-shell`, and once you're inside, you can use:\n\n```sh\n# install (or reinstall packages from package.json)\nnpm install\n# build the dist\nnpm run build\n# run the repl (this allows you to import from ./src)\nnpm run tsx\n# run the tests\nnpm run test\n# lint the source code\nnpm run lint\n# automatically fix the source\nnpm run lintfix\n```\n\n### Docs Generation\n\n```sh\nnpm run docs\n```\n\nSee the docs at: https://matrixai.github.io/js-rpc/\n\n### Publishing\n\nPublishing is handled automatically by the staging pipeline.\n\nPrerelease:\n\n```sh\n# npm login\nnpm version prepatch --preid alpha # premajor/preminor/prepatch\ngit push --follow-tags\n```\n\nRelease:\n\n```sh\n# npm login\nnpm version patch # major/minor/patch\ngit push --follow-tags\n```\n\nManually:\n\n```sh\n# npm login\nnpm version patch # major/minor/patch\nnpm run build\nnpm publish --access public\ngit push\ngit push --tags\n```\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmatrixai%2Fjs-rpc","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmatrixai%2Fjs-rpc","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmatrixai%2Fjs-rpc/lists"}