{"id":15639989,"url":"https://github.com/azat-co/node-advanced","last_synced_at":"2025-04-30T07:25:34.844Z","repository":{"id":138309997,"uuid":"97672941","full_name":"azat-co/node-advanced","owner":"azat-co","description":"Node Advanced Courseware","archived":false,"fork":false,"pushed_at":"2019-03-17T18:49:00.000Z","size":9218,"stargazers_count":82,"open_issues_count":0,"forks_count":39,"subscribers_count":5,"default_branch":"master","last_synced_at":"2025-03-30T13:51:16.750Z","etag":null,"topics":["cluster","cpp","javascript","module","node","node-js","node-module","nodejs","npm","require","stream"],"latest_commit_sha":null,"homepage":"https://node.university/p/node-advanced","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/azat-co.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2017-07-19T04:30:50.000Z","updated_at":"2025-03-16T12:21:29.000Z","dependencies_parsed_at":null,"dependency_job_id":"9c1bd982-bf60-4068-9ed0-0e0ca902eac7","html_url":"https://github.com/azat-co/node-advanced","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/azat-co%2Fnode-advanced","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/azat-co%2Fnode-advanced/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/azat-co%2Fnode-advanced/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/azat-co%2Fnode-advanced/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/azat-co","download_url":"https://codeload.github.com/azat-co/node-advanced/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":251659600,"owners_count":21623087,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cluster","cpp","javascript","module","node","node-js","node-module","nodejs","npm","require","stream"],"created_at":"2024-10-03T11:29:37.503Z","updated_at":"2025-04-30T07:25:34.822Z","avatar_url":"https://github.com/azat-co.png","language":"JavaScript","readme":"footer: © NodeProgram.com, Node.University and Azat Mardan 2018\nslidenumbers: true\ntheme: Simple, 1\nbuild-lists: true\nautoscale:true\n\n[.slidenumbers: false] \n[.hide-footer]\n\n![](images/Node-Advanced-x2-1.png)\n\n---\n\n# Node Advanced\n## Overview\n### Azat Mardan @azat_co\n\n\n![left](images/azat node interacitev no pipe.jpeg)\n\n![inline right](images/nu.png)\n\n---\n\n# Node Advanced\n\n\n* Videos: \u003chttp://node.university/p/node-advanced\u003e\n* Slides: in `*.md` in \u003chttps://github.com/azat-co/node-advanced\u003e\n* Code: in `code` in \u003chttps://github.com/azat-co/node-advanced\u003e\n\n---\n\n# Course Overview\n\n---\n\n## Course Overview\n\n* Table of Contents\n* What to expect\n* What you need\n\n---\n\n## Curriculum\n\n---\n\n## Curriculum\n\n1. Node Modules\n1. Node Event Loop and Async Programming\n1. Streaming\n1. Networking\n1. Debugging\n1. Scaling\n\n\n---\n\n## What to Expect\n\nFocus on:\n\n* Pure Node\n* Core Node modules\n* ES6-8\n\n---\n\n## What not to Expect\n\nDo not expect:\n\n* Not much JavaScript fundamentals and no old ES5\n* Not much Linux, Unix, Windows or computer fundamentals\n* Not many fancy npm modules or frameworks \n\n---\n\n## Prerequisites\n\n* Node Foundation: \u003chttps://node.university/p/node-npm-and-mongodb-foundation\u003e\n* You Don't Know Node: \u003chttps://node.university/p/you-dont-know-node\u003e\n* Node Patterns: \u003chttps://node.university/p/node-patterns\u003e\n\n---\n\n\n## What You Need\n\n* Node version 8+: `node -v`\n* npm version 5+: `npm -v`\n* Google Chrome\n* Slides\u0026code: \u003chttps://github.com/azat-co/node-advanced\u003e\n\n---\n\n## Mindset\n\n* Embrace errors\n* Increase curiosity\n* Experiment by iteration\n* Get comfortable reading source code of Node.js, npm, and npm modules\n* Enjoy the process\n\n---\n\n## Reading Source Code\n\nYou learn how to use a module *and* how to be a better developer\n\n* \u003chttps://github.com/nodejs/node\u003e\n* \u003chttps://github.com/npm/npm\u003e\n* \u003chttps://github.com/expressjs/express\u003e\n\n---\n\n## Tips for Deeper (Advanced) Understanding\n\n* Learn to think like V8 (a JS+Node engine): When in doubt, use `console.log` or debugger to walk through execution \n* Read call stack error message carefully. Learn and know common errors (address in use, cannot find module, undefined, etc.)\n* Upgrade your tools (No Notepad ++, seriously)\n\n---\n\n## Tips for Deeper (Advanced) Understanding (Cont)\n\n* Memorize all the array, string and Node core methods - saves tons of time and keeps focus (can work offline too)\n* Read good books, take in-person classes from good instructors and watch good video courses\n* Build side-projects\n* Subscribe to Node Weekly to stay up-to-date\n* Teach\n\n---\n\n# Module 1: Modules\n\n---\n\n## Importing Modules with `require()`\n\n1. Resolving\n1. Loading\n1. Wrapping\n1. Evaluating\n1. Caching\n\n---\n\n## Modules Can Have Code\n\n`code/modules/module-1.js`:\n\n```js\nconsole.log(module) // console.log(global.module)\n```\n\n---\n\n```js\nModule {\n  id: '.',\n  exports: {},\n  parent: null,\n  filename: '/Users/azat/Documents/Code/node-advanced/code/module-1.js',\n  loaded: false,\n  children: [],\n  paths:\n   [ '/Users/azat/Documents/Code/node-advanced/code/node_modules',\n     '/Users/azat/Documents/Code/node-advanced/node_modules',\n     '/Users/azat/Documents/Code/node_modules',\n     '/Users/azat/Documents/node_modules',\n     '/Users/azat/node_modules',\n     '/Users/node_modules',\n     '/node_modules' ] }\n```\n\n---\n\n## `require()`\n\n* local paths takes precedence (0 to N)\n* module can be a file or a folder with `index.js` (or any file specified in package.json main in that nested folder)\n* `loaded` is true when this file is imported/required by another\n* `id` is the path when this file is required by another\n* `parent` and `children` will be populated accordingly\n\n---\n\n## `require.resolve()` \n\nCheck if the package exists/installed or not but does not execute\n\n---\n\n## How `require()` Checks Files\n\n1. Try `name.js`\n1. Try `name.json`\n1. Try `name.node` (compiled addon example)\n1. Try `name` folder, i.e., `name/index.js`\n\n---\n\n## `require.extensions`\n\n```js\n{ '.js': [Function], '.json': [Function], '.node': [Function] }\n```\n\n```js\nfunction (module, filename) { // require.extensions['.js'].toString()\n  var content = fs.readFileSync(filename, 'utf8');\n  module._compile(internalModule.stripBOM(content), filename);\n  }\n\nfunction (module, filename) { // require.extensions['.json'].toString()\n    var content = fs.readFileSync(filename, 'utf8');\n    try {\n          module.exports = JSON.parse(internalModule.stripBOM(content));\n      } catch (err) {\n          err.message = filename + ': ' + err.message;\n        throw err;\n      }\n    }\n\nfunction (module, filename) { // \u003e require.extensions['.node'].toString()\n    return process.dlopen(module, path._makeLong(filename));\n}\n```\n\n---\n\n## Caching\n\nRunning require() twice will not print twice but just once:\n\n```\ncd code/modules \u0026\u0026 node\n\u003e require('./module-1.js')\n...\n\u003e require('./module-1.js')\n{}\n```\n\n(Or run `modules/main.js`)\n\n---\n\n\u003e A better way to execute code multiple times is to export it and then invoke\n\n---\n\n\n## Exporting Module\n\n\n---\n\n## Exporting Code\n\n```\nmodule.exports = () =\u003e {\n\n}\n```\n\n---\n\n## CSV to Node Object Converter Module\n\n```\ncode/modules/module-2.js\n```\n\n```js\nmodule.exports.parse = (csvString = '') =\u003e {\n  const lines = csvString.split('\\n')\n  let result = []\n  ...\n  return result\n}\n```\n\n---\n\n## CSV to Node Object Converter Main Program\n\n```\ncode/modules/main-2.js\n```\n\n```js\nconst csvConverter = require('./module-2.js').parse\n\nconst csvString = `id,first_name,last_name,email,gender,ip_address\n...\n10,Allin,Bernadot,abernadot9@latimes.com,Male,15.162.216.199`\n\nconsole.log(csvConverter(csvString))\n```\n\n---\n\n## Module Patterns\n\n* Export Function\n* Export Class\n* Export Function Factory \n* Export Object\n* Export Object with Methods\n\nMore on these patterns at [Node Patterns](https://node.university/p/node-patterns)\n\n---\n\n## Exporting Tricks and Gotchas\n\n```js\nmodule.exports.parse = () =\u003e {} // ok\nexports.parse = () =\u003e {} // ok\nglobal.module.exports.parse = () =\u003e {}  // not ok, use local module\n```\n\n---\n\n## Exporting Tricks and Gotchas (Cont)\n\n```js\nexports.parse = ()=\u003e{} // ok\nmodule.exports = {parse: ()=\u003e{} } // ok again \nexports = {parse: ()=\u003e{} } // not ok, creates a new variable\n```\n\n---\n\n## Module Wrapper Function\n\nKeeps local vars local\n\n`require('module').wrapper`\n\n```\nnode\n\u003e require('module').wrapper\n[ '(function (exports, require, module, __filename, __dirname) { ',\n  '\\n});' ]\n```\n\n---\n\n## Tricky Local Globals \n\n`exports` and `require` are specific to each module, not true global global, same with `__filename` and `__dirname`\n\n```js\nconsole.log(global.module === module) // false\nconsole.log(arguments)\n```\n\n---\n\n## What You Export === What You Use\n\n```js\nmodule.exports = { \n  parse: (csv) =\u003e {\n    //...\n  }\n}\n```\n\nImporting object, so use: \n\n```js\nconst parse = require('./name.js').parse\nconst {parse} = require('./name.js') // or\nparse(csv)\n```\n\n---\n\n## What You Export === What You Use (Cont)\n \n```js\nconst Parser = { \n  parse(csv) {\n    // ...\n  }\n}\nmodule.exports = Parser\n```\n\nAgain importing object, so use: \n\n```js\nconst parse = require('./name.js').parse\nconst {parse} = require('./name.js') // or\nparse(csv)\n```\n\n---\n\n## What You Export === What You Use (Cont)\n\n```js\nmodule.exports = () =\u003e { \n  return {\n    parse: (csv) =\u003e {}\n  }\n}\n```\n\nImporting function, not object, so use:\n\n```js\nconst {parse} = require('./name.js')()\nconst parse = require('./name.js')().parse\n```\n\n(`modules/main-3.js` and `modules/module-3.js`)\n\n---\n\n## What You Export === What You Use (Cont)\n\n```js\nclass Parser extends BaseClass {\n  parse(csv) {\n    // ...\n  }\n}\nmodule.exports = Parser\n```\n\n```js\nconst Parser = require('./name.js')\nconst parser = new Parser()\nconst parse = parser.parse // or const {parse} = parser\n```\n\n---\n\n## `import` vs `import()` vs `require()`\n\n* [import](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/import) is static and require is dynamic\n* *.mjs experimental https://nodejs.org/api/esm.html\n* import() method ([stage 3](https://github.com/tc39/proposal-dynamic-import))\n* No `require.extensions` or `require.cache` in import \n\n---\n\n## Node experimental ESM support\n\n```js\nimport fs from 'fs'\nimport('./button.js')\n```\n\nFor now, it's better to use Babel or just stick with `require`\n\n---\n\n## Caching\n\n`require.cache` has the cache\n\n---\n\n## Clear Cache\n\n`main-4.js` prints twice (unlike `main-1.js`):\n\n```js\nrequire('./module-4.js')\ndelete require.cache[require.resolve('./module-4.js')]\nrequire('./module-4.js')\n```\n\n\n---\n\n## Global\n\n```js\nvar limit = 1000 // local, not available outside\nconst height = 50 // local\nlet i = 10 // local\nconsole = () =\u003e {} // global, overwrites console outside\nglobal.Parser = {} // global, available in other files\nmax = 999 // global too\n```\n\n---\n\n## npm\n\n* registry\n* cli: folders, git, private registries (self hosted npm, Nexus, Artifactory)\n\n* yarn\n* pnpm\n\n---\n\n## npm Git\n\n```\nnpm i expressjs/express -E\n\n```\n\n```\nnpm i expressjs/express#4.14.0 -E\nnpm install https://github.com/indexzero/forever/tarball/v0.5.6\nnpm install git+ssh://git@github.com:npm/npm#semver:^5.0\nnpm install git+https://isaacs@github.com/npm/npm.git\n```\n\nWhen in doubt: `npm i --dry-run express`\n\n---\n\n## npm ls \n\n```\nnpm ls express\nnpm ls -g --depth=0\nnpm ll -g --depth=0\nnpm ls -g --depth=0 --json\n```\n\nnpm installs in ~/node_modules (if no local)\n\n\n---\n\n## Creating package.json For Lazy Programmers\n\n\n```\nnpm init -y\n```\n\n---\n\n## Setting Init Configs\n\nList: \n\n```\nnpm config ls\n```\n\n\n---\n\n## My npm Configs: cli, user, global\n\n```\n; cli configs\nscope = \"\"\nuser-agent = \"npm/4.2.0 node/v7.10.1 darwin x64\"\n\n; userconfig /Users/azat/.npmrc\ninit-author-name = \"Azat Mardan\"\ninit-author-url = \"http://azat.co/\"\ninit-license = \"MIT\"\ninit-version = \"1.0.1\"\npython = \"/usr/bin/python\"\n\n; node bin location = /Users/azat/.nvm/versions/node/v7.10.1/bin/node\n; cwd = /Users/azat/Documents/Code/node-advanced\n; HOME = /Users/azat\n; \"npm config ls -l\" to show all defaults.\n```\n\n---\n\n\n## Configs for npm init\n\n```\ninit-author-name = \"Azat Mardan\"\ninit-author-url = \"http://azat.co/\"\ninit-license = \"MIT\"\ninit-version = \"1.0.1\"\n```\n\n---\n\n## Setting up npm registry Config\n\n```\nnpm config set registry \"http://registry.npmjs.org/\"\n```\n\nor \n\nedit `~/.npmrc`, e.g., `/Users/azat/.npmrc`\n\n---\n\n## Setting up npm proxy\n\n```\nnpm config set https-proxy http://proxy.company.com:8080\nnpm config set proxy http://proxy_host:port\n```\n\nNote: The https-proxy doesn't have https as the protocol, but http.\n\n---\n\n## Dependency Options\n\n* `npm i express -S` (default in npm v5)\n* `npm i express -D`\n* `npm i express -O`\n* `npm i express -E`\n\n---\n\n## `npm update` and `npm outdated`\n\n* `\u003c` and `\u003c=`\n* `=`\n* `.x`\n* `~`\n* `^`\n* `\u003e` and `\u003e=`\n\n---\n\n## npm Tricks\n\n```\nnpm home express\nnpm repo express\nnpm docs express\n```\n\n---\n\n## npm Linking for Developing CLI Tools\n\n```\nnpm link \nnpm unlink\n```\n\n---\n\n# Module 2: Node Event Loop and Async Programming\n\n---\n\n## Event loop\n\n---\n\n## Two Categories of Tasks\n\n* CPU-bound\n* I/O-bound\n\n---\n\n## CPU Bound Tasks\n\nCPU-bound tasks examples:\n\n* Encryption\n* Password\n* Encoding\n* Compression\n* Calculations\n\n---\n\n## Input and Output Bound Tasks\n\nInput/Output examples:\n\n* Disk: write, read\n* Networking: request, response\n* Database: write, read\n\n---\n\n\u003e CPU-bound tasks are not the bottleneck in networking apps. The I/O tasks are the bottleneck because they take up more time typically.\n\n---\n\n## Dealing with Slow I/O\n\n* Synchronous \n* Forking (later module)\n* Threading (more servers, computers, VMs, containers)\n* Event loop (this module)\n\n---\n\n## Call Stack\n\n\u003e Uses push, pop functions on the FILO/LIFO/LCFS basis, i.e., functions removed from top (opposite of queue).\n\n^https://techterms.com/definition/filo\n\n---\n\n## Call Stack Illustration\n\n```js\nconst f3 = () =\u003e {\n  console.log('executing f3')\n  undefinedVariableError //  ERROR!\n}\nconst f2 = () =\u003e {\n  console.log('executing f2')\n  f3()\n}\nconst f1 = () =\u003e {\n  console.log('executing f1')\n  f2()\n}\n\nf1()\n```\n\n---\n\n## Call Stack as a Bucket\n\nStarts with Anonymous, then f1, f2, etc.\n\n```\nf3() // last in the bucket but first to go\nf2()\nf1()\nanonymous() // first in the bucket but last to go\n```\n\n---\n\n## Call Stack Error\n\n```\n\u003e f1()\nexecuting f1\nexecuting f2\nexecuting f3\nReferenceError: undefinedVariableError is not defined\n    at f3 (repl:3:1)\n    at f2 (repl:3:1)\n    at f1 (repl:3:1)\n    at repl:1:1\n    at ContextifyScript.Script.runInThisContext (vm.js:23:33)\n    at REPLServer.defaultEval (repl.js:339:29)\n    at bound (domain.js:280:14)\n    at REPLServer.runBound [as eval] (domain.js:293:12)\n    at REPLServer.onLine (repl.js:536:10)\n    at emitOne (events.js:101:20)\n```\n\n\n---\n\n\n## Event Queue\n\nFIFO to push to call stack\n\n---\n\n## Async Callback Messes Call Stack\n\n```js\nconst f3 = () =\u003e {\n  console.log('executing f3')\n  setTimeout(()=\u003e{\n    undefinedVariableError // STILL an ERROR but async in this case\n  }, 100)\n}\nconst f2 = () =\u003e {\n  console.log('executing f2')\n  f3()\n}\nconst f1 = () =\u003e {\n  console.log('executing f1')\n  f2()\n}\n\nf1()\n```\n\n---\n\n## Different Call Stack! \n\nNo f1, f2, f3 for the `setTimeout` callback call stack because event loop moved one, i.e., error comes from a different event queue:\n\n```\n\u003e f1()\nexecuting f1\nexecuting f2\nexecuting f3\nundefined\n\u003e ReferenceError: undefinedVariableError is not defined\n    at Timeout.setTimeout [as _onTimeout] (repl:4:1)\n    at ontimeout (timers.js:386:14)\n    at tryOnTimeout (timers.js:250:5)\n    at Timer.listOnTimeout (timers.js:214:5)\n\u003e\n```\n\n---\n\n\n## Event Loop Order of Operation:\n\n1. Timers\n1. I/O callbacks\n1. Idle, prepare\n1. Poll (incoming connections, data)\n1. Check\n1. Close callbacks\n\n^https://nodejs.org/en/docs/guides/event-loop-timers-and-nexttick/\n\n[.autoscale: true]\n\n---\n\n## Phases Overview\n\n1. Timers: this phase executes callbacks scheduled by setTimeout() and setInterval().\n1. I/O callbacks: executes almost all callbacks with the exception of close callbacks, the ones scheduled by timers, and setImmediate().\n1. Idle, prepare: only used internally.\n1. Poll: retrieve new I/O events; node will block here when appropriate.\n1. Check: setImmediate() callbacks are invoked here.\n1. Close callbacks: e.g. socket.on('close', ...).\n\n[.autoscale: true]\n\n---\n\n![](https://youtu.be/PNa9OMajw9w?t=5m48s)\n\n\u003chttps://youtu.be/PNa9OMajw9w?t=5m48s\u003e\n\n---\n\n![](images/true-event-loop.png)\n\n---\n\n## setTimeout vs. setImmediate vs. process.nextTick\n\n* `setTimeout(fn, 0)` - pushes to the *next* event loop cycle\n* `setImmediate()` similar to `setTimeout()` with 0 but timing is different sometimes, it is recommended when you need to execute on the next cycle\n* `process.nextTick` - not the next cycle (same cycle!), used to make functions fully async or to postpone code for events\n\n---\n\n## nextTick Usage\n\nAll callbacks passed to process.nextTick() will be resolved before the event loop continues\n\n* To emit event after `.on()`\n* To make some sync code async\n\n---\n\n## Event Emit nextTick Example in http\n\nIn http, to make sure event listeners are attached before emitting error (or anything else) \u003csup\u003e[source](https://github.com/nodejs/node/blob/3b8da4cbe8a7f36fcd8892c6676a55246ba8c3be/lib/_http_client.js#L205)\u003c/sup\u003e:\n\n```js\n    if (err) {\n      process.nextTick(() =\u003e this.emit('error', err));\n      return;\n    }\n```\n\n\n---\n\n## Async or Sync Error Handling in fs\n\nTo postpone callback if it's set (async) or throw error right away (sync) \u003csup\u003e[source](https://github.com/nodejs/node/blob/8aec3638cebd338b0ea2a62c3e1469b8e29616d7/lib/fs.js#L363)\u003c/sup\u003e:\n\n```js\nfunction handleError(val, callback) {\n  if (val instanceof Error) {\n    if (typeof callback === 'function') {\n      process.nextTick(callback, val);\n      return true;\n    } else throw val;\n  }\n  return false;\n}\n```\n\n---\n\n## Async Code Syntax \n\n* Just Callbacks: code and data are arguments\n* Promises: code is separate from data\n* Generators and Async/await: look like sync but actually async\n\n---\n\n## Error-First Callback\n\nDefine your async function:\n\n```js\nconst myFn = (cb) =\u003e {\n  // Define error and data\n  // Do something...\n  cb(error, data)\n}\n```\n\n---\n\n## Error-First Callback Usage\n\nUse your function: \n\n```js\nmyFn((error, data)=\u003e{\n  \n})\n```\n\n---\n\n## Arguments Naming\n\nArgument names don't matter but the order does matter, put errors first and callbacks last:\n\n```js\nmyFn((err, result)=\u003e{\n  \n})\n```\n\n---\n\n## Error-First\n\nErrors first but the callback last\n\n\u003e Popular convention but not enforced by Node)\n\n---\n\n## Arguments Order and Callback-First?\n\nSome functions don't follow error-first and use callback first, e.g., `setTimeout(fn, time)`.\n\n---\n\n## Callback-First\n\nWith the ES6 rest operator, it might make sense to start using callback-first style more because rest can only be the last parameter, e.g. \n\n```js\nconst myFn = (cb, ...options) =\u003e {\n\n}\n```\n\n---\n\n## How to Know What is The Function Signature\n\n1. You created it so you should know\n2. Someone else created it, thus, always know others modules by reading source code, checking documentation, testing and reading examples, tests, tutorials.\n\n---\n\n## Callbacks not Always Async\n\nSync code which has a function as an argument. :\n\n```js\nconst arr = [1, 2, 3]\narr.map((item, index, list) =\u003e {\n  return item*index // called arr.length times\n})\n```\n\n---\n\n## Promises\n\n\u003e Externalize the callback code and separate it from the data arguments\n\n---\n\n## Promises for Developers\n\n* Consume a ready promise from a library/module (`axios`, `koa`, etc.) - most likely\n* Create your own using ES6 Promise or a library ([bluebird]() or [q](https://documentup.com/kriskowal/q/)) - less likely\n\n\n---\n\n## Usage and Consumption of Ready Promises\n\n---\n\n## Callbacks Syntax \n\nWhere to put the callback and does the error argument go first? \n\n```js\nasyncFn1((error1, data1) =\u003e {\n  asyncFn2(data1, (error2, data2) =\u003e {\n    asyncFn3(data2, (error3, data3) =\u003e {\n      asyncFn4(data3, (error4, data4) =\u003e {\n        // Do something with data4\n      })\n    })\n  })\n})\n```\n\n---\n\n## Promise Syntax\n\nClear separation of data and control flow arguments:\n\n```js\npromise1(data1)\n  .then(promise2)\n  .then(promise3)\n  .then(promise4)\n  .then(data4=\u003e{\n    // Do something with data4\n  })\n  .catch(error=\u003e{\n    // handle error1, 2, 3 and 4\n  })\n```\n\n---\n\n## Axios GET Example\n\n```js\nconst axios = require('axios')\naxios.get('http://azat.co')\n  .then((response) =\u003e response.data)\n  .then(html =\u003e console.log(html))\n```  \n\n---\n\n## Axios GET Error Example\n\n```js\nconst axios = require('axios')\naxios.get('https://azat.co') // https will cause an error!\n  .then((response)=\u003eresponse.data)\n  .then(html =\u003e console.log(html))\n  .catch(e=\u003econsole.error(e))\n```\n\nError: Hostname/IP doesn't match certificate's altnames: \"Host: azat.co. is not in the cert's altnames: DNS:*.github.com, DNS:github.com, DNS:*.github.io, DNS:github.io\"\n\n---\n\n##  Let's implement our own naive promise. \n\n### We can learn how easy promises are, and this is advanced course after all so why not?\n\n---\n\n## Naive Promise: Callback Async Function\n\n```js\nfunction myAsyncTimeoutFn(data, callback) {\n  setTimeout(() =\u003e {\n    callback()\n  }, 1000)\n}\n\nmyAsyncTimeoutFn('just a silly string argument', () =\u003e {\n  console.log('Final callback is here')\n})\n```\n\n---\n\n## Naive Promise: Implementation\n\n```js\nfunction myAsyncTimeoutFn(data) {\n  let _callback = null\n  setTimeout(() =\u003e {\n    if (_callback) _callback()\n  }, 1000)\n  return {\n    then(cb){\n      _callback = cb\n    }\n  }\n}\n\nmyAsyncTimeoutFn('just a silly string argument').then(() =\u003e {\n  console.log('Final callback is here')\n})\n```\n\n---\n\n## Naive Promise: Implementation with Errors\n\n```js\nconst fs = require('fs')\nfunction readFilePromise( filename ) {\n  let _callback = () =\u003e {}\n  let _errorCallback = () =\u003e {}\n\n  fs.readFile(filename, (error, buffer) =\u003e {\n    if (error) _errorCallback(error)\n    else _callback(buffer)\n  })\n\n  return {\n    then( cb, errCb ){\n      _callback = cb\n      _errorCallback = errCb\n    }\n  }\n\n}\n```\n\n---\n\n## Naive Promise: Reading File\n\n```js\nreadFilePromise('package.json').then( buffer =\u003e {\n  console.log( buffer.toString() )\n  process.exit(0)\n}, err =\u003e {\n  console.error( err )\n  process.exit(1)\n})\n```\n\n---\n\n## Naive Promise: Triggering Error\n\n```js\nreadFilePromise('package.jsan').then( buffer =\u003e {\n  console.log( buffer.toString() )\n  process.exit(0)\n}, err =\u003e {\n  console.error( err )\n  process.exit(1)\n})\n```\n\n```\n{ Error: ENOENT: no such file or directory, open 'package.jsan'\n  errno: -2,\n  code: 'ENOENT',\n  syscall: 'open',\n  path: 'package.jsan' }\n```\n\n---\n\n## Creating Promises Using The Standard ES6/ES2015 Promise\n\n---\n\n## ES6/ES2015 Promise in Node\n\nNode version 8+ (v8 not V8):\n\n```\nPromise === global.Promise\n```\n\nES6 `Promise` takes callback with `resolve` and `reject`\n\n---\n\n## Simple Promise Implementation with ES6/ES2015\n\n```js\nconst fs = require('fs')\nfunction readJSON(filename, enc='utf8'){\n  return new Promise(function (resolve, reject){\n    fs.readFile(filename, enc, function (err, res){\n      if (err) reject(err)\n      else {\n        try {\n          resolve(JSON.parse(res))\n        } catch (ex) {\n          reject(ex)\n        }\n      }\n    })\n  })\n}\n\nreadJSON('./package.json').then(console.log)\n```\n\n---\n\n## Advanced Promise Implementation with ES6/ES2015 for Both Promises and Callbacks\n\n```js\nconst fs = require('fs')\n\nconst readFileIntoArray = function(file, cb = null) {\n  return new Promise((resolve, reject) =\u003e {\n    fs.readFile(file, (error, data) =\u003e {\n      if (error) {\n        if (cb) return cb(error) \n        return reject(error)\n      }\n\n      const lines = data.toString().trim().split('\\n')\n      if (cb) return cb(null, lines)\n      else return resolve(lines)\n    })\n  })\n}\n```\n\n---\n\n## Example Calls with `then` and a Callback\n\n```js\nconst printLines = (lines) =\u003e {\n  console.log(`There are ${lines.length} line(s)`)\n  console.log(lines)\n}\nconst FILE_NAME = __filename\n\nreadFileIntoArray(FILE_NAME)\n  .then(printLines)\n  .catch(console.error)\n\nreadFileIntoArray(FILE_NAME, (error, lines) =\u003e {\n  if (error) return console.error(error)\n  printLines(lines)\n})\n```\n\n---\n\n## Event Emitters\n\n1. Import `require('events')`\n1. Extend `class Name extends ...`\n1. Instantiate `new Name()`\n1. Add listeners `.on()`\n1. Emit `.emit()`\n\n---\n\n## Emitting Outside Event Emitter Class\n\n\n```js\nconst events = require('events')\nclass Encrypt extends events {\n  constructor(ops) {\n    super(ops)\n    this.on('start', () =\u003e {\n      console.log('beginning A')\n    })    \n    this.on('start', () =\u003e {\n      console.log('beginning B')\n    })\n  }\n}\n\nconst encrypt = new Encrypt()\nencrypt.emit('start')\n```\n\n---\n\n## Emitting Outside and Inside\n\n```js\nconst events = require('events')\nclass Encrypt extends events {\n  constructor(ops) {\n    super(ops)\n    this.on('start', () =\u003e {\n      console.log('beginning A')\n    })    \n    this.on('start', () =\u003e {\n      console.log('beginning B')\n      setTimeout(()=\u003e{\n        this.emit('finish', {msg: 'ok'})\n      }, 0)\n    })\n  }\n}\n\nconst encrypt = new Encrypt()\nencrypt.on('finish', (data) =\u003e {\n  console.log(`Finshed with message: ${data.msg}`)\n})\nencrypt.emit('start')\n```\n\n---\n\n## Working with Events\n\nEvents are about building extensible functionality and making modular code flexible\n\n* `.emit()` can be in the module and `.on()` in the main program which consumes the module\n* `.on()` can be in the module and `.emit()` in the main program, and in constructor or in instance\n* pass data with `emit()`\n* `error` is a special event (if listen to it then no crashes)\n* `on()` execution happen in the order in which they are defined (`prependListener` or `removeListener`)\n\n[.autoscale: true]\n\n---\n\n## Default Max Event Listeners\n\nDefault maximum listeners is 10 (to find memory leaks), use `setMaxListeners`\u003csup\u003e[source](https://github.com/nodejs/node/blob/master/lib/events.js#L81)\u003c/sup\u003e\n\n```js\nvar defaultMaxListeners = 10;\n...\nEventEmitter.prototype.setMaxListeners = function setMaxListeners(n) {\n  if (typeof n !== 'number' || n \u003c 0 || isNaN(n)) {\n    const errors = lazyErrors();\n    throw new errors.RangeError('ERR_OUT_OF_RANGE', 'n',\n                                'a non-negative number', n);\n  }\n  this._maxListeners = n;\n  return this;\n};\n```\n\n---\n\n## Promises vs Events\n\n* Events are synchronous while Promises are *typically* asynchronous\n* Events react to same event from multiple places, Promise just from one call\n* Events react to same event multiple times, `then` just once\n\n---\n\n## nextTick in class \n\nAgain, nextTick helps to emit events later such as in a class constructor\n\n```js\nclass Encrypt extends events {\n  constructor() {\n    process.nextTick(()=\u003e{  // otherwise, emit will happen before .on('ready')\n      this.emit('ready', {})\n    })\n  }\n}\nconst encrypt = new Encrypt()\nencrypt.on('ready', (data) =\u003e {})\n```\n\n---\n\n## Async/await\n\n---\n\n## How Developers Use Async/await \n\n* Consume ready async/await functions from libraries which support it - often\n* Create your own from callback or promises - not often (Node's `util.promisify`)\n\nYou need Node v8+ for both\n\n---\n\n## Consuming Async Fn from axios\n\n```js\nconst axios = require('axios')\nconst getAzatsWebsite = async () =\u003e {\n  const response = await axios.get('http://azat.co')\n  return response.data\n}\ngetAzatsWebsite().then(console.log)\n```\n\n---\n\n## `util.promisify`\n\n```js\nconst fs = require('fs')\nconst util = require('util')\nconst f = async function () {\n  try {\n    const data = await util.promisify(fs.readFile)('os.js', 'utf8') // \u003c- try changing to non existent file to trigger an error\n    console.log(data)\n  } catch (e) {\n    console.log('ooops')\n    console.error(e)\n    process.exit(1)\n  }\n}\n\nf()\nconsole.log('could be doing something else')\n```\n\n(Can be use just for Promises as well)\n\n---\n\n## Consuming Async Fn from mocha and axios\n\n```js\nconst axios = require('axios')\nconst {expect} = require('chai')\nconst app = require('../server.js')\nconst port = 3004\n\nbefore(async function() {\n  await app.listen(port, () =\u003e {\n    console.log('server is running')\n  })\n  console.log('code after the server is running')\n})\n```\n\n---\n\n## Consuming Async Fn from mocha and axios (Cont)\n\n```js\ndescribe('express rest api server', async () =\u003e {\n  let id\n\n  it('posts an object', async () =\u003e {\n    const {data: body} = await axios\n      .post(`http://localhost:${port}/collections/test`, \n      { name: 'John', email: 'john@rpjs.co'})\n    expect(body.length).to.eql(1)\n    expect(body[0]._id.length).to.eql(24)\n    id = body[0]._id\n  })\n\n  it('retrieves an object', async () =\u003e {\n    const {data: body} = await axios\n      .get(`http://localhost:${port}/collections/test/${id}`)\n    expect(typeof body).to.eql('object')\n    expect(body._id.length).to.eql(24)\n    expect(body._id).to.eql(id)\n    expect(body.name).to.eql('John')\n  })\n  // ...\n})\n```\n\n---\n\n## Project: Avatar Service\n\nKoa Server with Mocha and Async/await Fn and `Promise.all`\n\nTerminal:\n\n```\ncd code\ncd koa-rest\nnpm i\nnpm start\n```\n\nOpen in a Browser: \u003chttp://localhost:3000/?email=YOURMAIL\u003e, e.g., \u003chttp://localhost:3000/?email=hi@node.university\u003e to see your avatar (powered by Gravatar)\n\n---\n\n# Module 3: Streaming\n\n---\n\n## Abstractions for continuous chunking of data or simply data which is not available all at once and which does NOT require too much memory.\n\n---\n\n## No need to wait for the entire resource to load\n\n---\n\n## Types of Streams\n\n* Readable, e.g., `fs.createReadStream`\n* Writable, e.g., `fs.createWriteStream`\n* Duplex, e.g., `net.Socket`\n* Transform, e.g., `zlib.createGzip`\n\n---\n\n## Streams Inherit from Event Emitter\n\n---\n\n## Streams are Everywhere!\n\n* HTTP requests and responses\n* Standard input/output (stdin\u0026stdout)\n* File reads and writes\n\n---\n\n## Readable Stream Example\n\n`process.stdin`\n\nStandard input streams contain data going into applications.\n\n* Event data: `on('data')`\n*  `read()` method\n\n\n---\n\n## Input typically comes from the keyboard used to start the process.\n\nTo listen in on data from stdin, use the `data` and `end` events:\n\n```js\n// stdin.js\nprocess.stdin.resume()\nprocess.stdin.setEncoding('utf8')\n\nprocess.stdin.on('data', function (chunk) {\n  console.log('chunk: ', chunk)\n})\n\nprocess.stdin.on('end', function () {\n  console.log('--- END ---')\n})\n```\n\n---\n\n## Readable stdin Stream Demo\n\n`$ node stdin.js`\n\n---\n\n## Interface `read()`\n\n```js\nvar readable = getReadableStreamSomehow()\nreadable.on('readable', () =\u003e {\n  var chunk\n  while (null !== (chunk = readable.read())) { // SYNC!\n    console.log('got %d bytes of data', chunk.length)\n  }\n})\n```\n\n^readable.read is sync but the chunks are small\n\n---\n\n## Writable Stream Example\n\n* `process.stdout`: Standard output streams contain data going out of the applications.\n* `response` (server request handler response)\n* `request` (client request)\n\nMore on networking in the next module\n\n---\n\n## Write to Writable Stream\n\nUse `write()` method\n\n```js\nprocess.stdout.write('A simple message\\n')\n```\n\nData written to standard output is visible on the command line.\n\n---\n\n## Writable stdout Stream Demo\n\n```\nnode stdout.js\n```\n\n---\n\n## Pipe\n\n```\nsource.pipe(destination)\n```\n\nsource - readable or duplex\ndestination - writable, or transform or duplex\n\n---\n\n## Linux vs Node Piping\n\nLinux shell:\n\n```\noperationA | operationB | operationC | operationD\n```\n\nNode :\n\n```js\nstreamA.pipe(streamB).pipe(streamC).pipe(streamD)\n```\n\nor\n\n```js\nstreamA.pipe(streamB)\nstreamB.pipe(streamC)\nstreamC.pipe(streamD)\n```\n\n---\n\n![left fit](images/pipe.png)\n\nHow `pipe` really works: readable source will be paused if the queue for the writable/transform/duplex destination stream is full. Otherwise, the readable will be resumed and read. \u003csup\u003e[source](https://nodejs.org/en/docs/guides/backpressuring-in-streams)\u003c/sup\u003e\n\n[.footer:hide]\n\n---\n\n```\n                                                   +===================+\n                         x--\u003e  Piping functions   +--\u003e   src.pipe(dest)  |\n                         x     are set up during     |===================|\n                         x     the .pipe method.     |  Event callbacks  |\n  +===============+      x                           |-------------------|\n  |   Your Data   |      x     They exist outside    | .on('close', cb)  |\n  +=======+=======+      x     the data flow, but    | .on('data', cb)   |\n          |              x     importantly attach    | .on('drain', cb)  |\n          |              x     events, and their     | .on('unpipe', cb) |\n+---------v---------+    x     respective callbacks. | .on('error', cb)  |\n|  Readable Stream  +----+                           | .on('finish', cb) |\n+-^-------^-------^-+    |                           | .on('end', cb)    |\n  ^       |       ^      |                           +-------------------+\n  |       |       |      |\n  |       ^       |      |\n  ^       ^       ^      |    +-------------------+         +=================+\n  ^       |       ^      +----\u003e  Writable Stream  +---------\u003e  .write(chunk)  |\n  |       |       |           +-------------------+         +=======+=========+\n  |       |       |                                                 |\n  |       ^       |                              +------------------v---------+\n  ^       |       +-\u003e if (!chunk)                |    Is this chunk too big?  |\n  ^       |       |     emit .end();             |    Is the queue busy?      |\n  |       |       +-\u003e else                       +-------+----------------+---+\n  |       ^       |     emit .write();                   |                |\n  |       ^       ^                                   +--v---+        +---v---+\n  |       |       ^-----------------------------------\u003c  No  |        |  Yes  |\n  ^       |                                           +------+        +---v---+\n  ^       |                                                               |\n  |       ^               emit .pause();          +=================+     |\n  |       ^---------------^-----------------------+  return false;  \u003c-----+---+\n  |                                               +=================+         |\n  |                                                                           |\n  ^            when queue is empty     +============+                         |\n  ^------------^-----------------------\u003c  Buffering |                         |\n               |                       |============|                         |\n               +\u003e emit .drain();       |  ^Buffer^  |                         |\n               +\u003e emit .resume();      +------------+                         |\n                                       |  ^Buffer^  |                         |\n                                       +------------+   add chunk to queue    |\n                                       |            \u003c---^---------------------\u003c\n                                       +============+\n```                                       \n\n---\n\n## Pipe and Transform\n\nEncrypts and Zips:\n\n```js\nconst r = fs.createReadStream('file.txt')\nconst e = crypto.createCipher('aes256', SECRET) \nconst z = zlib.createGzip()\nconst w = fs.createWriteStream('file.txt.gz')\nr.pipe(e).pipe(z).pipe(w)\n```\n\n^Readable.pipe takes writable and returns destination\n\n---\n\n## Readable Streams Events\n\n* data\n* end\n* error\n* close\n* readable\n\n---\n\n## Readable Streams Methods\n\n* `pipe()` \n* `unpipe()`\n* `read()` \n* `unshift()` \n* `resume()`\n* `pause()` \n* `isPaused()` \n* `setEncoding()`\n\n---\n\n## Writable Streams Events\n\n* drain\n* finish\n* error\n* close\n* pipe\n* unpipe\n\n---\n\n## Writable Streams Methods\n\n* `write()` \n* `end()`\n* `cork()` \n* `uncork()` \n* `setDefaultEncoding()`\n\n\n---\n\n## With pipe, we can listen to events too!\n\n```js\nconst r = fs.createReadStream('file.txt')\nconst e = crypto.createCipher('aes256', SECRET) \nconst z = zlib.createGzip()\nconst w = fs.createWriteStream('file.txt.gz')\nr.pipe(e)\n  .pipe(z).on('data', () =\u003e process.stdout.write('.') // progress dot \".\"\n  .pipe(w).on('finish', () =\u003e console.log('all is done!')) // when all is done\n```\n\n---\n\n## Readable Stream\n\npaused: `stream.read()` - safe\n`stream.resume()`\n\nflowing: EventEmitter - data can be lost if no listeners or they are not ready\n`stream.pause()`\n\n---\n\n## What about HTTP?\n\n---\n\n## Core http uses Streams!\n\n```js\nconst http = require('http')\nvar server = http.createServer( (req, res) =\u003e {\n  req.setEncoding('utf8')\n  req.on('data', (chunk) =\u003e { // readable\n    processDataChunk(chunk) // This functions is defined somewhere else\n  })\n  req.on('end', () =\u003e {  \n    res.write('ok') // writable\n    res.end()\n  })\n})\n\nserver.listen(3000)\n```\n\n---\n\n## Streaming for Servers\n\n`streams/large-file-server.js`:\n\n```js\nconst path = require('path')\nconst fileName = path.join(\n  __dirname, process.argv[2] || 'webapp.log') // 67Mb\nconst fs = require('fs')\nconst server = require('http').createServer()\n\nserver.on('request', (req, res) =\u003e {\n  if (req.url === '/callback') {\n    fs.readFile(fileName, (err, data) =\u003e {\n      if (err) return console.error(err)\n      res.end(data)\n    })\n  } else if (req.url === '/stream') {\n    const src = fs.createReadStream(fileName)\n    src.pipe(res)\n  }\n})\n\nserver.listen(3000)\n```\n\n---\n\n![inline](images/stream-callback-memory-2.png)\n\n---\n\n![inline](images/stream-stream-memory-2.png)\n\n---\n\n## Before we were consuming streams, not let's create our own stream. This is ~~Sparta~~ advanced course after all!\n\n---\n\n## Create a Stream\n\n```js\nconst stream = require('stream')\nconst writable = new stream.Writable({...})\nconst readable = new stream.Readable({...})\nconst transform = new stream.Transform({...})\nconst duplex = new stream.Duplex({...})\n```\n\nor \n\n```js\nconst {Writable} = require('stream')\nconst writable = new Writable({...})\n```\n\n---\n\n## Create a Writable Stream\n\n```js\nconst translateWritableStream = new Writable({\n  write(chunk, encoding, callback) {\n    translate(chunk.toString(), {to: 'en'}).then(res =\u003e {\n        console.log(res.text)\n        //=\u003e I speak English\n        console.log(res.from.language.iso)\n        //=\u003e nl\n        callback()\n    }).catch(err =\u003e {\n        console.error(err)\n        callback()\n    })\n  }\n})\n```\n\nstreams/writable-translate.js\n\n\n---\n\n## Creating Readable\n\n```js\nconst {Readable} = require('stream')\nconst Web3 = require('web3')\nconst web3 = new Web3(new Web3.providers.HttpProvider(\"https://mainnet.infura.io/jrrVdXuXrVpvzsYUkCYq\"))\n\nconst latestBlock = new Readable({\n  read(size) {\n    web3.eth.getBlock('latest')\n      .then((x) =\u003e {\n        // console.log(x.timestamp)\n        this.push(`${x.hash}\\n`)\n        // this.push(null)\n      })\n  }\n})\n\nlatestBlock.pipe(process.stdout)\n```\n\n---\n\n## Creating Duplex\n\n```js\nconst {Duplex} = require('stream')\n\nconst MyDuplex = new Duplex ({\n  write(chunk, encoding, callback) {\n    callback()\n  }\n  read(size) {\n    this.push(data) // data defined\n    this.push(null)\n  }\n})\n```\n\n---\n\n## Creating Transform \n\n```js\nconst {Transform} = require('stream')\n\nconst MyTransform = new Transform({\n  transform(chunk, encoding, callback) {\n    this.push(data)\n    callback()\n  }\n})\n```\n\n---\n\n## Transform Real Life Example: Zlib from Node Core \u003csup\u003e[source](https://github.com/nodejs/node/blob/master/lib/zlib.js#L395)\u003c/sup\u003e\n\n```js\nZlib.prototype._transform = function _transform(chunk, encoding, cb) {\n  // If it's the last chunk, or a final flush, we use the Z_FINISH flush flag\n  // (or whatever flag was provided using opts.finishFlush).\n  // If it's explicitly flushing at some other time, then we use\n  // Z_FULL_FLUSH. Otherwise, use the original opts.flush flag.\n  var flushFlag;\n  var ws = this._writableState;\n  if ((ws.ending || ws.ended) \u0026\u0026 ws.length === chunk.byteLength) {\n    flushFlag = this._finishFlushFlag;\n  } else {\n    flushFlag = this._flushFlag;\n    // once we've flushed the last of the queue, stop flushing and\n    // go back to the normal behavior.\n    if (chunk.byteLength \u003e= ws.length)\n      this._flushFlag = this._origFlushFlag;\n  }\n  processChunk(this, chunk, flushFlag, cb);\n};\n```\n\n \n\n---\n\n## Backpressure\n\n* Data clogs \n* Reading typically is faster than writing\n* Backpressure is bad for memory exhaustion and GC (triggering GC too often is expensive)\n* Stream and Node solves the back pressure automatically by pausing source (read) stream if needed\n* `highWaterMark` option, defaults to 16kb\n\n---\n\n## Overwrite Streams\n\nSince Node.js v0.10, the Stream class has offered the ability to modify the behavior of the .read() or .write() by using the underscore version of these respective functions (._read() and ._write()).\n\n[Guide](https://nodejs.org/en/docs/guides/backpressuring-in-streams)\n\n\n---\n\n# Module 4: Networking\n\n---\n\n## net\n\n---\n\n## Any server, not just http or https!\n\n```js\nconst server = require('net').createServer()\nserver.on('connection', socket =\u003e {\n  socket.write('Enter your command: ') // Sent to client\n  socket.on('data', data =\u003e {\n    // Incoming data from a client\n  })\n\n  socket.on('end', () =\u003e {\n    console.log('Client disconnected')\n  })\n})\n\nserver.listen(3000, () =\u003e console.log('Server bound'))\n```\n\n---\n\n## Chat\n\nchat.js:\n\n```js\nif (!sockets[socket.id]) {\n  socket.name = data.toString().trim()\n  socket.write(`Welcome ${socket.name}!\\n`)\n  sockets[socket.id] = socket\n  return\n}\nObject.entries(sockets).forEach(([key, cs]) =\u003e {\n  if (socket.id === key) return\n  cs.write(`${socket.name} ${timestamp()}: `)\n  cs.write(data)\n})\n```    \n\n---\n\n## Client?\n\n```\ntelnet localhost 3000\n```\n\nor\n\n```\nnc localhost 3000\n```\n\nor write your own TCP/IP client using Node, C++, Python, etc.\n\n---\n\n## Bitcoin Price Ticker \n\n`node code/bitcoin-price-ticker.js`\n\n---\n\n## Ticker Server\n\n```js\nconst https = require('https')\n\nconst server = require('net').createServer()\nlet counter = 0\nlet sockets = {}\nserver.on('connection', socket =\u003e {\n  socket.id = counter++\n\n  console.log('Welcome to Bitcoin Price Ticker (Data by Coindesk)')\n  console.log(`There are ${counter} clients connected`)\n  socket.write('Enter currency code (e.g., USD or CNY): ')\n\n  socket.on('data', data =\u003e {\n    // process data from the client\n  })\n\n  socket.on('end', () =\u003e {\n    delete sockets[socket.id]\n    console.log('Client disconnected')\n  })\n})\n\nserver.listen(3000, () =\u003e console.log('Server bound'))\n```\n\n---\n\n## Processing Data from the Client\n\n```js\n    let currency = data.toString().trim()\n    if (!sockets[socket.id]) {\n      sockets[socket.id] = {\n        currency: currency\n      }\n      console.log(currency)\n    }\n    fetchBTCPrice(currency, socket)\n    clearInterval(sockets[socket.id].interval)\n    sockets[socket.id].interval = setInterval(()=\u003e{\n      fetchBTCPrice(currency, socket)\n    }, 5000)\n```\n\n---\n\n## Making request to [Coindesk API](https://www.coindesk.com/api/) (HTTPS!)\n\nAPI: https://api.coindesk.com/v1/bpi/currentprice/\u003cCODE\u003e.json\n\n\u003chttps://api.coindesk.com/v1/bpi/currentprice/USD.json\u003e\n\u003chttps://api.coindesk.com/v1/bpi/currentprice/JPY.json\u003e\n\u003chttps://api.coindesk.com/v1/bpi/currentprice/RUB.json\u003e\n\u003chttps://api.coindesk.com/v1/bpi/currentprice/NYC.json\u003e\n\n---\n\n\n## Response\n\n```json\n{\n  \"time\": {\n    \"updated\": \"Jan 9, 2018 19:52:00 UTC\",\n    \"updatedISO\": \"2018-01-09T19:52:00+00:00\",\n    \"updateduk\": \"Jan 9, 2018 at 19:52 GMT\"\n  },\n  \"disclaimer\": \"This data was produced from the CoinDesk \n  Bitcoin Price Index (USD). Non-USD currency data \n  converted using hourly conversion rate from openexchangerates.org\",\n  \"bpi\": {\n    \"USD\": {\n      \"code\": \"USD\",\n      \"rate\": \"14,753.6850\",\n      \"description\": \"United States Dollar\",\n      \"rate_float\": 14753.685\n    }\n  }\n}\n```\n\n---\n\n## HTTPS GET\n\n```js\nconst fetchBTCPrice = (currency, socket) =\u003e {\n  const req = https.request({\n    port: 443,\n    hostname: 'api.coindesk.com',\n    method: 'GET',\n    path: `/v1/bpi/currentprice/${currency}.json`\n  }, (res) =\u003e {\n    let data = ''\n    res.on('data', (chunk) =\u003e {\n      data +=chunk\n    })\n    res.on('end', () =\u003e {\n      socket.write(`1 BTC is ${JSON.parse(data).bpi[currency].rate} ${currency}\\n`)\n    })\n  })\n  req.end()\n}\n```\n\n\n\n---\n\n## Client\n\n```\ntelnet localhost 3000\nTrying 127.0.0.1...\nConnected to localhost.\nEscape character is '^]'.\nEnter currency code (e.g., USD or CNY): USD\n1 BTC is 14,707.9438 USD\n1 BTC is 14,694.5113 USD\n1 BTC is 14,694.5113 USD\nCNY\n1 BTC is 40,202.5000 CNY\nRUB\n1 BTC is 837,400.5342 RUB\n1 BTC is 837,400.5342 RUB\n1 BTC is 837,400.5342 RUB\n```\n\n---\n\n## http\n\nProtected SQL archive (file-server/file-server.js):\n\n```js\nconst url = require('url')\nconst fs = require('fs')\nconst SECRET = process.env.SECRET\nconst server = require('http').createServer((req, res) =\u003e {\n  console.log(`URL is ${req.url} and the method is ${req.method}`)\n  const course = req.url.match(/courses\\/([0-9]*)/) // works for /courses/123 to get 123\n  const query = url.parse(req.url, true).query // works for /?key=value\u0026key2=value2 \n  if (course \u0026\u0026 course[1] \u0026\u0026 query.API_KEY === SECRET) {\n    fs.readFile('./clients_credit_card_archive.sql', (error, data)=\u003e{\n      if (error) {\n        res.writeHead(500)\n        res.end('Server error')\n      } else {\n        res.writeHead(200, {'Content-Type': 'text/plain' })\n        res.end(data)\n      }\n    })\n  } else {\n    res.writeHead(404)\n    res.end('Not found')\n  }\n}).listen(3000, () =\u003e {\n  console.log('server is listening on 3000')\n})\n```\n\n---\n\n## HTTP File Server\n\nCommand to run the server:\n\n```\nSECRET=NNN nodemon file-server.js\n```\n\nBrowser request: \u003chttp://localhost:3000/courses/123?API_KEY=NNN\u003e\n\n---\n\n## HTTP Routing\n\nYou can use switch... \n\n```js\nconst server = require('http').createServer((req, res) =\u003e {\n  switch (req.url) {\n    case '/api':\n      res.writeHead(200, { 'Content-Type': 'application/json' })\n      // fetch data from a database\n      res.end(JSON.stringify(data))\n      break\n    case '/home':\n      res.writeHead(200, { 'Content-Type': 'text/html' })\n      // send html from a file\n      res.end(html)\n      break\n    default:\n      res.writeHead(404)\n      res.end()\n  }\n}).listen(3000, () =\u003e {\n  console.log('server is listening on 3000')\n})\n```\n\n---\n\n## HTTP Routing Puzzle\n\nFind a problem with this server (from [Advanced Node by Samer Buna](https://app.pluralsight.com/player?course=nodejs-advanced\u0026author=samer-buna\u0026name=nodejs-advanced-m5\u0026clip=3\u0026mode=live)):\n\n```js\nconst fs = require('fs')\nconst server = require('http').createServer()\nconst data = {}\n\nserver.on('request', (req, res) =\u003e {\n  switch (req.url) {\n  case '/api':\n    res.writeHead(200, { 'Content-Type': 'application/json' })\n    res.end(JSON.stringify(data))\n    break\n  case '/home':\n  case '/about':\n    res.writeHead(200, { 'Content-Type': 'text/html' })\n    res.end(fs.readFileSync(`.${req.url}.html`))\n    break\n  case '/':\n    res.writeHead(301, { 'Location': '/home' })\n    res.end()\n    break\n  default:\n    res.writeHead(404)\n    res.end()\n  }\n})\n\nserver.listen(3000)\n```\n\n---\n\n## Puzzle Answer\n\nAlways reading (no caching) and blocking!\n\n```js\n  case '/about':\n    res.writeHead(200, { 'Content-Type': 'text/html' })\n    res.end(fs.readFileSync(`.${req.url}.html`))\n    break\n```\n\n\n---\n\n## Use HTTP Status Codes\n\n```\nhttp.STATUS_CODES\n```\n\n---\n\n## Core `https` Module\n\nServer needs the key and certificate files:\n\n```\nopenssl req -x509 -newkey rsa:2048 -nodes -sha256 -subj '/C=US/ST=CA/L=SF/O=NO\\x08A/OU=NA' \\\n  -keyout server.key -out server.crt\n```  \n\n---\n\n## HTTPS Server with Core `https` Module\n\n```js\nconst https = require('https')\nconst fs = require('fs')\n\nconst server = https.createServer({\n  key: fs.readFileSync('server.key'),\n  cert: fs.readFileSync('server.crt')\n}, (req, res) =\u003e {\n  res.writeHead(200)\n  res.end('hello')\n}).listen(443)\n```\n\n---\n\n## `https` Request with Streaming\n\n```js\nconst https = require('https') \n\nconst req = https.request({\n    hostname: 'webapplog.com',\n    port: 443, \n    path: '/',\n    method: 'GET'\n  }, (res) =\u003e {\n  console.log('statusCode:', res.statusCode)\n  console.log('headers:', res.headers)\n\n  res.on('data', (chunk) =\u003e {\n    process.stdout.write(chunk)\n  })\n})\n\nreq.on('error', (error) =\u003e {\n  console.error(error)\n})\nreq.end()\n```\n\n---\n\n\n## HTTP/2 with `http2`\n\n---\n\n## Generating Self-Signed SSL\n\n```\nopenssl req -x509 -newkey rsa:2048 -nodes -sha256 -subj '/C=US/ST=CA/L=SF/O=NO\\x08A/OU=NA' \\\n  -keyout server.key -out server.crt\n```  \n\n\n---\n\n## Using Core `http2` Module\n\n```js\nconst http2 = require('http2')\nconst fs = require('fs')\n\nconst server = http2.createSecureServer({\n  key: fs.readFileSync('server.key'),\n  cert: fs.readFileSync('server.crt')\n}, (req, res) =\u003e {\n  res.writeHead(200, {'Content-Type': 'text/plain' })\n  res.end('\u003ch1\u003eHello World\u003c/h1\u003e') // JUST LIKE HTTP!\n})\nserver.on('error', (err) =\u003e console.error(err))\nserver.listen(3000)\n```\n\n\n---\n\n## Running H2 Hello Server\n```\ncd code\ncd http2\nnode h2-hello.js\n```\n\nBrowser: \u003chttps://localhost:3000\u003e\n\nTerminal:\n\n```\ncurl https://localhost:3000/ -vik\n```\n\n---\n\n![inline](images/http2-click-on-advanced.png)\n\n---\n\n![inline](images/http2-click-on-proceed.png)\n\n---\n\n![inline](images/http2-localhost-request.png)\n\n---\n\n![inline](images/http2-inspecting-self-signed.png)\n\n---\n\n```\ncurl https://localhost:3000/ -vik\n```\n\n```\n Trying 127.0.0.1...\n* Connected to localhost (127.0.0.1) port 3000 (#0)\n* ALPN, offering h2\n* ALPN, offering http/1.1\n* Cipher selection:\n...\n* SSL connection using TLSv1.2 / ECDHE-RSA-AES128-GCM-SHA256\n* ALPN, server accepted to use h2\n* Server certificate:\n*  subject: C=US; ST=CA; L=SF; O=NOx08A; OU=NA\n* Using HTTP2, server supports multi-use\n* Connection state changed (HTTP/2 confirmed)\n```\n\n---\n\n## Using Core `http2` Module with Stream\n\n```js\nconst http2 = require('http2')\nconst fs = require('fs')\n\nconst server = http2.createSecureServer({\n  key: fs.readFileSync('server.key'),\n  cert: fs.readFileSync('server.crt')\n})\n\nserver.on('error', (err) =\u003e console.error(err))\nserver.on('socketError', (err) =\u003e console.error(err))\n\nserver.on('stream', (stream, headers) =\u003e {\n  // stream is a Duplex\n  stream.respond({\n    'content-type': 'text/html',\n    ':status': 200\n  })\n  stream.end('\u003ch1\u003eHello World\u003c/h1\u003e')\n})\n\nserver.listen(3000)\n```\n\n---\n\n## WTF is http2 Server Push?\n\n---\n\n## Example: index.html refers to four static assets\n\nHTTP/1: server requires five requests from a client:\n\n1. index.html\n2. style.css\n3. bundle.js\n4. favicon.ico\n5. logo.png\n\n---\n\n## Example: index.html refers to four static assets (Cont)\n\nHTTP/2: server with server push requires just one request from a client: \n\n1. index.html\n  * style.css\n  * bundle.js\n  * favicon.ico\n  * logo.png\n\n---\n\n\u003e HTML and assets are pushed by the server but assets are not used unless referred to by HTML.\n\n---\n\n## Let's implement some server push!\n\n---\n\n## Start with a Normal H2 Server\n\n```js\nconst http2 = require('http2')\nconst fs = require('fs')\n\nconst server = http2.createSecureServer({\n  key: fs.readFileSync('server.key'),\n  cert: fs.readFileSync('server.crt')\n})\n\nserver.on('error', (err) =\u003e console.error(err))\nserver.on('socketError', (err) =\u003e console.error(err))\n```\n\n---\n\n## Use Stream and `pushStream`\n\n```js\nserver.on('stream', (stream, headers) =\u003e {\n  stream.respond({\n    'content-type': 'text/html',\n    ':status': 200\n  })\n  stream.pushStream({ ':path': '/myfakefile.js' }, (pushStream) =\u003e {\n    pushStream.respond({ \n      'content-type': 'text/javascript',\n      ':status': 200 \n    })\n    pushStream.end(`alert('you win')`)\n  })\n  stream.end('\u003cscript src=\"/myfakefile.js\"\u003e\u003c/script\u003e\u003ch1\u003eHello World\u003c/h1\u003e')\n})\n\nserver.listen(3000)\n```\n\n---\n\n![inline](images/http2-push-alert.png)\n\n---\n\n## Additional server push articles\n\n* [What’s the benefit of Server Push?](https://http2.github.io/faq/#whats-the-benefit-of-server-push)\n* [Announcing Support for HTTP/2 Server Push](https://blog.cloudflare.com/announcing-support-for-http-2-server-push-2)\n* [Innovating with HTTP 2.0 Server Push](https://www.igvita.com/2013/06/12/innovating-with-http-2.0-server-push)\n\n---\n\n## Advanced Express REST API Routing in HackHall Demo \n\n---\n\n## Conclusion\n\n### Just don't use core http directly. Use Express, Hapi or Koa.\n\n\n---\n\n# Module 5: Debugging\n\n\n---\n\n## Debugging Strategies\n\n* Don't guess and don't think too much\n* Isolate (use binary search)\n* Watch/check values\n* Trial and error\n* Full Stack overflow development (skip question, read answers)\n* Read source code, docs can be outdated or subpar\n\n---\n\n## `console.log` is one of the best debuggers\n\n* Not breaking the execution flow\n* Nothing extra needed (unlike Node Inspector/DevTools or VS Code)\n* Robust: clearly shows if a line is executed\n* Clearly shows data\n\n---\n\n## Console Tricks\n\n---\n\n## Streaming Logs to Files\n\n```js\nconst fs = require('fs')\n\nconst out = fs.createWriteStream('./out.log')\nconst err = fs.createWriteStream('./err.log')\n\nconst console2 = new console.Console(out, err)\n\nsetInterval(() =\u003e {\n  console2.log(new Date())\n  console2.error(new Error('Whoops'))\n}, 500)\n```\n\n---\n\n## Console Parameters\n\n```js\nconsole.log('Step', 2) // Step2\nconst name = 'Azat'\nconst city = 'San Francisco'\nconsole.log('Hello %s from %s', name, city)\n```\n\n---\n\n## `util.format` and `util.inspect`\n\n```js\nconst util = require('util')\nconsole.log(util.format('Hello %s from %s', name, city)) \n// Hello Azat from San Francisco\nconsole.log('Hello %s from %s', 'Azat', {city: 'San Francisco'}) \n// Hello Azat from [object Object]\nconsole.log({city: 'San Francisco'}) \n// { city: 'San Francisco' }\nconsole.log(util.inspect({city: 'San Francisco'})) \n// { city: 'San Francisco' }\n```\n\n---\n\n## `console.dir()`\n\n```js\nconst str = util.inspect(global, {depth: 0})\nconsole.dir(global, {depth: 0})\n```\n\n```\ninfo = log\nwarn = error\ntrace // prints call stack\nassert // require('assert')\n```\n\n---\n\n## Console Timers\n\n```js\nconsole.log('Ethereum transaction started')\nconsole.time('Ethereum transaction')\nweb3.send(txHash, (error, results)=\u003e{\n  console.timeEnd('Ethereum transaction') \n  // Ethereum transaction: 4545.921ms\n})\n```\n\n---\n\n\n## REPL Tricks (which can be used for quick testing and debugging)\n\n* Core modules are there already\n* You can load any module with `require()` (must be installed with proper path)\n* You can see all your sessions' histories in `~/.node_repl_history`, i.e., `cat ~/.node_repl_history` or `tail ~/.node_repl_history`\n\n---\n\n## REPL Commands\n\n* `.break`: When in the process of inputting a multi-line expression, entering the .break command (or pressing the \u003cctrl\u003e-C key combination) will abort further input or processing of that expression.\n* `.clear`: Resets the REPL context to an empty object and clears any multi-line expression currently being input.\n* `.exit`: Close the I/O stream, causing the REPL to exit.\n* `.help`: Show this list of special commands.\n* `.save`: Save the current REPL session to a file: \u003e .save ./file/to/save.js\n* `.load`: Load a file into the current REPL session. \u003e .load ./file/to/load.js\n\n---\n\n## Editing in REPL\n\n`.editor` - Enter editor mode (\u003cctrl\u003e-D to finish, \u003cctrl\u003e-C to cancel)\n\n```\n\u003e .editor\n// Entering editor mode (^D to finish, ^C to cancel)\nfunction welcome(name) {\n  return `Hello ${name}!`;\n}\n\nwelcome('Node.js User');\n\n// ^D\n'Hello Node.js User!'\n\u003e\n```\n\n---\n\n\n## Real Debuggers\n\n* CLI\n* DevTools\n* VS Code\n\n\n---\n\n## Node CLI Debugger\n\n```\n$ node inspect debug-me.js\n\u003c Debugger listening on ws://127.0.0.1:9229/80e7a814-7cd3-49fb-921a-2e02228cd5ba\n\u003c For help see https://nodejs.org/en/docs/inspector\n\u003c Debugger attached.\nBreak on start in myscript.js:1\n\u003e 1 (function (exports, require, module, __filename, __dirname) { global.x = 5;\n  2 setTimeout(() =\u003e {\n  3   console.log('world');\ndebug\u003e\n```\n\n---\n\n\n## Node CLI Debugger (Cont)\n\n```\nStepping#\ncont, c - Continue execution\nnext, n - Step next\nstep, s - Step in\nout, o - Step out\npause - Pause running code (like pause button in Developer Tools)\n```\n\n---\n\n## Node V8 Inspector\n\n```\n$ node --inspect index.js\nDebugger listening on 127.0.0.1:9229.\nTo start debugging, open the following URL in Chrome:\n    chrome-devtools://devtools/bundled/inspector.html?experiments=true\u0026v8only=true\u0026ws=127.0.0.1:9229/dc9010dd-f8b8-4ac5-a510-c1a114ec7d29\n```\n\nBetter to break right away:\n\n```\nnode --inspect-brk debug-me.js\n```\n\nOld (deprecated):\n\n```\nnode --inspect --debug-brk index.js\n```\n\n---\n\n\n## Node V8 Inspector Demo\n\n---\n\n## VS Code Demo\n\n---\n\n\n## CPU profiling\n\n---\n\n## Networking Debugging with DevTools\n\n---\n\n## V8 Memory Scheme\n\nResident Set:\n\n* Code: Node/JS code\n* Stack: Primitives, local variables, pointers to objects in the heap and control flow\n* Heap: Referenced types such as Objects, strings, closures\n\n---\n\n```\nprocess.memoryUsage()\n```\n\n```\n{ rss: 12476416,\n  heapTotal: 7708672,\n  heapUsed: 5327904,\n  external: 8639 }\n```\n\n---\n\n## Heap\n\n* New Space\u0026Young Generation: New allocations, size 1-8Mb, fast collection (Scavenge), ~20% goes into Old Space\n* Old Space\u0026Old Generation: Allocation is fast but collection is expensive (Mark-Sweep)\n\n\n---\n\n## Garbage Collection\n\nThe mechanism that allocates and frees heap memory is called garbage collection.\n\n---\n\n## Garbage Collection (Cont)\n\n* Automatic in Node, thanks to V8\n* Stops the world - expensive\n* Objects with refs are not collected (memory leaks)\n\n---\n\n## Memory Leak\n\n---\n\n![fit](images/memory-leak.png)\n\n---\n\n## Leaky Server\n\n```js\nconst express = require('express')\n\nconst app = express()\n\nlet cryptoWallet = {}\nconst generateAddress = () =\u003e {\n  const initialCryptoWallet = cryptoWallet\n  const tempCryptoWallet = () =\u003e {\n    if (initialCryptoWallet) console.log('We received initial cryptoWallet')\n  }\n  cryptoWallet = {\n    key: new Array(1e7).join('.'),\n    address: () =\u003e {\n      // ref to tempCryptoWallet ???\n      console.log('address returned')\n    }\n  }\n}\n\napp.get('*', (req, res) =\u003e {\n  generateAddress()\n  console.log( process.memoryUsage())\n  return res.json({msg: 'ok'})\n})\napp.listen(3000)\n```\n\n---\n\n## Starting the LEAK\n\n```\nloadtest -c 100 --rps 100 http://localhost:3000\nnode leaky-server/server.js\n```\n\n\n```\n{ rss: 1395490816,\n  heapTotal: 1469087744,\n  heapUsed: 1448368200,\n  external: 16416 }\n{ rss: 1405501440,\n  heapTotal: 1479098368,\n  heapUsed: 1458377224,\n  external: 16416 }\n{ rss: 1335377920,\n  heapTotal: 1409097728,\n  heapUsed: 1388386720,\n  external: 16416 }\n```\n\n---  \n\n## GCs\n\n```\n\u003c--- Last few GCs ---\u003e\n\n[35417:0x103000c00]    36302 ms: Mark-sweep 1324.1 (1345.3) -\u003e 1324.1 (1345.3) MB, 22.8 / 0.0 ms  allocation failure GC in old space requested\n[35417:0x103000c00]    36328 ms: Mark-sweep 1324.1 (1345.3) -\u003e 1324.1 (1330.3) MB, 26.4 / 0.0 ms  last resort GC in old space requested\n[35417:0x103000c00]    36349 ms: Mark-sweep 1324.1 (1330.3) -\u003e 1324.1 (1330.3) MB, 20.9 / 0.0 ms  last resort GC in old space requested\n```\n\n---\n\n## Line 12\n\n```\n==== JS stack trace =========================================\n\nSecurity context: 0x3c69fae25ee1 \u003cJSObject\u003e\n    2: generateAddress [/Users/azat/Documents/Code/node-advanced/code/leaky-server/server.js:12] [bytecode=0x3c69df959db9 offset=42](this=0x3c69a7f0c0b9 \u003cJSGlobal Object\u003e)\n    4: /* anonymous */ [/Users/azat/Documents/Code/node-advanced/code/leaky-server/server.js:20] [bytecode=0x3c69df959991 offset=7](this=0x3c69a7f0c0b9 \u003cJSGlobal Object\u003e,req=0x3c69389c07c1 \u003cIncomingMessage map = 0x3c693e7300f1...\n\n\n    FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory\n```\n\n---\n\n## Memory Leak Mitigation\n\n* Reproduce the error/leak\n* Check for variables and fn arguments, pure fns are better\n* Take heap dumps and compare (with debug and DevTools or heapdump modules)\n* Update Node\n* Get rid of extra npm modules\n* Trial and error: remove code you think is leaky\n* Modularize\u0026refactor \n\n---\n\nUseful Libraries\n\n* \u003chttps://www.npmjs.com/package/memwatch-next\u003e\n* \u003chttps://www.npmjs.com/package/systeminformation\u003e \n* \u003chttps://github.com/bnoordhuis/node-heapdump\u003e\n\n---\n\n## Heap Dumping \n\n`code/leaky-server/server-heapdump.js`:\n\n```js\n// ...\nconst heapdump = require('heapdump')\nsetInterval(function () {\n  heapdump.writeSnapshot()\n}, 2 * 1000)\n// ...\n```\n\nCreates files in the current folder:\n\n```\nheapdump-205347232.998971.heapsnapshot\nheapdump-205508465.289834.heapsnapshot\nheapdump-205513413.472744.heapsnapshot\n```\n\n---\n\n![fit](images/heap-object-count.png)\n\n---\n\n![fit](images/heap-retained-size.png)\n\n---\n\n![fit](images/heap-shallow-size.png)\n\n---\n\n# Module 6: Scaling\n\n---\n\n## Why You Need to Scale\n\n* Performance (e.g., under 100ms response time)\n* Availability (e.g., 99.999%)\n* Fault tolerance (e.g., zero downtime)\n\n^Zero downtime\n^ Offload the workload: when Node server is a single process, it can be easily blocked\n^https://blog.interfaceware.com/disaster-recovery-vs-high-availability-vs-fault-tolerance-what-are-the-differences/\n\n---\n\n## Scaling Strategies\n\n* Forking (just buy more EC2s) - what we will do\n* Decomposing (e.g., microservices just for bottlenecks) - in another course\n* Sharding (e.g., eu.docusign.com and na2.docusign.net) - not recommended\n\n---\n\n## Offload the Workload\n\n* `spawn()` - events, stream, messages, no size limit, no shell\n* `fork()` - Node processes, exchange messages\n* `exec()` - callback, buffer, 1Gb size limit, creates shell\n* `execFile()` - exec file, no shell\n\n---\n\n## Sync Processes (Dumb)\n\n* `spawnSync()`\n* `execFileSync()`\n* `execSync()`\n* `forkSync()`\n\n---\n\n## Executing bash and Spawn params\n\n```js\nconst {spawn} = require('child_process')\nspawn('cd $HOME/Downloads \u0026\u0026 find . -type f | wc -l', \n  {stdio: 'inherit', \n  shell: true, \n  cwd: '/', \n  env: {PASSWORD: 'dolphins'}\n})\n```\n\n---\n\n## Good Examples of Offloading the Workload\n\n* Hashing\n* Encryption\n* Requests\n* Encoding\n* Archiving/Compression\n* Computation\n\n---\n\n## Let's use Node to launch Python script to securely (512) hash a long string and get results back into Node.\n\n---\n\n## Executing Python with `exec()`\n\n`code/exec-hash.js`:\n\n```js\nconst {exec} = require('child_process')\nconsole.time('hashing')\nconst str = 'React Quickly: Painless web apps with React, JSX, Redux, and GraphQL'.repeat(100)\nexec(`STR=\"${str}\" python ${__dirname}/py-hash.py`, (error, stdout, stderr) =\u003e {\n  if (error) return console.error(error)\n  console.timeEnd('hashing')\n  console.log(stdout)\n})\n\nconsole.log('could be doing something else')\n```\n\n---\n\n## Python SHA512 Hashing\n\n`code/py-hash.py`:\n\n```py\nimport os\nstr = os.environ['STR'] \nimport hashlib\nhash_object = hashlib.sha512(str.encode())\nhex_dig = hash_object.hexdigest()\nprint(hex_dig)\n```\n\n---\n\n## Let's launch Ruby script to encrypt a string from Node with AES into a file and not wait for it.\n\n---\n\n## Node Sends a Long String for Encryption to Ruby\n\n```js\nconst {spawn} = require('child_process')\nconst str = 'React Quickly: Painless web apps with React, JSX, Redux, and GraphQL'.repeat(100)\nconsole.time('launch encryption')\n\nconst rubyEncrypt = spawn('ruby', ['encrypt.rb'], {\n  env: {STR: str},\n  detached: true,\n  stdio: 'ignore'\n})\nrubyEncrypt.unref() // Do not wait cause the results will be in the file.\n\nconsole.timeEnd('launch encryption')\n```\n\n---\n\n## Ruby Script is Encrypting with AES 256\n\n```rb\nrequire 'openssl'\ncipher = OpenSSL::Cipher.new('aes-256-cbc')\ncipher.encrypt # We are encrypting\nkey = cipher.random_key\niv = cipher.random_iv\n\nencrypted_string = cipher.update ENV[\"STR\"]\nencrypted_string \u003c\u003c cipher.final\nFile.write('ruby-encrypted.txt', encrypted_string)\n```\n\n---\n\n## Quick Summary About Spawn\n\n* Use params to pass data around\n* Offload work to other processes even when they are in other languages\n* Compare timing\n\n\n---\n\n\u003e Scaling by forking will require the core `os` module.\n\n---\n\n## `os` Module\n\n---\n\n## Things You Can Do with `os`\n\n```js\nconst os = require('os')\nconsole.log(os.freemem())\nconsole.log(os.type())\nconsole.log(os.release())\nconsole.log(os.cpus())\nconsole.log(os.uptime())\nconsole.log(os.networkInterface())\n```\n\n---\n\n## Network Interface Results\n\n```\n{ lo0:\n   [ { address: '127.0.0.1',\n       netmask: '255.0.0.0',\n       family: 'IPv4',\n       mac: '00:00:00:00:00:00',\n       internal: true },\n  ...\n  en0:\n   [ { address: '10.0.1.4',\n       netmask: '255.255.255.0',\n       family: 'IPv4',\n       mac: '78:4f:43:96:c6:f1',\n       internal: false } ],\n  ...  \n```  \n\nmacOS terminal command to get the same IP:\n\n```\nifconfig | grep \"inet \" | grep -v 127.0.0.1\n```\n\n---\n\n## CPU Usage in %\n\n`code/os-cpu.js`:\n\n```js\nconst os = require('os')\nlet cpus = os.cpus()\n\ncpus.forEach((cpu, i) =\u003e {\n  console.log('CPU %s:', i)\n  let total = 0\n  for (let type in cpu.times) {\n    total += cpu.times[type]\n  }\n  for (let type in cpu.times) {\n    console.log(`\\t ${type} ${Math.round(100 * cpu.times[type] / total)}%`)\n  }\n})\n```\n\n---\n\n## The Core `cluster` Module\n\n* Master process\n* Worker processes: it's own PID, event loop and memory space\n* Load testing - round robin or the second approach is where the master process creates the listen socket and sends it to interested workers. The workers then accept incoming connections directly.\n* Use the `child_process.fork()` method and messaging\n\n---\n\n##  Load Testing\n\n`cluster` uses round Robin uses shift and push \u003csup\u003e[source](https://github.com/nodejs/node/blob/master/lib/internal/cluster/round_robin_handle.js#L84)\u003c/sup\u003e\n\n```js\nRoundRobinHandle.prototype.distribute = function(err, handle) {\n  this.handles.push(handle);\n  const worker = this.free.shift();\n\n  if (worker)\n    this.handoff(worker);\n};\n```\n\n---\n\n## Load/Stress Testing Tools\n\nNode loadtest:\n\n```\nnpm i loadtest -g\nloadtest -c 10 --rps 100 10.0.1.4:3000\n```\n\nor Apache ab\n\n```\nab -c 200 -t 10 http://localhost:3000\n```\n\n---\n\n## With Clusters\n\nAvoid In-memory caching (each cluster has its own memory) or sticky sessions. Use external state store.\n\n---\n\n##  Cluster Messaging\n\n\nMaster:\n\n```js\ncluster.workers\nworker.send(data)\n```\n\nWorker:\n\n```js\nprocess.on('message', data=\u003e{})\n```\n\n---\n\n##  Optimizing a Slow Password Salting+Hashing Server \n\n---\n\n## A sync function which is a very CPU-Intensive task \n\n`offload/server-v1.js`:\n\n```js\n// ...\nconst bcrypt = require('bcrypt')\n\nconst hashPassword = (password, cb) =\u003e {\n  const hash = bcrypt.hashSync(password, 16) // bcrypt has async but we are using sync here for the example\n  cb(hash)\n}\n// ...\napp.post('/signup', (req, res) =\u003e {\n  hashPassword(req.body.password.toString(), (hash) =\u003e { // callback but sync\n    // Store hash in your password DB.\n    res.send('your credentials are stored securely')\n  })\n})\napp.listen(3000)\n```\n\n\n---\n\n## Benchmarking The Password Salting+Hashing Server\n\nTerminal:\n\n```\nnode server-v1.js\n```\n\nAnother terminal (not the first terminal):\n\n```\ncurl localhost:3000/signup -d '{\"password\":123}' -H \"Content-Type: application/json\" -X POST\n```\n\nThird terminal window/tab:\n\n```\ncurl localhost:3000\n```\n\nResult: 2nd request (3rd terminal) will wait for the 1st request (2nd terminal)\n\n---\n\n## Optimizing The Password Salting+Hashing Server\n\nServer with forked hashing `code/offload/worker-v2.js`:\n\n```js\nconst bcrypt = require('bcrypt')\n\nprocess.on('message', (password) =\u003e {\n  const hash = bcrypt.hashSync(password, 16)\n  process.send(hash)\n})\n```\n\n---\n\n## Optimizing Server (Cont)\n\nOptimized server `code/offload/server-v2.js`:\n\n```js\nconst hashPassword = (password, cb) =\u003e {\n  const hashWorker = fork('worker-v2.js')\n  hashWorker.send(password)\n  hashWorker.on('message', hash =\u003e {\n    cb(hash)\n  })\n}\napp.use(bodyParser.json())\napp.get('/', (req, res) =\u003e {\n  res.send('welcome to strong password site')\n})\n\napp.post('/signup', (req, res) =\u003e {\n  hashPassword(req.body.password.toString(), (hash) =\u003e { // callback but sync\n    // Store hash in your password DB.\n    res.send('your credentials are stored securely')\n  })\n})\n\n```\n\n---\n\n## Testing Server v2 (Forked Process)\n\nTerminal:\n\n```\nnode server-v2.js\n```\n\nAnother terminal (not the first terminal):\n\n```\ncurl localhost:3000/signup -d '{\"password\":123}' -H \"Content-Type: application/json\" -X POST\n```\n\nThird terminal window/tab:\n\n```\ncurl localhost:3000\n```\n\nResult: 2nd request (3rd terminal) will NOT wait for the 1st request (2nd terminal)\n\n---\n\n## We can fork the v1 server without splitting the hashing+salting function into a worker\n\n---\n\n\n## Server With a Forked Cluster \n\n`code/offload/server-v3.j`:\n\n```js\nconst express = require('express')\nconst app = express()\nconst path = require('path')\nconst bodyParser = require('body-parser')\nconst bcrypt = require('bcrypt')\n\nconst cluster = require('cluster')\n\nif (cluster.isMaster) {\n  const os = require('os')\n  os.cpus().forEach(() =\u003e {\n    const worker = cluster.fork()\n    console.log(`Started worker ${worker.process.pid}`)\n  })\n  return true\n} \n```\n\n\n---\n\n## Server With a Forked Cluster (Cont)\n\n`code/offload/server-v3.j`:\n\n```js\n// cluster.isWorker === true\nconst hashPassword = (password, cb) =\u003e {  \n  const hash = bcrypt.hashSync(password, 16) // bcrypt has async but we are using sync here for the example\n  cb(hash)\n}\n\napp.use(bodyParser.json())\napp.get('/', (req, res) =\u003e {\n  res.send('welcome to strong password site')\n})\n\napp.post('/signup', (req, res) =\u003e {\n  hashPassword(req.body.password.toString(), (hash) =\u003e { // callback but sync\n    // Store hash in your password DB.\n    res.send('your credentials are stored securely')\n  })\n})\n\napp.listen(3000)\n```\n\n---\n\n## Testing Server v3 (Forked Server)\n\nTerminal:\n\n```\nnode server-v3.js\n```\n\nAnother terminal (not the first terminal):\n\n```\ncurl localhost:3000/signup -d '{\"password\":123}' -H \"Content-Type: application/json\" -X POST\n```\n\nThird terminal window/tab:\n\n```\ncurl localhost:3000\n```\n\nResult: 2nd request (3rd terminal) will NOT wait for the 1st request (2nd terminal)\n\n---\n\n\u003e Node.js does not automatically manage the number of workers, however. It is the application's responsibility to manage the worker pool based on its own needs.\n\n---\n\n## No Fault Tolerance in Server v3\n\n```\nnode server-v3.js\n```\n\n```\nps aux | grep 'node'\nkill 12668\n```\n\n---\n\n## Implementing Fault Tolerance in Server v4\n\nin `isMaster` in `code/offload/server-v4.js`:\n\n```js\n  cluster.on('exit', (worker, code, signal) =\u003e {\n    if (signal) {\n      console.log(`worker was killed by signal: ${signal}`);\n    } else if (code !== 0) { // \u0026\u0026!worker.exitedAfterDisconnect\n      console.log(`worker exited with error code: ${code}`);\n    } else {\n      console.log('worker success!');\n    }\n    const newWorker = cluster.fork()\n    console.log(`Worker ${worker.process.pid} exited. Thus, starting a new worker ${newWorker.process.pid}`)\n  })\n```\n\n---\n\n## Fault Tolerance in Server v4\n\n```\nnode server-v4.js\n```\n\n```\nps aux | grep 'node'\nkill 12668\n```\n\n---\n\n\u003e `cluster` is good but `pm2` is better\n\n---\n\n## pm2 Basics\n\n```\nnpm i -g pm2\npm2 start app.js Start, Daemonize and auto-restart application (Node)\npm2 start app.js --watch\npm2 start app.js --name=\"bitcoin-exchange-api\"\npm2 reset bitcoin-exchange-api\npm2 stop all\npm2 stop bitcoin-exchange-api\n```\n\n---\n\n## pm2 Advanced\n\n```\npm2 startup\npm2 save\npm2 unstartup \npm2 start app.js -i 4         # Start 4 instances of application in cluster mode \n                              # it will load balance network queries to each app\npm2 start app.js -i 4         # Start auto-detect instances of application in cluster mode  \npm2 reload all                # Zero Second Downtime Reload\npm2 scale [app-name] 10       # Scale Cluster app to 10 process\n```\n\n---\n\n## pm2 More\n\n```\npm2-dev\npm2-docker\n```\n\n---\n\n# Outro\n\n---\n\n## Summary\n\n* Debugging\n* Console, Node REPL and npm tricks\n* Forking and spawning\n* Creating streams, async/await and naive promises\n* How really globals, modules and require() work\n\n---\n\n# The End!\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fazat-co%2Fnode-advanced","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fazat-co%2Fnode-advanced","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fazat-co%2Fnode-advanced/lists"}