Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/droiddevgeeks/nodelearning
This is node learning demo. It has covered all basics of node.
https://github.com/droiddevgeeks/nodelearning
crawler database ejs ejs-express mcv middleware-nodes mongodb node node-module nodejs nodemailer npm-package router sign
Last synced: 14 days ago
JSON representation
This is node learning demo. It has covered all basics of node.
- Host: GitHub
- URL: https://github.com/droiddevgeeks/nodelearning
- Owner: droiddevgeeks
- Created: 2018-04-18T21:08:12.000Z (almost 7 years ago)
- Default Branch: master
- Last Pushed: 2020-09-04T06:37:55.000Z (over 4 years ago)
- Last Synced: 2024-11-13T22:36:05.442Z (2 months ago)
- Topics: crawler, database, ejs, ejs-express, mcv, middleware-nodes, mongodb, node, node-module, nodejs, nodemailer, npm-package, router, sign
- Language: HTML
- Homepage:
- Size: 295 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# NodeLearning
This project covers all basic topics of nodejs framework.
ExpressJs
Routes
MiddleWare
Call Chaining
ErrorHandling
Ejs HTML Page rendering
MongoDb and Connection and CRUD operation
NodeMailer
WebCrawler############################### Start ###########################################
How to start project:Install latest version fo node
sudo apt-get install nodejs-legacyInstall latest version fo node package manager
sudo apt-get install npmClone repo https://github.com/droiddevgeeks/NodeLearning.git
1. npm install cmd to install all required dependencies
2. node app.jsThis will start server on localhost:5000/
##############################################################################
############ Web Crawler ####################################
1. localhost:5000/webcrawl/
this will hit website, parse html and mine http urls and store all url in db2. localhost:5000/weburls GET Api , will give all fetch url from database.
Another Example of web crawling
1. localhost:5000/rssfeed/
this will hit website, parse html and mine http urls and store all url in db2. localhost:5000/posts GET Api , will give all fetch url from database.
###########################################################
######################### Sign In and Sign Up ##################################
1. localhost:5000/user/alluser/:count GET
2. localhost:5000/user/signIn POST API
3. localhost:5000/user/myInfo POST API
4. localhost:5000/user/signUp POST API########################################################
################################ HTML Rendering#################################
EJS
view engine
1. localhost:5000/policy
2. localhost:5000/rules
###############################################################################