πNode Js (John Smilga - Udemy)
Node, Express, Mongo DB, Mongoose and PROJECTS! PROJECTS! PROJECTS! ππ»
π What is Node JS 
Environment to run JS outside the browser.
Globals
There's no window in Node unlike in vanilla JS (browser). However, there are global variables that can be accessed from anywhere in the code. Here's some of them (there are more and not listing everything)
// Globals
__dirname  - path to current directory // /Users/sandeepamarnath/Desktop/node_tutorial  
__filename - file name
require    - function to use modules
module     - info about current module
process    - info about env where our code is executedπ Modules
Node uses a common js module pattern. Every js file in the node is a module. One module can be imported by the other module and we can choose what we would like to export or allow other modules to import.
Note: when you use your own modules, then to require them, use require(./mymodule). See how we are starting with ./ even though they are in the same folder. No need to use ./ if it's a built-in module.
// Code without modules
//// app.js
const secret = "SECRET"
const peter = "Peter"
const john = "John"
const sayHi = (name) => {
    console.log(`Hi ${name}`)
}
sayHi("Susan")
sayHi(john)
// ------------------------------------
// With modules
//// names.js (we need to export what we want so that other modules can use them)
// dont share this
const secret = "SECRET"
//share them
const john = "John"
const peter = "Peter"
// exports is an object in module
module.exports = { john, peter } // This is ES6 syntax, same as writing {john:john, peter:peter}
// NOTE: If we export only one, then we can say module.exports = john, and now this will just be a string and not an object
//// app.js
const names = require('./names') // see we are using ./ even though it is in same folder as explained above
console.log(names) // {john:john, peter:peter}
//// utils.js
const sayHi = (name) => {
    console.log(`Hi ${name}`)
}
module.exports = sayHi // same as 'export default sayHi' in ES6
//// app.js 
const names = require('./names')
const sayHi = require('./utils') // sayHi function
sayHi(names.john) // Hi John
// Other ways of exporting
module.exports.items = ['item1', 'item2']; 
const person = {
    name: 'San'
}
module.exports.singlePerson = person
// Other way of importing (destructuring)
const { items, singlePerson } = require('./multipleexpprt')The tricky part of the modules
// Let's say we export or don't export (that doesn't matter), but IF WE HAVE 
// A FUNCTION CALL IN A MODULE, the module requirng it will call that function
// test.js (not exporting anything, same result even if we export something) 
 
const num1 = 5;
const num2 = 10;
function addValues() {
    console.log(`The sum is ${num1 + num2}`)
}
addValues()  // LOOK WE ARE CALLING THE FUNCTION
// app.js 
require('./test') // THIS WILL RUN THE addValues() 
// in the test.js even if we dont export from testBuilt-in modules
Some popular built-in modules are
os
http
path
fs (file system)
sync
async
OS module
Provides useful methods and props to interact with OS and the server
// Basic os module related code
const os = require('os') // no ./, just 'os' as it is a built in module
// info about current user
console.log(os.userInfo())
// system uptime in seconds (goes to 100 seconds and not 60)
console.log(os.uptime())
const currentOs = {
    name: os.type(),
    release: os.release(),
    totalMem: os.totalmem(),
    freeMem: os.freemem()
}
console.log(currentOs)Path module
Provides useful methods such as getting relative path, absolute path and so on
// Basic path module related code
const path = require('path')
console.log(path.sep) // gives / or \ depending on your OS. It's path separator
const filepath = path.join("/", 'content', "test.txt")
console.log(filepath) // gives /content/test.txt
console.log(path.basename(filepath)) // test.txt
// get absolute path (full path)
const absolute = path.resolve(__dirname, "content", "test.txt")
console.log(absolute)FS module (file system)
const { readFileSync, writeFileSync } = require('fs')
const os = require('path')
// to read file, we need to provide two things
// * file path 
// * encoding (like utf-8)
const first = readFileSync('./content/first.txt', 'utf8')
const second = readFileSync('./content/second.txt', 'utf8')
// reads the content of the file in sync way
console.log(first) // Hello this is first text file
console.log(second) // Hello this is second file
writeFileSync('./content/result-sync.txt',
    `Here's the result : ${first} ${second}`,
    { flag: 'a' }) // flag 'a' means append. It will not overwrite the data, instead will accumulate itconst { readFileSync, readFile } = require('fs') // readFile is Async
//SYNC VERSION
const first = readFileSync('./content/first.txt', 'utf8')
console.log(first)
//ASYNC VERSION
readFile('./content/first.txt', 'utf8', function (err, result) {
    if (err) {
        console.log(err)
        return
    }
    console.log(result)
})
// The above is same as 
//ASYNC VERSION
readFile('./content/first.txt', 'utf8', cb)
const cb = function (err, result) {
    if (err) {
        console.log(err)
        return
    }
    console.log(result)
}
Async file write
//ASYNC VERSION
readFile('./content/first.txt', 'utf8', function (err, result) {
    if (err) {
        console.log(err)
        return
    }
    const first = result
    console.log("Now getting second")
    readFile('./content/second.txt', 'utf8', (err, result) => {
        if (err)
            return console.log(err)
        const second = result
        console.log("Writing the result into a file")
        writeFile('./content/result-async.txt', second, () => { }) // not capturing result and error in callback
        
    })
})
// THIS IS CALLBACK HELL
// result from callback will be undefined even if we capture in the callback just like this
 writeFile('./content/result-async.txt', second, (err, res) => {
            if (err) return console.log(err)
            console.log(res)
        })
        
        
// appending by giving flag 'a'
 writeFile('./content/result-async.txt', second, { flag: 'a' }, (err, res) => {
            if (err) return console.log(err)
            console.log(res)
        })Sync vs Async file reads and writes
Sync file read/write
const first = readFileSync('./content/first.txt', 'utf8')
const second = readFileSync('./content/second.txt', 'utf8')
// reads the content of the file in sync way
console.log("Started");
writeFileSync('./content/result-sync.txt',
    `Here's the result : ${first} ${second}`,
    { flag: 'a' }) // flag 'a' means append. It will not overwrite the data, instead will accumulate it
    
    
console.log('done with this task')
console.log('starting with the next task')
// OTUPUT
// Started
// done with this task
// starting with the next task
// If write file is big then it takes lot of time and blocks the next lines of code
// so no other user can do anything with the app if one user is writing a big
// fileAsync file read/write
const { readFileSync, readFile, writeFile } = require('fs') // readFile is Async
//SYNC VERSION
const first = readFileSync('./content/first.txt', 'utf8')
console.log("Task 1 started")
//ASYNC VERSION
readFile('./content/first.txt', 'utf8', function (err, result) {
    if (err) {
        console.log(err)
        return
    }
    const first = result
    readFile('./content/second.txt', 'utf8', (err, result) => {
        if (err)
            return console.log(err)
        const second = result
        writeFile('./content/result-async.txt', second, { flag: 'a' }, (err, res) => {
            if (err) return console.log(err)
            console.log("End of Task 1")  // Appears at the end in console
        })
    })
})
console.log("Task 2 started")
// OUTPUT
//Task 1 started
//Task 2 started
//End of Task 1
// You see the writing file task isn't blockingHTTP module
HTTP module is all related to server-side coding in node. We can use node for back-end(server-side) because of HTTP module. HTTP does many things and few are listed here
Using HTTP,
We can create a server
That server can take request and send back the response
That server can define a port on which the client can send request
// setting up simple server
const http = require('http');
// server takes a callback function that exposes request and response.
// When we send any request (as a client) to 3000 port, and server responds 
// with res.write(). We can see this response in the browser when we type localhost:3000
const server = http.createServer((req, res) => {
    res.write("Hi sandeep, this is my server sdfa")
    res.end() // end of response. If this is not included, then browser will not stop querying the server and never get a response from server
})
server.listen(3000) // server is listening on port 3000. This can be any port
// There are lot of other methods and variations but this is the basic setup of a server
// Another example
const http = require('http')
const server = http.createServer((req, res) => {
    console.log("Request received... Here's your response")
    res.end("My response")
})
server.listen(5000, () => {
    console.log("I'm the server who is listening to the requests on port 5000")
})
// server.listen can take a callback. This will be executed soon after the 
// server has been setup, even before client sends the request to serverThe server that can handle basic requests
const http = require('http');
// simple server
// const server = http.createServer((req, res) => {
//     res.write("Hi sandeep, this is my server sdfa")
//     res.end()
// })
const server = http.createServer((req, res) => {
    if (req.url === "/") {
        // res.write("Hi sandeep, this is my server sdfa")
        res.end("Hi sandeep, this is my server sdfa")
        return
    }
    if (req.url === "/a") {
        // res.write("Hi sandeep, abput page")
        res.end("Hi sandeep, abput page")
        return
    }
    res.end("This is no good")
})
server.listen(3000)NPM (Node package manager)
When we make a change to the file, we need to run node app.js command to rerun the code. To avoid that  Install nodemon to watch the changes made to the file. 
// package.json
"scripts": {
    "dev": "nodemon app.js"
  },
  
 // In command line
 npm run dev
 
 // You can also do 
 "scripts": {
    "start": "nodemon app.js"
  },
  
  // In commandline
  
  npm start // npm run start also works but you can omit run. Not possible for all commands, for example 'npm run dev' is no possible.
  
  
  // I cannot do this in cmd line
  nodemon app.js // gives error because I have installed nodemon locally with command - npm i nodemon
  
  // To work gloabally (any folder and command line) then use
  npm install nodemon -g // installs globally
  
  // Now you can run this is cmd line
  nodemon app.js // works fine now.NPX (x - execute)
npx offcially means node package runner. Why do we need it?
Let's say you need to run app.js. How to do it?
// In cmdline type node app.js OR node appNow what if you want to keep track of changes? We install nodemon
// Installing locally as I don't need it globally npm i nodemonHow to run nodemon? Write a command in script tag and then use it
"scripts": { "start": "nodemon app.js" }, // In cmd line npm startCan I run
nodemon app.jsin command line instead of putting this in scripts and then calling that in cmd line likenpm start? No you can't. For that you need to install nodemon globally.// To install nodemon globally npm i nodemon -gBut what if I don't want to install nodemon globally but still want to run
nodemon app.jsin command line? Is it possible?Absolutely possible with npx. npx is introduced in npm version 5.2 to solve this issue. It is introduced to run a package from command line without having to install it globally.
// To install nodemon locally npm i nodemon //To run package in command line npx nodemon app.js OR npx nodemon app
util
It provides the utility methods that could be very useful. For example
promisify - returns back a promise so we can avoid the call back hell. Will see this later in action.
π Blocking code for all the users/ resources
Let's say we have 3 resources
/
/about
/unknown
If there is a huge for-loop in about section like shown below and there are 10 users visiting our site. The site runs smoothly for everyone until one of the users visits /about. The moment he visits /about where it takes a lot of time to execute for-loop and since Node is single-threaded, all other users who visits any page will be waiting for the page to load and all the pages will be buffering for all the users until the for-loop in the about page has completed.
const http = require('http')
const server = http.createServer((req, res) => {
    if (req.url === '/') {
        return res.end("Home page")
        // return
    }
    if (req.url === '/about') {
        for (let i = 0; i < 10000000; i++) { console.log("For loop", i) }
        return res.end("About page")
    }
    res.end("My response")
})
server.listen(5000, () => {
    console.log("I'm the server who is listening to the requests on port 5000")
})Hence this is a bad design due to the blocking code.
π Promisifying callbacks for ease (To avoid cb hell)
What the heck is promisifying? Why do we need Promise all of a sudden?
Let's see the below example we already know.
The first example is simple but the second example is complex. First file read becomes the input to the second file read and second file read becomes the input to file write. This leads to the call back hell (one call back inside another inside another and so on). We can avoid callback hell by promisifying this. The consumption of the data will then be linear even though one depends on the other. Let's see how.
//Example 1 :  Simple Async file read code. (Everything handled inside callback function)
fs.readFile("./content/first.txt", 'utf8', (err, data) => {
    if (err) return console.log(err)
    console.log(data)
})
//Example 2 :  A bit complex file read, file read and file write code. (Everything handled inside callback function)
// The dependencies on one another below lead to callback hell
fs.readFile("./content/first.txt", 'utf8', (err, data1) => {
    if (err) {
        return console.log(err)
    }
    console.log(data1)
    const writeData = data1
    // 2nd file read below depends on first file read data
    fs.readFile('./content/second.txt', 'utf8', (err, data2) => {
        if (err) return console.log(err)
        console.log(data2)
        // file write below depends on second file read data
        fs.writeFile("./content/writedata.txt", `My data is ${data1} and ${data2}`, { flag: 'a' }, (err, data) => {
            console.log("The data has been written ")
        })
    })
})Promisifying the above examples
The idea is,
const asyncFunction = ()=> { // this function returns a promise
  return new Promise((resolve,reject)=>{
    
    // your async code that you have written above (but slightly modified) and instead of returning the 
    // error and data or console logging it, you just resolve the data and reject 
    // the error
  
  }
}
// Then console log or do anything with that error and data outside here
asyncFunction().then(resp => clg(resp).catch(err=>clg(err))// Example 1
const fs = require('fs')
// getText is an async function as it returns a promise. Hence .then() can be called on getText()
const getText = () => {
    return new Promise((resolve, reject) => {
        fs.readFile("./content/first.txt", 'utf8', (err, data) => {
            if (err) reject(err)
            resolve(data)
        })
    })
}
getText().then((res) => console.log(res))
        .catch(err => console.log(err))
//-----------------------------------------------------------
// Example 2 (Considering only 2 reads (one depends on the other and not consiering write))
// Notice that we are writing the callback only for one read file and we
// can use it twice
const getText = (path) => {
    return new Promise((resolve, reject) => {
        fs.readFile(path, 'utf8', (err, data) => {
            if (err) {
                reject(err)
            } else {
                resolve(data)
            }
        })
    })
}
//Method 1 (then-catch, cb hell) -  utilizing the example 2 with promise then-catch (cb hell still forms)
// this still forms callback hell because in LINE 1 we are not returning the getText call so we can chain then-catch outside
getText("./content/first.txt").then(res => console.log(res))
    .catch(err => console.log(`1st link error`))
    .then(getText("../content/second.txt") // LINE 1
        .then(res => console.log(res))
        .catch(err => console.log(`2nd link error`)))
//-----------------------------------------------------------
//Method 2 (then-catch,  no cb hell) -  utilizing the example 2 with promise then-catch (cb hell doesn't form)
// this doesn't form cb hell because in LINE 1 we are returning the getText() call, so we can chain outside
getText("./content/first.txt").then(res => console.log(res))
    .catch(errr => console.log(`1st err`))
    .then(() => getText("./content/second.txt")) // LINE 1
    .then((res) => console.log(res))
    .catch(err => console.log(err))
// METHOD 2 is good but still there is a better approach using async-await shown below in method 3
//-----------------------------------------------------------
// Method 3 (async -await, no cb hell, no then-catch) - Better than then-catch outer chain (method 2)
// But for async-await to work, the function in which async await is used must be async function
// notice this is an async function as we are using async await inside it
 async function method3() {
    try {
        const first = await getText("./content/first.txt");
        const second = await getText("./content/second.txt");
        console.log(first, "From method 3")
        console.log(second, "From method 3")
    } catch (err) {
        console.log(`Error from method 3`)
    }
}
method3() 
We didn't take the example of write. Meaning, I considered only second read depends on first read and didn't consider write depending on second read as we considered in the Example 2 above. That's because it would become very complex to implement that because we can only resolve or reject something which is then called in then or catch block. Let's consider it now below.
// RECAP : EXAMPLE 2 FROM ABOVE
fs.readFile("./content/first.txt", 'utf8', (err, data1) => {
    if (err) {
        return console.log(err)
    }
    console.log(data1)
    const writeData = data1
    // 2nd file read below depends on first file read data
    fs.readFile('./content/second.txt', 'utf8', (err, data2) => {
        if (err) return console.log(err)
        console.log(data2)
        // file write below depends on second file read data
        fs.writeFile("./content/writedata.txt", `My data is ${data1} and ${data2}`, { flag: 'a' }, (err, data) => {
            console.log("The data has been written ")
        })
    })
})Promisifying the above example
// FOR READ
const getText = (path) => {
    return new Promise((resolve, reject) => {
        fs.readFile(path, 'utf8', (err, data) => {
            if (err) {
                reject(err)
            } else {
                resolve(data)
            }
        })
    })
}
// FOR WRITE
const writeText = (path) => {
    return new Promise((resolve, reject) => {
        fs.writeFile(path, "This is normal text written", (err, data) => resolve("Done writing"))
    })
}Let's see how to use them to escape the call back hell using two methods
Method 1 - then-catch
Method 2 - async-await
// Method 1 - then-catch
getText("./content/first.txt").then(res => console.log(res))
    .catch(errr => console.log(`1st err`))
    .then(() => getText("./content/second.txt")) // LINE 1
    .then((res) => {
        console.log(res)
        return writeText("./content/wtext.txt")
    })
    .catch(err => console.log(err))
    .then(res => console.log(res)) // Done Writing
    .catch(err => console.log(err))//Method 2 - aync-await
async function method2() {
    try {
        const first = await getText("./content/first.txt");
        const second = await getText("./content/second.txt");
        console.log(first, "From method 3")
        console.log(second, "From method 3")
        const writefile = await writeText("./content/first.txt")
        console.log(writefile)
    } catch (err) {
        console.log(`Error from method 3`)
    }
}
method2()
// Method 2 is the best as it keeps the code cleaner without then and catch. 
// It's syntax is almost same as normal sync codeMore simplified version
// This is a function that returns a promise (promisify)
// this entire boiler plate code is based on fs.readFile
const getText = (path) => {
    return new Promise((resolve, reject) => {
        fs.readFile(path, 'utf8', (err, data) => {
            if (err) {
                reject(err)
            } else {
                resolve(data)
            }
        })
    })
}
// We can simplify this using builtin promisify which does the same as above. Built in promisify
// The above code can be simplified using util module that provides promisify functionality
const util = require('util')
const readFilePromise = util.promisify(readFile) // does the same as above code
const writeFilePromise = util.promisify(fs.writeFile)
// using them
async function consumePromise() {
    try {
        // const first = await getText("./content/first.txt");
        // const second = await getText("./content/second.txt");
        const first = await readFilePromise("./content/first.txt", 'utf8')
        const second = await readFilePromise("./content/second.txt", 'utf8')
        console.log(first, "From consme promises")
        console.log(second, "From consume promises")
        await writeFilePromise("./content/sandeep.txt", "SANDEEP")
    } catch (err) {
        console.log(`Error`)
    }
}
consumePromise()Can this be more simplified?
Yes, this can be more simplified like below. We don't even need utli but just use require('fs').promises
const { readFile, writeFile } = require('fs').promises
async function mostSimplifiedReadAndWrite() {
    try {
        // METHOD 1 - Complex method using our own promises
        // const first = await getText("./content/first.txt");
        // const second = await getText("./content/second.txt");
        // METHOD 2 - Simple method using util
        // const first = await readFilePromise("./content/first.txt", 'utf8')
        // const second = await readFilePromise("./content/second.txt", 'utf8')
        // METHOD 3 - MOST SIMPLIFIED method using require('fs').promises
        const first = await readFile("./content/first.txt", 'utf8')
        const second = await readFile("./content/second.txt", 'utf8')
        console.log(first, "From method 3")
        console.log(second, "From method 3")
        await writeFile("./content/sandeep.txt", "NODE IS GOOD")
    } catch (err) {
        console.log(`Error from method 3`)
    }
}
mostSimplifiedReadAndWrite()π Event-driven programming
When a certain event happens, the callback related to that event will fire. This is called event-driven programming. Node is event-driven.




Even though you don't write your own events always, the events are the core building block of the node. A lot of built-in modules rely on them so we are using them anyway.
For example,

π Streams
Used to write or read sequentially
When we have to deal with a large file, streams come in handy
4 types of. stream in node
writeable - used to write data sequentially
readable - used to read data sequentially
duplex - used to both read and write sequentially
transform -. data can be modified while writing or reading
Streams extend event emitter class
Need of streams
In both sync and async, when we are reading a file in sync or async way, we are reading the whole file at once and place them into a variable. But if the file is too big then we get an error saying we can't place everything in a string. So the solution would be readStream.


The read stream also has options where we can set how much size the file should be and what format the data should be


Let's see another example as to why we need streams
The client (browser) is requesting the data which is in the server (on a file called big.txt - which is 2MB). Sending the large file from server to browser in one shot (one chunk) is not a. good idea because what if the internet drops and the file might not reach. It's always good to send the files in streams (chunks)



Let's do the same thing in streams now



Writing in chunks using pipe method on the file stream



Need for Express Frame work
Express is built on http module. In order to better understand the need for Express, we need to understand a few more things about http module.
HTTP continued


Currently there are two problems,
We are not setting the headers on the server (like status code and what type of data the server is serving)
If we navigate to any resource,say localhost:3000/anything, we still get the same Hello page
Let's handle first problem of handling headers and status code. Note that status code will be set by default (atleast that's what I saw. Even if I didn't specify, it was present)


Example 2


Example 3

Example 4


All good, then what's wrong with http module and why we need express?
If we have have an HTML, then obviously will have CSS and JS and other resources liked to it like images and so on. We need to rewuest for every single resoruce and handle that individually. That's a. lot of overhead if app grows.



Let's handle all the requests using http


Express 
Express is a fast, minimalist web frame work for node.
Install
 npm install express --save
// Earlier if you omit --save then the package would just save it to your local but 
// not in package.json so that when you push to git then next person couldn't 
// get it. 
// But now that problem doesn't exist anymore
// To install older version just type
 npm install express@4.17.1 --save //or any other version u like Basic setup 
Similar to what we did in http module. 
const express = require('express');
const app = express();
app.get("/", (req, res) => {
    console.log("User hits the resource")
    res.send("HOME PAGE")
})
app.get("/about", (req, res) => {
    console.log("User hits the resource")
    res.send("About PAGE")
})
app.listen(5000, () => {
    console.log("Server started listening")
})In express, we generally do these operations

We can send back the status before sending the response. Also, we can handle all other requests using all method.
const express = require('express');
const app = express();
app.get("/", (req, res) => {
    console.log("User hits the resource")
    res.status(200).send("HOME PAGE") // we can chain these methods like this
})
app.get("/about", (req, res) => {
    console.log("User hits the resource")
    res.status(200).send("About PAGE")
})
// for all others
app.all('*', (req, res) => {
    res.status(404).status(404).send('<h1>Resource not found</h1>') 
})
app.listen(5000, () => {
    console.log("Server started listening")
})First Express App
Sending HTML file to the client
const express = require('express')
const path = require('path')
const app = express()
app.get("/", (req, res) => {
    res.sendFile(path.resolve(__dirname, "02-express-tutorial/navbar-app/index.html"))
})
app.listen(5000, () => {
    console.log("Server is listening")
})
We get the same error (not having access to css and js) like we got when using http module.
To solve this we use a middleware like this


Now, time to thinkπ€ I said that if you place static files in the middleware then the server will get access to those files like styles and js and so on. So I first wrote, res.sendFile() and pointed an HTML inside sendFile() so that when user/client requests localhost:5000/, then the server gets access to HTML page we are sending and then for styles it can lookup the middleware folder and serve those as well to the client.
But the question is, isn't HTML page which is placed inside sendFile() isn't static as well π€
Definitely it is. So instead of sending this HTML file through sendFile(), we can add this HTML file to the public folder itself where middleware is pointing to. That will do the same job and we will be in good shape. Let's see how.


So the first option to send HTML file to client was through sendFile but mostly for HTML file sending we don't use this much. The second option was by placing HTML file inside middleware (adding HTML file into the static folder). And the third option is by using SSR(Server side rendering) where we use template engine.
Why Express is used? Use cases, in other words
To setup APIs -> RES.JSON()
Setup templates with Server Side Rendering. We send back entire HTML, CSS and JS ourselves to the client using RES.RENDER()

1. Let's build API using Express
A simple API
Where to find in DOCS?




Enhancing the app a bit



At this point, let's enhance what we are sending (JSON) a bit.
When you see any website, if they display products, then probably they display only a few things about the product like name, image and price. Once we click on the product then it displays all the details like description, make, expiry date, seller and so on.
So the idea is, when the client requests /products lets send only id, name, image first. Later, when the client requests for a specific product (product/3) then we can send all the details related to that product.
Sending only a few details in each product when request comes for all the products


To get a single product we use route parameters
Route parameters


The problem here is we are hardcoding 1. What if we need product 2? So let's use route params.

But what if I access the product that doesn't exist? It returns undefined and we need to display that.


FYI, the route params can be more complex than the above illustrated one. Let's see an example

Query-string parameters OR URL parameters
It's a way to send small amounts of information to the server through the URL. These params are not part of the URL but they are generally used to query the database or sort the results and something like that. The person who sets up the server (Node dev or backend team) decides which params will be accepted.
To give you an example, let's navigate to the hacker news API below

There's a small mistake in the above screenshot. ?query= is not "Where". Actually, until /search it is URL as mentioned and ? is the where. What I'm saying above is, in the above URL which is until /search, get me query = foo and tags = story. 
So, upon logging req.query we get {query:'foo',tags:'story'}. 
This is the general convention where we use ourDomain.com/api/version/something and here v1 is the version. 

Let's design the server in such a way that it accepts params called search and limit. It means, the user is searching for a specific product and limiting the results to a certain number.

Example 1 - The limit is 2 and search is alb

Example 2 - The search is ent

Example 3 - limit is 3

Example 4 - Some random param which is not taken care of by the server (We give back all the results)

Example 5 - Not giving any param will also result in same above (will get back everything)

Example 6 - What if no results exist for the search parameters passed?
By default if we don't handle then

See we are sending back an empty array as no results were found for the search where name starts with p. Wouldn't it be nice if we can send some message saying -> "Your search URL is correct but unfortunately we didn't find anything for this match"
To tell the first part, we send 200 status code that says -> "Your search URL is correct"
For the second part we can send the data like below


Or I can also do this

It's up to us (backend devs) to what we send back if the data is not found

The gotcha to keep in mind. We have to send one and only one response in a request. If we are using if condition to specify the response then we need to return the response like below
DON'T Do this


DO this

Middleware in Express js

Need for the middleware

We can do something like this

We can do better than this like below



You see that the page is growing bigger so it would be nice to have the getData in a separate file. Let's call that logger here after. Also, wouldn't it be nice if we have a function that can add the logger / getData to all the routes by default instead of we adding them manually? We'll we have such a functionality. Let's take a look.
Let's first move the logger to a separate file


Let's now add the logger programmatically to all the URLs without us adding manually. In that way, if logger (variable name of logger) changes later, we don't need to change in all the routes.
App.use()
The above is achieved using app.use(logger)
We pass the logger into app.use() and then that is equivalent of passing middleware in all the routes.

Rules of app.use()
In order for accessing app.use(logger) in all the routes, the app.use() must be placed at the top of all the routes. Order matters here
Let's say we need to apply the logger to only the routes starting with /api (or something) we can define that as the first parameter and logger as the second parameter.

Multiple middlewares

Let's mimic authorization here (real authorization functionality might look different).



this authorization was for demonstration but in real authorization, we check for JSON Web token and if the token exists then we communicate with the DB and get the user (all this stuff will come up later)
Now that we have the middle wares setup, we can now access that info in any route. For example, in authorize middleware, the req.user = {name:'john',id:3}. We can access this in any route as every route will have access to this middleware in our current setup. To prove that let's add it in /items and will see how we can access it there:  

Now let's say, like before, I don't want to apply the middleware to all my routes. For example, I only need to check for authorized users in /api/items

Options of middleware
Our own middleware what we saw till now
Express provided middleware
Third party middleware
Express provided middleware - example

In this example, all the static assets are placed in public folder and we are letting express know that using this above express middleware so that while rendering HTML page, it can include this public folder where all the assets are placed
Third party middleware
Let's install a logger middleware (3rd party) called morgan
This helps in logging just like logger we used our own

Http methods

 Get (default method the browser performs)
To get all the data

Post (add/insert data to server)
Before going to POST, jog your memory here Render static files
In order to explain the POST request, let's start with adding static resources and rendering it into the browser
Example 1 - using index.html

The reason we do this is because we cannot simply configure our browser to perform a POST request. We need a simple app like this or a tool like POST MAN which we will see later.
So let's take a look at our index.html which is currently displaying in the browser and understand what's going on



We know this anyway that we won't get the data as we are not handling /login in our express app yet but the point to notice here is below. Once you do the post request, got to Network tab and then see it's a post request.

Also, to see what data we passed in the body of POST scroll to the bottom after clicking the Network tab and see below as shown below

Let's handle the post data

At this point I will not be able to access the body (data) sent to the server (in other words inside post method) as I will not have access to name = 'john' inside post method. Hence I will not be able to add the user john to my list of people. This is where the middleware comes into picture. We use express middleware (builtin) like this.
Note that the req doesn't have a body property initially, so req.body will be undefined before defining the middleware.

Now, what is app.use(express.urlencoded({ extended: false }))
https://stackoverflow.com/questions/23259168/what-are-express-json-and-express-urlencoded

  In the early versions of express, the body parser ({extended:false} in this case) came only upon installing an npm package as explained in above article. Now it comes by default. But what exactly is this body parser option?

I know it's not very clear at this point why we need extended. I'll update this once I get clarity.
We could also install body parser and do this as explained in the video below

https://www.youtube.com/watch?v=vKlybue_yMQ&ab_channel=codedamn
Small task with POST request we learned so far
TASK: If name is provided, then welcome that user, else give the 401 error and ask him to provide credentials

Example 2 - using javascript.html
Here, we are submitting the form using axios package in Javascript unlike directly using html form in previous example.  axios is similar to fetch and they both behave the same way. axios is chosen here as it has better error handling and stuff but the concept remains same.





TASK: If name is not provided then handle that when the data is submitted using JS form

At this point, please navigate to Post man tool, understand it and come back here
Put (Update the data)
We use route parameter. Let's say we need to update the data of order 2, then

This is a convention. Technically, there are multiple other ways we can set this up. We will cover the official name of this convention and more details about it later.
Let's first see the console log data in PUT method sent through POSTMAN.

PUT basic implementation

PUT implementation to update the new data provided

Success case where person id exists

Failure case where person id doesn't exist

Put VS Patch
Patch is used to update only the required fields. Let's say we pass the id and then would like to update only the name and nothing else, then we do it with patch.

Whereas in Put, the ones we are passing only will get updated and rest will not be retained but will be removed.
Think of it like, in react, we copy the existing values and then update the required ones -> This is what patch does
For put, think that we are only passing the values we need and not copying the other ones so that will be removed -> This is what Put does
Delete 
We don't need to pass in the body for the delete request


Post-man tool
You saw that to test the get and post methods we had to setup a front-end app. That is so time consuming. Post-man allows us to write backend without worrying about front-end
Get

Post

We can send the same thing like this in post-man

this body what we are sending is captured by express.json() middleware and adds it to req.body in the server.

TASK : Add another post method and handle /api/postman/people


At this point, navigate back to here, take a look at PUT and DELETE methods and then come back
Express Router
Basic router setup
 Our complete app until now. We have only a couple of routes and you can see how big the app.js file already. So the solution is to use express router where we can group the routes together and as far as the functionality, we can set them up as a separate controllers.
Later when we talk about database, we will cover the common convention which is MVC pattern. That's not a rule but most used pattern that structures our code.
const express = require('express')
const app = express()
let { people } = require('./data')
//static assets
app.use(express.static('./methods-public'))
//parse the html form data
app.use(express.urlencoded({ extended: false }))
app.get("/api/people", (req, res) => {
    res.status(200).json({ success: true, data: people })
})
// parse javascript form data
app.use(express.json())
app.post("/api/people", (req, res) => {
    const { name } = req.body
    if (!name) {
        return res.status(400).json({ success: 'false', msg: 'please provide name value' })
    }
    res.status(201).json({ message: "success", person: req.body.name })
})
app.post("/api/postman/people", (req, res) => {
    const { name } = req.body
    if (!name) {
        return res.status(400).json({ success: 'false', msg: 'please provide name value' })
    }
    res.status(201).json({ success: true, data: [...people, { name: req.body.name }] })
})
app.post("/login", (req, res) => {
    console.log(req.body)
    if (!req.body.name) {
        return res.status(404).send("Please give us the credentials")
    }
    res.send(`The name is ${req.body.name}`)
})
app.put("/api/people/:id", (req, res) => {
    // we should get two data. 
    // One is the id we need to update
    // Two is we need the new data to be updated to
    const { id } = req.params // this will be string
    const { name } = req.body
    // first see if that ID exists. 
    // converting str to the number as per people array. +id is same as Number(id)
    const person = people.find(person => person.id === +id)
    if (!person) {
        return res.status(404).json({ success: false, msg: `No person with id ${id}` })
    }
    const newPeople = people.map(person => {
        if (person.id === +id) {
            person.name = name
        }
        return person
    })
    res.status(200).json({ success: true, data: newPeople })
})
app.delete("/api/people/:id", (req, res) => {
    const { id } = req.params
    const person = people.find(person => person.id === +id)
    if (!person) {
        return res.status(404).json({ success: false, msg: `No person found with id ${id}` })
    }
    const newPeople = people.filter(person => person.id !== +id)
    return res.status(200).json({ success: true, data: newPeople })
})
app.listen(5000, () => {
    console.log("Server started listening on 5000")
})You see we have a common type of code /api/people, /api/people/postman, /api/people/:id. Don't you think we can somehow group them?

Let's move all the routes starting with /api/people to people.js.
Steps to turn normal app into express router
In
people.jsimport router from express.RouterImport all the above marked routes starting with /api/people from
app.jstopeople.jsReplace all the app to router in
people.jsCut and paste any related data like people import from
app.jstopeople.jsImport people routes from
people.jsWrite middleware
app.use('/api/people',people)inapp.jsto setup the base path so that when we hit /api/people from postman, it hits this middleware and then takes us topeople.jsroutesIn
people.js, remove /api/people in all routes as we are defining them inapp.jsmiddleware as said in above point.Test all the routes in postman now.

const express = require('express')
const router = express.Router()
let { people } = require('../data')
router.get("/", (req, res) => {
    res.status(200).json({ success: true, data: people })
})
router.get("/:id", (req, res) => {
    const personId = +req.params.id
    const person = people.find(person => person.id === personId)
    if (!person) {
        return res.status(404)
        .json({ success: false, msg: `No person found with the id ${personId}`})
    }
    return res.status(200).json({ success: true, data: { ...person } })
})
router.post("/", (req, res) => {
    const { name } = req.body
    if (!name) {
        return res.status(400)
        .json({ success: 'false', msg: 'please provide name value' })
    }
    res.status(201).json({ message: "success", person: req.body.name })
})
router.post("/postman", (req, res) => {
    const { name } = req.body
    if (!name) {
        return res.status(400)
        .json({ success: 'false', msg: 'please provide name value' })
    }
    res.status(201)
    .json({ success: true, data: [...people, { name: req.body.name }] })
})
router.put("/:id", (req, res) => {
    // we should get two data. 
    // One is the id we need to update
    // Two is we need the new data to be updated to
    const { id } = req.params // this will be string
    const { name } = req.body
    // first see if that ID exists. 
    // converting str to the number as per people array. +id is same as Number(id)
    const person = people.find(person => person.id === +id)
    if (!person) {
        return res.status(404)
        .json({ success: false, msg: `No person with id ${id}` })
    }
    const newPeople = people.map(person => {
        if (person.id === +id) {
            person.name = name
        }
        return person
    })
    res.status(200).json({ success: true, data: newPeople })
})
router.delete("/:id", (req, res) => {
    const { id } = req.params
    const person = people.find(person => person.id === +id)
    if (!person) {
        return res.status(404).json({ success: false, msg: `No person found with id ${id}` })
    }
    const newPeople = people.filter(person => person.id !== +id)
    return res.status(200).json({ success: true, data: newPeople })
})
module.exports = router // exporting router so that it can be imported as people or something elseLet's now do the same thing for login in auth file


Controller
You see that we are now using router to separate the functionality but still when you look at people.js above, it's still a big file with so many methods and callback functions inside it. Wouldn't it be nicer and cleaner if we can separate the callback functions into a separate file? 

Yes we can do that and call them controllers. Controllers are the functions that are present inside GET, POST and other http methods. This can be called controllers as we are now slowly transforming this into MVC ( Model View Controller) pattern. Let's leave auth as is as it is small file and convert people.js
Setting up controller folder and inside it the file people.js
people.jslet { people } = require('../data')
const getPeople = (req, res) => {
    res.status(200).json({ success: true, data: people })
}
const createPerson = (req, res) => {
    const { name } = req.body
    if (!name) {
        return res.status(400).json({ success: 'false', msg: 'please provide name value' })
    }
    res.status(201).json({ message: "success", person: req.body.name })
}
const getPerson = (req, res) => {
    const personId = +req.params.id
    const person = people.find(person => person.id === personId)
    if (!person) {
        return res.status(404).json({ success: false, msg: `No person found with the id ${personId}` })
    }
    return res.status(200).json({ success: true, data: { ...person } })
}
const createPersonPostman = (req, res) => {
    const { name } = req.body
    if (!name) {
        return res.status(400).json({ success: 'false', msg: 'please provide name value' })
    }
    res.status(201).json({ success: true, data: [...people, { name: req.body.name }] })
}
const updatePerson = (req, res) => {
    // we should get two data. 
    // One is the id we need to update
    // Two is we need the new data to be updated to
    const { id } = req.params // this will be string
    const { name } = req.body
    // first see if that ID exists. 
    // converting str to the number as per people array. +id is same as Number(id)
    const person = people.find(person => person.id === +id)
    if (!person) {
        return res.status(404).json({ success: false, msg: `No person with id ${id}` })
    }
    const newPeople = people.map(person => {
        if (person.id === +id) {
            person.name = name
        }
        return person
    })
    res.status(200).json({ success: true, data: newPeople })
}
const deletePerson = (req, res) => {
    const { id } = req.params
    const person = people.find(person => person.id === +id)
    if (!person) {
        return res.status(404).json({ success: false, msg: `No person found with id ${id}` })
    }
    const newPeople = people.filter(person => person.id !== +id)
    return res.status(200).json({ success: true, data: newPeople })
}
module.exports = {
    getPeople,
    getPerson,
    createPersonPostman,
    createPerson,
    updatePerson,
    deletePerson
}Setting up routes : 1st way

Setting up routes : 2nd way (choose 1st or 2nd way as per your convenience)

Mongo DB
It's a NoSql DB. Atlas is a cloud platform that enables you to host your DB
Mongoose
It's a library used on top of MongoDB. This takes up all the heavy lifting and makes the db part easy.
Basic MongoDB and Mongoose setup
To test this we just require this file in app.js

const mongoose = require('mongoose')
const connectionString =
    'mongodb+srv://sandeep:1234@nodeexpressprojects.yt2te.mongodb.net/03-TASK-MANAGER?retryWrites=true&w=majority'
mongoose
    .connect(connectionString, {
        useNewUrlParser: true,
        useCreateIndex: true,
        useFindAndModify: false,
        useUnifiedTopology: true
    })
    .then(() => console.log('Connected to the DB...'))
    .catch((err) => console.log(err))Now if you observe the console, then we see, 1st the server is connected and then the DB. Think about it. What's the point in connecting to the server 1st without a DB? All the data must be first retrieved. Meaning, first the DB must be connected and then the server

So we should not invoke mongoose.connect as shown in the connect.js. We can set it up as a function and then invoke it in the app.js. 
Refactoring the connect.js and app.js where first the DB should be invoked before the server


Code for app and connect
const express = require('express')
const app = express()
const tasks = require('./routes/tasks')
const connectDB = require('./db/connect')
// Middleware
app.use(express.json())
// Routes
app.get('/hello', (req, res) => {
    res.send('Task Manager App')
})
app.use('/api/v1/tasks', tasks)
const port = 3000
const start = async () => {
    try {
        await connectDB()
        app.listen(port, console.log(`Server is listeneing on port ${port}...`))
    } catch (error) {
        console.log(error)
    }
}
start()const mongoose = require('mongoose')
const connectionString =
    'mongodb+srv://sandeep:1234@nodeexpressprojects.yt2te.mongodb.net/03-TASK-MANAGER?retryWrites=true&w=majority'
const connectDB = (url) => {
    console.log("Connecting to the DB...")
    return mongoose
        .connect(connectionString, {
            useNewUrlParser: true,
            useCreateIndex: true,
            useFindAndModify: false,
            useUnifiedTopology: true
        })
}
module.exports = connectDBRoute not found
Unknown route

Env 
Don't you think we should not push the database credentials to the git. Yes, we should not expose it. So we need to put it in env and ignore it in git file. We can install dotenv package in our application and then we can access the secret variables in our application.
npm install dotenv


Modal
Modal is the representation for the collection.
Think of documents as rows in RDBMS and collection as table
Below, it shows the documents at right side for the Products collection.

This is where we use mongoose schema and setup the structure for all the documents that we will have in our collection.
Schema types can be found here

Based on this let's setup simple schema type for a Task collection

Now we can go to the controller and start using the model.
Before you proceed, read this a bit

Create operation in DB
Step 1 : Defining a schema for a collection (model) also think it as a table 



Step 2 : Create a new document by receiving the req.body and add it to the collection
I showed you three approaches above to create a document. Let's pick the second approach and create a document. We get the incoming request (req.body) and then before sending it as a response, we add it to the DB using approach 2 in the controller.

Validation
We can send the empty object or empty fields to the DB without validation. Let's setup the validation to schema so that the values exist before sending to the DB. Validation is pretty big and we cover them as and when we progress.

More on validation can be found in Mongoose docs below
Error handling
At this point, if the validation error occurs, then we are not sending any response and the postman keeps requesting and doesn't get any response back.

In other words, we are not handling the error gracefully. The reason for that is we have async operation like below

Let's handle this using try-catch block. Later will find more ways to improve this and see how we can avoid this boiler plate of try-catch, but for now let's use this

Async Wrappers
The below explained stuff is a bit complicated, so I'm writing this line in future where I thoroughly understood the concept and wrote a blog on this. Please refer this first and then proceed
Since we are using async functions, we wrap them with try-catch block. Doing that in each controller will become redundant as we are repeating same code.

Creating our own async wrapper
This is bit tricky to understand first. Imagine as I have explained below.
In this project we are creating the async wrapper on our own. However, in the upcoming projects we will use npm package for this and I'll explain that then



Now we can replace all the controllers by this asyncWrapper that will result in below code
const Task = require('../models/Task')
const asyncWrapper = require('../middleware/asyncWraper')
const getAllTasks = asyncWrapper(async (req, res) => {
    const tasks = await Task.find({})
    res.status(200).json({ tasks })
})
const createTask = asyncWrapper(async (req, res) => {
    const task = await Task.create(req.body)
    res.status(201).json({ task })
})
const getTask = asyncWrapper(async (req, res) => {
    // res.status(200).json(req.params)
    console.log("Get single task")
    const { id: taskId } = req.params
    const task = await Task.findById(taskId)
    if (!task) {
        return res.status(404).json({ msg: `No task with id: ${taskId}` })
    }
    res.status(200).json({ task })
})
const deleteTask = asyncWrapper(async (req, res) => {
    const { id: taskId } = req.params
    const task = await Task.findOneAndDelete({ _id: taskId });
    if (!task) {
        return res.status(404).json({ msg: `No task with id: ${taskId}` })
    }
    const tasks = await Task.find({})
    res.status(200).json({ tasks })
})
const updateTask = asyncWrapper(async (req, res) => {
    const { id: taskId } = req.params
    const task = await Task.findByIdAndUpdate({ _id: taskId }, req.body, { new: true, runValidators: true })
    if (!task) {
        return res.status(404).json({ msg: `No task with id: ${taskId}` })
    }
    res.status(200).json({ task })
})
const editTask = asyncWrapper(async (req, res) => {
    const { id: taskId } = req.params
    const task = await Task.findByIdAndUpdate({ _id: taskId }, req.body, { new: true, runValidators: true })
    if (!task) {
        return res.status(404).json({ msg: `No task with id: ${taskId}` })
    }
    res.status(200).json({ task })
})
module.exports = { getAllTasks, createTask, getTask, updateTask, deleteTask, editTask }
But at the moment we are not handling error anymore yet as we are giving the error to next middleware like this


Let's handle this error which has been passed to the next middleware.
Error handling with Express middleware passed into next
Now I'll ask you a question. In the above image, I said we get an error in post man that is not handled yet as we are not handling the error passed into next. Agree? Then where is the error html in the post man coming from? We didn't write that code.
Well, express does the default error handling for us. Look at the docs as shown below


Writing our own error handler



The status must be 500 but I didn't define it so it says 200. We will define it below.
Let's write the error handler in in a separate file



Custom error handling class for 404
For handling these in an elegant way

Simple implementation of Error class


Lets create a new folder and call it errors.





We get a npm package for this async wrapper and we don't need to write asyncWrappers ourselves. https://www.npmjs.com/package/express-async-errors
π PROJECTS ππππ
Now that we have gained enough knowledge, let's build some cool projects.
I will make notes only where I feel it is important. Please navigate through Udemy here for projects if you want to follow along.
Convention
Our server might serve different things like /index and then /api and so on. So we could use this. /api/v1 means this servers for /api requests of version 1 so that in future you can add more versions keeping the old ones active.


β
 1. Task Manager App
This App involves
Setting up the controllers for GET, CREATE, PATCH, DELETE operations
Setting up middle wares for error handling, and async wrapper to avoid writing try-catch in each controller
Setting up postman for creating the routes easily
Setting up mongo db and mongoose
Using a simple front-end to perform the actions
This app is not deployed as we still didn't cover authentication and security
Deployment
PORT
In local we use 5000 or 3000 or whatever we feel is good, but in the platform which we deploy the same port may not be available, so we need to let the platform choose the port number. Let's set that up using process.env.PORT variable.

β
 2. JWT 
There are two parts to this project. The below fig shows how we use JWT to access /dashboard route.  Since the dashboard route is protected, we need to verify the token that we gave to client (jwt sign) during login/register. If the client gives us back the same token to access /dashboard route then we need to provide the proper data from /dashboard route.

The token verification functionality is embedded into /dashboard above. But what if we need to protect some other route. We need to provide this token verification functionality to that route as well so we might have to repeat the same token verification code. Instead of repeating we can put the token verification code into its own middleware and use the next functionality. Watch the video for further details.
β
 3. Jobs API (Project Steps)
Let's classify our projects' steps broadly
Step 1 (Basic setup)
Create
app.jsfile. Import express and start the serverDefine basic
middle wareslikeexpress.json(to get access to the req.body) and express.static (if app has in built html css and JS)Define and test basic route. app.get('/', req res CallBack) and see if that works by calling this in URL
const express = require('express')
const app = express()
// middlewares
app.use(express.json()) // to get req.body
app.use(express.static('public')) // to serve static files
// routes
app.get('/', (req, res) => {
  res.send('Home Page')
})
const port = process.env.PORT || 3000
app.listen(3000, () => {
  console.log('Listening on Port 3000')
})Routes and Controllers
We need auth route and controllers for register and login
We need jobs route and controllers for create, read, update and delete operations. Boiler plate code with a console log statement in each controller should be enough at this point
const express = require('express')
const app = express()
const authRouter = require('./routes/auth')
// middlewares
app.use(express.json()) // to get req.body
app.use(express.static('public')) // to serve static files
// routes
app.get('/', (req, res) => {
  res.send('Home Page')
})
app.use('/auth', authRouter)
const port = process.env.PORT || 3000
app.listen(port, () => {
  console.log('Listening on Port 3000')
})const express = require('express')
const router = express.Router()
const { register, login, logout } = require('../controllers/authController')
router.route('/register').get(register)
router.route('/login').get(login)
router.route('/logout').get(logout)
module.exports = router
const register = async (req, res) => {
  res.send('Register controller')
}
const login = async (req, res) => {
  res.send('Login controller')
}
const logout = async (req, res) => {
  res.send('Logout controller')
}
module.exports = { register, login, logout }Step 2 (Error Handlers)
Setup error handler middle wares
One for notFound and the other for all ErrorHandler which covers other errors like BadRequest and UnAuthenticated
One of the ways done in 06JobsAPI is
Created error folder and then create CustomErrorAPI which is the main class that extends Error class

class CustomAPIError extends Error {
  constructor(message) {
    super(message)
  }
}
module.exports = CustomAPIError
Then I'll use this class (extend this class) to create
NotFoundErrorclass,BadRequestErrorclass andUnAuthenticatedErrorclass (and any other class if u want)



We can import all these errors in the index file so that to throw these errors in the controllers, we can import from this index file

Error middlewares
Now we have all these errors implemented, let's define error middlewares that we can use in app js
We will have notFound Middleware and ErrorHandler middleware. Not found one is used when a route doesn't exist. The error handler one is used when something else goes wrong. This something else can be Authentication error or BadRequest error and so on which are defined above. (We have already implemented them above)


Use these two mw in app js
Not found mw
Error Handler mw
This includes BadRequest Error
Authentication Error
Not Found Error (This error occurs when we throw NotFoundError in any controller). If the end point which user entered doesn't exist then automatically it goes to Not Found mw above in first step. If it is any other kind of error except notFound one (non-existent route) then errorHandlerMW will handle that error.

Step 3 (Register)
We need to define a post route for register
Then define a controller for register
Then create a Model for user and define UserSchema. Also, define UserSchema validations. If u will be sending name, email and password then define the validations for these fields as shown below.
const UserSchema = new mongoose.Schema({
  name: {
    type: String,
    required: [true, 'Please provide name'],
    maxlength: 50,
    minlength: 3,
  },
  email: {
    type: String,
    required: [true, 'Please provide email'],
    match: [
      /^(([^<>()[\]\\.,;:\s@"]+(\.[^<>()[\]\\.,;:\s@"]+)*)|(".+"))@((\[[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\])|(([a-zA-Z\-0-9]+\.)+[a-zA-Z]{2,}))$/,
      'Please provide a valid email',
    ],
    unique: true,
  },
  password: {
    type: String,
    required: [true, 'Please provide password'],
    minlength: 6,
  },
})
module.exports = mongoose.model('User', UserSchema)In the register controller write code to put the incoming fields into to db
const user = await User.create({ ...req.body })This creates the user in the db but if u think about it then we are saving the passwords in plain text. So before we do that we can hash the password using
bcryptjspackage. We can do this here in the controller itself but the mongoose provides aprehook which executes before saving the data into db. So you can use this instead of cluttering your controller like this
// in the User model
// this executes after User.create but before going into DB. So we can hash the password here
UserSchema.pre('save', async function () {
  const salt = await bcrypt.genSalt(10)
  this.password = await bcrypt.hash(this.password, salt) // this refers to the document
})
// we can access like this
// this.username, this.password and so onOnce the user gets created in the db with his password hashed, we then need to create and send back the token. We can do this also in the controller but then mongoose gives us a nice little helper methods functionality where, similar to pre hook, we can create a method on mongoose and then use it in the controller like so
UserSchema.methods.generateToken = function () {
  return jwt.sign(
    { userId: this._id, name: this.name },
    process.env.JWT_SECRET,
    {
      expiresIn: process.env.JWT_LIFETIME,
    }
  )
}Now the auth controller's register method looks like this
const register = async (req, res) => {
  const user = await User.create({ ...req.body })
  const token = user.createJWT()
  res.status(StatusCodes.CREATED).json({ user: { name: user.name }, token })
}Step 4 (Login)
We need to get username (email in this case) and password
See if this email exists in the db
If yes, then compare that password string with the hashed password in the db
If they match, that means the user is legit and he is the one in the db so we can send him back the token like how we sent back in the reg call


Step 5 (Verify token - Auth middleware)
Once we have created the token and sent in register and login route, now it's time to think why we need this token. We have different end points for jobs like GetJob, GetAllJobs, CreateJob, UpdateJob and DeleteJob. All these end points of job are secure. Meaning, the user should be logged in to access these functionality. How do we differentiate a login and logged out user? Well with the token. The logged in user will have the token we sent him back when he registered or logged in. So if the user sends us back the token to access any of these end points then that means that user is logged in.
So what's the next step? We get the token and we need to verify that using jsonwebtoken package (same package that helped us create the token). If they match then we can allow the user to access any of the job routes.
We can create this token verification functionality in all the job related endpoints mentioned above. But it's better to write it once. So let's write in in the Auth middleware
Write Auth mw where we implement the token checking functionality
We can stick this method/ function to all the routes we need to be secure

Securing getAllJobs end point as an example here. Without the token this endpoint can't be accessed.


Step 6 (Jobs - Schema setup)
Once the token is verified then we can move to jobs part. First we need to define Job Schema with validations followed by Job model. We can say a job object or document would have these fields
company
position
status
createdBy (User) - So we need to link the User object ID (user ID) to this - Reference : https://mongoosejs.com/docs/populate.html
Also, we can define timestamps so we know when was each doc created and updated. We can then use them to sort
const mongoose = require('mongoose')
const JobSchema = new mongoose.Schema(
  {
    company: {
      type: String,
      required: [true, 'Please provide company name'],
      maxLength: 20,
    },
    position: {
      type: String,
      required: [true, 'Please provide the position'],
      maxLength: 120,
    },
    status: {
      type: String,
      enum: ['interview', 'declined', 'pending'],
      default: 'pending',
    },
    // Reference - https://mongoosejs.com/docs/populate.html
    createdBy: {
      type: mongoose.Types.ObjectId,
      ref: 'User',
      required: [true, 'Please provide a user'],
    },
  },
  { timestamps: true }
)
module.exports = mongoose.model('Jobs', JobSchema)Step 7 (Create and Get Job)
Now we have our db in place for Job, let's first create a job. We need to setup functionality in controllers
const createJob = async (req, res) => {
  req.body.createdBy = req.user.userId
  const job = await Job.create(req.body)
  res.status(StatusCodes.CREATED).json({ job })
}
const getAllJobs = async (req, res) => {
  // we need to get job related to the user who is logged in
  const jobs = await Job.find({ createdBy: req.user.userId }).sort('createdAt')
  res.status(StatusCodes.OK).json({ jobs, count: jobs.length })
}Step 8 (Postman - Dynamically set logged in user's token in authorization  header)
Currently, to create a job for a user,
We login or register a particular user
Get the token and copy that token
Manually paste it in the
AuthorizationHeader by writingBearer <token>Then create a job
Go to the GetAllJobs route, paste the token here as well in the
AuthorizationheaderGetAllJobs for that user
This copy paste thing gets annoying quite fast. So it would be better if we could automatically set the Authorization  header to the user who is currently logged in. Let's do that.
Automate token setup in Tests in postman
Tests in postmanNavigate to Tests in postman. When we click login we get back the token correct? So we need to write a few lines of code here to use that token



Now to test this, create two users (register) UserA and UserB. Login as UserA, create 2 jobs. Login as UserB and create 5 jobs. Now, when you login as UserA and get all jobs you should see only 2 jobs and as userB you should see 5 jobs.
Step 9 (Get a single job)
Now it's time to get a job. we pass our request from postman as {{URL}}/jobs/<JobID> and the JobID is accepted as id in the GetJob router as shown below. 

For getting a single job for that particular user we need to test whether the logged in user is requesting that particular job and also if the id passed exists in the db. Once these two conditions satisfy then we get the job from db and we can pass this response.
const getJob = async (req, res) => {
  const {
    user: { userId },
    params: { id: jobId },
  } = req
  const job = await Job.findOne({ _id: jobId, createdBy: userId }) // matching both jobID (if exists) and logged in user
  if (!job) throw new NotFoundError(`No job with id ${jobId}`)
  res.status(StatusCodes.OK).json({ job })
}Step 10 (Update a job)
Similar to Get Single Job, we need to pass jobId and userId in the request so that mongoose can first find that particular job related to the logged in user.  
We also need to pass company and position in the request to which the new values get updated.

Step 11 (Delete a job)
Let's now take a look at how to delete a job. Similar to what we did in update we do the same in delete. findByIdAndRemove or findOneAndRemove both does the trick of deleting a job.
We need to find if the job with an ID and user with that ID (createdBy) exists, then we can delete and it gives back the deleted job. If no job comes back then it means that it didn't delete it. In that case we send 404 (job not found), else we can just send 200 and leave the response empty.

Step 12 (Making Errors more friendly)
Currently in our setup we have 3 errors
Validation error - We throw BadRequest error when some field is missing
Duplicate error (for email) - Mongoose validation throws an error because of
uniqueset totrueCast Error - If syntax of our request is not good. For example while sending ID in query params if we mess up the number of characters of an ID (if we decrease or increase one letter in ID) then this error is thrown
The goal here is to handle all these better. At the moment, for a duplicate email, this is what our response looks like

For this let's work in error-handler middleware. Currently it looks like this
06-jobs-api/starter/middleware/error-handler.js

Notice that whatever we throw like BadRequestError or NotFoundError, that will be caught in line number 6. Rest all the errors like thrown by Mongoose or any internal code errors will be caught at line 9. The error occurring at line 9 is all of the errors which we are not throwing and it's not very informative as to what exactly went wrong.
For example, let's say there's a duplicate error thrown by mongoose for unique = true field like email. Since we are not throwing and mongoose is throwing that, we catch in line 9 and throw as 'Something went wrong'. Now the user doesn't know that it already exists. 
The work around here is to get the mongoose error, and show that to user in a useful way. The bottomline is, we need to cater for all the other errors which we are not throwing like Validation Error, Duplicate Error and Cast Error thrown by mongoose. Let's change the code like this below
Duplicate Error

Also, you can remove line 11 to 13 in the above code as we don't have to check if err is instance of CustomAPIError. We are concerned only about the message and status code that err param gives in line 3 and then we are setting them anyways in line 7 and 8 as err.statusCode and err.message
Validation Error
When does a validation error occur?
Validation error occurs at MODEL level. Meaning, while creating a model we define what fields must be defined and how they should be. If we don't adhere to those rules while creating a document (When we create a document then it checks Model and see if all the validations pass for this DOCUMENT as defined in the MODEL) then the validation error occurs.


We can make use of name prop ValidationError, and send some useful message and statusCode.   Let's check if this error is thrown and then if yes, we need to iterate through the fields within err.errors and then pick message prop for each and display them like this

and we get the below response

Cast Error
When we tamper the format of the parameter we are sending in the request then this error occurs. For example, ID format when requesting for a single job.
Let's say we find all jobs and I want the details about single job so I send get request to GetSingleJob route with ID 62eb2fdae7df6711285626f2 this is what I get

ID format correctNow let's say I mess up the ID format and send a string which is not of proper length, then I get cast error. Notice I have removed 2 at the end in the previous request like this 62eb2fdae7df6711285626f



Step 13 (Security packages)
So far our apps were small and cute where we just worked in local, the security wasn't a concern. We are now going to host this app on cloud (Heroku) so we need to worry about some bad hackers trying to hack our API. Meaning, a hacker can try to access our secured routes (jobs in this case) without proper authentication and do more damage than we can think of like getting our logged in user data / registered users email and so on.
Luckily there are many npm packages to our rescue. The packages we are going to use are:
helmet
Sets various HTTP headers to prevent numerous possible attacks. This is very popular
cors - Cross Origin Resource Sharing
Ensures that our AP is accessible from different domains
If you don't have cors installed, your app (server) will only be accessible by same domain like we did in our other apps where we used public folder to write our front-end within the same app in javascript file where we send request to our own end-points from our own apps' public folder
CORS is a mechanism to allow or restrict requested resources on a web-server depending on where the HTTP request was initiated
By installing and implementing the CORS package essentially we make our API accessible to the public
xss-clean
Sanitizes the user input in req.body, req.query and req.params as a result it protects us from cross-site-scripting attacks where the attacker tries to inject some malicious code
express-rate-limit
We (server) can limit the amount of request a client can make
The app.js looks like this below after installing security packages.
require('dotenv').config()
require('express-async-errors')
const express = require('express')
const app = express()
// extra security-packages
const helmet = require('helmet')
const cors = require('cors')
const xss = require('xss-clean')
const rateLimiter = require('express-rate-limit')
// CONNECT DB
const connectDB = require('./db/connect')
// ROUTER
const authRouter = require('./routes/auth')
const jobsRouter = require('./routes/jobs')
// error handler
const notFoundMiddleware = require('./middleware/not-found')
const errorHandlerMiddleware = require('./middleware/error-handler')
// token verification to access secured route
const authMiddleware = require('./middleware/authentication')
app.use(express.json())
// for heroku we need to do this
app.set('trust proxy', 1)
app.use(
  rateLimiter({
    windowMs: 15 * 60 * 1000, // 15 minutes
    max: 100, // max 100 requests in 15 mins
  })
)
app.use(helmet())
app.use(cors())
app.use(xss())
// routes
app.use('/api/v1/auth', authRouter)
app.use('/api/v1/jobs', authMiddleware, jobsRouter)
app.get('/api/v1', (req, res) => {
  const x = 5
  console.log('The value of x is', xy)
  res.send('home')
})
app.use(notFoundMiddleware)
app.use(errorHandlerMiddleware)
const port = process.env.PORT || 3000
const start = async () => {
  try {
    await connectDB(process.env.MONGO_URI)
    app.listen(port, () =>
      console.log(`Server is listening on port ${port}...`)
    )
  } catch (error) {
    console.log(error)
  }
}
start()
Step 14 (Deploy - Heroku or any other cloud provider)
Great! Congrats. You've come a long way. It's now time to deploy our server. Let's use heroku for this. It's free (it has a free tier so don't worry)
Let's deploy πππππ
Make a copy of jobs-api/starter folder and put it on to desktop. I've called to project copy as
06-jobs-api-deployableOpen VSCODE with this project.
Go to https://dashboard.heroku.com/apps and login with
mr.sandeepamarnath@gmail.com / Heroku@123Go to documentation and Node JS and Deploying Node Js Apps on Heroku https://devcenter.heroku.com/articles/deploying-nodejs
Go back to vscode and remove git folder if there's any already by typing
rm -rf .gitMake sure you have process.env.PORT setup in app.js -
const port = process.env.PORT || 3000Setup a dummy route so that we know our app is deployed once it's done by accessing that route
// dummy route
app.get('/', (req, res) => {
  res.send('Jobs API')
})Now follow the steps in the documentation https://devcenter.heroku.com/articles/deploying-nodejs
In package.json setup node version in engines object
"engines": {
    "node": "14.x"
  },In package.json in start script, change nodemon to node as we are now in prod and not development anymore
  "scripts": {
    "start": "node app.js" // change nodemon to node
  },Then specify a PROC file - Heroku first looks at proc file - https://devcenter.heroku.com/articles/deploying-nodejs#specifying-a-start-script
Create a file and name it Procfile (no extension needed)
Add this line
web: node app.jsFollow John's video 202. Deploy on Heroku for more info
Create routes in postman for login and get all jobs using deployed url. Once they work fine, you're good to go
Step 15 (Setup the documentation - Swagger UI)
Doing directly from swagger takes a lot more effort. Instead, we can do it from postman.
Go to postman and make sure all the routes have same URL
Click on these 3 dots and export (u can change the file name but should be .json)

Now we cannot directly pass exported postman docs into swagger UI. The data from postman needs to be parsed in a way that the swagger understands. For this we use APIMATIC. We need to sign up for this https://www.apimatic.io/dashboard with mrs
mr.sandeepamarnath@gmail.com / Apimatic@123In APIMATIC, click on import json file from postman export. Don't pay attention to warnings you get while importing
Then click Edit and edit how you need. Watch video 207. APIMATIC Setup
Step 16 (Add swagger to our app)
Once the swagger UI is working, then you need to have two packages in your package.json
swagger-ui-express - provides swagger to our app
yamljs - converts our postman json to something swagger can understand
Once you have installed these 2 packages, create a file having the extension .yaml. In my case I'll create swagger.yaml and then paste the yaml file you did in step 15 (yaml that you got from APIMATIC Export) and paste it in that file swagger.yaml
Back in our app.js we need to do the following
require swagger-ui-express, yamljs packages
Load the yaml file and then pass it on to swagger UI
Set up a route to show the documentation like this
// swagger
const swaggerUI = require('swagger-ui-express')
const YAML = require('yamljs')
const swaggerDocument = YAML.load('./swagger.yaml')
// Home route
app.get('/', (req, res) => {
  res.send('<h1>Jobs API</h1><a href ="/api-docs">API Documentation</a>') // uses below route
})
// Document route
app.use('/api-docs', swaggerUI.serve, swaggerUI.setup(swaggerDocument))Run these commands to update PROD (Heroku)
git add -A
git commit -m "swagger docs added" 
git push heroku masterStep 17 (PAT YOUR BACK)
Great! Congrats! You've completed the project. So whenever in future you want to come back to node-express learning and do some projects, I suggest you go through this entire page notes once (optional), take a look at previous project notes JWT project and this project (Jobs). You should be good.
β
 4. File upload project
In this project, let's upload the files to our server or cloud called cloudinary. We also use express-file-upload library here.

Step 1 - Product Model
Let's define 3 fields in Product Model, name, price and then product image.
const mongoose = require('mongoose')
const ProductSchema = new mongoose.Schema({
  name: {
    type: String,
    required: true,
  },
  price: {
    type: Number,
    required: true,
  },
  image: {
    type: String,
    required: true,
  },
})
module.exports = mongoose.model('Product', ProductSchema)Step 2 - Product and Upload Controller
Let's setup two controllers. ProductController and UploadController.
const Product = require('../models/Product')
const { StatusCodes } = require('http-status-codes')
const createProduct = async (req, res) => {
  const product = await Product.create(req.body)
  res.status(StatusCodes.OK).json({ product })
}
const getAllProducts = async (req, res) => {
  const products = await Product.find({})
  res.status(StatusCodes.OK).json({ products })
} 
module.exports = {
  createProduct,
  getAllProducts,
}const { StatusCodes } = require('http-status-codes')
const uploadProductImage = async (req, res) => {
  res.send('create product')
}
module.exports = {
  uploadProductImage,
}Now why do we need  uploadController.js file and uploadProductImage controller? Before we answer this question, lets first setup ProductsRoute
Step 3 - Product Route
const express = require('express')
const router = express.Router()
const {
  getAllProducts,
  createProduct,
} = require('../controllers/productController')
const { uploadProductImage } = require('../controllers/uploadsController')
router.route('/').get(getAllProducts).post(createProduct)
router.route('/uploads').post(uploadProductImage)
module.exports = routerStep 4 - App.js setup
require('dotenv').config()
require('express-async-errors')
const express = require('express')
const app = express()
app.use(express.json()) // to have access to req.body
// database
const connectDB = require('./db/connect')
// product router
const productRouter = require('./routes/productRoutes')
// error handler
const notFoundMiddleware = require('./middleware/not-found')
const errorHandlerMiddleware = require('./middleware/error-handler')
app.get('/', (req, res) => {
  res.send('<h1>File Upload Starter</h1>')
})
app.use('/api/v1/products', productRouter)
// middleware
app.use(notFoundMiddleware)
app.use(errorHandlerMiddleware)
const port = process.env.PORT || 3000
const start = async () => {
  try {
    await connectDB(process.env.MONGO_URI)
    app.listen(port, () =>
      console.log(`Server is listening on port ${port}...`)
    )
  } catch (error) {
    console.log(error)
  }
}
start()Step 5 - Postman setup
Once the basic setup is done let's now setup the routes in postman. We will have 3 routes
For Image upload (POST)
For creation of product (POST)
To get all products (GET)
Step 6 - Why we need uploadImage controller?
Ok it's time to answer this question. What do we send image path string in the Product? We don't know the path of the image we will be uploading correct?
The idea is, we need to put an image to the server OR cloudinary (store image on cloud) and then get the path of that image and add that path as Product image string.
So for uploading it to the cloud or server first we need this uploadImage controller/route.
Steps
Upload an image to the server or cloud (cloudinary) and get the path for this image
Use this path to create product
Step 7 - Upload image
Now let's focus on /products/uploads route to upload the image and get the path before creating the product.
Let's upload an image / file from postman

Now if I upload image like shown above and click on Send button in Postman, in our controller if we console log req, we see nothing in body. So how can we get image that we uploaded then? We need to use additional package called express-fileupload to grab that image data and parse it.


Now that we have access to the file that got uploaded, we need to do two things
We need to move this file to our server (any folder) Or cloudinary
Also, we need this image to be publicly available (we need to store in a folder that is publicly available - example public folder)

Also, on the response, not that we have mv function (last key in image) that helps us to move the image to any other folder




So now, how we create products?
Upload image in Upload Image path
Once the image is uploaded, you will get the path
Copy that path and create the product


Step 8 - Test on Front-end

So when you click Choose File, the /products/uploads route is called giving us back the path of the image that we can use when we Create Product (when we click Add Product)


Step 9 - Error Checks
Before we explore cloudinary, let's first check if the uploaded file is an image, it has a proper size and so on.
const { StatusCodes } = require('http-status-codes')
const path = require('path')
const CustomError = require('../errors')
const uploadProductImage = async (req, res) => {
  if (!req.files) {
    throw new CustomError.BadRequestError('No File uploaded')
  }
  const productImage = req.files.image
  if (!productImage.mimetype.startsWith('image')) {
    throw new CustomError.BadRequestError('Please upload an image')
  }
  const maxSize = 1024 * 1024 // 1000 Mb
  if (productImage.size > maxSize) {
    throw new CustomError.BadRequestError(
      'Please upload an image smaller than 1Mb'
    )
  }
  const imagePath = path.join(
    __dirname,
    '../public/uploads/' + `${req.files.image.name}`
  )
  await productImage.mv(imagePath)
  res
    .status(StatusCodes.OK)
    .json({ image: { src: `/uploads/${productImage.name}` } })
}
module.exports = {
  uploadProductImage,
}Step 10 - Cloudinary Setup
Instead of storing the images on the server like we did till now, we can use a popular cloud option like cloudinary https://cloudinary.com/ to store the images on cloud. The benefit is that the cloud can be located in many geographical locations and will be very faster to get the image as a client.
Setup cloudinary in app.js
npm install cloudinary
require('dotenv').config()
require('express-async-errors')
const express = require('express')
const app = express()
const fileUpload = require('express-fileupload') // to access files that are uploaded
const cloudinary = require('cloudinary').v2 // store images on cloud
cloudinary.config({
  cloud_name: process.env.CLOUD_NAME,
  api_key: process.env.CLOUD_API_KEY,
  api_secret: process.env.CLOUD_API_SECRET,
  secure: true,
})
app.use(express.static('./public')) // make public folder publicly available
app.use(express.json()) // to have access to req.body- This don't give files that are uploaded
app.use(fileUpload())
// database
const connectDB = require('./db/connect')
// product router
const productRouter = require('./routes/productRoutes')
// error handler
const notFoundMiddleware = require('./middleware/not-found')
const errorHandlerMiddleware = require('./middleware/error-handler')
app.get('/', (req, res) => {
  res.send('<h1>File Upload Starter</h1>')
})
app.use('/api/v1/products', productRouter)
// middleware
app.use(notFoundMiddleware)
app.use(errorHandlerMiddleware)
const port = process.env.PORT || 5000
const start = async () => {
  try {
    await connectDB(process.env.MONGO_URI)
    app.listen(port, () =>
      console.log(`Server is listening on port ${port}...`)
    )
  } catch (error) {
    console.log(error)
  }
}
start()
How to put images to the cloudinary?
const result = await cloudinary.uploader.upload(IMG_PATH)Now the image path (param to cloud upload) is /public/uploads folder. We get the image uploaded from /public/uploads and add it to cloud using above line of code. However, we have second option to replace /public/uploads folder. express-fileupload package will also provide us the temp folder for storing the image on server. How to enable that? Well, enable useTempFiles:true in express-fileupload package like this. This would create a tmp folder (left side highlighted)


Now the uploaded file will be available right away in tmp folder. We need to give this path to cloudinary.

Where can user access image he just uploaded? OR in other words, what do we send back as image URL?

When we send the response with secure_url like this, this is what we see in postman
const uploadProductImage = async (req, res) => {
  console.log(req.files.image)
  const result = await cloudinary.uploader.upload(
    req.files.image.tempFilePath,
    { use_filename: true, folder: '07-File-Upload' }
  )
  res.status(StatusCodes.OK).json({ image: { src: result.secure_url } })
}
Step 11 - Clear temp folder after upload
Notice that, after each upload from front-end or postman, we will have temp files pile up on server which was created before each push to cloudinary. We need to cleanup after we put that image to cloudinary
const { StatusCodes } = require('http-status-codes')
const path = require('path')
const CustomError = require('../errors')
const cloudinary = require('cloudinary').v2
const fs = require('fs')
const uploadProductImage = async (req, res) => {
  console.log(req.files.image)
  const result = await cloudinary.uploader.upload(
    req.files.image.tempFilePath,
    { use_filename: true, folder: '07-File-Upload' }
  )
  // this will delete any files in this path
  fs.unlinkSync(req.files.image.tempFilePath)
  res.status(StatusCodes.OK).json({ image: { src: result.secure_url } })
}
module.exports = {
  uploadProductImage,
}π Well done! Completed the File Upload Project.
β
 5. Stripe API Project
This is the project where we will work on accepting payments with stripe. This has become a goto platform for everything related to online payments.
Step 1 (General overview of how online payments work)

In any E-Commerce, Once we add the items to the cart and click on checkout we come to this payment page as shown above. The idea is, when we click pay, we cannot just directly communicate with stripe from the front-end to accept $ amount. That would be highly insecure.
Front-end ---> to stripe = Insecure
Instead, once the Pay button is clicked, we need to communicate with our backend where we send our payment intent.
Frontend ---> Our backend (OR serverless function) ---> Stripe
---> Stripe sends back payment intent to backend ---> We can then proceed with this payment.
So basically the overview is, in order for front-end to communicate with stripe it should have the payment intent from stripe.
Let's create a stripe account - mr.sandeepamarnath@gmail.com / Rememberstripepassword@123

Step 2 (Understand Stripe Code - Docs) 

Create .env file on starter project in backend and paste this Secret Key and restart the server. Once you're done, let's go to docs and see how to accept online payments https://stripe.com/docs/checkout/quickstart

Select the below shown. server.js is the node js code we need in our controller.

Go to checkout.html file and there is the script tags you can use for your html. That way you can get access to stripe in client side js file.
You can use checkout.js page to write your client side js file. There you have to use publishable key.

The above is the checkout.js file. Notice that we have items (cart)
const items = [{ id: "xl-tshirt" }];and we are sending this to create-payment-intent route (to our backend where we add code given by stripe in server.js)
  const response = await fetch("/create-payment-intent", {
    method: "POST",
    headers: { "Content-Type": "application/json" },
    body: JSON.stringify({ items }),
  });In server.js we have code for that create-payment-intent route where we send back clientSecret to front-end (checkout.js). Once checkout.js gets back the client secret then it can communicate with stripe.
app.post("/create-payment-intent", async (req, res) => {
  const { items } = req.body;
  // Create a PaymentIntent with the order amount and currency
  const paymentIntent = await stripe.paymentIntents.create({
    amount: calculateOrderAmount(items),
    currency: "cad",
    automatic_payment_methods: {
      enabled: true,
    },
  });
  res.send({
    clientSecret: paymentIntent.client_secret,
  });
});Step 3 (Using stripe code in our code)
Once you understand what is server.js and checkout.js and checkout.html. Then let's move on to our code to implement these.
3.1 Our front-end in public folder
Go to public/browser-app.js paste your public key from stripe

3.2 Stripe Controller
Let's work on our controller now in controllers/StripeController.js
We have to setup a route in app.js first. That route's name should match the route in browser-app.js as we call this route from browse-app.js and the request comes to our app.js (server). Let's call that /server in browser.js

require('dotenv').config()
require('express-async-errors')
const express = require('express')
const app = express()
// controller
const stripeController = require('./controllers/stripeController')
// error handler
const notFoundMiddleware = require('./middleware/not-found')
const errorHandlerMiddleware = require('./middleware/error-handler')
app.use(express.json())
app.use(express.static('./public'))
// stripe
app.post('/stripe', stripeController)
app.use(notFoundMiddleware)
app.use(errorHandlerMiddleware)
const port = process.env.PORT || 3000
const start = async () => {
  try {
    app.listen(port, () =>
      console.log(`Server is listening on port ${port}...`)
    )
  } catch (error) {
    console.log(error)
  }
}
start()const stripe = async (req, res) => {
  console.log(req.body)
  res.send('Stripe post route')
}
module.exports = stripeStart your server and go to localhost:3000 in your browser, you will then hit /stripe route which invokes stripeController as shown

Note that the amount is in cents here and not dollars because stripe needs the smallest unit of currency. For example, t-shirt's price is 1999 which means 19.99$. But stripe needs this in cents so 19.99 * 100 will be 1999.
Now in the stripeController (backend) we need to do two things
Verify if the cost of the items is actually what front-end is saying
Communicate with stripe and get the client secret because the front-end can make the payment directly to stripe if it has the client-secret as shown below

Normally in the stripe controller (our backend), we will get our cart (purchase in this case) from front-end and we need to take the ID of each item and check it's price in DB and confirm if the total is matching up with what front-end sent us. This is because front-end can modify the values and we can't rely on that so we always need to check in the backend and see if the prices are matching. If they are matching then we communicate with stripe and create a payment intent and get the client secret.
Once we get the client secret, we can send it to front-end where front-end can make the payment directly to stripe. Without this client-secret in front-end it cannot make a payment to stripe.
const stripe = require('stripe')(process.env.STRIPE_API_KEY)
const stripeController = async (req, res) => {
  const { purchase, total_amount, shipping_fee } = req.body
  const calcOrderAmount = () => {
    // IMPORTANT NOTEs
    // normally here, we will communicate with our database and get the price of each item and
    // verify if what front-end is saying the cost of each item is, it's actually true (because front-end can manipulate the items and cost)
    // since this is just a demo project we will not do the verification here by calling db
    // here we will just combine total_amount and shipping_fee that will give me total amount
    return total_amount + shipping_fee
  }
  const paymentIntent = await stripe.paymentIntents.create({
    amount: calcOrderAmount(),
    currency: 'usd',
  })
  res.json({ clientSecret: paymentIntent.client_secret })
}
module.exports = stripeControllerOnce this is setup then navigate to localhost:3000 in your browser and make a payment ππ

Good job! Done with stipe project
β
 6. E-Commerce API π
Buckle up your seats for yet another very large project E-commerce API. This involves several features like
Role based authorization
Sending JWT via cookie
Users
Orders
Reviews
Documentation using docgen package
What is MongoDB aggregator and how to use it
And much more
The final documentation looks like this https://e-commerce-api-10.herokuapp.com/
Let's write some steps on how to build this app.
Step 1 (Basic Setup)
Get the starter file from John's repo
It already has the error handling code (Step2 and Step 12 of previous project, Jobs API)
In the app js file, define express and then app
Setup the port
Import dotenv and require('dotenv').config()
Setup basic route for '/' and check in the browser if it works
Setup the notFound and errorHandler middlewares
Require express.json()
Setup the DB by defining Mongo_URL in .env file (create this file)
Define Mongoose connect.js. Note that in version 6, we don't need the optional parameters, it can just be like this below

Require this in the app js and call it like this

At this point, the app.js file looks like this
const express = require('express')
const app = express()
const connectDB = require('./db/connect')
require('dotenv').config()
// make sure this express-async-errors is above authRouter import. Else the code in the controllers (inside authRouters) are not covered so it wont work.
require('express-async-errors') // for avoiding writing try-catch in controllers
const notFoundMW = require('./middleware/not-found')
const errorHandlerMW = require('./middleware/error-handler')
//// Middlewares and Routes
app.use(express.json()) // used to get req.body data for post reqs
// Routes
// Basic Route
app.get('/', (req, res) => {
  res.send('E-Commerce API Home page')
})
// app.use('/api/v1/auth',)
app.use(notFoundMW)
app.use(errorHandlerMW)
////
const port = process.env.PORT || 5000
const start = async () => {
  try {
    // connect to db
    await connectDB(process.env.MONGO_URL)
    app.listen(port)
    console.log('Server is listening on port', port)
  } catch (err) {
    console.log(err)
  }
}
start()Step 2 (Define logging middleware)
Till now, we didn't do logging. We can use Morgan package https://www.npmjs.com/package/morgan to handle this in our app
Require morgan package
Define a middleware and use this with option
tinyto start withWe'll explore different options soon
Here's how it looks like in app.js
const morgan = require('morgan')
app.use(morgan('tiny'))So once you add this package like this then you call any route, for example go to localhost:5000/ which makes a get request to route '/', this is what you see

This helps you debug nicely where it will show what route is being hit in console. Let's say you hit a non existing route /apples, then it will show it right away and you know what's going on

Step 3 (Auth setup)
In this project, we will be sending JWT in cookies. Also, we will have role based authentication. Only an ADMIN user can CRUD Products. We have USERs who CRUD Orders, and Reviews.
3.1 Create User Model
Create models folder and User.js file
Create schema with name,email, password (all type:String)
Export mongoose model
const mongoose = require('mongoose')
const UserSchema = new mongoose.Schema({
  name: {
    type: String,
    required: [true, 'Please provide na,e property'],
    minLength: 2,
    maxLength: 50,
  },
  email: {
    type: String,
    unique: true,
    required: [true, 'Email address is required'],
    match: [
      /^\w+([\.-]?\w+)*@\w+([\.-]?\w+)*(\.\w{2,3})+$/,
      'Please fill a valid email address',
    ], // we will use a validator package later to validate this
  },
  password: {
    type: String,
    required: [true, 'Please provide password'],
    minLength: 3,
  },
  role: {
    enum: ['admin', 'user'],
    default: 'user',
  },
})
const UserModel = mongoose.model('User', UserSchema)
module.exports = UserModel Setup Custom validator
As you can see above, in the email, we are using match option and doing the email validation manually. In mongoose, we can do custom validation (write custom validators if the builtin ones are not enough)
validate is a property which we can use on one of the fields. This can have two properties
Validate function
Message
For the validate function, we can use our own validation but instead we can use a npm package called validator
So, let's modify the above code like this for User model
const mongoose = require('mongoose')
const validator = require('validator')
const UserSchema = new mongoose.Schema({
  name: {
    type: String,
    required: [true, 'Please provide na,e property'],
    minLength: 2,
    maxLength: 50,
  },
  email: {
    type: String,
    unique: true,
    required: [true, 'Email address is required'],
    validate: {
      validator: validator.isEmail,
      message: 'Please provide a valid email',
    },
  },
  password: {
    type: String,
    required: [true, 'Please provide password'],
    minLength: 3,
    maxLength: 40,
  },
  roles: {
    type: String,
    enum: ['admin', 'user'],
    default: 'user',
  },
})
const UserModel = mongoose.model('User', UserSchema)
module.exports = UserModel3.2 Create Auth Controller
Create controllers folder add authController file
Export (register,login,logout) functions
res.send('some string value')
const register = async (req, res) => {
  res.send('Register User')
}
const login = async (req, res) => {
  res.send('Login User')
}
const logout = async (req, res) => {
  res.send('Logout User')
}
module.exports = { register, login, logout }3.3 Create Auth Routes
Create routes folder
Setup
authRoutesfile and import all controllersSetup three routes - post('/register') post('/login') get('/logout')
const { register, login, logout } = require('../controllers/authController')
const express = require('express')
const router = express.Router()
// showing both ways here but I prefer router.post syntax when there's a single route
router.route('/register').post(register)
router.post('/login', login)
router.route('/logout').get(logout)
module.exports = router3.4 Postman - Test these routes created
Before we test these routes, in the last project, you might have noticed after deploying the app when you wanted to test the deployed app, you had to change local URL to PROD URL in all the routes which is not the best way to do. Instead we can create a new Environment and create variables there. If we want to change to production, we will have prod environment and that way we don't have to manually change URL or any other values.

3.5 Register Controller
Create user
Send response with entire user (only while testing)
Check if email already in use. We can check in two ways (Schema level and controller level). Let's do both here,
To check in schema, we already added unique - true which does it for us
To check in controller, we need to check if user already exists with this email.
const register = async (req, res) => {
  const { email } = req.body
  const emailAlreadyExists = await User.findOne({ email })
  if (emailAlreadyExists) {
    throw new customError.BadRequestError('Email already exists')
  }
  const user = await User.create(req.body)
  // only while testing we send this entire user object
  res.status(StatusCodes.CREATED).json({ user })
}User Role 
We have two types of user roles
User
Admin - Have more privileges like
View all users' account, modify them if necessary
Do all CRUD operations on Products whereas user can only view the products but not modify them
In our current setup, we can send the role as admin or user from the postman. But that's not a good way of doing it. Why? Because, if you think about it, we don't want anybody out there to be a admin who gets more privileges.
Let's take this reg page as an example, there's no user role selection option anywhere.

We don't want people to register as admins from this page. Not only front-end should be secure by not providing the user role option but also the backend should be secure. If you notice in the current implementation, the reg controller looks like this where we have User.create(req.body)
 const register = async (req, res) => {
  const { email } = req.body
  const emailAlreadyExists = await User.findOne({ email })
  if (emailAlreadyExists) {
    throw new customError.BadRequestError('Email already exists')
  }
  const user = await User.create(req.body) // What ever we pass here get's into DB in req.body
  // only while testing we send this entire user object
  res.status(StatusCodes.CREATED).json({ user })
}We need to restrict and not put the role into DB even if user provides it via front-end or postman. We can modify our code like this
const register = async (req, res) => {
  const { name,email,password } = req.body
  const emailAlreadyExists = await User.findOne({ email })
  if (emailAlreadyExists) {
    throw new customError.BadRequestError('Email already exists')
  }
  const user = await User.create({name,email,password}) // now the role will not be sent to DB and the default User role will be used and sent to db
  // only while testing we send this entire user object
  res.status(StatusCodes.CREATED).json({ user })
} Setting the very first user as admin
We can also use another strategy here where we can set up our first user as admin and all other users as users. We can use a method called countDocuments on User object where it will count all documents and if it's 0 then the first user can be admin.
const register = async (req, res) => {
  const { name, email, password } = req.body
  const emailAlreadyExists = await User.findOne({ email })
  if (emailAlreadyExists) {
    throw new customError.BadRequestError('Email already exists')
  }
  // let's setup first user as admin
  const isFirstUser = (await User.countDocuments({})) === 0
  const role = isFirstUser ? 'admin' : 'user'
  const user = await User.create({ name, email, password, role })
  // only while testing we send this entire user object
  res.status(StatusCodes.CREATED).json({ user }We can take multiple approaches to make a user admin
We manually go to our DB and change the role to some users as 'admin'
We can set first user as admin as shown above programmatically
We let front-end/postman send us the role (BAD PRACTICE)
3.6 Hash passwords
Let's hash the passwords and store it to DB. As you know we will be using bcryptjs package to do this. In our previous project, we first did it in the register controller and once you got a grasp of how to do it there, we then made use of pre-commit hook called pre on the Schema to hash the password. Let's directly do pre in this case
UserSchema.pre('save', async function () {
  const salt = await bcrypt.genSalt(10)
  this.password = await bcrypt.hash(this.password, salt)
})3.7 Compare Passwords
We've setup hashing password functionality. While we are here, let's also write functionality to compare the passwords. We can write the instance method (a method/function on the User Schema) to compare the password. By doing this, we can use this method in our Login controller later.
UserSchema.methods.comparePasswords = async function (candidatePassword) {
  const isMatch = bcrypt.compare(candidatePassword, this.password)
  return isMatch
}
// the above method can be used in login controller like this 
// User.comparePassword(passwordStr) which we will see shortly So the entire model/User.js looks like this
const mongoose = require('mongoose')
const validator = require('validator')
const bcrypt = require('bcryptjs')
const UserSchema = new mongoose.Schema({
  name: {
    type: String,
    required: [true, 'Please provide name property'],
    minLength: 2,
    maxLength: 50,
  },
  email: {
    type: String,
    unique: true,
    required: [true, 'Email address is required'],
    validate: {
      validator: validator.isEmail,
      message: 'Please provide a valid email',
    },
  },
  password: {
    type: String,
    required: [true, 'Please provide password'],
    minLength: 3,
    maxLength: 40,
  },
  role: {
    type: String,
    enum: ['admin', 'user'],
    default: 'user',
  },
})
// pre-commit hook to save the password
// this pre will be run before committing to the DB
UserSchema.pre('save', async function () {
  const salt = await bcrypt.genSalt(10)
  this.password = await bcrypt.hash(this.password, salt)
})
// this is the instance method we can create on a schema. 
// Later this can be used in controller as UserSchema.comparePasswords(passwordStr)
UserSchema.methods.comparePasswords = async function (candidatePassword) {
  const isMatch = bcrypt.compare(candidatePassword, this.password)
  return isMatch
}
const UserModel = mongoose.model('User', UserSchema)
module.exports = UserModel3.8 Issue JWT (Json Web Token)
Once the user gets registered, we need to send back the token so that, using that token the user can further query the protected routes. If you take a look at Jobs API (previous project), we used to send back the token directly in response Step 3 (Register) in Register and Login controller.
Then the client (front-end/postman) used to send this token in subsequent requests to access the protected route (Jobs) Step 5 (Verify token - Auth middleware)
Steps we are going to take
[] require 'jsonwebtoken' package
[] create jwt - jwt.sign(payload,secret,options)
[] verify jwt - jwt.verify(token,secret)
[] add variables in .env JWT_SECRET=jwtSecret and JWT_LIFETIME=1d
[] restart the server !!!!
[] refactor code, create jwt functions in utils
[] refactor cookie code
[] setup func attachCookiesToResponse
[] accept payload(res, tokenUser)
[] create token, setup cookie
[] optionally send back the response
Let's initially do this setup in register controller and then move this logic of token creation into utils folder
const register = async (req, res) => {
  const { name, email, password } = req.body
  const emailAlreadyExists = await User.findOne({ email })
  if (emailAlreadyExists) {
    throw new customError.BadRequestError('Email already exists')
  }
  // let's setup first user as admin
  const isFirstUser = (await User.countDocuments({})) === 0
  const role = isFirstUser ? 'admin' : 'user'
  const user = await User.create({ name, email, password, role })
  // generate jwt token. TODO: Move this logic to utils folder soon
  const tokenUser = { userID: user._id, name: user.name, role: user.role }
  const token = jwt.sign(tokenUser, 'jwtSecret', { expiresIn: '1d' })
  res.status(StatusCodes.CREATED).json({ tokenUser, token })
}
Let's move our token creation code to the utils folder and also setup the token verification which we can later use it in authMiddleware.
const jwt = require('jsonwebtoken')
const createJWT = ({ payload }) => {
  const token = jwt.sign(payload, process.env.JWT_SECRET, {
    expiresIn: process.env.JWT_LIFETIME,
  })
  return token
}
// we will use this token validation later in auth mw when we start querying protected routes
const isTokenValid = ({ token }) => jwt.verify(token, process.env.JWT_SECRET)
module.exports = { createJWT, isTokenValid }Let's setup the index.js so that we can import by targeting utils folder
const { createJWT, isTokenValid } = require('./jwt')
module.exports = { createJWT, isTokenValid }Let's now see how to modified authController looks like for register function
const register = async (req, res) => {
  const { name, email, password } = req.body
  const emailAlreadyExists = await User.findOne({ email })
  if (emailAlreadyExists) {
    throw new customError.BadRequestError('Email already exists')
  }
  // let's setup first user as admin
  const isFirstUser = (await User.countDocuments({})) === 0
  const role = isFirstUser ? 'admin' : 'user'
  const user = await User.create({ name, email, password, role })
  const tokenUser = {
    userId: user._id,
    name: user.name,
    role: user.role,
  }
  const token = createJWT({ payload: tokenUser }) // this is now called to create token
  res.status(StatusCodes.CREATED).json({ user:tokenUser, token })
}3.9 Send JWT cookie
Once we have seen the creation of JWT, let's now look at an alternative method of how we can send our JWT.
In Jobs API (previous project) we saw that we send token in our response which is then stored in local storage on front-end. But we can also send our jwt on a cookie that gets attached to response and on the front-end, we don't have to store in the local storage, instead it gets directly attached in cookies in front-end and we can access from our browser. And also in the next request, the browser will automatically send the jwt from the cookie to us without having us to manually send from front-end for the next requests.
We will see soon the gotchas to both the approaches but first let's see how to send jwt in cookie. So we will create a cookie, attach jwt to that cookie and attach the cookie in our response.
Reference - https://expressjs.com/en/5x/api.html#res.cookie

const register = async (req, res) => {
  const { name, email, password } = req.body
  const emailAlreadyExists = await User.findOne({ email })
  if (emailAlreadyExists) {
    throw new customError.BadRequestError('Email already exists')
  }
  // let's setup first user as admin
  const isFirstUser = (await User.countDocuments({})) === 0
  const role = isFirstUser ? 'admin' : 'user'
  const user = await User.create({ name, email, password, role })
  const tokenUser = {
    userId: user._id,
    name: user.name,
    role: user.role,
  }
  const token = createJWT({ payload: tokenUser })
  const oneDayinMillisec = 24 * 60 * 60 * 1000
  // send token via cookie
  res.cookie('token', token, {
    httpOnly: true,
    expiresIn: new Date(Date.now() + oneDayinMillisec),
  })
  res.status(StatusCodes.CREATED).json({ user:tokenUser }) // removed token from response here
}
3.10  Cookie parsing 
We saw how we can send the token in the cookie, now let's take a look at how we can parse the cookie we get back from client / postman in our request. We will check if front-end really attaches the cookie having token to it's requests.
As far as adding a cookie to the response we saw in 3.9, it's very easy that we do res.cookie. But to get the cookie that gets attached in a request by client we need to install an extra package named cookie-parser.
const express = require('express')
const app = express()
// rest of the packages
const morgan = require('morgan')
const cookieParser = require('cookie-parser') // COOKIE PARSER
const connectDB = require('./db/connect')
require('dotenv').config()
require('express-async-errors') // for avoiding writing try-catch in controllers
const notFoundMW = require('./middleware/not-found')
const errorHandlerMW = require('./middleware/error-handler')
const authRouter = require('./routes/authRoute')
//// Middlewares and Routes
app.use(morgan('tiny'))
app.use(express.json()) // used to get req.body data for post reqs
app.use(cookieParser()) // used to parse the cookies sent from the client(front-end) or postman
// Routes
// Basic Route
app.get('/', (req, res) => {
  console.log(req.cookies) // this is avaiable because of cookie-parser package
  res.send('E-Commerce API Home page')
})
// auth router
app.use('/api/v1/auth', authRouter)
app.use(notFoundMW)
app.use(errorHandlerMW)
////
const port = process.env.PORT || 5000
const start = async () => {
  try {
    // connect to db
    await connectDB(process.env.MONGO_URL)
    app.listen(port)
    console.log('Server is listening on port', port)
  } catch (err) {
    console.log(err)
  }
}
start()
3.11 Refactor cookie code (3.10)
Once we are familiar with attaching the token in the cookie and parsing that which we get it in request, let's tweak the code we saw in 3.10 and add that code of attaching thee token in cookie in utils/jwt.js


3.12 Secure and signed flags

Let's think about sending cookies from browser to server. We are currently sending via HTTP which is not secure. What this cookie contains? It has a token. If browser sends the cookie containing a token and once browser sends it, before it reaches the server, if a hacker gets that cookie then he gets access to token and he can then use this token to send the request to server's protected routes on behalf of original client.
To avoid this we need a secured way (over HTTPS) to send cookies from browser to server. So we need to use this secure option in production environment. It doesn't matter in dev env.
If signed option is enabled then the server can detect if client modified the cookie the server had sent it. We can then do res.signedCookies to access it.
// in app.js
app.use(cookieParser(process.env.JWT_SECRET)) 
// in app.js to access it
// Basic Route
app.get('/', (req, res) => {
  console.log(req.signedCookies) // this is avaiable because of cookie-parser package
  res.send('E-Commerce API Home page')
})
// in jwt.js
const attachCookiesToResponse = ({ res, user }) => {
  const token = createJWT({ payload: user })
  const oneDayinMillisec = 24 * 60 * 60 * 1000
  res.cookie('token', token, {
    httpOnly: true,
    expires: new Date(Date.now() + oneDayinMillisec),
    secure: process.env.NODE_ENV === 'production',
    signed: true, // this will sign the cookie
  })
}3.13 Login Route 
Steps we need to take
[] check if email and password exist, if one missing return 400 - Bad Request
[] find user, if no user return 401 - Unauthorized or unauthenticated
[] check password, if does not match return 401
[] if everything is correct, attach cookie and send back the same response as in register
const login = async (req, res) => {
  const { email, password } = req.body
  if (!email || !password)
    throw new customError.BadRequestError('Please provide email and password')
  const user = await User.findOne({ email })
  if (!user) throw new customError.UnauthenticatedError('Invalid credentials.')
  const isPasswordCorrect = await user.comparePasswords(password)
  if (!isPasswordCorrect)
    throw new customError.UnauthenticatedError('Invalid credentials.')
  const tokenUser = { userId: user._id, name: user.name, role: user.role }
  attachCookiesToResponse({ res, user: tokenUser })
  res.status(StatusCodes.OK).json({ user:tokenUser })
}3.14 Logout Route
Having cookie with token means that the user is logged in. So in the logout route we need to send some other string in cookie instead of token, so that the user would not send us the token in the next request being logged out.
What if he already noted down the token somewhere and sends that token in the cookie after being logged out? To solve this we also need to invalidate the token (expire the token immediately so that when user users the same token in the next route it should not be valid)
Steps
[] set token cookie equal to some string value - maybe logout
[] set expires:new Date(Date.now())
const logout = async (req, res) => {
  res.cookie('token', 'logout', {
    httpOnly: true,
    expires: new Date(Date.now()),
  })
  res.status(StatusCodes.OK).json({ msg: 'user logged out' })
}3.15 Create-React-APP with express
In Udemy John's Video 262. Cookies - Big picture and "Gotchas", he has explained how to connect CRA with node-express.
Issues when using Create-React-APP with Express
Both are on different domains. Create-React-APP is on localhost:3000 and express is on localhost:5000
When we try to access a route from CRA, it gives CORS error. You know that the way to solve this is by adding cors() to the server so that the resources on the server will be available
The other issue that occurs when dealing with different domains is the cookies that we send in the response from our server. Token in this case. This cookie from our express server won't show up in the CRA front-end. The way to get around it is by proxy configuration in CRA package.json file.
We need to setup proxy pointing to localhost:5000, our server in CRA package.json.
"proxy" : "http:localhost:5000"Now where ever we try to access the server route, for example,
http:localhost:5000/api/v1/xyzthen we can just do this way in CRA,
      const fetchRoute = async () => {
            url = `/api/v1/xyz` // no need to write localhost:5000 prefix. It's specifieds in proxy already
            await fetch(url)
      }Step 4 (Auth Middleware)
FYI, before Step 4, I built step 5 and then came to step 4. It's ok for you to go in order but don't get confused!! π
Time to implement the authentication middleware to give access to protected routes. Next, we will be implementing th User routes and, as an admin, we might have to get all the users info. For this the user who is requesting this info (all users or single user) must be an admin. How can we test that? Well we need a middleware (auth mw) to test whether the token is valid and the user who is requesting is actually an admin. So, two parts to this:
authenticationMW should implement a method to check the token if valid, and if it is valid then pass the payload we get from validating the token to next route
Along with checking the token is valid we also need to check if the token belongs to admin user and not the normal user
4.1 Authenticate User (Validate token)
Authentication MW - Validating the token and passing the payload to next route
const customErrors = require('../errors')
const { isTokenValid } = require('../utils')
const authenticateUser = async (req, res, next) => {
  const { token } = req.signedCookies
  console.log('The token is', token)
  if (!token)
    throw new customErrors.UnauthenticatedError('Authentication Invalid')
  try {
    // we get payload (that we stored in token while creating jwt) if token is valid
    const { name, userId, role } = isTokenValid({ token })
    req.user = { name, userId, role }
    next()
  } catch (error) {
    throw new customErrors.UnauthenticatedError('Authentication Invalids')
  }
}
module.exports = authenticateUser4.2 Authorize permissions
Authentication MW - Check if user (attached in req in above route authenticateUser) is admin
const authorizePermissions = async (req, res, next) => {
  console.log('Coming from authorize permissions', req.user)
  if (req.user.role !== 'admin') {
    throw new customErrors.UnAuthorizedError( // creating this error below
      'Unauthorized to access this route'
    )
  }
  next()
}4.3 Unauthorized error 
We created Unauthorized error class which didn't exist before (similar to other errors)
const { StatusCodes } = require('http-status-codes')
const CustomAPIError = require('./custom-api')
class UnAuthorizedError extends CustomAPIError {
  constructor(message) {
    super(message)
    this.statusCode = StatusCodes.FORBIDDEN
  }
}
module.exports = UnAuthorizedErrorLet's now use this authenticate and authorized permissions as shown in the User route. We will create User route in next section, so you can come back to this after setting up (section 5 below) User routes
4.4  Using authentication MW 
const express = require('express')
const router = express.Router()
const {
  getAllUsers,
  getSingleUser,
  showCurrentUser,
  updateUser,
  updateUserPassword,
} = require('../controllers/userController')
const {
  authenticateUser,
  authorizePermissions,
} = require('../middleware/authentication')
router.route('/').get(authenticateUser, authorizePermissions, getAllUsers) // first authenticates user, then checks if admin and then allows to proceed to access protected route
router.route('/showMe').get(showCurrentUser)
router.route('/updateUser').patch(updateUser)
router.route('/updateUserPassword').post(updateUserPassword)
// this rout with :id must be placed last so that it won't interfer with previous routes mentioned above this
router.route('/:id').get(authenticateUser, getSingleUser)
module.exports = route4.4 Authorise permissions for multiple roles
In middlewares/authentication.js, we have this code which checks if the role is admin.
if (req.user.role !== 'admin') 
What if we have multiple roles, then we need to hardcode every role and check? No, we can pass this as a param from user router and check against req.user.role like this
const authorizePermissions = (roles) => {
// we are now returning a function 
  return async (req, res, next) => {
    if (!roles.includes(req.user.role)) {
      throw new customErrors.UnAuthorizedError(
        'Unauthorized to access this route'
      )
    }
    next()
  }
}And it's called in the user router like this. The commented code was used earlier
const express = require('express')
const router = express.Router()
const {
  getAllUsers,
  getSingleUser,
  showCurrentUser,
  updateUser,
  updateUserPassword,
} = require('../controllers/userController')
const {
  authenticateUser,
  authorizePermissions,
} = require('../middleware/authentication')
// router.route('/').get(authenticateUser, authorizePermissions, getAllUsers)
router
  .route('/')
  .get(authenticateUser, authorizePermissions(['admin']), getAllUsers) // we can add multiple roles here, like admin, ownwer and so on.
router.route('/showMe').get(showCurrentUser)
router.route('/updateUser').patch(updateUser)
router.route('/updateUserPassword').post(updateUserPassword)
// this rout with :id must be placed last so that it won't interfer with previous routes mentioned above this
router.route('/:id').get(authenticateUser, getSingleUser)
module.exports = routerNow we are not hardcoding the role. Tomorrow if I have more roles like owner, superuser and so on, I can include it in array and don't have to change my authorizePermissions middleware
Step 5 (User Routes)
Uff! We completed the auth part. Now let's think from an admin perspective! What if you are an admin, wouldn't you like to have read/write access to all the Users we just created? Of course. So being an admin, we should be able to
Get All Users
Get Single User
Show Current logged in user
Update User info
Update User Password
Let's set up some boiler plate for the same and these are the steps we will take
Steps
[] add userController file
[] export (getAllUsers,getSingleUser,showCurrentUser,updateUser,updateUserPassword) functions
[] res.send('some string value')
[] setup userRoutes file
[] import all controllers
[] setup just one route - router.route('/').get(getAllUsers);
[] import userRoutes as userRouter in the app.js
[] setup app.use('/api/v1/users', userRouter)
Code
const getAllUsers = async (req, res) => {
  res.send('Get All Users route')
}
const getSingleUser = async (req, res) => {
  res.send('Get Single User route')
}
const showCurrentUser = async (req, res) => {
  res.send('Show current user route')
}
const updateUser = async (req, res) => {
  res.send('Update user route')
}
const updateUserPassword = async (req, res) => {
  res.send('update User Password route')
}
module.exports = {
  getAllUsers,
  getSingleUser,
  showCurrentUser,
  updateUser,
  updateUserPassword,
}const express = require('express')
const router = express.Router()
const {
  getAllUsers,
  getSingleUser,
  showCurrentUser,
  updateUser,
  updateUserPassword,
} = require('../controllers/userController')
router.route('/').get(getAllUsers)
router.route('/showMe').get(showCurrentUser)
router.route('/updateUser').patch(updateUser)
router.route('/updateUserPassword').post(updateUserPassword)
// this rout with :id must be placed last so that it won't interfer with previous routes mentioned above this
router.route('/:id').get(getSingleUser)
module.exports = router// auth router
app.use('/api/v1/auth', authRouter)const express = require('express')
const app = express()
// rest of the packages
const morgan = require('morgan')
const cookieParser = require('cookie-parser')
const connectDB = require('./db/connect')
require('dotenv').config()
require('express-async-errors') // for avoiding writing try-catch in controllers
const notFoundMW = require('./middleware/not-found')
const errorHandlerMW = require('./middleware/error-handler')
const authRouter = require('./routes/authRoute')
const userRouter = require('./routes/userRoute')
//// Middlewares and Routes
app.use(morgan('tiny'))
app.use(express.json()) // used to get req.body data for post reqs
app.use(cookieParser(process.env.JWT_SECRET)) // used to parse the cookies sent from the client(front-end) or postman
// Routes
// Basic Route
app.get('/', (req, res) => {
  console.log(req.signedCookies) // this is avaiable because of cookie-parser package
  res.send('E-Commerce API Home page')
})
// auth router
app.use('/api/v1/auth', authRouter)
// user router
app.use('/api/v1/users', userRouter)
app.use(notFoundMW)
app.use(errorHandlerMW)
const port = process.env.PORT || 5000
const start = async () => {
  try {
    // connect to db
    await connectDB(process.env.MONGO_URL)
    app.listen(port)
    console.log('Server is listening on port', port)
  } catch (err) {
    console.log(err)
  }
}
start()5.1 Get All Users and Get Single User
const { StatusCodes } = require('http-status-codes')
const customErrors = require('../errors')
const User = require('../models/User')
const getAllUsers = async (req, res) => {
  const users = await User.find({ role: 'user' }).select('-password')
  res.status(StatusCodes.OK).json({ users })
}
const getSingleUser = async (req, res) => {
  const { id } = req.params
  const user = await User.findOne({ _id: id }).select('-password')
  if (!user)
    throw new customErrors.NotFoundError(`User not found with ID ${id}`)
  res.status(StatusCodes.OK).json({ user })
}
// we will work on the below later
const showCurrentUser = async (req, res) => {
  res.send('Show current user route')
}
const updateUser = async (req, res) => {
  res.send('Update user route')
}
const updateUserPassword = async (req, res) => {
  res.send('update User Password route')
}
module.exports = {
  getAllUsers,
  getSingleUser,
  showCurrentUser,
  updateUser,
  updateUserPassword,
}5.2 Show Current User
Let's say a user logs in, we send him the token. With that he can further access the protected route. We check the token and if it's valid we allow him to access it.
What if he refreshes the page, he would still be sending the token in a cookie, and we have our authentication mw to check the token and give us back the user. We can send this user back without querying the DB (we would be checking if token is valid in auth mw like I said and that gets us the current user) and the client can display the username and so on the front-end using this route. He must of course be authenticated to access this route.
// router/userRouter.js (partial file content)
router.route('/showMe').get(authenticateUser, showCurrentUser)
// the above route hits this code (authenticate mw)
// middleware/authenticate.js (partial file content)
const authenticateUser = async (req, res, next) => {
  const { token } = req.signedCookies
  if (!token)
    throw new customErrors.UnauthenticatedError('Authentication Invalid')
  try {
    // we get payload (that we stored in token while creating jwt) if token is valid
    const { name, userId, role } = isTokenValid({ token })
    req.user = { name, userId, role }
    next()
  } catch (error) {
    throw new customErrors.UnauthenticatedError('Authentication Invalids')
  }
}
// if the above is successful where token would be present(meaning the user is logged in)
// , then we get the req.user passed to next request which would be showMe route
// to show the current user. This is the first route that hits when user refreshes the page
// controllers/UserController.js (partial file content)
const showCurrentUser = async (req, res) => {
  res.status(StatusCodes.OK).json({ user: req.user })
}5.3  Update Password
Let's say a logged-in user needs to update his password, then we need this route.
The save method validates the Schema before saving. If new password length is less or more than the one mentioned in the validation then it complains.
// routes/userRoute.js (partial file)
router.route('/updateUserPassword').patch(authenticateUser, updateUserPassword)
// controllers/userController.js (partial file)
const updateUserPassword = async (req, res) => {
  const { oldPassword, newPassword } = req.body
  if (!oldPassword || !newPassword) {
    throw new customErrors.BadRequestError('Old or new password is missing')
  }
  const user = await User.findOne({ _id: req.user.userId })
  const isMatch = await user.comparePasswords(oldPassword, user.password)
  if (!isMatch) {
    throw new customErrors.UnauthenticatedError('Invalid Credentials')
  }
  user.password = newPassword
  await user.save() // user.save will run the pre hook (which hashes the password) before saving to db
  // save() method also runs DB validation
  res.status(StatusCodes.OK).json({ message: 'Updated password successfully' })
}5.4 Create TokenUser function
In the authController, we have a code to create the tokenUser object. We are repeating this code in both register and login functions and later need this in other place. So let's put this code in utils..

// Replacing this below code in authController register and login functions
const { attachCookiesToResponse, createTokenUser } = require('../utils')
// const tokenUser = { userId: user._id, name: user.name, role: user.role }
const tokenUser = createTokenUser(user)
/************************************************/
// utils/createTokenUser.js
const createTokenUser = (user) => {
  // this function returns userToken object
  return { userId: user._id, name: user.name, role: user.role }
}
module.exports = createTokenUser
/************************************************/
// utils/index.js
const { createJWT, isTokenValid, attachCookiesToResponse } = require('./jwt')
const createTokenUser = require('../utils/createTokenUser')
module.exports = {
  createJWT,
  isTokenValid,
  attachCookiesToResponse,
  createTokenUser,
}5.5 Update User
Let's update the user now. What properties can we update. The role shouldn't be updated directly. The password needs to be updated with updatePassword route we have already seen. The remaining fields are email and password.
We have two update options
findOneAndUpdate / findByIdAndUpdate
save()
Let's see both
5.5.1 findOneAndUpdate
Notice the last point. Once we update the properties like email and pass, it would be nicer to also update the tokenUser , attach to cookies and send back the new token user in response. This is because, lets say if the user updated the username / name that was displayed in his welcome message on the screen, then after the update he should be seeing the new name
// controllers/userController.js (partial file)
const updateUser = async (req, res) => {
  const { name, email } = req.body
  if (!name || !email) {
    throw new customErrors.BadRequestError('Please provide name and email')
  }
  const user = await User.findOneAndUpdate(
    { _id: req.user.userId },
    { name, email },
    { new: true, runValidators: true }
  )
  const tokenUser = createTokenUser(user)
  attachCookiesToResponse({ res, user: tokenUser })
  res.status(StatusCodes.OK).json({ user: tokenUser })
}
// routes/userRoutes.js (partial)
router.route('/updateUser').patch(authenticateUser, updateUser)5.5.2 save() 
We can update the user using save method as well like we did for update password. The steps here would be
find the user using
User.findOne()update the values on this object. Example,
user.email = emailand so onThen do User.save()
This works for sure and updates the email and name but would break the password and you will not be able to login again.
There are some gotchas with this and knowing that would fix the issue
First, let's see what happens on save(). Let's write code for this
const updateUser = async (req, res) => {
  const { name, email } = req.body
  if (!name || !email) {
    throw new customErrors.BadRequestError('Please provide name and email')
  }
  // findOneAndUpdate doesn't invoke pre save hook
  // const user = await User.findOneAndUpdate(
  //   { _id: req.user.userId },
  //   { name, email },
  //   { new: true, runValidators: true }
  // )
  // save() invokes/calls the pre save hook
  const user = await User.find({ _id: req.user.userId })
  user.name = name
  user.email = email
  
  await user.save() // this calls pre-hook that runs first and then is saved to db
  const tokenUser = createTokenUser(user)
  attachCookiesToResponse({ res, user: tokenUser })
  res.status(StatusCodes.OK).json({ user: tokenUser })
}/* pre-save hook for User that's called on create user and on user.save()  */
// pre-commit hook to save the password
// this pre will be run before committing to the DB on save() and while creating new user
UserSchema.pre('save', async function () {
  const salt = await bcrypt.genSalt(10)
  this.password = await bcrypt.hash(this.password, salt)
})When you reach the line user.save(), then before saving into database, it runs the 'pre save' hook as shown above. This will create a new salt and hash the password again because of which the existing password becomes invalid and the user cannot login anymore after this update. This is very important to understand and remember.
So what's the fix here?
We have a method called modifiedPaths() on the user object. This would give us the array of values that we modified/updated. In this case, we updated email and name. Password wasn't updated so, this.modifiedPaths() give us ['name','email']
UserSchema.pre('save', async function () {
  console.log('shows what we are modifying', this.modifiedPaths()) // ['name','email']
  const salt = await bcrypt.genSalt(10)
  this.password = await bcrypt.hash(this.password, salt)
})And more interestingly, I can see if the property was modified or not using
console.log(this.isModified('password)) // we get false as this was not modifiedSo, the below code can save us from updating the password
UserSchema.pre('save', async function () {
  // console.log('shows what we are modifying', this.modifiedPaths()) // ['name','email']
  // if we didn't modify the password then we just return here
  if (!this.isModified('password')) return 
  const salt = await bcrypt.genSalt(10)
  this.password = await bcrypt.hash(this.password, salt)
})In line 4 above, we say that if password is not modified (which is true in this case as we are just changing email and name) then don't proceed further to create salt and hash, just return.
Also, what about while creating the User?  
While creating the user, line number 2, this.modifiedPaths() would log all the values as all the values which are new when registering the user are considered being changed so line 4 will be false and the password will be hashed. So nothing to worry there as well.
5.6 Check Permissions
Things look great so far but there is one pothole. In getSingleUser route, currently one user who is logged in can get the details of other user if the logged in user knows the ID of other user as shown below. We need to avoid this.

We will have to set up a util function to avoid this and this can be reusable later. In this function we will check for the permissions. For params for this function, we will pass in req user (getting from auth user mw) and also we pass id property that we get from the model. This can be user model (in this case _id) or review model for future and so on.
// controllers/userController.js
const getSingleUser = async (req, res) => {
  const { id } = req.params
  const user = await User.findOne({ _id: id }).select('-password')
  if (!user)
    throw new customErrors.NotFoundError(`User not found with ID ${id}`)
  // what are we checking in checkPermissions?
  // Our authenticated user (req.user) and user._id got from DB, if they both are same
  // if they both are same, that means the logged in user is accessing his own info and we can very well give him
  // if they both are not same, that means the logged in user is accessing others' user id. We can give the user back only if the user requesting is an admin. If not we will throw error
  // both user._id and req.param.id will be same but user._id will be in form of object. we need toString() on that
  checkPermissions(req.user, user._id)
  res.status(StatusCodes.OK).json({ user })
}
// utils/checkPermissions.js
const customError = require('../errors')
const checkPermissions = (reqUser, resourceUserID) => {
  //   console.log(reqUser) // { name: 'john', userId: '63044830c70600e17b5ac3f3', role: 'user' }
  //   console.log(resourceUserID) //new ObjectId("6304483ac70600e17b5ac3f7")
  //   console.log(typeof resourceUserID) // object
  //   console.log(resourceUserID.toString()) // 6304483ac70600e17b5ac3f7
  // if we are using return, that means we are not stopping the request, we simply return to caller (getSingleUser controller) and then proceed to give back the response
  if (reqUser.userId === resourceUserID.toString()) return
  if (reqUser.role === 'admin') return
    
  // if one of the above two conditions meet then we throw error as it would be unauthorized route
  throw new customError.UnAuthorizedError('Not authorized to access this route')
}
module.exports = checkPermissionsStep 6 (Products)
Let's now work on creating some products. It should be straight forward from here as we have already learnt most of the stuff. Let's start with Products model.
6.1 Products Model
Note
How to setup enum values?
// until now we have setup this way
  category: {
    type: String,
    required: [true, 'Please provide product category'],
    enum: ['office', 'kitchen', 'bedroom'],
  },
  
  // We can also setup like this below. We can use this when we need to 
  // provide back message saying the VALUE provided is not supported
  company: {
    type: String,
    required: [true, 'Please provide company category'],
    enum: {
      values: ['ikea', 'liddy', 'marcos'],
      message: '{VALUE} is not supported',
    },
  },const mongoose = require('mongoose')
const ProductSchema = new mongoose.Schema(
  {
    name: {
      type: String,
      trim: true,
      required: [true, 'Please provide product name'],
      maxLength: [100, 'Name cannot be more than 100 characters'],
    },
    price: {
      type: Number,
      required: [true, 'Please provide price'],
      default: 0,
    },
    description: {
      type: String,
      required: [true, 'Please provide product descritption'],
      maxLength: [1000, 'Description cannot be more than 1000 characters'],
    },
    image: {
      type: String,
      default: '/uploads/example.jpeg',
    },
    category: {
      type: String,
      required: [true, 'Please provide product category'],
      enum: ['office', 'kitchen', 'bedroom'],
    },
    company: {
      type: String,
      required: [true, 'Please provide company category'],
      enum: {
        values: ['ikea', 'liddy', 'marcos'],
        message: '{VALUE} is not supported',
      },
    },
    // we will see how this array of String works when we send data from postman
    colors: {
      type: [String],
      required: true,
    },
    featured: {
      type: Boolean,
      default: false,
    },
    freeShipping: {
      type: Boolean,
      default: false,
    },
    inventory: {
      type: Number,
      required: tue,
      default: 15,
    },
    // we will calculate this rating and num of reviews as we build up
    averageRating: {
      type: Number,
      default: 0,
    },
    numOfReviews: {
      type: Number,
      default: 0,
    },
    user: {
      type: mongoose.Types.ObjectId,
      ref: 'User',
      required: [true, 'Please provide a user'],
    },
  },
  { timestamps: true }
)
module.exports = mongoose.model('Product', ProductSchema)Note : We will see how to calculate averageRating and numOfReviews later herer -> Step 10 (Average rating and number of Reviews on Product)
6.2 Products Structure
const createProduct = async (req, res) => {
  res.send('Create Product')
}
const getAllProducts = async (req, res) => {
  res.send('Get All Product')
}
const getSingleProduct = async (req, res) => {
  res.send('Get Single Product')
}
const updateProduct = async (req, res) => {
  res.send('Update Product')
}
const deleteProduct = async (req, res) => {
  res.send('Delete Product')
}
const uploadImage = async (req, res) => {
  res.send('Upload image')
}
module.exports = {
  createProduct,
  getAllProducts,
  getSingleProduct,
  updateProduct,
  deleteProduct,
  uploadImage,
}const express = require('express')
const router = express.Router()
const {
  createProduct,
  getAllProducts,
  getSingleProduct,
  updateProduct,
  deleteProduct,
  uploadImage,
} = require('../controllers/productController')
const {
  authenticateUser,
  authorizePermissions,
} = require('../middleware/authentication')
router
  .route('/')
  .get(getAllProducts)
  .post(authenticateUser, authorizePermissions(['admin']), createProduct)
router
  .route('/uploadImage')
  .post(authenticateUser, authorizePermissions(['admin']), uploadImage)
router
  .route('/:id')
  .get(getSingleProduct)
  .patch(authenticateUser, authorizePermissions(['admin']), updateProduct)
  .delete(authenticateUser, authorizePermissions(['admin']), deleteProduct)
module.exports = routerconst express = require('express')
const app = express()
// rest of the packages
const morgan = require('morgan')
const cookieParser = require('cookie-parser')
const connectDB = require('./db/connect')
require('dotenv').config()
require('express-async-errors') // for avoiding writing try-catch in controllers
const notFoundMW = require('./middleware/not-found')
const errorHandlerMW = require('./middleware/error-handler')
const authRouter = require('./routes/authRoute')
const userRouter = require('./routes/userRoute')
const productRouter = require('./routes/productRoute')
//// Middlewares and Routes
app.use(morgan('tiny'))
app.use(express.json()) // used to get req.body data for post reqs
app.use(cookieParser(process.env.JWT_SECRET)) // used to parse the cookies sent from the client(front-end) or postman
// Routes
// Basic Route
app.get('/', (req, res) => {
  console.log(req.signedCookies) // this is avaiable because of cookie-parser package
  res.send('E-Commerce API Home page')
})
// auth router
app.use('/api/v1/auth', authRouter)
// user router
app.use('/api/v1/users', userRouter)
// product router
app.use('/api/v1/products', productRouter)
app.use(notFoundMW)
app.use(errorHandlerMW)
const port = process.env.PORT || 5000
const start = async () => {
  try {
    // connect to db
    await connectDB(process.env.MONGO_URL)
    app.listen(port)
    console.log('Server is listening on port', port)
  } catch (err) {
    console.log(err)
  }
}
start()6.3 Postman setup for products
Let's setup all the routes in postman for products
Get All Products (GET)
Get Single Product (GET)
Create Product (POST) - Admin route
Update Product (PATCH) - Admin route
Delete Product (DELETE) - Admin route
Upload Image (POST) - Admin route
Configure these routes in postman and login as admin. You should be able to see the appropriate responses from controllers. Just to test, login as user and all the Admin routes should give you message 'unauthorized to access this route'
6.4 Product CRUD
6.4.1 Create Product
Once we have all the basic setup working, let's focus on Creating product. We get all the details in req.body except for the user. Remember, the product model has got User property to show createdBy. We get this similar to Jobs API. req.user (getting from authentication middleware) will have the user prop and we will attach that user to req.body and then create the product.


// controllers/productController.js - partial file
const createProduct = async (req, res) => {
  req.body.user = req.user.userId // req.uesr.uesrID comes from auth mw
  const product = await Product.create(req.body)
  res.status(StatusCodes.CREATED).json({ product })
}6.4.2 Get All Products
const getAllProducts = async (req, res) => {
  const products = await Product.find({})
  res.status(StatusCodes.OK).json({ products, count: products.length })
}6.4.3 Get Single Product
const getSingleProduct = async (req, res) => {
  const { id: productId } = req.params
  // We can use both findById and findOne as shown below
  // const product = await Product.findById(productId)
  const product = await Product.findOne({ _id: productId })
  if (!product) {
    throw new CustomErrors.NotFoundError(`No product with ID ${productId}`)
  }
  res.status(StatusCodes.OK).json({ product })
}6.4.4 Update Product
const updateProduct = async (req, res) => {
  const { id: productId } = req.params
  const product = await Product.findOneAndUpdate({ _id: productId }, req.body, {
    new: true,
    runValidators: true,
  })
  if (!product) {
    throw new CustomErrors.NotFoundError(`No product with ID ${productId}`)
  }
  res.status(StatusCodes.OK).json({ product })
}6.4.5 Delete Product
We are not doing findOneAndDelete here. Instead we first find the product using findOne and then remove it. This will be helpful later when we do Reviews.
const deleteProduct = async (req, res) => {
  const { id: productId } = req.params
  const product = await Product.findOne({ _id: productId })
  if (!product) {
    throw new CustomErrors.NotFoundError(`No product with ID ${productId}`)
  }
  // we are not doing findOneAndDelete here because we will do some other functionality later
  // after finding the product and before deleting it here
  await product.remove()
  res.status(StatusCodes.OK).json({ msg: 'Success! Product removed' })
}6.4.6 Upload Image for Product
Let's store images on server this time rather than cloudinary. We will store it in /public/uploads on our server
Make a new folder public under root and under public make another folder uploads. So it should be /public/uploads
Go to app.js and require package express-fileupload
Then in app.js enable public folder to be publicly available
Then in app.js invoke fileUpload
const fileUpload = require('express-fileupload')
// enable public folder to be publicly available
app.use(express.static('./public'))
app.use(fileUpload())If you want to setup some default image (if user doesn't give image URL while creating product) then add your image into public folder. Example /public/uploads/example.jpeg. Remember this is our default we had setup in Product Model
Ok now when we upload an image from postman in uploadsImage route, in the uploadsImage controller we will have access to uploaded image in req.files
We will throw error if
req.files not present
image on req.files is not an image and some other file (determined by mimetype)
image on req.files is larger than mentioned size
Once we have our uploaded image pass all the conditions then we need to move our image to specific path on our computer which is /public/uploads/. For this we need to construct full path for destination
Then on the uploaded image, we have mv function that helps moving the image. We can move that image to destination path mentioned above.
Finally we will send back image path (no need to send full path) and we send /uploads/{imagename}. This path would be sufficient instead of full path as user will know about /public which we exposed in step 3 above where we made /public folder publicly available
const uploadImage = async (req, res) => {
  if (!req.files) {
    throw new CustomErrors.BadRequestError('No file uploaded')
  }
  const productImage = req.files.image
  if (!productImage.mimetype.startsWith('image')) {
    throw new CustomErrors.BadRequestError('Please upload an image file')
  }
  const maxSize = 1024 * 1024
  if (productImage.size > maxSize) {
    throw new CustomErrors.BadRequestError(
      'Please upload an image of size less than 1Mb'
    )
  }
  // the image path will be the complete path (DESTINATION PATH we need to move) of image on our computer
  const imagePath = path.join(
    __dirname,
    '../public/uploads/',
    `${productImage.name}`
  )
  // moving our uploaded image to the above mentioned path
  productImage.mv(imagePath)
  res
    .status(StatusCodes.OK)
    .json({ img: { src: `/uploads/${productImage.name}` } })
}Step 7 (Reviews)
Let's now work on Reviews.
7.1 Review Model
Also, one user should give only one review per product. We can achieve this two ways.
By writing some code in controller (which we will see later)
By using mongoose index. Remember we have already used unique=true for email. This means that all users will have unique email. Meaning, when a user needs to get created, the email prop he provides should not be there already in db for other emails. This is the indexing that is done by mongoose.
So in our Review model now, 1 user per 1 product should have 1 review and not more. So we need to set index for both user and product. This is a compound index (index includes multiple fields - User and Product in this case for Review Schama)
ReviewSchema.index({product:1,user:1},{unique:true})const mongoose = require('mongoose')
const ReviewSchema = new mongoose.Schema(
  {
    rating: {
      type: Number,
      min: 1,
      max: 5,
      required: [true, 'Please provide a rating'],
    },
    title: {
      type: String,
      trim: true,
      required: [true, 'Please provide review title'],
      maxLength: 100,
    },
    comment: {
      type: String,
      required: [true, 'Please provide review text'],
    },
    user: {
      type: mongoose.Types.ObjectId,
      ref: 'User',
      required: [true, 'Please provide a user'],
    },
    product: {
      type: mongoose.Types.ObjectId,
      ref: 'Product',
      required: [true, 'Please provide a product'],
    },
  },
  { timestamps: true }
)
// one user - one review per one product
ReviewSchema.index({ product: 1, user: 1 }, { unique: true })
module.exports = mongoose.model('Review', ReviewSchema)7.2 Review Structure
const createReview = async (req, res) => {
  res.send('Create Review')
}
const getAllReviews = async (req, res) => {
  res.send('Get All Review')
}
const getSingleReview = async (req, res) => {
  res.send('Get Single Review')
}
const updateReview = async (req, res) => {
  res.send('Update Review')
}
const deleteReview = async (req, res) => {
  res.send('Delete Review')
}
module.exports = {
  createReview,
  getAllReviews,
  getSingleReview,
  updateReview,
  deleteReview,
}const express = require('express')
const router = express.Router()
const {
  createReview,
  getAllReviews,
  getSingleReview,
  updateReview,
  deleteReview,
} = require('../controllers/reviewController')
const { authenticateUser } = require('../middleware/authentication')
router.route('/').get(getAllReviews).post(authenticateUser, createReview)
router
  .route('/:id')
  .get(getSingleReview)
  .patch(authenticateUser, updateReview)
  .delete(authenticateUser, deleteReview)
module.exports = routerconst express = require('express')
const app = express()
// rest of the packages
const morgan = require('morgan')
const cookieParser = require('cookie-parser')
const connectDB = require('./db/connect')
require('dotenv').config()
require('express-async-errors') // for avoiding writing try-catch in controllers
const notFoundMW = require('./middleware/not-found')
const errorHandlerMW = require('./middleware/error-handler')
const authRouter = require('./routes/authRoute')
const userRouter = require('./routes/userRoute')
const productRouter = require('./routes/productRoute')
const reviewRouter = require('./routes/reviewRoute')
const fileUpload = require('express-fileupload')
//// Middlewares and Routes
app.use(morgan('tiny'))
app.use(express.json()) // used to get req.body data for post reqs
app.use(cookieParser(process.env.JWT_SECRET)) // used to parse the cookies sent from the client(front-end) or postman
// enable public folder to be publicly available
app.use(express.static('./public'))
app.use(fileUpload())
// Routes
// Basic Route
app.get('/', (req, res) => {
  console.log(req.signedCookies) // this is avaiable because of cookie-parser package
  res.send('E-Commerce API Home page')
})
// auth router
app.use('/api/v1/auth', authRouter)
// user router
app.use('/api/v1/users', userRouter)
// product router
app.use('/api/v1/products', productRouter)
// review router
app.use('/api/v1/reviews', reviewRouter)
app.use(notFoundMW)
app.use(errorHandlerMW)
const port = process.env.PORT || 5000
const start = async () => {
  try {
    // connect to db
    await connectDB(process.env.MONGO_URL)
    app.listen(port)
    console.log('Server is listening on port', port)
  } catch (err) {
    console.log(err)
  }
}
start()7.3 Postman setup for Reviews
Let's setup all the routes in postman for products
Get All Reviews (GET)
Get Single Review (GET)
Create Review (POST) - Authenticated route
Update Review (PATCH) - Authenticated route
Delete Review (DELETE) - Authenticated route
7.4 Review CRUD
Once we have basic controller and routers for Review setup, let's now focus on getting Reviews working.
7.4.1 Create Review
First we need to check if product that we are getting in req.body from postman/front-end is a valid product or not
const createReview = async (req, res) => {
  const { product: productId } = req.body
  // req.body contains product.
  // First we need to validate if product sent in req.body actually exists
  // and also check that product is valid
  const isValidProduct = await Product.findOne({ _id: productId })
  if (!isValidProduct) {
    throw new CustomError.NotFoundError(`No product with id : ${productId}`)
  }
  req.body.user = req.user.userId
  const review = await Review.create(req.body)
  res.status(StatusCodes.CREATED).json({ review })
}We can also check from controller here that if the user already left a review for this product or not. The first way we checked it was through Review.index in model. This is the second way how we can check that in the controller.
const createReview = async (req, res) => {
  const { product: productId } = req.body
  // req.body contains product.
  // First we need to validate if product sent in req.body actually exists
  // and also check that product is valid
  const isValidProduct = await Product.findOne({ _id: productId })
  if (!isValidProduct) {
    throw new CustomError.NotFoundError(`No product with id : ${productId}`)
  }
  // second way of checking if user already submitted a review for this product.
  // The first way was in the Review model - ReviewSchema.index({ product: 1, user: 1 }, { unique: true })
  const alreadySubmitted = await Review.findOne({
    product: productId,
    user: req.user.userId,
  })
  // if user already submitted review for this product then he shouldn't be able to create review again
  if (alreadySubmitted) {
    throw new CustomError.BadRequestError(
      `This user with id ${req.user.userId} already left a review for this product id ${productId}`
    )
  }
  req.body.user = req.user.userId
  const review = await Review.create(req.body)
  res.status(StatusCodes.CREATED).json({ review })
}Let's now test our create Review functionality
Create Review by logging in as a user

Create review again with the same user

7.4.2 Get All Reviews
const getAllReviews = async (req, res) => {
  const reviews = await Review.find()
  res.status(StatusCodes.OK).json({ reviews, count: reviews.length })
}7.4.3 Get Single Review
const getSingleReview = async (req, res) => {
  const { id: reviewId } = req.params
  const review = await Review.findOne({ _id: reviewId })
  if (!review) {
    throw new CustomError.NotFoundError(`No review found with id ${reviewId}`)
  }
  res.status(StatusCodes.OK).json({ review })
}7.4.4 Delete Review
Like we did in Jobs API, only a logged in user must be able to delete his own review. So here we check if I'm deleting somebody else's review then throw an error
const deleteReview = async (req, res) => {
  const { id: reviewId } = req.params
  const review = await Review.findOne({ _id: reviewId })
  if (!review) {
    throw new CustomError.NotFoundError(`No review found with id ${reviewId}`)
  }
  // check if user who created the review is actually requesting it. If not throw unauthorized error
  // if (review.user.toString() !== req.user.userId) {
  //   throw new CustomError.UnAuthorizedError(
  //     'Not authorized to access this review'
  //   )
  // }
  // below code does same thing as above commmented code
  checkPermissions(req.user, review.user)
  await review.remove()
  res.status(StatusCodes.OK).json({ msg: 'Successfully deleted the review' })
}We will later learn why we are using review.remove() instead of findOneAndDelete. remove() has got some advantages over findOneAndDelete in this scenario which we will see it shortly. Similarly for Update Review below in 7.4.5 we use review.save() instead if findOneAndUpdate(). We will also see why that is important in this case shortly.
Why do we use remove and save methods (Click these and read)---> Step 9 (Remove associations) and Step 10 (Average rating and number of Reviews on Product)
7.4.5 Update Review
Let's go with save() method instead of findOneAndUpdate().
const updateReview = async (req, res) => {
  const { id: reviewId } = req.params
  const { rating, title, comment } = req.body
  const review = await Review.findOne({ _id: reviewId })
  if (!review) {
    throw new CustomError.NotFoundError(`No review found with id ${reviewId}`)
  }
  checkPermissions(req.user, review.user)
  // since we are setting all 3 props here, even if one is missing we get an error.
  // Optionally, if u want to set any one, you can also do that using if statemets
  // if title is present in req.body then set that, if comment is present then set that and so on
  review.title = title
  review.rating = rating
  review.comment = comment
  await review.save()
  res.status(StatusCodes.OK).json({ review })
}Step 8 (Additional Capabilities to get data - Goodies)
We are now in a good shape but mongoose provides some more important features. Let's start with populate method. If we need to reference documents in other collection then we use populate method https://mongoosejs.com/docs/populate.html
If you remember, we have already used this several times to reference user in Product model, and also reference user and product in Review model. Let's explore more about this.
8.1 Populate()
Now let's say I am requesting for reviews as shown below. If I want to see my product details for which this review is associated, I currently only see product (product ID highlighted below). Wouldn't it be nice to see more details about the product like name, category and so on?

We need to use populate method for this. When to use populate?
When another model(product) is associated with this model(review), then we can use populate to populate more info.
const getAllReviews = async (req, res) => {
  const reviews = await Review.find({}).populate('product') // this will give all props of producct
  res.status(StatusCodes.OK).json({ reviews, count: reviews.length })
}The above code will give all props of product

If you want only certain fields to be populated
const getAllReviews = async (req, res) => {
  const reviews = await Review.find({}).populate({
    path: 'product',
    select: 'name company price',
  })
  res.status(StatusCodes.OK).json({ reviews, count: reviews.length })
}
We can do it for user as well.
const getAllReviews = async (req, res) => {
  const reviews = await Review.find({})
    .populate({
      path: 'product',
      select: 'name company price',
    })
    .populate({
      path: 'user',
      select: 'name',
    })
  res.status(StatusCodes.OK).json({ reviews, count: reviews.length })
}

8.2 Mongoose Virtuals
Ok, now that we got to know about populate method, let's talk about this use case where, while getting a single product like below, I also want to get all the reviews associated with that product.
Note that previously, with populate method, we just expanded review to have all other product props. Before using populate we just had product ID in review and later after using populate we had other product props along with product ID.
Here, now we are talking what if we need all reviews associated with product. If you take a look, there's no review or reviews on our product model (but we had product on review model because of which populate was possible)
I cannot do below as product model doesn't have review model as it's field inside product model
const getAllProducts = async (req, res) => {
  const products = await Product.find({}).populate('review') // NOT POSSIBLE
  res.status(StatusCodes.OK).json({ products, count: products.length })
}
But after creating a virtual the above code is possible. Let's see how
In order to accomplish this we need to use mongoose virtuals. Mongoose virtuals are used to get properties that don't persist or exist on db and they only exist logically which needs to be generated on the fly

Refer here to docs attached below - Go to Populate section in below page. Same as what John says
The virtual will now work after above code is modified
const getSingleProduct = async (req, res) => {
  const { id: productId } = req.params
  // populate will now work as we have added virtuals - true to product model
  const product = await Product.findOne({ _id: productId }).populate('reviews')
  if (!product) {
    throw new CustomErrors.NotFoundError(`No product with ID ${productId}`)
  }
  res.status(StatusCodes.OK).json({ product })
}const mongoose = require('mongoose')
const ProductSchema = new mongoose.Schema(
  {
    name: {
      type: String,
      trim: true,
      required: [true, 'Please provide product name'],
      maxLength: [100, 'Name cannot be more than 100 characters'],
    },
    price: {
      type: Number,
      required: [true, 'Please provide price'],
      default: 0,
    },
    description: {
      type: String,
      required: [true, 'Please provide product descritption'],
      maxLength: [1000, 'Description cannot be more than 1000 characters'],
    },
    image: {
      type: String,
      default: '/uploads/example.jpeg',
    },
    category: {
      type: String,
      required: [true, 'Please provide product category'],
      enum: ['office', 'kitchen', 'bedroom'],
    },
    company: {
      type: String,
      required: [true, 'Please provide company'],
      enum: {
        values: ['ikea', 'liddy', 'marcos'],
        message: '{VALUE} is not supported',
      },
    },
    // we will see how this array of String works when we send data from postman
    colors: {
      type: [String],
      required: true,
    },
    featured: {
      type: Boolean,
      default: false,
    },
    freeShipping: {
      type: Boolean,
      default: false,
    },
    inventory: {
      type: Number,
      required: true,
      default: 15,
    },
    // we will calculate this rating as we build up
    averageRating: {
      type: Number,
      default: 0,
    },
    user: {
      type: mongoose.Types.ObjectId,
      ref: 'User',
      required: [true, 'Please provide a user'],
    },
  },
  // setting virtuals here to true. Meaning, product model will now accept virtuals
  { timestamps: true, toJSON: { virtuals: true }, toObject: { virtuals: true } }
)
// using the name 'reviews' as we used the same name in populate() in getSingleProduct controller
ProductSchema.virtual('reviews', {
  ref: 'Review',
  localField: '_id', // this is the Product id
  foreignField: 'product', // field in the Review that reference Product Model
  justOne: false, // want more documents of reviews not just one document
})
module.exports = mongoose.model('Product', ProductSchema)

We can also match certain documents. Let's say I want to get reviews only whose rating is 3 in single product
// using the name 'reviews' as we used the same name in populate() in getSingleProduct controller
ProductSchema.virtual('reviews', {
  ref: 'Review',
  localField: '_id', // this is the Product id
  foreignField: 'product', // field in the Review that reference Product Model
  justOne: false, // want more documents of reviews not just one document
  match: { rating: 3 }, // ADDED THIS MATCH HERE
})
This is a virtual property that we added. Meaning, this is not a real property which we can query for certain things. No! We can't query on review virtual property on product. We get all properties in review (virtual prop) by default. Hence, let's look at an alternative approach we can take to get reviews of a single product.
8.3 Virtuals Alternative approach
Let's see alternative way of how to implement reviews to appear in products without virtuals.
Steps
First we will implement a controller in reviews called
getSingleProductReviewsThen we will import it into products route



Step 9 (Remove associations) 
Ok, let's see this use case. If we remove a product, the reviews associated with that product should automatically be removed which makes sense. But think about it, in our current implementation that's not the case. If we remove a product, the reviews related to that product are not removed.
So if we have a post method that runs after removing the product, that post method should remove all the reviews associated with that product. We can have such post method that we can run after remove and save methods. That's not possible with findOneAndUpdate and findOneAndDelete method. That is the reason why we use save and remove.
ProductSchema.post('remove', async function () {
  // Even though this is product schema, I can access other model (like Review)
  await this.model('Review').deleteMany({ product: this._id }) // product is the prop on Review, so we delete that
  // the above means, Remove all reviews associated with product having _id
})const mongoose = require('mongoose')
const ProductSchema = new mongoose.Schema(
  {
    name: {
      type: String,
      trim: true,
      required: [true, 'Please provide product name'],
      maxLength: [100, 'Name cannot be more than 100 characters'],
    },
    price: {
      type: Number,
      required: [true, 'Please provide price'],
      default: 0,
    },
    description: {
      type: String,
      required: [true, 'Please provide product descritption'],
      maxLength: [1000, 'Description cannot be more than 1000 characters'],
    },
    image: {
      type: String,
      default: '/uploads/example.jpeg',
    },
    category: {
      type: String,
      required: [true, 'Please provide product category'],
      enum: ['office', 'kitchen', 'bedroom'],
    },
    company: {
      type: String,
      required: [true, 'Please provide company'],
      enum: {
        values: ['ikea', 'liddy', 'marcos'],
        message: '{VALUE} is not supported',
      },
    },
    // we will see how this array of String works when we send data from postman
    colors: {
      type: [String],
      required: true,
    },
    featured: {
      type: Boolean,
      default: false,
    },
    freeShipping: {
      type: Boolean,
      default: false,
    },
    inventory: {
      type: Number,
      required: true,
      default: 15,
    },
    // we will calculate this rating and num of reviews as we build up
    averageRating: {
      type: Number,
      default: 0,
    },
    numOfReviews: {
      type: Number,
      default: 0,
    },
    user: {
      type: mongoose.Types.ObjectId,
      ref: 'User',
      required: [true, 'Please provide a user'],
    },
  },
  // setting virtuals here to true. Meaning, product model will now accept virtuals
  { timestamps: true, toJSON: { virtuals: true }, toObject: { virtuals: true } }
)
// using the name 'reviews' as we used the same name in populate() in getSingleProduct controller
// ProductSchema.virtual('reviews', {
//   ref: 'Review',
//   localField: '_id', // this is the Product id
//   foreignField: 'product', // field in the Review that reference Product Model
//   justOne: false, // want more documents of reviews not just one document
//   match: { rating: 3 },
// })
ProductSchema.post('remove', async function () {
  // Even though this is product schema, I can access other model (like Review)
  await this.model('Review').deleteMany({ product: this._id }) // product is the prop on Review, so we delete that
  // the above means, Remove all reviews associated with product having _id
})
module.exports = mongoose.model('Product', ProductSchema)To test this
Create a product
Create some reviews associated with that product
Remove that product
Check the reviews associated with that product and you should not see any reviews
Step 10 (Average rating and number of Reviews on Product)
Ok, it's time to calculate Average Rating and Number of Reviews on a product. Review is a different Model which is linked to product (review has product), but product model has no association with review directly. Even then we need to know how many reviews (numOfReviews) a product has and what is the average rating (rating is a prop on review) for a product. Average rating is all ratings combined for a product / number of reviews for that particular product. In this case we use mongodb aggregation.
The idea is, we need to calculate this numOfReviews on a product when some review changes on the DB (when any review gets updated or created). So basically, when a new review is created for a product or existing review is updated or review is deleted then we need to calculate the numOfReviews on a product for which this review is associated. We can use post save and post remove hook for this and this is the reason why we chose save() over findOneAndUpdate() and remove over findOneAndRemove()
Static Method on Schema 
While we are on implementing average rating and number of reviews, let's also discuss about Schema static methods. Remember we implemented instance method for comparing the password on user document Node Js (John Smilga - Udemy). This was an instance method, meaning, every user document was able to access comparePassword method that was written on Schema. UserSchema.methods.comparePassword is what we had done so that all user documents (instances of User) had access to this comparePassword method.
Unlike instance method, what if we want to access a method directly on Schema rather than accessing on instance? For example, let's say I don't want user.comparePassword() but I want UserSchema.comparePassword(), then I should setup static method on Schema which we will do in this case on ReviewSchema. 
// creating a static method (review instance can't access this but we can access directly access it on ReviewSchema)
ReviewSchema.statics.calculateAverageRating = async function (productId) {
  // Here, I need to calculate average rating on this product id.
  // I need to take all the reviews associated with this product id, and then get average of them and
  // then attach it to this product (of productId given).
  console.log(productId)
}
Now to test this setup, create a review/update that review and delete that review. You should see the console log for that review on that product passed as param.
10.1 Aggregation pipeline
Now that we have setup post save and post remove hooks and what should happen in this hook (calculate average rating and number of reviews both), let's work on calculateAverageRating static method. Even though the method says calculateAverageRating we will calculate both average rating and numOfRaviews by setting aggregation pipeline.
First we will setup this aggregation pipeline in mongodb atlas interface and then get an idea how to do it through code.
Idea of aggregation
So what are we going to do here? What we want to achieve here? We want two things
1. Calculate number of reviews
Here, we can do this without even using aggregate pipeline. Aggregate pipeline is for doing some steps like matching certain queries, grouping them and then returning it to wherever it needs to go. This is all done on mongodb side and not in express JS in our code. In our example, if we need number of reviews, we can do it the other way in our code. We can actually get all the reviews for a specific product and calculate the length of that and attach it to specific product, this is one option. But here we are going to do that using aggregate pipeline on mongodb side itself. And I am assuming this would be faster. Let's look at the next case, calculate average rating, which makes more sense
2. Calculate average rating
This is also something which we can do it on express side. Get all reviews, iterate them one by one and store the rating in a variable and then calculate the average and attach it to specific product. But think about it, if we have a million or a billion reviews, we need to get all of them (which takes a lot of time and processing) and then iterate through them (which takes some more time) and then separate out according to product and then attach it to product. For an example and better understanding, consider udemy website which has got a million courses and each course has got a million reviews. So total reviews would be million * million.
For this reason it would be nice to do it on mongodb side. On mongodb, let's say we need to know how many 1 rating, 2 ratings, 3 ratings, 4 ratings, and 5 ratings for a course. Then we need to pick that course (match that course), group by rating prop so we get an array of each rating and then we can attach an amount prop that keeps count of number of separate ratings. This is what we are going to do now. We won't group by separate ratings, but all we do it calculate number of reviews for a product and it's average rating. So we are not grouping by anything here, so we can define group id to be null. Group id defines what we are grouping on like rating for example.
Two steps here
First we need to go to Atlas account (Mongodb), choose our project E-commerce, go to review collection (because we are going to aggregate here), and choose Aggregation


Once we do it here and get all documents we are looking for then we can copy the code by clicking here


10.2 MongoDB aggregation
Ok, now you know what is aggregation and how we do it, let's jump back to Atlas UI and do the matching and grouping to get caculateAverageRating and numOfReviews. Then in next step 10.3, we will write this in code.

Let's export this code

Let's copy this code and put it in temp file. This is what we want in our code, but let's write from scratch
[
  {
    '$match': {
      'product': new ObjectId('63156f49d6daad5d68142a3b')
    }
  }, {
    '$group': {
      '_id': null, 
      'averageRating': {
        '$avg': '$rating'
      }, 
      'numOfReviews': {
        '$sum': 1
      }
    }
  }
]10.3 MongoDB aggregation in code
Now that we have setup our aggregation in GUI and have the code snippet for aggregation, let's see how we can add this code in ReviewSchema.statics.calculateAverageRating()function to get average rating and number of reviews for passed product and then attach it to product model dynamically.

Above, we just added the code (temp file shown on right side is the actual code snippet we got from atlas in our previous step). We are adding a console log to see the average rating and number of reviews. We tested it by updating an existing review. This will log even if we create or delete the review.
Let's now attach this result on to our product. So every time someone creates or updates or deletes a review for this product, then we dynamically calculate them and add it to that product again.

const mongoose = require('mongoose')
const ReviewSchema = new mongoose.Schema(
  {
    rating: {
      type: Number,
      min: 1,
      max: 5,
      required: [true, 'Please provide a rating'],
    },
    title: {
      type: String,
      trim: true,
      required: [true, 'Please provide review title'],
      maxLength: 100,
    },
    comment: {
      type: String,
      required: [true, 'Please provide review text'],
    },
    user: {
      type: mongoose.Types.ObjectId,
      ref: 'User',
      required: [true, 'Please provide a user'],
    },
    product: {
      type: mongoose.Types.ObjectId,
      ref: 'Product',
      required: [true, 'Please provide a product'],
    },
  },
  { timestamps: true }
)
// one user - one review per one product
ReviewSchema.index({ product: 1, user: 1 }, { unique: true })
// creating a static method (review instance can't access this but we can access directly access it on ReviewSchema)
ReviewSchema.statics.calculateAverageRating = async function (productId) {
  // Here, I need to calculate average rating on this product id.
  // I need to take all the reviews associated with this product id, and then get average of them and
  // then attach it to this product (of productId given).
  const result = await this.aggregate([
    { $match: { product: productId } },
    {
      $group: {
        _id: null,
        averageRating: { $avg: '$rating' },
        numOfReviews: { $sum: 1 },
      },
    },
  ])
  console.log('The result is', result) // [ { _id: null, averageRating: 2.5, numOfReviews: 2 } ]
  // notice that the result is an array
  // if we delete all reviews for this product then this array will just be empty, so we need to check if array has atleast one element
  // we will do that through optional chaining
  try {
    // All we are doing here is - go to product and add average rating and numOfReviews
    // If array is empty (when no reviews or all reviews get deleted), then set averageRating and numOfReviews to 0
    await this.model('Product').findOneAndUpdate(
      { _id: productId },
      {
        averageRating: Math.ceil(result[0]?.averageRating || 0),
        numOfReviews: result[0]?.numOfReviews || 0,
      }
    )
  } catch (error) {
    console.log(error)
  }
}
// when a review is modified (created or updated or deleted) we need to calculate number of reviews (numOfReviews)
// for a product with which this review is associated and also calculate avg rating. So we use post save and remove hooks
ReviewSchema.post('save', async function () {
  console.log(
    'Adding or updating the review.... Average rating is calculating for the product...'
  )
  await this.constructor.calculateAverageRating(this.product) // this is how we access static method
})
ReviewSchema.post('remove', async function () {
  console.log(
    'Deleting the review.... Average rating is calculating for the product...'
  )
  await this.constructor.calculateAverageRating(this.product) // this is how we access static method
})
module.exports = mongoose.model('Review', ReviewSchema)Step 11 (Orders)
We are almost there and this is going to be our last model. To understand this order, let's look at John's comfy sloth store project here https://react-course-comfy-sloth-store.netlify.app/cart. When it comes to order, a big part is cart (The items user wants to buy). So we have got two options here
Option 1 - We have got all the cart data in front-end and is persisted in the local storage (like in comfy sloth project). We will cover this option in our project now. We will assume there's a front-end that is sending us all the cart info (in our case all the cart info will be sent by postman ofcourse).
Option 2 - We can store the cart data directly in the database. In the Yelp clone project (future project in John's course - watch out I might do it below), we will setup cart data in the database and we can then see how to do it with express sessions.


So once the user is ready with all the products and click checkout we will do two things
First, we will communicate with stripe (to get the client secret key)
Second, we will setup the initial order. Before we even communicate with stripe, we want to double check whether the data that is coming from the front-end / postman actually makes sense (if the ID exists for the product and if it does then what is the correct price). Then we will communicate with stripe (first point) and then we will setup the order. Once the payment is complete then we will get additional data that we will add it to our order.
11.1 Order Model
Note that for cartItems in the OrderSchema, we have setup as a separate schema. We can technically set this up directly as a simple object in cartItems prop in OrderSchema, but this is the clean way to setup as we will get the validation for the cart item as well.
const mongoose = require('mongoose')
// we can technically set this up directly as a simple object 
// in cartItems prop in OrderSchema, but this is the clean way 
// to setup as we will get the validation for the cart item as well
const SingleCartItemSchema = mongoose.Schema({
  name: {
    type: String,
    required: true,
  },
  image: {
    type: String,
    required: true,
  },
  price: {
    type: Number,
    required: true,
  },
  // amount is the quantity of this product
  amount: {
    type: Number,
    required: true,
  },
  product: {
    type: mongoose.Types.ObjectId,
    ref: 'Product',
    required: true,
  },
})
const OrderSchema = new mongoose.Schema(
  {
    tax: {
      type: Number,
      required: true,
    },
    shippingFee: {
      type: Number,
      required: true,
    },
    // this is the total for all cart items where for each item, 
    // we multiply price by quantity
    subtotal: {
      type: Number,
      required: true,
    },
    // total = subtotal + tax + shippingFee
    total: {
      type: Number,
      required: true,
    },
    // we can techincally set this up directly as a simple object 
    // in cartItems here, but this is the clean way to setup as a 
    // different schema as we will get the validation for SingleCartItemSchema as well
    cartItems: [SingleCartItemSchema],
    status: {
      type: String,
      enum: ['pending', 'failed', 'paid', 'delivered', 'cancelled'],
      default: 'pending',
    },
    user: {
      type: mongoose.Types.ObjectId,
      ref: 'User',
      required: [true, 'Please provide a user'],
    },
    clientSecret: {
      type: String,
      required: true,
    },
    paymentIntentId: {
      type: String,
    },
  },
  { timestamps: true }
)
const Order = mongoose.model('Order', OrderSchema)
module.exports = Order
/*
The above two lines is same as
module.exports = mongoose.model('Order', OrderSchema)
*/11.2 Order Structure
const createOrder = async (req, res) => {
  res.send('Create Order')
}
const getAllOrders = async (req, res) => {
  res.send('Get All Orders')
}
const getSingleOrder = async (req, res) => {
  res.send('Get Single Order')
}
const updateOrder = async (req, res) => {
  res.send('Update Order')
}
const getCurrentUserOrders = async (req, res) => {
  res.send('Get current ussers Orders')
}
module.exports = {
  getAllOrders,
  getSingleOrder,
  getCurrentUserOrders,
  createOrder,
  updateOrder,
}const express = require('express')
const router = express.Router()
const {
  getAllOrders,
  getSingleOrder,
  getCurrentUserOrders,
  createOrder,
  updateOrder,
} = require('../controllers/OrderController')
const {
  authenticateUser,
  authorizePermissions,
} = require('../middleware/authentication')
router
  .route('/')
  .get(authenticateUser, authorizePermissions(['admin']), getAllOrders)
  .post(authenticateUser, createOrder)
router.route('/showAllMyOrders').get(authenticateUser, getCurrentUserOrders)
router
  .route('/:id')
  .patch(authenticateUser, updateOrder)
  .get(authenticateUser, getSingleOrder)
module.exports = routerconst express = require('express')
const app = express()
// rest of the packages
const morgan = require('morgan')
const cookieParser = require('cookie-parser')
const connectDB = require('./db/connect')
require('dotenv').config()
require('express-async-errors') // for avoiding writing try-catch in controllers
const notFoundMW = require('./middleware/not-found')
const errorHandlerMW = require('./middleware/error-handler')
const authRouter = require('./routes/authRoute')
const userRouter = require('./routes/userRoute')
const productRouter = require('./routes/productRoute')
const reviewRouter = require('./routes/reviewRoute')
const orderRouter = require('./routes/orderRoute')
const fileUpload = require('express-fileupload')
//// Middlewares and Routes
app.use(morgan('tiny'))
app.use(express.json()) // used to get req.body data for post reqs
app.use(cookieParser(process.env.JWT_SECRET)) // used to parse the cookies sent from the client(front-end) or postman
// enable public folder to be publicly available
app.use(express.static('./public'))
app.use(fileUpload())
// Routes
// Basic Route
app.get('/', (req, res) => {
  console.log(req.signedCookies) // this is avaiable because of cookie-parser package
  res.send('E-Commerce API Home page')
})
// auth router
app.use('/api/v1/auth', authRouter)
// user router
app.use('/api/v1/users', userRouter)
// product router
app.use('/api/v1/products', productRouter)
// review router
app.use('/api/v1/reviews', reviewRouter)
// order router
app.use('/api/v1/orders', orderRouter)
app.use(notFoundMW)
app.use(errorHandlerMW)
const port = process.env.PORT || 5000
const start = async () => {
  try {
    // connect to db
    await connectDB(process.env.MONGO_URL)
    app.listen(port)
    console.log('Server is listening on port', port)
  } catch (err) {
    console.log(err)
  }
}
start()11.3 Postman setup for Orders
Let's setup these routes in postman now
Get All Orders (Admin route)
All below routes must be authenticated
Create order
Update order
Get Current user's order
Get Single order

11.4 Order CRUD
Let's now do some CRUD operations on Order
11.4.1 Create Order
I would suggest to go read stripe project before proceeding with this task --->Node Js (John Smilga - Udemy) Before we set this up, let's see what to expect from the front-end. We will be expecting to get
the shipping fee,
tax
and also the cart items array.
In cart items array we will have bunch of objects (each object represents a product). So each object will have a product-name, price, product-image, quantity, product id (very important)
So before we setup any functionality let's setup proper request in the postman
Setup Create Order in postman
You can use mock data John has prepared. It will be in starter/mockData/products.json file.  Make sure you create one or two products and when you create order two things must match - product ID and the amount (quantity of each product) in the array.

Let's first get the req.body and destructure items, shippingFee and tax, and validate them.
const Order = require('../models/Order')
const Product = require('../models/Product')
const { StatusCodes } = require('http-status-codes')
const CustomError = require('../errors')
const checkPermissions = require('../utils/checkPermissions')
const createOrder = async (req, res) => {
  const { items: cartItems, tax, shippingFee } = req.body
  if (!cartItems || cartItems.length < 1) {
    throw new CustomError.BadRequestError('No cart items provided')
  }
  if (!tax || !shippingFee) {
    throw new CustomError.BadRequestError('Please provide tax and shipping fee')
  }
  res.send('Create Order')
}Once this works fine then we can start testing the products. Now, this product is an array like this in the request
"items": [
      {
        "name": "product 1",
        "price": 2599,
        "image": "https://dl.airtable.com/.attachmentThumbnails/e8bc3791196535af65f40e36993b9e1f/438bd160",
        "amount": 34,
        "product": "6316666d1c1538a1ad5f5c58"
      },
      
      {
        "name": "product 2",
        "price": 2599,
        "image": "https://dl.airtable.com/.attachmentThumbnails/e8bc3791196535af65f40e36993b9e1f/438bd160",
        "amount": 34,
        "product": "6316666d1c1538a1ad5f5c57"
      },
      
]Notice that we have two products in cart. The idea in the create order controller is to loop through this items array and get each product ID, get that product from the db and get the price of it and keep adding cost of each product to a variable.
Then finally check if total amount, shipping fee and tax matches to what front-end sent us and if yes then contact stripe and create a client secret (we will fake this here as we don't have a front-end and actually front-end must exist to test stripe as front-end needs to communicate with stripe after getting client secret from backend, please refer to stripe project) and send back to front-end.
Now, let's step back a second and see how to loop through cart items. We need to use for of loop as, inside for of loop, we will be using await to get each product from db. forEach or map wouldn't work with await.
See the below image to know when should these checks happen that is present in the create order controller


Creating an order in the controller covering all above concepts
const Order = require('../models/Order')
const Product = require('../models/Product')
const { StatusCodes } = require('http-status-codes')
const { BadRequestError, NotFoundError } = require('../errors')
const fakeStripeAPI = async ({ amount, currency }) => {
  const client_secret = 'someRandomValue'
  return { client_secret, amount } // mimicing the response of real stripe function
}
const createOrder = async (req, res) => {
  const { items: cartItems, tax, shippingFee } = req.body
  if (!cartItems || cartItems.length < 1) {
    throw new BadRequestError('No cart items provided')
  }
  if (!tax || !shippingFee) {
    throw new BadRequestError('Please provide tax and shipping fee')
  }
  let orderItems = [] // used below to create the actual Order after stipe payment
  let subtotal = 0 // price * qty for each product
  for (let item of cartItems) {
    const dbProduct = await Product.findOne({ _id: item.product })
    if (!dbProduct) {
      throw new NotFoundError(`No product with id ${item.product}`)
    }
    const { name, image, price, _id } = dbProduct // no need of _id here as it is same as item.product,
    // but that's ok lets have it here, no harm
    const singleOrderItem = {
      name,
      price,
      image,
      amount: item.amount, // this is coming from front-end
      product: _id,
    }
    // add this singleOrderItem to orderItems array
    orderItems = [...orderItems, singleOrderItem]
    // I also need to calculate subtotal of my cart
    subtotal += item.amount * price // note that we are choosing price (from db, not from front-end) and amount (from front-end)
  }
  // once we have orderItems and subtotal calcualated on the items fetched
  // from db, we can setup stripe (fake stripe in this case) to
  // get the client_secret and send it to front-end
  const total = subtotal + tax + shippingFee
  // get client secret (from stripe, but fake stripe in this case)
  // we will setup fakeStripeAPI function to mimic real stripe function
  const paymentIntent = await fakeStripeAPI({
    amount: total,
    currency: 'usd',
  })
  // look at the Order schema for all required values
  const order = await Order.create({
    cartItems: orderItems,
    total,
    subtotal,
    tax,
    shippingFee,
    clientSecret: paymentIntent.client_secret,
    user: req.user.userId,
  })
  res
    .status(StatusCodes.CREATED)
    .json({ order, clientSecret: order.clientSecret })
}11.4.2 Get All Orders
This is an admin only route.
const getAllOrders = async (req, res) => {
  const orders = await Order.find({})
  res.status(StatusCodes.OK).json({ orders })
}11.4.3 Get Single Order
You need to check permissions here as each user should access his/her own order only and not others' orders.
const getSingleOrder = async (req, res) => {
  const { id: orderId } = req.params
  const order = await Order.findOne({ _id: orderId })
  if (!order) {
    throw new NotFoundError(`No order with id ${orderId}`)
  }
  
  // note: Don't populate above while getting order, else order.user will have 
  // addtional field attached from populate and bleow checkPermissions will fail
  // I learnt this after some trial and error, so don't fall into this trap again. 
  // Removed populate in this controller here so don't worry
  checkPermissions(req.user, order.user)
  res.status(StatusCodes.OK).json({ order })
}11.4.4 Get Current User Orders
Get only the orders of the currently logged in user
const getCurrentUserOrders = async (req, res) => {
  // populating here to check who the user is, for ease
  const orders = await Order.find({ user: req.user.userId }).populate({
    path: 'user',
    select: 'name',
  })
  console.log('req user userid', req.user.userId)
  res.status(StatusCodes.OK).json({ orders })
}11.4.5 Update Order
Once user adds items to cart and clicks
Proceed to Checkoutthen we hit create order controller.After creating the order (the status of order will be pending), we will send back clientSecret from back-end to front-end.
Once front-end has client secret he can proceed to below shown page where he can enter his card details and make payment to stripe
Once payment is made to stripe, stripe will send back payment Intent id to front-end
Front-end then calls update order and sends us this payment intent id that stripe sent
Now we are in update controller. We need to store this payment intent id for this order and update the order status from 'pending' to 'paid'

const updateOrder = async (req, res) => {
  const { id: orderId } = req.params
  const { paymentIntentId } = req.body
  if (!paymentIntentId) {
    throw new BadRequestError('Please provide payment intenet ID')
  }
  const order = await Order.findOne({ _id: orderId })
  if (!order) {
    throw new NotFoundError(`No order with id ${orderId}`)
  }
  // we don't want Susan to update Peter's order so checking permissions
  checkPermissions(req.user, order.user)
  // save payment intenet id and also the status
  order.paymentIntentId = paymentIntentId
  order.status = 'paid'
  await order.save()
  res.status(StatusCodes.OK).json({ order })
}
Order controller full code
const Order = require('../models/Order')
const Product = require('../models/Product')
const { StatusCodes } = require('http-status-codes')
const { BadRequestError, NotFoundError } = require('../errors')
const { checkPermissions } = require('../utils')
const fakeStripeAPI = async ({ amount, currency }) => {
  const client_secret = 'someRandomValue'
  return { client_secret, amount } // mimicing the response of real stripe function
}
const createOrder = async (req, res) => {
  const { items: cartItems, tax, shippingFee } = req.body
  if (!cartItems || cartItems.length < 1) {
    throw new BadRequestError('No cart items provided')
  }
  if (!tax || !shippingFee) {
    throw new BadRequestError('Please provide tax and shipping fee')
  }
  let orderItems = [] // used below to create the actual Order after stipe payment
  let subtotal = 0 // price * qty for each product
  for (let item of cartItems) {
    const dbProduct = await Product.findOne({ _id: item.product })
    if (!dbProduct) {
      throw new NotFoundError(`No product with id ${item.product}`)
    }
    const { name, image, price, _id } = dbProduct // no need of _id here as it is same as item.product,
    // but that's ok lets have it here, no harm
    const singleOrderItem = {
      name,
      price,
      image,
      amount: item.amount, // this is coming from front-end
      product: _id,
    }
    // add this singleOrderItem to orderItems array
    orderItems = [...orderItems, singleOrderItem]
    // I also need to calculate subtotal of my cart
    subtotal += item.amount * price // note that we are choosing price (from db, not from front-end) and amount (from front-end)
  }
  // once we have orderItems and subtotal calcualated on the items fetched
  // from db, we can setup stripe (fake stripe in this case) to
  // get the client_secret and send it to front-end
  const total = subtotal + tax + shippingFee
  // get client secret (from stripe, but fake stripe in this case)
  // we will setup fakeStripeAPI function to mimic real stripe function
  const paymentIntent = await fakeStripeAPI({
    amount: total,
    currency: 'usd',
  })
  // look at the Order schema for all required values
  const order = await Order.create({
    cartItems: orderItems,
    total,
    subtotal,
    tax,
    shippingFee,
    clientSecret: paymentIntent.client_secret,
    user: req.user.userId,
  })
  res
    .status(StatusCodes.CREATED)
    .json({ order, clientSecret: order.clientSecret })
}
const getAllOrders = async (req, res) => {
  const orders = await Order.find({}).populate({ path: 'user', select: 'name' })
  res.status(StatusCodes.OK).json({ orders })
}
const getSingleOrder = async (req, res) => {
  const { id: orderId } = req.params
  const order = await Order.findOne({ _id: orderId })
  if (!order) {
    throw new NotFoundError(`No order with id ${orderId}`)
  }
  checkPermissions(req.user, order.user)
  res.status(StatusCodes.OK).json({ order })
}
const updateOrder = async (req, res) => {
  const { id: orderId } = req.params
  const { paymentIntentId } = req.body
  if (!paymentIntentId) {
    throw new BadRequestError('Please provide payment intenet ID')
  }
  const order = await Order.findOne({ _id: orderId })
  if (!order) {
    throw new NotFoundError(`No order with id ${orderId}`)
  }
  // we don't want Susan to update Peter's order so checking permissions
  checkPermissions(req.user, order.user)
  // save payment intenet id and also the status
  order.paymentIntentId = paymentIntentId
  order.status = 'paid'
  await order.save()
  res.status(StatusCodes.OK).json({ order })
}
const getCurrentUserOrders = async (req, res) => {
  // populating here to check who the user is, for ease
  const orders = await Order.find({ user: req.user.userId }).populate({
    path: 'user',
    select: 'name',
  })
  res.status(StatusCodes.OK).json({ orders })
}
module.exports = {
  getAllOrders,
  getSingleOrder,
  getCurrentUserOrders,
  createOrder,
  updateOrder,
}Step 12 (Documentation)
Finally we are here after lot of hard work. Instead of swagger UI which we used in jobs api for documentation, let's use a better one here which is docgen
GitHub - thedevsaddam/docgen: Transform your postman collection to HTML/Markdown documentationGitHubTo install this, do as shown in documentation.
curl https://raw.githubusercontent.com/thedevsaddam/docgen/v3/install.sh -o install.sh \
&& sudo chmod +x install.sh \
&& sudo ./install.sh \
&& rm install.sh12.1 Create documentation
In postman, click on export and save the collection in a folder anywhere

Open that saved file in vscode and change the
{{URL}}variable to localhost:5000 (what ever app your port is running on). Later we will change this again to the actual URL once deployed to heroku.Run this command below so that your docs,
filename.json, will be converted to html,filename.html
// docgen build -i <postman exported filename>.json -o index.html
docgen build -i docs.json -o index.html  This would generate docs.html, move this html file into public folder
Note that since we have a line in our code to expose public folder as shown below, docs will now be available on http://localhost:5000/
// enable public folder to be publicly available
app.use(express.static('./public'))If you uncomment above line then the route / will be hit defined below and that will be run instead of docs as shown below
// Basic Route
app.get('/', (req, res) => {
  console.log(req.signedCookies) // this is avaiable because of cookie-parser package
  res.send('E-Commerce API Home page')
})Now we have our docs deployed to localhost:5000

Step 13 (Security Packages)
Since we will deploy our apps to heroku, it's good to consider some security packages to secure our app.
const express = require('express')
const app = express()
// rest of the packages
const morgan = require('morgan')
const cookieParser = require('cookie-parser')
const connectDB = require('./db/connect')
require('dotenv').config()
require('express-async-errors') // for avoiding writing try-catch in controllers
const notFoundMW = require('./middleware/not-found')
const errorHandlerMW = require('./middleware/error-handler')
const authRouter = require('./routes/authRoute')
const userRouter = require('./routes/userRoute')
const productRouter = require('./routes/productRoute')
const reviewRouter = require('./routes/reviewRoute')
const orderRouter = require('./routes/orderRoute')
const fileUpload = require('express-fileupload')
// Security packages
const rateLimiter = require('express-rate-limit')
const helmet = require('helmet')
const xss = require('xss-clean')
const cors = require('cors')
const expressMongoSanitize = require('express-mongo-sanitize')
//// Middlewares and Routes
// for rate limiter if it's behind the proxy then we need to set that as well
app.set('turst proxy', 1)
app.use(
  rateLimiter({
    windowsMs: 15 * 60 * 1000,
    max: 60,
  })
)
app.use(helmet())
app.use(xss())
app.use(cors())
app.use(expressMongoSanitize())
app.use(morgan('tiny'))
app.use(express.json()) // used to get req.body data for post reqs
app.use(cookieParser(process.env.JWT_SECRET)) // used to parse the cookies sent from the client(front-end) or postman
// enable public folder to be publicly available
app.use(express.static('./public'))
app.use(fileUpload())
// Routes
// Basic Route
app.get('/', (req, res) => {
  console.log(req.signedCookies) // this is avaiable because of cookie-parser package
  res.send('E-Commerce API Home page')
})
// auth router
app.use('/api/v1/auth', authRouter)
// user router
app.use('/api/v1/users', userRouter)
// product router
app.use('/api/v1/products', productRouter)
// review router
app.use('/api/v1/reviews', reviewRouter)
// order router
app.use('/api/v1/orders', orderRouter)
app.use(notFoundMW)
app.use(errorHandlerMW)
const port = process.env.PORT || 5000
const start = async () => {
  try {
    // connect to db
    await connectDB(process.env.MONGO_URL)
    app.listen(port)
    console.log('Server is listening on port', port)
  } catch (err) {
    console.log(err)
  }
}
start()Step 14 (Deploy to Heroku)
For Heroku setup, credentials and CLI, refer to Jobs API here ---> Step 14 (Deploy - Heroku or any other cloud provider)
Move your starter folder (you worked on till now) to your desktop and rename folder to what you want
Open this project with vscode
Change the start script in package.json by replacing
start:nodemon app.jstostart:node app.js. Rename the old start to 'dev'. If you want to run this in dev from now, donpm run dev
"scripts": {
    "start": "node app.js",
    "dev": "nodemon app.js"
}Also add this to your package json
"engines": {
    "node": "14.x"
  }Create a file called Procfile in root and add this
"web: node app.js"Make sure the spacing is correct here

Remove existing git repo
rm -rf .gitSetup git
git init
git add -A
git commit -m "initial commit"
heroku login // press any key and they'll take us to login
heroku create "APP NAME" // heroku create "e-commerce-api-node-project" -> If you omit name they will give a random name
Check if remote is pointing to heroku
git remote -vCheck if our project is created on heroku

Now click on this app on heroku and setup env variables

Take your keys and values in .env and set them up one by one here. In the process of doing so I noticed that we are still using
jwtSecretas string for JWT_SECRET. For this let's generate more complex string.Go to https://www.allkeysgenerator.com/ and select Encryption key and click on 256-bit (keep clicking on this 256-bit to generate new keys) and copy this and paste it for JWT_SECRET.

all key generator Now you have your configs on heroku

Once this is done you can push the project to heroku
git push heroku masterWhile this is building you can check the logs on

Once the Build is successful, open app by clicking the Open App button next to More as shown in the above image and it should open the app
Cool π our project is now deployed finally in https://e-commerce-api-node-project.herokuapp.com/
14.1 Final fixes
Notice that when we click on any route, we get this error and will not be able to open any of the routes. This is caused because docgen is using inline javascript which is forbidden by helmet package. Let's see in a second how to solve this problem by doing other steps first.

Replace http://localhost:5000 with our app's URL https://e-commerce-api-node-project.herokuapp.com/api/v1 in docs.json and republish it
docgen build -i docs.json -o index.htmlMove the resulting index.html file to public folder again and replace the existing one
Ok, now we have updated our index.html, let's solve this problem of javascript shown in above screenshot
Currently we have index.html which is the result of docgen command above. This is a minified file (so first we need to right click and format this file), we have embedded javascript within this file inside <script> tag. We need to separate this file, put it into another javascript file and then call that file in this index.html file like this
<script src = "that-file.js"></script>that solves the issueRestart the server and run
npm run devto check if everything is working fine
Now once this issue is solved, finally re-publish the app with working docs
git add -A
git commit -m "added few fixes"
git push heroku masterThe doc without errors is published now https://e-commerce-api-node-project.herokuapp.com/
Now let's create a git repo for this project and push our code
rm -rf .git // remove all git configs
AND Create a repository  Awesome π π Finally completed the project, https://github.com/sandeep194920/E-Commerce-API-Node
β
 7. Send Emails
In bigger projects we will have to send emails in several places. For example, after user registration, we might want to email user to verify email. So in this project we will see how to send the email.
Step 1 (General Project Structure)
require('dotenv').config()
require('express-async-errors')
const express = require('express')
const app = express()
const sendEmail = require('./controllers/sendEmail')
// error handler
const notFoundMiddleware = require('./middleware/not-found')
const errorHandlerMiddleware = require('./middleware/error-handler')
app.use(express.json())
// routes
app.get('/', (req, res) => {
  res.send(`<h1>Email Project</h1> <a href='send'>Send Email</a>`)
})
app.get('/send', sendEmail)
app.use(notFoundMiddleware)
app.use(errorHandlerMiddleware)
const port = process.env.PORT || 3000
const start = async () => {
  try {
    app.listen(port, () =>
      console.log(`Server is listening on port ${port}...`)
    )
  } catch (error) {
    console.log(error)
  }
}
start()const sendEmail = async (req, res) => {
  res.send('Email controller')
}
module.exports = sendEmailStep 2 (Email Sending Functionality)
2.1 Nodemailer package
We have a package to send emails in node and that is nodemailer. This is the most popular option for sending emails through node.
With nodemailer, we need to create a test account first. Then nodemailer uses a transport service (another 3rd party) to send our emails.
Nodemailer - Will setup the logic to send email
3rd Party - Will do the actual email sending
2.2 Email Transport Service
Now who is that 3rd party? The transporter who actually sends the email. Nodemailer will use this transport service to send the email.
As far as our options,
For Testing we can use services like Ethereal https://ethereal.email/ or mailtrap https://mailtrap.io/
For Production we can use Sendgrid https://sendgrid.com/ or Mailgun https://cloud.google.com/compute/docs/tutorials/sending-mail/using-mailgun.
We can also use normal gmail account for this but we won't do that here
In this project let's go with Ethereal as the setup is easy and faster


To check the incoming emails u need these values. You can create as many accounts as you want just by clicking "Create Ethereal Account" button.

You will setup this host in .env as you don't want to share with others (but not in this project). Once we have this Ethereal account created then we can send emails.
Let's navigate back to nodemailer docs and follow the steps.

const nodemailer = require('nodemailer')
const sendEmail = async (req, res) => {
  let testAccount = await nodemailer.createTestAccount()
  // create reusable transporter object using the default SMTP transport
  // You can get this createTransport code from ethereal docs as well as shown in one
  // of the above image
  let transporter = nodemailer.createTransport({
    host: 'smtp.ethereal.email',
    port: 587,
    secure: false, // true for 465, false for other ports
    auth: {
      user: 'fannie.koepp@ethereal.email', // generated ethereal user
      pass: 'bSRSmZcCMmcC1gkWwC', // generated ethereal password
    },
  })
  // send mail with defined transport object
  let info = await transporter.sendMail({
    from: '"Sandeep Amarnath" <sandeep@gmail.com>', // sender address
    to: 'anand@gmail.com, paxy@gmail.com', // list of receivers
    subject: 'Sending Emails with node js', // Subject line
    text: 'Hello Email', // plain text body
    html: '<b>Hello Anand and Paxy</b>', // html body
  })
  res.status(200).json(info)
}
module.exports = sendEmail

2.3 Sendgrid - Send Emails in Production
Once we know how to send emails with Ethereal, let's try to do the same thing with Sendgrid. This is one of the most popular email providers for the production.
Just like Ethereal, sendgrid also has nice and clean - easy to understand UI. Mailgun and other choice are comparatively confusing.
mr.sandeepamarnath@gmail.com / Sendgridmail@123
Click on Verify a single sender button and create sender email.

I've given mr.sandeepamarnath@gmail.com as a sender email

Once this is set up then we will create API key. Navigate to https://app.sendgrid.com/guide/integrate/langs/nodejs

Copy this created API key and put in env file and install this npm install --save @sendgrid/mail
SENDGRID_API_KEY=SG.uecsMFOhR3Gt7fhTd3wU6A.o_hRmfiUkpsRFu2rVJiYtMkBf4vhCG5USz9UG8OwccUCopy this code present in the below documentation and paste this in sendEmailController file and try it out
const nodemailer = require('nodemailer')
const sgMail = require('@sendgrid/mail')
// Sending Email through Ethereal transporter
const sendEmailEthereal = async (req, res) => {
  let testAccount = await nodemailer.createTestAccount()
  // create reusable transporter object using the default SMTP transport
  // You can get this createTransport code from ethereal docs as well as shown in one
  // of the above image
  let transporter = nodemailer.createTransport({
    host: 'smtp.ethereal.email',
    port: 587,
    secure: false, // true for 465, false for other ports
    auth: {
      user: 'fannie.koepp@ethereal.email', // generated ethereal user
      pass: 'bSRSmZcCMmcC1gkWwC', // generated ethereal password
    },
  })
  // send mail with defined transport object
  let info = await transporter.sendMail({
    from: '"Sandeep Amarnath" <sandeep@gmail.com>', // sender address
    to: 'anand@gmail.com, paxy@gmail.com', // list of receivers
    subject: 'Sending Emails with node js', // Subject line
    text: 'Hello Email', // plain text body
    html: '<b>Hello Anand and Paxy</b>', // html body
  })
  res.status(200).json(info)
}
// Sending Email through Sendgrid transporter
const sendEmailSendgrid = async (req, res) => {
  sgMail.setApiKey(process.env.SENDGRID_API_KEY)
  const msg = {
    to: 'sandeepamaranath@gmail.com', // Change to your recipient
    from: 'mr.sandeepamarnath@gmail.com', // Change to your verified sender
    subject: 'Test Email',
    text: 'Test Email from Sendgrid',
    html: '<strong>Test Email from Sendgrid</strong>',
  }
  try {
    const info = await sgMail.send(msg)
    res.status(200).json(info)
  } catch (er) {
    console.log(er)
  }
}
module.exports = sendEmailSendgridDone with this Email Sending Project. Good job!!
β
 8. Auth Workflow
In this project we will learn the following:
How to validate emails
Create refresh tokens
Setup reset password functionality
Step 1 (Project Overview)
When the app loads, the front-end initially calls /showMe route that we implemented in backend. This route checks if the user has already logged in and gives back 401 error (UnAuthorized) if not logged in (if no token is present in the request - cookie from front-end). 

For recalling about /showMe route and getting a complete understanding , refer how we implemented it in E-Commerce project Node Js (John Smilga - Udemy)
Now since you don't have an account yet, you can click register and enter creds to register.

Once you register the user, you get back 201, saying user created and you get back a message to verify the email, hence you still can't access the application before verification. You can't login yet and if you do, you get a message saying "Please verify your email".
Once you verify your email, by clicking on the verification link present in the email, the verification will be successful and that verification link will redirect you back to the front-end screen (with a token in the URL - more about this later) that says, "Email Confirmed, Please login".
Now we can login with that email and it works where it logs you in successfully. When you login you see the refresh token cookie (In this project you will see two cookies - access token and refresh token - More on this later)
So to reiterate this, the user will login, get the refresh token stored in the cookie and also gets back a user from /showMe route. So even when user refreshes the page, the refresh token exists in the cookie (this token in the cookie will not go away on refreshing the page π) and with this cookie the /showMe route is called (this is the first thing that happens when page loads or reloads) and gets back the user.
When the application is deployed on heroku, sometimes if this node app is not used much, heroku will put it sleep and it takes time to load when used after long time. Just FYI, this is the annoying part of Heroku free service.
Step 2 (Starter files overview)
To start with we will use the starter files provided by John. We have two folders /front-end (don't worry about this yet), /server. The /server contains most of the familiar files. It is actually the E-commerce project we just completed. This is because just to showcase that we can add functionality to any existing project (No need to worry much here). We are not going to change anything major in existing code in E-commerce part. They are just used so that it will make us comfortable an existing project with larger codebase.
We are not going to alter too much in e-commerce project part but may be a very little. Will see how it goes.
Install Project and run it
Navigate to starter/server and do
npm installSetup .env file with Mongo URI
Run
npm start
Step 3 (Understand the workflow)
Before we continue, let's try to understand what we need to achieve. Unlike Jobs API or our E-commerce API where, when user registers he used to get a token back, we shouldn't send a token in this project as soon as the user registers.
So, when a user registers with username and password, we create a user in the db, but there's an additional step where the user needs to verify the email. So when the user registers, first we need to send that verification email.
In E-commerce project, in register controller, these are the steps we did
Checked if Name, Email and Password are provided
Checked if this email already exists
If email didn't exist then we can created the user
Once the user got created, we sent back the token
In this project, we need to change the 4th point where, after creating the user we shouldn't send him the token, but instead send the verification email + fake token must be also be sent (will see in some time why we need this fake token).
Step 4 (Modify current code to implement email verification)
Let's modify existing e-commerce code to implement the email verification
4.1 Modify User Model

const mongoose = require('mongoose')
const validator = require('validator')
const bcrypt = require('bcryptjs')
const UserSchema = new mongoose.Schema({
  name: {
    type: String,
    required: [true, 'Please provide name'],
    minlength: 3,
    maxlength: 50,
  },
  email: {
    type: String,
    unique: true,
    required: [true, 'Please provide email'],
    validate: {
      validator: validator.isEmail,
      message: 'Please provide valid email',
    },
  },
  password: {
    type: String,
    required: [true, 'Please provide password'],
    minlength: 6,
  },
  role: {
    type: String,
    enum: ['admin', 'user'],
    default: 'user',
  },
  // Auth Workflow Project
  verificationToken: String,
  isVerified: {
    type: Boolean,
    default: false,
  },
  verified: {
    type: Date,
  },
})
UserSchema.pre('save', async function () {
  // console.log(this.modifiedPaths());
  // console.log(this.isModified('name'));
  if (!this.isModified('password')) return
  const salt = await bcrypt.genSalt(10)
  this.password = await bcrypt.hash(this.password, salt)
})
UserSchema.methods.comparePassword = async function (canditatePassword) {
  const isMatch = await bcrypt.compare(canditatePassword, this.password)
  return isMatch
}
module.exports = mongoose.model('User', UserSchema)4.2 Modify Register Controller
As I said in Step 4, the user when registers, will not get back the token but instead will get back the verification email. Hence we need to remove the code after createUser and add some code.

const register = async (req, res) => {
  const { email, name, password } = req.body
  const emailAlreadyExists = await User.findOne({ email })
  if (emailAlreadyExists) {
    throw new CustomError.BadRequestError('Email already exists')
  }
  // first registered user is an admin
  const isFirstAccount = (await User.countDocuments({})) === 0
  const role = isFirstAccount ? 'admin' : 'user'
  /* E-Commerce CODE where we used to send token after registration was successful. 
     Commenting this below part out in this Auth Workflow setup*/
  // const tokenUser = createTokenUser(user);
  // attachCookiesToResponse({ res, user: tokenUser });
  // res.status(StatusCodes.CREATED).json({ user: tokenUser });
  /* In Auth Workflow project, let's now implement this new functionality below of 
     sending verification email */
  const verificationToken = 'fake token'
  const user = await User.create({
    name,
    email,
    password,
    role,
    verificationToken,
  })
  // actually we need to send an email here, but let's send verification token back just for testing it in postman!!!
  res.status(StatusCodes.CREATED).json({
    msg: 'Success! Please check your email to verify the account',
    verificationToken: user.verificationToken, // we could have directly done verification token. But just seeing if user got created and has verificationToken on user object
  })
}4.2.1 Postman test register

So thee user got registered in our DB but not yet verified.

So at this point, if user tries to login, he should not be able to. He must get back a message saying "Please verify your email", so let's modify our login controller to handle this.
4.3 Modify Login Controller
Currently in our E-commerce app (and this auth app since it's the same app as e-commerce), we are sending back the token to the user once he logs in. This shouldn't be the case anymore and before sending this token, we need another check.
So, after we check if the provided password is correct, we need to also check if the user is verified or not. If not we need to send back 401 (Unauthorized) response. Let's implement that.

const login = async (req, res) => {
  const { email, password } = req.body
  if (!email || !password) {
    throw new CustomError.BadRequestError('Please provide email and password')
  }
  const user = await User.findOne({ email })
  if (!user) {
    throw new CustomError.UnauthenticatedError('Invalid Credentials')
  }
  const isPasswordCorrect = await user.comparePassword(password)
  if (!isPasswordCorrect) {
    throw new CustomError.UnauthenticatedError('Invalid Credentials')
  }
  /* In Auth Workflow project, let's now implement this new functionality below of 
     sending 401 (Unauthorized) if user is not verified */
  if (!user.isVerified) {
    throw new CustomError.UnauthenticatedError('Please verify your email')
  }
  /* The above is implemented in AuthWorkflow project*/
  const tokenUser = createTokenUser(user)
  attachCookiesToResponse({ res, user: tokenUser })
  res.status(StatusCodes.OK).json({ user: tokenUser })
}4.4 Setup proper token
Now we are sending a fake token in the register controller, but when it comes to production, sending fake token is silly. We need unique token for each user.
User registers
He/She gets the unique token
Proceed to the next steps
To generate this unique token for each user, there are multiple options. A clean option would be to use crypto library built into Node and you don't have to install anything extra.
const crypto = require('crypto')
const verificationToken = crypto.randomBytes(40).toString('hex')Note: crypto.randomBytes(<any number>) - This is going to create a buffer with random bytes. Since this is a buffer we want turn this into a string, and on this randomBytes buffer we have a method called toString() which will turn buffer into a string  where each byte will be encoded as 2 hexadecimal character. Since we have specified buffer number as 40, converting that to hex will give us back 80 characters.
Modified this token line in auth controller for register (line 23 below)
const register = async (req, res) => {
  const { email, name, password } = req.body
  const emailAlreadyExists = await User.findOne({ email })
  if (emailAlreadyExists) {
    throw new CustomError.BadRequestError('Email already exists')
  }
  // first registered user is an admin
  const isFirstAccount = (await User.countDocuments({})) === 0
  const role = isFirstAccount ? 'admin' : 'user'
  /* E-Commerce CODE where we used to send token after registration was successful. 
     Commenting this below part out in this Auth Workflow setup*/
  // const tokenUser = createTokenUser(user);
  // attachCookiesToResponse({ res, user: tokenUser });
  // res.status(StatusCodes.CREATED).json({ user: tokenUser });
  /* In Auth Workflow project, let's now implement this new functionality below of 
     sending verification email */
  const verificationToken = crypto.randomBytes(40).toString('hex')
  const user = await User.create({
    name,
    email,
    password,
    role,
    verificationToken,
  })
  // actually we need to send an email here, but let's send verification token back just for testing it in postman!!!
  res.status(StatusCodes.CREATED).json({
    msg: 'Success! Please check your email to verify the account',
    verificationToken: user.verificationToken, // we could have directly done verification token. But just seeing if user got created and has verificationToken on user object
  })
}
4.5 Verify Email big picture
User registers ---> /register - this will send him the verificationToken
User verifies his email (body has verificationToken) ---> /verify-email - this will check if verificationToken sent by user is correct or not and if correct then user is verified
User logs in ---> /login - gets back the normal token like previous projects and he can access any route
When user registers,
User provides name, email and password during registration
These are sent to registration controller via /register route
Registration controller does the following
Checks if email exists in DB already. If yes it throws an error
Checks length of password and hashes the password
Creates a verificationToken (in this project, we are using crypto library provided by node to do this)
Stores name, email, hashed password, verificationToken to DB
Sends back the verificationToken and a message ("please verify email") to the user (now doing this through postman, but generally we will send an email to the user's email with these and expect the user to click the link - we will do this later in some time)
Note that
isVerifedis also set to false by userModel already in the DB at this point

When user clicks the email sent by registration controller
If this is done through postman then the user can't click the email but need to do a POST request to /verify-email (same thing happens when clicked on email. i.e POST request is done to /verify-email)
User sends a POST request to /verify-email with email and verificationToken
This was the same verificationToken that was sent by registration controller in previous step
/verify-email controller verifies this verificationToken sent by user against the one stored for this user in DB
Once they match the /verification-email controller
Sets isVerified to true
Sets verified to Date.now()
Sets verificationToken to empty string to avoid duplicate verification
Once these are set in DB, it sends a success message "Email successfully verified" back to the user

4.5.1 Verify Email
Let's now implement above steps to our code to verify email. Initially we will verify the email through postman and then later we will see how to send the email to the user

Line 47, verifyEmail controller below
const User = require('../models/User')
const { StatusCodes } = require('http-status-codes')
const CustomError = require('../errors')
const { attachCookiesToResponse, createTokenUser } = require('../utils')
const crypto = require('crypto')
const register = async (req, res) => {
  const { email, name, password } = req.body
  const emailAlreadyExists = await User.findOne({ email })
  if (emailAlreadyExists) {
    throw new CustomError.BadRequestError('Email already exists')
  }
  // first registered user is an admin
  const isFirstAccount = (await User.countDocuments({})) === 0
  const role = isFirstAccount ? 'admin' : 'user'
  /* E-Commerce CODE where we used to send token after registration was successful. 
     Commenting this below part out in this Auth Workflow setup*/
  // const tokenUser = createTokenUser(user);
  // attachCookiesToResponse({ res, user: tokenUser });
  // res.status(StatusCodes.CREATED).json({ user: tokenUser });
  /* In Auth Workflow project, let's now implement this new functionality below of 
     sending verification email */
  const verificationToken = crypto.randomBytes(40).toString('hex')
  const user = await User.create({
    name,
    email,
    password,
    role,
    verificationToken,
  })
  // actually we need to send an email here, but let's send verification token back just for testing it in postman!!!
  res.status(StatusCodes.CREATED).json({
    msg: 'Success! Please check your email to verify the account',
    verificationToken: user.verificationToken, // we could have directly done verification token. But just seeing if user got created and has verificationToken on user object
  })
}
const verifyEmail = async (req, res) => {
  const { verificationToken, email } = req.body
  if (!email || !verificationToken) {
    throw new CustomError.BadRequestError('Please provide email and token')
  }
  const user = await User.findOne({ email })
  if (!user) {
    throw new CustomError.UnauthenticatedError('Verification failed')
  }
  if (verificationToken !== user.verificationToken) {
    throw new CustomError.UnauthenticatedError('Verification failed')
  }
  user.isVerified = true
  user.verified = Date.now()
  // to avoid duplicate, setting verificationToken to ''. If user clicks on verify email again, then he will get 'verification failed'
  user.verificationToken = ''
  user.save()
  res.status(StatusCodes.OK).json({ msg: 'Email verified' })
}
const login = async (req, res) => {
  const { email, password } = req.body
  if (!email || !password) {
    throw new CustomError.BadRequestError('Please provide email and password')
  }
  const user = await User.findOne({ email })
  if (!user) {
    throw new CustomError.UnauthenticatedError('Invalid Credentials')
  }
  const isPasswordCorrect = await user.comparePassword(password)
  if (!isPasswordCorrect) {
    throw new CustomError.UnauthenticatedError('Invalid Credentials')
  }
  /* In Auth Workflow project, let's now implement this new functionality below of 
     sending 401 (Unauthorized) if user is not verified */
  if (!user.isVerified) {
    throw new CustomError.UnauthenticatedError('Please verify your email')
  }
  /* The above is implemented in AuthWorkflow project*/
  const tokenUser = createTokenUser(user)
  attachCookiesToResponse({ res, user: tokenUser })
  res.status(StatusCodes.OK).json({ user: tokenUser })
}
const logout = async (req, res) => {
  res.cookie('token', 'logout', {
    httpOnly: true,
    expires: new Date(Date.now() + 1000),
  })
  res.status(StatusCodes.OK).json({ msg: 'user logged out!' })
}
module.exports = {
  register,
  login,
  logout,
  verifyEmail,
}4.6 Send Email
Since we have set up the functionality for email verification (/verify-email) where we are currently sending POST request to verify email (/verify-email) after register, let's now do this /verify-email POST request through email.
So the idea is, when the user registers, instead of sending him verificationToken and message through json object, let's now send email which will have the verificationToken, so that when user clicks that email, automatically a POST request must happen to /verify-email.
Before we re-learn how to send email, please re-read Node Js (John Smilga - Udemy) and familiarize yourself to know how we send the emails in dev and prod environments.
So let's now do it step by step
4.6.1 Email Setup
npm install nodemailerLet's create these files in utils folder
/utils/nodemailerConfig.js
/utils/sendEmail.js
/utils/sendResetPasswordEmail.js
/utils/sendVerficationEmail.jsNote that we are going to split the email sending functionality into different files as we need to re-user them for sending forgot/reset password email as well.
Send Email
Let's now write some code in /utils/sendEmail.js file. First let's hardcode sending email part -  what we did in Node Js (John Smilga - Udemy). Later we can tweak this to re-use this for sending password reset email and verification email.
const nodemailer = require('nodemailer')
// Sending Email through Ethereal transporter
const sendEmail = async (req, res) => {
  // Not using testAccount in this case. 
  // If it was used then user should have been testAccount.user instead of 
  // 'joel39@ethereal.email' and 
  // pass should have been testAccount.pass instead of 'fRYdcDUhv1kUEp5A8F'
  let testAccount = await nodemailer.createTestAccount()
  // create reusable transporter object using the default SMTP transport
  // You can get this createTransport code from ethereal account
  const transporter = nodemailer.createTransport({
    host: 'smtp.ethereal.email',
    port: 587,
    auth: {
      user: 'joel39@ethereal.email',
      pass: 'fRYdcDUhv1kUEp5A8F',
    },
  })
  // send mail with defined transport object
  let info = await transporter.sendMail({
    from: '"Fred Foo π»" <foo@example.com>', // sender address
    to: 'bar@example.com, baz@example.com', // list of receivers
    subject: 'Hello β', // Subject line
    text: 'Hello world?', // plain text body
    html: '<b>Hello world?</b>', // html body
  })
  res.status(200).json(info)
}
module.exports = sendEmailNote that the above code will be used in authController in register. Later we will this code in some more places like sending reset password email and resending verification email (so we will later modify the above sendEmail code to accept arguments).
const register = async (req, res) => {
  const { email, name, password } = req.body
  const emailAlreadyExists = await User.findOne({ email })
  if (emailAlreadyExists) {
    throw new CustomError.BadRequestError('Email already exists')
  }
  // first registered user is an admin
  const isFirstAccount = (await User.countDocuments({})) === 0
  const role = isFirstAccount ? 'admin' : 'user'
  /* E-Commerce CODE where we used to send token after registration was successful. 
     Commenting this below part out in this Auth Workflow setup*/
  // const tokenUser = createTokenUser(user);
  // attachCookiesToResponse({ res, user: tokenUser });
  // res.status(StatusCodes.CREATED).json({ user: tokenUser });
  /* In Auth Workflow project, let's now implement this new functionality below of 
     sending verification email */
  const verificationToken = crypto.randomBytes(40).toString('hex')
  const user = await User.create({
    name,
    email,
    password,
    role,
    verificationToken,
  })
  await sendEmail() // SENDING EMAIL HERE. NOT YET INCLUDING THE TOKEN. WILL DO IT SHORTLY
  res.status(StatusCodes.CREATED).json({
    msg: 'Success! Please check your email to verify the account',
    // NOW COMMENTING THIS BELOW AS WEE ARE NOW SENDING EMAIL IN LINE (33)-await sendEmail()
    // verificationToken: user.verificationToken, // we could have directly done verification token. But just seeing if user got created and has verificationToken on user object
  })
}Now we are actually sending email during the registration process but not yet sending the verification token in that email. The above is to showcase how we are first sending the email and then sending the response back to the user (Success message). We will send this verification token in the email shortly.

Also, one more thing to remember is that currently, in our register controller, we are sending email to hardcoded email (receivers below), but this will change later. We will later be sending to the actual user who is registering.
  // send mail with defined transport object
  let info = await transporter.sendMail({
    from: '"Fred Foo π»" <foo@example.com>', // sender address
    to: 'bar@example.com, baz@example.com', // list of receivers
    subject: 'Hello β', // Subject line
    text: 'Hello world?', // plain text body
    html: '<b>Hello world?</b>', // html body
  })4.6.2 Send Email from front-end
Now that we have setup the code for sending email (without sending proper token in the email yet of course), let's try to send this email from thee front-end.
We know that the register functionality works as we did it in postman in section 4.6.1 above, we can try doing the same by clicking register button in the front-end. Navigate to starter/front-end in the project and run
npm install && npm startYou will see the react-app spin up. Do the registration process and you should see it send an email.





4.6.3 Sending Verification link in email
Let's work on refactoring sendEmail part. Let's put the configuration of createTransport in nodeMailerConfig.js. If we are using send grid or any other service for production then it's good to put the values of host, port, user and pass into .env and then use it in nodeMailerConfig.js

Once we do this the code looks like this
//11-auth-workflow/starter/server/utils/nodemailerConfig.js
module.exports = {
  host: 'smtp.ethereal.email',
  port: 587,
  auth: {
    user: 'dameon.nitzsche94@ethereal.email',
    pass: 'rvvbPUutr2MZBuM5jq',
  },
}
/* -------------------------------------------------------- */
//11-auth-workflow/starter/server/utils/sendEmail.js
const nodemailer = require('nodemailer')
const nodeMailerConfig = require('./nodemailerConfig')
// Sending Email through Ethereal transporter
const sendEmail = async (req, res) => {
  // Not using testAccount in this case. If it was used then 
  // user should have been testAccount.user instead of 'joel39@ethereal.email' 
  // and pass should have been testAccount.pass instead of 'fRYdcDUhv1kUEp5A8F'
  let testAccount = await nodemailer.createTestAccount()
  // create reusable transporter object using the default SMTP transport
  // You can get this createTransport code from ethereal account
  const transporter = nodemailer.createTransport(nodeMailerConfig)
  // send mail with defined transport object
  let info = await transporter.sendMail({
    from: '"Fred Foo π»" <foo@example.com>', // sender address
    to: 'bar@example.com, baz@example.com', // list of receivers
    subject: 'Hello β', // Subject line
    text: 'Hello world?', // plain text body
    html: '<b>Hello world?</b>', // html body
  })
}
module.exports = sendEmailOnce we do this let's pass in to, subject and html as params. Also, we can return transport sendMail() in line 16 below and call it in place we are using it. No need to return with await keyword as sendEmail function (line 5 below) is async and it returns promise by default so we can omit await in line 16 below.

After refactor it looks like this,

Ok, now we have a better understanding of how we can make sendEmail function to accept params. Currently, sendEmail is being called in registerController. But we have to use this send email functionality in resetPassword as well later.
So let's have a separate function called sendVerificationEmail to send email during registration and sendResetPasswordEmail to send reset password email later. Both of these functions will call sendEmail.
The setup we currently have  (Setup A)

The setup we need (Setup B)

We need to implement setup b where the sendVerificationEmail function uses sendEmail. We will then use sendVerificationEmail in register controller.
The files for setup b looks like this
const nodemailer = require('nodemailer')
const nodeMailerConfig = require('./nodemailerConfig')
// Sending Email through Ethereal transporter
const sendEmail = async ({ to, subject, html }) => {
  // Not using testAccount in this case. If it was used then user should have been testAccount.user instead of 'joel39@ethereal.email' and pass should have been testAccount.pass instead of 'fRYdcDUhv1kUEp5A8F'
  let testAccount = await nodemailer.createTestAccount()
  // create reusable transporter object using the default SMTP transport
  // You can get this createTransport code from ethereal account
  const transporter = nodemailer.createTransport(nodeMailerConfig)
  // send mail with defined transport object. This return
  return transporter.sendMail({
    from: '"Sandeep Amarnath" <sandeep@gmail.com>', // sender address
    to, // list of receivers
    subject, // Subject line
    html, // html body
  })
}
module.exports = sendEmailconst sendEmail = require('./sendEmail')
/* origin is going to be the URL for the front-end (to specify where the user 
should navigate once he clicks the link). More on this later */
// prettier-ignore
const sendVerificationEmail = async({ name,email,verificationToken,origin}) => {
    const link = 'https://google.com'
    // we will add the confirmation link shortly
    const message = `<p>Please confirm your email by clicking on the following link: <a href='${link}'>Take me to Google</a> </p>`
    // sendEmail is a promise and we are directly returning without await. 
    // It will be awaited in the controller its  called in
    return sendEmail({
        to:email,
        subject:'Email Confirmation',
        html:`<h1>Hello ${name}</h1>   
             ${message}
        `
    })
}
module.exports = sendVerificationEmailconst register = async (req, res) => {
  const { email, name, password } = req.body
  const emailAlreadyExists = await User.findOne({ email })
  if (emailAlreadyExists) {
    throw new CustomError.BadRequestError('Email already exists')
  }
  // first registered user is an admin
  const isFirstAccount = (await User.countDocuments({})) === 0
  const role = isFirstAccount ? 'admin' : 'user'
  /* E-Commerce CODE where we used to send token after registration was successful. 
     Commenting this below part out in this Auth Workflow setup*/
  // const tokenUser = createTokenUser(user);
  // attachCookiesToResponse({ res, user: tokenUser });
  // res.status(StatusCodes.CREATED).json({ user: tokenUser });
  /* In Auth Workflow project, let's now implement this new functionality below of 
     sending verification email */
  const verificationToken = crypto.randomBytes(40).toString('hex')
  const user = await User.create({
    name,
    email,
    password,
    role,
    verificationToken,
  })
  /* await sendEmail() -  Not used anymore since we are using sendVerificationEmail here.
  sendVerificationEmail internally uses sendEmail */
  // await sendEmail()
  /* origin is URL to specify where the user 
      should navigate once he clicks the link. More on this later */
  const origin = 'http://localhost:3000'
  await sendVerificationEmail({
    name: user.name,
    email: user.email,
    verificationToken: user.verificationToken,
    origin,
  })
  res.status(StatusCodes.CREATED).json({
    msg: 'Success! Please check your email to verify the account',
    // NOW COMMENTING THIS BELOW AS WEE ARE SENDING EMAIL IN LINE  - await sendEmail() above
    // verificationToken: user.verificationToken, // we could have directly done verification token. But just seeing if user got created and has verificationToken on user object
  })
}At this point we can send verification email and user gets it in ethereal. In sendVerificationEmail, where we are sending the email, we are including a link and on clicking that, it takes us to google.com. So the current flow is
User registers
The request hits registerController
In registerController, sendVerificationEmail function is called
In sendVerificationEmail function, sendEmail function is called which sends the email to the user. The sendVerification email includes the link (<a href), which when clicked by the user, goes to google.com (Using google.com link just for understanding here. Actually we will replace this link with the actual verification link. So when the user clicks this link he should be verified and navigated back to other screen (We will see how in a moment)
The registerController also sends to success message
4.6.3.1 Sending the actual verification link
Now that we know we can include link in the email (we are doing it in sendVerificationEmail and it currently takes user to google), let's extend on this knowledge and build a verification link. So, the idea is
We send a verification link in the email when user registers
This link, when clicked
It should navigate user to front-end route
/user/verify-emailThis link should also have query params (verificationToken, and email) so that, In this process, while navigating to this route from front-end, the front-end route reads these query params and sends a POST request to
/api/v1/auth/verify-emailas we did from the postman{{URL}}/verify-email. The POST request will also include verificationToken and email that is passed in query params from backend and this verifies the email.After this, it also displays that the verification process is successful.
Front-end code
Notice that in react app below, react router is configured for /user/verify-email route. So the idea is, when the link is clicked from email, it should navigate to this route. For that to happen, the href in that link should be pointing to this route /user/verify-email. Since we need full URL, it should be http://localhost:3000/user/verify-email with query params verificationToken and email. Let's see why we need these query params to be present in this link.

Let's look at <Verify/> Component

import React, { useState, useEffect } from 'react';
import { useLocation, Link } from 'react-router-dom';
import styled from 'styled-components';
import { useGlobalContext } from '../context';
import axios from 'axios';
function useQuery() {
  return new URLSearchParams(useLocation().search);
}
const VerifyPage = () => {
  const [error, setError] = useState(false);
  const [loading, setLoading] = useState(false);
  const { isLoading } = useGlobalContext();
  const query = useQuery();
  const verifyToken = async () => {
    setLoading(true);
    try {
      const { data } = await axios.post('/api/v1/auth/verify-email', {
        verificationToken: query.get('token'),
        email: query.get('email'),
      });
    } catch (error) {
      // console.log(error.response);
      setError(true);
    }
    setLoading(false);
  };
  useEffect(() => {
    if (!isLoading) {
      verifyToken();
    }
  }, []);
  if (loading) {
    return (
      <Wrapper className='page'>
        <h2>Loading...</h2>
      </Wrapper>
    );
  }
  if (error) {
    return (
      <Wrapper className='page'>
        <h4>There was an error, please double check your verification link </h4>
      </Wrapper>
    );
  }
  return (
    <Wrapper className='page'>
      <h2>Account Confirmed</h2>
      <Link to='/login' className='btn'>
        Please login
      </Link>
    </Wrapper>
  );
};
const Wrapper = styled.section``;
export default VerifyPage;So now we know that we need to send query params -> verificationToken, and email. Let's see the full href in the link (when clicked, this is what the URL should be with query params which then hits Verify Page in react app)
// href of that link is (when it is clicked it should navigate to)
http://localhost:3000/user/verify-email?token=<token>&email=<email>So the sendVerificationEmail controller looks like this

const sendEmail = require('./sendEmail')
/* origin is going to be the URL for the front-end (to specify where the user 
should navigate once he clicks the link). More on this later */
// prettier-ignore
const sendVerificationEmail = async({ name,email,verificationToken,origin}) => {
    // origin is https://localhost:3000 for dev (passing this from registerController)
    // so the full URL should be - http://localhost:3000/user/verify-email?verificationToken=<token>&email=<email>
    const link = `${origin}/user/verify-email?token=${verificationToken}&email=${email}`
    // we will add the confirmation link shortly
    const message = `<p>Please confirm your email by clicking on the following link: 
    <a href='${link}'>Click to verify</a> </p>`
    // sendEmail is a promise and we are directly returning without await. 
    // It will be awaited in the controller its  called in
    return sendEmail({
        to:email,
        subject:'Email Confirmation',
        html:`<h1>Hello ${name}</h1>   
             ${message}
        `
    })
}
module.exports = sendVerificationEmailFull flow of sending verification email
4.6.4 Origin info
We used origin = http://localhost:3000in sendVerificationEmail but what about production? We can do this the same way. We can first deploy our app to prod and then use this prod URL as origin and that is the easy way.
Now what if we need this to be dynamic. I mean, if there is some other front-end using our app then it's tedious that we need to know the deployed URL of that app which is unnecessary headache (may be it's fine in this case). So let's see how we can generate this origin.
The origin can be generated by the req object. We have access to this req. If we console log req object it looks very large. We mainly want headers here. Here are a few important things of req object.

x-forwarded-host
x-forwarded-proto
The above 2 form the actual origin (because that is the actual sender of reequest). 
We can use them as origin like this
const origin = req.get(x-forwarded-proto) + req.get(x-forwarded-host)
// another way of getting fields from req headers is below
// const origin = req.headers['x-forwarded-proto'] + req.headers['x-forwarded-host']
Note that we use get method (req.get) to get the headers
................................................................................
The origin shown inside headers is https://localhost:5000. This is because we include
our server URL as proxy to allow port forwarding for CORS. 
referrer : This is the URL where the request came fromStep 5 (Refresh token)
Making this Step 5 and I didn't prefer doing it as 4.7. Technically, we are still using the same E-commerce code (as Step 4 heading specifies). Just making it as step 5 for better differentiation.
5.1 Need of Refresh token
These are some of the points I am drafting based on the resources (stack overflow and youtube) attached below
In real world, it's better to have separate servers called
authServerandresourceServer/sauthServer- used only for authentication and authorization. This server's responsibility is to issue refresh token, access token and also logout the usersresourceServer- this server (can also be multiple servers) provides protected data like products, reviews and so on (like we did in e-commerce project)
One of the uses of refresh token is, we don't have to send username and password (credentials) over the wire (from front-end to back-end) every time when we need a new access token. The refresh token will get you the new access token from authServer so you can continue to make requests to your protected resourceServer
The other main use of
refresh_tokenis that, let's say your authServer is very protected compared to resourceServer in real world (third party services like auth0, okta, azure and so on or your own implementation). You will only send youraccess_tokento the resourceServer (to get data) and you will never have to sendrefresh_tokento resourceServer . So there is a good chance that youraccess_tokenwhen sent to resourceServer, there is a hacker intercepting your resourceServer (since it's not that secure) who gets access to your short livedaccess_token.For this reason, the
access_tokenis has short life span (like 30 minutes). Remember, when thisaccess_tokenexpires, you will sendrefresh_tokento authServer (which is very secure than resourceServer) to get a newaccess_token. Since you are not sending therefresh_tokento the resourceServer at any time, there is no way that the hacker who is intercepting the resourceServer gets yourrefresh_token. If you, as a developer, still get a doubt that your users'refresh_tokenmight also be hacked, then you can logout all users (makerefresh_tokeninvalid for all users) and that way the users will login again to get a newrefresh_tokenand things will get on track again.
Resources
5.2 Access and Refresh  token in current app
This is what we need to achieve in current app
User logs in
Then he/she gets back both access-token and refresh-token
Access-token will have short lifetime and refresh token will have longer life time
Once access-token expires the refresh token is used to get back a new access token
User can continue to access the app with new access-token
5.3 Modify Setup 
5.3.1 Token Model
We are creating this token model here as we are going to store the refresh token in the db.
const mongoose = require('mongoose')
const TokenSchema = new mongoose.Schema(
  {
    refreshToken: {
      type: String,
      required: true,
    },
    /* We are not going to use the ip and UserAgent fields in this project 
  (they are here just to showcase where we can get them on the request) */
    // to specify where we are getting the request from 
    ip: {
      type: String,
      required: true,
    },
    // to specify what device is accessing our project
    userAgent: {
      type: String,
      required: true,
    },
    isValid: {
      type: Boolean,
      default: true,
    },
    user: {
      type: mongoose.Types.ObjectId,
      ref: 'User',
      required: true,
    },
  },
  { timestamps: true }
)
const tokenModel = mongoose.model('Token', TokenSchema)
module.exports = tokenModel5.3.2 Create refresh token in login controller
We currently have code to create a token (this was just an access token as per our previous setup in e-commerce ) project, but now let's setup refresh token.
Before I do that, let me show you the current code for login controller before modifying it. Note that Lines 17 - 22 were added during modifying the register controller.
const login = async (req, res) => {
  const { email, password } = req.body
  if (!email || !password) {
    throw new CustomError.BadRequestError('Please provide email and password')
  }
  const user = await User.findOne({ email })
  if (!user) {
    throw new CustomError.UnauthenticatedError('Invalid Credentials')
  }
  const isPasswordCorrect = await user.comparePassword(password)
  if (!isPasswordCorrect) {
    throw new CustomError.UnauthenticatedError('Invalid Credentials')
  }
  /* In Auth Workflow project, let's now implement this new functionality below of 
     sending 401 (Unauthorized) if user is not verified */
  if (!user.isVerified) {
    throw new CustomError.UnauthenticatedError('Please verify your email')
  }
  /* The above is implemented in AuthWorkflow project*/
  const tokenUser = createTokenUser(user)
  attachCookiesToResponse({ res, user: tokenUser })
  res.status(StatusCodes.OK).json({ user: tokenUser })
}Modified login controller 
We are creating a refresh token here, let's see how it looks. For now, we are just creating the refresh token, adding it to DB, and sending it back to the user just to check if it works. We will tweak this eventually and add more functionality.

const login = async (req, res) => {
  const { email, password } = req.body
  if (!email || !password) {
    throw new CustomError.BadRequestError('Please provide email and password')
  }
  const user = await User.findOne({ email })
  if (!user) {
    throw new CustomError.UnauthenticatedError('Invalid Credentials')
  }
  const isPasswordCorrect = await user.comparePassword(password)
  if (!isPasswordCorrect) {
    throw new CustomError.UnauthenticatedError('Invalid Credentials')
  }
  /* In Auth Workflow project, let's now implement this new functionality below of 
     sending 401 (Unauthorized) if user is not verified */
  if (!user.isVerified) {
    throw new CustomError.UnauthenticatedError('Please verify your email')
  }
  /* The above is implemented in AuthWorkflow project*/
  const tokenUser = createTokenUser(user)
  // create refresh token - and tie this to a user
  let refreshToken = '' // we will see why this is let and why setting this to empty later
  // let's set all the fields the token model needs and tie this refresh token to a user
  refreshToken = crypto.randomBytes(40).toString('hex')
  const userAgent = req.headers['user-agent'] // another way is  - req.get('user-agent')
  const ip = req.ip
  const userToken = { refreshToken, ip, userAgent, user: user._id }
  const token = await Token.create(userToken)
  // check for existing refresh token (we will do this later, and we will see later why this is important)
  /* commenting below code of attachCookiesToResponse for now*/
  //attachCookiesToResponse({ res, user: tokenUser })
  res.status(StatusCodes.OK).json({ user: tokenUser, token }) // just to showcase, let's send back the token
}5.3.2.1 Send two cookies (access + refresh token)
Earlier, in register and login controller, we just sent one cookie which was called token after user logged in or registered. Now we need to modify this to send two cookies (one for access + one for refresh token)
Current setup we have with one token 
const jwt = require('jsonwebtoken')
const createJWT = ({ payload }) => {
  // we can also remove expiresIn in this token and set it in the cookie on which we send this token below
  const token = jwt.sign(payload, process.env.JWT_SECRET, {
    expiresIn: process.env.JWT_LIFETIME,
  })
  return token
}
const isTokenValid = ({ token }) => jwt.verify(token, process.env.JWT_SECRET)
const attachCookiesToResponse = ({ res, user }) => {
  const token = createJWT({ payload: user })
  const oneDay = 1000 * 60 * 60 * 24
  res.cookie('token', token, {
    httpOnly: true,
    expires: new Date(Date.now() + oneDay),
    secure: process.env.NODE_ENV === 'production',
    signed: true,
  })
}
module.exports = {
  createJWT,
  isTokenValid,
  attachCookiesToResponse,
}After modification for two cookies. 
We use attachCookiesToResponse in login controller, so let's change there a bit 
const login = async (req, res) => {
  const { email, password } = req.body
  if (!email || !password) {
    throw new CustomError.BadRequestError('Please provide email and password')
  }
  const user = await User.findOne({ email })
  if (!user) {
    throw new CustomError.UnauthenticatedError('Invalid Credentials')
  }
  const isPasswordCorrect = await user.comparePassword(password)
  if (!isPasswordCorrect) {
    throw new CustomError.UnauthenticatedError('Invalid Credentials')
  }
  /* In Auth Workflow project, let's now implement this new functionality below of 
     sending 401 (Unauthorized) if user is not verified */
  if (!user.isVerified) {
    throw new CustomError.UnauthenticatedError('Please verify your email')
  }
  /* The above is implemented in AuthWorkflow project*/
  const tokenUser = createTokenUser(user)
  // create refresh token - and tie this to a user
  let refreshToken = '' // we will see why this is let and why setting this to empty later
  // let's set all the fields the token model needs and tie this refresh token to a user
  refreshToken = crypto.randomBytes(40).toString('hex')
  const userAgent = req.headers['user-agent'] // another way is  - req.get('user-agent')
  const ip = req.ip
  const userToken = { refreshToken, ip, userAgent, user: user._id }
  await Token.create(userToken)
  // check for existing refresh token (we will do this later, and we will see later why this is important)
  attachCookiesToResponse({ res, user: tokenUser, refreshToken }) // adding refreshToken here so we can send access + refresh token in attachCookiesToResponse
  res.status(StatusCodes.OK).json({ user: tokenUser })
}const jwt = require('jsonwebtoken')
const createJWT = ({ payload }) => {
  // we can also remove expiresIn in this token and set it in the cookie on which we send this token below
  // const token = jwt.sign(payload, process.env.JWT_SECRET, {
  //   expiresIn: process.env.JWT_LIFETIME,
  // })
  const token = jwt.sign(payload, process.env.JWT_SECRET)
  return token
}
const isTokenValid = ({ token }) => jwt.verify(token, process.env.JWT_SECRET)
const attachCookiesToResponse = ({ res, user, refreshToken }) => {
  // in accessTokenJWT we will have only the user
  // in refreshTokenJWT we will have the user + refeshToken string value
  const accessTokenJWT = createJWT({ payload: { user } })
  const refreshTokenJWT = createJWT({ payload: { user, refreshToken } })
  // accessTokenJWT can be short term like 15 mins, where as refreshTokenJWT can be longer like one day or 60 days
  const oneDay = 1000 * 60 * 60 * 24
  res.cookie('accessToken', accessTokenJWT, {
    httpOnly: true,
    // expires: new Date(Date.now() + oneDay), // we can give expiresIn or maxAge
    secure: process.env.NODE_ENV === 'production',
    signed: true,
    maxAge: 1000, // 1000 = 1s
  })
  res.cookie('refreshToken', refreshTokenJWT, {
    httpOnly: true,
    expires: new Date(Date.now() + oneDay),
    secure: process.env.NODE_ENV === 'production',
    signed: true,
  })
}
// const OLD_attachCookiesToResponse = ({ res, user }) => {
//   const token = createJWT({ payload: user })
//   const oneDay = 1000 * 60 * 60 * 24
//   res.cookie('token', token, {
//     httpOnly: true,
//     expires: new Date(Date.now() + oneDay),
//     secure: process.env.NODE_ENV === 'production',
//     signed: true,
//   })
// }
module.exports = {
  createJWT,
  isTokenValid,
  attachCookiesToResponse,
}We modified attachTokenToCookies to attach two cookies to response. One for access and one for refresh.

5.3.2.2 Check if refresh token exists in DB already

Since we are creating the refresh token and adding it to DB every time when the user logs in, it creates multiple refresh tokens for same user which is not good.
Instead, we need to check if the refresh token exists in DB for this user already, and if it does then we need to check if it is valid. If valid then we can use this token and attach it to cookie.
We can also make this token invalid if we find any suspicious login. So that user will then have to provide credentials once again which will then be a safe approach.
Let's add the code to check if refresh token exists.

const login = async (req, res) => {
  const { email, password } = req.body
  if (!email || !password) {
    throw new CustomError.BadRequestError('Please provide email and password')
  }
  const user = await User.findOne({ email })
  if (!user) {
    throw new CustomError.UnauthenticatedError('Invalid Credentials')
  }
  const isPasswordCorrect = await user.comparePassword(password)
  if (!isPasswordCorrect) {
    throw new CustomError.UnauthenticatedError('Invalid Credentials')
  }
  /* In Auth Workflow project, let's now implement this new functionality below of 
     sending 401 (Unauthorized) if user is not verified */
  if (!user.isVerified) {
    throw new CustomError.UnauthenticatedError('Please verify your email')
  }
  /* The above is implemented in AuthWorkflow project*/
  const tokenUser = createTokenUser(user)
  // create refresh token - and tie this to a user
  let refreshToken = ''
  const existingToken = await Token.findOne({ user: user._id })
  console.log('existing token', existingToken)
  // IF Refresh Token is already present in DB
  if (existingToken) {
    // if u find any suspicious activity on this refresh token, u can go and invalidate in DB for this user
    if (!existingToken.isValid) {
      throw new CustomError.UnauthenticatedError('Invalid credentials')
    }
    refreshToken = existingToken.refreshToken
    attachCookiesToResponse({ res, user: tokenUser, refreshToken })
    res.status(StatusCodes.OK).json({ user: tokenUser })
    return // adding return here so that the below code doesn't have to execute if token is already present in DB
  }
  // IF Refresh Token is NOT present in DB
  console.log('REACHED NOT BLOCK')
  refreshToken = crypto.randomBytes(40).toString('hex')
  const userAgent = req.headers['user-agent'] // another way is  - req.get('user-agent')
  const ip = req.ip
  const userToken = { refreshToken, ip, userAgent, user: user._id }
  await Token.create(userToken)
  attachCookiesToResponse({ res, user: tokenUser, refreshToken }) // adding refreshToken here so we can send access + refresh token in attachCookiesToResponse
  res.status(StatusCodes.OK).json({ user: tokenUser })
}5.3.3 Authentication middleware modification
Remember what authentication middleware does? Just to remind - we use this middleware in front of protected routes. Let's say we want to create a new Order, and for this the user must already be logged in.

This authentication mw checks if the request has the token attached and if yes then only it will allow us to create the order.
Currently we have this in auth mw

Now since we don't have token but now have access-token and refresh-token, we need to modify this auth mw code.

const authenticateUser = async (req, res, next) => {
  const { accessToken, refreshToken } = req.signedCookies
  try {
    if (accessToken) {
      const payload = isTokenValid(accessToken)
      req.user = payload.user
      return next()
    }
    const payload = isTokenValid(refreshToken)
    const existingToken = await Token.findOne({
      user: payload.user.userId,
      refreshToken: payload.refreshToken,
    })
    if (!existingToken || !existingToken.isValid) {
      throw new CustomError.UnauthenticatedError('Authentication Invalid')
    }
    req.user = payload.user
    attachCookiesToResponse({
      res,
      user: payload.user,
      refreshToken: existingToken.refreshToken,
    })
    next()
  } catch (e) {
    throw new CustomError.UnauthenticatedError(
      'Authentication Invalid, error occured'
    )
  }
  // OLD IMPLEMENTATION BELOW
  // const token = req.signedCookies.token;
  // if (!token) {
  //   throw new CustomError.UnauthenticatedError('Authentication Invalid');
  // }
  // try {
  //   const { name, userId, role } = isTokenValid({ token });
  //   req.user = { name, userId, role };
  //   next();
  // } catch (error) {
  //   throw new CustomError.UnauthenticatedError('Authentication Invalid');
  // }
}const jwt = require('jsonwebtoken')
const createJWT = ({ payload }) => {
  // we can also remove expiresIn in this token and set it in the cookie on which we send this token below
  // const token = jwt.sign(payload, process.env.JWT_SECRET, {
  //   expiresIn: process.env.JWT_LIFETIME,
  // })
  const token = jwt.sign(payload, process.env.JWT_SECRET)
  return token
}
const isTokenValid = (token) => jwt.verify(token, process.env.JWT_SECRET)
/*
On verifying, we get back same thing what we signed
- For access token we get 
{ payload: { user } }
- For refresh token we get 
{ payload: { user, refreshToken } }
*/
const attachCookiesToResponse = ({ res, user, refreshToken }) => {
  // in accessTokenJWT we will have only the user
  // in refreshTokenJWT we will have the user + refeshToken string value
  const accessTokenJWT = createJWT({ payload: { user } })
  const refreshTokenJWT = createJWT({ payload: { user, refreshToken } })
  // accessTokenJWT can be short term like 15 mins, where as refreshTokenJWT can be longer like one day or 60 days
  const oneDay = 1000 * 60 * 60 * 24
  res.cookie('accessToken', accessTokenJWT, {
    httpOnly: true,
    // expires: new Date(Date.now() + oneDay), // we can give expiresIn or maxAge
    secure: process.env.NODE_ENV === 'production',
    signed: true,
    maxAge: 1000 * 60 * 60 * 60, // 1000 = 1s
  })
  res.cookie('refreshToken', refreshTokenJWT, {
    httpOnly: true,
    expires: new Date(Date.now() + oneDay),
    secure: process.env.NODE_ENV === 'production',
    signed: true,
  })
}
// const OLD_attachCookiesToResponse = ({ res, user }) => {
//   const token = createJWT({ payload: user })
//   const oneDay = 1000 * 60 * 60 * 24
//   res.cookie('token', token, {
//     httpOnly: true,
//     expires: new Date(Date.now() + oneDay),
//     secure: process.env.NODE_ENV === 'production',
//     signed: true,
//   })
// }
module.exports = {
  createJWT,
  isTokenValid,
  attachCookiesToResponse,
}Step 6 (Logout)
Let's now implement the logout functionality
6.1 Remove the tokens
We can logout the users by
Setting access and refresh token cookies to timeout immediately
Removing the token for the logged in user from the database
So the idea is,
The user is already logged in before, so he can be logged out
By this, we know that we can get access to the
req.userinlogout controller. We can achieve this by adding theauthentication middlewareinlogout routebefore it hits logout controller (makinglogoutroute as protected route likeshowCurrentUserroute). Also, the logout route can be adeleteroute
router.delete('/logout', authenticateUser, logout) Here's the complete code for authRoutes
const express = require('express')
const router = express.Router()
const {
  register,
  login,
  logout,
  verifyEmail,
} = require('../controllers/authController')
const { authenticateUser } = require('../middleware/authentication')
router.post('/register', register)
router.post('/login', login)
router.delete('/logout', authenticateUser, logout)
router.post('/verify-email', verifyEmail)
module.exports = routerLogout controller

Logout controller code below - Line 151
const User = require('../models/User')
const Token = require('../models/Token')
const { StatusCodes } = require('http-status-codes')
const CustomError = require('../errors')
const {
  attachCookiesToResponse,
  createTokenUser,
  sendVerificationEmail,
} = require('../utils')
const crypto = require('crypto')
// const sendEmail = require('../utils/sendEmail') // Not used anymore since we are using sendVerificationEmail here.
// sendVerificationEmail internally uses sendEmail
const register = async (req, res) => {
  const { email, name, password } = req.body
  const emailAlreadyExists = await User.findOne({ email })
  if (emailAlreadyExists) {
    throw new CustomError.BadRequestError('Email already exists')
  }
  // first registered user is an admin
  const isFirstAccount = (await User.countDocuments({})) === 0
  const role = isFirstAccount ? 'admin' : 'user'
  /* E-Commerce CODE where we used to send token after registration was successful. 
     Commenting this below part out in this Auth Workflow setup*/
  // const tokenUser = createTokenUser(user);
  // attachCookiesToResponse({ res, user: tokenUser });
  // res.status(StatusCodes.CREATED).json({ user: tokenUser });
  /* In Auth Workflow project, let's now implement this new functionality below of 
     sending verification email */
  const verificationToken = crypto.randomBytes(40).toString('hex')
  const user = await User.create({
    name,
    email,
    password,
    role,
    verificationToken,
  })
  /* await sendEmail() -  Not used anymore since we are using sendVerificationEmail here.
  sendVerificationEmail internally uses sendEmail */
  // await sendEmail()
  /* origin is URL to specify where the user 
      should navigate once he clicks the link. More on this later */
  const origin = 'http://localhost:3000'
  await sendVerificationEmail({
    name: user.name,
    email: user.email,
    verificationToken: user.verificationToken,
    origin,
  })
  res.status(StatusCodes.CREATED).json({
    msg: 'Success! Please check your email to verify the account',
    // NOW COMMENTING THIS BELOW AS WEE ARE SENDING EMAIL IN LINE  - await sendEmail() above
    // verificationToken: user.verificationToken, // we could have directly done verification token. But just seeing if user got created and has verificationToken on user object
  })
}
const verifyEmail = async (req, res) => {
  const { verificationToken, email } = req.body
  if (!email || !verificationToken) {
    throw new CustomError.BadRequestError('Please provide email and token')
  }
  const user = await User.findOne({ email })
  if (!user) {
    throw new CustomError.UnauthenticatedError('Verification failed')
  }
  if (verificationToken !== user.verificationToken) {
    throw new CustomError.UnauthenticatedError('Verification failed')
  }
  user.isVerified = true
  user.verified = Date.now()
  // to avoid duplicate, setting verificationToken to ''. If user clicks on verify email again, then he will get 'verification failed'
  user.verificationToken = ''
  user.save()
  res.status(StatusCodes.OK).json({ msg: 'Email verified' })
}
const login = async (req, res) => {
  const { email, password } = req.body
  if (!email || !password) {
    throw new CustomError.BadRequestError('Please provide email and password')
  }
  const user = await User.findOne({ email })
  if (!user) {
    throw new CustomError.UnauthenticatedError('Invalid Credentials')
  }
  const isPasswordCorrect = await user.comparePassword(password)
  if (!isPasswordCorrect) {
    throw new CustomError.UnauthenticatedError('Invalid Credentials')
  }
  /* In Auth Workflow project, let's now implement this new functionality below of 
     sending 401 (Unauthorized) if user is not verified */
  if (!user.isVerified) {
    throw new CustomError.UnauthenticatedError('Please verify your email')
  }
  /* The above is implemented in AuthWorkflow project*/
  const tokenUser = createTokenUser(user)
  // create refresh token - and tie this to a user
  let refreshToken = ''
  const existingToken = await Token.findOne({ user: user._id })
  // IF Refresh Token is already present in DB
  if (existingToken) {
    // if u find any suspicious activity on this refresh token, u can go and invalidate in DB for this user
    if (!existingToken.isValid) {
      throw new CustomError.UnauthenticatedError('Invalid credentials')
    }
    refreshToken = existingToken.refreshToken
    attachCookiesToResponse({ res, user: tokenUser, refreshToken })
    res.status(StatusCodes.OK).json({ user: tokenUser })
    return // adding return here so that the below code doesn't have to execute if token is already present in DB
  }
  // IF Refresh Token is NOT present in DB
  refreshToken = crypto.randomBytes(40).toString('hex')
  const userAgent = req.headers['user-agent'] // another way is  - req.get('user-agent')
  const ip = req.ip
  const userToken = { refreshToken, ip, userAgent, user: user._id }
  await Token.create(userToken)
  attachCookiesToResponse({ res, user: tokenUser, refreshToken }) // adding refreshToken here so we can send access + refresh token in attachCookiesToResponse
  res.status(StatusCodes.OK).json({ user: tokenUser })
}
const logout = async (req, res) => {
  // We need to remove that refreshToken stored in the db for this user
  await Token.findOneAndDelete({ user: req.user.userId })
  // Setting the expiration time to now so that the tokens expires immediately
  res.cookie('accessToken', 'logout', {
    httpOnly: true,
    expires: new Date(Date.now()),
  })
  res.cookie('refreshToken', 'logout', {
    httpOnly: true,
    expires: new Date(Date.now()),
  })
  // leaving the message here just so we can see in postman
  res.status(StatusCodes.OK).json({ msg: 'user logged out!' })
}
module.exports = {
  register,
  login,
  logout,
  verifyEmail,
}After implementing this, you can test the flow
In front-end or postman, login a user
Make sure you got back both the cookies
If you are on postman, then you can easily test showCurrentUser (/showMe) route
Delete the access token (not refresh token), and call the showMe route again. You should get back new access token. This shows that the refresh token functionality works fine
After confirming that the refresh token works fine, let's test the logout functionality by deleting both the tokens
Make a request to showMe route and it should give you back 401 (Invalid Credentials)
Login again and then you should get back both the cookies (both tokens)
Hurray! Everything works. Awesome π
Step 7 (Password Reset Functionality)
7.1 User Model modification
Let's add two fields to User Model
passwordToken
passwordTokenExpirationDate
Lines 43 - 48
const mongoose = require('mongoose')
const validator = require('validator')
const bcrypt = require('bcryptjs')
const UserSchema = new mongoose.Schema({
  name: {
    type: String,
    required: [true, 'Please provide name'],
    minlength: 3,
    maxlength: 50,
  },
  email: {
    type: String,
    unique: true,
    required: [true, 'Please provide email'],
    validate: {
      validator: validator.isEmail,
      message: 'Please provide valid email',
    },
  },
  password: {
    type: String,
    required: [true, 'Please provide password'],
    minlength: 2,
  },
  role: {
    type: String,
    enum: ['admin', 'user'],
    default: 'user',
  },
  // Auth Workflow Project
  verificationToken: String,
  isVerified: {
    type: Boolean,
    default: false,
  },
  verified: {
    type: Date,
  },
  // Reset / forgot password functionality support
  passwordToken: {
    type: String,
  },
  passwordTokenExpirationDate: {
    type: Date,
  },
})
UserSchema.pre('save', async function () {
  // console.log(this.modifiedPaths());
  // console.log(this.isModified('name'));
  if (!this.isModified('password')) return
  const salt = await bcrypt.genSalt(10)
  this.password = await bcrypt.hash(this.password, salt)
})
UserSchema.methods.comparePassword = async function (canditatePassword) {
  const isMatch = await bcrypt.compare(canditatePassword, this.password)
  return isMatch
}
module.exports = mongoose.model('User', UserSchema)7.2 Auth controller and auth router modifications


7.3 Forgot Password functionality
Same as send verification email functionality. Refer Node Js (John Smilga - Udemy) before you
proceed. The idea is,
User forgot the password and he clicks on "Reset/Forgot Password" link in front-end
He gets an email and he need to reset his password by clicking on the link sent to his email within certain time (passwordTokenExpirationDate)
Then he will be redirected to a page where he can give new password and that will be save to db
In this 7.3 section let's work on sending that email with reset password link when thee user clicks on Forgot/Reset password button in front-end.
Front-end
On Front-end, we have a route for forgot password


Server
We will work in forgotPassword controller
Check for email. If not provided then throw (401) error
Check if user exists, and if he does then create a passwordToken (we will be sending this in the email) and also set passwordTokenExpirationDate to know how long this email is valid
Send the email to the user (if user exists) - We will do it in 7.3.1 section below
If user exists or not, send him the success message saying - "Success - Please check your email to reset the password"
Even though the user doesn't exist, why do we send this success email? Well, because, that way, the attacker/hacker will not know if that user email exists in DB or not if he tries to randomly reset passwords and wants to know if the email exists in DB or not.
Hence if we send this success message for the user who exists and not exists then hacker won't know if that user is in DB or not

Small correction in line 182 above. It should be
const passwordTokenExpirationDate = new Date(Date.now() + tenMinutes)const forgotPassword = async (req, res) => {
  const { email } = req.body
  if (!email) {
    throw new CustomError.BadRequestError('Please provide a valid email')
  }
  const user = await User.findOne({ email })
  if (user) {
    const passwordToken = crypto.randomBytes(70).toString('hex')
    // send email for password reset
    const origin = 'http://localhost:3000'
    sendResetPasswordEmail({
      name: user.name,
      email: user.email,
      token: passwordToken,
      origin,
    })
    const tenMinutes = 1000 * 60 * 10
    const passwordTokenExpirationDate = new Date(Date.now() + tenMinutes)
    user.passwordToken = passwordToken
    user.passwordTokenExpirationDate = passwordTokenExpirationDate
    await user.save()
  }
  // we will send this success response if the user exists or not for security reasons as explained above
  res
    .status(StatusCodes.OK)
    .json({ msg: 'Success! Please check you email for password reset link' })
}

If hacker tries to send reset password email to invalid email, he still gets a success message so that he can't know if that user exists in DB or not as shown below.

7.3.1 Send forgot password email


sendResetPasswordEmail

const sendEmail = require('./sendEmail')
/* origin is going to be the URL for the front-end (to specify where the user should navigate once he clicks the link).*/
const sendResetPasswordEmail = async ({ name, email, token, origin }) => {
  // origin is https://localhost:3000 for dev
  const passwordResetLink = `${origin}/user/reset-password?token=${token}&email=${email}`
  const message = `<p>Please reset your password by clicking on the following link: 
     <a href='${passwordResetLink}'>Reset Password</a> </p>`
  return sendEmail({
    to: email,
    subject: 'Password Reset',
    html: `<h1>Hello ${name}</h1>   
        ${message}`,
  })
}
module.exports = sendResetPasswordEmailsendResetPasswordEmail uses sendEmail
const nodemailer = require('nodemailer')
const nodeMailerConfig = require('./nodemailerConfig')
// Sending Email through Ethereal transporter
const sendEmail = async ({ to, subject, html }) => {
  // Not using testAccount in this case. If it was used then user should have been testAccount.user instead of 'joel39@ethereal.email' and pass should have been testAccount.pass instead of 'fRYdcDUhv1kUEp5A8F'
  let testAccount = await nodemailer.createTestAccount()
  // create reusable transporter object using the default SMTP transport
  // You can get this createTransport code from ethereal account
  const transporter = nodemailer.createTransport(nodeMailerConfig)
  // send mail with defined transport object. This return
  return transporter.sendMail({
    from: '"Sandeep Amarnath" <sandeep@gmail.com>', // sender address
    to, // list of receivers
    subject, // Subject line
    html, // html body
  })
}
module.exports = sendEmailLet's see the flow now






7.3.1.1 Front-end code 
Let's now look at how, on clicking the Reset Password link in the email, it redirects to front-end.
We already have written the redirected link in sendResetPasswordEmail.js file like this

So on-clicking this, it will be redirected to this URL
http://localhost:3000/user/reset-password?token=c7ed6dc701639b6cf1945077aa2ab77a8dd899a99768d20c14261e43763799691f5bbaae108559992fa277f9f66c311fa70a8cfc6f5f355bbd116bb2da3f553d1a61a9fd100d&email=john@gmail.com
In line 47 we have <ResetPassword/> component which is that page that shows up on clicking reset password link in email

This way we call reset password controller from front-end, where we send query params back to our resetPassword controller so we can verify if they match what we had sent them in forgotPassword controller. Along with query params, we also pass new password (line 33 above). Let's now implement resetPassword controller in the server now so that we can enter new password from front-end and then save it in DB for that user's email
7.3.2 Reset Password
User clicks on reset password link in email and then redirects to front-end. He then enters new password and now it hits controllers/authController.js - resetPassword
const resetPassword = async (req, res) => {
  const { token, email, password } = req.body
  if (!email || !token || !password) {
    throw new CustomError.BadRequestError('Please provide all values')
  }
  const user = await User.findOne({ email })
  if (user) {
    const currentDate = new Date(Date.now())
    if (
      user.passwordToken === token &&
      user.passwordTokenExpirationDate > currentDate
    ) {
      user.password = password
      user.passwordToken = null
      user.passwordTokenExpirationDate = null
      await user.save()
    }
  }
  res.status(StatusCodes.OK).json({ msg: 'Password reset Successful' })
}



Step 8 (Hash token)

When user clicks on "Reset Password" button in front-end, the forgotPassword controller is hit

In this controller, in line 179 above,
the passwordToken is generated
and in line 182, it is sent to the user - SECURE as now one can see this email (and token inside that) except the legit user himself
and in line 191, it is saved in the DB - NOT SECURE (in this section we will see why and how we need to hash this)
8.1 Why to hash passwordToken before sending to DB
Let's say John (john@gmail.com) wants to do a password reset so he clicks on "Forgot Password" in front-end. Now the forgotPassword controller creates a passwordToken, saves (un hashed passwordToken) in DB and also sends the same to John's email.
After the email is sent, let's say John didn't open his email, and meanwhile a hacker hacked the DATABASE and got access to his un hashed passwordToken from DB. Now this hacker can construct the URL with the passwordToken he stole from DB and enters that URL in browser. This way he can now reset John's password and keep using that. So hacker will now become John.
If passwordToken is hashed before stored in the DB, the hacker cannot construct the URL with token so he can't reset John's password. That's the reason we need to hash the passwordToken before sending to DB.
Once the legit user clicks on email link sent by server to reset his password, in the resetPassword controller, we can get the un hashed passwordToken and hash it and then compare it with DB passwordToken (which is already hashed by forgotPassword controller). Explained below in 8.2.
Now you may ask, we didn't hash verifyEmaiToken, so what about that. 
If you think about it, it's not required because, if hacker gets that un hashed verifyEmailToken from DB, he can construct URL with that and with that, the user gets verified. There's no use for hacker by verifying users email.
8.2 How to hash passwordToken before sending to DB
We will again use crypto library for hashing passwordToken.
Let's create a file called createHash.js in utils folder. Here, we will have function that takes in normal string (un hashed token string) and returns a md5 hashed value. This is one way and can't be reversed. If we need to later compare the user sent string (un hashed passwordToken from front-end ) to resetPassword controller then we can hash user sent string and then compare to hashed one in DB. 

const crypto = require('crypto')
// hashing is one way. Once we hash something then we can't reverse that.
// We can only compare it with a hashed value. There is no way to access original
const hashString = (unhashedToken) =>
  crypto.createHash('md5').update(unhashedToken).digest('hex')
module.exports = hashString

const forgotPassword = async (req, res) => {
  const { email } = req.body
  if (!email) {
    throw new CustomError.BadRequestError('Please provide a valid email')
  }
  const user = await User.findOne({ email })
  if (user) {
    const passwordToken = crypto.randomBytes(70).toString('hex')
    // send email for password reset
    const origin = 'http://localhost:3000'
    sendResetPasswordEmail({
      name: user.name,
      email: user.email,
      token: passwordToken,
      origin,
    })
    const tenMinutes = 1000 * 60 * 10
    const passwordTokenExpirationDate = new Date(Date.now() + tenMinutes)
    // hashing the passwordToken before sending it to DB
    // But sending the unhashed passwordToken to the user. DB is what we need to secure
    user.passwordToken = createHash(passwordToken)
    user.passwordTokenExpirationDate = passwordTokenExpirationDate
    await user.save()
  }
  // we will send this success response if the user exists or not for security reasons as explained above
  res
    .status(StatusCodes.OK)
    .json({ msg: 'Success! Please check you email for password reset link' })
}Comparing hash in resetPassword controller

const resetPassword = async (req, res) => {
  const { token, email, password } = req.body
  if (!email || !token || !password) {
    throw new CustomError.BadRequestError('Please provide all values')
  }
  console.log('The email, token and pass is', email, token, password)
  const user = await User.findOne({ email })
  if (user) {
    console.log('Before user update', user)
    const currentDate = new Date(Date.now())
    console.log('The current date is', currentDate)
    console.log('The expiration date is', user.passwordTokenExpirationDate)
    console.log(user.passwordTokenExpirationDate > currentDate)
    if (
      // since token sent by user is plain string and user.password(stored in db) is hashed one, we can't compare directly
      // first we need to hash the user sent one and then compare. We can't un hash the user.passwordToken because reversing is not possible as I explained
      user.passwordToken === createHash(token) &&
      user.passwordTokenExpirationDate > currentDate
    ) {
      user.password = password
      user.passwordToken = null
      user.passwordTokenExpirationDate = null
      console.log('The new user is', user)
      await user.save()
    }
  }
  res.status(StatusCodes.OK).json({ msg: 'Password reset Successful' })
}Congrats on completing this project ππ
Last updated
Was this helpful?



