Showing posts with label express. Show all posts
Showing posts with label express. Show all posts

Monday, August 27, 2018

Passport JWT authentication extract token

Leave a Comment

I am using express & jwt-simple to handle login/register & authenticated requests as a middleware api. I'm trying to create a .well-known endpoint so other api's can authenticate request based on token send in.

Here's my strategy:

module.exports = function() {     const opts = {};     opts.jwtFromRequest = ExtractJwt.fromAuthHeader();     opts.secretOrKey = securityConfig.jwtSecret;     passport.use(new JwtStrategy(opts, function(jwt_payload, done) {         // User.where('id', jwt_payload.id).fetch({withRelated: 'roles'})         console.log('jwt_payload', jwt_payload)             User.where('id', jwt_payload.id).fetch()             .then(user => user ? done(null, user) : done(null, false))             .catch(err => done(err, false));     })); }; 

Here's my login route:

router.post('/login', function(req, res) {     const {username, password} = req.body;     Promise.coroutine(function* () {         const user = yield User.where('username', username).fetch();      if(user) {         const isValidPassword = yield user.validPassword(password);         if (isValidPassword) {             let expires = (Date.now() / 1000) + 60 * 30             let nbf = Date.now() / 1000             const validatedUser = user.omit('password');              // TODO: Verify that the encoding is legit..             // const token = jwt.encode(user.omit('password'), securityConfig.jwtSecret);             const token = jwt.encode({ nbf: nbf, exp: expires, id: validatedUser.id, orgId: validatedUser.orgId }, securityConfig.jwtSecret)             res.json({success: true, token: `JWT ${token}`, expires_in: expires});         } else {             res.status(401);             res.json({success: false, msg: 'Authentication failed'});         }     } else {         res.status(401);         res.json({success: false, msg: 'Authentication failed'});     }     })().catch(err => console.log(err)); }); 

Here's my .well-known route:

router.get('/.well-known', jwtAuth, function(req, res) {     // TODO: look over res.req.user. Don't seem to be the way to get those parameters.     // We dont take those parameters from the decrypted JWT, we seem to grab it from the user in DB.     const { id, orgId } = res.req.user.attributes;     console.log("DEBUG: userId", id)     console.log("DEBUG: USER", res.req.user)     res.json({         success: true,         userId: id,         orgId     }); }); 

here's my jwtAuth() function:

const passport = require('passport'); module.exports = passport.authenticate('jwt', { session: false }); 

How would I actually get the token in the route function & decrypt it? All this does right now which works is that it authenticates if true however I need to be able to decrypt the token to send back the stored values. I'm not sure what res.req.user.attributes comes from, is this the token?

1 Answers

Answers 1

Take a look at passport-jwt and in your passport-config (or wherever you initialize passport) setup JWT Strategy:

const JwtStrategy = require('passport-jwt').Strategy; const ExtractJwt = require('passport-jwt').ExtractJwt;  const jwtAuth = (payload, done) => {  const user = //....find User in DB, fetch roles, additional data or whatever  // do whatever with decoded payload and call done  // if everything is OK, call  done(null, user);  //whatever you pass back as "user" object will be available in route handler as req.user   //if your user does not authenticate or anything call  done(null, false); }  const apiJwtOptions: any = {}; apiJwtOptions.jwtFromRequest = ExtractJwt.fromAuthHeaderAsBearerToken(); apiJwtOptions.algorithms = [your.jwt.alg]; apiJwtOptions.secretOrKey = your.jwt.secret; //apiJwtOptions.issuer = ???; //apiJwtOptions.audience = ???; passport.use('jwt-api', new JwtStrategy(apiJwtOptions, jwtAuth)); 

If you want just decoded token, call done(null, payload) in jwtAuth.

Then in your route files when you want to protect endpoints and have info about user, use as:

const router = express.Router(); router.use(passport.authenticate('jwt-api', {session: false})); 

And in handler you should have req.user available. It is configurable to what property of req you store data from auth, req.user is just default.

Read More

Friday, August 24, 2018

pass state around between express middleware in an isomorphic react app

Leave a Comment

I have an isomorphic react app and I would like to somehow pass state between express middleware.

I have the following express route that handles form submission:

export const createPaymentHandler = async (req: Request, res: Response, next: NextFunction) => {   const { field } = req.body;    if (!paymentType) {     res.locals.syncErrors = { field: 'some error.' };     next();     return;   }    try {     const { redirectUrl } = await makeRequest<CreatePaymentRequest, CreatePaymentResponse>({       body: { paymentType },       method: HttpMethod.POST     });      res.redirect(redirectUrl);   } catch (err) {     error(err);      res.locals.serverError = true;      next();   } }; 

The next middleware is handling the rendering.

At the moment I am using res.locals, is there a better way or a recognised pattern?

4 Answers

Answers 1

Because your handler is async, you need to pass the err into next, like so:

next(err); 

In order for your middleware to process the error, instead of it being picked up by the default error handler, you need to have four parameters:

app.use((err, req, res, next) => {   // handle the error }) 

It's also worth noting that error handlers need to be specified after other middleware. For your case, it might make sense to use a normal "success" middleware alongside an error handler, rather than combining the two into one middleware.

Finally, keep in mind that passing err as a parameter is specific to error handlers. If you just want to pass some data into your next middleware, you would do that by modifying the req:

req.x = 'some data' next() 

Then, the next middleware's req parameter will have the data you set.


Further reading: https://expressjs.com/en/guide/using-middleware.html#middleware.error-handling

Answers 2

IMO your question is more about passing some data to the next middleware. Since the rendering logic is handled by the next middleware, the express route shouldn't be concerned by how the data is being used. Your approach looks fine.

res.locals is the recommended way of passing data to the next middleware. From the docs:

This property is useful for exposing request-level information such as the request path name, authenticated user, user settings, and so on.

Also, since the variables added will be scoped to the current request, thus the data will only be available for the current request's lifecycle. Perhaps you can set a convention of adding a state key on the res.locals to store all your state variables, but the current approach would also work fine.

Answers 3

If it's passing lightweight information to the next middleware for rendering purposes then applying res.locals is fine. However, you might want to look into custom error-handling for general errors, such as internal error.

Consider the following error handling

function notFoundHandler(req, res, next) {     res.status(404).render('notFoundPage', {         error: '404 - not found'     }); }  function badRequestHandler(err, req, res, next) {     res.status(400).render('badRequestPage', {         error: 'Bad request'     }); }  function errorHandler(err, req, res, next) {     res.status(err.status || 500).render('errorPage', {         error: 'Internal server error'     }); }  app.use(notFoundHandler); app.use(badRequestHandler); app.use(errorHandler); 

Now instead of passing error details to the next middleware you would simple let it flow to the error handlers, e.g.

export const createPaymentHandler = async (req: Request, res: Response, next:   NextFunction) => {     const { field } = req.body;      if (!paymentType) {         res.status(400);         return next(); // This will hit the Bad Request handler     }      try {         const { redirectUrl } = await makeRequest < CreatePaymentRequest, CreatePaymentResponse > ({             body: { paymentType },             method: HttpMethod.POST         });          res.redirect(redirectUrl);     } catch (err) {         res.status(500);         return next(err); // This will hit the Error Handler     } }; 

Answers 4

res.locals is a standard way to pass data to the next middleware in the scope of the current request. Since your use case is around the current request, it makes sense to do so.

At the same time, the standard way to handle errors is to pass the error to the next middleware.

next(err); 

Then you can handle the error scenario from the error handler. However, for an isomorphic react app, this would make things harder. So if you decide to go down that path, I would suggest you to use a custom error like PaymentError by extending Error. This would make even more sense since you are already using Typescript.

However, when you actually think about this scenario, when the error is not a request error, from the point of view of the react app, it is a special state/property of rendering. Thus I suggest the following hybrid approach.

  1. If the error is of high priority, that is, if the error should stop rendering the expected content and fallback to a special page, use the next(err) approach.
  2. If the error should just be part of the state report, then use the res.locals approach.
Read More

Wednesday, July 18, 2018

Extract zip into folder node via express

Leave a Comment

I try to find example where I can send a zip (like via postman) and get this zip in my handler and unzip it so specified folder I didn't find much examples for zipping using express I want to unzip it in path web/app

I try something like the following which doesn't works for me , the zip file is not unzipped in the specified folder, any idea what im doing wrong ?

https://nodejs.org/api/zlib.html#zlib_zlib

var zlib = require('zlib'); var fs = require('fs'); const dir = path.join(__dirname, 'web/app/');  if (req.file.mimetype === 'application/zip') {      var unzip = zlib.createUnzip();      var read = fs.createReadStream(req.file);     var write = fs.createWriteStream(dir);     //Transform stream which is unzipping the zipped file     read.pipe(unzip).pipe(write);        console.log("unZipped Successfully");  } 

Any working example will be very helpful, or reference where can I've problem...

while debug I see the that this is when the code failed

var read = fs.createReadStream(req.file);

any idea why?

I've also tried with

var read = fs.createReadStream(req.file.body);

the issue that I dont see the error, reason etc.

when I change it to

var read = fs.createReadStream(req.file.buffer);

the program doesnt exit and i was able to run it until the logger console.log("unZipped Successfully"); but nothing happen...

if there any example with https://www.npmjs.com/package/yauzl yauzl and multer in my context it will be great

update- this is the postman request

enter image description here

4 Answers

Answers 1

Prerequisites:

  1. npm i express unzipper multiparty bluebird
  2. Create app/web directory in your project root (or you can automate creation if you want).
  3. Place all of these files into one directory.
  4. Node version that supports async/await (7.6+ as far as I know)

server.js:

const express = require('express'); const Promise = require('bluebird'); const fs = require('fs'); const writeFile = Promise.promisify(fs.writeFile);  const { parseRequest, getFile } = require('./multipart'); const { extractFiles } = require('./zip')  const EXTRACT_DIR = 'web/app';  const app = express();  const uploadFile = async (req, res, next) => {   try {     const body = await parseRequest(req);     const bodyFile = getFile(body, 'file');     if (!/\.zip$/.test(bodyFile.originalFilename)) {       res.status(200).json({ notice: 'not a zip archive, skipping' })       return;     }     const archiveFiles = await extractFiles(bodyFile);      await Promise.each(archiveFiles, async (file) => {       await writeFile(EXTRACT_DIR + '/' + file.path, file.buffer);     })     res.status(200).end();   } catch (e) {     res.status(500).end();   } };  app.post('/files', uploadFile);  app.listen(3000, () => {   console.log('App is listening on port 3000'); }); 

multipart.js

const Promise = require('bluebird'); const { Form } = require('multiparty');  function parseRequest (req, options) {     return new Promise((resolve, reject) => {         const form = new Form(options)         form.parse(req, (err, fields, files) => {             if (err) {                 return reject(err);             }             return resolve({ fields, files });         });     }); }  function getFile (body, field) {     const bodyFile = body.files[field];     const value = bodyFile ? bodyFile[0] : null;     return value || null; }  module.exports = {     parseRequest,     getFile, }; 

zip.js

const unzip = require('unzipper'); const fs = require('fs');  async function extractFiles (file) {     const files = [];     await fs.createReadStream(file.path).pipe(unzip.Parse()).on('entry', async entry => {     // Cleanup system hidden files (or drop this code if not needed)         if (             entry.type !== 'File'             || /^__MACOSX/.test(entry.path)             || /.DS_Store/.test(entry.path)         ) {             entry.autodrain()             return         }         const pathArr = entry.path.split('/');         const path = entry.path;         const buffer = await entry.buffer();         files.push({ buffer, path, originalFilename: pathArr[pathArr.length - 1] });     }).promise();     return files; }  module.exports = {     extractFiles, }; 

Usage:

  1. Start a server with node server
  2. Send your file in file field in request (key file in postman). Example in curl curl -XPOST -F 'file=@../ttrra-dsp-agency-api/banner.zip' 'localhost:3000/files')

Downsides:

  1. Unzipped files are stored in buffer so this method doesn't work great and is not recommended for big archives.

Answers 2

First of all, zlib does not support extracting zip files.

I recommend formidable for handling files because

  1. its battle tested
  2. most widely used
  3. avoids writing boilerplate plate code like reading filestream from request, storing and handling errors
  4. easily configurable

Bare minimal solution for your problem with formidable and extract-zip

const express = require('express'); const fs = require('fs'); const extract = require('extract-zip') const formidable = require('formidable'); const path = require('path'); const uploadDir = path.join(__dirname, '/uploads/'); const extractDir = path.join(__dirname, '/app/'); if (!fs.existsSync(uploadDir)) {   fs.mkdirSync(uploadDir); } if (!fs.existsSync(extractDir)) {   fs.mkdirSync(extractDir); }  const server = express();  const uploadMedia = (req, res, next) => {   const form = new formidable.IncomingForm();   // file size limit 100MB. change according to your needs   form.maxFileSize = 100 * 1024 * 1024;   form.keepExtensions = true;   form.multiples = true;   form.uploadDir = uploadDir;    // collect all form files and fileds and pass to its callback   form.parse(req, (err, fields, files) => {     // when form parsing fails throw error     if (err) return res.status(500).json({ error: err });      if (Object.keys(files).length === 0) return res.status(400).json({ message: "no files uploaded" });      // Iterate all uploaded files and get their path, extension, final extraction path     const filesInfo = Object.keys(files).map((key) => {       const file = files[key];       const filePath = file.path;       const fileExt = path.extname(file.name);       const fileName = path.basename(file.name, fileExt);       const destDir = path.join(extractDir, fileName);        return { filePath, fileExt, destDir };     });      // Check whether uploaded files are zip files     const validFiles = filesInfo.every(({ fileExt }) => fileExt === '.zip');      // if uploaded files are not zip files, return error     if (!validFiles) return res.status(400).json({ message: "unsupported file type" });      res.status(200).json({ uploaded: true });      // iterate through each file path and extract them     filesInfo.forEach(({filePath, destDir}) => {       // create directory with timestamp to prevent overwrite same directory names       extract(filePath, { dir: `${destDir}_${new Date().getTime()}` }, (err) => {         if (err) console.error('extraction failed.');       });     });   });    // runs when new file detected in upload stream   form.on('fileBegin', function (name, file) {     // get the file base name `index.css.zip` => `index.html`     const fileName = path.basename(file.name, path.extname(file.name));     const fileExt = path.extname(file.name);     // create files with timestamp to prevent overwrite same file names     file.path = path.join(uploadDir, `${fileName}_${new Date().getTime()}${fileExt}`);   }); }  server.post('/upload', uploadMedia);  server.listen(3000, (err) => {   if (err) throw err; }); 

This solution works for single/multiple file uploads. The one problem with this solution is, wrong file types will get uploaded to uploaded directory though server throw error.

To test with postman: postman image

Answers 3

Without a full example it's tough to say what the real problem is. But according to Express docs it says:

In Express 4, req.files is no longer available on the req object by default. To access uploaded files on the req.files object, use multipart-handling middleware like busboy, multer, formidable, multiparty, connect-multiparty, or pez.

So if you are not using a middleware library to handle uploading files, it's tough to tell what the value of req.file is.

I am also a bit worried that you are trying to use zlib to decompress a zip file, since the library only supports gzip.

The zlib module provides compression functionality implemented using Gzip and Deflate/Inflate

You would want to check for req.file.mimetype === 'application/gzip'

Here are some posts related to unzipping zip files:

Answers 4

This is my code for uploading a file to express server.

//require express library var express = require('express'); //require the express router var router = express.Router(); //require multer for the file uploads var multer = require('multer');  //File Upload  var storage = multer.diskStorage({   // destino del fichero   destination: function (req, file, cb) {     cb(null, './uploads/logo')   },   // renombrar fichero   filename: function (req, file, cb) {     cb(null, file.originalname);   } });  var upload = multer({ storage: storage });  router.post("/", upload.array("uploads[]", 1), function (req, res) {   res.json('Uploaded logo successfully'); });   module.exports = router;  
Read More

Monday, April 9, 2018

How to configure pm2 with webpack for typescripts compile and reload?

Leave a Comment

Is there any boiler plate code to use pm2 with webpack watch option for ts files auto hot reload?

pm2 start index.js is helpful to run directly, but how to add multiple tasks before doing it like watch files and auto reload using webpack and pm2 from dist folders?

2 Answers

Answers 1

I am finally sticking with this after so much of research considering performance, i might add live reload which is todo task. But not a priority as of now.

scripts": {     "build": "webpack --config webpack.config.js --watch",     "pm2": "pm2 start ./dist/server.js --watch=true",     "postinstall": "npm run build",     "test": "jest --forceExit",     "test-ci": "npm test && cat ./coverage/lcov.info | coveralls",     "start": "supervisor ./dist/server.js",     "server:dev": "concurrently \"npm run build \" \"npm run start\""   } 

Answers 2

Create a process.json for pm2 config In the script key you can give a webpack compiler to run. I am not sure if it will run for it watch reload.

Read More

Wednesday, April 4, 2018

Express: unable to access route from browser due to accept:application/javascript header missing

Leave a Comment

I'm new to express. I have a Vue application running on express. I have some API routes that I'm able to access using axios through the browser. To access those routes using postman I have to have the header:

accept: application/javascript 

for it to return the result of the actual API. If I don't use this header, I get the generated index.html from webpack. I need to reuse one of these routes to return excel/pdf, based on a parameter and have it accessible via a link on the page.

Here's my server.js - based on https://github.com/southerncross/vue-express-dev-boilerplate

import express from 'express'  import path from 'path'  import favicon from 'serve-favicon'  import logger from 'morgan'  import cookieParser from 'cookie-parser'  import bodyParser from 'body-parser'  import webpack from 'webpack'    const argon2 = require('argon2');  const passport = require('passport')  const LocalStrategy = require ('passport-local')  const session = require('express-session')      import history from 'connect-history-api-fallback'    // Formal(Prod) environment, the following two modules do not need to be introduced  import webpackDevMiddleware from 'webpack-dev-middleware'  import webpackHotMiddleware from 'webpack-hot-middleware'    import config from '../../build/webpack.dev.conf'    const app = express()  app.set('trust proxy', true)    app.set("view engine", "pug")  app.set('views', path.join(__dirname, 'views'))    app.use ('/', require('./routes'))      app.use(session({  	secret: process.env.SESSION_SECRET || 'secretsauce',  	resave: false,  	saveUninitialized: true  }))      app.use(history())  app.use(favicon(path.join(__dirname, 'public', 'favicon.ico')))  app.use(logger('dev'))  app.use(bodyParser.json())  app.use(bodyParser.urlencoded({  	extended: false  }))  app.use(cookieParser())  app.use(express.static(path.join(__dirname, 'public')))    const compiler = webpack(config)    app.use(webpackDevMiddleware(compiler, {  	publicPath: config.output.publicPath,  	stats: {  		colors: true  	}  }))    app.use(webpackHotMiddleware(compiler))        ////////// PASSPORT ///////////////////////  app.use (passport.initialize ());  app.use (passport.session ());    async function authenticateUser (username, password) {  //...  		  }    passport.use (  	new	LocalStrategy (async (username, password, done) => {  		const user = await authenticateUser (username, password)  		if (!user) {  			return done (null, false, {  				message: 'Username and password combination is wrong',  			});  		}    		delete user.password;  		return done (null, user)  	})  );    // Serialize user in session  passport.serializeUser ((user, done) => {  	done (null, user);  });    passport.deserializeUser (function(user, done) {  	if(user === undefined || !user || Object.keys(user).length === 0)  		return done(null, false)  	else  		done (null, user);  });    //////////// passport end ///////////////      app.set("view engine", "pug")  app.use(express.static(path.join(__dirname, 'views')))  app.get('/', function (req, res) {  	res.sendFile('./views/index.html')  })  app.get('/success', function (req, res) {  	res.render('./views/success')  })        app.use ('/api', require('./api'))      // catch 404 and forward to error handler  app.use(function (req, res, next) {  	var err = new Error('Not Found')  	err.status = 404  	next(err)  })    app.use(function (err, req, res) {  	res.status(err.status || 500)  	res.send(err.message)  })          let server = app.listen(80)    export default app

And here's a bit of api.js

const {Router} = require ('express')  const router = Router()    router.get('/whome', function(req, res){  	logger.info('whome', req.user)  	return res.json(req.user)  })      router.get ('/hello', auth.isAuthenticated, async (req, res) => {  	res.json ({text:'hello'})  })    module.exports = router

I can call http://localhost/api/hello from postman with the accept:application/javascript header and I get:

{     "text": "hello" } 

as expected. But if I call the same URL from the browser (and it's not sending that header), I get the created bundle index.html. How can I access these routes from the browser?

1 Answers

Answers 1

You have two options.

First one, try to add this in your server:

app.options('*', cors())

before to: app.set("view engine", "pug")

If that doesnt work, try to install this addon in your Google Chrome browser to test.

Allow-Control-Allow-Origin: *

And enable it. (The icon should be green instead of red).

Why this happens? The request that's being made is called a preflight request. Preflight requests are made by the browser, as CORS is a browser security restriction only - This is why it works in Postman, which is, of course, not a browser.

Reference: Preflight request

Read More

Friday, March 16, 2018

Error on Let's encrypt auto renewal (Nginx)

Leave a Comment

I am trying to set up greenlock-express to run behind nginx proxy.

Here is my nginx config

... # redirect server {     listen 80;     listen [::]:80;     server_name mydomain.com;      location / {         return 301 https://$server_name$request_uri;     } }  # serve server {     listen 443 ssl http2;     listen [::]:443 ssl http2;     server_name mydomain.com;      # SSL settings     ssl on;     ssl_certificate C:/path/to/mydomain.com/fullchain.pem;     ssl_certificate_key C:/path/to/mydomain.com/privkey.pem;      # enable session resumption to improve https performance     ssl_session_cache shared:SSL:50m;     ssl_session_timeout 1d;     ssl_session_tickets off;      # enables server-side protection from BEAST attacks     ssl_prefer_server_ciphers on;     # disable SSLv3(enabled by default since nginx 0.8.19) since it's less secure then TLS     ssl_protocols TLSv1 TLSv1.1 TLSv1.2;      # ciphers chosen for forward secrecy and compatibility     ssl_ciphers 'ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA:ECDHE-ECDSA-DES-CBC3-SHA:ECDHE-RSA-DES-CBC3-SHA:EDH-RSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:DES-CBC3-SHA:!DSS';      # enable OCSP stapling (mechanism by which a site can convey certificate revocation information to visitors in a privacy-preserving, scalable manner)     resolver 8.8.8.8 8.8.4.4;     ssl_stapling on;     ssl_stapling_verify on;     ssl_trusted_certificate C:/path/to/mydomain.com/chain.pem;      # config to enable HSTS(HTTP Strict Transport Security) https://developer.mozilla.org/en-US/docs/Security/HTTP_Strict_Transport_Security     add_header Strict-Transport-Security "max-age=31536000; includeSubdomains; preload";      # added to make handshake take less resources     keepalive_timeout 70;      location / {         proxy_set_header X-Real-IP $remote_addr;         proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;         proxy_set_header Host $http_host;         proxy_set_header X-NginX-Proxy true;         proxy_pass https://127.0.0.1:3001/;         proxy_redirect off;         proxy_set_header Upgrade $http_upgrade;         proxy_set_header Connection "upgrade";     } } ... 

I have node server running on port 3000 (http) and port 3001 (https). Everything else seems to be working, but certificates do not update and expire after 3 months.

If I closed nginx and ran node server on port 80 (http) and port 443 (https), then it updates certs.

I made sure that .well-known/acme-challenge is forwarded to node server, i.e when I go to url http(s)://mydomain.com/.well-known/acme-challenge/randomstr I get following response:

{    "error": {      "message": "Error: These aren't the tokens you're looking for. Move along."    }  } 

2 Answers

Answers 1

The easy way to separate the webroot for ACME authentication.

Create a webroot directory for ACME authentication.

C:\www\letsencrypt\.well-known 

In the nginx configuration, set the webroot for ACME authentication to the previously created directory.

http://example.com/.well-known/acme-challenge/token -> C:/www/letsencrypt/.well-known/acme-challenge/token

server {     listen 80;     listen [::]:80;     server_name mydomain.com;      location ^~ /.well-known/acme-challenge/ {         default_type "text/plain";         root C:/www/letsencrypt;     }      location / {         return 301 https://$server_name$request_uri;     } } 

Restart nginx.

You can change your webroot in certbot to get authentication again.

certbot certonly --webroot -w C:\www\letsencrypt\ -d exapmle.com --dry-run 

First, test it by adding the --dry-run option. Otherwise, you may experience issues limiting the number of authentication attempts.

Answers 2

The error you are seeing is that when a token is placed in your

webroot/.well-known/acme-challenge/token

Then Let’s Encrypt tries to verify that from the internet. going to http://yourdomain/.well-known/acme-challenge/token it gets a 404 error - page not found. Exactly why it get’s a 404 I can’t be certain. If you place a file there yourself, is it reachable from the internet ?.

If you are wondering there are a couple of automatic ways to renew your SSL's without restarting your nginx. The one most nginx users seem to prefer is the webroot plugin: first, obtain a new cert using something like:

certbot certonly --webroot -w /path/to/your/webroot -d example.com --post-hook="service nginx reload" 

Then set up a cron job to run certbot renew once or twice a day; it will only run the post-hook when it actually renews the certificate. You can also use --pre-hook flag if you prefer to stop nginx to run certbot in standalone mode.

There’s also a full nginx plugin, which you can activate with --nginx. It’s still being tested, so experiment at your own risk and report any bugs.

Note: post-hookFlag will take care of reloading nginx upload renewal of your certs

Read More

Saturday, February 24, 2018

Passport.SocketIo - How to get a list of online users with NodeJS, Express and Passport

1 comment

i am finish sessionStore with MongoStore every login is being performed correctly and sessions are being written to the database without errors. I am using this package github.com/jfromaniello/passport.socketio to align the passport with socket io but I have already looked for several places about how after login make the treatment of the sessionStorageso it lists which users with names are online and offline, Could show me a light on this?

app.js

var express = require('express'); var mongoose = require('mongoose'); var path = require('path'); var bodyParser = require('body-parser'); var cookieParser = require('cookie-parser'); var session = require('express-session'); const MongoStore = require('connect-mongo')(session); var flash = require('connect-flash'); var logger = require('morgan'); var passport = require('passport');  var passportSetup = require('./passport-setup');  // import routes var routes = require('./routes');  // setup express app var app = express(); app.use(logger());  // setup connection with mongodb mongoose.connect( process.env.MONGODB_URI || "mongodb://smachs:***@d***.mlab.com:****/****-messenger",     (err, db)=> {         if (err) return new Error(err);         console.log('🔐  Conexão estabelecida com banco de dados!');     }); // setup passport from different class         passportSetup();   // set view engine and connection of application app.set('views', path.join(__dirname, 'views')); app.set('view engine', 'ejs'); app.use(bodyParser.urlencoded({extended:false})); app.use(cookieParser());  // session storage based in mongodb var sessionStore = new MongoStore({     url: 'mongodb://smachs:***@d***.mlab.com:****/****-messenger',     ttl: 1 * 24 * 60 * 60, // = 1 days. Default     autoReconnect: true })  // setup session based in express-session app.use(session({     secret:"58585858585858",     key: "connect.sid",     resave: false,     saveUninitialized: false,     store: sessionStore }));  app.use(flash());  // public directory app.use(express.static(__dirname + '/public'));  // passport staff app.use(passport.initialize()); app.use(passport.session());  // start routes app.use(routes);  // start server var port = process.env.PORT || 3000; var server = app.listen(port, () => { console.log('🌐  Servidor iniciado em localhost:', port); });;  // setup socket.io and passport.socketio packages var io = require('socket.io').listen(server); var passportSocketIo = require("passport.socketio");  // setup session found in express-session io.use(passportSocketIo.authorize({     cookieParser: cookieParser,       // the same middleware you registrer in express     key: 'connect.sid',       // the name of the cookie where express/connect stores its session_id     secret: '58585858585858',    // the session_secret to parse the cookie     store: sessionStore, // we NEED to use a sessionstore. no memorystore please     success: onAuthorizeSuccess,  // *optional* callback on success - read more below     fail: onAuthorizeFail,     // *optional* callback on fail/error - read more below }));  // setup route just for clients authenticate function ensureAutheticated(req, res, next) {     if (req.isAuthenticated()) next();     else {         req.flash("info", "Você precisa estar logado para visualizar essa página!");         res.redirect('/login');     } }  // setup current online clients var User = require('./models/user'); app.use((req, res, next) => {     res.locals.currentUser = req.user;     res.locals.errors = req.flash('error');     res.locals.infos = req.flash('info');     next(); });  // callback from passport.socketio function onAuthorizeSuccess(data, accept) {     console.log('🗲 Passport-Socket.IO conectado com sucesso');      io.on('connection', function (socket) {         console.log("🗲 Socket.IO-Native conectado com sucesso");     });      // get current user online after authentication     io.on('connection', function (socket) {          // get user details of documents in database         app.get('/user-online', ensureAutheticated, (req, res) => {             User.find()                 .sort({ createdAd: 'descending' })                 .exec((err, users) => {                     if (err) return next(err);                     // render response                     res.send({                         users: users                     })                 });         });     });      accept(); }  function onAuthorizeFail(data, message, error, accept) {     console.log('failed connection to socket.io:', data, message);     if (error)         accept(new Error(message)); } 

user.js

var mongoose = require('mongoose'); var bcrypt = require('bcrypt-nodejs'); const SALT_FACTOR = 10;  var userSchema = mongoose.Schema({     username: { type: String, required: true, unique: true },     password: { type: String, required: true },     createdAt: { type: Date, default: Date.now },     displayName: String,     bio: String });  userSchema.methods.name = function() { return this.displayName || this.username;}  function noop() { };  userSchema.pre('save', function(done) {     var user = this;     console.log('USER: ' + JSON.stringify( user));      if (!( user.isModified('password'))) return done();     bcrypt.genSalt(SALT_FACTOR, function(err, salt) {         if (err) return done(err);         bcrypt.hash(user.password, salt, noop,            function (err, hashedPassword)  {                 if (err) return done(err);                 user.password = hashedPassword;                 done();             });     }); });  userSchema.methods.checkPassword = function(guess, done){     bcrypt.compare(guess, this.password, function(err, isMatch){         done(err,isMatch);     }); };  var User = mongoose.model('User', userSchema);  module.exports = User; 

I was trying after login to make an query in a collection to list the users I logged but it is limited to only 1 user and gives me no option to treat this result better, thank you very much for the help they give me!

1 Answers

Answers 1

You can track connection, disconnect, login and logout events to create a list of online users. You can manage online users in RAM or you can use redis for that. Following code snippet may help you achieve your goal -

// Store userIds here let onlineUsers = [];  io.on('connection', function (socket) {      socket.on('login', (userTokenOrId) => {         // store this to onlineUsers or redis         // Other stuff     });     socket.on('logout', (userTokenOrId) => {         // remove this from onlineUsers or redis         // Other stuff     });     socket.on('disconnect', (userTokenOrId) => {         // remove this from onlineUsers or redis         // Other stuff     }); }); 

For better use, You can manage one array of objects to store userId and list of socketIds for same and one object to map socketId to userId. This way you can track is one user is online on different browsers/system.

Read More

Monday, February 12, 2018

Heroku - web.1: crashed in node.js app

1 comment

So I am trying to deploy my node.js app in heroku for the first time.

After deploying my code and getting to the 6th step in the heroku deployment guide: https://devcenter.heroku.com/articles/getting-started-with-nodejs#scale-the-app

It told me it had deployed fine so I opened up "Application error" - the browser console was 503

Next I added a Procfile to my project, re-deployed and then ran heroku ps top which it returns the following:

=== web (Free): yarn start:production (1) web.1: crashed 2018/01/30 13:17:05 +0000 (~ 5m ago) 

I'm not too sure where to go from here as the heroku guide carries on without explaining what to do if I hit this error.

here are the initial few pages for my node.js app:

package.json

{   "name": "",   "version": "2.5.1",   "description": "",   "main": "index.js",   "engines": {     "node": ">=6.0",     "npm": ">=3.0"   },   "repository": {     "type": "git",     "url": ""   }   "author": "",   "license": "",   "bugs": {     "url": ""   },   "homepage": "",   "scripts": {     "start": "better-npm-run start",     "start:production": "yarn build && yarn start:prod",     "start:prod": "better-npm-run start:prod",     "build": "yarn clean:build && better-npm-run build",     "lint": "yarn lint:js && yarn lint:style",     "lint:js": "better-npm-run lint:js",     "lint:style": "better-npm-run lint:style",     "flow": "better-npm-run flow",     "test": "better-npm-run test",     "test:watch": "yarn test --watch",     "clean:all": "yarn clean:build && yarn clean:test",     "clean:build": "better-npm-run clean:build",     "clean:test": "better-npm-run clean:test",     "coveralls": "better-npm-run coveralls && yarn clean:test"   },   "betterScripts": {     "start": {       "command": "nodemon ./index.js",       "env": {         "NODE_PATH": "./src",         "NODE_ENV": "development",         "PORT": 3000       }     },     "start:prod": {       "command": "node ./index.js",       "env": {         "NODE_PATH": "./src",         "NODE_ENV": "production",         "PORT": 8080       }     },     "build": {       "command": "webpack --progress --hide-modules --config ./tools/webpack/config.babel.js",       "env": {         "NODE_ENV": "production"       }     },     "lint:js": {       "command": "eslint ./src ./tools ./index.js"     },     "lint:style": {       "command": "stylelint \"./src/**/*.scss\" --syntax scss"     },     "flow": {       "command": "flow; test $? -eq 0 -o $? -eq 2"     },     "test": {       "command": "jest --coverage",       "env": {         "NODE_ENV": "test"       }     },     "clean:build": {       "command": "rimraf ./public/assets"     },     "clean:test": {       "command": "rimraf ./coverage"     },     "coveralls": {       "command": "cat ./coverage/lcov.info | coveralls"     }   },   "babel": {     "presets": [       "env",       "react",       "stage-0"     ],     "env": {       "production": {         "plugins": [           "transform-remove-console"         ]       }     }   },   "eslintConfig": {     "parser": "babel-eslint",     "extends": "airbnb",     "plugins": [       "react",       "jsx-a11y",       "import"     ],     "env": {       "browser": true,       "node": true,       "jest": true,       "es6": true     },     "rules": {       "linebreak-style": 0,       "global-require": 0,       "no-underscore-dangle": 0,       "no-console": 0,       "react/jsx-filename-extension": [         1,         {           "extensions": [             ".js",             ".jsx"           ]         }       ],       "import/no-extraneous-dependencies": [         "error",         {           "devDependencies": true         }       ],       "function-paren-newline": 0     },     "globals": {       "__CLIENT__": true,       "__SERVER__": true,       "__DISABLE_SSR__": true,       "__DEV__": true,       "webpackIsomorphicTools": true     }   },   "stylelint": {     "extends": "stylelint-config-standard",     "rules": {       "string-quotes": "single",       "selector-pseudo-class-no-unknown": [         true,         {           "ignorePseudoClasses": [             "global",             "local"           ]         }       ]     }   },   "browserslist": [     "last 2 versions",     "not ie <= 8"   ],   "jest": {     "setupFiles": [       "raf/polyfill",       "<rootDir>/tools/jest/setup.js"     ],     "collectCoverageFrom": [       "src/containers/**/*.js",       "src/components/**/*.js",       "!src/**/__tests__"     ],     "moduleNameMapper": {       ".*\\.(css|scss|sass)$": "<rootDir>/tools/jest/styleMock.js",       ".*\\.(jpg|jpeg|png|gif|eot|otf|webp|svg|ttf|woff|woff2|mp4|webm|wav|mp3|m4a|aac|oga)$": "<rootDir>/tools/jest/assetMock.js"     }   } } 

index.js

/* @flow */  // Use babel-register to precompile ES6 syntax require('babel-core/register');  const WebpackIsomorphicTools = require('webpack-isomorphic-tools');  // Setup global variables for server global.__CLIENT__ = false; global.__SERVER__ = true; global.__DISABLE_SSR__ = false; // Disable server side render here global.__DEV__ = process.env.NODE_ENV !== 'production';  // This should be the same with webpack context const dirRoot = require('path').join(process.cwd());  // Settings of webpack-isomorphic-tools global.webpackIsomorphicTools =   new WebpackIsomorphicTools(require('./tools/webpack/WIT.config')).server(dirRoot, () => require('./src/server')); 

WIT.config.js

const WebpackIsomorphicToolsPlugin = require('webpack-isomorphic-tools/plugin');  module.exports = {   // debug: true,   // webpack_assets_file_path: 'webpack-assets.json',   // webpack_stats_file_path: 'webpack-stats.json',   assets: {     images: {       extensions: ['png', 'jpg', 'jpeg', 'gif'],       parser: WebpackIsomorphicToolsPlugin.url_loader_parser,     },     fonts: {       extensions: ['eot', 'ttf', 'woff', 'woff2'],       parser: WebpackIsomorphicToolsPlugin.url_loader_parser,     },     svg: {       extension: 'svg',       parser: WebpackIsomorphicToolsPlugin.url_loader_parser,     },     style_modules: {       extensions: ['css', 'scss'],       filter: (module, regex, options, log) => {         if (options.development) {           return WebpackIsomorphicToolsPlugin.style_loader_filter(module, regex, options, log);         }          return regex.test(module.name);       },       path: (module, options, log) => {         if (options.development) {           return WebpackIsomorphicToolsPlugin.style_loader_path_extractor(module, options, log);         }          return module.name;       },       parser: (module, options, log) => {         if (options.development) {           return WebpackIsomorphicToolsPlugin.css_modules_loader_parser(module, options, log);         }          return module.source;       },     },   }, }; 

server.js

   /* @flow */  import path from 'path'; import morgan from 'morgan'; import express from 'express'; import compression from 'compression'; import helmet from 'helmet'; import hpp from 'hpp'; import favicon from 'serve-favicon'; import React from 'react'; import { renderToString, renderToStaticMarkup } from 'react-dom/server'; import { StaticRouter, matchPath } from 'react-router-dom'; import { Provider } from 'react-redux'; import chalk from 'chalk';  import createHistory from 'history/createMemoryHistory'; import configureStore from './redux/store'; import Html from './utils/Html'; import App from './containers/App'; import routes from './routes'; import { port, host } from './config';  const app = express();  app.set('port', process.env.PORT || 3000);  // Using helmet to secure Express with various HTTP headers app.use(helmet()); // Prevent HTTP parameter pollution. app.use(hpp()); // Compress all requests app.use(compression());  // Use morgan for http request debug (only show error) app.use(morgan('dev', { skip: (req, res) => res.statusCode < 400 })); app.use(favicon(path.join(process.cwd(), './public/favicon.ico'))); app.use(express.static(path.join(process.cwd(), './public')));  // Run express as webpack dev server if (__DEV__) {   const webpack = require('webpack');   const webpackConfig = require('../tools/webpack/config.babel');    const compiler = webpack(webpackConfig);    app.use(require('webpack-dev-middleware')(compiler, {     publicPath: webpackConfig.output.publicPath,     hot: true,     noInfo: true,     stats: { colors: true },     serverSideRender: true,   }));    app.use(require('webpack-hot-middleware')(compiler)); }  //GET week app.get('/api/week', (req, res) => {     console.log('week');      var articles = [];      db.collection('articles')         .find()         .limit(2)         .sort("date", -1)         .toArray()         .then(result => {             articles = articles.concat(result);         }).then(() => {             // console.log(articles);             res.send(articles);         }).catch(e => {             console.error(e);         });  });  //GET articles app.get('/api/articles', (req, res) => {     console.log('articles');      var articles = [];      db.collection('articles')         .find()         .limit(12)         .sort("date", -1)         .toArray()         .then(result => {             articles = articles.concat(result);         }).then(() => {             // console.log(articles);             res.send(articles);         }).catch(e => {             console.error(e);         });  });  //GET authorArticles app.get('/api/authorArticles', (req, res) => {     console.log('authorArticles');      var articles = [];      var ObjectId = require('mongodb').ObjectID;     var author = {};     var param = req.query.authorQuery;     param = param.replace(/-/g, ' ');      db.collection('articles')         // .find()         .find({"author" : {$regex : ".*" + param + ".*"}})         .limit(12)         .sort("date", -1)         .toArray()         .then(result => {             articles = articles.concat(result);         }).then(() => {             // console.log(articles);             res.send(articles);         }).catch(e => {             console.error(e);         });  });  //GET extra app.get('/api/extra', (req, res) => {     console.log('extra');      var articles = [];      db.collection('articles')         .aggregate([{ $sample: { size: 4 } }])         .toArray()         .then(result => {             articles = articles.concat(result);         }).then(() => {             // console.log(articles);             res.send(articles);         }).catch(e => {             console.error(e);         });  });  //GET authors app.get('/api/authors', (req, res) => {     console.log('authors');      var authors = [];      db.collection('authors')         .find()         .limit(24)         .toArray()         .then(result => {             // console.log(result);             authors = authors.concat(result);         }).then(() => {             res.send(authors);         }).catch(e => {             console.error(e);         });  });  //GET search app.get('/api/search', (req, res) => {     console.log('/api/search');      var articles = [];      db.collection('articles')         .find({$or:[                 {title: {$regex : ".*" + req.query.searchQuery + ".*"}},                 {description: {$regex : ".*" + req.query.searchQuery + ".*"}},                 {author: {$regex : ".*" + req.query.searchQuery + ".*"}},                 {keywords: {$regex : ".*" + req.query.searchQuery + ".*"}}               ]})         .limit(24)         .sort("date", -1)         .toArray()         .then(result => {             articles = articles.concat(result);         }).then(() => {             // console.log(articles);             res.send(articles);         }).catch(e => {             console.error(e);         });  });  //GET category app.get('/api/category', (req, res) => {     console.log('category');      var articles = [];      db.collection('articles')         .find({$or:[                 {category: {$regex : ".*" + req.query.categoryQuery + ".*"}}               ]})         .limit(12)         .sort("date", -1)         .toArray()         .then(result => {             articles = articles.concat(result);         }).then(() => {             // console.log(articles);             res.send(articles);         }).catch(e => {             console.error(e);         });  });  //GET article app.get('/api/article', (req, res) => {     console.log('article');      var ObjectId = require('mongodb').ObjectID;     var article = {};     var param = req.query.title;     param = param.replace(/-/g, ' ');      db.collection('articles')         .findOne({"title": param})         .then(result => {             article = result;         }).then(() => {             res.send(article);         }).catch(e => {             console.error(e);         });  });  //GET author app.get('/api/author', (req, res) => {     console.log('author');      var ObjectId = require('mongodb').ObjectID;     var article = {};     var param = req.query.title;     param = param.replace(/-/g, ' ');      db.collection('authors')         .findOne({"name": param})         .then(result => {             article = result;         }).then(() => {             res.send(article);         }).catch(e => {             console.error(e);         });  });  // Register server-side rendering middleware app.get('*', (req, res) => {   if (__DEV__) webpackIsomorphicTools.refresh();    const history = createHistory();   const store = configureStore(history);   const renderHtml = (store, htmlContent) => { // eslint-disable-line no-shadow     const html = renderToStaticMarkup(<Html store={store} htmlContent={htmlContent} />);      return `<!doctype html>${html}`;   };    // If __DISABLE_SSR__ = true, disable server side rendering   if (__DISABLE_SSR__) {     res.send(renderHtml(store));     return;   }    // Load data on server-side   const loadBranchData = () => {     const promises = [];      routes.some((route) => {       const match = matchPath(req.path, route);        // $FlowFixMe: the params of pre-load actions are dynamic       if (match && route.loadData) promises.push(route.loadData(store.dispatch, match.params));        return match;     });      return Promise.all(promises);   };    // Send response after all the action(s) are dispathed   loadBranchData()     .then(() => {       // Setup React-Router server-side rendering       const routerContext = {};       const htmlContent = renderToString(         <Provider store={store}>           <StaticRouter location={req.url} context={routerContext}>             <App />           </StaticRouter>         </Provider>,       );        // Check if the render result contains a redirect, if so we need to set       // the specific status and redirect header and end the response       if (routerContext.url) {         res.status(301).setHeader('Location', routerContext.url);         res.end();          return;       }        // Checking is page is 404       const status = routerContext.status === '404' ? 404 : 200;        // Pass the route and initial state into html template       res.status(status).send(renderHtml(store, htmlContent));     })     .catch((err) => {       res.status(404).send('Not Found :(');        console.error(`==> 😭  Rendering routes error: ${err}`);     }); });  // connect to mongo db var db const MongoClient = require('mongodb').MongoClient MongoClient.connect('mongodb://dannyjones360:test@ds123930.mlab.com:23930/halftimefront', (err, database) => {     if (err) return console.log(err);     db = database     console.log('db connected'); })  if (port) {   app.listen(port, host, (err) => {     const url = `http://${host}:${port}`;      if (err) console.error(`==> 😭  OMG!!! ${err}`);      console.info(chalk.green(`==> 🌎  Listening at ${url}`));      // Open Chrome     require('../tools/openBrowser')(url);   }); } else {   console.error(chalk.red('==> 😭  OMG!!! No PORT environment variable has been specified')); } 

config

module.exports = {   host: 3000 || 'localhost', // Define your host from 'package.json'   port: 3000,   app: {     htmlAttributes: { lang: 'en' },     title: 'Rendah',     titleTemplate: 'Rendah - %s',     meta: [       {         name: 'description',         content: 'Beats culture magazine',       },       {         name: 'apple-mobile-web-app-title',         content: 'Vernacare',       },       {         name: 'apple-mobile-web-app-capable',         content: 'yes',       },       {         name: 'apple-mobile-web-app-status-bar-style',         content: 'black',       },       {         name: 'theme-color',         content: '#ffffff',       },       {         name: 'mobile-web-app-capable',         content: 'yes',       },       {         name: 'theme-color',         content: '#fff',       },     ],   }, }; 

Any help or advice would be appreciated - thank you in advance.

After making ammends to Yoni Rabinovitch's answer, I deploy the site and now get:

2018-02-05T11:37:12.562907+00:00 heroku[web.1]: Process exited with status 1 2018-02-05T11:37:12.580445+00:00 heroku[web.1]: State changed from starting to crashed 2018-02-05T11:37:15.775044+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/" host=sleepy-scrubland-78530.herokuapp.com request_id=d7e4d005-d2f9-4c5f-89c8-14f2addc9ebf fwd="185.108.171.221" dyno= connect= service= status=503 bytes= protocol=https 2018-02-05T11:37:16.157734+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/favicon.ico" host=sleepy-scrubland-78530.herokuapp.com request_id=0f2a7123-f9bf-4ee0-bbc6-59894e21cc1b fwd="185.108.171.221" dyno= connect= service= status=503 bytes= protocol=https 2018-02-05T11:37:19.664289+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/" host=sleepy-scrubland-78530.herokuapp.com request_id=b3748306-5b3a-4dd6-be2c-58a171bafbd1 fwd="185.108.171.221" dyno= connect= service= status=503 bytes= protocol=https 2018-02-05T11:37:19.898230+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/favicon.ico" host=sleepy-scrubland-78530.herokuapp.com request_id=99b155b8-17e9-47da-a454-d3d4c6f45ef4 fwd="185.108.171.221" dyno= connect= service= status=503 bytes= protocol=https 2018-02-05T11:37:21.262972+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/" host=sleepy-scrubland-78530.herokuapp.com request_id=ca84a061-f834-476f-a269-7a9a8cd4adda fwd="185.108.171.221" dyno= connect= service= status=503 bytes= protocol=https 2018-02-05T11:37:21.537896+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/favicon.ico" host=sleepy-scrubland-78530.herokuapp.com request_id=9e6e9096-f6b4-4d73-b0c1-faf921ea8666 fwd="185.108.171.221" dyno= connect= service= status=503 bytes= protocol=https 

4 Answers

Answers 1

You seem to be setting the PORT env var in your package.json. You should not be doing that for Heroku. Heroku sets if for you automatically. Instead, add something like this in your server.js:

    app.set('port', process.env.PORT || 3000); 

The you can subsequently call app.listen(app.get('port'),...)

Answers 2

I was getting the same error, so I tried this method. I added this code to my package.json :

"scripts": {     "start": "node index.js" }, 

And in my Procfile:

web: npm start 

Answers 3

Try in your Procfile npm run start:production instead of using yarn. One time I got the same App crashed in Heroku because when calling the run scripts with yarn it didn't iniatilize the process global variable.

In the latest deploys I hadn't this problem, but it may be ocurring with you.

Answers 4

I am assuming that you are deploying a fresh/sample app. Can you please try above like for the same process :)

Heroku NodeJs sample app

Read More

Saturday, February 10, 2018

findOneAndUpdate Not Updating Discriminator

Leave a Comment

I am working on a REST API using Node, Express and Mongoose. Everything works perfectly when I update the base model. But when I try to update the discriminator object sportEvent in this case, it doesn't work.

Event.js - Event data model has a base schema common for all the collections with a discriminator for additional detail for that collection.

// base schema for all the events // includes basic detail for all the events const eventSchema = new Schema({   //title for the event   title: {     type: String,     required: true   },    //description for the events    description: {     type: String,     required: true   },    //event type for the event. such as Music, Sports, Expo, Leisure   eventType: {     type: String,     required: true,   } }, { discriminatorKey: 'eventType' });  //sport event model for extending the basic event model const sportEvent = Event.discriminator("sports", new Schema({   sportEvent: {     //sport name. for eg: cricket, football, etc     sportName: {       type: String,       required: true     },     //first team name     firstTeam: {       type: String,       required: true     },     //second team name     secondTeam: {       type: String,       required: true     },   } })); 

EventController.js - has a PUT method for updating the collection. Here is a code snippet.

//for updating the event added a PUT method in /event route router.put('/events/:eventId', function(req, res, next){   //getting the event id form the url   eventId = req.params.eventId;    //checking the provided event id is a valid mongodb _id object or not   if(objectId.isValid(eventId)){     Event.findOneAndUpdate({_id: eventId}, {$set: req.body}, {new: true, runValidators: true}, function(err, event){       if(err) {         next(err);       }       sendResponse(res, "Event Successfully Updated", event);     });   } else {     //sending a bad request error to the user if the event id is not valid     sendError(res, 400, "Invalid Event ID");   } }); 

1 Answers

Answers 1

Ensure the discriminator key is present in the update object, or as an argument to the update function, write a switch case based on discriminator key, call update on the specific Schema type

callback = function(err, doc){     if(err) console.log(err)     console.log(doc) };  var id = ObjectId("5a75d22e6dabf3102c059f56");  var update = {     title : 'title-name',     eventType : 'sports' ,     sportEvent : {         firstTeam : 'first-name',         secondTeam : 'second-name',         sportName : 'sport-name'     } };  switch(update.eventType){     case 'sports':         SportEventSchema.findByIdAndUpdate(id, {$set : update}, {new : true, upsert : false}, callback)         break;     case 'games':         GameEventSchema.findByIdAndUpdate(id, {$set : update}, {new : true, upsert : false}, callback)         break;     default:         Event.findByIdAndUpdate(id, {$set : update}, {new : true, upsert : false}, callback);         break; } 

output : update for a sports event type

Mongoose: events.findAndModify({ eventType: 'sports', _id: ObjectId("5a75d22e6dabf3102c059f56") }, [], { '$set': { title: 'title-name', eventType: 'sports', sportEvent: { firstTeam: 'first-name', secondTeam: 'second-name', sportName: 'sport-name' } } }, { new: true, upsert: false, remove: false, fields: {} }) { sportEvent:    { firstTeam: 'first-name',      secondTeam: 'second-name',      sportName: 'sport-name' },   eventType: 'sports',   _id: 5a75d22e6dabf3102c059f56,   title: 'title-name',   description: 'desc',   __v: 0 } 
Read More

Monday, January 29, 2018

root password using Nodejs mscdex/ssh2

Leave a Comment

I'm trying to login as root on a remote linux machine using mscdex/ssh2, the steps I'm trying to achieve are :

  1. connect via ssh to the remot machine
  2. execute command as root user

but I'm failing in the second part, I can't get to put the password right, here is my code.

  conn.on('ready', function() {     conn.exec('sudo -s ls /', { pty: true }, (err, stream) => {       if (err) {        res.send(err);       }        stream.on('exit', (code, signal) => {        console.log(`Stream :: close :: code: ${code}, signal: ${signal}`);       });        stream.on('data', data => {          // here it's where supposedly the password should be given         stream.write('r00tpa$$word' + '\n');         console.log(data);       });      });   }).connect({     host: '192.168.100.100',     username: 'fakeAdmin',     password: 'fakePassword'   }); 

I already have the pty option set to true, but I'm only getting error messages on the promt.

Update :

here is my new code snippet :

const Client = require('ssh2').Client; const conn = new Client();  const encode = 'utf8';  conn.on('ready', () => {   conn.shell(false, { pty: true }, (err, stream) => {     if (err) { console.log(err) }      stream.on('data', (data) => {       process.stdout.write(data.toString(encode));     });      stream.write('ls -a\n');     stream.write('uptime\n');     stream.write('su\n'); // here nothing seems to happen     stream.write('rootPassword\n'); // here also     stream.write('cd /tmp && ls\n');   }); }) .connect({   host: "192.168.100.100",   username: "username",   password: "usernamePassword" }); 

I've managed to perform the several commands part in a much cleaner way, I even raised an issue on the library github page .shell command "su" loses interaction , but now what it's happening it's kinda weird, I can run as many commands as I what, but when I run a "su" command nothing seems to happen, does somebody step into this before?, or what I'm I doing wrong ? Sorry if I couldn't explain myself right.

Regards.

1 Answers

Answers 1

Consider using node-ssh. I've used it with no problem before and it is more modern as well because it has a promise-based asynchronous interface.

You also should connect directly to the root user if you need to perform root actions, unless you have a user that does not ask for a password when performing sudo, for example with EC2 instances that use the Ubuntu AMI which comes by default with an ubuntu user without password (you authenticate via ssh keypair).

Trying to log in by command and inputting the password in plain text is not a good idea. Also consider adding an ssh key to the root user to authenticate with a private key instead of a password.

const SSH = require('node-ssh')  const ssh = new SSH()    ssh.connect({   host: '<host>',   username: 'root',   password: '<password>' }).then(() => ssh.exec('your command')) 
Read More

Monday, January 15, 2018

Create an alias to query elasticsearch table from express

Leave a Comment

I am querying elastic search with the values places.name and places.state via a GET in the URL. The URL looks like this… localhost:9000/search?type=location&contains%5Bplaces.name%5D=Burlington&contains%5Bplaces.state%5D=NJ

My function in express is this...

function getPlaces(index, query, options, searchFn) {   var opts = options || {};   var body = {};   body.query = {};   body.query.bool = {};   body.query.bool.must = [{     multi_match: {       query: query,       fields: [“places.name”, “places.state "]     }   }]; 

What I want to do is create additional alias fields that will still perfom a search on it’s intended table. So adding static.places.name & static.places.state would show results for places.name & places.state

I added this to fields but that didn’t work, I kind of expected that.

fields: [      "places.name”, “static.places.name”, “places.state”, “static.places.state" ] 

So localhost:9000/search?type=location&contains%5Bplaces.name%5D=Burlington&contains%5Bplaces.state%5D=NJ and localhost:9000/search?type=location&contains%5Bstatic.places.name%5D=Burlington&contains%5Bstatic.places.state%5D=NJ would show the same results.

Can I create an alias to query elasticsearch table from express? Does this make sense? Am I going about this the wrong way?

Thank in advance for your help.

1 Answers

Answers 1

Field aliases are not a supported feature in Elasticsearch, though it has been suggested (and rejected) a number of times in the Github Issues.

You'll need to make the transformation on the client-side after retrieving the result.

Read More

Wednesday, November 29, 2017

How to use JSON.parse in nunjucks

Leave a Comment

i am get data in mongodb and express return occurs without errors but I wanted to use JSON.parse in the final results of my .find see below how I'm trying to do this

  app.get("/login", (req, res) => {     var credentialClient = {       expire_at: false,       __v: false,       _id: false     };      rememberMe.find({ username: "test-login" }, credentialClient, function (err, credentialInfo) {       if (err) {         res.send(err);       } else {         res.render("login.html", {           usernameClient: JSON.parse(credentialInfo)         });       }     });   }); 

Without JSON.parse the final rendering stays that way in my login.html

{ username: 'test-login' } 

The final results appear in login.html

<p class="center-align black-text" id="preFinalCredentialClient">{{ usernameClient }}</p> 

Thanks for helping me!

3 Answers

Answers 1

I hope below code will work for you. In below example, I have kept static json data. In your case you can store json data in any variable and render

index.nunjucks passing that variable.

var express = require( 'express' ) ;  var nunjucks = require( 'nunjucks' ) ;  var app = express() ;  app.get( '/', function( req, res )   {  var jsondata =   {  firstName: "Rakesh", lastName: "Barani"  };  return res.render('index.njk', {data:jsondata}) ;  res.render('index.njk', {data: jsondata}) }); 

Answers 2

credentialInfo is already JS object, no need to parse it.

app.get("/login", (req, res) => {     var credentialClient = {       expire_at: false,       __v: false,       _id: false     };      rememberMe.find({ username: "test-login" },          credentialClient, function (err, credentialInfo) {       if (err) {         res.send(err);       } else {         res.render("login.html", {           usernameClient: credentialInfo         });       }     });   });  <p class="center-align black-text" id="preFinalCredentialClient">{{ usernameClient.username }}</p> 

On the client, you can then access the properties of usernameClient.

Answers 3

It already returns JSON response so You can use this:

res.render("login.html", {    usernameClient: credentialInfo.username }); 
Read More

Friday, November 3, 2017

Express Session Different Expiration

Leave a Comment

I have the following code to create sessions in Express.

app.use(session({     secret: 'randomstring',     saveUninitialized: false,     resave: false,     cookie: {         secure: true,         maxAge: 60000 * 60 /* 1 hour */     } })); 

I mainly use this to store session data in Passport.js. Currently after 1 hour users get automatically logged out. Or the session ends and users get logged out.

Is there an easy way to store more data in the session and have different expiration dates for different information? So say I want the Passport.js (req.session.passport.user) session data to have an expiration of 1 hour. And I want another piece of data say for example (req.session.myDataA) to have an expiration of 30 days. And I want another piece of data (req.session.myDataB) to have no expiration and stay active forever.

Is there an easy way to achieve this behavior in Express Sessions?

2 Answers

Answers 1

You can use cookies and set different expirations on different cookie names. So you can have multiple cookies that would hold data and each one would have a specific expiration.

That being said, I would say that both sessions and cookies wouldn't be the correct solution to your problem. I would keep your sessions lean and store the data in a database. Considering you're using sessions, you could use Redis and store data with expirations. You could assign the key in Redis to your session. For example:

req.session.myDataA = 'user_id:myDataA' req.session.myDataB = 'user_id:myDataB' 

When you set your data in Redis, you can use the key user_id:myDataA and set it to expire.

// Will expire in 1 hour SET user_id:myDataA "your data" EXPIRE user_id:myDataA 3600 

While the key will still be in session, you can check if the value is null or has the data you're expecting.

I know this perhaps sounds a little more complicated, but even as a good practice with sessions, you really don't want to be storing a lot of data, beyond keys of reference as it becomes very difficult to manage.

If you're using MongoDB, you could also set documents to expire. However, if you're not using either, Redis would generally be the easiest to setup and acts a good session store.

Edit:

As commented below, instead of expiring the sessions which you can't at different time frames, is to just set a field in your data with expiration time. Then when you get that data, check if it's passed expiration (i.e., 1hr, 30days, etc.).

Answers 2

You could set the maxAge of session in another middleware

code:

// after session and passport middleware app.use(function(req, res, next) {     if (req.session.user) { // some condition         req.session.cookie.maxAge = 1000 * 60 * 60 * 24 // 24 hours     } else if (req.session.myData) {         req.session.cookie.maxAge = 1000 * 60 * 60      }     // another condition      // save the session     req.session.save()      // add next at the end of middleware      // so it can pas data edited to another middleware     next() }) 
Read More

Monday, September 11, 2017

How to successfully use express routing in electron project?

Leave a Comment

I am using ExpressJS in my Electron project. The routing with Express doesn't work as expected.

Here is how I created the routing (in the main process):

const express = require('express')  const app2 = express()  app2.get('/requests/:_id', (req, res, next) =>   {   console.log('Dynamic Link WORKS!!');   hosSchemaModel.findOne({ _id: req.params._id }, function(err, request){     res.json(request)     // res.sendFile(path.join(__dirname+'../homePage.html'))   }); }); 

And in the front-end I have the following:

<a href="/requests/{{this._doc._id}}">{{this._doc.status}}</a> 

When I click on {{this._doc.status}} the it takes me to empty white screen with nothing printed in the console.

Can I have some guidance on how to implement ExpressJS routing in Electron?

2 Answers

Answers 1

Just a shot in the dark but you won't be able to connect without a port. Try adding this to the end of your server file. 'app2.port(9000)` then try hitting the same URL but with a port.

Answers 2

Electron has basically two process main and rendered process, when you are printing console.log, it is basically printing in your main process's console. You have to pass data to renderer process to show in console of your web page.

UPDATE - 2

Make express sever to listen to some port, then from frontend hit the url having that port also.

Main.js

app2.get('/requests/1234', (req, res, next) =>   {   console.log('Dynamic Link WORKS!!');   hosSchemaModel.findOne({ _id: req.params._id }, function(err, request){     res.json(request);     // res.sendFile(path.join(__dirname+'../homePage.html'))   }); });  app2.listen(5000, function () {   console.log('Example app listening on port 3000!'); }); 

frontend

<a href="localhost:5000/requests/1234">{{this._doc.status}}</a> 

After this it is working at my end.

If you want to run the express server in cluster mode, you should fork the process and try running the express server in new process.

Read More

Wednesday, August 30, 2017

ExpressJS router normalized/canonical urls

Leave a Comment

I'm after normalized/canonical urls for SPA with ExpressJS server.

Although it is SPA that is backed up by server side router - templates can differ a bit for app urls. One of the differences is <link rel="canonical" href="https://example.com{{ originalPath }}"> tag. Not the relevant detail but explains the context of the question. I expect that there will be only one URL that responds with 200, its variations are redirected to it with 301/302 (works for living humans and search engines).

I would like to make the urls case-sensitive and strict (no extra slash), similarly to Router options, but non-canonical urls (that differ in case or extra slash) should do 301/302 redirect to canonical url instead of 404.

In most apps I just want to force the urls for * routes to be lower-cased (with the exception of queries), with no extra slashes. I.e. app.all('*', ...), and the redirects are:

/Foo/Bar/ -> /foo/bar /foo/Bar?Baz -> /foo/bar?Baz 

But there may be exceptions if the routes are defined explicitly. For example, there are camel-cased routes:

possiblyNestedRouter.route('/somePath')... possiblyNestedRouter.route('/anotherPath/:Param')... 

And all non-canonical urls should be redirected to canonical (parameter case is left intact):

/somepath/ -> /somePath /anotherpath/FOO -> /anotherPath/FOO 

The logic behind canonical urls is quite straightforward, so it is strange that I couldn't find anything on this topic regarding ExpressJS.

What is the best way to do this? Are there middlewares already that can help?

3 Answers

Answers 1

I have looked for npms, I could not find any, so this scratched my mind and I've coded a small task for express to do for each request, which seem to work fine. Please add this to your code.

var urls = {   '/main' : '/main',   '/anotherMain' : '/anotherMain' }  app.use(function(req, res, next){    var index = req.url.lastIndexOf('/');    //We are checking to see if there is an extra slash first   if(req.url[index+1] == null || req.url[index+1] == undefined || req.url[index+1] == '/'){      //slashes are wrong      res.send("please enter a correct url");      res.end();   }else{        for(var item in urls){          if(req.url != item && req.url.toUpperCase() == item.toUpperCase()){            res.redirect(item);            console.log("redirected");            //redirected          }else if (req.url == item) {            console.log("correct url");            next();          }else{            //url doesn't exist          }      }    }   next();  });  app.get('/main', function(req, res){   res.render('mainpage'); });  app.get('/anotherMain', function(req, res){   res.send("here here"); }); 

USAGE

All you have to do is, add your urls to urls object like done above with giving it the same key value. That's it. See how easy it is. Now all of your clients request will be redirected to the correct page case sensitively.

UPDATE

I have also made one for POST requests, I think it is pretty accurate, you should also give it a try. Ff you want a redirect when the user mixes up slashes, you need to write some regex for it. I didn't have time also my brain was fried so I made a simple one. You can change it however you like. Every web structure has its own set of rules.

var urlsPOST = {   '/upload' : '/upload' }  app.use(function(req, res, next){    if(req.method == 'POST'){      var index = req.url.lastIndexOf('/');      if(req.url[index+1] == null || req.url[index+1] == undefined || req.url[index+1] == '/'){         //slashes are wrong        res.sendStatus(400);        res.end();        return false;      }else{        for(var item in urlsPOST){           if(req.url != item && req.url.toUpperCase() == item.toUpperCase()){             res.redirect(307, item);             res.end();             return false;             //redirected            }else if (req.url == item) {             console.log("correct url");             next();            }else{             res.sendStatus(404).send("invalid URL");             return false;             //url doesn't exist           }       }     }   }   next(); }); 

Answers 2

You probably want to write your own middleware for this, something along the lines of this:

app.set('case sensitive routing', true);    /* all existing routes here */    app.use(function(req, res, next) {    var url = find_correct_url(req.url); // special urls only    if(url){      res.redirect(url); // redirect to special url    }else if(req.url.toLowerCase() !=== req.url){      res.redirect(req.url.toLowerCase()); // lets try the lower case version    }else{      next(); // url is not special, and is already lower case    };  });

Now keep in mind this middleware can be placed after all of your current routes, so that if it does not match an existing route you can try to look up what it should be. If you are using case insensitive route matching you would want to do this before your routes.

Answers 3

Using the same code as @user8377060 just with a regex instead.

  // an array of all my urls   var urls = [     '/main',     '/anotherMain'   ]    app.use(function(req, res, next){      var index = req.url.lastIndexOf('/');      //We are checking to see if there is an extra slash first     if(req.url[index+1] == null || req.url[index+1] == undefined || req.url[index+1] == '/'){      //slashes are wrong      res.send("please enter a correct url");      res.end();     }else{        for(var item in urls){          var currentUrl = new RegExp(item, 'i');           if(req.url != item && currentUrl.test(item)){            res.redirect(item);            console.log("redirected");            //redirected          }else if (req.url == item) {            console.log("correct url");            next();          }else{            //url doesn't exist          }      }      }     next();    }); 
Read More

Monday, August 28, 2017

Google Analytics Via Node.js Proxy

Leave a Comment

I am trying to impliment a proxy for google analytics into my current site. I started by following this guide.

After following all the steps I get a 400 error returned from google with no further information.

I downloaded google's analytics.js file and put it on my server, and replaced all instances of www.google-analytics.com with "+location.host+"/analytics. See the file here: https://pastebin.com/wimHim7x

I edited my tracking code from google to replace 'https://www.google-analytics.com/analytics.js' with '/analytics.js'.

Here is the proxy info from my app.js file

function getIpFromReq (req) { // get the client's IP address     var bareIP = ":" + ((req.connection.socket && req.connection.socket.remoteAddress)         || req.headers["x-forwarded-for"] || req.connection.remoteAddress || "");     return (bareIP.match(/:([^:]+)$/) || [])[1] || "127.0.0.1"; }  // proxying requests from /analytics to www.google-analytics.com. app.use("/analytics", proxy("www.google-analytics.com", {     proxyReqPathResolver: function (req) {         var path = req.url + (req.url.indexOf("?") === -1 ? "?" : "&")             + "uip=" + encodeURIComponent(getIpFromReq(req));          console.log(path)          return path;     } })); 

This is the page I get returned from google:

<!DOCTYPE html> <html lang=en>   <meta charset=utf-8>   <meta name=viewport content="initial-scale=1, minimum-scale=1, width=device-width">   <title>Error 400 (Bad Request)!!1</title>   <style>     *{margin:0;padding:0}html,code{font:15px/22px arial,sans-serif}html{background:#fff;color:#222;padding:15px}body{margin:7% auto 0;max-width:390px;min-height:180px;padding:30px 0 15px}* > body{background:url(//www.google.com/images/errors/robot.png) 100% 5px no-repeat;padding-right:205px}p{margin:11px 0 22px;overflow:hidden}ins{color:#777;text-decoration:none}a img{border:0}@media screen and (max-width:772px){body{background:none;margin-top:0;max-width:none;padding-right:0}}#logo{background:url(//www.google.com/images/branding/googlelogo/1x/googlelogo_color_150x54dp.png) no-repeat;margin-left:-5px}@media only screen and (min-resolution:192dpi){#logo{background:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) no-repeat 0% 0%/100% 100%;-moz-border-image:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) 0}}@media only screen and (-webkit-min-device-pixel-ratio:2){#logo{background:url(//www.google.com/images/branding/googlelogo/2x/googlelogo_color_150x54dp.png) no-repeat;-webkit-background-size:100% 100%}}#logo{display:inline-block;height:54px;width:150px}   </style>   <a href=//www.google.com/><span id=logo aria-label=Google></span></a>   <p><b>400.</b> <ins>That’s an error.</ins>   <p>Your client has issued a malformed or illegal request.  <ins>That’s all we know.</ins> 

0 Answers

Read More

Tuesday, August 22, 2017

Modifying user.username in Session With PassportJS

Leave a Comment

I have a form that allows people to update their profile information, which is populated with data from req.user via PassportJS.

The problem is that whenever I update the value that corresponds to user.username I get the following error message:

TypeError: Cannot read property '_id' of null  

From line 6 in this snippet of code:

passport.deserializeUser(function(id, done) {   mongo.connect("mongodb://localhost:27017/formulas", function(e, db){   if (e) {return next(e);}   var col = db.collection("users");   col.findOne({"username": id}, function(err, user){     done(err, {"id": user._id, "username": id, "activeEmail": user.activeEmail, "name": user.name, "password": user.password, "formulas": user.formulas});     });   }); }); 

I'm assuming it's because in serializeUser I'm using user.username to load it into session like so:

passport.serializeUser(function(user, done) {   done(null, user.username); }); 

Does anyone know how to go about getting around this or is it an intractable issue with Passport?

The code I have that does the update generically looks like this:

router.post('/update-profile', function(req, res) {     var name = req.body.name;     var username = req.body.username;     var db = req.db.collection('users');      db.updateOne({"username": username}, {       $set: {           "name": name,           "username": username,         }       }, function(err, r) {          assert.equal(null, err);          assert.equal(null, r.matchedCount);       }     });     res.render('profile', {       user: req.user     });   }); 

UPDATE:

Per comment request, the error message from findOne in serializeUser is null when it's called, so it's not the query that's the problem.

2 Answers

Answers 1

As username is changeable value, you should not use it as session cookie key.

I strongly recommend using user._id instead as it is unchanged value, so that server still "know" the current user even if username has been changed. Check official http://passportjs.org/docs, they are also using id as session cookie key.

Btw, even if you are using username, you should do NULL check for passport.deserializeUser():

col.findOne({"username": id}, function(err, user){     if (user) {         done(err, {"id": user._id, "username": id, "activeEmail": user.activeEmail, "name": user.name, "password": user.password, "formulas": user.formulas});     } else {         done(new Error('User not found'));     } }); 

Answers 2

Please update your code on passport deserialize funtion. You didn't checked, user is available or not. So when no user found, You got that error>

passport.deserializeUser(function(id, done) {   mongo.connect("mongodb://localhost:27017/formulas", function(e, db){   if (e) {return next(e);}   var col = db.collection("users");   col.findOne({"username": id}, function(err, user){     if (err){         done(new Error('No user found on db'));     }else if (user){          done(err, {"id": user._id, "username": id, "activeEmail": user.activeEmail, "name": user.name, "password": user.password, "formulas": user.formulas});     });     }   }); }); 
Read More

Wednesday, August 16, 2017

Best practice for multiple organisations on couch

Leave a Comment

So I have a node express app using nano with couchdb as the backend, this is running fine. I'm now looking to learn how I would expand it to multiple organisations.

So for instance, a wildcard DNS record allowing https://customername.myapp.com for each customer. I will then check the req.headers.host in the main database, along with checking session cookies etc in each request.

What I'm struggling to get my head around though, is how the backend will work. I think I understand that the correct method is to use a database for each organisation, and copy the design from a template database.

But if this is correct, I don't understand how this translates to my code using nano. I currently use this:

var dbname = 'customer1'; var nano = require('nano')(config.dbhost); var couch = nano.db.use(dbname); 

and then in my functions:

couch.get(somevalue, function(err, body) {     // do stuff }); 

But that won't work when the database itself is a variable. Should I be looking at moving the query to a lower level, eg nano.get('dbname', query... or something else?

EDIT

Hoping someone can give me an example of how to use middleware to change the database name dependent on the host header. I have this so far:

app.use(function(req,res,next) {     var couch = nano.db.use(req.header.host);     next(); }); 

But I don't understand how to pass the couch object through ('couch' is unknown in the rest of my routing). I have tried passing it back through in the 'next(couch)' but this breaks it...

1 Answers

Answers 1

First of all, I'd recommend to have the application working with a single organization. If you want to have 1 database per organization, it should be fairly easy to add more organizations later.

I would have a master database and a template database. The master database would be a database listing the existing organization in the service with some metadata. This is what NodeJS would query first to know from which database you need to fetch data.

The template database would be used to sync design objects to existing or new organizations. You can technically have old organization with old design and they will still work as the data will be consistent.

In your case, the line you're looking for is this one:

var couch = nano.db.use(dbname); 

When you know which database to query, you'll have to create a new nano object for each dbname you need.

You can know which database to use directly if the databases are named after domain name or project name as long as the information is present in the request headers/session.

Anyhow, it's a really wide question that can be answered in many ways and there is no particularly best way of doing things.

You could technically have all of your organization in one database if that works for you. Splitting database allow you to isolate a bit things and make use of ACL but you could technically make database not only for organization but for more specific things.

For example, I made a painting program that stores projects per database and allow people to cooperatively draw on a canvas. Database ACL allowed me to restrict access to people invited in a project. My NodeJS server was techically used only for WebSockets and the webapp was able to communicate with the couchDB directly without NodeJS.

Read More