In the next post I’ll delve into MongoDB, how to pair it with Node, and figuring out how to communicate across all the application layers.
When it finally came to receiving the user input Markdown produced on the client-side, and persisting it in the database, I realized I knew nothing about Node.JS. As I start writing this down, I realize I’m going to need to address some more fundamental issues before diving into Node.
I’ve read a few articles about Node and MongoDB, but I hadn’t really had any practical experience yet. I asked myself a few questions, and I feel like I need to answer at least some of them before going forward.
- How I would prevent or mitigate over-posting issues?
- How am I going to perform model validation?
- How am I going to separate concerns?
- How am I going to structure the application? I probably should refer to a framework such as Backbone.JS, Knockout.JS, or the like.
- Am I going to use the native MongoDB driver? Am I going to find Mongoose ODM useful at all?
- Lastly, how the heck am I going to render views that require a model? Will I pair my templating engine with some other templating language? Will my implementation do just fine on its own?
I sifted through Learning Node, it serves as a pretty decent introductory crash-course and it answered the most basic questions I had.
Refactoring Node.JS
The template I started off with had just a server.js
file that contained everything server-side: http server configuration, error handling, routing, route behavior, etc. I want a more robust separation of concerns going forward.
After going through the book, I formed a good idea of how I wanted to structure the server application.
The first thing I’m going to do is to modularize the application, I’ll separate concerns by having routing modules, controllers, and models, all in separate logical files.
module.exports in Node.JS
Node.JS has a cute way of separating concerns in what’s called modules. A module is self-contained and exposes an API providing access to explicitly defined methods.
Here’s an example dependency.js:
var uid = 0; // local
module.exports.startsWith = function(str, text){
return str.indexOf(text) === 0;
};
module.exports.uid = function(){
return ++uid;
};
In your server.js, you would reference it like this:
var dep = require('./dependency.js'),
model = {
sn: dep.uid();
text: 'dependency flavored model'
};
// ...
Not the best of examples, but you get the idea.
Routing and Controller Actions
Lets examine a more practical example, I want to have my routing defined somewhere else, rather than directly on server.js, so I’ll replace my routes declaration with the following call, and defer the implementation to another file:
require('/routing/core.js')(server);
With that simple statement I can pass the server
object, and handle any routing directly in my self-contained module.
module.exports = function(server){
server.get('/*', function(req,res){
res.render('site.jade');
});
server.post('/write-entry', function(req,res){
console.log(req.body.entry);
res.end();
});
};
This is pretty awesome, but all I really want from my routes is to juggle request parameters around, and I’ll leave any real processing to the controllers. The main
controller will be a really thin one:
module.exports = {
get: function(req,res){
res.render('site.jade');
}
};
For the endpoint previously referred to as POST /write-entry
, I’ll favor a more RESTful approach this time. The controller pretty much will remain unchanged:
module.exports = {
put: function(req,res){
console.log(req.body.entry);
res.end();
}
};
The routing module ends up being:
var main = require('../controllers/main.js');
var entry = require('../controllers/entry.js');
module.exports = function(server){
server.get('/*', main.get);
server.put('/entry', entry.put);
};
All this code might seem redundant at first, but it will gain value as our application grows.
As you might have noticed, my attempt at being RESTful is hindered by the rich application structure where all
GET
requests serve the sametext/html
response. This could be easily be mitigated in the future by considering some other endpoint for any non-static resource, such as:
- GET http://ponyfoo.com/api/1.0/entry fetch
- PUT http://ponyfoo.com/api/1.0/entry upsert
- DELETE http://ponyfoo.com/api/1.0/entry/:id delete
I’ll get to this later, but before release.
Models in MongoDB, introducing Mongoose
So far we’ve covered views, controllers, routes, but we haven’t actually done anything worthwhile in the server-side. Now we’ll plunge into MongoDB. I decided to use Mongoose after exploring my options a little and realizing how much simpler my development would be having it around.
Setting up the database environment
Install the mongoose package through npm.
$ npm install mongoose
Then we need a little initialization code to get things going:
var mongoose = require('mongoose');
mongoose.connect(config.db.uri); // configured in config.json
mongoose.connection.on('open', function() {
console.log('Connected to Mongoose');
});
I set up a tentative config.db.uri = mongodb://localhost/ponyfoo
. This code will blatantly fail upon execution, due to the simple fact that we didn’t fire up MongoDB yet, so we’ll go ahead and do that now.
I set up MongoDB on my development hard drive, sitting at
f:\mongodb
, I configured the data folder within in a\data
sub-directory, and I also created a succint batch file to fire up MongoDB from my project root:
$ f:\mongodb\bin\mongod.exe --config f:\mongodb\mongod.conf
The batch file just starts the MongoDB database server the configuration file I created for specifying the data
folder directly in the MongoDB installation folder. It specifies a data
folder:
dbpath = f:\mongodb\data
That’s it, you should be able to establish a connection through mongoose. Try it now.
Coming from the Microsoft stack and particularly SQL Server, I must admit I feel incredibly good about MongoDB, and I really appreciate being this easy to set up.
Our first database schema
I will maintain a modular approach to models as well, thus, anywhere I need them, I’ll just require
my models module:
var models = require('./models/all.js');
This module will, in turn, provide a list of MongoDB document models we can use:
var entry = require('./entry.js');
module.exports = {
entry: entry.model
};
And lastly, each model should expose its schema:
var mongoose = require('mongoose'),
schema = new mongoose.Schema({
title: String,
brief: String,
text: String,
date: Date
});
module.exports.model = mongoose.model('entry', schema);
Upsert world
This upsert command is all that’s left between our UI and the MongoDB database:
var mongoose = require('mongoose'),
models = require('../models/all.js');
module.exports = {
put: function(req,res){
var collection = models.entry,
document = req.body.entry,
query = { date: document.date },
opts = { upsert: true },
done = function(err){
res.end();
};
collection.findOneAndUpdate(query, document, opts, done);
collection.save();
}
};
The last little detail would be updating the UI to actually provide a date
for our entries. We’ll deal with that soon enough.
Thus far, this will create an entry the first time, and overwrite it in every subsequent request. This makes the method ideal for easily editing our blog post entries, we would just need to wire the UI to provide the identifier we’re using for the upserts.
MongoDB document identifiers
At a glance, you better have a damn good reason for having an auto-increment _id field. Every single blog post or Stack Overflow answer that tells you how to implement an auto-incrementing field also tries to persuade you not to do it.
I only wanted to have an incremental _id for routing purposes.
However, considering all the evidence against this kind of fields, I came up with a simple solution, which collaterally produces better urls.
I’ll simplify it to be:
Now my only constraint is not writing two separate entries with the exact same title and assign them the exact same date.
/:yyyy[/:mm[/:dd[/:slug]]]
This approach makes url hacking so easy even a zombie could try to do it.
So there we have a drastically basic Node.JS application that allows us to PUT
a MongoDB document, and does nothing much besides that, but we did set up a solid working base for code to come.
Comments