Node.js + MongoDB, part 2: here comes memcached!

Published: 2013-08-02
Let's elaborate on the Node.js + MongoDB app. I lied: it's not really worth $1 billion... yet. It surely will once we've added memcached to the mix ;)

Once again, we'll use a couple of EC2 instances running Ubuntu 12.04: one for the Node.js web server and one for the memcached server. MongoDB will still be served from MongoLab.

Start your instances and let's configure our memcached server first. Since it requires port 11211 to be open, we have to add a couple of rules for 11211/UDP and 11211/TCP in the security group attached to the instance.

Then, let's install memcached:
ubuntu@ip-10-234-177-74:~$ sudo apt-get install memcached

We also need to edit /etc/memcached.conf in order to set the '-l' parameter to the correct IP address (running ifconfig eth0 will confirm the right one to use). In my case, it is 10.234.177.74.

Then, we need to restart memcached:

ubuntu@ip-10-234-177-74:~$ sudo service memcached restart

Now, let's go to the Node.js instance and check that we can access the memcached server:
ubuntu@ip-10-48-161-115:~$ echo stats|nc 10.234.177.74 11211

If you see a lot of stats like I do, you're good to go. If not, double-check the steps above (rules, config file, restart).

Now, let's install the memcached client for Node.js with npm, the Node.js package manager. There are several clients out there, mc looks pretty good and well-maintained :)

ubuntu@ip-10-48-161-115:~$ npm install mc

That's it. Now, let's write some code, yeah! Here's the idea:


Alright, let's run this and hit it with some requests :
Mac:~ julien$ curl http://ec2-54-216-3-139.eu-west-1.compute.amazonaws.com:8080/?id=0
{"_id":"51e3ce08915082db3df32bf0","x":1}
{"_id":"51e3ce08915082db3df32bf1","x":2}
{"_id":"51e3ce08915082db3df32bf2","x":3}
{"_id":"51e3ce08915082db3df32bf3","x":4}
{"_id":"51e3ce08915082db3df32bf4","x":5}
{"_id":"51e3ce08915082db3df32bf5","x":6}
{"_id":"51e3ce08915082db3df32bf6","x":7}
{"_id":"51e3ce08915082db3df32bf7","x":8}
{"_id":"51e3ce08915082db3df32bf8","x":9}
{"_id":"51e3ce08915082db3df32bf9","x":10}
{"_id":"51e3ce08915082db3df32bfa","x":11}
{"_id":"51e3ce08915082db3df32bfb","x":12}
{"_id":"51e3ce08915082db3df32bfc","x":13}
{"_id":"51e3ce08915082db3df32bfd","x":14}
{"_id":"51e3ce08915082db3df32bfe","x":15}
{"_id":"51e3ce08915082db3df32bff","x":16}
{"_id":"51e3ce08915082db3df32c00","x":17}
{"_id":"51e3ce08915082db3df32c01","x":18}
{"_id":"51e3ce08915082db3df32c02","x":19}
{"_id":"51e3ce08915082db3df32c03","x":20}
{"_id":"51e3ce08915082db3df32c04","x":21}
{"_id":"51e3ce08915082db3df32c05","x":22}
{"_id":"51e3ce08915082db3df32c06","x":23}
{"_id":"51e3ce08915082db3df32c07","x":24}
{"_id":"51e3ce08915082db3df32c08","x":25}

Mac:~ julien$ curl http://ec2-54-216-3-139.eu-west-1.compute.amazonaws.com:8080/?id=51e3ce08915082db3df32bfc
{"_id":"51e3ce08915082db3df32bfc","x":13}

Console output:
Request received, id=51e3ce08915082db3df32bfc
Cache miss, key 51e3ce08915082db3df32bfc. Querying...
Item found: {"_id":"51e3ce08915082db3df32bfc","x":13}
Stored key=51e3ce08915082db3df32bfc, value=13

Memcached stats:
ubuntu@ip-10-234-177-74:~$ echo stats|nc 10.234.177.74 11211|grep [s,g]et
STAT cmd_get 1
STAT cmd_set 1
STAT get_hits 0
STAT get_misses 1

Let's try the same request again (within 60 seconds!): +1 get, +1 hit!
Request received, id=51e3ce08915082db3df32bfc
Cache hit,  key=51e3ce08915082db3df32bfc, value=13

STAT cmd_get 2
STAT cmd_set 1
STAT get_hits 1
STAT get_misses 1

And 60 seconds later, the memcached item should have disappeared: +1 get, +1 miss, +1 set
Cache miss, key 51e3ce08915082db3df32bfc. Querying...
Item found: {"_id":"51e3ce08915082db3df32bfc","x":13}
Stored key=51e3ce08915082db3df32bfc, value=13

STAT cmd_get 3
STAT cmd_set 2
STAT get_hits 1
STAT get_misses 2

Pretty cool, huh? A basic Node.js + memcached + MongoDB app in less than 100 lines of code, comments and logging included.

However, the really great stuff is what you DON'T see:
That's A LOT of scalability for free as far as the application developer is concerned.

Food for thought... Something tells me this isn't the last post on these topics. I hope you're enjoying this as much as I am. Time for a drink, I'm exhausted. Cheers!

About the Author

Julien Simon is the Chief Evangelist at Arcee AI , specializing in Small Language Models and enterprise AI solutions. Recognized as the #1 AI Evangelist globally by AI Magazine in 2021, he brings over 30 years of technology leadership experience to his role.

With 650+ speaking engagements worldwide and 350+ technical blog posts, Julien is a leading voice in practical AI implementation, cost-effective AI solutions, and the democratization of artificial intelligence. His expertise spans open-source AI, Small Language Models, enterprise AI strategy, and edge computing optimization.

Previously serving as Principal Evangelist at Amazon Web Services and Chief Evangelist at Hugging Face, Julien has helped thousands of organizations implement AI solutions that deliver real business value. He is the author of "Learn Amazon SageMaker," the first book ever published on AWS's flagship machine learning service.

Julien's mission is to make AI accessible, understandable, and controllable for enterprises through transparent, open-weights models that organizations can deploy, customize, and trust.