Have you ever wondered how Google Cloud Run magically scales your services down to zero and still manages to respond to incoming requests instantly? It feels like serverless magic, but under the hood, it's powered by Knative —a technology that lets Kubernetes dynamically start and stop containers on demand.
Wouldn’t it be awesome to build something like that ourselves?
Today, we’re going to roll up our sleeves and create a Node.js-powered on-demand container system. 💡 We’ll:
✅ Start a container only when needed
✅ Forward incoming requests to it
✅ Auto-stop the container after inactivity
✅ Handle errors gracefully
Let’s go! 🚀
First, let’s create a tiny HTTP server that will run inside our container.
📄 server.js
import http from "http";
const hostname = "0.0.0.0";
const port = 3000;
const server = http.createServer((req, res) => {
res.statusCode = 200;
res.setHeader("Content-Type", "text/plain");
res.end("Hello from your on-demand container!\n");
});
server.listen(port, hostname, () => {
console.log(`Server running at http://${hostname}:${port}/`);
});
Now, let’s dockerize it.
📄 Dockerfile
FROM node:22
WORKDIR /app
COPY ./server.js ./server.js
CMD ["node", "server.js"]
Build the image:
docker build . -t simple-server:local
Now, let’s write a script that:
📄 index.js
import http from "node:http";
import { proxyRequest, isPortOpen, startContainer, stopContainer } from "./utils.js";
const CONTAINER_PORT = 8080;
const INACTIVITY_TIMEOUT = 30_000; // 30 seconds
let lastRequestTime = Date.now();
let containerRunning = false;
const httpServer = http.createServer(async (req, res) => {
console.log("➡️ Incoming request detected...");
lastRequestTime = Date.now();
if (!(await isPortOpen(CONTAINER_PORT))) {
console.log("🚀 Container is not running. Starting it now...");
await startContainer();
containerRunning = true;
}
console.log("🔁 Forwarding request to container...");
return proxyRequest({ req, res, port: CONTAINER_PORT, host: "localhost" });
});
// Periodically check for inactivity
setInterval(async () => {
if (containerRunning && Date.now() - lastRequestTime > INACTIVITY_TIMEOUT) {
console.log("🛑 No activity detected. Stopping container...");
await stopContainer();
containerRunning = false;
}
}, 5_000); // Check every 5 seconds
httpServer.listen(3000, "127.0.0.1", () => {
console.log("🌐 On-Demand Proxy Server listening on http://127.0.0.1:3000");
});
📄 utils.js
import { spawn, exec } from "node:child_process";
import net from "node:net";
export function startContainer() {
return new Promise((resolve) => {
const process = spawn("docker", ["run", "--rm", "-p", "8080:3000", "simple-server:local"], {
stdio: "inherit",
});
process.on("spawn", resolve);
});
}
export function stopContainer() {
return new Promise((resolve, reject) => {
exec("docker ps -q --filter ancestor=simple-server:local", (err, stdout) => {
if (err) return reject(err);
if (!stdout.trim()) return resolve(); // No container found
exec(`docker stop ${stdout.trim()}`, (err) => {
if (err) return reject(err);
console.log("✅ Container stopped.");
resolve();
});
});
});
}
export function isPortOpen(port, host = "127.0.0.1") {
return new Promise((resolve) => {
const socket = new net.Socket();
socket.setTimeout(1000);
socket.once("connect", () => {
socket.destroy();
resolve(true);
});
socket.once("timeout", () => {
socket.destroy();
resolve(false);
});
socket.once("error", () => {
socket.destroy();
resolve(false);
});
socket.connect(port, host);
});
}
Fire up your Node.js proxy server:
node index.js
Then, make a request:
curl http://localhost:3000
🎉 You should see the container start up, handle the request, and return:
Hello from your on-demand container!
Wait 30 seconds without sending requests, and check the logs—you’ll see:
🛑 No activity detected. Stopping container...
✅ Container stopped.
Now, try the request again, and watch it restart automatically!
We've built a simple but powerful system that: ✅ Dynamically starts a container when needed
✅ Forwards HTTP requests to the container
✅ Stops the container after a period of inactivity
✅ Handles errors properly and avoids unnecessary resource usage
This is a lightweight alternative to serverless environments like Cloud Run but running on your own infrastructure.
👉 Want more? Let me know in the comments! We could:
Thanks for reading! 🚀