π’ From Browser to Server
So far, JavaScript has been running in the browser. Node.js lets you run JavaScript on the server, outside of any browser. Same language, different environment, new superpowers.
β Great News: JavaScript is JavaScript!
Everything you learned in the JS Refresher applies to Node.js:
- β Variables, types, operators
- β Functions, arrow functions, callbacks
- β Objects, arrays, destructuring
- β Promises, async/await
- β Classes, this, spread operator
- β Map, filter, reduce, and all array methods
β οΈ What's NOT available in Node.js: Browser-specific APIs don't exist: document, window, alert(), localStorage, etc. Node.js runs outside the browser, so it has its own APIs for files, networking, and process management.
π― By the end of this module, you will:
- Understand what Node.js is and how it differs from the browser
- Install Node.js and run JavaScript files from the terminal
- Master both CommonJS and ES Modules in depth
- Work with built-in modules (
fs,path,os,readline) - Create HTTP servers from scratch
- Use npm to manage packages and dependencies
- Set up nodemon for automatic server restarts
- Manage configuration with environment variables
π― What You Can Build with Node.js
β What is Node.js?
Node.js is a JavaScript runtime built on Chrome's V8 engine. It allows you to run JavaScript outside the browser β on your computer, on a server, anywhere!
Node.js isn't a language or a framework β it's a runtime environment. To understand how it works, let's look at the three layers that make up its architecture:
Google's open-source JavaScript engine (the same one inside Chrome). It takes your JavaScript code and compiles it directly to machine code β not interpreted line-by-line like older engines. This is why Node.js is so fast. V8 handles memory allocation, garbage collection, and executing your functions.
A C library that provides the event loop and asynchronous I/O. When your code reads a file, makes a network request, or queries a database, libuv handles it in a non-blocking way using a thread pool. It's the secret sauce that lets Node.js handle thousands of concurrent operations with a single JavaScript thread.
The built-in modules you actually use in your code: fs (files), http (servers), path (file paths), os (system info), crypto (encryption), and more. These are not browser APIs β there's no window, no document, no DOM. Instead, you get direct access to the operating system.
π Think of It Like a Restaurant Kitchen
V8 is the chef β it executes the recipes (your code) extremely fast. libuv is the kitchen team β while the chef works on one dish, the team handles washing, prep, and delivery in parallel so nothing blocks. The Node APIs are the recipe book β pre-built tools (fs, http, crypto) that tell the kitchen what's available. The result: one chef can serve thousands of customers because the team handles all the waiting.
π Browser
window,document,DOMfetch,localStorage- Sandboxed (no file system access)
- Runs in user's browser
π’ Node.js
global,process,__dirnamefs,http,path- Full system access (files, network, OS)
- Runs on a server or your machine
β‘ Key Concept: Event-Driven & Non-Blocking
Node.js uses an event-driven, non-blocking I/O model. Instead of waiting for operations (like reading a file or querying a database) to finish, Node.js registers a callback and moves on to handle other tasks. When the operation completes, the callback fires. This makes Node.js extremely efficient for I/O-heavy applications like web servers and APIs β handling thousands of concurrent connections with a single thread.
π₯ Installation & Setup
1οΈβ£ Download Node.js
Go to nodejs.org and download the LTS (Long Term Support) version β the recommended, stable version.
π‘ Installation Tips
- Windows: Download the
.msiinstaller and run it - macOS: Download the
.pkginstaller or use Homebrew:brew install node - Linux: Use your package manager or download from nodejs.org
2οΈβ£ Verify Installation
# Check Node.js version
node --version # v20.x.x or higher
# Check npm (Node Package Manager) version
npm --version # comes with Node.jsβΉοΈ What is npm?
npm (Node Package Manager) comes bundled with Node.js. It's the tool you use to install third-party packages (like Express, React, etc.). Think of it as an app store for JavaScript libraries!
Similar tools: Maven/Gradle (Java), pip (Python), Composer (PHP), Cargo (Rust), NuGet (.NET).
π Version Manager (Recommended)
# macOS/Linux β nvm (Node Version Manager)
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.0/install.sh | bash
nvm install --lts
nvm use --lts
# Windows β nvm-windows
# Download from https://github.com/coreybutler/nvm-windowsTip: Use nvm to switch between Node.js versions. Essential when working on multiple projects that need different versions.
βΆοΈ Running Node.js
π₯οΈ Node.js REPL (Interactive Console)
The REPL (Read-Eval-Print Loop) is an interactive JavaScript console. Type node in your terminal:
$ node
> 2 + 2
4
> const name = "Node.js"
undefined
> console.log(`Hello, ${name}!`)
Hello, Node.js!
undefined
> .exit // Exit the REPL (or Ctrl+C twice)π Executing a JavaScript File
Create a file called app.js and run it without a browser:
// app.js β Your first Node.js program
console.log("Hello from Node.js!");
console.log("Platform:", process.platform); // "darwin", "win32", "linux"
console.log("Node version:", process.version);
console.log("Current dir:", process.cwd());
console.log("Arguments:", process.argv.slice(2));
// Run: node app.js arg1 arg2
// Arguments: [ 'arg1', 'arg2' ]π The process Global
process is a Node.js global object (available everywhere, no import needed) that provides information about the current running process. Key properties include process.platform (OS name), process.version (Node.js version), process.cwd() (current working directory), process.argv (command-line arguments), and process.env (environment variables β more on this in Section 10).
π‘ File Path Options
1. Navigate first: cd /path/to/folder then node app.js
2. Relative path: node ./subfolder/app.js
3. Absolute path: node /Users/name/projects/app.js
β‘ Inline Execution
You can also run JavaScript directly from the command line without creating a file, using the -e (evaluate) flag. This is handy for quick tests or one-liner scripts:
# Run inline code
$ node -e "console.log(process.version)"
v20.11.0This is a Game Changer!
You just ran JavaScript without opening a browser! No HTML, no <script> tags. Just pure JavaScript executing on your machine.
π¦ Module System
In Node.js, every JavaScript file is a module. Modules let you split code into separate, reusable files.
π Three Types of Modules
1. Your Own Modules
Any .js file you write becomes a reusable module that can be imported into other files.
2. Built-in Node.js Modules
Powerful modules like fs, http, path, os β no installation needed!
3. Third-Party Modules
Community packages via npm: express, axios, lodash, and thousands more!
π Two Module Systems
CommonJS (CJS) β Traditional
require() / module.exports
Synchronous. Default in Node.js.
ES Modules (ESM) β Modern β
import / export
Async. Standard JS. Use "type": "module" in package.json.
π€ CommonJS: require() & module.exports
// math.js β Create a module
const add = (a, b) => a + b;
const subtract = (a, b) => a - b;
const multiply = (a, b) => a * b;
module.exports = { add, subtract, multiply };
// app.js β Import and use
const { add, multiply } = require('./math.js');
console.log(add(5, 3)); // 8
console.log(multiply(3, 7)); // 21βΆ π How require() Works Under the Hood
βοΈ What Happens When You Call require('./math.js')?
- Resolve the Path: Node.js figures out the full path
- Check Cache: Already loaded? Return cached result
- Load & Wrap: Read file and wrap in function with
module,exports,require,__dirname,__filename - Execute: Run the entire file top to bottom
- Return: Return whatever is in
module.exports
π Module Caching: Execute Once, Use Many
// counter.js
console.log('β‘ Counter module loaded!');
let count = 0;
module.exports = { increment: () => ++count, getCount: () => count };
// app.js
const c1 = require('./counter.js'); // Logs: "β‘ Counter module loaded!"
const c2 = require('./counter.js'); // Nothing! (cached)
c1.increment(); // count = 1
c2.increment(); // count = 2 (same instance!)
console.log(c1.getCount()); // 2 β all point to same module!π¦ The Module Wrapper Function
// Your code in math.js:
const add = (a, b) => a + b;
module.exports = { add };
// What Node.js actually runs:
(function(exports, require, module, __filename, __dirname) {
const add = (a, b) => a + b;
module.exports = { add };
});This explains: Why variables in one module don't pollute others (function scope!), where module, exports, require come from, and how you can access __dirname and __filename.
β¨ ES Modules: import & export
// math.js β ES Module exports
export const add = (a, b) => a + b;
export const subtract = (a, b) => a - b;
export default { add, subtract };
// app.js β ES Module imports
import { add } from './math.js'; // Named import
import * as math from './math.js'; // Namespace import
import mathFunctions from './math.js'; // Default import
console.log(add(5, 3)); // 8βΆ π How import/export Works Under the Hood
βοΈ The ESM Process
- Static Analysis: Before execution, all
importstatements are scanned (must be at top!) - Dependency Graph: Build a graph of all modules
- Parallel Loading: Fetch all module files in parallel
- Parse & Link: Parse each module and link imports to exports
- Execute: Run modules in correct order (dependencies first)
π Static Imports: Decided at Parse Time
// β
Valid: Imports must be at the top level
import { add } from './math.js';
// β Invalid: Can't import conditionally!
if (condition) {
import { add } from './math.js'; // SyntaxError!
}
// β
Use dynamic import() for conditional loading
if (condition) {
const { add } = await import('./math.js'); // Works!
}Why static? Allows tree shaking (dead code elimination), better tooling, and optimization β impossible with dynamic require()!
π Live Bindings: ES Module Magic!
// counter.js
export let count = 0;
export function increment() { count++; }
// app.js
import { count, increment } from './counter.js';
console.log(count); // 0
increment();
console.log(count); // 1 (updated automatically! Live binding!)Important: You can READ the imported variable, but you cannot REASSIGN it. The exporting module owns the variable!
βοΈ CommonJS vs ES Modules
| Feature | CommonJS | ES Modules |
|---|---|---|
| Syntax | require() | import |
| Loading | Synchronous | Asynchronous |
| Resolved | Runtime (dynamic) | Parse time (static) |
| Conditional? | β Yes, anywhere | β Top-level only |
| Bindings | Copies (snapshot) | Live references |
| Tree Shaking | β | β |
| Browser? | β Node only | β Native support |
Which to use? New projects should use ES Modules (import/export). Add "type": "module" to package.json. The future is ESM β it works in both Node.js AND browsers natively!
π§ Built-in Modules
Node.js ships with dozens of powerful modules β no npm install needed. These are part of the Node.js core and are always available. Here are the essential ones you'll use daily:
os System inforeadline User inputpath File pathsfs File systemhttp Web serverscrypto Encryptionevents Event emitterurl URL parsingos β Operating System
The os module provides information about the operating system your code is running on. It's useful for building platform-specific logic, monitoring system resources, or creating tools that adapt to the user's machine.
const os = require('os');
console.log('Platform:', os.platform()); // 'darwin', 'win32', 'linux'
console.log('Architecture:', os.arch()); // 'x64', 'arm64'
console.log('CPU Cores:', os.cpus().length);
console.log('Total Memory:', (os.totalmem() / 1024**3).toFixed(1), 'GB');
console.log('Free Memory:', (os.freemem() / 1024**3).toFixed(1), 'GB');
console.log('Home Directory:', os.homedir());
console.log('Uptime:', (os.uptime() / 3600).toFixed(1), 'hours');readline/promises β User Input
The readline module allows your Node.js program to read input from the terminal β the equivalent of prompt() in the browser. The readline/promises variant (Node.js 17+) provides a modern async/await interface, making it easy to build interactive CLI applications.
const readline = require('readline/promises');
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout
});
async function getUserInfo() {
const name = await rl.question('What is your name? ');
const age = await rl.question('How old are you? ');
console.log(`\nHello, ${name}! You are ${age} years old.`);
rl.close();
}
getUserInfo();path β File Paths
The path module provides utilities for working with file and directory paths. Always use path.join() instead of string concatenation ('folder' + '/' + 'file.txt') β it handles the path separator (/ on macOS/Linux, \ on Windows) automatically, making your code cross-platform.
import path from 'node:path';
path.join('/users', 'mehdi', 'docs', 'file.txt');
// "/users/mehdi/docs/file.txt"
path.resolve('src', 'app.js');
// "/absolute/path/to/src/app.js"
path.basename('/users/mehdi/app.js'); // "app.js"
path.extname('photo.png'); // ".png"
path.dirname('/users/mehdi/app.js'); // "/users/mehdi"βΉοΈ The node: prefix: You may see import path from 'node:path' vs require('path'). Since Node.js 16+, you can prefix built-in modules with node: to make it explicit that you're importing a core module, not a third-party package with the same name. Both forms work, but the node: prefix is recommended in modern code.
fs β File System
The fs (file system) module is one of the most important Node.js modules. It lets you create, read, update, and delete files and directories on your machine β something impossible in the browser for security reasons. This is what makes Node.js a powerful tool for backend development, build tools, and automation scripts.
π Three Versions of the fs Module
Synchronous
fs.readFileSync('file.txt', 'utf8')
Blocks your program β don't use in servers!
Async Callbacks
fs.readFile('file.txt', 'utf8', (err, data) => { ... })
Non-blocking but uses callbacks (messy)
fs/promises β¨
await fs.readFile('file.txt', 'utf8')
Non-blocking + clean async/await β the recommended way!
π Reading Files
The readFile() method reads the entire contents of a file into memory. The second argument 'utf8' tells Node.js to return the contents as a string β without it, you'd get a raw Buffer of bytes (useful for binary files like images, but not for text).
const fs = require('fs/promises');
async function readFile() {
try {
const data = await fs.readFile('example.txt', 'utf8');
console.log('File contents:', data);
} catch (error) {
console.error('Error reading file:', error.message);
}
}
readFile();βοΈ Writing Files
writeFile() creates a new file or completely overwrites an existing one. If you want to add content to the end of a file without erasing it, use appendFile() instead. Both methods are essential for logging, data storage, and file generation tasks.
const fs = require('fs/promises');
async function writeFile() {
try {
await fs.writeFile('output.txt', 'Hello from Node.js!', 'utf8');
console.log('File written successfully!');
await fs.appendFile('output.txt', '\nAppended line!', 'utf8');
console.log('Content appended!');
} catch (error) {
console.error('Error:', error.message);
}
}
writeFile();π Working with Directories
mkdir() creates directories, while readdir() lists a directory's contents. The { recursive: true } option works like mkdir -p in the terminal β it creates parent directories automatically if they don't exist, and won't throw an error if the directory already exists.
const fs = require('fs/promises');
async function workWithDirs() {
await fs.mkdir('myFolder', { recursive: true });
const files = await fs.readdir('.');
console.log('Files:', files);
try { await fs.access('myFolder'); console.log('Exists!'); }
catch(e) { console.log('Does not exist'); }
}π₯ Complete File System Example
Let's combine everything you've learned β reading, writing, creating directories, and working with JSON β into a realistic script that demonstrates common patterns used in real Node.js applications:
const fs = require('fs/promises');
const os = require('os');
async function demo() {
// Write JSON data
const user = { name: 'John', age: 30, platform: os.platform() };
await fs.writeFile('user.json', JSON.stringify(user, null, 2), 'utf8');
console.log('β Created user.json');
// Read it back
const content = await fs.readFile('user.json', 'utf8');
console.log('β Read:', JSON.parse(content));
// Create logs directory & write log
await fs.mkdir('logs', { recursive: true });
const log = `[${new Date().toISOString()}] User data accessed\n`;
await fs.appendFile('logs/app.log', log, 'utf8');
console.log('β Logged activity');
// List files
const files = await fs.readdir('.');
console.log('β Files:', files);
}
demo().catch(err => console.error('β Error:', err.message));π Building HTTP Servers
Node.js comes with a built-in http module to create web servers. A server listens for HTTP requests and sends back responses β the foundation of all web applications.
ποΈ Your First Server
const http = require('http');
const server = http.createServer((req, res) => {
console.log('New request received!');
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end('Hello from Node.js server!');
});
const PORT = 3000;
server.listen(PORT, () => {
console.log(`Server running at http://localhost:${PORT}/`);
});π§© Server Components
http.createServer(callback)
Creates a server instance. The callback runs for every request.
server.listen(port, callback)
Starts listening on a port. Port 3000 is common for development.
req (Request Object)
req.url β path, req.method β HTTP method, req.headers β headers
res (Response Object)
res.statusCode, res.setHeader(), res.writeHead(), res.end()
βΆ π Deep Dive: Request Object Properties
req.urlFull path + query stringreq.methodGET, POST, PUT, DELETE, PATCHreq.headersAll request headers (lowercase keys)req.headers['user-agent']Browser/client inforeq.headers['content-type']Format of request bodyreq.socket.remoteAddressClient's IP addressreq.on('data', fn)Fires when a chunk arrivesreq.on('end', fn)Fires when all data receivedβΆ π Deep Dive: Response Object Properties
res.statusCode200 (OK), 404 (Not Found), 500 (Error)res.setHeader(name, val)Set a single headerres.writeHead(status, headers)Set status + multiple headers at onceres.write(data)Write a chunk (can call multiple times)res.end([data])Finish response β must be called!res.headersSentBoolean: have headers been sent?Critical: Once you call res.write() or res.end(), headers are sent. After that you cannot modify headers. Set all headers first!
Common Status Codes
2xx: 200 (OK), 201 (Created), 204 (No Content)
3xx: 301 (Moved), 302 (Found), 304 (Not Modified)
4xx: 400 (Bad Request), 401 (Unauthorized), 404 (Not Found)
5xx: 500 (Server Error), 503 (Unavailable)
π¦ Reading Request Body (POST Data)
Request bodies arrive in chunks β you must collect them:
β οΈ Data comes in chunks! The body doesn't arrive all at once. Listen for 'data' events, then process on 'end'. This is a fundamental Node.js streams concept.
π‘ Two Ways to Collect Chunks
Simple Approach (text/JSON): String concatenation β easy and works perfectly for JSON data.
let body = '';
req.on('data', chunk => body += chunk.toString());Buffer Approach (binary data): For file uploads or when you need more control over encoding.
let chunks = [];
req.on('data', chunk => chunks.push(chunk));
req.on('end', () => {
const buffer = Buffer.concat(chunks);
const body = buffer.toString(); // or process binary data directly
});Now let's put it all together in a complete working POST server. This example combines the simple chunk approach with JSON parsing and proper error handling β the exact pattern you'll use in real applications:
const http = require('http');
const server = http.createServer((req, res) => {
if (req.method === 'POST') {
let body = '';
req.on('data', (chunk) => {
body += chunk.toString();
});
req.on('end', () => {
try {
const data = JSON.parse(body);
console.log('Parsed:', data);
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ message: 'Data received!', received: data }));
} catch (error) {
res.writeHead(400);
res.end('Invalid JSON');
}
});
} else {
res.end('Send a POST request with JSON data');
}
});
server.listen(3000);π§ͺ Test with cURL or Postman
# Test with cURL
curl -X POST http://localhost:3000 \
-H "Content-Type: application/json" \
-d '{"name":"John","age":30}'
# Expected response:
# {"message":"Data received!","received":{"name":"John","age":30}}π Why Does Node.js Use Chunks?
Node.js processes data in chunks instead of loading everything at once. This design pattern is what makes Node.js so powerful for I/O operations:
Instead of loading a huge file into memory at once, Node.js processes it piece by piece β even a 10GB file won't crash your server.
Server can handle other requests while waiting for data chunks to arrive. No waiting around!
Can handle file uploads and large payloads without crashing. Essential for production servers.
π£οΈ Manual Routing with if/else
Real web servers need to respond differently based on the URL the client visits and the HTTP method used. In pure Node.js, you handle this by checking req.url and req.method with if/else statements. Think of it as a switchboard β each URL/method combination maps to a different action:
const http = require('http');
const server = http.createServer((req, res) => {
const { method, url } = req;
if (url === '/' && method === 'GET') {
res.writeHead(200, { 'Content-Type': 'text/html' });
res.end('<h1>Welcome!</h1><p><a href="/about">About</a></p>');
}
else if (url === '/about' && method === 'GET') {
res.writeHead(200, { 'Content-Type': 'text/html' });
res.end('<h1>About</h1><p>Node.js server</p>');
}
else if (url === '/api/users' && method === 'GET') {
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify([{ id: 1, name: 'Alice' }, { id: 2, name: 'Bob' }]));
}
else {
res.writeHead(404, { 'Content-Type': 'text/html' });
res.end('<h1>404 - Not Found</h1>');
}
});
server.listen(3000, () => console.log('Server at http://localhost:3000'));π‘ This gets messy fast! As your app grows, manual if/else routing is hard to maintain. That's why frameworks like Express (Module 5) exist β making routing elegant and powerful.
π Content-Type Reference (MIME Types)
When sending responses, you must tell the browser what type of data you're sending using the Content-Type header. Without it, the browser guesses β often incorrectly.
| Type | Content-Type | Usage |
|---|---|---|
| HTML | text/html | Web pages |
| JSON | application/json | API responses, data exchange |
| Plain Text | text/plain | Simple text responses |
| CSS | text/css | Stylesheets |
| JavaScript | application/javascript | Script files |
| Images | image/png, image/jpeg, image/svg+xml | Image files |
application/pdf | PDF documents |
Pro tip: In our static file server challenge above, we didn't set Content-Type β the browser guessed. In production, you'd use the mime-types npm package or express.static() which handles this automatically.
π― Complete Todo API Server
Time to build something real! This complete example combines everything you've learned β routing, GET, POST, DELETE, JSON parsing, error handling, and CORS headers β into a working REST API. Study how each route follows the same request β process β respond pattern. This is the exact architecture that frameworks like Express simplify (Module 5):
const http = require('http');
const todos = [
{ id: 1, task: 'Learn Node.js', completed: false },
{ id: 2, task: 'Build a server', completed: true }
];
const server = http.createServer((req, res) => {
const { method, url } = req;
res.setHeader('Access-Control-Allow-Origin', '*');
res.setHeader('Content-Type', 'application/json');
if (url === '/api/todos' && method === 'GET') {
res.writeHead(200);
res.end(JSON.stringify({ todos }));
}
else if (url === '/api/todos' && method === 'POST') {
let body = '';
req.on('data', chunk => body += chunk);
req.on('end', () => {
try {
const data = JSON.parse(body);
const newTodo = { id: todos.length + 1, task: data.task, completed: false };
todos.push(newTodo);
res.writeHead(201);
res.end(JSON.stringify({ message: 'Created!', todo: newTodo }));
} catch(e) {
res.writeHead(400);
res.end(JSON.stringify({ error: 'Invalid JSON' }));
}
});
}
else if (url.startsWith('/api/todos/') && method === 'DELETE') {
const id = parseInt(url.split('/')[3]);
const index = todos.findIndex(t => t.id === id);
if (index !== -1) {
todos.splice(index, 1);
res.writeHead(200);
res.end(JSON.stringify({ message: 'Deleted!' }));
} else {
res.writeHead(404);
res.end(JSON.stringify({ error: 'Not found' }));
}
}
else {
res.writeHead(404);
res.end(JSON.stringify({ error: 'Endpoint not found' }));
}
});
server.listen(3000, () => console.log('π Todo API at http://localhost:3000'));βΆ π― Challenge: Build a Static File Server
Build a server that serves HTML/CSS/JS from a public/ folder β a mini Live Server!
const http = require('http');
const fs = require('fs/promises');
const PUBLIC_FOLDER = 'public';
const server = http.createServer(async (req, res) => {
let filePath = req.url.split('?')[0];
if (filePath === '/') filePath = '/index.html';
try {
const content = await fs.readFile(PUBLIC_FOLDER + filePath);
res.statusCode = 200;
res.end(content);
} catch (e) {
res.statusCode = 404;
res.end('Not found');
}
});
server.listen(3000, () => {
console.log('π Static server at http://localhost:3000');
});π What you learned: You built a mini version of tools like Live Server! In production, use express.static() β but now you understand what it does internally.
π¦ npm & External Modules
npm is the world's largest software registry with over 2 million packages. These are third-party modules created by developers worldwide.
π¬ Initializing a Project
# Create package.json (interactive)
npm init
# Quick init with defaults
npm init -yπ Understanding package.json
{
"name": "my-node-project",
"version": "1.0.0",
"description": "My awesome Node.js project",
"main": "index.js",
"scripts": {
"start": "node index.js",
"dev": "nodemon index.js",
"test": "echo \"Error: no test\" && exit 1"
},
"dependencies": {
"express": "^4.18.2"
},
"devDependencies": {
"nodemon": "^3.0.1"
}
}nameProject name (required for publishing)versionCurrent version (semantic versioning)scriptsCommands you run with npm rundependenciesPackages needed in productiondevDependenciesPackages only for developmentπ₯ Installing Packages
# Install a package β dependencies
npm install express
npm i express # shorthand
# Install multiple
npm install express mongoose dotenv
# Install as dev dependency
npm install --save-dev nodemon
npm i -D nodemon # shorthand
# Install all from package.json
npm install
# Install specific version
npm install express@4.17.1
# Install globally
npm install -g nodemonβΉοΈ dependencies vs devDependencies
dependencies: Packages your app needs to run in production β Express, database drivers, authentication libraries. Installed with npm install <package>.
devDependencies: Packages only needed during development β testing tools, build tools, nodemon, linters. Installed with npm install --save-dev <package>.
When deploying: Running npm install --production only installs dependencies, skipping devDependencies β keeping your production server lean!
βΆ π Understanding Version Numbers (SemVer)
npm uses Semantic Versioning (SemVer): MAJOR.MINOR.PATCH
"express": "^4.18.2" β 4 = MAJOR, 18 = MINOR, 2 = PATCH
^4.18.2 β Allows 4.18.2 to 4.x.x (minor + patch updates)
~4.18.2 β Allows 4.18.2 to 4.18.x (patch updates only)
4.18.2 β Exact version only (no updates)- MAJOR (4) β Breaking changes. Your code might break!
- MINOR (18) β New features added, backward compatible
- PATCH (2) β Bug fixes only, safe to update
β οΈ The ^ prefix (default when installing) allows minor + patch updates. This is usually safe, but occasionally a minor update introduces a subtle breaking change. When stability is critical, use exact versions or ~.
π¦ What Happens When You Install?
package.json
Updated with package name + version
package-lock.json
Locks exact versions. Commit to git!
node_modules/
Actual package code. Never commit to git!
β οΈ .gitignore: Always add node_modules/ and .env to .gitignore. node_modules is huge and can be regenerated with npm install.
π― npm Scripts
# Special scripts β can omit "run"
npm start # runs "start" script
npm test # runs "test" script
# Custom scripts β need "run"
npm run dev # runs "dev" script
npm run build # runs "build" scriptπ‘ Why npm start? It's a standard convention. Others can clone your project and immediately run npm start without knowing your file structure. Platforms like Heroku and Vercel look for this script!
π Useful npm Commands
npm list --depth=0List top-level packagesnpm outdatedCheck for newer versionsnpm updateUpdate packagesnpm uninstall expressRemove a packagenpm view expressView package infonpm search mongodbSearch packagesnpm cache clean --forceClean npm cacheπ Popular Packages You Should Know
π Nodemon
During development, you'll constantly be changing your server code β adding routes, fixing bugs, tweaking responses. But Node.js doesn't automatically pick up those changes. Let's see why that's a problem and how nodemon solves it.
π€ The Problem
Every time you change your code, you must manually stop (Ctrl+C) and restart the server. This gets annoying fast!
β¨ The Solution: nodemon
nodemon automatically restarts your app when it detects file changes. A must-have for development!
βοΈ How Does nodemon Work?
Under the hood, nodemon uses file system watchers (like fs.watch or the chokidar library) to monitor your project directory. When any watched file changes (is saved), nodemon automatically kills the running Node.js process and spawns a new one. It's essentially doing the Ctrl+C β node server.js cycle for you β instantly, every time you save.
# Install as dev dependency
npm install --save-dev nodemonThe --save-dev flag means nodemon is a development dependency β it's only needed while coding, not in production. Next, add a dev script to your package.json so you can start it easily:
// package.json
{
"scripts": {
"start": "node server.js", // Production
"dev": "nodemon server.js" // Development (auto-restart)
}
}Now instead of running node server.js manually, use npm run dev. Nodemon will watch your files and automatically restart the server whenever you save a change:
β‘ Visual Comparison
β Without nodemon
- Edit code
- Save file
- Ctrl+C to stop server
node server.jsagain- Repeat for every changeβ¦
β With nodemon
- Edit code
- Save file
- Auto-restart! π That's it!
βΆ π οΈ Advanced: nodemon.json Configuration
// nodemon.json
{
"watch": ["src"],
"ext": "js,json,html",
"ignore": ["test/*", "docs/*"],
"delay": "1000",
"env": {
"NODE_ENV": "development",
"PORT": "3000"
}
}Pro Tip: Type rs in the terminal and press Enter to manually restart nodemon anytime!
β οΈ Common Gotchas
- Port already in use: If nodemon crashes and the port is still occupied, kill the old process with
lsof -i :3000(macOS/Linux) or restart your terminal. - Watching wrong files: By default, nodemon watches
.js,.mjs,.jsonfiles. If you edit a.envor.htmlfile and expect a restart, you need to configure"ext"innodemon.json. - Don't use in production: nodemon is a dev tool only. In production, use
node server.jsdirectly (or a process manager like PM2).
π Environment Variables
Environment variables are key-value pairs for storing configuration and secrets that change between environments (development, production).
β οΈ Why Use Environment Variables?
Security
Never hardcode secrets! API keys, DB passwords should never be in source code.
Flexibility
Different configs per environment: dev database locally, prod on server β same code!
Team Work
Each developer can have their own local config without conflicts.
π Accessing: process.env
// Access environment variables
const PORT = process.env.PORT || 3000;
const NODE_ENV = process.env.NODE_ENV || 'development';
const DATABASE_URL = process.env.DATABASE_URL;
const API_KEY = process.env.API_KEY;
app.listen(PORT, () => {
console.log(`Server on port ${PORT} in ${NODE_ENV} mode`);
});βοΈ Setting Environment Variables
π» 1. Command Line (Temporary)
# Linux/Mac
PORT=3000 NODE_ENV=production node app.js
# Windows (PowerShell)
$env:PORT=3000; node app.jsπ 2. .env File (Recommended β¨)
# .env file
PORT=3000
NODE_ENV=development
DATABASE_URL=mongodb://localhost:27017/myapp
API_KEY=your_secret_api_key_here
JWT_SECRET=super_secret_token_keyπ¦ 3. npm Scripts
You can set environment variables directly in package.json scripts. Handy for distinguishing dev vs production start commands:
// package.json
{
"scripts": {
"start": "NODE_ENV=production node server.js",
"dev": "NODE_ENV=development nodemon server.js"
}
}π¦ Using the dotenv Package
# 1. Install
npm install dotenv
# 2. Create .env file (see above)
# 3. Load in your app (first line!)// server.js
require('dotenv').config(); // Load .env at the very top!
const PORT = process.env.PORT || 3000;
const DB = process.env.DATABASE_URL;
const API_KEY = process.env.API_KEY;
console.log('Port:', PORT);
console.log('Database:', DB);
// Never log API keys in production!π Critical: Never Commit .env to Git!
# .gitignore
.env
.env.local
.env.*.local
.env.development
.env.production
node_modules/Instead: Create a .env.example with placeholder values (safe to commit). Teammates copy it to .env and fill in their own values.
# .env.example (safe to commit!)
PORT=3000
DATABASE_URL=your_database_url_here
API_KEY=your_api_key_here
NODE_ENV=developmentβΆ β¨ Environment Variables Best Practices
- Always use .env for local development β don't set environment variables manually each time
- Never commit .env to version control β use
.gitignore - Provide .env.example β document required variables for teammates
- Always provide defaults β use
process.env.PORT || 3000as fallback - Validate required variables on startup β fail fast if critical vars are missing
- Different values per environment β dev database locally, prod database on server
- Load dotenv as the first line β before any other imports that might need env vars
// Validate required variables on startup
const required = ['DATABASE_URL', 'API_KEY', 'JWT_SECRET'];
for (const key of required) {
if (!process.env[key]) {
console.error(`β Missing required env var: ${key}`);
process.exit(1);
}
}
console.log('β
All required environment variables loaded!');π Common Environment Variables
NODE_ENV'development', 'production', 'test'PORTServer port (3000, 8080)DATABASE_URLDatabase connection stringAPI_KEYThird-party API keysJWT_SECRETToken signing secretCORS_ORIGINAllowed originsπ Production Deployment
On production platforms (Heroku, AWS, Vercel), you don't use .env files. You set environment variables through the platform's dashboard or CLI β keeping secrets secure and separate from code.
π What You Can Build
With Node.js, the possibilities are endless!
You've now mastered the core building blocks β the runtime, modules, the file system, HTTP servers, npm, and environment variables. These foundations unlock a huge variety of real-world applications. Here's a taste of what developers build with Node.js every day:
Desktop Apps
VS Code, Slack, Discord, Figma, Notion
Electron, TauriWeb Servers & APIs
REST APIs, GraphQL, WebSocket servers, microservices
Express, Fastify, Nest.jsBots & Automation
Discord bots, Telegram bots, web scrapers, task schedulers
discord.js, telegraf, puppeteerCLI Tools
create-react-app, webpack, ESLint, npm/yarn
commander, inquirer, chalkReal-time Apps
Chat apps, live dashboards, multiplayer games
Socket.IO, WebRTCIoT & Hardware
Raspberry Pi, Arduino, smart home, robotics
Johnny-Five, node-serialportπ Summary
You've made the leap from browser to server! Node.js gives you the full power of JavaScript plus direct access to the operating system β files, network, processes. With the module system, built-in APIs like fs, path, and http, the npm ecosystem, and dev tools like nodemon and dotenv, you have everything you need to start building real backend applications. In the next module, you'll learn Express β the framework that makes Node.js server development fast and elegant.
π’ Node.js
JavaScript runtime built on V8 + libuv. Server-side, full system access.
π¦ Modules
CommonJS (require) or ESM (import). Use ESM for new projects.
π fs
Read/write files. Use fs/promises for async operations.
π path
Cross-platform file paths. join, resolve, basename.
π» os
System info: platform, CPU, memory, home directory.
π http
Create servers from scratch. req/res objects, chunks, routing.
π¦ npm
Package manager. npm init, npm install, scripts, package.json.
π nodemon
Auto-restart on file changes. Essential for development.
π .env
Environment variables via dotenv. Never commit secrets to git.