Node.js

Node.js Logging
Best Practices

Structured logs, the right library, what never to log, and how to ship logs to production without exposing secrets or violating GDPR.

10 min read·Updated Apr 2026

Node.js logging best practices start with one rule: never use console.log in production. Good logging is the difference between a 10-minute outage investigation and a 3-hour one. Bad logging is the difference between a clean security posture and a data breach. This guide covers both.

Stop Using console.log in Production

console.log has no log levels, no timestamps, no structured output and no way to send logs to an external service. It works fine for debugging in development. For production:

❌ console.log (don't)
// No timestamp
// No severity
// Not queryable
// Can't be filtered
console.log("User logged in", userId);
console.log("Payment processed", amount);
console.error("DB error", err);
✅ Structured logger (do this)
// JSON output, timestamped, leveled
logger.info({ user_id: 882, event: "login" });
logger.info({ event: "payment", amount_cents: 4999 });
logger.error({ err, event: "db_error",
  query: "select_user" });

Winston vs Pino — Which to Choose

Pino Winston
Speed5-8x fasterSlower
Output formatJSON (native)Flexible (JSON, text, custom)
Multiple transportsVia pino-multi-streamBuilt-in, easy to configure
Async loggingYes (pino/file)No (synchronous by default)
Express integrationpino-httpmorgan or express-winston
Best forHigh-throughput APIs, microservicesFeature-rich apps, multiple outputs

Use Pino if performance is critical (high request volume, tight latency budgets). Use Winston if you need multiple simultaneous transports (file + console + external service) or more flexible formatting out of the box.

Pino setup

npm install pino pino-http
// logger.js
const pino = require('pino');

const logger = pino({
  level: process.env.LOG_LEVEL || 'info',
  // Redact sensitive fields before logging
  redact: {
    paths: ['req.headers.authorization', 'req.body.password',
            'req.body.token', '*.email', '*.api_key'],
    censor: '[REDACTED]',
  },
});

module.exports = logger;
// Express HTTP request logging
const pino = require('pino');
const pinoHttp = require('pino-http');
const logger = require('./logger');

app.use(pinoHttp({
  logger,
  // Don't log health check endpoints
  autoLogging: {
    ignore: (req) => req.url === '/health',
  },
  // Custom serialiser — strip sensitive data from request logs
  serializers: {
    req: (req) => ({
      method: req.method,
      url: req.url,
      // Never log req.body or req.headers here
    }),
  },
}));

Winston setup

npm install winston
// logger.js
const { createLogger, format, transports } = require('winston');

const logger = createLogger({
  level: process.env.LOG_LEVEL || 'info',
  format: format.combine(
    format.timestamp(),
    format.errors({ stack: true }),
    format.json(),
  ),
  transports: [
    new transports.Console(),
    // Write errors to a separate file
    new transports.File({ filename: 'logs/error.log', level: 'error' }),
    new transports.File({ filename: 'logs/combined.log' }),
  ],
  // Uncaught exception handler
  exceptionHandlers: [
    new transports.File({ filename: 'logs/exceptions.log' }),
  ],
});

module.exports = logger;

Log Levels — Use Them Correctly

Level When to use Example
errorUnexpected failure requiring attentionDB connection failed, unhandled exception
warnRecoverable problem, degraded operationRate limit approached, deprecated API used
infoNormal significant eventsServer started, user authenticated, payment processed
debugDetailed flow for debugging (dev/staging only)Function called, cache hit/miss, query executed
traceVery verbose — individual stepsLoop iterations, variable values during computation

Set LOG_LEVEL=info in production. Set LOG_LEVEL=debug only when actively debugging a specific issue — debug logs often contain data you don't want in long-term storage.

What Never to Log

This is where most security and compliance issues originate. Never log any of the following:

API keys and tokens
STRIPE_SECRET_KEY, OPENAI_API_KEY, AWS credentials, JWT tokens, session tokens. These are credentials — logging them creates a secondary exposure vector.
Passwords and password hashes
Including during authentication failures. Log "login failed for user_id: 882", not "wrong password for alice@example.com".
Full request bodies on sensitive routes
/login, /payment, /users — these routes receive passwords, card numbers, and PII. Log the event, not the payload.
Database connection strings
They contain passwords. Never log config objects that might include DB_URL.
Full HTTP Authorization headers
The header value IS the bearer token. Log that an auth header was present, not its value.
Personal data beyond legitimate need
Under GDPR, logging email addresses requires a lawful basis and defined retention. Use user_id (a pseudonymous identifier) instead.
Detailed stack frames in production
Stack traces in user-facing error responses leak your file structure and library versions. Log them server-side, never return them to clients.

Use Pino's redact Option

Pino's built-in redact option automatically censors specified field paths before writing the log — even if you accidentally pass a sensitive object:

const logger = pino({
  redact: {
    paths: [
      'req.headers.authorization',
      'req.headers.cookie',
      'req.body.password',
      'req.body.confirm_password',
      'req.body.card_number',
      'req.body.cvv',
      '*.token',
      '*.secret',
      '*.api_key',
      '*.password',
    ],
    censor: '[REDACTED]',
  },
});

// Even if you accidentally do this:
logger.info({ user: { email: 'alice@example.com', password: 'hunter2' } });
// Output: {"user":{"email":"alice@example.com","password":"[REDACTED]"}}

Structured Logging — Log Objects, Not Strings

Log structured objects rather than interpolated strings. This makes logs queryable in any logging platform:

// ❌ String interpolation — hard to query
logger.info(`User ${userId} logged in from ${ip}`);
logger.error(`Payment failed: ${error.message} for order ${orderId}`);

// ✅ Structured objects — queryable, filterable
logger.info({ event: 'user.login', user_id: userId, ip });
logger.error({ event: 'payment.failed', order_id: orderId, err });

// Good structured log shape
{
  "level": "info",
  "time": "2026-04-04T12:34:56.789Z",
  "event": "user.login",
  "user_id": 882,
  "ip": "203.0.113.0",
  "duration_ms": 45
}

Centralised Logging in Production

Writing logs to files on the server is fine for small apps. For anything with real traffic, you need centralised logging — a platform that aggregates logs from all your instances, lets you search and alert, and handles retention automatically.

Recommended
Better Stack (Logtail)
Structured log ingestion with SQL-based querying. Dead-simple Node.js integration — one package, two lines of code. EU data residency available, GDPR DPA provided. Free tier included.
npm install @logtail/node @logtail/winston
# or: @logtail/pino
betterstack.com →
Grafana Loki (self-hosted)
Open-source log aggregation. All data stays on your servers. Best for GDPR/HIPAA where you need to avoid third-party data sharing. Requires devops setup but is free and fully private.
grafana.com/oss/loki →

Better Stack integration with Pino

npm install pino @logtail/node @logtail/pino
const pino = require('pino');
const { Logtail } = require('@logtail/node');
const { LogtailTransport } = require('@logtail/pino');

const logtail = new Logtail(process.env.LOGTAIL_SOURCE_TOKEN);

const logger = pino(
  { level: 'info', redact: ['req.headers.authorization'] },
  pino.multistream([
    { stream: process.stdout },          // Also log to console
    { stream: new LogtailTransport(logtail) }, // Ship to Better Stack
  ])
);

module.exports = logger;

Log Retention Policy

Under GDPR, logs must be retained no longer than necessary. Set explicit retention in your logging platform:

Access / request logs
Operational debugging purpose is satisfied within weeks
30–90 days
Error logs
Long-tail bugs may surface weeks after deployment
90 days
Security / auth logs
Fraud investigation requires longer history
12 months
Audit logs (financial/medical)
Document the legal basis for extended retention
Per regulation

Before Sharing Logs — Sanitize First

No matter how careful your logging setup is, you'll eventually need to share a log snippet for debugging. Always sanitize before sharing with a colleague, pasting into ChatGPT or posting on StackOverflow.

Sanitize Logs Before Sharing

Strips API keys, JWTs, emails and database URLs. Client-side — nothing uploaded.

Open Log Sanitizer — Free →

FAQ

What is the best logging library for Node.js? +

Pino for performance-critical applications — it's the fastest Node.js logger, outputting JSON with minimal overhead. Winston for applications needing multiple simultaneous transports and flexible configuration. Both are production-ready and far better than console.log.

Should I use console.log in production Node.js? +

No. console.log has no log levels, no timestamps, no structured output and no way to route logs to external services. It's synchronous and can block the event loop under high load. Use Pino or Winston in production.

How do I make Node.js logging GDPR compliant? +

Log only what you need for a documented purpose. Use user_id instead of email. Set defined retention periods. Sign a DPA with any third-party logging service. Never log request bodies on sensitive routes. See the GDPR-Compliant Logging Guide for full implementation details.

What should you never log in Node.js? +

API keys, passwords, JWT tokens, session tokens, database connection strings, credit card numbers, full request bodies on auth/payment routes, and HTTP Authorization headers. Use Pino's redact option or Winston's custom formatters to automatically censor these fields.

Related