Web Simulation

 

 

 

Softbits: The "Certainty Spectrum" Tutorial

This tutorial illustrates the difference between hard bits (rigid 0 or 1 decisions) and soft bits (probabilistic Log-Likelihood Ratios, LLRs) in digital communications. Moving from a threshold-based "left or right" decision to a confidence value is where modern decoders (Viterbi, LDPC, Turbo) get their gain: they use soft inputs to weight reliability.

Mathematical foundation

BPSK. Transmitted symbol: x = +1 (bit 0) or x = −1 (bit 1). Received sample: y = x + n, where n is zero-mean Gaussian noise with variance σ2. For unit energy BPSK, SNR = 1/σ2 (linear), so σ = 1/√SNR.

Hard decision. Decide bit 0 if y > 0, else bit 1. A single threshold throws away how far y is from zero—information that tells us how confident we are.

Soft bit (LLR). The Log-Likelihood Ratio for bit 0 vs bit 1 (under AWGN) is LLR = 2y2. LLR >> 0 means strong evidence for 0; LLR << 0 means strong evidence for 1; LLR ≈ 0 means uncertain. This single number captures "how much" we believe the bit is 0 or 1, and is exactly what soft-decision decoders use.

Gaussian overlap. The two bell curves in the right panel are the PDFs of y when x = +1 (green) and x = −1 (blue). Where they overlap (shaded red), a received y could plausibly have come from either symbol—hence low confidence and LLR near zero. As SNR increases, σ shrinks, the curves separate, and LLRs grow in magnitude.

Received (y): 0.00 Hard bit: 0 Soft bit (LLR): 0.00
10.0 dB
1.00
Quantization:
BPSK constellation + AWGN (cloud)
Gaussian PDF overlap & LLR
LLR: −20 (strong 1) ↔ 0 (uncertain) ↔ +20 (strong 0)
Softbit calculation & bits per sample
Formula: LLR = 2y2 (σ = 1/√SNR)
Current:
Hard bit (per sample): (decoded from LLR sign)
Soft bit (per sample):
Quantization: Float (ideal)

Current symbol data

Metric I-Sign (B0) I-Mag (B1) Q-Sign (B2) Q-Mag (B3)
Hard decision
Soft integer
Hardware Reg

Image Recovery Stress Test

Send a 32×32 image through the channel at the current SNR. Hard uses only the latest received bits; Soft accumulates LLRs (like a repetition decoder) so confidence builds over transmissions. At low SNR the Hard result stays noisy; the Soft result recovers the image.

Source
Hard Bit Recovery
Soft Bit (LLR) Recovery
Packets: 0

 

Usage

SNR (dB): Drag the slider to change the signal-to-noise ratio. At low SNR the two Gaussian curves overlap heavily (red region), the received point often lands in the overlap, and the LLR stays near zero (uncertain). At high SNR the curves separate, the cloud tightens around +1 and −1, and the LLR needle moves strongly left (bit 1) or right (bit 0).

Left panel (Constellation): BPSK symbols +1 (green) and −1 (blue) on the real axis. The white dot is the current received sample y = x + noise. The ghosting effect (fading trail) shows the cloud of past samples so you see the noise distribution over time. The transmitted bit toggles every 2 seconds so you can watch the cloud jump between the two symbols.

Right panel (PDF): Two bell curves: p(y|bit 0) centered at +1 (green), p(y|bit 1) centered at −1 (blue). The red overlap is the region of ambiguity. The white dashed line is the current y. The gradient bar below is the LLR scale (range adjusts with SNR); the white needle shows the current LLR. The white dashed line on the PDF is at the same LLR value as the needle. Green side = confidence for 0; blue/cyan side = confidence for 1; center = uncertain.

Key concepts

  • Hard decision: Threshold at 0; loses confidence information; error rate drops only when SNR is already good.
  • Soft decision (LLR): Single number 2y2; magnitude = confidence; enables FEC (Viterbi, LDPC, Turbo) to weight bits and correct more errors.
  • Overlap: Where the two Gaussians overlap, the same y could come from either bit—that is exactly when LLR is near zero and a smart decoder can use other bits to resolve the ambiguity.
  • Modulation: BPSK (1 bit/symbol), 4-QAM (2 bits), 16-QAM (4 bits). Symbol table shows Hard decision, Soft integer, and Hardware Reg (quantized LLR) per bit.
  • Image Recovery: 32×32 source (Letter G, Checkerboard, or Random) is sent at the current SNR and modulation. Hard canvas shows latest received bits; Soft canvas accumulates LLRs over repeated "Transmit Packet" clicks. Apply Interleaving scrambles transmission order to avoid patterned errors (e.g. vertical bars in 16-QAM).