TemporalCloak

TemporalCloak

How TemporalCloak Works

TemporalCloak hides secret messages in the timing delays between data transmissions — not in the data content itself. The image you receive is completely normal; the secret is encoded in when each piece arrives.

Image loading chunk by chunk...

Receiving image data
Image loading progressively
Sender
Receiver

1 Message to Bits

Each character is converted to its 8-bit ASCII value. For example, "H" becomes 01001000 and "i" becomes 01101001.

A 16-bit boundary marker (0xFF00 in front-loaded mode) is prepended and appended to frame the message. An 8-bit XOR checksum is appended after the payload for integrity verification.

2 Bits to Timing Delays

Each bit maps to a time delay between consecutive data chunks:

1
1
0
1
0
0
1
1
short = 1 (~0ms), long = 0 (~100ms)

A short delay (~0ms) encodes a binary 1. A longer delay (~100ms) encodes a binary 0. The threshold between them is ~50ms.

3 Chunked Transmission

The server sends a normal image file in 256-byte chunks. The image data is completely unmodified — only the timing between chunks carries the hidden message.

Between each chunk, the server waits for the appropriate delay. The receiver sees a normal image loading progressively, but the gaps between chunks encode the secret bit stream.

4 Decoding

The receiver measures the time gap between each arriving chunk. If the gap is below the midpoint threshold (~50ms), it records a 1. If above, it records a 0.

Once the closing boundary marker (0xFF00) is detected, the accumulated bits are grouped into 8-bit bytes and converted back to ASCII characters. The XOR checksum is verified to ensure message integrity.

Going Further: Distributed Mode

The encoding described above is called front-loaded mode — all the message bits are packed into the first N chunk gaps. This works, but it creates an obvious pattern: a burst of short and long delays at the start, followed by uniform gaps for the rest of the image. An observer watching the traffic could notice this.

Distributed mode solves this by scattering message bits across the entire image transmission. A short preamble (32 bits) at the start contains a boundary marker, a random key, and the message length. The key seeds a PRNG that determines which chunk gaps carry real message bits — all other gaps use a neutral delay that blends in naturally.

The result: to an observer, the timing pattern looks much more uniform. The message bits are hidden among hundreds of neutral-delay gaps, making the encoding significantly harder to detect. The receiver uses the same key to know exactly which gaps to read.

The two modes use slightly different boundary markers (0xFF00 for front-loaded, 0xFF01 for distributed), so the decoder automatically detects which mode was used — no configuration needed.

Error Correction: Hamming FEC

Network jitter can flip a bit — a delay that should have been short arrives just above the threshold, or vice versa. A single flipped bit corrupts an entire character. To combat this, TemporalCloak supports Hamming(12,8) forward error correction.

Each 8-bit byte is encoded into a 12-bit block by adding 4 parity bits. These parity bits allow the decoder to detect and correct any single-bit error per byte — automatically, with no retransmission needed. The cost is 50% more bits on the wire, but for a timing channel where each bit takes ~100ms, reliability matters far more than throughput.

The boundary marker signals whether FEC is active: 0xFF02 for front-loaded + Hamming, 0xFF03 for distributed + Hamming. The decoder detects this automatically and applies error correction before assembling the message.

Not Just for the Web

This web demo is just one way to use time-based steganography. The technique works over any protocol that delivers data in discrete chunks with controllable timing:

The core temporal_cloak library is protocol-agnostic — it only cares about delay sequences, not how the data travels. A standalone CLI decoder is included for use outside the browser.

5 This Is a Demo

This web app is a demonstration of the concept — not how you'd actually send a covert message. In a real scenario, the sender and receiver would be on separate machines, communicating over a network. The receiver would measure timing delays directly from the raw connection (e.g. TCP sockets or a custom client), where precise inter-packet timing is preserved.

Here, the server plays both roles — sender and receiver — because browsers can't measure chunk-level timing accurately enough. The browser's fetch() API doesn't expose inter-chunk timing, so the server decodes on your behalf and streams the results back via WebSocket. It's a useful way to visualize how time-based steganography works, but the real magic happens at the network layer.

Deep dive: how the web decoder works

1. The browser opens a WebSocket to the server when you click "Decode".

2. The server fetches the image from its own endpoint, measuring the real timing delays between chunks as they arrive.

3. The server runs the decoding logic — classifying each delay as a 1 or 0, finding boundaries, assembling characters.

4. Decode results stream back to the browser over the WebSocket in real time: each bit, the partial message, confidence scores.

5. The browser is purely a visualization layer — it renders the bits and characters as they arrive, but does no decoding itself.

Try It Yourself