Browser & Network
WebRTC
Real-time communication with peer-to-peer audio, video, and data channels
WebRTC
WebRTC (Web Real-Time Communication) enables peer-to-peer audio, video, and data sharing directly between browsers without requiring plugins or intermediary servers for media transmission.
Architecture Overview
WebRTC Architecture
├── Application Layer
│ ├── getUserMedia() - Access camera/microphone
│ ├── RTCPeerConnection - P2P connection management
│ └── RTCDataChannel - Arbitrary data transfer
├── Session Layer
│ ├── Signaling (via application server)
│ ├── SDP (Session Description Protocol)
│ └── ICE (Interactive Connectivity Establishment)
├── Transport Layer
│ ├── SRTP (Secure Real-time Transport Protocol)
│ ├── SCTP (Stream Control Transmission Protocol)
│ └── DTLS (Datagram Transport Layer Security)
└── Network Layer
├── STUN (Session Traversal Utilities for NAT)
└── TURN (Traversal Using Relays around NAT)Connection Establishment Flow
WebRTC Connection Flow (Offer/Answer Model)
Peer A Signaling Server Peer B
│ │ │
│──── 1. Create Offer ────────│ │
│ (SDP + ICE candidates) │ │
│ │ │
│ │──── 2. Forward Offer ────────►│
│ │ │
│ │ 3. Create Answer
│ │ (SDP + ICE candidates)
│ │ │
│◄──── 4. Forward Answer ─────│◄──────────────────────────────│
│ │ │
│ │ │
│◄═══════════ 5. ICE Connectivity Checks (P2P) ══════════════►│
│ │ │
│◄═══════════ 6. DTLS Handshake (P2P) ═══════════════════════►│
│ │ │
│◄═══════════ 7. Media/Data Exchange (P2P) ══════════════════►│NAT Traversal
ICE (Interactive Connectivity Establishment)
ICE Candidate Types (by priority)
├── Host Candidate
│ └── Local IP address (works on same network)
├── Server Reflexive (srflx)
│ └── Public IP from STUN server (works through most NATs)
├── Peer Reflexive (prflx)
│ └── Discovered during connectivity checks
└── Relay Candidate
└── TURN server relay (always works, highest latency)STUN vs TURN
| Feature | STUN | TURN |
|---|---|---|
| Purpose | Discover public IP | Relay media traffic |
| Bandwidth | Minimal | High (relays all data) |
| Success Rate | ~80% | ~100% |
| Latency | Low | Higher |
| Cost | Low | High |
| When Used | NAT traversal possible | Symmetric NAT or firewall |
ICE Configuration
const configuration: RTCConfiguration = {
iceServers: [
// Public STUN servers
{ urls: 'stun:stun.l.google.com:19302' },
{ urls: 'stun:stun1.l.google.com:19302' },
// TURN server (required for reliable connectivity)
{
urls: [
'turn:turn.example.com:3478',
'turn:turn.example.com:3478?transport=tcp',
'turns:turn.example.com:5349',
],
username: 'user',
credential: 'password',
},
],
iceCandidatePoolSize: 10,
iceTransportPolicy: 'all', // 'all' or 'relay'
};RTCPeerConnection API
Creating a Connection
// Create peer connection
const pc = new RTCPeerConnection(configuration);
// Handle ICE candidates
pc.onicecandidate = (event) => {
if (event.candidate) {
// Send candidate to remote peer via signaling
signalingChannel.send({
type: 'ice-candidate',
candidate: event.candidate.toJSON(),
});
}
};
// Handle connection state changes
pc.onconnectionstatechange = () => {
console.log('Connection state:', pc.connectionState);
// 'new' | 'connecting' | 'connected' | 'disconnected' | 'failed' | 'closed'
};
pc.oniceconnectionstatechange = () => {
console.log('ICE state:', pc.iceConnectionState);
// 'new' | 'checking' | 'connected' | 'completed' | 'disconnected' | 'failed' | 'closed'
};
// Handle incoming tracks
pc.ontrack = (event) => {
const remoteVideo = document.getElementById('remoteVideo') as HTMLVideoElement;
remoteVideo.srcObject = event.streams[0];
};Creating and Handling Offers
// Caller: Create offer
async function createOffer() {
const offer = await pc.createOffer({
offerToReceiveAudio: true,
offerToReceiveVideo: true,
});
await pc.setLocalDescription(offer);
// Send offer to remote peer via signaling
signalingChannel.send({
type: 'offer',
sdp: offer.sdp,
});
}
// Callee: Handle offer and create answer
async function handleOffer(offer: RTCSessionDescriptionInit) {
await pc.setRemoteDescription(new RTCSessionDescription(offer));
const answer = await pc.createAnswer();
await pc.setLocalDescription(answer);
// Send answer back via signaling
signalingChannel.send({
type: 'answer',
sdp: answer.sdp,
});
}
// Caller: Handle answer
async function handleAnswer(answer: RTCSessionDescriptionInit) {
await pc.setRemoteDescription(new RTCSessionDescription(answer));
}
// Both: Handle ICE candidates from remote peer
async function handleIceCandidate(candidate: RTCIceCandidateInit) {
await pc.addIceCandidate(new RTCIceCandidate(candidate));
}Media Capture
getUserMedia API
// Request camera and microphone access
async function getLocalMedia() {
const constraints: MediaStreamConstraints = {
audio: {
echoCancellation: true,
noiseSuppression: true,
autoGainControl: true,
},
video: {
width: { ideal: 1280, max: 1920 },
height: { ideal: 720, max: 1080 },
frameRate: { ideal: 30, max: 60 },
facingMode: 'user', // 'user' (front) or 'environment' (back)
},
};
try {
const stream = await navigator.mediaDevices.getUserMedia(constraints);
// Display local video
const localVideo = document.getElementById('localVideo') as HTMLVideoElement;
localVideo.srcObject = stream;
// Add tracks to peer connection
stream.getTracks().forEach((track) => {
pc.addTrack(track, stream);
});
return stream;
} catch (error) {
if (error instanceof DOMException) {
switch (error.name) {
case 'NotAllowedError':
console.error('Permission denied');
break;
case 'NotFoundError':
console.error('No camera/microphone found');
break;
case 'NotReadableError':
console.error('Device in use by another application');
break;
}
}
throw error;
}
}Screen Sharing
// Request screen sharing
async function startScreenShare() {
const stream = await navigator.mediaDevices.getDisplayMedia({
video: {
displaySurface: 'monitor', // 'monitor' | 'window' | 'browser'
cursor: 'always',
},
audio: true, // System audio (browser support varies)
});
// Replace video track
const videoTrack = stream.getVideoTracks()[0];
const sender = pc.getSenders().find((s) => s.track?.kind === 'video');
if (sender) {
await sender.replaceTrack(videoTrack);
}
// Handle user stopping share
videoTrack.onended = () => {
// Switch back to camera
switchToCamera();
};
return stream;
}Enumerating Devices
// List available media devices
async function getMediaDevices() {
const devices = await navigator.mediaDevices.enumerateDevices();
const audioInputs = devices.filter((d) => d.kind === 'audioinput');
const videoInputs = devices.filter((d) => d.kind === 'videoinput');
const audioOutputs = devices.filter((d) => d.kind === 'audiooutput');
return { audioInputs, videoInputs, audioOutputs };
}
// Switch to specific camera
async function switchCamera(deviceId: string) {
const stream = await navigator.mediaDevices.getUserMedia({
video: { deviceId: { exact: deviceId } },
});
const videoTrack = stream.getVideoTracks()[0];
const sender = pc.getSenders().find((s) => s.track?.kind === 'video');
if (sender) {
await sender.replaceTrack(videoTrack);
}
}RTCDataChannel
Creating Data Channels
// Create data channel (must be done before offer/answer)
const dataChannel = pc.createDataChannel('chat', {
ordered: true, // Guarantee order (default: true)
maxRetransmits: 3, // Max retransmission attempts
// maxPacketLifeTime: 3000, // Alternative: max time in ms
protocol: '', // Sub-protocol name
negotiated: false, // Manual negotiation
id: undefined, // Channel ID (auto-assigned if undefined)
});
dataChannel.onopen = () => {
console.log('Data channel open');
dataChannel.send('Hello!');
};
dataChannel.onmessage = (event) => {
console.log('Received:', event.data);
};
dataChannel.onclose = () => {
console.log('Data channel closed');
};
dataChannel.onerror = (error) => {
console.error('Data channel error:', error);
};
// Handle incoming data channels
pc.ondatachannel = (event) => {
const channel = event.channel;
channel.onmessage = (e) => {
console.log('Received on', channel.label, ':', e.data);
};
};Sending Different Data Types
// Send text
dataChannel.send('Hello, World!');
// Send JSON
dataChannel.send(JSON.stringify({ type: 'message', content: 'Hello' }));
// Send binary (ArrayBuffer)
const buffer = new ArrayBuffer(8);
const view = new DataView(buffer);
view.setFloat64(0, 3.14159);
dataChannel.send(buffer);
// Send Blob
const blob = new Blob(['Hello'], { type: 'text/plain' });
dataChannel.send(blob);
// Handle binary data
dataChannel.binaryType = 'arraybuffer'; // or 'blob'
dataChannel.onmessage = (event) => {
if (event.data instanceof ArrayBuffer) {
// Handle binary
const view = new DataView(event.data);
console.log('Received float:', view.getFloat64(0));
} else {
// Handle text
console.log('Received text:', event.data);
}
};File Transfer
// Chunked file transfer
async function sendFile(file: File) {
const CHUNK_SIZE = 16384; // 16KB chunks
const fileReader = new FileReader();
let offset = 0;
const readSlice = () => {
const slice = file.slice(offset, offset + CHUNK_SIZE);
fileReader.readAsArrayBuffer(slice);
};
fileReader.onload = (e) => {
const chunk = e.target?.result as ArrayBuffer;
dataChannel.send(chunk);
offset += chunk.byteLength;
if (offset < file.size) {
// Check buffer before sending more
if (dataChannel.bufferedAmount < 65535) {
readSlice();
} else {
// Wait for buffer to drain
dataChannel.onbufferedamountlow = () => {
dataChannel.onbufferedamountlow = null;
readSlice();
};
dataChannel.bufferedAmountLowThreshold = 65535;
}
} else {
console.log('File transfer complete');
}
};
// Send file metadata first
dataChannel.send(JSON.stringify({
type: 'file-start',
name: file.name,
size: file.size,
mimeType: file.type,
}));
readSlice();
}SDP (Session Description Protocol)
SDP Structure
SDP Offer/Answer Structure
v=0 # Version
o=- 123456 2 IN IP4 127.0.0.1 # Origin
s=- # Session name
t=0 0 # Timing
a=group:BUNDLE 0 1 # Bundle media
a=msid-semantic: WMS stream # Media stream ID
m=audio 9 UDP/TLS/RTP/SAVPF 111 # Audio media line
c=IN IP4 0.0.0.0 # Connection info
a=rtcp:9 IN IP4 0.0.0.0 # RTCP info
a=ice-ufrag:xxxx # ICE username fragment
a=ice-pwd:yyyy # ICE password
a=fingerprint:sha-256 AA:BB:CC... # DTLS fingerprint
a=setup:actpass # DTLS role
a=mid:0 # Media ID
a=sendrecv # Direction
a=rtpmap:111 opus/48000/2 # Codec mapping
a=fmtp:111 minptime=10;useinbandfec=1 # Codec parameters
m=video 9 UDP/TLS/RTP/SAVPF 96 # Video media line
a=rtpmap:96 VP8/90000 # VP8 codec
a=rtcp-fb:96 nack # NACK feedback
a=rtcp-fb:96 nack pli # Picture loss indication
a=rtcp-fb:96 ccm fir # Full intra requestModifying SDP
// Prefer specific codec
function preferCodec(sdp: string, codec: string): string {
const lines = sdp.split('\r\n');
const mLineIndex = lines.findIndex((line) => line.startsWith('m=video'));
if (mLineIndex === -1) return sdp;
// Find payload type for desired codec
const rtpmapLine = lines.find((line) =>
line.includes('rtpmap') && line.toLowerCase().includes(codec.toLowerCase())
);
if (!rtpmapLine) return sdp;
const payloadType = rtpmapLine.split(':')[1].split(' ')[0];
// Reorder payload types in m= line
const mLine = lines[mLineIndex];
const parts = mLine.split(' ');
const payloads = parts.slice(3);
const reordered = [payloadType, ...payloads.filter((p) => p !== payloadType)];
lines[mLineIndex] = [...parts.slice(0, 3), ...reordered].join(' ');
return lines.join('\r\n');
}
// Set maximum bitrate
function setMaxBitrate(sdp: string, maxBitrate: number): string {
return sdp.replace(
/(a=mid:.*\r\n)/g,
`$1b=AS:${maxBitrate}\r\n`
);
}Statistics and Monitoring
Getting Connection Stats
// Get comprehensive stats
async function getConnectionStats() {
const stats = await pc.getStats();
stats.forEach((report) => {
switch (report.type) {
case 'inbound-rtp':
if (report.kind === 'video') {
console.log('Inbound video:', {
packetsReceived: report.packetsReceived,
packetsLost: report.packetsLost,
bytesReceived: report.bytesReceived,
framesDecoded: report.framesDecoded,
framesDropped: report.framesDropped,
jitter: report.jitter,
});
}
break;
case 'outbound-rtp':
if (report.kind === 'video') {
console.log('Outbound video:', {
packetsSent: report.packetsSent,
bytesSent: report.bytesSent,
framesEncoded: report.framesEncoded,
qualityLimitationReason: report.qualityLimitationReason,
});
}
break;
case 'candidate-pair':
if (report.state === 'succeeded') {
console.log('Connection:', {
localCandidateId: report.localCandidateId,
remoteCandidateId: report.remoteCandidateId,
currentRoundTripTime: report.currentRoundTripTime,
availableOutgoingBitrate: report.availableOutgoingBitrate,
});
}
break;
}
});
}
// Monitor stats periodically
const statsInterval = setInterval(async () => {
if (pc.connectionState === 'connected') {
await getConnectionStats();
}
}, 1000);Common Patterns
Renegotiation
// Handle renegotiation needed
pc.onnegotiationneeded = async () => {
try {
const offer = await pc.createOffer();
await pc.setLocalDescription(offer);
signalingChannel.send({
type: 'offer',
sdp: pc.localDescription?.sdp,
});
} catch (error) {
console.error('Renegotiation failed:', error);
}
};
// Add new track (triggers renegotiation)
function addTrack(track: MediaStreamTrack, stream: MediaStream) {
pc.addTrack(track, stream);
// onnegotiationneeded will fire automatically
}Perfect Negotiation
// Perfect negotiation pattern (handles glare)
let makingOffer = false;
let ignoreOffer = false;
const polite = true; // Set based on role
pc.onnegotiationneeded = async () => {
try {
makingOffer = true;
await pc.setLocalDescription();
signalingChannel.send({ type: 'offer', sdp: pc.localDescription?.sdp });
} finally {
makingOffer = false;
}
};
signalingChannel.onmessage = async ({ type, sdp, candidate }) => {
try {
if (type === 'offer') {
const offerCollision = makingOffer || pc.signalingState !== 'stable';
ignoreOffer = !polite && offerCollision;
if (ignoreOffer) return;
await pc.setRemoteDescription({ type, sdp });
await pc.setLocalDescription();
signalingChannel.send({ type: 'answer', sdp: pc.localDescription?.sdp });
} else if (type === 'answer') {
await pc.setRemoteDescription({ type, sdp });
} else if (candidate) {
await pc.addIceCandidate(candidate);
}
} catch (error) {
console.error('Signaling error:', error);
}
};Handling Disconnection
pc.oniceconnectionstatechange = () => {
switch (pc.iceConnectionState) {
case 'disconnected':
// Temporary disconnection, may recover
console.log('Connection disrupted, attempting recovery...');
break;
case 'failed':
// ICE restart
pc.restartIce();
break;
case 'closed':
// Clean up
cleanup();
break;
}
};
pc.onconnectionstatechange = () => {
if (pc.connectionState === 'failed') {
// Full reconnection needed
reconnect();
}
};Comparison with Other Technologies
| Feature | WebRTC | WebSocket | HTTP/SSE |
|---|---|---|---|
| Protocol | UDP (primarily) | TCP | TCP |
| Latency | Very low | Low | Medium |
| P2P | Yes | No (server required) | No |
| Audio/Video | Native support | Manual encoding | Not designed for |
| NAT Traversal | Built-in (ICE) | Server handles | Server handles |
| Reliability | Configurable | Guaranteed | Guaranteed |
| Use Case | Real-time media | Bidirectional messaging | Server push |
Best Practices
WebRTC Best Practices
- Always include TURN servers for reliable connectivity
- Handle all connection state changes gracefully
- Implement ICE restart for failed connections
- Use perfect negotiation pattern to avoid glare
- Monitor connection quality with getStats()
- Implement adaptive bitrate based on network conditions
- Handle device changes (camera/microphone switching)
- Clean up resources on disconnection
- Test with various network conditions (packet loss, latency)
- Consider fallback to server relay for constrained networks
Browser Support
| Feature | Chrome | Firefox | Safari | Edge |
|---|---|---|---|---|
| RTCPeerConnection | Yes | Yes | Yes | Yes |
| getUserMedia | Yes | Yes | Yes | Yes |
| getDisplayMedia | Yes | Yes | Yes | Yes |
| RTCDataChannel | Yes | Yes | Yes | Yes |
| VP8/VP9 | Yes | Yes | Yes | Yes |
| H.264 | Yes | Yes | Yes | Yes |
| AV1 | Yes | Yes | No | Yes |
| Insertable Streams | Yes | No | No | Yes |