Chat interfaces are deceptively complex. On the surface, it is just a list of messages with an input box at the bottom. But once you start building one for production use, you discover a long list of challenges: variable-height messages, smooth scrolling to the bottom on new messages, loading older messages when scrolling up, real-time delivery, typing indicators, read receipts, and offline support. In this post, I want to walk through how I built the chat interface for our Ionic Angular mobile app at Tawk.to.
The Message List with Virtual Scrolling
The first challenge is rendering the message list efficiently. A busy conversation can have thousands of messages, and rendering all of them as DOM elements would destroy performance on mobile devices. Virtual scrolling is the answer, but chat presents a unique twist: unlike a typical list that scrolls from top to bottom, chat messages are anchored to the bottom. New messages appear at the bottom, and you scroll up to see older ones.
// chat.page.ts
@Component({
selector: 'app-chat',
template: `
<ion-content #content>
<cdk-virtual-scroll-viewport
[itemSize]="estimatedItemSize"
[minBufferPx]="600"
[maxBufferPx]="900"
class="message-viewport">
<div *cdkVirtualFor="let msg of messages; trackBy: trackByMessageId"
class="message-wrapper"
[class.own-message]="msg.senderId === currentUserId">
<app-message-bubble
[message]="msg"
[showAvatar]="shouldShowAvatar(msg)">
</app-message-bubble>
</div>
</cdk-virtual-scroll-viewport>
</ion-content>
<ion-footer>
<app-message-input
(messageSent)="onSendMessage($event)"
(typing)="onTyping()">
</app-message-input>
</ion-footer>
`
})
export class ChatPage implements OnInit, OnDestroy {
@ViewChild('content') content: IonContent;
messages: ChatMessage[] = [];
estimatedItemSize = 72;
private destroy$ = new Subject<void>();
trackByMessageId(index: number, msg: ChatMessage): string {
return msg.id;
}
}
The trackBy function is essential. Without it, Angular would re-render every visible message whenever the list changes. With trackBy, it only creates or destroys DOM elements for messages that were actually added or removed. For a chat app receiving frequent updates, this is the difference between smooth operation and visible jank.
WebSocket Integration
For real-time message delivery, we use WebSockets through a service that manages the connection lifecycle. The key design decisions here are reconnection logic and message buffering:
@Injectable({ providedIn: 'root' })
export class WebSocketService {
private socket: WebSocket;
private messages$ = new Subject<WsMessage>();
private reconnectAttempts = 0;
private maxReconnectAttempts = 10;
private messageBuffer: OutgoingMessage[] = [];
connect(token: string): Observable<WsMessage> {
this.socket = new WebSocket(`wss://api.example.com/ws?token=${token}`);
this.socket.onopen = () => {
this.reconnectAttempts = 0;
this.flushBuffer();
};
this.socket.onmessage = (event) => {
const data = JSON.parse(event.data);
this.messages$.next(data);
};
this.socket.onclose = (event) => {
if (!event.wasClean && this.reconnectAttempts < this.maxReconnectAttempts) {
const delay = Math.min(1000 * Math.pow(2, this.reconnectAttempts), 30000);
setTimeout(() => {
this.reconnectAttempts++;
this.connect(token);
}, delay);
}
};
return this.messages$.asObservable();
}
send(message: OutgoingMessage): void {
if (this.socket?.readyState === WebSocket.OPEN) {
this.socket.send(JSON.stringify(message));
} else {
this.messageBuffer.push(message);
}
}
private flushBuffer(): void {
while (this.messageBuffer.length > 0) {
const msg = this.messageBuffer.shift();
this.socket.send(JSON.stringify(msg));
}
}
}
The exponential backoff on reconnection prevents hammering the server when there are network issues. The message buffer ensures that messages sent during a brief disconnection are not lost; they are queued and sent as soon as the connection is re-established.
Scroll Behavior
Getting scroll behavior right in a chat interface is surprisingly tricky. You want to auto-scroll to the bottom when a new message arrives, but only if the user is already at the bottom. If they have scrolled up to read older messages, auto-scrolling would be incredibly annoying:
private isNearBottom = true;
onScroll(event: any): void {
const threshold = 150; // pixels from bottom
const position = event.target.scrollTop + event.target.clientHeight;
const height = event.target.scrollHeight;
this.isNearBottom = (height - position) < threshold;
}
onNewMessage(message: ChatMessage): void {
this.messages = [...this.messages, message];
if (this.isNearBottom || message.senderId === this.currentUserId) {
// Use setTimeout to wait for the DOM to update
setTimeout(() => {
this.content.scrollToBottom(300); // 300ms smooth scroll
});
}
}
Notice the special case: if the current user sent the message, we always scroll to the bottom regardless of their scroll position. Users expect to see the message they just sent.
Offline Support
For a mobile chat app, offline support is not a nice-to-have; it is a requirement. Users expect to see their conversations even when they are in a subway tunnel or on a plane. We use a combination of Capacitor's storage plugin and a sync queue:
@Injectable({ providedIn: 'root' })
export class OfflineService {
private pendingMessages: PendingMessage[] = [];
async cacheConversation(convId: string, messages: ChatMessage[]): Promise<void> {
await Preferences.set({
key: `conversation_${convId}`,
value: JSON.stringify(messages)
});
}
async getCachedConversation(convId: string): Promise<ChatMessage[]> {
const result = await Preferences.get({ key: `conversation_${convId}` });
return result.value ? JSON.parse(result.value) : [];
}
queueMessage(message: OutgoingMessage): void {
const pending: PendingMessage = {
...message,
tempId: `temp_${Date.now()}`,
status: 'pending'
};
this.pendingMessages.push(pending);
this.savePendingQueue();
}
async syncPendingMessages(wsService: WebSocketService): Promise<void> {
for (const msg of this.pendingMessages) {
try {
wsService.send(msg);
msg.status = 'sent';
} catch {
msg.status = 'failed';
}
}
this.pendingMessages = this.pendingMessages.filter(m => m.status === 'failed');
this.savePendingQueue();
}
}
When the user sends a message while offline, we immediately add it to the UI with a "pending" status indicator (a small clock icon instead of a checkmark). When the connection is restored, the sync service processes the queue and updates the status. This gives the user instant feedback and a clear indication of what has and has not been delivered.
Typing Indicators
Typing indicators are a small touch that makes chat feel alive. The implementation is straightforward but needs throttling to avoid flooding the WebSocket connection. We send a "typing" event when the user starts typing and debounce it so we do not send more than one event every two seconds:
private typingSubject = new Subject<void>();
ngOnInit(): void {
this.typingSubject.pipe(
throttleTime(2000),
takeUntil(this.destroy$)
).subscribe(() => {
this.wsService.send({
type: 'typing',
conversationId: this.conversationId
});
});
}
onTyping(): void {
this.typingSubject.next();
}
Building a chat interface taught me more about performance, real-time systems, and mobile UX than any other project I have worked on. Every detail matters when users are staring at the screen waiting for messages to appear. If you are building something similar, I hope these patterns save you some of the trial and error I went through.