Case Studies

Real architectural challenges I've solved — the problem, the decisions, and the technical blueprints. No proprietary code shared; focusing on the engineering logic.

CASE_001 · ERP / Industrial

Server-Driven UI Framework with Custom DAO

Dannie.CC · 2023–2024

Node.jsTypeScriptWebSocketArangoDBDAO ORMSDUIReal-time
L O A D I N G _ D I A G R A M . . .

// The Problem

Manufacturing facility with 9 departments required real-time UI updates across complex production lines. Standard deployment cycles (3-4 hours) were blocking business velocity, and legacy database interactions were becoming untraceable.


// Approach & Design

  • Architected a TypeScript-based Platform delivering JSON UI descriptors over WebSocket channels.

  • Designed and built DAO (Data Access Object): a custom ArangoDB ORM for high-performance graph/document queries and audit-safe ledgering.

  • Implemented a Dynamic Frontend Runtime in React that renders complex terminal views from server-side fragment definitions.

  • Centralized all business logic in the backend, enabling daily UI iterations with zero client-side deployments.


// Key Architecture Decisions

WebSocket over HTTP:Eliminated 2s polling latency for quality control decisions.
Custom ORM (DAO):Optimized complex manufacturing graph queries while ensuring strict data traceability.

// Measurable Outcomes

80%Deployment Reduction
1 weekFeature Velocity
0msUI Latency (Internal)

// What We Shipped

Digitized 9 departments with an 80% reduction in frontend overhead. The system became the backbone for IATF 16949 certification by ensuring 100% server-side logic auditability.

CASE_002 · HealthTech / FHIR Compliance

Professional IoT & FHIR Data Platform

Actimi GmbH · 2022–2023

FHIR R4IoTECG/PulseMedplumPostgreSQLGDPRGCPCI/CD
L O A D I N G _ D I A G R A M . . .

// The Problem

Building a remote monitoring platform for German providers required strict FHIR R4 and GDPR compliance. Existing medical data storage lacked proper clinical resource indexing, and device lifecycle management was manually error-prone.


// Approach & Design

  • Integrated IoT Medical Devices (Tablets, ECG, Blood Pressure monitors) into a unified clinical pipeline.

  • Automated Device Lifecycle Management: factory wipe + UUID regeneration via MDM for 100% GDPR traceability.

  • Redesigned the database with PostgreSQL GIN Indexes for high-frequency clinical FHIR searches (Patient, Observation, Encounter).

  • Implemented vendor-agnostic FHIR Adapters supporting both Aidbox and Medplum layers.


// Key Architecture Decisions

Dual-Vendor Adapter Layer:Avoided lock-in, allowing seamless switching between Aidbox and Medplum.
Hardware-Level Data Wipe:Ensured verifiable GDPR erasure that satisfied German DPA requirement.

// Measurable Outcomes

100%GDPR Compliance
30%Search Performance
300+Linked IoT Devices

// What We Shipped

Reduced clinical query latency from 800ms+ to 560ms while successfully passing three GDPR audits and scaling across Germany.

CASE_003 · Fintech / High-Performance Crypto

High-Frequency Crypto Trading & Copy-Trade Engine

DigiAlpha · 2021–2022

MicroservicesMatching EngineRabbitMQHD WalletLedgerKYCCopy-Trading
L O A D I N G _ D I A G R A M . . .

// The Problem

DigiAlpha needed a scalable exchange architecture for thousands of orders per second. The challenge was building a lock-free matching engine while managing complex multi-currency positions and virtual ledger balances.


// Approach & Design

  • Designed a Single-Threaded Matching Engine avoiding distributed consensus latency, fed by ordered event logs.

  • Built a Microservices Mesh handling KYC, Payment (Fiat/Crypto), Order Management, and Wallet services.

  • Developed an Internal Double-Entry Ledger for instant virtual settlements, reconciled asynchronously with blockchain confirmations.

  • Implemented Copy-Trading Logic using message broadcasting via RabbitMQ to trigger mirror orders with near-zero latency.


// Key Architecture Decisions

Event-Driven Partitioning:Used RabbitMQ to broadcast price updates and wallet movements across services effortlessly.
Internal Virtual Ledger:Avoided blockchain confirmation bottlenecks for a fluent, real-time trading experience.

// Measurable Outcomes

30%Engine Performance
Throughput Gain
0Message Loss Events

// What We Shipped

Launched 5 trading modes in 12 months with zero data loss. The modular architecture easily scaled to support high-volume Copy-Trading features.