Dark
Over a decade of experience building data pipelines, APIs, and web applications in Python. Day-to-day tools include PySpark, Pandas, FastAPI, and Flask, among others.
Building and operating large-scale data platforms end-to-end: Airflow for orchestration, PySpark for ETL, HDFS & Delta Lake for storage, Kafka for streaming, and Druid for analytics.
Over a decade of writing and optimizing SQL across diverse query engines including Spark SQL, Trino, and Druid.
Production experience designing schemas, writing migrations, and tuning queries in PostgreSQL, MariaDB, and SQLite.
Daily driver for development and production infrastructure. This site runs on a self-managed Debian VPS with nginx.
Built this site with Lume and Deno, and used JavaScript extensively for freelance full-stack projects with Node.
Exploring systems programming through hobby projects in Rust, C, C++, and Go.
