Hi, I'm

Brendan White

Senior Full Stack Engineer

Experienced Full Stack Engineer with 10 years of expertise in designing and building high-performance industrial solutions. I specialize in the modern #T3_STACK and cloud platforms.

react React
next Next
typeScript TypeScript
tailwindcss Tailwindcss
nodejs Nodejs
csharp Csharp
spring Spring
java Java
ruby Ruby
mysql Mysql
supabase Supabase
web3 Web3
solana Solana
ethereum Ethereum
solidity Solidity
hardhat Hardhat
truffle Truffle
aws Aws
docker Docker
kubernetes Kubernetes
jenkins Jenkins
bash Bash
astro Astro
react React
next Next
typeScript TypeScript
tailwindcss Tailwindcss
nodejs Nodejs
csharp Csharp
spring Spring
java Java
ruby Ruby
mysql Mysql
supabase Supabase
web3 Web3
solana Solana
ethereum Ethereum
solidity Solidity
hardhat Hardhat
truffle Truffle
aws Aws
docker Docker
kubernetes Kubernetes
jenkins Jenkins
bash Bash
astro Astro
react React
next Next
typeScript TypeScript
tailwindcss Tailwindcss
nodejs Nodejs
csharp Csharp
spring Spring
java Java
ruby Ruby
mysql Mysql
supabase Supabase
web3 Web3
solana Solana
ethereum Ethereum
solidity Solidity
hardhat Hardhat
truffle Truffle
aws Aws
docker Docker
kubernetes Kubernetes
jenkins Jenkins
bash Bash
astro Astro

My Arsenals 💪

Languages & Frameworks

T3 Stack, Node.js, React, Next.js, Angular, GraphQL
Java, Spring Boot, Spring Framework, Quarkus
Hibernate, JPA for enterprise applications
.Net Core, ASP.NET Core, Entity Framework Core
Blockchain, Solana, Ethereum, DeFi, Web3

Cloud & DevOps

AWS (Lambda, EC2, ECS, EKS, ELB, SQS, SNS, SES, Fargate)
API Gateway, Cognito, EventBridge, Elastic Beanstalk
CodePipeline, CodeDeploy, CloudFront, Kinesis, CDK
Terraform, Docker, Kubernetes, Jenkins

Database & Middleware

PostgreSQL, Amazon RDS, Oracle, MongoDB
Cassandra, Amazon DynamoDB, Redis, Elasticsearch
Message systems: Kafka, Amazon SQS
MuleSoft integration & Anypoint Platform

Testing & Tools

JUnit, Jest, Jasmine, Mocha, Mockito, Selenium
Cypress, Cucumber, Postman, Spock, Geb, JMeter
Agile, Microservices, REST APIs, CI/CD, TDD, BDD
GitHub, GitLab, Bitbucket for version control

Professional Journey

Work Experience

Tech Lead

OXYgean

Dubai, UAE (Remote)

Oct 2025 – Present

Architected and led the development of cloud-native, event-driven backend services using Python/Scala and Kafka, supporting high-throughput data ingestion and processing for critical analytics and operational workflows.

Key Achievements:
  • • Established comprehensive platform standards for reliability and operability, including service templates, centralized logging/metrics/tracing, SLOs, and runbooks, fostering a healthy production culture
  • • Led Kubernetes-based production deployments and refined release practices using GitOps-style workflows and CI/CD automation (GitHub Actions), improving deployment consistency and speed
  • • Implemented Infrastructure as Code with Terraform for repeatable, secure provisioning of cloud resources (AWS), enforcing least-privilege access controls and secrets management
  • • Designed data modeling and transformation patterns using dbt and modern data warehouse/lakehouse architectures, incorporating built-in data quality checks to increase trust in outputs
  • • Drove performance and cost optimization across data pipelines and APIs through query tuning, partitioning strategies, caching, and autoscaling policies, reducing operational costs by ~20%
  • • Mentored and unblocked engineers through design reviews, pairing, and technical documentation, raising overall engineering quality and collaboration
  • • Developed and maintained effective logging and debugging practices for both server-side and distributed systems, significantly reducing mean time to resolution (MTTR)

Senior Data Engineer

WP Engine

Austin, TX (Remote)

Jan 2023 – Oct 2025

Led a large-scale dbt refactoring and optimization initiative, improving data reliability, reducing technical debt, and accelerating analytics delivery by 35% for cross-functional teams.

Key Achievements:
  • • Designed, built, and migrated legacy ETL pipelines from Fivetran/Stitch to custom Python-based data pipelines, reducing costs by 40% while increasing performance, flexibility, and maintainability
  • • Developed and maintained scalable data pipelines integrated with Ruby on Rails applications, enabling reliable, real-time data ingestion for business-critical systems
  • • Deployed and operated production data workloads on GCP using Kubernetes and Docker, ensuring high availability, scalability, and fault tolerance for enterprise clients
  • • Automated cloud infrastructure provisioning using Terraform, standardizing development and production environments and improving deployment consistency by 60%
  • • Implemented robust data quality monitoring and observability with Monte Carlo, proactively detecting anomalies and improving data trustworthiness for stakeholders
  • • Designed and implemented a custom AI agent leveraging LLMs to augment internal data and engineering workflows, improving developer productivity by 25% and accelerating troubleshooting
  • • Integrated incident management and alerting with PagerDuty and Slack, improving on-call efficiency and embodying a sane, balanced on-call culture
  • • Coached engineers through weekly technical talks, mentoring team members on modern data engineering best practices, system design, and tooling

Full Stack Engineer

Tunnl Exchange

Miami, FL (Remote)

Oct 2023 – May 2025

Contributed to the design and development of a hybrid CEX/DEX trading platform, combining off-chain performance with on-chain settlement for secure and scalable digital asset trading.

Key Achievements:
  • • Implemented an off-chain order book and microservices-based matching engine in Python, enabling low-latency trade execution while maintaining blockchain transparency
  • • Built cross-chain and fiat on/off-ramp functionality by integrating Circle payment services, enabling seamless asset bridging and fiat-to-crypto transactions
  • • Integrated decentralized liquidity and protocol interactions including Uniswap, ParaSwap, and Sablier, supporting token swaps, streaming payments, and optimized liquidity routing
  • • Developed ERC-20 token deposit and withdrawal workflows, supporting both on-chain deposits and card-based (credit/debit) funding for a seamless user experience
  • • Designed and implemented event-driven microservices for blockchain log ingestion, transaction tracking, and settlement reconciliation using Kafka
  • • Deployed and operated backend services on AWS (EC2, Lambda, EKS), leveraging containerization (Docker) and serverless architectures for optimal scalability and cost efficiency
  • • Wrote efficient tests and established tight feedback loops for both frontend components and backend services, ensuring code quality and reliability

Backend Engineer

Hashcash Consultants

Palo Alto, CA (Remote)

Feb 2020 – Dec 2022

Developed and maintained blockchain data ingestion pipelines using Python, Web3.py, and Ethereum JSON-RPC, enabling near real-time extraction of on-chain transactions and smart contract events.

Key Achievements:
  • • Built scalable ETL pipelines with Apache Airflow, dbt, and PySpark, transforming raw blockchain data into structured, analytics-ready datasets for Web3 platforms
  • • Optimized real-time on-chain data streaming using Kafka and AWS Kinesis, supporting low-latency processing for analytics and monitoring applications
  • • Designed and implemented high-performance data partitioning and storage strategies in Snowflake and Delta Lake, reducing query latency by ~50% on large datasets
  • • Engineered on-chain data indexing and query layers, leveraging BigQuery Ethereum datasets and custom GraphQL APIs for efficient data access and exploration
  • • Built backend data services and models powering AI-driven dashboards and predictive analytics for blockchain trends, token flows, and network activity
  • • Automated CI/CD pipelines for data platforms using Terraform, CloudFormation, Docker, and Kubernetes, improving deployment speed, consistency, and reliability
  • • Deployed and operated containerized data services across hybrid cloud environments (AWS, GCP, Azure), ensuring scalability and high availability
  • • Designed and optimized large-scale blockchain data pipelines using Azure Data Factory, Databricks, and Apache Spark for distributed processing

Backend Engineer

Optum

Eden Prairie, MN (Remote)

Dec 2016 – Jan 2020

Engineered and maintained backend data pipelines to ingest, transform, and store healthcare operational and performance data, supporting automated enterprise workflows.

Key Achievements:
  • • Designed and optimized backend data models, schemas, and fact tables in SQL databases to ensure efficient storage, query performance, and long-term system scalability
  • • Built backend data services and APIs delivering near–real-time datasets for internal healthcare platforms and downstream system integrations
  • • Developed SQL- and Python-based backend processing services to support automated healthcare reporting, analytics, and operational workflows
  • • Implemented reliable backend pipelines to process program performance, experimentation, and operational metrics, ensuring data accuracy, traceability, and consistency
  • • Automated end-to-end backend workflows using Zoho Flow, Zoho Creator, and n8n, integrating healthcare operational systems securely and efficiently
  • • Reduced manual operational effort by 43% through backend automation, data validation, orchestration, and standardized processing pipelines
  • • Deployed and orchestrated backend ETL services using Apache Airflow and AWS Lambda, enabling scalable, fault-tolerant data processing
  • • Implemented CI/CD pipelines for backend services using GitHub Actions, Jenkins, and Docker, improving deployment reliability while maintaining compliance standards

Backend Engineer

Zappos

Las Vegas, NV

May 2013 – Dec 2016

Engineered and maintained backend data ingestion and transformation pipelines from OLTP/OLAP databases, SQL sources, and Salesforce, ensuring data consistency and integrity across e-commerce systems.

Key Achievements:
  • • Implemented backend data processing and analytics logic to identify purchasing patterns and sales trends, improving conversion workflows by 12%
  • • Developed backend forecasting and demand-processing pipelines supporting sales, churn analysis, and inventory planning, reducing stockouts by 17%
  • • Exposed curated datasets through backend APIs and data access layers, enabling internal tools, reporting platforms, and operational decision-making
  • • Automated backend reporting and data refresh workflows, delivering timely and reliable datasets for leadership and operational teams
  • • Enabled customer lifecycle and retention systems by building backend data integrations supporting segmentation and personalization, contributing to a 23% increase in customer retention
  • • Collaborated closely with cross-functional marketing, finance, operations, and customer success teams to design backend data solutions aligned with core e-commerce business workflows
  • • Developed solid fundamentals in database design, SQL query optimization, and backend service architecture to support high-volume e-commerce transactions
  • • Gained extensive experience in shipping features rapidly within a fast-paced, customer-centric Agile environment

Academic background

Education

Valencia College Seal

Bachelor of Computer Science

Valencia College

2011 - 2015

Comprehensive education in computer science fundamentals, software engineering principles, and modern development practices. Foundation for a successful career in full-stack development and enterprise software solutions.

Certifications

AWS Certified Solutions Architect Badge
AWS Certified Solutions Architect – Associate

Amazon Web Services

Oracle Certified Professional Badge
Oracle Certified Professional, Java SE 8 Programmer

Oracle Corporation

Let's talk

Contact

Have a question or a project in mind? Feel free to reach out.