Click to see more
Click to see more
Full Stack Developer
Click to see more
Hello! I'm a passionate Software developer with a love for creating elegant solutions to complex problems. With experience in modern technologies, I focus on building responsive, user-friendly applications.
When I'm not coding, you'll find me writing on my blog, exploring new technologies, or contributing to open-source projects.
LIVE
A full-stack AI chat application with React + TypeScript frontend, Python + FastAPI backend, and microservices architecture. Features multi-model evaluation querying 6 AI models concurrently via Groq API, real-time WebSocket communication, and PostgreSQL database for conversation history. Deployed with Vercel (frontend), Render (backend), and Neon (database).
Click to try the live demo →
An intelligent ATS (Applicant Tracking System) tool that analyzes resumes against job descriptions to help job seekers optimize their applications. Features resume parsing, keyword matching, and actionable feedback to improve ATS compatibility scores.
Click to try the live demo →
A multi-module IoT Edge pipeline with three modules — a C++ sensor simulator, a C++ data filter for noise rejection and spike detection, and a Python analytics engine for threshold alerting. Modules communicate through Edge Hub routing with only alerts forwarded upstream to IoT Hub. Features multi-stage Docker builds for minimal image sizes, dual-mode compilation for local development, and achieved ~9% data reduction through edge filtering before cloud transmission.
GDR is a mission-critical data processing pipeline within MasterCard's Smart Data ecosystem, serving as a centralized platform for ingesting, transforming, enriching, and validating customer data before delivering it to downstream systems and external consumers. I contributed to building and maintaining core backend components that ensured high reliability, data integrity, and consistent processing across the pipeline. The system's continuous, multi-stage flow required strong attention to correctness and failure handling to prevent downstream data corruption
During my time at AT&T, I worked on backend application development within the Commissions team, building and enhancing systems responsible for calculating commissions for agents and national retailers. The platform ingested data from multiple upstream applications, applied complex business rules, and processed high-volume transactions with a strong emphasis on accuracy and consistency. I was actively involved in analyzing and modernizing legacy components, supporting migration efforts while ensuring reliability and production readiness at scale.
During my time at CenturyLink, I worked on backend development and support tasks involving C++ and Python. I spent most of my time understanding existing code, making small enhancements, fixing issues, and helping keep the systems stable in production. Although the engagement was short, it gave me solid hands-on exposure to working with real-world codebases and production environments.
Have a question or want to work together? Feel free to reach out!