Hello

I'm Gnaneshwar Yadla

Data Engineer • AI Automation Specialist

Building production data pipelines and agentic AI systems.

3+

Years Experience

15+

ETL/ELT Pipelines

500GB+

Daily Data Managed

4+

AI APIs Integrated

Gnaneshwar Yadla - Portrait 1Gnaneshwar Yadla - Portrait 2

About Me

Data Engineer and AI Automation specialist based in Richmond, VA. I have 3+ years of experience building production grade data pipelines and agentic AI systems from 500GB+ daily ETL workflows across Snowflake, AWS, and Azure at Wipro, to multi-LLM automation systems that run entirely on their own. I built an AI-powered job application pipeline that submits 200+ applications a day across LinkedIn, Dice, and Monster using Claude API, Ollama, and Airflow for under $14/month in AI costs. That's the kind of problem I enjoy: real, practical automation built with the right tools at the right cost. I'm currently completing my M.S. in Computer Science at Virginia Commonwealth University (GPA 3.8/4.0), and I'm always building something at the intersection of data and AI.

Education

Master of Science in Computer Science

Virginia Commonwealth University

GPA: 3.8 / 4.0

Bachelor of Science in Computer Science

Presidency University

Featured Data Engineering Projects

A collection of data platforms and pipelines I've built to solve complex business challenges.

AI Job Application Automator

End-to-end pipeline targeting 200+ applications/day across LinkedIn, Dice, and Monster using multi-agent workflows for JD analysis and form filling.

PythonClaude APIOllamaPlaywrightFlaskAirflow
View Project

Airbnb Analytics ELT Pipeline

Production ELT pipeline cutting refresh time by 60% with 20+ dbt quality tests across 12+ star schema dimensions.

SnowflakedbtAWS S3PythonAirflowPower BI
View Project

Uber Trip Analytics Platform

Real-time streaming platform processing 100K+ daily Kafka events with sub-60s latency and 6-month time-travel capabilities.

DatabricksPySparkDelta LakeKafkaAWS S3Tableau
View Project

Multi-Source Sales Dashboard

Consolidated 10+ data sources into 18+ dbt models, cutting query times by 55% and reducing data quality incidents by 50%.

SnowflakeSQLPythondbtPower BIExcel
View Project

Technical Skills

Technologies and tools I use to build robust data systems

Programming

PythonSQLPySpark

Data Platforms

SnowflakeDatabricksdbt

AI & Automation

Claude APIOllamaLLM OrchestrationPlaywrightAirflow

Cloud & Infrastructure

AWS (S3, Glue)AzureDocker

Analytics & BI

Power BITableauExcel

Tools

Git

How I Build Modern Data Platforms

End-to-end data architecture from ingestion to insights

Data Sources

APIs • Databases • Events

Ingestion

Kafka • Batch Processing

Orchestration

Airflow • Scheduling

Data Warehouse

Snowflake • Storage

Transformations

dbt • Data Modeling

Analytics

Power BI • ML Models

Let's Build Data Systems That Scale

I'm always interested in discussing new opportunities and challenging data projects.