Work

Systems we have shipped

Real problems, real constraints, real outcomes. Client names are anonymized unless we have explicit permission to share them.

Data EngineeringSeries C marketplace · E-commerce

Real-time pricing engine processing 2M+ events per day

Challenge

The client's pricing system ran on batch jobs every 45 seconds — too slow to react to demand signals during peak hours, causing lost margin and inventory imbalance.

What we did

We designed a streaming pipeline using Kafka and Flink, replacing the batch layer with event-driven processing. A new pricing service consumed signals in real time and published prices to a Redis cache consumed by the product catalog.

Outcomes

  • Pricing latency dropped from 45s to under 200ms
  • 12% improvement in gross margin during peak windows
  • Pipeline now handles 2M+ events/day with headroom to 10M
PythonDatabricksAirflowAWSGCPAzurePlaywright
Embedded EngineeringNational logistics distribution network · B2B operations

From email-driven ops to 90% platform adoption across a national distribution network

Challenge

Every operational process — approvals, resource requests, lifecycle events — ran through email threads. The system had fragmented modules with broken lifecycles, no automation, and no visibility. Teams spent more time coordinating than executing.

What we did

We embedded with the Connect India team to refactor broken lifecycle flows, consolidate fragmented modules, and systematically replace manual email-based processes with automated system workflows. Approval mechanisms were redesigned and RBAC was simplified to match real org structures.

Outcomes

  • Platform adoption reached 90% — email-based operations eliminated
  • Resource management costs cut by 50%
  • Approval and RBAC flows simplified across all modules
  • Automation introduced at every repeatable process touchpoint
Node.jsReactFastAPIPostgreSQLAWS
AI & Machine LearningConsumer goods brand · Retail

SKU-level demand forecasting cutting overstock costs by 22%

Challenge

The client's supply chain team was manually adjusting spreadsheet-based forecasts for 500+ SKUs across 12 distribution centers. Forecast error was running at 34%, leading to chronic overstock in slow-moving lines.

What we did

We built an ML pipeline using LightGBM with time-series features derived from POS data, promotional calendars, and external signals (weather, local events). The pipeline was productionized on Airflow with weekly retraining and a Streamlit dashboard for the supply chain team.

Outcomes

  • Forecast error (MAPE) reduced from 34% to 19%
  • Overstock costs down 22% in first 6 months
  • Adopted as the primary forecasting tool, replacing all spreadsheets
PythonLightGBMApache AirflowStreamlitBigQuerydbt

Have a similar problem?

Tell us what you are building. We will tell you whether we can help and what that would look like.

Start a project