PyJobs PyJobs
Post a job
Company
Keedian logo

Keedian

keedian.breezy.hr
Location

Remote, but you must be in the following locations

  • 🇦🇷 Argentina
  • 🇨🇴 Colombia
  • 🇲🇽 Mexico
Apply

Senior Data Engineer

Keedian (formerly Tutenlabs) is transforming how buildings operate — combining technology implementation, AI-powered optimization, and managed services to help multi-site organizations reduce costs, lower emissions, and move from reactive operations to predictive performance.

Our vision is clear: a future where buildings run autonomously, so our clients can focus on what matters most.

🌎 100% remote — we prioritize candidates based in Mexico, Colombia, and Argentina.

**

**

What this role is about

We’re looking for a Senior Data Engineer to help build and scale Keedian’s data and intelligence platform.

This role will be key in transforming large-scale operational, IoT, and telemetry data into dashboards, reporting, business metrics, and future AI-driven optimization capabilities.

You’ll own data domains end-to-end: ingestion, modeling, quality, architecture, and visualization. You’ll work closely with Engineering, Product, Operations, and Customer Success to build scalable data products with direct business impact.

This is an ideal opportunity for someone who enjoys building from scratch, making important technical decisions, and solving complex operational data challenges.

**

**

Responsibilities

  • Design, build, and evolve Keedian’s analytical platform and multi-tenant data lake architecture

  • Build and maintain scalable ingestion pipelines from operational systems, APIs, IoT devices, and telemetry sources

  • Develop and own analytical models, semantic layers, dashboards, and operational reporting

  • Ensure data quality, lineage, observability, freshness, scalability, and reliability across the platform

  • Collaborate with Engineering and Product teams to ensure systems and events are analytics-ready

  • Build the data foundation for predictive maintenance, operational intelligence, anomaly detection, and AI-driven optimization

  • Help define and measure customer impact through ROI, savings, and operational performance metrics

  • Contribute to architecture decisions, documentation, governance, and evolution of Keedian’s modern data stack

Requirements

Experience

  • 5+ years of experience in Data Engineering or Analytics Engineering

  • Experience building data platforms end-to-end

  • Experience working with telemetry, time-series, or large-scale operational data

  • Experience with cloud architectures and multi-tenant environments

  • Bonus: experience in IoT, energy, smart buildings, HVAC, or BAS industries

Technical Skills

  • Advanced SQL

  • Python for data engineering

  • AWS and modern data lake/lakehouse architectures

  • dbt or similar transformation/modeling tools

  • Workflow orchestration and pipeline management

  • Streaming and large-scale data processing

  • Data quality, observability, and Infrastructure as Code

Languages

  • Advanced English

  • Advanced Spanish

**

**

Benefits

💻 100% remote

⏰ Flexible schedule

🍰 Birthday day off

😁 Reduced hours on Fridays

🌴 Unlimited vacations

💊 Sick leave days

🏠 Moving day off

🎓 Support for studies, training, and language learning

👥 Referral program

If you’re excited about building the data foundation behind autonomous operations, large-scale analytics, and AI-driven optimization — we’d love to meet you. 💙

Get the latest Python jobs in your inbox.
Email address
Frequency
Receive jobs daily
Best when actively looking for a job
Receive jobs weekly
Best when just browsing jobs

Show filters

Your email won't be used for commercial purposes. Read our Privacy Policy.