Remote, but you must be in the following locations
Location: Remote from LATAM
Contract Type: Full-time vendor (Contracted via Inallmedia.com)
Time Zone Alignment: Central Time (CT)
Inallmedia.com is a global technology and design firm focused on building impactful digital solutions through remote, distributed teams across LATAM. We partner with international clients across industries, providing long-term technical expertise, product innovation, and team augmentation.
You will join a high-impact engineering squad dedicated to evolving user-facing experiences through AI-driven features and intelligent workflows. As a Senior Data Engineer, your mission is to bridge the gap between raw data collection and application logic. You will be instrumental in building a cohesive, intelligent ecosystem that powers modern user interactions, working within a fast-paced Agile environment that prioritizes innovation and architectural excellence.
Service Development: Design and build scalable backend microservices using Java, Spring Boot, and Gradle.
Data Pipeline Engineering: Architect and maintain robust ETL/ELT pipelines using Python and dbt to ensure seamless data flow across the ecosystem.
API Architecture: Design high-quality RESTful APIs that connect sophisticated frontend experiences to complex backend data systems.
Big Data Management: Optimize and manage data storage across SQL and NoSQL environments, leveraging technologies like Snowflake or Redshift.
Workflow Orchestration: Utilize Airflow to manage and schedule complex data workflows and dependencies.
AI Integration: Evolve backend services to support AI-powered features, ensuring infrastructure is prepared for LLM-driven and intelligent user experiences.
System Reliability: Troubleshoot distributed systems, lead code reviews, and participate in architectural brainstorming to ensure peak performance and reliability.
Dual Language Expertise: 5β7 years of professional experience working with Python (for data engineering) and Java/Spring Boot (for service layers).
Data Engineering Mastery: 5+ years building production-grade pipelines with ETL tools (specifically dbt ) and Airflow.
Big Data Stack: 3+ years of hands-on experience with Snowflake, Redshift, Spark, or Kafka.
Database Proficiency: Extensive experience navigating and optimizing both SQL and NoSQL environments.
Backend Fundamentals: Strong grasp of REST API design, Gradle, and microservices architecture in a distributed environment.
Education: Bachelorβs or Masterβs degree in a technical field (Computer Science, Math, Statistics, or equivalent).
Remote Fluency: Proven experience working in Agile teams within 100% remote environments.
Fluent English: Excellent verbal and written communication skills for daily technical collaboration.
AI/LLM Interest: Previous exposure to integrating machine learning models or intelligent workflows into backend services.
Cloud Ecosystems: Familiarity with AWS or GCP cloud infrastructure and deployment patterns.
Soft Skills: A self-starter mindset with the critical thinking skills necessary to manage competing priorities in a dynamic project.
This position is 100% remote for candidates based in LATAM. To ensure effective collaboration with our North American partners, the role requires full alignment with Central Time (CT) business hours.
All interviews, technical documentation, and daily stand-ups will be conducted exclusively in English.
#LI