Remotebase logo


Remote, but you must be in the following location

  • 🇵🇰 Pakistan

Senior Python Developer with Kafka

  • Architect data exfiltration pipelines for our fleet of field devices and high frequency research sensors over low-bandwidth communications links.

  • Use Kafka - messages come in via Kafka, has Kubernetes, MLK, microservices are in Typescript.

  • Implement highly-available data warehousing with intuitive APIs and efficient query infrastructure

  • Create monitoring and control software that flag issues with the power grid in real-time and enable operators to dispatch the appropriate resources needed to fix them.

  • Design intuitive access patterns that enable our data science teams to run analysis on collected data

  • Deploy new and existing cloud infrastructure that scales with our growing customer-base.

  • Collaborate cross-functionally with our hardware team, data scientists, full-stack engineers, and embedded software engineers to super-charge their workflows.

  • Use datadog confluence for docs.

To move forward in the application process you should have at least all of these:

  • Demonstrated expertise in one or more backend web frameworks using Python, such as Django, Flask, or FastAPI.

  • Ability to design and develop efficient and scalable web applications using Python.

  • Proven track record of at least 5 years in software engineering roles, with a focus on building robust and scalable applications.

  • Must have hands-on experience with Kafka, including data streaming, message queuing, and building event-driven architectures.

  • Prior experience in designing and developing large-scale cloud-based applications with an emphasis on scalability and performance.

  • Proficiency in working with Kubernetes to deploy, manage, and orchestrate containerized applications.

  • Solid experience in TypeScript development for building efficient and maintainable codebases.

  • Familiarity with version control systems (e.g., Git) for collaborative development.

  • Experience in setting up automated deployment pipelines for efficient and reliable application delivery.

  • Proficiency in implementing testing pipelines to ensure code quality and reliability.

  • Knowledge of container configuration and containerization technologies like Docker.

  • Demonstrated expertise in SQL database systems, preferably with Postgres.

  • Exposure to Halo Cast or similar technologies for data processing and analytics is a plus.

  • Familiarity with Cloud Infrastructure Solutions (AWS, GCP, Azure):

  • Understanding of cloud computing platforms such as AWS, Google Cloud Platform (GCP), or Microsoft Azure.

  • Knowledge of cloud services and their integration to architect scalable applications.