MX.JobDiagnosis logo
  • entrar
  • olvidados?
  • Únase hoy
Título del trabajo, industria, palabras clave, etc.
Ciudad, Provincia o Código Postal

Tech Lead Python Engineer Synchronizer

HeyDonto AI API - cuauhtémoc

Apply Now

Descripción del trabajo

WE ARE LOOKING FOR YOU IF YOU ARE A TECH LEAD PYTHON ENGINEER Job Summary HeyDonto is seeking a Tech Lead Python Engineer to lead the development of our healthcare data synchronization platform. You'll guide a team of engineers in building cloud-native microservices to orchestrate complex, event-driven data flows. This role emphasizes hands-on engineering, mentorship, and strong system design—with a particular focus on Kafka-based event processing, asynchronous Python, and FHIR-compliant healthcare data pipelines on Google Cloud Platform. Technical Responsibilities System Design & Leadership Lead the development of microservices using FastAPI and Uvicorn/Uvloop in an asynchronous architecture Guide system design for Kafka-based, event-driven workflows with exactly-once and idempotent semantics Architect and implement the outbox pattern for reliable event publishing and distributed consistency Collaborate cross-functionally to define scalable and fault-tolerant system boundaries Advocate for observability, CI/CD hygiene, and defensive programming practices Backend Engineering Develop and maintain type-safe, asynchronous Python (Python 3.10+) using mypy in strict mode Implement robust retry logic with Stamina and structured error handling with Sentry Create scalable API endpoints with OpenAPI documentation and testable interfaces via dependency injection Write concurrent HTTP clients for high-throughput web scraping and data extraction Automate data capture tasks using Playwright for browser-based workflows Use Attrs (instead of dataclasses) for structured, immutable data modeling Data Engineering & Healthcare Integration Design and build ETL pipelines that ingest and transform FHIR healthcare data Work with FHIR Resources (v8.0.0) and the GCP FHIR Store for secure, standards-compliant data exchange Validate and serialize healthcare data using Pydantic-Avro and schema-driven approaches Ensure data lineage, provenance, and schema transformation integrity across the pipeline Implement entity resolution and high-throughput data mapping mechanisms Infrastructure & Platform Operations Containerize services using Docker, optimizing for reliability and reproducibility Collaborate on CI/CD pipelines, ensuring test coverage and deployment safety Monitor and trace production systems with structured logging and Sentry error tracking Deploy and scale services on Google Cloud Platform (GCP) using Kubernetes Configure service health checks, metrics, and autoscaling policies Technical Requirements Core Languages & Frameworks Expert Python (6+ years) with deep async programming and type safety practices FastAPI, SQLAlchemy, and PostgreSQL (via psycopg2-binary) pytest with parameterized, integration, and mocking test strategies Distributed Systems & Event-Driven Architecture Proficient in Apache Kafka, including: Designing idempotent consumers and event processors Handling message serialization, ordering, and partitioning Implementing the outbox pattern for reliable message delivery  Familiar with CQRS, event sourcing, and asynchronous message orchestration Data & Healthcare Integration Hands-on experience with FHIR data models and healthcare interoperability standards Working knowledge of Google Cloud FHIR Store and GCP data services Experience with schema validation, Avro serialization, and data integrity in distributed systems Tooling & Infrastructure Docker and multi-stage container builds Google Cloud Platform (GCP) for deployment and data services Kubernetes scaling and configuration strategies Observability tooling including logging, metrics, health checks, and Sentry Familiarity with Uvicorn/Uvloop, Stamina, Attrs, and Pydantic-Avro Leadership & Collaboration Proven ability to mentor and unblock team members Strong communication across engineering, product, and ops Skilled at code reviews, system documentation, and technical planning Ownership mindset with a focus on team productivity and technical quality Technical Skills Assessment Areas Kafka event stream design with idempotent and fault-tolerant processing Type-safe, async Python with concurrency and high reliability Schema transformation and validation for FHIR-compliant data Design of retry strategies and distributed error handling CI/CD automation and deployment on GCP Testability and observability of complex backend services This role is ideal for a senior engineer ready to lead a team through complex challenges in healthcare data integration, distributed systems, and event-based architecture—with a hands-on approach to engineering excellence. English Level: Native or Advanced Hiring Details: Work Type: Hybrid City: Guadalajara, Jalisco, Mexico Salary Range: 90k to 140k MXN per month + Sign-in bonus PTO Relocation bonus If you are interested in applying, please send your CV in English to , mentioning the name of the position you are applying for in the subject of the email. In the body of the email, please include the following information: Salary expectations Availability for interview Availability to join the team

Creado: Jue, 01 de Ene de 1970

➤
Página principal | Contáctenos | Política de privacidad | Términos y condiciones | Darse de baja | Búsquedas más populares
El uso de nuestra web conlleva la aceptación de nuestros términos y condiciones , así como de nuestras políticas de privacidad.
Copyright © 2005 to 2025 [VHMnetwork LLC] todos los derechos reservados. Diseño, desarrollo y mantenimiento por NextGen TechEdge Solutions Pvt. Ltd.