Iranian Company in Tech Industry
Annual Package: 1.8 – 2.4 billion Tomans
Responsibilities
· Design and build data engineering pipelines to ingest, transform, and serve data from multiple providers.
· Own cloud infrastructure including deployment, monitoring, and scaling across cloud-agnostic environments.
· Build and maintain the API layer that serves both frontend applications and external integrations.
· Set up CI/CD pipelines, observability tools, and reliability systems for fast and safe iteration.
· Collaborate with algorithms engineers to productionize ML models including serving and feature stores.
· Make architectural decisions that will define the platform for the next two to three years.
· Manage PostgreSQL databases and transition toward more robust data infrastructure over time.
· Implement model serving infrastructure, A/B testing frameworks, and online feature stores.
· Integrate multiple data sources including MLS feeds, property databases, and geographic datasets.
· Choose and implement technologies wisely while keeping builds simple and maintainable.
Requirements
· End-to-end builder who has taken a product from zero to production serving real users.
· Deep data engineering experience moving data reliably at scale through ETL or ELT pipelines.
· Data warehousing knowledge with practical experience in streaming versus batch processing.
· Infrastructure fluency with Kubernetes, Docker, and cloud platforms like AWS, GCP, or Azure.
· Terraform or equivalent infrastructure-as-code proficiency for production environment setup.
· Strong backend engineering skills in Python or Go with deep PostgreSQL knowledge.
· Experience with Redis, message queues, and distributed systems concepts in production.
· Four or more years building real systems with actual users, SLAs, and on-call rotations.
· Previously built infrastructure at an early-stage startup understanding proper versus good enough.
· Experience with ML infrastructure including model serving, feature stores, or experiment tracking.