Senior DataOps Engineer

@Indium Software
  • Chennai, Tamil Nadu, India View on Map
  • Post Date : June 27, 2025
  • Salary: Rs300,000.00 - Rs3,000,000.00 / Yearly
  • 0 Application(s)
  • View(s) 2
Email Job

Job Description

Experience: 6–12 Years

Location: Chennai

Job Type: Full-Time

Role Overview:

We are seeking a highly motivated and experienced Senior DataOps Engineer to join our Data Movement team. This role is ideal for individuals who enjoy collaborative problem-solving, innovative thinking, and ensuring the reliability and performance of large-scale data systems. The ideal candidate will bring a strategic mindset, technical expertise in DataOps practices, and a proactive approach to infrastructure stability, automation, and scalability.

Key Responsibilities:
  • Design, build, and manage robust and scalable data pipelines and workflows.
  • Collaborate with data engineers, analysts, and architects to understand data needs and ensure optimal delivery pipelines.
  • Manage and maintain data movement infrastructure, ensuring high availability and minimal latency.
  • Implement automation for data ingestion, transformation, and delivery processes using modern DevOps and DataOps tools.
  • Monitor and troubleshoot issues in real-time to ensure stability, performance, and availability of the data infrastructure.
  • Drive CI/CD practices in the data ecosystem, including testing, deployment automation, and release management.
  • Implement and maintain observability tools (logging, monitoring, alerting) across the data infrastructure.
  • Apply data quality controls and enforce governance policies across the data lifecycle.
  • Stay up to date with emerging technologies and continuously improve existing systems and processes.
Required Skills & Qualifications:
  • 6–12 years of experience in DataOps, DevOps, or Data Engineering.
  • Strong expertise in Python, Shell scripting, or similar programming languages.
  • Experience with ETL tools (e.g., Talend, Informatica, Apache NiFi) and workflow orchestration tools (e.g., Apache Airflow, Luigi).
  • Strong knowledge of cloud platforms (AWS/GCP/Azure) and associated data services (e.g., S3, BigQuery, Redshift, Snowflake).
  • Proficiency in managing data pipelines and integrating with relational and non-relational databases.
  • Hands-on experience with CI/CD pipelines, Git, Jenkins, Docker, Kubernetes.
  • Familiarity with infrastructure-as-code (e.g., Terraform, CloudFormation).
  • Understanding of data security, compliance, and governance best practices.
  • Excellent problem-solving skills and a proactive attitude toward incident management.
  • Strong communication and interpersonal skills; ability to work cross-functionally.
Preferred Qualifications:
  • Bachelor’s or Master’s degree in Computer Science, Information Systems, or related technical field.
  • Certifications in cloud platforms (e.g., AWS Certified Data Analytics, GCP Data Engineer).
  • Experience in Agile/Scrum methodologies.
What We Offer:
  • Dynamic, innovation-driven work environment.
  • Opportunities to work on cutting-edge data technologies.
  • Continuous learning and growth through training and certifications.
  • Competitive compensation and benefits.

Other jobs you may like

Scroll to Top