Reporting and Analytics – Senior

Full-Time @Indium Software
  • Bengaluru, Karnataka, India View on Map
  • Post Date : June 27, 2025
  • Salary: Rs450,000.00 - Rs5,000,000.00 / Yearly
  • 0 Application(s)
  • View(s) 4
Email Job

Job Description

Experience: 6–8 Years

Location: [Specify location or mention if remote/hybrid]

Employment Type: Full-Time

Role Overview

We are looking for an experienced and passionate Reporting and Analytics Senior with deep technical expertise in Google Cloud Platform (GCP), BigQuery, and Java. The ideal candidate will have a strong foundation in data engineering and analytics, with proven experience designing and developing scalable, high-performance data pipelines using modern cloud technologies.

Key Responsibilities
  • Develop and optimize large-scale data processing pipelines using Apache Beam / Cloud Dataflow (batch and streaming) in Java and optionally Python.
  • Design and implement end-to-end data pipelines using Apache Airflow / Cloud Composer on GCP.
  • Work with BigQuery to design scalable, efficient, and secure data storage and querying solutions.
  • Collaborate with stakeholders to understand reporting and analytics requirements and translate them into robust technical solutions.
  • Drive the implementation of data ingestion, curation, and transformation processes across multiple data sources.
  • Ensure data quality, performance, and scalability using best practices.
  • Apply data warehousing principles and implement SQL standards across various analytics use cases.
  • Mentor junior team members and contribute to continuous process and platform improvements.
Required Skills and Experience
  • 6–8 years of total IT experience, with 6+ years in data engineering.
  • Strong hands-on expertise in Core and Advanced Java (must-have).
  • Proven experience with GCP BigQuery.
  • Experience building Apache Beam / Dataflow pipelines in Java (and preferably Python).
  • Hands-on experience with Apache Airflow / Cloud Composer for orchestration.
  • Deep understanding of data warehousing concepts, ETL best practices, and stream/batch data processing.
  • Strong SQL skills and experience implementing complex queries and optimization techniques.
  • Good understanding of data governance, security, and data lifecycle management.
Preferred Skills
  • Experience with Python for data processing and scripting.
  • Familiarity with CI/CD pipelines for data deployments.
  • Exposure to real-time analytics, dashboards, and reporting tools (e.g., Looker, Data Studio).
Key Attributes
  • Analytical mindset with strong problem-solving skills.
  • Ability to work independently and as part of a cross-functional team.
  • Excellent communication skills and stakeholder management.
  • Adaptable and eager to stay current with evolving GCP tools and data engineering practices.

Join us to build intelligent, scalable, and impactful data solutions that drive business decisions. If you are passionate about cloud data engineering and enjoy solving complex problems, we would love to hear from you.

Other jobs you may like

Scroll to Top