SDET-Data Quality

  • Industry Other
  • Category Programming/Software Development
  • Location Kathmandu, Nepal
  • Expiry date Jul 31, 2025 (Expires Today)
Job Description
Key Responsibilities:
  • Develop and maintain automated test frameworks for data pipelines, data migrations, and reporting systems using Python, Java, or JavaScript.
  • Perform end-to-end data validation across all stages of ETL processes to ensure data accuracy, completeness, and consistency.
  • Validate ETL processes and business logic to ensure accurate data extraction, transformation, and loading.
  • Conduct functional and performance testing using tools like JMeter or similar to ensure data systems meet performance requirements.
  • Write and optimize complex SQL and MongoDB queries for data testing and validation.
  • Monitor data quality and integrity, generate data quality reports, and work with stakeholders to resolve data issues.
  • Collaborate with engineering and analytics teams to understand data models, pipelines, and transformation logic.
  • Document test strategies, plans, results, and best practices clearly for both technical and non-technical stakeholders.
  • Recommend and implement improvements to QA practices, automation workflows, and data testing methodologies.
  • Participate actively in Agile/Scrum processes and contribute to sprint planning and retrospectives.
  • Stay up to date with emerging trends and tools in data quality, testing, and automation.
Required Qualifications:
  • 5+ years of experience in automated testing, integration testing, and data validation.
  • Strong experience with test automation frameworks and scripting in Python, Java, or JavaScript.
  • Hands-on experience with test case management tools and continuous integration (e.g., Jenkins, GitHub Actions).
  • Proven experience with cloud platforms (AWS, Azure, GCP) and data services such as S3, Redshift, BigQuery, Snowflake, or Athena.
  • Experience with ETL tools and data formats (e.g., AWS Glue, Apache Parquet).
  • Familiarity with data quality tools such as Talend Data Quality or Informatica Data Quality.
  • Exposure to performance testing tools like JMeter is a plus.
  • Good understanding of data governance and compliance standards (e.g., GDPR, HIPAA, CCPA).
  • Proficient in Git-based version control (GitFlow model, GitHub, Bitbucket).
  • Exposure to Linux OS, Docker, and Kubernetes environments.
  • Ability to work efficiently across Windows, Linux, and macOS environments.
  • Strong analytical and problem-solving skills with keen attention to detail.
  • Excellent communication skills with the ability to convey complex data issues to technical and non-technical stakeholders.
Nice to Have:
  • Experience with Power BI or other BI/reporting tools.
  • Exposure to research automation systems/applications.
  • Proficiency in additional languages (e.g., German, Japanese, French, Spanish) is a plus as the company is expanding globally.