Data Test Engineer (SDET)
About the project
It is a leading free streaming television service in America, delivering 100+ live and original channels and thousands of on-demand movies in partnership with major TV networks, movie studios, publishers, and digital media companies. Millions of viewers tune in each month to watch premium news, TV shows, movies, sports, lifestyle, and trending digital series. The service is available on all mobile, web, and connected TV streaming devices.
At the project, we approach testing differently — we are testing, and breaking, code constantly, but we help rebuild it better. Data Test Engineers (SDET) test and verify the streaming service’s applications developed under data pipelines using Java programming language & Apache Kafka and work closely with the data development teams to validate events and analytics.This role is a DTE with a focus on data application validation. In this role you will apply your SDET and SQL experience to verify multiple data applications and tools. This DTE will work with Business Intelligence analysts and developers to make sure data and application quality and integrity is maintained. The position requires strong SDET experience with a focus on data, knowledge of data pipelines from raw data to reporting, and demonstrable SQL skills. The DTE will also act as a representative of the Software Test Engineering Team in scrum meetings, and work alongside product management and development teams to address how to provide better quality coverage for the applications supported.
- Work with project development teams implementing analytics features into client applications;
- Design, develop manual and automated test cases in order to validate new or existing data integration solutions using Java to meet data pipeline business requirements;
- Verify applications and tools developed on data, data warehousing & AWS Redshift, Snowflake or columnar databases;
- Develop best practices for data integration/streaming;
- Design and develop data integration/engineering workflows on Big Data technologies and platforms;
- Verify capturing of analytics events in related file systems or databases through SQL, or a scripting language (Python, Java, shell scripting, etc.);
- Work with Business Intelligence and Product Management to create test strategies, plans and cases that provide acceptable coverage for a given data pipeline, from event creation to reporting;
- Work in an Agile Software Delivery methodology, highly focused in creating data validation tests based on requirements;
- Provide a risk assessment on the defects identified and set the correct priority and severity.
Qualities / Experience We’re Seeking:
- 5+ years of Quality Assurance/Testing experience;
- 3+ years of Data Quality experience, or SDET experience with a focus on data, data warehousing, reporting, etc.;
- 3+ years of testing experience working within an Agile environment, and with Agile Management tools such as JIRA;
- Experience with Automation Framework development using Java;
- Experience with Performance Test Design, Development and load testing execution;
- Design, create and maintain assets used to execute performance tests and contribute to the execution and monitoring of performance test executions using ApacheJMeter, LoadRunner or similar tools;
- Working knowledge of JAVA, JVM, Spring Boot, data warehouse, data integration, sql server, apache kafka, data streaming, Big Data, mongoDB, SQL, Web Services, microservices, ETL, change data capture (CDC), DevOps;
- Strong SQL experience, with knowledge of AWS Redshift, Snowflake, or columnar databases;
- Experience with reporting or analytics tools like Tableau or Mode;
- Experience working with Amazon Web Services, querying and working with data in various AWS services;
- Programming experience in a language such as Python, Java, etc. for the purposes of parsing files and running queries;
- Experience with analytics implementations (network events, ad beacons, user action events, etc.) in a web or mobile application;
- Speaking English B1+.