GrowExx is looking for a smart and passionate Data Engineer (ETL), who will design and fill a bespoke data warehousing environment for our company.
Key Responsibilities
- Overall responsible for the Implementation of the Tasks allocated during the sprint
- Ensure the software is developed confirming the project architecture, coding standards, and NFRs
- To analyze the User Requirements, NFRs, and Technical requirements for the project
- Identify any unknowns i.e. missing scenarios, etc, and consult with PO to ensure those are defined either as a User story or UAC
- Identify ways to implement stories and select the approach that is best suited for the project.
- Consult with L2 as required
- Break down user stories along with the team to identify technical tasks
- Provide detailed estimates before the start of the sprints. Need to work with the Team to get the estimates
- Proactively pre-plan the sprints to achieve 90+% confidence in delivery
- Create Technical documents as required for the project in Jira, Confluence, or other tools
- Provide POs and ADMs with daily updates of the team via Jira and Slack
- Proactively communicate with other members of the team
- Provide HR and Management with any relevant information to help improve organization culture & performance
Key Skills
- Strong proficiency in SQL is a must
- Good to have experience with Python, especially with Python-pandas
- Understanding of the underlying storage engines and how data is maintained
- Experience with ETL solutions like Informatica, SSIS, and Talend is a plus
- Familiarity with other SQL/NoSQL databases is a plus
- Experience with Snowflake and Apache Kafka is preferred
- Experience with Apache Storm/Spark is mandatory
- Proficient understanding of code versioning tools (Git/SVN/SourceSafe)
- Ability to create scripts for automation purposes
- Strong knowledge of Elastic (preferred)
- Strong knowledge of PowerBI
- Must have experience with Cloud Platforms like AWS/GCP
- Knowledge of Kubernetes and Docker would be added advantage
- Determines data storage needs
- Creates and enhances data solutions enabling seamless delivery of data and is responsible for collecting, parsing, managing, and analyzing large sets of data
- Design develop, automates, and support complex application to extract, transform, and load data
- Ensure data quality
- Develops logical and physical data flow models for ETL applications
- Translate data access, transformation, and movement requirement into functional requirements and mapping designs
Education and Experience
- B Tech or B. E. (Computer Science / Information Technology)
- 2+ years as an ETL Developer or similar roles
Analytical and Personal skills
- Must have sound logical reasoning and analytical skills
- Good communication skills in English – both written and verbal
- Demonstrate Ownership and Accountability of their work
- Attention to details