We are looking for a skilled and experienced Data Engineer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, think strategically, and innovate, and helping others, this job is for you!
Location : Location is not a constraint for the right candidate.
Primary Skills:
AWS,Python,SQL / PL SQL
Secondary Skills:
Airflow,Python,DBT,Fivetran,Kafka,Looker,Tableau
Role Description:
Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecturing, building & managing data flows/ pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases.
Role Responsibility:
Translate functional specifications and change requests into technical specifications
Translate business requirement document, functional specification, and technical specification to related coding
Develop efficient code with unit testing and code documentation
Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving
Setting up the development environment and configuration of the development tools
Communicate with all the project stakeholders on the project status
Manage, monitor, and ensure the security and privacy of data to satisfy business needs
Contribute to the automation of modules, wherever required
To be proficient in written, verbal and presentationcommunication (English)
Co-ordinating with the UAT team
Role Requirement:
Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.)
Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.)
Knowledgeable in Shell / PowerShell scripting
Knowledgeable in relational databases, nonrelational databases, data streams, and file stores
Knowledgeable in performance tuning and optimization
Experience in Data Profiling and Data validation
Experience in requirements gathering and documentation processes and performing unit testing
Understanding and Implementing QA and various testing process in the project
Knowledge in any BI tools will be an added advantage
Sound aptitude, outstanding logical reasoning, and analytical skills
Willingness to learn and take initiatives
Ability to adapt to fast-paced Agile environment
Additional Requirement:
Experience in ETL Tools(DBT, FiveTran, Airflow) to develop jobs for extracting, cleaning, transforming & loading data into DWH.
Ability to translate the business requirements to Functional/technical aspects, integrate multiple data sources & databases and create database schemas
Understanding of the threading limitations of Python, event-driven programming concept & multi-process architecture, accessibility & security compliance
Knowledge of user authentication & authorization between multiple systems, servers, & environments
Understanding of fundamental design principles for scalable application
Knowledge on code versioning tools such as Gitz
Experience with Amazon Web Services using Boto SDK, Jenkins and continuous Development Integration Service(CIDS)
Strong knowledge on Amazon’s AWS offerings: RDS ,Redshift, S3, EC2, ECS, Data Pipeline, Glue, Spectrum, Lambda