12 Oct
International Software Systems
Ghaziabad
Senior AWS Engineer -Data Lake
Primary Responsibilities
- Design and develop data lakes, manage data flows that integrate information from various sources into a common data lake platform through an ETL Tool
- Code and manage delta lake implementations on S3 using technologies like Databricks or Apache Hoodie
- Triage, debug and fix technical issues related to Data Lakes
- Design and Develop Data warehouses for Scale
- Design and Evaluate Data Models (Star, Snowflake and Flattened)
- Design data access patterns for OLTP and OLAP based transactions
- Coordinate with Business and Technical teams through all the phases in the software development life cycle
- Participate in making major technical and architectural decisions
- Maintain and Manage Code repositories like Git
You Must Have:
- 5+ Years of Experience operating on AWS Cloud with building Data Lake architectures
- 3+ Years of Experience with AWS Data services like S3, Glue, Lake Formation, EMR, Kinesis, RDS, DMS and Redshift
- 3+ Years of Experience building Data Warehouses on Snowflake, Redshift, HANA, Teradata, Exasol etc.
- 3+ Years of working knowledge in Spark
- 3+ Years of Experience in building Delta Lakes using technologies like Apache Hoodie or Data bricks
- 3+ Years of Experience working on any ETL tools and technologies
- 3+ Years of Experience in any programming language (Python, R, Scala, Java)
- Bachelor’s degree in computer science, information technology, data science, data analytics or related field
- Experience working on Agile projects and Agile methodology in general
- Strong RDBMS and data modelling skills
- AWS cloud certification is a big plus