Senior Data Engineer_AWS_Databricks_SQL

Location: Mumbai
Functional Practices: Data Engineering
Job type: Permanent
Contact name: Jagrati Taneja

Contact email: jagrati.taneja@crescendogroup.in
Job ref: 97738
Published: about 16 hours ago

Public

 

 

 

Vacancy Details

 

Company

GEG

Position (Designation)

Senior Data Engineer

Location

Vikhroli, Mumbai

Function

Digital

Job Profile

KRA

  1. Identifying & driving Digital Transformation initiatives across G&B
  2. Creation and implementing cloud-based architecture.
  3. Creation and implementing of advanced analytics use cases.
  4. Collaboration with internal and external teams
  5. Create Presentation / Proposal on Value proposition

 

Job Description: -

Seeking Senior Data Engineer for our Data CoE team to help us implement advanced analytics solution on both on-premises and AWS environments in our Enterprise Data Lake.

The incumbent must have a sound understanding of databases, relational structures, dimensional data modelling, structured query language (SQL) skills, data warehouse and reporting techniques.

 

This position is responsible for the successful delivery of advanced analytics, ML/ AI solution for various use cases.

 

  1. Identifying & driving Digital Transformation initiatives across G&B
  • Work with business teams to understand the business needs and how emerging technologies can be leveraged.
  • Ideation, conceptualization & analysis of Digital opportunities across the value chain including manufacturing, supply chain, workplace & customer experience.
  • Work closely with vendors for planning, executing & supporting Digital projects.
  • Working closely with end users and understand the use case in question.
  • Design and implement the use case.

 

  1. Creation and implementing cloud-based architecture.
  • Hands on experience in Python programming and deploying ML models.

 

 

  1. Creation and implementing of advanced analytics use cases.
  • Working closely with end users and understand the use case in question.
  • Design and implement the use case.

 

 

  1. Collaboration with internal and external teams
  • Ability to collaborate on a team with infrastructure, BI report development and business analyst resources and clearly communicate solutions to both technical and non-technical team members. Design and implement the use case.
  • Coordinate with vendors and bridge the gap amongst teams.

 

  1. Create Presentation / Proposal on Value proposition.
  • Create high quality presentations and value proposition to create stakeholder buy-in
  • Engage with internal stakeholders to evangelize the benefits of Digital technologies and drive cultural change.

 

 

Requisite Qualification

 

Essential

 

 

Bachelor’ s in engineering

(Computer application/Information & Technology/ Electronics and Telecomm)

 

Preferred

 

Data Engineering or Data Science Certification

 

Requisite Experience

Essential

47 yrs of relevant experience

Preferred

  • AWS Certified Database Specialty or
  • AWS Certified Data Analytics

Skills Required

Special Skills Required

  • Databricks:
  • AWS: S3, DMS, Redshift, EC2, VPC, Lambda, Delta Lake, CloudWatch etc.
  • Bigdata: Databricks, Spark, Glue and Athena
  • Expertise in Lake Formation, Python programming, Spark, Shell scripting
  • Minimum Bachelor’s degree with 5+ years of experience in designing, building, and maintaining Databricks and AWS data components
  • 3+ years of experience in data component configuration, related roles and access setup
  • Expertise in Python programming
  • Knowledge in all aspects of DevOps (source control, continuous integration, deployments, etc.)
  • Strong hands-on coding skills in Python, processing large-scale data set and developing machine learning models. Experience programming in Python, R, and SQL.
  • Expertise in deploying use cases and AI/ML models in standalone and cloud based systems and services.
  • Knowledge in all aspects of DevOps (source control, continuous integration, deployments, etc.)
  • Comfortable working with DevOps: Jenkins, Bitbucket, CI/CD
  • Hands on ETL development experience, preferably using or SSIS.
  • SQL Server experience required
  • Strong analytical skills to solve and model complex business requirements.
  • Sound understanding of BI Best Practices/Methodologies, relational structures, dimensional data modelling, structured query language (SQL) skills, data warehouse and reporting techniques.
  • Comfortable with creating data models and visualization using Power BI

 

Preferred Skills Required

  • Good understanding of machine learning techniques and algorithms, including clustering, anomaly detection, optimization, neural network etc.
  • Comfortable in deploying AI/ML models (ML Ops) in standalone and cloud based systems and services.
  • Experience working in the SCRUM Environment.
  • Experience in Administration (Windows/Unix/Network/Database/Hadoop) is a plus.
  • Experience in SQL Server, SSIS, SSAS, SSRS
  • Ability to collaborate on a team with infrastructure, BI report development and business analyst resources, and clearly communicate solutions to both technical and non-technical team members.