Senior Data Engineer GCP

Senior Data Engineer GCP

15 Oct
|
Logic Software Solutions
|
New Delhi

15 Oct

Logic Software Solutions

New Delhi

Title: Senior Data Engineer(GCP)

Location:

JobType:FullTime

AboutUs: Logic Hire Software Solutions is an innovativedatacentric organization committed to leveraging advanced analyticsand data engineering practices to inform strategic businessdecisions and improve client engagement. We are seeking a seasonedSenior Data Engineer to join our team and contribute to theoptimization and expansion of our datainfrastructures.

JobDescription:

Weare in search of a highly proficient Senior Data Engineer with over12 years of handson experience in architecting and maintaining datainfrastructure. The ideal candidate will possess extensiveexpertise in leveraging the Google Cloud Platform (GCP)





and theBigQuery ecosystem alongside strong commands in SQL SSIS (SQLServer Integration Services) SSRS (SQL Server Reporting Services)and Python. This role necessitates a combined technical acumen andstrong interpersonal skills to successfully engage with businessunits while supporting the overall projectlifecycle.

KeyResponsibilities:

- Architectimplement and maintain highperformance data pipelines utilizingGCP services particularly BigQuery Cloud StorageCloud Functions and Dataflow ensuring optimal dataflow and accessibility.
- Design and write highlyefficient scalable SQL queries including complex joins CTEs andaggregations to enable robust data analysis and reporting acrossmultiple operational facets.
- Develop ETL(Extract Transform Load) processes using SSIS for operational dataintegration and leverage SSRS for generating executivelevelreporting and analyticsdashboards.






- Employ Python tocreate productionquality scripts and applications for dataingestion transformation and visualization utilizing libraries suchas Pandas NumPy or Apache Airflow for orchestratingworkflows.
- Engage withcrossfunctional teams to elicit document and analyze businessrequirements subsequently translating these into comprehensivetechnical specifications data models andworkflows.
- Implement and uphold data governanceframeworks to ensure data integrity quality control and securityprotocols across all data engineeringprocesses.
- Monitor data pipelines and systemperformance metrics identifying bottlenecks and implementingsolutions to optimize throughput and minimizedowntime.
- Provide analytical insights andrecommendations to project and client management facilitatingdatadriven decisionmaking.






- Mentor junior dataengineering staff cultivating an environment of knowledge sharingand professional development.
- Stay abreast oflatest trends in data engineering technologies tools andmethodologies to continually refine our datapractices.

Qualifications:

- Bachelors degree in Computer Science Engineering Data Science or a relateddiscipline; a Master s degree is highlydesirable.
- A minimum of 8 yearsof experience in the field of data engineering particularly withinGCP and the BigQueryarchitecture.
- Profoundexperience in formulating and executing complex SQL queries and asolid understanding of relational database designprinciples.
- Advanced proficiencywith SSIS for ETL processes and SSRS for business intelligencereporting .
- Strongprogramming skills in Python with a focus on data manipulation andthe development of scalable ETLsolutions.






- Demonstrated abilityin constructing deploying and maintaining data engineeringpipelines utilizing modern bestpractices.
- Strong verbal and writtencommunication skills complemented by an ability to liaiseeffectively between technical teams and businessstakeholders.
- Exceptional analytical andproblemsolving capabilities with a proactive approach towardsdiagnosing and resolving issues.
- Workingknowledge of data governance principles compliance with dataprivacy regulations and industry bestpractices.

PreferredSkills:

- Familiaritywith additional GCP services such as Cloud Dataflow forstream/batch processing Dataproc for managing Hadoop/Spark clustersor Pub/Sub for messagingservices.
- Understandingof machine learning concepts and frameworks (e.g.





TensorFlowscikitlearn) to integrate predictive analytics within datasolutions.
- Experienceworking within Agile environments and proficiency with projectmanagement tools (e.g. JIRATrello).

WhatWe Offer :

- Acompetitive salary and comprehensive benefitspackage.
- Opportunities for continuedprofessional development and advancement within a cuttingedgeenvironment.
- A collaborative workspace thatencourages innovation and creativity.
- Flexibleworking options to support worklifebalance.

If youpossess the expertise and are eager to advance your career bydriving impactful data initiatives at Logic Hire we invite you toapply. Please submit your resume and a cover letter detailing yourrelevant qualifications andaccomplishments.







Communicationin English should be proficient.

Experiencein Data Engineering & Architecture ( Data Modeling ETLProcesses Data Pipeline Development Data Integration and Cloud DataSolutions(GCP)

Experiencein Cloud Platforms (Google Cloud Platform (GCP) particularlyBigQuery Cloud Storage Cloud Functions)

Experience inBig Data Tools (Hadoop Spark MapReduce Pig Hive NoSQL ApacheAirflow)

Experiencein Data Governance(data governance frameworks ensuring dataintegrity quality control and securityprotocols)

Experiencein Data Visualization & Reporting (PowerBI Tableau SSISSSRS SupersetPlotly)

Experiencein Programming Languages (: Python SQL R Scala C CJava)

Experiencein Database Technologies (Teradata Oracle SQLServer)

Note: Must be Green Card US Citizen







problem-solving,c++,spark,clouddata solutions (gcp),workflow management,data analysis,datapipelines,etl processes,data reporting,cloud,c,datavisualization,data governance frameworks,bigquery,datagovernance,scikit-learn,data integrity,data engineering,hive,dataengineering & architecture,tensorflow,gcp,data science,datavisualization & reporting,data engineering pipelines,agilemethodology,business intelligencereporting,tableau,hadoop,ssis,cloud functions,agilemethodologies,pandas,databasetechnologies,teradata,reporting,monitoring,numpy,java,plotly,superset,datapipeline development,sql,team management,data engineeringmethodologies,cloud platforms (google cloud platform (gcp),particularly bigquery, cloud storage,





cloud functions),dataarchitecture,agile environments,etl,etl (extract, transform,load),google cloud platform (gcp),data quality control,programminglanguages,big data tools,data engineering technologies,data,bigdata tools (hadoop, spark, mapreduce, pig, hive, nosql, apacheairflow),agile,apache airflow,data integration,data security,datagovernance (data governance frameworks, ensuring data integrity,quality control, and security protocols),ssrs,scala,datastorage,project management,cloud data solutions,datasolutions,cloud storage,nosql,communication,data modeling,databasetechnologies (teradata, oracle, sql server),business requirementsanalysis,mapreduce,python,data manipulation,relational databasedesign,dataflow,r,securityprotocols,mentoring,analytics,programming languages (python, sql,r, scala, c, c++, java),technicalspecifications,pipelines,powerbi,cloud platforms,data visualization& reporting (powerbi, tableau, ssis, ssrs, superset,plotly),machine learning,communication skills,oracle,pig,googlecloud platform

▶️ Senior Data Engineer GCP
🖊️ Logic Software Solutions
📍 New Delhi

Subscribe to this job alert:
Enter Your E-mail address to receive the latest job offers for: senior data engineer gcp
Subscribe to this job alert:
Enter Your E-mail address to receive the latest job offers for: senior data engineer gcp