Chennai, Tamil Nadu
Specialty Development Senior #1030841Job Description:
- Representing the Data Engineering Organization as a Google Cloud Platform (GCP) Data Engineer, specializing in migration and transformation, you will be a developer part of a global team to build a complex Datawarehouse in the Google Cloud Platform.
- This role involves designing, implementing, and optimizing data pipelines, ensuring data integrity during migration, and leveraging GCP services to enhance data transformation processes for scalability and efficiency.
- This role is for a GCP Data Engineer who can build cloud analytics platforms to meet expanding business requirements with speed and quality using lean Agile practices.
- You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the GCP.
- You will be responsible for designing the transformation and modernization on GCP.
- Experience with large scale solutions and operationalizing of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must.
- We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on the Google Cloud Platform.
Skills Required:
- Big Query,, BigTable, Data Flow, Pub/Sub, Data fusion, Dataproc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Function, App Engine, AIRFLOW, Cloud Storage, BigTable, Cloud Spanner
Skills Preferred:
- ETL
Experience Required:
- 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles).
- 5+ years of SQL development experience
- 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale.
- Strong understanding and experience of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner
- Experience developing with micro service architecture from container orchestration framework.
- Designing pipelines and architectures for data processing
- Excellent problem-solving skills, with the ability to design and optimize complex data pipelines.
- Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team
- Strong evidence of self-motivation to continuously develop own engineering skills and those of the team.
- Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support
- Evidence of a proactive mindset to problem solving and willingness to take the initiative.
- Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines
Experience Preferred:
- Professional Certification in GCP (e.g., Professional Data Engineer).
- Data engineering or development experience gained in a regulated, financial environment.
- Experience with Teradata to GCP migrations is a plus.
- Strong expertise in SQL and experience with programming languages such as Python, Java, and/or Apache Beam
- Experience of coaching and mentoring Data Engineers
- Experience with data security, governance, and compliance best practices in the cloud.
- An understanding of current architecture standards and digital platform services strategy · Advanced experience on Functional Architecture, or Software Architecture. ·
- Effectively uses software configuration management (source control, DevSecOps, CI/CD, etc.). Proven experience with: o Java Full stack development (Springboot, Microservices, React) o Persistence - Buckets, PostgreSQL Bigtable
- Work effectively on an agile team following agile practices with Internal SW Development groups as well as Tier I&II (external suppliers)
- Cloud technologies experience (such as GCP, AWS, Azure).
- Experience with software operations (DevSecOps, SRE, observability, support/maintenance, etc.).
- Experience in secure coding practices and modern software development methodology, such as pair programming, test-first/test-driven development OR demonstrated delivery of singular focus programming.
- Proficient with Automation tools such as Selenium, Cucumber, REST Assured.
Education Required:
- Bachelor's Degree
Additional Information :
- Develop technical solutions for Data Engineering
- This role will work closely with teams in US and as well as Europe to ensure robust, integrated migration aligned with Global Data Engineering patterns and standards.
- Design and deploying data pipelines with automated data lineage.
- Develop, reusable Data Engineering patterns.
- Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine
- Ensure timely setup of product for FCNA.