Chennai, Tamil Nadu
Software Engineer Practitioner #1031692Job Description:
- We're seeking a highly skilled and experienced Full Stack Data Engineer to play a pivotal role in the development and maintenance of our Enterprise Data Platform.
- In this role, you'll be responsible for designing, building, and optimizing scalable data pipelines within our Google Cloud Platform (GCP) environment.
- You'll work with GCP Native technologies like BigQuery, Dataform,Dataflow, and Pub/Sub, ensuring data governance, security, and optimal performance.
- This is a fantastic opportunity to leverage your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering.
Basic Qualifications:
- Bachelor’s or Master’s degree in a Computer Science, Engineering or a related or related field of study
- 5+ Years - Strong understating of Database concepts and experience with multiple database technologies – optimizing query and data processing performance.
- 5+ Years - Full Stack Data Engineering Competency in a public cloud – Google
- Critical thinking skills to propose data solutions, test, and make them a reality.
- 5+ Years - Highly Proficient in SQL, Python, Java- Experience programming engineering transformation in Python or a similar language.
- 5+ Years - Ability to work effectively across organizations, product teams and business partners.
- 5+ Years - Knowledge Agile (Scrum) Methodology, experience in writing user stories
- Deep understanding of data service ecosystems including data warehousing, lakes and Marts
- User experience advocacy through empathetic stakeholder relationship.
- Effective Communication both internally (with team members) and externally (with stakeholders)
- Knowledge of Data Warehouse concepts – experience with Data Warehouse/ ETL processes
- Strong process discipline and thorough understating of IT processes (ISP, Data Security).
Skills Required:
- Data Architecture, Data Warehousing, DataForm, Google Cloud Platform - Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API
Experience Required:
- Excellent communication, collaboration and influence skills; ability to energize a team.
- Knowledge of data, software and architecture operations, data engineering and data management standards, governance and quality
- Hands on experience in Python using libraries like NumPy, Pandas, etc.
- Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, DataFlow, Dataform, PubSub
- Experience with recoding, re-developing and optimizing data operations, data science and analytical workflows and products.
Education Required: Bachelor's Degree
Additional Information:
- Interact with GDIA product lines and business partners to understand data engineering opportunities, tooling and needs.
- Collaborate with Data Engineering and Data Architecture to design and build templates, pipelines and data products including automation, transformation and curation using best practices
- Develop custom cloud solutions and pipelines with GCP native tools – Data Prep, Data Fusion, Data Flow, Dataform and Big Query
- Operationalize and automate data best practices: quality, auditable, timeliness and complete
- Participate in design reviews to accelerate the business and ensure scalability
- Work with Data Engineering and Architecture and Data Platform Engineering to implement strategic solutions
- Advise and direct team members and business partners on company standards and processes.