Cloud Data Engineering Role
Overview of the Job
This position involves working on designing and developing ETL pipelines using Google Cloud Datafusion, including Multi DB Table and DLP plugins. The ideal candidate will also be responsible for building and managing workflows with Composer (Airflow).
* The job requires a deep understanding of GCP components and cloud architecture, as well as strong scripting experience in Python and Shell.
Key Responsibilities
* Develop scalable data solutions using BigQuery, Dataflow, and Cloud Functions.
* Automate infrastructure tasks using Google Cloud SDK and APIs.
* Implement encryption using Cloud KMS, AEAD, and DLP encryption techniques.
* Support release management with tools like Cloud Build.
Required Skills and Qualifications
To be successful in this role, you should have:
* In-depth knowledge of data modeling, data validation, and test automation.
* Excellent communication skills in English.
* A proactive approach to managing projects independently.
Desirable Skills
If you have experience with Dataplex and Vertex/Gemini AI, or hold a Google Professional Cloud Data Engineer certification, it would be an added advantage.
Contact Information is removed for privacy purposes.