Role: Data Engineer
Location: Phoenix, AZ (Hybrid)
Job description:
· Monitors the Data Lake constantly and ensures that the appropriate support teams are engaged at the right times.
· Design, build and test scalable data ingestion pipelines, perform end to end automation of ETL process for various datasets that are being ingested.
· Determine best way to extract application telemetry data, structure it, send to proper tool for reporting Kafka, Splunk .
· Create reports to monitor usage data for billing and SLA tracking.
· Work with business and cross functional teams to gather and document requirements to meet business needs.
· Provide support as required to ensure the availability and performance of ETL ELT jobs.
· Provide technical assistance and cross training to business and internal team members.
· Collaborate with business partners for continuous improvement opportunities.
Basic Qualifications:
· A minimum of 3 years working on a cloud data projects
· A minimum of 2 years hands on experience on GCP Cloud data implementation projects Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.
· A minimum of 3 years of experience with Python with working knowledge on Notebooks
· Must have experience working in an Onshore/Offshore model.
· High School Diploma or GED
Preferred Qualifications:
· A minimum of 8 years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.
· A minimum of 3 years of experience with one of the leading public clouds.
· A minimum of 2 years of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading.
· Mandatory Experience 3 years of experience with Python with working knowledge on Notebooks.
· Scala experience
· A minimum of 2 years of experience in Data governance and Metadata Management.
· Ability to work independently, solve problems and update the stake holders.
· Analyze, design, develop and deploy solutions per business requirements.
· Strong understanding of relational and dimensional data modeling.
· Experience in DevOps and CI CD related technologies.
· Excellent written and verbal communication skills