Role: AWS Data Engineer (W2 Position)
Location: Washington, DC
Duration: 6+ Months
Job Description
- Need 10+ years of experience as a AWS Data Engineer
- Databricks & AWS Big Data Architecture Certification would be plus.
- Solid experience with SQL-based database is required.
- Two-year experience with Hadoop/Spark in either an administrative, development or support role is required.
- Experience in engineering practices such as development, code refactoring, and leveraging design patterns, CI/CD, and building highly scalable data applications and processes.
- Knowledge of advanced data engineering concepts such as dimensional modeling, ETL, data governance, data warehousing involving structured and unstructured data
- Must have excellent coding skills either Python or Scala.
- Must have experience in Data Engineering domain.
- Must have implemented at least 2 project end-to-end in Databricks.
- Must have experience on Databricks which consists of various components as below.
- Delta lake
- dbConnect
- db API 2.0
- Databricks workflows orchestration
- Must be well versed with Databricks Lakehouse concept, Medallion architecture and its implementation in enterprise environments.
- Must have good understanding to create complex data pipeline.
- Must have extensive knowledge of Spark and Hive data processing framework.
Preferred Qualifications/Skills
- Good to have Unity catalog and basic governance knowledge.
- Good to have MicroStrategy or Tableau reporting experience.
- Good to have knowledge of docker and Kubernetes.
- Good to have Linux/Unix Admin or development skills.
Thanks & Regards
Charan
Anveta, Inc.
1333 Corporate Drive, Suite #108
Irving, TX 75038
charan@anveta.com
https://www.linkedin.com/in/charan-reddy-ba6450236/