Mandatory Skills: AWS, PYSPARK, ETL, SCALA, Spark
Desired Skills: Strong in Spark with good knowledge on Hadoop
• Create Scala/Spark jobs for data transformation and aggregation
• Produce unit tests for Spark transformations and helper methods
• Design data processing pipelines