Job Title: Analytics - Snowflake and Hadoop - Junior Role
Location : Wilmington DE (Need Locals)
Experience – 6+
RTTO – Day 1
Job responsibilities:
- Executes software solutions, design, development. and technical troubleshooting with ability to think beyond routine of conventional approaches to build solutions or break down technical problems
- Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems
- Produces architecture and design artifacts for complex applications while being accountable for ensuring design
- constraints are met by software code development
- Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse datasets in service of
- continuous improvement of software applications and systems
- Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture
- Contributes to software engineering communities of practice and events that explore new and emerging technologies.
- ‘Adds to team culture of diversity. equity inclusion. and respect
Required qualifications, capabilities, and skills:
- Formal training or certification on software engineering concepts and 3+ years applied experience
- Hands-on practical experience in system design, application development, testing, and operational stability
- Hands-on experience building data pipelines on AWS using Lambda, SQS, SNS, Athena, Glue, EMR Hands-on experience Data warehouse
- Proficient in coding in one or more languages
- Experience in developing. debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages
- Overall knowledge of the Software Development Life Cycle
- Solid understanding of agile methodologies such as CI/CD. Applicant Resiliency, and Security
- Demonstrated knowledge of software applications and technical processes within a technical discipline (eg. cloud,artificial intelligence, machine learning, mobile, etc)
Preferred qualifications, capabilities, and skills:
- Hands-on experience building data pipelines on AWS using Lambda, SQS, SNS, Athena, Glue, EMR
- Strong experience with distributed computing frameworks such as Apache Spark specifically Java Spark
- Strong hands-on experience building event driven architecture using Kafka
- Experience writing Splunk or Cloudwatch queries. DataDog metrics
- Strong experience on ETL tools such as Abiitio, Informatica, SSIS ete
- Experience writing SQL Queries
Experience working on AdTech/MarTech platforms is a plus