Role: Big Data Developer
Duration: 12 Months Contract
Location: Dublin
Role Summary: A seasoned Big Data Engineer to be part of Prime Data Services team who will be responsible for developing, enhancing existing big data pipelines, and performance tune Java Components along with contribution on new data ingestions and distribution frameworks in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities.
Responsibilities:
- Develops java applications and web services after understanding the current framework .
- Hands on experience on the Big Data frameworks and can performance tune them .
- Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code.
- Resolve issues by identifying and selecting solutions through the applications of acquired technical experience and guided by precedents.
- Has the ability to operate with a limited level of direct supervision.
- Participate in design meetings and present ideas to improve application framework.
- Strong knowledge of Agile Scrum principles and ability to lead and facilitate Scrum ceremonies.
Required Skills & Experience:
- 7+ years of hands on Software Development experience in fast paced agile collaborative environment.
- 3+ years of and Big Data experience AND/OR working as a Lead Developer on java based projects.
- Expertise with data reconciliations, data quality, large data processing and controls.
- Must have very robust understanding of design patterns and hand on coding skills.
- Deep understanding of data structures also required along with basic principles of Java.
- Must have understanding of latest java versions along with Spring Framework, Executor patterns, and Lambda.
- Must have basic understanding of the big data framework and understands Spark, Hive, HDFS, Impala, Presto and can query and tune those systems.
- Must have understanding of Linux, Bash, and Basic scripting along basics like Unix group, hostgroup etc .
- Must have basic understanding of RDBMS vs NoSQL vs Columnar Databases. Understanding of the file formats like Parquet, AVRO, ORC, and Iceberg. also required.
- Good to have understanding of basics of Microservice Architecture , Containerization and Orchestration.
- Good to have understanding of Helm, Jenkins, Bitbucket, Gradle, and Maven.
- Good to have understanding of the telemetry specifically open telemetry, ELK, Grafana, Prometheus etc.