Job – ETL Data Engineer
Location – Markham , CA ( 3 days onsite is must)
Job – Contractor
Must have skill:
• Informatica BDM/IDQ
• Hadoop/Oracle/Postgres
• MQFTE/Zena
What you’ll do
- Design, build and operationalize large scale enterprise data solutions in Hadoop, Postgres, Oracle, and Snowflake.
- Design and Develop ETL Pipeline to ingest data into Oracle/Postgres from different data sources (Files, Mainframe, Relational Sources, NoSQL, Hadoop Etc.) using Informatica BDM
- Work will also encompass crafting & developing solution designs for data acquisition/ingestion of multifaceted data sets (internal/external), data integrations & data warehouse/marts.
- You are collaborative with business partners, product owners, partners, functional specialists, business analysts, IT architecture, and developers to develop solution designs adhering to architecture standards.
- Responsible for supervising and ensuring that solutions adhere to enterprise data governance & design standards.
- Act as a point of contact to resolve architectural, technical and solution related challenges from delivery teams for best efficiency.
- Advocate importance of data catalogs, data governance and data quality practices.
- Outstanding problem-solving skills.
- Work in an Agile delivery framework to evolve data models and solution designs to deliver value incrementally.
- You are a self-starter with experience working in a fast-paced agile development environment.
- Strong mentoring and coaching skills and ability to lead by example for junior team members.
- Outcome focused with strong decision making and critical thinking skills to challenge the status quo which impacts delivery pace and performance and striving for efficiencies.
What you’ll bring
- University degree in Computer Engineering or Computer Science.
- 7+ years of experience crafting solutions for data lakes, data integrations, data warehouses/marts.
- Proficient in Informatica IDQ to design, implement, and optimize data quality solutions for our enterprise data management initiatives.
- Experienced Managed File Transfer (MFT) Specialist with expertise in MQFTE to ensure secure and efficient data exchange.
- Development and maintenance experience in automated scripts using Shell Scripting and Zena to streamline and optimize system operations and workflows.
- Solid grasp/experience with data technologies & tools (Hadoop, Oracle, PostgreSQL, Informatica, etc.,)
- Outstanding knowledge and experience in ETL with Informatica product suite.
- Experience implementing Data Governance principles and efficiencies.
- Familiar with the Agile software development.
- Excellent verbal and written communication skills.
- Insurance knowledge an asset-Ability to foundationally understand complex business process driving technical systems.