Must haves:
- 6+ Years of Data Engineering (DB2, Snowflake, Oracle, MongoDB, Redshift, PostgreSQL)
- SQL, PL/SQL, Python, and shell scripting languages, (50% PostgresSQL, 30% python, 20% shell scripting).
Interview will include complex SQL query coding challenges as well as PostgreSQL and shell. Must have experience working with ETL and streaming tools and processes (Kafka is preferred, but would consider other tools as well, ie: Adeptia or Alteryx)
Nice to haves:
Financial industry experience
Experience working with an Intelligence unit or Cyber Fraud team
DB2, Snowflake, Oracle, MongoDB, Redshift
Data Engineer
Fidelity TalentSource is your destination for discovering your next temporary role at Fidelity Investments. We are currently sourcing for a Data Engineer to work in Westlake TX.
The Role
Fidelity’s Financial Intelligence Unit is looking for a data technologist to join our technology enablement team. This team deploys advanced analytics to detect and combat cyber risks for financial products and services. Builds and develops databases, using DB2, PostgreSQL, Snowflake, NoSQL, and ETL processes. Designs solutions with databases and data modeling. Ensures data access and processes flows in a heterogeneous environment. Writes database stored procedures, functions, automation routines, and loaders. Performs SQL tuning and optimization. Designs and implements database technologies for fraud detection and prevention engine.
Required Technical Skills
- Bachelor’s or master’s degree in a technology related field like Computer Science or Engineering with demonstrated ability.
- Ensures alignment with enterprise data architecture strategies.
- Improves data availability via APIs and shared services and recommends optimization solutions using cloud technologies for data processing, storage, and advanced analytics.
- Performs SQL tuning and data integration activities.
- Provides technical guidance for cyber security on database technologies.
- Performs risk assessments and execute tests of data processing system to ensure functioning of data processing activities and security measures.
- Performs independent and complex technical and functional analysis for multiple projects supporting several divisional initiatives.
- Building technical infrastructure required for efficient Extraction, Transformation, and Loading (ETL) of data from a wide variety of data sources by leveraging, object-oriented/object function scripting languages such as Python.
- Expertise with relational databases, Splunk, Snowflake, YugabyteDB, Aerospike, S3 and similar data management platforms.
- Experience working with and handling large data sets in DB2 with SQL; writing sophisticated stored procedures to process data.
- Object oriented Python programming and solid experience with machine learning libraries - Pandas, NumPy, Scikit-learn, TensorFlow, etc.
- Data parsing/analytics experience in large data sets using Python, scripting, and other similar technologies, integrating with and consuming APIs.
- Familiarity with quantitative techniques and methods, statistics, econometrics – including probability, linear regression, time series data analysis and optimizations.
- Knowledge of hybrid on-prem and cloud data architectures and services, especially data streaming, storage and processing functionality
- Awareness of event-based systems, functional programming, emerging technologies and messaging frameworks such as Kafka.
- Experience in Agile methodologies (Kanban and SCRUM) is a plus.
The Value You Deliver
- Strong analytical skills and ability to take on issues and work through ambiguous situations by making timely decisions based on facts, knowledge, experience and judgment.
- Good interpersonal and client-handling skills with the ability to handle expectations and explain technical detail. Consistent track record to multitask, prioritizes tasks, and quickly adjusts in a constantly evolving environment.
- Collaborate with business and technology groups and should be able to present formal and informal presentations in various settings: one-on-one, small, and large groups, with peers, and senior management.
- Ability to navigate organizationally to accomplish tasks and work on multiple efforts simultaneously and ability to work with multi-functional teams located across geographies.
- Excellent conflict management and negotiation skills; eager to learn and continuously develop personal and technical capabilities.
- High level of dedication, initiative, vision, passion and professional approach to time, costs, and deadlines.
- Ability to handle production issues with accuracy and attention to detail; a methodical, investigative, and inquisitive mind; together with creative abilities.
- Design robust batch and streaming programs and adhering to standards and best-practices for these databases.
- Enjoy analyzing data, identifying gaps, issues, patterns, and trends and can analyze application dependencies and conduct impact assessment of changes.