Immediate need for a talented Data Engineer. This is a 07+ Months Contract opportunity with long-term potential and is located in Westlake, TX(Onsite). Please review the job description below and contact me ASAP if you are interested.
Job ID:24-44403
Pay Range: $60 - $66/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Requirements and Technology Experience:
- Skills-Snowflake,SQL,PL/SQL and DB2.
- 6+ Years of Data Engineering (DB2, Snowflake, Oracle, MongoDB, Redshift, PostgreSQL).
- SQL, PL/SQL, Python, and shell scripting languages, (50% PostgresSQL, 30% python, 20% shell scripting).
- Interview will include complex SQL query coding challenges as well as PostgreSQL and shell.
- Must have experience working with ETL and streaming tools and processes (Kafka is preferred, but would consider other tools as well, ie: Adeptia or Alteryx).
- Bachelor’s or master’s degree in a technology related field like Computer Science or Engineering with demonstrated ability.
- Ensures alignment with enterprise data architecture strategies.
- Improves data availability via APIs and shared services and recommends optimization solutions using cloud technologies for data processing, storage, and advanced analytics.
- Performs SQL tuning and data integration activities.
- Provides technical guidance for cyber security on database technologies.
- Performs risk assessments and execute tests of data processing system to ensure functioning of data processing activities and security measures.
- Performs independent and complex technical and functional analysis for multiple projects supporting several divisional initiatives.
- Building technical infrastructure required for efficient Extraction, Transformation, and Loading (ETL) of data from a wide variety of data sources by leveraging, object-oriented/object function scripting languages such as Python.
- Expertise with relational databases, Splunk, Snowflake, YugabyteDB, Aerospike, S3 and similar data management platforms.
- Experience working with and handling large data sets in DB2 with SQL; writing sophisticated stored procedures to process data.
- Object oriented Python programming and solid experience with machine learning libraries - Pandas, NumPy, Scikit-learn, TensorFlow, etc.
- Data parsing/analytics experience in large data sets using Python, scripting, and other similar technologies, integrating with and consuming APIs.
- Familiarity with quantitative techniques and methods, statistics, econometrics – including probability, linear regression, time series data analysis and optimizations.
- Knowledge of hybrid on-prem and cloud data architectures and services, especially data streaming, storage and processing functionality
- Awareness of event-based systems, functional programming, emerging technologies and messaging frameworks such as Kafka.
- Experience in Agile methodologies (Kanban and SCRUM) is a plus.
Our client is a leading Banking Industry and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.