- Be a good culture fit.
- If any have upstream oil and gas experience , specifically if it involving some domain concepts .
- If the candidate is creative and has solid critical thinking skills. They need candidates who are equipped to handle the level of ambiguity that developers encounter here.
- Not looking for candidates that have been in mostly siloed roles, as a selected candidate MUST be able to handle the open-ended nature of their projects.
- They seem to reject candidates who have been strictly in very data pipeline focused jobs. As during the Technical interview EOG will expect candidates to have lots of API and server-side knowledge.
- Candidates MUST be able to see the bigger picture behind what they work on. They need to be comfortable explaining the end purpose of what they have built or the real needs of the end user/ customer.
MAx 1-2 pages resumes
Job Title:
Data Engineer – Real-time Operations
Location: Houston, TX ON-SITE REQUIRED M-F
This position and program will require the candidate to be on-site at our client's office in downtown Houston, TX.
Responsibilities
Data Management & Integration:
- Design and build pipelines to ingest and transport data
- Own data quality and reliability for app-related data pipelines.
- Collaborate with data and product teams to deliver robust, scalable data solutions.
API Development & Management
- Design and develop secure, robust, user-friendly APIs for applications and users.
- Ensure API performance, fault tolerance, and comprehensive documentation.
Data Storage & Accessibility
- Build and support databases specific to app usage.
- Work with enterprise database teams to make relevant app data available to stakeholders.
Analytics & Monitoring
- Generate business rule-based alerts and notifications for stakeholders.
- Develop and support app health monitoring infrastructure to ensure uptime and reliability.
- Enhance and maintain the performance of real-time streaming data processes.
Additional Responsibilities
- Understand the oil and gas business domain to translate needs into technical solutions.
- Participate in code review processes to ensure code quality and security.
- Provide data quality support and maintenance for assigned applications.
Requirements
- Minimum 5 years of experience as a Data Engineer or in a similar role.
- Direct exposure to developing scientific or engineering use cases; or degree in a STEM field.
- Strong proficiency in Python
- Demonstrated experience in stream processing and real-time data calculations, ideally with IoT data.
- Proven ability in working with data caching, relational databases (RDBMS), and API creation/integration.
- Strong problem-solving skills and a proven track record of learning and adapting to new technologies.
- Excellent collaboration and communication skills, with a keen ability to work effectively within dynamic team environments.
Client’s Approach To Technology
We prioritize finding the right cultural fit and strong technical aptitude, with a willingness to learn and adapt. While we've listed technologies based on past implementations, we are flexible to similar experience and value candidates who can demonstrate their ability to excel in a dynamic environment.
- Understanding of Relational Database Management Systems (RDBMS)
- Oracle, SingleStore, MySQL, or Postgres
- Experience with NoSQL Databases
- Redis, MongoDB, and Elasticsearch
- Exposure to Event Log/Message Queuing Systems
- RabbitMQ, Pulsar, or Kafka
- API Development and Integration
- API creation concepts and experience with Python and FastAPI is a benefit
- Stream Processing Expertise
- Experience with large-scale IoT stream processing using Python for real-time data calculations and transformations (on data volumes exceeding MBs per second) is desirable.
- Infrastructure and Deployment Skills
- GitHub Actions, Kubernetes, Docker, Terraform, and Azure for deployment and infrastructure management on Linux (CentOS)