Title: Senior Data Engineer
Location:
Job Type: Full-Time
About Us: Logic Hire Software Solutions is an innovative data-centric organization committed to leveraging advanced analytics and data engineering practices to inform strategic business decisions and improve client engagement. We are seeking a seasoned Senior Data Engineer to join our team and contribute to the optimization and expansion of our data infrastructures.
Job Description
We are in search of a highly proficient Senior Data Engineer with over 8 years of hands-on experience in architecting and maintaining data infrastructure. The ideal candidate will possess extensive expertise in leveraging the Google Cloud Platform (GCP) and the BigQuery ecosystem, alongside strong commands in SQL, SSIS (SQL Server Integration Services), SSRS (SQL Server Reporting Services), and Python. This role necessitates a combined technical acumen and strong interpersonal skills to successfully engage with business units while supporting the overall project lifecycle.
Key Responsibilities
- Architect, implement, and maintain high-performance data pipelines utilizing GCP services, particularly BigQuery, Cloud Storage, Cloud Functions, and Dataflow, ensuring optimal data flow and accessibility.
- Design and write highly efficient, scalable SQL queries, including complex joins, CTEs, and aggregations, to enable robust data analysis and reporting across multiple operational facets.
- Develop ETL (Extract, Transform, Load) processes using SSIS for operational data integration and leverage SSRS for generating executive-level reporting and analytics dashboards.
- Employ Python to create production-quality scripts and applications for data ingestion, transformation, and visualization, utilizing libraries such as Pandas, NumPy, or Apache Airflow for orchestrating workflows.
- Engage with cross-functional teams to elicit, document, and analyze business requirements, subsequently translating these into comprehensive technical specifications, data models, and workflows.
- Implement and uphold data governance frameworks to ensure data integrity, quality control, and security protocols across all data engineering processes.
- Monitor data pipelines and system performance metrics, identifying bottlenecks and implementing solutions to optimize throughput and minimize downtime.
- Provide analytical insights and recommendations to project and client management, facilitating data-driven decision-making.
- Mentor junior data engineering staff, cultivating an environment of knowledge sharing and professional development.
- Stay abreast of latest trends in data engineering technologies, tools, and methodologies to continually refine our data practices.
Qualifications
- Bachelor’s degree in Computer Science, Engineering, Data Science, or a related discipline; a Master’s degree is highly desirable.
- A minimum of 8 years of experience in the field of data engineering, particularly within GCP and the BigQuery architecture.
- Profound experience in formulating and executing complex SQL queries and a solid understanding of relational database design principles.
- Advanced proficiency with SSIS for ETL processes and SSRS for business intelligence reporting.
- Strong programming skills in Python, with a focus on data manipulation and the development of scalable ETL solutions.
- Demonstrated ability in constructing, deploying, and maintaining data engineering pipelines utilizing modern best practices.
- Strong verbal and written communication skills, complemented by an ability to liaise effectively between technical teams and business stakeholders.
- Exceptional analytical and problem-solving capabilities, with a proactive approach towards diagnosing and resolving issues.
- Working knowledge of data governance principles, compliance with data privacy regulations, and industry best practices.
Preferred Skills
- Familiarity with additional GCP services such as Cloud Dataflow for stream/batch processing, Dataproc for managing Hadoop/Spark clusters, or Pub/Sub for messaging services.
- Understanding of machine learning concepts and frameworks (e.g., TensorFlow, scikit-learn) to integrate predictive analytics within data solutions.
- Experience working within Agile environments and proficiency with project management tools (e.g., JIRA, Trello).
What We Offer
- A competitive salary and comprehensive benefits package.
- Opportunities for continued professional development and advancement within a cutting-edge environment.
- A collaborative workspace that encourages innovation and creativity.
- Flexible working options to support work-life balance.
If you possess the expertise and are eager to advance your career by driving impactful data initiatives at Logic Hire we invite you to apply. Please submit your resume and a cover letter detailing your relevant qualifications and accomplishments.
Communication in English should be proficient.
Experience in
Data Engineering & Architecture - ( Data Modeling, ETL Processes, Data Pipeline Development, Data Integration, and Cloud Data Solutions (GCP)
Experience in
Cloud Platforms -(Google Cloud Platform (GCP), particularly BigQuery, Cloud Storage, Cloud Functions)
Experience in
Big Data Tools (Hadoop, Spark, MapReduce, Pig, Hive, NoSQL, Apache Airflow)
Experience in
Data Governance(data governance frameworks, ensuring data integrity, quality control, and security protocols)
Experience in
Data Visualization & Reporting - (PowerBI, Tableau, SSIS, SSRS, Superset, Plotly)
Experience in
Programming Languages - (: Python, SQL, R, Scala, C, C++, Java)
Experience in
Database Technologies - (Teradata, Oracle, SQL Server)
Note : This position candidates who have Green Card , or who are US citizen
Skills: cloud platforms (google cloud platform (gcp), particularly bigquery, cloud storage, cloud functions),data engineering,data visualization,data governance frameworks,data reporting,agile,data engineering pipelines,security protocols,mentoring,business requirements analysis,communication skills,agile environments,ssis,relational database design,monitoring,data visualization & reporting (powerbi, tableau, ssis, ssrs, superset, plotly),data integration,cloud functions,reporting,gcp,problem-solving,communication,data pipeline development,technical specifications,team management,data engineering & architecture,agile methodology,python,bigquery,pandas,data pipelines,cloud,data analysis,data engineering methodologies,etl processes,data storage,data governance,data engineering technologies,data quality control,programming languages (python, sql, r, scala, c, c++, java),cloud storage,ssrs,etl (extract, transform, load),data,google cloud platform (gcp),database technologies (teradata, oracle, sql server),data manipulation,workflow management,data governance (data governance frameworks, ensuring data integrity, quality control, and security protocols),sql,data modeling,google cloud platform,data integrity,big data tools (hadoop, spark, mapreduce, pig, hive, nosql, apache airflow),agile methodologies,pipelines,numpy,cloud data solutions (gcp),project management,apache airflow,business intelligence reporting,dataflow,analytics,data security,etl