Enterprise Data Modeler/Architect
Hybrid Richmond, VA (2 days onsite a week)
Impact Makers is a mission-driven management and technology consulting company that delivers exceptional value to make an impact for our clients and the world. Our experienced teams help organizations manage transformation in IT, data, cloud, and security with a focus on their customers and their people.
We utilize our innovative business model to transform the business value of our work into social value for the community. We are committed to gifting 100% of our net profits to the community over the life of the company and our annual community contributions rival companies 100x our size.
Job Summary:
Enterprise data modeler/Architect provides expert support across the enterprise information framework, analyze and translate business needs into long-term solution data models by evaluating existing systems and working with a business and data architect toarchitect to create conceptual data models ,models, data flows .flows. Develop best practices for Data Asset development, ensure consistency within the system and review modifications of existing cross-compatibility systems. Optimize data systems and evaluate implemented systems for variance discrepancies and efficiency. Maintain logical and physical data models along with accurate metadata.
Responsibilities:
- Create conceptual data model to identify key business entities and visualize their relationships, define concepts and rules.
- Translate business needs into data models Build logical and physical data models for client hierarchy Document data designs for team.
- Populate use case data into the physical model using a mix of sql or stored functions or procedures.
- Analyze performance issues and mitigate those performance issues with constant evaluation.
- Knowledge of data modeling tools like ER/Studio, ERwin, or similar.
- Familiarity with data warehousing concepts, star schema, snowflake schema, and data normalization techniques.
- Understanding of data security, encryption, and compliance requirements in the cloud.
- Proficiency in SQL, including writing complex queries, stored procedures, and functions.
- Experience with NoSQL databases and understanding of their data modeling techniques.
- Present and communicate modeling results and recommendations to internal stakeholders and Development teams and explains features that may affect the physical data model.
- Ability to work closely with data architects, data engineers, and other stakeholders to gather requirements and translate them into data models.
- Ensure and enforce a governance process to oversee implementation activities and ensure alignment to the defined architecture.
- Perform data profiling/analysis activities that helps to establish, modify and maintain data model.
- Develop canonical models, Data as a service models and Knowledge of SOA to support integrations.
- Analyze data-related system integration challenges and propose appropriate solutions with strategic approach.
- Strong Professional experience with Inmon and Kimball Methodology and able to demonstrate good practical use case examples from prior work experience.
- Strong understanding of Data Vault Methodology and able to demonstrate its effectiveness against Inmon and Kimball
- Strong understanding of the Azure Platform and working knowledge of Microsoft azure data lake storage ADLS Gen 2 and zone-based architecture.
- Experience in designing, partition, folder structure and file naming conventions optimized for data Lake.
- Knowledge of data formats for Data Lake (Parquet, ORC, Avro, XML and Json)
- Working knowledge of design patterns for data lakes
- Knowledge of integrating data Lake with Azure Synapse for analytics and reporting
- Perform data profiling and analysis for maintaining data models Develop and support the usage of MDM toolkit Integrate source systems into the MDM solution Implement business rules for data reconciliation and deduplication Enforce data models and naming standards across deliverables.
- Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity.
- Establish methods and procedures for tracking data quality, completeness, data redundancy, and improvement.
- Conduct data capacity planning, life cycle, duration, usage requirements, feasibility studies, and other tasks.
- Create strategies and plans for data security, backup, disaster recovery, business continuity, and archiving..· Ensure that data strategies and architectures are in regulatory compliance.
- Experience with big data platforms and tools like Azure Databricks, Azure HDInsight, or similar.
- Familiarity with big data technologies like Hadoop, Spark, and Hive.
- Willingness to stay updated with the latest trends and advancements in cloud data modeling and Azure services.
- Ability to work closely with data architects, data engineers, and other stakeholders to gather requirements and translate them into data models.
- Good knowledge of applicable data privacy practices and laws.
- Strong written and oral communication skills. Strong presentation and interpersonal skills and
- Ability to present ideas in user-friendly language.
- Strong analytical and problem-solving skills.
- Excellent communication skills to collaborate with cross-functional teams.
- Attention to detail and the ability to work independently.
- Experience with Azure data integration tools like Azure Data Factory, Azure Stream Analytics, and Azure Logic Apps.
- Knowledge of ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes.
- Familiarity with Azure DevOps and CI/CD pipelines for data solutions.
- Understanding of data security, encryption, and compliance requirements in the cloud.
- Familiarity with Azure security features like Azure Key Vault, Azure Policy, and Azure Blueprints.
- Ability to create comprehensive documentation for data models, data dictionaries, and design specifications.
- Familiarity with Azure SDKs and APIs for automation and integration.
- Familiarity with Postman and understanding of it in using APIs
- Experience in writing queries (SQL, Python, R, Scala) as needed and experience with various data technologies such as Azure Synapse or SQL Server, Snowflake, Databricks
Benefits of working at Impact Makers:
As a values-based company, taking care of our employees is of the utmost importance and our employee benefits package reflects that. We offer healthcare plans with benefit premiums 100% paid-for, 401(k) with matching, and generous PTO. Impact Makers embraces the pursuit of balance between work, family, and personal interests, with a flexible, dog-friendly work environment in Scott’s Addition. We like to have fun at work, collaborating through company sponsored events and pro bono opportunities. Our team members work with Impact Makers, not for Impact Makers.
Impact Makers provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability, or genetics. In addition to federal law requirements, Impact Makers complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.