What you’ll do:
Be a leader who strengthens data engineering to support agile growth. Re-engineer the data warehouse and ODS to manage increased demand from analytics and our member and provider portals. Enable transitions to open source, machine learning and Cloud technology based on a solid data foundation. Consult with Domain Data Architects, Business Stakeholders, Data and Application Integration and DBA.
This role is responsible for converting data architecture into physical designs, implementing and maintaining data structures and processing pipelines for the extraction, transformation, and loading (ETL) of data from a variety of data sources. You will design and develop robust, scalable solutions that transform data into a useful format for analysis, enhance data flow, and enable efficient, cost effective consumption and analysis of data. You can write complex SQL or programs and mentor technical and business users about our data. You may evaluate and recommend tools and technologies for data infrastructure and processing.
This role is involved in these areas:
Understand business and technical requirements. Convert logical data models to physical data models, DDL, roles and views.
Create and update data architecture designs and data flows for subject areas impacted by a program/project. Create source to target documents for ETL developers and ensure an efficient design.
Work with BAs and Business SMEs on program/projects to understand requirements and identify non-functional requirements that will impact the data design. Actively engaged as core member of scrum teams to deliver on data programs\projects.
Work with ETL and QA to ensure all requirements are met by the solution. Work with DBA’s to tune performance
BS in Computer Science or related field. A Masters degree is a plus.
7 years of experience in software development and database applications with a majority of this experience in data strategy, sourcing, modeling, integration, and architecture.
Skilled in data analysis, data profiling, data integration, and data quality.
Data modeling for logical, physical, dimensional and relational 3NF models.
Extensive experience creating data warehouses, data marts, and operational system databases.
Evaluate, analyze, design and implement solutions based on technical requirements.
Communicate to a variety of audiences in both verbal and written form
What’s nice to have:
Advanced knowledge of SQL, query tuning, RDBMS, and Query tools. Database performance tuning.
Experience with Cloud, No-SQL or SnowFlake databases, ADLS, Python
Prior experience in ETL, reporting and analytics, and business intelligence tools is preferred
Prior experience in health care and payer domain is preferred.
CDMP (certified data management professional) or CBIP (certified business intelligence professional) is nice to have.
Preferred Soft Skills:
You like to get stuff done and you enjoy sharing your expertise with other talented people. You’re curious about new technology and passionate about the fundamentals of data management. You enjoy putting together complex pieces of a puzzle to deliver better products or services for your business partners. You’re a translator between people and teams that don’t use the same lingo. You understand that incremental changes must be part of the long-term roadmap and you have the patience, endurance, and collaboration skills to move the organization forward.
*** In addition to above qualifications, please note:
Experience with any of ERwin, ER Studio, PowerDesigner
Experience with Informatica (preferred), Data Stage, or Microsoft SSIS/SSRS when used with a defined SDLC in a large, structured work process
Experience with any modern RDBMS with 3NF and Dimensional design. Strong SQL skills and familiarity with executing DDL. Query and database performance tuning is a plus.
Healthcare experience is very desirable, especially payer knowledge of member, enrollment, claims and provider data.
Successful candidates will already have 10-20 years’ experience in IT, healthcare or data roles.
ETL development experience is a plus if complemented with strong data modeling knowledge.
?The Medica data environment is Oracle Exadata, MS Azure, SnowFlake, Informatica, ERwin, TOAD, Excel, Visio, OBI, Qlik Sense, Azure DevOps
Key initiatives this year include:
Data Architecture and platform design for joint venture
Redesign on Individual Business platforms into ODS, DW, data marts and analytical extracts
Deploying Informatica EDC (data catalog), IDQ (data quality) for data governance and data stewardship
Migrating Enterprise Architecture into Cloud Infrastructure
?Continuing Delivery Model Optimization for Agile (SAFe) development