Plan, create, document, and implement comprehensive data pipelines and integration processes from start to finish, handling both batch and real-time data
Perform tasks including data analysis, profiling, cleansing, lineage tracking, mapping, transformation, ETL/ELT job and workflow development, and deploying data solutions
Monitor, suggest enhancements, and execute strategies to elevate data quality in terms of reliability, efficiency, and cleanliness, while also refining and optimizing ETL/ELT processes
Provide guidance, execute, and deliver industry best practices in data management and lifecycle processes, encompassing modular ETL/ELT process development, coding and configuration standards, error handling protocols, auditing procedures, and data archiving criteria
Prepare test data, execute test plans, cases, and scripts
Engage with Data Architects, Data Modelers, IT team members, subject matter experts, vendors, and internal business stakeholders to get data requirements, gather needs, and implement data solutions aligned with business objectives
Provide ongoing support for data issues and change requests, meticulously documenting all investigations, conclusions, suggestions, and resolutions.
Requirements:
Over 5 years' of ETL experience, ideally in the insurance industry
Proficiency with IBM DataStage is mandatory
Familiarity with ETL/ELT frameworks, data warehousing principles, data management frameworks, and data lifecycle processes
Skilled in managing various data types (structured, semi-structured, unstructured)
Strong knowledge of diverse database technologies (RDBMS, NoSQL, columnar)
Comprehensive understanding of programming languages such as Python
Excellent communication skills, encompassing prioritization, issue-solving, and fostering interpersonal relationships
Demonstrated proficiency in time management and organizational abilities, including effective prioritization and meeting deadlines
Strong independent work ethic and collaborative skills in multifaceted team environments