In keeping with our global position as an industry leader and innovator, Munich Re is driving transformative change in the life reinsurance industry. Our innovation strategy sets a new standard in digitization that will radically transform our end-to-end business operations and deliver world-class, fully automated business processes and workflows to our North American Life & Health business, which focuses on both traditional reinsurance solutions that concentrate on the transfer of mortality risk as well as living benefits products.
We currently lead the market in the products we offer and are simultaneously exploring new and exciting opportunities by expanding our business model beyond the traditional Life and Health Reinsurance services. There is no blueprint for what we’re doing - we’re creating our own path, and our own brand. Opportunities to effect change on this scale are rare, and we want to hire the best and brightest to bring a unique perspective to our industry and accelerate the rate of innovation, leveraging the tools of tomorrow - whether it’s artificial intelligence, blockchain, machine learning, or a technology that’s still emerging.
Are you highly dynamic, creative, curious, resilient and excited about disrupting and driving continuous transformative change within an established industry? Then we would like to get to know you!
Data Engineers have formal accountability for ensuring the effective use, governance and control of the enterprise data assets throughout the data lifecycle. They will support Munich Re’s Data Strategy and adhere to enterprise data modelling and management guidelines and policies. This role requires an energetic and forward thinking individual who appreciates the value of data as an asset and can significantly contribute to the evolution of our data management practice. The successful candidate is expected to: • Work independently and with teams on complex data engineering problems to directly support and deliver on strategic data initiatives.
Working Knowledge of Development platforms:
- Translate functional and technical requirements into detailed design and robust deliverables.
- Build and maintain optimal data pipeline architectures leveraging open technologies and ETL concepts.
- Ensure timely delivery and meet project timeline by automating development and deployment tasks where applicable.
- Document and communicate standard methods and tools used.
- Work with other data engineers, data ingestion specialists, and experts across the company to consolidate methods and tool standards where practical.
- Act as Subject Matter Expert in data manipulation and processing for end to end business processes.
- Develop data model for various business data subject areas.
- Construct ETL/ELT packages for data transformation and integration between various business applications or to the data warehouse.
- Accountable to configure unit tests that will validate and assure quality of business data processes.
- Manage and maintain master / reference data accuracy and integrity in the Master Data Management system.
- Adhere to the business and data architecture guidelines for data patterns and security.
- Responsible for sourcing and provisioning data in the data warehouse from on premise or cloud sources.
- Liaise with business stakeholders in gathering data requirements for projects.
- Participate in the design and provisioning of the Data Lake and Data Warehouse.
- Provide data engineering support to any development project as well as address tasks or bugs related to data issues.
- Expected to recommend and sometimes implement ways to improve data reliability, efficiency, and quality.
- Create and maintain all metadata for each critical data element.
- Create and maintain master data processes, data flow diagrams and entity relation documentation.
Visual Studio and DevOps
Working Knowledge of Data Manipulation languages:
SQL or (TSQL, PLSQL, pgSQL, SparkQL or Python)
Working knowledge of Data Modeling tools:
EA Sparx or (ERwin, ER Studio, Infosphere DA or Visual Paradigm)
Working Knowledge of Data Integration Tools:
SSIS or (Informatica, Data Stage or Azure Data Factory)
Working Knowledge of Master Data Management tools:
MDS or (Profisee, Informatica MDM, Ataccama or IBM MDM)
Working knowledge of BI tools:
Power BI or (QlikView, Tableau, MicroStrategy or SSRS)
Nice to Have:
Working knowledge of cloud environments such as Azure, AWS, Google
Working knowledge of Big Data environments such as Cloudera, Horton Works, Redshift or Databricks
Working knowledge of Data Lake technologies such as Azure Data Lake or Amazon S3
Munich Re is one of the world’s leading reinsurance companies with approximately 42,000 employees in over 50 locations around the globe. As an industry leader, we provide a unique opportunity to be part of a global success story. We offer our employees a diverse and challenging work environment which champions high performance, professional development, innovation and passion; and rewards top performers with a highly competitive total rewards package.