Our client, a global leader in heathcare technology are currently hiring for a Data Engineer to partner with product owners and data consumers to meet our client data needs.
In this role you will help built out a new modern data platform which uses infrastructure and tools like Kafka, Azure stack, Postgres, Spark, Databricks, Jenkins, Kubernetes.
You will oversee the entire lifecycle of data pipeline development from data discovery and design to quality and maintenance.
- Design, implement a cloud solution while helping maintain an enterprise Data Warehouse using the various ingestion patterns.
- Develop, test, deploy and schedule complex ETL/ELT solutions to integrate multiple data assets across the organization
- Research and evaluate alternative approaches, design and code efficient and effective solutions for challenging problems ranging from small to large work efforts
- Ability to act as a lead developer responsible for the planning, completion and coordination of work activity for projects involving more than one developer
- Participate in code reviews and provide constructive feedback to improve code or data quality
- Participate in agile work environment, attending daily scrum meetings and completing sprint deliverables on time
- 5+ years of experience in a Data Warehouse/BI environment
- 4+ or more years of experience with programming languages (shell scripts, Python, Java/Scala)
- 4+ years of experience with writing SQL queries and stored procedures with relational databases
- 4+ years of experience in Design and development of well documented source code
- Experience in Snowflake, ETL and Database Schemas
- Experience using CI/CD pipelines such as GitHub and Jenkins
- Data Stage experience
This is a full time, Interim position for an initial period of 6 months with the option to extend up to 24 months and/or become permanent (although this is not guaranteed).
Please apply online for consideration.