Summary
The Data Engineer will be responsible for designing, building, and maintaining scalable and robust data pipelines, ensuring the efficient and reliable flow of data throughout the organization.
Responsibilities
- Design, build, and maintain scalable data pipelines to collect, process, and store data from various sources, ensuring efficient and reliable data flow.
- Develop and implement data integration solutions using ETL (Extract, Transform, Load) tools and techniques.
- Collaborate with data analysts, data scientists, and other stakeholders to understand and translate their data needs into technical requirements.
- Optimize and improve existing data pipelines and architectures to ensure high performance, reliability, and maintainability.
- Implement data validation, cleansing, and error-handling processes to ensure data quality and consistency across the organization.
- Design and implement data storage solutions, such as databases, data warehouses, and data lakes, ensuring scalability, security, and accessibility.
- Monitor and maintain the performance and reliability of data pipelines and systems, troubleshooting and resolving any issues that arise.
- Stay current with industry trends, advancements in big data technologies, and emerging data engineering best practices.
- Develop and maintain thorough documentation of data pipelines, architectures, and processes, ensuring that knowledge is preserved and accessible to the team.
- Collaborate with cross-functional teams, providing support and guidance on data engineering best practices, tools, and technologies.
Requirements
- Bachelor’s degree in Computer Science, Engineering, or a related field. A master’s degree is a plus.
- 3-5 years of experience in data engineering, big data, or a related field, with a proven track record of building and maintaining data pipelines and systems.
- Strong proficiency in programming languages (e.g., Python, Java, Scala) and SQL.
- In-depth knowledge of database management systems (e.g., SQL Server, MySQL, PostgreSQL) and data warehousing concepts.
- Familiarity with ETL tools, such as SSIS, Azure Data factory or Apache Airflow.
- Excellent problem-solving skills and the ability to debug and optimize complex data pipelines.
- Strong understanding of data architecture principles and best practices.
- Good communication and collaboration skills, with the ability to work effectively with cross-functional teams.
- A proactive and curious mindset, with a passion for driving data-driven decisions and staying current with industry trends and technologies.
Apply Here
Also Apply:
- Call for Interns: Data Science Internship at 6sense
- Remote Backend Engineer Needed at WorkMotion
- Remote Backend Engineer Needed at WorkMotion
- Customer Service Assistant Needed at Aspire Consulting
- Free Software Engineering Course and Certification at ALX Africa (Cohort 2 2023)
- Call for Interns: Google Summer of Code 2023
- 2023 Graduate Internship Program at Exxon Mobil (Major Project)
- Full Stack Engineer Deep Technologys Limited
- UI/UX Design Interns Needed at eHealth4everyone
Never miss an opportunity again. Join our Telegram Group or WhatsApp Group