Key Responsibilities:
- Data Pipeline Development: Assist in creating and maintaining ETL data pipelines from diverse sources into our data lake.
- Data Quality Assurance: Collaborate on data validation and quality checks to ensure accurate and reliable data.
- Database Management: Work with relational databases, optimizing data storage and retrieval.
- Technology Exploration: Learn and apply new data engineering tools and frameworks.
- Documentation: Create and update pipeline and workflow documentation.
- Collaboration: Work with cross-functional teams to understand data requirements.
- Performance Optimization: Optimize data pipelines for efficiency and scalability.
- Enrolled in a Computer Science, Data Science, or related program.
- Basic understanding of data engineering, ETL, and relational databases.
- Familiarity with SQL & Power BI (DAX and M-Query)
- Strong problem-solving skills.
- Eagerness to learn and adapt.
- Excellent communication and teamwork.
- Attention to detail and commitment to data quality.
What the Role Offers
- Hands-on experience with Microsoft Fabric, Azure Synapse, and more.
- Mentorship from experienced data professionals.
- Exposure to cutting-edge data technologies.
- A collaborative and inclusive work environment.
- Networking opportunities.
- Competitive compensation and potential for full-time employment.
Generating Apply Link...