Descrição de Vaga
| Código: | 15213 |
| Título da vaga: | D&A Data Engineer - FSR 5163 |
| Local: | São Paulo, SP |
| Região | Outra |
| Nível Profissional: | Analista |
| Nível Acadêmico: | Ensino Superior Completo |
| Áreas de Atuação Profissional: | TI - Projetos |
| Descrição: | Standard Profile D&A-Data-Engineer – FSR-5163 Main Skill • Experience with Azure Data Factory, Databricks, and Azure-based data solutions. • Strong knowledge of ETL/ELT processes, data modeling, SQL, and data warehousing. • Familiarity with Delta Lake and Azure Data Lake Storage. • Ability to work collaboratively with cross-functional teams. • Proven experience in data modeling. • Familiarity with cloud data platforms (e.g., Azure, AWS, Google Cloud) and big data technologies. • Experience with data governance, data quality, and data security best practices. • Ability to work collaboratively with cross-functional teams and communicate complex technical concepts to non-technical stakeholders. • Strong problem-solving, analytical, and organizational skills. Required Qualifications • Certifications in data architecture, cloud platforms, or data management (e.g., AWS Certified Data Analytics, Google Professional Data Engineer, DAMA). • Experience with data visualization tools (e.g., Power BI, Tableau) and programming languages (e.g., Python, Java) is a plus. • Knowledge of industry-standard data models and regulatory requirements. Main Activities and Responsibilities Position Overview As a Data Engineer, you will play a pivotal role in ensuring effective storage, integration, and utilization of data across the Retail portfolio. You’ll collaborate with cross-functional teams to align data strategies with organizational goals, optimize data systems, and ensure data security and compliance. Key Accountabilities • Collaborate with cross-functional teams to align data engineering practices with business objectives. • Create conceptual, logical, and physical data models to support business requirements. • Determine organizational data needs and translate them into robust data architectures. • Define and enforce data architecture standards, policies, and procedures for effective data management, governance, and archival strategies. • Develop and implement data migration strategies from legacy systems. • Develop ETL/ELT pipelines in Azure Data Factory and Databricks. • Integrate usage history, weather data, pricing curves, and product terms into standardized models. • Build and optimize Delta Lake tables in ADLS for staging, reference, and historical use cases. • Enable workflows through Azure Notebooks and Databricks jobs. Essential Skills and Experience • Experience with Azure Data Factory, Databricks, and Azure-based data solutions. • Strong knowledge of ETL/ELT processes, data modeling, SQL, and data warehousing. • Familiarity with Delta Lake and Azure Data Lake Storage. • Ability to work collaboratively with cross-functional teams. • Proven experience in data modeling. • Familiarity with cloud data platforms (e.g., Azure, AWS, Google Cloud) and big data technologies. • Experience with data governance, data quality, and data security best practices. • Ability to work collaboratively with cross-functional teams and communicate complex technical concepts to non-technical stakeholders. • Strong problem-solving, analytical, and organizational skills. |
| Habilidades: | English Level Advanced End Date June 30 2026 |