Tasks
We are seeking a Data Engineer to consolidate multiple databases into Microsoft Azure, leverage Microsoft Fabric, and design, build, and manage reliable data pipelines. The role ensures high-quality, secure data integration and transformation so that curated data can be safely exposed to downstream applications and analytics systems. The candidate will work during Indian business hours with flexibility to support early US hours.
Requirements
Key Responsibilities
Data Integration & Consolidation
- Migrate and unify on-premises and cloud databases into Azure (e.g., Azure SQL, Azure Storage, Synapse/Microsoft Fabric OneLake).
- Design landing, curated, and serving layers aligned to business domains.
Pipeline Engineering (ETL/ELT)
- Build and manage pipelines using Azure Data Factory and Microsoft Fabric Data Pipelines/Notebooks; use Synapse/Spark for ingestion, transformation, and orchestration.
- Implement CI/CD for data workflows; monitor performance, reliability, and cost.
Data Modeling & Architecture
- Develop logical and physical data models (star/snowflake; data vault where appropriate).
- Establish standards for schema design, partitioning, and optimization.
Secure Data Exposure
- Create well-defined interfaces (views, APIs, lakehouse tables, Semantic Models) for downstream BI/analytics and operational systems.
- Implement data quality checks, lineage, and observability.
Governance & Operations
- Apply RBAC governance across Azure and Fabric resources to enforce least-privilege access.
- Manage metadata/catalog (e.g., Microsoft Purview), tagging, and policies.
- Provide operational support during early US hours in addition to Indian business hours.