Data Engineer
About Candidate
Introduction:
Experienced Big Data Engineer with 10+ years in data engineering, cloud computing, and business intelligence. Skilled in designing and implementing scalable distributed systems using Azure, GCP, and Oracle OCI. Expertise in Databricks, SQL, Python, and Terraform for data integration, transformation, and automation. Developed ETL pipelines, data warehouses, and reporting solutions to enhance business intelligence. Orchestrated data workflows using Azure Data Factory and Synapse for seamless data processing. Designed Unity Catalog in Databricks, leveraging Terraform for access management. Built CI/CD data pipelines to streamline data transformation from source to destination. Created interactive dashboards and reports using Power BI, OTBI, and BI Publisher. Worked extensively with Oracle Cloud IAM and Big Data Services for secure data handling. Strong background in Agile methodologies, DevOps practices, and performance tuning for optimized data solutions.
Responsibilities:
- Developed and optimized ETL pipelines using Databricks, SQL, and Python for data integration.
- Designed and implemented Unity Catalog structure in Databricks, ensuring secure access management.
- Built CI/CD data pipelines to automate data transformation and deployment processes.
- Orchestrated data imports and transformations using Azure Data Factory and Synapse.
- Developed interactive dashboards and reports using Power BI, OTBI, and BI Publisher.
- Designed data warehouse solutions to enable efficient data storage and retrieval.
- Managed Oracle Cloud IAM solutions, implementing secure access and identity management.
- Performed SQL query optimizations and performance tuning for improved data processing.
- Utilized Terraform for infrastructure automation across cloud environments.
- Applied Agile methodologies (Scrum, Kanban) for efficient project delivery and collaboration.