Data Engineer
About Candidate
- Extensive experience in developing, enhancing, and managing ETL pipelines and data integration processes across diverse environments.
- Proficient in creating data pipelines for processing and transforming structured and semi-structured data.
- Expertise in cloud platforms like Azure, GCP, and AWS, with hands-on experience in Databricks, ADF, ADLS, Big Query, and Snowflake.
- Skilled in using Python, Pyspark, and Terraform for data processing, pipeline automation, and cloud infrastructure provisioning.
- Strong capabilities in designing and developing APIs using Python frameworks and deploying them on cloud platforms with Kubernetes.
- Experience in creating and managing dashboards and monitoring tools like Grafana to track data pipeline health and performance.
- Proficient in automating workflows and orchestrating jobs using Airflow and other scheduling tools.
- Adept in implementing data transformation logic, applying business rules, and optimizing performance for large datasets.
- Strong background in database technologies, including SQL Server, and working with data formats like Parquet, Delta, and CSV.
- Experience in data modeling, creating data marts, and working with dimensional schemas.
- Expertise in CI/CD pipelines and automated deployments using Azure DevOps and related tools.
- Skilled in developing user-friendly front-end applications and UI tools for data validation and management.
- Comprehensive experience in troubleshooting production issues, handling change requests, and performing quality assurance testing.
- Knowledge of integrating and synchronizing data across platforms using connectors and APIs.
- Proven ability to collaborate with cross-functional teams to meet business requirements and deliver high-quality data solutions.
- Familiarity with implementing robust error handling and data validation mechanisms in pipelines and APIs.
- Experience in providing technical support, documentation, and training for seamless project execution and user onboarding.
- Strong understanding of data security, masking, and compliance processes in data engineering workflows.