Job Description
- 4-10 years of experience in Big Data and Data Engineering.
- Strong knowledge of advanced SQL and data warehousing concepts, including BigQuery.
- Have strong programming skills in SQL, Python/PySpark etc.
- Experience in the design and development of data pipelines, ETL/ELT processes.
- Experience in one of the public cloud providers – GCP, Azure, AWS.
- Experience with relational SQL and NoSQL databases, including Postgres and MongoDB.
- Experience with workflow management tools: Airflow, AWS data pipeline, Google Cloud
Composer,
etc. - Comfortable using Azure DevOps or similar CI/CD tools, Git
- Building Docker images and deploying them to Production. Integrate Docker container
orchestration
framework using Kubernetes by creating pods, config Maps, deployments using terraform. - Strong problem solving and communication skills.
- Bachelor’s or an advanced degree in Computer Science or related engineering discipline.
- Optionally have experience with C#, .NET core, and microservices
Roles & Responsibilities - Use the latest technology to build data pipelines and integrate machine learning models
- Build and expand our data platform
- Develop applications that run on a Google Cloud-based infrastructure
Skill Set
Big Data, Data Warehousing, SQL