Key Responsibilities:
- Design, develop, and optimize data pipelines with Spark/Scala.
- Collaborate with business analysts and data teams to translate requirements into technical solutions.
- Ensure quality, documentation, and automated testing of developments.
- Work with AWS services such as EMR, Glue, and Lambda, leveraging Terraform for infrastructure as code.
- Align with technical best practices and regulatory requirements.
Requisitos
- More than 3 years of experience in Spark/Scala within Big Data environments (Hadoop).
- Hands-on experience with AWS (EMR, Glue, Lambda) and Terraform.
- Solid knowledge of Unix/Linux, Bash/Python scripting, and code repositories (Git, Maven).
- Experience in DevOps practices (Jenkins, CI/CD).
- Advanced English (C1) - mandatory.
- Background in Financial Services/Banking.
- Advanced SQL knowledge and batch process control.
- Experience with BI platforms (ideally Power BI).
Se ofrece
-100% Remote - Work from anywhere in Spain.
-Salary based on the candidate´s qualifications and experience.
-Continuous training.
-Good work environment: professional and highly specialized.
-Opportunity to work in a multinational company that is growing nationally and internationally.
-Internal promotion based on your own goals.
Veure més
No et perdis res!
Uneix-te a la comunitat de wijobs i rep per email les millors ofertes d'ocupació
Mai no compartirem el teu email amb ningú i no t'enviarem correu brossa
Subscriu-te ara