No et perdis res!
Uneix-te a la comunitat de wijobs i rep per email les millors ofertes d'ocupació
Mai no compartirem el teu email amb ningú i no t'enviarem correu brossa
Subscriu-te araTransport i Logística
1.265Comercial i Vendes
968Informàtica i IT
878Administració i Secretariat
817Comerç i Venda al Detall
687Veure més categories
Desenvolupament de Programari
592Enginyeria i Mecànica
433Dret i Legal
431Educació i Formació
406Indústria Manufacturera
383Màrqueting i Negoci
292Instal·lació i Manteniment
258Sanitat i Salut
191Disseny i Usabilitat
153Construcció
133Publicitat i Comunicació
115Art, Moda i Disseny
110Alimentació
99Recursos Humans
98Hostaleria
94Arts i Oficis
86Comptabilitat i Finances
83Turisme i Entreteniment
76Atenció al client
67Banca
53Cures i Serveis Personals
52Producte
49Immobiliària
48Seguretat
30Farmacèutica
28Social i Voluntariat
21Energia i Mineria
7Ciència i Investigació
4Assegurances
3Telecomunicacions
3Esport i Entrenament
2Agricultura
0Editorial i Mitjans
0BECA Data Engineer
NovaBNP Paribas
BECA Data Engineer
BNP Paribas · Madrid, ES
Teletreball Big Data Power BI Tableau Office
Data Engineer Academy
¿Hablas inglés, español?
¿Te gustaría trabajar en un ambiente internacional?
¿Quieres participar en el crecimiento de un equipo de TI que proporciona un servicio de calidad a todas las entidades de BNP Paribas Personal Finance en todo el mundo?
¡Si tu respuesta es sí a esas 3 preguntas por favor sigue leyendo!
Dentro de BNP Paribas Personal Finance, vamos a crear una academia con el objetivo de incorporar un(as/os) estudiantes empezando en modalidad de beca que pueda dar soporte a un equipo de desarrollo internacional, el cual forma parte de South Europe Technologies (SET) - Centro Internacional de Servicios Compartidos de IT, Data, Operaciones y Ciberseguridad.
Tras un periodo de onboarding con el rol de Analista de Datos junto con el líder técnico y el resto del equipo, las principales responsabilidades del puesto serán:
* Identificar y proponer mejoras que mejoren el rendimiento y procesos de las soluciones de datos implementadas.
* Recopilar y analizar los requerimientos que las diferentes áreas de negocio propongan.
* Capacidad para interpretar grandes volúmenes de datos y extraer conclusiones relevantes.
* Implementar soluciones para la extracción y transformación de los datos almacenados.
* Desarrollar habilidades para traducir datos complejos en informes compresibles para audiencias técnicas y no técnicas. Estos informes se deberán diseñar, automatizar y publicar utilizando para ello Power BI y/o Tableau.
* Colaborar con otras áreas y equipos interdisciplinarios (analistas de negocio, científico de datos...) en el desarrollo de productos analíticos basados en datos.
* En base a la experiencia desarrollada, proponer mejoras en la calidad y gobierno del dato establecidos por el Data Office
* Mantenerse al tanto de las tendencias y mejores prácticas en el diseño e implementación de las diferentes tecnologías aplicadas al manejo y desarrollo de proyectos basados en datos.
Los requisitos que necesitarás para cubrir esta posición son:
* Una formación en Ingeniería Informática u otras Ingenierías. O en campos relacionados como matemáticas, estadística, física o equivalentes.
* Formación a través de cursos especializados y/o bootcamps, analítica de datos y/o Big data.
Soft skills
* Ser una persona proactiva y orientada a la resolución de problemas y toma de decisiones.
Idiomas
* Inglés (mínimo B2) / Francés (opcional).
Ofrecemos
* Formación continua durante el periodo de beca en el rol de Ingeniero de Datos con el objetivo de potenciar los conocimientos y habilidades.
* Participación en proyectos tecnológicos en los que los candidatos puedan aportar conocimiento y valor desde el inicio. Nuestro objetivo es que tu trabajo se vea reflejado y tenga impacto desde el inicio.
* Conocerás ejemplos prácticos de procesos ETL (Extracción, Transformación y Carga) y las diferentes herramientas empleadas.
* Beca de 6 meses con posibilidad real de incorporación en contrato permanente al término de la beca.
* Modelo híbrido de teletrabajo.
* Un día de vacaciones por mes de trabajo (de acuerdo con la duración del convenio con la Universidad).
Consideraciones
* Jornada completa de 9:00 a 18:00.
* Remuneración: 880€ brutos/mes.
* Incorporación inmediata.
Cloud Engineer / Amazon Web Services
4 de junyIncoming Domain
Cloud Engineer / Amazon Web Services
Incoming Domain · Barcelona, ES
Teletreball React API .Net C# Java Python Agile CSS TSQL HTML Azure NoSQL Scrum Maven Jenkins Linux Angular Docker Cloud Coumputing Kubernetes Ansible Git Android REST Jira Groovy OpenShift AWS Spring iOS PowerShell Sass Bash DevOps jUnit QA MVC Gradle Eclipse Microservices Perl SQL Server
TELETRABAJO para Desarrolladores (AF, AT, AP, PS) en alguna de las siguientes OFERTAS.
OFERTAS ACTIVAS (*) : / * DATA SCIENTIST / * CLOUD ENGINEER - AMAZON WEB SERVICES /
* DATA SCIENTIST :
-SE REQUIEREN EXPERTOS EN CUALQUIERA DE LAS VERTICALES IA
-Extraer y analizar datos de las bases de datos de la empresa para impulsar la optimización y la mejora del
desarrollo de productos de la Segunda Ola Digital.
-Evaluar la eficacia precisión de las nuevas fuentes de datos y aplicar técnicas de recopilación de datos.
-A partir de los datos disponibles y haciendo uso de algoritmos o de componentes de RAIP desarrollar
modelos personalizados para resolver las iniciativas de IA existentes.
-Utilice el modelado predictivo para aumentar y optimizar las experiencias de los Clientes, la generación de
ingresos, la segmentación de anuncios y otros resultados empresariales.
-Diseñar marcos de pruebas A/B y determinar siempre una métrica para evaluar los modelos.
-Desarrollar procesos y herramientas para analizar el rendimiento del modelo y la precisión de los datos.
-Habilidades fuertes de resolución de problemas con énfasis en el desarrollo del producto.
-Experiencia en el uso del lenguaje Python.
* CLOUD ENGINEER / AMAZON WEB SERVICES :
-Desarrollo y soporte de Arquitectura Gen AI para cliente bancario.
-Experiencia previa en soporte IT, DevOps o administración de sistemas en entornos cloud.
-Conocimiento en AWS (IAM, EC2, S3, Lambda, CloudWatch, etc.)
-Familiaridad con Azure y GCP (no excluyente)
-Manejo de Bases de Datos SQL y NoSQL
-Experiencia con herramientas de monitoreo y logging en la nube
-Capacidad de ejecutar y depurar scripts en Python, Bash ó PowerShell
-Familiaridad con API REST y herramientas de integración SaSS.
* MIGRACION : COBOL, CICS, DB2 , JCL / MICROFOCUS ENTERPRISE DEVELOPER / SERVER
. . . . . . . . . . . . . : Conocimientos de ECLIPSE , LINUX ( bash scripting) , PYTHON
* CORE BANKING: CONOCIMIENTOS GENERALES DE BANCA
. . . . . 1.- ANALISTAS DE REQUERIMIENTOS , FUNCIONAL Y PRUEBAS DE USUARIO
. . . . . 2.- ANALISTAS CORE BANKING REGULATORIO ( MERCADOS FINANCIEROS Y MEDIOS DE PAGO )
. . . . . . . ( Business Analyst con Conocimientos amplios en CIRBE )
* HOST. . : ANALISTAS PROGRAMADORES MANTENIMIENTO : COBOL CICS DB2 JCL AS400
. . . . . . . . . : 1 AP CON NIVEL ALEMAN C1; 1 AP CON NIVEL DE INGLES B2 ; 1 PROGRAMADOR SENIOR
DWH / ETL . . : INFORMATICA, PL-SQL
FrontEnd. . . . : HTML5, HTML, CSS3,APPVERSE, QWT, JAVA SCRIPT , REACT
BackEnd. . . . : Indispensable APIs, Spring MVC, API, Maven, GIT, JIRA,
. . . . . . . . . . . . : Docker, Kubernetes, Openshift, JUnit, Nexus, SonarQ,
. . . . . . . . . . . . : Spring Data, Swagger, Agile
JAVA. . . . . . . : JAVA 8+, J2EE, J2SE,SCRIPT,(SHELL,PYTHON,PERL)
. . . . . . . . . . . . : MICROSERVICIOS , SPRINT BOOT ,
. . . . . . . . . . . . : (SVN,GIT) (API Rest,RAML,Springboot,)
. . . . . . . . . . . . : (JIRA,REMEDY,GITLAB)
Microsoft . . . : .NET, C#, VISUAL BASIC ANGULAR , SQL SERVER
Mobility. . . . . : IOS, ANDROID
Other . . . . . . : C/C++,VISUAL BASIC
Testing . . . . . : QA, SCRUM,AUTOMATION, MANUAL...
DBA . . . . . . . : LINUX,DOCKER,KUBERNETES,JENKINS,ANSIBLE
. . . . . . . . . . . .: (MAVEN,GRADLE)(JUNIT,KARMA,JASMINE)
. . . . . . . . . . . . : (PYTHON,JAVA,GROOVY)
. . . . . . . . . . . . : (ADMINISTRACION DE REDES)
DATABASE . . . : SQL Server, ETLs,DTSX, VISUAL BASIC , . NET , SAS
TIBCO . . . . . . . : BPM , LINUX
Data Engineer
4 de junyAstraZeneca
Barcelona, ES
Data Engineer
AstraZeneca · Barcelona, ES
Python TSQL AWS Office
Introduction to role:
Are you ready to dive into the world of data engineering and make a real impact? As a Junior Data Engineer, you´ll be a vital part of our dynamic Data Team, collaborating closely with Senior Data Engineers and other team members. This role is perfect for someone with foundational skills in data engineering and a passion for creating automated data pipelines and innovative data solutions. Get ready to learn, grow, and contribute to exciting data initiatives!
Accountabilities:
• Assist in the development of Extract, Load, Transform (ELT) processes.
• Support the creation and maintenance of data models and datasets.
• Assist in identifying and implementing improvements to data integrity and quality.
• Perform preliminary assessments of new data sources for quality and suitability.
• Participate in maintaining documentation related to data processes and solutions.
• Engage in data governance practices to ensure data consistency and alignment with best practices.
• Support the team in resolving data-related issues and undertake root cause analysis.
Essential Skills/Experience:
• Bachelor´s Degree or Master´s degree in Computer Science, Information Systems, Engineering, or a related field, or equivalent practical experience.
• Basic understanding of data engineering principles and experience in a related role, including internships.
• Familiarity with programming languages such as Python and SQL.
• Write maintainable code with tests, and applying version control with Github.
• Exposure to some data engineering tools and services, such as AWS, Snowflake, dbt, Fivetran, Acceldata, or Apache Airflow.
• Ability to work collaboratively within a team setting and demonstrate eagerness to learn and adapt.
• Good communication skills to work effectively with both technical and non-technical stakeholders.
Desirable Skills/Experience:
• Experience or familiarity with cloud-based data ecosystems (e.g., AWS, Snowflake).
• Understanding of basic project management principles.
• Prior experience in the Pharmaceutical or Life Sciences industry is a plus.
When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions.
That´s why we work, on average, a minimum of three days per week from the office. But that doesn´t mean we´re not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world.
At AstraZeneca, you´ll find an environment where innovation thrives and your career can flourish. Our entrepreneurial spirit combined with the resources of a global pharma company creates a unique atmosphere where you can define your career path while contributing to meaningful work. We are committed to making a difference for patients with rare diseases, fostering a culture of inclusivity, integrity, and collaboration. Here, your growth is aligned with our mission to change lives for the better.
Ready to embark on this exciting journey? Apply now and become part of a team that truly makes a difference!
DevOps Engineer
4 de junyEPAM
Barcelona, ES
DevOps Engineer
EPAM · Barcelona, ES
Python Agile TSQL Docker Cloud Coumputing Kubernetes Ansible Git AWS R DevOps QA Terraform LESS Machine Learning Office
We are looking for a DevOps Engineer tojoin our new Enterprise AI platform team.
This is an exciting opportunity to be part of a high-impact, highly technical group focused on solving some of the most challenging machine learning problems in theLife Sciences & Healthcare industry. You will bring proven experience in AWS cloud environments and a strong track record of designing and deploying large-scale production infrastructure and platforms.
You will play a critical role in shaping how we use technology, machine learning and data to accelerate innovation. This includes designing, building and deploying next-generation data engines and tools at scale.
This is a hybrid role, with the expectation of occasional office visits in Barcelona.
#LI-DNI#EasyApply
Responsibilities
- Develop and maintain the essential infrastructure and platform required to deploy, monitor and manage ML solutions in production, ensuring they are optimized for performance and scalability
- Collaborate closely with data science teams in developing cutting edge data science, AI/ML environments and workflows on AWS
- Liaise with R&D data scientists to understand their challenges and work with them to help productionize ML pipelines, models and algorithms for innovative science
- Take responsibility for all aspects of software engineering, from design to implementation, QA and maintenance
- Lead technology processes from concept development to completion of project deliverables
- Liaise with other teams to enhance our technological stack, to enable the adoption of the latest advances in Data Processing and AI
Requirements
- Significant experience with AWS cloud environments is essential. Knowledge of SageMaker, Athena, S3, EC2, RDS, Glue, Lambda, Step functions, EKS and ECS is also essential
- Modern DevOps mindset, using best DevOps tools, such as Docker and Git
- Experience with infrastructure as code technology such as Ansible, Terraform and Cloud Formation
- Strong software coding skills, with proficiency in Python, however exceptional ability in any language, will be recognized
- Experience managing an enterprise platform and service, handling new client demand and feature requests
- Experience with containers and microservice architectures e.g., Kubernetes, Docker and serverless approaches
- Experience with Continuous Integration and building continuous delivery pipelines, such as CodePipeline, CodeBuild and Code Deploy
- GxP experience
- Excellent communication, analytical and problem-solving skills
Nice to have
- Experience building large scale data processing pipelines e.g., Hadoop/Spark and SQL
- Use of Data Science modelling tools e.g., R, Python and Data Science notebooks (e.g., Jupyter)
- Multi cloud experience (AWS/Azure/GCP)
- Demonstrable knowledge of building MLOPs environments to a production standard
- Experience on mentoring, coaching and supporting less experienced colleagues and clients
- Experience with SAFe agile principles and practices
We offer
- Private health insurance
- EPAM Employees Stock Purchase Plan
- 100% paid sick leave
- Referral Program
- Professional certification
- Language courses
EPAM is a leading digital transformation services and product engineering company with over 52,650 EPAMers in more than 55 countries and regions. Since 1993, our multidisciplinary teams have been helping make the future real for our clients and communities around the world. In 2018, we opened an office in Spain that quickly grew to over 1,450 EPAMers distributed between the offices in Málaga and Madrid as well as remotely across the country. Here you will collaborate with multinational teams, contribute to numerous innovative projects, and have an opportunity to learn and grow continuously.
- Why Join EPAM
- WORK AND LIFE BALANCE. Enjoy more of your personal time with flexible work options, 24 working days of annual leave and paid time off for numerous public holidays.
- CONTINUOUS LEARNING CULTURE. Craft your personal Career Development Plan to align with your learning objectives. Take advantage of internal training, mentorship, sponsored certifications and LinkedIn courses.
- CLEAR AND DIFFERENT CAREER PATHS. Grow in engineering or managerial direction to become a People Manager, in-depth technical specialist, Solution Architect, or Project/Delivery Manager.
- STRONG PROFESSIONAL COMMUNITY. Join a global EPAM community of highly skilled experts and connect with them to solve challenges, exchange ideas, share expertise and make friends.
Provectus
Middle/Senior Machine Learning Engineer (GenAI)
Provectus · Madrid, ES
Teletreball Python Docker Cloud Coumputing AWS Machine Learning
Join us at Provectus to be a part of a team that is dedicated to building cutting-edge technology solutions that have a positive impact on society. Our company specializes in AI and ML technologies, cloud services, and data engineering, and we take pride in our ability to innovate and push the boundaries of what's possible.
As an ML Engineer, you’ll be provided with all opportunities for development and growth.
Let's work together to build a better future for everyone!
Requirements:
- Comfortable with standard ML algorithms and underlying math
- Strong hands-on experience with LLMs in production, RAG architecture, and agentic systems
- AWS Bedrock experience strongly preferred
- Practical experience with solving classification and regression tasks in general, feature engineering
- Practical experience with ML models in production
- Practical experience with one or more use cases from the following: NLP, LLMs, and Recommendation engines
- Solid software engineering skills (i.e., ability to produce well-structured modules, not only notebook scripts)
- Python expertise, Docker
- English level - strong Intermediate
- Excellent communication and problem-solving skills
- Practical experience with cloud platforms (AWS stack is preferred, e.g. Amazon SageMaker, ECR, EMR, S3, AWS Lambda)
- Practical experience with deep learning models
- Experience with taxonomies or ontologies
- Practical experience with machine learning pipelines to orchestrate complicated workflows
- Practical experience with Spark/Dask, Great Expectations
- Create ML models from scratch or improve existing models.
- Collaborate with the engineering team, data scientists, and product managers on production models
- Develop experimentation roadmap.
- Set up a reproducible experimentation environment and maintain experimentation pipelines
- Monitor and maintain ML models in production to ensure optimal performance
- Write clear and comprehensive documentation for ML models, processes, and pipelines
- Stay updated with the latest developments in ML and AI and propose innovative solutions
Data Engineer
3 de junyFactorial
Barcelona, ES
Data Engineer
Factorial · Barcelona, ES
MySQL Python TSQL SaaS R PostgreSQL Office
Hello!
The Data Entry Team Data Engineer plays a critical role in managing and optimizing the flow of data from various sources into the SaaS platform. This position is responsible for data extraction, transformation, and loading (ETL), working closely with other data and configuration teams to ensure seamless integration of operational and historical data.
Key Responsibilities:
- Design and implement data pipelines to efficiently extract, transform, and load (ETL) data into the SaaS platform.
- Collaborate with the Data Analysts and Business Analysts to understand data requirements and ensure the proper structuring of data.
- Manage the integration of multiple data sources, ensuring consistency and accuracy during the data loading process.
- Develop and maintain scripts or tools for automating data entry processes, improving speed and accuracy.
- Handle data migrations from legacy systems, ensuring the integrity and compatibility of historical data.
- Troubleshoot and resolve data discrepancies, errors, and issues that arise during data loading.
- Monitor data performance and troubleshoot slow-running queries or processes.
Key Skills:
- Proficient in SQL, ETL tools, and programming languages (e.g., Python, R) for data manipulation.
- Experience with database management systems like MySQL or PostgreSQL.
- Strong understanding of data integration and transformation methodologies.
- Familiarity with SaaS platforms and their data architectures.
- Ability to handle complex data migration tasks effectively.
- Exceptional verbal and written communication skills in English.
Qualifications:
- Bachelor’s degree in Computer Science or a related discipline.
- 3+ years of experience in data engineering, data integration, or database management.
- Experience with ETL tools and data migration processes.
- Strong problem-solving skills and attention to detail.
- Experience with SaaS data structures is highly desirable.
How your responsibilities look more in deep:
Data Pipeline Development:
- Design, build, and manage data pipelines that support the extraction, transformation, and loading (ETL) of data from various sources into the SaaS platform.
- Ensure that data flows efficiently and securely between the client’s systems and the SaaS platform.
Data Integration:
- Handle the integration of different data sources, ensuring data consistency and accuracy during data transfer and loading.
- Develop scripts or automated processes to streamline data entry and reduce manual effort.
Data Transformation and Loading:
- Transform raw data into formats suitable for the SaaS platform, mapping data fields and ensuring compatibility with system configurations.
- Oversee the loading of data, ensuring that all data is accurately imported into the platform without loss or corruption.
Troubleshooting and Debugging:
- Identify, troubleshoot, and resolve any technical issues related to data migration or integration.
- Address any errors or bottlenecks in the data flow that could affect system performance or data accuracy.
Automation and Process Optimization:
- Develop automated solutions for recurring data entry tasks to reduce manual errors and improve efficiency.
- Continuously optimize data handling processes to enhance speed and accuracy.
Collaboration with Stakeholders:
- Work closely with the Data Analysts and Business Analysts to understand data requirements and business rules.
- Coordinate with the SaaS Implementation Team to ensure that data configurations align with system performance needs.
About us
Factorial is an all-in-one HR Software fast-growing company founded in 2016. Our mission is to help SMEs automate HR workflows, centralize people data, and make better business decisions. Currently, we serve thousands of customers in over 60 countries worldwide and across industries, and we have built a diverse and multicultural team of over 900 people in our Barcelona, Brazil, Mexico, and US offices.
Our Values
- We own it: We take responsibility for every project. We make decisions, not excuses.
- We learn and teach: We´re dedicated to learning something new every day and, above all, sharing it.
- We partner: Every decision is a team decision. We trust each other.
- We grow fast: We act fast. We believe that the worst mistake is not learning from them.
Benefits
We care about people and offer many benefits for employees:
- High growth, multicultural, and friendly environment
- Continuous training and learning based on your needs
- Alan private health insurance
- Healthy life with Wellhub (Gyms, pools, outdoor classes) ♀
- Save expenses with Cobee
- Language classes with Preply
- Get the most out of your salary with Payflow
And when at the office...
- Breakfast in the office and organic fruit
- Nora and Apeteat discounts
- Pet Friendly
Wanna learn more about us? Check our website!
Senior Data Engineer
2 de junyCapgemini
Sevilla, ES
Senior Data Engineer
Capgemini · Sevilla, ES
Python TSQL AWS
Elegir Capgemini es elegir la posibilidad de dar forma a tu carrera profesional como desees. Recibirás el apoyo y la inspiración de una comunidad colaborativa de colegas de todo el mundo y podrás reinventar lo que es posible. Únete a nuestro equipo y ayuda a las principales organizaciones del mundo a descubrir el valor de la tecnología y a construir un mundo más sostenible e inclusivo.
¿Te apetece sumarte a nosotros y participar en proyectos multisectoriales en un equipo conformado por profesionales del dato como Data Scientists, Data Engineers o Data Analysts? Nuestro objetivo es ayudar a nuestros clientes en el camino hacia la innovación continua.
¿Qué harás en el proyecto? ¿Cuál será tu rol?
En tu primer proyecto, como Data Engineer, formarás parte del equipo de la plataforma global de datos de uno de nuestros principales clientes.
Tendrás que liderar y tomar decisiones para llegar a obtener la data perfecta a partir de los datos incrementales (que estarán desordenados). Realizarás intervenciones manuales para la depuración de datos duplicados
Trabajarás con AWS Redshift; SQL y Data Warehouse/Data Marts, así como con Python y Dynamo BD
Estarás en un ambiente internacional, con un nivel alto de interlocución en inglés.
Para Desenvolverte Bien En La Posición Se Requiere Una Experiencia De Entre 6 y 9 Años, Así Como Conocimientos En
SQL y Data Warehouse.
Python y Dynamo BD.
AWS Redshift.
Inglés muy fluido (B2/C1)
Se valorará positivamente el poseer certificado de discapacidad, en el marco de nuestra política de inclusión y diversidad.
Valoraremos todas las candidaturas. Contamos con una amplísima oferta formativa, presencial, online de Certificaciones, etc. Aunque no tengas el 100% de los conocimientos valorados ¡nos encantará conocerte!
Nuestro compromiso con la inclusión e igualdad de oportunidades hace que tengamos un Plan de Igualdad y un Código Ético que garantizan el desarrollo profesional de la plantilla y la igualdad de oportunidades en su selección dentro de un entorno libre de discriminación por cuestión de etnia, nacionalidad, origen social, edad, orientación sexual, expresión de género, religión o cualquier otra circunstancia personal, física o social.
¿Qué te gustará de trabajar aquí?
Tenemos un catálogo de medidas de Desarrollo y Conciliación muy completo, como son, por ejemplo:
- Un ambiente de trabajo único muy valorado por nuestros profesionales en las evaluaciones periódicas.
- Wellbeing HUB - Incluye políticas y acciones para la salud física (Wellhub) y mental.
- 24 días de vacaciones + 2 asuntos propios + 24 y 31 de diciembre + opción a comprar hasta 7 días de vacaciones al año.
- FlexAbroad: posibilidad de trabajar en remoto desde otro país durante 45 días.
- Plan de Compensación Flexible (seguro médico, transporte, formación, tarjeta restaurante o subvención de comida, guardería…)
- Formación continua, podrás disfrutar de Mylearning y de Capgemini University y de nuestros Campus Digitales y Comunidades profesionales. Tendrás acceso a plataformas como: Coursera, Udemy, Pluralsight, Harvard Manager Mentor, Education First para idiomas (inglés francés, alemán…) ¡entre otras!
- Participación en Acciones de Voluntariado y Acción Social con nuestros Grupos de Sostenibilidad, Inclusión e Igualdad.
- Acompañamiento en tus inicios con el programa de Buddies.
- Seguro de Vida y Accidentes
Capgemini es líder global en transformando los negocios de los clientes aprovechando todo el poder de la tecnología. Nos guía el propósito de lograr un futuro inclusivo y sostenible a través de la tecnología y de la energía de quienes la desarrollamos. Somos una compañía responsable y diversa, líder internacional en servicios de IT e Ingeniería con más de 360.000 profesionales en más de 50 países. Con una sólida herencia de 55 años y una amplia experiencia en la industria, los clientes confían en Capgemini para abordar la totalidad de sus necesidades comerciales, desde la estrategia y el diseño hasta las operaciones, impulsadas por el rápido y novedoso mundo de la nube, los datos, la IA, la conectividad, el software, las plataformas e ingeniería digitales. El Grupo reportó en 2022 ingresos globales de €22 mil millones.
Reescribe tu futuro. ¡Únete al equipo!
www.capgemini.com/es-es
DevOps Engineer
2 de junyPluxee
Madrid, ES
DevOps Engineer
Pluxee · Madrid, ES
API Python Azure Cloud Coumputing Ansible Git PowerShell Bash DevOps Terraform
Pluxee is a global player in employee benefits and engagement that operates in 31 countries. Pluxee helps companies attract, engage, and retain talent thanks to a broad range of solutions across Meal & Food, Wellbeing, Lifestyle, Reward & Recognition, and Public Benefits.
Powered by leading technology and more than 5,000 engaged team members, Pluxee acts as a trusted partner within a highly interconnected B2B2C ecosystem made up of more than 500,000 clients, 36 million consumers and 1.7 million merchants.
Conducting its business as a trusted partner for more than 45 years, Pluxee is committed to creating a positive impact on all its stakeholders, from driving business to local communities, to supporting wellbeing at work for employees while protecting the planet.
🚀 Your next challenge
We’re seeking a hands-on, automation-first DevOps Engineer to join our global engineering team. You’ll lead the design and implementation of secure and scalable infrastructure in Azure, focusing on robust CI/CD pipelines, Infrastructure as Code, and GitOps. You'll be responsible for optimizing deployments, managing AKS clusters, and automating with tools like Terraform, Ansible, and Python, all within the framework of Microsoft’s Cloud Adoption Framework (CAF).
🛠️ Key Responsibilities:
- Build and maintain efficient CI/CD pipelines with Azure DevOps (YAML-based), enabling secure, repeatable, and fast delivery.
- Define, provision, and manage infrastructure using Terraform and Ansible, ensuring modular, reusable, and compliant configurations.
- Operate and optimize AKS clusters, including autoscaling, ingress controllers, namespaces, and RBAC.
- Apply GitOps principles using Flux v2 (or ArgoCD), managing cluster state declaratively from Git.
- Automate operational tasks and platform workflows using Python, Bash, or PowerShell.
- Integrate Azure-native services (App Services, Functions, API Management, Gateways) in deployment and infra flows.
- Collaborate with development and security teams to align infrastructure and deployment workflows with the Microsoft Cloud Adoption Framework (CAF).
🧰 Required Skills & Experience:
- Proven experience designing and managing CI/CD pipelines in Azure DevOps (multi-stage, templates, approvals).
- Expert-level knowledge in Terraform (remote state, workspaces, modules) and IaC best practices.
- Experience with Ansible for configuration management and provisioning.
- Proficiency in scripting with Python, Bash, or PowerShell.
- Strong background in container orchestration with AKS/Kubernetes, including security and autoscaling practices.
- Familiarized with GitOps tooling such as Flux or ArgoCD.
- Experience with observability platforms like Azure Monitor, Prometheus, or Grafana.
- Understanding of cloud security: secrets management, policy enforcement, image scanning, etc.
- Working knowledge of Azure services: App Services, Functions, API Management, Gateways.
- Professional working proficiency in English (Spanish is a plus).
- Certified in AZ-400, CKA/CKAD, Terraform Associate, will be a plus.
- Experience with Azure Policy, Landing Zones, and Blueprints, will be a plus.
- Knowledge on tools such as Helm, Vault, ArgoCD, Kustomize, will be a plus.
- Knowledge of FinOps practices and cloud cost optimization, will be a plus.
☀️ Happy at work
1) A meaningful job: Be the change! Help us build the future of employee benefits by bringing to life sustainable and personalized experiences and contribute to make a real impact on millions of lives. Our business model delivers not just for individuals but their communities too, by supporting local businesses and economies.
2) A great culture: People matter – a lot! Be part of a multicultural team that moves as one in a fast paced and innovative environment. We respect and care authentically about our people, we embrace wellbeing and work-life balance, new ideas and we have a lot of fun!
3) An empowering environment: Be yourself! At Pluxee we proudly embrace diversity and value the uniqueness of our talents, fostering an inclusive work place where all abilities are celebrated, and equal learning and growing opportunities are a given.
Backend Data Engineer
2 de junySocial You
Madrid, ES
Backend Data Engineer
Social You · Madrid, ES
Java Node.js Python TSQL Cloud Coumputing Git AWS PostgreSQL Spring
Estamos en búsqueda de un/a Backend Developer con experiencia en entornos distribuidos, para incorporarse a un proyecto estratégico dentro del sector financiero, con foco en soluciones de banca privada e inversiones.
Buscamos una persona con al menos 3 años de experiencia, apasionada por la ingeniería de software y orientada a la excelencia técnica, que quiera crecer dentro de un equipo multidisciplinar en constante evolución.
Requisitos:
Experiencia avanzada en desarrollo backend con Java (Spring Boot), Python o Node.js.
Conocimientos sólidos en el diseño y consumo de APIs RESTful.
Experiencia en transformación de datos y desarrollo de pipelines con bases de datos relacionales como PostgreSQL, incluyendo modelado de datos y optimización de consultas SQL.
Familiaridad con entornos AWS y buenas prácticas de diseño cloud.
Experiencia con herramientas de control de versiones (Git) y flujos CI/CD.
Certificaciones relacionadas con AWS (valorable).
Conocimiento del sector financiero, especialmente en inversiones y modelos de datos de banca privada (valorable).
Buen nivel de inglés (valorable).
Ofrecemos :
Retribución competitiva+beneficios sociales
Formato de trabajo hibrido en Madrid