No et perdis res!
Uneix-te a la comunitat de wijobs i rep per email les millors ofertes d'ocupació
Mai no compartirem el teu email amb ningú i no t'enviarem correu brossa
Subscriu-te araTransport i Logística
1.015Comercial i Vendes
873Informàtica i IT
829Administració i Secretariat
575Desenvolupament de Programari
500Veure més categories
Comerç i Venda al Detall
439Enginyeria i Mecànica
344Dret i Legal
321Educació i Formació
319Indústria Manufacturera
281Màrqueting i Negoci
259Instal·lació i Manteniment
253Sanitat i Salut
182Art, Moda i Disseny
166Disseny i Usabilitat
135Construcció
116Publicitat i Comunicació
104Recursos Humans
102Arts i Oficis
97Comptabilitat i Finances
75Hostaleria
75Alimentació
66Producte
62Turisme i Entreteniment
56Immobiliària
55Atenció al client
54Cures i Serveis Personals
51Seguretat
29Banca
17Energia i Mineria
16Farmacèutica
14Social i Voluntariat
12Telecomunicacions
8Esport i Entrenament
4Assegurances
3Ciència i Investigació
1Agricultura
0Editorial i Mitjans
0Junior Data Engineer
NovaBNP Paribas
Madrid, ES
Junior Data Engineer
BNP Paribas · Madrid, ES
TSQL Kubernetes Elasticsearch Scala Kafka Spark
About The Job
Junior Data Engineer – South Europe Technologies (S.ET), BNP Paribas Personal Finance
South Europe Technologies (S.ET) Is One Of BNP Paribas Personal Finance Shared Services Centers Delivering The Best IT Solutions To BNP Paribas Personal Finance Entities Around The World
- Applications Management (Architecture, Project management, Development, and Quality Assurance)
- IT Risks & Cybersecurity services
- Platforms management
- Data
- Ad-hoc, T&M development
As a Data Engineer, your mission is to design, implement, and optimize robust data pipelines and infrastructure, enabling reliable, secure, and high-performance data flows throughout the organization. You will work closely with stakeholders and multidisciplinary teams to support data integration, transformation, and delivery processes, contributing to the ongoing evolution and stability of our data platforms.
Your Main Activities Are To
- Implement and maintain orchestrators and scheduling systems to automate data pipeline execution (e.g., Airflow as a service).
- Modify and enhance existing codebases in line with business requirements, continuously driving improvements in performance and maintainability.
- Monitor, ensure, andoptimizethe performance and security of the data infrastructure, applying best practices in Data Engineering.
- Contribute to production support, incident resolution, and anomaly correction, as well as support functional and technical evolutions to ensure process stability.
- Develop andmaintaincomprehensive technical documentation to ensure effective knowledge capitalization.
- Assistin building andmaintainingdata pipelines using Spark on Scala for collecting and processing data from diverse sources such as Kafka topics, APIs, HDFS, and structured databases.
- Support data transformation activities and contribute to data quality assurance, ensuring the reliability and accuracy of information.
- Help set up CI/CD pipelines under the guidance of senior team members to automate testing and deployment.
- Learn and employ orchestration tools like Airflow for scheduling and automating data workflows.
- Make incremental improvements to code and contribute to performance enhancements asrequired, aligned with business needs.
- Participate in monitoring data infrastructure for performance and security, learning and applying industry best practices.
- Assistwith production support tasks, including incident identification and resolution, and support ongoing technical improvements.
- Document and update technical processes to ensure clear records of changes and procedures.
- Good knowledge of
- Spark on Scala
- CI/CD tools (Gitlab, Jenkins…)
- HDFS and structured databases (SQL)
- Full understanding of
- Apache Airflow
- Streaming process (Kafka, event steam…)
- S3 storage
- Shell script
- Some knowledge of
- Kubernetes
- Optionally/ as a plus
- Elasticsearch and Kibana
- HVault
- Dremioas tool to virtualize data
- Dataiku
- Demonstrated knowledge of the banking sector and related business processes
- Experience in managing business and IT relationships
- Ability to understand, explain, and support change initiatives
- Results-driven mindset andcapacityto deliver
- Strong collaboration and teamwork skills
- Ability to synthesize and simplify complex technical topics
- Proficiencyin analytical thinking and resilience in handling challenges
- Desirable: Familiarity with tools such as DWH, Dataiku, Spark, Airflow, S3, Kubernetes, and CI/CD platforms
- English: B2 level or higher
- French: B1 level (optional)
- Training programs, career paths, and opportunities for internal mobility—nationally and internationally—thanks to our global presence
- Diversity and Inclusion Committee fostering an inclusive work environment, with employee communities organizing awareness actions (PRIDE, We Generations,MixCity, etc.)
- Corporate volunteering program (1MillionHours 2 Help) supporting employees in their commitment to volunteering activities
- Flexible compensation plan
- Hybrid telecommuting model (50%)
- 31 vacation days
BNP Paribas Group in Spain is an equal opportunity employer, proud to provide equal employment opportunities to all job seekers. We are committed to ensuring that no individual is discriminated against on the grounds of age, disability, gender reassignment, marital or civil partnership status, pregnancy and maternity/paternity, race, religion or belief, sex, or sexual orientation. Equity and diversity are at the core of our recruitment policy, as we believe they foster creativity and efficiency, increasing performance and productivity. We strive to reflect the society in which we live, while keeping in line with the image of our clients.
About Our Culture
We are proud to create, maintain, and develop strategic business applications for BNP Paribas Personal Finance entities worldwide, maintaining high service levels and delivering added value to our customers.
Working in a multicultural environment, we encourage our people to develop their talents and skills, offering a variety of career opportunities and internal mobility programs, within local S.ET teams or other entities across the Group, both in Spain and internationally.
We value the experience of our employees and strive to maintain a balanced work environment, with flexibility regarding work schedules and respect for personal time. Our hybrid working model reflects our belief that social connection enhances daily activities.
Diversity and inclusion are among our core values, as S.ET is an equal opportunity employer. Therefore, we are committed to ensuring employment opportunities regardless of race, skin color, beliefs, religion, nationality, ethnic background, age, sex, sexual orientation, marital status, or political opinions.
Junior Data Engineer
NovaFever
Madrid, ES
Junior Data Engineer
Fever · Madrid, ES
Python TSQL Django PostgreSQL Machine Learning Office
Hey there!
We’re Fever, the world’s leading tech platform for culture and live entertainment,
Our mission? To democratize access to culture and entertainment. With our proprietary cutting-edge technology and data-driven approach, we’re revolutionizing the way people engage with live entertainment.
Every month, our platform inspires over 300 million people in +40 countries (and counting) to discover unforgettable experiences while also empowering event creators with our data and technology, helping them scale, innovate, and enhance their events to reach new audiences.
Our results? We’ve teamed up with major industry leaders like Netflix, F.C. Barcelona, and Primavera Sound, presented international award-winning experiences, and are backed by several leading global investors! Impressive, right?
To achieve our mission, we are looking for bar-raisers with a hands-on mindset who are eager to help shape the future of entertainment!
Ready to be part of the experience?
Now, let’s discuss this role and what you will do to help achieve Fever’s mission.
About The Role
- You’ll be part of the Data organization, building and operating the core technologies that enable data scientists, analysts and the different business units to leverage rich data in efficient and innovative ways to generate impact and connect people to the most relevant real-world experiences.
- You'll own parts of our data warehouse and the resulting data products that are used daily across the company to inform all sorts of decisions and models.
- You'll ideate and implement tools and processes that increase our ability to exploit our diverse sources of data to solve business problems, understand behaviors, …
- You'll work closely with other business units to understand the challenges they face and apply an engineering vision to create structured and scalable solutions to those challenges.
- You'll contribute to the development of a complex data and software ecosystem using the latest technologies in the data and software engineering stack.
- You will be fully integrated into the team. During this month you will have already participated in onboarding, pair programming, one to one, retrospective sessions, and you will have met the different departments at Fever.
- You will get familiar with Fever’s tech stack and frameworks used to develop our data strategy.
- You will attend some of the Fever Original’s experiences like Candlelight.
- You’ll be able to come up with solutions to new difficult problems and you'll be generating impact and creating new business opportunities.
- You’ll have responsibilities and ownership over parts of our Data Warehouse or other critical tools.
- You will participate in some of the hackdays or hackathons we organise with other teams, and you will mostly know everybody from the data and engineering communities.
- You’ll contribute to the overall health of our data ecosystem, improving performance, scalability, robustness, …
- You'll be able to identify gaps in our platforms and processes and be a champion for continuous improvement.
- You’ll be mentoring other new joiners to the team.
- You will participate in some of the team buildings we organise for your team or the whole engineering team.
- Have a data-oriented mindset to understand complex data assets and business challenges and use engineering skills to solve it.
- Build trusted data assets that power Fever's decision making.
- Build automatizations to create huge business opportunities.
- Design, build and support modern and scalable data infrastructure, e.g.:
- Write robust, maintainable code to orchestrate our ETL workflows and build data quality monitoring processes
- Extend our data APIs
- Build data tools to make the company more data-driven
- Understand the technical trade offs of different solutions, implement them and make them scalable
- Collaborate with other engineers, and stakeholders to understand what data is required and how best to make it available in our Data Platform.
- You have a background in computer science, data engineering, or data science with a strong academic record in your bachelor's program. It would be an advantage if you have a Master's degree in one of the above areas.
- You are a collaborative team player with strong communication skills, adaptable to a multidisciplinary, international, and fast-paced environment.
- You are proactive, driven, and bring positive energy to your work, thriving in dynamic settings.
- You are familiar with software engineering best practices, and you take pride in writing clean, robust, and maintainable code.
- You possess strong analytical and problem-solving abilities, backed by solid software engineering skills.
- You are proficient in Python 3, with a deep understanding of SQL for data manipulation and querying.
- You have experience handling large volumes of data from diverse sources.
- You are proficient in business English, ensuring clear and effective communication in a professional setting.
- Collaborated effectively in a multidisciplinary team, interacting with roles like data analysts, data scientists, marketing, and product managers to meet project goals and deliver actionable insights.
- Gained experience with scheduling and workflow orchestration tools, such as Airflow, or similar technologies, to manage data pipelines and automate tasks.
- Worked with databases like Snowflake and PostgreSQL for data storage, retrieval, and management, ensuring efficient and accurate data handling.
- Utilized Business Intelligence (BI) tools, such as Metabase or Superset, for data visualization and reporting to support decision-making processes.
- Integrated and interacted with APIs from popular marketing platforms (e.g., Facebook, Google, Instagram) to extract and process data relevant for analysis.
- Developed data-powered tools and applications, either as part of a professional setting or through personal projects, showcasing hands-on skills in practical data applications.
- Gained familiarity with tools and processes designed to support reproducible, production-ready machine learning applications, contributing to ML workflows.
- Acquired knowledge of backend frameworks, including Django, and their use cases in data engineering and application development.
- Attractive compensation package consisting of base salary and the potential to earn a significant bonus for top performance.
- Stock options.
- Opportunity to have a real impact in a high-growth global category leader
- 40% discount on all Fever events and experiences
- Home office friendly
- Responsibility from day one and professional and personal growth
- Great work environment with a young, international team of talented people to work with!
- Health insurance and other benefits such as Flexible remuneration with a 100% tax exemption through Cobee.
- English Lessons
- Gympass Membership
- Possibility to receive in advance part of your salary by Payflow.
- Attractive compensation package consisting of base salary and the potential to earn a significant bonus for top performance. 25.000 - 35.000EUR
If you want to learn more about us: Fever's Blog | Tech.Eu |TechCrunch
Fever is committed to creating an inclusive and diverse workspace where everyone's background and ideas count. Our main goal is to find the best possible talent regardless of place of birth, racial or ethnic origin, gender, gender identity, religion, opinion, sexual orientation, disability, pregnancy, marital status, age or caring responsibilities. We encourage everyone to apply!
If you require any kind of accommodation during the selection process please contact our Talent team so we can help you by providing a welcoming and seamless journey.
If you want to know more about how Fever processes your personal data, click here Fever - Candidate Privacy Notice
HP UK
Sant Cugat del Vallès, ES
AI Lab - Junior Machine Learning Engineer
HP UK · Sant Cugat del Vallès, ES
C# Python C++ Machine Learning Office
Description -
Junior Machine Learning Engineer (GenAI Modelling)
The AI Lab (under the well-known Technology and Innovation Office) is responsible for bringing state-of-the-art research in Generative AI (and AI in general) to HP’s product portfolio to solve top user needs and enable all new type of user experiences. The team will be in two main locations: Sant Cugat (ESP) and Palo Alto (US). The AI Lab goal is to operationalize the Gen AI model lifecycle, build models to support the top initiatives within Personal Systems, and do research on Generative AI. We are looking for a Junior Machine Learning Engineer with expertise and passion in the Gen AI space.
The AI Lab is seeking an individual to join our team as an ML Engineer in our HP Sant Cugat R&D unit. The candidate will research and develop generative AI models and work with other team members and business unit partners to develop proof-of-concept prototypes and help move technologies to product. Also, the candidate should be able to collaborate with other scientists, developers, and product managers on new applications. Strong communication skills are required.
Responsibilities
- Help execute the technical strategy of the AI Lab with the goal of supporting HP ambitious AI roadmap.
- Support the application of new processes and standards to ensure that the teams are building high quality and safe models.
- Develop tooling to support the model lifecycle and encourages the use of best practices.
- Adopts state-of-the-art Gen AI modelling (finetuning) and evaluation techniques.
- Help develop strategic engineering proof of concepts in the Gen AI space.
- BSc or MSc in Computer Science, Artificial Intelligence, Mathematics, Data Science, or any other related discipline or commensurate work experience or demonstrated competence. MSc related to Generative AI would be a plus.
- Between 0 - 3 years of work experience, internships related to the job content would also be valuable. Exposure to Gen AI in previous projects / internships.
- Good communication and interpersonal skills, with the ability to collaborate effectively and learning with agility.
- Programming Language/s certification (Python, and C++ / C# is a plus).
- Knowledge of machine learning, deep learning, generative AI and statistical modelling.
- Finetuning or evaluation work pursued in generative AI and machine learning (LLMs, LVMs) development and deployment.
- Experience leveraging models from repositories such as Hugging Face.
- Experience with deep learning frameworks such as PyTorch.
- Optional: Contributions to the AI community (e.g., publications, patents, open-source projects, or participation in conferences and workshops).
- Opportunity to work in an international organization with colleagues coming from all over the world.
- Diverse, continued internal growth and career opportunities. Including HP’s own learning platform and LinkedIn Learning.
- An attractive benefits package:
- Health & Life insurance.
- Lunch at reduced prices at our canteen/ ticket restaurant vouchers.
- HP product discount.
- Work life balance / flexible working hours.
- Women, Pride, Young employees, Sustainability and DisAbility! Just a few of our fantastic global business networks you can get involved with locally.
- We also dedicate time and resources to contribute with our community through Corporate Volunteering activities, including our onsite HP Charity day.
- Do you like to give back to the community? Then join one of our many volunteering teams or be a part of the incredible HP charity day held on site annually.
- Love sports? Then take advantage of our sports center (indoor and outdoor) with 25+ regular coordinated activities.
- We have an onsite Doctor and medical team for our employees, including services such as: nutrition, physiotherapy, and general health.
- Printing Happy hour – from photographs to large posters. And Hands-on workshops to print with the latest technology – from wall covers to 3D printed models.
- Our Women Network organizes activities such as Networking, the promotion of STEM vocations, talks on, improving business acumen, work life balance and skills of the future, etc.
Software
Schedule -
Full time
Shift -
No shift premium (Spain)
Travel -
Relocation -
Equal Opportunity Employer (EEO) -
HP, Inc. provides equal employment opportunity to all employees and prospective employees, without regard to race, color, religion, sex, national origin, ancestry, citizenship, sexual orientation, age, disability, or status as a protected veteran, marital status, familial status, physical or mental disability, medical condition, pregnancy, genetic predisposition or carrier status, uniformed service status, political affiliation or any other characteristic protected by applicable national, federal, state, and local law(s).
Please be assured that you will not be subject to any adverse treatment if you choose to disclose the information requested. This information is provided voluntarily. The information obtained will be kept in strict confidence.
If you’d like more information about HP’s EEO Policy or your EEO rights as an applicant under the law, please click here: Equal Employment Opportunity is the Law Equal Employment Opportunity is the Law – Supplement
Cloud Engineer
12 d’ag.EPAM
Barcelona, ES
Cloud Engineer
EPAM · Barcelona, ES
Python Agile TSQL Docker Cloud Coumputing Kubernetes Ansible Git AWS R DevOps QA Terraform LESS Machine Learning Office
We are looking for a Cloud Engineer to join our new Enterprise AI platform team.
This is an exciting opportunity to be part of a high-impact, highly technical group focused on solving some of the most challenging machine learning problems in the Life Sciences & Healthcare industry. You will bring proven experience in AWS cloud environments and a strong track record of designing and deploying large-scale production infrastructure and platforms.
You will play a critical role in shaping how we use technology, machine learning and data to accelerate innovation. This includes designing, building and deploying next-generation data engines and tools at scale.
This is a hybrid role, with an expectation of working from the Barcelona office one day per week.
RESPONSIBILITIES- Develop and maintain the essential infrastructure and platform required to deploy, monitor and manage ML solutions in production, ensuring they are optimized for performance and scalability
- Collaborate closely with data science teams in developing cutting edge data science, AI/ML environments and workflows on AWS
- Liaise with R&D data scientists to understand their challenges and work with them to help productionize ML pipelines, models and algorithms for innovative science
- Take responsibility for all aspects of software engineering, from design to implementation, QA and maintenance
- Lead technology processes from concept development to completion of project deliverables
- Liaise with other teams to enhance our technological stack, to enable the adoption of the latest advances in Data Processing and AI
REQUIREMENTS- Significant experience with AWS cloud environments is essential. Knowledge of SageMaker, Athena, S3, EC2, RDS, Glue, Lambda, Step functions, EKS and ECS is also essential
- Modern DevOps mindset, using best DevOps tools, such as Docker and Git
- Experience with infrastructure as code technology such as Ansible, Terraform and Cloud Formation
- Strong software coding skills, with proficiency in Python, however exceptional ability in any language, will be recognized
- Experience managing an enterprise platform and service, handling new client demand and feature requests
- Experience with containers and microservice architectures e.g., Kubernetes, Docker and serverless approaches
- Experience with Continuous Integration and building continuous delivery pipelines, such as CodePipeline, CodeBuild and Code Deploy
- GxP experience
- Excellent communication, analytical and problem-solving skills
NICE TO HAVE- Experience building large scale data processing pipelines e.g., Hadoop/Spark and SQL
- Use of Data Science modelling tools e.g., R, Python and Data Science notebooks (e.g., Jupyter)
- Multi cloud experience (AWS/Azure/GCP)
- Demonstrable knowledge of building MLOPs environments to a production standard
- Experience on mentoring, coaching and supporting less experienced colleagues and clients
- Experience with SAFe agile principles and practices
WE OFFER- Private health insurance
- EPAM Employees Stock Purchase Plan
- 100% paid sick leave
- Referral Program
- Professional certification
- Language courses
Azure Cloud Engineer
12 d’ag.Devoteam
Azure Cloud Engineer
Devoteam · Madrid, ES
Teletreball Azure Cloud Coumputing AWS DevOps Terraform Machine Learning Salesforce
Descripción de la empresa
Devoteam es una consultora europea líder enfocada en estrategia digital, plataformas
tecnológicas, ciberseguridad y transformación empresarial a través de la tecnología.
Centrada en 6 áreas de especialización, abordamos los desafíos estratégicos de nuestros
clientes: Digital Business & Products, Data-driven Intelligence, Distributed Cloud, Business
Automation, Ciberseguridad y la Sostenibilidad conseguida por la Digitalización.
La Tecnología está en nuestro ADN y creemos en ella como una palanca capaz de impulsar
el cambio para mejorar, manteniendo un equilibrio que nos permite ofrecer a nuestros
clientes herramientas tecnológicas de primer nivel pero siempre con la cercanía y
profesionalidad de un equipo que actúa como guía durante el camino.
Nuestros 25 años de experiencia nos convierten en una consultora innovadora,
consolidada y madura que permite el desarrollo de nuestras 8.500 personas, certificando
continuamente a nuestros consultores en las últimas tecnologías y contando con expertos
en: Cloud, BI, Data Analytics, Excelencia en Procesos de Negocio, Gestión de la Relación
con clientes, Ciberseguridad, Marketing Digital, Machine Learning, Ingeniería y desarrollo
del Software.
Devoteam ha sido premiado como Partner del año 2021 de los 5 líderes de la nube: AWS,
Google Cloud, Microsoft, Salesforce y ServiceNow.
#CreativeTechForBetterChange
Descripción del empleo
¿Te gustaría tener ownership completo de la infraestructura cloud en un entorno de datos? ¿Te motiva trabajar de forma autónoma, tomando decisiones técnicas clave sin depender de terceros?
En Devoteam buscamos un/a Azure Cloud Engineer para dar soporte a un equipo de Data & AI, siendo la única persona responsable de la infraestructura. Buscamos a alguien con experiencia real en Terraform, resolución de tickets complejos, conocimiento de servicios como Databricks o Azure Data Factory... y sobre todo, con capacidad de moverse solo, sin depender de nadie para que las cosas avancen.
¿Qué harás en tu día a día?
- Diseñar y desplegar infraestructura en Azure usando Terraform (IaC).
- Resolver de forma autónoma tickets técnicos relacionados con:- Azure Data Factory (linked services, Key Vault).
- Azure Databricks (configuración, permisos, conectividad).
- Grupos de seguridad, redes, conectividad, etc.
- Ser el punto de conexión entre los equipos de Data, Infraestructura y Soporte (acelerador de procesos).
- Evitar bloqueos técnicos que dependan de soporte externo/offshore: te encargarás tú directamente.
- Identificar mejoras técnicas y ejecutarlas por iniciativa propia.
- Actuar como referente único de infraestructura cloud para el equipo de datos.
Perfil que buscamos
- Experiencia con Terraform y despliegue de infraestructuras en Azure.
- Conocimientos sólidos en servicios como ADF, Databricks, VNETs, NSGs, Key Vault.
- Capacidad para resolver problemas sin esperar indicaciones: alto nivel de autonomía y proactividad.
- Experiencia previa trabajando solo/a o con mínima supervisión técnica.
- Enfoque resolutivo, comunicativo y orientado a simplificar.
Certificaciones que valoramos
- Azure Administrator Associate
- Azure DevOps Engineer Expert
- Terraform Associate
¿Qué te ofrecemos?
- Salario competitivo, revisable según perfil.
- Proyecto estable con responsabilidad técnica real.
- Cultura que valora la autonomía, la confianza y la mejora continua.
- Formación continua y certificaciones oficiales cubiertas.
- Trabajo 100% en remoto.
Requisitos
- Experiencia con Terraform y despliegue de infraestructuras en Azure.
- Conocimientos sólidos en servicios como ADF, Databricks, VNETs, NSGs, Key Vault.
- Capacidad para resolver problemas sin esperar indicaciones: alto nivel de autonomía y proactividad.
- Experiencia previa trabajando solo/a o con mínima supervisión técnica.
- Enfoque resolutivo, comunicativo y orientado a simplificar.
Data Engineer
12 d’ag.Serem
Data Engineer
Serem · Madrid, ES
Teletreball Agile TSQL Azure DevOps
En serem estamos comprometidos con diversos proyectos y queremos contar con los mejores profesionales del sector.
Actualmente, nos encontramos en la búsqueda de un/a Ingeniero de Datos para un equipo de migración de datos Agile DevOps.
Descripción:
Servicios para un proyecto de migración de datos en un cliente que está acelerando su transformación hacia una organización innovadora y centrada en datos. Transformación que garantiza la transición de sistemas heredados a plataformas modernas.
Tareas:
- Diseñar, desarrollar y mantener pipelines ETL utilizando Informatica PowerCenter y Azure Data Factory;
- Transformar y migrar datos de los sistemas de origen a los de destino, priorizando la calidad, la consistencia y el valor añadido de los datos;
- Trabajar estrechamente con analistas, diseñadores de soluciones, testers y desarrolladores para ofrecer soluciones de migración integrales;
- Asesorar sobre estrategia de migración, soluciones técnicas y desafíos de mapeo de datos;
- Desarrollar continuamente sus habilidades y compartir sus conocimientos con el equipo y la organización en general.
Habilidades:
- Buscamos a alguien que combine experiencia técnica con sólidas habilidades de colaboración y comunicación. Su perfil incluye:
- Licenciatura en TI o campo relacionado;
- Mínimo 2 años de experiencia en ingeniería de datos o desarrollo ETL;
- Experiencia demostrada con Informatica PowerCenter y Azure Data Factory;
- Sólidos conocimientos de SQL y experiencia en análisis y transformación de datos;
- Familiaridad con Agile/Scrum y mentalidad flexible en forma de T;
- Actitud proactiva, capaz tanto de ejecución estructurada como de resolución creativa de problemas;
- Se valorará la experiencia con SAP Data Services.
Una actitud proactiva y orientada a la solución en un entorno en constante cambio;
Disposición a dar y recibir retroalimentación, y a crecer continuamente con el equipo.
Nivel de INGLÉS C1 requerido.
Teletrabajo 100%.
Fomentamos un ambiente de trabajo multicultural e inclusivo, no discriminamos por edad, género o creencias; así como ofrecemos igualdad de oportunidades a todo el personal.
Desarrollamos nuestras actividades bajo los principios del cuidado del medioambiente, la sostenibilidad y la responsabilidad social corporativa; colaborando en proyectos de reforestación y sostenibilidad.
Apoyamos los 10 principios del Pacto Mundial y los 17 Objetivos de Desarrollo Sostenible, en materia de derechos humanos, condiciones laborales, medio ambiente y anticorrupción.
Los procesos de reclutamiento se desarrollan bajo altos estándares de calidad definiendo la incorporación en base a la experiencia y habilidades del candidato.
Somos una empresa española líder en servicios tecnológicos y atracción del talento presente en el mercado desde 1995. Contamos con más de 600 empleados en proyectos tanto nacionales como internacionales en sector TI.
Lead Data Engineer
11 d’ag.Thoughtworks
Lead Data Engineer
Thoughtworks · Madrid, ES
Teletreball TSQL NoSQL TDD Big Data
Lead data engineers at Thoughtworks develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. They might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems. On projects, they will be leading the design of technical solutions, or perhaps overseeing a program inception to build a new product. Alongside hands-on coding, they are leading the team to implement the solution.
Job responsibilities
- You will lead and manage data engineering projects from inception to completion, including goal-setting, scope definition and ensuring on-time delivery with cross team collaboration
- You will collaborate with stakeholders to understand their strategic objectives and identify opportunities to leverage data and data quality
- You will design, develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions
- You will be responsible to create, design and develop intricate data processing pipelines, addressing clients' most challenging problems
- You will collaborate with data scientists to design scalable implementations of their models
- You write clean and iterative code based on TDD and leverage various continuous delivery practices to deploy, support and operate data pipelines
- You will lead and advise clients on how to use different distributed storage and computing technologies from the plethora of options available
- You will develop data models by selecting from a variety of modeling techniques and implementing the chosen data model using the appropriate technology stack
- You will be responsible for data governance, data security and data privacy to support business and compliance requirements
- You will define the strategy for and incorporate data quality into your day-to-day work
Technical Skills
- You have experience in leading the system design and implementation of technical solutions
- Working with data excites you; You have created Big Data architecture, can build and operate data pipelines, and maintain data storage, all within distributed systems
- You have a deep understanding of data modeling and experience with modern data engineering tools and platforms
- You have experience in writing clean, high-quality code using the preferred programming language
- You have built and deployed large-scale data pipelines and data-centric applications using any of the distributed storage platforms and distributed processing platforms in a production setting
- You have experience with data visualization techniques and can communicate the insights as per the audience
- You have experience with data-driven approaches and can apply data security and privacy strategy to solve business problems
- You have experience with different types of databases (i.e.: SQL, NoSQL, data lake, data schemas, etc.)
- You understand the importance of stakeholder management and can easily liaise between clients and other key stakeholders throughout projects, ensuring buy-in and gaining trust along the way
- You are resilient in ambiguous situations and can adapt your role to approach challenges from multiple perspectives
- You don’t shy away from risks or conflicts, instead you take them on and skillfully manage them
- You coach, mentor and motivate others and you aspire to influence teammates to take positive action and accountability for their work
- You enjoy influencing others and always advocate for technical excellence while being open to change when needed
- You are a proven leader with a track record of encouraging teammates in their professional development and relationships
- Cultivating strong partnerships comes naturally to you; You understand the importance of relationship building and how it can bring new opportunities to our business
Learning & Development
There is no one-size-fits-all career path at Thoughtworks: however you want to develop your career is entirely up to you. But we also balance autonomy with the strength of our cultivation culture. This means your career is supported by interactive tools, numerous development programs and teammates who want to help you grow. We see value in helping each other be our best and that extends to empowering our employees in their career journeys.
About Thoughtworks
Thoughtworks is a global technology consultancy that integrates strategy, design and engineering to drive digital innovation. For 30+ years, our clients have trusted our autonomous teams to build solutions that look past the obvious. Here, computer science grads come together with seasoned technologists, self-taught developers, midlife career changers and more to learn from and challenge each other. Career journeys flourish with the strength of our cultivation culture, which has won numerous awards around the world.
Join Thoughtworks and thrive. Together, our extra curiosity, innovation, passion and dedication overcomes ordinary.
See here our AI policy.
DevOps Engineer - AWS & Terraform
11 d’ag.Hopla! Software
DevOps Engineer - AWS & Terraform
Hopla! Software · Barcelona, ES
Teletreball Jenkins Docker Kubernetes AWS DevOps Microservices Terraform
🚀 Únete a HOPLA! y transforma el futuro con nosotros 🚀
En HOPLA! no solo ayudamos a las empresas a modernizar sus aplicaciones e infraestructuras con tecnología Open Source, microservicios y soluciones en la nube… ¡también queremos que tú formes parte de este reto!
Si te apasiona la tecnología, la innovación y la independencia a la hora de crear soluciones, aquí encontrarás un equipo de expertos que apuesta por las mejores prácticas de la industria. Trabajamos con empresas de primer nivel y una red de partners tecnológicos top para desarrollar proyectos que realmente marcan la diferencia.
Aquí no solo vienes a trabajar, vienes a crecer, aprender y aportar valor en un entorno dinámico y lleno de oportunidades. ¿Te unes al viaje? 🌍☁️
💼 Ahora mismo estamos buscando un/a DevOps Engineer - AWS & Terraform para sumarse a un proyecto estable con modalidad híbrida ¿Te interesa? ¡Pues te estamos esperando! 🚀😃
¿CÓMO NOS AYUDARÁS A CUMPLIR NUESTRA MISIÓN?
- Te incorporarás al equipo de DevOps que da soporte a las soluciones tecnológicas del área industrial a nivel global.
- Te integrarás en un momento transición clave en la empresa.
- Diseño, implementación y mantenimiento de pipelines CI/CD robustos y eficientes para múltiples proyectos.
- Despliegue de infraestructuras complejas en AWS utilizando Terraform como herramienta principal.
- Automatización de procesos de infraestructura, configuración y despliegue de entornos, con foco en escalabilidad, resiliencia y rendimiento.
- Colaboración con equipos de desarrollo y arquitectura para facilitar buenas prácticas DevOps y cloud-native.
- Gestión de entornos multicuenta de AWS, control de costes, seguridad y cumplimiento de políticas globales.
- Monitorización, logging y trazabilidad de los sistemas desplegados.
- Diseño de soluciones ad hoc de infraestructura para proyectos específicos del negocio industrial.
- Documentación clara y mantenible de arquitecturas, procesos y entornos.
- Reuniones diarias en inglés hablado.
- CF Admnistración de sistemas, Ingeniería Informática o Telecomunicaciones, o formaciones relacionadas.
- +4 años de experiencia como DevOps o en roles similares.
- Experiencia sólida con Terraform.
- Conocimiento avanzado de AWS.
- Experiencia con herramientas de CI/CD (GitLab CI, Jenkins, etc.).
- Conocimientos de contenedores (Docker, Kubernetes, etc.).
- Inglés fluido (C1 mínimo), ya que el entorno es 100% en inglés hablado.
- Disponibilidad para acudir 1 día/semana a la oficina de Sant Cugat.
- Incorporación en una compañía dinámica del sector IT, en crecimiento y con proyectos tecnológicos de vanguardia.
- 📄 Contrato indefinido.
- 🏥 Seguro médico privado.
- 🚀 Desarrollo profesional y evolución a largo plazo.
- 📚 Programa de formación y aprendizaje continuo.
- 🕓 Flexibilidad horaria.
- 💻 Teletrabajo total o híbrido, según preferencia.
- 💳 Retribución flexible (transporte, restaurante, guardería).
- 📅 Viernes con jornada intensiva todo el año.
- 🎉 Eventos de empresa.
- 🌈 Un entorno abierto, inclusivo y colaborativo, donde cada persona cuenta.
- +4 años de experiencia como DevOps o en roles similares.
- Experiencia sólida con Terraform.
- Conocimiento avanzado de AWS.
- Inglés fluido (C1 mínimo), ya que el entorno es 100% en inglés hablado.
- Disponibilidad para acudir 1 día/semana a la oficina de Sant Cugat.
Cloud Engineer
11 d’ag.EPAM
Madrid, ES
Cloud Engineer
EPAM · Madrid, ES
Python Agile TSQL Docker Cloud Coumputing Kubernetes Ansible Git AWS R DevOps QA Terraform LESS Machine Learning Office
We are looking for a Cloud Engineer tojoin our new Enterprise AI platform team.
This is an exciting opportunity to be part of a high-impact, highly technical group focused on solving some of the most challenging machine learning problems in the Life Sciences & Healthcare industry. You will bring proven experience in AWS cloud environments and a strong track record of designing and deploying large-scale production infrastructure and platforms.
You will play a critical role in shaping how we use technology, machine learning and data to accelerate innovation. This includes designing, building and deploying next-generation data engines and tools at scale.
This is a hybrid role, with the expectation of occasional office visits in Barcelona.
RESPONSIBILITIES- Develop and maintain the essential infrastructure and platform required to deploy, monitor and manage ML solutions in production, ensuring they are optimized for performance and scalability
- Collaborate closely with data science teams in developing cutting edge data science, AI/ML environments and workflows on AWS
- Liaise with R&D data scientists to understand their challenges and work with them to help productionize ML pipelines, models and algorithms for innovative science
- Take responsibility for all aspects of software engineering, from design to implementation, QA and maintenance
- Lead technology processes from concept development to completion of project deliverables
- Liaise with other teams to enhance our technological stack, to enable the adoption of the latest advances in Data Processing and AI
REQUIREMENTS- Significant experience with AWS cloud environments is essential. Knowledge of SageMaker, Athena, S3, EC2, RDS, Glue, Lambda, Step functions, EKS and ECS is also essential
- Modern DevOps mindset, using best DevOps tools, such as Docker and Git
- Experience with infrastructure as code technology such as Ansible, Terraform and Cloud Formation
- Strong software coding skills, with proficiency in Python, however exceptional ability in any language, will be recognized
- Experience managing an enterprise platform and service, handling new client demand and feature requests
- Experience with containers and microservice architectures e.g., Kubernetes, Docker and serverless approaches
- Experience with Continuous Integration and building continuous delivery pipelines, such as CodePipeline, CodeBuild and Code Deploy
- GxP experience
- Excellent communication, analytical and problem-solving skills
NICE TO HAVE- Experience building large scale data processing pipelines e.g., Hadoop/Spark and SQL
- Use of Data Science modelling tools e.g., R, Python and Data Science notebooks (e.g., Jupyter)
- Multi cloud experience (AWS/Azure/GCP)
- Demonstrable knowledge of building MLOPs environments to a production standard
- Experience on mentoring, coaching and supporting less experienced colleagues and clients
- Experience with SAFe agile principles and practices
WE OFFER- Private health insurance
- EPAM Employees Stock Purchase Plan
- 100% paid sick leave
- Referral Program
- Professional certification
- Language courses