¡No te pierdas nada!
Únete a la comunidad de wijobs y recibe por email las mejores ofertas de empleo
Nunca compartiremos tu email con nadie y no te vamos a enviar spam
Suscríbete AhoraInformática e IT
322Transporte y Logística
254Desarrollo de Software
232Comercial y Ventas
231Adminstración y Secretariado
159Ver más categorías
Derecho y Legal
120Marketing y Negocio
120Educación y Formación
82Comercio y Venta al Detalle
79Ingeniería y Mecánica
63Diseño y Usabilidad
41Instalación y Mantenimiento
41Publicidad y Comunicación
40Arte, Moda y Diseño
32Industria Manufacturera
30Sanidad y Salud
28Recursos Humanos
27Contabilidad y Finanzas
24Hostelería
18Producto
18Artes y Oficios
15Construcción
15Atención al cliente
12Turismo y Entretenimiento
12Farmacéutica
8Inmobiliaria
8Alimentación
7Cuidados y Servicios Personales
6Social y Voluntariado
5Banca
4Deporte y Entrenamiento
2Energía y Minería
2Telecomunicaciones
2Agricultura
0Ciencia e Investigación
0Editorial y Medios
0Seguridad
0Seguros
0Top Zonas
Madrid
1.507Junior Data Engineer
NuevaBNP Paribas
Madrid, ES
Junior Data Engineer
BNP Paribas · Madrid, ES
TSQL Kubernetes Elasticsearch Scala Kafka Spark
About The Job
Junior Data Engineer – South Europe Technologies (S.ET), BNP Paribas Personal Finance
South Europe Technologies (S.ET) Is One Of BNP Paribas Personal Finance Shared Services Centers Delivering The Best IT Solutions To BNP Paribas Personal Finance Entities Around The World
- Applications Management (Architecture, Project management, Development, and Quality Assurance)
- IT Risks & Cybersecurity services
- Platforms management
- Data
- Ad-hoc, T&M development
As a Data Engineer, your mission is to design, implement, and optimize robust data pipelines and infrastructure, enabling reliable, secure, and high-performance data flows throughout the organization. You will work closely with stakeholders and multidisciplinary teams to support data integration, transformation, and delivery processes, contributing to the ongoing evolution and stability of our data platforms.
Your Main Activities Are To
- Implement and maintain orchestrators and scheduling systems to automate data pipeline execution (e.g., Airflow as a service).
- Modify and enhance existing codebases in line with business requirements, continuously driving improvements in performance and maintainability.
- Monitor, ensure, andoptimizethe performance and security of the data infrastructure, applying best practices in Data Engineering.
- Contribute to production support, incident resolution, and anomaly correction, as well as support functional and technical evolutions to ensure process stability.
- Develop andmaintaincomprehensive technical documentation to ensure effective knowledge capitalization.
- Assistin building andmaintainingdata pipelines using Spark on Scala for collecting and processing data from diverse sources such as Kafka topics, APIs, HDFS, and structured databases.
- Support data transformation activities and contribute to data quality assurance, ensuring the reliability and accuracy of information.
- Help set up CI/CD pipelines under the guidance of senior team members to automate testing and deployment.
- Learn and employ orchestration tools like Airflow for scheduling and automating data workflows.
- Make incremental improvements to code and contribute to performance enhancements asrequired, aligned with business needs.
- Participate in monitoring data infrastructure for performance and security, learning and applying industry best practices.
- Assistwith production support tasks, including incident identification and resolution, and support ongoing technical improvements.
- Document and update technical processes to ensure clear records of changes and procedures.
- Good knowledge of
- Spark on Scala
- CI/CD tools (Gitlab, Jenkins…)
- HDFS and structured databases (SQL)
- Full understanding of
- Apache Airflow
- Streaming process (Kafka, event steam…)
- S3 storage
- Shell script
- Some knowledge of
- Kubernetes
- Optionally/ as a plus
- Elasticsearch and Kibana
- HVault
- Dremioas tool to virtualize data
- Dataiku
- Demonstrated knowledge of the banking sector and related business processes
- Experience in managing business and IT relationships
- Ability to understand, explain, and support change initiatives
- Results-driven mindset andcapacityto deliver
- Strong collaboration and teamwork skills
- Ability to synthesize and simplify complex technical topics
- Proficiencyin analytical thinking and resilience in handling challenges
- Desirable: Familiarity with tools such as DWH, Dataiku, Spark, Airflow, S3, Kubernetes, and CI/CD platforms
- English: B2 level or higher
- French: B1 level (optional)
- Training programs, career paths, and opportunities for internal mobility—nationally and internationally—thanks to our global presence
- Diversity and Inclusion Committee fostering an inclusive work environment, with employee communities organizing awareness actions (PRIDE, We Generations,MixCity, etc.)
- Corporate volunteering program (1MillionHours 2 Help) supporting employees in their commitment to volunteering activities
- Flexible compensation plan
- Hybrid telecommuting model (50%)
- 31 vacation days
BNP Paribas Group in Spain is an equal opportunity employer, proud to provide equal employment opportunities to all job seekers. We are committed to ensuring that no individual is discriminated against on the grounds of age, disability, gender reassignment, marital or civil partnership status, pregnancy and maternity/paternity, race, religion or belief, sex, or sexual orientation. Equity and diversity are at the core of our recruitment policy, as we believe they foster creativity and efficiency, increasing performance and productivity. We strive to reflect the society in which we live, while keeping in line with the image of our clients.
About Our Culture
We are proud to create, maintain, and develop strategic business applications for BNP Paribas Personal Finance entities worldwide, maintaining high service levels and delivering added value to our customers.
Working in a multicultural environment, we encourage our people to develop their talents and skills, offering a variety of career opportunities and internal mobility programs, within local S.ET teams or other entities across the Group, both in Spain and internationally.
We value the experience of our employees and strive to maintain a balanced work environment, with flexibility regarding work schedules and respect for personal time. Our hybrid working model reflects our belief that social connection enhances daily activities.
Diversity and inclusion are among our core values, as S.ET is an equal opportunity employer. Therefore, we are committed to ensuring employment opportunities regardless of race, skin color, beliefs, religion, nationality, ethnic background, age, sex, sexual orientation, marital status, or political opinions.
Junior Data Engineer
NuevaFever
Madrid, ES
Junior Data Engineer
Fever · Madrid, ES
Python TSQL Django PostgreSQL Machine Learning Office
Hey there!
We’re Fever, the world’s leading tech platform for culture and live entertainment,
Our mission? To democratize access to culture and entertainment. With our proprietary cutting-edge technology and data-driven approach, we’re revolutionizing the way people engage with live entertainment.
Every month, our platform inspires over 300 million people in +40 countries (and counting) to discover unforgettable experiences while also empowering event creators with our data and technology, helping them scale, innovate, and enhance their events to reach new audiences.
Our results? We’ve teamed up with major industry leaders like Netflix, F.C. Barcelona, and Primavera Sound, presented international award-winning experiences, and are backed by several leading global investors! Impressive, right?
To achieve our mission, we are looking for bar-raisers with a hands-on mindset who are eager to help shape the future of entertainment!
Ready to be part of the experience?
Now, let’s discuss this role and what you will do to help achieve Fever’s mission.
About The Role
- You’ll be part of the Data organization, building and operating the core technologies that enable data scientists, analysts and the different business units to leverage rich data in efficient and innovative ways to generate impact and connect people to the most relevant real-world experiences.
- You'll own parts of our data warehouse and the resulting data products that are used daily across the company to inform all sorts of decisions and models.
- You'll ideate and implement tools and processes that increase our ability to exploit our diverse sources of data to solve business problems, understand behaviors, …
- You'll work closely with other business units to understand the challenges they face and apply an engineering vision to create structured and scalable solutions to those challenges.
- You'll contribute to the development of a complex data and software ecosystem using the latest technologies in the data and software engineering stack.
- You will be fully integrated into the team. During this month you will have already participated in onboarding, pair programming, one to one, retrospective sessions, and you will have met the different departments at Fever.
- You will get familiar with Fever’s tech stack and frameworks used to develop our data strategy.
- You will attend some of the Fever Original’s experiences like Candlelight.
- You’ll be able to come up with solutions to new difficult problems and you'll be generating impact and creating new business opportunities.
- You’ll have responsibilities and ownership over parts of our Data Warehouse or other critical tools.
- You will participate in some of the hackdays or hackathons we organise with other teams, and you will mostly know everybody from the data and engineering communities.
- You’ll contribute to the overall health of our data ecosystem, improving performance, scalability, robustness, …
- You'll be able to identify gaps in our platforms and processes and be a champion for continuous improvement.
- You’ll be mentoring other new joiners to the team.
- You will participate in some of the team buildings we organise for your team or the whole engineering team.
- Have a data-oriented mindset to understand complex data assets and business challenges and use engineering skills to solve it.
- Build trusted data assets that power Fever's decision making.
- Build automatizations to create huge business opportunities.
- Design, build and support modern and scalable data infrastructure, e.g.:
- Write robust, maintainable code to orchestrate our ETL workflows and build data quality monitoring processes
- Extend our data APIs
- Build data tools to make the company more data-driven
- Understand the technical trade offs of different solutions, implement them and make them scalable
- Collaborate with other engineers, and stakeholders to understand what data is required and how best to make it available in our Data Platform.
- You have a background in computer science, data engineering, or data science with a strong academic record in your bachelor's program. It would be an advantage if you have a Master's degree in one of the above areas.
- You are a collaborative team player with strong communication skills, adaptable to a multidisciplinary, international, and fast-paced environment.
- You are proactive, driven, and bring positive energy to your work, thriving in dynamic settings.
- You are familiar with software engineering best practices, and you take pride in writing clean, robust, and maintainable code.
- You possess strong analytical and problem-solving abilities, backed by solid software engineering skills.
- You are proficient in Python 3, with a deep understanding of SQL for data manipulation and querying.
- You have experience handling large volumes of data from diverse sources.
- You are proficient in business English, ensuring clear and effective communication in a professional setting.
- Collaborated effectively in a multidisciplinary team, interacting with roles like data analysts, data scientists, marketing, and product managers to meet project goals and deliver actionable insights.
- Gained experience with scheduling and workflow orchestration tools, such as Airflow, or similar technologies, to manage data pipelines and automate tasks.
- Worked with databases like Snowflake and PostgreSQL for data storage, retrieval, and management, ensuring efficient and accurate data handling.
- Utilized Business Intelligence (BI) tools, such as Metabase or Superset, for data visualization and reporting to support decision-making processes.
- Integrated and interacted with APIs from popular marketing platforms (e.g., Facebook, Google, Instagram) to extract and process data relevant for analysis.
- Developed data-powered tools and applications, either as part of a professional setting or through personal projects, showcasing hands-on skills in practical data applications.
- Gained familiarity with tools and processes designed to support reproducible, production-ready machine learning applications, contributing to ML workflows.
- Acquired knowledge of backend frameworks, including Django, and their use cases in data engineering and application development.
- Attractive compensation package consisting of base salary and the potential to earn a significant bonus for top performance.
- Stock options.
- Opportunity to have a real impact in a high-growth global category leader
- 40% discount on all Fever events and experiences
- Home office friendly
- Responsibility from day one and professional and personal growth
- Great work environment with a young, international team of talented people to work with!
- Health insurance and other benefits such as Flexible remuneration with a 100% tax exemption through Cobee.
- English Lessons
- Gympass Membership
- Possibility to receive in advance part of your salary by Payflow.
- Attractive compensation package consisting of base salary and the potential to earn a significant bonus for top performance. 25.000 - 35.000EUR
If you want to learn more about us: Fever's Blog | Tech.Eu |TechCrunch
Fever is committed to creating an inclusive and diverse workspace where everyone's background and ideas count. Our main goal is to find the best possible talent regardless of place of birth, racial or ethnic origin, gender, gender identity, religion, opinion, sexual orientation, disability, pregnancy, marital status, age or caring responsibilities. We encourage everyone to apply!
If you require any kind of accommodation during the selection process please contact our Talent team so we can help you by providing a welcoming and seamless journey.
If you want to know more about how Fever processes your personal data, click here Fever - Candidate Privacy Notice
Azure Cloud Engineer
12 ago.Devoteam
Azure Cloud Engineer
Devoteam · Madrid, ES
Teletrabajo Azure Cloud Coumputing AWS DevOps Terraform Machine Learning Salesforce
Descripción de la empresa
Devoteam es una consultora europea líder enfocada en estrategia digital, plataformas
tecnológicas, ciberseguridad y transformación empresarial a través de la tecnología.
Centrada en 6 áreas de especialización, abordamos los desafíos estratégicos de nuestros
clientes: Digital Business & Products, Data-driven Intelligence, Distributed Cloud, Business
Automation, Ciberseguridad y la Sostenibilidad conseguida por la Digitalización.
La Tecnología está en nuestro ADN y creemos en ella como una palanca capaz de impulsar
el cambio para mejorar, manteniendo un equilibrio que nos permite ofrecer a nuestros
clientes herramientas tecnológicas de primer nivel pero siempre con la cercanía y
profesionalidad de un equipo que actúa como guía durante el camino.
Nuestros 25 años de experiencia nos convierten en una consultora innovadora,
consolidada y madura que permite el desarrollo de nuestras 8.500 personas, certificando
continuamente a nuestros consultores en las últimas tecnologías y contando con expertos
en: Cloud, BI, Data Analytics, Excelencia en Procesos de Negocio, Gestión de la Relación
con clientes, Ciberseguridad, Marketing Digital, Machine Learning, Ingeniería y desarrollo
del Software.
Devoteam ha sido premiado como Partner del año 2021 de los 5 líderes de la nube: AWS,
Google Cloud, Microsoft, Salesforce y ServiceNow.
#CreativeTechForBetterChange
Descripción del empleo
¿Te gustaría tener ownership completo de la infraestructura cloud en un entorno de datos? ¿Te motiva trabajar de forma autónoma, tomando decisiones técnicas clave sin depender de terceros?
En Devoteam buscamos un/a Azure Cloud Engineer para dar soporte a un equipo de Data & AI, siendo la única persona responsable de la infraestructura. Buscamos a alguien con experiencia real en Terraform, resolución de tickets complejos, conocimiento de servicios como Databricks o Azure Data Factory... y sobre todo, con capacidad de moverse solo, sin depender de nadie para que las cosas avancen.
¿Qué harás en tu día a día?
- Diseñar y desplegar infraestructura en Azure usando Terraform (IaC).
- Resolver de forma autónoma tickets técnicos relacionados con:- Azure Data Factory (linked services, Key Vault).
- Azure Databricks (configuración, permisos, conectividad).
- Grupos de seguridad, redes, conectividad, etc.
- Ser el punto de conexión entre los equipos de Data, Infraestructura y Soporte (acelerador de procesos).
- Evitar bloqueos técnicos que dependan de soporte externo/offshore: te encargarás tú directamente.
- Identificar mejoras técnicas y ejecutarlas por iniciativa propia.
- Actuar como referente único de infraestructura cloud para el equipo de datos.
Perfil que buscamos
- Experiencia con Terraform y despliegue de infraestructuras en Azure.
- Conocimientos sólidos en servicios como ADF, Databricks, VNETs, NSGs, Key Vault.
- Capacidad para resolver problemas sin esperar indicaciones: alto nivel de autonomía y proactividad.
- Experiencia previa trabajando solo/a o con mínima supervisión técnica.
- Enfoque resolutivo, comunicativo y orientado a simplificar.
Certificaciones que valoramos
- Azure Administrator Associate
- Azure DevOps Engineer Expert
- Terraform Associate
¿Qué te ofrecemos?
- Salario competitivo, revisable según perfil.
- Proyecto estable con responsabilidad técnica real.
- Cultura que valora la autonomía, la confianza y la mejora continua.
- Formación continua y certificaciones oficiales cubiertas.
- Trabajo 100% en remoto.
Requisitos
- Experiencia con Terraform y despliegue de infraestructuras en Azure.
- Conocimientos sólidos en servicios como ADF, Databricks, VNETs, NSGs, Key Vault.
- Capacidad para resolver problemas sin esperar indicaciones: alto nivel de autonomía y proactividad.
- Experiencia previa trabajando solo/a o con mínima supervisión técnica.
- Enfoque resolutivo, comunicativo y orientado a simplificar.
Data Engineer
12 ago.Serem
Data Engineer
Serem · Madrid, ES
Teletrabajo Agile TSQL Azure DevOps
En serem estamos comprometidos con diversos proyectos y queremos contar con los mejores profesionales del sector.
Actualmente, nos encontramos en la búsqueda de un/a Ingeniero de Datos para un equipo de migración de datos Agile DevOps.
Descripción:
Servicios para un proyecto de migración de datos en un cliente que está acelerando su transformación hacia una organización innovadora y centrada en datos. Transformación que garantiza la transición de sistemas heredados a plataformas modernas.
Tareas:
- Diseñar, desarrollar y mantener pipelines ETL utilizando Informatica PowerCenter y Azure Data Factory;
- Transformar y migrar datos de los sistemas de origen a los de destino, priorizando la calidad, la consistencia y el valor añadido de los datos;
- Trabajar estrechamente con analistas, diseñadores de soluciones, testers y desarrolladores para ofrecer soluciones de migración integrales;
- Asesorar sobre estrategia de migración, soluciones técnicas y desafíos de mapeo de datos;
- Desarrollar continuamente sus habilidades y compartir sus conocimientos con el equipo y la organización en general.
Habilidades:
- Buscamos a alguien que combine experiencia técnica con sólidas habilidades de colaboración y comunicación. Su perfil incluye:
- Licenciatura en TI o campo relacionado;
- Mínimo 2 años de experiencia en ingeniería de datos o desarrollo ETL;
- Experiencia demostrada con Informatica PowerCenter y Azure Data Factory;
- Sólidos conocimientos de SQL y experiencia en análisis y transformación de datos;
- Familiaridad con Agile/Scrum y mentalidad flexible en forma de T;
- Actitud proactiva, capaz tanto de ejecución estructurada como de resolución creativa de problemas;
- Se valorará la experiencia con SAP Data Services.
Una actitud proactiva y orientada a la solución en un entorno en constante cambio;
Disposición a dar y recibir retroalimentación, y a crecer continuamente con el equipo.
Nivel de INGLÉS C1 requerido.
Teletrabajo 100%.
Fomentamos un ambiente de trabajo multicultural e inclusivo, no discriminamos por edad, género o creencias; así como ofrecemos igualdad de oportunidades a todo el personal.
Desarrollamos nuestras actividades bajo los principios del cuidado del medioambiente, la sostenibilidad y la responsabilidad social corporativa; colaborando en proyectos de reforestación y sostenibilidad.
Apoyamos los 10 principios del Pacto Mundial y los 17 Objetivos de Desarrollo Sostenible, en materia de derechos humanos, condiciones laborales, medio ambiente y anticorrupción.
Los procesos de reclutamiento se desarrollan bajo altos estándares de calidad definiendo la incorporación en base a la experiencia y habilidades del candidato.
Somos una empresa española líder en servicios tecnológicos y atracción del talento presente en el mercado desde 1995. Contamos con más de 600 empleados en proyectos tanto nacionales como internacionales en sector TI.
Lead Data Engineer
11 ago.Thoughtworks
Lead Data Engineer
Thoughtworks · Madrid, ES
Teletrabajo TSQL NoSQL TDD Big Data
Lead data engineers at Thoughtworks develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. They might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems. On projects, they will be leading the design of technical solutions, or perhaps overseeing a program inception to build a new product. Alongside hands-on coding, they are leading the team to implement the solution.
Job responsibilities
- You will lead and manage data engineering projects from inception to completion, including goal-setting, scope definition and ensuring on-time delivery with cross team collaboration
- You will collaborate with stakeholders to understand their strategic objectives and identify opportunities to leverage data and data quality
- You will design, develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions
- You will be responsible to create, design and develop intricate data processing pipelines, addressing clients' most challenging problems
- You will collaborate with data scientists to design scalable implementations of their models
- You write clean and iterative code based on TDD and leverage various continuous delivery practices to deploy, support and operate data pipelines
- You will lead and advise clients on how to use different distributed storage and computing technologies from the plethora of options available
- You will develop data models by selecting from a variety of modeling techniques and implementing the chosen data model using the appropriate technology stack
- You will be responsible for data governance, data security and data privacy to support business and compliance requirements
- You will define the strategy for and incorporate data quality into your day-to-day work
Technical Skills
- You have experience in leading the system design and implementation of technical solutions
- Working with data excites you; You have created Big Data architecture, can build and operate data pipelines, and maintain data storage, all within distributed systems
- You have a deep understanding of data modeling and experience with modern data engineering tools and platforms
- You have experience in writing clean, high-quality code using the preferred programming language
- You have built and deployed large-scale data pipelines and data-centric applications using any of the distributed storage platforms and distributed processing platforms in a production setting
- You have experience with data visualization techniques and can communicate the insights as per the audience
- You have experience with data-driven approaches and can apply data security and privacy strategy to solve business problems
- You have experience with different types of databases (i.e.: SQL, NoSQL, data lake, data schemas, etc.)
- You understand the importance of stakeholder management and can easily liaise between clients and other key stakeholders throughout projects, ensuring buy-in and gaining trust along the way
- You are resilient in ambiguous situations and can adapt your role to approach challenges from multiple perspectives
- You don’t shy away from risks or conflicts, instead you take them on and skillfully manage them
- You coach, mentor and motivate others and you aspire to influence teammates to take positive action and accountability for their work
- You enjoy influencing others and always advocate for technical excellence while being open to change when needed
- You are a proven leader with a track record of encouraging teammates in their professional development and relationships
- Cultivating strong partnerships comes naturally to you; You understand the importance of relationship building and how it can bring new opportunities to our business
Learning & Development
There is no one-size-fits-all career path at Thoughtworks: however you want to develop your career is entirely up to you. But we also balance autonomy with the strength of our cultivation culture. This means your career is supported by interactive tools, numerous development programs and teammates who want to help you grow. We see value in helping each other be our best and that extends to empowering our employees in their career journeys.
About Thoughtworks
Thoughtworks is a global technology consultancy that integrates strategy, design and engineering to drive digital innovation. For 30+ years, our clients have trusted our autonomous teams to build solutions that look past the obvious. Here, computer science grads come together with seasoned technologists, self-taught developers, midlife career changers and more to learn from and challenge each other. Career journeys flourish with the strength of our cultivation culture, which has won numerous awards around the world.
Join Thoughtworks and thrive. Together, our extra curiosity, innovation, passion and dedication overcomes ordinary.
See here our AI policy.
Cloud Engineer
11 ago.EPAM
Madrid, ES
Cloud Engineer
EPAM · Madrid, ES
Python Agile TSQL Docker Cloud Coumputing Kubernetes Ansible Git AWS R DevOps QA Terraform LESS Machine Learning Office
We are looking for a Cloud Engineer tojoin our new Enterprise AI platform team.
This is an exciting opportunity to be part of a high-impact, highly technical group focused on solving some of the most challenging machine learning problems in the Life Sciences & Healthcare industry. You will bring proven experience in AWS cloud environments and a strong track record of designing and deploying large-scale production infrastructure and platforms.
You will play a critical role in shaping how we use technology, machine learning and data to accelerate innovation. This includes designing, building and deploying next-generation data engines and tools at scale.
This is a hybrid role, with the expectation of occasional office visits in Barcelona.
RESPONSIBILITIES- Develop and maintain the essential infrastructure and platform required to deploy, monitor and manage ML solutions in production, ensuring they are optimized for performance and scalability
- Collaborate closely with data science teams in developing cutting edge data science, AI/ML environments and workflows on AWS
- Liaise with R&D data scientists to understand their challenges and work with them to help productionize ML pipelines, models and algorithms for innovative science
- Take responsibility for all aspects of software engineering, from design to implementation, QA and maintenance
- Lead technology processes from concept development to completion of project deliverables
- Liaise with other teams to enhance our technological stack, to enable the adoption of the latest advances in Data Processing and AI
REQUIREMENTS- Significant experience with AWS cloud environments is essential. Knowledge of SageMaker, Athena, S3, EC2, RDS, Glue, Lambda, Step functions, EKS and ECS is also essential
- Modern DevOps mindset, using best DevOps tools, such as Docker and Git
- Experience with infrastructure as code technology such as Ansible, Terraform and Cloud Formation
- Strong software coding skills, with proficiency in Python, however exceptional ability in any language, will be recognized
- Experience managing an enterprise platform and service, handling new client demand and feature requests
- Experience with containers and microservice architectures e.g., Kubernetes, Docker and serverless approaches
- Experience with Continuous Integration and building continuous delivery pipelines, such as CodePipeline, CodeBuild and Code Deploy
- GxP experience
- Excellent communication, analytical and problem-solving skills
NICE TO HAVE- Experience building large scale data processing pipelines e.g., Hadoop/Spark and SQL
- Use of Data Science modelling tools e.g., R, Python and Data Science notebooks (e.g., Jupyter)
- Multi cloud experience (AWS/Azure/GCP)
- Demonstrable knowledge of building MLOPs environments to a production standard
- Experience on mentoring, coaching and supporting less experienced colleagues and clients
- Experience with SAFe agile principles and practices
WE OFFER- Private health insurance
- EPAM Employees Stock Purchase Plan
- 100% paid sick leave
- Referral Program
- Professional certification
- Language courses
Azure Cloud Engineer
11 ago.Grupo NS
Madrid, ES
Azure Cloud Engineer
Grupo NS · Madrid, ES
Python Azure Jenkins Cloud Coumputing Kubernetes Ansible Oracle Terraform
En Grupo NS seleccionamos un/a Azure Cloud Engineer para colaborar con uno de nuestros clientes estratégicos. Buscamos un perfil polivalente, con actitud proactiva, resolutiva y con muchas ganas de aprender y aportar en un entorno técnico y dinámico.
Condiciones:
* Ubicación: 1 día a la semana en oficinas del cliente (Madrid - zona norte)
* Resto de la semana: trabajo remoto
* Guardias: 1 semana cada 3 (rotación entre 3 compañeros)
* Proyecto estable, con posibilidades de crecimiento
¿Qué ofrecemos?
* Incorporación a proyectos tecnológicos punteros
* Buen ambiente de trabajo y cultura colaborativa
* Estabilidad laboral y plan de carrera
* Retribución acorde a la experiencia
¡Inscríbete ahora y da el siguiente paso en tu carrera con Grupo NS!
Requisitos del puesto:
Experiencia previa en entornos Cloud (mínimo 2-3 años recomendados)
Conocimiento y experiencia en:
1. Azure
2. AKS (Azure Kubernetes Service)
3. Terraform (Infrastructure as Code)
4. Jenkins
Se valorará positivamente experiencia adicional en:
1. Ansible
2. OCI (Oracle Cloud Infrastructure)
3. Python
Soft skills valorados:
* Capacidad para trabajar en equipo
* Buena comunicación con cliente
* Proactividad y autonomía
Cloud Engineer (Python, Aws)
10 ago.CIPEx - Consejo de Ingenieros Peruanos en el EXterior
Madrid, ES
Cloud Engineer (Python, Aws)
CIPEx - Consejo de Ingenieros Peruanos en el EXterior · Madrid, ES
Python Azure Cloud Coumputing AWS Terraform
Desde RED estamos buscando un/a Ingeniero/a Cloud para un nuevo proyecto de transformación Cloud con uno de nuestros socios clave.
El contrato inicial será de 6 meses para un proyecto de larga duración, con posibilidad de extensión.Inicio: ASAPModalidad: Mayormente presencial – Aproximadamente 4 días/semana en MadridRequisitos clave:Experiencia en AWS y Azure (Terraform y CloudFormation)Despliegue IaC de infraestructuras IaaS y PaaSConocimientos en Windows/Linux, seguridad (security groups), redes (VPC, VPN) y automatizaciónExperiencia en migraciones cloud y entornos híbridosGestión de costes cloud y buenas prácticasSi te interesa, por favor aplica aqui con tu CV actualizado y me pondré en contacto contigo.
Si conoces a alguien a quien pueda interesarle esta oportunidad, no dudes en hacérmelo saber: ******.
DATA Engineer
8 ago.sg tech
Madrid, ES
DATA Engineer
sg tech · Madrid, ES
Python Git Jira AWS Big Data
Descripción
Lanzamos una nueva petición para sector financiero como Data Engineer.
- Experiencia entre 1-3 años.
- Madrid, híbrido ok. (Las tablas) - 1 - 2 dias presencial en Las Tablas
Nos requieren un perfil de BigData de 2-3 años de experiencia con lo siguiente:
- Conocimiento de programacion en python y pyspark
- Utilización de GIT para versionado
- Entendimiento de pipeline de datos
- Buena interlocución con cliente
- Acostumbrado a trabajar en metodologia ágil por sprint y con Jira.
- Presencialidad 1-2 dias en cliente (oficinas en Madrid, Las Tablas)
Deseable
- Conocimiento ecosistema Data de BBVA
- Conocimiento de AWS
Requisitos
Nos requieren un perfil de BigData de 2-3 años de experiencia con lo siguiente:
- Conocimiento de programacion en python y pyspark
- Utilización de GIT para versionado
- Entendimiento de pipeline de datos
- Buena interlocución con cliente
- Acostumbrado a trabajar en metodologia ágil por sprint y con Jira.
- Presencialidad 1-2 dias en cliente (oficinas en Madrid, Las Tablas)
Deseable
- Conocimiento ecosistema Data de BBVA
- Conocimiento de AWS