¡No te pierdas nada!
Únete a la comunidad de wijobs y recibe por email las mejores ofertas de empleo
Nunca compartiremos tu email con nadie y no te vamos a enviar spam
Suscríbete AhoraInformática e IT
160Comercial y Ventas
117Adminstración y Secretariado
108Derecho y Legal
80Desarrollo de Software
77Ver más categorías
Educación y Formación
69Transporte y Logística
67Marketing y Negocio
61Comercio y Venta al Detalle
51Diseño y Usabilidad
42Instalación y Mantenimiento
34Publicidad y Comunicación
34Ingeniería y Mecánica
28Sanidad y Salud
27Industria Manufacturera
20Recursos Humanos
18Producto
16Atención al cliente
15Hostelería
14Construcción
13Arte, Moda y Diseño
10Turismo y Entretenimiento
10Inmobiliaria
9Alimentación
7Artes y Oficios
7Contabilidad y Finanzas
7Cuidados y Servicios Personales
6Farmacéutica
5Energía y Minería
2Banca
1Seguridad
1Social y Voluntariado
1Agricultura
0Ciencia e Investigación
0Deporte y Entrenamiento
0Editorial y Medios
0Seguros
0Telecomunicaciones
0Top Zonas
Barcelona
846Data Engineer
NuevaB. Braun Group
Barcelona, ES
Data Engineer
B. Braun Group · Barcelona, ES
. Python Agile TSQL Azure Cloud Coumputing DevOps Terraform Power BI
We are seeking a Data Engineer to join our team, focusing on building scalable and governed data products in a cloud data mesh architecture for the SAP Finance & Controlling domain.
This specialized role is paramount for designing, maintaining, and optimizing robust data pipelines and semantic models on our Azure-based Data Analytics Platform, leveraging Databricks and Microsoft Fabric. The ideal candidate combines strong technical proficiency in modern data engineering with the ability to translate finance and controlling business logic into governed, performant data models.
Experience with SAP FI/CO processes is preferred, as well as advanced skills in data modeling, Data Contracts, and cost/performance optimization. You will be instrumental in ensuring high data quality, governance, and availability for critical business intelligence and analytical dashboards. We are looking for a proactive, solution-oriented individual eager to contribute to a multidisciplinary, agile, and international environment.
Your Tasks in the Team
- Design, build, and operate data pipelines on Azure Data Factory and Databricks (PySpark/SQL, Delta Lake) using Azure DevOps for CI/CD.
- Apply advanced data modeling techniques (dimensional/star, data vault, normalized models) and implement Medallion architecture (Bronze/Silver/Gold).
- Define and enforce Data Contracts: schemas, SLAs/SLOs, versioning, and validation gates.
- Optimize Databricks workloads for performance and cost (partitioning, Z ORDER, caching, Photon, autoscaling, cluster policies).
- Standardize delivery with Databricks Asset Bundles and implement observability (job metrics, audit logs).
- Ensure compliance with governance, security, and regulatory requirements via Unity Catalog and RBAC/ABAC policies.
- Embed data quality frameworks, automated tests, and monitoring for pipeline health, SLA breaches, and anomaly detection.
- Collaborate closely with Finance stakeholders and domain engineers to ensure KPI sign-off and business alignment.
- Contribute to technical documentation, participate in code reviews, and drive continuous improvement.
- (Preferred) Build semantic models in Microsoft Fabric/Power BI aligned with curated data and governed KPIs.
- (Preferred) Translate SAP FI/CO business logic (GL, AP/AR, allocations, exchange rates) into reconciled semantic models.
- Strong experience with Microsoft Azure (ADLS Gen2, Data Factory, Key Vault) and foundational networking/security.
- Hands-on expertise in Databricks: PySpark, SQL, Delta Lake, Unity Catalog, Asset Bundles; performance tuning and cost optimization.
- Advanced data modeling skills: dimensional/star, data vault, semantic layers; optimization for query performance.
- Proficiency in Python and SQL for data processing; modular code and unit testing.
- Experience with Azure DevOps (Repos, Pipelines, approvals) and CI/CD strategies with rollback procedures.
- Knowledge of Data Contracts: schema definition, SLAs/SLOs, versioning, compatibility policies.
- Familiarity with event-driven architectures and real-time data streaming.
- Experience working in Agile/Scrum environments.
- Fluent in English (written and spoken).
- SAP FI/CO domain knowledge (GL, AP/AR, Asset Accounting, Cost Center Accounting, Internal Orders, CO PA).
- Microsoft Fabric / Power BI: semantic modeling, dataset governance, KPI standardization.
- Infrastructure as Code (Terraform for Azure & Databricks).
- Data Quality & Anomaly Detection frameworks (DLT expectations, Great Expectations).
- Cost governance: tagging, dashboards, budgets/alerts.
- Advanced modeling patterns: slowly changing dimensions, snapshotting, late-arriving facts.
- Security & Compliance: data masking, tokenization, PII minimization.
Aubay
Barcelona, ES
AWS Cloud Engineer con inglés
Aubay · Barcelona, ES
Python Cloud Coumputing AWS Terraform
Funciones
- Diseñar y arquitectar infraestructuras AWS seguras y escalables.
- Implementar Infraestructura como Código (Terraform) y automatizar despliegues con GitLab-CI.
- Desarrollar y mantener código Python de alta calidad para soluciones cloud.
- Gestionar servicios AWS como Lambda, Fargate, S3, IAM, entre otros.
- Implementar y mantener arquitecturas serverless y containerizadas.
- Configurar y administrar soluciones SIEM (Splunk) y herramientas de seguridad AWS.
- Aplicar prácticas DevSecOps y seguridad en todo el ciclo de desarrollo.
- Configurar monitoreo, logging y alertas con CloudWatch, Prometheus, Grafana y PagerDuty.
Modalidad híbrida: 3-4 días en remoto + 1-2 días presenciales en nuestras oficinas (junto al metro Bogatell, Barcelona)
Requisitos
- Experiencia en ingeniería y arquitectura de soluciones de infraestructura AWS.
- Experiencia con: AWS Landing Zone, servicios de redes y seguridad de AWS, y estrategia multi-cuenta en AWS.
- Conocimiento de principios y diseño de Infraestructura como Código con Terraform.
- Experiencia con GitLab y GitLab-CI.
- Experiencia comprobada escribiendo código en Python.
- Profundo entendimiento de la infraestructura y servicios de AWS (Fargate, Lambda, S3, WAF, KMS, Transit Gateway, IAM, AWS Config)
- Experiencia con soluciones SIEM, idealmente Splunk.
- Experiencia con los siguientes conceptos: enfoque Shift-left y DevSecOps, SBOM, SAST, servicios de seguridad y cumplimiento de AWS (AWS Config, Inspector, Network Firewall, etc.).
- Experiencia en mejores prácticas de registro, monitoreo y alertas basadas en AWS Cloud y herramientas estándar (Splunk, CloudWatch Logs, Prometheus, Grafana, Alert Manager y PagerDuty)
- Inglés
PRUEBA TÉCNICA PARA EL PUESTO PREVIA A LA ENTREVISTA
Se ofrece
AUBAY seleccionamos un/a AWS Cloud Engineer con inglés en Barcelona.
Ofrecemos la posibilidad de formar parte de una Compañía en continuo crecimiento, participando en innovadores proyectos que te permitirán completar tu formación y potenciar tus capacidades. Valoramos el compromiso y la dedicación en el trabajo realizado.
En Aubay somos una multinacional de servicios digitales (DSC) fundada en 1998. Actualmente, con un fuerte crecimiento. Operamos en mercados con un alto valor agregado, tanto en Francia como en otras partes de Europa. En Aubay actualmente tenemos 5 000 personas trabajando.
Desde el asesoramiento hasta todo tipo de proyectos tecnológicos, acompañamos la transformación y modernización de los sistemas de información en todos los sectores, incluidos la industria, I + D, telecomunicaciones e infraestructura, y especialmente los principales bancos y compañías de seguros, que representan más del 80% de nuestra facturación francesa y el 65% de nuestra facturación europea.
Únete a nosotros, te esperamos!
#LI-LR1
Data Engineer
1 mar.IO Interactive
Barcelona, ES
Data Engineer
IO Interactive · Barcelona, ES
. .Net Python TSQL Azure Cloud Coumputing AWS Power BI
Welcome to IO Interactive, where we shape worlds, stories, and adventures for players around the globe. Now, we’re looking for our next adventurer: a Data Engineer to join our centralized Business Intelligence team.
You’ll be joining a team that genuinely enjoys working together. We are a group known for curiosity, collaboration, and a great sense of humor. Most of the team is based in Copenhagen, but we work seamlessly across all our studios. You’ll be stepping into an environment where people support each other, share knowledge openly, and have fun while tackling complex data challenges.
This is not just a support role. As a Data Engineer at IO Interactive, you will help shape how we understand our players, our games, and our business. You will turn raw information into meaningful, actionable intelligence that empowers our teams to make smarter decisions, from post‑launch game performance to commercial insights to financial forecasting across all of IO Interactive.
This is a role for someone curious, grounded, collaborative, and capable of navigating ambiguity. You enjoy understanding the problem before rushing to the solution, and you can translate complex technical topics into clear, accessible insights for non‑technical stakeholders.
If you want your work to directly influence iconic, industry‑defining games, while being part of a genuinely warm, international, and fun team, then you are our new companion, and this is the adventure for you.
This position is open in our Malmö, Copenhagen, Brighton, and Barcelona studios. We offer a welcoming and great studio culture with a hybrid setup of 4 days in the studio and 1 optional remote day.
What You Will Do
- Finance: financial dashboards, cashflow monitoring, budget allocation dashboards, assignment plan insights, salary review dashboards etc.
- Games: post‑release game analytics for Hitman and 007 First Light, game performance dashboards, player behavior insights, hardware usage analytics etc.
- Commercial: commercial insights and reporting across the IO Interactive portfolio.
- Identify, collaborate, and enable data collection requirements across finance, commercial, and game analytics domains.
- Build, maintain, and optimize robust data processing pipelines with consideration for data protection laws, cost, and scalability.
- Implement dashboards and ensure stakeholders are onboarded and empowered to use them effectively.
- Monitor pipeline health, troubleshoot issues, and ensure reliable data availability.
- Contribute to BI team initiatives that enhance tooling, processes, prioritization, and data culture across IOI.
- Several years of experience working in Data Engineering, Data Analytics, or Business Intelligence, ideally within the financial or tech industry.
- Experience working with financial and commercial analytics, such as P&L reports, cash flow analysis, revenue monitoring, and multinational financial structures.
- Comfortable working with data warehouses, databases, ETL/ELT pipelines, and dashboarding tools.
- Experience automating data extraction from RESTful APIs. Knowledge of Python or .NET is helpful.
- Practical experience with SQL and data processing techniques.
- A degree in a relevant field such as data science, computer science, information systems, software engineering, statistics, applied mathematics, finance, or a related discipline.
- Experience with Microsoft data stack is a plus: Azure Synapse, Data Factory, Azure SQL, Fabric, Power BI, Blob Storage.
- Experience in Google Cloud, AWS, or equivalent ecosystems is also valued.
IO Interactive is an independent videogame development and publishing company with studios in Copenhagen, Malmö, Barcelona, Istanbul, and Brighton. As the creative force behind some of the most talked-about multiplatform video games in the last decade, we are committed to creating unforgettable characters and experiences – all powered by our award-winning, proprietary Glacier technology.
IOI is a studio that values in-person collaboration. Being together helps us focus our collective energy on our immediate goals. For us, being both in-office and connected across our studios helps us integrate our teams faster, strengthen relationships, and improve knowledge-sharing. We believe that the more time we spend together, the more quality and progress we achieve for our games and players.
We know that to achieve those goals, we need courage, talented people, and a great working environment – and we do our utmost to have all of that. Across our multiple studios, we’re working on several projects. Crucially, though, we’re all one team. We value the work and impact that each person brings to the table, and we actively encourage new ideas, whilst listening to your insights along the way.
We have a dedicated team of People Managers, who look after you as an individual and as an employee. With more than 40 nationalities, we know that everyone is different and we are proud to have a reputation for being a friendly workplace with highly-talent people.
IO Interactive is an independent video game development and publishing company with studios in Copenhagen, Malmö, Barcelona, Brighton and Istanbul. As the creative force behind some of the most talked-about multiplatform video games in the last decade, we are dedicated to creating unforgettable characters and experiences – all powered by our award-winning proprietary Glacier technology.
DevOps Engineer
24 feb.Krell Consulting & Training
Barcelona, ES
DevOps Engineer
Krell Consulting & Training · Barcelona, ES
Python Cloud Coumputing AWS DevOps Terraform
Descripción
Desde Krell-consulting buscamos un/a Senior DevOps Engineer con experiencia consolidada en entornos cloud y automatización de infraestructuras, para incorporarse a un proyecto tecnológico de alto nivel.
Funciones principales
Diseño, implementación y mantenimiento de infraestructura como código (IaC) utilizando Terraform.
Gestión y evolución de arquitecturas en AWS, especialmente en entornos serverless y basados en contenedores.
Desarrollo y automatización mediante Python.
Diseño, implementación y mantenimiento de pipelines CI/CD en GitLab.
Colaboración con equipos de desarrollo para mejorar procesos de integración, despliegue y calidad del software.
Aplicación de buenas prácticas DevOps y mejora continua de los procesos técnicos.
Requisitos mínimos
Experiencia profesional de 3 a 5 años en entornos DevOps.
Dominio de Terraform para Infraestructura como Código.
Experiencia sólida en AWS (arquitecturas serverless y containerizadas).
Conocimientos avanzados de Python.
Experiencia con GitLab y configuración de pipelines CI/CD.
Nivel de inglés mínimo B2 (capacidad para trabajar en entorno internacional).
Formación: Ciclo Formativo de Grado Superior o titulación equivalente en ámbito tecnológico.
Requisitos valorables
Experiencia con herramientas de monitorización y alertado como Prometheus, Grafana, AlertManager, PagerDuty o Dynatrace.
Conocimientos en prácticas DevSecOps.
Competencias personales
Persona autónoma y proactiva, con alta motivación e interés por la mejora continua.
Mentalidad analítica y orientada a la resolución de problemas.
Capacidad de aprendizaje constante.
Buenas habilidades interpersonales y capacidad de trabajo en equipo.
Condiciones
Modalidad: Preferiblemente Barcelona, con asistencia presencial 2 días por semana.
Proyecto estable con entorno tecnológico avanzado.
22 días de vacaciones laborables.
Jornada: Completa
Modalidad: Híbrida
Contrato indefinido.
22 días de vacaciones laborables.
Amazon
Barcelona, ES
Data Engineer Internship - Esp Amazon University
Amazon · Barcelona, ES
C# Java Python TSQL NoSQL C++
We´re on the lookout for the curious, those who think big and want to define the world of tomorrow. At Amazon, you will grow into the high impact, visionary person you know you´re ready to be. Every day will be filled with exciting new challenges, developing new skills, and achieving personal growth.
How often can you say that your work changes the world? At Amazon, you´ll say it often. Join us and define tomorrow.
2026 Spain Data Engineering Internship
Do you love building tools and data pipelines? Are you excited by the opportunity to create clear effective reports and data visualizations, and collaborate with stakeholders to answer key business questions? Do you want to be a part of a fast-paced environment and contribute to one of the most visited sites on the Internet?
If this describes you, consider joining us as an intern. Amazon is looking for a data engineer intern to join one our many lines of business. Amazon interns have the opportunity to work alongside the industry´s brightest engineers who innovate everyday on behalf of our customers. You will be matched to a manager and a mentor. You will have the opportunity to affect the evolution of Amazon technology as well as lead mission critical projects early in your career. Your work will contribute to solving some of the most complex technical challenges in the company.
In addition to working on an impactful project, you will have the opportunity to engage with Amazonians for both personal and professional development, expand your network, and participate in fun activities with other interns throughout the summer. No matter the location of your internship, we give you the tools to own your summer and learn in a real world setting.
Key job responsibilities
- Design, implement, and automate deployment of our distributed system for collecting and processing log events from multiple sources
- Design data schema and operate internal data warehouses and SQL/NoSQL database systems
- Own the design, development, and maintenance of ongoing metrics, reports, analyses, and dashboards to drive key business decisions
- Monitor and troubleshoot operational or data issues in the data pipelines
- Drive architectural plans and implementation for future data storage, reporting, and analytic solutions
- Work collaboratively with Business Analysts, Data Scientists, and other internal partners to identify opportunities/problems
- Provide assistance to the team with troubleshooting, researching the root cause, and thoroughly resolving defects in the event of a problem
A day in the life
Our Data Engineers build and maintain the infrastructure to answer questions with data, using software engineering best practices, data management fundamentals, data storage principles, and recent advances in distributed systems (e.g. MapReduce, MPP architectures, NoSQL database). We´re looking for Data Engineer interns to join one of our many lines of business.
About the team
If you´re insatiably curious and always want to learn more, then you´ve come to the right place. Depending on your location, country, job status and other requirements, some or all of the following benefits may be available to you as an intern.
- Competitive pay
- Impactful project and internship/role deliverables
- Networking opportunities with fellow interns
- Internships events such as speaker series, intern panels, Leadership Principles sessions, Amazon writing skills sessions.
- Mentorship and career development
If you´re successful during your internship, you could be considered for a graduate role after finishing your university studies
Internship start dates vary throughout the year.
Internship ideal length is 3 months.
We are committed to diversity, equity, and inclusion, and leveraging our unique perspectives to scale our impact and grow. Amazon has 13 affinity groups, sometimes known as employee resource groups, which bring employees together across businesses and locations around the world. With executive and company sponsorship, these groups play an important role in building internal networks for creating a community, advising Amazon business units, leading in service projects, and reaching out to communities where Amazonians live and work.
Want to know more about our opportunities?
BASIC QUALIFICATIONS
- Work 40 hours/week minimum and commit to 6 month internship maximum
- Are enrolled in a Bachelor´s degree or above
PREFERRED QUALIFICATIONS
- Experience with at least one modern language such as Java, Python, C++, or C# including object-oriented design
- Experience with SQL, including class projects, personal projects, research, prior internships, or volunteer work
Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience and skills. We value your passion to discover, invent, simplify and build. Protecting your privacy and the security of your data is a longstanding top priority for Amazon.
Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit the appropriate resources for more information.
5G NTN Test Engineer (Junior)
21 feb.Sateliot
Barcelona, ES
5G NTN Test Engineer (Junior)
Sateliot · Barcelona, ES
. Linux IoT Office
WHO ARE WE?
Sateliot is a Barcelona-based Startup in the New Space sector, becoming the first satellite telecommunications operator that will provide global and continuous connectivity to all the elements that will make up the massive Internet of Things (IoT) universe under the 5G protocol.
In order to do so, Sateliot is launching a constellation of last generation nanosatellites, located at low altitude that act as mobile towers. Sateliot is the perfect complement for large telecommunications companies by providing them with the necessary infrastructure where terrestrial technologies do not arrive.
YOUR MISSION
Lead and coordinate Proofs of Concept (PoCs) with Mobile Network Operators (MNOs) in both lab and field environments , ensuring seamless integration of IPX and NTN services, and effectively managing end-to-end service validation.
Your Main Functions
- Manage Proofs of Concept (PoCs) with Mobile Network Operators (MNOs), both in the lab and on-site in the MNO’s country when required.
- Provide technical support during MNO integration processes, including IPX and NTN network elements.
- Coordinate and execute test passes and service validations, ensuring alignment with integration requirements.
- Analyze and report test results, identify anomalies, and recommend technical solutions.
- Collaborate with internal engineering, product, and operations teams to optimize the integration process and resolve technical blockers.
- Contribute to the documentation of procedures, integration steps, and lessons learned from each MNO onboarding.
- Act as a technical interface between Sateliot and the MNO’s technical teams during integration and validation.
- Degree in Telecommunications Engineering.
- Familiarity with IPX networks, 5G networks, roaming setup, and interconnection models.
- Ability to work with testbeds, and telecom monitoring tools (e.g., Wireshark, protocol analyzers). knowledge in IP networking protocols (IPv4/IPv6, DNS, routing, VPN, etc.).
- Basic Linux user skills
- Communication and coordination skills for working with international MNOs and technical partners.
- Analytical and troubleshooting mindset in telecom environments.
- Capacity to work independently and manage technical projects in parallel.
- Availability to travel when needed for PoCs and integration projects abroad.
- Good level of English (spoken and written) is essential; additional languages are a plus.
- Understanding of mobile and satellite communication protocols, including NB-IoT, LTE-M, 3GPP NTN, and LEO technologies.
You will be part of one the fastest-growing start-ups in Spain with global reach along getting into the challenging world of New Space & Telecommunication.
Our culture is based on embracing openness by welcoming multicultural talent, being respectful with everybody and being open to exchange ideas. We are also committed to a healthy lifestyle by helping our team balance their work and personal life and also by providing them facilities for healthy habits.
We are a driven team with big goals, that seek for people who are genuinely passionate about their work and that also want to keep learning and getting better personally and professionally!
WHAT DO WE OFFER?
- Full time permanent Contract
- Hybrid Work Model 💻
- Schedule flexibility
- Flat and transparent organizational structure
- Buddy Program to help you with your integration during your first month 🫂
- Flexible compensation package: Tax benefits with ticket restaurant, transportation and kindergarten, training programs.💰
- We promote good physical and mental health, with a Health insurance, Fresh fruit in the office and the possibility of sharing the cost of bicycle transport or gyms.🏋🏻
- Work in a dynamic, multidisciplinary and multicultural environment that will allow you to boost your professional career 🌍
- To be part of a strong, international, friendly and motivated team, where you can progress both personal and professionally 🪴
- The chance to be part of one of the most exciting and disruptive space projects in Europe 🚀
Devops
20 feb.Randstad ES
Barcelona, ES
Devops
Randstad ES · Barcelona, ES
Python Cloud Coumputing Jira AWS Bash DevOps Terraform
Do you have experience experience as DevOps working with AWS?
If you are motivated to positively contributing to the life science industry and deserving patients worldwide, keep reading!
our client
Our client is a software-as-a-service provider that transforms manufacturing operations in life science industries using advanced analytics and artificial intelligence.
Their mission is to improve global health by optimizing how medicines are manufactured so that pharma and biotech companies can provide patients worldwide with the right medicine at the right time and price.
Reporting to the Team Lead, you will be assuring that the platform is automated, well-documented, taken care of, and maintained efficiently. Being responsible for Infrastructure Delivery Automation processes in AWS.
your functions
Responsibilities:
- Design, develop, and implement an infrastructure on AWS using Infrastructure as Code (IaC) principles with OpenTofu (Terraform), Terragrunt, and AWS CloudFormation.
- Product Environments Commissioning/Decommissioning, and Product Releases Deployment.
- Implement continuous integration and continuous delivery (CI/CD) pipelines for infrastructure deployments and company code using tools like GitHub Actions/Workflows, programming languages like Python, and shell programming languages like Bash/Zsh.
- Ensure configuration management processes and infrastructure comply with ISO 9001, 27001/27017, NIS 2, and SOC 2, GmP/GAMP standards.
- Automate infrastructure patching and vulnerability management.
- Monitor and respond to incidents.
position requisites
- Formación: Ciclo Formativo Grado Superior
- Idiomas: Inglés: B2
- Conocimientos: aws, ci/cd, scripting, iaac
- Experiencia: 1 año
1+ years of experience using AWS cloud technologies in a DevOps or similar role.
Basic Infrastructure as Code (IaC) knowledge, particularly with OpenTofu (Terraform) and/or Terragrunt.
Experience with CI/CD pipelines for automated deployments.
Basic scripting skills (Bash, Python, etc).
Good problem-solving and troubleshooting skills.
Effective communication and collaboration skills.
Experience with CI/CD tools (Github Actions).
Familiarity with working on a ticket-based task scheduling (JIRA).
Data Engineer
19 feb.HAYS
Barcelona, ES
Data Engineer
HAYS · Barcelona, ES
Python Agile TSQL Azure Git Machine Learning Power BI SQL Server
Senior Data Engineer
Buscamos un Senior Data Engineer capaz de construir plataformas de datos sólidas, escalables y listas para impulsar decisiones reales de negocio. Si te apasiona diseñar arquitecturas modernas, elevar estándares técnicos y trabajar en un entorno donde tus ideas cuentan, este rol es para ti.
Lo que harás
• Crear y mantener pipelines end‑to‑end en Azure.
• Modelar datos con dbt, SQL y Databricks/Spark.
• Asegurar calidad, fiabilidad y rendimiento del dato.
• Diseñar modelos optimizados para Power BI.
• Ser referente técnico: buenas prácticas, mentoring y visión arquitectónica.
• Trabajar en Agile con equipos de negocio y analítica.
Lo que aportarás
• +5 años como Data Engineer.
• Dominio de SQL, dbt, Databricks/Spark y Azure (ADLS, ADF, Synapse).
• Experiencia con SQL Server y modelos semánticos en Power BI.
• Buen manejo de Git y arquitecturas de datos modernas.
• Mentalidad crítica, autonomía y pasión por hacer las cosas bien.
Plus que suman puntos
• Integración Power BI + dbt.
• CI/CD para datos o BI.
• Gobierno del dato: RLS, accesos, mejores prácticas.
• Optimización de Power BI Service.
• Python (machine learning)
Por qué te puede interesar
Porque tu trabajo tendrá impacto directo en cómo la compañía toma decisiones, escala sus sistemas y utiliza los datos como ventaja competitiva. Aquí podrás liderar, proponer y construir.
Michael Page
Data Analyst & Data Engineer en Sant Cugat
Michael Page · Sant Cugat del Vallès, ES
Teletrabajo Python TSQL Azure R ERP Machine Learning Power BI
- ¿Resides por la zona del Valles?
- ¿Tienes experiencia cómo Data Analyst o Data Engineer?
¿Dónde vas a trabajar?
Empresa del sector del Retail.
Descripción
Responsabilidades Principales
Desarrollo y Gobernanza de Power BI
Diseñar, desarrollar y mantener informes interactivos, dashboards corporativos y cuadros de mando estratégicos en Power BI.
RLS/OLS, alineación con grupos de seguridad, Gestión de workspaces, permisos, auditoría.
Construir modelos de datos eficientes, escalables y bien documentados.
Implementar y optimizar medidas DAX complejas asegurando rendimiento, precisión y consistencia.
Gestionar datasets, incremental refresh, gateways y conexiones a múltiples orígenes.
Aplicar buenas prácticas de modelado (star schema, normalización, relaciones, granularidad, etc.).
Análisis Cuantitativo y Ciencia de Datos
Desarrollar análisis descriptivos, diagnósticos y predictivos que aporten valor al negocio.
Utilizar Python / R (si aplica) para procesamiento avanzado, machine learning o automatización de cargas.
Documentar metodologías, supuestos y resultados de forma clara para stakeholders no técnicos.
Data Management y Arquitectura
- Colaborar con Data Engineering para garantizar la calidad, disponibilidad y trazabilidad de datos.
- Participar en la definición de pipelines ETL/ELT y en la optimización de procesos de ingesta y transformación.
- Garantizar estándares de gobernanza, seguridad y cumplimiento de datos.
Requisitos Técnicos
Imprescindibles
Dominio avanzado de Power BI Desktop y Power BI Service.
Experiencia sólida en DAX (time intelligence, funciones avanzadas, optimización).
Experiencia en modelado de datos (tabular models, star schema, rolado, relaciones many-to-many, etc.).
Conocimiento sólido de SQL aplicado a la extracción y transformación de datos desde DWH/Lake, ERP y CRM, incluyendo buen dominio de modelado relacional (ER), normalización básica, claves primarias/foráneas y comprensión de granularidad y calidad de datos.
Conocimiento de Python o R
Deseables
- Herramientas de performance Tabular Editor, DAX Studio, Performance Analyzer.
- Experiencia con (Pandas, Numpy, Scikit-learn) u otros lenguajes analíticos.
- Conocimientos de Power Query (M).
- Familiaridad con Azure Data Platform (Synapse, Datafactory, Lakehouse, Fabric).
- Experiencia previa en entornos de alto volumen de datos o en sectores bajo regulación.
¿A quién buscamos (H/M/D)?
Competencias Clave
- Pensamiento analítico y capacidad para resolver problemas complejos.
- Rigor en la validación de datos y enfoque orientado a la calidad.
- Comunicación efectiva con perfiles técnicos y no técnicos.
- Proactividad para identificar oportunidades de mejora y automatización.
- Capacidad para trabajar de manera estructurada, priorizando según impacto y urgencia.
Formación y Experiencia
- Entre 3 y 5 años de experiencia en roles de Data Engineer, Data Analyst, Data Scientist o BI Specialist
- Certificaciones en Power BI o Azure.
- Dominio en modelado tabular, esquemas estrella y relaciones complejas.
- Sólida base en SQL para extracción y transformación desde Data Warehouse, ERP y CRM.
- Valorable experiencia en Python o R y certificaciones en Power BI/Azure.( plataforma clout de Microsoft)
¿Cuáles son tus beneficios?
- Posición estable e indefinida.
- 1 día de teletrabajo.
- Proyecto con opciones de crecimiento.