Empleos actuales relacionados con Data Acquisition Engineer - Ceuta - Walkway
-
Data Acquisition Engineer
hace 2 semanas
ceuta, España Walkway A tiempo completoDescripción del trabajo About Walkway Walkway builds AI-driven revenue intelligence for tours and activities. Operators use our platform for real-time analytics, competitive benchmarks, and dynamic pricing. Our data team collects large-scale web and API data to power these insights. The Role We have a small, focused group that owns source coverage and...
-
Senior Acquisition Analyst
hace 2 semanas
Ceuta, España Flutter International A tiempo completoSenior Acquisition Analyst - SEUJob Title: SEU Senior Acquisition Analyst Location: Malta/Ceuta Reports to: Marketing Lead Closing Date: 12th of December 2025 Grade: I5 / I6.1 PokerStars SEU: A growth story you can be part of PokerStars is part of Flutter Entertainment, the world's largest sports betting and iGaming operator, with market leading positions...
-
Data Engineer
hace 2 semanas
Ceuta, España AMURA IT A tiempo completo¿En qué consiste esta oportunidad? En Amura¿Es este el siguiente paso en su carrera? Descubra si es el candidato adecuado leyendo la descripción completa a continuación.- Unikal ampliamos nuestra Business Line de Data Technologies, Buscamos un/a Data Engineer en un equipo dinámico que trabaja con tecnologías cloud punteras y herramientas modernas de...
-
Paid Media Strategy
hace 2 semanas
Ceuta, España OPTIMAES Talent A tiempo completoOverviewNuestro cliente es un importante líder global de Beauty Retailer, perteneciente a una multinacional del lujo con 150 puntos de Venta en Iberia y un importante presupuesto digital.Siga leyendo para comprender completamente lo que este trabajo requiere en cuanto a habilidades y experiencia. Si su perfil encaja, presente su candidatura.El/la Traffic...
-
Paid Media Strategy
hace 2 semanas
Ceuta, España OPTIMAES Talent A tiempo completoOverview Nuestro cliente es un importante líder global de Beauty Retailer, perteneciente a una multinacional del lujo con 150 puntos de Venta en Iberia y un importante presupuesto digital.Siga leyendo para comprender completamente lo que este trabajo requiere en cuanto a habilidades y experiencia. Si su perfil encaja, presente su candidatura.El/la Traffic...
-
Head of Data Science Program
hace 2 semanas
Ceuta, España Evolve A tiempo completoOverviewFormamos a los profesionales tech del futuro. Maximice sus posibilidades de que su candidatura sea seleccionada asegurándose de que su CV y sus habilidades se ajustan al perfil. Másteres intensivos de 8 meses en Data Science & AI, Ciberseguridad e Inteligencia Artificial Generativa. Prácticos, actualizados y con una tasa de empleabilidad que nos...
-
Ceuta, España ESS Bilbao A tiempo completoTraining RequirementsPhD in Science (Physics or Chemistry) / MS Physics or Chemistry / MS Electronics Engineering or similar.LocationThe position will principally be based in Bilbao but may also require the provision of services in any of the facilities of ESS-Bilbao.Todas las habilidades, cualificaciones y experiencia relevantes que necesitará un candidato...
-
Head of Data Science Program
hace 1 semana
Ceuta, España Evolve A tiempo completoOverview Formamos a los profesionales tech del futuro. Maximice sus posibilidades de que su candidatura sea seleccionada asegurándose de que su CV y sus habilidades se ajustan al perfil. Másteres intensivos de 8 meses en Data Science & AI, Ciberseguridad e Inteligencia Artificial Generativa. Prácticos, actualizados y con una tasa de empleabilidad que nos...
-
Head of Data Science Program
hace 2 semanas
Ceuta, España Evolve A tiempo completoOverview Formamos a los profesionales tech del futuro. Maximice sus posibilidades de que su candidatura sea seleccionada asegurándose de que su CV y sus habilidades se ajustan al perfil. Másteres intensivos de 8 meses en Data Science & AI, Ciberseguridad e Inteligencia Artificial Generativa. Prácticos, actualizados y con una tasa de empleabilidad que nos...
-
ceuta, España ESS Bilbao A tiempo completoTraining Requirements PhD in Science (Physics or Chemistry) / MS Physics or Chemistry / MS Electronics Engineering or similar. Location The position will principally be based in Bilbao but may also require the provision of services in any of the facilities of ESS-Bilbao. Todas las habilidades, cualificaciones y experiencia relevantes que necesitará un...
Data Acquisition Engineer
hace 2 semanas
Descripción del trabajo About Walkway Walkway builds AI-driven revenue intelligence for tours and activities. Operators use our platform for real-time analytics, competitive benchmarks, and dynamic pricing. Our data team collects large-scale web and API data to power these insights. The Role We have a small, focused group that owns source coverage and freshness. The Data Acquisition Lead sets priorities and reviews complex fixes; the Data Engineer maintains schemas, pipelines, and SLAs. You'll own day‑to‑day spider health and QA. Your focus is 80 percent web data collection and spider reliability; 20 percent light transformations when formats change so downstream tables stay consistent. You will keep pipelines healthy, support internal users, and run QA checks so data stays accurate at all times. This is an early‑career role with significant growth. What you will do 80 percent – Spiders and data collection Build and maintain spiders and API collectors in Python/JavaScript; adapt quickly when sites change. Handle basics: headers, cookies, sessions, pagination, rate limits, retries with backoff. Use browser automation when needed: Playwright or Puppeteer for dynamic pages. Triage and fix breakages: selectors, auth flows, captcha or anti‑bot responses, proxy rotation. Monitor runs and freshness; create alerts and simple dashboards; alert when SLAs are at risk. Write validation checks and source‑level QA to prevent bad data from entering the warehouse. Document playbooks so fixes are repeatable. 20 percent – Transformations, QA, and support Adjust small Python or SQL transformations when a source output changes. Reconcile row counts and key fields against benchmarks; raise and resolve data quality issues. Collaborate with Data Engineers on schemas and idempotent loads into the warehouse. Update DAGs or jobs when source formats change so downstream tasks run idempotently and on schedule. Provide lightweight technical support to internal consumers. Always Follow legal and ethical guidelines for data collection; respect terms, privacy, and access controls. Communicate clearly in English with engineers and non‑technical stakeholders. Our stack (you do not need all of it) Node.js in JavaScript or TypeScript; async and await fundamentals. crawlee framework: PlaywrightCrawler, PuppeteerCrawler. Browser automation: Playwright or Puppeteer. crawling and DOM parsing: Cheerio. Large‑scale crawling: request queues, autoscaled concurrency, session pools. Proxy providers: integration and rotation, residential or datacenter, country targeting, session stickiness. GCP basics: Cloud Run or Cloud Functions, Pub/Sub, Cloud Storage, Cloud Scheduler. Data: BigQuery or Postgres fundamentals, CSV or Parquet handling. What you bring Some hands‑on scraping experience – personal projects or internships are fine. Core web fundamentals: headers and cookies, session handling, JSON APIs, simple auth flows. Comfortable in Node.js and TypeScript or JavaScript; willing to learn browser automation and concurrency patterns. Curiosity and high energy – you like chasing down failures and making things work again. Adaptable in a fast‑changing environment – comfortable prioritizing under guidance. Experience with other web crawling frameworks, e.g. Scrapy, is valued. Schedule and orchestrate runs reliably using Cloud Scheduler and Airflow or Mage where appropriate, with clear SLAs and alerting. Nice to have Familiarity with anti‑bot tactics and safe bypass strategies; rotating proxies; headless browsers. Basic SQL; comfort reading or writing simple queries for QA. Experience with GitHub Actions, Docker, and simple cost‑aware choices on GCP. Exposure to data quality checks or anomaly detection. Your first 90 days 30 days: ship your first spider, add monitoring and a QA checklist, fix a real breakage end to end. 60 days: own a set of sources; reduce failure rate and mean time to repair; document playbooks. 90 days: propose a reliability or cost improvement; automate a repeat QA step. Why Walkway Real impact on a data product used by operators. Ship quickly with a pragmatic, low‑ego team; see your work move from concept to production fast. Fully remote with EU and US overlap; a few team gatherings per year; travel covered. Learn from senior engineers and grow toward data engineering or platform paths. How to apply Apply to this job offer and add links to a repo or code sample; if possible provide one example of a scraper you built and what it collected. If you are based in Europe, we would love to hear from you. #J-18808-Ljbffr