Empleos actuales relacionados con Data Acquisition Engineer - Bilbao - Walkway
-
Software Engineer, Data Infrastructure
hace 1 día
bilbao, España Speechify A tiempo completoSoftware Engineer, Data Infrastructure & Acquisition - Bilbao, Spain The mission of Speechify is to make sure that reading is never a barrier to learning. Speechify’s text-to-speech products help over 50 million people read more efficiently. The team builds and maintains backend services across payments, analytics, subscriptions, new products,...
-
Data Engineer
hace 2 semanas
Bilbao, España We Bring A tiempo completo**_¿Te gustaría formar parte de un equipo ágil y colaborativo, que trabaja con últimas tecnologías y tiene ganas de aprender todos los días?_** Desde **We Bring** estamos seleccionando para su incorporación en una empresa tecnológica y referente internacional en el sector bancario a un/a **Data Engineer (Python) en Bilbao.** **¿CUÁL ES EL STACK...
-
Data Engineer
hace 2 semanas
Bilbao, Vizcaya, España TheWhiteam A tiempo completoData Engineer (ODI/OBIEE), Europe (Full-remote) – International clientJob role:Data Engineer (ODI/OBIEE).Location:Europe (Full-remote).Languages required:English MANDATORY.Studies required:Master's degree.Minimum experience required:5 to 7 years.DescriptionKnowledge and SkillsDeep understanding of data integration and ETL concepts.Strong analytical and...
-
Data Engineer
hace 2 semanas
Bilbao, Vizcaya, España The Whiteam A tiempo completoData Engineer (ODI/OBIEE), Europe (Full-remote) – International clientJob role: Data Engineer (ODI/OBIEE).Location: Europe (Full-remote).Languages required: English MANDATORY.Studies required: Master's degree.Minimum experience required: 5 to 7 years.DESCRIPTIONKnowledge and SkillsDeep understanding of data integration and ETL concepts.Strong analytical...
-
Data Architect Teletrabajo
hace 1 semana
Bilbao, España Exportadora Data Base S.A. A tiempo completoBuscamos un/a Data Engineer con experiencia en Talend y PySpark para incorporarse a un proyecto estable, participando en el desarrollo, mantenimiento y optimización de procesos de integración y transformación de datos en entornos analíticos. La posición está orientada a perfiles con experiencia práctica en ingeniería de datos, acostumbrados a...
-
Senior Data Infrastructure Engineer
hace 2 semanas
Bilbao, España Speechify A tiempo completoA leading tech company in Bilbao is seeking a Software Engineer to join its Data team. The role involves data collection for model training, focusing on high-quality datasets. Ideal candidates have a background in software development with strong skills in scripting, cloud infrastructure, and data processing.
-
Data Engineer
hace 1 semana
Bilbao, Vizcaya, España TheWhiteam A tiempo completoDescripción del puestoBuscamos unData Engineercon experiencia enPython, PySpark y servicios de Azurepara un proyecto estable y100% remoto. La prestación será al50% (4 horas diarias), ideal para profesionales que buscan flexibilidad y compatibilidad con otros proyectos.ResponsabilidadesDiseñar y desarrollar procesos de data engineering utilizando Python,...
-
Data Engineer
hace 1 semana
Bilbao, Vizcaya, España The Whiteam A tiempo completoDescripción del puestoBuscamos un Data Engineer con experiencia en Python, PySpark y servicios de Azure para un proyecto estable y 100% remoto. La prestación será al 50% (4 horas diarias), ideal para profesionales que buscan flexibilidad y compatibilidad con otros proyectos.ResponsabilidadesDiseñar y desarrollar procesos de data engineering utilizando...
-
Data Engineer
hace 2 semanas
Bilbao, Vizcaya, España Athletic Club A tiempo completoDesde el Departamento de Tecnología del Athletic Club, estamos buscando una persona que se incorpore al área de Data & Analytics como Data Engineer.Liderará técnicamente los servicios de desarrollo y evolución de la Plataforma Informacional, asegurando su rendimiento y adecuación a las necesidades del Club y siguiendo las mejores prácticas de...
-
Data Engineer Databricks
hace 2 semanas
Bilbao, España The White Team A tiempo completo**Data Engineers Databricks & DBT (Barcelona or anywhere in EU) - International client** **Job Requirements** **Profile**: Engineer. **Knowledge**: Databricks. **Studies**: Technical Engineer. **Languages**: English. **Minimum experience**:3 to 5 years. **Technical Skills required** - Databricks (strong hands-on experience) - DBT (Data Build Tool) -...
Data Acquisition Engineer
hace 2 semanas
Contractor role; US-based company. We operate remotely - most of the Engineering team is CET.About WalkwayWalkway builds AI-driven revenue intelligence for tours and activities. Operators use our platform for real-time analytics, competitive benchmarks, and dynamic pricing. Our data team collects large-scale web and API data to power these insights.The RoleWe have a small, focused group that owns source coverage and freshness. The Data Acquisition Lead sets priorities and reviews complex fixes; the Data Engineer maintains schemas, pipelines, and SLAs. You’ll own day-to-day spider health and QA.Your focus is 80 percent web data collection and spider reliability; 20 percent light transformations when formats change so downstream tables stay consistent. You will keep pipelines healthy, support internal users, and run QA checks so data stays accurate at all times. This is an early-career role with significant growth.What you will do80 percent - Spiders and data collection- Build and maintain spiders and API collectors in Python/JavaScript; adapt quickly when sites change.- Handle HTTP basics: headers, cookies, sessions, pagination, rate limits, retries with backoff.- Use browser automation when needed: Playwright or Puppeteer for dynamic pages.- Triage and fix breakages: selectors, auth flows, captcha or antibot responses, proxy rotation.- Monitor runs and freshness; create alerts and simple dashboards; escalate when SLAs are at risk.- Write validation checks and source-level QA to prevent bad data from entering the warehouse.- Document playbooks so fixes are repeatable.20 percent - Transformations, QA, and support- Adjust small Python or SQL transformations when a source output changes.- Reconcile row counts and key fields against benchmarks; raise and resolve data quality issues.- Collaborate with Data Engineers on schemas and idempotent loads into the warehouse.- Update DAGs or jobs when source formats change so downstream tasks run idempotently and on schedule.- Provide lightweight technical support to internal consumers.Always- Follow legal and ethical guidelines for data collection; respect terms, privacy, and access controls.- Communicate clearly in English with engineers and non-technical stakeholders.Our stack (you do not need all of it)- Node.js in JavaScript or TypeScript; async and await fundamentals.- Crawlee framework: PlaywrightCrawler, PuppeteerCrawler, HttpCrawler.- Browser automation: Playwright or Puppeteer.- HTTP-based crawling and DOM parsing: Cheerio.- Large-scale crawling: request queues, autoscaled concurrency, session pools.- Proxy providers: integration and rotation, residential or datacenter, country targeting, session stickiness.- GCP basics: Cloud Run or Cloud Functions, Pub/Sub, Cloud Storage, Cloud Scheduler.- Data: BigQuery or Postgres fundamentals, CSV or Parquet handling.What you bring- Some hands-on scraping experience; personal projects or internships are fine.- Core web fundamentals: HTTP, headers and cookies, session handling, JSON APIs, simple auth flows.- Comfortable in Node.js and TypeScript or JavaScript; willing to learn browser automation and concurrency patterns.- Curiosity and high energy; you like chasing down failures and making things work again.- Adaptable in a fast-changing environment; comfortable prioritizing under guidance.- Experience with other web crawling frameworks, for example Scrapy, is valued and a plus.- Schedule and orchestrate runs reliably using Cloud Scheduler and Airflow or Mage where appropriate, with clear SLAs and alerting.Nice to have- Familiarity with antibot tactics and safe bypass strategies; rotating proxies; headless browsers.- Basic SQL; comfort reading or writing simple queries for QA.- Experience with GitHub Actions, Docker, and simple cost-aware choices on GCP.- Exposure to data quality checks or anomaly detection.Your first 90 days- 30 days: ship your first spider, add monitoring and a QA checklist, fix a real breakage end to end.- 60 days: own a set of sources; reduce failure rate and mean time to repair; document playbooks.- 90 days: propose a reliability or cost improvement; automate a repeat QA step.Why Walkway- Real impact on a data product used by operators.- Ship quickly with a pragmatic, low-ego team; see your work move from concept to production fast.- Fully remote with EU and US overlap; a few team gatherings per year; travel covered.- Learn from senior engineers and grow toward data engineering or platform paths.How to applyApply to this job offer and add in your resume links to a repo or code sample; if possible one example of a scraper you built and what it collected.If you are based in Europe, we would love to hear from you.