Data Engineer (Spain Only)
Technosylva
We are looking for a new colleague to join our Data Engineering team, a core part of the wider Data Platform area at Technosylva (which also includes Data QA and Integrations).
Our team’s mission is to coordinate all internal and external data publication, ensuring it is automated, reliable, consistent, and of the highest quality. We see ourselves as a "Platform Team" in the sense of the Team Topologies book—we have a strong service-oriented mindset and are focused on enabling other teams. This gives us a unique perspective, as we interact with nearly every part of the company.
You'll be joining a team that is mostly remote, though we have a great office in León (Spain) if you prefer an in-person environment. This position is open to being fully remote. On a daily basis, our team communicates in Spanish, but official company documentation and some cross-team forums are in English, so comfort in both is key.
We are currently building a new data platform based on Airflow and Kubernetes. These new tools will serve as the central orchestration and support layer for our data pipelines. These pipelines, in turn, will run tasks across our diverse and powerful ecosystem, which includes Azure Blob Storage, Azure Batch, Tinybird, PostgreSQL, and HPCs (High-Performance Computing).
RESPONSIBILITIES
As a Data Engineer on our team, your work will be focused on evolving our data infrastructure. A large part of our current work involves migrating existing data pipelines (many based on Windows services) to this new, modern, and scalable platform.
Your day-to-day work will involve:
- Designing, building, and maintaining robust data pipelines on our new platform.
- Orchestrating complex workflows that process massive volumes of data, primarily in batch, but with some pseudo-real-time needs.
- Handling a significant and fascinating geospatial data component, including its specific file formats and processing challenges.
- Collaborating closely with our Science teams to adapt their calculation models (which may come in Python, R, or .Net) so they can be validated, monitored, and scaled effectively within our production pipelines.
- Contributing to our DevOps culture by working closely with the Platform team. This includes managing infrastructure as code (Terraform) and building and maintaining our CI/CD pipelines in GitLab.
- Helping our organization on its journey to democratize data access for everyone.
REQUIRED EXPERIENCE / SKILLS
We want to be transparent about the levels we are targeting. We are expanding the team and are open to hiring both mid-level and senior engineers. Whether you are a professional with a solid foundation ready to take the next step, or a seasoned expert capable of driving architectural decisions and mentoring others, we have complex challenges waiting for you
This section is not a rigid checklist. If you don't meet 100% of these points but feel that your experience and mindset align with our mission, we strongly encourage you to apply.
We believe this role is a great fit if you bring:
- A strong foundation in Python as a primary language for data processing and backend development.
- Solid experience in data engineering: You have built and maintained data pipelines before and understand the fundamentals of data orchestration, validation, and processing.
- A collaborative, service-oriented mindset: You enjoy helping others and understand the value of building platforms that enable other teams (that "Team Topologies" spirit).
- A genuine interest in DevOps and infrastructure: You are comfortable working close to the metal and believe that teams should own their services, from code to deployment (CI/CD, IaC).
- A pragmatic approach to technology: You understand that we must support existing codebases (like .Net or R) while building the future in Python.
- Professional fluency: You must be fluent in Spanish for daily team communication and professionally proficient in English for documentation and company-wide discussions.
The following skills are definitely not required, but they are great complements to our team. If you don't have them, this is a perfect place to learn them.
- Direct experience with Airflow and/or Kubernetes.
- Familiarity with the Azure cloud ecosystem.
- Previous exposure to geospatial data and its specific libraries or formats.
- Experience with Infrastructure as Code tools like Terraform or CI/CD systems like GitLab CI.
- An interest in emerging data platform technologies, such as Apache Iceberg.
EDUCATION
- Education or certifications in Computer Science, Computer Engineering, Systems Administration, Software Development, or similar fields.
- English certifications will be valued.
BENEFITS
- Competitive annual salary.
- An annual bonus based on individual and company performance.
- Flexible working hours.
- Possibility of remote work.
At Technosylva, we value diverse experiences and skills, and we understand that each career path is unique. Therefore, we encourage all individuals who believe they meet most of the requirements and are interested in growing and contributing in the role to apply.
DISCLAIMER
The final salary and benefits depend on a variety of factors, including location, experience, training, qualifications, and market demands.
INCLUSION COMMITMENT
Technosylva is an equal opportunity employer. We are committed to creating an inclusive environment where diverse perspectives contribute to better solutions.