One sunny Barcelona morning, two designers sat down to build a beautiful lead gen form for a bathroom company’s showroom. They ended up reimagining the data collection experience, and realized it was one idea they wouldn’t be flushing down the toilet. Today, Typeform is a team of over 330 people from more than 30 countries, with offices in Barcelona, Spain and San Francisco, US.
We’re building a world where brands, businesses, and their communities can have more personal relationships with the people who matter most. To do this, we give people the tools they need to have conversational experiences at scale—because meaningful relationships start with conversations.
From online lead gen forms and engaging quizzes, to face-to-face encounters in the office, help us build more meaningful relationships, one conversation at a time.
About the Team
Different people employ different methods when it comes to making decisions. SWOTs, pros and cons, a coin flip, gut feeling...none of them beat the facts. At Typeform, the Data team is making sure it’s easy for data to become our #1 asset.
Typeform’s data needs are growing, and we need to find new technical solutions to respond to these needs—fast.
That’s where you come in. We’re looking for a passionate Data Engineer to design a big data infrastructure for our machine learning projects, a set of APIs to ingest/expose data, and a coordination framework for the whole thing. All this to drive Typeform to the gold standard of data collection.
Here’s Albert Franzi, our Data Engineering Team Lead:
“In our team, the collaboration between data scientists, machine learning, and data engineers is key. We work together to challenge the current status quo and continuously improve day by day our ingestion, processing and consumption of data; enabling us to grow using cutting edge technology”
We're very proud of what we achieved over 2020. We ran Airflow and Spark on K8s, enabled our DS and ML teams to build their own DAGs with our DAG Factory tool, started ingesting real-time data using Kafka and Spark Structured Streaming, and deployed machine learning models into production by using MLFlow, K8s, and ArgoCD. Find out more in our latest LinkedIn Post.
Do you love diving into data projects? Do you dream in numbers? Do you see problems as challenges? Then come be a part of something big.
About the Role
Our data platform is core to Typeform’s business and we’ve made it the asset it is by working together to build, enable and constantly develop our tools.
Here’s what you’ll do:
- Design, implement, evaluate, and challenge data pipelines to ingest, transform, and store data in real-time and batch mode.
- Collect and structure external data to feed our internal services and our product teams to make better data-driven decisions.
- Develop tools and frameworks to empower and enable our DS and ML colleagues in their daily work.
- Work closer with the ML engineers to deliver ML products into production.
- Design, implement and challenge the architecture to support all the previous points.
- Understand how data flows within various systems, ensure data integrity and availability, and organize it properly for reporting and analytics.
- Work with other departments to meet their data needs, making the info clear for tech experts and newbies alike.
- Champion a healthy data culture throughout the organization.
Here’s what we’re after:
- Professional experience as a data engineer building data pipelines and data lakes.
- Solid knowledge of Scala or Python programming languages.
- Experience building large-scale data pipelines in Spark.
- Experience in SQL and data warehouses.
- Experience integrating data from multiple sources, including DBs, product tracking, and APIs.
- Excitement when your jobs are running like clockwork.
- Knowledge about cloud computing platforms such as AWS, Google Cloud or Azure.
- A love for getting stuck into juicy problems. The word ‘impossible’—given the appropriate time and resources, of course—does not compute.
- A collaborative spirit and eagerness to keep growing as a data engineer, you’re always ready for new challenges and new technologies.
- Proficiency in English and eagerness to work in a multicultural, international environment.
- Comfort with a continuous feedback culture.
And here are some things that would be great, but aren’t essential:
- You’ve got experience with AWS Redshift or other MPP databases (BigQuery, Vertica, Exasol, etc.). You’ve got basic DBA skills.
- Some technologies and libraries which are nice-to-have but not mandatory are Kafka, K8s, Airflow or Luigi, FastApi, Redis, ArgoCD.
- Experience in Machine Learning.
Think you’re a good fit? Hit ‘Apply’—success might be just around the corner