Role Overview
Join Fullstory as a Senior Data Engineer and embark on a journey to harness the vast trove of data within our systems. This role offers the opportunity to work remotely anywhere in the United States, providing flexibility and autonomy to innovate and drive projects forward. You will collaborate with various internal groups to understand their data needs, ensuring that data is accessible and actionable.
Responsibilities
- Work with internal teams to identify valuable data sources.
- Develop and maintain efficient ETL pipelines using tools like Airflow and Google BigQuery.
- Build monitoring systems for existing data pipelines to enhance responsiveness.
- Participate in strategy meetings to maximize data utilization for achieving business goals.
Qualifications
- Proven experience in data engineering with a strong focus on ETL processes.
- Proficiency in Python and familiarity with REST APIs.
- Experience with data formats and databases (CSV, JSON, Relational, No-SQL).
Impact
- Within 6 months: Ensure smooth operation of data pipelines, providing new feature sets to internal teams through ML and AI.
- Within 12 months: Expand support to provide more powerful insights from overseen data.
About Fullstory
Fullstory is dedicated to making businesses more customer-centric through behavioral data insights. Our technology transforms digital visits into actionable data, helping enterprises understand customer sentiments and drive conversions.
Support and Culture
At Fullstory, we value clarity, empathy, bionics, and trust. We support our employees by offering a remote-first work environment, comprehensive benefits, and opportunities for growth and development. Connect with a diverse team and contribute to a culture that embraces all dimensions of diversity.
How to Apply
Interested candidates are encouraged to apply through our dedicated careers page. We are committed to fostering an inclusive environment and consider all qualified applicants without regard to race, color, religion, sex, national origin, or disability.
Benefits Extracted with AI
- Remote work flexibility
- Flexible paid time off
- Annual company-wide closure
- Sponsored benefit packages for US-based employees
- Supplemental coverage for international employees
- Professional development opportunities
- Career coaching sessions
- Annual learning subsidy
- Monthly productivity stipend
- Reimbursement for home office setup
- Employee Resource Group events
- Team off-sites
- Paid parental leave
- Global fertility and family building benefit
- Bereavement leave
- Miscarriage/Pregnancy loss leave
Similar jobs
Last update: 23 minutes ago
Senior Software Engineer - Remote
Join Fullstory as a Senior Software Engineer to build and maintain services for behavioral data governance.
Remote Data Engineer - Google BigQuery
Join Hostaway as a Remote Data Engineer, leveraging Google BigQuery and Python to optimize data infrastructure and support revenue operations.
Senior Data Engineer
Senior Data Engineer role focusing on data architecture, ETL processes, and big data technologies in Denver, CO.
Senior Software Engineer, Backend
Senior Backend Software Engineer role focusing on distributed systems using Go, Kubernetes, and GCP at Fullstory, a leading tech company.
Senior Data Engineer
Join Algolia as a Senior Data Engineer to design and scale data pipelines using Python, Airflow, and AWS technologies.
Senior Data Engineer
Senior Data Engineer needed to develop high-scale data systems using Python, PostgreSQL, and cloud services. Remote work flexibility.
Senior Data Engineer - GCP/Airflow
Senior Data Engineer role focusing on GCP, Airflow, and ETL in Stockholm. Join Bambuser to enhance data-driven decision making.
Senior Analytics Engineer
Join Remote as a Senior Analytics Engineer to drive impactful decision-making with data analytics and engineering.
Senior Data Engineer (f/m/d)
Senior Data Engineer needed in Berlin. Expertise in Python, SQL, Data Modeling, and ETL required. Hybrid work policy.
Lead Data Engineer with GCP Expertise
Lead Data Engineer role in Berlin, focusing on GCP, BigQuery, and data pipelines.
Senior Data Engineer
Senior Data Engineer role in Barcelona, skilled in Python, Spark, AWS, SQL, and big data processing.
Senior Data Platform Engineer
Senior Data Platform Engineer specializing in AWS and GCP services, data pipelines, and cloud infrastructure.
Data Engineer II
Join Strava as a Data Engineer II in San Francisco, CA. Work with modern data technologies and a diverse team.
Senior Data Engineer
Join us as a Senior Data Engineer in Lisbon to design and maintain data infrastructure. Hybrid role with flexible benefits.
Senior Data Engineer, Data Platform
Senior Data Engineer needed to build scalable data platforms using Kafka, Spark, and AWS. Inclusive team, great benefits.
Senior Data Platform Engineer
Senior Data Platform Engineer needed for Blip, focusing on Big Data management and cloud solutions. Expertise in SQL, Python, Spark, and cloud platforms required.
Software Engineer II, Data Engineering
Join GitHub as a Software Engineer II in Data Engineering, focusing on data pipelines with Python, SQL, Airflow, and Spark.
Staff Software Engineer, Data Infrastructure
Senior Data Infrastructure Engineer at Airbnb, focusing on data engineering tools and frameworks, remote eligible.
Senior Software Engineer, Data
Join Airtable as a Senior Software Engineer, Data, to design and maintain scalable data pipelines and solutions.
Senior Data Engineer
Join Stability AI as a Senior Data Engineer to build scalable data infrastructure for AI models. Remote work from Germany.
Senior Data Engineer
Senior Data Engineer at Bynder in Barcelona, skilled in Apache Kafka, Airflow, ETL, and cloud data solutions.
Senior Data Engineer (Contract)
Senior Data Engineer, fully remote, contract. Expertise in Snowflake, SQL, Python, GCP required. $45-$60/hr.
Senior Data Engineer
Join SeQura as a Senior Data Engineer to develop and maintain data pipelines using AWS, Python, and SQL in a hybrid work environment.
Remote Data Engineer - Research
Join Stability AI as a Remote Data Engineer to build scalable data infrastructure for AI models.