Blockhouse logo

Data Engineering Intern

Blockhouse

Job Description: Data Engineering Intern

Blockhouse is focused on real-time machine learning and data engineering, building scalable infrastructure for high-frequency ML models that redefine how organizations extract actionable insights from data. Our systems drive the future of real-time analytics, leveraging cutting-edge technology to deploy machine learning pipelines with sub-second level response times. If you’re passionate about building the future of MLOps and want to work with a world-class team, this is your opportunity.

Role Description:

We are looking for an exceptional Data Engineering Intern to join our team and help architect the data systems of the future. In this role, you will build and scale real-time data pipelines and analytics infrastructure, powering high-frequency machine learning models. This is not a typical internship – you will be working on mission-critical projects that process millions of data points per second, collaborating closely with machine learning scientists and MLOps engineers.

Your work will directly influence the performance of trading models and real-time decision-making engines. You’ll work with cutting-edge technologies for event-driven streaming and OLAP analytics, delivering insights at scale and speed.

Key Responsibilities:

  • Real-Time Data Pipelines: Design, develop, and optimize real-time data pipelines that feed high-frequency machine learning models. Ensure seamless data ingestion, transformation, and storage for analytics and machine learning at scale.
  • Advanced Data Integration: Collaborate with MLOps engineers and machine learning teams to ensure real-time data flows between systems, enabling models to continuously learn from and react to new data streams.
  • Performance Optimization: Work on optimizing the performance and reliability of data architectures using technologies like ClickHouse for high-throughput OLAP querying and Redpanda for low-latency event streaming.
  • Real-Time Monitoring & Diagnostics: Implement robust monitoring and diagnostic tools to track the health and performance of data pipelines, ensuring real-time models are supplied with accurate, up-to-date data.
  • Cloud Infrastructure: Build and manage scalable cloud infrastructure to support data pipelines in production, leveraging AWS or GCP services to ensure fault-tolerant, cost-efficient deployments.
  • Collaborate with Elite Teams: Engage with top-tier engineers, data scientists, and quantitative researchers to build scalable solutions that bridge the gap between data engineering and machine learning.

What You’ll Need:

  • 1+ Years of Data Engineering Experience: Hands-on experience building and scaling data pipelines, especially in high-throughput, low-latency environments.
  • Mastery of Real-Time Data Systems: Expertise in real-time data streaming and processing, with strong hands-on experience using technologies like Redpanda (or Kafka) and ClickHouse (or similar OLAP databases).
  • Proficiency in Data Engineering Tools: Strong command of Python, SQL, and other tools commonly used in data engineering. Experience with frameworks such as Apache Spark, Airflow, or similar is a plus.
  • Cloud Expertise: Proven experience with cloud platforms such as AWS or GCP, including services like S3, Lambda, EKS, or other tools for building scalable data infrastructure.
  • Data Architecture & Integration: Experience architecting systems that handle both streaming and batch processing, integrating real-time pipelines with machine learning workflows.
  • Monitoring at Scale: Familiarity with monitoring and alerting tools such as Prometheus, Grafana, or CloudWatch to ensure seamless operation of real-time data systems.

Ideal Candidate Profile:

  • Passion for Real-Time Systems: A deep interest in building data systems that operate in real time, optimizing for performance, latency, and throughput.
  • Experience with High-Frequency Systems: Familiarity with the challenges and complexities of handling large-scale, high-frequency data.
  • Self-Motivated & Results-Driven: You thrive in a fast-paced environment, are self-driven, and have the ability to work independently on complex tasks.
  • Collaborative Mindset: A team player with excellent communication skills, who can work effectively across teams to drive innovation and problem-solving.

Why You Should Join Us:

  • Innovative Environment: Be part of a team that is pushing the boundaries of real-time data engineering, solving complex challenges in financial technology and beyond.
  • Expert Team: Work alongside some of the brightest minds in data engineering, machine learning, and quantitative research.
  • Professional Growth: Blockhouse fosters a culture of continuous learning and development, ensuring you gain hands-on experience with cutting-edge technologies and best practices.
  • Cutting-Edge Projects: You’ll work on transformative projects that directly impact the future of trade execution, real-time analytics, and financial technology.
  • Compensation & Perks: Equity-only compensation. NYC-based employees enjoy daily free lunch and weekly company bonding events.

How to Apply:

If you are passionate about real-time data systems and eager to apply your skills to solve complex engineering challenges, join us at Blockhouse. Together, we will redefine the future of data engineering and real-time analytics.

Benefits
Extracted with AI

  • Equity-only compensation
  • Daily free lunch for NYC-based employees
  • Weekly company bonding events

Similar jobs

Last update: 23 minutes ago

Blockhouse logo
Blockhouse

Machine Learning Engineer Intern

Join Blockhouse as a Machine Learning Engineer Intern in New York, applying AI and ML techniques in financial analytics.

Blockhouse logo
Blockhouse

Machine Learning Research Intern

Join Blockhouse as a Machine Learning Research Intern to innovate in financial analytics using AI and ML techniques. Remote, part-time role.

Blockhouse logo
Blockhouse

Full Stack Developer Intern

Internship for Full Stack Developer with skills in React, Python, Django, and PostgreSQL at a financial data platform.

Computer Futures logo
Computer Futures

Cloud Data Engineer

Seeking a Cloud Data Engineer with expertise in AWS, Python, and CI/CD for a hybrid role in Hannover. Join our dynamic team!

Holland Casino logo
Holland Casino

Data Engineer with ETL and SQL Expertise

Join Holland Casino as a Data Engineer to build and maintain data infrastructure for the Online Casino, focusing on ETL, SQL, and cloud solutions.

Blockhouse logo
Blockhouse

Full Stack Developer Intern

Join Blockhouse as a Full Stack Developer Intern to work on innovative financial technology projects using React, Python, and Django.

Blockhouse logo
Blockhouse

Full Stack Developer Intern

Join Blockhouse as a Full Stack Developer Intern to work on innovative financial technology projects using React, Python, and Django.

Zalando logo
Zalando

Senior Backend/Data Engineer

Join Zalando as a Senior Backend/Data Engineer in Berlin to enhance our audience-building platform using AWS, Java, Scala, and SQL.

Carbon13 logo
Carbon13

Cofounder - Full Stack Developer/Data Scientist for Climatech Startup

Join Carbon13 as a cofounder in climate tech, leveraging AI, data science, and software development to combat climate change.

Messari logo
Messari

Data Engineer with Blockchain and Cloud Experience

Join Messari as a Data Engineer to design blockchain data models, build dashboards, and derive insights. Remote role with competitive benefits.

Datadog logo
Datadog

Software Engineer - Summer Intern

Join Datadog as a Software Engineer Intern to work on high-scale metrics, logs, and application data. Gain hands-on experience in a dynamic environment.

i4talent detachering logo
i4talent detachering

Senior Data Engineer

Join i4talent as a Senior Data Engineer to lead cloud transitions and data projects. Enjoy a fun work environment with great benefits.

Aiven logo
Aiven

Staff Software Engineer

Join Aiven as a Staff Software Engineer to develop cloud operations platforms using open-source technologies. Hybrid work in Berlin.

Metyis logo
Metyis

Data Engineering Intern

Join Metyis as a Data Engineering Intern in Amsterdam. Gain hands-on experience in data pipelines, warehousing, and modeling.

Onehouse logo
Onehouse

Data Platform Engineer

Join Onehouse as a Data Platform Engineer to build scalable data pipelines using Apache Spark and Flink.

Personio logo
Personio

Staff Software Engineer, Data Platform

Join Personio as a Staff Software Engineer in Berlin to build scalable data platforms using Kafka, Kubernetes, and AWS. Drive innovation and excellence.

Coinbase logo
Coinbase

Software Engineer Intern

Join Coinbase as a Software Engineer Intern to build next-gen crypto products. Solve complex problems with blockchain technology.

Datadog logo
Datadog

Software Engineer - Winter Intern

Join Datadog as a Software Engineer Intern to work on high-scale metrics, logs, and application data in New York.

Intuit logo
Intuit

Data Science Intern

Join Intuit as a Data Science Intern to work on real-world data products and machine learning models.

ARECneprix logo
ARECneprix

Data Engineering Intern

Internship in Data Engineering focusing on big data, data management, and analytics in Milan, Italy.

Upper Hand logo
Upper Hand

Internship - Machine Learning Engineer & Data Science

Join Upper Hand as a Machine Learning Engineer & Data Scientist intern to build and deploy AI models in sports technology.

Duolingo logo
Duolingo

Data Scientist Intern

Join Duolingo as a Data Scientist Intern to work on innovative solutions using data analytics and predictive models.

Artemis logo
Artemis

Backend Data Engineer

Join Artemis as a Backend Data Engineer in NYC to build and scale analytics for digital assets using modern data stack technologies.

Zalando logo
Zalando

Backend Software Engineer - Privacy Technology

Join Zalando as a Backend Software Engineer in Privacy Technology, focusing on data protection and privacy automation services.