Job Description: Data Engineering Intern
Blockhouse is focused on real-time machine learning and data engineering, building scalable infrastructure for high-frequency ML models that redefine how organizations extract actionable insights from data. Our systems drive the future of real-time analytics, leveraging cutting-edge technology to deploy machine learning pipelines with sub-second level response times. If you’re passionate about building the future of MLOps and want to work with a world-class team, this is your opportunity.
Role Description:
We are looking for an exceptional Data Engineering Intern to join our team and help architect the data systems of the future. In this role, you will build and scale real-time data pipelines and analytics infrastructure, powering high-frequency machine learning models. This is not a typical internship – you will be working on mission-critical projects that process millions of data points per second, collaborating closely with machine learning scientists and MLOps engineers.
Your work will directly influence the performance of trading models and real-time decision-making engines. You’ll work with cutting-edge technologies for event-driven streaming and OLAP analytics, delivering insights at scale and speed.
Key Responsibilities:
- Real-Time Data Pipelines: Design, develop, and optimize real-time data pipelines that feed high-frequency machine learning models. Ensure seamless data ingestion, transformation, and storage for analytics and machine learning at scale.
- Advanced Data Integration: Collaborate with MLOps engineers and machine learning teams to ensure real-time data flows between systems, enabling models to continuously learn from and react to new data streams.
- Performance Optimization: Work on optimizing the performance and reliability of data architectures using technologies like ClickHouse for high-throughput OLAP querying and Redpanda for low-latency event streaming.
- Real-Time Monitoring & Diagnostics: Implement robust monitoring and diagnostic tools to track the health and performance of data pipelines, ensuring real-time models are supplied with accurate, up-to-date data.
- Cloud Infrastructure: Build and manage scalable cloud infrastructure to support data pipelines in production, leveraging AWS or GCP services to ensure fault-tolerant, cost-efficient deployments.
- Collaborate with Elite Teams: Engage with top-tier engineers, data scientists, and quantitative researchers to build scalable solutions that bridge the gap between data engineering and machine learning.
What You’ll Need:
- 1+ Years of Data Engineering Experience: Hands-on experience building and scaling data pipelines, especially in high-throughput, low-latency environments.
- Mastery of Real-Time Data Systems: Expertise in real-time data streaming and processing, with strong hands-on experience using technologies like Redpanda (or Kafka) and ClickHouse (or similar OLAP databases).
- Proficiency in Data Engineering Tools: Strong command of Python, SQL, and other tools commonly used in data engineering. Experience with frameworks such as Apache Spark, Airflow, or similar is a plus.
- Cloud Expertise: Proven experience with cloud platforms such as AWS or GCP, including services like S3, Lambda, EKS, or other tools for building scalable data infrastructure.
- Data Architecture & Integration: Experience architecting systems that handle both streaming and batch processing, integrating real-time pipelines with machine learning workflows.
- Monitoring at Scale: Familiarity with monitoring and alerting tools such as Prometheus, Grafana, or CloudWatch to ensure seamless operation of real-time data systems.
Ideal Candidate Profile:
- Passion for Real-Time Systems: A deep interest in building data systems that operate in real time, optimizing for performance, latency, and throughput.
- Experience with High-Frequency Systems: Familiarity with the challenges and complexities of handling large-scale, high-frequency data.
- Self-Motivated & Results-Driven: You thrive in a fast-paced environment, are self-driven, and have the ability to work independently on complex tasks.
- Collaborative Mindset: A team player with excellent communication skills, who can work effectively across teams to drive innovation and problem-solving.
Why You Should Join Us:
- Innovative Environment: Be part of a team that is pushing the boundaries of real-time data engineering, solving complex challenges in financial technology and beyond.
- Expert Team: Work alongside some of the brightest minds in data engineering, machine learning, and quantitative research.
- Professional Growth: Blockhouse fosters a culture of continuous learning and development, ensuring you gain hands-on experience with cutting-edge technologies and best practices.
- Cutting-Edge Projects: You’ll work on transformative projects that directly impact the future of trade execution, real-time analytics, and financial technology.
- Compensation & Perks: Equity-only compensation. NYC-based employees enjoy daily free lunch and weekly company bonding events.
How to Apply:
If you are passionate about real-time data systems and eager to apply your skills to solve complex engineering challenges, join us at Blockhouse. Together, we will redefine the future of data engineering and real-time analytics.
Benefits Extracted with AI
- Equity-only compensation
- Daily free lunch for NYC-based employees
- Weekly company bonding events
Similar jobs
Last update: 23 minutes ago
Cloud Data Engineer
Seeking a Cloud Data Engineer with expertise in AWS, Python, and CI/CD for a hybrid role in Hannover. Join our dynamic team!
Senior Software Engineer - Data Platform
Join Nubank as a Senior Software Engineer to build and maintain core data infrastructure, ensuring reliable and scalable data flow.
Staff Software Engineer
Join Aiven as a Staff Software Engineer to develop cloud operations platforms using open-source technologies. Hybrid work in Berlin.
Staff Software Engineer, Data Platform
Join Personio as a Staff Software Engineer in Berlin to build scalable data platforms using Kafka, Kubernetes, and AWS. Drive innovation and excellence.
Software Engineer Intern
Join Coinbase as a Software Engineer Intern to build next-gen crypto products. Solve complex problems with blockchain technology.
Data Engineer with Azure and PySpark
Join BeFrank as a Data Engineer to build and enhance our data platform using Azure and PySpark. Hybrid work in Amsterdam.
AI Engineer
Join BCG X as an AI Engineer in Milan, Italy. Develop AI solutions, partner with clients, and drive innovation in a dynamic environment.
Expert Machine Learning Engineer
Join Dataroots as an Expert Machine Learning Engineer to design and deliver AI-powered solutions, focusing on machine learning models.
Quantitative Developer with Python and C++
Join BlockTech as a Quantitative Developer in Amsterdam. Work with Python and C++ in a dynamic trading environment.
Senior Backend Engineer - Java, Rust, Go
Join Together AI as a Senior Backend Engineer in Amsterdam. Work with Java, Rust, and Go to build scalable backend systems.
Senior Backend Engineer C++
Join DeepL as a Senior Backend Engineer C++ to design and maintain scalable backend services using C++ and AI technologies.
Senior Software Engineer (Node.js & TypeScript)
Join n8n as a Senior Software Engineer to build AI applications using Node.js and TypeScript. Remote role within Europe.
Senior Software Engineer - AWS, Python, Ruby on Rails
Join HeyJobs as a Senior Software Engineer to design scalable systems using AWS, Python, and Ruby on Rails in a dynamic team.
Senior Full Stack Engineer - Climate Tech - Rust & TypeScript
Join Climatiq as a Senior Full Stack Engineer to develop climate tech solutions using Rust and TypeScript. Remote work available.
Full-Stack Engineer with Angular and React.js
Join Labelbox as a Full-Stack Engineer to develop scalable systems using Angular, React.js, and GraphQL. Work remotely in a dynamic AI-driven environment.
Senior Backend Engineer (JavaScript & Node.js)
Join Devire as a Senior Backend Engineer specializing in JavaScript & Node.js, working on innovative fintech solutions in a hybrid role in Warsaw.
Back End Engineer with Node.js and TypeScript
Join LegalFly as a Back End Engineer to revolutionize legal AI with Node.js and TypeScript in a hybrid role in Ghent.
Software Engineer with Haskell, Java, JavaScript, Python, C++
Join our innovative engineering team in Hamburg as a Software Engineer. Work with Haskell, Java, Python, and more in a hybrid environment.
Software Engineer - Autonomous Driving
Join Applied Intuition as a Software Engineer in Munich to tackle autonomous driving challenges with top ADAS/AV programs.
Senior Software Engineer - Python, Apache Kafka
Join Aiven as a Senior Software Engineer in Berlin, focusing on Python and Apache Kafka in a hybrid work environment.
Intern Backend Developer Node.js
Join DEPT® as an Intern Backend Developer in Amsterdam, working with Node.js, JavaScript, and more. Gain hands-on experience with top clients.
AI Solutions Software Engineer
Join DwellFi as an AI Solutions Software Engineer to develop innovative AI solutions using LangChain or Llama. Remote position in Palo Alto, CA.
Software Engineer - Cloud Applications and Python
Join Topicus as a Software Engineer in Arnhem to develop cloud applications using Python, REST APIs, and ETL processes for healthcare data services.
Information Retrieval Algorithm Engineer
Join Huawei as an Information Retrieval Algorithm Engineer to develop cutting-edge AI technologies in Amsterdam.