We are looking for a

Software Engineer (Core Engine)

Remote

About Tenzir

Tenzir is revolutionizing how enterprises address security and data infrastructure challenges with its innovative, AI-powered data pipeline, empowering builders and cyber defenders to shape, enrich, and route their security data. Our mission is to provide the critical data infrastructure that enables organizations to effectively manage and leverage their security data, protecting them from evolving threats. 

We are at a pivotal stage of our journey: Our customers rely on us for their mission-critical data infrastructure, and we are seeking an exceptional engineer to help evolve our core data processing engine.

Based in Hamburg with a fully remote-friendly culture, we believe in finding the best talent regardless of location across European timezones.

The Opportunity

This is a rare opportunity for a systems engineer who wants to work on query execution and streaming data processing.

As a Software Engineer on our Core Engine team, you'll work on Tenzir's C++ execution engine—the heart of our platform that processes, transforms, and routes security telemetry at scale. You'll tackle problems in query optimization, streaming execution, data serialization, and performance engineering.

If you've worked on database internals, query engines, or streaming systems and want to apply that expertise to security data infrastructure, this role offers a unique combination of deep technical challenges and real-world impact.

You'll work directly with our founder and a small team of engineers who care deeply about building robust, performant systems.

What You'll Do

  • Evolve our query execution engine that processes TQL (Tenzir Query Language) pipelines

  • Optimize streaming data processing for high-throughput, low-latency security telemetry

  • Work with Apache Arrow for in-memory columnar data representation and zero-copy data exchange

  • Implement new operators for data transformation, enrichment, and routing

  • Improve query optimization including predicate pushdown, operator fusion, and pipeline parallelization

  • Design storage and indexing strategies for efficient data retrieval and replay

  • Profile and optimize performance across the entire data path

  • Contribute to our open source codebase and engage with the community

What We're Looking For

Must-Haves

  • Significant systems programming experience in C++ (modern C++23/26)

  • Experience with data-intensive systems: databases, query engines, streaming systems, or similar

  • Strong computer science fundamentals: algorithms, data structures, memory management

  • Performance engineering mindset: you profile, measure, and optimize systematically

  • Quality standards: you write tests, handle edge cases, and think about failure modes

  • Excellent professional English fluency (written and spoken)

  • Located in EU timezone with residency in the EU / a country recognized by the EU as providing an adequate level of data protection

Technical Requirements

  • Languages: Strong modern C++ (23/26); familiarity with Python for tooling

  • Data Processing: Experience with columnar formats, vectorized execution, or SIMD optimization

  • Systems: Memory management, concurrency, I/O optimization, Linux internals

  • Formats: Apache Arrow, Parquet, or similar columnar/serialization formats

  • Build Systems: CMake, familiarity with CI/CD for native code

  • Nice to have: Query optimization, catalog design, compression algorithms

The Engineer We're Looking For

  • Gets excited about shaving microseconds off hot paths

  • Reads database papers for fun (or at least professionally)

  • Thinks about memory layout and cache efficiency

  • Enjoys debugging complex concurrent systems

  • Cares about correctness as much as performance

  • Wants to understand the full stack, from query plan to syscall

Nice-to-Haves

  • Experience with database internals (DuckDB, ClickHouse, DataFusion, Velox, PostgreSQL)

  • Background in streaming systems (Kafka, Flink, or similar)

  • Familiarity with query languages and parsers

  • Open source contributions to data infrastructure projects

  • Experience with security data formats (CEF, LEEF, syslog) or SIEM systems

  • Exposure to C++/TypeScript—we value engineers who can navigate the full stack

What Makes This Role Special

  • Hard Problems: Query execution, streaming processing, and performance optimization in a real production system.

  • Apache Arrow Ecosystem: Work with cutting-edge columnar data technology used across the industry.

  • Open Source: Contribute to a codebase with real users and community engagement.

  • Full Stack Impact: Your engine work directly affects how customers process millions of security events.

  • Small Team: Direct collaboration with founders, no layers of abstraction between you and decisions.

  • Early-Stage Upside: Join at a pivotal inflection point with meaningful equity and growth potential.

  • Remote Flexibility: Work from anywhere in EU timezones with quarterly team meetups.

Our Technical Stack

  • Core Engine: Modern C++ (C++23/26), Apache Arrow, custom query execution

  • Query Language: TQL, our pipeline query language with streaming semantics

  • Storage: Parquet, custom indexing, tiered storage

  • Deployment: Docker, Kubernetes, cloud-native

  • Integrations: TypeScript/Node.js for platform, Python for tooling

We're not just building another SIEM or log aggregator. We're building a data processing engine that sits at the heart of modern enterprise security infrastructure.

Compensation & Benefits

  • Base Salary: €60,000–€115,000 (based on experience and location)

  • Equity: Meaningful early-stage equity participation (VSOP)

  • Remote Work: Full flexibility within EU timezones

  • Employment Model: Candidates based in Germany will be employed directly. Candidates based outside Germany will work with us as independent contractors

  • Equipment: Provided for employees; BYOD for independent contractors

  • Team Building: Quarterly in-person meetups in Hamburg or rotating EU cities, plus one annual company-wide retreat bringing the entire team together

Interview Process

  1. Initial Call (30 min) - Meet our engineering lead

  2. Technical Interview (60 min) - Code review, architecture discussion, and live problem-solving

  3. Team Fit (30 min) - Open discussion with founder and HR representative 

Reference check and final offer conclude the process.

Ready to Join Us?

If you're a systems engineer who wants to work on query execution and data processing in a domain that matters, we want to hear from you. Send us:

  • Your CV highlighting relevant systems work

  • A brief note on a technical challenge you've solved in data processing or databases

  • GitHub profile or links to relevant projects/contributions

  • Optional: Papers, blog posts, or talks you've given on relevant topics