Overview
We are opening the search for a critical role at Parable, and hiring our first Backend Engineer .
This person will play an essential role, building a product that transforms how companies understand and use their most precious resource – their time.
As our first backend engineer, you'll have the unique opportunity to shape our technical architecture from the ground up, working directly with our CTO, Data and AI teams to build scalable systems that power our AI-driven insights platform.
If you're excited to tackle complex technical challenges while working with seasoned entrepreneurs on a fast-moving team, we'd love to talk.
This role is for someone who:
Is obsessed with building elegant, scalable systems . You're not just a coder - you're an architect who thinks deeply about system design, data flows, and scalability. You've spent years building backend infrastructure, but you're always learning new approaches and technologies.
Combines technical excellence with business impact . You can architect complex systems and write efficient code, but you never lose sight of what truly matters - delivering value to your users. You're as comfortable diving deep into technical specifications as you are collaborating with AI engineers and data scientists to understand their needs.
Has deep data expertise . You have a deep understanding of data modeling and transformation techniques, and experience with semi-structured data and creating scalable data pipelines. You also have a knack for solving complex data problems, a passion for data quality.
Is a lean experimenter at heart . You believe in shipping to learn, but you also know how to build for scale. You have a track record of delivering results in one-third the time that most competent engineers think possible, not by cutting corners, but through smart architectural decisions and iterative development.
Is deeply investing in learning . You are eager to learn and grow, you are comfortable with ambiguity, and are willing to adapt. Collaborative spirit and the ability to work effectively in a small, fast-moving team. Your personal career goal is to grow as much as possible.
You will be responsible for:
Data Transformation: Design, build, and maintain scalable and reliable data pipelines that ingest data from various SaaS APIs and transform it into a standardized, usable format for AI analysis and visualization.
Data Modeling: Develop a deep understanding of diverse data sources and create robust data models that can be used for our AI/ML workflows.
Service Architecture: Build and maintain highly performant and scalable data services with an emphasis on modularity and maintainability.
Normalization & Linkage: Tackle the challenges of normalizing and linking semi-structured data from various sources, ensuring data quality, integrity, and accessibility.
Data Integration: Develop robust connectors to seamlessly integrate with a wide variety of external data sources, keeping in mind our goal of making data ingestion easy for others to contribute.
AI/ML Collaboration: Work closely with our AI engineers and data scientists to ensure data is optimized for model training and inference, including integrating data into our pub/sub system.
Technical standards: Contribute to our technical standards and best practices as our first backend engineer.
In your first 3 months, you'll:
Work with our Data Engineers to build our core data transformation pipeline for processing workplace data from multiple sources.
Implement services based on our initial service architecture.
Create our first set of API integrations with major workplace tools.
Contribute to our backend development standards and practices.
Partner with our AI team to optimize data flows for our machine learning pipeline.
Ship multiple iterations of our core systems based on real customer feedback.
Requirements:
Experience: 3-5 years of experience building backend services, with a focus on data transformation, modeling, and integration, preferably in the context of AI/ML workflows.
Technical Skills: Solid understanding of building scalable, reliable, and maintainable web services. Experience in any of Python, Java, Scala, Rust, or Typescript.
Database Experience: Experience with one or more database concepts (SQL, NoSQL, graphdb, timeseriesdb).
Tooling Experience: Experience with data processing tools and libraries like Spark, and a familiarity with containerization technologies.
API Integration: Proven experience connecting to and working with various APIs, including SaaS APIs (REST, GraphQL).
Bachelor's degree in Computer Science or related field preferred.
#J-18808-Ljbffr