About Port
At Port.io, we are building an open and flexible Agentic Engineering Platform for modern engineering organizations. Following our recent $100M Series C funding round, we are in a phase of rapid hypergrowth with strong enterprise momentum.
We act as the central nervous system for engineering, enabling platform teams to unify their stack and expose it as a governed layer through golden paths for developers and AI agents.
By combining rich engineering context, workflows, and actions, we help organizations transition from manual processes to autonomous, AI-assisted engineering workflows while maintaining control and accountability.
As a product-led company, we believe in building world-class platforms that fundamentally shape how modern engineering organizations operate.
What you’ll do:
- Lead the design and development of scalable and efficient data lake solutions that account for high-volume data coming from a large number of sources both pre-determined and custom.
- Utilize advanced data modeling techniques to create robust data structures supporting reporting and analytics needs.
- Implement ETL/ELT processes to assist in the extraction, transformation, and loading of data from various sources into a data lake that will serve Port users.
- Identify and address performance bottlenecks within our data warehouse, optimize queries and processes, and enhance data retrieval efficiency.
- Collaborate with cross-functional teams (product, analytics, and R&D) to enhance Port's data solutions.
Who you’ll work with:
- You’ll be joining a collaborative and dynamic team of talented and experienced developers where creativity and innovation thrive.
- You'll closely collaborate with our dedicated Product Managers and Designers, working hand in hand to bring our developer portal product to life.
- Additionally, you will have the opportunity to work closely with our customers and engage with our product community. Your insights and interactions with them will play an important role to ensure we deliver the best product possible.
- Together, we'll continue to empower platform engineers and developers worldwide, providing them with the tools they need to create seamless and robust developer portals. Join us in our mission to revolutionize the developer experience!
Requirements:
- 5+ years of experience in a Data Engineering role
- Expertise in building scalable pipelines and ETL/ELT processes, with proven experience with data modeling
- Expert-level proficiency in SQL and experience with large-scale datasets
- Strong experience with Snowflake
- Strong experience with cloud data platforms and storage solutions such as AWS S3, or Redshift
- Hands-on experience with ETL/ELT tools and orchestration frameworks such as Apache Airflow and dbt
- Experience with Python and software development
- Strong analytical and storytelling capabilities, with a proven ability to translate data into actionable insights for business users
- Collaborative mindset with experience working cross-functionally with data engineers and product managers
- Excellent communication and documentation skills, including the ability to write clear data definitions, dashboard guides, and metric logic
Advantages:
- Experience in NodeJs + Typescript
- Experience with streaming data technologies such as Kafka or Kinesis
- Familiarity with containerization tools such as Docker and Kubernetes
- Knowledge of data governance and data security practices.