DevJobs

Forward Deployed Engineer - Data

Overview
Skills
  • Python Python
  • RESTful API RESTful API
  • CI/CD CI/CD
  • Docker Docker
  • Airflow Airflow
  • Interface engines
  • Cloud APIs
  • Data ingestion
  • Data replication
  • Data warehousing
  • Databases
  • Distributed storage systems
  • AI Inference tools
  • GPU Accelerated Computing
About Rhino Federated Computing


Rhino Federated Computing Rhino solves one of the biggest challenges in AI: seamlessly connecting siloed data through federated computing. The Rhino Federated Computing Platform (Rhino FCP) serves as the ‘data collaboration tech stack’, extending from providing computing resources to data preparation & discoverability, to model development & monitoring - all in a secure, privacy preserving environment. To do this, Rhino FCP offers flexible architecture (multi-cloud and on-prem hardware), end-to-end data management workflows (multimodal data, schema definition, harmonization, and visualization), privacy enhancing technologies (e.g., differential privacy), and allows for the secure deployment of custom code & 3rd party applications via persistent data pipelines. Rhino is trusted by >60 leading organizations worldwide - including 14 of 20 of Newsweek’s ‘Best Smart Hospitals’ and top 20 global biopharma companies - and is leveraging this foundation for financial services, ecommerce, and beyond.


The company is headquartered in Boston, with an R&D center in Tel Aviv.


About the Role


Forward Deployed Engineer – Rhino Federated Computing Platform

As a Forward Deployed Engineer (FDE) at Rhino, you will lead the deployment of our Federated Computing Platform (FCP) in high-stakes, real-world environments. You’ll work directly with customers—often embedding within their teams—where data privacy, data and model protection, regulatory constraints, and distributed infrastructure are the norm. You will map out complex customer workflows and architecture.  Then, you will structure and own the  delivery, and ship quickly. You will manage multiple workstream - whether it is data harmonization, data validation,  AI model training or inference, deploying encrypted containers or validating data security and privacy setup. You will scope, sequence, and build solutions that create measurable value. You will also drive clarity across internal and external teams. You will identify reusable patterns and share field signals / customer feedback that influences the roadmap. 


Success in this role means holding the line on quality deliverables, delivering with urgency, and helping Rhino and its customers learn rapidly through execution and delivering business outcomes. You will shape how federated computing is adopted at scale—in production, under privacy and security controls, and where outcomes matter most.


Key Responsibilities


  • Map our complex data workflows and architecture. Scope, sequence, and build data processing and AI Inference solutions using Rhino FCP product and customers’ data sources. 
  • Nurture strong customer relationships during the product evaluation and implementation phases.
  • Support healthcare and life science organizations in large scale data projects on Rhino FCP.
  • Deliver pre-built demos and build  custom demos and reference architectures aligned with customers’ use case and integration needs.
  • Onboard and educate users and equip the users with training and tutorials and demonstrate deep technical understanding of the product.
  • Provide project management support to help users achieve successful Rhino FCP implementations. 
  • Debug technical issues with users and implement effective process improvements.
  • Identify reusable patterns and share field signals / customer feedback that influences the roadmap. 



Required Skills
  • Proficient in Python Programming, REST APIs, and Databases.
  • Expert in Data Engineering with understanding of data science concepts and workflows. 
  • Hands-on building experience with cloud APIs, data ingestion , data replication, interface engines, data warehousing, and distributed storage systems.
  • Experience with workflow orchestration tools (example- Airflow), containerization tools (Docker), and CI/CD tools.
  • Excellent communication, product demonstration, and interpersonal skills.
  • Take full ownership of problems from start to finish, driving success for both the team and our customers.
  • Prioritize effectively and thrive in ambiguous environments.


Preferred Skills
  • Healthcare and Biopharma Data Harmonization experience.
  • Healthcare data modeling (FHIR, OMOP) and ontology (LOINC etc.) knowledge.
  • GPU Accelerated Computing, and AI Inference tools. 
  • Experience with technical roles with customer facing activities- such as solutions engineer, sales engineer, solutions architect, and professional services engineer. 
  • Project Management experience - delivery excellence, stakeholder engagement, and impact measurement. 
  • Degree in a quantitative field, with computer science or engineering or biomedical informatics/bioinformatics preferred. 


Rhino Health