Job Title: Principal Data Architect
Role Summary
We are seeking a
Principal Data Architect to design the data backbone for our next generation products. HP Indigo presses are complex, multi-disciplinary machines that generate massive volumes of high-velocity data - from our Presses .
In this role, you will be the "System Hydraulics" expert. You will design the architecture that ensures data flows from the press sensors to the edge, and eventually to the cloud, without hitting bandwidth bottlenecks or crashing sub-systems. You will balance traditional data modelling with the high-stakes world of
Industrial flow control.
Core Responsibilities
- Multi-Disciplinary Data Strategy
- Unified Data Model: Create a cohesive data schema that links disparate data types.
- Edge-to-Cloud Architecture: Design the path for data as it moves from the Press (Edge) to local servers and finally to the HP PrintOS cloud.
- Digital Twin Support: Develop the data foundations required to build "Digital Twins" of our presses for predictive maintenance and remote diagnostics.
- High-Velocity Flow Control & Bandwidth Management
- Throughput Optimization: HP Indigo presses generate gigabytes of telemetry. You are responsible for designing the sampling rates and data compression strategies so the internal network isn't overwhelmed.
- Congestion Management: Implement "safety valves" (Edge Buffering) to ensure that if the cloud connection is slow, the press can continue to operate and store logs locally without data loss.
- Real-time vs. Batch: Define which data must be processed in "Hard Real-Time" (for machine safety/quality) vs. what can be sent in batches (for long-term analytics).
- Industrial Data Governance & Quality
- Fleet-wide Standardization: Ensure that data coming from a Series 3 press is structurally compatible with a Series 6 press for cross-generational analysis.
- Security: Architect secure data transmission protocols for sensitive customer print-job data and proprietary HP machine logs.
Technical Domain Expertise
- Protocols: Deep knowledge of MQTT, AMQP, or OPC-UA for machine-to-machine communication.
- Message Brokers: Expert-level design in Kafka or RabbitMQ to handle high-velocity telemetry streams.
- Time-Series Databases: Experience with InfluxDB, TimescaleDB, or Azure Data Explorer for storing high-frequency sensor readings.
- Imaging Data: Understanding the flow of heavy raster data (RIP output) and its impact on system memory and bandwidth.
Required Qualifications
- Experience: 8+ years in Data Architecture, specifically within Manufacturing, Robotics, or Aerospace.
- Systems Thinking: Ability to understand a change impacts the data load on the network.