
Transformation Flows
I conducted the UX design of Transformation Flows—defining user journeys, prototyping in UI5/Figma, testing with Data Modelers, and collaborating with PM and Engineering before handing over the feature.
Jan 2, 2023
CLIENT
SAP
CLIENT
SAP
CLIENT
SAP
Role
Designer
Role
Designer
Role
Designer
Service
UX Design
Service
UX Design
Service
UX Design



Work Details
Work Details
Work Details
Data modelers in SAP Datasphere faced a limitation: they could load data from different sources, but they couldn’t easily transform and output it into a target table. Every time they needed to combine datasets or apply business logic, the workaround was clunky, time-consuming, or pushed outside the product entirely.
This was more than just an inconvenience — it was a bottleneck for anyone responsible for modeling enterprise data.
The Problem
Data Modelers needed to:
Load data from one or more sources.
Transform it with joins, filters, or calculations.
Output the result in a target table inside SAP Datasphere.
Yet, this workflow wasn’t natively supported. Without a unified transformation capability, users had to stitch together partial solutions, often at the cost of performance and governance.
This gap was particularly painful because it limited SAP Datasphere's very purpose: creating a trusted, centralized environment for enterprise data.

Data modelers in SAP Datasphere faced a limitation: they could load data from different sources, but they couldn’t easily transform and output it into a target table. Every time they needed to combine datasets or apply business logic, the workaround was clunky, time-consuming, or pushed outside the product entirely.
This was more than just an inconvenience — it was a bottleneck for anyone responsible for modeling enterprise data.
The Problem
Data Modelers needed to:
Load data from one or more sources.
Transform it with joins, filters, or calculations.
Output the result in a target table inside SAP Datasphere.
Yet, this workflow wasn’t natively supported. Without a unified transformation capability, users had to stitch together partial solutions, often at the cost of performance and governance.
This gap was particularly painful because it limited SAP Datasphere's very purpose: creating a trusted, centralized environment for enterprise data.

Data modelers in SAP Datasphere faced a limitation: they could load data from different sources, but they couldn’t easily transform and output it into a target table. Every time they needed to combine datasets or apply business logic, the workaround was clunky, time-consuming, or pushed outside the product entirely.
This was more than just an inconvenience — it was a bottleneck for anyone responsible for modeling enterprise data.
The Problem
Data Modelers needed to:
Load data from one or more sources.
Transform it with joins, filters, or calculations.
Output the result in a target table inside SAP Datasphere.
Yet, this workflow wasn’t natively supported. Without a unified transformation capability, users had to stitch together partial solutions, often at the cost of performance and governance.
This gap was particularly painful because it limited SAP Datasphere's very purpose: creating a trusted, centralized environment for enterprise data.

Design Solution
Design Solution
Design Solution
I started by gathering inputs from product managers, engineers, and — most importantly — data modelers themselves. Their feedback was clear: “We don’t just want to move data. We need to shape it before it lands in the target.”
At first, I explored extensions of existing flows. But it became clear that adding transformation logic into current paradigms would overcomplicate the UI and confuse users. That was a dead end.
The pivot came when we realized: what if we introduced an entirely new flow type? One purpose-built for transformation, but still consistent with the Data Builder canvas.
The Solution: Transformation Flows
We designed Transformation Flows, a new flow type in the Data Builder:
Source: connect to one or multiple tables.
Transform: apply SQL-based logic, joins, filters, or aggregations.
Target: save the results in a defined table.
The UI was designed to feel familiar — leveraging the same canvas, nodes, and interactions as existing flows — but extended with transformation-specific capabilities.

Collaboration Behind the Scenes
This feature wouldn’t have happened without collaboration across teams:
Product Management defined priorities and user requirements.
Engineering explored feasibility and built core flow execution.
Design (my role) translated requirements into user journeys, prototypes, and usability-tested flows.
Quality Assurance ensured transformation logic performed reliably at scale.
Other Designers helped align the solution with broader Datasphere patterns for consistency.
Impact & Results
Qualitative: Transformation Flows now cover the most common modeling requirements, leveraging delta capabilities and SQL logic. Data Modelers can finally work with delta tables and transformations directly in Datasphere.
Broader value: We unlocked the ability to load data from multiple sources, apply transformations, and write the output back to a target — all within one seamless experience.
Even without hard metrics yet, the feature represents a major step in making Datasphere the central hub for enterprise data modeling.
Know more:
SAP Datasphere - https://www.sap.com/products/data-cloud/datasphere.html

I started by gathering inputs from product managers, engineers, and — most importantly — data modelers themselves. Their feedback was clear: “We don’t just want to move data. We need to shape it before it lands in the target.”
At first, I explored extensions of existing flows. But it became clear that adding transformation logic into current paradigms would overcomplicate the UI and confuse users. That was a dead end.
The pivot came when we realized: what if we introduced an entirely new flow type? One purpose-built for transformation, but still consistent with the Data Builder canvas.
The Solution: Transformation Flows
We designed Transformation Flows, a new flow type in the Data Builder:
Source: connect to one or multiple tables.
Transform: apply SQL-based logic, joins, filters, or aggregations.
Target: save the results in a defined table.
The UI was designed to feel familiar — leveraging the same canvas, nodes, and interactions as existing flows — but extended with transformation-specific capabilities.

Collaboration Behind the Scenes
This feature wouldn’t have happened without collaboration across teams:
Product Management defined priorities and user requirements.
Engineering explored feasibility and built core flow execution.
Design (my role) translated requirements into user journeys, prototypes, and usability-tested flows.
Quality Assurance ensured transformation logic performed reliably at scale.
Other Designers helped align the solution with broader Datasphere patterns for consistency.
Impact & Results
Qualitative: Transformation Flows now cover the most common modeling requirements, leveraging delta capabilities and SQL logic. Data Modelers can finally work with delta tables and transformations directly in Datasphere.
Broader value: We unlocked the ability to load data from multiple sources, apply transformations, and write the output back to a target — all within one seamless experience.
Even without hard metrics yet, the feature represents a major step in making Datasphere the central hub for enterprise data modeling.
Know more:
SAP Datasphere - https://www.sap.com/products/data-cloud/datasphere.html

I started by gathering inputs from product managers, engineers, and — most importantly — data modelers themselves. Their feedback was clear: “We don’t just want to move data. We need to shape it before it lands in the target.”
At first, I explored extensions of existing flows. But it became clear that adding transformation logic into current paradigms would overcomplicate the UI and confuse users. That was a dead end.
The pivot came when we realized: what if we introduced an entirely new flow type? One purpose-built for transformation, but still consistent with the Data Builder canvas.
The Solution: Transformation Flows
We designed Transformation Flows, a new flow type in the Data Builder:
Source: connect to one or multiple tables.
Transform: apply SQL-based logic, joins, filters, or aggregations.
Target: save the results in a defined table.
The UI was designed to feel familiar — leveraging the same canvas, nodes, and interactions as existing flows — but extended with transformation-specific capabilities.

Collaboration Behind the Scenes
This feature wouldn’t have happened without collaboration across teams:
Product Management defined priorities and user requirements.
Engineering explored feasibility and built core flow execution.
Design (my role) translated requirements into user journeys, prototypes, and usability-tested flows.
Quality Assurance ensured transformation logic performed reliably at scale.
Other Designers helped align the solution with broader Datasphere patterns for consistency.
Impact & Results
Qualitative: Transformation Flows now cover the most common modeling requirements, leveraging delta capabilities and SQL logic. Data Modelers can finally work with delta tables and transformations directly in Datasphere.
Broader value: We unlocked the ability to load data from multiple sources, apply transformations, and write the output back to a target — all within one seamless experience.
Even without hard metrics yet, the feature represents a major step in making Datasphere the central hub for enterprise data modeling.
Know more:
SAP Datasphere - https://www.sap.com/products/data-cloud/datasphere.html
