Take your variable mapping skills to the next level with advanced patterns, optimization strategies, and power user techniques.
This guide assumes you’re comfortable with basic syntax, filters, and the pipe operator. If you’re new to variable mapping, start with Basic Syntax .
Combining Multiple Data Sources
Reference and combine data from triggers, multiple nodes, and nested structures:
{
"enrichedOrder" : {
"orderId" : "$.trigger.order_id" ,
"customer" : {
"id" : "$.trigger.customer_id" ,
"name" : "$.fetch_customer.name" ,
"email" : "$.fetch_customer.email" ,
"tier" : "$.fetch_customer.tier"
},
"orderDetails" : {
"items" : "$.fetch_order.items" ,
"subtotal" : "$.fetch_order.subtotal" ,
"tax" : "$.calculate_tax.amount" ,
"total" : "$.calculate_total.final_amount"
},
"shipping" : {
"address" : "$.fetch_customer.shipping_address" ,
"method" : "$.fetch_order.shipping_method" ,
"cost" : "$.calculate_shipping.cost" ,
"estimatedDays" : "$.calculate_shipping.estimated_days"
}
}
}
Combine trigger data with multiple node outputs into a unified structure. {
"notification" : {
"to" : "$.fetch_customer.email" ,
"template" : "$.fetch_customer.tier|" ,
"subject" : "Order |$.trigger.order_id| - |$.fetch_order.status|" ,
"priority" : "$.fetch_customer.tier===premium|high|normal"
}
}
Use pipe operator patterns for conditional values based on customer tier. {
"matchingProducts" : "$.fetch_products.items[?(@.categoryId===$.trigger.selected_category)]" ,
"customerOrders" : "$.fetch_orders.data[?(@.customerId===$.fetch_customer.id)]" ,
"recentItems" : "$.fetch_items.data[?(@.timestamp>$.trigger.after_date)]"
}
Filter one node’s data based on values from trigger or other nodes. {
"summary" : {
"totalOrders" : "$.fetch_orders.data.length" ,
"completedOrders" : "$.fetch_orders.data[?(@.status===completed)].length" ,
"totalRevenue" : "$.calculate_revenue.total" ,
"averageOrderValue" : "$.calculate_metrics.avg_order_value" ,
"topProduct" : "$.fetch_orders.data[?(@.total===$.fetch_orders.data[*].total.max())][0].product"
}
}
Build comprehensive summaries from multiple data sources.
Nested Filter Patterns
Apply multiple levels of filtering for complex queries:
{
"specific" : "$.categories[?(@.name===Electronics)].products[?(@.price>100 && @.inStock===true)]"
}
// Step 1: Filter categories by name
$.categories[ ?(@.name===Electronics) ]
// Step 2: From those categories, get products
.products
// Step 3: Filter products by price and stock
[ ?(@.price> 100 && @.inStock=== true ) ]
Chain filters to progressively narrow results. {
"qualifiedUsers" : "$.departments[?(@.budget>100000)].employees[?(@.performance>=4)]"
}
Filter parents, then filter their children. Filter against multiple trigger values simultaneously. {
"usersWithTasks" : "$.users[?(@.tasks && @.tasks.length>0 && @.tasks[?(@.priority===high)])]"
}
Filter users who have at least one high-priority task.
Dynamic Property Access
Access properties using computed or variable names:
Property Name from Trigger
Conditional Property Selection
Array Index from Variable
{
"dynamicValue" : "$.config[$.trigger.setting_name]"
}
{
"trigger" : { "setting_name" : "theme" },
"config" : {
"theme" : "dark" ,
"language" : "en" ,
"timezone" : "UTC"
}
}
{
"dynamicValue" : "dark"
}
Access properties dynamically based on trigger data. {
"address" : "$.user[$.trigger.address_type]"
}
{
"trigger" : { "address_type" : "billing" },
"user" : {
"billing" : { "street" : "123 Main St" },
"shipping" : { "street" : "456 Oak Ave" }
}
}
Select between billing or shipping based on trigger. {
"selectedItem" : "$.items[$.trigger.index]" ,
"firstN" : "$.items[0:$.trigger.limit]"
}
Use trigger values for array indexing and slicing.
Dynamic property access is powerful but can make debugging harder. Always validate that the property names exist and document the expected structure.
Recursive Operations
Work with deeply nested or recursive structures:
Find All Occurrences
Deep Property Search
Recursive with Filtering
{
"allEmails" : "$..email" ,
"allPrices" : "$..price" ,
"allIds" : "$..id"
}
Use recursive descent (..) to find all occurrences at any nesting level. {
"errorMessages" : "$..error.message" ,
"allUserNames" : "$..user.name" ,
"allStatuses" : "$..status"
}
Find nested properties regardless of structure depth. {
"allActiveUsers" : "$..[?(@.type===user && @.active===true)]" ,
"errorNodes" : "$..[?(@.error)]"
}
Combine recursive descent with filters for powerful searches.
When to use recursive descent:
Data structure varies or is deeply nested
You need all occurrences regardless of location
Schema is flexible or unknown
When to avoid:
You know the exact path (use direct path for better performance)
Working with very large datasets (can be slow)
You need only one specific occurrence
Optimize your variable mappings for better performance:
Specific vs Recursive
Filter Early
Simple Filters
Reuse vs Recalculate
{
// ✅ Specific path - fast
"email" : "$.fetch_user.profile.contact.email"
}
{
// ⚠️ Recursive - searches entire structure
"email" : "$..email"
}
Use specific paths when you know the structure. {
// ✅ Filter first, then process
"names" : "$.users[?(@.active===true)].name"
}
{
// ❌ Gets all names, then filters (not possible with JSONPath)
// This pattern shows why filtering early matters
"names" : "$.users.name[?(@.active===true)]"
}
Apply filters as early as possible in the path. {
// ✅ Simple condition
"premium" : "$.users[?(@.tier===premium)]"
}
{
// ⚠️ Complex nested condition
"complex" : "$.users[?(@.orders[?(@.total>1000)].length>5 && @.tier===premium)]"
}
Keep filters simple when possible. {
"userList" : "$.fetch_users.data[?(@.active===true)]" ,
"count" : "$.fetch_users.data[?(@.active===true)].length" ,
"firstUser" : "$.fetch_users.data[?(@.active===true)][0]"
}
Better Pattern - Use Another Node
// Node 1: filter_active_users
{
"activeUsers" : "$.fetch_users.data[?(@.active===true)]"
}
// Node 2: use filtered data
{
"userList" : "$.filter_active_users.activeUsers" ,
"count" : "$.filter_active_users.activeUsers.length" ,
"firstUser" : "$.filter_active_users.activeUsers[0]"
}
Store complex query results in intermediate nodes.
// ✅ Fast
"$.user.profile.email"
// ⚠️ Slower
"$..email"
// ✅ Filter then extract
"$.items[?(@.active===true)].name"
// ❌ Don't wildcard then filter
"$.items[*].name[?(@.active===true)]" // Won't work as intended
// ✅ Specific
"$.categories[0].products"
// ⚠️ May return more than needed
"$.categories[*].products"
Store expensive query results in intermediate nodes rather than repeating the same complex expression multiple times.
Performance characteristics change with data size. Test your mappings with realistic data volumes.
Error Handling Strategies
Build robust mappings that handle missing or invalid data gracefully:
Trailing Pipe for Missing Data
Existence Checks with Filters
Handle Empty Arrays
Default Values in Flow Logic
{
"safeEmail": "$.user.profile.contact.email|",
"safeName": "$.user.firstName| |$.user.lastName|",
"safeString": "$.items[*].name|"
}
What trailing pipe does: Missing property returns "" (empty string). Array returns comma-separated string "item1,item2". Object returns "[object Object]". This prevents properties from being removed when data is missing, but does NOT provide custom defaults. {
"verifiedUsers": "$.users[?(@.email && @.verified===true)]",
"itemsWithPrice": "$.products[?(@.price)]",
"safeNested": "$.data[?(@.metadata && @.metadata.priority)]"
}
Check property existence with && before accessing or comparing nested values. This prevents errors from undefined properties. {
"firstItem": "$.items[0]",
"itemCount": "$.items.length",
"hasItems": "$.items[?(@)]",
"allNames": "$.items[*].name"
}
Array handling: $.items[0] on empty array returns undefined. $.items.length returns 0 for empty, number for populated. $.items[?(@)] returns [] if empty, useful for checking. Use .length checks in flow logic to determine if array has items.{
"email": "$.user.email",
"name": "$.user.name",
"status": "$.user.status"
}
For actual default values, handle in subsequent flow nodes. Check if value exists or is empty string. Use conditional routing to provide defaults. Use transformation nodes to set fallback values. Let flow logic handle missing data, not JSONPath.
Key Points: Pipe operator (|) concatenates strings, does NOT provide defaults. Use trailing pipe $.path| only to prevent property removal when data is missing. For real default values, use flow logic and conditional routing. Use filter existence checks ?(@.property) to safely access nested data.
Common error scenarios to handle:
Missing nested properties - Use filters with existence checks ?(@.nested && @.nested.prop)
Empty arrays - Check .length before accessing indexes
Null values - Filter with ?(@.value!==null) to exclude nulls
Type mismatches - Use type selectors ?(@string()) to filter by type
Out-of-bounds array access - Check array length, use negative indexes for last items
Complex String Building
Advanced patterns for building dynamic strings:
Multi-Line Templates
Conditional Sections
Formatted Lists
JSON-like Output
{
"emailBody" : "Dear |$.customer.firstName| |$.customer.lastName|,
Thank you for your order #|$.order.id|!
Order Summary:
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Items: |$.order.items.length|
Subtotal: $|$.order.subtotal|
Tax: $|$.order.tax|
Shipping: $|$.order.shipping|
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Total: $|$.order.total|
Status: |$.order.status|
Tracking: |$.shipping.tracking|
Questions? Contact us at |$.company.support_email|
Best regards,
|$.company.name| Team"
}
{
"message" : "Order |$.order.id| status: |$.order.status||$.order.tracking|| - Tracking: |$.order.tracking|||$.order.estimatedDate|| - ETA: |$.order.estimatedDate||"
}
Sections appear only when data exists. {
"itemList" : "• |$.items[0].name| ($|$.items[0].price|)
• |$.items[1].name| ($|$.items[1].price|)
• |$.items[2].name| ($|$.items[2].price|)"
}
{
"jsonString" : "{
\" orderId \" : \" |$.order.id| \" ,
\" customer \" : \" |$.customer.name| \" ,
\" total \" : |$.order.total|,
\" status \" : \" |$.order.status| \"
}"
}
Build JSON strings (though typically you’d use object structure instead).
Mapping Strategies for Large Datasets
Handle large arrays and datasets efficiently:
Testing and Debugging
Strategies for testing and debugging complex mappings:
Incremental Testing
Isolation Testing
Debug Output
Type Checking
{
"test" : "$.node.data"
}
{
"test" : "$.node.data[?(@.active===true)]"
}
Step 3: Add Property Access
{
"test" : "$.node.data[?(@.active===true)].email"
}
Step 4: Add String Building
{
"test" : "Emails: |$.node.data[?(@.active===true)].email|"
}
Build complexity gradually. {
"part1" : "$.trigger.id" ,
"part2" : "$.node.data[?(@.id===$.trigger.id)]" ,
"part3" : "$.node.data[?(@.id===$.trigger.id)][0]" ,
"final" : "$.node.data[?(@.id===$.trigger.id)][0].name"
}
Test each part of a complex expression separately. {
"debug" : {
"triggerData" : "$.trigger" ,
"nodeData" : "$.fetch_data" ,
"filterResult" : "$.fetch_data.items[?(@.active===true)]" ,
"filterCount" : "$.fetch_data.items[?(@.active===true)].length" ,
"firstItem" : "$.fetch_data.items[?(@.active===true)][0]"
},
"actual" : {
"result" : "$.fetch_data.items[?(@.active===true)][0].name"
}
}
Include debug fields to understand data flow. {
"valueType" : "$.node.value|" ,
"isArray" : "$.node.items" ,
"arrayLength" : "$.node.items.length|0" ,
"hasProperty" : "$.node.optional|"
}
Check types and existence to debug issues.
Debugging Checklist
Verify Data Structure
Use Flow Debugger to inspect actual data from previous nodes
Test Path Components
Break complex paths into pieces and test each part
Check Filter Logic
Verify filter conditions return expected results
Validate String Concatenation
Test pipe operator segments individually
Handle Edge Cases
Test with missing data, empty arrays, null values
Monitor Performance
Check execution time for complex expressions
Advanced Use Cases
Best Practices Summary
Performance
Use specific paths over recursive
Filter early in expressions
Cache expensive queries
Test with realistic data sizes
Reliability
Use trailing pipes for optional data
Check existence in filters
Handle empty arrays
Validate assumptions
Maintainability
Use descriptive node IDs
Document complex mappings
Break into steps
Test incrementally
Debugging
Test pieces individually
Use debug output fields
Verify data structures
Monitor performance
What’s Next?