Conigma Connect
Data Bindings
Introduction
When writing flows, a recurring pattern is that input values (sources) and output values (targets) need to be defined. Because the flow context relies on flexible JSON manipulation, configuring these fields consists of setting options that allow precise and powerful control.
To illustrate the available configurations, we use the example of the Set Variable instruction shown above. This block performs a single data binding, assigning a value from a source to a target. Each field will be examined step by step to explain the available options.
In general, all instruction blocks in Connect are composed of different data sources, data targets, and context configurations. Once these concepts are understood, the behavior of other instruction blocks becomes much easier to follow, since they are all based on the same underlying principles.
All the examples in this page use the "simple view" if not stated otherwise.
Source
The first field to examine is the configuration of the data source. This field plays a central role, as it is not only applied in the Set Variable instruction but also used across the flow editor in many other cases, such as when defining the iteration set of a For Each loop or specifying the payload of a web request.
On the upper left side, a dropdown menu determines the general type of the source. Based on this selection, the selector on the lower right side is adjusted accordingly, providing the correct interface to define the chosen type of input.
Constants and Special Types
For constant data sources, several options are available that represent different data types and produce corresponding outputs. A constant source, as the name suggests, refers to a value defined statically during flow design.
This is comparable to writing an assignment such as "x = 20" in a conventional programming language, where the predefined value is directly assigned to the target without further evaluation or dynamic resolution.
Important: defining a variable by setting the data source to constant does not mean the variable cannot be modified later. "Constant" in this context only refers to the right-hand side of the assignment operation being a constant value.
The constant data types correspond directly to JSON types and can be defined explicitly. Options such as Boolean, String, Number, or Null provide a user-friendly interface for specifying values. For example, selecting Boolean changes the input on the lower position to a checkbox representing the JSON values true or false.
More complex structures can be created using JSON Node. In these cases, the input becomes a free text field where the corresponding JSON content (i.e. a JSON object or a JSON array) can be inserted. Once the focus leaves the field, the editor automatically validates and reformats the JSON.
Some examples:
The remaining options in the list represent special value types that are frequently used when creating flows but do not correspond directly to standard JSON nodes, even though they are sometimes stored in the flow context in JSON format.
One example is the Connection type. Here, the user can select from the connections already defined in the tenant, with an additional option to open and review the selected connection for verification. Once chosen, the connection name is stored in the context as a JSON string.
The main benefit of this approach is that these specialized input fields enforce static validation during save and syntax checks, which prevents errors such as misspelled connection names.
If an invalid input is entered, the editor automatically clears the field to highlight the mistake and prompt correction
The remaining special types extend the possibilities of constant inputs and are only relevant in specific situations. While JSON Paths or Expressions can theoretically be applied wherever a string is expected and primarily serve as type hints, the other types are reserved for particular instructions where they have defined meaning. By selecting an instruction, the editor guides the user toward the correct type using these options.
Dynamic Code Name: Reserved for the Dynamic Code block, where it references the name of a stored code snippet. Dynamic code provides a mechanism to integrate custom logic beyond the standard instruction blocks, enabling flexible extensions.
Connection Method: Only valid in combination with a connection. Used in the Web API instruction, where the connection specifies the target service and the method defines the request to execute.
Flow Name: Exclusive to the Invoke Flow block, where it specifies which flow should be called, allowing a flow to delegate work to another flow within the same tenant.
BPMN: Represents the internal format of BPMN diagrams. Only meaningful in instructions that process parsed BPMN data, such as comparing diagrams for structural differences.
Enums: Predefined sets of valid values bound to a specific instruction input field. Shown as a restricted list in the UI. A typical example is the Log instruction, where the log level must be one of: Trace, Debug, Information, Warning, Error, Critical, or None.
Compare Values
Selecting Compare Values as a data source makes it possible to perform direct comparisons between two values. The result is always a Boolean (true or false). This is useful for defining conditional logic, such as checking whether one number is smaller than another or whether two objects are equal.
A comparison requires three components:
Left-hand value
Operator
Right-hand value
The operator is defined as a constant string and determines the type of comparison.
Operator | Description |
|---|---|
lower | Left < right |
lower or equal | Left <= right |
equal | Left == right (deep JSON) |
greater or equal | Left >= right |
greater | Left > right |
not equal | Left != right (deep JSON) |
Concatenate Values
Concatenate Values allows multiple string inputs to be combined into a single output. Inputs are concatenated in the order they are defined, and all must resolve to strings.
This is especially useful for building dynamic strings. Each part of the concatenation can come from different sources, for example, static text, expressions, or JSON paths, making it possible to generate flexible, context-dependent values.
Evaluate Expression
The Evaluate Expression option allows expressions to be written directly in a source field. This enables inline logic, comparisons, and simple transformations without requiring multiple instruction blocks.
Expressions can both access values from the flow or global context using JSON paths and perform logical or computational operations.
Example: $.testValue > 5 returns true if testValue equals 10.
Expressions are not limited to comparisons. They support binary operations, type casting, collections, element access, member access, and even string interpolation.
See
Evaluate JSON Path
The Evaluate JSON Path option queries the context using a JSON Path expression and returns the resulting value. The syntax follows the standard JSON Path syntax.
The Mode field defines how results are handled if the query matches multiple values. For example, filtering an array may return several elements. Depending on the mode, you can return the first match, all matches, or another subset.
This allows precise control when extracting data, for example selecting a single matching item from a larger array in one step.
Get Variable
The Get Variable option provides a simplified way to access values from the context. It behaves like Evaluate JSON Path with the mode set to All, meaning the entire result set of the path is returned.
The user specifies the JSON path of the variable, and the corresponding value is resolved directly from the selected context.
Logical AND, NOT, and OR
The Logical AND, OR, and NOT sources make it possible to perform boolean logic directly in a binding.
AND: Returns true only if all inputs are true.
OR: Returns true if at least one input is true.
NOT: Expects a single boolean input and returns its negation.
Inputs must resolve to JSON booleans, and can themselves be nested sources. This makes it possible to define checks such as requiring multiple conditions simultaneously.
Above Example evaluates to true if $.testValue equals 10.
Map Value
Map Value as a data source is designed to solve a central problem that occurs frequently in system integrations.
When two services need to exchange data, they often represent the same entities but use different identifier schemes. A typical example is when service A and service B both store information about the same users, but each system assigns its own IDs. If a user performs an action in service A and this action needs to be mirrored in service B, the system must know how the user in A corresponds to the user in B. Without mapping, this would require repeated manual lookups or repetitive handling in every flow.
To avoid this overhead, the concept of mapping arrays was introduced. A map is a JSON array where each entry is an object with the same structure, effectively acting like a table of relationships. In a user mapping scenario, each object could contain fields such as name, service1Id, and service2Id. This array can be stored globally and reused wherever an identifier translation is required.
The Map Value source makes use of such arrays directly within a flow. The Value field specifies the identifier to look up. The Map field references the JSON array that contains the mappings. Original Value indicates which property of the mapping objects should be compared against the provided Value and Transformed specifies which property should be returned as the result.
In the example above, the input value alice is provided as the Value. The map array contains multiple user objects, and Original Value is set to the user field. This means the system searches for the object where user == "alice". Once found, the Transformed field is applied, here set to service1Id, so the result returned is ABC-123. If no entry is found instead, null is returned as a fallback.
This mechanism is heavily used in practice in combination with global maps (stored in global context), as it allows flows to automatically convert between identifiers, codes, or any other values used differently across systems.
Target
In contrast to data sources, data targets are relatively simple. A target can either be used to store the result of a data source in a variable, or it can be set to Ignore value, which means the output is ignored.
Ignoring a result is useful in cases where the outcome of an operation is not needed, such as when the response of a web request has no relevance for the rest of the flow.
When storing a result, the only configuration required is the context (cf.
In expert mode, the field for the path itself is again defined as a nested source, allowing more advanced scenarios where the target location is calculated dynamically. This makes it possible to build flexible flows that adjust their output structure during runtime.
Variable Scope
Switching between contexts in the flow editor is done by toggling the context selector between Flow and Global when defining a target.
In this example, the content of the flow context variable $.sourceVariable is copied to the global context into variable $.targetVariable.
Note: using the global context is typical for values such as a project ID that need to be available across multiple API calls within different flow executions, where repeatedly requesting the same information would be inefficient. See