Tips for Migrating SAP IDoc Reception Workloads from BizTalk to Azure Logic Apps

Introduction

The Azure Logic Apps’ SAP Connector provides a trigger named “When a message is received”, which allows receiving IDoc messages and initiating a Logic App workflow – similar to how a BizTalk Receive Location triggers a process, either through orchestration or messaging.

When migrating workloads, avoiding the reimplementation or modification of existing code is key to reducing regression risk. In most integration scenarios, incoming messages are transformed—often using BizTalk maps, which ultimately rely on XSLT. In many cases, BizTalk developers bypass the visual mapper and write XSLTs directly.

Let’s take note of the typical XML structure of an IDoc received in BizTalk:

<Receive xmlns="ReceiveNamespace">
  <idocData>
    <EDI_DC40 xmlns="IDocNamespace">
		<TABNAM xmlns="CommonNamspace">EDI_DC40</TABNAM>
		...
    </EDI_DC40>
    <IDOC_ROOT xmlns="IDocNamespace">
		...
    </IDOC_ROOT>
  </idocData>
</Receive>

Wouldn’t it be ideal to reuse existing BizTalk XSLTs as-is? Let’s explore how to achieve that.

SAP trigger “When a message is received”

The SAP connector “When a message is received” trigger offers several parameters that are not thoroughly documented. Based on experience, I will present the combination of parameters that facilitates the reuse of BizTalk XSLTs.

IDoc Format Options

When configuring the trigger, the first parameter to set is “IDoc Format”, which offers three options:

  • FlatFile
  • SapPlainXml
  • MicrosoftLobNamespaceXml

Since BizTalk is XML-centric and the goal is to reuse existing XSLTs, the relevant Idoc Format choices are SapPlainXml and MicrosoftLobNamespaceXml.

Let’s explore the behavior of each.

Option 1: SapPlainXml

When selecting the SapPlainXml IDoc format, the received IDoc is structured as XML without any namespace and without a <Receive> wrapper element:

<IdocTypeName>
	<IDOC BEGIN="1">
		<EDI_DC40 SEGMENT="1">
			...
		</EDI_DC40>
		<IDOC_ROOT SEGMENT="1"/>
	</IDOC>
</IdocTypeName>

This structure differs from what BizTalk expects, so existing XSLTs would require adjustments to be reused.

Option 2: MicrosoftLobNamespaceXml

When selecting the MicrosoftLobNamespaceXml IDoc format, the IDoc message is received in XML that closely matches the BizTalk structure. However, the namespaces are slightly different—specifically, the release number is missing.

For example, the Receive namespace might appear as:

http://Microsoft.LobServices.Sap/2007/03/Idoc/3/IDOCTYPENAME///Receive

Whereas in BizTalk it could be:

http://Microsoft.LobServices.Sap/2007/03/Idoc/3/IDOCTYPENAME//740/Receive

As a result, existing XSLTs will still require some namespace updates.

Option 3: MicrosoftLobNamespaceXml + Generate Namespace From Control Record = Yes

When selecting the MicrosoftLobNamespaceXml IDoc format and enabling the advanced parameter “Generate Namespace From Control Record”, the received message includes both the correct structure and the expected namespace, including the release number. This makes it compatible with BizTalk’s structure and allows direct reuse of existing XSLTs.

Here is how these parameters are presented in the worflow designer:

Note: In the Logic App code view, the “Generate Namespace From Control Record” parameter appears as EnforceControlRecordNamespace.
Example snippet from the code view:

{
  "type": "ServiceProvider",
  "inputs": {
    "parameters": {
      "idocFormat": "MicrosoftLobNamespaceXml",
      "DegreeOfParallelism": 10,
      "GatewayHost": "dummy",
      "GatewayService": "dummy",
      "ProgramId": "dummy",
      "EnforceControlRecordNamespace": true
    },
    "serviceProviderConfiguration": {
      "connectionName": "sap",
      "operationId": "SapTrigger",
      "serviceProviderId": "/serviceProviders/sap"
    }
  }
}

Conclusion

To ensure maximum compatibility with BizTalk and reuse existing XSLT artifacts with minimal changes, configure the SAP Logic App trigger as follows:

  • IDoc Format: MicrosoftLobNamespaceXml
  • Generate Namespace From Control Record: Yes (or EnforceControlRecordNamespace: true in code view)

This setup preserves the BizTalk-style message structure and namespace conventions, enabling a smoother migration path.

Bug When Generating Schemas for SAP IDocs Using the Logic App Built-In Connector

When integrating SAP with Azure Logic Apps, one of the first steps is to obtain XSD schemas that describe the structure of SAP artifacts like IDocs and RFCs. These schemas are essential for building workflows that send or receive data from SAP.

To generate these schemas, you typically create a Logic App Standard workflow and use the Generate Schema action from the SAP built-in connector. This action introspects the connected SAP system and produces the necessary schema files based on the selected artifact.

For an IDoc, we must provide:

  • The IDoc Type
  • The Release number
  • The Version number
  • The Direction to specify if we intend to send or receive the IDoc.

Bug Description

While migrating existing workload from BizTalk Server to Azure Logic Apps, we noticed inconsistencies when generating IDoc schemas via the “Generate Schema” Logic App action.
Specifically, when comparing introspection results between the “Generate Schema” action from Logic App and the “Consume Adapter Service” wizard of the BizTalk Server Extension for Visual Studio 2019, we noticed that:

  • Some schemas differed in their structure. E.g. Some element names were different or missing altogether.
  • The schemas XML namespaces were different. E.g. When introspecting the ALE AUDIT IDoc type, the “Generate Schema” action returned:
    http://Microsoft.LobServices.Sap/2007/03/Types/Idoc/3/ALEAUD01//30C
    instead of:
    http://Microsoft.LobServices.Sap/2007/03/Types/Idoc/3/ALEAUD01//731

Upon further investigation, it became evident that the Logic App’s Generate Schema action does not respect the specified release number for the IDoc. Instead, it appears to return the schema for the first available release,  resulting in inaccurate schemas.

We raised this issue with Microsoft Support, and the Logic Apps Product Group confirmed the behavior as a bug. They acknowledged that the release parameter is currently ignored by the connector, and a fix is planned for deployment across Azure.

Temporary Workaround

Until the fix is deployed, our current workaround is to continue using schemas generated by the BizTalk Server Extension for Visual Studio. It provides reliable and accurate schema definitions that align with the intended IDoc version and release.

Unit Testing Low Code Logic Apps Standard Workflows

Microsoft recently introduced, in preview, the ability to create unit tests for Logic Apps Standard workflows defined in an MSTest project.

This significantly improves the development experience when writing and maintaining low-code Logic Apps workflows. It enables developers to:

  • Write tests in C# using tools integrated into the IDE via the Logic Apps Standard VS Code extension. Writing tests in C# is crucial as it allows the use of custom utilities and useful NuGet packages.
  • Execute tests in a CI/CD pipeline.

Both practices are well-known to developers and are considered industry best practices. Without automated testing, teams risk undetected bugs, slower feedback loops, longer release cycles, and compromised software quality.

This new feature offers a more robust approach to developing and testing Logic Apps workflows, which is likely to boost its adoption.

Microsoft provides documentation on creating unit tests for workflows in VS Code, either from the workflow definition or from a workflow run. I won’t repeat the details here, but the main points are:

  • The unit of testing is the entire workflow, with mock objects injected into it.
  • The unit test wizard generates mock types for the workflow’s trigger and actions that depend on external systems (e.g., HTTP, Service Bus, Files, SAP, etc.).
  • Mock object instances can be created either programmatically in C# or in a mock definition JSON file. The latter being a serialization of the Mock objects. Note that when creating unit test from a workflow run, the JSON file is generated by the unit test wizard with data taken from the run instance.
  • Unit tests are written as C# methods decorated with the [TestMethod] attribute.

Implementing Negative Tests

For the workflow I wrote unit tests for, I did not encounter any limitations for happy path scenarios. However, I quickly came across a limitation when writing negative tests.

To illustrate this, let’s consider the simple workflow below, which calls an HTTP endpoint through the HTTP Action. Depending on the success or failure of this call, the business process takes different paths. To simplify the illustration, I replaced these different paths with returning different responses in the Response actions.

When generating unit tests from a successful run, the wizard creates a negative test like this:


[TestMethod]
public async Task GetGreetings_GetGreetingsSuccess_ExecuteWorkflow_FAILED_Sample3()
{
    // PREPARE
    var mockData = this.GetTestMockDefinition();
    var mockError = new TestErrorInfo(code: ErrorResponseCode.BadRequest, message: "Input is invalid.");
    mockData.ActionMocks["HTTP"] = new HTTPActionMock(status: TestWorkflowStatus.Failed, error: mockError);

    // ACT
    var testRun = await this.TestExecutor
        .Create()
        .RunWorkflowAsync(testMock: mockData).ConfigureAwait(false);

    // ASSERT
    Assert.IsNotNull(testRun);
    Assert.AreEqual(TestWorkflowStatus.Failed, testRun.Status);
}

Note that the HTTPActionMock type models the mock for the workflow’s HTTP action. The code above overrides the ActionMock for the HTTP action loaded from the JSON file with a mock having its status set to Failed instead of Succeeded. As the HTTP action is now set to fail, it will cause the entire workflow to fail.

Now, let’s imagine that I want to enhance the test and ensure that when the HTTP action fails, the “Response OK” action does not run, and the “Response Failure” action runs instead. To implement this, I can simply add:


Assert.AreEqual(expected: TestWorkflowStatus.Skipped, actual: testRun.Actions["Response_OK"].Status);
Assert.AreEqual(expected: TestWorkflowStatus.Succeeded, actual: testRun.Actions["Response_Failure"].Status);

These asserts will ensure that I have implemented my business logic correctly and also acts as a regression test to detect if a breaking change is introduced later on.

Current Limitations

Current Limitation with Negative Testing

In more complex scenarios, business logic might depend on the actual content of the error response returned by the HTTP call. For example, a web API might return specific error codes.

I expect to implement such a scenario by overriding the Action Mock for the HTTP action with:

  • A failed status
  • An output for the mock with the specific payload returned by the HTTP action

I tried defining this in C#:


var actionOutput = new HTTPActionOutput
{
    Body = new JObject { ["errorCode"] = "009" },
    StatusCode = HttpStatusCode.BadRequest
};

var httpFailedActionMock = new HTTPActionMock(
    status: TestWorkflowStatus.Failed,
    outputs: actionOutput
);

mockData.ActionMocks["HTTP"] = httpFailedActionMock;

But this causes the TestExecutor to throw the following exception:

The workflow '' associated with unit test '' has action mock 'HTTP' that should have non empty error message when status is set to 'Failed'.

The only way to prevent the exception is to use a constructor that takes a TestErrorInfo object — but this constructor does not allow passing a custom HTTP response payload, which prevents me from implementing my test scenario.

Even trying to bypass this limitation by editing the JSON file directly did not work:

"actionMocks": {
  "HTTP": {
    "name": "HTTP",
    "status": "Failed",
    "outputs": {
      "statusCode": 400,
      "body": {
        "errorCode": "009"
      }
    },
    "error": {
      "Code": "BadRequest",
      "Message": "The request is invalid."
    }
  }
}

Current Limitation with MSTest project

As of now, the MSTest project must target .NET 6.0, which is no longer supported by Microsoft. Although this code is only used for testing and not deployed to production, some company security policies may still flag it. Static analysis tools like SonarQube, Snyk, and others might raise alerts due to the use of an unsupported framework, requiring documented exceptions and justification.

Conclusion

Given that this is the initial public preview, I’m satisfied with the current capabilities and have provided feedback to Microsoft, expressing hope for improvements in negative test handling and support for a more recent version of .NET.

← Back

Thank you for your response. ✨