1 comment on “Nintex Forms for Nintex Workflow Cloud – Saving Multi-choice fields to SharePoint Lists”

Nintex Forms for Nintex Workflow Cloud – Saving Multi-choice fields to SharePoint Lists

Nintex Forms for NWC was just released yesterday, and just in time for me to add a contact form to my new website (https://www.docfluix.com).  Matt Jennings already posted an awesome intro for Nintex Forms for NWC so check this out to understand what this awesome new product is all about.

Background Info

My goal for my first Nintex Forms for NWC project was to create a simple web contact form (Name, Company, Phone, Email) and save the data to a SharePoint list. I also decided to add a multi-choice control named CustomerInterest to my form with the following options:

  • Workflow / Business Process Automation
  • Document Management (ECM / Records Management)
  • Collaboration
  • Intranet / Portals
  • Migration to SharePoint
  • SharePoint Governance
  • SharePoint Adoption

In my SharePoint site, I added an out of the box Contacts list, and customized it with a choice field (also named “CustomerInterest”) that allows multiple choices and includes the identical options above.

After the Start Event, I added the SharePoint Online action Create an Item to my workflow and connected it to my site.

This action displays the columns from my new SharePoint contact list including the CustomerInterest field.  Initially, my plan was to directly map the CustomerInterest collection variable to the CustomInterest SharePoint column. However, that approach does not work.  The solution however, is quite easy but not totally obvious…

To understand the solution, please note the following:

  • When you define a Multiple Choice field in Nintex Forms for NWC, it creates a variable for that control of type collection.  This collection shall be populated with only the choices that the user selected.
  • If you want to populate a multi-choice column in SharePoint, you need to pass it a concatenated string of all values separated with ;# (semi-colon hashtag).  So I would need to export a string that looks like this: “Collaboration;#SharePoint Governance;#SharePoint Adoption
  • In Nintex Forms for NWC, the title that is displayed for each control becomes the name of the start event variable created in the workflow.  So in my form, the multi choice field is titled as “Are you looking for assistance with:“, and so that is also the name of the collection variable that I will reference in my workflow.


The solution is to iterate through the collection variable and format a text variable the way SharePoint expects for a Multi choice column.  Although this part is quite simple, I’ll outline my steps:

  1. Define new variables:
    • CustomerInterests (Text) – Will store the concatenated value to map to the SharePoint choice column
    • ChoiceItem (Text) – Will store each selected choice from the collection variable
    • Idx (Integer) – Used for loop processing
  2. Add a Loop For Each action configured as follows:
    • Target Collection: Are you looking for assistance with:
    • Store Item: ChoiceItem
    • Index: Idx
  3. Within the Loop For Each action I added
    • Get Item from Collection action
      • Target Collection: Are you looking for assistance with:
      • Index: Idx
      • Store Item: ChoiceItem
    • Set a variable value action (optionally, the Create a text string action would also work the same way)
      • Variable: CustomerInterests
      • Value: [CustomerInterests][ChoiceItem];# (this concatenates the next ChoiceItem to the end of the string for each iteration)
  4. In the Create an Item action, now I was able to simply assign the CustomerInterests text variable to the CustomerInterest SharePoint column.  When this action executes we end up with a new list item in SharePoint with the intended choices selected.

Here is what the completed form looks like:DocFluix Contact Form

0 comments on “Using Nintex Workflow Cloud? Always Start with External Start!”

Using Nintex Workflow Cloud? Always Start with External Start!

When I first started using using Nintex Workflow Cloud back in the early preview days, the first workflows that I created used various Start Events from both Nintex (Public Web Form, Nintex Mobile, Scheduled Start, External Start) and some external connectors (like Salesforce & Dynamics CRM).

As I jumped into NWC and started creating some workflows, I would pick a particular start event (say Public Web Form) and implement my workflow logic. Then I might decide that I want to test the same workflow logic but using a different start event, perhaps Nintex Mobile or Salesforce.  However, one cannot simply change Start Events if there are any references in the workflow to the Start Event variables.  These need to be removed before you can change the type of Start Event.

After developing a few processes with Nintex Workflow Cloud, I realized the best bet is to always put your core workflow logic in a workflow using the External Start Event.

Then you can create one or more additional workflows that leverage other Start Event types, which can easily call your core workflow via External Start.  This is accomplished using the Call a Workflow action.  Here are the steps:

  1. Create your core workflow using the External Start and add whatever start event variables that your workflow needs.  Then implement the workflow logic and publish your workflow.
  2. Create a 2nd workflow, using whichever start event you expect the end users will need, such as Public Web Form, Nintex Mobile, Salesforce, etc.  This workflow will need the identical start event variables as the first workflow. However, this workflow will only require one action.
  3. After the Start Event, simply drag the Call a Workflow action (from the Logic and Flow group) under the Start Event.
  4. In the configuration for the Call a Workflow action, the first setting is to select which workflow you wish to start.  It will display a list of all published workflows that use an External Start Event:

  1. After selecting the desired workflow, the configuration screen will display fields for each Start Event variable from the selected workflow.  You can simply map the Start Event variables from the current workflow to the inputs of the workflow that you will call via External Start.

After mapping the Start Event variables, you can simply publish your 2nd workflow.  If you know you need additional ways to call your core workflow logic, you can immediately repeat steps 2 – 5 above using other Start Event types.

By using the External Start event you’ll also be able to initiate your NWC workflow just as easily from Nintex for SharePoint 2013/2016 or SharePoint Online, using the “Start workflow in Nintex Workflow Cloud” action.  Additionally, custom apps and other cloud services will be able to initiate your workflow using the OpenAPI (swagger) protocol.

Even if you don’t need to initiate your core workflow logic in multiple ways, NWC makes it so easy to use this pattern that its almost always a good idea to start with the External Start event.  You just never know when your business requirements might change in the future; And this way you (and your NWC workflows) will always be prepared for new requirements.


0 comments on “C# code for calling GSA Per Diem API from Azure Functions (Part 2 of 4)”

C# code for calling GSA Per Diem API from Azure Functions (Part 2 of 4)


This article is part 2 of a 4 part series, where we’ll cover the C# code for our custom connector for Nintex Xtensions and NWC.

In part 1, I covered the project requirements and how to setup Azure and Visual Studio 2017 with Tools for Azure Functions.  The goal of this project was to allow a workflow designer in NWC to query the GSA Per Diem API to lookup the allowed Per Diem amounts for Hotel and Meals based on the travel date and location of a trip.

Below I’ll cover the GSA Per Diem API and the C# code for a wrapper function to make this API very easy to consume from Nintex Workflow Cloud.


The GSA’s Per Diem API, is a RESTful service with a very simple interface.  The API URL is: https://inventory.data.gov/api/action/datastore_search?resource_id=8ea44bc4-22ba-4386-b84c-1494ab28964b

This URL allows one pass in an additional “filter” parameter with a JSON formatted string containing the parameter values that the API shall use to return the desired results.  The API allows four types of filters and they each require two arguments, with FiscalYear as one of the arguments and one of the following as the other argument:

  • Zip
  • County
  • DestinationID
  • State

For my initial project, I used Zip.  So to call this API, you’ll need to format a REST call as follows:


The response from this REST API is a fairly lengthy JSON message containing a lot of data.  However, my NWC connector only needs to receive two values (MealsPerDiem & HotelPerDiem).  As noted above, the Azure Function I created was a fairly simple wrapper around this REST API, that enabled the following:

  1. Allow API to be called as an HTTP POST operation
  2. Accept an input value from Nintex Workflow Cloud of a specific date, rather than the FiscalYear.
  3. Return only the two value that we need: MealsPerDiem & HotelPerDiem

So here is the C# code that I used (Since the code is pretty well commented, I’m not providing additional explanations):

using System;
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Security.Cryptography;
using System.Threading.Tasks;
using System.Text;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
using Newtonsoft.Json;

namespace GsaPerDiemFunction
    /// This class implements behavior for GetPerDiemByZip function in Azure
    /// </summary>

    public static class GetPerDiemByZip
        private const string GSA_REST_URL = "https://inventory.data.gov/api/action/datastore_search";
        private const string RESOURCEID = "?resource_id=8ea44bc4-22ba-4386-b84c-1494ab28964b";

        /// The PerDiemInput class is used to deserialize JSON input data from the HTTP POST
        /// Zip - Zip code of where the travel occured
        /// TripDate - Date that the specifc travel occured on
        /// </summary>

        public class PerDiemInput
            public string Zip { get; set; }
            public string TripDate { get; set; }

        /// The PerDiemOutput class is returned as serialized JSON with the requested values from the GSA Per Diem API
        /// Hotels - Per Diem amount allowed by GSA for Hotel expense on specified date and zip code
        /// Meals - Per Diem amount allowed by GSA for Meals on specified date and zip code
        /// </summary>

        public class PerDiemOutput
            public int Hotel { get; set; }
            public int Meals { get; set; }

        /// This method implements the  core logic for our Azure Function
        /// </summary>

        /// <param name="req">Incoming HTTP Request data as a POST operation</param>
        /// <param name="log">Log provided by Azure functions for debug purposes.  Logged data is visible by admins in the Azure portal.</param>
        /// <returns></returns>
        public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequestMessage req, TraceWriter log)
                log.Info("Received Per Diem Request.");

                //Deserialize the incoming JSON data into an instance of the PerDiemInput
                string jsonInput = await req.Content.ReadAsStringAsync();
                var perDiemInput = JsonConvert.DeserializeObject<PerDiemInput>(jsonInput);

                //Cast incoming values and check that the inputs are valid
                string zip = perDiemInput.Zip;
                string tripDate = perDiemInput.TripDate;
                string fiscalYear;
                int monthNum;
                string month;
                DateTime travelDate;
                int zipInt;

                //If TripDate is not valid, return BadRequest response, otherwise parse the date into year and month values
                if (!DateTime.TryParse(tripDate, out travelDate))
                    string tripDateInvalidMsg = string.Format("The specified TripDate ({0}) is not valid.", tripDate);
                    return req.CreateResponse(HttpStatusCode.BadRequest, tripDateInvalidMsg);
                    fiscalYear = travelDate.Year.ToString();
                    monthNum = travelDate.Month;
                    DateTime monthDate = new DateTime(1, monthNum, 1);

                    month = monthDate.ToString("MMM");

                //If Zip is not valid, return BadRequest response
                if ((zip.Length != 5) || (!int.TryParse(zip, out zipInt)))
                    string zipInvalidMsg = string.Format("The specified Zip ({0}) is not valid.", zip);
                    return req.CreateResponse(HttpStatusCode.BadRequest, zipInvalidMsg);

                log.Info(string.Format("Per Diem Requested for Zip: {0} and Fiscal Year: {1}", zip, fiscalYear));

                //Get per diem data in json format from GSA REST services, based on the specified year and zip code
                //Deserialize the per diem json into an object
                string jsonResponse = GetPerDiemJson(zip, fiscalYear, log);
                var gsaPerDiem = JsonConvert.DeserializeObject<Rootobject>(jsonResponse);
                log.Info("Deserialized jsonResonse");

                //Create the output object that we want to return.  This is a simplied version of what the GSA provides,
                //returning only the max per diem allowed for meals and hotel based on the fiscal year and zip.
                PerDiemOutput perDiemOut = new PerDiemOutput();
                perDiemOut.Meals = int.Parse(gsaPerDiem.result.records[0].Meals);
                perDiemOut.Hotel = GetHotelPerDiemByMonth(ref gsaPerDiem, month);

                return req.CreateResponse(HttpStatusCode.OK, perDiemOut);
            catch (Exception e)
                log.Info(string.Format("ERROR:\n\r Message: {0}\r\n Source: {1}\r\n Stack: {2}\r\n TargetSite: {3}", e.Message, e.Source, e.StackTrace, e.TargetSite));
                return req.CreateResponse(HttpStatusCode.BadRequest, e.Message);

        /// Formats input filters into JSON and Calls the GSA Per Diem REST API
        /// </summary>

        /// <param name="zip">Zip code specified by the caller</param>
        /// <param name="fiscalYear">Fiscal Year specified by the caller</param>
        /// <param name="log">Log provided by Azure functions for debug purposes.  Logged data is visible by admins in the Azure portal.</param>
        /// <returns></returns>
        private static string GetPerDiemJson(string zip, string fiscalYear, TraceWriter log)
            HttpWebResponse response = null;
            Stream receiveStream = null;
            StreamReader streamReader = null;
                //format filter parameter as JSON
                string filters = string.Format("&filters={{\"FiscalYear\":\"{0}\",\"Zip\":\"{1}\"}}", fiscalYear, zip);
                string resourceUrl = string.Format("{0}{1}{2}", GSA_REST_URL, RESOURCEID, filters);

                //Create request to GSA API
                HttpWebRequest request = (HttpWebRequest)WebRequest.Create(resourceUrl);

                ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls
                                        | SecurityProtocolType.Tls11
                                        | SecurityProtocolType.Tls12
                                        | SecurityProtocolType.Ssl3;

                log.Info(string.Format("Created Request for: {0}", resourceUrl));

                request.Method = "POST";
                request.ContentType = "application/x-www-form-urlencoded";

                log.Info("Request Host: " + request.Host);

                response = (HttpWebResponse)request.GetResponse();
                log.Info("Received Response from Request");

                receiveStream = response.GetResponseStream();
                streamReader = new StreamReader(receiveStream, Encoding.UTF8);
                string jsonResponse = streamReader.ReadToEnd();

                return jsonResponse;
            catch (Exception e)
                if (response != null)
                if (streamReader != null)

        private static int GetHotelPerDiemByMonth(ref Rootobject gsaPerDiem, string month)
            int perDiem = 0;

                case "jan":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Jan);
                case "feb":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Feb);
                case "mar":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Mar);
                case "apr":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Apr);
                case "may":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].May);
                case "jun":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Jun);
                case "jul":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Jul);
                case "aug":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Aug);
                case "sep":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Sep);
                case "oct":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Oct);
                case "nov":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Nov);
                case "dec":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Dec);

            return perDiem;

Once the code is complete, you can publish again to Azure.  I should also mention that there are a few ways to test and debug your Azure Function code, both locally (without publishing to Azure) or from Azure Functions.  Many REST API developers use Postman, which is what I did as well.

Before you can consume your Azure Function as a Connector for Nintex Workflow Cloud, you’ll need to define the Swagger definition file, which I’ll cover in part 3.

0 comments on “Using Azure Functions to create Xtensions for Nintex Workflow Cloud (Part 1 of 4)”

Using Azure Functions to create Xtensions for Nintex Workflow Cloud (Part 1 of 4)


Many people are familiar with Nintex as a workflow solution for SharePoint and Office 365, but last year Nintex released their Workflow Cloud offering (NWC), which has no dependency on SharePoint or Office 365. NWC comes with dozens of native workflow actions connecting to many different external cloud services, such as Salesforce, Box, Dropbox, Google, SharePoint, and many others.  Nintex frequently updates NWC with new connectors/actions for additional cloud services.  And more importantly, Nintex recently added an extensibility framework with a published SDK to allow developers to create their own custom NWC connectors.

This new extensibility model is branded as Nintex Xtensions, and leverages the OpenAPI specification, (aka Swagger), which is considered the most popular open source framework for creating RESTful APIs.  While Nintex Xtensions is well documented in their SDK & Help files, the documentation assumes that you (the developer) are familiar with how to write the code for for your custom connector as well as create the necessary Swagger definition file, which you need to import into NWC in order to configure your connector.

I decided the easiest approach for me to create a custom Xtension for NWC was to use Azure Functions, which provides a scalable, “serverless”, compute-on-demand platform to host and execute the code.  Additionally, Azure Functions includes tooling to generate the Swagger definition file for my code.  And you can develop and test Azure Functions for free.  If your Azure Function is used in production, you’ll end up paying for the actual compute time.  But Azure provides a limited amount of compute time per month at no cost, so for testing or a Proof of Concept, it would likely cost you $0/month.

Typically, when developing in Azure Functions, the development is done in the browser, within the Azure Portal, and you can choose between various languages (Javascript, C#, F#, etc.).  However, if you prefer to develop using Visual Studio, that’s possible as well using Visual Studio 2017 Tools for Azure Functions, which is currently available as a preview.  That’s the approach I used.

Since, there is a lot of content for this article, I decided to break it down into four separate posts:


For my first NWC Xtension, I wanted to select a project where the coding logic would be routine, since I was learning a few new technologies (Azure Functions & the NWC Xtensions Framework), but also a project that would have some real-world value.

Having implemented various projects in the past for clients to process Travel Authorization Requests and Expense Reports, I figured it would be useful to have an Xtension to calculate the allowed per diem expense amounts for hotels and meals.  The U.S. GSA website allows government employees and contractors to lookup the max per diem amounts that can be reimbursed for hotels and meals.  The GSA also provides a REST based Per Diem API that can be queried by year and location (Zip, County or City & State).

This API returns a JSON message, containing the maximum per diem amount for meal and hotel expenses.  The response contains a single value for the meal per diem and 12 values for hotel per diem (one value for each month, as the hotel per diem changes based on peak months when hotels tend to be more expensive).

To simplify the requirements for my Xtension, I decided it would accept two inputs (Zipcode and TravelDate) and return two values (HotelPerDiem and MealsPerDiem).  I then used Visual Studio 2017 to create my Azure Function as a simple wrapper around the GSA Per Diem API.


I assume you’re reading this article due to your interest in Azure Functions and/or Xtensions for Nintex Workflow Cloud, not for GSA Per Diem rules.  I expect that anyone reading this will adapt the concepts to whatever business logic or integration requirements that you may have.  Although I’ve provided code samples for the GSA Per Diem integration, that’s only to ensure completeness of the content.


If you haven’t used Azure Functions before, before starting in Visual Studio I suggest that you Create your first function in the Azure portal to understand the basics of Azure Functions.  Ultimately, you may decide you prefer to develop in the portal rather than in Visual Studio.  But the rest of this article is based on using Visual Studio.

The following steps are needed before you can really begin:

  1. Make sure you have an Azure subscription.  If not, you’ll need to create a free account before you begin.
  2. Download and install the Visual Studio 2017 Tools for Azure Functions, including:
  1. Once your Visual Studio environment is configured per instructions above, you can create your first Azure Functions project in Visual Studio.  From the New Project dialog, select Visual C# as your language and the Azure Functions project template, as shown below:


  1. Provide a project and solution name for your new function, select your desired folder location, and click the OK button.  This creates the Visual Studio project which represents a single Azure Function “App”.  We can then add one or more individual Azure Functions to this App / Project.
  2. Once the new Project is created, we’ll need to add an Azure Function to it.  From the Solution Explorer, right click on the project name and select Add> New Item.  This shall display the Add New Item dialog.  Select Azure Function, provide a name for the new class, and click Add, as shown below:


  1. You’ll then be prompted to specify the type of trigger that this function should use.  Since NWC will call our Xtension over HTTP, you’ll need to select the HttpTrigger.  At this point, you can select the AccessRights (I used “Function”), and specify the FunctionName.  I suggest using the same name as was used to create the C# class in the prior step.  This will create a new C# so you can start developing your code.  Note that at this point, everything has been done locally.  Later, we’ll Publish our new Function App and any specific Functions to Azure.image

Publish to Azure Functions

Now that our Visual Studio project is created, we can publish the project which will create the new Function App with an empty function in Azure Functions.  From the Visual Studio Solution Explorer, right click on your project name and click the Publish option.  This will display the Publishing screen with options to create a new Function App or select an existing one:image

Click the Publish button, which will display the Create App Service dialog, as shown below:


Provide your preferred App Name and select the Subscription, Resource Group, Service Plan and Storage Account.  Then click the Create button, which will provision your new Function App in Azure.  You can then login to your Azure portal to inspect your new Function App and function:


As we start and continue to write the code for our function, we can publish our changes as often as needed to Azure.  Now that our Azure Functions project is setup in Visual Studio, we can start on our C#, which I’ll cover in part 2.

0 comments on “Welcome to Insights on Workflow and ECM…”

Welcome to Insights on Workflow and ECM…

So, this is my first Blog post in over 2 years, when I sold my share in Hershey Technologies to Konica Minolta.  I never had a personal Blog before as I had simply used the HersheyTech website for my personal blogging.  Since our acquisition by KM, I simply hadn’t gotten around to launching my own Blog.

As a software consultant/architect with SharePoint and Office 365, I’ve worked on many projects covering many aspects of these products (collaboration, upgrades, migrations, intranets/portals, ECM, workflow, custom development, etc.).

This Blog will cover all of these topics and more.  But as the site name says, my focus here will be mainly related to Workflow (especially Nintex and Microsoft Flow) and Enterprise Content Management (with a focus on leveraging native ECM-related features in SharePoint and Office 365).