0 comments on “Tips & Tricks for Automated SharePoint Site Provisioning with PnP (Part 2)”

Tips & Tricks for Automated SharePoint Site Provisioning with PnP (Part 2)

Overview

Welcome to Part 2 of this series on Automated SharePoint Site Provisioning with PnP.  This blog series is focused on lessons learned with using PnP Powershell along with Site Actions, Site Scripts, Azure Automation and other components in order to automate creation of new SharePoint sites in a repeatable and well-governed manner. 

If you haven’t reviewed part 1, you can check it out here:

Tl;DR; when using Document Sets in your PnP Provisioning solution, be sure to:

  1. Enable site scripting (by setting DenyAddAndCustomizePages to false)
  2. Include feature activation for Document Sets in your provisioning template

This article shall is focused on something I just came across tonight, and figured I would post it while its fresh in my head.  The situation I stumbled on was related to deploying Document Sets as part of a provisioning site template.  In this case, I had a base PnP provisioning template with existing sections for SiteFields and ContentTypes containing a few basic site columns and content types. 

But now I added a custom document set content type (named “Project Packet”) to my test site which included columns for values like Project Manager, Project Name, Status, Start Date & End Date.  Since I needed to replicate the functionality of my test site in a new site, I added the related elements for my new SiteFields and ContentTypes to my existing provisioning template.

Using the “Features” handler in PnP

When you want to use Document Sets in a new site, you have to activate the Document Sets Site Collection feature (until you do this the base Document Set content type is not deployed into the site).  Rather than doing this through the normal Site Settings, I decided to update my provisioning template to handle this for me.  Since this was my first time using PnP Provisioning with Document Sets, I ran the following Powershell cmdlet on my test site which already had the Document Set feature activated:

 
Get-PnPProvisioningTemplate out “MyTemplate-Features.xml”
                            –Handlers “Features”

Note the use of the –Handlers parameter with a value of “Features”.  This tells the cmdlet to only export the custom site (site collection) and web features from the site, so we’ll get a small template from this, such as shown below:

<?xml version=“1.0”?>
<pnp:Provisioning
                  2018/07/ProvisioningSchema”>
<pnp:Preferences
     Generator=“OfficeDevPnP.Core, Version=3.2.1810.0,
                 Culture=neutral, PublicKeyToken=5e633289e95c321a” />
 
  <pnp:Templates ID=“CONTAINER-TEMPLATE-D9976DB23F1040BAB0A41BE5BC677F83”>
   <pnp:ProvisioningTemplate ID=“TEMPLATE-D9976DB23F1040BAB0A41BE5BC677F83”
                              Version=“1” BaseSiteTemplate=“GROUP#0”
                                Scope=“RootSite”>
    <pnp:Features>
     <pnp:SiteFeatures>
      <pnp:Feature ID=“3bae86a2-776d-499d-9db8-fa4cdc7884f8” />
     </pnp:SiteFeatures>
     <pnp:WebFeatures>
      <pnp:Feature ID=“f151bb39-7c3b-414f-bb36-6bf18872052f” />
     </pnp:WebFeatures>
    </pnp:Features>
   </pnp:ProvisioningTemplate>
  </pnp:Templates>
</pnp:Provisioning>

In this case, the Site Feature of “3bae86a2-776d-499d-9db8-fa4cdc7884f8” is the Document Set feature.  The Web Feature of “f151bb39-7c3b-414f-bb36-6bf18872052f” refers to the Site Notebook feature.  Quick side note: It’s a mystery to me why the Site Notebook feature was included here.  I did not explicitly activate the Site Notebook feature as it is activated by default in modern team sites.  If I run the same powershell cmdlets on a brand new modern team site that does not have the Document Set feature activated, the generated PnP template does not contain either of these features.  But for some reason, the PnP engine feels that it needs to include the Site Notebook feature along with the Document Set feature.

I now copied the <pnp:Features> </pnp:Features> section from this Xml file and merged into my main pnp template (above the section for <pnp:SiteFields>).  At this point I now have two PnP provisioning templates:

  1. FieldCTs.xml – with sections for Features, SiteFields and ContentTypes (as described above)
  2. Lists.xml – with list definitions containing references to my Project Packet content type and related columns

Now, I figured, I’m good to go and I applied my updated pnp template containing the Document Set feature activation along with my custom document set content type (Project Packet) and associated columns, using these PowerShell cmdlets:

ApplyPnPProvisioningTemplate –Path “FieldCTs.xml”
ApplyPnPProvisioningTemplate –Path “Lists.xml”

These cmdlets completed without error.  So I went in to use my updated site.  From the library containing my Project Packet content type I clicked on new Project Packet and filled out the metadata fields and clicked Ok. But instead of seeing my new Project Packet document set, I was greeted with a 404 error!  I tested this a few times and replicated the error on additional sites.

Enable Site Scripting

So with a little research, I found that in order to activate the Document Set feature properly, we first need to enable site scripting on the target site.  This article in the PnP-Sites-Core github repo got me started down the right path, but it referenced a solution using SharePoint Online powershell. I wanted to complete this using PnP powershell.  I also found this discussion from the MS Tech Community to be helpful: How do I set DenyAddAndCustomizePages using PnP? (thanks Alan Trafford and Pieter Veenstra).

So here is the final script with everything together:

Import-module SharePointPnPPowerShellOnline
 
try
{
 
Set-PnPTraceLog On LogFile “pnplog.txt” Level Debug
 
$DenyAddCustomizeEnum =
     [Microsoft.Online.SharePoint.TenantAdministration.DenyAddAndCustomizePagesStatus]
 
Connect-PnPOnlineUrl https://tenant-admin.sharepoint.com
 
$ctx = Get-PnPContext
 
$site = Get-PnPTenantSite Detailed Url $siteUrl
Write-Host “Site: $site
 
$site.DenyAddAndCustomizePages = $DenyAddCustomizeEnum::Disabled
 
$site.Update()
$ctx.ExecuteQuery()
 
Write-Host “Disabled DenyAddAndCustomizePages”
 
Get-PnPTenantSite Detailed Url $siteUrl |
Select-Object url,DenyAddAndCustomizePages
 
$status = $null
 
Do
{
Write-Host “Waiting… $status
Start-Sleep Seconds 5
$Site=Get-PnPTenantSite url $siteUrl Detailed
$status = $Site.Status
 
} While ($status -ne ‘Active’)
 
Disconnect-PnPOnline
 
ApplyPnPProvisioningTemplate –Path “FieldCTs.xml”
ApplyPnPProvisioningTemplate –Path “Lists.xml”
 
}
catch
{
Write-Error “Exception Occured”
Write-Error “Exception Type: $($_.Exception.GetType().FullName)”
Write-Error “Exception Message: $($_.Exception.Message)”
}
finally
{
 
Set-PnPTraceLog – –off
}
 

Conclusion

So when including document sets within your PnP Provisioning solution, remember two things:

  1. Enable site scripting (by setting DenyAddAndCustomizePages to false)
  2. Include feature activation in your provisioning template
2 comments on “Tips & Tricks for Automated SharePoint Site Provisioning with PnP (Part 1)”

Tips & Tricks for Automated SharePoint Site Provisioning with PnP (Part 1)

Overview

For the last few months, I’ve been working on a few new solutions where clients need to easily create new project sites in SharePoint.  These customers have found that by using Modern Team & Communication sites, Hub sites, Office 365 Groups and Microsoft Planner they can provide a compelling platform for a wide range of project management requirements. 

Each client has unique requirements for the specific components that they need for each project site.  But in each case, there have been common requirements that each of my customers share…

  1. Need to customize a Modern Team site with a unique set of content types, columns, home page layout and other features to support their own project management approach.
  2. Ability for users or IT to easily spin up new project sites, and have each site contain the appropriate configuration (columns, content types, pages, web parts, permissions, etc.)
  3. Enable users across projects to easily access and share common project resources through use of a Hub site.

To implement these solutions, I’ve leveraged a combination of Office & SharePoint technologies including:

  • PnP Powershell
  • Site Designs / Site Scripts
  • Microsoft Flow
  • Azure Storage Queues
  • Azure Functions / Azure Automation

The end result is a solution that behaves like this:

image

Although this type of solution has a lot of moving parts, the end result is a solution that offers incredible flexibility to support changing client requirements in the future. 

If you’re not already familiar with these components, Microsoft provides great documentation for PnP Powershell and the overall PnP Provisioning Engine, so you may wish to start with these articles first.  Since the major setup steps are well documented elsewhere, this blog series will cover various tips and tricks (and lessons learned) that I encountered while implementing my customers’ provisioning solutions.

Bakground on PnP Powershell

But before we jump into the first “tip”, lets review some basics of PnP Powershell…

With PnP Powershell, the easiest way to generate a PnP provisioning template from a site is with this cmdlet:

Connect-PnPOnline “https://[yourtenant].sharepoint.com/teams/[YourSite]

Get-PnPProvisioningTemplate -out “MyTemplate.xml”

This command will generate a site provisioning template that contains all aspects of your site.  When you generate a provisioning template like this, your MyTemplate.xml will include any customizations that you created in the site (lists, libraries, content types, columns, web parts, navigation, security, etc.) along with definitions for all of the Out of the Box components in a modern team site.

If you execute this Powershell cmdlet on a brand new (uncustomized) modern team site, the resulting Xml file shall contain 890 lines of XML including :

  • 233 lines for OOTB site columns
  • 180 lines for OOTB site content types
  • 306 lines for OOTB libraries

Now let’s say you have a modern SharePoint team site and  you have customized the site by adding some custom site columns, content types, lists and document libraries.  You’ve also added some web parts to the home page and customized the left navigation.

If you generate a provisioning template for your customized site, the Xml file created shall be larger as it will contain the OOTB components and your custom components.  Now that you have a customized “MyTemplate.xml” you could use that template to apply your customizations to a new modern template site using this PnP Powershell cmdlet:

Connect-PnPOnline “https://[yourtenant].sharepoint.com/teams/TestSite001

Apply-PnPProvisioningTemplate –path “MyTemplate.xml”

One of the best parts about PnP Provisioning is that you can re-apply a template to the same site multiple times.  The provisioning engine is very smart about applying changes from the template to the site.  For elements that have not been changed, the provisioning engine simply skips those items.  So even though our template contains definitions for existing libraries like Site Assets, Style Library etc, if we have not customized those things there is no problem to re-apply the template that contains those definitions.

This also means that if you apply a template to a site, then make a small tweak to your template, you can apply the revised template to the existing site and it will simply apply the changes.

Tip # 1 – Customize your PnP site templates

Although the PnP Provisioning engine does an excellent job about ignoring definitions that already exist, you may still wish to customize your templates to make them smaller and more modular.  Personally, I tend to delete the the XML definitions for the out of the box Fields, ContentTypes and Lists from my template XML file, to make the template easier to review in the future.

Removing the OOTB lists is pretty easy, but removing the OOTB Fields and ContentTypes is much more tedious.  If you open your provisioning template XML file in your favorite XML editor, and collapse all sections you can see exactly where the OOTB ListInstances are stored.  In the example below, my template has one custom list (a library named “Sales”) and four OOTB libraries: Documents, Form Templates, Site Pages and Style Library. 

Assuming we have not customized the OOTB libraries, then we don’t need to keep them in the template, as we’ll have a copy of these libraries created by SharePoint in each new team site.  Therefore, I’ll delete the pnp:ListInstance elements for the OOTB libraries:

Before Delete

This Provisioning template XML only contains definitions for the site’s lists and it is 326 lines.

image
After Delete

Note that the provisioning template XML is now down to 59 lines

image

It’s been my experience that there are numerous scenarios where you’ll need or want to manually edit the provisioning template XML file.  By starting with a vastly smaller Xml file, our future tweaking to the template file will be easier.

Tip # 2 – Modularize your PnP site templates

In addition to removing redundant or unneeded components from the template, you may also find value in refactoring templates to make them more re-usable.

Consider a scenario where you’re defining a site template to manage projects.  Your company has various divisions that all manage projects, so we want a site template that is customized to our company’s project management standards.  Each project site needs a consistent set of custom columns, content types, libraries, lists, web parts, etc.  However, within each department, they may wish to customize the homepage and left navigation for their project sites differently from the standard company project sites.

It is easier to manage this type of requirement by creating PnP site templates that are modular and re-usable.  In this case, we may want to generate two provisioning templates for each project site:

  • A template for the “information architecture” (e.g. ProjectIA-Template.xml)
    • Contains definitions for site fields, content types, lists and libraries
    • One version of this template shall be used across all departments
  • A template for the layout and web parts on the home page along with left navigation (e.g. ProjectUI-DeptX-Template.xml)
    • Contains definitions for home page and left navigation
    • A unique version of this template may be created for each department

To generate modular templates, we use the Handlers parameter within the Get-PnPProvisioningTemplate cmdlet.  For example:

  • Get-PnPProvisioningTemplate -out “ProjectIA-Template.xml” –Handlers “Fields,ContentTypes,Lists”
  • Get-PnPProvisioningTemplate -out “ProjectUI-DeptX-Template.xml” –Handlers “Pages,PageContents,Navigation”

A few common examples of PnP Provisioning Handlers include:

Handler Name Description Represented in template XML as

Fields

Site columns

pnp:Fields

ContentTypes

Site’s ContentTypes

pnp:ContentTypes

Lists

Site’s lists and libraries

pnp:Lists

SiteSecurity

Members of SharePoint groups in the site

pnp:Security

PageContents

Layout and webparts on the home page

pnp:WebSettings       

pnp:PageContents

Navigation

Left and Top navigation links

pnp:Navigation

Note: For a full list of all PnP Provisioning Handler values, check out OfficeDevPnP.Core in Github

Conclusion

That concludes the Part 1 of this series on PnP Provisioning tips and tricks.  I’ll follow up this article soon with Part 2.

1 comment on “Nintex Forms for Nintex Workflow Cloud – Saving Multi-choice fields to SharePoint Lists”

Nintex Forms for Nintex Workflow Cloud – Saving Multi-choice fields to SharePoint Lists

Nintex Forms for NWC was just released yesterday, and just in time for me to add a contact form to my new website (https://www.docfluix.com).  Matt Jennings already posted an awesome intro for Nintex Forms for NWC so check this out to understand what this awesome new product is all about.

Background Info

My goal for my first Nintex Forms for NWC project was to create a simple web contact form (Name, Company, Phone, Email) and save the data to a SharePoint list. I also decided to add a multi-choice control named CustomerInterest to my form with the following options:

  • Workflow / Business Process Automation
  • Document Management (ECM / Records Management)
  • Collaboration
  • Intranet / Portals
  • Migration to SharePoint
  • SharePoint Governance
  • SharePoint Adoption

In my SharePoint site, I added an out of the box Contacts list, and customized it with a choice field (also named “CustomerInterest”) that allows multiple choices and includes the identical options above.

After the Start Event, I added the SharePoint Online action Create an Item to my workflow and connected it to my site.

This action displays the columns from my new SharePoint contact list including the CustomerInterest field.  Initially, my plan was to directly map the CustomerInterest collection variable to the CustomInterest SharePoint column. However, that approach does not work.  The solution however, is quite easy but not totally obvious…

To understand the solution, please note the following:

  • When you define a Multiple Choice field in Nintex Forms for NWC, it creates a variable for that control of type collection.  This collection shall be populated with only the choices that the user selected.
  • If you want to populate a multi-choice column in SharePoint, you need to pass it a concatenated string of all values separated with ;# (semi-colon hashtag).  So I would need to export a string that looks like this: “Collaboration;#SharePoint Governance;#SharePoint Adoption
  • In Nintex Forms for NWC, the title that is displayed for each control becomes the name of the start event variable created in the workflow.  So in my form, the multi choice field is titled as “Are you looking for assistance with:“, and so that is also the name of the collection variable that I will reference in my workflow.

Solution

The solution is to iterate through the collection variable and format a text variable the way SharePoint expects for a Multi choice column.  Although this part is quite simple, I’ll outline my steps:

  1. Define new variables:
    • CustomerInterests (Text) – Will store the concatenated value to map to the SharePoint choice column
    • ChoiceItem (Text) – Will store each selected choice from the collection variable
    • Idx (Integer) – Used for loop processing
  2. Add a Loop For Each action configured as follows:
    • Target Collection: Are you looking for assistance with:
    • Store Item: ChoiceItem
    • Index: Idx
  3. Within the Loop For Each action I added
    • Get Item from Collection action
      • Target Collection: Are you looking for assistance with:
      • Index: Idx
      • Store Item: ChoiceItem
    • Set a variable value action (optionally, the Create a text string action would also work the same way)
      • Variable: CustomerInterests
      • Value: [CustomerInterests][ChoiceItem];# (this concatenates the next ChoiceItem to the end of the string for each iteration)
  4. In the Create an Item action, now I was able to simply assign the CustomerInterests text variable to the CustomerInterest SharePoint column.  When this action executes we end up with a new list item in SharePoint with the intended choices selected.

Here is what the completed form looks like:DocFluix Contact Form

0 comments on “Using Nintex Workflow Cloud? Always Start with External Start!”

Using Nintex Workflow Cloud? Always Start with External Start!

When I first started using using Nintex Workflow Cloud back in the early preview days, the first workflows that I created used various Start Events from both Nintex (Public Web Form, Nintex Mobile, Scheduled Start, External Start) and some external connectors (like Salesforce & Dynamics CRM).

As I jumped into NWC and started creating some workflows, I would pick a particular start event (say Public Web Form) and implement my workflow logic. Then I might decide that I want to test the same workflow logic but using a different start event, perhaps Nintex Mobile or Salesforce.  However, one cannot simply change Start Events if there are any references in the workflow to the Start Event variables.  These need to be removed before you can change the type of Start Event.

After developing a few processes with Nintex Workflow Cloud, I realized the best bet is to always put your core workflow logic in a workflow using the External Start Event.

Then you can create one or more additional workflows that leverage other Start Event types, which can easily call your core workflow via External Start.  This is accomplished using the Call a Workflow action.  Here are the steps:

  1. Create your core workflow using the External Start and add whatever start event variables that your workflow needs.  Then implement the workflow logic and publish your workflow.
  2. Create a 2nd workflow, using whichever start event you expect the end users will need, such as Public Web Form, Nintex Mobile, Salesforce, etc.  This workflow will need the identical start event variables as the first workflow. However, this workflow will only require one action.
  3. After the Start Event, simply drag the Call a Workflow action (from the Logic and Flow group) under the Start Event.
  4. In the configuration for the Call a Workflow action, the first setting is to select which workflow you wish to start.  It will display a list of all published workflows that use an External Start Event:

  1. After selecting the desired workflow, the configuration screen will display fields for each Start Event variable from the selected workflow.  You can simply map the Start Event variables from the current workflow to the inputs of the workflow that you will call via External Start.


After mapping the Start Event variables, you can simply publish your 2nd workflow.  If you know you need additional ways to call your core workflow logic, you can immediately repeat steps 2 – 5 above using other Start Event types.

By using the External Start event you’ll also be able to initiate your NWC workflow just as easily from Nintex for SharePoint 2013/2016 or SharePoint Online, using the “Start workflow in Nintex Workflow Cloud” action.  Additionally, custom apps and other cloud services will be able to initiate your workflow using the OpenAPI (swagger) protocol.

Even if you don’t need to initiate your core workflow logic in multiple ways, NWC makes it so easy to use this pattern that its almost always a good idea to start with the External Start event.  You just never know when your business requirements might change in the future; And this way you (and your NWC workflows) will always be prepared for new requirements.

 

0 comments on “Add New Connector to Nintex Workflow Cloud (Part 4 of 4)”

Add New Connector to Nintex Workflow Cloud (Part 4 of 4)

Overview

This article is part 4 of a 4 part series, where I’ll cover the steps to import your Azure Function as a new connector using the Nintex Workflow Cloud Xtensions framework.  If you haven’t already reviewed the prior posts in this series, here’s what you missed:

My sample project here was to build an Azure Function that wrapped the GSA Per Diem API, making it easy to consume as an Xtension in Nintex Workflow Cloud.  The use case for this would be workflow designers creating a solution for Travel Authorization Requests and/or Expense Reports.  This custom workflow action could validate that the amount of expenses for Meals and Hotels is within the limit allowed by the GSA.

Now that we’ve created our Azure Function, we’re left with the best part of the solution – importing it into Nintex Workflow Cloud as a new new Xtension / Connector.

Create the Xtension in NWC

After logging into your NWC tenant, click on the Xtensions link on the left side of your Dashboard.  The click the orange plus (+) sign on the right side.

image

This will display the Connector Definition dialog.  Simple copy and paste the API Definition URL into the textbox for the OpenAPI specification URL (see bottom of part 2 for how to obtain this URL in Azure Portal).  After pasting the URL, NWC will immediately validate the definition and display a green checkmark on the right.  Then click Next.

image

This will display the Security dialog.  We can accept the default settings here.  These settings indicate that Nintex Xtensions will use an APIKey for security to call our Azure Function (the APIKEY is the token at the end of the API Definition URL) and it will pass the APIKEY value as a parameter named “code” in the URL querystring.  The APIKEY shall be entered by the workflow designer when they use the GSA Per Diem action within a workflow.  Click Next to continue.

image

This will display the Publish dialog as shown below.  Enter a Name and Description for the Connector.

image

Then click on one of the available icons or if you prefer you can upload your own icon.  Then click the Publish button.

image

 

When the Publishing is complete your new Connector will show up in the custom connectors list:

image

Before using the Connector, we’ll need to create a Connection to it.  While you can create the connection after you add the GSA Per Diem action into your workflow.  But in this case well create a new connection first.  Click on the Connections link from the left side of the Dashboard, then click the Add New button:

image

From the Add a new connection screen the name of your new Connector from the Connector list.  In this case, I’m selecting the GSA Per Diem connector.  Then click the Connect button:

image

Enter values for the Connection name and the API Key that you plan to use name for this specific connection, then click the Connect button:

image

The new connection shall be displayed in the connection list:

image

Now, we can actually use our custom Connector in an NWC workflow!  So, lets create a new Workflow and when we review the toolbox, we’ll see our new GSA Per Diem connector. If we expand the GSA Per Diem section, we’ll see the specific action that corresponds to our Azure Function.  Note that the name displayed for this action is defined in our Swagger definition, as the summary.  So if we prefer a different name for the action, we can change that summary value (see part 3 for details).    image

In this case, I created a workflow using the Nintex Public Web Form as the Start Event, with two start variables: ZipCode and TripDate.  I then added my new action GSA Per Diem action onto the workflow design surface.  Lets look at the configuration:

image

For this action there are 6 settings to be provided:

  • Connection – Select the name of the connection that was created previously from the drop down list
  • The API Key – You may use the same API KEY that was provided in the API Definition URL.  Azure Functions also allows you to create and manage additional API KEY for your function app.  This can be done from the Settings screen of your Azure Function App.
  • Gsa Per Diem Input Zip – assign this to the ZipCode start event variable.
  • Gsa Per Diem Input Trip Date – assign this to the TripDate start event variable
  • Results Hotel – Define a new variable of type integer called HotelPerDiem.  This is an output variable that will be populated by this action.
  • Results Meals – Define a new variable of type integer called MealsPerDiem. This is an output variable that will be populated by this action.

Conclusion

So we have no concluded our project to implement a solution using C# & Visual Studio to develop our logic within Azure Functions, and import our API definition as a custom connector for Nintex Workflow Cloud.

The expected scenario with this example, is that we may capture actual Meal and / or Hotel expenses and compare those against the allowed Per Diems.  If the actual expenses exceeds the Per Diem the workflow may taken certain actions like automatically reject or route for special approval.  But you can hopefully adapt the concepts here for any custom business logic or integration scenarios that you can imagine.  Good luck!

If you have questions or comments please contact me on Twitter @TomCastiglia

0 comments on “Configuring Swagger Definition for your Azure Functions (Part 3 of 4)”

Configuring Swagger Definition for your Azure Functions (Part 3 of 4)

Overview

This article is part 3 of a 4 part series, where I’ll cover editing of the Swagger definition file that we’ll import into NWC in part four in order to create the actual Nintex Xtension.  In part 1, I covered the steps to setup your Azure Functions and Visual Studio environment, and in part 2 I covered the C# code that was developed and published to Azure Functions.

In part 4, we’ll import our Azure Function into Nintex Workflow Cloud as a new Connector using NWC’s Xtensions framework.

Configure Methods

Before creating our API definition, we’ll need to configure the allowed HTTP methods for our new function, so that it only allows POST methods.  First, navigate to your new Function App in the Azure portal, expand the Functions section, then expand the specific function and click on the Integrate option in the portal, as shown below:

image

From the Integrate screen, review the Trigger configuration and find the Allowed HTTP methods drop down field, which may be defaulted to a value of All methods.  Change it to Selected Methods instead and then uncheck all of the HTTP methods except for POST, as shown below:

image

Now that the allowed methods are set, we can create our Swagger API Definition.

Swagger Definition

Creating a Swagger Definition was a new experience for me and there are various tools out there that can help with the process.  One tool is built into the Azure Functions portal site, and that is primarily where I created my definition.  But you can also use the native editor at http://editor.swagger.io/.  Although both editors are very similar, there were a few things that I found easier to figure out on the Swagger site than in the Azure Portal, so I used both of them.  But we’ll start in the Azure Portal.

Regardless of the editor that you use, you create your swagger definition using a language call YAML, which stands for “YAML Ain’t Markup Language”.  Instead its described as a “a human friendly data serialization standard for all programming languages”.  The output of a Swagger definition is JSON, but you edit with YAML.

To generate the API definition for your Azure Function:

Navigate to your Function App in Azure Portal, and click on the API Definition link.

image

This will display the Function API definition (Swagger) page.  For API definition source, click on the Function (preview) button.

image

This will display the Swagger Editor, with a blank definition, as shown below:

image

Click on the button for Generate API definition template.image

This will generate a boilerplate API definition based on the metadata about your function, as shown below.  As you edit the swagger definition, it will validate your changes and provide error details in real time.

image

Here is what the completed API Definition looks like for my GsaPerDiemFunctions app:

image

To review the generated Swagger JSON, click the Copy button next to the API Definition URL:

image

Then test it in your browser, or better yet a REST client tool like Postman.  The resulting JSON should look like this:

{
    "swagger": "2.0",
    "info": {
        "title": "gsaperdiemfunctions.azurewebsites.net",
        "version": "1.0.0"
    },
    "schemes": [
        "https"
    ],
    "host": "gsaperdiemfunctions.azurewebsites.net",
    "basePath": "/api",
    "paths": {
        "/GetPerDiemByZip?code={APIKEY}": {
            "post": {
                "tags": [
                    "Looks up GSA Per Diems for Hotel and Meals by Date and Zipcode."
                ],
                "summary": "Looks up GSA Per Diems for  Hotel and Meals by Date and Zipcode.",
                "operationId": "GetPerDiemByZip",
                "x-ntx-summary": "Looks up GSA Per Diems for Hotel and Meals by Date and Zipcode.",
                "parameters": [
                    {
                        "name": "APIKEY",
                        "in": "path",
                        "description": "The API Key",
                        "x-ntx-summary": "The API Key",
                        "required": true,
                        "type": "string"
                    },
                    {
                        "name": "GsaPerDiemInput",
                        "in": "body",
                        "description": "Contains Zipcode and Tripdate.",
                        "x-ntx-summary": "Contains Zipcode and Tripdate.",
                        "required": true,
                        "schema": {
                            "$ref": "#/definitions/GsaPerDiemInput"
                        }
                    }
                ],
                "produces": [
                    "application/json"
                ],
                "consumes": [
                    "application/json"
                ],
                "responses": {
                    "200": {
                        "description": "Returns Per Diem Amounts for Hotel and Meals by Zip and TripDate",
                        "schema": {
                            "$ref": "#/definitions/GsaPerDiemOutput"
                        }
                    }
                },
                "security": [
                    {
                        "apikeyQuery": []
                    }
                ]
            }
        }
    },
    "securityDefinitions": {
        "apikeyQuery": {
            "type": "apiKey",
            "name": "code",
            "in": "query"
        }
    },
    "definitions": {
        "GsaPerDiemInput": {
            "description": "Zipcode and tripdate for the trip",
            "properties": {
                "Zip": {
                    "type": "string",
                    "description": "Zipcode for main destination of the trip"
                },
                "TripDate": {
                    "type": "string",
                    "description": "Date of the first day of the trip"
                }
            },
            "required": [
                "Zip",
                "TripDate"
            ]
        },
        "GsaPerDiemOutput": {
            "description": "Max Per Diem amounts for Meals and Hotel",
            "properties": {
                "Hotel": {
                    "type": "integer",
                    "description": "Per Diem amount allowed for Hotel"
                },
                "Meals": {
                    "type": "integer",
                    "description": "Per Diem amount allowed for Meals"
                }
            },
            "required": [
                "Hotel",
                "Meals"
            ]
        }
    }
}
0 comments on “C# code for calling GSA Per Diem API from Azure Functions (Part 2 of 4)”

C# code for calling GSA Per Diem API from Azure Functions (Part 2 of 4)

Overview

This article is part 2 of a 4 part series, where we’ll cover the C# code for our custom connector for Nintex Xtensions and NWC.

In part 1, I covered the project requirements and how to setup Azure and Visual Studio 2017 with Tools for Azure Functions.  The goal of this project was to allow a workflow designer in NWC to query the GSA Per Diem API to lookup the allowed Per Diem amounts for Hotel and Meals based on the travel date and location of a trip.

Below I’ll cover the GSA Per Diem API and the C# code for a wrapper function to make this API very easy to consume from Nintex Workflow Cloud.

Development

The GSA’s Per Diem API, is a RESTful service with a very simple interface.  The API URL is: https://inventory.data.gov/api/action/datastore_search?resource_id=8ea44bc4-22ba-4386-b84c-1494ab28964b

This URL allows one pass in an additional “filter” parameter with a JSON formatted string containing the parameter values that the API shall use to return the desired results.  The API allows four types of filters and they each require two arguments, with FiscalYear as one of the arguments and one of the following as the other argument:

  • Zip
  • County
  • DestinationID
  • State

For my initial project, I used Zip.  So to call this API, you’ll need to format a REST call as follows:

https://inventory.data.gov/api/action/datastore_search?resource_id=8ea44bc4-22ba-4386-b84c-1494ab28964b&filters={“FiscalYear”:”2017″,”Zip”:”10036″}

The response from this REST API is a fairly lengthy JSON message containing a lot of data.  However, my NWC connector only needs to receive two values (MealsPerDiem & HotelPerDiem).  As noted above, the Azure Function I created was a fairly simple wrapper around this REST API, that enabled the following:

  1. Allow API to be called as an HTTP POST operation
  2. Accept an input value from Nintex Workflow Cloud of a specific date, rather than the FiscalYear.
  3. Return only the two value that we need: MealsPerDiem & HotelPerDiem

So here is the C# code that I used (Since the code is pretty well commented, I’m not providing additional explanations):

using System;
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Security.Cryptography;
using System.Threading.Tasks;
using System.Text;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
using Newtonsoft.Json;

namespace GsaPerDiemFunction
{
    ///
<summary>
    /// This class implements behavior for GetPerDiemByZip function in Azure
    /// </summary>

    public static class GetPerDiemByZip
    {
        //Constants
        private const string GSA_REST_URL = "https://inventory.data.gov/api/action/datastore_search";
        private const string RESOURCEID = "?resource_id=8ea44bc4-22ba-4386-b84c-1494ab28964b";

        ///
<summary>
        /// The PerDiemInput class is used to deserialize JSON input data from the HTTP POST
        /// Zip - Zip code of where the travel occured
        /// TripDate - Date that the specifc travel occured on
        /// </summary>

        public class PerDiemInput
        {
            public string Zip { get; set; }
            public string TripDate { get; set; }
        }

        ///
<summary>
        /// The PerDiemOutput class is returned as serialized JSON with the requested values from the GSA Per Diem API
        /// Hotels - Per Diem amount allowed by GSA for Hotel expense on specified date and zip code
        /// Meals - Per Diem amount allowed by GSA for Meals on specified date and zip code
        /// </summary>

        public class PerDiemOutput
        {
            public int Hotel { get; set; }
            public int Meals { get; set; }
        }

        ///
<summary>
        /// This method implements the  core logic for our Azure Function
        /// </summary>

        /// <param name="req">Incoming HTTP Request data as a POST operation</param>
        /// <param name="log">Log provided by Azure functions for debug purposes.  Logged data is visible by admins in the Azure portal.</param>
        /// <returns></returns>
        [FunctionName("GetPerDiemByZip")]
        public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequestMessage req, TraceWriter log)
        {
            try
            {
                log.Info("Received Per Diem Request.");

                //Deserialize the incoming JSON data into an instance of the PerDiemInput
                string jsonInput = await req.Content.ReadAsStringAsync();
                log.Info(jsonInput);
                var perDiemInput = JsonConvert.DeserializeObject<PerDiemInput>(jsonInput);

                //Cast incoming values and check that the inputs are valid
                string zip = perDiemInput.Zip;
                string tripDate = perDiemInput.TripDate;
                string fiscalYear;
                int monthNum;
                string month;
                DateTime travelDate;
                int zipInt;

                //If TripDate is not valid, return BadRequest response, otherwise parse the date into year and month values
                if (!DateTime.TryParse(tripDate, out travelDate))
                {
                    string tripDateInvalidMsg = string.Format("The specified TripDate ({0}) is not valid.", tripDate);
                    log.Info(tripDateInvalidMsg);
                    return req.CreateResponse(HttpStatusCode.BadRequest, tripDateInvalidMsg);
                }
                else
                {
                    fiscalYear = travelDate.Year.ToString();
                    monthNum = travelDate.Month;
                    DateTime monthDate = new DateTime(1, monthNum, 1);

                    month = monthDate.ToString("MMM");
                }

                //If Zip is not valid, return BadRequest response
                if ((zip.Length != 5) || (!int.TryParse(zip, out zipInt)))
                {
                    string zipInvalidMsg = string.Format("The specified Zip ({0}) is not valid.", zip);
                    log.Info(zipInvalidMsg);
                    return req.CreateResponse(HttpStatusCode.BadRequest, zipInvalidMsg);
                }

                log.Info(string.Format("Per Diem Requested for Zip: {0} and Fiscal Year: {1}", zip, fiscalYear));

                //Get per diem data in json format from GSA REST services, based on the specified year and zip code
                //Deserialize the per diem json into an object
                string jsonResponse = GetPerDiemJson(zip, fiscalYear, log);
                var gsaPerDiem = JsonConvert.DeserializeObject<Rootobject>(jsonResponse);
                log.Info("Deserialized jsonResonse");

                //Create the output object that we want to return.  This is a simplied version of what the GSA provides,
                //returning only the max per diem allowed for meals and hotel based on the fiscal year and zip.
                PerDiemOutput perDiemOut = new PerDiemOutput();
                perDiemOut.Meals = int.Parse(gsaPerDiem.result.records[0].Meals);
                perDiemOut.Hotel = GetHotelPerDiemByMonth(ref gsaPerDiem, month);

                return req.CreateResponse(HttpStatusCode.OK, perDiemOut);
            }
            catch (Exception e)
            {
                log.Info(string.Format("ERROR:\n\r Message: {0}\r\n Source: {1}\r\n Stack: {2}\r\n TargetSite: {3}", e.Message, e.Source, e.StackTrace, e.TargetSite));
                return req.CreateResponse(HttpStatusCode.BadRequest, e.Message);
            }
        }

        ///
<summary>
        /// Formats input filters into JSON and Calls the GSA Per Diem REST API
        /// </summary>

        /// <param name="zip">Zip code specified by the caller</param>
        /// <param name="fiscalYear">Fiscal Year specified by the caller</param>
        /// <param name="log">Log provided by Azure functions for debug purposes.  Logged data is visible by admins in the Azure portal.</param>
        /// <returns></returns>
        private static string GetPerDiemJson(string zip, string fiscalYear, TraceWriter log)
        {
            HttpWebResponse response = null;
            Stream receiveStream = null;
            StreamReader streamReader = null;
            try
            {
                //format filter parameter as JSON
                string filters = string.Format("&filters={{\"FiscalYear\":\"{0}\",\"Zip\":\"{1}\"}}", fiscalYear, zip);
                string resourceUrl = string.Format("{0}{1}{2}", GSA_REST_URL, RESOURCEID, filters);

                //Create request to GSA API
                HttpWebRequest request = (HttpWebRequest)WebRequest.Create(resourceUrl);

                ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls
                                        | SecurityProtocolType.Tls11
                                        | SecurityProtocolType.Tls12
                                        | SecurityProtocolType.Ssl3;

                log.Info(string.Format("Created Request for: {0}", resourceUrl));

                request.Method = "POST";
                request.ContentType = "application/x-www-form-urlencoded";

                log.Info("Request Host: " + request.Host);

                response = (HttpWebResponse)request.GetResponse();
                log.Info("Received Response from Request");

                receiveStream = response.GetResponseStream();
                streamReader = new StreamReader(receiveStream, Encoding.UTF8);
                string jsonResponse = streamReader.ReadToEnd();

                response.Close();
                streamReader.Close();
                return jsonResponse;
            }
            catch (Exception e)
            {
                throw;
            }
            finally
            {
                if (response != null)
                {
                    response.Close();
                }
                if (streamReader != null)
                {
                    streamReader.Close();
                }
            }
        }

        private static int GetHotelPerDiemByMonth(ref Rootobject gsaPerDiem, string month)
        {
            int perDiem = 0;

            switch(month.ToLower())
            {
                case "jan":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Jan);
                    break;
                case "feb":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Feb);
                    break;
                case "mar":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Mar);
                    break;
                case "apr":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Apr);
                    break;
                case "may":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].May);
                    break;
                case "jun":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Jun);
                    break;
                case "jul":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Jul);
                    break;
                case "aug":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Aug);
                    break;
                case "sep":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Sep);
                    break;
                case "oct":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Oct);
                    break;
                case "nov":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Nov);
                    break;
                case "dec":
                    perDiem = int.Parse(gsaPerDiem.result.records[0].Dec);
                    break;
            }

            return perDiem;
        }
    }
}

Once the code is complete, you can publish again to Azure.  I should also mention that there are a few ways to test and debug your Azure Function code, both locally (without publishing to Azure) or from Azure Functions.  Many REST API developers use Postman, which is what I did as well.

Before you can consume your Azure Function as a Connector for Nintex Workflow Cloud, you’ll need to define the Swagger definition file, which I’ll cover in part 3.