Dynamics 365 portals v9 solutions now available and liquid editor change

If you are running on Dynamics 365 v9.x the solutions that accompany the latest portal update which moves the portal version up a major version to v9 are now available in all geos. Solutions for portal updates needs to be installed manually or are customer driven by an administrator instead of like the portal code which is applied by Microsoft. Administrators can access this update in the Dynamics 365 Administration Center and use the solution editor for the instance.

The new solutions with the v9.x are only available for v9.x instances though, you will not have them installed if you are still on v8.2 of Dynamics 365. So be aware there are v9 only changes in the future of portals. You can read more about the documented changes on the Microsoft Support site, instructions for new features like Reset Portal and Change Base URL can be found on the Microsoft Docs site.

But not everything is documented. The big change in these solution updates first pointed out by @readyxrm (Nick Doelman) privately is the inclusion of intellisense for liquid within the Web Template editor. But that is actually just the highly visible change. The major change is the editor for Web Templates and Liquid Templating to the Microsoft Monaco Editor which is the same editor that VS Code uses. This moves away from the ACE Editor that ADXStudio had originally implemented in the product.

You can see some interesting methods in the initialization of the editor that point to a liquid parser being built for the Monaco editor and the autocomplete registration.

registerLiquidLanguage(monaco);
registerAutocompleteProvider(monaco);

Within the registerAutocompleteProvider method they have the starts of the intellisense definition. It is very basic at this point and there is no dynamic build of intellisense based on your existing liquid code and you can’t yet access standard liquid objects (user, page, sitemap, etc.) and their attributes or attributes of queried entities. It is the first implementation though and I would imagine there is more to come with later releases.

function registerAutocompleteProvider(monaco) {
	monaco.languages.registerCompletionItemProvider('liquid', {
		provideCompletionItems: () => {
			var autocompleteProviderItems = [];
			var keywords = ['assign', 'capture', 'endcapture', 'increment', 'decrement',
						'if', 'else', 'elsif', 'endif', 'for', 'endfor', 'break',
						'continue', 'limit', 'offset', 'range', 'reversed', 'cols',
						'case', 'endcase', 'when', 'block', 'endblock', 'true', 'false',
						'in', 'unless', 'endunless', 'cycle', 'tablerow', 'endtablerow',
						'contains', 'startswith', 'endswith', 'comment', 'endcomment',
						'raw', 'endraw', 'editable', 'endentitylist', 'endentityview', 'endinclude',
						'endmarker', 'entitylist', 'entityview', 'forloop', 'image', 'include',
						'marker', 'outputcache', 'plugin', 'style', 'text', 'widget',
						'abs', 'append', 'at_least', 'at_most', 'capitalize', 'ceil', 'compact',
						'concat', 'date', 'default', 'divided_by', 'downcase', 'escape',
						'escape_once', 'first', 'floor', 'join', 'last', 'lstrip', 'map',
						'minus', 'modulo', 'newline_to_br', 'plus', 'prepend', 'remove',
						'remove_first', 'replace', 'replace_first', 'reverse', 'round',
						'rstrip', 'size', 'slice', 'sort', 'sort_natural', 'split', 'strip',
						'strip_html', 'strip_newlines', 'times', 'truncate', 'truncatewords',
						'uniq', 'upcase', 'url_decode', 'url_encode'];

			for (var i = 0; i < keywords.length; i++) {
				autocompleteProviderItems.push({ 'label': keywords[i], kind: monaco.languages.CompletionItemKind.Keyword });
			}

			return autocompleteProviderItems;
		}
	});
}

They have a good list of tags and filters but there is no relationships or understanding of where they work with operators so don’t expect a lot out of this autocomplete.

This is though a big change and hopefully a very prosperous one for Dynamics 365 portals developers. Using the same editor as VS Code and the multitude of features it has with it could really open up the possibilities, both for Microsoft and for customizers to enhance it. Monaco is open source and has a well documented API that can be used to extend it.

Sadly they weren’t able to yet addressed the much loved developer shortcut to save a web template often of CTRL + S and make the code window bigger or full screen function. Maybe soon though with the investment in this new editor :).

Dynamics 365 portal: Use liquid fetchxml with paging cookie

You might be familiar already with a previous post on Use Liquid to Return JSON or XML but what if you want efficient paging included in your scenario. I have had a couple of queries of how to do this with large data sets so that the fetchxml limit of 5000 results can be exceeded or results returned in an efficient manner as possible. Fetchxml has a solution with the paging cookie and the portal natively uses this in all its entity view type queries, but you can use it as well in your custom liquid fetchxml!

Using the same method in the previous post Use Liquid to Return JSON or XML we will setup a web template that makes the fetchxml query and instead of returning HTML we will set the MIME type to application/json.

This is the stubbed in liquid code we are starting with in our web template:

{% fetchxml feed %}
  <fetch version="1.0" mapping="logical">
    <entity name="contact">
      <attribute name="firstname" />
      <attribute name="lastname" />
      <attribute name="contactid" />
      <order attribute="lastname" descending="false" />
    </entity>
  </fetch>
{% endfetchxml %}{
  "results": [
    {% for item in feed.results.entities %}
      {
        "firstname": "{{ item.firstname }}",
        "lastname": "{{ item.lastname }}",
        "contactid": "{{ item.contactid }}"
      }{% unless forloop.last %},{% endunless %}
    {% endfor %}
  ]
}

Here we are just making a simple query using the liquid fetchxml tag and returning a list of all contacts (up to 5000 with the fetchxml limit).

The problem using this just like this is that it is not getting back a limited number of results and there is no paging of records involved. To make paging efficient on large date sets Microsoft has included what is called a paging cookie in fetchxml so that you can get faster application performance. Read more about the fetchxml paging cookie on the Microsoft Docs site – Page Large Result Sets with FetchXML.

The paging_cookie property should be used with the more_results boolean property both available the results object of a fetchxml query. The code below now has updated to include both of those properties in the highlighted lines in JSON returned by the endpoint.

{% fetchxml feed %}
  <fetch version="1.0" mapping="logical">
    <entity name="contact">
      <attribute name="firstname" />
      <attribute name="lastname" />
      <attribute name="contactid" />
      <order attribute="lastname" descending="false" />
    </entity>
  </fetch>
{% endfetchxml %}{
  "morerecords": {{ feed.results.more_records }},
  "paging-cookie": "{{ feed.results.paging_cookie }}",
  "results": [
    {% for item in feed.results.entities %}
      {
        "firstname": "{{ item.firstname }}",
        "lastname": "{{ item.lastname }}",
        "contactid": "{{ item.contactid }}"
      }{% unless forloop.last %},{% endunless %}
    {% endfor %}
  ]
}

Now you can make logic decisions if to get more records based on the value of more_records and use the value of paging_cookie to provide to the fetchxml.

At this point we are going to want to include a page size or returned record count so that we aren’t getting all the records at once (to a max of 5000). To do this you want to add the count attribute to the opening fetch with a integer value as to the number of records in the page of results.

<fetch version="1.0" mapping="logical" count="10">

Now we have setup the returned JSON with all the necessary details for the UI to make choices to get more data. Now we need further enhance the liquid logic to allow the UI to pass the endpoint parameters for the page and paging cookie so you can include those in the liquid fetchxml query. To do this we need to collect both of those as query string parameters and then add them to the fetchxml query if they exist.

For the paging cookie we want some logic so that the cookie is only included when it is passed as a parameter to the endpoint. Adding the following code to the top of your web template will check the request parameters for the key 'paging-cookie' and if it has a value then setup the XML statement attribute with the value of the query string parameter.

{% assign pagingCookie = request.params['paging-cookie'] %}
{% if pagingCookie %}
  {% assign pagingCookie = ' paging-cookie="{{ pagingCookie }}"' | liquid %}
{% endif %}

Note we also have a filter at the end of the paging cookie variable assignment, the liquid filter so that liquid is executed in the assignment of the variable.

With the paging cookie and the page parameter we want to add those to the opening fetch xml tag.

<fetch version="1.0" mapping="logical"{{ pagingCookie }} page="{{ request.params['page'] | default:1 }}" count="10">

We have also applied the default filter on the request.params['page'] so that when it isn’t included as a query string parameter that it assumes you want the first page. The first page won’t also require a paging cookie.

Everything looks pretty good at this point with the exception of one gotcha. The paging cookie value is going to be XML. XML is not going to go well into a query string parameter because it includes illegal URL characters. We could solve this with the UI layer itself and translate or encode the XML as URL safe but that would require logic at the UI. We can actually encode the XML on the liquid end so that the data we pass to whatever the UI is doesn’t need to worry about any translating, just passing that same data back.

We are going to add another property to the return JSON that is the encoded version of the paging cookie. With this we need to use a liquid filter called url_escape to encode all the XML to URL friendly characters.

"paging-cookie-encoded": "{{ feed.results.paging_cookie | escape | url_escape }}",

For example the original XML paging cookie of:

<cookie page="1"><lastname last="Vermander" first="Administrator" /><contactid last="{D77E163F-4B77-E811-A960-000D3A1CA7D6}" first="{7469FD95-C0BD-4236-90BF-1D1100291DF5}" /></cookie>

Becomes:

%26lt%3Bcookie+page%3D%26quot%3B1%26quot%3B%26gt%3B%26lt%3Blastname+last%3D%26quot%3BVermander%26quot%3B+first%3D%26quot%3BAdministrator%26quot%3B+%2F%26gt%3B%26lt%3Bcontactid+last%3D%26quot%3B%7BD77E163F-4B77-E811-A960-000D3A1CA7D6%7D%26quot%3B+first%3D%26quot%3B%7B7469FD95-C0BD-4236-90BF-1D1100291DF5%7D%26quot%3B+%2F%26gt%3B%26lt%3B%2Fcookie%26gt%3B

With this you now have a JSON endpoint that supports paging with the fetchxml paging cookie and can now efficiently return any number of records with various page sizes in your portal implementations. I do always recommend you keep your page sizes reasonable for performance considerations.

Below is the completed web template example with paging cookie included in the input of the endpoint and output of JSON.

{% assign pagingCookie = request.params['paging-cookie'] %}
{% if pagingCookie %}
  {% assign pagingCookie = ' paging-cookie="{{ pagingCookie }}"' | liquid %}
{% endif %}
{% fetchxml feed %}
  <fetch version="1.0" mapping="logical"{{ pagingCookie }} page="{{ request.params['page'] | default:1 }}" count="10">
    <entity name="contact">
      <attribute name="firstname" />
      <attribute name="lastname" />
      <attribute name="contactid" />
      <order attribute="lastname" descending="false" />
    </entity>
  </fetch>
{% endfetchxml %}{
  "morerecords": {{ feed.results.more_records }},
  "paging-cookie": "{{ feed.results.paging_cookie }}",
  "paging-cookie-encoded": "{{ feed.results.paging_cookie | escape | url_escape }}",
  "page": {{ request.params['page'] | default: 0 }},
  "results": [
    {% for item in feed.results.entities %}
      {
        "firstname": "{{ item.firstname }}",
        "lastname": "{{ item.lastname }}",
        "contactid": "{{ item.contactid }}"
      }{% unless forloop.last %},{% endunless %}
    {% endfor %}
  ]
}

You can test your new service now without parameters and with parameters. Your query for the first page would follow this format, the page parameter being optional.

https://[portalname].microsoftcrmportals.com/[json-endpoint]/?page=1

Queries for next or previous pages should include the encoded version of the paging cookie.

https://[portalname].microsoftcrmportals.com/[json-endpoint]/?page=2&paging-cookie=%26lt%3Bcookie+page%3D%26quot%3B1%26quot%3B%26gt%3B%26lt%3Blastname+last%3D%26quot%3BVermander%26quot%3B+first%3D%26quot%3BAdministrator%26quot%3B+%2F%26gt%3B%26lt%3Bcontactid+last%3D%26quot%3B%7BD77E163F-4B77-E811-A960-000D3A1CA7D6%7D%26quot%3B+first%3D%26quot%3B%7B7469FD95-C0BD-4236-90BF-1D1100291DF5%7D%26quot%3B+%2F%26gt%3B%26lt%3B%2Fcookie%26gt%3B

Dynamics 365 portals: Authentication Deprecation?

The portal updates and news has been a little slow in 2018, but we have had some good small updates with enhancements to features. Most recently the portal update brought with it GDPR compatibility and within there was hidden some direction for the future of authentication of the portal. If you carefully read the GDPR article on the Microsoft Docs site you will find the following quote:

Going forward, we recommended that you use only Azure AD B2C identity provider for authentication and that you deprecate other identity providers.

😮 you might say…even local authentication?! Yes. All of the authentication in the portal is recommend to be deprecated, and that Azure AD B2C is the identity provider of choice. The key is right now it is recommended, but that is right now. Long term I think we can expect that we will truly see that all the existing authentication methods will be actually marked as deprecated, not just suggested or recommended, but forced deprecation.

The great news is the changes brought with GDPR provide the functionality to help migrate off of the recommended deprecated providers. The enhancements bring with it a way to mark existing providers, including local authentication as deprecated and bring a user experience to help existing users using deprecated providers to migrate to the Azure AD B2C provider.

What is Azure AD B2C? An Azure service that is targeted at helping your organization utilize consumer based identities within your sites and applications. It provides the ability to setup identity for any application in a super simple manner or get into complicated policies. It provides a robust identity experience that allows you to utilize any number of social providers (like Google, Facebook, LinkedIn, Microsoft Account, etc.).

Why you should use Azure AD B2C or other identity service with your Dynamics 365 portal is to abstract the identity of your users outside of Dynamics 365. This is important for GDPR to provide a hardened service that meets the requirements of GDPR but beyond GDPR it allows you to share the identities of your users outside of the Dynamics 365 portal. Native authentication with the Dynamics 365 portal, be it local authentication or using a social provider is next to near impossible to share with other applications. This makes for a bad user experience of end users having to maintain yet another username and password just for your application. Using Azure AD B2C sets you up to allow a common identity across all applications in your organization so that external users only need 1 identity and that identity can also be a social login.

The direction of the Dynamics 365 portal is clear, the future is Azure AD B2C for all authentication, both local and social providers. Even if you aren’t needing to meet the GDPR requirements it would be highly advised that all new implementations utilize Azure AD B2C with no legacy providers and that existing implementations start to plan migrating off legacy based authentications currently being used.

You can review the full GDPR article with the new deprecation settings for authentication, below is a quick summary of some of the important highlights.

You can mark the local authentication using the following site setting:

Authentication/Registration/LocalLoginDeprecated

You can mark any other provider using the following format of a site setting, replacing [provider] with the name of your provider:

Authentication/[protocol]/[provider]/Deprecated

Both site settings are boolean supporting true or false.

When you sign in with a provider marked deprecated it will switch into that experience of getting that user to transition to the Azure AD B2C provider.

This screen can be customized with your own content by modifying the content snippets in use on this page.

  • Account/Conversion/PageTitle
  • Account/Conversion/PageCopy
  • Account/Conversion/SignInExternalFormHeading

For more information please review the https://docs.microsoft.com/en-us/dynamics365/customer-engagement/portals/implement-gdpr#migrating-identity-providers-to-azure-ad-b2c article for deprecation details and Azure AD B2C provider settings for portals for details on how to configure your portal with Azure AD B2C.

Dynamics 365 portal: New Release 8.4.0.265 – Portal Error Logs!

Portal development has been quiet from Microsoft for the last couple of months so its great to see a new release from the Microsoft portals team. The new release, 8.4.0.265 is now available and rolling out to environments. If you have had the “Early Upgrade” option in your administrative interface then you may have seen this release a little earlier. There are some major fixes within this release as well as new features. The new features disabling custom errors, diagnostic logging and plugin errors will make portal developers lives a lot easier allowing them to see the detailed error messages when a portal encounters an issue themselves rather than having to file a support request. This should really help to improve the development process and time when debugging issues.

First let’s look at how you can know if you have the new version of the portal as well as how you can access the administrative actions to configure custom errors. To check your portal version you can navigate to the services about endpoint /_services/about. Full portal address would look like: https://exampleportalname.microsoftcrmportals.com/_services/about. From here you should see regardless of if you are logged into the portal or not the portal version.

If you have the version 8.4.x.x then you will have the fixes and new features in this release. The next item to check is that your administrative console for the portal also has its update. You can do this by navigating to the Dynamics Administration Center, going to Applications and then selecting Manage on your portal add-on. Within the portal administrative management, select Portal Actions on the left navigation. If you have the updated administrative interface you should see new tiles for Disable Custom errors and Enable diagnostic logging.

The 3 new features, Custom Errors, Diagnostic Logging and Plugin Errors, provide a self-service way to now get error and debugging details from the portal. This is hugely helpful to have these as self-service functions as the time to get error details is dramatically cut. The Disable Custom errors will take the We’re Sorry “friendly” error message and change it to the full ASP.NET error details. See the following example when we don’t have an ASPX page for the defined page template.

Custom Errors Enabled:

Custom Errors Disabled:

For your developer portals it would be suggested to just disable custom errors at all times so that developers can immediately see the debug information and correct the issue. For test and production environments so that the direct error message is not revealed you can use the diagnostic logging feature and have all the error details written to an Azure Storage Blob while showing the end user the “friendly” We’re sorry… message.

To enable diagnostic logging you will first need an Azure subscription setup and create a storage account. You can follow the Microsoft Docs quick start guide for specific directions on how to create a storage account.

Once you have your storage account, in the Azure Management Portal, navigate in your storage account to Access Keys, and copy the connection string.

Now from the portal administrative management, in Portal Actions, select, Enable diagnostic logging. Within the dialog paste in your Azure Storage Account connection string, and select the time period to keep logs for, if you do pick Always be aware that your storage account will continuously grow in size and that cost will be past on to you.

Once you have configured the logging, go create some traffic on your site, or reproduce an error then navigate in the Azure Portal to the storage account, select Blobs, Containers. Within Blobs there should now be a container called telemetry-logs, this is your portals diagnostic logs within this container.

Navigating into the container will show you usually 2 sites. This is because your site has a replicate within another Azure data center for performance and redundancy.

Within each site you will then have folders of the year/month/day/hour (all in GMT) and then finally the actual log file itself as a CSV.

You can download these CSV files by selecting them then selecting download. Within the file you will find various rows for each request and if there are error details such as a stack trace then they would be available from the message column. Below is an example of the same error shown in the custom errors of a ASPX page not found.

Having the diagnostic logs of previous events will help the portal development team go back and investigate errors that are reported from testing or live use and get insight into the issues directly from the logs.

Both of these features once enabled can also be turned off. The Portal Actions tiles will update to the disable or re-configuration once either feature is enabled.

Finally Plugin Errors is a simple site setting Site/EnableCustomPluginError which takes a boolean (true or false) value. This will allow your plugin exceptions to now be shown in the portal, but not the full stack trace. You can use this for debugging more easily or actually providing a more targeted error message to end users if you do have custom business logic that the portal also needs to follow and direct the user on the boundaries or parameters.

For the full change log of fixes available in this release please checkout the Microsoft Support article for 8.4.0.275.

You can also view the full documentation on the new features for error logging on the Microsoft Docs site. If you aren’t seeing your portal upgraded to 8.4.x.x then you might need to wait for the roll-out to be available in your region and server, this process is usually staggered to help mitigate issues or major problems in the release.

Upgrade Adxstudio Portals v7.x to xRM Portals Community Edition

With Adxstudio Portals having an end of support date announced as August 2018, many existing Adxstudio customers are looking to see what their upgrade options are. One of the options to upgrade to the new Dynamics 365 portals from Microsoft is to utilize the Portals Source Code version or xRM Portals Community Edition as part of the upgrade path. Hidden within the PowerShell scripts for importing the Portal Source Code version is logic to actually upgrade your portal data and make it compatible with the new data formats in the release of the Portals Source Code. This allows you to maintain your existing implementation instead of having to completely re-implement it on the new version. Once you are at Portal Source Code (v8.3 portals) you can then move to online portals maintaining the implementation as well. To trigger this you need to be aware of 2 site settings that will need to be created for each web site. In this post we will look at going the first step of the upgrade process, from Adxstudio Portals v7 to Portal Source Code/xRM Portals Community Edition.

Firstly, run a backup of your instance and read over some caveats to be aware of.

  • This example will use the Basic/Start/Custom Portal template with a completely out of the box portal.
  • The upgrade process will not perfectly upgrade all your components. The upgrade process is focused around transforming data to work within the new data model. You will need to review your configurations to ensure they all function properly within the new version.
  • Custom branding will likely need to be redone
  • New portal data from the latest version will not be imported. So you will not get any new templates or configurations from the new version, but your existing data will be maintained. If you want new portal data and configurations you can install a fresh portal in a new instance and manually copy over data.
  • If multiple web sites are present all web sites must be upgraded at once and therefore all must have the 2 site settings
  • Code customizations need to be transported to the updated code base and tested for compatibility OR implemented using new out of box configuration based method. NOTE: all custom code must be removed or abstracted to a Dynamics Portal Companion App approach before upgrading to the online Dynamics 365 portals
  • If you are not using ASP.NET Identity in Adxstudio Portals this is a required change prior to removing the Adxstudio solutions as the schema for the old forms based authentication will be removed with them.

For this example I have a new Dynamics 365 instance, version 8.2.x with Adxstudio Portals v7.0.0025 installed using the Basic Portal starter template. Here are the solutions installed in the system currently:

To do the upgrade we are doing to use the package deployer packages that come with the solutions components (MicrosoftDynamics365PortalsSolutions.exe). You can download that item from the Microsoft Download Center.

Within these packages there is logic that looks at the following 2 site settings to determine if it will run the data transformation to resolve the data model changes, UpgradeWebsiteData and WebsiteLCIDforUpgrade to determine the language to apply to the site. Below is an example of the site setting values:

Name Value Example Value Description
UpgradeWebsiteData true Boolean value, default of false
WebsiteLCIDforUpgrade 1033 LCID Code for web site. Use one of 43 supported portal languages.

Once you have these site settings in, and remember if you have multiple sites you will want to have the settings in for all sites. You cannot upgrade one site and then upgrade another site later. The sites all share the same solutions which will be upgraded on the first site being done. We can then proceed with running the PowerShell script which will run one of the packages.

Open your PowerShell and navigate to the .\PackageDeployerPackages\ folder. Within that folder execute the import script, .\Import.ps1.

This will prompt you to select your connection type, on premises or online. Enter your full organization URL, ie. https://orgname.crm.dynamics.com, then enter your language LCID code. Then you will be prompted to select your package. You will want to select the starter portal you already have installed. Remember this package list and names differs a bit from the Adxstudio Installer website gallery list. Below is a map Adxstudio to Dynamics 365 portals.

Adxstudio Portals Website Gallery Templates Microsoft Dynamics 365 portals Templates
Basic Portal Starter/Custom Portal
Community Portal Community Portal
Company Portal Not Supported
Conference Portal Not Supported
Customer Portal Customer Self-Service Portal
Government Portal Not Supported
Partner Portal Partner Portal
Retail Portal Not Supported
Not Supported Employee Self-Service Portal

It is important to note that features in the portals have also changed. Community Portal with xRM Portals Community Edition and Dynamics 365 portals online no longer contains event management functionality. There will be other gaps such as this in other portals. Please ensure you validate using a portals comparison like available on adoxio.com or installing the latest portals in a clean instance.

Once you have selected your package the package deployer will start its process of importing solutions and then running the data transformation against the websites that return the site settings above.

When it is completed running you will be left with a combination of the Adxstudio solutions and the new Dynamics 365 portal ones from the Portals Source Code release. Here is an example of the combined solutions for the Basic/Starter/Custom Portal template.

You can now test the portal using the xRM Portals Community Edition code base against your instance and start to validate some of the custom configurations you put in place.

Once you have completed a quick validation the next step is to remove all the Adxstudio solutions based on their dependency tree usually last in is first out, so work your way down the list and delete each of them. Because the Dynamics 365 portals solutions are there the entities will remain in the system so none of your data should be removed either. Always best practice to be taking your backups at logical points of this process as well. With Dynamics 365 online this is a super easy process.

Note that if you have published web notification steps then you will need to unregister the SDK message steps where the event handler is “Adxstudio.Xrm.Plugins.Webnotifications”, otherwise you may run into issues removing some solutions.

Once you have removed all the Adxstudio solutions you should be left with just the Dynamics 365 ones. You can retest your functionality and move on to addressing any configuration and branding issues.

This process will have upgraded you to the Portal Source Code version solutions and you can utilize xRM Portals Community Edition has the code base for your portal and now deploy that to your web servers or to an Azure App Service. If you had any custom code that you need to continue to maintain then you should have moved it into this code base and made any necessary changes to continue its existing function.

Upgrading to Portal Source Code/xRM Portals Community Edition could be where you stop for now. When you are ready to go fully online you can continue with an upgrade to the online version of Dynamics 365 portals. In an upcoming post we will look at how you can do that upgrade from Portal Source Code version or xRM Portals Community Edition to the latest Microsoft Dynamics 365 portals online service.

If you have questions related to the upgrade please post a comments and I will try to include answers in future blog posts about the topic.