Search through blog..

Wednesday, December 4, 2019

D365FO - Postman and Odata service validation

If you are in Dynamics 365 Finance and Operations development and if you have a scenario of Synchronous integration, most probably you must have used oData service and along with that POSTMAN app in order to validate your scenarios.

A short intro.. Postman is free tool available for download which can be used as a test client for API development. It is available as a native app (need to download) and also as a Chrome extension. 

Just to give you a context around, API, different types of API and D365FO, different integration approaches in D365FO and oData service (format=JSON) - I have tried to illustrate the relation in the below picture.


Disclaimer: Obviously, the above illustration doesn't contain all the relevant information, but might only help in setting some context.    

Now in this blog post I would like to share about the ease of working with POSTMAN when working with D365FO oData endpoints. So information regarding integration as such will have to be obtained from various sources which are already available online.

Let's begin with a screenshot of POSTMAN native app.


For the best use of Postman, it is important to understand the highlighted topics in the above screenshot: 

Collections - as the name suggests holds the collection of requests. These requests can be organized in folder structure as shown above. And could contain GET/POST or others as you wish. 
Having a collection defined would help you run those requests on demand for any number of times and for different environments

Environments (top-right corner) - here you can define environment specific variables, so that you be efficient in running your Collections/requests. And also makes it easier to have environment (DEV, Test, UAT..) specific ClientIDs and secrets.
Below is the screenshot of variables I have defined for my environment, just as a reference. 

Requests - In the above example screenshot you can see I have used a variable in my request. 
GET :      {{resource}}/data/Documents 
By doing so, I am making use of the environment variables defined and still run the oData endpoint without making any changes across other enviornments. 
For my example here, {{resource}} would convert to https://mydevbox.cloudax.dynamics.com. So when I push the Send button.. the actual request would be, 
https://mydevbox.cloudax.dynamics.com/data/Documents -- which is a normal oData endpoint to get the data from Dynamics. 

Pre-request Scripts - Now in Postman there several nice features, one of them - the most useful for me is Pre-request scripts. You can define them at Request level, folder level (or) on the collection level. The script placed in here would run before the request is sent out. 
So one of the best use of this is to tip: automatically generate the bearer token, instead of manually entering the username and password to authenticate towards the source system. Lot of blogs available online in order to educate and even provide samples. 

Output - And finally when we hit the send button on the request and if everything is correctly configured. A json output would be presented in the output body message window. You would be able to see the "response code", "time taken" and "size of the output file" in the window. See below for reference. 



Hopefully this helps. Please comment below if you have any questions/suggestions, will try to respond asap. Cheers.   

Tuesday, December 3, 2019

Microsoft 365: Admin center

In my previous post, I tried to illustrate how Microsoft licensing works. Of course, a lot more information is already share by Microsoft in Docs - have look for the latest from Microsoft. In this post I try to illustrate how to add a new subscription from Microsoft 365 Admin center, or otherwise called as Office 365 Admin center (or) Dynamics 365 Admin center - all of them most probably will lead you to https://admin.microsoft.com 

Basically to assign one of your Organization user available in a particular tenant to a SAAS based cloud offering, you would need to go through the Admin centre. The Organization AD administrator would have access to this and can delegate it to others by creating/updating the AD user accounts accordingly. 

Steps to add a subscription: 

1. Sign into the Microsoft 365 admin center (https://admin.microsoft.com) with your global administrator account. The Home page would look something like below: 

2. From the left navigation of the Admin center home page, click Billing, and then Subscriptions (or) Licenses (leads and adds to Purchase services page for me at least 😉 a short disclaimer as I have experienced change in UI by Microsoft several times)

3. On the Purchase services page, purchase your new subscriptions

The admin center assigns the organization and Azure AD tenant of your Office 365 subscription to the new subscriptions for SaaS-based cloud offerings.

To add an Azure subscription with the same organization and Azure AD tenant as your Office 365 subscription:

1. Sign in to the Azure portal (https://portal.azure.com) with your Office 365 global administrator account.

2. In the left navigation, click Subscriptions, and then click Add

3. On the Add subscription page, select an offer and complete the payment information and agreement.

If you purchased Azure and Office 365 subscriptions separately and want to access the Office 365 Azure AD tenant from your Azure subscription, see the instructions in Microsoft Docs. Hope this helps. 

Monday, December 2, 2019

Microsoft 365: Licensing illustration

Now with all the Cloud offerings which Microsoft provides to their customers/users, it has become a little more important to understand few terminologies around Licensing. The primary ones for me were: 
  1. Organizations 
  2. Subscriptions
  3. Licenses
  4. User accounts
  5. and Tenants
Where an Organization could be any Business entity. Let's take the example as Walmart is an Organization. 

And a Subscription is an agreement the Walmart made with Microsoft, the terms, the agreed price, the offers/rebates from Microsoft and till what time the Subscription is valid - all such elements are covered under this.

Now Licenses are needed on top of Subscription (sometimes subscription comes along with certain number of free/included licenses). So whenever a license under a particular subscription is allocated to a Walmart user, only that he/she can use the Cloud offering. 
So that explains what User accounts consist of - Users and user groups within the Active directory of Walmart are the user accounts, and they are needed for assigning a license (under a subscription)

And finally Tenant, this is something which determines the Regional location that will house the servers providing the Cloud services which are part of the subscription purchased. 


The AAD tenant for Walmart could be spread across the globe, so let's say, if Walmart has their head office in West USA - they would have the Head office users held in TENANT2:West USA. And suppose that Walmart is opening up a new branch office in Nordics then they would get a new instance of Azure AD tenant containing the organization User accounts for Nordics. And all the cloud services which are part of Walmart's subscription agreement with Microsoft would also be now available from this new TENANT1: North Europe

Another important thing to note is that all the User accounts for Cloud offerings are to be held in Azure Active directory and if there are any local user accounts using legacy Active directory domain services (ADDS) - those will have to be synced with AAD. 

Microsoft Dynamics 365 Finance and Operations Subscription: 
I am not providing any details specifically about D365FO here for a reason - there has been lot of changes in the Licensing for Dynamics during July and onwards - so best to get the latest from Microsoft.        

However, if you (or) your customer as an Organization has D365FO subscription, then the minimum user licenses count = 20. And that includes the following:
  1. FastTrack onboarding support/meeting
  2. One PRODUCTION environment
  3. One Tier 2 UAT environment
  4. One Tier 1 DEV / BUILD / TEST environment 
However, Production environment will be available later in the Implementation Project timeline, after the Readiness assessment, which includes
  1. For you to upload the "Usage profile" to Microsoft
  2. Code and Configuration readiness (discussion with MS)
  3. Customer's UAT signed off
Tip: Production is always sized by Microsoft, and the way you can influence the sizing decision is by providing most accurate Usage profile (includes peak-hour transaction and numbers and many more) and by providing the output (telemetry data) of your Performance testing on a Production-like (maybe Tier-5 environment). So please make appropriate plans for these two in your project timeplan. 

I will write a separate blog in order to explain the new approach for Dynamics Subscriptions and subscription model. That's it for now.       

D365FO - Copy of databases across environments

When working with Dynamics 365 Finance and operations implementations, it is an obvious step to perform database movements across environments. We have been doing this in the previous version (AX2012) as well, where we had to change the environments related settings after the database copy manually.

Now we D365FO, we might end up in different scenarios based on the Tier-level of the environment. So I have tried to identify the #4 major scenarios in here and share the approach I have used. 

  1. Copy of DB from Tier 1 env. to another Tier 1 env.
  2. Copy of DB from Tier 1 env. to a Tier 2 env.
  3. Copy of DB from Tier 2 env. to another Tier 2 env.
  4. Copy of DB from Tier 2 env. to a Tier 1 env.
The below picture is an illustration of different types and corresponding solutions on a high level (sorry for my not-so-good-handwriting 😅)

#1 Tier 1 -> Tier 1 DB copy: 

This scenario can happen quite common if you have multiple tier-1 environment in your on-going project like, DevBoxes, Configuration env. or even an internal-test env. You might find yourself in a situation where you have the latest configuration in either a config/test environment and then you want that configuration to be moved to a DevBox for the developer to test an ongoing development with appropriate configuration and master data. 

In such scenarios, A simple SQL Server backup from Source environment and then restore to the target environment, followed by a full database synchronization, would suffice. There are several blogs online which could help you with SQL backup and restore, so not going into any further details in here. 

A tip however, is that, you should run the Environment reprovisioning tool available in LCS > Shared Library in order to get all the settings updated accordingly. If it is a devbox, you can do without this step as well. 



You can just use LCS > Maintain > Apply updates > select Environment reprovisioning tool from Asset library
Approach is different if you have an on-premise devbox - again lot of material available online already. 


#2 Tier 1 -> Tier 2 DB copy: 

This scenario could occur when you have been maintaining a Tier-1 environment with the Golden configuration and you would like to import that into your UAT or ultimately to your PRODUCTION environment. The process for this has been always quite complex - due to the fact that Tier-1 environment host their dynamics database on SQL Server, where as Tier-2 and above environments host their dynamics database on Azure SQL
And the database settings in Azure SQL are quite different from that of SQL Server, for example, there won't be any tempDB's associated, there won't be any windows/AD users associated with Azure SQL. 

So if you intend to copy SQL DB to an Azure SQL DB, the process involves running several scripts to make the mandatory changes. Several blogs available describing this process. 

That said, a tip is to challenge the cost vs risk assessment when choosing a Tier-1 environment for Golden configuration. Given the fact that, Microsoft now doesn't allow DB access to any Tier-2 and above environments. To run the scripts would involve Microsoft DSE team. So go for Tier-2 environment for Golden configuration to avoid the hassle. 


#3 Tier 2 -> Tier 2 DB copy: 

This scenario could happen when moving databases from Test environment to another Test (or) UAT environment. This is however pretty straight forward given the tools available via LCS. 

So as described in the above image. It is as simple as selecting Move database from LCS > Maintain and Export database to a .bacpac file, which will be stored in the Asset library automatically. 
And then go to the Target environment and again from LCS > Maintain > Move database > Import by choosing the saved .bacpac file.

The tip here is ensure that the database size is less than 200 GB for the best results. Microsoft has now improved the scripts to handle databases more than 200 GB, of course, but takes a lot of time to perform the import as well.  


#4 Tier 2 -> Tier 1 DB copy: 

This scenario would occur when you want to get either the PRODUCTION data or mostly your UAT data into a DevBox for troubleshooting purposes. The process earlier was as mentioned in the above picture, to create a copy of Azure SQL database and perform some scripts on top of it. However, I haven't tried this. I will update more about this in another post. 

A tip to perform the PRODUCTION database troubleshooting is to follow the step-by-step method from MS Docs.  

That's it for today. Please share your comments/feedback, if any. Thanks and Happy D365ing 😄