Search through blog..

Tuesday, January 9, 2024

Hello World, Happy New year :)

I would like to begin by wishing you a very Happy new year, 2024 ๐Ÿ˜Š

It’s been over two years since I last wrote a blog. With the rapid pace of technological advancements and the shift towards Generative Artificial Intelligence (Gen AI), I felt inspired to return to my old policy of "Learn by doing.." and start blogging again. Although I’ve been away from this space for a while, I pushed myself to write this very generic blog - to get started - to motivate and push myself - and to hope I gather the time and courage needed to write some good material. 

I’m sure many of you are familiar with FOMO, or the Fear of Missing Out. With so many new skills to learn and technologies to explore, I’ve been feeling FOMO for quite some time. However, I’ve recently realized that I should rather embrace JOMO, or the Joy of Missing Out. Instead of worrying about all the details, I’m going to focus on the one topic that I can really sink my teeth into, and now that Microsoft has released a co-pilot for almost everything to do half of my job. I plan to joyfully explore the one topic on at a time. And hope to share while I do so - Wish me luck๐Ÿ˜…

Just like most of you, I have been exploring/utilizing Generative AI quite a bit lately. So, why not we talk about the same today. 

So, what is Generative AI? 
Why don't we ask one of such tools available - Bing Chat. 


Ok - when you post a question. Bing Chat immediately triggered the chatGPT-4 engine owned by OpenAI and did a search through the internet and wrote a response back. 
Not only that, Bing Chat also provides the reference links which it used to generate the response. 

So, in simple terms - Generative AI is an AI capable of generating new and original content based on its learning from various sources. The generated content could by just plain text, as shown above (or) could be images and videos too - also a sample is shown in the above screenshot. Not only this, GenAI is widely used to generate programming code as well (Co-pilot for GitHub) is a prominent example. 

OpenAI says this is only a minor step in their journey to achieve Artificial General Intelligence (AGI). Otherwise known as Full AI. OpenAI and many other similar tech companies want to create Artificial Intelligence that can act like a human can do, it should be able to take up any mental or intellectual task which a human can do.  

Ok before I forget, In the above screen grab, the question I raised to Bing Chat (the blue bubble - top right) is called a prompt. There are much better ways to ask a question to GenAI model, the better you engineer your prompt, the better response will be coming your way. We even have a term for the skill of asking the right question to a GenAI model, it is called Prompt Engineering๐Ÿ˜ 

So, now let's try to provide some context to Bing Chat and ask about Artificial General Intelligence and see what it has to share. 


As you can notice, the model changes its tone, doesn't write too long sentences and even adds a smiley automatically. This is where the aspect of human touch has been added directly into the model. This is all taken care already by Microsoft while training the model in the first place - so all you have to do is set the scene and wait. 

There are multiple ways to engineer your prompt, some key approaches are: 
  1. Specifying the Audience - like I did above
    Example: But Imagine you are speaking to a 15 year old kid. 

  2. You can also set the Tone in advance
    Example: Be caring and empathetic with your response. 

  3. Also the most commonly used (by me, atleast) is Personas.
    Example: Act like a X++ Developer and suggest how to.... 
So far so good.. right!! 
However, not so much in reality. The biggest risk with Generative AI Models as of today is Hallucinations. A term used to describe a defect in GenAI models - it means that the model could create responses that are might look very convincing (might even suggest them to you as facts) but are entirely fictional and invented by GenAI to prove its point and respond to your question. This is a bit scary, ain't it? 
This is the reason why, GenAI models are primarily suggested to be used as co-pilots / supporting partners. Meaning someone knowledgeable has to review the responses and take support from co-pilots in documenting, drafting a template, creating new ideas / brain storming and other mundane tasks you can think of & get it done by GenAI models.

Another concern could be people misusing GenAI models, even training GenAI models to create Deep fakes - manipulate existing audio, images and videos to create pretty realistic clones of them with added corruptive content. So, I would say we need to learn more in order to be better prepared while striking the balance to use co-pilots and GenAI models to reduce the work effort on our jobs and daily tasks. 

I summarize today with a prompt illustrating various components (Persona to use while responding, Length recommended for response, the tone to use, and setting the possible audience for the response)


You see, there is a minor touch of rhyming words but not using too complex English literature and also shared what it thinks would be the most prominent outcome of the year. Probably has hallucinated based on the previous conversations - never know, but let's raise the glass nevertheless ๐Ÿฅ‚๐Ÿฅ‚๐Ÿฅ‚
 
 

Friday, December 3, 2021

Azure Data Lake : 101

We are living in a Digital world where Data is everything and the Ability to process and generate insights to enable Business decision making is the absolute super power you want to have.

And in order to process data into meaningful information, it might be a good practice to have a place to store all kinds of data - and Microsoft provides one such storage service with Azure Data lake

So What is Azure Data Lake Storage? 

Azure Data Lake Storage (ADLS) can be literally compared to a large lake/pond, where rain water passing through various terrains gets collected. Irrespective of whether a water stream passing through fields is muddy (or) a water stream passing through a cluster of rocks is clean - a lake would take in the water as it comes. 

Just like that ADLS can be considered as a repository that has capacity to hold large amounts of data in their native, raw format. 

Data lake storages can be terabytes and petabytes in size. Data can come from multiple heterogeneous sources (different in nature). Structured data, Semi-structure data or Unstructured data - all can be stored in a Data lake in their original and un-transformed state. 

Advantages of a Data lake: 

  1. Faster than traditional ETL tools 
  2. Data is never thrown away
  3. Users have possibility to query and explore data
  4. More flexible than a traditional Data warehouse, as there is no demand to ingest only structured data
Then What is Azure Data Lake Storage Gen2? 

ADLS Gen2 converges the capabilities of ADLS Gen1 with Azure Blob storage. So basically has a top up - provides file system semantics, file-level security available, and better scalability. 

All these additional capabilites for ADLS Gen2 are built on Azure Blob storage - thereby supports low-cost data storage, tiered storage and with higher availability (Blog storage disaster recovery capabilities are inherited)

Lot more details on Data lake can be found in the links shared below.

Also it is important to understand that, 

  • Data Lake is usually the first stop in the data flow. So further processing of the raw data needs to be done utilizing big data technologies. 
  • The raw data dump into data lake comes with a responsibility to include governance and need to ensure quality of meta data. Data discovery and analytics capabilities should be developed in order to make proper use of data stored in data lake

Azure Data Lake add-in for Microsoft Dynamics 365 Finance and Operations is also now generally available depending on where you are placed on our planet. This add-in basically helps push data out from D365FO out into Data Lake based on the configuration and setup. Installation details can be found in https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/data-entities/configure-export-data-lake

And once installed, further details to be setup can be found in https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/data-entities/finance-data-azure-data-lake

Important to understand that this is a fairly new feature. Microsoft promises big on this approach going forward, so we can anticipate enhancements in the near future. More details and better overview can be found in https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/data-entities/azure-data-lake-ga-version-overview


Microsoft has detailed information on Docs regarding data lakes - https://docs.microsoft.com/en-us/azure/architecture/data-guide/scenarios/data-lake

Read more on ADLS Gen1 in https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-overview

Details on ADLS Gen2 in https://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-introduction

If you would like to try our setting up Azure Data Lake yourself, Try this blog - it helped me setup mine - https://allaboutdynamic.com/2020/07/09/entity-store-in-azure-data-lake-d365-finance-operations/amp/

If you are in a location where you don't have the feature enabled (or) if you would like to understand the details on what does the add-in do. Try and explore in this Github link - https://github.com/microsoft/Dynamics-365-FastTrack-Implementation-Assets/tree/master/Analytics/AzureDataFactoryARMTemplates/SQLToADLSFullExport

However you are at the very beginning of understanding all these concepts, I would always recommend you to go through MS Learn - https://docs.microsoft.com/en-us/learn/modules/introduction-to-azure-data-lake-storage/ 

Happy exploring. Good luck ๐Ÿ˜„


Wednesday, December 1, 2021

Azure Data Factory : 101

If you have ever worked with a Data warehousing solution, you would probably say that the most important part of the job is to ensure proper Data ingestion (Data loading). If you lose any data at this point, then the resulting information (reports) will end up inaccurate, failing to represent the facts on which Business decisions are made.

Microsoft Azure provides several services which you can use to ingest data and one of them is Azure Data Factory.

So What is Azure Data Factory? 

Azure Data Factory (ADF) is a Platform-as-a-Service offering from Microsoft. The primary purpose of this service could be to do Extract, Transform and Load (ETL) or Extract, Load and Transform (ELT) and this is done via using a concept pipelines. Two types of pipelines to begin with - data movement pipelines (Extract & Load) and also data transformation pipelines (Transform). And being a PaaS service, ADF automatically scales out based on the demand enforced using these pipelines.

ADF is ideal for working with Structured data as well as Unstructured data. ADF allows you to load raw data from many different sources, both on-premises and in the cloud. 

Like many other products from Microsoft, I would call ADF as a collection of several tools packaged together, 

  1. For ease of understanding
  2. To eradicate unnecessary maintenance work
  3. To streamline the approach to be taken 

Microsoft has detailed information in Docs and you can probably start digging from https://docs.microsoft.com/en-us/azure/data-factory/introduction 

If you have the necessary details and would like to get started with Azure Data Factory already - you can start in https://azure.microsoft.com/en-us/services/data-factory/

An idea on the pricing details can be found in https://azure.microsoft.com/en-us/pricing/details/data-factory/data-pipeline/

And if you are a hands-on person, there is a GitHub lab tutorial with all needed details in https://github.com/kromerm/adflab 

If you would like to have a poster on your wall reminding of you all about Azure Data Factory - then feel free to go in https://aka.ms/visual/azure-data-factory

And probably good to know is that you can utilize upto  5 free low frequency activites with Azure Data Factory by signing up for a free Azure account. More details in https://azure.microsoft.com/en-us/free/free-account-faq/

And if you are into LEARN from Microsoft docs - I would recommend to go through https://docs.microsoft.com/en-us/learn/modules/explore-azure-synapse-analytics/ to get a better perspective on large scale data analytics 

Sunday, September 20, 2020

How to: copy databases (for D365FO tests)

First post in year 2020, I am surprised by myself that I haven't managed to make a single post this year - given that I am, like most of you, working from home for the majority of the year. 

Anyways, here comes a small tip about Copying Databases. With Dynamics 365 for Finance and operations (now a.k.a Dynamics 365 for Finance, Dynamics 365 for Supply Chain Management and also Dynamics 365 for Commerce) - you might be in need to perform several database operations during the Project implementation phases. Also you might want to take a back up of the existing database for recovery perspective. 

Below is something which I use to support with Copy actions on databases.

How to make a quick copy of AXDB hosted in Azure SQL (Tier 2 and above)

  1. To begin connect to the Azure SQL database from within the primary AOS server (AOS 01 - because this is where you would great maximum permissions to perform actions on SQL database)
  2. You should use the credentials from LCS to connect to Azure SQL database from SSMS in AOS 01
  3. Open the SQL Query window
  4. And use the below command to perform copy action

    CREATE DATABASE [Name-for-New-Copy] AS COPY OF [Database-name-of-the-Source-DB(AXDB)];

  5. Once you execute this commend, you would get a message stating that action is finished and usually SSMS is quick to give you this message
  6. However, important to understand is that this is only the trigger, the copy continues in the background
  7. You could confirm by looking at the database size of the [New copy]
  8. I usually leave it overnight

How to make a quick copy of AXDB hosted in SQL Server (Tier 1 and/or Cloud hosted environments)

  1. The easiest way is always to use the functions available in SSMS UI
  2. Connect to SQL database using SSMS in the server
  3. Usually credentials are auto-populated with the User details
  4. Right click on the [Source] database > Tasks > Copy Database


  5. Just follow the wizard 
  6. However, Important to understand that SQL Server Agent needs to be Active for this to work
  7. Another tip, if you are working with Microsoft Managed Tier-1 environment, easiest way to get around SQL Server Agent is to go to Services.msc and start the service from in there
  8. Once you are all set, Click on Finish on the Wizard and you should have copy of your database soon enough

Hope this helps. Happy DAXing.

 

Wednesday, December 4, 2019

D365FO - Postman and Odata service validation

If you are in Dynamics 365 Finance and Operations development and if you have a scenario of Synchronous integration, most probably you must have used oData service and along with that POSTMAN app in order to validate your scenarios.

A short intro.. Postman is free tool available for download which can be used as a test client for API development. It is available as a native app (need to download) and also as a Chrome extension. 

Just to give you a context around, API, different types of API and D365FO, different integration approaches in D365FO and oData service (format=JSON) - I have tried to illustrate the relation in the below picture.


Disclaimer: Obviously, the above illustration doesn't contain all the relevant information, but might only help in setting some context.    

Now in this blog post I would like to share about the ease of working with POSTMAN when working with D365FO oData endpoints. So information regarding integration as such will have to be obtained from various sources which are already available online.

Let's begin with a screenshot of POSTMAN native app.


For the best use of Postman, it is important to understand the highlighted topics in the above screenshot: 

Collections - as the name suggests holds the collection of requests. These requests can be organized in folder structure as shown above. And could contain GET/POST or others as you wish. 
Having a collection defined would help you run those requests on demand for any number of times and for different environments

Environments (top-right corner) - here you can define environment specific variables, so that you be efficient in running your Collections/requests. And also makes it easier to have environment (DEV, Test, UAT..) specific ClientIDs and secrets.
Below is the screenshot of variables I have defined for my environment, just as a reference. 

Requests - In the above example screenshot you can see I have used a variable in my request. 
GET :      {{resource}}/data/Documents 
By doing so, I am making use of the environment variables defined and still run the oData endpoint without making any changes across other enviornments. 
For my example here, {{resource}} would convert to https://mydevbox.cloudax.dynamics.com. So when I push the Send button.. the actual request would be, 
https://mydevbox.cloudax.dynamics.com/data/Documents -- which is a normal oData endpoint to get the data from Dynamics. 

Pre-request Scripts - Now in Postman there several nice features, one of them - the most useful for me is Pre-request scripts. You can define them at Request level, folder level (or) on the collection level. The script placed in here would run before the request is sent out. 
So one of the best use of this is to tip: automatically generate the bearer token, instead of manually entering the username and password to authenticate towards the source system. Lot of blogs available online in order to educate and even provide samples. 

Output - And finally when we hit the send button on the request and if everything is correctly configured. A json output would be presented in the output body message window. You would be able to see the "response code", "time taken" and "size of the output file" in the window. See below for reference. 



Hopefully this helps. Please comment below if you have any questions/suggestions, will try to respond asap. Cheers.   

Tuesday, December 3, 2019

Microsoft 365: Admin center

In my previous post, I tried to illustrate how Microsoft licensing works. Of course, a lot more information is already share by Microsoft in Docs - have look for the latest from Microsoft. In this post I try to illustrate how to add a new subscription from Microsoft 365 Admin center, or otherwise called as Office 365 Admin center (or) Dynamics 365 Admin center - all of them most probably will lead you to https://admin.microsoft.com 

Basically to assign one of your Organization user available in a particular tenant to a SAAS based cloud offering, you would need to go through the Admin centre. The Organization AD administrator would have access to this and can delegate it to others by creating/updating the AD user accounts accordingly. 

Steps to add a subscription: 

1. Sign into the Microsoft 365 admin center (https://admin.microsoft.com) with your global administrator account. The Home page would look something like below: 

2. From the left navigation of the Admin center home page, click Billing, and then Subscriptions (or) Licenses (leads and adds to Purchase services page for me at least ๐Ÿ˜‰ a short disclaimer as I have experienced change in UI by Microsoft several times)

3. On the Purchase services page, purchase your new subscriptions

The admin center assigns the organization and Azure AD tenant of your Office 365 subscription to the new subscriptions for SaaS-based cloud offerings.

To add an Azure subscription with the same organization and Azure AD tenant as your Office 365 subscription:

1. Sign in to the Azure portal (https://portal.azure.com) with your Office 365 global administrator account.

2. In the left navigation, click Subscriptions, and then click Add

3. On the Add subscription page, select an offer and complete the payment information and agreement.

If you purchased Azure and Office 365 subscriptions separately and want to access the Office 365 Azure AD tenant from your Azure subscription, see the instructions in Microsoft Docs. Hope this helps. 

Monday, December 2, 2019

Microsoft 365: Licensing illustration

Now with all the Cloud offerings which Microsoft provides to their customers/users, it has become a little more important to understand few terminologies around Licensing. The primary ones for me were: 
  1. Organizations 
  2. Subscriptions
  3. Licenses
  4. User accounts
  5. and Tenants
Where an Organization could be any Business entity. Let's take the example as Walmart is an Organization. 

And a Subscription is an agreement the Walmart made with Microsoft, the terms, the agreed price, the offers/rebates from Microsoft and till what time the Subscription is valid - all such elements are covered under this.

Now Licenses are needed on top of Subscription (sometimes subscription comes along with certain number of free/included licenses). So whenever a license under a particular subscription is allocated to a Walmart user, only that he/she can use the Cloud offering. 
So that explains what User accounts consist of - Users and user groups within the Active directory of Walmart are the user accounts, and they are needed for assigning a license (under a subscription)

And finally Tenant, this is something which determines the Regional location that will house the servers providing the Cloud services which are part of the subscription purchased. 


The AAD tenant for Walmart could be spread across the globe, so let's say, if Walmart has their head office in West USA - they would have the Head office users held in TENANT2:West USA. And suppose that Walmart is opening up a new branch office in Nordics then they would get a new instance of Azure AD tenant containing the organization User accounts for Nordics. And all the cloud services which are part of Walmart's subscription agreement with Microsoft would also be now available from this new TENANT1: North Europe

Another important thing to note is that all the User accounts for Cloud offerings are to be held in Azure Active directory and if there are any local user accounts using legacy Active directory domain services (ADDS) - those will have to be synced with AAD. 

Microsoft Dynamics 365 Finance and Operations Subscription: 
I am not providing any details specifically about D365FO here for a reason - there has been lot of changes in the Licensing for Dynamics during July and onwards - so best to get the latest from Microsoft.        

However, if you (or) your customer as an Organization has D365FO subscription, then the minimum user licenses count = 20. And that includes the following:
  1. FastTrack onboarding support/meeting
  2. One PRODUCTION environment
  3. One Tier 2 UAT environment
  4. One Tier 1 DEV / BUILD / TEST environment 
However, Production environment will be available later in the Implementation Project timeline, after the Readiness assessment, which includes
  1. For you to upload the "Usage profile" to Microsoft
  2. Code and Configuration readiness (discussion with MS)
  3. Customer's UAT signed off
Tip: Production is always sized by Microsoft, and the way you can influence the sizing decision is by providing most accurate Usage profile (includes peak-hour transaction and numbers and many more) and by providing the output (telemetry data) of your Performance testing on a Production-like (maybe Tier-5 environment). So please make appropriate plans for these two in your project timeplan. 

I will write a separate blog in order to explain the new approach for Dynamics Subscriptions and subscription model. That's it for now.