Build a Personal 1:1 Conversational Bot With Microsoft Teams

Introduction

 
A conversation bot allows users to interact within multiple forms like text, adaptive cards, forms, etc., from the Microsoft Teams client.
 
A bot in Microsoft Team can be part of a one to one conversation, group chat, or a channel in a team. Each section has its own opportunities and challenges.


 

Conversational Bots 

 
Conversation bot is a series of messages between the users and the bot. There are three types of conversation within the Microsoft Team Client:
  • Personal Conversation or Personal Chat: It's a one to one chat between users and bot.
  • Group Conversation or Group Chat: It's between a bot and two or more users.
  • Teams Channel: It's available to all channel team members, any team member can do a conversation. 
Bots behave differently depending on the type of conversation, for example:
  • Personal conversation i.e. one to one conversation doesn't need @mention. All messages sent by the user directly routed to the bot.
  • Teams Channel or Group Conversation requires users to use @mention to invoke the bot into the channel or group chat. 
Prerequisites
 
To build Microsoft Teams Bot, below are the necessary tools needed to install at your environment:
  1. Microsoft Azure Subscription
  2. Office 365 Tenant
  3. Microsoft Team enable at Office 365 tenant and allowed the custom app to upload
  4. Node.js (V10.* or higher)
  5. NPM (V6.* or higher)
  6. Gulp (V4.* or higher)
  7. Yeoman (V3.* or higher)
  8. Yeomen Generator of MS Teams
  9. Visual Studio Code

Let's start with Azure Bot

 
To create Microsoft Team Bot requires two activities:
  1. Create Microsoft Azure Bot.
  2. Add a bot to MS Team-based code base. 
Step 1 - Create Microsoft Azure Bot
  1. Browse the https://portal.azure.com with your work or school account.
  2. Click "+Create Resource"
  3. Select "AI + Machine Learning"
  4. Select "Web App Bot"


Step 2 - Complete all required information and click create to provision the bot
  1. Bot handle:- its unique name of your bot.
  2. Subscription:- Select a valid available subscription.
  3. Resource Group:- select an appropriate resource group or create a new one as per requirement. 
  4. Location:- Select your closed azure location or the preferred location.
  5. Pricing:- Select the required pricing tier or F0 as free for POC activity.
  6. App Name:-  Leave it as-is for this demo purpose.
  7. Service Plan:- Leave it as-is for this demo purpose.
  8. Bot Template:- Leave it as-is for this demo purpose.
  9. Application insight: Select off for this demo purpose
  10. Microsoft App Id and Password:- Leave it as-is for this demo purpose.
  11. Click create to provision. 

Step 3 - Enable Microsoft Team Channel for your Bot
 
In order for the bot to interact with MS Teams, you must enable the Teams Channel.

To complete the process, MS Teams and Webchat should be available and should be listed in your channel lists.


Step 4 - Create a Microsoft Teams App
 
In this section, we are going to create a Node.js project
  • Open the Command Prompt and navigate to the desired directory to create a project.
  • Run the Yeomen Generator for Microsoft Teams

    yo teams  
  • Yeomen generator will ask the following question:

Step 5 - Update Default Bot Code
 
Here we will find a 1:1 conversation response. 
 
Browse Interactive ChatBotBot file

Step 6
 
To implement this functionality, locate and open the ./src/app/interactiveChatBotBot/InteractiveChatBotBot.ts file and add the following method to the InteractiveChatBotBot class:
 
Add the below code after the import statement 
  1. import * as Util from "util";  
  2. const TextEncoder = Util.TextEncoder;  
 Add Message Factory object reference to the existing botbuilder package:
  1. import {  
  2.   StatePropertyAccessor,  
  3.   CardFactory,  
  4.   TurnContext,  
  5.   MemoryStorage,  
  6.   ConversationState,  
  7.   ActivityTypes,  
  8.   TeamsActivityHandler,  
  9.   MessageFactory,  
  10. } from 'botbuilder';  
Locate the handler onMessage() within the constructor().
 
Locate and replace the line if (text.startsWith("hello")) { in the onMessage() handler with the following code:
  1. if (text.startsWith("onetoonepersonalchat"))  
  2.   await this.handleMessageMentionMeOneOnOne(context);  
  3.   return;  
  4. else if (text.startsWith("hello")) {   
Add the following method to the class:
  1. private async handleMessageMentionMeOneOnOne(context: TurnContext): Promise<void> {  
  2.   const mention = {  
  3.     mentioned: context.activity.from,  
  4.     text: `<at>${new TextEncoder().encode(context.activity.from.name)}</at>`,  
  5.     type: "mention"  
  6.   };  
  7.   
  8.   const replyActivity = MessageFactory.text(`Hi ${mention.text} from a one to one personal chat.`);  
  9.   replyActivity.entities = [mention];  
  10.   await context.sendActivity(replyActivity);  
  11. }  
Step 7 - Update Project Environment Variable
 
Locate and open the file ./.env.
 
Locate the following section in the file and set the values of the two properties that you obtained when registering the bot:
  1. # App Id and App Password fir the Bot Framework bot  
  2. MICROSOFT_APP_ID=  
  3. MICROSOFT_APP_PASSWORD=  
Step 8 - Update Manifest File
  • Browse manifest.json file  ./src/maifest/manisfest.json
  • Update below properties:-
  • ID: Replace with Azure bot ID
  • Version: 1.0.0
  • PackageName   : InteractiveBotBot
  • Bot Property
  1. "bots": [  
  2.     {  
  3.       "botId""b0edaf1f-0ded-4744-ba2c-113e50376be6",  
  4.       "needsChannelSelector"true,  
  5.       "isNotificationOnly"false,  
  6.       "scopes": [  
  7.         "team",  
  8.         "personal"  
  9.       ],  
  10.       "commandLists": [  
  11.         {  
  12.           "scopes": [  
  13.             "team",  
  14.             "personal"  
  15.           ],  
  16.           "commands": [  
  17.             {  
  18.               "title""Help",  
  19.               "description""Shows help information"  
  20.             },  
  21.             {  
  22.               "title""mentionme",  
  23.               "description""Sends message with @mention of the sender"  
  24.                 
  25.             }  
  26.           ]  
  27.         }  
  28.       ]  
  29.     }  
  30.   ],  
Step 9 - Test and Run the Bot
 
From VSCode select View -> Terminal and Hit below command
  1. gulp ngrok-serve  
Ngrok-serve builds your project and starts running at a local webserver. Ngrok start with random subdomain and create  secure URL for the local webserver
 
Microsoft Team requires all content to be displayed with an HTTPS request. Local debugging requires local HTTP webserver. Ngrok creates a secure routable URL to HTTP webserver to debug the HTTPS application securely.



Ngrok created a temporary URL "14dceed6815b.ngrok.io". The same needs to be updated at Azure Bot with Message endpoint as per the below screenshot.
  • Select Setting under Bot Management
  • Update Messaging End Point with Ngrok URL
  • Click save to update.

Step 9 - Upload Manifest File
 
Navigate to manifest folder /src/manifest. Select both .png files and manifest.jon file and zip all three files. Give the name of zip file "manifest"
 
Login to https://teams.microsoft.com 
  1. Navigate to App Studio
  2. Select Import an existing app
  3. Select the manifest.zip file
  4. Bot name with the icon will appear at the screen

Step 10 - Install Custom App to Teams Solution
 
Using the app bar navigation menu, select the Mode added apps button. Then select Browse all apps followed by Upload for me or my teams.


In the file dialog that appears, select the Microsoft Teams package in your project. This app package is a ZIP file that can be found in the project's ./package folder.


Click Add or Select app to navigate to chat with the bot 


Select the MentionMe command, or manually type mentionme in the compose box, then press enter.
 
After a few seconds, you should see the bot respond mentioning the user you are signed in with,



 
Same Article Published here also

 I hope you have enjoyed and learned something new in this article. Thanks for reading and stay tuned for the next article.


Join me at Microsoft 365 Virtual Marathon, 36 hour event happening May 27-28 2020

Power Platform Evolution with AI Builder Model

Speedup digital transformation through the Power Platform and AI Builder. AI Builder provides AI templates (Binary Classification, Text Classification, Object Detection, Form Scanning) that you can tailor to easily add intelligence to your apps and processes in Power Apps and/or Microsoft Flow





Join me at Microsoft 365 Virtual Marathon, 36 hour event happening May 27-28 2020 
I will be speaking about Power Platform evolution with AI Builder 
Register here: https://lnkd.in/dA8vQS3 and hope to see you there! 
Virtual event page: https://lnkd.in/dqZ6Ks5 

🗓 Thursday, May, 28th 📺Room Mile 14 ⏱️06:30PM IST / 06:00PM PDT / 03:00 PM CEST

Speaking at Virtual Global AI on Tour 2020 – Mumbai – 17 May-20

Join me at Global AI Online Tour on 17-May (Sunday) Topic:-

How to design next generation document search Azure Bot (SDK V4) with MS Teams


Usually, the bot is used for text-based intelligent conversation, The session would demonstrate how to extend bot capability to cognitive document search level.

Timing: 01:30 PM IST to 2:30 PM IST

Use below WebEx detail to Join:-
https://global.gotomeeting.com/join/861754429

You can also dial in using your phone. 
United States: +1 (408) 650-3123 | Access Code: 861-754-429

Next Generation | Document Search Bot using Bot Framework SDK V4

Azure Bot Service provides an integrated environment that is purpose-built for bot development, enabling you to build, connect, test, deploy, and manage intelligent bots.

Overview 

In this article, I am going to discuss how to leverage document search using Azure bot framework SDK V4.
SharePoint Online search engine is quite robust at the enterprise level and allows the developer to define custom managed properties and leverage that into search.
Here I am going to leverage SharePoint online search api to find the document based on keywords.


  1. Azure Bot Framework used to build chatbot interface.
  2. Luis cognitive services used to defined the intent and it will help to define the score and identified keywords.
  3. QnAMaker cognitive service used to define the FAQ based questions and answers
  4. Azure App API used to build search api and return data based on defined keywords.
  5. SharePoint Online is used as content repositorty.
  6. Adaptive Card used to design the card based on api results.  

Let's get started with implementation
Step 1
Lanuch VS2019 and create echo bot project.

Bot Project Structure look like this,
Step 2
Login to Luis.ai and create an app with intent, entity and utterances.
Luis is known as language understanding and used to defined intent; i.e. intention around the phrases, and Entity is used to define multple phrases with custom and build in entities.
In short, Luis has inbuilt machine learning capability and help to understand the intention around phases as well as defined keywords which will help to search documents.
Select and define intent name: "DocumentSearch" 


Select Entities and add some phrase 


Once you finish with Intent and Entities, train and publish the app.



Luis Key and End Point




Step 3
Browse the appsetting.json file in the solution and add Luis and QnA Maker keys and endpoint url:-
  1. {  
  2.   "MicrosoftAppId""",  
  3.   "MicrosoftAppPassword""",  
  4.   "ScmType""",  
  5.   
  6.   "LuisAppId""XXXXXXXX-3b11-4ae7-89fb-9c169d78e0ff",  
  7.   "LuisAPIKey""XXXXXX63f5ed4881a152c055309ac809",  
  8.   "LuisAPIHostName""westus.api.cognitive.microsoft.com/",  
  9.   
  10.   "QnAKnowledgebaseId""XXXXXX7b-e0d4-4b32-b9c3-27634d078778",  
  11.   "QnAAuthKey""XXXXXXX-17a9-4772-aaed-fb2ba3f8775e",  
  12.   "QnAEndpointHostName""https://docsearchqna.azurewebsites.net/qnamaker"  
  13. }  
Step 4
Add Luis recognizer class; i.e. create a connection with Luis based on defined Luis ID, Luis Key and Luis end point.
Create a folder name; i.e. Cognitive Services.
Add new file; i.e. SearchLuisRecognizer.cs and add the below code as-is. 
  1. public class SearchLuisRecognizer : IRecognizer  
  2.    {  
  3.        private readonly LuisRecognizer _recognizer;  
  4.        public SearchLuisRecognizer(IConfiguration configuration)  
  5.        {  
  6.            var luisIsConfigured = !string.IsNullOrEmpty(configuration["LuisAppId"]) && !string.IsNullOrEmpty(configuration["LuisAPIKey"]) && !string.IsNullOrEmpty(configuration["LuisAPIHostName"]);  
  7.            if (luisIsConfigured)  
  8.            {  
  9.                var luisApplication = new LuisApplication(  
  10.                    configuration["LuisAppId"],  
  11.                    configuration["LuisAPIKey"],  
  12.                    "https://" + configuration["LuisAPIHostName"]);  
  13.   
  14.                var recognizerOptions = new LuisRecognizerOptionsV3(luisApplication)  
  15.                {  
  16.                    PredictionOptions = new Microsoft.Bot.Builder.AI.LuisV3.LuisPredictionOptions  
  17.                    {  
  18.                        IncludeInstanceData = true,  
  19.                    }  
  20.                };  
  21.   
  22.                _recognizer = new LuisRecognizer(recognizerOptions);  
  23.            }  
  24.        }  
  25.   
  26.        public virtual bool IsConfigured => _recognizer != null;  
  27.   
  28.        public virtual async Task<RecognizerResult> RecognizeAsync(ITurnContext turnContext, CancellationToken cancellationToken)  
  29.            => await _recognizer.RecognizeAsync(turnContext, cancellationToken);  
  30.   
  31.        public virtual async Task<T> RecognizeAsync<T>(ITurnContext turnContext, CancellationToken cancellationToken)  
  32.            where T : IRecognizerConvert, new()  
  33.            => await _recognizer.RecognizeAsync<T>(turnContext, cancellationToken);  
  34.    }  
Step 5
Add Main Dialog and define water fall model to call Luis and QnA Maker.
MainDialog will recognize the intention in the phrases.
  1. Create folder name i.e. Dialogs
  2. Add new file i.e. MainDialogs.cs  
  1. private readonly SearchLuisRecognizer _luisRecognizer;  
  2.        public QnAMaker SearchBotQnA { get; private set; }  
  3.   
  4.        //string messageText = "What can I help you with today?";  
  5.        // Dependency injection uses this constructor to instantiate MainDialog  
  6.        public MainDialogs(SearchLuisRecognizer luisRecognizer, QnAMakerEndpoint endpoint)  
  7.            : base(nameof(MainDialogs))  
  8.        {  
  9.            _luisRecognizer = luisRecognizer;  
  10.            SearchBotQnA = new QnAMaker(endpoint);  
  11.   
  12.            AddDialog(new TextPrompt(nameof(TextPrompt)));  
  13.            AddDialog(new WaterfallDialog(nameof(WaterfallDialog), new WaterfallStep[]  
  14.            {  
  15.                IntroStepAsync,  
  16.                ActStepAsync,  
  17.                FinalStepAsync,  
  18.            }));  
  19.   
  20.            // The initial child Dialog to run.  
  21.            InitialDialogId = nameof(WaterfallDialog);  
  22.        }  
Main Dialog is used to define the Luis Connection, QnAMaker Connection and Waterfall Dialog Model used for implementing sequencial conversation flow.


1. IntroStepAsync ->  It's used to check Luis connection and move forward. 
  1. private async Task<DialogTurnResult> IntroStepAsync(WaterfallStepContext stepContext, CancellationToken cancellationToken)  
  2.        {  
  3.            if (!_luisRecognizer.IsConfigured)  
  4.            {  
  5.                return await stepContext.NextAsync(null, cancellationToken);  
  6.            }  
  7.   
  8.            return await stepContext.PromptAsync(nameof(TextPrompt), new PromptOptions { }, cancellationToken);  
  9.        }   
2. ActStepAsync -> Used to define the intent against phrase. Based on intention, calls respective method to execute.
  1. private async Task<DialogTurnResult> ActStepAsync(WaterfallStepContext stepContext, CancellationToken cancellationToken)    
  2.         {    
  3.     
  4.             var luisResult = await _luisRecognizer.RecognizeAsync<ContentSearchLuisRecognizer>(stepContext.Context, cancellationToken);    
  5.             APICaller aPICaller = new APICaller();    
  6.             switch (luisResult.TopIntent().intent)    
  7.             {    
  8.                    
  9.     
  10.                 case ContentSearchLuisRecognizer.Intent.DocumentSearch:    
  11.                     string documentName = luisResult.DocumentNameEntities != null ? luisResult.DocumentNameEntities : "";    
  12.                     string documents = aPICaller.GetDocument(documentName);    
  13.                     var docAttachments = DocumentCard.GetDocumentCard(documents);    
  14.                     await stepContext.Context.SendActivityAsync(MessageFactory.Carousel(docAttachments), cancellationToken);    
  15.                     break;    
  16.     
  17.                 default:    
  18.     
  19.                     var results = await SearchBotQnA.GetAnswersAsync(stepContext.Context);    
  20.                     if (results.Length > 0)    
  21.                     {    
  22.                         var answer = results.First().Answer;    
  23.                         await stepContext.Context.SendActivityAsync(MessageFactory.Text(answer), cancellationToken);    
  24.                     }    
  25.                     else    
  26.                     {    
  27.                         string documentNames = aPICaller.GetDocument(stepContext.Context.Activity.Text);    
  28.                         if (!String.IsNullOrEmpty(documentNames) && documentNames != "[]")    
  29.                         {    
  30.                             var documentAttachments = DocumentCard.GetDocumentCard(documentNames);    
  31.                             await stepContext.Context.SendActivityAsync(MessageFactory.Carousel(documentAttachments), cancellationToken);    
  32.                         }    
  33.                         else    
  34.                         {    
  35.                             Activity reply = ((Activity)stepContext.Context.Activity).CreateReply();    
  36.                             reply.Text = $"😢 **Sorry!!! I found nothing** \n\n Please try to rephrase your query.";    
  37.                             await stepContext.Context.SendActivityAsync(reply);    
  38.                         }    
  39.                     }    
  40.                     break;    
  41.             }    
  42.     
  43.             return await stepContext.NextAsync(null, cancellationToken);    
  44.         }   
3. FinalStepAsync ->  Used to initiate the conversation flow again.
  1. private async Task<DialogTurnResult> FinalStepAsync(WaterfallStepContext stepContext, CancellationToken cancellationToken)  
  2. {  
  3.   
  4. return await stepContext.ReplaceDialogAsync(InitialDialogId, null, cancellationToken);  
  5. }  
Step 6 - Call Custom API 
API will return data from SharePoint Online into Json Format. As of now I used anonymous api so it didn't generate access token. It's always a best practice to use user token and access token to make api and connection secure.
  • Create a folder name Services
  • Create a file "APICaller.cs"
  1. public string GetDocument(string qry)  
  2.        {  
  3.            HttpClient httpClient = new HttpClient();  
  4.            var baseUrl = "http://botapi.azurewebsites.net/";  
  5.            var route = "api/document?query=" + qry;  
  6.            httpClient.BaseAddress = new Uri(baseUrl);  
  7.            httpClient.DefaultRequestHeaders.Accept.Clear();  
  8.            httpClient.DefaultRequestHeaders.Accept.Add(  
  9.            new MediaTypeWithQualityHeaderValue("application/json"));  
  10.   
  11.            string responseString = string.Empty;  
  12.            var response = httpClient.GetAsync(route).Result;  
  13.            if (response.IsSuccessStatusCode)  
  14.            {  
  15.                responseString = response.Content.ReadAsStringAsync().Result;  
  16.            }  
  17.            return responseString;  
  18.        }  
Step 7
Define Adaptive; i.e. Document Card, based on api response.
Adaptive; i.e. Document card, will be designed to return api response into card format and the same is going to attach to carousel.
Create Folder Name: Cards

Create file name : DocumentCards.cs 
  1. public static List<Attachment> GetDocumentCard(string dataSet)  
  2.         {  
  3.             var attachments = new List<Attachment>();  
  4.             List<DocumentDto> documentDtos = JsonConvert.DeserializeObject<List<DocumentDto>>(dataSet);  
  5.             foreach (DocumentDto info in documentDtos)  
  6.             {  
  7.                 string summary = HtmlToPlainText(info.Summary);  
  8.                 string documentIcon = GetFileIcon(info.DocumentPath);  
  9.                 //Icon fileIcon=Icon.ExtractAssociatedIcon("<fullPath>");  
  10.   
  11.                 var card = new AdaptiveCard("1.2");  
  12.                 List<AdaptiveElement> AdaptiveElements = new List<AdaptiveElement>  
  13.                 {  
  14.                      new AdaptiveColumnSet()  
  15.                      {  
  16.                         Columns =new List<AdaptiveColumn>()  
  17.                         {  
  18.                             new AdaptiveColumn()  
  19.                                 {  
  20.                                     Width="100px",  
  21.                                     Items = new List<AdaptiveElement>()  
  22.                                     {  
  23.                                           
  24.                                         new AdaptiveImage(documentIcon)  
  25.                                         {  
  26.                                            Id="documentIcon",  
  27.                                            Size = AdaptiveImageSize.Medium,  
  28.                                            Style = AdaptiveImageStyle.Default,  
  29.                                         },  
  30.                                     }  
  31.                                 },  
  32.                             new AdaptiveColumn()  
  33.                             {  
  34.                                     Width=AdaptiveColumnWidth.Stretch,  
  35.                                     Items = new List<AdaptiveElement>()  
  36.                                     {  
  37.                                          new AdaptiveTextBlock()  
  38.                                          {  
  39.                                             Id="title",  
  40.                                             Text = info.Title,  
  41.                                             Size = AdaptiveTextSize.Medium,  
  42.                                             Weight = AdaptiveTextWeight.Bolder,  
  43.                                             HorizontalAlignment =AdaptiveHorizontalAlignment.Left,  
  44.                                          },  
  45.                                          new AdaptiveTextBlock()  
  46.                                          {  
  47.                                            Id="author",  
  48.                                            Text ="✍ " +info.Author,  
  49.                                            Weight = AdaptiveTextWeight.Lighter,  
  50.                                            Size=AdaptiveTextSize.Small,  
  51.                                            Color=AdaptiveTextColor.Dark,  
  52.                                            Wrap=true,  
  53.                                          },  
  54.                                          new AdaptiveTextBlock()  
  55.                                          {  
  56.                                            Id="date",  
  57.                                            Text = "🗓 "+info.CreatedDateTime,  
  58.                                            Weight = AdaptiveTextWeight.Lighter,  
  59.                                            Color=AdaptiveTextColor.Dark,  
  60.                                            Size=AdaptiveTextSize.Small,  
  61.                                            Wrap=true,  
  62.                                          },  
  63.                                     }  
  64.                             }  
  65.                         }  
  66.                      },  
  67.                      new AdaptiveColumnSet()  
  68.                      {  
  69.                             Columns =new List<AdaptiveColumn>()  
  70.                             {  
  71.                                 new AdaptiveColumn()  
  72.                                 {  
  73.                                      
  74.                                     Items = new List<AdaptiveElement>()  
  75.                                     {  
  76.                                         new AdaptiveTextBlock()  
  77.                                         {  
  78.                                            Id="summary",  
  79.                                            Text = summary,  
  80.                                            Weight = AdaptiveTextWeight.Default,  
  81.                                            Size=AdaptiveTextSize.Small,  
  82.                                            Color=AdaptiveTextColor.Dark,  
  83.                                            Wrap=true,  
  84.                                         },  
  85.   
  86.                                     }  
  87.   
  88.                                 }  
  89.                             }  
  90.                      },  
  91.                      new AdaptiveActionSet()  
  92.                      {  
  93.                         Actions = new List<AdaptiveAction>(){  
  94.                            new AdaptiveOpenUrlAction()  
  95.                            {  
  96.                               Id="open_url_action",  
  97.                               Title = "View",  
  98.                               UrlString = info.DocumentPath  
  99.                            }  
  100.                         }  
  101.                      }  
  102.                 };  
  103.                   
  104.                 card.Body = AdaptiveElements;  
  105.                 Attachment attachment = new Attachment()  
  106.                 {  
  107.                     ContentType = AdaptiveCard.ContentType,  
  108.                     Content = card  
  109.                 };  
  110.                 attachments.Add(attachment);  
  111.   
  112.             }  
  113.             return attachments;  
  114.         }  
Step 8
Create search bot file and call MainDialog file
Add new file searchbot.cs into bot folder. It has two main functions.
OnMemberAddedAsync
  1. protected override async Task OnMembersAddedAsync(IList<ChannelAccount> membersAdded, ITurnContext<IConversationUpdateActivity> turnContext, CancellationToken cancellationToken)  
  2.        {  
  3.            foreach (var member in membersAdded ?? Array.Empty<ChannelAccount>())  
  4.            {  
  5.                if (member.Id != turnContext.Activity.Recipient.Id)  
  6.                {  
  7.                    //if (turnContext.Activity.MembersAdded[0].Name == "USER_NAME")  
  8.                    //{  
  9.                    Activity reply = ((Activity)turnContext.Activity).CreateReply();  
  10.                    reply.Text = $" ðŸ˜€ **Hi, I am Virtual Assistant!!** \n\n I am here to assist you.";  
  11.                    await turnContext.SendActivityAsync(reply, cancellationToken);  
  12.                    //}  
  13.                }  
  14.            }  
  15.            await Dialog.RunAsync(turnContext, ConversationState.CreateProperty<DialogState>(nameof(DialogState)), cancellationToken);  
  16.        }  
 OnTeamMemberAddedAsync
  1. protected override async Task OnTeamsMembersAddedAsync(IList<TeamsChannelAccount> membersAdded, TeamInfo teamInfo, ITurnContext<IConversationUpdateActivity> turnContext, CancellationToken cancellationToken)  
  2.        {  
  3.            foreach (var teamMember in membersAdded)  
  4.            {  
  5.                Activity reply = ((Activity)turnContext.Activity).CreateReply();  
  6.                reply.Text = $" ðŸ˜€ **Hi, I am Virtual Assistant!!** \n\n I am here to assist you.";  
  7.                await turnContext.SendActivityAsync(reply, cancellationToken);  
  8.            }  
  9.        }  
Step 9
Call Search Bot.cs file into startup.cs file,
  1. public void ConfigureServices(IServiceCollection services)  
  2.        {  
  3.            services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Version_2_1);  
  4.   
  5.            // Create the Bot Framework Adapter with error handling enabled.  
  6.            services.AddSingleton<IBotFrameworkHttpAdapter, AdapterWithErrorHandler>();  
  7.   
  8.            // Create the storage we'll be using for User and Conversation state. (Memory is great for testing purposes.)  
  9.            services.AddSingleton<IStorage, MemoryStorage>();  
  10.   
  11.            // Create the User state. (Used in this bot's Dialog implementation.)  
  12.            services.AddSingleton<UserState>();  
  13.   
  14.            // Create the Conversation state. (Used by the Dialog system itself.)  
  15.            services.AddSingleton<ConversationState>();  
  16.   
  17.            services.AddSingleton<SearchLuisRecognizer>();  
  18.   
  19.   
  20.            // the dialog that will be run by the bot.  
  21.            services.AddSingleton<MainDialogs>();  
  22.   
  23.            services.AddSingleton(new QnAMakerEndpoint  
  24.            {  
  25.                KnowledgeBaseId = Configuration.GetValue<string>($"QnAKnowledgebaseId"),  
  26.                EndpointKey = Configuration.GetValue<string>($"QnAAuthKey"),  
  27.                Host = Configuration.GetValue<string>($"QnAEndpointHostName")  
  28.            });  
  29.   
  30.            // Create the bot as a transient. In this case the ASP Controller is expecting an IBot.  
  31.            services.AddTransient<IBot, SearchBot>();  
  32.        }  
Step 10
Test, Build and Run the solution to test into bot emulator
Press F5 to test locally; i.e. using Bot Emulator,


I hope you have enjoyed and learned something new in this article. Thanks for reading and stay tuned for the next article.