Bot Framework v4 NLP with Orchestrator bot sample
This bot has been created using Bot Framework, it shows how to create a bot that relies on multiple LUIS.ai and QnAMaker.ai models for natural language processing (NLP).
Use the Orchestrator dispatch model in cases when:
- Your bot consists of multiple language modules (LUIS + QnA) and you need assistance in routing user's utterances to these modules in order to integrate the different modules into your bot.
- Create a text classification model from text files.
This bot uses Orchestrator to route user utterances to multiple LUIS models and QnA maker services to support multiple conversational scenarios.
OS | Version | Architectures |
---|---|---|
Windows | 10 (1607+) | ia32 (x86), x64 |
MacOS | 10.15+ | x64 |
Linux | Ubuntu 18.04, 20.04 | x64 |
This sample requires prerequisites in order to run.
-
Install latest supported version of Visual C++ Redistributable
-
Install latest Bot Framework Emulator
-
.NET SDK version 6.0
> dotnet --version
-
Install BF CLI with Orchestrator plugin
> npm i -g @microsoft/botframework-cli
Make sure bf orchestrator command is working and shows all available orchestrator commands
> bf orchestrator
-
Clone the repository
> git clone https://github.com/microsoft/botbuilder-samples.git
-
CD samples\csharp_dotnetcore\14.nlp-with-orchestrator
> cd samples\csharp_dotnetcore\14.nlp-with-orchestrator
-
Configure the LUIS applications (HomeAutomation and Weather) required for this sample.
- Get your LUIS authoring key
> bf luis:build --in CognitiveModels --authoringKey <YOUR-KEY> --botName <YOUR-BOT-NAME>
- Update application settings in
./appsettings.json
-
Configure the QnA Maker KB required for this sample.
- Get your QnA Maker Subscription key
> bf qnamaker:build --in CognitiveModels --subscriptionKey <YOUR-KEY> --botName <YOUR-BOT-NAME>
- Update kb information in
./appsettings.json
-
Configure Orchestrator to route utterances to LUIS/QnA language services set up above
- Download Orchestrator base model
> mkdir model > bf orchestrator:basemodel:get --out ./model
- Create the Orchestrator snapshot
> mkdir generated > bf orchestrator:create --hierarchical --in ./CognitiveModels --out ./generated --model ./model
The hierarchical flag creates top level intents in the snapshot file derived from the .lu/.qna file names in the input folder. As a result, the example utterances are mapped to HomeAutomation, QnAMaker and Weather intents/labels.
-
Verify appsettings.json has the following:
"Orchestrator": { "ModelFolder": ".\\model", "SnapshotFile": ".\\generated\\orchestrator.blu" }
-
Run the bot from a terminal or from Visual Studio, choose option A or B. A) From a terminal
> cd samples\csharp_dotnetcore\14.nlp-with-orchestrator > dotnet run
B) Or from Visual Studio
- Launch Visual Studio
- File -> Open -> Project/Solution
- Navigate to
Orchestrator
folder - Select
OrchestratorSamples.sln
file - Right click on
01.dispatch-bot
project in the solution and 'Set as Startup Project' - Press
F5
to run the project
- Launch Bot Framework Emulator
- File -> Open Bot
- Enter a Bot URL of
http://localhost:3978/api/messages