Download as:
Rating : ⭐⭐⭐⭐⭐
Price: $10.99
Language:EN
Pages: 187

Review the content the run file static void run myblob

AZ-300T06
Developing for the Cloud

Contents

Module 0 Start Here . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Welcome to Developing for the Cloud . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Module 1 Module Developing Long-Running Tasks and Distributed Transactions . . . . . . . . .
Implement large-scale, parallel, and high-performance apps by using batches . . . . . . . . . . . . . . . . . .
Implement resilient apps by using queues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

8

Implement code to address application events by using webhooks . . . . . . . . . . . . . . . . . . . . . . . . . . .

12

Online Lab - Configuring a Message-Based Integration Architecture . . . . . . . . . . . . . . . . . . . . . . . . . .
Review Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Module 2 Module Configuring a Message-Based Integration Architecture . . . . . . . . . . . . . . .
Configure an app or service to send emails . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Configure an event publish and subscribe model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

27

Configure the Azure Relay service

Create and configure a notification hub . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

40

Create and configure a service bus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

43

Configuring apps and services with Microsoft Graph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Review Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Module 3 Module Developing for Asynchronous Processing . . . . . . . . . . . . . . . . . . . . . . . . . . .
Implement parallelism, multithreading, and processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Implement Azure Functions and Azure Logic Apps

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

53

Implement interfaces for storage or data access . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Implement appropriate asynchronous computing models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Review Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Module 4 Module Developing for Autoscaling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Implement autoscaling rules and patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

65

Implement code that addresses singleton application instances . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

68

Review Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Module 5 Module Developing Azure Cognitive Services Solutions . . . . . . . . . . . . . . . . . . . . . .

79

Cognitive Services Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

79

Develop Solutions using Computer Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Develop Solutions using Bing Web Search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Develop Solutions using Custom Speech Service . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Welcome to Developing for the Cloud. This course is part of a series of five courses to help students prepare for Microsoft’s Azure Solutions Architect technical certification exam AZ-300: Microsoft Azure Architect Technologies. These courses are designed for IT professionals and developers with experience and knowledge across various aspects of IT operations, including networking, virtualization, identity, security, business continuity, disaster recovery, data management, budgeting, and governance.

The outline for this course is as follows:

● Implementing resilient apps by using queues

As well as, implementing code to address application events by using webhooks. Implementing a web-hook gives an external resource a URL for an application. The external resource then issues an HTTP request to that URL whenever a change is made that requires the application to take an action.


Begin creating apps for Autoscaling


Understand Azure Cognitive Services Solutions

High-performance computing (HPC)

Traditionally, complex processing was something saved for universities and major research firms. A combination of cost, complexity, and accessibility served to keep many from pursuing potential gains for their organizations by processing large and complex simulations or models. Cloud platforms have democratized hardware so much that massive computing tasks are within the reach of hobbyist develop-ers and small and medium-sized enterprises.

ED 6

Module 1 Module Developing Long-Running Tasks and Distributed Transactions

MCT USE ONLY. STUDENT USE PROHIBIT

HPC Pack using Azure Virtual Machines

Starting with HPC Pack 2012 R2 Update 2, HPC Pack has supported several Linux distributions to run on

compute nodes deployed in Azure VMs, managed by a Windows Server head node. With the latest

Remote Direct Memory Access (RDMA)

RDMA is a technology that provides a low-latency network connection between the processing running

er’s perspective, RDMA is implemented in a way that makes it seem as if the machines are sharing

A subset of the compute-intensive instances (H16r, H16mr, A8, and A9) feature a network interface for

RDMA connectivity. Selected N-series sizes designated with “r” (such as NC24r) are also RDMA-capable.

A8 and A9 VMs. These RDMA capabilities can boost the scalability and performance of certain MPI

Azure Batch for HPC

Azure Batch is a service that manages VMs for large-scale parallel and HPC applications. Batch is a

platform as a service (PaaS) offering that manages the VMs necessary for your compute jobs for you

Scaling out parallel workloads

The Batch API can handle scaling out an intrinsically parallel workload, such as image rendering, on a pool of up to thousands of compute cores. Instead of having to set up a compute cluster or write code to queue and schedule your jobs and move the necessary input and output data, you automate the schedul-ing of large compute jobs and scale a pool of compute VMs up and down to run them. You can write client apps or front ends to run jobs and tasks on demand, on a schedule, or as part of a larger workflow managed by tools such as Azure Data Factory.

Use a message queue to implement the communication channel between the application and the instances of the consumer service. The application posts requests in the form of messages to the queue, and the consumer service instances receive messages from the queue and process them. This approach enables the same pool of consumer service instances to handle messages from any instance of the application. The figure illustrates using a message queue to distribute work to instances of a service.

To insert a message into an existing queue, first create a new CloudQueueMessage. Next, call the AddMessage method. A CloudQueueMessage can be created from either a string (in UTF-8 format) or a byte array. Here is code that creates a queue (if it doesn't exist):

CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigu-

ED 10
MCT USE ONLY. STUDENT USE PROHIBIT

queue.CreateIfNotExists();

The next code sample inserts the message ‘Hello, World’ into the referenced queue:

PeekMessage method:

CloudQueueMessage peekedMessage = queue.PeekMessage();

message from the queue, you must also call DeleteMessage. This two-step process of removing a

message assures that if your code fails to process a message due to a hardware or software failure,

another instance of your code can get the same message and try again. Your code calls DeleteMessage

Using Azure Service Bus queues in code

Azure Service Bus supports a set of cloud-based, message-oriented middleware technologies, including

ers. That is, receivers typically receive and process messages in the order in which they were added to the

queue, and only one message consumer receives and processes each message. A key benefit of using

Bus namespace and the name of a specific queue:

QueueClient queueClient = new QueueClient(ServiceBusConnectionString,

var messages = new List<Message>();
for (int i = 0; i < 10; i++)
{
var message = new Message(Encoding.UTF8.GetBytes($"Message {i:00})}; messages.Add(message);
}

await queueClient.SendBatchAsync(messages);

static async Task MethodToHandleNewMessage (Message message, Cancellation-Token token)
{
string messageString = Encoding.UTF8.GetString(message.Body);
Console.WriteLine($"Received message: {messageString}");
await queueClient.CompleteAsync(message.SystemProperties.LockToken); }

You will notice that we need to call the CompleteAsync method of the queue client at the end of the message handler method. This ensures that the message is not received again. Alternatively, you can call AbandonAsync if you wish to stop handling the message and receive it again.

Implement code to address application events by using webhooks 13

"type": "http",
"direction": "out"
}
]
}

{
"bindings": [
{
"type": "httpTrigger",
"direction": "in",
"webHookType": "github",
"name": "req"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
],
"disabled": false

ED 14

Module 1 Module Developing Long-Running Tasks and Distributed Transactions

MCT USE ONLY. STUDENT USE PROHIBIT
context.log('GitHub WebHook triggered!', data.comment.body);

};

Online Lab - Configuring a Message-Based Integration Architecture 15

Scenario

Adatum has several web applications that process files uploaded in regular intervals to their on-premises file servers. Files sizes vary, but they can reach up to 100 MB. Adatum is considering migrating the applications to Azure App Service or Azure Functions-based apps and using Azure Storage to host uploaded files. You plan to test two scenarios:

In this lab, you will use Azure Storage Blob service to store files to be processed. A client just needs to drop the files to be shared into a designated Azure Blob container. In the first exercise, the files will be consumed directly by an Azure Function, leveraging its serverless nature. You will also take advantage of the Application Insights to provide instrumentation, facilitating monitoring and analyzing file processing. In the second exercise, you will use Event Grid to automatically generate a claim check message and send it to an Azure Storage queue. This allows a client application to poll the queue, get the message and then use the stored reference data to download the payload directly from Azure Blob Storage.

Online Lab - Configuring a Message-Based Integration Architecture 17

2. In the Azure portal, in the Microsoft Edge window, start a Bash session within the Cloud Shell.

● Storage account: a name of a new storage account

● File share: a name of a new file share

export LOCATION='<Azure_region>'

6. From the Cloud Shell pane, run the following to create a resource group that will host all resources that you will provision in this lab:

export CONTAINER_NAME="workitems"

export STORAGE_ACCOUNT=$(az storage account create --name "${STORAGE_AC-

8. Note: The same storage account will be also used by the Azure function to facilitate its own process-ing requirements. In real-world scenarios, you might want to consider creating a separate storage account for this purpose.

9. From the Cloud Shell pane, run the following to create a variable storing the value of the connection string property of the Azure Storage account:

ED 18
MCT USE ONLY. STUDENT USE PROHIBIT

10. From the Cloud Shell pane, run the following to create an Application Insights resource that will

provide monitoring of the Azure Function processing blobs and store its key in a variable:

CATION}" --properties '{"Application_Type": "other", "ApplicationId":

export APPINSIGHTS_KEY=$(az resource show --name "${APPLICATION_INSIGHTS_

NAME}" --query "properties.InstrumentationKey" --resource-group "${RE-

SOURCE_GROUP_NAME}" --resource-type "Microsoft.Insights/components" -o tsv)

11. From the Cloud Shell pane, run the following to create the Azure Function that will process events

tion-plan-location "${LOCATION}"

12. From the Cloud Shell pane, run the following to configure Application Settings of the newly created

az functionapp config appsettings set --name "${FUNCTION_NAME}" --re-

source-group "${RESOURCE_GROUP_NAME}" --settings "STORAGE_CONNECTION_

STRING=$STORAGE_CONNECTION_STRING" FUNCTIONS_EXTENSION_VERSION=~2

Runtime stack drop down list and click Go.

16. On the Choose a template below or go to the quickstart blade, click Azure Blob Storage trigger

17. On the Extensions not Installed blade, click Install and wait until the installation of the extension

19. On the Azure Blob Storage trigger blade, specify the following and click Create to create a new function within the Azure function:

● Name: BlobTrigger

21. Note: By default, the function is configured to simply log the event corresponding to creation of a new blob. In order to carry out blob processing tasks, you would modify the content of this file.

Task 2: Validate an Azure Function App Storage Blob trigger

export CONTAINER_NAME="workitems"

3. From the Cloud Shell pane, run the following to upload a test blob to the Azure Storage account you created earlier in this task:

4. From the Cloud Shell pane, run the following to create an Azure Storage account and its container that will be used by the Event Grid subscription that you will configure in this task:

export STORAGE_ACCOUNT_NAME="az300t06st3${PREFIX}"

export STORAGE_ACCOUNT_ID=$(az storage account show --name "${STORAGE_AC-COUNT_NAME}" --query "id" --resource-group "${RESOURCE_GROUP_NAME}" -o tsv)

6. From the Cloud Shell pane, run the following to create the Storage Account queue that will store messages generated by the Event Grid subscription that you will configure in this task:

az eventgrid event-subscription create --name "${QUEUE_SUBSCRIPTION_NAME}" --included-event-types 'Microsoft.Storage.BlobCreated' --endpoint "${STOR-AGE_ACCOUNT_ID}/queueservices/default/queues/${QUEUE_NAME}" --endpoint-type "storagequeue" --source-resource-id "${STORAGE_ACCOUNT_ID}"

Task 2: Validate an Azure Event Grid subscription-based queue messaging

ED 22
MCT USE ONLY. STUDENT USE PROHIBIT export AZURE_STORAGE_ACCESS_KEY="$(az storage account keys list --ac-

export WORKITEM='workitem2.txt'

touch "${WORKITEM}"

previous task of this exercise.

3. On the blade of the Azure Storage account, click Queues to display the list of its queues.

5. Note that the queue contains a single message. Click its entry to display the Message properties

Exercise 3: Remove lab resources

The main tasks for this exercise are as follows:

you provisioned in this lab

for RESOURCE_GROUP_NAME in 'az300T0602-LabRG' 'az300T0603-LabRG'

az group delete --name "${RESOURCE_GROUP_NAME}" --no-wait --yes

Azure Batch

You are designing a video editing solution. The solution will require extension hardware resources to perform require processing on large video files.

Messaging

You manage an e-Commerce solution that uses Azure Batch.

Azure Batch uses a message queue to implement the communication channel between the application and the instances of the consumer service. The application posts requests in the form of messages to the queue, and the consumer service instances receive messages from the queue and process them. This approach enables the same pool of consumer service instances to handle messages from any instance of the application.

Azure Service Bus

ED 24
MCT USE ONLY. STUDENT USE PROHIBIT

Service Bus queues offer first in, first out (FIFO) message delivery to one or more competing consumers.

Messages are processed in the order in which they were added to the queue.

SendGrid provides transactional email delivery, scalability based on email volume, and real-time analytics for the sent messages. SendGrid also has a flexible API to enable custom integration scenarios.

Sending emails through SendGrid by using C#

SendGridMessage message = new SendGridMessage()

{

Modern application requirements stipulate that applications we build should be able to handle a high volume and velocity of data, process that data in real time, and allow multiple systems to respond to the same data. If we were to build applications in a serial manner, this stipulation would be incredibly difficult to meet. To help handle these application scenarios, many modern systems are built using an architectural style referred to as event-driven
architecture. An event-driven architecture consists of event producers that generate a stream of events and event consumers that listen for the events.

Event stream processing. You use a data streaming platform, such as Azure IoT Hub or Apache Kafka, as a pipeline to ingest events and feed them to stream processors. The stream processors act to process or transform the stream. There may be multiple stream processors for different subsystems of

ED 28

Module 2 Module Configuring a Message-Based Integration Architecture

MCT USE ONLY. STUDENT USE PROHIBIT

architectures. You select the Azure resource you would like to subscribe to and

give the event handler or webhook endpoint to send the event to. Event Grid has

send events to multiple endpoints, and make sure your events are reliably

delivered. Event Grid also has built-in support for custom and third-party

Custom topics

Azure Event Hubs

Azure Blob storage

General Purpose v2 storage

Subscribing to Blob storage events using Azure CLI

Event Grid connects data sources and event handlers. For example, an Event Grid can instantly trigger a serverless function to run an image analysis each time a new photo is added to a Blob storage container. In this example, we will create an event source using Blob storage.

az storage account create --name demostor --location eastus --re-

source-group

ED 30
MCT USE ONLY. STUDENT USE PROHIBIT

Azure CLI also contains the capability to query JSON objects and arrays for

specific values. We can further refine the query by using the –query

id

--output tsv

DemoGroup

--query id --output tsv)

to send those events. The following example uses the az eventgrid

event-subscription create command to subscribe to the storage account you

Configure the Azure Relay service 31

Configure the Azure Relay service

Note: The Azure Relay capabilities differ from network-level integration technologies, such as virtual private network (VPN) technology, in that Azure Relay can be scoped to a single application endpoint on a single machine, while VPN technology is far more intrusive, as it relies on altering the network environ-ment.

Hybrid Connections

const WebSocket = require('hyco-ws');

var listenUri =

The two classes are mostly contract compatible, meaning that an existing
application using the ws.Server class can easily be changed to use the
relayed version. The main differences are in the constructor and in the
available options. The RelayedServer constructor supports a different set of arguments than the Server, because it is not a standalone listener or able
to be embedded into an existing HTTP listener framework. There are also fewer options available, since the WebSocket protocol management is largely delegated to the Azure Relay service:

var server = ws.RelayedServer;

});

The RelayedServer constructor has two required arguments to establish a connection over the WebSocket protocol using Azure Relay:

ED 34
MCT USE ONLY. STUDENT USE PROHIBIT

Azure Notification Hubs

Azure Notification Hubs is a scaled-out push engine that allows you to send

Sending location-based coupons to interested user segments

Sending event-related notifications to users or groups for

Sending codes for multi-factor authentication

Push notifications

notifications are vital for consumer apps in increasing app engagement and usage

and for enterprise apps in communicating up-to-date business information. It's

called Platform Notification Systems (PNSs). They offer bare-bones push

functionalities to deliver a message to a device with a provided handle and have

At a high level, here are how push notifications work:

1. The client app decides it wants to receive notifications. Hence, it contacts

3. To send a push notification, the app’s back end contacts the PNS using the

handle to target a specific client app.

Configuring Notification Hubs in iOS

ED 36
MCT USE ONLY. STUDENT USE PROHIBIT

In most examples, you would first add a few constants to HubInfo.h that will

contain important connection details for your notification hub:

connection string"

#endif /* HubInfo_h */

#import <UserNotifications/UserNotifications.h>

#import "HubInfo.h"

-(void) application:(UIApplication *) application didRegisterForRemoteNoti-

ficationsWithDeviceToken:(NSData *) deviceToken {

notificationHubPath:HUB-

tion:^(NSError* error) {

if (error != nil) {

NSLog(@"Error registering for notifications: %@", error);
[self MessageBox:@"Registration Status" message:@"Regis-

}];

}

-(void)MessageBox:(NSString *) title message:(NSString *)messageText
cancelButtonTitle:@"OK" otherButtonTitles: nil];

}

Create and configure a notification hub 37

● You must register the com.google.firebase.iid.FirebaseInstanceIdReceiver receiver to enable PNS registration and message receipt.

In most examples, you would first add a few constants to a C# class that will contain important connection details for your notification hub:

public override void OnTokenRefresh()
{
var refreshedToken = FirebaseInstanceId.Instance.Token; Log.Debug(TAG, "FCM token: " + refreshedToken);
SendRegistrationToServer(refreshedToken);
}

void SendRegistrationToServer(string token)
{
hub = new NotificationHub(Constants.NotificationHubName,
Constants.ListenConnectionString, this);

Create and configure a notification hub 39

Sending messages from an application back end to Notification Hubs using C#

var alert = @"{"aps": {"alert": "From contoso-admin: Hello World"}}";
await hub.SendAppleNativeNotificationAsync(alert);

var notif = @"{"data": {"message":"From contoso-admin: Hello World"}}";
await hub.SendGcmNativeNotificationAsync(notif);
}
}

ED 40

Module 2 Module Configuring a Message-Based Integration Architecture

MCT USE ONLY. STUDENT USE PROHIBIT

Create and configure an event hub

events per second. Event Hubs can process and store events, data, or telemetry

produced by distributed software and devices. Data sent to an event hub can be

Event Hubs represents the “front door” for an event pipeline, often called an

event ingestor in solution architectures. An event ingestor is a component or

Event Hubs capabilities are built around high throughput and event processing

scenarios. Event Hubs contains the following key components:

the message stream.

Consumer groups: Views (state, position, or offset) of an entire event

throughput capacity of Event Hubs.

Event receivers: Any entities that read event data from event hubs. All

Azure IoT Hub

Azure provides services specifically developed for diverse types of connectivity and communication to help you connect your data to the power of the cloud. Both Azure IoT Hub and Azure Event Hubs are cloud services that can ingest large amounts of data and process or store that data for business insights. The two services are similar in that they both support the ingestion of data with low latency and high reliability, but they are designed for different purposes. IoT Hub was developed specifically to address the unique requirements of connecting IoT devices—at scale—to Azure Cloud Services, while Event Hubs was designed for big data streaming. This is why Microsoft recommends using Azure IoT Hub to connect IoT devices to Azure.

Azure IoT Hub is the cloud gateway that connects IoT devices to gather data to drive business insights and automation. In addition, IoT Hub includes

Whether an application or service runs in the cloud or on-premises, it often needs to interact with other applications or services. Different situations call for different styles of communication. Sometimes, letting applications send and receive messages through a simple queue is the best solution. In other
situations, an ordinary queue isn't enough; a queue with a publish-and-subscribe mechanism is better. In some cases, all that's needed is a connection between applications, and queues are not required.

To provide a broadly useful way to do this, Azure offers Azure Service Bus. Azure Service Bus provides all three options, enabling your applications to interact in several different ways. Service Bus is a multitenant cloud service, which means that the service is shared by multiple users.

When you create a queue, topic, or relay, you give it a name. Combined with whatever you called your namespace, this name creates a unique identifier for the object. Applications can provide this name to Service Bus and then use that queue, topic, or relay to communicate with each other.

Service Bus queues

ED 44
MCT USE ONLY. STUDENT USE PROHIBIT

journals, or inventory movements.

Decoupling applications: Improve the reliability and scalability of

Message sessions: Implement workflows that require message ordering or

message deferral.

the message is held more safely in redundant storage. Messages are delivered in

pull mode, which delivers messages on request.

intermediary (broker). A message producer (sender) hands off a message to the

queue and then continues its processing. Asynchronously, a message consumer

received and processed by the receivers in the order in which they were added to

the queue, and each message is received and processed by only one message

application.

Communication between on-premises apps and Azure-hosted apps in a hybrid

resiliency in your architecture.

Create and configure a service bus 45

require "azure"

To create a connection to Service Bus using the client object, use the following code to set the values of the namespace, key name, key, signer, and host:

The following example demonstrates how to send a test message to the queue named test-queue using send_queue_message():

message = Azure::ServiceBus::BrokeredMessage.new("test queue message")

Microsoft Graph

Microsoft Graph is the gateway to data and intelligence in Microsoft 365. Microsoft Graph provides a unified programmability model that you can use to query and manipulate data in Microsoft Office 365, Microsoft Enterprise Mobility

● Windows 10 services: activities and devices

Microsoft Graph connects all the resources across these services using
relationships. For example, a user can be connected to a group through a memberOf relationship and to another user through a manager relationship. Your app can traverse these relationships to access these connected resources and perform actions on them through the API.

https://graph.microsoft.com/v1.0/me/people (https://developer.microsoft.com/graph/graph-ex-plorer/?request=me%2Fpeople&version=beta)

GET items trending around me

Many personal computers and workstations have several CPU cores that enable the simultaneous execution of multiple threads. As client devices increase in performance and server computers opt for horizontal over vertical scaling, it's becoming increasingly common for a local client device to have more computing power and threads than the server it is currently accessing. Shortly, computers are expected to have even more cores, and this divide will continue to grow. It is more critical now than ever to take advantage of parallelism in client devices, server-side computers, and cloud-managed computing.

Parallelism

ED 52
MCT USE ONLY. STUDENT USE PROHIBIT

To get started with the TPL, create a Task instance that has a delegate parameter. The delegate contains

the code that is encapsulated and executed in the context of the task. The actual Task instance contains

After you have a Task instance, invoke the Task.Run method to start the task. The Task class contains a

Wait method that can be invoked to block execution until the asynchronous task has finished executing.

string greeting = $"You are running {Environment.OSVersion}";

return greeting;

string message = asyncOperation.Result;

Note: The TPL also contains convenient helper methods, such as Task.Run and TaskFactory.StartNew,

the tracking of the asynchronous methods. The keyword and its abstraction makes your code more

efficient and easier to read. To get started with this keyword, first define the delegate as a separate

method by using the async modifier:

string greeting = $"You are running {Environment.OSVersion}";

You can then define another method to use the asynchronous method. Within this method, use the await

keyword to asynchronously call the first method:

Implement Azure Functions and Azure Logic Apps 53

Implement Azure Functions and Azure Logic Apps

Azure Functions integrates with various Azure and third-party services. These services can trigger your function and start execution, or they can serve as input and output for your code. Azure Functions supports the following service integrations:

{

"disabled": false,

ED 54
MCT USE ONLY. STUDENT USE PROHIBIT

"direction": "in",

"name": "message",

}

The C# script for the corresponding function uses the name property of the “in” binding as a method

log.Info($"New message: {message}");

}

data, systems, and services across enterprises or organizations. Logic Apps simplifies how you design and

create scalable solutions for app integration, data integration, system integration, enterprise application

Move uploaded files from an FTP server to Azure Storage.

Monitor tweets for a specific subject, analyze the sentiment and create alerts or tasks for items that

flows with the Schedule trigger.

Each time the trigger fires, the Logic Apps engine creates a logic app instance that runs the workflow's

trigger that has the built-in criterion "When a record is updated." If the trigger detects an event that

You can visually build your logic apps by using the Logic Apps designer, which is available in the Azure

portal through your browser and in Visual Studio. You can also use Azure PowerShell commands and

Azure Resource Manager templates for select tasks. For more-custom logic apps, you can create or edit

logic app definitions in JavaScript Object Notation (JSON) by working in code view mode. Here is an

}
},
"actions": {
"readData": {
"type": "Http",
"inputs": {
"method": "GET",
"uri": "@parameters('uri')"

}
}
},
"outputs": { }
}

4. In the New blade, in the Search the Marketplace box, enter function app, and then press Enter.

5. In the search results, select the Function App template.

3. In the OS group, select Windows.

4. In the Hosting Plan list, select Consumption Plan.

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
log.Info("C# HTTP trigger executed");
return req.CreateResponse(
await req.Content.ReadAsAsync<object>()
);
}

18. Note: This script will echo the body of a request back to the calling client.

2. Replace the request body field's value with the following JSON object:

{
"sizeSystem": "US",
"sizeType": "regular",
"targetCountry": "US",
"taxes": [
{
"country": "US",
"rate": "9.9",
"region": "CA",
"taxShip": "True"
}
],
"title": "Cute Toddler Sundress"
}

7. At the top of the FUNCDEMO blade, select Delete resource group.

8. In the deletion confirmation dialog box, enter the name FUNCDEMO, and then select Delete.

ED 58
MCT USE ONLY. STUDENT USE PROHIBIT

Blocking the calling thread during I/O can reduce performance and affect vertical scalability. You can see

this antipattern when your application attempts to access data in local storage via either Azure Storage or

instance.

To fix this problem, implement asynchronous interfaces for all of your data tier code that accesses any

accesses data in an Azure SQL Database instance using Entity Framework Core. In the simplest implemen-

tation, you create a DbContext implementation that is used directly in the controller action:

public class Item

{

{

public string Get(int id)

}

}

to implement the object-relational mapper, because Entity Framework–specific code exists directly in

your web application. Things might be even worse if you chose to use a different database service altogether.

Second, you might observe I/O blocking while your code uses the synchronous version of the Entity Framework methods in the System.Data.Entity namespace.

public class EntityFrameworkDataLayer : IDataLayer
{
public async Task<string> GetNameForIdAsync(int id) {
ItemsContext context = new ItemsContext(); Item item = await context.Items.FindAsync(id); return item.Name;
}
}

public class ItemsController : Controller
{
public async Task<string> Get(IDataLayer dataLayer, int id) {
return await dataLayer.GetNameForIdAsync(id);
}
}

ED 62
MCT USE ONLY. STUDENT USE PROHIBIT

Review Questions

Module 3 Review Questions

The solution must route messages to manufacturing, create orders in a sales system, and notify your

What should you use?

Suggested Answer ↓

integration (EAI), and business-to-business (B2B) communication—whether in the cloud, on-premises, or

both.

How can you ensure that a user request does not affect the experience of other users? Which design

pattern should be used to develop the code? Which tools are available?

Suggested Answer ↓

Computing patterns

You are designing a solution to handle order processing from an online e-Commerce site.

design patterns. For this scenario, unpredictable bursting is most appropriate. Solutions that use this

pattern are designed to respond to unexpected spikes in activity. Resources are automatically added to

handle increased workloads and then released when no longer necessary.

In distributed application scenarios, you are often advised to scale out your application horizontally by adding multiple, small instances of your application to the cloud. With horizontal scaling, you can serve more client devices and ensure high availability and resiliency against any specific fault.

Unfortunately, application workloads are unpredictable. To illustrate this, here are four of the most common computing patterns you will see for web applications hosted in the cloud:

Workloads occur occasionally (for example, batch processing).

Over-provisioned capacity is wasted. The time to market can be cumbersome.

ED 66

Module 4 Module Developing for Autoscaling

MCT USE ONLY. STUDENT USE PROHIBIT

You can easily overestimate or underestimate the number of instances required to provide the best user

experience. If you overestimate, you can end up paying for unnecessary compute resources. If you

To provide redundancy and improved performance, applications are typically distributed across multiple

instances. Customers may access your application through a load balancer that distributes requests to

of application instances.

A primary advantage of the cloud is elastic scaling—the ability to use as much capacity as you need,
scaling out as load increases and scaling in when the extra capacity is not needed. In the context of

Microsoft Azure, many services provide the capability to both manually and automatically scale to closely

cloud service can scale out and in to exactly match the amount of instances needed for your specific

computing pattern.

Auto-scale metrics

Auto-scale settings help ensure that you have the right amount of resources running to handle the

fluctuating load of your application. You can configure auto-scale settings to be either triggered based

Send email notifications to the service administrator and co-administrators Send email to additional email addresses that you specify
Call a webhook
Start the execution of an Azure runbook

ED 70

Module 4 Module Developing for Autoscaling

MCT USE ONLY. STUDENT USE PROHIBIT

...

az vm list --query '[].{name:name, image:storageProfile.imageReference.

offer}'

},

{

Using the [] operator, you can create queries that filter your result set by comparing the values of various

JSON properties:

az vm list --query "[?starts_with(storageProfile.imageReference.offer, 'Ubun-

tu')].{name:name, id:vmId}"

}

]

specific VM in your subscription.

Implement code that addresses singleton application instances 71

{
"clientId": "b52dd125-9272-4b21-9862-0be667bdf6dc",
"clientSecret": "ebc6e170-72b2-4b6f-9de2-99410964d2d0",
"subscriptionId": "ffa52f27-be12-4cad-b1ea-c2c241b6cceb",
"tenantId": "72f988bf-86f1-41af-91ab-2d7cd011db47",
"activeDirectoryEndpointUrl": "https://login.microsoftonline.com", "resourceManagerEndpointUrl": "https://management.azure.com/", "activeDirectoryGraphResourceId": "https://graph.windows.net/", "sqlManagementEndpointUrl": "https://management.core.windows.

net:8443/",
"galleryEndpointUrl": "https://gallery.azure.com/",
"managementEndpointUrl": "https://management.core.windows.net/" }

azure.VirtualMachines

The properties have both synchronous and asynchronous versions of methods to perform actions such as Create, Delete, List, and Get. If we wanted to get a list of VMs asynchronously, we could use the ListA-sync method:

Implement code that addresses a transient state 73

Handling transient errors

In the cloud, transient faults aren't uncommon, and an application should be designed to handle them elegantly and transparently. This minimizes the effects faults can have on the business tasks the applica-tion is performing.

For the more common transient failures, the period between retries should be chosen to spread requests from multiple instances of the application as evenly as possible. This reduces the chance of a busy service continuing to be overloaded. If many instances of an application are continually overwhelming a service with retry requests, it'll take the service longer to recover.

If the request still fails, the application can wait and make another attempt. If necessary, this process can be repeated with increasing delays between retry attempts, until some maximum number of requests have been attempted. The delay can be increased incrementally or exponentially depending on the type of failure and the probability that it'll be corrected during this time.

ED 74
MCT USE ONLY. STUDENT USE PROHIBIT

exception and handle it accordingly.

2. The application waits for a short interval and tries again. The request still fails with HTTP response

code 200 (OK).

The application should wrap all attempts to access a remote service in code that implements a retry

policy matching one of the strategies listed above. Requests sent to different services can be subject to

tors. If a service is frequently unavailable or busy, it's often because the service has exhausted its resourc-

es. You can reduce the frequency of these faults by scaling out the service. For example, if a database
service is continually overloaded, it might be beneficial to partition the database and spread the load

across multiple servers.

method, shown below, invokes an external service asynchronously through the TransientOperation-

private int retryCount = 3;

private readonly TimeSpan delay = TimeSpan.FromSeconds(5);

{

try

The statement that invokes this method is contained in a try/catch block wrapped in a for loop. The for loop exits if the call to the TransientOperationAsync method succeeds without throwing an exception. If the TransientOperationAsync method fails, the catch block examines the reason for the failure. If it's believed to be a transient error, the code waits for a short delay before retrying the operation.

The for loop also tracks the number of times that the operation has been attempted, and if the code fails three times, the exception is assumed to be more long lasting. If the exception isn't transient or it's long lasting, the catch handler will throw an exception. This exception exists in the for loop and should be caught by the code that invokes the OperationWithBasicRetryAsync method.

Review Questions 77

Auto-scale metrics
You are designing a solution for a university. Students will use the solution to register for classes. You plan to implement elastic scaling to handle the high levels of activity at the beginning of each class registration period.

You plan to implement an App Service instance and trigger auto-scaling based on metrics?

What options are available to handle transient errors?

Suggested Answer ↓
When an application loses a connection to a resource such as a database, you can use cancel, retry, or retry with delay logic to handle and potentially resolve the issue. In this case, the database may become unavailable due to network issues. Alternatively, the database may be busy with maintenance tasks. You can implement retry logic to reconnect to the database.

Getting started with free trials

Signing up for free trials only takes an email and a few simple steps1 You will need a Microsoft Account if you don't already have one. You will receive a unique pair of keys for each API requested. The second one is just a spare. Please do not share the secret keys with anyone. Trials have both rate limit, in terms of transactions per second or minute, and a monthly usage cap. A transaction is simply an API call. You can upgrade to paid tiers to unlock the restrictions.

ED 80

Module 5 Module Developing Azure Cognitive Services Solutions

T USE ONLY. STUDENT USE PROHIBIT

Microsoft Computer Vision algorithms can analyze visual content in different ways based on inputs and

user choices.

Image dimension: Greater than 50 x 50 pixels.

Tagging images

tion' displayed as human readable language formatted in complete sentences. Note, that at this point

English is the only supported language for image description.

After uploading an image or specifying an image URL, Computer Vision API's algorithms output tags

In addition to tagging and descriptions, Computer Vision API returns the taxonomy-based categories

defined in previous versions. These categories are organized as a taxonomy with parent/child hereditary

my3.

3

MC

There are several ways to categorize images. Computer Vision API can set a boolean flag to indicate whether an image is black and white or color. The API can also set a flag to indicate whether an image is a line drawing or not. It can also indicate whether an image is clip art or not and indicate its quality on a scale of 0-3.

Domain-specific content

Analyze only a chosen model by invoking an HTTP POST call. If you know which model you want to use, specify the model's name. You only get information relevant to that model. For example, you can use this option to only look for celebrity-recognition. The response contains a list of potential matching celebri-ties, accompanied by their confidence scores.

Requirements for OCR:
● The size of the input image must be between 40 x 40 and 3200 x 3200 pixels.

● The image cannot be bigger than 10 megapixels.

Artistic font styles.

Small text size.

Text recognition saves time and effort. You can be more productive by taking an image of text rather than transcribing it. Text recognition makes it possible to digitize notes. This digitization allows you to imple-ment quick and easy search. It also reduces paper clutter.

Input requirements:
● Supported image formats: JPEG, PNG, and BMP.

Develop Solutions using Computer Vision 85

● Click the Browse tab, and in the Search box type “Microsoft.Azure.CognitiveServices.Vision. ComputerVision”.

● Select Microsoft.Azure.CognitiveServices.Vision.ComputerVision when it displays, then click the checkbox next to your project name, and Install.

7. Optionally, set remoteImageUrl to a different image.

8. Run the program.

private const string remoteImageUrl =
"http://upload.wikimedia.org/wikipedia/commons/3/3c/Shaki_wa-terfall.jpg";

// Specify the features to return
private static readonly List<VisualFeatureTypes> features =
new List<VisualFeatureTypes>()
{
VisualFeatureTypes.Categories, VisualFeatureTypes.Description, VisualFeatureTypes.Faces, VisualFeatureTypes.ImageType,
VisualFeatureTypes.Tags

ED 86
MCT USE ONLY. STUDENT USE PROHIBIT

ComputerVisionAPI computerVision = new ComputerVisionAPI(

new ApiKeyServiceClientCredentials(subscriptionKey),
new System.Net.Http.DelegatingHandler[] { });

// You must use the same region as you used to get your sub-

//

// Free trial subscription keys are generated in the westcen-

// Specify the Azure region

computerVision.AzureRegion = AzureRegions.Westcentralus;

Console.WriteLine("Press any key to exit");

Console.ReadLine();

{

if (!Uri.IsWellFormedUriString(imageUrl, UriKind.Absolute))

Console.WriteLine(
"\nInvalid remoteImageUrl:\n{0} \n", imageUrl);
return;
await computerVision.AnalyzeImageAsync(imageUrl, features);

}

// Analyze a local image

using (Stream imageStream = File.OpenRead(imagePath))
{
ImageAnalysis analysis = await computerVision.AnalyzeImage-InStreamAsync(
imageStream, features);
DisplayResults(analysis, imagePath);
}
}

// Display the most relevant caption for the image
private static void DisplayResults(ImageAnalysis analysis, string imageUri)
{
Console.WriteLine(imageUri);
Console.WriteLine(analysis.Description.Captions[0].Text + "\n");
}
}
}

The following information shows you how to generate a thumbnail from an image using the Computer Vision Windows client library.

Prerequisites

ED 88
T USE ONLY. STUDENT USE PROHIBIT

isn't necessary to download the package. Installation instructions are provided below.

GenerateThumbnailAsync method

input image. Computer Vision uses smart cropping to intelligently identify the region of interest and

To run the sample, do the following steps:

1. Create a new Visual C# Console App in Visual Studio.

ComputerVision”.

Select Microsoft.Azure.CognitiveServices.Vision.ComputerVision when it displays, then click

where you obtained your subscription keys, if necessary.

6. Optionally, replace <LocalImage> with the path and file name of a local image (will be ignored if

using Microsoft.Azure.CognitiveServices.Vision.ComputerVision;

using Microsoft.Azure.CognitiveServices.Vision.ComputerVision.Models;

{

class Program

8
MC
9

https://westus.dev.cognitive.microsoft.com/docs/services/5adf991815e1060e6355ad44/operations/56f91f2e778daf14a499e1fb

Develop Solutions using Computer Vision 89

static void Main(string[] args)
{
ComputerVisionAPI computerVision = new ComputerVisionAPI( new ApiKeyServiceClientCredentials(subscriptionKey), new System.Net.Http.DelegatingHandler[] { });

// You must use the same region as you used to get your sub-scription
// keys. For example, if you got your subscription keys from westus,
// replace "Westcentralus" with "Westus".

// Create a thumbnail from a remote image
private static async Task GetRemoteThumbnailAsync(
ComputerVisionAPI computerVision, string imageUrl)
{
if (!Uri.IsWellFormedUriString(imageUrl, UriKind.Absolute)) {
Console.WriteLine(
"\nInvalid remoteImageUrl:\n{0} \n", imageUrl);

Thumbnail written to: C:\Documents\LocalImage_thumb.jpg

Thumbnail written to: ...\bin\Debug\Bloodhound_Puppy_thumb.jpg

● Any edition of Visual Studio 2015 or 2017.

● The Microsoft.Azure.CognitiveServices.Vision.ComputerVision11 client library NuGet package. It isn't necessary to download the package. Installation instructions are provided below.

A description of image content in a complete sentence.

The coordinates, gender, and age of any faces contained in the image. The ImageType (clip art or a line drawing).

//
// Free trial subscription keys are generated in the westcen-tralus
// region. If you use a free trial subscription key, you shouldn't
// need to change the region.

// Specify the Azure region
computerVision.AzureRegion = AzureRegions.Westcentralus;

await GetTextAsync(computerVision, textHeaders.OperationLoca-tion);

ED 94

Module 5 Module Developing Azure Cognitive Services Solutions

MCT USE ONLY. STUDENT USE PROHIBIT
private static async Task ExtractLocalHandTextAsync(

if (!File.Exists(imagePath))

{

Console.WriteLine(
"\nUnable to open or read localImagePath:\n{0} \n",
return;
// Start the async process to recognize the text
RecognizeTextInStreamHeaders textHeaders =
await computerVision.RecognizeTextInStreamAsync(
imageStream, TextRecognitionMode.Handwritten);
await GetTextAsync(computerVision, textHeaders.OperationLo-

cation);

}

{

// Retrieve the URI where the recognized text will be

// stored from the Operation-Location header
operationLocation.Length - numberOfCharsInOperationId);
await computerVision.GetTextOperationResultAsync(operation-

Id);

// Wait for the operation to complete

result.Status == TextOperationStatusCodes.NotStarted)

{

Console.WriteLine(
"Server status: {0}, waiting {1} seconds...", result.

Status, i);

RecognizeTextAsync response

A successful response displays the lines of recognized text for each image.

Develop Solutions using Bing Web Search 97

Images

The images answer contains a list of images that Bing thought were relevant to the query. Each image in the list includes the URL of the image, its size, its dimensions, and its encoding format. The image object also includes the URL of a thumbnail of the image and the thumbnail's dimensions.

The video object also includes the URL of a thumbnail of the video and the thumbnail's dimensions.

News

If the user enters a time or date query, the re-
sponse may contain a TimeZone answer. This
answer supports implicit or explicit queries. An implicit query such as What time is it?, returns the local time of the user's location. An explicit query such as What time is it in Seattle?, returns the local time of Seattle, WA.

Develop a Bing Web Search query in C#

const string searchTerm = "Microsoft Cognitive Services";

// Used to return search results including relevant headers struct SearchResult
{
public String jsonResult;
public Dictionary<String, String> relevantHeaders; }

Console.WriteLine("\nJSON Response:\n");
Console.WriteLine(JsonPrettyPrint(result.jsonResult)); }
else
{
Console.WriteLine("Invalid Bing Search API subscription key!");
Console.WriteLine("Please paste yours into the source code.");
}

Console.Write("\nPress Enter to exit ");
Console.ReadLine();
}

ED 100
MCT USE ONLY. STUDENT USE PROHIBIT

var uriQuery = uriBase + "?q=" + Uri.EscapeDataString(search-

Query);

// Perform the Web request and get the response

string json = new StreamReader(response.GetResponseStream()).

ReadToEnd();

jsonResult = json,
relevantHeaders = new Dictionary<String, String>()

// Extract Bing HTTP headers

foreach (String header in response.Headers)

if (header.StartsWith("BingAPIs-") || header.StartsWith("X-
searchResult.relevantHeaders[header] = response.Head-

return searchResult;

}

/// <param name="json">The raw JSON string to format.</param>

/// <returns>The formatted JSON string.</returns>

return string.Empty;

StringBuilder sb = new StringBuilder();

bool quote = false;

Develop Solutions using Bing Web Search 101

foreach (char ch in json)
{
switch (ch)
{
case '"':
if (!ignore) quote = !quote;
break;
case '\\':
if (quote && last != '\\') ignore = true; break;
}

ED 104
MCT USE ONLY. STUDENT USE PROHIBIT

{

"text": "microsoft cognitive services news",

{

"text": "ms cognitive service",

{

"text": "microsoft cognitive services text analytics",

"displayText": "microsoft cognitive services text analytics",

"text": "microsoft cognitive services toolkit",

"displayText": "microsoft cognitive services toolkit",

"webSearchUrl": "https://www.bing.com/search?q=microsoft+cogni-

"displayText": "microsoft cognitive services api",

"webSearchUrl": "https://www.bing.com/search?q=microsoft+cogni-

"rankingResponse": {

"mainline": {

"value": {

"id": "https://api.cognitive.microsoft.com/api/v7/#WebPages.0"

"sidebar": {

"items": [

Develop Solutions using Bing Web Search 105

Searches"
}
}
]
}
}
}

GET https://api.cognitive.microsoft.com/bing/v7.0/search?q=sailing+din-ghies&responseFilter=images%2Cvideos%2Cnews&mkt=en-us HTTP/1.1
Ocp-Apim-Subscription-Key: 123456789ABCDE
User-Agent: Mozilla/5.0 (compatible; MSIE 10.0; Windows Phone 8.0; Tri-dent/6.0; IEMobile/10.0; ARM; Touch; NOKIA; Lumia 822)
X-Search-ClientIP: 999.999.999.999
X-Search-Location: 47.60357;long:-122.3295;re:100
X-MSEdge-ClientID: <blobFromPriorResponseGoesHere>

Develop Solutions using Bing Web Search 107

GET https://api.cognitive.microsoft.com/bing/v7.0/search?q=sailing+din-ghies&answerCount=2&mkt=en-us HTTP/1.1
Ocp-Apim-Subscription-Key: 123456789ABCDE
User-Agent: Mozilla/5.0 (compatible; MSIE 10.0; Windows Phone 8.0; Tri-dent/6.0; IEMobile/10.0; ARM; Touch; NOKIA; Lumia 822)
X-Search-ClientIP: 999.999.999.999
X-Search-Location: 47.60357;long:-122.3295;re:100
X-MSEdge-ClientID: <blobFromPriorResponseGoesHere>
Host: api.cognitive.microsoft.com

The response includes only webPages and images.

ED 108
MCT USE ONLY. STUDENT USE PROHIBIT

}

Promoting answers that are not ranked

GET https://api.cognitive.microsoft.com/bing/v7.0/search?q=sailing+din-

ghies&answerCount=2&promote=images%2Cvideos&mkt=en-us HTTP/1.1

X-Search-Location: 47.60357;long:-122.3295;re:100

X-MSEdge-ClientID: <blobFromPriorResponseGoesHere>

The following is the response to the above request. Bing returns the top two answers, webpages and

"queryContext" : {

"originalQuery" : "sailiing dinghies"

"rankingResponse" : {...}

}

If you set promote to news, the response doesn't include the news answer because it is not a ranked

contains videos and news.

You may use promote only if you specify the answerCount query parameter.

The Custom Speech Service enables you to create customized language models and acoustic models tailored to your application and your users. By uploading your specific speech and/or text data to the Custom Speech Service, you can create custom models that can be used in conjunction with Microsoft’s existing state-of-the-art speech models.

For example, if you’re adding voice interaction to a mobile phone, tablet or PC app, you can create a custom language model that can be combined with Microsoft’s acoustic model to create a speech-to-text endpoint designed especially for your app. If your application is designed for use in a particular environ-ment or by a particular user population, you can also create and deploy a custom acoustic model with this service.

Both the acoustic and language models are statistical models learned from training data. As a result, they perform best when the speech they encounter when used in applications is similar to the data observed during training. The acoustic and language models in the Microsoft Speech-To-Text engine have been trained on an enormous collection of speech and text and provide state-of-the-art performance for the most common usage scenarios, such as interacting with Cortana on your smart phone, tablet or PC, searching the web by voice or dictating text messages to a friend.

Why use the Custom Speech Service?

● If you have background noise in your data, it is recommended to also have some examples with longer segments of silence, for example, a few seconds, in your data, before and/or after the speech content.

● Each audio file should consist of a single utterance, for example, a single sentence for dictation, a single query, or a single turn of a dialog system.

The transcriptions for all WAV files should be contained in a single plain-text file. Each line of the tran-scription file should have the name of one of the audio files, followed by the corresponding transcription. The file name and transcription should be separated by a tab (\t).

For example:

The following steps are done using the Custom Speech Service Portal18.

17 https://docs.microsoft.com/en-us/azure/cognitive-services/custom-speech-service/customspeech-how-to-topics/cognitive-services- custom-speech-transcription-guidelines
18 https://cris.ai/

ED 112
MCT USE ONLY. STUDENT USE PROHIBIT

To do so, first ensure you are signed into the Custom Speech Service Portal. Then click the “Custom

Speech” drop-down menu on the top ribbon and select “Adaptation Data”. If this is your first time

Enter a Name and Description in the appropriate text boxes. These are useful for keeping track of various

data sets you upload. Next, click “Choose File" for the “Transcription File” and “WAV files” and select your

corresponds to your acoustic data set. Notice that it has been assigned a unique id (GUID). The data will

also have a status that reflects its current state. Its status will be “Waiting” while it is being queued for

processing, “Processing” while it is going through validation, and “Complete” when the data is ready for

When the status is “Complete”, you can click “Details” to see the acoustic data verification report. The

utterances. In the example below, two WAV files failed verification because of improper audio format (in

this data set, one had an incorrect sampling rate and one was the incorrect file format).

To do so, click “Acoustic Models” in the “Custom Speech” drop-down menu. You will see a table called "Your models” that lists all of your custom acoustic models. This table will be empty if this is your first use.

The current locale is shown in the table title. Currently, acoustic models can be created for US English only.

In this lesson, you learn how to:

● Prepare the data

Ensure that your Cognitive Services account is connected to a subscription by opening the Cognitive Services Subscriptions20 page.

If no subscriptions are listed, you can either have Cognitive Services create an account for you by clicking the Get free subscription button. Or you can connect to a Custom Search Service subscription created in the Azure portal by clicking the Connect existing subscription button.

When the import is complete, you will return to the language data table and will see an entry that corresponds to your language data set. Notice that it has been assigned a unique id (GUID). The data will also have a status that reflects its current state. Its status will be “Waiting” while it is being queued for processing, “Processing” while it is going through validation, and “Complete” when the data is ready for use. Data validation performs a series of checks on the text in the file and some text normalization of the data.

ED 118
MCT USE ONLY. STUDENT USE PROHIBIT

When the status of the language data set is “Complete”, it can be used to create a custom language

model.

After you have specified the base language model, select the language data set you wish to use for the customization using the “Language Data” drop-down menu

QnA Maker enables you to power a question and answer service from your semi-structured content like FAQ (Frequently Asked Questions) documents or URLs and product manuals. You can build a model of questions and answers that is flexible to user queries, providing responses that you'll train a bot to use in a natural, conversational way.

An easy-to-use graphical user interface enables you to create, manage, train and get your service up and running without any developer experience.

Prerequisites

If your preferred IDE is Visual Studio, you'll need Visual Studio 2017 to run this code sample on Windows. (The free Community Edition will work.)

ED 122
paid subscription key from your new API account in your Azure dashboard. To retrieve your key, select

Keys under Resource Management in your dashboard. Either key will work for this quickstart.

3. Replace the key value with your valid subscription key.

4. Run the program.

using System.Net.Http.Headers;

using System.Text;

namespace QnAMaker

{

static string host = "https://westus.api.cognitive.microsoft.com";

24 https://westus.dev.cognitive.microsoft.com/docs/services/5a93fcf85b4ccd136866eb37/operations/5ac266295b4ccd1554da75ff

static string key = "YOUR SUBSCRIPTION KEY HERE";

/// <summary>
/// Defines the data source used to create the knowledge base.

/// </summary>
public struct Response
{
public HttpResponseHeaders headers;
public string response;

/// </summary>
/// <param name="kb">The data source for the knowledge base.</ param>
/// <returns>A <see cref="System.Threading.Tasks.Task{TResult} (QnAMaker.Program.Response)"/>
/// object that represents the HTTP response."</returns>
/// <remarks>The method constructs the URI to create a knowledge base in QnA Maker, and then
/// asynchronously invokes the <see cref="QnAMaker.Program.

Post(string, string)"/> method
/// to send the HTTP request.</remarks>
async static Task<Response> PostCreateKB(string kb) {
// Builds the HTTP request URI.

return await Post(uri, kb);
}

/// <summary>
/// Gets the status of the specified QnA Maker operation.

ED 126
MCT USE ONLY. STUDENT USE PROHIBIT

/// object that represents the HTTP response."</returns>

/// <remarks>The method constructs the URI to get the status of a

async static Task<Response> GetStatus(string operation)

{

// Builds the HTTP request URI.

// Asynchronously invokes the Get(string) method, using the

// HTTP request URI.

/// until the knowledge base is created.

/// </summary>

// Starts the QnA Maker operation to create the knowledge

base.

var response = await PostCreateKB(kb);
// Retrieves the operation ID, so the operation's status

can be

// checked periodically.
var operation = response.headers.GetValues("Location").
// Displays the JSON in the HTTP response returned by the
// PostCreateKB(string) method.
Console.WriteLine(PrettyPrint(response.response));
// Iteratively gets the state of the operation creating the
// knowledge base. Once the operation state is set to
// than "Running" or "NotStarted", the loop ends.
var done = false;
while (true != done)
{

// Displays the JSON in the HTTP response returned by the
// GetStatus(string) method.

Console.WriteLine(PrettyPrint(response.response));

The thread is
// paused for a number of seconds equal to the Retry-After header value,
// and then the loop continues.

var wait = response.headers.GetValues("Retry-Af-ter").First();
Console.WriteLine("Waiting " + wait + " sec-onds...");
Thread.Sleep(Int32.Parse(wait) * 1000);
}
else
{
// QnA Maker has completed creating the knowledge base.

{
"operationState": "Succeeded",
"createdTimestamp": "2018-06-25T10:30:15Z",
"lastActionTimestamp": "2018-06-25T10:30:51Z",
"resourceLocation": "/knowledgebases/1d9eb2a1-de2a-4709-91b2-f6ea8afb-6fb9",
"userId": "0d85ec294c284197a70cfeb51775cd22",
"operationId": "d9d40918-01bd-49f4-88b4-129fbc434c94"
}
Press any key to continue.

Once your knowledge base is created, you can view it in your QnA Maker Portal, My knowledge bases26 page. Select your knowledge base name, for example QnA Maker FAQ, to view.

3. Replace the key value with a valid subscription key.

4. Replace the kb value with a valid knowledge base ID. Find this value by going to one of your QnA Maker knowledge bases28. Select the knowledge base you want to update. Once on that page, find the ‘kdid=’ in the URL as shown below. Use its value for your code sample.

5.

namespace QnAMaker

26 https://www.qnamaker.ai/Home/MyServices
27 https://westus.dev.cognitive.microsoft.com/docs/services/5a93fcf85b4ccd136866eb37/operations/5ac266295b4ccd1554da7600 28 https://www.qnamaker.ai/Home/MyServices

/// </summary>
/// <param name="s">The JSON to format and indent.</param> /// <returns>A string containing formatted and indented JSON.</ returns>
static string PrettyPrint(string s)
{
return JsonConvert.SerializeObject(JsonConvert.DeserializeOb-ject(s), Formatting.Indented);
}

/// <summary>
/// Asynchronously sends a PATCH HTTP request.

ED 132
MCT USE ONLY. STUDENT USE PROHIBIT

/// Asynchronously sends a GET HTTP request.

/// </summary>

async static Task<Response> Get(string uri)

{

using (var request = new HttpRequestMessage())
request.Method = HttpMethod.Get;
request.RequestUri = new Uri(uri);
request.Headers.Add("Ocp-Apim-Subscription-Key", key);
var response = await client.SendAsync(request);
var responseBody = await response.Content.ReadAsStrin-
return new Response(response.Headers, responseBody);

}

/// <summary>

/// <param name="new_kb">The new data source for the updated knowl-

edge base.</param>

Maker,

/// then asynchronously invokes the <see cref="QnAMaker.Program.

string uri = host + service + method + kb;

Console.WriteLine("Calling " + uri + ".");

/// </summary>

/// <param name="operation">The QnA Maker operation to check.</

/// object that represents the HTTP response."</returns>
/// <remarks>Constructs the URI to get the status of a QnA Maker /// operation, then asynchronously invokes the <see cref="QnAMaker.

Program.Get(string)"/>
/// method to send the HTTP request.</remarks>
async static Task<Response> GetStatus(string operation) {
string uri = host + service + operation;
Console.WriteLine("Calling " + uri + ".");
return await Get(uri);
}

var operation = response.headers.GetValues("Location").

First();

response = await GetStatus(operation);
// Displays the JSON in the HTTP response returned by the
// GetStatus(string) method.

Console.WriteLine(PrettyPrint(response.response));

// The console waits for a key to be pressed before closing. Console.ReadLine();
}
}
}

Understand what QnA Maker returns

Publish a knowledge base in C#

The following code publishes an existing knowledge base, using the Publish method.

using System;
using System.Collections.Generic;
using System.Linq;

29 https://westus.dev.cognitive.microsoft.com/docs/services/5a93fcf85b4ccd136866eb37/operations/5ac266295b4ccd1554da7600 30 https://www.qnamaker.ai/Home/MyServices

ED 136
MCT USE ONLY. STUDENT USE PROHIBIT

using System.Threading;

using System.Threading.Tasks;

class Program

{

static string key = "ENTER KEY HERE";

// NOTE: Replace this with a valid knowledge base ID.

static string kb = "ENTER ID HERE";

}

async static Task<string> Post(string uri)

using (var request = new HttpRequestMessage())
request.Method = HttpMethod.Post;
request.RequestUri = new Uri(uri);
request.Headers.Add("Ocp-Apim-Subscription-Key", key);
var response = await client.SendAsync(request);
if (response.IsSuccessStatusCode)
{
return "{'result' : 'Success.'}";
}
else
{
return await response.Content.ReadAsStringAsync();
}

}

async static void PublishKB()

A successful response is returned in JSON, as shown in the following example:

{
"result": "Success."
}



Azure Logic Apps to automate business processes.

● Store, synchronize, and query device metadata and state information for all your devices.● Set device state either per-device or based on common characteristics of devices.

● Automatically respond to a device-reported state change with message routing integration.

4. In the IoT hub pane, enter the following information for your IoT hub:

Subscription: Choose the subscription that you want to use to create this IoT hub.

8.

9. Select Review + create.

10. Review your IoT hub information, then click Create. Your IoT hub might take a few minutes to create. You can monitor the progress in the Notifications pane.

2. Run the following command to get the device connection string for the device you just registered:

az iot hub device-identity show-connection-string --hub-name {YourIoTHub-Name} --device-id MyDotnetDevice --output table

Review Questions 143

Review Questions

What criteria must the images meet to use Computer Vision API v2?

Suggested Answer ↓

Suggested Answer ↓

To use the Computer Vision API, you must procure a subscription key. You can use Visual Studio 2015 or 2017 to develop solutions that use the API. You must import the Microsoft.Azure.CognitiveServices.Vision. ComputerVision client library NuGet package into your Visual Studio solution.

Module 6 Module Develop for Azure Storage

Develop Solutions that use Azure Cosmos DB Storage

Global replication

Azure Cosmos DB has a feature referred to as turnkey global distribution that automatically replicates data to other Azure datacenters across the globe without the need to manually write code or build a replica-tion infrastructure.

ED 146
MCT USE ONLY. STUDENT USE PROHIBIT

provable consistency choices at all and databases that offer two extreme programmability choices (strong

versus eventual consistency). The former burdens application developers with the minutia of their

consistency models, because they provide less consistency than strong, which is the most highly consist-

ent model available.

The consistency levels range from very strong consistency—where reads are guaranteed to be visible

Consider the following points if your application is built by using Cosmos DB SQL API or Table API

● For many real-world scenarios, session consistency is optimal and it's the recommended option.

ED 148
MCT USE ONLY. STUDENT USE PROHIBIT

consistency level.

If you need less strict consistency guarantees than the ones provided by session consistency, it is

You may get stronger consistency guarantees in practice. Consistency guarantees for a read operation

to the ordering and propagation of the write/update operations.

When the consistency level is set to bounded staleness, Cosmos DB guarantees that the clients

workload. For example, if there are no write operations on the database, a read operation with

eventual, session, or consistent prefix consistency levels is likely to yield the same results as a read

operation with strong consistency level.

looking at the Probabilistic Bounded Staleness (PBS) metric. This metric is exposed in the Azure portal.

currently configured on your Cosmos DB account. In other words, you can see the probability (measured

in milliseconds) of getting strongly consistent reads for a combination of write and read regions.

Five consistency models offered by Azure Cosmos DB are natively supported by the Azure Cosmos DB

don't offer precisely defined consistency models or SLA-backed guarantees for consistency levels. They

typically provide only a subset of the five consistency models offered by Azure Cosmos DB. For the SQL

Cosmos DB consistency levels for Apache Cassandra and MongoDB.

Mapping between Apache Cassandra and Azure Cosmos DB consistency levels

default consistency level in Azure Cosmos DB. The table shows multi-region and single-region deploy-

The following table shows the “read concerns” mapping between MongoDB 3.4 and the default consist-ency level in Azure Cosmos DB. The table shows multi-region and single-region deployments.

MongoDB 3.4

Strong

Majority

Consistent prefix

Azure Cosmos DB Supported APIs

The Table API in Azure Cosmos DB is a key-value database service built to provide premium capabilities (for example, automatic indexing, guaranteed low latency, and global distribution) to existing Azure Table storage applications without making any app changes.

Gremlin API

Prior to migrating, you should increase the container’s throughput to at least 1,000 Request Units

(RUs) per second so that the import tools are not throttled. The throughput can be reverted back to the typical values after the import is complete.

Azure Cosmos DB containers can be created as fixed or unlimited in the Azure portal. Fixed-size contain-ers have a maximum limit of 10 GB and a 10,000 RU/s throughput. To create a container as unlimited, you must specify a partition key and a minimum throughput of 1,000 RU/s. Azure Cosmos DB containers can also be configured to share throughput among the containers in a database.

If you created a fixed container with no partition key or a throughput less than 1,000 RU/s, the container will not automatically scale. To migrate the data from a fixed container to an unlimited container, you need to use the data migration tool or the Change Feed library.

A physical partition is a fixed amount of reserved solid-state drive (SSD) backend storage combined with a variable amount of compute resources (CPU and memory). Each physical partition is replicated for high availability. A physical partition is an internal concept of Azure Cosmos DB, and physical partitions are transient. Azure Cosmos DB will automatically scale the number of physical partitions based on your workload.

A logical partition is a partition within a physical partition that stores all the data associated with a single partition key value. Partition ranges can be dynamically subdivided to seamlessly grow the database as the application grows while simultaneously maintaining high availability. When a container meets the

ED 154
partitioning prerequisites, partitioning is completely transparent to your application. Azure Cosmos DB
handles distributing data across physical and logical partitions and routing query requests to the right

Manage Collections and Documents by using the Microsoft .NET SDK

To get started with the Azure Cosmos DB SQL API, you will need the Microsoft.Azure.DocumentDB.

Then, you can create a DocumentClient instance by using the endpoint from your Azure Cosmos DB

account and one of your keys:

DocumentClient client = new DocumentClient(new Uri("[endpoint]"), "[key]");

To reference any resource in the software development kit (SDK), you will need a URI. The UriFactory

MC

1

https://www.nuget.org/packages/Microsoft.Azure.DocumentDB.Core

If you want to query the database, you can perform SQL queries by using the SqlQuerySpec class:

var query = client.CreateDocumentQuery<Family>(
collectionUri,
new SqlQuerySpec()
{
QueryText = "SELECT * FROM f WHERE (f.surname = @lastName)", Parameters = new SqlParameterCollection()
{
new SqlParameter("@lastName", "Andt")
}
},
DefaultOptions
);

Develop Solutions that use a Relational Database 157

interruption. SQL Database has additional features that are not available in SQL Server, such as built-in intelligence and management. Azure SQL Database offers several deployment options:

The main differences between these options are listed in the following table:

A database copy is a snapshot of the source database as of the time of the copy request. You can select the same server or a different server, its service tier and compute size, or a different compute size within the same service tier (edition). After the copy is complete, it becomes a fully functional, independent database. At this point, you can upgrade or downgrade it to any edition. The logins, users, and permis-sions can be managed independently.

Note: Automated database backups are used when you create a database copy.

After the copying succeeds and before other users are remapped, only the login that initiated the copying, the database owner, can log in to the new database.

Copy a database by using the Azure portal

ED 160
MCT USE ONLY. STUDENT USE PROHIBIT

To copy a database by using PowerShell, use the New-AzureRmSqlDatabaseCopy cmdlet.

New-AzureRmSqlDatabaseCopy -ResourceGroupName "myResourceGroup" `

-CopyDatabaseName "CopyOfMySampleDatabase"

Resolve logins

initiated the database copy becomes the database owner of the new database and is assigned a new

security identifier (SID). After the copying succeeds and before other users are remapped, only the login

that initiated the copying, the database owner, can log in to the new database.

impedance mismatch between the relational and object-oriented worlds. The goal of the library is to

access “plumbing” code that they usually need to write to access data in a database.

Develop Solutions that use a Relational Database 161

The Entity Framework provider model allows Entity Framework to be used with different types of data-base servers. For example, one provider can be plugged in to allow Entity Framework to be used against Microsoft SQL Server, whereas another provider can be plugged in to allow Entity Framework to be used against Oracle Database. There are many current providers in the market for databases, including:

SQL Server provider

This database provider allows Entity Framework Core to be used with Microsoft SQL Server (including Microsoft Azure SQL Database). The provider is maintained as an open-source project as part of the Entity Framework Core repository on GitHub (https://github.com/aspnet/EntityFrameworkCore).

}

Logically, our database has a table that is a collection of these blog instances. Without knowing anything about Entity Framework, we would probably create a class that looks like this:

protected override void OnModelCreating(ModelBuilder modelBuilder) {
modelBuilder.Entity<Blog>()
.HasKey(c => c.BlogId)
.Property(b => b.Url)
.IsRequired()
.Property(b => b.Description);
}

Data annotations

ED 164

Module 6 Module Develop for Azure Storage

MCT USE ONLY. STUDENT USE PROHIBIT

DbContext implementation

After you have a model, the primary class your application interacts with is System.Data.Entity.

Track changes that are made to those objects.

Persist object changes back on the database.

{

public DbSet<Blog> Blogs { get; set; }

included in the model.

Querying Databases by using Entity Framework Core

merable<> interface, giving you access to many of the existing LINQ queries.

For example, you can load all the data from a table by enumerating the collection with a call to the

.Where(b => b.Url.Contains("dotnet"))

You can also use the Single method to get a single instance that matches a specific filter:

● Iterating the results in a for loop.

● Using an operator such as ToList, ToArray, Single, or Count.

For example, if your storage account is named mystorageaccount, then the default endpoint for Blob storage is:

http://mystorageaccount.blob.core.windows.net

Blobs

Azure Storage supports three types of blobs:

2

You may only tier your object storage data to Hot, Cool, or Archive in Blob storage or General Purpose v2 (GPv2) accounts. General Purpose v1 (GPv1) accounts do not support tiering. However, customers can easily convert their existing GPv1 or Blob storage accounts to GPv2 accounts through a simple one-click process in the Azure portal. GPv2 provides a new pricing structure for blobs, files, and queues, and access to a variety of other new storage features as well. Furthermore, going forward some new features and prices cuts will only be offered in GPv2 accounts. Therefore, customers should evaluate using GPv2 accounts but only use them after reviewing the pricing for all services as some workloads can be more expensive on GPv2 than GPv1.

Blob storage and GPv2 accounts expose the Access Tier attribute at the account level, which allows you to specify the default storage tier as Hot or Cool for any blob in the storage account that does not have an explicit tier set at the object level. For objects with the tier set at the object level, the account tier will not apply. The Archive tier can only be applied at the object level. You can switch between these storage tiers at any time.

Cool access tier

Cool storage tier has lower storage costs and higher access costs compared to Hot storage. This tier is intended for data that will remain in the Cool tier for at least 30 days. Example usage scenarios for the Cool storage tier include:

Archive storage has the lowest storage cost and higher data retrieval costs compared to Hot and Cool storage. This tier is intended for data that can tolerate several hours of retrieval latency and will remain in the Archive tier for at least 180 days.

Develop Solutions that use Microsoft Azure Blob Storage 171

can be updated only by using operations appropriate for that blob type, i.e., writing a block or list of blocks to a block blob, appending blocks to a append blob, and writing pages to a page blob.

Block blobs let you upload large blobs efficiently. Block blobs are comprised of blocks, each of which is identified by a block ID. You create or modify a block blob by writing a set of blocks and committing them by their block IDs. Each block can be a different size, up to a maximum of 100 MB (4 MB for requests using REST versions before 2016-05-31), and a block blob can include up to 50,000 blocks. The maximum size of a block blob is therefore slightly more than 4.75 TB (100 MB X 50,000 blocks). For REST versions before 2016-05-31, the maximum size of a block blob is a little more than 195 GB (4 MB X 50,000 blocks). If you are writing a block blob that is no more than 256 MB (64 MB for requests using REST versions before 2016-05-31) in size, you can upload it in its entirety with a single write operation.

Storage clients default to a 128 MB maximum single blob upload, settable using the SingleBlobUp-loadThresholdInBytes property of the BlobRequestOptions object. When a block blob upload is larger than the value in this property, storage clients break the file into blocks. You can set the number of threads used to upload the blocks in parallel on a per-request basis using the ParallelOperation-ThreadCount property of the BlobRequestOptions object.

ED 172

Module 6 Module Develop for Azure Storage

MCT USE ONLY. STUDENT USE PROHIBIT

If you write a block for a blob that does not exist, a new block blob is created, with a length of zero bytes.

About Page Blobs

Page blobs are a collection of 512-byte pages optimized for random read and write operations. To create

the blob. The maximum size for a page blob is 8 TB.

About Append Blobs

An append blob is comprised of blocks and is optimized for append operations. When you modify an

include up to 50,000 blocks. The maximum size of an append blob is therefore slightly more than 195 GB

(4 MB X 50,000 blocks).

serverless architectures. It does so without the need for complicated code or expensive and inefficient
Functions, Azure Logic Apps, or even to your own custom http listener, and you only pay for what you

Blob storage events are reliably sent to the Event grid service which provides reliable delivery services to

your applications through rich retry policies and dead-letter delivery.

but your scenario requires immediate responsiveness, event-based architecture can be especially effi-

Blob storage accounts

Event Schema

Blob storage events contain all the information you need to respond to changes in your data. You can identify a Blob storage event because the eventType property starts with “Microsoft.Storage”. Additional information about the usage of Event Grid event properties is documented in Event Grid event schema3.

3

Develop Solutions that use Microsoft Azure Blob Storage 175

Signature, you can enable them to access resources in your storage account without sharing your account key with them.

r=b&sp=rw&sig=Z%2FRHIX5Xcg0Mq2rqI3OlWTjEg2tYkboXr1P9ZUXDtkk%3

Component

The address of the blob. Note that using HTTPS is highly
recommended.

Storage services version

Specified in an International
Organization for Standardization (ISO) 8061 format. If you want the SAS to be valid immediately, omit the start time.

Expiration time

The resource is a blob.

Permissions

sig=Z%2FRHIX5Xcg0Mq2rqI3Ol-WTjEg2tYkboXr1P9ZUXDtkk%3D

Valet key pattern that uses Shared Access Signatures

A common scenario where an SAS is useful is a service where users read and write their own data to your storage account. In a scenario where a storage account stores user data, there are two typical design patterns:

Develop Solutions that use Microsoft Azure Blob Storage 177

Names are case-insensitive. Note that metadata names preserve the case with which they were created, but are case-insensitive when set or read. If two or more metadata headers with the same name are submitted for a resource, the Blob service returns status code 400 (Bad Request).

The metadata consists of name/value pairs. The total size of all metadata pairs can be up to 8KB in size.

Retrieving Properties and Metadata

The GET and HEAD operations both retrieve metadata headers for the specified container or blob. These operations return headers only; they do not return a response body. The URI syntax for retrieving meta-data headers on a container is as follows:

blob?comp=metadata

Setting Metadata Headers

The URI syntax for setting metadata headers on a blob is as follows:

ED 178

Module 6 Module Develop for Azure Storage

MCT USE ONLY. STUDENT USE PROHIBIT
represented as standard HTTP headers; the difference between them is in the naming of the headers.

Metadata headers are named with the header prefix x-ms-meta- and a custom name. Property headers

use standard HTTP header names, as specified in the Header Field Definitions section 14 of the HTTP/1.1

The standard HTTP headers supported on blobs include:

ETag

Content-Encoding

Content-Language

The CloudStorageAccount class contains the CreateCloudBlobClient method that gives you

programmatic access to a client that manages your file shares:

CloudBlobClient client = storageAccount.CreateCloudBlobClient();

container if it does not already exist in the Azure storage account:

container.CreateIfNotExists();

With a hydrated reference, you can perform actions such as fetching the properties (metadata) of the

container.Properties

This class has properties that can be set to change the container, including (but not limited to) those in the following table.

ED 180
MCT USE ONLY. STUDENT USE PROHIBIT

Review Questions

Module 6 - Review Questions

A Shared Access Signature (SAS) is a URI that grants restricted access rights to containers, binary large

> Click to see suggested answer

An ad hoc SAS. When you create an ad hoc SAS, the start time, expiration time, and permissions for

the SAS are all specified on the SAS URI (or implied in the case where the start time is omitted). This

Access Signatures. When you associate an SAS with a stored access policy, the SAS inherits the

Copying Blobs between containers

Like with Azure Files, you can use AzCopy to copy blobs between storage containers. By default, does

AzCopy copy data synchronously or asynchronously?

operation runs in the background by using spare bandwidth capacity that has no Service Level Agree-

Azure SQL database

SQL Database is a general-purpose relational database managed service in Microsoft Azure that supports

structures such as relational data, JSON, spatial, and XML. What are the three deployment options for

As a pooled database in an elastic pool with a shared set of resources managed via a logical server

As a part of a collection of databases known as a managed instance that contains system and user





Write and execute queries.

> Click to see suggested answer

Copyright © 2009-2023 UrgentHomework.com, All right reserved.