Merge pull request #43 from mattchenderson/funcsample

Adding Functions sample
This commit is contained in:
Yaron Schneider 2019-10-15 11:35:48 -07:00 committed by GitHub
commit bfa033c29d
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
46 changed files with 1393 additions and 0 deletions

View File

@ -0,0 +1,69 @@
# DAPR, Azure Functions, and KEDA
This sample shows DAPR being used with Azure Functions and KEDA to create a ployglot Functions-as-a-Service application which leverages DAPR pub/sub. In it, a Python function is triggered by a message created in an Azure storage queue. This function then interacts with Dapr to publish that message to two subscribers: A C# and Javascript function that receive the event and process it accordingly.
## Run the sample
### Requirements
Setting up this sample requires you to have several components installed:
- [Install the DAPR CLI](https://github.com/dapr/cli)
- [Install Docker](https://docs.docker.com/install/)
- [Install kubectl](https://kubernetes.io/docs/tasks/tools/install-kubectl/)
- [Install Helm](https://github.com/helm/helm)
- [Install the Azure CLI](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest)
- [Install PowerShell Core 6](https://github.com/PowerShell/PowerShell)
- [Install the Azure Functions Core Tools](https://docs.microsoft.com/azure/azure-functions/functions-run-local#v2)
### Run the setup script
Before running this script, note that this will provision the following resources into your Azure subscription and will incur associated costs:
- A Kubernetes service
- A container registry
- A storage account
To run the script, first log into the Azure CLI:
```powershell
az login
```
Then:
```powershell
./setup.ps1
```
You will be prompted for a name that determines the resource group where things will be deployed. It also serves as a base name for the resources themselves. Note that storage accounts can only accept lowercase alphanumeric characters and must start with an alpha character. Please set the base name accordingly.
You will also be prompted for the azure region in which to deploy these resources. For example: "westus"
This will create an entirely new cluster and configure it with this sample.
## Explore the configured sample
Once the sample script has completed, run the following command to see what pods are running:
```powershell
kubectl get pods -w
```
You should see that there are three pods for DAPR infrastructure, as well as some Redis pods. You'll also see two pods for the function projects: one for C#, and one for JavaScript. The Python function project should not be visible, and this is because it has scaled to zero using KEDA. We'll see this pod momentarily. The `-w` flag in the command we ran means that the view will update as new pods become available.
Navigate to the [Azure portal](https://portal.azure.com), and find the resource group based on the name you provided earlier. You should see all three of the resouces mentioned earlier.
Select the Storage account, and then it's "Storage Explorer (Preview)" option in the left-hand navigation. Select "Queues" and then "items". Click "Add message," provide your message text, and hit "OK."
If you head back to the terminal where you are running the `kubectl get pods -w` command, you should see a new pod enter the `ContainerCreating` state. This is the Python function app, being scaled out because KEDA saw a message sitting in the queue. Note that there are two containers created - one of them is the DAPR sidecar!
The Python function will consume the message from the queue and then use DAPR to publish a message that both the C# and Javascript apps have registered themselves to consume. You can check the logs for these apps to see them process the message. To do this, copy the name of the pod from the `kubectl get pods` command and run a `kubectl logs` command, as shown below:
```powershell
kubectl logs csharp-function-subscriber-7b874cd7f9-bqgsj csharp-function-subscriber
# OR
kubectl logs javascript-function-subscriber-6b9588c86-2zlxh javascript-function-subscriber
```
You should see log messages indicating that the message was processed.

View File

@ -0,0 +1,34 @@
kind: Secret
metadata:
name: csharp-function-subscriber
namespace: default
data:
AzureWebJobsStorage: CONNECTION_STRING_B64
apiVersion: v1
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: csharp-function-subscriber
labels:
app: csharp-function-subscriber
spec:
replicas: 1
selector:
matchLabels:
app: csharp-function-subscriber
template:
metadata:
labels:
app: csharp-function-subscriber
annotations:
dapr.io/enabled: "true"
dapr.io/id: "csharp-function-subscriber"
dapr.io/port: "80"
spec:
containers:
- name: csharp-function-subscriber
image: IMAGE_NAME
ports:
- containerPort: 80
imagePullPolicy: Always

View File

@ -0,0 +1,48 @@
data:
AzureWebJobsStorage: CONNECTION_STRING_B64
FUNCTIONS_WORKER_RUNTIME: bm9kZQ==
apiVersion: v1
kind: Secret
metadata:
name: javascript-function-subscriber
namespace: default
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: javascript-function-subscriber
namespace: default
labels:
app: javascript-function-subscriber
annotations:
osiris.deislabs.io/enabled: true
osiris.deislabs.io/minReplicas: 1
spec:
replicas: 1
selector:
matchLabels:
app: javascript-function-subscriber
template:
metadata:
labels:
app: javascript-function-subscriber
annotations:
dapr.io/enabled: "true"
dapr.io/id: "javascript-function-subscriber"
dapr.io/port: "80"
spec:
containers:
- name: javascript-function-subscriber
image: IMAGE_NAME
ports:
- containerPort: 80
env:
- name: AzureFunctionsJobHost__functions__0
value: MyTopic
- name: AzureFunctionsJobHost__functions__1
value: Subscribe
envFrom:
- secretRef:
name: javascript-function-subscriber
---

View File

@ -0,0 +1,76 @@
kind: Secret
metadata:
name: python-function-publisher
namespace: default
data:
AzureWebJobsStorage: CONNECTION_STRING_B64
FUNCTIONS_WORKER_RUNTIME: cHl0aG9u
apiVersion: v1
---
kind: Service
apiVersion: v1
metadata:
name: python-function-publisher
labels:
app: python-function-publisher
spec:
selector:
app: python-function-publisher
ports:
- protocol: TCP
port: 80
targetPort: 80
type: LoadBalancer
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: python-function-publisher
labels:
app: python-function-publisher
spec:
replicas: 1
selector:
matchLabels:
app: python-function-publisher
template:
metadata:
labels:
app: python-function-publisher
annotations:
dapr.io/enabled: "true"
dapr.io/id: "python-function-publisher"
dapr.io/port: "80"
spec:
containers:
- name: python-function-publisher
image: IMAGE_NAME
ports:
- containerPort: 80
imagePullPolicy: Always
envFrom:
- secretRef:
name: python-function-publisher
---
apiVersion: keda.k8s.io/v1alpha1
kind: ScaledObject
metadata:
name: python-function-publisher
namespace: default
labels:
deploymentName: python-function-publisher
spec:
scaleTargetRef:
deploymentName: python-function-publisher
triggers:
- type: azure-queue
metadata:
name: myQueueItem
type: queueTrigger
direction: in
queueName: items
queueLength: "1"
connection: AzureWebJobsStorage
---

View File

@ -0,0 +1,23 @@
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: statestore
spec:
type: state.redis
metadata:
- name: "redisHost"
value: "REPLACE_HOST"
- name: "redisPassword"
value: "REPLACE_SECRET"
---
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: messagebus
spec:
type: pubsub.redis
metadata:
- name: "redisHost"
value: "REPLACE_HOST"
- name: "redisPassword"
value: "REPLACE_SECRET"

View File

@ -0,0 +1,117 @@
$setupFolder = "./utils/base-templates"
$deployFolder = "./deploy"
$pythonName = "python-function-publisher"
$dotnetName = "csharp-function-subscriber"
$javascriptName = "javascript-function-subscriber"
$sourceFolder = "./src"
$pythonFolder = "$sourceFolder/$pythonName"
$dotnetFolder = "$sourceFolder/$dotnetName"
$javascriptFolder = "$sourceFolder/$javascriptName"
# Prompts
$resourceBase = Read-Host -Prompt "Enter resource name base"
$location = Read-Host -Prompt "Enter location"
$groupName= "$resourceBase"
$clusterName= "$resourceBase" + "-cluster"
$registryName="${resourceBase}reg"
$storageName = "${resourceBase}sa"
# Resource Group
Write-Host
Write-Host "Creating resource group $groupName..."
az group create -n $groupName -l $location
# AKS
Write-Host
Write-Host "Creating AKS cluster $clusterName..."
az aks create -g $groupName -n $clusterName --generate-ssh-keys
az aks get-credentials -n $clusterName -g $groupName
kubectl apply -f $setupFolder/helm-rbac.yaml
helm init --service-account tiller
# ACR
Write-Host
Write-Host "Creating ACR registry $registryName..."
az acr create --resource-group $groupName --name $registryName --sku Basic
$CLIENT_ID= az aks show --resource-group $groupName --name $clusterName --query "servicePrincipalProfile.clientId" --output tsv
$ACR_ID= az acr show --name $registryName --resource-group $groupName --query "id" --output tsv
Write-Host
Write-Host "Connecting ACR and AKS..."
az role assignment create --assignee $CLIENT_ID --role acrpull --scope $ACR_ID
az acr login -n $registryName
# Install DAPR
Write-Host
Write-Host "Installing DAPR on $clusterName..."
dapr init --kubernetes
Write-Host
Write-Host "Installing Redis as the DAPR state store on $clusterName..."
helm install stable/redis --name redis --set image.tag=5.0.5-debian-9-r104 --set rbac.create=true
Start-Sleep -Seconds 60
$redisHost= $(kubectl get service redis-master -o=custom-columns=IP:.spec.clusterIP --no-headers=true) + ":6379"
$encoded = kubectl get secret --namespace default redis -o jsonpath="{.data.redis-password}"
$redisSecret = [System.Text.Encoding]::ASCII.GetString([System.Convert]::FromBase64String($encoded))
(Get-Content $setupFolder/redis-base.yaml) | Foreach-Object {$_ -replace "REPLACE_HOST", $redisHost} | Foreach-Object {$_ -replace "REPLACE_SECRET", $redisSecret} | Set-Content $deployFolder/redis.yaml
# Install KEDA
Write-Host
Write-Host "Installing KEDA on $clusterName..."
func kubernetes install --namespace keda
#### Application section
# Provision Azure Storage
Write-Host
Write-Host "Creating storage account $storageName..."
az group create -l $location -n $groupName
az storage account create --sku Standard_LRS -l $location -g $groupName -n $storageName --kind StorageV2
$CONNECTION_STRING= az storage account show-connection-string -g $groupName -n $storageName --query connectionString
az storage queue create -n items --connection-string $CONNECTION_STRING
# Build images and deployment templates
Write-Host
Write-Host "Building and publishing images..."
$trimmedConnectionString = $CONNECTION_STRING -replace "`"", ""
$encodedConnectionString = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes($trimmedConnectionString))
(Get-Content $setupFolder/python-function-publisher-base.yaml) `
| Foreach-Object {$_ -replace "IMAGE_NAME", "$registryName.azurecr.io/$pythonName"} `
| Foreach-Object {$_ -replace "CONNECTION_STRING_B64", $encodedConnectionString} `
| Set-Content $deployFolder/python-function-publisher.yaml
docker build -t "$registryName.azurecr.io/$pythonName" $pythonFolder
docker push "$registryName.azurecr.io/$pythonName"
(Get-Content $setupFolder/csharp-function-subscriber-base.yaml) `
| Foreach-Object {$_ -replace "IMAGE_NAME", "$registryName.azurecr.io/$dotnetName"} `
| Foreach-Object {$_ -replace "CONNECTION_STRING_B64", $encodedConnectionString} `
| Set-Content $deployFolder/csharp-function-subscriber.yaml
docker build -t "$registryName.azurecr.io/$dotnetName" $dotnetFolder
docker push "$registryName.azurecr.io/$dotnetName"
(Get-Content $setupFolder/javascript-function-subscriber-base.yaml) `
| Foreach-Object {$_ -replace "IMAGE_NAME", "$registryName.azurecr.io/$javascriptName"} `
| Foreach-Object {$_ -replace "CONNECTION_STRING_B64", $encodedConnectionString} `
| Set-Content $deployFolder/javascript-function-subscriber.yaml
docker build -t "$registryName.azurecr.io/$javascriptName" $javascriptFolder
docker push "$registryName.azurecr.io/$javascriptName"
# Deploy
Write-Host
Write-Host "Deploying application..."
kubectl apply -f $deployFolder

View File

@ -0,0 +1 @@
local.settings.json

View File

@ -0,0 +1,264 @@
## Ignore Visual Studio temporary files, build results, and
## files generated by popular Visual Studio add-ons.
# Azure Functions localsettings file
local.settings.json
# User-specific files
*.suo
*.user
*.userosscache
*.sln.docstates
# User-specific files (MonoDevelop/Xamarin Studio)
*.userprefs
# Build results
[Dd]ebug/
[Dd]ebugPublic/
[Rr]elease/
[Rr]eleases/
x64/
x86/
bld/
[Bb]in/
[Oo]bj/
[Ll]og/
# Visual Studio 2015 cache/options directory
.vs/
# Uncomment if you have tasks that create the project's static files in wwwroot
#wwwroot/
# MSTest test Results
[Tt]est[Rr]esult*/
[Bb]uild[Ll]og.*
# NUNIT
*.VisualState.xml
TestResult.xml
# Build Results of an ATL Project
[Dd]ebugPS/
[Rr]eleasePS/
dlldata.c
# DNX
project.lock.json
project.fragment.lock.json
artifacts/
*_i.c
*_p.c
*_i.h
*.ilk
*.meta
*.obj
*.pch
*.pdb
*.pgc
*.pgd
*.rsp
*.sbr
*.tlb
*.tli
*.tlh
*.tmp
*.tmp_proj
*.log
*.vspscc
*.vssscc
.builds
*.pidb
*.svclog
*.scc
# Chutzpah Test files
_Chutzpah*
# Visual C++ cache files
ipch/
*.aps
*.ncb
*.opendb
*.opensdf
*.sdf
*.cachefile
*.VC.db
*.VC.VC.opendb
# Visual Studio profiler
*.psess
*.vsp
*.vspx
*.sap
# TFS 2012 Local Workspace
$tf/
# Guidance Automation Toolkit
*.gpState
# ReSharper is a .NET coding add-in
_ReSharper*/
*.[Rr]e[Ss]harper
*.DotSettings.user
# JustCode is a .NET coding add-in
.JustCode
# TeamCity is a build add-in
_TeamCity*
# DotCover is a Code Coverage Tool
*.dotCover
# NCrunch
_NCrunch_*
.*crunch*.local.xml
nCrunchTemp_*
# MightyMoose
*.mm.*
AutoTest.Net/
# Web workbench (sass)
.sass-cache/
# Installshield output folder
[Ee]xpress/
# DocProject is a documentation generator add-in
DocProject/buildhelp/
DocProject/Help/*.HxT
DocProject/Help/*.HxC
DocProject/Help/*.hhc
DocProject/Help/*.hhk
DocProject/Help/*.hhp
DocProject/Help/Html2
DocProject/Help/html
# Click-Once directory
publish/
# Publish Web Output
*.[Pp]ublish.xml
*.azurePubxml
# TODO: Comment the next line if you want to checkin your web deploy settings
# but database connection strings (with potential passwords) will be unencrypted
#*.pubxml
*.publishproj
# Microsoft Azure Web App publish settings. Comment the next line if you want to
# checkin your Azure Web App publish settings, but sensitive information contained
# in these scripts will be unencrypted
PublishScripts/
# NuGet Packages
*.nupkg
# The packages folder can be ignored because of Package Restore
**/packages/*
# except build/, which is used as an MSBuild target.
!**/packages/build/
# Uncomment if necessary however generally it will be regenerated when needed
#!**/packages/repositories.config
# NuGet v3's project.json files produces more ignoreable files
*.nuget.props
*.nuget.targets
# Microsoft Azure Build Output
csx/
*.build.csdef
# Microsoft Azure Emulator
ecf/
rcf/
# Windows Store app package directories and files
AppPackages/
BundleArtifacts/
Package.StoreAssociation.xml
_pkginfo.txt
# Visual Studio cache files
# files ending in .cache can be ignored
*.[Cc]ache
# but keep track of directories ending in .cache
!*.[Cc]ache/
# Others
ClientBin/
~$*
*~
*.dbmdl
*.dbproj.schemaview
*.jfm
*.pfx
*.publishsettings
node_modules/
orleans.codegen.cs
# Since there are multiple workflows, uncomment next line to ignore bower_components
# (https://github.com/github/gitignore/pull/1529#issuecomment-104372622)
#bower_components/
# RIA/Silverlight projects
Generated_Code/
# Backup & report files from converting an old project file
# to a newer Visual Studio version. Backup files are not needed,
# because we have git ;-)
_UpgradeReport_Files/
Backup*/
UpgradeLog*.XML
UpgradeLog*.htm
# SQL Server files
*.mdf
*.ldf
# Business Intelligence projects
*.rdl.data
*.bim.layout
*.bim_*.settings
# Microsoft Fakes
FakesAssemblies/
# GhostDoc plugin setting file
*.GhostDoc.xml
# Node.js Tools for Visual Studio
.ntvs_analysis.dat
# Visual Studio 6 build log
*.plg
# Visual Studio 6 workspace options file
*.opt
# Visual Studio LightSwitch build output
**/*.HTMLClient/GeneratedArtifacts
**/*.DesktopClient/GeneratedArtifacts
**/*.DesktopClient/ModelManifest.xml
**/*.Server/GeneratedArtifacts
**/*.Server/ModelManifest.xml
_Pvt_Extensions
# Paket dependency manager
.paket/paket.exe
paket-files/
# FAKE - F# Make
.fake/
# JetBrains Rider
.idea/
*.sln.iml
# CodeRush
.cr/
# Python Tools for Visual Studio (PTVS)
__pycache__/
*.pyc

View File

@ -0,0 +1,6 @@
{
"recommendations": [
"ms-azuretools.vscode-azurefunctions",
"ms-vscode.csharp"
]
}

View File

@ -0,0 +1,11 @@
{
"version": "0.2.0",
"configurations": [
{
"name": "Attach to .NET Functions",
"type": "coreclr",
"request": "attach",
"processId": "${command:azureFunctions.pickProcess}"
}
]
}

View File

@ -0,0 +1,6 @@
{
"azureFunctions.deploySubpath": "bin/Release/netcoreapp2.1/publish",
"azureFunctions.projectLanguage": "C#",
"azureFunctions.projectRuntime": "~2",
"azureFunctions.preDeployTask": "publish"
}

View File

@ -0,0 +1,45 @@
{
"version": "2.0.0",
"tasks": [
{
"label": "clean",
"command": "dotnet clean",
"type": "shell",
"problemMatcher": "$msCompile"
},
{
"label": "build",
"command": "dotnet build",
"type": "shell",
"dependsOn": "clean",
"group": {
"kind": "build",
"isDefault": true
},
"problemMatcher": "$msCompile"
},
{
"label": "clean release",
"command": "dotnet clean --configuration Release",
"type": "shell",
"problemMatcher": "$msCompile"
},
{
"label": "publish",
"command": "dotnet publish --configuration Release",
"type": "shell",
"dependsOn": "clean release",
"problemMatcher": "$msCompile"
},
{
"type": "func",
"dependsOn": "build",
"options": {
"cwd": "${workspaceFolder}/bin/Debug/netcoreapp2.1"
},
"command": "host start",
"isBackground": true,
"problemMatcher": "$func-watch"
}
]
}

View File

@ -0,0 +1,12 @@
FROM microsoft/dotnet:2.2-sdk AS installer-env
COPY . /src/dotnet-function-app
RUN cd /src/dotnet-function-app && \
mkdir -p /home/site/wwwroot && \
dotnet publish *.csproj --output /home/site/wwwroot
FROM mcr.microsoft.com/azure-functions/dotnet:2.0
ENV AzureWebJobsScriptRoot=/home/site/wwwroot \
AzureFunctionsJobHost__Logging__Console__IsEnabled=true
COPY --from=installer-env ["/home/site/wwwroot", "/home/site/wwwroot"]

View File

@ -0,0 +1,38 @@
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using System.Net.Http;
namespace DAPR.Sample
{
public class MyTopic
{
private readonly HttpClient _client;
private const string DAPR_URL = "http://localhost:3500/v1.0";
public MyTopic(IHttpClientFactory httpClientFactory)
{
_client = httpClientFactory.CreateClient();
}
[FunctionName("MyTopic")]
public async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "myTopic")] HttpRequest req,
ILogger log)
{
log.LogInformation("MyTopic C# trigger function processed a request.");
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic daprMessage = JsonConvert.DeserializeObject(requestBody);
log.LogInformation($"Got value: {daprMessage["data"]["message"]}");
return new OkResult();
}
}
}

View File

@ -0,0 +1,15 @@
using Microsoft.Azure.Functions.Extensions.DependencyInjection;
using Microsoft.Extensions.DependencyInjection;
[assembly: FunctionsStartup(typeof(DAPR.Sample.Startup))]
namespace DAPR.Sample {
public class Startup : FunctionsStartup
{
public override void Configure(IFunctionsHostBuilder builder)
{
builder.Services.AddHttpClient();
}
}
}

View File

@ -0,0 +1,31 @@
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
namespace DAPR.Sample
{
public class Subscribe
{
public Subscribe()
{
}
[FunctionName("Subscribe")]
public async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "dapr/subscribe")] HttpRequest req,
ILogger log)
{
log.LogInformation("Subscription HTTP trigger function processed a request.");
string[] subscriptions = { "myTopic" };
return new OkObjectResult(JsonConvert.SerializeObject(subscriptions));
}
}
}

View File

@ -0,0 +1,21 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netcoreapp2.1</TargetFramework>
<AzureFunctionsVersion>v2</AzureFunctionsVersion>
<RootNamespace>csharp_subscribe</RootNamespace>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Azure.Functions.Extensions" Version="1.0.0" />
<PackageReference Include="Microsoft.Extensions.Http" Version="2.2.0" />
<PackageReference Include="Microsoft.NET.Sdk.Functions" Version="1.0.29" />
</ItemGroup>
<ItemGroup>
<None Update="host.json">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</None>
<None Update="local.settings.json">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
<CopyToPublishDirectory>Never</CopyToPublishDirectory>
</None>
</ItemGroup>
</Project>

View File

@ -0,0 +1,8 @@
{
"version": "2.0",
"extensions": {
"http": {
"routePrefix": ""
}
}
}

View File

@ -0,0 +1 @@
local.settings.json

View File

@ -0,0 +1,43 @@
bin
obj
csx
.vs
edge
Publish
*.user
*.suo
*.cscfg
*.Cache
project.lock.json
/packages
/TestResults
/tools/NuGet.exe
/App_Data
/secrets
/data
.secrets
appsettings.json
local.settings.json
node_modules
dist
# Local python packages
.python_packages/
# Python Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class

View File

@ -0,0 +1,5 @@
{
"recommendations": [
"ms-azuretools.vscode-azurefunctions"
]
}

View File

@ -0,0 +1,9 @@
FROM mcr.microsoft.com/azure-functions/node:2.0
ENV AzureWebJobsScriptRoot=/home/site/wwwroot \
AzureFunctionsJobHost__Logging__Console__IsEnabled=true
COPY . /home/site/wwwroot
RUN cd /home/site/wwwroot && \
npm install

View File

@ -0,0 +1,19 @@
{
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"post"
],
"route": "myTopic"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
]
}

View File

@ -0,0 +1,3 @@
module.exports = async function (context, req) {
context.log('MyTopic JavaScript function processed a request.');
};

View File

@ -0,0 +1,19 @@
{
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get"
],
"route": "/dapr/subscribe"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
]
}

View File

@ -0,0 +1,9 @@
module.exports = async function (context, req) {
context.log('JavaScript Subscription HTTP trigger function processed a request.');
context.res = {
// status: 200, /* Defaults to 200 */
body: [
"myTopic"
]
};
};

View File

@ -0,0 +1,12 @@
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
},
"extensions": {
"http": {
"routePrefix": ""
}
}
}

View File

@ -0,0 +1,9 @@
{
"name": "",
"version": "",
"description": "",
"scripts" : {
"test": "echo \"No tests yet...\""
},
"author": ""
}

View File

@ -0,0 +1 @@
local.settings.json

View File

@ -0,0 +1,5 @@
.git*
.vscode
local.settings.json
test
.venv

View File

@ -0,0 +1,131 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
.hypothesis/
.pytest_cache/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
.python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that dont work, or not
# install all needed dependencies.
#Pipfile.lock
# celery beat schedule file
celerybeat-schedule
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# Azure Functions artifacts
bin
obj
appsettings.json
local.settings.json
.python_packages
python-publish.zip

View File

@ -0,0 +1,6 @@
{
"recommendations": [
"ms-azuretools.vscode-azurefunctions",
"ms-python.python"
]
}

View File

@ -0,0 +1,12 @@
{
"version": "0.2.0",
"configurations": [
{
"name": "Attach to Python Functions",
"type": "python",
"request": "attach",
"port": 9091,
"preLaunchTask": "func: host start"
}
]
}

View File

@ -0,0 +1,9 @@
{
"azureFunctions.deploySubpath": "python-publish.zip",
"azureFunctions.pythonVenv": ".venv",
"azureFunctions.projectLanguage": "Python",
"azureFunctions.projectRuntime": "~2",
"debug.internalConsoleOptions": "neverOpen",
"azureFunctions.preDeployTask": "func: pack",
"python.pythonPath": ".venv/bin/python3.6"
}

View File

@ -0,0 +1,26 @@
{
"version": "2.0.0",
"tasks": [
{
"type": "func",
"command": "host start",
"problemMatcher": "$func-watch",
"isBackground": true,
"dependsOn": "pipInstall"
},
{
"label": "pipInstall",
"type": "shell",
"osx": {
"command": "${config:azureFunctions.pythonVenv}/bin/python -m pip install -r requirements.txt"
},
"windows": {
"command": "${config:azureFunctions.pythonVenv}\\Scripts\\python -m pip install -r requirements.txt"
},
"linux": {
"command": "${config:azureFunctions.pythonVenv}/bin/python -m pip install -r requirements.txt"
},
"problemMatcher": []
}
]
}

View File

@ -0,0 +1,9 @@
FROM mcr.microsoft.com/azure-functions/python:2.0
ENV AzureWebJobsScriptRoot=/home/site/wwwroot \
AzureFunctionsJobHost__Logging__Console__IsEnabled=true
COPY . /home/site/wwwroot
RUN cd /home/site/wwwroot && \
pip install -r requirements.txt

View File

@ -0,0 +1,19 @@
import logging
import requests
import json
import azure.functions as func
dapr_url = "http://localhost:3500/v1.0"
def main(msg: func.QueueMessage):
logging.info(f"Python queue-triggered function received a message!")
message = msg.get_body().decode('utf-8')
logging.info(f"Message: {message}")
# Publish an event
url = f'{dapr_url}/publish/myTopic'
content = { "message": message }
logging.info(f'POST to {url} with content {json.dumps(content)}')
p = requests.post(url, json=content)
logging.info(f'Got response code {p.status_code}')

View File

@ -0,0 +1,12 @@
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "msg",
"type": "queueTrigger",
"direction": "in",
"queueName": "items",
"connection": "AzureWebJobsStorage"
}
]
}

View File

@ -0,0 +1,7 @@
{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
}

View File

@ -0,0 +1,2 @@
requests
azure-functions

View File

@ -0,0 +1,34 @@
kind: Secret
metadata:
name: csharp-function-subscriber
namespace: default
data:
AzureWebJobsStorage: CONNECTION_STRING_B64
apiVersion: v1
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: csharp-function-subscriber
labels:
app: csharp-function-subscriber
spec:
replicas: 1
selector:
matchLabels:
app: csharp-function-subscriber
template:
metadata:
labels:
app: csharp-function-subscriber
annotations:
dapr.io/enabled: "true"
dapr.io/id: "csharp-function-subscriber"
dapr.io/port: "80"
spec:
containers:
- name: csharp-function-subscriber
image: IMAGE_NAME
ports:
- containerPort: 80
imagePullPolicy: Always

View File

@ -0,0 +1,18 @@
apiVersion: v1
kind: ServiceAccount
metadata:
name: tiller
namespace: kube-system
---
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
name: tiller
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: cluster-admin
subjects:
- kind: ServiceAccount
name: tiller
namespace: kube-system

View File

@ -0,0 +1,48 @@
data:
AzureWebJobsStorage: CONNECTION_STRING_B64
FUNCTIONS_WORKER_RUNTIME: bm9kZQ==
apiVersion: v1
kind: Secret
metadata:
name: javascript-function-subscriber
namespace: default
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: javascript-function-subscriber
namespace: default
labels:
app: javascript-function-subscriber
annotations:
osiris.deislabs.io/enabled: true
osiris.deislabs.io/minReplicas: 1
spec:
replicas: 1
selector:
matchLabels:
app: javascript-function-subscriber
template:
metadata:
labels:
app: javascript-function-subscriber
annotations:
dapr.io/enabled: "true"
dapr.io/id: "javascript-function-subscriber"
dapr.io/port: "80"
spec:
containers:
- name: javascript-function-subscriber
image: IMAGE_NAME
ports:
- containerPort: 80
env:
- name: AzureFunctionsJobHost__functions__0
value: MyTopic
- name: AzureFunctionsJobHost__functions__1
value: Subscribe
envFrom:
- secretRef:
name: javascript-function-subscriber
---

View File

@ -0,0 +1,76 @@
kind: Secret
metadata:
name: python-function-publisher
namespace: default
data:
AzureWebJobsStorage: CONNECTION_STRING_B64
FUNCTIONS_WORKER_RUNTIME: cHl0aG9u
apiVersion: v1
---
kind: Service
apiVersion: v1
metadata:
name: python-function-publisher
labels:
app: python-function-publisher
spec:
selector:
app: python-function-publisher
ports:
- protocol: TCP
port: 80
targetPort: 80
type: LoadBalancer
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: python-function-publisher
labels:
app: python-function-publisher
spec:
replicas: 1
selector:
matchLabels:
app: python-function-publisher
template:
metadata:
labels:
app: python-function-publisher
annotations:
dapr.io/enabled: "true"
dapr.io/id: "python-function-publisher"
dapr.io/port: "80"
spec:
containers:
- name: python-function-publisher
image: IMAGE_NAME
ports:
- containerPort: 80
imagePullPolicy: Always
envFrom:
- secretRef:
name: python-function-publisher
---
apiVersion: keda.k8s.io/v1alpha1
kind: ScaledObject
metadata:
name: python-function-publisher
namespace: default
labels:
deploymentName: python-function-publisher
spec:
scaleTargetRef:
deploymentName: python-function-publisher
triggers:
- type: azure-queue
metadata:
name: myQueueItem
type: queueTrigger
direction: in
queueName: items
queueLength: "1"
connection: AzureWebJobsStorage
---

View File

@ -0,0 +1,23 @@
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: statestore
spec:
type: state.redis
metadata:
- name: "redisHost"
value: "REPLACE_HOST"
- name: "redisPassword"
value: "REPLACE_SECRET"
---
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: messagebus
spec:
type: pubsub.redis
metadata:
- name: "redisHost"
value: "REPLACE_HOST"
- name: "redisPassword"
value: "REPLACE_SECRET"

View File

@ -9,3 +9,4 @@ This repository contains a series of samples that highlight Dapr capabilities. T
| [3.distributed-calculator](./3.distributed-calculator) | Demonstrates a distributed calculator application that uses Dapr services to power a React web app. Highlights polyglot (multi-language) programming, service invocation and state management. |
| [4.pub-sub](./4.pub-sub) | Demonstrates how we use Dapr to enable pub-sub applications. Uses Redis as a pub-sub component. |
| [5.bindings](./5.bindings) | Demonstrates how we use Dapr to create input and output bindings to other components. Uses bindings to Kafka. |
| [6.functions-and-keda](./6.functions-and-keda) | Demonstrates use of Dapr pub/sub from Azure Functions, as well as composition with KEDA. |