My GitHub repo shows how to write an Azure Function that responds to a NewBlobCreated
event from Azure Blob Storage, decodes an encoded field in the file and writes the results to a new blob file in a different container.

In this example, any blob file that is dropped into the input
container will fire an Azure Blob Storage – Event Grid NewBlobCreated
event. This is similar to if an IoT device was writing new files to a storage account. This IoT device encodes its data in one of the fields of the blob file using base64 encoding. The downstream code of an overall solution expects the JSON files that are in the storage account to be plain text. Therefore, we need to inject an Azure Function in the middle of the process to decode the base64 encoded data and write out a new “plaintext” JSON file that can be read by downstream processes.
In the src/DecodeAndWriteFile.cs
file, the Run
method is decorated with the Function("DecodeAndWriteFile")
attribute. This defines an endpoint that the Event Grid trigger will be linked to.
[Function("DecodeAndWriteFile")]
public async Task Run([EventGridTrigger] BlobCreatedEvent input)
This linkage is created declaratively in the infra/eventSubscription.bicep
file.
resource newBlobCreatedEventSubscription 'Microsoft.EventGrid/systemTopics/eventSubscriptions@2021-06-01-preview' = {
name: '${blobCreatedEventGridTopic.name}/newBlobCreatedForRaiseEventFunctionAppEventSubscription'
properties: {
destination: {
endpointType: 'AzureFunction'
properties: {
resourceId: '${functionApp.id}/functions/DecodeAndWriteFile'
Run the code
- Take the sample data in the
./src/sampleData/data.json
file and upload it to your Azure blob storage account, in theinput
container. - Wait a few seconds, then look in the
output
container to see the new decoded data file.