This repo shows you how to build an ASP.NET Core & Java Spring application and access Azure Data Lake Store (ADLS). The code shows you both how to access the ADLS using Storage Account keys and how to access the ADLS using a managed identity.

Here is a screenshot of the Java application accessing the ADLS & display the JSON contents of the files.

Accessing Storage Account using Storage Account Access Keys
In this example, the Storage Account Access Keys are stored in Azure Key Vault and accessed as environment variables in the App Service the application is running as.
ASP.NET Core
In the ASP.NET Core example, a singleton service of type DataLakeServiceClient
is added to the middleware at startup time so the application can use it later on. Look in the /src/aspnet/Program.cs
file.
StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(
builder.Configuration["StorageAccountName"],
builder.Configuration["StorageAccountAccessKey"]);
builder.Services.AddSingleton(new DataLakeServiceClient(
new Uri($"https://{builder.Configuration["StorageAccountName"]}.dfs.core.windows.net"),
sharedKeyCredential));
These configuration values are populated by App Service at startup time.

In the controller /src/aspnet/Controllers/HomeController.cs
, we can get this service and connect to the ADLS.
public HomeController(ILogger<HomeController> logger, DataLakeServiceClient dataLakeServiceClient, IConfiguration configuration)
{
_logger = logger;
_dataLakeServiceClient = dataLakeServiceClient;
_configuration = configuration;
}
public async Task<IActionResult> Index()
{
List<JsonFile> jsonFiles = new List<JsonFile>();
var fileSystemClient = _dataLakeServiceClient.GetFileSystemClient(_configuration["BlobContainerName"]);
await foreach (var path in fileSystemClient.GetPathsAsync("/", true))
{
if (path.IsDirectory == false)
{
var fileClient = fileSystemClient.GetFileClient(path.Name);
var fileStream = fileClient.OpenRead();
var streamReader = new StreamReader(fileStream);
jsonFiles.Add(new JsonFile
{
Name = path.Name,
Content = await streamReader.ReadToEndAsync()
});
}
}
ViewBag.JsonFiles = jsonFiles;
return View();
}
Finally, we can render the JSON files in the view /src/aspnet/Views/Home/Index.cshtml
.
<script src="https://cdn.jsdelivr.net/gh/google/code-prettify@master/loader/run_prettify.js"></script>
<div>
<ul>
@foreach (var jsonFile in ViewBag.JsonFiles) {
<li>
<details>
<summary>@jsonFile.Name</summary>
<pre class="prettyprint"><code>@jsonFile.Content</code></pre>
</details>
</li>
}
</ul>
</div>
Java
The Java example is similar to the ASP.NET Core example. The environment variables are read in using a POJO.
Look in the src/java/src/main/resources/application.properties
file to see how these get mapped from the App Service configuration to Java.
azure.blobContainerName=${BlobContainerName}
azure.managedIdentityClientId=${ManagedIdentityClientId}
azure.storageAccountAccessKey=${StorageAccountAccessKey}
azure.storageAccountName=${StorageAccountName}
azure.useManagedIdentity=${UseManagedIdentity}
These will get automatically populated by Spring at startup time. Look in the src/java/src/main/java/com/microsoft/azure/aspnetadls/ConfigProperties.java
.
@Configuration
@ConfigurationProperties(prefix = "azure")
public class ConfigProperties {
private String blobContainerName;
public String getBlobContainerName() {
return blobContainerName;
}
public void setBlobContainerName(String blobContainerName) {
this.blobContainerName = blobContainerName;
}
...
We can instantiate the DataServiceClient
in a Service so it can be used in other code. Look in the src/java/src/main/java/com/microsoft/azure/aspnetadls/service/DataLakeServiceClientService.java
file.
@Service
public class DataLakeServiceClientService {
private ConfigProperties configProperties;
@Autowired
public void setConfigProperties(ConfigProperties configProperties) {
this.configProperties = configProperties;
}
@PostConstruct
public DataLakeServiceClient getDataLakeServiceClient() {
String endpoint = "https://" + configProperties.getStorageAccountName() + ".dfs.core.windows.net";
DataLakeServiceClient dataLakeServiceClient;
...
StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(
configProperties.getStorageAccountName(), configProperties.getStorageAccountAccessKey());
DataLakeServiceClientBuilder builder = new DataLakeServiceClientBuilder();
builder.credential(sharedKeyCredential);
builder.endpoint(endpoint);
dataLakeServiceClient = builder.buildClient();
return dataLakeServiceClient;
}
}
We can now use it to access the ADLS in the HomeService
. Look in the src/java/src/main/java/com/microsoft/azure/aspnetadls/service/HomeService.java
file.
@Service
public class HomeService {
...
private DataLakeServiceClientService dataLakeServiceClientService;
@Autowired
public void setDataLakeServiceClientService(DataLakeServiceClientService dataLakeServiceClientService) {
this.dataLakeServiceClientService = dataLakeServiceClientService;
}
public Vector<JsonFile> getJsonFiles() {
Vector<JsonFile> jsonFiles = new Vector<JsonFile>();
var dataLakeServiceClient = dataLakeServiceClientService.getDataLakeServiceClient();
var fileSystemClient = dataLakeServiceClient.getFileSystemClient(configProperties.getBlobContainerName());
for (var path : fileSystemClient.listPaths(new ListPathsOptions().setPath("/").setRecursive(true), null)) {
if (path.isDirectory() == false) {
var fileClient = fileSystemClient.getFileClient(path.getName());
var fileStream = fileClient.openInputStream();
var jsonFile = new JsonFile();
jsonFile.setName(path.getName());
try {
jsonFile.setContent(new String(fileStream.getInputStream().readAllBytes(), StandardCharsets.UTF_8));
} catch (Exception e) {
jsonFile.setContent("Unable to load file");
}
jsonFiles.add(jsonFile);
}
}
return jsonFiles;
}
}
Now the controller can request data from the ADLS using the HomeService
. Look in the src/java/src/main/java/com/microsoft/azure/aspnetadls/controller/HomeController.java
file.
@Controller
public class HomeController {
@Autowired
HomeService homeService;
@GetMapping("/")
public String index(ModelMap model) {
Vector<JsonFile> jsonFiles = new Vector<JsonFile>();
jsonFiles = homeService.getJsonFiles();
model.put("jsonFiles", jsonFiles);
return "index";
}
Finally, we can render the results in the view. Look in the src/java/src/main/webapp/WEB-INF/templates/index.html
file.
<script src="https://cdn.jsdelivr.net/gh/google/code-prettify@master/loader/run_prettify.js"></script>
<div>
<ul>
<c:forEach items="${jsonFiles}" var="jsonFile">
<li th:each="jsonFile: ${jsonFiles}">
<details>
<summary th:text="${jsonFile.getName()}" />
<pre
class="prettyprint"
><code th:text="${jsonFile.getContent()}" /></pre>
</details>
</li>
</c:forEach>
</ul>
</div>
Accessing Storage Account using a managed identity
Using a Managed Identity simplifies both the code & the infrastructure since there are no secrets to store in Key Vault and manage. The application will get a token at runtime and access to ADLS.
ASP.NET Core
TokenCredential tokenCredential = new DefaultAzureCredential(new DefaultAzureCredentialOptions
{
ManagedIdentityClientId = builder.Configuration["ManagedIdentityClientId"]
});
builder.Services.AddSingleton(new DataLakeServiceClient(
new Uri($"https://{builder.Configuration["StorageAccountName"]}.dfs.core.windows.net"),
tokenCredential));
Java
DefaultAzureCredential defaultAzureCredential = new DefaultAzureCredentialBuilder()
.managedIdentityClientId(configProperties.getManagedIdentityClientId())
.build();
DataLakeServiceClientBuilder builder = new DataLakeServiceClientBuilder();
dataLakeServiceClient = builder.credential(defaultAzureCredential).endpoint(endpoint).buildClient();