Experimenting with the Kudu API

1 -> Experimenting with the Kudu API

2 -> Building and Packaging .NET Core with AppVeyor

3 -> Parsing command line arguments in .NET Core

I’ve been playing around with the idea of some remote tools for Azure Functions using the Kudu API, mostly to make a nice way to edit and run Azure Functions from a local machine (large screen, easier access etc).

The repo is here https://github.com/jakkaj/k-scratch. It’s a work in progress.

I figure the parts I need are:

  • Log in from PublishSettings.xml
  • LogStream
  • Ability to query the functions in the site
  • Ability to list / download files
  • Ability to upload files

I decided to build the libraries on top of .NET Core.

I thought initially about a UWP app -> click + to add a new publish settings which will add a vertical tab down the left hand side. Each tab would be a little window in to that function. The LogStream, the files, editing + save would immediately upload and run it.

Perhaps I’ll get to that, but for now I’m building each bit as a standalone console app (.NET Core based) that takes the PublishSettings.xml as param.

So far I have the LogStream working, which was actually pretty easy!

You can grab a PublishSettings.xml file from your function by going to Function app settings / Go to App Service Settings / Click “…” and click Get Publish Profile.

You load the PublishSettings XML in to a nice entity. I created an entity to load the XML in to by using Xml2Csharp.com. Result entity here.

See this file for how the XML is loaded in to the entity.

The PublishSettings.xml file contains the Kudu API end points, and also the user and password needed to authenticate. Authentication is done using the basic authentication header.

Convert.ToBase64String(
                Encoding.UTF8.GetBytes($"{settings.UserName}:{settings.Password}"));

Once I had that organised, I could start calling services.

I tried a few ->

  • GET /api/zip/site/{folder} will zip up that folder and send it back
  • GET /api/vfs/site/{folder}/ will list a path
  • GET /api/vfs/site/{folder/filename will return that file

etc.

Great, I can get files! Change the HTTP method to PUT and I can upload files.

I then tried some LogStream. Using the new Bash shell in Windows, i was able to straight up Curl the log stream.

 curl -u username https://xxx.scm.azurewebsites.net/logstream

Straight away logs are streaming. Too easy! Next step was to see if the .NET Core HttpClient can stream those logs too.

 _currentStream = await response.Content.ReadAsStreamAsync();

Works really well it turns out – just read the stream as lines come in and you have yourself log ouput.

using (var reader = new StreamReader(_currentStream))
                    {
                        while (!reader.EndOfStream && _currentStream != null)
    {
         //We are ready to read the stream
         var currentLine = reader.ReadLine()

Full implementation of that file here.

Then I added a simple app – KScratchLog which takes the PublishSettings.xml path as the parameter and will start showing logs.

So why not just use curl? Simplicity of loading the PublishSettings.xml to get user/pass/API endpoint really.

Next steps – file download, change monitoring and uploading after edit. The goal is to allow editing in Visual Studio Code and have it auto save back to the App Service on file save.