Building a Real-Time Photo Gallery with Xamarin, SignalR, Azure, and WebAPI

Lately I’ve been throwing around the idea of writing an app for my wedding, and started thinking of what I’d want to put in it. One feature that I thought could be fun would be to let guests upload photos directly from the app and display them both to other guests and on our site in real time. In this post I’ll go over how I was able to quickly put together a simple prototype for how this could work using SignalR, Azure Storage, and WebAPI.

First things first: all of the code for this sample is available on GitHub

Here’s a short video showing the gallery in action:

The first upload is being done in a script called from LINQPad, and is basically the same code you’ll see later that is used from the iOS app.

To The Code!

Enough of that, let’s look at some code. This sample includes several projects:

  • Gallery: MVC/WebAPI/SignalR project for viewing and uploading photos
  • GalleryApp.Core: Shared core logic for apps to let them upload photos and listen for new uploads
  • GalleryApp.Core.iOS: A file-linked version of GalleryApp.Core compiled for iOS
  • GalleryApp-iOS: An iOS app that lets you upload photos and see what others upload

Let’s start with the Gallery project. In the sample project I have it set up to use the local Azure Storage emulator that makes it easy to test, but you can swap this out with a real connection string in Web.config if you want.

First we need an API endpoint for uploading new photos to our storage container:

using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
using System.Web.Http;
using Gallery.Storage;
using Microsoft.AspNet.SignalR;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
 
namespace Gallery.Controllers
{
    public class PhotoController : ApiController
    {
        private readonly CloudBlobContainer _container;
 
        public PhotoController()
        {
            _container = GetContainer();
        }
 
        public async Task Put()
        {
            if (!Request.Content.IsMimeMultipartContent("form-data"))
            {
                throw new HttpResponseException(Request.CreateResponse(HttpStatusCode.UnsupportedMediaType));
            }
 
            var provider = new BlobStorageProvider(_container);
            await Request.Content.ReadAsMultipartAsync(provider);
 
            var context = GlobalHost.ConnectionManager.GetHubContext<Hubs.Gallery>();
 
            context.Clients.All.newPhotosReceived(provider.Urls);
        }
 
        private CloudBlobContainer GetContainer()
        {
            var storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("CloudStorageConnectionString"));
            var blobClient = storageAccount.CreateCloudBlobClient();
            var container = blobClient.GetContainerReference("weddingpictures");
 
            container.CreateIfNotExists();
 
            var permissions = container.GetPermissions();
            if (permissions.PublicAccess == BlobContainerPublicAccessType.Off)
            {
                permissions.PublicAccess = BlobContainerPublicAccessType.Blob;
                container.SetPermissions(permissions);
            }
 
            return container;
        }
    }
}

What happens there is that when a PUT request comes in that gets routed to this controller, it will pull out the files contained in the form data and pipe them through a custom provider that we’ll define next, which uploads the files to Azure Storage. Once that has completed, it broadcasts a message on a SignalR hub so that anybody listening will know about the new photo.

The custom storage provider used there looks like this:

using System.Collections.Generic;
using System.IO;
using System.Net.Http;
using System.Threading.Tasks;
using Microsoft.WindowsAzure.Storage.Blob;
 
namespace Gallery.Storage
{
    public class BlobStorageProvider : MultipartFileStreamProvider
    {
        private readonly CloudBlobContainer _container;
 
        public BlobStorageProvider(CloudBlobContainer container)
            : base(Path.GetTempPath())
        {
            _container = container;
            Urls = new List<string>();
        }
 
        public IList<string> Urls { get; private set; } 
 
        public override Task ExecutePostProcessingAsync()
        {
            foreach (var file in FileData)
            {
                string fileName = Path.GetFileName(file.Headers.ContentDisposition.FileName.Trim('"'));
                var blob = _container.GetBlockBlobReference(fileName);
 
                using (var stream = File.OpenRead(file.LocalFileName))
                {
                    blob.UploadFromStream(stream);
                }
 
                File.Delete(file.LocalFileName);
                Urls.Add(blob.Uri.AbsoluteUri);
            }
 
            return base.ExecutePostProcessingAsync();
        }
    }
}

Each file in the request is saved down to a local file, uploaded to Azure Storage, and then the local copy is deleted. This code is based on a helpful blog post I found by Yao Huang Lin. After each file is uploaded, its public URL is added to a collection so that it can be broadcast out.

The other piece we have referenced here that hasn’t been defiend yet is the SignalR hub:

using Microsoft.AspNet.SignalR;
 
namespace Gallery.Hubs
{
    public class Gallery : Hub
    {
    }
}

Since PhotoController is manually broadcasting its message to clients, the hub itself only needs to exist, so we don’t have to define anything extra here.

Now that the API side is defined, it would be nice to have a web interface that displays the gallery as well, so let’s define a view for that:

@section head {
    <style type="text/css">
        #Photos img {
             width: 300px;
        }
        #Photos li {
            list-style-type: none;
            display: inline;
            padding: 2em;
        }
    </style>
}
 
<div id="body">
    <section class="content-wrapper main-content clear-fix">
        <h2>Watch the photos come in as they're uploaded!</h2>
        
        <h4>Connection status:  <span id="Status">Disconnected</span></h4>
 
        <ul id="Photos"></ul>
    </section>
</div>
 
@section scripts {
    <script src="~/signalr/hubs"></script>
    <script type="text/javascript">
        $(function() {
            var gallery = $.connection.gallery,
                $photoList = $("#Photos");

            gallery.client.newPhotosReceived = function(urls) {
                $.each(urls, function(i, url) {
                    var $img = $("<img/>").attr("src", url);
                    var $item = $("<li/>").append($img).hide();

                    $photoList.prepend($item);

                    $item.fadeIn();
                });
            };

            $.connection.hub.start().done(function() {
                $("#Status").html("Connected");
            });
        });
    </script>
}

As you can see, there’s not much going on here. In markup we just have a list for photos to get added to. When the page loads, it will connect to the SignalR hub and listen for messages about new photos added to the gallery. When new photos are received they are added to the list and displayed. Simple!

In a real application you’d definitely want to add in security to prevent anyone from being able to use your storage container, but I left that out of this sample to keep things simple. You might also want to do things like resizing, restricting file formats, etc.

GalleryApp.Core

Let’s take a look at the shared component for apps to hook into gallery. This component contains just two classes. The PhotoUploader class, as the name implies, takes a byte array representing an image along with its file extension, assigns it a unique filename, and sends that to the API we defined earlier.

using System;
using System.Threading.Tasks;
using System.Net.Http;
using System.Net.Http.Headers;
 
namespace GalleryApp.Core
{
    public class PhotoUploader
    {
        private const string UploadUrl = "http://192.168.1.103/api/photo";
 
        public async Task UploadPhoto(byte[] photoBytes, string fileExtention)
        {
            var content = new MultipartFormDataContent();
            var fileContent = new ByteArrayContent(photoBytes);
            fileContent.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment")
            {
                FileName = Guid.NewGuid() + "." + fileExtention
            };
            content.Add(fileContent);
 
            using (var client = new HttpClient())
            {
                await client.PutAsync(UploadUrl, content);
            }
        }
    }
}

Next up is the PhotoListener class, which connects to the SignalR hub and listens for new photos. When a new photo is received, it raises an event that the app can response do.

using System;
using System.Collections.Generic;
using Microsoft.AspNet.SignalR.Client.Hubs;
using System.Threading.Tasks;
 
namespace GalleryApp.Core
{
    public class PhotoListener
    {
        private const string Url = "http://192.168.1.103/signalr";
        private HubConnection _connection;
        private IHubProxy _proxy;
 
        public event EventHandler<IList<string>> NewPhotosReceived;
 
        public async Task StartListening()
        {
            _connection = new HubConnection(Url);
            _proxy = _connection.CreateHubProxy("gallery");
 
            _proxy.On<IList<string>>("newPhotosReceived", urls =>
                                     {
                if (NewPhotosReceived != null)
                    NewPhotosReceived.Invoke(this, urls);
            });
 
            await _connection.Start();
        }
    }
}

That’s it! The GalleryApp.Core.iOS project simply links these files into a Xamarin.iOS class library so they can be used from an app. These classes make use of some newer features like async/await and HttpClient, which are currently available in the alpha and beta channels of Xamarin.

GalleryApp-iOS

Finally, let’s look at the iOS app. First we’ll look at the code, then discuss what’s going on in there:

using System;
using System.Runtime.InteropServices;
using GalleryApp.Core;
using MonoTouch.Dialog;
using MonoTouch.Foundation;
using MonoTouch.UIKit;
 
namespace GalleryAppiOS
{
    public partial class MainViewController : DialogViewController
    {
        private const string UploadUrl = "http://192.168.1.103/api/photo";
 
        private UIImagePickerController _picker;
        private PhotoUploader _uploader;
        private PhotoListener _listener;
        private Section _imageSection;
 
        public MainViewController() 
            : base (UITableViewStyle.Grouped, null)
        {
            _imageSection = new Section();
 
            Root = new RootElement("RealTime Gallery")
            {
                _imageSection
            };
        }
 
        public override void ViewDidLoad()
        {
            base.ViewDidLoad();
 
            NavigationItem.RightBarButtonItem = new UIBarButtonItem(UIBarButtonSystemItem.Add);
            NavigationItem.RightBarButtonItem.Clicked += delegate { UploadPicture(); };
 
            _uploader = new PhotoUploader();
            _listener = new PhotoListener();
 
            _listener.NewPhotosReceived += (sender, urls) =>
            {
                InvokeOnMainThread(() =>
                {
                    foreach (var url in urls)
                    {
                        _imageSection.Add(
                            new ImageStringElement(DateTime.Now.ToString(), 
                                                   UIImage.LoadFromData(NSData.FromUrl(new NSUrl(url))))
                        );
                    }
                });
            };
 
            _listener.StartListening();
        }
 
        private void UploadPicture()
        {
            _picker = new UIImagePickerController();
            _picker.SourceType = UIImagePickerControllerSourceType.PhotoLibrary;
            _picker.Canceled += delegate { _picker.DismissViewController(true, null); };
            _picker.FinishedPickingMedia += (s, e) =>
            {
                _picker.DismissViewController(true, null);
                var image = (UIImage)e.Info.ObjectForKey(new NSString("UIImagePickerControllerOriginalImage"));
                byte[] bytes;
                using (var imageData = image.AsJPEG())
                {
                    bytes = new byte[imageData.Length];
                    Marshal.Copy(imageData.Bytes, bytes, 0, Convert.ToInt32(imageData.Length));
                }
 
                _uploader.UploadPhoto(bytes, "jpg");
            };
 
            PresentViewController(_picker, true, null);
        }
    }
}

To simplify the UI creation, I’m making use of MonoTouch.Dialog, which ships with Xamarin.iOS and makes it really easy to create UI elements in code. When the view loads, we set the right bar button to an Add symbol, and attach a click handler to it that starts the process of uploading a photo. It then starts listening for new photos, and will add new ImageStringElement objects to the UI when any are received.

To upload a photo we just use the built-in iOS image picker, so we don’t have to write any of that code ourselves. Once a photo is chosen, it is converted to a JPEG, then to a byte array, and uploaded via the shared uploader code from earlier.

Summary

In this post we went over how you can combine the powers of Xamarin, SignalR, Azure, and WebAPI to create an easy real-time photo gallery with support for multiple platforms. The best part is that it didn’t require very much code to do, which is a real testament to the power of these technologies. You could also swap out Azure and WebAPI for other technologies that you may prefer, such as Amazon, Nancy, ServiceStack, etc. I was personally curious to try out Azure and WebAPI, which is why I chose to go that route, and both worked quite well in this prototype.

comments powered by Disqus
Navigation