• New Favicon

    Last week I sold my laptop and received delivery of the new iPad Pro. I’ve a bunch of posts I want to make about the iPad, but here’s a short one for now.

    I’ve been meaning to add a favicon for this blog for a while, but just never got round to it. So, this morning, whilst playing around in Procreate, I decided to draw one up. I wanted some depiction of a pizza, however all the references that I could find online were certainly not Neapolitan (the undisputed best form of pizza), so, I made my own. Here is the process as exported from Procreate - a really cool feature that lets you export a time lapse of your drawing.


  • Presenting a series of UIAlertControllers with RxSwift

    I was recently writing a helper application for work that would store a bunch of URL paths that we support as Deeplinks. The user can pick from one of these paths and will be prompted to input a value for any parameters (any part of the path that matched a specific regex). However, the app got pretty complex when having to deal with multiple capture groups in that regular expression.

    The problem was this - given an array of strings (capture groups from my regular expression), show an alert one after the other requesting a value that the user wishes to input for that ‘key’.

    Traditionally, I would have probably made some UIAlertController-controller that might look something like this.

    protocol AlertPresenting: class {
        func present(alert: UIViewController)
    protocol MultipleAlertControllerOutput: class {
        func didFinishDisplayingAlerts(withChoices choices: [(String, AlertViewModel)])
    class MultipleAlertController {
        private var iterations = 0
        private var viewModels = [AlertViewModel]()
        private var results = [(String, AlertViewModel)]()
        private weak var alertPresenter: AlertPresenting?
        weak var output: MultipleAlertControllerOutput?
        func display(alerts alertViewModels: [AlertViewModel]) {
            self.iterations = 0
            self.results = []
            self.viewModels = alertViewModels
        private func showNextAlert() {
            guard viewModels.indices.contains(self.iterations) else {
                self.output?.didFinishDisplayingAlerts(withChoices: self.results)
            self.display(alert: viewModels[self.iterations])
        private func display(alert: AlertViewModel) {
            // some stuff here showing alert, invoking show next alert after completion
            iterations += 1

    Now this isn’t necessarily bad, but having to keep track of or reset state can always lead to bugs - particularly in asynchronous situations. Instead I had an idea for doing this using RxSwift.

    import RxSwift
    func populateParameters(for alertViewModels: [AlertViewModel], presentingAlertFrom viewController: ViewControllerPresenting?) -> Observable<[RegexReplacement]> {
        return Observable
            .concatMap { alertViewModel -> Observable<(String, AlertViewModel)> in
                guard let vc = viewController else { return Observable.empty() }
                let textFieldResponse: Observable<String> = vc.show(textFieldAlertViewModel: alertViewModel)
                return textFieldResponse
                    .map { textInputValue in (textInputValue, alertViewModel) }
    // Helper method for Rx UIAlertControllers
    protocol ViewControllerPresenting: class {
        func present(viewController: UIViewController)
    extension ViewControllerPresenting {
        func show(textFieldAlertViewModel: AlertViewModel) -> Observable<String> {
            return Observable.create { [weak self] observer in
                let alert = UIAlertController(title: textFieldAlertViewModel.title,
                                              message: textFieldAlertViewModel.message,
                                              preferredStyle: .alert)
                alert.addAction(.init(title: "Cancel",
                                      style: .cancel,
                                      handler: { _ in
                                        observer.onCompleted() }))
                alert.addAction(.init(title: "Submit",
                                      style: .default,
                                      handler: { _ in
                                        observer.onNext(alert.textFields?.first?.text ?? "")
                                        observer.onCompleted() }))
                alert.addTextField(configurationHandler: { _ in })
                self?.present(viewController: alert)
                return Disposables.create {
                    alert.dismiss(animated: true, completion: nil)

    Now, this looks like a lot of code, but the lower part is simply a wrapping on UIAlertController with Rx. Something that can be re-used for any UIAlertController with a text field.

    Let’s break down the main bits.


    This little snippet takes an array an gives you an observable that emits each element of the array. In this example that will be an event for event match in the regular expression.

    .concatMap { alertViewModel -> Observable<(String, AlertViewModel)> in
                guard let vc = viewController else { return Observable.empty() }
                let textFieldResponse: Observable<String> = vc.show(textFieldAlertViewModel: alertViewModel)
                return textFieldResponse
                    .map { textInputValue in (textInputValue, alertViewModel) }

    Here we take the stream of matches for the regular expression and we compactMap an observable of the String that the user typed into a UIAlertController alongside the original AlertViewModel instance. The RxSwift method doing the heavy lifting is the compactMap. It waits for a completed event to be sent by the observable text field responses before invoking the next alert.

    The result of the compactMap is an Observable tuple of (String, AlertViewModel), which can be used by the consumer to populate the matches of the regex with the String values.

    We pipe this tuple Observable into a toArray which waits for a completed event - which happens automatically for us once every item in the array has had an alert shown.

    I’ve cut a bunch of corners in these examples in order to simplify flow and make these two approaches more comparable. You can see the full example in my deeplinking helper app here.

  • Rab Microlight Alpine

    I recently purchased the Rab Microlight Alpine down jacket. Like with my Arcteryx jacket, I thought it might be helpful to others if I shared my opinions and some pictures as there’s limited info or pictures outside of those by the manufacturer online.

    I chose this garment to replace a North Face Thermo Ball jacket that my brother has adopted to be his own. The Rab number uses real down and is noticeably warmer than the synthetic insulation of the North Face. That being said, the Rab fits slimmer which certainly helps to keep the warm in.

    Rab Microlight

    The jacket comes branded with a small Nikwax - the waterproofing company - logo printed at the back. The down used in the jacket is claimed to be ‘hydrophobic’, but I’m convinced this is just marketing chat. I don’t how the material /inside/ the jacket can really keep you dry, but I guess it’s better than nothing?

    Like the Arcteryx, this jacket has both a good hood and a soft microfibre-like material on the inside of the zip. There’s even a small sleeping bag like case that Rab provide for you to pack the jacket away into. Some jackets I have tried have the ability to pack away into their own pockets which defeats the need to keep track of a little bag that I will inevitably lose.

    The jacket doesn’t pack away /that/ small but it’s so light that you don’t notice on your person that much.

    I went for size Medium and I’m happy with the fit. I’m just shy of 6’2. Feel free to send me any questions, I’ll try and remember to add more pictures when I get some.

    Rab Microlight

  • Arcteryx Beta SL

    A few weeks ago coming down Ben Ledi, soaked to the bone for what felt like the hundredth time that year, I decided it was time to invest in a solid waterproof jacket. As soon as I was home I did a bunch of online reading (notably Outdoor Gear Lab and The Wirecutter) and over the next few weeks I took every opportunity to run into an outdoors shop in order to try out brands and sizes. Eventually I settled on the Arcteryx. Had I not gone for this jacket, the Mountain Equipment Rupal was next on my list. Both were highly waterproof, not crazily expensive and, in my opinion, looked pretty good.

    In my looking I couldn’t find many great photos of the jackets online and any good quality ones I found did not do the fits of the jackets justice. I thought I’d share some pictures of mine for anyone else who isn’t sold by the styling you see online.

    In each of these photos I’m wearing the Arcteryx with a Rab Microlight Alpine down jacket underneath - both in size UK medium.

    Arcteryx Beta SL with Rab Microlight

    Arcteryx Beta SL with Rab Microlight

    The hood on the jacket is great, it fits large so that it can fit over a helmet, but it can also be cinched down for normal wearing just fine. I’ll often wear over-ear headphones to work and it’s nice that they fit easily under this hood, something I couldn’t say about my previous North Face Triclimate.

    Arcteryx Beta SL with Rab Microlight

  • Photo Storage Workflow

    When I first started this blog I wrote a post about how much I love using my Fujifilm X-T20. Around the time I wrote that post I made a decision to stop paying for a Creative Cloud subscription and began using my phone as a hub for all my photos. Now, several months later, I can’t see myself ever going back.

    Previously 💾

    To provide some context, it’s worth noting my old workflow. I previously used a Canon 6D and shot exclusively in raw, meaning that each picture was around 10-25mb in size. Whenever I had taken a set of photos I would plug my SD card and an external hard drive into my laptop, open up Lightroom and transfer everything. I would then do some batch edits to the set and would often spend some more time on my favourites. Then I would export a subset of the photos and add them to Photos for Mac to be uploaded to iCloud where I could share them with others. Finally, I had Backblaze cloud set up in use so I would usually end up leaving my laptop sitting on overnight plugged full of external drives to allow time for backups to complete.

    Going full iCloud ☁️

    Back in March I was getting pretty fed up with how long it took to get my photos anywhere. I was enjoying the portability of my Fujifilm but was still restricted by either WiFi transfer speeds to my phone or the hard drive workflow described above. At that point I had stopped using raw as I was happy with the JPEGs I got straight out of the camera. I decided, on a whim, to cancel both my Backblaze and my Creative Cloud subscriptions. I moved every photo from my hard drives into Photos.app and left it for weeks (yes, plural) to upload everything. As of today I have ‘37,611’ photos and ‘1,744’ videos stored in iCloud. I’ve also got the free tier of Google Photos so I have lower quality backups on the other side too.

    This has been great for so many reasons.

    • Access to all my photos on virtually any device - including my phone, laptop, work laptop and the increasingly good iCloud.com
    • Great utilisation of machine learning for photos. I’ve tagged friends and family and can now make full use of search in Photos. This has only been made better by iOS 12 where you can combine search terms. I can instantly find photos of all sorts. For example ‘Rory Bain’, ‘Snowboarding’ in ‘France’.
    • Vastly improved “Memories”. When adding my photos from my hard drives, I also added photos from my Mum’s old hard drives. I’ve still many more to add, but even now it’s made the Memories feature far more enjoyable. On a persons birthdays for example, it automatically creates an album with my photos of that person over the years. Most people only get these albums since the start of their social media presence in the last 10 years and are photos are limited to ones that they’ve selectively kept online. This gives me photos going back decades.
    • Less scraping through sets of photos. Another benefit of machine learning and image recognition techniques in is that Photos automatically only adds distinct photos to auto-created albums. There aren’t 10 takes of the same family photo where I accidentally left continuous shutter on.
    • iOS photo editing apps are great. I personally use Darkroom 90% of the time and can do 80% of what I did in Lightroom without any importing faff.

    And much more…

    The final (for now 😬) piece of the iCloud set up that I received last week was a Lightning SD card adapter. Until now I have been using my laptop to upload my photos, having given up on WiFi transfer for being too slow. I got the Apple SD adapter, which I would highly recommend despite its high price. I tried to cheap out at first and was left with an adapter that hardly worked. Plus, the Apple one transfers at USB-3 speeds on the iPad Pro. I now leave this adapter in my bag alongside my camera and I can transfer photos in an instant. I love this setup. Its only problem is that it has me eyeing the upcoming 2018 iPad Pro… 🙄

  •  WWDC 2018


    15 year old me was pretty excited about watching the Apple Event where Apple would unveil the penultimate iPods Nano and Touch. 23 year old me was even more excited to have the chance to go San Jose and attend WWDC in person. I thought I’d write about my takeaways from attending and why I’m so keen to return.


    Learnings and Inspriation

    Although you can watch the vast majority of the WWDC sessions online, it’s nothing compared to 5 days of dedicated learning in proper theatres. There are around 6 time slots during each day which each have up to 3 concurrently running sessions. There is always something worth going to. This means that you end up going to so many sessions that you would have likely would never watch online and there are so many tricks to learn. I work on a large Objective C and Swift project and our compile time is dreadfully slow. I attended several sessions that taught changes you could make to your application to improve compile time which have now been compiled into a massive JIRA board at work of project cleanup tasks where the speed improvements alone are worth the cost of a WWDC ticket.

    The sessions are great, but even more useful are the labs. These offer you the chance to speak to the actual Apple engineers who wrote the APIs that you are consuming. I cannot recommend enough making sure that you attend as many labs as possible. I went to a UIKit one and an LLVM one. The UIKit one I attended, I asked about some UITableView autolayout based sizing header and footers where I have a working solution (thanks to StackOverflow) but I was convinced there was a better way. Sure enough, I was linked a great GitHub Gist on how to do this properly. The LLVM session I was hoping for some simple solutions as to why our compile time is so slow (up to 40 minutes for a clean build and overly keen 20-minute recompiles). The engineer I spoke to was nice but he was just shocked as to how poor the compilers performance was on our project. The feedback I got was to file a couple of radars and to try and split up our main target into more separate frameworks - which is easier said than done. Even with no concrete actions to leave with, it was incredibly useful to at least have some validation that our build settings were correctly configured - something that can be all too easy to mess up.

    My biggest regret would be that I could not remember enough things to take to the labs. In the future I’m planning on keeping track of all solutions I have to problems that are less than satisfactory. This last week I’ve added a note to check how Apple recommends that you manage screen orientation for single cases where you need to display in landscape. Particularly for the case where you have an AVPlayerViewController - which you should not subclass.

    My personal favourites of the sessions I attended -

    Tim Cook


    Possibly my favourite thing about the conference was the sheer variety of developers you met. There are 5000 developers that attend WWDC, and there is also Layers conference and Alt Conf running at the same time, also in San Jose. I met people who worked on custom email clients,banking clients, car rentals, music streaming, cycling, design prototyping, podcasting and even simulated Santa phoning, to name a few. All these companies are working so many different technologies and you are always chatting and exchanging experiences and recommendations. I met more developers in a day than I had previously met in my life. I found this to be invaluable, particularly at my age, should I ever come across a problem that I’m struggling with and perhaps my co-workers have no experience in, I now have a wealth of people to contact where there’s bound to be someone with some experience in what I’m working with.

    San Jose

    Silicon Valley is fully committed to all its cliches which is both a strength and a weakness. Whilst I was there, the most exciting things for me were the dock-less electric scooters and the Tesla Model 3. Both of which appear to me to be the future of transport.

    For the unaware, the scooters are electrically powered, internet connected scooters that you can find on most sidewalks in certain US cities. You scan a barcode with your phone and you are then free to ride the scooter as far as its range of roughly 35 kilometres will allow. Typically you pay $1 to start and subsequently 25 cents per minute.

    I for one, cannot wait for these to come to the UK. I hope that they become commonplace and people can learn to be sensible with them. My hotel in San Jose was around 2.5 miles from the conference center, for this commute I tried the tram, taxis, bikes, and scooters. Scooters were by far the most fun, convenient and cost-effective option. When traveling around Edinburgh, I’ll take my bike any chance I can get, because I love to be out in the open and to be able to move around at my pace and be limited by public transport timetables. Dockless scooters - and bikes - offer this same freedom wherever you are. They also have the added benefit of not having to lock up and remember your own vehicle and there’s a possibility that scooters will lower the bar for access to last mile transport. Not everyone feels comfortable riding a bike - some may feel far more comfortable on a scooter where you don’t get sweaty and in the case of an accident you can easier step off a scooter than a bike.

    Lime Scooters

    Whilst on the train from San Francisco International to San Jose I spotted my first Model 3. While I did not get to try one (I talked myself out of renting one on Turo) I can say that they look amazing in person.

    Model 3

    Model 3