Animation can vastly improve user experience in an application. I love buttons that animate and give you the feeling that you are actually pressing something. These combined with the great TapticEngine APIs (UIFeedbackGenerator) can completely change the way your application feels.
Here are a couple of ways of doing button animations in Swift - both of which utilise UIButton addTarget methods.
Vanilla UIKit Method
If you are not using RxSwift or RxCocoa, this method should work just as well. The only downside is that you once a button becomes animatable, you have no way of making it un-animatable.
RxSwift + RxCocoa
This is my preferred method of making generic animations. The benefit of using RxSwift is that you start and stop animating button presses - should you wish to - simply by disposing the DisposeBag that you pass in. It also avoids you having to @objc expose your control event methods.
This bit of Rx revolves around mapping button events to an animation state. For touch-down or touch-drag-enter events, we want to animate the press down action of the button, this ‘pressed’ state is represented by
CGAffineTransform.identity.scaledBy(x: 0.95, y: 0.95). For all other touch events we want to animate back to the
identity(the default) transform. For more info on how transforms work, check out this article by HackingWithSwift. These transform mappings are both merged and subscribed to the animate transform function which simply calls the animateTransform function whenever a new event is received.
If you are ever trying to get the compile time of a particular function down then hopefully the following examples will be of some use. For me, the best option from the following examples is to split long functions into smaller, composite functions.
To easily see the compile times of functions I created an empty project and added an
Other Swift Flags(this can be found inside
Swift Compiler - Custom Flags) of
-Xfronted -warn-long-function-bodies=1. Doing this means Xcode will warn you everytime that you build if any function in your project took longer than 1 millisecond to type check.
Please note that these examples are just that - examples. Your mileage may vary and I would recommend not to take these as must-dos. In most cases, it makes more sense to write code that is clear and understandable - not code that compiles quickly but is confusing to yourself and others.
As you might expect, if you explicitly tell the compiler what type the values are, then you have significantly faster compile times. Option C was the only function body that did not prompt a warning for compile time being >= 1ms. Option D only took 1ms though so the difference is negligible.
I had often suspected
??to be slower to type check than a guard or if let function but in this short example there appears to be no significant difference.
Interestingly, simply naming your arguments significantly reduces compile-time - even if you don’t explicitly type the named arguments.
A nice feature of Swift is its ability to add custom subscripts. The most common one I’ve seen used is the safe operator to safely access array values.
The subscript option is clearly the cleanest implementation, but annoyingly it also takes the longest to compile by a good amount.
Splitting Up Long functions
For this example, I’ve taken some sample UITableViewCell code and have two options. One option has all the setup in one function. The other option has three functions, one function does half the setup, another the other half and the main function calls the other two. The time I’ve recorded is the total time for both options.
This example is perhaps the most useful of all the examples on this page. It is much better for compile time if you split your functions into smaller composite functions. The best thing about this is that generally this also results in more readable code.
- Its screen is the best I’ve ever used - viewing anything edge-to-edge on this phone is a delight. The best way I can describe it is by saying it looks like it has been superimposed onto the phone using CGI.
- The gestures for navigating between apps are second to none. Using a home button to move around iOS feels like a chore in comparison.
- The build quality is the best of any iPhone. This phone just feels great in your hand - in my opinion it is the perfect size and weight.
- Face ID is good, it is not great. It does not work when you are in bed; when your phone is on a desk (with your face slightly out of the view of the scanner); or sporadically in fairly normal conditions. Touch ID is more reliable and predicatable.
- Portrait + Lighting Modes are average at best. I’ve seen many people say portrait mode has improved significantly since its release. This, I am highly cynical of. Portrait mode probably wouldn’t be so disappointing if it wasn’t for the Google Pixel outperforming Apple’s offering using only one camera. Lighting modes are bad. Both effects need the absolute best lighting conditions (which are rarely available) to be any good.
- Wireless charging is pretty disappointing in my opinion. I am fairly sure that this phone (and the 8 and 8 Plus) only have glass backs so that they can do wireless charging. I would swap to a stainless steel backing - for robustness reasons - in an instant. Wireless chargers are slow, expensive and awkward to use. Apple have their own proprietary charging mat coming soon which is supposed to be easier to use. But, it is going to be ridiculously expensive and it has been delayed for months.
Between my third and fourth years of university, I worked as a distillery tour guide and as an intern software developer. By the end of the first month I bought my ideal camera - a Canon 6D - and a month later I picked up the Sigma 35mm 1.4 ART lens. Two years later I sold both of these and instead bought a second hand Fujifilm X-T20 and Fujinon 23mm 1.4 lens and bar a few nitpicks, I believe it was a brilliant swap to make.
What I miss from my Canon:
- Brilliant low light performance
- A solid handgrip
- Long battery life
- The detail of a full frame sensor
Here are the reasons I much prefer the Fujifilm:
1. It’s small and light
- The lens and body of the Fujifilm combined weight less than either the lens or the body of the Canon. This is probably the main factor for me. My Canon was brilliant but I had to consciously bring it places and it was so bulky that when I did take it places, I wouldn’t want to keep it out all the time. The X-T20, on the other hand, is so small and light that I can happily leave it slung over my shoulder, it comes with me far more often.
2. The digital viewfinder is brilliant
- Prior to getting this camera, I presumed that optical viewfinders were far superior to digital ones, on the contrary, this viewfinder works so much better for me than my Canon. You get focus peaking whilst looking down the viewfinder and whilst in manual focus, the middle of the screen zooms right in so you can get your shot pin-sharp. When you half-press the shutter, the screen zooms back out and you can compose your shot. It’s a great system that I wish all cameras had.
3. The in-camera JPEG processing is great
- If you set the X-T20 to JPEG mode than you can choose what film emulation is used to process your JPEGs. I now shoot in JPEG 90% of the time. I realized that I spent more time backing up and processing my photos than I ever did looking at them. Now, by shooting straight in JPEG, I don’t think nearly as much about processing and 99% of the time I am just as happy with the photos.
4. A physical aperture ring, shutter speed dial and exposure compensation ring
- Having physical controls for each setting feels great. Combining this and the brilliant film simulation gives you the closest you can get to shooting film without spending £10 per 30 shots of photos you take. Having an aperture ring has encouraged me to change the aperture in my photos far more often, I no longer shoot everything wide open and hope for bokeh.
5. Its friendliness
- This is a hard one to describe but people react differently to this camera than they do with large full frame cameras. This camera feels much more point and shoot and people are more comfortable both getting their picture taken and taking pictures themselves with it.
6. Charging over USB
- The X-T20 can be charged with any micro-USB cable. This is great for travelling as you don’t need to pack a camera specific charger and can even charge your camera using a battery pack. I regularly would forget the proprietary Canon charger and never wanted to fork out the cash for a spare one, so USB charging is a welcome change.
This weekend I forayed from the safety of my go-to Flour Water Salt Yeast book to try a recipe from The Perfect Loaf. The reasoning for this was that I was keen to improve the scoring on my bread. FWSY says that scoring is unnecessary and you are fine to just let your loaves split naturally. However, I’m keen to improve the neatness of my loaves and want to move on baguettes which will certainly require scoring. If you don’t score your loves they can split anywhere on the loaf as they expand - leading to a less even bake.
Going into this bake there were two factors I wanted to change in comparison to my previous loaves:
- I wanted to generate greater tension in the dough at both the folding and shaping stages. In practice, this should help the dough retain its shape and should make it easier to score.
- I wanted to try and follow the recipe someone who likes to score their bread. The key difference with this recipe was that your shaped dough sits in a proofing basket overnight, previously I would do a long bulk overnight and only waiting a short time after shaping to bake in the morning.
I had varied success with these goals. Maurizio from The Perfect Loaf calls for room temperature water to feed your starter prior to baking. However, my room temperature in cold Edinburgh flat at this time of year is around 14℃. Knowing this, I tried using warmer water (by 2-3 degrees) than the recipe called for. In retrospect, though I feel I should have gone even warmer, my final dough temperature still managed to be 3 degrees cooler than was expected in the recipe.
I was determined to get a tight ball when shaping my dough which proved difficult as I had possibly the wettest dough I have worked with yet. Over 5 minutes I eventually got a great round ball, but at the expanse of the great rise I generated with my starter. I noticed the next morning after baking my bread that I have been using Allinson’s Country Grain Bread Flour which I had naively presumed to be wholegrain flour - it is, in fact, a blend of various flours. I presume mixing this (which already contains a deal of white and rye flour) with my mix of white and rye flour resulted in skewed proportions of water to flour. From what I have read, rye flour will typically be less absorbent of water and will give you a stickier dough.
Although I struggled to shape the dough, the tension I generated seemed to have worked very well. Scoring the loaves worked really well, in the past the razor has immediately gotten stuck to the dough. In this case, it glided right through the dough no problem - I actually wish I had cut deeper and gone for a bigger cut once I knew that it was going to cut well.
One other thing I have previously struggled with is getting the dough from the proofing basket to the counter and then to the combo cooker. I read on Maurizio’s blog that you can use baking paper and a pizza peel to invert the basket straight onto the paper which can simply be lifted in the pan. This turned out to be so much easier than trying to lift a wet dough with just my hands. The only downside to this was that I cooked the loaf in the deep pan of the combo cooker, this made the baking paper curl in at the edges and left me with a bumpy shaped loaf. In the future when I have a good tight boule, I will just cook in the lid (inverting the pan) and the paper should pull less at the edges of the dough.
You can see here the boule had significantly less rise than my previous post. I believe this was due to me overworking the dough during shaping and also due to me mixing the wrong flours. I’m going to try again soon with the right flour, warmer water and I’ve also ordered a new dough knife from an Oban based bakery equipment fabricator.
Every iPhone since the 6S has had the ability to shoot Live Photos (a photo with the surrounding 1.5 seconds stored as video). There is nothing breakthrough about this ‘file format’, but I find them to be one of the nicest features of iOS. The video captured has a low frame rate (something around 15 fps) and takes up relatively little space. Unless you are completely void of storage space on your phone then I see no reason not to leave them enabled.
I wanted to write this post to share some uses of Live Photos that may be less apparent.
Any live photo can be turned into a boomerang long after you took the photo. This is particularly nice as you can just quickly open the camera from your phone lock screen, take your pictures and later convert those live photos into boomerangs. To do this you need to go to the stories camera on Instagram, swipe up to view your camera roll, select a live photo and then press hard (3D touch) on the photo to turn it into a boomerang.
This app is pretty simple but it can help you get much more use out of your live photos. You can convert them to GIFs, convert them to videos and also loop them. The best thing about this app though is it’s motion stabilizing. A shaky live photo can be perfectly still after be resaved as a Live Photo using Motion Stills. I regularly use this on my Instagram posts in order to get videos where only one part of the image is moving. I’ve also used the video export of this app quite extensively in order to create a holiday video where half the content was just the video portion of the Live Photos I took whilst away.
Darkroom has become my go-to photo editing app. I love not having to import photos, being able to batch edit efficiently and the fact that it properly uses Photos APIs so that you can ‘modify’ photos instead of saving a copy (I used to end up with multiple copies of photos I was editing in various apps). Recently the app added support for editing Live Photos, in that you can apply edits to a photo and those same edits are applied to the video portion of the Live Photo. I use a combination of this and Google Motion stills to create many of my Instagram posts.
The only caveat of using Darkroom and Motion Stills to create videos for posting on Instagram is that Instagram requires videos to be longer than 3-seconds. To get around this I created a simple app that queries your photos library for Live Photos. You simply tap on the photo you wish to post on Instagram and it will extract the video component of the LivePhoto, loop it as many times are required for you to get a 3 second video, and save it to your camera roll.