Thursday, Oct. 29th 2020
For the last two years, I’ve been working on a massive rewrite of Air Lookout. It finally launched last week.
Learning On The Go
When Voyager was launched in 1977, radio technology at the time wouldn’t have been able to send or receive communications at the distance from Earth that Voyager is today. Fortunately for the Voyager team, advances in radio technology has progressed faster than the speed that Voyager is distancing itself from Earth.
When I originally made Air Lookout 1, it was the second iOS app I made (R.I.P. Block Circle Block, my first app), but I barely understood many of the essential APIs in the iOS SDK. Since Air Lookout launched in 2016, I’ve learned a lot about iOS development. I’ve had to grow a lot to be able to support Air Lookout where it is today. This update is a big step in my ability and confidence to be able to support Air Lookout. With this update, I can continue to support where Air Lookout is heading.
Redesigned To Feel Like iOS
When I made Air Lookout 1, I didn’t know more than the basics of Swift and the iOS SDK. Everything was built with UIViews, UIButtons and UIImageViews. I even made a custom navigation controller and tab controller because I didn’t understand how to reliably use UINavigationController or UITabController.
By comparison, Air Lookout 2 covers an exhaustive amount of APIs in the iOS, iPadOS, watchOS (and soon macOS Catalyst) SDKs. Because of my previous inability to utilize the library of iOS APIs, Air Lookout never felt like a true iOS app. I grew to hate this. I decided that Air Lookout, in order to feel and work how I wanted for users, I’d have to learn to use a larger amount of the SDK provided by Apple.
It took a lot of learning and mistakes with many frustrating nights and weekends over more than two years, but Air Lookout 2 is finally here designed with functionality and features that I want (and more to come).
New in 2.0
There’s a lot of new features in Air Lookout 2. Here’s a few of my favorites.
Qualitative Meets Quantitative With Week-At-A-Glance
One of the hardest aspects regarding air quality is that the quantitative numbers—e.g. Moderate (55), Unhealthy For Sensitive Groups (115)—it’s hard to have a reaction based on experience. While the Air Quality Index categories help, seeing context of previous days will aid users in understanding the common questions: “Is today’s air quality better or worse than yesterday?” and “Is the air quality improving or worsening?”
Week-at-a-glance works by showing today’s AQI in comparison with the past three recorded AQI highs and the forecast AQIs (where available). Additionally, week-at-a-glance overlays the current air quality reading with the forecast for today. When an exact forecast AQI is not given, week-at-a-glance will show the expected air quality index range.
While there is a slight learning curve, once you’re used to reading week-at-a-glance, the higher information density is quick to parse.
Your Favorite Locations Without Location Sharing
Now with Air Lookout 2.0, there’s a number of ways to accomplish this and never have to give Air Lookout location permissions.
The easiest way, that everyone has access to, is to set a home site. This can be accessed under Settings → Home Site.
More importantly, all sites and their data can be accessed by search, which is available at the top of the Stations tab on iOS and at the top of the sidebar on iPadOS.
If someone wishes to keep track of even more sites, one can unlock Air Lookout Pro and save as many sites as they want as a favorite.
Now Available For iPad
For the first time, Air Lookout is available on the iPad. I’m not sure how many people were clamoring for air quality apps on the iPad, but I love using Air Lookout on my iPad Pro.
Air Lookout Pro features, like the detailed hourly chart and map are awesome on the larger iPad screen. Additionally, there’s a sidebar for easy access to the same sections as the iOS tab bar but with shortcuts to view location favorites.
There’s more Air Lookout Pro features that I’m looking forward to that will be exceptional on the iPad.
AQI On The Go: Graphic Rectangular Complication
Complications for watchOS have always been a high priority feature for Air Lookout.
When I created week-at-a-glance I wanted to make sure it would work on devices as large as iPads to small graphic rectangular complications. The result for this complication is simplified: a bar chart instead of floating dots and category ranges. This decision was made to increase clarity on small device screens.
When it’s not business hours, where I use the calendar graphic rectangular complication, I switch to a modular compact watch face with this graphic rectangular complication1.
I had a lot of fun programming Shortcuts. It was hard to resist delaying Air Lookout to create additional Shortcuts (I strangely want to make a Shortcut that changes a smart light to an air quality index category color based on current conditions).
The Shortcuts that ship with Air Lookout should be enough to do a number of basic and useful tasks relating to getting AQI from nearby or favorite stations (App Settings → Shortcuts).
I made a basic shortcut to control my air filter that’s hooked up to a smart outlet that you can download. This can be further automated by using the fantastic Pushcut app.
The Most Important Lesson
After I shipped Air Lookout 1.0, which was originally a $0.99USD paid app, I had an unexpected discovery: my best sales were during the worst wildfires. This caused me to feel unbelievably gross. Profiting off other’s misfortune is the last thing I want Air Lookout to do. As a result, I made Air Lookout 1.x free. No tip jar. No in-app purchase. No profit. I would prefer to run Air Lookout at a loss than profit when people are at risk or in danger.
Not only is air quality an essential human right, but access to air quality data to make daily health and safety decisions should never be behind a pay wall.
Foundation For The Future
The biggest feature in Air Lookout 2.0 is, selfishly, for me2: there’s a new foundation for Air Lookout. To prepare for Air Lookout 2, I removed considerable tech debt that I had accumulated from early and naive decisions. The framework that powers Air Lookout is rewritten and ready for new technologies (such as SwiftUI and Combine) across a variety of devices from watchOS and iOS to macOS and HomePod.
I’m excited about all the upcoming and future features that this foundation will be able to support.
There were nearly 50 beta testers that provided essential feedback for Air Lookout 2.0. Without them, the binary and design that shipped would’ve not been nearly as good. More importantly, I need to thank Val for giving me the personal support I needed. When I started working on Air Lookout, Val and I were dating. Now, by the time Air Lookout 2 has shipped, we’ve been married for over 2 years. That’s pretty neat!
Thanks for reading this whole post.
Do you want to support my work? Then download Air Lookout on the App Store and consider upgrading to Air Lookout Pro. You can learn more at airlookout.com.
If you have any feedback or questions, contact me on twitter. I would recommend following @airlookout on twitter for future update information.
1. If you open this link on an iOS device that your watch is paired with, it should be the exact Modular Compact watch face with Air Lookout complication.
2. However, the easier and more reliably I can update Air Lookout, the more the user will benefit.
Friday, April 24th 2020
Many Apple developers and rumor followers have been expecting or hoping for ARM based macs in the near future. The performance of the A12 and A13 powered iPhone and iPad has surpassed the performance of their Intel powered mac laptops and desktops in certain areas.
The Verge: Apple will reportedly use 12-core 5nm ARM processor in a 2021 Mac:
Apple will release its first Mac powered by an ARM processor in 2021, Bloomberg reports. The company is thought to have three Mac processors in development as part of its Kalamata project, which are all based on the A14 chip that’s due to be used in this year’s flagship iPhone lineup. According to Bloomberg, the first of these processors will include a 12-core CPU with eight high-performance “Firestorm” cores and at least four energy-efficient “Icestorm” cores.
Bloomberg’s report offers a lot of technical details on the form Apple’s chips could take:
- Three Mac System-on-Chip (SoC) designs based on the A14 processor are currently in development, and work has also started on a Mac SoC based on next year’s iPhone processor. Bloomberg speculates that Apple is planning to keep both its laptop and mobile chips on the same development cycle.
- The Mac chips will reportedly be manufactured by TSMC based on a 5nm fabrication process.
- The first of these chips will feature eight high-performance CPU cores and at least four energy-efficient cores, for 12 cores in total. The A12Z chip used in the current iPad Pro has eight cores: four high performance and four energy efficient.
- As well as a CPU, the SoC will also include a GPU.
- ARM Mac computers will continue to run macOS rather than switching to iOS, similar to the approach taken with existing Windows laptops that use Qualcomm ARM processors.
- Bloomberg speculates that Apple’s first ARM-based machines will be lower-powered MacBooks because its own chips won’t be able to match Intel’s performance in its higher-end MacBook Pros, iMacs, and Mac Pro computers.
- Back in 2018, Apple reportedly developed a prototype Mac chip based on that year’s iPad Pro A12X processor. The success of this prototype is thought to have given the company the confidence to target a transition as early as 2020.
mjtsai: ARM Macs in 2021:
I expect the ARM transition to be accompanied by removal of lots of APIs, so developers will have to contend with that, as well as porting and testing their own code, and dealing with any dependencies that have broken.
While everyone has mostly been focused on the first machine a lower powered ARM mac could be (likely the MacBook), they are quick to say that Apple’s ARM chips couldn’t compete with the high-end Intel mac laptops or Intels Xeons in the iMac Pro and Mac Pro. This has left me wondering what a high-powered workstation class macOS ARM processor could be like.
While I do have a travel laptop, almost all of my work is done on a 2015 iMac1. This has left me to wonder what a workstation class ARM-based mac would be like and how long it might be until it’s available.
Fortunately, some ARM powered workstations and servers2 already exist for comparison.
AnandTech: Arm Development For The Office: Unboxing an Ampere eMag Workstation
Inside the system is a 32-core Ampere eMag server, with 256 GB of eight-channel DDR-2666 memory, a 500GB WD Black SN750 NVMe SSD, a 960 GB Micron 5300 Pro SATA SSD in the rear, a Corsair VS 650W power supply, and an AMD Radeon Pro WX 5100 graphics accelerator…
The eMAG 8180 is a 32-core design running at 2.8 GHz with a turbo up to 3.3 GHz, with a TDP of 125 W. This is a first generation eMAG, which uses the old AppliedMicro Skylark microarchitecture, a custom design of Arm v8 with 32 MB of L3, 42 PCIe lanes, and eight memory channels. Avantek offers the system with three optional graphics cards: AMD FirePro W2100, a Radeon Pro WX 5100, and the NVIDIA Quadro GV100.
I am really curious to see how this CPU benchmarks against some similar wattage Xeons. This seems really promising for a design and development workstation.
This Ampere eMag Workstation can be configured on their website and starts at $3,938. A setup like this could be comfortably within the Mac Pro price range.
AnandTech: Next Generation Arm Server: Ampere’s Altra 80-core N1 SoC for Hyperscalers against Rome and Xeon:
On top of the 80 cores, the SoC will also have eight DDR4-3200 memory channels with ECC support, up to 4 TB per socket. There are also 128 PCIe 4.0 lanes, with which the CPU can use 32 of them to hook up to another CPU for dual socket operation. The dual socket system can then have a total of 192 PCIe 4.0 lanes between it, as well as support for up to 8 TB of memory. We are told that it’s actually the CCIX protocol that runs over these PCIe lanes, which means 25 GB/s per x16 linkup. That’s good for 50 GB/s in each direction.
Each of the 80 cores is designed to run at 3.0 GHz all-core, and Ampere was consistent in its messaging in that the top SKU is designed to run at 3.0 GHz at all times, even when both 128-bit SIMD units per core are being used (thus an unlimited turbo at 3.0 GHz). The CPU range will vary from 45W to 210W, and vary in core count - we suspect these SKUs will be derived from the single silicon design, and it will depend on demand as well as binning as to what comes out of the fabs.
This definitely sounds promising. 3GHz x 80 cores would be amazing. The 45W TDP of the lower spec CPUs (barely) matches the 16” MacBook Pro TDP which is also 45W. The high end 210W spec is also very close to the 2019 Mac Pro Xeon W TDP of 205W.
While I have no idea what an Apple designed workstation CPU would be like, it should be noted that both of the above examples from Ampere are being built on TSMC’s 5nm process which is the fab and production company that Apple is also going to be using.
A Brief Note on Software: Unfortunately, a lot of design tools are still single-core or barely multi-core capable. Design tools aren’t ready for this massively parallel future… yet.
I hope that once more 64 core (and more!) CPUs become standard in design workstations it will motivate software to take advantage of more parallel computing.
1: The last two generations of iPhone have a faster single core speed than my iMac. The iPhone 11 even has a 30% faster single core benchmark. Regarding multi-core, the 2018 iPad Pro is 9% faster than my iMac.
2: ARM powered servers have been making some big gains for their great performance per watt. I wonder if there’s a small possibility of an ARM-based Xserve returning. At least, I can dream about it.
Thursday, April 23rd 2020
I have been using the guetzli image encoder from Google and I’m really impressed with the results that it is able to deliver.
Here’s a quick comparison using a photo that I recently took with an iPhone Xs that would potentially be a problematic image for most JPEG encoders. It is straight from the camera and not edited.
Below are all 800× 600 crops of the encoded images exported at 200% as a PNG (a lossless format). The original full size photo from the iPhone was given to the encoders.
All exports/encodings were done on my iMac (27-inch Retina Late 2015) with Intel Core i7-6700K @ 4.0 GHz (4 cores).
- Shot on iPhone Xs
- Full photo file size: 6.6MB (6,638,714 bytes)
- Resolution: 4032 × 3024
- Used 2× lens
View crop of original
Photoshop JPEG Export Quality 30
- Full photo file size: 2.9MB (2,892,924 bytes)
- Time to export: “a second or two”
View crop of Photoshop export at 30 quality
Guetzli JPEG Encode Quality 80
- Full photo file size: 2.5MB (2,451,868 bytes)
- Time to encode: 24:47.25
View crop of guetzli encode at 80 quality
When exporting large photographic images at retina resolutions for the web, it’s fairly common to export at the full retina (2x) resolution but drop the JPEG quality down to 30 or 40.
This is exactly what I did with the first Photoshop export. These results are fairly in-line with what I would expect and aim for.
However, for the guetzli JPEG encode, I was able to bring the quality all the way up to 80 and still have the resulting image be 0.4MB smaller.
Here’s what I noticed about the guetzli export:
- Better color reproduction (especially in the sky blues)
- Less color banding in the sky’s color gradient
- Better separation between the mountain ridge and the sky
- Similar but slightly less JPEG artifacts in the tree branches
Photoshop JPEG Export Quality 80
- Full photo file size: 6.8MB (6,849,138 bytes)
- Time to export: “a second or two”
View crop of Photoshop export at 80 quality
To be fair to Photoshop, here’s what Adobe’s JPEG encoder can do at a quality of 80. While the visual quality is definitely improved (no sky banding, better color), the file size has grown to be larger than the original photo!
Where did this extra information come from (around 0.2MB!)?! I assume that the increased file size is because iOS has a more optimized Huffman table than Adobe Photoshop. I also don’t know what JPEG quality the iOS JPEG encoder is aiming for (there is some slightly noticeable compression in the original JPEG1 if you look closely).
Due to the resulting increase in file size, I would never be able to export large photographic retina assets for the web at this quality using the Photoshop JPEG encoder.
Guetzli JPEG Encode Quality 30
- Full photo file size: 1.9MB (1,902,340 bytes)
- Time to encode: 22:15.62
View crop of Guetzli export at 30 quality
Inversely, here’s what guetzli is capable of doing with a quality of 30.
I’ve noticed that below 60 quality, the resulting JPEG can be quite poor with a lot of JPEG artifacts. While the sky maintains a reasonable gradient with minimal banding, the sharpness of the ridge-line is gone and the JPEG compression in the branches has become quite extreme and now lacks sharpness. In some ways, I would consider this to be lower visual quality than the Photoshop export at 30 quality.
While the actual visual quality has dropped tremendously this has only saved 0.4MB on a 12MP photo. When I export JPEG images for the web, I always try to find these steep quality reductions. The higher visual quality of the guetzli encode at 80 for only 0.4MB (~20% increase in file size) in this scenario would be an easy decision for me.
Photoshop JPEG Export Quality 100
- Full photo file size: 9.6MB (9,596,247 bytes)
- Time to export: “a second or two”
View crop of Photoshop export at 100 quality
I also wanted to see what these two encoders could do at a quality of 100. What’s interesting about this Photoshop export is that although the resulting visual quality is quite good, the file size is considerably larger than the original photo! Again, I am interested to know where this extra information come from (an additional 3MB!).
Guetzli JPEG Encode Quality 100
- Full photo file size: 6.1MB (6,138,260 bytes)
- Time to encode: 21:12.58
View crop of Guetzli export at 100 quality
Fortunately sanity has been restored (at least in this scenario) and the resulting encoded JPEG is smaller than the original photo! Like the other high quality exports, the visual quality is incredible but the file size is just too large.
It’s worthwhile to note how steep the file size drop-off can be with guetzli. From a quality of 100 to 80 we saved 3.5MB (around 57%) without much loss of visual quality.
I always search for these sweet spots when creating JPEG assets for websites. I wish there were image export tools that could automatically export images at a variety of quality levels and optimizations and graph the resulting file sizes next to their respective image previews. It would save me a ton of time.
Guetzli JPEG Encode Quality 1
- Full photo file size: 1.9MB (1,902,340 bytes)
- Time to encode: 21:01.04
View crop of Guetzli export at 1 quality
Out of curiosity I wanted to see the extreme lowest quality that Photoshop and guetzli would encode.
It should be noted that the guetzli quality 1 and quality 30 JPEG files are the exact same file size. Running md5 on both guetzli JPEGs at quality 30 and 1 confirms that these are the exact same files.
I suspect the guetzli encoder has not been optimized for these very low quality levels or might be hitting a bottom barrier or threshold for the algorithm. This also makes me wonder if the above quality 30 has already hit that barrier and what quality level that barrier is really at.
Photoshop JPEG Export Quality 1
- Full photo file size: 1.5MB (1,467,329 bytes)
- Time to export: “a second or two”
View crop of Photoshop export at 1 quality
This quality 1 from Photoshop is definitely not usable for this specific image. To Adobe’s credit, their algorithm does appear to be continuing to save file size beyond the quality 30 export (1.4MB or about 50%).
It may seem like a goofy test, but occasionally there are simple photographic images that can look reasonably good at JPEG quality levels of 10 or 20.
Comparison Of Quality 100, 80 And 30 From Guetzli And Photoshop
View full size comparison
Encoding JPEGs with guetzli can create visually high quality results with considerably smaller file sizes than other comparable tools. However, this all comes at a computational cost. While spending more than 20 minutes for a 12MP photo likely doesn’t make sense for photographers, it does make sense for web and app designers or developers.
Spending a few extra minutes to encode a large and prominent image in a way that saves megabytes of file size on a website that could be downloaded and viewed by millions seems reasonable to me. My hope is that design tools like Photoshop, Sketch or Figma will update their image exporting process to optionally use guetzli in the future (although I won’t be holding my breath…).
Unfortunately compiling C++ programs and editing shell scripts is out of reach for most designers. I hope that other peripheral tools like ImageOptim will support guetzli encoding soon.
If you aren’t afraid of installing ports via Homebrew or the command line and you regularly work with JPEGs, I’d strongly recommend giving guetzli a try.
Let me know on twitter if you have used guetzli before or you found this post helpful.
Also: R.I.P. JPEG2000
Here are all the full, not cropped, encoded and exported JPEGs that are referenced above.
The guetzli tool from Google disables all quality settings below 84. I do not understand this decision.
If you try to export at a quality setting below 84 it gives the user this response:
Guetzli should be called with quality >= 84, otherwise the output will have noticeable artifacts. If you want to proceed anyway, please edit the source code.
I took their advice. I created a fork of Google’s guetzli project and disabled the test for quality levels below 84.
rectangular/guetzli on Github
I also created a bash script to speed up encoding a folder full of images using JPGs. While guetzli will not take advantage of additional CPU cores or threads, this shell script will start simultaneous guetzli encoding tasks to keep your CPU busy. Warning: guetzli encoding is very CPU intense and using my shell script will demolish your computer for many minutes. However, you will have some nice JPEGs in a relatively short amount of time.
rectangular/convert.sh gist on Github
It should also be noted that guetzli is not aware of JPEG rotation data or embedded color profiles. This is unfortunately not the first time that JPEG rotation data has caused me additional frustration.
An Interactive Introduction to Fourier Transforms(via):
Fourier transforms are a tool used in a whole bunch of different things. This is an explanation of what a Fourier transform does, and some different ways it can be useful.
Did you know Fourier transforms can also be used on images? In fact, we use it all the time, because that's how JPEGs work! We're applying the same principles to images – splitting up something into a bunch of sine waves, and then only storing the important ones.
After publishing my post I was lead to this great interactive article on Fourier transforms. I think Fourier transforms finally clicked for me and how JPEGs use them.
1Why did I use a JPEG for my input file and not RAW, HEIF or a lossless format? Most clients and web projects get their image assets from stock photo websites. Typically these are high quality JPEGs. It would be unrealistic to mimic web asset creation by using a lossless image file as an input to the encoders.
2The file that was generated by guetzli at quality 1 is the exact same file as was generated at quality 30. I included it anyways.
Saturday, April 18th 2020
This documentary mostly follows AlphaGo (Google’s Go playing AI) competing against Lee Sedol (one of the top ranked Go player in the world).
I don’t know much about Go but I found the documentary to be extremely interesting. I strongly believe that computers should be used to make humanity smarter and not to be a crutch to make us dumber. It’s amazing to watch AlphaGo be a catalyst for one of the world’s best Go players who has to rethink his understanding of Go in order to attempt beating this AI.
If you have a spare hour and a half, I’d consider watching it.
AlphaGo Documentary on YouTube
Monday, April 13th 2020
The inside-story of what it takes to win sailing’s iconic race around the world, and the ultimate test of a team in professional sport. Seven fully professional sailing teams battle to claim ocean racing’s greatest prize, in a race that covers 11 legs over 45,000 nautical miles, taking in 12 major cities and six continents.
I am forever fascinated by extreme endurance challenges. There aren't many that can compare to the 9 month long sailboat race around the world that is the Volvo Ocean Race. It has the tech of F1, the navigation of an adventure or orienteering competition, the weight consciousness of pro cycling, the brutality of the Dakar and survival challenges not quite like anything else.
If you have some spare TV time, I think you'll find this entertaining. The entire documentary series is on YouTube free of charge.
The Volvo Ocean Race 2017–2018 playlist on YouTube