I thought it was just something that we lived with as Mac users. If you wanted to use the built-in photo picker to grab an image from your photo library, it was going to take a while. For as long as I can remember, the photo picker has taken a long time to load when choosing a file from the first-party interface.
Naively, I thought the introduction of the M1 chip might speed things up. Everything else was faster, so why not? But, no dice, it felt as sluggish as ever, even on my M1 Mac mini. Other files were instantly available to choose from, but something about Photos bogged things down. I put it out of mind.
Enter the Home app. A little-known feature in that app is that you can customize the background image for specific rooms to your liking, either with a built-in color or one of your images. I was recently changing the backgrounds of my rooms to make them more recognizable, and since those images don’t sync between devices, I found myself in the Home app on the Mac to update them there. But when I went to the Home icon → Room Settings → Room Wallpaper → Choose Photo…, my jaw dropped. My photos were immediately available!
The reason is apparent; it’s using the iOS-style Photo Picker. My educated guess is that since the Home app is built using the Catalyst framework (basically an easy, but not the easiest, way to convert an iPad app to work on the Mac), it had shed the decades of cruft that slows down getting to those photos.
It’s possible that getting photos with this method has been lightning-quick since 2018 when macOS Mojave unveiled the ‘Sneak Peek’ which brought the News, Stocks, Voice Memos, and Home apps over from iOS using frameworks that later became known as Mac Catalyst. But I never had a reason to upload a photo to one of those apps before, so I wouldn’t know.
You can test it yourself. In Safari, try to upload a photo from a specific album by navigating through the ‘Choose File’ UI to the Photos location in the sidebar. Count how many seconds it takes before any photos appear. For me, with a 15,000+ item library, it takes about 7.5 seconds.
Almost 8 seconds from clicking to loaded images. ⌘
Next, go through the Home app. Less than 2 seconds for me, and often even quicker. That’s nearly 4x faster.
Let’s try one final test. Shortcuts is perhaps the most complicated app Apple has built with its newest cross-platform framework, SwiftUI. When pulling up the Photos Picker on macOS, my library loads, if anything, even faster than the Home app. It feels instant.
The fastest yet, Shortcuts loads photos almost instantly. ⌘
So, Apple, if you’re looking for some low-hanging fruit to pick to speed up what is supposed to be the “world’s fastest web browser”, I’d suggest digging down through that old file-choosing code and sprucing it up with something more modern.
Wordle continues to be part of the zeitgeist weeks after it exploded in popularity — in large part due to its ingenious built-in sharing mechanism. Honestly, Wordle’s 15 minutes of fame have far outlasted my expectations, but I’m glad! I love Wordle, and haven’t missed (or failed!) a day since I started playing.
Almost as soon as Wordle made Twitter’s trending lists, people started discussing the accessibility, or lack thereof, of those colorful emoji squares that have become iconic to the game. While they’re fun and mysterious to look at while scrolling your timeline, those emojis are not very inclusive for people who use screen readers to engage with content.
The first stab I saw at making Wordle tweets more accessible came from Zach Knox, who made the Wordle Normalizer shortcut. This simply replaces the yellow squares (🟨) and green squares (🟩) in the shared text with different emojis (🟡 and ✅) so that they are differentiated for people who have more trouble distinguishing colors. Everything else about the text stays the same, which means it didn’t solve the screen reader issue.
Federico Viticci went further with WordleBot, which automatically added plain text to the end of each row of emojis, describing the game’s progress line-by-line. He quickly found, though, that just adding text next to the emojis still didn’t solve the accessibility problem since screen readers would continue to read out each emoji square. A clever update to WordleBot added the ability to export the emoji to an actual image that could include Alt Text — the gold standard for describing pictures for screen readers.
But I thought I could take it one step further with a few tweaks to Federico’s shortcut. I’ve combined the best of both Wordle Normalizer and WordleBot to make a more complete solution for those who can see the emojis and those who use screen readers. It keeps the mechanics of WordleBot, but before converting the grid to an image, it replaces the colored squares with their more distinguishable counterparts from Wordle Normalizer. I also added a couple of quality-of-life enhancements that smooth out the sharing flow, which I’ll describe below.
First things first, when I share my Wordle results, I like to take advantage of the power of the hashtag, so I replaced “Wordle” with “#Wordle” in the first line of text. That gets saved for later use to put together the final shared text.
I left all of Federico’s ReGex alone to match and tally up all the line results. Those results get formatted into the line descriptions that will be used as the Alt Text (e.g., Line 1 - 2 partial, 1 perfect). However, I realized that there was no description given to lines that had neither partial nor perfect matches — it was just blank. I used an ‘If’ action to check for a letter that would be present in either case, “p”, and if it isn’t found, then it replaces the line to show that 0 letters were correct.
Each line gets a description, even if no partials or perfects were counted. ⌘
Once the line text is saved as a variable, I replace the standard square emojis with their alternate shapes inspired by Wordle Normalizer.
I only ever want to share the image version of the puzzle grid with Alt Text, so there was no reason to keep the extra actions for copying the original emojis. I removed that ‘Choose from Menu’ and its associated actions. Luckily, Federico’s technique for converting emoji to images still works perfectly, even with the new emoji shapes.
I realized in the original WordleBot that only when sharing the text/emoji version, not the image one, would “5 perfect” be replaced with a phrase confirming puzzle completion. I fixed it, so the Alt Text also gets that same treatment. I also added an explainer for how users can customize the text that ultimately gets shared and use the Alt Text.
If folks decide they want to change how the final text is formatted, here’s where it can be customized. ⌘
The rest of the flow for sharing remains the same from Federico’s WordleBot. You still get the grid image from your photo library when you’re ready to share. But I added one final option: if you always share your Wordle results to the same place, you can automatically open that web page. For me, it’s a Twitter thread. The ‘URL’ action is populated during a setup question, and if you choose not to use that feature, the shortcut ends by outputting the saved grid image.
Copying, confirming, and (optionally) continuing to sharing destination. ⌘
In the end, we’re left with the quintessential Wordle grid — readable by people regardless of seeing ability — and text to share with the world. A few clipboard maneuvers get the Alt Text in the right place (the ‘+ALT’ button below), and it’s ready.
The results, ready for cutting/pasting as Alt Text, and then posting. ⌘
This whole saga around Wordle accessibility warms my heart. A few years ago, it may not have been part of the discussion at all. But I’m glad that folks are thinking about making even small things, like silly game emoji on social media, more inclusive. I’m happy to have played a small role and look forward to using WordleBot again tomorrow morning.
My thanks to Zach Knox for getting the ball rolling and to Federico Viticci for doing the heavy lifting on this shortcut. I decided not to change the ‘WordleBot’ name since most of it is still Federico’s work.
If you see how WordleBot can be further improved, please do let me know! And if you’re using it to share your results, best of luck on tomorrow’s word.
As announced on the Apple Developer website, developers can now create their own custom discount codes (like SPRINGPROMO) to distribute to customers, similar to what online stores already offer.
A good move that means we’ll probably been seeing more app ads in podcasts.
The console’s design reflects the eject button’s priority. The disc eject button is bigger, higher up, and surrounded by an LED ring in the console’s iconic green glow, drawing even more attention to it.
The reasoning here is simple: the original Xbox (like its contemporaries and predecessors) was useless without discs for games, DVDs, and CDs. Without the disc tray button, your Xbox was never more than a hulking hunk of green and black plastic. So Microsoft wanted to direct you toward that button because it meant that you had bought a game and were ready to play or that you wanted to swap out discs to play something else.
What’s in a button? A good reflection on how product design should reflect the use of it, both intended and actual.
HomePod mini’s light-up surface is on the top of the device, but I propose that Apple angle the surface and add a proper touch display. The new angle would allow users to more easily use controls and view content — it’d be far better and add more utility to the product.
I would love this product for around the house, and it seems like the natural evolution for added functionality in the HomePod mini. Basing it off the watchOS interface makes sense for the limited interaction I’d expect, and it plays nicely into the fact that the HomePod mini already runs on the Apple Watch processor. I share Matt Birchler’s concern for its usefulness across a room due to the display size, and continue to long for an iPad/HomePod mashup device for the kitchen in particular.
Mile 1 provides you with the most comprehensive highway milepost locator service ever built. Easy to use day or night, quickly know your distance to the nearest posts. Nearly every highway on the U.S. Interstate system is included along with a growing list of state and U.S. highways.
It’s in the worst situations on the highway, like when you’re stuck, that you need to know your mile marker. But if there’s not one in sight, what do you do? This free Mile-1 app made by developer Chris Dry saves you from walking up and down the interstate to determine the closest mile marker. I haven’t had to use it yet, but I’m glad to have it in my toolbox. It’s just too bad that it can’t get an entitlement to work with CarPlay.
So: Last.fm. There are a few things I like about it. First, it seems to take into account my entire listening history, though it does give greater weight to recency and frequency. Second, it shows me why it is recommending a particular artist or album. Something as simple as that helps me contextualize a recommendation. Third, its suggestions are a blend of artists I am familiar with in passing and those that I have never heard of.
Most importantly, it feels free of artificial limitations. Apple Music only shows a maximum of eight similar artists on my iPhone, but there are pages of recommendations on Last.fm. Echo and the Bunnymen has twenty-five pages with ten artists each. I can go back and see my entire listening history since I started my account there. Why can I only see the last forty things I listened to on Apple Music?
Should Apple buy Last.fm? It would booster their recommendations. People love Last.fm. It would give them another lens into up-and-coming artists. I’ve been giving Last.fm a try over the last month to get more insight into my listening habits, and hopefully some better new artist suggestions than I’m getting from Apple Music itself.
If you’ve ever tried using a hotel app as your digital room key, this is so much better. The key is now stored in Apple Wallet, which means there is no need to unlock your phone, open an app or activate a key before you want to use it.
The key is always ready, just tap your phone to unlock a door in seconds.
The apps use Bluetooth, this new Apple feature uses NFC.
If you’re a fan of Nick Offerman, you’ve got to listen to this podcast interview of him with Ezra Klein. I love Nick’s outlook on life, and was tickled by his study of Aldo Leopold, who is another favorite of mine.
Thanks for reading 7 Things! If you enjoyed these links, or have something else neat to share, please hit me up on Twitter or send me an email!
The Mac is such an inconvenient platform for Apple. It prevents the company from making any credible claim of an impending security catastrophe, if lawmakers force the iPhone to allow installation of apps without the App Store. With the Mac, we have almost forty years of proof that computers don’t need an App Store to be safe. Made by the same company that now tries to pretend to legislators that this isn’t possible!
If you think back to the Mac vs. PC ads, Apple pushed pretty hard on the rock-solid security of their platform. Sure, things have gotten more complicated since those aired, but does Apple really think users should worry about trusting their Mac?
To be clear, I’m not in favor of allowing side-loading on iOS. Nor do I think that the Mac should be locked down to iOS levels. I like that iOS and macOS are different in that regard. I just think that Hansson is rightfully holding Apple’s feet to the flame on how they characterize the relative security of their platforms.
As a follow-on to my ‘Copy as Embedded Tweet’ shortcut that I made last year, I set out to further improve my workflow for preparing posts for 7 Things. That shortcut let me get the proper HTML for embedding a live Tweet right from the Share Sheet. Since I also share a lot of videos in those weekly posts, I wanted a similar solution for embedding videos without having to click through the YouTube website. So, I give you a new shortcut that accomplishes a similar task, but in a different way.
This time around, I used very different techniques to build the embed code. Instead of putting together a longer URL by combining one with another, this time I needed to break down the canonical YouTube link for the video.
1️⃣ To make sure we’re working with a real link and not just text, we use a ‘Get URLs from Input’ action. That pulls the ‘Shortcut Input’ variable, which was either passed into the shortcut or retrieved from the clipboard.
Since we can’t rely on the shared URL to be a youtube.com link, the first step is to expand the URL out in case it’s a bit.ly or youtu.be version.
2️⃣ Once we have the full-length URL, we make the first of two splits using the ‘Split Text’ action. Our goal is to narrow down to just the video’s ID, which always comes directly after the ?v= characters in the URL. After splitting it by those characters, we get the last variable, which includes the ID, using ‘Get Item from List’.
3️⃣ The next split for that variable is by & since additional attributes are after ampersands. We get the first variable from the list this time, which we know must be the video’s ID.
4️⃣ Now that we have the ID, it’s just a matter of filling in that string of characters as a variable into the pre-formatted ‘Text’ action. Since the ID is the only important thing that changes between different embedded HTML for YouTube videos, that’s the only bit we need to change. The text is then copied to the clipboard.
5️⃣ The final two actions serve to confirm that everything has been copied as expected. The ‘Text’ action just lets us format the confirmation message as we please, and the ‘Show Result’ action displays it.
Now the code is ready to be pasted into another text document or website.
My next challenge is going to be to combine my two embedding shortcuts into one, or running them as functions from a different menu. That, however, will be for another time. So far, using this shortcut throughout the week to prepare links for 7 Things has been working great and saves me time.
The Afterparty is the latest series to come to Apple’s streaming service, which is one of the only services not to feature a free tier. They do offer generous free trial periods, three months with any new Apple product that can play their movies and shows, but it’s a one-time deal. That’s why posting an entire episode to YouTube is so notable. Now anyone can watch the first episode, whether or not they’ve ever tried or paid for Apple TV+.
I’ve been looking forward to this show, the first three episodes of which are available to watch now on Apple TV+. It’s directed by Christopher Miller and has an all-star cast. I’ve got Knives Out vibes from the trailer, not just from the murder mystery genre, but also the laugh-out-loud humor. It’s going to the top of my watch list.
The video’s description shows that this YouTube premier will be short-lived and points viewers to Apple TV+ to continue watching the series:
Watch the first episode of The Afterparty, a new whodunnit comedy from Lord Miller starring Tiffany Haddish, Ben Schwartz, Sam Richardson, Ilana Glazer, Ike Barinholtz, Zoe Chao, and Dave Franco. Continue watching the next two episodes, available now on Apple TV+ and don’t miss Xavier’s final EP and music video for “Imma Live Forever” dropping February 1 https://apple.co/_TheAfterpartyEpisode2
[…]
Available on YouTube until February 6 at 11:59p PST
Apple has made their streaming content freely available before, but not in full on another platform — you always had to figure out how to get to Apple TV+. I could see this being the start of a new strategy to hook potential subscribers on a new show by meeting them where they already find new video content, and that’s YouTube.
Your browser will store these topics for three weeks before deleting them. Google says that these categories “are selected entirely on your device” and don’t involve “any external servers, including Google servers.” When you visit a website, Topics will show the site and its advertising partners just three of your interests, consisting of “one topic from each of the past three weeks.”
There are probably a lot of nuances to learn about here. But Topics sounds way more private for Chrome users than the cookies they have now and better than FLoC. But will it only be applicable to Chrome users?
If you’re a long-time reader, you’ll know that HeyDingus doesn’t have a lot of photography. Screenshots? Sure. But featured photos? Not really. Despite having created a complicated shortcut for uploading pictures to a CDN, and cobbled together a workflow to make those Markdown-based images work with Squarespace, it’s rare that I include images in my posts.
Part of that is because I’m an inexperienced photographer without a lot of confidence. And it’s partly due to my influences — I tend to gravitate toward word-focused blogs like Daring Fireball. However, there’s no denying that the occasional featured image can enhance a post and key the reader into the subject. But with my Shortcuts-centric, iPhone and iPad-heavy workflow, what’s the best (read: easiest, because any friction will reduce my interest in using them) way to get quality photos ready to insert with Markdown?
With Shortcuts itself, of course!
I’ll cut to the chase. This shortcut searches a stock photo library, presents options, previews your selection, and then gives you options on what to do with it using the Share Sheet, and copies the photo’s credit to the clipboard. It’s pretty neat.
Months ago, I noticed a Shortcuts action provided by Toolbox Pro that mentioned Pexels photos. I knew I had seen that word, “Pexels”, before. A quick search showed me that it’s a stock photo library. It is similar to Unsplash (which seems to integrate with every service and their brother, including Squarespace) in that it offers free stock photos and videos but is different in that it isn’t owned by a large corporation.
When I saw that action, I knew I could build what I wanted: a shortcut that would let me choose a photo and then pass it and its metadata through to my existing workflows. But the key to its effectiveness was leveraging the excellent parameters that developer Alex Hay built into the Toolbox Pro action.
Toolbox Pro makes excellent use of both search and output parameters to make this an incredibly flexible action. ⌘
Now would be a good time to mention that while Toolbox Pro is a free application with tons of outstanding actions at no cost, this particular ‘Find Pexels Photos’ action is part of its premium in-app purchase. That said, a mere $6 one-time payment unlocks all of the premium actions. I’ve barely scratched the surface of Toolbox Pro’s power, and there will be so much more to explore once Alex introduces macOS support down the road. Yes, that also means that this shortcut doesn’t yet work on the Mac. It’s a bummer for me, too.
Let’s get into how the shortcut is put together.
Putting Together the Pieces
Evaluating the Shortcut Input, getting the search query, and presenting the results. ⌘
1️⃣ I got clever in how this shortcut handles input. I wanted it to accept text input so that I could highlight a word in an article and then share it to the shortcut as the search term. I could have used iOS 15’s new Shortcut Input parameters to have it ask for text if nothing was passed in, but I wanted to reuse the actions if I needed to restart a search using the same search term — more on that later. So, in this case, I have the shortcut continue onto an ‘If’ action when no input is passed.
The ‘If’ action checks to see if the input has any value and then presents an ‘Ask for Input’ action. When there is input detected, it’s filled by default. If not, the ‘Ask for Input’ is blank.
2️⃣ Here’s the meat of the shortcut; the ‘Find Pexels Photos’ action. You can see here that I’m using the ‘If’ result as a Magic variable, labeled as ‘searchQuery’. I’ve set the search parameters to be any orientation, a minimum of 12 megapixels, and providing 15 results. Those results are passed as a variable to a ‘Choose from List’ action, using the ‘Pexels Photos’ type.
The ‘Choose from List’ action gives us rich, albeit small, thumbnail previews of the search results. ⌘
Although it looks straghtforward, we get a little more advanced with these ‘Choose from Menu’ choices. ⌘
3️⃣ Here’s where we start making use of the output parameters. This ‘Get Images from Input’ action retrieves the image from the URL of the medium-sized image URL that Pexels provides. It’s big enough to preview using the ‘Show Result’ action, but not full-quality. This method speeds things up and reduces the amount of data you’re downloading throughout the shortcut.
Here’s where you can see the selected image in more detail. ⌘
4️⃣ Since the ‘Choose from List’ thumbnails are pretty small, I wanted the option to choose again in case the selected image wasn’t quite right upon further inspection with ‘Show Result’.
5️⃣ The first option is simply to continue, and it gets the variable for the selected image so it can be passed as the result of the ‘Choose from Menu’ action.
6️⃣ The second option is where things get interesting. I told you we’d come back to rerunning this shortcut as a function. Since there’s no easy way to return to an arbitrary point in the shortcut once it’s running, the best way to return to the result list is just to run it again. So that’s what happens if you select “No, choose again”. It reruns itself, passes the original search term as input, and stops the first run-through. It’s important to stop the initial shortcut here because otherwise, you get into a nasty loop.
Remember how the initial ‘If’ action presents an ‘Ask for Input’ based on the Shortcut Input. The shortcut will default to the inputted search term when running it as a function. All you need to do is hit “Done”, but you could edit the query on the second round, too.
Running shortcuts as functions, especially running itself, is an advanced technique that I’m still somewhat wrapping my head around.
7️⃣ If you’d prefer to start fresh rather than rerun the exact search, this option runs the shortcut as a function, but no input text is passed along. Again, we stop running the original flow with a ‘Stop Shortcut’ action.
8️⃣ In the end, we want the actual full-quality photo, right? By using another ‘Get Images from Input’ action, but acting on the Original URL output parameter this time, we download the selected photo for later use.
For a little while, I got tripped up by the ‘Pexels Photo’ output parameter when I tried to use that as the final photo. It doesn’t work as well as getting the original image from its URL.
9️⃣ People like to be recognized for their work, so I do my best to credit anyone on HeyDingus for linked and quoted articles and any images that I use and didn’t create. Pexels doesn’t require it as part of their Terms of Service, but I still think it’s important.
This ‘Text’ action formats the image credit the way I like them for HeyDingus. My personal preference is to credit the artist rather than the platform, so I link the photographer’s name back to the photo’s page on Pexels’ website. That page also links to the photographer’s profile and the rest of their work.
’Show Notification’ actions can include images, too. ⌘
🔟 All that’s left is to copy that Markdown-formatted credit text to the clipboard, so it can be pasted alongside wherever the photo goes next. However, I hope you’ll notice that there’s something special about the notification I use to confirm that everything has finished as expected.
In addition to displaying the text just copied to the clipboard, the ‘Show Notification’ action lets you include an attachment. When adding the ‘Selected Image’ variable, we get a tiny image preview right in the notification. I’ve set it to the ‘Tiny URL’ parameter since it’s only going to be a small thumbnail anyway.
🎉 By grabbing the Magic Variable from our last ‘Get Images from Input’ action for this ‘Share’ action, you finally send it wherever you want next.
There we have it. The original quality stock photo and its credits ready-to-go. ⌘
I say “finally”, but in reality, this whole shortcut runs in less than 30 seconds from start to finish — including typing in a search term and making a selection.
And here’s the resulting stock image! (Image: Josh Sorenson) ⌘
A brief note: When you run this shortcut for the first time, it will look like you’re passing along all of the search results and their metadata through the actions. I don’t know why it runs like that, but I can confirm that the result is only your selected image. If you choose “Always Allow”, you won’t see that weirdness again.
I’ve spent a lot of time testing and adapting this shortcut and had fun learning about some trickier techniques to get it just right. It’s amazing that you can build a tool like this that saves you from browsing through websites to download files and copy things endlessly back and forth. I look forward to using it to spruce up my blog posts, and I hope you’ll get some use out of it, too.
I’m baffled by Google’s lack of long-term support for their first-party Pixel phones. I thought that was a huge selling point for going with the “Google Phone”.
Straight from the horse’s mouth, only about three years of updates for the Pixels. ⌘
I had to do some Wikipedia research to remember when the first Pixel phone debuted. It was in 2016, and then it stopped getting the latest updates after three years. It was Google’s first phone (even though it really wasn’t), so I could forgive that. Let’s check in a few years later with the Pixel 3. That phone came out in 2019 and won’t get updates after this year (2022). Again, just three years of new features. Hmm. Fast-forward to today. The Pixel 6 and Pixel 6 Pro, the latest and greatest from Google, and which run on their own custom silicon, are still only guaranteed the newest version of Android for three years (until 2024).
Shall we compare to the iPhone? iOS 15 is available on phones all the way back to the iPhone 6, which was released in 2015. A whole year before the first Pixel. Apple doesn’t provide a timeline for how long their phones will receive the latest version, but the support length has only been getting longer over time, and it appears to be at least double that of Google with Pixel.
It’s hard not to see the parallels between Google’s commitment, or lack thereof, to its messaging services and its Pixels phones. You can’t build loyalty with your customers if you’re not loyal to your own product. That’s why I scoff when folks suggest that Apple intentionally inhibits their phones so that people have to buy a new one. I’m not a happy customer if I have to do that. Not good “customer sat”, Tim Cook’s favorite metric. Instead, Apple offers the most extended support in the industry, while their most comparable competitor gives up halfway around the track.
As the A-series silicon has gotten more advanced, efficient, and faster, I wouldn’t be surprised to see the iPhone 13 dancing on the grave of Pixel 6 as it gets iOS 22, 7+ after its debut.
It’s [sic] would be a trivially small amount of money for Apple to create an internal group dedicated to proactively finding and eliminating scam, copycat, infringing, exploitive apps. But every one it finds costs Apple money. And doing nothing isn’t hurting sales, not when it’s so much cheaper to just market the App Store as so secure and trustworthy. Apple seems to view App Store trust and quality as a marketing activity more than a real technical or service problem.
The scams and ripoffs keep on coming. While I don’t think that Apple actively views scams as a profit area, there’s no denying that they do benefit — 15-30% of every immoral transaction — from them. And that’s pretty gross.