September 12, 2023
Later today, the world will be introduced to Apple’s latest and greatest. If Apple follows its longstanding routines, new iPhone models, regular and Pro, will be unveiled alongside updated Apple Watches.
If I’m honest, I’m a bit underwhelmed by the rumors and leaks, but I’ll be happy if proven wrong and Tim Cook and Co show off something that really rocks my world. But seeing as they just took the wraps off Vision Pro, which did rock my socks, I can give them a pass if the rest of the year’s announcements are more benign.
Anyway, here’s what I’m feeling at the moment about the anticipated product releases.
iPhone
👍
- Titanium finish for Pro models. Titanium seems to be the new stainless steel, and I’m all for the durability paired with lighter weight. I’m not sure it’ll be the jaw-dropping weight savings that some think it will be, but any dropped grams are okay by me.
- Dynamic Island coming to all models. It was such a cool introduction last year, that it’s one of the few features that I lust after while I cling to my 13 mini. I hope it being on more models will help with it becoming more useful with more apps.
- Better battery life. I’m really waiting for that next leap in electron storage technology to make a meaningful difference in day-to-day battery life. But in the tick-tock cycle we seem to have picked up in great battery life then okay battery life, this should be a “tock” (better battery life) year. That’s never a bad thing.
- Action button. Although I’ll be a little sad to see the mute switch go, it makes so much sense to turn it into a user-programmable button. Nearly everyone just sets and forgets their ring/silence preference, so the “switching” nature of the mute switch is probably hardly ever used. And for those who do regularly turn their phone’s ringer on and off, I imagine that software and haptics will make doing that with the Action button just as easy. But for the rest of us, we just might finally get that dedicated hardware camera button, flashlight, app launcher, or shortcut starter we’ve always wanted. I just hope it’s faster than the Action button on the Apple Watch Ultra, which is so criminally slow to launch a shortcut that I only begrudgingly use it.
- USB-C all the things. Lightning has had a great run and lasted the decade that Phil Schiller told us it would. I didn’t begrudge the port change then, and I don’t begrudge it now. I do think that every iPhone should get bumped to at least USB 3.0 speeds though. USB 2.0 just feels cruelly slow for a premium product like an iPhone in 2023. Most of all, though, I hope that they’ll use the opportunity to update the Magic Keyboard with Touch ID with USB-C as well, and release it in a black and silver color option.
👎
- Periscope lens for the Pro Max. After several failed attempts, I’ve come to accept that I just don’t have enough interest to pursue becoming a discerning mobile photographer. I’m, at best, a point-and-shoot kind of person. And while a longer lens would certainly open up more options in those quick and dirty shots I take, I just can’t get myself excited for the ever-diminishing returns in smartphone camera improvements. And certainly not if it only comes to the largest of iPhones.
- Pro Max vs. Ultra. Speaking of large phones, until recently, it seemed that the “Ultra” moniker was going to replace “Pro Max” for the largest iPhone Pro. I was on board with the change, especially since it’s been rumored there would be a real feature difference between the regular-sized Pro phone, and the big boy. Now it seems “Pro Max” will be sticking around. I’ve never liked that name, and think “Ultra” would be more clear.
- No mini-sized phone. It’s the best one. A couple of years ago, I had hoped that Apple would switch back and forth from having a smaller then a bigger version of their regular phone available. A mini year, then a Plus year. No signs pointed toward that trend, but still, I held out hope. Despite supposedly weak sales of the iPhone 14 Plus, I guess I’ll have to keep my fingers crossed for a two-year back and forth. They did offer the mini for two years before switching to the Plus after all…
Apple Watch
👍
- Chip updates. It’s been far too long since we’ve seen a performance leap in the system-on-a-chip in the Apple Watch. And while most people don’t seem to crave a faster watch, we shouldn’t forget that the S-series chip has also gone into other products like the HomePod. I want the HomePod’s software to stay relevant for as long as possible since their audio quality should continue to be top-notch. Plus, with faster chips usually comes better battery life — always a welcome thing for a watch — and I’ll take any performance boost to that brute forces running shortcuts faster.
🤷♂️
- Anything else? I’ve been wracking my brain to come up with anything else to say about the Apple Watch. New colors? Sure. Different band options? Great. Action button on the regular Series models? Okay, but that would dilute the Ultra’s differentiation. But we haven’t heard any whispers about new sensors, screen sizes, or anything else of note this year. It might be a sleeper year for the Watch lineup.
Grab Bag
👍
- Event day in general. This is only the second Apple event of 2023, which is a bit of an outlier. Can you believe that the only one since 2022’s iPhone event was WWDC23 in June? It’s a great year when we have an Apple Event to look forward to in each season. So to be “back” at Apple Park for anything is exciting.
- Bye-bye leather. I’m good with Apple replacing their leather accessories with something more progressive. I know some people swear by their leather cases, but I’ve never used one and I’m confident Apple will have developed a worthy replacement. Their textile game is so good. And “FineWoven” is a cool name.
🤷♂️
- Pre-recorded events. Okay, I’ll concede that the polished videos are slick, fun to watch, and pack more in than a live presentation ever could. And pre-2020 keynotes now look a little dated and dark compared to the brightly lit scenes from around Apple Park of their modern event videos. But it’s a bit awkward to invite all the press to come watch a movie together in the Steve Jobs Theater, and I do miss the electric atmosphere of in-person presentations on event day, knowing that it was all happening live.
At this point, I’m not planning on purchasing anything rumored to be announced tomorrow — gotta save those pennies for the Vision Pro.
September 9, 2023
A weekly list of interesting things I found on the internet, posted on Sundays. Sometimes themed, often not.
1️⃣ I love this drought-resistant sidewalk garden that a family has been maintaining for the last six years. [🔗 zachklein.com]
2️⃣ I’ll never not be impressed by the care and usefulness poured into CleanShot X. It’s got a great new update this week, and Matt Birchler made this excellent feature promo video. [▶️ CleanShot X // youtube.com]
3️⃣ I can’t say that I’m a Rolling Stones fan, but I did really like the creativity of their latest music video. The song’s pretty good, too. [▶️ Rolling Stones // youtube.com] (Via Daring Fireball)
4️⃣ It’s heartwarming to see that the late, great Alex Hay has been memorialized by Starfield’s developers in the game he was so excited to play. [🔗 Matthew Cassinelli // matthewcassinelli.com]
5️⃣ Could next year finally be the year that Siri gets good? I’ve been duped into that idea before, but tighter integration with Shortcuts is always a good thing in my book. [🔗 Zac Hall // 9to5mac.com] (Via Matthew Cassinelli)
6️⃣ Words to live by. [🔗 HeyScottyJ // heyscottyj.com]
7️⃣ This is an awesome tool to rank any list of things. [🔗 Chorus.fm // chorus.fm]
Take a Chance
Thanks for reading 7 Things. If you enjoyed these links or have something neat to share, please let me know.
7 Things
August 27, 2023
A weekly list of interesting things I found on the internet, posted on Sundays. Sometimes themed, often not.
1️⃣ 🔗 Link
2️⃣ 🔗 Link
3️⃣ 🔗 Link
4️⃣ 🔗 Link
5️⃣ 🔗 Link
6️⃣ 🔗 Link
7️⃣ 🔗 Link
Take Another Chance
Thanks for reading 7 Things. If you enjoyed these links or have something neat to share, please let me know.
7 Things
August 13, 2023
Casey Liss, describing his new app Callsheet:
Callsheet, in short, allows you to look up movies, TV shows, cast, and crew. You can think of it as similar to the IMDb app but… with respect for its users. Which, actually, makes it not like IMDb at all. 🙃
When I watch a movie or TV show, I’m constantly trying to figure out who that actor is, who the director is, and so on. Early this year, I wanted a way to look this up that was native to iOS/iPadOS, but also fast, with no fluff that I wasn’t interested in. I wanted a bespoke version of the IMDb app.
It’s unlikely that you haven’t heard about Callsheet yet this week, but I’m here to pile on the praise. I’ve been using Casey’s app throughout the beta period, and it’s been solid from day one.
I’m totally that person who can’t help but point out what show or movie we know an actor from, and it completely derails my attention until I can figure it out. Callsheet makes figuring that out easy, ad-free, and — most importantly — fast. I love all the attention to detail that Casey has poured into the app. For example, try tapping on a show or movie’s runtime to see at what time it would end!
The pricing is more than fair. $1 per month or $9 per year (as of launch). Plus you get 20 searches totally free to try it out AND a weeklong free trial when you start your subscription. I even appreciate the honest, whole-number pricing — no $0.99s or $8.99s in sight.
Casey is right to disparage the IMDb app. It’s become an ad-filled, in-your-face, and in-your-way atrocity. One of the only good things that I can say about it is that it has inspired the creation of Callsheet. In a myriad of ways, right down to app size, Callsheet respects its users where IMDb doesn’t. IMDb weighs in at 103 MB of storage on your device. Callsheet punches way above its weight class, beating IMDb at its own game with a minuscule 5 MB.
Stephen Hackett summarized it best, saying, “Callsheet is the indie app scene at its best — taking on a huge app written by people who don’t seem to care about their users — and doing a better job at it in every single way.”
The TV Tracking Dynamic Duo
Callsheet pairs well with my other favorite TV-focused, indie-developed app: TV Forecast. Callsheet excels at looking up people, and TV Forecast is best at tracking shows (and now movies!) that you’ve watched and want to watch. And while you can get actor details in TV Forecast, Callsheet has quickly become the tool I reach for to solve the “where do we know them from?” problem.
You should give both apps a look, but especially Callsheet this week to boost Casey’s launch. And, for the love of god, delete IMDb!
Apps
August 13, 2023
A weekly list of interesting things I found on the internet, posted on Sundays. Sometimes themed, often not.
1️⃣ If you’re wondering how long we’ve been waiting for delivery on an Elon Musk promise, this site’s got the receipts. [🔗 elonmusk.today]
2️⃣ AI-generated summaries of product reviews sound pretty useful! [🔗 Emma Roth // theverge.com]
3️⃣ An absolutely mesmerizing percussive performance in the water. Yeah, you read that right. [🔗 instagram.com]
4️⃣ Super cool portfolio of Mike Matas’ work across Apple, Facebook, Nest, and other startups. Sounds like he’s headed to Jony Ive’s LoveFrom next. [🔗 Mike Matas // mikematas.com]
5️⃣ I loved this exploration of web design that replicates many of the constraints that writing with a typewriter would have. The result is a beautiful and simple site with every detail considered. [🔗 Leon Paternoster // thisdaysportion.com]
6️⃣ As recommended by Joanna Stern, this Grover tech rental site looks super interesting and sustainability-conscious. [🔗 grover.com]
7️⃣ RSS is still just the best and hasn’t died on the vine as some major news publications seem to think. Matt Birchler sets the story straight. [🔗 Matt Birchler // birchtree.me]
Take a Chance
Thanks for reading 7 Things. If you enjoyed these links or have something neat to share, please let me know.
7 Things
August 10, 2023
Over the years, I’ve become more passionate about making the web an ever more accessible place. At a basic level, that means adding image descriptions (alt text) for images that I post. But I’ve also tinkered around with shortcuts that help bring more clarity to the things shared online. So I was intrigued when rumors spread about Threads not only building support for adding custom alt text to images, but also automatically generating a description if a custom one wasn’t added. It took some time, but Threads delivered.
For the past few hours, I’ve been toying around with VoiceOver, the technology in Apple’s operating systems that reads out things on the screen. It’s also how you can hear those image descriptions. (To enable it, go to Settings → Accessibility → VoiceOver. Or just ask Siri to turn it on.) With that new tool on my toolbelt, I could start testing.
I started a thread with two posts. One had an image for which I added a custom alt text: “Eating my first meal on the floor of our new house with our dog.” The other was the same image, except without any alt text. What would Threads describe it as? I turned on VoiceOver to learn and…
…my first lesson was in patience. While VoiceOver dutifully read out my custom description for the first image, the second one was just a “photo by hey.jarrod”. Threads apparently needed some time to process.
So I went scouring through my older posts with images and was delightfully surprised at what I found. Threads did a pretty good job at creating brief descriptions, and they even helpfully started with “Maybe an image of” to convey its uncertainty. It also reads out any text it found within an image. Upon further testing, I think reading that found text is a feature of VoiceOver and isn’t exclusive to Threads, but it’s cool nonetheless!
I headed to my timeline and tapped through more images there. Meta’s Llama generative AI model is doing some great work. I found a post that very accurately described a landscape photo as “Maybe an image of beach, long grass, horizon”, and another as “Maybe an image of pottery, coffee cup, long grass, landscape.” (I wish I could link to that post, but Threads automatically refreshed before I could save it and the post was forever lost to the algorithmic timeline. 🙃)
One more example. In another post, Threads described the final photo as “Maybe an image of two people, landscape, [the three different dog breeds].” Pretty good, right? Another hiccup though: I wanted to back and relisten to the post’s VoiceOver so I could document the specific dog breeds it found, but it’s lost the generated description. It is once again blandly described as “an image”. The generative feature must not yet be stable, or it just hasn’t permanently attached the generated description to the image.
Okay, time to check on my test thread.
Voilá! After about an hour, Threads came through. Even though I hadn’t added a description of any sort, here’s what it thought my photo showed: “Maybe an image of one person, golden retriever, and pet food.”
The photo in question. ⌘
As I mentioned in a follow-up reply, I can assure you that I wasn’t eating dog food. But as a basic description, I give it a pass! Furthermore, this is just the beginning for these auto-generated descriptions. The technology will progress, and get more accurate, faster, and more detailed. What were once major swaths of the internet literally hidden from folks using screen reading technology, at least on Threads everyone will have a better idea of what’s shown in an image.
To be clear, I still think custom alt text is better to accurately convey your intent when posting an image. I’m still planning on manually adding a description for most images. But not everyone will do that, and this seems like a big win for everyone.
August 6, 2023
A weekly list of interesting things I found on the internet, posted on Sundays. Sometimes themed, often not.
1️⃣ Has your computer been lacking in whimsy? Ever lose track of your mouse on that giant screen? MouseosaurusRex will fulfill both needs. [🔗 Tyler // tyler.io]
2️⃣ A video by 9to5Mac showcasing how a man with a degenerative muscle condition uses his Apple technology to retain his independence was instrumental in getting that tech cost covered as health care. [🔗 Ben Lovejoy // 9to5mac.com]
3️⃣ Is delayed gratification always the best? A few years ago, I would responded with an enthusiastic, “Yes!” But now, as time moves ever faster and years feel ever shorter, I tend to agree more with Sreekar, here. [🔗 Sreekar // sreekarscribbles.com]
4️⃣ This article from over 20 years ago was eye-opening about how vastly different dental diagnoses can be. [🔗 William Ecenbarger // rd.com] (Via Jason Becker)
5️⃣ If you’re not a regular Macalope reader, what even are you doing? That whole “non-folding iPhones will be worthless” article really was excessive, wasn’t it? [🔗 The Macalope // macworld.com]
6️⃣ In what I hope will be a poetic finale of my posts about Twitter/X, I give you a Basic Apple Guy’s resurrection of the Fail Wail in the form of an icon that you could use to replace that uninspired X icon on your home screen, if you want. [🔗 BasicAppleGuy // basicappleguy.com]
7️⃣ A few thoughts that popped into my mind while watching this video of pencils being manufactured: (1) I wonder how many pounds of sawdust is produced from this factory per day/month/year. (2) With so many moving parts, its miraculous that the factory isn’t always on pause for one repair or another. (3) The little flipper flappers are my favorite. (4) Process X is definitely making it into my YouTube subscriptions. [▶️ Process X // youtube.com] (Via Jason Kotkee]
Take a Chance
Thanks for reading 7 Things. If you enjoyed these links or have something neat to share, please let me know.
7 Things
July 31, 2023
Dan Seifert, writing for The Verge: (emphasis added is mine)
Reading Mode was released late last year and is intended to be an accessibility feature — it makes reading content on your phone easier if you are vision impaired. You can customize the typeface, font size, colors, and spacing to fine-tune how it presents the text. It can also use Google’s onboard text-to-voice transcription to read the content aloud, which you can customize between various voices and adjust the speed of. The best part is, unlike reading modes that are built into some browsers, the Reading Mode app works on almost anything your phone is displaying, whether that’s in Chrome, an in-app browser, or an app itself.
It makes so much sense for a reading mode to work across apps, not just the browser. Yesterday, I shared a tip about reading aloud any selected text, but now I want this reading and listening view to just be a layer living atop anything on the screen.
Three other things I like about Google’s implementation: (1) The swipe gesture to activate it anywhere, anytime. (2) The estimated reading time is prominently displayed, making it easy to judge if I have time to read now or if I should save it for later. (3) Text-to-speech all the things. Weirdly, iOS 17’s “read this” feature of Safari Reader is only available in Safari, not Safari View Controller where you can also activate Reader. Like I said, weird.
One thing I don’t like: Reading Mode is a separate app that users have to download to use.
Linked
July 31, 2023
Devon Dundee, on Threads last week:
Alt text for images is now available in Threads! A great step in making the app more accessible.
Just press the Alt button after selecting an image to upload. The prompt says, “Add a short description for screen readers. If you leave this blank, we’ll automatically add a description for you.”
Interested to see what that automatic descriptions look like.
I’m likewise intensely curious about how this will work. It’s precisely the use case I’ve thought would be perfectly suited for LLMs to tackle. Better yet would be if the automatic descriptions were easy to manually edit. Sometimes it’s just hard to get started!
It’s the sort of feature that I’d love to test and use, but I haven’t yet found that ‘Alt’ button, nor any other way to adding descriptions natively. Once I do, though, I’m going to have to figure out a way to see or hear the descriptions that the app adds on its own. Some are going to be wild, I’m sure.
Linked
July 30, 2023
It’s no secret that I’ve become enamored with listening to web articles being read aloud to me. I’ve been testing the narration features of Reader, Instapaper, and Omnivore. (Spoiler: Reader’s is good, Instapaper’s is okay, and Omnivore’s is next-level) But what about elsewhere outside of those apps?
If you’ve been following along with my Beta Impressions thread, you’ll know that we’ll soon be able to enjoy good quality narrations by Siri in Safari Reader. That will make it way easier to get text read out from any webpage without having to send it to one of those apps first. But there are two major downsides I’ve noted. First, the text is not highlighted as it’s read aloud. It’s surprising how much better my focus and retention are when I’m reading and listening to something. Second, it’s still just for web articles; not for arbitrary text. For example, you can’t listen to text you’ve written as a proofreading method.
But I have good news! Already in iOS 16 (Et al.) you can turn on ‘Spoken Content’ in accessibility settings. It adds a ‘Speak’ button to the popup context menu for any selected text. Plus, you can turn on ‘Highlight Content’ to focus your attention on the current sentence and (optionally) word being spoken. I have no idea how long these features have been hiding in the accessibility settings, but I’m so glad to have found them!
Having the text highlighted along with the audio makes all the difference. ⌘
You can tweak other settings as well. For instance, I prefer the Siri Voice 4, and changed the voice to that one. It’s got great intonation and keeps the voice consistent with Siri’s spoken feedback on my devices. I sped up the voice a little, too.
Here’s how I’ve tuned my Spoken Content settings. ⌘
For a capability that hasn’t had much marketing behind it — at least from what I’ve seen — Spoken Content is quite full-featured! Don’t miss that you can add specific pronunciations for tricky words and names. It’s worth taking some time to explore all the submenus.
This very article was proofread aloud by Siri using the Spoken Content feature. ⌘
Future Feature Requests
I can’t try something without having some thoughts on how it could be improved.
- I’d love to see this added as an API for developers to implement in their apps. Then they wouldn’t have to spend their time building their own voices or text-to-speech engines. Likewise, when Apple’s voices improved, so would any app that uses them for spoken content.
- I was surprised when trying Omnivore’s reading voices at how much of an improved experience it is to have a secondary voice pop in to read block quotes. It really helps clarify the intended meaning of an article to know which text is being quoted. It’s probably tricky to implement for arbitrary selected text — as opposed to specifically audiobooks or articles where the format is more constrained — but would be great if Apple figured it out.
Tips