August 13, 2023

Yep, Everyone Is Right. Callsheet Is Great.

Casey Liss, describing his new app Callsheet:

Callsheet, in short, allows you to look up movies, TV shows, cast, and crew. You can think of it as similar to the IMDb app but… with respect for its users. Which, actually, makes it not like IMDb at all. 🙃

When I watch a movie or TV show, I’m constantly trying to figure out who that actor is, who the director is, and so on. Early this year, I wanted a way to look this up that was native to iOS/iPadOS, but also fast, with no fluff that I wasn’t interested in. I wanted a bespoke version of the IMDb app.

It’s unlikely that you haven’t heard about Callsheet yet this week, but I’m here to pile on the praise. I’ve been using Casey’s app throughout the beta period, and it’s been solid from day one.

I’m totally that person who can’t help but point out what show or movie we know an actor from, and it completely derails my attention until I can figure it out. Callsheet makes figuring that out easy, ad-free, and — most importantly — fast. I love all the attention to detail that Casey has poured into the app. For example, try tapping on a show or movie’s runtime to see at what time it would end!

The pricing is more than fair. $1 per month or $9 per year (as of launch). Plus you get 20 searches totally free to try it out AND a weeklong free trial when you start your subscription. I even appreciate the honest, whole-number pricing — no $0.99s or $8.99s in sight.

Casey is right to disparage the IMDb app. It’s become an ad-filled, in-your-face, and in-your-way atrocity. One of the only good things that I can say about it is that it has inspired the creation of Callsheet. In a myriad of ways, right down to app size, Callsheet respects its users where IMDb doesn’t. IMDb weighs in at 103 MB of storage on your device. Callsheet punches way above its weight class, beating IMDb at its own game with a minuscule 5 MB.

Stephen Hackett summarized it best, saying, Callsheet is the indie app scene at its best — taking on a huge app written by people who don’t seem to care about their users — and doing a better job at it in every single way.”

The TV Tracking Dynamic Duo

Callsheet pairs well with my other favorite TV-focused, indie-developed app: TV Forecast. Callsheet excels at looking up people, and TV Forecast is best at tracking shows (and now movies!) that you’ve watched and want to watch. And while you can get actor details in TV Forecast, Callsheet has quickly become the tool I reach for to solve the where do we know them from?” problem.

You should give both apps a look, but especially Callsheet this week to boost Casey’s launch. And, for the love of god, delete IMDb!

Apps


August 13, 2023

7 Things This Week [#107]

A weekly list of interesting things I found on the internet, posted on Sundays. Sometimes themed, often not.


1️⃣ If you’re wondering how long we’ve been waiting for delivery on an Elon Musk promise, this site’s got the receipts. [🔗 elonmusk.today]

2️⃣ AI-generated summaries of product reviews sound pretty useful! [🔗 Emma Roth // theverge.com]

3️⃣ An absolutely mesmerizing percussive performance in the water. Yeah, you read that right. [🔗 instagram.com]

4️⃣ Super cool portfolio of Mike Matas’ work across Apple, Facebook, Nest, and other startups. Sounds like he’s headed to Jony Ive’s LoveFrom next. [🔗 Mike Matas // mikematas.com]

5️⃣ I loved this exploration of web design that replicates many of the constraints that writing with a typewriter would have. The result is a beautiful and simple site with every detail considered. [🔗 Leon Paternoster // thisdaysportion.com]

6️⃣ As recommended by Joanna Stern, this Grover tech rental site looks super interesting and sustainability-conscious. [🔗 grover.com]

7️⃣ RSS is still just the best and hasn’t died on the vine as some major news publications seem to think. Matt Birchler sets the story straight. [🔗 Matt Birchler // birchtree.me]


Take a Chance


Thanks for reading 7 Things. If you enjoyed these links or have something neat to share, please let me know.

7 Things


August 10, 2023

Alt Text is (Actually) Enabled on Threads

Over the years, I’ve become more passionate about making the web an ever more accessible place. At a basic level, that means adding image descriptions (alt text) for images that I post. But I’ve also tinkered around with shortcuts that help bring more clarity to the things shared online. So I was intrigued when rumors spread about Threads not only building support for adding custom alt text to images, but also automatically generating a description if a custom one wasn’t added. It took some time, but Threads delivered.

For the past few hours, I’ve been toying around with VoiceOver, the technology in Apple’s operating systems that reads out things on the screen. It’s also how you can hear those image descriptions. (To enable it, go to Settings → Accessibility → VoiceOver. Or just ask Siri to turn it on.) With that new tool on my toolbelt, I could start testing.

I started a thread with two posts. One had an image for which I added a custom alt text: Eating my first meal on the floor of our new house with our dog.” The other was the same image, except without any alt text. What would Threads describe it as? I turned on VoiceOver to learn and…

…my first lesson was in patience. While VoiceOver dutifully read out my custom description for the first image, the second one was just a photo by hey.jarrod”. Threads apparently needed some time to process.

So I went scouring through my older posts with images and was delightfully surprised at what I found. Threads did a pretty good job at creating brief descriptions, and they even helpfully started with Maybe an image of” to convey its uncertainty. It also reads out any text it found within an image. Upon further testing, I think reading that found text is a feature of VoiceOver and isn’t exclusive to Threads, but it’s cool nonetheless!

I headed to my timeline and tapped through more images there. Meta’s Llama generative AI model is doing some great work. I found a post that very accurately described a landscape photo as Maybe an image of beach, long grass, horizon”, and another as Maybe an image of pottery, coffee cup, long grass, landscape.” (I wish I could link to that post, but Threads automatically refreshed before I could save it and the post was forever lost to the algorithmic timeline. 🙃)

One more example. In another post, Threads described the final photo as Maybe an image of two people, landscape, [the three different dog breeds].” Pretty good, right? Another hiccup though: I wanted to back and relisten to the post’s VoiceOver so I could document the specific dog breeds it found, but it’s lost the generated description. It is once again blandly described as an image”. The generative feature must not yet be stable, or it just hasn’t permanently attached the generated description to the image.

Okay, time to check on my test thread.

Voilá! After about an hour, Threads came through. Even though I hadn’t added a description of any sort, here’s what it thought my photo showed: Maybe an image of one person, golden retriever, and pet food.”

Eating my first meal on the floor of our new house with our dog.
The photo in question.

As I mentioned in a follow-up reply, I can assure you that I wasn’t eating dog food. But as a basic description, I give it a pass! Furthermore, this is just the beginning for these auto-generated descriptions. The technology will progress, and get more accurate, faster, and more detailed. What were once major swaths of the internet literally hidden from folks using screen reading technology, at least on Threads everyone will have a better idea of what’s shown in an image.

To be clear, I still think custom alt text is better to accurately convey your intent when posting an image. I’m still planning on manually adding a description for most images. But not everyone will do that, and this seems like a big win for everyone.


August 6, 2023

7 Things This Week [#106]

A weekly list of interesting things I found on the internet, posted on Sundays. Sometimes themed, often not.


1️⃣ Has your computer been lacking in whimsy? Ever lose track of your mouse on that giant screen? MouseosaurusRex will fulfill both needs. [🔗 Tyler // tyler.io]

2️⃣ A video by 9to5Mac showcasing how a man with a degenerative muscle condition uses his Apple technology to retain his independence was instrumental in getting that tech cost covered as health care. [🔗 Ben Lovejoy // 9to5mac.com]

3️⃣ Is delayed gratification always the best? A few years ago, I would responded with an enthusiastic, Yes!” But now, as time moves ever faster and years feel ever shorter, I tend to agree more with Sreekar, here. [🔗 Sreekar // sreekarscribbles.com]

4️⃣ This article from over 20 years ago was eye-opening about how vastly different dental diagnoses can be. [🔗 William Ecenbarger // rd.com] (Via Jason Becker)

5️⃣ If you’re not a regular Macalope reader, what even are you doing? That whole non-folding iPhones will be worthless” article really was excessive, wasn’t it? [🔗 The Macalope // macworld.com]

6️⃣ In what I hope will be a poetic finale of my posts about Twitter/X, I give you a Basic Apple Guy’s resurrection of the Fail Wail in the form of an icon that you could use to replace that uninspired X icon on your home screen, if you want. [🔗 BasicAppleGuy // basicappleguy.com]

7️⃣ A few thoughts that popped into my mind while watching this video of pencils being manufactured: (1) I wonder how many pounds of sawdust is produced from this factory per day/month/year. (2) With so many moving parts, its miraculous that the factory isn’t always on pause for one repair or another. (3) The little flipper flappers are my favorite. (4) Process X is definitely making it into my YouTube subscriptions. [▶️ Process X // youtube.com] (Via Jason Kotkee]


Take a Chance


Thanks for reading 7 Things. If you enjoyed these links or have something neat to share, please let me know.

7 Things


July 31, 2023

‘Google’s Reading Mode app for Android can save you from the worst of web design’

Dan Seifert, writing for The Verge: (emphasis added is mine)

Reading Mode was released late last year and is intended to be an accessibility feature — it makes reading content on your phone easier if you are vision impaired. You can customize the typeface, font size, colors, and spacing to fine-tune how it presents the text. It can also use Google’s onboard text-to-voice transcription to read the content aloud, which you can customize between various voices and adjust the speed of. The best part is, unlike reading modes that are built into some browsers, the Reading Mode app works on almost anything your phone is displaying, whether that’s in Chrome, an in-app browser, or an app itself.

It makes so much sense for a reading mode to work across apps, not just the browser. Yesterday, I shared a tip about reading aloud any selected text, but now I want this reading and listening view to just be a layer living atop anything on the screen.

Three other things I like about Google’s implementation: (1) The swipe gesture to activate it anywhere, anytime. (2) The estimated reading time is prominently displayed, making it easy to judge if I have time to read now or if I should save it for later. (3) Text-to-speech all the things. Weirdly, iOS 17’s read this” feature of Safari Reader is only available in Safari, not Safari View Controller where you can also activate Reader. Like I said, weird.

One thing I don’t like: Reading Mode is a separate app that users have to download to use.

Linked


July 31, 2023

Alt Text is (Maybe) Enabled on Threads

Devon Dundee, on Threads last week:

Alt text for images is now available in Threads! A great step in making the app more accessible.

Just press the Alt button after selecting an image to upload. The prompt says, Add a short description for screen readers. If you leave this blank, we’ll automatically add a description for you.”

Interested to see what that automatic descriptions look like.

I’m likewise intensely curious about how this will work. It’s precisely the use case I’ve thought would be perfectly suited for LLMs to tackle. Better yet would be if the automatic descriptions were easy to manually edit. Sometimes it’s just hard to get started!

It’s the sort of feature that I’d love to test and use, but I haven’t yet found that Alt’ button, nor any other way to adding descriptions natively. Once I do, though, I’m going to have to figure out a way to see or hear the descriptions that the app adds on its own. Some are going to be wild, I’m sure.

Linked


July 30, 2023

Quick Tip: Get Visual Narration Anywhere There’s Text

It’s no secret that I’ve become enamored with listening to web articles being read aloud to me. I’ve been testing the narration features of Reader, Instapaper, and Omnivore. (Spoiler: Reader’s is good, Instapaper’s is okay, and Omnivore’s is next-level) But what about elsewhere outside of those apps?

If you’ve been following along with my Beta Impressions thread, you’ll know that we’ll soon be able to enjoy good quality narrations by Siri in Safari Reader. That will make it way easier to get text read out from any webpage without having to send it to one of those apps first. But there are two major downsides I’ve noted. First, the text is not highlighted as it’s read aloud. It’s surprising how much better my focus and retention are when I’m reading and listening to something. Second, it’s still just for web articles; not for arbitrary text. For example, you can’t listen to text you’ve written as a proofreading method.

But I have good news! Already in iOS 16 (Et al.) you can turn on Spoken Content’ in accessibility settings. It adds a Speak’ button to the popup context menu for any selected text. Plus, you can turn on Highlight Content’ to focus your attention on the current sentence and (optionally) word being spoken. I have no idea how long these features have been hiding in the accessibility settings, but I’m so glad to have found them!

The speak button, and the narrated text being highlighted.
Having the text highlighted along with the audio makes all the difference.

You can tweak other settings as well. For instance, I prefer the Siri Voice 4, and changed the voice to that one. It’s got great intonation and keeps the voice consistent with Siri’s spoken feedback on my devices. I sped up the voice a little, too.

The Accessibility settings for Spoken Content, including Speak Selection, Highlight Content, Voices, and Speaking Rate.
Here’s how I’ve tuned my Spoken Content settings.

For a capability that hasn’t had much marketing behind it — at least from what I’ve seen — Spoken Content is quite full-featured! Don’t miss that you can add specific pronunciations for tricky words and names. It’s worth taking some time to explore all the submenus.

Text being read and highlighted in the Drafts app.
This very article was proofread aloud by Siri using the Spoken Content feature.

Future Feature Requests

I can’t try something without having some thoughts on how it could be improved.

  • I’d love to see this added as an API for developers to implement in their apps. Then they wouldn’t have to spend their time building their own voices or text-to-speech engines. Likewise, when Apple’s voices improved, so would any app that uses them for spoken content.
  • I was surprised when trying Omnivore’s reading voices at how much of an improved experience it is to have a secondary voice pop in to read block quotes. It really helps clarify the intended meaning of an article to know which text is being quoted. It’s probably tricky to implement for arbitrary selected text — as opposed to specifically audiobooks or articles where the format is more constrained — but would be great if Apple figured it out.

Tips


July 30, 2023

‘The past is not true’

Derek Sivers:

Seems we had both been told the accident was our fault, and had spent eighteen years feeling bad about it. This time she started crying, sniffled, grabbed a tissue to wipe her eyes and said, It’s so stupid - these stories.”

And

Aim a laser pointer at the moon, then move your hand the tiniest bit, and it’ll move a thousand miles at the other end. The tiniest misunderstanding long ago, amplified through time, leads to piles of misunderstandings in the present

Every time I shine a flashlight in the air, I always think about how those photons are being cast out for how many millions of unknown miles and, yes, that the most minuscule movement on my end will drastically change their trajectory down the line.

Derek’s story and deft comparison to our memories of past actions having longstanding impact — even if they’re not true” — left me almost shellshocked.

Linked


July 30, 2023

7 Things This Week [#105]

A weekly list of interesting things I found on the internet, posted on Sundays. Sometimes themed, often not.


1️⃣ People are amazing. This guy made an AirPods Pro charging case from off-the-shelf parts and 3D prints. [🔗 Exploring the Simulation // youtube.com] (Via Nick Heer)

2️⃣ I’m no Ezra Klein stan, but I thought this interview of him was quite good. I like the four questions he uses to evaluate his day. They seem like a solid foundation. [🔗 Clay Skipper // gq.com]

3️⃣ The Strange Planet series looks as good as I hoped! [🔗 Apple TV // youtube.com]

4️⃣ There are some bad internet bills introduced in Congress that would weaken encryption and privacy, give unprecedented access to law enforcement, and restrict access to necessary resources from kids. This site makes it easy to tell your representatives that you oppose them all. [🔗 badinternetbills.com] (Via Nick Heer)

5️⃣ Whether you like his music or not (I do), it’s hard to deny that Ed Sheeran is one of a kind. This is one of the most impressive musical performances I think I’ve ever seen. (See also, I Don’t Care (Live at Abbey Road)”) [🔗 Songkick // youtube.com] (Via Chance Miller)

6️⃣ This Neeva search engine sounded pretty great. Too bad, as explained in the article, it couldn’t make it in a room filled by one big elephant. [🔗 David Pierce // theverge.com]

7️⃣ I’m not a baby person” but I did like this retrospective on one year of parenting authored by an adventure writer. [🔗 brendan // semi-rad.com]


Take a Chance


Thanks for reading 7 Things. If you enjoyed these links or have something neat to share, please let me know.

7 Things


July 27, 2023

“Reading” the Web Could Get Way More Intimate With Personal Voice

I’m terribly excited about the possibilities that Personal Voice will unlock. I imagine a future internet where recordings of articles being read by the author are posted alongside the story. There are plenty of outlets that already incorporate audio versions of their stories — The Wall Street Journal and Stratechery come to mind — which open up alternative ways to access the great stuff people are putting out into the world. However, those types of hosted content are complicated and expensive to produce. At Stratechery, Ben records a podcast (with a secondary speaker for quoted material!) for each and every article. WSJ, as best as I can tell, uses a proprietary service to provide those narrations, and I think The Verge used to have something similar. Very few people have the time or resources to provide similar offerings for the text they post.

The Overcast podcast app listing Stratechery articles read by Ben as podcasts.
You can read Stratechery articles on the web, in RSS, email, or as podcasts.

But now anyone with with Apple’s latest operating systems can create a passable, if not perfect, recreation of their voice. It can be used to say any sort of text, and the possibility of automating the creation of an audio file for any blog post (or social media post?) seems tantalizingly close.

An article on The Wall Street Journal’s website with an annotation pointing out the option to listen to it.
Imagine this, but read in Joanna Stern’s voice.

Why does Personal Voice set my spidery-sense for important technical advances a’tingling? In short, personal connection. I’m a loudly proclaimed fan of text-to-speech features in reading apps. Sometimes it’s just better, easier, and more convenient to listen to a story rather than to read its text. Not to mention the obvious accessibility benefits for folks with vision impairments. But articles in those apps are all read in the same, somewhat robotic, voice. I often need to double-check the author and publication it came from because my brain assumes that every article read in that same voice was written by that one non-person. So at the heart of my excitement is the opportunity to feel more connected with the true originator. To recognize their voice as an author, not just through their writing style, but also through the sound of their actual voice.

While Personal Voice is billed as a feature intended to give people an audible approximation of their voice for a given moment rather than for an everlasting record, it’s not hard to imagine that next leap. I’m already engineering the shortcuts in my mind to produce those recordings, provided Apple gives us the necessary building blocks to make it happen.

Once the tools are in place, the next thing I’ll be advocating for is a standard for attaching such audio files alongside their text counterparts so that they can live and travel together from their home website, across shares to social media, RSS, and more. The future using such tools is bright, and you can bet that you’ll be able to hear HeyDingus articles read to you in my voice just as soon as I can string it all together.