Artificial intelligence continues to be the latest buzzword in the technology industry. (Sorry, blockchain and NFT, your 15 minutes are up.) And while there are plenty of ways Apple is already using machine learning to power its technology, it’s hard to deny that there are some areas where the company can still benefit from the jump. on this last step.
So it’s interesting to hear a report in the New York Times that Apple engineers are actively looking into AI that generates languages like the systems that underpin chatbots like ChatGPT for a number of applications. And a follow-up report from 9to5Mac confirming that the technology, codenamed “Bobcat”, is already being tested for the tvOS 16.4 beta.
How can this technology be used in Apple products? Well, it just so happens that I can think of several ways to deploy it, not all of which are just building a chatbot.
Help I need someone
Computers in general have gotten simpler over the past few decades, but that doesn’t mean they still evoke mystique from time to time. Anyone who has ever tried to troubleshoot a loved one’s device and ended up wanting to smash it into a million pieces can probably vouch for that fact.
Apple has long tried to integrate ways for people to find solutions to their problems, whether through the company’s online knowledge base or tools like macOS’ universal Help menu. But navigating these systems can be tricky in their own way, and they don’t always contain the most up-to-date information.
This is one place where an AI chatbot can potentially help users. If you could just type a query for some functionality into the search box and be shown exactly the steps you need to take, that would be very important to fulfill the promise of a simple technology. And it’s hardly out of reach: I already have one friend who regularly uses ChatGPT to help him figure out the correct configuration details for arcane command line tools.
Seek and you will find
Apple has never tried to compete in the internet search market with Microsoft or Google, but that doesn’t mean that search as a concept isn’t important to the company. Whether it’s Spotlight or Siri, people use Apple’s search to find all sorts of information both on their computers and on the web.
But the more data, the harder it is to sift through all that noise to find what someone is actually looking for, as Google and Microsoft are learning. Apple has done a reasonable job with Spotlight, but there are certainly times when being able to interact with it via a chat interface could be more useful. Imagine simply asking him to “show me all the spreadsheets I’ve edited in the last month.”
IDG
Similarly, we’ve all experienced the horrendous attempt to search for information via Siri on our Apple Watch or HomePod, only to find out that the relevant results were sent to our phone. Or just get a list of websites that may or may not answer your request. What if instead one could give an answer with more details sent to your phone? To be fair, Siri has gotten better at this, but adding an AI-based chat interface could provide more power and flexibility than Apple has provided so far.
Time to get Siri-ous
And so we came to the elephant in the room: yes, Siri. This is already Apple’s closest thing to a chatbot, but anyone who’s spent a lot of time with it quickly realized that the illusion of intelligence doesn’t extend very far: it basically gives the impression that you’re just dealing with prearranged, randomized responses. it doesn’t offer much of an improvement over the voice command interface that Apple shipped back in Classic Mac OS.
It also means that Siri is ripe for a leap in power. Rumor has it that the aforementioned language generation model is already being rolled out in the upcoming tvOS 16.4 update, albeit in a limited way, both in terms of what aspects it will affect and in terms of how it is being used to enhance an existing virtual assistant. than to replace immediately.
While chatbots have their limitations, it’s clear that they are a step up from conventional voice interfaces, potentially finally living up to the promise of a truly virtual assistant that can handle complex concepts. Unlike today, where I have to tell my HomePod mini very precisely how to turn on the lights in my living room, or else I run the risk of turning on (or off) the lights somewhere else in my house.
Siri was an impressive piece of technology on its debut, but it’s become clear over the past decade that it has stalled, improving only modestly. I’ve been championing Siri 2.0 for more years than I can remember, and this future where our virtual assistant adapts to us, and not the other way around, may finally be within reach.