Arc search, which is an iPhone-only app, was released early this year and aimed to compete with established voice search engines like Google Assistant. While the app already offers voice search functionality, the company launched this new fun way to get answers to your questions.
Call Arc Feature Explained
Arc search’s new Call Arc feature is a way for users to ask questions to an AI using a phone call! Users can pick up their phones, ask their questions, and receive answers in real time.
Arc already had a voice search feature but this way makes it “voice-activated”, so you can use it more easily since it is just like making a quick call to a friend or an expert. It’s just that they are playing with a common human behaviour that some people might like in certain situations and it is easy to explain to a person who is not that techie, maybe to your grandparents.
Here is what they told us in the official release notes:
“Arc Search now has voice-activated search, triggered by holding your phone to your ear and saying your query — just like you’re making a phone call. Once you do, you’ll instantly hear your search results, accompanied by an animated smiley face! You can also click “Read More” on the screen to access full Browse for Me results for each query.”
However, users don’t have to keep holding the phone up to their ears once a query is asked, the conversation can be continued by chatting with the AI. The app also displays a smiley face and some mellow music while it fetches results.
But is it really that good of a feature that is going to be revolutionary? Will people make a habit of it?
Is This Feature Really Necessary?
While it is a novel feature and something new for people to enjoy, is this feature really necessary? We are slowly moving towards more and more hands-free tech. Most people use AirPods or earbuds to take calls and rarely put a phone to their ear anyway.
After launching the app, wouldn’t it be easier to just tap on the voice search and ask your question rather than pretending to be on a call with an AI?
It is said that the company utilizes an LLM from OpenAI, although it’s not clear which model is being utilized. With the ChatGPT app releasing with the latest available models, and Google’s own Gemini voice engine to query, the development of this feature seems quite redundant.
Once Apple integrates OpenAI tech into Siri, which is built into iPhones and already has Safari support, this feature might become entirely unusable.
There might be some hope to keep improving on this feature, but the general consensus among netizens is that while this is a fun little feature, there isn’t much scope for regular use.
Many users on Reddit pointed out a use case like querying while crossing the road or when on the go, where having a feature like this could be very useful. But the feature currently has a lot of lag when it comes to answering questions. If the response time of the feature is improved, it could be a viable product.
Conclusion
Overall, the new Call feature is a fun experiment! While it would be useful in places with restricted or expensive data, a large language model is prone to having hallucinations. Not just that, but the recent direction of development by big techs like Google and Apple shows far more effective voice engines and the ability to query by calling isn’t enough to compensate for efficiency.