1 min read

Link: Live AI on Meta’s smart glasses is a solution looking for a problem

I tested Meta's new Ray-Ban glasses featuring Live AI, hoping it would streamline everyday queries. However, the AI's responses often seemed too obvious or redirected me to conduct my own internet searches.

For instance, when I asked for breakfast options based on my sparse fridge contents, the AI suggested meals that were impossible with the available ingredients. Clearly, the AI’s practical application in real-life scenarios still has significant room for improvement.

Further trying its utility, I inquired about potential dinner dishes, only to receive generic suggestions which led me to opt for ordering takeaway instead. The AI couldn't effectively assist in enhancing meal planning or preparation.

Attempting to experiment beyond food, I queried the AI about fashion and literature, which similarly ended in basic answers that provided little value. My attempts to use Live AI for meaningful insight consistently fell short, demonstrating the system's limitations in understanding context or offering tailored advice.

In one instance, the AI was moderately helpful in suggesting specific artists for office decor, yet this success was rare during my testing. Even this required precise and detailed questions to obtain a somewhat useful response.

Despite the potential of Live AI to act as a real-time, visual assistant, its frequent misinterpretations and generic advice highlight a significant gap between its current functionality and practical everyday usefulness. The technology, enthralling in theory, often compels a return to traditional internet searches for reliable information. #

--

Yoooo, this is a quick note on a link that made me go, WTF? Find all past links here.