Meta and Ray-Ban have launched new smart glasses with advanced AI capabilities. The glasses feature an ultra-wide 12-megapixel camera, a five-speaker setup for immersive audio, and a redesigned charging case that provides up to 36 hours of use. The glasses also have a new feature called multimodal AI, which allows users to prompt the AI assistant by simply saying, “Hey Meta.” This enables the glasses to make calls, send texts, control features, and answer questions based on what the user is looking at. However, the AI’s grasp of real-time information is still shaky, often providing inaccurate information in response to simple questions. Despite this, the glasses have significant potential, especially for travelers who can use real-time translation and text summary features. The glasses are available in two classic silhouettes and over 150 different lens and frame combinations.
Read more at: www.theverge.com