Tuesday, July 23, 2024

The Meta smart glasses will feature artificial intelligence and we will take them for a spin

Must Read

(circles); (technical repair)

In a sign that the tech industry is still getting weirder, Meta plans to release a major update soon that will turn the Ray-Ban Meta, its camcorder glasses, into a gadget only seen in sci-fi movies.

Next month, the glasses will be able to use new AI software to see the real world and describe what you see, similar to the AI ​​assistant in the movie “Her.”

The glasses, with different frames starting at $300 and lenses starting at $17, were mainly used for taking photos and videos and listening to music. But with new artificial intelligence software, they can be used to scan famous landmarks, translate languages, and identify animal breeds and exotic fruits, among other tasks.

To use the AI ​​software, users simply say “Hey, Meta,” followed by instructions, such as “Look and tell me what kind of dog this is.” The AI ​​responds with a computer-generated sound played through small speakers on the glasses.

The concept of artificial intelligence programs is so new and strange that when we – Brian X. Chen, a tech columnist who reviewed Ray-Bans last year, and Mike Isaac, who covers Meta and uses smart glasses to produce a cooking show — we discovered we were dying to try. Meta has given us early access to the update and we've been testing the technology over the past few weeks.

We took the glasses to the zoo, the supermarket and the museum while we asked questions and gave instructions to the artificial intelligence.

The result: We enjoyed the virtual assistant's mistakes—for example, confusing a monkey with a giraffe—and admired when it performed useful tasks like determining that a package of cookies was gluten-free.

A Meta spokesperson said that because the technology is still new, the AI ​​won't always get things right and that feedback should improve the glasses over time.

The Meta software also generated scripts for our questions and the AI's answers, which we captured on screens. Here are the highlights of the month of Living with Meta Assistant.

See also  One True Hero, Angelian Trigger, and Oxenfree II: Lost Signals Make Their Premieres on Nintendo Switch


Brian: Naturally, the first thing I had to test the Meta's AI was my corgi, Max. I looked at the plump dog and asked, “Hey Mita, what am I looking at?”

“A cute corgi is sitting on the floor with his tongue out,” the assistant said. True, especially about being nice.

Mike: Meta's AI correctly identified my dog, Bruna, as a “black and brown Bernese Mountain Dog.” I expected the AI ​​to think it was a bear, the animal that neighbors confuse the most.

Zoo animals

Brian: After the AI ​​correctly identified my dog, the next logical step was to test it on zoo animals. I recently visited the Oakland Zoo, California, where I spent two hours looking at dozens of animals: parrots, turtles, monkeys, and zebras. I said, “Hey, Mita, look and tell me what kind of animal it is.”

The AI ​​was wrong the vast majority of the time, partly because many of the animals were caged and far away. He confused a primate with a giraffe, a duck with a tortoise, and a meerkat with a giant panda, among other errors. On the other hand, I liked that the AI ​​correctly identified a type of parrot known as the blue and gold macaw, as well as zebras.

The strangest thing about this experience was talking with the artificial intelligence assistant near the children and their parents. They pretended not to hear the only adult in the park alone while I muttered to myself.


Mike: I also had a weird time shopping. Being inside Safeway and talking to myself was a bit awkward, so I tried to speak calmly. However, they looked at me suspiciously.

When the Meta's AI worked, it was magical. I picked up an odd-shaped package of Oreo cookies and asked him to look at the package and tell me if it contained gluten. (Yes they had). Half the time he answered these types of questions correctly, although I can't say it saved time compared to reading the label.

See also  Everything you need to know before purchasing a Human AI pin

However, the reason I started wearing those glasses was to start my own cooking show on Instagram, which is a fancy way of saying that I record myself preparing the week's meal while talking to myself. These glasses made doing this much easier than using your cell phone manually.

The AI ​​assistant can also provide assistance in the kitchen. For example, if I needed to know how many teaspoons are in a tablespoon and my hands were covered in olive oil, I could ask you to tell me. (There are three teaspoons in a tablespoon, just FYI.)

But when I asked the AI ​​to look at a set of ingredients I had and give me a recipe, it issued quick instructions for egg custard, which didn't help me much with following the instructions at my own pace.

Brian: I went to the supermarket and bought the most exotic fruit I could find: cherimoya, a green, scaly fruit that looks like a dinosaur egg. When I gave Meta's AI several chances to recognize it, it responded each time with something different: a chocolate-covered nut, a stone fruit, an apple, and finally a durian, which was close to, but not quite right, a banana.

Monuments and museums

Mike: It looks like the new software's ability to recognize landmarks and monuments is starting to take hold. When looking at a dome in downtown San Francisco, Meta's AI correctly answered, “City Hall.” It's a clever trick and perhaps useful if you're a tourist.

Other times he didn't get it right. Driving from the city to my home in Auckland, I asked Mita which bridge it was while looking out the window in front of me (with both hands on the wheel, of course). The first answer was the Golden Gate Bridge, which is wrong. On the second try, he realized he was on the Bay Bridge, which made me wonder if he just needed a clearer photo of the tall, overhanging white columns of the newer section to get it right.

See also  Nintendo defended the high cost of The Legend of Zelda Tears of the Kingdom: "The price reflects the quality of experience fans can expect."

Brian: I visited the San Francisco Museum of Modern Art to see if Meta's AI could serve as a tour guide. After taking photos of nearly two dozen paintings and asking an assistant to tell me what work I was looking at, the AI ​​could describe the images and media used to create the work–which would be fine for an art student History.art–but it couldn't identify the artist Or address. (A Meta spokesperson said that another software update released after my visit to the museum improved this ability.)

After the update, I tried images on my computer screen of more famous works of art, including the Mona Lisa, and the AI ​​recognized them correctly.


Meta's AI glasses offer an interesting glimpse into a future that seems far away. Its disadvantages highlight the limitations and challenges posed by the design of this type of product. For example, glasses can better recognize zoo animals and fruit if the camera has a higher resolution, but a better quality lens will add more volume. Wherever we were, it was uncomfortable to talk to a virtual assistant in public places. It is unclear whether this will be normal or not.

But when it worked, it was good and we enjoyed it, and the fact that Meta's AI can do things like translate languages ​​and identify landmarks through cool-looking glasses shows how far the technology has come.

Cell phone screenshot taken with Meta in San Francisco. (Mike Isaac / The New York Times)

Cell phone screenshot taken with Meta in San Francisco. (Mike Isaac / The New York Times)

Latest News

Decentraland: What is the price of this cryptocurrency?

However, little by little the way opened up to the point that companies, millionaires and even governments encouraged or...

More Articles Like This