Ray-Ban Meta Wayfarer Smart Glasses Unreview

By Unregistered User (not verified), 27 February, 2024

Forum
Apple Hardware and Compatible Accessories

I'm not good at writing reviews, so I'm not going to even try. I just got these delivered an hour ago and am about to dive in. I wanted to start a new thread where I could comment and hopefully others wil to. Questions are welcome.

They cost £329 in the UK. They arrived two days early.

I'm living alone right now, so i will be attempting the unboxing and setup with just my iPhone and Be My Eyes/AI.

Options

Comments

By Holger Fiallo on Wednesday, May 1, 2024 - 18:28

Cookies? Now that you know do not eat it. If you do and tell her the glasses told you, she might hide them.

By Stephen on Wednesday, May 1, 2024 - 18:46

OK, so as some of you folks may know, I have a couple pairs of smart glasses. I had the Celeste, I also now have the Ray-Ban metas. The metas are everything i’ve ever been looking for in a pair of smart glasses. Response time is superfast, it does give you short descriptions, but if you want more detail, just ask it. I prefer short descriptions because sometimes I don’t need every single detail. Text, Reeds great, I even tried holding my finger over a button on a microwave and asked it to tell me what button my finger was on. Worked phenomenal. End up buying a second pair as a spare. Also works fine when it comes to reading my thermostat from across the room. I also love that you can do multiple things with it, not just descriptions. Making and receiving phone calls, listening to music, taking pictures and live streaming all of which are phenomenal. Not gonna lie though, it makes me worried for envision and seleste. Meta and Rayban are behemoths in their perspective fields.

By mr grieves on Wednesday, May 1, 2024 - 18:58

Cookies? What cookies? I don't even know how those crumbs got there.

By Andy Lane on Wednesday, May 1, 2024 - 21:19

Why are you getting another pair already. I’m over the moon excited about where these are going but 1 pair will do for me. Are you going to try the Skyla frames? I’d be really interested to know what they’re like.

By Stephen on Wednesday, May 1, 2024 - 21:29

The reason why I’m getting a second Paris so that I can swap them out when the glasses battery gets low

By CrazyEyez on Wednesday, May 1, 2024 - 23:04

Hey Stephen.
Would you mind explaining how you got yours to read text?
Mine will not go past the summary.
The product recognition isn't the most accurate either.
I had some deodorant, and it got the brand wrong twice. Both times it gave 2 different responses.
The colour recognition seems okay.
The speed is great.

I'm still excited to try the Seleste glasses to compare the two devices.

Good times are ahead.

By 3AM on Thursday, May 2, 2024 - 01:00

I have been using these glasses for about 5 weeks now. All I can say is, awesome! No. They're not perfect. But which one is? It's been said before, and I would agree. This is the worse they will be.
I took a trip to Eastern Europe a couple weeks ago. I was hoping for easy sign translation, etc. Nope! And the Look and tell features were not available. Well actually, two days before returning to the States, they appeared to get an update that enabled Look and tell along with other AI functions. NICE!
But ok. in case you may not know, there are a couple very useful apps for the phone that really take AI interaction to the next level. And in tandem with the Meta glasses, they shine!
I primarily use PI: Personal Assistant which is a completely free and very capable AI platform that allows for two way communication. And in general, it is excellent. The other is the well known ChatGPT app. It too has a very useful free mode that also provides two way voice conversation.
So the pro tip is, in the Shortcuts app, create shortcuts to either or both apps conversation mode. Then, simply invoke the respective AI chatbot with a hey Siri command.
You may need to unlock your device or use Face ID. But from there on you can have a full conversation and get the information you may be looking for. And all this from the convenience of your glasses. How cool is that?
PI works well at finding real-time location information.
Let me know what you think? I'm thinking of doing a YouTube tutorial. Give it a try and report back!

By CrazyEyez on Thursday, May 2, 2024 - 02:16

Do those apps use the cameras in the glasses, or are these apps just something to talk to and ask questions?

By Falco on Thursday, May 2, 2024 - 09:06

Hello,

Does Meta AI also speaks other languages? If you take a picture of a text in another language, does it reads it correct?

Regards

By mr grieves on Thursday, May 2, 2024 - 09:53

Well, yes I sort of agree. But what is the use case for a sighted person to ask the glasses to describe what's in front of them? I get this feature being useful for "what kind of plant is this?" or other very specific questions, but a general "where the hell am I?" is probably not a question a sighted person is even going to ask.

I have noticed that if I say "look and tell me what you see in a lot of detail" that you get a bit more information. Maybe we still disagree on the definition of "a lot of" but when I tried that it did describe the trees and give a bit of a feeling of the ambience.

You might be right that getting it to read an entire document isn't going to happen right now, and we aren't going to be able to sit with a book and have it read it all to us. But for bits and pieces I can see them being really useful.

With any AI gadget there are always big caveats - there is quite a large BS factor to consider for starters. I think right now both developers and us are just testing the waters to see what can be achieved. I hope Meta picks up on how much we all love these glasses and what potential they have in our world.

By Portia on Thursday, May 2, 2024 - 11:38

Hey all,

I did end up getting the shipping email from Amazon US, saying that my Meta Wayfairer smart glasses will arrive Tuesday.
I'm excited to put them to the test around my room, and maybe around my house as well.
I figured it would still be worth getting them?

By Victor Dima on Thursday, May 2, 2024 - 12:06

Hey everyone. You probably don’t know me, but I am a totally blind journalist and accessibility advocate. I am using these glasses since January, I had access to the preview program for AI and have been testing features within early access and now with the public Version.
I have to say that I am impressed with what updates I got during this months. The fact that I can activate AI functionality in Norway using DNS when setting up the glasses, the fact that I can stream calls directly through WhatsApp messenger and Facebook messenger complete with video And the fact that they included Apple Music functionality, not only Spotify, makes me really happy for now.
Of course, I’m hoping that the rumored glasses from Apple are in development and we will get something from them soon, but until then, these should do the trick for me when it comes to Identifying objects, describing areas and reading text.
Always here if you have any kind of questions.

By mr grieves on Friday, May 3, 2024 - 10:57

Oh wow, I had no idea this sort of thing was even possible. I'm guessing this could open up the glasses to do all sorts of things if you had the time and inclination. I've not gone into detail but I presume the web hook needs to be over the internet not wifi.

I suspect I am too lazy to go all the way through with this but I am very tempted to have a go. Probably with a Raspberry Pi on the internet you cold unlock your whole smart home if you so desired. (Not looked into the security of this yet though). I do have a Pi but have never been bothered to try to get it available on the internet.

By mr grieves on Saturday, May 4, 2024 - 15:33

I had a good chance to put the AI through its paces today.

At one point I was sitting outside a cafe. I asked meta to tell me the menu of the cafe I was at. It told me that I could enable location services in the meta view app settings to get information like this. So I asked where I was and it said that according to location services on the Meta View app, I was at Towcester. Now I wasn't really in the town, but it was the nearest one. I tried this again later and again it just told me the nearest town.

I then asked what direction I was facing and it said North. So I kept my body still and moved my head to the left and asked again and it said West. So it looks like the glasses have head tracking in them which I wasn't aware of before. The second attempt took two goes before it worked - in the middle it said I had to enable location services in the app even though it understood my question exactly. I asked the same thing again and it told me.

I can't find any specific location options in the app but obviously it knows where I am. A bit later my wife disappeared into an antiques shop and left me outside with the dog. I asked what shop I was outside and it said Tesco Express. As far as I am aware there wasn't one of those for miles.

The where am I/what direction questions seem new. I tried them last weekend and they didn't work, although I was on VPN and in the middle of a forest. But it said I couldn't do that on these glasses. If they can tweak it to give a bit more accuracy that could be useful.

When my wife brought over the printed menu I asked the glasses to read it. It said that it was a menu and had breakfast, lunch and dinner options. I asked it to tell me what was on the lunch menu. It gave me some funny answer about"the best lunch options in Towcester include the Ship Inn etc etc" "

A few other things I tried - there was a bench and it told me it was ornate and had some text on it. I asked it to read the text. It said it was Latin then told me it roughly translated to something or other. I had heard it could translate some things, but was surprised it managed Latin.

I also tried asking about a few buildings I was looking at. It told me things like "this is a gothic temple from the 18th century" and I could then find out what year it was built and by whom. (Assuming it was telling the truth.) Or it said "this is stowe house" and could then go on and describe how many stories it was, the Corinthian columns outside and so on.

So I maybe take back my assertion that a sighted person would be asking what they are looking at, because I guess this sort of makes sense.

I think the detail it gives in general is a bit substandard compared to Be My AI, or even sighted person. But it is quite smart when it gives me some extra information that you wouldn't know just from sight. And as always with AI, follow-up questions are amazing.

Another thing I tried was asking what the score was in the Arsenal v Bournmouth match. It told me 1-0 to Arsenal and who had scored from the penalty spot. I asked how long had the game been going on and it told me "it hasn't started - it kicks off at 3pm". About an hour later I asked for the score and it told me it was 0-0.

The other thing I hadn't realised was that it keeps track of the conversations in the AI tab, so I can refer back with VoiceOver. And in here are the images I was asking about too. I presume these are stored in the cloud and not on the glasses as they are not in gallery.

Anyway, AI definitely enhanced the day for me even if it is still a bit hit and miss at times.

The other thing that was quite good was that I could take the occasional photo to keep the wife happy. But a few times it told me that my hand was over the camera. I'm pretty sure this wasn't the case. One time I asked the M-guy to take the photo and he told me that but my arms were by my side. Unless it was some shadow from the hat I was wearing that confused it.

By Portia on Saturday, May 4, 2024 - 15:47

Wow!
I cannot wait to get mine now!
OMG, this is going to be so exciting to test around my house, etc.
According to Amazon US, mine left Georgia on Friday, so we will see when they get here, Amazon says Tuesday.

By Peter Holdstock on Saturday, May 4, 2024 - 19:36

I’ve got my glasses in the UK 24 hours ago but unfortunately still no sign of meta AI.

I’ve gone through and pretty much allowed everything that it wants access to. As the matter AI functionality is used in the marketing material, I would have hoped that New users would get it from the start so as to avoid any confusion. Of course if anybody has any ideas if I might be doing something wrong, happy to hear them.

By mr grieves on Saturday, May 4, 2024 - 19:48

I think if it’s not working for you, I would make sure you are on the latest firmware. Then quit and reopen the app. Then ask the glasses something like “look and tell me what you see”. If it says something like you nee to enable AI, quit the app again and reopen. Hopefully then it should prompt you to enable AI.

This is what happened to me after I used proton VPN to pretend I was in the US last weekend. Once enabled, the app will have a Meta AI tab and something similar appears in Settings.

If it’s not available still, then it’s possible that it is only rolling out slowly. It only came on in the UK mid last week, possibly Wednesday. At the time I heard it might be rolling out slowly as some of these things tend to do.

If you can’t wait, then you can get a free account with Proton VPN, pretend you are in the US and once you are setup, just turn off the VPN and you should be good. But I think you wouldn’t need to do that for much longer.

By Peter Holdstock on Sunday, May 5, 2024 - 08:45

as mentioned, I’m in the UK and meta AI wasn’t showing up. As recommended, I connected to a VPN to show I was in the United States and as soon as I went back into the Meta View app, met AI appeared.

I then disconnected straight from the VPN and it is still working. it’s a shame the information given by Meta AI isn’t up-to-date but the image descriptions are superfast. Initially very short but it’s so quick to answer all questions that it doesn’t really matter. In fact, you probably get the exact information you want much more quickly than if it described everything

By mr grieves on Sunday, May 5, 2024 - 12:52

I was at a market today and asked what I was looking at. I was told a dog treats stall. I tried asking what products it was selling. I was told something like "I can't help you with product availability, but I will be able to soon". This wasn't recorded in my AI log and I can't seem to get it to reproduce it again. So not sure what that is exactly.

One other thing I didn't realise it could do. After this I asked it about another market stall. A little later I said "tell me more about the dog treats stall" and it told me about a couple of products it was selling (which my wife verified). I had a walk and asked a few more things. This was a few hours ago. And I asked it just now to tell me more about the dog treats stall and it still can.

So I thought I would ask it again about the bench I saw yesterday. It told me it cannot access previous conversations. I tried again just now to get it to describe the bench to me and I think it basically just described any old bench. It had a different inscription and I asked where it was and it said it was in the middle of a dog park. I asked where in the world it was. Apparently San Francisco. I don't have the VPN turned on any more I should say, and the man in the glasses seems to know where in the world I really am.

I think you could quite easily lose your grasp on reality with this sort of thing if you aren't careful! The fact it was sunny did make me think I couldn't possibly still be in the UK.

By CrazyEyez on Sunday, May 5, 2024 - 15:15

Some random thoughts:
I tried the "what direction am I facing" question someone else tried and got various answers, so no head tracking on mine. If there is such a thing, its unreliable.

It is very hit and miss with this AI.

I'm getting slightly annoyed with all the summarization.
Yesterday I was in a coffee shop. I had to ask it 5 questions just so I could find 1 menu item and its price.
It got the sausage egg and cheese breakfast sandwich right but got the price of a cup of coffee wrong.
Product recognition is 60-40 at best.
It can't recognize any of my cologne bottles, it took 5 tries to guess the brand of deodorant I prefer.
It recognized I was holding a bottle of water but not the brand.

Finding directions on packaging is frustrating.
So far colour recognition has been spot on for me.
Its fun to play with in the store.
I wore the glasses to Costco and had a ball.
It correctly identified 3 cars that belong to family members and their colours.
The best features are the video calling and the ability to snap a quick picture or shoot a video with the press of a button.
The speakers are quite good for such a tiny product.
Its not all bad but it definitely needs work.
If anyone has any tips and tricks to aid in proper product recognition, i'd love to hear them.
I'm glad I have a 45 day return policy so I can play with them a little more before I decide if i'm keeping them or not.
Still waiting for my seleste glasses. Hopefully they come soon so I can do a proper comparison.

By mr grieves on Sunday, May 5, 2024 - 15:31

I may have got lucky with the directions. It's hard to know. I was assuming head tracking because it seemed to know my head had turned, but who knows really.

I think there is a debate to be had about the importance of being able to trust tech. Right now I think we are all giving AI a free pass because of the potential it has and how mind-blowing it is when it gets something right. But I think once the novelty wears off we will stop accepting all the false positives and start having the same expectations we would have for any other kind of tech.

I think right now it is too early to buy a product only based on the AI in it. My belief is that these glasses are well worth the money without it, and I just see the AI as a fun thing to play about with. I'm lucky that my wife can see so I have an easy way to check the results, and we can have a laugh if it is wrong. But no way would I depend on it.

As has been said elsewhere, this is the worst AI will ever be. It is going to improve at a rapid rate, I'm sure. Whether they can get past the hallucinations I don't know.

I definitely wouldn't use the AI in these glasses in an app on the phone - the other options seem better to me. But you can't deny that the form factor is so spot on and the convenience is incredible.

By CrazyEyez on Sunday, May 5, 2024 - 16:16

its incredible to be able to hold something and ask what it is.
Its just somewhat frustrating when it is giving you wrong answers.
It frustrates me further because it recognized the breeds of my little dogs, the fact that the dog was drinking water or sitting on the floor and their colour.
It recognized cabbage leaves and green onion.
I love the convenience of having the glasses.
I'm excited to see what the future holds.

By CrazyEyez on Sunday, May 5, 2024 - 16:42

more amazement lol.

It correctly answered cabbage rolls and mashed potatoes when I asked it what was in front of me.

We don't need it since it is brailled, but it successfully recognized Canadian currency ranging from $10 to 100 dollar bills.

By Brooke on Sunday, May 5, 2024 - 19:29

I think we all need to remember that these aren't designed specifically for blind people. They aren't meant to read menus to us, as it's unlikely sighted people would be requesting this. The AI is cool for what it is. But if I need something specific read word for word, I'm going to try one of the apps on my phone or the Celeste glasses. That's more of what they're meant for. My Meta glasses did correctly identify every seasoning, jar, and can I held in front of them. Lol it even used a Mexican accent when identifying my Midz Alfredo sauce. It's also been completely accurate with colors and scene descriptions.

By Stephen on Sunday, May 5, 2024 - 19:51

Ah, the quest for the ultimate tech gadget—it’s like expecting a Michelin-star meal but ending up with fast food! Your Meta glasses seem to have quite the flair, especially with that Mexican accent when identifying your Alfredo sauce. It’s like they’re ready for their own segment on a food network!

But let’s talk turkey about something crucial: inclusivity.

It’s wonderful that your glasses can identify every spice in your cabinet and narrate your surroundings like they’re in an Oscar-nominated film. However, the real magic happens when technology serves everyone at the table. Right now, it feels like tech companies aim for the moon but only clear the tree line. They dazzle with features for some, but what about making sure everyone can benefit?

We need devices that go beyond just impressing at tech expos; they need to be as versatile and accessible as a Swiss Army knife—useful for everyone, whether they have perfect vision or need a little extra help seeing the menu.

Imagine a world where every gadget is like the ultimate potluck: dishes for every taste, preference, and need. Everyone, from those who can’t read tiny text to those who just want to spice their chicken correctly, finds something useful.

Let’s champion technology that doesn’t just boast smart features but also embraces a warm, inclusive spirit. These gadgets should be as commonplace and essential as salt on a dinner table, ensuring everyone can enjoy their meal.

Here’s to a future where technology is as inclusive as a family reunion buffet—everyone finds something they love. Let’s make it the standard, not the exception!

By Holger Fiallo on Sunday, May 5, 2024 - 20:34

Meta would do better to allow third party apps that can take advantage of the glasses. Until them, it will be just a nice toy. If they let third party apps than only than, the blind will be able to fully take advantage of the possibilities of the glasses. Google map, Be my eyes and apps that are setup for the blind would be able to make the glasses be worth the price. Until them, is a nice toy for those who have the money to get them unlike us who do not have the money for toys. Hope Apple will release glasses similar to it and would be worth the price.

By Gokul on Monday, May 6, 2024 - 02:26

@Stephen you make an important point there. Its sad that noone bothered to think about inclusivity or the potential of such glasses for the disabled community when they were conceived; things would have been so holistic if the glasses were built with accessibility and inclusivity in mind from the scratch. Having said that, the platform becoming inclusive is still a very real possibility so long as we as a community can make our voices be heard. I still remember the state of android accessibility as recently as android 8/9, and how far it has evolved since then. So it's all about letting our voices and concerns reach places which could help make a difference.
Also, why is the comparison always between Meta and Seleste glasses? Why is Envision Glasses never in the picture? Is it the cost? Or is it something else? I mean, the range of services they offer is almost the same. I'm genuinely curious.

By Stephen on Monday, May 6, 2024 - 04:07

I think the reason for me anyways is because when it comes to the unflattering appearance of the device, that alone is not worth the time or effort and yes, The almost $4000 cad price tag is another factor for basically the same features as the seleste and meta glasses. In all reality, the envision AI is about the same as the other 2. Keep in mind the seleste glasses and envision glasses use the exact same engine under the hood. Meta however is building there own which is why their AI model needs so much training before it can compare with open AI’s chat gpt and others like it.

By Holger Fiallo on Monday, May 6, 2024 - 08:42

I would not trust any AI regarding medications. That would be like playing Russian roulette. Also any scannerapp.

By Gokul on Monday, May 6, 2024 - 09:25

I have been using OCR apps to identify medication for some time now; have also used BeMy AI ( having said that, I don't have to do it constantly; it's once-in-a-while stuff). I guess it's about trusting one's own knowledge of where certain medicins are, the shape/sice of the tablets etc and then confirming that with one/multiple apps. And yes, it's a very subjective thing; it's also about how much one usually rely on their own awareness of the things around them.
As for the toy argument, as I have already mentioned elsewhere, I work on a job where the awareness of visual info is a huge plus if not an imperative, and I find that the use of AI-based apps had already upped my game and the use of wearable tech is taking it to the next level.

By Holger Fiallo on Monday, May 6, 2024 - 11:11

You just play one? take care.

By mr grieves on Monday, May 6, 2024 - 13:36

So I think maybe I was wrong about the head tracking and I've just been a little gullible.

So today I tried again. I stood up and look forward. What direction am I facing? North. OK, no idea if that's true. Moved head to the left. What direction am I facing? Firstly, I need to enable location service apparently (No I don't). So repeated. Oh I'm looking East it seems. Now I'm not the outdoorsy sort and I'm not going to go out hiking in the mountains using the moon as a guide. But I'm reasonably sure that east is not to the left of north. So I turn my body so it is facing to the right of north. Now I am looking south. Again I'm pretty sure that's not where south should be in relation to north.

So I positioned myself back where it told me that north was and asked what direction. Now it's west. I ask again without moving. Now it's back to north.

So I think the other day it just guessed and happened to get it right. And now I just feel stupid for having taken its word for it.

Again so much of this is smoke and mirrors, but it's so convincing that it is easy to forget.

@Lottie - I agree with you to some extend. We should absolutely be hyped up about this sort of thing. But I also don't think we should just ignore its limitations. Because with tech like this is is absolutely essential that you understand what you are getting into.

It would be a very bad idea to trust everything the glasses have to tell you.

And this goes back to my previous comment - how important is it that we can trust it?

I think for something that some of us might be wanting to depend on then it is essential. For something that is just giving us a little extra flavour to our lives, then not so much.

But it does beg the question - what am I going to do with all this information?

Am I going to swallow these pills it has confidently told me are the right medication? Am I going to open this can and hope that there really are tomatoes inside? Am I going to try to look clever and tell everyone that I can translate the Latin on this bench?

I think it is much more important for us to get reliable information than for us to get information tailored to being blind.

I think the other side to this is how much confidence AI always has in itself. So, like an idiot, I totally fell for the idea that it had a compass or head tracking built in. Like, why would it lie to me about something like that? In this case it wasn't important. But what if I had been out hiking and had been under the assumption that it could do this.

Contrast with OCR - it's usually fairly obvious if it can't cope, but you know it is at least trying to answer the specific question you are giving it, and not potentially inventing a whole new question you weren't aware of.

So I don't think we should just be saying how wonderful this all is without acknowledging the other side. I feel right now we are excited by the potential not its current state.

But we should also not be so hung up on the problems that we are unable to reap the benefits and look forward to what comes next.

When I had a good try with these glasses before I took them away and was going off to try to find something that normally I would ask the wife for. So I was looking through all these things getting the glasses to read the labels and for a while I thought this is absolutely incredible - it is giving me back something I have lost. I wouldn't go as far as saying suddenly I feel partially sighted instead of blind as that's a bit of a leap but it was something of a rush to be able to do it.

So I then made my choice and took it over to the wife... and it wasn't at all what the glasses had told me it was.

And that's the two sides to it. We shouldn't discuss one without the other. But you should also not feel that just because some of us post negative comments that it in anyway invalidates the successes we do have.

I would also agree that £300 is actually not much considering what you get. That doesn't mean it is affordable to everyone and I am lucky I could afford to get them. Considering the top end Bose frames were a little more expensive and didn't have the nice charging case, or the camera or the ability to share videos or make video WhatsApp calls. Let alone open up this other world to us.

Anyway, Lottie, please keep doing what you do, which is sharing your enthusiasm and making at least one old curmudgeon a little bit more excited about the future.

By Holger Fiallo on Monday, May 6, 2024 - 14:27

If the AI from the device tells you that it has a bridge that you can get for $4 that is in SF, cal do not believe it.

By Brooke on Monday, May 6, 2024 - 16:27

Your comment... “I think we should all stop being negative and should all start being grateful for what we have ben given. £300 to turn a blind person into a partially sighted person is pretty good value isn't it?” ... is one of the best things I've read in a long time. It's exactly how I feel. I love these glasses, flaws and all. Do I hope they improve? Absolutely! But I appreciate what they do right now.

By Brooke on Monday, May 6, 2024 - 16:27

For me, the Envision glasses have never been an option because of the price.

By Brooke on Monday, May 6, 2024 - 17:35

With WhatsApp video calls... I'm guessing the call has to be started on the phone, then transferred to the glasses? I've done this successfully. Just wanted to make sure I can't actually initiate the video call from the glasses.

By Andy Lane on Monday, May 6, 2024 - 18:14

This can be done by starting a voice call on your glasses then double pressing the button on the top of the right arm to switch to a video call.

By Brooke on Monday, May 6, 2024 - 18:25

Will give that a try!

By MarkSarch on Monday, May 6, 2024 - 18:29

You don't actually need the phone to make a video call, you can do it directly from the glasses without touching the phone
First you have to know where the capture button is located on the right temple
You can try the following so you can see what will work for you.
you can keep your phone in your pocket
Start by asking Meta to make a WhatsApp voice call to any of your contacts
Once your call has been answered, press the capture button twice and you will listen the synthesizer mention something about the camera. Only the other person needs to accept the video call request and that's it.

By Andy Lane on Monday, May 6, 2024 - 18:48

Just start a call then double press that button and it will start the video on the glasses. I’ve used it to make a 15 minute call which used 24% of the battery which should hopefully mean around an hour of video calling from the glasses without recharging.

By Stephen on Monday, May 6, 2024 - 19:27

Honestly, I loved our back-and-forth banter! I just removed it because there was some ridiculous disrespectful people in the comment section which I don’t want to respond to. i’m really not interested in entertaining drama. It’s because of some people that we just can’t have nice things lol.

By Matt D on Monday, May 6, 2024 - 21:33

So, a weird thing happened for me this past Saturday We were out at a local pub and there was a paper copy of the specials. I said hey meta look and read these specials to me, and to my astonishment it read the first 3 paragraphs verbatim, and then stopped. It did this every time, this indicates to me a potential for real time OCR, and I did ask it as follow up to just tell me saturdays specials and of course it summarized them for me, but was most interesting was the fact that it read those first few paragraphs as they were. I wish I could figure out what difference happened there to make it decide to go in that direction. FYI, I did have my sighted wife confirm that the conversation was as the print was written.

By Stephen on Monday, May 6, 2024 - 21:58

Thanks. I thought it was pretty good myself… Not to toot my own horn or anything lol. Honestly, I don’t mind constructive criticism or constructive feedback. What I do mind is when people are bored and just start attacking people online for no apparent reason. You can still get your point across without being something that I probably shouldn’t be saying on this forum. I don’t tolerate it whether it’s directed at me, or other people who are commenting with replies to other comments. I don’t put that negative energy anywhere near me 😊.

By Brooke on Wednesday, May 8, 2024 - 17:57

I'm curious, because I enjoy FB groups. Is there one relating to the glasses? I guess I could go and look it up instead of asking here, but... I'll post this anyway.

By mr grieves on Saturday, May 25, 2024 - 15:50

So a few times I was out and about and opened Voice Vista whilst wearing the glasses. It did its thing as I would expect, but then every time I tried to say Hey Meta it would start playing my audio book (from Easy Reader).

When I got back to base I was playing with other apps instead of Voice Vista and they seemed fine. But then I tried Voice Vista again and it worked. So I think maybe this bug only materialises when I want to actually use Voice Vista.

I ended up just not bothering but it was a shame. There might be an obvious answer - not sure if Easy Reader was open or not. But I didn't want to spend ages fiddling with my phone when I was wanting to explore. Has anyone else come across this?

I will fiddle some more anyway,

By mr grieves on Wednesday, June 12, 2024 - 15:25

I know suddenly Gpt 4o has made these seem a bit old hat now, but I still love my glasses. I use them all the time and they are so useful. But it doesn't always know what the important information is as I've mentioned before.

One fantastically useless answer it gave me. I was holding a box of tea and asked it to look and tell me what this is. It said "this is a box with information on it". Yup, bingo. That's what I was after - thanks.

The other one that made me laugh was when I was on holiday walking through a field and my wife said "ooh look to the right and ask your glasses what that is!". So I asked and it said something like "there is a green field with flowers and some trees with branches sticking out." What it failed to mention was the 20 foot high naked man made out of bronze standing right in front of me.

In the latter one I did manage to eventually encourage it into a proper answer. "So, er, can you see a statue maybe?". Sometimes you do have to give it clues which would mean being able to see the thing in the first place.

But I had to give up with the tea after a number of tries.

I know I'm not mentioning all the times when it was genuinely useful but they weren't very funny.