I'm not good at writing reviews, so I'm not going to even try. I just got these delivered an hour ago and am about to dive in. I wanted to start a new thread where I could comment and hopefully others wil to. Questions are welcome.
They cost £329 in the UK. They arrived two days early.
I'm living alone right now, so i will be attempting the unboxing and setup with just my iPhone and Be My Eyes/AI.
Comments
@Ollie
Whilst I agree with your statement I still think it is not necessarily answering the questions that a sighted person would ask. For example, my wife also didn't know about the big naked man in the field until she saw it. So "Hey Meta, look and tell me what that is" would be an obvious thing for her to ask. The fact that there was grass and trees around she could figure out for herself.
Similarly "this is a box with information on it" is not going to be useful to anyone. She may have been asking because she wanted to know a bit more about the product.
I think regardless of whether it is designed for us or not, the AI is just not as good as the others right now.
But given it is on my face and just ridiculously convenient, I can easily forgive it because when it does work it just blows my mind.
And the video calling thing is brilliant. The other day the post lady threw a parcel into a bush. I was able to video call for help. I would never have found it there in a million years. Sure I could have used my phone, but having two hands to prod around was very useful.
mr grieves
If she did asked it probably could not respond and make a statement about privacy. Nuts.
Potential for integration with other apps…
Hello everyone,
I have been researching these smart glasses for the last few weeks and I just ordered myself a pair yesterday. It appears as though that there is room for growth with the product such as this. I found the article below from envision AI back into January of this year. They would like to be integrated with this Application and device! I feel that would be awesome! And I would wonder if other apps services such as be my AI, or seeing AI could be integrated into a product such as this?
When I get mine, I will be submitting some accessibility feedback on this very subject.
https://www.letsenvision.com/blog/ray-ban-meta-smart-glasses-accessibility-envision
Meta and envision
Hay that could be brilliant actually! All the features that envision glasses offer with the meta hardware. Consider the cost factor! Plus it'd solve the OCR problem if it actually happens. Though, Like Ollie was saying, the real thing is going to be the open ai multimodel thing in a pare of glasses...
@Ollie
You beat me to it - at least I don't need to apologise for my crap nuts joke now.
I do find the Meta AI a mixture of totally useless, quite fun and quite useful depending. If I have a number of things I want it to read the labels of so I can choose one and I roughly know what the options are, then it's quite good. It'll sometimes blurt out something random but I know if I've never heard of the answer then it's probably wrong. If I'm out with mrs grieves and I find myself standing around waiting for her, I will use it to try to get a feeling for where I am which stops me getting bored. She enjoys getting me to ask it when we see odd things.
I also use the AI to just ask random questions like I might an echo. I think I posted this elsewhere but one time we were driving home and I asked where we were. I then said tell me some facts about that place, and it told me that a dinosaur fossil or some bones or something had been found a few years ago. So I asked what kind of dinosaur and it told me. I then could say "oh, what does that name mean?" and it translated it. I then asked what kind of dinosaur it was, whether it had tiny little arms like a t-rex, who found it and when. And it was really good at giving me the details.
Or if we are visiting a heritage site or something I can ask questions and get random little details. It's really quite good at that.
And yesterday my wife saw lots of Croatia flags about and said "oh are they in the Euros?". So I was sure they were, but I asked Meta and it said yes, and then I could ask who they were playing first in the tournament and it was getting me all the live information out of it.
Not really accessibility related, but I just find it handy being able to ask thin air whatever random crap life throws up.
But you are right, I really, really want to be able to have it describe what the ducks are doing and help me hail a taxi. But that won't be the end game - it will just provoke the next "but what if it could do this too. The more powerful these things are the more it fires up our imagination..."
I also agree that air pods don't seem like the right place to have this. I personally don't like shoving things in my ears much anyway but I can't even imagine how the camera would work on them. I would never walk about with my Air Pods 3 in my ears as they could fall out at any time and that would be the end of that. Maybe the Air Pods Pros are a bit more secure. But I love the feel of the glasses.
Ollie
Would the EU would not insist? They did with Apple regarding their iOS. Or that is only for phones?
Re: EU
I think it is quite different here.
With iOS it was an anti-competition thing. Apple had an app store that only they control and therefore they have a monopoly and can do what they want, which is bad for consumers.
With the glasses, they aren't a platform they are an accessory. So telling them to open it up to other apps is almost like forcing them to make the glasses into something they are not which doesn't make much sense to me, much as I might want it.
I'm not sure technically how easy it would be to open up access to the glasses camera, microphone and voice assistant to other apps. I'd love it if they did but it also feels like there is no great reason for it unless they can somehow license it.
I think there are a number of things that we want that could maybe be framed in a way that isn't specific to accessibility that other people might find useful.
For example, being able to start turn by turn directions from your glasses or being able to find out what shops are nearby but out of sight.. Maybe "what places are there to eat nearby?" and then "ok, direct me to xxx". That's not specific to us but we would benefit. My wife does similar things with the google assistant in the car so a more portable version would be good for her too.
Similarly for OCR, there may be some benefits for sighted users to be able to grab text they see in the world then copy it as text into a message or note or something. And there are a lot of people with low vision who are nowhere near blind who might find it helpful if it could read small text to them.
mr grieves
Do meta allow third party apps? That was the issue with apple.
Ollie
Thanks.
New update plus the Verge
I saw the other day that there has been an update for the glasses. You can now change the maximum length of a video clip from 1 minute to 3 minutes. I You have to do so from settings. There was also something about being able to use Amazon Music but only with car-play which seems a bit odd.
Other thing, I was listening to the Verge podcast yesterday and they were talking about these glasses. Apparently they have shipped a million of them which is far more than they were expecting. They were talking about the possibility of a new version maybe coming out next year with AR. Unlike Vision Pro which has everything on a screen in your face, this would be you looking into the world with things projected over the top of it. I guess like a HUD in a video game. I'm not sure. how much appetite there is from sighted people. Doesn't sound helpful to those of us who can't see, and I hope that they either keep the existing glasses as the entry level option or at least have a way to turn off the visuals to save battery. Again not sure if ading extra visuals is really that helpful for someone who can see. I'm guessing such a thing would cost more.
Still it is great that these are proving so successful.
In other interesting news
Meta put out a couple of papers on multi-model (read as gpt4o-type) capabilities and suggested that these could be put out for testing sometime in the future. Maybe this kind of stuff might come to the meta glasses sometime down the line...?
@Gokul
That does sound incredible. That chatgpt demo but on my face is the dream.
I'm still amazed by what these glasses can do already. However, I'd slightly prefer that they just improved what was there then get too ambitious too soon. For example, if I take a photo and ask about it I'm never quite sure if the glasses have any idea what I'm looking at or are just making assumptions. So maybe it knows I'm looking at a lake, so will just tell me something about some random lake it found online. Or maybe I'm looking at a building or monument and ask a specific question about a feature of it, but it answers based on something else in the vicinity but not what I'm actually lookking at now.
Hopefully one day AI will be good enough to rely on.
Mr G
What your talking about is Augmented Reality and it is one of the two main uses for this technology - it is where they are going, we are just on step one.
You are right, it won't help us, but Assistive Reality will - which is what we are using the Ray-Ban glasses for as blind people.
@Mrg
Lama3 does appear quite deficient compared to, say, gpt 3.5 and even claud in it's first itteration, but hopefully Meta is investing on improving the smarts especially since the glasses have already been a better hit than even they expected. Also, I'd like full ocr capabilities to come to the glasses before anything else... The multimodal capabilities are for somewhere in the little more far-off future; far enough that it's far, but not so far as not to be imagined.
Has the what are you looking at ability come to the UK yet
Hi, Do all the what are you looking at features now work in the UK? also can you make unlimited time vid calls so a family member could guide you through a shopping centre using what's app or FB msgr?
Kind regards
Graham
One answer....sorry
The video calls are WhatsApp calls, so they are as long as you want, from what I can see.
meta ray ban
i would have been nice to have record more than one minute. how about the sunshine oho video glasses with access to siri, much better deal for $75
Video recordings now 3 minutes
The new 6.0 update that came out a couple of days ago now adds the ability to extend the video recordings to 3 minutes. You'll have to change that setting yourself in the Meta View app, but it's rather nice to go from a limit of 1 minute videos previously to now being able to record 3 minutes. So far I'm loving the glasses, and again, VPN is the answer for getting all AI functions enabled outside of the US. I'm in Norway, and it works as a charm for me here :)
OhO sunshine Waterproof Video Sunglasses, 1080P HD Outdoor Sport
These are a thing, but I didn' see anyting about Siri on Amazon, just Mac/Windows. If they could work on the iPhone, they would bb the obvious companion to the Envision apps - for £2650 less! So probs something we are al missing.
These really are the cats pyjamas
I'm wearing these more and more. They are still a bit weird at times, there is a way they work you have to get tuned in to, but they are fabulous really. For the price, they are the best assistive technology I have ever bought - even if I am paying £10.99 a month for a VPN.
On the downside, I found myself looking on Amazon earlier, to see if I could buy Twizzlers and Snapple in the UK!
Oho Sunshine
Well I did not know these were a thing. I have the Oho Sunshine audio glasses. I thought they were extremely comfortable, the battery life was great but the sound was pretty bad. I ended up pairing them with my Apple Watch and they were just a tiny bit better than just holding my watch up to my ear.
With the video glasses, is the idea that you wear these when you are out, take a load of videos or photos or whatever, and then download them on your computer when you get home? So they are completely independent from the phone? Seems a bit weird to give them the same brand name if they work in an entirely different way if so.
Doesn't sound like they can really compare to the Meta Ray-bans but interesting to know about them.
2 more apps added to the Meta glasses
In the latest update you can now connect Amazon Music and Calm to the Meta glasses. I wonder who/what is driving this? Could AIRA do an AIRA Anywhere integration? Could Envision also? That would be cool!
Or even, Seeing AI? I would love to be able to use that on these glasses.
Re: Amazon and the Car
I think I totally misheard the release notes before. For some stupid reason I thought that it had Amazon Music support but only with Car Play. Which didn't make sense because, it seems, that's not what it was at all. You'd think I'd be better at this sort of thing by now...
So Amazon Music works like Spotify Tap I think where you can have it play what it thinks you will like but can't actually choose anything. Is that right? Whereas I think Apple music will allow you to play anything? I noticed Spotify is not showing up in settings with the others. Not sure if that's because it's the only one I've setup or if it's been given the boot? I don't really use it anyway.
And Calm is a meditation thing and nothing to do with the wheely thing you sit in.
God I'm such an idiot. I can only apologise. I appreciate anyone who politely ignored my stupid message from before.
oHo Sunshine
I don't know if these would work with the envision app the way we imagine it to. I never thought about it; I guess I have a pair somewhere around here; let me see. But even if they do, it will never give us the seemlessness that the meta glasses do. And, I do feel that this is the best piece of assistive tech I've ever owned, maybe just below a computer with a screenreader (that could just be sentimental value); and yes, addition of something like envission app or even just ocr facility would make these incredible.
Speaking of which, why don't we think about pushing the envission devs to approach meta to have some kind of collab? But I guess that'd totally destroy the envission glasses so...
UK Availability
I was listening to Double Tap last night and Steven was saying that the AI on his glasses had just booted him out saying not available in your region. So looks like it is not available in the UK yet unless you use a VPN.
I hope they get a move on!
Someone has said that there is a page that shows availability of it but I've not yet found it after my admittedly slightly half-arsed look.
Solos AirGo Vision Smart Glasses
Are introduced below the fold - a hint for those of you who avoid walking on the wild side!
Free VPN
I was listening to the RNIB podcast and was surprised that they recommended using a VPN to get Meta AI. They suggested Windscribe as it is free.
I downloaded the app and registered. If you provide an email address you get 10gb of data per month and can unlock your account. But you can choose not to use an email address and get 2gb. (I initially didn't put an email in then went back and did after it warned me and it says I only have 2gb left so not sure if that's a bug)
Anyway, once in I was able to choose a US server easily and connect.
If you are just using the VPN for the Meta AI then this is plenty. You don't need to reboot or uninstall MetaView or anything like that (as was made out in the podcast). Instead what I did was have MetaView closed, then I opened it once VPN was on. Then quit and reopened again and I think that was it. It's obvious if it works because you are prompted when you open the app. (If that doesn't work try using Look and Tell, then quit and reopen the app again).
Once AI is enabled, you can disable the VPN and it should last a month or so. Then just repeat the process when it stops working.
I haven't tried Windscribe for this specific purpose but I reckon it will do a job better than Proton VPN for those not wanting to pay for a VPN. With Proton you can't really choose country on a free subscription although I have a feeling maybe it will work every month or so. (In my test it seemed to allow it a few days after my Meta AI stopped working)
Anyway I've not head of Windscribe before but it seemed accessible enough and it took me no time to get it connected to a US server.
Windscribe
As usual, I wasn't paying enough attention. You have to confirm your email before you get the 10gb which I've just done. So that explains why I only had 2gb.
It's like talking to an idiot!
The AI is really good, but stupid. It took me three questions earlier to deal with an envelope I picked up off the mat:
1. Whats this? "an envelope"
2. Who is it addressed to? "my name"
3. Who is it fromm? "Humanware"
It was only four pages, I managed to get it to read most of each page with one or two questions. I think it did invent a new product - I can't find the Harc Reader with AI anywhere online!
compass direction thought
I've been reading this thread, and have a thought. Even when using GPS apps on my phone, results are not immediate. Could there be a slight delay when using these glasses when asking which direction you are facing? I plan to get a pair next month. Having been blind for 70 years, this will be a thrill. Actually wearing glasses for a reason other then for looks.
Weird things sighted people ask
There are lots, obviously, but more than you would think ask "What if they tell you someone is standing right next to you when you think you are alone?"
I was terrified just now, I thoght my cat IgglePiggle had become invisible! turns out I was just looking over her. Phew!
Compass, follow-up questions and magic taps
I think it just makes the compass directions up. Every time you ask you get a different response. I think it is just doing the AI thing of plucking some random thing out of the air and assuming that is correct. I don't think it is tied into the compass at all.
I noticed I was using it yesterday with look and task and it seemed to have a lot of difficulty with follow-up questions and kept telling me to start my questions with "look and". Like I'd ask "look and tell me what this is", "it's a bottle", "a what kind of bottle?" "if you want to know what you are looking at, start the question with look and". Not had that before. I thought maybe the feature was a bit broken but on my last attempt it did work. QUite annoying when you usually have to do a couple of follow-ups to get the answer.
And I still get a lot of phantom magic taps. Like I'd ask a question, it would give me the response then Easy Reader would start playing. Has no one else had this sort of thing? I started closing all my apps when going out with this, but then Easy Reader forgets where in the book I am which is quite annoying.
Never had it start anyting else
I wonder if they realy have a compass in them. I poked my head out of the door yesterday and asked "what colour is the van?" It told me, purple and white, I checked with the driver, it was right. The van was at the end of my drive, very clever!
Aira
Aira are exploring ways to integrate with the Meta Ray-bans: https://aira.io/new-wearables-pilot/
I don't personally use it but it is supposed to be a great experience for those that need it and can afford it. Hopefully this might be the start of something good.
amazing news MrG
BTW, do you know you get five minutes of AIRA free every so often? So you can use it without a subscription.
AIRA
Users of AIRA get a 5 minute call a day, and they are also beta testing the use of these glasses. I ordered mine yesterday.
Aira
Aira were on Double Tap yesterday talking about this amongst other things.
The impression I got was that they are trying to get some input from Meta to do a proper integration. I may be wrong but I don't think they have had a lot back yet.
Their other approach is to do something via WhatsApp given that the glasses support video conversations with that already. I think this is the option if Meta don't allow them proper access.
I can't remember which thread it was - we've had a lot on these glasses and I'm too lazy to go find it - but someone mentioned that you can write Chatbots on WhatsApp and someone had managed to do something in php to allow you to talk to ChatGPT with them. So not sure if it's something like that - presumably not just a number of you can call them on normally if they need to filter out non-subscribers.
I think Aira itself is a service I'd be more interested in if I didn't have easy access to a working pair of eyes. But I find this exciting if it can help open other doors. If I didn't have a full-time job sucking out all my energy I'd be really curious to mess about and see what could be done via WhatsApp.
I suspect a lot of the other things we want - Seeing AI, ChatGPT etc - could be seen by Meta as competitors as they are using a different AI than the one they are keen to promote. I noticed on the Verge podcast the other day there were a lot of adverts for Meta AI trying to convince me that it was the most advanced AI out there.
Story Saver
An Instagram Story Saver is a tool that allows users to download and save stories from Instagram. It's handy for preserving memorable moments or content shared by others for later viewing offline. These apps enhance user experience by providing easy access to saved media, making it convenient to revisit and enjoy stories at any time.
Story Download Link:- https://storysaver.page/
Multimodal AI coming to glasses?
I just happened on an article on the Verge and right at the bottom was a little throwaway comment that suggested that the Meta Ray-bans will be getting multimodal AI features. It doesn't necessarily say that it is this generation of glasses and I'm not sure if it is speculation or actual news. But sounds quite interesting.
The article is here: https://www.theverge.com/2024/7/18/24201041/meta-multimodal-llama-ai-model-launch-eu-regulations
I hope Meta realise that the UK isn't part of the EU any more...
Lama 3.1
So meta released the new and upgraded Lama3.1, the LLM that powers the meta glasses among other meta services. The word is that it is state-of-the-art stuff that can directly compete with GPT 4O And it's all open source. Likely the AI behind the glasses will be upgraded? Also what are the implications of the model being open source you think? will they also open up the glasses to third party services?
It's not Open Source!
Ah, the audacity of calling LLama 3.1 "open source" is nothing short of a grandiose illusion! One must delve into the very essence of what open source embodies—complete transparency, unfettered access, and the unmitigated freedom to modify, distribute, and use the software. Alas, LLama 3.1 falls tragically short of these noble ideals.
Instead of granting the open access it purports, it shackles developers with restrictive licenses and convoluted terms of use. These are nothing but gilded cages that superficially resemble the freedom of open source but in reality, confine and control. It's a farce, a mere façade of openness, designed to placate the masses while maintaining a firm grip on the reins of control.
In truth, LLama 3.1 is the antithesis of the open source ethos—a wolf in sheep's clothing, masquerading as a paragon of transparency. It is high time we call out this charade for what it truly is and champion the cause of genuine open source projects that honor the spirit of true freedom and collaboration.
Open source or not
I know nothing about the open sauciness of the llamas, but I don't think it has any bearing at all on whether the glasses get opened up to 3rd parties or not.
However, if the AI did start to improve to the point of matching chatgpt then that really would be amazing.
Case battery?
Hi all. I’m sorry if this question is already got answered here.
Where can you find the case’s battery percentage?
Thanks.
Finally Got My Own!
So after Amazon screw-up, by sending me a regular pair of Ray Bands, even though I made sure to order the correct pair, I ordered a pair from my Optician and got it two days ago.
After an hour or two of setting up and configuring them, got them to work. Didn't realize that, in order for all of the functions to work properly, it's best too keep the MetaView app running in the background. Although some features do function, even though the app is closed, functions like calling someone through WhatsApp or Messenger requires the app to be running in the background.
Out of curiosity, does anyone else's glasses take about 20 seconds to boot up? I get that they're a computer on your face and it can take a while longer, then the Bose Frames for example, but I'm just curious to know if it's the same for everyone else.
Apparently, if you are going to use them often enough, you can leave them powered on and, when you place them in the case, they'll go into a low power mode. It appears that it won't cause any issues, but I kind of like knowing they're powered off, so will probably shut them down.
A nice thing here in Canada, it looks like if you buy them from an Optician, you save on the taxes. I didn't know that, so saved about $60. Plus, since the model with clear lenses had a shorter wait time, they ordered those and made the polarized lenses in-house.
Out of curiosity, did anyone get a USB cable with their glasses? I didn't, but I just thought, since my Optician had to open the box to make the swap of lenses, perhaps the cable fell out or Ray Band just forgot to include one. But a friend of mine ordered the same polarized pair from Amazon, and although it looks like they sent him a used pair, even though he ordered a new one, he too didn't get a cable.
I also asked one of my Mastodon followers and they too said that they didn't get a cable. So when I asked Pi AI if one did and they responded with yes, it looks like the AI may have been wrong.
Granted, I think it maybe the case, as it claimed that the glasses was using a 5 MP camera, but when searching on Google, it says that it's a 12 MP camera. So who knows!!!
Anyways, so far, I'm liking them, but haven't really had them for that long and haven't had the chance to put them through their paces.
Will probably end up talking about them in more detail on the next episode of my podcast.
In case anyone feels like checking it out, just search for "eyeBytes Podcast" in your favourite podcast app.
Battery and charging
to get the case battery level, I think the glasses need to be in the case. Then in the MetaVie app, I think if you go to settings and find the glasses it will speak the case battery level after the level of the glasses. For some reason it doesn’t do this on the home tab. I think if you double tap the glasses on the home tab it might take you there too.
I don’t tend to turn the glasses off. I just fold them up when done. Usually they come on quite quickly, but occasionally it can take maybe the 20 seconds.
I don’t remember getting a usb c cable with them but I could be wrong.
@mr grieves
Thanks much!
The apt need to be running for most things
I almost never launch it. I just checked and I was able to make a WhatsApp call. I wasn't able to switch to video during the call though, but I'm not sure if that was me or the other person.
some replies
@Assistive Intelligence:
Please, please, if you're going to use ChatGPT to write your replies, make sure they include a bit of substance. You made a very strong emotionally driven (which was AI generated) argument for Llama-3.1 not being open-source but never actually explained why it's not open-source. That AI generated passage could have been replaced by the sentence "Llama is not open-source and it would have had the same meaning.
Llama is definitely not open-source, but it's definitely not as closed-source as your message would imply. After all, they do provide the weights with which you can run the models yourself, fine tune them, or use them to distill smaller models based on them. Their license is significantly more restrictive than traditional open-source licenses like the GPL or the MIT license, but those kinds of licenses are unrealistic for large language models.
Compass
I can't remember who originally brought this up because it was in the previous page of posts, but the glasses do not have a compass or GPS. All they have is a camera, speakers, and a bunch of microphones. If Meta AI tells you that it can determine your compass orientation, it's making that up. A good rule of thumb for a LLM is you should never ask it what its capabilities are, because it doesn't know what its capabilities are and they struggle to say things like 'I don't know," so you end up with a ton of made up nonsense.
Battery
yeah, double tapping the glasses in the Meta View app will take you to the settings page where the case's battery level will be shown. I agree that it should be shown on the app's home screen as well.
Llama on the glasses
https://www.fonearena.com/blog/430069/meta-llama-3-1-meta-ai-multilingual-support.html
This article here says Llama3.1 will be available on the glasses very soon; and also that long form summerisation is supported. So hopefully we can expect the glasses to at least read short texts like menu cards in full detail?
Hope So
The Llama 3.1 models don't yet support vision though, so it's possible Meta AI will still be falling back on the cheaper lower-quality vision model to handle those requests. They do say 3.1 with vision is coming soon though, so we'll just have to see!
I hope they do something though. The vision capabilities currently available are awful!