Smart Glasses Suggestions

By Maldalain, 15 April, 2026

Forum
Assistive Technology

I have some spare money that I’m considering investing in something useful for my work, and I’ve been thinking about trying smart glasses.
My main use case would be reading printed text during meetings and possibly getting some light assistance with mobility. I understand that these glasses won’t replace my cane, but I believe having an additional tool could still be helpful.
I don’t have any prior experience with smart glasses, so I would really appreciate guidance. I’m currently deciding between Meta Ray-Ban glasses and Rokid glasses, and I’d like to understand which one would suit my needs better.
In my work, I often sit in meetings where documents are shared, and having some support in reading printed material would be very useful. I also walk frequently, so even simple contextual information about my surroundings could enhance my daily experience.

Options

Comments

By Brian on Wednesday, April 15, 2026 - 07:39

Mark Sarch's initial Meta Ray-Bans review:
https://applevis.com/podcasts/review-ray-ban-meta-smart-glasses

Mark Sarch's follow-up review of the Meta Ray-Bans:
https://applevis.com/blog/applevis-unlimited-whats-new-noteworthy-may-2024

Gokul's review of the Meta Display smart glasses with the Neural Band:
https://applevis.com/forum/assistive-technology/meta-ray-ban-display-neural-band-blind-users-honest-take-future-wearable

Stephen's podcast on the Rokid Styles smart glasses:
https://1drv.ms/u/c/c0bea78b5c1b00b3/IQAN5-sdB90BRYHwYudC3GhdAeYsoZn_f_Xf5jHfIBVpEM8

You may or may not have already listened to or read these, but hey, you want advice, so I thought these were help. 😉

By Maldalain on Wednesday, April 15, 2026 - 12:32

That is very helpful, many thanks. Wondering if it is an issue if I am not in the US or EU, will I miss on features?

By Brian on Wednesday, April 15, 2026 - 14:27

Best thing to do is google it and see if and when what features are, or will be, available in whichever country you're in. 🙂

By Seanoevil on Wednesday, April 15, 2026 - 23:50

Hi,
One comment specific to your suggested use case of scanning documents during a meeting, consider how comfortable you are giving audible prompts to your glasses in front of of other people.. Or whether it is even appropriate to do so.
The standard Meta and Oakley glasses will require you to audibly prompt the glasses to Look and Summarize, or Look and Transcribe the documents. If this is not an issue, then you may get some benefit from the Meta Gen 2 and Oakley Glasses.
If this is not workable, then consider the Oakley Vanguard Glasses. They have an action button that can be matched to a pre-established prompt. This means that you can discreetly trigger the Glasses to Look and Describe with a discreet touch of the action button.
I do not believe that this feature can be replicated on the other Meta Glasses. The Vanguard Glasses are, I am told, a Bold Style Choice.
One final thought is regarding Privacy and Security. How comfortable are you, and the others in the meeting, with Documents being scanned and interpreted by Meta?
HTH.
@SeaNoEvil00

By Stephen on Thursday, April 16, 2026 - 04:40

As Brian mentioned above, I am the one who did the half baked podcast or whatever you wanna call it about the Rokid styles glasses.
When the metas came out, I was a huge fan. I had both the Ray-Ban‘s and the Oakley‘s however on the metas, I have found the AI to be fairly limited. It’s great that they eventually partnered with Be My Eyes and it’s great that they’ve opened up the developer kit however it all seems pretty underwhelming even with apps coming out for the metas. Not to mention you can access the AI in some places and others you can’t.
Ever since I purchased the Rokid styles, they have actually been my daily drivers. I haven’t warn the metas since. It reads text extremely well, I do like the fact that you can also turn on the notes feature in the glasses where it will record and give you a summary of your meeting. Honestly, there’s tons of things with the Rokid styles that you can do. You can also do a custom gesture, so for example, I have a custom gesture where I could do a 2 finger single tap on the side of the glasses, and the prompt I have in there is to read text. I think my prompt says something along the lines of “read the first page top to bottom. Read the right page afterwards from top to bottom. If book is not in view, tell me how to adjust it.” I am paraphrasing, but it especially helps if you’re trying to read notes on something, trying to read a book somewhere and you don’t particularly want to talk to your glasses. As much as I hate to say it, I would highly recommend the Rokid styles over the Meta glasses any day.
You can also choose what language model you want to use whether that be ChatGPT or Google Gemini.
Also if you have prescription lenses, most likely your prescription will work for the glasses.

By emassey on Thursday, April 16, 2026 - 13:54

There is also the Agiga glasses, which have much more detailed AI than the Meta glasses. They also do not require you to speak to read text or get a scene description, and they have a live AI mode that gives constant updates. However, these advantages might decrease as more apps come to the Meta glasses. Also, it might be worth waiting if you are purchasing smart glasses because the Google glasses running Android XR will likely come out this year. In addition to Gemini or the AI Google includes on the glasses, it is likely that you could run any Android app, including Seeing AI or Be My Eyes with Be My AI. The Apple glasses might also be announced late this year or early next year, although it is unknown if third-party apps will be able to use the camera or how good their AI will be. Perhaps the AirPods Pro with cameras coming out this year will have AI features too.

By Brian on Thursday, April 16, 2026 - 14:46

So have there been any more updates on the Agiga Echo Vision glasses? I feel like these are somebodies pipe dream at this point...

By emassey on Thursday, April 16, 2026 - 15:34

Yes, there are regular updates to both the app and firmware and I have had mine for months now. I'm pretty sure anyone who pre-orders now and fills out a form will get the pioneer edition very soon after they do, and the only difference I think between that and the full edition will be the full edition will have a bigger battery, and they are still improving the software although it is already pretty good.

By Stephen on Thursday, April 16, 2026 - 15:56

Unfortunately as they keep postponing rollouts, they keep becoming more and more irrelevant in this space. With the rokid styles coming in at $299 USD with no subscription and the fact they can do basically everything that other smart glasses can do but better[in my oppinion], I feel like they are going to be another Selest situation.

By Michael Hansen on Thursday, April 16, 2026 - 16:17

Member of the AppleVis Editorial Team

Mark it on the calendar, I am actually ***possibly*** interested in buying smart glasses for the first time.
My question is, how do the Rokid glasses compare to the Agiga? I have no interest in Meta because of brand reputation and privacy concerns, though I realize that it's all relative.
All I am really interested in any smart glasses for is AI-assisted visual tasks and perhaps the occasional picture.
Thanks!

By Stephen on Thursday, April 16, 2026 - 16:48

So I can’t give you a side by side comparison of the 2 but from my hands on with the Rokids and from what I’ve seen of Agiga on YouTube videos. They look pretty much the same. Idk about Agiga but with the rokid styles you can do all you mentioned above no problem. The delivery time for the Rokid Styles was horrendous though. It took about a month for them to get to me…funny story about that. Basically they had been taking too long so I emailed customer service, got no response and ended up getting a refund. The day they refunded me was the day the glasses shipped out lol. So honestly if you are going to order them, you’re gonna have to have a lot of patience while you wait for them to ship. Story aside, I don’t even bother with the metas… I guess it would really depend on what AI assisted tasks you wanna do.

By Michael Hansen on Thursday, April 16, 2026 - 18:25

Member of the AppleVis Editorial Team

Hi Stephen, thanks for this.
The versions of ChatGPT and Gemini, just as examples. Are these the same models that one can access through their web or app interfaces? Are there any artificial limitations?

By Maldalain on Thursday, April 16, 2026 - 19:00

Same questions as above, and one further question: if I have a ChatGPT subscription can it be somehow imported to the glasses to have better responses?

By Brian on Thursday, April 16, 2026 - 23:09

Admittedly Stephen's podcast he did makes me a touch jealous. Rokid has a far more pleasant reading voice for long documents/book pages. Maybe once i have killed off my Gen 1 Meta's i will look into Rokid. 😇

By TheBlindGuy07 on Thursday, April 16, 2026 - 23:27

I am also looking for smart glasses. Are the Rokid cheaper than Metas?
Can we choose between different models?

By Stephen on Thursday, April 16, 2026 - 23:50

Michael: I’m not honestly sure how to answer that question…what I know just from me using it I can tell you that you have the option between using ChatGPT or Google Gemini, my assumption is that it’s using the latest ChatGPT model. It doesn’t look like you can see the specific model it’s using but that’s my observations. I hope I answered that correctly lol 😊. Basically you have to use the Hi Rokid app, through their app that’s where you would get access to both ChatGPT and Gemini. I’m not sure of limitations. If you would like to expand on that question a little bit further I could hunt down the answer 😊.
Maldalain, you don’t need a chat gpt subscription at all to use these glasses. I’m not sure how the backend of all that works but either they have their own API key or they struck some sort of a deal with OpenAI.
TheBlindGuy07, the rokid styles are regularly priced at US$299 but right now they’re on sale for US$279.
If y’all have any more questions, I’ll try to answer them to the best of my abilities 😊.

By emassey on Friday, April 17, 2026 - 02:55

Something to be aware of when comparing glasses is that right now the Meta glasses have the best third-party app support. They support Be My Eyes natively, and AIRA, Seeing AI, and Envision are coming as well and I've heard that PiccyBot may also come to the Meta glasses. Right now Oorion supports them and there is the HumanWare navigation app coming in the future. The Rokid AI glasses have none of these right now, and while there is an SDK, I didn't find any apps supporting it yet or planning to and it seems less documented than the Meta one. Even if the AI on the Meta glasses does not meet your needs, there are likely to be numerous options to choose from in the very near future and you are likely to find something you like. When the Google or Apple glasses come out, they might have more apps available for them than Meta but we'll just have to see. The Agiga glasses have Aira and Be My Eyes, but no public SDK and no other supported apps.

By emassey on Friday, April 17, 2026 - 03:21

Also, Meta has announced a significantly improved AI model called Muse Spark. This will be coming to the Meta glasses soon, and hopefully this will make the glasses give better descriptions. I've also heard someone mention (Jonathan Mosen I think) that they assigned the action button on the Oakley Meta Vanguard to a prompt requesting a detailed description, and this made the glasses give longer and more detailed descriptions.

By Stephen on Friday, April 17, 2026 - 04:24

Yeah but the third-party app support is pretty underwhelming. The apps get glitchy when trying to connect to the camera, and they can’t use the touchpad on the side at all — that’s just not available to third-party developers. Oorion is genuinely good and honestly might be the only one worth anything, but even then there’s a catch baked into the Meta SDK: if you haven’t used the AI in ten minutes, it stops listening for the wake word. You have to unlock your phone, go back into the app, and then try again. That’s not on the developers, that’s a Meta SDK problem — but you’re still the one dealing with it.
Be My Eyes is another one I’m giving a wide berth right now. There have been reports on Facebook of volunteers recording their sessions with blind users and posting the videos on TikTok. That’s not Meta’s problem, but it’s enough to keep me away.
Keep in mind I’m not hating on Be My Eyes at all, I think it’s a great service, but it sounds like people are becoming volunteers for content. I hate people lol. Regardless of my thoughts and opinions on that, it’s not a reflection on all volunteers nor the organization itself.
The Rokid Styles could have all of these apps and more if developers knew the platform existed. The SDK is already there. People are already building their own apps and posting them on Reddit and GitHub. The ecosystem just needs people asking for it.
There are also things the Rokid Styles can do that the Meta Ray-Bans simply cannot. The Rokids have turn-by-turn navigation built right in — just ask for directions and they start guiding you through the earpiece. The Metas have nothing like that. Translation on the Rokids covers 89 languages, including six you can download and use offline without any internet connection. The Metas handle four languages and need a connection for all of them. The Rokids let you set custom AI shortcuts on the stem — a two-finger long press that triggers whatever action you assign to it. The Metas don’t have anything like that. Prescription support on the Rokids goes up to -16 diopters; the Metas stop around -4, so if you need a strong prescription, the Metas are already out of the picture. The Rokids also record longer videos, support multiple aspect ratios, have stabilization built in, and come with 32GB of onboard storage.
Meta AI also isn’t available in a lot of regions. The Rokids don’t have that problem.
I used to be firmly in the Meta camp. I’m not anymore. I prefer the Rokid Styles, thin third-party support on the other side and all. But if those apps matter to you, spending a bit more for the Metas is the right call — they’re good glasses. Just skip the Gen 1s entirely, that’s money down the drain. The Gen 2s have better battery life and water resistance, which you’ll appreciate when you get caught in the rain. I have both generations and the Gen 1s are already struggling as the updates pile up and more developers add to the load. They weren’t built for it.
Before you spend the money, be honest with yourself about a few things:
What do I actually need these glasses for?
What’s my budget?
If I had them today, what specifically in my day-to-day life gets easier?
Those third-party apps look fantastic on paper. The YouTube videos are stunning — some companies in this space know exactly how to make their product look incredible on camera, nudge nudge — and the videos are edited to show the best possible experience under the best possible conditions. They’re designed to drive sales, and what you see on screen may look nothing like what lands on your face. Some of these apps are paid services too, which tends not to come up in the content. I genuinely cannot take my eyes off some of this footage, and I mean that with full awareness of the irony. Buy with some healthy skepticism.

By Justin Harris on Friday, April 17, 2026 - 12:31

I do look forward to this new AI version and seeing what it can do. Since most stuff from the glasses gets sent back to the phone for processing anyway, I hope it'll still work well with the gen 1s.

By Michael Hansen on Friday, April 17, 2026 - 12:50

Member of the AppleVis Editorial Team

Hi Stephen,

Thanks for the additional info.

When I asked about models, I think my real question is... Could I get the same level of description I would get were I using ChatGPT or Gemini on the phone to describe the same picture? Or is this a more limited version of the models.

Thanks!

By Stephen on Friday, April 17, 2026 - 16:03

Oh and here I was making my answer more complicated haha. Yes you would get the same level of description.

By mr grieves on Friday, April 17, 2026 - 17:10

I only have experience of the gen 1 Meta glasses. I like them a lot. The AI is a tad limited, but I was quite impressed by the oorion support and am excited to see what other 3rd parties can do. It's a bit iffy at reading text from my experience although others have more luck. It's good enough to be helpful but I'm not going to read a book with them. I noticed in the oorion app there is a text reading option but it says that for small text it doesn't really recommend you use the glasses and shoud revert to the phone camera. Not sure if this is a limitation of the glasses or the app. I know the gen 2 camera is a little better than gen 1. Also oorion does kill the battery - think maybe half an hour on the gen 1, so guess maybe double that on gen 2?

But you can control the oorion app through your iphone and use the Metas as a speaker, so you could read text without a wake word. I did hear that Seeing AI was coming to the Metas which could be good.

I was listening to the Daily Tech Headlines podcast about a week ago and they said that the Apple glasses had been announced and had no display, with a camera etc and were setup to work with new Siri with a release date of late next year I think. I've not seen this mentioned anywhere else so I'm a bit suspicious about that info. With tech you can always be waiting for the next big thing and at some point you just need to jump in somewhere. But I'm hanging on for this before I ditch my current Metas.

Steven's doing a great job at promoting the rokids - they sound really good.

By Brian on Friday, April 17, 2026 - 17:24

A little alcohol swab on the Meta camera lens goes a long way. 😉

By Stephen on Friday, April 17, 2026 - 17:27

I’m kind of intrigued to see where the Apple smart glasses go. I don’t have much faith in them yet, especially because they can’t seem to fix dictation. They are so far behind in the ai department it has me worried for them especially in the smart glasses space but maybe, just maybe they will be good.

By Dan Cook on Friday, April 17, 2026 - 20:37

I got a pair of meta generation two last November and absolutely love them!
I take them everywhere with me and have found them incredibly helpful. That being said I am fully aware we are still in the infancy stage of this technology, so I’m keeping a close eye on this space because once these eventually die in a couple of years, I will definitely be getting another pair.
Which ever brand that ends up being will be interesting to see by that time, but I can definitely see me using a pair of these forever.

By Michael Hansen on Friday, April 17, 2026 - 22:01

Member of the AppleVis Editorial Team

Thanks Steven, this is very encouraging.
I wonder how much third-party apps are needed for basic scene descriptions? I can appreciate that for something like live AI, you would really want to have a blindness-tuned integration for the best results. But if all these image description services just rely on the same key AI engines, wouldn't you be able to get similar results with a good-enough prompt of your own?
For me, my ideal pair of smart glasses is obviously Apple. Besides that, I would want something from a reputable brand that can plug into ChatGPT, or Gemini, or Claude, or whatever model I want... And just describe what it sees according to my prompting. I don't want social networking, no other integrations, just a pair of glasses that will actually do something for me in a way that I would expect.

By emassey on Friday, April 17, 2026 - 22:23

I agree that for basic scene description, if you find a good app or the glasses AI is good, then you don't need other apps. However, there are other things third-party apps would be useful for. For example, the Seeing AI short text feature. If you are trying to read text on a sign, or on a screen, or a package, Seeing AI will continuously read any text it sees instantly, so you can just move the camera around or the package until it reads it. With scene description, you would have to try it multiple times until the camera can see the text and wait a few seconds between tries. So it would be great if this came to glasses, and I think the Oorion app has something like this for text and objects. Also, there are navigation apps. The OKO app uses the camera to identify traffic lights and notifies the user when it is safe to cross, the GoodMaps app uses the camera for indoor navigation, and there are a few GPS apps that use the camera to try to make outdoor navigation more precise. It would be great if all of those could come to glasses. In addition, I think there will also be AI apps that offer unique features or use different models that have different advantages. For example, AIRA Access AI allows you to ask for a human to verify the AI's output, and Envision Ally allows you to save information and prompts that you want it to always use rather than saying everything again each time.

I'm hoping the Apple glasses will have support for third-party apps when they come out, and hopefully they do not block access to the camera like on the Vision Pro. If not, the Google glasses should let you run any Android app I think which would be amazing.

By Brian on Friday, April 17, 2026 - 23:06

I won't lie. I would kill for Be My AI on my Meta's. I cannot tell you how often I use it on my iPhone. It is too good.
Oh, and as far as public LLMs go, Chat GPT is boss. 👌😊

By MarkSarch on Saturday, April 18, 2026 - 00:52

I hope my comment serves as useful information for anyone who is planning to buy smart glasses in the near future.
Each of you is free to verify the information that about to provide as it is publicly available.
According to the majority technology analyst, in fact lets estimate more then 80%, once Google android XR glasses hit the market other smart glasses will not enjoy the same level of success, throughout the 1st and 2nd quarter of the current year you will see various brands of smart glasses launching until the market they are compelled to do so before android Xr glasses arrive in order to insure some sales.
Agiga, Ally, scribem, and many more use Google Gemini enterprice.
I’ll leave the link below, if it doesn’t work I invite you to search for Google Gemini enterprise so you can better understand what it’s all about.
https://www.google.com/url?sa=t&source=web&rct=j&opi=89978449&url=https://cloud.google.com/gemini-enterprise&ved=2ahUKEwj7ia3sifaTAxUGIUQIHZaIEGgQFnoECBoQAQ&usg=AOvVaw1GWI-4IaYzAXQbVVCCQKc9

Unfortunately Rocket is one of this brands, and despite using Gemini and GPT language models since as free it doesn’t offer the same features and functionality.
Starter edition is available as an optional free-to-use experience after the initial 30-day free trial of the Business edition. Only in Gemini Enterprise – Starter Edition, your data will be used for service improvement and training. See FAQ labeled "Who owns the data and does Google use my data for training models?" below for more information.
All this info you could find on Google gemini enterprice.

Unfortunately the features of any AI language model are based on language and region.
Personally I’m grateful to those developers who are making changes to smart glasses such as as mention above amount others to provide better accessibility for our community.
I can say this openly I have collaborated to improve accessibility in the Meta glasses, my next pair of glasses will be Google android XR glasses because they will offer better accessibility and features.
Think about what Google deep mind and google deep think labs will offer using Project Astra.

By mr grieves on Saturday, April 18, 2026 - 16:39

I think the expectation with Apple is that the glasses would have better integration and the privacy would be stronger.

I think the Meta glasses apps are great, but it does seem to be that every developer needs to add its own voice assistant at the moment to make it usable. So now if I open up oorion I can talk to oorion but no longer meta. And I can't navigate to another app and start using it without fiddling on the phone. I think this is a great workaround - actually a lot more than I was expecting - but it does feel like a bit of a hack to get things working. I struggle enough to remember which name I'm supposed to be shouting out into the ether as it is - it's like having dozens of children.

Anyway, they are a way off and Apple definitely does have to prove it can get New Siri right which we will have to wait and see about.

From Steven's demo and descriptions the Rokid do feel like they would suit the OP's requirements the best. As has been said before, the Metas aren't really that great if you aren't using voice commands. Sure there may be apps that change this, but you are still fiddling about with apps on your phone at that point.

I suspect the Meta glasses do have some better communication options - for example I can just take and send a photo using WhatsApp hands-free and hear the result coming back. So I've been able to send photos to the wife and hear what she says in response, which is great. Or I can open up a video call with her hands-free. And obviously Be My Eyes etc.

My main concern with the Rokid is that no one other than Steven seems to be talking about them and I'm not sure why as they sound great, and I've not heard of the company before even though they have been around for a little while now.

I think the market is becoming a bit saturated with all these glasses and they aren't all going to survive.

I heard about another one today from a company called Mavis which is being made just near to where I live. It has object detection and a load of other features, but costs over a thousand pounds and has a £30 a month subscription. Much as I would love to support a local company, I don't really feel comfortable going with someone I've not heard of in such a competitive space.

However if I was still low vision they also have a pair of glasses with night vision which sounds pretty awesome. Many years ago I was given a deluxe edition of one of the Call of Duty games which came with a pair of night vision goggles. I used them a couple of times when taking the bins out. It felt very middle-aged SAS.

By Brian on Sunday, April 19, 2026 - 13:06

Remember a few years ago when AI first came out? No, I'm not talking about things like personal assistance such as Siri, Google Assistant, etc. The AI LLMs we have now, like ChatGPT, google Gemini, CoPilot, and so on. When Open AI release their demonstration to the world, the world went a little nuts. Tech companies all over the world scrambled to start producing their own LLMs, to be competitive, because something new was out that was, "eye-catching".
Now we have smart glasses. Granted, AR glasses have kind of been around for a little while. Does anyone remember the original Google Glass? As I understand it, it was kind of a bust, still it paved the way for what we have now. Of course, as trends go, now everyone is scrambling to get their hand in the cookie jar, so to speak. I said this in another post, but someone from BME announced on the Double Tap podcast how these tech companies have spent hundreds of billions on research, and now they just want a return on their investment. So, everyone is scrambling to push out new stuff. New applications, new features, and of course, new tech.
Personally I don't mind, variety is a spice of life, and all that jazz. So long as they actually produce what they preach. Otherwise, we just have another Seleste fiasco.

... but wait, there's more..

Now Meta has produced the Neural Band, a piece of tech that reportedly allows a user to control their smart glasses with hand and finger gestures.

Give it a few months, and tech companies will be producing their own version of the Neural Band.

Aren't trends exciting? 🙄😇

By Maldalain on Sunday, April 19, 2026 - 15:03

Yeah, I get what you’re saying—and honestly, you’re not wrong.
It’s kind of funny how predictable this cycle has become. Something new drops, everyone loses their mind for a minute, and then suddenly every company on the planet is acting like they were already working on it. AI was the perfect example—one big splash, and then boom, a dozen “me too” products overnight.
And yeah, the smart glasses thing feels like it’s following that exact same script. Google Glass walked so all this new stuff could run, even if it tripped over itself back then. Now the tech’s a bit more polished, a bit more practical, and suddenly it’s back on the menu like it never left.
What you said about companies trying to claw back their investment hits the nail on the head. They’ve sunk insane amounts of money into R&D, and now it’s like: okay, time to monetize everything, fast. Which is why we’re seeing this flood of half-baked features mixed in with genuinely cool innovations.
A good example? Foldable phones. When Samsung first pushed out the Galaxy Fold, people were curious, but it was rough—fragile screens, crazy price tags, and a lot of “is this actually useful?” floating around. Fast forward a bit, and suddenly you’ve got Huawei, Motorola, and a bunch of others all jumping in, each with their own version, all trying to refine the same idea. Same pattern: one company takes the risk, everyone else piles in once there’s proof it might stick.
And the Neural Band? Yeah… that’s definitely about to become the next “must-have” checkbox feature. Give it a little time and we’ll see ten different versions of it, all claiming to be more “intuitive” than the last. Same playbook, different gadget.
I’m with you though—variety is great when it’s real. When it actually works, when it delivers. Otherwise it just turns into hype fatigue, and people start tuning it all out.
Trends are exciting… right up until they start feeling like déjà vu 😅

By Brian on Sunday, April 19, 2026 - 17:13

Sadly tech trends are a tired old game. The industry has been at it for decades. Let's take a look at the video game industry. Consoles when they first came out were a marvel. Anyone here who is old enough will probably remember when Atari was the new kid on the block. Then there was Commodore64. There was also ColecoVision (pronounced Co'Lee'Co Vision) and so on. Then of course there was the mighty Nintendo Entertainment System, NES, and then the Sega Master System.
Then the Super Nintendo Entertainment System, SNES. Then the Sega Genesis, Oh, and Atari tried to get their hands in the cookie jar again with the Jaguar. Not to mention the handhelds; Gameboy, Game Gear, Gameboy Color/Advance/DS/3DS. The Sony PSP. Nokia even made something called NGage iirc. Was a mobile phone/gaming handheld.

Point is someone releases something new and shiny into the really real world, and then a dozen other companies try to copy and/or make something even shinier.

Tech changes about as much as the weather.
Tech trends, not so much. 😅

By Troy B on Sunday, April 19, 2026 - 22:36

I bought my generation 1 meta glasses near the end of May last year so I've had them just short of a year. I admittedly haven't used them muchbut the two main things I've used them for is reading mail, and identifying boxes of food including retrieving cooking instructions.

When trying to read my mail pretty much all I can get the glasses to do is summarize, and the couple of times that I tried to read a bill and get the amount it wasn't easy. Regarding the cooking instructions, I ask if the glasses see any cooking instructions and if they don't, if it's a pot pie for example the glasses will tell me that they don't see any instructions but that a pot pie of that brand is normally cooked this way. The problem is the instructions are wrong.

A friend of mine who was the first person to buy a pair of these glasses and eventually talked me into getting a pair told me recently that I just need to practice, but although I purchased the glasses from a place in Florida that sells blindness products I didn't get much if any help regarding using the glasses. Can somebody give me some tips that might make things easier? I've even thought that maybe these are the wrong glasses for me and that I need to find some other brand that might meet my needs better, and I listened to a couple of audio files on the Echovisions and they sounded good, but I don't know how old those files are and after reading this thread it sounds like at least some of you think the Echovisions may not survive.