For years, weekend bike rides have been sacred escapes for me. Every pedal stroke helps melt away the stressors that piled up throughout the week, and I’ve collected a few gadgets that make these rides better. However, I’ve learned the hard way that bringing along too much gear takes away from the ride itself, forcing you to manage a network of pings and battery levels instead of just riding the damn bike.
Enter Ray-Ban Meta: smart glasses that made my weekend rides simpler and a bit more fun.
Instead of wearing sunglasses, a pair of headphones, and fumbling around with my phone to take photos throughout the ride, I now have one device that helps with everything.

The Ray-Ban Meta smart glasses have been a surprise hit with more folks than just me — Meta says it has sold millions of these devices, and CEO Mark Zuckerberg recently said sales have tripled in the last year.
Several Reddit threads and YouTube videos suggest that lots of folks are wearing Ray-Ban Meta glasses while biking. Meta has caught on as well — it’s reportedly building a next generation of AI smart glasses with Oakley, specifically built for athletes.
I never expected to use my Ray-Ban Metas on the bike. But a few months ago, I decided to try them out.
Now, I wear these glasses on bike rides more than anywhere else. Meta got just enough things right with these smart glasses to convince me there’s something here. It’s almost a joy to use, and with a few upgrades, it could get there.
Techcrunch event
Berkeley, CA
|
June 5
BOOK NOW
A key selling point of Ray-Ban Meta is that they’re just a solid pair of Ray-Ban sunglasses — mine are the Wayfarer style with transition lenses and a clear plastic body.
I found these work well for bike rides, protecting my eyes from the sun, dirt, and pollen. They sit comfortably under a bike helmet — but maybe not perfectly. (More on that later.)
The killer feature of Meta’s smart glasses is the camera that sits above your right and left eyes. The glasses allow me to grab photos and videos of things I see on my rides just by pressing a button on the top right corner of the frames, instead of fumbling with my phone — something that feels slightly cumbersome and dangerous on the bike.


While riding through Golden Gate Park in San Francisco last weekend, I used the Ray-Ban Meta glasses to snap photos of the beautiful Blue Heron Lake, the shrub-covered dunes where the park meets the Pacific Ocean, and the tree-covered track that sits at the park’s entrance.
Is the camera amazing? No. But it’s pretty good, and I end up capturing moments I simply never would have if I weren’t wearing the glasses. For that reason, I don’t see the camera as a replacement for my phone’s camera, but rather a way to capture more photos and videos altogether.
The feature I use the most: the open-ear speakers in the arms of the glasses, which allow me to listen to podcasts and music without blocking the noise of people, bikers, and cars around me. Meta was far from the first company to put speakers in glasses — Bose has had a solid pair for years. But Meta’s take on open-ear speakers is surprisingly good. I’ve been impressed by the audio quality and how little I miss traditional headphones on these rides.
I’ve found myself chatting with Meta’s AI assistant a bit on my weekend rides. I recently asked it questions about the nature I was seeing throughout the park — such as “Hey, Meta, look and tell me what kind of tree this is?” — as well as the origins of the historic buildings I saw.
I typically use bike rides as a way to unplug from the world, so it seemed counterintuitive to talk with an AI chatbot during the rides. However, I found these short queries stoked my curiosity about the world around me without sucking me into a rabbit hole of content and notifications, which is what usually happens when I use my phone.
And, again, the greatest thing about these features is they all come in one device.
That means fewer things to charge, less clutter in my biking gear box, and fewer devices to manage along my ride.
Potholes
While the Ray-Ban Meta glasses look great for walking around, they clearly weren’t designed with biking in mind.
Oftentimes, the Ray-Ban Meta glasses fall down my nose during a bumpy ride. When I’m bent over on the bike and looking up to see what’s ahead of me, the thick frames block my view. (Most sunglasses for cyclists have thin frames and nose pads to solve these problems.)
There are some limitations around how the Ray-Ban Meta glasses work with other apps, which is a problem. While I love taking photos and pausing music with the glasses, for anything else, my phone has to come out of my pocket.
For example, Ray-Ban Meta has a Spotify integration, but I had a hard time getting the AI assistant to play specific playlists. Sometimes, the glasses played nothing when I asked for a playlist or played the wrong playlist altogether.
I’d love to see these integrations improved — and expanded to include more biking-specific integrations with apps like Strava or Garmin.
Ray-Ban Meta also doesn’t work super well with the rest of my iPhone, which is likely due to Apple’s restrictive policies.
I’d love to be able to fire off texts or easily navigate through Apple Maps with my Ray-Ban Meta glasses, but features like that may not be available until Apple releases its own smart glasses.
That leaves Meta’s AI assistant. The AI feature is often touted as the main selling point of these glasses, but I often found it lacking.
Meta’s voice AI is not as impressive as other voice AI products from OpenAI, Perplexity, and Google. Its AI voices feel more robotic, and I find its answers are less reliable.
I tested the recently launched Ray-Ban Meta’s live video AI sessions, which were first unveiled at last year’s Meta Connect conference. The feature streams live video and audio from Ray-Ban Meta into an AI model in the cloud, aiming to create a more seamless way to interact with your AI assistant and letting it “see” what you see. In reality, it was a hallucinated hot mess.
I asked Ray-Ban Meta to identify some of the interesting cars I was biking past near my apartment. The glasses described a modern Ford Bronco as a vintage Volkswagen Beetle, even though the two look nothing alike. Later, the glasses confidently told me that a 1980s BMW was a Honda Civic. Closer, but still very different cars.
During the live AI session, I asked the AI to help identify some plants and trees. The AI told me a eucalyptus tree was an oak tree. When I said, “No, I think that’s a eucalyptus tree,” the AI responded, “Oh yeah, you’re right.” Experiences like that make me question why I’m talking to AI at all.
Google DeepMind and OpenAI are also working on multimodal AI sessions like the one that Meta offers with its smart glasses. But for now, the experiences seem far from finished.
I really want to see an improved version of AI smart glasses that I can take on bike rides. The Ray-Ban Meta glasses are one of the most convincing AI devices I’ve seen yet, and I could see how wearing them on a ride would be a joy after a few key upgrades.