Alright, so this one caught me off guard. Like, completely. Bloomberg's Mark Gurman dropped a report on Monday and I had to read it twice because I wasn't sure Apple was really doing all three of these things at once. Smart glasses to compete with Meta Ray-Bans? Sure, we've been hearing about that for a while. But an AI pendant? AirPods with cameras built into them? Apple is going all in on this idea that your devices should be able to see the world around you, and I gotta be honest — the more I think about it, the more I think they might actually be onto something here.
Let me break down all three of these products for you, because they're not all at the same stage and they're not all trying to do the same thing. But together? They paint a really clear picture of where Apple thinks personal technology is headed. And it's not your phone screen anymore.
The Strategy: Give Siri Eyes and Ears Everywhere
Before I get into each product, I wanna talk about what Apple is actually doing here, because the strategy is more interesting than any single device. Apple CEO Tim Cook held an all-hands meeting with employees earlier this month and said the company is "extremely excited" about new product categories enabled by AI. That's Tim Cook talk for "we're betting big on this."
The idea is simple: all three of these wearables — the glasses, the pendant, and the AirPods — will have cameras. Not necessarily for taking photos, but so that Siri and Apple Intelligence can see what you see. The glasses have proper high-res cameras for photos and video plus a computer vision camera. The pendant and AirPods have lower-res cameras that are just there to give the AI context about your surroundings.
Think about it. Right now, if you wanna use Visual Intelligence on your iPhone, you gotta pull your phone out, point it at something, and tap a button. With any of these three devices, Siri can just... see. All the time. You walk past a restaurant and ask "hey Siri, what are the reviews for that place?" and it already knows what you're looking at. That's the play.
Now, all three of these are designed as iPhone accessories, not standalone devices. That's the key difference between what Apple is doing and what Humane tried to do. Humane said "your phone is dead, use our pin instead." Apple is saying "keep your phone, but let us extend what it can do." Very different energy. And I think that's the right call.
The Three Devices: What We Know
The Glasses: Apple's Answer to Meta Ray-Bans
Okay let's dig into the glasses because these are the most developed of the three and I think they have the most potential to actually change how people interact with tech day-to-day.
First off — Apple has given prototypes to their hardware engineering team already. Production could start as early as December 2026 with a public launch in 2027. That's not vaporware. That's a real timeline. And to put a fine point on how serious Apple is about this: they paused development on a new version of the Vision Pro — the lighter "Vision Air" headset everyone was expecting — to funnel those engineers into the glasses project instead. That's a massive strategic pivot.
The first generation won't have a display. I know that might sound like a dealbreaker to some people, but honestly I think that's the right move. Meta's standard Ray-Bans don't have a display either and they sell really well. The new Ray-Ban Display model just came out with a screen, but the original camera-and-audio version is the one that actually built the market. Apple is clearly taking notes.
What the glasses will have is a dual camera system. One high-resolution camera for capturing photos and 1080p video, and a second camera dedicated to computer vision — giving Siri the ability to understand what you're looking at, measure distances, identify landmarks, read signs, and provide navigation context. Combine that with speakers for spatial audio and multiple mics, and you've got a device that can handle calls, play music, give directions, translate languages, and let you interact with Siri hands-free all day long.
And because this is Apple, they're going all-in on fashion. Multiple frame styles and sizes, premium materials like acrylic and potentially titanium, prescription lens compatibility. They know the number one reason people rejected Google Glass back in the day was because it looked ridiculous. Apple's not about to make that mistake.
Apple isn't entering a vacuum here. Meta's Ray-Bans basically created this category and they're already on their second generation with a display. Samsung confirmed their Android XR glasses are coming in 2026. Google is building AI glasses with Warby Parker. Even Alibaba launched Quark AI glasses in China. TrendForce projects smart glasses shipments could hit 32 million units by 2030.
Apple's entry is the one the entire supply chain has been waiting for though. Taiwanese optical suppliers have already increased capital spending and shifted R&D priorities based on Apple's expected requirements alone. That's the Apple effect — they don't have to ship a single unit to move the industry.
The Pendant: Yes, It's Like the Humane AI Pin. No, It's Not the Same Thing.
Okay, I know what you're thinking. "Didn't the Humane AI Pin just crash and burn spectacularly?" Yes. Yes it did. It was a $699 disaster that tried to replace your phone with a lapel projector and couldn't even get basic tasks right. So why on earth would Apple do something similar?
Because they're doing it completely differently. And I think the difference matters.
The Humane AI Pin failed for a bunch of reasons, but the biggest one was that it tried to be a standalone device. It wanted to be your phone replacement. It had its own cellular connection, its own subscription plan, its own half-baked OS. And it was terrible at all of it. Apple's pendant isn't trying to be any of that. It's an iPhone accessory. It's the dumbest device in the lineup on purpose — the computing power is closer to AirPods than an Apple Watch. It does one job: give your iPhone a set of eyes and ears that you don't have to hold in your hand.
| Feature | Humane AI Pin | Apple AI Pendant |
|---|---|---|
| Philosophy | Phone replacement | iPhone accessory |
| Processing | Standalone (Snapdragon) | Offloaded to iPhone |
| Display | Laser projector | None |
| Cellular | Built-in + $24/mo plan | None (uses iPhone) |
| Camera | 13MP for photos + AI | Low-res, AI context only |
| Speaker | Yes | Still being debated |
| Price | $699 | TBD (likely much less) |
| App ecosystem | Proprietary (barely any apps) | Apple ecosystem + Siri |
The pendant was born out of the industrial design team's work on the glasses. Apparently while they were figuring out how to put cameras on your face, someone had the idea: what about people who don't wear glasses? What if you could clip something small to your shirt that does the same basic AI context thing?
And look — I'm gonna be real. The idea of an always-on camera clipped to your chest is going to make some people very uncomfortable. The privacy conversation around this is gonna be intense. But if Apple leans into their privacy-first reputation and handles it the way they've handled Face ID and on-device processing... maybe? It's a big "maybe" but Apple has more credibility on privacy than basically any other company that would try this.
The project is still early though. Gurman describes Apple's plans for this entire lineup as "fluid," and the pendant feels like the most likely candidate to get shelved if it doesn't come together. Could launch in 2027 if everything goes well. Could also never see the light of day.
AirPods with Cameras: The Sleeper Hit
Of the three, this is the one I think most people are sleeping on, and it's actually the closest to shipping. We're probably looking at a higher-end AirPods Pro 3 variant dropping later this year with built-in infrared cameras.
Now before you picture a GoPro sticking out of your ear — these are tiny IR sensors. You won't see them. They're not for taking photos or recording video. They're there so the AirPods can understand the space around you and feed that visual context to Apple Intelligence on your iPhone. Think of it like Visual Intelligence but completely hands-free. You look at a menu in a foreign language, ask Siri to translate it, and the answer comes through your AirPods. You never touched your phone.
The cameras also enable hand gesture recognition, which is interesting. Imagine being able to wave your hand to skip a song or accept a phone call without reaching for anything. Ming-Chi Kuo has been talking about this for years based on supply chain data, so this isn't some last-minute pivot — Apple has been building toward this for a while.
Price-wise, the leaker Kosutami claims the same $249 price as current AirPods Pro 3, but I'm skeptical. More likely we see a two-tier setup like the current AirPods 4 — base model without cameras, premium model with cameras at something like $299. That would fit Apple's playbook perfectly.
The Elephant in the Room: Siri Doesn't Have an AI Foundation
Okay, I need to be really direct about this because I think it's the single biggest risk to everything I just described. All three of these devices are built around Siri. Siri. The voice assistant that, as I'm writing this in February 2026, still cannot reliably set two timers at the same time. And I'm not being dramatic — Apple's entire wearable AI strategy depends on a version of Siri that literally does not exist yet.
Let me walk you through how we got here because the timeline is kind of insane. Apple showed off a dramatically improved Siri at WWDC in June 2024. Personal context, on-screen awareness, app intents, multi-step tasks — the whole thing. They even ran iPhone 16 ads showing these features working. The problem? None of it shipped. Not with iOS 18. Not with iOS 19. Apple admitted in early 2025 that the features needed more time and pushed everything to 2026. Craig Federighi, Apple's software chief, straight up said Siri had "unacceptable error rates" in internal testing.
So they rearchitected the whole thing. New foundation models. More powerful backend. Target: iOS 26.4, expected around March 2026. And then, literally last week — last week — Gurman reported that Apple hit more problems. Siri sometimes doesn't process queries properly and takes too long to respond. The iOS 26.4 beta dropped on February 17th. Guess what's not in it? Any new Siri features. Zero. The features are now being pushed to iOS 26.5 in May, or possibly iOS 27 in September. That's a feature Apple showed off in June 2024 that still hasn't shipped in February 2026. Almost two years.
June 2024: Apple announces dramatically improved Siri at WWDC, shows working demos, runs iPhone 16 ads featuring the capabilities.
Fall 2024: iOS 18 ships without the new Siri features. Apple stays quiet.
March 2025: Apple officially confirms delay, says features are coming "in the coming year." Multiple class-action lawsuits filed over advertising features that don't exist.
June 2025: Federighi acknowledges "unacceptable error rates." Apple rearchitects Siri with new foundation models. Internal target: iOS 26.4.
February 11, 2026: Gurman reports Siri is having problems again in internal testing. Features may slip to iOS 26.5 or iOS 27.
February 17, 2026: iOS 26.4 beta drops with zero new Siri features.
And here's the part that really gets me: Apple doesn't even have its own AI models powering this. They partnered with Google. The next version of Siri — the one that's supposed to save everything — runs on a custom model built by the Google Gemini team. Apple is paying Google roughly a billion dollars a year for access to these models. They're even discussing running Siri queries on Google's TPU servers because Apple doesn't have the cloud infrastructure to handle chatbot-level requests from billions of devices.
Let me say that again. Apple — the company that built its entire brand on controlling every piece of the stack — is outsourcing the AI brain of their voice assistant to Google. Steve Jobs is spinning in his grave. I don't say that to be disrespectful, but it's the honest truth of where Apple is right now with AI. They're behind. Significantly behind.
The plan going forward has two phases. Phase one: iOS 26.4 or 26.5 (spring 2026) brings the personalized Siri features Apple showed two years ago — personal context, app intents, on-screen awareness. Phase two: iOS 27 (September 2026) introduces "Project Campos," a full chatbot version of Siri that replaces the current interface entirely. It'll support text and voice, search the web, generate content, analyze files, and integrate deeply into every Apple app. Think ChatGPT but built into your entire phone.
If both of those ship on time and work well? These wearables could be incredible. The glasses with a chatbot-level Siri that can see what you see? That's genuinely exciting. But if Siri 2.0 ships half-baked — and let's be honest, Apple Intelligence has had a rough rollout so far, including the notification summary disaster where AI was literally rewriting news headlines with incorrect information — then it doesn't matter how beautiful the hardware is.
I really want this to work. I think Apple's privacy-first approach to AI is the right one, and I think having a smart assistant that can see the world through your glasses or your AirPods would be genuinely useful. But they need to ship. They need Siri to actually work. And right now, in February 2026, they're still missing their own internal deadlines. That should concern anyone who's excited about these wearables.
The Competition: Apple Is Late but That Might Not Matter
Let's be honest — Apple didn't invent this category. Meta's been selling Ray-Ban smart glasses for a couple years now and they're really good. I've used them. The Ray-Ban Meta Display model with the actual screen is genuinely impressive. And Meta has a massive head start with Meta AI baked into every pair.
Google is coming in hot too, partnering with Warby Parker and Kering (the luxury group behind Gucci) on their own AI glasses for 2026. Samsung confirmed Android XR glasses this year. Even OpenAI has been making noise about hardware. This space is about to get very crowded very fast.
But here's the thing about Apple, and I've seen this play out so many times over the years: they don't need to be first. They need to be best-integrated. The iPhone, AirPods, Apple Watch, Mac — when you're already living in that ecosystem, an Apple wearable that just works with everything you own is a very easy sell. Meta can have the best standalone smart glasses in the world and it still won't matter to the person who's got an iPhone, a MacBook, and AirPods on their desk right now.
That ecosystem lock-in is Apple's superpower and they know it. Every one of these three devices requires an iPhone. That's not a limitation — that's the strategy.
When Does All of This Actually Ship?
Here's what the timeline looks like based on everything we know right now:
My Take: This Is Smarter Than It Looks
I'm gonna tell you something that might surprise you: I actually think this is one of the smartest moves Apple has made in a while. And I don't say that as an Apple fanboy — you guys know me, I'll call it out when Apple messes up.
The reason I think this is smart is the three-tier approach. Not everyone wants to wear glasses. Not everyone wants earbuds. Not everyone wants a pendant on their shirt. But almost everyone wants at least one of those things. Apple is giving you three entry points into the same basic capability — ambient AI that can see the world — and letting you pick the form factor that works for your life.
Meta only has one: glasses. And if you don't wear glasses? Tough. Google is doing glasses too. OpenAI's been vague about hardware. Apple is the only company saying "here are three different ways to get this same experience, and they all work with the phone you already own." That's a really strong position.
The big question marks are Siri (and I just spent a lot of words explaining why that concern is very real), privacy (always-on cameras are a tough sell no matter who makes them), and whether the pendant lands or gets laughed out of the room. But the overall strategy? I think it's right. And I think a lot of people are underestimating how big this shift to ambient AI wearables is going to be over the next few years.
The phone in your pocket changed everything 18 years ago. The thing you wear on your face — or your ears, or your chest — might be the next version of that. And Apple clearly thinks so too, because they don't redirect Vision Pro engineers and launch three products at once unless they believe this is the future.
Smart glasses are the headline product — Meta Ray-Ban competitor, no display, dual cameras, production starting December 2026, likely shipping early-to-mid 2027. This is the one to watch.
AirPods with cameras are the sleeper — IR sensors for Visual Intelligence and gesture control, possibly shipping late 2026 as a premium AirPods Pro 3 variant around $299. Closest to launch.
The AI pendant is the wildcard — always-on camera in an AirTag-sized disc, fascinating concept but earliest stage of development and most likely to get cancelled. If it ships, probably 2027.
All three require an iPhone, all three are built around a Siri that — as of this week — is still missing its own internal deadlines and running on Google's AI models instead of Apple's own. The hardware strategy is smart. The software foundation it depends on is the biggest question mark in Apple's product pipeline right now. If Siri delivers, this changes everything. If it doesn't, these are three very pretty paperweights.
Which One Would You Actually Wear?
Smart glasses, the pendant, or camera AirPods? Or does the whole always-on camera thing freak you out? Drop your take in the comments — I'm genuinely curious where you guys land on this.
