Meta just turned its smart glasses into calorie counters, a move that is both innovative and raises immediate privacy questions. The company recently announced that Ray-Ban and Oakley smart glasses will soon let U.S. users log meals using built-in cameras and AI. Snap a photo of your lunch, ask the glasses what is in it, and Meta AI will log it into a nutrition diary. It is clever technology that promises frictionless tracking, though the implications of always-on food surveillance deserve scrutiny.
The feature works like this: you wear your Ray-Ban or Oakley smart glasses, spot a plate of food, and either snap a photo or ask the AI what you are looking at. Meta AI then identifies the dish and logs it into a nutrition diary inside the Meta AI app. The company says the tool will offer personalized suggestions, things like "what to eat for more energy," which could prove helpful for users seeking nutritional guidance.
For now, the feature requires you to actively prompt it. Continuous analysis, where the glasses automatically recognize food without you asking, is slated for a future update. That is probably for the best, as passive tracking raises additional privacy considerations beyond user-initiated scanning.
Meta says the nutrition tracker will roll out later this summer in the U.S., limited to adults 18 or older. It will debut on Ray-Ban and Oakley models that already have built-in heads-up displays, so not every pair of smart glasses will get the feature. The company has not named specific models yet, but if you have a pair with a display, you are likely in the running.
The age restriction makes sense from a legal and ethical standpoint, given that nutrition tracking can be a sensitive area for younger users. Meta is entering territory that apps like MyFitnessPal and Lose It! have occupied for years, except now the camera is integrated into wearable eyewear.
The promise here is frictionless tracking: if logging meals takes three seconds instead of three minutes, adoption rates will increase significantly. No need to pull out your phone, open an app, search for "grilled chicken breast," and scroll through multiple variations. Just look, ask, done.
However, when your glasses are always capable of watching, the line between helpful and intrusive becomes blurred. Meta says privacy safeguards will be in place, as required by data protection rules, but questions remain about what data is stored, how long it is kept, and whether it is used to train future AI models. Transparency on these points will be critical for user trust and adoption.
This is not just about counting calories. It is about Meta embedding AI deeper into everyday life, literally onto your face. The company has been pushing hard to make smart glasses a mainstream product, and nutrition tracking is the kind of practical feature that could actually move the needle. It is not flashy or revolutionary, but it is useful in a way that novelty AR features never were.
The real test will be accuracy. AI food recognition has come a long way, but it still stumbles on things like portion sizes, mixed dishes, and anything that does not look like standardized food photography. If the glasses cannot accurately distinguish portion sizes or consistently identify complex dishes, user confidence in the feature will erode quickly.
The U.S. rollout is set for later this summer. For users 18 or older who own compatible glasses and are comfortable with Meta handling their meal data, the feature represents a notable step forward in wearable health tracking technology.







-1.webp&w=3840&q=70)










