• My Feed
  • Home
  • What's Important
  • Media & Entertainment
Search

Stay Curious. Stay Wanture.

© 2026 Wanture. All rights reserved.

  • Terms of Use
  • Privacy Policy
Tech/Trends

Space-Based Data Centers: Why Tech Giants Are Moving Computing to Orbit

Solar-powered satellites promise to solve AI's energy crisis, but latency and physics stand in the way

6 February 2026

—

Explainer *

Tasha Greene
banner

AI training consumes as much power as small towns, sparking community opposition to data centers. Companies like SpaceX, xAI, and Google are exploring orbital computing—satellites powered by 24/7 solar energy that eliminate cooling costs. The concept solves resource conflicts but introduces latency problems that make real-time AI impossible. Here's what works, what doesn't, and which workloads might actually benefit from computing in space.

image-224

Summary:

  • Space-based data centers could solve Earth's AI energy crisis by running satellites on 24/7 solar power in orbit, eliminating water cooling needs and local grid strain—but latency of 5-30 milliseconds per round trip makes real-time AI tasks impossible.
  • SpaceX, Google, China, and the European Space Agency are exploring orbital computing, though no operational systems exist yet—technical hurdles include massive radiator panels for heat dissipation and launch costs of thousands of dollars per pound of hardware.
  • Batch processing tasks like climate modeling, protein folding, and archival storage could work in orbit where delays don't matter, but moving infrastructure to space removes community oversight and shifts jobs from distributed data centers to concentrated control facilities.

A single AI training run can consume as much electricity as a hundred homes use in a year. The servers generate so much heat they need water cooling systems that rival small municipal plants. Communities from Virginia to Arizona are blocking new data center projects, worried about power grid strain and water depletion.

This is why some of the world's biggest tech companies are now looking up instead of out. Space-based data centers place computing hardware on satellites in orbit, powered by solar panels that collect unfiltered sunlight 24 hours a day. In February 2025, Elon Musk announced plans to explore orbital computing infrastructure through SpaceX and xAI partnerships. Google has Project Suncatcher exploring similar concepts. China and the European Space Agency have launched feasibility studies.

What sounds like science fiction is becoming an engineering question worth serious money and attention.

How Orbital Computing Works

The core concept involves launching satellites equipped with computing hardware and large solar panel arrays into low Earth orbit, roughly 200 to 1,200 miles above the surface. The satellites receive data from ground stations via laser or radio links, process computations onboard, and transmit results back down.

The appeal is environmental and logistical. Solar panels in orbit collect energy without atmospheric filtering, cloud cover, or nighttime interruptions. A satellite above Earth receives about 35 percent more solar energy than a ground-based panel at peak efficiency. There's no need for cooling towers or water systems because the vacuum of space provides natural thermal radiation. Heat dissipates directly into the void through radiator panels, the same way the International Space Station manages temperature.

Proponents argue this solves three problems simultaneously. It eliminates local resource conflicts by removing data centers from communities. It taps an abundant energy source. It sidesteps the real estate and permitting challenges that slow terrestrial expansion. The infrastructure moves off the grid entirely.

Why Data Centers Became an Infrastructure Crisis

Modern AI models require thousands of specialized processors running simultaneously for weeks or months. Training GPT-4 class models consumes an estimated 10 to 25 megawatts of power. That's enough to run a small factory, operating around the clock. Multiply that by dozens of training runs happening concurrently across the industry, plus the energy needed for inference, and the numbers become staggering.

The cooling problem compounds the power problem. High-performance chips generate intense heat in tiny spaces. Data centers use refrigeration systems, evaporative cooling towers, or direct liquid cooling that circulates water through server racks. In arid regions like Phoenix, a single large data center can consume millions of gallons of water per day, equivalent to thousands of households.

Local opposition has grown fierce. Residents near proposed data center sites in Loudoun County, Virginia have organized against projects they view as infrastructure leeches. The facilities bring few jobs compared to their resource demands. They strain electrical grids during peak hours. The NIMBY dynamic is slowing approvals and driving up costs for tech companies that need to expand capacity quickly.

The Technical Barriers No One Has Solved

Latency is the killer constraint. Light travels fast, but not instantaneously. A signal traveling from a ground station to a satellite in low Earth orbit and back covers roughly 500 to 2,500 miles, depending on satellite position and ground station location. That round trip introduces 5 to 30 milliseconds of latency under ideal conditions, not accounting for atmospheric interference or signal processing delays.

For many AI workloads, this matters enormously. Training large language models requires millions of rapid-fire data exchanges between processors. Each training step depends on the results of the previous one. Adding even 10 milliseconds per exchange would slow training runs from weeks to months or make them computationally impractical. Real-time inference tasks like chatbots, voice assistants, and autonomous vehicle decision making cannot tolerate multi-second delays while data travels to orbit and back.

The Cooling Paradox

Thermal management in orbit is more complex than it appears. Yes, space is cold, but there's no air to carry heat away. Spacecraft rely entirely on thermal radiation, which means large radiator panels to dissipate heat from densely packed processors. The International Space Station's radiators span 1,700 square feet to cool a fraction of the power load a modern data center generates. Scaling this to handle megawatts of computing heat requires radiator systems so large they might dwarf the computing hardware itself.

Satellites in low Earth orbit move at roughly 17,000 miles per hour, circling the planet every 90 minutes. Maintaining constant communication with ground stations requires either constellations of dozens of satellites so one is always overhead, or higher orbits where satellites move slower but latency increases. Launch costs, while declining, still run thousands of dollars per kilogram of payload. A single rack of high-performance servers weighs over 1,000 pounds before adding solar panels, radiators, and communications equipment.

What This Is Actually Good For

Space-based computing isn't viable for latency-sensitive workloads, but it could excel at specific batch processing tasks that don't require real-time responses. Training certain types of models where data can be uploaded in bulk, processed over hours or days, and results downloaded later might work. Climate modeling projects like the European Centre for Medium-Range Weather Forecasts' seasonal prediction models, which process historical atmospheric data over multiple days to generate three-month forecasts, could tolerate orbital delays. The same applies to protein folding simulations and large-scale data analysis jobs that already run overnight on terrestrial servers.

Archival storage represents another potential use case. Data that needs to be retained but rarely accessed could sit on orbital storage satellites, retrieved only when necessary. The latency penalty matters less when retrieval times measured in minutes are acceptable.

Specialized edge processing for satellite networks might close the loop more elegantly. If data is already being collected by Earth observation satellites, processing it in orbit before sending condensed results to the ground reduces bandwidth needs. This works for applications like agricultural monitoring, disaster response analysis, or environmental tracking where time scales are measured in hours or days, not milliseconds.

The Companies Making Real Moves

Several major technology and aerospace organizations are actively exploring orbital computing infrastructure, though no operational systems have been deployed yet.

SpaceX and xAI have announced plans to explore orbital computing infrastructure, positioning it as central to long-term AI development strategy. The announcements leverage SpaceX's launch capabilities and satellite experience from Starlink, though concrete timelines and technical specifications remain undisclosed.

Google's Project Suncatcher explores solar-powered orbital computing with a focus on sustainability metrics. The project remains in early research phases, with no announced launch dates. Internal presentations reportedly emphasize reducing data center carbon footprints by moving compute-intensive training runs to space-based platforms powered entirely by solar energy.

China's space program has allocated funding for feasibility studies on orbital data infrastructure as part of its broader space station and satellite network ambitions. European Space Agency documents reference edge computing in orbit as a long-term research area, integrated with Earth observation satellite programs already in operation.

The pattern is consistent across all initiatives: ambitious concepts backed by preliminary research, but no operational hardware in orbit or confirmed deployment schedules.

What This Means for Infrastructure Decisions

Moving data centers to orbit doesn't address the underlying question: who gets a say in infrastructure decisions that affect shared resources? The terrestrial data center fights revealed something deeper than NIMBY politics. Residents were asking whether their local power and water should subsidize global AI infrastructure that brings minimal local benefit. Space-based systems remove the immediate environmental burden but also remove any community participation in governance or benefit-sharing.

Labor implications remain murky. Data centers employ thousands of technicians, engineers, and operations staff. Orbital systems would require far fewer hands-on workers, with most operations managed remotely from ground control facilities. The infrastructure jobs shift from distributed local employment to concentrated technical roles at launch sites and control centers.

Progress That Lasts Requires Honest Tradeoffs

Space-based computing addresses real problems. Terrestrial data centers strain resources and face opposition. Solar power in orbit is abundant and constant. The environmental case has merit.

But the technical challenges are not yet solved, the economics remain speculative, and the social implications deserve scrutiny beyond corporate press releases. The question is not whether we can put data centers in orbit. The question is whether we should, for which workloads, and at what cost to whom.

Right now, we have more announcements than answers, more ambition than evidence, and more engineering problems than operational systems. That's not a reason to dismiss the concept. It's a reason to watch the evidence closely and ask hard questions about feasibility, accountability, and who benefits when infrastructure moves beyond democratic reach.

Every watt is a choice. So is every orbit.

What is this about?

  • commercial space infrastructure/
  • satellite deployment/
  • data center power/
  • AI energy consumption/
  • orbital computing/
  • data center sustainability

Feed

    Cursor 3 Launches Unified AI Coding Workspace

    Cursor 3 Launches Unified AI Coding Workspace

    Side‑panel lets devs toggle local and cloud agents, building on Composer 2 and Kimi 2.5

    about 9 hours ago
    Orion’s Six‑Minute Burn Puts Artemis 2 on Free‑Return Path

    Orion’s Six‑Minute Burn Puts Artemis 2 on Free‑Return Path

    iPhone 17 Pro Max survives Orion’s deep‑space test as crew heads to lunar flyby

    about 10 hours ago
    Android 17 Introduces System‑Level Notification Rules

    Android 17 Introduces System‑Level Notification Rules

    Samsung’s One UI 9 will adopt Android 17’s rules, adding OS‑level alert control

    about 11 hours ago

    Nvidia rolls out DLSS 4.5, 6× boost on RTX 50-series

    Dynamic Multi‑Frame Generation smooths 120–240 Hz, delivered in the driver 595.97

    about 13 hours ago
    Apple rolls out iOS 18.7.7 to block DarkSword

    Apple rolls out iOS 18.7.7 to block DarkSword

    Patch fixes six Safari bugs, stopping DarkSword on iOS 18–18.7 devices

    1 day ago
    BoxPlates Skins Revamp PS5 Slim & Pro in Two Weeks

    BoxPlates Skins Revamp PS5 Slim & Pro in Two Weeks

    1 day ago
    Artemis 2 Rockets Beyond Earth—402,000 km From Home

    Artemis 2 Rockets Beyond Earth—402,000 km From Home

    Lift‑off at 8:23 a.m. ET marks first crewed lunar flight since 1972, with a diverse four‑person crew

    1 day ago
    Apple celebrates 50 years with new minimalist wallpapers

    Apple celebrates 50 years with new minimalist wallpapers

    Basic Apple Guy releases iPhone and Mac wallpapers for Apple’s 50th anniversary

    1 day ago
    Razer Unveils Pro Type Ergo Ergonomic Keyboard Today

    Razer Unveils Pro Type Ergo Ergonomic Keyboard Today

    Split design, AI button, and 19‑zone RGB aim at U.S. workers with a 9.7% RSI rate

    2 days ago
    Google to debut screen‑free Fitbit band in 2026

    Google to debut screen‑free Fitbit band in 2026

    AI‑driven training plan and upgraded platform aim at the health‑tracking market against Oura and Whoop

    2 days ago
    Nothing unveils AI‑powered smart glasses for a 2027 launch

    Nothing unveils AI‑powered smart glasses for a 2027 launch

    The glasses use a paired phone and cloud, with a clear frame and LED accents

    2 days ago
    Google rolls out Veo 3.1 Lite, halving AI video costs

    Google rolls out Veo 3.1 Lite, halving AI video costs

    Veo 3.1 Lite matches Veo 3.1 Fast speed but cuts price by over 50% for devs now

    2 days ago
    Freelander 97 Debuts 800‑V EV Crossover in Shanghai

    Freelander 97 Debuts 800‑V EV Crossover in Shanghai

    Chery‑JLR showcases ADS 4.1 autonomy on 800‑V platform, eyeing 2028 launch

    2 days ago
    Telegram Launches Version 12.6 With AI Editor, New Polls

    Telegram Launches Version 12.6 With AI Editor, New Polls

    It adds an AI tone editor, richer polls, Live/Motion Photos, and bot management

    2 days ago

    Pixel 11 Pro Renders Leak With Black Camera Bar and MediaTek Modem

    Google’s August 2026 flagship ditches Samsung radios for improved 5G and runs the Tensor G6

    3 days ago

    Anthropic leak reveals Opus 4.7, Sonnet 4.8 in npm 2.1.88

    Leak on March 30‑31 exposed TypeScript, revealing Opus 4.7, Sonnet 4.8, and internal features

    3 days ago
    iOS 26.5 beta lands on iPhone 17 Pro with an 8 GB download

    iOS 26.5 beta lands on iPhone 17 Pro with an 8 GB download

    Apple restores RCS encryption and adds a 12‑month subscription in the update

    3 days ago
    Windows 11 24H2 Brings Dark Mode to Core Utilities

    Windows 11 24H2 Brings Dark Mode to Core Utilities

    Tools like Registry Editor get dark mode in Windows 11 24H2, out in Sep 2026

    4 days ago

    John Noble's 1,024 Thread Implant Powers Warcraft Raids

    John Noble, a former British parachutist turned veteran gamer, received a neural implant with 1,024 threads after a 2024 trial in Seattle. The device lets him control a MacBook with thought alone, turning World of Warcraft raids into hands‑free battles. His story shows how brain‑computer interfaces can expand digital access for disabled veterans and reshape gaming.

    5 days ago
    Apple unveils Siri app for iOS 27, adds 50+ AI agents

    Apple unveils Siri app for iOS 27, adds 50+ AI agents

    iOS 27 Siri app adds Extensions marketplace, eyeing Alexa’s 100,000‑skill store

    5 days ago
    Loading...
Tech/Trends

Space-Based Data Centers: Why Tech Giants Are Moving Computing to Orbit

Solar-powered satellites promise to solve AI's energy crisis, but latency and physics stand in the way

February 6, 2026, 9:39 pm

AI training consumes as much power as small towns, sparking community opposition to data centers. Companies like SpaceX, xAI, and Google are exploring orbital computing—satellites powered by 24/7 solar energy that eliminate cooling costs. The concept solves resource conflicts but introduces latency problems that make real-time AI impossible. Here's what works, what doesn't, and which workloads might actually benefit from computing in space.

image-224

Summary

  • Space-based data centers could solve Earth's AI energy crisis by running satellites on 24/7 solar power in orbit, eliminating water cooling needs and local grid strain—but latency of 5-30 milliseconds per round trip makes real-time AI tasks impossible.
  • SpaceX, Google, China, and the European Space Agency are exploring orbital computing, though no operational systems exist yet—technical hurdles include massive radiator panels for heat dissipation and launch costs of thousands of dollars per pound of hardware.
  • Batch processing tasks like climate modeling, protein folding, and archival storage could work in orbit where delays don't matter, but moving infrastructure to space removes community oversight and shifts jobs from distributed data centers to concentrated control facilities.

A single AI training run can consume as much electricity as a hundred homes use in a year. The servers generate so much heat they need water cooling systems that rival small municipal plants. Communities from Virginia to Arizona are blocking new data center projects, worried about power grid strain and water depletion.

This is why some of the world's biggest tech companies are now looking up instead of out. Space-based data centers place computing hardware on satellites in orbit, powered by solar panels that collect unfiltered sunlight 24 hours a day. In February 2025, Elon Musk announced plans to explore orbital computing infrastructure through SpaceX and xAI partnerships. Google has Project Suncatcher exploring similar concepts. China and the European Space Agency have launched feasibility studies.

What sounds like science fiction is becoming an engineering question worth serious money and attention.

How Orbital Computing Works

The core concept involves launching satellites equipped with computing hardware and large solar panel arrays into low Earth orbit, roughly 200 to 1,200 miles above the surface. The satellites receive data from ground stations via laser or radio links, process computations onboard, and transmit results back down.

The appeal is environmental and logistical. Solar panels in orbit collect energy without atmospheric filtering, cloud cover, or nighttime interruptions. A satellite above Earth receives about 35 percent more solar energy than a ground-based panel at peak efficiency. There's no need for cooling towers or water systems because the vacuum of space provides natural thermal radiation. Heat dissipates directly into the void through radiator panels, the same way the International Space Station manages temperature.

Proponents argue this solves three problems simultaneously. It eliminates local resource conflicts by removing data centers from communities. It taps an abundant energy source. It sidesteps the real estate and permitting challenges that slow terrestrial expansion. The infrastructure moves off the grid entirely.

Why Data Centers Became an Infrastructure Crisis

Modern AI models require thousands of specialized processors running simultaneously for weeks or months. Training GPT-4 class models consumes an estimated 10 to 25 megawatts of power. That's enough to run a small factory, operating around the clock. Multiply that by dozens of training runs happening concurrently across the industry, plus the energy needed for inference, and the numbers become staggering.

The cooling problem compounds the power problem. High-performance chips generate intense heat in tiny spaces. Data centers use refrigeration systems, evaporative cooling towers, or direct liquid cooling that circulates water through server racks. In arid regions like Phoenix, a single large data center can consume millions of gallons of water per day, equivalent to thousands of households.

Local opposition has grown fierce. Residents near proposed data center sites in Loudoun County, Virginia have organized against projects they view as infrastructure leeches. The facilities bring few jobs compared to their resource demands. They strain electrical grids during peak hours. The NIMBY dynamic is slowing approvals and driving up costs for tech companies that need to expand capacity quickly.

The Technical Barriers No One Has Solved

Latency is the killer constraint. Light travels fast, but not instantaneously. A signal traveling from a ground station to a satellite in low Earth orbit and back covers roughly 500 to 2,500 miles, depending on satellite position and ground station location. That round trip introduces 5 to 30 milliseconds of latency under ideal conditions, not accounting for atmospheric interference or signal processing delays.

For many AI workloads, this matters enormously. Training large language models requires millions of rapid-fire data exchanges between processors. Each training step depends on the results of the previous one. Adding even 10 milliseconds per exchange would slow training runs from weeks to months or make them computationally impractical. Real-time inference tasks like chatbots, voice assistants, and autonomous vehicle decision making cannot tolerate multi-second delays while data travels to orbit and back.

The Cooling Paradox

Thermal management in orbit is more complex than it appears. Yes, space is cold, but there's no air to carry heat away. Spacecraft rely entirely on thermal radiation, which means large radiator panels to dissipate heat from densely packed processors. The International Space Station's radiators span 1,700 square feet to cool a fraction of the power load a modern data center generates. Scaling this to handle megawatts of computing heat requires radiator systems so large they might dwarf the computing hardware itself.

Satellites in low Earth orbit move at roughly 17,000 miles per hour, circling the planet every 90 minutes. Maintaining constant communication with ground stations requires either constellations of dozens of satellites so one is always overhead, or higher orbits where satellites move slower but latency increases. Launch costs, while declining, still run thousands of dollars per kilogram of payload. A single rack of high-performance servers weighs over 1,000 pounds before adding solar panels, radiators, and communications equipment.

What This Is Actually Good For

Space-based computing isn't viable for latency-sensitive workloads, but it could excel at specific batch processing tasks that don't require real-time responses. Training certain types of models where data can be uploaded in bulk, processed over hours or days, and results downloaded later might work. Climate modeling projects like the European Centre for Medium-Range Weather Forecasts' seasonal prediction models, which process historical atmospheric data over multiple days to generate three-month forecasts, could tolerate orbital delays. The same applies to protein folding simulations and large-scale data analysis jobs that already run overnight on terrestrial servers.

Archival storage represents another potential use case. Data that needs to be retained but rarely accessed could sit on orbital storage satellites, retrieved only when necessary. The latency penalty matters less when retrieval times measured in minutes are acceptable.

Specialized edge processing for satellite networks might close the loop more elegantly. If data is already being collected by Earth observation satellites, processing it in orbit before sending condensed results to the ground reduces bandwidth needs. This works for applications like agricultural monitoring, disaster response analysis, or environmental tracking where time scales are measured in hours or days, not milliseconds.

The Companies Making Real Moves

Several major technology and aerospace organizations are actively exploring orbital computing infrastructure, though no operational systems have been deployed yet.

SpaceX and xAI have announced plans to explore orbital computing infrastructure, positioning it as central to long-term AI development strategy. The announcements leverage SpaceX's launch capabilities and satellite experience from Starlink, though concrete timelines and technical specifications remain undisclosed.

Google's Project Suncatcher explores solar-powered orbital computing with a focus on sustainability metrics. The project remains in early research phases, with no announced launch dates. Internal presentations reportedly emphasize reducing data center carbon footprints by moving compute-intensive training runs to space-based platforms powered entirely by solar energy.

China's space program has allocated funding for feasibility studies on orbital data infrastructure as part of its broader space station and satellite network ambitions. European Space Agency documents reference edge computing in orbit as a long-term research area, integrated with Earth observation satellite programs already in operation.

The pattern is consistent across all initiatives: ambitious concepts backed by preliminary research, but no operational hardware in orbit or confirmed deployment schedules.

What This Means for Infrastructure Decisions

Moving data centers to orbit doesn't address the underlying question: who gets a say in infrastructure decisions that affect shared resources? The terrestrial data center fights revealed something deeper than NIMBY politics. Residents were asking whether their local power and water should subsidize global AI infrastructure that brings minimal local benefit. Space-based systems remove the immediate environmental burden but also remove any community participation in governance or benefit-sharing.

Labor implications remain murky. Data centers employ thousands of technicians, engineers, and operations staff. Orbital systems would require far fewer hands-on workers, with most operations managed remotely from ground control facilities. The infrastructure jobs shift from distributed local employment to concentrated technical roles at launch sites and control centers.

Progress That Lasts Requires Honest Tradeoffs

Space-based computing addresses real problems. Terrestrial data centers strain resources and face opposition. Solar power in orbit is abundant and constant. The environmental case has merit.

But the technical challenges are not yet solved, the economics remain speculative, and the social implications deserve scrutiny beyond corporate press releases. The question is not whether we can put data centers in orbit. The question is whether we should, for which workloads, and at what cost to whom.

Right now, we have more announcements than answers, more ambition than evidence, and more engineering problems than operational systems. That's not a reason to dismiss the concept. It's a reason to watch the evidence closely and ask hard questions about feasibility, accountability, and who benefits when infrastructure moves beyond democratic reach.

Every watt is a choice. So is every orbit.

What is this about?

  • commercial space infrastructure/
  • satellite deployment/
  • data center power/
  • AI energy consumption/
  • orbital computing/
  • data center sustainability

Feed

    Cursor 3 Launches Unified AI Coding Workspace

    Cursor 3 Launches Unified AI Coding Workspace

    Side‑panel lets devs toggle local and cloud agents, building on Composer 2 and Kimi 2.5

    about 9 hours ago
    Orion’s Six‑Minute Burn Puts Artemis 2 on Free‑Return Path

    Orion’s Six‑Minute Burn Puts Artemis 2 on Free‑Return Path

    iPhone 17 Pro Max survives Orion’s deep‑space test as crew heads to lunar flyby

    about 10 hours ago
    Android 17 Introduces System‑Level Notification Rules

    Android 17 Introduces System‑Level Notification Rules

    Samsung’s One UI 9 will adopt Android 17’s rules, adding OS‑level alert control

    about 11 hours ago

    Nvidia rolls out DLSS 4.5, 6× boost on RTX 50-series

    Dynamic Multi‑Frame Generation smooths 120–240 Hz, delivered in the driver 595.97

    about 13 hours ago
    Apple rolls out iOS 18.7.7 to block DarkSword

    Apple rolls out iOS 18.7.7 to block DarkSword

    Patch fixes six Safari bugs, stopping DarkSword on iOS 18–18.7 devices

    1 day ago
    BoxPlates Skins Revamp PS5 Slim & Pro in Two Weeks

    BoxPlates Skins Revamp PS5 Slim & Pro in Two Weeks

    1 day ago
    Artemis 2 Rockets Beyond Earth—402,000 km From Home

    Artemis 2 Rockets Beyond Earth—402,000 km From Home

    Lift‑off at 8:23 a.m. ET marks first crewed lunar flight since 1972, with a diverse four‑person crew

    1 day ago
    Apple celebrates 50 years with new minimalist wallpapers

    Apple celebrates 50 years with new minimalist wallpapers

    Basic Apple Guy releases iPhone and Mac wallpapers for Apple’s 50th anniversary

    1 day ago
    Razer Unveils Pro Type Ergo Ergonomic Keyboard Today

    Razer Unveils Pro Type Ergo Ergonomic Keyboard Today

    Split design, AI button, and 19‑zone RGB aim at U.S. workers with a 9.7% RSI rate

    2 days ago
    Google to debut screen‑free Fitbit band in 2026

    Google to debut screen‑free Fitbit band in 2026

    AI‑driven training plan and upgraded platform aim at the health‑tracking market against Oura and Whoop

    2 days ago
    Nothing unveils AI‑powered smart glasses for a 2027 launch

    Nothing unveils AI‑powered smart glasses for a 2027 launch

    The glasses use a paired phone and cloud, with a clear frame and LED accents

    2 days ago
    Google rolls out Veo 3.1 Lite, halving AI video costs

    Google rolls out Veo 3.1 Lite, halving AI video costs

    Veo 3.1 Lite matches Veo 3.1 Fast speed but cuts price by over 50% for devs now

    2 days ago
    Freelander 97 Debuts 800‑V EV Crossover in Shanghai

    Freelander 97 Debuts 800‑V EV Crossover in Shanghai

    Chery‑JLR showcases ADS 4.1 autonomy on 800‑V platform, eyeing 2028 launch

    2 days ago
    Telegram Launches Version 12.6 With AI Editor, New Polls

    Telegram Launches Version 12.6 With AI Editor, New Polls

    It adds an AI tone editor, richer polls, Live/Motion Photos, and bot management

    2 days ago

    Pixel 11 Pro Renders Leak With Black Camera Bar and MediaTek Modem

    Google’s August 2026 flagship ditches Samsung radios for improved 5G and runs the Tensor G6

    3 days ago

    Anthropic leak reveals Opus 4.7, Sonnet 4.8 in npm 2.1.88

    Leak on March 30‑31 exposed TypeScript, revealing Opus 4.7, Sonnet 4.8, and internal features

    3 days ago
    iOS 26.5 beta lands on iPhone 17 Pro with an 8 GB download

    iOS 26.5 beta lands on iPhone 17 Pro with an 8 GB download

    Apple restores RCS encryption and adds a 12‑month subscription in the update

    3 days ago
    Windows 11 24H2 Brings Dark Mode to Core Utilities

    Windows 11 24H2 Brings Dark Mode to Core Utilities

    Tools like Registry Editor get dark mode in Windows 11 24H2, out in Sep 2026

    4 days ago

    John Noble's 1,024 Thread Implant Powers Warcraft Raids

    John Noble, a former British parachutist turned veteran gamer, received a neural implant with 1,024 threads after a 2024 trial in Seattle. The device lets him control a MacBook with thought alone, turning World of Warcraft raids into hands‑free battles. His story shows how brain‑computer interfaces can expand digital access for disabled veterans and reshape gaming.

    5 days ago
    Apple unveils Siri app for iOS 27, adds 50+ AI agents

    Apple unveils Siri app for iOS 27, adds 50+ AI agents

    iOS 27 Siri app adds Extensions marketplace, eyeing Alexa’s 100,000‑skill store

    5 days ago
    Loading...
banner