• My Feed
  • Home
  • What's Important
  • Media & Entertainment
Search

Stay Curious. Stay Wanture.

© 2026 Wanture. All rights reserved.

  • Terms of Use
  • Privacy Policy
Tech/Trends

AI's Energy Cost: What Every Query Really Consumes

From training to inference, how artificial intelligence became one of tech's most power-hungry innovations

11 February 2026

—

Explainer *

Tasha Greene
banner

Every ChatGPT query uses 10 times more energy than a Google search. Training GPT-3 consumed enough electricity to power 120 American homes for a year. As AI adoption explodes, data centers face mounting pressure on water, electricity grids, and carbon emissions. This explainer breaks down the hidden infrastructure costs behind each interaction.

image (4)-1

Summary:

  • Training GPT‑3 burned ~1,300 MWh—enough to power 120 U.S. homes for a year—and evaporated ~185,000 gal of water, total footprint ≈1.4 M gal.
  • Each ChatGPT reply uses ~0.34 Wh (≈10× a Google search). With 3.5 billion weekly queries, AI’s energy and water use surge dramatically.
  • Data‑center cooling consumes ~40% of power; U.S. centers used ~17 billion gal of water in 2023. AI could push data‑center electricity to 6–12% of U.S. use by 2028.

When ChatGPT answers your question about weekend dinner recipes, somewhere in a data center, hundreds of specialized chips light up. They process your request in milliseconds, then cool down and wait for the next query. That single moment of computation costs energy. Multiply it by millions of users asking millions of questions, and you begin to see why artificial intelligence has become one of the tech industry's most power-hungry innovations.

The question isn't whether AI consumes energy. It does. The real question is whether that consumption will outpace our ability to make it cleaner and more efficient.

Training Consumes Energy at City Scale

Building a large language model requires an enormous one-time energy investment. Training GPT-3, for example, consumed about 1,287 megawatt-hours of electricity. That's roughly equivalent to powering 120 average American homes for an entire year, compressed into several weeks of continuous computation.

The process works like this: thousands of graphics processing units (GPUs) run simultaneously, analyzing patterns across billions of text examples. Each GPU generates heat while crunching numbers. The system learns language structure, context, and relationships between concepts. This happens 24 hours a day until the model reaches acceptable performance levels.

Training happens once per model. GPT-4 required more energy than GPT-3. Future models will likely require more still, unless efficiency improvements outpace scale increases. The data centers running these training operations draw power comparable to small industrial facilities during peak training periods.

Water consumption adds another invisible cost. GPT-3's training evaporated around 185,000 gallons of water on-site for cooling. When you include the water used to generate the electricity that powered the training, the total footprint reaches about 1.4 million gallons—enough to fill more than two Olympic swimming pools.

Every Query Triggers Invisible Computations

After training comes inference, the process of actually answering user questions. This is where scale creates ongoing energy costs. A single ChatGPT query consumes around 0.34 watt-hours of electricity. That's roughly 10 times more than a traditional Google search, which uses about 0.03 watt-hours.

That difference seems trivial until you consider volume. ChatGPT handled over 100 million weekly active users by early 2024. If each user makes just five queries per day, that's 3.5 billion queries weekly. The cumulative energy cost grows with every new user and every additional question.

Complexity matters too. Generating a 500-word essay requires more computation than answering a simple factual question. Image generation through models like DALL-E or Midjourney demands even more processing power. The energy cost scales with output length and complexity.

Each query also consumes water. A single ChatGPT interaction uses about 0.01 fluid ounces for cooling infrastructure. Individually negligible, but at billions of queries per week, the total adds up quickly.

Cooling Systems Drive Hidden Infrastructure Costs

Data centers don't just run AI models. They fight heat. Modern GPUs can reach temperatures exceeding 82 degrees Fahrenheit under full load. Pack thousands of them into a single facility and you create an environment that requires industrial-scale cooling systems.

In temperate climates, data centers use outside air for cooling when possible. In warmer regions like Arizona or Texas, massive air conditioning systems run continuously. Google's data centers, which serve AI workloads among other services, dedicate around 40 percent of their total energy consumption to cooling and power distribution rather than computation itself.

Water consumption operates on a staggering scale. U.S. data centers consumed about 17 billion gallons of water directly in 2023. When you include the indirect water used to generate the electricity that powers these facilities, the total reaches roughly 211 billion gallons. Google alone reported 8.1 billion gallons of water use across data centers and offices in 2024.

Evaporative cooling systems, common in hot climates, pull millions of gallons annually per facility. As AI workloads expand, so does the infrastructure needed to keep them from overheating. Microsoft's data centers consumed around 2,072 million gallons of water in fiscal year 2023 across all operations.

Carbon Emissions and Grid Pressure Mount

A query processed in Iceland carries a different environmental cost than one processed in West Virginia. Iceland's data centers run almost entirely on geothermal and hydroelectric power. West Virginia's grid relies heavily on coal.

Major AI companies recognize this disparity. Google reports that 64 percent of its global data center energy came from carbon-free sources in 2023. Microsoft committed to being carbon negative by 2030. Meta aims for 100 percent renewable energy across all operations.

The challenge is matching renewable energy supply with actual usage patterns. Solar panels generate power during the day, but AI queries happen around the clock. Battery storage helps, but at current scale, most data centers still draw from mixed grids that include fossil fuel generation during peak demand periods.

U.S. data centers consumed 176 terawatt-hours in 2023, representing about 4.4 percent of U.S. electricity. Globally, data centers used roughly 415 terawatt-hours in 2024, about 1.5 percent of global electricity. AI servers accounted for 24 percent of server electricity demand. Projections suggest U.S. data centers could consume between 6.7 and 12 percent of U.S. electricity by 2028, depending on AI adoption rates.

Efficiency Gains Emerge From Optimization

AI models are becoming more efficient per unit of performance. GPT-3 required 175 billion parameters to achieve its capabilities. Newer architectures achieve similar results with fewer parameters through techniques like mixture of experts, which activate only relevant parts of the model for each query rather than the entire network.

Quantization reduces the precision of calculations without significantly degrading output quality, cutting energy consumption by 30 to 50 percent in some implementations. Specialized AI chips from companies like Google (TPUs) and startups like Cerebras deliver better performance per watt than general-purpose GPUs.

Edge computing shifts some AI processing to devices themselves. Your phone can now run smaller language models locally for tasks like predictive text or voice recognition. This reduces data center load, though it transfers energy consumption to billions of individual devices with smaller but cumulative impact.

Companies are also investing in mitigation. Google's water stewardship projects supplied around 4.5 billion gallons in 2024, offsetting roughly 55 percent of their 8.1 billion gallon consumption. Microsoft committed to becoming water-positive by 2030, replenishing more water than it consumes.

These improvements matter. But they're incremental gains in a context of exponential growth. The question is whether efficiency curves can keep pace with adoption curves.

Whether AI Becomes a Crisis Depends on Choices Made Now

The International Energy Agency estimates that data centers could consume 3 to 4 percent of global electricity by 2030, up from roughly 1.5 percent today. AI workloads represent the fastest-growing segment of that demand.

Whether this becomes a crisis or a manageable transition depends on three factors. First, renewable energy deployment must accelerate. Tech companies are signing power purchase agreements for solar and wind farms, but construction timelines lag behind AI adoption rates. Second, algorithmic efficiency must continue improving. Third, use cases matter. AI that optimizes electrical grids or accelerates renewable energy research could offset its own carbon footprint through systemic improvements elsewhere.

For product teams evaluating AI features, the energy equation has become part of the decision calculus. Caching frequently requested outputs reduces redundant computation. Choosing smaller models for simpler tasks makes engineering and environmental sense. Selecting data center regions with cleaner grids reduces carbon intensity.

For individual users, the calculus is different. The environmental impact of a single query is negligible. But collective usage patterns shape infrastructure development. The models that get used most heavily receive the most investment in optimization.

AI's energy consumption is neither insignificant nor apocalyptic. It's a design problem. The answers lie in cleaner grids, smarter architectures, and honest accounting about which applications justify their resource costs. Progress only counts if it lasts.

What is this about?

  • AI energy consumption/
  • data center power/
  • data center sustainability/
  • machine learning optimization/
  • water consumption tech/
  • carbon-free computing

Feed

    Cursor 3 Launches Unified AI Coding Workspace

    Cursor 3 Launches Unified AI Coding Workspace

    Side‑panel lets devs toggle local and cloud agents, building on Composer 2 and Kimi 2.5

    about 14 hours ago
    Orion’s Six‑Minute Burn Puts Artemis 2 on Free‑Return Path

    Orion’s Six‑Minute Burn Puts Artemis 2 on Free‑Return Path

    iPhone 17 Pro Max survives Orion’s deep‑space test as crew heads to lunar flyby

    about 15 hours ago
    Android 17 Introduces System‑Level Notification Rules

    Android 17 Introduces System‑Level Notification Rules

    Samsung’s One UI 9 will adopt Android 17’s rules, adding OS‑level alert control

    about 15 hours ago

    Nvidia rolls out DLSS 4.5, 6× boost on RTX 50-series

    Dynamic Multi‑Frame Generation smooths 120–240 Hz, delivered in the driver 595.97

    about 17 hours ago
    Apple rolls out iOS 18.7.7 to block DarkSword

    Apple rolls out iOS 18.7.7 to block DarkSword

    Patch fixes six Safari bugs, stopping DarkSword on iOS 18–18.7 devices

    1 day ago
    BoxPlates Skins Revamp PS5 Slim & Pro in Two Weeks

    BoxPlates Skins Revamp PS5 Slim & Pro in Two Weeks

    1 day ago
    Artemis 2 Rockets Beyond Earth—402,000 km From Home

    Artemis 2 Rockets Beyond Earth—402,000 km From Home

    Lift‑off at 8:23 a.m. ET marks first crewed lunar flight since 1972, with a diverse four‑person crew

    1 day ago
    Apple celebrates 50 years with new minimalist wallpapers

    Apple celebrates 50 years with new minimalist wallpapers

    Basic Apple Guy releases iPhone and Mac wallpapers for Apple’s 50th anniversary

    1 day ago
    Razer Unveils Pro Type Ergo Ergonomic Keyboard Today

    Razer Unveils Pro Type Ergo Ergonomic Keyboard Today

    Split design, AI button, and 19‑zone RGB aim at U.S. workers with a 9.7% RSI rate

    2 days ago
    Google to debut screen‑free Fitbit band in 2026

    Google to debut screen‑free Fitbit band in 2026

    AI‑driven training plan and upgraded platform aim at the health‑tracking market against Oura and Whoop

    3 days ago
    Nothing unveils AI‑powered smart glasses for a 2027 launch

    Nothing unveils AI‑powered smart glasses for a 2027 launch

    The glasses use a paired phone and cloud, with a clear frame and LED accents

    3 days ago
    Google rolls out Veo 3.1 Lite, halving AI video costs

    Google rolls out Veo 3.1 Lite, halving AI video costs

    Veo 3.1 Lite matches Veo 3.1 Fast speed but cuts price by over 50% for devs now

    3 days ago
    Freelander 97 Debuts 800‑V EV Crossover in Shanghai

    Freelander 97 Debuts 800‑V EV Crossover in Shanghai

    Chery‑JLR showcases ADS 4.1 autonomy on 800‑V platform, eyeing 2028 launch

    3 days ago
    Telegram Launches Version 12.6 With AI Editor, New Polls

    Telegram Launches Version 12.6 With AI Editor, New Polls

    It adds an AI tone editor, richer polls, Live/Motion Photos, and bot management

    3 days ago

    Pixel 11 Pro Renders Leak With Black Camera Bar and MediaTek Modem

    Google’s August 2026 flagship ditches Samsung radios for improved 5G and runs the Tensor G6

    3 days ago

    Anthropic leak reveals Opus 4.7, Sonnet 4.8 in npm 2.1.88

    Leak on March 30‑31 exposed TypeScript, revealing Opus 4.7, Sonnet 4.8, and internal features

    3 days ago
    iOS 26.5 beta lands on iPhone 17 Pro with an 8 GB download

    iOS 26.5 beta lands on iPhone 17 Pro with an 8 GB download

    Apple restores RCS encryption and adds a 12‑month subscription in the update

    3 days ago
    Windows 11 24H2 Brings Dark Mode to Core Utilities

    Windows 11 24H2 Brings Dark Mode to Core Utilities

    Tools like Registry Editor get dark mode in Windows 11 24H2, out in Sep 2026

    5 days ago

    John Noble's 1,024 Thread Implant Powers Warcraft Raids

    John Noble, a former British parachutist turned veteran gamer, received a neural implant with 1,024 threads after a 2024 trial in Seattle. The device lets him control a MacBook with thought alone, turning World of Warcraft raids into hands‑free battles. His story shows how brain‑computer interfaces can expand digital access for disabled veterans and reshape gaming.

    5 days ago
    Apple unveils Siri app for iOS 27, adds 50+ AI agents

    Apple unveils Siri app for iOS 27, adds 50+ AI agents

    iOS 27 Siri app adds Extensions marketplace, eyeing Alexa’s 100,000‑skill store

    5 days ago
    Loading...
Tech/Trends

AI's Energy Cost: What Every Query Really Consumes

From training to inference, how artificial intelligence became one of tech's most power-hungry innovations

February 11, 2026, 4:07 pm

Every ChatGPT query uses 10 times more energy than a Google search. Training GPT-3 consumed enough electricity to power 120 American homes for a year. As AI adoption explodes, data centers face mounting pressure on water, electricity grids, and carbon emissions. This explainer breaks down the hidden infrastructure costs behind each interaction.

image (4)-1

Summary

  • Training GPT‑3 burned ~1,300 MWh—enough to power 120 U.S. homes for a year—and evaporated ~185,000 gal of water, total footprint ≈1.4 M gal.
  • Each ChatGPT reply uses ~0.34 Wh (≈10× a Google search). With 3.5 billion weekly queries, AI’s energy and water use surge dramatically.
  • Data‑center cooling consumes ~40% of power; U.S. centers used ~17 billion gal of water in 2023. AI could push data‑center electricity to 6–12% of U.S. use by 2028.

When ChatGPT answers your question about weekend dinner recipes, somewhere in a data center, hundreds of specialized chips light up. They process your request in milliseconds, then cool down and wait for the next query. That single moment of computation costs energy. Multiply it by millions of users asking millions of questions, and you begin to see why artificial intelligence has become one of the tech industry's most power-hungry innovations.

The question isn't whether AI consumes energy. It does. The real question is whether that consumption will outpace our ability to make it cleaner and more efficient.

Training Consumes Energy at City Scale

Building a large language model requires an enormous one-time energy investment. Training GPT-3, for example, consumed about 1,287 megawatt-hours of electricity. That's roughly equivalent to powering 120 average American homes for an entire year, compressed into several weeks of continuous computation.

The process works like this: thousands of graphics processing units (GPUs) run simultaneously, analyzing patterns across billions of text examples. Each GPU generates heat while crunching numbers. The system learns language structure, context, and relationships between concepts. This happens 24 hours a day until the model reaches acceptable performance levels.

Training happens once per model. GPT-4 required more energy than GPT-3. Future models will likely require more still, unless efficiency improvements outpace scale increases. The data centers running these training operations draw power comparable to small industrial facilities during peak training periods.

Water consumption adds another invisible cost. GPT-3's training evaporated around 185,000 gallons of water on-site for cooling. When you include the water used to generate the electricity that powered the training, the total footprint reaches about 1.4 million gallons—enough to fill more than two Olympic swimming pools.

Every Query Triggers Invisible Computations

After training comes inference, the process of actually answering user questions. This is where scale creates ongoing energy costs. A single ChatGPT query consumes around 0.34 watt-hours of electricity. That's roughly 10 times more than a traditional Google search, which uses about 0.03 watt-hours.

That difference seems trivial until you consider volume. ChatGPT handled over 100 million weekly active users by early 2024. If each user makes just five queries per day, that's 3.5 billion queries weekly. The cumulative energy cost grows with every new user and every additional question.

Complexity matters too. Generating a 500-word essay requires more computation than answering a simple factual question. Image generation through models like DALL-E or Midjourney demands even more processing power. The energy cost scales with output length and complexity.

Each query also consumes water. A single ChatGPT interaction uses about 0.01 fluid ounces for cooling infrastructure. Individually negligible, but at billions of queries per week, the total adds up quickly.

Cooling Systems Drive Hidden Infrastructure Costs

Data centers don't just run AI models. They fight heat. Modern GPUs can reach temperatures exceeding 82 degrees Fahrenheit under full load. Pack thousands of them into a single facility and you create an environment that requires industrial-scale cooling systems.

In temperate climates, data centers use outside air for cooling when possible. In warmer regions like Arizona or Texas, massive air conditioning systems run continuously. Google's data centers, which serve AI workloads among other services, dedicate around 40 percent of their total energy consumption to cooling and power distribution rather than computation itself.

Water consumption operates on a staggering scale. U.S. data centers consumed about 17 billion gallons of water directly in 2023. When you include the indirect water used to generate the electricity that powers these facilities, the total reaches roughly 211 billion gallons. Google alone reported 8.1 billion gallons of water use across data centers and offices in 2024.

Evaporative cooling systems, common in hot climates, pull millions of gallons annually per facility. As AI workloads expand, so does the infrastructure needed to keep them from overheating. Microsoft's data centers consumed around 2,072 million gallons of water in fiscal year 2023 across all operations.

Carbon Emissions and Grid Pressure Mount

A query processed in Iceland carries a different environmental cost than one processed in West Virginia. Iceland's data centers run almost entirely on geothermal and hydroelectric power. West Virginia's grid relies heavily on coal.

Major AI companies recognize this disparity. Google reports that 64 percent of its global data center energy came from carbon-free sources in 2023. Microsoft committed to being carbon negative by 2030. Meta aims for 100 percent renewable energy across all operations.

The challenge is matching renewable energy supply with actual usage patterns. Solar panels generate power during the day, but AI queries happen around the clock. Battery storage helps, but at current scale, most data centers still draw from mixed grids that include fossil fuel generation during peak demand periods.

U.S. data centers consumed 176 terawatt-hours in 2023, representing about 4.4 percent of U.S. electricity. Globally, data centers used roughly 415 terawatt-hours in 2024, about 1.5 percent of global electricity. AI servers accounted for 24 percent of server electricity demand. Projections suggest U.S. data centers could consume between 6.7 and 12 percent of U.S. electricity by 2028, depending on AI adoption rates.

Efficiency Gains Emerge From Optimization

AI models are becoming more efficient per unit of performance. GPT-3 required 175 billion parameters to achieve its capabilities. Newer architectures achieve similar results with fewer parameters through techniques like mixture of experts, which activate only relevant parts of the model for each query rather than the entire network.

Quantization reduces the precision of calculations without significantly degrading output quality, cutting energy consumption by 30 to 50 percent in some implementations. Specialized AI chips from companies like Google (TPUs) and startups like Cerebras deliver better performance per watt than general-purpose GPUs.

Edge computing shifts some AI processing to devices themselves. Your phone can now run smaller language models locally for tasks like predictive text or voice recognition. This reduces data center load, though it transfers energy consumption to billions of individual devices with smaller but cumulative impact.

Companies are also investing in mitigation. Google's water stewardship projects supplied around 4.5 billion gallons in 2024, offsetting roughly 55 percent of their 8.1 billion gallon consumption. Microsoft committed to becoming water-positive by 2030, replenishing more water than it consumes.

These improvements matter. But they're incremental gains in a context of exponential growth. The question is whether efficiency curves can keep pace with adoption curves.

Whether AI Becomes a Crisis Depends on Choices Made Now

The International Energy Agency estimates that data centers could consume 3 to 4 percent of global electricity by 2030, up from roughly 1.5 percent today. AI workloads represent the fastest-growing segment of that demand.

Whether this becomes a crisis or a manageable transition depends on three factors. First, renewable energy deployment must accelerate. Tech companies are signing power purchase agreements for solar and wind farms, but construction timelines lag behind AI adoption rates. Second, algorithmic efficiency must continue improving. Third, use cases matter. AI that optimizes electrical grids or accelerates renewable energy research could offset its own carbon footprint through systemic improvements elsewhere.

For product teams evaluating AI features, the energy equation has become part of the decision calculus. Caching frequently requested outputs reduces redundant computation. Choosing smaller models for simpler tasks makes engineering and environmental sense. Selecting data center regions with cleaner grids reduces carbon intensity.

For individual users, the calculus is different. The environmental impact of a single query is negligible. But collective usage patterns shape infrastructure development. The models that get used most heavily receive the most investment in optimization.

AI's energy consumption is neither insignificant nor apocalyptic. It's a design problem. The answers lie in cleaner grids, smarter architectures, and honest accounting about which applications justify their resource costs. Progress only counts if it lasts.

What is this about?

  • AI energy consumption/
  • data center power/
  • data center sustainability/
  • machine learning optimization/
  • water consumption tech/
  • carbon-free computing

Feed

    Cursor 3 Launches Unified AI Coding Workspace

    Cursor 3 Launches Unified AI Coding Workspace

    Side‑panel lets devs toggle local and cloud agents, building on Composer 2 and Kimi 2.5

    about 14 hours ago
    Orion’s Six‑Minute Burn Puts Artemis 2 on Free‑Return Path

    Orion’s Six‑Minute Burn Puts Artemis 2 on Free‑Return Path

    iPhone 17 Pro Max survives Orion’s deep‑space test as crew heads to lunar flyby

    about 15 hours ago
    Android 17 Introduces System‑Level Notification Rules

    Android 17 Introduces System‑Level Notification Rules

    Samsung’s One UI 9 will adopt Android 17’s rules, adding OS‑level alert control

    about 15 hours ago

    Nvidia rolls out DLSS 4.5, 6× boost on RTX 50-series

    Dynamic Multi‑Frame Generation smooths 120–240 Hz, delivered in the driver 595.97

    about 17 hours ago
    Apple rolls out iOS 18.7.7 to block DarkSword

    Apple rolls out iOS 18.7.7 to block DarkSword

    Patch fixes six Safari bugs, stopping DarkSword on iOS 18–18.7 devices

    1 day ago
    BoxPlates Skins Revamp PS5 Slim & Pro in Two Weeks

    BoxPlates Skins Revamp PS5 Slim & Pro in Two Weeks

    1 day ago
    Artemis 2 Rockets Beyond Earth—402,000 km From Home

    Artemis 2 Rockets Beyond Earth—402,000 km From Home

    Lift‑off at 8:23 a.m. ET marks first crewed lunar flight since 1972, with a diverse four‑person crew

    1 day ago
    Apple celebrates 50 years with new minimalist wallpapers

    Apple celebrates 50 years with new minimalist wallpapers

    Basic Apple Guy releases iPhone and Mac wallpapers for Apple’s 50th anniversary

    1 day ago
    Razer Unveils Pro Type Ergo Ergonomic Keyboard Today

    Razer Unveils Pro Type Ergo Ergonomic Keyboard Today

    Split design, AI button, and 19‑zone RGB aim at U.S. workers with a 9.7% RSI rate

    2 days ago
    Google to debut screen‑free Fitbit band in 2026

    Google to debut screen‑free Fitbit band in 2026

    AI‑driven training plan and upgraded platform aim at the health‑tracking market against Oura and Whoop

    3 days ago
    Nothing unveils AI‑powered smart glasses for a 2027 launch

    Nothing unveils AI‑powered smart glasses for a 2027 launch

    The glasses use a paired phone and cloud, with a clear frame and LED accents

    3 days ago
    Google rolls out Veo 3.1 Lite, halving AI video costs

    Google rolls out Veo 3.1 Lite, halving AI video costs

    Veo 3.1 Lite matches Veo 3.1 Fast speed but cuts price by over 50% for devs now

    3 days ago
    Freelander 97 Debuts 800‑V EV Crossover in Shanghai

    Freelander 97 Debuts 800‑V EV Crossover in Shanghai

    Chery‑JLR showcases ADS 4.1 autonomy on 800‑V platform, eyeing 2028 launch

    3 days ago
    Telegram Launches Version 12.6 With AI Editor, New Polls

    Telegram Launches Version 12.6 With AI Editor, New Polls

    It adds an AI tone editor, richer polls, Live/Motion Photos, and bot management

    3 days ago

    Pixel 11 Pro Renders Leak With Black Camera Bar and MediaTek Modem

    Google’s August 2026 flagship ditches Samsung radios for improved 5G and runs the Tensor G6

    3 days ago

    Anthropic leak reveals Opus 4.7, Sonnet 4.8 in npm 2.1.88

    Leak on March 30‑31 exposed TypeScript, revealing Opus 4.7, Sonnet 4.8, and internal features

    3 days ago
    iOS 26.5 beta lands on iPhone 17 Pro with an 8 GB download

    iOS 26.5 beta lands on iPhone 17 Pro with an 8 GB download

    Apple restores RCS encryption and adds a 12‑month subscription in the update

    3 days ago
    Windows 11 24H2 Brings Dark Mode to Core Utilities

    Windows 11 24H2 Brings Dark Mode to Core Utilities

    Tools like Registry Editor get dark mode in Windows 11 24H2, out in Sep 2026

    5 days ago

    John Noble's 1,024 Thread Implant Powers Warcraft Raids

    John Noble, a former British parachutist turned veteran gamer, received a neural implant with 1,024 threads after a 2024 trial in Seattle. The device lets him control a MacBook with thought alone, turning World of Warcraft raids into hands‑free battles. His story shows how brain‑computer interfaces can expand digital access for disabled veterans and reshape gaming.

    5 days ago
    Apple unveils Siri app for iOS 27, adds 50+ AI agents

    Apple unveils Siri app for iOS 27, adds 50+ AI agents

    iOS 27 Siri app adds Extensions marketplace, eyeing Alexa’s 100,000‑skill store

    5 days ago
    Loading...
banner