• My Feed
  • Home
  • What's Important
  • Media & Entertainment
Search

Stay Curious. Stay Wanture.

© 2026 Wanture. All rights reserved.

  • Terms of Use
  • Privacy Policy
Science/Tech

AI now consumes 50% of data center power

Demand doubled in two years, rivaling UK's national grid by 2025

17 December 2025

—

Data Story *

Nadia Bennett
banner

Artificial intelligence's energy appetite has exploded from 10% to 50% of global data center electricity in under two years. By 2025, AI will consume power equivalent to the United Kingdom's entire national demand. Infrastructure planners face unprecedented constraints as chip production doubles and grid capacity becomes the limiting factor for AI deployment.

telegram-cloud-photo-size-2-5307767060697910295-y

Summary:

  • AI's electricity consumption in 2024 matched the Netherlands' entire annual grid demand, projected to rival the UK's national consumption by 2025.
  • TSMC's AI chip production capacity more than doubled between 2023-2024, serving as a leading indicator of exponential energy demand growth for AI infrastructure.
  • Technical leaders must prioritize energy efficiency, with potential policy interventions like mandatory energy disclosure and renewable energy requirements becoming critical constraints.
banner

Artificial intelligence consumed electricity at a velocity that rivals national grids in 2024. By year's end, projections indicate AI could claim half of all power flowing through the world's data centers. The surge is forcing infrastructure planners, technology leaders, and engineers to treat energy capacity not as an environmental footnote, but as a first-order operational constraint.

The 20% to 50% Trajectory

AI's share of data center electricity doubled in under two years. In 2023, it consumed roughly 10% of total capacity. By 2024, that figure reached 20%. Projections from technology energy consumption specialist Alex de Vries indicate it will hit 50% by the close of 2025.

Scale matters. In 2024, AI's energy appetite matched the Netherlands' entire annual consumption—approximately 120 terawatt-hours. The 2025 forecast suggests it will rival the United Kingdom's national demand, reaching approximately 23 gigawatts.

These figures emerge from a triangulation methodology: cross-referencing equipment specifications from semiconductor manufacturers, corporate energy disclosures where available, and third-party analytics. This multi-source approach addresses a persistent gap—corporations rarely publish granular energy data, making accurate forecasting difficult and accountability nearly impossible.

Production Capacity as Leading Indicator

Taiwan Semiconductor Manufacturing Company (TSMC) more than doubled its AI chip production capacity between 2023 and 2024. Since these chips power the training and inference workloads driving AI's expansion, their production volume serves as a reliable predictor of future energy demand.

This correlation matters because AI infrastructure scales exponentially, not linearly. Each generation of models requires more compute, more memory, more cooling—compounding the energy load with every deployment cycle. When chip production doubles, energy consumption follows.

The Supply Chain Signal

De Vries' methodology tracks TSMC's CoWoS (Chip-on-Wafer-on-Substrate) packaging capacity—a specialized process used for high-performance AI accelerators. Growth in this metric precedes energy demand by roughly six to nine months, offering infrastructure planners a forward-looking indicator.

Current data shows CoWoS capacity continuing to expand through 2025, suggesting energy demand will maintain its steep trajectory through at least the first half of 2026.

AI vs. National Grids: The Comparative Frame

Translating gigawatts into tangible scale requires comparison:

  • 2024 baseline: AI energy use equals the Netherlands' total annual electricity consumption
  • 2025 projection: Expected to match the United Kingdom's national demand
  • Growth rate: Consumption increasing faster than Bitcoin mining at its 2021 peak

Bitcoin mining—long criticized for its environmental footprint—consumes approximately 150 terawatt-hours annually. AI is on track to exceed that figure within months, yet operates with far less public scrutiny or regulatory oversight.

Infrastructure Constraints and Grid Stability

Energy demand at this velocity creates cascading challenges. Data centers are requesting grid connections that exceed local utility capacity, forcing infrastructure upgrades that can take years to complete.

In regions where AI companies are clustering—Northern Virginia, Dublin, Singapore—utilities are scrambling to add transmission capacity while managing aging equipment designed for slower load growth. In the United States, data center electricity use reached approximately 176 terawatt-hours in 2023, representing 4.4% of national consumption. Scenario analyses project this could rise to 325–580 terawatt-hours by 2028, depending on GPU deployment rates and operational assumptions.

The timing compounds the problem. Many developed economies are simultaneously electrifying transportation and heating while decommissioning fossil fuel plants. AI's demand surge risks crowding out renewable energy investments, delaying decarbonization goals as utilities prioritize immediate grid stability over long-term transition planning.

The Waitlist Reality

Companies are already encountering power allocation limits that delay AI projects, regardless of budget or technical readiness. In some markets, data center operators are implementing waitlists for high-density compute deployments. Energy capacity, not capital or technical expertise, is becoming the binding constraint.

The Transparency Gap

De Vries' research highlights a critical obstacle: corporate opacity. Most AI companies disclose aggregated energy use at the corporate level, if at all, making it impossible to track model-specific consumption or compare efficiency across platforms.

Without standardized reporting, regulators lack the data to set meaningful benchmarks, and customers can't make informed choices about which AI tools carry the heaviest environmental cost. All major studies—including analyses from the International Energy Agency and Lawrence Berkeley National Laboratory—note large uncertainty due to limited corporate disclosure of AI-specific energy use.

This opacity extends to operational details. Which applications consume the most—training or inference? How do different model architectures compare? What's the energy cost of a single query versus a batch process? These questions remain largely unanswered in public documentation.

Efficiency as Countervailing Force

Not all AI development follows the same energy trajectory. China's DeepSeek model demonstrates that architectural choices matter: it achieves comparable performance to Meta's Llama 3.1 while requiring significantly fewer computational resources.

This efficiency gap suggests optimization pathways exist—if companies prioritize them. Technical strategies for reducing AI energy consumption include:

  • Model pruning: Removing redundant parameters without sacrificing accuracy
  • Quantization: Using lower-precision arithmetic to reduce memory and compute requirements
  • Sparse architectures: Activating only relevant network sections per query
  • Efficient training techniques: Transfer learning and fine-tuning rather than training from scratch

Yet these optimizations remain secondary considerations in a competitive landscape where model capability—not energy efficiency—drives market positioning. Until energy cost becomes a visible metric in AI product comparisons, efficiency gains will likely remain marginal.

Operational Implications for Technical Leaders

For engineering leaders evaluating AI integration, energy consumption is shifting from an environmental concern to an operational constraint. Critical questions to consider:

  • What is the total cost of ownership when factoring in energy expenses over a system's lifecycle?
  • Does your data center infrastructure have capacity for AI workloads, or will upgrades be required?
  • Are you evaluating AI vendors based on model efficiency, or only on capability?
  • How will energy costs scale as you move from pilot projects to production deployment?

These aren't hypothetical considerations. Infrastructure teams are encountering power allocation limits that delay deployment regardless of technical readiness.

Policy and Accountability Mechanisms

Addressing AI's energy trajectory requires systemic responses beyond individual company optimization. Potential policy interventions include:

  • Mandatory energy disclosure: Requiring companies to report model-specific consumption metrics
  • Efficiency standards: Setting performance-per-watt benchmarks for commercial AI systems
  • Grid impact assessments: Evaluating regional capacity before approving large-scale AI data centers
  • Renewable energy requirements: Mandating that AI infrastructure sources power from clean generation

These measures face resistance from an industry that moves faster than regulatory cycles. But as AI energy consumption approaches the scale of national grids, treating it as critical infrastructure becomes unavoidable. U.S. Congressional Research Service analyses now cite these energy projections, indicating policy awareness is growing.

Immediate Actions for Technical Teams

While systemic change requires policy action, technical teams can take immediate steps:

  • Audit current AI usage: Identify which tools and workflows consume the most resources
  • Prioritize efficient alternatives: Compare energy profiles when selecting AI vendors or open-source models
  • Optimize query patterns: Batch requests, cache results, avoid redundant processing
  • Advocate for transparency: Request energy metrics from AI vendors and incorporate them into procurement criteria

For product teams, consider whether energy efficiency should be part of the value proposition when communicating AI features to end users. As environmental awareness grows, lower carbon footprint may become a meaningful product differentiator.

The Path Forward

AI's energy consumption isn't inherently unsustainable—it's a design challenge. Data centers themselves now deliver far more compute per watt than a decade ago, demonstrating that efficiency gains are achievable when prioritized.

The International Energy Agency's base case projects global data center electricity rising to approximately 945 terawatt-hours by 2030, with AI as the main driver. Current trajectories suggest energy demand is growing faster than efficiency improvements can offset, creating a widening gap between AI's computational ambitions and the infrastructure capacity to power them sustainably.

Closing that gap requires treating energy as a first-order constraint, not an externality to be managed later. For those building, deploying, or analyzing AI systems, the imperative is clear: watts are the new currency of artificial intelligence. Understanding their cost—and designing to minimize it—isn't just environmental responsibility. It's operational necessity in a world where the grid itself is becoming the limiting factor.

Topic

AI Energy Consumption Crisis

VCs Say 2026 Is When AI Stops Assisting and Starts Replacing Workers

12 January 2026

212,000 Banking Jobs Face AI Elimination by 2030

2 January 2026

212,000 Banking Jobs Face AI Elimination by 2030

AI's scaling era is over. What comes next?

2 January 2026

What is this about?

  • AI energy consumption/
  • data center power/
  • environmental sustainability/
  • infrastructure constraints/
  • energy efficiency

Feed

    Cursor 3 Launches Unified AI Coding Workspace

    Cursor 3 Launches Unified AI Coding Workspace

    Side‑panel lets devs toggle local and cloud agents, building on Composer 2 and Kimi 2.5

    about 9 hours ago
    Orion’s Six‑Minute Burn Puts Artemis 2 on Free‑Return Path

    Orion’s Six‑Minute Burn Puts Artemis 2 on Free‑Return Path

    iPhone 17 Pro Max survives Orion’s deep‑space test as crew heads to lunar flyby

    about 10 hours ago
    Android 17 Introduces System‑Level Notification Rules

    Android 17 Introduces System‑Level Notification Rules

    Samsung’s One UI 9 will adopt Android 17’s rules, adding OS‑level alert control

    about 11 hours ago

    Nvidia rolls out DLSS 4.5, 6× boost on RTX 50-series

    Dynamic Multi‑Frame Generation smooths 120–240 Hz, delivered in the driver 595.97

    about 13 hours ago
    Apple rolls out iOS 18.7.7 to block DarkSword

    Apple rolls out iOS 18.7.7 to block DarkSword

    Patch fixes six Safari bugs, stopping DarkSword on iOS 18–18.7 devices

    1 day ago
    BoxPlates Skins Revamp PS5 Slim & Pro in Two Weeks

    BoxPlates Skins Revamp PS5 Slim & Pro in Two Weeks

    1 day ago
    Artemis 2 Rockets Beyond Earth—402,000 km From Home

    Artemis 2 Rockets Beyond Earth—402,000 km From Home

    Lift‑off at 8:23 a.m. ET marks first crewed lunar flight since 1972, with a diverse four‑person crew

    1 day ago
    Apple celebrates 50 years with new minimalist wallpapers

    Apple celebrates 50 years with new minimalist wallpapers

    Basic Apple Guy releases iPhone and Mac wallpapers for Apple’s 50th anniversary

    1 day ago
    Razer Unveils Pro Type Ergo Ergonomic Keyboard Today

    Razer Unveils Pro Type Ergo Ergonomic Keyboard Today

    Split design, AI button, and 19‑zone RGB aim at U.S. workers with a 9.7% RSI rate

    2 days ago
    Google to debut screen‑free Fitbit band in 2026

    Google to debut screen‑free Fitbit band in 2026

    AI‑driven training plan and upgraded platform aim at the health‑tracking market against Oura and Whoop

    2 days ago
    Nothing unveils AI‑powered smart glasses for a 2027 launch

    Nothing unveils AI‑powered smart glasses for a 2027 launch

    The glasses use a paired phone and cloud, with a clear frame and LED accents

    2 days ago
    Google rolls out Veo 3.1 Lite, halving AI video costs

    Google rolls out Veo 3.1 Lite, halving AI video costs

    Veo 3.1 Lite matches Veo 3.1 Fast speed but cuts price by over 50% for devs now

    2 days ago
    Freelander 97 Debuts 800‑V EV Crossover in Shanghai

    Freelander 97 Debuts 800‑V EV Crossover in Shanghai

    Chery‑JLR showcases ADS 4.1 autonomy on 800‑V platform, eyeing 2028 launch

    2 days ago
    Telegram Launches Version 12.6 With AI Editor, New Polls

    Telegram Launches Version 12.6 With AI Editor, New Polls

    It adds an AI tone editor, richer polls, Live/Motion Photos, and bot management

    2 days ago

    Pixel 11 Pro Renders Leak With Black Camera Bar and MediaTek Modem

    Google’s August 2026 flagship ditches Samsung radios for improved 5G and runs the Tensor G6

    3 days ago

    Anthropic leak reveals Opus 4.7, Sonnet 4.8 in npm 2.1.88

    Leak on March 30‑31 exposed TypeScript, revealing Opus 4.7, Sonnet 4.8, and internal features

    3 days ago
    iOS 26.5 beta lands on iPhone 17 Pro with an 8 GB download

    iOS 26.5 beta lands on iPhone 17 Pro with an 8 GB download

    Apple restores RCS encryption and adds a 12‑month subscription in the update

    3 days ago
    Windows 11 24H2 Brings Dark Mode to Core Utilities

    Windows 11 24H2 Brings Dark Mode to Core Utilities

    Tools like Registry Editor get dark mode in Windows 11 24H2, out in Sep 2026

    4 days ago

    John Noble's 1,024 Thread Implant Powers Warcraft Raids

    John Noble, a former British parachutist turned veteran gamer, received a neural implant with 1,024 threads after a 2024 trial in Seattle. The device lets him control a MacBook with thought alone, turning World of Warcraft raids into hands‑free battles. His story shows how brain‑computer interfaces can expand digital access for disabled veterans and reshape gaming.

    5 days ago
    Apple unveils Siri app for iOS 27, adds 50+ AI agents

    Apple unveils Siri app for iOS 27, adds 50+ AI agents

    iOS 27 Siri app adds Extensions marketplace, eyeing Alexa’s 100,000‑skill store

    5 days ago
    Loading...
Science/Tech

AI now consumes 50% of data center power

Demand doubled in two years, rivaling UK's national grid by 2025

December 17, 2025, 9:55 pm

Artificial intelligence's energy appetite has exploded from 10% to 50% of global data center electricity in under two years. By 2025, AI will consume power equivalent to the United Kingdom's entire national demand. Infrastructure planners face unprecedented constraints as chip production doubles and grid capacity becomes the limiting factor for AI deployment.

telegram-cloud-photo-size-2-5307767060697910295-y

Summary

  • AI's electricity consumption in 2024 matched the Netherlands' entire annual grid demand, projected to rival the UK's national consumption by 2025.
  • TSMC's AI chip production capacity more than doubled between 2023-2024, serving as a leading indicator of exponential energy demand growth for AI infrastructure.
  • Technical leaders must prioritize energy efficiency, with potential policy interventions like mandatory energy disclosure and renewable energy requirements becoming critical constraints.
banner

Artificial intelligence consumed electricity at a velocity that rivals national grids in 2024. By year's end, projections indicate AI could claim half of all power flowing through the world's data centers. The surge is forcing infrastructure planners, technology leaders, and engineers to treat energy capacity not as an environmental footnote, but as a first-order operational constraint.

The 20% to 50% Trajectory

AI's share of data center electricity doubled in under two years. In 2023, it consumed roughly 10% of total capacity. By 2024, that figure reached 20%. Projections from technology energy consumption specialist Alex de Vries indicate it will hit 50% by the close of 2025.

Scale matters. In 2024, AI's energy appetite matched the Netherlands' entire annual consumption—approximately 120 terawatt-hours. The 2025 forecast suggests it will rival the United Kingdom's national demand, reaching approximately 23 gigawatts.

These figures emerge from a triangulation methodology: cross-referencing equipment specifications from semiconductor manufacturers, corporate energy disclosures where available, and third-party analytics. This multi-source approach addresses a persistent gap—corporations rarely publish granular energy data, making accurate forecasting difficult and accountability nearly impossible.

Production Capacity as Leading Indicator

Taiwan Semiconductor Manufacturing Company (TSMC) more than doubled its AI chip production capacity between 2023 and 2024. Since these chips power the training and inference workloads driving AI's expansion, their production volume serves as a reliable predictor of future energy demand.

This correlation matters because AI infrastructure scales exponentially, not linearly. Each generation of models requires more compute, more memory, more cooling—compounding the energy load with every deployment cycle. When chip production doubles, energy consumption follows.

The Supply Chain Signal

De Vries' methodology tracks TSMC's CoWoS (Chip-on-Wafer-on-Substrate) packaging capacity—a specialized process used for high-performance AI accelerators. Growth in this metric precedes energy demand by roughly six to nine months, offering infrastructure planners a forward-looking indicator.

Current data shows CoWoS capacity continuing to expand through 2025, suggesting energy demand will maintain its steep trajectory through at least the first half of 2026.

AI vs. National Grids: The Comparative Frame

Translating gigawatts into tangible scale requires comparison:

  • 2024 baseline: AI energy use equals the Netherlands' total annual electricity consumption
  • 2025 projection: Expected to match the United Kingdom's national demand
  • Growth rate: Consumption increasing faster than Bitcoin mining at its 2021 peak

Bitcoin mining—long criticized for its environmental footprint—consumes approximately 150 terawatt-hours annually. AI is on track to exceed that figure within months, yet operates with far less public scrutiny or regulatory oversight.

Infrastructure Constraints and Grid Stability

Energy demand at this velocity creates cascading challenges. Data centers are requesting grid connections that exceed local utility capacity, forcing infrastructure upgrades that can take years to complete.

In regions where AI companies are clustering—Northern Virginia, Dublin, Singapore—utilities are scrambling to add transmission capacity while managing aging equipment designed for slower load growth. In the United States, data center electricity use reached approximately 176 terawatt-hours in 2023, representing 4.4% of national consumption. Scenario analyses project this could rise to 325–580 terawatt-hours by 2028, depending on GPU deployment rates and operational assumptions.

The timing compounds the problem. Many developed economies are simultaneously electrifying transportation and heating while decommissioning fossil fuel plants. AI's demand surge risks crowding out renewable energy investments, delaying decarbonization goals as utilities prioritize immediate grid stability over long-term transition planning.

The Waitlist Reality

Companies are already encountering power allocation limits that delay AI projects, regardless of budget or technical readiness. In some markets, data center operators are implementing waitlists for high-density compute deployments. Energy capacity, not capital or technical expertise, is becoming the binding constraint.

The Transparency Gap

De Vries' research highlights a critical obstacle: corporate opacity. Most AI companies disclose aggregated energy use at the corporate level, if at all, making it impossible to track model-specific consumption or compare efficiency across platforms.

Without standardized reporting, regulators lack the data to set meaningful benchmarks, and customers can't make informed choices about which AI tools carry the heaviest environmental cost. All major studies—including analyses from the International Energy Agency and Lawrence Berkeley National Laboratory—note large uncertainty due to limited corporate disclosure of AI-specific energy use.

This opacity extends to operational details. Which applications consume the most—training or inference? How do different model architectures compare? What's the energy cost of a single query versus a batch process? These questions remain largely unanswered in public documentation.

Efficiency as Countervailing Force

Not all AI development follows the same energy trajectory. China's DeepSeek model demonstrates that architectural choices matter: it achieves comparable performance to Meta's Llama 3.1 while requiring significantly fewer computational resources.

This efficiency gap suggests optimization pathways exist—if companies prioritize them. Technical strategies for reducing AI energy consumption include:

  • Model pruning: Removing redundant parameters without sacrificing accuracy
  • Quantization: Using lower-precision arithmetic to reduce memory and compute requirements
  • Sparse architectures: Activating only relevant network sections per query
  • Efficient training techniques: Transfer learning and fine-tuning rather than training from scratch

Yet these optimizations remain secondary considerations in a competitive landscape where model capability—not energy efficiency—drives market positioning. Until energy cost becomes a visible metric in AI product comparisons, efficiency gains will likely remain marginal.

Operational Implications for Technical Leaders

For engineering leaders evaluating AI integration, energy consumption is shifting from an environmental concern to an operational constraint. Critical questions to consider:

  • What is the total cost of ownership when factoring in energy expenses over a system's lifecycle?
  • Does your data center infrastructure have capacity for AI workloads, or will upgrades be required?
  • Are you evaluating AI vendors based on model efficiency, or only on capability?
  • How will energy costs scale as you move from pilot projects to production deployment?

These aren't hypothetical considerations. Infrastructure teams are encountering power allocation limits that delay deployment regardless of technical readiness.

Policy and Accountability Mechanisms

Addressing AI's energy trajectory requires systemic responses beyond individual company optimization. Potential policy interventions include:

  • Mandatory energy disclosure: Requiring companies to report model-specific consumption metrics
  • Efficiency standards: Setting performance-per-watt benchmarks for commercial AI systems
  • Grid impact assessments: Evaluating regional capacity before approving large-scale AI data centers
  • Renewable energy requirements: Mandating that AI infrastructure sources power from clean generation

These measures face resistance from an industry that moves faster than regulatory cycles. But as AI energy consumption approaches the scale of national grids, treating it as critical infrastructure becomes unavoidable. U.S. Congressional Research Service analyses now cite these energy projections, indicating policy awareness is growing.

Immediate Actions for Technical Teams

While systemic change requires policy action, technical teams can take immediate steps:

  • Audit current AI usage: Identify which tools and workflows consume the most resources
  • Prioritize efficient alternatives: Compare energy profiles when selecting AI vendors or open-source models
  • Optimize query patterns: Batch requests, cache results, avoid redundant processing
  • Advocate for transparency: Request energy metrics from AI vendors and incorporate them into procurement criteria

For product teams, consider whether energy efficiency should be part of the value proposition when communicating AI features to end users. As environmental awareness grows, lower carbon footprint may become a meaningful product differentiator.

The Path Forward

AI's energy consumption isn't inherently unsustainable—it's a design challenge. Data centers themselves now deliver far more compute per watt than a decade ago, demonstrating that efficiency gains are achievable when prioritized.

The International Energy Agency's base case projects global data center electricity rising to approximately 945 terawatt-hours by 2030, with AI as the main driver. Current trajectories suggest energy demand is growing faster than efficiency improvements can offset, creating a widening gap between AI's computational ambitions and the infrastructure capacity to power them sustainably.

Closing that gap requires treating energy as a first-order constraint, not an externality to be managed later. For those building, deploying, or analyzing AI systems, the imperative is clear: watts are the new currency of artificial intelligence. Understanding their cost—and designing to minimize it—isn't just environmental responsibility. It's operational necessity in a world where the grid itself is becoming the limiting factor.

Topic

AI Energy Consumption Crisis

VCs Say 2026 Is When AI Stops Assisting and Starts Replacing Workers

12 January 2026

212,000 Banking Jobs Face AI Elimination by 2030

2 January 2026

212,000 Banking Jobs Face AI Elimination by 2030

AI's scaling era is over. What comes next?

2 January 2026

What is this about?

  • AI energy consumption/
  • data center power/
  • environmental sustainability/
  • infrastructure constraints/
  • energy efficiency

Feed

    Cursor 3 Launches Unified AI Coding Workspace

    Cursor 3 Launches Unified AI Coding Workspace

    Side‑panel lets devs toggle local and cloud agents, building on Composer 2 and Kimi 2.5

    about 9 hours ago
    Orion’s Six‑Minute Burn Puts Artemis 2 on Free‑Return Path

    Orion’s Six‑Minute Burn Puts Artemis 2 on Free‑Return Path

    iPhone 17 Pro Max survives Orion’s deep‑space test as crew heads to lunar flyby

    about 10 hours ago
    Android 17 Introduces System‑Level Notification Rules

    Android 17 Introduces System‑Level Notification Rules

    Samsung’s One UI 9 will adopt Android 17’s rules, adding OS‑level alert control

    about 11 hours ago

    Nvidia rolls out DLSS 4.5, 6× boost on RTX 50-series

    Dynamic Multi‑Frame Generation smooths 120–240 Hz, delivered in the driver 595.97

    about 13 hours ago
    Apple rolls out iOS 18.7.7 to block DarkSword

    Apple rolls out iOS 18.7.7 to block DarkSword

    Patch fixes six Safari bugs, stopping DarkSword on iOS 18–18.7 devices

    1 day ago
    BoxPlates Skins Revamp PS5 Slim & Pro in Two Weeks

    BoxPlates Skins Revamp PS5 Slim & Pro in Two Weeks

    1 day ago
    Artemis 2 Rockets Beyond Earth—402,000 km From Home

    Artemis 2 Rockets Beyond Earth—402,000 km From Home

    Lift‑off at 8:23 a.m. ET marks first crewed lunar flight since 1972, with a diverse four‑person crew

    1 day ago
    Apple celebrates 50 years with new minimalist wallpapers

    Apple celebrates 50 years with new minimalist wallpapers

    Basic Apple Guy releases iPhone and Mac wallpapers for Apple’s 50th anniversary

    1 day ago
    Razer Unveils Pro Type Ergo Ergonomic Keyboard Today

    Razer Unveils Pro Type Ergo Ergonomic Keyboard Today

    Split design, AI button, and 19‑zone RGB aim at U.S. workers with a 9.7% RSI rate

    2 days ago
    Google to debut screen‑free Fitbit band in 2026

    Google to debut screen‑free Fitbit band in 2026

    AI‑driven training plan and upgraded platform aim at the health‑tracking market against Oura and Whoop

    2 days ago
    Nothing unveils AI‑powered smart glasses for a 2027 launch

    Nothing unveils AI‑powered smart glasses for a 2027 launch

    The glasses use a paired phone and cloud, with a clear frame and LED accents

    2 days ago
    Google rolls out Veo 3.1 Lite, halving AI video costs

    Google rolls out Veo 3.1 Lite, halving AI video costs

    Veo 3.1 Lite matches Veo 3.1 Fast speed but cuts price by over 50% for devs now

    2 days ago
    Freelander 97 Debuts 800‑V EV Crossover in Shanghai

    Freelander 97 Debuts 800‑V EV Crossover in Shanghai

    Chery‑JLR showcases ADS 4.1 autonomy on 800‑V platform, eyeing 2028 launch

    2 days ago
    Telegram Launches Version 12.6 With AI Editor, New Polls

    Telegram Launches Version 12.6 With AI Editor, New Polls

    It adds an AI tone editor, richer polls, Live/Motion Photos, and bot management

    2 days ago

    Pixel 11 Pro Renders Leak With Black Camera Bar and MediaTek Modem

    Google’s August 2026 flagship ditches Samsung radios for improved 5G and runs the Tensor G6

    3 days ago

    Anthropic leak reveals Opus 4.7, Sonnet 4.8 in npm 2.1.88

    Leak on March 30‑31 exposed TypeScript, revealing Opus 4.7, Sonnet 4.8, and internal features

    3 days ago
    iOS 26.5 beta lands on iPhone 17 Pro with an 8 GB download

    iOS 26.5 beta lands on iPhone 17 Pro with an 8 GB download

    Apple restores RCS encryption and adds a 12‑month subscription in the update

    3 days ago
    Windows 11 24H2 Brings Dark Mode to Core Utilities

    Windows 11 24H2 Brings Dark Mode to Core Utilities

    Tools like Registry Editor get dark mode in Windows 11 24H2, out in Sep 2026

    4 days ago

    John Noble's 1,024 Thread Implant Powers Warcraft Raids

    John Noble, a former British parachutist turned veteran gamer, received a neural implant with 1,024 threads after a 2024 trial in Seattle. The device lets him control a MacBook with thought alone, turning World of Warcraft raids into hands‑free battles. His story shows how brain‑computer interfaces can expand digital access for disabled veterans and reshape gaming.

    5 days ago
    Apple unveils Siri app for iOS 27, adds 50+ AI agents

    Apple unveils Siri app for iOS 27, adds 50+ AI agents

    iOS 27 Siri app adds Extensions marketplace, eyeing Alexa’s 100,000‑skill store

    5 days ago
    Loading...
banner