• My Feed
  • Home
  • What's Important
  • Media & Entertainment
Search

Stay Curious. Stay Wanture.

© 2026 Wanture. All rights reserved.

  • Terms of Use
  • Privacy Policy
banner
Tech/Trends

AI now consumes 50% of data center power

Demand doubled in two years, rivaling UK's national grid by 2025

17 December 2025

—

Data Story *

Nadia Bennett

Artificial intelligence's energy appetite has exploded from 10% to 50% of global data center electricity in under two years. By 2025, AI will consume power equivalent to the United Kingdom's entire national demand. Infrastructure planners face unprecedented constraints as chip production doubles and grid capacity becomes the limiting factor for AI deployment.

telegram-cloud-photo-size-2-5307767060697910295-y

Summary

  • AI's electricity consumption in 2024 matched the Netherlands' entire annual grid demand, projected to rival the UK's national consumption by 2025.
  • TSMC's AI chip production capacity more than doubled between 2023-2024, serving as a leading indicator of exponential energy demand growth for AI infrastructure.
  • Technical leaders must prioritize energy efficiency, with potential policy interventions like mandatory energy disclosure and renewable energy requirements becoming critical constraints.
banner

Artificial intelligence consumed electricity at a velocity that rivals national grids in 2024. By year's end, projections indicate AI could claim half of all power flowing through the world's data centers. The surge is forcing infrastructure planners, technology leaders, and engineers to treat energy capacity not as an environmental footnote, but as a first-order operational constraint.

The 20% to 50% Trajectory

AI's share of data center electricity doubled in under two years. In 2023, it consumed roughly 10% of total capacity. By 2024, that figure reached 20%. Projections from technology energy consumption specialist Alex de Vries indicate it will hit 50% by the close of 2025.

Scale matters. In 2024, AI's energy appetite matched the Netherlands' entire annual consumption—approximately 120 terawatt-hours. The 2025 forecast suggests it will rival the United Kingdom's national demand, reaching approximately 23 gigawatts.

These figures emerge from a triangulation methodology: cross-referencing equipment specifications from semiconductor manufacturers, corporate energy disclosures where available, and third-party analytics. This multi-source approach addresses a persistent gap—corporations rarely publish granular energy data, making accurate forecasting difficult and accountability nearly impossible.

Production Capacity as Leading Indicator

Taiwan Semiconductor Manufacturing Company (TSMC) more than doubled its AI chip production capacity between 2023 and 2024. Since these chips power the training and inference workloads driving AI's expansion, their production volume serves as a reliable predictor of future energy demand.

This correlation matters because AI infrastructure scales exponentially, not linearly. Each generation of models requires more compute, more memory, more cooling—compounding the energy load with every deployment cycle. When chip production doubles, energy consumption follows.

The Supply Chain Signal

De Vries' methodology tracks TSMC's CoWoS (Chip-on-Wafer-on-Substrate) packaging capacity—a specialized process used for high-performance AI accelerators. Growth in this metric precedes energy demand by roughly six to nine months, offering infrastructure planners a forward-looking indicator.

Current data shows CoWoS capacity continuing to expand through 2025, suggesting energy demand will maintain its steep trajectory through at least the first half of 2026.

AI vs. National Grids: The Comparative Frame

Translating gigawatts into tangible scale requires comparison:

  • 2024 baseline: AI energy use equals the Netherlands' total annual electricity consumption
  • 2025 projection: Expected to match the United Kingdom's national demand
  • Growth rate: Consumption increasing faster than Bitcoin mining at its 2021 peak

Bitcoin mining—long criticized for its environmental footprint—consumes approximately 150 terawatt-hours annually. AI is on track to exceed that figure within months, yet operates with far less public scrutiny or regulatory oversight.

Infrastructure Constraints and Grid Stability

Energy demand at this velocity creates cascading challenges. Data centers are requesting grid connections that exceed local utility capacity, forcing infrastructure upgrades that can take years to complete.

In regions where AI companies are clustering—Northern Virginia, Dublin, Singapore—utilities are scrambling to add transmission capacity while managing aging equipment designed for slower load growth. In the United States, data center electricity use reached approximately 176 terawatt-hours in 2023, representing 4.4% of national consumption. Scenario analyses project this could rise to 325–580 terawatt-hours by 2028, depending on GPU deployment rates and operational assumptions.

The timing compounds the problem. Many developed economies are simultaneously electrifying transportation and heating while decommissioning fossil fuel plants. AI's demand surge risks crowding out renewable energy investments, delaying decarbonization goals as utilities prioritize immediate grid stability over long-term transition planning.

The Waitlist Reality

Companies are already encountering power allocation limits that delay AI projects, regardless of budget or technical readiness. In some markets, data center operators are implementing waitlists for high-density compute deployments. Energy capacity, not capital or technical expertise, is becoming the binding constraint.

The Transparency Gap

De Vries' research highlights a critical obstacle: corporate opacity. Most AI companies disclose aggregated energy use at the corporate level, if at all, making it impossible to track model-specific consumption or compare efficiency across platforms.

Without standardized reporting, regulators lack the data to set meaningful benchmarks, and customers can't make informed choices about which AI tools carry the heaviest environmental cost. All major studies—including analyses from the International Energy Agency and Lawrence Berkeley National Laboratory—note large uncertainty due to limited corporate disclosure of AI-specific energy use.

This opacity extends to operational details. Which applications consume the most—training or inference? How do different model architectures compare? What's the energy cost of a single query versus a batch process? These questions remain largely unanswered in public documentation.

Efficiency as Countervailing Force

Not all AI development follows the same energy trajectory. China's DeepSeek model demonstrates that architectural choices matter: it achieves comparable performance to Meta's Llama 3.1 while requiring significantly fewer computational resources.

This efficiency gap suggests optimization pathways exist—if companies prioritize them. Technical strategies for reducing AI energy consumption include:

  • Model pruning: Removing redundant parameters without sacrificing accuracy
  • Quantization: Using lower-precision arithmetic to reduce memory and compute requirements
  • Sparse architectures: Activating only relevant network sections per query
  • Efficient training techniques: Transfer learning and fine-tuning rather than training from scratch

Yet these optimizations remain secondary considerations in a competitive landscape where model capability—not energy efficiency—drives market positioning. Until energy cost becomes a visible metric in AI product comparisons, efficiency gains will likely remain marginal.

Operational Implications for Technical Leaders

For engineering leaders evaluating AI integration, energy consumption is shifting from an environmental concern to an operational constraint. Critical questions to consider:

  • What is the total cost of ownership when factoring in energy expenses over a system's lifecycle?
  • Does your data center infrastructure have capacity for AI workloads, or will upgrades be required?
  • Are you evaluating AI vendors based on model efficiency, or only on capability?
  • How will energy costs scale as you move from pilot projects to production deployment?

These aren't hypothetical considerations. Infrastructure teams are encountering power allocation limits that delay deployment regardless of technical readiness.

Policy and Accountability Mechanisms

Addressing AI's energy trajectory requires systemic responses beyond individual company optimization. Potential policy interventions include:

  • Mandatory energy disclosure: Requiring companies to report model-specific consumption metrics
  • Efficiency standards: Setting performance-per-watt benchmarks for commercial AI systems
  • Grid impact assessments: Evaluating regional capacity before approving large-scale AI data centers
  • Renewable energy requirements: Mandating that AI infrastructure sources power from clean generation

These measures face resistance from an industry that moves faster than regulatory cycles. But as AI energy consumption approaches the scale of national grids, treating it as critical infrastructure becomes unavoidable. U.S. Congressional Research Service analyses now cite these energy projections, indicating policy awareness is growing.

Immediate Actions for Technical Teams

While systemic change requires policy action, technical teams can take immediate steps:

  • Audit current AI usage: Identify which tools and workflows consume the most resources
  • Prioritize efficient alternatives: Compare energy profiles when selecting AI vendors or open-source models
  • Optimize query patterns: Batch requests, cache results, avoid redundant processing
  • Advocate for transparency: Request energy metrics from AI vendors and incorporate them into procurement criteria

For product teams, consider whether energy efficiency should be part of the value proposition when communicating AI features to end users. As environmental awareness grows, lower carbon footprint may become a meaningful product differentiator.

The Path Forward

AI's energy consumption isn't inherently unsustainable—it's a design challenge. Data centers themselves now deliver far more compute per watt than a decade ago, demonstrating that efficiency gains are achievable when prioritized.

The International Energy Agency's base case projects global data center electricity rising to approximately 945 terawatt-hours by 2030, with AI as the main driver. Current trajectories suggest energy demand is growing faster than efficiency improvements can offset, creating a widening gap between AI's computational ambitions and the infrastructure capacity to power them sustainably.

Closing that gap requires treating energy as a first-order constraint, not an externality to be managed later. For those building, deploying, or analyzing AI systems, the imperative is clear: watts are the new currency of artificial intelligence. Understanding their cost—and designing to minimize it—isn't just environmental responsibility. It's operational necessity in a world where the grid itself is becoming the limiting factor.

Topic

AI Energy Consumption Crisis

212,000 Banking Jobs Face AI Elimination by 2030

2 January 2026

212,000 Banking Jobs Face AI Elimination by 2030

AI's scaling era is over. What comes next?

2 January 2026

What is this about?

  • AI energy consumption/
  • data center power/
  • environmental sustainability/
  • infrastructure constraints/
  • energy efficiency

Feed

    Roborock Saros Rover climbs stairs and vacuums

    Roborock's Saros Rover uses wheel-legs and real-time AI navigation to climb traditional, curved, and carpeted stairs while vacuuming each surface—a first for stair-climbing robots. Eufy and Dreame prototypes transport vacuums but don't clean during climbs. Expect pricing above $2,500 with release dates unconfirmed.

    2 days ago

    Instagram Will Mark Real Photos as Human-Made

    Instagram head Adam Mosseri announced fingerprinting technology to verify authentic human photos and videos instead of flagging AI-generated content. The shift comes as synthetic imagery saturates the platform, with AI posts expected to outnumber human content within months. Creators face new friction proving work is real.

    Instagram Will Mark Real Photos as Human-Made
    5 days ago
    How Peptides Actually Rebuild Your Skin

    How Peptides Actually Rebuild Your Skin

    6 days ago

    10 Biohacking Methods Ranked by Scientific Evidence

    We evaluated ten popular biohacking interventions against peer-reviewed research, prioritizing documented physiological effects, reproducibility, cost-benefit ratios, and real-world accessibility. Finnish sauna studies show 40% mortality reduction, light hygiene rivals prescription sleep aids for near-zero cost, and cold exposure boosts dopamine 250%—while some expensive gadgets deliver marginal returns.

    6 days ago
    ASUS Zenbook A14 Review: 2.18 Pounds That Change Everything

    ASUS Zenbook A14 Review: 2.18 Pounds That Change Everything

    6 days ago

    Norway hits 97.5% EV sales—diesels outnumbered

    Norway registered 172,232 battery-electrics in 2025—97.5% of all new passenger cars—and EVs now outnumber diesel in the total fleet for the first time. Tesla captured 19.1% market share, Chinese brands rose to 13.7%, and only 487 pure gasoline cars sold all year. The country proved eight years of consistent tax policy can flip an entire market.

    Norway hits 97.5% EV sales—diesels outnumbered
    2 January 2026

    OpenAI pivots to audio-first AI devices

    OpenAI merged engineering and research teams to develop audio models for a personal device expected early 2026. The move signals an industry shift from screens to voice interfaces. With Jony Ive on board and competitors launching AI rings, the race is on—but past failures like Humane's AI Pin show audio-first hardware remains high-risk.

    OpenAI pivots to audio-first AI devices
    2 January 2026

    212,000 Banking Jobs Face AI Elimination by 2030

    Morgan Stanley projects 212,000 banking roles will disappear across Europe by 2030 as AI absorbs compliance, risk modeling, and back-office work. Major lenders including ABN AMRO and Société Générale plan deep cuts, while U.S. banks from Goldman Sachs to Wells Fargo follow suit. The shift raises questions about institutional memory and training pipelines.

    212,000 Banking Jobs Face AI Elimination by 2030
    2 January 2026

    Clicks launches distraction-free Android 16 phone and universal magnetic keyboard

    Clicks Technology unveiled two devices Thursday: a BlackBerry-style Communicator smartphone running Android 16 that strips out Instagram, TikTok, and games while keeping work apps like Gmail and Slack, and a slide-out Power Keyboard that magnetically attaches to phones, tablets, and TVs. Pre-orders open today with spring 2026 shipping for both products.

    Clicks launches distraction-free Android 16 phone and universal magnetic keyboard
    2 January 2026

    Tesla Deliveries Drop 9% in 2025 as BYD Takes Global EV Crown

    Tesla delivered 1,636,129 vehicles in 2025, down 9% year-over-year and marking the automaker's second consecutive annual decline. BYD claimed global leadership with 2,256,714 battery-electric units while Tesla's Q4 deliveries of 418,227 vehicles fell 15.6% despite price cuts and zero-percent financing. The $7,500 federal tax credit expired January 1.

    Tesla Deliveries Drop 9% in 2025 as BYD Takes Global EV Crown
    2 January 2026

    AI's scaling era is over. What comes next?

    2 January 2026

    Samsung Galaxy S26 Ultra—same specs, new look

    Samsung's Galaxy S26 Ultra keeps the S25's camera hardware, 5,000mAh battery, and 45W charging while Chinese rivals push 100W+ solutions. The leaked prototype shows a fresh camera module from the Z Fold 7, 6.9-inch display hitting 2,600 nits, Snapdragon 8 Elite Gen 5, and up to 16GB RAM. Launch delayed to February 25, breaking tradition and signaling supply issues or strategic repositioning.

    Samsung Galaxy S26 Ultra—same specs, new look
    1 January 2026
    Ruby 4.0 bets on method-level optimization

    Ruby 4.0 bets on method-level optimization

    31 December 2025

    SpaceX launches Twilight rideshare for dawn-dusk orbits

    SpaceX debuts specialized rideshare service for sun-synchronous terminator orbits, where satellites stay on the day-night boundary with continuous solar power. The Pandora/Twilight mission democratizes access to dawn-dusk orbits previously requiring dedicated launches, serving SAR, IoT, and communications markets with proven demand.

    25 December 2025

    Gmail now lets you change your address without a new account

    Google is gradually rolling out the ability to change Gmail addresses without creating new accounts, addressing years of user requests to escape outdated usernames. Users can check availability through account settings, but face critical limitations: approximately three username changes maximum per account lifetime, and deleted addresses can't be reused for 12 months.

    25 December 2025

    Max Hodak: From Neuralink Co-Founder to Networked Consciousness Pioneer

    Max Hodak co-founded Neuralink at 28, helping compress decade-long timelines into years. Now he's building Science, a venture that replaces metal electrodes with living neurons to overcome the brain's 10-bit-per-second output bottleneck. His goal isn't just treating paralysis—it's networking consciousness itself, making the boundary of the skull negotiable and human experience shareable. One decade remains before the phase transition becomes irreversible.

    24 December 2025
    Jet Lag Reset in 48 Hours: The Science-Backed Protocol That Works

    Jet Lag Reset in 48 Hours: The Science-Backed Protocol That Works

    24 December 2025
    Latent-X2 claims zero-shot antibody design. Does it work?

    Latent-X2 claims zero-shot antibody design. Does it work?

    22 December 2025

    Unitree's robot app store is live — but the robots can't think yet

    22 December 2025
    Xiaomi 17 Ultra: When a One-Inch Sensor Meets a 6,800 mAh Battery

    Xiaomi 17 Ultra: When a One-Inch Sensor Meets a 6,800 mAh Battery

    19 December 2025
    Loading...
Tech/Trends

AI now consumes 50% of data center power

Demand doubled in two years, rivaling UK's national grid by 2025

17 December 2025

—

Data Story *

Nadia Bennett

banner

Artificial intelligence's energy appetite has exploded from 10% to 50% of global data center electricity in under two years. By 2025, AI will consume power equivalent to the United Kingdom's entire national demand. Infrastructure planners face unprecedented constraints as chip production doubles and grid capacity becomes the limiting factor for AI deployment.

telegram-cloud-photo-size-2-5307767060697910295-y

Summary:

  • AI's electricity consumption in 2024 matched the Netherlands' entire annual grid demand, projected to rival the UK's national consumption by 2025.
  • TSMC's AI chip production capacity more than doubled between 2023-2024, serving as a leading indicator of exponential energy demand growth for AI infrastructure.
  • Technical leaders must prioritize energy efficiency, with potential policy interventions like mandatory energy disclosure and renewable energy requirements becoming critical constraints.
banner

Artificial intelligence consumed electricity at a velocity that rivals national grids in 2024. By year's end, projections indicate AI could claim half of all power flowing through the world's data centers. The surge is forcing infrastructure planners, technology leaders, and engineers to treat energy capacity not as an environmental footnote, but as a first-order operational constraint.

The 20% to 50% Trajectory

AI's share of data center electricity doubled in under two years. In 2023, it consumed roughly 10% of total capacity. By 2024, that figure reached 20%. Projections from technology energy consumption specialist Alex de Vries indicate it will hit 50% by the close of 2025.

Scale matters. In 2024, AI's energy appetite matched the Netherlands' entire annual consumption—approximately 120 terawatt-hours. The 2025 forecast suggests it will rival the United Kingdom's national demand, reaching approximately 23 gigawatts.

These figures emerge from a triangulation methodology: cross-referencing equipment specifications from semiconductor manufacturers, corporate energy disclosures where available, and third-party analytics. This multi-source approach addresses a persistent gap—corporations rarely publish granular energy data, making accurate forecasting difficult and accountability nearly impossible.

Production Capacity as Leading Indicator

Taiwan Semiconductor Manufacturing Company (TSMC) more than doubled its AI chip production capacity between 2023 and 2024. Since these chips power the training and inference workloads driving AI's expansion, their production volume serves as a reliable predictor of future energy demand.

This correlation matters because AI infrastructure scales exponentially, not linearly. Each generation of models requires more compute, more memory, more cooling—compounding the energy load with every deployment cycle. When chip production doubles, energy consumption follows.

The Supply Chain Signal

De Vries' methodology tracks TSMC's CoWoS (Chip-on-Wafer-on-Substrate) packaging capacity—a specialized process used for high-performance AI accelerators. Growth in this metric precedes energy demand by roughly six to nine months, offering infrastructure planners a forward-looking indicator.

Current data shows CoWoS capacity continuing to expand through 2025, suggesting energy demand will maintain its steep trajectory through at least the first half of 2026.

AI vs. National Grids: The Comparative Frame

Translating gigawatts into tangible scale requires comparison:

  • 2024 baseline: AI energy use equals the Netherlands' total annual electricity consumption
  • 2025 projection: Expected to match the United Kingdom's national demand
  • Growth rate: Consumption increasing faster than Bitcoin mining at its 2021 peak

Bitcoin mining—long criticized for its environmental footprint—consumes approximately 150 terawatt-hours annually. AI is on track to exceed that figure within months, yet operates with far less public scrutiny or regulatory oversight.

Infrastructure Constraints and Grid Stability

Energy demand at this velocity creates cascading challenges. Data centers are requesting grid connections that exceed local utility capacity, forcing infrastructure upgrades that can take years to complete.

In regions where AI companies are clustering—Northern Virginia, Dublin, Singapore—utilities are scrambling to add transmission capacity while managing aging equipment designed for slower load growth. In the United States, data center electricity use reached approximately 176 terawatt-hours in 2023, representing 4.4% of national consumption. Scenario analyses project this could rise to 325–580 terawatt-hours by 2028, depending on GPU deployment rates and operational assumptions.

The timing compounds the problem. Many developed economies are simultaneously electrifying transportation and heating while decommissioning fossil fuel plants. AI's demand surge risks crowding out renewable energy investments, delaying decarbonization goals as utilities prioritize immediate grid stability over long-term transition planning.

The Waitlist Reality

Companies are already encountering power allocation limits that delay AI projects, regardless of budget or technical readiness. In some markets, data center operators are implementing waitlists for high-density compute deployments. Energy capacity, not capital or technical expertise, is becoming the binding constraint.

The Transparency Gap

De Vries' research highlights a critical obstacle: corporate opacity. Most AI companies disclose aggregated energy use at the corporate level, if at all, making it impossible to track model-specific consumption or compare efficiency across platforms.

Without standardized reporting, regulators lack the data to set meaningful benchmarks, and customers can't make informed choices about which AI tools carry the heaviest environmental cost. All major studies—including analyses from the International Energy Agency and Lawrence Berkeley National Laboratory—note large uncertainty due to limited corporate disclosure of AI-specific energy use.

This opacity extends to operational details. Which applications consume the most—training or inference? How do different model architectures compare? What's the energy cost of a single query versus a batch process? These questions remain largely unanswered in public documentation.

Efficiency as Countervailing Force

Not all AI development follows the same energy trajectory. China's DeepSeek model demonstrates that architectural choices matter: it achieves comparable performance to Meta's Llama 3.1 while requiring significantly fewer computational resources.

This efficiency gap suggests optimization pathways exist—if companies prioritize them. Technical strategies for reducing AI energy consumption include:

  • Model pruning: Removing redundant parameters without sacrificing accuracy
  • Quantization: Using lower-precision arithmetic to reduce memory and compute requirements
  • Sparse architectures: Activating only relevant network sections per query
  • Efficient training techniques: Transfer learning and fine-tuning rather than training from scratch

Yet these optimizations remain secondary considerations in a competitive landscape where model capability—not energy efficiency—drives market positioning. Until energy cost becomes a visible metric in AI product comparisons, efficiency gains will likely remain marginal.

Operational Implications for Technical Leaders

For engineering leaders evaluating AI integration, energy consumption is shifting from an environmental concern to an operational constraint. Critical questions to consider:

  • What is the total cost of ownership when factoring in energy expenses over a system's lifecycle?
  • Does your data center infrastructure have capacity for AI workloads, or will upgrades be required?
  • Are you evaluating AI vendors based on model efficiency, or only on capability?
  • How will energy costs scale as you move from pilot projects to production deployment?

These aren't hypothetical considerations. Infrastructure teams are encountering power allocation limits that delay deployment regardless of technical readiness.

Policy and Accountability Mechanisms

Addressing AI's energy trajectory requires systemic responses beyond individual company optimization. Potential policy interventions include:

  • Mandatory energy disclosure: Requiring companies to report model-specific consumption metrics
  • Efficiency standards: Setting performance-per-watt benchmarks for commercial AI systems
  • Grid impact assessments: Evaluating regional capacity before approving large-scale AI data centers
  • Renewable energy requirements: Mandating that AI infrastructure sources power from clean generation

These measures face resistance from an industry that moves faster than regulatory cycles. But as AI energy consumption approaches the scale of national grids, treating it as critical infrastructure becomes unavoidable. U.S. Congressional Research Service analyses now cite these energy projections, indicating policy awareness is growing.

Immediate Actions for Technical Teams

While systemic change requires policy action, technical teams can take immediate steps:

  • Audit current AI usage: Identify which tools and workflows consume the most resources
  • Prioritize efficient alternatives: Compare energy profiles when selecting AI vendors or open-source models
  • Optimize query patterns: Batch requests, cache results, avoid redundant processing
  • Advocate for transparency: Request energy metrics from AI vendors and incorporate them into procurement criteria

For product teams, consider whether energy efficiency should be part of the value proposition when communicating AI features to end users. As environmental awareness grows, lower carbon footprint may become a meaningful product differentiator.

The Path Forward

AI's energy consumption isn't inherently unsustainable—it's a design challenge. Data centers themselves now deliver far more compute per watt than a decade ago, demonstrating that efficiency gains are achievable when prioritized.

The International Energy Agency's base case projects global data center electricity rising to approximately 945 terawatt-hours by 2030, with AI as the main driver. Current trajectories suggest energy demand is growing faster than efficiency improvements can offset, creating a widening gap between AI's computational ambitions and the infrastructure capacity to power them sustainably.

Closing that gap requires treating energy as a first-order constraint, not an externality to be managed later. For those building, deploying, or analyzing AI systems, the imperative is clear: watts are the new currency of artificial intelligence. Understanding their cost—and designing to minimize it—isn't just environmental responsibility. It's operational necessity in a world where the grid itself is becoming the limiting factor.

Topic

AI Energy Consumption Crisis

212,000 Banking Jobs Face AI Elimination by 2030

2 January 2026

212,000 Banking Jobs Face AI Elimination by 2030

AI's scaling era is over. What comes next?

2 January 2026

What is this about?

  • AI energy consumption/
  • data center power/
  • environmental sustainability/
  • infrastructure constraints/
  • energy efficiency

Feed

    Roborock Saros Rover climbs stairs and vacuums

    Roborock's Saros Rover uses wheel-legs and real-time AI navigation to climb traditional, curved, and carpeted stairs while vacuuming each surface—a first for stair-climbing robots. Eufy and Dreame prototypes transport vacuums but don't clean during climbs. Expect pricing above $2,500 with release dates unconfirmed.

    2 days ago

    Instagram Will Mark Real Photos as Human-Made

    Instagram head Adam Mosseri announced fingerprinting technology to verify authentic human photos and videos instead of flagging AI-generated content. The shift comes as synthetic imagery saturates the platform, with AI posts expected to outnumber human content within months. Creators face new friction proving work is real.

    Instagram Will Mark Real Photos as Human-Made
    5 days ago
    How Peptides Actually Rebuild Your Skin

    How Peptides Actually Rebuild Your Skin

    6 days ago

    10 Biohacking Methods Ranked by Scientific Evidence

    We evaluated ten popular biohacking interventions against peer-reviewed research, prioritizing documented physiological effects, reproducibility, cost-benefit ratios, and real-world accessibility. Finnish sauna studies show 40% mortality reduction, light hygiene rivals prescription sleep aids for near-zero cost, and cold exposure boosts dopamine 250%—while some expensive gadgets deliver marginal returns.

    6 days ago
    ASUS Zenbook A14 Review: 2.18 Pounds That Change Everything

    ASUS Zenbook A14 Review: 2.18 Pounds That Change Everything

    6 days ago

    Norway hits 97.5% EV sales—diesels outnumbered

    Norway registered 172,232 battery-electrics in 2025—97.5% of all new passenger cars—and EVs now outnumber diesel in the total fleet for the first time. Tesla captured 19.1% market share, Chinese brands rose to 13.7%, and only 487 pure gasoline cars sold all year. The country proved eight years of consistent tax policy can flip an entire market.

    Norway hits 97.5% EV sales—diesels outnumbered
    2 January 2026

    OpenAI pivots to audio-first AI devices

    OpenAI merged engineering and research teams to develop audio models for a personal device expected early 2026. The move signals an industry shift from screens to voice interfaces. With Jony Ive on board and competitors launching AI rings, the race is on—but past failures like Humane's AI Pin show audio-first hardware remains high-risk.

    OpenAI pivots to audio-first AI devices
    2 January 2026

    212,000 Banking Jobs Face AI Elimination by 2030

    Morgan Stanley projects 212,000 banking roles will disappear across Europe by 2030 as AI absorbs compliance, risk modeling, and back-office work. Major lenders including ABN AMRO and Société Générale plan deep cuts, while U.S. banks from Goldman Sachs to Wells Fargo follow suit. The shift raises questions about institutional memory and training pipelines.

    212,000 Banking Jobs Face AI Elimination by 2030
    2 January 2026

    Clicks launches distraction-free Android 16 phone and universal magnetic keyboard

    Clicks Technology unveiled two devices Thursday: a BlackBerry-style Communicator smartphone running Android 16 that strips out Instagram, TikTok, and games while keeping work apps like Gmail and Slack, and a slide-out Power Keyboard that magnetically attaches to phones, tablets, and TVs. Pre-orders open today with spring 2026 shipping for both products.

    Clicks launches distraction-free Android 16 phone and universal magnetic keyboard
    2 January 2026

    Tesla Deliveries Drop 9% in 2025 as BYD Takes Global EV Crown

    Tesla delivered 1,636,129 vehicles in 2025, down 9% year-over-year and marking the automaker's second consecutive annual decline. BYD claimed global leadership with 2,256,714 battery-electric units while Tesla's Q4 deliveries of 418,227 vehicles fell 15.6% despite price cuts and zero-percent financing. The $7,500 federal tax credit expired January 1.

    Tesla Deliveries Drop 9% in 2025 as BYD Takes Global EV Crown
    2 January 2026

    AI's scaling era is over. What comes next?

    2 January 2026

    Samsung Galaxy S26 Ultra—same specs, new look

    Samsung's Galaxy S26 Ultra keeps the S25's camera hardware, 5,000mAh battery, and 45W charging while Chinese rivals push 100W+ solutions. The leaked prototype shows a fresh camera module from the Z Fold 7, 6.9-inch display hitting 2,600 nits, Snapdragon 8 Elite Gen 5, and up to 16GB RAM. Launch delayed to February 25, breaking tradition and signaling supply issues or strategic repositioning.

    Samsung Galaxy S26 Ultra—same specs, new look
    1 January 2026
    Ruby 4.0 bets on method-level optimization

    Ruby 4.0 bets on method-level optimization

    31 December 2025

    SpaceX launches Twilight rideshare for dawn-dusk orbits

    SpaceX debuts specialized rideshare service for sun-synchronous terminator orbits, where satellites stay on the day-night boundary with continuous solar power. The Pandora/Twilight mission democratizes access to dawn-dusk orbits previously requiring dedicated launches, serving SAR, IoT, and communications markets with proven demand.

    25 December 2025

    Gmail now lets you change your address without a new account

    Google is gradually rolling out the ability to change Gmail addresses without creating new accounts, addressing years of user requests to escape outdated usernames. Users can check availability through account settings, but face critical limitations: approximately three username changes maximum per account lifetime, and deleted addresses can't be reused for 12 months.

    25 December 2025

    Max Hodak: From Neuralink Co-Founder to Networked Consciousness Pioneer

    Max Hodak co-founded Neuralink at 28, helping compress decade-long timelines into years. Now he's building Science, a venture that replaces metal electrodes with living neurons to overcome the brain's 10-bit-per-second output bottleneck. His goal isn't just treating paralysis—it's networking consciousness itself, making the boundary of the skull negotiable and human experience shareable. One decade remains before the phase transition becomes irreversible.

    24 December 2025
    Jet Lag Reset in 48 Hours: The Science-Backed Protocol That Works

    Jet Lag Reset in 48 Hours: The Science-Backed Protocol That Works

    24 December 2025
    Latent-X2 claims zero-shot antibody design. Does it work?

    Latent-X2 claims zero-shot antibody design. Does it work?

    22 December 2025

    Unitree's robot app store is live — but the robots can't think yet

    22 December 2025
    Xiaomi 17 Ultra: When a One-Inch Sensor Meets a 6,800 mAh Battery

    Xiaomi 17 Ultra: When a One-Inch Sensor Meets a 6,800 mAh Battery

    19 December 2025
    Loading...