• My Feed
  • Home
  • What's Important
  • Media & Entertainment
Search

Stay Curious. Stay Wanture.

© 2026 Wanture. All rights reserved.

  • Terms of Use
  • Privacy Policy
Tech/Trends
AI's Energy Cost: What Every Query Really Consumes

From training to inference, how artificial intelligence became one of tech's most power-hungry innovations

11 February 2026

—

Explainer *

Tasha Greene
banner

Every ChatGPT query uses 10 times more energy than a Google search. Training GPT-3 consumed enough electricity to power 120 American homes for a year. As AI adoption explodes, data centers face mounting pressure on water, electricity grids, and carbon emissions. This explainer breaks down the hidden infrastructure costs behind each interaction.

image (4)-1

Summary:

  • Training GPT‑3 burned ~1,300 MWh—enough to power 120 U.S. homes for a year—and evaporated ~185,000 gal of water, total footprint ≈1.4 M gal.
  • Each ChatGPT reply uses ~0.34 Wh (≈10× a Google search). With 3.5 billion weekly queries, AI’s energy and water use surge dramatically.
  • Data‑center cooling consumes ~40% of power; U.S. centers used ~17 billion gal of water in 2023. AI could push data‑center electricity to 6–12% of U.S. use by 2028.

When ChatGPT answers your question about weekend dinner recipes, somewhere in a data center, hundreds of specialized chips light up. They process your request in milliseconds, then cool down and wait for the next query. That single moment of computation costs energy. Multiply it by millions of users asking millions of questions, and you begin to see why artificial intelligence has become one of the tech industry's most power-hungry innovations.

The question isn't whether AI consumes energy. It does. The real question is whether that consumption will outpace our ability to make it cleaner and more efficient.

Training Consumes Energy at City Scale

Building a large language model requires an enormous one-time energy investment. Training GPT-3, for example, consumed about 1,287 megawatt-hours of electricity. That's roughly equivalent to powering 120 average American homes for an entire year, compressed into several weeks of continuous computation.

The process works like this: thousands of graphics processing units (GPUs) run simultaneously, analyzing patterns across billions of text examples. Each GPU generates heat while crunching numbers. The system learns language structure, context, and relationships between concepts. This happens 24 hours a day until the model reaches acceptable performance levels.

Training happens once per model. GPT-4 required more energy than GPT-3. Future models will likely require more still, unless efficiency improvements outpace scale increases. The data centers running these training operations draw power comparable to small industrial facilities during peak training periods.

Water consumption adds another invisible cost. GPT-3's training evaporated around 185,000 gallons of water on-site for cooling. When you include the water used to generate the electricity that powered the training, the total footprint reaches about 1.4 million gallons—enough to fill more than two Olympic swimming pools.

Every Query Triggers Invisible Computations

After training comes inference, the process of actually answering user questions. This is where scale creates ongoing energy costs. A single ChatGPT query consumes around 0.34 watt-hours of electricity. That's roughly 10 times more than a traditional Google search, which uses about 0.03 watt-hours.

That difference seems trivial until you consider volume. ChatGPT handled over 100 million weekly active users by early 2024. If each user makes just five queries per day, that's 3.5 billion queries weekly. The cumulative energy cost grows with every new user and every additional question.

Complexity matters too. Generating a 500-word essay requires more computation than answering a simple factual question. Image generation through models like DALL-E or Midjourney demands even more processing power. The energy cost scales with output length and complexity.

Each query also consumes water. A single ChatGPT interaction uses about 0.01 fluid ounces for cooling infrastructure. Individually negligible, but at billions of queries per week, the total adds up quickly.

Cooling Systems Drive Hidden Infrastructure Costs

Data centers don't just run AI models. They fight heat. Modern GPUs can reach temperatures exceeding 82 degrees Fahrenheit under full load. Pack thousands of them into a single facility and you create an environment that requires industrial-scale cooling systems.

In temperate climates, data centers use outside air for cooling when possible. In warmer regions like Arizona or Texas, massive air conditioning systems run continuously. Google's data centers, which serve AI workloads among other services, dedicate around 40 percent of their total energy consumption to cooling and power distribution rather than computation itself.

Water consumption operates on a staggering scale. U.S. data centers consumed about 17 billion gallons of water directly in 2023. When you include the indirect water used to generate the electricity that powers these facilities, the total reaches roughly 211 billion gallons. Google alone reported 8.1 billion gallons of water use across data centers and offices in 2024.

Evaporative cooling systems, common in hot climates, pull millions of gallons annually per facility. As AI workloads expand, so does the infrastructure needed to keep them from overheating. Microsoft's data centers consumed around 2,072 million gallons of water in fiscal year 2023 across all operations.

Carbon Emissions and Grid Pressure Mount

A query processed in Iceland carries a different environmental cost than one processed in West Virginia. Iceland's data centers run almost entirely on geothermal and hydroelectric power. West Virginia's grid relies heavily on coal.

Major AI companies recognize this disparity. Google reports that 64 percent of its global data center energy came from carbon-free sources in 2023. Microsoft committed to being carbon negative by 2030. Meta aims for 100 percent renewable energy across all operations.

The challenge is matching renewable energy supply with actual usage patterns. Solar panels generate power during the day, but AI queries happen around the clock. Battery storage helps, but at current scale, most data centers still draw from mixed grids that include fossil fuel generation during peak demand periods.

U.S. data centers consumed 176 terawatt-hours in 2023, representing about 4.4 percent of U.S. electricity. Globally, data centers used roughly 415 terawatt-hours in 2024, about 1.5 percent of global electricity. AI servers accounted for 24 percent of server electricity demand. Projections suggest U.S. data centers could consume between 6.7 and 12 percent of U.S. electricity by 2028, depending on AI adoption rates.

Efficiency Gains Emerge From Optimization

AI models are becoming more efficient per unit of performance. GPT-3 required 175 billion parameters to achieve its capabilities. Newer architectures achieve similar results with fewer parameters through techniques like mixture of experts, which activate only relevant parts of the model for each query rather than the entire network.

Quantization reduces the precision of calculations without significantly degrading output quality, cutting energy consumption by 30 to 50 percent in some implementations. Specialized AI chips from companies like Google (TPUs) and startups like Cerebras deliver better performance per watt than general-purpose GPUs.

Edge computing shifts some AI processing to devices themselves. Your phone can now run smaller language models locally for tasks like predictive text or voice recognition. This reduces data center load, though it transfers energy consumption to billions of individual devices with smaller but cumulative impact.

Companies are also investing in mitigation. Google's water stewardship projects supplied around 4.5 billion gallons in 2024, offsetting roughly 55 percent of their 8.1 billion gallon consumption. Microsoft committed to becoming water-positive by 2030, replenishing more water than it consumes.

These improvements matter. But they're incremental gains in a context of exponential growth. The question is whether efficiency curves can keep pace with adoption curves.

Whether AI Becomes a Crisis Depends on Choices Made Now

The International Energy Agency estimates that data centers could consume 3 to 4 percent of global electricity by 2030, up from roughly 1.5 percent today. AI workloads represent the fastest-growing segment of that demand.

Whether this becomes a crisis or a manageable transition depends on three factors. First, renewable energy deployment must accelerate. Tech companies are signing power purchase agreements for solar and wind farms, but construction timelines lag behind AI adoption rates. Second, algorithmic efficiency must continue improving. Third, use cases matter. AI that optimizes electrical grids or accelerates renewable energy research could offset its own carbon footprint through systemic improvements elsewhere.

For product teams evaluating AI features, the energy equation has become part of the decision calculus. Caching frequently requested outputs reduces redundant computation. Choosing smaller models for simpler tasks makes engineering and environmental sense. Selecting data center regions with cleaner grids reduces carbon intensity.

For individual users, the calculus is different. The environmental impact of a single query is negligible. But collective usage patterns shape infrastructure development. The models that get used most heavily receive the most investment in optimization.

AI's energy consumption is neither insignificant nor apocalyptic. It's a design problem. The answers lie in cleaner grids, smarter architectures, and honest accounting about which applications justify their resource costs. Progress only counts if it lasts.

What is this about?

  • Explainer */
  • Tasha Greene/
  • Tech/
  • Trends
  • AI energy consumption/
  • data center power/
  • data center sustainability/
  • machine learning optimization/
  • water consumption tech/
  • carbon-free computing

Feed

    Google adds Gmail mobile encryption for Enterprise Plus

    Google adds Gmail mobile encryption for Enterprise Plus

    Mobile Gmail now provides end-to-end encryption, dropping third-party tools

    about 9 hours ago
    Microsoft removes Copilot disclaimer on April 10, 2026

    Microsoft removes Copilot disclaimer on April 10, 2026

    2025 Nadella interview frames the removal as a push to make Copilot a tool

    about 9 hours ago
    Artemis-2 Returns: Orion Splashdown at 3:00 a.m. PT

    Artemis-2 Returns: Orion Splashdown at 3:00 a.m. PT

    Four astronauts end a nine‑day, 406,765 km lunar arc—Moon flight since Apollo 17

    about 9 hours ago
    Button AI Assistant Debuts, Offering Screen‑Free Voice Help

    Button AI Assistant Debuts, Offering Screen‑Free Voice Help

    Nostalgic iPod Shuffle design meets privacy‑first press‑to‑talk AI

    1 day ago
    Razer Hammerhead V3 HyperSpeed Debuts with Dual‑Mode Case

    Razer Hammerhead V3 HyperSpeed Debuts with Dual‑Mode Case

    The USB‑C case also serves as a 2.4 GHz receiver, cutting dongles for PS5 and phones

    1 day ago
    Apple ships 6.2 million Macs Q1 2026, M5‑MacBook Pro leads

    Apple ships 6.2 million Macs Q1 2026, M5‑MacBook Pro leads

    Apple’s share rises to 9.5%, moving it into fourth place among global PC makers

    1 day ago
    Galaxy S22 Ultra can be bricked after factory reset

    Galaxy S22 Ultra can be bricked after factory reset

    US owners report IMEI‑level lock that hands control to unknown administrator Numero LLC

    1 day ago
    Mouse: P.I. for Hire arrives April 16 on PC, PS5, and Xbox

    Mouse: P.I. for Hire arrives April 16 on PC, PS5, and Xbox

    Modes: 4K 60 fps quality or 120 fps performance on PS5 and Xbox Series X

    1 day ago
    YouTube Rolls Out Auto Speed for Premium Users

    YouTube Rolls Out Auto Speed for Premium Users

    The AI‑driven playback boost aims to cut dead air on long videos

    2 days ago
    Blackwell Set to Capture Majority of the 2026 GPU Market

    Blackwell Set to Capture Majority of the 2026 GPU Market

    GB300/B300 GPUs Push Blackwell to 71% of Shipments; Rubin Falls to 22%

    2 days ago
    Google launches AI avatar tool for Shorts on April 9, 2026

    Google launches AI avatar tool for Shorts on April 9, 2026

    Ages 18+ can create digital replicas, with Synth ID tags and a 3‑year auto‑delete

    2 days ago
    Mac OS X 10.0 Cheetah runs on Wii

    Mac OS X 10.0 Cheetah runs on Wii

    Ports Mac OS X 10.0 Cheetah to the Wii, showing the PowerPC 750CL can run an OS

    2 days ago
    DuoBell Beats ANC: Safer Cycling with Apple AirPods Max

    DuoBell Beats ANC: Safer Cycling with Apple AirPods Max

    A 750 Hz blind‑spot lets DuoBell cut through ANC on popular headphones

    2 days ago
    Škoda DuoBell prototype unveiled on April 5, 2026

    Škoda DuoBell prototype unveiled on April 5, 2026

    750 Hz pulse and 2,000 Hz chime cut through ANC, alerting riders faster at 15 mph

    3 days ago
    SteamGPT Leak Reveals Dual‑Role AI on Steam

    SteamGPT Leak Reveals Dual‑Role AI on Steam

    Leak shows AI handling support and cheat‑detection for millions on the platform

    3 days ago
    Oppo Pad mini challenges Apple with Snapdragon 8 Gen 5

    Oppo Pad mini challenges Apple with Snapdragon 8 Gen 5

    April 21: Oppo Pad mini 8.8‑inch, Snapdragon 8 Gen 5, 5.39 mm, 279 g, 144 Hz OLED

    3 days ago
    Apple to ship 3 million foldable iPhones by end‑2026

    Apple to ship 3 million foldable iPhones by end‑2026

    Limited rollout equals 12 % of iPhone volume and rivals Samsung’s 2.4 million Galaxy Z Fold 7 sales

    3 days ago
    Apple unveils iPhone 18 Pro, iPhone 18 Pro Max, and iPhone Ultra

    Apple unveils iPhone 18 Pro, iPhone 18 Pro Max, and iPhone Ultra

    Mockups match leaked renders; 20 million Samsung panels for iPhone Ultra

    4 days ago
    Sony launches Playerbase program for Gran Turismo 7

    Sony launches Playerbase program for Gran Turismo 7

    PlayStation gamers can win a flight, facial scan, and an avatar in Gran Turismo 7

    4 days ago
    Claude Mythos Preview Beats Opus 4.6 in Cybersecurity!

    Claude Mythos Preview Beats Opus 4.6 in Cybersecurity!

    Claude Mythos Preview for five partners—pricing after a 100 million token credit

    4 days ago
    Loading...
Tech/Trends

AI's Energy Cost: What Every Query Really Consumes

From training to inference, how artificial intelligence became one of tech's most power-hungry innovations

February 11, 2026, 4:07 pm

Every ChatGPT query uses 10 times more energy than a Google search. Training GPT-3 consumed enough electricity to power 120 American homes for a year. As AI adoption explodes, data centers face mounting pressure on water, electricity grids, and carbon emissions. This explainer breaks down the hidden infrastructure costs behind each interaction.

image (4)-1

Summary

  • Training GPT‑3 burned ~1,300 MWh—enough to power 120 U.S. homes for a year—and evaporated ~185,000 gal of water, total footprint ≈1.4 M gal.
  • Each ChatGPT reply uses ~0.34 Wh (≈10× a Google search). With 3.5 billion weekly queries, AI’s energy and water use surge dramatically.
  • Data‑center cooling consumes ~40% of power; U.S. centers used ~17 billion gal of water in 2023. AI could push data‑center electricity to 6–12% of U.S. use by 2028.

When ChatGPT answers your question about weekend dinner recipes, somewhere in a data center, hundreds of specialized chips light up. They process your request in milliseconds, then cool down and wait for the next query. That single moment of computation costs energy. Multiply it by millions of users asking millions of questions, and you begin to see why artificial intelligence has become one of the tech industry's most power-hungry innovations.

The question isn't whether AI consumes energy. It does. The real question is whether that consumption will outpace our ability to make it cleaner and more efficient.

Training Consumes Energy at City Scale

Building a large language model requires an enormous one-time energy investment. Training GPT-3, for example, consumed about 1,287 megawatt-hours of electricity. That's roughly equivalent to powering 120 average American homes for an entire year, compressed into several weeks of continuous computation.

The process works like this: thousands of graphics processing units (GPUs) run simultaneously, analyzing patterns across billions of text examples. Each GPU generates heat while crunching numbers. The system learns language structure, context, and relationships between concepts. This happens 24 hours a day until the model reaches acceptable performance levels.

Training happens once per model. GPT-4 required more energy than GPT-3. Future models will likely require more still, unless efficiency improvements outpace scale increases. The data centers running these training operations draw power comparable to small industrial facilities during peak training periods.

Water consumption adds another invisible cost. GPT-3's training evaporated around 185,000 gallons of water on-site for cooling. When you include the water used to generate the electricity that powered the training, the total footprint reaches about 1.4 million gallons—enough to fill more than two Olympic swimming pools.

Every Query Triggers Invisible Computations

After training comes inference, the process of actually answering user questions. This is where scale creates ongoing energy costs. A single ChatGPT query consumes around 0.34 watt-hours of electricity. That's roughly 10 times more than a traditional Google search, which uses about 0.03 watt-hours.

That difference seems trivial until you consider volume. ChatGPT handled over 100 million weekly active users by early 2024. If each user makes just five queries per day, that's 3.5 billion queries weekly. The cumulative energy cost grows with every new user and every additional question.

Complexity matters too. Generating a 500-word essay requires more computation than answering a simple factual question. Image generation through models like DALL-E or Midjourney demands even more processing power. The energy cost scales with output length and complexity.

Each query also consumes water. A single ChatGPT interaction uses about 0.01 fluid ounces for cooling infrastructure. Individually negligible, but at billions of queries per week, the total adds up quickly.

Cooling Systems Drive Hidden Infrastructure Costs

Data centers don't just run AI models. They fight heat. Modern GPUs can reach temperatures exceeding 82 degrees Fahrenheit under full load. Pack thousands of them into a single facility and you create an environment that requires industrial-scale cooling systems.

In temperate climates, data centers use outside air for cooling when possible. In warmer regions like Arizona or Texas, massive air conditioning systems run continuously. Google's data centers, which serve AI workloads among other services, dedicate around 40 percent of their total energy consumption to cooling and power distribution rather than computation itself.

Water consumption operates on a staggering scale. U.S. data centers consumed about 17 billion gallons of water directly in 2023. When you include the indirect water used to generate the electricity that powers these facilities, the total reaches roughly 211 billion gallons. Google alone reported 8.1 billion gallons of water use across data centers and offices in 2024.

Evaporative cooling systems, common in hot climates, pull millions of gallons annually per facility. As AI workloads expand, so does the infrastructure needed to keep them from overheating. Microsoft's data centers consumed around 2,072 million gallons of water in fiscal year 2023 across all operations.

Carbon Emissions and Grid Pressure Mount

A query processed in Iceland carries a different environmental cost than one processed in West Virginia. Iceland's data centers run almost entirely on geothermal and hydroelectric power. West Virginia's grid relies heavily on coal.

Major AI companies recognize this disparity. Google reports that 64 percent of its global data center energy came from carbon-free sources in 2023. Microsoft committed to being carbon negative by 2030. Meta aims for 100 percent renewable energy across all operations.

The challenge is matching renewable energy supply with actual usage patterns. Solar panels generate power during the day, but AI queries happen around the clock. Battery storage helps, but at current scale, most data centers still draw from mixed grids that include fossil fuel generation during peak demand periods.

U.S. data centers consumed 176 terawatt-hours in 2023, representing about 4.4 percent of U.S. electricity. Globally, data centers used roughly 415 terawatt-hours in 2024, about 1.5 percent of global electricity. AI servers accounted for 24 percent of server electricity demand. Projections suggest U.S. data centers could consume between 6.7 and 12 percent of U.S. electricity by 2028, depending on AI adoption rates.

Efficiency Gains Emerge From Optimization

AI models are becoming more efficient per unit of performance. GPT-3 required 175 billion parameters to achieve its capabilities. Newer architectures achieve similar results with fewer parameters through techniques like mixture of experts, which activate only relevant parts of the model for each query rather than the entire network.

Quantization reduces the precision of calculations without significantly degrading output quality, cutting energy consumption by 30 to 50 percent in some implementations. Specialized AI chips from companies like Google (TPUs) and startups like Cerebras deliver better performance per watt than general-purpose GPUs.

Edge computing shifts some AI processing to devices themselves. Your phone can now run smaller language models locally for tasks like predictive text or voice recognition. This reduces data center load, though it transfers energy consumption to billions of individual devices with smaller but cumulative impact.

Companies are also investing in mitigation. Google's water stewardship projects supplied around 4.5 billion gallons in 2024, offsetting roughly 55 percent of their 8.1 billion gallon consumption. Microsoft committed to becoming water-positive by 2030, replenishing more water than it consumes.

These improvements matter. But they're incremental gains in a context of exponential growth. The question is whether efficiency curves can keep pace with adoption curves.

Whether AI Becomes a Crisis Depends on Choices Made Now

The International Energy Agency estimates that data centers could consume 3 to 4 percent of global electricity by 2030, up from roughly 1.5 percent today. AI workloads represent the fastest-growing segment of that demand.

Whether this becomes a crisis or a manageable transition depends on three factors. First, renewable energy deployment must accelerate. Tech companies are signing power purchase agreements for solar and wind farms, but construction timelines lag behind AI adoption rates. Second, algorithmic efficiency must continue improving. Third, use cases matter. AI that optimizes electrical grids or accelerates renewable energy research could offset its own carbon footprint through systemic improvements elsewhere.

For product teams evaluating AI features, the energy equation has become part of the decision calculus. Caching frequently requested outputs reduces redundant computation. Choosing smaller models for simpler tasks makes engineering and environmental sense. Selecting data center regions with cleaner grids reduces carbon intensity.

For individual users, the calculus is different. The environmental impact of a single query is negligible. But collective usage patterns shape infrastructure development. The models that get used most heavily receive the most investment in optimization.

AI's energy consumption is neither insignificant nor apocalyptic. It's a design problem. The answers lie in cleaner grids, smarter architectures, and honest accounting about which applications justify their resource costs. Progress only counts if it lasts.

What is this about?

  • Explainer */
  • Tasha Greene/
  • Tech/
  • Trends
  • AI energy consumption/
  • data center power/
  • data center sustainability/
  • machine learning optimization/
  • water consumption tech/
  • carbon-free computing

Feed

    Google adds Gmail mobile encryption for Enterprise Plus

    Google adds Gmail mobile encryption for Enterprise Plus

    Mobile Gmail now provides end-to-end encryption, dropping third-party tools

    about 9 hours ago
    Microsoft removes Copilot disclaimer on April 10, 2026

    Microsoft removes Copilot disclaimer on April 10, 2026

    2025 Nadella interview frames the removal as a push to make Copilot a tool

    about 9 hours ago
    Artemis-2 Returns: Orion Splashdown at 3:00 a.m. PT

    Artemis-2 Returns: Orion Splashdown at 3:00 a.m. PT

    Four astronauts end a nine‑day, 406,765 km lunar arc—Moon flight since Apollo 17

    about 9 hours ago
    Button AI Assistant Debuts, Offering Screen‑Free Voice Help

    Button AI Assistant Debuts, Offering Screen‑Free Voice Help

    Nostalgic iPod Shuffle design meets privacy‑first press‑to‑talk AI

    1 day ago
    Razer Hammerhead V3 HyperSpeed Debuts with Dual‑Mode Case

    Razer Hammerhead V3 HyperSpeed Debuts with Dual‑Mode Case

    The USB‑C case also serves as a 2.4 GHz receiver, cutting dongles for PS5 and phones

    1 day ago
    Apple ships 6.2 million Macs Q1 2026, M5‑MacBook Pro leads

    Apple ships 6.2 million Macs Q1 2026, M5‑MacBook Pro leads

    Apple’s share rises to 9.5%, moving it into fourth place among global PC makers

    1 day ago
    Galaxy S22 Ultra can be bricked after factory reset

    Galaxy S22 Ultra can be bricked after factory reset

    US owners report IMEI‑level lock that hands control to unknown administrator Numero LLC

    1 day ago
    Mouse: P.I. for Hire arrives April 16 on PC, PS5, and Xbox

    Mouse: P.I. for Hire arrives April 16 on PC, PS5, and Xbox

    Modes: 4K 60 fps quality or 120 fps performance on PS5 and Xbox Series X

    1 day ago
    YouTube Rolls Out Auto Speed for Premium Users

    YouTube Rolls Out Auto Speed for Premium Users

    The AI‑driven playback boost aims to cut dead air on long videos

    2 days ago
    Blackwell Set to Capture Majority of the 2026 GPU Market

    Blackwell Set to Capture Majority of the 2026 GPU Market

    GB300/B300 GPUs Push Blackwell to 71% of Shipments; Rubin Falls to 22%

    2 days ago
    Google launches AI avatar tool for Shorts on April 9, 2026

    Google launches AI avatar tool for Shorts on April 9, 2026

    Ages 18+ can create digital replicas, with Synth ID tags and a 3‑year auto‑delete

    2 days ago
    Mac OS X 10.0 Cheetah runs on Wii

    Mac OS X 10.0 Cheetah runs on Wii

    Ports Mac OS X 10.0 Cheetah to the Wii, showing the PowerPC 750CL can run an OS

    2 days ago
    DuoBell Beats ANC: Safer Cycling with Apple AirPods Max

    DuoBell Beats ANC: Safer Cycling with Apple AirPods Max

    A 750 Hz blind‑spot lets DuoBell cut through ANC on popular headphones

    2 days ago
    Škoda DuoBell prototype unveiled on April 5, 2026

    Škoda DuoBell prototype unveiled on April 5, 2026

    750 Hz pulse and 2,000 Hz chime cut through ANC, alerting riders faster at 15 mph

    3 days ago
    SteamGPT Leak Reveals Dual‑Role AI on Steam

    SteamGPT Leak Reveals Dual‑Role AI on Steam

    Leak shows AI handling support and cheat‑detection for millions on the platform

    3 days ago
    Oppo Pad mini challenges Apple with Snapdragon 8 Gen 5

    Oppo Pad mini challenges Apple with Snapdragon 8 Gen 5

    April 21: Oppo Pad mini 8.8‑inch, Snapdragon 8 Gen 5, 5.39 mm, 279 g, 144 Hz OLED

    3 days ago
    Apple to ship 3 million foldable iPhones by end‑2026

    Apple to ship 3 million foldable iPhones by end‑2026

    Limited rollout equals 12 % of iPhone volume and rivals Samsung’s 2.4 million Galaxy Z Fold 7 sales

    3 days ago
    Apple unveils iPhone 18 Pro, iPhone 18 Pro Max, and iPhone Ultra

    Apple unveils iPhone 18 Pro, iPhone 18 Pro Max, and iPhone Ultra

    Mockups match leaked renders; 20 million Samsung panels for iPhone Ultra

    4 days ago
    Sony launches Playerbase program for Gran Turismo 7

    Sony launches Playerbase program for Gran Turismo 7

    PlayStation gamers can win a flight, facial scan, and an avatar in Gran Turismo 7

    4 days ago
    Claude Mythos Preview Beats Opus 4.6 in Cybersecurity!

    Claude Mythos Preview Beats Opus 4.6 in Cybersecurity!

    Claude Mythos Preview for five partners—pricing after a 100 million token credit

    4 days ago
    Loading...
banner