Technology News from Around the World, Instantly on Oracnoos!

In a mere decade 'everyone on Earth will be capable of accomplishing more than the most impactful person can today' says OpenAI boss Sam Altman - Related to weekly, testing, accomplishing, results, earth

DF Direct: our weekly show celebrates 200 episodes

DF Direct: our weekly show celebrates 200 episodes

DF Direct, our weekly news and Q+A show, not long ago celebrated its bicentennial - in the sense of reaching 200 episodes, not marking a 200-year anniversary! All four not-on-holiday members of Digital Foundry assembled for the occasion, sharing stories of those early episodes and other highlights from the show's nearly four-year run thus far.

As part of the on-holiday contingent, it was a lot of fun to watch the show back - I heartily recommend viewing at least that initial segment if you're a fan of the channel, as it contains some nice reminisces from Rich, John, Alex and Oliver.

With five full-time members of Digital Foundry testing, scripting, recording and editing their own videos - Rich, Tom, John, Alex and Oliver - we only have the capability to cover around three or four topics a week, which just isn't enough when we have to contend with the volume of game releases and updates. Having the Direct means that those smaller topics can be tested by one or two members of the team over the course of minutes or hours, then reported off-the-cuff, enabling much broader coverage than we'd otherwise manage.

The Direct is also one of favourite ways to maintain a dialogue with our community, in the form of the questions submitted for each episode. Like the news portion of the show, the questions-and-answers segment has gradually grown over the years, and our community continues to step up its game both in terms of insightful questions and comic memery. Everyone on the team really looks forward to selecting questions for the week's show, and there are often many more slots on the docket for questions than there are news topics.

The questions on this week's Direct are, as always, high-quality ones. I particularly liked Michael Giles' enquiry into the establishment of a DF Latency Rating, which he proposes could be based on millisecond thresholds - thereby giving viewers an easy way to know whether the experience delivered by various levels of frame generation had good, acceptable or unacceptable levels of input latency.

0:02:07 News 1: DF Direct celebrates episode 200!

0:32:01 News 2: Nintendo sets Switch 2 price expectations.

0:46:17 News 3: Next Battlefield shown in brief teaser.

0:55:45 News 4: Monster Hunter Wilds beta, benchmark tested.

1:09:05 News 5: RTX 5080 overclocking tested.

1:26:52 News 6: Epic admits Unreal Engine has a #StutterStruggle problem.

1:34:17 News 7: Star Wars Outlaws updated with new PSSR version.

1:41:07 Supporter Q1: Could you test the 5090 with no AI elements vs. the 5080 with all AI elements?

1:47:23 Supporter Q2: Could DF establish a latency rating system?

1:54:41 Supporter Q3: Would increasing resolution and AA quality be enough for Nintendo Switch 2 games?

1:57:53 Supporter Q4: Is a Bloodborne remaster imminent?

2:03:32 Supporter Q5: Could 1080p actually be preferable to 4K for Switch 2 games?

It's an interesting thought, and perhaps has an equally interesting answer. This concept and others like it have been suggested a few times as the answer to giving an intuitive way of classifying varying input latency at equal frame-rates, and there's certainly merit in that. However, the problem facing such a system is that acceptable levels of input latency vary quite substantially depending on the input method, game and even person. You may find that an action-adventure game played on a controller is perfectly acceptable latency-wise with 4x frame generation, but a mouse-and-keyboard shooter is intolerable with even regular 2x frame gen.

To put this in historical context, we've often seen quite dramatic shifts in input latency from one console generation to another, with Rich picking out PS3 being a notable downgrade over PS2. That's because of a confluence of factors, notably an industry-wide switch to 30fps games rather than 60fps, game engines becoming more complicated to handle more advanced graphics, and early LCD TVs replacing CRTs. All three changes caused input latency to increase substantially - we saw measurements in Killzone 2 of over 200ms, even factoring out the display latency! - yet in the absence of good measuring tools, little was made of the change.

In the current context then, perhaps we need more personalised tools to judge where the threshold is for input latency that's too high. I like Rich's idea of someone like the makers of 3DMark producing an input latency test with various game genres represented, then randomly varying latency with the user marking when things become too unresponsive.

With those two discussion points out of the way, I should close with this final thought: Digital Foundry would not be where it is today without the support of its viewers. We massively appreciate everyone that reaches out to offer their feedback, questions and good humour each week, or just reads and watches what we put out. Thank you.

If you are interested in joining our brilliant community, take a look at the DF Patreon. You get some neat perks, including access to our private Discord server filled with some lovely individuals, weekly reports on what we're working on and high-quality video downloads of every video we've put out for nearly a decade. Higher tiers get all of the above plus extra rewards, including early access to non-embargoed videos, behind-the-scenes content, and even early access to DF Retro episodes and .

As always, thanks again for watching and supporting Digital Foundry.

Following the PSN outage that happened over the weekend, Sony has stepped forward and explained what exactly happened to cause such an event to occur.......

Square Enix released its financial earnings for the nine months ending December 31, 2024, reporting a decrease in net sales attributed to.

Take-Two boss Strauss Zelnick has discussed the growing importance of the PC market, though stopped short of outright confirming GTA 6 will be heading......

I've been testing Nvidia's new Neural Texture Compression toolkit and the impressive results could be good news for game install sizes

I've been testing Nvidia's new Neural Texture Compression toolkit and the impressive results could be good news for game install sizes

At CES 2025, Nvidia showcased so many new things that it was somewhat hard to figure out just what was really worth paying attention to. While the likes of the RTX 5090 and its enormous price tag were grabbing all the headlines, one new piece of tech sat to one side with lots of promise but no game to showcase it. However, Nvidia has now released a beta software toolkit for its RTX Neural Texture Compression (RTXNTC) system, and after playing around with it for an hour or two, I'm far more impressed with this than any hulking GPU.

At the moment, all textures in games are compressed into a common format, to save on storage space and download requirements, and then decompressed when used in rendering. It can't have escaped your notice, though, that today's massive 3D games are…well…massive and 100 GB or more isn't unusual.

RTXNTC works like this: The original textures are pre-converted into an array of weights for a small neural network. When the game's engine issues instructions to the GPU to apply these textures to an object, the graphics processor samples them. Then, the aforementioned neural network (aka decoding) reconstructs what the texture looks like at the sample point.

The system can only produce a single unfiltered texel so for the sample demonstration, RTX Texture Filtering (also called stochastic texture filtering) is used to interpolate other texels.

Nvidia describes the whole thing using the term 'Inference on Sample,' and the results are impressive, to say the least. Without any form of compression, the texture memory footprint in the demo is 272 MB. With RTXNTC in full swing, that reduces to a mere [website] MB.

The whole process of sampling and decoding is pretty fast. It's not quite as fast as normal texture sampling and filtering, though. At 1080p, the non-NTC setup runs at 2,466 fps but this drops to 2,088 fps with Interfence on Sample. Stepping the resolution up to 4K the performance figures are 930 and 760 fps, respectively. In other words, RTXNTC incurs a frame rate penalty of 15% at 1080p and 18% at 4K—for a 96% reduction in texture memory.

Those frame rates were achieved using an RTX 4080 Super, and lower-tier or older RTX graphics cards are likely to see a larger performance drop. For that kind of hardware, Nvidia hints at using 'Inference on load' (NTC Transcoded to BCn in the demo) where the pre-compressed NTC textures are decompressed as the game (or demo, in this case) is loaded. They are then transcoded in a standard BCn block compression format, to be sampled and filtered as normal.

The biggest gaming news, reviews and hardware deals Keep up to date with the most crucial stories and the best deals, as picked by the PC Gamer team. Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors.

The texture memory reduction isn't as impressive but the performance hit isn't anywhere near as big as with Interfence on Sample. At 1080p, 2,444 fps it's almost as fast as a standard texture sample and filtering, and the texture footprint is just 98 MB. That's a 64% reduction over the uncompressed format.

All of this would be for nothing if the texture reconstruction was rubbish but as you can see in the gallery below, RTXNTC generates texels that look almost identical to the originals. Even Inference on Load looks the same.

Of course, this is a demonstration and a simple beta one at that, and it's not even remotely like Alan Wake 2, in terms of texture resolution and environment complexity. RTXNTC isn't suitable for every texture, either, being designed to be applied to 'physically-based rendering (PBR) materials' rather than a single, basic texture.

And it also requires cooperative vector support to work as quickly as this and that's essentially limited to RTX 40 or 50 series graphics cards. A cynical PC enthusiast might be tempted to claim that Nvidia only developed this system to justify equipping its desktop GPUs with less VRAM than the competition, too.

But the tech itself clearly has lots of potential and it's possible that AMD and Intel are working on developing their own systems that achieve the same result. While three proprietary algorithms for reducing texture memory footprints aren't what anyone wants to see, if developers show enough interest in using them, then one of them (or an amalgamation of all three) might end up being a standard aspect of DirectX and Vulkan.

That would be the best outcome for everyone, so it's worth keeping an eye on AI-based texture compression because just like with Nvidia's other first-to-market technologies ([website] ray tracing acceleration, AI upscaling), the industry eventually adapts them as being the norm. I don't think this means we'll see a 20 GB version of Baldur's Gate 3 any time soon but the future certainly looks a lot smaller.

World War 3 ist ein kostenloser Multiplayer-Shooter von den Entwicklerstudios The Farm 51 und Wishlist Games, der einmal als Battlefield-Alternative f......

I have a confession: I think I might like Warframe. Although in general I think of myself as a strict adherent to the singleplayer philosophy of Saint......

Dragon Quest creator Yuji Horii has reassured fans that the next mainline entry in the RPG series, Dragon Quest XII: The Flames of Fate, is still very......

In a mere decade 'everyone on Earth will be capable of accomplishing more than the most impactful person can today' says OpenAI boss Sam Altman

In a mere decade 'everyone on Earth will be capable of accomplishing more than the most impactful person can today' says OpenAI boss Sam Altman

In surprising news to me, OpenAI co-founder and CEO Sam Altman has a blog. And in among the ruminations on traditional blog-like topics like "What I Wish Someone Had Told Me" and "The Strength of Being Misunderstood", he not long ago posted three observations on AGI (Artificial General Intelligence) and its potential uses for the human race.

"The economic growth in front of us looks astonishing, and we can now imagine a world where we cure all diseases, have much more time to enjoy with our families, and can fully realize our creative potential," says Altman.

"In a decade, perhaps everyone on earth will be capable of accomplishing more than the most impactful person can today."

Well, that sounds lovely, doesn't it? Who doesn't love a promise of a Star Trek-style, utopian, post-disease, post-scarcity future, in which our endeavours are supported by our own private genius. To justify that thinking, Altman's three observations are thus:

1. The intelligence of an AI model roughly equals the log of the resources used to train and run it. "These resources are chiefly training compute, data, and inference compute. It appears that you can spend arbitrary amounts of money and get continuous and predictable gains; the scaling laws that predict this are accurate over many orders of magnitude."

2. The cost to use a given level of AI falls about 10x every 12 months, and lower prices lead to much more use. "You can see this in the token cost from GPT-4 in early 2023 to GPT-4o in mid-2024, where the price per token dropped about 150x in that time period. Moore’s law changed the world at 2x every 18 months; this is unbelievably stronger."

3. The socioeconomic value of linearly increasing intelligence is super-exponential in nature. "A consequence of this is that we see no reason for exponentially increasing investment to stop in the near future."

The biggest gaming news, reviews and hardware deals Keep up to date with the most key stories and the best deals, as picked by the PC Gamer team. Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors.

That last point seems particularly pertinent, given that OpenAI has previously been reported to be burning through billions of dollars in staffing and model training costs. Exponentially increasing investment has been a keystone of the modern AI boom, and the release of China-based startup DeepSeek's R1 model (supposedly trained at a fraction of the cost of existing efforts) has in the recent past shaken investor confidence in the US-dominated AI industry.

So it's no surprise Altman is highlighting its importance here. On the first point, however, it looks like Altman has no illusions of the mainstream AI market (if such a thing exists) letting up on training costs, hardware requirements, and "arbitrary amounts of money" in order to continue gaining ground in AI development, at least when it comes to AGI.

Still, . Which, if his predictions about the future of AI agents come true, will be necessary to enable our AI-co-worker hellsc... I mean, future working methods.

"Let’s imagine the case of a software engineering agent... imagine it as a real-but-relatively-junior virtual coworker. Now imagine 1,000 of them. Or 1 million of them. Now imagine such agents in every field of knowledge work.

"The world will not change all at once; it never does. Life will go on mostly the same in the short run, and people in 2025 will mostly spend their time in the same way they did in 2024. We will still fall in love, create families, get in fights online, hike in nature, etc.

"But the future will be coming at us in a way that is impossible to ignore, and the long-term changes to our society and economy will be huge. "

Goody. I'm pleased to hear that, in Altman's eyes, I'll still be getting in fights online and hiking in nature this year. But AGI-enabled assistants are coming, says the OpenAI head honcho, and given the previous trends he's highlighting here, they appear to be coming rather quickly (providing the money tap keeps flowing, of course).

"Agency, willfulness, and determination will likely be extremely valuable," Altman continues. "Correctly deciding what to do and figuring out how to navigate an ever-changing world will have huge value; resilience and adaptability will be helpful skills to cultivate."

"AGI will be the biggest lever ever on human willfulness, and enable individual people to have more impact than ever before, not less."

So, a lot of optimistic thinking going on here, it seems. I'd like to hold my hand up and say that I'm not too keen on the idea of an AI co-worker writing my articles for me, but if they could "marshall the intellectual capacity" to whittle down my inbox reliably without sending significant messages to the spam folder, that'd be grand.

I don't want an AGI Mozart, more of a competent Jeeves. Still, as Altman has it, things do sound suspiciously bright and rosy for our creative futures:

"There is a great deal of talent right now without the resources to fully express itself, and if we change that, the resulting creative output of the world will lead to tremendous benefits for us all."

Natürlich hat auch Marvel beim Super Bowl 2025 Neues zu seinen kommenden Kino-Highlights gezeigt. Bildquelle: Disney/Marvel.

Resident Evil 5 has received a new rating for Xbox Series X|S from the Entertainment Software Ratings Board (ESRB), prompting speculation that an enha......

At CES 2025, Nvidia unveiled so many new things that it was somewhat hard to figure out just what was really worth paying attention to. While the lik......

Market Impact Analysis

Market Growth Trend

2018201920202021202220232024
6.0%7.2%7.5%8.4%8.8%9.1%9.2%
6.0%7.2%7.5%8.4%8.8%9.1%9.2% 2018201920202021202220232024

Quarterly Growth Rate

Q1 2024 Q2 2024 Q3 2024 Q4 2024
8.5% 8.8% 9.0% 9.2%
8.5% Q1 8.8% Q2 9.0% Q3 9.2% Q4

Market Segments and Growth Drivers

Segment Market Share Growth Rate
Console Gaming28%6.8%
Mobile Gaming37%11.2%
PC Gaming21%8.4%
Cloud Gaming9%25.3%
VR Gaming5%32.7%
Console Gaming28.0%Mobile Gaming37.0%PC Gaming21.0%Cloud Gaming9.0%VR Gaming5.0%

Technology Maturity Curve

Different technologies within the ecosystem are at varying stages of maturity:

Innovation Trigger Peak of Inflated Expectations Trough of Disillusionment Slope of Enlightenment Plateau of Productivity AI/ML Blockchain VR/AR Cloud Mobile

Competitive Landscape Analysis

Company Market Share
Sony PlayStation21.3%
Microsoft Xbox18.7%
Nintendo15.2%
Tencent Games12.8%
Epic Games9.5%

Future Outlook and Predictions

The Direct Weekly Show landscape is evolving rapidly, driven by technological advancements, changing threat vectors, and shifting business requirements. Based on current trends and expert analyses, we can anticipate several significant developments across different time horizons:

Year-by-Year Technology Evolution

Based on current trajectory and expert analyses, we can project the following development timeline:

2024Early adopters begin implementing specialized solutions with measurable results
2025Industry standards emerging to facilitate broader adoption and integration
2026Mainstream adoption begins as technical barriers are addressed
2027Integration with adjacent technologies creates new capabilities
2028Business models transform as capabilities mature
2029Technology becomes embedded in core infrastructure and processes
2030New paradigms emerge as the technology reaches full maturity

Technology Maturity Curve

Different technologies within the ecosystem are at varying stages of maturity, influencing adoption timelines and investment priorities:

Time / Development Stage Adoption / Maturity Innovation Early Adoption Growth Maturity Decline/Legacy Emerging Tech Current Focus Established Tech Mature Solutions (Interactive diagram available in full report)

Innovation Trigger

  • Generative AI for specialized domains
  • Blockchain for supply chain verification

Peak of Inflated Expectations

  • Digital twins for business processes
  • Quantum-resistant cryptography

Trough of Disillusionment

  • Consumer AR/VR applications
  • General-purpose blockchain

Slope of Enlightenment

  • AI-driven analytics
  • Edge computing

Plateau of Productivity

  • Cloud infrastructure
  • Mobile applications

Technology Evolution Timeline

1-2 Years
  • Technology adoption accelerating across industries
  • digital transformation initiatives becoming mainstream
3-5 Years
  • Significant transformation of business processes through advanced technologies
  • new digital business models emerging
5+ Years
  • Fundamental shifts in how technology integrates with business and society
  • emergence of new technology paradigms

Expert Perspectives

Leading experts in the gaming tech sector provide diverse perspectives on how the landscape will evolve over the coming years:

"Technology transformation will continue to accelerate, creating both challenges and opportunities."

— Industry Expert

"Organizations must balance innovation with practical implementation to achieve meaningful results."

— Technology Analyst

"The most successful adopters will focus on business outcomes rather than technology for its own sake."

— Research Director

Areas of Expert Consensus

  • Acceleration of Innovation: The pace of technological evolution will continue to increase
  • Practical Integration: Focus will shift from proof-of-concept to operational deployment
  • Human-Technology Partnership: Most effective implementations will optimize human-machine collaboration
  • Regulatory Influence: Regulatory frameworks will increasingly shape technology development

Short-Term Outlook (1-2 Years)

In the immediate future, organizations will focus on implementing and optimizing currently available technologies to address pressing gaming tech challenges:

  • Technology adoption accelerating across industries
  • digital transformation initiatives becoming mainstream

These developments will be characterized by incremental improvements to existing frameworks rather than revolutionary changes, with emphasis on practical deployment and measurable outcomes.

Mid-Term Outlook (3-5 Years)

As technologies mature and organizations adapt, more substantial transformations will emerge in how security is approached and implemented:

  • Significant transformation of business processes through advanced technologies
  • new digital business models emerging

This period will see significant changes in security architecture and operational models, with increasing automation and integration between previously siloed security functions. Organizations will shift from reactive to proactive security postures.

Long-Term Outlook (5+ Years)

Looking further ahead, more fundamental shifts will reshape how cybersecurity is conceptualized and implemented across digital ecosystems:

  • Fundamental shifts in how technology integrates with business and society
  • emergence of new technology paradigms

These long-term developments will likely require significant technical breakthroughs, new regulatory frameworks, and evolution in how organizations approach security as a fundamental business function rather than a technical discipline.

Key Risk Factors and Uncertainties

Several critical factors could significantly impact the trajectory of gaming tech evolution:

Technological limitations
Market fragmentation
Monetization challenges

Organizations should monitor these factors closely and develop contingency strategies to mitigate potential negative impacts on technology implementation timelines.

Alternative Future Scenarios

The evolution of technology can follow different paths depending on various factors including regulatory developments, investment trends, technological breakthroughs, and market adoption. We analyze three potential scenarios:

Optimistic Scenario

Rapid adoption of advanced technologies with significant business impact

Key Drivers: Supportive regulatory environment, significant research breakthroughs, strong market incentives, and rapid user adoption.

Probability: 25-30%

Base Case Scenario

Measured implementation with incremental improvements

Key Drivers: Balanced regulatory approach, steady technological progress, and selective implementation based on clear ROI.

Probability: 50-60%

Conservative Scenario

Technical and organizational barriers limiting effective adoption

Key Drivers: Restrictive regulations, technical limitations, implementation challenges, and risk-averse organizational cultures.

Probability: 15-20%

Scenario Comparison Matrix

FactorOptimisticBase CaseConservative
Implementation TimelineAcceleratedSteadyDelayed
Market AdoptionWidespreadSelectiveLimited
Technology EvolutionRapidProgressiveIncremental
Regulatory EnvironmentSupportiveBalancedRestrictive
Business ImpactTransformativeSignificantModest

Transformational Impact

Technology becoming increasingly embedded in all aspects of business operations. This evolution will necessitate significant changes in organizational structures, talent development, and strategic planning processes.

The convergence of multiple technological trends—including artificial intelligence, quantum computing, and ubiquitous connectivity—will create both unprecedented security challenges and innovative defensive capabilities.

Implementation Challenges

Technical complexity and organizational readiness remain key challenges. Organizations will need to develop comprehensive change management strategies to successfully navigate these transitions.

Regulatory uncertainty, particularly around emerging technologies like AI in security applications, will require flexible security architectures that can adapt to evolving compliance requirements.

Key Innovations to Watch

Artificial intelligence, distributed systems, and automation technologies leading innovation. Organizations should monitor these developments closely to maintain competitive advantages and effective security postures.

Strategic investments in research partnerships, technology pilots, and talent development will position forward-thinking organizations to leverage these innovations early in their development cycle.

Technical Glossary

Key technical terms and definitions to help understand the technologies discussed in this article.

Understanding the following technical concepts is essential for grasping the full implications of the security threats and defensive measures discussed in this article. These definitions provide context for both technical and non-technical readers.

Filter by difficulty:

latency intermediate

algorithm

VR intermediate

interface

AR intermediate

platform

platform intermediate

encryption Platforms provide standardized environments that reduce development complexity and enable ecosystem growth through shared functionality and integration capabilities.

ray tracing intermediate

API

game engine intermediate

cloud computing

algorithm intermediate

middleware