Technology News from Around the World, Instantly on Oracnoos!

InfoQ's New Certification Focuses on Practical Skills for Senior Developers and Architects - Related to skills, model, architects, llm, focuses

DeepSeek Open-Sources DeepSeek-R1 LLM with Performance Comparable to OpenAI's o1 Model

DeepSeek Open-Sources DeepSeek-R1 LLM with Performance Comparable to OpenAI's o1 Model

DeepSeek open-sourced DeepSeek-R1, an LLM fine-tuned with reinforcement learning (RL) to improve reasoning capability. DeepSeek-R1 achieves results on par with OpenAI's o1 model on several benchmarks, including MATH-500 and SWE-bench.

DeepSeek-R1 is based on DeepSeek-V3, a mixture of experts (MoE) model lately open-sourced by DeepSeek. This base model is fine-tuned using Group Relative Policy Optimization (GRPO), a reasoning-oriented variant of RL. The research team also performed knowledge distillation from DeepSeek-R1 to open-source Qwen and Llama models and released several versions of each; these models outperform larger models, including GPT-4, on math and coding benchmarks.

[DeepSeek-R1 is] the first step toward improving language model reasoning capabilities using pure reinforcement learning (RL). Our goal is to explore the potential of LLMs to develop reasoning capabilities without any supervised data, focusing on their self-evolution through a pure RL [website] in a wide range of tasks, including creative writing, general question answering, editing, summarization, and more. Additionally, DeepSeek-R1 demonstrates outstanding performance on tasks requiring long-context understanding, substantially outperforming DeepSeek-V3 on long-context benchmarks.

To develop the model, DeepSeek started with DeepSeek-V3 as a base. They first tried fine-tuning it only with RL, and without any supervised fine-tuning (SFT), producing a model called DeepSeek-R1-Zero, which they have also released. This model exhibits strong reasoning performance, but " powerful reasoning behaviors, it faces several issues. For instance, DeepSeek-R1-Zero struggles with challenges like poor readability and language mixing."

To address this, the team used a short stage of SFT to prevent the "cold start" problem of RL. They collected several thousand examples of chain-of-thought reasoning to use in SFT of DeepSeek-V3 before running RL. After the RL process converged, they then collected more SFT data using rejection sampling, resulting in a dataset of 800k samples. This dataset was used for further fine-tuning and to produce the distilled models from Llama and Qwen.

DeepSeek evaluated their model on a variety of reasoning, math, and coding benchmarks and compared it to other models, including [website], GPT-4o, and o1. DeepSeek-R1 outperformed all of them on several of the benchmarks, including AIME 2024 and MATH-500.

Within a few days of its release, the LMArena showcased that DeepSeek-R1 was ranked #3 overall in the arena and #1 in coding and math. It was also tied for #1 with o1 in "Hard Prompt with Style Control" category.

Django framework co-creator Simon Willison wrote about his experiments with one of the DeepSeek distilled Llama models on his blog:

Each response starts with a ... pseudo-XML tag containing the chain of thought used to help generate the response. [Given the prompt] "a joke about a pelican and a walrus who run a tea room together"[website] then thought for 20 paragraphs before outputting the joke!...[T]he joke is awful. But the process of getting there was such an interesting insight into how these new models work.

DeepSeek is rapidly emerging as a strong builder of open models. Not only are these models great performers, but their license permits use of their outputs for distillation, potentially pushing forward the state of the art for language models (and multimodal models) of all sizes.

The DeepSeek-R1 models are available on HuggingFace.

Hugging Face has launched the integration of four serverless inference providers Fal, Replicate, SambaNova, and Together AI, directly into its model p......

How do you get testers and developers to cooperate on tests?

If developers help out with tests at all — that's already a good start. But even then, t......

This is part two of a Makers series on the state of observability. Part one featured Christine Yen , CEO and co-founder of [website].

InfoQ's New Certification Focuses on Practical Skills for Senior Developers and Architects

InfoQ's New Certification Focuses on Practical Skills for Senior Developers and Architects

InfoQ is introducing its first hands-on software architecture certification at QCon London 2025 (April 7-11), the international software development conference. The certification will combine practitioner-led conference sessions with a hands-on workshop focused on real-world architectural challenges.

Wesley Reisz, sixteen-time QCon chair, co-host of the InfoQ Podcast, and technical principal consultant at Equal Experts, will lead the InfoQ Certified Software Architect in Emerging Technologies (ICSAET) certification workshop.

"While many certification programs exist, few address the practical challenges of implementing emerging technologies at enterprise scale," expressed Reisz. "This certification bridges that gap by focusing on real-world scenarios software architects face daily."

The certification requires participants to have a minimum of five years of senior technical experience. Participants will work through architectural challenges drawn from actual enterprise implementations inspired by the conference sessions, including "Architectures You've Always Wondered About" which capabilities real-world examples from companies scaling systems for massive traffic and complexity, "Modern Data Architectures" which addresses critical challenges in building scalable systems that integrate AI and machine learning, and "The Changing Face of Architectural Practice" that will examine how traditional approaches are evolving to meet new challenges.

The InfoQ Certified Software Architect in Emerging Technologies (ICSAET) certification differs from traditional architecture certifications through its:

Integration with QCon’s cutting-edge software architecture tracks.

Focus on practical implementation challenges.

Real enterprise implementation scenarios.

"Participants will work through actual challenges we've encountered in enterprise implementations," Reisz explained. "The focus is on the trade-offs and constraints that shape real-world solutions, not theoretical frameworks."

Dio Synodinos, president of C4Media, Inc., the makers of InfoQ and QCon, expressed:

"This certification emerges from nearly two decades of InfoQ and QCon's work with international enterprise software teams. Through our conferences and community, we've helped senior software developers navigate emerging technologies and architectural challenges since 2006. We've seen firsthand what senior architects need: practical insights into implementing emerging technologies at scale. That's exactly what this certification delivers - real-world knowledge from practitioners who are actively solving today's complex enterprise-scale architectural challenges."

The ICSAET certification signals participants’ authority as senior technical leaders and their dedication to continuous growth. Successfully completing both the QCon London conference and the workshop will earn participants the InfoQ Certified Software Architect in Emerging Technologies (ICSAET) credential, which demonstrates leadership and expertise in modern software architecture.

The certification will initially be available at QCon London 2025, with plans to expand to QCon San Francisco 2025 in November. The workshop component will be limited in size to ensure quality interactions and meaningful peer discussions.

Registration for the ICSAET certification is now open. More information and registration details can be found at [website].

As the Trump administration revokes Executive Order 14110, the [website] shifts toward a market-driven AI strategy, departing from the Biden administration......

State management and reactivity are at the heart of modern frontend development, determining how applications enhancement and respond to user interactions.......

Overview of Databases and Their Significance in Data Management.

Databases are structured repositories of information that can be readily accessed, co......

Vercel Rolls Out More Cost-Effective Infrastructure Model

Vercel Rolls Out More Cost-Effective Infrastructure Model

Gone are the days of the edge worker/runtime for frontend cloud hosting provider Vercel, CEO Guillermo Rauch tweeted early yesterday.

“In fact, as of last week, all Edge middleware runs on Fluid in the Vercel cloud,” he added.

Edge Network is a Vercel offering that is both a Content Delivery Network (CDN) and a globally distributed platform for running compute in regions around the globe.

Fluid is a new web application infrastructure model that purports to blend the best of servers and serverless, while offering efficient resource utilization and — more importantly for Vercel consumers — reduced costs.

Fluid is multiregion, but that doesn’t mean “dozens or hundreds of CDN edges,” Rauch wrote.

Edge computing is best for routing, static assets and pre-renders, but it’s not for apps, APIs and databases, Rauch stated in his tweet and reiterated in a webcast later that day with CTO Malte Ubl.

“The big insight is that your application, [the] real logic, has to run close to the data, because you’re going back and forth as data waterfalls; that’s going to be slow,” Ubl stated. “The dream of edge compute, that you suddenly have your data in all these 200 locations, that dream is just not reality.”.

Plus, the vast majority of Vercel end-people host their data in one location, he added.

“The example I like to give is that Google has eight data centers for Google search,” Rauch added. “Google did pretty well with eight data centers. You’re not going to copy petabytes of data to 200 locations.”.

With Fluid, the compute runs closer to where your data already lives instead of “attempting unrealistic replication across every edge location,” wrote Mariano Cocirio, product manager for CI/CD and Compute, in a post introducing Fluid.

“The dream of edge compute, that you suddenly have your data in all these 200 locations, that dream is just not reality.”.

“Rather than forcing widespread data distribution, this approach ensures your compute is placed in regions that align with your data, optimizing for both performance and consistency,” he expressed. “Dynamic requests are routed to the nearest healthy compute region — among your designated locations — ensuring efficient and reliable execution.”.

For enterprise consumers, multi-region failover is the default when activating Fluid, he added.

Serverless computing can suffer from cold starts, which are delays that occur when a function is invoked for the first time or after inactivity.

Serverless functions run in containers. When the function is deployed, the cloud provider packages the code and dependencies into a container. Containers need initialization when a function is invoked, a process that takes time as the container is allocated, initialized, and the code loaded. Also, containers shut down due to inactivity to save resources.

“If I turn on Fluid, which I just did, I’m going to send another four requests, 1-2-3-4, they’re all hitting the same instance,” Ubl noted. “What you can see in the demo is it counts the total duration of the time the function was alive, in this case, [website] seconds. So that’s all I’m billed for. Yes, so you get built for [website] seconds instead of 12 seconds […] that’s what Fluid is.”.

Fluid reduces the frequency of cold starts by maintaining a ‘warm instance.’ It trades single-invocation functions for high-performance mini-servers, Cocirio stated.

“When cold starts do happen, a Rust-based runtime with full [website] and Python support accelerates initialization,” Cocirio wrote. “Bytecode caching further speeds up invocation by pre-compiling function code, reducing startup overhead.”.

As a result, the model maximizes resource efficiency and, in early adopters, has reduced costs by up to 85%, he added.

Fluid bills on actual compute usage, minimizing waste, he emphasized. It also prioritizes using existing resources before creating new instances, “eliminating hard scaling limits and leveraging warm compute for faster, more efficient scaling,” Cociro noted. “By scaling functions before instances, Fluid shifts to a many-to-one model that can handle tens of thousands of concurrent invocations.”.

Fluid also mitigates the risk of uncontrolled execution, which can drive up costs, Cicirio explained. Functions that are waiting on backend responses can process other requests instead of wasting compute.

Fluid also supports advanced tasks such as streaming and post-response processing. It’s fully managed, which retains one of the appealing aspects of the serverless model.

Despite that, Vercel is not automatically switching clients over to Fluid — it does require clients to flip one switch in the functions tab under Project Settings to enable it. Rauch explained that Vercel decided not to enable it for everyone because the execution model slightly changes.

“It requires no code changes. We have … mitigations built in. It’s powered by Node and Python and more open runtimes to come, and you’re ready to enable it today,” Rauch presented.

We call this an event when a button is pressed; a sensor detects a temperature change, or a transaction flows through. An event is an action or state ......

AI-driven data trends in Indian governance in 2025 are revolutionizing decision-making, enhancing efficiency, and improving public servi......

AI is being used to open fake feature requests in open source repos, . So far, AI-driven issues have been reported in Cur......

Market Impact Analysis

Market Growth Trend

2018201920202021202220232024
7.5%9.0%9.4%10.5%11.0%11.4%11.5%
7.5%9.0%9.4%10.5%11.0%11.4%11.5% 2018201920202021202220232024

Quarterly Growth Rate

Q1 2024 Q2 2024 Q3 2024 Q4 2024
10.8% 11.1% 11.3% 11.5%
10.8% Q1 11.1% Q2 11.3% Q3 11.5% Q4

Market Segments and Growth Drivers

Segment Market Share Growth Rate
Enterprise Software38%10.8%
Cloud Services31%17.5%
Developer Tools14%9.3%
Security Software12%13.2%
Other Software5%7.5%
Enterprise Software38.0%Cloud Services31.0%Developer Tools14.0%Security Software12.0%Other Software5.0%

Technology Maturity Curve

Different technologies within the ecosystem are at varying stages of maturity:

Innovation Trigger Peak of Inflated Expectations Trough of Disillusionment Slope of Enlightenment Plateau of Productivity AI/ML Blockchain VR/AR Cloud Mobile

Competitive Landscape Analysis

Company Market Share
Microsoft22.6%
Oracle14.8%
SAP12.5%
Salesforce9.7%
Adobe8.3%

Future Outlook and Predictions

The Deepseek Model Open landscape is evolving rapidly, driven by technological advancements, changing threat vectors, and shifting business requirements. Based on current trends and expert analyses, we can anticipate several significant developments across different time horizons:

Year-by-Year Technology Evolution

Based on current trajectory and expert analyses, we can project the following development timeline:

2024Early adopters begin implementing specialized solutions with measurable results
2025Industry standards emerging to facilitate broader adoption and integration
2026Mainstream adoption begins as technical barriers are addressed
2027Integration with adjacent technologies creates new capabilities
2028Business models transform as capabilities mature
2029Technology becomes embedded in core infrastructure and processes
2030New paradigms emerge as the technology reaches full maturity

Technology Maturity Curve

Different technologies within the ecosystem are at varying stages of maturity, influencing adoption timelines and investment priorities:

Time / Development Stage Adoption / Maturity Innovation Early Adoption Growth Maturity Decline/Legacy Emerging Tech Current Focus Established Tech Mature Solutions (Interactive diagram available in full report)

Innovation Trigger

  • Generative AI for specialized domains
  • Blockchain for supply chain verification

Peak of Inflated Expectations

  • Digital twins for business processes
  • Quantum-resistant cryptography

Trough of Disillusionment

  • Consumer AR/VR applications
  • General-purpose blockchain

Slope of Enlightenment

  • AI-driven analytics
  • Edge computing

Plateau of Productivity

  • Cloud infrastructure
  • Mobile applications

Technology Evolution Timeline

1-2 Years
  • Technology adoption accelerating across industries
  • digital transformation initiatives becoming mainstream
3-5 Years
  • Significant transformation of business processes through advanced technologies
  • new digital business models emerging
5+ Years
  • Fundamental shifts in how technology integrates with business and society
  • emergence of new technology paradigms

Expert Perspectives

Leading experts in the software dev sector provide diverse perspectives on how the landscape will evolve over the coming years:

"Technology transformation will continue to accelerate, creating both challenges and opportunities."

— Industry Expert

"Organizations must balance innovation with practical implementation to achieve meaningful results."

— Technology Analyst

"The most successful adopters will focus on business outcomes rather than technology for its own sake."

— Research Director

Areas of Expert Consensus

  • Acceleration of Innovation: The pace of technological evolution will continue to increase
  • Practical Integration: Focus will shift from proof-of-concept to operational deployment
  • Human-Technology Partnership: Most effective implementations will optimize human-machine collaboration
  • Regulatory Influence: Regulatory frameworks will increasingly shape technology development

Short-Term Outlook (1-2 Years)

In the immediate future, organizations will focus on implementing and optimizing currently available technologies to address pressing software dev challenges:

  • Technology adoption accelerating across industries
  • digital transformation initiatives becoming mainstream

These developments will be characterized by incremental improvements to existing frameworks rather than revolutionary changes, with emphasis on practical deployment and measurable outcomes.

Mid-Term Outlook (3-5 Years)

As technologies mature and organizations adapt, more substantial transformations will emerge in how security is approached and implemented:

  • Significant transformation of business processes through advanced technologies
  • new digital business models emerging

This period will see significant changes in security architecture and operational models, with increasing automation and integration between previously siloed security functions. Organizations will shift from reactive to proactive security postures.

Long-Term Outlook (5+ Years)

Looking further ahead, more fundamental shifts will reshape how cybersecurity is conceptualized and implemented across digital ecosystems:

  • Fundamental shifts in how technology integrates with business and society
  • emergence of new technology paradigms

These long-term developments will likely require significant technical breakthroughs, new regulatory frameworks, and evolution in how organizations approach security as a fundamental business function rather than a technical discipline.

Key Risk Factors and Uncertainties

Several critical factors could significantly impact the trajectory of software dev evolution:

Technical debt accumulation
Security integration challenges
Maintaining code quality

Organizations should monitor these factors closely and develop contingency strategies to mitigate potential negative impacts on technology implementation timelines.

Alternative Future Scenarios

The evolution of technology can follow different paths depending on various factors including regulatory developments, investment trends, technological breakthroughs, and market adoption. We analyze three potential scenarios:

Optimistic Scenario

Rapid adoption of advanced technologies with significant business impact

Key Drivers: Supportive regulatory environment, significant research breakthroughs, strong market incentives, and rapid user adoption.

Probability: 25-30%

Base Case Scenario

Measured implementation with incremental improvements

Key Drivers: Balanced regulatory approach, steady technological progress, and selective implementation based on clear ROI.

Probability: 50-60%

Conservative Scenario

Technical and organizational barriers limiting effective adoption

Key Drivers: Restrictive regulations, technical limitations, implementation challenges, and risk-averse organizational cultures.

Probability: 15-20%

Scenario Comparison Matrix

FactorOptimisticBase CaseConservative
Implementation TimelineAcceleratedSteadyDelayed
Market AdoptionWidespreadSelectiveLimited
Technology EvolutionRapidProgressiveIncremental
Regulatory EnvironmentSupportiveBalancedRestrictive
Business ImpactTransformativeSignificantModest

Transformational Impact

Technology becoming increasingly embedded in all aspects of business operations. This evolution will necessitate significant changes in organizational structures, talent development, and strategic planning processes.

The convergence of multiple technological trends—including artificial intelligence, quantum computing, and ubiquitous connectivity—will create both unprecedented security challenges and innovative defensive capabilities.

Implementation Challenges

Technical complexity and organizational readiness remain key challenges. Organizations will need to develop comprehensive change management strategies to successfully navigate these transitions.

Regulatory uncertainty, particularly around emerging technologies like AI in security applications, will require flexible security architectures that can adapt to evolving compliance requirements.

Key Innovations to Watch

Artificial intelligence, distributed systems, and automation technologies leading innovation. Organizations should monitor these developments closely to maintain competitive advantages and effective security postures.

Strategic investments in research partnerships, technology pilots, and talent development will position forward-thinking organizations to leverage these innovations early in their development cycle.

Technical Glossary

Key technical terms and definitions to help understand the technologies discussed in this article.

Understanding the following technical concepts is essential for grasping the full implications of the security threats and defensive measures discussed in this article. These definitions provide context for both technical and non-technical readers.

Filter by difficulty:

CI/CD intermediate

algorithm

serverless computing intermediate

interface

framework intermediate

platform

middleware intermediate

encryption

API beginner

API APIs serve as the connective tissue in modern software architectures, enabling different applications and services to communicate and share data according to defined protocols and data formats.
API concept visualizationHow APIs enable communication between different software systems
Example: Cloud service providers like AWS, Google Cloud, and Azure offer extensive APIs that allow organizations to programmatically provision and manage infrastructure and services.

platform intermediate

cloud computing Platforms provide standardized environments that reduce development complexity and enable ecosystem growth through shared functionality and integration capabilities.