Technology News from Around the World, Instantly on Oracnoos!

IBM Granite 3.2 Brings New Vision Language Model, Chain of Thought Reasoning, Improved TimeSeries - Related to improves, cdk, granite, new, dashboard

IBM Granite 3.2 Brings New Vision Language Model, Chain of Thought Reasoning, Improved TimeSeries

IBM Granite 3.2 Brings New Vision Language Model, Chain of Thought Reasoning, Improved TimeSeries

IBM has introduced its new Granite [website] multi-modal and reasoning model. Granite [website] elements experimental chain-of-thought reasoning capabilities that significantly improve its predecessor's performance, a new vision language model (VLM) outperforming larger models on several benchmarks, and smaller models for more efficient deployments.

IBM says its Granite [website] 8B Instruct and Granite [website] 2B Instruct significantly outperform their [website] predecessors thanks to enhanced reasoning capabilities. Instead of providing specialized reasoning models as other companies currently do, IBM chose to include reasoning in their Instruct models as an option that can be toggled on and off depending on the particular task at hand.

One technique IBM is using in Granite [website] to build their reasoning capabilities is inference scaling, which is inspired by the idea of letting an LLM generate multiple answers and then pick the best based on some reward model — only, applied to the reasoning process.

In the context of reasoning tasks, this idea of scoring multiple answers to pick the best answer can be applied also to the “chain of thought” that often precedes answer generation. In fact, you don’t need to wait for the entire reasoning to be completed before deciding whether the reasoning was good or not.

IBM's approach advances the one popularized by DeepSeek, which uses one inference model to measure its own progress by also using a search model to explore the reasoning space. So, the process reward model helps the LLM detect and avoid wrong reasoning turns, while the search algorithm makes the process more flexible.

, their inference scaling approach is able to boost performance on the MATH500 and AIME2024 math-reasoning benchmarks and let Granite [website] outperform much larger models like GPT-4o-0513 and [website] for single-pass inference.

Granite [website] also includes a VLM particularly aimed at document understanding, named Granite Vision [website] 2B. , this lightweight model rivals larger models on enterprise benchmarks, such as DocVQA and ChartQA, but is not intended to be used as a replacement for text-only Granite models. It was trained using a specific dataset, DocFM, that IBM built on curated enterprise data, including general document images, charts, flowcharts, and diagrams.

Another component of the Granite family is Granite Guardian [website], a guardrail model able to detect risks in prompts and responses. Guardian [website] provides a similar performance to Guardian [website] at greater speed with lower inference costs and memory usage, IBM says. It introduces a new feature, verbalized confidence, to assess the potential risk in a more nuanced way by providing a confidence value.

Guardian [website] comes in two variants, Guardian [website] 5B (down from 8B in Granite [website] and Guardian [website] 3B-A800M, with the added optimization of activating only 800 million parameters out of the total three billion at inference time.

As a final note on Granite [website], it is worth mentioning that it brings new timeseries models (TTM) supporting weekly and daily forecasting in addition to the minutely to hourly resolutions already supported by its predecessor.

TTM-R2 models (including the new [website] variants) top all models for point forecasting accuracy as measured by mean absolute scaled error (MASE). TTM-R2 also ranks in the top 5 for probabilistic forecasting, as measured by continuous ranked probability score (CRPS).

In its announcement, IBM does not let go unnoticed that its TTM models are "tiny" in comparison to Google’s [website] (500M parameters) and Amazon’s Chronos-Bolt-Base (205M parameters), which ranks second and third by MASE.

While IBM's announcement appeared an impressive feat to some reddit individuals, others highlighted the fact that their reported performance may look like overfitting a few benchmarks while ignoring others. Still, although it would be naive to think that such small models (8B and 2B parameters) can be preferable to larger models performing much more effective overall or for complex tasks like coding, it is true that they may be a good fit for more specialized tasks.

Others speculate about the fact that IBM's offering specifically targets enterprises, where it is significant to have legal guarantees in case things go wrong or with potential IP issues with datasets used for training.

All Granite models are licensed under the Apache [website] license and available on HuggingFace, [website], Ollama, and LM Studio.

Tomasz Jakut reflects on the evolution of web design, recalling the days when tabl......

Ben and Ryan chat with Babak Behzad, senior engineering manager at Verkada, about running a pipeline that vectorizes 25,000 images per second into a c......

Writing Logs to Files in [website] with LogLayer.

Writing logs to files is a crucial aspect of application monitoring and debugging in production enviro......

.NET Aspire 9.1 Improves Dashboard Features

.NET Aspire 9.1 Improves Dashboard Features

.NET Aspire team has released version [website] of the platform. This minor revision is focused on new dashboard attributes and small overall improvements across different platform components.

The dashboard functionality, built into the .NET Aspire app host, is meant to closely track various aspects of the app such as logs, traces, resources, and environment configurations. In this version, it has been enhanced with six new elements.

The first enhancement is that the "child" resources, that depend on other resources (like 1:N relationships), are now displayed nested under the "parent" resource instance. For example, if a database server resource has one or more databases in the .NET Aspire app, they are displayed underneath the server resource.

The resources in the dashboard now feature a more complete details pane, adding the references, back references and volume information. In addition, there is a filter button on the resources list that allows for filtering by desired resource types, state or health state.

Another small enhancement is the ability to change the language of the dashboard visualisation, independently of the browser language. It only affects the UI language, not the number and date formatting, which is still tied to browser settings.

The dashboard application has an OpenTelemetry endpoint for receiving the telemetry from client-side apps, which is restricted to allow only HTTP POST method for sending the telemetry. The allowed CORS origins are set in the configuration file. In .NET Aspire [website] developers can override these CORS settings and add allowed origins for custom localhost domains. It is achieved by setting the DOTNET_DASHBOARD_CORS_ALLOWED_ORIGINS environment variable.

The last dashboard addition in this version is the possibility of downloading app logs for detailed analysis in an external tool.

Lastly, there are local development Azure integration enhancements in the new version. Specifically, Azure Service Bus, Azure Cosmos DB on Linux, and Azure SignalR are now added as emulated services, allowing developers to test them without having to use real cloud resources.

This release comes three months after the release of .NET Aspire [website] , .NET Aspire major versions are released roughly with major .NET versions, while minor versions are released frequently and out of band.

, the .NET developers appreciate the orchestration capabilities that .NET Aspire brings to local development experience.

The official release notes hold more information about this release for interested readers, together with breaking changes in version [website].

A data culture is the collective behaviors and beliefs of people who value, practice, and encourage the use of data and AI to propel organizational tr......

The traditional way of building Docker images using the docker build command is simple and straightforward, but when working with complex applications......

The Case For Minimal WordPress Setups: A Contrarian View On Theme Frameworks.

AWS CDK Introduces Garbage Collection to Remove Outdated Assets

AWS CDK Introduces Garbage Collection to Remove Outdated Assets

Amazon in the recent past introduced the preview of garbage collection in the AWS CDK. The new feature automatically deletes old assets in bootstrapped S3 buckets and ECR repositories, reducing maintenance and deployment costs.

The recent cdk gc command performs garbage collection on unused assets stored in the resources of the bootstrap stack, allowing developers to view, manage, and delete assets that are no longer needed. Kaizen Conroy, software engineer at AWS, and Adam Keller, senior cloud architect at AWS, explain:

For CDK developers that leverage assets at scale, they may notice over time that the bootstrapped bucket or repository accumulated old or unused data. If people wanted to clean this data on their own, CDK didn’t provide a clear way of determining which data is safe to delete. (...) We expect CDK Garbage Collection to help AWS CDK end-people save on storage costs associated with using the product while not affecting how end-people use CDK.

The AWS Cloud Development Kit (CDK) is an open source framework that provides higher-level abstractions and enables developers to define cloud infrastructure using TypeScript, JavaScript, Python, Java, C#/.NET, and Go. Developers define reusable cloud components known as constructs that can be composed together into stacks and apps. The garbage collection feature has been a long-standing request by the community, with Janne Sinivirta, principal DevOps consultant at Polar Squad, highlighting the issue as far back as 2019:

Each cdk build creates a new assets folder under [website] If this includes node_modules, the total size of the [website] folder can add up pretty quickly (mine was over 10Gb)!

, the cdk gc command is still in development and preview mode, and while the current aspects of this command are considered production-ready and safe to use, the scope of the command and its aspects might be subject to change. Developers are required to explicitly opt-in by providing the -- unstable=gc option. For example, while the current version of garbage collection is scoped to an individual account and region, there is a feature request to scope it instead to each stack.

CDK Garbage Collection exposes some parameters to help developers customize the experience, determining how aggressive the garbage collection should be. This is achieved using the -- rollback-buffer-days and -- created-buffer-days parameters, specifying respectively the days an asset has to be marked as isolated before it is eligible for deletion and the days the asset must live before it is eligible for deletion. Conroy and Keller clarify:

Rollback Buffer Days should be considered when you are not using cdk deploy and instead use a deployment method that operates on templates only, like a pipeline. If your pipeline can rollback without any involvement of the CDK CLI, this parameter will help ensure that assets are not prematurely deleted.

Adam Keller, senior cloud architect at AWS, summarizes on LinkedIn:

This was a pain point for a lot of folks as they were required to clean up resources on their own without any sort of intervention from the CDK. With the new garbage collection feature in the CDK toolkit, old, unused assets can be cleaned up with ease.

The CDK Garbage Collection is available starting in AWS CDK version [website].

Microservice architecture has become the standard for modern IT projects, enabling the creation of autonomous services with independent lifecycles. In......

Kubernetes has emerged as the go-to orchestration tool for managing containerized applications. ’s 2024 Voice of Kubernetes Exper......

Amazon ne veut pas que les candidat(e)s aux offres d'emploi puissent utiliser l'IA / GenIA pour rédiger un CV, une lettre de motivation ou lire des ré......

Market Impact Analysis

Market Growth Trend

2018201920202021202220232024
7.5%9.0%9.4%10.5%11.0%11.4%11.5%
7.5%9.0%9.4%10.5%11.0%11.4%11.5% 2018201920202021202220232024

Quarterly Growth Rate

Q1 2024 Q2 2024 Q3 2024 Q4 2024
10.8% 11.1% 11.3% 11.5%
10.8% Q1 11.1% Q2 11.3% Q3 11.5% Q4

Market Segments and Growth Drivers

Segment Market Share Growth Rate
Enterprise Software38%10.8%
Cloud Services31%17.5%
Developer Tools14%9.3%
Security Software12%13.2%
Other Software5%7.5%
Enterprise Software38.0%Cloud Services31.0%Developer Tools14.0%Security Software12.0%Other Software5.0%

Technology Maturity Curve

Different technologies within the ecosystem are at varying stages of maturity:

Innovation Trigger Peak of Inflated Expectations Trough of Disillusionment Slope of Enlightenment Plateau of Productivity AI/ML Blockchain VR/AR Cloud Mobile

Competitive Landscape Analysis

Company Market Share
Microsoft22.6%
Oracle14.8%
SAP12.5%
Salesforce9.7%
Adobe8.3%

Future Outlook and Predictions

The Granite Brings Vision landscape is evolving rapidly, driven by technological advancements, changing threat vectors, and shifting business requirements. Based on current trends and expert analyses, we can anticipate several significant developments across different time horizons:

Year-by-Year Technology Evolution

Based on current trajectory and expert analyses, we can project the following development timeline:

2024Early adopters begin implementing specialized solutions with measurable results
2025Industry standards emerging to facilitate broader adoption and integration
2026Mainstream adoption begins as technical barriers are addressed
2027Integration with adjacent technologies creates new capabilities
2028Business models transform as capabilities mature
2029Technology becomes embedded in core infrastructure and processes
2030New paradigms emerge as the technology reaches full maturity

Technology Maturity Curve

Different technologies within the ecosystem are at varying stages of maturity, influencing adoption timelines and investment priorities:

Time / Development Stage Adoption / Maturity Innovation Early Adoption Growth Maturity Decline/Legacy Emerging Tech Current Focus Established Tech Mature Solutions (Interactive diagram available in full report)

Innovation Trigger

  • Generative AI for specialized domains
  • Blockchain for supply chain verification

Peak of Inflated Expectations

  • Digital twins for business processes
  • Quantum-resistant cryptography

Trough of Disillusionment

  • Consumer AR/VR applications
  • General-purpose blockchain

Slope of Enlightenment

  • AI-driven analytics
  • Edge computing

Plateau of Productivity

  • Cloud infrastructure
  • Mobile applications

Technology Evolution Timeline

1-2 Years
  • Technology adoption accelerating across industries
  • digital transformation initiatives becoming mainstream
3-5 Years
  • Significant transformation of business processes through advanced technologies
  • new digital business models emerging
5+ Years
  • Fundamental shifts in how technology integrates with business and society
  • emergence of new technology paradigms

Expert Perspectives

Leading experts in the software dev sector provide diverse perspectives on how the landscape will evolve over the coming years:

"Technology transformation will continue to accelerate, creating both challenges and opportunities."

— Industry Expert

"Organizations must balance innovation with practical implementation to achieve meaningful results."

— Technology Analyst

"The most successful adopters will focus on business outcomes rather than technology for its own sake."

— Research Director

Areas of Expert Consensus

  • Acceleration of Innovation: The pace of technological evolution will continue to increase
  • Practical Integration: Focus will shift from proof-of-concept to operational deployment
  • Human-Technology Partnership: Most effective implementations will optimize human-machine collaboration
  • Regulatory Influence: Regulatory frameworks will increasingly shape technology development

Short-Term Outlook (1-2 Years)

In the immediate future, organizations will focus on implementing and optimizing currently available technologies to address pressing software dev challenges:

  • Technology adoption accelerating across industries
  • digital transformation initiatives becoming mainstream

These developments will be characterized by incremental improvements to existing frameworks rather than revolutionary changes, with emphasis on practical deployment and measurable outcomes.

Mid-Term Outlook (3-5 Years)

As technologies mature and organizations adapt, more substantial transformations will emerge in how security is approached and implemented:

  • Significant transformation of business processes through advanced technologies
  • new digital business models emerging

This period will see significant changes in security architecture and operational models, with increasing automation and integration between previously siloed security functions. Organizations will shift from reactive to proactive security postures.

Long-Term Outlook (5+ Years)

Looking further ahead, more fundamental shifts will reshape how cybersecurity is conceptualized and implemented across digital ecosystems:

  • Fundamental shifts in how technology integrates with business and society
  • emergence of new technology paradigms

These long-term developments will likely require significant technical breakthroughs, new regulatory frameworks, and evolution in how organizations approach security as a fundamental business function rather than a technical discipline.

Key Risk Factors and Uncertainties

Several critical factors could significantly impact the trajectory of software dev evolution:

Technical debt accumulation
Security integration challenges
Maintaining code quality

Organizations should monitor these factors closely and develop contingency strategies to mitigate potential negative impacts on technology implementation timelines.

Alternative Future Scenarios

The evolution of technology can follow different paths depending on various factors including regulatory developments, investment trends, technological breakthroughs, and market adoption. We analyze three potential scenarios:

Optimistic Scenario

Rapid adoption of advanced technologies with significant business impact

Key Drivers: Supportive regulatory environment, significant research breakthroughs, strong market incentives, and rapid user adoption.

Probability: 25-30%

Base Case Scenario

Measured implementation with incremental improvements

Key Drivers: Balanced regulatory approach, steady technological progress, and selective implementation based on clear ROI.

Probability: 50-60%

Conservative Scenario

Technical and organizational barriers limiting effective adoption

Key Drivers: Restrictive regulations, technical limitations, implementation challenges, and risk-averse organizational cultures.

Probability: 15-20%

Scenario Comparison Matrix

FactorOptimisticBase CaseConservative
Implementation TimelineAcceleratedSteadyDelayed
Market AdoptionWidespreadSelectiveLimited
Technology EvolutionRapidProgressiveIncremental
Regulatory EnvironmentSupportiveBalancedRestrictive
Business ImpactTransformativeSignificantModest

Transformational Impact

Technology becoming increasingly embedded in all aspects of business operations. This evolution will necessitate significant changes in organizational structures, talent development, and strategic planning processes.

The convergence of multiple technological trends—including artificial intelligence, quantum computing, and ubiquitous connectivity—will create both unprecedented security challenges and innovative defensive capabilities.

Implementation Challenges

Technical complexity and organizational readiness remain key challenges. Organizations will need to develop comprehensive change management strategies to successfully navigate these transitions.

Regulatory uncertainty, particularly around emerging technologies like AI in security applications, will require flexible security architectures that can adapt to evolving compliance requirements.

Key Innovations to Watch

Artificial intelligence, distributed systems, and automation technologies leading innovation. Organizations should monitor these developments closely to maintain competitive advantages and effective security postures.

Strategic investments in research partnerships, technology pilots, and talent development will position forward-thinking organizations to leverage these innovations early in their development cycle.

Technical Glossary

Key technical terms and definitions to help understand the technologies discussed in this article.

Understanding the following technical concepts is essential for grasping the full implications of the security threats and defensive measures discussed in this article. These definitions provide context for both technical and non-technical readers.

Filter by difficulty:

DevOps intermediate

algorithm

microservices intermediate

interface

platform intermediate

platform Platforms provide standardized environments that reduce development complexity and enable ecosystem growth through shared functionality and integration capabilities.

Kubernetes intermediate

encryption

algorithm intermediate

API

framework intermediate

cloud computing