Introduction

Software project failures are a harsh reality in the world of technology. Despite the best intentions and efforts, projects can unravel due to various reasons, such as poor estimation and planning, inadequate requirements gathering, scope creep, and unrealistic timelines. These failures not only result in financial losses but also tarnish a company’s reputation and erode stakeholder trust. Addressing project failures requires a proactive approach, emphasizing communication, risk management, continuous evaluation and especially realistic estimation and planning. Embracing these lessons can lead to improved project outcomes and foster a culture of learning and growth in the software development industry.

In the dynamic world of software development, accurate cost estimation is crucial to ensure project success. Organizations rely on dependable software cost estimation practices to manage budgets, meet deadlines, and deliver quality products. To address this need, a new Software Cost Estimation Certification has emerged, complemented by the Cost Estimation Body of Knowledge for Software (CEBoK-S). In this blog, we will delve into the significance of this certification and the CEBoK-S, shedding light on how they empower professionals to excel in the field of software cost estimation.

The New Software Cost Estimation Certification (SCEC)

The new Software Cost Estimation Certification is a comprehensive program designed to equip professionals with the latest tools, methodologies, and best practices for accurately estimating software project costs. Offered by the International Cost Estimation and Analysis Association (ICEAA), its special interest group ICEAA-Software, this certification reflects the industry’s evolving demands and ensures that participants stay up to date with the latest trends.

Key Components:

  1. Advanced Estimation Techniques: The certification program covers a wide array of advanced estimation techniques, from traditional methods like function point analysis and COCOMO to modern approaches like agile estimation and parametric modelling. By learning these techniques, professionals gain the flexibility to adapt their approach to diverse project requirements.
  2. Risk Assessment and Mitigation: Effective cost estimation involves identifying potential risks and uncertainties that can impact the project’s outcome. The certification equips participants with the skills to assess and mitigate risks, allowing for better planning and resource allocation.
  3. Industry Case Studies: Real-world case studies are an integral part of the certification program. These case studies provide valuable insights into how cost estimation principles are applied in various scenarios, offering participants a practical understanding of the challenges they may encounter.

The CEBoK-S – Cost Estimation Body of Knowledge for Software

The CEBoK-S is a comprehensive guide that provides a structured framework for software cost estimation. Developed by industry experts, this body of knowledge encompasses a wide range of topics, from fundamental concepts to advanced practices, creating a solid foundation for professionals in the field.

Key Features:

  1. Detailed Framework: The CEBoK-S offers a detailed framework that covers all aspects of software cost estimation. It defines the key processes, activities, and inputs required for accurate estimation, guiding professionals through the entire estimation lifecycle.
  2. Best Practices and Standards: In an ever-changing industry, adhering to best practices and standards is crucial. The CEBoK-S outlines established industry standards, ensuring consistency and reliability in cost estimation practices across projects and organizations.
  3. Continuous Updates: Software development is continually evolving, and the CEBoK-S keeps pace with these changes. It undergoes regular updates to reflect the latest advancements and emerging trends in the field, making it a reliable and relevant resource for professionals.

Impact on the Software Industry

The combination of the new Software Cost Estimation Certification and the CEBoK-S has revolutionized the software industry’s approach to cost estimation. Certified professionals armed with the knowledge from the CEBoK-S are better equipped to address the challenges posed by modern software projects, leading to improved project outcomes and client satisfaction.

  1. Enhanced Project Planning: The comprehensive knowledge gained from the certification and the CEBoK-S enables professionals to create accurate and realistic project plans. This, in turn, leads to better resource allocation, reduced budget overruns, and timely project deliveries.
  2. Quality and Consistency: Employing standardized cost estimation practices ensures consistency in project management across different teams and organizations. This leads to higher-quality software development, as well as improved collaboration and communication among stakeholders.
  3. Improved Stakeholder Trust: Clients and stakeholders place their trust in organizations that employ certified professionals and follow industry standards. The certification acts as a testament to an organization’s commitment to excellence and professionalism.
  4. Higher success rates of software development projects, resulting in fewer cost and schedule overruns. This potentially saves companies huge amounts of money and reputation damage.

Conclusion

In conclusion, the new Software Cost Estimation Certification and the CEBoK-S are instrumental in equipping professionals with the knowledge and skills required to excel in software cost estimation. By combining advanced estimation techniques with a structured body of knowledge, these resources elevate the industry’s cost estimation practices to new heights. As organizations continue to embrace these certifications, we can expect to see more successful projects, satisfied clients, and a stronger, more reliable software industry overall.

IDC Metri is proud to announce that its Software Cost Estimation Center of Excellence now has two Software Cost Estimation Certified professionals: Frank Vogelezang and Harold van Heeringen. More information can be found here: https://www.idc.com/eu/idcmetri/it-intelligence

Better insights

The vast majority of nearly 800 CxO participants in our future enterprise survey experienced challenges in getting insight in the performance of their software development and DevOps teams. Most organizations have practices in place to help them to monitor and report on the performance, but this insight does not reach CxO level. What are the causes of this paradox and what can you do to align the insights in the organization to effective CxO information?

Insights in the teams

As you can see from the numbers there is a good chance that your development and DevOps teams are using at least one of the top 4 practices to monitor performance. What we encounter in practice is that the way these practices are implemented is either not consistent over all development and DevOps teams or inconsistent in a way that is prone to be influenced by personal judgment or taste. When you have implemented code quality management tools like SonarQube, Checkstyle or FindBugs, but have no rules on how these tools should be used, how do you know what the quality of the software is that your teams are developing or maintaining? If you don’t use it consistently throughout your whole application portfolio, how do you know where to direct improvement attention?

Team productivity is quite often determined based on DORA or Flow. These types of metrics are ideal on a team level to help the teams improve, but are not suitable for CxO level decisions because they have no consistent meaning beyond the team level.

IDC is all about data. With our KPI model, we can help you determine the right dataset you need to make fact-based decisions to direct teams to bring value to your organization. When you are already a value-driven organization, IDC can tie the right KPI’s to your objectives and key results. With an assessment of your data in your tools we can help you piece the right data together or we can help you fill the gaps based on best practices.

Examples of real insight

Some examples of real insight that we have helped set up for some of our customers:

  • Output of the teams. How much software are they producing, compared to similar teams in similar companies, based on standardized sizing metrics. This is real management information on top of the DORA, Flow and happiness metrics for the teams themselves.
  • Estimation erosion. When internal team processes evolve, so do internal estimates. Improved internal processes are not always reflected in more value per sprint. By comparing the internal metrics to standardized metrics you can determine whether estimation is improving or eroding.
  • Code analysis at system or portfolio level. Almost half of the organizations we interviewed are using automated code quality management tools. Most of these tools are limited to establishing the code quality of an individual piece of code. Code analysis at system or portfolio level looks deeper or wider, also taking account of the quality of the total code base. This can bring to light risks in your code base of interactions between individual pieces of code or of bad updating practices for open source components that are used in your code base.
  • Insight in the real Bill of IT. All IT costs, including cloud spent, can be mapped towards the right value stream.

Better decisions

By comparing the real insights from your company to the insights we have from our IDC global market data, we can give you insight into how you compare to similar organizations. By comparing like for like we show you the differences between you and your peers. With this information you can make fact-based decisions on whether or not you need to change something. Our insight can help you make better decisions in cooperation with external service providers, make better sourcing decisions, make better decisions on how to set up your IT organization, how to create more value or to transform your application landscape.

Better outcomes

What we see with all of our clients is that better informed decisions lead to better outcomes. Examples of those better outcomes are:

To learn more about how IDC can help you and get connected with someone on our team, click the button below.

In recent years, B2B digital commerce has seen significant growth in the manufacturing sector. Manufacturers are increasingly turning to digital channels for both purchasing and sales, driven by several factors. According to our 2023 Global Manufacturing Survey, 47% of manufacturers globally consider B2B digital commerce as a strategic initiative to drive customer experience.

We talk about digital commerce as it encompasses buying and selling in digital channels and covers the different processes that also improve engagement in the customer journey, whereas eCommerce has a narrower focus on buying and selling online.

The COVID-19 pandemic acted as a catalyst for the adoption of digital commerce in manufacturing industry as digital channels provided a safe and convenient alternative, ensuring continuity in sales and procurement processes. The competitive landscape in manufacturing also requires companies to optimize costs and increase their time to market.

In an era of rapidly changing customer demands and technological advancements, manufacturers need to be agile and responsive. Furthermore, digital channels open up opportunities for manufacturers to reach new customer segments and markets.

By embracing digital commerce, manufacturers can overcome these barriers and tap into previously untapped potential.

Customer Experience is Key to Digital Commerce

In addition to the growing importance of B2B digital commerce, customer experience (CX) has become a key competitive differentiator for manufacturers globally. In the past, manufacturers focused on efficiency and product quality, but in today’s highly competitive environment, improving the customer experience has become crucial for success.

Many decision-makers involved in B2B transactions are millennials, who have grown accustomed to seamless online experiences in their B2C interactions and therefore, expect the same level of convenience and personalization in their B2B transactions. To meet these expectations, manufacturers need to provide self-service platforms that allow customers to navigate and make purchases independently.

Slow workflows and manual processes are no longer acceptable, with customers demanding on demand access to information and assistance. In response, manufacturers have embedded chat functionalities (chatbots) to provide real-time support and address any queries or concerns.

Manufacturers should also invest in designing intuitive platforms that are easy to navigate and provide a seamless buying experience. Detailed product information, along with complementary product suggestions and immersive product visualization tools, can significantly enhance CX.

How Manufacturers Can Prioritize the Customer

Delivering Personalized and Immersive Customer Experiences Through Cutting-edge Technologies

Personalization is key in B2B digital commerce. Customers expect tailored recommendations, targeted promotions, and real-time updates on order and delivery statuses. They also desire the ability to configure products and witness dynamic price changes during the buying process.

Additionally, customers increasingly prefer digital-first experiences, allowing them to make purchases anytime, anywhere, from their mobile devices.

To execute these CX initiatives successfully, manufacturers are relying on advanced analytics and artificial intelligence (AI). These technologies provide valuable insights into customer behaviors and preferences, enabling manufacturers to deliver personalized experiences at scale.

AI-powered recommendation engines, for example, can suggest relevant products based on customer browsing and purchasing history.  Manufacturers are also delivering immersive experiences enabled by 3d visualization, AR/VR and other digital technologies in different phases of the buyer journey.

The customer experience does not end with the purchase. Manufacturers must have efficient processes in place to handle product returns and resolve disputes promptly. There is also an opportunity to provide additional services such as maintenance and upgrades, further enhancing the overall customer experience.

Staying Ahead of the Game Requires Continuous Innovation

In this rapidly evolving segment, manufacturers should keep an eye on emerging models and technologies. Industrial B2B marketplaces are emerging as a viable option for manufacturers to reach a broader customer base. Direct-to-customer models, which aim to establish closer relationships with customers can provide manufacturers with valuable insights on product usage and promote loyalty.

New architectures, such as headless commerce, can enable manufacturers to quickly deploy features that improve the user experience without significant infrastructure investments. This flexibility allows manufacturers to adapt and innovate at a rapid pace, staying ahead of evolving customer needs and preferences.

Connected Value Chains, Sales Empowerment, and Customer-centricity Are Key to a Successful B2B Digital Commerce Strategy

Despite the promise of B2B digital commerce, there are areas that manufacturers need to address to fully leverage their potential.

Breaking down data silos and ensuring seamless integration of digital commerce with different value chain processes is crucial. This means aligning sales and marketing efforts and integrating ERP, CRM, and other business applications with e-commerce platforms.

Manufacturers must also determine their optimal omnichannel strategy and cultivate mutually beneficial relationships with distributors to jointly create value.

The role of humans should not be overlooked in this digital transformation. Salespeople still play an important role in the B2B buying process, and their capabilities need to be augmented to align with the digital commerce landscape.

Thanks to digital commerce, sales reps are no longer burdened by time consuming tasks and can focus on other activities such as guiding customers through the buying process for more complex orders. Empowering salespeople with readily available information and insights can enhance their interactions with customers and drive sales.

Finally, building an organization-wide culture based on customer centricity is vital for manufacturers. Putting the customer at the center of decision-making across all organizational functions ensures that CX remains a top priority and drives continuous improvement.

 

The IDC Manufacturing Insights: Worldwide Manufacturing B2B Commerce and Customer Experience Technology Strategies subscription service analyses the key trends in B2B digital commerce and how manufacturers are creating value and delivering innovative customer experiences by leveraging digital technologies. Reach out to me at mcasidsid@idc.com to learn more about this new program.

Is GenAI the “Next Big Thing” in Retail?

The retail industry is moving fast. Evolving customer expectations and needs, fierce competition, and the quest for enhanced online customer experience – among others – are all factors driving retailers to rush into experimenting with emerging technologies. In 2022, newspapers were crowded with titles of bold retailers and brands landing in the Metaverse; 2023 has already proved to be the year of Generative AI (GenAI).  

Following the hype, in the last months we have seen the web crawling with OpenAI’s DALL-E experiments, an example is the viral AI-generated video series “By Balenciaga”, or the ultra-realistic image of Pope Francis wearing a Moncler puffy jacket. However, the potential applications of the technology in the retail industry are much more tangible than digital content creation for mere customers’ entertainment.

GenAI technologies (such as ChatGPT and Dall-E), and solutions powered by LLMs, or text-to-image models are publicly set to have a disruptive impact across all retail operations and functions. The expected benefits of the technology span from enhancing and accelerating retailers marketing creative process to achieving the personalization of online customer journeys based on individual preferences and needs.

How Are Retailers Approaching Generative AI Nowadays?

With the OpenAI release of ChatGPT in early November 2022, large language models (LLMs) or foundation models (FMs) become available to the general public, and everyone had the chance to experience firsthand their transformative power in terms of augmenting human productivity and creativity- retailers included.

According to our findings, 40% of worldwide retailers and brands are in the experimentation phase of GenAI, trying to figure out the most relevant field of applications and use cases, while 21% are already investing in the implementation of GenAI technologies. Today we see organizations such as Coca-Cola, Mattel, and Carrefour starting to pilot GenAI applications, even if still on a limited scale and with a test-and-learn approach, predominantly across the areas of product development, marketing, and customer service.

 

Download eBook: Generative AI in EMEA: Opportunities, Risks, and Futures

 

Which GenAI Use Cases Do Retailers Expect to Explore in the Short Term?

In the short-term, most retailers expect to explore LLMs and FMs applications in marketing, sales and customer engagement, as confirmed by our data showing that 50% of them will prioritize GenAI marketing use cases over the next 18 months. Knowledge management, design and conversational applications are following marketing in the current retailers’ prioritization of GenAI use cases, while code applications are lagging for the moment- as opposed to other industries where they are leading the way for the technology implementation.  

Unsurprisingly, retailers are starting to approach GenAI through less strategic and data-sensitive use cases using open-source commercial models or plug-in solutions (i.e., marketing and conversational applications). Other more strategic retail use cases (i.e., product prototyping and knowledge management) would require the fine-tuning of commercial models to the organization dataset and eventually the training of their own proprietary model or enterprise solution.

Therefore, we could expect these strategic use cases to reach wide adoption in the medium-long term- when the technology will become more established, and retailers will have developed enhanced GenAI in-house capabilities and workforce change management.

How Will GenAI Impact the Retail Industry?

We define Generative AI as a branch of computer science that involves unsupervised and semi-supervised algorithms that enable computers to create new content using previously created content, such as text, audio, video, images, and code in response to short prompts. Therefore, the power of GenAI goes beyond chatbots and lies in its ability to create infinite, contextualized content of any format.

Theoretically, the Retail industry could see significant improvements when advanced GenAI’s content creation ability will be added on top of the existing AI/ML applications that retailers use to automate processes, enhance personalization, and increase efficiency- and across all retail functions.

In the short-term, even if the technology will inevitably raise important questions in terms of proprietary data sharing, customer data privacy, and factual inaccuracy, without any doubt the integration of GenAI into online customer journeys could lead to significant improvements in both the backend of e-commerce workflows and the front-end of online customer experience– leading to major enhancements in retail e-commerce efficiency and efficacy.

The disruptive power of GenAI for digital commerce is not going unnoticed, for instance, the Chinese e-commerce giant JD.com announced the imminent release of its own retail-specific ChatGPT solution which aims to improve online retailers’ rankings of product listing on SERP, generate product descriptions that are tailored to a shopper’s preferences, and optimize online product images and video generation process.

 

Watch the Webcast: Generative AI in EMEA: Opportunities, Risks, and Futures

 

Two Nascent GenAI Use Cases with Examples 

  1. Conversational Customer Service Chatbots: Carrefour

Compared to the standard chatbots that are typically limited by a defined number of interactions and decision trees, GenAI conversational applications open an infinite number of responses to potential customers’ inquiries. In doing so, the GenAI virtual assistant can provide more inspirational bits of advice, guiding the shoppers throughout the whole website navigation journey with contextual and relevant information about the product listed in their vast online catalogue.

In the short term, conversational AI could be an attractive option for retailers in need of powering up their poor e-commerce search experience, as it could be easily done by plugging in GenAI solutions through API rather than reviewing their in-house search capabilities. Ultimately, GenAI could lead to a different kind of product search, giving shoppers the chance of starting a conversation with the retailer/brand by uploading an image, photo, and audio, or by simply speaking to the intelligent AI concierge.

For example, starting from June 8, French grocery retailer Carrefour has integrated a chatbot based on ChatGPT into its company website, giving shoppers the ability to use natural-language AI to assist them in their daily grocery shopping. Customers can find “Intelligent Assistance” on the company home page and consult it in choosing the best products based on their budget, food intolerances and allergies, or even specific menu ideas.

Also, being connected to Carrefour’s website research engine, the chatbot can give a real-time list of products while conversing with customers until the purchase is finalized.

  1. Enhanced Product Visualization Online: Anthropologie, Everlane, H&M and LOFT

Product visualization online has been historically the main source of frustration for shoppers and forgone sales for retailers, especially when the product in question is an experiential one such as a piece of clothing or furniture. Using GenAI for image generation, retailers and brands can optimize the creation of product images, reducing the time and costs associated with in-studio shooting. Indeed, GENAI tools can be used to transform text or 2D images into 3D products representation to offer more appealing and interactive online product catalogues.

H&M, LOFT, Everlane and Anthropologie – among other brands – are all partnering with Google in its latest release of a GenAI virtual try-on tool for women’s apparel on Search. Starting from the brand product catalogue image, the tool can create in real-time an AI-Generated rendering of how it would fit on a diverse set of real models in various poses.

Concerning other previous attempts at virtual try-ons for clothes, such as the AR-powered “Be Your Own Model” released by Walmart, GenAI technology is raising the bar by reflecting accurately how articles of clothing drape, fold, cling, stretch and form wrinkles and shadows- and do it at scale.

GenAI in Retail: What’s Next?

LLMs, FMs and GenAI advancements are happening quickly, especially in comparison to more slow-burn technologies that we have seen emerging in the previous years- such as metaverse, blockchain or AR (Augmented Reality) and VR (Virtual Reality). The immediate applications for the retail industry that we have seen above suggest that retailers and brand executives must start planning their GenAI strategy today, since many of the use cases that might seem so futuristic now could become viable within months rather than years.

In this context, a wait-and-see approach could be risky for retailers. Therefore, they must start experimenting with a test-and-learn approach now to develop a repeatable process that can be deployed tomorrow on a larger scale, while setting the basis in the workforce for GenAI change management and in-house capability building.

One can be forgiven for their quizzical expression on hearing that we still have a third of the world’s population without internet access. 

According to the International Telecommunication Union (ITU), an estimated 2.9 billion people, representing 35% of the world’s population, do not have access to internet. The least developed countries are the least connected, however, these connectivity gaps also extend to developed countries.

For example, approximately 19 million Americans representing 6% of the population still lack access to fixed broadband service at threshold speeds. Global connectivity gaps exist in parts of Europe and Asia, Africa, and South America.

It’s against this backdrop that a challenge to bridge the digital divide creates an opportunity for comms SPs (Communications Service Providers) partnering with LEO (Low Earth Orbit) satellite operators to offer highspeed cost-effective options. This will close the remote and rural connectivity gap, considering terrestrial network deployment has not been financially feasible to develop a robust digital infrastructure. 

The pairing of satellite providers’ constellations in LEO with comms SPs’ mobile networks promises to deliver highspeed satellite-to-cellular coverage and broadband internet access to connect the unconnected.  Comms SPs have high expectations to derive significant benefits including extended coverage with lower latency, resiliency, access to emerging markets, and the ability to offer new services and applications.

The Surge in LEO Satellite Constellations

Previously, satellites’ success has been hindered by substantial costs and invisible profit margins associated with launching and maintaining a fleet of satellites. This time, satellite companies have a higher chance of success because of several key reasons:

  • Building smaller, more capable LEO satellites that are placed in orbit at an altitude of 2,000km or less. The advancement in rocket technology has made reusable rockets a reality. This significantly improves the economics of launching satellites into lower Earth orbit, making the business case more viable.
  • Conventional satellite connectivity placed in more distant GEO and MEO orbits have had challenges with cost, signal delay, and high latency for many cases. Whereas LEO satellite services offer lower latency in the range of 20–50ms due to the satellites’ lower orbit, making them a more viable highspeed satellite connectivity option.
  • Combined private and public funding in the billions of dollars for LEO satellite ventures. Governments, driven by the desire to establish control in space are diverting resources away from other programs to invest in space-related initiatives. Prominent examples include the U.S., India, China, and the European Union.
  • Private companies such as SpaceX are outperforming governments in terms of deployment speed, shaving years off the process.   Also, previously struggling companies such as OneWeb and Iridium have been revived out of bankruptcy through public and private funds.

With these factors at play, the current landscape presents a more favorable environment for LEO satellite operators to thrive and overcome previous challenges in addressing a vast untapped market.

Comms SPs Tying the Knot with LEO Satellite Partners

The advent of LEO satellites signals an important new opportunity for comms SPs to extend the reach of their network.  Some of these early and notable comms SP and satellite operator partnerships capitalize on this opportunity with first-mover advantage, including:

  • T-Mobile announced its technology alliance with SpaceX (Starlink) in August 2022 to develop satellite-to-smartphone connectivity in the United States by pairing the Starlink satellite constellation in LEO with T-Mobile’s wireless network.
  • Verizon announced its strategic collaboration with Amazon (Project Kuiper) LEO satellites in October 2021 to extend 4G/LTE and 5G connectivity solutions for both consumers and businesses in rural and remote communities in the United States.
  • Orange SA and OneWeb announced their partnership, in March 2023, to expand connectivity services across Europe, Africa, Latin America, and other global regions using LEO satellite communications.  
  • AT&T, Rakuten, and Vodafone have each separately partnered with AST SpaceMobile to use its LEO satellites to provide direct satellite-to-cellular communications. The partnership, in April 2023, tested the first-ever direct voice connection from space to an everyday cellular device.
  • NTT and SES announced in April 2023 a multiyear partnership to use MEO satellites to deliver NTT’s edge as a service to enterprise customers.
  • Apple in partnership with Globalstar unveiled satellite connectivity for its iPhone 14 and iPhone 14 Pro models in November 2022 to provide emergency SOS satellite service in remote areas without regular cell phone coverage.
  • Qualcomm and Iridium announced their partnership in January 2023 to bring satellite-to-cellular communications to Android smartphones connected to Iridium’s LEO satellite constellation.

It is evident satellite providers who also own the rockets to launch their LEO satellites have an advantage in building out their constellations versus those that rely on third-party rocket providers. IDC expects, due to the increased competition, smaller satellite operators will be either acquired or forced to partner to survive. Also, the right partnerships will be crucial for comms SPs to support their growth plans for satellite-to-cellular coverage as well as future road map development for broadband wireless services.

IDC’s Conclusion

IDC is projecting alliances with long-term implications will increase and become more significant through the pairing of satellite providers’ constellations in LEO with comms SPs.  Enabling comms SPs to evolve their network beyond terrestrial cellular with LEO satellite services that include high-speed internet, 4G/LTE and 5G broadband backhaul, and Internet of Things (IoT).

The partnering of terrestrial and satellite communications holds the promise of helping bridge the digital divide and open new economic opportunities for remote areas of the world.  It also aligns with the United Nations mandate that by 2030 every person in the world should have safe and affordable access to internet.

The coexistence of satellite and cellular networks will become increasingly prevalent and the continued standardization effort in the 3GPP ensures 5G systems integrate NTNs, such as high-altitude platform systems (e.g., LEO, MEO, and GEO satellites), with terrestrial options.

The new wave of LEO satellite deployments have placed them on the radar of government regulatory bodies. Regulators are taking note of next-generation satellite technologies and government oversight will continue to increase as nations look to establish protective borders in space.

LEO satellite constellations offer a pragmatic and more cost-effective option to close the remote and rural connectivity gap, considering terrestrial wireless network deployment has not been financially feasible to develop a robust digital infrastructure.

The opportunity presented to global comms SPs by integrating LEO satellite connectivity is huge to connect the unconnected and help bridge the digital divide. Hence the right alliances with satellite operators will be critical for comms SPs to support their growth plans for ubiquitous satellite-to-cellular coverage and 5G wireless data and broadband services.

For more insight and information on comms SPs expanding service and revenue opportunities with LEO Satellites please see IDC’s market perspective, “Communication Service Providers Expand Service and Revenue Opportunities with Low Earth Orbit Satellites.”

Peter Chahal - Research Director, Worldwide Telecommunications Services and Strategies - IDC

Peter Chahal is a Research Director at IDC's Network and Telecommunications research practice covering telecommunication services and strategies. Some of the key areas of his research includes mobile broadband services, 5G monetization, SD-WAN, wireline broadband services, and other emerging telecom digital services. Peter’s also looks at telecommunication service providers’ broader strategies and how those strategies influence telecommunication providers’ digital transformations. His research helps telecommunication service providers and vendors have a better understanding of the worldwide telecommunications market and discover new opportunities for growth.

The breach of U.S. government email accounts by Chinese hackers, which Microsoft reported this week, will no doubt become one of the leading cybersecurity stories of 2023. It will join the likes of the SolarWinds hack and the Log4J fiasco as a major cybersecurity breach. Analysts will cite this breach for years to come to underscore why investment in cybersecurity is so critical.

But the government email breach isn’t just another example of what can go wrong when the bad guys find holes in cyberdefenses. It exemplifies novel cybersecurity challenges that forward-thinking businesses should be preparing for today. Doing more of the same on the cybersecurity front won’t protect organizations against attacks like this one.

To prove the point, let’s unpack three key takeaways from the Microsoft government hack and what they mean for the future of cybersecurity at organizations everywhere.

1.  Sophisticated cyber threats are the new threat

Perhaps the most important lesson from this incident is that the most serious threats facing organizations today are highly sophisticated ones. Gone are the days when ransomware or phishing attacks targeting low-hanging fruit – meaning businesses that were underprepared because they failed to meet standard baselines of cybersecurity preparedness – were the main challenge that CISOs had to worry about.

In the case of this hack, sophisticated, government-sponsored threat actors who presumably had extensive resources at their disposal carried out a complex attack. They didn’t exploit a known vulnerability in a server that someone had forgotten to patch, or email unsuspecting employees hoping to trick them into handing over passwords. They apparently discovered and exploited a zero-day flaw within a complex system managed by one of the world’s most sophisticated and successful tech companies – Microsoft.

The takeaway here is that erecting defenses that make your organization harder-than-average to breach will no longer suffice because the bad guys might not simply be searching for low-hanging fruit. Protecting against more mundane risks, like those that traditional ransomware attackers exploit, remains important, of course. But understanding which threat actors might be seeking to carry out more sophisticated and targeted attacks against your organization will be critical, too.

2.  The meaning of software supply chain security needs to expand

The government hack is also notable because it highlights how sophisticated threat actors are targeting software supply chains in novel ways.

Traditionally, when cybersecurity analysts talked about software supply chain security, they referred to ensuring that any third-party software components that businesses imported into their applications – such as open source libraries or modules – were secure. Hence why the U.S. government instructed developers to compile Software Bills of Materials (SBOMs). SMOBs provide visibility into the third-party software resources that businesses depend on.

SBOMs are important for determining whether a known vulnerability impacts a business’s apps. But in the case of the government hack, an SBOM would not have helped protect victims because this was not a conventional software supply chain attack. The Chinese hackers did breach the software supply chain of government agencies, but they did so by targeting a third-party SaaS platform – Microsoft Exchange Online – rather than targeting code that victims incorporated directly into their own software applications.

This type of supply chain breach isn’t entirely novel. The SolarWinds attack similarly gave attackers a backdoor into the IT environments of companies that used a third-party application. Although in that case the attackers planted malicious code into the product, which made the incident resemble a traditional supply chain breach in many ways. With this latest government hack, threat actors didn’t plant malicious code into anyone’s software supply chain; they just found a way to exfiltrate user data from third-party software that government agencies depend on.

Plus, to remediate the attack, victims had to rely on their software vendor. That also makes this incident different from conventional software supply chain attacks, which developers can remediate themselves by removing vulnerable components from their applications. In this case, only Microsoft could supply the remediation.

The key lesson here is that CISOs must think more broadly about software supply chain security. Securing your supply chain must involve more than just ensuring the securing of third-party code that your business depends on. Engaging with software vendors to manage supply chain security in SaaS apps is equally critical.

3.  Geopolitics continues to impact cybersecurity

Finally, this incident is a reminder that geopolitics is poised to play an increasingly important role in cybersecurity. Although it’s not clear exactly why Chinese hackers targeted government agencies, it’s a reasonable assumption that Chinese government competition with the United States was a factor.

A similar story has already been playing out in the context of the Ukraine conflict, which bore important implications for global cybersecurity. This latest attack highlights how tension between two other nations – the United States and China – may also lead to increased cyber attacks and cyberespionage, and perhaps not only against government agencies.

Advice for CISOs

A deep dive on what all of the above means for CISOs and the future of cybersecurity is beyond the scope of this post. But in short, businesses must adapt to a changing cybersecurity environment by:

  • Steeling themselves against highly sophisticated threat actors.
  • Taking a broader and more dynamic approach to software supply chain security by extending it to include partnerships with third-party SaaS vendors.
  • Anticipating and preparing to respond to cybersecurity threats linked to global conflicts or tensions.

That, at least, is what we’re thinking about the future of cybersecurity based on the information released so far about the Chinese hack. We look forward to sharpening our thoughts and guidance as more details emerge about the nature and scope of the incident.

To access more insights and would like to learn more about the benefits of IDC and how our IT Executive Programs can help you with your IT strategy, implementation, and digital transformation, click the button below.

Christopher Tozzi - Adjunct Research Advisor - IDC

Christopher Tozzi, an adjunct research advisor for IDC, is senior lecturer in IT and Society at Rensselaer Polytechnic Institute. He is also the author of thousands of blog posts and articles for a variety of technology media sites, as well as a number of scholarly publications. Prior to pivoting to his current focus on researching and writing about technology, Christopher worked full-time as a tenured history professor and as an analyst for a San Francisco Bay area technology startup. He is also a longtime Linux geek, and he has held roles in Linux system administration. This unusual combination of "hard" technical skills with a focus on social and political matters helps Christopher think in unique ways about how technology impacts business and society.

On May 24, AMD revealed its new Radeon RX 7600 graphics card. This is an entry-level card positioned to play the newest games at 60+ frames per second (fps) at 1080p. It supports very efficient streaming using the latest AV1 encoding technology. According to AMD, the card performs 1080p gaming 29% faster on average than the AMD Radeon RX 6600.

AMD’s latest RDNA 3 generation of cards have marked ray tracing improvements over the previous RDNA 2 versions. Our tests show that the Radeon RX 7600 can get close to the performance of the Radeon RX 6700 XT midrange card in ray tracing benchmarks such as Speedway and Port Royal. The RX 7600 achieved around 86% of the performance of the midrange card in both tests using default driver settings.

The Radeon RX 7600 is based on the AMD RDNA 3 architecture and includes revamped compute units with unified ray tracing and AI accelerators. It features AMD’s Infinity Cache technology from the second generation of cards.

The Test Platform

The AMD Ryzen 5 7600X processor, the Radeon RX 7600 graphics card, the GIGABYTE X670E Aorus Master motherboard, and a G.SKILL Trident Z5 Neo 2x16GB DDR5-6000 EXPO memory kit — which were all provided to IDC by AMD — comprised the test PC hardware components. The primary Windows 11 disk was a 1TB GIGABYTE Aorus NVMe Gen4 solid state drive.

A be quiet! Silent Loop 2 280mm water cooler was fitted for the processor, which was coupled with a be quiet! STRAIGHT POWER 11 Platinum 850W power supply. A 34” Dell Gaming S3422DWG monitor — a quad-HD 3440*1440 display with a 144Hz frame rate, FreeSync, 10-bit colors, and high dynamic range functionality — was also used.

The reviewers utilized the motherboard’s optimal default settings, set the memory profile to EXPO 6000, and made sure that smart access memory was enabled. No special tuning, optimization, or overclocking was carried out for the tests.

Synthetic Benchmarks and Productivity Performance

Blender Benchmark 3.5.0 was used to evaluate the graphics card’s rendering performance. The Radeon RX 7600 ranked in the top 29% of all benchmarks, thanks to the Heterogeneous Interface for Portability — AMD’s compute language for GPUs utilized by Blender Benchmark (as opposed to OpenCL, which does not utilize it). A far quicker result than expected was delivered. This is good news for gamers who do light personal and family photo editing or enhance pictures for social media posts.

The system’s 3DMark Time Spy score of 10,557 was better than 60% of all results, which is respectable for an entry-level gaming machine.

Gaming Performance

Various old and new video games were tested on the platform, including next-gen versions.

Shadow of the Tomb Raider

This game averaged 134fps at 1080p with the maximum graphics settings and AMD’s FidelityFX Contrast Adaptive Sharpening enabled. With ray traced shadow enabled at high settings, the game ran at an average 77fps with a low of 53fps. Increasing the quality of the ray traced shadow to extreme resulted in an average 70fps and a minimum of 43fps.

Far Cry 6

This game averaged 118fps at the 1080p high graphics quality setting, registering a minimum of 98fps. During testing, all DirectX Ray tracing (DXR) and FidelityFX Super Resolution (FSR) capabilities were activated. Increasing the graphics settings to ultra quality resulted in an average 99fps and a minimum of 85fps.

Cyberpunk 2077

At 1080p, this game averaged 37fps with a minimum of 22fps. Ultra ray tracing presets and FSR 2.1 capabilities were activated automatically. The game performed at an average 50fps and a minimum of 35fps using the medium ray tracing setting, resulting in a much smoother experience.

The Witcher 3: Wild Hunt Next-Gen

This game averaged 38fps at 1080p, with a minimum of 26fps. Ultra ray tracing presets and FSR 2.1 capabilities were activated automatically. The game functioned significantly better at the medium ray tracing setting, clocking an average 57fps and a minimum 46fps. Without ray tracing, rasterization performance averaged 104fps and registered a minimum of 76fps in extreme settings.

Frequency, Power Consumption, Temperature, and Noise

The RX 7600 operated at an average frequency of 2545MHz, consumed 160W of power, and attained an average temperature of 79C when playing The Witcher 3 in ultra ray tracing mode, with the GPU loaded to 99%. Due to their small size and low revolutions per minute, the two 90mm fans kept the card cool and noiseless.

Final Words and Conclusion

According to IDC’s monitor tracker, about two-thirds of new monitors still have a max resolution of 1080p. There is a massive installed base of such monitors. Not every customer with full HD aspirations is seeking the best and most costly gear. For example, Minecraft and Roblox are popular among youngsters, while Fortnite in performance mode is popular among teens. Such groups will be very delighted with a PC powered by the RX 7600, and their parents will not have to seek a loan to build it!

AMD faces increased competition now that Intel has entered the arena, alongside Nvidia and AMD. Difficult macroeconomic conditions — ranging from inflation to a war on the ground in Europe — are reducing consumer purchasing power. However, AMD has wisely evaluated the market conditions and taken quick and clever measures to adjust, such as reducing the proposed end-user price of the Radeon RX 7600 from an anticipated $299 to $269! AMD has also reduced the prices of its previous generation RDNA 2-based RX 6000 series cards, thereby providing gamers and customers with a wide selection of goods at various price points.

In conclusion, there is a lot to like about the AMD Radeon RX 7600. It is an affordable, sleek, and compact dual slot, dual fan graphics card that delivers impressive 1080p gaming performance at 50+fps on the highest graphical settings with FSR and ray tracing enabled.

Mohamed Hakam Hefny - Senior Program Manager - IDC

Mohamed Hefny leads market research in EMEA on professional workstation PCs and solutions. He also reports on professional computing semiconductors, processors, and accelerators (CPUs and GPUs), as well as breakthroughs and trends related to the market. In addition, Mohamed is actively involved in AI PC taxonomy and research. He participates in business development projects, contributes to consulting activities, and provides IDC customers with analysis, opinions, and advice.

In today’s fast-paced world, automation has emerged as a revolutionary force reshaping the technological landscape.

From self-driving cars to intelligent virtual assistants, automation is rapidly permeating various industries. Because of its increasing popularity and the importance it plays in streamlining processes and reducing costs, it has become a critical part of many organizations’ digital transformation strategy.

So far, enterprise automation has been mostly reactive. It has been implemented as a piecemeal, noninvasive method to automate routine, repetitive tasks, and structured processes and data. Business drivers, goals, and means for business processes, IT operations, and software development of enterprise automation have expanded for the next chapter of the digital journey.

According to IDC’s Future Enterprise Resiliency and Spending Survey, Wave 1 (January 2023), 30% of organizations are more than 50% done with their automation goals for their front office businesses.

The promise of automation means we will continue to see new use cases pop up. This is particularly evident with the ever-increasing amount of data that organizations need to compile, organize, and analyze. B2B buyers’ expectations for agility, flexibility, and the ability to roll out new products and services quickly have also pushed companies to embrace automation.

Digital journeys and automation are now the dynamic duo that will run a viable digital business at scale.

Rapid innovation in AI-assisted automation adds another layer of possibilities to what can be accomplished. Below, we’ll discuss a couple of ways in which automation is being used today to deliver a better customer experience overall.

Personalization at Scale

Automation plays a crucial role in the ability of businesses to tailor and customize customer experiences, communications, and offerings. Starting with data collection and analysis, automation streamlines the process of gathering customer data from multiple sources such as website interactions, purchase history, social media activity, and customer surveys.

The use of artificial intelligence (AI) and machine learning (ML) within automation enables algorithms to extract meaningful insights from the data gathered and make informed decisions. With automation tools, businesses can create and update customer profiles. Data is incorporated from various sources in real-time. These profiles give businesses a holistic view of each customer, allowing them to effectively customize their experiences.

Automation tools can also help organizations generate and deliver content in real-time. From personalized product recommendations based on browsing and purchase history to email marketing campaigns based on customer segmentation, dynamic content delivery ensures that customers receive relevant and engaging information that resonates with their specific needs.

Enterprise automation means artificial intelligence continuously supports decision-making and automated actions that proactively optimize and enrich outcomes. This process spans across the entire organization and will maximize the business value.

Proactive Engagement

Automation enables organizations to proactively engage with customers based on triggers or predefined conditions. For instance, when a customer abandons a shopping cart, an automated workflow can send a personalized email with a reminder or a special discount to encourage them to complete the purchase.

These workflows can be designed to address various customer interactions such as onboarding, upselling, cross-selling, and re-engagement. Automation can extend proactive engagement to social media platforms where businesses can monitor customer mentions, comments, and questions.

An infrastructure that’s scaled and agile delivers a great user experience. As part of digital transformation, leaders must enhance their risk and controls environment to be more intuitive and automated.

AI and ML have had a considerable impact on automation, particularly in how they’ve enabled better customer experiences. The introduction of generative AI has been met with enthusiasm. Automation use cases are already being created that have the potential to impact customer experience. Some examples of where automation within CX is headed:

  • Resource reallocation. Automation continues to take over manual tasks that humans perform daily, freeing up resources to focus on more complex, skill-driven activities. Everything from recruiting to medical diagnoses will be assisted by AI-driven automation, giving back valuable time to highly skilled employees to meet the unique needs of each customer.
  • Communication mining. Communication mining uses intelligent automation to extract valuable information and insights with AI and NPL (natural language processing). These come from various forms of communication data such as text messages, emails, social posts, customer support interactions, phone calls, recordings, and more. By mining communication data, organizations can gain insight into customer preferences. They can identify emerging trends, improve customer services, and make data-driven decisions.
  • Employee experience. Investing in employees and creating a positive work environment not only leads to happier, motivated, and engaged employees but improves customer service. With automation handling routine and time-consuming activities, employees can work on high-value projects and achieve higher levels of productivity. Automation can also provide opportunities for employees to develop new skills and expand their expertise. When employees feel empowered and have the right tools to be productive, they are more likely to deliver a positive customer experience.

The future of automation in CX is intelligent, automated, and engaging. The brands that achieve the most favorable results in CX are those that will merge automation and intelligent tools with human ingenuity and compassion.

Michelle Morgan - Research Manager, Customer Experience - IDC

Before joining IDC, Holtz worked for ABN AMRO. As a senior analyst in the bank's Treasury Operations, Wholesale Division, she had a strategic advisory role on business organizational matters and was responsible for internal IT cost control and internal service level and performance data management.

Oil and gas industry players have a mixed view of generative AI (GenAI). While the technology vendor community is so excited, oil and gas end-user organisations are cautious and are taking a more conservative position — for now. Maybe it’s still too early for them to commit or to comment on their next moves in the GenAI space.

This is reflected in IDC’s Future Enterprise Resilience and Spending Survey Wave 2 (March 2023), which shows that only 18% of oil and gas companies worldwide will invest in GenAI technologies this year. The remaining 82% are either neutral or are carrying out initial assessments to identify the best use cases.

 

Download eBook: Generative AI in EMEA: Opportunities, Risks, and Futures

 

Potential Use Cases for the Oil and Gas Industry

There are three main use cases where oil and gas industry early adopters will be able to generate value with GenAI:

  1. Asset operations: GenAI can create new data and content to enhance multiscenario authentic simulations and prediction capabilities of operational assets. It can enhance the capabilities of digital twins, predictive maintenance and asset-management-specific workflow automation.
  2. Upstream subsurface data analysis: GenAI can enhance images to create 3D models. It can also generate subsurface images using fewer seismic data scans, avoiding the need for repeated data acquisitions to fill the data gaps that are common in the upstream oil industry.
  3. Enterprise ChatGPT for business leaders: Oil and gas companies’ unstructured data is generally held by different personas in different locations. All this data can be operationalised to create instant access to the right information to support organisations’ leadership in business decisions. Large language models (LLMs), such as ChatGPT, can play a crucial role here as they can generate human-like text, respond to domain questions and be used in the form of chatbots and virtual assistants.

 

Watch the Webcast: Generative AI in EMEA: Opportunities, Risks, and Futures

 

Uncertainties

There are lot of uncertainties around the adoption of GenAI, such as development of new regulatory frameworks and organisations’ data security. Also, with oil and gas companies seeking to improve their ESG performance and making a serious commitment to net-zero emissions, they are trying to adopt new technologies to operate their business efficiently but with minimum possible environmental impact. One operational concern is the sustainability credentials of GenAI technologies, as the technology could have a huge carbon footprint. The training of a single common natural-language processing AI model, for example, emits nearly five times the emissions of a single car during its lifetime.

With GenAI at the early stages of adoption, there are still questions about how it will support business outcomes. How industries such as oil and gas utilise it will depend on how effectively it supports and enhances performance, while mitigating the risks that come with adopting a new technology. For the oil and gas market it seems that in the short term it’s a case of watch this space.

It’s no secret that datacenters – the digital heartbeats powering our interconnected world – are voracious consumers of energy resources. Their energy consumption and corresponding carbon emissions are pressing concerns to those within and outside the datacenter industry.  

The importance of reducing carbon emissions as a crucial strategy to tackle climate change cannot be overstated. From the potential consequences of rising sea levels to the unpredictability of extreme weather events and disruptive impacts on ecosystems, the stakes are high. Pressure is mounting on industries worldwide to curtail their carbon footprints and contribute to global environmental efforts. 

Understanding this, the datacenter industry is far from indifferent. Almost every datacenter provider now advocates for net-zero operations. Leaders are targeting a specific date to achieve this lofty ambition. Simultaneously, IT vendors tout their innovative solutions, purporting to aid their customers in shrinking their carbon footprint. But amidst these eco-conscious promises, a troubling void becomes evident – the lack of quantitative data. 

Ironically, for an industry that thrives on data, the datacenter sector lacks comprehensive, credible data. There is a data deficit regarding power capacity, energy consumption, and carbon emissions resulting from a datacenter’s operations. We don’t have an accurate measure of the industry’s environmental impact. As a result, we don’t have a clear pathway to meeting stated net-zero goals. 

Recognizing this, IDC created new research aiming to estimate the energy consumption and carbon emissions of the datacenter market. This endeavor promises to shed new light on the datacenter industry’s environmental footprint. IDC’s research offers a quantitative lens through which to assess and compare the impact of different datacenter types and geographies. 

This dataset quantifies key parameters such as energy consumed, carbon emitted, and carbon avoided through the use of carbon-neutral energy sources. It also details power capacity, square footage, and expenditure by datacenter type, including Internet Giants, Colocation, Internal, Edge, and Telco datacenters. 

Datacenter Carbon Emissions – A Key Business Issue 

As digital transformation continues to redefine the operational landscape of organizations, another influential paradigm has become increasingly prominent – sustainability.   

IDC estimates that worldwide digital transformation investments will reach $3.4 trillion in 2026, all of which drives demand for datacenter capacity. 

“The demand for datacenter capacity is outpacing sustainability advancements” 

As part of digital transformation efforts, organizations will invest in Generative AI technologies.  These revolutionary technologies are particularly energy-hungry. Using them places an unprecedented demand on datacenter resources compared to traditional loads. For example, the energy consumption to train GPT-3 (a precursor to Chat GPT 3.5) is estimated to be 1.287 gigawatt hours. This does not include end user consumption while interacting with the model. 

It is also clear that sustainability is no longer an optional add-on or a public relations talking point. ESG (Environmental, Social, and Governance) practices create business value by fostering long-term sustainability, mitigating risks, attracting socially conscious investors, enhancing brand reputation, and driving innovation. 

These two seemingly opposing goals are creating the pre-eminent challenge for the datacenter industry. How can we meet the rapidly growing capacity demands while ensuring our operations are sustainable? 

Progress toward Net-Zero Goals 

In recent years, the datacenter industry has seen a remarkable increase in sustainability-oriented claims. In an effort to stay competitive almost all datacenter vendors are asserting their sustainability credentials. Cloud Service Providers (CSPs) and Colocation Providers are pledging to reach net-zero carbon emissions by specific target dates.

In addition, IT vendors boast about the enhanced energy efficiency of their latest chips and equipment compared to older models. While these claims are often valid and reflect a promising shift towards more sustainable practices, they can inadvertently give the impression that the datacenter industry is rapidly becoming more sustainable.  

However, this might not be the full picture. Individual components are becoming more efficient and companies are setting emission reduction goals but the overall environmental impact of the industry may not necessarily be decreasing at the same pace. Especially considering its continuous and rapid expansion. Therefore, there’s a pressing need for reliable data.

Accurate figures on energy consumption and carbon emissions of the entire datacenter industry would provide a more objective assessment of its sustainability progress. This data will help identify gaps and develop strategies to mitigate the environmental impact effectively.

So, while we must applaud the efforts and strides taken so far, it’s equally important to validate them with robust, industry-wide data to ensure a truly sustainable future for the datacenter industry. 

IDC estimates the global energy consumption of datacenters in 2022 at 382 Terawatt Hours (TWh). With a compound annual growth rate (CAGR) of 16.0%, leading to 803TWh by the year 2027. 

For more insight and information on the trends in datacenter energy growth and sustainability, please see the IDC Datacenter Deployment Model