We tried to assess the potential uses that a manufacturing organisation could make of chat-enabled generative AI technology and assessed the potential impacts of this technology for the industry by analysing the very answers an open tool has given us.

We questioned none other than ChatGPT, which stands for Chat Generative Pre-trained Transformer. ChatGPT is based on OpenAI’s GPT-3.5 series, which can perform a variety of natural language processes*.

With the professional version available, assuming that outage and availability problems are now past, does it make sense for a manufacturing organisation to fork out the few dollars of monthly subscription? Let’s find it out.

When we asked ChatGPT how it can be useful in manufacturing processes, the response listed the following areas:

Quality control: ChatGPT can be used to analyse written reports and data from quality control inspections, identifying patterns and trends in product defects.

Predictive maintenance: ChatGPT can be used to analyse sensor data and logs from manufacturing equipment, identifying patterns and trends that indicate when maintenance is needed.

Document summarisation: ChatGPT can be used to automatically summarise large amounts of technical documents, such as manuals and procedures, making it easier for employees to find the information they need.

Natural Language Processing: ChatGPT can be used to process natural language input from users, such as voice commands, and translate them into actionable instructions for manufacturing robots or machines.

Chatbot: ChatGPT can be used to create a chatbot for customer service, handling customer inquiries and complaints.

Language Translation: ChatGPT can be used for language translation for multi-lingual communication with customers and suppliers.

These are just a few examples of how ChatGPT can be used in manufacturing. The specific application will depend on the needs of the manufacturing company and how they want to utilise the technology.

We then proceeded in investigating some more details for each of the answers. The follow up question was “How can ChatGPT be used in quality management in manufacturing?”

We are now providing the actual answers, with a brief comment and our verdict.

“ChatGPT can be used in quality management in manufacturing in several ways. One way is by analysing data from quality control inspections to identify patterns and potential issues. The model can be trained on historical data to predict when quality issues are likely to occur, and can also be used to generate reports and alerts to notify quality control personnel.”

Any modern Quality Management System (QMS) application can already be augmented with an embedded AI analytics module to achieve this. This is very useful to help companies figure out complex root cause structures, and in general, investigate their “known” and “unknown” unknowns. In our case, it is unclear on which data Chat GPT is going to rely. So, at the present time, we think that ChatGPT is overselling a little bit here.

“Additionally, ChatGPT can be used to generate automated test plans and reports to increase the efficiency of quality control inspections. ChatGPT can also be used to generate automated documentation for compliance with regulations and industry standards. This can include generating standard operating procedures, work instructions, and other documents to ensure that quality management processes are followed consistently.”

This is quite interesting. This can help a lot with the simplification of clerical work and create multiple reports for different audiences very easily and at scale. But for the real business impacts to materialise, a company should first do some preparatory work and carefully define business rules, templates, formats, and audience requirements.

“Another way ChatGPT can be used in quality management is by analysing customer feedback and complaints, and identifying patterns and common issues. This information can be used to improve the quality of products, and to prevent similar issues from occurring in the future.”

The good old “feedback loop” story: understanding user habits and recurrent issues to feed back to the quality department with relevant information. But in the manufacturing industry, the problem for most companies is how to bypass distributors and access timely and good quality market information directly to feed the analysis. If ChatGPT only relies on publicly available information and does not tap-in any market intelligence database, everything it would talk about is things a company should already know about. So, as an add-on to a market intelligence subscription, it could become an intelligent way to consume the data, otherwise, mostly of little use.

“Overall, ChatGPT can be used to improve the efficiency and effectiveness of quality management processes in manufacturing, and help to identify and prevent quality issues before they occur.”

Here it seems a definitive oversell.

Then we followed up with the next use case: predictive maintenance and this is the answer.

“Analysing large amounts of data from manufacturing processes and equipment to predict when maintenance will be needed. This can include data from sensors on equipment, as well as data on past maintenance and repair history.

The model can also be trained on historical data to identify patterns and make predictions about when equipment is likely to fail.

Additionally, ChatGPT can be used to generate reports and alerts to notify maintenance personnel when maintenance is required, and can also be used to optimize maintenance schedules to minimize downtime and improve overall equipment efficiency.”

We are not doing a point-by-point analysis here, but in general, we are seeing the same pattern as the use case before. Overall, ChatGPT can be used to improve the efficiency and effectiveness of document management processes in manufacturing and help to identify and extract important information from documents quickly and accurately.

Good for clerical work simplification, redundant for other functionalities, but definitely overselling itself sometimes.

Now, let’s analyse the other points that ChatGPT has brough forward: Document summarisation, Natural Language Processing, Chatbot, Language Translation.

A quick glance at the detailed answers highlighted that here ChatGPT was not providing additional business use cases, but rather outlining some basic principles of how it works. A little bit like if we asked an intelligent washer-dryer machine what it does, and the reply was: “Washing clothes, drying clothes, having a spinning drum inside, being heavy, consuming electricity, etc.”

Any Missing Areas?

Now we also thought about some obvious “missing areas” in its self-assessment: for example, idea generation. Using the generative text functionalities as a way to explore concepts, visions, user needs, like a “crowdsourcing” tool that prompts ideas that may just not come to mind to a decision maker considering several options.

Another obvious miss was code programming. Our research has constantly highlighted a critical need for digital skills in factories and the importance of low/no-code applications to address this (we have been talking about the notion of “buyilders” — a hybrid between “build” or “buy” software users), enabling workers on the shop floor in general act as/programmed as interactive tool/personal assistant, digital work instructions, etc…

As personnel with some knowledge of coding become more of a frequent sight in plants, having a conversational tool for programming can easily lower the threshold for digital skills in disciplines such as data analysis, machine connectivity, and operational improvement.

What Are the Risks that a Manufacturing Company May Incur While Using this Type of Solution in Their Business Practices?

First, the risk of losing the grip on information quality. Over time, managers can lose grip on the practical aspects of information they have to evaluate (where it comes from, how it has been created, are there inherent biases in it? etc.), with an increased risk of “garbage in/out” effect (a side note: with examples of answers such as “gnothi seauton”, or “42!” we can think of plenty of “oracles” providing unintelligible answers to poorly structured questions).

It is true that sloppy interns will get wrong information as well, but the system has no way to check the quality of information, at least not yet. Eventually, there is a risk of having supervisors that have never done the job in the first place because the step is not done anymore.

Also, these tools automate a lot of “entry-level”, clerical, repetitive transactional tasks (filling repetitive reports, data entry, data check, etc.). This is not bad per se, but it could be an issue in organisations where the “learning by doing” career growth is heavily based on those menial activities.

The risk is that there could be no available route for juniors to learn and grow in these organisations.

Hampering radical innovation reduces the questioning of the status quo. The widespread use of generative tools will also force companies to reinvent excellence.

In its answers, ChatGPT tends to converge to a median level, the “common knowledge”. So not only can this lead to a risk of preventing “disruptive discovery” if not used properly, it can also amplify and reinforce organisational biases, favouring routine and procedural consistency over innovation. Removing the transactional costs of procedures (i.e., clerical work is boring and expensive) can make organisational streamlining less of a need.

Making the fulfilment of useless steps automated and cheap may lead to the accumulation of inefficiencies. When nobody has the ownership and burden to execute these tasks, nobody questions the procedure, and the risk of procedural errors or bottlenecks arising from convoluted processes is never addressed.

The advice is to take this technology with a grain of salt. Companies have to look at it with an eye for process transformation. Automating something manual is not a benefit by itself, it is a benefit when it simplifies the organisational structure, saves time, and increases the focus of key decision makers to more relevant tasks that create real business value.

Purpose is a word that is often talked about. But what does it mean and how does it impact your business?

Purpose is not just about what you sell, but the value you bring. Not just about the products and services you produce, but your company ethos. The impact of what you sell, how you make it and the effect your company has on the wider community.

Purpose impacts everything from buying decisions to employee productivity. Purpose, therefore, drives not only what you do but how you do it.

A purpose gets to the why of the company. Why do your products or services exist? What value does your company bring? Beyond making a profit, what are you providing?

But why does this matter?

Customers are not just concerned with the performance of the products they buy, but who is providing them. Interest in things like a company’s carbon footprint, the working conditions of employees, even the charities a company donates to influence a customer’s decision to buy.

What you stand for is impacting if people buy from you. So you need to ensure you can demonstrate the value you bring. What your “purpose” is.

Your brand purpose is also important for your employees.

It helps build strategy and focus goals. Helps with decision making. It needs to be more than a PR statement — an ethos that can help drive the company. Something that an employee can look to and ask, does the work I am doing contribute to our purpose?

It promotes an aligned and connected organisation that delivers value. Purpose enhances performance and creates value.

Brand purpose is an important keystone for any business. Brand purpose should influence everything from your strategy to messaging to recruitment.

Your company may already have a brand purpose. If you do, it’s good to revisit it and see if it still reflects the ethos and goals of your business.

There are key questions that you should be asking when thinking about your business that your purpose should answer.

Questions to ask when looking at brand purpose:

  • What does our business help people achieve?
  • How does our business impact the societies we operate in?
  • What value(s) do we have while building our products and services?

Customers are becoming more concerned about what a brand stands for. More likely to check whether your business is doing what you say it does. Performative actions without substance (such as greenwashing) will not go down well with potential customers. Purpose can be aspirational, but it must be honest.

Examples from major companies:

  • Coca-Cola. Our purpose: Refresh the world. Make a difference.
  • HP. We are a technology company born of the belief that companies should do more than just make a profit. They should make the world a better place. Our efforts in climate action, human rights and digital equity prove that we are doing everything in our power to make it so.
  • Nike. Our mission: Bring inspiration and innovation to every athlete* in the world. (*If you have a body, you are an athlete.)

Purpose goes beyond a slogan.

Purpose should tell whoever encounters it what your company stands for and what it provides.

Purpose is a factor in a business’ reach and awareness within a market. It can determine whether a customer buys from you, or not. A driver for success.

Act with purpose.

 

For more information and resources, click here.

If speed of adoption and scale of use were top considerations in the last “cloud first” decade, this decade it is about adding control with cloud economics, efficiency and sustainability.

These value pillars are driving IT investment decisions for the C-Suite across industry sectors and countries today. In fact, based on a recently conducted IDC survey, “value for money” was the highest rated attribute for selecting a cloud technology vendor partner.  

The proactive focus on cloud costs over the last 18-24 months is so dramatic that in some cases, it is superseding cloud security focus. At a broad cloud-themed roundtable of an exclusive group of C-suite executives hosted by IDC in February, the tech and business leaders spent the majority of the time discussing cloud cost optimization tips and challenges. “Security is paramount for the cloud, but so is cost control and efficiency for us,” said a CIO of a multinational communications company at the roundtable.

According to a recent IDC CIO Quick poll, 6 in every 10 organizations I surveyed admitted to spending more on the cloud than initially budgeted. There is consensus among the C-Suite leaders that cloud is an imperative for innovation and the foundation of digital business platform.

While cloud adoption accelerated, cloud governance and control mechanisms haven’t kept pace. As a result, up to 30% of cloud spend is categorized as “waste” spend that can be optimized.

This perception of cloud waste combined with tighter budgets and efficiency pressures (from the finance and ESG teams) and macroeconomic factors such as inflation and a potential recession are making cloud costs a top-of-mind consideration for organizations.

With cloud (IaaS, PaaS, SaaS, data clouds) accounting for a dominant portion of IT spend, efficient and optimized use of cloud resources is a key priority. But cloud democratises resources and decentralises the purchase decisions. The combination of consumption-based pricing, dynamism of resources and decentralisation of decisions makes cloud cost control harder. Traditional budgeting, planning, forecasting of IT spend don’t lend themselves to this dynamic cloud world.

Rising cloud costs won’t be as much a concern if tech leaders can monitor the spend, manage it, and see the return on investments. This is the premise used by many organizations to adopt FinOps – as a means to optimize cloud spend rather than altogether cut or avoid cloud usage.  

FinOps is a framework, culture and mindset that enables organizations to maximize the value of their cloud investments. At the heart of it, FinOps is about accountability, transparency and a culture of optimization.

FinOps is not yet another dashboard but a powerful methodology focused on people and processes—creating a culture change in many companies. Organizations adopting FinOps principles are able to:

  • Be more informed and in control of their cloud environment
  • Have confidence and insights to better plan and forecast cloud budgets
  • Make cloud economics their strength in negotiations internally and externally
  • Identifying areas of optimization (decommissioning orphaned resources, rightsizing environment, matching cloud resources to workload needs etc.) resulting in quick savings
  • Empowering developers and cloud users with information to take right decisions
  • Put cross-functional collaboration at the heart of cloud project
  • Encourage discipline and ownership of cloud spending
  • Foster a blameless culture through a single source of truth
  • Make IT a business enabler, rather than being a cost center by starting with visibility and, over time building mechanisms to dynamically chargeback business units.

Shift left of cost element – rise of DevSecFinOps

Digital mandates brought synergies between business, IT, security, and developers and we are now rapidly entering the era of FinOps resulting in finance dimension included in this synergy leading to a DevSecFinOps methodology.

 

The new CIO approach is that if security is a Day 0 job, FinOps is a Day 0.5 job and it is time to shift FinOps left

Business value of FinOps:

In the short term, it helps with visibility and easy savings to bring discipline in cloud use. It also makes users more aware of economic aspect of cloud.

In the medium term, FinOps strengthens organizations’ cloud governance capabilities and improve cloud planning, forecasting, and negotiating. It helps teams consider application right-scaling, resource matching and better tagging of cloud resources.

In the long term, it fosters a culture of collaboration and makes all users take responsibility for cloud costs. It also helps IT have metrics and KPIs to contribute positively to the broader sustainability and carbon footprint reduction goals. FinOps maturity includes Unit Economic concepts. It represents a business approach where a P&L containing the cloud-driven revenue compares it to the cloud costs incurred, so the complete picture of return on investment is evident.

Recession Tech Playbook Guidance: How to Excel with FinOps

Focus on the “and”, not the “or” or “but”. FinOps is not an excuse to cut cloud use. Quality, security, speed and costs are all business value pillars and a balance is key for sustainable and consistent IT transformation. Cost is one dimension. Organizations should be aware of trade-offs when making cost optimization decisions, as this is only one objective (albeit one that is an increasingly high priority).

Start small but start soon. Regardless of where organizations are in their cloud journeys, FinOps as a strategy is critical for the cloud success. Businesses can bring cloud cost visibility as close as possible to engineering teams, so these teams can understand how their decisions impact cloud costs and take appropriate measures at the design point. With regards to teams, organizations can start with 1-2 IT members with a passion for optimization or the CCoE embracing FinOps tasks. The company can then look to slowly build FinOps certification and a full-fledged team.

Explore FinOps enabling tech. A triad of automation, observability and optimization tools can help organizations take their cost governance strategies to scale. FinOps is firstly a strategy and a discipline. Once organizations have clear objectives and KPIs, they can evaluate platforms to implement it.

Adhere to industry standards to benchmark and track progress. Exploring the FinOps Foundation maturity model and adopting the three phases of FinOps – Inform-Optimize-Operate will help organizations clearly benchmark themselves in comparison to their peers and track progress. It will also help in identifying the weaknesses and using the same language of the industry.

Don’t lose momentum. Cloud cost governance is not a one-time exercise and overtime as consumption gets granular and innovation continues at breakneck speed, keeping up is key.

Understand that FinOps is a shared responsibility. Explore the cloud partnership to ensure cost optimization of the cloud and cost optimization in the cloud.

Don’t forget to explore the tips and best practices shared by my IDC colleague Rick Villars, Group Vice President, IDC, in this blog: Adopt a FinOps and Cloud Economics Strategy to Minimize the Risks of Cloud.

We at IDC have made FinOps a key part of our cloud research agenda, so watch out for more research insights from us this year to help you put your best, optimized and collaborative foot forward in the cloud this decade!

Luisa Pannozzi - Sr. Manager, Marketing - IDC

Customer-empathic marketer who excels at integrated marketing strategy. 15+ years in branding, integrated campaigns, content, and GTM programs. Known for uniting teams, building momentum, and turning insight into impact.

Are we in a recession? Or is it a downturn? Or would it be more apt to call today’s business landscape “economically challenging”?  The answer is that there is no single answer —especially when it comes to highly exposed small and medium-sized businesses. Economic conditions and their impact vary dramatically for SMBs based on factors such as their vertical, target market, region, and company size, to name just a few.

However, new IDC survey data makes point one abundantly clear: The recession fear is here for SMBs.

73% of SMBs believe there will be a recession in the coming year

Source: IDC’s January Future Enterprise Resiliency & Spending Survey

Of those that forecast a recession, nearly 46% answered that they are experiencing a recession right now, 35% estimate one to hit in the first half of 2023 and 20% in the second half of this year. 20% of those SMBs that predict a recession think it will be mild, 60% moderate and another 20% are planning for a severe recession.

A Game Plan to Weather Economic Challenges

In times of uncertainty, it’s essential for SMBs to take emotion out of the equation. One way to do this, and to feel better prepared and organized as a company, is to develop a clear and specific game plan—such as a list of actionable items that will (relatively) quickly deliver measurable results. If executed thoughtfully, this exercise can help SMBs weather today’s economic challenges and put in place strategies to help them operate more efficiently for years to come.

Below are 3 concrete steps SMBs can take now to survive an economic downturn—and thrive once it passes.

Invest in In-House IT Expertise

SMBs need to lean further into digital capabilities during economic downturns to reduce manual work and errors, be more efficient with slimmer staff, and combat rising costs of skilled labor. As a result, SMBs need to hire strong technology leaders who can help them as they invest in digital transformation.

IDC finds that 40% of global SMBs do not employ even one full-time IT staffer, and of those that do have at least one, around 70% only have between 2 and 4. These new in-house IT leaders can get to know an SMB’s inefficiencies more intimately than an outside vendor or consultant and can be technology advocates for the SMB.

High-level IT leaders should be tasked with researching, vetting vendors and technologies, and making technology purchase decisions. When beginning the recruiting process, SMBs should be sure to vet any new IT leaders carefully before hiring. Ensure that they understand a range of IT programs that your business is considering using and that they have experience with similar companies in similar verticals to yours.

These experts should have the capabilities and expertise to ask potential technology suppliers tough questions surrounding details and costs of implementation, integrations, maintenance, and time to ROI.

And finally, these technology executives should be able to carefully evaluate whether it is better to build a technology in-house or use a vendor for it, and also determine if a new technology is necessary and if it will deliver timely cost savings.

Amplify Automation

Inflation is leading to rising costs of skills, and tough economic circumstances are forcing companies to reassess and trim their workforces. This one-two labor punch makes today’s SMB landscape a prime environment for automation. This broad technology comes in many flavors.

Automation technologies can, for example, use ML to learn and improve on their own over time without human intervention. These automation programs could hone marketing and advertising campaigns as they learn what factors lead to a sale or conversion or detect fraudulent transactions by learning typical characteristics of such transactions over time. AI chatbots can take the pressure off customer service staff by solving basic inquiries without human intervention. Robotics in warehouses can save on labor for picking and packing in warehouses while doing these tasks more quickly than humans could. And digital apps and kiosks at restaurants can reduce the number of order takers necessary at quick service restaurants.

Shop Smarter

SMBs should quickly reinstate cost-saving measures that may have fallen by the wayside during healthier economic times. SMBs should seek to become master negotiators. Don’t be afraid to ask for more flexible contract terms and explore any cost savings a vendor might offer, such as volume discounts or bundled savings for when a business buys several offerings from a provider.

Beyond cost savings, this approach also offers the added benefit of reducing integration headaches that can surface when an SMB uses several different point solutions from multiple technology suppliers. This is important as, noted above, SMBs don’t’ typically have large IT staffs. Even if a contract is not yet up for renewal, ask suppliers if there is any room for contract negotiations or to revise fee structures in a way that works better for your business. Use your network of community resources such as a local small business association or chamber of commerce for advice and guidance on the best suppliers for your industry and business size.

Now is also a wise time to diversify your supply chain and sourcing, if possible, to account for global currency fluctuations that can impact energy, component, material, hardware, and outsourced skilled labor prices. And lastly, be vigilant about tracking the ROI of each and every technology in your tech stack. CFOs should also require businesses and IT to perform ROI calculations before approving the budget for new technologies. One person should be held accountable for each project and deliver on those previously forecasted results.

Find more information related to macroeconomic impacts on SMBs, in these related reports that are part of IDC’s Worldwide Small and Medium Business Research practice.

Katie Evans - Sr. Director, Research - IDC

Katie Evans, Senior Director, Worldwide Small Medium Business (SMB) Research Program within the Digital Transformation space. Katie's core research coverage includes identifying and supporting the unique, evolving needs of the Very Small, Small and Medium Business technology buyer. Katie has a strong, SMB-focused research and writing background, having covered SMBs in the retail and ecommerce space for over 12 years. Most recently, her primary coverage area was researching the technology needs of SMB retailers and analyzing the vendor offerings on the market to meet those evolving needs. Katie has also conducted extensive writing and research on mobile and international ecommerce and has authored several custom reports for vendors serving SMBs.

Neurodivergent people have many talents that can add value to the IT industry. Autistic people can be strong logical thinkers, highly focused, detailed orientated, reliable and loyal. ADHD people tend to have high energy and strong imagination. Dyslexic people can bring out-of-the-box thinking and pattern recognition. Dyspraxic people tend to be really good at strategic thinking and problem solving, and highly motivated.

There is no question that neurodivergent talent can add value in a variety of technical and business roles in the IT industry. So, we don’t need to ask, “What can neurodivergent people do for the IT industry?” The answer, unequivocally, is a lot.

We need to change the perspective. We need to ask ourselves what we can do better to attract and retain these talents in IT buyer and supplier organisations.

In a previous blog, I talked about how cities should think of how to become autism friendly, including through the intelligent application of technology. In this piece, I’m reflecting on how the technology industry itself can make the workplace more autism friendly.

Making the IT Industry Autism Friendly

According to Digital Scotland, 10% of Scottish people are neurodivergent, but many of them are unemployed. In fact, the UK Office of National Statistics’ research shows that just 22% of autistic people are in any kind of employment, but many more are eager to work. That’s a lot of wasted talent for the IT industry at a time when there’s a dire shortage of talent.

According to our surveys, around 74% of European organisations find it difficult or very difficult to hire technology roles in either line of business or IT. Most importantly, that’s a lot of unaccomplished self-fulfilment and happiness for autistic people.

The good news is that the IT industry has started to pay attention. On the technology buyer side, the Israeli army recruited autistic soldiers for a highly specialised visual intelligence unit. On the supplier side, IBM established the ND@IBM BRG (Business Resource Group), which includes neurodivergent employees and allies in IBM offices across the globe.

SAP, Microsoft, DXC and EY have invested to raise awareness both among their employees through internal webinars and training, and for the overall industry by sponsoring Autism at Work Summits. There are even companies that make neurodivergent talent their core asset, such as Auticon, which provides quality assurance, testing, data science and cybersecurity services with a delivery workforce that includes around 400 autistic consultants in its 20 offices.

There’s a long way to go, but these examples show that a different perspective on autism at work is possible for the IT industry. Companies embracing this new perspective need to consider that matching the skillset of neurodivergent people with the right projects and activities, and raising awareness, are only the first steps in the process.

Success comes from changing recruiting and hiring processes by finding alternatives to one-to-one interviews, which can be a barrier for people with gaps in their social skills. For example, they could combine cognitive written tests with week-long workshops, where psychologists bring candidates together for group work and meals to evaluate their individual soft skills.

Workspaces need to be adapted. Just as an employee in a wheelchair may need a ramp, an autistic person may find a low-light, low-noise environment more conducive to concentrating. Psychologists need to be retained as job coaches to help prevent situations that cause anxiety, based on each individual’s profile, and to facilitate interaction with clients.

Dress codes need to be relaxed for autistic people that may be hyper-sensitive to touch and therefore can’t wear certain fibres. Neurotypical employees must be immersed in teams with neurodivergent people to learn how to interact.

A simple change of language from “I need this deliverable to be completed ASAP” to “I need this deliverable to be completed by tomorrow at 5pm” makes an immense difference for an autistic person. The former quite simply does not make sense, and just creates anxiety. The latter provides a clear deadline that an autistic person can meet.

Neurodivergent talent can bring a different perspective to help IT buyers and suppliers avoid bias when tackling business and technical problems in our fast-paced industry. All we need is a change in mindset to make the IT industry a good place to work for neurodivergent talent.

Massimiliano Claps - Research Director - IDC

Massimiliano (Max) Claps is the research director for the Worldwide National Government Platforms and Technologies research in IDC's Government Insights practice. In this role, Max provides research and advisory services to technology suppliers and national civilian government senior leaders in the US and globally. Specific areas of research include improving government digital experiences, data and data sharing, AI and automation, cloud-enabled system modernization, the future of government work, and data protection and digital sovereignty to drive social, economic, and environmental outcomes for agencies and the public.

AMD made major changes to its CPU and GPU portfolios in 2022. On August 29, AMD unveiled its Ryzen 7000 series of desktop processors powered by Zen 4 architecture and the new socket AM5 platform. These processors have been available globally from September 27. The flagship 16-core AMD Ryzen 9 7950X processor is presently offered at the manufacturer’s suggested retail price (MSRP) of $699.

On November 3, AMD unveiled the Radeon RX 7900 series of graphics cards powered by RDNA 3 architecture. The cards have been available on AMD’s website from December 13, 2022. Leading AMD board partners, including ASRock, ASUS, Biostar, Gigabyte, MSI, PowerColor, Sapphire, Vastarmor, XFX, and Yeston, have offered the cards from mid-December. The Radeon RX 7900 XTX has a starting price of $999, while the AMD Radeon RX 7900 XT has a $899 price tag.

AMD offered samples of the Radeon RX 7900 XTX along with the Ryzen 9 7950X processor, a socket AM5 motherboard, and a DDR5 memory kit. The samples allowed IDC to test how well AMD’s flagship technology platform performed as a completely new system or upgrade over an existing system. IDC also gauged how well the new samples performed over prior generations in terms of performance per watt and computations per U.S. dollar.

What’s New with the AMD Ryzen 7000 Series Desktop CPUs?

The core compute die of the Ryzen 7000 series processor is built on TSMC’s 5nm process node. The input/output die — which now includes basic RDNA 2 graphics capabilities for 2D workloads or diagnostic/troubleshooting purposes — is based on TSMC’s 6nm process.

AMD made a significant change by moving from the previously long-lived AM4 socket platform to the new LGA AM5 socket. Since pins have moved from the CPU to the socket, there is less risk of damage during installations or upgrades. The AM5 platform also supports dual-channel DDR5 memory up to 5200 megatransfers per second (MT/S), in line with JEDEC standards.

AMD has invested in a memory configuration standard it has branded EXPO. EXPO supports advanced pre-configured profile settings designed specifically for AMD memory controllers built into DIMM boards. These pre-configured profiles simplify overclocking for greater bandwidth and tighten timing for the fastest responses and lowest latencies.

The new socket AM5 motherboards have up to 24 PCIe 5.0 lanes and are available with AMD X670E chipsets. In addition to possessing the highest overclocking capabilities, the enthusiast-grade X670 and B650E motherboards support the PCIe 5.0 standard for graphics and storage components. However, PCIe 5.0 support is an optional feature on the mainstream B650 motherboard.

AMD has pledged to support AM5 motherboards through 2025 and at least two subsequent CPU generations. However, support may be provided even longer if the long history of the AM4 series is anything to go by.

What’s New with the AMD Radeon RX 7900 Series GPUs?

The AMD Radeon RX 7900 XTX and Radeon RX 7900 XT graphics cards are the first gaming cards to feature an advanced chiplet design. Like the Ryzen 7000 series, the AMD RDNA 3 generation architecture’s chiplet design combines 5nm and 6nm process nodes. The main graphics compute die (GCD) is built on TSMC’s 5nm process and provides the compute units for rasterization and ray tracing. AMD includes up to six TSMC 6nm memory cache dies (MCDs), each with 16MB of second-generation AMD Infinity Cache technology and a 64-bit wide memory interface. AMD invested in a novel interconnect to link the GCD and MCD chiplets together, enabling bandwidth of up to 5.3TB/s.

The AMD Radeon RX 7900 XTX has 96 compute units. Each has four texture units (for 384 in total), one ray accelerator (96 total), and two artificial intelligence (AI) units (192 total). According to AMD, the performance of the ray tracing units in RDNA 3 architecture is nearly double that of the corresponding units in RDNA 2, while the new AI instructions are nearly three times as many as those in the previous generation.

The AMD Radeon RX 7900 XTX features six MCDs that support 24GB of GDDR6 memory running at 20Gbps over a 384-bit bus. The Radeon RX 7900 XTX also supports USB-C, DisplayPort 2.1, and HDMI 2.1a connectivity with UHBR 13.5, allowing displays with high refresh rates to be connected (up to 480Hz refresh rates on 4K panels and 165Hz refresh rates on 8K panels). A dual media engine supports simultaneous encode or decode streams up to 8K60 for HEVC, as well as the new AV1 codec.

The Test Platform

The test PC hardware components included the AMD Ryzen 9 7950X processor, the Radeon RX 7900 XTX graphics card, the Gigabyte X670E Aorus Master motherboard, and a G.SKILL Trident Z5 Neo 2x16GB DDR5-6000 EXPO memory kit. The Windows 11 main drive was a 1TB GIGABYTE AORUS NVMe Gen4 solid-state drive.

A be quiet! Silent Loop 2 280mm water cooler was installed for the processor, which was paired with a be quiet! STRAIGHT POWER 11 Platinum 850W power supply. A 34″ Dell Gaming S3422DWG monitor — a Quad-HD 3440×1440 display with a 144Hz refresh rate, FreeSync, 10-bit colors, and high dynamic range support — was also utilized.

The reviewer utilized the motherboard’s optimal default settings, set the memory profile to EXPO 6000, and made sure that smart access memory was enabled. No special tuning, optimization, or overclocking was carried out for the tests.

Synthetic Benchmarks and Productivity Performance

PCMark 10 is a comprehensive benchmarking tool that covers the wide variety of tasks performed in the modern workplace. Web browsing, videoconferencing, spreadsheet and word-processing workloads, photo and video editing, and rendering and visualization are some of the tasks tested by PCMark 10.

The gaming test focused on real-time graphics and physics engines of the platform hardware. The 9,186 score the platform achieved was better than 99% of all results produced by PCMark 10.

Blender Benchmark 3.4.0 was used to test the rendering performance of the graphics card. Thanks to the Heterogeneous Interface for Portability — AMD’s compute language for GPUs which Blender Benchmark uses (in contrast to OpenCL, which does not) — the Radeon RX 7900 XTX ranked in the top 10% of all benchmarks, delivering an excellent result.

IndigoBench, an OpenCL benchmark based on Indigo 4’s advanced rendering engine, was used to measure the performance of the Radeon RX 7900 XTX. The AMD Radeon RX 7900 XTX achieved 21.500 M samples/s in the bedroom rendering test and 49.599 M samples/s in the supercar rendering test. According to the published test results, the graphics card was 56% faster than the Radeon Pro W6800 and 51% faster than the 6900 XT.

3DMark Port Royal is a dedicated real-time ray tracing benchmark for gamers. The system’s score of 16,319 was better than 93% of all results and almost double the 8,784 score of the older AMD Ryzen 9 5900X and Radeon RX 6800 XT system.

The system’s 3DMark Time Spy Extreme 4K score of 14,242 was better than 95% of all results. The graphics score of 14,593 was 48% higher than what was achieved by the AMD Radeon RX 6800 XT on the same Ryzen 9 7950X platform.

Gaming Performance

Various old and new video games were tested on the platform, including next-gen versions.

Shadow of the Tomb Raider

This game ran at an average 111 frames per second (fps) at 1440p, registering a minimum of 79fps. The highest graphical settings, as well as AMD’s FidelityFX CAS package, were enabled.

FARCRY6

Far Cry 6 ran at an average 140fps at 1440p, registering a minimum of 123fps. All DirectX Raytracing (DXR) and FidelityFX Super Resolution (FSR) features were enabled during testing.

Cyberpunk 2077

Cyberpunk 2077 ran at an average 67fps at 1440p, registering a minimum of 49fps. Ultra-ray tracing presets and FSR 2.1 features were automatically enabled.

The Witcher 3 Wild Hunt Next-Gen

The Witcher 3 Wild Hunt Next-Gen ran at an average 75fps at 1440p, registering a minimum of 60fps. Ultra-ray tracing presets and FSR 2.1 features were automatically enabled.

Fortnite

The latest Fortnite game runs on Unreal Engine 5.1 and makes use of several next-gen features (such as Nanite, Lumen, Virtual Shadow Maps, and Temporal Super Resolution). For example, the Lumen Global Illumination and Lumen Reflection features can exploit the Radeon GPU’s hardware-accelerated ray-tracing capabilities. The updated game ran at an average 82fps, with a 65fps minimum.

IDC Opinion and Conclusion

Ryzen 7950X Performance, Power Consumption, and Heat Considerations

With the Ryzen 7000 generation of desktop CPUs, AMD has taken a new approach to delivering maximum performance. Rather than limit power, AMD allows its CPUs to consume as much power as needed within the socket and reach their thermal throttling temperature (for the Ryzen 9 7950x, the throttling temperature is 95°C). With a much higher CPU thermal design power of 170W in comparison to 105W for the Ryzen 9 5950X, the 7950X reached much higher single threads and all core boost clocks, increasing overall performance and speeding up challenging productivity tasks.

However, enhanced cooling is needed to maintain the performance of the processor (this means increased fan noise). The system and cooler’s fans will quickly ramp up to high speeds when the processor is executing a demanding workload.

AMD provides several tools for performance and efficiency customizations. For example, Precision Boost Overdrive (PBO) is a feature that has dramatic impacts on power consumption, heat, and noise as it enables users to set overall socket power. On the test system, PBO had a major impact on the power consumption of the Ryzen 9 7950X, with only minimal drawbacks in terms of performance.

For example, the Ryzen 9 7950X running at 105W delivered 93% of the performance and used just 46% of the power of a higher 230W system setting. When AMD’s suggested 65W “eco mode” setting was used, the test platform delivered 73% of the performance and used 38% of the power of the 230W setting. While this configuration flexibility is a welcome change, AMD should make it simple and easy for users to change PBO socket power profiles on the fly. AMD could work with Microsoft to enable this capability via the power management features of Windows 10 or 11.

Radeon RX 7900 XTX Considerations

On the GPU side, AMD has increased the GPU power draw to 355W. This is a 55W increase over the 300W draw of the Radeon RX 6900 XT and RX 6800 XT but only a 20W increase over the Radeon RX 6950 XT. Overclocking delivered small gains of around 200 points in 3DMARK tests and slightly boosted gaming fps, but at the expense of increased power consumption, heat, and fan noise. The RDNA 3-based Radeon RX 7900 XTX can hit frequencies of up to 3GHz, with additional power available on AIB partner cards that include a third 8-pin PCIe power connector and significantly enhanced cooling systems compared to AMD reference card designs.

As with the Ryzen 9 7950X, reducing the GPU power limit and undervolting the graphics card significantly reduces overall temperatures and noise for only a few percentage drops in gaming performance.

AMD Ryzen 7000 Series Pricing

AMD faced some criticism about the Ryzen 7000X series and AM5 platform at launch. Although the pricing of the CPUs was competitive, the initial AM5 motherboards based on the X670E chipset were limited in choice and very expensive. This was compounded by the high price of DDR5 RAM (in comparison to DDR4).

AMD has since lowered the price of the Ryzen 7000X and introduced more cost effective and lower power non-X series of CPUs (the 6-core Ryzen 5 7600, 8-core Ryzen 7 7700, and 12-core Ryzen 9 7900). AMD board partners have also introduced a wider range of AM5 motherboards based on the X670, B650E, and B650 chipsets. Volume production of DDR5 has also resulted in more cost-competitive memory options.

AMD Radeon 7900 XTX Pricing

The European street price of the Radeon 7900 XTX is around €1,130, which is higher than the American $999 MSRP listed on amd.com, due to fluctuating exchange rates and taxes (e.g., value-added tax). The cards from board partners are in some cases more expensive, chiefly due to their beefed-up coolers, uprated power systems, and the additional margins of the board manufacturers and their channel partners.

It does not make sense for AMD to lower its profit margins, especially on niche flagship products. AMD’s key competitor asks higher prices for cards in the same category. At the same time, customers may not actually benefit from lower prices in some instances due to intermediaries such as channel partners or scalpers.

It’s getting harder to find a bargain these days, but with an MSRP of $999, the RX 7959 XTX reference card is actually $100 cheaper than the previous generation RX 6950 XT, despite having more compute units, more RAM, a wider RAM interface, and a significantly better performance. This should come as good news for high-end gamers.

User Experience

There are a few user experience challenges with the AMD AM5 platform, some of which are created by partners. For example, updating the motherboard BIOS may be difficult for some users. In fact, updating the BIOS of the motherboard in the test system failed a few times — the reviewer only discovered that the older BIOS first needed to be set to a default state before an upgrade could be made after carrying out a bit of reading and research. Fortunately, new standard features (e.g., BIOS flashback) simplify the recovery process when experiments with settings make a system unbootable and clearing CMOS memory does not work.

Another example is when the reviewer found that when games such as Hell Let Loose, Fortnite or Genshin Impact were launched, a sudden surge in frame rates can dramatically bolster power and voltage consumption. Fortunately, AMD has countered this issue with safety mechanisms that turn the device off when such surges are detected. After carrying out various tests and checking the health of system components, the reviewer found that these problems can be solved by establishing a frame rate cap (e.g., of 240fps), using Radeon Software.

Casual users who encounter such issues may feel displeased with the overall experience, particularly if they have to spend hours troubleshooting with systems integrators or manufacturers. They may attribute the problem to the AMD platform, rather than a poor job from the software house, in capping the frame rates.

AMD Advantage

The AMD Advantage framework combines Ryzen 7950X processors, Radeon RX 7900 XTX graphics cards, the AMD Software: Adrenalin Edition application, and smart technologies to deliver the best experiences to gamers and creators. The company works closely with prominent system integrators, including CSL, Cyberpower, eBuyer, Falcon Northwest, Maingear, Origin PC, and XIdax, to bring stable and enhanced experiences to gamers, streamers, and content creators. Customers who want to avoid the headache of troubleshooting issues during PC buildouts or upgrades will be pleased with AMD’s investments in additional testing, certification, and support.

Performance

AMD is innovating on multiple levels. The company is enhancing core architecture for inter-process communication gains, improving chiplet design for scalable manufacturing, making cache enhancements for memory intensive workloads, and supporting process and design improvements for clock speed increases. As a result of these efforts, its new products are 30–50% faster than previous generations. The company is not holding back or milking the market with small, incremental gains. Competition is one of the biggest drivers of innovation. As such, the performance improvements of AMD CPUs and GPUs are benefiting the entire market.

The move to PCIe 5 and DDR 5 will future proof the new platform. The inclusion of technologies — such as EXPO memory configuration profiles — will make performance improvements simpler and more accessible to the wider PC market.

AMD is also investing in backend software development with ISVs. AMD is collaborating with Adobe, Blender, and OBS to improve hardware acceleration across CPUs and GPUs for content creation, productivity, and encoding, decoding, and streaming use cases, thereby adding productivity value to the gaming cards.

Final Words and Conclusion

According to IDC’s monitor tracker, 1440p is the fastest-growing screen resolution. In addition, 3% of users buy 4K and higher-resolution displays. As these users are the target customers for high-end hardware, AMD is on the right track.

However, AMD has increased the power draw of its CPUs and GPUs to deliver a high level of performance. Per IDC tests, power consumption can be dropped significantly in many cases while still maintaining over 90% of peak performance. In turn, this drop can lower noise and energy bills. AMD can provide users with the best of both worlds by making it simple to switch performance profiles depending on the particular workload.

In conclusion, AMD has certainly taken a big step forward with its new generation of CPUs and GPUs. AMD’s 2023 desktop platform is taking 1440p gaming mainstream and making 120–140Hz refresh rates the norm. The new platform also makes 4K gaming possible at 60fps with the highest graphical settings achievable at native resolutions or with FSR. For this and other reasons, including AMD’s well-earned reputation for remedying challenges and aging very well, we have no issues with recommending the platform.

Mohamed Hakam Hefny - Senior Program Manager - IDC

Mohamed Hefny leads market research in EMEA on professional workstation PCs and solutions. He also reports on professional computing semiconductors, processors, and accelerators (CPUs and GPUs), as well as breakthroughs and trends related to the market. In addition, Mohamed is actively involved in AI PC taxonomy and research. He participates in business development projects, contributes to consulting activities, and provides IDC customers with analysis, opinions, and advice.

Public cloud may sound quite new to a lot of people but in fact public cloud as we know it has been around since 2006, when Amazon Web Services (AWS) launched its S3 storage and EC2 IaaS services. The first major organizations started to move their workloads to the public cloud in the early 2000s, prior to 2010. In the 10 years following the migrating and running of workloads in the cloud, a lot of lessons will have been learned. However, it’s surprising that some of these learned lessons are still not applied today. Three major lessons learned that should be applied by everyone migrating to public cloud are the following:

  1. Don’t just lift and shift your workloads to the public cloud
  2. Cloud management requires new practices
  3. Cloud expertise shouldn’t just be left to application teams

Lesson 1: Don’t just lift and shift your workloads to the public cloud

Lifting and shifting applications does sound attractive at first: a lift and shift migration is relatively fast and cheap. Since the applications themselves are being touched as minimal as possible during the migration, the risk of defects resulting from the migration is low. But then comes the drawback: the added value of the cloud isn’t there. By lifting and shifting applications, you also lift and shift the mistakes made in the past and the rubbish from running the application for years in the datacenter. On top of this, public cloud isn’t just another datacenter. Design decisions perfectly suited for the datacenter do often not work out in the cloud. IDC Metri estimates from experience that a lift and shift migration gets at its best what you already got in the datacenter, except for a 30% higher cost.

A cloud migration should be thoroughly prepared, with an application based approach rather than an infrastructure based one. For every application it should be determined whether the application will be retired (phased out), retained (kept ‘as is’ in the datacenter), rehosted (migrated ‘as is’), replatformed (reinstalled on a newly designed cloud platform without adapting source code), refactored (adapted and optimized for the cloud) or repurchased (bought or built from scratch). Tools offering scans on source code level can greatly enhance the speed and quality of such an inventory, thus helping achieve a successful migration with actual added value.

Lesson 2: Cloud management requires new practices

A typical way of working when it comes to a datacenter is that the datacenter and the infrastructure inside of it is managed by an infrastructure department (which can also be a Managed Services Provider). The infrastructure department acts as the king of its own castle, with a gatekeeper controlling everything that’s coming into the datacenter.

The ease of use of public cloud makes it far easier for application teams managing the full application stack instead of only the software part. A tedious task like installing and configuring infrastructure manually is being replaced by creating infrastructure trough Infrastructure-as-Code scripts and ready made building blocks. This way public cloud has become a driving force behind DevOps and the credo ‘you build it, you run it’.

But there is a drawback here too. Knocking down the castle wall and associated bureaucracy between the application teams and infrastructure department also means the gatekeeper isn’t there anymore. With that, the natural control mechanism concerning rolling out infrastructure and its associated costs is also gone. Organizations are experiencing their cloud costs running out of hand due to a missing control mechanism.

Controlling the cloud without throwing away the advantages such as flexibility and scalability requires new practices. Reinstating the gatekeeper is a practice IDC Metri encounters regularly, but this also reinstates bureaucracy and throws away the advantages of the cloud. The way to go here is making application teams responsible for their cloud costs by giving them insight and control mechanisms, building guardrails by policies restricting them on instantiating overly expensive resources, creating alerts that warn them when costs are getting out of hand, and by actively helping them analyze and manage their costs.

Lesson 3: Cloud expertise should be bundled and not just left to application teams

Lesson two leads to lesson three. While making application teams responsible for managing their own cloud costs, you can’t expect them to know all the ins and outs of the cloud. In-depth knowledge of the cloud should be bundled within a single entity in the organization: the Cloud Center of Excellence (CCoE). The CCoE acts as a both as an assistance dog and a watchdog. The CCoE helps the application teams with their design decisions for using the cloud optimally and using the right cloud services for the right task. The CCoE also monitors the cloud costs on an organizational level and alerts the application teams if they see costs going out of hand. Besides assisting and watching application teams, the CCoE also manages the discount schemes, such as reserved instances and savings plans, on the organizational level. Also, more complex costs savings such as license optimization should be driven by the CCoE since license management appears to be a science on its own.

Critical success factors for the CCoE are a proper mandate or backup by senior management to prevent it from becoming a paper tiger and, of course, the availability of skilled and competent team members. On top of this, the CCoE should have an organizational and financial approach, rather than a pure technical one. Application teams running their applications in the cloud do have the technical knowledge, the CCoE enhances this by bringing in knowledge on cloud billing, pricing models and governance.

Conclusion

Cloud brings a lot of opportunities, but also a lot of challenges. Ten years of widespread public cloud usage has taught us valuable lessons on how to use the cloud in a way that actually brings us forward instead of staying where we are at a higher cost. By applying these lessons, public cloud can be used optimally while keeping control on spend.

Controlling cloud costs and delivering cloud value is within your reach with IDC Metri’s Cloud Economics service. Learn how you can achieve great results.

Inflation is forcing tech leaders to make difficult choices in 2023. CIO’s and their teams must choose between delaying some technology investments to keep budgets under control or increasing budgets more than required based purely on business growth or transformation. The second choice is even less palatable today, as most IT leaders anticipate a recession in the coming year that may force even more aggressive cuts.

Inflation and Currency Fluctuations Alter IT Budget Decisions

IT leaders around the world faced many disruptive challenges in 2022, from IT supply chain issues to continued difficulties securing skilled people required to support new digital business initiatives. IDC tracked how all of these “storms of disruption” were influencing organization’s IT spending plans in our monthly Future Enterprise Resiliency & Spending surveys of   ̴850 IT decision makers around the world. Top concerns changed over time, but at the end of last year, concerns about the rising cost of technology due to inflation and “extra cost bumps” due to currency fluctuations topped the list for organizations across regions.

Source: Future Enterprise Resiliency & Spending Survey – Wave 11, IDC, December 2022, N=840, NA: 255, AP: 370, Europe: 215

In challenging economic times, Chief Information Officers (CIOs) and Chief Financial Officers (CFOs) tend to scrutinize the largest and fastest growing parts of their budgets in anticipation of a need to pare spending. The technology investment areas where surveyed IT leaders most expect inflation/currency fluctuations to affect budget decisions were SaaS, infrastructure hardware/software, IaaS, PaaS. All, however, are also important investment areas to support ongoing digital business transformation. They are at the core of enterprises’ automation and data analytics projects that can improve a business’s ability to navigate economic disruptions.

Aligning IT Spending with Business Conditions in an Inflationary Environment

When working with CIOs on their technology investment plans for 2023 and 2024, IDC recommends that organizations adopt a two-track approach.

  • Sustain momentum in technology projects that help the overall business navigate and even take advantage of this period of economic instability.
  • Adopt cloud optimization (FinOps) practices and processes that eliminate waste in current cloud investments while also resetting long term cloud cost metrics (Cloud Economics).

In both tracks, it’s also important to adopt a phased approach that delivers short term savings, mid-term gains that protect the business if conditions worsen more than expected, and long-term architectural choices that ensure sustainably lower operating costs as business conditions improve.

Adopt a Tech Project Prioritization Strategy that Aligns with the Strengths of Cloud

Cloud Services, as well as the growing number of as-a-Service solutions for infrastructure hardware and software, deliver several benefits.

  • The broad portfolio of cloud services offered by leading providers makes it possible to use the optimum cost/performance service without increasing administrative costs or business risk.
  • SaaS solutions provide the earliest access to innovation in collaborative applications, customer experience, AI/automation, data integration, and security that can be directly tied to critical business priorities.
  • The increasingly intelligent governance capabilities in cloud platforms make it possible to observe and automate reactions to actual changes in compute, storage, data, and process resource requirements in near real time as business conditions change.

All these benefits are about “speed”. Less upfront costs. More flexibility. Pay only for what you need when you need it. Companies that prioritize application modernization and data augmentation technology projects can quickly realize reduction in budget pressure. Businesses should gravitate towards those technologies that can deliver meaningful short term (in 2023) business benefits through greater use of cloud capabilities.

IDC also advises that CIOs adopt a mid-term strategy that reallocates funds from “run” projects that mostly just push out the elimination of technical debt. We recommend companies focus on “new” projects that can boost revenue or productivity by early 2024. This approach ensures the organization can recover faster if conditions improve or respond more quickly if conditions worsen.

It’s also important to start thinking early about priorities for 2024 and beyond. Regardless of how inflation and currency conditions change, IT organizations will remain sensitive to potential future cost shocks associated with people and processes. Technology suppliers are embedding more automation and AI into all their products and services. 2024 will be the year that forward thinking businesses identify and execute the right IT automation mix to deliver long term operational improvements. IDC finds that IT Automation projects can yield consistent paybacks in less than 12 months based on post-implementation reviews.

Adopt a FinOps and Cloud Economics Strategy to Minimize the Risks of Cloud

While “speed is a Cloud benefit which can contribute to lower costs for many workloads, speed also creates new risks. After numerous conversations with CIOs and surveys of IT leaders in 2022, it became clear to IDC that many technology leaders feel their companies are overspending on cloud resources.

To achieve the full benefits of a cloud-centric project prioritization strategy, CIOs must ensure that they are reducing current ” waste” in cloud and other IT resources. In the short-term, they must empower existing vendor/product procurement and service management teams to more aggressively pursue cost optimization strategies. These include:

  • Scrutinizing duplicate SaaS contracts
  • Assessing actual usage of existing service levels, particularly for SaaS applications that use graduated per user pricing.
  • Leveraging and maximizing volume-based discounts with leading cloud providers by selecting complimentary products/services from within their marketplaces.

In the mid-term, cloud operations teams need to adopt a wide range of standard processes that continually prune spending on unused resources. These teams also need to identify and accelerate adoption of new, lower cost options (e.g., new compute instances, storage tiers, tiered per-user pricing) as they become available. One significant opportunity to consider use of an emerging group of SaaS optimization solutions that provide greater visibility into actual SaaS use across the company. IDC has heard from IT leaders who noted that use of these solutions allowed them to get a clearer picture of current SaaS use and make changes that delivered major (up to 50%) reductions in recurring spend very quickly.

Both the of these short and mid-term efforts fall under what is commonly referred to as FinOps, today. While many CIO’s note that they already have staff dedicated to these tasks, what many lack is a unified view of actual cloud usage that is critical for achieving sustained rather than just one-time cost savings. Finding the right partner that can help create the visibility required to achieve this as well as establish sustainable processes can show paybacks tied to less over-provisioning in just a few months.

In the long-term, CIOs must also start laying a foundation for long term governance and cost optimization guardrails in their organizations’ future cloud use. The end goal of FinOps is to enable sustained assessment, reassessment, and future projection of cloud operating costs. IDC calls this Cloud Economics.

Businesses around the world are building entirely new business models based on smart connected things, AI-enhanced operations, and continuous customer engagement.  As a result, the effective use of cloud services will require timely and accurate implementation, scale out, and long-term costs forecasts for vital digital business systems and will be less focused on simply cutting the cost of running back-end business systems.

What If Everyone is Overreacting?

As 2023 progresses, industry observers quickly began asking IDC if this is all an overreaction. In some countries, inflation showed signs of cooling down in early 2023, and the US dollar lost some of its strength. Others wonder if we may all be calling this the “GODOT recession of 2023” next year. Is all this “cost optimization” and “project prioritization” wasted energy?

The simple answer is, “No!”

Regardless of where the shifting winds of the economics take global businesses over the next 12 months, the factors that triggered technology cost concerns are already locked in. They include the long-term price increases that many SaaS providers initiated in 2022. They also include the reset in energy prices that corporate datacenters, colocation operators, and cloud providers will be grappling with thorough 2023 and 2024 at a minimum. SaaS providers that run their own infrastructure or leverage IaaS from leading cloud providers will be dealing with cost increases, forcing them to weigh the costs of expansion/innovation against profit margins. CIO’s need to keep a close eye on how this develops.

In the long term, focusing on prudent and methodical cost optimization after several years of accelerated technology investment driven by radical, pandemic-driven disruptions is the right strategy no matter how the economic situation evolves in 2023. Tech investment is no longer about supporting the business. Tech is at the core of competing as a digital business. Using Cloud Economics to invest wisely and ensure maximum returns on those technology investments is the key to long term success.

Rick Villars - Group VP, Worldwide Research - IDC

Rick is IDC's chief analyst guiding research on the future of the IT Industry. He coordinates all IDC research related to the impact of Cloud and the shift to digital business models across infrastructure, platforms, software, and services. He helps enterprises develop effective strategies for using their diverse portfolio of cloud investments and applications. He supplies early guidance on implications of critical innovations such as the shift to cloud-based control platforms for deploying/managing infrastructure, data, and code delivery as well as the emergence of AI as a critical IT workload and part of all IT products/services.

Navigating the Storms of Disruption in the Nordics — Finding Potential Beyond the Winds of Change

From the beginning of the pandemic, IDC has talked about the new winds of change that were to come and how to navigate them in a new normal context. Now, we are right in the middle of the change, and people, businesses, and society are enduring new storms of disruption, striving to find opportunities to keep afloat and continue to deliver value.

First the pandemic, and then the Russia-Ukraine war, have created a critical turning point for Europe and the entire world, impacting the functioning and evolution of global ICT markets, from tech demand and supply chain limitations to inflationary pressure, significantly increased costs of energy, skills availability, and increased cyberattacks. And Nordic countries have not been spared from the consequences of this situation, with the changing macroeconomic conditions resulting in a shift of business priorities. Customer satisfaction has remained ttop priority throughout the year, but cost savings have become relatively more important at the expense of profits.

According to the latest IDC Future Enterprise & Resiliency Spending Survey (FERS) from December 2022, more than half of Nordic companies find that rising IT costs have the greatest or second greatest impact on ICT spending plans. This is followed by challenges related to the IT supply chain and expectations of a coming recession.

IT investment plans are directly affected. When prices increase, companies need to either increase spending or reduce the amount purchased — e.g., buying fewer units, licenses, or consulting hours. While few markets are unilaterally impacted, the technologies and projects that have the strongest and most immediate impact on business performance fare better than the ones that can relatively easily be postponed.

Despite the gloomy expectations, IT spending in the Nordic region, including Denmark, Finland, Norway, and Sweden, is expected to outperform GDP performance, as shown in IDC’s Black Book Live Edition from December 2022. However, with inflation expected to be around 5% in 2023, real growth will be limited.

Where Will Nordics Companies Expand Their Investments, Despite the Headwinds?

To maintain performance and mitigate the impact of the geopolitical and macroeconomic situation, Nordic companies will accelerate investments in technologies aiming to support them in their digital transformation.

Thus, the following key areas are identified by IDC to drive spending among Nordic organizations:

Cloud

Companies will keep their budgetary plans related to cloud infrastructure (IaaS) services. Overall cloud spending in the Nordics is estimated to achieve 22% five-year CAGR, reaching more than $27 billion by 2026. Even though more established than other emerging technologies, cloud will continue to drive investments across companies aiming to become digitally ready and to accelerate their business performance.

Internet of Things

According to the latest IDC European Emerging Technologies Survey from 2022, more than 40% of Nordic companies are already using IoT solutions. IoT is seen as part of the solution for the energy crisis, being critical to reduce costs, optimize processes, and improve performance. IoT spending will remain steady in Nordics, especially on use cases related to smart grids, electric vehicle charging, advanced payments, and so on.

Security

Investing in cybersecurity technologies is among the top 3 priorities for organizations in the next two years. Investment can be perceived as a cost center, but it also enables new offerings and reassures existing and potential customers. Cybersecurity spending in the Nordics is estimated to grow at 9% 5-year CAGR, reaching more than $6 billion by 2026.

Artificial Intelligence

Artificial intelligence is one of the fastest growing technologies due to tremendous potential to improve customer experience, enable new employee experiences, mitigate skills shortages, and transform the workplace environment. According to the latest IDC European Emerging Technologies Survey from 2022, more than 40% of Nordic companies are planning to adopt artificial intelligence solutions in the next two years.

 

Enterprise infrastructure, managed services, project/professional services, and anything “as a service” are additional areas where Nordic organizations indicated they will continue the pace of their investments.

How to Capture the Pockets of Growth in the Nordics Region

Many companies are struggling to find the right skills or partners to work with during the digital transformation journey, especially in such uncertain times. Offering guidance and support with the right technology stack, customized to address different industry-specific challenges will be the winning bet in these times.

To address all the challenges Nordic companies are facing, IT vendors must plan their strategy with precision. Adjusting to industry changes and rapid demand fluctuation, being use-case centric, and placing the right “growth” bets will be crucial to resist storms and headwinds.

IDC will help technology vendors remain resilient, stay competitive, and win revenue despite turbulent times by supporting their precision planning with the following assets:

  • Assess your competition and your position by analyzing technology markets, vendor shares, and forecasts — IDC Trackers
  • Position your products and services to the appropriate audience with an extensive market overview — IDC Black Book
  • Find strategic opportunities by industry, company size, use case, and geography — IDC Spending Guide

Contact us for more information about how IDC Data products help business leaders target, plan, and execute their most important strategic initiatives, analyzing more than 100 countries, 120+ technology markets, 20 industries, and more than 400 use cases.

In a recent webinar, we presented an entirely new marketing and sales approach to replace the well-known funnel. Based on extensive research, IDC’s new Adaptive Customer Experience (ACE) model is a circular and evolving framework and not linear at all, like it’s predecessor. This is because engagement with tech buyers is no longer linear. B2B buyers have B2C experience expectations today.  They expect you, as a vendor, to understand and provide solutions for their challenges, jobs to done and business outcomes. ACE is a customer-centric framework to evolve how you go to market.  

“Marketing is the conductor of orchestrated journey engagement, with data, automation and analytics to make this all work.”

– Laurie Buczek, Research VP, CMO Advisory Practices

And perhaps most importantly, to fully engage with a buyer, marketing and sales need to play on the same field. Put another way, no longer should the two functional groups be looked at as if they are running a relay race, where marketing passes a baton onto sales to finish the race. it is, instead, a customer experience journey, more like an orchestra, where all the instruments work together to make one beautiful masterpiece. And if marketing does a good job as conductor of the orchestra, the sounds become music that buyers want to sit and listen to, rather than just noise that they walk away from.

“More activity doesn’t bring more success” in “random acts of marketing”

– Jason Cunliffe, GVP Content Marketing Services

The webinar introduces this new framework and covers all of the elements in depth. From new analytics to content considerations. If you did not have an opportunity to attend, you can listen to the on-demand version. We received fantastic questions that we think will help you get started in a new marketing and selling approach.

Q1: Is the customer data platform (CDP) available in marketing and sales automation solutions?

 CDP is a very large category right now. You’re seeing niche players, but also established CRM or marketing automation vendors that are baking CDP into their platform experience. For more information, you can read IDC’s Marketscape:  Worldwide Customer Data Platforms Focused on Front-Office Users 2021-2022 Vendor Assessment.

Q2: Do you recommend building bespoke analytic platforms or are most off-the-shelf platforms good enough?

We are seeing that commercially available CPD and analytic capabilities can deliver functionality faster, more economically and with scale than home grown solutions.   Use the 80/20 rule to start – 80% off the shelf with the option for light customization for 20% of capability.

Q3: On the customer data platforms and analytics, can you talk about where you would typically see these functions reside in an organization? 

IDC recommends that customer data platforms (CDPs) are implemented as enterprise capabilities versus an overlay of a functional data silo.  ACE hinges on democratizing customer data rather than it being a departmental resource only.  Data and analytics are the only enterprise resource that enable the adaptive engagement more real-time.  To implement for success, it requires a collaborative effort between customer facing business functions and IT.  For more information refer to The Data Pardox Reshaping the CIO and CMO Relationship

Q4: Do you assume the tech stack and data scientists are in place to implement ACE? If not how does one start?

It is important to approach this holistically and as a system. Often, companies buy the tech first.  Let’s learn from the historical missteps of buying technology and thinking it will magically change the way you operate. It’s like buying a Ferrari but not having anyone who knows how to drive it. You must start by thinking about the skillset that you need. Marketing leaders are building marketing data operations competencies in-house, which include people who understand the business and how to use the data on a sophisticated level to be able to extract insights from the data to inform marketing action. Then couple this with people who really know how to create content that tells a connected story and supports the buying decision journeys rather than creating content in silos.

IDC Research finds that at least 25% of your people competencies are going to be in content strategy and in marketing data science.

Q5: Can you expand on the type of effective content that moves the needle?

Content that is highly measurable outperforms content that cannot be measured. So, the more content you can create that is interactive and provides a deeper, richer experience is more engaging and also highly measurable. It’s through testing that you can adjust content until it delivers the outcomes you want. You also should consider all of the other elements that are working together. Examine what efforts are coming before the content launches and what came after. Think about the entire customer experience with the content.

In addition, IDC Research finds that B2B buyers expect more interactive, immersive and omnichannel content across their engagement stages.  It is a mixology of content mediums that lines up to how buyers want to receive specific types of information.   Content creativity has been noted as most important for marketing effectiveness.

Q6: What are some examples of interactive content?

This is important to start to define. What does this look like? It can be calculators or business value assessment tools on your website where users can calculate how their business outcomes can be achieved (or the ROI) with your solution, without having to speak immediately to a human. Some buyers still want long form content pieces such as PDF’s, but significantly more want shorter content formats such as ebooks with embedded video, that provides options to read the content in smaller digestible pieces, if they need to. Also consider mobile applications and augmented reality that provide a realistic overlay of what your solutions can look like in a buyer’s environment.

IDC Custom Content Solutions sees that 70% of the content we create for clients is consumed on mobile. So, this is a key channel that needs to be considered, along with improvements in content readability.

It is time for a new operating logic.

To purchase the new IDC Perspective and learn more about the research behind the ACE Framework, click here – The Digital-First Era Demands a New Marketing and Sales Model: Introducing Adaptive Customer Engagement.

Or, to learn more about how we can help you implement the ACE model please reach out to one of our colleagues at IDC. Contact us today.

Introducing a new lead generation service! To make it easier for marketers to gain the most value of their IDC MarketScape and truly drive qualified leads, we have worked with Foundry to create an enticing lead generation package leveraging their media brands that capture proven customer engagements.

Laurie Buczek - GVP, Research - IDC

Laurie Buczek is the Group Vice President of Executive Insights at IDC, where she spearheads the global research initiatives that shape the industry's understanding of digital business transformation, evolving buying behaviors, and technology investments. She leads IDC's premier research practices, including the CMO Advisory Practice, C-Suite Tech Agenda, and Digital to AI Business Transformation. As the principal analyst for the CMO Advisory Practice, Laurie advises senior marketing leaders on driving business growth through deeper customer connections and the strategic evolution of the marketing function, with a keen focus on AI's transformative impact. Her expertise and thought leadership empower executives to navigate the intersection of technology, business strategy, and customer engagement in today's dynamic digital landscape.