We are a very inquisitive species with a remarkable long-term record of adaptation and with even more remarkable recent accomplishments in making the lives of most of the world’s population healthier, richer, safer, and longer. Still, fundamental constraints persist: We have changed some of them through our ingenuity, but such adjustments have their own limits.

— Vaclav Smil, How the World Really Works (2022)

 

The industry sector needs resources more than ever, particularly rare minerals. Even as the hunt for such resources intensifies, the industry is pushing to achieve sustainable growth and meet new environmental, social, and governance (ESG) goals.

According to the Copper Alliance, renewable energy systems require up to 12x more copper than traditional energy systems. Copper demand is expected to increase nearly 600% by 2030.

Renault’s Chairman Jean-Dominique Senard told Reuters news agency: “If there’s a real geopolitical crisis, the damage to battery factories solely powered by products coming from outside will be considerable.”

According to the UN’s Intergovernmental Panel on Climate Change, reducing industry’s greenhouse gas (GHG) emissions requires coordinated action across value chains. Such action includes circular material flows and transformational changes in production processes.

The manufacturing industry remains at the forefront of efforts to reduce the impacts of extracting natural resources and to secure materials that enable low-carbon production. But to meet these and other challenges, organizations must continue to find efficient ways to transform their value chains into closed-loop flows of the basic materials needed to extend product lifetimes. And they should double down on their recycling programs by finding ways to turn materials from end-of-life products into completely new products.

A series of game changers have been pushing organizations to be more efficient with resources and to adopt the principles of the circular economy. These include:

  • Organizations have been adopting sustainability policies that call for them to reduce their carbon footprints to at least net zero.
  • The massive spread of electromobility has turned the EV battery business, and the rare minerals needed for such batteries, into critical assets.
  • The COVID-19 crisis showed that it can be risky to depend on third parties to transport strategic materials around the world.
  • Digital technology has developed significantly in the past three years, especially in terms of cloud-based digital platforms and IT infrastructure, artificial intelligence-powered digital tools, and generative AI engines.

Manufacturing organizations, at least in theory, are in an ideal position to make circular principles inseparable from operations. Operationalizing circular principles at scale, however, remains one of the biggest challenges for managers across lines of business and industries.

In IDC’s 2022 global survey of 1,300+ manufacturing organizations, 58% of respondents said they have already incorporated circular economy principles into operations including design and production processes, waste reuse, and local sourcing of resources. Two-fifths (43%) of respondents said that shrinking carbon emissions and their CO2 footprints are key elements of achieving their ESG/sustainability strategic business goals. Two-fifths (41%) of respondents also cited the goals of reducing waste and driving cost efficiencies.

Reduced carbon production, as well as cost reductions driven by the optimized use of materials, labor, and assets, are some of the benefits organizations are receiving after adopting circular economy principles.

The auto industry is pioneering circularity principles in operations, particularly in the area of EV and EV battery production.

  • In 2022, General Motors announced an initiative to recover and reuse the raw material in its Ultium battery packs, thus driving down costs and making the manufacturer’s EVs even more sustainable.
  • Stellantis established a Circular Economy Business Unit whose objective is to “extend the life of vehicles and parts, ensuring that they last for as long as possible, and returning material and end-of-life vehicles to the manufacturing loop for new vehicles and products.” According to the company’s website, multi-brand parts that are still in good condition are recovered from end-of-life vehicles and sold in 155 countries through the B-Partsecommerce platform.
  • Renault’s “The Future Is NEUTRAL” entity aims to scale the closed-loop automotive circular economy, with the aim of moving the automotive industry toward resource neutrality.

These are all great initiatives that seek to improve material resiliency, make more efficient use of resources across the value chain, slow the impacts of climate change, and deliver sustainable profit and increased customer trust.

 

Download eBook: Sustainability in EMEA: Opportunities for Tech Vendors, Challenges for Tech Buyers

 

Operational Challenges

The following is a brief rundown of the operational challenges that organizations must tackle to reach a meaningful level of profitable circularity.

  • Fragmented Approach: Many organizations lack a clear, unified strategy and circular principles are thus applied opportunistically, mostly in production areas where the effort can bring immediate benefits or solves obvious issues.
  • Logistics: Many organizations struggle with insufficient production infrastructure and related logistics. Applying remanufacturing and repair to current operational setups significantly reduces overall efficiency during production, warehousing, and delivery processes.
  • Transparency and Flexibility: Implementation of circular principles in operations requires absolute transparency, traceability, and operational flexibility. To secure circular principles during the entire life cycle of the product, data related to the product’s usage must be captured and shared in real time in an autonomous, touchless way.

Faced with these challenges, a digital thread — a closed loop between the physical product and its digital representative — can provide relevant feedback to the product’s lifetime stakeholders. To make such data flows reality, however, several technology elements must converge, including ubiquitous connectivity, IoT, digital twins, and data capturing and sharing via cloud-based digital platforms.

Detailed transparency requires seamless integration of enterprise software. Examples of such systems include product life-cycle management, bills of material hierarchy, enterprise resource planning with remanufacturing functionality, logistics management, manufacturing management platforms, and servicing platforms.

Up-front costs and investments can be significant barriers to circularity. Achieving meaningful impact at scale requires coordination across functions and the involvement of various stakeholders inside and even outside of the company.

Suppliers, reverse logistics providers, remanufacturing and repair centers, customers, and technology partners must be coordinated into a perfectly synchronized machine. A circular environment is far more complex than traditional chains. Organizations may be challenged to create a business case with a short ROI.

Organizations must also determine whether circular principles can be applied to a product that is already in production — or if circular product design and management should instead be implemented only for new products, at the beginning of their life cycles.

Going “circular native,” as I term this last option, was very important to 46%, and extremely important for 38%, of respondents to an IDC Manufacturing Insights survey. “Circular native” is not defined by materials or extended life cycles but by a connection via digital thread to data sharing across a product’s entire lifetime.

The operationalization of circularity requires solid collaboration among procurement, engineering, and supply chain managers, especially during the design and supplier selection process.

It must also be acknowledged that the complexity of supply chains can make it challenging to establish closed-loop systems. Collaboration and coordination among suppliers, customers, and other partners are necessary for efficient material flows. And resource recovery must be underpinned by digital technology (e.g., cloud-based supply chain control towers).

 

Register for the webcast: Sustainability in EMEA: The Challenge of Moving from Ambition to Action

 

Boiling the Ocean?

For some leaders, embedding circularity principles in manufacturing operations — including reengineering product specifications according to circular principles — may feel a bit like “boiling the ocean,” or undertaking a seemingly impossible or unnecessarily difficult task.

Yet there are a great many benefits to providing data on technology processes and supply chains to stakeholders in real time. Products connected via digital thread to closed-loop stakeholders can help organizations better manage the product’s life-cycle bill of materials, collect data to improve the next generation of the product, and contextualize product data with current point-of-use data to provide a complex view of the product’s life-cycle status.

To achieve circular economy success, circular principles must be embedded across the entire product life cycle, including packaging. And the digital twin of the product must be integrated with a cloud data platform.

Circularity is not just about utilizing sustainable and recyclable materials: Life extension is a significant element. The most sustainable material is one that doesn’t need to be processed. Repair and remanufacturing are thus integral steps of the product life cycle.

Circularity also requires investments in digital tools capable of handling manufacturing processes in which input and output indicators may not always be well defined. Manufacturers that tackle this challenge should consider dedicated software enhanced with features like reverse bills of material, disassembly, expected recovery and kitting, remanufactured parts management, and remanufacturing pricing with core changes.

Data and contextualized life-cycle information, including carbon emissions, is a real enabler of the optimization of circularity principles in the manufacturing and supply chain environment.

In today’s hyperconnected world, moving from fascination with, to visualization, to implementation of circular principles isn’t viable without reliable and secure digital infrastructure, relevant digital tools, and AI-powered technology.

 

Bottom line: When it comes to securing material resiliency and achieving ESG goals, there is no time for hesitation or inertia!

 

To find out more about manufacturing visit our website, or to find out more about the framework-based guidance on how manufacturers can develop and deploy circular principles in their operations, click here.

Europe is gradually recovering from the worst energy crisis in a generation, which started as a tight supply market in 2021 and quickly escalated into a full-blown global supply shock, with energy prices peaking in Q3 2022 at levels unseen in decades. This year, as prices and supply readjust to profoundly changed market fundamentals, Europeans are weighing the long-term consequences of this crisis on their consumption behavior, the cost of doing business and broader decarbonization strategy.

In this context, energy efficiency has quickly risen to the top of the business and policy discourse, not only as a tactical tool to tackle higher energy prices today, but also as a key foundation of the EU’s climate transition under the ‘Fit for 55’ strategy.

In the near term, energy efficiency can improve consumer resilience, helping them cope with a higher cost environment. In the medium term, it should make it relatively less painful for Europe to regain its lost energy security, helping reduce energy dependency and diversify supplier risk.

Longer-term, it has the potential for lowering the cost of the energy transition by reducing the investment needed to decarbonize power production and electrify energy use.

Converging Towards Energy Efficiency: Policies, Prices and Demand Across Sectors

From a market standpoint, the time is ripe for Europe to raise its energy efficiency game as it now sits at the convergence of three critical enablers of a functioning energy services market.

  1. Policies and subsidies. Several pieces of legislation are being (or have recently been) rolled out that will accelerate changes in the way energy is used and produced in the EU. The most critical one on the use side of the balance is the ongoing revision of the Energy Efficiency Directive (EED), others include revisions of Directives covering the Energy Performance of Buildings, Renewable Energy and Energy Taxation.
  2. Energy prices. In June 2023, EU wholesale electricity and gas prices were still more than 70% and 2.7 times higher than in June 2019, respectively. Pivoting away from cheap and abundant piped Russian gas to new supplies (including via LNG, with all the related infrastructure and transport complexities) means the market may remain tight, resulting in higher prices than pre-2021 levels in the medium term.
  3. Market demand. In just one year, the energy crisis has done more to fuel the European consumer’s demand for energy efficiency than decades of direct incentives and tax credits. Especially for commercial and industrial energy consumers, from process manufacturers to food retailers and hospitals, the tactical need to react to higher energy cost is triggering investments that can serve these businesses well in their longer-term decarbonization plans. In the immediate aftermath of the energy crisis – IDC data shows – almost half of European businesses were planning to improve the efficiency of their energy use to limit the impact of higher energy prices on the cost of doing business. At the same time, between 50% and 60% were planning to invest in energy efficiency (both data- and capital investment-driven) as part of their broader decarbonization strategies.

This renewed focus has profound implications not only for energy suppliers and service providers but also for large and small energy consumers across European industries and their technical ecosystems.

European manufacturers and retailers, for example, have long been working on their energy mix and consumption to generate cost efficiencies, meet growing customer expectations and target ambitious long-term sustainability goals. In today’s energy price environment, however, energy efficiency has become critical to sustain profitability and competitiveness. This is particularly the case for organizations competing with non-European producers that have access to cheaper energy supplies.

Manufacturing

While energy efficiency has always been a consideration for manufacturing organizations, access to relatively cheap energy, loose regulatory requirements and the lack of effective digital technology led to some complacency in the past. Nowadays, manufacturers have the ability to contextualize and analyze real-time data by breaking down data silos across their IT and OT estate. With access to data, technology owners on the shop floor can adjust production plans and material routes accordingly.

Additionally, energy efficiency initiatives have the long-term potential to help manufacturers jump-start broader data-driven process improvement strategies. For example, a prominent Tier 1 global automotive supplier successfully connected over 250 energy-related data points. The energy management system allowed the company to analyze the energy consumption of injection molding machines for each produced part. With this data, not only could the company adjust production equipment and determine the most efficient injection molding machine based on the parts being produced but also detect and alert supervisors of equipment anomalies.

Retail

Retailers too are prioritizing the implementation of energy management systems in their retail operations, along with a growing focus on supply chain and logistics efficiency, to minimize overall energy consumption.

For retailers in particular, implementing energy-efficient technologies and practices goes well beyond sustaining profitability and competitiveness. As consumers become increasingly conscious of the environmental impact of their purchases, energy efficiency becomes a clear first step towards achieving sustainability goals that align with such changing preferences.

This should not be viewed (only) as a way to enhance brand reputation and attract environmentally conscious consumers, but rather materially help them improve their environmental footprint. For example, at the beginning of the energy crisis, one of the UK’s leading food and grocery retailers strengthened its commitment to tackling the climate crisis. This meant cutting as many as five years from its target to become carbon neutral in its business and operations (Scope 1 and 2), by 2035. To do so, the grocer is focusing on maximizing the energy efficiency of its operations, reducing carbon emissions, food waste, plastic packaging, water usage, and increasing recycling.

Healthcare

For European healthcare organizations, higher energy prices are rubbing salt in the wound of the enormous resource strain caused by two years of pandemic.

The sector is one of the largest and most sophisticated energy consumers and hospitals are typically among a territory’s most energy-intensive buildings. Not only medical equipment and healthcare facilities, on which patients’ lives depend, necessitate 24/7 power supply. But within the same hospital, each of those facilities and departments have their own requirements in terms of access, lighting, temperature and humidity, cleanliness and air filtration, availability of water, power, medical gases and communications.

With healthcare fees typically lagging inflation, often by several years, and with energy bills up by as much as 100% or more since 2021, energy prices are not only hurting hospitals’ bottom lines but diverting crucial resources from patient care. This adds to inflation increasing the cost of medical equipment, pharmaceuticals, medical logistics and other expenses outside core operations.

In this context, European hospitals are prioritizing efforts to reduce energy consumption (and limit their carbon emissions in the process) without impacting the quality and safety of day-to-day care.

Two investment areas are worth calling out. Adopting sustainable design principles for new builds using, for example, parametric modelling to track the rise and fall of the sun in different seasons, allowing to make the most of natural light and solar radiation. Plans for rooftop solar are also increasing, enabling hospitals to self-generate and decarbonize part of their energy needs. Deploying smart assets and measurement systems is also on the rise, to monitor temperatures, air quality, occupancy and overall humidity and optimize operations.

Public Sector

European Governments and public administrations, for their part, will have an increasingly relevant role to play going forward. They are expected to not only regulate and orchestrate but actually lead the energy transition, demonstrating best practices and setting a benchmark against which other organizations can measure themselves.

The proposed revision of the EU EED is a case in point. It firmly establishes that the public sector should have an “exemplary role” underscored by specific, more aggressive energy efficiency goals than the rest of the economy. Similarly, the UK Government’s Net Zero Strategy states that “the wider public sector will lead by example during the transition to net zero.”

This is critical because governments are among the largest contributors to European economies. They have their own significant direct environmental footprint and therefore have a critical influence on the journey to net zero. For example, in the UK, the Government estimates that emissions from public buildings account for approximately 2% of total UK emissions. And this only includes estimates of fuel burnt not wider scope 1, 2 and 3 emissions.

Driven by regulation, higher energy prices, NextGeneration EU funding, and public expectations, local, regional and national governments are putting in place measures to improve the efficiency of their biggest emitters – transport fleets and public buildings and assets.

IDC research highlights that, across Europe, 37% of governments are investing in building energy management systems and nearly 60% are investing in workplace management systems to optimize space utilization and occupancy. It must be noted that a selection of government departments, due to their size and function generate the bulk of public sector emissions.

For instance, the Ministry of Defence is estimated to account for 50% of the UK central government emissions; therefore, accelerating energy efficiency measures in those departments is essential.

Financial Services

As a relatively less energy-intensive sector, the direct effects of higher energy costs on the financial services industry were less critical than for others. The major energy consumers in financial services are data centers and, to a lesser degree, office buildings, and even for these the increase in cost remained manageable.

Financial services, however, play a critical role in enabling the energy transition of their corporate and consumer customers through the issuance of green and social bonds, credit and other financing options. The surge in energy prices, however, will likely have delayed the net-zero targets of banks’ lending portfolios, as customers have been forced to use working capital to pay their energy bills.

The bigger dilemma however is that, in addition to renewables, diversification from Russian gas will require major investments in oil and gas exploration and import infrastructures, which is fundamentally countering Europe’s green deal policies.

In autumn 2022 there were also concerns that energy suppliers and the energy-intensive industries may bend under the crisis, which increased the pressure on banks to prepare for loan defaults. Thanks largely to the estimated €758 billion (Source: Bruegel) in fiscal policy measures allocated by European government to protecting consumers from rising energy costs (including nationalization of energy utility giants Uniper and EDF), European banks only saw a marginal increase in loan defaults.

Overall, the energy crisis may have slowed down the green transformation of the financial services industry asset base, but the long-term opportunities of going net-zero remain sound.

Utilities

Finally, turning to the supply side of the energy balance, energy and utility companies represent the business and infrastructure backbone of the energy transition.

Over the past five to 10 years there has a been a substantial uptick in investment by European utilities and energy suppliers in the energy services (ESCo) space. From diversified energy companies to international electric utilities, energy infrastructure operators and municipal multi-utilities, many traditional players have added energy management technology and efficiency capabilities to their portfolios.

For example, between 2015 and 2019, a major European power utility acquired companies covering the full stack of B2B energy technology and services. The resulting ESCo offers energy analytics and energy management technology, financing and operations of solar, storage and co-generation plants, energy audit services and performance contracting, as well as demand side response solutions.

The strategic intent is clearly to integrate horizontally by adding to the existing commodity business a set of solutions that enable customers to consume more sustainably and cost-effectively, in an effort to meet the growing demand for efficiency. The energy crisis has obviously provided fresh impetus to this type of strategies. To reflect this acceleration, at the end of last year, IDC predicted that by 2025, a third of competitive gentailers would set up integrated supply, efficiency, decarbonization, and electrification service portfolios, growing average profit per customer by more than 20%.

 

Contributing analysts: Jan Burian, Adriana Allocato, Massimiliano Claps, Louisa Barker, Tom Zink and Filippo Battaini

For more in-depth insights into industry coverage, visit our website.

Global EV Market Landscape

The automotive industry is facing the most important transition period in its history — the replacement of the traditional internal combustion engine with more sustainable, energy-saving, and environmentally friendly technologies. The traditional engine has dominated powertrains for more than a century.

In 2022, the worldwide Electric Vehicle (EV) market exceeded 10 million units, with a penetration rate of14%. In 2023, it is expected to reach 14 million units, with a penetration rate of 18%. Overall, China and Europe are leading the market, whereas the United States and other developing regions have great potential.

Electrification, connectivity, autonomous driving, and ride sharing are the four key trends that drive this transition, resulting in the rapid growth of the global EV market. From supply side, governments take EV as a country strategy, providing subsidies to promote players developing their business.

More investments in R&D and innovation have resulted in breakthroughs in core technologies, such as 5G, OS,V2X etc. As such, traditional OEMs, technology giants, and emerging players are trying to seize the opportunities from the electric vehicle market. From the demand side, more and more customers are now preferring green travel and are willing to pay for intelligent functions.

In 2022, top 3 players in the EV market worldwide were BYD, Tesla, and SAIC-GM-Wuling, with Tesla falling behind BYD. Due to the emergence of more and more electric models from other players, Tesla’s market share has continuously eroded, falling from 17% in 2019 to 13% in 2022. It is expected to stabilize at around 10% in the future. Tesla needs to diversify its product line with a cheaper compact car if it is to regain the number one spot.

The industry transition will be fast, both opportunities and challenges exist. Only by establishing advantage in advance, can the players get ahead of their competitors and win the final victory.

The competition in the EV market is fierce and here are some of IDC’s advice for OEMs to capture opportunities from this market:

  • Targeting Valuable Markets: Continue investments in the most valuable markets: China, Europe, and North America. China and Europe are the leading electric vehicle markets, with strong government subsidies and promotion over the last few years, and high customer awareness. In these markets, the competition will become more intense, and products will become more segmented. The United States has also begun to drive the electric vehicle market. The Inflation Reduction Act (IRA) signed on by the Biden administration in August 2022 has had a significant impact on the electric vehicle industry. Some developing countries are also showing potential, such as India, Thailand, Philippines, Indonesia, etc.
  • Strong Company Positioning: The leading companies in the EV market position themselves as energy or technology companies, which are bigger than automotive, while maximizing synergies between different business portfolios. Overall, high-end brands are beginning to penetrate the low-end, and low-end brands are trying to break through to the high-end. Segmented markets and high-quality EV products have become competitive hotspots.
  • Technology Strategy: Choose the most suitable technological route, such as low cost, time to go to market, high-quality product, etc. At the same time, increase investments in R&D and innovation, especially for software.
  • Create a High-Tenacity Supply Chain: Actively consider changing their original pure outsourcing strategy and more actively arrange upstream core components from a strategic perspective, to enhance their control of the supply chain. What is more, build a more digital and intelligent supply chain management system to increase resilience and agility.
  • Talent Strategy: Talent has become more and more important for industries today, including OEMs. Start as early as possible to discover talent and skills shortages, especially around the areas of the Internet, AI, information communication, energy, and power. Build healthy and attractive systems to attract top talent and maintain employee satisfaction.

For more information on IDC’s Worldwide Semiconductor Automotive Ecosystem and Supply Chain Research, please visit this page.

Related Reading

Adela Guo - Research Manager - IDC

Before joining IDC EMEA in January 2008, Holtz worked for the IDC Asia/Pacific Telecommunications Research Group in Singapore. Her research there was mainly on IP communications, including IP telephony and IP VPN equipment and services. 

Generative AI has wowed consumers and individuals across the globe with its ability to find information and author high-quality content. For enterprises, the use cases are still being explored and defined. In this blog, we will explore a potential ‘killer app’ for generative AI: The Virtual Mentor as a new way to do learning and onboarding.

In today’s organizations, the vast majority of mentoring is done by speaking to experienced colleagues, looking for answers in the public internet or in company-specific intranets, trawling through various PDF guides and presentations or maybe e-learning courses or classroom sessions. The problem is that there is no easy way of finding the information employees need using existing technologies and approaches.

Current e-learning and onboarding solutions struggle with multiple challenges. Firstly, the content is costly and time-consuming to produce. Secondly, it quickly becomes outdated and is generally static, once produced. Thirdly, the one-size-fits-all standard approach to learning and onboarding doesn’t quite meet the needs of the individual, who already knows all about A but would like to deep-dive into B.

We believe that generative AI will be a game changer in solving these problems, because the system themselves – for the first time in world history – can generate the needed learning content. Future virtual mentors will meet many of today’s unserved learning and onboarding needs and employee would be able to interact digitally, remotely or in the office, intensively or in drip-feed style, and the learning content would be created on the fly determined largely by the nature of the interaction and the learner queries.

AI-Powered Virtual Mentor vs. Previous Learning Approaches

First of all, let’s define generative AI. We define generative AI as a branch of computer science that involves unsupervised and semi-supervised algorithms that enable computers to create new content using previously created content, such as text, audio, video, images and code.

Secondly, let’s define what an AI-powered virtual mentor is. We envision the AI-powered mentor is have the following characteristics:

  • Always available. Like Microsoft’s failed personal digital assistant Clippy (remember the animated talking paperclip?), a virtual mentor will be an omni-available resource to the learner.
  • Creates content itself. If fed enough material, a generative AI-powered virtual mentor will be able to create the relevant teaching material itself by synthesizing existing content.
  • Conversational. Just like a real-life, human mentor, the AI-powered virtual mentor interacts via conversation. The human mentor converses verbally, while the virtual mentor works best via written conversation (although verbal user experience is on its way, as well).
  • Adaptive. A virtual mentor goes far beyond what is known today as ‘adaptive learning’, I.e., an e-learning experience with some variation in the course depending on the individual learner. A virtual mentor can freestyle and go where the learner would like to go within a general topic area.

An employee would be able to ask a wide variety of general questions to the virtual mentor, such as:

  • What is the pricing structure for product X?
  • Do we have representation in Peru?
  • What are the key new features in the version YY.YYY of product Z?
  • What is the expense management policy for a client meeting?
  • Who in my company works with [expertise area]?

Let’s compare what it is like to work with a generative AI-powered virtual mentor compared to traditional e-learning as well as classroom training:

Why Do We Need Virtual Mentors When We Already Have ChatGPT and Similar Generative AI Platforms?

ChatGPT is of limited use in an enterprise context for one simple reason: Employees using the platform are likely to reveal sensitive company information. This is why most organizations have banned the use of ChatGPT among employees.

Just imagine an employee at a healthcare provider uploaded the raw transcript of an internal meeting regarding the cancer treatment of patient XX and asking for an abbreviated minute of meeting. Such an upload to a public internet system would constitute a major violation of the privacy of patient XX.

Virtual mentors, on the other hand, would leverage the public internet-based Large Learning Models but would not feed any inquiries from employees back to the public internet. Such ChatGPT replicas in confined corporate setting will be the first wave of generative AI virtual mentors that we are going to see on the market.

This will, in other words, be general purpose virtual mentors based upon public internet information. These can be adopted by organizations of any size and are ready to use immediately.

A subsequent wave of virtual mentors will be based on curated content specific to a functional area or an industry or similar. Such specialized content virtual mentors will be sold by vendors that are in charge of curating content and maintaining the AI solution.

A virtual mentor in the area of accounting could be offered by learning content provider or alternatively to an accounting solution provider. Some specialized virtual mentors could be provided as free add-ons to commercial software subscriptions.

Finally, we will see a wave of organization-specific virtual mentors that will act as experts in one organization. In this case, the organization itself would be in charge – possibly aided by a services provider – of feeding the system with learning material.

A product manufacturer would input all manuals, product FAQs, marketing material, customer service interactions, HR policies, internal communication, public pricing information, everything on the intranet and company internet sites, training materials, etc. That solution could be very helpful in onboarding new employees and help answering inquiries for existing employees. However, it would take time and resources to implement and require a certain company size in order to benefit.

The figure below shows the different levels of data feeding into a virtual mentor. The interaction between the virtual mentor and the employee will be chat-based to begin with. However, in the medium term, interaction could also be done through verbal communication, games, metaverses, augmented reality, etc.

Evidence of Generative AI Replacing Existing Digital Learning and Coaching Solutions

Chegg, an established American education technology (EdTech) company known for textbook rentals, online tutoring, and a variety of student services, was among the entities to feel the competition from generative AI. Their initial projection regarding generative AI tools, such as ChatGPT, was that these technologies would take a longer period to truly influence the market.

However, the release and subsequent popularity of GPT-4 among students, credited to its swift response time, efficiency, and affordability, led to a sales slowdown and a dramatic Chegg stock price decline of 48% in early May 2023.

As response to these trends, Chegg entered into a partnership with OpenAI in April 2023, leading to the development of CheggMate. This tool, which is still in its development phase, intends to amalgamate GPT-4’s generative AI capabilities with Chegg’s existing question database.

The goal for CheggMate is to enhance user experience by better aligning user queries with the most suitable resources.

Other EdTech vendors, including Duolingo, have unveiled new AI-driven features. Specifically, Duolingo introduced a role-play chat where users can learn a language by conversing with an AI. After these interactions, they receive feedback and suggestions to enhance their language-learning journey.

We have also witnessed the first examples of generative AI approaches in mentoring. CoachHub is a leading vendor of digital coaching solutions recently unveiled AIMY, a virtual AI-powered career coach rooted in OpenAI’s ChatGPT. AIMY is designed to let users try personalized coaching sessions without any human interactions and without the costs associated with traditional coaching. It emulates human to human coaching, is still in beta phase, and not yet able to manage too complex discussions.

Challenges to Overcome for Virtual Mentor Solutions

Adopting virtual mentor solutions for learning, onboarding, and coaching purposes is not without challenges. Here are a few key obstacles that organizations might encounter when introducing these new AI-driven solutions:

  • Data privacy and security concerns. The first cases of data breaches related to the use of generative AI solutions by employees have already emerged, such as Samsung’s discovery of staff uploading a variety of sensitive information to ChatGPT. Future virtual mentor solutions will not feedback data to public generative AI systems, such as ChatGPT.

As shown in the figure above, virtual mentors will use a combination of user data, curated company data, curated industry or functionally specific data as well as publicly available data as training material. Such approaches will limit the risk of data breaches significantly.

However, adoption will require significant attention to security-related aspects, such as ensuring robust encryption, compliance with data protection regulations, etc.

  • Implementation complexity and skills gap. Introducing virtual mentor solutions on top of existing data is likely to require specialist AI training skills, which might not be in possession of many organizations. In terms of the overview figure above, the company-specific layer presents the biggest challenges. This is because training material is limited (compared to the vast number of resources available on the public internet) and because training material must be curated, updated, deleted (in case of obsolete material), etc.
  • Risk of hallucinations. AI-driven virtual mentors can produce “hallucinations” or inaccurate answers. In a mentoring context, this can lead to confusion or misguidance and ultimately a rejection of the mentor system as unreliable by the employees. The risk of hallucinations by the virtual mentor means that organizations will have to dedicate resources to quality assurance, ticketing system for incorrect or inappropriate answers, etc.

Implications for HCM and Payroll Vendors

Generative AI will have a major impact on the field of Human Capital Management solutions. There has been a significant initial focus on the impact of generative AI on recruiting, candidate marketing, and employee performance.

However, learning and onboarding will also see massive change as a result of generative AI.

A market for curation of Large Learning Models for various industries and functional areas will appear. This could open new revenue streams for the providers with strong existing domain knowledge.

As displayed on the table above, different learning delivery methods will have different sweet spots. Classroom-based learning and traditional e-learning formats will not disappear.

What will happen, however, is that a lot of the more general learning and onboarding tasks will transition to generative AI-based learning formats. Initially, the formats will evolve around chat-based interfaces, but over time other user experiences and communication formats will emerge.

Generative AI is an opportunity for vendors of learning and onboarding solutions. However, they will need to react fast in terms of evolving existing solutions and building in generative AI features and aspects.

Existing learning and onboarding vendors will come under pressure from new providers of virtual mentors and other related generative AI-based solutions. Generative AI is a twin edged sword for HCM vendors, a blessing for those who are willing revisit their existing offerings, but a curse for those that fail to respond.

Bo Lykkegaard - Associate VP for Software Research Europe - IDC

Bo Lykkegaard is associate vice president for the enterprise-software-related expertise centers in Europe. His team focuses on the $172 billion European software market, specifically on business applications, customer experience, business analytics, and artificial intelligence. Specific research areas include market analysis, competitive analysis, end-user case studies and surveys, thought leadership, and custom market models.

In today’s sprawling realm of content marketing, establishing an authentic connection with your audience is no longer an option but a necessity. With the proliferation of generic content and the ever-expanding outreach channels, personalization in the digital age is a significant challenge for many marketer organizations.

The Imperative of Personalization

Imagine a bustling marketplace, vendors competing for attention. Amid the noise, what captures your interest? It’s the vendor who remembers your preferences, understands your needs, and tailors their offerings accordingly; the vendor who shows empathy, builds trust and inspires loyalty in customers.

In the digital sphere, content personalization recreates this personalized shopping experience. Successful content personalization entails crafting content that isn’t just broadcasted but resonates deeply with individual preferences, making each user feel valued and understood.

However, in the cacophony of content flooding digital channels, traditional one-size-fits-all strategies fall short. Personalized content goes beyond merely inserting a name; it’s about deciphering user behaviors, comprehending their wants, and delivering content that genuinely strikes a chord.

Data drives Personalization expansion

Until recently, marketers often employed “batch and blast” campaigns – generic messages sent out en masse, devoid of personalization. These campaigns were a mere numbers game, lacking relevance and compliance.

However, with the advent of marketing automation platforms layered atop CRM data, campaigns started gaining personal touches. Marketers could design campaigns based on limited yet more personal information. This transition marked a move from arithmetic growth – one campaign after another – to multiplicative growth, where controlled scenarios were crafted with ease. 

Marketing Campaign Growth based on data driven personalization.

Nowadays, marketers have access to even more data courtesy of tools like Customer Data Platforms (CDP). Each data point – from buyer intention to past purchases – drives exponential campaign growth as shown in the green boxes on the right of the above figure.  This data-driven reality poses challenges for downstream functional groups who struggle to manage the influx of auto-generated, real-time campaign ideas. In this context, content curation at scale becomes a crucial driver of success – a challenge many CMOs are grappling with.  

In the digital age there are exponential possibilities for content personalization.  Modern CMOs need to leverage new tools – real time data, GenAI, content management, atomic content – to engage audiences across the myriad of medium and the deluge of distractions.

Breaking down content to build up personalization. 

To address the need for highly customized messages, today’s CMOs are adopting innovative strategies. Taking a page from dynamic content optimization, atomic content, and other technologies, marketers are looking to breaking content into smaller components. This allows marketers to generate personalized assets as needed – leveraging Generative AI and real time data where possible. 

For instance, instead of creating a standard product page, marketers can create a collection of product features. Each feature tagged with alignment attributes like seasonality, buyer stage, and demographics. These tags then facilitate reassembly for specific data-driven use cases.

Instead of developing assets far in advance for say, a “manufacturing CFO doing research,” the system can generate personalized assets for various, and infinite, scenarios such as: CFO vs. CTO, Manufacturer vs. Retailer, Top of Funnel vs. Post-Purchase, and more. This process creates content on-the-fly, adapting in response to real-time inputs.

The Evolution of Engagement: Unleashing Personalization

Modern CMOs are embracing a practical, data-driven approach to content creation. They’re breaking down content into manageable pieces and utilizing AI-powered tools to seamlessly blend these components for diverse audiences. This adaptability not only streamlines the marketing process but also empowers brands to establish significant connections, harnessing the potent power of data for tangible impact.

Content creation from a data-driven approach drives personalization which will create better user experiences based on empathetic relationships between customers and brands. These relationships are built on what the customer wants and how they want to be treated through the lens of technology. Brands need to engage with customers in a contextual manner based on awareness, engagement, learning, and measurement.

As channels multiply and data flows in, mastering this complexity presents both a challenge and an exhilarating opportunity. It’s a chance for brands to navigate profound connections and wield the potential of data to craft impactful experiences.

By diving into this complexity and making the most of available tools, brands can effectively resonate with their audience, traverse the evolving currents of content marketing, and emerge as genuine pioneers of audience engagement in the digital age.

For more information on the Future of Customer Experience, read our blog:

Roger Beharry Lall - Research Director, Marketing Applications for Growth Companies - IDC

With over 25 years' experience leading technology driven marketing programs, Mr. Beharry Lall is now a Research Director with IDC covering Advertising Technologies and SMB Marketing Applications. He brings a unique multidisciplinary perspective, evangelizing the innovative and pragmatic use of both martech and adtech solutions for companies of all sizes. Early in his career Rog worked with an IBM subsidiary expanding into the Asian Market and subsequently, he spent over a decade at RIM (BlackBerry) building marketing leadership across new industry segments, geographies, and product categories. This background fuels his perspective as he researches enterprise customers engagement tools and tactics across the unified omnichannel.

Sales enablement has emerged as a pivotal function within modern businesses, bridging the gap between sales and marketing to drive revenue growth. As the landscape of sales continues to evolve, so does the role of sales enablement. To effectively lead a sales enablement function, there are key questions that leaders must address. In this article, we’ll delve into these critical questions, shedding light on the core aspects of sales enablement and providing insights to guide your strategy.

What is Sales Enablement?

Sales enablement is the strategic process of equipping sales teams with the right resources, tools, content, and training to engage potential customers and close deals effectively. It encompasses a wide range of activities, from developing targeted content and training programs to optimizing sales processes and providing technology solutions that empower sales professionals.

Why is Sales Enablement Critical? What the Research Says

The importance of sales enablement is not just anecdotal; it’s backed by data. According to recent IDC studies, (IDC 2022 Outcome Selling Advisory IDC Survey on Value Selling Excellence​) ​49% of sales representatives state pipeline development and issues finding qualified buyers are a challenge. 45% struggle to move a proof of concept to a sale.

3 Areas of Focus for Sales Enablement Leaders

To lead a successful sales enablement function, it’s crucial to direct your efforts in alignment with organizational goals. Here are three areas of focus:

  1. Content Development and Management: Creating relevant and engaging content that aligns with different stages of the buyer’s journey is essential. Effective content empowers sales teams to have meaningful conversations with prospects. Collaborate with marketing to ensure a steady stream of high-quality content that addresses buyer pain points and objections.
  2. Sales Training and Development: Continuous training and skill development are imperative for a high-performing sales team. Implement a structured training program that covers product knowledge, objection handling, sales techniques, and market insights. Utilize both in-person and digital training methods to accommodate various learning preferences.
  3. Technology Integration: Leverage sales enablement tools to streamline processes and enhance efficiency. Tools for content management, CRM integration, analytics, and communication can provide real-time insights into prospect interactions, enabling sales teams to make informed decisions and tailor their approach.

A Robust Go-To-Market Strategy Is Paramount for Sales Teams Aiming To Thrive

To achieve success with a sales enablement strategy, sales teams must embrace three pivotal shifts in their approach. A profound comprehension of the digital journey, along with a keen understanding of key personas and their priorities, is integral to a solid go-to-market strategy. This foundation allows sales professionals to tailor their interactions with precision, ensuring relevance and resonance at every touchpoint.

Secondly, the era of product and feature selling is waning within the realm of go-to-market strategies. Modern buyers demand more—they seek a clear demonstration of how a solution can address their unique business challenges and deliver tangible value. This necessitates a shift towards solution-focused selling that revolves around solving problems rather than just promoting features as part of an effective go-to-market strategy.

Lastly, value selling takes center stage by seamlessly aligning with the buyer persona’s overarching business strategy as a component of an integrated go-to-market strategy. This approach intertwines your solution’s value proposition with the customer’s long-term goals, creating an enduring, adaptable partnership. Through value selling within the context of a comprehensive go-to-market strategy, a symbiotic relationship emerges, fostering ongoing collaboration that not only meets immediate needs but also adapts to future transformations.

As sales enablement becomes an integral part of this dynamic landscape, mastering these three tenets within your overarching go-to-market strategy ensures that sales teams can navigate complexity, drive engagement, and forge lasting connections that transcend transactional interactions.

Sales Enablement Tools

Sales enablement tools play a pivotal role in the success of your function. These tools provide automation, data-driven insights, and improved collaboration. Consider implementing:

Mastery Classes: Mastery class programs that are meticulously tailored, offer comprehensive insight into specific priorities, personas, and use cases that align with the vertical requirements and the comprehensive suite of solutions and value propositions offered by you, the vendor. Throughout IDC’s program, interactive elements are integrated to foster peer-to-peer information exchange and collaborative learning. Our program also features outcome-oriented, task-based actions, accompanied by clear directives and strategic account planning frameworks, equipping participants to seamlessly apply their newfound knowledge in the field starting tomorrow.

Digital Coaching: The advantage of digital coaching for sales reps lies in its ability to provide personalized, on-demand guidance that enhances skills, boosts performance, and adapts to the dynamic needs of each individual representative. Delivered in a user-friendly format, IDC’s re-recorded videos or audio sessions, accompanied by informative slides and conveniently organized chapters, cater to various vertical and technology markets, while also addressing distinct profiles of target buyers. These resources can seamlessly integrate into your learning management system, creating a holistic approach that guarantees effortless access to pertinent insights for salespeople and partners alike.

Sales Playbooks: Sales playbooks offer sales reps a structured roadmap, streamlining their approach with proven strategies and best practices for more effective and consistent sales engagements.Harnessing our wealth of existing research and profound comprehension of IT buyers, we empower sales professionals with the tools to grasp the intricate landscape of a given market, whether it pertains to technology or geography. Our approach delves into market trends and drivers, unveiling insights that enable sales teams to engage in informed conversations. We illuminate the path toward solutions by demonstrating precisely how the vendor’s offerings align with and overcome these challenges. This equips sales professionals with the knowledge and confidence to articulate the value proposition coherently.

Buyer Conversation Guides: Research-based buyer conversation guides facilitate interactions that transcend traditional sales pitches, enabling sales teams to seamlessly converse with executive-level buyers. These interactions get to the very core of the challenges that organizations face, offering a panoramic understanding of their pain points and aspirations. Armed with this knowledge, sales professionals can adeptly steer the conversation towards how your solutions stand as beacons of resolution, poised to surmount these challenges and steer them towards elevated business outcomes.

Did you know? IDC has a Sales Enablement practice empowers organizations to sell more effectively and helps connect and align your marketing and sales efforts. Browse IDC’s Sales Enablement Solutions

Leading a sales enablement function requires addressing critical questions that shape the strategy and approach. Understanding the role of sales enablement, leveraging research-backed insights, focusing on key areas, and implementing the right tools are all essential for driving success. By aligning your efforts with the evolving sales landscape and the needs of your sales teams, you can empower them to achieve exceptional results and contribute to your organization’s growth.

The landscape of business may be seeing a seismic shift with the rise of Generative AI (Gen AI). This shift is not just the direct impact of Gen AI itself, but also how Gen AI is re-affirming the importance of AI overall and raising its profile within the business. This sea of change appears to be as tectonic as the PC revolutions of the ’80s and smartphones in the aughts. We are seeing the potential to revolutionize and disrupt industries, foster innovation, streamline operations, impact workforces, realize the promise of knowledge management, and democratize/consumerize AI in a remarkable fashion.

With this potential, it is important for organizations to navigate this transformation in a consistent, methodical, mindful fashion. This does not mean it has to be a slow laborious process, but it requires consideration to ensure responsible deployment and tangible/beneficial and scalable outcomes. This involves both technology and business stakeholders working together to quickly identify and implement short-term initiatives with a view towards longer-term imperatives and supportable environments.

Contaminated data can lead to incorrect models that fail to meet the desired outcomes. Addressing data quality, accuracy, and security challenges is a priority.

Daniel Saroff – Group Vice President of Consulting and Research

It Starts With Establishing a Foundation

Before embarking on the journey to harness the power of Generative AI, organizations need to establish a solid foundation. This foundation comprises several crucial elements:

1. Responsible AI Policy

A well-defined AI policy that outlines principles of fairness, transparency, accountability, and data protection is paramount. Ensuring explainability of AI model outputs and complying with legal/statutory regulations like GDPR are table stakes.

2. AI Strategy and Roadmap and the Role of the Proof of Concept

Crafting a comprehensive AI strategy with prioritized use cases is essential to aligning the organization’s efforts to business impact (both short and long terms).

The AI strategy should include the rules or guidelines for Gen AI proofs of concept (POCs), and it should incorporate the results of the POCs to recursively improve the strategy. This enables the strategy to self-correct and refine itself for a more successful, long-term approach, and facilitate responsive decision-making.

Many organizations don’t have the skills, policies, or data to leap into large/enterprise-changing Gen AI initiatives. With focused POCs, they have the opportunity for rapid action to identify skill, data, policy, and technology gaps in an efficient fashion. And with limited investment to prove out the technology.

To select POCs, use the following criteria for evaluation:

  • Value: Economic, strategic alignment, risk
  • Complexity: Data, algorithmic, system requirements, required ‘know-how’ or skills

For example, assess based on value, complexity, risk, data quality. If a proposal is high value but has poor data, don’t POC. Or if one is exceptionally high value and good data but high risk, wait until you’ve built experience in Gen AI.

Organizations should avoid over thinking their strategy and roadmap and thereby delay piloting this technology.

3. Intelligence Architecture

While Gen AI POCs do not need to build a platform to support enterprise Gen AI initiatives, part of their criteria for selection should be based on how they develop the understanding required for such a platform to exist. The architecture needs to consider how a platform can be implemented and governed, data models (structured, evaluated, integrated) required, and integration into existing systems.

Data privacy, security, and intellectual property protection must also be embedded within this platform architecture.

4. Reskilling and Training

Most organizations do not have mature skill-sets (prompt engineering, data science, data analysis, AI ethics, modeling) required to take full advantage of Gen AI. Nurturing a workforce equipped to build and use Generative AI models is a fundamental requirement. This requires hiring (which can be costly due to high demand roles) or reskilling.

Training should also be provided broadly across the organization. Due to its potential business impact, training all staff is important to create a baseline knowledge of the benefits and risks of the technology.

Implementing programs to assess organizational readiness is key to ensuring a smooth transition. This drives an understanding of the impacts on the organization and staffing, potential cultural inertia, and to proactively address staff concerns on its effects on their employment.

Data’s Crucial Role

Data serves as the foundation for Generative AI.

However, most organizations struggle with enterprise data. While business leaders often state that data is their most important asset, it is an asset that is frequently poorly curated, managed, understood, or analyzed. When IDC surveyed clients about their data, troubling results were revealed.

When assessing for POC selection, evaluate the data quality, ease of data retrieval/access, and integration as selection criteria. Where you have multiple POCs that can leverage decent quality, common data sets, select them over multiple POCs that require management of disparate data sets. There are enough challenges in the strategic leverage of Gen AI without adding data as another.

Security Concerns

Ensuring high-quality, accurate, and protected data is imperative. The integrity and privacy of data used to train AI models directly impacts their performance. Contaminated data can lead to incorrect models that fail to meet the desired outcomes. Addressing data quality, accuracy, and security challenges is a priority.

Impact on Infrastructure and Software Platforms

Generative AI’s adoption affects infrastructure and software platforms. For infrastructure the question of investment must be answered: whether to fund these investments via as-a-service models or via more traditional capital purchases. POCs can help drive the thought process for this decision making.

Software development lifecycles will accelerate, and low-code/no-code programming efforts will diversify code across AI-optimized architectures. This shift demands adaptable, API-enabled environments that balance portability, security, performance, cost control, and resilience.

Defining and Prioritizing Use Cases

Use cases are pivotal in driving the impact of Generative AI and strategy. Use cases typically fall into several categories:

  1. Industry-Specific: Tailored solutions, like generative drug discovery or material design, require customization and specialized data sharing. They can create substantial business value but demand unique models and integration and risk.
  2. Business Function: Integrating models with corporate data for specific departments (e.g., marketing, sales, procurement) needs careful data governance. Integration with established enterprise applications is crucial.
  3. Productivity: Basic use cases for productivity including summarizing across multiple reports to code generation to RFP template creation. They often integrate into existing applications as standalone SaaS solutions or cloud-based APIs. Gen AI may also be used to accelerate the meta tagging and categorization of enterprise data to improve data quality and retrieval. Additional productivity is gained through Gen AI’s unique ability to automate knowledge management through the synthesis of disparate data and sources across the enterprise. Knowledge management was largely unsuccessful in the first decade of this century, because it asked people to work differently. Gen AI allows knowledge productivity without staff changing their work habits.

Engage your C-Suite and key leaders in collaborative sessions to uncover relevant use cases and design a realistic roadmap. Also, provide a mechanism for capturing ideas that organically bubble up from divisions or lines of business.

Think Through Your Vendor Partner Selection Carefully

In the rapidly evolving landscape of Generative AI, there’s considerable ambiguity surrounding the technology and its practical applicability. Despite this uncertainty, a few key insights have begun to crystallize, especially regarding the usage of publicly-shared foundation models and the role of cloud platform providers.

According to IDC’s recent survey, cloud platform providers are perceived as the most strategic technology partners for GenAI initiatives.

Cloud providers deliver publicly-shared foundation models, often as PaaS or SaaS, that will find their place within a subset of enterprise use cases. While these may deliver short-term advantage, they are a commodity so are unlikely to provide long term competitive advantage. For most organizations, lasting benefit comes from leveraging finely-tuned, domain-specific models accessible in a private or controlled manner – a current example being Microsoft’s strategic investments in generative AI technologies making it a strong contender in this space.

In the same survey, IT consultants and system integrators emerged as a strong second. They are poised to guide organizations through the complex journey of Gen AI implementation. Organizations need to determine how well and long to use these partners.

As an initial driver for change, they are valuable.  Consultants and systems integrators provide skills and tools in short supply in many enterprises, but in the long run, if it is a competitive advantage (versus necessity), developing those skills and capabilities internally drives greater sustainable benefit.

Conclusion

Navigating the world of Generative AI requires an approach that encompasses responsible policies, sound data practices, technology understanding, a comprehensive view of use cases, and collaboration between IT and business leadership. Develop POCs with an eye to how they can help you create a consistent and defensible Gen AI moat for your organization, which can grow and evolve as strategy and competitive pressures demand. And be ready for a wild ride!

Daniel Saroff - GVP, Consulting and Research Services - IDC

Daniel Saroff is Group Vice President of Consulting and Research at IDC, where he is a senior practitioner in the end-user consulting practice. This practice provides support to boards, business leaders, and technology executives in their efforts to architect, benchmark, and optimize their organization's information technology. IDC's end-user consulting practice utilizes our extensive international IT data library, robust research base, and tailored consulting solutions to deliver unique business value through IT acceleration, performance management, cost optimization, and contextualized benchmarking capabilities.

In the digital business era, transformative advancements have reached unprecedented heights, driving rapid digital transformation and widespread cloud adoption across industries. This transformation has profoundly impacted customer experiences, enabling companies to offer seamless, personalized, real-time interactions across multiple touchpoints. By leveraging digital technologies and cloud capabilities, enterprises can create meaningful and engaging experiences that set them apart in the competitive digital economy.

However, this shift to cloud-based solutions has also led to an expansion of attack surfaces, creating newer areas of vulnerability. From smartphones and tablets to IoT devices and wearables, the proliferation of interconnected devices has resulted in a complex and vast digital landscape, each representing a potential entry point for cyberattacks.

Cyberthreat Landscape in Asia/Pacific

Cyberattacks worldwide are escalating at an alarming rate, becoming highly targeted and sophisticated. Cybercriminals continuously develop more intelligent methods to exploit vulnerabilities, steal sensitive data, or demand ransom. Securing all connected applications to critical infrastructure becomes more challenging, making it easier for attackers to find vulnerabilities to exploit, including the use of bots for both legitimate and malicious purposes. As a result, businesses face frequent, targeted, and complex cyberattacks, leading to significant financial burdens, customer attrition, and damage to brand reputation.

The Asia-Pacific Japan (APJ) region has seen a surge in cyberattacks, with a cyberthreat landscape that is intricate and constantly evolving. The region is influenced by geopolitical tensions, rapid digitalization, and the growing expertise of cybercriminals and state-sponsored hackers. According to IDC’s 2023 Future Enterprise Resiliency and Spending (FERS) Survey, Wave 2, a staggering 59% of enterprises in APJ fell victim to ransomware attacks in 2022, and 32% ultimately paid the ransom. Out of these, Australia, New Zealand, Singapore, and India were the worst affected regions. Among the affected businesses, 97% reported that the impact lasted from a single day to several weeks. This signals that now is the opportune moment for enterprises to strategically invest in cutting-edge technologies for proactive threat detection and decisive attack mitigation.

Significant Advancements in Threat Detection and Response

Today’s cyberthreat landscape has led to the emergence of EDR (End Point Detection and Response) and XDR (Extended Detection and Response) solutions backed by MDR (Managed Detection and Response)services to detect and respond to cyber threats. Early detection allows organizations to prevent or limit the damage caused by attacks, reducing data loss and minimizing the attack’s impact. According to IDC’s 2023 FERS Survey, Wave 2, 71.5% of the surveyed enterprises in APJ mentioned that threat detection and response tools, including EDR, NDR and SIEM (Security information and event management), helped them detect attacks before intruders had a chance to act.

EDR has become essential in enterprise cybersecurity strategies, used by organizations of all sizes and industries to protect their endpoints from cyberthreats. MDR services offer a comprehensive approach to shield businesses from advanced and frequent cyberthreats, delivered by experienced cybersecurity experts in a 24 x 7 remote SOC with cutting-edge solutions and hands-on support. As per IDC’s Asia/Pacific IT Services Survey, 2022, majority of the enterprises stated that the most important capabilities they seek in an MDR provider, is the ability to effectively integrate network and endpoint at the architectural level for enhanced visibility into assets and proactive threat detection at all surfaces. Apart from this, they also require an MDR provider to offer strong analytical capabilities. Some enterprises also indicate the need for a third-party analytical platform that can help absorb inputs from web, email, network, endpoints as well as cloud and deliver a comprehensive threat analysis. This is exacerbated by the need to have a proactive threat hunting for knowns and unknowns including third-party risk assessments from all sources as well as a well-suited and integrable range of threat detection and response offerings.

Enterprises are now directing their investments towards XDR solutions, empowering them to identify and effectively counter threats across networks, endpoints, and cloud environments. With advanced analytics, XDR solutions can form complex correlations between relevant data sources, reducing false positives and improving threat detection. IDC emphasizes that every XDR solution should include EDR capabilities, which can be enhanced with NDR, (Network Detection and Response) integration with external threat intelligence, and an underlying log management backplane, providing alerts from virtualized resources over the cloud.

The XDR solutions must also incorporate a SOAR solution for workflow management, DDoS Security, a WAF, web and email defense, identity and access management (IAM), data loss prevention (DLP), workload management, and FIM. XDR platforms are known for their scalability, reliability, extensibility, and modularity. While many XDR tools are cloud-based, some organizations prefer dedicated or on-premises solutions or a hybrid approach due to concerns about public cloud environments. Regardless of the chosen approach, a cloud-based XDR solution offers accessibility and flexibility for experts and analysts working in hybrid setups. A comprehensive XDR solution much in demand these days helps assist enterprises with threat quarantine, automated and manual remediation, alert escalation, reporting, and forensic analysis and must be the focus area for security service providers looking to cater to future enterprises.

Proactive detection and response may only sometimes be sufficient, particularly as cybercriminals adopt multi-vector approaches. The threat landscape’s complexity has led to the evolution of threat detection, including signature-based and behavior-based detection, threat intelligence, automation and orchestration, integration with incident response, and deception technology.

Using AI for TI – Threat Intelligence

AI-powered threat hunting leverages ML and data analytics to uncover hidden patterns and anomalies, improving the identification of potential threats. Businesses are now investing in threat hunting solutions that deploy AI/ML capabilities to predict threats based on historic patterns, addressing known and unknown threats with relevant insights and minimal false positives using comprehensive security analytics. AI’s relationship with threat detection and response is symbiotic, enabling more accurate and efficient threat detection, facilitating faster incident response and remediation, and empowering security analysts with advanced tools to proactively hunt for threats.

The potential use cases for threat intelligence are a significant leap forward compared to detection and response strategies. A prime example is identifying adversaries, a captivating aspect of threat intelligence, as it traces known threat vectors back to the responsible miscreant be it a cybergang or a nation-state sponsored attacker. Moreover, threat intelligence platforms can collect and correlate data from in-house security tools, including SIEM, UEBA, IDS/IPS, and antivirus software. This grants insights, validates possible insider threats, and supports external intelligence for forensic investigations.

The ever-evolving threat intelligence feeds necessitate consistent cross-referencing with up-to-date IoCs, such as behaviors, tactics, exploits, and open source code vulnerabilities. Here, automation plays a pivotal role in artifact collection, thereby ensuring accuracy. Additionally, there are times when unmanaged devices within a network can become inadvertent targets for attackers due to misconfigurations, incomplete patch management, or other issues. Threat intelligence also mitigates the challenges of shadow IT or enhances detection across data graveyards.

Remarkably, specific threat intelligence solutions cater to industrial control systems, APT intelligence, crime, and forensics intelligence. In the transportation industry, enterprises are leveraging threat intelligence to proactively prepare for attacks and fortify their infrastructure. Notably, a major Indian insurance company utilized threat intelligence to thwart 3.4 billion INR worth of fraud across various domains, integrating AI technology to enhance the fraud investigation process.

In the current landscape, establishing strategic partnerships between threat intelligence vendors and service providers holds the utmost significance. Enterprises seeking relief from financial and operational burdens desire consolidated service offerings. This market shift calls for security service providers to offer comprehensive solutions, including SOC, vulnerability assessments, incident management, and threat intelligence. Cultivating strong and strategic partnerships is pivotal for ensuring a unified, all-encompassing approach that aligns with evolving customer demands. Additionally, collaborative partnerships between security vendors and service providers aimed at delivering advanced threat intelligence capabilities and solutions by seamlessly blending global threat data with localized insights will offer a robust framework and a comprehensive perspective to potential clients on threats that hold significance in their unique operational context. This synchronized approach empowers organizations to stay ahead of evolving cyber risks and enhance their security posture.

Advice for Enterprises

For enterprises looking to adopt or elevate their threat detection and response capabilities, initiating efforts to reduce dwell time typically involves starting with EDR. However, sophisticated attacks often encompass more than just endpoints, necessitating the adoption of XDR as the next evolutionary step. Technology buyers are advised to assess their requirements and then look at investing in a multitude of advancements happening with the advent of AI/ML models in the threat hunting and threat intelligence space as an advancement to detection and response.

When selecting threat intelligence vendors, technology buyers should remember to prioritize those offering contextual insights that align with their industry and environment. It’s crucial to assess vendors based on their ability to provide actionable insights, enabling proactive defense strategies and swift responses to emerging threats. Integration capabilities are key, ensuring seamless collaboration with existing security tools and infrastructure. Look for vendors who blend global and local threat data to offer a comprehensive perspective and consider their automation and data enrichment capabilities to enhance threat detection accuracy. Scalability is essential to accommodate growth and evolving threat landscapes. Additionally, evaluate vendors using real-performance metrics such as Mean Time To Detect (MTTD) and Mean Time To Respond (MTTR) to ensure their effectiveness in rapid threat identification and resolution.

This approach ensures that your chosen threat intelligence vendor aligns with your organization’s unique needs and contributes to a robust security posture. Also, it is always essential to note that AI is not a silver bullet and should be used with human expertise, as security analysts play a critical role in validating and interpreting AI findings within the organization’s environment to make informed decisions regarding threat response and mitigation. Without a doubt, the collaboration between AI and human intelligence undoubtedly bolsters an organization’s overall security posture.

Get insight on adoption and perception of threat intelligence solutions by Indian enterprises in this on-demand IDC webinar here.

Interested in how enterprises should strategize their investments moving forward?

Sakshi Grover - Senior Research Manager - IDC

Sakshi Grover is a senior research manager for IDC Asia/Pacific Cybersecurity Services, supporting its research and client engagement activities across Asia/Pacific markets. Additionally, she serves as the lead security analyst for IDC India. Sakshi is responsible for delivering syndicated custom research and consulting engagements on next-generation emerging and disruptive technologies. Her tasks include developing and socializing IDC's point of view within security services, covering both legacy and modern cybersecurity technologies. Her role involves close collaboration with technology vendors and buyers, developing market insights, and providing research, consulting, and advisory services in the fields of security software and services. This includes partnering on research efforts with relevant country analysts in the local IDC offices. Sakshi's views on security have been quoted in numerous publications, such as the Economic Times, Business Standard, Data Quest, CRN, and others.

In a world inundated with AI buzz, are you feeling overwhelmed by the incessant chatter around artificial intelligence? As the AI frenzy reaches its zenith, it’s imperative that we take a collective breath and evaluate how to get to a positive impact for our organization. Let’s pause and reevaluate the landscape.

Neil Ward-Dutton, one of our distinguished IDC analysts, once aptly noted that AI appears as an enchanting spell until we unravel its inherent limitations and complications. And limitations, there are many, regardless of how you approach the matter. A considerable portion of these limitations stem from the data that fuels AI’s algorithms: for AI systems developed internally, the scarcity of high-quality data in most organizations data often proves to be a stumbling block; and for generative AI systems that leverage pre-built “foundation models” already trained on external data sources, lack of transparency about the provenance and quality of those data sources creates a number of risks.

While generating training data is an option, recent observations indicate that excessive training can actually yield adverse outcomes. And that’s just the tip of the iceberg; the complications extend into the realms of bias and ethical quandaries.

Dispelling any illusions, let’s be clear – acquiring a software package won’t instantaneously immerse your organization into the realm of AI excellence. Remember the CIO who, a few years back, joyfully declared hiring two data scientists as the path to “getting AI”? Upon inquiry about their role and the benefits they’d bring, he confessed, “I’m not a data scientist; I don’t know.” Such anecdotes underscore the essence of the issue.

Presently, history seems to be repeating itself. When inquired about AI’s potential, responses often resemble, “I don’t know; I’ll ask the AI.” This reveals a common theme – many are intrigued by AI-enabled possibilities, but grappling with its tangible advantages remains a challenge. As the chasm between curiosity and efficacy widens, it begs the question: Is the investment in perfecting AI worth the monumental effort?

We have been defining what is needed to be successful with AI and the steps needed to make it work. We invite you to come and discuss with your peers what steps make sense and where to hold back inverstment. By the end of the session you should have insight into the value you can realisticly derive from AI over the next three horizons and what the pitfalls may be. 

 

Join the CIO ThinkTank on September 28th: On September 28th, from 5:00 to 6:00 PM CET, we invite you to participate in the CIO ThinkTank – an open dialogue among peers. 

In A CISO’s Guide to Artificial Intelligence, we view artificial intelligence as providing advisory, enhanced service, and semiautonomous cybersecurity defense functionality based on a range of structured and unstructured data, including logs, device telemetry, network packet headers, and other available information.

Simply put, AI is the application of applied statistics to solve cybersecurity problems. The goal is to create analytics platforms that capture and replicate the tactics, techniques, and procedures of the finest security professionals; democratize the traditionally unstructured threat detection and remediation process; or complete a range of near-real-time automated detection and response techniques that theoretically can be replicated, but by the time the security professional completed the task, it would be far too late.

As AI continues to promise simplicity in the face of the complexity of today’s security environment, it will be helped by the homogeneity of data.

Frank Dickson – Group Vice President, Security and Trust

However, our collective focus is in the wrong place in our opinion. The hype and conversation focus are on AI. Why not? The possibilities of AI inspire the imagination, illuminating the possible. The key to enabling outcomes in security is not about the AI; it is about the data. Many children are inspired by the power and girth of locomotives. The potential of the locomotive, though, relies on the boring and tedious process of laying the tracks and the enabling infrastructure. Likewise, data is the enabling infrastructure for security AI. Three characteristics are deterministic of success:

  • Data framework structures
  • Data management
  • Data curation

Data Framework Structures

As we look to unlock the potential of artificial intelligence to unlock the potential and promise of – for example – extended detection and response (XDR) creating frameworks and structures is critical. The most basic definition of XDR is:

  • The collecting of telemetry from multiple security tools
  • The application of analytics to the collected and homogenized data to arrive at a detection of maliciousness
  • The response to and remediation of that maliciousness

As we look to apply analytics to the collected and homogenized data to detect maliciousness, AI needs structure to be able to look at the data at scale. Afterall, AI is really no more than a mathematical model that implies the relationship of the data. Telemetry optimized for a point use case, such as the perimeter-centric defense of network perimeters of a firewall, is of little use if you cannot relate it with other data sets, such as identity, and if it is not framed in a way to achieve an end goal.

As we discussed the value of event sequencing as a core attribute of most detection and response offerings, much of the value was unlocked by application of the MITRE ATT@CK framework. Not only does the framework provide structure to the task of threat detection by mapping to the cyber kill chain, but it also creates a manner in which different tools from different vendors can structure data and prepare it for analysis.

Data Management

Data has weight and gravity. Security data has a lot of weight. For example, a typical endpoint protection platform agent will produce 150-200MB of data a day. Movement, storage, and management of such data quickly creates a problem of scale. Data retention policies thus can become quickly divisive topics.

In addition, only with AI can the increasing pools of telemetry be put to the very best use. ML has limits, but using AI to train for previously unseen patterns and lens on the data can (time-to-X) be reduced in a truly significant way.

Data weight has become a competitive differentiating tool. For example, the move by the infrastructure-as-a-service (IaaS) vendors to retain their own cloud logs at no or very low cost is significant, as SIEM is often priced based on the volume of data ingested, and the SIEM vendors cannot simply “eat” the cost of ingesting and storing voluminous cloud logs. Analysis needs to happen on the native format in a predictable manner. The entire business model of SIEM, XDR, and other analysis platforms thus is increasingly challenged and is changing based on the weight of data.

Data Curation

In a world where every vendor has a different data structure, curating heterogeneous data sets to create data homogeneity to enable analysis is an extra step, a potentially ominous step depending on the calculus and scale required. As AI continues to promise simplicity in the face of the complexity of today’s security environment, it will be helped by the homogeneity of data. In a world where every vendor has a different data structure, curating heterogeneous data sets to create data homogeneity to enable analysis is an inhibitor.

Restructuring data takes time and costs money. Thus, large vendors with broad portfolios have the advantage as multiproduct but single platform offerings save time and cost due to having a larger percentage of multi-technology homogeneous data sets.

Overcoming the issue of data curation is the objective of many standards. For example, Structured Threat Information Expression (STIX) and Trusted Automated eXchange of Indicator Information (TAXII) were developed by MITRE as the U.S. Department of Homeland Security FFRDC. STIX is a common language for threat intelligence, so it can be shared and machine-read by any tool supporting it. TAXII is the application layer protocol designed to simplify the transmission of threat intel data. In 2015, STIX/TAXII development was moved to the OASIS international standard organization. Today, the work is free, open, and community driven.

We would be remiss if we did not mention Open Cybersecurity Schema Framework (OCSF) here and its significance to AI. Normalization of hybrid multicloud security telemetry is needed before any converged data is useful. The goal of OCSF is to simplify the exchange of data between the tools that ingest it, manage it, and enrich it because every organization has a cornucopia of solutions purchased over the past half dozen years. OCSF means a single format to make it easy for those getting started instead of writing data connectors to a lot of solutions. The real story here is one of simplicity, which is the holy grail of cybersecurity solutions.

So what? What Does This Mean to YOU?

Look.  Every cybersecurity vendor is going to roll-out a generative AI interface for their tools, and they should.  It is the fourth generation of the user interface; it is significant. A vendor will be conspicuous without one. By the end of 2023, every tool of relevance will have one; tools without one will likely become irrelevant or subservient to those that do. The ability of the tool to create outcomes in your environment however will be determined not by the power of generative AI but in the data and the predictive AI models behind the generative AI.  It’s Not About the AI; It’s About the Data.

Frank Dickson - Group Vice President, Research - IDC

Frank Dickson is the Group Vice President for IDC's Security & Trust research practice. In this role, he leads the team that delivers compelling research in the areas of AI Security; Cybersecurity Services; Information and Data Security; Endpoint Security; Trust; Governance, Risk & Compliance; Identity & Digital Trust; Network Security; Privacy & Legal Tech; and Application Security & Fraud. Topically, he provides thought leadership and guidance for clients on a wide range of security topics including ransomware and emerging products designed to protect transforming architectures and business models.