The idea of a New India has inspired a great deal of confidence and optimism among all Indians in recent years. Our ability to take a confident, independent stance on multiple issues in the global and regional political theatre, register exceptional sporting achievements (including in the Olympics), and complete a successful global-first landing on the south pole of the moon are just a few of the reasons for us to believe that a New India is finally emerging.

India’s economic transformation that is currently underway, driven in significant measure by its rapid digitalization, is another shining example of a New India that is evolving.

The current government has actively pushed for a digital economy and digital delivery models before the Covid-19 pandemic forced countries and organizations across the world to go digital. The Digital India mission was launched in 2015 and saw a revolution in the delivery of public services being ushered in by the Indian Government.

It also initiated the India Stack, a unique platform for different constituents of the economy to leverage for growth. This mission led to new business models in critical industries such as agriculture, transportation, healthcare, and education – all of which are undergoing fundamental transformation. This digitalization cut down leakages in the system and enabled formalization of the economy, with rising GST registrations and collections serving as a good benchmark of its success.

The India Stack has been a game changer as individuals and enterprises leverage it for convenience, speed and efficiency, the combination of which will drive greater velocity of business. As India (and the world) moves to digital, this underlying, enabling digital infrastructure is having the same effect as railroads did after the Industrial Revolution.

India Stack has led to an exponential and cascading effect on different sectors of the economy and driven growth opportunities for citizens. Jan Dhan Yojana, Cowin, UPI and ONDC are some of the world leading examples of digital at scale. Financial institutions and organizations in the wider financial ecosystem have probably been the strongest adopters of the India Stack, driving financial inclusion and bringing significant ease in adoption of financial products.

As an example, Pradhan Mantri Jan Dhan Yojana (PMJDY) accounts grew three-fold from 147.2 million in March 2015 to 462.5 million in August 2022, covering a large segment of the unbanked population.

In response to the increased opportunities digitalization provides, we see widespread usage of the India Stack by enterprises especially in the B2C domain. This underlines the immense benefits it provides Indian citizens.

In fact, the government’s push for digital has been so compelling that in the IDC’s recent Digital Business Survey (Aug 2023) in India, 25% of medium to large enterprises said that the government’s push towards digitalization was one of the key factors in their adoption of a digital business model.

IDC defines a digital business as one where value creation is based on the use of digital technologies which includes internal and external processes; how an organization engages with customers, citizens, suppliers, and partners; how it attracts, manages, and retains employees; and what products, services, and experiences it provides.

It is encouraging to note that Indian enterprises do not significantly lag behind global peers in digital adoption. This is in large part thanks to our strong developer community and traditional strength in IT services. IDC’s survey also indicates that CEOs drive the digital strategy in more than a third of Indian companies today.

To advance the digital agenda forward, another 25% of Indian enterprises have instituted new designations such as Chief Digital Officer and Chief Data Officer in the last two years. Enterprises realize that digital business models help drive operational efficiencies, enable enhanced decision making, increase competitive advantage, and most importantly, meet customer expectations.

This digital embrace is not the preserve of enterprises alone though. The Indian consumer has also adopted it enthusiastically. With 850 million Internet subscribers, 1.1 billion mobile subscribers (of whom 630+ million are smartphone users), and 398 million engaged on social media – India already has a connected population. And, this connected population is adopting to digital. Digilocker has more than 137 million users, UPI has 300 million active users, and COWIN has registered 1.1 bn registered persons.

With enterprises and consumers adopting digital actively, it is clear that India’s shift to digital is real, irreversible and widespread. The Honorable Minister of State for IT & Electronics, Shri Rajeev Chandrasekhar recently stated that the digital economy will contribute to 20% of India’s GDP by 2026.

IDC is similarly bullish about India’s digital prospects. Two years back, 34.8% of Indian businesses told IDC that 50% or more of their revenues came from digital business models. Today, that number is 50% and in three years time IDC expects 62% of Indian enterprises will derive more than 50% of their revenues from digital business models. For these reasons, the future of New India is digital, and bright!

Digital-native businesses’ (DNBs) deal-making, valuations, and exit activities were all down in 2023 in the European venture market, according to Atomico’s The State of European Tech 2023. A market return to form that, however, can be considered a worldwide phenomenon.

The key fundamentals that led to a downturn in the funding environment in the last two years are still in place. Limited partners are still cautious about providing more money to the venture capital (VC) ecosystem, due to persisting macroeconomic and geopolitical uncertainties. With difficulties continuing in the funding environment, the number of exits is expected to remain limited in the short term, in favor of M&A and consolidation.

With all this as a backdrop, what will 2024 look like for European DNBs?

From AI to Sustainability Technologies: Where Is the Money?

European venture capitals hold a consistent amount of dry powder due to this lack of activity, which could be invested in selected deals this year. A 2024 rebound is expected in the event of a cut in interest rates, which could lower risk perception from limited partners. If only 10 new unicorns (privately owned companies with valuation above $1 billion) were created in Europe in 2023, down from 46 in 2022, with an upturn in deal-making activities we expect a larger number of DNBs to join the unicorn cohort.

European Artificial intelligence DNBs are expected to be at the forefront of investors’ interest again in 2024. As focus on deals from VCs and corporate VCs in 2023 was on large language models (LLMs), deals will most probably shift toward AI vertical applications. With regulations such as the EU AI Act coming into effect, investment will also shift toward start-ups and scale-ups focused on AI security and privacy.

Sustainability technology DNBs, from carbontech to climatetech, dominated capital flows in 2023, and the segment is expected to attract more capital in 2024 too, with climate change a key topic on European (and worldwide) leaders’ agendas, as demonstrated by the outcomes of COP23. Furthermore, tech start-ups growth in Europe is also sustained by national and EU stimulus funds, such as the European Innovation Council (EIC) work programme 2024, which allocates €1.2 billion for strategic technologies and scaling up companies in deep tech innovations, from spacetech to quantum technologies.

How Will External Conditions Shape European DNBs’ Technology Investments?

Uncertain market conditions push digital natives to reprioritize their tech spending toward optimizing processes and increasing profitability, but tech expenditure will not be cut, as it is essential to sustain their digital-based business models. More specifically, security technologies and cloud platforms are pivotal investments to develop secure and scalable digital products and services, whereas increased focus on AI and automation technologies is set to make larger DNBs leaner and more cost effective. Data infrastructure, integration, and quality investments would be still pivotal to boost wider AI adoption, targeting customer experience initiatives as well, with the aim to retain and enlarge the existing customer base.

Want to know more? You can find these and other key trends driving the European DNB landscape, in IDC’s 2024 Digital-Native Business Trends or by getting directly in touch at mlongo@idc.com.

Martina Longo - Research Manager, Digital Business - IDC

Martina Longo is a research manager in the IDC Digital Business Research Group. In her role she advises ICT players on how European organizations create business value using digital technologies. She also leads IDC European Digital Native Business research, focused on those enterprises born in a modern technological world in a mix of start-ups, scaleups, and more mature digital natives. Within the European Digital Business Research, the European Digital Native Business, Start-ups and Scale-ups theme advises technology suppliers on the market dynamics and segmentation, business priorities, tech buying patterns and go to market approaches (sell to/sell with) needed to engage digital native organizations in Europe.

If I told you the single most effective step your business could take to improve cybersecurity is to stop using passwords, you might think I’m crazy. It’s sort of like saying that the best way to make your car faster is to remove the tires, or that swearing off vegetables is key to losing weight.

But the reality is that by many measures, passwords are past their prime. There’s a better solution – passkeys – and organizations looking for ways to optimize their security stances would do well to consider passkey-based authentication in contexts where it makes sense.

That said, passkeys remain subject to a variety of challenges, which make it unrealistic for most businesses to shift entirely to passkey-based logins in the near future.

Plus, as passkeys grow in popularity, it will be increasingly important for business decision-makers to distinguish between hype and reality when it comes to passkeys. Expect to hear more and more in the near future about how wonderful passkeys are – especially from vendors who sell passkey solutions – but don’t assume that passkeys are preferable to passwords for every use case and circumstance.

With those realities in mind, here’s a balanced look at the pros and cons of passkeys, along with tips on when and how to take advantage of passkeys as part of your business’s authentication strategy.

What are passkeys?

Passkeys are a way of authenticating with a website or app without requiring a username and password. Instead, passkeys confirm a user’s identity using methods like biometric authentication or the entry of a PIN code.

Under the hood, passkeys rely on a set of keys – one public and one private – that are generated for each user. The user’s public key is shared with a website or app to which the user wants to log in. The private key is stored only on the user’s personal device (such as a phone or laptop).

To authenticate with a website or app, the user must unlock his or her private key from the device where it resides. Typically, the method for unlocking the key involves biometric authentication (like scanning the user’s face or fingerprint) or the entry of a unique PIN code that the user configured when setting up the passkey.

Advantages of passkeys vs. passwords

Compared to password-based authentication, which has been widespread for decades, the benefits of passkeys boil down to two main advantages:

  • Greater convenience: Passkeys are more convenient for users, who don’t have to remember login names or passwords to login.
  • Enhanced security: Passkeys improve security because, unlike passwords, they cannot be guessed or brute-forced by attackers (at least in most cases). In addition, because private passkeys reside only on users’ personal devices, passkeys eliminate the risk that threat actors could hack a server or database containing passwords and usernames, then use the information to compromise user accounts.
  • Spoofing/phishing protection: Passkeys mitigate spoofing and phishing risks because a user’s private key must be paired with the public key of a specific site or app when logging in. Therefore, attempts to trick users to log into malicious sites masquerading as legitimate ones won’t work, because the malicious sites won’t have the same public keys as the legitimate sites they are impersonating.

For these reasons, passkeys are a growing focus of identity and authentication providers like Okta and Microsoft, according to recent IDC research. Businesses that use authentication products or services from vendors who have added passkey support can make passkey-based logins an option – or a mandatory requirement, if desired – for their employees and customers.

The pitfalls of passkeys

On the other hand, passkeys are not perfect. They are subject to several distinct drawbacks that could hinder the ability of enterprises to adopt passkeys in certain situations:

Websites and apps may need to be updated to support passkeys

Sites and apps that are already configured to integrate with third-party authentication providers who support passkeys can add passkey-based logins relatively easily. Otherwise, however, businesses will need to overhaul the authentication logic in their apps to make passkeys viable for employees and customers – a process that takes time and money.

Passkeys are tied to devices

Because passkey-based authentication depends on access to private keys stored on specific devices, it’s not a good option for use cases where it’s difficult to predict which device an employee or customer will use to log in. For instance, if a customer sometimes connects to your site using a mobile phone but also uses a personal laptop or work laptop, they’d need to configure separate passkeys.

The passkey vendor ecosystem is fragmented

To date, most solutions for configuring and managing passkeys work only on certain operating systems, devices or vendor ecosystems. For example, Apple’s offerings do not support Android devices.

Passkeys only support certain devices and operating systems

Passkey-based authentication also only works on devices and operating systems designed to support it. Older devices are likely not to be compatible which could create confusion among users about which devices are supported

Passkeys can be hacked

Passkeys are substantially more secure than passwords, but they’re not impervious against attack. Sophisticated threat actors who manage to obtain physical access to devices may find ways to work around biometric authentication or guess PIN codes in order to access passkeys stored on the device.

These types of attacks are much more challenging to carry out than conventional techniques for bypassing passwords, and to date, no major breach has occurred involving stolen passkeys. But they are plausible nonetheless.

Enterprise security policies don’t accommodate passkeys

Currently, most enterprise security policies that govern authentication and authorization were not designed with passkeys in mind. Enterprises will therefore need to update their security policies (and associated security practices).

This is feasible, but updating security policies is likely to take some time, delaying enterprise passkey implementation.

A long horizon for passkey adoption

The nature of passkeys, and the challenges surrounding their implementation, mean that very few businesses are likely to migrate exclusively to passkey-based authentication anytime soon. To get to that point, organizations would need to overhaul all of their websites and applications to support passkeys

In addition, identity and authentication management vendors would need to make passkey-based authentication a first-class citizen within their solutions. To date, few have done that, although it’s reasonable to expect that this will happen over the next two or three years.

When businesses should – and should not – use passkeys

Rather than approaching the question of “to passkey or not to passkey” as an either-or, binary choice, organizations should be thinking at present about specific situations where it does and doesn’t make sense to adopt passkeys as a primary means of authentication.

In general, shifting to passkeys is a good strategy for websites and applications that have well-defined sets of users with predictable behavior. If you know that the employees or customers who need to access a certain resource typically log in using specific types of devices, and they access those resources frequently, asking them to set up passkeys is reasonable – especially if the website or app already integrates with an identity management service that supports passkeys.

On the other hand, it’s harder to make a case for switching to passkeys in situations where the cost, complexity and hassle of configuring and maintaining them – from the perspective of both the business and users – outweigh the benefits. For example, legacy applications that can’t integrate with authentication services offering built-in passkey support are likely not worth updating just to enable passkey logins. Likewise, if you have a group of customers who access a website only a few times, they may view passkey requirements as more trouble than they’re worth.

Learn more about passkeys in the enterprise

Passkeys remain a fast-evolving topic as more and more identity and authentication management providers embrace the concept of passwordless logins, and as enterprises continue to evaluate use cases for passkeys.

IDC is following this scene closely and will be unveiling multiple resources in the coming months to offer actionable guidance on how enterprises can (and can’t) benefit from passkeys. To learn more – or to request access to IDC assets and analysts focused on passkey authentication – contact us.

If you’re interested learning more about IDC’s guidance around cybersecurity, watch the on-demand webinar, Cybersecurity Norms and Trends: How Does Your Business Stack Up?, by clicking the button below.

Christopher Tozzi - Adjunct Research Advisor - IDC

Christopher Tozzi, an adjunct research advisor for IDC, is senior lecturer in IT and Society at Rensselaer Polytechnic Institute. He is also the author of thousands of blog posts and articles for a variety of technology media sites, as well as a number of scholarly publications. Prior to pivoting to his current focus on researching and writing about technology, Christopher worked full-time as a tenured history professor and as an analyst for a San Francisco Bay area technology startup. He is also a longtime Linux geek, and he has held roles in Linux system administration. This unusual combination of "hard" technical skills with a focus on social and political matters helps Christopher think in unique ways about how technology impacts business and society.

Sales success today goes beyond simply offering a superior product or service. It demands a strategic approach that encompasses understanding market dynamics, addressing customer needs effectively, and mastering the art of persuasive communication. At the heart of this approach lies the sales playbook, a comprehensive guide that not only outlines the path to success but also equips sales teams with the tools and strategies needed to navigate the intricacies of the modern buyer and marketplace.

The sales playbook serves as the cornerstone of a company’s sales efforts, providing a roadmap for engaging potential customers and driving revenue growth. It is more than just a static document; it is a dynamic resource that evolves alongside market trends and customer preferences.  A well-crafted sales playbook encapsulates the collective wisdom and experience of a sales organization, distilling best practices, proven strategies, and invaluable insights into a single, cohesive framework.

In essence, the sales playbook serves as a compass, guiding sales professionals through the complexities of the sales process and empowering them to make informed decisions at every turn. From identifying promising leads to closing deals and nurturing long-term relationships, the playbook provides a blueprint for success at every stage of the customer journey.

However, creating the perfect sales playbook is no small feat. It requires a deep understanding of market trends, customer behavior, competitive landscapes, and the ever-evolving role of technology in driving sales effectiveness. It demands meticulous planning, careful analysis, and a willingness to adapt and innovate in the face of change.

In this blog post, we will address the essential components of a modern sales playbook, exploring four key building blocks that underpin its success. Whether you’re a seasoned sales professional looking to refine your approach or a business leader seeking to empower your sales team for success, this blog post will provide you with the knowledge, tools, and inspiration you need to create a sales playbook that sets your organization apart in today’s competitive marketplace.

Building Block 1: Understanding Market Trends and Drivers

The foundation of any successful sales playbook lies in a deep understanding of market trends. By analyzing industry data, competitor strategies, and emerging technologies, sales teams can identify opportunities and anticipate shifts in customer preferences. Tools such as market research reports, industry publications, and data analytics platforms can provide invaluable insights into market dynamics.

Market trends encompass various aspects, including changes in consumer behavior, evolving regulatory landscapes, and emerging technologies. Sales teams must stay vigilant and adapt their strategies accordingly to capitalize on new opportunities and mitigate potential risks.

Building Block 2: Identifying Challenges and Needs

Effective selling begins with a thorough understanding of the challenges and needs facing your possible customers. Understanding common challenges and pain points, on a persona level, allows sales professionals to customize messaging and solutions to meet their audience’s specific needs.

60% of survey respondents say they won’t engage/respond to outreach unless communication is personalized for relevancy.

IDC 2023 B2B Technology Buyer Survey

While many sales teams often prioritize discussing price as a determining factor, it’s essential to recognize that various roles within organizations weigh multiple considerations beyond fiscal budgets. In a 2023 IDC B2B Technology Buyer Survey, it was revealed that product innovation from category market leaders was the primary reason for customer switching. This underscores the importance of understanding challenges and needs at a persona-level to facilitate relevant conversations that resonate with potential customers.

Building Block 3: Relevance and Customer Value

Technology serves as a cornerstone in shaping business processes. However, the true measure of success lies in demonstrating how YOUR technology precisely meets the needs of the individual persona you’re addressing within their unique position in the buying center.

A robust playbook should empower sales teams to quickly grasp and articulate current market dynamics, prevalent challenges, and how their technology directly alleviates common pain points. It’s imperative to integrate a value-selling approach that highlights the tangible benefits and reliable return on investment (ROI) that your technology offers to customers.

By emphasizing the specific value propositions and ROI metrics associated with your technology, sales professionals can effectively communicate its relevance and significance to potential buyers. However, it’s crucial to recognize that value varies for each persona based on their role and objectives within the buying center. Leveraging solutions that account for persona-specific needs during research and the development of sales enablement tools empowers sales teams to engage in more accurate conversations with potential customers. This tailored approach fosters authentic relationships and enhances engagement while instilling confidence in the solution’s capacity to address pressing needs and deliver measurable results.

Incorporating detailed information about value selling and proven ROI metrics into the sales playbook equips teams with the tools and knowledge they need to navigate complex sales cycles successfully. It enables them to tailor their messaging and pitch strategies to resonate with the priorities and objectives of each persona within the buying center, ultimately driving stronger relationships and more meaningful conversions.

Building Block 4: Crafting Key Questions and Hooks

The art of selling lies in asking the right questions and delivering compelling value propositions that resonate with prospects. A well-crafted sales playbook should equip sales professionals with a set of key questions and hooks designed to uncover customer needs and differentiate their offerings from the competition.

Key questions should probe into the pain points, goals, and priorities of prospective customers. By actively listening to customer responses and empathizing with their challenges, sales professionals can tailor their solutions to address specific needs and add value.

Hooks are persuasive messages or value propositions that capture the attention of prospects and differentiate the offering from competitors. Whether it’s highlighting unique features, showcasing customer testimonials, or offering exclusive incentives, hooks should resonate with the target audience and compel them to act.

Sales Enablement Tools to Support Your Success

Sales enablement tools play a pivotal role in supporting the success of sales teams as they navigate the complexities of the modern marketplace. These tools provide invaluable resources that build new sales skills and engage prospects in a more compelling buying experience.

Ultimately, while a sales playbook serves as a valuable tool for navigating the sales journey, success hinges on thorough preparation and ongoing training. Just as in a marathon, adequate training beforehand is essential for optimal performance. Invest in sales enablement tools that equip your sales team with the knowledge and skills they need to excel, such as sales mastery classes, educational workshops, or online coaching programs. By prioritizing education and skill development, your team will be well-prepared to engage prospects in informed and meaningful conversations, ultimately increasing their effectiveness and success rates.

Creating the perfect sales playbook requires a comprehensive understanding of market dynamics, customer needs, technology trends, and persuasive communication strategies. By leveraging tools and methodologies across these four building blocks, sales enablement and business development leaders can develop a playbook that empowers them to engage prospects effectively, drive meaningful conversations, and ultimately, win more business.

Sales playbooks should be living documents that evolve in tandem with market trends, customer feedback, and technological advancements. By continuously refining and optimizing their sales strategies, organizations can stay ahead of the curve and achieve sustainable growth in today’s competitive marketplace.

Learn More About IDC’s:

The role of the Chief Information Officer (CIO) has evolved significantly in the digital age. From managing IT infrastructure to becoming key business strategists, CIOs now stand at the intersection of technology and business, leveraging innovations to shape organizational directions, create value, and boost revenue. 

The “IDC FutureScape: Worldwide CIO Agenda 2024 Predictions” provides valuable insights into the future of this role. The FutureScape prediction looked at ten different predictions across the next four years and their impact for CIOs (see Figure, below). In this blog we will detail out the first five.  

Here are five key predictions that every CIO should be aware of:

1. Embracing AI, Automation, and Analytics

By 2028, it is predicted that 85% of CIOs will leverage organizational changes to harness AI, automation, and analytics, driving agile, insight-driven digital businesses. This means that CIOs will need to stay ahead of the curve in understanding these technologies and implementing them effectively within their organizations. The most challenging aspect will not be the technology, but aligning both the business and technical cultures to make these changes successful.  

2. Pressure to Adopt Digital Tech

By 2024, 65% of CIOs will face pressure to adopt digital tech such as GenAI and deep intelligence. However, limited IT support may diminish the benefits and heighten risks. CIOs will need to ensure they have the right support and talent in place to navigate these challenges and reap the benefits of these advanced technologies. In fact, we give in-depth guidance on approaches to adopting and applying GenAI to the organization in this eBook. In some cases, IT may have the talent they can leverage internally or train-up, but in other cases, IDC sees CIOs leveraging third-parties to  initially provide this talent and support.

3. Proactive Cybersecurity Measures

By 2027, 75% of CIOs will integrate cybersecurity measures directly into systems and processes to proactively detect and neutralize vulnerabilities, fortifying against cyberthreats and breaches. This highlights the increasing importance a multi-layer, systematic and systemic approach to cybersecurity protection rather than the more traditional point solution approach organizations have adopted over the years. Download this checklist to unlock that set of practices that Technology Leaders should employ to proactively protect your organization.

4. Prioritizing Strategic Data Management

By 2025, 45% of CIOs will prioritize strategic data management and foster a data-centric culture, ensuring competitive differentiation in the digital era. This means that CIOs will need to focus on how data is managed and used within their organizations, ensuring it is used strategically to drive business outcomes. As part of this, IT organizations will need to work with the business to identify what data is critical to achieving business outcomes and also the quality of the data. And as part of driving out a data centric culture, creating a common data platform (which doesn’t mean one tool, but common set of process, practices, and accessibility of data) is critical to enterprise success.

5. Misaligned Investments Hindering Business Performance

Two-thirds of CIOs will not meet their 2025 digital revenue goals due to misaligned investments hindering business performance. This highlights the need for CIOs to ensure their IT investments align with their business goals and strategies and that CIOs must understand both their company and the markets their company competes in. CIOs can no longer simply be good technologists. They also must gain acumen in their business and their market. They must understand the ‘art of the possible’ with technology and be able to translate it in terms that brings business conviction.

Above all, these forecasts highlight the crucial need for CIOs to sync up IT investments with the wider business strategy. It’s not enough to chase the latest trends; it’s about ensuring that every tech move aligns with the company’s goals. By fostering collaboration between tech experts and business leaders, CIOs can steer their organizations toward sustainable growth and success in the digital age. So, as we look ahead, let’s embrace these insights, stay agile, and keep innovating to thrive in tomorrow’s world.

To wrap things up, the IDC FutureScape forecasts paint a vivid picture of the road ahead for CIOs navigating the ever-evolving tech landscape. From AI to automation and analytics, the opportunities for innovation are huge. Yet, it’s not just about adopting shiny new tools; it’s about building a solid foundation of data management, resilient platforms, and proactive cybersecurity measures.

For more IDC FutureScape content for CIOs, be sure to check out the on-demand webinar, IDC FutureScape: Worldwide CIO Agenda 2024 Predictions. Click the button below to watch the webinar now.

Mona Liddell - Research Manager, Quantitative Analysis, CIO Executive Research - IDC

Mona Liddell is a Research Manager for IDC’s CIO Executive Research team. She is responsible for leading the creation, analysis, and delivery of quantitative-based research and related marketing content for business and technology leaders. This research provides guidance on how to leverage technology to achieve innovative and disruptive business outcomes.

San Francisco-based OpenAI’s introduction of ChatGPT on November 30, 2022, marked a significant milestone in the development of large language models (LLMs) and generative AI (GenAI) technology. The launch by OpenAI, the creator of the initial GPT series, sparked a race among technology vendors, system providers, consultants, and app builders. These entities immediately recognized the potential of ChatGPT and similar models to revolutionize industry.

2023 saw a surge in efforts to develop GenAI tools that are smarter, more powerful, and less prone to hallucinations. The competition led to an influx of innovative ideas and tools aimed at harnessing the capabilities of LLMs. The goal became to leverage these models as ultimate tools to enhance productivity, competitiveness, and customer experience across diverse sectors.

With ChatGPT paving the way, a broad range of organizations and professionals are exploring how to integrate GenAI into workflows and solutions. The widespread interest and investment have underscored the technology’s transformative potential and laid the groundwork for its continued evolution in the years to come.

4 Uses Cases for GenAI in Manufacturing

In manufacturing organizations, the utilization of GenAI-powered tools and solutions is primarily focused on four key areas:

  1. Content Generation: This includes automated report generation, in which GenAI algorithms are employed to automatically generate reports based on predefined parameters and data inputs.
  2. User Interface Enhancement: This involves the integration of chatbots into user interfaces, enabling more intuitive and interactive communication between users and systems.
  3. Knowledge Management: GenAI facilitates knowledge management by providing co-pilot services that help users access and interpret vast amounts of data and information.
  4. Software and Delivery: This encompasses various applications, such as code generation, in which GenAI is leveraged to automate the creation of software code, streamlining development processes.

According to IDC’s GenAI ARC Survey of 2023, manufacturing organizations are actively evaluating or implementing GenAI solutions.

Around 30% of European respondents have already invested significantly in GenAI, with spending plans established for training, acquiring Gen AI-enhanced software, and consulting. Nearly 20% are doing some initial testing of models and focused proofs of concept, but don’t yet have a spending plan in place.

These results suggest steady growth in the adoption of GenAI-powered tools and solutions within the manufacturing sector. The initial hype surrounding GenAI in 2023, fueled by its perceived potential as a “wonder technology,” has evolved into a pragmatic recognition of its capacity to address ongoing challenges such as workforce shortages, skills gaps, language barriers, data complexity, regulatory compliance, and more.

In the manufacturing industry, GenAI is increasingly viewed as an enabling technology capable of facilitating innovation and overcoming barriers to success.

Framework for Manufacturing Organizations to Implement GenAI

To fully capitalize on the potential of GenAI pilots, manufacturing organizations recognize the need for comprehensive frameworks that encompass processes and policies. Key measures include:

  • Data Sharing and Operations Practices: Organizations should prioritize the implementation of practices that ensure data integrity for LLMs developed internally or in collaboration with third parties. This ensures that data used in GenAI models is accurate, reliable, and ethically sourced.
  • Corporate-Wide Guidelines for Transparency: Guidelines should be established to evaluate transparency and track the use of GenAI code, data, and trained models throughout the organization. This promotes accountability in GenAI usage.
  • Mandatory GenAI Awareness and Acceptable Use Training Programs: Mandatory training programs should be implemented to raise awareness of GenAI capabilities and ethical considerations among designated workforce groups. This helps ensure that employees understand how to responsibly utilize GenAI technologies.

As excitement over the capabilities of GenAI has died down, organizations are becoming increasingly aware of the risks posed by potential intellectual property theft and privacy threats linked to the technology.

To address these concerns, many organizations are prioritizing the establishment or expansion of formal AI governance/ethics/risks councils tasked with overseeing the ethical use of GenAI and mitigating risks associated with privacy, manipulation, bias, security, and transparency.

As a manufacturing interviewee in one of my studies put it, “The governance framework is indispensable in ensuring responsible and ethical AI implementation.” This underscores the importance of implementing robust governance measures to ensure the ethical use of GenAI within manufacturing organizations.

Deployment Strategies

Strategies for selecting the right solution for the right use case can vary substantially. A global white goods company, for example, piloted several GenAI-powered use cases in 2023. Its selection and deployment strategy encompassed a range of approaches, including:

  • Off-the-Shelf Solutions: The company utilized ready-to-use, commercially available GenAI-embedded software-as-a-service solutions. These offered immediate access to GenAI capabilities without the need for extensive development or customization.
  • AI Assistants: It deployed AI assistants to support specific tasks within their business processes. These assistants helped, for example, to create designs based on predetermined workflows, providing valuable support and efficiency gains.
  • AI Agents: The company deployed AI agents in complex use cases requiring the orchestration of workflows and decision-making based on AI-driven insights. The agents leveraged GenAI to analyze data and make informed decisions autonomously.

A primary challenge often mentioned in such endeavors is selecting the optimal LLM for company-specific use cases from a multitude of possibilities. With new models and solutions constantly emerging and becoming accessible, this task can be daunting. The selection process typically involves thorough market research, vendor presentations, and internal discussions about the technology framework underlying current and future use cases.

However, the success of GenAI ultimately hinges on the quality and quantity of the data utilized. Curating a diverse and sufficient data set is critical to ensuring unbiased outcomes and maximizing the effectiveness of GenAI solutions. Data curation therefore remains a cornerstone of success in leveraging GenAI technologies.

The Bottom Line

GenAI-powered technology holds immense potential across industries and regions, offering capabilities that traditional machine learning algorithms or neural networks may struggle to match in terms of breadth and depth. GenAI can assist in co-piloting humans, thereby addressing challenges associated with an aging and/or unqualified workforce.

However, organizations must prioritize addressing concerns such as data leakage, biases, and maintaining sovereignty over IT processes running in the background. These issues must be carefully managed to ensure the responsible and ethical implementation of this powerful technology.

The past year and a half has demonstrated the impressive capabilities of generative AI (GenAI) systems, such as ChatGPT, Bard, and Gemini. Business application vendors have since begun a sprint to include the most recently enabled capabilities (summarizing, drafting text, natural language conversation, etc.) into their products. And organizations across industries have started to deploy generative AI to help serve customers — hoping that GenAI-powered chatbots could provide a better customer experience than the failed and largely useless service chatbots of the past.

The results have started to come out, and they are mixed. The service chatbots of organizations such as Air Canada and DPD have made unsubstantiated offers or even rogue poetry. Another customer chatbot for a Nordic insurance company was not updated with the latest website reorganization and kept sending customers to outdated and decommissioned web pages.

The popular Microsoft Copilot hallucinated about recent events and invented occurrences that never happened. Based upon personal experience, a customer meeting summary written by generative AI included a final evaluation of the meeting as “largely unproductive due to technical difficulties and unclear statements” — an assessment not echoed by the human participants.

These issues highlight several dilemmas related to using generative AI in software applications:

  • Autonomous AI functions versus human-supervised AI. Autonomous AI is attractive to customer service departments because of the cost difference between a chatbot and a human customer service agent. This cost saving potential must, however, be balanced against the risk of reputational damage and negative customer experiences as a result of chatbot failures and mishaps.

Instead, designing solutions with “human in the loop” may have multiple benefits. Incorporating employee oversight to guide, validate or enhance performance of AI systems may not only drive outputs accuracy, but also increase adoption of GenAI solutions. For example, a customer service agent could have a range of tools, such as automatically drafted chat and email responses, intelligent knowledge bases, and summarization tools that augment productivity without replacing the human.

  • At what point is company-specific training enough? In other words, extensive training investments into company-specific large language models (LLMs) versus relying on out-of-the-box LLMs, such as ChatGPT, for good-enough answers. In some of the generative AI failures described above, it seems that the company-specific training of the AI engine was too superficial and did not cover enough interaction scenarios.

As a result, the AI engine resorted to its foundational LLM, such as GPT or PaLM, and these did, in some cases, act in unexpected and undesired ways. Organizations are obviously eager not to reinvent the wheel with respect to LLM, but the examples above show that overly reliance upon general LLMs is risky.

  • Keeping the chat experience simple versus allowing the user to report issues. This includes errors, biased information, irrelevant information, offensive language, and incorrect format. To this end, it is crucial to understand sources and training methods. A good software user experience is helped by a clean user interface. In the context of generative AI, think of the prompt input field in an application. Traditional wisdom suggests keeping this very clean. However, what is the user supposed to do in case of errors or other types of unacceptable AI responses, and how is the user supposed to verify sources and methodologies?

This is linked to the need for “explainable AI”, which refers to the concept of designing and developing AI systems in such a way that their decisions and actions can be easily understood, interpreted, and explained by humans.

The need for explainability has arisen because many advanced machine learning models, especially deep neural networks, are often treated as “black boxes” due to their complexity and the lack of transparency in their decision-making processes.

  • Using generative AI for very specific and controlled use cases versus general AI scenarios. One way to potentially curb the risks of AI errors is to frame the use of AI into specific and limited application use cases. One example is a “summarize this” button as part of a specific user experience next to a field with unstructured text. There is a limit to how wrong this can go, as opposed to an all-purpose prompt-based digital assistant.

This is a difficult dilemma simply because of the attractiveness of a general-purpose assistant, which has prompted vendors to announce such general assistants (e.g., Joule from SAP, Einstein Copilot from Salesforce, Oracle Digital Assistant, and the Sage Copilot).

  • Charging customers for generative AI value versus wrapping into existing commercial models. GenAI is known to be expensive in terms of compute costs and manpower needed to orchestrate and supervise training. This begs the question of whether such new costs should be rolled over to the customers.

This is a complex dilemma for a number of reasons. Firstly, AI costs are expected to decline over time as this technology matures. Secondly, AI functionality will be embedded into standard software, which is already paid for by customers.

The embedded nature of many AI application use cases will make it very difficult for vendors to change for incremental, separate new AI functions. Mandatory additional AI-related fees related to existing SaaS solutions are likely to be met by strong objections from customers.

  • Sharing the risk of Generative AI outputs inaccuracy with customers and partners versus letting customers be fully accountable. Generative AI will be increasingly leveraged in supporting key personas’ decision-making processes in organizations. What if it hallucinated and the outputs were misleading? And what if the consequence is a wrong decision that will have serious negative impact on the client organization? Who is going to take the responsibility for the consequences of those actions? Should customers accept this burden alone, or should the accountability be distributed between vendors, their partners (e.g., LLMs), and end customers?

In any case, vendors should have full transparency of their solutions (including clear procedures regarding training, implementing, monitoring, and measuring the accuracy of generative AI models) to be able to immediately provide required information to the customer in the case of an emergency.    

 

After having taken the enterprise technology space by storm, generative AI is likely to progress slower than initial expectations. As a new technology, GenAI might enter the “phase of disillusionment,” to paraphrase colleagues in the analyst industry.

This slowdown will be driven by a more cautious adoption of AI in enterprise software, as new horror stories instill fear of reputational damage in CEOs across industries. We believe that new generative AI rollouts will have more guardrails, more quality assurance, more iterations, and much better feedback loops compared to earlier experiments.

Bo Lykkegaard - Associate VP for Software Research Europe - IDC

Bo Lykkegaard is associate vice president for the enterprise-software-related expertise centers in Europe. His team focuses on the $172 billion European software market, specifically on business applications, customer experience, business analytics, and artificial intelligence. Specific research areas include market analysis, competitive analysis, end-user case studies and surveys, thought leadership, and custom market models.

The efficient management of identities and access has become central to digital business. It determines the speed and agility with which an organization is able to operate or pursue new goals; it underpins employee productivity and enables operational efficiencies; and it is key to security, privacy, and compliance. Most organizations have deployed identity and access management (IAM) solutions to handle their operational demands effectively.

However, the identity infrastructure and processes themselves are a frequent target of cyberattackers, driving recognition that identity security measures need to be improved.

What Are the Main Identity Threats?

IDC’s Global Identity Management Assessment Survey 2023 found that in Western Europe, the two categories of identity that are perceived as the biggest threats are hybrid or remote employees and partners, suppliers, or affiliates (each category mentioned by 49.6% of respondents). The external nature of these identities — from a location perspective, an employment perspective or both — increases the attack surface of the organization and creates potential vulnerability and exposure of data, systems, and processes.

Nevertheless, those roles also provide access to a broader talent pool and deliver operational efficiencies and economies of scale, allowing organizations to outsource non-core functions. Consequently, organizations are striving to accurately assess and manage the risk.

What Are the Top IAM Investments?

Accordingly, the top two service areas in which Western European organizations are planning to make significant IAM investments to address the security risk are identity management for roles and authorizations (56.9%) and privileged access management (PAM – 53.3%).

Note that since the onset of the COVID-19 pandemic in 2019, investments in PAM have been growing steadily, as organizations required greater control over remote employees accessing sensitive corporate applications and data.

Which IAM Areas Must Improve

The survey also asked which IAM areas organizations need to improve on significantly in the next 18 months. From a list of options including functional, operational, structural, and organizational aspects, the top responses were squarely in the area of identity security:

  • The biggest share of organizations (45.1%) want to improve their ability to detect insider threats.
  • A further 44.3% aim to improve identity threat detection and response (ITDR).
  • 9% aim to improve integration with other IT security solutions.

The emergence of ITDR in the last couple of years as a key priority for organizations building out their security and identity capabilities has been a key takeaway of multiple IDC surveys now.

The final area to touch on is the “wish list” question, always a good barometer of what respondents really value. In this case, if your organization had the budget and resources to do so, what’s the one identity technology solution you’d add or strengthen in the next three months?

The top response was strong authentication, such as two-factor authentication or multifactor authentication (MFA), cited by 25.6%. This was followed by generative AI (GenAI) for fraud detection and identification of synthetic identities (20.3%) and, again, ITDR (19.5%).

The rapid maturing of deep fake tools and capabilities underlined by real-world examples of successful attacks is already driving demand for security tools to protect against them as the GenAI arms race heats up.

Identity really is at the heart of everything in the digital era: business, security, trust, compliance, risk management, operational efficiency, and more. It is fundamental to enterprise initiatives such as building cyber resilience or adopting zero trust principles.

Many direct references to IAM and identity security controls in the growing landscape of EU legislation further emphasize why identity should be high on every organization’s priority list. This new report maps many of the key trends shaping the European identity and access landscape in 2024.

Mark Child - Associate Research Director, European Security - IDC

Associate Research Director Mark Child of IDC’s European Security Group leads the group's Endpoint Security and Identity & Digital Trust (IDT) research for both Western Europe and Central & Eastern Europe. He monitors developments in security technologies and strategies as organizations address the challenges of evolving business models, IT infrastructure, and cyberthreats. Mark's coverage includes in-depth security market studies, end-user research, white papers, and custom consulting.

When NASA created its Apollo launch vehicles to take payloads to space (including humans), they were designed with multiple segments. The segment nearest the ground on launch (the “first stage”) contained huge rockets and fuel tanks that could get everything into the air and accelerate it to a velocity where it could escape Earth’s gravity. At this point, still some way before the edge of Earth’s atmosphere, the first stage would be jettisoned, to fall back to Earth. The rest of the vehicle would continue on its way, with escape velocity now reached.

A Frenzy of FOMO

OpenAI is the outfit that — above all others — is responsible for the rapid acceleration of interest and investment in generative AI (GenAI) technologies. The launch of ChatGPT in November 2022 kick-started a frenzy of FOMO, first for many individuals (after all, ChatGPT did surpass 1 million users in just five days) and then in businesses — as well as catalyzing conversations about intellectual property in the digital age, potential impacts of AI on employment and skills, and more.

Just over 12 months from the GenAI market launch created primarily by the attractiveness of OpenAI’s consumer services, IDC conducted a worldwide survey that demonstrated the incredible momentum behind the new technology within businesses: in January 2024, 68% of organizations already exploring or working with GenAI said it would have an impact on their business in 2024-2025, and an astounding 29% said that GenAI had already disrupted their business to some extent.

OpenAI continues to benefit from amazing levels of mindshare, thanks to the good old rule of “be first”, but also to the undeniable PR power of its CEO Sam Altman — not least within senior business leadership circles. But mindshare is not enough; it also benefits from a strategic partnership with Microsoft, which has seen Microsoft committing to provide $13 billion of investment, in return for an exclusive license to OpenAI’s IP and an agreement that it would be OpenAI’s exclusive cloud provider.

The heavily promoted downstream results of that partnership (Azure OpenAI Service, use of OpenAI models in CoPilots, and so on) have continued to create mindshare momentum.

And yet: OpenAI is not currently traveling along the route that businesses want to take.

OpenAI’s Alignment Problem

The outfit was founded as a not-for-profit research institute, focused on developing artificial general intelligence (AGI) — a currently hypothetical future level of capability that envisions AI systems that can perform as well or better than humans on a wide range of cognitive tasks — with a capped profit company subsidiary (which is the entity invested in by Microsoft and others).

However, when we ask organizations what they need from GenAI in order to create business value from the technology, they typically cite qualities such as accuracy, privacy, security and frugality. For example: 28% of organizations are concerned that GenAI jeopardizes control of data and intellectual property; 26% are concerned that GenAI use will expose them to brand or regulatory risks; and 19% of respondents are concerned about the accuracy or potential toxicity in the output of GenAI models.

OpenAI is innovating fast, but the dominant innovation focus is on breadth and depth of functionality (e.g., the introduction of “multimodal” models that can manipulate multiple content types, including text, images, sound, and video). Not on accuracy, privacy, security, frugality, and so on.

Currently, it is vendors “higher up the stack” (enterprise application and enterprise software platform vendors) who are attempting to bridge the gap with functionality aimed at addressing trust issues and minimizing risks. But it is clear that foundation model providers also need to bear some responsibility for… being responsible.

Beyond OpenAI: An Explosion of GenAI Model Providers

OpenAI might have amazing mindshare right now, but it is already far from the only source of GenAI model innovation. Fueled by venture capital and corporate investment, competitors have flooded into the space, including:

  • GenAI research-focused vendors like Anthropic, AI21, and Cohere
  • Hyperscale public cloud providers AWS and Google
  • Enterprise technology platform vendors including IBM, Oracle, ServiceNow, and Adobe
  • Sovereignty-focused providers, including Mistral, Aleph Alpha, Qwen, and Yi
  • Industry-specialized providers, including Harvey (insurance) and OpenEvidence (medicine)
  • A vibrant and fast-growing open-source model community, with thousands of GenAI-related projects hosted by Hugging Face and GitHub

Open-source communities are a particularly energetic vector of innovation: open-source projects are quickly evolving model capabilities in terms of model size and efficiency, training and inferencing cost, explainability, and more.

Microsoft Is Clearly Looking Beyond OpenAI

In late February, Microsoft President Brad Smith published a blog post announcing Microsoft’s new “AI Access Principles”.

There’s a lot of detail in the post, but underpinning it all is a clear direction: in order to reinforce its credentials as a “good actor” in the technology industry and minimize the risks of interventions by industry regulators around the world, Microsoft is committing to support an open AI (no pun intended) ecosystem across the full AI technology stack (from datacenter power and connectivity and infrastructure hardware to services for developers). As part of this, it is increasingly emphasizing the importance of a variety of different model providers. For instance, it’s made a recent small investment in France’s Mistral AI and is expanding support for models from providers like Cohere, Meta, NVIDIA, and Hugging Face in its platform.

Will OpenAI Fly or Crash?

In order for OpenAI to reap significant rewards from business demand for GenAI technology implementation, it is going to have to evolve its approach. While the initial success of ChatGPT captured market attention, the rapidly evolving landscape of both GenAI technology supply and demand requires a stronger business focus. OpenAI is faced with tension between its research-oriented ethos and the market’s demand for practical AI applications. This alignment problem raises questions about its identity and future strategy.

Lastly — what about Microsoft? It must back its new principles with tangible actions that genuinely advance AI responsibly. It needs to ensure transparency and avoid actions that would suggest it only uses “responsible AI” as a PR tool for driving profits. It needs to promote both innovation and competition. Nobody wants a world where one model’s dominance could stifle competition and limit options for developers.

Hence, fostering an open and inclusive ecosystem where smaller players can grow will be imperative for Microsoft’s credibility and allow for a trustworthy AI ecosystem benefiting everyone.

 

Want to know more? Join IDC’s experts on the 19th of the March from across EMEA for an exclusive peek into our latest research to:

  • Uncover real-world use cases from organizations aiming to maximize positive impact of GenAI on their business,
  • Learn about evolving GenAI technology, supplier dynamics, and the shifting regulatory landscape,
  • Gain actionable insights to reveal a roadmap to get through GenAI possibilities and challenges in 2024 and beyond.

Register for the webcast here: How EMEA Organizations Will Deliver Business Impact With GenAI – Beyond the Hype.

Neil Ward-Dutton - VP AI, Automation, Data & Analytics Europe - IDC

Neil Ward-Dutton is vice president, AI, Automation, Data & Analytics at IDC Europe. In this role he guides IDC’s research agendas, and helps enterprise and technology vendor clients alike make sense of the opportunities and challenges across these very fast-moving and complicated technology markets. In a 28-year career as a technology industry analyst, Neil has researched a wide range of enterprise software technologies, authored hundreds of reports and regularly appeared on TV and in print media.

A couple of months back, I was on my way to a medical appointment and the service loaner vehicle that I was driving suffered significant tire damage due to an unforeseen road hazard condition. The car detected “the collision”, and the vehicle immediately offered the option to connect with an Emergency Response Specialist.

The agent confirmed that I was safe and dispatched a tow service for the vehicle to be towed back to the dealership. Unfortunately, this was about the only high point of the experience! My requests for a replacement vehicle were met with rude and indifferent responses from the emergency agent as well as customer service personnel at my dealership. I was forced to stay with my vehicle until the tow service arrived despite mentioning that it was crucial for me to make my doctor’s appointment. No alternative transportation options were offered, and I had to wait for someone from my family to finally drop me off at the doctor’s.

This is a classic, and unfortunately, all too common example of an experiential disconnect. Where the brand failed on customer experience (CX) was not being able to connect the different pieces of my loaner vehicle journey to my expected outcome. Past insights such as my tenure as a customer, service/sales referrals, past service records, amount spent at the dealership, long standing relationship with my service advisor etc., felt neglected.

Intelligent orchestration allows enterprises to bring forward relevant insights from customer, organization, and other stakeholder relationships like the insurance company, tow service, or rideshare companies and apply these insights to the current interaction. Making this complex connection across numerous different contexts within a single customer interaction and correlating it to understand and meet a customer outcome, is where generative AI (GenAI) shines.

GenAI enables deeper and more accurate contextual awareness in customer engagements and more accurate recognition of customer intent. As a result, experiences are closer to customer desired outcomes.

With its ability to apply generative foundation models that can be trained on extraordinarily large amounts of data, GenAI is positioned well to strengthen foundational capabilities that power intelligent experience orchestration. These include:

Capturing richer customer context: Context defines a customer’s immediate need(s), and refers to the customers’ updated preferences, prior and current actions, behavior, sentiment, intent, location, goal/purpose, and circumstances.

Customer conversations are built on semantics, structures of concepts and observations that need to be inferred – an area where GenAI vastly outpaces predictive and interpretive AI.

GenAI was born to find patterns and correlations in unstructured data (e.g., sentiment, intent, emotion etc.), driven by trained sets of vast collections of textual data organized by discovered proximities of usage.

Improved context continuity: Context continuity means to be able to bring forward relevant insights from past customer, organization, and ecosystem, data, actions/inactions, and outcomes and apply these insights as relevant to the current engagement scenario.

The 2023 Future of Customer Experience survey found that just about a fifth of enterprises globally have the capability to maintain context continuity for all customers across all their brands. Other factors, which will continue to impact enterprises’ ability to manage context continuity, will be the massive proliferation of channels as well as the rise of connected customer data and insights. With GenAI, customer journeys can be infused with insights across multiple modes/channels.

Anchor customer journeys on outcomes (vs. outputs): In the earlier service incident example, the brand’s loaner vehicle journey only focused on making sure that the vehicle could be returned to the dealership – i.e., the output, and inward looking, organizational output at that. While required, this action begs the question if customer outcomes were even considered as part of the design?

GenAI inherently makes use of a declarative approach where a goal is the starting point. Combined with foundational models, trained on a vast knowledge base of richer contextual customer data, GenAI can even assist enterprises to begin journey design with specific customer outcomes. In addition, GenAI’s active learning capability can adjust customer interactions and journeys to meet these outcomes by actively accounting for changes in real time in customer needs, emotions, and intent in each interaction.

Improved system of connected insights: Optimizing orchestration depends on industrializing a system of connected insights – i.e., consuming a continuous stream of accurate and authentic customer intelligence.

The ability to consume vast amounts of unstructured data across channels/modalities offers enterprises a low-cost way to make collection of insights a byproduct of their customer engagements and not a separate process. Industrializing insights also includes wiring a deep understanding of customers into the company’s day-to-day actions – essentially, customer insights driving the business model.

Real-time journey automation and optimization: A key part of delivering intelligent orchestration is the fundamental automation capability required to connect data, tasks, and outcomes together. At its core, automation comprises of data and process connectivity and correlating insights to determine and execute actions.

With customer journeys becoming more non-linear, GenAI can increase the adaptability of customer journeys to be more dynamic. The model can evolve new responses based on changing customer/business events, while keeping the customer outcome constant. For instance, the exception noted in the earlier service loaner vehicle example. GenAI’s LLMs can suggest alternative journey steps or even navigational pathways (multiple journey steps), essentially redesigning journeys dynamically.

While GenAI excels at connecting and orchestrating insights at scale, it will not solve for the most crucial gap plaguing customer experience transformations – delivering value parity. Parity in the value exchange with customers means that the customer and the organization, equally, get something meaningful out of the exchange. Customer value parity is crucial since an imbalance can lead to loss of customer trust, and often, ends in customer attrition.

The age of AI Everywhere promises to offer enterprises significant experience-based market differentiation. However, to grow profitably in a highly competitive digital economy, enterprises must capitalize on intelligent orchestration, anchor on customer desired outcomes, and aim to achieve value parity between customers and brands.

Visit our Future of Customer Experience website to learn more about achieving customer empathy at scale and get more insights on our thought leadership for how intelligent customer experiences can drive profitable growth.

Sudhir Rajagopal - Research Director, Future of Customers and Consumers - IDC

Sudhir is Research Director for the CMO Advisory Service, focused on creating and executing programs and research to help companies make data-informed decisions about marketing. Sudhir's research and advisory focuses on how organizations must consider transforming their marketing function with AI at the center. In his role, Sudhir monitors the continual innovation of technologies, business strategy, and customer experiences to empower marketing leaders to make decisions on marketing strategy and operationalization.