Generative AI has wowed consumers and individuals across the globe with its ability to find information and author high-quality content. For enterprises, the use cases are still being explored and defined. In this blog, we will explore a potential ‘killer app’ for generative AI: The Virtual Mentor as a new way to do learning and onboarding.

In today’s organizations, the vast majority of mentoring is done by speaking to experienced colleagues, looking for answers in the public internet or in company-specific intranets, trawling through various PDF guides and presentations or maybe e-learning courses or classroom sessions. The problem is that there is no easy way of finding the information employees need using existing technologies and approaches.

Current e-learning and onboarding solutions struggle with multiple challenges. Firstly, the content is costly and time-consuming to produce. Secondly, it quickly becomes outdated and is generally static, once produced. Thirdly, the one-size-fits-all standard approach to learning and onboarding doesn’t quite meet the needs of the individual, who already knows all about A but would like to deep-dive into B.

We believe that generative AI will be a game changer in solving these problems, because the system themselves – for the first time in world history – can generate the needed learning content. Future virtual mentors will meet many of today’s unserved learning and onboarding needs and employee would be able to interact digitally, remotely or in the office, intensively or in drip-feed style, and the learning content would be created on the fly determined largely by the nature of the interaction and the learner queries.

AI-Powered Virtual Mentor vs. Previous Learning Approaches

First of all, let’s define generative AI. We define generative AI as a branch of computer science that involves unsupervised and semi-supervised algorithms that enable computers to create new content using previously created content, such as text, audio, video, images and code.

Secondly, let’s define what an AI-powered virtual mentor is. We envision the AI-powered mentor is have the following characteristics:

  • Always available. Like Microsoft’s failed personal digital assistant Clippy (remember the animated talking paperclip?), a virtual mentor will be an omni-available resource to the learner.
  • Creates content itself. If fed enough material, a generative AI-powered virtual mentor will be able to create the relevant teaching material itself by synthesizing existing content.
  • Conversational. Just like a real-life, human mentor, the AI-powered virtual mentor interacts via conversation. The human mentor converses verbally, while the virtual mentor works best via written conversation (although verbal user experience is on its way, as well).
  • Adaptive. A virtual mentor goes far beyond what is known today as ‘adaptive learning’, I.e., an e-learning experience with some variation in the course depending on the individual learner. A virtual mentor can freestyle and go where the learner would like to go within a general topic area.

An employee would be able to ask a wide variety of general questions to the virtual mentor, such as:

  • What is the pricing structure for product X?
  • Do we have representation in Peru?
  • What are the key new features in the version YY.YYY of product Z?
  • What is the expense management policy for a client meeting?
  • Who in my company works with [expertise area]?

Let’s compare what it is like to work with a generative AI-powered virtual mentor compared to traditional e-learning as well as classroom training:

Why Do We Need Virtual Mentors When We Already Have ChatGPT and Similar Generative AI Platforms?

ChatGPT is of limited use in an enterprise context for one simple reason: Employees using the platform are likely to reveal sensitive company information. This is why most organizations have banned the use of ChatGPT among employees.

Just imagine an employee at a healthcare provider uploaded the raw transcript of an internal meeting regarding the cancer treatment of patient XX and asking for an abbreviated minute of meeting. Such an upload to a public internet system would constitute a major violation of the privacy of patient XX.

Virtual mentors, on the other hand, would leverage the public internet-based Large Learning Models but would not feed any inquiries from employees back to the public internet. Such ChatGPT replicas in confined corporate setting will be the first wave of generative AI virtual mentors that we are going to see on the market.

This will, in other words, be general purpose virtual mentors based upon public internet information. These can be adopted by organizations of any size and are ready to use immediately.

A subsequent wave of virtual mentors will be based on curated content specific to a functional area or an industry or similar. Such specialized content virtual mentors will be sold by vendors that are in charge of curating content and maintaining the AI solution.

A virtual mentor in the area of accounting could be offered by learning content provider or alternatively to an accounting solution provider. Some specialized virtual mentors could be provided as free add-ons to commercial software subscriptions.

Finally, we will see a wave of organization-specific virtual mentors that will act as experts in one organization. In this case, the organization itself would be in charge – possibly aided by a services provider – of feeding the system with learning material.

A product manufacturer would input all manuals, product FAQs, marketing material, customer service interactions, HR policies, internal communication, public pricing information, everything on the intranet and company internet sites, training materials, etc. That solution could be very helpful in onboarding new employees and help answering inquiries for existing employees. However, it would take time and resources to implement and require a certain company size in order to benefit.

The figure below shows the different levels of data feeding into a virtual mentor. The interaction between the virtual mentor and the employee will be chat-based to begin with. However, in the medium term, interaction could also be done through verbal communication, games, metaverses, augmented reality, etc.

Evidence of Generative AI Replacing Existing Digital Learning and Coaching Solutions

Chegg, an established American education technology (EdTech) company known for textbook rentals, online tutoring, and a variety of student services, was among the entities to feel the competition from generative AI. Their initial projection regarding generative AI tools, such as ChatGPT, was that these technologies would take a longer period to truly influence the market.

However, the release and subsequent popularity of GPT-4 among students, credited to its swift response time, efficiency, and affordability, led to a sales slowdown and a dramatic Chegg stock price decline of 48% in early May 2023.

As response to these trends, Chegg entered into a partnership with OpenAI in April 2023, leading to the development of CheggMate. This tool, which is still in its development phase, intends to amalgamate GPT-4’s generative AI capabilities with Chegg’s existing question database.

The goal for CheggMate is to enhance user experience by better aligning user queries with the most suitable resources.

Other EdTech vendors, including Duolingo, have unveiled new AI-driven features. Specifically, Duolingo introduced a role-play chat where users can learn a language by conversing with an AI. After these interactions, they receive feedback and suggestions to enhance their language-learning journey.

We have also witnessed the first examples of generative AI approaches in mentoring. CoachHub is a leading vendor of digital coaching solutions recently unveiled AIMY, a virtual AI-powered career coach rooted in OpenAI’s ChatGPT. AIMY is designed to let users try personalized coaching sessions without any human interactions and without the costs associated with traditional coaching. It emulates human to human coaching, is still in beta phase, and not yet able to manage too complex discussions.

Challenges to Overcome for Virtual Mentor Solutions

Adopting virtual mentor solutions for learning, onboarding, and coaching purposes is not without challenges. Here are a few key obstacles that organizations might encounter when introducing these new AI-driven solutions:

  • Data privacy and security concerns. The first cases of data breaches related to the use of generative AI solutions by employees have already emerged, such as Samsung’s discovery of staff uploading a variety of sensitive information to ChatGPT. Future virtual mentor solutions will not feedback data to public generative AI systems, such as ChatGPT.

As shown in the figure above, virtual mentors will use a combination of user data, curated company data, curated industry or functionally specific data as well as publicly available data as training material. Such approaches will limit the risk of data breaches significantly.

However, adoption will require significant attention to security-related aspects, such as ensuring robust encryption, compliance with data protection regulations, etc.

  • Implementation complexity and skills gap. Introducing virtual mentor solutions on top of existing data is likely to require specialist AI training skills, which might not be in possession of many organizations. In terms of the overview figure above, the company-specific layer presents the biggest challenges. This is because training material is limited (compared to the vast number of resources available on the public internet) and because training material must be curated, updated, deleted (in case of obsolete material), etc.
  • Risk of hallucinations. AI-driven virtual mentors can produce “hallucinations” or inaccurate answers. In a mentoring context, this can lead to confusion or misguidance and ultimately a rejection of the mentor system as unreliable by the employees. The risk of hallucinations by the virtual mentor means that organizations will have to dedicate resources to quality assurance, ticketing system for incorrect or inappropriate answers, etc.

Implications for HCM and Payroll Vendors

Generative AI will have a major impact on the field of Human Capital Management solutions. There has been a significant initial focus on the impact of generative AI on recruiting, candidate marketing, and employee performance.

However, learning and onboarding will also see massive change as a result of generative AI.

A market for curation of Large Learning Models for various industries and functional areas will appear. This could open new revenue streams for the providers with strong existing domain knowledge.

As displayed on the table above, different learning delivery methods will have different sweet spots. Classroom-based learning and traditional e-learning formats will not disappear.

What will happen, however, is that a lot of the more general learning and onboarding tasks will transition to generative AI-based learning formats. Initially, the formats will evolve around chat-based interfaces, but over time other user experiences and communication formats will emerge.

Generative AI is an opportunity for vendors of learning and onboarding solutions. However, they will need to react fast in terms of evolving existing solutions and building in generative AI features and aspects.

Existing learning and onboarding vendors will come under pressure from new providers of virtual mentors and other related generative AI-based solutions. Generative AI is a twin edged sword for HCM vendors, a blessing for those who are willing revisit their existing offerings, but a curse for those that fail to respond.

Bo Lykkegaard - Associate VP for Software Research Europe - IDC

Bo Lykkegaard is associate vice president for the enterprise-software-related expertise centers in Europe. His team focuses on the $172 billion European software market, specifically on business applications, customer experience, business analytics, and artificial intelligence. Specific research areas include market analysis, competitive analysis, end-user case studies and surveys, thought leadership, and custom market models.

In today’s sprawling realm of content marketing, establishing an authentic connection with your audience is no longer an option but a necessity. With the proliferation of generic content and the ever-expanding outreach channels, personalization in the digital age is a significant challenge for many marketer organizations.

The Imperative of Personalization

Imagine a bustling marketplace, vendors competing for attention. Amid the noise, what captures your interest? It’s the vendor who remembers your preferences, understands your needs, and tailors their offerings accordingly; the vendor who shows empathy, builds trust and inspires loyalty in customers.

In the digital sphere, content personalization recreates this personalized shopping experience. Successful content personalization entails crafting content that isn’t just broadcasted but resonates deeply with individual preferences, making each user feel valued and understood.

However, in the cacophony of content flooding digital channels, traditional one-size-fits-all strategies fall short. Personalized content goes beyond merely inserting a name; it’s about deciphering user behaviors, comprehending their wants, and delivering content that genuinely strikes a chord.

Data drives Personalization expansion

Until recently, marketers often employed “batch and blast” campaigns – generic messages sent out en masse, devoid of personalization. These campaigns were a mere numbers game, lacking relevance and compliance.

However, with the advent of marketing automation platforms layered atop CRM data, campaigns started gaining personal touches. Marketers could design campaigns based on limited yet more personal information. This transition marked a move from arithmetic growth – one campaign after another – to multiplicative growth, where controlled scenarios were crafted with ease. 

Marketing Campaign Growth based on data driven personalization.

Nowadays, marketers have access to even more data courtesy of tools like Customer Data Platforms (CDP). Each data point – from buyer intention to past purchases – drives exponential campaign growth as shown in the green boxes on the right of the above figure.  This data-driven reality poses challenges for downstream functional groups who struggle to manage the influx of auto-generated, real-time campaign ideas. In this context, content curation at scale becomes a crucial driver of success – a challenge many CMOs are grappling with.  

In the digital age there are exponential possibilities for content personalization.  Modern CMOs need to leverage new tools – real time data, GenAI, content management, atomic content – to engage audiences across the myriad of medium and the deluge of distractions.

Breaking down content to build up personalization. 

To address the need for highly customized messages, today’s CMOs are adopting innovative strategies. Taking a page from dynamic content optimization, atomic content, and other technologies, marketers are looking to breaking content into smaller components. This allows marketers to generate personalized assets as needed – leveraging Generative AI and real time data where possible. 

For instance, instead of creating a standard product page, marketers can create a collection of product features. Each feature tagged with alignment attributes like seasonality, buyer stage, and demographics. These tags then facilitate reassembly for specific data-driven use cases.

Instead of developing assets far in advance for say, a “manufacturing CFO doing research,” the system can generate personalized assets for various, and infinite, scenarios such as: CFO vs. CTO, Manufacturer vs. Retailer, Top of Funnel vs. Post-Purchase, and more. This process creates content on-the-fly, adapting in response to real-time inputs.

The Evolution of Engagement: Unleashing Personalization

Modern CMOs are embracing a practical, data-driven approach to content creation. They’re breaking down content into manageable pieces and utilizing AI-powered tools to seamlessly blend these components for diverse audiences. This adaptability not only streamlines the marketing process but also empowers brands to establish significant connections, harnessing the potent power of data for tangible impact.

Content creation from a data-driven approach drives personalization which will create better user experiences based on empathetic relationships between customers and brands. These relationships are built on what the customer wants and how they want to be treated through the lens of technology. Brands need to engage with customers in a contextual manner based on awareness, engagement, learning, and measurement.

As channels multiply and data flows in, mastering this complexity presents both a challenge and an exhilarating opportunity. It’s a chance for brands to navigate profound connections and wield the potential of data to craft impactful experiences.

By diving into this complexity and making the most of available tools, brands can effectively resonate with their audience, traverse the evolving currents of content marketing, and emerge as genuine pioneers of audience engagement in the digital age.

For more information on the Future of Customer Experience, read our blog:

Roger Beharry Lall - Research Director, Marketing Applications for Growth Companies - IDC

With over 25 years' experience leading technology driven marketing programs, Mr. Beharry Lall is now a Research Director with IDC covering Advertising Technologies and SMB Marketing Applications. He brings a unique multidisciplinary perspective, evangelizing the innovative and pragmatic use of both martech and adtech solutions for companies of all sizes. Early in his career Rog worked with an IBM subsidiary expanding into the Asian Market and subsequently, he spent over a decade at RIM (BlackBerry) building marketing leadership across new industry segments, geographies, and product categories. This background fuels his perspective as he researches enterprise customers engagement tools and tactics across the unified omnichannel.

Sales enablement has emerged as a pivotal function within modern businesses, bridging the gap between sales and marketing to drive revenue growth. As the landscape of sales continues to evolve, so does the role of sales enablement. To effectively lead a sales enablement function, there are key questions that leaders must address. In this article, we’ll delve into these critical questions, shedding light on the core aspects of sales enablement and providing insights to guide your strategy.

What is Sales Enablement?

Sales enablement is the strategic process of equipping sales teams with the right resources, tools, content, and training to engage potential customers and close deals effectively. It encompasses a wide range of activities, from developing targeted content and training programs to optimizing sales processes and providing technology solutions that empower sales professionals.

Why is Sales Enablement Critical? What the Research Says

The importance of sales enablement is not just anecdotal; it’s backed by data. According to recent IDC studies, (IDC 2022 Outcome Selling Advisory IDC Survey on Value Selling Excellence​) ​49% of sales representatives state pipeline development and issues finding qualified buyers are a challenge. 45% struggle to move a proof of concept to a sale.

3 Areas of Focus for Sales Enablement Leaders

To lead a successful sales enablement function, it’s crucial to direct your efforts in alignment with organizational goals. Here are three areas of focus:

  1. Content Development and Management: Creating relevant and engaging content that aligns with different stages of the buyer’s journey is essential. Effective content empowers sales teams to have meaningful conversations with prospects. Collaborate with marketing to ensure a steady stream of high-quality content that addresses buyer pain points and objections.
  2. Sales Training and Development: Continuous training and skill development are imperative for a high-performing sales team. Implement a structured training program that covers product knowledge, objection handling, sales techniques, and market insights. Utilize both in-person and digital training methods to accommodate various learning preferences.
  3. Technology Integration: Leverage sales enablement tools to streamline processes and enhance efficiency. Tools for content management, CRM integration, analytics, and communication can provide real-time insights into prospect interactions, enabling sales teams to make informed decisions and tailor their approach.

A Robust Go-To-Market Strategy Is Paramount for Sales Teams Aiming To Thrive

To achieve success with a sales enablement strategy, sales teams must embrace three pivotal shifts in their approach. A profound comprehension of the digital journey, along with a keen understanding of key personas and their priorities, is integral to a solid go-to-market strategy. This foundation allows sales professionals to tailor their interactions with precision, ensuring relevance and resonance at every touchpoint.

Secondly, the era of product and feature selling is waning within the realm of go-to-market strategies. Modern buyers demand more—they seek a clear demonstration of how a solution can address their unique business challenges and deliver tangible value. This necessitates a shift towards solution-focused selling that revolves around solving problems rather than just promoting features as part of an effective go-to-market strategy.

Lastly, value selling takes center stage by seamlessly aligning with the buyer persona’s overarching business strategy as a component of an integrated go-to-market strategy. This approach intertwines your solution’s value proposition with the customer’s long-term goals, creating an enduring, adaptable partnership. Through value selling within the context of a comprehensive go-to-market strategy, a symbiotic relationship emerges, fostering ongoing collaboration that not only meets immediate needs but also adapts to future transformations.

As sales enablement becomes an integral part of this dynamic landscape, mastering these three tenets within your overarching go-to-market strategy ensures that sales teams can navigate complexity, drive engagement, and forge lasting connections that transcend transactional interactions.

Sales Enablement Tools

Sales enablement tools play a pivotal role in the success of your function. These tools provide automation, data-driven insights, and improved collaboration. Consider implementing:

Mastery Classes: Mastery class programs that are meticulously tailored, offer comprehensive insight into specific priorities, personas, and use cases that align with the vertical requirements and the comprehensive suite of solutions and value propositions offered by you, the vendor. Throughout IDC’s program, interactive elements are integrated to foster peer-to-peer information exchange and collaborative learning. Our program also features outcome-oriented, task-based actions, accompanied by clear directives and strategic account planning frameworks, equipping participants to seamlessly apply their newfound knowledge in the field starting tomorrow.

Digital Coaching: The advantage of digital coaching for sales reps lies in its ability to provide personalized, on-demand guidance that enhances skills, boosts performance, and adapts to the dynamic needs of each individual representative. Delivered in a user-friendly format, IDC’s re-recorded videos or audio sessions, accompanied by informative slides and conveniently organized chapters, cater to various vertical and technology markets, while also addressing distinct profiles of target buyers. These resources can seamlessly integrate into your learning management system, creating a holistic approach that guarantees effortless access to pertinent insights for salespeople and partners alike.

Sales Playbooks: Sales playbooks offer sales reps a structured roadmap, streamlining their approach with proven strategies and best practices for more effective and consistent sales engagements.Harnessing our wealth of existing research and profound comprehension of IT buyers, we empower sales professionals with the tools to grasp the intricate landscape of a given market, whether it pertains to technology or geography. Our approach delves into market trends and drivers, unveiling insights that enable sales teams to engage in informed conversations. We illuminate the path toward solutions by demonstrating precisely how the vendor’s offerings align with and overcome these challenges. This equips sales professionals with the knowledge and confidence to articulate the value proposition coherently.

Buyer Conversation Guides: Research-based buyer conversation guides facilitate interactions that transcend traditional sales pitches, enabling sales teams to seamlessly converse with executive-level buyers. These interactions get to the very core of the challenges that organizations face, offering a panoramic understanding of their pain points and aspirations. Armed with this knowledge, sales professionals can adeptly steer the conversation towards how your solutions stand as beacons of resolution, poised to surmount these challenges and steer them towards elevated business outcomes.

Did you know? IDC has a Sales Enablement practice empowers organizations to sell more effectively and helps connect and align your marketing and sales efforts. Browse IDC’s Sales Enablement Solutions

Leading a sales enablement function requires addressing critical questions that shape the strategy and approach. Understanding the role of sales enablement, leveraging research-backed insights, focusing on key areas, and implementing the right tools are all essential for driving success. By aligning your efforts with the evolving sales landscape and the needs of your sales teams, you can empower them to achieve exceptional results and contribute to your organization’s growth.

The landscape of business may be seeing a seismic shift with the rise of Generative AI (Gen AI). This shift is not just the direct impact of Gen AI itself, but also how Gen AI is re-affirming the importance of AI overall and raising its profile within the business. This sea of change appears to be as tectonic as the PC revolutions of the ’80s and smartphones in the aughts. We are seeing the potential to revolutionize and disrupt industries, foster innovation, streamline operations, impact workforces, realize the promise of knowledge management, and democratize/consumerize AI in a remarkable fashion.

With this potential, it is important for organizations to navigate this transformation in a consistent, methodical, mindful fashion. This does not mean it has to be a slow laborious process, but it requires consideration to ensure responsible deployment and tangible/beneficial and scalable outcomes. This involves both technology and business stakeholders working together to quickly identify and implement short-term initiatives with a view towards longer-term imperatives and supportable environments.

Contaminated data can lead to incorrect models that fail to meet the desired outcomes. Addressing data quality, accuracy, and security challenges is a priority.

Daniel Saroff – Group Vice President of Consulting and Research

It Starts With Establishing a Foundation

Before embarking on the journey to harness the power of Generative AI, organizations need to establish a solid foundation. This foundation comprises several crucial elements:

1. Responsible AI Policy

A well-defined AI policy that outlines principles of fairness, transparency, accountability, and data protection is paramount. Ensuring explainability of AI model outputs and complying with legal/statutory regulations like GDPR are table stakes.

2. AI Strategy and Roadmap and the Role of the Proof of Concept

Crafting a comprehensive AI strategy with prioritized use cases is essential to aligning the organization’s efforts to business impact (both short and long terms).

The AI strategy should include the rules or guidelines for Gen AI proofs of concept (POCs), and it should incorporate the results of the POCs to recursively improve the strategy. This enables the strategy to self-correct and refine itself for a more successful, long-term approach, and facilitate responsive decision-making.

Many organizations don’t have the skills, policies, or data to leap into large/enterprise-changing Gen AI initiatives. With focused POCs, they have the opportunity for rapid action to identify skill, data, policy, and technology gaps in an efficient fashion. And with limited investment to prove out the technology.

To select POCs, use the following criteria for evaluation:

  • Value: Economic, strategic alignment, risk
  • Complexity: Data, algorithmic, system requirements, required ‘know-how’ or skills

For example, assess based on value, complexity, risk, data quality. If a proposal is high value but has poor data, don’t POC. Or if one is exceptionally high value and good data but high risk, wait until you’ve built experience in Gen AI.

Organizations should avoid over thinking their strategy and roadmap and thereby delay piloting this technology.

3. Intelligence Architecture

While Gen AI POCs do not need to build a platform to support enterprise Gen AI initiatives, part of their criteria for selection should be based on how they develop the understanding required for such a platform to exist. The architecture needs to consider how a platform can be implemented and governed, data models (structured, evaluated, integrated) required, and integration into existing systems.

Data privacy, security, and intellectual property protection must also be embedded within this platform architecture.

4. Reskilling and Training

Most organizations do not have mature skill-sets (prompt engineering, data science, data analysis, AI ethics, modeling) required to take full advantage of Gen AI. Nurturing a workforce equipped to build and use Generative AI models is a fundamental requirement. This requires hiring (which can be costly due to high demand roles) or reskilling.

Training should also be provided broadly across the organization. Due to its potential business impact, training all staff is important to create a baseline knowledge of the benefits and risks of the technology.

Implementing programs to assess organizational readiness is key to ensuring a smooth transition. This drives an understanding of the impacts on the organization and staffing, potential cultural inertia, and to proactively address staff concerns on its effects on their employment.

Data’s Crucial Role

Data serves as the foundation for Generative AI.

However, most organizations struggle with enterprise data. While business leaders often state that data is their most important asset, it is an asset that is frequently poorly curated, managed, understood, or analyzed. When IDC surveyed clients about their data, troubling results were revealed.

When assessing for POC selection, evaluate the data quality, ease of data retrieval/access, and integration as selection criteria. Where you have multiple POCs that can leverage decent quality, common data sets, select them over multiple POCs that require management of disparate data sets. There are enough challenges in the strategic leverage of Gen AI without adding data as another.

Security Concerns

Ensuring high-quality, accurate, and protected data is imperative. The integrity and privacy of data used to train AI models directly impacts their performance. Contaminated data can lead to incorrect models that fail to meet the desired outcomes. Addressing data quality, accuracy, and security challenges is a priority.

Impact on Infrastructure and Software Platforms

Generative AI’s adoption affects infrastructure and software platforms. For infrastructure the question of investment must be answered: whether to fund these investments via as-a-service models or via more traditional capital purchases. POCs can help drive the thought process for this decision making.

Software development lifecycles will accelerate, and low-code/no-code programming efforts will diversify code across AI-optimized architectures. This shift demands adaptable, API-enabled environments that balance portability, security, performance, cost control, and resilience.

Defining and Prioritizing Use Cases

Use cases are pivotal in driving the impact of Generative AI and strategy. Use cases typically fall into several categories:

  1. Industry-Specific: Tailored solutions, like generative drug discovery or material design, require customization and specialized data sharing. They can create substantial business value but demand unique models and integration and risk.
  2. Business Function: Integrating models with corporate data for specific departments (e.g., marketing, sales, procurement) needs careful data governance. Integration with established enterprise applications is crucial.
  3. Productivity: Basic use cases for productivity including summarizing across multiple reports to code generation to RFP template creation. They often integrate into existing applications as standalone SaaS solutions or cloud-based APIs. Gen AI may also be used to accelerate the meta tagging and categorization of enterprise data to improve data quality and retrieval. Additional productivity is gained through Gen AI’s unique ability to automate knowledge management through the synthesis of disparate data and sources across the enterprise. Knowledge management was largely unsuccessful in the first decade of this century, because it asked people to work differently. Gen AI allows knowledge productivity without staff changing their work habits.

Engage your C-Suite and key leaders in collaborative sessions to uncover relevant use cases and design a realistic roadmap. Also, provide a mechanism for capturing ideas that organically bubble up from divisions or lines of business.

Think Through Your Vendor Partner Selection Carefully

In the rapidly evolving landscape of Generative AI, there’s considerable ambiguity surrounding the technology and its practical applicability. Despite this uncertainty, a few key insights have begun to crystallize, especially regarding the usage of publicly-shared foundation models and the role of cloud platform providers.

According to IDC’s recent survey, cloud platform providers are perceived as the most strategic technology partners for GenAI initiatives.

Cloud providers deliver publicly-shared foundation models, often as PaaS or SaaS, that will find their place within a subset of enterprise use cases. While these may deliver short-term advantage, they are a commodity so are unlikely to provide long term competitive advantage. For most organizations, lasting benefit comes from leveraging finely-tuned, domain-specific models accessible in a private or controlled manner – a current example being Microsoft’s strategic investments in generative AI technologies making it a strong contender in this space.

In the same survey, IT consultants and system integrators emerged as a strong second. They are poised to guide organizations through the complex journey of Gen AI implementation. Organizations need to determine how well and long to use these partners.

As an initial driver for change, they are valuable.  Consultants and systems integrators provide skills and tools in short supply in many enterprises, but in the long run, if it is a competitive advantage (versus necessity), developing those skills and capabilities internally drives greater sustainable benefit.

Conclusion

Navigating the world of Generative AI requires an approach that encompasses responsible policies, sound data practices, technology understanding, a comprehensive view of use cases, and collaboration between IT and business leadership. Develop POCs with an eye to how they can help you create a consistent and defensible Gen AI moat for your organization, which can grow and evolve as strategy and competitive pressures demand. And be ready for a wild ride!

Daniel Saroff - GVP, Consulting and Research Services - IDC

Daniel Saroff is Group Vice President of Consulting and Research at IDC, where he is a senior practitioner in the end-user consulting practice. This practice provides support to boards, business leaders, and technology executives in their efforts to architect, benchmark, and optimize their organization's information technology. IDC's end-user consulting practice utilizes our extensive international IT data library, robust research base, and tailored consulting solutions to deliver unique business value through IT acceleration, performance management, cost optimization, and contextualized benchmarking capabilities.

In the digital business era, transformative advancements have reached unprecedented heights, driving rapid digital transformation and widespread cloud adoption across industries. This transformation has profoundly impacted customer experiences, enabling companies to offer seamless, personalized, real-time interactions across multiple touchpoints. By leveraging digital technologies and cloud capabilities, enterprises can create meaningful and engaging experiences that set them apart in the competitive digital economy.

However, this shift to cloud-based solutions has also led to an expansion of attack surfaces, creating newer areas of vulnerability. From smartphones and tablets to IoT devices and wearables, the proliferation of interconnected devices has resulted in a complex and vast digital landscape, each representing a potential entry point for cyberattacks.

Cyberthreat Landscape in Asia/Pacific

Cyberattacks worldwide are escalating at an alarming rate, becoming highly targeted and sophisticated. Cybercriminals continuously develop more intelligent methods to exploit vulnerabilities, steal sensitive data, or demand ransom. Securing all connected applications to critical infrastructure becomes more challenging, making it easier for attackers to find vulnerabilities to exploit, including the use of bots for both legitimate and malicious purposes. As a result, businesses face frequent, targeted, and complex cyberattacks, leading to significant financial burdens, customer attrition, and damage to brand reputation.

The Asia-Pacific Japan (APJ) region has seen a surge in cyberattacks, with a cyberthreat landscape that is intricate and constantly evolving. The region is influenced by geopolitical tensions, rapid digitalization, and the growing expertise of cybercriminals and state-sponsored hackers. According to IDC’s 2023 Future Enterprise Resiliency and Spending (FERS) Survey, Wave 2, a staggering 59% of enterprises in APJ fell victim to ransomware attacks in 2022, and 32% ultimately paid the ransom. Out of these, Australia, New Zealand, Singapore, and India were the worst affected regions. Among the affected businesses, 97% reported that the impact lasted from a single day to several weeks. This signals that now is the opportune moment for enterprises to strategically invest in cutting-edge technologies for proactive threat detection and decisive attack mitigation.

Significant Advancements in Threat Detection and Response

Today’s cyberthreat landscape has led to the emergence of EDR (End Point Detection and Response) and XDR (Extended Detection and Response) solutions backed by MDR (Managed Detection and Response)services to detect and respond to cyber threats. Early detection allows organizations to prevent or limit the damage caused by attacks, reducing data loss and minimizing the attack’s impact. According to IDC’s 2023 FERS Survey, Wave 2, 71.5% of the surveyed enterprises in APJ mentioned that threat detection and response tools, including EDR, NDR and SIEM (Security information and event management), helped them detect attacks before intruders had a chance to act.

EDR has become essential in enterprise cybersecurity strategies, used by organizations of all sizes and industries to protect their endpoints from cyberthreats. MDR services offer a comprehensive approach to shield businesses from advanced and frequent cyberthreats, delivered by experienced cybersecurity experts in a 24 x 7 remote SOC with cutting-edge solutions and hands-on support. As per IDC’s Asia/Pacific IT Services Survey, 2022, majority of the enterprises stated that the most important capabilities they seek in an MDR provider, is the ability to effectively integrate network and endpoint at the architectural level for enhanced visibility into assets and proactive threat detection at all surfaces. Apart from this, they also require an MDR provider to offer strong analytical capabilities. Some enterprises also indicate the need for a third-party analytical platform that can help absorb inputs from web, email, network, endpoints as well as cloud and deliver a comprehensive threat analysis. This is exacerbated by the need to have a proactive threat hunting for knowns and unknowns including third-party risk assessments from all sources as well as a well-suited and integrable range of threat detection and response offerings.

Enterprises are now directing their investments towards XDR solutions, empowering them to identify and effectively counter threats across networks, endpoints, and cloud environments. With advanced analytics, XDR solutions can form complex correlations between relevant data sources, reducing false positives and improving threat detection. IDC emphasizes that every XDR solution should include EDR capabilities, which can be enhanced with NDR, (Network Detection and Response) integration with external threat intelligence, and an underlying log management backplane, providing alerts from virtualized resources over the cloud.

The XDR solutions must also incorporate a SOAR solution for workflow management, DDoS Security, a WAF, web and email defense, identity and access management (IAM), data loss prevention (DLP), workload management, and FIM. XDR platforms are known for their scalability, reliability, extensibility, and modularity. While many XDR tools are cloud-based, some organizations prefer dedicated or on-premises solutions or a hybrid approach due to concerns about public cloud environments. Regardless of the chosen approach, a cloud-based XDR solution offers accessibility and flexibility for experts and analysts working in hybrid setups. A comprehensive XDR solution much in demand these days helps assist enterprises with threat quarantine, automated and manual remediation, alert escalation, reporting, and forensic analysis and must be the focus area for security service providers looking to cater to future enterprises.

Proactive detection and response may only sometimes be sufficient, particularly as cybercriminals adopt multi-vector approaches. The threat landscape’s complexity has led to the evolution of threat detection, including signature-based and behavior-based detection, threat intelligence, automation and orchestration, integration with incident response, and deception technology.

Using AI for TI – Threat Intelligence

AI-powered threat hunting leverages ML and data analytics to uncover hidden patterns and anomalies, improving the identification of potential threats. Businesses are now investing in threat hunting solutions that deploy AI/ML capabilities to predict threats based on historic patterns, addressing known and unknown threats with relevant insights and minimal false positives using comprehensive security analytics. AI’s relationship with threat detection and response is symbiotic, enabling more accurate and efficient threat detection, facilitating faster incident response and remediation, and empowering security analysts with advanced tools to proactively hunt for threats.

The potential use cases for threat intelligence are a significant leap forward compared to detection and response strategies. A prime example is identifying adversaries, a captivating aspect of threat intelligence, as it traces known threat vectors back to the responsible miscreant be it a cybergang or a nation-state sponsored attacker. Moreover, threat intelligence platforms can collect and correlate data from in-house security tools, including SIEM, UEBA, IDS/IPS, and antivirus software. This grants insights, validates possible insider threats, and supports external intelligence for forensic investigations.

The ever-evolving threat intelligence feeds necessitate consistent cross-referencing with up-to-date IoCs, such as behaviors, tactics, exploits, and open source code vulnerabilities. Here, automation plays a pivotal role in artifact collection, thereby ensuring accuracy. Additionally, there are times when unmanaged devices within a network can become inadvertent targets for attackers due to misconfigurations, incomplete patch management, or other issues. Threat intelligence also mitigates the challenges of shadow IT or enhances detection across data graveyards.

Remarkably, specific threat intelligence solutions cater to industrial control systems, APT intelligence, crime, and forensics intelligence. In the transportation industry, enterprises are leveraging threat intelligence to proactively prepare for attacks and fortify their infrastructure. Notably, a major Indian insurance company utilized threat intelligence to thwart 3.4 billion INR worth of fraud across various domains, integrating AI technology to enhance the fraud investigation process.

In the current landscape, establishing strategic partnerships between threat intelligence vendors and service providers holds the utmost significance. Enterprises seeking relief from financial and operational burdens desire consolidated service offerings. This market shift calls for security service providers to offer comprehensive solutions, including SOC, vulnerability assessments, incident management, and threat intelligence. Cultivating strong and strategic partnerships is pivotal for ensuring a unified, all-encompassing approach that aligns with evolving customer demands. Additionally, collaborative partnerships between security vendors and service providers aimed at delivering advanced threat intelligence capabilities and solutions by seamlessly blending global threat data with localized insights will offer a robust framework and a comprehensive perspective to potential clients on threats that hold significance in their unique operational context. This synchronized approach empowers organizations to stay ahead of evolving cyber risks and enhance their security posture.

Advice for Enterprises

For enterprises looking to adopt or elevate their threat detection and response capabilities, initiating efforts to reduce dwell time typically involves starting with EDR. However, sophisticated attacks often encompass more than just endpoints, necessitating the adoption of XDR as the next evolutionary step. Technology buyers are advised to assess their requirements and then look at investing in a multitude of advancements happening with the advent of AI/ML models in the threat hunting and threat intelligence space as an advancement to detection and response.

When selecting threat intelligence vendors, technology buyers should remember to prioritize those offering contextual insights that align with their industry and environment. It’s crucial to assess vendors based on their ability to provide actionable insights, enabling proactive defense strategies and swift responses to emerging threats. Integration capabilities are key, ensuring seamless collaboration with existing security tools and infrastructure. Look for vendors who blend global and local threat data to offer a comprehensive perspective and consider their automation and data enrichment capabilities to enhance threat detection accuracy. Scalability is essential to accommodate growth and evolving threat landscapes. Additionally, evaluate vendors using real-performance metrics such as Mean Time To Detect (MTTD) and Mean Time To Respond (MTTR) to ensure their effectiveness in rapid threat identification and resolution.

This approach ensures that your chosen threat intelligence vendor aligns with your organization’s unique needs and contributes to a robust security posture. Also, it is always essential to note that AI is not a silver bullet and should be used with human expertise, as security analysts play a critical role in validating and interpreting AI findings within the organization’s environment to make informed decisions regarding threat response and mitigation. Without a doubt, the collaboration between AI and human intelligence undoubtedly bolsters an organization’s overall security posture.

Get insight on adoption and perception of threat intelligence solutions by Indian enterprises in this on-demand IDC webinar here.

Interested in how enterprises should strategize their investments moving forward?

Sakshi Grover - Senior Research Manager - IDC

Sakshi Grover is a senior research manager for IDC Asia/Pacific Cybersecurity Services, supporting its research and client engagement activities across Asia/Pacific markets. Additionally, she serves as the lead security analyst for IDC India. Sakshi is responsible for delivering syndicated custom research and consulting engagements on next-generation emerging and disruptive technologies. Her tasks include developing and socializing IDC's point of view within security services, covering both legacy and modern cybersecurity technologies. Her role involves close collaboration with technology vendors and buyers, developing market insights, and providing research, consulting, and advisory services in the fields of security software and services. This includes partnering on research efforts with relevant country analysts in the local IDC offices. Sakshi's views on security have been quoted in numerous publications, such as the Economic Times, Business Standard, Data Quest, CRN, and others.

In a world inundated with AI buzz, are you feeling overwhelmed by the incessant chatter around artificial intelligence? As the AI frenzy reaches its zenith, it’s imperative that we take a collective breath and evaluate how to get to a positive impact for our organization. Let’s pause and reevaluate the landscape.

Neil Ward-Dutton, one of our distinguished IDC analysts, once aptly noted that AI appears as an enchanting spell until we unravel its inherent limitations and complications. And limitations, there are many, regardless of how you approach the matter. A considerable portion of these limitations stem from the data that fuels AI’s algorithms: for AI systems developed internally, the scarcity of high-quality data in most organizations data often proves to be a stumbling block; and for generative AI systems that leverage pre-built “foundation models” already trained on external data sources, lack of transparency about the provenance and quality of those data sources creates a number of risks.

While generating training data is an option, recent observations indicate that excessive training can actually yield adverse outcomes. And that’s just the tip of the iceberg; the complications extend into the realms of bias and ethical quandaries.

Dispelling any illusions, let’s be clear – acquiring a software package won’t instantaneously immerse your organization into the realm of AI excellence. Remember the CIO who, a few years back, joyfully declared hiring two data scientists as the path to “getting AI”? Upon inquiry about their role and the benefits they’d bring, he confessed, “I’m not a data scientist; I don’t know.” Such anecdotes underscore the essence of the issue.

Presently, history seems to be repeating itself. When inquired about AI’s potential, responses often resemble, “I don’t know; I’ll ask the AI.” This reveals a common theme – many are intrigued by AI-enabled possibilities, but grappling with its tangible advantages remains a challenge. As the chasm between curiosity and efficacy widens, it begs the question: Is the investment in perfecting AI worth the monumental effort?

We have been defining what is needed to be successful with AI and the steps needed to make it work. We invite you to come and discuss with your peers what steps make sense and where to hold back inverstment. By the end of the session you should have insight into the value you can realisticly derive from AI over the next three horizons and what the pitfalls may be. 

 

Join the CIO ThinkTank on September 28th: On September 28th, from 5:00 to 6:00 PM CET, we invite you to participate in the CIO ThinkTank – an open dialogue among peers. 

In A CISO’s Guide to Artificial Intelligence, we view artificial intelligence as providing advisory, enhanced service, and semiautonomous cybersecurity defense functionality based on a range of structured and unstructured data, including logs, device telemetry, network packet headers, and other available information.

Simply put, AI is the application of applied statistics to solve cybersecurity problems. The goal is to create analytics platforms that capture and replicate the tactics, techniques, and procedures of the finest security professionals; democratize the traditionally unstructured threat detection and remediation process; or complete a range of near-real-time automated detection and response techniques that theoretically can be replicated, but by the time the security professional completed the task, it would be far too late.

As AI continues to promise simplicity in the face of the complexity of today’s security environment, it will be helped by the homogeneity of data.

Frank Dickson – Group Vice President, Security and Trust

However, our collective focus is in the wrong place in our opinion. The hype and conversation focus are on AI. Why not? The possibilities of AI inspire the imagination, illuminating the possible. The key to enabling outcomes in security is not about the AI; it is about the data. Many children are inspired by the power and girth of locomotives. The potential of the locomotive, though, relies on the boring and tedious process of laying the tracks and the enabling infrastructure. Likewise, data is the enabling infrastructure for security AI. Three characteristics are deterministic of success:

  • Data framework structures
  • Data management
  • Data curation

Data Framework Structures

As we look to unlock the potential of artificial intelligence to unlock the potential and promise of – for example – extended detection and response (XDR) creating frameworks and structures is critical. The most basic definition of XDR is:

  • The collecting of telemetry from multiple security tools
  • The application of analytics to the collected and homogenized data to arrive at a detection of maliciousness
  • The response to and remediation of that maliciousness

As we look to apply analytics to the collected and homogenized data to detect maliciousness, AI needs structure to be able to look at the data at scale. Afterall, AI is really no more than a mathematical model that implies the relationship of the data. Telemetry optimized for a point use case, such as the perimeter-centric defense of network perimeters of a firewall, is of little use if you cannot relate it with other data sets, such as identity, and if it is not framed in a way to achieve an end goal.

As we discussed the value of event sequencing as a core attribute of most detection and response offerings, much of the value was unlocked by application of the MITRE ATT@CK framework. Not only does the framework provide structure to the task of threat detection by mapping to the cyber kill chain, but it also creates a manner in which different tools from different vendors can structure data and prepare it for analysis.

Data Management

Data has weight and gravity. Security data has a lot of weight. For example, a typical endpoint protection platform agent will produce 150-200MB of data a day. Movement, storage, and management of such data quickly creates a problem of scale. Data retention policies thus can become quickly divisive topics.

In addition, only with AI can the increasing pools of telemetry be put to the very best use. ML has limits, but using AI to train for previously unseen patterns and lens on the data can (time-to-X) be reduced in a truly significant way.

Data weight has become a competitive differentiating tool. For example, the move by the infrastructure-as-a-service (IaaS) vendors to retain their own cloud logs at no or very low cost is significant, as SIEM is often priced based on the volume of data ingested, and the SIEM vendors cannot simply “eat” the cost of ingesting and storing voluminous cloud logs. Analysis needs to happen on the native format in a predictable manner. The entire business model of SIEM, XDR, and other analysis platforms thus is increasingly challenged and is changing based on the weight of data.

Data Curation

In a world where every vendor has a different data structure, curating heterogeneous data sets to create data homogeneity to enable analysis is an extra step, a potentially ominous step depending on the calculus and scale required. As AI continues to promise simplicity in the face of the complexity of today’s security environment, it will be helped by the homogeneity of data. In a world where every vendor has a different data structure, curating heterogeneous data sets to create data homogeneity to enable analysis is an inhibitor.

Restructuring data takes time and costs money. Thus, large vendors with broad portfolios have the advantage as multiproduct but single platform offerings save time and cost due to having a larger percentage of multi-technology homogeneous data sets.

Overcoming the issue of data curation is the objective of many standards. For example, Structured Threat Information Expression (STIX) and Trusted Automated eXchange of Indicator Information (TAXII) were developed by MITRE as the U.S. Department of Homeland Security FFRDC. STIX is a common language for threat intelligence, so it can be shared and machine-read by any tool supporting it. TAXII is the application layer protocol designed to simplify the transmission of threat intel data. In 2015, STIX/TAXII development was moved to the OASIS international standard organization. Today, the work is free, open, and community driven.

We would be remiss if we did not mention Open Cybersecurity Schema Framework (OCSF) here and its significance to AI. Normalization of hybrid multicloud security telemetry is needed before any converged data is useful. The goal of OCSF is to simplify the exchange of data between the tools that ingest it, manage it, and enrich it because every organization has a cornucopia of solutions purchased over the past half dozen years. OCSF means a single format to make it easy for those getting started instead of writing data connectors to a lot of solutions. The real story here is one of simplicity, which is the holy grail of cybersecurity solutions.

So what? What Does This Mean to YOU?

Look.  Every cybersecurity vendor is going to roll-out a generative AI interface for their tools, and they should.  It is the fourth generation of the user interface; it is significant. A vendor will be conspicuous without one. By the end of 2023, every tool of relevance will have one; tools without one will likely become irrelevant or subservient to those that do. The ability of the tool to create outcomes in your environment however will be determined not by the power of generative AI but in the data and the predictive AI models behind the generative AI.  It’s Not About the AI; It’s About the Data.

Frank Dickson - Group Vice President, Research - IDC

Frank Dickson is the Group Vice President for IDC's Security & Trust research practice. In this role, he leads the team that delivers compelling research in the areas of AI Security; Cybersecurity Services; Information and Data Security; Endpoint Security; Trust; Governance, Risk & Compliance; Identity & Digital Trust; Network Security; Privacy & Legal Tech; and Application Security & Fraud. Topically, he provides thought leadership and guidance for clients on a wide range of security topics including ransomware and emerging products designed to protect transforming architectures and business models.

Interested in account-based marketing? Be sure to check out IDC’s on-demand webinar The Company is the Key: How Account-Level Intelligence Helps You Gain Share.

Why Competitors Matter to Your Account-Based Marketing Effort

Account-based marketing (“ABM”) is a strategic B2B marketing approach that targets a single company, division, or individual within a company. As such, it deploys far more targeted tactics than general marketing, designing campaigns around names and emails, individualized value propositions, and highly specific personas.

If your firm is engaged in ABM, it’s guaranteed your competitors are as well. This means you need to know what they are saying, how they are positioning themselves, and how they are engaging their prospects and clients so you can better align your own efforts.

Getting started by identifying the competitors you need to analyze can be a formidable task in crowded IT markets. Granted, for some technology areas, the number of true players is small enough that everyone knows who they are. For instance, in the Canadian market for notebooks, five manufacturers hold more than 85% of the market value.

 Identify Key Competitors In Busy IT Markets

For many IT markets, however, the list of competitors is long. For instance, in the U.K., IDC tracks more than 100 players in the market for financial applications. While the top 10 control around 58% of the market, the next 10 control less than 15%, leaving lots of room for ambitious software houses.

In China, the market for human capital management software is wide open, with the largest supplier holding less than 10% of the market. In the U.S., the market for custom application development is both enormous and fragmented, with the top 10 players accounting for only one-third of the value. In all three markets, there are lots of fast-growing players aiming to break into the top 10.

Figure 1: Market Analysis Example

To do ABM right, you need to identify the largest players and the fastest growing players both at the top of the market and in your revenue range. You then analyze their strategy and tactics for best practices and pitfalls.

* Only one supplier grew; the other was simply less negative.
** There are only 3 vendors in the top 11-20. Only one supplier grew; the other was simply less negative.

Source: IDC, 2023

Focus Attention on Priority Competitors

With so many tech suppliers, it can be hard to know on which of your competitors to focus your analysis. This is where market data comes in. IDC believes there are three primary ways you can use data to identify competitors worth your scrutiny:

  • Competitors Outperforming the Market: While you will already be aware of your largest competitors, it can be extremely useful to rank them by share. This reveals who has the most visibility and the messaging and approach you’ll need to position yourself against. You should also rank them by growth, as this is a strong indicator of the effectiveness of their go-to-market strategy, including ABM. For instance, in the U.K. market for financial applications, only two of the top 5 gained share in 2022. The rest lost share.
  • Fast Growing Competitors In or Near the Top 10: IT suppliers that are rapidly gaining share are doing something right. For smaller companies, a good year can give the illusion of exceptional growth. IDC therefore recommends looking at the fastest-growing suppliers in the top 20–30 (depending on the market), as these organizations are usually large enough to be dangerous. Returning to the U.K. market for financial applications, half of the top 20 software providers expanded much faster than the market; it’d be a good idea to catalog their ABM strategy and tactics for best practices.
  • Fast Growing Competitors In Your Revenue Range: If you are among the top performing tech suppliers or a fast-growing company nearing the top 10 or 20, the two points above have you covered. But if you are further down the list, identifying which firms in your revenue range are growing fast tells you who to watch out for — and perhaps who to emulate when it comes to ABM. In Germany, IDC tracks around 70 firms trying to steal share from SAP in the supply chain management space. In 2022, in the $2–5 million revenue range, five beat the market by significant margins. If you were in that range, these five would be worth examination.

In short, the right data can help you quickly identify which of your competitors to analyze for ABM best practices and the positioning and messaging to set yourself apart.

IDC Company Lens provided the data for this post.

Get Started With ABM Resources and IDC Data

ABM planning can be a time consuming and challenging process to get right, especially the first time. To help organize your thinking and make key decisions you can use this account-based marketing starter guide. This step-by-step guide can help you bring together marketing and sales teams to develop a cohesive ABM campaign by asking the right questions and identifying the necessary insights for planning.

Whether you are approaching ABM from the perspective of marketing or sales—or through indirect or direct business channels—in today’s economic climate, objective insight and expert advice about buyers, partners, and competitors is vital to inform and accelerate decision making, campaign production, and account planning cycles. IDC Data & Analytics offer a broad array of solutions which detail company and ecosystem dynamics for the global tech market and that matter most to answering critical ABM planning and execution questions.

To get in contact with us to book a demo, please reach out here.

Wall Street’s top regulator has adopted new cybersecurity rules that require companies to disclose a material cyber breach within four days of determining that the breach is material. The 96-hour requirement has been on the table for months, but the materiality qualifier puts a critical onus on boards and CISOs to get specific on their cyber-risk tolerance.

Until now, breach notification has been driven primarily by regulations or industry rules requiring notification “without unreasonable delay.” That afforded a fair amount of bandwidth within which to understand and assess a situation and then determine the most appropriate path forward. The new SEC rules raise the bar for publicly traded companies, demanding that they not only know that an incident has occurred but also requiring boards to quickly get fact-based in the context of materiality.

Any situation where shareholders would consider the breach important, or where there is significant potential impact on the company’s financial position, operations, customer relationships, or reputation would clearly be material. But, often, data breaches or other cyber incidents are more nuanced. For example, a data breach that impacts a small number of customers or a denial of service (DoS) that impacts a small location and that is quickly remediated might not be considered material for SEC reporting purposes.

In 2022, for example, there were an estimated 490 million ransomware attacks, and Microsoft said that it mitigated an average of 1,435 DoS attacks a day; most of those incidents would probably not meet a standard of materiality. While we always must be beyond reproach on reporting, we should not fall into the trap of launching a disclosure cycle only to find out that the incident was not, in fact, material.

Here are best practices to follow to ensure compliance with the new SEC rule.

Consider Materiality as a New and Critical Element of Cyber Oversight

The interpretation of materiality should be provided by the board, in the form of clear risk tolerance guidelines. Defining risk tolerance is a normal practice at the board level; clearly define the triggers that would push an incident into the SEC four-day window by using scenario-based analysis, including:

  • Customer data: If the breach impact is known, contained, and minimal, is it material?
  • Operational impact: If a subset of operations is impacted, and impacts can be contained and recovered, is it material?
  • Reputational risk: If a disclosure occurred where there was a small impact and the awareness and response are beyond reproach, is that material?

Understand How the Board of Directors Interprets ‘Materiality’

Neither the CISO nor the technology team should be responsible for determining or interpreting materiality. What matters under the new SEC rules is very much subject to interpretation, so the team needs to know in advance how the board wants “materiality” to be interpreted.

In addition, be mindful of opportunities to proactively stay within approved risk tolerance. For example, notification is generally not required for encrypted data, so take advantage of data encryption as it continues to be your best defense. If you have not already encrypted personally sensitive information, consider taking action to encrypt the data that is most exposed from the board’s risk tolerance perspective.

Ensure that You Have the Data to Assess, Monitor, and Report in the Context of the Approved Risk Tolerance

Plan around the defined risk tolerance to know exactly how to bring together the necessary data to monitor and report. Then build the capability to produce a clear, concise, and meaningful report that could be used for management and the board in an incident situation. Develop communications templates in advance for use if you have an incident, including models for reporting on progress and incident closure with a consistent notification and reporting cadence. Understand how you would report to each of the risk tolerance elements and exercise the data sources to know how those boundary conditions will be tested and reported on.

The new SEC rules raise the risk that the board will be distracted by the clock in the heat of a cyber incident. Time pressures make it easy to say too much or to elaborate beyond what is required. By planning the critical data strategy beforehand and using templated communications to share the right message, you can ensure that nothing is missed but the situation is not exacerbated by oversharing.

We look forward to learning more as the SEC rules are absorbed, and sharpening our thoughts and guidance as more details emerge.

Alizabeth Calder - Research Adjunct Advisor - IDC

Alizabeth Calder, an adjunct analyst with IDC's IT Executive Programs (IEP), is the former CIO of HomeEquity Bank, a contributing writer to IT World Canada, and a best-selling author and sought-after keynote speaker. She focuses on bridging the gap between the technology sector and the leaders who provide the governance and investment needed to succeed.

Enterprise leaders now see digital technology and capabilities as foundational to innovate and succeed in the digital business era. However, as enterprises continue to navigate economic uncertainty, we are seeing a greater emphasis on achieving clear business outcomes from technology spending.

The growing complexity and pervasiveness of technology within enterprises is also driving expectations for faster time to value. As a result, IDC sees a greater need for clarity on prioritizing technology investments, resource allocation and insight into achieving business outcomes.

These issues are far from only being affiliated with IT; they now extend upward into the remit of the C-suite and impact all functional areas; 44% of CEOs have told IDC they need help with their digital business strategies. Moreover, lack of skills both within the C-suite and across the organization remains a key hurdle to achieving business outcomes from their digital initiatives.

Unveiling IDC’s FoX Scorecard Unique Value Proposition

One of the tools IDC  believes will be instrumental in enabling business and IT leaders to navigate these obstacles is the IDC Future of “X” (FoX) Scorecard. The FoX Scorecard provides IDC clients with proprietary data and research to:

  • Understand Future Enterprise capabilities and their correlation to business outcomes.
  • Compare performance relative to peers and leading enterprises.
  • Identify areas for resource and process optimization and investment.
  • Implement recommendations from best-in-class organizations.

The FoX Scorecard methodology brings together the insights from our worldwide FoX analysts who have a deep understanding of the capabilities required to become a Future Enterprise. With the analysis expertise from IDC’s quantitative survey team:

  • The FoX research framework explains the processes, organizational structures, and enabling technologies that empower enterprises to achieve their top business goals.
  • The FoX Scorecard survey data identifies the investments, readiness, and performance across diverse enterprises across the globe.

For example, IDC’s Future of Work Agenda research reveals that organizations are struggling to implement the right balance of on-site and flexible work practices, understand what the best practices are to maintain company culture and how automation can make employees more productive.

Learn from Leading Organizations

The recently published IDC Future of Work Scorecard compares which approaches and technology deployments differentiate “Leading” enterprises from their peers. For example, best practices in hardware, software and services investment, driving increased operational efficiency, improved employee productivity and cost savings. These improvements arise not simply because of a single investment or deployment.  They occur across key areas of work augmentation that in turn have an impact on work culture as it evolves in office, remotely and in spaces in between. At the most advanced level, enterprises that lead the culture, space, and augmentation pillars quickly embrace work transformation and new ways of working.

IDC’s analysis reveals that only 11% of worldwide enterprises are at the “Leading” stage. KPIs such as employee and customer satisfaction, quality scores, improved skills-levels, innovation and task-based metrics are important to these Leading enterprises. The Scorecard shows the stark measurable differences in positive business outcomes achieved by Leading organizations compared to all others – especially those that are nascent in their work transformation journeys. The gap between the Nascent and Leading enterprises points to areas of improvement.

Another example comes from IDC’s Future of Connectedness agenda program. The Future of Connectedness Scorecard analysis reveals that just 8% of enterprises are at the Leading stage, highlighting the connectivity technology areas of investment that most enterprises need to accelerate innovation. The insights from this Scorecard identify areas for optimization and investment across three capabilities: (1) Connectivity Transformation, (2) Services Enablement, and (3) Contextual Experiences. The results show the largest gap between the Nascent and Leading enterprises today is in their ability to use real-time insights to improve business outcomes.

Relevance to the Most Pressing C-Suite Agenda Items

Our 2023 Global CEO survey revealed that economic pressures top the list of risks to organizations. This underscores the need to measure the outcomes from any business investments, including technologies, services, and new hires. The most successful enterprises in the digital business era will be intentional about their investments. Using a fact-based approach to decision-making is what IDC FoX Scorecards are designed to offer.

The connection to business outcomes is what really makes the Scorecard methodology so relevant in today’s economic climate. IDC has built a standard approach to measuring business value across all our research domains. Scorecards are being rolled out across the following IDC research programs this year:

  • Future of Operations
  • Future of Trust
  • Future of Customer Experience
  • Future of Enterprise Intelligence
  • Future of Connectedness
  • Future of Industry Ecosystems
  • Future of Digital Infrastructure
  • Future of Work
  • Worldwide Digital Business Strategies

Technology Suppliers are Part of the Equation

While the FoX Scorecards are designed to help end-user organizations, IDC believes that success will come when enterprises work closely with their trusted technology and service providers to advance their capabilities. The ongoing challenges facing the C-suite create an opportunity for suppliers to deliver targeted solutions and services to help enterprises drive business outcomes.

In an increasingly crowded and competitive tech industry, we expect the winning tech companies and services firms to finely tune their offerings and engagement model to empower enterprise customers to achieve business outcomes. Becoming a trusted advisor means being engaged with customers, helping C-suite and group leaders identify strengths and shortcomings, and demonstrating the benefits of improvement from accelerated technology investment.  

Ultimately, when the economic picture begins to turn more universally positive, enterprise leaders will remember and reward the vendors that were there to help them during challenging times.

Summary: If You Don’t Measure It, You Can’t Improve It

During this period of economic uncertainty, a time when inflation is high, geopolitical conflict threatens supply chains, and qualified workers are in short supply, enterprise buyers are seeking faster time to value and quantifiable business outcomes from their tech investments. IDC FoX Scorecards will serve as valuable tools for IT and business leaders, aiding them in prioritizing and optimizing technologies and capabilities that can maximize business outcomes. And for IT suppliers, FoX Scorecards will be instrumental in demonstrating and measuring the value of their technology solutions.

Tony Olvet - GVP, Worldwide C-Suite & Digital Business Research - IDC

Tony Olvet is Group Vice President, Worldwide C-suite and Digital Business Research at IDC. His team's global research focuses on the connection between business transformation and digital investments across enterprises. Tony's analysis and insights help vendors, IT professionals, and business executives make fact-based decisions on technology strategy and digital business. Tony has worked with clients across a variety of organizations including global IT manufacturers, enterprise software vendors, telecom service providers, financial institutions and public sector organizations. He has been quoted in major business and industry media including CIO Magazine, The Globe and Mail, CBC and The Financial Post.