November 2022 was a busy month for the European Commission, with two major pieces of legislation passed that aim to bolster the cybersecurity and cyber resilience of Member States and at organisations across the bloc.

The first was the Digital Operational Resilience Act (DORA), which covers the finance sector and companies that provide ICT services and infrastructure to financial sector entities. The second was the long-awaited update of the Security of Network and Information Systems (NIS) directive, known as NIS 2.

The broad aim of NIS 2 is to engender a high common level of cybersecurity in the EU, across all Member States, in the long term.

This is the first in a two-part IDC blog series that will focus on the implications of NIS 2.

The Clock is Ticking

The full text of the NIS 2 directive was published in the official journal of the European Union on December 27, 2022, and enters into force 20 days after that (January 16, 2023). Thereafter, Member States will have 21 months to transpose the directive into their national law (by October 17, 2024). What happens between now and then?

Building the Frame(work)

The next 21 months will be critical for the success of NIS 2 as regional and national bodies get to work on transposing the articles of the directive into their national legislation. Who will be responsible for this part of the process?

The prime mover in this respect will be the NIS Cooperation Group, which was established in 2017 to support the first NIS directive. The Cooperation Group comprises representatives of all the EU Member States, the European Commission and the EU Agency for Cybersecurity (ENISA).

The group will provide guidance to the national authorities of the Member States on transposing and implementing the directive. It will also provide guidance, advice and cooperation on numerous related areas including cybersecurity policy initiatives, capacity building, training and awareness, exchange of information and best practices, and vulnerability disclosure. It will also be responsible for defining standards and technical specifications, as well as maintaining a central register of essential and important entities in each country.

A second key group will be a network of computer security incident response teams (CSIRTs) across all the Member States. At least one CSIRT in each country will be designated as a competent authority for various roles including international cooperation and coordination, threat monitoring and analysis, and the provision of incident response and assistance to essential entities.

The third key entity is the European Cyber Crisis Liaison Organisation Network (EU-CyCLONe). Its task is to support coordinated management of large-scale cybersecurity incidents and crises at an operational level. It will also ensure regular exchange of information among Member States and relevant entities within the union. EU-CyCLONe’s role will really crank up once the directive is in place.

Key responsibilities will include:

  • Developing shared situational awareness for large-scale cybersecurity incidents
  • Assessing the impact of large-scale cybersecurity incidents and proposing potential mitigation measures
  • Coordinating the management of large-scale cybersecurity incidents and supporting decision making at the political level

Between them, these organisations, along with the Member States themselves, will be tasked with ensuring that when NIS 2 comes into force at the national level, it is appropriately transposed into national law and the countries are able to put in place the necessary structures and resources.

Kicking the Tyres

One criticism of the first NIS directive was that it lacked teeth. The EC is striving to establish NIS 2 more firmly throughout the bloc and one measure through which it seeks to do this is peer reviews. These are aimed at assessing at a national level the conformity, progress and readiness of the directive. For example, peer reviews will assess:

  • The level of implementation of cybersecurity risk management measures and reporting obligations
  • The level of capabilities, including available financial, technical and human resources
  • The operational capabilities of the country’s CSIRTs
  • The level of implementation of cybersecurity information-sharing arrangements

Peer reviews are to be carried out by designated cybersecurity experts from at least two Member States, at a maximum of once every two years. The experts conducting the reviews are expected to provide reports including recommended improvement on any of the reviewed aspects. Those reports will be submitted to the Cooperation Group and the CSIRTs network where relevant.

Conclusion

These entities and processes should ensure that at a regional and national level the EU and its Member States can develop a higher level of cybersecurity and resilience by adhering to the NIS 2 directive.

The second instalment of this blog series will look at which organisations NIS 2 will apply to and what will be required of them.

Mark Child - Associate Research Director, European Security - IDC

Associate Research Director Mark Child of IDC’s European Security Group leads the group's Endpoint Security and Identity & Digital Trust (IDT) research for both Western Europe and Central & Eastern Europe. He monitors developments in security technologies and strategies as organizations address the challenges of evolving business models, IT infrastructure, and cyberthreats. Mark's coverage includes in-depth security market studies, end-user research, white papers, and custom consulting.

Throughout 2022, I talked a great deal about how digital sovereignty was evolving, and we now have the evidence to prove it.

Ongoing geopolitical uncertainties, along with macroeconomic trends such as the ongoing threat of a new pandemic and inflationary fears, mean digital sovereignty is shifting gears and emphasis from self-determination to self-sufficiency and survivability.

As a result, digital sovereignty encompasses several layers which we present as part the “Sovereignty Stack”:Organisations that are ready to embrace this broader view of digital sovereignty should consider the following attributes when designing, procuring, implementing and managing sovereign solutions.

Self-Determination

  • Data sovereignty: While you may be familiar with data (and cloud) sovereignty, these are just subsets of digital sovereignty (as explained in my video A Few Words on Digital Sovereignty).

Data sovereignty is the starting point that enables organisations to achieve the full stack of digital sovereignty outcomes.

As data regulations emerge and evolve, they should look for technology solutions that provide a holistic view of how data is collected, classified, processed and stored to ensure that data legislation and rules are being met.

  • Technical sovereignty: Organisations cannot lock themselves into custom-built solutions that become legacy systems in their own right.

They must aim to work with platform players that can deliver plug-and-play capabilities. Here, open source solutions will lend themselves well to interoperability, as well as data portability and transferability.

  • Operational sovereignty: IT executives will seek technology suppliers that offer cloud capabilities that enable transparency in controlling operations, from provisioning and performance management, to monitoring of physical and digital access to the infrastructure.

Transparency equals trust — a fundamental tenet of digital sovereignty.

  • Assurance sovereignty: Data availability is essentially all about resilience.

For example, in Europe, this is mandated by rules such as the EU Cybersecurity Strategy, Network and Information Systems Directive, and the Digital Operational Resilience Act (DORA).

The latter defines the principles to ensure that digital infrastructure across the continent’s financial sector is always available to provide critical services.

Self-Sufficiency

  • Supply chain sovereignty: As well as reinforcing digital supply chain resilience, the aim here is to strengthen the digital economy’s competitiveness, its capacity to innovate, and ability to create jobs.

Skills sovereignty is also a part of this layer. Here, organisations should expect service providers to invest in knowledge transfer as this can support and foster local talent and empower companies to develop their own digital innovations.

Without this transfer, the much-talked-about IT talent shortage will continue to perpetuate.

Survivability

  • Geopolitical sovereignty: IT and digital technologies are now at the heart of a nation’s critical infrastructure.

This takes the idea of digital sovereignty level to a broader, macro level.

As a result, governments as well as business leaders want to use cloud solutions to help deal with the strategic weaknesses, vulnerabilities and high-risk dependencies of an increasingly volatile geopolitical environment.

Note that the attributes you see in the IDC Sovereignty Stack are mutually dependent on one another.

For instance, the easier and more affordable it is to integrate solutions and switch among vendors, the easier the opportunity to promote co-innovation with local companies that create local jobs.

The “Stack” in Action

Since the start of the Russia-Ukraine War in February 2022, a combined 75% of organisations in Europe now consider digital sovereignty to be more important (source: IDC EMEA, FERS Survey Europe, Wave 4, May 10–25, 2022).

As a result, they are either adjusting their operations or changing their IT strategies.

When asked what actions their business and IT leaders were taking due to their increased digital sovereignty concerns, many of the top results seen in the graph below are evidence of the Sovereignty Stack in action.While improving privacy measures is the foundation of data sovereignty, supply chain management and enhancing resilience in the face of geopolitical volatilities are examples of the “self-sufficiency” and “survivability” layers as depicted in the stack.

Supply chain issues were already highlighted during the COVID-19 pandemic.

Organisations now want to get on top of this and also know the sources of their IT services and goods to ensure business continuity because digital sovereignty is ultimately about operational resilience.

Find out more:

Why Does Digital Sovereignty Matter in Cloud Buying Decisions?

European Digital Sovereignty

Are you one of those IT leaders who is facing the challenge of making change happen at their organization with only a limited budget? Here are IDC’s suggestions for considerably improving the software development lifecycle with a small spend. These small investments can create enough leverage to make real changes happen. 

Shift from run to change by creating better predictability. A healthy organization should spend most of its effort on creating new business value. We regularly speak to IT leaders who are struggling to bring the run-cost of their software maintenance to below 50%. By comparing the maintenance productivity to comparable portfolios in the market you can determine whether there’s room for improvement.  

By analyzing the types of tasks your teams are spending their time on, you can identify improvement areas. These improvements create additional capacity and/or budgetary room to develop and deploy more changes that bring value to your organization. 

On the change side of the software development lifecycle, predictability is critical. By comparing yourself against comparable teams in the market, it’ll be clear what aspects you can improve on and what aspects you are already above market average. These insights will increase the predictability of software development initiatives and make better use of your resources. 

When NCOI, one of the leading life-long learning providers in Europe, analyzed the maintenance productivity of the nearshore team that was maintaining the core administrative system with IDC Metri’s help, the results were astounding. Within seven months that team could do the same output with only a third of the number of people. In this way, NCOI was able to redirect about € 250,000 per month from maintenance to development of new business value. 

Introduce better prioritization of value by giving your product owners better insight into which features bring value to your organization, both in the long run and in the short term. In most agile application development organizations, product owners are (implicitly) responsible for the software value creation of the organization, without a solid basis to work from. You can help your product owners create better insight in the value of features over multiple dimensions, so that your business gets the features it needs to better serve your clients, but your IT portfolio remains future-proof.  
 

It’s possible to train your product owners to work with the Software Value Map approach. Organizations who’ve partnered with IDC Metri have approached each requirement from 2-4 value perspectives. In this way, all value perspectives are balanced, so that not only new business features are released, but there is also sufficient attention to the prevention/reduction of technical debt, usage of (cloud) infra resources and product innovation. This leads to proud and effective product owners and teams that are satisfied with the features they deliver and the architectural runway they can work with. 

Create better insight in your portfolio by getting a grip on the dependencies within your portfolio so that you can prevent rework and unnecessary reshuffling of priorities. Within weeks you can have your software portfolio analyzed. That will give you and your teams insight into the dependencies and possible risks to business benefits. This insight will help you and your teams make guided decisions in prioritization of resources and results in a fact-based starting point to improve the robustness of your portfolio. 

When our client’s software portfolios were analyzed by our Software Intelligence provider partner, CAST, this not only gave them insight into the security, robustness and quality of their portfolio, but also gave them valuable information where they could make (more) use of standard building blocks from their cloud providers. By using standard building blocks, they could really shift left in their software development lifecycle. These organizations had to do less investment in capabilities that are already available and could reduce the amount of resources that had to be allocated to testing. On average these clients saved 5-10% on their software development lifecycle. 

Induce a shift from re-invent to repeat by facilitating your teams so they can benefit from each other’s work. When you have insight into your portfolio, you can identify common building blocks in the portfolio that can benefit multiple teams. When you have identified a lot of these types of building blocks, you could even dedicate a team to the common part of your architecture. When you and your team start to look at which building blocks or microservices your enterprise architecture contains, you will see that the mentality of your teams will shift from re-inventing their own solutions to a common issue to repeated use of a common solution created by one of the teams. This way of thinking will not only boost your productivity, but also create a common ownership of all the teams for the whole portfolio. 

Another driver for the centralization and standardization of building blocks is that IT is moving into the overall business-side of organizations more and more. Different business units can build their own applications on an enterprise architecture that provides them with standard building blocks they can work from. This decreases their security vulnerabilities and requires less test effort. 

Can’t wait to get started, or see what the impact can be of these four initiatives on your specific situation? We’re available to help you to discover how small investments can make a huge impact on your organization, quality and budget. And, with the right balance between them. All of these improvements require limited investments but will optimize your value chains so that you can bring more value as quickly as possible to your organization.

I always like to leave some time before writing about the World Smart City Congress in Barcelona to give myself a reality check. The main reason is that while you are with around 20,000 other engaged and excited people from around the globe, it is difficult not to suffer from group think.

It’s always instructive to see who has “gone big” each year and who has cut back, as it reflects how the market is developing. This is equally true of the countries involved and the major themes behind the solutions presented.

Enormous single stands from the likes of IBM and Schneider have come and gone, and new entrants like Mastercard have taken their place, but there is a definite move towards the power of partnership. Global players such as AWS and Red Hat are happy to have a small presence on the Fiware booth. Companies the size of Accenture and Siemens ply their wares from the Microsoft booth and Nvidia and Dell promoted the synergies between their solutions at a combined stand.

This move can also be seen with natural solution curators such as professional service providers. Deloitte ran one of the busiest and most popular stands as a platform for partners to display the depth and breadth of their market reach.

There was a wide spectrum of technology being promoted from smart city 101 type parking and lighting solutions to futuristic visions of the benefits of the Metaverse. What became clear though was that better and more efficient use of data was the force majeure behind many of the solutions being presented.

This was matched by a focus on the security, privacy, governance and ethics of data collection and use. It was heartening to hear tech vendors highlighting trust, democratisation, diversity and sustainability alongside how many new widgets their solution had.

The big difference this year was the emergence of cross-industry partnerships. At IDC, we have been researching the new industry partner ecosystems that will drive the market.

In essence, to achieve the outcomes required from government clients there will need to be a coordinated approach across architectural, engineering and construction companies (AEC), commercial real estate (CRE) and tech vendors. Some AEC companies such as AECOM were at the event and we predict more will be there next year.

The technology solution that was ubiquitous throughout was digital twins. Digital twins evolved from BIM solutions used by AEC and CRE companies and are critical to both visualising data and for scenario planning. Government clients want to see a “golden thread of information” from data collection, through data synthesis into information made accessible by digital twins.

One consequence of the above is that the purview of “smart” has expanded. The original locus was captured by IBM’s Smarter Planet marketing back in 2008 and a recognition that new technologies had fundamentally changed the world.

The tech industry then spent the next decade looking for a customer focusing on cities as the best level of subsidiarity. Arguably, this has been less than successful as cities have never developed a customer with concomitant budget, responsibility and authority to engage in holistic solutions across all the functions of a city. This inability is exacerbated by an election cycle that reduces the ability for long-term planning. Ergo, we are still stuck selling point solutions.

The nascent cross-industry partner ecosystems outlined above provide a much needed expansion beyond City Hall to a return to the smarter planet idea just as we are beginning to recognise that we have a planet-sized problem. Carbon, energy, water and food are the new riders of the apocalypse.

See the Barcelona interview with Jennifer Schooling, director, Cambridge Centre for Smart Infrastructure and Construction, and me here University of Cambridge & IDC | Construction that Maximize Urban Potential (tomorrow.city).

Today’s tech buyer is but one person in a larger committee. Sales and marketing teams are talking to a larger tech buying committee that consists of multiple personas with varying jobs to be done and business challenges. However, sales reps are unable to articulate the value of their organization’s solutions as it directly applies to their customer’s business.

A recent IDC survey found that 61% of sales reps are not skilled at selling to C-level buyers and 49% are having issues finding qualified buyers.

Source: 2022 Outcome Selling Advisory IDC Survey on Value Selling Excellence

Sales Enablement is the Key to Empowered Sales Conversations

Digital marketing activities begin a conversation about outcomes and promises of value that a sales team must be able to continue to articulate. But how do you engage with different tech stakeholders at the same organization as well as C-level executives? A strong understanding of the digital journey and key personas and their priorities is essential for driving sales discussions with this diverse group. That is what sales enablement centers on. At IDC specifically, it’s a practice that focuses on delivering sales and education programs that provide the right information, to the right person, at the right time and place.

Evolve Your Sales Model

Especially under the weight of the current economy, it’s critical today to move away from product and feature selling. This model makes the technology you are selling the focus and the value. That’s not what buyers need today. Buyers are looking for a clear understanding of how your solution will solve their business challenges and provide real value.

As marketers and sellers started realizing this, it gave way for the consultative selling approach. This model, as it compares to product and feature selling, aimed to provide problem solving as value and this results in larger, more transformative deals. Once the transaction is over, however, consultative selling isn’t an approach that integrates with the buyer’s business strategy. Enter value selling–focused on on-going value generation, co-created with the customer.

Value selling is the creation and extraction of value in a continuous and virtuous circle. It is deeply integrated with the customer’s business strategy and, because of that, creates an ongoing, agile and iterative partnership between you as a vendor and your customer and partners. It doesn’t just focus on the sale, but on pre- and post-sale alignment.

The Tech Buyer Journey

Continuity between marketing and sales is critical for a seamless buyer journey. Engagement activities conducted by marketing should set up a conversation that sales should be able to intelligently carry forward. Ultimately, to engage more effectively with your audience, your value proposition must be aligned. In another IDC blog post, we take a deeper dive into understanding how to align digital and interpersonal strategies. Simply put, as marketers nurture this value dialogue digitally, they are moving the buyer closer to sales who must understand the marketing dialogue and progress to demonstrating that value.

Marketing and sales must now work together using a persona-based, journey-nurturing approach, because buyers expect an ongoing relationship with you; one that does not end after the purchase is made.

Creating a Sales Enablement Strategy and Supporting Tools

A strategically crafted sales enablement strategy will help you map your customer journey so that you are prepared and on-message through pre-sales, sales and post-sales. It will familiarize your team with new markets, changing demand and equip everyone with quantitative and qualitative research and metrics to create an ROI validation and response to buyers.

A sales enablement strategy addresses key situations with a clear plan to respond to today’s challenges. If built leveraging the right research, it will allow you to:

  1. Tailor messages to key industries
  2. Better understand the C-level and LoB buyers in the buying committee
  3. Learn who best to position your technology to and how to address key use cases with a messaging around the problems your new technology solves
  4. Familiarization with the new market and specific dynamics
  5. Quantitative and qualitative metrics to respond to ROI questions

IDC’s Sales Enablement practice provides tailored solutions from target market education, to sales call aids and sales engagement. Our research offers a wealth of insight across technology markets, with industry-specific insights that you can leverage in your strategy.

Learn more about IDC’s industry research, high impact business value solutions and interactive selling tools.

Watch our latest webinar where we introduce a new model for marketers and sellers to work within, for lead generation, customer creation, and value management efforts.

The 3 Dimensions of Retail Immersive Customer Experience

The opportunity with the Metaverse is it creates infinite possibilities for us to connect, create and belong. […] In essence, how can we help our current members, get educated, informed and show them the way?” Tareq Nazlawy – Senior Director Digital, adidas

Despite the internal and external challenges that retailers must face, real-time contextual customer experience is a key factor for retailers and brands that aims to engage, retain and enhance shoppers’ experiences during different and multiple interactions and interfaces.

According to our Global Retail Operating Models 2022 Survey, brick-and-mortar remains retailers’ main channel source of revenue generation over the next two years. At the same time, eCommerce and marketplaces as well as retail media networks, conversational and social commerce are expected to become new and increasing channels of new revenue streams.

This would allow retailers’ operating models to become fully omni-channel while completely integrate online and physical operations, in real-time.

However, the traditional convergence between online and offline is now nowadays characterized by a third dimension that aims to augment the overall shopping experience, including the metaverse continuum. This ranges from AR-enable visual and image search to 3D product visualization, to more sophisticated and immersive virtual realities for visual commerce, such as Web3 and Metaverse platforms and related NFT capabilities.At the center of this 3D integration – between online, physical stores, and metaverse continuum – there are three core elements that are foundational of the immersive retail customer experience. The core elements are:

  • Customers. Shoppers’ need, behaviour and preference as well as the customer-centric approach that permeates B2C, B2B2C and D2C business models.
  • Data. Collecting and managing first-, second- and/or third-party, retailers and brands become key players of partners ecosystems for leveraging customer data platforms (CDPs) and customer data sharing hubs to guarantee data accuracy, security and identity management.
  • Technology. AR, VR, digital twin, and Metaverse are (re)emerging technologies that fueled by data and powered by AI and ML analytics, enable retailers to enhance personalization of the customer journey model.

As we have recently published in the IDC FutureScape on Worldwide Retail 2023 Preditions, “By 2024, 65% of Retailers Will Invest in Visual Commerce to Enable Personalization Through 3D Product Configuration and Virtual Try-On and Reduce Complexity Through Image-Based Interfaces”.

There are also concrete examples of retailers that launched metaverse initiatives, such as Nike launching on Roblox; Walmart adopting an AR-powered virtual try-on option for in-app virtual fitting from home; recently, LaCoste opened its Web3 community for Le Club loyalty program members or the Swedish retailer H&M is currently piloting a new tech-enabled collection of style (COS) store — including smart mirrors — with the intention to expand this format soon across all its stores.

These are all examples of retailers already investing on visual commerce-enabling technologies enhancing personalization and customer engagement at the highest level of immersion. To achieve this goal, there are some IT implications to consider:

Frame visual search applications with the overall retail commerce platform infrastructure to enrich a solid customer data base (such as customer data platform) for CX personalization.

Integrate AR/VR to the current in-store technologies (e.g., RFID, sensors, and smart mirrors) to achieve a fully omni-channel experience.

Set the basis for metaverse requires overcoming any complexities deriving from existing customer-facing disconnected interfaces.

Key Actions for Retailers

So, what retailers should do then? Here there are three immediate key actions:

  • Include augmented commerce as part of business model innovation, and long-term strategic digital road map use case definition.
  • Go or become fully and truly omni-channel implementing those capabilities and technologies that bridge personalization to augmented customer experience.
  • Co-innovate along retail metaverse continuum with the right partners to generate customer lifetime value across multiple interfaces and – again! – reducing complexities.

IDC Retail Insights analysts Ornella Urso and Filippo Battaini will be onsite in New York for NRF 2023: Retail’s Big Show. They look forward to meeting you and share their research themes and upcoming IDC initiatives for 2023!

The digital airwaves and social media feeds have recently gone wild with examples of how the AI-driven chatbot ChatGPT has solved riddles, generated high school essays and explained why the Croatian football team has outperformed similar sized nations at recent World Cup Tournaments. Understandably, it has again raised important questions about the impact of AI on our lives, enterprises, and broader society.

First and foremost, let’s start with definitions. What is Generative AI and where does OpenAI/ChatGPT fit within all of this? Generative AI is a branch of computer science that involves unsupervised and semi-supervised algorithms that enable computers to create new content using previously created content, such as text, audio, video, images and code.

ChatGPT (which stands for Chat Generative Pre-Trained Transformer) is a chatbot developed by OpenAI. ChatGPT is built on top of OpenAI’s GPT-3.5 family of large language models (LLMs) and is fine-tuned with both supervised and reinforcement learning techniques. It is being hailed as the smartest chatbot ever developed. OpenAI was founded in 2015 (initially as a non-profit organization) and early investors included Elon Musk & Peter Thiel. In 2019, it became a for-profit organization and inked a $1bn deal from Microsoft. This deal allowed it to use Microsoft’s Azure Cloud Platform for its research and development; and in return, Microsoft was given the first opportunity to commercially leverage early results from OpenAI’s research. OpenAI has a stated goal of promoting and developing friendly AI in a way that benefits humanity as a whole and is viewed as the leading competitor to DeepMind (acquired by Google in 2014 for $500M).

It is important to understand that while ChatGPT is a good example of generative AI technology, the market segment is much broader.  LLMs began at Google Brain in 2017, where they were initially used for translation of words, while preserving context. Since then, large language and text to image models have proliferated at leading tech firms like Google (BERT and LaMDA), Facebook(OPT-175B and BlenderBot) and OpenAI (GPT-3 for text, DallE-2 for images and Whisper for speech). Online communities (e.g. MidJourney), open-source providers (e.g. HuggingFace) and startups such as Stability AI have also created generative models. In Q4 this year, a spate of text-to-video models from Google, Meta and others have emerged. Generative models have largely been confined to larger tech companies because training them requires massive amounts of data and computing power. But once a generative model is trained, it can be “fine-tuned” for a particular content domain with much less data.  Today, Generative AI applications largely exist as plugins within software ecosystems.

The questions that technology and business leaders should be asking in terms of what Generative AI means for the enterprise are outlined below:

How will it be incorporated in existing enterprise technology environments?

  • Code Generation – GPT-3 has proven to be an effective generator of computer program code. GPT-3’s Codex program is specifically trained for code generation and works well when given a small function. Microsoft’s Github has a version of GPT-3 for code generation and is called CoPilot. The latest versions of Codex can identify bugs and fix mistakes in its code and can explain what it does occasionally. The goal for these tools is to not eliminate programmers but to make tools like Codex and CoPilot “pair digital assistant” with humans to improve their speed and effectiveness.
  • Enterprise Content Management – Vendors in the Headless content management space are incorporating these types of generative AI tools for both content generation and recommendations. This is to deal with the increased content velocity as additional forms of content are based on a single source generated by AI with human oversight.  It is not being used to write whole copy, but rather an outline for the content author to use as a draft.  In addition, there it is likely to impact GUI design in the form of “generative design” with the likes of Figma or Stackbit potentially including generative AI capabilities in as part of collaborative interface design engines.  
  • Marketing and CX Applications – Outside of the use of content generation for advertising and marketing along with the automation of marketing campaigns, the primary application for early versions of generative AI is in AI driven chatbots and agents for contact centers and customer self-service such as employed by Salesforce and Genesys, and these have initially delivered mixed results. However, this next generation of capabilities will mean a broader range of interactions, more accurate answers, and lower levels of required human interactions which will result in higher adoption and eventually more training data for the models. In the near future, generative AI will become more prevalent in the creation of personalized product recommendations through insight analytics, better and deeper customer segmentation as a steppingstone to true personalization and contextualization of experiences, and better understanding customer satisfaction and performance.  
  • Product Design & Engineering – It will also affect technologies in the product lifecycle management (PLM) and innovation space with the likes of Autodesk, Dassault Systemes, Siemens, PTC and Ansys continuing to build capabilities to enable design engineers & R&D teams to automate and expand the ideation and optioning process during early-stage product design, simulation, & development.  Generative AI design would allow options for engineering and R&D teams to consider in terms of structure, materials, and optimal manufacturing/production tooling.  For example, it would potentially suggest a part design that optimizes against factors like cost, load bearing, and weight. Generative design can also enable reimagining of product look and feel, often resulting in unique aesthetics and form that is not only more compelling to end users, but more practical and environmentally sustainable. Many of these vendors have attached their generative design offerings to additive manufacturing capabilities that are needed to realize these unique products.  Opportunities exist across multiple industries for generative design.  Automotive, aerospace, and machinery organizations can improve product quality, sustainability, and success, while life sciences, healthcare, and consumer products companies can improve patient outcomes and customer experiences.   

What are the pitfalls?

Generative AI, while providing lower-cost, higher-value solutions, has significant ethical and perhaps legal implications. There are significant questions over issues like copyright, trust and safety. Organizations must consider issues such as privacy and consent around data, reproduction of biases and toxicity, generation of harmful content, sufficient security against third-party manipulation, and accountability and transparency of processes. Neglect of AI ethics isn’t just a moral quandary – it is a significant business risk that means less trust, less control, and less ability to advance the models in an optimal way. Businesses must take a multi-pronged approach to AI from developer to end-user, first and foremost guided by a framework including principles that appropriately consider all ramifications of AI. Businesses should also choose models where techniques such as adversarial input (training against bad or manipulated data), benchmark dataset training (checking for biases via label tests), and XAI (explainable AI) are used. Finally, concerns with AI ethics are intrinsically linked to how accountability measures are enacted. Businesses should ensure they take a Human-in-the-Loop (HITL) approach to ensure minimal model drift, rigorous monitoring of output, and continuous improvement. AI must not be viewed as an independent, black box entity, but should rather be seen as human-computer interaction where optimal usage comes from deep understanding, meticulous monitoring, and striving for accuracy of the model.

How will it affect jobs?

At the end of 2020, the World Economic Forum (WEF) predicted that AI would displace 85 million jobs by 2025. The main jobs it identified under threat would be the likes of data entry clerks, administrative assistants, accounting and auditing professional amongst others. By the same timeframe, it predicted that 97 million new jobs would be created as AI becomes more mainstream in the enterprise. Growing job demand would focus on data scientists, process automation specialists, digital marketing and strategy experts as well as many other more roles. Generative AI means that we can add a new role to that list – prompt engineers. Basically, this role focuses on working out what to type into AI chatbots to get the best out of them. Some would expect these individuals to also deal with so-called ‘hallucinations’ – where Generative AI gets it completely wrong. These types of entirely new job descriptions highlight how an emerging technology not only displaces activities, but also creates new ones. The classic creative destruction principle initially outlined by Schumpeter. However, for business and technology leaders – it does require a dynamic and ongoing assessment of required digital skills including continuous gap analysis and roadmaps to ensure that the necessary capabilities are available to support the digital business of the future.

Moving forward, the best place to watch new and interesting generative AI use cases is in the start-up and scale-up space. The likes of Jasper (Copywriting), Stability AI (Visual art), DoNotPay (Legal Services), Omnekey (Creative Content), Paige.ai (Cancer diagnostics) and Mostly.ai (Synthetic data) showcase how quickly this space is fueling a range of game changing innovations across the industry – and potentially what’s around the corner for so many industries. It is incumbent on all of us to ensure that we approach this fascinating space with the right balance of curiosity and skepticism.

Philip Carter - Group Vice President, General Manager, Research AI - IDC

Philip Carter is General Manager and Group Vice President for AI, Data, and Automation research at IDC. In this role, he leads a global team of analysts focused on delivering IDC's research and insights at the intersection of AI, data platforms, and intelligent automation - three foundational areas shaping the future of technology and business. His work is centered on helping C-Suite executives make sense of the rapid innovation in the AI space, and drive meaningful transformation through data- and intelligence-led strategies. BACKGROUND Carter has held multiple senior roles at IDC across regions. Prior to his current position, he served as GVP and GM of IDC TechMatch, where he led a global team tasked to build and commercialize IDC's first AI-powered digital platform - focused on helping CIOs and procurement executives evaluate and source technology vendors leveraging IDC trusted intelligence. Earlier in his IDC career, Carter was the lead for IDC's Global Thought Leadership research function and was also Chief Analyst for IDC Europe, where he drove innovation in research related to digital transformation, emerging business models, and technology strategy at the C-suite level. Before that, he worked in IDC's Asia/Pacific region, covering software, services, and sustainability. Prior to joining IDC, he held various leadership roles at SAS Institute across EMEA and APAC in marketing strategy, product management, and business development. He is a recognized industry voice, regularly featured on platforms such as CNBC and Bloomberg, and quoted in leading publications including the New York Times. EDUCATION/INDUSTRY ACCOMPLISHMENTS: - Honors degree in Business Science, majoring in Economics and Law, University of Cape Town, South Africa.

The number of digital native businesses (DNBs), start-ups, and scale-ups has grown exponentially over the past decade and now represents a significant market cap. Despite the economic downturn, Digital Native Businesses should be on your radar as they drive economic growth, create new jobs, and foster innovation. However, their DNA differs from traditional organizations as they are completely cloud driven.

What is a Digital Native Business?

IDC defines Digital Native Businesses (DNBs) as companies built from the start around modern, cloud-native technologies, leveraging data and AI across all aspects, from product development to logistic operations and customer engagement. By leveraging new and emerging technologies, platform services, and marketplaces, DNBs grow and scale fast, disrupt industries, and create new markets.

DNBs set the pace when it comes to product innovation and development; they set new standards in customer intimacy and customer experience. As technology gives them their competitive advantage, even more important in today’s economic environment, they should be on every tech vendor’s radar.

Simone De Bruin, Research Director, Worldwide Digital Native Business, Start-ups & Scale-ups, IDC

DNBs encompass a wide range of both B2B and B2C enterprises, from food delivery to carbon sequestration to cryptocurrency. There are five defining characteristics of DNBs, distilled below as follows:

  1. Use tech as differentiator – they depend on UX-driven innovation cycles, and use tech to compete or to monetize their services and products. All core value and revenue-generating processes are dependent on digital technologies.
  2. Are born digital – they are cloud-native and data-driven and being digital is part of their DNA. They have a tech-driven operating model.
  3. Scale and innovate at speed – with tech-savvy developer and data scientist teams, DNBs aim for rapid growth.
  4. Have an ecosystem-centric approach – they are highly marketplace-driven and a DNB leverages its ecosystem of stakeholders to drive community-led innovation, dynamically evolve, and co-create offerings.
  5. Significantly funded – whether it’s venture capital, bootstrapped, or crowdfunded, DNBs enjoy a high degree of funding to support their ambitious growth ambitions.

Why it’s Important to Pay Attention to DNBs

Despite the economic downturn, DNBs should be on your radar. There is a number of reasons for this, but mostly it boils down to a combination of their innovative mindsets, focus on customers, and ability to rapidly scale, leveraging new technologies. Other factors that help set the trend:

  • The number of DNBs continues to increase, and the time it takes startups and scale-ups to grow into unicorn status decreases (average 6 years)
  • At the same time, traditional enterprises are fading as a concept as their lifespan decreases.
  • Compared to 2021, VC investments in 2022 have dropped. Over the longer term though, VC investments and startup initiatives have grown significantly. 2022 again broke records on fundraising levels as well as on dry powder (funds to invest) available.

The rise of DNBs has been set in motion two decades ago and they will continue to be a major source of innovation for some decades to come. The value that startups create is nearly on par with the GDP of a G7 economy. As Eynat Guez, co-founder and CEO of Papaya Global says, “Technology startups are more than catalysts for growth. They are the engine of growth itself. They solve problems no other sector is addressing with innovative thinking, thus pushing society forward – all while creating jobs, stimulating the economy, and attracting foreign investment.”

DNBs scale rapidly and can generate returns unparalleled to those in traditional enterprises. Vendors who aim to engage with DNBs in the traditional way will find it difficult to keep up with their fast-changing requirements. In addition, they represent different opportunities as well. IDC sees vendors engaging in three ways:

  1. Invest / Acquire – CVC invests in young & emerging startups, and scale-ups. For example, Salesforce Ventures, M12, Workday Ventures etc.
  2. Partner / Accelerate – Vendors partner with DNBs, start-ups, and scale-ups, or they engage with them through accelerator or incubator programs. Examples include SAP.io, and IBM Sustainability Accelerator.
  3. Sell-to – Focused on a commercial relationship where DNBs, startups, and scale-up are a new customer segment.

Although there is a fair amount of overlap between these categories, many initiatives are still disconnected and siloed. Vendors should take a more holistic approach to engaging digital native businesses to build longer relationships, partnerships, and commercial relationships. Once a vendor is clear on the type of engagement, vendors need to assess what type of DNBs fit that purpose. The following segmentation looks at the DNB business model:

IDC distinguishes:

  • Technology Providers. Providing third-party organizations with next-generation technology products or services (e.g., a chatbot for enabling better customer experience in retail banking, a SaaS tool for analyzing space imagery).
  • Technology Enabled – B2B. Offering products or services to businesses where those services are enabled by next-generation technologies at the core (e.g., a B2B e-commerce company connecting businesses with suppliers and manufacturers or a B2B trading platform for SMEs to source products from distributors and wholesalers).
  • Technology Enabled – B2C. Offering products or services to consumers where those services are enabled by next-generation technologies at the core (e.g., a car-sharing service enabling consumers to book and locate cars on the fly using an app, a social network for dating, a video-streaming application).

Vendors will look to identify, or recruit, technology-oriented DNBs to invest in, or partner with. The B2C and B2B types of DNBs are more likely a new customer segment. Vendors who want to engage with the next Snowflake, Uber of Instacart need to first be able to identity what type of opportunity they provide, and then have the system in place to cherish and nurture them to grow. 

Taking this segmentation one step further, the following subsegments can be distilled to look at high-growth opportunities. Each category within the DNB landscape consists of various functional markets. These functional markets are not mutually exclusive. Metaverse (tech-driven) solutions will impact the (tech-enabled B2B) retail industry that eventually effect the (tech-enabled B2C) consumer tech in fashion for example.

Source: IDC, 2022. This market glance highlights some of the largest (either unicorn, or recently substantially funded) Digital Native Businesses per segment, illustrating the type of vendors that play in each category. Company selection is up to analyst discretion.

The lifeblood of the Future Digital Economy

DNBs are the lifeblood of the future digital economy. With growing investments in digital natives and an exponential growth of companies born in the digital age, digital natives start to command a sizeable portion of tech spend. However, that tech spend will differ for each of the different categories as defined by IDC. ​As DNBs exert a strong influence on the market, they should be on the radar of any tech provider. Tech is their key competitive advantage. Even during an economic downturn, they are not likely to downsize their IT investments. However, to get traction and increase engagement there is great value in understanding their business operations, what enables their success, and what their IT requirements are.

  • If you’d like to learn more check out our latest research here, or contact Simone de Bruin, Research Director, WW Digital Native Business, Start-ups & Scale-ups
  • At the moment we are collecting data from digital native businesses. So, if you are a startup, scale-up, or mature digital native business and would like to participate in our research – and receive our CEO tech book as a thank-you gift, please click this link to participate!

“Humanity has a choice: cooperate or perish. It’s either a Climate Solidarity Pact — or a Collective Suicide Pact”.

COP27, held in Sharm El Sheikh, Egypt, in November 2022, began with this sobering opening statement from UN Secretary-General António Guterres. It set the mood for the two-week conference, which fell well short of meeting its targets. According to the Economist, “There is no way Earth can now avoid a temperature rise of more than 1.5°C. There is still hope that the overshoot may not be too big, and may be only temporary, but even these consoling possibilities are becoming ever less likely.”

Governments need to keep investing to tackle climate change, but they now also need to invest to increase our collective resilience. Since COP26 in 2021, not only has the geopolitical environment changed significantly, but the increase in global temperatures, causing wildfires and flooding, has reminded us of the heavy cost of inaction.

While people expect decisive action from their governments, their leaders seem overwhelmed with different priorities and planned investments.

A Real Test of Leadership

This year, 130 developing countries succeeded in their attempt to add the notion of “loss and damages” to the official COP27 agenda. But with COP now over for another year, that looks like the only success in 2022. Even that still needs to be ironed out, however, and it should also be remembered that it only tackles the consequences and not the causes.

Mahmoud Mohieldin, UN Climate Change High Level Champion for Egypt, reminded us that global warming is not only about changing the way we produce and consume energy, but also about the way we produce food. “Transforming food systems could release back the $12 trillion the world spends on the hidden cost of food, from transportation to fertilisers,” he said. “We could also eliminate nearly all of the 8.5% of emissions that come from agriculture.”

There are many reasons why such important matters were not intensively discussed at COP27, but we believe one of them was the lack of global leadership.

If no leader stands out when there is so much to coordinate and activate, the transformation must come from cooperation and greater transparency in the promises made to lower our emissions and our dependence on fossil energies.

COP28: Climate Data for the Common Good

Next year’s COP will come at the same time as the first report since the Paris Agreement of 2015, as the final biennial reports for developed countries will be multilaterally assessed to complete the final IAR cycle during 2023–2024. It’s hard to believe that the direction set in 2015 — to limit global warming to well below 2°C and preferably to 1.5°C — will be reached by then. It’s also hard to think that we will have concrete data to rely on by then.

Some initiatives with data transparency at their core have already been implemented. We think of the Climate Data Steering Committee, the EU’s Corporate Sustainability Reporting Directive and the One Data Hub. By the time these reporting mechanisms are live, there will be more data to track and report, including the loss and damages funds agreed at COP27.

These reports include the same KPIs and data format to follow up on, however. One goal for government executives will be to agree on a data format for each component of climate change, which will need to be transparent for citizens so that they can hold their governments to account.

Philosopher Günther Anders once explained the notion of the Promethean gap, which refers to the incapacity of the human brain to perceive the danger it might encounter. At the beginning of 2022, IDC revealed that the number 2 challenge for governments when attempting to become more sustainable was the lack of IT tools to measure the impact, which was almost as challenging as the lack of funds. If we need concrete data before we take action, it’s time to understand that when it comes to “cooperate or perish” it’s not too late to make the right choice.

Remi Letemple - Senior Research Analyst, IDC Government Insights - IDC

Remi Letemple leads IDC’s Worldwide Sustainable Transportation and Smart Vehicles Strategies service, where he provides strategic guidance and thought leadership on the future of mobility and transportation. Operating at a global level, he is recognized as a subject matter expert in smart mobility and transportation technologies—including connected, autonomous, shared, and electric mobility—enabled by software-defined vehicle (SDV) architectures, over-the-air (OTA) updates, cloud and edge platforms, and AI, including generative AI.

At IDC Cloud Pulse, a quarterly survey that takes in views from up to 1,700 cloud consumers, we have been tracking how companies are being impacted by, and how they are responding to, macroeconomic trends. We then consider how this relates to the consumption of cloud. Questions around Inflation and Energy Costs were added to our survey first in Q1, 2022. The Possibility of a Recession was new in Q3. And we will now be adding Recession into future surveys.

These new additions tell a story about the challenges businesses have had to endure over this last year. Not as much, however, as the responses we received when we asked about the likely impacts to businesses.

Source: n = 1,700 Source: QP1 3Q22 Cloud Pulse Survey, September 2022, IDC

Each quarter companies are asked what events are most likely to cause disruption to their business over the course of the year. They can respond using a sliding scale of 0-10 with 10 being the highest amount of disruption. The above results show only those responses between 8-10 (what we would consider as ‘high impact’).

During Q1 and Q2, not too many more than a third of companies said they felt they were experiencing major impacts from the macroeconomic trends we asked about. In Q3, this rose to around a half.  Inflationary costs were seen across more of the business, energy price rises became a reality, and national and global recessions became less likely to avoid. At the same time, markets still battled ongoing supply chain challenges, in part brought on by continued to reactions to the COVID pandemic. Many of these impacts are intertwined, making the current macroeconomic landscape even more difficult for companies to navigate.

IT Budgetary Impacts

During Q3, for the first time, we saw companies shift towards a more pessimistic view of the business environment. Early indications from our Q4 data suggest we will see even higher rates of pessimism moving forward.

This pessimism impacts budgets. Cloud makes up around 31% of IT budgets – this is up only slightly from around 30% seen in Q4, 2021 despite previous years showing higher annual rates of growth in terms of cloud as part of IT budgets. The real challenge is, however, that IT budgets are decreasing when viewed as a proportion of overall company revenue. As we know, company revenue is also being challenged (and where it isn’t, margins are suffering from increased costs).

Source: IDC, 2022

If we focus on the area of inflation – where we see some of the earliest impacts on industry in terms of economic stress – we gain valuable insight into why we are seeing budgetary constraints across IT. Companies operating or consuming a cloud environment first felt inflationary pressure in the form of increased professional services costs, and then application software subscriptions. These are two areas where companies are more likley to operate with rolling monthly contracts.

During Q3, we started to see more companies (25%) saying they could now see increases across private cloud infrastructure (most likely brought about as a result of increased equipment and energy costs). Internal skills costs were also a major challenge.

Public Cloud as a Response

Responding to these challenges, a quarter of respondents said they will be looking to migrate more of their environments to public cloud/ Software-as-a-Service. Around the same amount again (23%) said they will be looking to find reductions for spend across their cloud estates.

With almost a fifth (18%) of respondents saying they are still waiting to assess the impact of inflationary pressures, these figures could grow.

Cloud Pulse findings also show that the number of companies that say they are ‘Public Cloud First’ when it comes to their cloud adoption strategy is more (32%) when companies say they are directly impacted by inflation compared to just 21% for those that are not.

Note, the rate of Public Cloud First companies has been decreasing annually. Q3 marks an increase overall in the number of companies relying on Public Cloud before taking a hybrid approach – though hybrid remains the number-one approach to adoption).

Qualities that Count

Another shift is in the qualities companies are now requiring of their cloud providers. In Q3, 2020, the most important Company Attribute sought by cloud consumers was a global/international footprint. What companies now seek are providers they can label ‘trustworthy’.

In this case, Trustworthy relates to a provider’s ability to be reliable and responsive, to meet contractual requirements, to be transparent about pricing and to keep promises made to the business (we did ask). During this time of macroeconomic uncertainty, customers also want to work with technical experts that can deliver services at speed with guarantees along the supply chain.

We also see shifts in what companies require in terms of cost and pricing. In 2020, the focus was on flexible payment terms and enterprise-wide agreements. Now it is more about flexbile licencing and credits that allow companies to move their applications and workloads across cloud and even non-cloud environments as they require.

Many companies know their business could have to alter the way they consume and gain access to IT over the coming year. Where rising costs – from inflation or energy increases – are already felt the requirement for flexibility gives way to predictable pricing,  with many IT departments focussing more on cost optimization and forecasting.

Many cloud vendors have already started to notice these shifts across their own businesses, from customer conversations they are now having to the budgtary bottom line. Those who will succeed will be those that can quickly pivot and reinvent solutions – in particular with the right financial models.

Companies want to continue to benefit from cloud as they navigate uncertain times but companies will be more selective about what they are deploying and how over the coming year. Many will be taking stock on current digital transformation agendas and application portfolios to create leaner, more efficient, IT responses to current business needs.

For more on these key cloud trends, including how individual providers aligned to current market sentiments and needs, ask further about IDC’s Cloud Pulse data. Cloud Pulse’s rich insights cover a range of cloud topics from deployments to application landscapes, vendor selection, ROI and more.