Recently, Google appointed McKinsey Global Institute veteran James Manyika as the company’s first SVP of technology and society, as reported by Axios and Protocol. Manyika has been studying the global impacts of innovation and the internet for decades. Now, he’ll be reporting directly to the CEO of Alphabet, “shaping and sharing” the company’s understanding of the social impacts of tech, including AI, the future of work, and sustainability. Given increasing “tech-lash” headlines, declining consumer trust in technology, challenging congressional hearings, and mounting proposed legislation to address tech’s impacts, it comes as no surprise that big tech firms are feeling pressure to get a handle on their role in society. Academic centers and think tanks such as the Berkman Klein Center for Internet & Society (where I’ve been a research fellow) have long studied technology’s social effects. Now, Google’s move signals that tech companies are trying to think holistically and strategically about tech’s impact — both for good and ill — on customers, society, and the environment. The tech industry is listening. Google isn’t the only firm talking in terms of technology’s impact on society. Efforts across the industry have cropped up to coordinate response to mounting ethical concerns:
Digital acceleration requires companies to understand their technological and social impacts to build customer trust. It’s not just the tech industry’s concern. As every firm becomes increasingly digital, and as more tech leaders directly support business outcomes and customer experiences, the question of tech’s impact on society will loom large. Protecting against security breaches and mitigating tech risks will no longer be enough. Values-based consumers increasingly demand ethical uses of data. Data will become a currency for building and maintaining trust in the customer experience. Remits like Manyika’s will make their way into the enterprise technology organization, either through new ethics roles or added responsibilities under the purview of current CIOs, CTO, and digital leaders. Responsible and ethical tech efforts need to be coordinated across the enterprise. Leading an adaptive, creative, and resilient tech organization increasingly means that technology executives must be able to account for and nurture the responsible and ethical use of technology. Future fit technology leaders are starting to monitor fairness in ethical AI, embed privacy by design throughout engineering and development processes, and track social and environmental impacts against ESG goals and sustainability targets. These efforts will require greater coordination across teams, functions, and lines of business. And they’ll need data and governance structures to form a comprehensive view. Having a responsible and ethical tech strategy will enable tech executives to deliver technology that enhances the overall customer experience by increasing trust and reducing risk of faltering on brand promises. Future fit tech execs will play a critical role bringing all this together. To learn more about this topic, be sure to check out my keynote session at the upcoming Technology & Innovation North America event September 29-30, 2022 and read our report to develop your Responsible And Ethical Technology Strategy.
Overseeing current IT projects and operations will always be part of an IT management mandate. But today's CIOs will need to use technology in new, innovative ways to help the business keep pace with rapid change. IT management software and tools can help. Data and analytics, as well as cloud, are some of the areas CIOs have pursued. At the same time, they’re looking at artificial intelligence (AI), Internet of Things (IoT) and more to prepare for the future. Analytics An analytics solution can mine terabytes of operational data quickly to find the root cause of service impacts. It helps identify potential bottlenecks, predict outages and drive greater efficiency. Organizations gain insights into data or processing issues, negative IT trends and anomalies – making it easier to take steps to avoid system chaos. Beyond in-house optics, analytics provide insights to help enterprises better understand their customers – which in turn can drive business strategy. Cloud computing Cloud services offer scalability, data security, data recovery services and more. Using the cloud can improve efficiencies and reduce infrastructure costs. It can benefit all aspects of the business, from operations to finance, and help position the organization for transformative cloud-based solutions in future. Many enterprises host core business applications on mainframes, which process millions of transactions each day. Cloud enablement helps IT departments modernize their mainframe systems – while freeing up CIOs to focus on other priorities. Organizations benefit from higher levels of productivity and performance with less overhead. AI and cognitive computing AI systems analyze data, learn and predict problems to help IT managers deliver better service quality. As well, AI-based chatbots can function as virtual agents, talking with users to resolve technical issues. Customers can also use them to learn about products and services. Moving ahead, cognitive computing may become vital to helping enterprises manage IT and accelerate innovation. IoT IoT platforms collect and analyze data from devices and sensors, helping to proactively resolve issues and improve productivity. IT managers can quickly derive insights into what the organization is doing right – and what it could be doing better. Cognitive learning further enables business to unlock IoT value. For one, it could combine multiple data streams to identify patterns and provide more context than would otherwise be available. Intelligent sensors too have the potential to self-diagnose and adapt to their environment without the need for human intervention. For several years, a growing number of executives, analysts and management writers have argued that business leaders, not technologists, should take "ownership" of corporate information technology by holding themselves responsible both for its impact and for the money spent to improve it. Information technology's role within organizations has changed, these critics argue, and the way they manage their investments in technology must change, too. At one time, the IT organization could be run effectively as a support function. Today, however, most new IT applications span businesses and functions, while some connect organizations to their partners and customers. Companies that aim to derive full value from their investment in IT must therefore alter their business processes and understand how IT can be used to foster improvements and competitive advantage. But these advances will be achieved only if business leaders become more involved in technological decision making--only, in fact, if they call the shots. Some companies have heeded this advice. Yet few believe that the effort had the desired effect; after appointing business leaders to corporate technology committees they found that the hoped-for improvement in relations between the two sides did not occur. In the meantime, useless applications continue to be implemented and IT costs continue to rise. Even so, companies have unquestionably taken a step forward by creating structures and processes that encourage business and IT managers to work together. This collaboration at least ensures that business leaders oversee investments, evaluate proposed IT applications, and help the organization plan for the changes any new system requires. But oversight aside, business leaders have no incentive to run IT with the same rigor they bring to running the business. The management of information technology is still left to IT leaders, who struggle to balance the changing demands of the companies for which they work. How, then, can companies close the gap between IT and the business it supports? The key is to ensure that executives not only set the corporate IT agenda but also manage its performance--and their compensation should reflect their ability to do so. We have seen business leaders ignore potentially valuable technology projects, and then suddenly cancel them when they ran into difficulties instead of taking responsibility, up front, to ensure their successful completion. These leaders must own decisions instead of just making them, assuming that someone else will be accountable. Other changes are needed, too. Most companies now manage IT as a function separate and distinct from the business. Thus, although IT executives lead complicated organizations that serve not only companies as a whole--networks, for instance, and corporate applications--but also individual businesses and functions, there are too few links between them and IT. To bridge the gap, selected IT managers should be drawn more closely into the business units and be made more accountable for the performance of the business, just as business leaders should answer for the performance of IT. A handful of companies in financial services, energy and high technology have begun to make this transformation and, as a result, are improving their return on investment and managing their IT costs more successfully. Taking these companies as our example, we have developed some practical advice on how to encourage a more effective partnership between IT and business.
What goes wrong?
• A legacy of two cultures Technology committees on their own can't directly address this deep-rooted cultural division. Committees and joint processes for setting agendas can certainly help the two sides develop a common technological vision and a shared language to discuss the issues that divide them. But both sets of managers ultimately return to their respective camps, since the incentives governing their careers promote the performance of their own units, not the achievement of joint goals.
• Too much bureaucracy When IT planning processes become too complex, decisions end up being made under the table: Business managers circumvent the formal channels for making IT requests, and rational investment and effective spending become almost impossible. Executives in several companies we have worked with were shocked to learn that this "shadow" spending amounted to as much as 40 percent of their total application-development expenditures.
• Too many junior managers A few companies have overcome this formidable challenge and now use technology as a competitive weapon. Their progress is based on three important steps: They make business leaders accountable for the return on IT investments; they put those leaders in charge of setting the IT agenda; and they integrate their IT organizations more closely into the business.
Accountability In view of these challenges, most companies shy away from the question of accountability. But unless they face it, business executives can have little incentive to give full attention to setting the technology agenda. We have seen such executives on a technology committee come unprepared; others constantly leave the room to take telephone calls, and there are always absentees. Even if business executives do manage to set an agenda, they have little motivation to follow it through. Moreover, without accountability, cultural differences are difficult to reconcile: Business leaders who are responsible for the outcome of an initiative are more likely to commit senior staff to it and to ensure there's close cooperation between the two sides. Leadership is clearly needed if the issue of accountability is to be faced. At one of the few companies to have grasped this particular challenge, a core group of leaders recognized that technology had become essential to success and felt comfortable using it to change the way business was done. Another company's technology organization was broken, threatening its ability to support the core business. The CEOs of both businesses made sure that all leaders of their business units were responsible for setting priorities, for developing and overseeing their IT investments, and for the results. As a result, the two companies became engaged in efforts to ensure that IT would be a success. These experiences suggest two measures that will promote accountability. First, companies should charge business units for spending on items such as personal computers, telecommunications equipment, and the development of new applications. For instance, one company made its groups pay for their employees' cell phone use, and saw the bill drop by almost half. Of course, such charge-backs can be controversial, particularly if they involve the allocation of expenses for shared infrastructure (the network and data centers, say) or for systems that cut across business units and functions (customer relationship management and enterprise resource planning). The best approach is to balance simplicity and fairness while always keeping in mind the end goal: getting managers to spend their IT dollars more wisely. When a company allocates the cost of a project-management application used by three departments, for example, the simplest way would be to divide the amount equally. Depending on the size of each of these departments or how frequently they use the application, at least one department head might view this approach as unfair. However, making the decision at least forces all three managers to reach an agreement among themselves; more important, the exercise of allocating the costs forces them to define the application's features and interface and, later, to monitor the progress of implementation.
Getting business to set the IT agenda The basic process is straightforward, though details vary from company to company. First, each business unit ranks its IT spending priorities and develops a business case for all projects costing more than, say, $100,000. Hard numbers for costs and benefits are needed, since the projected benefits will become the evaluation criteria for the business and IT managers leading the project. The first meeting to decide priorities can be an eye-opener: Executives often find that as many as 30 percent of their current IT projects should be abandoned. Next, an IT prioritization committee of business unit executives and the CIO reviews the projects and draws up the corporate IT agenda. A project should be funded only if a senior business executive is willing to take responsibility for the results, and business units should not be allowed to pursue independently any project rejected by the committee. This simple system goes a long way toward ensuring that technology expenditures are linked to a company's business strategy and will produce results. Consider the experience of a prominent global investment bank. Every quarter, each of this bank's business units ranks its IT spending priorities. With input from the IT department, it develops a succinct business case for new investments--a case consisting of their objectives, their up-front and ongoing costs, their other resource requirements, and expected benefits. The heads of the business units and the CIO gather to review these proposals and, in a short time--days rather than months--decide which investments will go forward. They also review the CFO's budget goals and negotiate, on the spot, any cuts needed. The process at this bank has two distinguishing features: Both business and IT executives are engaged, and they collaborate because each side is on the hook for delivering results. Since this process was introduced, the bank has concentrated its IT investments on a few high-impact areas and jettisoned many cool but nonessential technology baubles. Spending controlled, among other things, by stricter adherence to technology standards is now more finely balanced between the maintenance of legacy systems and the development of new applications. Several innovative client and back-office systems adopted under this scheme have made the bank a widely recognized technology leader in its field. Getting business leaders to spearhead the determination of a company's IT priorities helps ensure that investments have a strategic impact and makes it easier both to adhere to strict budgets and, paradoxically, to adjust them. One asset-management company, for example, installed a rigorous process to determine its priorities for IT projects and budgeted $135 million a year for them. Several months into the company's fiscal year, revenues were lower than expected, and the budget for IT projects had to be slashed by $50 million. Decisions about which to postpone to the next fiscal year could have sparked a huge battle, but thanks to the existence of clear corporate priorities these decisions were relatively easy to make.
Aligning the IT organization To correct it, the first move should be to decentralize the applications-development function so that senior applications managers report both to a business unit head and to the CIO. Business applications--the area in which business and IT can collaborate most effectively--are tools to automate interactions within a company or between it and its business partners or customers, as well as platforms for creating value for customers. Under this new arrangement, applications managers will maintain a direct reporting line to the CIO and a dotted reporting line to the business unit leaders for whom they work. IT developers will thus have a clear incentive to understand any business they work with and to collaborate with it to identify ways of using technology to enhance its efficiency and effectiveness. Within IT, the shared-services groups, such as those responsible for architecture and infrastructure, should in most cases remain centralized to take advantage of economies of scale and to enforce technology standards. Implementing this new reporting and accountability structure will almost certainly send a jolt through the IT organization, and IT managers will understandably be nervous about serving two bosses. Companies can minimize the conflict by aligning the incentives of their business unit managers and their CIOs so both focus on ensuring that IT expenditures produce tangible business results. A well-defined performance evaluation mechanism is also crucial. Companies that have adopted this structure find that their application developers become smarter about the business and produce well-matched, user-friendly applications. Business people who become more aware of what technology can--and, no less important, can't--offer get involved in developing the requirements of the company's systems. One large financial institution has found that its business leaders and IT managers now discuss the ways IT can add value and what resources should be deployed for that purpose. Only a few years ago, the IT department was viewed largely as a support function that simply responded to requests on a first-come, first-served basis. For companies that have been disappointed with their past IT investments, the message is clear. Senior executives must summon the courage to realign the IT and business organizations by demanding real accountability that will help create a partnership between the two sides. Those who succeed will find that technology can be a vital strategic tool, not just a necessary expense. For more insight, go to the McKinsey Quarterly Web site. Copyright © 1992-2002 McKinsey & Company, Inc. |