Robot Overlords: AI At Facebook, Amazon, Disney And Digital Transformation At GE, DBS, BNY Mellon – Forbes
Searching recently for evidence of artificial intelligence taking over our jobs, lives, and everything else, Tom Davenport came up empty. âNary a robot overlord to be found,â he declared at the 14th MIT Sloan CIO Symposium. A day later, at the 3rd REâ¢WORK Deep Learning Summit, I found many humans who are successfully lording over robots, telling their artificial intelligence creations how to perform a number of narrow cognitive tasks.
Davenport, who has published eighteen books on topics ranging from managing organizational data and processes to leading with AI and analytics, is working on his next one, tentatively titled The Cognitive Company. Throughout his distinguished career in a field littered with âhype cycles,â Davenport has opted to respond with healthy skepticism to the typical breathless and enthusiastic pronouncements regarding the latest new new thing. This time around, his letâs-look-at-reality approach is aimed at tempering the hype around the negative implications of artificial intelligence.
Surveying 160 AI-related projects performed by consulting firm Deloitte (where he is a Senior Advisor), Davenport couldnât find even one job that was lost because of the adoption of AI (but also no sign of increased economic productivity). What he did find in the real world of companies adopting the latest advances in AI, were 3 types of âcognitive projectsâ:
- Robotics and cognitive automation (44% of the projects he surveyed)âperforming digital task transferring (e.g., replacing lost credit or ATM cards without human intervention).
- Cognitive insights (48%)âdelivering granular insights, detecting patterns (e.g., identifying credit/claims fraud in real-time at banks and insurance companies).
- Cognitive engagement (8%)âinteracting with customers and employees (e.g., virtual digital assistants for customer service).
The fastest growing AI applications today are those that are most prosaic, observed Davenport. With the two that he mentioned, Robotic Process Automation and using machine learning for data integration, it is probably easier than with other AI applications to quantify or demonstrate the benefits to the business.
Be prosaic rather than too ambitious, be slow in becoming a cognitive company, Davenport recommended: Take small steps but have a larger goal, including moving to a platform and developing an ecosystem; pick the right technology for the problem; plan for augmentation (assisting humans in their work), not automation; and build on your existing analytics capabilities and staffâthese are the people best suited for an AI skills-upgrade.
Davenport hedged his bets by promising that AI will be âtransformative in the long run.â His colleagues at the MIT Initiative on the Digital Economy, Erik Brynjolfsson and Andrew McAfee, argue that it is transformative now. In their best-selling The Second Machine Age, they have âunderestimated the scale of destructionâ of jobs, said Brynjolfsson at the MIT CIO Symposium. Their new book, Machine, Platform, Crowd: Harnessing Our Digital Future, is about the second phase of the second machine age, about moving from codifying knowledge to allowing machines to learn on their own. This shift has âmassive implications for how we run our companies and live our livesâ and we are âunderestimating whatâs ahead of us,â said McAfee, adding that with âmachine learning and interconnected humanity put together, we are in new territory.â
New? Color me skeptical, in a much deeper shade than Davenportâs. The term âmachine learningâ was coined in 1959 (and the better-sounding âartificial intelligenceâ in 1955) to denote machines that learn on their own. And talk about interconnected humanity has preceded computers by a few hundred years. As for destruction of jobs, physicist Richard Feynman organized in 1944 a contest between human âcomputersâ (the women who for the previous 60 years performed complex scientific and military calculations by hand) and IBMâs punched-card machines. Just like chess and Go champions, human computers lost that contest. With the advent of digital computers (or âgiant brainsâ as they were called at the time) a few years later, the job of a human computer was eventually destroyed, with some of the women performing that role becoming the first computer programmers, a completely new occupation and a much bigger job creator.
If you donât subscribe to Mark Twainâs dictum that history sometime rhymes and instead hold to the widespread notion that everything changed with the introduction of the smartphone (or the Internet or whatever other recent technological development) and the invention of âdeep learningâ (the new âartificial intelligenceâ), how about the following present and future numbers from McKinsey: