Tracking the Effects of Technology-Automation Disruptions on Human Work

Author: Xinyu Chu, Catherine Liu, Pooja Juvekar

Many industry leaders believe we are currently at the peak of the fourth industrial revolution (The World Economic Forum, 2016). Unlike previous industrial revolutions, in which technology was used to streamline production, aiding with simple, specific, routinized tasks, modern technologies have enabled automation of a much wider range of functions. To better understand the effects of these disruptors, we first need to define and clarify key terms that are often (mis)used.  

Automation
Automation refers to the performance of a process in absence of human participation (The World Economic Forum, 2016).
Automated teller machines (ATMs) complete transactions without the participation of bank staff.
Artificial Intelligence 
Artificial Intelligence (AI) is a computational method of converting environmental inputs into actions or outputs in a way that mirrors human cognitive functioning (Shahin, 2016, p.33)
Siri can recognize and interpret human speech and provide relevant answers or actions. 

Robots
A robot is a computer programmed machine that is able to sense and gather information of the surrounding environment to build a simulation model so that it can perform its action automatically (Kuipers, 2018).
Roomba, a robot vacuum cleaner, is able to autonomously navigate and clean one’s floors.
The Internet of Things 
Physical objects can be equipped with electronics, sensors, and network connectivity that allow them to communicate with one another. This interconnection of objects is the internet of things. (Whitmore et al., 2015)
Thermostats, home lighting, and background music can all be controlled and monitored by apps on a smartphone.

Cloud Computing
Cloud computing connects multiple remote servers or computers in a network to carry out the functions normally performed by an isolated computer.
(Mell & Grance, 2011)
Programs like Google Docs or Dropbox provide online cloud space for people to store their documents, so that they can work on them at anytime and in any device.  

Machine Learning 
Machine learning is a type of AI that allows for data-driven improvements in prediction and performance, without the need for additional human programing (Samuel, 1958)
Online retailers like Amazon use machine learning to make product recommendation to customers based on their previous purchases or searches on items. 

Deep Learning 
Deep learning is a set of machine learning techniques, that relies on multiple levels of processing that progressively convert raw data into increasingly usable and abstract outputs (Bengio, Courville, & Vincent, 2013) 
Xu and colleagues (2015) created a program that could analyze photographic images and develop text captions that can precisely and briefly describe them. 

Artificial Neural Networks
An artificial neural network is a collection of nodes (computational units that convert inputs to outputs with a programmed function) that represent different attributes of the system being modeled.
Anderer and colleagues (1995) used an artificial neural network to identify dementia from an analysis of EEG patterns.

Big Data
Big Data is a large volume of data collected from an internal and external organization that may be used for analysis to trends and associations in patterns. Big data is marked by a large volume, high velocity of new data, and a wide variety of data input.
Different industries such as retail, healthcare, and education use big data to detect trends in their customers and compare accordingly, in purchasing habits or prolonged educational performance. 

Data Science
Data science is the process of capturing, maintaining, processing, analyzing, and communicating data (Berkeley Data Science).
Internet search engines use data science to deliver search queries and find the best results for searches.
Natural Language Processing
Natural language processing is a form of artificial intelligence that allows computers to understand, interpret, and manipulate human language (SAS, analytics company).
Electronics, such as the google home and amazon echo, are able to understand and respond to our commands.
Computer Vision
Computer vision is the theory and technology for building artificial system that obtain information from images or multi-dimensional data (Science Daily).
Snapchat filters calculate the distance between objects and the relative positions of elements in a stream of images from the user’s camera.

Data Mining
Data mining is the process of finding patterns and relationships within large data sets to predict outcomes (SAS, analytics company).
E-commerce sites use data mining to advertise similar products to consumers and predict what other products consumer may consider purchasing.
Virtual Reality
Virtual reality is a computer-generated artificial three-dimensional environment where an user can interact with when equipped with electronics, such as a helmet with a screen inside or gloves with sensors (Merriam-Webster).
Lowe’s, a home improvement store, created a virtual reality based skills-training clinic to teach participants visually and more realistically in a 3D environment.

New technologies provide the power for different forms of automation.  Deep learning, for example, is a method of machine learning, which is in turn a critical component of the broader field of artificial intelligence, which tends to rely heavily on cloud computing and data science to rapidly complete necessary calculations. Many modern robots as well as the internet of things, rely on machine learning, cloud computing, and aspects of artificial intelligence to enhance the user experience and further increase data processing speeds. Taken together these technologies work in tandem to automate different aspects of production and human work. 

Automation effects on the human working experience
 
Although organizations have long relied on automation in production, new automated tools are being implemented ever more successfully across a range of industry sectors.  For example, BMW has introduced more than 2000 robots and smart technology to their manufacturing plant in Spartanburg, South Carolina (Mitchell, 2018). Previously, robots were cordoned off in areas inaccessible to humans because of safety concerns.  But greater attention to the human-robot interface has made it possible for a more seamless integration of robots and humans, an integration in which robots often take over the repetitive and routinized assembly line tasks that pose strain and safety issues for human workers (Allinson, 2017). One robot, referred to as Miss Charlotte by the employees, has been integrated as part of the team, entirely taking over a difficult task. Workers typically feel more personal responsibility towards a task if they are interacting with machine-like robot subordinate (Hinds et al., 2014).

Advances in artificial intelligence are increasing the use of machine learning in healthcare. Machine learning tools have shown promise in diagnosing skin cancer and many rare diseases (Sennaar, 2018). A new machine learning algorithm processed millions of previous digital medical records to generate an accurate diagnosis for patients within a few seconds (Jiang et.al, 2017). If these algorithms become widespread, they may shift the role of physicians from diagnosticians to physical and psychosocial healers. Further, the collaboration between AI and doctors may not only improve the quality of the diagnoses, but also shorten the waiting time for patients and increase access to quality healthcare. To capitalize upon this technology, machine learning algorithms must be seen as trustworthy by the physicians who may otherwise ignore them in favor of their own clinical expertise. This requires transparency in the decision making process, including what information the decision was based upon and what the probabilities associated with different diagnoses may be. One study found that when presented with more transparency about the plans, rationale, and possible outcomes of actions conducted by AI agents, human operators were generally more trusting and willing to incorporate the results of the AI agent into their problem-solving process (Chen, 2016). The study also showed a positive relationship between transparency of the intentions behind the actions and people’s task performance in terms of decision making and job operation. Furthermore, Chen (2016) argues that transparency about behavioral intention can contribute to the improvement of the machine learning process of the AI agents, as it provides human operators to evaluate and analyze the information presented by the AI agents, thus including human operators in the decision making process.

The implementation of new technologies in the marketplace is also changing the work experience of office personnel.  For example, Alexa for Business can integrate workers’ calendars, personal phones, meeting room equipment, and other smart technology across physical locations to create an IoT that simplifies the work experience (Walker, 2017).  Advances in artificial intelligence as well as the IoT allow for a smart office that eliminates some daily hassles which can contribute to making work stressful or unpleasant. By integrating personal information from multiple devices, tasks that previously were complicated and bothersome (e.g., finding a time and location for everyone to meet) can be offloaded to technology and done almost instantly. 

Implications of automation on human jobs

As automation becomes more common, there has been increased attention paid to the human cost in terms of jobs eliminated or altered. Acemoglu and Restrepo (2017) estimated that for each robot introduced to the workplace, six people would lose their jobs. A report by McKinsey Global Institute estimated that by 2030, 50% of our time currently spent on work would be replaced by automation, and up to 30% of current work activities would be automated. 

However, as some jobs disappear, others are created or expanded by automation and new technologies. Rideshare drivers (e.g., Uber and Lyft), for example, would not exist if it were not for the automation and programming available today. Similarly, the introduction of robots into BMW’s facilities necessitated the creation of an apprentice mechanic program to fill the many open equipment service associate positions (Mitchell, 2018). 

If past industrial revolutions serve as a guide, we would expect automation to create many new jobs that currently don’t exist.  However, how many jobs will be created and the skill requirements associated with performing such jobs are still largely unknown.  Therefore, we focus on more concrete questions, such as : (1) What jobs are least likely to be substituted by automation? (2) What is the likely pattern of task reallocation among existing jobs as automation becomes more prevalent across industry sectors? 

Frey and Osborne (2017) suggest three task domains in which humans have comparative advantage over robots: (1) perception and manipulation tasks, (2) creative intelligence tasks, and (3) social intelligence tasks.  Humans’ ability to rapidly sense, perceive, and understand the world around them is currently unmatched in many ways by technology. This ability enables a range of complex behaviors from handling irregular objects to interacting with dynamic and unpredictable environments.  Jobs that rely on these skills (e.g., daycare provider, nurse) should be relatively immune (at least in the near-term) to the effects of automation.  For example, although there is a push for the integration of artificial intelligence in healthcare, the trust between clinicians and patients is nearly impossible to replicate (Wahl et. al, 2018).  Further, jobs that require the creation of novel ideas, ranging from poems to scientific discoveries, while enhanced by automation, should also be safe. Automation may actually make these jobs more attainable for individuals who would not normally have access to the training or technology traditionally necessary to succeed. Lastly, jobs that require sophisticated social interactions like negotiation and persuasion should also be relatively secure, given that they focus on tasks such as complicated human interactions, development of novel ideas, quick capture of visual and auditory signals, and complicated body and facial expression. This includes many jobs in management, finance, media, and art.   

With the increasing demand for humans with strong “soft” or people skills, we expect a rise in the number of mixed human/robot teams.  For example, financial managers may spend less time on data processing, which can be handled easily by artificial intelligence, instead shifting their focus toward managing employees and interacting with stakeholders and business partners. 

Promising Research Directions

Although automation is often regarded by workers as a threat to job displacement, it is also the case that automation may improve the human work experience by reducing the number of dangerous, monotonous, or minimally rewarding tasks that employees must perform. We need to further examine strategies or interventions to facilitate and improve this type of interaction so that it provides an advantage to both employees and organizations.  

Automation engineering is only one aspect of the industrial revolution.  To be successful from the human perspective, more research is needed to understand how automation and the process by which it is implemented affects the human work experience.  For example, how best can employees and automated programs or robots coordinate and communicate to avoid errors and redundancies? Also, how can we best select, train, and support employees who will be required to coordinate and communicate with automated technology? Lastly, as much of the focus on technology in the workplace is centered on selection, hiring, and performance appraisal, it is vital that the team dynamic between technology and employees is analyzed in the near future, especially as automated technology becomes a large part of many people’s day to day interactions in their respective workplaces.

Three takeaways on technology-automation disruption: 

The implementation of new technologies in the form of automated machines, agents, and robots will have multiple effects on the human labor market as some jobs are automated out of the market, new jobs are created, and yet other jobs are changed.
 
The current limits of technology and automation suggest that human jobs will change in   ways that can foster a more meaningful work experience (e.g., jobs that involve more social interaction, creativity, and challenge). 

Successful integration of automated technologies requires that employers consider not only the design, purpose, and functionality of new tools but also how they interface with human workers and their effect on team functioning. 
 Further Reading: 

Acemoglu, D., & Autor, D. (2011). Skills, Tasks and Technologies: Implications for
Employment and Earnings, Handbook of Labor Economics,4,1043-1171

Allinson, M. (2017). BMW shows off its smart factory technologies at its plants worldwide.
Robotics & Automation. https://roboticsandautomationnews.com/2017/03/04/bmw-shows-off-its-smart-factory-technologies-at-its-plants-worldwide/11696/ 

Bengio, Y., Courville, A., & Vincent, P. (2013). Representation Learning: A Review and New 
Perspectives. IEEE Transactions On Pattern Analysis & Machine Intelligence, 35, 1798-1828. doi:10.1109/TPAMI.2013.50

Bughin, J., Manyika, J., & Woetzel, J. (2017). Job lost, jobs gained: Workforce transitions in a 
time of automation. Retrieved from Mckinsey Website: 
https://www.mckinsey.com/~/media/McKinsey/Featured%20Insights/Future%20of%20Organizations/What%20the%20future%20of%20work%20will%20mean%20for%20jobs%20skills%20and%20wages/MGI-Jobs-Lost-Jobs-Gained-Report-December-6-2017.ashx 

Chen, J. C. (2018). Human-autonomy teaming in military settings. Theoretical Issues
In Ergonomics Science, 19, 255-258. doi:10.1080/1463922X.2017.1397229

Endsley, M. R. (1995). Toward a theory of situation awareness in dynamic systems. Human
Factors, 37, 32-64. doi:10.1518/001872095779049543

Frey, C. B., & Osborne, M. A. (2017). The future of employment: How susceptible are jobs to
computerisation? Technological Forecasting and Social Change,114,254-280

Hinds, P. J., Roberts, T. L., & Jones, H. (2004). Whose Job Is It Anyway? A Study of
Human-Robot Interaction in a Collaborative Task. Human-Computer Interaction,19(1/2), 
151-181.

How Human-Robot Teamwork Will Upend Manufacturing (2014, Sept 16). Retrieved from 
https://www.technologyreview.com/s/530696/how-human-robot-teamwork-will-upend-manufacturing/

Jiang, F., Jiang, Y., Zhi, H., Dong, Y., Li, H., Ma, S., Wang, Y., Dong, Q., Shen, H., & Wang, Y.
(2017). Artificial intelligence in healthcare: past, present and future. Stroke and vascular neurology 2. doi:10.1136/svn-2017-000101

Kuipers, B. (2018). How Can We Trust a Robot?. Communications of The ACM, 61, 86-95.
doi:10.1145/3173087

Lee, S. H., Chan, C. S., Mayo, S. J., & Remagnino, P. (2017). How deep learning extracts and     learns leaf features for plant classification. Pattern Recognition, 71, 1-13. doi:10.1016/        j.patcog.2017.05.015

Mell, P., & Grance, T. (2011). The NIST definition of cloud computing. [electronic resource]. 
Gaithersburg, MD : Computer Security Division, Information Technology Laboratory, National Institute of Standards and Technology.

Mitchell., A. B. (2018). BMW seeks more humans to maintain Greer plant’s robots. Greenville 
News. https://www.greenvilleonline.com/story/news/2018/01/31/bmw-doubles-down-tech-apprentice-program/1082773001/ 

Samuel, A. (1969). Some studies in machine learning using the game of checkers. II—Recent 
progress. Annual Review in Automatic Programming, 6 (Part 1), 1-36. 
doi:10.1016/0066-4138(69)90004-4

Sennaar, K. (2018). Machine learning for medical diagnostics — 4 current applications. Techemergence. https://www.techemergence.com/machine-learning-medical-diagnostics-4-current-applications/ 

Shahin, M. A. (2016). State-of-the-art review of some artificial intelligence applications in pile
    foundations doi:https://doi.org/10.1016/j.gsf.2014.10.002

The Difference Between Artificial Intelligence, Machine Learning, and Deep Learning.
(2017, December 4). Retrieved from
https://medium.com/iotforall/the-difference-between-artificial-intelligence-machine-learning-and-deep-learning-3aa67bff5991

Thomas, C., Stankiewicz, L., Grötsch, A., Wischniewski, S., Deuse, J., & Kuhlenkötter, B.
(2016).  Intuitive Work Assistance by Reciprocal Human-robot Interaction in the Subject 
Area of Direct Human-robot Collaboration. Procedia CIRP, 44(6), 275-280. doi:10.1016/j.procir.2016.02.098

Walker, T. (2017). Announcing Alexa for Business: Using Amazon Alexa’s voice enabled
devices for workplaces. AWS News Blog. https://aws.amazon.com/blogs/aws/launch-announcing-alexa-for-business-using-amazon-alexas-voice-enabled-devices-for-workplaces/ 

World Economic Forum(2016), The Future of Jobs: Employment Skills and Workforce
Strategy for the Fourth Industrial Revolution. Retrieved from World Economic Forum website: http://www3.weforum.org/docs/WEF_Future_of_Jobs.pdf. 

What is Cloud Computing: a beginner’s guide. (2018, January 28). Retrieved from 
https://azure.microsoft.com/en-us/overview/what-is-cloud-computing/

Who Needs the Internet of Things? (2016, September 13). Retrieved from
https://www.linux.com/news/who-needs-internet-things 

Whitmore, A., Agarwal, A., & Xu, L. (2015). The Internet of Things-A survey of topics and     trends. Information Systems Frontiers, 17(2), 261-274. doi:10.1007/s10796-014-9489-2

Xu, K.; Ba, J.; Kiros, R.; Cho, K.; Courville, A.; Salakhutdinov, R.; Zemel, R.; and Bengio, Y.
    (2015). Show, attend and tell: Neural image caption generation with visual attention. In
ICML.

Leave a Reply

Your email address will not be published. Required fields are marked *