Club-BPM - Gestión, Automatización, Inteligencia de Procesos y Transformación Digital
 

   
Quiénes
Somos
Metodología Ágil
BPM:RAD®
Centro de Estudios y
Certificaciones
Servicios de Consultoría Observatorio
Internacional
           

 

  Artículo

The Digital Transformation and Business Process

Fuente: BPTrends
Autor: Paul Harmon


I've been reading a lot on the digital transformation recently. My first impression is “so what?” As far as I can tell what people now term a digital transformation is just another way of saying that companies are embracing computers or information technology. That said, “the digital transformation” seems to be the current faddish way of talking about that transition.

Let's be clear, the computer revolution/IT revolution/digital transformation began just after World War II – when computer technology began to become available to business organizations for the first time. The revolution – which some would claim to be the third phase of the Industrial Revolution – accelerated very slowly in the 50s, began to pick up speed in the 60s and became an overwhelming fact of business life in the 70s. PCs were introduced in the early 80s. The Internet and then the Web became really popular in the 90s. And it continues.

Another way to think about it is that we began by using computers to automate routine tasks and to store data. We automated the backroom activities of the census operation. We automated payroll and payroll records. Vast warehouses of paper documents were gradually duplicated and then eliminated by computer databases. This was the era of mainframes.

In the late 70s mini-computers made it possible to use computers for smaller applications and computers began to do a wide variety of office tasks, often taking the backroom operations out of processes and turning them over to computers. The introduction of the PC in the Eighties completed the transition, and suddenly computers were everywhere. The key application of this era was the spreadsheet. Used on a PC, the spreadsheet empowered every manager to maintain his budget on a PC, make changes to see how they would effect operations, and update his budget whenever needed. This, and hundreds of similar applications that quickly followed, introduced the idea of work done in conjunction with a computer. Instead of replacing workers, computers empowered workers – making them much more productive. The computer became the new typewriter that sat on everyone's desk and documents became word files.

Right after World War II, there was a period when the US government hoped to obtain great things form a special field of computing – termed Artificial Intelligence (AI). It was hoped, for example, that AI would allow instantaneous translation of Russian to English. After some experimentation, it was determined that AI wasn't up to the task. Current theories of how human languages worked didn't provide an adequate basis for analysis and the programming wasn't up to the translation task.

A second round of AI bloomed in the early Eighties, when companies were founded to develop “expert systems” – applications that would duplicate the work of human experts and, potentially replace physicians doing diagnosis, or experts scanning photos for missile sites. The second round of AI led to a revolution in how people thought about computing. During the Eighties, most computer workers had learned computing using mainframes and focused on big data processing tasks. They thought of computers a big calculators, or dumb typewriters. They didn't think of computers as being capable of understanding what was written, of maintaining knowledge or of reasoning. Those who truly understood computer technology had always known of these possibilities, but they hadn't been stressed when computers were first being acquired to do bookkeeping and record keeping tasks. The immediate goals of the second round of AI proved unsuccessful; expert systems could indeed capture and use human expertise to solve problems, but the cost of capturing the knowledge and then of keeping it up to date proved too expensive. Keep in mind that the PCs available in the early 80s were based on 286 intel chips that lacked the speed or RAM to support the kinds of applications AI developers wanted to create. The expert systems movement fizzled out by the late Eighties.

The education that computer employees got, however, had a major impact. If we couldn't build expert systems, we could at least use rule-based technology to capture business policies and decision-making rules. Similarly, ex-expert systems developers revolutionized game playing programs, developed data mining and analytics applications that could analyze vast amounts of data and find patterns in data that humans couldn't detect.

So we arrive at 2016, and the third round of AI, which IBM terms Cognitive Computing. In one sense we are using a lot of technology that was initially created in the Eighties, but supplemented by much that has been improved since them. And everything is now being done of computers that are many generations faster and more powerful than anything available in the Eighties. Natural language problems have been solved. Watson played Jeopardy against experts and beat them in human time. Rule-based systems have been supplanted by deep neural network systems that are able to learn and improve as they gain experience. Deep Blue beat a human chess champion using rules and brute force processing. Recently, Google's AlphaGo beat the European Go champion in 5 of 5 games. (Go is a much more difficult task than chess.) They key, however, is that AlphaGo didn't need to do brute force searches of hundreds of possible future moves to decide which move to make. More important, AlphaGo was created to be a Go playing systems. It was developed to be a learning systems, and then it was taught to play go. Then turned loose to play itself until it has become a world class player.

AI is only a part of the digital transformation. At base, there are incredibly fast computer chips and databases capable of storing vast amounts of data. There is the Internet that connects everything, and, increasingly, the cloud that provides massive data storage and processing to anyone, anywhere, via a PC, pad or phone access device.

There are still jobs that only people can do, but they are becoming fewer and fewer. Now that customers are armed with their own computers, and natural language systems can perform quickly and effectively, the parts of a business process that can't be automated are precious few.

The challenge facing organizations, today, isn't what to automate, but what to automate first – and to what degree to push the automation.

The current digital transformation is ultimately a business process transformation. We are capable of changing all aspects of our businesses. The dreams that Michael Hammer shared in Reengineering the Corporation – when he urged CEO's to obliterate their current business process “cow paths” and replace them with “super highways” are pale things compared to what we can do today.

Any CEO should assume that he or she could significantly improve half the business processes in his or her organization. The challenge is deciding which changes will provide the largest return, and which will respond best to the changing environment in which the organization will be operating in when the changes are complete.

The digital transition is usually described as technology. Technology is great, but it doesn't make money, as such. Implementations of technology that provide value to customers – what we call business processes – are what make money. Opportunities are everywhere. The challenge is to determine which to seize.

 

 

 

 

Articulos Club-BPM