The IT sector is growing and evolving rapidly, and new advancements are being made every day. Keeping up to date is an obligation for all those involved in this sector. For those starting out, it is essential to know about the new applications being developed, as well as their future uses. For those who have been in the industry for some time, it is vital to remain active and up-to-date in an ever-changing world. Training and knowledge are increasingly necessary for a sector with a great present and a promising future. 

At LafargeHolcim IT EMEA, we want to contribute to our community by creating a small glossary of the concepts we discuss and those that seem relevant to all those who read us. As with all digital projects, collaboration and teamwork are fundamental to success. Therefore, we would like to invite all those who follow us to collaborate with comments, contributions and even definitions that can be included.

This glossary will refer to the topics that have been discussed so far as well as to the topics that we will address in future publications. We hope that it will be useful and we are counting on your contributions!

ARTIFICIAL INTELLIGENCE

According to mathematician John MacCarthy, who coined this term in 1956, artificial intelligence is “the science and engineering of making intelligent machines, especially intelligent computer programs. It is related to the similar task of using computers to understand human intelligence, but AI does not have to confine itself to methods that are biologically observable”

Colloquially, the term artificial intelligence is used when a machine is capable of imitating the “cognitive” functions that humans associate with other human minds, such as: “perceiving”, “reasoning”, “learning” and “problem-solving”.

Its development has been exponential, especially over the last few decades, when it has become a central element for information technologies. It is behind technologies such as Digital Twins the Big Data or and the Internet of things, among many others. As we have already mentioned in several posts in our blog, artificial intelligence plays a key role in the digital transformation of companies; it allows to speed up and automate processes, analyze large amounts of data, predict possible events and many other things. 

If you want to know more about artificial intelligence we recommend the following links:

9 POWERFUL EXAMPLES OF ARTIFICIAL INTELLIGENCE IN USE TODAY 

Understanding the Four Types of Artificial Intelligence 

Top 12 AI Tools, Libraries, and Platforms 

DATA SCIENCE

Data science is a discipline of computer science that focuses on designing algorithms and programming machines that can analyze such large amounts of data that would not be possible otherwise. It involves developing methods of recording, storing, and analyzing data to effectively extract useful information. 

As Jorge Diaz Limon pointed out in our blog, data science has evolved intensively over the past two decades. In addition, it continues to grow exponentially as we are surrounded by more and more sensors, cameras and devices that analyze the entire universe around us and deliver vast amounts of information. 

The main benefit of data science today is the ability to consume and process a large volume of data and to generate information for business in order to make the best decisions possible.

Data science and data scientists are being the pillars on which the digital transformation of business is based.

Want to know more? You can find more information here:

What is Data Science? 

What skills are needed to be a data scientist? 

DIGITAL TWINS

A digital twin is a virtual replica of a model, system or product, created for the purpose of evaluating its performance, predicting its evolution and anticipating potential problems. It is a technological trend that has evolved greatly in recent years and is now part of the decision-making processes in the world of information technology.

As we mentioned in our previous post, the first time the use of this method was exactly 50 years ago, during the Apollo XIII mission to the moon. This historic mission was a failure because it failed to step on the satellite, but the several technical problems that put the astronauts’ lives at risk, made its rescue part of history as one of NASA’s greatest successes. Part of the success of the rescue was the use of a replica of the spacecraft in which the astronauts were travelling. This replica was used for tests and trials and also to develop tools that would make it possible for the astronauts to repair their spacecraft to get back to earth. 

Since then, 50 years ago, this technique has been developed in different areas and, thanks to new connection networks such as 5G, today it is one of the models that is growing most.

One of the primary reasons Digital Twins technology is rapidly being adopted is that there are multiple use cases across the industrial enterprise: engineering, manufacturing and operations, and maintenance and service. Digital twins are made possible (and improved) by a multitude of Industry 4.0 technologies – IoT, AR, CAD, PLM, AI, edge computing, to name a few – to create a powerful tool that’s driving business value.

Want to know more? Check out these links:

What Is Digital Twin Technology – And Why Is It So Important? 

7 Amazing Examples of Digital Twin Technology In Practice

IBM explains what are Digital Twins