The IT sector is growing and evolving rapidly, and new advancements are being made every day. Keeping up to date is an obligation for all those involved in this sector. For those starting out, it is essential to know about the new applications being developed, as well as their future uses. For those who have been in the industry for some time, it is vital to remain active and up-to-date in an ever-changing world. Training and knowledge are increasingly necessary for a sector with a great present and a promising future. 

At LafargeHolcim IT EMEA, we want to contribute to our community by creating a small glossary of the concepts we discuss and those that seem relevant to all those who read us. As with all digital projects, collaboration and teamwork are fundamental to success. Therefore, we would like to invite all those who follow us to collaborate with comments, contributions and even definitions that can be included.

This glossary will refer to the topics that have been discussed so far as well as to the topics that we will address in future publications. We hope that it will be useful and we are counting on your contributions!

IoT

IoT is the abbreviation for Internet Of Things. It is the M2M (machine to machine) interconnection of devices and objects through network access, such as the internet.

Although the beginnings of the IoT date back to 1980, the term was not coined until 1999. It is said that the first IoT device was a Coke machine installed at the Carnegie Mellon University that was connected to the Arpanet. Today, there are more and more devices connected to the Internet and, essentially, this technology associated with the IoT allows data to be collected and sent to the network for analysis or even to perform a previous analysis and then send it to the network.

Manufacturing is also an industry in which IoT has become essential. Many processes are already being carried out almost entirely by machines. Even in for the supply chain, reporting, analytics, optimization. IoT is now part of our daily life and it will become more present in the coming years.

Want to know more? You can find more information here:

What is the Internet of Things (IoT)?

The connected car is one of the engines of the IoT, according to Telefónica’s Things Matter 2019 Study – [Download the report]

 

Smart cities

A ‘Smart city’ is a complex and interconnected system that applies new technologies to manage everything, from the proper running of public and private transport systems, and the efficient use of power or water resources, civil protection plans, to socio-economic aspects, such as the vitality of public spaces and the commercial fabric, or the notification of incidences to visitors and citizens.

Smart cities are those where information and communication technologies (ICT) are applied. More and more people are migrating from the countryside to the city, and it is estimated that by 2050, 85% of the world’s population will live in cities; therefore, the term ‘smart’ implies a commitment to constant improvement.

Stefan Junestrand highlights that the incorporation of the blockchain in the development of intelligent cities will allow having a common transversal platform that connects the different services of the city and provides more transparency and security to all the processes. Likewise, he affirms that this technology can improve the management of the administration, support urban planning, allow collaborative economy or contribute to sustainability policies.

 

Want to know more? You can find more information here:

What are smart cities?

What does ‘blockchain’ bring to the ‘smart cities’?

ARTIFICIAL INTELLIGENCE

According to mathematician John MacCarthy, who coined this term in 1956, artificial intelligence is “the science and engineering of making intelligent machines, especially intelligent computer programs. It is related to the similar task of using computers to understand human intelligence, but AI does not have to confine itself to methods that are biologically observable”

Colloquially, the term artificial intelligence is used when a machine is capable of imitating the “cognitive” functions that humans associate with other human minds, such as: “perceiving”, “reasoning”, “learning” and “problem-solving”.

Its development has been exponential, especially over the last few decades, when it has become a central element for information technologies. It is behind technologies such as Digital Twins the Big Data or and the Internet of things, among many others. As we have already mentioned in several posts in our blog, artificial intelligence plays a key role in the digital transformation of companies; it allows to speed up and automate processes, analyze large amounts of data, predict possible events and many other things. 

If you want to know more about artificial intelligence we recommend the following links:

9 POWERFUL EXAMPLES OF ARTIFICIAL INTELLIGENCE IN USE TODAY 

Understanding the Four Types of Artificial Intelligence 

Top 12 AI Tools, Libraries, and Platforms 

DATA SCIENCE

Data science is a discipline of computer science that focuses on designing algorithms and programming machines that can analyze such large amounts of data that would not be possible otherwise. It involves developing methods of recording, storing, and analyzing data to effectively extract useful information. 

As Jorge Diaz Limon pointed out in our blog, data science has evolved intensively over the past two decades. In addition, it continues to grow exponentially as we are surrounded by more and more sensors, cameras and devices that analyze the entire universe around us and deliver vast amounts of information. 

The main benefit of data science today is the ability to consume and process a large volume of data and to generate information for business in order to make the best decisions possible.

Data science and data scientists are being the pillars on which the digital transformation of business is based.

Want to know more? You can find more information here:

What is Data Science? 

What skills are needed to be a data scientist? 

DIGITAL TWINS

A digital twin is a virtual replica of a model, system or product, created for the purpose of evaluating its performance, predicting its evolution and anticipating potential problems. It is a technological trend that has evolved greatly in recent years and is now part of the decision-making processes in the world of information technology.

As we mentioned in our previous post, the first time the use of this method was exactly 50 years ago, during the Apollo XIII mission to the moon. This historic mission was a failure because it failed to step on the satellite, but the several technical problems that put the astronauts’ lives at risk, made its rescue part of history as one of NASA’s greatest successes. Part of the success of the rescue was the use of a replica of the spacecraft in which the astronauts were travelling. This replica was used for tests and trials and also to develop tools that would make it possible for the astronauts to repair their spacecraft to get back to earth. 

Since then, 50 years ago, this technique has been developed in different areas and, thanks to new connection networks such as 5G, today it is one of the models that is growing most.

One of the primary reasons Digital Twins technology is rapidly being adopted is that there are multiple use cases across the industrial enterprise: engineering, manufacturing and operations, and maintenance and service. Digital twins are made possible (and improved) by a multitude of Industry 4.0 technologies – IoT, AR, CAD, PLM, AI, edge computing, to name a few – to create a powerful tool that’s driving business value.

Want to know more? Check out these links:

What Is Digital Twin Technology – And Why Is It So Important? 

7 Amazing Examples of Digital Twin Technology In Practice

IBM explains what are Digital Twins 

MACHINE LEARNING

The definition of artificial intelligence depends on the concept of intelligence itself. Thus, we understand that its more general definition would be the sub-discipline of the field of computer science that seeks to create machines capable of imitating intelligent behaviour. This definition, first used in 1955, establishes the essence of this concept. But artificial intelligence is a complex field with multiple derivatives.

As early as 1959 Arthur Lee Samuel, a pioneer in artificial intelligence, coined the term Machine Learning and defined it as follows: “A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P if its performance at tasks in T, as measured by P, improves with experience E.” Thus, we understand that machine learning is the branch of artificial intelligence that seeks to provide machines with the ability to learn” or “The generalization of knowledge from a defined set of experiences”.

This procedure can be subdivided into three main paradigms: supervised, unsupervised and reinforced learning. 

Supervised learning is based on discovering the relationship between input variables and output variables. Learning arises from teaching algorithms what is the result you want to obtain for a certain value until the algorithm can give a correct result even when it is shown elements that it has not seen before. Supervised learning is the paradigm that has been used more in recent years, however, has limitations.

On the other hand, unsupervised learning represents the future of machine learning. It is the paradigm that manages to generate knowledge only from the data provided as input, without the need to explain to the system what result we want to obtain. Unlike supervised learning, there is no need to provide so much data or to tell the system what result to obtain, which is sometimes too complex or costly. 

Booster learning is a very common form of learning, not only in humans but also in animals. It involves assigning a task to an algorithm and “rewarding” or ” penalizing” each step depending on whether or not the goal is achieved. When the answer is correct, reinforcement learning is similar to supervised learning: in both cases, the learner receives information about what is appropriate. However, in the case of wrong answers, the procedure is different: while supervised learning tells the algorithm what it should have responded to, reinforced learning simply informs it that the behaviour or response was not appropriate and how much error was made, without directly indicating the response. 

Each of the main machine learning paradigms has its particular applications depending on the needs that the algorithm must meet. 

Do you want to know more? Check out these links

What is Machine Learning? A definition

Supervised vs. Unsupervised Learning

A Beginner’s Guide to Deep Reinforcement Learning

What is reinforcement learning? The complete guide

10 real-life examples of Machine Learning

5G

As we mentioned in our blog, 5G technology has arrived to revolutionize and enhance the digital transformation of our society. The term 5G refers to the fifth generation of wireless communication technologies. Therefore, it is a new technological paradigm of communication based on previous generations, especially the immediately previous one: 4G, only 10 to 20 times faster than this one. 

5G is not a new technology in itself but is based on the development and improvement of existing technologies with the aim of maximizing and optimizing processes to achieve high speeds. The assignment of new higher frequencies, together with the development of infrastructures such as Distributed Antenna Systems (DAS) have made it possible to achieve speeds never before experienced.

In addition to speed, another of the main characteristics, and probably the most relevant, will be the decrease in latency, or response time. In the case of 4G technology, the average latency was 100 milliseconds. With 4G+, it was reduced to 20 milliseconds. The arrival of 5G promises latencies of 1 or 2 milliseconds, which is considered almost immediate. This will greatly facilitate the development of the Internet of things, allow the creation of cars that are interconnected and can be driven automatically, or ultra-precise surgeries can be performed at a distance. 

 Source: Digital Tends

If you want to know more about 5G we recommend our post “5G networks and its implications for the industry”

Still want more? Visit these links

What is 5?

5G Network Architecture, a high-level perspective