【 COMPUTING 】 100% Updated Guides and Tutorials ▷ 2022

Definition of the word “informatics”

The word computing comes from the union of two terms (inform-mation, self-math). Information, since data is disseminated that is processed and reconstructed into something new automatically and transmitting this new data.

What is computing and what are its characteristics? Concept and definition

It can be said that computing is that science that studies information and all the means used for the automation and transmission of processes, and then transform the data and treat it into something new.

arguably The raw material of computing is the information that is used, This is because if it did not exist, final tasks could not be carried out, such as: How can you withdraw money from your bank’s ATM, if you do not transmit your account information through your card or security code?

The ATM needs this information to work, validate your identity and, in turn, to supply the requested money. This task obviously involves other computer processes that we will not mention in detail, but surely you realize that this process is complex.

The objective of this area is the How to treat said information? In order to transform that data into something new and transmit it to the user. In other words, just by providing some simple data, more complex things can be achieved, simplifying tasks and work.

computing characteristics

These characteristics can be highly variable, but in different They are very common, because although it is true, they will always require information to be previously transmitted to perform a task.

Among the features we have

  • They generate savings in time, labor and comfort for users in the long run.
  • Most computer equipment is easy to use and with quality standards that provide a better experience for users.
  • A computer system must exercise an automatic processing of all the information that is provided to it, this can be done through electronic devices that use computer systems.
  • Three basic rules must be met: (The set of these three tasks is known as an algorithm.)
    • Entry (information capture), which is when some information is provided. .
    • Prosecutionwhere all the information is processed and analyzed.
    • Departure (transmission of results), the information provided is processed and calculated at high speeds, performing all the necessary calculations, and depending on the task requested, the results must be sent and displayed. .
  • The information must be stored digitally, thanks to this it makes it possible to recover any file later with the least margin of loss.
  • It provides the occasional user with an interface for managing the system and satisfying their information needs.
  • Transmit or share said information with other computer systems through networks or telecommunications.
See also  【 Slack Payment Plans 】What are they? + Which one to choose? ▷ 2022

history of computing

The development of this science started in the middle of the 20th century beginning like this with great advances such as: the creation of the first computer, the and the mobile phone.

the first machine that carried out automatic processes based on an algorithm that existed it was called Z3 and was invented by german scientist konrad zusein the year 1941. Said equipment had a weight of one ton. An incredible weight and hard to believe for modern technology.

But for that time, the electronic elements that made up the computer were very large and heavy. It could perform simple operations such as addition in 0.7 seconds and multiplication or division in three seconds.

The word computing was first used in the 1960s in France. by the engineer Philippe Dreyfus, who joined the words “information” and “automatique”, (information and automatic), being born as a result, computer science. Prior to this event, in the year 57 Karl Steinbach used the word “informatik” in the title of a published document.

Source

The origin comes much further back than many may believe, since they associate this science with modern technologies such as electronics and programming. But actually Computer science arose in antiquity with the creation of various methods that were used to perform mathematical calculations.

During the period from 1600 to 1700, incredible and important scientific advances occurred at the hands of one of them, Blaise Pascal, who designed and built in 1642 the first working mechanical calculator, pascal’s calculator.

In the time of 1673, Gottfried Leibniz, demonstrated a mechanical digital calculator called Stepped Reckoner. This scientist can be considered the first computer scientist in history. One of his most important discoveries was the binary number system.

The following years, almost continuously Human beings have been creating new methods to automate work processes, and in this way improve the productivity and quality of the same.

New calculating machines such as calculators were created Difference Engine and Analytical Engine), successor to the previous one named, both designed by Charles Babbage (1791-1871). This machine innovated with the implementation of perforated cards allowing the Analytical Engine to be programmed in different ways.

Thus, in 1843, during the translation of an article in French by the Analytical Engine, Ada Lovelace (1815-1852) managed to write what is now considered the first computer program in history which was used to compute Bernoulli numbers.

Evolution

Computer science has had a series of incredible and long changes throughout its history, This began by performing a series of simple, which were later transformed into more complex tasks as new computing functions and new technologies appeared that helped to better transmit information, store it and transform it more easily.

See also  【These are the Turkish series that are all the rage in Spain】 ▷ 2022

This has allowed the development of programmable machines such as: computers, electronic devices, mobile phones and automobiles. These computers for computer use are very useful in various areas such as medicine, robotics, transportation, business administration and management, industrial processes, and many more.

In fact, computing has given rise to the so-called information agepromoter of the information revolutionis seen as the third greatest technological progress of humanity, being on a par with the Industrial Revolution (1750–1850 AD) and the Neolithic Revolution (agriculture) (8000–5000 BC).

Future

The future of computing may be somewhat uncertain, This is due to the constant changes that the world is undergoing today, and it is easy to see, since the leaps and the evolution of technologies are increasing and in a shorter time frame.

but there is something that will not change and it will be computer security, Well, currently this area is gaining more and more importance, due to the fact that the current world is modernizing and there are more computer systems in charge of daily life. This is a consequence of the fact that more people use and store data on their computer equipment, computers, mobile phones, etc.

So it can be said that, in the future, greater importance will be given to the protection of user data in order to achieve greater reliability and credibility in this branch.

Computer science has become a very important pillar for humanity, and although it is already a great help in other areas such as electronics, electricity, physics, mathematics, graphic design, architecture and other areas, Every day the programs used get improvements and new tools that help simplify the work, saving time for their workers and users.

That is why, although it seems that there are no new places for computing, it seeks to grow and evolve in order to help in the future to create better technologies such as the quite a challenge and already a reality for computer systems.

units of measurement in computing

There are two units of measure, these are to measure the weight (space) of a file and the speed with which an information or document is transmitted, and the other unit is the one that is used to indicate with which a process is done or concluded.

The smallest unit of measurement is the bit. This can be thought of as the state of a switch (open or closed) where each state is represented by a binary digit 0 or 1.

See also  【ENABLE-DISABLE Pop-up Windows】Step by Step ▷ 2022

The bits are grouped into 8-bit byte arrays. Although formally there can be bytes of between 6 and 9 bits, but You will mostly find 8-bit bytes, these are also called octets.

memory and storage

The bit can be used to indicate a measurement of . The larger the file space, the bit will take up more space. It is important to indicate that for memory and storage the binary system is used, where each unit is 1024 of the previous unit, so we have:

  • 1024 bytes are 1 Kilobyte (K, KB,)
  • 1024 KB is 1 Megabyte (Mega, MB,)
  • 1024 MB is 1 Gigabyte (Giga, GB,)
  • 1024 GB is 1 Terabyte (Tera, TB,)
  • 1024 TB is 1 Petabyte (Peta, PB)
  • 1024 PB is 1 Exabyte (EB)
  • 1024 EBs are 1 Zettabyte (ZB)
  • 1024 ZB is 1 Yottabyte (YB)

Really after overcoming the terabyte barrier the amount of space becomes partially infinite for an average user, since the amount that is available of it for storage is so wide that you will not be able to fill it for a long time.

Data transmission speed

In this case, the decimal system (international system) is used and therefore the units are multiples of 10 and not 2. Here the bits per second (bps) are used as the base.

*Note: Bits are not bytes, as they tend to get confused.

The most used units are:

  • Kilobit (Kbps) = 1,000 bits per second
  • Megabit (Mbps) = 1,000,000 bits per second (1,000 Kpbs)
  • Gigabit (Gbps) = 1,000,000,000 bits per second (1,000 Mbps)

Many times Internet speeds are measured in kilobytes per second (KB/s), (counting in bytes not in bits). This reflects the capacity and speed of the units that are transmitted per second, having to multiply by 8 in order to obtain the real transmission speed.

Frequency (processor, memory)

The processing speed of the processor, memory, graphics, etc., is measured in hertz, a hertz (hz) being a cycle or repetition of an event per second.

currently the hertz lagged behind with speed capability devices, which now operate with capacities ranging from: megahertz (Mhz, million hertz) to gigahertz (Ghz, billion hertz).

So, due to what was explained above, the speed of processing work or frequency of a processor is measured in megahertz As current standard, these…

Loading Facebook Comments ...
Loading Disqus Comments ...