Informatică aplicată în știință, tehnică și economie [622135]
1
Universitatea „Aurel Vlaicu" din Arad
Facultatea de Științe Exacte
Masterat
Informatică aplicată în știință, tehnică și economie
Lucrare de disertație
Calea spre calculator ul cuantic
Towards Quantum Computing
ÎNDRUMĂTOR ȘTIINȚIFIC
Prof. univ. dr. Ioan Dzițac
ABSOLVENT: [anonimizat] 2018
2
Rezumat: „Calea spre calculator ul cuantic”
De la ideile de automatizare a calculelor matematice , apoi la criptarea mesajelor secrete din
campaniile militare și apoi cele economice, spre calculatoare electronice, apoi rețele
neorona le, sisteme de calcul avansate – supercomputere, toate bazate pe fizica clasică și
tehnologia sistemelor electromagnetice … apoi limitările fizicii clasice, apariția și dezvoltarea
conceptelor de fizică cua ntică, elaborarea și succesele mecanicii cuantice și a tehnologiilor
bazate pe fizica cuantică … însoțite de limitările tehnologiei infornației bazate pe sistemele de
calcul clasice … urmate de calea spre calculatoarele cuantice, unde a ajuns umanitate a și
încotro mergem în științele exacte și în tehnologia informației … un periplu de la științ ele
exact e, ficțiune științifică, anticipație, până la așteptările umanității legate de evoluția
informaticii în viitorul apropiat și viitorul mai îndepărtat a le științei calculatoarelor .
Abstract: „Towards Quantum Computing”
Starting from the ideas of automation of mathematical calculus, to the encryption of sec ret
messages in military campaig ns continued by economical campaigns, towards electronic
computin g machines, then neural networks alongside advanced computing systems –
supercomputers, all of these based on classical physics and electromagnetic systems
technology … then again due to the limitations of classical physics came along the appear ance
and development of the concepts in quantum physics, the elaboration and achievements of
quantum mechanics and quantum physics based technologies … accompanied by the
limitations of information technology based on classical computing systems … leading
towar ds quantum computers, whereto did humanity arrived up to present day and where are
we heading to in exact sciences and information technology … a path through exact sciences,
science fiction, anticipation, up to the expectations of mankind related to evo lution of
informatics in the close and the further future of computer sciences .
3
Chapter 1 Introduction
1.1 Science and Technology in History
Most of our culture in the present has its roots in the Mediter ranean antiquity, as developed by
the ancient E gyptians, the Phoenicians , the Greek and the Romans, in their succeeding
empires. Later on in the Middle Ages, the Byzantine, the Spanish, the Portuguese and then the
British empire spread the culture of western civilization throughout the world, impo sing it
over the cultures of the local human civilizations. Huge empires like the far eastern Chinese ,
Persian and Indian ones were simply occupied and their culture overruled by their invading
masters. Finally in the 18th, 19th and 20th centuries most of the t echnological advanc ement s
moved on to northern America, then reflecting back to Europe, once again super imposing the
western culture on to the roots it had developed from.
The development of modern science and technology and the related industrial revoluti ons
(standardization, mechanical production, steam power, electrification, electronic information
technology) and globalization dramatically changed the world and the way most of the
humans live in the present civilizations.
In the second half of the 20th century, the so called ―cold war‖ between the North -Atlantic
Alliance „NATO‖ (the USA with Canada and the western European states) and the Eastern
European Alliance ―Wars aw Pact‖ (the USSR with the states ruled by socialist or communist
governments) trigge red and lead to an escalation of technological research and development
for the purpose of military armaments race , with an unprecedented accumulation of mass
destruction weapons (nuclear bombs), finally stopped in the early 1990‘s.
One of the leading tech nologies after World War II proved to be the Information and
Communications Technology ―IT&C ‖. Computers and digitally driven economies alongside
mobile wireless communications and the global data transmission network that we call
―internet‖ are quite comm on nowadays, increasing exponentially the amount of data flows for
every sector and every activities of civilized human societies.
So it is quite important to understand how science and technology work together and how we
can improve this cooperative relat ionship by the intelligent and efficient development of
information and communication technologies.
Let us see to it, as follows.
4
Chapter 2 Science and Technology
2.1 The heritage of Ar istotle
From his position as the favored student of Greek philosopher Plato he was appointed as the
teacher of infant Alexander (later on Alexander the Great), accompanying him to become the
first world emperor in recorded human history . After the end of Alexander and his empire,
Aristotle left Athens into exile, but his fo llowers at the Lykaion (Lyceum) Academy
continued his legacy, so that his influence upon philosophy and science was respected until
the Enlightenment Era, considered to be the most valued thinker of the antiquity and the
middle ages. As a matter of fact, h is legacy lives on in the structure of western culture.
As a matter of fact, several predecessors of Aristotle promoted some quite resembling ideas,
but he was the one to consecrate them in the western community; let‘s just remind them –
Pythagoras of Samo s, Thales of Miletus, Empedocles of Acragas, later on his works being
support ed, continued and promoted by Archimedes of S yracuse and in the roman empire
Claudius Ptolemaeus of Alexandria.
So Aristotle defined Science "as the knowledge about the world that is a systematic enterprise
that builds and organizes knowledge in the form of testable explanations and predictions
about the universe ." – cited in Wikipedia .
On the other hand, Technology is the practical application of science , the skill of craft, as
stated by Aristotle in his book ―Physi cs‖ … Since the G reek citizen‘s public opinion
considered working as an activity of the lower classes (slaves, workers, craftsmen), Aristotle
did not deepen the concepts about techn ics and technolo gical abilities of hi s time ; these were
to be reconsidered only by the contribution of Archimedes, a century later, then by his
followers. Archimedes himself learned and successfully practiced several crafts.
2.2 How science works
So, science builds and organizes knowledge wi th testable explanations and predictions. The
most popular example of such a structure o n organized knowledge are the ―Elements of
Geometr y‖ by the Greek mathematician Euclid of Alexandria (c 325 -270 BC).
His major contribution to mathematics was the comp rehensive structure of interconnected
rules of reasoning, in order to demonstrate the veracity or – on the contrary – the lack of
truthfulness of an geometrical assumption and the deriving conclusions.
The base of his method w as a set of fundamental laws, called ―axioms‖. An axiom is a
5
An axiom is a statement that is considered to be true, to serve as a premise for further
reasoning and arguments. … In Greek , the word ― axioma‖ means an righteous and obvious ly
true affirmation .
The elements of the axioms are previously given definitions that are short statements which
name the elements and describe their roles and mathematical features . Then some statements
or affirmations are constructed as sentences, considered to be obviously true and which
neither can be demonstrated, nor have to be proved – the axioms .
Building on the definitions and the axioms we can structure a series of theorems –
affirmations which consequently have to be verified and can be proved as being truthful
based on the axioms and on the previously demonstrated theorems.
With a sufficient stack of theorems we then can formulate a hypothesis concerning a
phenomenon we are studying or a mathematical reasoning we are interested in.
If and when we are able to formulate some assumptions and e ven predictions about the
behavior of the mathematical or physical structure and those predictions match the finds in
reality or within the mathematical structure we observe, then we have found a valid theory .
Such a theory remains valid until we find a s ingle example (or several more) which is proven
to contradict the conclusions of the theory in the given cond itions of axioms and theorems. In
this case we have to dump the theory and try to build anot her new one which might fit our
observations and findin gs in reality.
The British mathematician, physicist and astronomer Stephen Hawking published several
editions of his popular science book ―The Theory of Everything‖ with refer rals to the
observable and verif iable evolution of the Universe – with light and gravity.
So far, his predictions about the existence of black holes in our galaxy were confirmed.
2.3 How physics works
Since in mathematics the systems of axioms (called ―axiomatic ‖) depend mostly on the will
and knowledge of their author, in nature there are some fundamental laws to observe , which
obviously do not depend neither on the will, nor on the knowledge of anybody, but can be
noticed and proclaimed by a scientist who observes and then formulates a suiting text to
describe them. These are ―natu ral axioms‖ – so called principles .
Based on these principles we can then formulate and prove some theorems, also known as
―laws of physics‖ or generally ―laws of nature‖ (if extended to other domains as biology,
chemistry, astronomy and so on).
6
These als o can lead to the statement of a theoretical hypothesis , for which can be formulated
some predictions or expectations of phenomena to observe. If these examples are confirmed
by several independent observers (that is some laboratories in dedicated research institutes
and universities), then the hypothesis (it means ―little thesis‖) turns into a thesis / theory .
The theory remains valid as long as it fulfills all the expectations and predictions deriving
from it but it questioned and suspended if there occur s even one and only example that is
verifiable contradictory to the theory, observed by multiple sources.
This overthrown theory must then be replaced by some new theory, which corresponds to the
observations in nature and which contains all theorems and observations of the former theory
as particular cases or extreme situations.
In 1687 the British genius mathematician, physicist, astronomer, philosopher and theologian
Sir Isaac Newton published his groundbreaking work on the principles of natural scienc es
(then called natural philosophy in Latin) , which stated his visions about the structure and the
functioning of the universe and the physical phenomena observable in the world.
The main part of his work was a set of three principles (and another auxilia ry two , not
mentioned here) , formulated as follows:
"1. In an inertial frame of reference, an object either remains at rest or continues to
move at a constant velocity, unless acted upon by a force.
2. In an inertial reference frame, the vector sum of th e forces on an object is equal to
the mass of that object multiplied by the acceleration of the object.
3. When one body exerts a force on a second body, the second body simultaneously
exerts a force equal in magnitude and opposite in direction on the fir st body." [9]
These three principles then were continued by a whole construction of physical theorems –
ordinary laws – which explained most of the phenomena observed by and known to scientists
of the time, second half of the eighteenth century. The physi cal knowledge then was used in
technology – that is building machinery for manufacturing of goods, triggering the first
technological revolution of the western world.
2.4 Postulates
Sometimes the scientific community will not recognize, nor accept some na tural principles as
they are stated by their authors – the individuals that first notice, then observe and finally
describe and proclame them. The reasons for that ar e mainly the consequences of lack of
7
public acceptance or poor reputation of the particula r scientists or simply the egos of the
dominant public figures from the academic community.
So, if the discoverer of a natural principle fails to proclaim it as such, she or he has no other
choice but to proclaim it as an axiom – depending on the knowledg e and the will of its author.
Let us conclude then:
Axioms are fundamental laws depending on the knowledge and the will of their author.
Principles are fundamental laws valid independent of the knowledge or will of anybody.
Postulates are valid natural p rinciples proclaimed as axioms by their author.
2.5 What does technology
Technics (―crafting‖) and technology (―the knowledge of crafting‖) is the practical part of a
scientific culture of humanity. Based on the theories about some natural phenomena of
economic interest, found to be valid, technics is designing and developing applications like
buildings, devices, mechanisms, machines, apparatuses, structures of practical utility for their
owners and utilizers. The purpose of these technical products is to f ulfill some tasks, in order
to make life better, at least some aspects of it, for the civilization which has developed the
technical knowledge. Complementary, technology is the set of knowledge, abilities and skills
about how to make the products of techni cs.
Therefor technics and technology obtains the knowledge from the science, just good enough
to be applied and utilized. Professionals with technical education and practical training in
technology were known to build engines – moving machines to get thing s done; this is why
they are called ―engineers‖ – a professional title for technicians trained at their highest
academic educational level, usually at technical universities or te chnical institutes.
Lately, in the last five decades, the most dynamic part o f technics and technology is the
domain of Information Technology and Communication, abbreviated IT&C or ITC. The main
practical service of the IT&C is computing, that is collecting, transmitting, processing, storing
and interpreting data and the outcome o f mathematical operations with them, achieved with
the help of electronic ―computing‖ devices – computers or computing machines.
This graduation paper is about the imminent and far future of these computing machines.
8
Chapter 3 Measur ement and computing
3.1 Measur ement and measures
It is obvious that in order to deal with a mathematical model of reality, we need to transfer the
features of the subject of study into our model. The best ways to do so are either to choose
some descriptors (in a corresponding language, based on a form of logic) or to transpose the
numerical and spatial -temporal properties of our subject. Therefor we need measurement.
Measur ement is a comparison with a measure unit, a comparison with a prototype model of
the same type as the su bject.
Wikipedia states this in a more academic way:
―Measurement is the assignment of a number to a characteristic of an object or event, which
can be compared with other objects or events. The scope and application of measurement are
dependent on the c ontext and discipline. ‖
Hence it basically describes the process of comparison.
In order to have corresponding terms of observations for different observers, they need to
relation their findings with the same kind of measure, agreed preliminarily. Th ese measures
are usually called measurement units.
Most of the present day countries use the I nternational Measurement System , introduced for
the first time in France shortly after the 1789 Fre nch Revolution. Meanwhile, after World War
II, most of the European countries and many of their former colonies worldwide , members of
the United Nations Organizations, adopted the IS measurement system, while the related
gauge models or calibration standard models are kept at Sevres , a city part of Paris,
Wikipedia descr ibes it as followed:
―The International System of Units … is the modern form of the metric system, and is the
most widely used system of measurement. It comprises a coherent system of units of
measurement built on seven base units that are ampere, kelvin, second, metre, kilogram,
candela, mole, and a set of twenty prefixes to the unit names and unit symbols … The system
also specifies names for 22 derived units for other common physical quantities like lumen,
watt, etc. … The reliability of the SI depends not only on the precise measurement of
standards for the base units in terms of various physical constants of nature, but also on
precise definition of those constants .‖ – states Wikipedia.
Some other countries, most of them former colonies of the British Empire, still use the
imperial British measures, mainly because of customary reasons – the people got used to
them. The united States of America, although detached from the British Empire , still use
9
inches, feet, yards, furlongs, miles for length measures; ounces, pounds and
quintals/hundredweights, tons; temperature measured in Fahrenheit degrees and some more.
3.2 Processing and interpreting the measured data
Once collected the numerical data of the variables (observables) , those have to be processed
with mathematical methods, starting from simple arithmetical operations – addition,
subtraction, multiplication, division – and by more complex algebraic formulae , with some
operators like square, cubic, square -root, fractions, polynomial expressions and furt her more
like expressions of mathematical analysis (derivatives – integral and partial, integrals – one
dimensional or more -dimensional, math operators – gradient, d‘Alembert and Lagrange
operators a .s.a.p.), place d in what we call mathematical equations o r formulae. After
processing the mathematical operations upon the data, we obtain one or more outcomes of the
procedures, sometimes called results or solutions.
These solutions are nothing more than simple results of the mathematical transformations we
have undergone the data, but their importance derives from their interpretation. That is, the
observer and science practitioner, whoever does the studies has to correlate and to explain the
meaning of the results.
If for reasons of speed and commodity comput ing machines are used, those have to be
previously programmed to ―know‖ what a certain type of outcome means for the human
observers and eventually to express them in a convenient way – displayed on a screen, printed
on paper, transmitted as electromagneti c signals into a data network, transmitted to the
actuators of a device or a machine etc.
3.3 Computing the model of reality
Let us summarize what has been debated so far :
Technics is the practical application of science.
Science operates with mathematic al and structural models of reality.
In its mathematical model, science builds validated theories based upon a structure of natural
or artificial rules:
– fundamental laws (axioms, principles, postulates) ,
– theorems or provable laws, demonstrated based on th e fundamentals ,
validated by several practical observations, confirming each hypothesis into a truthful thesis;
gathering the confirmed sets of theses into a theory.
10
If we make the calculations by hand, with a given precision or maybe we use computing
machines for them (also having a precision limit) , if we then represent the outcome of our
computed data into o result or a solution for our problem of interest – the solutions will be
neither precise/exact nor will they replace reality. The results of our ca lculations remain what
they are – a model of reality, not reality itself.
For practical purposes the scientific and technological solutions do not necessarily have to be
exact but they have to be an approximation close enough for our practical goals.
With in the last five decades computing machines developed to be better and better for every
new generation of them, they calculate faster and faster, their parts are built smaller and
smaller, their processing memory gets bigger and bigger, computing networks get wider and
wider spreading all over the inhabited world , the power consumption of the global mobile
communications and data transmission grows and grows with every technological
improvement of the IT&C systems … but they still remain a bunch of models o f reality, with
their technical limitations.
11
Chapter 4 Classical physics
3.1 How classical physics works
Classical physics basically works as described before in the section about measurement,
computing data, interpreting the results and modeling realit y. Classical physics relies on plain
arithmetic, elementary algebra, geometry, calculus, mathematical analysis, higher algebra,
vector spaces, theories of groups and of mathematical spaces, even nonlinear geometries etc.
All of these domains deal with the simplest and therefor best mathematical elements, mostly
numbers, scalars, vectors, sets, rows and strings, geometric figures, functions, families of
functions = fields and many more. Most uf the utilized functions happen to be continuous and
derivable and integrabl e … but reality works with exceptional complex objects as well – and
they are not rare at all, as a matter of fact the observable universe is swarming of them.
3.2 Deterministic models of reality
Therefore the models of reality used mainly in cl assical physics and in the corresponding
technics and technology are the equivalent of continuous functions and corresponding mostly
to bijectiv e ambivalent functions . For each and every real object that we observe we obtain
one only corresponding model , given the methods of measurement, computing data and
interpreting the results. This one -to-one correspondence is called ― determinism ‖.
This is the world we know and we are confortable to live in, this is the way we model this
reality in our minds, this is what we are used to call science and technology.
3.3 What was wrong with classical physics
At the end of the 19th century and at the dawn of the 20th century some leading physicists
extensively studies the propagation of light and radio waves (electromagn etic fields and
waves) and discovered a shocking fact – light was not entirely respecting the deterministic
laws of classical physics, quite in the matter of propagation laws and rules. Soon two
categories of phenomena emerged from the observations which s howed the ―indiscipline‖ of
light: the black body radiation with the photoelectric effect and the propagation of light .
It became clear that the classical theories about light and other related electromagnetic fields
were outrageously contradicted by the b ehavior of light itself.
They had to be dropped … and replaced with some new theories, corresponding to reality.
12
Chapter 5 Quantum physics
5.1 Dawn of quantum physics
In order to explain the black -body radiation, after sustained intellectual efforts an d following
his very keen intuition, in 1990 the German physicist Max P LANCK succeeded to prove that
in fact the electromagnetic radiation within the model of the black body necessarily had to
follow a discrete distribution of values, since it did not matc h a continuous function but rather
a statistical distribution, based on the works of Austrian physicist Ludwig B OLTZMANN .
Wikipedia states about the quanta: ―that electromagnetic energy could be emitted only in
quantized form, in other words, the energy c ould only be a multiple of an elementary unit –
where h is Planck's constant, also known as Planck's action quantum (introduced already in
1899), and ν is the frequency of the radiation ‖ as in ε=h ν.
The Nobel Prize website states about the work of Max P lanck: ― When a black body is heated,
electromagnetic radiation is emitted with a spectrum corresponding to the temperature of the
body, and not to its composition. … Max Planck solved this problem … introducing the
theory of ‗quanta ‘ … with specific energi es determined by a new fundamental constant h,
thereafter called Planck's constant. ‖
5.2 Early success of quantum physics
Quite soon after the first success, in 1905 quantum theory offered the opportunity to prove it‘s
rightfulness to another gif ted physi cist, Albert E INSTEIN . In fact, Einstein noticed that one of
the two unsolved mysteries of light behavior – the photoelectric radiation – could be
explained quite credibly with the help of Planck‘s quantum theory. Einstein published a short
article in the leading physics magazine of the time, ‗Annalen der Physik ‘, proposing a new
concept of light particles – he called ―photons‖ – instead of continuous waves within the EM –
field, as the manifestation of Planck‘s electromagnetic energy quanta.
Accepting the i dea of photons, it would be very easy to rewrite the preservation law of energy
(corresponding to the first principle of thermodynamics), compulsory in explaining the
interaction of a photon with an electron within a metal atom. Thus the observable outcome of
the photoelectric effect would be consistent with the theoretical assumptions, confirming the
predictions of the Planck theory of light quanta.
13
Chapter 6 Development of Quantum Physics
6.1 It gets complicated with quanta
After the successful introd uction of the concept of energy quanta and light photons by the
genius tandem Planck and Einstein, quantum theory has soon experienced a fulminating
development. A few years after Einstein‘s success, another young physicist – Danish all-
round genius Niels Henrik David BOHR utilized the quantum theory to explain the strange
light spectra of atoms, which then contradicted the relatively young planetary theory of the
atom, just introduced a few years earlier by the winner of the Nobel Prize for c hemistry Sir
Ernest RUTHERFORD.
He introduced the idea that an electron could drop from a higher -energy orbit to a lower one,
in the process emitting a quantum of discrete energy. This became a basis for what is now
known as the old quantum theory. [4] Therefor e he pro posed two postulates stating the
quantum behavior of the electrons within the atom, quite different from the classical physics
interpretation – but confirmed by spectroscopic observations of the atoms radiations.
6.2 Quantum Mechanics
Quite soon some of t he promoters of quantum physics realized that the usual mathematics to
deal with modeling reality were not suited anymore for the new kinds of phenomena they
were studying, out of the submicroscopic world. Mainly the principles and the prime theorems
of th e rules of motion and the related momentum and energy transformations had to be
reformulated in order to match the observable outcome of the phenomena.
It became imperative to rewrite the whole structure of mechanics, first published in the 17th
century b y Isaac Newton, to be adapted to the successes of quantum theory in explaining what
was going on in the subatomic world of elementary particles and of light quanta.
The cradle for this enterprise appeared in the Institute of Theoretical Physics , a research
center near Copenhagen, financed by the Danish Government and the Carlsberg Found ation.
The institute opened in 1921 and under the guidance of Niels Bohr the concepts of physics
were rewritten with the help of many innovative young physicists, invited to join the think
tanks of the institute. Therefor we should mention Werner HEISENBERG (Germany) , Erwin
SCHRÖDINGER (Austria) and Paul DIRAC (United Kingdom) .
14
6.3 Lie groups and Hermitic mathematics
At the end of the 19th century, Norwegian mathematician Sophus Marius LIE developed a
theory of continuous symmetry , later on applied to differential equations , geometry and
theoretical physics – mainly quantum mechanics. Especially the Lie transformation groups
play an extremely important role in modern quant um physics, as developed starting in the
1920´s at the Carlsberg Institute for Theoretical Physics.
A construct of complete normed vector spaces – as developed by Polish mathematician Stefan
BANACH in 1920, followed by the development of self-adjoint opera tors – also called
Hermitic matrixes in Hilbert spaces (superior algebra applied in functional analysis), these all
showed to be the appropriate mathematical tools to rewrite the principles and laws of
mechanics and later on the laws of all known physical phenomena at the time .
With his PhD paper in 1924, Werner Heisenberg published his findings derived from hermitic
mathematics , a year later then applied them to the newly developed quantum mechanics, as
elaborated with the help of German mathematical phys icist Pascual JORDAN and the
guidance of Niels Bohr . He discovered some amazing features of quantic particle -waves,
namely the Principle of C omplementar ity alongside the Uncertainty P rinciple, also the
Principle of S uperposition alongside the Quantic C ollapse of physical observables.
The Principle of Complementarity states that the variables appear grouped in pairs of
complementary properties , of which neither pair can be simultaneously observed, nor can they
be precisely measured together . The energy with the momentum of elementary particles are
typical examples of such paired up observables.
The Principle of Superposition on the other hand holds that there is a huge number (up to an
infinity ) of possible own states of values of the variables (in German ―Eigenwerte ‖, translated
phonetically to ―eigenstates‖) for each physical feature of the object – evolving as wave
functions , but there will occur a ―wave function collapse‖ – when the observation and the
measurement of a quantic phenomen on reduces the mult itude of equally possible eigenstates
to one single outcome – the one value that we can and will observe in our reality .
That is: the bare observation of a physical phenomen on influences and determines the
result of the object’s evolution in time .
These extremely surprising and unexpected behavior patterns of quantic objects – waves and
particles as well – represent the basement of the goal we pursue in this paper: the eloquent
comprehension of quantum computing and the possible construction of quantum co mputers.
15
6.4 Limitations of appliances
Most of the electrical, electronic and computing models, devices, machines and apparatuses,
all of them have the structure of quantum matter inside and work according to the laws of
quantum mechanics and further mor e quantum physics principles and laws, but they are used
in the old ways of classical physics .
In the macroscopic world , where the simple observation of objects and measurement of their
properties has collapsed their variables each to a single value, the one we just have measured.
The measurements are then processed w ith the approximation calculations and displays of the
results and solutions, in an approximate result and with approximate solutions to the
applications issues to be solved.
In addition, ou r electrical driven appliances work in an extremely inefficient way concerning
their energy consumption , namely with a horrendous waste of energy, by transforming most
of the involved mechanical labor and electromagnetic energy into heat, that are useless
mechanical vibrations of the molecules and atoms of the matter.
If we go on with these features, we very soon have to come up with solutions to build some
new and energy efficient and very much faster and reliable computing machines, namely the
quantum co mputers . We will describe our expectations about them later on.
16
Chapter 7 Classical physics versus quant um physics
7.1 Duality of matter
Photons and electromagnetic field s behave in a peculiar way: they are both fields – that is
they travel in the for m of multiple superposed continuous functions, like the classical physics
describes them , but in the same time they behave like discrete energy quanta – that is they
interact in the form of a bunch of material particles, each of them colliding with other p oint
like objects. They do so in the same time, although we can observe their dual nature in
different states of phenomena, as we observe them.
There is also the problem of velocity of light: light being a set of electromagnetic waves
travels with the hig hest speed known to man in the present. The Special Theory of Relativity
by Albert Einstein and the General Theory of Relativity by Albert Einstein and Karl
Schwarzschild state that for a ny massive object ( bearing individual ―inertial‖ masses for every
atom they consist of) the speed of light in void has the practical value tending to infinity.
We are practically not able to observe what really light does while travelling through space
and time – so we must assume that our closest theory about the phenomen a has to be
approximately true as long as we don‘t figure out what to do in order to observe the light
propagation even more precisely , while in collisions light behaves like particles .
Later on, the French physicist Louis de B ROGLIE proved that duality w orks also the other
way around: common ―massive‖ particles can behave like waves, as a matter of fact can be
waves as well, a clue that matter de facto is just another form of fields, let us say it consists of
condensed fields. The matter waves were propos ed to science by de Broglie in 1924 and 1927
already were confirmed experimentally by George Paget THOM SON, then independently by
Clinton DAVISSON with Lester GERMER, both in electron diff raction experiments .
7.2 Double slit experiment
In the early 19th century the British scientist Thomas Y OUNG introduced an experiment to
study and then to explain the interference phenomena of light. Therefore he built an
experimental device with a n opaque pane which had two close slits cut into it .
By sending light th rough the double slit, the electromagnetic light wave would have to pass its
wave front through both slits, the wave front being coherent with it.
This class of experiment s later on would be called "double path" experiments, in which a
wave front is split into two separate waves and later on combine d into a single wave back .
17
Because of c hanges in the path lengths of both waves one could observe an interference
pattern , caused by a phase shift of the two combined wave fronts .
After superposition of the wave front segments on the projection screen, interference occurs,
seen as a figure of lighter and darker stripes or bands .
A century later the newly emerged quantum physics pondered whether the energy quanta
―photons‖ were crossing either through the left sl it or through the right one – or both
simultaneously .
According to classical physics, each of the photons would have to ―choose‖ one and one only
from the slits, then arriving on the screen placed on the back side of the double slit pane.
The model state s that we should expect to see two dots of light on the screen. In reality we
can observ e the phenomena of light interference, where there is to be seen a central light
fringe called ―maxima‖ and several lateral secondary fringes. This should be impossible , since
classical light could not be able to double its quanta – they would pass through a single slit
and should not pass through the other one at the same time. But light does. This is a clear sign
for the ubi quity of light – it can be present in two pla ces at the same time. Strange!
Some explanations: Since the light is always absorbed at the screen as individual photons
particles ( light interacts as particles with matter and not as waves), the interference pattern
appear s because of the vari able density of the photons hitting the screen.
Other versions of the experiment have shown that each observed photon passes through one
slit only (as a particle would ) rather than through both slits (as a wave would).
So these experiments definitely prove that partic les do not build the interference pattern if the
observer detects which slit they have passed through. This fact shows the principle of wave –
particle duality . Later on we shall return to this subject, since it constitutes one of the quantum
effects of matt er on the microscopic level of existence.
7.3 Matter waves of de Broglie
In his PhD thesis of 1924 a French scientist, Charles de BROGLIE , proposed the concept that
massive matter could behaves like a wave . Already knowing that light has both wave -like an d
particle -like properties, he concluded that electrons also must have wave -like properties ,
based on the remark of Austrian -Dutch physicist Paul EHRENFEST . All massive particles in
motion would behave as waves, with a corresponding wavelength indirectly p roportional to
their momentum, by a coefficient corresponding to Planck‘s constant h. Since the value of this
constant was so very small, it would be difficult to observe with bare eyes the wave effects of
massive matter.
18
Wave -like behavior of matter was f irst experimentally proved by George Paget Thomson and
shortly after by Davisson and Germer, both with the diffusion of electrons, later confirmed by
other universities as well .
It was clear that the concept introduced 1905 by Einstein about the coexisten ce of the wave
and of the corpuscular aspects was equally right for light photons and massive particles as
well. In the 1930‘s up to the 1960‘s repeated experiments have shown that matter waves and
quanta corpuscules exist for all forms of matter, fields a nd particles alike, observable even for
huge organic molecules havin g hundreds of atoms in their structure.
7.3 Weisskopf´s Dilemma
A physics joke by Nobel Prize winner Victor Frederick WEISSKOPF (American -Austrian
physicist) states that asked whether lig ht was waves or particles, professor Weisskopf
answered: ―Light is waves on Mondays, Wednesdays and Fridays; it is particles on Tuesdays,
Thursdays and Saturdays; and on Sundays we think about it.‖
The same applies to massive matter, according to Louis de Broglie: ―Electrons would be
particles on Mondays, Wednesdays and Fridays; they would be waves on Tuesdays,
Thursdays and Saturdays; and on Sundays they would go out to date light photons.‖
It is not a dilemma any more , after all!
19
Chapter 8 Quantum en tanglement
8.1 Quan tum entanglement
The quantic structure and behavior of matter determines some outstanding and surprising
effect, one of them is called ―q uantum entanglement ‖. The QE phenomenon consists in the
fact that the quantum states of two or mor e ―entangled‖ objects have to be described refer ring
to each other, even though the individual objects may be widely separated in space and time .
These connections lead to correlations between observable physical variables of the systems ,
called quantum states, so that their mutual observation leads to a unique and unrepeatable set
of data – the ones that Niels Bohr has called ―quantum collapse‖ – the entangled objects
would have unique interconnected pairs of values of their parameters, evident to the ob servers
but accordingly different to other observers .
8.2 What we could do with entangled particles
It would be possible to connect two particles in a single quantum state, but we could not
predict which particular set of measure d parameters will be reco rded. As an immediate result,
measurements performed on one of the system s would instantaneously influenc e the other
behavior and parameters of the system entangled with it , at any distance .
The next step would be to build applications in computing, buildi ng what we can call a
Quantum Computer, in order to perform immensely fast q uantum computing and quantum
cryptography, but also to practic ally achieve quantum teleportation .
These facts also would have philosophical consequences upon how we perceive our
surrounding world and any place in the universe, since so called local realism would lose its
sense , having instantly connected objects in positions at any tangible distances to each other.
8.3 Quantum communication and computing
In the first half of this y ear (2018) several of the research teams worldwide presented their
newest results in quantum entanglement at room temperature , already visible with the naked
eye – although still having to keep the processors at cryogenic temperatures , which lets the
pheno menon get real spooky, since big research institutes like the ones of Google and IBM
just introduced their versions of working quantum computers.
We will see to it in the chapter reserved to conc lusions.
20
Chapter 9 O rigins of classical computing
After a long period of developing and using of mathematical tools, in several civilizations,
like mathematical tables – correlating variables with their corresponding values of calculus,
mechanical devices emerged out of the need for automation of calculations. T he abacus and
the counter are the most relevant examples:
9.1 Abacus
The abacus or counting frame is a calculating tool emerged in resembling forms in Asia and
Europe, some time in the early middle ages . Today, abaci are used in kindergarden or in
elemen tary schools to learn children count and perform elementary arithmetics, made from
simple materials – beads sliding on wires put on a frame.
Later on, in the first half of the 17th century, French scientist Blaise PASCAL developed his
arithmetic machine or calculator, intending to help his father solving scrupulous arithmetical
calculations , as supervisor tax collector in his home city Ruen .
The „Pascaline‖ is still in use today, almost unimproved, best known as the mileage indicator
on the screen of tac hyometers (speed indicator ) on the board of cars and the little numbered
wheels on the electromagnetical current meters.
9.2 Slide rule
The slide rule, also known colloquially in the United States as a ―slip stick ‖, is basically a
mechanical analog comput ing device. Used extensively before the apparition of the electronic
pocket calculators, mainly by engineers and scientists, t he slide rule can be used primarily for
multiplication and division, but also for more complex operations such as square roots,
logarithms, exponents, and trigonometr ic functions , rather th an for addition s or subtraction s.
The inventor of the slide rule is considered to be r everend William O UGHTRED , in the
second half of the 17th century , based on the emerging work on logarithms by John Napier.
The 1 975 emergence of the handheld electronic scientific calculator made the slide rules
obsolete , nowadays it gets quite difficult to even find one, maybe in forgotten office cabinet
drawers or in science and technology museums .
21
9.3 Charles Babbage´s Differential Machine
Although British mathematician Charles BABBAGE of the 19th century was never able to
complete construction of any of his inventions – analytical and differential calculating
machines in 1837 – having harsh conflicts with his chief engineer, his devices and machine
parts laid the foundations to the industry of building mechanical and electrical accounting,
cyphering and computing machines. However, in the late 1930s and early 1940s, his concept
of a general -purpose computer w as actually built in several functional variants .
The Babbage mechanical Analytical Engine incorporated an arithmetic logic unit, also held
some flow control s in the form of loops, furthermore it had integrated memory, making it the
first design of a so cal led Turing -complete computer . That means that logical structure of the
Analytical Engine was quite the same as we use today in the modern electronic computers.
22
Chapter 10 Mechanical computing
After developing and usage of the abacus co unting machine in the antiquity, later on the
pascaline and the slide rule, alongside other calculating and counting devices, another family
of computer precursor machines emerged: the encrypting machines.
10.1 Encrypting machines
During middle ages military units had to communicate being situated in different locations on
the battle field or in their military camps, having to exchange news and receive orders from
their leading officers at the court of the monarchs. Usually the military checkers were sent as
notes carried either by emissaries , or by pigeons; the enemy forces would try to intercept
these notes, in order to find out about the intensions of their foes – with devastating effects on
the outcome of the confrontation , once they had learned about them .
So secrecy was the key word to the success of efficient communications. To avoid the loss of
information to the enemy, encrypting the messages was the obvious solution to the problem.
Even if the emissary would fall into the hands of the foe, his messages would not b e
intelligible for the ones there were not intended to.
For the encryption of a message, referred to as ―plaintext ‖, has to be encrypted with a
―cipher ‖ – that is an encryption algorithm – resulting in a ―cipher text ‖ that can be read and
understood by i ts recipient only if properly and completely decrypted.
Several algorithms and cyphering methods were developed through the centuries, but the first
half of the 20th century brought some astonishing mechanical and electromechanical solutions
for the cypher ing business, by the invention and distribution of cyphering machines.
Already in 1918 the German engineer and inventor Arthur SCHERBIUS developed and
patented the idea for a rotor based cipher machine – mainly for banking and stock exchange
purposes, as nowadays the ciphering code for bank accounts are used .
With his associate Ernst R itter, in 1923 Arthur Scherbius started to produce and to market
their machine under the brand name „Enigma ‖, in the banking and accounting business. Pretty
soon the German military purchased and implemented the Enigma ciphering machines for
their telecommunications on radio and telegraphs.
We should also mention that Arthur Scherbius patented his invention of ciphering
mechanisms at the United States Patent Office in 1928, just to be overseen by the authorities.
23
10.2 Zuse´s Z1 computing machine
In his biography, German engineer and inventor Konrad Z USE recalle d how he got into the
computing machine business – really out of laziness . [11, Zuse Konrad, My life, pp. 27-29]
―After graduating technical university , Zuse started working for the Henschel airplane factory
in Berlin , in charge of performing static calculations for airplane wings . Being terribly bored
by his job, he quickly resigned and decided to change his life, becoming an entrepreneur and
inventor.
Between the years 1935 -1937 he built his first creation – a machine called ―automaton Z1 ‖,
financed with loaned money from his parents and friends.
Meanw hile in the USA some other inventors working on computing machin es, like Atanasoff
and Berry , Aiken, Mauchly and Eckert – they all had the resources of universities or
companies at their disposal . Zuse was working unattended and alone – meaning that th e whole
conception and the design of the machine s was his own work. ‖ – stated his biographer.
More about the mechanical computer in the biography: ― The Z1 was operational in 1938.
Zuse, ignorant of the internal structure of any type of calculator built at the time, started from
scratch and developed a whole new kind of mec hanical construction.
While the calculators of the time were using the decimal system with rotating mechanical
parts, Konrad Zuse considered about a simpler binary system , with metallic plates that could
move only in two states, by shifting position. The difficulties lied in the fact that every the
movement s of every single logical gate had to be coupled with all the other gates. In fact, the
functioning of the mechanical parts was more difficult than the logical system .‖
10.3 Rubik´s cube
By the way, the Hungarian inventor and architect Ernö Rubik re invented 1976 the mechanical
slides similar to the Zuse mechanical computing machine , for educational pur poses, showing
a computing mathematical toy to the world, bearing his name ―Rubik‘s Cube‖, sti ll in use
today by teenagers and young adults.
24
Chapter 11 Electrical computing
11.1 Electrical e ncryptin g machines
After a short period of working with a mechanical drive, soon the business purpose and the
military ciphe ring machines turned electric, as a matter o f fact electro -mechanic, keeping their
rotor wheels. The readings and the transmission of data, as a matter of fact, were done
electrically, for reasons of reliability. The next generations of Enigma ciphering machine went
better and better, but their func tions basically kept the rotor components by Scherbius.
11.2 Zuse´s electrical computer s Z2 to Z4
―Reassigned to the airplane factory with the beginning of WW -II, in 1940 he also built the
machine Z2, which used an integer processor built out of relays an d a mechanical memory ,
retrieved from defected telephone centrals .
Zuse convince d the German Airspace Research Office (DLV in German) to partially finance
the development of the computing machine Z3, which would be built using only relays ,
operational in 1941. Luckily, the national -socialist government of the time failed to realize the
military importance of Zuse‘s invention, so they simply ignored him and left him alone.
Zuse continued working for the Henschel airplane factory, but started his own busines s in
1941 – ―Zuse Ingenieurbüro und Apparatebau ‖, founded with the sole scope of developing
and marketing computers. ‖ The biography reveals even further development [11, pp. 31-39]:
―The next machine Z4, built for the avionics association DLV, would have 1 024 memory
units (exceeding the only 64 of Z2) , constructed and almost ready in 1945, just before the
Russian occupation . Zuse flew to Switzerland , thus escap ing potential military captivity .
After the war, Konrad Zuse started his company again in Bavaria and continued developing
his computing machines, not being aware of the British and American efforts. Therefor e he
developed an algorithmic language, called the Plankalkül (calculus of programs), and
presently considered to be the first high -level program ming language conceived at the time.
The Swiss Technical University of Zürich rented 1950 the restored Z4 for university research,
thus Z4 was the first commercial computer in operation, several months before the first
UNIVAC was delivered in the USA.
We c an conclude that the most important achievements of K onrad Zuse was the invention of a
general purpose family of computing machines, completely digital, floating -point ed, fully
programmable machines, constructed in almost total isolation from 1936 to 1945.
25
His life´s dream of creating a small computer for scientific applications and for business was
finally fulfilled, economically not very successful though. Obsessed with the efficiency of his
computing machines, Zuse always considered himself the true inv entor of the computer ,
although the rest of the world would not acknowledge it, until his death in 1995. ‖
Later on, he sold his company to Nixdorf Incorporated , which then merged with the Siemens
holding eventually and with the Japanese Fujitsu company, so most of Zuse‘s inventions were
indirectly included into the scientific, technical and industrial heritage of the western world.
Other appliances also emerged from Konrad Zuse‘s inventions, after transferring his company
to Siemens AG, whereever Siemens c orporation distributed electrial equipment.
Although largely unknown to the general public, related to his individuality and not belonging
to some great industrial company or some university research center, Konrad Zuse remained a
genius and simultaneously a discrete master professional of the middle class, struggeling for
economic survival of himself, his family, his company and his entourage.
26
Chapter 12 Development and success of classical computers
12.1 The race for computing machines
During World War II the efforts for intercepting and deciphering the communications of the
German military were bitterly and ruthlessly going on. Both the British and the American
military each developed a project of building an electronic computing machine, helping them
to interpret the German notes without having the ciphering codes for the Enigma machine,
since those were changed every day with new ones.
Therefore the British hired in great secrecy the most gifted young scientist of the time, Alan
Turing, who defined an d developed a model for a general -purpose computer, named after him
―Turing -complete computer‖, describing the technical features such a machine should have.
Until the end of the war, the research team almost succeeded to finish the British desciphering
computer, but they lacked a corresponding programming language, not being aware of the
existance of such endeavours. Still, they have made remarkable successes .
Meanwhile the United States of America deployed their own efforts of developing a running
computi ng machine, not being aware of the British efforts and progresses. Starting from the
computer model of Atanasoff and Berry, they achieved their goal at the end of war.
12.2 Post-war computing machines
In the USA, t he Atanasoff –Berry Computer (ABC , 1937 -1942) at Iowa University was a good
start for digital computer model s, but not progr ammable and not fit for the Turing criteria.
Let us recall some of the major milestones in computer development.
1943: Colossus Mark 1 and then Mark 2 in 1944, at UK Bletchl ey Park deciphering center
1944: IBM ASCC Mark 1 at Harvard University, staff led by John von Neumann
1945: ENIAC at the United States Army's Ballistic Research Laboratory
1948: Manchester Baby in the UK, using vacuum tubes memory
1949: CSIRAC in Australi a, with delayed mercury memory
The year 1948 marked as well the invention of the transistor device by William Shockley,
John Bardeen und Walter Brattain, which shortly lead to the development of integrated
circuits and finally to microprocessors (Motorola and Intel corporations).
The 1970‘s brought the home computer and the personal computer by Apple Inc. and then the
development of computer software, most successful by the Microsoft Corporation and its
operating system Windows series, in the 1980‘s and fol lowed up in the 1990‘s to the present.
27
The big computing companies like IBM continued to develop their mainframe computers for
scientific, industrial and military purposes, still working on them for big data, as we write.
12.3 Development of computing
Thousands of articles, books, studies and graduation papers were already written on the
history of computing, so we will not add another one to them. We will keep in mind, though,
what the purpose of the computing machines at the dawn of their development w as and what
it is in present days, as well as what is to come next.
However, following the principle of the modern computer, as proposed 1936 by Alan Turing,
any "universal computing machine" should be capable to compute any data that were
representable b y numbers or tables and to execute instructions written in programs, previously
stored on a device called memory.
The American mathematician and inventor John von N eumann l ater led the research team on
developing the theory of computer programming and als o assisted the building of the first US
computer models at Harvard University.
12.4 Limitations of classical computing
With all the hardly imaginable developments of supercomputers and computer networks,
alongside mobile computing devices for personal us e like laptops, tablet computers, smart
phone computers, smart watches , more recently general purpose programmable mini
computers like Raspberry Pi or its smaller brother Arduino, well … all of them have some
serious limitations in comparison with our ever growing demand for precision and so called
big data, getting bigger and bigger.
First of all they have limited storage capabilities, even with the storage in externalized
network storage devices ―in the cloud‖, we are gathering more and more data of which most
will be either obsolete or getting useless by the minute – and we don‘t know how to filter
them and get rid of the useless data and computer trash.
Secondary their programs get controlled obsoleteness generated by the software industries
themselves. We have to ever buy new software which then requires new hardware to keep up
with the development of processor velocity and memory usage.
In the third line, we are using more and more electrical energy from the electrical grids, by
simply running our comp uting devices – private ones at home or at our working place, but
also the ones we don‘t even see and don‘t know where they were, since they belong to the big
computer companies like Google, Microsoft, IBM or anything else. Statistics show that
28
humanity is already using more than 20% of all the electrical power in the world grids just to
power computers and communications networks. And it‘s growing.
12.5 Seeking for solutions
We are going to suffocate in our own technological success continuing to use clas sical
computers and classical software.
If we want to survive as a technological species, we simply need to leap to the next computing
level, as we need even smaller computers that consume far less power than the present ones
and we need them to compute m uch faster than the classical ones, doing far more flips per
second than the processors can do now.
The first issue could be solved by further miniaturizing the processors (microchip industries
approach the 7 or less nanometer thick conducting layers , a few atoms in cross section) .
Furthermore we need to develop data processing devices that replicate the functioning of
biological neurons and neural networks , as energy efficient like nature itself .
The second issue could be solved by finally developing th e Quantum Computer to an
affordable model, working at room temperature and at superfast computing speeds.
29
Chapter 13 Quantum Computing
13.1 N ew computational strategies
We already learned that classical computers tend to reach their physical boundaries , mainly
because of the limitations of classical physics itself, despite our best efforts. So it gets obvious
that we have to focus more on the quantum physics, seeking for applications on building
functional quantum computers. ―For that, we have to study quantum mechanics from a
computer science point of view .‖ [14]
In the early 80‘s, Nobel Prize winner p hysicist Richard Feynman suggested two steps for
quantum computing research : first accurately s imulating quan tum physical systems , than
writing the softwa re and the actual construction of quantum computers.
In 1985, David Deutsch presented his concepts of quantum algorithms with quantum bits.
13.2 Qubit s
In classical digital computers, information is processed in digits, thus all processed data are
represe nted as binary system numbers, with only two possible value s {0,1}.
A ―qubit ‖ – derived from ―quantum bit‖ – is the equivalent unit of quantum information .
Its name is attributed to American physicist Benjamin Schumacher [15],
Since quantum mechanics all ows the qubit to appear in a simultaneous superposition of both
states, the system has yet another observable associated – with an equal probability of 1 and 0,
corresponding to quantum ―eigenstates‖ of information, where individual probabilities
summed wi ll result to unity (Banach space full y normation) .
Hence, qubits can hold more information, up to two bits , using super positioned coding. The
outcome is a multitude of possible states, depleting classical computation by a exponentially
factor, thus increas ing the computing reliability to an extreme high level.
If observing a qubit system , it is reduced to Boolean parameter, as it can have only the
classical values 0 and 1 ; hence this state is called a sharp observable. In reality this
corresponds to the p resence or the absence of a particle .
13.3 Computation with Qubits
In order to perform computation with qubits, in 1989 David Deutsch invented what he called
Quantum Logic Networks . This is how they work:
The s ystem is divided into individual qubits . There for two orthogonal states of each qubit are
designated as the computational basis states, ―0‖ and ―1‖ .
30
Just like in binary logic, we can transform the qubit information through equivalent Quantum
logic gates – which are l ocal unitary transforms that operat e on only a few state bits at a time .
For a complete description of a system having n components, classical physics requires only n
bits, wh ile quantum physics requires 2n-1 complex numbers to represent the same system .
Hence quantum computation allows the simultaneous evolve ment of much more values.
13.4 What is Quantum Computin g
In order to understand how quantum computation works, David Deutsch first developed
corresponding simulation algorithms, which could run on classical computers, up to their
natur al limit. Later on, the leading IT companies, started to build real Quantum Computers,
handling a very small number of qubits, starting with the 1990‘s and slowly increasing the
number of qubits they could manage. In April 2018, IBM announced their 50 qubi t computer
to be completed and operational .
13.5 Quantum Entanglement
In a chat with Niels Bohr about quantum effects, Albert Einstein called "spooky action at a
distance" connected pairs of particles , separated in space and time, but instantly having th e
same outcome of interactions results . This is because q uantum physics states that entangled
particles stay connected hence any actions performed on one object immediately does affect
the other, no matter at what distances.
The rules of Bohr‘s quantum collaps e indicates that e.g. an „ignored ‖ photon actually exists in
all its possible states simultaneously , but when „observed ‖ or measured by a interested
person , the photon appears in one only state – having his eigenstates collapsed into one .
A correspon ding experiment to check on the veridicity of such entanglement plans to send
one photon of a entangled pair up to the International Space Station in orbit , but to observe its
pair in a terrestrial laboratory, just to see if light shows instant entanglemen t effects.
By the way, in April 2018, the science magazines „Nature ‖ and „The Conversation ‖ reported
the succes of a research team having proven the entanglement of two macrsocoping objects at
an appreciable distance , visible to the human naked eye.
13.6 Quantum Co mmunication and Teleportation
These facts open new perspectives to Quantum Teleportation , by which quantum information
of one given object can be exactly and almost instantly transmitted from one place to another,
31
using regular classical commu nication and previously establish ed quantum entanglement
between the expeditor and the receiv er.
32
Chapter 14 Quantum Teleportation
14.1 Actual Quantum Teleportation
Until recently, t eleportation was to be seen in imaginary science fiction films – like St ar Trek,
with the command meme ―Beam me up! Engage!‖ Within the last two decades this quasi
fictional phenomenon has proven to be fe asible and even more, having been accomplished
and double checked by several research labs and universities around the globe .
For now, the information of one entangled object is sent somewhere else to its entanglement
pair and used to instantly change the state of it in identical manner to the emitter one .
Later on we hope to be able to dismantle an object at its emission loca tion, send its
constituency information to the receptor‘s location and then instantly rebuild it out of
material s already being there on the other side. That would be SF -like object teleportation.
As a matter of fact, in July 2017 a Chinese research team from the Jiuquan Satellite Launch
Centre actually teleported with the reconstruction method a bunch of molecules to a satellite
in orbit , for the first time to such distance, as publish ed in their science and research
magazine. [17]
Even if teleporting macroscopic o bjects and living organisms is to expect only in the far
future, this experiment has already been performed several times with microscopic object like
subatomic or even simple molecules several kilometers away. Hopefully our descendants,
generati ons away, will succeed to actually fulfill our visions of transforming Science Fiction
features into tangible reality.
33
Chapter 15 Towards quantum computing
15.1 Professional expectations on IT&C
At the 2002 CeBIT exhibition in Hannover , Germany, Micros oft SEO Steve Ballmer made
some very interesting statements about the future of computing:
―We‘re sitting in a room like this and as I look out I see a lot of people with paper and pencils.
Why aren‘t we helping you with information technology devices? Som eday in the next few
years you‘ll come with a little tablet device. It‘s your PC — you‘ll write on it, my voice will
be recorded and it will be automatically fed as a piece of voice input over the wireless
network in this room onto your tablet PCs. That‘s the way a meeting lik e this should work.‖
15.2 Fictional expectations of “Quantonics”
Teenager visitor Meike Bärmann replied to Bal lmer with a rhetorical promise statement :
―When I ‘ll grow up, t here will come a day when we will not be able to build small er
computer -chips with usual methods, but we will have to develop new techniques, like the use
of anatomical materials (neural pathways of leeches ) or light pathway techn ologies.
In the distant future there will be quantum computers available, which will exceed many
thousands of times the speed and the e fficiency of present day computers. ‖
15.3 Achievements
As a matter of fact, both of them were right, pretty soon, since most of the future visions have
become everyday reality by now, only one decade and a half later off their statements .
The essential parameters of circuitry technology – the so called hardware – are improved by a
factor of 1000 every 20 years. In 1964 the first computer family of IBM had four transistors
on a chip. According to our formu la we would expect four million transistors onto one chip;
instead we have exceeded hundre ds of millions of transistors packed into one single core chip.
This shows us how difficult it gets to precisely anticipate the future evolution .
Both predictions weren‘t even so wrong . We can hereby expect some significant progresses in
computing design following two ways :
• further developing computing machines based on artificial neural networks similar to
biological neural networks ,
34
• continuing to develop the quant um computers, making them more independent from
cryogenics (functional at room temperature) and making them accessible to business,
educational and home purposes.
These goals would not work very well without excluding military and economic dominance
out o f the development practices … which s eems an impossible target , just like the duality of
matter is, or is it not? So, maybe we should observe and hereby choose a quantum state where
the empowered ones turn peaceful and act accordingly, after all it is a ma tter of observer‘s
quantum collapse – just as quantum physics states .
35
Chapter 16 Conclusions
More th an a century of quantum physics and eight decades of computing machines most
astonishingly changed our world, forever. We have come to live good and less good things.
Humanity, as we know it in the western world, has reached a technological level with a
development speed far too fast for several of the elderly and middle -aged people.
Children, teenagers and young adults adapt much faster, but some of the developments are
dropping out some of the slower individuals. Digital gaming is a good example to observe
such things. The entertainment industry has come a long way ‗til now, a long w ay awaits
ahead of us, to fulfill our childhoods dreams.
Anticipation and science fiction started as a n entertainment game, in the youth era of
television. We were charmed by the tricorder devi ce of Mr. Spock and the communicator of
Commander Kirk in the S tar Trek Series. The foldable G2 mobile phones came and went, we
have G4 smart phones now with Wireless connections, Bluetooth and NTS data transmission,
several applets to download, cloud storage devices, voice recognition board computers in cars
and lots of other things to download from the App Stores and Play Stores, we can find out
almost anything on the world wide web and the other parts of the internet, we store huge
amounts of data in cloud storage services, we get more and more dependent on the inte rnet of
things … but the development of IT keeps going on at an ever growing faster and faster pace.
Amazing, isn’t it?!
Then again the limitations of classical computing cannot be overseen anymore and neither the
growing tribute we have to pay to computin g, mobile communication and automation.
It gets imperiously necessary for I nformation and Communication Technology to take the
leap towards quantum computers, a huge and difficult leap indeed, but surely unavoidable.
Let us conclude:
It is up to the young generation to take over the mission and to lead it on, towards the future –
with new quant um computing hardware, with simulation software and then authentic quant um
computing software, maybe even a new informatics concept – let‘s look forward and call it :
“Quant onics”.
Copyright Notice
© Licențiada.org respectă drepturile de proprietate intelectuală și așteaptă ca toți utilizatorii să facă același lucru. Dacă consideri că un conținut de pe site încalcă drepturile tale de autor, te rugăm să trimiți o notificare DMCA.
Acest articol: Informatică aplicată în știință, tehnică și economie [622135] (ID: 622135)
Dacă considerați că acest conținut vă încalcă drepturile de autor, vă rugăm să depuneți o cerere pe pagina noastră Copyright Takedown.
