The impact of the Simulations’ prediction power on the Nanotechnology Revolution

  • Ali AlZubi (Speaker)

Activity: Talk or presentationOral presentation

Description

Throughout human history, scientists have been obsessed with understanding the universe outside our planet Earth. Until 1900, the main interest of all sciences was connected to what we see and observe with our eyes. In the early 1900s, the rudder was flipped into completely opposite direction. The debate was about describing and understanding the true nature of the basic building units of our entire universe, Atoms!
Atoms are very small in a way that we can’t see or observe them, neither with our naked eyes nor with any traditional microscope. The size of an atom is one tenth of a Nanometer (one part per billion of a meter). To imagine how small an atom is, we need to think about our body compared to a planet which is 100 times larger than Earth, so one atom is compared to our body. Therefore, it is barely impossible, practically, or technologically, to build something atom by atom the way we wish to. It is always connected to the nature of the chemical reactions between atoms and how they bond with each other.
The idea of creating a material that doesn’t exist either could form naturally, started at the end of 1980s. IBM invented a device called Scanning Tunneling Microscope (STM). It is not an optical device. Instead, an electronic device which measures the variation of the electric currents if a conducting wire gets very close to one nanometer from the surface of a metal. Because of this device, scientists were able to capture direct images with an accuracy of 0.1 nanometer! It was the first time in human history to see how the atoms are sorted in the gold surface. Because of that device, scientists were convinced to improve the perspective of the atomic world in the nanoscale. At that time, theoretical modeling, and simulations of the nanoscale atomic layers, but they were very limited in computational power. A premium PC in 1980s did not exceed 80-120 Megabyte per square inch (MB/in2)of memory. That was a big challenge for computational physicists.
The lake of computational power drove the scientists to focus their research on how the materials used in the data storage devices can be improved, or maybe changed. They tried to model and simulate nanomaterials of different structures. In 1988 one of the theoretical simulations predicted that non-magnetic metals become magnetic if their size becomes as small as one nano meter. Then, in 1989, experimental scientists discovered a new phenomenon that happens in the scale of few nanometers size. For further readings about this phenomenon, its is called Giant Magneto Resistance (GMR). This discovery led industry to improve computational power by about 100 times! From 100 megabytes (MB) into 10 Gigabyte (GB) data storage capacity. It was within 10 years only, 1990-2000, compared to the previous 40 years since the first computer was made. Due to that tremendous impact of the GMR discovery, Nobel Prize in Physics (2007) was honored to those scientists who discovered this phenomenon. After then, modeling and simulations became very powerful. We started to hear about supercomputers. The scientists did not stop their research in modeling and simulations to improve the materials found. Because of that, the computational power increased, and the data storage capacity got very close to 1 TB/in2 by 2010. Again another 100 times larger data storage capacity within 10 years!
Since 2010, Scientists did employ that huge computational power to improve the power more and investigate nanomaterials that do not exist naturally. They could predict many materials with unusual properties that were never known before. These materials may contain less than 100 atoms (less than 20-50 nanometers in size) having different forms and shapes, such as nano-wires, nano-layers, nanoparticles. The only drawback was the technological aspect of manufacturing nanodevices and building them atom by atom, to approve the theoretical predicted findings.
In 2012, scientists at IBM were able to improve the STM device, invented in the early 1980s. They could use it to manipulate atoms and sort them the way we wish! They announced the smallest data storage device ever. It contains 12 atoms only! At that time the revolution in the Nano industry started. Since then, the capacity of data storage devices improved by factor of 15, from 1 TB/in2 in early 2010s to about 15 TB/in2
Period15 Nov 2022
Event titleFaculty Speaker Series: Panel Discussion on Education, American University of Kuwait, Salmiya, Kuwait
Event typeConference