Researchers use supercomputer to determine whether ‘molecules of life’ can be formed naturally in right conditions.


Basic biology textbooks will tell you that all life on Earth is built from four types of molecules: proteins, carbohydrates, lipids, and nucleic acids. And each group is vital for every living organism.

But what if humans could actually show that these “molecules of life,” such as amino acids and DNA bases, can be formed naturally in the right environment? Researchers at the University of Florida are using the HiPerGator—the fastest supercomputer in U.S. higher education—to test this experiment.

HiPerGator—with its AI models and vast capacity for graphics processing units, or GPUs (specialized processors designed to accelerate graphics renderings)—is transforming the molecular research game.

Until a decade ago, conducting research on the evolution and interactions of large collections of atoms and molecules could only be done using simple computer simulation experiments; the computing power needed to handle the datasets just wasn’t available.

It is now, thanks to HiPerGator. Using the supercomputer, UF Ph.D. student Jinze Xue (from the Roitberg Computational Chemistry Group) was able to conduct a large-scale early Earth chemistry experiment during the 2023 winter break.

Xue utilized more than 1,000 A100 GPUs on HiPerGator, and performed a molecular dynamics experiment on 22 million atoms that identified 12 amino acids, three nucleobases, one fatty acid, and two dipeptides. The discovery of larger molecules, which would not have been possible in smaller computing systems, was a significant achievement.

“Our previous success enabled us to use Machine Learning and AI to calculate energies and forces on molecular systems, with results that are identical to those of high-level quantum chemistry but around 1 million times faster,” said Adrian Roitberg, Ph.D., a professor in UF’s Department of Chemistry who has been using Machine Learning to study chemical reactions for six years.

“These questions have been asked before but, due to computational limitations, previous calculations used small numbers of atoms and could not explore the range of time needed to obtain results. But with HiPerGator, we can do it.”

Erik Deumens, Ph.D., the senior director for UFIT Research Computing, explained how this full takeover of HiPerGator was possible.

“HiPerGator has the unique capability to run very large ‘hero’ calculations that use the entire machine, with the potential to lead to breakthroughs in science and scholarship,” Deumens said. “When we found out about the work Dr. Roitberg’s group was doing, we approached him to try a ‘hero’ run with the code he developed.”

The emergence of AI and powerful GPUs can enable such data-intensive scientific simulations to be carried out—calculations that scientists could only imagine a few years ago.

“Using Machine Learning methods, we created a simulation using the complete HiPerGator set of GPUs,” Roitberg said. “We were able to see, in real time, the formations of almost every amino acid (alanine, glycine, etc.) and a number of very complex molecules. This was very exciting to experience.”

This project is part of an ongoing effort to discover how complex molecules can form from basic building blocks, and to make the process automatic through large computer simulations. Roitberg and his research group spent many hours working with members of UFIT. Ying Zhang, UFIT’s AI support manager, ran point for the experiment.

“Ying put together a team comprised of Research Computing staff and staff from NVIDIA to help scale compute runs, provide invaluable advice and help, and accelerate analysis of the data to the point where the analyses were done in just seven hours (instead of the three days we initially expected it to take),” Roitberg said. “We met every week, from initial conception to the final results, in a very fruitful collaboration.”

The results, and the short time in which HiPerGator was able to deliver them, were groundbreaking, bringing researchers one step closer to answering questions about how complex molecules are formed. And the fact that Roitberg was able to run this computation shows that UF has the capability to support “hero runs” or “moonshot calculations” that move scientific and engineering and scholarly projects forward.

“This is a great opportunity for UF faculty,” Roitberg said. “Having HiPerGator in-house—with the incredible staff willing to go above and beyond to help researchers produce groundbreaking science like this—is something that makes my non-UF colleagues very jealous.”

Mass extinction by 2100? Supercomputer predicts one-quarter of Earth’s species will die by century’s end


More than a quarter of the world’s animals and plants will go extinct by the end of the century, according to a scientific model created by one of Europe’s most advanced supercomputers.

Scientists say 10 percent of plants and animals will disappear by 2050, with the number rising to 27 percent by 2100. This extinction “cascade” means that children born today might well be the last generation to see elephants or koalas, the researchers warn.

The researchers say the world is undergoing its “sixth mass extinction event” driven by global warming and changes to land use. The team says earlier models of extinction trajectories are not particularly useful because they do not account for co-extinctions, where species go extinct because others on which they depend on die off.

“Think of a predatory species that loses its prey to climate change,” says study author Professor Corey Bradshaw of Flinders University in a statement.

“The loss of the prey species is a ‘primary extinction’ because it succumbed directly to a disturbance. But with nothing to eat, its predator will also go extinct, a co-extinction. Or, imagine a parasite losing its host to deforestation, or a flowering plant losing its pollinators because it has become too warm,” Prof. Bradshaw continues.

“Every species depends on others in some way.”

Scientists are trying to find real-world tipping points

Until now, researchers have not been able to interconnect species on a global scale in order to work out how much additional loss will take place through co-extinctions.

While earlier studies have examined different aspects of extinctions such as the direct effects of climate change and the loss of habitats on species fates, these have not been combined to predict the scale of extinctions.

For the study, academics used one of Europe’s most powerful supercomputers to make “synthetic Earths” complete with virtual species and more than 15,000 food webs.

The networks were linked by whom eats whom and then climate and land use changes were applied to the system in order to inform future projections.

Virtual species were able to recolonize regions as the climate changed, adapt to changing conditions, go extinct because of global heating, or fall victim to an extinction cascade.

“Essentially, we have populated a virtual world from the ground up and mapped the resulting fate of thousands of species across the globe to determine the likelihood of real-world tipping points,” says study author Dr. Giovanni Strona from the University of Helsinki.

“We can then assess adaptation to different climate scenarios and interlink with other factors to predict a pattern of coextinctions.”

“By running many simulations over three main scenarios of climate until 2050 and 2100 — the so-called Shared Socioeconomic Pathways (SSP) from the Intergovernmental Panel on Climate Change (IPCC), we show that there will be up to 34% more co-extinctions overall by 2100 than are predicted from direct effects alone,” Dr. Strona says.

Co-extinctions triple the threat to vulnerable species

Prof. Bradshaw says what is even more frightening is that co-extinctions will raise the overall extinction rate of the most vulnerable species by up to 184 percent by the end of the century, according to the new projections.

“This study is unique, because it accounts also for the secondary effect on biodiversity, estimating the effect of species going extinct in local food webs beyond direct effects. The results demonstrate that interlinkages within food webs worsen biodiversity loss, to a predicted rate of up to 184% for the most susceptible species over the next 75 years,” Bradshaw explains.

“Compared with traditional approaches to predicting extinctions, our model provides a detailed insight into variation in patterns of species diversity responding to the interplay of climate, land use, and ecological interactions,” the researcher continues.

“Children born today who live into their 70s can expect to witness the disappearance of literally thousands of plant and animal species, from the tiny orchids and the smallest insects, to iconic animals such as the elephant and the koala … all in one human lifetime.”

Bradshaw adds that despite acknowledging that climate change is a major driver of extinction, the new study demonstrates that humans have underestimated its true impact on the diversity of life on Earth.

“Without major changes in human society, we stand to lose much of what sustains life on our planet,” the team concludes.

Dr. Stona says that the findings leave “no doubt” that climate change is mainly responsible for most extinctions globally.

The world’s fastest supercomputer just broke the exascale barrier.


https://www.sciencenews.org/article/supercomputer-exascale-frontier-speed-record-computing-calculations

NASA Upgraded Its Supercomputer To Recreate The Origin Of Stars


NASA Upgraded Its Supercomputer To Recreate The Origin Of Stars

Pleiades is the name for the supercomputer that is housed at the NASA Advanced Supercomputing (NAS) facility at NASA Ames Research Center near Mountain View, California, which was classed as the world’s thirteenth fastest computer on the TOP500 list in November 2015.  But, this was not enough for the team at the research center and as of July 1st, 2016 Pleiades now features a total of 28 racks of Intel processors, increasing its theoretical peak performance power from 6.28 to 7.25 petaflops, which will certainly give it a boost in the next TOP500 list.

At the NASA Advanced Supercomputing (NAS) facility researchers combine data from several observatories including the Hubble Space Telescope with the supercomputer’s amazing capabilities to get the detailed images they need to uncover the truths of the great unknown universe.

Without the upgrades to the supercomputer, the images that are now being produced were just impossible, and researchers are now able to progress their studies at a much faster rate than previously.  So far, the results generated from ORION2 have been consistent, and the simulations clearly show a mix of gravity, radiation, hydrodynamics magnetic fields and so much more.  Now, in the continuum to the current study, researchers can also use the upgraded technology to take a closer look at the planetary formation.

Press Release Update:

To make room for more Intel Xeon E5-2680v4 (Broadwell) racks, as of the 1st July NASA removed all of the remaining racks that housed Intel Xeon X5670 (Westmere) processors.  This upgrade has now doubled the number of Broadwell nodes, increasing the computer’s potential performance power to 7.25 petaflops and now makes the total number of racks containing Intel processors 161, providing more than 900 terabytes of memory.

NASA Upgraded Its Supercomputer To Recreate The Origin Of Stars

Are humans the new supercomputer? Team blurred the boundaries between man and mac


Are humans the new supercomputer?
A screenshot of one of the many games that are available. In this case the task is to shoot spiders in the “Quantum-Shooter” but there are many other kinds of games. 

The saying of philosopher René Descartes of what makes humans unique is beginning to sound hollow. ‘I think—therefore soon I am obsolete’ seems more appropriate. When a computer routinely beats us at chess and we can barely navigate without the help of a GPS, have we outlived our place in the world? Not quite. Welcome to the front line of research in cognitive skills, quantum computers and gaming.

Today there is an on-going battle between man and machine. While genuine machine consciousness is still years into the future, we are beginning to see computers make choices that previously demanded a human’s input. Recently, the world held its breath as Google’s algorithm AlphaGo beat a professional player in the game Go—an achievement demonstrating the explosive speed of development in machine capabilities.

But we are not beaten yet—human skills are still superior in some areas. This is one of the conclusions of a recent study by Danish physicist Jacob Sherson, published in the journalNature.

“It may sound dramatic, but we are currently in a race with technology—and steadily being overtaken in many areas. Features that used to be uniquely human are fully captured by contemporary algorithms. Our results are here to demonstrate that there is still a difference between the abilities of a man and a machine,” explains Jacob Sherson.

At the interface between and computer games, Sherson and his research group at Aarhus University have identified one of the abilities that still makes us unique compared to a computer’s enormous : our skill in approaching problems heuristically and solving them intuitively. The discovery was made at the AU Ideas Centre CODER, where an interdisciplinary team of researchers work to transfer some human traits to the way computer algorithms work.

Quantum physics holds the promise of immense technological advances in areas ranging from computing to high-precision measurements. However, the problems that need to be solved to get there are so complex that even the most powerful supercomputers struggle with them. This is where the core idea behind CODER—combining the processing power of computers with human ingenuity—becomes clear.

Our common intuition

Like Columbus in QuantumLand, the CODER research group mapped out how the human brain is able to make decisions based on intuition and accumulated experience. This is done using the online game “Quantum Moves”. Over 10,000 people have played the game that allows everyone contribute to basic research in quantum physics.

“The map we created gives us insight into the strategies formed by the human brain. We behave intuitively when we need to solve an unknown problem, whereas for a computer this is incomprehensible. A computer churns through enormous amounts of information, but we can choose not to do this by basing our decision on experience or intuition. It is these intuitive insights that we discovered by analysing the Quantum Moves player solutions,” explains Jacob Sherson.

The laws of quantum physics dictate an upper speed limit for data manipulation, which in turn sets the ultimate limit to the processing power of quantum computers—the Quantum Speed Limit. Until now a computer algorithm has been used to identify this limit. It turns out that with human input researchers can find much better solutions than the algorithm.

Are humans the new supercomputer? Team blurred the boundaries between man and mac
This is how the “Mind Atlas” looks. Based on 500.000 completed games the group has been able to visualize our ability to solve problems. Each peak on the ‘map’ represents a good idea, and the area with the most peaks – marked by red rings – are where the human intuition has hit a solution. A computer can then learn to focus on these areas, and in that way ‘learn’ about the cognitive functions of a human. Credit: CODER/AU

“The players solve a very complex problem by creating simple strategies. Where a computer goes through all available options, players automatically search for a solution that intuitively feels right. Through our analysis we found that there are common features in the players’ solutions, providing a glimpse into the shared intuition of humanity. If we can teach computers to recognise these good solutions, calculations will be much faster. In a sense we are downloading our common intuition to the computer” says Jacob Sherson.

And it works. The group has shown that we can break the Quantum Speed Limit by combining the cerebral cortex and computer chips. This is the new powerful tool in the development of quantum computers and other quantum technologies.

We are the new supercomputer

Science is often perceived as something distant and exclusive, conducted behind closed doors. To enter you have to go through years of education, and preferably have a doctorate or two. Now a completely different reality is materialising.

In recent years, a new phenomenon has appeared—citizen science breaks down the walls of the laboratory and invites in everyone who wants to contribute. The team at Aarhus University uses games to engage people in voluntary science research. Every week people around the world spend 3 billion hours playing games. Games are entering almost all areas of our daily life and have the potential to become an invaluable resource for science.

“Who needs a supercomputer if we can access even a small fraction of this computing power? By turning science into games, anyone can do research in quantum physics. We have shown that games break down the barriers between quantum physicists and people of all backgrounds, providing phenomenal insights into state-of-the-art research. Our project combines the best of both worlds and helps challenge established paradigms in computational research,” explains Jacob Sherson.

The difference between the machine and us, figuratively speaking, is that we intuitively reach for the needle in a haystack without knowing exactly where it is. We ‘guess’ based on experience and thereby skip a whole series of bad options. For Quantum Moves, intuitive human actions have been shown to be compatible with the best computer solutions. In the future it will be exciting to explore many other problems with the aid of human intuition.

“We are at the borderline of what we as humans can understand when faced with the problems of quantum physics. With the problem underlying Quantum Moves we give the computer every chance to beat us. Yet, over and over again we see that players are more efficient than machines at solving the problem. While Hollywood blockbusters on artificial intelligence are starting to seem increasingly realistic, our results demonstrate that the comparison between man and machine still sometimes favours us. We are very far from computers with human-type cognition,” says Jacob Sherson and continues:

“Our work is first and foremost a big step towards the understanding of physical challenges. We do not know if this can be transferred to other challenging problems, but it is definitely something that we will work hard to resolve in the coming years.”

 

Supercomputer Faster Than Humans Uses Less Power Than a Hearing Aid


Lawrence Livermore National Laboratory, one of the country’s top scientific research facilities,announced the new mega machine yesterday, which is the product of the IBM and the US government. This new supercomputer for use on national security missions, makes decisions like a human brain, and uses less power than a hearing aid.supercomputer-4

supercomputer-3

Lawrence Livermore National Laboratory, one of the country’s top scientific research facilities,announced the new mega machine yesterday, which is the product of the IBM and the US government. This new supercomputer for use on national security missions, makes decisions like a human brain, and uses less power than a hearing aid.

Using a platform called TrueNorth, a brain-inspired group of chips, which mimics a network 16 million neurons with 4 billion synapses worth of processing power, this supercomputer is capable of recognizing patterns and solving problems much like a human does.

While, AIs thinking the same way as humans do, have been a struggle to create for quite some time now, this machine finally does the trick by learning from their mistakes like humans and adapting quickly to changing, complex situations. This system uses just 2.5 watts of power- similar to the amount needed by a LED liightbulb!

SF Gate points out that in five years or so, the TrueNorth chips could even help smartphones achieve facial recognition, or smart glasses help blind people navigate their surroundings.

The array of TrueNorth chips, which set the lab back a relatively cheap-sounding $1 million, will be used in government cybersecurity missions. It will also help “ensure the safety, security and reliability of the nation’s nuclear deterrent without underground testing,” according to the press release.

 

A supercomputer that knows if you’re going to die


A super computer developed in the US can predict the likelihood of a person’s death with almost 100 per cent accuracy, a media report said.

The machine, installed at Boston’s Beth Israel Deaconess Medical Centre, draws on the data of more than 250,000 people collected over a period of 30 years to make speedy diagnoses, The Mirror reported on Monday.

The machine’s ability to speedy disease recognition could potentially save lives as well as predict patients’ imminent demise, the report added.

“Our goal is not to replace the clinician… This artificial intelligence is really about the augmenting of doctors’ ability to take care of patients,” Steve Horng, a doctor at the hospital, was quoted as telling BBC.

Patients at the hospital are linked up to the super computer which collects and analyses data about their condition every three minutes, measuring everything from oxygen levels to blood pressure to give doctors “everything we need to know about a patient”.

When the computer says no, doctors can “predict with 96 per cent confidence” when patients may die.

“If the computer says you’re going to die, you probably will die in the next 30 days,” Horng said.

This physicist has built a supercomputer from old PlayStations .


A home-made PlayStation 3 supercomputer is 3,000 times more powerful than any desktop processor, and is being used to study black holes.

Next time you upgrade your gaming console for the latest model, there may be a better option than simply throwing the old one away – you can use it for science instead.

Guarav Khanna, a black hole physicist at the University of Massachusetts Dartmouth in the US, has managed to build a powerful and extremely cheap supercomputer using old PlayStation 3s (PS3s), and he’s used it to publish several papers on black holes.

His research focusses on finding gravitational waves, which are curvatures in space-time that ripple out from a violent astrophysical event, such as two black holes colliding. They were first predicted by Einstein’s theory of general relativity, but no one has been able to observe them.

“Science has become expensive,” Khanna told Parker. “There’s simply not that much money going around, either at the university or the federal level. Supercomputing allows scientists to make up for the resources they don’t have.”

In theory, a supercomputer basically involves linking many standard computers together via a network. But instead of using regular laptops, Khanna decided to go a cheaper option, and link up PS3s. Their main benefit was that they allow users to install their preferred operating system on the console, and they retail for around US$250.

To help with his research, Sony donated four consoles to the experiment, and Khanna and the university bought another 12.

All 16 were then loaded up with Linux and linked over the Internet – the result was a processor that could speed up calculations by a factor of nearly 10 compared to an ordinary computer. He published the results of the make-shift supercomputer in thejournal Parallel and Distributed Computing Systems in 2009.

He used that first device to model the behaviour of gravitational waves and publish several papers on the phenomenon, but since then he’s now made an even more powerful model, as Parker reports.

In 2010, the US Air Force Research Laboratory in New York found out about Khanna’s PS3 supercomputer, and decided to make their own out of 1,760 consoles, pictured below, in order to process radar image surveillance.

playstations

As a thank you, the US Department of Defense donated 176 additional PS3s to Khanna and his team. They now house their supercomputer in a refrigerated shipping container, designed to carry milk. This model is as powerful as 3,000 desktop computers, and only cost around US$75,000 to make – a ridiculously cheap amount for a supercomputer.

Although PS3s have their limitations – the memory is much smaller than traditional supercomputers, for example – their supercomputer is continuing to grow in power and has helped not only Khanna with his research, but other scientists around the university. The team will add another 220 PS3s to the system by 2015.

The next project Khanna wants to work on is creating a supercomputer out of PC graphics cards, which are equally low-cost but as powerful as around 20 PS3 consoles.

“The next supercomputer we’re going to build will probably be made entirely of these cards,” Khanna told Parker. “It won’t work for everything, but it will certainly cover a large set of scientific and engineering applications, especially if we keep improving on it.”

Source: The New York Times

 

Supercomputers Can Save U.S. Manufacturing.


The key to reviving manufacturing in the U.S. may lie in the nation’s supercomputers

The U.S. used to be a powerhouse in manufacturing. In the past quarter of a century we have relinquished this leadership position, in large part because we made a decision—consciously or unconsciously—that the service and financial sectors are sufficient to sustain our economy. But they are not. Service jobs pay little. The financial industry makes nothing of value and therefore cannot maintain, let alone raise, the nation’s standard of living.

The fate of manufacturing is in some ways linked to our prowess in the physical sciences. In the 1960s and 1970s high-performance computing (HPC) developed at the national labs made its way to the manufacturing sector, where it now powers much of the innovation behind our most successful commercial firms. Yet we are ceding leadership in the physical sciences, too. Canceling the Superconducting Super Collider in the 1990s ended U.S. dominance in particle physics. NASA’s decision to delay, and possibly eventually abandon, the Wide-Field Infrared Survey Telescope could do the same for cosmology.

Fortunately, the nation’s lead in high-performance computing still stands. HPC is the advanced computing physicists use to model the dynamics of black holes, meteorologists use to model weather and engineers use to simulate combustion. This expertise may also be our best chance to rescue U.S. manufacturing. If we can successfully deliver it to engineers at small firms, it might give the sector enough of a boost to compete with lower labor costs overseas.

We already know how useful HPC is for big firms. When Boeing made the 767 in the 1980s, it tested 77 wing prototypes in the wind tunnel. When it made the 787 in 2005, it tested only 11. In the future, Boeing plans to bring that number down to three. Instead of physical wind tunnels, it uses virtual ones—simulations run on supercomputers—saving much time and money and quickening the pace of new products development. HPC modeling and simulation has become an equally powerful tool in designing assembly lines and manufacturing processes in a broad range of fields—big manufacturers such as Caterpillar, General Electric, Goodyear and Procter & Gamble use it routinely. Small manufacturers could get similar benefits from these tools, if only they had access to them.

I first came to appreciate the potential of HPC to help small manufacturers in 2009 as part of the Obama transition team. Working with the Council on Competitiveness, we identified lack of software, cost of entry and shortages of expertise as the main obstacles to the use of HPC by small manufacturers and proposed a partnership among government, manufacturers and universities to help. The result is the National Digital Engineering and Manufacturing Consortium, or NDEMC, a pilot pro­gram created by the council and the federal government.

Recently NDEMC made HPC resources available to a handful of firms, including Jeco Plastic Products. This 25-employee firm in Plainfield, Ind., makes plastic pallets for packaging of auto parts. The plastic pallets are a less expensive alternative to steel pallets, which are heavier and prone to rusting. When Jeco makes a new product, its engineers build a prototype, test it in the lab to see how it bears up under the stress it is likely to encounter in the field and repeat the process until they arrive at the best design. Last December, however, Jeco engineers got a chance to tap expertise at Purdue University to develop simulations of a pallet designed for a German automotive company and ran them on hardware at the Ohio Supercomputing Center in Columbus. As a result, Jeco bypassed that trial-and-error process completely, arriving at a design in only a few hours of computer time.

Many other small firms could reap similar benefits. NDEMC’s goal is to find the best business models for getting HPC to these firms and eventually take the effort nationwide. Small manufacturers today are in some ways like farmers at the beginning of the 20th century, most of whom did not know what contour farming, crop rotation and fertilizers could do for productivity. When the U.S. agricultural extension service, in conjunction with land-grant universities, made the requisite expertise available, it triggered a revolution in agricultural productivity. A similar revolution could be in the cards for small manufacturers if we can get supercomputing technology into the hands of their engineers.

Source: Scientific American

 

Supercomputer Pioneer Roadrunner To Shut Down.


http://news.sky.com/story/1071902/supercomputer-pioneer-roadrunner-to-shut-down