Researchers based at the University of Illinois at Urbana-Champaign have created a unique living cells simulation using NVIDIA graphics card accelerated software to simulate a 2 billion at himself that metabolises and grows like a living cell. The simulation of the living cell contains its very own microcosm consisting of thousands of components responsible for protein building, gene transcription, energy production and more. The 3D simulation replicates the physical and chemical characteristics as a particle scale and provides a fully dynamic model that mimics the behaviour of a living cell for researchers to study.
Living cell simulation
“To build the living cell model, the Illinois researchers simulated one of the simplest living cells, a parasitic bacteria called mycoplasma. They based the model on a trimmed-down version of a mycoplasma cell synthesized by scientists at J. Craig Venter Institute in La Jolla, Calif., which had just under 500 genes to keep it viable. For comparison, a single E. coli cell has around 5,000 genes. A human cell has more than 20,000.
Luthy-Schulten’s team then used known properties of the mycoplasma’s inner workings, including amino acids, nucleotides, lipids and small molecule metabolites to build out the model with DNA, RNA, proteins and membranes. Even a minimal cell requires 2 billion atoms,” said Zaida Luthey-Schulten, chemistry professor and co-director of the university’s Center for the Physics of Living Cells. “You cannot do a 3D model like this in a realistic human time scale without GPUs.”
“Using Lattice Microbes software on NVIDIA Tensor Core GPUs, the researchers ran a 20-minute 3D simulation of the cell’s life cycle, before it starts to substantially expand or replicate its DNA. The model showed that the cell dedicated most of its energy to transporting molecules across the cell membrane, which fits its profile as a parasitic cell. “If you did these calculations serially, or at an all-atom level, it’d take years,” said graduate student and paper lead author Zane Thornburg. “But because they’re all independent processes, we could bring parallelization into the code and make use of GPUs.””
Latest Geeky Gadgets Deals
Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.