bloc 33rd Square Business Tools - simulation 33rd Square Business Tools: simulation - All Post
Showing posts with label simulation. Show all posts
Showing posts with label simulation. Show all posts

Thursday, October 2, 2014


 Uncanny Valley
Computer graphics artist Chris Jones has created one very realistic looking head using the latest software.  His short video of the piece called 'Ed' must be seen.




Australian-based computer game artist Chris Jones has created one of the most realistic human head simulations we've seen.

One person commented on Sploid, "This IS the most realistic human face I've ever seen produced via computer; however it's so perfect that something still screams uncanny valley."

The head, 'Ed,' was made with Lightwave, Sculptris and Krita, and composited with Davinci Resolve Lite. Jones also created the music for the video.  It has been an ongoing work for the artist.

Hard To Believe This Face Is Computer Generated

Related articles

Jones has worked as a freelance children’s book illustrator, and following graduation with and industrial design degree, continued illustrating and animating before becoming a computer game artist at Beam Software (later to become Infogrames).

He left Infogrames in May 2000 to work full-time on a film called, The Passenger.


Soon this quality of avatar will be walking around in Second Life.





SOURCE  Chris Jones

By 33rd SquareEmbed

Monday, June 16, 2014

Computer Simulations Reveal Secrets of Influenza Virus

 Simulation
Researchers using computer simulations have revealed a key mechanism in the replication process of influenza A.  The work may help defend against future deadly pandemics.




Treating influenza relies on drugs, such as Amantadine, that are becoming less effective due evolution of the virus. Now University of Chicago scientists have published computational results that may give drug designers the insight they need to develop the next generation of effective influenza treatment.

“It’s very hard to design a drug if you don’t understand how the disease functions,” said Gregory Voth, the Haig P. Papazian Distinguished Service Professor in Chemistry. Voth and three co-authors offer new insights into the disease’s functioning in the Proceedings of the National Academy of Sciences.

Amantadine is a bulky organic compound originally designed to treat influenza A by blocking proton flow through the M2 channel, one of the few proteins that are targets for antiviral therapies. “The proton flow is essential for influenza viral replication,” said Voth, who also is director of the Center for Multiscale Theory and Simulation. Unfortunately, subsequent mutations in different forms of the flu have changed the ability of Amantadine to bind to the M2 protein. “There’s a big, worldwide push to find new drugs that will block this or other influenza proteins,” Voth said.

"Computer simulation, when done very well, with all the right physics, reveals a huge amount of information that you can’t get otherwise."


The UChicago team conducted extensive multiscale simulations of proton permeation, a critical step in viral replication, through the M2 channel from influenza A. The simulations enabled them to visualize this process at three interconnected scales, from the electronic (the smallest), to the molecular (intermediate) to the mesoscopic (the largest). The capability of the technique was demonstrated by last year’s Nobel Prize in Chemistry, which was awarded to three scientists “for the development of multiscale models for complex chemical systems.”

“Computer simulation, when done very well, with all the right physics, reveals a huge amount of information that you can’t get otherwise,” Voth said. “In principle, you could do these calculations with potential drug targets and see how they bind and if they are, in fact, effective.”

Related articles
The flow of protons through the watery M2 channel is a complex process, one involving many phenomena, including the making and breaking of chemical bonds. Scientists have attempted to simulate this process computationally for more than 20 years to understand how it works, but only now has the feat been achieved. No other experimental or simulation technique is capable of examining the proton flow process in such detail.

Scientists have, however, succeeded in experimentally producing mutations of different parts of the M2 protein. The UChicago team’s simulations of the protein’s dynamics not only agree with those experimental data, which validates the results, but also explains the effects of these mutations, one of which is a dominant cause of drug resistance.

To reach such significant conclusions, the UChicago team tapped the power of four high-performance computer clusters. Principal among these was the Midway high-performance computing cluster at the University’s Research Computing Center. The Midway cluster worked various aspects of the problem continually for an entire year under the watchful guidance of Ruibin Liang, a graduate student in chemistry and the study’s lead author.

But the team also needed clusters at the Texas Advanced Computing Center at the University of Texas at Austin, the San Diego Supercomputer Center at the University of San Diego, and the Department of Defense High Performance Computing Center in Vicksburg, Miss.

“This was a huge amount of work, so I used every resource available. Professor Voth devoted a lot of machine time to this project,” Liang said.

More work lies ahead for Voth and his team, including trying to make the simulation process run more quickly, explaining the effects of drug resistant mutations, and targeting other forms of influenza. According the Liang, the stage has been set and the work is underway to reveal the proton permeation mechanism in influenza B, another form of the flu that has a different M2 channel and is entirely resistant to drugs like Amantidine.


SOURCE  University of Chicago

By 33rd SquareEmbed

Thursday, May 22, 2014

Computer Models Help Unravel the Science of Life

 Computer Models
Scientists have developed a sophisticated computer modelling simulation to explore how cells of the fruit fly react to changes in the environment.




Researchers have developed a sophisticated computer modelling simulation to explore how cells of the fruit fly react to changes in the environment. The research has been published in the science journal Cell, is part of an on-going study at The Universities of Manchester and Sheffield that is investigating how external environmental factors impact on health and disease.

The model shows how cells of the fruit fly communicate with each other during its development. Dr Martin Baron, who led the research, said:

"It is exciting that the computer model was able to make predictions that we could test by going back to the fly experiments to investigate the effects of different mutations which alter the components of the cells."


“The work is a really nice example of researchers from different disciplines of maths and biology working together to tackle challenging problems.”

The paper describes how the comptuer model provides a theoretical framework by which to explore how different environmental and other regulatory inputs can be integrated with the core signaling mechanism to result in adaptive—or, possibly, maladaptive—outcomes on the development, maintenance, and health of an organism.

Drosphila Simulation

The current phase of the study aims to understand how temperature interacts with cell signalling networks during development. Flies are able to develop normally across a wide range of temperatures and it is not understood how this is achieved.

The combined disciplines approach was undertaken because the complexity of development involves numerous components that are interconnected with each other in networks of cell to cell communication pathways, whose outcomes are difficult to predict without computer simulations.

Related articles
The fruit fly is a commonly used  in lab work because, although its development is relatively simple, around 75% of known human disease genes have a recognizable match in the genome of fruit flies which means they can be used to study the fundamental biology behind complex conditions such as neurodegeneration or cancer.

Baron said: “it is exciting that the computer model was able to make predictions that we could test by going back to the fly experiments to investigate the effects of different mutations which alter the components of the cells. It shows us that the model is working well and provides a solid basis on which to develop its sophistication further.”

The next phase will see the team research how the cell signalling network adjusts and responds to other environmental changes such as nutrition. Baron says "There is a lot of interest in how environmental inputs influence our health and disease by interacting with our genetic makeup. Our initial studies have already shown that changes to the adult fly's diet can also affect how cells inside a fly communicate with each other and produce responses in certain fly tissues. This is a promising avenue for future studies".

Baron explains that there are wider implications for understanding human health and disease: “Many different types of signal control normal development but when some of these signals are mis-activated they can result in the formation of tumors."

“What we’ve learnt from studying the flies” said Baron, “is that some communication signals can arise in different ways and this means that, in cancer, mis-activation of these signals can also occur by different routes. This is important because it can help us to understand how to stop mis-activation from occurring.”


SOURCE  University of Manchester

By 33rd SquareEmbed

Thursday, May 8, 2014

Illustris
 
Cosmology
A new computer simulation called Illustris shows the formation of galaxies with unprecedented precision, allowing astrophysicists to indirectly confirm the standard model of cosmology. 




Astronomers have created the first realistic virtual universe using a computer simulation called "Illustris." Illustris has recreated 13 billion years of cosmic evolution in a cube 350 million light-years on a side with unprecedented resolution.

"Until now, no single simulation was able to reproduce the universe on both large and small scales simultaneously," says Mark Vogelsberger  of the MIT/Harvard-Smithsonian Center for Astrophysics, who conducted the work in collaboration with researchers at several institutions, including the Heidelberg Institute for Theoretical Studies in Germany.

These results have been published in the journal Nature.

"Illustris is like a time machine. We can go forward and backward in time. We can pause the simulation and zoom into a single galaxy or galaxy cluster to see what's really going on."


Previous simulations of the growth of cosmic structures have broadly reproduced the ‘cosmic web’ of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of a lack of computing power and the complexities of the underlying physics. Moreover, they were unable to track the small-scale evolution of gas and stars to the present within a representative portion of the Universe. Earlier simulations also had trouble modeling complex feedback from star formation, supernova explosions, and supermassive black holes.

Illustris is a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. The simulation is filled with a population of elliptical and spiral galaxies, and reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the ‘metal’ and hydrogen content of galaxies on small scales.

Illustris simulation

Related articles
Illustris employs a sophisticated computer program to recreate the evolution of the universe in high fidelity. It includes both normal matter and dark matter using 12 billion 3-D "pixels," or resolution elements.

The team dedicated five years to developing the Illustris program. The actual calculations took 3 months of "run time," using a total of 8,000 CPUs running in parallel. If they had used an average desktop computer, the calculations would have taken more than 2,000 years to complete.

The computer simulation began a mere 12 million years after the Big Bang. When it reached the present day, astronomers counted more than 41,000 galaxies in the cube of simulated space. Importantly, Illustris yielded a realistic mix of spiral galaxies like the Milky Way and football-shaped elliptical galaxies. It also recreated large-scale structures like galaxy clusters and the bubbles and voids of the cosmic web. On the small scale, it accurately recreated the chemistries of individual galaxies.

Because light travels at a fixed speed, the farther away astronomers look, the farther back in time they can see. A galaxy one billion light-years away is seen as it was a billion years ago. Telescopes like Hubble can give us views of the early universe by looking to greater distances. However, astronomers can't use Hubble to follow the evolution of a single galaxy over time.

"Illustris is like a time machine. We can go forward and backward in time. We can pause the simulation and zoom into a single galaxy or galaxy cluster to see what's really going on," says co-author Shy Genel of the CfA.



SOURCE  Harvard-Smithsonian Center for Astrophysics / Illustris

By 33rd SquareEmbed

Wednesday, January 22, 2014


 Computer Graphics
A team of researchers has developed a realistic walking simulator for a variety of bipedal creatures. In the simulator, two-legged computer-based creatures walk in various conditions with a system using discrete muscle control parameters.




Agroup of researchers from Utrecht University and the University of British Columbia have created a  a muscle-based control method for simulated two-legged computer-based creatures where the muscle control parameters are optimized.

Through an evolutionary algorithm, the system yields effective gaits for the creatures for various parameters, including speed rotation, and even gravity.

Watch A Computer Learn How To Walk
Image Source - Geijtenbeek, van de Panne and van der Stappen

The generic locomotion control method developed titled, Flexible Muscle-Based Locomotion for Bipedal Creatures, supports a variety of bipedal creatures. All actuation forces are the result of 3D simulated muscles, and a model of neural delay is included for all feedback paths. 

The researchers' conntrollers generate torque patterns that incorporate biomechanical constraints. The synthesized controllers find different gaits based on target speed, can cope with uneven terrain and external elements, like blocks being thrown at the creatures.

Muscle Path
An example muscle path from the research.  Image Source- Geijtenbeek, van de Panne and van der Stappen
The current method still has limitations, so it won't be powering up any humanoid robots soon. Compared to the results of other studies, the walking and running motions in the system are of somewhat lesser fidelity, especially for the upper-body.

Related articles
This can be partially explained by the absence of specific arm features in the researchers' humanoid models. For now, they favored using a generic approach, but the researchers say focusing on a more faithful human
gait could make their models even more realistic.

Despite this, the team's lower-body walking motions are very close to their state-of-the-art result. "We witness a similar near-passive knee usage during swing, as well as a natural build-up of the ankle plantarflexion moment during stance," they write.

Work on an improved set of authoring tools remains an important direction for future development. Such efforts which could be further improved include: greater fidelity for the modeling joints such as the knees, ankles, and shoulders; more accurate muscle path wrapping models that interact with the skeleton geometry; giving further thought to the detail with which the target feature trajectories need to be modeled; the addition of anticipatory feed-forward control to the architecture; and the use of alternate dynamics simulators.


SOURCE  ACM Transactions on Graphics

By 33rd SquareSubscribe to 33rd Square

Saturday, December 21, 2013


 Artificial Life
The Open Worm project aims to build a lifelike copy of a nematode roundworm entirely out of computer code. Now the creature's creators have added code that gets the virtual worm wriggling like the real thing.




The open-source OpenWorm Project has had a major milestone,creatingt an artificial life form from the cellular level in silco.

"That's a simulated worm body with muscle segments that resemble an actual C.Elegans," project advocate John Hurliman told New World Notes.

"Each muscle segment can receive a contraction signal, and although the current setup just has a hardcoded algorithm driving the muscles, its movement closely resembles published literature on how C. Elegans swims."

OpenWorm Milestone as Artificial Worm Wriggles to Life

Related articles
"The core algorithm for the physics simulation is called PCI-SPH, which is a somewhat advanced but well understood particle simulation method. The main source of complexity is the architecture: going from brain firing signals to muscle contractions to moving particles around."

The Open Worm project started in May 2013 and is slowly working towards creating a virtual copy of the C. elegans nematode. This worm is one of the most widely studied creatures on Earth and was the first multicelled organism to have its entire genome mapped.

The simulated worm slowly being built out of code aims to replicate C. elegans in exquisite detail with each of its 1,000 cells being modelled on computer.

The next steps for OpenWorm are to continue working on performance and hook up a synthetic brain, based on the worm's connectome.

Early work on the worm involved making a few muscle segments twitch but now the team has a complete worm to work with. The code governing how the creature's muscles move has been refined so its swaying motion and speed matches that of its real life counterpart. The tiny C. elegans manages to move around in water at a rate of about 1mm per second.


SOURCE  New World Notes

By 33rd SquareSubscribe to 33rd Square

Tuesday, July 16, 2013

Infographic - Are We Living In A Computer Simulation?

 Infographics
How do we know if the world we know it isn't just a simulation running on some cosmic computer, where the Big Bang essentially ran the code, 'Start Program'.  Our new infographic explores these concepts and the ideas behind them.




When philospher Nick Bostrom first proposed the simulation argument a decade ago, it was just an idea.

Now researchers working in cosmology to quantum mechanics to string theory are referring to Bostrom's idea.

Related articles
He initially argued that "at least one of the following propositions is true: (1) the human species is very likely to go extinct before reaching a “posthuman” stage; (2) any posthuman civilization is extremely unlikely to run a significant number of simulations of their evolutionary history (or variations thereof); (3) we are almost certainly living in a computer simulation. It follows that the belief that there is a significant chance that we will one day become posthumans who run ancestor-simulations is false, unless we are currently living in a simulation."


The question is compelling.  What if our universe is just a computer simulation?  How could we determine if it is?  What would this mean for our day-to-day lives?

Physicist Martin Savage, from the University of Washington, believes we can't discount the idea. He and two colleagues (Silas Beane and Zohreh Davoudi) published a paper in November 2012 exploring the possibility. Savage and his group have also devised a method of checking if the simulation exists using cosmic ray observations.

For these reasons we have made the following infographic looking at the question, "Are We Living In A Computer Simulation?"  


Embed 'Are We Living In A Computer Simulation? - Infographic': Copy and Paste the Code Below



By 33rd SquareSubscribe to 33rd Square

Monday, June 3, 2013

DARPA Robotics Challenge Simulator

 
DARPA Robotics Challenge
For the first part of the DARPA Robotics Challenge, competitors need to prove themselves in the Virtual Robotics Challenge (VRC), in which teams compete through a computer simulation of a robot and the challenge tasks. For the last month, teams from around the world have been submitting entries to the VRC qualification event. Now the Open Source Robotics Foundation (OSRF) has released a preview of how the simulator, called Gazebo, works.








The goal of the DARPA Robotics Challenge (DRC) is to generate groundbreaking research and development so that future robotics can perform the most hazardous activities in future disaster response operations, in tandem with their human counterparts, in order to reduce casualties, avoid further destruction, and save lives.

Within the coming months, the Challenge will test the participating Teams’ robots ability to work in rough terrain and their capacity to use human aids such as vehicles and hand tools in three events.

Related articles
To facilitate robot software development, DARPA is developing an open source simulation tool: the DRC Simulator. The Simulator will be populated with models of robots, robot components, and field environments and will be made available to organizations skilled in robotic software development. This simulator will help to expand the supplier base for ground robot systems (both hardware and software), increase capabilities, and in the future will help lower acquisition costs.

The simulation environment, called Gazebo and the tasks that teams will need to complete in the Virtual Robotics Challenge (VRC), later this month) has been developed by the Open Source Robotics Foundation (OSRF).

To come up with these tasks, DARPA talked to disaster responders like firefighters, police, and nuclear engineers, asking them "what would you want a robot to be able to do?" In addition to things like walking over rubble and driving, a high priority was basic tool use and manipulation of hoses and valves, to get cooling water into a nuclear power plant after a disaster for example.

OSRF has put a lot of time into making Gazebo as close as possible to the real world. As a functional simulation the focus has been on the dynamics of the system, not the aesthetics, but the details even ensure that the simulated robots do not move perfectly smoothly, just as the real robots do. 

Gazebo is a 3D multi-robot simulator with dynamics. It is capable of simulating a population of robots, sensors and objects in a three-dimensional world. It generates both realistic sensor feedback and physically plausible interactions between objects (it includes an accurate simulation of rigid-body physics).

DRC Team ROBIL Simulation
The Israeli team ROBIL has posted a video here of their navigation strategy in the simulator.  

The VRC is scheduled to take place in a few weeks. Some of these teams could potentially win the opportunity to work with actual Bostom Dynamics' ATLAS robots at the DARPA Robotics Challenge Trials this December based on thier performance in the VRC.



SOURCE  IEEE Spectrum

By 33rd SquareSubscribe to 33rd Square

Wednesday, May 22, 2013


 Main Label
The SIGGRAPH Technical Papers program is the premier international forum for disseminating new scholarly work in computer graphics and interactive techniques. SIGGRAPH 2013 brings together thousands of computer graphics professionals to share and discuss their work.
-->



The SIGGRAPH 2013 Technical Papers program is the premier international forum for disseminating new scholarly work in computer graphics and interactive techniques. The 40th International Conference and Exhibition on Computer Graphics and Interactive Techniques, 21-25 July 2013 at the Anaheim Convention Center in California, received submissions from around the globe and features high quality and never before seen scholarly work. Submitters are held to extremely high standards in order to qualify.

“Computer Graphics is a dynamic and ever-changing field in many ways,” says Marc Alexa, SIGGRAPH 2013 Technical Papers Chair from Technische Universität Berlin. “The range of ground-breaking papers presented at SIGGRAPH is getting broader every year, now also encompassing 3D printing, and fabricating realistic materials as well as generating ever more realistic images of complex phenomena.”

SIGGRAPH accepted 115 technical papers (out of 480 submissions) to showcase this year representing an acceptance rate of 24 percent (one percent higher than 2012). The selected papers were chosen by a distinguished committee of academia and industry experts.

This year's Technical Papers program also includes conference presentations for 37 papers published this year in the journal ACM Transactions on Graphics (TOG).

Highlights From the SIGGRAPH 2013 Technical Papers Program this year include:

OpenFab: A Programmable Pipeline for Multi-Material Fabrication
Authors: Kiril Vidimce, Szu-Po Wang, Jonathan Ragan-Kelley and Wojciech Matusik, Massachusetts Institute of Technology CSAIL

Open Fab

This paper proposes a programmable pipeline, inspired by RenderMan, for synthesis of multi-material 3D printed objects. The pipeline introduces user-programmable fablets, a corollary to procedural shaders for 3D printing, and is designed to stream over arbitrary numbers of voxels with a fixed and controllable memory footprint.

Opacity Optimization for 3D Line Fields
Authors: Tobias Günther, Christian Roessl, and Holger Theisel, Otto-von-Guericke-Universität Magdeburg

Opacity Optimization for 3D Line Fields

For visualizing dense line fields, this method selects lines by view-dependent opacity optimizations and applies them to real-time free navigation in flow data, medical imaging, physics, and computer graphics.

Related articles
AIREAL: Interactive Tactile Experiences in Free Air
Authors: Rajinder Sodhi, University of Illinois; Ivan Poupyrev, Matthew Glisson, Ali Israr, Disney Research, The Walt Disney Company

AIREAL: Interactive Tactile Experiences in Free Air

AIREAL is a tactile feedback device that delivers effective and expressive tactile sensations in free air, without requiring the user to wear a physical device. Combined with interactive graphics and applications, AIREAL enables users to feel virtual objects, experience free-air textures and receive haptic feedback with free-space gestures.

Bi-Scale Appearance Fabrication
Authors: Yanxiang Lan, Tsinghua University; Yue Dong, Microsoft Research Asia; Fabio Pellacini, Sapienza Universita’ Di Roma, Dartmouth College; Xin Tong, Microsoft Research Asia

Bi-Scale Appearance Fabrication

A system for fabricating surfaces with desired spatially varying reflectance, including anisotropic ones, and local shading frames.

Map-Based Exploration of Intrinsic Shape Differences and Variability
Authors: Raif Rustamov, Stanford University; Maks Ovsjanikov, École Polytechnique; Omri Azencot, Mirela Ben-Chen, Technion - Israel Institute of Technology; Frederic Chazal, INRIA Saclay - Île-de-France; and Leonidas Guibas, Stanford University

Map-Based Exploration of Intrinsic Shape Differences and Variability

A novel formulation of shape differences, aimed at providing detailed information about the location and nature of the differences or distortions between the shapes being compared. This difference operator is much more informative than a scalar similarity score, so it is useful in applications requiring more refined shape comparisons.

Highly Adaptive Liquid Simulations on Tetrahedral Meshes
Authors: Ryoichi Ando, Kyushu University; Nils Thuerey, ScanlineVFX GmbH; and Chris Wojtan, Institute of Science and Technology Austria

Highly Adaptive Liquid Simulations on Tetrahedral Meshes

This new method for efficiently simulating fluid simulations with extreme amounts of spatial adaptivity combines several key components to produce a simulation algorithm that is capable of creating animations at high effective resolutions while avoiding common pitfalls like inaccurate boundary conditions and inefficient computation.

SIGGRAPH 2013 will bring thousands of computer graphics and interactive technology professionals from five continents to Anaheim, California for the industry's most respected technical and creative programs focusing on research, science, art, animation, music, gaming, interactivity, education, and the web from Sunday, 21 July through Thursday, 25 July 2013 at the Anaheim Convention Center. SIGGRAPH 2013 includes a three-day exhibition of products and services from the computer graphics and interactive marketplace from 23-25 July 2013.

More details are available at SIGGRAPH 2013 or on Facebook and Twitter.



SOURCE  SIGGRAPH 2013

By 33rd SquareSubscribe to 33rd Square

Tuesday, May 21, 2013


 Simulated Biology
The OpenWorm project aims to build the first comprehensive computational model of the roundworm C. elegans. With only a thousand cells, it solves basic problems such as feeding, mate-finding and predator avoidance. Despite being extremely well studied in biology, this organism still eludes a deep, principled understanding of its biology.






The OpenWorm project aims to build the first comprehensive computational model of the Caenorhabditis elegans (C. elegans), a microscopic roundworm.

With only a thousand cells, it solves basic problems such as feeding, mate-finding and predator avoidance. Despite being extremely well studied in biology, this organism still eludes a deep, principled understanding of its biology.

C. elegans

If it succeeds, OpenWorm will have created a first in executable biology: a simulated animal using the principles of life to exist on a computer.

The international group collaborating on the project using a bottom-up approach, aimed at observing the worm behaviour emerge from a simulation of data derived from scientific experiments carried out over the past decade. To do so they are incorporating the data available in the scientific community into software models.

According to the organization, rigorous predictive models are the cornerstone of science and engineering. Unfortunately, today, there are no comprehensive predictive models of living cells and tissues. Consequently, the entire field of biology and medicine is in a kind of “pre-mathematical” era.

A revolution in the biosciences driven by simulation-based research, using predictive models running on high performance computing architectures with flexible user interfaces, is now possible. Simulating a living organism will have impacts on understanding mechanisms of disease, drug discovery and development, synthetic biology, bioengineering, neuroscience and artificial intelligence.


OpenWorm is engineering Geppetto, an open-source simulation platform, to be able to run these different models together. They are also forging new collaborations with universities and research institutes to collect data that fill in the gaps.

"If you're going to understand a nervous system or, more humbly, how a neural circuit works, you can look at it and stick electrodes in it and find out what kind of receptor or transmitter it has," said John White, who built the first map of C. elegans's neural anatomy, and recently started contributing to the project. "But until you can quantify and put the whole thing into a computer and simulate it and show your computer model can behave in the same way as the real one, I don't think you can say you understand it."


Related articles
David Dalrymple, an MIT graduate student who has contributed to OpenWorm and is working on a worm brain modeling project of his own, pointed out what he sees as a limitation to the effort. OpenWorm has incorporated a lot of anatomical data -- the structures of the worm's nervous system and musculature. The issue is, these studies were carried out with dead worms. They can't tell scientists about the relative importance of connections between neurons within the worm's neural system, only that a connection exists. Very little data from living animals' cells exist in the published literature, and it may be required to develop a good simulation.

"I believe that an accurate model requires a great deal of functional data that has not yet been collected, because it requires a kind of experiment that has only become feasible in the last year or two," Dalrymple told Alexis Madrigal. Dalrymple's own research is to build an automated experimental apparatus that can gather up that functional data, which can then be fed into these models. "We're coming at the problem from different directions," he said. "Hopefully, at some point in the future, we'll meet in the middle and save each other a couple years of extra work to complete the story."



SOURCE  The Atlantic

By 33rd SquareSubscribe to 33rd Square