A new neural network can solve the problem of three bodies in 100 million times faster than human. It is reported by Science Alert, citing a Preprint on arXiv.org.
What is the three-body problem?
First formulated by Isaac Newton three-body problem is to determine the relative motion of three bodies (material points), interacting according to Newton law of gravity (e.g., Sun, Earth and moon).
Unlike the two-body problem in the General case the problem has no solution in the form of the final analytical expressions. Known only to certain exact solutions for special initial velocities and coordinates of objects.
In the years 1892-1899 French mathematician, mechanic, physicist, astronomer and philosopher Henri poincaré proved that there are infinitely many particular solutions of the three-body problem. So, at the moment there are at least 21 particular solution. For example, in 1911, the American mathematician and astronomer William Duncan Macmillan has opened a new particular solution, but without a clear mathematical justification. Only in 1961 the Soviet mathematician Kirill Sitnikov was able to find a strict mathematical proof for this case.
In 2013 Serbian Milovan Suvakov scientists and Veljko Dmitrasinovic from the Institute of physics in Belgrade found 13 new private solutions to the problem of three bodies in which the motion of the system of three equal mass objects will occur in a repetitive cycle.
Today the problem of three bodies is essential to study the behavior of globular star clusters, galactic nuclei with double black holes and other astronomical objects.
This can solve the problem of neural network?
Let’s say, a neural network can significantly expedite the receipt of the response.
An international group of scientists from Britain, Portugal and the Netherlands drew to the solution of the deep learning neural network (ANN), which finds the answer at 100 million times faster than a human and any available algorithms.
Scientists have developed neural network, taught her to work with the database, and showed her the already developed solution. The researchers simplified the process by including only three particles of equal mass whose initial velocity is equal to zero, and then launched an existing integrator under the name “Brutus” more than 10 thousand times.
Based on this, training a new ANN received five thousand new scenarios to work, the results of which were compared with their own predictions of “Brutus”.
It turned out that the neural network is best achieved if the time intervals in the training set were minimal. The neural network was faster “Brutus” a hundred thousand times, and in some cases a hundred million times.
“In the end, we assume that the network can be trained to more complex chaotic problems, such as problem four and five bodies, which further reduces the computational load,” conclude the researchers.