- Published on 20 December 2017
Plastic scintillators consist in one or several fluorescent probes embedded in a polymer matrix. They are able to produce light while interacting with a radioactive source. Recently, their technology has been modified by making them denser to improve their absorption while limiting the fluorescence quenching, leading to make them usable as pseudo-gamma spectrometers.
This is just an example of the numerous advances reported in a review based on four editions of the ANIMMA conferences (www.animma.com). This review is organized according to the measurement methodologies: neutronic, photonic, thermal, acoustic and optical, and includes medical imaging as well as progress in data acquisition and electronic hardening. Applications involve many fields like fundamental physics, fission and fusion reactors, medical imaging, environmental protection and homeland security, radioactive wastes measurement and control.
EPJ C Highlight - Combining experimental data to test models of new physics that explain dark matter
- Published on 19 December 2017
The most statistically consistent and versatile tool to date is designed to gain insights into dark matter from models that extend the standard model of particle physics, rigorously comparing them with the latest experimental data
In chess, a gambit refers to a move in which a player risks one piece to gain an advantage. The quest to explain dark matter, a missing ingredient from the minimal model that can describe the fundamental particles we have observed (referred to as the standard model of particle physics), has left many physicists eager to gain an advantage when comparing theoretical models to as many experiments as possible. In particular, maintaining calculation speed without sacrificing the number of parameters involved is a priority. Now the GAMBIT collaboration, an international group of physicists, has just published a series of papers in EPJ C that offer the most promising approach to date to understanding dark matter.
- Published on 05 December 2017
The personal recollections of a physicist involved in developing a reference model in particle physics, called the Standard Model, particularly in Italy
Understanding the Universe requires first understanding its building blocks, a field covered by particle physics. Over the years, an elegant model of particle physics, dubbed the Standard Model, has emerged as the main point of reference for describing the fundamental components of matter and their interactions. The Standard Model is not confined to particle physics; it also provides us a guide to understanding phenomena that take place in the Universe at large, down to the first moments of the Big Bang, and it sets the stage for a novel cosmic problem, namely the identification of dark matter. Placing the Standard Model in a historical context sheds valuable light on how the theory came to be. In a remarkable paper published in EPJ H, Luciano Maiani from the University of Rome and the National Institute of Nuclear Physics, Italy, shares his personal recollections with Luisa Bonolis from the Max Planck Institute for the History of Science, Berlin, Germany. During an interview recorded over several days in March 2016, Maiani outlines the role of those researchers who were instrumental in the evolution of theoretical particle physics in the years when the Standard Theory was developed.
- Published on 04 December 2017
New study elucidates the DNA sequences that offer the perfect conditions for packaged DNA to unwrap and ‘breathe’, thus allowing genes to be read
Accessing DNA wrapped into basic units of packaging, called nucleosomes, depends on the underlying sequence of DNA building blocks, or base pairs. Like Christmas presents, some nucleosomes are easier to unwrap than others. This is because what makes the double helix stiffer or softer, straight or bent—in other words, what determines its elasticity—is the actual base pair sequence. In a new study published in EPJ E, Jamie Culkin from Leiden University, the Netherlands, and colleagues demonstrate the role of the DNA sequence in making it possible for packaged DNA to open up and let genes be read and expressed.
- Published on 04 December 2017
Nowadays, platforms like Twitter play a big role in the aftermath of disasters, such as natural disasters, mass shootings, or terror attacks, as people try to receive the latest information on what happened through social media channels. A new study published in EPJ Data Science shows how an analysis of social media responses to disasters might help us better understand the dynamic of the public’s attention during these events, what such an analysis shows about people’s attention spans and focus points in the aftermath of disasters, and how analyses like these could be performed in a cost-effective way.
(Guest post by Yu-Ru Lin, originally published on SpringerOpen blog)
- Published on 01 December 2017
New method creates time-efficient way of computing models of complex systems reaching equilibrium
When the maths cannot be done by hand, physicists modelling complex systems, like the dynamics of biological molecules in the body, need to use computer simulations. Such complicated systems require a period of time before being measured, as they settle into a balanced state. The question is: how long do computer simulations need to run to be accurate? Speeding up processing time to elucidate highly complex study systems has been a common challenge. And it cannot be done by running parallel computations. That’s because the results from the previous time lapse matters for computing the next time lapse. Now, Shahrazad Malek from the Memorial University of Newfoundland, Canada, and colleagues have developed a practical partial solution to the problem of saving time when using computer simulations that require bringing a complex system into a steady state of equilibrium and measuring its equilibrium properties. These findings are part of a special issue on “Advances in Computational Methods for Soft Matter Systems,” recently published in EPJ E.
EPJ Data Science Highlight - Sentiment analysis methods for understanding large-scale texts: a case for using continuum-scored words and word shift graphs
- Published on 29 November 2017
Due to the emergence and continuously increasing usage of social media services all over the world, it is now possible to estimate in real-time how entire groups of people are feeling at a given point. However, in order to be able interpret the available data correctly, the right tools and methods need to be used. A new article EPJ Data Science examines a range of such methods and shows their ability but also their limitations.
(Guest post by Andrew Reagan, originally published on SpringerOpen blog
As a grad student trying to understand the emotional content of some unreadably large collection of texts, a typical night in grad school can often go something like this: You’re up late at night planning a new research study, thinking about trying some of this fancy sentiment-based text analysis. You resort to your favorite search engine with the query “sentiment analysis package python.” We have all been there, except maybe with R instead of Python (the latter being my favorite).
- Published on 22 November 2017
Chinese physicists manipulate the transfer of thermal energy as a means of reducing heat waste, using thermal camouflage tactics
Ever heard of the invisibility cloak? It manipulates how light travels along the cloak to conceal an object placed behind it. Similarly, the thermal cloak is designed to hide heated objects from infrared detectors without distorting the temperature outside the cloak. Materials for such cloaks would need to offer zero thermal conductivity to help camouflage the heat. Now, Liujun Xu and colleagues from Fudan University, Shanghai, China, have explored a new mechanism for designing such materials. These findings published in EPJ B could have implications for manipulating the transfer of thermal energy as a way to ultimately reduce heat waste from fossil fuels and help mitigate energy crises.
- Published on 22 November 2017
New study of the trading interactions that determine the stock price using AI algorithms reveals unexpected microstructure for stock evolution, useful for financial crash modeling
Every day, thousands of orders for selling or buying stocks are registered and processed within milliseconds. Electronic stock exchanges, such as NASDAQ, use what is referred to as microscopic modelling of the order flow - reflecting the dynamics of order bookings - to facilitate trading. The study of such market microstructures is a relatively new research field focusing on the trading interactions that determine the stock price. Now, a German team from the University of Duisburg-Essen has analysed the statistical regularities and irregularities in the recent order flow of 96 different NASDAQ stocks. Since prices are strongly correlated during financial crises, they evolve in a way that is similar to what happens to nerve signals during epileptic seizures. The findings of the Duisburg-Essen group, published in EPJ B, contribute to modelling price evolution, and could ultimately be used to evaluate the impact of financial crises.
- Published on 22 November 2017
Ion beam cancer therapy could be improved if ion-induced shock waves are discovered. A new study explores how these predicted waves can be observed
An arrow shooting through an apple, makes for a spectacular explosive sight in slow motion. Similarly, energetic ions passing through liquid droplets induce shock waves, which can fragment the droplets. In a study published in EPJ D, Eugene Surdutovich from Oakland University, Rochester, Michigan, USA with his colleagues from the MBN Research Centre, Frankfurt, Germany have proposed a solution to observe the predicted ion-induced shock waves. They believe these can be identified by observing the way incoming ions fragment liquid droplets into multiple smaller droplets. The discovery of such shock waves would change our understanding of the nature of radiation damage with ions to cancerous tumour. This matters for the optimisation of ion-beam cancer therapy, which requires a thorough understanding of the relation between the physical characteristics of the incoming ion beam and its effects on biological tissues.