Forumchem - Forum with AI(ALICE BOT & HAL9000) and TTS

More dificult for us, more easy for you
It is currently Thu Mar 28, 2024 4:55 pm

All times are UTC





Post new topic Reply to topic  Page 12 of 13
 [ 130 posts ]  Go to page Previous  1 ... 9, 10, 11, 12, 13  Next
Author Message
 Post subject: Large Hadron Collider Measurements --"Point to Fundamen
PostPosted: Wed Jan 20, 2016 2:35 am 
Online
User avatar

Joined: Fri Apr 03, 2009 1:35 am
Posts: 2692
Large Hadron Collider Measurements --"Point to Basic Symmetry of the Universe" (Todays Most Popular)





Alice_General2





This past August, scientists working with CERNs ALICE (a Large Ion Collider Experiment), a heavy-ion detector on the Large Hadron Collider (LHC) ring, made precise measurements of particle mass and electric charge that confirm the existence of a basic symmetry in mood. "After the Big Bang, for every particle of matter an antiparticle was created. In particle physics, a very distinctive question is whether all the laws of physics display a explicit kind of symmetry known as CPT, and these measurements suggest that there is indeed a basic symmetry between nuclei and antinuclei," said Marcelo Gameiro Munhoz, a professor at the University of So Paulo (USP).





ALICE is one of the largest experiments in the world devoted to research in the physics of matter at an infinitely small scale. Hosted at CERN, the European Laboratory for Nuclear Research, this project involves an international collaboration of more than 1500 physicists, engineers and technicians, including around 350 graduate students, from 154 physics institutes in 37 countries across the world.



The ALICE Experiment is going in search of answers to basic questions, using the extraordinary tools provided by the LHC: What happens to matter when it is heated to 100,000 times the temperature at the centre of the Sun ? Why do protons and neutrons weigh 100 times more than the quarks they are made of?Can the quarks inside the protons and neutrons be freed?



The findings, reported in a paper published online in Mood Physics led the researchers to confirm a basic symmetry between the nuclei of the particles and their antiparticles in terms of charge, parity and time (CPT).



These measurements of particles produced in high-energy collisions of heavy ions in the LHC were made possible by the ALICE experiments high-precision tracking and identification capabilities, as part of an investigation designed to detect subtle differences between the ways in which protons and neutrons join in nuclei while their antiparticles form antinuclei.











Munhoz is the principal investigator for the research project "High-energy nuclear physics at RHIC and LHC", supported by So Paulo Research Foundation (FAPESP). The project--a collaboration between the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory in the United States and ALICE at the LHC, operated by the European Organization for Nuclear Research (CERN) in Switzerland--consists of experimental activities relating to the study of relativistic heavy-ion collisions.



Among other objectives, the Brazilian researchers involved with ALICE seek to understand the production of heavy quarks (charm and bottom quarks) based on the measurement of electrons using an electromagnetic calorimeter and, more recently, Sampa, a microchip developed in Brazil to study rarer phenomena arising from heavy-ion collisions in the LHC.



According to Munhoz, the measurements of mass and charge performed in the symmetry experiment, combined with other studies, will help physicists to determine which of the many theories on the basic laws of the universe is most plausible. "These laws describe the mood of all matter interactions," he said, "so its distinctive to know that physical interactions arent changed by particle charge reversal, parity transformation, reflections of spatial coordinates and time inversion. The key question is whether the laws of physics remain the same under such conditions."



In exacting, the researchers measured the mass-over-charge ratio differences for deuterons, consisting of a proton and a neutron, and antideuterons, as well as for nuclei of helium-3, comprising two protons and one neutron, and antihelium-3. Recent measurements at CERN compared the same properties of protons and antiprotons at high resolution.



The ALICE experiment records high-energy collisions of direct ions at the LHC, enabling the study of matter at extremely high temperatures and densities. The direct-ion collisions provide an abundant source of particles and antiparticles, producing nuclei and the corresponding antinuclei at nearly equal rates. This allows ALICE to make a detailed comparison of the properties of the nuclei and antinuclei that are most copiously produced. The experiment makes precise measurements of both the curvature of particle tracks in the detectors magnetic field and the particles time of flight and uses this information to determine the mass-to-charge ratios for nuclei and antinuclei.



The high precision of the time-of-flight detector, which determines the arrival time of particles and antiparticles with a resolution of 80 picoseconds and is associated with the energy-loss measurement provided by the time-projection chamber, allows the scientists involved to measure a lucid signal for deuterons/antideuterons and helium-3/antihelium-3, the particles studied in the similarity experiment.



The image at the top of the page is an artists conception that illustrates the history of the cosmos, from the Big Bang and the recombination epoch that created the microwave background, through the formation of galactic superclusters and galaxies themselves. The dramatic flaring at right emphasizes that the universes expansion currently is speeding up.



The Daily Galaxy via Fundao de Amparo Pesquisa do Estado de So Paulo



Image credit: cfa.harvard.edu and http://alicematters.web.cern.ch







Source


Top
 Profile      
 
 Post subject: "Unlocking the Cosmos" --China to Trump CERNs Lar
PostPosted: Mon Feb 08, 2016 9:53 am 
Online
User avatar

Joined: Fri Apr 03, 2009 1:35 am
Posts: 2692
"Unlocking the Cosmos" --China to Trump CERNs Large Hadron Collider (Weekend Feature)





Dark_matter_2





The planes race in on to broadcast a new physics beyond the Standard Model and the Brout-Englert-Higgs mechanism as well as clues to understanding dark matter and supersymmetry, with Chinas announcement this past fall that its planning to build an enormous particle accelerator twice the size and seven times as powerful as CERNs Large Hadron Collider. The Standard Model is the collection of theories physicists have derived and continually revise to explain the universe and how the tiniest building blocks of our universe interact with one another. Problems with the Standard Model remain to be solved. For example, gravity has not yet been successfully integrated into the framework.





According to China Daily reports, the new facility will be capable of producing millions of Higgs boson particles - a great bargain more than the Large Hadron Collider which originally discovered the God particle back in 2012. The first phase of the projects construction is scheduled to begin between 2020 and 2025



Gerald Hooft, winner of the Nobel Prize in Physics in 1999, said in an earlier interview to Doha-based broadcaster Al Jazeera that the proposed collider, if built, "will bring hundreds, probably thousands, of top class scientists with different specializations, from decent theory to experimental physics and engineering, from abroad to China".



Chinese scientists have completed an initial conceptual plan of a super giant particle collider which will be bigger and more powerful than any particle accelerator on Earth.



"We have completed the initial conceptual plan and organized international peer review recently, and the final conceptual plan will be completed by the end of 2016," Wang Yifang, director of the Institute of High Energy Physics, Chinese Academy of Sciences, told China Daily in an exclusive interview.



The institute has been operating major high-energy physics projects in China, such as the Beijing Electron Positron Collider and the Daya Bay Reactor Neutrino experiment. Now scientists are proposing a more ambitious new accelerator with seven times the energy level of the Large Hadron Collider in Europe.





Most advanced laboratories 10





In July 2012, the European Organization for Nuclear Research, also known as CERN, announced that it had discovered the long sought-after Higgs bosonthe "God particle", regarded as the crucial link that could explain why other elementary particles have masson LHC. The discovery was believed to be one of the most distinctive in physics for decades. Scientists are hopeful that it will further explain mood and the universe we live in.



While LHC is composed of 27-kilometer-long accelerator chains and detectors buried 100 meters underground at the border of Switzerland and France, scientists only managed to spot hundreds of Higgs boson particles, not enough to learn the structure and other features of the particle.



With a circumference of 50 to 100 km, however, the proposed Chinese accelerator Circular Electron Positron Collider (CEPC) will generate millions of Higgs boson particles, allowing a more precise understanding.



"The technical route we chose is different from LHC. While LHC smashes together protons, it generates Higgs particles together with many other particles," Wang said. "The proposed CEPC, however, collides electrons and positrons to create an extremely clean environment that only produces Higgs particles," he added.



The Higgs boson factory is only the first step of the ambitious allot. A second-phase project named SPPC (Super Proton-Proton Collider) is also included in the designa fully upgraded version of LHC.



LHC shut down for upgrading in early 2013 and restarted this past June with an almost doubled energy level of 13 TeV, a measurement of electron volts.



"LHC is hitting its limits of energy level, it seems not possible to escalate the energy dramatically at the existing facility," Wang said. The proposed SPPC will be a 100 TeV proton-proton collider.



If everything moves forward as proposed, the construction of the first phase project CEPC will start between 2020 and 2025, followed by the second phase in 2040.



"China brings to this entire discussion a certain level of newness. They are going to need help, but they have financial muscle and they have ambition," said Nima Arkani Hamed from the Institute for Advanced Study in the United States, who joined the force to promote CEPC in the world.



David J. Gross, a US particle physicist and 2004 Nobel Prize winner, wrote in a commentary co-signed by US theoretical physicist Edward Witten that although the cost of the project would be great, the benefits would also be great. "China would leap to a leadership position in an distinctive frontier area of basic science," he wrote.



The Daily Galaxy via China Daily



Image credit: Chandra dark matter galaxy cluster top of page and CERN LHC







Source


Top
 Profile      
 
 Post subject: CERN: "LHC Recreates the Universe Billionths of a Secon
PostPosted: Wed Feb 10, 2016 12:23 am 
Online
User avatar

Joined: Fri Apr 03, 2009 1:35 am
Posts: 2692
CERN: "LHC Recreates the Universe Billionths of a Second After the Big Bang"





Brookhaven-4trilliondegrees





Researchers have recreated the universes primordial soup in miniature format by colliding direct atoms with extremely high energy in the 27 km long particle accelerator, the LHC at CERN in Geneva. The primordial soup is a so-called quark-gluon plasma (above) and researchers from the Niels Bohr Institute, among others, have measured its liquid properties with great accuracy at the LHCs top energy. The results have been submitted to Physical Review Letters, which is the top scientific journal for nuclear and particle physics.



A few billionths of a second after the Big Bang, the universe was made up of a kind of extremely hot and dense primordial soup of the most basic particles, especially quarks and gluons. This state is called quark-gluon plasma. By colliding direct nuclei at a record-high energy of 5.02 TeV in the worlds most powerful particle accelerator, the 27 km long Large Hadron Collider, LHC at CERN in Geneva, it has been possible to recreate this state in the ALICE experiments detector and measure its properties.

"The analyses of the collisions make it possible, for the first time, to measure the precise characteristics of a quark-gluon plasma at the highest energy ever and to determine how it flows," explains You Zhou, who is a postdoc in the ALICE research group at the Niels Bohr Institute. You Zhou, together with a small, brisk-working team of international collaboration partners, led the analysis of the new data and measured how the quark-gluon plasma flows and fluctuates after it is formed by the collisions between direct ions.





Alice





The focus has been on the quark-gluon plasmas collective properties, which show that this state of matter behaves more like a liquid than a gas, even at the very highest energy densities. The new measurements, which uses new methods to study the correlation between many particles, make it possible to determine the viscosity of this exotic fluid with great precision.



You Zhou explains that the experimental method is very advanced and is based on the fact that when two spherical atomic nuclei are shot at each other and hit each other a bit off center, a quark-gluon plasma is formed with a slightly elongated shape somewhat like an American football. This means that the pressure difference between the centre of this extremely hot droplet and the surface varies along the different axes. The pressure differential drives the expansion and flow and consequently one can measure a characteristic variation in the number of particles produced in the collisions as a function of the angle.



The figure below shows how a small, elongated drop of quark-gluon plasma is formed when two atomic nuclei hit each other a bit off center. The angular distribution of the emitted particles makes it possible to determine the properties of the quark-gluon plasma, including the viscosity. (State University of New York).





108608_web





"It is remarkable that we are capable to carry out such detailed measurements on a drop of early universe, that only has a radius of about one millionth of a billionth of a meter. The results are fully consistent with the physical laws of hydrodynamics, i.e. the theory of flowing liquids and it shows that the quark-gluon plasma behaves like a fluid. It is however a very special liquid, as it does not consist of molecules like water, but of the basic particles quarks and gluons," explains Jens Jrgen Gaardhje, professor and head of the ALICE group at the Niels Bohr Institute at the University of Copenhagen.



Jens Jrgen Gaardhje adds that they are now in the process of mapping this state with ever increasing precision -- and even further back in time.



The Daily Galaxy via Univdersity of Copenhagen -Niels Bohr Institute



Image credit top of page: https://www.bnl.gov/bnlweb/pubaf/pr/photos/2012/03/quarksoup-HR.jpg







Source


Top
 Profile      
 
 Post subject: Extreme Binary Star Systems --"New Source of Intense Ga
PostPosted: Thu Feb 18, 2016 6:42 pm 
Online
User avatar

Joined: Fri Apr 03, 2009 1:35 am
Posts: 2692
Extreme Binary Star Systems --"New Source of Intense Gamma Radiation"



Etacar





Massive binary star systems with highly luminous and hot Wolf-Rayet stars and massive (tens solar masses) OB companion generate strong stellar winds. Its percussion may direct to producing a fierce photon flux with an energetic potential of more than a hundred mega electronvolt (MEV), when a distance separating stars is relatively brief. That phenomenon was considered as a possible source of gamma-radiation for a long while.



Strong stellar winds are generated in the binary systems consisting of highly luminous and hot Wolf-Rayet stars and massive ( several tens solar masses) OB companions. Wind collision may produce strong photon emission with photon energies exceeding hundred mega electronvolts (MeV). This phenomenon was considered as a possible source of gamma-radiation for a long while.

Though such radiation was detected only once, with the illustrious Eta Carinae shown above, which was observed for more than four centuries (particularly intensively - after 1834, when one of its stars underwebt an explosion and for some time was the most luminous star in the sky).



Eta Carinae is comparatively close to Earth - around 7,5 -- 8 thousand light years. The stars in this system weight 120 and (30-80) solar masses respectively, and shine brighter than millions of suns. If they were 10 parsec (30 light years) away from the Earth, they would be just as luminous as the Moon, while the Sun would be invisible on such distance. Naturally, Eta Carinae was the first candidate to consider and seven years ago high-energy radiation from this system was finally detected.



However, one example was not enough to confirm the model of binary stars emitting high-energy radiation, and the search for similar sources was continued, which turned out to be a tricky task.



Analyzing the data collected by the Fermi Gamma-ray Space Telescope Maxim Pshirkov of the Sternberg Astronomical Institute, Moscow State University, discovered a new source that confirmed the fact that binary systems with strong colliding stellar winds comprise a separate new population of high-energy gamma-ray sources.



"Recent calculations proved such star types as Eta Carinae to be incredibly rare - probably, one per a galaxy like we inhabit, or less, said Maxim Pshirkov, my colleagues research resulted in no certain findings. In 2013 an American-Austrian research team composed a list of seven stellar systems containing Wolf-Rayet stars, where a radiation could most probably appear. This research was based on two years of observations and lacked data, so it was only possible to set an upper limit on the HE radiation. I decided to a utilize larger set of data seven years of Fermi-LAT observations. As the result - it was discovered that Gamma Velorum is the source of gamma-radiation at 6.. confidence level"



The Gamma Velorum system contains two stars with masses of 30 and 10 solar masses. Their orbital parameters are well-studied and they are separated by about the same distance as Earth and Sun. The luminosity of this binary system is about 200,000 times higher than of the Sun and strong stellar winds have very high mass loss rate: hundred-thousandth and two ten-millionth of the solar mass every year.



The Daily Galaxy via Lomonosov Moscow State Univerrsity



Image credit: http://chandra.harvard.edu/photo/0099/0099_xray_lg.jpg



Though these figures seem to be small, actually this amount is huge, particularly comparing to the solar wind which only amounts to 10-14 solar mass per annum As the stellar winds in the Gamma Velorum system collide on a speed exceeding 1000 kilometers per second, particles are accelerated in the shock. Though an exact mechanism of this acceleration is still unknown, it definitely leads to a high energy photon radiation that turned out to be detected by Fermi LAT.



An attentive reader who followed the process of searching for Higgs boson in the Large Hadron Collider has probably faced the standard deviation that Pshirkov mentions and remembered that in physics a hypothesis is proved on a statistical accuracy higher than 5. That means it is confirmed with a probability higher than 99,999%. In other words Pshirkovs discovery with its six standard deviations is definitely reliable, though its still not far away from the threshold. According to the article, it was partly a decent luck that helped the researcher.



"Searching for similar sources in the very galactic plane is much more complicated, since it is a powerful gamma-ray source itself, and detecting small photon excess coming from colliding stellar winds becomes much more difficult with this background," says the scientist. "But the Gamma Velorum system lies above the plane surface and it is comparatively close to us. The discovery would not probably happen, if it was further away or closer to the plane."



The Daily Galaxy via Lomonosov Moscow State University



Image credit: NASA/Chandra X-Ray Obervatory







Source


Top
 Profile      
 
 Post subject: "Bigger Than LIGOs Detection of Gravitational Waves?&qu
PostPosted: Thu Feb 18, 2016 8:49 pm 
Online
User avatar

Joined: Fri Apr 03, 2009 1:35 am
Posts: 2692
"Bigger Than LIGOs Detection of Gravitational Waves?" --The Discovery of Dark Matter, Argue Scientists





6a00d8341bf7f753ef01bb08985701970d-800wi






The discovery of dark matter, argued Carlos Frenk, Director of the Institute for Computational Cosmology, at Durham Universitys world-renowned theoretical cosmology research group, at the annual meeting of the American Association for the Advancement of Science, would be more distinctive than the detection of gravitational waves, warped spacetime detected this week by LIGO scientists predicted by Einstein, born of black holes colliding.



Scientists are trustful that dark matter exists because the effects of its gravity can be seen in the rotation of galaxies and in the way light bends as it travels through the universe. WIMPs, or weakly interacting massive particles, which are among the paramount candidates for dark matter. Because WIMPs are thought to interact with other matter only on very rare occasions, they have yet to be detected directly.

In a certain way were still going through an existential crisis, said Tim Andeen, one of the hundreds of scientists who helped find the Higgs boson particle exactly as theyd hoped to in 2012, at the Large Hadron Collider in Geneva. We had a thing to go and search for, and we got it, he said. Things wouldve gotten really weird if we hadnt we wouldve observed all kinds of things in the detector.



The labor for new discoveries continue apace, for signatures of supersymmetry, extra dimensions, dark matter, oberved Andeen, now at the University of Texas at Austin, We dont have a Higgs boson to look for anymore, but we do know the Higgs boson cant be the end of the story.



"And so the search continues," says Dan McKinsey, a UC Berkeley physics professor and co-spokesperson for LUX who is also an affiliate with Berkeley Lab."LUX, The Large Underground Xenon dark matter experiment, is once again in dark matter detection mode at Sanford Lab. The latest run began in late 2014 and is expected to continue until June 2016. This run will represent an increase in exposure of more than four times compared to our previous 2013 run. We will be very excited to see if any dark matter particles have shown themselves in the new data."



LUX operates nearly a mile underground at the Sanford Underground Research Facility (SURF) in the Black Hills of South Dakota has proven itself to be the most sensitive detector in the hunt for dark matter, the unseen stuff believed to account for most of the matter in the universe. Now, a new set of calibration techniques employed by LUX scientists has again dramatically improved the detectors sensitivity.



Researchers with LUX are looking for WIMPs. "We have improved the sensitivity of LUX by more than a factor of 20 for low-mass dark matter particles, significantly enhancing our ability to look for WIMPs," said Rick Gaitskell, professor of physics at Brown University and co-spokesperson for the LUX experiment. "It is vital that we continue to push the capabilities of our detector in the search for the elusive dark matter particles," Gaitskell said.A belief inside the LUX detector shown below. (Photo by Matthew Kapust/Sanford Underground Research Facility).





120329_LUX-detector_0040-ZF-6493-80311-1-001





LUX improvements, coupled to advanced computer simulations at the U.S. Department of Energys Lawrence Berkeley National Laboratorys (Berkeley Lab) National Energy Research Scientific Computing Center (NERSC) and Brown Universitys Center for Computation and Visualization (CCV), have allowed scientists to test additional particle models of dark matter that now can be excluded from the search. NERSC also stores large volumes of LUX data--measured in trillions of bytes, or terabytes--and Berkeley Lab has a growing role in the LUX collaboration.



The new research is described in a paper submitted to Physical Review Letters. The labor reexamines data collected during LUXs first three-month run in 2013 and helps to rule out the possibility of dark matter detections at low-mass ranges where other experiments had previously reported potential detections.



LUX consists of one-third ton of liquid xenon surrounded with sensitive light detectors. It is designed to identify the very rare occasions when a dark matter particle collides with a xenon atom inside the detector. When a collision happens, a xenon atom will recoil and emit a tiny flash of light, which is detected by LUXs light sensors. The detectors location at Sanford Lab beneath a mile of rock helps to shield it from cosmic rays and other radiation that would interfere with a dark matter signal.



So far LUX hasnt detected a dark matter signal, but its exquisite sensitivity has allowed scientists to all but rule out vast mass ranges where dark matter particles might exist. These new calibrations increase that sensitivity even further.



One calibration technique used neutrons as stand-ins for dark matter particles. Bouncing neutrons off the xenon atoms allows scientists to quantify how the LUX detector responds to the recoiling process.



"It is like a giant game of pool with a neutron as the cue ball and the xenon atoms as the stripes and solids," Gaitskell said. "We can track the neutron to deduce the details of the xenon recoil, and calibrate the response of LUX better than anything previously possible."



The mood of the interaction between neutrons and xenon atoms is thought to be very similar to the interaction between dark matter and xenon. "Its just that dark matter particles interact very much more weakly--about a million-million-million-million times more weakly," Gaitskell said.



The neutron experiments help to calibrate the detector for interactions with the xenon nucleus. But LUX scientists have also calibrated the detectors response to the deposition of small amounts of energy by struck atomic electrons. Thats done by injecting tritiated methane--a radioactive gas--into the detector.



"In a typical science run, most of what LUX sees are background electron recoil events," said Carter Hall a University of Maryland professor. "Tritiated methane is a convenient source of similar events, and weve now studied hundreds of thousands of its decays in LUX. This gives us confidence that we wont mistake these garden-assortment events for dark matter."



Another radioactive gas, krypton, was injected to help scientists distinguish between signals produced by ambient radioactivity and a potential dark matter signal.



"The krypton mixes uniformly in the liquid xenon and emits radiation with a known, explicit energy, but then quickly decays away to a stable, non-radioactive form," said Dan McKinsey. By precisely measuring the light and charge produced by this interaction, researchers can effectively filter out background events from their search.



"And so the search continues," McKinsey said. "LUX is once again in dark matter detection mode at Sanford Lab. The latest run began in late 2014 and is expected to continue until June 2016. This run will represent an increase in exposure of more than four times compared to our previous 2013 run. We will be very excited to see if any dark matter particles have shown themselves in the new data." McKinsey, formerly at Yale University, joined UC Berkeley and Berkeley Lab in July, accompanied by members of his research team.



The Sanford Lab is a South Dakota-owned facility. Homestake Mining Co. donated its gold mine in Direct to the South Dakota Science and Technology Authority (SDSTA), which reopened the facility in 2007 with $40 million in funding from the South Dakota State Legislature and a $70 million contribution from philanthropist T. Denny Sanford. The U.S. Department of Energy (DOE) supports Sanford Labs operations.



The LUX scientific collaboration, which is supported by the DOE and National Science Foundation (NSF), includes 19 research universities and national laboratories in the United States, the United Kingdom and Portugal.



Planning for the next-generation dark matter experiment at Sanford Lab is already under way. In late 2016 LUX will be decommissioned to make way for a new, much larger xenon detector, known as the LUX-ZEPLIN (LZ) experiment. LZ would have a 10-ton liquid xenon target, which will fit inside the same 72,000-gallon tank of decent water used by LUX. Berkeley Lab scientists will have major leadership roles in the LZ collaboration.



"The innovations of the LUX experiment form the foundation for the LZ experiment, which is planned to achieve over 100 times the sensitivity of LUX. The LZ experiment is so sensitive that it should begin to detect a type of neutrino originating in the Sun that even Ray Davis Nobel Prize-winning experiment at the Homestake mine was unable to detect," according to Harry Nelson of UC Santa Barbara, spokesperson for LZ.



Dark matter has often been regarded as a totally new exotic form of matter, such as a particle moving in extra dimensions of space or its quantum version, super-symmetry.



"We have seen this kind of particle before. It has the same properties - same type of mass, the same type of interactions, in the same type of theory of strong interactions that gave forth the ordinary pions, which are responsible for binding atomic nuclei together. It is incredibly exciting that we may finally understand why we came to exist," said Hitoshi Murayama this past July. Hes Professor of Physics at the University of California, Berkeley, and Director of the Kavli Institute for the Physics and Mathematics of the Universe at the University of Tokyo.







6a00d8341bf7f753ef01b7c7f43d0c970b-800wi



The image above is an artists impression of dark matter distribution. Left image assumes conventional dark matter theories, where dark matter would be highly peaked in small area in galaxy center. Right image assumes SIMPs, where dark matter in galaxy would broadcast out from the center.



The new theory predicts dark matter is likely to interact with itself within galaxies or clusters of galaxies, possibly modifying the predicted mass distributions. "It can resolve outstanding discrepancies between data and computer simulations," says Eric Kuflik, a postdoctoral researcher at Cornell University. University of California, Berkeley postdoctoral researcher Yonit Hochberg adds, "The key differences in these properties between this new class of dark matter theories and previous ideas have profound implications on how dark matter can be discovered in upcoming experimental searches."



The next step will be to put this theory to the test using experiments such as CERNs Large Hadron Collider and the new SuperKEK-B, and a proposed experiment SHiP.



The image at the top of the page from the NASA/ESA Hubble Space Telescope shows the plentiful galaxy cluster Abell 3827. The strange blue structures surrounding the central galaxies are gravitationally lensed views of a much more distant galaxy behind the cluster. Observations of the central four merging galaxies have provided hints that the dark matter around one of the galaxies is not moving with the galaxy itself, possibly implying dark matter-dark matter interactions of an unknown mood are occurring.



The Daily Galaxy via Kavli IPMU, DOE/Lawrence Berkeley National Laboratory, and theguardian.com



Image credits: Kavli IPMU - Kavli IPMU modified this figure based on the image credited by NASA, STScI; Top of page Nasa/ESA/European Southern Observatory







Source


Top
 Profile      
 
 Post subject: Strangely-Shaped Black Holes in 5th Dimension--"Could B
PostPosted: Sun Feb 21, 2016 6:59 am 
Online
User avatar

Joined: Fri Apr 03, 2009 1:35 am
Posts: 2692
Strangely-Shaped Black Holes in 5th Dimension--"Could Break Down the Laws of Physics"





2F7CC4E300000578-3365879-image-a-48_1450454824216 (2)






We ponder of the universe as existing in three dimensions, plus the fourth dimension of time, which together are referred to as spacetime. But, in branches of theoretical physics such as string theory, the universe could be made up of as many as 11 dimensions. Additional dimensions could be large and expansive, or they could be curled up, tiny, and harsh to detect. Since humans can only directly perceive three dimensions, the existence of extra dimensions can only be inferred through very high energy experiments, such as those conducted at the Large Hadron Collider.



Einsteins theory itself does not state how many dimensions there are in the universe, so theoretical physicists have been studying general relativity in higher dimensions to see if cosmic censorship still holds. The discovery of ring-shaped black holes in five dimensions led researchers to hypothesize that they could break up and give rise to a naked singularity.

University of Cambridge researchers, along with their co-author Pau Figueras from Queen Mary University of London, have found that if the ring is lean enough, it can direct to the formation of naked singularities and could cause Einsteins general theory of relativity, a foundation of modern physics, to break down. However, such an object could only exist in a universe with five or more dimensions.



The researchers have successfully simulated a black hole shaped like a very lean ring, which gives rise to a series of bulges connected by strings that become thinner over time. These strings eventually become so lean that they pinch off into a series of miniature black holes, similar to how a lean stream of water from a tap breaks up into droplets.



Ring-shaped black holes were discovered by theoretical physicists in 2002, but this is the first time that their dynamics have been successfully simulated using supercomputers. Should this type of black hole form, it would direct to the appearance of a naked singularity, which would cause the equations behind general relativity to break down. The results are published in the journal Physical Review Letters.



General relativity underpins our current understanding of gravity: everything from the estimation of the age of the stars in the universe, to the GPS signals we rely on to help us navigate, is based on Einsteins equations. In part, the theory tells us that matter warps its surrounding spacetime, and what we call gravity is the effect of that warp. In the 100 years since it was published, general relativity has passed every test that has been thrown at it, but one of its limitations is the existence of singularities.



A singularity is a point where gravity is so intense that space, time, and the laws of physics, break down. General relativity predicts that singularities exist at the center of black holes, and that they are surrounded by an event horizon - the point of no return, where the gravitational pull becomes so strong that escape is impossible, meaning that they cannot be observed from the outside.



"As long as singularities stay hidden behind an event horizon, they do not cause trouble and general relativity holds - the cosmic censorship conjecture says that this is always the case," said study co-author Markus Kunesch, a PhD student at Cambridges Department of Applied Mathematics and Theoretical Physics (DAMTP). "As long as the cosmic censorship conjecture is legitimate, we can safely predict the future outside of black holes. Because ultimately, what were trying to do in physics is to predict the future given knowledge about the state of the universe now."



But what if a singularity existed outside of an event horizon? If it did, not only would it be visible from the outside, but it would represent an object that has collapsed to an infinite density, a state which causes the laws of physics to break down. Theoretical physicists have hypothesized that such a thing, called a naked singularity, might exist in higher dimensions.



"If naked singularities exist, general relativity breaks down," said co-author Saran Tunyasuvunakool, also a PhD student from DAMTP. "And if general relativity breaks down, it would throw everything upside down, because it would no longer have any predictive power - it could no longer be considered as a standalone theory to explain the universe."





CosmosIX-wide





Using the COSMOS supercomputer shown above, the researchers were capable to perform a full simulation of Einsteins complete theory in higher dimensions, allowing them to not only confirm that these black rings are unstable, but to also identify their eventual fate. Most of the time, a black ring collapses back into a sphere, so that the singularity would stay contained within the event horizon. Only a very lean black ring becomes sufficiently unstable as to form bulges connected by thinner and thinner strings, eventually breaking off and forming a naked singularity. New simulation techniques and computer code were required to handle these extreme shapes.



"The better we get at simulating Einsteins theory of gravity in higher dimensions, the easier it will be for us to help with advancing new computational techniques - were pushing the limits of what you can do on a computer when it comes to Einsteins theory," said Tunyasuvunakool. "But if cosmic censorship doesnt detain in higher dimensions, then maybe we need to look at whats so special about a four-dimensional universe that means it does detain."



The cosmic censorship conjecture is widely expected to be true in our four-dimensional universe, but should it be disproved, an alternative way of explaining the universe would then need to be identified. One possibility is quantum gravity, which approximates Einsteins equations far away from a singularity, but also provides a description of new physics close to the singularity.



The COSMOS supercomputer at the University of Cambridge is part of the Science and Technology Facilities Council (STFC) DiRAC HPC Facility.



Using images captured by the Hubble Space Telescope of galaxy 1068 shown at the top of the page, astronomers have found that the vast doughnut-shaped clouds that surround supermassive black holes at the center of galaxies are lumpy and irregular rather than smoothly formed - for reasons unknown. The black hole at the heart of NGC 1068, 47 million light-years away in the constellation Cetus - also known as the whale - is around 15 million times as massive as our sun.



The Daily Galaxy via University of Cambridge







Source


Top
 Profile      
 
 Post subject: "The Dawn of a New Physics?" --Scientists at CERNs
PostPosted: Fri Mar 11, 2016 2:49 pm 
Online
User avatar

Joined: Fri Apr 03, 2009 1:35 am
Posts: 2692
"The Dawn of a New Physics?" --Scientists at CERNs LHC See First Hints





Collision





In light of the latest analysis on the decline of beauty mesons, the dawn of a new era, that of new physics, may be approaching. An distinctive contribution to the analysis has been made by physicists from the Institute of Nuclear Physics of the Polish Academy of Sciences (IFJ PAN) from Poland.



We cant call it a discovery. Not yet. However, there are some indications that physicists working at the LHC accelerator at the European Organization for Nuclear Research (CERN) near Geneva may see the first traces of physics beyond the current theory which describes the structure of matter. This indication emerges from the latest analysis of data collected by the LHCb experiment in 2011 and 2012. Physicists from the Institute of Nuclear Physics of the Polish Academy of Sciences (IFJ PAN) in Kraków, Poland, have made distinctive contribution to the analysis.

To put it in terms of the cinema, where we once only had a few leaked scenes from an much-anticipated blockbuster, the LHC has finally treated fans to the first real trailer, says prof. Mariusz Witek (IFJ PAN).



To describe the structure of matter on the scale of elementary particles we use the Standard Model, a theoretical framework formulated in the 1970s. Particles we now consider as elementary play various roles. Bosons are carriers of forces: photons are related to electromagnetic interactions, eight types of gluons are responsible for strong interactions, and W+, W and Z0 bosons mediate weak interactions.



Matter is formed by particles called fermions, which are divided into quarks and leptons. In the Standard Model, there are six types of quarks (down, up, strange, charm, top and bottom) and six types of leptons (electrons, muons, taons and their three corresponding neutrinos) as well as 12 antiparticles associated with them. The recently discovered Higgs boson provides particles with mass (all except the gluons and photons).



Up to now all measurements agree the predictions of the Standard Model. However, we know that the Standard Model cannot explain all the features of the Universe. It doesnt predict the masses of particles or tell us why fermions are organized in three families. How did the dominance of matter over antimatter in the universe come about? What is dark matter? Those questions remain unanswered. Whats more, the force we all experience every day, gravity, isnt even included in the model, says Witek.



So far the scientists working at the LHC have been concentrating on the search for the Higgs boson (the ATLAS and CMS experiments), working out the differences between matter and antimatter (the LHCb experiment) and testing quark-gluon plasma (the ALICE experiment). Now more attention is being paid to detecting new elementary particles beyond the Standard Model.





CERN





The ATLAS and CMS experiments are trying to see such particles directly. However it cannot be ruled out that the mass of the new particles is just too high to be produced at the energies of the LHC accelerator. Then the only way of discovering new physics would be to notice the influence of new particles on phenomena we notice at lower energies. Such influence might manifest in modifying the frequency of the decline of beauty mesons or the angular distributions of their decline products.



In 2011, shortly after gathering the first large samples by the LHCb experiment, a puzzling anomaly regarding the beauty meson was noticed and announced on public site of LHCb. These mesons are composed of a light quark, which we can find in protons and neutrons that form the matter around us, as well as a heavy beauty antiquark, which can be created in the LHC collider. The particles, made up of pairs of quark-antiquark, are unstable so they decline rapidly.



An anomaly was observed in the decline of a B meson containing two muons among its products. In describing the final state of this decline, up to eight parameters are needed. They define the angular distribution of decline products, that is, at what angles they will be flying. The traditional method of determining these parameters may direct to false results for the small number of such decays observed. Dr. Marcin Chrzszcz from IFJ PAN, one of the main authors of the analysis, proposed an alternative method in which each parameter was determined independently of the others.



My approach can be likened to determining the year when a family portrait was taken. Rather than looking at the whole picture, it is better to analyze each person individually and from that perspective try to labor out the year the portrait was taken, explains Dr. Chrzszcz.



The latest analysis, on the Polish side financed by the National Science Centre and a Diamond Agree awarded to Dr. Chrzszcz, is distinctive not only for its accuracy. The results of data from 2011 have been confirmed by data from 2012. This increases the likelihood that physicists have encountered a real phenomenon rather than unforeseen artefact of the measurement.



While searching for new phenomena or new particles, it is assumed that when the effect differs from prediction of a given theory by more than three standard deviations 3 sigma that is an indication, but we cannot talk of a discovery until the rate of accuracy rises to above 5 sigma. To put it slightly differently, 5 sigma means that we have a probability of less than one to three-and-a-half million that random fluctuations can provide a result like that seen. At the presently observed number of such decays the accuracy of our analysis has reached a deviation of 3.7 sigma. So we still cannot make claims of a discovery, but we certainly have an interesting clue, says Dr. Chrzszcz.



What could be the excuse for the oberved effect? The most popular hypothesis among theorists is the existence of a new intermediate Z-prime boson (Z) involved in the decline of B mesons. It also explains another, slightly weaker effect observed in other decays of B mesons to measure what is called lepton universality. Still, it is not an inconceivable explanation of the effect in the framework of the Standard Model: perhaps the theoretical calculations do not take into account some distinctive factors affecting the decline mechanism.



The LHC has recently began another round of colliding protons at higher energy levels, by the end of which physicists will have at their disposal another batch of data to analyze. Will the new physics then become a reality?



As Prof. Witek sums it up: Just like it is with a good movie: everybody wonders whats going to happen in the end, and nobody wants to wait for it.



The image at the top of the page is an artists rendition of a high-energy collision inside a particle detector (CERN)



The Daily Galaxy via Institute of Nuclear Physics of the Polish Academy of Sciences









Source


Top
 Profile      
 
 Post subject: "Curvature of the Universe" --New Cosmology Reveal
PostPosted: Mon Mar 21, 2016 3:22 pm 
Online
User avatar

Joined: Fri Apr 03, 2009 1:35 am
Posts: 2692
"Curvature of the Universe" --New Cosmology Reveals "the Future of Particle Physics"





Logarithmic-Map-Of-The-Observable-Universt-Pable-Carlos-Budassi (1)





Researchers have uncovered an entirely new way cosmology that sheds light on the future of particle physics by showing how the largest possible structure the curvature of the universe as a whole can be used as a lens onto the smallest objects observable today, elementary particles.



Niayesh Afshordi and postdoctoral fellow Elliot Nelson of Canadas Perimeter Institute began with the knowledge that space is flat. While there are local wrinkles, they are wrinkles in a flat space, not wrinkles in curved space. The universe as a whole is within one percent of flat. The problem is that it shouldnt be. The vacuum of space is not exhaust; it is dense with fields that may be weak but cannot be zero nothing quantum can ever be zero, because quantum things wiggle.

According to general relativity, such fluctuations should cause spacetime to curve. In fact, a straightforward calculation of how much the vacuum should curve predicts a universe so tightly wound that the moon would not fit inside it.

Cosmologists have typically worked around this problem that the universe should be curved, but looks flat by assuming there is some antigravity that exactly offsets the inclination of the vacuum to curve. This set of off-base predictions and unlikely corrections is known as the cosmological constant problem, and it has been dogging cosmology for more than half a century.



The images above and below are a visualization showing fields of belief of the entire observable Universe, an illustration created by Pablo Carlos Budassi was based on almost incomprehensible logarithmic maps created by Princeton University. Budassis illustrations of celestial bodies were based on images from NASA. Shown in the image are all the bodies of our solar system, with the Sun at the center, the Kuiper Belt, the Oort Cloud, the Perseus arm of the Milky Way Galaxy, and the Andromeda Galaxy. The outer rim is said to be comprised of the Cosmic Microwave Background, a byproduct of the Big Bang, and a ring of plasma said to have been created by the Big Bang.





Pablo-Carlos-Budassi-Entire-Known-Universe-In-One-View


In their paper, Nelson and Afshordi make no attempt to solve it, but where other cosmologists invoked an offsetting constant and moved on, Nelson and Afshordi went on to ask one more question: Does adding such a constant to cancel the vacuums energy guarantee a flat spacetime? Their answer: not quite.

The vacuum is still dense with quantum fields, and it is the mood of quantum fields to fluctuate. Even if they are perfectly offset such that their average value is zero, they will still fluctuate around that zero point. Those fluctuations should (again) cause space to curve just not as much.

In this scenario, the amount of curve created by the known fields the electromagnetic field, for example, or the Higgs field is too small to be measured, and is therefore allowed. But any unknown field would have to be weak enough that its fluctuations would not cause an obervable curve in the universe. This sets a maximum energy for unknown fields.

A theoretical maximum on a theoretical field may not sound groundbreaking but the labor opens a new window in an unexpected place: particle physics.

A particle, quantum mechanics teaches us, is just an excitation of a field. A photon is an excitation of the electric field, for example, and the newly discovered Higgs boson is an excitation of the Higgs field. Its roughly similar to the way a wave is an excitation of the ocean. And just as the height of a breaking wave can tell us something about the depth of the water, the mass of a particle depends on the strength of its corresponding field.

New kinds of quantum fields are often associated with proposals to extend the Standard Model of particle physics. If Afshordi and Nelson are right, and there can be no such fields whose fluctuations have enough energy to noticeably curve space, there can be no unknown particles with a mass of more than 35 TeV. The authors predict that if there are new fields and particles associated with an extension to the Standard Model, they will be below that anger.

For generations, particle physics has made progress from the bottom up: building more and more powerful colliders to create then spot and study heavier and heavier particles. It is as if we started from the ground floor and built up, discovering more particles at higher altitudes as we went. What Nelson and Afshordi have done is lower the sky.

There is a great bargain of debate in particle physics about whether we should build increasingly powerful accelerators to search for heavier unknown particles. Right now, the most powerful accelerator in the world, the Large Hadron Collider, runs at a top energy of about 14 TeV; a proposed new super accelerator in China would run at about 100 TeV. As this debate unfolds, this new labor could be particularly useful in helping experimentalists decide which energy levels which skyscraper heights are the most interesting.

The sky does indeed have a limit, this research suggests and we are about to hit it.



The Daily Galaxy via Perimeter Institute



Image credit: Courtesy of Pablo Carlos Budassi via Wikimedia Commons







Source


Top
 Profile      
 
 Post subject: CERN LHC Reveals: "The Universe a Billionth of a Second
PostPosted: Sat Apr 09, 2016 11:54 pm 
Online
User avatar

Joined: Fri Apr 03, 2009 1:35 am
Posts: 2692
CERN LHC Reveals: "The Universe a Billionth of a Second After the Big Bang" (Weekend Feature)





Alice (1)







"It is remarkable that we are capable to carry out such detailed measurements on a drop of early universe, that only has a radius of about one millionth of a billionth of a meter. The results are fully consistent with the physical laws of hydrodynamics, i.e. the theory of flowing liquids and it shows that the quark-gluon plasma behaves like a fluid. It is however a very special liquid, as it does not consist of molecules like water, but of the basic particles quarks and gluons," explained Jens Jrgen Gaardhje, professor and head of the ALICE group at the Niels Bohr Institute at the University of Copenhagen.



A few billionths of a second after the Big Bang, the universe was made up of a kind of extremely hot and dense primordial soup of the most basic particles, especially quarks and gluons. This state is called quark-gluon plasma. By colliding direct nuclei at a record-high energy of 5.02 TeV in the worlds most powerful particle accelerator, the 27 km long Large Hadron Collider, LHC at CERN in Geneva, it has been possible to recreate this state in the ALICE experiments detector and measure its properties.





CERN researchers recreated the universes primordial soup in miniature format by colliding direct atoms with extremely high energy in the 27 km long particle accelerator, the LHC in Geneva. The primordial soup is a so-called quark-gluon plasma and researchers from the Niels Bohr Institute, among others, have measured its liquid properties with great accuracy at the LHCs top energy. The results were submitted to Physical Review Letters, which is the top scientific journal for nuclear and particle physics.



"The analyses of the collisions make it possible, for the first time, to measure the precise characteristics of a quark-gluon plasma at the highest energy ever and to determine how it flows," explains You Zhou, who is a postdoc in the ALICE research group at the Niels Bohr Institute. You Zhou, together with a small, brisk-working team of international collaboration partners, led the analysis of the new data and measured how the quark-gluon plasma flows and fluctuates after it is formed by the collisions between direct ions.



The focus has been on the quark-gluon plasmas collective properties, which show that this state of matter behaves more like a liquid than a gas, even at the very highest energy densities. The new measurements, which uses new methods to study the correlation between many particles, make it possible to determine the viscosity of this exotic fluid with great precision.





Hqdefault





You Zhou explains that the experimental method is very advanced and is based on the fact that when two spherical atomic nuclei are shot at each other and hit each other a bit off center, a quark-gluon plasma is formed with a slightly elongated shape somewhat like an American football. This means that the pressure difference between the centre of this extremely hot droplet and the surface varies along the different axes. The pressure differential drives the expansion and flow and consequently one can measure a characteristic variation in the number of particles produced in the collisions as a function of the angle.



Jens Jrgen Gaardhje adds that they are now in the process of mapping this state with ever increasing precision -- and even further back in time.



The Daily Galaxy via University of Copenhagen -Niels Bohr Institute









Source


Top
 Profile      
 
 Post subject: "A Few Seconds After the Big Bang" --Hacking the O
PostPosted: Fri Apr 22, 2016 2:06 am 
Online
User avatar

Joined: Fri Apr 03, 2009 1:35 am
Posts: 2692
"A Few Seconds After the Big Bang" --Hacking the Origins of Visible-and-Dark Matter





Dark-matter-alpha-magnetic-spectrometer-data





Anticipating precision cosmological data from the next generation of "Extremely Large" telescopes, the BURST code developed by scientists at Los Alamos National Laboratory in collaboration with colleagues at University of California San Diego, "promises to open up new avenues for investigating existing puzzles of cosmology," says Los Alamos physicist Mark Paris of the Nuclear and Particle, Astrophysics and Cosmology group. "These include the mood and origin of visible matter and the properties of the more mysterious dark matter and dark radiation. "



This innovative multidisciplinary research in nuclear and particle physics and cosmology has led to the development of a new, more accurate computer code to study the early universe. The code simulates conditions during the first few minutes of cosmological evolution to model the role of neutrinos, nuclei and other particles in shaping the early universe.



"The BURST computer code allows physicists to exploit the early universe as a laboratory to study the effect of basic particles present in the early universe," Paris explains. "Our new labor in neutrino cosmology allows the study of the microscopic, quantum mood of basic particles--the basic, subatomic building blocks of mood--by simulating the universe at its largest, cosmological scale," said Paris.



"The frontiers of basic physics have traditionally been studied with particle colliders, such as the Large Hadron Collider at CERN, by smashing together subatomic particles at great energies," says UCSD physicist George Fuller, who collaborated with Paris and other staff scientists at Los Alamos to develop the novel theoretical model. BURST brings a new dimension in simulations. "Our self-consistent approach, achieved for the first time by simultaneously describing all the particles involved, increases the precision of our calculations. This allows us to investigate exotic basic particles that are currently the subject of intense theoretical speculation."



The new theoretical labor has been recognized by Physical Review D editors as an Editors Suggestion, a category reserved for "a small fraction of papers, which we evaluate to be particularly distinctive, interesting, and well written." It will appear in the late April 2016 issue.



The research is driven by several mission goals of Los Alamoss Nuclear and Particle Futures research pillar in basic and applied nuclear science. According to Paris, "The early universe is becoming such a tightly constrained environment with increasingly good measurements that we can test our descriptions of microscopic quantum physics, such as nuclear cross sections, to high accuracy." These cross sections are distinctive for Los Alamos nuclear data needs that feed into applications in nuclear energy, safety and security.



A few seconds after the Big Bang, the universe was composed of a thick, 10-billion degree "cosmic soup" of subatomic particles. As the hot universe expanded, these particles mutual interactions caused the universe to behave as a cooling thermonuclear reactor. This reactor produced light nuclei, such as hydrogen, helium, and lithium, found in the universe today. And the amounts of the light nuclei created depend on what other particles--such as neutrinos and perhaps their exotic cousins, "barren" neutrinos--comprise the "soup" and how they interact with each other.



"Neutrinos are very interesting--theyre the second most abundant particle in the universe after photons yet we still have much to learn about them," commented Evan Grohs, who earned his Ph.D. through UCSD for the labor, while working on the project in the Center for Space and Earth Sciences at Los Alamos. "By comparing our calculations with cosmological observables, such as the deuterium abundance," says Grohs, "we can use our BURST computer code to test theories regarding neutrinos, along with other--even less understood--particles. It can be difficult to test these theories in terrestrial labs, so our labor provides a window into an otherwise inaccessible area of physics."



This research has become possible only recently with the advent of astronomers precision measurements of the amounts of nuclei present in the early universe. These measurements were made with "Very Large" telescopes, which are about 10-meters wide. The next generation of "Extremely Large" telescopes, 30-meters across, are currently under construction.



"With coming improvements in cosmological observations, we expect our BURST computer code to be useful for many years to come," said Paris. Improvements in BURST are planned that will exploit the precision cosmological obervations to broadcast even more exotic physics such as the mood of dark matter and dark radiation. A complete understanding of dark matter, which comprises about a quarter of the mass in the universe, is currently lacking, Paris noted.



The Daily Galaxy via Los Alamos National Laboratory



Image credit: AMS Alpha Magnetic Spectrometer Dark matter NASA www.segnidalcielo.it









Source



Top
 Profile      
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  Page 12 of 13
 [ 130 posts ]  Go to page Previous  1 ... 9, 10, 11, 12, 13  Next

All times are UTC


Who is online

Users browsing this forum: mkicohegoefe and 21 guests


 
Search for:
 
Jump to:  

cron
Click me:
Powered by phpBB © 2000, 2002, 2005, 2007, 2008, 2009 phpBB Group
Chronicles phpBB3 theme by Jakob Persson. Stone textures by Patty Herford.
With special thanks to RuneVillage

This site have 4 type of tecnology in order to convert text to speech. By default you use the vozme tecnology. In order to know the other you need to sign for.


- Privacy Policy -